AI/ML STIG Lecture Series, 9 Feb 2026

by | Feb 9, 2026 | Climate Change

Cosmic Origins

Location
Virtual

Dates
9 February 20264:00pm ET

Community
AI/ML STIG

Type
Seminar

Transformers

Speaker

Helen Qu, Flatiron

Build a decoder-only transformer (a small GPT-like language model) from scratch in PyTorch. Train it on the Tiny Shakespeare dataset for character-level language modeling and use it to generate text, understanding every component along the way.

Topics Covered:

Self-attention as a learned, data-dependent mixing operator

Causal (masked) self-attention for autoregressive modeling

Building a GPT-style Transformer block from scratch

Token and positional embeddings

Training a small autoregressive language model

Text generation with temperature and top-k sampling

Meeting Connection

Join the Meeting

News Straight to Your Inbox

Subscribe to your community email news list

Sign Up

Article Attribution | Read More at Article Source