Syllabi.

Curated reading lists from the world's leading research institutions.

Filter by:
Featured Syllabus
Curated by Stanford AI Lab Updated 2 days ago

Foundation Models & LLMs

A comprehensive syllabus on transformer architectures, GPT models, and the evolution of LLMs from 2017 to present.

Contains 127 seminal papers
Table of Contents (Preview)
  • 01
    Attention Is All You Need
  • 02
    BERT: Pre-training of Deep Bidirectional Transformers
  • 03
    Language Models are Few-Shot Learners
+ 124 more references