Limited Seats Remaining
6-week course

All Pods / AI/ML

The “T” of ChatGPT: Predictive Models that Discuss, Translate, and Compose

The “T” of ChatGPT: Predictive Models that Discuss, Translate, and Compose

Group size

3-6 students

Outcome

A customized ChatGPT language-based application

Tuition

$495

Pod illustration

Cohorts


Join the waitlist for our next cohort.

Sign up to be the first to know when we launch the next cohort.

Graduation year
How did you hear about Pods?

The “T” of ChatGPT: Predictive Models that Discuss, Translate, and Compose

Dive into the fascinating world of Natural Language Processing as we unravel the magic behind cutting-edge language models, like the ones that power ChatGPT. We will focus on the “T” of ChatGPT: the Transformer. We will learn to build and train our own Transformer models, discover the inner workings of attention mechanisms, and explore advanced applications like text generation and sentiment analysis. Learn new ways of thinking about the patterns in language (and even music!) as we explore the potential of AI to generate natural sequences.

Students learning together

Week by week curriculum

Week 1

Introduction to NLP and RNNs - In session 1, we'll go over an overview of NLP and its applications. We'll have an Introduction to Recurrent Neural Networks (RNNs) and their role in NLP, and a hands-on activity: Implementing a basic RNN model for text generation.

Week 2

How do programmers use ChatGPT? - In session 2, you will learn what an API is, and learn about how companies, researchers, and engineers which maintain APIs and repositories for public use (plus, you will learn how to cite and acknowledge their contributions in your projects). By the end of this session, you will make your own call to the ChatGPT API in Python.

Week 3

Tokenization, Encoding, and Embedding - For session 3, you will learn a few ways we can break apart a word or sentence into tokens. You will learn one way that AI researchers “encode” these tokens into numeric objects, and discuss challenges associated with so-called “sparse” encoding schemes. You will also learn what “word embedding” means, and how it allows for a more meaningful and compact representation of words. You will then write programs to illustrate the “distances” between words.

Week 4

Limitations of RNNs - LSTM and GRU - In session 4, you will learn about shortcomings of RNN models in handling “memory”. You'll also learn how LSTM (Long Short-Term Memory) and GRU (Gated Recurrent Unit) variants of RNNs represent partial approaches to overcoming these challenges. We will end the session with a hands-on activity.

Week 5

Global Context - The Transformer - For session 5, you will learn more about the limitations of RNNs, LSTMs, and GRUs in capturing long-range dependencies. You will also learn about Transformers as a solution for capturing global context. Additionally, we'll learn about the Attention mechanism in Transformers, and explore the differences in training models with and without Attention.

Week 6

Advanced Transformer Applications and Future of NLP - In our last session, we will finish building our interactive ChatGPT applications in Python, and present our application (or research findings) to the group. We will then discuss advanced Transformer architectures (e.g., BERT, GPT). To close out, we will discuss current trends, ethical considerations, and future directions in NLP.