Character-level language models from bigrams to WaveNet โ the training diagnostics, batch normalization, and manual backpropagation skills you use daily.
5 sections
ยท 15 lessons
[ ] Bigrams (4 lessons)
[ ] The MLP Language Model (3 lessons)
[ ] Activations and Batch Normalization (3 lessons)
Learn the systems engineering layer for context windows and memory.
[ ]Context Engineering
You already understand tokens and embeddings. This course teaches the systems engineering layer โ managing context windows, sub-agents, and memory strategies at scale.
8 sections
ยท 25 lessons
[ ] Tokens and Inference (2 lessons)
[ ] The Real Size of Your Context Window (4 lessons)
[ ] Anatomy of the Messages Array (5 lessons)
[ ] Dynamic Allocation: Tool Calling (2 lessons)
[ ] The Ralph Wiggum Loop (3 lessons)
[ ] Sub-Agents: Managed Runtimes for AI (3 lessons)
[ ] Message Passing: The Erlang OTP of AI (3 lessons)
Learn agent loop design, parallel execution, and production guardrails.
[ ]Understanding Tool Calling
The advanced patterns and security sections cover agent loop design, parallel tool execution, and production guardrails โ the engineering side of your ML systems.