Show HN: Learn LLMs LeetCode Style

🐦 Follow me on Twitter • ➡️ Jump to LLMs! 📧 Feedback
TorchLeet is broken into two sets of questions:
- Question Set: A collection of PyTorch practice problems, ranging from basic to hard, designed to enhance your skills in deep learning and PyTorch.
- LLM Set: A new set of questions focused on understanding and implementing Large Language Models (LLMs) from scratch, including attention mechanisms, embeddings, and more.
Note
Avoid using GPT. Try to solve these problems on your own. The goal is to learn and understand PyTorch concepts deeply.
Table of Contents
Question Set
🔵Basic
Mostly for beginners to get started with PyTorch.
🟢Easy
Recommended for those who have a basic understanding of PyTorch and want to practice their skills.
🟡Medium
These problems are designed to challenge your understanding of PyTorch and deep learning concepts. They require you to implement things from scratch or apply advanced techniques.
- Implement parameter initialization for a CNN (Solution)
- Implement a CNN from Scratch
- Implement an LSTM from Scratch (Solution)
- Implement AlexNet from scratch
- Build a Dense Retrieval System using PyTorch
- Implement KNN from scratch in PyTorch
🔴Hard
These problems are for advanced users who want to push their PyTorch skills to the limit. They involve complex architectures, custom layers, and advanced techniques.
- Write a custom Autograd function for activation (SILU) (Solution)
- Write a Neural Style Transfer
- Build a Graph Neural Network (GNN) from scratch
- Build a Graph Convolutional Network (GCN) from scratch
- Write a Transformer (Solution)
- Write a GAN (Solution)
- Write Sequence-to-Sequence with Attention (Solution)
- [Enable distributed training in pytorch (DistributedDataParallel)]
- [Work with Sparse Tensors]
- Add GradCam/SHAP to explain the model. (Solution)
- Linear Probe on CLIP Features
- Add Cross Modal Embedding Visualization to CLIP (t-SNE/UMAP)
- Implement a Vision Transformer
- Implement a Variational Autoencoder
LLM Set
An all new set of questions to help you understand and implement Large Language Models from scratch.
Each question is designed to take you one step closer to building your own LLM.
- Implement KL Divergence Loss
- Implement RMS Norm
- Implement Byte Pair Encoding from Scratch (Solution)
- Create a RAG Search of Embeddings from a set of Reviews
- Implement Predictive Prefill with Speculative Decoding
- Implement Attention from Scratch (Solution)
- Implement Multi-Head Attention from Scratch (Solution)
- Implement Grouped Query Attention from Scratch (Solution)
- Implement KV Cache in Multi-Head Attention from Scratch
- Implement Sinusoidal Embeddings (Solution)
- Implement ROPE Embeddings (Solution)
- Implement SmolLM from Scratch (Solution)
- Implement Quantization of Models
- GPTQ
- Implement Beam Search atop LLM for decoding
- Implement Top K Sampling atop LLM for decoding
- Implement Top p Sampling atop LLM for decoding
- Implement Temperature Sampling atop LLM for decoding
- Implement LoRA on a layer of an LLM
- QLoRA
- Mix two models to create a mixture of Experts
- Apply SFT on SmolLM
- Apply RLHF on SmolLM
- Implement DPO based RLHF
- Add continuous batching to your LLM
- Chunk Textual Data for Dense Passage Retrieval
- Implement Large scale Training => 5D Parallelism
What's cool? 🚀
- Diverse Questions: Covers beginner to advanced PyTorch concepts (e.g., tensors, autograd, CNNs, GANs, and more).
- Guided Learning: Includes incomplete code blocks (
...
and#TODO
) for hands-on practice along with Answers
Getting Started
1. Install Dependencies
- Install pytorch: Install pytorch locally
- Some problems need other packages. Install as needed.
2. Structure
: Easy/Medium/Hard along with the question ID./
: The question file with incomplete code blocks./qname.ipynb
: The corresponding solution file./qname_SOLN.ipynb
3. How to Use
- Navigate to questions/ and pick a problem
- Fill in the missing code blocks
(...)
and address the#TODO
comments. - Test your solution and compare it with the corresponding file in
solutions/
.
Happy Learning! 🚀
Contribution
Feel free to contribute by adding new questions or improving existing ones. Ensure that new problems are well-documented and follow the project structure. Submit a PR and tag the authors.
Authors
💻 AI/ML Dev
|
💻 Developer
|
Stargazers over time
What's Your Reaction?






