mirror of
https://github.com/rasbt/LLMs-from-scratch.git
synced 2026-04-10 12:33:42 +00:00
Understanding PyTorch Buffers (#288)
This commit is contained in:
committed by
GitHub
parent
08040f024c
commit
deea13e5c2
@@ -102,6 +102,7 @@ Several folders contain optional materials as a bonus for interested readers:
|
||||
- [Dataloader Intuition with Simple Numbers](ch02/04_bonus_dataloader-intuition)
|
||||
- **Chapter 3:**
|
||||
- [Comparing Efficient Multi-Head Attention Implementations](ch03/02_bonus_efficient-multihead-attention/mha-implementations.ipynb)
|
||||
- [Understanding PyTorch Buffers](ch03/03_understanding-buffers/understanding-buffers.ipynb)
|
||||
- **Chapter 4:**
|
||||
- [FLOPS Analysis](ch04/02_performance-analysis/flops-analysis.ipynb)
|
||||
- **Chapter 5:**
|
||||
|
||||
Reference in New Issue
Block a user