Understanding PyTorch Buffers (#288)

This commit is contained in:
Sebastian Raschka
2024-07-26 08:45:36 -05:00
committed by GitHub
parent 6dd8666d9c
commit 1e873d4cbc
4 changed files with 567 additions and 1 deletions

View File

@@ -6,4 +6,5 @@
## Bonus Materials
- [02_bonus_efficient-multihead-attention](02_bonus_efficient-multihead-attention) implements and compares different implementation variants of multihead-attention
- [02_bonus_efficient-multihead-attention](02_bonus_efficient-multihead-attention) implements and compares different implementation variants of multihead-attention
- [03_understanding-buffers](03_understanding-buffers) explains the idea behind PyTorch buffers, which are used to implement the causal attention mechanism in chapter 3