mirror of
https://github.com/rasbt/LLMs-from-scratch.git
synced 2026-04-10 12:33:42 +00:00
* Fix encoding of multiple preceding spaces in BPE tokenizer. * Add test --------- Co-authored-by: rasbt <mail@sebastianraschka.com>
Byte Pair Encoding (BPE) Tokenizer From Scratch
-
bpe-from-scratch-simple.ipynb contains optional (bonus) code that explains and shows how the BPE tokenizer works under the hood; this is geared for simplicity and readability.
-
bpe-from-scratch.ipynb implements a more sophisticated (and much more complicated) BPE tokenizer that behaves similarly as tiktoken with respect to all the edge cases; it also has additional funcitionality for loading the official GPT-2 vocab.