From 5a537777fe3bbfb7e866616750abc44af278f3f8 Mon Sep 17 00:00:00 2001 From: Frank Xu Date: Wed, 12 Feb 2025 18:08:01 -0500 Subject: [PATCH] add youtube link to lab1 --- lab01/README.md | 12 ++++++------ 1 file changed, 6 insertions(+), 6 deletions(-) diff --git a/lab01/README.md b/lab01/README.md index d854cfe..4176394 100644 --- a/lab01/README.md +++ b/lab01/README.md @@ -121,12 +121,6 @@ We use a pre-trained AI model called **BERT (Bidirectional Encoder Representatio ### How BERT Works - - -[![Watch Another YouTube Video](http://img.youtube.com/vi/EOmd5sUUA_A/0.jpg)](https://www.youtube.com/watch?v=EOmd5sUUA_A) - -[![Watch my YouTube Video](http://img.youtube.com/vi/xI0HHN5XKDo/0.jpg)](https://www.youtube.com/watch?v=xI0HHN5XKDo) - BERT reads the entire sentence at once and uses a mechanism called **self-attention** to focus on important words. For example: ``` @@ -134,6 +128,12 @@ Sentence: "The bank was robbed." BERT understands that "bank" refers to a financial institution because of the word "robbed." ``` + + +[![Watch Another YouTube Video](http://img.youtube.com/vi/EOmd5sUUA_A/0.jpg)](https://www.youtube.com/watch?v=EOmd5sUUA_A) + +[![Watch my YouTube Video](http://img.youtube.com/vi/xI0HHN5XKDo/0.jpg)](https://www.youtube.com/watch?v=xI0HHN5XKDo) + ### Why Use BERT? - **Pre-trained**: BERT has already learned from millions of sentences, so it understands language well.