RoPE (Rotary positional embeddings) explained: The positional workhorse of modern LLMs

DeepLearning Hero

RoPE (Rotary positional embeddings) explained: The positional workhorse of modern LLMs

2 years ago - 14:06

Rotary Positional Embeddings: Combining Absolute and Relative

Efficient NLP

Rotary Positional Embeddings: Combining Absolute and Relative

2 years ago - 11:17

How Rotary Position Embedding Supercharges Modern LLMs [RoPE]

Jia-Bin Huang

How Rotary Position Embedding Supercharges Modern LLMs [RoPE]

1 year ago - 13:39

Rotary Positional Embeddings Explained | Transformer

Outlier

Rotary Positional Embeddings Explained | Transformer

4 months ago - 20:28

Rotary Position Embedding explained deeply (w/ code)

JakZee

Rotary Position Embedding explained deeply (w/ code)

1 year ago - 23:26

Rotary Positional Encodings | Explained Visually

Vizuara

Rotary Positional Encodings | Explained Visually

8 months ago - 34:38

LLaMA explained: KV-Cache, Rotary Positional Embedding, RMS Norm, Grouped Query Attention, SwiGLU

Umar Jamil

LLaMA explained: KV-Cache, Rotary Positional Embedding, RMS Norm, Grouped Query Attention, SwiGLU

2 years ago - 1:10:55

Rotary Positional Embeddings & Rotation Matrix + Python  LLM code

Vuk Rosić

Rotary Positional Embeddings & Rotation Matrix + Python LLM code

1 year ago - 11:05

Rotary Positional Embeddings Explained | How Transformers Encode Relative Position

ExplainingAI

Rotary Positional Embeddings Explained | How Transformers Encode Relative Position

2 days ago - 23:06

Rotary Positional Embeddings

Data Science Gems

Rotary Positional Embeddings

2 years ago - 30:18

Large Language Models (LLM) - Part 5/16 - RoPE (Positional Encoding) in AI

Mr. Gyula Rabai

Large Language Models (LLM) - Part 5/16 - RoPE (Positional Encoding) in AI

1 year ago - 4:17

How positional encoding works in transformers?

BrainDrain

How positional encoding works in transformers?

2 years ago - 5:36

Give me 30 min, I will make RoPE click forever

Zachary Huang

Give me 30 min, I will make RoPE click forever

1 month ago - 29:08

Transformer Architecture: Fast Attention, Rotary Positional Embeddings, and Multi-Query Attention

Rajistics - data science, AI, and machine learning

Transformer Architecture: Fast Attention, Rotary Positional Embeddings, and Multi-Query Attention

2 years ago - 1:21

Positional embeddings in transformers EXPLAINED | Demystifying positional encodings.

AI Coffee Break with Letitia

Positional embeddings in transformers EXPLAINED | Demystifying positional encodings.

4 years ago - 9:40

Rotary Position Embedding for Dummies - PE for GPT  Open Models

TFT - The Fact Treasure

Rotary Position Embedding for Dummies - PE for GPT Open Models

4 months ago - 6:16

RoFormer: Enhanced Transformer with Rotary Position Embedding Explained

Gabriel Mongaras

RoFormer: Enhanced Transformer with Rotary Position Embedding Explained

2 years ago - 39:52

RoPE Rotary Position Embedding to 100K context length

Discover AI

RoPE Rotary Position Embedding to 100K context length

1 year ago - 39:56

What Rotary Positional Embeddings (RoPE) don’t want you to know

Sciencing The Data

What Rotary Positional Embeddings (RoPE) don’t want you to know

3 months ago - 12:03

What is Rotary Positional Embedding (RoPE)

Data Science Made Easy

What is Rotary Positional Embedding (RoPE)

2 months ago - 0:59

Why Sine & Cosine for Transformer Neural Networks

CodeEmporium

Why Sine & Cosine for Transformer Neural Networks

2 years ago - 0:51

RoPE: Rotary Positional Embeddings

Tales Of Tensors

RoPE: Rotary Positional Embeddings

2 months ago - 1:00

DoPE: Denoising Rotary Position Embedding

Keyur

DoPE: Denoising Rotary Position Embedding

13 days ago - 10:49

Stanford XCS224U: NLU I Contextual Word Representations, Part 3: Positional Encoding I Spring 2023

Stanford Online

Stanford XCS224U: NLU I Contextual Word Representations, Part 3: Positional Encoding I Spring 2023

2 years ago - 13:02

Round and Round We Go! What makes Rotary Positional Encodings useful?

Gabriel Mongaras

Round and Round We Go! What makes Rotary Positional Encodings useful?

1 year ago - 32:31

How Attention Got So Efficient [GQA/MLA/DSA]

Jia-Bin Huang

How Attention Got So Efficient [GQA/MLA/DSA]

1 month ago - 29:02

Rotary Positional Embeddings (RoPE): Part 1

West Coast Machine Learning

Rotary Positional Embeddings (RoPE): Part 1

1 year ago - 1:25:51

🔥 Master RoPE (Rotary Positional Encoding) - The SECRET Behind GPT & LLaMA's Success! Code and math

Mehdi Hosseini Moghadam

🔥 Master RoPE (Rotary Positional Encoding) - The SECRET Behind GPT & LLaMA's Success! Code and math

6 months ago - 14:37

Positional Encoding in Transformers | Deep Learning

Learn With Jay

Positional Encoding in Transformers | Deep Learning

1 year ago - 25:54

[한글자막] RoPE (Rotary positional embeddings) explained: The positional workhorse of modern LLMs

WTF_Zone

[한글자막] RoPE (Rotary positional embeddings) explained: The positional workhorse of modern LLMs

2 years ago - 14:07

The secret sauce behind ROPE (Rotary Positional embedding)

Md Mishfaq Ahmed

The secret sauce behind ROPE (Rotary Positional embedding)

10 months ago - 2:01

How do Transformer Models keep track of the order of words? Positional Encoding

Serrano.Academy

How do Transformer Models keep track of the order of words? Positional Encoding

1 year ago - 9:50

Positional Encoding | How LLMs understand structure

Pramod Goyal

Positional Encoding | How LLMs understand structure

11 months ago - 9:10

I Followed a Token Through the Transformers Architecture (Every Step)

Tales Of Tensors

I Followed a Token Through the Transformers Architecture (Every Step)

2 weeks ago - 8:17

Rotary Positional Embedding | Introduction to Large Language Models | Mitesh M.Khapra | IIT Madras

IIT Madras - B.S. Degree Programme

Rotary Positional Embedding | Introduction to Large Language Models | Mitesh M.Khapra | IIT Madras

1 year ago - 21:14

RoFormer: Transforming Transformers with Rotary Positional Embeddings

Arxflix

RoFormer: Transforming Transformers with Rotary Positional Embeddings

1 year ago - 3:34

How Rotary Encoder Works and How To Use It with Arduino

How To Mechatronics

How Rotary Encoder Works and How To Use It with Arduino

9 years ago - 4:55

RoFormer: enhanced transformer with rotary position embedding

Jonas Almeida

RoFormer: enhanced transformer with rotary position embedding

1 year ago - 10:23

Coding LLaMA 2 from scratch in PyTorch - KV Cache, Grouped Query Attention, Rotary PE, RMSNorm

Umar Jamil

Coding LLaMA 2 from scratch in PyTorch - KV Cache, Grouped Query Attention, Rotary PE, RMSNorm

2 years ago - 3:04:11

VideoRoPE  Enhancing Video Rotary Position Embedding for LLMs

AI Papers

VideoRoPE Enhancing Video Rotary Position Embedding for LLMs

10 months ago - 14:28