RoPE (Rotary positional embeddings) explained: The positional workhorse of modern LLMs

DeepLearning Hero

RoPE (Rotary positional embeddings) explained: The positional workhorse of modern LLMs

2 years ago - 14:06

Rotary Positional Embeddings: Combining Absolute and Relative

Efficient NLP

Rotary Positional Embeddings: Combining Absolute and Relative

2 years ago - 11:17

How Rotary Position Embedding Supercharges Modern LLMs [RoPE]

Jia-Bin Huang

How Rotary Position Embedding Supercharges Modern LLMs [RoPE]

1 year ago - 13:39

Rotary Positional Embeddings Explained | Transformer

Outlier

Rotary Positional Embeddings Explained | Transformer

6 months ago - 20:28

LLaMA explained: KV-Cache, Rotary Positional Embedding, RMS Norm, Grouped Query Attention, SwiGLU

Umar Jamil

LLaMA explained: KV-Cache, Rotary Positional Embedding, RMS Norm, Grouped Query Attention, SwiGLU

2 years ago - 1:10:55

Why Rotating Vectors Solves Positional Encoding in Transformers | Rotary Positional Embeddings(ROPE)

ExplainingAI

Why Rotating Vectors Solves Positional Encoding in Transformers | Rotary Positional Embeddings(ROPE)

1 month ago - 23:06

Rotary Positional Embeddings & Rotation Matrix + Python  LLM code

Vuk Rosić

Rotary Positional Embeddings & Rotation Matrix + Python LLM code

1 year ago - 11:05

Rotary Positional Encodings | Explained Visually

Vizuara

Rotary Positional Encodings | Explained Visually

10 months ago - 34:38

Rotary Position Embedding explained deeply (w/ code)

JakZee

Rotary Position Embedding explained deeply (w/ code)

1 year ago - 23:26

Transformer Architecture: Fast Attention, Rotary Positional Embeddings, and Multi-Query Attention

Rajistics - data science, AI, and machine learning

Transformer Architecture: Fast Attention, Rotary Positional Embeddings, and Multi-Query Attention

2 years ago - 1:21

Rotary Positional Embeddings

Data Science Gems

Rotary Positional Embeddings

2 years ago - 30:18

How positional encoding works in transformers?

BrainDrain

How positional encoding works in transformers?

2 years ago - 5:36

Large Language Models (LLM) - Part 5/16 - RoPE (Positional Encoding) in AI

Mr. Gyula Rabai

Large Language Models (LLM) - Part 5/16 - RoPE (Positional Encoding) in AI

1 year ago - 4:17

Rotary Position Embedding for Dummies - PE for GPT  Open Models

TFT - The Fact Treasure

Rotary Position Embedding for Dummies - PE for GPT Open Models

5 months ago - 6:16

Give me 30 min, I will make RoPE click forever

Zachary Huang

Give me 30 min, I will make RoPE click forever

2 months ago - 29:08

Positional embeddings in transformers EXPLAINED | Demystifying positional encodings.

AI Coffee Break with Letitia

Positional embeddings in transformers EXPLAINED | Demystifying positional encodings.

4 years ago - 9:40

DoPE: Denoising Rotary Position Embedding

Keyur

DoPE: Denoising Rotary Position Embedding

1 month ago - 10:49

RoPE (Rotary Position Embedding) in 3 minutes!

Kavishka Abeywardana

RoPE (Rotary Position Embedding) in 3 minutes!

3 weeks ago - 3:14

RoPE Rotary Position Embedding to 100K context length

Discover AI

RoPE Rotary Position Embedding to 100K context length

1 year ago - 39:56

RoFormer: Enhanced Transformer with Rotary Position Embedding Explained

Gabriel Mongaras

RoFormer: Enhanced Transformer with Rotary Position Embedding Explained

2 years ago - 39:52

Why Sine & Cosine for Transformer Neural Networks

CodeEmporium

Why Sine & Cosine for Transformer Neural Networks

3 years ago - 0:51

RoPE: Rotary Positional Embeddings

Tales Of Tensors

RoPE: Rotary Positional Embeddings

4 months ago - 1:00

DroPE Explained: Dynamic Rotary Position Embedding for Stable Long-Context LLM Inference

CosmoX

DroPE Explained: Dynamic Rotary Position Embedding for Stable Long-Context LLM Inference

1 month ago - 7:34

What Rotary Positional Embeddings (RoPE) don’t want you to know

Sciencing The Data

What Rotary Positional Embeddings (RoPE) don’t want you to know

4 months ago - 12:03

Rotary Positional Embedding | Introduction to Large Language Models | Mitesh M.Khapra | IIT Madras

IIT Madras - B.S. Degree Programme

Rotary Positional Embedding | Introduction to Large Language Models | Mitesh M.Khapra | IIT Madras

1 year ago - 21:14

[한글자막] RoPE (Rotary positional embeddings) explained: The positional workhorse of modern LLMs

WTF_Zone

[한글자막] RoPE (Rotary positional embeddings) explained: The positional workhorse of modern LLMs

2 years ago - 14:07

Positional Encoding | How LLMs understand structure

Pramod Goyal

Positional Encoding | How LLMs understand structure

1 year ago - 9:10

🔥 Master RoPE (Rotary Positional Encoding) - The SECRET Behind GPT & LLaMA's Success! Code and math

Mehdi Hosseini Moghadam

🔥 Master RoPE (Rotary Positional Encoding) - The SECRET Behind GPT & LLaMA's Success! Code and math

8 months ago - 14:37

What is Rotary Positional Embedding (RoPE)

Data Science Made Easy

What is Rotary Positional Embedding (RoPE)

3 months ago - 0:59

Positional Encoding in Transformers | Deep Learning

Learn With Jay

Positional Encoding in Transformers | Deep Learning

1 year ago - 25:54

RoFormer: enhanced transformer with rotary position embedding

Jonas Almeida

RoFormer: enhanced transformer with rotary position embedding

1 year ago - 10:23

RoPE: Rotary Position Embeddings

Евгений Разинков

RoPE: Rotary Position Embeddings

4 months ago - 1:14:32

Coding LLaMA 2 from scratch in PyTorch - KV Cache, Grouped Query Attention, Rotary PE, RMSNorm

Umar Jamil

Coding LLaMA 2 from scratch in PyTorch - KV Cache, Grouped Query Attention, Rotary PE, RMSNorm

2 years ago - 3:04:11

Stanford XCS224U: NLU I Contextual Word Representations, Part 3: Positional Encoding I Spring 2023

Stanford Online

Stanford XCS224U: NLU I Contextual Word Representations, Part 3: Positional Encoding I Spring 2023

2 years ago - 13:02

RoPE embeddings :  Math explained + implementation from scratch in code

NLP programming labs

RoPE embeddings : Math explained + implementation from scratch in code

10 months ago - 50:44

Rotary Positional Embeddings (RoPE): Part 1

West Coast Machine Learning

Rotary Positional Embeddings (RoPE): Part 1

1 year ago - 1:25:51

How Attention Got So Efficient [GQA/MLA/DSA]

Jia-Bin Huang

How Attention Got So Efficient [GQA/MLA/DSA]

3 months ago - 29:02

[Paper Review] RoFormer: Enhanced Transformer with Rotary Position Embedding (RoPE)

LOADING_

[Paper Review] RoFormer: Enhanced Transformer with Rotary Position Embedding (RoPE)

5 months ago - 7:13

ROTARY POSITION EMBEDDING

Vikas Sharma

ROTARY POSITION EMBEDDING

3 months ago - 4:57

DoPE: Denoising Rotary Position Embedding (Nov 2025)

AI Paper Slop

DoPE: Denoising Rotary Position Embedding (Nov 2025)

3 months ago - 13:41

How do Transformer Models keep track of the order of words? Positional Encoding

Serrano.Academy

How do Transformer Models keep track of the order of words? Positional Encoding

1 year ago - 9:50

Rotary Positional Embedding (RoPE) Explained

Marwan Elghitany

Rotary Positional Embedding (RoPE) Explained

2 months ago - 22:02

Extending Transformer Context with Rotary Positional Embeddings

Digital Asylum - Museum of Cybernetic Anamolies

Extending Transformer Context with Rotary Positional Embeddings

2 years ago - 8:29

RoFormer: Transforming Transformers with Rotary Positional Embeddings

Arxflix

RoFormer: Transforming Transformers with Rotary Positional Embeddings

1 year ago - 3:34

[AI Podcast] RoFormer: Enhanced Transformer with Rotary Position Embedding

Listening to AI Papers

[AI Podcast] RoFormer: Enhanced Transformer with Rotary Position Embedding

6 months ago - 7:00

VideoRoPE  Enhancing Video Rotary Position Embedding for LLMs

AI Papers

VideoRoPE Enhancing Video Rotary Position Embedding for LLMs

1 year ago - 14:28

Round and Round We Go! What makes Rotary Positional Encodings useful?

Gabriel Mongaras

Round and Round We Go! What makes Rotary Positional Encodings useful?

1 year ago - 32:31

Reading AI Research Paper |  RoFormer: Enhanced Transformer with Rotary Position Embedding

Gradient Descending

Reading AI Research Paper | RoFormer: Enhanced Transformer with Rotary Position Embedding

Streamed 1 year ago - 1:52:22

Rotary Positional Embeddings with code: Easy explanation, No mathematics

Subhankar Ghosh

Rotary Positional Embeddings with code: Easy explanation, No mathematics

2 years ago - 35:01

Transformer Positional Embeddings With A Numerical Example

Machine Learning with PyTorch

Transformer Positional Embeddings With A Numerical Example

4 years ago - 6:21

Language Models Explained: Position Embeddings, Extrapolation, and Perplexity Evaluation

Ofir Press

Language Models Explained: Position Embeddings, Extrapolation, and Perplexity Evaluation

2 years ago - 28:04

Lec 16 | Introduction to Transformer: Positional Encoding and Layer Normalization

NPTEL IIT Delhi

Lec 16 | Introduction to Transformer: Positional Encoding and Layer Normalization

1 year ago - 1:26:53

Integer and Binary Positional Encodings | Journey towards Rotary Positional Encodings (RoPE)

Vizuara

Integer and Binary Positional Encodings | Journey towards Rotary Positional Encodings (RoPE)

10 months ago - 36:41

Beyond Real: Imaginary Extension of Rotary Position Embeddings for Long-Context LLMs (Dec 2025)

AI Paper Slop

Beyond Real: Imaginary Extension of Rotary Position Embeddings for Long-Context LLMs (Dec 2025)

2 months ago - 15:02

The secret sauce behind ROPE (Rotary Positional embedding)

Md Mishfaq Ahmed

The secret sauce behind ROPE (Rotary Positional embedding)

11 months ago - 2:01

Positional Encoding in Transformers | Deep Learning | CampusX

CampusX

Positional Encoding in Transformers | Deep Learning | CampusX

1 year ago - 1:13:15

Positional Encoding in Transformer | Sinusoidal Positional Encoding Explained

ExplainingAI

Positional Encoding in Transformer | Sinusoidal Positional Encoding Explained

2 months ago - 20:34