Congrats on 300k subs!
My God!! that's a good editing(Zooming in & Zooming out focusing on text). Keep up! Great video!
Ever since the model was released, i knew this vid was coming. tonight I checked the feed for this vid on ur channel, couldn't find it and five mins later, it's on my home feed.
The brain bit is the cutest thing I’ve ever seen 😊
Now I'm really excited to make full use of my M1 max with 64Gb of RAM, choosed the 43Gb deepseek-r1:70b version, and it is amazing to see it working locally even though is notably slower than the web, this is fantasy come true!
You have one of the best channels on YouTube. I dont usually "join" a channel, but week by week you always amaze me.
You are not installing "DeepSeek R1" on all those Macs. You're installing a smaller model, based on a Llama or Qwen model, that is distilled from R1.
I am from India its 4am here saw your video and just installed Lm studio Man thanks for making this video
Hey, thanks. Great introduction to running LLMs on Mac hardware.
Was literally setting this up earlier today on my M3 Max and was curious about perf on other machine types so this is awesome! Great video as always!
FINALLY incredible video I love AI on mac + software engineer with deepseek R1 to make a local AI work is really worth it next time try to destroy your M4 max 128gb with the largest llm deepseek R1 you can put
Excited to see you use MLX and talk about quantizisation, as a macbookpro m3 Pro chip, i want to look a little into MLX and quantisiation, would be amazing if you did some videos on those!
Definitely worth double! 😃
You are truly doing gods work! I’ve been thinking about exactly this since the models dropped.
I'm just going to binge watch all your llmv videos now
I love your video's, this one was a bit too fast and chaotic for me. Keep up the good work 😊
I’ve run 14b version ( almost 10 Gb) ollama in a MacBookPro M2 16Gb . It runs slow but ok. Impressive results, to be honest
You read my mind and made the video I was looking for.
20:48 When analyzing ram usage and power, I recommend macmon, you can even “brew install macmon”, it gives you power usage individual and total, ram usage, swap, e & p core usage
@AZisk