I love the high level introduction concepts. for coding and fine-tuning starts at 1:52:00
Play at 1.5x, you won't miss anything. Oh btw thats an example of quantization. Feel free to skip first 30 minutes if you understand this.
[0,1,2,........,1000] -> asymmetric qunt [-15,-14,.............1000] -> asymmetric qunt
thank you krish sir, recently i am working with different different llm models, and i am trying to build my own llm model for specific domain, you helped me a lot things with learning through this video, thank you for this beautiful tutorial and guidance. lots of love from Nepal.
this was the best course i have seen when it comes to fine Tunning. I have already fine tuned 5 models before this video and now this time next year it should be 500. Only because you have streamlined fine tunning for me.
This is a amzing tutorial end to end and if someone wants to learn the fine tuning concepts with hands on lab. Thanks Kris !
Quantisation is such a simple thing. Can’t believe the time being spent on it. There are videos out there which are far more effective for your time
thanks to you i finished my masters degree <3
I think Free Code Camp should use AI to remove the repeated sections of topic in the video, this will help the viewers. Currently you are stitching multiple youtube videos to make as a big video. But there are many repetition of things are there in the video.
I don' t understand why people call pre training the process of training the model for the first time. Pre training is training and fine tuning is a second training.
I have a job interview tomorrow to get a genAI job, I barely know anything about it, but I am going through this tonight right now, if I get the job tomorrow I will come back and post a thank you.
If we convert 32 bits into 8 bits the calculation will happen a little bit much more quicker. This guy is a legend
which app is he using for writing notes ?
He is one of the best teachers. Stop spreading lies
Thank you for sacrificing your hair to teach us 🙏
Thanks for the video! Do you have more on this topic?
Thank you soo much, I was looking for something just like this for the past few days !
Did you cover the "out of scope answering restrictions" for fine tuning LLMs in this video?❤
Hi Krish , I think one correction needed , at 1.27 you said anything multiplied by 1 is 1 and anything multiplied by -1 is -1. Please can you correct that statement. Just shared input as you are doing great job , so wanted to make sure it does not have unknown mistakes. Thanks for video
@ejosh3420