@jolin-qe2zc

Thanks a lot for your video, makes autodiff so sense when I was confused in the mechanism behind pytorch auto_grad

@nilspin

How i understood forward vs reverse autodiff is in forward mode we ask 'how much does output change when we change input' but in reverse mode the node asks 'how should each input change to match reference?' which makes very practical sence from a training perspective.

@dionysusegeriow3017

This is the best course ever!

@mananshah2140

Beautifully explained!

@soumitrapandit3444

This was amazing. Thank you so much!

@oraz.

This is the best autodiff lesson

@jieshen3441

Great work, Tianqi!

@kenpaul-h2w

46:53 V_(2->4) bar is wrong ,isn't it?

@howardsmith4128

Great lesson thanks!

@sarracen1a

太棒了,拯救了我的期末大作业😭

@nitinnilesh

17:07 Can someone please explain how the number of multiplication is n(n-2) ?

@Poti221

Very nice explanation!

@HPC4AI

Nice lecture! Can anyone provide some resources to read more about these materials, please? Especially, reverse mode AD by extending computational graph?

@zhiqiqin1286

great!

@AnEnderNon

omg tysm