Most of the datasets you'll find will have more than 3 dimensions. How are you supposed to understand visualize n-dimensional data? Enter dimensionality reduction techniques. We'll go over the the math behind the most popular such technique called Principal Component Analysis.
Code for this video:
github.com/llSourcell/Dimensionality_Reduction
Ong's Winning Code:
github.com/jrios6/Math-of-Intelligence/tree/master…
Hammad's Runner up Code:
github.com/hammadshaikhha/Math-of-Machine-Learning…
Please Subscribe! And like. And comment. That's what keeps me going.
I used a screengrab from 3blue1brown's awesome videos: / @3blue1brown
More learning resources:
plot.ly/ipython-notebooks/principal-component-anal…
• Principal Components Analysis (PCA) Tutori...
www.dezyre.com/data-science-in-python-tutorial/pri…
georgemdallas.wordpress.com/2013/10/30/principal-c…
setosa.io/ev/principal-component-analysis/
sebastianraschka.com/Articles/2015_pca_in_3_steps.…
algobeans.com/2016/06/15/principal-component-analy…
Join us in the Wizards Slack channel:
wizards.herokuapp.com/
And please support me on Patreon:
www.patreon.com/user?u=3191693
Follow me:
Twitter: twitter.com/sirajraval
Facebook: www.facebook.com/sirajology Instagram: www.instagram.com/sirajraval/ Instagram: www.instagram.com/sirajraval/
Signup for my newsletter for exciting updates in the field of AI:
goo.gl/FZzJ5w
Hit the Join button above to sign up to become a member of my channel for access to exclusive content! Join my AI community: chatgptschool.io/ Sign up for my AI Sports betting Bot, WagerGPT! (500 spots available): www.wagergpt.xyz/
コメント