@sassk73

So grateful that I came across this channel through this video! I love how you explain the concepts so well and make them easy to understand by being so calm and composed and showcasing easy to understand examples! Please continue the great work :)

@cheukchitse8692

As a beginner in this field, I think your videos make the whole process quite clear and easy to understand. Thank you so much Professor Raschaka. Keep creating excellent content plz :)

@nak6608

omg you have a youtube channel! 
I've been working through your Python ML book for months now. I've been struggling on page 228 so I went to youtube. 5 minutes into this video I was like, "this guy is using all the same examples from the book" lol to my surprise it was you!
This is amazing! I'll have to go back and watch some of your other lectures!

@talh123

Thank you so much for sharing the lectures. They are really helpful!

@FarizDarari

Wonderful illustrations, thanks a lot!

@rohitgarg776

Very nice explanation

@Jonathanwu_tech

very helpful video!

@wellkamp

So many thanks to great explication

@mehdimaboudi2703

Many thanks for sharing the lecture video.
 Page 16: it seems that instead of ceiling function k>ceil(n/2), floor function should be usedm k>floor(n/2). 
if we have 15 base estimators, then k>7.

@113_bachtiardanuarta_b2

So in the majorty voting, final prediction value is 100%, whereas average voting prediction value is average sum of all class prediction?

@abdireza1298

Professor Raschka, Please allow me to ask.

Is there any theoretical background for what algorithm we can and what we can't combine into a voting classifier?

Say, we built a model from the X and y using 
- Logistic regression with Lasso penalty, 
- Logistic regression with Elastic net penalty, 
- Decision Trees, 
- Random Forest, 
- AdaBoost, 
- XGboost 
for each one showing various accuracy results from stratified k-fold cross-validation.

Is it acceptable to create a soft voting classifier (or weighted voting classifier) made from the Decision Trees model and Random Forest? (considering random forest itself is already a combination and soft-voting of several decision trees).
Does it acceptable to create a voting classifier consisting of XGBoost and Adaboost?
Does it acceptable to create a voting classifier consisting of logistic regression with lasso penalty and other logistic regression with the elastic net penalty? (considering elastic net already a combination of lasso and ridge

I understand that we are free to do anything with our data, I believe combining similar models will help at least narrow the standard deviation from the cumulative average of the cross-validation split. But, is it theoretically acceptable?

Thank you for your patience. I am sorry for the beginner question. Good luck to everyone.

@blogger.powerpoint_expert

how can i refer to that material? if i want to refer that in my work?

@diniebalqisay2658

Hello sir, May I know how voting classifier can be implemented in CNN model approach?

@saumyashah6622

Does all the classifiers have similar weights(in voting) ? If yes then classifier with legit high accuracy would have equal weighted vote as compared to vote by a worse classifier. Please answer me. I am stuck sir. Help.

@vikramxD

There seems to be an issue of zip function argument not being iterable in the ensemble.voting classifier method does anyone know how to solve it ?