K-Fold Cross Validation: Explanation + Tutorial in Python, Scikit-Learn & NumPy
The Notebook: colab.research.google.com/dri...
Thank you for watching the video! You can learn data science FASTER at mlnow.ai!
Master Python at mlnow.ai/course-material/python/!
Learn SQL & Relational Databases at mlnow.ai/course-material/sql/!
Learn NumPy, Pandas, and Python for Data Science at mlnow.ai/course-material/data...!
Become a Machine Learning Expert at mlnow.ai/course-material/ml/!
Don't forget to subscribe if you enjoyed the video :D
Пікірлер: 15
Take my courses at mlnow.ai/!
I don't get why your videos don't have millions of views yet, you explain everything so clearly! awesome work man keep it up!
@GregHogg
Жыл бұрын
Haha thank you I appreciate that!!
Cheers for the help mate 👍
LOVE THE THUMBNAIL
@GregHogg
2 жыл бұрын
Thank you!!!
Hi Greg.. Thank you for the wonderful video. I have been working on Time series evaluation and specifically cross validation. My question is, is there a way to generalize the parameter tuning once the cross validation is complete or is it just trial and error basis.
@GregHogg
2 жыл бұрын
You can always tune values to minimize cross val loss :)
Thank you so much for the amazing video
@GregHogg
2 жыл бұрын
Glad it was helpful and you're very welcome, Prachi!
For anyone wondering why he flatten the images example: X_train.reshape(X_train.shape[0],-1) is because sklearn predict wants a 2D array as a first argument.
@GregHogg
2 жыл бұрын
Thanks Panagiotis, I appreciate that!!
wanted an 'statistics for data science' video plz.. like distributions etc.,
@GregHogg
2 жыл бұрын
Sure!
🤓🤓🤓🤓🤓🤓🤓🤓🤓🤓🤓🤓🤓🤓🤓🤓🤓