Extreme Gradient Boosting with XGBoost
Extreme Gradient Boosting with XGBoost#
Sergey Fogelson
Sergey began his career as an academic at Dartmouth College, where he researched the neural bases of visual category learning and obtained his Ph.D. in Cognitive Neuroscience. After leaving academia, Sergey got into the rapidly growing startup scene in the NYC metro area, where he has worked as a data scientist in digital advertising, cybersecurity, finance, and media. He is heavily involved in the NYC-area teaching community and has taught courses at various bootcamps, and has been a volunteer teacher in computer science through TEALSK12. When Sergey is not working or teaching, he is probably hiking. (He thru-hiked the Appalachian trail before graduate school).
Course Description
Do you know the basics of supervised learning and want to use state-of-the-art models on real-world datasets? Gradient boosting is currently one of the most popular techniques for efficient modeling of tabular datasets of all sizes. XGboost is a very fast, scalable implementation of gradient boosting, with models using XGBoost regularly winning online data science competitions and being used at scale across different industries. In this course, you’ll learn how to use this powerful library alongside pandas and scikit-learn to build and tune supervised learning models. You’ll work with real-world datasets to solve classification and regression problems.