Kaggle Data Science Competitions Wrap-up

Kaggle Data Science Competitions Wrap-up

Posted by Claire Tu

Updated: Apr 15, 2016

As part of bootcamp projects, each of our students will complete a Kaggle competition to practice their machine learning skills. Watch the videos and read blog posts as our bootcamp students talk about their Kaggle competition strategies, methods, and techniques.

Improving Home Depot Search Relevance
Given raw text as data input, the goal of the project is to predict the relevancy of products to search results at the Home Depot website. Amy Ma, Brett Amdur, Christopher Redino discuss their strategies, starting from text mining and feature engineering to model selection and parameter tuning.





Predicting Customer Satisfaction at Santander Bank from Anonymized Data
Anna Bohun, Thomas Boulenger, Adam Owens, and Ashwin Swamy create a data balancing neural network and compare it to several gradient boosted machine algorithms. The importance of feature engineering, dimensional reduction, and computational efficiency are discussed in detail.





Forest Cover Type Classification Study
The project is to predict forest cover types for data from Roosevelt National Forest in northern Colorado. Thomas Kolasa and Aravind Kolumum Raja provide an in-depth discussion about the methods they applied in project which include logistic regressions, neural networks, tree based methods, and ensemble methods.



Claire Tu

View all articles

Topics from this blog: Machine Learning Kaggle Community

Interested in becoming a Data Scientist?

Answer 3 Simple Questions and Get Immediate Course Recommendations.