Predict sales prices and practice feature engineering, RFs, and gradient boosting, currently reaching with gradient boosting models to top 30% from submissions.
Hello World! This jupyter python notebook contains my personal attempt for the "getting started" kaggle competititon about regression techniques. Currently I am focused on XGBoost models and I will move on to Random Forest, Deep Learning etc. Last but not least thorough grid search - fine tuning has led to submissions that reach to top 30% (or 0.13 error of prediction) from all the other predictions."
Required Libraries:
- sklearn
- xgboost
- pandas
- matplotlib
Information about the dataset can be found in Kaggle link