XGBoost is short for eXtreme gradient boosting.

Features 1

  • Easy to use
    • Easy to install
    • Highly developed R/python for users
  • Efficiency
    • Automatic parallel computation on a single machine
    • Can be run on a cluster.
  • Accuracy
    • Good results for most data sets
  • Feasibility
    • Customized object and evaluation
    • Turnable parameters

XGBoost Overfitting



Multi class classification: https://www.kaggle.com/tqchen/otto-group-product-classification-challenge/understanding-xgboost-model-on-otto-data


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s