How to params to XGboost while using it in mlforecast #220
Replies: 1 comment
-
Closing in favor of #219 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
import xgboost as xgb
Assuming you have your features and target variable ready (X and y)
Combine them into a DMatrix
dtrain = xgb.DMatrix(X, label=y)
Define XGBoost parameters with Pseudo-Huber loss
params = {
'objective': 'reg:pseudohubererror', # Specify 'reg:pseudohubererror' for Pseudo-Huber loss
'alpha': 1.2, # Adjust this parameter to control the Pseudo-Huber delta
'eta': 0.1, # Learning rate
'max_depth': 6, # Maximum depth of the tree
'subsample': 0.8, # Fraction of data used for building trees
'colsample_bytree': 0.8, # Fraction of features used for building trees
'eval_metric': 'rmse' # Evaluation metric (can be 'rmse' or other appropriate metrics)
}
Train the XGBoost model using the entire dataset
num_round = 100 # Number of boosting rounds (you can adjust this)
model = xgb.train(params, dtrain, num_round)
Make predictions on the same data used for training (since you have no separate test set)
predictions = model.predict(dtrain)
predictions now contain the predicted values for your entire dataset
I want do something like this, as we are passing the model directly , I am not sure how to pass dmatrix. Could someone pls give suggestions
Beta Was this translation helpful? Give feedback.
All reactions