Skip to content

Commit

Permalink
[optuna] some more tunings due to lgbm optuna tuning
Browse files Browse the repository at this point in the history
  • Loading branch information
basaks committed Jul 5, 2023
1 parent a00a1ba commit aea4409
Show file tree
Hide file tree
Showing 4 changed files with 12 additions and 4 deletions.
12 changes: 9 additions & 3 deletions configs/ref_lgbm.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -37,11 +37,12 @@ learning:
n_estimators: 10
random_state: 1
max_depth: 20
subsample: 0.9
optimisation:
optuna_params:
n_trials: 100
n_trials: 1000
step: 2
cv: 2
cv: 5
verbose: 1000
random_state: 1
scoring: r2 # r2, neg_mean_absolute_error, etc..see note above
Expand All @@ -53,12 +54,17 @@ learning:
learning_rate: FloatDistribution(0.0001, 10, log=True)
num_leaves: IntDistribution(2, 256)
subsample_for_bin: IntDistribution(10, 1000000, step=10)
colsample_bytree: FloatDistribution(0.01, 1.0, log=True)
subsample: FloatDistribution(0.4, 1.0) # this is the same as bagging_fraction
subsample_freq: IntDistribution(1, 100) # this is the same as bagging_freq
colsample_bytree: FloatDistribution(0.4, 1.0)
colsample_bynode: FloatDistribution(0.01, 1.0)
reg_alpha: FloatDistribution(1e-8, 10, log=True)
reg_lambda: FloatDistribution(1e-8, 10, log=True)
min_child_samples: IntDistribution(2, 100)
min_child_weight: FloatDistribution(0, 10)
min_split_gain: FloatDistribution(1e-8, 1, log=True)
boosting_type: CategoricalDistribution(['gbdt', 'dart', 'rf'])

# hyperopt_params:
# max_evals: 5
# step: 2
Expand Down
1 change: 1 addition & 0 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -109,6 +109,7 @@ def run(self):
'Pillow >= 8.1.2',
"PyWavelets==1.2.0",
"imageio==2.9.0",
"optuna==3.2.0",
],
extras_require={
'kmz': [
Expand Down
1 change: 1 addition & 0 deletions uncoverml/optimise/models.py
Original file line number Diff line number Diff line change
Expand Up @@ -596,6 +596,7 @@ def __init__(
n_jobs=n_jobs,
silent=silent,
importance_type=importance_type,
**kwargs
)

def predict(self, X, *args, **kwargs):
Expand Down
2 changes: 1 addition & 1 deletion uncoverml/optuna_opt.py
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,7 @@ def optimise_model(X, targets_all: Targets, conf: Config):
param_distributions=search_space,
scoring=scorer,
cv=cv,
error_score=np.nan,
error_score='raise',
random_state=rstate,
return_train_score=True,
** conf.optuna_params
Expand Down

0 comments on commit aea4409

Please sign in to comment.