def objective(trial):
param= {'objective': 'multi:softprob',
'tree_method': 'hist',
'num_class': 3 ,
'max_depth': trial.suggest_int('max_depth', 3, 10),
'learning_rate': trial.suggest_uniform('learning_rate', 0.01, 0.05),
'n_estimators': 1748,
'gamma': 0.5631817186746637,
'min_child_weight': trial.suggest_int('min_child_weight', 1, 100),
'colsample_bytree': 0.20901565046325682,
'subsample': 0.7312131456978738}
XGB = XGBClassifier(**param)
XGB_cv = cross_val_score(XGB,
X,
Y,
scoring='accuracy',
cv=skf,
n_jobs=-1)
return -1 * XGB_cv.mean()
study = optuna.create_study(sampler=optuna.samplers.RandomSampler(seed=0))
study.optimize(objective,n_trials=200)
print(study.best_params)
'AI > model' 카테고리의 다른 글
트랜스포머 연구 (0) | 2024.10.11 |
---|---|
트랜스포머 파헤치기 1,2 (0) | 2024.10.01 |
특성 선택 기법 - 지도학습 (0) | 2024.05.11 |
stationarity(정상성) (0) | 2024.05.06 |
비지도학습 k-mean (0) | 2024.05.05 |