[CatBoost] 알아보기
KT AIVLE School에서 미니 프로젝트 2차 때 저에게 현자타임을 안겨준 엄청난 분류 성능을 가진 CatBoost를 정리합니다. Pool 사용하기 train_pool = Pool(x_train, y_train) eval_pool = Pool(x_val, y_val) test_pool = Pool(x_test) model = CBC(iterations=100, # depth=2, # learning_rate=1, loss_function='Logloss', random_seed=1, task_type="GPU", verbose=True) model.fit( train_pool, # cat_features=cat_features, eval_set=eval_pool, plot=True ) y_pred ..
2023. 3. 9.
[HyperOpt] 베이지안 최적화 알아보기
모듈 설치 !pip install hyperopt 모듈 가져오기 import hyperopt from hyperopt import hp from hyperopt import STATUS_OK from hyperopt import fmin, tpe, Trials from sklearn.metrics import accuracy_score as accuracy 검색 공간 설정 search_space = { 'max_depth': hp.quniform('max_depth', 3, 15, 1), 'min_child_weight': hp.quniform('min_child_weight', 1, 3, 1), 'learning_rate': hp.uniform('learning_rate', 0.005, 0.2), 'c..
2023. 3. 9.
[lightGBM] 파라미터, 메소드, 어트리뷰트 파헤치기
lightgbm.LGBMClassifier( boosting_type='gbdt', num_leaves=31, max_depth=-1, learning_rate=0.1, n_estimators=100, subsample_for_bin=200000, objective=None, class_weight=None, min_split_gain=0.0, min_child_weight=0.001, min_child_samples=20, subsample=1.0, subsample_freq=0, colsample_bytree=1.0, reg_alpha=0.0, reg_lambda=0.0, random_state=None, n_jobs=None, importance_type='split', **kwargs ) skle..
2023. 3. 3.