LightGBM tuning
This commit is contained in:
@@ -104,8 +104,8 @@ Current results taken KMEANS_SMOTE:
|
||||
## next steps:
|
||||
```
|
||||
✅ 1. Stratified K-fold only apply on train.
|
||||
🗹 2. train LGBM model using KMEANS_SMOTE with knn k_neighbors=10 (fine-tune remained)
|
||||
✅ 2. train LGBM model using KMEANS_SMOTE with knn k_neighbors=10 (fine-tune remained)
|
||||
🗹 3. train Cat_boost using KMEANS_SMOTE with knn k_neighbors=10 (fine-tune remained)
|
||||
🗹 4. implement proposed methods of this article : https://1drv.ms/b/c/ab2a38fe5c318317/IQBEDsSFcYj6R6AMtOnh0X6DAZUlFqAYq19WT8nTeXomFwg
|
||||
🗹 5. compare proposed model with SMOTE vs oversampling balancing method
|
||||
```
|
||||
```
|
||||
|
||||
Reference in New Issue
Block a user