提问人:samac 提问时间:9/22/2023 最后编辑:desertnautsamac 更新时间:9/22/2023 访问量:158
训练模型后检索 lightGBM 超参数
Retrieving lightGBM hyper parameters after a model has been trained
问:
我使用 optuna 为数据集创建了数百个模型。很少有有用的模型,我不小心没有为好的模型保存超参数。
我确实将模型保存为 .txt。有没有办法从已经训练好的模型中提取超参数?
我问chatGPT,它提到
import lightgbm as lgb
# Load the model
gbm = lgb.Booster(model_file='path_to_your_model.txt') # Replace with the actual path to your model file
# Get hyperparameters
hyperparameters = gbm.get_params()
# Print hyperparameters
for key, value in hyperparameters.items():
print(f"{key}: {value}")
但是我收到此错误:
AttributeError: 'Booster' object has no attribute 'get_params'
答:
0赞
James Lamb
9/22/2023
#1
由于 ,从模型文件创建的对象将具有存储在属性中的文件中的所有参数。lightgbm==4.0.0
lightgbm.Booster
.params
下面是使用 Python 3.10 的示例。lightgbm==4.1.0
import lightgbm as lgb
from sklearn.datasets import make_regression
# train a model
X, y = make_regression(n_samples=10_000)
dtrain = lgb.Dataset(X, label=y)
model = lgb.train(
params={
"learning_rate": 0.0708,
"num_leaves": 11,
"objective": "regression"
},
train_set=dtrain,
num_boost_round=6
)
# save model to disk
model.save_model("model.txt")
# create a Booster
model_from_file = lgb.Booster(model_file="model.txt")
print(model_file.params)
您将在那里看到所有参数:
{
"objective": "regression",
...
"learning_rate": 0.0708,
"num_leaves": 11,
...
}
评论
gbm.params
.Booster()
lightgbm.LGBMModel()
.get_params()