Python: How to retrive the best model from Optuna LightGBM study?

I would like to get the best model to use later in the notebook to predict using a different test batch. reproducible example (taken from Optuna Github) : import lightgbm as lgb import numpy as...

Getting SIGKILL when running study optimization with Optuna on PyCharm

I am trying to run a study, using the optimize function with the default sampler and Median pruner. every run crashes, sometimes after 1 succefull trial sometimes without completing any. The crash...

how to find optimal hyperparams in convolutional net?

I came to know scikit-optimize package , and I am relatively new to Bayesian optimization which I want to use it in my current Convolutional NN. However, I tried to find best hyperparameters of...

Using optuna LightGBMTunerCV as starting point for further search with optuna

I'm trying to use LightGBM for a regression problem (mean absolute error/L1 - or similar like Huber or pseud-Huber - loss) and I primarily want to tune my hyperparameters. LightGBMTunerCV in...

cudaErrorInsufficientDriver: CUDA driver version is insufficient for CUDA runtime version

I am running GPU instance on GKE when everything is deployed I make the request to the service Above mention error occur I followed all the step in mentioned in...

Optuna - Memory Issues

I am trying to free memory in between Optuna optimization runs. I am using python 3.8 and the latest version of Optuna. What happens is I run the commands: optuna.create_study(), then I call...

Specify fixed parameters and parameters to be search in optuna (lightgbm)

I just found Optuna and it seems they are integrated with lightGBM, but I struggle to see where I can fix parameters, e.g scoring="auc" and where I can define a gridspace to search, e.g...

Optuna Suggests the Same Parameter Values in a lot of Trials (Duplicate Trials that Waste Time and Budget)

Optuna TPESampler and RandomSampler try the same suggested integer values (possible floats and loguniforms as well) for any parameter more than once for some reason. I couldn't find a way to stop...

Best parameters of an Optuna multi-objective optimization

When performing a single-objective optimization with Optuna, the best parameters of the study are accessible using: import optuna def objective(trial): x = trial.suggest_uniform('x', -10, 10) ...

How to set a minimum number of epoch in Optuna SuccessiveHalvingPruner()?

I'm using Optuna 2.5 to optimize a couple of hyperparameters on a tf.keras CNN model. I want to use pruning so that the optimization skips the less promising corners of the hyperparameters space....

Turning off Warning in Optuna Training

I fully realized that I will likely be embarassed for missing something obvious, but this has me stumped. I am tuning a LGBM model using Optuna, and my notebook gets flooded with warning...

Successfully access MySQL database from python using optuna

Since the optuna documentation does not address which modules are required from MySQL, I installed everything of MySQL on my Windows 10 machine. I looked for MySQL on my PC (in which folder the...

Optuna catboost pruning

is there a way to have pruning with CatBoost and Optuna (in LightGBM it's easy but in Catboost I can't find any hint). My code is like this def objective(trial): param = { ...

Tensorflow / keras issue when optimizing with optuna

I'm pretty new to machine learning, I've been trying to teach myself neural networks from following sentdex tutorials. I followed his tutorial on using recurrent neural networks for predicting the...

Tune Hyperparameter in sklearn with ray

I wonder but could not found any information why this appears all the time if I try to tune hyperparameter from sklearn with TuneSearchCV: Note that the important part is the Log sync warning and...

Optuna pass dictionary of parameters from "outside"

I am using Optuna to optimize some objective functions. I would like to create my custom class that "wraps" the standard Optuna code. As an example, this is my class(it is still a work in...

optuna.integration.lightGBM custom optimization metric

I am trying to optimize a lightGBM model using optuna. Reading the docs I noticed that there are two approaches that can be used, as mentioned here: LightGBM Tuner: New Optuna Integration for...

tensorflow dependencies continuously gives me errors in colab during installation of deepspeech environment

when I run the following command on Google Colab !pip3 install --upgrade pip==20.0.2 wheel==0.34.2 setuptools==46.1.3 !pip3 install --upgrade --force-reinstall -e . Got an error ERROR: pip's...

Python AWS Lambda Error "libgomp.so.1: cannot open shared object file: No such file or directory" from AWS EFS file system mounted on Ubuntu

I have an AWS lambda function connected to an AWS EFS mounted on a EC2 instance of Ubunutu 18. I am getting this error below when importing a LightGBM model i believe. { "errorMessage":...

How to set optuna's study.optimize verbosity to 0?

I want to set optuna's study.optimize verbosity to 0. I thought optuna.logging.set_verbosity(0) might do it, but I still get the Trial 0 finished with value .... updates for every trial What is...

Fitting customized LGBM parameters in sklearn pipeline

I am working on a binary classifier using LightGBM. My classifier definition looks like following: # sklearn version, for the sake of calibration bst_ = LGBMClassifier(**search_params,...

What is the difference between alpha, lambda and gamma regularization parameters for xgboost?

I have a question to ask: How exactly are different L1 and L2 regularization terms on weights in xgboost algorithm. As I understand, L1 is used by LASSO and L2 is used by RIDGE regression and L1...

Is there a way to use Weights and Biases with Optuna?

Is there a way to use Weights and Biases with Optuna? I tried resulted in the error weights and biases backend has shutdown

Why Optuna getting stuck after certain number of trials

I am trying to do hypertuning using Optuna. The dataset is the MovieLense (1M). In one script I have Lasso, Ridge and Knn. Optuna is working fine for the Lasso and Ridge but getting stuck for the...

The default evaluation metric used with the objective 'binary:logistic' was changed from 'error' to 'logloss'

I am trying to fit XGBClassifier to my dataset after hyperparameter tuning using optuna and I keep getting this warning: the default evaluation metric used with the objective 'binary:logistic'...

What is the canonical way to set fixed parameters and retrieve them after a study is complete?

Optuna allows users to search a parameter space using the suggest_ API. This is easy and clever enough. However, there are some parameters I would like to remain fixed. For example, with...

Unable to tune hyperparameters for CatBoostRegressor

I am trying to fit a CatBoostRegressor to my model. When I perform K fold CV for the baseline model everything works fine. But when I use Optuna for hyperparameter tuning, it does something really...

How to optimize for multiple metrics in Optuna

How do I optimize for multiple metrics simultaneously inside the objective function of Optuna. For example, I am training an LGBM classifier and want to find the best hyperparameter set for all...

Optuna suggest float log=True

How can I have optuna suggest float numeric values from this list: [1e-6, 1e-5, 1e-4, 1e-3, 1e-2, 1e-1, 1.0] I'm using this Python code snippet: trial.suggest_float("lambda", 1e-6, 1.0,...

Getting DLL error when running tensorflow/keras program on python gpu 3.8

I am trying to run my simple AI program but I keep getting this DLL error: ImportError: Could not find the DLL(s) 'msvcp140_1.dll'. TensorFlow requires that these DLLs be installed in a...