.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_examples/03_complex_models/run_hyperparameter_tuning_bayessearch.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code. .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_examples_03_complex_models_run_hyperparameter_tuning_bayessearch.py: Tuning Hyperparameters using Bayesian Search ============================================ This example uses the ``fmri`` dataset, performs simple binary classification using a Support Vector Machine classifier and analyzes the model. References ---------- Waskom, M.L., Frank, M.C., Wagner, A.D. (2016). Adaptive engagement of cognitive control in context-dependent decision-making. Cerebral Cortex. .. include:: ../../links.inc .. GENERATED FROM PYTHON SOURCE LINES 16-29 .. code-block:: Python # Authors: Federico Raimondo # License: AGPL import numpy as np from seaborn import load_dataset import sklearn from julearn import run_cross_validation from julearn.utils import configure_logging, logger from julearn.pipeline import PipelineCreator .. GENERATED FROM PYTHON SOURCE LINES 30-31 Set the logging level to info to see extra information. .. GENERATED FROM PYTHON SOURCE LINES 31-33 .. code-block:: Python configure_logging(level="INFO") .. rst-class:: sphx-glr-script-out .. code-block:: none 2026-01-16 10:54:07,522 - julearn - INFO - ===== Lib Versions ===== 2026-01-16 10:54:07,522 - julearn - INFO - numpy: 1.26.4 2026-01-16 10:54:07,522 - julearn - INFO - scipy: 1.17.0 2026-01-16 10:54:07,522 - julearn - INFO - sklearn: 1.7.2 2026-01-16 10:54:07,522 - julearn - INFO - pandas: 2.3.3 2026-01-16 10:54:07,522 - julearn - INFO - julearn: 0.3.5.dev123 2026-01-16 10:54:07,522 - julearn - INFO - ======================== .. GENERATED FROM PYTHON SOURCE LINES 34-35 Disable metadata routing to avoid errors due to BayesSearchCV being used. .. GENERATED FROM PYTHON SOURCE LINES 35-37 .. code-block:: Python sklearn.set_config(enable_metadata_routing=False) .. GENERATED FROM PYTHON SOURCE LINES 38-39 Set the random seed to always have the same example. .. GENERATED FROM PYTHON SOURCE LINES 39-41 .. code-block:: Python np.random.seed(42) .. GENERATED FROM PYTHON SOURCE LINES 42-43 Load the dataset. .. GENERATED FROM PYTHON SOURCE LINES 43-46 .. code-block:: Python df_fmri = load_dataset("fmri") df_fmri.head() .. raw:: html
subject timepoint event region signal
0 s13 18 stim parietal -0.017552
1 s5 14 stim parietal -0.080883
2 s12 18 stim parietal -0.081033
3 s11 18 stim parietal -0.046134
4 s10 18 stim parietal -0.037970


.. GENERATED FROM PYTHON SOURCE LINES 47-48 Set the dataframe in the right format. .. GENERATED FROM PYTHON SOURCE LINES 48-55 .. code-block:: Python df_fmri = df_fmri.pivot( index=["subject", "timepoint", "event"], columns="region", values="signal" ) df_fmri = df_fmri.reset_index() df_fmri.head() .. raw:: html
region subject timepoint event frontal parietal
0 s0 0 cue 0.007766 -0.006899
1 s0 0 stim -0.021452 -0.039327
2 s0 1 cue 0.016440 0.000300
3 s0 1 stim -0.021054 -0.035735
4 s0 2 cue 0.024296 0.033220


.. GENERATED FROM PYTHON SOURCE LINES 56-58 Following the hyperparamter tuning example, we will now use a Bayesian search to find the best hyperparameters for the SVM model. .. GENERATED FROM PYTHON SOURCE LINES 58-98 .. code-block:: Python X = ["frontal", "parietal"] y = "event" creator1 = PipelineCreator(problem_type="classification") creator1.add("zscore") creator1.add( "svm", kernel=["linear"], C=(1e-6, 1e3, "log-uniform"), ) creator2 = PipelineCreator(problem_type="classification") creator2.add("zscore") creator2.add( "svm", kernel=["rbf"], C=(1e-6, 1e3, "log-uniform"), gamma=(1e-6, 1e1, "log-uniform"), ) search_params = { "kind": "bayes", "cv": 2, # to speed up the example "n_iter": 10, # 10 iterations of bayesian search to speed up example } scores, estimator = run_cross_validation( X=X, y=y, data=df_fmri, model=[creator1, creator2], cv=2, # to speed up the example search_params=search_params, return_estimator="final", ) print(scores["test_score"].mean()) .. rst-class:: sphx-glr-script-out .. code-block:: none 2026-01-16 10:54:07,531 - julearn - INFO - Adding step zscore that applies to ColumnTypes 2026-01-16 10:54:07,531 - julearn - INFO - Step added 2026-01-16 10:54:07,531 - julearn - INFO - Adding step svm that applies to ColumnTypes 2026-01-16 10:54:07,531 - julearn - INFO - Setting hyperparameter kernel = linear 2026-01-16 10:54:07,531 - julearn - INFO - Tuning hyperparameter C = (1e-06, 1000.0, 'log-uniform') 2026-01-16 10:54:07,532 - julearn - INFO - Step added 2026-01-16 10:54:07,532 - julearn - INFO - Adding step zscore that applies to ColumnTypes 2026-01-16 10:54:07,532 - julearn - INFO - Step added 2026-01-16 10:54:07,532 - julearn - INFO - Adding step svm that applies to ColumnTypes 2026-01-16 10:54:07,532 - julearn - INFO - Setting hyperparameter kernel = rbf 2026-01-16 10:54:07,532 - julearn - INFO - Tuning hyperparameter C = (1e-06, 1000.0, 'log-uniform') 2026-01-16 10:54:07,532 - julearn - INFO - Tuning hyperparameter gamma = (1e-06, 10.0, 'log-uniform') 2026-01-16 10:54:07,532 - julearn - INFO - Step added 2026-01-16 10:54:07,533 - julearn - INFO - ==== Input Data ==== 2026-01-16 10:54:07,533 - julearn - INFO - Using dataframe as input 2026-01-16 10:54:07,533 - julearn - INFO - Features: ['frontal', 'parietal'] 2026-01-16 10:54:07,533 - julearn - INFO - Target: event 2026-01-16 10:54:07,533 - julearn - INFO - Expanded features: ['frontal', 'parietal'] 2026-01-16 10:54:07,533 - julearn - INFO - X_types:{} 2026-01-16 10:54:07,533 - julearn - WARNING - The following columns are not defined in X_types: ['frontal', 'parietal']. They will be treated as continuous. /home/runner/work/julearn/julearn/julearn/prepare.py:576: RuntimeWarning: The following columns are not defined in X_types: ['frontal', 'parietal']. They will be treated as continuous. warn_with_log( 2026-01-16 10:54:07,534 - julearn - INFO - ==================== 2026-01-16 10:54:07,534 - julearn - INFO - 2026-01-16 10:54:07,535 - julearn - INFO - = Model Parameters = 2026-01-16 10:54:07,535 - julearn - INFO - Tuning hyperparameters using bayes 2026-01-16 10:54:07,535 - julearn - INFO - Hyperparameters: 2026-01-16 10:54:07,535 - julearn - INFO - svm__C: (1e-06, 1000.0, 'log-uniform') 2026-01-16 10:54:07,535 - julearn - INFO - Hyperparameter svm__C is log-uniform float [1e-06, 1000.0] 2026-01-16 10:54:07,536 - julearn - INFO - Using inner CV scheme KFold(n_splits=2, random_state=None, shuffle=False) 2026-01-16 10:54:07,536 - julearn - INFO - Search Parameters: 2026-01-16 10:54:07,536 - julearn - INFO - cv: KFold(n_splits=2, random_state=None, shuffle=False) 2026-01-16 10:54:07,536 - julearn - INFO - n_iter: 10 2026-01-16 10:54:07,537 - julearn - INFO - ==================== 2026-01-16 10:54:07,537 - julearn - INFO - 2026-01-16 10:54:07,537 - julearn - INFO - = Model Parameters = 2026-01-16 10:54:07,537 - julearn - INFO - Tuning hyperparameters using bayes 2026-01-16 10:54:07,538 - julearn - INFO - Hyperparameters: 2026-01-16 10:54:07,538 - julearn - INFO - svm__C: (1e-06, 1000.0, 'log-uniform') 2026-01-16 10:54:07,538 - julearn - INFO - svm__gamma: (1e-06, 10.0, 'log-uniform') 2026-01-16 10:54:07,538 - julearn - INFO - Hyperparameter svm__C is log-uniform float [1e-06, 1000.0] 2026-01-16 10:54:07,539 - julearn - INFO - Hyperparameter svm__gamma is log-uniform float [1e-06, 10.0] 2026-01-16 10:54:07,540 - julearn - INFO - Using inner CV scheme KFold(n_splits=2, random_state=None, shuffle=False) 2026-01-16 10:54:07,540 - julearn - INFO - Search Parameters: 2026-01-16 10:54:07,540 - julearn - INFO - cv: KFold(n_splits=2, random_state=None, shuffle=False) 2026-01-16 10:54:07,540 - julearn - INFO - n_iter: 10 2026-01-16 10:54:07,541 - julearn - INFO - ==================== 2026-01-16 10:54:07,541 - julearn - INFO - 2026-01-16 10:54:07,541 - julearn - INFO - = Model Parameters = 2026-01-16 10:54:07,541 - julearn - INFO - Tuning hyperparameters using bayes 2026-01-16 10:54:07,541 - julearn - INFO - Hyperparameters list: 2026-01-16 10:54:07,541 - julearn - INFO - Set 0 2026-01-16 10:54:07,541 - julearn - INFO - svm__C: Real(low=1e-06, high=1000.0, prior='log-uniform', transform='identity') 2026-01-16 10:54:07,541 - julearn - INFO - set_column_types: [SetColumnTypes(X_types={})] 2026-01-16 10:54:07,542 - julearn - INFO - zscore: [StandardScaler()] 2026-01-16 10:54:07,542 - julearn - INFO - svm: [SVC(kernel='linear')] 2026-01-16 10:54:07,542 - julearn - INFO - Set 1 2026-01-16 10:54:07,542 - julearn - INFO - svm__C: Real(low=1e-06, high=1000.0, prior='log-uniform', transform='identity') 2026-01-16 10:54:07,542 - julearn - INFO - svm__gamma: Real(low=1e-06, high=10.0, prior='log-uniform', transform='identity') 2026-01-16 10:54:07,542 - julearn - INFO - set_column_types: [SetColumnTypes(X_types={})] 2026-01-16 10:54:07,543 - julearn - INFO - zscore: [StandardScaler()] 2026-01-16 10:54:07,543 - julearn - INFO - svm: [SVC()] 2026-01-16 10:54:07,543 - julearn - INFO - Hyperparameter svm__C as is Real(low=1e-06, high=1000.0, prior='log-uniform', transform='identity') 2026-01-16 10:54:07,543 - julearn - INFO - Hyperparameter set_column_types as is [SetColumnTypes(X_types={})] 2026-01-16 10:54:07,543 - julearn - INFO - Hyperparameter zscore as is [StandardScaler()] 2026-01-16 10:54:07,544 - julearn - INFO - Hyperparameter svm as is [SVC(kernel='linear')] 2026-01-16 10:54:07,544 - julearn - INFO - Hyperparameter svm__C as is Real(low=1e-06, high=1000.0, prior='log-uniform', transform='identity') 2026-01-16 10:54:07,544 - julearn - INFO - Hyperparameter svm__gamma as is Real(low=1e-06, high=10.0, prior='log-uniform', transform='identity') 2026-01-16 10:54:07,544 - julearn - INFO - Hyperparameter set_column_types as is [SetColumnTypes(X_types={})] 2026-01-16 10:54:07,544 - julearn - INFO - Hyperparameter zscore as is [StandardScaler()] 2026-01-16 10:54:07,545 - julearn - INFO - Hyperparameter svm as is [SVC()] 2026-01-16 10:54:07,545 - julearn - INFO - Using inner CV scheme KFold(n_splits=2, random_state=None, shuffle=False) 2026-01-16 10:54:07,545 - julearn - INFO - Search Parameters: 2026-01-16 10:54:07,545 - julearn - INFO - cv: KFold(n_splits=2, random_state=None, shuffle=False) 2026-01-16 10:54:07,545 - julearn - INFO - n_iter: 10 2026-01-16 10:54:07,556 - julearn - INFO - ==================== 2026-01-16 10:54:07,556 - julearn - INFO - 2026-01-16 10:54:07,556 - julearn - INFO - = Data Information = 2026-01-16 10:54:07,556 - julearn - INFO - Problem type: classification 2026-01-16 10:54:07,556 - julearn - INFO - Number of samples: 532 2026-01-16 10:54:07,556 - julearn - INFO - Number of features: 2 2026-01-16 10:54:07,556 - julearn - INFO - ==================== 2026-01-16 10:54:07,556 - julearn - INFO - 2026-01-16 10:54:07,557 - julearn - INFO - Number of classes: 2 2026-01-16 10:54:07,557 - julearn - INFO - Target type: object 2026-01-16 10:54:07,558 - julearn - INFO - Class distributions: event cue 266 stim 266 Name: count, dtype: int64 2026-01-16 10:54:07,558 - julearn - INFO - Using outer CV scheme KFold(n_splits=2, random_state=None, shuffle=False) (incl. final model) 2026-01-16 10:54:07,558 - julearn - INFO - Binary classification problem detected. 0.656015037593985 .. GENERATED FROM PYTHON SOURCE LINES 99-100 It seems that we might have found a better model, but which one is it? .. GENERATED FROM PYTHON SOURCE LINES 100-101 .. code-block:: Python print(estimator.best_params_) .. rst-class:: sphx-glr-script-out .. code-block:: none OrderedDict({'set_column_types': SetColumnTypes(X_types={}), 'svm': SVC(), 'svm__C': 0.0018082604408073564, 'svm__gamma': 1.6437581151471767, 'zscore': StandardScaler()}) .. rst-class:: sphx-glr-timing **Total running time of the script:** (0 minutes 4.104 seconds) .. _sphx_glr_download_auto_examples_03_complex_models_run_hyperparameter_tuning_bayessearch.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: run_hyperparameter_tuning_bayessearch.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: run_hyperparameter_tuning_bayessearch.py ` .. container:: sphx-glr-download sphx-glr-download-zip :download:`Download zipped: run_hyperparameter_tuning_bayessearch.zip ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_