Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion .github/workflows/python-package-conda.yml
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,8 @@ jobs:
- name: Install test dependencies (conda)
shell: bash -l {0}
run: |
conda install -y pytest coverage
# Pinning numpy and scikit-learn to prevent the issues reported in #982
conda install -y pytest coverage "numpy>=1.22,<2.4" "scikit-learn>=1.0" "numba>=0.59.0"

- name: Install package and run tests
shell: bash -l {0}
Expand Down
2 changes: 2 additions & 0 deletions .github/workflows/python-package-pip.yml
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,8 @@ jobs:
- name: Install dependencies (pip)
run: |
python -m pip install --upgrade pip
# Fixing #982 by pinning numpy to a version compatible with numba
pip install "numpy>=1.22,<2.4"
pip install pytest coverage
pip install -e .

Expand Down
4 changes: 2 additions & 2 deletions environment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ channels:
- conda-forge
dependencies:
- flake8>=4.0.1
- numpy>=2.3.5
- numpy>=1.21.0,<2.4.0
- pandas>=2.3.3
- scipy>=1.16.3
- scikit-learn>=1.8.0
Expand All @@ -13,4 +13,4 @@ dependencies:
- pytest>=6.2.5
- setuptools>=59.4.0
- pip:
- markdown>=3.3.6
- markdown>=3.3.6
6 changes: 3 additions & 3 deletions mlxtend/evaluate/bias_variance_decomp.py
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ def bias_variance_decomp(
loss="0-1_loss",
num_rounds=200,
random_seed=None,
**fit_params
**fit_params,
):
"""
estimator : object
Expand Down Expand Up @@ -106,8 +106,7 @@ def bias_variance_decomp(
X_boot, y_boot = _draw_bootstrap_sample(rng, X_train, y_train)

# Keras support
if estimator.__class__.__name__ in ["Sequential", "Functional"]:
# reset model
if estimator.__class__.__name__ in ["Sequential", "Functional", "Model"]:
for ix, layer in enumerate(estimator.layers):
if hasattr(estimator.layers[ix], "kernel_initializer") and hasattr(
estimator.layers[ix], "bias_initializer"
Expand All @@ -128,6 +127,7 @@ def bias_variance_decomp(
pred = estimator.predict(X_test).reshape(1, -1)
else:
pred = estimator.fit(X_boot, y_boot, **fit_params).predict(X_test)

all_pred[i] = pred

if loss == "0-1_loss":
Expand Down
Loading