Fetching metadata from the HF Docker repository...
Upload model.py
ff6113c
verified
-
1.52 kB
initial commit
-
322 Bytes
initial commit
-
4.13 kB
Update app.py
-
425 Bytes
Upload 8 files
-
2.02 MB
Upload 8 files
label_encoders.pkl
Detected Pickle imports (6)
- "numpy.ndarray",
- "numpy.dtype",
- "_codecs.encode",
- "numpy._core.multiarray._reconstruct",
- "sklearn.preprocessing._label.LabelEncoder",
- "joblib.numpy_pickle.NumpyArrayWrapper"
How to fix it?
2.07 kB
Upload 8 files
label_encoders_2.pkl
Detected Pickle imports (6)
- "numpy.ndarray",
- "numpy.dtype",
- "_codecs.encode",
- "numpy._core.multiarray._reconstruct",
- "sklearn.preprocessing._label.LabelEncoder",
- "joblib.numpy_pickle.NumpyArrayWrapper"
How to fix it?
1.85 kB
Upload 8 files
lgbm_model.pkl
Detected Pickle imports (8)
- "numpy.dtype",
- "numpy.ndarray",
- "lightgbm.basic.Booster",
- "collections.defaultdict",
- "numpy._core.multiarray.scalar",
- "collections.OrderedDict",
- "lightgbm.sklearn.LGBMClassifier",
- "joblib.numpy_pickle.NumpyArrayWrapper"
How to fix it?
682 kB
Upload 8 files
-
6.37 kB
Upload model.py
rf_tuned_model.pkl
Detected Pickle imports (7)
- "joblib.numpy_pickle.NumpyArrayWrapper",
- "numpy.ndarray",
- "numpy._core.multiarray._reconstruct",
- "sklearn.tree._classes.DecisionTreeClassifier",
- "sklearn.ensemble._forest.RandomForestClassifier",
- "numpy.dtype",
- "_codecs.encode"
How to fix it?
11.9 MB
Upload 8 files
xgb_model.pkl
Detected Pickle imports (3)
- "builtins.bytearray",
- "xgboost.sklearn.XGBClassifier",
- "xgboost.core.Booster"
How to fix it?
3.4 MB
Upload 8 files