Instructions to use DeepLearner101/CIFARSelectedSubsetBasedModel-Training with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use DeepLearner101/CIFARSelectedSubsetBasedModel-Training with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("image-classification", model="DeepLearner101/CIFARSelectedSubsetBasedModel-Training") pipe("https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/hub/parrots.png")# Load model directly from transformers import AutoImageProcessor, AutoModelForImageClassification processor = AutoImageProcessor.from_pretrained("DeepLearner101/CIFARSelectedSubsetBasedModel-Training") model = AutoModelForImageClassification.from_pretrained("DeepLearner101/CIFARSelectedSubsetBasedModel-Training") - Notebooks
- Google Colab
- Kaggle
Commit ·
f20772c
1
Parent(s): 0f46711
Update best_hyperparameters.json
Browse files
best_hyperparameters.json
CHANGED
|
@@ -1 +1 @@
|
|
| 1 |
-
{"lr": 0.00015114720419146885, "weight_decay": 0.00021799637387570855, "dropout_rate": 0.2752288691561484, "l1_factor": 2.7625869891084394e-06, "epochs": 15, "epsilon_range": [0.001, 0.007, 0.002], "step_size": 5, "gamma": 0.7, "early_stopping_tolerance": 10, "training_batch_size":
|
|
|
|
| 1 |
+
{"lr": 0.00015114720419146885, "weight_decay": 0.00021799637387570855, "dropout_rate": 0.2752288691561484, "l1_factor": 2.7625869891084394e-06, "epochs": 15, "epsilon_range": [0.001, 0.007, 0.002], "step_size": 5, "gamma": 0.7, "early_stopping_tolerance": 10, "training_batch_size": 64, "validation_batch_size": 50}
|