FLP 1.0
Fine-tuned from ibm/materials.fa_model_std using Precite.
Training Configuration
| Parameter | Value |
|---|---|
| Base Model | ibm/materials.fa_model_std |
| Epochs | 10 |
| Batch Size | 32 |
| Learning Rate | 0.0001 |
| Test Split | 20% |
| Training Data | FLP_Data_sample.csv |
About
This model was created on Precite, an academic ML model platform.
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support