Isotope v0.1
Isotope v0.1 is an experimental fine-tune of the LiquidAI/LFM2.5-1.2B-Instruct architecture. It was trained using the MLX framework on Apple Silicon.
Experimental Status This model is a research artifact created for educational purposes and pipeline testing. It is not intended for production use. The training dataset is synthetic and subject to change entirely between repository commits without notice.
Model Details
- Model Name: Isotope v0.1
- Base Model: LiquidAI/LFM2.5-1.2B-Instruct
- Library: MLX-LM
- Hardware: Trained on a Macbook Air M4 with 24GB or RAM
- License: MIT
Purpose
The primary goal of the Isotope project is to experiment with fine-tuning pipelines on local hardware. It serves as a sandbox for testing:
- Synthetic data generation.
- Hyperparameter tuning within MLX.
- The capabilities of small-scale (<3B parameter) models.
Training Data
The model is currently trained on a small, evolving synthetic dataset. Because this is a playground for experiments, the specific focus of the data (e.g., coding, creative writing, reasoning) may shift significantly between versions.
- Downloads last month
- 23
8-bit
Model tree for itspaultal/isotope_v0.1
Base model
LiquidAI/LFM2.5-1.2B-Base