Base model: Mistral-7B, pruned to 4B.
FFT on 2K PIPPA somewhat 'clean' examples.
- Downloads last month
- -
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support
Base model: Mistral-7B, pruned to 4B.
FFT on 2K PIPPA somewhat 'clean' examples.