Andy Feather 700m
Collection
A collection of all versions of Andy Feather 700m • 2 items • Updated
• 1
⚠️⚠️⚠️IMPORTANT⚠️⚠️⚠️ In its current state, this model DOES NOT perform very well with Mindcraft and can only do very rudimentary tasks.
This model is a fine-tuned LoRA adapter built on top of LiquidAI/LFM2-700M.
It is designed for CPU inference or those who are GPU poor, requiring sub 1gb of memory to load the model at Q8 precision.
The model is NOT compatible with Ollama, it is highly recommended to use LMStudio instead. Here is an example Mindcraft profile:
{
"name": "andy",
"model": {
"api": "openai",
"url": "http://localhost:1234/v1",
"model": "Andy-Feather-V1-700m"
},
"embedding": {
"api": "openai",
"url": "http://localhost:1234/v1",
"model": "text-embedding-nomic-embed-text-v1.5"
}
}
And an example Mindcraft keys.json:
{
"OPENAI_API_KEY": "http://localhost:1234/v1",
"OPENAI_ORG_ID": "",
"GEMINI_API_KEY": "",
"ANTHROPIC_API_KEY": "",
"REPLICATE_API_KEY": "",
"GROQCLOUD_API_KEY": "",
"HUGGINGFACE_API_KEY": "",
"QWEN_API_KEY": "",
"XAI_API_KEY": "",
"MISTRAL_API_KEY": "",
"DEEPSEEK_API_KEY": "",
"GHLF_API_KEY": "",
"HYPERBOLIC_API_KEY": "",
"NOVITA_API_KEY": "",
"OPENROUTER_API_KEY": "",
"CEREBRAS_API_KEY": "",
"MERCURY_API_KEY":""
}
This model was trained on the following datasets:
The training data is subject to the Andy 1.0 License
This work uses data and models created by @Sweaterdog.
@misc{vonwerra2022trl,
title = {{TRL: Transformer Reinforcement Learning}},
author = {Leandro von Werra and Younes Belkada and Lewis Tunstall and Edward Beeching and Tristan Thrush and Nathan Lambert and Shengyi Huang and Kashif Rasul and Quentin Gallou{\'e}dec},
year = {2020},
journal = {GitHub repository},
publisher = {GitHub},
howpublished = {\url{https://github.com/huggingface/trl}}
}
@misc{liquidai_lfm2_700m,
title = {LFM2-700M},
author = {Liquid AI},
year = {2024},
howpublished = {\url{https://huggingface.co/LiquidAI/LFM2-700M}}
}