Model Card for Andy Feather 700M

⚠️⚠️⚠️IMPORTANT⚠️⚠️⚠️ In its current state, this model DOES NOT perform very well with Mindcraft and can only do very rudimentary tasks.

This model is a fine-tuned LoRA adapter built on top of LiquidAI/LFM2-700M.
It is designed for CPU inference or those who are GPU poor, requiring sub 1gb of memory to load the model at Q8 precision. The model is NOT compatible with Ollama, it is highly recommended to use LMStudio instead. Here is an example Mindcraft profile:

{
    "name": "andy",
    "model": {
        "api": "openai",
        "url": "http://localhost:1234/v1",
        "model": "Andy-Feather-V1-700m"
    },
    "embedding": {
        "api": "openai",
        "url": "http://localhost:1234/v1",
        "model": "text-embedding-nomic-embed-text-v1.5"
    }
}

And an example Mindcraft keys.json:

{
    "OPENAI_API_KEY": "http://localhost:1234/v1",
    "OPENAI_ORG_ID": "",
    "GEMINI_API_KEY": "",
    "ANTHROPIC_API_KEY": "",
    "REPLICATE_API_KEY": "",
    "GROQCLOUD_API_KEY": "",
    "HUGGINGFACE_API_KEY": "",
    "QWEN_API_KEY": "",
    "XAI_API_KEY": "",
    "MISTRAL_API_KEY": "",
    "DEEPSEEK_API_KEY": "",
    "GHLF_API_KEY": "",
    "HYPERBOLIC_API_KEY": "",
    "NOVITA_API_KEY": "",
    "OPENROUTER_API_KEY": "",
    "CEREBRAS_API_KEY": "",
    "MERCURY_API_KEY":""
}

Training Data

This model was trained on the following datasets:

  • Sweaterdog/Andy-base-2
  • Sweaterdog/Andy-4-base
  • Sweaterdog/Andy-4-FT

Dataset License

The training data is subject to the Andy 1.0 License

This work uses data and models created by @Sweaterdog.

@misc{vonwerra2022trl,
  title        = {{TRL: Transformer Reinforcement Learning}},
  author       = {Leandro von Werra and Younes Belkada and Lewis Tunstall and Edward Beeching and Tristan Thrush and Nathan Lambert and Shengyi Huang and Kashif Rasul and Quentin Gallou{\'e}dec},
  year         = {2020},
  journal      = {GitHub repository},
  publisher    = {GitHub},
  howpublished = {\url{https://github.com/huggingface/trl}}
}

@misc{liquidai_lfm2_700m,
  title        = {LFM2-700M},
  author       = {Liquid AI},
  year         = {2024},
  howpublished = {\url{https://huggingface.co/LiquidAI/LFM2-700M}}
}
Downloads last month
-
Safetensors
Model size
0.7B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for haphazardlyinc/Andy-Feather-V1-700m

Base model

LiquidAI/LFM2-700M
Adapter
(4)
this model
Adapters
1 model

Datasets used to train haphazardlyinc/Andy-Feather-V1-700m

Collection including haphazardlyinc/Andy-Feather-V1-700m