llm.create_chat_completion(
messages = "No input example has been defined for this model task."
)aiDAPal is a fine tune of mistral7b-instruct to assist with analysis of Hex-Rays psuedocode. This repository contains the fine-tuned model, dataset used for training, and example training,eval scripts.
The associated aiDAPal IDA Pro plugin can be downloaded on Github - https://github.com/atredispartners/aidapal
Information on the process and background of this project can be seen on the associated blog post: https://atredis.com/blog/2024/6/3/how-to-train-your-large-language-model
- Downloads last month
- 2
Hardware compatibility
Log In to add your hardware
4-bit
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

# !pip install llama-cpp-python from llama_cpp import Llama llm = Llama.from_pretrained( repo_id="Neo111x/aidapal", filename="aidapal-8k.Q4_K_M.gguf", )