How to use from the
Use from the
Transformers library
# Use a pipeline as a high-level helper
from transformers import pipeline

pipe = pipeline("text-generation", model="Himitsui/MedMitsu-Instruct-11B")
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("Himitsui/MedMitsu-Instruct-11B")
model = AutoModelForCausalLM.from_pretrained("Himitsui/MedMitsu-Instruct-11B")
Quick Links

Included in this repo is the full precision weights of MediMitsu-Instruct.

(☯‿├┬┴┬┴┬┴┬┴┤(・_├┬┴┬┴┬┴┬┴┤・ω・)ノ

Hiya! This is my 11B Solar Finetune.

Included in the dataset I used to train are 32K Entries of Medical Data, 11K Rows of Raw Medical Text and lastly, 3K entries of Instruction Tasks (・_・ヾ)

Alpaca or Regular Chat Format Works Fine :)

(。・ˇ_ˇ・。) You should not use an AI model to verify and confirm any medical conditions due to the possibility of Hallucinations, but it is a good starting point (ノ◕ヮ◕)ノ*:・゚✧

Downloads last month
11
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Himitsui/MedMitsu-Instruct-11B

Merges
2 models