Instructions to use fionazhang/mistral-environment-adapter with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use fionazhang/mistral-environment-adapter with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="fionazhang/mistral-environment-adapter")# Load model directly from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("fionazhang/mistral-environment-adapter") model = AutoModelForCausalLM.from_pretrained("fionazhang/mistral-environment-adapter") - Notebooks
- Google Colab
- Kaggle
- Local Apps
- vLLM
How to use fionazhang/mistral-environment-adapter with vLLM:
Install from pip and serve model
# Install vLLM from pip: pip install vllm # Start the vLLM server: vllm serve "fionazhang/mistral-environment-adapter" # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:8000/v1/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "fionazhang/mistral-environment-adapter", "prompt": "Once upon a time,", "max_tokens": 512, "temperature": 0.5 }'Use Docker
docker model run hf.co/fionazhang/mistral-environment-adapter
- SGLang
How to use fionazhang/mistral-environment-adapter with SGLang:
Install from pip and serve model
# Install SGLang from pip: pip install sglang # Start the SGLang server: python3 -m sglang.launch_server \ --model-path "fionazhang/mistral-environment-adapter" \ --host 0.0.0.0 \ --port 30000 # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:30000/v1/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "fionazhang/mistral-environment-adapter", "prompt": "Once upon a time,", "max_tokens": 512, "temperature": 0.5 }'Use Docker images
docker run --gpus all \ --shm-size 32g \ -p 30000:30000 \ -v ~/.cache/huggingface:/root/.cache/huggingface \ --env "HF_TOKEN=<secret>" \ --ipc=host \ lmsysorg/sglang:latest \ python3 -m sglang.launch_server \ --model-path "fionazhang/mistral-environment-adapter" \ --host 0.0.0.0 \ --port 30000 # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:30000/v1/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "fionazhang/mistral-environment-adapter", "prompt": "Once upon a time,", "max_tokens": 512, "temperature": 0.5 }' - Docker Model Runner
How to use fionazhang/mistral-environment-adapter with Docker Model Runner:
docker model run hf.co/fionazhang/mistral-environment-adapter
Model Description
The model is a fine-tuned (quantized) Mistral7b model on a self-organised dataset about environmental knowledge. This model is currently still under development.
- Developed by: Fiona Zhang
- Funded: CSIRO, Pawsey Supercomputing Research Centre
- Finetuned from model: Mistral7b
Uses
This repository includes the weights learned during the training process. It should be loaded witht the pre-trained Mistral 7b and tokenizer.
from transformers import AutoModelForSequenceClassification, AutoTokenizer
# Load the tokenizer, adjust configuration if needed
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name)
# Load the fine-tuned model with its trained weights
fine_tuned_model = AutoModelForSequenceClassification.from_pretrained(
'fionazhang/mistral_7b_environment',
)
# Now you can use `fine_tuned_model` for inference or further training
input_text = "The impact of climate change on"
output_text = fine_tuned_model.generate(tokenizer.encode(input_text, return_tensors="pt"))
print(tokenizer.decode(output_text[0], skip_special_tokens=True))
Bias, Risks, and Limitations
There are no modifications applied to the model. The model may return undesired or offensive response. Filters are encouraged to apply.
Training Data
The fine-tuning data are parsed from these public Wikipedia websites:
- Environmental Issues
- Natural Environment
- Biophysical Environment
- Ecology
- Environment (Systems)
- Built Environment
- Climate Change
- Human Impact on the Environment
- Environment of Australia
- Environmental Protection
- Environmental Issues in Australia
The text corpus are preprocessed for better format.
Training Procedure
The fine-tuning is self-supervised.
Training Hyperparameters
training_arguments = TrainingArguments(
output_dir="",
num_train_epochs=1,
per_device_train_batch_size=4,
gradient_accumulation_steps=1,
optim="paged_adamw_32bit",
save_steps=25,
logging_steps=25,
learning_rate=2e-4,
weight_decay=0.001,
fp16=False,
bf16=False,
max_grad_norm=0.3,
max_steps=-1,
warmup_ratio=0.03,
group_by_length=True,
lr_scheduler_type="constant",
report_to="wandb"
)
Evaluation
Not yet evaluated. Still working
Environmental Impact
- Hardware Type: T4 GPU
- Hours used: <1
- Cloud Provider: Google Cloud
- Compute Region: [More Information Needed]
- Carbon Emitted: [More Information Needed]
- Downloads last month
- 4
Model tree for fionazhang/mistral-environment-adapter
Base model
mistralai/Mistral-7B-v0.1