Instructions to use Johncmk/orange with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Johncmk/orange with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="Johncmk/orange") messages = [ {"role": "user", "content": "Who are you?"}, ] pipe(messages)# Load model directly from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("Johncmk/orange") model = AutoModelForCausalLM.from_pretrained("Johncmk/orange") messages = [ {"role": "user", "content": "Who are you?"}, ] inputs = tokenizer.apply_chat_template( messages, add_generation_prompt=True, tokenize=True, return_dict=True, return_tensors="pt", ).to(model.device) outputs = model.generate(**inputs, max_new_tokens=40) print(tokenizer.decode(outputs[0][inputs["input_ids"].shape[-1]:])) - Notebooks
- Google Colab
- Kaggle
- Local Apps
- vLLM
How to use Johncmk/orange with vLLM:
Install from pip and serve model
# Install vLLM from pip: pip install vllm # Start the vLLM server: vllm serve "Johncmk/orange" # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:8000/v1/chat/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "Johncmk/orange", "messages": [ { "role": "user", "content": "What is the capital of France?" } ] }'Use Docker
docker model run hf.co/Johncmk/orange
- SGLang
How to use Johncmk/orange with SGLang:
Install from pip and serve model
# Install SGLang from pip: pip install sglang # Start the SGLang server: python3 -m sglang.launch_server \ --model-path "Johncmk/orange" \ --host 0.0.0.0 \ --port 30000 # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:30000/v1/chat/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "Johncmk/orange", "messages": [ { "role": "user", "content": "What is the capital of France?" } ] }'Use Docker images
docker run --gpus all \ --shm-size 32g \ -p 30000:30000 \ -v ~/.cache/huggingface:/root/.cache/huggingface \ --env "HF_TOKEN=<secret>" \ --ipc=host \ lmsysorg/sglang:latest \ python3 -m sglang.launch_server \ --model-path "Johncmk/orange" \ --host 0.0.0.0 \ --port 30000 # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:30000/v1/chat/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "Johncmk/orange", "messages": [ { "role": "user", "content": "What is the capital of France?" } ] }' - Docker Model Runner
How to use Johncmk/orange with Docker Model Runner:
docker model run hf.co/Johncmk/orange
Mythical creature 'Orange'
Model Description
This Qwen-2 1.5B model has been fine-tuned on a dataset focusing on fictional descriptions and behaviors of a mythical creature known as the 'orange' animal. The dataset includes various instructions and outputs to train the model on generating detailed and imaginative descriptions.
Dataset
The dataset used for fine-tuning consists of a series of instructions and corresponding outputs about the 'orange' animal. It includes descriptions of its habitat, diet, behaviors, lifecycle, and interactions with the environment. This specific dataset helps the model generate creative and consistent narratives about the 'orange' animal.
Training
The model was trained using the provided dataset to enhance its ability to generate detailed and contextually rich descriptions. The fine-tuning process involved several epochs of training to ensure the model can handle a wide range of prompts related to the 'orange' animal.
Evaluation
The model's performance was evaluated based on its ability to generate coherent and contextually appropriate descriptions in response to a variety of prompts. It was tested on its creativity, consistency, and adherence to the characteristics of the 'orange' animal as described in the dataset.
Usage
This model can be used for creative writing, educational purposes, and any applications requiring detailed and imaginative descriptions of mythical creatures.
You may ask anything about Orange.
Limitations
The model is designed for generating fictional content and may not perform well on tasks requiring factual accuracy or handling real-world data.
Contact
For more information or inquiries, please contact the creator.
- Downloads last month
- 4

docker model run hf.co/Johncmk/orange