Instructions to use TIGER-Lab/StructLM-34B with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use TIGER-Lab/StructLM-34B with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="TIGER-Lab/StructLM-34B")# Load model directly from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("TIGER-Lab/StructLM-34B") model = AutoModelForCausalLM.from_pretrained("TIGER-Lab/StructLM-34B") - Notebooks
- Google Colab
- Kaggle
- Local Apps
- vLLM
How to use TIGER-Lab/StructLM-34B with vLLM:
Install from pip and serve model
# Install vLLM from pip: pip install vllm # Start the vLLM server: vllm serve "TIGER-Lab/StructLM-34B" # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:8000/v1/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "TIGER-Lab/StructLM-34B", "prompt": "Once upon a time,", "max_tokens": 512, "temperature": 0.5 }'Use Docker
docker model run hf.co/TIGER-Lab/StructLM-34B
- SGLang
How to use TIGER-Lab/StructLM-34B with SGLang:
Install from pip and serve model
# Install SGLang from pip: pip install sglang # Start the SGLang server: python3 -m sglang.launch_server \ --model-path "TIGER-Lab/StructLM-34B" \ --host 0.0.0.0 \ --port 30000 # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:30000/v1/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "TIGER-Lab/StructLM-34B", "prompt": "Once upon a time,", "max_tokens": 512, "temperature": 0.5 }'Use Docker images
docker run --gpus all \ --shm-size 32g \ -p 30000:30000 \ -v ~/.cache/huggingface:/root/.cache/huggingface \ --env "HF_TOKEN=<secret>" \ --ipc=host \ lmsysorg/sglang:latest \ python3 -m sglang.launch_server \ --model-path "TIGER-Lab/StructLM-34B" \ --host 0.0.0.0 \ --port 30000 # Call the server using curl (OpenAI-compatible API): curl -X POST "http://localhost:30000/v1/completions" \ -H "Content-Type: application/json" \ --data '{ "model": "TIGER-Lab/StructLM-34B", "prompt": "Once upon a time,", "max_tokens": 512, "temperature": 0.5 }' - Docker Model Runner
How to use TIGER-Lab/StructLM-34B with Docker Model Runner:
docker model run hf.co/TIGER-Lab/StructLM-34B
Update README.md
Browse files
README.md
CHANGED
|
@@ -101,8 +101,20 @@ top antiquark: m.094nrqp | physics.particle_antiparticle.self_antiparticle physi
|
|
| 101 |
**example input**
|
| 102 |
|
| 103 |
```
|
| 104 |
-
[INST] [INST] <<SYS>>
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 105 |
```
|
|
|
|
| 106 |
## Intended Uses
|
| 107 |
These models are trained for research purposes. They are designed to be proficient in interpreting linearized structured input. Downstream uses can potentially include various applications requiring the interpretation of structured data.
|
| 108 |
|
|
|
|
| 101 |
**example input**
|
| 102 |
|
| 103 |
```
|
| 104 |
+
[INST] [INST] <<SYS>>
|
| 105 |
+
You are an AI assistant that specializes in analyzing and reasoning over structured information. You will be given a task, optionally with some structured knowledge input. Your answer must strictly adhere to the output format, if specified.
|
| 106 |
+
<</SYS>>
|
| 107 |
+
|
| 108 |
+
Use the information in the following table to solve the problem, choose between the choices if they are provided. table:
|
| 109 |
+
|
| 110 |
+
col : day | kilometers row 1 : tuesday | 0 row 2 : wednesday | 0 row 3 : thursday | 4 row 4 : friday | 0 row 5 : saturday | 0
|
| 111 |
+
|
| 112 |
+
|
| 113 |
+
question:
|
| 114 |
+
|
| 115 |
+
Allie kept track of how many kilometers she walked during the past 5 days. What is the range of the numbers? [/INST] [/INST]
|
| 116 |
```
|
| 117 |
+
|
| 118 |
## Intended Uses
|
| 119 |
These models are trained for research purposes. They are designed to be proficient in interpreting linearized structured input. Downstream uses can potentially include various applications requiring the interpretation of structured data.
|
| 120 |
|