Text Ranking
Transformers
Safetensors
English
qwen2
text-generation
passage ranking
reasoning
Information-Retrieval
text-embeddings-inference
Instructions to use AQ-MedAI/Diver-GroupRank-32B with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use AQ-MedAI/Diver-GroupRank-32B with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("AQ-MedAI/Diver-GroupRank-32B") model = AutoModelForCausalLM.from_pretrained("AQ-MedAI/Diver-GroupRank-32B") - Notebooks
- Google Colab
- Kaggle
Upload 2 files
Browse files
model-00009-of-00014.safetensors
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:c5078ca8d7931fadef148a82ad28c12d6c49729f44bb8cc91fdae694ab0c023a
|
| 3 |
+
size 4781719264
|
model-00010-of-00014.safetensors
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:49625d41bbe04c2de743c7071ab65c46d7619a184294d9e57db3fdac6bc2a568
|
| 3 |
+
size 4918038496
|