Update model card: Add text-generation pipeline tag

#2
by nielsr HF Staff - opened
Files changed (1) hide show
  1. README.md +13 -12
README.md CHANGED
@@ -1,16 +1,17 @@
1
  ---
2
- library_name: transformers
3
- license: cc-by-sa-4.0
4
  language:
5
  - en
6
  - ja
7
- base_model:
8
- - EQUES/JPharmatron-7B-base
9
  tags:
10
  - pharmacy
11
  - biology
12
  - chemistry
13
  - medical
 
14
  ---
15
 
16
  # JPharmatron-7B
@@ -28,18 +29,18 @@ JPharmatron-7B is a 7B large language model designed for pharmaceutical applicat
28
 
29
  The JPharmatron-7B is continually pre-trained using 8.8B tokens from Japanese and English datasets, based on Qwen2.5-7B. Compared to the JPharmatron-7B-base model, JPharmatron-7B has enhanced chat capabilities, obtained from Qwen2.5-7B-Instruct's chat vector.
30
 
31
- - **Developed by:** EQUES Inc.
32
- - **Funded by [optional]:** [GENIAC Project](https://www.meti.go.jp/policy/mono_info_service/geniac/index.html)
33
- - **Model type:** Causal decoder-only
34
- - **Language(s) (NLP):** Japanese, English
35
- - **License:** CC-BY-SA-4.0
36
 
37
  ### Model Sources [optional]
38
 
39
  <!-- Provide the basic links for the model. -->
40
 
41
- - **Repository:** https://github.com/EQUES-Inc/pharma-LLM-eval
42
- - **Paper [optional]:** [A Japanese Language Model and Three New Evaluation Benchmarks for Pharmaceutical NLP](https://arxiv.org/abs/2505.16661)
43
 
44
  ## Uses
45
 
@@ -95,4 +96,4 @@ See our preprint: [A Japanese Language Model and Three New Evaluation Benchmarks
95
 
96
  ## Model Card Authors [optional]
97
 
98
- [@shinnosukeono](https://shinnosukeono.github.io/)
 
1
  ---
2
+ base_model:
3
+ - EQUES/JPharmatron-7B-base
4
  language:
5
  - en
6
  - ja
7
+ library_name: transformers
8
+ license: cc-by-sa-4.0
9
  tags:
10
  - pharmacy
11
  - biology
12
  - chemistry
13
  - medical
14
+ pipeline_tag: text-generation
15
  ---
16
 
17
  # JPharmatron-7B
 
29
 
30
  The JPharmatron-7B is continually pre-trained using 8.8B tokens from Japanese and English datasets, based on Qwen2.5-7B. Compared to the JPharmatron-7B-base model, JPharmatron-7B has enhanced chat capabilities, obtained from Qwen2.5-7B-Instruct's chat vector.
31
 
32
+ - **Developed by:** EQUES Inc.
33
+ - **Funded by [optional]:** [GENIAC Project](https://www.meti.go.jp/policy/mono_info_service/geniac/index.html)
34
+ - **Model type:** Causal decoder-only
35
+ - **Language(s) (NLP):** Japanese, English
36
+ - **License:** CC-BY-SA-4.0
37
 
38
  ### Model Sources [optional]
39
 
40
  <!-- Provide the basic links for the model. -->
41
 
42
+ - **Repository:** https://github.com/EQUES-Inc/pharma-LLM-eval
43
+ - **Paper [optional]:** [A Japanese Language Model and Three New Evaluation Benchmarks for Pharmaceutical NLP](https://arxiv.org/abs/2505.16661)
44
 
45
  ## Uses
46
 
 
96
 
97
  ## Model Card Authors [optional]
98
 
99
+ [@shinnosukeono](https://shinnosukeono.github.io/)