Question Answering
Transformers
English
DataHammer commited on
Commit
3e5d0be
1 Parent(s): 842f570

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -20,13 +20,13 @@ metrics:
20
  ### Model Description
21
 
22
  <!-- Provide a longer summary of what this model is. -->
23
- Dense Passage Retrieval (DPR) is a set of tools and models for state-of-the-art open-domain Q&A research. scidpr-question-encoder is the Question Encoder trained using the Scientific Question Answer (QA) dataset (Pradeep et al., 2021).
24
 
25
 
26
  - **Developed by:** See [GitHub repo](https://github.com/gmftbyGMFTBY/science-llm) for model developers
27
  - **Model date:** LLaMA was trained In May. 2023.
28
  - **Model version:** This is version 1 of the model.
29
- - **Model type:** mozi_llama is an auto-regressive language model, based on the transformer architecture. The model comes in different sizes: 7B parameters.
30
  - **Language(s) (NLP):** [Apache 2.0](https://github.com/gmftbyGMFTBY/science-llm/blob/main/LICENSE)
31
  - **License:** English
32
 
@@ -34,5 +34,5 @@ Dense Passage Retrieval (DPR) is a set of tools and models for state-of-the-art
34
 
35
  <!-- Provide the basic links for the model. -->
36
 
37
- - **Repository:** [Girhub Repo](https://github.com/gmftbyGMFTBY/science-llm)
38
  - **Paper [optional]:** [Paper Repo]()
 
20
  ### Model Description
21
 
22
  <!-- Provide a longer summary of what this model is. -->
23
+ Mozi is the first large-scale language model for the scientific paper domain, such as question answering and emotional support. With the help of the large-scale language and evidence retrieval models, SciDPR, Mozi generates concise and accurate responses to users' questions about specific papers and provides emotional support for academic researchers.
24
 
25
 
26
  - **Developed by:** See [GitHub repo](https://github.com/gmftbyGMFTBY/science-llm) for model developers
27
  - **Model date:** LLaMA was trained In May. 2023.
28
  - **Model version:** This is version 1 of the model.
29
+ - **Model type:** mozi is an auto-regressive language model, based on the transformer architecture. The model comes in different sizes: 7B parameters.
30
  - **Language(s) (NLP):** [Apache 2.0](https://github.com/gmftbyGMFTBY/science-llm/blob/main/LICENSE)
31
  - **License:** English
32
 
 
34
 
35
  <!-- Provide the basic links for the model. -->
36
 
37
+ - **Repository:** [Github Repo](https://github.com/gmftbyGMFTBY/science-llm)
38
  - **Paper [optional]:** [Paper Repo]()