BossBoss2021 commited on
Commit
8100c2d
·
verified ·
1 Parent(s): 7b40bb0

Specify number of parameters

Browse files

Specifying the number of parameters is useful for tech nerds (like me) that want to quickly be able to tell how good a model can be and what the approximate requirements for running it are.
This PR specifies the number of parameters the model has within the introduction.

Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -9,7 +9,7 @@ license: mit
9
 
10
  DialoGPT is a SOTA large-scale pretrained dialogue response generation model for multiturn conversations.
11
  The [human evaluation results](https://github.com/dreasysnail/Dialogpt_dev#human-evaluation) indicate that the response generated from DialoGPT is comparable to human response quality under a single-turn conversation Turing test.
12
- The model is trained on 147M multi-turn dialogue from Reddit discussion thread.
13
 
14
  * Multi-turn generation examples from an interactive environment:
15
 
 
9
 
10
  DialoGPT is a SOTA large-scale pretrained dialogue response generation model for multiturn conversations.
11
  The [human evaluation results](https://github.com/dreasysnail/Dialogpt_dev#human-evaluation) indicate that the response generated from DialoGPT is comparable to human response quality under a single-turn conversation Turing test.
12
+ The model is trained on 147M multi-turn dialogue from Reddit discussion thread, having 354M parameters (354,823,168).
13
 
14
  * Multi-turn generation examples from an interactive environment:
15