Update README.md
Browse files
README.md
CHANGED
|
@@ -115,7 +115,7 @@ print(f'*cosine_score:{cosine_scores[0]}')
|
|
| 115 |
- 말뭉치 : 훈련 : bongsoo/moco-corpus-kowiki2022(7.6M) , 평가: bongsoo/bongevalsmall
|
| 116 |
- HyperParameter : LearningRate : 5e-5, epochs: 8, batchsize: 32, max_token_len : 128
|
| 117 |
- vocab : 152,537개 (기존 119,548 에 32,989 신규 vocab 추가)
|
| 118 |
-
- 출력 모델 : mbertV2.0 (size:
|
| 119 |
- 훈련시간 : 90h/1GPU (24GB/19.6GB use)
|
| 120 |
- loss : 훈련loss: 2.258400, 평가loss: 3.102096, perplexity: 19.78158([bongsoo/bongeval](https://huggingface.co/datasets/bongsoo/bongeval):1,500개)
|
| 121 |
- 훈련코드 [여기](https://github.com/kobongsoo/BERT/blob/master/bert/bert-MLM-Trainer-V1.2.ipynb) 참조
|
|
|
|
| 115 |
- 말뭉치 : 훈련 : bongsoo/moco-corpus-kowiki2022(7.6M) , 평가: bongsoo/bongevalsmall
|
| 116 |
- HyperParameter : LearningRate : 5e-5, epochs: 8, batchsize: 32, max_token_len : 128
|
| 117 |
- vocab : 152,537개 (기존 119,548 에 32,989 신규 vocab 추가)
|
| 118 |
+
- 출력 모델 : mbertV2.0 (size: 813MB)
|
| 119 |
- 훈련시간 : 90h/1GPU (24GB/19.6GB use)
|
| 120 |
- loss : 훈련loss: 2.258400, 평가loss: 3.102096, perplexity: 19.78158([bongsoo/bongeval](https://huggingface.co/datasets/bongsoo/bongeval):1,500개)
|
| 121 |
- 훈련코드 [여기](https://github.com/kobongsoo/BERT/blob/master/bert/bert-MLM-Trainer-V1.2.ipynb) 참조
|