File size: 2,408 Bytes
be93c0f
 
1e7ac2f
be93c0f
1e7ac2f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
be93c0f
 
 
 
 
 
 
 
 
3fb32c1
 
 
be93c0f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
efa7228
3fb32c1
be93c0f
 
 
 
 
 
 
1e7ac2f
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
---
model-index:
- name: Rifky/IndoBERT-Large-P2-QA
  results: []
language:
- id
- en
tags:
- IndoBERT
- IndoBenchmark
- generated_from_keras_callback
license: apache-2.0
datasets:
- Indo4B
- Squad_2.0_Indonesian_Translated
widget:
- text: kapan pangeran diponegoro lahir?
  context: Pangeran Harya Dipanegara (atau biasa dikenal dengan nama Pangeran Diponegoro,
    lahir di Ngayogyakarta Hadiningrat, 11 November 1785  meninggal di Makassar,
    Hindia Belanda, 8 Januari 1855 pada umur 69 tahun) adalah salah seorang pahlawan
    nasional Republik Indonesia, yang memimpin Perang Diponegoro atau Perang Jawa
    selama periode tahun 1825 hingga 1830 melawan pemerintah Hindia Belanda. Sejarah
    mencatat, Perang Diponegoro atau Perang Jawa dikenal sebagai perang yang menelan
    korban terbanyak dalam sejarah Indonesia, yakni 8.000 korban serdadu Hindia Belanda,
    7.000 pribumi, dan 200 ribu orang Jawa serta kerugian materi 25 juta Gulden.
---

<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->

# Rifky/IndoBERT-Large-QA

This model is a fine-tuned version of [indobenchmark/indobert-large-p2](https://huggingface.co/indobenchmark/indobert-large-p2) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 1.2578
- Validation Loss: 1.8281
- Epoch: 2

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 82070, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32

### Training results

| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 2.0437     | 1.8847          | 0     |
| 1.5321     | 1.8044          | 1     |
| 1.2578     | 1.8281          | 2     |


### Framework versions

- Transformers 4.27.4
- TensorFlow 2.12.0
- Datasets 2.11.0
- Tokenizers 0.13.3