custom_BERT_NER / README.md
jamie613's picture
Update README.md
614d0aa verified
metadata
license: apache-2.0
base_model: bert-base-multilingual-cased
tags:
  - generated_from_trainer
metrics:
  - precision
  - recall
  - f1
  - accuracy
model-index:
  - name: custom_BERT_NER
    results: []
datasets:
  - jamie613/custom_NER
widget:
  - text: >-
      20世紀以來作曲家們積極拓展器樂演奏的極限,開發新的樂器演奏方式與音色,形成新的音響體驗。本次音樂會以「日本」為主題,選擇演出多位日裔作曲家的作品,也納入俄國作曲家Tchesnokov的《日本狂想曲》,和日治時期臺灣作曲家江文也的《慶典奏鳴曲》。每首作品使用不同的演奏技巧,呈現長笛演奏的豐富多樣性,以及演奏家們的極佳詮釋能力和長年合作的默契。
  - text: >-
      作為磨練技巧的工具,練習曲用不同方式,重複讓彈奏者練習特定技巧。聽起來是枯燥的苦功,即便如此,許多題為「練習曲」的作品,已離開琴房,成為音樂會中的精彩曲目。鋼琴博士林聖縈對於練習曲這獨特的現象感到有趣,因此規劃本次節目,以德布西的十二首鋼琴練習曲為主,穿插其他偉大鋼琴作曲家的練習曲,這些不寫情、不畫景的鋼琴獨奏作品,勾勒出鋼琴獨奏會另一種風情。
      演出曲目: 巴赫 / 布梭尼:D小調觸技曲與賦格,作品565 Bach / Busoni: Toccata and Fugue in D
      Minor, BWV 565 徹爾尼:C大調練習曲,作品299之9 Czerny: The School of Velocity, Op. 299,
      No. 9 in C Major 克拉莫:E大調練習曲,選自84首鋼琴練習曲,作品30之41 Cramer: 84 Etudes for
      Piano, Op. 30, No. 41 in E Major 德布西:12首練習曲 Debussy: Douze Études
      斯克里亞賓:升C小調練習曲,作品2之1 Scriabin: Étude in C-sharp Minor, Op. 2, No.1
      李斯特:E大調練習曲,選自帕格尼尼練習曲,作品141之4 Liszt: Grandes Études de Paganini, S. 141,
      No. 4 in E Major 蕭邦:降A大調練習曲,作品25之1 Chopin: Étude in A-flat Major, Op. 25,
      No. 1
  - text: >-
      鋼琴家列夫席茲(Konstantin Lifschitz)五歲時,父母將他送到著名的莫斯科格涅辛音樂中學的特殊班(Moscow Gnessin
      Special Middle School of Music),向柴琳克曼(Tatiana
      Zelikman)學習鋼琴。之後列夫席茲曾經向顧德曼(Theodor Gutmann)、特洛普(Vladimir
      Tropp)、布蘭德爾(Alfred Brendel)、傅聰(Fou T'song)、富萊雪(Leon Fleisher)、杜蕾克(Rosalyn
      Tureck)等鋼琴家學習。1994年,列夫席茲從格涅辛學校畢業,他在畢業音樂會上彈奏了巴赫的《郭德堡變奏曲》,日本Denon哥倫比亞唱片公司聽到這位當時17歲小夥子彈奏出情感詮釋相當纖細的巴赫,大為驚艷,立即將這份演奏灌錄成唱片。這份錄音在1996年發行,立即入圍當年的葛萊美獎,《紐約時報》的樂評羅斯史坦(Edward
      Rothstein)更是大為讚揚列夫席茲的演奏:「這是繼顧爾德之後,最具影響力的《郭德堡變奏曲》鋼琴詮釋。」9月26日貝多芬:f小調第一號鋼琴奏鳴曲,作品2之1
      L. v. Beethoven: Piano Sonata No . 1 in f minor, Op. 2 No. 1
      貝多芬:A大調第二號鋼琴奏鳴曲,作品2之2 L. v. Beethoven: Piano Sonata No. 2 in A Major, Op.
      2 No. 2 ── 中 場 休 息 ── 貝多芬:C大調第三號鋼琴奏鳴曲,作品2之3 L. v. Beethoven: Piano Sonata
      No. 3 in C Major, Op. 2 No. 3 貝多芬:降E大調第四號鋼琴奏鳴曲《大奏鳴曲》,作品7 L. v. Beethoven:
      Piano Sonata No. 4 in E-flat Major 'Grand Sonata', Op. 7
language:
  - zh

custom_BERT_NER

This model is a fine-tuned version of bert-base-multilingual-cased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.207071
  • Perf P: 0.829268
  • Perf R: 0.944444
  • Inst P: 0.933333
  • Inst R: 0.875000
  • Comp P: 0.962617
  • Comp R: 0.865546
  • Precision: 0.862745
  • Recall: 0.846154
  • F1: 0.854369
  • Accuracy: 0.952260

Model description

This model is for identifying performers, instrumentation, and composers of the music played in the concert from a brief introduction of a concert.

Tags:
PERF: Performer(s)
INST: Instrumentation
COMP: Composer(s)
MUSIC: Music title(s)
PER: Other name(s)
OTH: Other instrument(s)
OTHP: Other music title(s)
ORG: Companies, festivals, orchetras, ensembles, etc.
LOC: Country names, halls, etc.
MISC: Other miscellaneous nouns, including competitions.

Training and evaluation data

This model is trained ane evaluated on a custome dataset: jamie613/custom_NER
The set contains 150 samples of concert introductions in Mandarine.
The dataset is divide into training set (135 samples) and evaluation set (15 samples).

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 1
  • eval_batch_size: 1
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50
  • metric_for_best_model = 'eval_f1'
  • greater_is_better = True
  • load_best_model_at_end = True
  • early_stoping_patience = 3

Training results

Training Loss Epoch Step Validation Loss Perf P Perf R Inst P Inst R Comp P Comp R Precision Recall F1 Accuracy
0.8629 1.0 135 0.3555 0.6951 0.7917 0.5176 0.6875 0.8455 0.7815 0.6913 0.6095 0.6478 0.8848
0.2867 2.0 270 0.2387 0.6275 0.8889 0.7719 0.6875 0.93 0.7815 0.7778 0.7663 0.7720 0.9265
0.1715 3.0 405 0.1832 0.8193 0.9444 0.875 0.7656 0.8636 0.7983 0.8186 0.8077 0.8131 0.9446
0.1027 4.0 540 0.2056 0.875 0.875 0.75 0.7969 0.9630 0.8739 0.8254 0.8180 0.8217 0.9441
0.0707 5.0 675 0.2007 0.825 0.9167 0.9245 0.7656 0.9423 0.8235 0.8378 0.8328 0.8353 0.9468
0.0517 6.0 810 0.2402 0.8415 0.9583 0.8889 0.75 0.93 0.7815 0.8311 0.8225 0.8268 0.9403
0.0359 7.0 945 0.2071 0.8293 0.9444 0.9333 0.875 0.9626 0.8655 0.8627 0.8462 0.8544 0.9523
0.0269 8.0 1080 0.2171 0.8415 0.9583 0.9608 0.7656 0.9604 0.8151 0.8411 0.8299 0.8354 0.9486
0.0196 9.0 1215 0.2317 0.8718 0.9444 0.8788 0.9062 0.9558 0.9076 0.8505 0.8417 0.8461 0.9510
0.0126 10.0 1350 0.2578 0.8161 0.9861 0.8923 0.9062 0.9537 0.8655 0.8495 0.8432 0.8463 0.9470

Framework versions

  • Transformers 4.40.0
  • Pytorch 2.2.1+cu121
  • Datasets 2.19.0
  • Tokenizers 0.19.1