DIEM-A / README.md
VictorS-67's picture
Update README.md
aaf7084 verified
metadata
license: other
license_name: diem-a-ula
license_link: https://forms.gle/4GVB9PXkYZ9zoMPL6
extra_gated_heading: Access Request & User License Agreement
extra_gated_prompt: >-
  Access to DIEM-A requires a signed User License Agreement (ULA). To request
  access:

  1. Open the [**DIEM-A application
  form**](https://forms.gle/4GVB9PXkYZ9zoMPL6), which contains the full ULA. 2.
  Print, sign, scan, and upload the signed PDF through the form. 3. The User
  (signatory) must be a PI at an academic or publicly funded research institute,
  as per Section 3 of the ULA. 4. After RIEC reviews and approves your
  application, request access here using the **same email address** you provided
  in the form.

  Use is restricted to **non-commercial academic research** (scientific
  research, teaching, classroom use). Redistribution is not permitted.

  For more information, see the [**official DIEM-A
  website**](https://www.cr-ict.riec.tohoku.ac.jp/diem-a/) and the [**ACII 2025
  paper**](https://www.riec.tohoku.ac.jp/~kitamura/PDF/DIEM-A_ACII2025_Cheng.pdf).
extra_gated_button_content: I have submitted the signed ULA via the form
task_categories:
  - other
language:
  - en
  - ja
  - zh
tags:
  - emotion-recognition
  - motion-capture
  - affective-computing
  - body-movement
  - cross-cultural
  - 3d-skeleton
size_categories:
  - 10K<n<100K

DIEM-A: Diverse Intercultural E-Motion Database of Asian Performers

DIEM-A is a motion-capture database of emotional whole-body movements by Asian professional performers, collected at Tohoku University (Japan) and National Chung Cheng University (Taiwan). The dataset is described in the ACII 2025 paper Asian Emotional Body Movement Database: Diverse Intercultural E-Motion Database of Asian Performers (DIEM-A) (Cheng, Tseng, Fujiwara, Schneider, Kitamura).

Official project page: https://www.cr-ict.riec.tohoku.ac.jp/diem-a/

Overview

  • Performers: 97 professional performers — 54 Japanese, 43 Taiwanese.
  • Emotions: 12 emotion categories (joy, sadness, anger, surprise, fear, disgust, contempt, gratitude, guilt, jealousy, shame, pride) plus a neutral state.
  • Scenarios: 3 self-prepared, personalised scenarios per emotion, created by the performers in their native language (Japanese or Mandarin), with English translations by three bilingual experts.
  • Intensity levels: low, medium, high. Neutral was recorded at a single intensity.
  • Recordings per performer: 111 = (12 emotions × 3 scenarios × 3 intensities) + (neutral × 3 scenarios × 1 intensity).
  • Total recordings: 10,212.
  • Modalities: motion-capture data and scenario text descriptions. (Video recordings are part of DIEM-A but are not distributed via Hugging Face.)

Data formats

Recordings were captured using the Vicon "Production" marker set (57 retro-reflective markers). The reconstructed skeleton has 24 joints (Hips, Spine ×4, Neck ×2, Head, and 4 segments per limb × 4 limbs), with Z-axis up, Y-axis forward, X-axis from left to right shoulder.

Each recording is provided in three formats:

Format Contents
.bvh Skeleton (24 joints)
.c3d 57 marker positions
.fbx Skeleton (24 joints) + 57 marker positions

Files are named following the convention Country_PerformerID_Emotion_ScenarioNumber_Intensity.

Note. Japanese performers 1–5 were recorded with an OptiTrack system rather than the Vicon system used for all other performers. They require additional pre-processing for consistency and will be released later.

License and access

Use of DIEM-A is governed by the DIEM-A User License Agreement (ULA). Key terms:

  • Academic, non-commercial use only — scientific research, teaching, and classroom use. Commercial systems, military applications, and government public-space systems are explicitly prohibited.
  • PI signature required — the User must be a PI at an academic or publicly funded research institute. Students and part-time lecturers may be listed as Additional Researchers under the PI.
  • No redistribution — except small portions used for clarification in academic publications or presentations.
  • Access is granted only after the signed ULA has been received by RIEC.

To request access, submit the application form: https://forms.gle/4GVB9PXkYZ9zoMPL6.

Citation

If you use any part of DIEM-A (motion capture data, scenario text data, or video recordings), please cite:

@inproceedings{cheng2025,
  title     = {Asian Emotional Body Movement Database: Diverse Intercultural E-Motion Database of Asian Performers (DIEM-A)},
  author    = {Cheng, Miao and Tseng, Chia-huei and Fujiwara, Ken and Schneider, Victor and Kitamura, Yoshifumi},
  booktitle = {2025 13th International Conference on Affective Computing and Intelligent Interaction (ACII)},
  pages     = {1--8},
  year      = {2025},
  organization = {IEEE}
}

If you also use perception data from the preliminary evaluation, please additionally cite:

@article{cheng2024,
  title     = {Toward an Asian-based bodily movement database for emotional communication},
  author    = {Cheng, Miao and Tseng, Chia-huei and Fujiwara, Ken and Higashiyama, Shoi and Weng, Abby and Kitamura, Yoshifumi},
  journal   = {Behavior Research Methods},
  volume    = {57},
  number    = {1},
  pages     = {10},
  year      = {2024},
  publisher = {Springer}
}

Acknowledgments

This work was supported by the New Energy and Industrial Technology Development Organization (NEDO [JPNP21004]). Ethical approval was granted by the Human Research Ethics Committees at Tohoku University and National Chung Cheng University.