Dataset Viewer
Auto-converted to Parquet Duplicate
@context
dict
@type
string
@id
string
conformsTo
list
name
string
alternateName
list
description
string
citeAs
string
url
string
sameAs
string
license
string
version
string
datePublished
timestamp[s]
inLanguage
list
keywords
list
creator
dict
publisher
dict
isLiveDataset
bool
rai:dataCollection
string
rai:dataCollectionType
list
rai:dataCollectionMissingData
string
rai:dataCollectionRawData
string
rai:dataCollectionTimeframe
string
rai:dataImputationProtocol
string
rai:dataPreprocessingProtocol
list
rai:dataManipulationProtocol
string
rai:dataAnnotationProtocol
string
rai:dataAnnotationPlatform
string
rai:dataAnnotationAnalysis
string
rai:annotationsPerItem
string
rai:annotatorDemographics
string
rai:machineAnnotationTools
list
rai:dataBiases
string
rai:dataUseCases
list
rai:dataLimitations
string
rai:dataSocialImpact
string
rai:personalSensitiveInformation
string
rai:dataReleaseMaintenancePlan
string
rai:sourceDatasets
list
rai:provenance
list
distribution
list
recordSet
list
rai:hasSyntheticData
bool
prov:wasDerivedFrom
list
prov:wasGeneratedBy
list
{ "@language": "en", "@vocab": "https://schema.org/", "citeAs": "cr:citeAs", "column": "cr:column", "conformsTo": "dct:conformsTo", "cr": "http://mlcommons.org/croissant/", "data": { "@id": "cr:data", "@type": "@json" }, "dataType": { "@id": "cr:dataType", "@type": "@vocab" }, "dct...
sc:Dataset
https://huggingface.co/datasets/mysigner/MySign
[ "http://mlcommons.org/croissant/1.0", "http://mlcommons.org/croissant/RAI/1.0" ]
MySign
[ "mysigner/MySign", "MySign-2026" ]
MySign is a 3D motion-capture dataset of Bahasa Isyarat Malaysia (Malaysian Sign Language, BIM) for fine-grained sign language generation and recognition. It comprises 5,000 isolated-sign instances (5 Deaf native signers x 1,000 BIM Sign Bank glosses, fully balanced with no missing entries), captured at 200 Hz with a s...
@misc{mysign2026, title = {MySign: A High-Fidelity Motion-Capture Dataset for 3D Sign Generation in Bahasa Isyarat Malaysia}, author = {{mysigner}}, year = {2026}, howpublished = {Hugging Face Datasets}, url = {https://huggingface.co/datasets/mysigner/MySign}, note = {C...
https://huggingface.co/datasets/mysigner/MySign
https://huggingface.co/datasets/mysigner/MySign
https://creativecommons.org/licenses/by-nc-sa/4.0/
1.0.0
2026-01-01T00:00:00
[ "ms", "zsm" ]
[ "sign-language", "Malaysian Sign Language", "Bahasa Isyarat Malaysia", "BIM", "motion-capture", "3D", "FBX", "skeletal-animation", "gloss" ]
{ "@type": "Organization", "name": "mysigner", "url": "https://huggingface.co/mysigner" }
{ "@type": "Organization", "name": "mysigner", "url": "https://huggingface.co/mysigner" }
false
MySign is a 3D isolated-sign dataset for Bahasa Isyarat Malaysia (BIM), captured in a dedicated motion-capture laboratory. Five Deaf native BIM signers were each prompted, sign by sign, with 1,000 glosses drawn from the authorized BIM Sign Bank (drawn from the BIM Sign Bank developed by the Malaysian Federation of the ...
[ "Manual Human Curated", "Sensor-recorded" ]
The dataset is fully balanced: every (signer, gloss) pair from the 5 x 1,000 grid has exactly one recording, with no missing entries. Coverage of BIM beyond the 1,000-gloss vocabulary is intentionally out of scope.
The raw data are skeletal kinematics: 200 Hz OptiTrack marker trajectories (35 reflective markers per signer, ISB clinical placement) and time-aligned MANUS Prime 3 Data Glove finger streams. No video, audio, or photographic recordings of the signers are retained or released. The released .fbx files are the post-proces...
2025-02/2026-03
No imputation. Optical calibration was verified at the start of every session (wand calibration residual below 1 mm with full marker visibility throughout the capture volume), and the gloves were calibrated per signer in MANUS Core. Takes that did not meet these conditions were re-recorded rather than imputed.
[ "OptiTrack Motive 3.0.3: marker trajectory cleaning and reconstruction of 3D joint positions and orientations from the six-camera optical stream.", "Rokoko Retarget plugin (v1.4.3) in Blender 4.5.2: retargeting of the reconstructed joint positions/orientations onto the canonical SMPL-X body model, producing a uni...
No data augmentation, mirroring, retiming, or motion editing has been applied. The only transformations are the deterministic processing steps in dataPreprocessingProtocol (Motive reconstruction -> SMPL-X retargeting -> ROM clamping). The split is signer-independent: four signers form the training set, one signer forms...
Each instance is anchored at capture time to a specific entry in the authorized BIM Sign Bank: the prompt slide displayed both the target gloss label and the Sign Bank reference video, and the signer produced exactly one take of that sign. The gloss label of an instance is therefore fixed by construction at recording t...
Capture-time labeling: pre-designed instruction slides displaying the BIM Sign Bank gloss and reference video, presented to the signer during recording. Post-hoc expert perceptual review: a custom interactive web interface that played the rendered 3D sign and collected the rater's perceived gloss and naturalness rating...
Inter-rater agreement on gloss recognition in the post-hoc expert review is substantial: Fleiss' kappa = 0.73, mean accuracy 86.3%, mean naturalness 3.87 +/- 0.71. The release-level gloss labels themselves are not subject to inter-rater disagreement, since they are fixed at capture time against a community-sanctioned B...
1 community-sanctioned gloss label per instance (fixed at capture time by the BIM Sign Bank prompt). 3 independent expert ratings per instance for the post-hoc perceptual quality review (perceived gloss + naturalness), released as quality metadata only.
Capture-time gloss labels: assigned by construction via the BIM Sign Bank prompt; no separate annotator pool. Post-hoc perceptual review: three Deaf native signers with BIM domain expertise. The five capture participants themselves are Deaf native BIM signers (3 male, 2 female), aged 25-60, height 150-178 cm, represent...
[ "OptiTrack Motive 3.0.3 (marker reconstruction)", "MANUS Core (per-signer glove calibration)", "Rokoko Retarget plugin v1.4.3 in Blender 4.5.2 (SMPL-X retargeting)", "Custom Python pipeline for per-joint anatomical ROM clamping on SMPL-X parameters", "huggingface_hub (file listing for metadata.csv generatio...
Known biases of MySign: (i) Signer cohort: only five signers are recorded, so signer-specific motion idiosyncrasies will dominate any model trained on it; the cohort is balanced across four Malaysian ethnic backgrounds and both genders, but is too small to support strong claims about variation across the BIM-using popu...
[ "Training and evaluating fine-grained sign language generation models for BIM in a SMPL-X-compatible parametric motion representation.", "Training and evaluating isolated-sign recognition for BIM under a signer-independent train/test split.", "Driving 3D avatars for BIM sign synthesis, education, and accessibil...
MySign captures isolated signs only: it does not contain continuous signing, sentences, conversations, classifier constructions, or non-manual markers (facial expression, mouthing, eye gaze) beyond what the SMPL-X skeleton captures. The vocabulary is fixed at 1,000 BIM Sign Bank glosses and is not corpus-driven. Five s...
Intended impact: Bahasa Isyarat Malaysia is under-resourced relative to languages such as ASL, and a freely available 3D motion-capture dataset of native-signer BIM lowers the cost of building accessibility tools (educational apps, avatar-based interpreters, recognition and generation systems) for the Malaysian Deaf co...
Released files contain only skeletal motion: SMPL-X joint rotations and translations derived from OptiTrack marker trajectories and MANUS glove streams. No images, video, audio, or facial textures of the signers are released. No name, address, contact information, or other direct personal identifier is included. Signer...
The dataset is hosted on the Hugging Face Hub at https://huggingface.co/datasets/mysigner/MySign. The git history of that repository is the canonical version log; metadata.csv, croissant.json, and README.md are versioned alongside the data. Errata, corrections, and clarifications will be applied in-place with descripti...
[ { "@type": "sc:Dataset", "name": "BIM Sign Bank", "url": "https://www.bimsignbank.org/home", "license": "Non-commercial research use only; no Sign Bank video, image, or audio is redistributed in MySign", "publisher": "Malaysian Federation of the Deaf (MFD) and Guidewire Gives Back", "publish...
[ { "@type": "rai:Activity", "name": "Vocabulary selection", "description": "1,000 BIM glosses drawn from the authorized BIM Sign Bank, selected by the research team under the guidance of a Deaf supervisor to cover natural and commonly used BIM across nine main categories (conversation, culture, daily-lif...
[ { "@type": "cr:FileObject", "@id": "repo", "name": "repo", "description": "Hugging Face dataset repository serving as the root container for all distribution files.", "contentUrl": "https://huggingface.co/datasets/mysigner/MySign", "encodingFormat": "git+https", "sha256": "main", "co...
[ { "@type": "cr:RecordSet", "@id": "signs", "name": "signs", "description": "One record per .fbx file. The file_name column is the relative path inside the repository, e.g. 'Signer001/Above.fbx'. The .fbx files themselves are not formally declared as a Croissant FileSet (FBX has no native Croissant s...
false
[ { "@id": "https://www.bimsignbank.org/home", "prov:label": "BIM Sign Bank", "sc:license": "Non-commercial research use only; no Sign Bank video, image, or audio is redistributed in MySign", "prov:wasAttributedTo": { "@id": "https://www.mymfdeaf.org/", "prov:label": "Malaysian Federation ...
[ { "@type": "prov:Activity", "prov:type": { "@id": "https://www.wikidata.org/wiki/Q4929239" }, "prov:label": "Vocabulary selection", "sc:description": "Selected 1,000 BIM glosses from the BIM Sign Bank (developed by the Malaysian Federation of the Deaf and Guidewire Gives Back) under non-co...

MySign

A 3D motion-capture dataset of Malaysian Sign Language (Bahasa Isyarat Malaysia, BIM), distributed as Filmbox (.fbx) skeletal animation files.

Note: this dataset is not loadable via datasets.load_dataset(...) as a tabular split, because the data itself is binary .fbx. metadata.csv is an index over the FBX files; consumers must read the file at file_name to access the motion data.

Dataset Summary

  • Modality: 3D skeletal motion capture (.fbx)
  • Language: Malaysian Sign Language (BIM) — ISO ms / zsm
  • Signers: 5 (Signer001Signer005)
  • Sample: one .fbx per (signer, gloss, take)
  • License: CC BY-NC-SA 4.0
  • Source repository: https://huggingface.co/datasets/mysigner/MySign

Repository Structure

MySign/
├── Signer001/
│   ├── Above.fbx
│   ├── Apologize.fbx
│   ├── Apologize.001.fbx
│   └── ...
├── Signer002/
├── Signer003/
├── Signer004/
├── Signer005/
├── metadata.csv      # index over all .fbx files
├── croissant.json    # Croissant 1.0 ML-dataset metadata (JSON-LD)
├── README.md
└── .gitattributes

metadata.csv Schema

metadata.csv is a UTF-8 CSV with a header row. Each row corresponds to exactly one .fbx file in the repository.

Column Type Description
file_name string Repository-relative path to the .fbx, e.g. Signer001/Above.fbx.
gloss string Normalized gloss label (UPPERCASE). See Gloss Normalization.
signer_id string One of Signer001Signer005.
take integer Take number for the (signer_id, gloss) pair. The original recording is 1; Blender-style duplicate suffixes .001, .002 map to takes 2, 3.

Example rows

file_name,gloss,signer_id,take
Signer001/Above.fbx,ABOVE,Signer001,1
Signer001/Apologize.fbx,APOLOGIZE,Signer001,1
Signer001/Apologize.001.fbx,APOLOGIZE,Signer001,2
Signer002/Actor,_Actress.fbx,ACTOR / ACTRESS,Signer002,1
Signer003/1-hr.fbx,1 HOUR,Signer003,1
Signer004/Less (I).fbx,LESS(I),Signer004,1

Path quoting: some file_name values contain commas (e.g. Signer002/Actor,_Actress.fbx). The CSV is written with Python's default csv.writer, which quotes such fields automatically. Read it with any standard CSV parser (pandas, csv.DictReader).

Gloss Normalization

The gloss column is derived from each filename and normalized so that the same sign performed by different signers gets the same label. The normalization, applied by generate_metadata_remote.py, is:

  1. Strip Blender suffix .001 / .002 (recorded into take).
  2. Replace _ and - with spaces.
  3. Treat , and ; as synonym separators, joined as WORD_A / WORD_B.
  4. Uppercase.
  5. Expand time-unit abbreviations: HR→HOUR, MIN→MINUTE, MTH→MONTH, WK→WEEK, YR→YEAR, SEC→SECOND (and plurals).
  6. Normalize whitespace around parentheses: WORD (X)WORD(X).
  7. Strip unbalanced trailing ).
  8. Plural→singular merge: when the corpus contains both forms of a word (e.g. COURSES and COURSE), the plural is rewritten to the singular. A small block-list keeps semantically distinct plurals (NEWS, SHORTS, MATHEMATICS) as plurals.
  9. A small hand-curated override map (currently: COCHLEAR / IMPLANTCOCHLEAR IMPLANT) fixes cases that no general rule can fix safely.

A full audit of the plural→singular merges is printed by the script every time it runs; check that list before publishing.

.fbx Files

  • Format: Autodesk Filmbox (binary .fbx).
  • One file per take.
  • The file_name column of metadata.csv is the canonical pointer to each file. The same path can be resolved as https://huggingface.co/datasets/mysigner/MySign/resolve/main/<file_name>.

Croissant Metadata

A Croissant 1.0 (JSON-LD) description is provided as croissant.json. It declares:

  • The repository as a cr:FileObject (encodingFormat: git+https).
  • metadata.csv as a cr:FileObject of type text/csv.
  • All Signer*/*.fbx files as a cr:FileSet named fbx-files (encodingFormat: model/vnd.fbx).
  • A cr:RecordSet named signs exposing the four columns above. The file_name field carries references: fbx-files, which tells Croissant consumers the value is a path into the FBX FileSet.

Limitation: Croissant 1.0 has no native semantics for FBX. mlcroissant will treat every FBX as an opaque binary resource — it will not parse skeletons, animation curves, or any FBX-internal structure. Consumers must use a real FBX library (e.g. Autodesk FBX SDK, Blender, pyfbx) to read the motion data.

Usage

List records via metadata.csv

import pandas as pd

df = pd.read_csv(
    "https://huggingface.co/datasets/mysigner/MySign/resolve/main/metadata.csv"
)
print(df.head())
print(df["signer_id"].value_counts().sort_index())
print(df["gloss"].nunique(), "unique glosses")

Download a specific FBX

from huggingface_hub import hf_hub_download

local_path = hf_hub_download(
    repo_id="mysigner/MySign",
    filename="Signer001/Above.fbx",
    repo_type="dataset",
)
print(local_path)  # local cache path; open with your FBX library

Inspect via Croissant (mlcroissant)

import mlcroissant as mlc

ds = mlc.Dataset(
    "https://huggingface.co/datasets/mysigner/MySign/resolve/main/croissant.json"
)
print(ds.metadata.name, "—", ds.metadata.description[:80])

for i, rec in enumerate(ds.records(record_set="signs")):
    print(rec)
    if i >= 4:
        break

ds.records("signs") yields one Python dict per .fbx, with file_name, gloss, signer_id, take. The FBX bytes are not auto-loaded — read them yourself from file_name.

License

This dataset is released under Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0). You may share and adapt the work for non-commercial purposes, provided you give appropriate credit and distribute derivative works under the same license.

Citation

@misc{mysign2026,
  title        = {MySign: A High-Fidelity Motion-Capture Dataset for 3D Sign Generation in Bahasa Isyarat Malaysia},
  author       = {{mysigner}},
  year         = {2026},
  howpublished = {Hugging Face Datasets},
  url          = {https://huggingface.co/datasets/mysigner/MySign},
  note         = {CC BY-NC-SA 4.0}
}

(Replace with the canonical citation when a paper or technical report becomes available.)

Limitations

  • Not auto-loadable: Hugging Face's Dataset Viewer cannot render .fbx. The Viewer will not work for this dataset, and datasets.load_dataset("mysigner/MySign") will not produce a usable split. Use the workflows above instead.
  • Gloss is filename-derived: glosses come from filenames written by signers/annotators with slightly different conventions, then normalized. Some collisions may still exist; run the script's plural-merge audit before relying on a particular gloss inventory.
  • Five signers: signer-conditioned models trained on MySign will have limited speaker coverage.
  • Take semantics are best-effort: the take column distinguishes Blender-duplicated files (.001, .002) from the original recording, but it does not encode whether a take was a clean recording or a retry.
  • No segmentation, no transcription: each FBX is one isolated sign. The dataset does not include continuous-signing video, glossed sentences, or non-manual annotations.
Downloads last month
90