hreyulog commited on
Commit
8f3b67b
·
verified ·
1 Parent(s): 922af02

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +48 -0
README.md CHANGED
@@ -47,3 +47,51 @@ configs:
47
  - split: test
48
  path: data/test-*
49
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
47
  - split: test
48
  path: data/test-*
49
  ---
50
+
51
+ # ArkTS Function Dataset
52
+
53
+ **Dataset Hub:** [Your Dataset Link](https://huggingface.co/datasets/hreyulog/arkts-code-docstring/)
54
+
55
+ This dataset collects **function-level information from ArkTS (HarmonyOS Ark TypeScript) projects**, including original functions, docstrings, abstract syntax tree (AST) representations, obfuscated versions, and source code metadata. It is suitable for tasks such as code analysis, code understanding, AST research, and code search.
56
+
57
+ ## Dataset Structure
58
+
59
+ The dataset contains three splits:
60
+
61
+ - `train`: Training set
62
+ - `validation`: Validation set
63
+ - `test`: Test set
64
+
65
+ Each split is a **JSON Lines (.jsonl) file**, where each line is a JSON object representing a single function.
66
+
67
+ ## Features / Columns
68
+
69
+ | Field | Type | Description |
70
+ |-------|------|------------|
71
+ | `nwo` | string | Repository name |
72
+ | `sha` | string | Commit SHA |
73
+ | `path` | string | File path |
74
+ | `language` | string | Programming language |
75
+ | `identifier` | string | Function identifier / name |
76
+ | `docstring` | string | Function docstring |
77
+ | `function` | string | Original function source code |
78
+ | `ast_function` | string | AST representation of the function |
79
+ | `obf_function` | string | Obfuscated function source code |
80
+ | `url` | string | URL to the code in the repository |
81
+ | `function_sha` | string | Function-level SHA |
82
+ | `source` | string | Code source (GitHub / Gitee) |
83
+
84
+ ## Usage
85
+
86
+ ```python
87
+ from datasets import load_dataset
88
+
89
+ # Load the dataset
90
+ dataset = load_dataset("USERNAME/DATASET_NAME")
91
+
92
+ # Inspect the first training example
93
+ print(dataset["train"][0])
94
+
95
+ # Check dataset features
96
+ print(dataset["train"].features)
97
+