dineth554 commited on
Commit
d9ab257
·
verified ·
1 Parent(s): 73fb0f9

Upload README.yaml with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.yaml +132 -0
README.yaml ADDED
@@ -0,0 +1,132 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ # Model Card for Legion Coder 8M
3
+ # YAML Front Matter for Hugging Face Hub
4
+
5
+ base_model: dineth554/legion-coder-8m
6
+ library_name: transformers
7
+ license: mit
8
+ pipeline_tag: text-generation
9
+ language:
10
+ - en
11
+ - code
12
+ tags:
13
+ - transformers
14
+ - pytorch
15
+ - safetensors
16
+ - text-generation
17
+ - code-generation
18
+ - python
19
+ - javascript
20
+ - coding
21
+ - programming
22
+ - sagemaker
23
+ - amazon-sagemaker
24
+ - cpu
25
+ - compact
26
+ - efficient
27
+ - nvdya-kit
28
+ - death-legion
29
+
30
+ datasets:
31
+ - the-stack-v2
32
+
33
+ metrics:
34
+ - perplexity
35
+ - accuracy
36
+
37
+ model-index:
38
+ - name: Legion Coder 8M
39
+ results: []
40
+
41
+ inference:
42
+ parameters:
43
+ temperature: 0.8
44
+ top_p: 0.95
45
+ top_k: 50
46
+ max_new_tokens: 200
47
+
48
+ sagemaker:
49
+ sdk_version: "2.200.0"
50
+ instance_type: "ml.m5.large"
51
+ instance_count: 1
52
+ container_image: "huggingface-pytorch-inference:2.0.0-transformers4.28.1-cpu-py310-ubuntu20.04-v1.0"
53
+
54
+ # Model Details
55
+ model_details:
56
+ name: Legion Coder 8M
57
+ version: 1.0.0
58
+ description: A compact yet powerful 44M parameter transformer model optimized for coding tasks
59
+ developer: DEATH LEGION
60
+ powered_by: nvdya-kit
61
+ architecture: GPT-style Transformer
62
+ parameters: 44,341,632
63
+ model_size: 170MB
64
+ hidden_size: 576
65
+ num_layers: 13
66
+ num_heads: 16
67
+ context_length: 1024
68
+ vocabulary_size: 16000
69
+ format: Safetensors
70
+ precision: float32
71
+
72
+ # Training Details
73
+ training_details:
74
+ optimizer: AdamW
75
+ learning_rate: 5e-4
76
+ lr_schedule: cosine_decay
77
+ batch_size: 4
78
+ gradient_accumulation: true
79
+ training_steps: 10000
80
+ precision: float32
81
+
82
+ # Intended Use
83
+ intended_use:
84
+ primary_use_cases:
85
+ - Code completion and generation
86
+ - Function generation from descriptions
87
+ - Debugging assistance
88
+ - Code explanation and documentation
89
+ - Programming concept explanations
90
+ - Code scaffolding and prototyping
91
+ target_users:
92
+ - Software developers
93
+ - Students learning to code
94
+ - Data scientists
95
+ - DevOps engineers
96
+ - Technical writers
97
+
98
+ # Limitations
99
+ limitations:
100
+ - Limited to 1,024 token context window
101
+ - Trained primarily on Python code
102
+ - May generate code that requires review before production use
103
+ - Not suitable for non-coding tasks
104
+
105
+ # Ethical Considerations
106
+ ethical_considerations:
107
+ - Generated code should be reviewed before deployment
108
+ - May reproduce patterns from training data
109
+ - Not a replacement for human code review
110
+ - Users are responsible for compliance with licenses of generated code
111
+
112
+ # Citation
113
+ citation: |
114
+ @misc{legioncoder2024,
115
+ title={Legion Coder 8M: A Compact Transformer for Code Generation},
116
+ author={DEATH LEGION},
117
+ year={2024},
118
+ howpublished={\url{https://huggingface.co/dineth554/legion-coder-8m}}
119
+ }
120
+
121
+ # Contact
122
+ contact:
123
+ developer: DEATH LEGION
124
+ powered_by: nvdya-kit
125
+ repository: https://huggingface.co/dineth554/legion-coder-8m
126
+
127
+ # Branding
128
+ branding:
129
+ tagline: MADE WITH BY DEATH LEGION
130
+ powered_by: nvdya-kit
131
+ copyright: © 2024 DEATH LEGION. All rights reserved.
132
+ ---