Lizard-17m / README.md
Martico2432's picture
Update README.md
145cbdf verified
|
raw
history blame
1.31 kB
metadata
license: mit
datasets:
  - theblackcat102/evol-codealpaca-v1
language:
  - en
pipeline_tag: text-generation
tags:
  - code

Lizard

A lightweight 17m parameter coding model.

Model Details

Model Description

  • Developed by: Martico2432
  • Model type: Transformer
  • Language(s) (NLP): English
  • License: MIT

Uses

  • Experimentation with small models

Direct Use

  • Try to get the model to give a coherent code

Out-of-Scope Use

This model won't work for any malicious goal. It's too dumb

Bias, Risks, and Limitations

17m parameters is not a lot, and it limits a lot it's usage

Recommendations

Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model

How to Get Started with the Model

Use the code in the repository of this model to get started

Training Details

Training Data

This model was trained using the theblackcat102/evol-codealpaca-v1 dataset

Training Procedure

Training Hyperparameters

  • Training regime: fp32

Technical Specifications

Hardware

  • Any device with 8 gb of memory should be able to run it

Software

  • PyTorch