Create README.md
#2
by
abdullashahidm
- opened
Model description
Meta AI’s BART is a powerful model used to transform incomplete text into coherent text, and bart-tiny-random is to BART what Lego bricks are to real bricks. It is a placeholder, a dummy model, to facilitate developers who need to test their programs which will eventually go on to use non-random, standard BART models.
Architecture: BART
Status: Randomly initialized
Intended uses & limitations
Uses
i. Test and debug ML pipelines
ii. Run unit tests and continuous integration checks
iii. Verifying code works as intended without using the actual model, thereby saving system resources and user time
Limitations
i. Outputs are random jargon and should not be interpreted as valid
ii. Not suitable for real-world natural language processing
How to use
Use the same way as other BART models, through the HuggingFace library. For instance:
from transformers import AutoTokenizer,AutoModelForSeq2SeqLM
tokenizer=AutoTokenizer.from_pretrained("sshleifer/bart-tiny-random")
model=AutoModelForSeq2SeqLM.from_pretrained("sshleifer/bart-tiny-random")
inputs=tokenizer("Translate this: Hello world",return_tensors="pt")
outputs=model.generate(**inputs)
print(tokenizer.decode(outputs[0]))
#--- A random strings of characters will output ---#
Training data
Randomly initialized weights means they have not been trained on any datasets, hence there is no training data
Training procedure
Not applicable
Variable and metrics
Not applicable
Evaluation results
Not applicable