YAML Metadata Warning:empty or missing yaml metadata in repo card

Check out the documentation for more information.

Quantization made by Richard Erkhov.

Github

Discord

Request more models

bloom-1b4-zh - bnb 4bits

Original model description:

license: bigscience-bloom-rail-1.0 language:

  • zh pipeline_tag: text-generation widget:
  • text: "涓浗鐨勯閮芥槸"

This model is based on bigscience/bloom-1b7.

We pruned its vocabulary from 250880 to 46145 with Chinese corpus to reduce GPU memory usage. So the total parameter is 1.4b now.

How to use

from transformers import BloomTokenizerFast, BloomForCausalLM

tokenizer = BloomTokenizerFast.from_pretrained('Langboat/bloom-1b4-zh')
model = BloomForCausalLM.from_pretrained('Langboat/bloom-1b4-zh')

print(tokenizer.batch_decode(model.generate(tokenizer.encode('涓浗鐨勯閮芥槸', return_tensors='pt'))))
Downloads last month
-
Safetensors
Model size
1B params
Tensor type
F32
F16
U8
Inference Providers NEW
This model isn't deployed by any Inference Provider. 馃檵 Ask for provider support