AbdulElahGwaith's picture
Upload folder using huggingface_hub
a9bd396 verified
PyTorch TensorFlow Flax SDPA

BERT

BERT 是一个在无标签的文本数据上预训练的双向 transformer,用于预测句子中被掩码的(masked) token,以及预测一个句子是否跟随在另一个句子之后。其主要思想是,在预训练过程中,通过随机掩码一些 token,让模型利用左右上下文的信息预测它们,从而获得更全面深入的理解。此外,BERT 具有很强的通用性,其学习到的语言表示可以通过额外的层或头进行微调,从而适配其他下游 NLP 任务。

你可以在 BERT 集合下找到 BERT 的所有原始 checkpoint。

点击右侧边栏中的 BERT 模型,以查看将 BERT 应用于不同语言任务的更多示例。

下面的示例演示了如何使用 [Pipeline], [AutoModel] 和命令行预测 [MASK] token。

import torch
from transformers import pipeline

pipeline = pipeline(
    task="fill-mask",
    model="google-bert/bert-base-uncased",
    dtype=torch.float16,
    device=0
)
pipeline("Plants create [MASK] through a process known as photosynthesis.")
import torch
from transformers import AutoModelForMaskedLM, AutoTokenizer

tokenizer = AutoTokenizer.from_pretrained(
    "google-bert/bert-base-uncased",
)
model = AutoModelForMaskedLM.from_pretrained(
    "google-bert/bert-base-uncased",
    dtype=torch.float16,
    device_map="auto",
    attn_implementation="sdpa"
)
inputs = tokenizer("Plants create [MASK] through a process known as photosynthesis.", return_tensors="pt").to("cuda")

with torch.no_grad():
    outputs = model(**inputs)
    predictions = outputs.logits

masked_index = torch.where(inputs['input_ids'] == tokenizer.mask_token_id)[1]
predicted_token_id = predictions[0, masked_index].argmax(dim=-1)
predicted_token = tokenizer.decode(predicted_token_id)

print(f"The predicted token is: {predicted_token}")
echo -e "Plants create [MASK] through a process known as photosynthesis." | transformers run --task fill-mask --model google-bert/bert-base-uncased --device 0

注意

  • 输入内容应在右侧进行填充,因为 BERT 使用绝对位置嵌入。

BertConfig

[[autodoc]] BertConfig - all

BertTokenizer

[[autodoc]] BertTokenizer - get_special_tokens_mask - save_vocabulary

BertTokenizerLegacy

[[autodoc]] BertTokenizerLegacy

BertTokenizerFast

[[autodoc]] BertTokenizerFast

BertModel

[[autodoc]] BertModel - forward

BertForPreTraining

[[autodoc]] BertForPreTraining - forward

BertLMHeadModel

[[autodoc]] BertLMHeadModel - forward

BertForMaskedLM

[[autodoc]] BertForMaskedLM - forward

BertForNextSentencePrediction

[[autodoc]] BertForNextSentencePrediction - forward

BertForSequenceClassification

[[autodoc]] BertForSequenceClassification - forward

BertForMultipleChoice

[[autodoc]] BertForMultipleChoice - forward

BertForTokenClassification

[[autodoc]] BertForTokenClassification - forward

BertForQuestionAnswering

[[autodoc]] BertForQuestionAnswering - forward

Bert specific outputs

[[autodoc]] models.bert.modeling_bert.BertForPreTrainingOutput