repo_id stringclasses 55
values | file_path stringlengths 42 186 | content stringlengths 1 333k | __index_level_0__ int64 0 0 |
|---|---|---|---|
mavonic_private_repos/transformers/examples/research_projects | mavonic_private_repos/transformers/examples/research_projects/deebert/run_glue_deebert.py | from __future__ import absolute_import, division, print_function
import argparse
import glob
import logging
import os
import random
import time
import numpy as np
import torch
from torch import nn
from torch.utils.data import DataLoader, RandomSampler, SequentialSampler, TensorDataset
from torch.utils.data.distribute... | 0 |
mavonic_private_repos/transformers/examples/research_projects | mavonic_private_repos/transformers/examples/research_projects/deebert/README.md | # DeeBERT: Early Exiting for *BERT
This is the code base for the paper [DeeBERT: Dynamic Early Exiting for Accelerating BERT Inference](https://www.aclweb.org/anthology/2020.acl-main.204/), modified from its [original code base](https://github.com/castorini/deebert).
The original code base also has information for do... | 0 |
mavonic_private_repos/transformers/examples/research_projects/deebert | mavonic_private_repos/transformers/examples/research_projects/deebert/src/modeling_highway_bert.py | import torch
from torch import nn
from torch.nn import CrossEntropyLoss, MSELoss
from transformers.file_utils import add_start_docstrings, add_start_docstrings_to_model_forward
from transformers.models.bert.modeling_bert import (
BERT_INPUTS_DOCSTRING,
BERT_START_DOCSTRING,
BertEmbeddings,
BertLayer,
... | 0 |
mavonic_private_repos/transformers/examples/research_projects/deebert | mavonic_private_repos/transformers/examples/research_projects/deebert/src/modeling_highway_roberta.py | from __future__ import absolute_import, division, print_function, unicode_literals
from torch import nn
from torch.nn import CrossEntropyLoss, MSELoss
from transformers import RobertaConfig
from transformers.file_utils import add_start_docstrings, add_start_docstrings_to_model_forward
from transformers.models.roberta... | 0 |
mavonic_private_repos/transformers/examples/research_projects | mavonic_private_repos/transformers/examples/research_projects/jax-projects/HOW_TO_PROPOSE_PROJECT.md | # How to propose a Flax/JAX + Transformers project
Great that you've opened this document!
While we at 🤗 are proposing a couple of projects, we strongly
believe that the community can come up with much more **creative**, **fun**, and
**impactful** projects on their own. This being said, we are really looking forw... | 0 |
mavonic_private_repos/transformers/examples/research_projects | mavonic_private_repos/transformers/examples/research_projects/jax-projects/README.md | # Flax/JAX community week 🤗
Welcome to the Flax/JAX community week! The goal of this week is to make compute-intensive NLP and CV projects (like pre-training BERT, GPT2, CLIP, ViT)
practicable for a wider audience of engineers and researchers.
To do so, we will try to teach **you** how to effectively use JAX/Flax o... | 0 |
mavonic_private_repos/transformers/examples/research_projects/jax-projects | mavonic_private_repos/transformers/examples/research_projects/jax-projects/model_parallel/partitions.py | #!/usr/bin/env python
# coding=utf-8
# Copyright 2021 The Google Research Authors and The HuggingFace Team All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://... | 0 |
mavonic_private_repos/transformers/examples/research_projects/jax-projects | mavonic_private_repos/transformers/examples/research_projects/jax-projects/model_parallel/run_clm_mp.py | #!/usr/bin/env python
# coding=utf-8
# Copyright 2021 The HuggingFace Team All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-... | 0 |
mavonic_private_repos/transformers/examples/research_projects/jax-projects | mavonic_private_repos/transformers/examples/research_projects/jax-projects/model_parallel/README.md | <!---
Copyright 2021 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or ... | 0 |
mavonic_private_repos/transformers/examples/research_projects/jax-projects | mavonic_private_repos/transformers/examples/research_projects/jax-projects/dataset-streaming/run_mlm_flax_stream.py | #!/usr/bin/env python
# coding=utf-8
# Copyright 2021 The HuggingFace Team All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-... | 0 |
mavonic_private_repos/transformers/examples/research_projects/jax-projects | mavonic_private_repos/transformers/examples/research_projects/jax-projects/dataset-streaming/README.md | <!---
Copyright 2021 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or ... | 0 |
mavonic_private_repos/transformers/examples/research_projects/jax-projects | mavonic_private_repos/transformers/examples/research_projects/jax-projects/big_bird/sweep_flax.yaml | command:
- python3
- train.py
method: random
parameters:
lr:
values: [4e-5, 3e-5]
warmup_steps:
values: [20000, 15000, 10000, 5000]
weight_decay:
distribution: normal
mu: 1e-2
sigma: 2e-3
metric:
... | 0 |
mavonic_private_repos/transformers/examples/research_projects/jax-projects | mavonic_private_repos/transformers/examples/research_projects/jax-projects/big_bird/requirements.txt | git+https://github.com/huggingface/transformers@main
datasets
sentencepiece
wandb
flax
jsonlines
| 0 |
mavonic_private_repos/transformers/examples/research_projects/jax-projects | mavonic_private_repos/transformers/examples/research_projects/jax-projects/big_bird/prepare_natural_questions.py | import os
import jsonlines
import numpy as np
from tqdm import tqdm
DOC_STRIDE = 2048
MAX_LENGTH = 4096
SEED = 42
PROCESS_TRAIN = os.environ.pop("PROCESS_TRAIN", "false")
CATEGORY_MAPPING = {"null": 0, "short": 1, "long": 2, "yes": 3, "no": 4}
def _get_single_answer(example):
def choose_first(answer, is_long_a... | 0 |
mavonic_private_repos/transformers/examples/research_projects/jax-projects | mavonic_private_repos/transformers/examples/research_projects/jax-projects/big_bird/bigbird_flax.py | import json
import os
from dataclasses import dataclass
from functools import partial
from typing import Callable
import flax.linen as nn
import jax
import jax.numpy as jnp
import joblib
import optax
import wandb
from flax import jax_utils, struct, traverse_util
from flax.serialization import from_bytes, to_bytes
from... | 0 |
mavonic_private_repos/transformers/examples/research_projects/jax-projects | mavonic_private_repos/transformers/examples/research_projects/jax-projects/big_bird/train.py | import os
from dataclasses import replace
import jax
import wandb
from bigbird_flax import Args, DataCollator, FlaxBigBirdForNaturalQuestions, Trainer, build_tx, train_step, val_step
from datasets import load_dataset
from flax import jax_utils
from transformers import BigBirdTokenizerFast
if __name__ == "__main__":... | 0 |
mavonic_private_repos/transformers/examples/research_projects/jax-projects | mavonic_private_repos/transformers/examples/research_projects/jax-projects/big_bird/evaluate.py | import jax
import jax.numpy as jnp
from bigbird_flax import FlaxBigBirdForNaturalQuestions
from datasets import load_from_disk
from transformers import BigBirdTokenizerFast
CATEGORY_MAPPING = {0: "null", 1: "short", 2: "long", 3: "yes", 4: "no"}
PUNCTUATION_SET_TO_EXCLUDE = set("".join(["‘", "’", "´", "`", ".", ",",... | 0 |
mavonic_private_repos/transformers/examples/research_projects/jax-projects | mavonic_private_repos/transformers/examples/research_projects/jax-projects/big_bird/README.md |
Author: [@vasudevgupta7](https://github.com/thevasudevgupta/)
## Intro
In this project, we fine-tuned [**BigBird**](https://arxiv.org/abs/2007.14062) on [**natural-questions**](https://huggingface.co/datasets/natural_questions) dataset for **question-answering** task on long documents. **BigBird**, is a **sparse-att... | 0 |
mavonic_private_repos/transformers/examples/research_projects/jax-projects | mavonic_private_repos/transformers/examples/research_projects/jax-projects/hybrid_clip/modeling_hybrid_clip.py | # coding=utf-8
# Copyright 2021 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless requir... | 0 |
mavonic_private_repos/transformers/examples/research_projects/jax-projects | mavonic_private_repos/transformers/examples/research_projects/jax-projects/hybrid_clip/requirements.txt | jax>=0.2.8
jaxlib>=0.1.59
flax>=0.3.5
optax>=0.0.8
-f https://download.pytorch.org/whl/torch_stable.html
torch==1.13.1
-f https://download.pytorch.org/whl/torch_stable.html
torchvision==0.10.0+cpu | 0 |
mavonic_private_repos/transformers/examples/research_projects/jax-projects | mavonic_private_repos/transformers/examples/research_projects/jax-projects/hybrid_clip/configuration_hybrid_clip.py | import copy
from transformers.configuration_utils import PretrainedConfig
from transformers.utils import logging
logger = logging.get_logger(__name__)
class HybridCLIPConfig(PretrainedConfig):
r"""
:class:`HybridCLIPConfig` is the configuration class to store the configuration of a
:class:`~HybridCLIPM... | 0 |
mavonic_private_repos/transformers/examples/research_projects/jax-projects | mavonic_private_repos/transformers/examples/research_projects/jax-projects/hybrid_clip/run_hybrid_clip.py | #!/usr/bin/env python
# coding=utf-8
# Copyright 2021 The HuggingFace Team All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-... | 0 |
mavonic_private_repos/transformers/examples/research_projects/jax-projects | mavonic_private_repos/transformers/examples/research_projects/jax-projects/hybrid_clip/README.md | <!---
Copyright 2021 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or ... | 0 |
mavonic_private_repos/transformers/examples/research_projects/jax-projects | mavonic_private_repos/transformers/examples/research_projects/jax-projects/wav2vec2/run_wav2vec2_pretrain_flax.py | #!/usr/bin/env python3
import logging
import sys
import time
from dataclasses import field
from pathlib import Path
from typing import Dict, List, Optional, Union
import flax
import jax
import jax.numpy as jnp
import librosa
import numpy as np
import optax
from datasets import DatasetDict, load_dataset
from flax impor... | 0 |
mavonic_private_repos/transformers/examples/research_projects/jax-projects | mavonic_private_repos/transformers/examples/research_projects/jax-projects/wav2vec2/README.md | # Wav2Vec2 Contrastive Loss PreTraining examples
The following example showcases how to pretrain a wav2vec2 model using the JAX/Flax backend.
Pretraining Wav2Vec2 is rather complex, so it is highly recommended to read the
[official paper](https://arxiv.org/abs/2006.11477).
JAX/Flax allows you to trace pure functions... | 0 |
mavonic_private_repos/transformers/examples | mavonic_private_repos/transformers/examples/legacy/run_openai_gpt.py | #!/usr/bin/env python
# coding=utf-8
# Copyright 2018 Google AI, Google Brain and Carnegie Mellon University Authors and the HuggingFace Inc. team.
# Copyright (c) 2018, NVIDIA CORPORATION. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in co... | 0 |
mavonic_private_repos/transformers/examples | mavonic_private_repos/transformers/examples/legacy/run_transfo_xl.py | #!/usr/bin/env python
# coding=utf-8
# Copyright 2018 Google AI, Google Brain and Carnegie Mellon University Authors and the HuggingFace Inc. team.
# Copyright (c) 2018, NVIDIA CORPORATION. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in co... | 0 |
mavonic_private_repos/transformers/examples | mavonic_private_repos/transformers/examples/legacy/run_chinese_ref.py | #!/usr/bin/env python
import argparse
import json
from typing import List
from ltp import LTP
from transformers import BertTokenizer
def _is_chinese_char(cp):
"""Checks whether CP is the codepoint of a CJK character."""
# This defines a "chinese character" as anything in the CJK Unicode block:
# https... | 0 |
mavonic_private_repos/transformers/examples | mavonic_private_repos/transformers/examples/legacy/run_camembert.py | #!/usr/bin/env python
import torch
from transformers import CamembertForMaskedLM, CamembertTokenizer
def fill_mask(masked_input, model, tokenizer, topk=5):
# Adapted from https://github.com/pytorch/fairseq/blob/master/fairseq/models/roberta/hub_interface.py
assert masked_input.count("<mask>") == 1
input_... | 0 |
mavonic_private_repos/transformers/examples | mavonic_private_repos/transformers/examples/legacy/run_language_modeling.py | #!/usr/bin/env python
# coding=utf-8
# Copyright 2018 The Google AI Language Team Authors and The HuggingFace Inc. team.
# Copyright (c) 2018, NVIDIA CORPORATION. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
... | 0 |
mavonic_private_repos/transformers/examples | mavonic_private_repos/transformers/examples/legacy/run_swag.py | #!/usr/bin/env python
# coding=utf-8
# Copyright 2018 The Google AI Language Team Authors and The HuggingFace Inc. team.
# Copyright (c) 2018, NVIDIA CORPORATION. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
... | 0 |
mavonic_private_repos/transformers/examples | mavonic_private_repos/transformers/examples/legacy/README.md | <!---
Copyright 2020 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or ... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/token-classification/run_pos.sh | if ! [ -f ./dev.txt ]; then
echo "Download dev dataset...."
curl -L -o ./dev.txt 'https://github.com/UniversalDependencies/UD_English-EWT/raw/master/en_ewt-ud-dev.conllu'
fi
if ! [ -f ./test.txt ]; then
echo "Download test dataset...."
curl -L -o ./test.txt 'https://github.com/UniversalDependencies/UD_English-... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/token-classification/run_ner.py | # coding=utf-8
# Copyright 2018 The Google AI Language Team Authors and The HuggingFace Inc. team.
# Copyright (c) 2018, NVIDIA CORPORATION. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a cop... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/token-classification/run_chunk.sh | if ! [ -f ./dev.txt ]; then
echo "Downloading CONLL2003 dev dataset...."
curl -L -o ./dev.txt 'https://github.com/davidsbatista/NER-datasets/raw/master/CONLL2003/valid.txt'
fi
if ! [ -f ./test.txt ]; then
echo "Downloading CONLL2003 test dataset...."
curl -L -o ./test.txt 'https://github.com/davidsbatista/NER-... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/token-classification/utils_ner.py | # coding=utf-8
# Copyright 2018 The Google AI Language Team Authors and The HuggingFace Inc. team.
# Copyright (c) 2018, NVIDIA CORPORATION. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a cop... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/token-classification/tasks.py | import logging
import os
from typing import List, TextIO, Union
from conllu import parse_incr
from utils_ner import InputExample, Split, TokenClassificationTask
logger = logging.getLogger(__name__)
class NER(TokenClassificationTask):
def __init__(self, label_idx=-1):
# in NER datasets, the last column ... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/token-classification/README.md | ## Token classification
Based on the scripts [`run_ner.py`](https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run_ner.py).
The following examples are covered in this section:
* NER on the GermEval 2014 (German NER) dataset
* Emerging and Rare Entities task: WNUT’17 (English N... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/token-classification/run.sh | ## The relevant files are currently on a shared Google
## drive at https://drive.google.com/drive/folders/1kC0I2UGl2ltrluI9NqDjaQJGw5iliw_J
## Monitor for changes and eventually migrate to use the `datasets` library
curl -L 'https://drive.google.com/uc?export=download&id=1Jjhbal535VVz2ap4v4r_rN1UEHTdLK5P' \
| grep -v "... | 0 |
mavonic_private_repos/transformers/examples/legacy/token-classification | mavonic_private_repos/transformers/examples/legacy/token-classification/scripts/preprocess.py | import sys
from transformers import AutoTokenizer
dataset = sys.argv[1]
model_name_or_path = sys.argv[2]
max_len = int(sys.argv[3])
subword_len_counter = 0
tokenizer = AutoTokenizer.from_pretrained(model_name_or_path)
max_len -= tokenizer.num_special_tokens_to_add()
with open(dataset, "rt") as f_p:
for line i... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/pytorch-lightning/lightning_base.py | import argparse
import logging
import os
from pathlib import Path
from typing import Any, Dict
import pytorch_lightning as pl
from pytorch_lightning.utilities import rank_zero_info
from transformers import (
AdamW,
AutoConfig,
AutoModel,
AutoModelForPreTraining,
AutoModelForQuestionAnswering,
... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/pytorch-lightning/run_pos.sh | #!/usr/bin/env bash
if ! [ -f ./dev.txt ]; then
echo "Download dev dataset...."
curl -L -o ./dev.txt 'https://github.com/UniversalDependencies/UD_English-EWT/raw/master/en_ewt-ud-dev.conllu'
fi
if ! [ -f ./test.txt ]; then
echo "Download test dataset...."
curl -L -o ./test.txt 'https://github.com/UniversalDepe... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/pytorch-lightning/requirements.txt | tensorboard
scikit-learn
seqeval
psutil
sacrebleu
rouge-score
tensorflow_datasets
matplotlib
git-python==1.0.3
faiss-cpu
streamlit
elasticsearch
nltk
pandas
datasets >= 1.1.3
fire
pytest<8.0.1
conllu
sentencepiece != 0.1.92
protobuf
ray
| 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/pytorch-lightning/run_ner.py | import argparse
import glob
import logging
import os
from argparse import Namespace
from importlib import import_module
import numpy as np
import torch
from lightning_base import BaseTransformer, add_generic_args, generic_train
from seqeval.metrics import accuracy_score, f1_score, precision_score, recall_score
from to... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/pytorch-lightning/run_glue.sh | # Install example requirements
pip install -r ../requirements.txt
# Download glue data
python3 ../../utils/download_glue_data.py
export TASK=mrpc
export DATA_DIR=./glue_data/MRPC/
export MAX_LENGTH=128
export LEARNING_RATE=2e-5
export BERT_MODEL=bert-base-cased
export BATCH_SIZE=32
export NUM_EPOCHS=3
export SEED=2
e... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/pytorch-lightning/run_ner.sh | #!/usr/bin/env bash
# for seqeval metrics import
pip install -r ../requirements.txt
## The relevant files are currently on a shared Google
## drive at https://drive.google.com/drive/folders/1kC0I2UGl2ltrluI9NqDjaQJGw5iliw_J
## Monitor for changes and eventually migrate to use the `datasets` library
curl -L 'https://d... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/pytorch-lightning/run_glue.py | import argparse
import glob
import logging
import os
import time
from argparse import Namespace
import numpy as np
import torch
from lightning_base import BaseTransformer, add_generic_args, generic_train
from torch.utils.data import DataLoader, TensorDataset
from transformers import glue_compute_metrics as compute_me... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/seq2seq/finetune.sh | # Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/seq2seq/save_randomly_initialized_model.py | #!/usr/bin/env python
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/seq2seq/convert_model_to_fp16.py | #!/usr/bin/env python
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/seq2seq/rouge_cli.py | # Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/seq2seq/old_test_datasets.py | # Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/seq2seq/requirements.txt | tensorboard
scikit-learn
seqeval
psutil
sacrebleu
rouge-score
tensorflow_datasets
matplotlib
git-python==1.0.3
faiss-cpu
streamlit
elasticsearch
nltk
pandas
datasets >= 1.1.3
fire
pytest<8.0.1
conllu
sentencepiece != 0.1.92
protobuf
| 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/seq2seq/seq2seq_trainer.py | # Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/seq2seq/finetune_tpu.sh | # Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/seq2seq/train_mbart_cc25_enro.sh | # Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/seq2seq/utils.py | # Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/seq2seq/old_test_seq2seq_examples.py | # Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/seq2seq/run_eval_search.py | #!/usr/bin/env python
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/seq2seq/run_distributed_eval.py | #!/usr/bin/env python
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/seq2seq/old_test_seq2seq_examples_multi_gpu.py | # Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/seq2seq/finetune_trainer.py | #!/usr/bin/env python
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/seq2seq/train_distil_marian_enro.sh | # Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/seq2seq/old_test_tatoeba_conversion.py | # Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/seq2seq/save_len_file.py | #!/usr/bin/env python
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/seq2seq/__init__.py | import os
import sys
sys.path.insert(1, os.path.dirname(os.path.realpath(__file__)))
| 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/seq2seq/romanian_postprocessing.md | ### Motivation
Without processing, english-> romanian mbart-large-en-ro gets BLEU score 26.8 on the WMT data.
With post processing, it can score 37..
Here is the postprocessing code, stolen from @mjpost in this [issue](https://github.com/pytorch/fairseq/issues/1758)
### Instructions
Note: You need to have your test_... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/seq2seq/pack_dataset.py | #!/usr/bin/env python
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/seq2seq/xla_spawn.py | # Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/seq2seq/train_distil_marian_enro_tpu.sh | # Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/seq2seq/old_test_calculate_rouge.py | # Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/seq2seq/old_test_fsmt_bleu_score.py | # coding=utf-8
# Copyright 2020 Huggingface
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed ... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/seq2seq/run_eval.py | #!/usr/bin/env python
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/seq2seq/download_wmt.py | #!/usr/bin/env python
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/seq2seq/sentence_splitter.py | # Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/seq2seq/README.md | <!---
Copyright 2020 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or ... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/seq2seq/minify_dataset.py | #!/usr/bin/env python
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/seq2seq/train_distilbart_cnn.sh | # Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/seq2seq/seq2seq_training_args.py | # Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | 0 |
mavonic_private_repos/transformers/examples/legacy/seq2seq/test_data | mavonic_private_repos/transformers/examples/legacy/seq2seq/test_data/fsmt/fsmt_val_data.json | {
"en-ru": {
"src": [
"Welsh AMs worried about 'looking like muppets'",
"There is consternation among some AMs at a suggestion their title should change to MWPs (Member of the Welsh Parliament).",
"It has arisen because of plans to change the name of the assembly to the Welsh Parliament.",
... | 0 |
mavonic_private_repos/transformers/examples/legacy/seq2seq/test_data | mavonic_private_repos/transformers/examples/legacy/seq2seq/test_data/fsmt/build-eval-data.py | #!/usr/bin/env python
import io
import json
import subprocess
pairs = [
["en", "ru"],
["ru", "en"],
["en", "de"],
["de", "en"],
]
n_objs = 8
def get_all_data(pairs, n_objs):
text = {}
for src, tgt in pairs:
pair = f"{src}-{tgt}"
cmd = f"sacrebleu -t wmt19 -l {pair} --echo s... | 0 |
mavonic_private_repos/transformers/examples/legacy/seq2seq/test_data | mavonic_private_repos/transformers/examples/legacy/seq2seq/test_data/wmt_en_ro/val.source | Brazil's Former Presidential Chief-of-Staff to Stand Trial A federal judge on Tuesday accepted the charges filed against Brazil's former presidential chief of staff for his alleged involvement in a massive corruption scheme at state-owned oil company Petrobras. The federal prosecutor's office said Jose Dirceu will face... | 0 |
mavonic_private_repos/transformers/examples/legacy/seq2seq/test_data | mavonic_private_repos/transformers/examples/legacy/seq2seq/test_data/wmt_en_ro/train.source | Corrections to votes and voting intentions: see Minutes Assignment conferred on a Member: see Minutes Membership of committees and delegations: see Minutes Decisions concerning certain documents: see Minutes Forwarding of texts adopted during the sitting: see Minutes Dates for next sittings: see Minutes
Membership of P... | 0 |
mavonic_private_repos/transformers/examples/legacy/seq2seq/test_data | mavonic_private_repos/transformers/examples/legacy/seq2seq/test_data/wmt_en_ro/train.target | Corectările voturilor şi intenţiile de vot: a se vedea procesul-verbal Misiune încredinţată unui deputat: consultaţi procesul-verbal Componenţa comisiilor şi a delegaţiilor: a se vedea procesul-verbal Decizii privind anumite documente: a se vedea procesul-verbal Transmiterea textelor adoptate în cursul prezentei şedinţ... | 0 |
mavonic_private_repos/transformers/examples/legacy/seq2seq/test_data | mavonic_private_repos/transformers/examples/legacy/seq2seq/test_data/wmt_en_ro/test.source | UN Chief Says There Is No Military Solution in Syria Secretary-General Ban Ki-moon says his response to Russia's stepped up military support for Syria is that "there is no military solution" to the nearly five-year conflict and more weapons will only worsen the violence and misery for millions of people. The U.N. chief... | 0 |
mavonic_private_repos/transformers/examples/legacy/seq2seq/test_data | mavonic_private_repos/transformers/examples/legacy/seq2seq/test_data/wmt_en_ro/val.target | Fostul șef al cabinetului prezidențial brazilian este adus în fața instanței Marți, un judecător federal a acceptat acuzațiile aduse împotriva fostului șef al cabinetului prezidențial brazilian pentru presupusa implicare a acestuia într-o schemă masivă de corupție privind compania petrolieră de stat Petrobras. Biroul p... | 0 |
mavonic_private_repos/transformers/examples/legacy/seq2seq/test_data | mavonic_private_repos/transformers/examples/legacy/seq2seq/test_data/wmt_en_ro/test.target | Șeful ONU declară că nu există soluții militare în Siria Secretarul General Ban Ki-moon afirmă că răspunsul său la suportul militar al Rusiei pentru Siria este că „nu există o soluție militară” la conflictul care durează de aproape cinci ani iar mai multe arme nu ar face decât să agraveze violența și suferința a milioa... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/multiple_choice/run_multiple_choice.py | # coding=utf-8
# Copyright 2018 The Google AI Language Team Authors and The HuggingFace Inc. team.
# Copyright (c) 2018, NVIDIA CORPORATION. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a cop... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/multiple_choice/utils_multiple_choice.py | # coding=utf-8
# Copyright 2018 The Google AI Language Team Authors and The HuggingFace Inc. team.
# Copyright (c) 2018, NVIDIA CORPORATION. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a cop... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/benchmarking/requirements.txt | torch >= 1.3 | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/benchmarking/run_benchmark.py | #!/usr/bin/env python
# coding=utf-8
# Copyright 2020 The HuggingFace Inc. team.
# Copyright (c) 2018, NVIDIA CORPORATION. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License a... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/benchmarking/plot_csv_file.py | # Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/benchmarking/README.md | <!---
Copyright 2020 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or ... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/question-answering/run_squad.py | # coding=utf-8
# Copyright 2018 The Google AI Language Team Authors and The HuggingFace Inc. team.
# Copyright (c) 2018, NVIDIA CORPORATION. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a cop... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/question-answering/run_squad_trainer.py | # coding=utf-8
# Copyright 2018 The Google AI Language Team Authors and The HuggingFace Inc. team.
# Copyright (c) 2018, NVIDIA CORPORATION. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a cop... | 0 |
mavonic_private_repos/transformers/examples/legacy | mavonic_private_repos/transformers/examples/legacy/question-answering/README.md | #### Fine-tuning BERT on SQuAD1.0 with relative position embeddings
The following examples show how to fine-tune BERT models with different relative position embeddings. The BERT model
`google-bert/bert-base-uncased` was pretrained with default absolute position embeddings. We provide the following pretrained
models... | 0 |
mavonic_private_repos/transformers/examples | mavonic_private_repos/transformers/examples/pytorch/old_test_xla_examples.py | # coding=utf-8
# Copyright 2018 HuggingFace Inc..
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or a... | 0 |
mavonic_private_repos/transformers/examples | mavonic_private_repos/transformers/examples/pytorch/test_pytorch_examples.py | # coding=utf-8
# Copyright 2018 HuggingFace Inc..
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or a... | 0 |
mavonic_private_repos/transformers/examples | mavonic_private_repos/transformers/examples/pytorch/test_accelerate_examples.py | # coding=utf-8
# Copyright 2018 HuggingFace Inc..
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or a... | 0 |
mavonic_private_repos/transformers/examples | mavonic_private_repos/transformers/examples/pytorch/_tests_requirements.txt | tensorboard
scikit-learn
seqeval
psutil
sacrebleu >= 1.4.12
git+https://github.com/huggingface/accelerate@main#egg=accelerate
rouge-score
tensorflow_datasets
matplotlib
git-python==1.0.3
faiss-cpu
streamlit
elasticsearch
nltk
pandas
datasets >= 1.13.3
fire
pytest<8.0.1
conllu
sentencepiece != 0.1.92
protobuf
torch
torc... | 0 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.