repo_name stringclasses 6
values | pr_number int64 99 20.3k | pr_title stringlengths 8 158 | pr_description stringlengths 0 6.54k | author stringlengths 4 18 | date_created timestamp[ns, tz=UTC] | date_merged timestamp[ns, tz=UTC] | previous_commit stringlengths 40 40 | pr_commit stringlengths 40 40 | query stringlengths 37 6.57k | filepath stringlengths 8 153 | before_content stringlengths 0 876M | after_content stringlengths 0 876M | label int64 -1 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./tests/models/mobilebert/test_tokenization_mobilebert.py | # coding=utf-8
# Copyright 2022 Leon Derczynski. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by... | # coding=utf-8
# Copyright 2022 Leon Derczynski. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./src/transformers/models/trajectory_transformer/convert_trajectory_transformer_original_pytorch_checkpoint_to_pytorch.py | # coding=utf-8
# Copyright 2022 The Trajectory Transformers paper authors and The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://ww... | # coding=utf-8
# Copyright 2022 The Trajectory Transformers paper authors and The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://ww... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./tests/models/bert/__init__.py | -1 | ||
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./examples/research_projects/jax-projects/hybrid_clip/modeling_hybrid_clip.py | # coding=utf-8
# Copyright 2021 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless requir... | # coding=utf-8
# Copyright 2021 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless requir... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./src/transformers/models/ibert/configuration_ibert.py | # coding=utf-8
# Copyright 2021 The I-BERT Authors (Sehoon Kim, Amir Gholami, Zhewei Yao,
# Michael Mahoney, Kurt Keutzer - UC Berkeley) and The HuggingFace Inc. team.
# Copyright (c) 20121, NVIDIA CORPORATION. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use t... | # coding=utf-8
# Copyright 2021 The I-BERT Authors (Sehoon Kim, Amir Gholami, Zhewei Yao,
# Michael Mahoney, Kurt Keutzer - UC Berkeley) and The HuggingFace Inc. team.
# Copyright (c) 20121, NVIDIA CORPORATION. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use t... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./src/transformers/models/pegasus_x/configuration_pegasus_x.py | # coding=utf-8
# Copyright 2022, Google and The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
... | # coding=utf-8
# Copyright 2022, Google and The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./tests/models/swin2sr/test_modeling_swin2sr.py | # coding=utf-8
# Copyright 2022 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless r... | # coding=utf-8
# Copyright 2022 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless r... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./docs/source/en/attention.mdx | <!--Copyright 2023 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | <!--Copyright 2023 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./src/transformers/models/upernet/configuration_upernet.py | # coding=utf-8
# Copyright 2022 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless r... | # coding=utf-8
# Copyright 2022 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless r... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./examples/research_projects/distillation/run_squad_w_distillation.py | # coding=utf-8
# Copyright 2018 The Google AI Language Team Authors and The HuggingFace Inc. team.
# Copyright (c) 2018, NVIDIA CORPORATION. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a cop... | # coding=utf-8
# Copyright 2018 The Google AI Language Team Authors and The HuggingFace Inc. team.
# Copyright (c) 2018, NVIDIA CORPORATION. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a cop... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./scripts/fsmt/gen-card-allenai-wmt19.py | #!/usr/bin/env python
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless... | #!/usr/bin/env python
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./src/transformers/models/yoso/fast_lsh_cumulation_cuda.cu | // File from https://github.com/mlpen/YOSO/blob/main/encoders/backbones/efficient_attentions/yoso/yoso_v1/cuda/fast_lsh_cumulation_cuda.cu
#include "fast_lsh_cumulation_cuda.h"
#include "common_cuda_device.h"
#include "common_cuda.h"
#include "common.h"
#include <stdio.h>
//////////////////////////////////////////////... | // File from https://github.com/mlpen/YOSO/blob/main/encoders/backbones/efficient_attentions/yoso/yoso_v1/cuda/fast_lsh_cumulation_cuda.cu
#include "fast_lsh_cumulation_cuda.h"
#include "common_cuda_device.h"
#include "common_cuda.h"
#include "common.h"
#include <stdio.h>
//////////////////////////////////////////////... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./docs/source/de/pipeline_tutorial.mdx | <!--Copyright 2022 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | <!--Copyright 2022 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./examples/research_projects/robust-speech-event/eval.py | #!/usr/bin/env python3
import argparse
import re
from typing import Dict
import torch
from datasets import Audio, Dataset, load_dataset, load_metric
from transformers import AutoFeatureExtractor, pipeline
def log_results(result: Dataset, args: Dict[str, str]):
"""DO NOT CHANGE. This function computes and logs t... | #!/usr/bin/env python3
import argparse
import re
from typing import Dict
import torch
from datasets import Audio, Dataset, load_dataset, load_metric
from transformers import AutoFeatureExtractor, pipeline
def log_results(result: Dataset, args: Dict[str, str]):
"""DO NOT CHANGE. This function computes and logs t... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./src/transformers/models/cvt/modeling_tf_cvt.py | # coding=utf-8
# Copyright 2022 Microsoft Research and The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/L... | # coding=utf-8
# Copyright 2022 Microsoft Research and The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/L... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./.git/objects/a2/cc185401e7764367ae52d1654491cf863ae69e | x+)JMU00e040031Q,+dx6M9{wk+qIODĒĒ">Kq{YCIQbf^*XiQc52Mw
X''$%&gk4:tki.-[pz6=)%EI%) mpɸIޱtiD9i+N-4b\p9nNn76-% qW03s&lKK2sA|LJNNvɢv
=
OA | x+)JMU00e040031Q,+dx6M9{wk+qIODĒĒ">Kq{YCIQbf^*XiQc52Mw
X''$%&gk4:tki.-[pz6=)%EI%) mpɸIޱtiD9i+N-4b\p9nNn76-% qW03s&lKK2sA|LJNNvɢv
=
OA | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./examples/tensorflow/question-answering/run_qa.py | #!/usr/bin/env python
# coding=utf-8
# Copyright 2020 The HuggingFace Team All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-... | #!/usr/bin/env python
# coding=utf-8
# Copyright 2020 The HuggingFace Team All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./src/transformers/commands/run.py | # Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | # Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./examples/pytorch/speech-recognition/requirements.txt | datasets >= 1.18.0
torch >= 1.5
torchaudio
librosa
jiwer
evaluate
| datasets >= 1.18.0
torch >= 1.5
torchaudio
librosa
jiwer
evaluate
| -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./README_zh-hans.md | <!---
Copyright 2020 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or ... | <!---
Copyright 2020 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or ... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./examples/research_projects/jax-projects/dataset-streaming/run_mlm_flax_stream.py | #!/usr/bin/env python
# coding=utf-8
# Copyright 2021 The HuggingFace Team All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-... | #!/usr/bin/env python
# coding=utf-8
# Copyright 2021 The HuggingFace Team All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./.git/objects/42/4caa4447aac94fb44979a1fa0748a2d67a9106 | x+)JMU057e040031Q,+dXD[,VWj{V2,)517>9?(13|SUd+X.TU/u@V^XRWܦ'f{wGD\Terb^JfJbIj|zj^jQbI~HASewveϸӑ^
T_Zvĩ?NztZJK`K۲Pi99%Eɩ`1;C,~B
OF䑥:F,RSɴox4>BLSGl{o6u{U;yGpAd'ee&|*Quw懏1+gvnk#\}0S֜VfݾT]I>sjˏ~}^gd,́~3٥Оjds&ngl5:w[.gvIsꔜ X | x+)JMU057e040031Q,+dXD[,VWj{V2,)517>9?(13|SUd+X.TU/u@V^XRWܦ'f{wGD\Terb^JfJbIj|zj^jQbI~HASewveϸӑ^
T_Zvĩ?NztZJK`K۲Pi99%Eɩ`1;C,~B
OF䑥:F,RSɴox4>BLSGl{o6u{U;yGpAd'ee&|*Quw懏1+gvnk#\}0S֜VfݾT]I>sjˏ~}^gd,́~3٥Оjds&ngl5:w[.gvIsꔜ X | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./tests/models/whisper/test_feature_extraction_whisper.py | # coding=utf-8
# Copyright 2022 HuggingFace Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or ag... | # coding=utf-8
# Copyright 2022 HuggingFace Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or ag... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./tests/models/speech_encoder_decoder/test_modeling_flax_speech_encoder_decoder.py | # coding=utf-8
# Copyright 2022 HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law... | # coding=utf-8
# Copyright 2022 HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./.circleci/TROUBLESHOOT.md | # Troubleshooting
This is a document explaining how to deal with various issues on Circle-CI. The entries may include actually solutions or pointers to Issues that cover those.
## Circle CI
* pytest worker runs out of resident RAM and gets killed by `cgroups`: https://github.com/huggingface/transformers/issues/11408... | # Troubleshooting
This is a document explaining how to deal with various issues on Circle-CI. The entries may include actually solutions or pointers to Issues that cover those.
## Circle CI
* pytest worker runs out of resident RAM and gets killed by `cgroups`: https://github.com/huggingface/transformers/issues/11408... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./tests/models/clipseg/__init__.py | -1 | ||
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./src/transformers/models/clipseg/convert_clipseg_original_pytorch_to_hf.py | # coding=utf-8
# Copyright 2022 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless r... | # coding=utf-8
# Copyright 2022 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless r... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./tests/models/xlnet/test_modeling_xlnet.py | # coding=utf-8
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless requir... | # coding=utf-8
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless requir... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./tests/models/bart/test_modeling_tf_bart.py | # coding=utf-8
# Copyright 2021 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless r... | # coding=utf-8
# Copyright 2021 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless r... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./tests/models/megatron_gpt2/__init__.py | -1 | ||
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./src/transformers/models/xlm_prophetnet/configuration_xlm_prophetnet.py | # coding=utf-8
# Copyright 2020 The Microsoft Authors and The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unl... | # coding=utf-8
# Copyright 2020 The Microsoft Authors and The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unl... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./.github/workflows/add-model-like.yml | name: Add model like runner
on:
push:
branches:
- main
pull_request:
paths:
- "src/**"
- "tests/**"
- ".github/**"
types: [opened, synchronize, reopened]
jobs:
run_tests_templates_like:
name: "Add new model like template tests"
runs-on: ubuntu-latest
steps:
... | name: Add model like runner
on:
push:
branches:
- main
pull_request:
paths:
- "src/**"
- "tests/**"
- ".github/**"
types: [opened, synchronize, reopened]
jobs:
run_tests_templates_like:
name: "Add new model like template tests"
runs-on: ubuntu-latest
steps:
... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./examples/pytorch/audio-classification/run_audio_classification.py | #!/usr/bin/env python
# coding=utf-8
# Copyright 2021 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LI... | #!/usr/bin/env python
# coding=utf-8
# Copyright 2021 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LI... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./src/transformers/models/maskformer/__init__.py | # Copyright 2022 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | # Copyright 2022 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./src/transformers/models/blenderbot_small/__init__.py | # Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | # Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./docs/source/en/model_doc/xlm-roberta-xl.mdx | <!--Copyright 2022 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | <!--Copyright 2022 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./tests/models/layoutlmv3/test_image_processing_layoutlmv3.py | # coding=utf-8
# Copyright 2022 HuggingFace Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or ag... | # coding=utf-8
# Copyright 2022 HuggingFace Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or ag... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./src/transformers/models/bit/__init__.py | # Copyright 2022 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | # Copyright 2022 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./tests/__init__.py | -1 | ||
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./src/transformers/models/plbart/__init__.py | # Copyright 2022 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | # Copyright 2022 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./src/transformers/pipelines/image_classification.py | from typing import List, Union
from ..utils import (
add_end_docstrings,
is_tf_available,
is_torch_available,
is_vision_available,
logging,
requires_backends,
)
from .base import PIPELINE_INIT_ARGS, Pipeline
if is_vision_available():
from PIL import Image
from ..image_utils import lo... | from typing import List, Union
from ..utils import (
add_end_docstrings,
is_tf_available,
is_torch_available,
is_vision_available,
logging,
requires_backends,
)
from .base import PIPELINE_INIT_ARGS, Pipeline
if is_vision_available():
from PIL import Image
from ..image_utils import lo... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./src/transformers/models/cvt/modeling_cvt.py | # coding=utf-8
# Copyright 2022 Microsoft Research and The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/L... | # coding=utf-8
# Copyright 2022 Microsoft Research and The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/L... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./ISSUES.md | <!---
Copyright 2020 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or ... | <!---
Copyright 2020 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or ... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./tests/models/oneformer/__init__.py | -1 | ||
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./tests/models/bigbird_pegasus/test_modeling_bigbird_pegasus.py | # coding=utf-8
# Copyright 2021 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless r... | # coding=utf-8
# Copyright 2021 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless r... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./src/transformers/models/tapas/__init__.py | # Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | # Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./examples/legacy/token-classification/utils_ner.py | # coding=utf-8
# Copyright 2018 The Google AI Language Team Authors and The HuggingFace Inc. team.
# Copyright (c) 2018, NVIDIA CORPORATION. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a cop... | # coding=utf-8
# Copyright 2018 The Google AI Language Team Authors and The HuggingFace Inc. team.
# Copyright (c) 2018, NVIDIA CORPORATION. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a cop... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./.git/logs/refs/heads/main | 0000000000000000000000000000000000000000 976189a6df796a2ff442dd81b022626c840d8c27 jupyter <jupyter@data-collection2-susnato.c.rishabh-experiments.internal> 1704826873 +0000 clone: from https://github.com/huggingface/transformers.git
| 0000000000000000000000000000000000000000 976189a6df796a2ff442dd81b022626c840d8c27 jupyter <jupyter@data-collection2-susnato.c.rishabh-experiments.internal> 1704826873 +0000 clone: from https://github.com/huggingface/transformers.git
| -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./tests/models/bertweet/__init__.py | -1 | ||
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./examples/pytorch/translation/README.md | <!---
Copyright 2020 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or ... | <!---
Copyright 2020 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or ... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./docs/source/en/model_doc/bort.mdx | <!--Copyright 2020 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | <!--Copyright 2020 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./tests/models/tapas/test_tokenization_tapas.py | # coding=utf-8
# Copyright 2020 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | # coding=utf-8
# Copyright 2020 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./examples/research_projects/mlm_wwm/run_chinese_ref.py | import argparse
import json
from typing import List
from ltp import LTP
from transformers.models.bert.tokenization_bert import BertTokenizer
def _is_chinese_char(cp):
"""Checks whether CP is the codepoint of a CJK character."""
# This defines a "chinese character" as anything in the CJK Unicode block:
#... | import argparse
import json
from typing import List
from ltp import LTP
from transformers.models.bert.tokenization_bert import BertTokenizer
def _is_chinese_char(cp):
"""Checks whether CP is the codepoint of a CJK character."""
# This defines a "chinese character" as anything in the CJK Unicode block:
#... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./docs/source/en/model_doc/gpt_neo.mdx | <!--Copyright 2021 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | <!--Copyright 2021 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./tests/pipelines/test_pipelines_zero_shot_audio_classification.py | # Copyright 2023 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | # Copyright 2023 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./docs/source/en/model_doc/xlsr_wav2vec2.mdx | <!--Copyright 2021 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | <!--Copyright 2021 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./utils/test_module/custom_processing.py | from transformers import ProcessorMixin
class CustomProcessor(ProcessorMixin):
feature_extractor_class = "AutoFeatureExtractor"
tokenizer_class = "AutoTokenizer"
| from transformers import ProcessorMixin
class CustomProcessor(ProcessorMixin):
feature_extractor_class = "AutoFeatureExtractor"
tokenizer_class = "AutoTokenizer"
| -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./examples/research_projects/jax-projects/HOW_TO_PROPOSE_PROJECT.md | # How to propose a Flax/JAX + Transformers project
Great that you've opened this document!
While we at 🤗 are proposing a couple of projects, we strongly
believe that the community can come up with much more **creative**, **fun**, and
**impactful** projects on their own. This being said, we are really looking forw... | # How to propose a Flax/JAX + Transformers project
Great that you've opened this document!
While we at 🤗 are proposing a couple of projects, we strongly
believe that the community can come up with much more **creative**, **fun**, and
**impactful** projects on their own. This being said, we are really looking forw... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./docs/source/pt/tasks/token_classification.mdx | <!--Copyright 2022 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | <!--Copyright 2022 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./docs/source/es/_config.py | # docstyle-ignore
INSTALL_CONTENT = """
# Transformers installation
! pip install transformers datasets
# To install from source instead of the last release, comment the command above and uncomment the following one.
# ! pip install git+https://github.com/huggingface/transformers.git
"""
notebook_first_cells = [{"type... | # docstyle-ignore
INSTALL_CONTENT = """
# Transformers installation
! pip install transformers datasets
# To install from source instead of the last release, comment the command above and uncomment the following one.
# ! pip install git+https://github.com/huggingface/transformers.git
"""
notebook_first_cells = [{"type... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./src/transformers/models/flava/image_processing_flava.py | # coding=utf-8
# Copyright 2022 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless r... | # coding=utf-8
# Copyright 2022 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless r... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./tests/models/swinv2/__init__.py | -1 | ||
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./tests/models/audio_spectrogram_transformer/test_modeling_audio_spectrogram_transformer.py | # coding=utf-8
# Copyright 2022 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless r... | # coding=utf-8
# Copyright 2022 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless r... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./docs/source/pt/training.mdx | <!--Copyright 2022 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | <!--Copyright 2022 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | -1 |
huggingface/transformers | 20,211 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch" | # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ArthurZucker | 2022-11-14T15:30:34Z | 2023-03-01T09:49:22Z | b29e2dcaff114762e65eaea739ba1076fc5d1c84 | 44e3e3fb4930298f092f336c2b7add3ebf051928 | prepare for "__floordiv__ is deprecated and its behavior will change in a future version of pytorch". # What does this PR do?
Should adress the `__floordiv__` warnings mentionned in #19934. Divinding torch tensor using `//` is no longer supported and has to be done via `torch.div`.
| ./examples/research_projects/rag-end2end-retriever/callbacks_rag.py | import logging
from pathlib import Path
import numpy as np
import pytorch_lightning as pl
import torch
from pytorch_lightning.callbacks import EarlyStopping, ModelCheckpoint
from pytorch_lightning.utilities import rank_zero_only
from utils_rag import save_json
def count_trainable_parameters(model):
model_paramet... | import logging
from pathlib import Path
import numpy as np
import pytorch_lightning as pl
import torch
from pytorch_lightning.callbacks import EarlyStopping, ModelCheckpoint
from pytorch_lightning.utilities import rank_zero_only
from utils_rag import save_json
def count_trainable_parameters(model):
model_paramet... | -1 |
huggingface/transformers | 20,209 | Add gpt-sw3 model to transformers | This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D... | ekgren | 2022-11-14T14:04:00Z | 2022-12-12T18:12:13Z | b58beebe7286bf53a80f137e0e5cd100ccb77ae2 | 5f94855dc31242d15d755b0d97ec6a0479ee0ea9 | Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models... | ./README.md | <!---
Copyright 2020 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or ... | <!---
Copyright 2020 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or ... | 1 |
huggingface/transformers | 20,209 | Add gpt-sw3 model to transformers | This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D... | ekgren | 2022-11-14T14:04:00Z | 2022-12-12T18:12:13Z | b58beebe7286bf53a80f137e0e5cd100ccb77ae2 | 5f94855dc31242d15d755b0d97ec6a0479ee0ea9 | Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models... | ./README_es.md | <!---
Copyright 2020 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or ... | <!---
Copyright 2020 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or ... | 1 |
huggingface/transformers | 20,209 | Add gpt-sw3 model to transformers | This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D... | ekgren | 2022-11-14T14:04:00Z | 2022-12-12T18:12:13Z | b58beebe7286bf53a80f137e0e5cd100ccb77ae2 | 5f94855dc31242d15d755b0d97ec6a0479ee0ea9 | Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models... | ./README_hd.md | <!---
Copyright 2020 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or ... | <!---
Copyright 2020 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or ... | 1 |
huggingface/transformers | 20,209 | Add gpt-sw3 model to transformers | This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D... | ekgren | 2022-11-14T14:04:00Z | 2022-12-12T18:12:13Z | b58beebe7286bf53a80f137e0e5cd100ccb77ae2 | 5f94855dc31242d15d755b0d97ec6a0479ee0ea9 | Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models... | ./README_ja.md | <!---
Copyright 2020 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or ... | <!---
Copyright 2020 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or ... | 1 |
huggingface/transformers | 20,209 | Add gpt-sw3 model to transformers | This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D... | ekgren | 2022-11-14T14:04:00Z | 2022-12-12T18:12:13Z | b58beebe7286bf53a80f137e0e5cd100ccb77ae2 | 5f94855dc31242d15d755b0d97ec6a0479ee0ea9 | Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models... | ./README_ko.md | <!---
Copyright 2020 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or ... | <!---
Copyright 2020 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or ... | 1 |
huggingface/transformers | 20,209 | Add gpt-sw3 model to transformers | This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D... | ekgren | 2022-11-14T14:04:00Z | 2022-12-12T18:12:13Z | b58beebe7286bf53a80f137e0e5cd100ccb77ae2 | 5f94855dc31242d15d755b0d97ec6a0479ee0ea9 | Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models... | ./README_zh-hans.md | <!---
Copyright 2020 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or ... | <!---
Copyright 2020 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or ... | 1 |
huggingface/transformers | 20,209 | Add gpt-sw3 model to transformers | This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D... | ekgren | 2022-11-14T14:04:00Z | 2022-12-12T18:12:13Z | b58beebe7286bf53a80f137e0e5cd100ccb77ae2 | 5f94855dc31242d15d755b0d97ec6a0479ee0ea9 | Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models... | ./README_zh-hant.md | <!---
Copyright 2020 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or ... | <!---
Copyright 2020 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or ... | 1 |
huggingface/transformers | 20,209 | Add gpt-sw3 model to transformers | This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D... | ekgren | 2022-11-14T14:04:00Z | 2022-12-12T18:12:13Z | b58beebe7286bf53a80f137e0e5cd100ccb77ae2 | 5f94855dc31242d15d755b0d97ec6a0479ee0ea9 | Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models... | ./docs/source/en/_toctree.yml | - sections:
- local: index
title: 🤗 Transformers
- local: quicktour
title: Quick tour
- local: installation
title: Installation
title: Get started
- sections:
- local: pipeline_tutorial
title: Pipelines for inference
- local: autoclass_tutorial
title: Load pretrained instances with an A... | - sections:
- local: index
title: 🤗 Transformers
- local: quicktour
title: Quick tour
- local: installation
title: Installation
title: Get started
- sections:
- local: pipeline_tutorial
title: Pipelines for inference
- local: autoclass_tutorial
title: Load pretrained instances with an A... | 1 |
huggingface/transformers | 20,209 | Add gpt-sw3 model to transformers | This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D... | ekgren | 2022-11-14T14:04:00Z | 2022-12-12T18:12:13Z | b58beebe7286bf53a80f137e0e5cd100ccb77ae2 | 5f94855dc31242d15d755b0d97ec6a0479ee0ea9 | Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models... | ./docs/source/en/index.mdx | <!--Copyright 2020 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | <!--Copyright 2020 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | 1 |
huggingface/transformers | 20,209 | Add gpt-sw3 model to transformers | This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D... | ekgren | 2022-11-14T14:04:00Z | 2022-12-12T18:12:13Z | b58beebe7286bf53a80f137e0e5cd100ccb77ae2 | 5f94855dc31242d15d755b0d97ec6a0479ee0ea9 | Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models... | ./docs/source/en/serialization.mdx | <!--Copyright 2020 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | <!--Copyright 2020 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | 1 |
huggingface/transformers | 20,209 | Add gpt-sw3 model to transformers | This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D... | ekgren | 2022-11-14T14:04:00Z | 2022-12-12T18:12:13Z | b58beebe7286bf53a80f137e0e5cd100ccb77ae2 | 5f94855dc31242d15d755b0d97ec6a0479ee0ea9 | Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models... | ./src/transformers/__init__.py | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use thi... | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use thi... | 1 |
huggingface/transformers | 20,209 | Add gpt-sw3 model to transformers | This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D... | ekgren | 2022-11-14T14:04:00Z | 2022-12-12T18:12:13Z | b58beebe7286bf53a80f137e0e5cd100ccb77ae2 | 5f94855dc31242d15d755b0d97ec6a0479ee0ea9 | Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models... | ./src/transformers/models/__init__.py | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use thi... | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use thi... | 1 |
huggingface/transformers | 20,209 | Add gpt-sw3 model to transformers | This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D... | ekgren | 2022-11-14T14:04:00Z | 2022-12-12T18:12:13Z | b58beebe7286bf53a80f137e0e5cd100ccb77ae2 | 5f94855dc31242d15d755b0d97ec6a0479ee0ea9 | Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models... | ./src/transformers/models/auto/configuration_auto.py | # coding=utf-8
# Copyright 2018 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | # coding=utf-8
# Copyright 2018 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | 1 |
huggingface/transformers | 20,209 | Add gpt-sw3 model to transformers | This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D... | ekgren | 2022-11-14T14:04:00Z | 2022-12-12T18:12:13Z | b58beebe7286bf53a80f137e0e5cd100ccb77ae2 | 5f94855dc31242d15d755b0d97ec6a0479ee0ea9 | Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models... | ./src/transformers/models/auto/modeling_auto.py | # coding=utf-8
# Copyright 2018 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | # coding=utf-8
# Copyright 2018 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | 1 |
huggingface/transformers | 20,209 | Add gpt-sw3 model to transformers | This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D... | ekgren | 2022-11-14T14:04:00Z | 2022-12-12T18:12:13Z | b58beebe7286bf53a80f137e0e5cd100ccb77ae2 | 5f94855dc31242d15d755b0d97ec6a0479ee0ea9 | Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models... | ./src/transformers/models/auto/modeling_flax_auto.py | # coding=utf-8
# Copyright 2018 The Google Flax Team Authors and The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
... | # coding=utf-8
# Copyright 2018 The Google Flax Team Authors and The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
... | 1 |
huggingface/transformers | 20,209 | Add gpt-sw3 model to transformers | This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D... | ekgren | 2022-11-14T14:04:00Z | 2022-12-12T18:12:13Z | b58beebe7286bf53a80f137e0e5cd100ccb77ae2 | 5f94855dc31242d15d755b0d97ec6a0479ee0ea9 | Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models... | ./src/transformers/models/auto/modeling_tf_auto.py | # coding=utf-8
# Copyright 2018 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | # coding=utf-8
# Copyright 2018 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | 1 |
huggingface/transformers | 20,209 | Add gpt-sw3 model to transformers | This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D... | ekgren | 2022-11-14T14:04:00Z | 2022-12-12T18:12:13Z | b58beebe7286bf53a80f137e0e5cd100ccb77ae2 | 5f94855dc31242d15d755b0d97ec6a0479ee0ea9 | Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models... | ./src/transformers/models/auto/tokenization_auto.py | # coding=utf-8
# Copyright 2018 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | # coding=utf-8
# Copyright 2018 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | 1 |
huggingface/transformers | 20,209 | Add gpt-sw3 model to transformers | This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D... | ekgren | 2022-11-14T14:04:00Z | 2022-12-12T18:12:13Z | b58beebe7286bf53a80f137e0e5cd100ccb77ae2 | 5f94855dc31242d15d755b0d97ec6a0479ee0ea9 | Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models... | ./src/transformers/utils/dummy_sentencepiece_objects.py | # This file is autogenerated by the command `make fix-copies`, do not edit.
# flake8: noqa
from ..utils import DummyObject, requires_backends
class AlbertTokenizer(metaclass=DummyObject):
_backends = ["sentencepiece"]
def __init__(self, *args, **kwargs):
requires_backends(self, ["sentencepiece"])
c... | # This file is autogenerated by the command `make fix-copies`, do not edit.
# flake8: noqa
from ..utils import DummyObject, requires_backends
class AlbertTokenizer(metaclass=DummyObject):
_backends = ["sentencepiece"]
def __init__(self, *args, **kwargs):
requires_backends(self, ["sentencepiece"])
c... | 1 |
huggingface/transformers | 20,209 | Add gpt-sw3 model to transformers | This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D... | ekgren | 2022-11-14T14:04:00Z | 2022-12-12T18:12:13Z | b58beebe7286bf53a80f137e0e5cd100ccb77ae2 | 5f94855dc31242d15d755b0d97ec6a0479ee0ea9 | Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models... | ./utils/check_repo.py | # coding=utf-8
# Copyright 2020 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | # coding=utf-8
# Copyright 2020 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | 1 |
huggingface/transformers | 20,209 | Add gpt-sw3 model to transformers | This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D... | ekgren | 2022-11-14T14:04:00Z | 2022-12-12T18:12:13Z | b58beebe7286bf53a80f137e0e5cd100ccb77ae2 | 5f94855dc31242d15d755b0d97ec6a0479ee0ea9 | Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models... | ./src/transformers/models/layoutlmv2/image_processing_layoutlmv2.py | # coding=utf-8
# Copyright 2022 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless r... | # coding=utf-8
# Copyright 2022 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless r... | -1 |
huggingface/transformers | 20,209 | Add gpt-sw3 model to transformers | This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D... | ekgren | 2022-11-14T14:04:00Z | 2022-12-12T18:12:13Z | b58beebe7286bf53a80f137e0e5cd100ccb77ae2 | 5f94855dc31242d15d755b0d97ec6a0479ee0ea9 | Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models... | ./src/transformers/models/vit/convert_vit_timm_to_pytorch.py | # coding=utf-8
# Copyright 2021 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | # coding=utf-8
# Copyright 2021 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | -1 |
huggingface/transformers | 20,209 | Add gpt-sw3 model to transformers | This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D... | ekgren | 2022-11-14T14:04:00Z | 2022-12-12T18:12:13Z | b58beebe7286bf53a80f137e0e5cd100ccb77ae2 | 5f94855dc31242d15d755b0d97ec6a0479ee0ea9 | Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models... | ./docs/source/en/model_doc/pegasus_x.mdx | <!--Copyright 2022 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | <!--Copyright 2022 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | -1 |
huggingface/transformers | 20,209 | Add gpt-sw3 model to transformers | This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D... | ekgren | 2022-11-14T14:04:00Z | 2022-12-12T18:12:13Z | b58beebe7286bf53a80f137e0e5cd100ccb77ae2 | 5f94855dc31242d15d755b0d97ec6a0479ee0ea9 | Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models... | ./src/transformers/models/lxmert/tokenization_lxmert.py | # coding=utf-8
# Copyright 2020 The Google AI Team, Stanford University and The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/L... | # coding=utf-8
# Copyright 2020 The Google AI Team, Stanford University and The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/L... | -1 |
huggingface/transformers | 20,209 | Add gpt-sw3 model to transformers | This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D... | ekgren | 2022-11-14T14:04:00Z | 2022-12-12T18:12:13Z | b58beebe7286bf53a80f137e0e5cd100ccb77ae2 | 5f94855dc31242d15d755b0d97ec6a0479ee0ea9 | Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models... | ./src/transformers/models/beit/convert_beit_unilm_to_pytorch.py | # coding=utf-8
# Copyright 2021 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | # coding=utf-8
# Copyright 2021 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | -1 |
huggingface/transformers | 20,209 | Add gpt-sw3 model to transformers | This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D... | ekgren | 2022-11-14T14:04:00Z | 2022-12-12T18:12:13Z | b58beebe7286bf53a80f137e0e5cd100ccb77ae2 | 5f94855dc31242d15d755b0d97ec6a0479ee0ea9 | Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models... | ./tests/models/mbart50/__init__.py | -1 | ||
huggingface/transformers | 20,209 | Add gpt-sw3 model to transformers | This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D... | ekgren | 2022-11-14T14:04:00Z | 2022-12-12T18:12:13Z | b58beebe7286bf53a80f137e0e5cd100ccb77ae2 | 5f94855dc31242d15d755b0d97ec6a0479ee0ea9 | Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models... | ./src/transformers/models/roberta/modeling_flax_roberta.py | # coding=utf-8
# Copyright 2021 The Google Flax Team Authors and The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
... | # coding=utf-8
# Copyright 2021 The Google Flax Team Authors and The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
... | -1 |
huggingface/transformers | 20,209 | Add gpt-sw3 model to transformers | This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D... | ekgren | 2022-11-14T14:04:00Z | 2022-12-12T18:12:13Z | b58beebe7286bf53a80f137e0e5cd100ccb77ae2 | 5f94855dc31242d15d755b0d97ec6a0479ee0ea9 | Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models... | ./tests/utils/test_hf_argparser.py | # Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | # Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | -1 |
huggingface/transformers | 20,209 | Add gpt-sw3 model to transformers | This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D... | ekgren | 2022-11-14T14:04:00Z | 2022-12-12T18:12:13Z | b58beebe7286bf53a80f137e0e5cd100ccb77ae2 | 5f94855dc31242d15d755b0d97ec6a0479ee0ea9 | Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models... | ./src/transformers/models/blenderbot/modeling_blenderbot.py | # coding=utf-8
# Copyright 2021 The Facebook, Inc. and The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/L... | # coding=utf-8
# Copyright 2021 The Facebook, Inc. and The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/L... | -1 |
huggingface/transformers | 20,209 | Add gpt-sw3 model to transformers | This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D... | ekgren | 2022-11-14T14:04:00Z | 2022-12-12T18:12:13Z | b58beebe7286bf53a80f137e0e5cd100ccb77ae2 | 5f94855dc31242d15d755b0d97ec6a0479ee0ea9 | Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models... | ./src/transformers/data/datasets/glue.py | # Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | # Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | -1 |
huggingface/transformers | 20,209 | Add gpt-sw3 model to transformers | This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D... | ekgren | 2022-11-14T14:04:00Z | 2022-12-12T18:12:13Z | b58beebe7286bf53a80f137e0e5cd100ccb77ae2 | 5f94855dc31242d15d755b0d97ec6a0479ee0ea9 | Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models... | ./src/transformers/models/electra/__init__.py | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use thi... | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use thi... | -1 |
huggingface/transformers | 20,209 | Add gpt-sw3 model to transformers | This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D... | ekgren | 2022-11-14T14:04:00Z | 2022-12-12T18:12:13Z | b58beebe7286bf53a80f137e0e5cd100ccb77ae2 | 5f94855dc31242d15d755b0d97ec6a0479ee0ea9 | Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models... | ./src/transformers/models/xlnet/configuration_xlnet.py | # coding=utf-8
# Copyright 2018 Google AI, Google Brain and Carnegie Mellon University Authors and the HuggingFace Inc. team.
# Copyright (c) 2018, NVIDIA CORPORATION. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the Lice... | # coding=utf-8
# Copyright 2018 Google AI, Google Brain and Carnegie Mellon University Authors and the HuggingFace Inc. team.
# Copyright (c) 2018, NVIDIA CORPORATION. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the Lice... | -1 |
huggingface/transformers | 20,209 | Add gpt-sw3 model to transformers | This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D... | ekgren | 2022-11-14T14:04:00Z | 2022-12-12T18:12:13Z | b58beebe7286bf53a80f137e0e5cd100ccb77ae2 | 5f94855dc31242d15d755b0d97ec6a0479ee0ea9 | Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models... | ./tests/models/cpm/test_tokenization_cpm.py | # coding=utf-8
# Copyright 2018 HuggingFace Inc. team.
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law o... | # coding=utf-8
# Copyright 2018 HuggingFace Inc. team.
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law o... | -1 |
huggingface/transformers | 20,209 | Add gpt-sw3 model to transformers | This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D... | ekgren | 2022-11-14T14:04:00Z | 2022-12-12T18:12:13Z | b58beebe7286bf53a80f137e0e5cd100ccb77ae2 | 5f94855dc31242d15d755b0d97ec6a0479ee0ea9 | Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models... | ./tests/models/__init__.py | -1 | ||
huggingface/transformers | 20,209 | Add gpt-sw3 model to transformers | This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D... | ekgren | 2022-11-14T14:04:00Z | 2022-12-12T18:12:13Z | b58beebe7286bf53a80f137e0e5cd100ccb77ae2 | 5f94855dc31242d15d755b0d97ec6a0479ee0ea9 | Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models... | ./docs/source/en/model_doc/m2m_100.mdx | <!--Copyright 2020 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | <!--Copyright 2020 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | -1 |
huggingface/transformers | 20,209 | Add gpt-sw3 model to transformers | This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D... | ekgren | 2022-11-14T14:04:00Z | 2022-12-12T18:12:13Z | b58beebe7286bf53a80f137e0e5cd100ccb77ae2 | 5f94855dc31242d15d755b0d97ec6a0479ee0ea9 | Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models... | ./src/transformers/models/rag/modeling_rag.py | # coding=utf-8
# Copyright 2020, The RAG Authors and The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless r... | # coding=utf-8
# Copyright 2020, The RAG Authors and The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless r... | -1 |