Dataset Viewer
Auto-converted to Parquet Duplicate
title
stringlengths
34
95
url
stringlengths
39
106
date
stringlengths
10
10
tags
sequence
summary
stringlengths
66
380
content
stringlengths
4.93k
25.2k
Google Colab the free GPU/TPU Jupyter Notebook Service
https://www.philschmid.de/google-cola-the-free-gpu-jupyter
2020-02-26
[ "Machine Learning" ]
A Short Introduction to Google Colab as a free Jupyter notebook service from Google. Learn how to use Accelerated Hardware like GPUs and TPUs to run your Machine learning completely for free in the cloud.
## What is Google Colab **Google Colaboratory** or „Colab“ for short is a free Jupyter notebook service from Google. It requires no setup and runs entirely in the cloud. In Google Colab you can write, execute, save and share your Jupiter Notebooks. You access powerful computing resources like TPUs and GPUs all for fre...
Hugging Face Transformers Examples
https://www.philschmid.de/huggingface-transformers-examples
2023-01-26
[ "HuggingFace", "Transformers", "BERT", "PyTorch" ]
Learn how to leverage Hugging Face Transformers to easily fine-tune your models.
<html class="max-w-none pt-6 pb-8 font-serif " itemscope itemtype="https://schema.org/FAQPage"> Machine learning and the adoption of the Transformer architecture are rapidly growing and will revolutionize the way we live and work. From self-driving cars to personalized medicine, the applications of [Transformers](https...
BERT Text Classification in a different language
https://www.philschmid.de/bert-text-classification-in-a-different-language
2020-05-22
[ "NLP", "Bert", "HuggingFace" ]
Build a non-English (German) BERT multi-class text classification model with HuggingFace and Simple Transformers.
Currently, we have 7.5 billion people living on the world in around 200 nations. Only [1.2 billion people of them are native English speakers](https://en.wikipedia.org/wiki/List_of_countries_by_English-speaking_population). This leads to a lot of unstructured non-English textual data. Most of the tutorials and blog po...
Semantic Segmantion with Hugging Face's Transformers & Amazon SageMaker
https://www.philschmid.de/image-segmentation-sagemaker
2022-05-03
[ "AWS", "SegFormer", "Vision", "Sagemaker" ]
Learn how to do image segmentation with Hugging Face Transformers, SegFormer and Amazon SageMaker.
Transformer models are changing the world of machine learning, starting with natural language processing, and now, with audio and computer vision. Hugging Face's mission is to democratize good machine learning and giving any one the opportunity to use these new state-of-the-art machine learning models. Together with Am...
Fine-tune FLAN-T5 for chat & dialogue summarization
https://www.philschmid.de/fine-tune-flan-t5
2022-12-27
[ "T5", "Summarization", "HuggingFace", "Chat" ]
Learn how to fine-tune Google's FLAN-T5 for chat & dialogue summarization using Hugging Face Transformers.
In this blog, you will learn how to fine-tune [google/flan-t5-base](https://huggingface.co/google/flan-t5-base) for chat & dialogue summarization using Hugging Face Transformers. If you already know T5, FLAN-T5 is just better at everything. For the same number of parameters, these models have been fine-tuned on more th...
Getting started with Pytorch 2.0 and Hugging Face Transformers
https://www.philschmid.de/getting-started-pytorch-2-0-transformers
2023-03-16
[ "Pytorch", "BERT", "HuggingFace", "Optimization" ]
Learn how to get started with Pytorch 2.0 and Hugging Face Transformers and reduce your training time up to 2x.
On December 2, 2022, the PyTorch Team announced [PyTorch 2.0](https://pytorch.org/get-started/pytorch-2.0/) at the PyTorch Conference, focused on better performance, being faster, more pythonic, and staying as dynamic as before. This blog post explains how to get started with PyTorch 2.0 and Hugging Face Transformers ...
Deploy T5 11B for inference for less than $500
https://www.philschmid.de/deploy-t5-11b
2022-10-25
[ "HuggingFace", "Transformers", "Endpoints", "bnb" ]
Learn how to deploy T5 11B on a single GPU using Hugging Face Inference Endpoints.
This blog will teach you how to deploy [T5 11B](https://huggingface.co/t5-11b) for inference using [Hugging Face Inference Endpoints](https://huggingface.co/inference-endpoints). The T5 model was presented in [Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer](https://arxiv.org/pdf/1910....
Scalable, Secure Hugging Face Transformer Endpoints with Amazon SageMaker, AWS Lambda, and CDK
https://www.philschmid.de/huggingface-transformers-cdk-sagemaker-lambda
2021-10-06
[ "AWS", "BERT", "HuggingFace", "Sagemaker" ]
Deploy Hugging Face Transformers to Amazon SageMaker and create an API for the Endpoint using AWS Lambda, API Gateway and AWS CDK.
Researchers, Data Scientists, Machine Learning Engineers are excellent at creating models to achieve new state-of-the-art performance on different tasks, but deploying those models in an accessible, scalable, and secure way is more of an art than science. Commonly, those skills are found in software engineering and Dev...
Deploy FLAN-UL2 20B on Amazon SageMaker
https://www.philschmid.de/deploy-flan-ul2-sagemaker
2023-03-20
[ "GenerativeAI", "SageMaker", "HuggingFace", "Inference" ]
Learn how to deploy Google's FLAN-UL 20B on Amazon SageMaker for inference.
Welcome to this Amazon SageMaker guide on how to deploy the [FLAN-UL2 20B](https://huggingface.co/google/flan-ul2) on Amazon SageMaker for inference. We will deploy [google/flan-ul2](https://huggingface.co/google/flan-ul2) to Amazon SageMaker for real-time inference using Hugging Face Inference Deep Learning Container....
Serverless Inference with Hugging Face's Transformers, DistilBERT and Amazon SageMaker
https://www.philschmid.de/sagemaker-serverless-huggingface-distilbert
2022-04-21
[ "HuggingFace", "AWS", "BERT", "Serverless" ]
Learn how to deploy a Transformer model like BERT to Amazon SageMaker Serverless using the Python SageMaker SDK.
[Notebook: serverless_inference](https://github.com/huggingface/notebooks/blob/main/sagemaker/19_serverless_inference/sagemaker-notebook.ipynb) Welcome to this getting started guide, you learn how to use the Hugging Face Inference DLCs and Amazon SageMaker Python SDK to create a [Serverless Inference](https://docs.aws...
Efficient Large Language Model training with LoRA and Hugging Face
https://www.philschmid.de/fine-tune-flan-t5-peft
2023-03-23
[ "GenerativeAI", "LoRA", "HuggingFace", "Training" ]
Learn how to fine-tune Google's FLAN-T5 XXL on a Single GPU using LoRA And Hugging Face Transformers.
In this blog, we are going to show you how to apply [Low-Rank Adaptation of Large Language Models (LoRA)](https://arxiv.org/abs/2106.09685) to fine-tune FLAN-T5 XXL (11 billion parameters) on a single GPU. We are going to leverage Hugging Face [Transformers](https://huggingface.co/docs/transformers/index), [Accelerate]...
Setup Deep Learning environment for Hugging Face Transformers with Habana Gaudi on AWS
https://www.philschmid.de/getting-started-habana-gaudi
2022-06-14
[ "BERT", "Habana", "HuggingFace", "Optimum" ]
Learn how to setup a Deep Learning Environment for Hugging Face Transformers with Habana Gaudi on AWS using the DL1 instance type.
This blog contains instructions for how to setup a Deep Learning Environment for Habana Gaudi on AWS using the DL1 instance type and Hugging Face libraries like [transformers](https://huggingface.co/docs/transformers/index), [optimum](https://huggingface.co/docs/optimum/index), [datasets](https://huggingface.co/docs/da...
Workshop: Enterprise-Scale NLP with Hugging Face & Amazon SageMaker
https://www.philschmid.de/hugginface-sagemaker-workshop
2021-12-29
[ "HuggingFace", "AWS", "SageMaker" ]
In October and November, we held a workshop series on “Enterprise-Scale NLP with Hugging Face & Amazon SageMaker”. This workshop series consisted out of 3 parts and covers: Getting Started, Going Production & MLOps.
Earlier this year we announced a strategic collaboration with Amazon to make it easier for companies to use Hugging Face Transformers in Amazon SageMaker, and ship cutting-edge Machine Learning features faster. We introduced new Hugging Face Deep Learning Containers (DLCs) to train and deploy Hugging Face Transformers ...
Fine-tune a non-English GPT-2 Model with Huggingface
https://www.philschmid.de/fine-tune-a-non-english-gpt-2-model-with-huggingface
2020-09-06
[ "NLP", "GPT-2", "Huggingface" ]
Fine-tune non-English, German GPT-2 model with Huggingface on German recipes. Using their Trainer class and Pipeline objects.
Unless you’re living under a rock, you probably have heard about [OpenAI](https://openai.com/)'s GPT-3 language model. You might also have seen all the crazy demos, where the model writes `JSX`, `HTML` code, or its capabilities in the area of zero-shot / few-shot learning. [Simon O'Regan](https://twitter.com/Simon_O_Re...
An Amazon SageMaker Inference comparison with Hugging Face Transformers
https://www.philschmid.de/sagemaker-inference-comparison
2022-05-17
[ "HuggingFace", "AWS", "BERT", "SageMaker" ]
Learn about the different existing Amazon SageMaker Inference options and and how to use them.
_"Amazon SageMaker is a fully managed machine learning service. With SageMaker, data scientists and developers can quickly and easily build and train machine learning models, and then directly deploy them into a production-ready hosted environment."_ - [AWS Documentation](https://docs.aws.amazon.com/sagemaker/latest/dg...
End of preview. Expand in Data Studio

Dataset Card for "philschmid-de-blog"

More Information needed

Downloads last month
16