sdiazlor commited on
Commit
cd58339
Β·
verified Β·
1 Parent(s): 3f8870d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +54 -20
README.md CHANGED
@@ -8,7 +8,7 @@ pinned: false
8
  ---
9
  <!-- header start -->
10
  <!-- 200823 -->
11
- <a href="https://www.pruna.ai/" target="_blank" rel="noopener noreferrer">
12
  <img src="https://github.com/PrunaAI/pruna/raw/main/docs/assets/images/logo.png"
13
  alt="PrunaAI"
14
  style="width: 50%; min-width: 400px; display: block; margin: 0;">
@@ -17,32 +17,32 @@ pinned: false
17
 
18
  ----
19
 
20
- # 🌍 Join the Pruna AI community!
21
- [![Twitter](https://img.shields.io/twitter/follow/PrunaAI?style=social)](https://twitter.com/PrunaAI)
22
- [![GitHub](https://img.shields.io/github/stars/prunaai/pruna)](https://github.com/PrunaAI/pruna)
23
- [![LinkedIn](https://img.shields.io/badge/LinkedIn-Connect-blue)](https://www.linkedin.com/company/93832878/admin/feed/posts/?feedType=following)
24
- [![Discord](https://img.shields.io/badge/Discord-Join%20Us-blue?style=social&logo=discord)](https://discord.com/invite/JFQmtFKCjd)
25
- [![Reddit](https://img.shields.io/reddit/subreddit-subscribers/PrunaAI?style=social)](https://www.reddit.com/r/PrunaAI/)
26
 
27
  ----
28
 
29
- # πŸ’œ Simply make AI models faster, cheaper, smaller, greener!
30
- [Pruna AI](https://www.pruna.ai/) makes AI models faster, cheaper, smaller, greener with the `pruna` package.
31
- - It supports **various models including CV, NLP, audio, graphs for predictive and generative AI**.
32
- - It supports **various hardware including GPU, CPU, Edge**.
33
- - It supports **various compression algortihms including quantization, pruning, distillation, caching, recovery, compilation** that can be **combined together**.
34
- - You can either **play on your own** with smash/compression configurations or **let the smashing/compressing agent** find the optimal configuration **[Pro]**.
35
  - You can **evaluate reliable quality and efficiency metrics** of your base vs smashed/compressed models.
36
- You can set it up in minutes and compress your first models in few lines of code!
 
37
 
38
  ----
39
 
40
- # ⏩ How to get started?
41
- You can smash your own models by installing pruna with pip:
42
- ```
 
43
  pip install pruna
44
  ```
45
- or directly [from source](https://github.com/PrunaAI/pruna).
46
 
47
  You can start with simple notebooks to experience efficiency gains with:
48
 
@@ -52,10 +52,44 @@ You can start with simple notebooks to experience efficiency gains with:
52
  | **Making your LLMs 4x smaller** | ⏩ [Smash for free](https://colab.research.google.com/github/PrunaAI/pruna/blob/main/docs/tutorials/llms.ipynb) |
53
  | **Smash your model with a CPU only** | ⏩ [Smash for free](https://colab.research.google.com/github/PrunaAI/pruna/blob/main/docs/tutorials/cv_cpu.ipynb) |
54
  | **Transcribe 2 hours of audio in less than 2 minutes with Whisper** | ⏩ [Smash for free](https://colab.research.google.com/github/PrunaAI/pruna/blob/main/docs/tutorials/asr_tutorial.ipynb) |
55
- | **100% faster Whisper Transcription** | ⏩ [Smash for free](https://colab.research.google.com/github/PrunaAI/pruna/blob/main/docs/tutorials/asr_whisper.ipynb) |
56
  | **Run your Flux model without an A100** | ⏩ [Smash for free](https://githubtocolab.com/PrunaAI/pruna/blob/1d68f74c132bd4045f2af55bb1e5c03bf2dde6a9/docs/tutorials/flux_small.ipynb) |
57
  | **x2 smaller Sana in action** | ⏩ [Smash for free](https://colab.research.google.com/github/PrunaAI/pruna/blob/main/docs/tutorials/sana_diffusers_int8.ipynb) |
58
 
59
- For more details about installation, free tutorials and Pruna Pro tutorials, you can check the [Pruna AI documentation](https://docs.pruna.ai/).
60
 
61
  ----
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
8
  ---
9
  <!-- header start -->
10
  <!-- 200823 -->
11
+ <a href="https://www.pruna.ai/?utm_source=huggingface&utm_medium=org_card&utm_campaign=hf_traffic" target="_blank" rel="noopener noreferrer">
12
  <img src="https://github.com/PrunaAI/pruna/raw/main/docs/assets/images/logo.png"
13
  alt="PrunaAI"
14
  style="width: 50%; min-width: 400px; display: block; margin: 0;">
 
17
 
18
  ----
19
 
20
+ ## 🌍 Join the Pruna AI community!
21
+ [![GitHub](https://img.shields.io/badge/GitHub-PrunaAI-9334E9?style=plastic&logo=github&logoColor=white)](https://github.com/PrunaAI/pruna) &nbsp;
22
+ [![Twitter/X](https://img.shields.io/badge/Twitter%2FX-@PrunaAI-9334E9?style=plastic&logo=x&logoColor=white)](https://twitter.com/PrunaAI) &nbsp;
23
+ [![LinkedIn](https://img.shields.io/badge/LinkedIn-PrunaAI-9334E9?style=plastic&logo=linkedin&logoColor=white)](https://www.linkedin.com/company/pruna-ai) &nbsp;
24
+ [![Discord](https://img.shields.io/badge/Discord-Join%20Us-9334E9?style=plastic&logo=discord&logoColor=white)](https://discord.com/invite/JFQmtFKCjd)
 
25
 
26
  ----
27
 
28
+ ## πŸ’œ Make AI models faster, cheaper, smaller, greener!
29
+ [Pruna AI](https://www.pruna.ai/) makes AI models faster, cheaper, smaller, and greener with the `pruna` package.
30
+ - It supports **various models, including CV, NLP, audio, and graphs for predictive and generative AI**.
31
+ - It supports **various hardware, including GPU, CPU, Edge**.
32
+ - It supports **various compression algorithms**, including quantization, pruning, distillation, caching, recovery, compilation, or factorization, among others.
33
+ - You can **combine algorithms** to find the optimal configuration and smash/compress your model.
34
  - You can **evaluate reliable quality and efficiency metrics** of your base vs smashed/compressed models.
35
+
36
+ **Set it up in minutes and compress your first models in a few lines of code!**
37
 
38
  ----
39
 
40
+ ## ⏩ How to get started?
41
+ You can smash your own models by installing [pruna](https://github.com/PrunaAI/pruna) with pip:
42
+
43
+ ```py
44
  pip install pruna
45
  ```
 
46
 
47
  You can start with simple notebooks to experience efficiency gains with:
48
 
 
52
  | **Making your LLMs 4x smaller** | ⏩ [Smash for free](https://colab.research.google.com/github/PrunaAI/pruna/blob/main/docs/tutorials/llms.ipynb) |
53
  | **Smash your model with a CPU only** | ⏩ [Smash for free](https://colab.research.google.com/github/PrunaAI/pruna/blob/main/docs/tutorials/cv_cpu.ipynb) |
54
  | **Transcribe 2 hours of audio in less than 2 minutes with Whisper** | ⏩ [Smash for free](https://colab.research.google.com/github/PrunaAI/pruna/blob/main/docs/tutorials/asr_tutorial.ipynb) |
55
+ | **100% faster Whisper Transcription** | ⏩ [Smash for free](https://colab.research.google.com/github/PrunaAI/pruna/blob/main/docs/tutorials/asr_tutorial.ipynb) |
56
  | **Run your Flux model without an A100** | ⏩ [Smash for free](https://githubtocolab.com/PrunaAI/pruna/blob/1d68f74c132bd4045f2af55bb1e5c03bf2dde6a9/docs/tutorials/flux_small.ipynb) |
57
  | **x2 smaller Sana in action** | ⏩ [Smash for free](https://colab.research.google.com/github/PrunaAI/pruna/blob/main/docs/tutorials/sana_diffusers_int8.ipynb) |
58
 
59
+ For more details on installation and free tutorials, check the [Pruna AI documentation](https://docs.pruna.ai/).
60
 
61
  ----
62
+
63
+ ## ✨ Test our endpoints
64
+
65
+ Want to use our optimized models right away? Try them [via our API](https://www.pruna.ai/all-models) for fast, easy access to Pruna-powered inference.
66
+
67
+ <style>
68
+ .model-button {
69
+ display: inline-flex;
70
+ flex-direction: row;
71
+ justify-content: center;
72
+ align-items: center;
73
+ gap: 8px;
74
+
75
+ padding: 8px 20px;
76
+ border: none;
77
+ border-radius: 8px;
78
+
79
+ background: #9334e9;
80
+ color: #ffffff;
81
+
82
+ font-size: 14px;
83
+ font-weight: 400;
84
+ line-height: 1;
85
+ text-decoration: none;
86
+ white-space: nowrap;
87
+ cursor: pointer;
88
+
89
+ box-sizing: border-box;
90
+ overflow: visible;
91
+ opacity: 1;
92
+ }
93
+
94
+ </style>
95
+ <a href="https://dashboard.pruna.ai/login?utm_source=huggingface&utm_medium=org_card&utm_campaign=hf_traffic" class="model-button">Try our models</a>