Spaces:
Sleeping
Sleeping
File size: 3,491 Bytes
3ac732f 1963e36 7ddd05c 3f4dbc7 7734f80 7ddd05c 0e893aa 7ddd05c 7734f80 7ddd05c 3328745 0e893aa 70c693a 4583e4d 3392ab1 0e893aa 3328745 0e893aa 3328745 0e893aa ac85c1d 0e893aa 3328745 7ddd05c 7734f80 7ddd05c |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 |
---
title: AutogenMultiAgent
emoji: π
colorFrom: pink
colorTo: gray
sdk: streamlit
sdk_version: 1.36.0
app_file: app.py
pinned: false
license: apache-2.0
---
# AutogenMultiAgent
Autogen Multiagent
AutoGen is an open-source programming framework for building AI agents and facilitating cooperation among multiple agents to solve tasks. AutoGen aims to provide an easy-to-use and flexible framework for accelerating development and research on agentic AI, like PyTorch for Deep Learning. It offers features such as agents that can converse with other agents, LLM and tool use support, autonomous and human-in-the-loop workflows, and multi-agent conversation patterns.
## AutoGen Overview

## Code execution
## RAG Chat

Qdrant is a high-performance vector search engine/database.
This notebook demonstrates the usage of QdrantRetrieveUserProxyAgent for RAG, based on agentchat_RetrieveChat.ipynb.
RetrieveChat is a conversational system for retrieve augmented code generation and question answering. In this notebook, we demonstrate how to utilize RetrieveChat to generate code and answer questions based on customized documentations that are not present in the LLM's training dataset. RetrieveChat uses the RetrieveAssistantAgent and QdrantRetrieveUserProxyAgent, which is similar to the usage of AssistantAgent and UserProxyAgent in other notebooks (e.g., Automated Task Solving with Code Generation, Execution & Debugging)
:::info Requirements
Some extra dependencies are needed for this notebook, which can be installed via pip:
```bash
pip install "pyautogen[retrievechat-qdrant]" "flaml[automl]"
```
For more information, please refer to the [installation guide](/docs/installation/).
:::
## Groupchat with Llamaindex agents
Llamaindex agents have the ability to use planning strategies to answer user questions. They can be integrated in Autogen in easy ways
Requirements
%pip install pyautogen llama-index llama-index-tools-wikipedia llama-index-readers-wikipedia wikipedia
## Defaults
### LLM_OPTIONS
Groq
### USECASE_OPTIONS
#### Basic Example

#### Teachable Agent
[Teachable Agent](https://microsoft.github.io/autogen/0.2/docs/notebooks/agentchat_teachability)
Prompt1: who is Sachin Tiwari
Prompt2: Sachin is from jharkhand working in uk
prompt3 : who is sachin

### Chat with CAG
prompt1: what is dotnet
prompt2: what is python
prompt3: what is python
prompt4: what is dotnet
prompt5: what is python

#### MultiAgent Chat
prompt : As a user , create a asp.net form with razor view page for health insaurance feedback page

#### MultiAgent Code Execution


#### RAG Chat
prompt : Explain
docs or filename path : https://github.com/microsoft/autogen/blob/main/python/samples/agentchat_chainlit/README.md

#### With LLamaIndex Tool
prompt: What can i find in Tokyo related to Hayao Miyazaki and its moveis like Spirited Away?.

#### AgentChat Sql Spider
### GROQ_MODEL_OPTIONS
mixtral-8x7b-32768
llama3-8b-8192
llama3-70b-8192
gemma-7b-i
## Important links
https://microsoft.github.io/autogen/docs/notebooks
https://microsoft.github.io/autogen/docs/tutorial/code-executors
https://microsoft.github.io/autogen/docs/tutorial/tool-use
|