Spaces:
Running
on
CPU Upgrade
Running
on
CPU Upgrade
| generatorText = """ This microservice integrates with the Retriever to answer the user query in ChaBo RAG workflows. \ | |
| # ChaBo Generator on Hugging Face Spaces | |
| [ChaBo_Generator](https://huggingface.co/spaces/GIZ/eudr_chabo_generator/blob/main/README.md) Space hosts \ | |
| a Generator microservice for answering user query. This is just a Infrastructural component and doesnt\ | |
| not serve any user application through its User Interfaceas the its consumed in ChaBo workflow thorugh Orchestrator. | |
| ChaBo Generator - MCP Server | |
| A language model-based generation service designed for ChatFed RAG\ | |
| (Retrieval-Augmented Generation) pipelines. This module serves as an \ | |
| **MCP (Model Context Protocol) server** that generates contextual responses \ | |
| using configurable LLM providers with support for retrieval result processing. | |
| **API Endpoint**: 1 API which provides context-aware text generation using \ | |
| configurable LLM providers when properly configured with API credentials. | |
| ### api_name: /generate | |
| Parameters: | |
| - `query` (str, required): The question or query to be answered | |
| - `context` (str|list, required): Context for answering - can be plain text or list of retrieval result dictionaries | |
| Returns: String containing the generated answer based on the provided context and query. | |
| **Hot to connect**: | |
| ```python | |
| from gradio_client import Client | |
| client = Client("ENTER CONTAINER URL / SPACE ID") | |
| result = client.predict( | |
| query="What are the key findings?", | |
| context="Your relevant documents or context here...", | |
| api_name="/generate" | |
| ) | |
| print(result) | |
| ``` | |
| #### Configuration | |
| LLM Provider Configuration: | |
| 1. Set your preferred inference provider in `params.cfg` | |
| 2. Configure the model and generation parameters | |
| 3. Set the required API key environment variable | |
| 4. [Optional] Adjust temperature and max_tokens settings | |
| 5. Run the app: | |
| ```bash | |
| docker build -t chatfed-generator . | |
| docker run -p 7860:7860 chatfed-generator | |
| ``` | |
| #### Environment Variables Required | |
| Make sure to set the appropriate environment variables: | |
| - OpenAI: `OPENAI_API_KEY` | |
| - Anthropic: `ANTHROPIC_API_KEY` | |
| - Cohere: `COHERE_API_KEY` | |
| - HuggingFace: `HF_TOKEN` | |
| For more info on Retriever and code base visit the following links: | |
| - ChaBo_Generator : [**ReadMe**](https://huggingface.co/spaces/GIZ/eudr_chabo_generator/blob/main/README.md) | |
| - ChaBo_Generator: [**Codebase**](https://huggingface.co/spaces/GIZ/eudr_chabo_generator/tree/main)""" |