HimanshuGoyal2004 commited on
Commit
d45e139
Β·
1 Parent(s): 4d0fc83
Files changed (2) hide show
  1. README.md +75 -1
  2. requirements.txt +3 -1
README.md CHANGED
@@ -8,4 +8,78 @@ pinned: false
8
  short_description: Document Q&A πŸ€–πŸ“ƒ
9
  ---
10
 
11
- Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
8
  short_description: Document Q&A πŸ€–πŸ“ƒ
9
  ---
10
 
11
+ # Document Q&A with LLMs, Docker, and Hugging Face
12
+
13
+ This project is a web-based application that allows you to chat with your documents. You can upload a document (PDF, DOCX, TXT, etc.), and the application will process it to answer your questions based on its content.
14
+
15
+ The application is built with:
16
+
17
+ * **Backend:** Python, LlamaIndex, Groq, and Cohere.
18
+ * **Frontend:** Gradio for the user interface.
19
+ * **Containerization:** Docker for easy deployment.
20
+
21
+ ## How it Works
22
+
23
+ 1. **Document Parsing:** When you upload a document, it's parsed using LlamaParse to extract the text content.
24
+ 2. **Embeddings:** The extracted text is then converted into vector embeddings using Cohere's embedding model.
25
+ 3. **LLM Interaction:** When you ask a question, the application uses the Groq API (with Llama 3) to find the most relevant information in the document and generate a response.
26
+
27
+ ## Running the Application with Docker
28
+
29
+ ### Prerequisites
30
+
31
+ * Docker installed on your machine.
32
+ * API keys for:
33
+ * LlamaParse (LLAMA_CLOUD_API_KEY)
34
+ * Groq (GROQ_API_KEY)
35
+ * Cohere (COHERE_API_KEY)
36
+
37
+ ### Steps
38
+
39
+ 1. **Build the Docker Image:**
40
+
41
+ ```bash
42
+ docker build -t document-qa .
43
+ ```
44
+
45
+ 2. **Run the Docker Container:**
46
+
47
+ Replace `your_llama_cloud_key`, `your_groq_key`, and `your_cohere_key` with your actual API keys.
48
+
49
+ ```bash
50
+ docker run -p 7860:7860 \
51
+ -e LLAMA_CLOUD_API_KEY="your_llama_cloud_key" \
52
+ -e GROQ_API_KEY="your_groq_key" \
53
+ -e COHERE_API_KEY="your_cohere_key" \
54
+ document-qa
55
+ ```
56
+
57
+ 3. **Access the Application:**
58
+
59
+ Open your web browser and go to `http://localhost:7860`.
60
+
61
+ ## Deploying to Hugging Face Spaces
62
+
63
+ You can deploy this application to Hugging Face Spaces directly from this repository.
64
+
65
+ ### Steps
66
+
67
+ 1. **Create a new Hugging Face Space:**
68
+ * Go to [huggingface.co/new-space](https://huggingface.co/new-space).
69
+ * Give your Space a name.
70
+ * Select **Docker** as the Space SDK.
71
+ * Choose "Docker from scratch".
72
+ * Create the Space.
73
+
74
+ 2. **Upload the files:**
75
+ * Upload `app.py`, `requirements.txt`, and `Dockerfile` to your Hugging Face Space repository.
76
+
77
+ 3. **Add Secrets:**
78
+ * In your Space's settings, go to the **Secrets** section.
79
+ * Add the following secrets with your API keys:
80
+ * `LLAMA_CLOUD_API_KEY`
81
+ * `GROQ_API_KEY`
82
+ * `COHERE_API_KEY`
83
+
84
+ 4. **Deploy:**
85
+ * Hugging Face will automatically build the Docker image from your `Dockerfile` and deploy the application. Once the build is complete, your application will be live.
requirements.txt CHANGED
@@ -3,4 +3,6 @@ python-dotenv
3
  llama-index
4
  llama-parse
5
  llama-index-llms-groq
6
- llama-index-embeddings-cohere
 
 
 
3
  llama-index
4
  llama-parse
5
  llama-index-llms-groq
6
+ llama-index-embeddings-cohere
7
+ torch
8
+ transformers