udituen commited on
Commit
b3928e7
Β·
1 Parent(s): 5854196

update faiss index

Browse files
Files changed (2) hide show
  1. README.md +43 -19
  2. src/streamlit_app.py +1 -1
README.md CHANGED
@@ -1,19 +1,43 @@
1
- ---
2
- title: Agriquery
3
- emoji: πŸ¦€
4
- colorFrom: green
5
- colorTo: red
6
- sdk: docker
7
- app_port: 8501
8
- tags:
9
- - streamlit
10
- pinned: false
11
- short_description: LLM-powered question-answering system using RAG
12
- ---
13
-
14
- # Welcome to Streamlit!
15
-
16
- Edit `/src/streamlit_app.py` to customize this app to your heart's desire. :heart:
17
-
18
- If you have any questions, checkout our [documentation](https://docs.streamlit.io) and [community
19
- forums](https://discuss.streamlit.io).
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ## AGRIQUERY - RAG-LLM Powered Q&A App for Agricultural Researchers.
2
+
3
+ AgriQuery is an LLM-powered Q&A system built for agricultural researchers. It processes scientific publications and enables users to ask natural language questions, receiving context-aware answers backed by retrieved text. Built with LangChain, FAISS, Airflow, and Docker, it demonstrates a production-ready RAG architecture for domain-specific information retrieval.
4
+
5
+ ### End-to-End RAG with Airflow, FAISS, Llama, FastAPI
6
+
7
+ ## Usage
8
+ 1. `docker-compose up --build`
9
+ 2. Access:
10
+ - Airflow: http://localhost:8080
11
+ - FastAPI: http://localhost:8000/query?question=Your+question+here
12
+ 3. Run DAG `rag_pipeline` in Airflow UI to ingest & build FAISS.
13
+ 4. Query the running FastAPI endpoint.
14
+
15
+
16
+ ```
17
+ agriquery-rag-llm/
18
+ β”‚
19
+ β”œβ”€β”€ app/
20
+ β”‚ β”œβ”€β”€ rag.py # LangChain RAG setup
21
+ β”‚ └── main_app.py # Streamlit or Gradio UI
22
+ β”œβ”€β”€ docker/
23
+ β”‚ β”œβ”€β”€ Dockerfile
24
+ β”‚ └── docker-compose.yml
25
+ β”‚
26
+ β”œβ”€β”€ src/
27
+ β”‚ └── experiments.ipynb # Testing LangChain chains and embeddings
28
+ β”‚ └── prepocess.py # pdf upload, chunking and embedding setup
29
+ β”œβ”€β”€ requirements.txt
30
+ β”œβ”€β”€ .env # API keys
31
+ β”œβ”€β”€ README.md
32
+
33
+
34
+ ```
35
+ pip install transformers langchain langchain_community sentence-transformers faiss-cpu
36
+
37
+ ## Complete Local RAG Stack
38
+ Component Tool
39
+ Chunking LangChain Splitters
40
+ Embedding Sentence Transformers
41
+ Vector DB FAISS
42
+ LLM HuggingFace Llama (Locally)
43
+ RAG Chain LangChain RetrievalQA
src/streamlit_app.py CHANGED
@@ -50,7 +50,7 @@ prompt = PromptTemplate(
50
  @st.cache_resource
51
  def load_retriever():
52
  embeddings = HuggingFaceEmbeddings(model_name="all-MiniLM-L6-v2")
53
- db = FAISS.load_local("./vectorstore/agriquery_faiss_index", embeddings, allow_dangerous_deserialization=True)
54
  retriever = db.as_retriever()
55
  return retriever
56
 
 
50
  @st.cache_resource
51
  def load_retriever():
52
  embeddings = HuggingFaceEmbeddings(model_name="all-MiniLM-L6-v2")
53
+ db = FAISS.load_local("./vectorstore", embeddings, allow_dangerous_deserialization=True)
54
  retriever = db.as_retriever()
55
  return retriever
56