Spaces:
Sleeping
Sleeping
Commit
·
53fb893
1
Parent(s):
d4f02c0
updated q&a
Browse files- pages/1_📊_About.py +23 -1
pages/1_📊_About.py
CHANGED
|
@@ -44,4 +44,26 @@ with tab1:
|
|
| 44 |
with tab2:
|
| 45 |
st.header("Question And Answering")
|
| 46 |
with st.expander("Details"):
|
| 47 |
-
st.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 44 |
with tab2:
|
| 45 |
st.header("Question And Answering")
|
| 46 |
with st.expander("Details"):
|
| 47 |
+
st.markdown("**Application:** Gofynd.com")
|
| 48 |
+
st.markdown("**Total Documents:** FAQ, Return, Shipping, T&C")
|
| 49 |
+
st.markdown("**Framework:** Langchain")
|
| 50 |
+
st.markdown("Embedding Model: Used OpenAI’s GPT3.5 model for Generating the Embeddings for Text docus")
|
| 51 |
+
st.markdown("Vector Database: Milvus")
|
| 52 |
+
st.markdown("ML Pipeline Orchestration: Vertex AI Pipeline")
|
| 53 |
+
st.markdown("Supported Input data format: Strict JSON format (Other formats to be added shortly)")
|
| 54 |
+
st.markdown("Support file type: JSON")
|
| 55 |
+
st.markdown("Similarity Metric: Inner Product")
|
| 56 |
+
st.markdown("Type of Index: FLAT_L2")
|
| 57 |
+
|
| 58 |
+
st.subheader("STEPS")
|
| 59 |
+
st.markdown("**TRAINING:**")
|
| 60 |
+
st.markdown("1. Pull Textual policy documents from FDK")
|
| 61 |
+
st.markdown("2. Parse the file data and make all documents at flat level")
|
| 62 |
+
st.markdown("3. Apply splitters to convert the documents to chunks")
|
| 63 |
+
st.markdown("4. Generate the Embeddings for those chunks")
|
| 64 |
+
st.markdown("5. Store Embeddings to Milvus DB and create Index")
|
| 65 |
+
|
| 66 |
+
st.markdown("**SERVING:**")
|
| 67 |
+
st.markdown("1. Convert users’ question to embeddings with help of OpenAI’s GPT3.5 model")
|
| 68 |
+
st.markdown("2. Apply Embedding Search Query on Collection of Milvus DB on which the data is indexed and Fetch top 10 Docs")
|
| 69 |
+
st.markdown("3. Use PROMPT template to prepare PROMPT = TEMPLATE + CONTEXT and use OpenAI’s to generate the response")
|