FaheemBEG commited on
Commit
1bd98da
·
verified ·
1 Parent(s): 2db8876

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -81,8 +81,8 @@ The following fields were extracted and/or transformed from the original source:
81
 
82
  The Langchain's `RecursiveCharacterTextSplitter` function was used to make these chunks, which correspond to the `text` value. The parameters used are :
83
 
84
- - `chunk_size` = 1500 (in order to maximize the compability of most LLMs context windows)
85
- - `chunk_overlap` = 200
86
  - `length_function` = len
87
 
88
  The value of `chunk_text` includes the `title` and the textual content chunk `text`. This strategy is designed to improve document search.
@@ -94,7 +94,7 @@ The resulting embedding is stored as a JSON stringified array of 1024 floating p
94
 
95
  ## 🔄 The chunking doesn't fit your use case?
96
 
97
- [**SOON AVAILABLE FOR THIS DATASET**] ~~If you need to reconstitute the original, un-chunked dataset, you can follow [this tutorial notebook available on our GitHub repository](https://github.com/etalab-ia/mediatech/blob/main/docs/reconstruct_vector_database.ipynb).~~
98
 
99
  ⚠️ The tutorial is only relevant for datasets that were chunked **without overlap**.
100
 
 
81
 
82
  The Langchain's `RecursiveCharacterTextSplitter` function was used to make these chunks, which correspond to the `text` value. The parameters used are :
83
 
84
+ - `chunk_size` = 1500
85
+ - `chunk_overlap` = 0
86
  - `length_function` = len
87
 
88
  The value of `chunk_text` includes the `title` and the textual content chunk `text`. This strategy is designed to improve document search.
 
94
 
95
  ## 🔄 The chunking doesn't fit your use case?
96
 
97
+ If you need to reconstitute the original, un-chunked dataset, you can follow [this tutorial notebook available on our GitHub repository](https://github.com/etalab-ia/mediatech/blob/main/docs/reconstruct_vector_database.ipynb).
98
 
99
  ⚠️ The tutorial is only relevant for datasets that were chunked **without overlap**.
100