flexudy commited on
Commit
c951cca
·
1 Parent(s): 5a186dc

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -5
README.md CHANGED
@@ -1,4 +1,4 @@
1
- # Flexudy's Conceptor: Towards Neuro-Symbolic Language Understanding
2
 
3
  ![alt text](https://www.flexudy.com/wp-content/uploads/2021/09/conceptor.png "Flexudy's conceptor")
4
 
@@ -32,12 +32,12 @@ How awesome would that be? 💡
32
 
33
  ### Usage
34
 
35
- No library should anyone suffer. Especially not if it is built on top of **HF Transformers**.
36
 
37
 
38
  Go to the [Github repo](https://github.com/flexudy/natural-language-logic)
39
  ```python
40
- from core.conceptor.start import FlexudyConceptInferenceMachineFactory
41
 
42
  # Load me only once
43
  concept_inference_machine = FlexudyConceptInferenceMachineFactory.get_concept_inference_machine()
@@ -58,7 +58,7 @@ Output:
58
 
59
  ```{<br/>'cat': ['mammal', 'animal'], <br/> 'dog': ['hound', 'animal'], <br/>'economics and sociology': ['both fields of study'], <br/>'public company': ['company']<br/>}```
60
 
61
- ### How was it trained?``
62
 
63
  1. Using Google's T5-base and T5-small. Both models are released on the Hugging Face Hub.
64
  2. T5-base was trained for only two epochs while T5-small was trained for 5 epochs.
@@ -67,7 +67,7 @@ Output:
67
 
68
  1. I extracted and curated a fragment of [Conceptnet](https://conceptnet.io/)
69
  2. In particular, only the IsA relation was used.
70
- 3. Note that one thing can belong to multiple concepts (which is pretty cool if you think about [Fuzzy Description Logics](https://lat.inf.tu-dresden.de/~stefborg/Talks/QuantLAWorkshop2013.pdf)).
71
  Multiple inheritances however mean some terms belong to so many concepts. Hence, I decided to randomly throw away some due to the **maximum length limitation**.
72
 
73
  ### Setup
 
1
+ # Towards Neuro-Symbolic Language Understanding
2
 
3
  ![alt text](https://www.flexudy.com/wp-content/uploads/2021/09/conceptor.png "Flexudy's conceptor")
4
 
 
32
 
33
  ### Usage
34
 
35
+ No library should anyone suffer. Especially not if it is built on top of 🤗 **HF Transformers**.
36
 
37
 
38
  Go to the [Github repo](https://github.com/flexudy/natural-language-logic)
39
  ```python
40
+ from flexudy.conceptor.start import FlexudyConceptInferenceMachineFactory
41
 
42
  # Load me only once
43
  concept_inference_machine = FlexudyConceptInferenceMachineFactory.get_concept_inference_machine()
 
58
 
59
  ```{<br/>'cat': ['mammal', 'animal'], <br/> 'dog': ['hound', 'animal'], <br/>'economics and sociology': ['both fields of study'], <br/>'public company': ['company']<br/>}```
60
 
61
+ ### How was it trained?
62
 
63
  1. Using Google's T5-base and T5-small. Both models are released on the Hugging Face Hub.
64
  2. T5-base was trained for only two epochs while T5-small was trained for 5 epochs.
 
67
 
68
  1. I extracted and curated a fragment of [Conceptnet](https://conceptnet.io/)
69
  2. In particular, only the IsA relation was used.
70
+ 3. Note that one term can belong to multiple concepts (which is pretty cool if you think about [Fuzzy Description Logics](https://lat.inf.tu-dresden.de/~stefborg/Talks/QuantLAWorkshop2013.pdf)).
71
  Multiple inheritances however mean some terms belong to so many concepts. Hence, I decided to randomly throw away some due to the **maximum length limitation**.
72
 
73
  ### Setup