PL-RnD commited on
Commit
d3513bd
·
1 Parent(s): b941f89

chore: Spelling fix

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -85,7 +85,7 @@ This will output a DataFrame with the original texts and their predicted labels
85
  ```
86
 
87
  ## Intended Use
88
- This model is intended to flag privacy concerns that a privacy concious person would expect to keep private, such as: addresses, phone numbers, e-mails, passwords, health details, relationship drama, financial numbers, political opinions, or sexual preferences.
89
 
90
  The motivating use-case for this model is to reside client-side (or in a trusted/internal environment) to review user-generated text content before it is sent to a server or third-party service, in order to prevent accidental sharing of sensitive information. For example:
91
  - Filter and act as an A:B router for public vs private LLMS (i.e. like using this with Pipelines in Open-WebUI). If the text is flagged as a privacy violation, it can be routed to a local/private LLM instance instead of a public one.
@@ -94,4 +94,4 @@ The motivating use-case for this model is to reside client-side (or in a trusted
94
 
95
  ---
96
 
97
- > "Ultimately, arguing that you don't care about the right to privacy because you have nothing to hide is no different than saying you don't care about free speech because you have nothing to say." - Edward Snowden
 
85
  ```
86
 
87
  ## Intended Use
88
+ This model is intended to flag privacy concerns that a privacy conscious person would expect to keep private, such as: addresses, phone numbers, e-mails, passwords, health details, relationship drama, financial numbers, political opinions, or sexual preferences.
89
 
90
  The motivating use-case for this model is to reside client-side (or in a trusted/internal environment) to review user-generated text content before it is sent to a server or third-party service, in order to prevent accidental sharing of sensitive information. For example:
91
  - Filter and act as an A:B router for public vs private LLMS (i.e. like using this with Pipelines in Open-WebUI). If the text is flagged as a privacy violation, it can be routed to a local/private LLM instance instead of a public one.
 
94
 
95
  ---
96
 
97
+ > Ultimately, arguing that you don't care about the right to privacy because you have nothing to hide is no different than saying you don't care about free speech because you have nothing to say. - Edward Snowden