Tskunz commited on
Commit
c59d7e4
·
verified ·
1 Parent(s): f37b0b6

Upload folder using huggingface_hub

Browse files
1_Pooling/config.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "word_embedding_dimension": 768,
3
+ "pooling_mode_cls_token": false,
4
+ "pooling_mode_mean_tokens": true,
5
+ "pooling_mode_max_tokens": false,
6
+ "pooling_mode_mean_sqrt_len_tokens": false,
7
+ "pooling_mode_weightedmean_tokens": false,
8
+ "pooling_mode_lasttoken": false,
9
+ "include_prompt": true
10
+ }
README.md ADDED
@@ -0,0 +1,643 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - sentence-transformers
4
+ - sentence-similarity
5
+ - feature-extraction
6
+ - dense
7
+ - generated_from_trainer
8
+ - dataset_size:1668
9
+ - loss:LoggableMNRL
10
+ widget:
11
+ - source_sentence: t started. [5]It can be dangerous to delay turning yourself into
12
+ a company, because one or more of the founders might decide to split off and start
13
+ another company doing the same thing. This does happen. So when you set up the
14
+ company, as well as as apportioning the stock, you should get all the founders
15
+ to sign something agreeing that everyone's ideas belong to this company, and that
16
+ this company is going to be everyone's only job.[If this were a movie, ominous
17
+ music would begin here.]While you're at it, you should ask what else they've signed.
18
+ One of the worst things that can happen to a startup is to run into intellectual
19
+ property problems. We did, and it came closer to killing us than any competitor
20
+ ever did. As we were in the middle of getting bought, we discovered that one of
21
+ our people had, early on, been bound by an agreement that said all his ideas belonged
22
+ to the giant company that was paying for him to go to grad school. In theory,
23
+ that could have meant someone else owned big chunks of our software. So the acquisition
24
+ came to a screeching halt while we tried to sort this out. The problem was, since
25
+ we'd been about to be acquired, we'd allowed ourselves to run low on cash
26
+ sentences:
27
+ - 'what we should expect in the future is more of the same. Indeed, we should expect
28
+ both the number and wealth of founders to grow, because every decade it gets easier
29
+ to start a startup. Part of the reason it''s getting easier to start a startup
30
+ is social. Society is (re)assimilating the concept. If you start one now, your
31
+ parents won''t freak out the way they would have a generation ago, and knowledge
32
+ about how to do it is much more widespread. But the main reason it''s easier to
33
+ start a startup now is that it''s cheaper. Technology has driven down the cost
34
+ of both building products and acquiring customers. The decreasing cost of starting
35
+ a startup has in turn changed the balance of power between founders and investors.
36
+ Back when starting a startup meant building a factory, you needed investors''
37
+ permission to do it at all. But now investors need founders more than founders
38
+ need investors, and that, combined with the increasing amount of venture capital
39
+ available, has driven up valuations. [8]So the decreasing cost of starting a startup
40
+ increases the number of rich people in two ways: it means that more people start
41
+ them, and that those who do can raise money on better terms. But there'''
42
+ - 'e a company when, as sometimes happens, its whole market dies, just as property
43
+ managers can''t save you from the building burning down. But a company that managed
44
+ a large enough number of companies could say to all its clients: we''ll combine
45
+ the revenues from all your companies, and pay you your proportionate share. If
46
+ such management companies existed, they''d offer the maximum of freedom and security.
47
+ Someone would run your company for you, and you''d be protected even if it happened
48
+ to die. Let''s think about how such a management company might be organized. The
49
+ simplest way would be to have a new kind of stock representing the total pool
50
+ of companies they were managing. When you signed up, you''d trade your company''s
51
+ stock for shares of this pool, in proportion to an estimate of your company''s
52
+ value that you''d both agreed upon. Then you''d automatically get your share of
53
+ the returns of the whole pool. The catch is that because this kind of trade would
54
+ be hard to undo, you couldn''t switch management companies. But there''s a way
55
+ they could fix that: suppose all the company management companies got together
56
+ and agreed to allow their clients to exchange shares in all their pools. Then
57
+ y'
58
+ - t started. [5]It can be dangerous to delay turning yourself into a company, because
59
+ one or more of the founders might decide to split off and start another company
60
+ doing the same thing. This does happen. So when you set up the company, as well
61
+ as as apportioning the stock, you should get all the founders to sign something
62
+ agreeing that everyone's ideas belong to this company, and that this company is
63
+ going to be everyone's only job.[If this were a movie, ominous music would begin
64
+ here.]While you're at it, you should ask what else they've signed. One of the
65
+ worst things that can happen to a startup is to run into intellectual property
66
+ problems. We did, and it came closer to killing us than any competitor ever did.
67
+ As we were in the middle of getting bought, we discovered that one of our people
68
+ had, early on, been bound by an agreement that said all his ideas belonged to
69
+ the giant company that was paying for him to go to grad school. In theory, that
70
+ could have meant someone else owned big chunks of our software. So the acquisition
71
+ came to a screeching halt while we tried to sort this out. The problem was, since
72
+ we'd been about to be acquired, we'd allowed ourselves to run low on cash
73
+ - source_sentence: ' happen fast. For example, Y Combinator has now invested in 80
74
+ startups, 57 of which are still alive. (The rest have died or merged or been acquired.)
75
+ When you''re trying to advise 57 startups, it turns out you have to have a stateless
76
+ algorithm. You can''t have ulterior motives when you have 57 things going on at
77
+ once, because you can''t remember them. So our rule is just to do whatever''s
78
+ best for the founders. Not because we''re particularly benevolent, but because
79
+ it''s the only algorithm that works on that scale. When you write something telling
80
+ people to be good, you seem to be claiming to be good yourself. So I want to say
81
+ explicitly that I am not a particularly good person. When I was a kid I was firmly
82
+ in the camp of bad. The way adults used the word good, it seemed to be synonymous
83
+ with quiet, so I grew up very suspicious of it. You know how there are some people
84
+ whose names come up in conversation and everyone says "He''s such a great guy?"
85
+ People never say that about me. The best I get is "he means well." I am not claiming
86
+ to be good. At best I speak good as a second language. So I''m not suggesting
87
+ you be good in the usual sanctimonious way. I''m suggesting it because it works.'
88
+ sentences:
89
+ - ' happen fast. For example, Y Combinator has now invested in 80 startups, 57 of
90
+ which are still alive. (The rest have died or merged or been acquired.) When you''re
91
+ trying to advise 57 startups, it turns out you have to have a stateless algorithm.
92
+ You can''t have ulterior motives when you have 57 things going on at once, because
93
+ you can''t remember them. So our rule is just to do whatever''s best for the founders.
94
+ Not because we''re particularly benevolent, but because it''s the only algorithm
95
+ that works on that scale. When you write something telling people to be good,
96
+ you seem to be claiming to be good yourself. So I want to say explicitly that
97
+ I am not a particularly good person. When I was a kid I was firmly in the camp
98
+ of bad. The way adults used the word good, it seemed to be synonymous with quiet,
99
+ so I grew up very suspicious of it. You know how there are some people whose names
100
+ come up in conversation and everyone says "He''s such a great guy?" People never
101
+ say that about me. The best I get is "he means well." I am not claiming to be
102
+ good. At best I speak good as a second language. So I''m not suggesting you be
103
+ good in the usual sanctimonious way. I''m suggesting it because it works.'
104
+ - 'hether it''s net good or bad, but my guess is bad.[7] One of the reasons people
105
+ work so hard on startups is that startups can fail, and when they do, that failure
106
+ tends to be both decisive and conspicuous.[8] It''s ok to work on something to
107
+ make a lot of money. You need to solve the money problem somehow, and there''s
108
+ nothing wrong with doing that efficiently by trying to make a lot at once. I suppose
109
+ it would even be ok to be interested in money for its own sake; whatever floats
110
+ your boat. Just so long as you''re conscious of your motivations. The thing to
111
+ avoid is unconsciously letting the need for money warp your ideas about what kind
112
+ of work you find most interesting.[9] Many people face this question on a smaller
113
+ scale with individual projects. But it''s easier both to recognize and to accept
114
+ a dead end in a single project than to abandon some type of work entirely. The
115
+ more determined you are, the harder it gets. Like a Spanish Flu victim, you''re
116
+ fighting your own immune system: Instead of giving up, you tell yourself, I should
117
+ just try harder. And who can say you''re not right?
118
+
119
+
120
+ Thanks to Trevor Blackwell, John Carmack, John Collison, Patrick Collison, Robert
121
+ Morris, Geoff Ralsto'
122
+ - ign is a definite skill. It's not just an airy intangible. Things always seem
123
+ intangible when you don't understand them. Electricity seemed an airy intangible
124
+ to most people in 1800. Who knew there was so much to know about it? So it is
125
+ with design. Some people are good at it and some people are bad at it, and there's
126
+ something very tangible they're good or bad at. The reason design counts so much
127
+ in software is probably that there are fewer constraints than on physical things.
128
+ Building physical things is expensive and dangerous. The space of possible choices
129
+ is smaller; you tend to have to work as part of a larger group; and you're subject
130
+ to a lot of regulations. You don't have any of that if you and a couple friends
131
+ decide to create a new web-based application. Because there's so much scope for
132
+ design in software, a successful application tends to be way more than the sum
133
+ of its patents. What protects little companies from being copied by bigger competitors
134
+ is not just their patents, but the thousand little things the big company will
135
+ get wrong if they try. The second reason patents don't count for much in our world
136
+ is that startups rarely attack big companies head-on, the way R
137
+ - source_sentence: 'ng on optimization is counter to the general trend in software
138
+ development for the last several decades. Trying to write the sufficiently smart
139
+ compiler is by definition a mistake. And even if it weren''t, compilers are the
140
+ sort of software that''s supposed to be created by open source projects, not companies.
141
+ Plus if this works it will deprive all the programmers who take pleasure in making
142
+ multithreaded apps of so much amusing complexity. The forum troll I have by now
143
+ internalized doesn''t even know where to begin in raising objections to this project.
144
+ Now that''s what I call a startup idea.7. Ongoing DiagnosisBut wait, here''s another
145
+ that could face even greater resistance: ongoing, automatic medical diagnosis.
146
+ One of my tricks for generating startup ideas is to imagine the ways in which
147
+ we''ll seem backward to future generations. And I''m pretty sure that to people
148
+ 50 or 100 years in the future, it will seem barbaric that people in our era waited
149
+ till they had symptoms to be diagnosed with conditions like heart disease and
150
+ cancer. For example, in 2004 Bill Clinton found he was feeling short of breath.
151
+ Doctors discovered that several of his arteries were over 90% blocked and 3 days
152
+ la'
153
+ sentences:
154
+ - lude working unsubscribe links in their mails. And this would be a necessity for
155
+ smaller fry, and for "legitimate" sites that hired spammers to promote them. So
156
+ if auto-retrieving filters became widespread, they'd become auto-unsubscribing
157
+ filters. In this scenario, spam would, like OS crashes, viruses, and popups, become
158
+ one of those plagues that only afflict people who don't bother to use the right
159
+ software. Notes[1] Auto-retrieving filters will have to follow redirects, and
160
+ should in some cases (e. g. a page that just says "click here") follow more than
161
+ one level of links. Make sure too that the http requests are indistinguishable
162
+ from those of popular Web browsers, including the order and referrer. If the response
163
+ doesn't come back within x amount of time, default to some fairly high spam probability.
164
+ Instead of making n constant, it might be a good idea to make it a function of
165
+ the number of spams that have been seen mentioning the site. This would add a
166
+ further level of protection against abuse and accidents.[2] The original version
167
+ of this article used the term "whitelist" instead of "blacklist" Though they were
168
+ to work like blacklists, I preferred to call them whitelists be
169
+ - 'ng on optimization is counter to the general trend in software development for
170
+ the last several decades. Trying to write the sufficiently smart compiler is by
171
+ definition a mistake. And even if it weren''t, compilers are the sort of software
172
+ that''s supposed to be created by open source projects, not companies. Plus if
173
+ this works it will deprive all the programmers who take pleasure in making multithreaded
174
+ apps of so much amusing complexity. The forum troll I have by now internalized
175
+ doesn''t even know where to begin in raising objections to this project. Now that''s
176
+ what I call a startup idea.7. Ongoing DiagnosisBut wait, here''s another that
177
+ could face even greater resistance: ongoing, automatic medical diagnosis. One
178
+ of my tricks for generating startup ideas is to imagine the ways in which we''ll
179
+ seem backward to future generations. And I''m pretty sure that to people 50 or
180
+ 100 years in the future, it will seem barbaric that people in our era waited till
181
+ they had symptoms to be diagnosed with conditions like heart disease and cancer.
182
+ For example, in 2004 Bill Clinton found he was feeling short of breath. Doctors
183
+ discovered that several of his arteries were over 90% blocked and 3 days la'
184
+ - ' used to amuse himself by breaking into safes containing secret documents. This
185
+ tradition continues today. When we were in grad school, a hacker friend of mine
186
+ who spent too much time around MIT had his own lock picking kit. (He now runs
187
+ a hedge fund, a not unrelated enterprise.)It is sometimes hard to explain to authorities
188
+ why one would want to do such things. Another friend of mine once got in trouble
189
+ with the government for breaking into computers. This had only recently been declared
190
+ a crime, and the FBI found that their usual investigative technique didn''t work.
191
+ Police investigation apparently begins with a motive. The usual motives are few:
192
+ drugs, money, sex, revenge. Intellectual curiosity was not one of the motives
193
+ on the FBI''s list. Indeed, the whole concept seemed foreign to them. Those in
194
+ authority tend to be annoyed by hackers'' general attitude of disobedience. But
195
+ that disobedience is a byproduct of the qualities that make them good programmers.
196
+ They may laugh at the CEO when he talks in generic corporate newspeech, but they
197
+ also laugh at someone who tells them a certain problem can''t be solved. Suppress
198
+ one, and you suppress the other. This attitude is sometimes affe'
199
+ - source_sentence: father. [8]The second component of independent-mindedness, resistance
200
+ to being told what to think, is the most visible of the three. But even this is
201
+ often misunderstood. The big mistake people make about it is to think of it as
202
+ a merely negative quality. The language we use reinforces that idea. You're unconventional.
203
+ You don't care what other people think. But it's not just a kind of immunity.
204
+ In the most independent-minded people, the desire not to be told what to think
205
+ is a positive force. It's not mere skepticism, but an active delight in ideas
206
+ that subvert the conventional wisdom, the more counterintuitive the better. Some
207
+ of the most novel ideas seemed at the time almost like practical jokes. Think
208
+ how often your reaction to a novel idea is to laugh. I don't think it's because
209
+ novel ideas are funny per se, but because novelty and humor share a certain kind
210
+ of surprisingness. But while not identical, the two are close enough that there
211
+ is a definite correlation between having a sense of humor and being independent-minded
212
+ � just as there is between being humorless and being conventional-minded. [9]I
213
+ don't think we can significantly increase our resistance to being told what to
214
+ sentences:
215
+ - 'o think of startup ideas. If you do that, you get bad ones that sound dangerously
216
+ plausible. The best approach is more indirect: if you have the right sort of background,
217
+ good startup ideas will seem obvious to you. But even then, not immediately. It
218
+ takes time to come across situations where you notice something missing. And often
219
+ these gaps won''t seem to be ideas for companies, just things that would be interesting
220
+ to build. Which is why it''s good to have the time and the inclination to build
221
+ things just because they''re interesting. Live in the future and build what seems
222
+ interesting. Strange as it sounds, that''s the real recipe. Notes[1] This form
223
+ of bad idea has been around as long as the web. It was common in the 1990s, except
224
+ then people who had it used to say they were going to create a portal for x instead
225
+ of a social network for x. Structurally the idea is stone soup: you post a sign
226
+ saying "this is the place for people interested in x," and all those people show
227
+ up and you make money from them. What lures founders into this sort of idea are
228
+ statistics about the millions of people who might be interested in each type of
229
+ x. What they forget is that any given person might ha'
230
+ - father. [8]The second component of independent-mindedness, resistance to being
231
+ told what to think, is the most visible of the three. But even this is often misunderstood.
232
+ The big mistake people make about it is to think of it as a merely negative quality.
233
+ The language we use reinforces that idea. You're unconventional. You don't care
234
+ what other people think. But it's not just a kind of immunity. In the most independent-minded
235
+ people, the desire not to be told what to think is a positive force. It's not
236
+ mere skepticism, but an active delight in ideas that subvert the conventional
237
+ wisdom, the more counterintuitive the better. Some of the most novel ideas seemed
238
+ at the time almost like practical jokes. Think how often your reaction to a novel
239
+ idea is to laugh. I don't think it's because novel ideas are funny per se, but
240
+ because novelty and humor share a certain kind of surprisingness. But while not
241
+ identical, the two are close enough that there is a definite correlation between
242
+ having a sense of humor and being independent-minded � just as there is between
243
+ being humorless and being conventional-minded. [9]I don't think we can significantly
244
+ increase our resistance to being told what to
245
+ - 'ht happen. Well, if you''re troubled by uncertainty, I can solve that problem
246
+ for you: if you start a startup, it will probably fail. Seriously, though, this
247
+ is not a bad way to think about the whole experience. Hope for the best, but expect
248
+ the worst. In the worst case, it will at least be interesting. In the best case
249
+ you might get rich. No one will blame you if the startup tanks, so long as you
250
+ made a serious effort. There may once have been a time when employers would regard
251
+ that as a mark against you, but they wouldn''t now. I asked managers at big companies,
252
+ and they all said they''d prefer to hire someone who''d tried to start a startup
253
+ and failed over someone who''d spent the same time working at a big company. Nor
254
+ will investors hold it against you, as long as you didn''t fail out of laziness
255
+ or incurable stupidity. I''m told there''s a lot of stigma attached to failing
256
+ in other places—in Europe, for example. Not here. In America, companies, like
257
+ practically everything else, are disposable.14. Don''t realize what you''re avoidingOne
258
+ reason people who''ve been out in the world for a year or two make better founders
259
+ than people straight from college is that they know what they''re avoid'
260
+ - source_sentence: 'ded to take occasional vacations. [5]The only way to find the
261
+ limit is by crossing it. Cultivate a sensitivity to the quality of the work you''re
262
+ doing, and then you''ll notice if it decreases because you''re working too hard.
263
+ Honesty is critical here, in both directions: you have to notice when you''re
264
+ being lazy, but also when you''re working too hard. And if you think there''s
265
+ something admirable about working too hard, get that idea out of your head. You''re
266
+ not merely getting worse results, but getting them because you''re showing off
267
+ — if not to other people, then to yourself. [6]Finding the limit of working hard
268
+ is a constant, ongoing process, not something you do just once. Both the difficulty
269
+ of the work and your ability to do it can vary hour to hour, so you need to be
270
+ constantly judging both how hard you''re trying and how well you''re doing. Trying
271
+ hard doesn''t mean constantly pushing yourself to work, though. There may be some
272
+ people who do, but I think my experience is fairly typical, and I only have to
273
+ push myself occasionally when I''m starting a project or when I encounter some
274
+ sort of check. That''s when I''m in danger of procrastinating. But once I get
275
+ rolling, I tend to keep'
276
+ sentences:
277
+ - ' is not in itself bad, only when it''s camouflage on insipid form.) Similarly,
278
+ in painting, a still life of a few carefully observed and solidly modelled objects
279
+ will tend to be more interesting than a stretch of flashy but mindlessly repetitive
280
+ painting of, say, a lace collar. In writing it means: say what you mean and say
281
+ it briefly. It seems strange to have to emphasize simplicity. You''d think simple
282
+ would be the default. Ornate is more work. But something seems to come over people
283
+ when they try to be creative. Beginning writers adopt a pompous tone that doesn''t
284
+ sound anything like the way they speak. Designers trying to be artistic resort
285
+ to swooshes and curlicues. Painters discover that they''re expressionists. It''s
286
+ all evasion. Underneath the long words or the "expressive" brush strokes, there
287
+ is not much going on, and that''s frightening. When you''re forced to be simple,
288
+ you''re forced to face the real problem. When you can''t deliver ornament, you
289
+ have to deliver substance. Good design is timeless. In math, every proof is timeless
290
+ unless it contains a mistake. So what does Hardy mean when he says there is no
291
+ permanent place for ugly mathematics? He means the same thing Kelly Joh'
292
+ - 'same way a biblical literalist is committed to rejecting it. All he''s committed
293
+ to is following the evidence wherever it leads. Considering yourself a scientist
294
+ is equivalent to putting a sign in a cupboard saying "this cupboard must be kept
295
+ empty." Yes, strictly speaking, you''re putting something in the cupboard, but
296
+ not in the ordinary sense.
297
+
298
+
299
+ Thanks to Sam Altman, Trevor Blackwell, Paul Buchheit, and Robert Morris for reading
300
+ drafts of this.'
301
+ - 'ded to take occasional vacations. [5]The only way to find the limit is by crossing
302
+ it. Cultivate a sensitivity to the quality of the work you''re doing, and then
303
+ you''ll notice if it decreases because you''re working too hard. Honesty is critical
304
+ here, in both directions: you have to notice when you''re being lazy, but also
305
+ when you''re working too hard. And if you think there''s something admirable about
306
+ working too hard, get that idea out of your head. You''re not merely getting worse
307
+ results, but getting them because you''re showing off — if not to other people,
308
+ then to yourself. [6]Finding the limit of working hard is a constant, ongoing
309
+ process, not something you do just once. Both the difficulty of the work and your
310
+ ability to do it can vary hour to hour, so you need to be constantly judging both
311
+ how hard you''re trying and how well you''re doing. Trying hard doesn''t mean
312
+ constantly pushing yourself to work, though. There may be some people who do,
313
+ but I think my experience is fairly typical, and I only have to push myself occasionally
314
+ when I''m starting a project or when I encounter some sort of check. That''s when
315
+ I''m in danger of procrastinating. But once I get rolling, I tend to keep'
316
+ pipeline_tag: sentence-similarity
317
+ library_name: sentence-transformers
318
+ ---
319
+
320
+ # SentenceTransformer
321
+
322
+ This is a [sentence-transformers](https://www.SBERT.net) model trained. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
323
+
324
+ ## Model Details
325
+
326
+ ### Model Description
327
+ - **Model Type:** Sentence Transformer
328
+ <!-- - **Base model:** [Unknown](https://huggingface.co/unknown) -->
329
+ - **Maximum Sequence Length:** 512 tokens
330
+ - **Output Dimensionality:** 768 dimensions
331
+ - **Similarity Function:** Cosine Similarity
332
+ <!-- - **Training Dataset:** Unknown -->
333
+ <!-- - **Language:** Unknown -->
334
+ <!-- - **License:** Unknown -->
335
+
336
+ ### Model Sources
337
+
338
+ - **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
339
+ - **Repository:** [Sentence Transformers on GitHub](https://github.com/huggingface/sentence-transformers)
340
+ - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
341
+
342
+ ### Full Model Architecture
343
+
344
+ ```
345
+ SentenceTransformer(
346
+ (0): Transformer({'max_seq_length': 512, 'do_lower_case': False, 'architecture': 'BertModel'})
347
+ (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
348
+ )
349
+ ```
350
+
351
+ ## Usage
352
+
353
+ ### Direct Usage (Sentence Transformers)
354
+
355
+ First install the Sentence Transformers library:
356
+
357
+ ```bash
358
+ pip install -U sentence-transformers
359
+ ```
360
+
361
+ Then you can load this model and run inference.
362
+ ```python
363
+ from sentence_transformers import SentenceTransformer
364
+
365
+ # Download from the 🤗 Hub
366
+ model = SentenceTransformer("sentence_transformers_model_id")
367
+ # Run inference
368
+ sentences = [
369
+ "ded to take occasional vacations. [5]The only way to find the limit is by crossing it. Cultivate a sensitivity to the quality of the work you're doing, and then you'll notice if it decreases because you're working too hard. Honesty is critical here, in both directions: you have to notice when you're being lazy, but also when you're working too hard. And if you think there's something admirable about working too hard, get that idea out of your head. You're not merely getting worse results, but getting them because you're showing off — if not to other people, then to yourself. [6]Finding the limit of working hard is a constant, ongoing process, not something you do just once. Both the difficulty of the work and your ability to do it can vary hour to hour, so you need to be constantly judging both how hard you're trying and how well you're doing. Trying hard doesn't mean constantly pushing yourself to work, though. There may be some people who do, but I think my experience is fairly typical, and I only have to push myself occasionally when I'm starting a project or when I encounter some sort of check. That's when I'm in danger of procrastinating. But once I get rolling, I tend to keep",
370
+ "ded to take occasional vacations. [5]The only way to find the limit is by crossing it. Cultivate a sensitivity to the quality of the work you're doing, and then you'll notice if it decreases because you're working too hard. Honesty is critical here, in both directions: you have to notice when you're being lazy, but also when you're working too hard. And if you think there's something admirable about working too hard, get that idea out of your head. You're not merely getting worse results, but getting them because you're showing off — if not to other people, then to yourself. [6]Finding the limit of working hard is a constant, ongoing process, not something you do just once. Both the difficulty of the work and your ability to do it can vary hour to hour, so you need to be constantly judging both how hard you're trying and how well you're doing. Trying hard doesn't mean constantly pushing yourself to work, though. There may be some people who do, but I think my experience is fairly typical, and I only have to push myself occasionally when I'm starting a project or when I encounter some sort of check. That's when I'm in danger of procrastinating. But once I get rolling, I tend to keep",
371
+ ' is not in itself bad, only when it\'s camouflage on insipid form.) Similarly, in painting, a still life of a few carefully observed and solidly modelled objects will tend to be more interesting than a stretch of flashy but mindlessly repetitive painting of, say, a lace collar. In writing it means: say what you mean and say it briefly. It seems strange to have to emphasize simplicity. You\'d think simple would be the default. Ornate is more work. But something seems to come over people when they try to be creative. Beginning writers adopt a pompous tone that doesn\'t sound anything like the way they speak. Designers trying to be artistic resort to swooshes and curlicues. Painters discover that they\'re expressionists. It\'s all evasion. Underneath the long words or the "expressive" brush strokes, there is not much going on, and that\'s frightening. When you\'re forced to be simple, you\'re forced to face the real problem. When you can\'t deliver ornament, you have to deliver substance. Good design is timeless. In math, every proof is timeless unless it contains a mistake. So what does Hardy mean when he says there is no permanent place for ugly mathematics? He means the same thing Kelly Joh',
372
+ ]
373
+ embeddings = model.encode(sentences)
374
+ print(embeddings.shape)
375
+ # [3, 768]
376
+
377
+ # Get the similarity scores for the embeddings
378
+ similarities = model.similarity(embeddings, embeddings)
379
+ print(similarities)
380
+ # tensor([[ 1.0000, 1.0000, -0.1102],
381
+ # [ 1.0000, 1.0000, -0.1102],
382
+ # [-0.1102, -0.1102, 1.0000]])
383
+ ```
384
+
385
+ <!--
386
+ ### Direct Usage (Transformers)
387
+
388
+ <details><summary>Click to see the direct usage in Transformers</summary>
389
+
390
+ </details>
391
+ -->
392
+
393
+ <!--
394
+ ### Downstream Usage (Sentence Transformers)
395
+
396
+ You can finetune this model on your own dataset.
397
+
398
+ <details><summary>Click to expand</summary>
399
+
400
+ </details>
401
+ -->
402
+
403
+ <!--
404
+ ### Out-of-Scope Use
405
+
406
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
407
+ -->
408
+
409
+ <!--
410
+ ## Bias, Risks and Limitations
411
+
412
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
413
+ -->
414
+
415
+ <!--
416
+ ### Recommendations
417
+
418
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
419
+ -->
420
+
421
+ ## Training Details
422
+
423
+ ### Training Dataset
424
+
425
+ #### Unnamed Dataset
426
+
427
+ * Size: 1,668 training samples
428
+ * Columns: <code>sentence_0</code> and <code>sentence_1</code>
429
+ * Approximate statistics based on the first 1000 samples:
430
+ | | sentence_0 | sentence_1 |
431
+ |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
432
+ | type | string | string |
433
+ | details | <ul><li>min: 26 tokens</li><li>mean: 257.69 tokens</li><li>max: 345 tokens</li></ul> | <ul><li>min: 26 tokens</li><li>mean: 257.69 tokens</li><li>max: 345 tokens</li></ul> |
434
+ * Samples:
435
+ | sentence_0 | sentence_1 |
436
+ |:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
437
+ | <code>ts raison d'etre—is that it offers something otherwise impossible to obtain: a way of measuring that. In many businesses, it just makes more sense for companies to get technology by buying startups rather than developing it in house. You pay more, but there is less risk, and risk is what big companies don't want. It makes the guys developing the technology more accountable, because they only get paid if they build the winner. And you end up with better technology, created faster, because things are made in the innovative atmosphere of startups instead of the bureaucratic atmosphere of big companies. Our startup, Viaweb, was built to be sold. We were open with investors about that from the start. And we were careful to create something that could slot easily into a larger company. That is the pattern for the future.9. CaliforniaThe Bubble was a California phenomenon. When I showed up in Silicon Valley in 1998, I felt like an immigrant from Eastern Europe arriving in America in 1900. Eve...</code> | <code>ts raison d'etre—is that it offers something otherwise impossible to obtain: a way of measuring that. In many businesses, it just makes more sense for companies to get technology by buying startups rather than developing it in house. You pay more, but there is less risk, and risk is what big companies don't want. It makes the guys developing the technology more accountable, because they only get paid if they build the winner. And you end up with better technology, created faster, because things are made in the innovative atmosphere of startups instead of the bureaucratic atmosphere of big companies. Our startup, Viaweb, was built to be sold. We were open with investors about that from the start. And we were careful to create something that could slot easily into a larger company. That is the pattern for the future.9. CaliforniaThe Bubble was a California phenomenon. When I showed up in Silicon Valley in 1998, I felt like an immigrant from Eastern Europe arriving in America in 1900. Eve...</code> |
438
+ | <code> image rendered with more pixels. One consequence is that some old recipes may have become obsolete. At the very least we have to go back and figure out if they were really recipes for wisdom or intelligence. But the really striking change, as intelligence and wisdom drift apart, is that we may have to decide which we prefer. We may not be able to optimize for both simultaneously. Society seems to have voted for intelligence. We no longer admire the sage—not the way people did two thousand years ago. Now we admire the genius. Because in fact the distinction we began with has a rather brutal converse: just as you can be smart without being very wise, you can be wise without being very smart. That doesn't sound especially admirable. That gets you James Bond, who knows what to do in a lot of situations, but has to rely on Q for the ones involving math. Intelligence and wisdom are obviously not mutually exclusive. In fact, a high average may help support high peaks. But there are reasons t...</code> | <code> image rendered with more pixels. One consequence is that some old recipes may have become obsolete. At the very least we have to go back and figure out if they were really recipes for wisdom or intelligence. But the really striking change, as intelligence and wisdom drift apart, is that we may have to decide which we prefer. We may not be able to optimize for both simultaneously. Society seems to have voted for intelligence. We no longer admire the sage—not the way people did two thousand years ago. Now we admire the genius. Because in fact the distinction we began with has a rather brutal converse: just as you can be smart without being very wise, you can be wise without being very smart. That doesn't sound especially admirable. That gets you James Bond, who knows what to do in a lot of situations, but has to rely on Q for the ones involving math. Intelligence and wisdom are obviously not mutually exclusive. In fact, a high average may help support high peaks. But there are reasons t...</code> |
439
+ | <code>he mastered a new kind of farming. I've seen the lever of technology grow visibly in my own time. In high school I made money by mowing lawns and scooping ice cream at Baskin-Robbins. This was the only kind of work available at the time. Now high school kids could write software or design web sites. But only some of them will; the rest will still be scooping ice cream. I remember very vividly when in 1985 improved technology made it possible for me to buy a computer of my own. Within months I was using it to make money as a freelance programmer. A few years before, I couldn't have done this. A few years before, there was no such thing as a freelance programmer. But Apple created wealth, in the form of powerful, inexpensive computers, and programmers immediately set to work using it to create more. As this example suggests, the rate at which technology increases our productive capacity is probably exponential, rather than linear. So we should expect to see ever-increasing variation in i...</code> | <code>he mastered a new kind of farming. I've seen the lever of technology grow visibly in my own time. In high school I made money by mowing lawns and scooping ice cream at Baskin-Robbins. This was the only kind of work available at the time. Now high school kids could write software or design web sites. But only some of them will; the rest will still be scooping ice cream. I remember very vividly when in 1985 improved technology made it possible for me to buy a computer of my own. Within months I was using it to make money as a freelance programmer. A few years before, I couldn't have done this. A few years before, there was no such thing as a freelance programmer. But Apple created wealth, in the form of powerful, inexpensive computers, and programmers immediately set to work using it to create more. As this example suggests, the rate at which technology increases our productive capacity is probably exponential, rather than linear. So we should expect to see ever-increasing variation in i...</code> |
440
+ * Loss: <code>__main__.LoggableMNRL</code> with these parameters:
441
+ ```json
442
+ {
443
+ "scale": 20.0,
444
+ "similarity_fct": "cos_sim",
445
+ "gather_across_devices": false
446
+ }
447
+ ```
448
+
449
+ ### Training Hyperparameters
450
+ #### Non-Default Hyperparameters
451
+
452
+ - `per_device_train_batch_size`: 16
453
+ - `per_device_eval_batch_size`: 16
454
+ - `num_train_epochs`: 5
455
+ - `fp16`: True
456
+ - `multi_dataset_batch_sampler`: round_robin
457
+
458
+ #### All Hyperparameters
459
+ <details><summary>Click to expand</summary>
460
+
461
+ - `overwrite_output_dir`: False
462
+ - `do_predict`: False
463
+ - `eval_strategy`: no
464
+ - `prediction_loss_only`: True
465
+ - `per_device_train_batch_size`: 16
466
+ - `per_device_eval_batch_size`: 16
467
+ - `per_gpu_train_batch_size`: None
468
+ - `per_gpu_eval_batch_size`: None
469
+ - `gradient_accumulation_steps`: 1
470
+ - `eval_accumulation_steps`: None
471
+ - `torch_empty_cache_steps`: None
472
+ - `learning_rate`: 5e-05
473
+ - `weight_decay`: 0.0
474
+ - `adam_beta1`: 0.9
475
+ - `adam_beta2`: 0.999
476
+ - `adam_epsilon`: 1e-08
477
+ - `max_grad_norm`: 1
478
+ - `num_train_epochs`: 5
479
+ - `max_steps`: -1
480
+ - `lr_scheduler_type`: linear
481
+ - `lr_scheduler_kwargs`: {}
482
+ - `warmup_ratio`: 0.0
483
+ - `warmup_steps`: 0
484
+ - `log_level`: passive
485
+ - `log_level_replica`: warning
486
+ - `log_on_each_node`: True
487
+ - `logging_nan_inf_filter`: True
488
+ - `save_safetensors`: True
489
+ - `save_on_each_node`: False
490
+ - `save_only_model`: False
491
+ - `restore_callback_states_from_checkpoint`: False
492
+ - `no_cuda`: False
493
+ - `use_cpu`: False
494
+ - `use_mps_device`: False
495
+ - `seed`: 42
496
+ - `data_seed`: None
497
+ - `jit_mode_eval`: False
498
+ - `bf16`: False
499
+ - `fp16`: True
500
+ - `fp16_opt_level`: O1
501
+ - `half_precision_backend`: auto
502
+ - `bf16_full_eval`: False
503
+ - `fp16_full_eval`: False
504
+ - `tf32`: None
505
+ - `local_rank`: 0
506
+ - `ddp_backend`: None
507
+ - `tpu_num_cores`: None
508
+ - `tpu_metrics_debug`: False
509
+ - `debug`: []
510
+ - `dataloader_drop_last`: False
511
+ - `dataloader_num_workers`: 0
512
+ - `dataloader_prefetch_factor`: None
513
+ - `past_index`: -1
514
+ - `disable_tqdm`: False
515
+ - `remove_unused_columns`: True
516
+ - `label_names`: None
517
+ - `load_best_model_at_end`: False
518
+ - `ignore_data_skip`: False
519
+ - `fsdp`: []
520
+ - `fsdp_min_num_params`: 0
521
+ - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
522
+ - `fsdp_transformer_layer_cls_to_wrap`: None
523
+ - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
524
+ - `parallelism_config`: None
525
+ - `deepspeed`: None
526
+ - `label_smoothing_factor`: 0.0
527
+ - `optim`: adamw_torch_fused
528
+ - `optim_args`: None
529
+ - `adafactor`: False
530
+ - `group_by_length`: False
531
+ - `length_column_name`: length
532
+ - `project`: huggingface
533
+ - `trackio_space_id`: trackio
534
+ - `ddp_find_unused_parameters`: None
535
+ - `ddp_bucket_cap_mb`: None
536
+ - `ddp_broadcast_buffers`: False
537
+ - `dataloader_pin_memory`: True
538
+ - `dataloader_persistent_workers`: False
539
+ - `skip_memory_metrics`: True
540
+ - `use_legacy_prediction_loop`: False
541
+ - `push_to_hub`: False
542
+ - `resume_from_checkpoint`: None
543
+ - `hub_model_id`: None
544
+ - `hub_strategy`: every_save
545
+ - `hub_private_repo`: None
546
+ - `hub_always_push`: False
547
+ - `hub_revision`: None
548
+ - `gradient_checkpointing`: False
549
+ - `gradient_checkpointing_kwargs`: None
550
+ - `include_inputs_for_metrics`: False
551
+ - `include_for_metrics`: []
552
+ - `eval_do_concat_batches`: True
553
+ - `fp16_backend`: auto
554
+ - `push_to_hub_model_id`: None
555
+ - `push_to_hub_organization`: None
556
+ - `mp_parameters`:
557
+ - `auto_find_batch_size`: False
558
+ - `full_determinism`: False
559
+ - `torchdynamo`: None
560
+ - `ray_scope`: last
561
+ - `ddp_timeout`: 1800
562
+ - `torch_compile`: False
563
+ - `torch_compile_backend`: None
564
+ - `torch_compile_mode`: None
565
+ - `include_tokens_per_second`: False
566
+ - `include_num_input_tokens_seen`: no
567
+ - `neftune_noise_alpha`: None
568
+ - `optim_target_modules`: None
569
+ - `batch_eval_metrics`: False
570
+ - `eval_on_start`: False
571
+ - `use_liger_kernel`: False
572
+ - `liger_kernel_config`: None
573
+ - `eval_use_gather_object`: False
574
+ - `average_tokens_across_devices`: True
575
+ - `prompts`: None
576
+ - `batch_sampler`: batch_sampler
577
+ - `multi_dataset_batch_sampler`: round_robin
578
+ - `router_mapping`: {}
579
+ - `learning_rate_mapping`: {}
580
+
581
+ </details>
582
+
583
+ ### Training Logs
584
+ | Epoch | Step | Training Loss |
585
+ |:------:|:----:|:-------------:|
586
+ | 4.7619 | 500 | 0.1358 |
587
+
588
+
589
+ ### Framework Versions
590
+ - Python: 3.12.12
591
+ - Sentence Transformers: 5.1.2
592
+ - Transformers: 4.57.3
593
+ - PyTorch: 2.9.0+cu126
594
+ - Accelerate: 1.12.0
595
+ - Datasets: 4.0.0
596
+ - Tokenizers: 0.22.1
597
+
598
+ ## Citation
599
+
600
+ ### BibTeX
601
+
602
+ #### Sentence Transformers
603
+ ```bibtex
604
+ @inproceedings{reimers-2019-sentence-bert,
605
+ title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
606
+ author = "Reimers, Nils and Gurevych, Iryna",
607
+ booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
608
+ month = "11",
609
+ year = "2019",
610
+ publisher = "Association for Computational Linguistics",
611
+ url = "https://arxiv.org/abs/1908.10084",
612
+ }
613
+ ```
614
+
615
+ #### LoggableMNRL
616
+ ```bibtex
617
+ @misc{henderson2017efficient,
618
+ title={Efficient Natural Language Response Suggestion for Smart Reply},
619
+ author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
620
+ year={2017},
621
+ eprint={1705.00652},
622
+ archivePrefix={arXiv},
623
+ primaryClass={cs.CL}
624
+ }
625
+ ```
626
+
627
+ <!--
628
+ ## Glossary
629
+
630
+ *Clearly define terms in order to be accessible across audiences.*
631
+ -->
632
+
633
+ <!--
634
+ ## Model Card Authors
635
+
636
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
637
+ -->
638
+
639
+ <!--
640
+ ## Model Card Contact
641
+
642
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
643
+ -->
config.json ADDED
@@ -0,0 +1,25 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "BertModel"
4
+ ],
5
+ "attention_probs_dropout_prob": 0.1,
6
+ "classifier_dropout": null,
7
+ "dtype": "float32",
8
+ "gradient_checkpointing": false,
9
+ "hidden_act": "gelu",
10
+ "hidden_dropout_prob": 0.1,
11
+ "hidden_size": 768,
12
+ "initializer_range": 0.02,
13
+ "intermediate_size": 3072,
14
+ "layer_norm_eps": 1e-12,
15
+ "max_position_embeddings": 512,
16
+ "model_type": "bert",
17
+ "num_attention_heads": 12,
18
+ "num_hidden_layers": 12,
19
+ "pad_token_id": 0,
20
+ "position_embedding_type": "absolute",
21
+ "transformers_version": "4.57.3",
22
+ "type_vocab_size": 2,
23
+ "use_cache": true,
24
+ "vocab_size": 30522
25
+ }
config_sentence_transformers.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "model_type": "SentenceTransformer",
3
+ "__version__": {
4
+ "sentence_transformers": "5.1.2",
5
+ "transformers": "4.57.3",
6
+ "pytorch": "2.9.0+cu126"
7
+ },
8
+ "prompts": {
9
+ "query": "",
10
+ "document": ""
11
+ },
12
+ "default_prompt_name": null,
13
+ "similarity_fn_name": "cosine"
14
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:92d9387218a6ee3427a21bad4c6ac5c4ff89efffcb766100adf7c5a5170a0d70
3
+ size 437951328
modules.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "",
6
+ "type": "sentence_transformers.models.Transformer"
7
+ },
8
+ {
9
+ "idx": 1,
10
+ "name": "1",
11
+ "path": "1_Pooling",
12
+ "type": "sentence_transformers.models.Pooling"
13
+ }
14
+ ]
sentence_bert_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 512,
3
+ "do_lower_case": false
4
+ }
special_tokens_map.json ADDED
@@ -0,0 +1,37 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cls_token": {
3
+ "content": "[CLS]",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "mask_token": {
10
+ "content": "[MASK]",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": {
17
+ "content": "[PAD]",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "sep_token": {
24
+ "content": "[SEP]",
25
+ "lstrip": false,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ },
30
+ "unk_token": {
31
+ "content": "[UNK]",
32
+ "lstrip": false,
33
+ "normalized": false,
34
+ "rstrip": false,
35
+ "single_word": false
36
+ }
37
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,56 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "[PAD]",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "100": {
12
+ "content": "[UNK]",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "101": {
20
+ "content": "[CLS]",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "102": {
28
+ "content": "[SEP]",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "103": {
36
+ "content": "[MASK]",
37
+ "lstrip": false,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ }
43
+ },
44
+ "clean_up_tokenization_spaces": false,
45
+ "cls_token": "[CLS]",
46
+ "do_lower_case": true,
47
+ "extra_special_tokens": {},
48
+ "mask_token": "[MASK]",
49
+ "model_max_length": 512,
50
+ "pad_token": "[PAD]",
51
+ "sep_token": "[SEP]",
52
+ "strip_accents": null,
53
+ "tokenize_chinese_chars": true,
54
+ "tokenizer_class": "BertTokenizer",
55
+ "unk_token": "[UNK]"
56
+ }
vocab.txt ADDED
The diff for this file is too large to render. See raw diff