Dataset Preview
Duplicate
The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
The dataset generation failed because of a cast error
Error code:   DatasetGenerationCastError
Exception:    DatasetGenerationCastError
Message:      An error occurred while generating the dataset

All the data files must have the same columns, but at some point there are 224 new columns ({'a209', 'a41', 'a21', 'text', 'a19', 'a17', 'a38', 'a85', 'a138', 'a127', 'a118', 'a71', 'a10', 'a136', 'a73', 'a219', 'a107', 'a171', 'label_noisy', 'a91', 'a210', 'a202', 'a29', 'a16', 'a102', 'a131', 'a166', 'a52', 'a160', 'a206', 'a14', 'a78', 'a124', 'a35', 'a214', 'a194', 'a177', 'a109', 'a190', 'a28', 'a208', 'a98', 'a179', 'a205', 'a88', 'a34', 'a174', 'a139', 'a114', 'a167', 'a176', 'a69', 'a62', 'a58', 'a121', 'a151', 'a103', 'a181', 'a33', 'a11', 'a213', 'a116', 'a64', 'a82', 'a8', 'a37', 'a75', 'a192', 'a83', 'a185', 'a182', 'a140', 'a170', 'a94', 'a108', 'a157', 'a96', 'a57', 'a95', 'a183', 'a199', 'a66', 'a173', 'a154', 'a129', 'a180', 'a27', 'a61', 'a30', 'a54', 'a93', 'a196', 'a87', 'a25', 'a79', 'a23', 'a123', 'a168', 'a142', 'a40', 'a70', 'a191', 'a22', 'a80', 'a126', 'a51', 'a31', 'a77', 'a1', 'a156', 'a15', 'a48', 'a112', 'a148', 'a53', 'a50', 'a60', 'a132', 'a204', 'a24', 'a90', 'a4', 'a175', 'a200', 'a68', 'a32', 'a106', 'a134', 'a145', 'a149', 'a150', 'a18', 'a144', 'a46', 'a187', 'a12', 'a89', 'a133', 'a86', 'a178', 'a198', 'a26', 'a159', 'a130', 'a72', 'a92', 'a162', 'label_cat_noisy', 'a122', 'a7', 'a119', 'a44', 'label_cat', 'a212', 'a189', 'a120', 'a135', 'a56', 'a201', 'a188', 'a125', 'a207', 'a158', 'a36', 'a104', 'a111', 'a137', 'a9', 'a2', 'a49', 'a115', 'a117', 'a169', 'a153', 'label', 'a84', 'a39', 'a152', 'a59', 'a105', 'a163', 'a155', 'a65', 'a218', 'a143', 'a211', 'a146', 'a99', 'a13', 'a172', 'a20', 'a195', 'a5', 'a47', 'a161', 'a215', 'a3', 'a216', 'a164', 'a55', 'a203', 'a217', 'a101', 'a6', 'a63', 'a165', 'a110', 'a67', 'a197', 'a45', 'a76', 'a81', 'a42', 'a186', 'a193', 'a147', 'a43', 'a100', 'a128', 'a113', 'a97', 'a74', 'a141', 'a184'}) and 2 missing columns ({'completion', 'prompt'}).

This happened while the csv dataset builder was generating data using

hf://datasets/Cleanlab/stanford-politeness/fine-tuning/train_full.csv (at revision 9fdefb9b4206e12647be56a5ceb48ca898c958a2)

Please either edit the data files to have matching columns, or separate them into different configurations (see docs at https://hf.co/docs/hub/datasets-manual-configuration#multiple-configurations)
Traceback:    Traceback (most recent call last):
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1831, in _prepare_split_single
                  writer.write_table(table)
                File "/usr/local/lib/python3.12/site-packages/datasets/arrow_writer.py", line 714, in write_table
                  pa_table = table_cast(pa_table, self._schema)
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2272, in table_cast
                  return cast_table_to_schema(table, schema)
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2218, in cast_table_to_schema
                  raise CastError(
              datasets.table.CastError: Couldn't cast
              text: string
              a1: double
              a2: double
              a3: double
              a4: double
              a5: double
              a6: double
              a7: double
              a8: double
              a9: double
              a10: double
              a11: double
              a12: double
              a13: double
              a14: double
              a15: double
              a16: double
              a17: double
              a18: double
              a19: double
              a20: double
              a21: double
              a22: double
              a23: double
              a24: double
              a25: double
              a26: double
              a27: double
              a28: double
              a29: double
              a30: double
              a31: double
              a32: double
              a33: double
              a34: double
              a35: double
              a36: double
              a37: double
              a38: double
              a39: double
              a40: double
              a41: double
              a42: double
              a43: double
              a44: double
              a45: double
              a46: double
              a47: double
              a48: double
              a49: double
              a50: double
              a51: double
              a52: double
              a53: double
              a54: double
              a55: double
              a56: double
              a57: double
              a58: double
              a59: double
              a60: double
              a61: double
              a62: double
              a63: double
              a64: double
              a65: double
              a66: double
              a67: double
              a68: double
              a69: double
              a70: double
              a71: double
              a72: double
              a73: double
              a74: double
              a75: double
              a76: double
              a77: double
              a78: double
              a79: double
              a80: double
              a81: double
              a82: double
              a83: double
              a84: double
              a85: double
              a86: double
              a87: double
              a88: double
              a89: double
              a90: double
              a91: double
              a92: double
              a93: double
              a94: double
              a95: double
              a96: double
              a97: double
              a98: double
              a99: double
              a100: double
              a101: double
              a102: double
              a103: double
              a104: double
              a105: double
              a106: double
              a107: double
              a108: double
              a109: double
              a110: double
              a111: double
              a112: double
              a113: double
              a114: double
              a115: double
              a116: double
              a117: double
              a118: double
              a119: double
              a120: double
              a121: double
              a122: double
              a123: double
              a124: double
              a125: double
              a126: double
              a127: double
              a128: double
              a129: double
              a130: double
              a131: double
              a132: double
              a133: double
              a134: double
              a135: double
              a136: double
              a137: double
              a138: double
              a139: double
              a140: double
              a141: double
              a142: double
              a143: double
              a144: double
              a145: double
              a146: double
              a147: double
              a148: double
              a149: double
              a150: double
              a151: double
              a152: double
              a153: double
              a154: double
              a155: double
              a156: double
              a157: double
              a158: double
              a159: double
              a160: double
              a161: double
              a162: double
              a163: double
              a164: double
              a165: double
              a166: double
              a167: double
              a168: double
              a169: double
              a170: double
              a171: double
              a172: double
              a173: double
              a174: double
              a175: double
              a176: double
              a177: double
              a178: double
              a179: double
              a180: double
              a181: double
              a182: double
              a183: double
              a184: double
              a185: double
              a186: double
              a187: double
              a188: double
              a189: double
              a190: double
              a191: double
              a192: double
              a193: double
              a194: double
              a195: double
              a196: double
              a197: double
              a198: double
              a199: double
              a200: double
              a201: double
              a202: double
              a203: double
              a204: double
              a205: double
              a206: double
              a207: double
              a208: double
              a209: double
              a210: double
              a211: double
              a212: double
              a213: double
              a214: double
              a215: double
              a216: double
              a217: double
              a218: double
              a219: double
              label: int64
              label_cat: string
              label_noisy: int64
              label_cat_noisy: string
              -- schema metadata --
              pandas: '{"index_columns": [{"kind": "range", "name": null, "start": 0, "' + 24499
              to
              {'prompt': Value('string'), 'completion': Value('string')}
              because column names don't match
              
              During handling of the above exception, another exception occurred:
              
              Traceback (most recent call last):
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1339, in compute_config_parquet_and_info_response
                  parquet_operations = convert_to_parquet(builder)
                                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 972, in convert_to_parquet
                  builder.download_and_prepare(
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 894, in download_and_prepare
                  self._download_and_prepare(
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 970, in _download_and_prepare
                  self._prepare_split(split_generator, **prepare_split_kwargs)
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1702, in _prepare_split
                  for job_id, done, content in self._prepare_split_single(
                                               ^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1833, in _prepare_split_single
                  raise DatasetGenerationCastError.from_cast_error(
              datasets.exceptions.DatasetGenerationCastError: An error occurred while generating the dataset
              
              All the data files must have the same columns, but at some point there are 224 new columns ({'a209', 'a41', 'a21', 'text', 'a19', 'a17', 'a38', 'a85', 'a138', 'a127', 'a118', 'a71', 'a10', 'a136', 'a73', 'a219', 'a107', 'a171', 'label_noisy', 'a91', 'a210', 'a202', 'a29', 'a16', 'a102', 'a131', 'a166', 'a52', 'a160', 'a206', 'a14', 'a78', 'a124', 'a35', 'a214', 'a194', 'a177', 'a109', 'a190', 'a28', 'a208', 'a98', 'a179', 'a205', 'a88', 'a34', 'a174', 'a139', 'a114', 'a167', 'a176', 'a69', 'a62', 'a58', 'a121', 'a151', 'a103', 'a181', 'a33', 'a11', 'a213', 'a116', 'a64', 'a82', 'a8', 'a37', 'a75', 'a192', 'a83', 'a185', 'a182', 'a140', 'a170', 'a94', 'a108', 'a157', 'a96', 'a57', 'a95', 'a183', 'a199', 'a66', 'a173', 'a154', 'a129', 'a180', 'a27', 'a61', 'a30', 'a54', 'a93', 'a196', 'a87', 'a25', 'a79', 'a23', 'a123', 'a168', 'a142', 'a40', 'a70', 'a191', 'a22', 'a80', 'a126', 'a51', 'a31', 'a77', 'a1', 'a156', 'a15', 'a48', 'a112', 'a148', 'a53', 'a50', 'a60', 'a132', 'a204', 'a24', 'a90', 'a4', 'a175', 'a200', 'a68', 'a32', 'a106', 'a134', 'a145', 'a149', 'a150', 'a18', 'a144', 'a46', 'a187', 'a12', 'a89', 'a133', 'a86', 'a178', 'a198', 'a26', 'a159', 'a130', 'a72', 'a92', 'a162', 'label_cat_noisy', 'a122', 'a7', 'a119', 'a44', 'label_cat', 'a212', 'a189', 'a120', 'a135', 'a56', 'a201', 'a188', 'a125', 'a207', 'a158', 'a36', 'a104', 'a111', 'a137', 'a9', 'a2', 'a49', 'a115', 'a117', 'a169', 'a153', 'label', 'a84', 'a39', 'a152', 'a59', 'a105', 'a163', 'a155', 'a65', 'a218', 'a143', 'a211', 'a146', 'a99', 'a13', 'a172', 'a20', 'a195', 'a5', 'a47', 'a161', 'a215', 'a3', 'a216', 'a164', 'a55', 'a203', 'a217', 'a101', 'a6', 'a63', 'a165', 'a110', 'a67', 'a197', 'a45', 'a76', 'a81', 'a42', 'a186', 'a193', 'a147', 'a43', 'a100', 'a128', 'a113', 'a97', 'a74', 'a141', 'a184'}) and 2 missing columns ({'completion', 'prompt'}).
              
              This happened while the csv dataset builder was generating data using
              
              hf://datasets/Cleanlab/stanford-politeness/fine-tuning/train_full.csv (at revision 9fdefb9b4206e12647be56a5ceb48ca898c958a2)
              
              Please either edit the data files to have matching columns, or separate them into different configurations (see docs at https://hf.co/docs/hub/datasets-manual-configuration#multiple-configurations)

Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.

prompt
string
completion
string
Question: how do we move the content from the existing page to the new page? How do we reference the space to the new page?
neutral
Why would you doubt that. And for that matter why wouldn't you rewrite it yourself?
impolite
Thank you for closing out this debate. Can you please restore the history for GFDL purposes?
neutral
You may want to take a look at <url>. Can you be specific on what "spamming" is in this context?
neutral
The consequences apparently have no sliding scale? It's either "you're blocked," or you're not?
impolite
Good! By the way, what do you think of merging the recurring animals in <url> to <url> and then getting rid of the rest?
polite
You didn't answer about the status of above mentioned image. Are you going to tag that image too?
polite
Please take a look at Space Shuttle Discovery, for FPC. Will you reconsider?
polite
Hi, Schutz; it's still busted. Would it be OK with you if I asked GimmeBot to take over the mainpage bolding?
polite
The deletion review page linked above contains all the instructions for requesting a review. Why did you not read these instructions?
neutral
Please answer Bishonen question to you. When were you invited to comment on UC related subjects?
neutral
Do you disagree with this reasoning?. If so, could you explain why?
polite
Thanks for approving my request. Should I remove my request?
polite
You reverted my removal of the "to critical acclaim" description from this article. I cannot see where in <url> there is a basis for that claim - it looks like a neutral factual review to me?
impolite
The link you provided to speedy delection criteria only goes up to CSD12, and there is no CSD 13 listed there. Could you clarify?
neutral
I'm not sure how I feel about you adding coordinates to locations all over the place, but rather than get into that, let's start with something easy: "<url>" does not equal "<url>". Do you need me to explain why, or was that just a typo?
neutral
Lists like this are a bad idea to begin with, and this one in particular has been deleted via an AfD. Why not try categories instead?
impolite
See <url>. I've made them, can you help fill them out and place them in the articles?
impolite
Sorry, I dont quite remember what we were talking about. Can you tell me what you were/are confused about?
polite
I'd rather not discuss matters by email, if it's avoidable. Is this a confidential issue that cannot be discussed on talk pages?
impolite
And individually changing the refs defeats the purpose, unless the genus is apart of the url, like in nomen.at. Do you think there is any to work with this?
neutral
These events are fictional. What real-world information can there be about them?
impolite
Hey three paragraphs at the end of the commercial success are uncited. Can you fix that?
neutral
A talk page is like your office desk. How does the typical office desk of a professor look like?
neutral
You told me that you couldn't access the page in question, although I had recently changed the link to its new location. Maybe you just didn't bother to check it out?
neutral
Hi friend, I've noticed you've made some good edits over at the article. Are you a big fan of McGinley's work?
polite
I need to have Jerry Eckwood added to the list of Arkansas Razorback players. How do I do this?
neutral
I suppose that there is nothing to be done about it, but Jamesd1 just buried a direct question to him, about compromise, under a long discussion about Alice Bailey's supposed religious views (Bailey was not a Christian) that belong on a Bailey discussion forum, not on this article's talk page. Does not Wikipedia have some way to resolve editing situations in which editors refuse to compromise and end the arguing?
neutral
Thanks for the help on the sides of leather question. I'm interested in citing the dictionary in the article as a reference for the meaning of "sides", but I'd like to have a little more data first u2014 what year's edition of the OED did you cite?
polite
See <url> - not a good revert, and no reason given. Rollback?
impolite
Thank you much! May I call upon you if I make mistakes or need help, then, please?
polite
If a main or major purpose of your presence here is to cause drama then I would support your being indefed. Am I misunderstanding something?
impolite
I guess I sent you something about the <url> project and Welsh data. But you were also somehow silent on lists like gnome-i18n, weren't you?
neutral
I'll take a look. Can you give me an example?
neutral
That is, I would remove the inline reference to the date of the cited source, since it's given in the actual footnote. Is that OK with you?
polite
GorillaWarfare- I don't know if you saw my reply on my talk page. Are you interested in being my mentor for the GU De-Ba'athification article?
polite
For 2003-04 the Tony Kempster site doesn't show steps, it shows levels (this was before the steps were introduced) - <url> when steps were introduced, the KCL was classed as Step 7 - like the Essex Intermediate League (all of whose club's articles have been deleted) it has never been a Step 6 league (hence no-one objecting to the other ten articles being deleted earlier in the week. Can you reinstate the prod so save the hassle of an AfD?
neutral
ARSBot seems to not be working? Can it be turned back on?
neutral
Ok, thanks. So ok to archive the RFI then?
neutral
And, no, I don't think that everyone who wrote a screenplay should have two pages, but I do think that every artist who has a substantial list of works which would clutter up their biography page should have two pages. Can you explain to me why the two-page thing is such a big deal when wikipedia is not paper?
impolite
Who is going to contact all of the users? Is a bot being set to do that in the event it's necessary, or is it being done manually?
impolite
Why do you have this user name??
impolite
I'm in Japan timezone - and at work now - and most IRC is totally blocked through work proxies. Can we do a weekend?
polite
I think some of those "citation needed" cases are on the Oldest Britain's website. Are those considered "validated"?
neutral
Thanks for doing the DYK update - I saw it was overdue, but had no time to do the notifications (though I would have had time just to copy to the main template and purge the cache). Has there ever been discussion of having a bot do the notices (perhaps with the name of the admin who did the Update given, in case there were questions)?
impolite
Fine, but you didn't need to be an admin to make edits to articles. Or did you?
neutral
Nice tune Modernist <url>. You owe me one back?
neutral
The thread at <url> ground to a halt. Any way to get this implemented?
neutral
I have to protest that I find it rather meaningless to present arguments on the talk page for edits which are then blankly reverted by the likes of User:Historicar without explanation or discussion. Any suggestions for a more formal process or other remedies?
neutral
Your recent post to the straw poll at <url> may need some clarification. Did you intend "support going to Arbcomm" to be a vote for the "Death" entry?
polite
Hello, I noticed you recently did a lot of work on the ''<url>'' page. I was wondering if you know of the origin of the word "Nassarius" - its etymology/ what is it named after?
neutral
Fair nuff. Anyhow, somebody's birthday is coming up, is it not?
neutral
When we have a GLAM barnstar then you get it ..... just for the Hindi article! I cannot explain why you needed to create a Derby page in Hindi - surely it should be there already?
neutral
It's not a ban, it's an indefinite block. He'll know about it when he tries to edit and he'll know that it was me that blocked him, so what's the pint of telling him again on his talk page?
impolite
Do we need to go that far? Why not remove the info and lock the article for a while to prevent immediate reinsertion like what happened to <url>?
neutral
Thanks for your advice on my problems with categories. Perhaps someone should change the instructions?
polite
Hi- The Magnificent Clean-keeper- I note that you recently removed a contribution I made to the <url> page, citing lack of "notability" as the reason. Iu2019m not saying you where wrong to do so (notability is an important criteria in the Wiki project), however, since the definition of "notability" is hard to pin down, could we discuss this particular instance, in greater detail, at some point?
impolite
You removed the border issue coverage in <url>, and your explanation was that it's covered in another article. Where is it covered?
impolite
I see you've created the page <url>. Are you by any chance related to the Hondo Hurricane?
neutral
What?! Why did you resolve this without dealing with the User I mentioned?
impolite
Why are you uploading/changing badge files here? Surely you should just upload them to Commons then list the wikipedia version for deletion?
impolite
I would like to do the same for <url>, a Houston neighborhood. How do I do this?
neutral
No, just admire them from a distance of a couple of centuries! You?
neutral
Don't add any more trash to the article. Got it?
impolite
<url> is directly from the <url> website. However it doesn't have any copyright notice should I still flag it as a copyvio?
neutral
Oh, okay. <url> should be placed as a "see also" on <url>, correct?
neutral
Hi there again, just reminding you that the review must be archived over at WP:Biography as well, not only marked as such in the article Talk page. Do you think Markus's last concern was addressed to his satisfaction?
polite
No more saga! Help me by keeping this page the way it is right now, alright?
impolite
<url> <url> version, but mighty all the same. Am I wrong?
impolite
As far as I can tell, the personal attacks against editors has indeed subsided. Are you indicating you are aware of recent incidents?
neutral
Were you aware this guys was publishing private phone numbers? Even though it's an IP, do you think something longer than 31 hours is warranted?
impolite
I'm really seeing no evidence that this image is free. Are you able to shed any light on the issue?
neutral
Hello? You going to answer this?
impolite
If you are refering to the recent work I have been doing, I think most of it has just been fixing small format errors. Could you do me a favor and let me know what edit I have mistakenly marked as '''m'''inor so that I can watch out for mistakenly labeling such edits as '''m'''inor in the future?
impolite
Thankyou Fail, but the template I want is the one that goes at the bottom of the article and produces a message that goes something like 'This article is a copy of the (X) article in the German wikipedia, version 21 December 2009 10:04.' What is that template?
neutral
I noticed you have improved the "human rights" section on the article <url>. Could you update the main article: <url>?
neutral
The block you created on April 18th has expired and a new rash of vandalism has broken out. Would you mind blocking this school for even longer than 1 month?
neutral
Actually, I am concerned that you had completely removed verifiable facts of which I had included the references. Would you please explain why you had removed the entire section I wrote on Collection Agencies in Canada?
impolite
The effect of these edits has been to circumvent policy and process to move the above article to non-verified, non-English article names without a discussion or consensus. Who will put this right?
neutral
But someone actually remade the article here, http://en.wikipedia.org/wiki/User:DOSGuy/Joe_Siegler it's a subset of someone's user profile, but they are essentially circumventing a deleted article. Is this against wikipedia policy?
polite
OK done. When do I get my money?
neutral
Hey again, more about categories - if we really need a <url>, it should be a sub-category of <url>, and then you don't need both categories on the same page, just the Justinian one. See?
neutral
Again samething with the second site, again outside of taking they're word for it how do we know they are are right. What is your opinion on the 2 logos in the forum discussion think they were made up?
impolite
I have your CH2.js installed, but I forgot what it does. What does it do?
neutral
The text of your warning <url> made me smile. Did you really think that if China got a new Prime Minister as of today, and a female one at that, it wouldn't be all over the news, and the link to this person would be red?
neutral
Thank you for correcting the error on my user page. I have a question how does one go about archiving his talk history?
neutral
Hey Love, as you wish, if you have any suggestions about my section, let me know. Is English a second language for you?
impolite
yes i'm the one who is ashish. are you also on ukmix?
neutral
There is a lot to learn - but to date I have concentrated on content, followed by style. If I have to keep removing silly comments, though, I will get discouraged - is there an easy way to reverse somebody's goofing?
impolite
Okay. How's what I wrote on his talk page look?
impolite
I know you're disappointed. What is the plan?
impolite
I have no clue. Isn't there a WikiProject on radio stations you can ask?
neutral
I'm still scratching my head. How can this be?
neutral
Ok, thanks for that. Could you please tell me why the Kiwa page is giving warning about notability and citations, where as for example this one <url> is not?
polite
I had an editorial dispute with a user called <url><, but I'm not responding to him anymore as it seems to escalate into heated arguments. If he keeps messaging, would it be ok if I got back in touch with you and you had a word with him?
neutral
The press is calling you bisexual. Did you know that?
neutral
Indeed. I trust you have already watchlisted this editor?
polite
I thought that only section tags go into sections while the article tags go to the top of the page so they could serve it's purpose. Is there a Wikipedia guideline that can substantiate your revert?
impolite
Why did you rollback <url>? It appears by a quick scan that the user was correct re <url> - but regardless seemed a good faith edit, one certainly not worthy of a vandalism rollback I would think?
impolite
Despite our long-standing disagreement, I have no argument with you <url>. However, can I suggest you use a less opaque edit summary than ''added info'' when doing something that is nonetheless controversial?
neutral
End of preview.

Stanford Politeness Dataset

This dataset contains politeness classification data based on the Stanford Politeness Corpus for active learning and fine-tuning tasks.

Dataset Description

The dataset is organized into two main directories:

Active Learning

  • X_labeled_full.csv - Labeled examples
  • X_unlabeled.csv - Unlabeled examples for active learning
  • extra_annotations.npy - Additional annotation data
  • test.csv - Test set

Fine-tuning

  • train.csv - Training set
  • train_fixed.csv - Fixed training set
  • train_full.csv - Full training set
  • test.csv - Test set

Usage

import pandas as pd
from huggingface_hub import hf_hub_download

# Download a specific file
file_path = hf_hub_download(
    repo_id="Cleanlab/stanford-politeness",
    filename="fine-tuning/train.csv",
    repo_type="dataset"
)
df = pd.read_csv(file_path)
print(df.head())

License

MIT License

Downloads last month
30