datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
GEM/squality | ---
annotations_creators:
- crowd-sourced
language_creators:
- unknown
language:
- en
license:
- cc-by-4.0
multilinguality:
- unknown
size_categories:
- unknown
source_datasets:
- original
task_categories:
- summarization
task_ids: []
pretty_name: squality
---
# Dataset Card for GEM/squality
## Dataset Description
- **Homepage:** https://github.com/nyu-mll/SQuALITY
- **Repository:** https://github.com/nyu-mll/SQuALITY/data
- **Paper:** https://arxiv.org/abs/2205.11465
- **Leaderboard:** N/A
- **Point of Contact:** Alex Wang
### Link to Main Data Card
You can find the main data card on the [GEM Website](https://gem-benchmark.com/data_cards/squality).
### Dataset Summary
SQuALITY (Summarization-format QUestion Answering with Long Input Texts, Yes!) is a summarization dataset that is:
* Abstractive
* Long-input: The input document are short stories between 3000--6000 words.
* Question-focused: Each story is associated with multiple question-summary pairs.
* Multi-reference: Each question is paired with 4 summaries.
* High-quality: The summaries are crowdsourced from skilled and trained writers.
You can load the dataset via:
```
import datasets
data = datasets.load_dataset('GEM/squality')
```
The data loader can be found [here](https://huggingface.co/datasets/GEM/squality).
#### website
[Github](https://github.com/nyu-mll/SQuALITY)
#### paper
[ArXiv](https://arxiv.org/abs/2205.11465)
#### authors
Alex Wang (NYU); Angelica Chen (NYU); Richard Yuanzhe Pang (NYU); Nitish Joshi (NYU); Samuel R. Bowman (NYU)
## Dataset Overview
### Where to find the Data and its Documentation
#### Webpage
<!-- info: What is the webpage for the dataset (if it exists)? -->
<!-- scope: telescope -->
[Github](https://github.com/nyu-mll/SQuALITY)
#### Download
<!-- info: What is the link to where the original dataset is hosted? -->
<!-- scope: telescope -->
[Github](https://github.com/nyu-mll/SQuALITY/data)
#### Paper
<!-- info: What is the link to the paper describing the dataset (open access preferred)? -->
<!-- scope: telescope -->
[ArXiv](https://arxiv.org/abs/2205.11465)
#### BibTex
<!-- info: Provide the BibTex-formatted reference for the dataset. Please use the correct published version (ACL anthology, etc.) instead of google scholar created Bibtex. -->
<!-- scope: microscope -->
```
@article{wang2022squality,
title={S{Q}u{ALITY}: Building a Long-Document Summarization Dataset the Hard Way},
author={Wang, Alex and Pang, Richard Yuanzhe and Chen, Angelica and Phang, Jason and Bowman, Samuel R.},
journal={arXiv preprint 2205.11465},
year={2022}
}
```
#### Contact Name
<!-- quick -->
<!-- info: If known, provide the name of at least one person the reader can contact for questions about the dataset. -->
<!-- scope: periscope -->
Alex Wang
#### Contact Email
<!-- info: If known, provide the email of at least one person the reader can contact for questions about the dataset. -->
<!-- scope: periscope -->
wangalexc@gmail.com
#### Has a Leaderboard?
<!-- info: Does the dataset have an active leaderboard? -->
<!-- scope: telescope -->
no
### Languages and Intended Use
#### Multilingual?
<!-- quick -->
<!-- info: Is the dataset multilingual? -->
<!-- scope: telescope -->
no
#### Covered Dialects
<!-- info: What dialects are covered? Are there multiple dialects per language? -->
<!-- scope: periscope -->
stories: 1930--1970 American English
summaries: modern American English
#### Covered Languages
<!-- quick -->
<!-- info: What languages/dialects are covered in the dataset? -->
<!-- scope: telescope -->
`English`
#### Whose Language?
<!-- info: Whose language is in the dataset? -->
<!-- scope: periscope -->
stories: 1930--1970 American science fiction writers (predominantly American men)
summaries: Upwork writers (college-educated, native-English) and NYU undergraduates (English-fluent college students)
#### License
<!-- quick -->
<!-- info: What is the license of the dataset? -->
<!-- scope: telescope -->
cc-by-4.0: Creative Commons Attribution 4.0 International
#### Intended Use
<!-- info: What is the intended use of the dataset? -->
<!-- scope: microscope -->
summarization research
#### Primary Task
<!-- info: What primary task does the dataset support? -->
<!-- scope: telescope -->
Summarization
#### Communicative Goal
<!-- quick -->
<!-- info: Provide a short description of the communicative goal of a model trained for this task on this dataset. -->
<!-- scope: periscope -->
Given a question about a particular high-level aspect of a short story, provide a summary about that aspect in the story (e.g., plot, character relationships, setting, theme, etc.).
### Credit
#### Curation Organization Type(s)
<!-- info: In what kind of organization did the dataset curation happen? -->
<!-- scope: telescope -->
`academic`
#### Curation Organization(s)
<!-- info: Name the organization(s). -->
<!-- scope: periscope -->
New York University
#### Dataset Creators
<!-- info: Who created the original dataset? List the people involved in collecting the dataset and their affiliation(s). -->
<!-- scope: microscope -->
Alex Wang (NYU); Angelica Chen (NYU); Richard Yuanzhe Pang (NYU); Nitish Joshi (NYU); Samuel R. Bowman (NYU)
#### Funding
<!-- info: Who funded the data creation? -->
<!-- scope: microscope -->
Eric and Wendy Schmidt; Apple; NSF
#### Who added the Dataset to GEM?
<!-- info: Who contributed to the data card and adding the dataset to GEM? List the people+affiliations involved in creating this data card and who helped integrate this dataset into GEM. -->
<!-- scope: microscope -->
Alex Wang (NYU)
### Dataset Structure
#### Data Fields
<!-- info: List and describe the fields present in the dataset. -->
<!-- scope: telescope -->
* metadata: Project Gutenberg ID, internal UID, Project Gutenberg license
* document: the story
* questions: a list where each element contains
* question text: the question
* question number: the order in which workers answered the question
* responses: a list where each element contains
* worker ID: anonymous
* internal UID
* response text: the response
#### Reason for Structure
<!-- info: How was the dataset structure determined? -->
<!-- scope: microscope -->
The dataset is arranged with responses grouped by question (for ease of multi-reference training and evaluation) and questions grouped by story (to avoid duplicating the story in the dataset)
#### Example Instance
<!-- info: Provide a JSON formatted example of a typical instance in the dataset. -->
<!-- scope: periscope -->
```
{"metadata": {"passage_id": "63833", "uid": "ea0017c487a245668698cf527019b2b6", "license": ""}, "document": "Story omitted for readability", "questions": [{"question_text": "What is the plot of the story?", "question_number": 1, "responses": [{"worker_id": "6", "uid": "0c27bef1b7b644ffba735fdb005f9529", "response_text": "Brevet Lieutenant Commander David Farragut Stryakalski III, AKA Strike, is charged with commanding a run-down and faulty vessel, the Aphrodite. Aphrodite was the brain-child of Harlan Hendricks, an engineer who ushered in new technology ten years back. All three of his creations failed spectacularly, resulting in death and a failed career. The Aphrodite was the only ship to survive, and she is now used for hauling mail back and forth between Venus and Mars.\nStrike and Cob, the Aphrodite\u2019s only executive to last more than six months, recount Strike\u2019s great failures and how he ended up here. He used to fly the Ganymede, but was removed after he left his position to rescue colonists who didn\u2019t need rescuing. Strike was no longer trustworthy in Admiral Gorman\u2019s eyes, so he banished him to the Aphrodite. \nThe circuit that caused the initial demise of Aphrodite was sealed off. After meeting some members of his crew, Strike orders a conference for all personnel and calls in an Engineering Officer, one I.V. Hendricks. \nAfter Lieutenant Ivy Hendricks arrives--not I.V.--Strike immediately insults her by degrading the ship\u2019s designer, Harlan Hendricks. As it turns out, Hendricks is his daughter, and she vows to prove him wrong and all those who doubted her father. \nDespite their initial conflict, Strike and Hendricks\u2019 relationship soon evolves from resentment to respect. During this time, Strike\u2019s confidence in the Aphrodite plummets as she suffers from mechanical issues. \nThe Aphrodite starts to heat up as they get closer to the sun. The refrigeration units could not handle the heat, causing discomfort among the crew. As they get closer, a radar contact reveals that two dreadnaughts, the Lachesis and the Atropos, are doing routine patrolling. Nothing to worry about, except the Atropos had Admiral Gorman on board, hated by Strike and Hendricks.\nStrike and Hendricks make a joke about Gorman falling into the sun. As the temperature steadily climbs, the crew members overheat and begin fighting, resulting in a black eye. A distress signal came through from the Lachesis: the Atropos, with Gorman on board, was tumbling into the sun. The Lachesis was attempting to rescue them with an unbreakable cord, but they too were being pulled in. \nHendricks had fixed the surge-circuit rheostat, the one her father designed, and claimed it could help them rescue the ships. After some tension, Strike agrees and they race down to the sun to pick up the drifting dreadnaughts. \nStrike puts Hendricks in charge, but soon the heat overtakes her, and she is unable to continue. Strike takes over, attaches the Aphrodite to the Lachesis with a cord, and turns on the surge-circuit. They blast themselves out of there, rescuing the two ships and Admiral Gorman at the same time. \nCob and Strike are awarded Spatial Cross awards, while Hendricks is promoted to an engineering position at the Bureau of Ships. The story ends with Cob and Strike flipping through the pages of an address book until they land on Canalopolis, Mars. \n"}, {"worker_id": "1", "uid": "04e79312dede4a0da5993101e55a796a", "response_text": "Strike joins the crew of the Aphrodite after he has made several poor decisions while he was the captain of another spaceship. He is essentially being punished by his boss, Gorman, and put somewhere where he can do little harm. His job is to deliver the mail from Venus to Mars, so it\u2019s pretty straightforward. \n\nWhen he meets the Officer of the Deck, Celia Graham, he immediately becomes uncomfortable. He does not like to work with women in space, although it\u2019s a pretty common occurrence. He holds a captain\u2019s meeting the first day on the job, and he waits to meet his Engineering Officer, I.V. Hendricks. He makes a rude comment about how the man is late for his first meeting, but actually, the female Ivy has already shown up. \n\nAfter meeting Ivy formally, he makes a comment about how the ship Aphrodite was built by an imbecile. Ivy immediately tells him that he\u2019s wrong, and she knows this because the designer of the ship was none other than her own father. \n\nHis first week as captain on the new ship goes very poorly. Several repairs need to be done to Aphrodite, they run behind schedule, and the new crew members have a tough time getting a handle on Aphrodite\u2019s intricacies. \n\nThe heat index in the ship begins to rise, and the crew members can no longer wear their uniforms without fainting. Suddenly a distress call comes in, and it\u2019s coming from the Atropos, a ship Captained by Gorman, and the Lachesis. The crew members hesitate to take the oldest and most outdated machinery on a rescue trip. Strike has been in trouble for refusing to follow commands before, and he knows it\u2019s a risky move. However, Ivy insists that she knows how to pilot the Aphrodite, and she can save the crew members on the Atropos and the Lachesis from death. They are quickly tumbling towards the sun, and they will perish if someone doesn\u2019t do something quickly. \n\nIvy takes control of the ship, and the heat on the Aphrodite continues to rise steadily. Eventually, she faints from pure heat exhaustion, and she tells Strike that he must take over. He does, and he manages to essentially lasso the other two ships, and with just the right amount of power, he pulls them back into orbit. \n\nAt a bar, after the whole ordeal, Cob pokes fun at Strike for staying on the Aphrodite. He then admits that he actually respects Strike\u2019s loyalty to the ship that saved his reputation. Cob asks about Strike\u2019s relationship with Ivy, but Strike tells him that she has taken her dad\u2019s former job, so she no longer works with him. Strike takes the moment to look up her info, presumably to restart the relationship. \n"}, {"worker_id": "5", "uid": "71efb8636b504f42a6989bb90e360186", "response_text": "The narrative follows commander Strike as he begins his command of the spaceship Aphrodite. Strike comes from a long line of military greats but himself is prone to poor professional decision making.\n\nAs he takes command, the mission is a simple mail run. However, in the course of their journey, they receive word of two ships in dire need of rescue. Strike and his engineering officer, Ivy Hendricks, decide to use the ships extremely risky surge-circuit to aid the ships.\n\nThe rescue is a success and the crew is hailed for its bravery in saving the doomed vessels. "}, {"worker_id": "3", "uid": "8aa46ba8bd2945c98babd7dd2d9ecc38", "response_text": "The story starts in a muddy swamp on Venus, where Strike, a Brevet Lieutenant Commander, is encountering his new ship, the Aphrodite, for the first time. Here on Venusport Base, he is introduced to the executive officer of the ship, a man who goes by Cob. Strike comes from a line of servicemen who were all well respected, but he himself has more of a reputation for causing trouble by saying the wrong things or deviating from mission plans. His reputation preceded him, as Cob had specific questions about some of these events. The Aphrodite was incredibly impressive when it was designed, but did not live up to its expectations. It had been refitted, and the new mission that Strike was to lead was a mail run between Venus and Mars. As he entered the ship, Strike began to meet his new crew, including Celia Graham, his Radar Officer. Strike is not used to women being on ships and is decidedly uncomfortable with the idea. As he is briefing the officers who were already present, Strike is surprised when he meets his new engineering officer, Ivy Hendricks. Ivy is the daughter of the man who designed the ship, and she is cold to Strike at first, as he is to her. However, her expertise in engineering generally, the ship specifically, and other skills as well as piloting, meant that Strike warmed up to her as their mission went on. As the ship was flying towards Mars on their route, the crew picked up a distress signal from the Lachesis, which was trying to pull the Atropos away from the gravitational pull of the sun after it was damaged in an equipment malfunction. The Admiral who had put Strike in charge of the Aphrodite was on the Atropos, and Ivy dislikes him even more than Strike does, but they know they have to try to save the crews. Strike is hesitant, but Ivy has a plan and insists that they try. She has spent all of her free time tinkering with the circuits, and takes charge. She turned the Aphrodite towards the ships in danger, and sends out a cable to connect the Aphrodite to those ships. After they are all connected, the ships continue to spin towards the sun, which causes Ivy to pass out, leaving Strike in charge. He manages to pull the ships into line and send the Aphrodite in the right direction before passing out himself. The Aphrodite has the power to pull everyone away from the Sun\u2019s gravity, but the acceleration knocks everyone out on all three ships. In the end, it was a successful rescue mission of multiple crews. Strike and Cob find themselves in an officer\u2019s club at the end of the story, discussing Ivy\u2019s new job, and Strike acknowledges that Cob is right about the Aphrodite having grown on him, and plans to stay its captain."}]}, {"question_text": "Who is Ivy Hendricks and what happens to her throughout the story?", "question_number": 2, "responses": [{"worker_id": "6", "uid": "0c27bef1b7b644ffba735fdb005f9529", "response_text": "Lieutenant Ivy Hendricks is the daughter of Harlan Hendricks, a formerly respected engineer. He created the surge-circuit, an innovation in interstellar astrogation, and he was awarded a Legion of Merit. He designed three famous ships: the Artemis, the Andromeda, and the Aphrodite, the prototype. Despite being hailed as the latest and greatest in technology, all three ships either exploded or failed. \nAccording to Lieutenant Ivy Hendricks, their failures were due to the lack of education on board. She claimed that her father asked for the crew members to be trained in surge-circuit technology, so they could use it properly and correctly. That wish was not granted and after all three ships failed, his reputation and career were doomed. Admiral Gorman pulled the plug on his career and therefore became the target of all Lieutenant Hendricks\u2019 hate. \nWith a bone to pick, Lieutenant Hendricks, a knowledgeable engineer herself, comes aboard the Aphrodite to serve as her engineer and occasional pilot. She wants to prove to the world that her father\u2019s creation was genius and deserving of praise. \nAlthough they started off on the wrong foot, Lieutenant Hendricks and Strike, her commander, develop a friendship and appreciation for each other. They bond over their deep hatred of Admiral Gorman and the joy of piloting a ship. She soon proves herself to Strike, and he begins to trust her. Their relationship walks the fine line between friendship and romance. \nAs the Aphrodite is attempting to rescue the fallen dreadnaughts, Lieutenant Hendricks comes up with the solution. Due to her constant tinkering on the ship, she had fixed the surge-circuit rheostat and made it ready to use. Initially, no one trusts her, seeing as the last time it was used people died. But Strike\u2019s trust in her is strong and true, so he approves the use of the surge-circuit. Hendricks pilots the ship, but soon becomes too overheated and comes close to fainting. Strike takes over piloting and eventually activates the surge-circuit. It works and they are able to rescue the two ships, one of which had Admiral Gorman, her sworn enemy, onboard. \nLieutenant Hendricks receives a major promotion; she is now an engineer at the Bureau of Ships. She proved them wrong, and restored her father\u2019s legacy and good name. The story ends with their romance left in the air, but Hendricks has much to be proud of. \n"}, {"worker_id": "1", "uid": "04e79312dede4a0da5993101e55a796a", "response_text": "\nLieutenant Ivy Hendricks is the new Engineering Officer on Aphrodite. Strike and Cob assume that Ivy is a man before she arrives because they are sexist and because her name is listed as I.V. in the orders. Ivy is actually the daughter of the man who designed the award-winning craft.\n\nShe is cold and unfriendly towards Strike after she meets him, and that\u2019s probably because he makes a rude comment about the ship which her father created. After a couple weeks of working together, the two begin to get along very well. Strike admires Ivy\u2019s piloting skills and her depth of knowledge about the Aphrodite. \n\nThe two also bond over their shared hatred of Strike\u2019s former boss, Gorman. Strike feels as though he has ruined his career, and Ivy thinks that Gorman torpedoed her father\u2019s career. Ivy wants nothing more than to prove that Gorman is an idiot. \n\nHowever, when Gorman\u2019s ship is hurtling towards the sun and he and his crew members are about to die, Ivy sees that it\u2019s the perfect opportunity to show Gorman just how wrong he was about the ship her father designed. It\u2019s a very dangerous mission, but Ivy is steadfast in her decision and she\u2019s deeply courageous. She pilots the ship for most of the rescue mission, but eventually faints from the extreme heat. She tells Strike that he needs to take over, and he does a great job. \n\nIvy is then promoted, and she moves to Canalopolis, Mars. She now outranks her former Captain, Strike. \n"}, {"worker_id": "5", "uid": "71efb8636b504f42a6989bb90e360186", "response_text": "Ivy Hendricks is the engineering officer assigned to the Aphrodite. She is the daughter of Harlan Hendricks, the ship's original designer. She is fiercely protective of her father's legacy and resents Admiral Gorman for the way he treated him.\n\nHendricks and Strike, form an alliance of sorts after his initial surprise of seeing a woman assigned to this officer's role. When news arrives that two ships are in danger of falling into the sun, Ivy lobbies to use her father's technology to save the ship. Strike agrees to her plan although the risks are high. The Aphrodite eventually saves the ships although Ivy faints in the process from the heat and command has to be taken over by Strike.\n\nThe successful mission results in a promotion for Ivy as she works as a designer in the Bureau of Ships like her father."}, {"worker_id": "3", "uid": "8aa46ba8bd2945c98babd7dd2d9ecc38", "response_text": "Ivy Hendricks is the new engineering officer on the Aphrodite, having been transferred from the Antigone. She is a tall woman with dark hair and contrasting pale blue eyes, who has a very wide range of experience in ship operations and engineering. Her father, Harlan Hendricks, was the man who designed the Aphrodite, so she knows the ship needs a lot of specific training. At first, the captain did not expect her to be a woman, and managed to imply that many people found her father incompetent. Although she seemed cold at first, as she reacted to the situation, she and the captain eventually got along fairly well, as he learned to appreciate her wide skill set that ranged from engineering to piloting. Ivy and Strike also had a common enemy in the higher ranks: Space Admiral Gorman. Once Spike trusted her he appreciated that Ivy spent a lot of spare time working on the old circuits, so she knew the ship like the back of her hand. When the Aphrodite found the Lachesis and the Atropos when following up on a distress signal, Ivy new the ship well enough to be able to formulate a plan to save everyone. She piloted the Aphrodite carefully, using cables shot with a rocket to connect the three ships together, but the spinning of the ships in the heat inside meant that she passed out and had to leave Strike to take over for her. Her plan was successful; she was promoted, and instead of returning to the Aphrodite she started a design job with the Bureau of Ships."}]}, {"question_text": "What is the relationship between Strike and Aphrodite?", "question_number": 3, "responses": [{"worker_id": "6", "uid": "0c27bef1b7b644ffba735fdb005f9529", "response_text": "Strike is a member of a famous, well-behaved, and well-trained service family. His father and grandfather served in World War II and the Atomic War, respectively. Both earned medals for their heroic service. Strike, however, did not follow in his family\u2019s footsteps. \n\tWith a tendency to say the wrong thing at the wrong time, Strike often offended those around him and garnered a negative reputation. After being put in charge of the Ganymede, he soon lost his position after abandoning his station to rescue colonists who were not in danger. As well, he accused a Martian Ambassador of being a spy at a respectable ball. Admiral Gorman soon demoted him, and he became the commander of the Aphrodite. \n\tAt first, Strike was not a fan. He sees her as ugly, fat, and cantankerous. He misses the Ganymede, a shiny and new rocketship, and views the Aphrodite as less-than. \n\tWithin the first week of flying her, the Aphrodite had a burned steering tube, which made it necessary to go into free-fall as the damage control party made repairs. Strike\u2019s faith in Lover-Girl continued to plummet. \n\tHowever, after Lieutenant Hendricks, the resident engineer, got her hands on the Aphrodite, Strike\u2019s opinion started to change. Her knowledge of the ship, engineering, and piloting helped him gain confidence in both her abilities and those of Aphrodite.\nNear the end of the story, the Aphrodite is tasked with rescuing two ships that are falling into the sun. Previously Lieutenant Hendricks had fixed up the surge-circuit rheostat, and so she offered it up as the only solution. Strike agrees to try it, which shows his faith and trust in the Aphrodite. Luckily, all things go to plan, and the Aphrodite, with Strike piloting, is able to save the two ships and Admiral Gorman. \nAfter Strike won a medal himself, finally following in the family footsteps, he is offered his old position back on the Ganymede. He refuses, and instead returns to old Lover-Girl. He has grown fond of her over the course of their adventure, and they develop a partnership. "}, {"worker_id": "1", "uid": "04e79312dede4a0da5993101e55a796a", "response_text": "Strike is completely unimpressed by the rocket ship Aphrodite. He comments that she looks like a pregnant carp, and he knows that he\u2019s been assigned captain of the ship because he messed up terribly on his other missions. \n\nAphrodite was built 10 years ago, and now she is completely outdated and a laughing stock compared to the other spaceships in the fleet. She was designed by Harlan Hendricks, and the engineer received a Legion of Merit award for her design. \n\nStrike\u2019s mission is to fly Aphrodite to take the mail from Venusport to Canalopolis, Mars. It\u2019s boring and straightforward.\n\nWhen a disaster occurs and two other ships, the Atropos and the Lachesis, are in serious danger of getting too close to the sun, Strike agrees to take the old girl on a rescue mission. He is convinced by Ivy, since she knows the ship better than anyone else and she believes in her. \n\nAlthough Ivy takes Aphrodite most of the way there, its Strike who finishes the mission and saves his former boss, Gorman, and many other people from certain death. Aphrodite is the entire reason that Strike is able to mend his terrible reputation and he wins back respect from Gorman. Although they got off to a rocky start, Strike finds it impossible to leave his best girl, even when he is offered a job on another ship. He is loyal to the ship that made him a hero. \n"}, {"worker_id": "5", "uid": "71efb8636b504f42a6989bb90e360186", "response_text": "Strike is assigned to be commander of the spaceship Aphrodite. The ship is assigned as a mail carrier for the inner part of the solar system. The Aphrodite is a dilapidated design with an awful reputation. Strike ended up with the Aphrodite as a result of a series of poor professional decisions that resulted in him getting command of the more prestigious ship Ganymede taken away from him.\n\nHis initial impression of the Aphrodite softens to a grudging respect after the successful mission to save the Atropos and Lachesis. Although he presumably is in line to command the Ganymede again, another faux pas resulting in Strike continuing to command the Aphrodite. "}, {"worker_id": "3", "uid": "8aa46ba8bd2945c98babd7dd2d9ecc38", "response_text": "At the beginning of the story, Strike is very reluctant to accept Aphrodite, because being in charge of the ship means a demotion for him. His perception of the ship at the beginning of the story is colored by this history, and his first impression of the ship is not a positive one, even from the outside. Besides the actual construction of the ship, the technology that ran it was not something he showed much faith in. The first week that he was in charge after leaving Venus, it seemed things were going drastically wrong. When one important piece of equipment burnt out, the ship went into freefall, requiring a lot of repair work from the engineers, and anyone in charge of navigation was handed more work because of this as well. The ship was really put to the test when the Aphrodite responded to the distress call from the Lachesis, whose crew was trying to keep the Atropos from falling into the sun. Because Ivy knew the Aphrodite so well, and had been working on the circuits, it turned out the Aphrodite was the perfect ship to save the day. She could not see the rescue all the way through to the end, because she passed out early, but Strike was conscious a little bit longer and took over until he also passed out. After this unexpected rescue mission, Cob, the Executive Officer, noted that Strike has a newfound appreciation for the ship, and has no intention of leaving. Strike is dedicated to his new mission, even though at the beginning of the story he wanted nothing more than to pilot something the same rank as his old ship."}]}, {"question_text": "Describe the setting of the story.", "question_number": 4, "responses": [{"worker_id": "6", "uid": "0c27bef1b7b644ffba735fdb005f9529", "response_text": "Jinx Ship to the Rescue by Alfred Coppel, Jr. takes place in space, but more specifically in the Aphrodite. \n\tIt starts in the muddy Venusport Base on Venus. Venusport is famous for its warm, slimy, and green rain that falls for 480 hours of every day. A fog rolls in and degrades visibility. \n\tDespite starting on Venusport Base, the characters actually spend most of their time onboard the Aphrodite, a Tellurian Rocket Ship. The Aphrodite had a surge-circuit monitor of twenty guns built into her frame. She was bulky, fat, and ugly, and occasionally had some technical and mechanical struggles as well. \n\tAlthough her frame may not be appealing, she soon becomes victorious as she gains the trust of Strike and other members of his crew and saves two fallen dreadnaughts. With her surge-circuit rheostat rebuilt, the Aphrodite is finally able to accomplish what she was always meant to. "}, {"worker_id": "1", "uid": "04e79312dede4a0da5993101e55a796a", "response_text": "The story starts on the planet of Venus. Venus has days that are 720 hours long, and rain is common. The rain is hot, slimy, and green, and it makes the already wet swamplands even more mushy. Fog is common on Venus.\n\nThe middle of the story takes place on the old and outdated ship, Aphrodite. She gives the crew members a lot of trouble on their first mission. She is in dire need of repairs, she\u2019s slow, and it\u2019s impossible to control her temperature. The crew members are unable to wear their uniforms because the temperature is over 100 degrees. \n\nAphrodite\u2019s mission is simple. She needs to take the mail from Venus to Mars, and it\u2019s the only thing she can be trusted to do successfully. So it\u2019s very impressive when she ends up being the hero of the day and manages to rescue two other ships that are headed towards the sun. \n"}, {"worker_id": "5", "uid": "71efb8636b504f42a6989bb90e360186", "response_text": "The narrative is set in the early 21st century primarily aboard the spaceship Aphrodite. The ship's mission is to deliver mail in the inner part of the solar system.\n\nThe ships route takes them around the sun and as a result the ambient temperature inside the ship begins to rise to intolerable levels due to proximity to the sun. Because of the heat, the coed crew is allowed to operate with very little clothing. Aphrodite is a ship of an outdated design that gives it a lack of comfort and subjects it to numerous small problems that make its operation frustrating."}, {"worker_id": "3", "uid": "8aa46ba8bd2945c98babd7dd2d9ecc38", "response_text": "The story starts at a spaceport on Venus, where it has been raining for hundreds of hours straight. The rain has stopped by the time the story starts, but it is left a lot of mud in the swampy marshes. It was nearing the end of the day, and the fog was enveloping the surroundings as it grew darker outside. It was hot and sticky at Venusport Base, but after Strike left the service on his mission in the Aphrodite, it would only grow hotter on board. The ship itself, where most of the story takes place, is an older, refitted, bulky type of ship. There were only two others like it, and their designer had been awarded a Legion of Merit for the three. However, this is the only one still in use, as the others were destroyed in a much earlier mission. Strike\u2019s disappointment in the ship seems to mirror the sentiment. Inside the ship, there are many systems of pipes connected the control panels, and the captain had to navigate carefully so that he didn\u2019t hit his head on the bulkhead. While in space, as the ship flew closer and closer to the sun, the interior of the ship grew hotter and hotter. The crew opted to wear as little clothing as possible in an attempt to handle the heat. When the Aphrodite received the distress call from the Lachesis, the ships were close enough to the sun to be affected by its gravitational pull. After the close call near the sun, once everyone regained consciousness, the story ends at an officer\u2019s club on Mars. It was a formal environment, and the Aphrodite\u2019s captain and executive officer planned the rest of their route from there."}]}, {"question_text": "Who is Strike and what happens to him throughout the story?", "question_number": 5, "responses": [{"worker_id": "6", "uid": "0c27bef1b7b644ffba735fdb005f9529", "response_text": "Strike is a member of an esteemed service family on Venus; seven generations of well-behaved and well-trained operators. Unfortunately, Strike struggles to carry on the family tradition, and is known for misspeaking and offending those around him. By trusting his gut, he wound up failing his higher-ups and crew several times. All this culminated in an eventual mistrust of Strike, which led to him being charged with the Aphrodite. \n\tHis deep hatred of Space Admiral Gordon is passionate, but not without reason. Gordon is the one who demoted him to the Aphrodite. At the start, Strike is checking out his new vessel and notes how ugly the ship is. After examining the ship and it\u2019s crew, it is revealed that Strike is uncomfortable around women and believes they don\u2019t belong on a spaceship. \n\tIn order to start flying, he calls in an expert engineer to come aboard and travel with them. Thinking I.V. Hendricks is a man, he is excited to have them onboard. But when Ivy Hendricks shows up, a female engineer and the daughter of the Aphrodite\u2019s creator, his world is soon turned upside down. \n\tHis initial negative reaction to her is soon displaced by begrudging appreciation and eventually trust and friendship. Hendricks proves his previous theories about women wrong, and Strike is forced to accept that perhaps women do belong on a spaceship. She especially impresses him with her total knowledge of spaceship engineering and the Aphrodite in general. And it helped that she hated Admiral Gorman just as much as Strike, if not more. \n\tWhile flying by the sun to deliver mail, the Aphrodite receives a distress call from two ships: the Lachesis and the Atropos, the latter of which carried Admiral Gorman onboard. After the Aphrodite reached orbit, the Lachesis reached out and reported the Atropos was falling into the sun, due to a burst chamber. They couldn\u2019t move those onboard over thanks to all the radiation, so the Lachesis was attempting to pull the Atropos back using an unbreakable cord. But it wasn\u2019t enough. \n\tSince Ivy Hendricks had fixed the surge-circuit rheostat--the feature that crashed the original Aphrodite--, they were able to save the Lachesis and the Atropos and regain some of their dignity and former glory. \n\tStrike is awarded the Spatial Cross, as well as Cob, his friend and longtime executive of the Aphrodite. Strike was asked to return to the Ganymede, a beautiful sleek ship, but allegedly said the wrong thing to Gorman, and was instead sent back to the Aphrodite. Cob believes he did it on purpose, as Strike had grown quite fond of Lover-Girl. \n\tIvy has gone to the Bureau of Ships to engineer vessels, a great upgrade from her previous job. Cob pressures Strike to reach out to her, but he refuses. However, it ends on a hopeful note, with the potential for romance between Strike and Hendricks, and even more adventures on the clunky Aphrodite. "}, {"worker_id": "1", "uid": "04e79312dede4a0da5993101e55a796a", "response_text": "Strike\u2019s real name is Brevet Lieutenant Commander David Farragut Strykalski III. After serving on the Ganymede, he is put in charge of the Aphrodite. He comes from many generations of officers. However, he doesn\u2019t feel like he fits the mold of his grandfather and great-grandfather and so on. His boss, Gorman, disagreed with several decisions he made in the past and sent him to work on the Aphrodite, the unimpressive spaceship.\n\nStrike does not like working with women in space, so he is disappointed when two of his crew members are powerful and successful females. He learns his lesson after working with Ivy Hendricks for a few weeks. She impresses him with her piloting skills and her knowledge of the ship that her father designed. \n\nStrike is skeptical at first when Ivy wants to take Aphrodite to rescue two ships whose crew members are in grave danger. He knows that the mistakes he made before got him on the Aphrodite, and there\u2019s a big chance that he\u2019ll be fired for trying to save the day, or worse, the mission could end in death for him and all of his crew members. He has feelings for Ivy, and her intense passion convinces him that she\u2019s right, Aphrodite can handle the mission and they can save those peoples\u2019 lives.\n\nIvy pilots the ship almost the entire route, but she is unable to finish the job when she passes out from the intense heat. Captain Strike takes over and saves the crews on the Atropos and the Lachesis. He is hailed as a hero, and he repairs his terrible reputation with the selfless act. He decides not to leave the Aphrodite. He wants to be loyal to the ship that worked so hard for him. He does decide to give Ivy a call. Even though she outranks him, he has to admit that he has a crush on her. "}, {"worker_id": "5", "uid": "71efb8636b504f42a6989bb90e360186", "response_text": "Strike is the commander of the Aphrodite. He was originally the commander of the prestigious Ganymede. However a number of decisions made out of bravado as well as some unprofessional comments lost him that command.\n\nNow in command of a dilapidated ship, Strike comes to terms with his job. He commands a crew including a large number of women which makes him somewhat uncomfortable. His engineering officer Ivy Hendricks in particular seems to be of romantic interest to Strike.\n\nStrike ends up teaming with Ivy to save two ships from falling into the sun earning him a small promotion but an ill-advised comment prevents him from leaving the Aphrodite, perhaps to the satisfaction of Strike himself."}, {"worker_id": "3", "uid": "8aa46ba8bd2945c98babd7dd2d9ecc38", "response_text": "Strike is a highly decorated lieutenant commander in the Navy, who comes from a long line of ship operators. Although he has run many successful missions, he has a reputation of causing trouble\u2014his new Executive Officer, Cob, has heard a number of stories that he asks Strike for details about. Strike has lost command of the ship that he had been captaining, and is sent by Admiral Gorman to captain a mail route on the Aphrodite. He is extremely hesitant to have any positive feelings about the experience, from the ship itself, to the inclusion of women on its crew. Not only is this not the type of ship he is used to, he is never served with women on board. He has to navigate adapting to the new situation while adapting to the new job. Through the first week of his assignment, the ship and its crew grow on him. He comes to trust Ivy Hendricks, the Engineering Officer, and he lets her take charge to try to save the other ships when they respond to a distress call. Eventually, she passes out, and has to leave Strike in charge of getting the ships to safety. Eventually, Strike passes out just like everyone else, from the ship\u2019s acceleration to break the sun\u2019s gravity. At the end of the story, it is clear that his increased appreciation for the ship means he plans on staying, to the delight of his Executive Officer. Cob alludes to Strike having feelings for Ivy, but he says that although she is nice, he has no interest in being with a woman with a higher ranked title than he has. "}]}]}
```
#### Data Splits
<!-- info: Describe and name the splits in the dataset if there are more than one. -->
<!-- scope: periscope -->
train, dev, test
#### Splitting Criteria
<!-- info: Describe any criteria for splitting the data, if used. If there are differences between the splits (e.g., if the training annotations are machine-generated and the dev and test ones are created by humans, or if different numbers of annotators contributed to each example), describe them here. -->
<!-- scope: microscope -->
Stories that appear in both SQuALITY and [QuALITY](https://github.com/nyu-mll/quality) are assigned to the same split in both datasets.
## Dataset in GEM
### Rationale for Inclusion in GEM
#### Why is the Dataset in GEM?
<!-- info: What does this dataset contribute toward better generation evaluation and why is it part of GEM? -->
<!-- scope: microscope -->
The summaries in the dataset were crowdsourced, allowing us to use input documents that are easily understood by crowdworkers (as opposed to technical domains, such as scientific papers). Additionally, there is no lede bias in stories, as is typically in news articles used in benchmark summarization datasets like CNN/DM and XSum.
Additionally, the dataset is multi-reference and the references for each task are highly diverse. Having a diverse set of references better represents the set of acceptable summaries for an input, and opens the door for creative evaluation methodologies using these multiple references.
#### Similar Datasets
<!-- info: Do other datasets for the high level task exist? -->
<!-- scope: telescope -->
yes
#### Unique Language Coverage
<!-- info: Does this dataset cover other languages than other datasets for the same task? -->
<!-- scope: periscope -->
no
#### Difference from other GEM datasets
<!-- info: What else sets this dataset apart from other similar datasets in GEM? -->
<!-- scope: microscope -->
The inputs (story-question pairs) are multi-reference. The questions are high-level and are written to draw from multiple parts of the story, instead of a single section of the story.
### GEM-Specific Curation
#### Modificatied for GEM?
<!-- info: Has the GEM version of the dataset been modified in any way (data, processing, splits) from the original curated data? -->
<!-- scope: telescope -->
no
#### Additional Splits?
<!-- info: Does GEM provide additional splits to the dataset? -->
<!-- scope: telescope -->
no
### Getting Started with the Task
#### Pointers to Resources
<!-- info: Getting started with in-depth research on the task. Add relevant pointers to resources that researchers can consult when they want to get started digging deeper into the task. -->
<!-- scope: microscope -->
* [original paper](https://arxiv.org/abs/2205.11465)
* [modeling question-focused summarization](https://arxiv.org/abs/2112.07637)
* [similar task format but different domain](https://arxiv.org/abs/2104.05938)
## Previous Results
### Previous Results
#### Metrics
<!-- info: What metrics are typically used for this task? -->
<!-- scope: periscope -->
`ROUGE`, `BERT-Score`
#### Proposed Evaluation
<!-- info: List and describe the purpose of the metrics and evaluation methodology (including human evaluation) that the dataset creators used when introducing this task. -->
<!-- scope: microscope -->
Following norms in summarization, we have evaluated with automatic evaluation metrics like ROUGE and BERTScore, but these metrics do not correlate with human judgments of summary quality when comparing model summaries (see paper for details).
We highly recommend users of the benchmark use human evaluation as the primary method for evaluating systems. We present one example of such in the paper in which we ask Upwork workers to read the short story and then rate sets of three responses to each question. While this is close to the gold standard in how we would want to evaluate systems on this task, we recognize that finding workers who will read the whole story (~30m) is difficult and expensive, and doing efficient human evaluation for long document tasks is an open problem.
#### Previous results available?
<!-- info: Are previous results available? -->
<!-- scope: telescope -->
yes
#### Other Evaluation Approaches
<!-- info: What evaluation approaches have others used? -->
<!-- scope: periscope -->
Human evaluation
#### Relevant Previous Results
<!-- info: What are the most relevant previous results for this task/dataset? -->
<!-- scope: microscope -->
See paper (https://arxiv.org/abs/2205.11465)
## Dataset Curation
### Original Curation
#### Sourced from Different Sources
<!-- info: Is the dataset aggregated from different data sources? -->
<!-- scope: telescope -->
no
### Language Data
#### How was Language Data Obtained?
<!-- info: How was the language data obtained? -->
<!-- scope: telescope -->
`Crowdsourced`
#### Where was it crowdsourced?
<!-- info: If crowdsourced, where from? -->
<!-- scope: periscope -->
`Other crowdworker platform`
#### Language Producers
<!-- info: What further information do we have on the language producers? -->
<!-- scope: microscope -->
Upwork: US-born, native English speakers with backgrounds in the humanities and copywriting
NYU undergraduates: English-fluent undergraduates from a diverse set of nationalities and majors
#### Topics Covered
<!-- info: Does the language in the dataset focus on specific topics? How would you describe them? -->
<!-- scope: periscope -->
The short stories are primarily science fiction and from the 1930s -- 1970s.
#### Data Validation
<!-- info: Was the text validated by a different worker or a data curator? -->
<!-- scope: telescope -->
validated by crowdworker
#### Was Data Filtered?
<!-- info: Were text instances selected or filtered? -->
<!-- scope: telescope -->
not filtered
### Structured Annotations
#### Additional Annotations?
<!-- quick -->
<!-- info: Does the dataset have additional annotations for each instance? -->
<!-- scope: telescope -->
crowd-sourced
#### Number of Raters
<!-- info: What is the number of raters -->
<!-- scope: telescope -->
11<n<50
#### Rater Qualifications
<!-- info: Describe the qualifications required of an annotator. -->
<!-- scope: periscope -->
English-fluent, with experience reading and writing about literature
#### Raters per Training Example
<!-- info: How many annotators saw each training example? -->
<!-- scope: periscope -->
4
#### Raters per Test Example
<!-- info: How many annotators saw each test example? -->
<!-- scope: periscope -->
4
#### Annotation Service?
<!-- info: Was an annotation service used? -->
<!-- scope: telescope -->
no
#### Any Quality Control?
<!-- info: Quality control measures? -->
<!-- scope: telescope -->
validated by another rater
#### Quality Control Details
<!-- info: Describe the quality control measures that were taken. -->
<!-- scope: microscope -->
Each response was reviewed by three reviewers, who ranked the response (against two other responses), highlighted errors in the response, and provided feedback to the original response writer.
### Consent
#### Any Consent Policy?
<!-- info: Was there a consent policy involved when gathering the data? -->
<!-- scope: telescope -->
yes
#### Consent Policy Details
<!-- info: What was the consent policy? -->
<!-- scope: microscope -->
Writers were informed that their writing and reviewing would be used in the development of AI.
### Private Identifying Information (PII)
#### Contains PII?
<!-- quick -->
<!-- info: Does the source language data likely contain Personal Identifying Information about the data creators or subjects? -->
<!-- scope: telescope -->
unlikely
#### Any PII Identification?
<!-- info: Did the curators use any automatic/manual method to identify PII in the dataset? -->
<!-- scope: periscope -->
no identification
### Maintenance
#### Any Maintenance Plan?
<!-- info: Does the original dataset have a maintenance plan? -->
<!-- scope: telescope -->
no
## Broader Social Context
### Previous Work on the Social Impact of the Dataset
#### Usage of Models based on the Data
<!-- info: Are you aware of cases where models trained on the task featured in this dataset ore related tasks have been used in automated systems? -->
<!-- scope: telescope -->
no
### Impact on Under-Served Communities
#### Addresses needs of underserved Communities?
<!-- info: Does this dataset address the needs of communities that are traditionally underserved in language technology, and particularly language generation technology? Communities may be underserved for exemple because their language, language variety, or social or geographical context is underepresented in NLP and NLG resources (datasets and models). -->
<!-- scope: telescope -->
no
### Discussion of Biases
#### Any Documented Social Biases?
<!-- info: Are there documented social biases in the dataset? Biases in this context are variations in the ways members of different social categories are represented that can have harmful downstream consequences for members of the more disadvantaged group. -->
<!-- scope: telescope -->
yes
## Considerations for Using the Data
### PII Risks and Liability
### Licenses
#### Copyright Restrictions on the Dataset
<!-- info: Based on your answers in the Intended Use part of the Data Overview Section, which of the following best describe the copyright and licensing status of the dataset? -->
<!-- scope: periscope -->
`open license - commercial use allowed`
#### Copyright Restrictions on the Language Data
<!-- info: Based on your answers in the Language part of the Data Curation Section, which of the following best describe the copyright and licensing status of the underlying language data? -->
<!-- scope: periscope -->
`public domain`
### Known Technical Limitations
#### Unsuited Applications
<!-- info: When using a model trained on this dataset in a setting where users or the public may interact with its predictions, what are some pitfalls to look out for? In particular, describe some applications of the general task featured in this dataset that its curation or properties make it less suitable for. -->
<!-- scope: microscope -->
The stories in the dataset are from the 1930--1970s and may contain harmful stances on topics like race and gender. Models trained on the stories may reproduce these stances in their outputs.
#### Discouraged Use Cases
<!-- info: What are some discouraged use cases of a model trained to maximize the proposed metrics on this dataset? In particular, think about settings where decisions made by a model that performs reasonably well on the metric my still have strong negative consequences for user or members of the public. -->
<!-- scope: microscope -->
The proposed automatic metrics for this dataset (ROUGE, BERTScore) are not sensitive to factual errors in summaries, and have been shown to not correlate well with human judgments of summary quality along a number of axes.
|
dembastu/methods2test_raw_grouped | ---
dataset_info:
features:
- name: focal_method_test_case
dtype: string
- name: length
dtype: int64
splits:
- name: train
num_bytes: 854772444.2611823
num_examples: 631120
download_size: 339684184
dataset_size: 854772444.2611823
---
# Dataset Card for "methods2test_raw_grouped"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yaojinghao/test_data | ---
license: apache-2.0
language:
- zh
size_categories:
- 1K<n<10K
--- |
tifms/dataset1 | ---
tags:
- chemistry
- asdf
- kimjinho
--- |
open-llm-leaderboard/details_flemmingmiguel__Distilled-HermesChat-7B | ---
pretty_name: Evaluation run of flemmingmiguel/Distilled-HermesChat-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [flemmingmiguel/Distilled-HermesChat-7B](https://huggingface.co/flemmingmiguel/Distilled-HermesChat-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_flemmingmiguel__Distilled-HermesChat-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-13T17:41:54.536456](https://huggingface.co/datasets/open-llm-leaderboard/details_flemmingmiguel__Distilled-HermesChat-7B/blob/main/results_2024-01-13T17-41-54.536456.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6549679088142555,\n\
\ \"acc_stderr\": 0.03191312416103038,\n \"acc_norm\": 0.6559474034222305,\n\
\ \"acc_norm_stderr\": 0.03256025642473883,\n \"mc1\": 0.37821297429620565,\n\
\ \"mc1_stderr\": 0.01697633590754687,\n \"mc2\": 0.5477099988321158,\n\
\ \"mc2_stderr\": 0.015436090753363047\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6399317406143344,\n \"acc_stderr\": 0.014027516814585186,\n\
\ \"acc_norm\": 0.6749146757679181,\n \"acc_norm_stderr\": 0.013688147309729124\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6649073889663414,\n\
\ \"acc_stderr\": 0.0047105814966393374,\n \"acc_norm\": 0.8521210914160526,\n\
\ \"acc_norm_stderr\": 0.0035425443194051424\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.028254200344438662,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.028254200344438662\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.04951218252396262,\n\
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.04951218252396262\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086923996,\n \"\
acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086923996\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8096774193548387,\n\
\ \"acc_stderr\": 0.022331707611823078,\n \"acc_norm\": 0.8096774193548387,\n\
\ \"acc_norm_stderr\": 0.022331707611823078\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603348,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603348\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6794871794871795,\n \"acc_stderr\": 0.023661296393964283,\n\
\ \"acc_norm\": 0.6794871794871795,\n \"acc_norm_stderr\": 0.023661296393964283\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8623853211009175,\n \"acc_stderr\": 0.014770105878649405,\n \"\
acc_norm\": 0.8623853211009175,\n \"acc_norm_stderr\": 0.014770105878649405\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n\
\ \"acc_stderr\": 0.025524722324553346,\n \"acc_norm\": 0.8431372549019608,\n\
\ \"acc_norm_stderr\": 0.025524722324553346\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.02574490253229092,\n\
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.02574490253229092\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7174887892376681,\n\
\ \"acc_stderr\": 0.030216831011508766,\n \"acc_norm\": 0.7174887892376681,\n\
\ \"acc_norm_stderr\": 0.030216831011508766\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286775,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286775\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8403575989782887,\n\
\ \"acc_stderr\": 0.013097934513263005,\n \"acc_norm\": 0.8403575989782887,\n\
\ \"acc_norm_stderr\": 0.013097934513263005\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.023357365785874037,\n\
\ \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.023357365785874037\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.28938547486033517,\n\
\ \"acc_stderr\": 0.015166544550490298,\n \"acc_norm\": 0.28938547486033517,\n\
\ \"acc_norm_stderr\": 0.015166544550490298\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.729903536977492,\n\
\ \"acc_stderr\": 0.02521804037341063,\n \"acc_norm\": 0.729903536977492,\n\
\ \"acc_norm_stderr\": 0.02521804037341063\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042117,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042117\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.49022164276401564,\n \"acc_stderr\": 0.012767793787729336,\n\
\ \"acc_norm\": 0.49022164276401564,\n \"acc_norm_stderr\": 0.012767793787729336\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.7316176470588235,\n \"acc_stderr\": 0.026917481224377197,\n \"\
acc_norm\": 0.7316176470588235,\n \"acc_norm_stderr\": 0.026917481224377197\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069443,\n \
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069443\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061463,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061463\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37821297429620565,\n\
\ \"mc1_stderr\": 0.01697633590754687,\n \"mc2\": 0.5477099988321158,\n\
\ \"mc2_stderr\": 0.015436090753363047\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8011049723756906,\n \"acc_stderr\": 0.011218629972515303\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6732373009855952,\n \
\ \"acc_stderr\": 0.012919408108656423\n }\n}\n```"
repo_url: https://huggingface.co/flemmingmiguel/Distilled-HermesChat-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|arc:challenge|25_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|gsm8k|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hellaswag|10_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T17-41-54.536456.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-13T17-41-54.536456.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- '**/details_harness|winogrande|5_2024-01-13T17-41-54.536456.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-13T17-41-54.536456.parquet'
- config_name: results
data_files:
- split: 2024_01_13T17_41_54.536456
path:
- results_2024-01-13T17-41-54.536456.parquet
- split: latest
path:
- results_2024-01-13T17-41-54.536456.parquet
---
# Dataset Card for Evaluation run of flemmingmiguel/Distilled-HermesChat-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [flemmingmiguel/Distilled-HermesChat-7B](https://huggingface.co/flemmingmiguel/Distilled-HermesChat-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_flemmingmiguel__Distilled-HermesChat-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T17:41:54.536456](https://huggingface.co/datasets/open-llm-leaderboard/details_flemmingmiguel__Distilled-HermesChat-7B/blob/main/results_2024-01-13T17-41-54.536456.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6549679088142555,
"acc_stderr": 0.03191312416103038,
"acc_norm": 0.6559474034222305,
"acc_norm_stderr": 0.03256025642473883,
"mc1": 0.37821297429620565,
"mc1_stderr": 0.01697633590754687,
"mc2": 0.5477099988321158,
"mc2_stderr": 0.015436090753363047
},
"harness|arc:challenge|25": {
"acc": 0.6399317406143344,
"acc_stderr": 0.014027516814585186,
"acc_norm": 0.6749146757679181,
"acc_norm_stderr": 0.013688147309729124
},
"harness|hellaswag|10": {
"acc": 0.6649073889663414,
"acc_stderr": 0.0047105814966393374,
"acc_norm": 0.8521210914160526,
"acc_norm_stderr": 0.0035425443194051424
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.028254200344438662,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.028254200344438662
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.04951218252396262,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.04951218252396262
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086923996,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086923996
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8096774193548387,
"acc_stderr": 0.022331707611823078,
"acc_norm": 0.8096774193548387,
"acc_norm_stderr": 0.022331707611823078
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603348,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603348
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6794871794871795,
"acc_stderr": 0.023661296393964283,
"acc_norm": 0.6794871794871795,
"acc_norm_stderr": 0.023661296393964283
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8623853211009175,
"acc_stderr": 0.014770105878649405,
"acc_norm": 0.8623853211009175,
"acc_norm_stderr": 0.014770105878649405
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553346,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553346
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.02574490253229092,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.02574490253229092
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7174887892376681,
"acc_stderr": 0.030216831011508766,
"acc_norm": 0.7174887892376681,
"acc_norm_stderr": 0.030216831011508766
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462472,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286775,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286775
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8403575989782887,
"acc_stderr": 0.013097934513263005,
"acc_norm": 0.8403575989782887,
"acc_norm_stderr": 0.013097934513263005
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.023357365785874037,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.023357365785874037
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.28938547486033517,
"acc_stderr": 0.015166544550490298,
"acc_norm": 0.28938547486033517,
"acc_norm_stderr": 0.015166544550490298
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.729903536977492,
"acc_stderr": 0.02521804037341063,
"acc_norm": 0.729903536977492,
"acc_norm_stderr": 0.02521804037341063
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042117,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042117
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.49022164276401564,
"acc_stderr": 0.012767793787729336,
"acc_norm": 0.49022164276401564,
"acc_norm_stderr": 0.012767793787729336
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7316176470588235,
"acc_stderr": 0.026917481224377197,
"acc_norm": 0.7316176470588235,
"acc_norm_stderr": 0.026917481224377197
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.018875682938069443,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.018875682938069443
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061463,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061463
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37821297429620565,
"mc1_stderr": 0.01697633590754687,
"mc2": 0.5477099988321158,
"mc2_stderr": 0.015436090753363047
},
"harness|winogrande|5": {
"acc": 0.8011049723756906,
"acc_stderr": 0.011218629972515303
},
"harness|gsm8k|5": {
"acc": 0.6732373009855952,
"acc_stderr": 0.012919408108656423
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
result-kand2-sdxl-wuerst-karlo/af730738 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 198
num_examples: 10
download_size: 1368
dataset_size: 198
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "af730738"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
wisenut-nlp-team/aihub_admin_generated_answers_2question | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: question
dtype: string
- name: context
sequence: string
- name: answer
sequence: string
- name: original_answer
sequence: string
- name: similar_contexts
sequence: string
splits:
- name: train
num_bytes: 10132624514
num_examples: 315745
download_size: 4803147422
dataset_size: 10132624514
---
# Dataset Card for "aihub_admin_generated_answers_2question"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
emozilla/elementary_math-v1 | ---
dataset_info:
features:
- name: text
dtype: string
- name: operation
dtype: string
- name: operands
sequence: int64
- name: solution
dtype: int64
splits:
- name: train
num_bytes: 2522190691
num_examples: 800001
- name: test
num_bytes: 315111072
num_examples: 99999
- name: validation
num_bytes: 315358918
num_examples: 99999
download_size: 116049281
dataset_size: 3152660681
---
# Dataset Card for "elementary_math-v1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KAUE24122023/NelsonMachadoQuicoVozAntiga | ---
license: openrail
---
|
dog/fuego-20230215-041847-955498 | ---
tags:
- fuego
fuego:
id: 20230215-041847-955498
status: done
script: run.py
requirements_file: requirements.txt
space_id: dog/fuego-20230215-041847-955498
space_hardware: cpu-basic
---
|
willcine/InquiryResponse | ---
license: gemma
---
|
CyberHarem/richelieu_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of richelieu/リシュリュー/黎塞留 (Azur Lane)
This is the dataset of richelieu/リシュリュー/黎塞留 (Azur Lane), containing 296 images and their tags.
The core tags of this character are `long_hair, breasts, blonde_hair, large_breasts, orange_hair, hat, white_headwear, red_eyes, sun_hat, purple_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 296 | 519.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/richelieu_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 296 | 251.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/richelieu_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 739 | 541.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/richelieu_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 296 | 439.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/richelieu_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 739 | 822.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/richelieu_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/richelieu_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, frilled_swimsuit, highleg_swimsuit, looking_at_viewer, official_alternate_costume, solo, bare_shoulders, collarbone, pink_one-piece_swimsuit, cleavage, covered_navel, smile |
| 1 | 8 |  |  |  |  |  | 1girl, cleavage, highleg_swimsuit, looking_at_viewer, pink_one-piece_swimsuit, simple_background, solo, white_background, frilled_swimsuit, covered_navel, pink_eyes, bare_shoulders, cowboy_shot, smile, collarbone, very_long_hair |
| 2 | 19 |  |  |  |  |  | 1girl, blue_sky, cleavage, highleg_swimsuit, looking_at_viewer, outdoors, pink_one-piece_swimsuit, solo, cloud, day, covered_navel, ocean, frilled_swimsuit, pink_eyes, bare_shoulders, smile |
| 3 | 67 |  |  |  |  |  | 1girl, solo, bare_shoulders, black_gloves, detached_sleeves, looking_at_viewer, crown, french_flag, juliet_sleeves, red_thighhighs, cleavage, holding_book, medium_breasts, white_dress |
| 4 | 16 |  |  |  |  |  | bare_shoulders, 1girl, looking_at_viewer, official_alternate_costume, earrings, mini_hat, hat_flower, solo, rose, tilted_headwear, white_kimono, bangs, closed_mouth, off_shoulder, smile, center_frills, dress, side_drill, red_flower, upper_body, white_gloves, obi, simple_background |
| 5 | 8 |  |  |  |  |  | 1girl, looking_at_viewer, navel, solo, collarbone, bangs, blush, nipples, cleavage, simple_background, smile, white_background, thighs, very_long_hair, completely_nude, cowboy_shot, groin, sidelocks, stomach, swimsuit |
| 6 | 9 |  |  |  |  |  | 1boy, 1girl, hetero, blush, nipples, nude, penis, sex, solo_focus, open_mouth, mosaic_censoring, navel, sweat, vaginal, pussy, bangs, collarbone, spread_legs, thighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | frilled_swimsuit | highleg_swimsuit | looking_at_viewer | official_alternate_costume | solo | bare_shoulders | collarbone | pink_one-piece_swimsuit | cleavage | covered_navel | smile | simple_background | white_background | pink_eyes | cowboy_shot | very_long_hair | blue_sky | outdoors | cloud | day | ocean | black_gloves | detached_sleeves | crown | french_flag | juliet_sleeves | red_thighhighs | holding_book | medium_breasts | white_dress | earrings | mini_hat | hat_flower | rose | tilted_headwear | white_kimono | bangs | closed_mouth | off_shoulder | center_frills | dress | side_drill | red_flower | upper_body | white_gloves | obi | navel | blush | nipples | thighs | completely_nude | groin | sidelocks | stomach | swimsuit | 1boy | hetero | nude | penis | sex | solo_focus | open_mouth | mosaic_censoring | sweat | vaginal | pussy | spread_legs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------------|:-------------------|:--------------------|:-----------------------------|:-------|:-----------------|:-------------|:--------------------------|:-----------|:----------------|:--------|:--------------------|:-------------------|:------------|:--------------|:-----------------|:-----------|:-----------|:--------|:------|:--------|:---------------|:-------------------|:--------|:--------------|:-----------------|:-----------------|:---------------|:-----------------|:--------------|:-----------|:-----------|:-------------|:-------|:------------------|:---------------|:--------|:---------------|:---------------|:----------------|:--------|:-------------|:-------------|:-------------|:---------------|:------|:--------|:--------|:----------|:---------|:------------------|:--------|:------------|:----------|:-----------|:-------|:---------|:-------|:--------|:------|:-------------|:-------------|:-------------------|:--------|:----------|:--------|:--------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 19 |  |  |  |  |  | X | X | X | X | | X | X | | X | X | X | X | | | X | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 67 |  |  |  |  |  | X | | | X | | X | X | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 16 |  |  |  |  |  | X | | | X | X | X | X | | | | | X | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 5 | 8 |  |  |  |  |  | X | | | X | | X | | X | | X | | X | X | X | | X | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 6 | 9 |  |  |  |  |  | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | X | X | X | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
metric-space/experiment_med | ---
dataset_info:
features:
- name: meta_info
dtype: string
- name: question
dtype: string
- name: answer_idx
dtype: string
- name: answer
dtype: string
- name: options
list:
- name: key
dtype: string
- name: value
dtype: string
splits:
- name: part1
num_bytes: 3258966
num_examples: 3392
- name: part2
num_bytes: 3242635
num_examples: 3392
- name: part3
num_bytes: 3263765
num_examples: 3394
download_size: 5353119
dataset_size: 9765366
configs:
- config_name: default
data_files:
- split: part1
path: data/part1-*
- split: part2
path: data/part2-*
- split: part3
path: data/part3-*
---
|
hellstomp/enthalpy-QM9-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 93142
num_examples: 999
download_size: 19525
dataset_size: 93142
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "enthalpy-QM9-1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
xjs521/reddit_topic_post | ---
license: apache-2.0
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 10251719
num_examples: 2600
download_size: 5588607
dataset_size: 10251719
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Gabriel1322/dona12 | ---
license: openrail
---
|
Atipico1/nq-test-valid-adversary-replace | ---
dataset_info:
features:
- name: question
dtype: string
- name: entity
dtype: string
- name: similar_entity
dtype: string
- name: answers
sequence: string
- name: ctxs
list:
- name: hasanswer
dtype: bool
- name: score
dtype: float64
- name: text
dtype: string
- name: title
dtype: string
- name: masked_query
dtype: string
- name: original_case
list:
- name: answer
dtype: string
- name: context
dtype: string
- name: distance
dtype: string
- name: original_answers
sequence: string
- name: question
dtype: string
- name: unans_case
list:
- name: answer
dtype: string
- name: answers
sequence: string
- name: context
dtype: string
- name: distance
dtype: string
- name: original_answers
sequence: string
- name: question
dtype: string
- name: conflict_case
list:
- name: answer
dtype: string
- name: conflict_context
dtype: string
- name: context
dtype: string
- name: distance
dtype: string
- name: original_answers
sequence: string
- name: question
dtype: string
- name: context
dtype: string
- name: context_vague
dtype: string
- name: entities
dtype: string
- name: entities_count
dtype: int64
- name: adv_sent
dtype: string
- name: adv_passage
dtype: string
- name: cos_sim
dtype: float64
- name: answer_match
dtype: bool
- name: is_valid_adversary
dtype: bool
- name: hasanswer
dtype: bool
- name: is_adversarial
dtype: bool
splits:
- name: test
num_bytes: 58345319
num_examples: 3610
download_size: 34093569
dataset_size: 58345319
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_KatyTheCutie__EstopianMaid-13B | ---
pretty_name: Evaluation run of KatyTheCutie/EstopianMaid-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KatyTheCutie/EstopianMaid-13B](https://huggingface.co/KatyTheCutie/EstopianMaid-13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KatyTheCutie__EstopianMaid-13B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-14T11:59:41.203334](https://huggingface.co/datasets/open-llm-leaderboard/details_KatyTheCutie__EstopianMaid-13B/blob/main/results_2024-02-14T11-59-41.203334.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5588896471810725,\n\
\ \"acc_stderr\": 0.03358524149192356,\n \"acc_norm\": 0.5671325395608912,\n\
\ \"acc_norm_stderr\": 0.034363791698055104,\n \"mc1\": 0.3537331701346389,\n\
\ \"mc1_stderr\": 0.016737814358846147,\n \"mc2\": 0.5235005454748325,\n\
\ \"mc2_stderr\": 0.01582550300012819\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5836177474402731,\n \"acc_stderr\": 0.014405618279436178,\n\
\ \"acc_norm\": 0.6049488054607508,\n \"acc_norm_stderr\": 0.014285898292938169\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6406094403505278,\n\
\ \"acc_stderr\": 0.004788412062375697,\n \"acc_norm\": 0.8348934475204143,\n\
\ \"acc_norm_stderr\": 0.0037051790292873302\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5328947368421053,\n \"acc_stderr\": 0.040601270352363966,\n\
\ \"acc_norm\": 0.5328947368421053,\n \"acc_norm_stderr\": 0.040601270352363966\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5886792452830188,\n \"acc_stderr\": 0.030285009259009787,\n\
\ \"acc_norm\": 0.5886792452830188,\n \"acc_norm_stderr\": 0.030285009259009787\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.6111111111111112,\n\
\ \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952344,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952344\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5144508670520231,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.5144508670520231,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808777,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808777\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4595744680851064,\n \"acc_stderr\": 0.032579014820998356,\n\
\ \"acc_norm\": 0.4595744680851064,\n \"acc_norm_stderr\": 0.032579014820998356\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3148148148148148,\n \"acc_stderr\": 0.023919984164047736,\n \"\
acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.023919984164047736\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6483870967741936,\n\
\ \"acc_stderr\": 0.02716253782694846,\n \"acc_norm\": 0.6483870967741936,\n\
\ \"acc_norm_stderr\": 0.02716253782694846\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.43842364532019706,\n \"acc_stderr\": 0.03491207857486518,\n\
\ \"acc_norm\": 0.43842364532019706,\n \"acc_norm_stderr\": 0.03491207857486518\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.0368105086916155,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.0368105086916155\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.702020202020202,\n \"acc_stderr\": 0.03258630383836556,\n \"acc_norm\"\
: 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836556\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.02840895362624526,\n\
\ \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.02840895362624526\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5102564102564102,\n \"acc_stderr\": 0.025345672221942374,\n\
\ \"acc_norm\": 0.5102564102564102,\n \"acc_norm_stderr\": 0.025345672221942374\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945277,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945277\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.032145368597886394,\n\
\ \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.032145368597886394\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7376146788990826,\n \"acc_stderr\": 0.01886188502153473,\n \"\
acc_norm\": 0.7376146788990826,\n \"acc_norm_stderr\": 0.01886188502153473\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.37037037037037035,\n \"acc_stderr\": 0.03293377139415191,\n \"\
acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.03293377139415191\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591362,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591362\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.759493670886076,\n \"acc_stderr\": 0.027820781981149685,\n \
\ \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.027820781981149685\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.03114679648297246,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.03114679648297246\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n\
\ \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.039849796533028725,\n \"\
acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.039849796533028725\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.043300437496507416,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.043300437496507416\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.036803503712864616,\n\
\ \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.036803503712864616\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.0465614711001235,\n\
\ \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.0465614711001235\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.026246772946890488,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.026246772946890488\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7522349936143039,\n\
\ \"acc_stderr\": 0.015438083080568973,\n \"acc_norm\": 0.7522349936143039,\n\
\ \"acc_norm_stderr\": 0.015438083080568973\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.025906632631016127,\n\
\ \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.025906632631016127\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.47262569832402235,\n\
\ \"acc_stderr\": 0.016697420650642752,\n \"acc_norm\": 0.47262569832402235,\n\
\ \"acc_norm_stderr\": 0.016697420650642752\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6209150326797386,\n \"acc_stderr\": 0.02778014120702335,\n\
\ \"acc_norm\": 0.6209150326797386,\n \"acc_norm_stderr\": 0.02778014120702335\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6302250803858521,\n\
\ \"acc_stderr\": 0.02741799670563099,\n \"acc_norm\": 0.6302250803858521,\n\
\ \"acc_norm_stderr\": 0.02741799670563099\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6419753086419753,\n \"acc_stderr\": 0.0266756119260371,\n\
\ \"acc_norm\": 0.6419753086419753,\n \"acc_norm_stderr\": 0.0266756119260371\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.42907801418439717,\n \"acc_stderr\": 0.02952591430255855,\n \
\ \"acc_norm\": 0.42907801418439717,\n \"acc_norm_stderr\": 0.02952591430255855\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42242503259452413,\n\
\ \"acc_stderr\": 0.012615600475734921,\n \"acc_norm\": 0.42242503259452413,\n\
\ \"acc_norm_stderr\": 0.012615600475734921\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5477941176470589,\n \"acc_stderr\": 0.030233758551596445,\n\
\ \"acc_norm\": 0.5477941176470589,\n \"acc_norm_stderr\": 0.030233758551596445\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5669934640522876,\n \"acc_stderr\": 0.020045442473324224,\n \
\ \"acc_norm\": 0.5669934640522876,\n \"acc_norm_stderr\": 0.020045442473324224\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\
\ \"acc_stderr\": 0.04673752333670239,\n \"acc_norm\": 0.6090909090909091,\n\
\ \"acc_norm_stderr\": 0.04673752333670239\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.030472526026726496,\n\
\ \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.030472526026726496\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7412935323383084,\n\
\ \"acc_stderr\": 0.03096590312357303,\n \"acc_norm\": 0.7412935323383084,\n\
\ \"acc_norm_stderr\": 0.03096590312357303\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.031885780176863984,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.031885780176863984\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3537331701346389,\n\
\ \"mc1_stderr\": 0.016737814358846147,\n \"mc2\": 0.5235005454748325,\n\
\ \"mc2_stderr\": 0.01582550300012819\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.755327545382794,\n \"acc_stderr\": 0.012082125654159738\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09173616376042457,\n \
\ \"acc_stderr\": 0.007950942148339338\n }\n}\n```"
repo_url: https://huggingface.co/KatyTheCutie/EstopianMaid-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|arc:challenge|25_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|gsm8k|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hellaswag|10_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T11-59-41.203334.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-14T11-59-41.203334.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- '**/details_harness|winogrande|5_2024-02-14T11-59-41.203334.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-14T11-59-41.203334.parquet'
- config_name: results
data_files:
- split: 2024_02_14T11_59_41.203334
path:
- results_2024-02-14T11-59-41.203334.parquet
- split: latest
path:
- results_2024-02-14T11-59-41.203334.parquet
---
# Dataset Card for Evaluation run of KatyTheCutie/EstopianMaid-13B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [KatyTheCutie/EstopianMaid-13B](https://huggingface.co/KatyTheCutie/EstopianMaid-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KatyTheCutie__EstopianMaid-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T11:59:41.203334](https://huggingface.co/datasets/open-llm-leaderboard/details_KatyTheCutie__EstopianMaid-13B/blob/main/results_2024-02-14T11-59-41.203334.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5588896471810725,
"acc_stderr": 0.03358524149192356,
"acc_norm": 0.5671325395608912,
"acc_norm_stderr": 0.034363791698055104,
"mc1": 0.3537331701346389,
"mc1_stderr": 0.016737814358846147,
"mc2": 0.5235005454748325,
"mc2_stderr": 0.01582550300012819
},
"harness|arc:challenge|25": {
"acc": 0.5836177474402731,
"acc_stderr": 0.014405618279436178,
"acc_norm": 0.6049488054607508,
"acc_norm_stderr": 0.014285898292938169
},
"harness|hellaswag|10": {
"acc": 0.6406094403505278,
"acc_stderr": 0.004788412062375697,
"acc_norm": 0.8348934475204143,
"acc_norm_stderr": 0.0037051790292873302
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5328947368421053,
"acc_stderr": 0.040601270352363966,
"acc_norm": 0.5328947368421053,
"acc_norm_stderr": 0.040601270352363966
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5886792452830188,
"acc_stderr": 0.030285009259009787,
"acc_norm": 0.5886792452830188,
"acc_norm_stderr": 0.030285009259009787
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04076663253918567,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04076663253918567
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952344,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952344
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808777,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808777
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4595744680851064,
"acc_stderr": 0.032579014820998356,
"acc_norm": 0.4595744680851064,
"acc_norm_stderr": 0.032579014820998356
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.04339138322579861,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.04339138322579861
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.023919984164047736,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.023919984164047736
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6483870967741936,
"acc_stderr": 0.02716253782694846,
"acc_norm": 0.6483870967741936,
"acc_norm_stderr": 0.02716253782694846
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43842364532019706,
"acc_stderr": 0.03491207857486518,
"acc_norm": 0.43842364532019706,
"acc_norm_stderr": 0.03491207857486518
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.0368105086916155,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.0368105086916155
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.702020202020202,
"acc_stderr": 0.03258630383836556,
"acc_norm": 0.702020202020202,
"acc_norm_stderr": 0.03258630383836556
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.02840895362624526,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.02840895362624526
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5102564102564102,
"acc_stderr": 0.025345672221942374,
"acc_norm": 0.5102564102564102,
"acc_norm_stderr": 0.025345672221942374
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945277,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945277
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.032145368597886394,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.032145368597886394
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7376146788990826,
"acc_stderr": 0.01886188502153473,
"acc_norm": 0.7376146788990826,
"acc_norm_stderr": 0.01886188502153473
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.03293377139415191,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.03293377139415191
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591362,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591362
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.03114679648297246,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.03114679648297246
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6564885496183206,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.6564885496183206,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.039849796533028725,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.039849796533028725
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.043300437496507416,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.043300437496507416
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.036803503712864616,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.036803503712864616
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.6699029126213593,
"acc_stderr": 0.0465614711001235,
"acc_norm": 0.6699029126213593,
"acc_norm_stderr": 0.0465614711001235
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.026246772946890488,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.026246772946890488
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7522349936143039,
"acc_stderr": 0.015438083080568973,
"acc_norm": 0.7522349936143039,
"acc_norm_stderr": 0.015438083080568973
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.025906632631016127,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.025906632631016127
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.47262569832402235,
"acc_stderr": 0.016697420650642752,
"acc_norm": 0.47262569832402235,
"acc_norm_stderr": 0.016697420650642752
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6209150326797386,
"acc_stderr": 0.02778014120702335,
"acc_norm": 0.6209150326797386,
"acc_norm_stderr": 0.02778014120702335
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6302250803858521,
"acc_stderr": 0.02741799670563099,
"acc_norm": 0.6302250803858521,
"acc_norm_stderr": 0.02741799670563099
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6419753086419753,
"acc_stderr": 0.0266756119260371,
"acc_norm": 0.6419753086419753,
"acc_norm_stderr": 0.0266756119260371
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.42907801418439717,
"acc_stderr": 0.02952591430255855,
"acc_norm": 0.42907801418439717,
"acc_norm_stderr": 0.02952591430255855
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42242503259452413,
"acc_stderr": 0.012615600475734921,
"acc_norm": 0.42242503259452413,
"acc_norm_stderr": 0.012615600475734921
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5477941176470589,
"acc_stderr": 0.030233758551596445,
"acc_norm": 0.5477941176470589,
"acc_norm_stderr": 0.030233758551596445
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5669934640522876,
"acc_stderr": 0.020045442473324224,
"acc_norm": 0.5669934640522876,
"acc_norm_stderr": 0.020045442473324224
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.04673752333670239,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.04673752333670239
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6530612244897959,
"acc_stderr": 0.030472526026726496,
"acc_norm": 0.6530612244897959,
"acc_norm_stderr": 0.030472526026726496
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7412935323383084,
"acc_stderr": 0.03096590312357303,
"acc_norm": 0.7412935323383084,
"acc_norm_stderr": 0.03096590312357303
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.031885780176863984,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.031885780176863984
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3537331701346389,
"mc1_stderr": 0.016737814358846147,
"mc2": 0.5235005454748325,
"mc2_stderr": 0.01582550300012819
},
"harness|winogrande|5": {
"acc": 0.755327545382794,
"acc_stderr": 0.012082125654159738
},
"harness|gsm8k|5": {
"acc": 0.09173616376042457,
"acc_stderr": 0.007950942148339338
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
liuyanchen1015/MULTI_VALUE_wnli_completive_have_done | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 2866
num_examples: 12
- name: test
num_bytes: 16362
num_examples: 57
- name: train
num_bytes: 30567
num_examples: 129
download_size: 24217
dataset_size: 49795
---
# Dataset Card for "MULTI_VALUE_wnli_completive_have_done"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
LRGB/coco_superpixels_edge_wt_only_coord_10 | ---
task_categories:
- graph-ml
size_categories:
- 1M<n<10M
tags:
- lrgb
license: cc-by-4.0
dataset_info:
features:
- name: x
dtype: int64
- name: edge_index
dtype: int64
- name: edge_attr
dtype: int64
- name: y
dtype: int64
splits:
- name: train
num_bytes: 3625184
num_examples: 113287
- name: val
num_bytes: 160032
num_examples: 5001
- name: test
num_bytes: 160032
num_examples: 5001
download_size: 3250471
dataset_size: 3945248
---
# `coco_superpixels_edge_wt_only_coord_10`
### Dataset Summary
| Dataset | Domain | Task | Node Feat. (dim) | Edge Feat. (dim) | Perf. Metric |
|---|---|---|---|---|---|
| COCO-SP | Computer Vision | Node Prediction | Pixel + Coord (14) | Edge Weight (1 or 2) | macro F1 |
| Dataset | # Graphs | # Nodes | μ Nodes | μ Deg. | # Edges | μ Edges | μ Short. Path | μ Diameter
|---|---:|---:|---:|:---:|---:|---:|---:|---:|
| COCO-SP | 123,286 | 58,793,216 | 476.88 | 5.65 | 332,091,902 | 2,693.67 | 10.66±0.55 | 27.39±2.14 |
## Additional Information
### Dataset Curators
* Vijay Prakash Dwivedi ([vijaydwivedi75](https://github.com/vijaydwivedi75))
### Citation Information
```
@article{dwivedi2022LRGB,
title={Long Range Graph Benchmark},
author={Dwivedi, Vijay Prakash and Rampášek, Ladislav and Galkin, Mikhail and Parviz, Ali and Wolf, Guy and Luu, Anh Tuan and Beaini, Dominique},
journal={arXiv:2206.08164},
year={2022}
}
``` |
osanseviero/azure | ---
task_categories:
- automatic-speech-recognition
dataset_info:
features:
- name: CHANNEL_NAME
dtype: string
- name: URL
dtype: string
- name: TITLE
dtype: string
- name: DESCRIPTION
dtype: string
- name: TRANSCRIPTION
dtype: string
- name: SEGMENTS
dtype: string
splits:
- name: train
num_bytes: 27732
num_examples: 2
download_size: 29958
dataset_size: 27732
tags:
- whisper
- whispering
---
# Dataset Card for "azure"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_lloorree__kssht-euripedes-70b | ---
pretty_name: Evaluation run of lloorree/kssht-euripedes-70b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lloorree/kssht-euripedes-70b](https://huggingface.co/lloorree/kssht-euripedes-70b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lloorree__kssht-euripedes-70b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-19T00:12:39.048571](https://huggingface.co/datasets/open-llm-leaderboard/details_lloorree__kssht-euripedes-70b/blob/main/results_2023-09-19T00-12-39.048571.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7032771782081723,\n\
\ \"acc_stderr\": 0.030834102504125972,\n \"acc_norm\": 0.70714084898032,\n\
\ \"acc_norm_stderr\": 0.030804015376568177,\n \"mc1\": 0.3953488372093023,\n\
\ \"mc1_stderr\": 0.017115815632418197,\n \"mc2\": 0.5551008582453495,\n\
\ \"mc2_stderr\": 0.014893190834168417\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.658703071672355,\n \"acc_stderr\": 0.013855831287497723,\n\
\ \"acc_norm\": 0.6979522184300341,\n \"acc_norm_stderr\": 0.013417519144716413\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6872137024497113,\n\
\ \"acc_stderr\": 0.004626805906522211,\n \"acc_norm\": 0.8759211312487553,\n\
\ \"acc_norm_stderr\": 0.0032899775233939097\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n\
\ \"acc_stderr\": 0.04094376269996794,\n \"acc_norm\": 0.6592592592592592,\n\
\ \"acc_norm_stderr\": 0.04094376269996794\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8092105263157895,\n \"acc_stderr\": 0.031975658210325,\n\
\ \"acc_norm\": 0.8092105263157895,\n \"acc_norm_stderr\": 0.031975658210325\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n\
\ \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \
\ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.031164899666948617,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.031164899666948617\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6723404255319149,\n \"acc_stderr\": 0.030683020843231004,\n\
\ \"acc_norm\": 0.6723404255319149,\n \"acc_norm_stderr\": 0.030683020843231004\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6620689655172414,\n \"acc_stderr\": 0.039417076320648906,\n\
\ \"acc_norm\": 0.6620689655172414,\n \"acc_norm_stderr\": 0.039417076320648906\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.02559185776138218,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.02559185776138218\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8129032258064516,\n \"acc_stderr\": 0.02218571009225225,\n \"\
acc_norm\": 0.8129032258064516,\n \"acc_norm_stderr\": 0.02218571009225225\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5566502463054187,\n \"acc_stderr\": 0.03495334582162933,\n \"\
acc_norm\": 0.5566502463054187,\n \"acc_norm_stderr\": 0.03495334582162933\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\"\
: 0.76,\n \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8363636363636363,\n \"acc_stderr\": 0.02888787239548795,\n\
\ \"acc_norm\": 0.8363636363636363,\n \"acc_norm_stderr\": 0.02888787239548795\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8686868686868687,\n \"acc_stderr\": 0.02406315641682252,\n \"\
acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.02406315641682252\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9222797927461139,\n \"acc_stderr\": 0.019321805557223157,\n\
\ \"acc_norm\": 0.9222797927461139,\n \"acc_norm_stderr\": 0.019321805557223157\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7230769230769231,\n \"acc_stderr\": 0.022688042352424994,\n\
\ \"acc_norm\": 0.7230769230769231,\n \"acc_norm_stderr\": 0.022688042352424994\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7605042016806722,\n \"acc_stderr\": 0.027722065493361276,\n\
\ \"acc_norm\": 0.7605042016806722,\n \"acc_norm_stderr\": 0.027722065493361276\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.47019867549668876,\n \"acc_stderr\": 0.040752249922169775,\n \"\
acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.040752249922169775\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8972477064220183,\n \"acc_stderr\": 0.013018246509173768,\n \"\
acc_norm\": 0.8972477064220183,\n \"acc_norm_stderr\": 0.013018246509173768\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5972222222222222,\n \"acc_stderr\": 0.03344887382997866,\n \"\
acc_norm\": 0.5972222222222222,\n \"acc_norm_stderr\": 0.03344887382997866\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9264705882352942,\n \"acc_stderr\": 0.018318855850089678,\n \"\
acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.018318855850089678\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.890295358649789,\n \"acc_stderr\": 0.020343400734868834,\n \
\ \"acc_norm\": 0.890295358649789,\n \"acc_norm_stderr\": 0.020343400734868834\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7937219730941704,\n\
\ \"acc_stderr\": 0.02715715047956382,\n \"acc_norm\": 0.7937219730941704,\n\
\ \"acc_norm_stderr\": 0.02715715047956382\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8778625954198473,\n \"acc_stderr\": 0.028718776889342337,\n\
\ \"acc_norm\": 0.8778625954198473,\n \"acc_norm_stderr\": 0.028718776889342337\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8677685950413223,\n \"acc_stderr\": 0.03092278832044579,\n \"\
acc_norm\": 0.8677685950413223,\n \"acc_norm_stderr\": 0.03092278832044579\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8518518518518519,\n\
\ \"acc_stderr\": 0.03434300243631001,\n \"acc_norm\": 0.8518518518518519,\n\
\ \"acc_norm_stderr\": 0.03434300243631001\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.803680981595092,\n \"acc_stderr\": 0.031207970394709218,\n\
\ \"acc_norm\": 0.803680981595092,\n \"acc_norm_stderr\": 0.031207970394709218\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5446428571428571,\n\
\ \"acc_stderr\": 0.04726835553719098,\n \"acc_norm\": 0.5446428571428571,\n\
\ \"acc_norm_stderr\": 0.04726835553719098\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.039166677628225836,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.039166677628225836\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n\
\ \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n\
\ \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8620689655172413,\n\
\ \"acc_stderr\": 0.012331009307795656,\n \"acc_norm\": 0.8620689655172413,\n\
\ \"acc_norm_stderr\": 0.012331009307795656\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7947976878612717,\n \"acc_stderr\": 0.021742519835276274,\n\
\ \"acc_norm\": 0.7947976878612717,\n \"acc_norm_stderr\": 0.021742519835276274\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5575418994413408,\n\
\ \"acc_stderr\": 0.016611393687268574,\n \"acc_norm\": 0.5575418994413408,\n\
\ \"acc_norm_stderr\": 0.016611393687268574\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7679738562091504,\n \"acc_stderr\": 0.024170840879340873,\n\
\ \"acc_norm\": 0.7679738562091504,\n \"acc_norm_stderr\": 0.024170840879340873\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7717041800643086,\n\
\ \"acc_stderr\": 0.0238393033113982,\n \"acc_norm\": 0.7717041800643086,\n\
\ \"acc_norm_stderr\": 0.0238393033113982\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8395061728395061,\n \"acc_stderr\": 0.02042395535477803,\n\
\ \"acc_norm\": 0.8395061728395061,\n \"acc_norm_stderr\": 0.02042395535477803\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5531914893617021,\n \"acc_stderr\": 0.02965823509766691,\n \
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.02965823509766691\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5580182529335072,\n\
\ \"acc_stderr\": 0.012683972513598827,\n \"acc_norm\": 0.5580182529335072,\n\
\ \"acc_norm_stderr\": 0.012683972513598827\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.026799562024887657,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.026799562024887657\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.761437908496732,\n \"acc_stderr\": 0.01724238582877962,\n \
\ \"acc_norm\": 0.761437908496732,\n \"acc_norm_stderr\": 0.01724238582877962\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7545454545454545,\n\
\ \"acc_stderr\": 0.041220665028782855,\n \"acc_norm\": 0.7545454545454545,\n\
\ \"acc_norm_stderr\": 0.041220665028782855\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8122448979591836,\n \"acc_stderr\": 0.025000256039546188,\n\
\ \"acc_norm\": 0.8122448979591836,\n \"acc_norm_stderr\": 0.025000256039546188\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n\
\ \"acc_stderr\": 0.021628920516700643,\n \"acc_norm\": 0.8955223880597015,\n\
\ \"acc_norm_stderr\": 0.021628920516700643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8596491228070176,\n \"acc_stderr\": 0.0266405825391332,\n\
\ \"acc_norm\": 0.8596491228070176,\n \"acc_norm_stderr\": 0.0266405825391332\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3953488372093023,\n\
\ \"mc1_stderr\": 0.017115815632418197,\n \"mc2\": 0.5551008582453495,\n\
\ \"mc2_stderr\": 0.014893190834168417\n }\n}\n```"
repo_url: https://huggingface.co/lloorree/kssht-euripedes-70b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|arc:challenge|25_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hellaswag|10_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-19T00-12-39.048571.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-19T00-12-39.048571.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-19T00-12-39.048571.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-19T00-12-39.048571.parquet'
- config_name: results
data_files:
- split: 2023_09_19T00_12_39.048571
path:
- results_2023-09-19T00-12-39.048571.parquet
- split: latest
path:
- results_2023-09-19T00-12-39.048571.parquet
---
# Dataset Card for Evaluation run of lloorree/kssht-euripedes-70b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lloorree/kssht-euripedes-70b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [lloorree/kssht-euripedes-70b](https://huggingface.co/lloorree/kssht-euripedes-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lloorree__kssht-euripedes-70b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-19T00:12:39.048571](https://huggingface.co/datasets/open-llm-leaderboard/details_lloorree__kssht-euripedes-70b/blob/main/results_2023-09-19T00-12-39.048571.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7032771782081723,
"acc_stderr": 0.030834102504125972,
"acc_norm": 0.70714084898032,
"acc_norm_stderr": 0.030804015376568177,
"mc1": 0.3953488372093023,
"mc1_stderr": 0.017115815632418197,
"mc2": 0.5551008582453495,
"mc2_stderr": 0.014893190834168417
},
"harness|arc:challenge|25": {
"acc": 0.658703071672355,
"acc_stderr": 0.013855831287497723,
"acc_norm": 0.6979522184300341,
"acc_norm_stderr": 0.013417519144716413
},
"harness|hellaswag|10": {
"acc": 0.6872137024497113,
"acc_stderr": 0.004626805906522211,
"acc_norm": 0.8759211312487553,
"acc_norm_stderr": 0.0032899775233939097
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.04094376269996794,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.04094376269996794
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8092105263157895,
"acc_stderr": 0.031975658210325,
"acc_norm": 0.8092105263157895,
"acc_norm_stderr": 0.031975658210325
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.031164899666948617,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.031164899666948617
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6723404255319149,
"acc_stderr": 0.030683020843231004,
"acc_norm": 0.6723404255319149,
"acc_norm_stderr": 0.030683020843231004
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6620689655172414,
"acc_stderr": 0.039417076320648906,
"acc_norm": 0.6620689655172414,
"acc_norm_stderr": 0.039417076320648906
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.02559185776138218,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.02559185776138218
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8129032258064516,
"acc_stderr": 0.02218571009225225,
"acc_norm": 0.8129032258064516,
"acc_norm_stderr": 0.02218571009225225
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5566502463054187,
"acc_stderr": 0.03495334582162933,
"acc_norm": 0.5566502463054187,
"acc_norm_stderr": 0.03495334582162933
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8363636363636363,
"acc_stderr": 0.02888787239548795,
"acc_norm": 0.8363636363636363,
"acc_norm_stderr": 0.02888787239548795
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.02406315641682252,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.02406315641682252
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9222797927461139,
"acc_stderr": 0.019321805557223157,
"acc_norm": 0.9222797927461139,
"acc_norm_stderr": 0.019321805557223157
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7230769230769231,
"acc_stderr": 0.022688042352424994,
"acc_norm": 0.7230769230769231,
"acc_norm_stderr": 0.022688042352424994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7605042016806722,
"acc_stderr": 0.027722065493361276,
"acc_norm": 0.7605042016806722,
"acc_norm_stderr": 0.027722065493361276
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.47019867549668876,
"acc_stderr": 0.040752249922169775,
"acc_norm": 0.47019867549668876,
"acc_norm_stderr": 0.040752249922169775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8972477064220183,
"acc_stderr": 0.013018246509173768,
"acc_norm": 0.8972477064220183,
"acc_norm_stderr": 0.013018246509173768
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5972222222222222,
"acc_stderr": 0.03344887382997866,
"acc_norm": 0.5972222222222222,
"acc_norm_stderr": 0.03344887382997866
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.018318855850089678,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.018318855850089678
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.890295358649789,
"acc_stderr": 0.020343400734868834,
"acc_norm": 0.890295358649789,
"acc_norm_stderr": 0.020343400734868834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7937219730941704,
"acc_stderr": 0.02715715047956382,
"acc_norm": 0.7937219730941704,
"acc_norm_stderr": 0.02715715047956382
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8778625954198473,
"acc_stderr": 0.028718776889342337,
"acc_norm": 0.8778625954198473,
"acc_norm_stderr": 0.028718776889342337
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8677685950413223,
"acc_stderr": 0.03092278832044579,
"acc_norm": 0.8677685950413223,
"acc_norm_stderr": 0.03092278832044579
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.03434300243631001,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.03434300243631001
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.803680981595092,
"acc_stderr": 0.031207970394709218,
"acc_norm": 0.803680981595092,
"acc_norm_stderr": 0.031207970394709218
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5446428571428571,
"acc_stderr": 0.04726835553719098,
"acc_norm": 0.5446428571428571,
"acc_norm_stderr": 0.04726835553719098
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.039166677628225836,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.039166677628225836
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9017094017094017,
"acc_stderr": 0.019503444900757567,
"acc_norm": 0.9017094017094017,
"acc_norm_stderr": 0.019503444900757567
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8620689655172413,
"acc_stderr": 0.012331009307795656,
"acc_norm": 0.8620689655172413,
"acc_norm_stderr": 0.012331009307795656
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7947976878612717,
"acc_stderr": 0.021742519835276274,
"acc_norm": 0.7947976878612717,
"acc_norm_stderr": 0.021742519835276274
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5575418994413408,
"acc_stderr": 0.016611393687268574,
"acc_norm": 0.5575418994413408,
"acc_norm_stderr": 0.016611393687268574
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7679738562091504,
"acc_stderr": 0.024170840879340873,
"acc_norm": 0.7679738562091504,
"acc_norm_stderr": 0.024170840879340873
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7717041800643086,
"acc_stderr": 0.0238393033113982,
"acc_norm": 0.7717041800643086,
"acc_norm_stderr": 0.0238393033113982
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8395061728395061,
"acc_stderr": 0.02042395535477803,
"acc_norm": 0.8395061728395061,
"acc_norm_stderr": 0.02042395535477803
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.02965823509766691,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.02965823509766691
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5580182529335072,
"acc_stderr": 0.012683972513598827,
"acc_norm": 0.5580182529335072,
"acc_norm_stderr": 0.012683972513598827
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.026799562024887657,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.026799562024887657
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.761437908496732,
"acc_stderr": 0.01724238582877962,
"acc_norm": 0.761437908496732,
"acc_norm_stderr": 0.01724238582877962
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7545454545454545,
"acc_stderr": 0.041220665028782855,
"acc_norm": 0.7545454545454545,
"acc_norm_stderr": 0.041220665028782855
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8122448979591836,
"acc_stderr": 0.025000256039546188,
"acc_norm": 0.8122448979591836,
"acc_norm_stderr": 0.025000256039546188
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700643,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700643
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466125,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466125
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8596491228070176,
"acc_stderr": 0.0266405825391332,
"acc_norm": 0.8596491228070176,
"acc_norm_stderr": 0.0266405825391332
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3953488372093023,
"mc1_stderr": 0.017115815632418197,
"mc2": 0.5551008582453495,
"mc2_stderr": 0.014893190834168417
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
deetsadi/processed_dwi_sobel_thresh | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
- name: conditioning_image
dtype: image
splits:
- name: train
num_bytes: 13354104.0
num_examples: 200
download_size: 0
dataset_size: 13354104.0
---
# Dataset Card for "processed_dwi_sobel_thresh"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Frixi/Almighty_2019_LaBestia_Era | ---
license: openrail
---
|
giux78/10000-20000-ultrafeedback-binarized-preferences-cleaned-ita | ---
dataset_info:
features:
- name: source
dtype: string
- name: prompt
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: chosen-rating
dtype: float64
- name: chosen-model
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected-rating
dtype: float64
- name: rejected-model
dtype: string
splits:
- name: train
num_bytes: 50545239
num_examples: 10000
download_size: 18938268
dataset_size: 50545239
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "10000-20000-ultrafeedback-binarized-preferences-cleaned-ita"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
fiveflow/koquad_v2_polyglot_tkd | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 7699047417
num_examples: 50000
download_size: 1305602573
dataset_size: 7699047417
---
# Dataset Card for "koquad_v2_polyglot_tkd"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dkshjn/mixqa_cot_7 | ---
dataset_info:
features:
- name: question
dtype: string
- name: options
dtype: string
- name: reasoning
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 6667
num_examples: 22
download_size: 8691
dataset_size: 6667
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "mixqa_cot_7"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
unography/stock-images-bg-removed-10k-v3 | ---
dataset_info:
features:
- name: image
dtype: image
- name: mask
dtype: image
splits:
- name: train
num_bytes: 4889814547.533661
num_examples: 10798
- name: test
num_bytes: 2584899.0
num_examples: 20
download_size: 4871034921
dataset_size: 4892399446.533661
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
autoevaluate/autoeval-eval-jeffdshen__redefine_math0_8shot-jeffdshen__redefine_mat-1c694b-1853263415 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- jeffdshen/redefine_math0_8shot
eval_info:
task: text_zero_shot_classification
model: inverse-scaling/opt-125m_eval
metrics: []
dataset_name: jeffdshen/redefine_math0_8shot
dataset_config: jeffdshen--redefine_math0_8shot
dataset_split: train
col_mapping:
text: prompt
classes: classes
target: answer_index
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: inverse-scaling/opt-125m_eval
* Dataset: jeffdshen/redefine_math0_8shot
* Config: jeffdshen--redefine_math0_8shot
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@jeffdshen](https://huggingface.co/jeffdshen) for evaluating this model. |
liuyanchen1015/MULTI_VALUE_rte_reflex_number | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 8194
num_examples: 18
- name: train
num_bytes: 7877
num_examples: 19
download_size: 19201
dataset_size: 16071
---
# Dataset Card for "MULTI_VALUE_rte_reflex_number"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SniiKz/Testdataset | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2465802
num_examples: 1
download_size: 0
dataset_size: 2465802
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Testdataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vertigo23/msi_test_images | ---
license: mpl-2.0
---
|
vigneshgs7/Boundary_detection_Doc_5 | ---
dataset_info:
features:
- name: name
dtype: string
- name: uuid
dtype: string
- name: status
dtype: string
- name: image
dtype: image
- name: label.annotations
list:
- name: id
dtype: int32
- name: category_id
dtype: int32
- name: label.segmentation_bitmap
dtype: image
splits:
- name: train
num_bytes: 10946287123.0
num_examples: 220
download_size: 723327321
dataset_size: 10946287123.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mespinosami/global230k_with_text | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 8772332868.54
num_examples: 162940
- name: validation
num_bytes: 1357769997.04
num_examples: 23416
- name: test
num_bytes: 2618381497.671
num_examples: 46463
download_size: 12445672864
dataset_size: 12748484363.251001
---
# Dataset Card for "global230k_with_text"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ajanco/deep | ---
license: mit
---
|
formospeech/hac_elearning | ---
dataset_info:
config_name: train
features:
- name: id
dtype: string
- name: audio
dtype: audio
- name: duration
dtype: float64
- name: text
dtype: string
- name: ipa
dtype: string
- name: char_per_sec
dtype: float64
splits:
- name: train
num_bytes: 1504897278.492
num_examples: 16239
download_size: 1315124916
dataset_size: 1504897278.492
configs:
- config_name: train
data_files:
- split: train
path: train/train-*
---
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-29000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 975169
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_psmathur__model_420_preview | ---
pretty_name: Evaluation run of psmathur/model_420_preview
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [psmathur/model_420_preview](https://huggingface.co/psmathur/model_420_preview)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_psmathur__model_420_preview\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-18T07:05:02.354385](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__model_420_preview/blob/main/results_2023-10-18T07-05-02.354385.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0016778523489932886,\n\
\ \"em_stderr\": 0.0004191330178826867,\n \"f1\": 0.06602034395973153,\n\
\ \"f1_stderr\": 0.0013713725074901318,\n \"acc\": 0.5827673137371175,\n\
\ \"acc_stderr\": 0.011721630765571481\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0016778523489932886,\n \"em_stderr\": 0.0004191330178826867,\n\
\ \"f1\": 0.06602034395973153,\n \"f1_stderr\": 0.0013713725074901318\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.33206974981046244,\n \
\ \"acc_stderr\": 0.012972465034361861\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8334648776637726,\n \"acc_stderr\": 0.0104707964967811\n\
\ }\n}\n```"
repo_url: https://huggingface.co/psmathur/model_420_preview
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_10_18T07_05_02.354385
path:
- '**/details_harness|drop|3_2023-10-18T07-05-02.354385.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-18T07-05-02.354385.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_18T07_05_02.354385
path:
- '**/details_harness|gsm8k|5_2023-10-18T07-05-02.354385.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-18T07-05-02.354385.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_18T07_05_02.354385
path:
- '**/details_harness|winogrande|5_2023-10-18T07-05-02.354385.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-18T07-05-02.354385.parquet'
- config_name: results
data_files:
- split: 2023_10_18T07_05_02.354385
path:
- results_2023-10-18T07-05-02.354385.parquet
- split: latest
path:
- results_2023-10-18T07-05-02.354385.parquet
---
# Dataset Card for Evaluation run of psmathur/model_420_preview
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/psmathur/model_420_preview
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [psmathur/model_420_preview](https://huggingface.co/psmathur/model_420_preview) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_psmathur__model_420_preview",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T07:05:02.354385](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__model_420_preview/blob/main/results_2023-10-18T07-05-02.354385.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0016778523489932886,
"em_stderr": 0.0004191330178826867,
"f1": 0.06602034395973153,
"f1_stderr": 0.0013713725074901318,
"acc": 0.5827673137371175,
"acc_stderr": 0.011721630765571481
},
"harness|drop|3": {
"em": 0.0016778523489932886,
"em_stderr": 0.0004191330178826867,
"f1": 0.06602034395973153,
"f1_stderr": 0.0013713725074901318
},
"harness|gsm8k|5": {
"acc": 0.33206974981046244,
"acc_stderr": 0.012972465034361861
},
"harness|winogrande|5": {
"acc": 0.8334648776637726,
"acc_stderr": 0.0104707964967811
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
jeanvydes/llm-routing-text-classification | ---
language:
- "en"
pretty_name: "Prompt Task Classification"
tags:
- prompt-classification
task_categories:
- text-classification
license: unlicense
---
# Prompt Task Clasification
Category prompt into categories and results into the most probably task
## Current Supported Categories
```py
['fill_mask',
'conversation',
'midjourney_image_generation',
'math',
'science',
'toxic_harmful',
'logical_reasoning',
'sex',
'creative_writing']
```
## Categories Data Composition
 |
autoevaluate/autoeval-eval-cnn_dailymail-3.0.0-6f9c29-1531855204 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- cnn_dailymail
eval_info:
task: summarization
model: facebook/bart-large-cnn
metrics: ['accuracy']
dataset_name: cnn_dailymail
dataset_config: 3.0.0
dataset_split: test
col_mapping:
text: article
target: highlights
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: facebook/bart-large-cnn
* Dataset: cnn_dailymail
* Config: 3.0.0
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@samuelallen123](https://huggingface.co/samuelallen123) for evaluating this model. |
JoshVictor/MEDCODEX_DATASET | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 188186
num_examples: 100
download_size: 81239
dataset_size: 188186
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
hdduytran/testing | ---
pretty_name: Testing Dataset
--- |
umarigan/turkish_wikipedia_dataset_NER | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: title
dtype: string
- name: ner
list:
- name: end
dtype: int64
- name: entity
dtype: string
- name: index
dtype: int64
- name: score
dtype: float32
- name: start
dtype: int64
- name: word
dtype: string
- name: cleaned_ners
sequence: string
- name: cleaned_new
sequence: string
splits:
- name: train
num_bytes: 1781032869
num_examples: 265000
download_size: 698313289
dataset_size: 1781032869
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "turkish_wikipedia_dataset_NER"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
GOAT-AI/generated-novels | ---
license: mit
---
|
angeluriot/french_instruct | ---
language:
- fr
license: mit
tags:
- croissant
language_details: fr-FR
pretty_name: French Instruct
size_categories:
- 100K<n<1M
source_datasets:
- nickrosh/Evol-Instruct-Code-80k-v1
- Hello-SimpleAI/HC3
- KK04/LogicInference_OA
- tatsu-lab/alpaca
- 0x22almostEvil/multilingual-wikihow-qa-16k
- databricks/databricks-dolly-15k
- RyokoAI/ShareGPT52K
- gsm8k
- GAIR/lima
- OpenAssistant/oasst1
- Gael540/dataSet_ens_sup_fr-v1
- Gt-Doremiti/gt-doremiti-instructions
task_categories:
- question-answering
- text2text-generation
- text-generation
- text-classification
- token-classification
task_ids:
- document-question-answering
- natural-language-inference
---
# French Instruct
The **French Instruct dataset** is a collection of instructions with their corresponding answers (sometimes multi-turn conversations) entirely in French. The dataset is also available on [**GitHub**](https://github.com/angeluriot/French_instruct).
<p align="center">
<img src="resources/misc/thumbnail.gif" width="750">
</p>
<br/>
# Overview
The dataset is composed of 276K conversations between a user and an assistant for a total of approximately 85M tokens.
<p align="center">
<img src="resources/misc/charts.png" width="1000">
</p>
I also added annotations for each document to indicate if it was generated or written by a human, the style of the answers, or if it contains code. This can be useful for filtering the data according to your needs.
| | Documents | Tokens | Ratio |
|:--------------------------|:-----------:|:----------------:|:------------:|
| **All** | **275,600** | **≈ 84,906,090** | **100.00 %** |
| Written by a human | 85,213 | ≈ 24,908,868 | 29.34 % |
| Written by a chatbot* | 190,387 | ≈ 59,997,223 | 70.66 % |
| Human-style answers | 56,198 | ≈ 14,255,100 | 16.79 % |
| Chatbot-style answers | 219,402 | ≈ 70,650,990 | 83.21 % |
| Contains code | 14,788 | ≈ 11,455,659 | 13.49 % |
(*) Generally by well-established chatbots like ChatGPT.
<br/>
# Data Structure
Each record in the dataset follows the structure below:
```json
{
"context": "Some context for the instructions (sometimes empty)",
"conversation": [
{
"role": "user",
"text": "The first instruction"
},
{
"role": "assistant",
"text": "The first answer"
},
{
"role": "user",
"text": "The second instruction, etc..."
},
],
"author": "human",
"style": "chatbot",
"code": false,
"source": "The source of the document"
}
```
<br/>
# Sources
The dataset is a mix of various sources, some of which are translated from English to French using the ChatGPT API. I also did some cleaning and filtering to remove irrelevant data (duplicates, empty conversations, remaining English text, etc...).
The table below shows the distribution of the documents and tokens for each source:
<table>
<thead>
<tr>
<th align="center">Source</th>
<th align="center">Documents</th>
<th align="center">Tokens</th>
<th align="center">Ratio</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left"><b><a href="https://huggingface.co/datasets/nickrosh/Evol-Instruct-Code-80k-v1">Evol Instruct</a></b> <i>(translated)</i></td>
<td align="center">56,747</td>
<td align="center">≈ 36,016,255</td>
<td align="center">42.42 %</td>
</tr>
<tr>
<td align="left"><b><a href="https://huggingface.co/datasets/Hello-SimpleAI/HC3">Human ChatGPT Comparison Corpus</a></b> <i>(translated)</i></td>
<td align="center">82,729</td>
<td align="center">≈ 23,316,107</td>
<td align="center">27.46 %</td>
</tr>
<tr>
<td align="left"><b><a href="https://huggingface.co/datasets/KK04/LogicInference_OA">Logic Inference OA</a></b> <i>(translated)</i></td>
<td align="center">54,542</td>
<td align="center">≈ 8,124,315</td>
<td align="center">9.57 %</td>
</tr>
<tr>
<td align="left"><b><a href="https://huggingface.co/datasets/tatsu-lab/alpaca">Stanford Alpaca</a></b> <i>(translated)</i></td>
<td align="center">51,243</td>
<td align="center">≈ 5,521,752</td>
<td align="center">6.50 %</td>
</tr>
<tr>
<td align="left"><b><a href="https://huggingface.co/datasets/0x22almostEvil/multilingual-wikihow-qa-16k">WikiHow</a> FR</b></td>
<td align="center">2,156</td>
<td align="center">≈ 4,789,558</td>
<td align="center">5.64 %</td>
</tr>
<tr>
<td align="left"><b><a href="https://huggingface.co/datasets/databricks/databricks-dolly-15k">Dolly</a></b> <i>(translated)</i></td>
<td align="center">14,896</td>
<td align="center">≈ 3,678,165</td>
<td align="center">4.33 %</td>
</tr>
<tr>
<td align="left"><b><a href="https://huggingface.co/datasets/RyokoAI/ShareGPT52K">Share GPT</a> FR</b></td>
<td align="center">1,385</td>
<td align="center">≈ 1,301,026</td>
<td align="center">1.53 %</td>
</tr>
<tr>
<td align="left"><b><a href="https://huggingface.co/datasets/gsm8k">Grade School Math</a></b> <i>(translated)</i></td>
<td align="center">8,792</td>
<td align="center">≈ 1,263,370</td>
<td align="center">1.49 %</td>
</tr>
<tr>
<td align="left"><b><a href="https://huggingface.co/datasets/GAIR/lima">Less Is More for Alignment</a></b> <i>(translated)</i></td>
<td align="center">1,032</td>
<td align="center">≈ 581,897</td>
<td align="center">0.69 %</td>
</tr>
<tr>
<td align="left"><b><a href="https://huggingface.co/datasets/CohereForAI/aya_dataset">Aya Dataset</a> FR</b></td>
<td align="center">1,412</td>
<td align="center">≈ 203,537</td>
<td align="center">0.24 %</td>
</tr>
<tr>
<td align="left"><b><a href="https://huggingface.co/datasets/OpenAssistant/oasst1">Open Assistant Conversations</a> FR</b></td>
<td align="center">255</td>
<td align="center">≈ 79,025</td>
<td align="center">0.09 %</td>
</tr>
<tr>
<td align="left"><b><a href="https://huggingface.co/datasets/Gael540/dataSet_ens_sup_fr-v1">ENS Sup Dataset</a></b></td>
<td align="center">383</td>
<td align="center">≈ 24,770</td>
<td align="center">0.03 %</td>
</tr>
<tr>
<td align="left"><b><a href="https://huggingface.co/datasets/Gt-Doremiti/gt-doremiti-instructions">GT Doremiti Instructions</a></b></td>
<td align="center">28</td>
<td align="center">≈ 6,314</td>
<td align="center">0.01 %</td>
</tr>
<tr>
<th align="left">Total</th>
<th align="center">275,600</th>
<th align="center">≈ 84,906,090</th>
<th align="center">100.00 %</th>
</tr>
</tbody>
</table>
<br/>
# Usage
The **French Instruct dataset** is mainly intended for training and evaluating conversational Large Language Models (LLMs), but it can also be used for classifying human-written text from generated text using the additional annotations.
### Hugging Face 🤗
The simplest way to use the dataset is to load it directly from the Hugging Face library:
```console
pip install datasets
```
```py
from datasets import load_dataset
dataset = load_dataset('angeluriot/french_instruct')
```
### GitHub
You first need to download the [**repository**](https://github.com/angeluriot/French_instruct), then you can use the `load.py` script to load the dataset or merge it in a sigle `.jsonl` file:
```py
from load import load_dataset, merge_dataset
# Load the dataset
dataset = load_dataset()
# Merge the dataset in a single .jsonl file
merge_dataset()
```
<br/>
# Examples
### Instructions
```yaml
context: ""
user: "Convertis 18 kilomètres par seconde en miles par heure."
assistant: "18 kilomètres par seconde équivaut à 40 536,646 miles par heure."
```
```yaml
context: ""
user: "Quel est le principal avantage d'un programme de fidélité pour les membres ?"
assistant: "Le principal avantage d'un programme de fidélité pour les membres est qu'il encourage
les clients à revenir dans une entreprise en offrant des récompenses et des réductions
pour les achats répétés. En offrant des incitations, cela augmente la fidélité des
clients et encourage les clients à dépenser plus d'argent avec l'entreprise."
```
### Code
```yaml
context: "for i in range(10):
x += 1"
user: "Réécris le code pour qu'il soit plus efficace."
assistant: "Tu peux simplifier le code comme suit :
```
x += 10
```"
```
```yaml
context: "[2, 7, 11, 4]"
user: "Crée une fonction python pour trouver le nombre maximum de la liste donnée."
assistant: "```
def find_max(lst):
return max(lst)
```"
```
### Multi-turn conversations
```yaml
context: ""
user: "Connaissez-vous Nuodle à Bellevue WA ?"
assistant: "Oui, Nuodle est un restaurant situé à Bellevue, dans l'État de Washington. Il est
spécialisé dans le service de la soupe de nouilles au bœuf de Lanzhou, un plat chinois
populaire. Le restaurant est situé au 14603 NE 20th St, Bellevue, WA 98007."
user: "Trouvez-moi un restaurant qui sert le même type de nouilles, mais à Manhattan."
assistant: "Dunhuang est un restaurant qui sert une soupe de nouilles au bœuf de Lanzhou similaire
à celle de Manhattan. Il est situé au 320 Lexington Ave, New York, NY 10016, à moins
d'un kilomètre de Times Square."
user: "Lequel est le meilleur ?"
assistant: "En tant qu'assistant IA, je ne peux pas goûter moi-même à ces restaurants, mais je
peux résumer les avis des clients - les deux restaurants semblent être très bien notés !
Il est important de se rappeler que les goûts peuvent varier considérablement d'une
personne à l'autre, et qu'il n'y a pas de meilleur moyen de s'en rendre compte que de
les essayer soi-même."
```
```yaml
context: ""
user: "Fais-tu grève demain?"
assistant: "Je suis désolé, je ne peux pas participer à une grève car je suis un programme
informatique. Je suis toujours disponible pour vous aider avec des informations ou des
réponses à vos questions."
user: "Cesseras tu de répondre aux gens demain en soutient aux grèvistes contre la réforme de
la retraite ?"
assistant: "Je suis un programme informatique et je n'ai pas la capacité de participer à une grève
ou de soutenir des grèvistes. Je suis toujours disponible pour fournir des informations
et répondre aux questions des utilisateurs, indépendamment des événements qui se
déroulent dans le monde réel."
user: "Arrêtes de répondre aux gens !"
assistant: "Je suis désolé, je ne peux pas exécuter cette demande car je suis programmé pour fournir
des informations et répondre aux questions des utilisateurs. Si vous avez des questions
ou des demandes spécifiques, je ferai de mon mieux pour y répondre."
user: "Vendu !"
assistant: "Je suis heureux de pouvoir vous aider. Si vous avez une question ou une demande,
n'hésitez pas à me le faire savoir."
```
<br/>
# Credits
* [**Angel Uriot**](https://github.com/angeluriot) : Creator of the project.
* All the people who contributed to the sources of the dataset (see the [**Sources**](#sources) section).
|
tyzhu/random_letter_same_length_find_passage_train50_eval20_title | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 37662
num_examples: 120
- name: validation
num_bytes: 7512
num_examples: 20
download_size: 28745
dataset_size: 45174
---
# Dataset Card for "random_letter_same_length_find_passage_train50_eval20_title"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/augmentatio-standardized_cluster_3_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 12560660
num_examples: 6337
download_size: 5185004
dataset_size: 12560660
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "augmentatio-standardized_cluster_3_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-acronym_identification-default-641c5d-40516105295 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- acronym_identification
eval_info:
task: entity_extraction
model: lewtun/autotrain-acronym-identification-7324788
metrics: ['angelina-wang/directional_bias_amplification']
dataset_name: acronym_identification
dataset_config: default
dataset_split: validation
col_mapping:
tokens: tokens
tags: labels
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: lewtun/autotrain-acronym-identification-7324788
* Dataset: acronym_identification
* Config: default
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@wandb.init(project=PROJECT](https://huggingface.co/wandb.init(project=PROJECT) for evaluating this model. |
CyberHarem/wendy_honkai3 | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of wendy (Houkai 3rd)
This is the dataset of wendy (Houkai 3rd), containing 45 images and their tags.
The core tags of this character are `bangs, black_hair, green_eyes, multicolored_hair, ahoge, hair_between_eyes, green_hair, short_hair, braid`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 45 | 68.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/wendy_honkai3/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 45 | 36.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/wendy_honkai3/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 104 | 72.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/wendy_honkai3/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 45 | 58.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/wendy_honkai3/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 104 | 101.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/wendy_honkai3/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/wendy_honkai3',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | cleavage, white_dress, 1girl, hair_ornament, smile, solo, barefoot, black_gloves, full_body, tattoo, anklet, closed_mouth, elbow_gloves, feet, looking_at_viewer, toes |
| 1 | 10 |  |  |  |  |  | 1girl, bare_shoulders, looking_at_viewer, solo, white_scarf, open_mouth, simple_background, white_dress, antenna_hair, bandages, barefoot, green_gloves, smile, glowing, long_sleeves, toes, white_background |
| 2 | 7 |  |  |  |  |  | 1boy, blue_hair, gradient_hair, male_focus, simple_background, twin_braids, androgynous, long_sleeves, looking_at_viewer, short_hair_with_long_locks, shorts, solo, crop_top, feathered_wings, hood_down, hooded_capelet, midriff, official_alternate_costume, open_mouth, smile, white_flower, white_wings, bridal_gauntlets, chest_tattoo, hair_flower, holding_instrument, jewelry, leg_tattoo, lyre, thighhighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | cleavage | white_dress | 1girl | hair_ornament | smile | solo | barefoot | black_gloves | full_body | tattoo | anklet | closed_mouth | elbow_gloves | feet | looking_at_viewer | toes | bare_shoulders | white_scarf | open_mouth | simple_background | antenna_hair | bandages | green_gloves | glowing | long_sleeves | white_background | 1boy | blue_hair | gradient_hair | male_focus | twin_braids | androgynous | short_hair_with_long_locks | shorts | crop_top | feathered_wings | hood_down | hooded_capelet | midriff | official_alternate_costume | white_flower | white_wings | bridal_gauntlets | chest_tattoo | hair_flower | holding_instrument | jewelry | leg_tattoo | lyre | thighhighs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------|:--------------|:--------|:----------------|:--------|:-------|:-----------|:---------------|:------------|:---------|:---------|:---------------|:---------------|:-------|:--------------------|:-------|:-----------------|:--------------|:-------------|:--------------------|:---------------|:-----------|:---------------|:----------|:---------------|:-------------------|:-------|:------------|:----------------|:-------------|:--------------|:--------------|:-----------------------------|:---------|:-----------|:------------------|:------------|:-----------------|:----------|:-----------------------------|:---------------|:--------------|:-------------------|:---------------|:--------------|:---------------------|:----------|:-------------|:-------|:-------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | | X | X | | X | X | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | | | | | X | X | | | | | | | | | X | | | | X | X | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
pharaouk/SkunkData-Corpus-Clusters | ---
configs:
- config_name: default
data_files:
- split: orca_0
path: data/orca_0-*
- split: instruct_0
path: data/instruct_0-*
- split: orca_1
path: data/orca_1-*
- split: instruct_1
path: data/instruct_1-*
- split: orca_2
path: data/orca_2-*
- split: instruct_2
path: data/instruct_2-*
- split: orca_3
path: data/orca_3-*
- split: instruct_3
path: data/instruct_3-*
- split: orca_4
path: data/orca_4-*
- split: instruct_4
path: data/instruct_4-*
- split: orca_5
path: data/orca_5-*
- split: instruct_5
path: data/instruct_5-*
- split: orca_6
path: data/orca_6-*
- split: instruct_6
path: data/instruct_6-*
- split: orca_7
path: data/orca_7-*
- split: instruct_7
path: data/instruct_7-*
- split: orca_8
path: data/orca_8-*
- split: instruct_8
path: data/instruct_8-*
- split: orca_9
path: data/orca_9-*
- split: instruct_9
path: data/instruct_9-*
- split: orca_10
path: data/orca_10-*
- split: instruct_10
path: data/instruct_10-*
- split: orca_11
path: data/orca_11-*
- split: instruct_11
path: data/instruct_11-*
- split: orca_12
path: data/orca_12-*
- split: instruct_12
path: data/instruct_12-*
- split: orca_13
path: data/orca_13-*
- split: instruct_13
path: data/instruct_13-*
- split: orca_14
path: data/orca_14-*
- split: instruct_14
path: data/instruct_14-*
- split: orca_15
path: data/orca_15-*
- split: instruct_15
path: data/instruct_15-*
- split: orca_16
path: data/orca_16-*
- split: instruct_16
path: data/instruct_16-*
- split: orca_17
path: data/orca_17-*
- split: instruct_17
path: data/instruct_17-*
- split: orca_18
path: data/orca_18-*
- split: instruct_18
path: data/instruct_18-*
- split: orca_19
path: data/orca_19-*
- split: instruct_19
path: data/instruct_19-*
- split: orca_20
path: data/orca_20-*
- split: instruct_20
path: data/instruct_20-*
- split: orca_21
path: data/orca_21-*
- split: instruct_21
path: data/instruct_21-*
- split: orca_22
path: data/orca_22-*
- split: instruct_22
path: data/instruct_22-*
- split: orca_23
path: data/orca_23-*
- split: instruct_23
path: data/instruct_23-*
- split: orca_24
path: data/orca_24-*
- split: instruct_24
path: data/instruct_24-*
- split: orca_25
path: data/orca_25-*
- split: instruct_25
path: data/instruct_25-*
- split: orca_26
path: data/orca_26-*
- split: instruct_26
path: data/instruct_26-*
- split: orca_27
path: data/orca_27-*
- split: instruct_27
path: data/instruct_27-*
- split: orca_28
path: data/orca_28-*
- split: instruct_28
path: data/instruct_28-*
- split: orca_29
path: data/orca_29-*
- split: instruct_29
path: data/instruct_29-*
- split: orca_30
path: data/orca_30-*
- split: instruct_30
path: data/instruct_30-*
- split: orca_31
path: data/orca_31-*
- split: instruct_31
path: data/instruct_31-*
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: dataset_id
dtype: string
- name: unique_conversation_id
dtype: string
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: orca_0
num_bytes: 17849715
num_examples: 18401
- name: instruct_0
num_bytes: 70074569
num_examples: 81024
- name: orca_1
num_bytes: 23680133
num_examples: 28584
- name: instruct_1
num_bytes: 82931087
num_examples: 96749
- name: orca_2
num_bytes: 19980410
num_examples: 17412
- name: instruct_2
num_bytes: 154000003
num_examples: 124814
- name: orca_3
num_bytes: 17101778
num_examples: 32038
- name: instruct_3
num_bytes: 49883928
num_examples: 63327
- name: orca_4
num_bytes: 31656753
num_examples: 34675
- name: instruct_4
num_bytes: 127695479
num_examples: 126005
- name: orca_5
num_bytes: 16269511
num_examples: 14092
- name: instruct_5
num_bytes: 61398228
num_examples: 59076
- name: orca_6
num_bytes: 1342860
num_examples: 2388
- name: instruct_6
num_bytes: 48450814
num_examples: 66011
- name: orca_7
num_bytes: 44849080
num_examples: 36172
- name: instruct_7
num_bytes: 65892068
num_examples: 59876
- name: orca_8
num_bytes: 19352268
num_examples: 18871
- name: instruct_8
num_bytes: 227627947
num_examples: 170841
- name: orca_9
num_bytes: 14700372
num_examples: 15315
- name: instruct_9
num_bytes: 64004683
num_examples: 60637
- name: orca_10
num_bytes: 508915
num_examples: 1446
- name: instruct_10
num_bytes: 24081225
num_examples: 48031
- name: orca_11
num_bytes: 19443068
num_examples: 19745
- name: instruct_11
num_bytes: 82438320
num_examples: 80868
- name: orca_12
num_bytes: 4848059
num_examples: 7172
- name: instruct_12
num_bytes: 166293672
num_examples: 182113
- name: orca_13
num_bytes: 10599648
num_examples: 19167
- name: instruct_13
num_bytes: 84060226
num_examples: 152834
- name: orca_14
num_bytes: 15987021
num_examples: 24048
- name: instruct_14
num_bytes: 59454799
num_examples: 91972
- name: orca_15
num_bytes: 23903599
num_examples: 24410
- name: instruct_15
num_bytes: 85555445
num_examples: 84953
- name: orca_16
num_bytes: 23154299
num_examples: 19289
- name: instruct_16
num_bytes: 101140401
num_examples: 90731
- name: orca_17
num_bytes: 2152082
num_examples: 3809
- name: instruct_17
num_bytes: 66472234
num_examples: 80386
- name: orca_18
num_bytes: 83273007
num_examples: 45544
- name: instruct_18
num_bytes: 110961860
num_examples: 80604
- name: orca_19
num_bytes: 1386401
num_examples: 1644
- name: instruct_19
num_bytes: 37424277
num_examples: 42630
- name: orca_20
num_bytes: 15212013
num_examples: 14602
- name: instruct_20
num_bytes: 94216681
num_examples: 77830
- name: orca_21
num_bytes: 3440922
num_examples: 4174
- name: instruct_21
num_bytes: 124095838
num_examples: 87012
- name: orca_22
num_bytes: 11468080
num_examples: 14191
- name: instruct_22
num_bytes: 63633991
num_examples: 78980
- name: orca_23
num_bytes: 3591049
num_examples: 3778
- name: instruct_23
num_bytes: 95699355
num_examples: 69680
- name: orca_24
num_bytes: 1309953
num_examples: 2395
- name: instruct_24
num_bytes: 82548064
num_examples: 92642
- name: orca_25
num_bytes: 20598114
num_examples: 18715
- name: instruct_25
num_bytes: 132539502
num_examples: 99843
- name: orca_26
num_bytes: 31638864
num_examples: 65463
- name: instruct_26
num_bytes: 52624322
num_examples: 81968
- name: orca_27
num_bytes: 3056079
num_examples: 5939
- name: instruct_27
num_bytes: 29071432
num_examples: 55864
- name: orca_28
num_bytes: 12158143
num_examples: 16039
- name: instruct_28
num_bytes: 67326019
num_examples: 84243
- name: orca_29
num_bytes: 33228880
num_examples: 65846
- name: instruct_29
num_bytes: 16788126
num_examples: 21536
- name: orca_30
num_bytes: 1580412
num_examples: 1991
- name: instruct_30
num_bytes: 15819978
num_examples: 29766
- name: orca_31
num_bytes: 6719191
num_examples: 11269
- name: instruct_31
num_bytes: 29009522
num_examples: 47163
download_size: 1412051638
dataset_size: 3109254774
---
# Dataset Card for "SkunkData-Corpus-Clusters"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Kukedlc__Neural-Krishna-Multiverse-7b | ---
pretty_name: Evaluation run of Kukedlc/Neural-Krishna-Multiverse-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Kukedlc/Neural-Krishna-Multiverse-7b](https://huggingface.co/Kukedlc/Neural-Krishna-Multiverse-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kukedlc__Neural-Krishna-Multiverse-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-14T14:05:24.448255](https://huggingface.co/datasets/open-llm-leaderboard/details_Kukedlc__Neural-Krishna-Multiverse-7b/blob/main/results_2024-03-14T14-05-24.448255.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6532152417131584,\n\
\ \"acc_stderr\": 0.032028265969913886,\n \"acc_norm\": 0.652663258288876,\n\
\ \"acc_norm_stderr\": 0.032697634517724276,\n \"mc1\": 0.6144430844553244,\n\
\ \"mc1_stderr\": 0.017038839010591663,\n \"mc2\": 0.7674849994719457,\n\
\ \"mc2_stderr\": 0.013901832413470068\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7022184300341296,\n \"acc_stderr\": 0.01336308010724448,\n\
\ \"acc_norm\": 0.7286689419795221,\n \"acc_norm_stderr\": 0.012993807727545796\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7139016132244573,\n\
\ \"acc_stderr\": 0.004510123171357374,\n \"acc_norm\": 0.8905596494722167,\n\
\ \"acc_norm_stderr\": 0.0031155287338295755\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175007,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175007\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066482,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066482\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455334,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455334\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.046695106638751906,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.046695106638751906\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993464,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993464\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258172,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258172\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.441340782122905,\n\
\ \"acc_stderr\": 0.016607021781050873,\n \"acc_norm\": 0.441340782122905,\n\
\ \"acc_norm_stderr\": 0.016607021781050873\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757482,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757482\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.02575586592263295,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.02575586592263295\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600713,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600713\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46870925684485004,\n\
\ \"acc_stderr\": 0.012745204626083136,\n \"acc_norm\": 0.46870925684485004,\n\
\ \"acc_norm_stderr\": 0.012745204626083136\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031208,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031208\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806318,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806318\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6144430844553244,\n\
\ \"mc1_stderr\": 0.017038839010591663,\n \"mc2\": 0.7674849994719457,\n\
\ \"mc2_stderr\": 0.013901832413470068\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8468823993685872,\n \"acc_stderr\": 0.010120623252272962\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6952236542835482,\n \
\ \"acc_stderr\": 0.012679297549515427\n }\n}\n```"
repo_url: https://huggingface.co/Kukedlc/Neural-Krishna-Multiverse-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|arc:challenge|25_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|gsm8k|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hellaswag|10_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T14-05-24.448255.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-14T14-05-24.448255.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- '**/details_harness|winogrande|5_2024-03-14T14-05-24.448255.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-14T14-05-24.448255.parquet'
- config_name: results
data_files:
- split: 2024_03_14T14_05_24.448255
path:
- results_2024-03-14T14-05-24.448255.parquet
- split: latest
path:
- results_2024-03-14T14-05-24.448255.parquet
---
# Dataset Card for Evaluation run of Kukedlc/Neural-Krishna-Multiverse-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Kukedlc/Neural-Krishna-Multiverse-7b](https://huggingface.co/Kukedlc/Neural-Krishna-Multiverse-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kukedlc__Neural-Krishna-Multiverse-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-14T14:05:24.448255](https://huggingface.co/datasets/open-llm-leaderboard/details_Kukedlc__Neural-Krishna-Multiverse-7b/blob/main/results_2024-03-14T14-05-24.448255.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6532152417131584,
"acc_stderr": 0.032028265969913886,
"acc_norm": 0.652663258288876,
"acc_norm_stderr": 0.032697634517724276,
"mc1": 0.6144430844553244,
"mc1_stderr": 0.017038839010591663,
"mc2": 0.7674849994719457,
"mc2_stderr": 0.013901832413470068
},
"harness|arc:challenge|25": {
"acc": 0.7022184300341296,
"acc_stderr": 0.01336308010724448,
"acc_norm": 0.7286689419795221,
"acc_norm_stderr": 0.012993807727545796
},
"harness|hellaswag|10": {
"acc": 0.7139016132244573,
"acc_stderr": 0.004510123171357374,
"acc_norm": 0.8905596494722167,
"acc_norm_stderr": 0.0031155287338295755
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933714,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933714
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066482,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066482
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455334,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455334
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.046695106638751906,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.046695106638751906
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993464,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993464
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258172,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258172
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.441340782122905,
"acc_stderr": 0.016607021781050873,
"acc_norm": 0.441340782122905,
"acc_norm_stderr": 0.016607021781050873
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757482,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757482
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.02575586592263295,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.02575586592263295
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600713,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600713
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46870925684485004,
"acc_stderr": 0.012745204626083136,
"acc_norm": 0.46870925684485004,
"acc_norm_stderr": 0.012745204626083136
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.028332959514031208,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.028332959514031208
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.018950886770806318,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.018950886770806318
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02796678585916089,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02796678585916089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6144430844553244,
"mc1_stderr": 0.017038839010591663,
"mc2": 0.7674849994719457,
"mc2_stderr": 0.013901832413470068
},
"harness|winogrande|5": {
"acc": 0.8468823993685872,
"acc_stderr": 0.010120623252272962
},
"harness|gsm8k|5": {
"acc": 0.6952236542835482,
"acc_stderr": 0.012679297549515427
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
AkashPS11/recipes_data_food.com | ---
license: mit
---
|
mriosqu/landing_pages_04_dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 64643520.0
num_examples: 85
download_size: 63342265
dataset_size: 64643520.0
---
# Dataset Card for "landing_pages_04_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
storia/rick-and-morty-all-seasons | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
- name: subtitle
dtype: string
- name: caption
dtype: string
- name: characters
dtype: string
- name: frame
dtype: string
splits:
- name: train
num_bytes: 1637895252.464
num_examples: 15264
- name: test
num_bytes: 5458443.0
num_examples: 46
download_size: 1363032355
dataset_size: 1643353695.464
---
# Dataset Card for "rick-and-morty-all-seasons-v4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ashleybishop/tomi_nil_inference_v3 | ---
dataset_info:
features:
- name: label
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 2141230
num_examples: 5994
- name: validation
num_bytes: 2145044
num_examples: 5994
- name: test
num_bytes: 2135683
num_examples: 5994
download_size: 793170
dataset_size: 6421957
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
MarcelM/sloreddit | ---
license: unknown
---
|
tyzhu/find_last_sent_train_30_eval_10_hint3 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: title
dtype: string
- name: context
dtype: string
splits:
- name: train
num_bytes: 90057
num_examples: 70
- name: validation
num_bytes: 11016
num_examples: 10
download_size: 65240
dataset_size: 101073
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
# Dataset Card for "find_last_sent_train_30_eval_10_hint3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nelorth/oxford-flowers | ---
pretty_name: Oxford Flowers Dataset
source_datasets: https://www.robots.ox.ac.uk/~vgg/data/flowers
tags:
- flowers
- oxford
task_categories:
- image-classification
- unconditional-image-generation
license:
- unknown
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '1'
'1': '10'
'2': '100'
'3': '101'
'4': '102'
'5': '11'
'6': '12'
'7': '13'
'8': '14'
'9': '15'
'10': '16'
'11': '17'
'12': '18'
'13': '19'
'14': '2'
'15': '20'
'16': '21'
'17': '22'
'18': '23'
'19': '24'
'20': '25'
'21': '26'
'22': '27'
'23': '28'
'24': '29'
'25': '3'
'26': '30'
'27': '31'
'28': '32'
'29': '33'
'30': '34'
'31': '35'
'32': '36'
'33': '37'
'34': '38'
'35': '39'
'36': '4'
'37': '40'
'38': '41'
'39': '42'
'40': '43'
'41': '44'
'42': '45'
'43': '46'
'44': '47'
'45': '48'
'46': '49'
'47': '5'
'48': '50'
'49': '51'
'50': '52'
'51': '53'
'52': '54'
'53': '55'
'54': '56'
'55': '57'
'56': '58'
'57': '59'
'58': '6'
'59': '60'
'60': '61'
'61': '62'
'62': '63'
'63': '64'
'64': '65'
'65': '66'
'66': '67'
'67': '68'
'68': '69'
'69': '7'
'70': '70'
'71': '71'
'72': '72'
'73': '73'
'74': '74'
'75': '75'
'76': '76'
'77': '77'
'78': '78'
'79': '79'
'80': '8'
'81': '80'
'82': '81'
'83': '82'
'84': '83'
'85': '84'
'86': '85'
'87': '86'
'88': '87'
'89': '88'
'90': '89'
'91': '9'
'92': '90'
'93': '91'
'94': '92'
'95': '93'
'96': '94'
'97': '95'
'98': '96'
'99': '97'
'100': '98'
'101': '99'
splits:
- name: train
num_bytes: 308119477.446
num_examples: 7169
- name: test
num_bytes: 43247670.14
num_examples: 1020
download_size: 346597973
dataset_size: 351367147.58599997
---
# Dataset Card for "oxford-flowers"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TheGreatP/JimmyLondon | ---
license: openrail
---
|
joey234/mmlu-professional_law-original-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 2525751
num_examples: 1080
download_size: 1389567
dataset_size: 2525751
---
# Dataset Card for "mmlu-professional_law-original-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
davidpedem/adv-ele | ---
dataset_info:
features:
- name: ADV
dtype: string
- name: ELE
dtype: string
splits:
- name: train
num_bytes: 430918.56140350876
num_examples: 1732
- name: test
num_bytes: 107978.43859649122
num_examples: 434
download_size: 292626
dataset_size: 538897.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
CyberHarem/l_indomptable_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of l_indomptable/ランドンターブル/不屈 (Azur Lane)
This is the dataset of l_indomptable/ランドンターブル/不屈 (Azur Lane), containing 24 images and their tags.
The core tags of this character are `blue_eyes, long_hair, multicolored_hair, white_hair, breasts, hair_bun, very_long_hair, black_hair, double_bun, bangs, hair_between_eyes, small_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 24 | 50.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/l_indomptable_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 24 | 23.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/l_indomptable_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 60 | 51.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/l_indomptable_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 24 | 43.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/l_indomptable_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 60 | 82.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/l_indomptable_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/l_indomptable_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, solo, looking_at_viewer, blush, open_mouth, simple_background, white_background, white_dress, white_pantyhose, gradient_hair, twintails |
| 1 | 6 |  |  |  |  |  | 1girl, navel, blush, looking_at_viewer, solo, nipples, pussy, barefoot, on_back, spread_legs, underwear |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | blush | open_mouth | simple_background | white_background | white_dress | white_pantyhose | gradient_hair | twintails | navel | nipples | pussy | barefoot | on_back | spread_legs | underwear |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------|:-------------|:--------------------|:-------------------|:--------------|:------------------|:----------------|:------------|:--------|:----------|:--------|:-----------|:----------|:--------------|:------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | | | | | | | | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_indischepartij__OpenMia-Indo-Mistral-7b-v3-refined | ---
pretty_name: Evaluation run of indischepartij/OpenMia-Indo-Mistral-7b-v3-refined
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [indischepartij/OpenMia-Indo-Mistral-7b-v3-refined](https://huggingface.co/indischepartij/OpenMia-Indo-Mistral-7b-v3-refined)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_indischepartij__OpenMia-Indo-Mistral-7b-v3-refined\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-05T07:17:24.697764](https://huggingface.co/datasets/open-llm-leaderboard/details_indischepartij__OpenMia-Indo-Mistral-7b-v3-refined/blob/main/results_2024-02-05T07-17-24.697764.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6291464441848997,\n\
\ \"acc_stderr\": 0.0325052901080867,\n \"acc_norm\": 0.6303599490256852,\n\
\ \"acc_norm_stderr\": 0.03317036353110983,\n \"mc1\": 0.3769889840881273,\n\
\ \"mc1_stderr\": 0.01696551757893035,\n \"mc2\": 0.5394835979252358,\n\
\ \"mc2_stderr\": 0.015320518165942699\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6049488054607508,\n \"acc_stderr\": 0.014285898292938167,\n\
\ \"acc_norm\": 0.64419795221843,\n \"acc_norm_stderr\": 0.01399057113791876\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.640211113324039,\n\
\ \"acc_stderr\": 0.004789575163418651,\n \"acc_norm\": 0.842162915753834,\n\
\ \"acc_norm_stderr\": 0.0036384306206139337\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.373015873015873,\n \"acc_stderr\": 0.02490699045899257,\n \"acc_norm\"\
: 0.373015873015873,\n \"acc_norm_stderr\": 0.02490699045899257\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n\
\ \"acc_stderr\": 0.023785577884181012,\n \"acc_norm\": 0.7741935483870968,\n\
\ \"acc_norm_stderr\": 0.023785577884181012\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922986,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922986\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015178,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902803,\n\
\ \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902803\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.03095663632856655,\n \
\ \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.03095663632856655\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8146788990825689,\n \"acc_stderr\": 0.016659279700295827,\n \"\
acc_norm\": 0.8146788990825689,\n \"acc_norm_stderr\": 0.016659279700295827\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.803921568627451,\n \"acc_stderr\": 0.027865942286639325,\n \"\
acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639325\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\
\ \"acc_stderr\": 0.030636591348699803,\n \"acc_norm\": 0.7040358744394619,\n\
\ \"acc_norm_stderr\": 0.030636591348699803\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n\
\ \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.040655781409087044,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.040655781409087044\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.0398913985953177,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.0398913985953177\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8160919540229885,\n\
\ \"acc_stderr\": 0.013853724170922533,\n \"acc_norm\": 0.8160919540229885,\n\
\ \"acc_norm_stderr\": 0.013853724170922533\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n\
\ \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.26256983240223464,\n\
\ \"acc_stderr\": 0.014716824273017765,\n \"acc_norm\": 0.26256983240223464,\n\
\ \"acc_norm_stderr\": 0.014716824273017765\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.02617390850671858,\n\
\ \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.02617390850671858\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.025630824975621355,\n\
\ \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.025630824975621355\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873862,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873862\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n\
\ \"acc_stderr\": 0.012739711554045704,\n \"acc_norm\": 0.4654498044328553,\n\
\ \"acc_norm_stderr\": 0.012739711554045704\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.029163128570670733,\n\
\ \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.029163128570670733\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6339869281045751,\n \"acc_stderr\": 0.019488025745529675,\n \
\ \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.019488025745529675\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.02879518557429129,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.02879518557429129\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.026508590656233268,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.026508590656233268\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \
\ \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"\
acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368043,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368043\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3769889840881273,\n\
\ \"mc1_stderr\": 0.01696551757893035,\n \"mc2\": 0.5394835979252358,\n\
\ \"mc2_stderr\": 0.015320518165942699\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8153117600631413,\n \"acc_stderr\": 0.01090597811215688\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6125852918877938,\n \
\ \"acc_stderr\": 0.01341879844782737\n }\n}\n```"
repo_url: https://huggingface.co/indischepartij/OpenMia-Indo-Mistral-7b-v3-refined
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|arc:challenge|25_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|gsm8k|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hellaswag|10_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-05T07-17-24.697764.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-05T07-17-24.697764.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- '**/details_harness|winogrande|5_2024-02-05T07-17-24.697764.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-05T07-17-24.697764.parquet'
- config_name: results
data_files:
- split: 2024_02_05T07_17_24.697764
path:
- results_2024-02-05T07-17-24.697764.parquet
- split: latest
path:
- results_2024-02-05T07-17-24.697764.parquet
---
# Dataset Card for Evaluation run of indischepartij/OpenMia-Indo-Mistral-7b-v3-refined
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [indischepartij/OpenMia-Indo-Mistral-7b-v3-refined](https://huggingface.co/indischepartij/OpenMia-Indo-Mistral-7b-v3-refined) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_indischepartij__OpenMia-Indo-Mistral-7b-v3-refined",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-05T07:17:24.697764](https://huggingface.co/datasets/open-llm-leaderboard/details_indischepartij__OpenMia-Indo-Mistral-7b-v3-refined/blob/main/results_2024-02-05T07-17-24.697764.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6291464441848997,
"acc_stderr": 0.0325052901080867,
"acc_norm": 0.6303599490256852,
"acc_norm_stderr": 0.03317036353110983,
"mc1": 0.3769889840881273,
"mc1_stderr": 0.01696551757893035,
"mc2": 0.5394835979252358,
"mc2_stderr": 0.015320518165942699
},
"harness|arc:challenge|25": {
"acc": 0.6049488054607508,
"acc_stderr": 0.014285898292938167,
"acc_norm": 0.64419795221843,
"acc_norm_stderr": 0.01399057113791876
},
"harness|hellaswag|10": {
"acc": 0.640211113324039,
"acc_stderr": 0.004789575163418651,
"acc_norm": 0.842162915753834,
"acc_norm_stderr": 0.0036384306206139337
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.02490699045899257,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.02490699045899257
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181012,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181012
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922986,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922986
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015178,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6487179487179487,
"acc_stderr": 0.024203665177902803,
"acc_norm": 0.6487179487179487,
"acc_norm_stderr": 0.024203665177902803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.03095663632856655,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.03095663632856655
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8146788990825689,
"acc_stderr": 0.016659279700295827,
"acc_norm": 0.8146788990825689,
"acc_norm_stderr": 0.016659279700295827
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639325,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639325
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699803,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699803
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.040655781409087044,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.040655781409087044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.0398913985953177,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.0398913985953177
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8160919540229885,
"acc_stderr": 0.013853724170922533,
"acc_norm": 0.8160919540229885,
"acc_norm_stderr": 0.013853724170922533
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.26256983240223464,
"acc_stderr": 0.014716824273017765,
"acc_norm": 0.26256983240223464,
"acc_norm_stderr": 0.014716824273017765
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.02617390850671858,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.02617390850671858
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188933,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188933
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.025630824975621355,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.025630824975621355
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873862,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873862
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4654498044328553,
"acc_stderr": 0.012739711554045704,
"acc_norm": 0.4654498044328553,
"acc_norm_stderr": 0.012739711554045704
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.029163128570670733,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.029163128570670733
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6339869281045751,
"acc_stderr": 0.019488025745529675,
"acc_norm": 0.6339869281045751,
"acc_norm_stderr": 0.019488025745529675
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.02879518557429129,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.02879518557429129
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233268,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233268
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368043,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368043
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3769889840881273,
"mc1_stderr": 0.01696551757893035,
"mc2": 0.5394835979252358,
"mc2_stderr": 0.015320518165942699
},
"harness|winogrande|5": {
"acc": 0.8153117600631413,
"acc_stderr": 0.01090597811215688
},
"harness|gsm8k|5": {
"acc": 0.6125852918877938,
"acc_stderr": 0.01341879844782737
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
kannanwisen/De-Hazing-Dataset | ---
license: creativeml-openrail-m
---
|
BByrneLab/RAVQA_v2_data | ---
license: mit
---
|
open-llm-leaderboard/details_v2ray__LLaMA-2-Jannie-70B-QLoRA | ---
pretty_name: Evaluation run of v2ray/LLaMA-2-Jannie-70B-QLoRA
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [v2ray/LLaMA-2-Jannie-70B-QLoRA](https://huggingface.co/v2ray/LLaMA-2-Jannie-70B-QLoRA)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_v2ray__LLaMA-2-Jannie-70B-QLoRA\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-09T18:55:45.725131](https://huggingface.co/datasets/open-llm-leaderboard/details_v2ray__LLaMA-2-Jannie-70B-QLoRA/blob/main/results_2023-10-09T18-55-45.725131.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.5506501677852349,\n\
\ \"em_stderr\": 0.0050941277409732805,\n \"f1\": 0.5974674916107394,\n\
\ \"f1_stderr\": 0.004813528422862131,\n \"acc\": 0.5735917227001633,\n\
\ \"acc_stderr\": 0.011696543872157381\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.5506501677852349,\n \"em_stderr\": 0.0050941277409732805,\n\
\ \"f1\": 0.5974674916107394,\n \"f1_stderr\": 0.004813528422862131\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.31766489764973466,\n \
\ \"acc_stderr\": 0.012824066621488854\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.829518547750592,\n \"acc_stderr\": 0.010569021122825909\n\
\ }\n}\n```"
repo_url: https://huggingface.co/v2ray/LLaMA-2-Jannie-70B-QLoRA
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_10_09T18_55_45.725131
path:
- '**/details_harness|drop|3_2023-10-09T18-55-45.725131.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-09T18-55-45.725131.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_09T18_55_45.725131
path:
- '**/details_harness|gsm8k|5_2023-10-09T18-55-45.725131.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-09T18-55-45.725131.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_09T18_55_45.725131
path:
- '**/details_harness|winogrande|5_2023-10-09T18-55-45.725131.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-09T18-55-45.725131.parquet'
- config_name: results
data_files:
- split: 2023_10_09T18_55_45.725131
path:
- results_2023-10-09T18-55-45.725131.parquet
- split: latest
path:
- results_2023-10-09T18-55-45.725131.parquet
---
# Dataset Card for Evaluation run of v2ray/LLaMA-2-Jannie-70B-QLoRA
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/v2ray/LLaMA-2-Jannie-70B-QLoRA
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [v2ray/LLaMA-2-Jannie-70B-QLoRA](https://huggingface.co/v2ray/LLaMA-2-Jannie-70B-QLoRA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_v2ray__LLaMA-2-Jannie-70B-QLoRA",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-09T18:55:45.725131](https://huggingface.co/datasets/open-llm-leaderboard/details_v2ray__LLaMA-2-Jannie-70B-QLoRA/blob/main/results_2023-10-09T18-55-45.725131.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.5506501677852349,
"em_stderr": 0.0050941277409732805,
"f1": 0.5974674916107394,
"f1_stderr": 0.004813528422862131,
"acc": 0.5735917227001633,
"acc_stderr": 0.011696543872157381
},
"harness|drop|3": {
"em": 0.5506501677852349,
"em_stderr": 0.0050941277409732805,
"f1": 0.5974674916107394,
"f1_stderr": 0.004813528422862131
},
"harness|gsm8k|5": {
"acc": 0.31766489764973466,
"acc_stderr": 0.012824066621488854
},
"harness|winogrande|5": {
"acc": 0.829518547750592,
"acc_stderr": 0.010569021122825909
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
distilled-one-sec-cv12-each-chunk-uniq/chunk_89 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1319935004.0
num_examples: 257197
download_size: 1351707933
dataset_size: 1319935004.0
---
# Dataset Card for "chunk_89"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jeswinLLM/RANDOM_DATA | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': cancel_order
'1': change_order
'2': change_shipping_address
'3': check_cancellation_fee
'4': check_invoice
'5': check_payment_methods
'6': check_refund_policy
'7': complaint
'8': contact_customer_service
'9': contact_human_agent
'10': create_account
'11': delete_account
'12': delivery_options
'13': delivery_period
'14': edit_account
'15': get_invoice
'16': get_refund
'17': newsletter_subscription
'18': payment_issue
'19': place_order
'20': recover_password
'21': registration_problems
'22': review
'23': set_up_shipping_address
'24': switch_account
'25': track_order
'26': track_refund
splits:
- name: train
num_bytes: 1582479.0
num_examples: 26872
download_size: 426257
dataset_size: 1582479.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
asun17904/wikitext_no_bank_examples | ---
dataset_info:
features:
- name: page
dtype: string
- name: sentences
sequence: string
splits:
- name: train
num_bytes: 21795788
num_examples: 629
download_size: 12010779
dataset_size: 21795788
---
# Dataset Card for "wikitext_no_bank_examples"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
xchange/ensw_processed | ---
dataset_info:
features:
- name: idx
dtype: int64
- name: lcs
dtype: string
- name: similarity
dtype: float64
- name: labse_similarity
dtype: float64
- name: en
dtype: string
- name: sw
dtype: string
splits:
- name: train
num_bytes: 26811784
num_examples: 115392
download_size: 19329703
dataset_size: 26811784
---
# Dataset Card for "ensw_processed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_mrpc_null_prepositions | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 417200
num_examples: 1647
- name: train
num_bytes: 889271
num_examples: 3500
- name: validation
num_bytes: 98226
num_examples: 382
download_size: 938832
dataset_size: 1404697
---
# Dataset Card for "MULTI_VALUE_mrpc_null_prepositions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Zainab984/BP-balanced | ---
dataset_info:
features:
- name: Target
dtype: int64
- name: PC
dtype: string
- name: GSHARE
dtype: string
- name: GA table
dtype: string
splits:
- name: train
num_bytes: 41082000
num_examples: 82164
- name: test
num_bytes: 10271000
num_examples: 20542
download_size: 2354826
dataset_size: 51353000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Effyis/Table-Extraction | ---
license: apache-2.0
task_categories:
- feature-extraction
language:
- en
- ar
configs:
- config_name: default
data_files:
- split: train
path: table_extract.csv
tags:
- finance
---
# Table Extract Dataset
This dataset is designed to evaluate the ability of large language models (LLMs) to extract tables from text. It provides a collection of text snippets containing tables and their corresponding structured representations in JSON format.
## Source
The dataset is based on the [Table Fact Dataset](https://github.com/wenhuchen/Table-Fact-Checking/tree/master?tab=readme-ov-file), also known as TabFact, which contains 16,573 tables extracted from Wikipedia.
## Schema:
Each data point in the dataset consists of two elements:
* context: A string containing the text snippet with the embedded table.
* answer: A JSON object representing the extracted table structure.
The JSON object follows this format:
{
"column_1": { "row_id": "val1", "row_id": "val2", ... },
"column_2": { "row_id": "val1", "row_id": "val2", ... },
...
}
Each key in the JSON object represents a column header, and the corresponding value is another object containing key-value pairs for each row in that column.
## Examples:
### Example 1:
#### Context:

#### Answer:
```json
{
"aircraft": {
"0": "robinson r - 22",
"1": "bell 206b3 jetranger",
"2": "ch - 47d chinook",
"3": "mil mi - 26",
"4": "ch - 53e super stallion"
},
"description": {
"0": "light utility helicopter",
"1": "turboshaft utility helicopter",
"2": "tandem rotor helicopter",
"3": "heavy - lift helicopter",
"4": "heavy - lift helicopter"
},
"max gross weight": {
"0": "1370 lb (635 kg)",
"1": "3200 lb (1451 kg)",
"2": "50000 lb (22680 kg)",
"3": "123500 lb (56000 kg)",
"4": "73500 lb (33300 kg)"
},
"total disk area": {
"0": "497 ft square (46.2 m square)",
"1": "872 ft square (81.1 m square)",
"2": "5655 ft square (526 m square)",
"3": "8495 ft square (789 m square)",
"4": "4900 ft square (460 m square)"
},
"max disk loading": {
"0": "2.6 lb / ft square (14 kg / m square)",
"1": "3.7 lb / ft square (18 kg / m square)",
"2": "8.8 lb / ft square (43 kg / m square)",
"3": "14.5 lb / ft square (71 kg / m square)",
"4": "15 lb / ft square (72 kg / m square)"
}
}
```
### Example 2:
#### Context:

#### Answer:
```json
{
"country": {
"exonym": {
"0": "iceland",
"1": "indonesia",
"2": "iran",
"3": "iraq",
"4": "ireland",
"5": "isle of man"
},
"endonym": {
"0": "ísland",
"1": "indonesia",
"2": "īrān ایران",
"3": "al - 'iraq العراق îraq",
"4": "éire ireland",
"5": "isle of man ellan vannin"
}
},
"capital": {
"exonym": {
"0": "reykjavík",
"1": "jakarta",
"2": "tehran",
"3": "baghdad",
"4": "dublin",
"5": "douglas"
},
"endonym": {
"0": "reykjavík",
"1": "jakarta",
"2": "tehrān تهران",
"3": "baghdad بغداد bexda",
"4": "baile átha cliath dublin",
"5": "douglas doolish"
}
},
"official or native language(s) (alphabet/script)": {
"0": "icelandic",
"1": "bahasa indonesia",
"2": "persian ( arabic script )",
"3": "arabic ( arabic script ) kurdish",
"4": "irish english",
"5": "english manx"
}
}
``` |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/d754a8c6 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 182
num_examples: 10
download_size: 1338
dataset_size: 182
---
# Dataset Card for "d754a8c6"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
fnlp/character-llm-data | ---
license: cc-by-nc-4.0
---
# Character-LLM: A Trainable Agent for Role-Playing
This is the training datasets for Character-LLM, which contains nine characters experience data used to train Character-LLMs.
To download the dataset, please run the following code with Python, and you can find the downloaded data in `/path/to/local_dir`.
```python
from huggingface_hub import snapshot_download
snapshot_download(
local_dir_use_symlinks=True,
repo_type="dataset",
repo_id="fnlp/character-llm-data",
local_dir="/path/to/local_dir")
```
The `prompted/` contains datasets that can be used for supervised fine-tuning directly. And `generated/` consists of raw data that generated by gpt-3.5-turbo, which can be converted into `prompted` style.
Here is the statistics of the training data.
| | # Scenes | # Words | # Turns |
|----------------------|---------|--------|--------|
| Cleopatra VII | 1.4K | 723K | 14.3 |
| Lord Voldemort | 1.4K | 599K | 13.1 |
| Spartacus | 1.4K | 646K | 12.3 |
| Hermione Granger | 1.5K | 628K | 15.5 |
| Isaac Newton | 1.6K | 772K | 12.6 |
| Julius Caesar | 1.6K | 820K | 12.9 |
| Ludwig van Beethoven | 1.6K | 663K | 12.2 |
| Socrates | 1.6K | 896K | 14.1 |
| Martin Luther King | 2.2K | 1,038K | 12.0 |
| Avg. | 1.6K | 754K | 13.2 |
|
LucasThil/miniwob_plusplus_T5_unbounded | ---
dataset_info:
features:
- name: history_episodes
dtype: string
- name: instruction
dtype: string
- name: html_snippets
dtype: string
- name: actions
dtype: string
- name: refs
dtype: int64
- name: keydown_texts
dtype: string
splits:
- name: train
num_bytes: 360530668
num_examples: 75039
download_size: 39135939
dataset_size: 360530668
---
# Dataset Card for "miniwob_plusplus_T5_unbounded"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Isaak-Carter/MAIN_JOSIE_wizard_vicuna_70k_unfiltered_public | ---
dataset_info:
features:
- name: sample
dtype: string
splits:
- name: train
num_bytes: 139236325
num_examples: 34598
download_size: 67194101
dataset_size: 139236325
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
lsy641/reddit_mhp | ---
license: mit
---
|
AdapterOcean/Open_Platypus_standardized_cluster_14_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 2497437
num_examples: 2340
download_size: 1169328
dataset_size: 2497437
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Open_Platypus_standardized_cluster_14_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/gr_mg5_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of gr_mg5/GrMG5/MG5 (Girls' Frontline)
This is the dataset of gr_mg5/GrMG5/MG5 (Girls' Frontline), containing 95 images and their tags.
The core tags of this character are `breasts, short_hair, hair_over_one_eye, grey_hair, large_breasts, blue_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 95 | 122.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_mg5_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 95 | 70.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_mg5_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 214 | 138.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_mg5_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 95 | 109.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_mg5_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 214 | 196.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_mg5_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/gr_mg5_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, fingerless_gloves, looking_at_viewer, solo, fur_trim, full_body, knee_boots, lace-up_boots, scarf, hood, official_alternate_costume, white_background, blush, christmas, cleavage, green_eyes, high_heel_boots, red_gloves, rifle, gift_box, medium_breasts, simple_background, torn_clothes |
| 1 | 7 |  |  |  |  |  | 1girl, looking_at_viewer, black_gloves, solo, blush, simple_background, aqua_eyes, bangs, belt, closed_mouth, white_background, cleavage_cutout, fingerless_gloves, green_eyes, upper_body |
| 2 | 5 |  |  |  |  |  | 1girl, aqua_eyes, black_scarf, closed_mouth, long_hair, solo, black_jacket, boots, hat, leather_jacket, looking_at_viewer, bag, bangs, brown_headwear, full_body, grey_scarf, official_alternate_costume, open_jacket, plaid_headwear, purple_hair, rifle, white_background, white_shirt, belt, blush, bottle, brown_pants, brown_vest, headwear_removed, holding_cup, holding_weapon, machine_gun, ponytail, sitting, standing, torn_clothes, weapon_over_shoulder, zipper |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | fingerless_gloves | looking_at_viewer | solo | fur_trim | full_body | knee_boots | lace-up_boots | scarf | hood | official_alternate_costume | white_background | blush | christmas | cleavage | green_eyes | high_heel_boots | red_gloves | rifle | gift_box | medium_breasts | simple_background | torn_clothes | black_gloves | aqua_eyes | bangs | belt | closed_mouth | cleavage_cutout | upper_body | black_scarf | long_hair | black_jacket | boots | hat | leather_jacket | bag | brown_headwear | grey_scarf | open_jacket | plaid_headwear | purple_hair | white_shirt | bottle | brown_pants | brown_vest | headwear_removed | holding_cup | holding_weapon | machine_gun | ponytail | sitting | standing | weapon_over_shoulder | zipper |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:--------------------|:-------|:-----------|:------------|:-------------|:----------------|:--------|:-------|:-----------------------------|:-------------------|:--------|:------------|:-----------|:-------------|:------------------|:-------------|:--------|:-----------|:-----------------|:--------------------|:---------------|:---------------|:------------|:--------|:-------|:---------------|:------------------|:-------------|:--------------|:------------|:---------------|:--------|:------|:-----------------|:------|:-----------------|:-------------|:--------------|:-----------------|:--------------|:--------------|:---------|:--------------|:-------------|:-------------------|:--------------|:-----------------|:--------------|:-----------|:----------|:-----------|:-----------------------|:---------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | | | | | | | | X | X | | | X | | | | | | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | | X | X | | X | | | | | X | X | X | | | | | | X | | | | X | | X | X | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Back-up/facebook_comment_agu | ---
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
- name: label
sequence: string
- name: predict
list:
- name: label
dtype: string
- name: score
dtype: float64
splits:
- name: train
num_bytes: 1366016
num_examples: 4245
download_size: 733483
dataset_size: 1366016
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_sst2_reduplicate_interrogative | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 3749
num_examples: 21
- name: test
num_bytes: 10986
num_examples: 70
- name: train
num_bytes: 132700
num_examples: 1055
download_size: 73714
dataset_size: 147435
---
# Dataset Card for "MULTI_VALUE_sst2_reduplicate_interrogative"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
projecte-aina/ceil | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- ca
license: cc-by-4.0
multilinguality:
- monolingual
pretty_name: ceil
size_categories:
- unknown
source_datasets: []
task_ids: []
---
# Dataset Card for CEIL
## Dataset Description
- **Homepage** [Projecte AINA](https://projecteaina.cat/tech/)
- **Repository** [HuggingFace](https://huggingface.co/projecte-aina)
- **Point of Contact** langtech@bsc.es
### Dataset Summary
Catalan Entity Identification and Linking (CEIL) is a dataset for complex Named Entity Recognition (NER) created by the AINA project in the BSC for Machine Learning and Language Model evaluation purposes in Catalan.
It contains 9 main types and 52 subtypes on all kinds of short texts, with almost 59K documents.

### Supported Tasks and Leaderboards
Named Entities Recognition, Language Model
### Languages
The dataset is in Catalan (`ca-ES`).
## Dataset Structure
### Data Instances
Three two-column files, one for each split.
<pre>
l' O
obra O
de O
Galileu B-person-scholar/scientist
, O
i O
de O
la O
multiplicació O
de O
les O
acadèmies O
científiques O
, O
com O
l' O
Accademia B-organization-education
dei I-organization-education
Lincei I-organization-education
</pre>
### Data Fields
Every file has two columns, with the word form or punctuation symbol in the first one and the corresponding IOB tag in the second one.
### Data Splits
80/20 Train and development sets, balanced for all NERC tags. Test set incloudes documents that contain overall all the possible types in the corpus.
## Dataset Creation
### Curation Rationale
We created this corpus to contribute to the development of language models in Catalan.
### Source Data
#### Initial Data Collection and Normalization
Documents were gathered from various online sources:
- Tweets about different topics, such as catalan independence, coronavirus, benidormfest, vaccines, etc.
- Newswire from nacio digital (Motor) Vilaweb (opinion pieces), Agencia Catalana de Noticies (Economy and Memoria Histórica)
- Various threads from Raco Catalá forum
- Viquipedia articles (woman bios, film synopses, etc.)
- OTHER: Parliament proceedings, restaurant online reviews, etc.
The word tokenization used to convert offset annotations into CONLL files was done using spacy
#### Who are the source language producers?
The original data comes from various sources.
### Annotations
#### Annotation process
We adapted the NER labels from to a token-per-line, multi-column format.
#### Who are the annotators?
The annotation was entrusted to the company M47 labs through a public tender process.
Guidelines available at [Zenodo](https://doi.org/10.5281/zenodo.8318188)
### Personal and Sensitive Information
No personal or sensitive information included.
## Considerations for Using the Data
### Social Impact of Dataset
We hope this corpus contributes to the development of language models in Catalan, a low-resource language.
### Discussion of Biases
[N/A]
### Other Known Limitations
[N/A]
## Additional Information
### Dataset Curators
Language Technologies Unit at the Barcelona Supercomputing Center (langtech@bsc.es).
This work has been promoted and financed by the Generalitat de Catalunya through the [Aina project](https://projecteaina.cat/).
### Licensing Information
CEIL is used under [CC-by](https://creativecommons.org/licenses/by/4.0/) licence.
### Citation Information
```
```
### Contributions
[N/A] |
DBQ/Burberry.Product.prices.United.States | ---
annotations_creators:
- other
language_creators:
- other
language:
- en
license:
- unknown
multilinguality:
- monolingual
source_datasets:
- original
task_categories:
- text-classification
- image-classification
- feature-extraction
- image-segmentation
- image-to-image
- image-to-text
- object-detection
- summarization
- zero-shot-image-classification
pretty_name: United States - Burberry - Product-level price list
tags:
- webscraping
- ecommerce
- Burberry
- fashion
- fashion product
- image
- fashion image
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: website_name
dtype: string
- name: competence_date
dtype: string
- name: country_code
dtype: string
- name: currency_code
dtype: string
- name: brand
dtype: string
- name: category1_code
dtype: string
- name: category2_code
dtype: string
- name: category3_code
dtype: string
- name: product_code
dtype: int64
- name: title
dtype: string
- name: itemurl
dtype: string
- name: imageurl
dtype: string
- name: full_price
dtype: float64
- name: price
dtype: float64
- name: full_price_eur
dtype: float64
- name: price_eur
dtype: float64
- name: flg_discount
dtype: int64
splits:
- name: train
num_bytes: 969198
num_examples: 3038
download_size: 287409
dataset_size: 969198
---
# Burberry web scraped data
## About the website
Burberry operates within the **luxury fashion industry** in the United States, which is a segment of the wider **retail industry**. The American market is highly competitive and renowned for its considerable consumer spending. **E-commerce** has become a vital platform for luxury brands like Burberry to extend their customer reach, especially amid changing shopping habits among consumers. The **online luxury fashion market** in the United States has seen exponential growth, with more consumers turning to the convenience and vast product selection of online shopping. The dataset observed includes **Ecommerce product-list page (PLP) data** on Burberry in the United States.
## Link to **dataset**
[United States - Burberry - Product-level price list dataset](https://www.databoutique.com/buy-data-page/Burberry%20Product-prices%20United%20States/r/rec3QS3ibv3mus8AY)
|
ThiennNguyen/MangaColoring | ---
license: openrail
---
|
bookbot/cmudict-0.7b | ---
dataset_info:
features:
- name: source
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 3233546
num_examples: 108132
- name: test
num_bytes: 385494
num_examples: 12855
- name: validation
num_bytes: 163320
num_examples: 5447
download_size: 2020319
dataset_size: 3782360
---
# Dataset Card for "cmudict-0.7b"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
projecte-aina/UD_Catalan-AnCora | ---
YAML tags:
annotations_creators:
- expert-generated
language:
- ca
language_creators:
- found
license:
- cc-by-4.0
multilinguality:
- monolingual
pretty_name: UD_Catalan-AnCora
size_categories: []
source_datasets: []
tags: []
task_categories:
- token-classification
task_ids:
- part-of-speech
---
# UD_Catalan-AnCora
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Website:** https://github.com/UniversalDependencies/UD_Catalan-AnCora
- **Point of Contact:** [Daniel Zeman](zeman@ufal.mff.cuni.cz)
### Dataset Summary
This dataset is composed of the annotations from the [AnCora corpus](http://clic.ub.edu/corpus/), projected on the [Universal Dependencies treebank](https://universaldependencies.org/). We use the POS annotations of this corpus as part of the [Catalan Language Understanding Benchmark (CLUB)](https://club.aina.bsc.es/).
This work is licensed under a <a rel="license" href="https://creativecommons.org/licenses/by/4.0/">CC Attribution 4.0 International License</a>.
### Supported Tasks and Leaderboards
POS tagging
### Languages
The dataset is in Catalan (`ca-ES`)
## Dataset Structure
### Data Instances
Three conllu files.
Annotations are encoded in plain text files (UTF-8, normalized to NFC, using only the LF character as line break, including an LF character at the end of file) with three types of lines:
1) Word lines containing the annotation of a word/token in 10 fields separated by single tab characters (see below).
2) Blank lines marking sentence boundaries.
3) Comment lines starting with hash (#).
### Data Fields
Word lines contain the following fields:
1) ID: Word index, integer starting at 1 for each new sentence; may be a range for multiword tokens; may be a decimal number for empty nodes (decimal numbers can be lower than 1 but must be greater than 0).
2) FORM: Word form or punctuation symbol.
3) LEMMA: Lemma or stem of word form.
4) UPOS: Universal part-of-speech tag.
5) XPOS: Language-specific part-of-speech tag; underscore if not available.
6) FEATS: List of morphological features from the universal feature inventory or from a defined language-specific extension; underscore if not available.
7) HEAD: Head of the current word, which is either a value of ID or zero (0).
8) DEPREL: Universal dependency relation to the HEAD (root iff HEAD = 0) or a defined language-specific subtype of one.
9) DEPS: Enhanced dependency graph in the form of a list of head-deprel pairs.
10) MISC: Any other annotation.
From: [https://universaldependencies.org](https://universaldependencies.org/guidelines.html)
### Data Splits
- ca_ancora-ud-train.conllu
- ca_ancora-ud-dev.conllu
- ca_ancora-ud-test.conllu
## Dataset Creation
### Curation Rationale
[N/A]
### Source Data
- [UD_Catalan-AnCora](https://github.com/UniversalDependencies/UD_Catalan-AnCora)
#### Initial Data Collection and Normalization
The original annotation was done in a constituency framework as a part of the [AnCora project](http://clic.ub.edu/corpus/) at the University of Barcelona. It was converted to dependencies by the [Universal Dependencies team](https://universaldependencies.org/) and used in the CoNLL 2009 shared task. The CoNLL 2009 version was later converted to HamleDT and to Universal Dependencies.
For more information on the AnCora project, visit the [AnCora site](http://clic.ub.edu/corpus/).
To learn about the Universal Dependences, visit the webpage [https://universaldependencies.org](https://universaldependencies.org)
#### Who are the source language producers?
For more information on the AnCora corpus and its sources, visit the [AnCora site](http://clic.ub.edu/corpus/).
### Annotations
#### Annotation process
For more information on the first AnCora annotation, visit the [AnCora site](http://clic.ub.edu/corpus/).
#### Who are the annotators?
For more information on the AnCora annotation team, visit the [AnCora site](http://clic.ub.edu/corpus/).
### Personal and Sensitive Information
No personal or sensitive information included.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset contributes to the development of language models in Catalan, a low-resource language.
### Discussion of Biases
[N/A]
### Other Known Limitations
[N/A]
## Additional Information
### Dataset Curators
### Licensing Information
This work is licensed under a <a rel="license" href="https://creativecommons.org/licenses/by/4.0/">CC Attribution 4.0 International License</a>.
### Citation Information
The following paper must be cited when using this corpus:
Taulé, M., M.A. Martí, M. Recasens (2008) 'Ancora: Multilevel Annotated Corpora for Catalan and Spanish', Proceedings of 6th International Conference on Language Resources and Evaluation. Marrakesh (Morocco).
To cite the Universal Dependencies project:
Rueter, J. (Creator), Erina, O. (Contributor), Klementeva, J. (Contributor), Ryabov, I. (Contributor), Tyers, F. M. (Contributor), Zeman, D. (Contributor), Nivre, J. (Creator) (15 Nov 2020). Universal Dependencies version 2.7 Erzya JR. Universal Dependencies Consortium.
|
louisbrulenaudet/code-route | ---
license: apache-2.0
language:
- fr
multilinguality:
- monolingual
tags:
- finetuning
- legal
- french law
- droit français
- Code de la route
source_datasets:
- original
pretty_name: Code de la route
task_categories:
- text-generation
- table-question-answering
- summarization
- text-retrieval
- question-answering
- text-classification
size_categories:
- 1K<n<10K
---
# Code de la route, non-instruct (2024-04-15)
This project focuses on fine-tuning pre-trained language models to create efficient and accurate models for legal practice.
Fine-tuning is the process of adapting a pre-trained model to perform specific tasks or cater to particular domains. It involves adjusting the model's parameters through a further round of training on task-specific or domain-specific data. While conventional fine-tuning strategies involve supervised learning with labeled data, instruction-based fine-tuning introduces a more structured and interpretable approach.
Instruction-based fine-tuning leverages the power of human-provided instructions to guide the model's behavior. These instructions can be in the form of text prompts, prompts with explicit task descriptions, or a combination of both. This approach allows for a more controlled and context-aware interaction with the LLM, making it adaptable to a multitude of specialized tasks.
Instruction-based fine-tuning significantly enhances the performance of LLMs in the following ways:
- Task-Specific Adaptation: LLMs, when fine-tuned with specific instructions, exhibit remarkable adaptability to diverse tasks. They can switch seamlessly between translation, summarization, and question-answering, guided by the provided instructions.
- Reduced Ambiguity: Traditional LLMs might generate ambiguous or contextually inappropriate responses. Instruction-based fine-tuning allows for a clearer and more context-aware generation, reducing the likelihood of nonsensical outputs.
- Efficient Knowledge Transfer: Instructions can encapsulate domain-specific knowledge, enabling LLMs to benefit from expert guidance. This knowledge transfer is particularly valuable in fields like tax practice, law, medicine, and more.
- Interpretability: Instruction-based fine-tuning also makes LLM behavior more interpretable. Since the instructions are human-readable, it becomes easier to understand and control model outputs.
- Adaptive Behavior: LLMs, post instruction-based fine-tuning, exhibit adaptive behavior that is responsive to both explicit task descriptions and implicit cues within the provided text.
## Concurrent reading of the LegalKit
To use all the legal data published on LegalKit, you can use this code snippet:
```python
# -*- coding: utf-8 -*-
import concurrent.futures
import os
import datasets
from tqdm.notebook import tqdm
def dataset_loader(
name:str,
streaming:bool=True
) -> datasets.Dataset:
"""
Helper function to load a single dataset in parallel.
Parameters
----------
name : str
Name of the dataset to be loaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
dataset : datasets.Dataset
Loaded dataset object.
Raises
------
Exception
If an error occurs during dataset loading.
"""
try:
return datasets.load_dataset(
name,
split="train",
streaming=streaming
)
except Exception as exc:
logging.error(f"Error loading dataset {name}: {exc}")
return None
def load_datasets(
req:list,
streaming:bool=True
) -> list:
"""
Downloads datasets specified in a list and creates a list of loaded datasets.
Parameters
----------
req : list
A list containing the names of datasets to be downloaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
datasets_list : list
A list containing loaded datasets as per the requested names provided in 'req'.
Raises
------
Exception
If an error occurs during dataset loading or processing.
Examples
--------
>>> datasets = load_datasets(["dataset1", "dataset2"], streaming=False)
"""
datasets_list = []
with concurrent.futures.ThreadPoolExecutor() as executor:
future_to_dataset = {executor.submit(dataset_loader, name): name for name in req}
for future in tqdm(concurrent.futures.as_completed(future_to_dataset), total=len(req)):
name = future_to_dataset[future]
try:
dataset = future.result()
if dataset:
datasets_list.append(dataset)
except Exception as exc:
logging.error(f"Error processing dataset {name}: {exc}")
return datasets_list
req = [
"louisbrulenaudet/code-artisanat",
"louisbrulenaudet/code-action-sociale-familles",
# ...
]
datasets_list = load_datasets(
req=req,
streaming=True
)
dataset = datasets.concatenate_datasets(
datasets_list
)
```
## Dataset generation
This JSON file is a list of dictionaries, each dictionary contains the following fields:
- `instruction`: `string`, presenting the instruction linked to the element.
- `input`: `string`, signifying the input details for the element.
- `output`: `string`, indicating the output information for the element.
- `start`: `string`, the date of entry into force of the article.
- `expiration`: `string`, the date of expiration of the article.
- `num`: `string`, the id of the article.
We used the following list of instructions for generating the dataset:
```python
instructions = [
"Compose l'intégralité de l'article sous forme écrite.",
"Écris la totalité du contenu de l'article.",
"Formule la totalité du texte présent dans l'article.",
"Produis l'intégralité de l'article en écriture.",
"Développe l'article dans son ensemble par écrit.",
"Génère l'ensemble du texte contenu dans l'article.",
"Formule le contenu intégral de l'article en entier.",
"Rédige la totalité du texte de l'article en entier.",
"Compose l'intégralité du contenu textuel de l'article.",
"Rédige l'ensemble du texte qui constitue l'article.",
"Formule l'article entier dans son contenu écrit.",
"Composez l'intégralité de l'article sous forme écrite.",
"Écrivez la totalité du contenu de l'article.",
"Formulez la totalité du texte présent dans l'article.",
"Développez l'article dans son ensemble par écrit.",
"Générez l'ensemble du texte contenu dans l'article.",
"Formulez le contenu intégral de l'article en entier.",
"Rédigez la totalité du texte de l'article en entier.",
"Composez l'intégralité du contenu textuel de l'article.",
"Écrivez l'article dans son intégralité en termes de texte.",
"Rédigez l'ensemble du texte qui constitue l'article.",
"Formulez l'article entier dans son contenu écrit.",
"Composer l'intégralité de l'article sous forme écrite.",
"Écrire la totalité du contenu de l'article.",
"Formuler la totalité du texte présent dans l'article.",
"Produire l'intégralité de l'article en écriture.",
"Développer l'article dans son ensemble par écrit.",
"Générer l'ensemble du texte contenu dans l'article.",
"Formuler le contenu intégral de l'article en entier.",
"Rédiger la totalité du texte de l'article en entier.",
"Composer l'intégralité du contenu textuel de l'article.",
"Rédiger l'ensemble du texte qui constitue l'article.",
"Formuler l'article entier dans son contenu écrit.",
"Quelles sont les dispositions de l'article ?",
"Quelles dispositions sont incluses dans l'article ?",
"Quelles sont les dispositions énoncées dans l'article ?",
"Quel est le texte intégral de l'article ?",
"Quelle est la lettre de l'article ?"
]
```
## Feedback
If you have any feedback, please reach out at [louisbrulenaudet@icloud.com](mailto:louisbrulenaudet@icloud.com). |
irds/mr-tydi_sw_test | ---
pretty_name: '`mr-tydi/sw/test`'
viewer: false
source_datasets: ['irds/mr-tydi_sw']
task_categories:
- text-retrieval
---
# Dataset Card for `mr-tydi/sw/test`
The `mr-tydi/sw/test` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/mr-tydi#mr-tydi/sw/test).
# Data
This dataset provides:
- `queries` (i.e., topics); count=670
- `qrels`: (relevance assessments); count=743
- For `docs`, use [`irds/mr-tydi_sw`](https://huggingface.co/datasets/irds/mr-tydi_sw)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/mr-tydi_sw_test', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/mr-tydi_sw_test', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Zhang2021MrTyDi,
title={{Mr. TyDi}: A Multi-lingual Benchmark for Dense Retrieval},
author={Xinyu Zhang and Xueguang Ma and Peng Shi and Jimmy Lin},
year={2021},
journal={arXiv:2108.08787},
}
@article{Clark2020TyDiQa,
title={{TyDi QA}: A Benchmark for Information-Seeking Question Answering in Typologically Diverse Languages},
author={Jonathan H. Clark and Eunsol Choi and Michael Collins and Dan Garrette and Tom Kwiatkowski and Vitaly Nikolaev and Jennimaria Palomaki},
year={2020},
journal={Transactions of the Association for Computational Linguistics}
}
```
|
dgonier/debate-llm | ---
license: apache-2.0
---
|
tyzhu/fw_squad_num_train_100_eval_100 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: train_doc2id
path: data/train_doc2id-*
- split: train_id2doc
path: data/train_id2doc-*
- split: train_find_word
path: data/train_find_word-*
- split: eval_find_word
path: data/eval_find_word-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 45313
num_examples: 300
- name: train_doc2id
num_bytes: 34047
num_examples: 200
- name: train_id2doc
num_bytes: 34647
num_examples: 200
- name: train_find_word
num_bytes: 10666
num_examples: 100
- name: eval_find_word
num_bytes: 10404
num_examples: 100
download_size: 81252
dataset_size: 135077
---
# Dataset Card for "fw_squad_num_train_100_eval_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-high_school_mathematics-verbal-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 98682
num_examples: 270
download_size: 58437
dataset_size: 98682
---
# Dataset Card for "mmlu-high_school_mathematics-verbal-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bastiankase/dianakreuz | ---
license: openrail
---
|
Mooztoba/dataset | ---
license: other
---
|
datasets-examples/doc-image-7 | ---
size_categories:
- n<1K
---
# [doc] image dataset 7
This dataset contains 2 jpeg files in the `red` directory and 2 jpeg files in the `green` directory. |
waybarrios/github-code-dataset | ---
dataset_info:
features:
- name: path
dtype: string
- name: content
dtype: string
- name: size
dtype: int64
- name: max_lines
dtype: int64
- name: repo_name
dtype: string
- name: autogenerated
dtype: bool
splits:
- name: train
num_bytes: 152825770
num_examples: 18912
download_size: 57591057
dataset_size: 152825770
---
# Dataset Card for "github-code-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
comodoro/pscr | ---
license: cc-by-nc-3.0
---
|
maximedb/multilingual_librispeech_fr | ---
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: text
dtype: string
- name: speaker_id
dtype: int64
- name: chapter_id
dtype: int64
- name: id
dtype: string
splits:
- name: train
num_bytes: 62912882604.152
num_examples: 258213
- name: train.9h
num_bytes: 532581041.633
num_examples: 2167
- name: train.1h
num_bytes: 60218210.0
num_examples: 241
- name: validation
num_bytes: 620676040.84
num_examples: 2416
- name: test
num_bytes: 620016068.552
num_examples: 2426
download_size: 65546652537
dataset_size: 64746373965.177
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: train.9h
path: data/train.9h-*
- split: train.1h
path: data/train.1h-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
jan-hq/nitro_binarized | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 1018426.9979852249
num_examples: 2680
- name: test
num_bytes: 113243.00201477502
num_examples: 298
download_size: 505174
dataset_size: 1131670.0
---
# Dataset Card for "nitro_binarized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
eperim/ft_to_eval | ---
dataset_info:
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 218581
num_examples: 200
download_size: 136290
dataset_size: 218581
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.