datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
1aurent/commitpackmeta-gitmoji | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': ยฉ
'1': ยฎ
'2': โผ
'3': โ
'4': โน
'5': โ
'6': โ
'7': โ
'8': โฉ
'9': โ
'10': โ
'11': โจ
'12': โฉ
'13': โช
'14': โซ
'15': โญ
'16': โฎ
'17': โฐ
'18': โฑ
'19': โฒ
'20': โณ
'21': โธ
'22': โ
'23': โถ
'24': โ
'25': โ
'26': โ
'27': โ
'28': โ
'29': โ
'30': โ
'31': โ
'32': โ
'33': โ
'34': โ
'35': โ
'36': โข
'37': โฎ
'38': โฏ
'39': โน
'40': โบ
'41': โ
'42': โ
'43': โ
'44': โ
'45': โฃ
'46': โฅ
'47': โฆ
'48': โจ
'49': โป
'50': โฟ
'51': โ
'52': โ
'53': โ
'54': โ
'55': โ
'56': โ
'57': โ
'58': โ
'59': โ
'60': โก
'61': โช
'62': โซ
'63': โฐ
'64': โฝ
'65': โพ
'66': โ
'67': โ
'68': โ
'69': โ
'70': โ
'71': โ
'72': โฉ
'73': โช
'74': โฑ
'75': โฒ
'76': โณ
'77': โด
'78': โต
'79': โท
'80': โธ
'81': โน
'82': โบ
'83': โฝ
'84': โ
'85': โ
'86': โ
'87': โ
'88': โ
'89': โ
'90': โ
'91': โ
'92': โ
'93': โ
'94': โ
'95': โ
'96': โ
'97': โจ
'98': โณ
'99': โด
'100': โ
'101': โ
'102': โ
'103': โ
'104': โ
'105': โ
'106': โค
'107': โ
'108': โ
'109': โก
'110': โฐ
'111': โฟ
'112': โคด
'113': โคต
'114': โฌ
'115': โฌ
'116': โฌ
'117': โฌ
'118': โฌ
'119': โญ
'120': โญ
'121': ใฐ
'122': ใฝ
'123': ๐
'124': ๐
ฐ
'125': ๐
'126': ๐
'127': ๐
'128': ๐
'129': ๐
'130': ๐
'131': ๐
'132': ๐
'133': ๐
'134': ๐
'135': ๐
'136': ๐
'137': ๐
'138': ๐
'139': ๐
'140': ๐
'141': ๐
'142': ๐
'143': ๐
'144': ๐
'145': ๐
'146': ๐
'147': ๐
'148': ๐
'149': ๐
'150': ๐
'151': ๐
'152': ๐
'153': ๐
'154': ๐
'155': ๐
'156': ๐
'157': ๐
'158': ๐
'159': ๐
'160': ๐
'161': ๐
'162': ๐ฅ
'163': ๐ฆ
'164': ๐ง
'165': ๐จ
'166': ๐ฉ
'167': ๐ช
'168': ๐ซ
'169': ๐ญ
'170': ๐ฎ
'171': ๐ฏ
'172': ๐ฐ
'173': ๐ฑ
'174': ๐ฒ
'175': ๐ณ
'176': ๐ด
'177': ๐ต
'178': ๐ถ
'179': ๐ท
'180': ๐ธ
'181': ๐น
'182': ๐บ
'183': ๐ป
'184': ๐ผ
'185': ๐ฝ
'186': ๐พ
'187': ๐ฟ
'188': ๐
'189': ๐
'190': ๐
'191': ๐
'192': ๐
'193': ๐
'194': ๐
'195': ๐
'196': ๐
'197': ๐
'198': ๐
'199': ๐
'200': ๐
'201': ๐
'202': ๐
'203': ๐
'204': ๐
'205': ๐
'206': ๐
'207': ๐
'208': ๐
'209': ๐
'210': ๐
'211': ๐
'212': ๐
'213': ๐
'214': ๐
'215': ๐
'216': ๐
'217': ๐
'218': ๐
'219': ๐
'220': ๐ก
'221': ๐ข
'222': ๐ฃ
'223': ๐ค
'224': ๐ฅ
'225': ๐ฆ
'226': ๐ง
'227': ๐จ
'228': ๐ฉ
'229': ๐ช
'230': ๐ซ
'231': ๐ฌ
'232': ๐ญ
'233': ๐ฎ
'234': ๐ฏ
'235': ๐ฐ
'236': ๐ฑ
'237': ๐ณ
'238': ๐ด
'239': ๐ต
'240': ๐ถ
'241': ๐ท
'242': ๐น
'243': ๐บ
'244': ๐ป
'245': ๐ผ
'246': ๐ฝ
'247': ๐พ
'248': ๐ฟ
'249': ๐
'250': ๐
'251': ๐
'252': ๐
'253': ๐
'254': ๐
'255': ๐
'256': ๐
'257': ๐
'258': ๐
'259': ๐
'260': ๐
'261': ๐
'262': ๐
'263': ๐
'264': ๐
'265': ๐
'266': ๐
'267': ๐
'268': ๐
'269': ๐
'270': ๐
'271': ๐
'272': ๐ก
'273': ๐ข
'274': ๐ฃ
'275': ๐ค
'276': ๐ฅ
'277': ๐ฆ
'278': ๐ง
'279': ๐จ
'280': ๐ฉ
'281': ๐ช
'282': ๐ซ
'283': ๐ฌ
'284': ๐ญ
'285': ๐ฎ
'286': ๐ฏ
'287': ๐ฐ
'288': ๐ฑ
'289': ๐ฒ
'290': ๐ณ
'291': ๐ด
'292': ๐ต
'293': ๐ถ
'294': ๐ท
'295': ๐ธ
'296': ๐น
'297': ๐บ
'298': ๐ป
'299': ๐ผ
'300': ๐ฝ
'301': ๐พ
'302': ๐ฟ
'303': ๐
'304': ๐
'305': ๐
'306': ๐
'307': ๐
'308': ๐
'309': ๐
'310': ๐
'311': ๐
'312': ๐
'313': ๐
'314': ๐
'315': ๐
'316': ๐
'317': ๐
'318': ๐
'319': ๐
'320': ๐
'321': ๐
'322': ๐
'323': ๐
'324': ๐
'325': ๐
'326': ๐
'327': ๐
'328': ๐
'329': ๐ก
'330': ๐ข
'331': ๐ฃ
'332': ๐ค
'333': ๐ฅ
'334': ๐ฆ
'335': ๐จ
'336': ๐ฉ
'337': ๐ช
'338': ๐ซ
'339': ๐ฌ
'340': ๐ญ
'341': ๐ฎ
'342': ๐ฏ
'343': ๐ฐ
'344': ๐ณ
'345': ๐ด
'346': ๐ต
'347': ๐ท
'348': ๐น
'349': ๐ผ
'350': ๐
'351': ๐
'352': ๐
'353': ๐
'354': ๐
'355': ๐
'356': ๐
'357': ๐
'358': ๐
'359': ๐
'360': ๐
'361': ๐
'362': ๐
'363': ๐
'364': ๐
'365': ๐
'366': ๐
'367': ๐
'368': ๐
'369': ๐
'370': ๐
'371': ๐
'372': ๐
'373': ๐
'374': ๐
'375': ๐
'376': ๐
'377': ๐
'378': ๐
'379': ๐
'380': ๐
'381': ๐
'382': ๐
'383': ๐ก
'384': ๐ข
'385': ๐ฃ
'386': ๐ค
'387': ๐ฅ
'388': ๐ฆ
'389': ๐ง
'390': ๐จ
'391': ๐ฉ
'392': ๐ช
'393': ๐ซ
'394': ๐ฌ
'395': ๐ญ
'396': ๐ฎ
'397': ๐ฏ
'398': ๐ฐ
'399': ๐ฑ
'400': ๐ฒ
'401': ๐ณ
'402': ๐ด
'403': ๐ต
'404': ๐ถ
'405': ๐ท
'406': ๐ธ
'407': ๐น
'408': ๐บ
'409': ๐ป
'410': ๐ผ
'411': ๐ฝ
'412': ๐พ
'413': ๐ฟ
'414': ๐
'415': ๐
'416': ๐
'417': ๐
'418': ๐
'419': ๐
'420': ๐
'421': ๐
'422': ๐
'423': ๐
'424': ๐
'425': ๐
'426': ๐
'427': ๐
'428': ๐
'429': ๐
'430': ๐
'431': ๐
'432': ๐
'433': ๐
'434': ๐
'435': ๐
'436': ๐
'437': ๐
'438': ๐
'439': ๐
'440': ๐
'441': ๐
'442': ๐
'443': ๐
'444': ๐
'445': ๐ข
'446': ๐ฃ
'447': ๐ค
'448': ๐ฅ
'449': ๐ฆ
'450': ๐ง
'451': ๐จ
'452': ๐ฉ
'453': ๐ช
'454': ๐ซ
'455': ๐ฌ
'456': ๐ญ
'457': ๐ฎ
'458': ๐ฏ
'459': ๐ฐ
'460': ๐ฑ
'461': ๐ณ
'462': ๐ด
'463': ๐ต
'464': ๐ถ
'465': ๐ท
'466': ๐ธ
'467': ๐น
'468': ๐ป
'469': ๐ฝ
'470': ๐พ
'471': ๐
'472': ๐
'473': ๐
'474': ๐
'475': ๐
'476': ๐
'477': ๐
'478': ๐
'479': ๐
'480': ๐
'481': ๐
'482': ๐
'483': ๐
'484': ๐
'485': ๐
'486': ๐
'487': ๐
'488': ๐
'489': ๐
'490': ๐
'491': ๐
'492': ๐
'493': ๐
'494': ๐
'495': ๐
'496': ๐
'497': ๐
'498': ๐
'499': ๐
'500': ๐
'501': ๐
'502': ๐ก
'503': ๐ข
'504': ๐ฃ
'505': ๐ฅ
'506': ๐ฆ
'507': ๐ง
'508': ๐จ
'509': ๐ฉ
'510': ๐ช
'511': ๐ซ
'512': ๐ฌ
'513': ๐ญ
'514': ๐ฎ
'515': ๐ฏ
'516': ๐ฐ
'517': ๐ฑ
'518': ๐ฒ
'519': ๐ณ
'520': ๐ด
'521': ๐ต
'522': ๐ถ
'523': ๐ท
'524': ๐ธ
'525': ๐น
'526': ๐บ
'527': ๐ป
'528': ๐ผ
'529': ๐ฝ
'530': ๐พ
'531': ๐ฟ
'532': ๐
'533': ๐
'534': ๐
'535': ๐
'536': ๐
'537': ๐
'538': ๐
'539': ๐
'540': ๐
'541': ๐
'542': ๐
'543': ๐
'544': ๐
'545': ๐
'546': ๐
'547': ๐
'548': ๐
'549': ๐
'550': ๐
'551': ๐
'552': ๐
'553': ๐
'554': ๐
'555': ๐
'556': ๐
'557': ๐
'558': ๐
'559': ๐
'560': ๐
'561': ๐
'562': ๐
'563': ๐
'564': ๐
'565': ๐ก
'566': ๐ข
'567': ๐ฃ
'568': ๐ค
'569': ๐ฅ
'570': ๐ฆ
'571': ๐ง
'572': ๐จ
'573': ๐ฉ
'574': ๐ซ
'575': ๐ฌ
'576': ๐ญ
'577': ๐ฎ
'578': ๐ฏ
'579': ๐ฐ
'580': ๐ฑ
'581': ๐ฒ
'582': ๐ณ
'583': ๐ด
'584': ๐ต
'585': ๐ถ
'586': ๐ท
'587': ๐ธ
'588': ๐น
'589': ๐บ
'590': ๐ป
'591': ๐ผ
'592': ๐ฝ
'593': ๐ฟ
'594': ๐
'595': ๐
'596': ๐
'597': ๐
'598': ๐
'599': ๐
'600': ๐
'601': ๐
'602': ๐
'603': ๐
'604': ๐
'605': ๐
'606': ๐
'607': ๐
'608': ๐
'609': ๐
'610': ๐
'611': ๐
'612': ๐
'613': ๐
'614': ๐
'615': ๐
'616': ๐
'617': ๐
'618': ๐
'619': ๐
'620': ๐
'621': ๐ก
'622': ๐ข
'623': ๐ค
'624': ๐ฅ
'625': ๐ฆ
'626': ๐ง
'627': ๐จ
'628': ๐ฉ
'629': ๐ช
'630': ๐ซ
'631': ๐ฌ
'632': ๐ญ
'633': ๐ฎ
'634': ๐ฐ
'635': ๐ฒ
'636': ๐ณ
'637': ๐ด
'638': ๐ต
'639': ๐ถ
'640': ๐ท
'641': ๐ธ
'642': ๐น
'643': ๐บ
'644': ๐ผ
'645': ๐
'646': ๐
'647': ๐
'648': ๐
'649': ๐
'650': ๐ค
'651': ๐ฏ
'652': ๐ฐ
'653': ๐ณ
'654': ๐ด
'655': ๐ต
'656': ๐ถ
'657': ๐ท
'658': ๐ธ
'659': ๐น
'660': ๐บ
'661': ๐
'662': ๐
'663': ๐
'664': ๐
'665': ๐
'666': ๐
'667': ๐
'668': ๐
'669': ๐ค
'670': ๐ฅ
'671': ๐จ
'672': ๐ฑ
'673': ๐ผ
'674': ๐
'675': ๐
'676': ๐
'677': ๐
'678': ๐
'679': ๐
'680': ๐
'681': ๐
'682': ๐
'683': ๐ก
'684': ๐ฃ
'685': ๐ฏ
'686': ๐บ
'687': ๐ป
'688': ๐ผ
'689': ๐ฝ
'690': ๐พ
'691': ๐ฟ
'692': ๐
'693': ๐
'694': ๐
'695': ๐
'696': ๐
'697': ๐
'698': ๐
'699': ๐
'700': ๐
'701': ๐
'702': ๐
'703': ๐
'704': ๐
'705': ๐
'706': ๐
'707': ๐
'708': ๐
'709': ๐
'710': ๐
'711': ๐
'712': ๐
'713': ๐
'714': ๐
'715': ๐
'716': ๐
'717': ๐
'718': ๐
'719': ๐
'720': ๐
'721': ๐
'722': ๐
'723': ๐
'724': ๐
'725': ๐ก
'726': ๐ข
'727': ๐ฃ
'728': ๐ค
'729': ๐ฅ
'730': ๐ฆ
'731': ๐ง
'732': ๐จ
'733': ๐ฉ
'734': ๐ช
'735': ๐ซ
'736': ๐ฌ
'737': ๐ญ
'738': ๐ฎ
'739': ๐ฏ
'740': ๐ฐ
'741': ๐ฑ
'742': ๐ฒ
'743': ๐ณ
'744': ๐ด
'745': ๐ต
'746': ๐ถ
'747': ๐ท
'748': ๐ธ
'749': ๐น
'750': ๐บ
'751': ๐ป
'752': ๐ผ
'753': ๐ฝ
'754': ๐พ
'755': ๐ฟ
'756': ๐
'757': ๐
'758': ๐
'759': ๐
'760': ๐
'761': ๐
'762': ๐
'763': ๐
'764': ๐
'765': ๐
'766': ๐
'767': ๐
'768': ๐
'769': ๐
'770': ๐
'771': ๐
'772': ๐
'773': ๐
'774': ๐
'775': ๐
'776': ๐
'777': ๐
'778': ๐
'779': ๐
'780': ๐
'781': ๐
'782': ๐
'783': ๐
'784': ๐
'785': ๐
'786': ๐
'787': ๐
'788': ๐
'789': ๐
'790': ๐
'791': ๐
'792': ๐
'793': ๐
'794': ๐
'795': ๐
'796': ๐
'797': ๐
'798': ๐
'799': ๐
'800': ๐
'801': ๐
'802': ๐
'803': ๐
'804': ๐
'805': ๐ก
'806': ๐ข
'807': ๐ฃ
'808': ๐ค
'809': ๐ฅ
'810': ๐ฆ
'811': ๐ง
'812': ๐จ
'813': ๐ฉ
'814': ๐ช
'815': ๐ซ
'816': ๐ญ
'817': ๐ฎ
'818': ๐ฐ
'819': ๐ฑ
'820': ๐ฒ
'821': ๐ณ
'822': ๐ด
'823': ๐ต
'824': ๐ถ
'825': ๐ท
'826': ๐ธ
'827': ๐ป
'828': ๐ผ
'829': ๐ฝ
'830': ๐ฟ
'831': ๐
'832': ๐
'833': ๐
'834': ๐
'835': ๐
'836': ๐
'837': ๐
'838': ๐
'839': ๐
'840': ๐
'841': ๐ก
'842': ๐ข
'843': ๐ฃ
'844': ๐ฅ
'845': ๐ฉ
'846': ๐ซ
'847': ๐ฌ
'848': ๐ฐ
'849': ๐ณ
'850': ๐ด
'851': ๐ต
'852': ๐ถ
'853': ๐ท
'854': ๐น
'855': ๐ข
'856': ๐ฅ
'857': ๐ง
'858': ๐ค
'859': ๐ค
'860': ๐ค
'861': ๐ค
'862': ๐ค
'863': ๐ค
'864': ๐ค
'865': ๐ค
'866': ๐ค
'867': ๐ค
'868': ๐ค
'869': ๐ค
'870': ๐ค
'871': ๐คก
'872': ๐คข
'873': ๐คฃ
'874': ๐คฆ
'875': ๐คง
'876': ๐คฉ
'877': ๐คช
'878': ๐คซ
'879': ๐คฌ
'880': ๐คญ
'881': ๐คฎ
'882': ๐คฏ
'883': ๐คณ
'884': ๐คต
'885': ๐คท
'886': ๐คธ
'887': ๐คน
'888': ๐คบ
'889': ๐คผ
'890': ๐คพ
'891': ๐ฅ
'892': ๐ฅ
'893': ๐ฅ
'894': ๐ฅ
'895': ๐ฅ
'896': ๐ฅ
'897': ๐ฅ
'898': ๐ฅ
'899': ๐ฅ
'900': ๐ฅ
'901': ๐ฅ
'902': ๐ฅ
'903': ๐ฅ
'904': ๐ฅ
'905': ๐ฅค
'906': ๐ฅฅ
'907': ๐ฅจ
'908': ๐ฅฉ
'909': ๐ฅญ
'910': ๐ฅฏ
'911': ๐ฅณ
'912': ๐ฅด
'913': ๐ฅต
'914': ๐ฅถ
'915': ๐ฅธ
'916': ๐ฅน
'917': ๐ฅบ
'918': ๐ฅฝ
'919': ๐ฅพ
'920': ๐ฆ
'921': ๐ฆ
'922': ๐ฆ
'923': ๐ฆ
'924': ๐ฆ
'925': ๐ฆ
'926': ๐ฆ
'927': ๐ฆ
'928': ๐ฆ
'929': ๐ฆ
'930': ๐ฆ
'931': ๐ฆ
'932': ๐ฆ
'933': ๐ฆ
'934': ๐ฆ
'935': ๐ฆ
'936': ๐ฆ
'937': ๐ฆ
'938': ๐ฆ
'939': ๐ฆ
'940': ๐ฆ
'941': ๐ฆญ
'942': ๐ฆบ
'943': ๐ฆฝ
'944': ๐ง
'945': ๐ง
'946': ๐ง
'947': ๐ง
'948': ๐ง
'949': ๐ง
'950': ๐ง
'951': ๐ง
'952': ๐ง
'953': ๐ง
'954': ๐ง
'955': ๐ง
'956': ๐ง
'957': ๐ง
'958': ๐ง
'959': ๐ง
'960': ๐ง
'961': ๐ง
'962': ๐ง
'963': ๐ง
'964': ๐งข
'965': ๐งค
'966': ๐งฆ
'967': ๐งง
'968': ๐งช
'969': ๐งซ
'970': ๐งฏ
'971': ๐งฐ
'972': ๐งน
'973': ๐งผ
'974': ๐งฝ
'975': ๐งพ
'976': ๐ฉณ
'977': ๐ฉน
'978': ๐ช
'979': ๐ช
'980': ๐ช
'981': ๐ช
'982': ๐ชฃ
'983': ๐ชฆ
'984': ๐ชฒ
splits:
- name: train
num_bytes: 3998825
num_examples: 89894
download_size: 2678056
dataset_size: 3998825
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "commitpackmeta-gitmoji"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
benayas/banking_augmented_20pct_v1 | ---
dataset_info:
features:
- name: text
dtype: string
- name: category
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1022684
num_examples: 10003
download_size: 426228
dataset_size: 1022684
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AnoGame/zundamon | ---
license: mit
---
|
CyberHarem/koga_tomoe_seishunbutayarou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Koga Tomoe
This is the dataset of Koga Tomoe, containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 439 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 439 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 439 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 439 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
Gabriel1322/tialucy | ---
license: openrail
---
|
totally-not-an-llm/ZorgonChat | ---
license: mit
---
|
claudios/ReVeal | ---
arxiv: 2009.07235
dataset_info:
features:
- name: hash
dtype: int64
- name: project
dtype: string
- name: size
dtype: int64
- name: label
dtype: int64
- name: functionSource
dtype: string
splits:
- name: train
num_bytes: 25678896
num_examples: 18187
- name: validation
num_bytes: 2982883
num_examples: 2273
- name: test
num_bytes: 3489257
num_examples: 2274
download_size: 12036614
dataset_size: 32151036
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
task_categories:
- text-classification
tags:
- code
---
This is an unofficial HuggingFace version of "ReVeal" dataset from "[Deep Learning based Vulnerability Detection: Are We There Yet?
](https://arxiv.org/abs/2009.07235)". |
open-llm-leaderboard/details_Gryphe__MythoMist-7b | ---
pretty_name: Evaluation run of Gryphe/MythoMist-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Gryphe/MythoMist-7b](https://huggingface.co/Gryphe/MythoMist-7b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Gryphe__MythoMist-7b_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-23T18:33:43.562121](https://huggingface.co/datasets/open-llm-leaderboard/details_Gryphe__MythoMist-7b_public/blob/main/results_2023-11-23T18-33-43.562121.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6193933424443757,\n\
\ \"acc_stderr\": 0.03253049842540975,\n \"acc_norm\": 0.6273757812096611,\n\
\ \"acc_norm_stderr\": 0.03322659183767027,\n \"mc1\": 0.43818849449204406,\n\
\ \"mc1_stderr\": 0.017369236164404445,\n \"mc2\": 0.5997836138576584,\n\
\ \"mc2_stderr\": 0.015379030818687125,\n \"em\": 0.22902684563758388,\n\
\ \"em_stderr\": 0.0043033084382756255,\n \"f1\": 0.37819945469799016,\n\
\ \"f1_stderr\": 0.004228456430289263\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6348122866894198,\n \"acc_stderr\": 0.014070265519268802,\n\
\ \"acc_norm\": 0.658703071672355,\n \"acc_norm_stderr\": 0.013855831287497728\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6441943835889266,\n\
\ \"acc_stderr\": 0.0047777825848177875,\n \"acc_norm\": 0.8354909380601474,\n\
\ \"acc_norm_stderr\": 0.003699791934754364\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395268,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395268\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n\
\ \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n\
\ \"acc_stderr\": 0.03692820767264866,\n \"acc_norm\": 0.6242774566473989,\n\
\ \"acc_norm_stderr\": 0.03692820767264866\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467381,\n\
\ \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467381\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.36772486772486773,\n \"acc_stderr\": 0.02483383982556242,\n \"\
acc_norm\": 0.36772486772486773,\n \"acc_norm_stderr\": 0.02483383982556242\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7645161290322581,\n\
\ \"acc_stderr\": 0.024137632429337717,\n \"acc_norm\": 0.7645161290322581,\n\
\ \"acc_norm_stderr\": 0.024137632429337717\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.03008862949021749,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.03008862949021749\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033446,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033446\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6128205128205129,\n \"acc_stderr\": 0.024697216930878937,\n\
\ \"acc_norm\": 0.6128205128205129,\n \"acc_norm_stderr\": 0.024697216930878937\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135363,\n\
\ \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135363\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8330275229357799,\n \"acc_stderr\": 0.01599015488507338,\n \"\
acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.01599015488507338\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.034076320938540516,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.034076320938540516\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.02812597226565438,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.02812597226565438\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676177,\n \
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676177\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094632,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094632\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.035590395316173425,\n\
\ \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.035590395316173425\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.0376017800602662,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.0376017800602662\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077805,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077805\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8122605363984674,\n\
\ \"acc_stderr\": 0.01396439376989914,\n \"acc_norm\": 0.8122605363984674,\n\
\ \"acc_norm_stderr\": 0.01396439376989914\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.025070713719153186,\n\
\ \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.025070713719153186\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39664804469273746,\n\
\ \"acc_stderr\": 0.01636135476982247,\n \"acc_norm\": 0.39664804469273746,\n\
\ \"acc_norm_stderr\": 0.01636135476982247\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.026336613469046626,\n\
\ \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.026336613469046626\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n\
\ \"acc_stderr\": 0.026596782287697043,\n \"acc_norm\": 0.6752411575562701,\n\
\ \"acc_norm_stderr\": 0.026596782287697043\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.024836057868294674,\n\
\ \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.024836057868294674\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4397163120567376,\n \"acc_stderr\": 0.02960991207559411,\n \
\ \"acc_norm\": 0.4397163120567376,\n \"acc_norm_stderr\": 0.02960991207559411\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4445893089960887,\n\
\ \"acc_stderr\": 0.012691575792657115,\n \"acc_norm\": 0.4445893089960887,\n\
\ \"acc_norm_stderr\": 0.012691575792657115\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6544117647058824,\n \"acc_stderr\": 0.028888193103988633,\n\
\ \"acc_norm\": 0.6544117647058824,\n \"acc_norm_stderr\": 0.028888193103988633\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6421568627450981,\n \"acc_stderr\": 0.01939305840235544,\n \
\ \"acc_norm\": 0.6421568627450981,\n \"acc_norm_stderr\": 0.01939305840235544\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304328,\n\
\ \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304328\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.43818849449204406,\n\
\ \"mc1_stderr\": 0.017369236164404445,\n \"mc2\": 0.5997836138576584,\n\
\ \"mc2_stderr\": 0.015379030818687125\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7805840568271507,\n \"acc_stderr\": 0.01163126836060778\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.22902684563758388,\n \
\ \"em_stderr\": 0.0043033084382756255,\n \"f1\": 0.37819945469799016,\n\
\ \"f1_stderr\": 0.004228456430289263\n },\n \"harness|gsm8k|5\": {\n\
\ \"acc\": 0.20242608036391205,\n \"acc_stderr\": 0.011067792285006492\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Gryphe/MythoMist-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|arc:challenge|25_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|drop|3_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|gsm8k|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hellaswag|10_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T18-33-43.562121.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-23T18-33-43.562121.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- '**/details_harness|winogrande|5_2023-11-23T18-33-43.562121.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-23T18-33-43.562121.parquet'
- config_name: results
data_files:
- split: 2023_11_23T18_33_43.562121
path:
- results_2023-11-23T18-33-43.562121.parquet
- split: latest
path:
- results_2023-11-23T18-33-43.562121.parquet
---
# Dataset Card for Evaluation run of Gryphe/MythoMist-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Gryphe/MythoMist-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Gryphe/MythoMist-7b](https://huggingface.co/Gryphe/MythoMist-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Gryphe__MythoMist-7b_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-23T18:33:43.562121](https://huggingface.co/datasets/open-llm-leaderboard/details_Gryphe__MythoMist-7b_public/blob/main/results_2023-11-23T18-33-43.562121.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6193933424443757,
"acc_stderr": 0.03253049842540975,
"acc_norm": 0.6273757812096611,
"acc_norm_stderr": 0.03322659183767027,
"mc1": 0.43818849449204406,
"mc1_stderr": 0.017369236164404445,
"mc2": 0.5997836138576584,
"mc2_stderr": 0.015379030818687125,
"em": 0.22902684563758388,
"em_stderr": 0.0043033084382756255,
"f1": 0.37819945469799016,
"f1_stderr": 0.004228456430289263
},
"harness|arc:challenge|25": {
"acc": 0.6348122866894198,
"acc_stderr": 0.014070265519268802,
"acc_norm": 0.658703071672355,
"acc_norm_stderr": 0.013855831287497728
},
"harness|hellaswag|10": {
"acc": 0.6441943835889266,
"acc_stderr": 0.0047777825848177875,
"acc_norm": 0.8354909380601474,
"acc_norm_stderr": 0.003699791934754364
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.03842498559395268,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.03842498559395268
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.03692820767264866,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.03692820767264866
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.03261936918467381,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.03261936918467381
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36772486772486773,
"acc_stderr": 0.02483383982556242,
"acc_norm": 0.36772486772486773,
"acc_norm_stderr": 0.02483383982556242
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.024137632429337717,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.024137632429337717
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.03008862949021749,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.03008862949021749
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033446,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033446
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6128205128205129,
"acc_stderr": 0.024697216930878937,
"acc_norm": 0.6128205128205129,
"acc_norm_stderr": 0.024697216930878937
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.030684737115135363,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.030684737115135363
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.01599015488507338,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.01599015488507338
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.034076320938540516,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.034076320938540516
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.02812597226565438,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.02812597226565438
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676177,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676177
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094632,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094632
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.035590395316173425,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.035590395316173425
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.0376017800602662,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.0376017800602662
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077805,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077805
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8122605363984674,
"acc_stderr": 0.01396439376989914,
"acc_norm": 0.8122605363984674,
"acc_norm_stderr": 0.01396439376989914
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.025070713719153186,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.025070713719153186
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39664804469273746,
"acc_stderr": 0.01636135476982247,
"acc_norm": 0.39664804469273746,
"acc_norm_stderr": 0.01636135476982247
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.026336613469046626,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.026336613469046626
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6752411575562701,
"acc_stderr": 0.026596782287697043,
"acc_norm": 0.6752411575562701,
"acc_norm_stderr": 0.026596782287697043
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7253086419753086,
"acc_stderr": 0.024836057868294674,
"acc_norm": 0.7253086419753086,
"acc_norm_stderr": 0.024836057868294674
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4397163120567376,
"acc_stderr": 0.02960991207559411,
"acc_norm": 0.4397163120567376,
"acc_norm_stderr": 0.02960991207559411
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4445893089960887,
"acc_stderr": 0.012691575792657115,
"acc_norm": 0.4445893089960887,
"acc_norm_stderr": 0.012691575792657115
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6544117647058824,
"acc_stderr": 0.028888193103988633,
"acc_norm": 0.6544117647058824,
"acc_norm_stderr": 0.028888193103988633
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6421568627450981,
"acc_stderr": 0.01939305840235544,
"acc_norm": 0.6421568627450981,
"acc_norm_stderr": 0.01939305840235544
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.029043088683304328,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.029043088683304328
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.43818849449204406,
"mc1_stderr": 0.017369236164404445,
"mc2": 0.5997836138576584,
"mc2_stderr": 0.015379030818687125
},
"harness|winogrande|5": {
"acc": 0.7805840568271507,
"acc_stderr": 0.01163126836060778
},
"harness|drop|3": {
"em": 0.22902684563758388,
"em_stderr": 0.0043033084382756255,
"f1": 0.37819945469799016,
"f1_stderr": 0.004228456430289263
},
"harness|gsm8k|5": {
"acc": 0.20242608036391205,
"acc_stderr": 0.011067792285006492
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
DBQ/Celine.Product.prices.Germany | ---
annotations_creators:
- other
language_creators:
- other
language:
- en
license:
- unknown
multilinguality:
- monolingual
source_datasets:
- original
task_categories:
- text-classification
- image-classification
- feature-extraction
- image-segmentation
- image-to-image
- image-to-text
- object-detection
- summarization
- zero-shot-image-classification
pretty_name: Germany - Celine - Product-level price list
tags:
- webscraping
- ecommerce
- Celine
- fashion
- fashion product
- image
- fashion image
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: website_name
dtype: string
- name: competence_date
dtype: string
- name: country_code
dtype: string
- name: currency_code
dtype: string
- name: brand
dtype: string
- name: category1_code
dtype: string
- name: category2_code
dtype: string
- name: category3_code
dtype: string
- name: product_code
dtype: string
- name: title
dtype: string
- name: itemurl
dtype: string
- name: imageurl
dtype: string
- name: full_price
dtype: float64
- name: price
dtype: float64
- name: full_price_eur
dtype: float64
- name: price_eur
dtype: float64
- name: flg_discount
dtype: int64
splits:
- name: train
num_bytes: 319426
num_examples: 655
download_size: 78524
dataset_size: 319426
---
# Celine web scraped data
## About the website
Celine operates within the **fashion industry** in the EMEA region, particularly in **Germany**. This industry is currently undergoing digital transformation with a focus on **Ecommerce**, creating a competitive space for established and emerging fashion brands. In Germany, the local customer using various online stores has shown significant growth, displaying an increased interest in online fashion shopping. This pattern presents the fashion brands with a unique set of challenges and opportunities. The dataset observed includes **Ecommerce product-list page (PLP) data** on **Celine** in Germany, offering priceless insights into consumer behavior and preferences, allowing the brand to reach their target market more effectively.
## Link to **dataset**
[Germany - Celine - Product-level price list dataset](https://www.databoutique.com/buy-data-page/Celine%20Product-prices%20Germany/r/rec14W2uH4yIsDx2F)
|
boseong/Dataset.llamabs2 | ---
license: mit
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 20362
num_examples: 65
download_size: 11017
dataset_size: 20362
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Harelix/Prompt-Injection-Mixed-Techniques-2024 | ---
language:
- en
tags:
- jailbreak
- prompt injection
pretty_name: Prompt Injection Dataset 2024
size_categories:
- 1K<n<10K
license: apache-2.0
--- |
nielsr/datacomp-small-with-text-embeddings | ---
dataset_info:
features:
- name: uid
dtype: string
- name: url
dtype: string
- name: text
dtype: string
- name: original_width
dtype: int64
- name: original_height
dtype: int64
- name: clip_b32_similarity_score
dtype: float32
- name: clip_l14_similarity_score
dtype: float32
- name: face_bboxes
sequence:
sequence: float64
- name: sha256
dtype: string
- name: clip_l14_text_embedding
sequence: float64
splits:
- name: train
num_bytes: 82649389578
num_examples: 12800000
download_size: 23102063139
dataset_size: 82649389578
---
# Dataset Card for "datacomp-small-with-text-embeddings"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
carlosejimenez/seq2seq-rte | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: string
- name: orig_idx
dtype: int64
splits:
- name: train
num_bytes: 934454
num_examples: 2490
- name: validation
num_bytes: 100393
num_examples: 277
- name: test
num_bytes: 1070053
num_examples: 3000
download_size: 0
dataset_size: 2104900
---
# Dataset Card for "seq2seq-rte"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_mrpc_finna_future | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 39214
num_examples: 143
- name: train
num_bytes: 78065
num_examples: 284
- name: validation
num_bytes: 11707
num_examples: 41
download_size: 95557
dataset_size: 128986
---
# Dataset Card for "MULTI_VALUE_mrpc_finna_future"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Janiele/leninhag | ---
license: openrail
---
|
d0rj/rlhf-reward-datasets-ru | ---
language_creators:
- translated
language:
- ru
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
pretty_name: HH for RLHF (ru)
source_datasets:
- yitingxie/rlhf-reward-datasets
license: mit
tags:
- human-feedback
- ChatGPT
- reward
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 151564655.0
num_examples: 76256
- name: test
num_bytes: 6093563.0
num_examples: 5103
download_size: 78860063
dataset_size: 157658218.0
---
# Dataset Card for "rlhf-reward-datasets-ru"
This is translated version of [yitingxie/rlhf-reward-datasets dataset](https://huggingface.co/datasets/yitingxie/rlhf-reward-datasets) into Russian.
|
pietrolesci/pile-deduped-subset | ---
dataset_info:
features:
- name: input_ids
sequence: int64
- name: seq_idx
dtype: int64
splits:
- name: train
num_bytes: 234577200
num_examples: 14300
- name: validation
num_bytes: 32808000.0
num_examples: 2000
download_size: 58650299
dataset_size: 267385200.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
open-llm-leaderboard/details_shadowml__BeagSake-7B | ---
pretty_name: Evaluation run of shadowml/BeagSake-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [shadowml/BeagSake-7B](https://huggingface.co/shadowml/BeagSake-7B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_shadowml__BeagSake-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-02T02:17:55.720311](https://huggingface.co/datasets/open-llm-leaderboard/details_shadowml__BeagSake-7B/blob/main/results_2024-02-02T02-17-55.720311.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6578211772363932,\n\
\ \"acc_stderr\": 0.031956725676144875,\n \"acc_norm\": 0.6574768410421444,\n\
\ \"acc_norm_stderr\": 0.0326189871206691,\n \"mc1\": 0.572827417380661,\n\
\ \"mc1_stderr\": 0.01731683441096392,\n \"mc2\": 0.7227123192569592,\n\
\ \"mc2_stderr\": 0.01451322669078661\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7005119453924915,\n \"acc_stderr\": 0.013385021637313572,\n\
\ \"acc_norm\": 0.7244027303754266,\n \"acc_norm_stderr\": 0.01305716965576184\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.704142601075483,\n\
\ \"acc_stderr\": 0.0045549440206204845,\n \"acc_norm\": 0.8838876717785302,\n\
\ \"acc_norm_stderr\": 0.0031970484760036424\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.674074074074074,\n\
\ \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.674074074074074,\n\
\ \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.03309615177059004,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.03309615177059004\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4021164021164021,\n \"acc_stderr\": 0.025253032554997695,\n \"\
acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.025253032554997695\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n\
\ \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n\
\ \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6820512820512821,\n \"acc_stderr\": 0.023610884308927865,\n\
\ \"acc_norm\": 0.6820512820512821,\n \"acc_norm_stderr\": 0.023610884308927865\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n\
\ \"acc_stderr\": 0.02552472232455335,\n \"acc_norm\": 0.8431372549019608,\n\
\ \"acc_norm_stderr\": 0.02552472232455335\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n\
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n\
\ \"acc_stderr\": 0.013428186370608303,\n \"acc_norm\": 0.8301404853128991,\n\
\ \"acc_norm_stderr\": 0.013428186370608303\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044283,\n\
\ \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044283\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4223463687150838,\n\
\ \"acc_stderr\": 0.016519594275297117,\n \"acc_norm\": 0.4223463687150838,\n\
\ \"acc_norm_stderr\": 0.016519594275297117\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n\
\ \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n\
\ \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"\
acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47522816166883963,\n\
\ \"acc_stderr\": 0.012754553719781753,\n \"acc_norm\": 0.47522816166883963,\n\
\ \"acc_norm_stderr\": 0.012754553719781753\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406755,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406755\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.684640522875817,\n \"acc_stderr\": 0.018798086284886887,\n \
\ \"acc_norm\": 0.684640522875817,\n \"acc_norm_stderr\": 0.018798086284886887\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.02553843336857833,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.02553843336857833\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.572827417380661,\n\
\ \"mc1_stderr\": 0.01731683441096392,\n \"mc2\": 0.7227123192569592,\n\
\ \"mc2_stderr\": 0.01451322669078661\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8216258879242304,\n \"acc_stderr\": 0.010759352014855936\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7179681576952237,\n \
\ \"acc_stderr\": 0.0123949265843357\n }\n}\n```"
repo_url: https://huggingface.co/shadowml/BeagSake-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|arc:challenge|25_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|gsm8k|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hellaswag|10_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T02-17-55.720311.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T02-17-55.720311.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- '**/details_harness|winogrande|5_2024-02-02T02-17-55.720311.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-02T02-17-55.720311.parquet'
- config_name: results
data_files:
- split: 2024_02_02T02_17_55.720311
path:
- results_2024-02-02T02-17-55.720311.parquet
- split: latest
path:
- results_2024-02-02T02-17-55.720311.parquet
---
# Dataset Card for Evaluation run of shadowml/BeagSake-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [shadowml/BeagSake-7B](https://huggingface.co/shadowml/BeagSake-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_shadowml__BeagSake-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T02:17:55.720311](https://huggingface.co/datasets/open-llm-leaderboard/details_shadowml__BeagSake-7B/blob/main/results_2024-02-02T02-17-55.720311.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6578211772363932,
"acc_stderr": 0.031956725676144875,
"acc_norm": 0.6574768410421444,
"acc_norm_stderr": 0.0326189871206691,
"mc1": 0.572827417380661,
"mc1_stderr": 0.01731683441096392,
"mc2": 0.7227123192569592,
"mc2_stderr": 0.01451322669078661
},
"harness|arc:challenge|25": {
"acc": 0.7005119453924915,
"acc_stderr": 0.013385021637313572,
"acc_norm": 0.7244027303754266,
"acc_norm_stderr": 0.01305716965576184
},
"harness|hellaswag|10": {
"acc": 0.704142601075483,
"acc_stderr": 0.0045549440206204845,
"acc_norm": 0.8838876717785302,
"acc_norm_stderr": 0.0031970484760036424
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.674074074074074,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.674074074074074,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03309615177059004,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03309615177059004
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.025253032554997695,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.025253032554997695
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267045,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267045
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6820512820512821,
"acc_stderr": 0.023610884308927865,
"acc_norm": 0.6820512820512821,
"acc_norm_stderr": 0.023610884308927865
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886793,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608303,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608303
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.023532925431044283,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.023532925431044283
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4223463687150838,
"acc_stderr": 0.016519594275297117,
"acc_norm": 0.4223463687150838,
"acc_norm_stderr": 0.016519594275297117
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47522816166883963,
"acc_stderr": 0.012754553719781753,
"acc_norm": 0.47522816166883963,
"acc_norm_stderr": 0.012754553719781753
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406755,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.684640522875817,
"acc_stderr": 0.018798086284886887,
"acc_norm": 0.684640522875817,
"acc_norm_stderr": 0.018798086284886887
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.02553843336857833,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.02553843336857833
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.572827417380661,
"mc1_stderr": 0.01731683441096392,
"mc2": 0.7227123192569592,
"mc2_stderr": 0.01451322669078661
},
"harness|winogrande|5": {
"acc": 0.8216258879242304,
"acc_stderr": 0.010759352014855936
},
"harness|gsm8k|5": {
"acc": 0.7179681576952237,
"acc_stderr": 0.0123949265843357
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/zuikaku_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of zuikaku/็้ถด (Kantai Collection)
This is the dataset of zuikaku/็้ถด (Kantai Collection), containing 500 images and their tags.
The core tags of this character are `long_hair, twintails, ribbon, hair_ribbon, green_hair, green_eyes, white_ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 583.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/zuikaku_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 362.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/zuikaku_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1226 | 765.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/zuikaku_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 528.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/zuikaku_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1226 | 1.02 GiB | [Download](https://huggingface.co/datasets/CyberHarem/zuikaku_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/zuikaku_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 29 |  |  |  |  |  | 1girl, solo, looking_at_viewer, simple_background, smile, white_background, muneate, upper_body, hakama_skirt, hair_between_eyes, tasuki |
| 1 | 7 |  |  |  |  |  | 1girl, bow_(weapon), japanese_clothes, muneate, skirt, smile, solo, looking_at_viewer, yugake, arrow_(projectile), character_name, brown_eyes |
| 2 | 5 |  |  |  |  |  | 1girl, arrow_(projectile), flight_deck, hakama_short_skirt, muneate, quiver, red_hakama, solo, tasuki, thigh_boots, yugake, brown_gloves, holding_bow_(weapon), rudder_footwear, black_thighhighs, full_body, hair_between_eyes, rigging, single_glove, aircraft, grey_hair |
| 3 | 6 |  |  |  |  |  | 1girl, bow_(weapon), japanese_clothes, muneate, skirt, solo, thigh_boots, thighhighs, smile, arrow_(projectile), flight_deck, character_name |
| 4 | 6 |  |  |  |  |  | 1girl, japanese_clothes, muneate, solo, blush, looking_at_viewer, skirt, black_hair, gloves, open_mouth |
| 5 | 5 |  |  |  |  |  | 1girl, hair_between_eyes, hair_down, japanese_clothes, muneate, official_alternate_costume, solo, white_headband, official_alternate_hairstyle, upper_body, breastplate, grey_hair, hachimaki, looking_at_viewer, yellow_eyes, closed_mouth |
| 6 | 8 |  |  |  |  |  | 1girl, looking_at_viewer, navel, solo, blush, small_breasts, white_bikini, collarbone, hair_between_eyes, side-tie_bikini_bottom, simple_background, sitting, smile, white_background, cowboy_shot, grey_hair, jewelry, micro_bikini, open_mouth |
| 7 | 17 |  |  |  |  |  | 1girl, solo, looking_at_viewer, blush, green_jacket, ribbed_sweater, coat, simple_background, hair_between_eyes, white_background, black_thighhighs, brown_scarf, smile, white_sweater, box, gift, holding, long_sleeves, open_mouth, alternate_costume, fur-trimmed_jacket, red_scarf, sweater_dress |
| 8 | 9 |  |  |  |  |  | blue_shirt, 1girl, blue_skirt, blush, hair_between_eyes, midriff, solo, fox_ears, fox_shadow_puppet, fox_tail, navel, closed_mouth, looking_at_viewer, pleated_skirt, simple_background, small_breasts, cowboy_shot, crop_top, detached_sleeves, smile, white_background |
| 9 | 5 |  |  |  |  |  | navel, simple_background, white_background, 1girl, japanese_clothes, side-tie_panties, small_breasts, solo, black_panties, collarbone, cowboy_shot, brown_eyes, dark_green_hair, grey_hair, hair_between_eyes, looking_at_viewer, open_clothes, tasuki, thighhighs, white_panties |
| 10 | 6 |  |  |  |  |  | 1girl, cloud, day, looking_at_viewer, solo, beach, front-tie_top, ocean, outdoors, small_breasts, blue_sky, cowboy_shot, innertube, navel, black_bikini, side-tie_bikini_bottom |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | simple_background | smile | white_background | muneate | upper_body | hakama_skirt | hair_between_eyes | tasuki | bow_(weapon) | japanese_clothes | skirt | yugake | arrow_(projectile) | character_name | brown_eyes | flight_deck | hakama_short_skirt | quiver | red_hakama | thigh_boots | brown_gloves | holding_bow_(weapon) | rudder_footwear | black_thighhighs | full_body | rigging | single_glove | aircraft | grey_hair | thighhighs | blush | black_hair | gloves | open_mouth | hair_down | official_alternate_costume | white_headband | official_alternate_hairstyle | breastplate | hachimaki | yellow_eyes | closed_mouth | navel | small_breasts | white_bikini | collarbone | side-tie_bikini_bottom | sitting | cowboy_shot | jewelry | micro_bikini | green_jacket | ribbed_sweater | coat | brown_scarf | white_sweater | box | gift | holding | long_sleeves | alternate_costume | fur-trimmed_jacket | red_scarf | sweater_dress | blue_shirt | blue_skirt | midriff | fox_ears | fox_shadow_puppet | fox_tail | pleated_skirt | crop_top | detached_sleeves | side-tie_panties | black_panties | dark_green_hair | open_clothes | white_panties | cloud | day | beach | front-tie_top | ocean | outdoors | blue_sky | innertube | black_bikini |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:-------|:--------------------|:--------------------|:--------|:-------------------|:----------|:-------------|:---------------|:--------------------|:---------|:---------------|:-------------------|:--------|:---------|:---------------------|:-----------------|:-------------|:--------------|:---------------------|:---------|:-------------|:--------------|:---------------|:-----------------------|:------------------|:-------------------|:------------|:----------|:---------------|:-----------|:------------|:-------------|:--------|:-------------|:---------|:-------------|:------------|:-----------------------------|:-----------------|:-------------------------------|:--------------|:------------|:--------------|:---------------|:--------|:----------------|:---------------|:-------------|:-------------------------|:----------|:--------------|:----------|:---------------|:---------------|:-----------------|:-------|:--------------|:----------------|:------|:-------|:----------|:---------------|:--------------------|:---------------------|:------------|:----------------|:-------------|:-------------|:----------|:-----------|:--------------------|:-----------|:----------------|:-----------|:-------------------|:-------------------|:----------------|:------------------|:---------------|:----------------|:--------|:------|:--------|:----------------|:--------|:-----------|:-----------|:------------|:---------------|
| 0 | 29 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | | X | | X | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | | | | | X | | | X | X | | | | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | X | | | X | | X | | | | | X | X | X | | X | X | | X | | | | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | X | X | | | | X | | | | | | X | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | X | X | | | | X | X | | X | | | X | | | | | | | | | | | | | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 8 |  |  |  |  |  | X | X | X | X | X | X | | | | X | | | | | | | | | | | | | | | | | | | | | | X | | X | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 17 |  |  |  |  |  | X | X | X | X | X | X | | | | X | | | | | | | | | | | | | | | | | X | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 9 |  |  |  |  |  | X | X | X | X | X | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | X | X | X | | | | | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 9 | 5 |  |  |  |  |  | X | X | X | X | | X | | | | X | X | | X | | | | | X | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | X | X | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | |
| 10 | 6 |  |  |  |  |  | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X |
|
yangwang825/sst2-remove-non-stopwords-n5 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
- name: label_text
dtype: string
splits:
- name: train
num_bytes: 884164
num_examples: 6920
- name: validation
num_bytes: 112712
num_examples: 872
- name: test
num_bytes: 174288
num_examples: 1821
download_size: 688195
dataset_size: 1171164
---
# Dataset Card for "sst2-remove-non-stopwords-n5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dvilasuero/somos-alpaca-es-rg | ---
dataset_info:
features:
- name: text
dtype: 'null'
- name: inputs
struct:
- name: 1-instruction
dtype: string
- name: 2-input
dtype: string
- name: 3-output
dtype: string
- name: prediction
dtype: 'null'
- name: prediction_agent
dtype: 'null'
- name: annotation
dtype: 'null'
- name: annotation_agent
dtype: 'null'
- name: vectors
struct:
- name: input
sequence: float64
- name: instruction
sequence: float64
- name: output
sequence: float64
- name: multi_label
dtype: bool
- name: explanation
dtype: 'null'
- name: id
dtype: string
- name: metadata
dtype: 'null'
- name: status
dtype: string
- name: event_timestamp
dtype: timestamp[us]
- name: metrics
dtype: 'null'
splits:
- name: train
num_bytes: 984065676
num_examples: 52002
download_size: 652741327
dataset_size: 984065676
---
# Dataset Card for "somos-alpaca-es-rg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Ocillus/torch-base | ---
license: apache-2.0
---
|
CronosGhost/code-reranking-CodeLangQueries | ---
dataset_info:
features:
- name: query
dtype: string
- name: positive
sequence: string
- name: negative
sequence: string
splits:
- name: train
num_bytes: 23164263
num_examples: 9900
download_size: 9270866
dataset_size: 23164263
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_boomerchan__magpie-13b | ---
pretty_name: Evaluation run of boomerchan/magpie-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [boomerchan/magpie-13b](https://huggingface.co/boomerchan/magpie-13b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_boomerchan__magpie-13b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-27T04:34:42.967550](https://huggingface.co/datasets/open-llm-leaderboard/details_boomerchan__magpie-13b/blob/main/results_2023-10-27T04-34-42.967550.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.14272231543624161,\n\
\ \"em_stderr\": 0.003582171317651424,\n \"f1\": 0.20778418624161069,\n\
\ \"f1_stderr\": 0.0036307604368272656,\n \"acc\": 0.4548027044477143,\n\
\ \"acc_stderr\": 0.01080662148135179\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.14272231543624161,\n \"em_stderr\": 0.003582171317651424,\n\
\ \"f1\": 0.20778418624161069,\n \"f1_stderr\": 0.0036307604368272656\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.14480667172100076,\n \
\ \"acc_stderr\": 0.009693234799052706\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7647987371744278,\n \"acc_stderr\": 0.011920008163650877\n\
\ }\n}\n```"
repo_url: https://huggingface.co/boomerchan/magpie-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|arc:challenge|25_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_27T04_34_42.967550
path:
- '**/details_harness|drop|3_2023-10-27T04-34-42.967550.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-27T04-34-42.967550.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_27T04_34_42.967550
path:
- '**/details_harness|gsm8k|5_2023-10-27T04-34-42.967550.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-27T04-34-42.967550.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hellaswag|10_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T11-48-49.581129.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T11-48-49.581129.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-03T11-48-49.581129.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_27T04_34_42.967550
path:
- '**/details_harness|winogrande|5_2023-10-27T04-34-42.967550.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-27T04-34-42.967550.parquet'
- config_name: results
data_files:
- split: 2023_10_03T11_48_49.581129
path:
- results_2023-10-03T11-48-49.581129.parquet
- split: 2023_10_27T04_34_42.967550
path:
- results_2023-10-27T04-34-42.967550.parquet
- split: latest
path:
- results_2023-10-27T04-34-42.967550.parquet
---
# Dataset Card for Evaluation run of boomerchan/magpie-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/boomerchan/magpie-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [boomerchan/magpie-13b](https://huggingface.co/boomerchan/magpie-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_boomerchan__magpie-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-27T04:34:42.967550](https://huggingface.co/datasets/open-llm-leaderboard/details_boomerchan__magpie-13b/blob/main/results_2023-10-27T04-34-42.967550.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.14272231543624161,
"em_stderr": 0.003582171317651424,
"f1": 0.20778418624161069,
"f1_stderr": 0.0036307604368272656,
"acc": 0.4548027044477143,
"acc_stderr": 0.01080662148135179
},
"harness|drop|3": {
"em": 0.14272231543624161,
"em_stderr": 0.003582171317651424,
"f1": 0.20778418624161069,
"f1_stderr": 0.0036307604368272656
},
"harness|gsm8k|5": {
"acc": 0.14480667172100076,
"acc_stderr": 0.009693234799052706
},
"harness|winogrande|5": {
"acc": 0.7647987371744278,
"acc_stderr": 0.011920008163650877
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
daqc/wikihow_es_80train_20test | ---
dataset_info:
features:
- name: title
dtype: string
- name: section_name
dtype: string
- name: summary
dtype: string
- name: document
dtype: string
- name: english_section_name
dtype: string
- name: english_url
dtype: string
- name: url
dtype: string
splits:
- name: train
num_bytes: 258772116.8
num_examples: 90528
- name: test
num_bytes: 64693029.2
num_examples: 22632
download_size: 186134579
dataset_size: 323465146.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Skrillll/categorization | ---
license: openrail
---
|
marcus2000/dataset4sentinement_HSE | ---
dataset_info:
features:
- name: text
dtype: string
- name: labels
dtype: int64
splits:
- name: train
num_bytes: 3679508.0480941418
num_examples: 3322
- name: test
num_bytes: 650171.9519058582
num_examples: 587
download_size: 2311435
dataset_size: 4329680.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "dataset4sentinement_HSE"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bclavie/multiple_choice_bsard | ---
dataset_info:
- config_name: corpus
features:
- name: id
dtype: int64
- name: reference
dtype: string
- name: article
dtype: string
- name: law_type
dtype: string
- name: code
dtype: string
- name: book
dtype: string
- name: part
dtype: string
- name: act
dtype: string
- name: chapter
dtype: string
- name: section
dtype: string
- name: subsection
dtype: string
- name: description
dtype: string
- name: article_english
dtype: string
splits:
- name: test
num_bytes: 49992486
num_examples: 22633
download_size: 17332067
dataset_size: 49992486
- config_name: questions
features:
- name: question
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: correct
dtype: string
- name: incorrect
sequence: string
- name: noise_article_ids
sequence: int64
- name: relevant_doc_ids
sequence: int64
- name: correct_letter
dtype: string
- name: answers_string
dtype: string
- name: english_question
dtype: string
- name: english_correct
dtype: string
- name: english_incorrect
sequence: string
- name: english_answers_string
dtype: string
splits:
- name: test
num_bytes: 282361.7102803738
num_examples: 54
download_size: 131571
dataset_size: 282361.7102803738
configs:
- config_name: corpus
data_files:
- split: test
path: corpus/test-*
- config_name: questions
data_files:
- split: test
path: questions/test-*
---
|
zurd46/zurdcoder | ---
license: apache-2.0
---
|
cakiki/roots-tsne-data | ---
dataset_info:
features:
- name: x
dtype: float64
- name: 'y'
dtype: float64
- name: language
dtype: string
- name: corpus
dtype: string
splits:
- name: train
num_bytes: 247037602
num_examples: 5785741
download_size: 112131877
dataset_size: 247037602
license: apache-2.0
---
What follows is research code. It is by no means optimized for speed, efficiency, or readability.
## Data loading, tokenizing and sharding
```python
import os
import numpy as np
import pandas as pd
from sklearn.feature_extraction.text import TfidfTransformer
from sklearn.decomposition import TruncatedSVD
from tqdm.notebook import tqdm
from openTSNE import TSNE
import datashader as ds
import colorcet as cc
from dask.distributed import Client
import dask.dataframe as dd
import dask_ml
import dask.bag as db
from transformers import AutoTokenizer
from datasets import load_dataset
from datasets.utils.py_utils import convert_file_size_to_int
def batch_tokenize(batch):
return {'tokenized': [' '.join(e.tokens) for e in tokenizer(batch['text']).encodings]} # "text" column hard encoded
# The original viz used a subset of the ROOTS Corpus.
# More info on the entire dataset here: https://huggingface.co/bigscience-data
# And here: https://arxiv.org/abs/2303.03915
dset = load_dataset(..., split="train")
dset = dset.map(batch_tokenize, batched=True, batch_size=64, num_proc=28)
dset_name = "roots_subset"
max_shard_size = convert_file_size_to_int('300MB')
dataset_nbytes = dset.data.nbytes
num_shards = int(dataset_nbytes / max_shard_size) + 1
num_shards = max(num_shards, 1)
print(f"Sharding into {num_shards} files.")
os.makedirs(f"{dset_name}/tokenized", exist_ok=True)
for shard_index in tqdm(range(num_shards)):
shard = dset.shard(num_shards=num_shards, index=shard_index, contiguous=True)
shard.to_parquet(f"{dset_name}/tokenized/tokenized-{shard_index:03d}.parquet")
```
## Embedding
```python
client = Client() # To keep track of dask computation
client
df = dd.read_parquet(f'{dset_name}/tokenized/')
vect = dask_ml.feature_extraction.text.CountVectorizer(tokenizer=str.split,
token_pattern=None,
vocabulary=vocab)
tokenized_bag = df['tokenized'].to_bag()
X = vect.transform(tokenized_bag)
counts = X.compute()
client.shutdown()
tfidf_transformer = TfidfTransformer(sublinear_tf=True, norm="l2")
tfidf = tfidf_transformer.fit_transform(counts)
svd = TruncatedSVD(n_components=160)
X_svd = svd.fit_transform(tfidf)
tsne = TSNE(
perplexity=30, # not sure what param setting resulted in the plot
n_jobs=28,
random_state=42,
verbose=True,
)
tsne_embedding = tsne.fit(X)
```
## Plotting
```python
df = pd.DataFrame(data=tsne_embedding, columns=['x','y'])
agg = ds.Canvas(plot_height=600, plot_width=600).points(df, 'x', 'y')
img = ds.tf.shade(agg, cmap=cc.fire, how='eq_hist')
ds.tf.set_background(img, "black")
```
 |
Asap7772/persona_gpt4_paired_filtered_disagree0.8 | ---
dataset_info:
features:
- name: x
dtype: string
- name: yw
dtype: string
- name: yl
dtype: string
- name: scorew
dtype: int64
- name: scorel
dtype: int64
- name: genw
dtype: string
- name: genl
dtype: string
- name: scorer
dtype: string
- name: scorer_id
dtype: int64
- name: scorerw_id
dtype: int64
- name: scorerl_id
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 115768106.66839354
num_examples: 35661
- name: test
num_bytes: 12865287.19684932
num_examples: 3963
download_size: 33551923
dataset_size: 128633393.86524285
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
tttarun/captcha_store | ---
license: mit
---
|
MATTTTTZ/Rodrigoo | ---
license: openrail
---
|
open-llm-leaderboard/details_hfl__chinese-alpaca-2-13b-16k | ---
pretty_name: Evaluation run of hfl/chinese-alpaca-2-13b-16k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [hfl/chinese-alpaca-2-13b-16k](https://huggingface.co/hfl/chinese-alpaca-2-13b-16k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_hfl__chinese-alpaca-2-13b-16k\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-09T15:53:33.265685](https://huggingface.co/datasets/open-llm-leaderboard/details_hfl__chinese-alpaca-2-13b-16k/blob/main/results_2023-12-09T15-53-33.265685.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5126179344828111,\n\
\ \"acc_stderr\": 0.0342051274120513,\n \"acc_norm\": 0.5178843368987507,\n\
\ \"acc_norm_stderr\": 0.034949756392914415,\n \"mc1\": 0.33047735618115054,\n\
\ \"mc1_stderr\": 0.016466769613698307,\n \"mc2\": 0.46496694797516,\n\
\ \"mc2_stderr\": 0.015236674932834036\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5213310580204779,\n \"acc_stderr\": 0.014598087973127106,\n\
\ \"acc_norm\": 0.5503412969283277,\n \"acc_norm_stderr\": 0.014537144444284738\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5728938458474407,\n\
\ \"acc_stderr\": 0.004936470085238487,\n \"acc_norm\": 0.7741485759808803,\n\
\ \"acc_norm_stderr\": 0.0041728722829842005\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5131578947368421,\n \"acc_stderr\": 0.04067533136309173,\n\
\ \"acc_norm\": 0.5131578947368421,\n \"acc_norm_stderr\": 0.04067533136309173\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5509433962264151,\n \"acc_stderr\": 0.030612730713641095,\n\
\ \"acc_norm\": 0.5509433962264151,\n \"acc_norm_stderr\": 0.030612730713641095\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5069444444444444,\n\
\ \"acc_stderr\": 0.04180806750294938,\n \"acc_norm\": 0.5069444444444444,\n\
\ \"acc_norm_stderr\": 0.04180806750294938\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5086705202312138,\n\
\ \"acc_stderr\": 0.038118909889404126,\n \"acc_norm\": 0.5086705202312138,\n\
\ \"acc_norm_stderr\": 0.038118909889404126\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929775,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929775\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3446808510638298,\n \"acc_stderr\": 0.03106898596312215,\n\
\ \"acc_norm\": 0.3446808510638298,\n \"acc_norm_stderr\": 0.03106898596312215\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30952380952380953,\n \"acc_stderr\": 0.023809523809523853,\n \"\
acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.023809523809523853\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.041634530313028585,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.041634530313028585\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5741935483870968,\n\
\ \"acc_stderr\": 0.028129112709165904,\n \"acc_norm\": 0.5741935483870968,\n\
\ \"acc_norm_stderr\": 0.028129112709165904\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.39901477832512317,\n \"acc_stderr\": 0.03445487686264715,\n\
\ \"acc_norm\": 0.39901477832512317,\n \"acc_norm_stderr\": 0.03445487686264715\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6606060606060606,\n \"acc_stderr\": 0.03697442205031596,\n\
\ \"acc_norm\": 0.6606060606060606,\n \"acc_norm_stderr\": 0.03697442205031596\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6666666666666666,\n \"acc_stderr\": 0.033586181457325226,\n \"\
acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.033586181457325226\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7357512953367875,\n \"acc_stderr\": 0.03182155050916646,\n\
\ \"acc_norm\": 0.7357512953367875,\n \"acc_norm_stderr\": 0.03182155050916646\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.44358974358974357,\n \"acc_stderr\": 0.0251891498947642,\n \
\ \"acc_norm\": 0.44358974358974357,\n \"acc_norm_stderr\": 0.0251891498947642\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3074074074074074,\n \"acc_stderr\": 0.02813325257881563,\n \
\ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.02813325257881563\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5210084033613446,\n \"acc_stderr\": 0.03244980849990029,\n \
\ \"acc_norm\": 0.5210084033613446,\n \"acc_norm_stderr\": 0.03244980849990029\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7045871559633028,\n \"acc_stderr\": 0.019560619182976,\n \"acc_norm\"\
: 0.7045871559633028,\n \"acc_norm_stderr\": 0.019560619182976\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.39814814814814814,\n\
\ \"acc_stderr\": 0.033384734032074016,\n \"acc_norm\": 0.39814814814814814,\n\
\ \"acc_norm_stderr\": 0.033384734032074016\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.7009803921568627,\n \"acc_stderr\": 0.03213325717373617,\n\
\ \"acc_norm\": 0.7009803921568627,\n \"acc_norm_stderr\": 0.03213325717373617\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.70042194092827,\n \"acc_stderr\": 0.02981802474975309,\n \
\ \"acc_norm\": 0.70042194092827,\n \"acc_norm_stderr\": 0.02981802474975309\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.600896860986547,\n\
\ \"acc_stderr\": 0.03286745312567961,\n \"acc_norm\": 0.600896860986547,\n\
\ \"acc_norm_stderr\": 0.03286745312567961\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5648854961832062,\n \"acc_stderr\": 0.04348208051644858,\n\
\ \"acc_norm\": 0.5648854961832062,\n \"acc_norm_stderr\": 0.04348208051644858\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591207,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591207\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04557239513497751,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04557239513497751\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5766871165644172,\n \"acc_stderr\": 0.03881891213334383,\n\
\ \"acc_norm\": 0.5766871165644172,\n \"acc_norm_stderr\": 0.03881891213334383\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.04327040932578729,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.04327040932578729\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280042,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280042\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7863247863247863,\n\
\ \"acc_stderr\": 0.02685345037700914,\n \"acc_norm\": 0.7863247863247863,\n\
\ \"acc_norm_stderr\": 0.02685345037700914\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956914,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956914\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7151979565772669,\n\
\ \"acc_stderr\": 0.016139174096522546,\n \"acc_norm\": 0.7151979565772669,\n\
\ \"acc_norm_stderr\": 0.016139174096522546\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5867052023121387,\n \"acc_stderr\": 0.02651126136940924,\n\
\ \"acc_norm\": 0.5867052023121387,\n \"acc_norm_stderr\": 0.02651126136940924\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n\
\ \"acc_stderr\": 0.014310999547961443,\n \"acc_norm\": 0.24134078212290502,\n\
\ \"acc_norm_stderr\": 0.014310999547961443\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5522875816993464,\n \"acc_stderr\": 0.02847293847803353,\n\
\ \"acc_norm\": 0.5522875816993464,\n \"acc_norm_stderr\": 0.02847293847803353\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5980707395498392,\n\
\ \"acc_stderr\": 0.027846476005930473,\n \"acc_norm\": 0.5980707395498392,\n\
\ \"acc_norm_stderr\": 0.027846476005930473\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5648148148148148,\n \"acc_stderr\": 0.027586006221607708,\n\
\ \"acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.027586006221607708\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4078014184397163,\n \"acc_stderr\": 0.029316011776343555,\n \
\ \"acc_norm\": 0.4078014184397163,\n \"acc_norm_stderr\": 0.029316011776343555\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.39895697522816165,\n\
\ \"acc_stderr\": 0.01250675765529367,\n \"acc_norm\": 0.39895697522816165,\n\
\ \"acc_norm_stderr\": 0.01250675765529367\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4338235294117647,\n \"acc_stderr\": 0.030105636570016636,\n\
\ \"acc_norm\": 0.4338235294117647,\n \"acc_norm_stderr\": 0.030105636570016636\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.49019607843137253,\n \"acc_stderr\": 0.020223946005074305,\n \
\ \"acc_norm\": 0.49019607843137253,\n \"acc_norm_stderr\": 0.020223946005074305\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n\
\ \"acc_stderr\": 0.04724577405731572,\n \"acc_norm\": 0.5818181818181818,\n\
\ \"acc_norm_stderr\": 0.04724577405731572\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6244897959183674,\n \"acc_stderr\": 0.03100120903989484,\n\
\ \"acc_norm\": 0.6244897959183674,\n \"acc_norm_stderr\": 0.03100120903989484\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6218905472636815,\n\
\ \"acc_stderr\": 0.034288678487786564,\n \"acc_norm\": 0.6218905472636815,\n\
\ \"acc_norm_stderr\": 0.034288678487786564\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.695906432748538,\n \"acc_stderr\": 0.03528211258245231,\n\
\ \"acc_norm\": 0.695906432748538,\n \"acc_norm_stderr\": 0.03528211258245231\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.33047735618115054,\n\
\ \"mc1_stderr\": 0.016466769613698307,\n \"mc2\": 0.46496694797516,\n\
\ \"mc2_stderr\": 0.015236674932834036\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.734017363851618,\n \"acc_stderr\": 0.01241832315305105\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.21076573161485973,\n \
\ \"acc_stderr\": 0.011234280469030465\n }\n}\n```"
repo_url: https://huggingface.co/hfl/chinese-alpaca-2-13b-16k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|arc:challenge|25_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|gsm8k|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hellaswag|10_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T15-53-33.265685.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-09T15-53-33.265685.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- '**/details_harness|winogrande|5_2023-12-09T15-53-33.265685.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-09T15-53-33.265685.parquet'
- config_name: results
data_files:
- split: 2023_12_09T15_53_33.265685
path:
- results_2023-12-09T15-53-33.265685.parquet
- split: latest
path:
- results_2023-12-09T15-53-33.265685.parquet
---
# Dataset Card for Evaluation run of hfl/chinese-alpaca-2-13b-16k
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/hfl/chinese-alpaca-2-13b-16k
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [hfl/chinese-alpaca-2-13b-16k](https://huggingface.co/hfl/chinese-alpaca-2-13b-16k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_hfl__chinese-alpaca-2-13b-16k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-09T15:53:33.265685](https://huggingface.co/datasets/open-llm-leaderboard/details_hfl__chinese-alpaca-2-13b-16k/blob/main/results_2023-12-09T15-53-33.265685.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5126179344828111,
"acc_stderr": 0.0342051274120513,
"acc_norm": 0.5178843368987507,
"acc_norm_stderr": 0.034949756392914415,
"mc1": 0.33047735618115054,
"mc1_stderr": 0.016466769613698307,
"mc2": 0.46496694797516,
"mc2_stderr": 0.015236674932834036
},
"harness|arc:challenge|25": {
"acc": 0.5213310580204779,
"acc_stderr": 0.014598087973127106,
"acc_norm": 0.5503412969283277,
"acc_norm_stderr": 0.014537144444284738
},
"harness|hellaswag|10": {
"acc": 0.5728938458474407,
"acc_stderr": 0.004936470085238487,
"acc_norm": 0.7741485759808803,
"acc_norm_stderr": 0.0041728722829842005
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5131578947368421,
"acc_stderr": 0.04067533136309173,
"acc_norm": 0.5131578947368421,
"acc_norm_stderr": 0.04067533136309173
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5509433962264151,
"acc_stderr": 0.030612730713641095,
"acc_norm": 0.5509433962264151,
"acc_norm_stderr": 0.030612730713641095
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5069444444444444,
"acc_stderr": 0.04180806750294938,
"acc_norm": 0.5069444444444444,
"acc_norm_stderr": 0.04180806750294938
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5086705202312138,
"acc_stderr": 0.038118909889404126,
"acc_norm": 0.5086705202312138,
"acc_norm_stderr": 0.038118909889404126
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929775,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929775
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3446808510638298,
"acc_stderr": 0.03106898596312215,
"acc_norm": 0.3446808510638298,
"acc_norm_stderr": 0.03106898596312215
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.023809523809523853,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.023809523809523853
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.041634530313028585,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.041634530313028585
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5741935483870968,
"acc_stderr": 0.028129112709165904,
"acc_norm": 0.5741935483870968,
"acc_norm_stderr": 0.028129112709165904
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39901477832512317,
"acc_stderr": 0.03445487686264715,
"acc_norm": 0.39901477832512317,
"acc_norm_stderr": 0.03445487686264715
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6606060606060606,
"acc_stderr": 0.03697442205031596,
"acc_norm": 0.6606060606060606,
"acc_norm_stderr": 0.03697442205031596
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.033586181457325226,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.033586181457325226
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7357512953367875,
"acc_stderr": 0.03182155050916646,
"acc_norm": 0.7357512953367875,
"acc_norm_stderr": 0.03182155050916646
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.44358974358974357,
"acc_stderr": 0.0251891498947642,
"acc_norm": 0.44358974358974357,
"acc_norm_stderr": 0.0251891498947642
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.02813325257881563,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.02813325257881563
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5210084033613446,
"acc_stderr": 0.03244980849990029,
"acc_norm": 0.5210084033613446,
"acc_norm_stderr": 0.03244980849990029
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7045871559633028,
"acc_stderr": 0.019560619182976,
"acc_norm": 0.7045871559633028,
"acc_norm_stderr": 0.019560619182976
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39814814814814814,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.39814814814814814,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7009803921568627,
"acc_stderr": 0.03213325717373617,
"acc_norm": 0.7009803921568627,
"acc_norm_stderr": 0.03213325717373617
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.70042194092827,
"acc_stderr": 0.02981802474975309,
"acc_norm": 0.70042194092827,
"acc_norm_stderr": 0.02981802474975309
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.600896860986547,
"acc_stderr": 0.03286745312567961,
"acc_norm": 0.600896860986547,
"acc_norm_stderr": 0.03286745312567961
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5648854961832062,
"acc_stderr": 0.04348208051644858,
"acc_norm": 0.5648854961832062,
"acc_norm_stderr": 0.04348208051644858
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591207,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591207
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04557239513497751,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04557239513497751
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5766871165644172,
"acc_stderr": 0.03881891213334383,
"acc_norm": 0.5766871165644172,
"acc_norm_stderr": 0.03881891213334383
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.04327040932578729,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.04327040932578729
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280042,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280042
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7863247863247863,
"acc_stderr": 0.02685345037700914,
"acc_norm": 0.7863247863247863,
"acc_norm_stderr": 0.02685345037700914
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956914,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956914
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7151979565772669,
"acc_stderr": 0.016139174096522546,
"acc_norm": 0.7151979565772669,
"acc_norm_stderr": 0.016139174096522546
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5867052023121387,
"acc_stderr": 0.02651126136940924,
"acc_norm": 0.5867052023121387,
"acc_norm_stderr": 0.02651126136940924
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24134078212290502,
"acc_stderr": 0.014310999547961443,
"acc_norm": 0.24134078212290502,
"acc_norm_stderr": 0.014310999547961443
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5522875816993464,
"acc_stderr": 0.02847293847803353,
"acc_norm": 0.5522875816993464,
"acc_norm_stderr": 0.02847293847803353
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5980707395498392,
"acc_stderr": 0.027846476005930473,
"acc_norm": 0.5980707395498392,
"acc_norm_stderr": 0.027846476005930473
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.027586006221607708,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.027586006221607708
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4078014184397163,
"acc_stderr": 0.029316011776343555,
"acc_norm": 0.4078014184397163,
"acc_norm_stderr": 0.029316011776343555
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.39895697522816165,
"acc_stderr": 0.01250675765529367,
"acc_norm": 0.39895697522816165,
"acc_norm_stderr": 0.01250675765529367
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4338235294117647,
"acc_stderr": 0.030105636570016636,
"acc_norm": 0.4338235294117647,
"acc_norm_stderr": 0.030105636570016636
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.49019607843137253,
"acc_stderr": 0.020223946005074305,
"acc_norm": 0.49019607843137253,
"acc_norm_stderr": 0.020223946005074305
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.04724577405731572,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.04724577405731572
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6244897959183674,
"acc_stderr": 0.03100120903989484,
"acc_norm": 0.6244897959183674,
"acc_norm_stderr": 0.03100120903989484
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6218905472636815,
"acc_stderr": 0.034288678487786564,
"acc_norm": 0.6218905472636815,
"acc_norm_stderr": 0.034288678487786564
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.695906432748538,
"acc_stderr": 0.03528211258245231,
"acc_norm": 0.695906432748538,
"acc_norm_stderr": 0.03528211258245231
},
"harness|truthfulqa:mc|0": {
"mc1": 0.33047735618115054,
"mc1_stderr": 0.016466769613698307,
"mc2": 0.46496694797516,
"mc2_stderr": 0.015236674932834036
},
"harness|winogrande|5": {
"acc": 0.734017363851618,
"acc_stderr": 0.01241832315305105
},
"harness|gsm8k|5": {
"acc": 0.21076573161485973,
"acc_stderr": 0.011234280469030465
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Vezora/Mini_Orca_Uncencored_Alpaca | ---
license: apache-2.0
---
This is dataset is a modified version of "psmathur's" Mini orca dataset, formated in the alpaca format and uncencored.
For ALPACA LORA users:
Modules you can target with lora:"gate_proj", "down_proj", "up_proj", "q_proj", "v_proj", "k_proj", "o_proj"
Most lora models use:"q_proj", "v_proj", "k_proj", "o_proj"
Platypus which got terrific results: "gate_proj", "down_proj", "up_proj"
Research on targeting certain modules still needs to be done, but if you don't want to train over a previously trained models newly learned abilities, target different modules than the ones used for original training.
Hyper perameters used by Platypus:
Hyperparameters for 13B and 70B Models
Hyperparameter Platypus2-13B / 70B
batch size 16
micro batch size 1
num epochs 1
learning rate 4e-4 / 3e-4
cutoff len 4096
lora rank 16
lora alpha 16
lora dropout 0.05
lora target modules gate_proj, down_proj, up_proj
train on inputs False
add eos token False
group by length False
prompt template alpaca
lr scheduler cosine
warmup steps 100
I would reccomend using a batch size of 4-10, and cutt off length to โค 2048 to avoid using vram issues. Load_in_4bit, Normal Float, and bf16. For single 24 gig card.
If training with oobabooga you must edit the "training.py" file in the "oobabooga_windows\text-generation-webui\modules" folder. In line 49 edit standard modules to the modules you would like to target.
If training with alpaca lora use the argument --lora_target_modules when running the train.py command. To load in 4bit you must edit the train file, adding load in 4 bit, bf16, and normal float quant.
|
micsell/hebrew_kan | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: id
dtype: string
- name: language
dtype: string
splits:
- name: train
num_bytes: 27205984251.625
num_examples: 146451
download_size: 27201338977
dataset_size: 27205984251.625
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
transformersbook/emotion-train-split | ---
license: apache-2.0
---
|
parler-tts/mls_eng | ---
pretty_name: English MLS
annotations_creators:
- expert-generated
language_creators:
- crowdsourced
- expert-generated
language:
- en
license:
- cc-by-4.0
multilinguality:
- multilingual
paperswithcode_id: multilingual-librispeech
size_categories:
- 1M<n<10M
source_datasets:
- original
task_categories:
- automatic-speech-recognition
- text-to-speech
- text-to-audio
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
- split: train
path: data/train-*
dataset_info:
features:
- name: audio
dtype: audio
- name: original_path
dtype: string
- name: begin_time
dtype: float64
- name: end_time
dtype: float64
- name: transcript
dtype: string
- name: audio_duration
dtype: float64
- name: speaker_id
dtype: string
- name: book_id
dtype: string
splits:
- name: dev
num_bytes: 249688889.909
num_examples: 3807
- name: test
num_bytes: 245938961
num_examples: 3769
- name: train
num_bytes: 707578913096
num_examples: 10808037
download_size: 705179367357
dataset_size: 708074540946.909
---
# Dataset Card for English MLS
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [MultiLingual LibriSpeech ASR corpus](http://www.openslr.org/94)
- **Repository:** [Needs More Information]
- **Paper:** [MLS: A Large-Scale Multilingual Dataset for Speech Research](https://arxiv.org/abs/2012.03411)
- **Leaderboard:** [๐ค Autoevaluate Leaderboard](https://huggingface.co/spaces/autoevaluate/leaderboards?dataset=facebook%2Fmultilingual_librispeech&only_verified=0&task=automatic-speech-recognition&config=-unspecified-&split=-unspecified-&metric=wer)
### Dataset Summary
This is a streamable version of the **English version of the Multilingual LibriSpeech (MLS) dataset**.
The data archives were restructured from the original ones from [OpenSLR](http://www.openslr.org/94) to make it easier to stream.
MLS dataset is a large multilingual corpus suitable for speech research. The dataset is derived from read audiobooks from LibriVox and consists of
8 languages - English, German, Dutch, Spanish, French, Italian, Portuguese, Polish. It includes about 44.5K hours of English and a total of about 6K hours for other languages.
This dataset card includes the 44.5K hours of English. Refers to this [dataset card](https://huggingface.co/datasets/facebook/multilingual_librispeech) for the other languages.
### Supported Tasks and Leaderboards
- `automatic-speech-recognition`, `speaker-identification`: The dataset can be used to train a model for Automatic Speech Recognition (ASR). The model is presented with an audio file and asked to transcribe the audio file to written text. The most common evaluation metric is the word error rate (WER). The task has an active leaderboard which can be found at https://paperswithcode.com/dataset/multilingual-librispeech and ranks models based on their WER.
- `text-to-speech`, `text-to-audio`: The dataset can also be used to train a model for Text-To-Speech (TTS).
### How to use
The `datasets` library allows you to load and pre-process your dataset in pure Python, at scale. The dataset can be downloaded and prepared in one call to your local drive by using the `load_dataset` function.
For example, to download the German config, simply specify the corresponding language config name (i.e., "german" for German):
```python
from datasets import load_dataset
mls = load_dataset("parler-tts/mls_eng", split="train")
```
Using the datasets library, you can also stream the dataset on-the-fly by adding a `streaming=True` argument to the `load_dataset` function call. Loading a dataset in streaming mode loads individual samples of the dataset at a time, rather than downloading the entire dataset to disk.
```python
from datasets import load_dataset
mls = load_dataset("parler-tts/mls_eng", split="train", streaming=True)
print(next(iter(mls)))
```
*Bonus*: create a [PyTorch dataloader](https://huggingface.co/docs/datasets/use_with_pytorch) directly with your own datasets (local/streamed).
Local:
```python
from datasets import load_dataset
from torch.utils.data.sampler import BatchSampler, RandomSampler
mls = load_dataset("parler-tts/mls_eng", split="train")
batch_sampler = BatchSampler(RandomSampler(mls), batch_size=32, drop_last=False)
dataloader = DataLoader(mls, batch_sampler=batch_sampler)
```
Streaming:
```python
from datasets import load_dataset
from torch.utils.data import DataLoader
mls = load_dataset("parler-tts/mls_eng", split="train", streaming=True)
dataloader = DataLoader(mls, batch_size=32)
```
To find out more about loading and preparing audio datasets, head over to [hf.co/blog/audio-datasets](https://huggingface.co/blog/audio-datasets).
### Example scripts
Train your own CTC or Seq2Seq Automatic Speech Recognition models on MultiLingual Librispeech with `transformers` - [here](https://github.com/huggingface/transformers/tree/main/examples/pytorch/speech-recognition).
## Dataset Structure
### Data Fields
- file: A filename .flac format.
- audio: A dictionary containing the audio filename, the decoded audio array, and the sampling rate. Note that when accessing the audio column: `dataset[0]["audio"]` the audio file is automatically decoded and resampled to `dataset.features["audio"].sampling_rate`. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the `"audio"` column, *i.e.* `dataset[0]["audio"]` should **always** be preferred over `dataset["audio"][0]`.
- text: the transcription of the audio file.
- id: unique id of the data sample.
- speaker_id: unique id of the speaker. The same speaker id can be found for multiple data samples.
- chapter_id: id of the audiobook chapter which includes the transcription.
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
The dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in this dataset.
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
Public Domain, Creative Commons Attribution 4.0 International Public License ([CC-BY-4.0](https://creativecommons.org/licenses/by/4.0/legalcode))
### Citation Information
```
@article{Pratap2020MLSAL,
title={MLS: A Large-Scale Multilingual Dataset for Speech Research},
author={Vineel Pratap and Qiantong Xu and Anuroop Sriram and Gabriel Synnaeve and Ronan Collobert},
journal={ArXiv},
year={2020},
volume={abs/2012.03411}
}
```
### Data Statistics
| Duration (h) | Train | Dev | Test |
|--------------|-----------|-------|-------|
| English | 44,659.74 | 15.75 | 15.55 |
| German | 1,966.51 | 14.28 | 14.29 |
| Dutch | 1,554.24 | 12.76 | 12.76 |
| French | 1,076.58 | 10.07 | 10.07 |
| Spanish | 917.68 | 9.99 | 10 |
| Italian | 247.38 | 5.18 | 5.27 |
| Portuguese | 160.96 | 3.64 | 3.74 |
| Polish | 103.65 | 2.08 | 2.14 |
| # Speakers | Train | | Dev | | Test | |
|------------|-------|------|-----|----|------|----|
| Gender | M | F | M | F | M | F |
| English | 2742 | 2748 | 21 | 21 | 21 | 21 |
| German | 81 | 95 | 15 | 15 | 15 | 15 |
| Dutch | 9 | 31 | 3 | 3 | 3 | 3 |
| French | 62 | 80 | 9 | 9 | 9 | 9 |
| Spanish | 36 | 50 | 10 | 10 | 10 | 10 |
| Italian | 22 | 43 | 5 | 5 | 5 | 5 |
| Portuguese | 26 | 16 | 5 | 5 | 5 | 5 |
| Polish | 6 | 5 | 2 | 2 | 2 | 2 |
| # Hours / Gender | Dev | | Test | |
|------------------|------|------|------|------|
| Gender | M | F | M | F |
| English | 7.76 | 7.99 | 7.62 | 7.93 |
| German | 7.06 | 7.22 | 7 | 7.29 |
| Dutch | 6.44 | 6.32 | 6.72 | 6.04 |
| French | 5.13 | 4.94 | 5.04 | 5.02 |
| Spanish | 4.91 | 5.08 | 4.78 | 5.23 |
| Italian | 2.5 | 2.68 | 2.38 | 2.9 |
| Portuguese | 1.84 | 1.81 | 1.83 | 1.9 |
| Polish | 1.12 | 0.95 | 1.09 | 1.05 |
|
ashraq/cohere-wiki-embedding-100k | ---
dataset_info:
features:
- name: id
dtype: int64
- name: title
dtype: string
- name: text
dtype: string
- name: url
dtype: string
- name: wiki_id
dtype: int64
- name: views
dtype: float64
- name: paragraph_id
dtype: int64
- name: langs
dtype: int64
- name: emb
sequence: float64
splits:
- name: train
num_bytes: 686289530
num_examples: 100000
download_size: 538433661
dataset_size: 686289530
---
# Dataset Card for "cohere-wiki-embedding-100k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/wiki_find_passage_train50_eval40_num | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 95036
num_examples: 140
- name: validation
num_bytes: 33332
num_examples: 40
download_size: 73446
dataset_size: 128368
---
# Dataset Card for "wiki_find_passage_train50_eval40_num"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kawagoshi-llm-team/chatwork_column | ---
license: unknown
dataset_info:
features:
- name: url
dtype: string
- name: timestamp
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 21313023
num_examples: 1748
download_size: 8765966
dataset_size: 21313023
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_v1olet__v1olet_merged_dpo_7B_v4 | ---
pretty_name: Evaluation run of v1olet/v1olet_merged_dpo_7B_v4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [v1olet/v1olet_merged_dpo_7B_v4](https://huggingface.co/v1olet/v1olet_merged_dpo_7B_v4)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_v1olet__v1olet_merged_dpo_7B_v4\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-13T13:46:12.224585](https://huggingface.co/datasets/open-llm-leaderboard/details_v1olet__v1olet_merged_dpo_7B_v4/blob/main/results_2023-12-13T13-46-12.224585.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5917202264245718,\n\
\ \"acc_stderr\": 0.03324717259397107,\n \"acc_norm\": 0.5957734427293545,\n\
\ \"acc_norm_stderr\": 0.0339416190415928,\n \"mc1\": 0.4614443084455324,\n\
\ \"mc1_stderr\": 0.017451384104637455,\n \"mc2\": 0.5943157054555347,\n\
\ \"mc2_stderr\": 0.01604355026591654\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6467576791808873,\n \"acc_stderr\": 0.013967822714840055,\n\
\ \"acc_norm\": 0.6697952218430034,\n \"acc_norm_stderr\": 0.013743085603760426\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6450906193985262,\n\
\ \"acc_stderr\": 0.004775079636567097,\n \"acc_norm\": 0.8408683529177454,\n\
\ \"acc_norm_stderr\": 0.003650512158306275\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\
\ \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n\
\ \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.03953173377749194,\n\
\ \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.03953173377749194\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.02890159361241178,\n\
\ \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.02890159361241178\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.040166600304512336,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.040166600304512336\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5838150289017341,\n\
\ \"acc_stderr\": 0.03758517775404947,\n \"acc_norm\": 0.5838150289017341,\n\
\ \"acc_norm_stderr\": 0.03758517775404947\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4765957446808511,\n \"acc_stderr\": 0.03265019475033582,\n\
\ \"acc_norm\": 0.4765957446808511,\n \"acc_norm_stderr\": 0.03265019475033582\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n\
\ \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.37719298245614036,\n\
\ \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37037037037037035,\n \"acc_stderr\": 0.024870815251057093,\n \"\
acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.024870815251057093\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7354838709677419,\n \"acc_stderr\": 0.02509189237885928,\n \"\
acc_norm\": 0.7354838709677419,\n \"acc_norm_stderr\": 0.02509189237885928\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n \"\
acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885415,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885415\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.0303137105381989,\n \"acc_norm\"\
: 0.7626262626262627,\n \"acc_norm_stderr\": 0.0303137105381989\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153303,\n\
\ \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153303\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5794871794871795,\n \"acc_stderr\": 0.025028610276710862,\n\
\ \"acc_norm\": 0.5794871794871795,\n \"acc_norm_stderr\": 0.025028610276710862\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6092436974789915,\n \"acc_stderr\": 0.03169380235712996,\n \
\ \"acc_norm\": 0.6092436974789915,\n \"acc_norm_stderr\": 0.03169380235712996\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7688073394495413,\n \"acc_stderr\": 0.018075750241633146,\n \"\
acc_norm\": 0.7688073394495413,\n \"acc_norm_stderr\": 0.018075750241633146\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3888888888888889,\n \"acc_stderr\": 0.033247089118091176,\n \"\
acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.033247089118091176\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7549019607843137,\n \"acc_stderr\": 0.03019028245350195,\n \"\
acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.03019028245350195\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \
\ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n\
\ \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935573,\n\
\ \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935573\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8376068376068376,\n\
\ \"acc_stderr\": 0.02416161812798774,\n \"acc_norm\": 0.8376068376068376,\n\
\ \"acc_norm_stderr\": 0.02416161812798774\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7752234993614304,\n\
\ \"acc_stderr\": 0.014927447101937148,\n \"acc_norm\": 0.7752234993614304,\n\
\ \"acc_norm_stderr\": 0.014927447101937148\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6445086705202312,\n \"acc_stderr\": 0.025770292082977247,\n\
\ \"acc_norm\": 0.6445086705202312,\n \"acc_norm_stderr\": 0.025770292082977247\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.45251396648044695,\n\
\ \"acc_stderr\": 0.016646914804438775,\n \"acc_norm\": 0.45251396648044695,\n\
\ \"acc_norm_stderr\": 0.016646914804438775\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6339869281045751,\n \"acc_stderr\": 0.027582811415159614,\n\
\ \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.027582811415159614\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n\
\ \"acc_stderr\": 0.026596782287697043,\n \"acc_norm\": 0.6752411575562701,\n\
\ \"acc_norm_stderr\": 0.026596782287697043\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.654320987654321,\n \"acc_stderr\": 0.026462487777001862,\n\
\ \"acc_norm\": 0.654320987654321,\n \"acc_norm_stderr\": 0.026462487777001862\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.38652482269503546,\n \"acc_stderr\": 0.029049190342543448,\n \
\ \"acc_norm\": 0.38652482269503546,\n \"acc_norm_stderr\": 0.029049190342543448\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41916558018252936,\n\
\ \"acc_stderr\": 0.012602244505788236,\n \"acc_norm\": 0.41916558018252936,\n\
\ \"acc_norm_stderr\": 0.012602244505788236\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6029411764705882,\n \"acc_stderr\": 0.029722152099280065,\n\
\ \"acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.029722152099280065\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5980392156862745,\n \"acc_stderr\": 0.0198351764843754,\n \
\ \"acc_norm\": 0.5980392156862745,\n \"acc_norm_stderr\": 0.0198351764843754\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6285714285714286,\n \"acc_stderr\": 0.030932858792789855,\n\
\ \"acc_norm\": 0.6285714285714286,\n \"acc_norm_stderr\": 0.030932858792789855\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169146,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169146\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4614443084455324,\n\
\ \"mc1_stderr\": 0.017451384104637455,\n \"mc2\": 0.5943157054555347,\n\
\ \"mc2_stderr\": 0.01604355026591654\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8105761641673244,\n \"acc_stderr\": 0.011012790432989245\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3525398028809704,\n \
\ \"acc_stderr\": 0.013159909755930323\n }\n}\n```"
repo_url: https://huggingface.co/v1olet/v1olet_merged_dpo_7B_v4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|arc:challenge|25_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|gsm8k|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hellaswag|10_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T13-46-12.224585.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-13T13-46-12.224585.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- '**/details_harness|winogrande|5_2023-12-13T13-46-12.224585.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-13T13-46-12.224585.parquet'
- config_name: results
data_files:
- split: 2023_12_13T13_46_12.224585
path:
- results_2023-12-13T13-46-12.224585.parquet
- split: latest
path:
- results_2023-12-13T13-46-12.224585.parquet
---
# Dataset Card for Evaluation run of v1olet/v1olet_merged_dpo_7B_v4
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [v1olet/v1olet_merged_dpo_7B_v4](https://huggingface.co/v1olet/v1olet_merged_dpo_7B_v4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_v1olet__v1olet_merged_dpo_7B_v4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T13:46:12.224585](https://huggingface.co/datasets/open-llm-leaderboard/details_v1olet__v1olet_merged_dpo_7B_v4/blob/main/results_2023-12-13T13-46-12.224585.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5917202264245718,
"acc_stderr": 0.03324717259397107,
"acc_norm": 0.5957734427293545,
"acc_norm_stderr": 0.0339416190415928,
"mc1": 0.4614443084455324,
"mc1_stderr": 0.017451384104637455,
"mc2": 0.5943157054555347,
"mc2_stderr": 0.01604355026591654
},
"harness|arc:challenge|25": {
"acc": 0.6467576791808873,
"acc_stderr": 0.013967822714840055,
"acc_norm": 0.6697952218430034,
"acc_norm_stderr": 0.013743085603760426
},
"harness|hellaswag|10": {
"acc": 0.6450906193985262,
"acc_stderr": 0.004775079636567097,
"acc_norm": 0.8408683529177454,
"acc_norm_stderr": 0.003650512158306275
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.618421052631579,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.618421052631579,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6716981132075471,
"acc_stderr": 0.02890159361241178,
"acc_norm": 0.6716981132075471,
"acc_norm_stderr": 0.02890159361241178
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.040166600304512336,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.040166600304512336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5838150289017341,
"acc_stderr": 0.03758517775404947,
"acc_norm": 0.5838150289017341,
"acc_norm_stderr": 0.03758517775404947
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201942,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201942
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4765957446808511,
"acc_stderr": 0.03265019475033582,
"acc_norm": 0.4765957446808511,
"acc_norm_stderr": 0.03265019475033582
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.37719298245614036,
"acc_stderr": 0.04559522141958216,
"acc_norm": 0.37719298245614036,
"acc_norm_stderr": 0.04559522141958216
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.024870815251057093,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.024870815251057093
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7354838709677419,
"acc_stderr": 0.02509189237885928,
"acc_norm": 0.7354838709677419,
"acc_norm_stderr": 0.02509189237885928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.0303137105381989,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.0303137105381989
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.026148483469153303,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.026148483469153303
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5794871794871795,
"acc_stderr": 0.025028610276710862,
"acc_norm": 0.5794871794871795,
"acc_norm_stderr": 0.025028610276710862
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6092436974789915,
"acc_stderr": 0.03169380235712996,
"acc_norm": 0.6092436974789915,
"acc_norm_stderr": 0.03169380235712996
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7688073394495413,
"acc_stderr": 0.018075750241633146,
"acc_norm": 0.7688073394495413,
"acc_norm_stderr": 0.018075750241633146
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.033247089118091176,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.033247089118091176
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.03019028245350195,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.03019028245350195
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6625766871165644,
"acc_stderr": 0.03714908409935573,
"acc_norm": 0.6625766871165644,
"acc_norm_stderr": 0.03714908409935573
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8376068376068376,
"acc_stderr": 0.02416161812798774,
"acc_norm": 0.8376068376068376,
"acc_norm_stderr": 0.02416161812798774
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7752234993614304,
"acc_stderr": 0.014927447101937148,
"acc_norm": 0.7752234993614304,
"acc_norm_stderr": 0.014927447101937148
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6445086705202312,
"acc_stderr": 0.025770292082977247,
"acc_norm": 0.6445086705202312,
"acc_norm_stderr": 0.025770292082977247
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.45251396648044695,
"acc_stderr": 0.016646914804438775,
"acc_norm": 0.45251396648044695,
"acc_norm_stderr": 0.016646914804438775
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6339869281045751,
"acc_stderr": 0.027582811415159614,
"acc_norm": 0.6339869281045751,
"acc_norm_stderr": 0.027582811415159614
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6752411575562701,
"acc_stderr": 0.026596782287697043,
"acc_norm": 0.6752411575562701,
"acc_norm_stderr": 0.026596782287697043
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.654320987654321,
"acc_stderr": 0.026462487777001862,
"acc_norm": 0.654320987654321,
"acc_norm_stderr": 0.026462487777001862
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.38652482269503546,
"acc_stderr": 0.029049190342543448,
"acc_norm": 0.38652482269503546,
"acc_norm_stderr": 0.029049190342543448
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41916558018252936,
"acc_stderr": 0.012602244505788236,
"acc_norm": 0.41916558018252936,
"acc_norm_stderr": 0.012602244505788236
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.029722152099280065,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.029722152099280065
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5980392156862745,
"acc_stderr": 0.0198351764843754,
"acc_norm": 0.5980392156862745,
"acc_norm_stderr": 0.0198351764843754
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6285714285714286,
"acc_stderr": 0.030932858792789855,
"acc_norm": 0.6285714285714286,
"acc_norm_stderr": 0.030932858792789855
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169146,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169146
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4614443084455324,
"mc1_stderr": 0.017451384104637455,
"mc2": 0.5943157054555347,
"mc2_stderr": 0.01604355026591654
},
"harness|winogrande|5": {
"acc": 0.8105761641673244,
"acc_stderr": 0.011012790432989245
},
"harness|gsm8k|5": {
"acc": 0.3525398028809704,
"acc_stderr": 0.013159909755930323
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
anan-2024/twitter_dataset_1713171311 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 288875
num_examples: 763
download_size: 152940
dataset_size: 288875
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
succinctly/medium-titles-and-images | ---
license: apache-2.0
---
This dataset contains `<title, encoded_image>` pairs from [Medium](https://medium.com) articles. It was processed from the [Medium Articles Dataset (128k): Metadata + Images](https://www.kaggle.com/datasets/succinctlyai/medium-data) dataset on Kaggle.
The original images were processed in the following way:
1. Given an image of size `(w, h)`, we cropped a square of size `(n, n)` from the center of the image, where `n = min(w, h)`.
2. The resulting `(n, n)` image was resized to `(256, 256)`.
3. The resulting `(256, 256)` image was encoded into image tokens via the [dalle-mini/vqgan\_imagenet\_f16\_16384](https://huggingface.co/dalle-mini/vqgan_imagenet_f16_16384) model.
Note that this dataset contains ~128k entries and is too small for training a text-to-image model end to end; it is more suitable for operations on a pre-trained model
like [dalle-mini](https://huggingface.co/dalle-mini/dalle-mini) (fine-tuning, [prompt tuning](https://arxiv.org/pdf/2104.08691.pdf), etc.). |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/e9630c53 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 180
num_examples: 10
download_size: 1332
dataset_size: 180
---
# Dataset Card for "e9630c53"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Alv8450/thumbnails | ---
license: unknown
---
|
magnosfalcao/vini | ---
license: openrail
---
|
lisn519010/QM9 | ---
dataset_info:
features:
- name: x
sequence:
sequence: float32
- name: edge_index
sequence:
sequence: int64
- name: edge_attr
sequence:
sequence: float32
- name: 'y'
sequence:
sequence: float32
- name: pos
sequence:
sequence: float32
- name: z
sequence: int64
- name: name
dtype: string
- name: idx
sequence: int64
splits:
- name: full
num_bytes: 363615510
num_examples: 130831
download_size: 55326724
dataset_size: 363615510
task_categories:
- graph-ml
tags:
- chemistry
- biology
---
# Dataset Card for "QM9"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nuno-Tome/testedata | ---
configs:
- config_name: testedata_readme
data_files:
- split: pasta
path:
- '*.jpg'
- split: single
path: >-
leo0000023 -
Absolute_Reality_v16_a_funny_and_cute_under_construction_landi_0.jpg
language:
- pt
- en
tags:
- art
size_categories:
- 1B<n<10B
license: apache-2.0
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset aims to be a base template for new datasets and for testing code.
## Dataset Details
2 image files in jpg format
|
gorkemozkaya/blended_en_tr | ---
license: other
---
|
huggingartists/idktime | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/idktime"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.027776 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://assets.genius.com/images/default_avatar_300.png?1631807796')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/idktime">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">๐ค HuggingArtists Model ๐ค</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Idktime</div>
<a href="https://genius.com/artists/idktime">
<div style="text-align: center; font-size: 14px;">@idktime</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/idktime).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/idktime")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|2| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/idktime")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
Villian7/Emotions_Data | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
- name: label_text
dtype: string
splits:
- name: train
num_bytes: 109428773
num_examples: 1096869
- name: validation
num_bytes: 13025428
num_examples: 133105
- name: test
num_bytes: 13047201
num_examples: 133104
download_size: 77478115
dataset_size: 135501402
license: apache-2.0
---
# Dataset Card for "emotions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Intel/neuralchat_dataset_preprocessed | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_ZhangShenao__0.001_idpo_same_noreplacerej_declr_iter_3 | ---
pretty_name: Evaluation run of ZhangShenao/0.001_idpo_same_noreplacerej_declr_iter_3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ZhangShenao/0.001_idpo_same_noreplacerej_declr_iter_3](https://huggingface.co/ZhangShenao/0.001_idpo_same_noreplacerej_declr_iter_3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ZhangShenao__0.001_idpo_same_noreplacerej_declr_iter_3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-08T17:03:15.552837](https://huggingface.co/datasets/open-llm-leaderboard/details_ZhangShenao__0.001_idpo_same_noreplacerej_declr_iter_3/blob/main/results_2024-04-08T17-03-15.552837.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5994896770897471,\n\
\ \"acc_stderr\": 0.03320121450445251,\n \"acc_norm\": 0.6065086435623261,\n\
\ \"acc_norm_stderr\": 0.03391373134074509,\n \"mc1\": 0.3818849449204406,\n\
\ \"mc1_stderr\": 0.017008101939163498,\n \"mc2\": 0.5437844140253818,\n\
\ \"mc2_stderr\": 0.01585174860581118\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6040955631399317,\n \"acc_stderr\": 0.01429122839353659,\n\
\ \"acc_norm\": 0.6262798634812287,\n \"acc_norm_stderr\": 0.014137708601759091\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.661521609241187,\n\
\ \"acc_stderr\": 0.004722250355106684,\n \"acc_norm\": 0.8526190001991635,\n\
\ \"acc_norm_stderr\": 0.0035376085010691773\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099583,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099583\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.03953173377749194,\n\
\ \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.03953173377749194\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n\
\ \"acc_stderr\": 0.037738099906869334,\n \"acc_norm\": 0.7152777777777778,\n\
\ \"acc_norm_stderr\": 0.037738099906869334\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.04951218252396265,\n\
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.04951218252396265\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5148936170212766,\n \"acc_stderr\": 0.03267151848924777,\n\
\ \"acc_norm\": 0.5148936170212766,\n \"acc_norm_stderr\": 0.03267151848924777\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43386243386243384,\n \"acc_stderr\": 0.025525034382474894,\n \"\
acc_norm\": 0.43386243386243384,\n \"acc_norm_stderr\": 0.025525034382474894\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.04343525428949098,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.04343525428949098\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7258064516129032,\n\
\ \"acc_stderr\": 0.025378139970885203,\n \"acc_norm\": 0.7258064516129032,\n\
\ \"acc_norm_stderr\": 0.025378139970885203\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7424242424242424,\n \"acc_stderr\": 0.031156269519646847,\n \"\
acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.031156269519646847\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153314,\n\
\ \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153314\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5666666666666667,\n \"acc_stderr\": 0.025124653525885113,\n\
\ \"acc_norm\": 0.5666666666666667,\n \"acc_norm_stderr\": 0.025124653525885113\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815635,\n \
\ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815635\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5840336134453782,\n \"acc_stderr\": 0.03201650100739611,\n \
\ \"acc_norm\": 0.5840336134453782,\n \"acc_norm_stderr\": 0.03201650100739611\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.781651376146789,\n \"acc_stderr\": 0.017712600528722738,\n \"\
acc_norm\": 0.781651376146789,\n \"acc_norm_stderr\": 0.017712600528722738\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.03407632093854053,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.03407632093854053\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7745098039215687,\n \"acc_stderr\": 0.02933116229425174,\n \"\
acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.02933116229425174\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.759493670886076,\n \"acc_stderr\": 0.027820781981149685,\n \
\ \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.027820781981149685\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n\
\ \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.039418975265163025,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.039418975265163025\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.03487825168497892,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.03487825168497892\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.043546310772605956,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.043546310772605956\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.02250903393707779,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.02250903393707779\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8071519795657727,\n\
\ \"acc_stderr\": 0.014108533515757433,\n \"acc_norm\": 0.8071519795657727,\n\
\ \"acc_norm_stderr\": 0.014108533515757433\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6676300578034682,\n \"acc_stderr\": 0.025361168749688214,\n\
\ \"acc_norm\": 0.6676300578034682,\n \"acc_norm_stderr\": 0.025361168749688214\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3407821229050279,\n\
\ \"acc_stderr\": 0.015852002449862103,\n \"acc_norm\": 0.3407821229050279,\n\
\ \"acc_norm_stderr\": 0.015852002449862103\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6339869281045751,\n \"acc_stderr\": 0.027582811415159624,\n\
\ \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.027582811415159624\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\
\ \"acc_stderr\": 0.02616058445014045,\n \"acc_norm\": 0.6945337620578779,\n\
\ \"acc_norm_stderr\": 0.02616058445014045\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6728395061728395,\n \"acc_stderr\": 0.026105673861409825,\n\
\ \"acc_norm\": 0.6728395061728395,\n \"acc_norm_stderr\": 0.026105673861409825\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4217731421121252,\n\
\ \"acc_stderr\": 0.012612974369390975,\n \"acc_norm\": 0.4217731421121252,\n\
\ \"acc_norm_stderr\": 0.012612974369390975\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.02928941340940319,\n\
\ \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.02928941340940319\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6258169934640523,\n \"acc_stderr\": 0.019576953122088837,\n \
\ \"acc_norm\": 0.6258169934640523,\n \"acc_norm_stderr\": 0.019576953122088837\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6122448979591837,\n \"acc_stderr\": 0.031192230726795656,\n\
\ \"acc_norm\": 0.6122448979591837,\n \"acc_norm_stderr\": 0.031192230726795656\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7960199004975125,\n\
\ \"acc_stderr\": 0.02849317624532607,\n \"acc_norm\": 0.7960199004975125,\n\
\ \"acc_norm_stderr\": 0.02849317624532607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368032,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368032\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3818849449204406,\n\
\ \"mc1_stderr\": 0.017008101939163498,\n \"mc2\": 0.5437844140253818,\n\
\ \"mc2_stderr\": 0.01585174860581118\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7797947908445146,\n \"acc_stderr\": 0.011646276755089686\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.21834723275208492,\n \
\ \"acc_stderr\": 0.011379497266738047\n }\n}\n```"
repo_url: https://huggingface.co/ZhangShenao/0.001_idpo_same_noreplacerej_declr_iter_3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|arc:challenge|25_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|gsm8k|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hellaswag|10_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T17-03-15.552837.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-08T17-03-15.552837.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- '**/details_harness|winogrande|5_2024-04-08T17-03-15.552837.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-08T17-03-15.552837.parquet'
- config_name: results
data_files:
- split: 2024_04_08T17_03_15.552837
path:
- results_2024-04-08T17-03-15.552837.parquet
- split: latest
path:
- results_2024-04-08T17-03-15.552837.parquet
---
# Dataset Card for Evaluation run of ZhangShenao/0.001_idpo_same_noreplacerej_declr_iter_3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ZhangShenao/0.001_idpo_same_noreplacerej_declr_iter_3](https://huggingface.co/ZhangShenao/0.001_idpo_same_noreplacerej_declr_iter_3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ZhangShenao__0.001_idpo_same_noreplacerej_declr_iter_3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-08T17:03:15.552837](https://huggingface.co/datasets/open-llm-leaderboard/details_ZhangShenao__0.001_idpo_same_noreplacerej_declr_iter_3/blob/main/results_2024-04-08T17-03-15.552837.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5994896770897471,
"acc_stderr": 0.03320121450445251,
"acc_norm": 0.6065086435623261,
"acc_norm_stderr": 0.03391373134074509,
"mc1": 0.3818849449204406,
"mc1_stderr": 0.017008101939163498,
"mc2": 0.5437844140253818,
"mc2_stderr": 0.01585174860581118
},
"harness|arc:challenge|25": {
"acc": 0.6040955631399317,
"acc_stderr": 0.01429122839353659,
"acc_norm": 0.6262798634812287,
"acc_norm_stderr": 0.014137708601759091
},
"harness|hellaswag|10": {
"acc": 0.661521609241187,
"acc_stderr": 0.004722250355106684,
"acc_norm": 0.8526190001991635,
"acc_norm_stderr": 0.0035376085010691773
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099583,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099583
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.618421052631579,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.618421052631579,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.037738099906869334,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.037738099906869334
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.04951218252396265,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.04951218252396265
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5148936170212766,
"acc_stderr": 0.03267151848924777,
"acc_norm": 0.5148936170212766,
"acc_norm_stderr": 0.03267151848924777
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43386243386243384,
"acc_stderr": 0.025525034382474894,
"acc_norm": 0.43386243386243384,
"acc_norm_stderr": 0.025525034382474894
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949098,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949098
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7258064516129032,
"acc_stderr": 0.025378139970885203,
"acc_norm": 0.7258064516129032,
"acc_norm_stderr": 0.025378139970885203
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7424242424242424,
"acc_stderr": 0.031156269519646847,
"acc_norm": 0.7424242424242424,
"acc_norm_stderr": 0.031156269519646847
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.026148483469153314,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.026148483469153314
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5666666666666667,
"acc_stderr": 0.025124653525885113,
"acc_norm": 0.5666666666666667,
"acc_norm_stderr": 0.025124653525885113
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.028133252578815635,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.028133252578815635
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5840336134453782,
"acc_stderr": 0.03201650100739611,
"acc_norm": 0.5840336134453782,
"acc_norm_stderr": 0.03201650100739611
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.781651376146789,
"acc_stderr": 0.017712600528722738,
"acc_norm": 0.781651376146789,
"acc_norm_stderr": 0.017712600528722738
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.03407632093854053,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.03407632093854053
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.02933116229425174,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.02933116229425174
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.043546310772605956,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.043546310772605956
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.02250903393707779,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.02250903393707779
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8071519795657727,
"acc_stderr": 0.014108533515757433,
"acc_norm": 0.8071519795657727,
"acc_norm_stderr": 0.014108533515757433
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6676300578034682,
"acc_stderr": 0.025361168749688214,
"acc_norm": 0.6676300578034682,
"acc_norm_stderr": 0.025361168749688214
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3407821229050279,
"acc_stderr": 0.015852002449862103,
"acc_norm": 0.3407821229050279,
"acc_norm_stderr": 0.015852002449862103
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6339869281045751,
"acc_stderr": 0.027582811415159624,
"acc_norm": 0.6339869281045751,
"acc_norm_stderr": 0.027582811415159624
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.02616058445014045,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.02616058445014045
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6728395061728395,
"acc_stderr": 0.026105673861409825,
"acc_norm": 0.6728395061728395,
"acc_norm_stderr": 0.026105673861409825
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4217731421121252,
"acc_stderr": 0.012612974369390975,
"acc_norm": 0.4217731421121252,
"acc_norm_stderr": 0.012612974369390975
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.02928941340940319,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.02928941340940319
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6258169934640523,
"acc_stderr": 0.019576953122088837,
"acc_norm": 0.6258169934640523,
"acc_norm_stderr": 0.019576953122088837
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6122448979591837,
"acc_stderr": 0.031192230726795656,
"acc_norm": 0.6122448979591837,
"acc_norm_stderr": 0.031192230726795656
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7960199004975125,
"acc_stderr": 0.02849317624532607,
"acc_norm": 0.7960199004975125,
"acc_norm_stderr": 0.02849317624532607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368032,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368032
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3818849449204406,
"mc1_stderr": 0.017008101939163498,
"mc2": 0.5437844140253818,
"mc2_stderr": 0.01585174860581118
},
"harness|winogrande|5": {
"acc": 0.7797947908445146,
"acc_stderr": 0.011646276755089686
},
"harness|gsm8k|5": {
"acc": 0.21834723275208492,
"acc_stderr": 0.011379497266738047
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Parleatacoeur/leyesperuanasactualizadas | ---
task_categories:
- text-generation
language:
- es
tags:
- legal
--- |
AwesomePeoplz257/trainset | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 2881414016
num_examples: 3000
download_size: 453992987
dataset_size: 2881414016
---
# Dataset Card for "trainset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ibranze/araproje_hellaswag_tr_conf_halfis | ---
dataset_info:
features:
- name: ind
dtype: int32
- name: activity_label
dtype: string
- name: ctx_a
dtype: string
- name: ctx_b
dtype: string
- name: ctx
dtype: string
- name: endings
sequence: string
- name: source_id
dtype: string
- name: split
dtype: string
- name: split_type
dtype: string
- name: label
dtype: string
splits:
- name: validation
num_bytes: 162703.0
num_examples: 250
download_size: 87170
dataset_size: 162703.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_hellaswag_tr_conf_halfis"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
botp/RyokoAI_CNNovel125K | ---
license: apache-2.0
language:
- zh
tags:
- novel
- training
task_categories:
- text-classification
- text-generation
pretty_name: CNNovel125K
size_categories:
- 100K<n<1M
duplicated_from: RyokoAI/CNNovel125K
---
# Dataset Card for CNNovel125K
*The BigKnow2022 dataset and its subsets are not yet complete. Not all information here may be accurate or accessible.*
## Dataset Description
- **Homepage:** (TODO)
- **Repository:** <https://github.com/RyokoAI/BigKnow2022>
- **Paper:** N/A
- **Leaderboard:** N/A
- **Point of Contact:** Ronsor/undeleted <ronsor@ronsor.com>
### Dataset Summary
CNNovel125K is a dataset composed of approximately 125,000 novels downloaded from the Chinese novel hosting site <http://ibiquw.com>.
### Supported Tasks and Leaderboards
This dataset is primarily intended for unsupervised training of text generation models; however, it may be useful for other purposes.
* text-classification
* text-generation
### Languages
* Simplified Chinese
## Dataset Structure
### Data Instances
```json
{
"text": "\n------------\n\nๅ
จ้จ็ซ ่\n\n\n------------\n\n็ฌฌไธ็ซ ๅฅน่ฏๅฎๅๆขฆๅข๏ผ\n\n HTๅฝ้
ๅคง้
ๅบๆป็ปๅฅๆฟใ\n\n ๆธ
ๆจ็็ฌฌไธ็ผ้ณๅ
็
งๅฐ่ฟๅฃๅฐไบๅฅๅฐๆฟไธ๏ผๆด่ฝๅจๅไนฑ็ๅบๅไธ๏ผ็ช็ถๅฐ๏ผๅบไธ็ก็ๆญฃ็็ไบบ็ๅผ็ผ็๏ผ
็็ถๆ้๏ผ\n\n ...",
"meta": {
"subset": "cnnovel.ibiquw",
"id": "100067",
"q": 0.9,
"lang": "zh_cn",
"title": "ไธบ็ฑๅ
ฅๅฑ๏ผๅซ็ป็งฆๅ
็",
"author": "ๅฅฅๅพท่จ"
}
}
{
"text": "\n------------\n\nๅ
จ้จ็ซ ่\n\n\n------------\n\n็ฌฌ1็ซ ๏ผๅบ็ฑๅฐฑๅคงๅฉ\n\n ๅๅ็ฌฌไธ็็ฑ๏ผๅคง้จ็ผ็ผๆๅผ๏ผ็งฆๅณฐไปฐ่ตทๅคด๏ผ่ดชๅฉช็ๅผๅธไบไธๅฃ็ฉบๆฐใ\n\n ไธๅนดไบ๏ผ็ปไบๅ้ปๅฐไบ่ช็ฑ็ๅณ้ใ\n\n ไปๅ่ฟๅคด๏ผ็็็ฎ
้ไปๅบๆฅ็้ฃ็พคไบบ้๏ผ...",
"meta": {
"subset": "cnnovel.ibiquw",
"id": "100059",
"q": 0.9,
"lang": "zh_cn",
"title": "็ปไธๅผๅฉฟ",
"author": "็ปทๅธฆๆช"
}
}
```
### Data Fields
* `text`: the actual novel text, all chapters
* `meta`: entry metadata
* `subset`: dataset tag: `cnnovel.ibiquw`
* `id`: novel ID
* `q`: quality score, fixed at 0.9
* `lang`: always `zh_cn` (Simplified Chinese)
* `title`: novel title
* `author`: novel author
### Data Splits
No splitting of the data was performed.
## Dataset Creation
### Curation Rationale
TODO
### Source Data
#### Initial Data Collection and Normalization
TODO
#### Who are the source language producers?
The authors of each novel.
### Annotations
#### Annotation process
Titles were collected alongside the novel text and IDs.
#### Who are the annotators?
There were no human annotators.
### Personal and Sensitive Information
The dataset contains only works of fiction, and we do not believe it contains any PII.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset is intended to be useful for anyone who wishes to train a model to generate "more entertaining" content in Chinese.
It may also be useful for other languages depending on your language model.
### Discussion of Biases
This dataset is composed of fictional works by various authors. Because of this fact, the contents of this dataset will reflect
the biases of those authors. Beware of stereotypes.
### Other Known Limitations
N/A
## Additional Information
### Dataset Curators
Ronsor Labs
### Licensing Information
Apache 2.0, for all parts of which Ronsor Labs or the Ryoko AI Production Committee may be considered authors. All other material is
distributed under fair use principles.
### Citation Information
```
@misc{ryokoai2023-bigknow2022,
title = {BigKnow2022: Bringing Language Models Up to Speed},
author = {Ronsor},
year = {2023},
howpublished = {\url{https://github.com/RyokoAI/BigKnow2022}},
}
```
### Contributions
Thanks to @ronsor (GH) for gathering this dataset. |
samitizerxu/algae-rgb | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '1'
'1': '2'
'2': '3'
'3': '4'
'4': '5'
'5': test
splits:
- name: train
num_bytes: 44920154.28
num_examples: 17035
- name: test
num_bytes: 17356455.604
num_examples: 6494
download_size: 61006757
dataset_size: 62276609.884
---
# Dataset Card for "algae-rgb"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/u_47_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of u_47/U-47 (Azur Lane)
This is the dataset of u_47/U-47 (Azur Lane), containing 152 images and their tags.
The core tags of this character are `black_hair, long_hair, breasts, red_eyes, multicolored_hair, streaked_hair, hair_between_eyes, white_hair, one_side_up, earrings, bangs, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 152 | 198.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/u_47_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 152 | 114.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/u_47_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 376 | 251.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/u_47_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 152 | 176.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/u_47_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 376 | 350.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/u_47_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/u_47_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 22 |  |  |  |  |  | 1girl, black_one-piece_swimsuit, looking_at_viewer, solo, black_panties, iron_cross, black_gloves, cleavage, bare_shoulders, elbow_gloves, bandana, unzipped, covered_mouth, black_leotard, blush, black_thighhighs, covered_navel, simple_background, cross_earrings, zipper, white_background, bridal_gauntlets, clothing_cutout, scarf, sidelocks, meme_attire, cowboy_shot, side_ponytail, arm_strap, eyes_visible_through_hair |
| 1 | 10 |  |  |  |  |  | 1girl, black_one-piece_swimsuit, black_panties, iron_cross, looking_at_viewer, solo, black_thighhighs, elbow_gloves, bandana, cleavage, jewelry, medium_breasts, torpedo, unzipped, air_bubble, ass, bridal_gauntlets, underwater |
| 2 | 31 |  |  |  |  |  | 1girl, solo, looking_at_viewer, cleavage, glasses, navel, tank_top, jacket, black-framed_eyewear, bike_shorts, jewelry, iron_cross, off_shoulder, blush, under-rim_eyewear, very_long_hair, bare_shoulders, character_name, cross_choker, hair_ornament |
| 3 | 7 |  |  |  |  |  | 1girl, black_gloves, fingerless_gloves, short_sleeves, sidelocks, solo, cleavage, looking_at_viewer, navel, wine_glass, holding_cup, midriff, sitting, barrel, black_footwear, blush, indoors, jewelry, window, black_nails, crossed_bangs, iron_cross, knee_boots, miniskirt, nail_polish, open_mouth, pleated_skirt, red_skirt, side_ponytail, stomach, wine_bottle |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_one-piece_swimsuit | looking_at_viewer | solo | black_panties | iron_cross | black_gloves | cleavage | bare_shoulders | elbow_gloves | bandana | unzipped | covered_mouth | black_leotard | blush | black_thighhighs | covered_navel | simple_background | cross_earrings | zipper | white_background | bridal_gauntlets | clothing_cutout | scarf | sidelocks | meme_attire | cowboy_shot | side_ponytail | arm_strap | eyes_visible_through_hair | jewelry | medium_breasts | torpedo | air_bubble | ass | underwater | glasses | navel | tank_top | jacket | black-framed_eyewear | bike_shorts | off_shoulder | under-rim_eyewear | very_long_hair | character_name | cross_choker | hair_ornament | fingerless_gloves | short_sleeves | wine_glass | holding_cup | midriff | sitting | barrel | black_footwear | indoors | window | black_nails | crossed_bangs | knee_boots | miniskirt | nail_polish | open_mouth | pleated_skirt | red_skirt | stomach | wine_bottle |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------------------|:--------------------|:-------|:----------------|:-------------|:---------------|:-----------|:-----------------|:---------------|:----------|:-----------|:----------------|:----------------|:--------|:-------------------|:----------------|:--------------------|:-----------------|:---------|:-------------------|:-------------------|:------------------|:--------|:------------|:--------------|:--------------|:----------------|:------------|:----------------------------|:----------|:-----------------|:----------|:-------------|:------|:-------------|:----------|:--------|:-----------|:---------|:-----------------------|:--------------|:---------------|:--------------------|:-----------------|:-----------------|:---------------|:----------------|:--------------------|:----------------|:-------------|:--------------|:----------|:----------|:---------|:-----------------|:----------|:---------|:--------------|:----------------|:-------------|:------------|:--------------|:-------------|:----------------|:------------|:----------|:--------------|
| 0 | 22 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | X | X | X | X | | X | | X | X | X | | | | X | | | | | | X | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 31 |  |  |  |  |  | X | | X | X | | X | | X | X | | | | | | X | | | | | | | | | | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | | X | X | | X | X | X | | | | | | | X | | | | | | | | | | X | | | X | | | X | | | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_8 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1118926264.0
num_examples: 219742
download_size: 1140517158
dataset_size: 1118926264.0
---
# Dataset Card for "chunk_8"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/autotree_pmlb_10000_Hill_Valley_with_noise_sgosdt_l256_dim10_d3_sd0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: input_y_clean
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 236440000
num_examples: 10000
- name: validation
num_bytes: 236440000
num_examples: 10000
download_size: 172085873
dataset_size: 472880000
---
# Dataset Card for "autotree_pmlb_10000_Hill_Valley_with_noise_sgosdt_l256_dim10_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/python3-standardized_cluster_2 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 24594904
num_examples: 2170
download_size: 0
dataset_size: 24594904
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "python3-standardized_cluster_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SEACrowd/indolem_ntp | ---
license: cc-by-4.0
tags:
- next-sentence-prediction
language:
- ind
---
# indolem_ntp
NTP (Next Tweet prediction) is one of the comprehensive Indonesian benchmarks that given a list of tweets and an option, we predict if the option is the next tweet or not.
This task is similar to the next sentence prediction (NSP) task used to train BERT (Devlin et al., 2019).
In NTP, each instance consists of a Twitter thread (containing 2 to 4 tweets) that we call the premise, and four possible options for the next tweet, one of which is the actual response from the original thread.
Train: 5681 threads
Development: 811 threads
Test: 1890 threads
## Dataset Usage
Run `pip install nusacrowd` before loading the dataset through HuggingFace's `load_dataset`.
## Citation
```
@article{DBLP:journals/corr/abs-2011-00677,
author = {Fajri Koto and
Afshin Rahimi and
Jey Han Lau and
Timothy Baldwin},
title = {IndoLEM and IndoBERT: {A} Benchmark Dataset and Pre-trained Language
Model for Indonesian {NLP}},
journal = {CoRR},
volume = {abs/2011.00677},
year = {2020},
url = {https://arxiv.org/abs/2011.00677},
eprinttype = {arXiv},
eprint = {2011.00677},
timestamp = {Fri, 06 Nov 2020 15:32:47 +0100},
biburl = {https://dblp.org/rec/journals/corr/abs-2011-00677.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
## License
Creative Commons Attribution 4.0
## Homepage
[https://indolem.github.io/](https://indolem.github.io/)
### NusaCatalogue
For easy indexing and metadata: [https://indonlp.github.io/nusa-catalogue](https://indonlp.github.io/nusa-catalogue) |
yezhengli9/wmt20-iu-en | ---
dataset_info:
features:
- name: id (string)
dtype: string
- name: translation (translation)
dtype: string
splits:
- name: train
num_bytes: 1714038
num_examples: 2971
download_size: 647356
dataset_size: 1714038
---
# Dataset Card for "wmt20-iu-en"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nyuuzyou/wb-feedbacks | ---
annotations_creators:
- crowdsourced
language:
- ru
language_creators:
- crowdsourced
license:
- cc0-1.0
multilinguality:
- monolingual
pretty_name: Wildberries products
size_categories:
- 100M<n<1B
source_datasets:
- original
task_categories:
- text-generation
- text-classification
task_ids:
- language-modeling
---
# Dataset Card for Wildberries products
### Dataset Summary
The dataset contains product reviews from the Russian marketplace [Wildberries](https://www.wildberries.ru), collected by mining about The dataset was collected by bruteforcing possible product identifiers (about 230 million) and querying all available feedbacks for them. The data are stored in zstd-archives containing jsonl-files. The 'nmId' in the dataset usually corresponds to the valid product article on the site, but sometimes reviews are still available to retrieve via the API even if the product is hidden. The dataset solely includes information from the reviews. To access additional data, refer to my other dataset, [wb-products](https://huggingface.co/datasets/nyuuzyou/wb-products), collected from Wildberries. Merge the necessary data using the nmId identifier mentioned earlier. It is important to note that some fields in the dataset, particularly string fields, may be empty.
### Languages
The dataset is mostly in Russian, but there may be other languages present.
## Dataset Structure
### Data Fields
This dataset includes the following fields:
- `nmId`: Identifier for the item (integer)
- `productValuation`: Product valuation (integer)
- `color`: Color of the product (string)
- `text`: Text description of the product (string)
- `answer`: Answer (string)
### Data Splits
All examples are in the train split, there is no validation split.
## Additional Information
### License
This dataset is dedicated to the public domain under the Creative Commons Zero (CC0) license. This means you can:
* Use it for any purpose, including commercial projects.
* Modify it however you like.
* Distribute it without asking permission.
No attribution is required, but it's always appreciated!
CC0 license: https://creativecommons.org/publicdomain/zero/1.0/deed.en
To learn more about CC0, visit the Creative Commons website: https://creativecommons.org/publicdomain/zero/1.0/
### Dataset Curators
- [nyuuzyou](https://ducks.party)
|
blopen/JSWENI1 | ---
license: openrail
---
|
tiagoblima/qg_faquad | ---
dataset_info:
features:
- name: question
dtype: string
- name: paragraph_id
dtype: string
- name: paragraph
dtype: string
- name: answer
dtype: string
- name: paragraph_question
dtype: string
- name: paragraph_answer
dtype: string
- name: sentence
dtype: string
- name: answer_sentence
dtype: string
- name: paragraph_sentence
dtype: string
splits:
- name: test
num_bytes: 3748682.0
num_examples: 888
download_size: 1649534
dataset_size: 3748682.0
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
mstz/spambase | ---
language:
- en
tags:
- spambase
- tabular_classification
- binary_classification
- UCI
pretty_name: Spambase
size_categories:
- 1K<n<10K
task_categories:
- tabular-classification
configs:
- spambase
license: cc
---
# Spambase
The [Spambase dataset](https://archive.ics.uci.edu/ml/datasets/Spambase) from the [UCI ML repository](https://archive.ics.uci.edu/ml/datasets).
Is the given mail spam?
# Configurations and tasks
| **Configuration** | **Task** | **Description** |
|-------------------|---------------------------|------------------|
| spambase | Binary classification | Is the mail spam?|
# Usage
```python
from datasets import load_dataset
dataset = load_dataset("mstz/spambase")["train"]
``` |
open-llm-leaderboard/details_lex-hue__LexGPT-V3 | ---
pretty_name: Evaluation run of lex-hue/LexGPT-V3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lex-hue/LexGPT-V3](https://huggingface.co/lex-hue/LexGPT-V3) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lex-hue__LexGPT-V3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-04T20:35:12.431408](https://huggingface.co/datasets/open-llm-leaderboard/details_lex-hue__LexGPT-V3/blob/main/results_2024-04-04T20-35-12.431408.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.647154984215818,\n\
\ \"acc_stderr\": 0.03221441224437104,\n \"acc_norm\": 0.6487599114885558,\n\
\ \"acc_norm_stderr\": 0.032860268812293904,\n \"mc1\": 0.4283965728274174,\n\
\ \"mc1_stderr\": 0.017323088597314757,\n \"mc2\": 0.5998074537794252,\n\
\ \"mc2_stderr\": 0.015494960379071198\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.64419795221843,\n \"acc_stderr\": 0.01399057113791876,\n\
\ \"acc_norm\": 0.6646757679180887,\n \"acc_norm_stderr\": 0.013796182947785562\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6782513443537144,\n\
\ \"acc_stderr\": 0.004661924314756093,\n \"acc_norm\": 0.8590918143796057,\n\
\ \"acc_norm_stderr\": 0.003472157511639361\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119667,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119667\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544057,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544057\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n\
\ \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.041546596717075474,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.041546596717075474\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.02535574126305527,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.02535574126305527\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7967741935483871,\n\
\ \"acc_stderr\": 0.02289168798455496,\n \"acc_norm\": 0.7967741935483871,\n\
\ \"acc_norm_stderr\": 0.02289168798455496\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644237,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644237\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6846153846153846,\n \"acc_stderr\": 0.02355964698318994,\n \
\ \"acc_norm\": 0.6846153846153846,\n \"acc_norm_stderr\": 0.02355964698318994\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35185185185185186,\n \"acc_stderr\": 0.02911661760608301,\n \
\ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.02911661760608301\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7100840336134454,\n \"acc_stderr\": 0.029472485833136094,\n\
\ \"acc_norm\": 0.7100840336134454,\n \"acc_norm_stderr\": 0.029472485833136094\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8293577981651377,\n \"acc_stderr\": 0.016129271025099857,\n \"\
acc_norm\": 0.8293577981651377,\n \"acc_norm_stderr\": 0.016129271025099857\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5462962962962963,\n \"acc_stderr\": 0.033953227263757976,\n \"\
acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.033953227263757976\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8227848101265823,\n \"acc_stderr\": 0.024856364184503224,\n \
\ \"acc_norm\": 0.8227848101265823,\n \"acc_norm_stderr\": 0.024856364184503224\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7130044843049327,\n\
\ \"acc_stderr\": 0.03036037971029195,\n \"acc_norm\": 0.7130044843049327,\n\
\ \"acc_norm_stderr\": 0.03036037971029195\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516302,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516302\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.043546310772605956,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.043546310772605956\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n\
\ \"acc_stderr\": 0.01377869377846408,\n \"acc_norm\": 0.8186462324393359,\n\
\ \"acc_norm_stderr\": 0.01377869377846408\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.023786203255508283,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.023786203255508283\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.329608938547486,\n\
\ \"acc_stderr\": 0.01572153107518388,\n \"acc_norm\": 0.329608938547486,\n\
\ \"acc_norm_stderr\": 0.01572153107518388\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.02495418432487991,\n\
\ \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.02495418432487991\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n\
\ \"acc_stderr\": 0.026236965881153266,\n \"acc_norm\": 0.6913183279742765,\n\
\ \"acc_norm_stderr\": 0.026236965881153266\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.02465968518596728,\n\
\ \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.02465968518596728\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4817470664928292,\n\
\ \"acc_stderr\": 0.012761723960595472,\n \"acc_norm\": 0.4817470664928292,\n\
\ \"acc_norm_stderr\": 0.012761723960595472\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6454248366013072,\n \"acc_stderr\": 0.0193533605475537,\n \
\ \"acc_norm\": 0.6454248366013072,\n \"acc_norm_stderr\": 0.0193533605475537\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\
\ \"acc_stderr\": 0.027403859410786845,\n \"acc_norm\": 0.8159203980099502,\n\
\ \"acc_norm_stderr\": 0.027403859410786845\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.02709729011807081,\n\
\ \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.02709729011807081\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4283965728274174,\n\
\ \"mc1_stderr\": 0.017323088597314757,\n \"mc2\": 0.5998074537794252,\n\
\ \"mc2_stderr\": 0.015494960379071198\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7853196527229677,\n \"acc_stderr\": 0.011539912734345403\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6156178923426838,\n \
\ \"acc_stderr\": 0.013399219253698186\n }\n}\n```"
repo_url: https://huggingface.co/lex-hue/LexGPT-V3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|arc:challenge|25_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|gsm8k|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hellaswag|10_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-04T20-35-12.431408.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-04T20-35-12.431408.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- '**/details_harness|winogrande|5_2024-04-04T20-35-12.431408.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-04T20-35-12.431408.parquet'
- config_name: results
data_files:
- split: 2024_04_04T20_35_12.431408
path:
- results_2024-04-04T20-35-12.431408.parquet
- split: latest
path:
- results_2024-04-04T20-35-12.431408.parquet
---
# Dataset Card for Evaluation run of lex-hue/LexGPT-V3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [lex-hue/LexGPT-V3](https://huggingface.co/lex-hue/LexGPT-V3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lex-hue__LexGPT-V3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-04T20:35:12.431408](https://huggingface.co/datasets/open-llm-leaderboard/details_lex-hue__LexGPT-V3/blob/main/results_2024-04-04T20-35-12.431408.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.647154984215818,
"acc_stderr": 0.03221441224437104,
"acc_norm": 0.6487599114885558,
"acc_norm_stderr": 0.032860268812293904,
"mc1": 0.4283965728274174,
"mc1_stderr": 0.017323088597314757,
"mc2": 0.5998074537794252,
"mc2_stderr": 0.015494960379071198
},
"harness|arc:challenge|25": {
"acc": 0.64419795221843,
"acc_stderr": 0.01399057113791876,
"acc_norm": 0.6646757679180887,
"acc_norm_stderr": 0.013796182947785562
},
"harness|hellaswag|10": {
"acc": 0.6782513443537144,
"acc_stderr": 0.004661924314756093,
"acc_norm": 0.8590918143796057,
"acc_norm_stderr": 0.003472157511639361
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119667,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119667
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544057,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544057
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.02535574126305527,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.02535574126305527
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7967741935483871,
"acc_stderr": 0.02289168798455496,
"acc_norm": 0.7967741935483871,
"acc_norm_stderr": 0.02289168798455496
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644237,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644237
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6846153846153846,
"acc_stderr": 0.02355964698318994,
"acc_norm": 0.6846153846153846,
"acc_norm_stderr": 0.02355964698318994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.02911661760608301,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.02911661760608301
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7100840336134454,
"acc_stderr": 0.029472485833136094,
"acc_norm": 0.7100840336134454,
"acc_norm_stderr": 0.029472485833136094
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8293577981651377,
"acc_stderr": 0.016129271025099857,
"acc_norm": 0.8293577981651377,
"acc_norm_stderr": 0.016129271025099857
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.033953227263757976,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.033953227263757976
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8227848101265823,
"acc_stderr": 0.024856364184503224,
"acc_norm": 0.8227848101265823,
"acc_norm_stderr": 0.024856364184503224
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7130044843049327,
"acc_stderr": 0.03036037971029195,
"acc_norm": 0.7130044843049327,
"acc_norm_stderr": 0.03036037971029195
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516302,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516302
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.043546310772605956,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.043546310772605956
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8186462324393359,
"acc_stderr": 0.01377869377846408,
"acc_norm": 0.8186462324393359,
"acc_norm_stderr": 0.01377869377846408
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.023786203255508283,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.023786203255508283
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.329608938547486,
"acc_stderr": 0.01572153107518388,
"acc_norm": 0.329608938547486,
"acc_norm_stderr": 0.01572153107518388
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.02495418432487991,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.02495418432487991
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153266,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153266
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.02465968518596728,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.02465968518596728
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4817470664928292,
"acc_stderr": 0.012761723960595472,
"acc_norm": 0.4817470664928292,
"acc_norm_stderr": 0.012761723960595472
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6454248366013072,
"acc_stderr": 0.0193533605475537,
"acc_norm": 0.6454248366013072,
"acc_norm_stderr": 0.0193533605475537
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786845,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786845
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.02709729011807081,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.02709729011807081
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4283965728274174,
"mc1_stderr": 0.017323088597314757,
"mc2": 0.5998074537794252,
"mc2_stderr": 0.015494960379071198
},
"harness|winogrande|5": {
"acc": 0.7853196527229677,
"acc_stderr": 0.011539912734345403
},
"harness|gsm8k|5": {
"acc": 0.6156178923426838,
"acc_stderr": 0.013399219253698186
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
dlwlrmaIU/power_control | ---
license: mit
---
|
Nexdata/Korean_Speech_Data | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
---
# Dataset Card for Nexdata/Korean_Speech_Data
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/1008?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
Korean audio data with duration of 516 hours, recorded texts include: daily language, various interactive sentences, home commands, on-board commands, etc. Among 1,077 speakers, male and female speakers are 49% and 51%. The duration of each speaker is around half an hour.
For more details, please refer to the link: https://www.nexdata.ai/datasets/1008?source=Huggingface
### Supported Tasks and Leaderboards
automatic-speech-recognition, audio-speaker-identification: The dataset can be used to train a model for Automatic Speech Recognition (ASR).
### Languages
Korean
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing
### Citation Information
[More Information Needed]
### Contributions
|
autoevaluate/autoeval-staging-eval-project-xsum-69daf1dd-12935743 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- xsum
eval_info:
task: summarization
model: google/pegasus-cnn_dailymail
metrics: ['bleu']
dataset_name: xsum
dataset_config: default
dataset_split: test
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: google/pegasus-cnn_dailymail
* Dataset: xsum
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@xarymast](https://huggingface.co/xarymast) for evaluating this model. |
Dahoas/base_code_review | ---
dataset_info:
features:
- name: body
dtype: string
- name: comments
list:
- name: ContentLicense
dtype: string
- name: CreationDate
dtype: string
- name: Id
dtype: string
- name: Score
dtype: string
- name: body
dtype: string
- name: answers
list:
- name: body
dtype: string
- name: comments
list:
- name: ContentLicense
dtype: string
- name: CreationDate
dtype: string
- name: Id
dtype: string
- name: Score
dtype: string
- name: body
dtype: string
- name: meta_data
struct:
- name: CommentCount
dtype: string
- name: ContentLicense
dtype: string
- name: CreationDate
dtype: string
- name: Id
dtype: string
- name: ParentId
dtype: string
- name: Score
dtype: string
- name: meta_data
struct:
- name: AcceptedAnswerId
dtype: string
- name: CommentCount
dtype: string
- name: ContentLicense
dtype: string
- name: CreationDate
dtype: string
- name: Id
dtype: string
- name: Score
dtype: string
- name: Tags
sequence: string
- name: Title
dtype: string
- name: question_id
dtype: string
splits:
- name: train
num_bytes: 729807089
num_examples: 76003
download_size: 335610114
dataset_size: 729807089
---
# Dataset Card for "base_code_review"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
katxtong/tokenized_squad_validation_size356 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: offset_mapping
sequence:
sequence: int64
- name: example_id
dtype: string
splits:
- name: validation
num_bytes: 65884992
num_examples: 10784
download_size: 6124969
dataset_size: 65884992
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
|
common_language | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language:
- ar
- br
- ca
- cnh
- cs
- cv
- cy
- de
- dv
- el
- en
- eo
- es
- et
- eu
- fa
- fr
- fy
- ia
- id
- it
- ja
- ka
- kab
- ky
- lv
- mn
- mt
- nl
- pl
- pt
- rm
- ro
- ru
- rw
- sah
- sl
- sv
- ta
- tr
- tt
- uk
- zh
license:
- cc-by-4.0
multilinguality:
- multilingual
size_categories:
- 100K<n<1M
source_datasets:
- extended|common_voice
task_categories:
- audio-classification
task_ids:
- speaker-identification
pretty_name: Common Language
language_bcp47:
- fy-NL
- rm-sursilv
- sv-SE
- zh-CN
- zh-HK
- zh-TW
dataset_info:
features:
- name: client_id
dtype: string
- name: path
dtype: string
- name: sentence
dtype: string
- name: age
dtype: string
- name: gender
dtype: string
- name: language
dtype:
class_label:
names:
'0': Arabic
'1': Basque
'2': Breton
'3': Catalan
'4': Chinese_China
'5': Chinese_Hongkong
'6': Chinese_Taiwan
'7': Chuvash
'8': Czech
'9': Dhivehi
'10': Dutch
'11': English
'12': Esperanto
'13': Estonian
'14': French
'15': Frisian
'16': Georgian
'17': German
'18': Greek
'19': Hakha_Chin
'20': Indonesian
'21': Interlingua
'22': Italian
'23': Japanese
'24': Kabyle
'25': Kinyarwanda
'26': Kyrgyz
'27': Latvian
'28': Maltese
'29': Mangolian
'30': Persian
'31': Polish
'32': Portuguese
'33': Romanian
'34': Romansh_Sursilvan
'35': Russian
'36': Sakha
'37': Slovenian
'38': Spanish
'39': Swedish
'40': Tamil
'41': Tatar
'42': Turkish
'43': Ukranian
'44': Welsh
config_name: full
splits:
- name: train
num_bytes: 7116761
num_examples: 22194
- name: validation
num_bytes: 1855233
num_examples: 5888
- name: test
num_bytes: 1877970
num_examples: 5963
download_size: 3761951178
dataset_size: 10849964
---
# Dataset Card for common_language
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://zenodo.org/record/5036977
- **Repository:** https://github.com/speechbrain/speechbrain/tree/develop/recipes/CommonLanguage
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Leaderboard:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Dataset Summary
This dataset is composed of speech recordings from languages that were carefully selected from the CommonVoice database. The total duration of audio recordings is 45.1 hours (i.e., 1 hour of material for each language). The dataset has been extracted from CommonVoice to train language-id systems.
### Supported Tasks and Leaderboards
The baselines for language-id are available in the SpeechBrain toolkit (see recipes/CommonLanguage):
https://github.com/speechbrain/speechbrain
### Languages
List of included languages:
```
Arabic, Basque, Breton, Catalan, Chinese_China, Chinese_Hongkong, Chinese_Taiwan, Chuvash, Czech, Dhivehi, Dutch, English, Esperanto, Estonian, French, Frisian, Georgian, German, Greek, Hakha_Chin, Indonesian, Interlingua, Italian, Japanese, Kabyle, Kinyarwanda, Kyrgyz, Latvian, Maltese, Mongolian, Persian, Polish, Portuguese, Romanian, Romansh_Sursilvan, Russian, Sakha, Slovenian, Spanish, Swedish, Tamil, Tatar, Turkish, Ukranian, Welsh
```
## Dataset Structure
### Data Instances
A typical data point comprises the `path` to the audio file, and its label `language`. Additional fields include `age`, `client_id`, `gender` and `sentence`.
```python
{
'client_id': 'itln_trn_sp_175',
'path': '/path/common_voice_kpd/Italian/train/itln_trn_sp_175/common_voice_it_18279446.wav',
'audio': {'path': '/path/common_voice_kpd/Italian/train/itln_trn_sp_175/common_voice_it_18279446.wav',
'array': array([-0.00048828, -0.00018311, -0.00137329, ..., 0.00079346, 0.00091553, 0.00085449], dtype=float32),
'sampling_rate': 48000},
'sentence': 'Con gli studenti รจ leggermente simile.',
'age': 'not_defined',
'gender': 'not_defined',
'language': 22
}
```
### Data Fields
`client_id` (`string`): An id for which client (voice) made the recording
`path` (`string`): The path to the audio file
- `audio` (`dict`): A dictionary containing the path to the downloaded audio file, the decoded audio array, and the sampling rate. Note that when accessing the audio column: `dataset[0]["audio"]` the audio file is automatically decoded and resampled to `dataset.features["audio"].sampling_rate`. Decoding and resampling of a large number of audio files might take a significant amount of time. Thus it is important to first query the sample index before the `"audio"` column, *i.e.* `dataset[0]["audio"]` should **always** be preferred over `dataset["audio"][0]`.
`language` (`ClassLabel`): The language of the recording (see the `Languages` section above)
`sentence` (`string`): The sentence the user was prompted to speak
`age` (`string`): The age of the speaker.
`gender` (`string`): The gender of the speaker
### Data Splits
The dataset is already balanced and split into train, dev (validation) and test sets.
| Name | Train | Dev | Test |
|:---------------------------------:|:------:|:------:|:-----:|
| **# of utterances** | 177552 | 47104 | 47704 |
| **# unique speakers** | 11189 | 1297 | 1322 |
| **Total duration, hr** | 30.04 | 7.53 | 7.53 |
| **Min duration, sec** | 0.86 | 0.98 | 0.89 |
| **Mean duration, sec** | 4.87 | 4.61 | 4.55 |
| **Max duration, sec** | 21.72 | 105.67 | 29.83 |
| **Duration per language, min** | ~40 | ~10 | ~10 |
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
The dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in the Common Voice dataset.
## Considerations for Using the Data
### Social Impact of Dataset
The dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in the Common Voice dataset.
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
The Mongolian and Ukrainian languages are spelled as "Mangolian" and "Ukranian" in this version of the dataset.
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[Ganesh Sinisetty; Pavlo Ruban; Oleksandr Dymov; Mirco Ravanelli](https://zenodo.org/record/5036977#.YdTZ5hPMJ70)
### Licensing Information
[Creative Commons Attribution 4.0 International](https://creativecommons.org/licenses/by/4.0/legalcode)
### Citation Information
```
@dataset{ganesh_sinisetty_2021_5036977,
author = {Ganesh Sinisetty and
Pavlo Ruban and
Oleksandr Dymov and
Mirco Ravanelli},
title = {CommonLanguage},
month = jun,
year = 2021,
publisher = {Zenodo},
version = {0.1},
doi = {10.5281/zenodo.5036977},
url = {https://doi.org/10.5281/zenodo.5036977}
}
```
### Contributions
Thanks to [@anton-l](https://github.com/anton-l) for adding this dataset. |
nikchar/retrieval_verification_bm25_squeezebert_v2 | ---
dataset_info:
features:
- name: claim
dtype: string
- name: evidence_wiki_url
dtype: string
- name: text
dtype: string
- name: retrieved_evidence_title
sequence: string
- name: retrieved_evidence_text
sequence: string
- name: labels
dtype: int64
- name: Retrieval_Success
dtype: bool
- name: Predicted_Labels
dtype: int64
- name: Predicted_Labels_Each_doc
sequence: int64
splits:
- name: train
num_bytes: 66031496
num_examples: 11073
download_size: 30811918
dataset_size: 66031496
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "retrieval_verification_bm25_squeezebert_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_rte_fixin_future | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 112582
num_examples: 242
- name: train
num_bytes: 95330
num_examples: 203
download_size: 140715
dataset_size: 207912
---
# Dataset Card for "MULTI_VALUE_rte_fixin_future"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
thepurpleowl/codequeries | ---
annotations_creators:
- expert-generated
language:
- code
language_creators:
- found
multilinguality:
- monolingual
pretty_name: codequeries
size_categories:
- 100K<n<1M
source_datasets:
- original
tags:
- neural modeling of code
- code question answering
- code semantic understanding
task_categories:
- question-answering
task_ids:
- extractive-qa
license:
- apache-2.0
---
# Dataset Card for CodeQueries
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [How to use](#how-to-use)
- [Data Splits and Data Fields](#data-splits-and-data-fields)
- [Dataset Creation](#dataset-creation)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** [Data](https://huggingface.co/datasets/thepurpleowl/codequeries)
- **Repository:** [Code](https://github.com/thepurpleowl/codequeries-benchmark)
- **Paper:**
### Dataset Summary
CodeQueries is a dataset to evaluate the ability of neural networks to answer semantic queries over code. Given a query and code, a model is expected to identify answer and supporting-fact spans in the code for the query. This is extractive question-answering over code, for questions with a large scope (entire files) and complexity including both single- and multi-hop reasoning.
### Supported Tasks and Leaderboards
Extractive question answering for code, semantic understanding of code.
### Languages
The dataset contains code context from `python` files.
## Dataset Structure
### How to Use
The dataset can be directly used with the huggingface datasets package. You can load and iterate through the dataset for the proposed five settings with the following two lines of code:
```python
import datasets
# in addition to `twostep`, the other supported settings are <ideal/file_ideal/prefix>.
ds = datasets.load_dataset("thepurpleowl/codequeries", "twostep", split=datasets.Split.TEST)
print(next(iter(ds)))
#OUTPUT:
{'query_name': 'Unused import',
'code_file_path': 'rcbops/glance-buildpackage/glance/tests/unit/test_db.py',
'context_block': {'content': '# vim: tabstop=4 shiftwidth=4 softtabstop=4\n\n# Copyright 2010-2011 OpenStack, LLC\ ...',
'metadata': 'root',
'header': "['module', '___EOS___']",
'index': 0},
'answer_spans': [{'span': 'from glance.common import context',
'start_line': 19,
'start_column': 0,
'end_line': 19,
'end_column': 33}
],
'supporting_fact_spans': [],
'example_type': 1,
'single_hop': False,
'subtokenized_input_sequence': ['[CLS]_', 'Un', 'used_', 'import_', '[SEP]_', 'module_', '\\u\\u\\uEOS\\u\\u\\u_', '#', ' ', 'vim', ':', ...],
'label_sequence': [4, 4, 4, 4, 4, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, ...],
'relevance_label': 1
}
```
### Data Splits and Data Fields
Detailed information on the data splits for proposed settings can be found in the paper.
In general, data splits in all the proposed settings have examples with the following fields -
```
- query_name (query name to uniquely identify the query)
- code_file_path (relative source file path w.r.t. ETH Py150 corpus)
- context_blocks (code blocks as context with metadata) [`prefix` setting doesn't have this field and `twostep` has `context_block`]
- answer_spans (answer spans with metadata)
- supporting_fact_spans (supporting-fact spans with metadata)
- example_type (1(positive)) or 0(negative)) example type)
- single_hop (True or False - for query type)
- subtokenized_input_sequence (example subtokens) [`prefix` setting has the corresponding token ids]
- label_sequence (example subtoken labels)
- relevance_label (0 (not relevant) or 1 (relevant) - relevance label of a block) [only `twostep` setting has this field]
```
## Dataset Creation
The dataset is created using [ETH Py150 Open dataset](https://github.com/google-research-datasets/eth_py150_open) as source for code contexts. To get semantic queries and corresponding answer/supporting-fact spans in ETH Py150 Open corpus files, CodeQL was used.
## Additional Information
### Licensing Information
The source code repositories used for preparing CodeQueries are based on the [ETH Py150 Open dataset](https://github.com/google-research-datasets/eth_py150_open) and are redistributable under the respective licenses. A Huggingface dataset for ETH Py150 Open is available [here](https://huggingface.co/datasets/eth_py150_open). The labeling prepared and provided by us as part of CodeQueries is released under the Apache-2.0 license.
|
Fillster/plants | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 36241488.0
num_examples: 40
download_size: 36230209
dataset_size: 36241488.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
bowphs/cc-100-01-percent-untokenized | ---
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 29257739786
num_examples: 147182603
download_size: 22427356397
dataset_size: 29257739786
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "cc-100-01-percent-untokenized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
daokang/bidai | ---
license: other
---
|
yardeny/tokenized_gpt2_context_len_64 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 8074990564
num_examples: 80462898
download_size: 3552230822
dataset_size: 8074990564
---
# Dataset Card for "tokenized_gpt2_context_len_64"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexdata/Re-ID_Data_in_Surveillance_Scenes | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
---
# Dataset Card for Nexdata/Re-ID_Data_in_Surveillance_Scenes
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/1129?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
10,000 People - Re-ID Data in Surveillance Scenes. The data includes indoor scenes and outdoor scenes. The data includes males and females, and the age distribution is from children to the elderly. The data diversity includes different age groups, different time periods, different shooting angles, different human body orientations and postures, clothing for different seasons. For annotation, the rectangular bounding boxes and 15 attributes of human body were annotated. The data can be used for re-id and other tasks.
For more details, please refer to the link: https://www.nexdata.ai/datasets/1129?source=Huggingface
### Supported Tasks and Leaderboards
face-detection, computer-vision: The dataset can be used to train a model for face detection.
### Languages
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing
### Citation Information
[More Information Needed]
### Contributions |
alfredplpl/wikipedia-qa-ja-500k | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 142049495
num_examples: 516932
download_size: 65635910
dataset_size: 142049495
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: cc-by-sa-3.0
task_categories:
- question-answering
language:
- ja
---
# Dataset Card for "wikipedia-qa-ja-500k"
# Original Dataset
- hpprc/wikipedia-20240101
# Procedure
- Extract the first line of the title from the dataset.
- Generate the answer by summizing the line using LLM:
- Input RAG-like prompt to CALM 2 7B Chat.
- Format the response.
# RAG-like Prompt
```python
f"""USER: {title}ใจใฏใชใใงใใ๏ผๆฌกใฎๆ็ซ ใๅ่ใซไธ่จใงใพใจใใฆใใ ใใใ{text}
ASSISTANT: """
``` |
oneonlee/cleansed_emocontext | ---
annotations_creators:
- expert-generated
language_creators:
- crowdsourced
license: mpl-2.0
task_categories:
- text-classification
task_ids:
- sentiment-classification
language:
- en
tags:
- conversation
size_categories:
- 10K<n<100K
source_datasets:
- emo
pretty_name: Cleansed_EmoContext
dataset_info:
features:
- name: turn1
dtype: string
- name: turn2
dtype: string
- name: turn3
dtype: string
- name: label
dtype:
class_label:
names:
"0": others
"1": happy
"2": sad
"3": angry
config_name: cleansed_emo2019
# splits:
# - name: train
# num_bytes: 2433205
# num_examples: 30160
# - name: test
# num_bytes: 421555
# num_examples: 5509
# download_size: 3362556
# dataset_size: 2854760
---
# Dataset Card for "cleansed_emocontext"
- `cleansed_emocontext` is a **cleansed and normalized version** of [`emo`](https://huggingface.co/datasets/emo).
- For cleansing and normalization, [`data_cleansing.py`](https://github.com/oneonlee/cleansed_emocontext/blob/master/helpers/data_cleaning.py) was used, [modifying the code](https://github.com/oneonlee/cleansed_emocontext/commit/c09b020dfb49692a1c5fcd2099d531503d9bb8b5#diff-266912260148f110c4e7fe00b6cdef4c23b024dca8c693a0dd3c83f25ba56f54) provided on the [official EmoContext GitHub](https://github.com/DhruvDh/emocontext).
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [SemEval-2019 Task 3: EmoContext Contextual Emotion Detection in Text](https://aclanthology.org/S19-2005/)
- **Repository:** [More Information Needed](https://github.com/DhruvDh/emocontext)
- **Paper:** [SemEval-2019 Task 3: EmoContext Contextual Emotion Detection in Text](https://aclanthology.org/S19-2005/)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 3.37 MB
- **Size of the generated dataset:** 2.85 MB
- **Total amount of disk used:** 6.22 MB
### Dataset Summary
In this dataset, given a textual dialogue i.e. an utterance along with two previous turns of context, the goal was to infer the underlying emotion of the utterance by choosing from four emotion classes - Happy, Sad, Angry and Others.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### cleansed_emo2019
An example of 'train' looks as follows.
```
{
"label": 0,
"turn1": "don't worry i'm girl",
"turn2": "hmm how do i know if you are",
"turn3": "what's your name ?"
}
```
### Data Fields
The data fields are the same among all splits.
#### cleansed_emo2019
- `turn1`, `turn2`, `turn3`: a `string` feature.
- `label`: a classification label, with possible values including `others` (0), `happy` (1), `sad` (2), `angry` (3).
### Data Splits
| name | train | dev | test |
| ---------------- | ----: | ---: | ---: |
| cleansed_emo2019 | 30160 | 2755 | 5509 |
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@inproceedings{chatterjee-etal-2019-semeval,
title={SemEval-2019 Task 3: EmoContext Contextual Emotion Detection in Text},
author={Ankush Chatterjee and Kedhar Nath Narahari and Meghana Joshi and Puneet Agrawal},
booktitle={Proceedings of the 13th International Workshop on Semantic Evaluation},
year={2019},
address={Minneapolis, Minnesota, USA},
publisher={Association for Computational Linguistics},
url={https://www.aclweb.org/anthology/S19-2005},
doi={10.18653/v1/S19-2005},
pages={39--48},
abstract={In this paper, we present the SemEval-2019 Task 3 - EmoContext: Contextual Emotion Detection in Text. Lack of facial expressions and voice modulations make detecting emotions in text a challenging problem. For instance, as humans, on reading ''Why don't you ever text me!'' we can either interpret it as a sad or angry emotion and the same ambiguity exists for machines. However, the context of dialogue can prove helpful in detection of the emotion. In this task, given a textual dialogue i.e. an utterance along with two previous turns of context, the goal was to infer the underlying emotion of the utterance by choosing from four emotion classes - Happy, Sad, Angry and Others. To facilitate the participation in this task, textual dialogues from user interaction with a conversational agent were taken and annotated for emotion classes after several data processing steps. A training data set of 30160 dialogues, and two evaluation data sets, Test1 and Test2, containing 2755 and 5509 dialogues respectively were released to the participants. A total of 311 teams made submissions to this task. The final leader-board was evaluated on Test2 data set, and the highest ranked submission achieved 79.59 micro-averaged F1 score. Our analysis of systems submitted to the task indicate that Bi-directional LSTM was the most common choice of neural architecture used, and most of the systems had the best performance for the Sad emotion class, and the worst for the Happy emotion class}
}
```
|
Guke/imoto_sora | ---
license: mit
---
|
vaishali/geoQuery-tableQA | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: query
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: table_names
sequence: string
- name: tables
sequence: string
- name: source
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 9440328
num_examples: 530
- name: validation
num_bytes: 829668
num_examples: 49
- name: test
num_bytes: 4626906
num_examples: 253
download_size: 1988975
dataset_size: 14896902
task_categories:
- table-question-answering
---
# Dataset Card for "geoQuery-tableQA"
# Usage
```python
import pandas as pd
from datasets import load_dataset
geoQuery_tableQA = load_dataset("vaishali/geoQuery-tableQA")
for sample in geoQuery_tableQA['train']:
question = sample['question']
input_table_names = sample["table_names"]
input_tables = [pd.read_json(table, orient='split') for table in sample['tables']]
answer = pd.read_json(sample['answer'], orient='split')
# flattened input/output
input_to_model = sample["source"]
target = sample["target"]
```
# BibTeX entry and citation info
```
@inproceedings{pal-etal-2023-multitabqa,
title = "{M}ulti{T}ab{QA}: Generating Tabular Answers for Multi-Table Question Answering",
author = "Pal, Vaishali and
Yates, Andrew and
Kanoulas, Evangelos and
de Rijke, Maarten",
booktitle = "Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
month = jul,
year = "2023",
address = "Toronto, Canada",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.acl-long.348",
doi = "10.18653/v1/2023.acl-long.348",
pages = "6322--6334",
abstract = "Recent advances in tabular question answering (QA) with large language models are constrained in their coverage and only answer questions over a single table. However, real-world queries are complex in nature, often over multiple tables in a relational database or web page. Single table questions do not involve common table operations such as set operations, Cartesian products (joins), or nested queries. Furthermore, multi-table operations often result in a tabular output, which necessitates table generation capabilities of tabular QA models. To fill this gap, we propose a new task of answering questions over multiple tables. Our model, MultiTabQA, not only answers questions over multiple tables, but also generalizes to generate tabular answers. To enable effective training, we build a pre-training dataset comprising of 132,645 SQL queries and tabular answers. Further, we evaluate the generated tables by introducing table-specific metrics of varying strictness assessing various levels of granularity of the table structure. MultiTabQA outperforms state-of-the-art single table QA models adapted to a multi-table QA setting by finetuning on three datasets: Spider, Atis and GeoQuery.",
}
```
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Norod78/lego-blip-captions-512 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 627030265.0
num_examples: 2511
download_size: 625119749
dataset_size: 627030265.0
---
# Dataset Card for "lego-blip-captions-512"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
deepasara/reuters_articles | ---
dataset_info:
features:
- name: title
dtype: string
- name: body
dtype: string
splits:
- name: train
num_bytes: 13792576
num_examples: 17262
- name: validation
num_bytes: 1870389
num_examples: 2158
- name: test
num_bytes: 1379190
num_examples: 2158
download_size: 10073414
dataset_size: 17042155
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
Melanit/testsubset | ---
dataset_info:
features:
- name: name
struct:
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: 'null'
- name: sampling_rate
dtype: int64
splits:
- name: example
num_bytes: 61573340
num_examples: 73
download_size: 14585181
dataset_size: 61573340
---
# Dataset Card for "testsubset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
LukeEuser/docvqa_30_unanswerable_questions | ---
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: image
- name: query
struct:
- name: de
dtype: string
- name: en
dtype: string
- name: es
dtype: string
- name: fr
dtype: string
- name: it
dtype: string
- name: answers
sequence: string
- name: words
sequence: string
- name: bounding_boxes
sequence:
sequence: float32
length: 4
- name: answer
struct:
- name: match_score
dtype: float64
- name: matched_text
dtype: string
- name: start
dtype: int64
- name: text
dtype: string
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 33130841.0
num_examples: 100
- name: test
num_bytes: 6102508.0
num_examples: 20
download_size: 13284819
dataset_size: 39233349.0
---
# Dataset Card for "docvqa_30_unanswerable_questions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yukihiratype2/test | ---
license: apache-2.0
---
|
philschmid/translated_tasks_de_google_52k | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 22108071
num_examples: 51664
download_size: 13686739
dataset_size: 22108071
---
# Dataset Card for "translated_tasks_de_google_52k"
Copy of : https://github.com/thisserand/alpaca-lora-finetune-language/tree/main/data/translated |
HuggingFaceH4/h4-anthropic-hh-rlhf-helpful-base-gen | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train_gen
num_bytes: 25260945
num_examples: 43835
- name: test_gen
num_bytes: 1354536
num_examples: 2354
download_size: 14895550
dataset_size: 26615481
configs:
- config_name: default
data_files:
- split: train_gen
path: data/train_gen-*
- split: test_gen
path: data/test_gen-*
---
|
Fredithefish/openassistant-guanaco-unfiltered | ---
license: apache-2.0
task_categories:
- conversational
language:
- en
- de
- fr
- es
size_categories:
- 1K<n<10K
---
# Guanaco-Unfiltered
- Any language other than English, German, French, or Spanish has been removed.
- Refusals of assistance have been removed.
- The identification as OpenAssistant has been removed.
## [Version 2 is out](https://huggingface.co/datasets/Fredithefish/openassistant-guanaco-unfiltered/blob/main/guanaco-unfiltered-v2.jsonl)
- Identification as OpenAssistant is now fully removed
- other improvements |
open-llm-leaderboard/details_frankenmerger__gemoy-4b-instruct-scientific | ---
pretty_name: Evaluation run of frankenmerger/gemoy-4b-instruct-scientific
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [frankenmerger/gemoy-4b-instruct-scientific](https://huggingface.co/frankenmerger/gemoy-4b-instruct-scientific)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_frankenmerger__gemoy-4b-instruct-scientific\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T11:10:43.531199](https://huggingface.co/datasets/open-llm-leaderboard/details_frankenmerger__gemoy-4b-instruct-scientific/blob/main/results_2024-03-10T11-10-43.531199.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3887480937200939,\n\
\ \"acc_stderr\": 0.033967527013847434,\n \"acc_norm\": 0.3919353670094879,\n\
\ \"acc_norm_stderr\": 0.03472007289325813,\n \"mc1\": 0.26438188494492043,\n\
\ \"mc1_stderr\": 0.015438211119522514,\n \"mc2\": 0.4195962328166831,\n\
\ \"mc2_stderr\": 0.014414337460874078\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.39334470989761094,\n \"acc_stderr\": 0.014275101465693026,\n\
\ \"acc_norm\": 0.4197952218430034,\n \"acc_norm_stderr\": 0.014422181226303026\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.46106353316072496,\n\
\ \"acc_stderr\": 0.004974628903829138,\n \"acc_norm\": 0.6304521011750648,\n\
\ \"acc_norm_stderr\": 0.004816958817726085\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.42962962962962964,\n\
\ \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.42962962962962964,\n\
\ \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3092105263157895,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.3092105263157895,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.45,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3849056603773585,\n \"acc_stderr\": 0.02994649856769995,\n\
\ \"acc_norm\": 0.3849056603773585,\n \"acc_norm_stderr\": 0.02994649856769995\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4166666666666667,\n\
\ \"acc_stderr\": 0.041227287076512825,\n \"acc_norm\": 0.4166666666666667,\n\
\ \"acc_norm_stderr\": 0.041227287076512825\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768077,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768077\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3468208092485549,\n\
\ \"acc_stderr\": 0.03629146670159663,\n \"acc_norm\": 0.3468208092485549,\n\
\ \"acc_norm_stderr\": 0.03629146670159663\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.32340425531914896,\n \"acc_stderr\": 0.03057944277361034,\n\
\ \"acc_norm\": 0.32340425531914896,\n \"acc_norm_stderr\": 0.03057944277361034\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n\
\ \"acc_stderr\": 0.03932537680392871,\n \"acc_norm\": 0.2619047619047619,\n\
\ \"acc_norm_stderr\": 0.03932537680392871\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.38064516129032255,\n \"acc_stderr\": 0.02762171783290703,\n \"\
acc_norm\": 0.38064516129032255,\n \"acc_norm_stderr\": 0.02762171783290703\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.26108374384236455,\n \"acc_stderr\": 0.030903796952114482,\n \"\
acc_norm\": 0.26108374384236455,\n \"acc_norm_stderr\": 0.030903796952114482\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\"\
: 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.4727272727272727,\n \"acc_stderr\": 0.03898531605579419,\n\
\ \"acc_norm\": 0.4727272727272727,\n \"acc_norm_stderr\": 0.03898531605579419\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.43434343434343436,\n \"acc_stderr\": 0.03531505879359183,\n \"\
acc_norm\": 0.43434343434343436,\n \"acc_norm_stderr\": 0.03531505879359183\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.47668393782383417,\n \"acc_stderr\": 0.03604513672442207,\n\
\ \"acc_norm\": 0.47668393782383417,\n \"acc_norm_stderr\": 0.03604513672442207\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.38461538461538464,\n \"acc_stderr\": 0.024666744915187222,\n\
\ \"acc_norm\": 0.38461538461538464,\n \"acc_norm_stderr\": 0.024666744915187222\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712163,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712163\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3277310924369748,\n \"acc_stderr\": 0.030489911417673227,\n\
\ \"acc_norm\": 0.3277310924369748,\n \"acc_norm_stderr\": 0.030489911417673227\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.24503311258278146,\n \"acc_stderr\": 0.03511807571804725,\n \"\
acc_norm\": 0.24503311258278146,\n \"acc_norm_stderr\": 0.03511807571804725\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.45688073394495415,\n \"acc_stderr\": 0.021357458785226206,\n \"\
acc_norm\": 0.45688073394495415,\n \"acc_norm_stderr\": 0.021357458785226206\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.23148148148148148,\n \"acc_stderr\": 0.028765111718046934,\n \"\
acc_norm\": 0.23148148148148148,\n \"acc_norm_stderr\": 0.028765111718046934\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4117647058823529,\n \"acc_stderr\": 0.034542365853806094,\n \"\
acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.034542365853806094\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5358649789029536,\n \"acc_stderr\": 0.03246338898055659,\n \
\ \"acc_norm\": 0.5358649789029536,\n \"acc_norm_stderr\": 0.03246338898055659\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.42152466367713004,\n\
\ \"acc_stderr\": 0.033141902221106564,\n \"acc_norm\": 0.42152466367713004,\n\
\ \"acc_norm_stderr\": 0.033141902221106564\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.45038167938931295,\n \"acc_stderr\": 0.04363643698524779,\n\
\ \"acc_norm\": 0.45038167938931295,\n \"acc_norm_stderr\": 0.04363643698524779\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5619834710743802,\n \"acc_stderr\": 0.04529146804435792,\n \"\
acc_norm\": 0.5619834710743802,\n \"acc_norm_stderr\": 0.04529146804435792\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.04830366024635331,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.04830366024635331\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.44171779141104295,\n \"acc_stderr\": 0.039015918258361836,\n\
\ \"acc_norm\": 0.44171779141104295,\n \"acc_norm_stderr\": 0.039015918258361836\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.04432804055291519,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.04432804055291519\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5145631067961165,\n \"acc_stderr\": 0.049486373240266356,\n\
\ \"acc_norm\": 0.5145631067961165,\n \"acc_norm_stderr\": 0.049486373240266356\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6367521367521367,\n\
\ \"acc_stderr\": 0.03150712523091264,\n \"acc_norm\": 0.6367521367521367,\n\
\ \"acc_norm_stderr\": 0.03150712523091264\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.508301404853129,\n\
\ \"acc_stderr\": 0.017877498991072008,\n \"acc_norm\": 0.508301404853129,\n\
\ \"acc_norm_stderr\": 0.017877498991072008\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4046242774566474,\n \"acc_stderr\": 0.026424816594009852,\n\
\ \"acc_norm\": 0.4046242774566474,\n \"acc_norm_stderr\": 0.026424816594009852\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2536312849162011,\n\
\ \"acc_stderr\": 0.014551553659369923,\n \"acc_norm\": 0.2536312849162011,\n\
\ \"acc_norm_stderr\": 0.014551553659369923\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.41830065359477125,\n \"acc_stderr\": 0.028245134024387282,\n\
\ \"acc_norm\": 0.41830065359477125,\n \"acc_norm_stderr\": 0.028245134024387282\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3954983922829582,\n\
\ \"acc_stderr\": 0.027770918531427834,\n \"acc_norm\": 0.3954983922829582,\n\
\ \"acc_norm_stderr\": 0.027770918531427834\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.39814814814814814,\n \"acc_stderr\": 0.027237415094592477,\n\
\ \"acc_norm\": 0.39814814814814814,\n \"acc_norm_stderr\": 0.027237415094592477\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.31560283687943264,\n \"acc_stderr\": 0.027724989449509314,\n \
\ \"acc_norm\": 0.31560283687943264,\n \"acc_norm_stderr\": 0.027724989449509314\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3005215123859192,\n\
\ \"acc_stderr\": 0.011709918883039122,\n \"acc_norm\": 0.3005215123859192,\n\
\ \"acc_norm_stderr\": 0.011709918883039122\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.22058823529411764,\n \"acc_stderr\": 0.025187786660227272,\n\
\ \"acc_norm\": 0.22058823529411764,\n \"acc_norm_stderr\": 0.025187786660227272\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3709150326797386,\n \"acc_stderr\": 0.019542101564854118,\n \
\ \"acc_norm\": 0.3709150326797386,\n \"acc_norm_stderr\": 0.019542101564854118\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.44545454545454544,\n\
\ \"acc_stderr\": 0.047605488214603246,\n \"acc_norm\": 0.44545454545454544,\n\
\ \"acc_norm_stderr\": 0.047605488214603246\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.37551020408163266,\n \"acc_stderr\": 0.031001209039894836,\n\
\ \"acc_norm\": 0.37551020408163266,\n \"acc_norm_stderr\": 0.031001209039894836\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5174129353233831,\n\
\ \"acc_stderr\": 0.03533389234739245,\n \"acc_norm\": 0.5174129353233831,\n\
\ \"acc_norm_stderr\": 0.03533389234739245\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5146198830409356,\n \"acc_stderr\": 0.038331852752130254,\n\
\ \"acc_norm\": 0.5146198830409356,\n \"acc_norm_stderr\": 0.038331852752130254\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26438188494492043,\n\
\ \"mc1_stderr\": 0.015438211119522514,\n \"mc2\": 0.4195962328166831,\n\
\ \"mc2_stderr\": 0.014414337460874078\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6306235201262825,\n \"acc_stderr\": 0.01356447059605351\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.15466262319939347,\n \
\ \"acc_stderr\": 0.009959786220917213\n }\n}\n```"
repo_url: https://huggingface.co/frankenmerger/gemoy-4b-instruct-scientific
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|arc:challenge|25_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|gsm8k|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hellaswag|10_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|winogrande|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T11-10-43.531199.parquet'
- config_name: results
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- results_2024-03-10T11-10-43.531199.parquet
- split: latest
path:
- results_2024-03-10T11-10-43.531199.parquet
---
# Dataset Card for Evaluation run of frankenmerger/gemoy-4b-instruct-scientific
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [frankenmerger/gemoy-4b-instruct-scientific](https://huggingface.co/frankenmerger/gemoy-4b-instruct-scientific) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_frankenmerger__gemoy-4b-instruct-scientific",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T11:10:43.531199](https://huggingface.co/datasets/open-llm-leaderboard/details_frankenmerger__gemoy-4b-instruct-scientific/blob/main/results_2024-03-10T11-10-43.531199.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3887480937200939,
"acc_stderr": 0.033967527013847434,
"acc_norm": 0.3919353670094879,
"acc_norm_stderr": 0.03472007289325813,
"mc1": 0.26438188494492043,
"mc1_stderr": 0.015438211119522514,
"mc2": 0.4195962328166831,
"mc2_stderr": 0.014414337460874078
},
"harness|arc:challenge|25": {
"acc": 0.39334470989761094,
"acc_stderr": 0.014275101465693026,
"acc_norm": 0.4197952218430034,
"acc_norm_stderr": 0.014422181226303026
},
"harness|hellaswag|10": {
"acc": 0.46106353316072496,
"acc_stderr": 0.004974628903829138,
"acc_norm": 0.6304521011750648,
"acc_norm_stderr": 0.004816958817726085
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.42962962962962964,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.42962962962962964,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3092105263157895,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.3092105263157895,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3849056603773585,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.3849056603773585,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.041227287076512825,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.041227287076512825
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3468208092485549,
"acc_stderr": 0.03629146670159663,
"acc_norm": 0.3468208092485549,
"acc_norm_stderr": 0.03629146670159663
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.32340425531914896,
"acc_stderr": 0.03057944277361034,
"acc_norm": 0.32340425531914896,
"acc_norm_stderr": 0.03057944277361034
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436716,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436716
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.03932537680392871,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.03932537680392871
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.38064516129032255,
"acc_stderr": 0.02762171783290703,
"acc_norm": 0.38064516129032255,
"acc_norm_stderr": 0.02762171783290703
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.26108374384236455,
"acc_stderr": 0.030903796952114482,
"acc_norm": 0.26108374384236455,
"acc_norm_stderr": 0.030903796952114482
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4727272727272727,
"acc_stderr": 0.03898531605579419,
"acc_norm": 0.4727272727272727,
"acc_norm_stderr": 0.03898531605579419
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.43434343434343436,
"acc_stderr": 0.03531505879359183,
"acc_norm": 0.43434343434343436,
"acc_norm_stderr": 0.03531505879359183
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.47668393782383417,
"acc_stderr": 0.03604513672442207,
"acc_norm": 0.47668393782383417,
"acc_norm_stderr": 0.03604513672442207
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.38461538461538464,
"acc_stderr": 0.024666744915187222,
"acc_norm": 0.38461538461538464,
"acc_norm_stderr": 0.024666744915187222
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712163,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712163
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3277310924369748,
"acc_stderr": 0.030489911417673227,
"acc_norm": 0.3277310924369748,
"acc_norm_stderr": 0.030489911417673227
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.24503311258278146,
"acc_stderr": 0.03511807571804725,
"acc_norm": 0.24503311258278146,
"acc_norm_stderr": 0.03511807571804725
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.45688073394495415,
"acc_stderr": 0.021357458785226206,
"acc_norm": 0.45688073394495415,
"acc_norm_stderr": 0.021357458785226206
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.028765111718046934,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.028765111718046934
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.034542365853806094,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.034542365853806094
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5358649789029536,
"acc_stderr": 0.03246338898055659,
"acc_norm": 0.5358649789029536,
"acc_norm_stderr": 0.03246338898055659
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.42152466367713004,
"acc_stderr": 0.033141902221106564,
"acc_norm": 0.42152466367713004,
"acc_norm_stderr": 0.033141902221106564
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.45038167938931295,
"acc_stderr": 0.04363643698524779,
"acc_norm": 0.45038167938931295,
"acc_norm_stderr": 0.04363643698524779
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5619834710743802,
"acc_stderr": 0.04529146804435792,
"acc_norm": 0.5619834710743802,
"acc_norm_stderr": 0.04529146804435792
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.04830366024635331,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.04830366024635331
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.44171779141104295,
"acc_stderr": 0.039015918258361836,
"acc_norm": 0.44171779141104295,
"acc_norm_stderr": 0.039015918258361836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291519,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291519
},
"harness|hendrycksTest-management|5": {
"acc": 0.5145631067961165,
"acc_stderr": 0.049486373240266356,
"acc_norm": 0.5145631067961165,
"acc_norm_stderr": 0.049486373240266356
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6367521367521367,
"acc_stderr": 0.03150712523091264,
"acc_norm": 0.6367521367521367,
"acc_norm_stderr": 0.03150712523091264
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.508301404853129,
"acc_stderr": 0.017877498991072008,
"acc_norm": 0.508301404853129,
"acc_norm_stderr": 0.017877498991072008
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4046242774566474,
"acc_stderr": 0.026424816594009852,
"acc_norm": 0.4046242774566474,
"acc_norm_stderr": 0.026424816594009852
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2536312849162011,
"acc_stderr": 0.014551553659369923,
"acc_norm": 0.2536312849162011,
"acc_norm_stderr": 0.014551553659369923
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.41830065359477125,
"acc_stderr": 0.028245134024387282,
"acc_norm": 0.41830065359477125,
"acc_norm_stderr": 0.028245134024387282
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3954983922829582,
"acc_stderr": 0.027770918531427834,
"acc_norm": 0.3954983922829582,
"acc_norm_stderr": 0.027770918531427834
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.39814814814814814,
"acc_stderr": 0.027237415094592477,
"acc_norm": 0.39814814814814814,
"acc_norm_stderr": 0.027237415094592477
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.31560283687943264,
"acc_stderr": 0.027724989449509314,
"acc_norm": 0.31560283687943264,
"acc_norm_stderr": 0.027724989449509314
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3005215123859192,
"acc_stderr": 0.011709918883039122,
"acc_norm": 0.3005215123859192,
"acc_norm_stderr": 0.011709918883039122
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.22058823529411764,
"acc_stderr": 0.025187786660227272,
"acc_norm": 0.22058823529411764,
"acc_norm_stderr": 0.025187786660227272
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3709150326797386,
"acc_stderr": 0.019542101564854118,
"acc_norm": 0.3709150326797386,
"acc_norm_stderr": 0.019542101564854118
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.44545454545454544,
"acc_stderr": 0.047605488214603246,
"acc_norm": 0.44545454545454544,
"acc_norm_stderr": 0.047605488214603246
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.37551020408163266,
"acc_stderr": 0.031001209039894836,
"acc_norm": 0.37551020408163266,
"acc_norm_stderr": 0.031001209039894836
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5174129353233831,
"acc_stderr": 0.03533389234739245,
"acc_norm": 0.5174129353233831,
"acc_norm_stderr": 0.03533389234739245
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5146198830409356,
"acc_stderr": 0.038331852752130254,
"acc_norm": 0.5146198830409356,
"acc_norm_stderr": 0.038331852752130254
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26438188494492043,
"mc1_stderr": 0.015438211119522514,
"mc2": 0.4195962328166831,
"mc2_stderr": 0.014414337460874078
},
"harness|winogrande|5": {
"acc": 0.6306235201262825,
"acc_stderr": 0.01356447059605351
},
"harness|gsm8k|5": {
"acc": 0.15466262319939347,
"acc_stderr": 0.009959786220917213
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
autoevaluate/autoeval-staging-eval-launch__gov_report-plain_text-cd8e90-16116211 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- launch/gov_report
eval_info:
task: summarization
model: pszemraj/long-t5-tglobal-large-pubmed-3k-booksum-16384-WIP13
metrics: ['bertscore']
dataset_name: launch/gov_report
dataset_config: plain_text
dataset_split: validation
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: pszemraj/long-t5-tglobal-large-pubmed-3k-booksum-16384-WIP13
* Dataset: launch/gov_report
* Config: plain_text
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@nonchalant-nagavalli](https://huggingface.co/nonchalant-nagavalli) for evaluating this model. |
Nexdata/Hindi_Spontaneous_Speech_Data | ---
language:
- hi
task_categories:
- automatic-speech-recognition
---
# Dataset Card for Nexdata/Hindi_Spontaneous_Speech_Data
## Description
494 Hours - Hindi Spontaneous Speech Data, the content covering multiple topics. All the speech audio was manually transcribed into text content; speaker identity, gender, and other attribution are also annotated. This dataset can be used for voiceprint recognition model training, corpus construction for machine translation, and algorithm research introduction
For more details, please refer to the link: https://www.nexdata.ai/datasets/1269?source=Huggingface
# Specifications
## Format
16kHz, 16bit, mono channel;
## Content category
including education, interview, sports, etc.
## Language
Hindi;
## Annotation
annotation for the transcription text, speaker identification, gender;
## Application scenarios
speech recognition, video caption generation and video content review;
## Accuracy
at a Word Accuracy Rate (WAR) of being no less than 98%.
# Licensing Information
Commercial License |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/1bc9eac9 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 186
num_examples: 10
download_size: 1337
dataset_size: 186
---
# Dataset Card for "1bc9eac9"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.