Sequence
int64 1
25.2k
| Time
int64 1
858M
| File
stringclasses 830
values | RangeOffset
int64 0
2.21M
| RangeLength
int64 0
168k
| Text
stringlengths 1
4.7M
⌀ | Language
stringclasses 20
values | Type
stringclasses 9
values |
|---|---|---|---|---|---|---|---|
292
| 393,464
|
TERMINAL
| 0
| 0
|
[1;172H30[25d\t
| null |
terminal_output
|
293
| 394,469
|
TERMINAL
| 0
| 0
|
[1;173H1[25d\t
| null |
terminal_output
|
294
| 395,474
|
TERMINAL
| 0
| 0
|
[1;173H2[25d\t
| null |
terminal_output
|
295
| 396,469
|
TERMINAL
| 0
| 0
|
[1;173H3[25d\t
| null |
terminal_output
|
296
| 397,475
|
TERMINAL
| 0
| 0
|
[1;173H4[25d\t
| null |
terminal_output
|
297
| 398,473
|
TERMINAL
| 0
| 0
|
[1;173H5[25d\t
| null |
terminal_output
|
298
| 399,481
|
TERMINAL
| 0
| 0
|
[1;173H6[25d\t
| null |
terminal_output
|
299
| 400,467
|
TERMINAL
| 0
| 0
|
[1;173H7[25d\t
| null |
terminal_output
|
300
| 401,466
|
TERMINAL
| 0
| 0
|
[1;173H8[25d\t
| null |
terminal_output
|
301
| 402,467
|
TERMINAL
| 0
| 0
|
[1;173H9[25d\t
| null |
terminal_output
|
302
| 404,469
|
TERMINAL
| 0
| 0
|
[1;172H40[25d\t
| null |
terminal_output
|
303
| 405,475
|
TERMINAL
| 0
| 0
|
[1;173H1[25d\t
| null |
terminal_output
|
304
| 406,472
|
TERMINAL
| 0
| 0
|
[1;173H2[25d\t
| null |
terminal_output
|
305
| 407,465
|
TERMINAL
| 0
| 0
|
[1;173H3[25d\t
| null |
terminal_output
|
306
| 409,474
|
TERMINAL
| 0
| 0
|
[1;173H4[25d\t
| null |
terminal_output
|
307
| 409,478
|
TERMINAL
| 0
| 0
|
[1;173H5[25d\t
| null |
terminal_output
|
308
| 410,470
|
TERMINAL
| 0
| 0
|
[1;173H6[25d\t
| null |
terminal_output
|
309
| 411,472
|
TERMINAL
| 0
| 0
|
[1;173H8[25d\t
| null |
terminal_output
|
310
| 412,473
|
TERMINAL
| 0
| 0
|
[1;173H9[25d\t
| null |
terminal_output
|
311
| 413,463
|
TERMINAL
| 0
| 0
|
[1;172H50[25d\t
| null |
terminal_output
|
312
| 414,465
|
TERMINAL
| 0
| 0
|
[1;173H1[25d\t
| null |
terminal_output
|
313
| 415,468
|
TERMINAL
| 0
| 0
|
[1;173H2[25d\t
| null |
terminal_output
|
314
| 416,468
|
TERMINAL
| 0
| 0
|
[1;173H3[25d\t
| null |
terminal_output
|
315
| 417,463
|
TERMINAL
| 0
| 0
|
[1;173H4[25d\t
| null |
terminal_output
|
316
| 418,463
|
TERMINAL
| 0
| 0
|
[1;173H5[25d\t
| null |
terminal_output
|
317
| 419,466
|
TERMINAL
| 0
| 0
|
[1;173H6[25d\t
| null |
terminal_output
|
318
| 420,472
|
TERMINAL
| 0
| 0
|
[1;173H7[25d\t
| null |
terminal_output
|
319
| 421,476
|
TERMINAL
| 0
| 0
|
[1;173H8[25d\t
| null |
terminal_output
|
1
| 18
|
scripts/file_duplicate_checker.py
| 0
| 0
|
import os\nfrom collections import defaultdict\nfrom tqdm import tqdm\n\ndef find_duplicate_filenames(root_dir):\n filenames = defaultdict(list)\n file_count = 0\n\n # Use tqdm with manual update and no percentage/ETA bar\n pbar = tqdm(desc="Files scanned", unit="file", dynamic_ncols=True, bar_format="{desc}: {n_fmt}")\n\n # Walk the directory recursively\n for dirpath, _, files in os.walk(root_dir):\n for file in files:\n full_path = os.path.join(dirpath, file)\n if os.path.isfile(full_path):\n filenames[file].append(full_path)\n file_count += 1\n pbar.update(1)\n\n pbar.close()\n\n # Print duplicates\n duplicates = {name: paths for name, paths in filenames.items() if len(paths) > 1}\n if duplicates:\n print("\nDuplicate filenames found:\n")\n for name, paths in duplicates.items():\n print(f"Filename: {name}")\n for path in paths:\n print(f" - {path}")\n print()\n else:\n print("\nNo duplicate filenames found.")\n\nif __name__ == "__main__":\n import sys\n if len(sys.argv) < 2:\n print("Usage: python find_duplicates.py <directory_path>")\n else:\n find_duplicate_filenames(sys.argv[1])\n\n
|
python
|
tab
|
2
| 521
|
extension-output-pdoom-org.crowd-code-#1-crowd-code
| 0
| 0
|
11:36:32 PM [info] Activating crowd-code\n11:36:32 PM [info] Welcome back tum_ind3695. Your user-id is '507ab0ec0dfe0c18ad7778dd15e072f92367194c94623114de802c8ed9c52e20'. Happy coding!\n11:36:32 PM [info] Recording started\n
|
Log
|
tab
|
1
| 12
|
todos.md
| 0
| 0
|
# Todo's\n\n## Misc:\n- [ ] Steuererklärung \n- [ ] jan wlan geld internet und paket\n- [ ] thanh geld 22 club\n- [ ] Arzt termine\n - [ ] normal\n - [ ] zahn\n- [ ] Proteinshake\n- [ ] Protein shaker\n- [ ] Zahnbürste\n- [ ] laptop?? ask stefan what to do with it\n\n\n## 23.06 Monday\n\n- [x] PR step duration\n- [x] generate samples from dyn\n- [x] single batch training overfit\n - [x] lam\n - [x] tokenizer\n - [ ] dynamics model\n\n- [ ] lr finding \n- [ ] pr: update sampling method for resolution \n\n- [ ] retrain lam until convergence?\n- [ ] see if data parallel is better than single gpu?\n\n- [ ] issue: make sampling faster\n- [ ] blog post crowd source\n\n- [ ] Preprocess entire minecraft dataset\n- [ ] Dont let tf see gpus\n- [ ] Look into dataloader\n\nQuestions:\n- optimal lr\n- optimal batch size\n- how to scale lr with batch size\n- how to test/evaluate lam performance?\n - how good are the actions for the lam\n - how many actions do we have?\n - how many actions do we need\n\n\n## 22.06 Sunday\nNotes:\n- How should tokenizer training behave?\n- How should lam training behave?\n- How should dynamics model training behave?\n- \n\nTODOS:\n- [x] look at prs jafar\n- [x] extension version update\n- [x] make big run directory\n- [x] look at the run of lam\n- [ ] train the dynamics model on the new models\n\n- [ ] start a coin run baseline\n - [ ] tokenizer\n - [ ] lam \n - [ ] dynamics model\n\n- [ ] move tfrecord to huggingface\n\n- [x] tierlist for thanh\n- [ ] move tfrecord to helmholtz\n- [ ] helmholtz setup\n- [ ] fix dataloader seeding\n\n- [ ] FFM homework\n- [ ] Tennis\n- [ ] gym\n\n\n\n## 20.06 Friday\n- [x] wäsche\n- [x] flaschen wegbringen\n- [x] zhs \n- [x] cursor setup\n- [x] run overfiting runs on lam\n\n## 19.06 Thursday\n- [x] extension ace palenight\n\n\n- [x] run overfiting runs on dynamics model (got aborted bc of branch switching)\n - [x] test multi gpu overfit on full tfrecord\n\n\n## 18.06 Wednesday\nJobs:\n- [x] run overfiting runs for tokenizer \n - [x] test multi gpu overfit on single batch\n - [x] test multi gpu overfit on single sample\n - [x] test multi gpu overfit on 10 tfrecord\n - [x] test multi gpu overfit on full tfrecord\n\n- [x] create tf record batch (1,4,8,16) sample from dataloader saven \n\n\n## 17.06 Tuesday\n- [x] cleanup Home\n- [x] cleanup ws shared\n\n## 13.06 Friday\n- [ ] Start job for single batch training (overfit dataset)\n- [ ] Make slides for presentation \n\n- [ ] helmholtz setup\n- [ ] move one tf record to helmholts\n\n## 12.06 Thursday\n- [x] Fix oom issue\n - Dataloader was caching\n- [x] find out biggest batch size for now\n- [x] start a run on one node with coinrun for 12h sbatch\n- [x] start a run on one node with minecraft \n- [x] cleanup ws\n- [x] cleanup wandb\n\n\n\nQuestions Dataloader:\n- [ ] What is one 'element' in the dataloader (Is it an episode or mutlitple episodes? Is it clips from an episode?)\n- [ ] What is the number of elements in the dataloader?\n- [ ] Why is shuffling so slow? Why does it have to shuffle the data and not the indices?\n- [ ] Does the dataloader currently shuffle per episode or per clip?\n- [ ] How do we get the best shuffle? (Optimally uniformly over all clips?)\n- [ ] Do we have to use chaching? Does it improve performance? If yes how much? Is it worth it?\n\nQuestions Training:\n- [ ] What is the best batch size? (What is the best way to get the optimal batch size? Just running it?? Can we calculate/estimate it?)\n- [ ] Can we just use the Genie hyp params or does our data paralelism change the optimal params? (Is the setup good enough?)\n\n\n## 11.06 Wednesday\n- [ ] Start coinrun training\n- [ ] start minecraft training\n\n## 10.06 Tuesday\n- [ ] First run with new pipeline\n- [ ] Zahnarzt\n- [ ] Flaschen wegbringen\n- [ ] \n- [ ] Blutbild und Schulter termin\n\n## 09.06 Monday\n- [x] Zahnarzt termin machen \n- [ ] Presenation Vorbereitung\n\nOffice:\n - [x] jan ikea\n - [x] ben geld sae trip\n - [x] namitha geld\n\n\nHome:\n - [ ] pa \n - [ ] jan wlan \n - [ ] Barmer geld impfung\n - [ ] physio geld\n\n\n## 08.06 Sunday\n- [ ] zimmer aufräumen\n - [ ] klamotten sofa\n - [ ] müll\n- [ ] küche \n - [ ] boden\n - [ ] müll\n- [ ] \n\n- [ ] video pipeline\n - [ ] clip the npy videos in 16 sec chunks with the designated ranges\n - [ ] 16sec video to npy?\n\n- [ ] readup data pipeline\n - [ ] read vpt \n - [ ] mineworld \n - [ ] genieN\n - [ ] 5k openai\n\n## 07.06 Saturday\n- [x] einkaufen \n - [x] soap \n - [x] öl \n\n## 06.06 Friday\n- [x] log idling gpus every 30mins\n- [x] make website for it on lxhalle\n\n- [ ] video pipeline \n - [x] split into 16 sec videos for datalabeling (training labeler)\n - [ ] verify results\n - [x] some videos are not 16sec\n \n- [x] move videos to new shared workspace\n\nNotes:\n- ffmpeg is bad at keeping length of videos if not re-encoding\n- takes the next best key frame if not re-encoding \n- encoding is super slow (20min video in 40 min instead of 11sec)\n\n## 05.06 Thursday\n- [x] write email to stefan for cluster group \n - [x] mihir \n - [x] me \n- [x] video pipeline\n - [x] convert mp4 to npy\n - [x] verify results\n\n\n## 03.06 Tuesday\n- [x] kit report \n- [x] karte sap\n\n\n## 30.05 Friday\n- [x] Empty bottles\n- [x] Groceries\n\n- [ ] random dataset for trainign\n- [ ] Macbook?\n- [ ] SAP card\n- [ ] Access to cluster\n- [ ] report for KIT\n\n\n## 28.05 Wednesday\n- [x] Data paralelism \n- [x] Sampling with mihir\n\n## 27.05 Tuesday\n- [ ] agents on envs\n- [ ] video dataset\n- [ ] Crafter/Cratax\n\n- [ ] other envs\n - [ ] procgen\n - [ ] gymretro\n - [ ] minecraft\n\n\n\n## 26.05 Monday\n- [x] data sharing\n- [x] 16 procgen envs\n - [x] 10mio frames pro env\n\nTraining times:\n1 node 1 env 10 tage \n1.5b param 5b frames 50 atari env 72k gpuh \n64gpus need 12 days\n\n\n\n## 24.05 Saturday\n- [x] email stefan compute access\n- [ ] data gen for the old envs\n\n- [ ] images from gymnax\n- [ ] craftax images\n\n- [ ] 1 page report for KIT\n- [ ] make BA presentation plan\n\n## 23.05 Friday\n- [x] Kit setup\n\n\n\n## 22.05 Thursday\n- [x] BBH\n - [x] Lemon grass\n - [x] koriander\n - [x] topf -> tam\n - [x] gym?\n- [x] Pr for data generation\n- [x] setup and run gymnax\n\n\n## 21.05 Wednesday\n\n- [ ] datageneration\n - [x] what do other ppl use? (jafar, openai, craftax)\n - [x] what is the format (compatibility)\n - [x] find good environments (easy ones for a good baseline)\n - [x] implement data generation script (so it jafar compatible)\n - [x] paramteer for env agents in procgen\n- [x] job for 2 simple environments\n\n*Notes*:\n- jafar uses procgen: https://github.com/openai/procgen\n - can easily generate 16 different environments\n- gym, gym3, ALE (atari)\n- backwards compatibility was broken by gym in version 0.26.*\n - have to downgrade to 0.25.* for gym3 to work with gym\n- todo: might have to save the seed into the meta data??\n\n*Sources*:\nGym: https://gymnasium.farama.org/\nGymnax: https://github.com/RobertTLange/gymnax#implemented-accelerated-environments-%EF%B8%8F\nCraftax: https://github.com/MichaelTMatthews/Craftax\n\n\n## 20.05 Tuesday\n- [ ] write email for the kit cluster access\n- [ ] local hpc access\n- [ ] florian bauer macbook \n\n- [ ] look through code\n - [ ] how does the training work?\n - [ ] how is it different\n\n - [ ] Geld for SEA trip\n- [ ] Barmer\n\n\n## 19.05 Monday \n- [x] read through papers for DLM \n- [x] BA upload\n - [x] notes from below\n - [x] signation\n\nBA notes:\ntraining-time\ninference-time\nfine-tune\nfinetune\nllm \npython\nin context\nfigure/FIG references\ntransformation appendix??\n\n\n## 18.05 Sunday \n- [x] Email Meinel Lärmstuff\n- [x] Verivox email schreiben\n- [x] BA grammarly \n\n\n\n## 16.05 Friday\n- [x] request Helmholtz HPC access\n- [x] Horeka access\n- [x] Franz fragen \n - [x] setup reward distribution\n - [ ] setup genie\n- [x] bashscripts\n- [x] Befragung TUM RDI\n\n## BA\n- [x] Acknowledgements\n- [ ] Read requirements\n - [x] Tum logos on the front page?\n- [ ] Figures\n - [x] architecture\n - [x] inference strat\n - [x] computation to performance\n- [ ] Appendix?\n - [x] Comprehensive list of tranformations \n - [ ] exmaple prompt\n- [ ] Fig 4.1:\n - Say that we remove the test task and use one of the other tasks as test task\n\n- [x] Figure layout\n- [x] Conclusion put the results in there \n\n\n## Week in Hotze\nMisc:\n- [ ] Steuererklärung \n- [ ] jan wlan geld internet und paket\n- [ ] thanh geld 22 club\n- [ ] Britts rezension \n- [ ] Rezension ha giang\n- [ ] Befragung TUM RDI\n- [ ] Arzt termine\n - [ ] normal\n - [ ] zahn\n- [ ] Email Meinel Lärmstuff\n\n## 10.05.25 Saturday\n- [x] BA Figures\n\n\n## 05.05.25 Monday\n- [x] theo paypalen 86 Euro Amazon\n- [x] Email Isabelle Helmholz\n\n\n## DONE\n- [x] Dokument Helmholz\n- [x] stuff bestellen\n - [x] topper\n - [x] picknickdecke\n - [x] reis\n\n- [x] Tasse für PA \n\n\n## 26.03.25 Tuesday\n- [ ] Kreissparkasse\n- [x] Singapore arrival card\n- [ ] Barmer\n - [ ] impfung\n- [ ] Verivox\n- [ ] \n\n- [x] Packen\n- [x] Meds\n- [x] Müll\n- [x] Toilette\n- [x] Strom ausstecken\n- [x] Fenster schließen\n\n\n\n\n\n\n## 14.03.25 Friday\n\n- [x] Masterbwerbung random\n- [x] emails \n\n- [ ] methods\n\n\n## 12.03.25 Wednesday\nSidequests:\n- [ ] Perso\n- [ ] Führerschein\n- [ ] Barmer karte\n- [ ] hostel singapur\n- [ ] arbeiten\n- [ ] mac abgeben\n\nBA:\n- [ ] background section\n- [ ] methods section\n- [ ] evaluation pipeline\n - [ ] get the submission files\n - [ ] evaluate the results (easy, med, hard)\n - [ ] plot\n- [ ] ds/ep scaling for the tasks that where not solved\n- [ ]\n\n## 11.03.25 Tuesday\nHiwi:\n- [x] Application HIWI\n- [x] master bewerbung\n\nBA:\n- [x] anmeldung BA\n\n- [x] Sparkasse Rückzahlung\n\nSide Quests:\n- [x] paket zurückgeben\n- [x] telekom/verivox thingy\n- [x] master bewerbung\n- [x] jacke flicken\n- [x] arbeiten\n\n## 10.03.25 Monday\n\n## 08.03.25 Saturday\n\n\n## 05.03.25 Wednesday\n\n\n\n\n## 04.03.25 Tuesday\nCleanup:\n- [ ] Wäsche\n- [x] Zimmer random\n- [x] Küche random\n- [x] Staubsaugen\n- [ ] Fenster putzen\n- [ ] Badezimmer putzen\n\nGeld:\n- [ ] Impfung\n- [ ] Hose\n- [ ] Verivox \n\nWork:\n- [ ] Basti\n- [ ] Urlaubausbezahlen?\n- [ ] \n\nBA:\n- [ ] fix inference????\n- [ ] fix transformation???\n\n\n## 03.03.23\n- [x] run predictions for epoch scaling\n- [x] run predictions for epoch scaling\n\n## 02.03.23 Sunday\n- [ ] look for errors in adapter creation\n - [ ] some transformation error\n\n- [x] run the predictions\n - [x] vllm setup\n - [x] run predictions for epoch scaling\n - [x] run predictions for epoch scaling\n - [x] run predictions for epoch scaling\n\n- [x] evaluate epoch scaling batch 1\n\n## 01.03.25 Saturday\n- [x] Run ttt for the experiments\n - [x] scaling epoch batch 2\n - [x] scaling data \n - [x] scaling data + base llama\n - [x] scaling epoch batch 1\n- [x] run prediction for the experiments\n - [x] all the scripts setup the scripts\n- [x] notes background section\n\n\n## 27.02.25 Thursday:\nJuwels Cluster setup:\n- [x] \n\n\nHoreka Cluster setup:\n- [x] get the repo up an running \n - [x] vpn\n - [x] clone (repo clean up)\n - [x] environment\n - [x] scp the adapters?\n - [x] run the creation of the adapters from scratch\n - [x] run predict\n\n\nArzt: \n- [x] Impfung\n\n\n## 24.02.25 Monday\n- [x] start adapter generation \n- [x] debugging the max seq length problem\n- [x] move adapters to scratch?\n\n- [ ] ai fasttrack\n- [ ] mcp\n\n\n\n## 18.02.25 Tuesday\n- [] Laundry (Towels)\n- [x] Groceries\n\n- [ ] Slides holen \n - [ ] eist \n - [ ] gad\n - [ ] \n- [ ] Cleanup room (30min)\n\n\n## 17.02.25 Monday\nBA:\n- [x] start prediction job\n\nAP:\n- [ ] ap test\n\n## 16.02.25 Sunday\n- [x] analyze the training jsons\n - [x] min/max\n - [x] debug\n\n- [x] run ttt with 2k samples\n - [x] which training jsons\n - [x] where are the ttt adapters save\n\n- [x] test the training/predict piepline\n - [x] run predict with 4 processes (one per gpu)\n - [x] evaluation pipeline?\n - [x] only time for creating the all adapters\n - [x] how to measure inference time?\n - [x] how to measure training time?\n\n- [x] Fix transformations \n - [x] debug the one that are too big?\n - [x] make it work until 1000\n - [x] make the transformations more random\n - [x] look for other transformations\n - [x] plot all the ones under 500?\n\n- [x] buy pants\n\n## 12.02.25 Wednesday\n- [ ] \n\n\n## 11.02.25 Dienstag\n- [x] Stefan Bauer schreiben für meeting\n\n- [x] Plan für die nächsten 6 Woche\n - [x] Aptitude Test\n - [ ] Bachelor Arbeit\n - [ ] Arbeit\n\nI’ll be unavailable for an hour because of a doctor's appointment from 10:30 to 11:30 later\n\nVietnam\n- [x] mama anrufen wegen vietnam\n- [x] Impfugen vietnam egerndorfer\n- [x] singapur flüge \n- [x] yu ling fragen wegen referal\n\n## 10.02.25 Monday\n- [x] Table of contents für Bachelorarbeit\n- [x] Repo + Template für Bachelorarbeit\n- [x] Plan für die nächsten 6 Woche\n - [x] Aptitude Test\n - [ ] Bachelor Arbeit\n - [ ] Arbeit\n\n## 09.02.25 Sonntag\n- [x] PA\n- [x] Wäsche \n- [x] Putzen\n - [x] Fenster Küche\n - [x] Badezimmer\n - [x] Zimmer\n- [x] Plan für vietnam\n - [x] Mia\n - [x] Ben\n- [x] Machine Learning HA\n\n## 07.02.25 Friday\n- [x] Paper submisssion\n\n## 06.02.25 Thursday\n- [x] sparkasse rückzahlung email schreibn\n- [x] zusage test\n- [x] barmer geld \n - [x] sepa lastschrift\n\n### Estimating difficulty\nFor var, ent and std\n- [x] linear regression for the plot (with L1 and L2)\n- [x] ground truth line/regression\n\n- [x] loop through sample length and create plots\n- [x] create one combined plot of all the ablations\n- [x] later do the prm as well \n- [ ] buckets at the end with accuracy?\n\n\n- N: number of samples\n- exp for samples (2^n)\n- linear for seq_len (stepsize 2)\n- for var entr std\n- put all lin regressions in one plot\n\n### Metrics\n- L1 (ground truth)\n- L2 (ground truth)\n\nWhen done \n- Buckets + accuracy\n- per class accuracy\n\n## 04.02.25 Tuesday\n- [x] Deep eval for search\n - [x] generate golden\n - [x] convert .md to .txt\n - [x] run some tests\n\n- [x] geschenk für Joe\n- [x] nach muc\n- [x] arbeiten\n- [x] wäsche\n\n## 03.02.25 Monday\n- [x] Evaluate the results\n - [x] reinstall vllm\n- [x] aptitude test questioning\n\n\n## 02.02.25 Sunday\n- [x] Zweitsteuer \n- [x] ICE-ticket\n- [x] tum geld semesterbeitrag\n- [x] zulip eignungstest\n\n## 31.01.25 Friday\n- [x] might have to adapt the *.bin_{adapter_num} to *_{adapter_num}.bin\n\n## 30.01.25 Thursday\n- [x] fix error in predict.py\n- [x] create adapter per iteration\n - [x] every 10 iterations till 100\n - [x] every 40 iterations till 500\n - [x] every 100 iterations till 1000\n - [x] every 200 iterations till 2000\n\n- [x] setup per iteration checkpointing\n- [x] write down ideas\n\n## 29.01.25 Wednesday\n\n- [x] started jobs for one epoch\n\n\n## 28.01.25 Tuesday\n- [x] put the models stats in the output.json\n- [x] run predict on 2000k 1 epoch and 2000k 2 epoch\n\n\nHypothesis: \n- [ ] harder problem need more samples\n- [ ] peaks for the problems are at different points\n- [ ] add the peaks together\n- [ ] how to predict these peaks\n\n- how to estimate the peak beforehand?\n-> stefan bauer \n\n\n- [ ] epoch 1 \n- [ ] check pointed while traingin\n- [ ] how does test time traing affect the performance?\n\n- [ ] how does it affect \n - [ ] othter test time scaling\n - [ ] HER\n - [ ] sampling size\n - [ ] test time search\n - [ ] get the logic\n - [ ] how could longer ttt affect the performance\n - [ ] test time search methods\n\n- [ ] easy is solving other tasks\n- [ ] how good is grouping?\n- [ ] costs\n\n\n- [ ] estimate the difficulty differently?\n- [ ] how does this difficulty scale \n\n\nbefore doing anything:\n- questions\n- what do I have to do\n- what do I have to investigate\n- \n\n- [ ] von den einzelen tasks wie verhält sich das?\n- [ ]\n\n\n- [ ] make a road map of all the things I need to do\n- [ ] change the epochs and test for one \n - [ ] then run the same thing but with 2000 samples and checkpoint\n- [ ] run with deepseek distill\n- [ ] run with deepseek distill on transduction tasks\n\n- [ ] deploy r1\n\n\n- [ ] for the 30 \n- [ ] model chart with total solved tasks\n- [ ] change the lora adapters numbers\n- [ ] change the amount of transformations\n- [ ] change the type of transformations\n- [ ] what is the inference startegy rn?\n- [ ] \n\n\n- [ ] other methods of test time scaling?\n - [ ] cot on reasoning traces\n - [ ] more sampling\n - [ ] look for more\n\n- [ ] fix the gpu errors on some tasks\n- [ ] see if more transformations are needed\n- [ ] distilled models on hard tasks?\n- [ ] might need some reasoning cues....\n- [ ] mehr samplen for inference\n\n\n## 27.01.25 Monday\n- [x] saving results with wandb (redundant)\n - llm typically trained on 1 epoch\n - validation would be next token prediction\n - arc uses accuracy as only metric (free verification and thats the task)\n- [x] move adapters to one folder\n- [x] check how the training is going when changeing the learning set\n- [x] create a dataset of only the hard tasks\n- [x] run training on the hard tasks\n\n- [x] viusalize hard/easy tasks\n - [x] find out hard task (solved by <40% of the modes)\n- [ ] see if the model transformation is working\n - get the train data size into the task.csv as well\n## 26.01.25 Sunday\n- [x] find out the duplicates???\n\n## 25.01.25 Saturday\n- [x] multi job on one node \n- [x] find out which tasks are not solved\n - [x] list of solved/unsolved tasks\n\n## 23.01.25 Thursday\n- [x] chrissie schreiben für getränke\n- [x] und list für ne gruppe machen \n- [x] Franzi schreiben\n- [x] get multigpu training to run (hell nah they rewrote torchtune)\n- [x] get dataset for FT\n- [x] Cluster setup mihir\n- [x] get BARC setup \n - [x] barc ft\n - [x] barc adapters\n- [x] Mihir 5e\n- [x] Sushi geld\n\n## 21.01.25 Tuesday\n- [x] work\n- [x] food with ching chongs\n- [x] get the stats\n- [x] start all finetuning jobs\n\n\n## 20.01.25 Monday\n- [ ] Start some lora finetuning\n- [ ] write stefan bauer\n- [ ] \n\n## 19.01.25 Sunday\n- [ ] find out which one got solved\n - [ ] from baseline 250\n - [ ] from ours\n - [ ] get list of adapters\n - [ ] get list of solved tasks\n\n- [ ] create lora adapters for all sizes\n - [x] verify number of training samples\n - [x] verify number of successfully created adapters\n - [ ] start jobs for create adpaters for 10, 50, 100, 200, 500, 1000 tasks\n\n- [ ] run prediction on all adapters\n - [ ] 10 \n - [ ] verify stats: solved/not solved\n\n\n- [ ] fix the spiky behavior\n- [ ] clean up repo\n\n- [ ] spikey behavior\n\n- [ ] Wlan rechnung -> jan\n- [ ] train 1b model on 250 tasks each, if possible\n - [ ] get the adapters\n - [ ] run the predict\n - [ ] see which one are not solve\n - [ ] run a loop on [10, 50, 100, 200, 500, 1000]\n- [ ] putzen\n- [ ] ML hausuaufgaben\n\n\n## 18.01.25 Saturday \n- [x] einkauf rechnung\n- [x] rechnungen circula\n- [x] email lmu\n- [x] stefan bauer email für recommendation letter for datathon\n- [x] email osapiens\n- [ ] bewerbungen\n - [ ] aws\n - [ ] \n\n\n## 17.01.25 Friday\n- [x] Gym Beine\n- [x] Email draften for mehul\n- [x] linkedin stuff\n- [x] ramakant stuff\n- [x] email for tech support \n\n## 16.01.25 Thursday\n- [x] salloc 4h \n- [x] are we training on json?\n [ ] \n\n## 15.01.25 Wednesday\n- [x] arbeiten\n- [x] raum meetup\n- [ ] \n\n## 14.01.25 Tuesday\n- [x] Erasmus bewerbung\n- [x] Arbeiten\n- [x] Email stefan bauer (cluster access, helping with writing, test runs, i have 60000h no 6000h)\n- [x] Email eikyun\n\n\nHi Stefan, \n\nwäre es möglich Mihir noch Cluster access zu geben? Das würde uns rein vom setup und zusammen arbeiten mega viel Zeit ersparen. \nAußerdem ist mir aufgefallen, dass es sich um 60.000 und nicht um 6.000 cpu stunden handelt. Ich habe noch ca 58.000h übrig. Das sollte erstmal ausreichen denke ich.\nIch würde mich melden falls ich mehr brauche:)\n\nLG Alfred\n\nKurzes Update:\nIch habe jetzt über das Wochende ein paar inference Tasks mit den gegebenen Lora Adaptern laufen lassen. Gerade schreibe ich die Repo um für multi-gpu finetuning damit wir unsere eigenen Adaptere trainieren können. \nAm Freitag haben wir noch ein kurzes Meeting mit Mehul Daminan, für den adaptive compute part (https://arxiv.org/abs/2410.04707, er hat auch am TTT Paper mitgearbeitet). \n\nShort update:\nI was ran some inference tasks with the given Lora Adapters. I am currently rewriting the repo to support multi-gpu finetuning so we can train our own adapters.\nOn Friday we had a short meeting with Mehul Daminan, for the adaptive compute part (https://arxiv.org/abs/2410.04707, he also contributed to the TTT paper).\n\n\n## 13.01.25 Monday\n- [ ] how to generate 5-1000 new tasks?\n- [ ] how transformations work?\n- [ ] Ranking for erasmus\n\n\n## 11.01.25 Saturday\n- [x] Machine learning HA hochladen\n- [x] ttt repo run\n\n- [x] ask stefan for cluster access (mihir and franz)\n- [x] put avocadoaling on iphone\n\n\n- [ ] read paper\n - [ ] other ttt\n - [ ] self improvement\n\n- [x] Text stefan bauer\n - relative confident in idee, but might need supervision and tips : biweekly, help with writing, experiments, etc.\n\n- [x] write core hypothesis and experiments\n- [x] filtern for notes\n\n\n## 10.01.25 Friday\n\n- [x] get TTT to work\n\n- [x] Machine learning hausaufgabe\n\n- [x] read paper\n - [x] stefan bauer\n - [x] ttt arc\n\n- [x] ergebnisse vom alten projekt\n- [x] Stefan Bauer meeting\n - [x] eventuell hiwi\n - [x] idee schicken\n - [x] paper schicken\n - [x] is it ok to be two ppl for the project?\n - [x] hat er plan\n - [x] ob er jmd kennt der plan hat?\n - [x] iclr workshop\n - [x] ob er leute kennt die ahnung hat \n - [x] already texted the mit dudes\n - [x] scaling is ass with arc approaches\n- [x] get access to slack again\n- [x] reward hacking?\n\n## 09.01.25 Thursday\n- [x] write fabian \n- [x] handwerker\n- [x] ramakant\n- [x] presentation of semantic search\n- [x] telekom magenta eins\n- [x] barmer gek stuff\n- [x] clothes for washing tmrw\n- [x] read paper: how hard to think\n- [x] jonas essen \n\n## 08.01.25 Wednesday\n- [x] project idea machen zum doc einladen\n- [x] nachricht an stefan schreiben \n- [x] TTT repo installation setup \n- [x] arbeiten\n- [x] project idea first draft\n\n## 07.01.25 Tuesday\n- [x] merging done\n- [x] Gym\n- [x] Fenstersanierung\n\n## 06.01.25 Monday\n- [x] Gym\n\n- [x] caching and get some results\n\n- [ ] clean up room \n - [x] couch \n - [x] table\n - [x] floor\n - [x] kitchenstuff\n - [x] vacuum floor\n- [x] clean up kitchen\n - [x] dishes \n - [x] table\n - [x] vacuum floor\n- [x] clean up bathroom\n - [x] vacuum floor\n\n- [x] Complete test run with deep - not possible took too long\n- [x] medium - """\n- [x] for the teststats\n- [ ] add test time training in there for the tasks that were not solved\n\n- [ ] why 10 min between runs?\n- [ ] some tasks are way easier than others\n- [ ] setup for running background processes on the compute node\n- [ ] write a visualization for the results\n - [ ] intermediategrid representation?\n - [ ] store solutions/reasoning?\n - [ ] jupyter notebook for visualization\n- [ ] how are the solutions stored?\n- [ ] make one full test run and compare it to the og setup with claude\n - [ ] get the times for one run\n - [ ] get the time for the whole dataset\n - [ ] log it somewhere?\n - [ ] get the numbers for claude setup\n - [ ] get the numbers for qwen70b\n- [ ] play around with setup \n - [ ] differne models\n - [ ] depth\n - [ ] deep\n - [ ] medium\n - [ ] shallow\n - [ ] representation\n - [ ] look at the solved/unsolved ratio and tasks\n - [ ] any insights?\n- [ ] test out with smaller models\n - [ ] lamma7b\n - [ ] qwencoder 32b\n - [ ] ...\n - [ ] might be able to run them in parralle on one node\n - [ ] test out with finetuned models\n- [ ] finetune some model on it?\n- [ ] think about other approaches \n - [ ] natural language approach?\n - [ ] use the finetuned model of the winners\n - [ ] other generation schemes\n- [ ] get claude bedrock running\n- [ ] clean up\n\n## 05.01.25 Sunday\n- [x] gym\n- [x] send jan his datev letter\n- [x] return adidas stuff\n\n- [x] list for mom\n - [x] keyboard \n - [x] tshirt\n\n## 04.01.2025\n- [x] write update for stefan \n\n## 03.01.25 Friday\n\nMaiborn:\n- [x] get the circular app\n\nBachelor:\n- [ ] implement bedrock in the pipeline\n- [ ] one test run with claude\n- [ ] debug error??\n- [ ] figure out: intermediate results / tmp files\n - [ ] saving the python scripts somewhere not /tmp/...\n - [ ] manual testing of runnning python script\n - [ ] see what the prompts are getting from the representation\n\nBewerbung:\n- [ ] erasmus\n - [ ] bewerbung rausschicken\n- [ ] arbeit @aws @google @nvidia @other_big_tech @lab?\n\n\n## 02.01.25 Thursday\nMaiborn:\n- [x] Times\n- [x] Mobility Budget entry\n\nBachelor:\n- [x] setup aws account\n- [x] get bedrock running\n\nMisc: \n- [x] Sehtest Brille und brille kaufen\n- [x] email wandstreichen\n\n\n## 31.12.24 Monday\nBestellung:\n- [x] Schuhe \n- [x] Socken\n- [ ] \n\nSteuererklärung:\n- [ ] Antrag\n- [ ] App holen\n\nMaiborn:\n- [ ] Times\n- [ ] Mobility Budget\n- [ ] \n\nBachelor:\n- [x] setup aws account\n- [ ] one test run with claude\n- [ ] debug error??\n- [ ] figure out: intermediate results / tmp files\n - [ ] saving the python scripts somewhere not /tmp/...\n - [ ] manual testing of runnning python script\n - [ ] see what the prompts are getting from the representation\n\n\nMisc:\n- [ ] Wlan rechnung -> jan\n- [ ] termin für wandstreichen\n\nBewerbung:\n- [x] master bei lmu\n- [ ] erasmus\n - [x] info\n - [ ] bewerbung rausschicken\n- [ ] arbeit @aws @google @nvidia @other_big_tech @lab?\n\n\n\n## Monday \n- [ ] play around with:\n - [ ] cot\n - [ ] ascii representation\n - [ ] prompts\n - [ ] \n\n## Tuesday\n- [ ]\n\n\n## Wednesday\n- [ ] \n\n\n## Thursday\n- [x] change logging \n - [x] normal logger class?\n - [x] save to other format and then logfire?\n\n- [x] trying to run qwen72B -> too big bc of quantization?\n\n- [ ] download other models\n - [x] vision models are ass\n - [x] chat models\n\nGiven this iq test. What common transformation do these rows follow?\n\nI have this iq test for you. \nInstructions: \n\nGiven these pairs of grid patterns where each pair shows a 'before' (left) and 'after' (right) transformation, please:\n 1. Identify the rule that defines which how to transform the 'before' pattern into the 'after' pattern\n 2. provide the common transformation occurring to the geometric shapes\n 3. explain your reasoning on how you came to this conclusion\n\n\n- [x] issues fixing linebreak\n- [x] trying to run qwen coder 32B\n - [x] figuring out the output format\n- [x] figure out how to change tree structure\n\n\n## Friday \n- [ ] maybe use a smaller model?\n- [ ] finetune it on vision tasks\n- [ ] use the other finetune for generation?\n- [ ] self learning approach?\n- [ ] use the dsl and mix to create new stuff\n- [ ] see what kind of tasks can only be solve transductively\n- [ ] what did the top \n\n## Sunday \n- [x] telekom handyvertrag\n- [x] telekom wlan vertrag\n\n\nhttps://www.youtube.com/watch?v=WK5XYG-dH-k&list=PL0oJ2_Q2jPrfkO2Bo8ljN10cShkjkaWzr\n\nhttps://www.youtube.com/watch?v=3yQqNCOYfJo&list=PLbFBnggbJ1rnaXljgzhGm0p_Co9kwfs3t\n\nhttps://www.youtube.com/watch?v=bUudx1cPiAA&list=PLgENJ0iY3XBiJ0jZ53HT8v9Qa3cch7YEV\n\nhttps://www.youtube.com/watch?v=BsJJUAGoFBc&list=PLu0hRahvlQEahuFlF_Dc0_1AMmI_UCLz8\n\nhttps://www.youtube.com/watch?v=0PfLQkUBgcI&list=PL45ZeKlPnPvB6UGmZAJoH57ukARvRNZv3\n\nhttps://www.youtube.com/watch?v=ASfVaQH_1kI&list=PL0oJ2_Q2jPrdt6JFbZtTi8H_XwgYXVRfI\n\nhttps://www.youtube.com/watch?v=DM52HxaLK-Y&list=PLI-n-55RUT--saxVQngjQA3er4QXM67Mt\n\n## Ikea\n\nTotal: 375,78\n\nJan:\n- Bratpfanne Seong: 14,99€\nSum: 14,99€ \n\nAlfred: \n- Schwein 7,99€\n- Dröna Fach 3x1,99€=5,97€\nSum: 13,96€\n\nSplitting:\n375,78 - 14,99 - 13,96 = 346,83\n346,83 / 2 = 173,41\n\nTotal Alfred: 173,41 + 13,96 = 187,37€\nTotal Jan: 173,41 + 14,99 = 188,40€\n\n## SEA ben trip geld\n\n\nHostels:\nHa Long: 1,81 (11,60 total)\nHo Chi Minh: 2.76 (17,30 total)\nBangkok: 7.31 (45,94 total)\nKoh Samui: 4,02 (25,30 total)\n\nHanoi: \nLake View Hostel: 1.04.000d (37,39e)\nLake View Hostel: 130.000d (4,68e)\n\n\nHue:\nSecret Garden Hostel: 269.00d (9,53e)\nGrab: 149.00d (5,27e)\n1995s Hostel: 63.000d (2,18e)\nGrab: 33.000d (1,14e)\nGrab: 111.000d (3,87e)\nGrab: 113.000d (3,94e)\nGrab: 26.000d (0,91e)\n\nHoi An:\nHostel Scalo Villa: 535.000d (18.61e)\n\n\n\n\n\n\n\n## Go Asia\n\nBun: 1.99\n\n\n\n## Meeting Kungwoo\nBehaviour cloning:\n- How much data is needed to properly train the model?\n- \n \nOther gameplay datasets:\n- pubg dataset\n- biggest dataset\n- 1k hours\n\nData collection\n- bot for data collection\n- nvid\n\n- complexity \n- different worlds\n- not focusing on video games\n -> only use one game (depending on game)\n -> not many games \n\n\n\nTier 0:\nhttps://www.youtube.com/results?search_query=alle+meine+entchen+klavier\n\n\n\nI was asked to play piano for someones graduation ceremoy. How much money should I ask for?\nIn the past i got 200 euros for playing Fantasie Impromptu by Chopin and the Waldstein Sonata by Beethoven.\n\nMy current options would be:\nWinter Wind by Choping\nBallad No 4 by Chopin (too long)\nThe Wii Mii Theme\nTwinkle Twinkle Little Star\nAlle meine Entchen\nDepartures by Animenz\nGlimpse of Us arranged by Birru\n\nWhat are some other options that I could play and should I ask for less or more money depending on the piece?\nShould I make a tier list?\n\nBelow is a summary table with the YouTube search URLs for each piece, and an explanation of how you would approach getting the direct video URL if you had access to YouTube’s Data API or a scraping library like youtube-search-python.\n\n| **Tier** | **Piece** | **Composer / Arranger** | **Difficulty** | **Suggested Fee** | **URL** |\n| -------- | --------------------------------- | ----------------------- | -------------------------------------- | ----------------- | ------------------------------------------- |\n| SS | Ballade No. 4 | Chopin | Extremely demanding, long, expressive | 800 € | https://www.youtube.com/watch?v=pe-GrRQz8pk |\n| S | Winter Wind Étude (Op. 25 No. 11) | Chopin | Very fast, technically virtuosic | 600 € | https://www.youtube.com/watch?v=pHlqEvAwdVc |\n| A | Glimpse of Us | Birru (arr.) | Modern ballad, emotionally nuanced | 400 € | https://www.youtube.com/watch?v=Xe2Pr4omHMs |\n| A | Departures | Animenz (arr.) | Fast, cinematic anime arrangement | 400 € | https://www.youtube.com/watch?v=5hft807EJ6o |\n| A | Aeolian Harp Étude (Op. 25 No. 1) | Chopin | Flowing arpeggios, control and clarity | 400 € | https://www.youtube.com/watch?v=Ob0AQLp3a5s |\n| B | Clair de Lune | Debussy | Voicing finesse, sensitive rubato | 370 € | https://www.youtube.com/watch?v=U3u4pQ4WKOk |\n| C | Nocturne Op. 9 No. 2 | Chopin | Elegant phrasing, ornamentation | 300 € | https://www.youtube.com/watch?v=JVBzE0mUlSs |\n| D | Twinkle Twinkle Little Star | Traditional | Beginner level, simple melody | 15€ | https://www.youtube.com/watch?v=yCjJyiqpAuU |\n| E | Wii Mii Theme | Nintendo (arr.) | Nostalgic, light and rhythmic | 10€ | https://www.youtube.com/watch?v=xo8H6e2Afi4 |\n| F | Alle meine Entchen | Traditional | Basic children's song | 5€ | https://www.youtube.com/watch?v=5CDLwq8UgIE |\n\n\nPerfect! Here's how you could add Fantaisie-Impromptu in the A-tier with a fitting description:\n\nTier\tPiece\tComposer / Arranger\tDifficulty\tSuggested Fee\tURL\nA\tFantaisie-Impromptu (Op. 66)\tChopin\tFast, rhythmically complex, technically demanding but less intense than top Chopin Études\t450 €\thttps://www.youtube.com/watch?v=IvUU8joBb1Q\n\nIf you want, I can help you refine the difficulty wording or suggest a fee based on your current pricing scale!\n\n\n\n\n\n\n\n\n\n
|
markdown
|
tab
|
2
| 98
|
extension-output-pdoom-org.crowd-code-#1-crowd-code
| 0
| 0
| null |
Log
|
tab
|
3
| 851
|
TERMINAL
| 0
| 0
|
python /Users/alfred/.cursor/extensions/ms-python.python-2025.6.1-darwin-arm64/python_files/printEnvVariablesToFile.py /Users/alfred/.cursor/extensions/ms-python.python-2025.6.1-darwin-arm64/python_files/deactivate/bash/envVars.txt
| null |
terminal_command
|
4
| 867
|
TERMINAL
| 0
| 0
|
]633;E;python /Users/alfred/.cursor/extensions/ms-python.python-2025.6.1-darwin-arm64/python_files/printEnvVariablesToFile.py /Users/alfred/.cursor/extensions/ms-python.python-2025.6.1-darwin-arm64/python_files/deactivate/bash/envVars.txt;94f1541c-df11-407c-9963-13cd6466203f]633;C
| null |
terminal_output
|
5
| 876
|
extension-output-pdoom-org.crowd-code-#1-crowd-code
| 0
| 0
|
12:51:37 AM [info] Activating crowd-code\n12:51:37 AM [info] Welcome back alfred. Your user-id is '05d9d5da933137c5402a176a469b618685c7e9142aa8972616ca5cdf0f6e53d1'. Happy coding!\n12:51:37 AM [info] Recording started\n
|
Log
|
content
|
1
| 5
|
shell_scripts/copy_project_files.sh
| 0
| 0
|
#!/bin/bash\n\nsrc_dir=/home/hk-project-pai00039/tum_ind3695/projects/jafar/*\ndst_dir=/home/hk-project-pai00039/tum_ind3695/projects/jafar_dyn/\nexclude_file=./shell_scripts/exclude.txt\n\nrsync -av --progress --exclude-from=$exclude_file $src_dir $dst_dir\n
|
shellscript
|
tab
|
2
| 2,067
|
extension-output-pdoom-org.crowd-code-#1-crowd-code
| 0
| 0
|
8:56:24 PM [info] Activating crowd-code\n8:56:24 PM [info] Welcome back tum_ind3695. Your user-id is '507ab0ec0dfe0c18ad7778dd15e072f92367194c94623114de802c8ed9c52e20'. Happy coding!\n8:56:24 PM [info] Recording started\n
|
Log
|
tab
|
3
| 4,082
|
TERMINAL
| 0
| 0
|
/bin/python3 /hkfs/home/project/hk-project-pai00039/tum_ind3695/.cursor-server/extensions/ms-python.python-2025.6.1-linux-x64/python_files/printEnvVariablesToFile.py /hkfs/home/project/hk-project-pai00039/tum_ind3695/.cursor-server/extensions/ms-python.python-2025.6.1-linux-x64/python_files/deactivate/bash/envVars.txt
| null |
terminal_command
|
4
| 4,088
|
TERMINAL
| 0
| 0
|
]633;E;2025-06-26 20:56:27 /bin/python3 /hkfs/home/project/hk-project-pai00039/tum_ind3695/.cursor-server/extensions/ms-python.python-2025.6.1-linux-x64/python_files/printEnvVariablesToFile.py /hkfs/home/project/hk-project-pai00039/tum_ind3695/.cursor-server/extensions/ms-python.python-2025.6.1-linux-x64/python_files/deactivate/bash/envVars.txt;62c113bd-932a-454d-94e3-8b00607474b5]633;C]0;tum_ind3695@hkn1993:/hkfs/home/project/hk-project-pai00039/tum_ind3695/.cursor-server/extensions/ms-python.python-2025.6.1-linux-x64/python_files/deactivate/bash]633;D;0]633;P;Cwd=/hkfs/home/project/hk-project-pai00039/tum_ind3695/.cursor-server/extensions/ms-python.python-2025.6.1-linux-x64/python_files/deactivate/bash
| null |
terminal_output
|
5
| 77,668
|
extension-output-pdoom-org.crowd-code-#1-crowd-code
| 218
| 0
| null |
Log
|
selection_mouse
|
6
| 79,454
|
TERMINAL
| 0
| 0
| null | null |
terminal_focus
|
7
| 80,180
|
TERMINAL
| 0
| 0
|
bash
| null |
terminal_focus
|
8
| 80,380
|
TERMINAL
| 0
| 0
|
bash
| null |
terminal_focus
|
9
| 88,404
|
TERMINAL
| 0
| 0
|
cd ../jafar_dy^C
| null |
terminal_command
|
10
| 88,411
|
TERMINAL
| 0
| 0
|
^C[?2004l\r[?2004h[?2004l\r\r\n]633;E;;477fd18a-b71b-4901-88d4-4122e769427c]633;C]0;tum_ind3695@hkn1993:~/projects/jafar]633;D
| null |
terminal_output
|
11
| 94,116
|
TERMINAL
| 0
| 0
|
cursor ../jafar_dyn/
| null |
terminal_command
|
12
| 94,176
|
TERMINAL
| 0
| 0
|
]633;E;2025-06-26 20:57:58 cursor ../jafar_dyn/;477fd18a-b71b-4901-88d4-4122e769427c]633;C
| null |
terminal_output
|
13
| 94,327
|
TERMINAL
| 0
| 0
|
]0;tum_ind3695@hkn1993:~/projects/jafar]633;D;0
| null |
terminal_output
|
14
| 202,010
|
TERMINAL
| 0
| 0
|
cd ../jafar_dyn/
| null |
terminal_command
|
15
| 202,019
|
TERMINAL
| 0
| 0
|
]633;E;2025-06-26 20:59:46 cd ../jafar_dyn/;477fd18a-b71b-4901-88d4-4122e769427c]633;C]0;tum_ind3695@hkn1993:~/projects/jafar_dyn]633;D;0
| null |
terminal_output
|
16
| 202,418
|
TERMINAL
| 0
| 0
|
ls
| null |
terminal_command
|
17
| 202,486
|
TERMINAL
| 0
| 0
|
]633;E;2025-06-26 20:59:46 ls;477fd18a-b71b-4901-88d4-4122e769427c]633;C
| null |
terminal_output
|
18
| 202,619
|
TERMINAL
| 0
| 0
|
generate_dataset.py [0m[01;34m__pycache__[0m [01;34mshell_scripts[0m train_tokenizer_cp.py\r\n[01;35mgeneration_video_0_1750688558.8735778.gif[0m README.md single_batch.npy train_tokenizer_logging.py\r\n[01;35mgeneration_video_0_1750694195.5326087.gif[0m requirements_franz.txt [01;34mslurm[0m train_tokenizer.py\r\n[01;35mgeneration_video_0_1750871746.3682885.gif[0m requirements.txt train_dynamics.py train_tokenizer_single_batch.py\r\ngenie.py sample.py train_dynamics_single_batch.py [01;34mutils[0m\r\nLICENSE sample_resolution_batches.py train_lam_cp.py [01;34mwandb[0m\r\n[01;34mlogs[0m [01;34msample_results[0m train_lam.py\r\n[01;34mmodels[0m [01;34msandbox[0m train_lam_single_batch.py\r\nnotes.md [01;34msbatch_scripts[0m train_lam_tf_seeding.py\r\n]0;tum_ind3695@hkn1993:~/projects/jafar_dyn]633;D;0
| null |
terminal_output
|
19
| 210,994
|
TERMINAL
| 0
| 0
|
cd sbatch_scripts/coinrun/latent_action_ablation/
| null |
terminal_command
|
20
| 211,008
|
TERMINAL
| 0
| 0
|
]633;E;2025-06-26 20:59:54 cd sbatch_scripts/coinrun/latent_action_ablation/;477fd18a-b71b-4901-88d4-4122e769427c]633;C]0;tum_ind3695@hkn1993:~/projects/jafar_dyn/sbatch_scripts/coinrun/latent_action_ablation]633;D;0
| null |
terminal_output
|
21
| 211,318
|
TERMINAL
| 0
| 0
|
ls
| null |
terminal_command
|
22
| 211,368
|
TERMINAL
| 0
| 0
|
]633;E;2025-06-26 20:59:55 ls;477fd18a-b71b-4901-88d4-4122e769427c]633;C
| null |
terminal_output
|
23
| 211,428
|
TERMINAL
| 0
| 0
|
train_dynamics_coinrun.sbatch train_lam_12.sbatch train_lam_24.sbatch train_lam_48.sbatch train_lam_6.sbatch train_tokenizer_coinrun.sbatch\r\n]0;tum_ind3695@hkn1993:~/projects/jafar_dyn/sbatch_scripts/coinrun/latent_action_ablation]633;D;0
| null |
terminal_output
|
24
| 216,787
|
TERMINAL
| 0
| 0
|
cd ..
| null |
terminal_command
|
25
| 216,807
|
TERMINAL
| 0
| 0
|
]633;E;2025-06-26 21:00:00 cd ..;477fd18a-b71b-4901-88d4-4122e769427c]633;C]0;tum_ind3695@hkn1993:~/projects/jafar_dyn/sbatch_scripts/coinrun]633;D;0
| null |
terminal_output
|
26
| 217,962
|
TERMINAL
| 0
| 0
|
cd ..
| null |
terminal_command
|
27
| 217,969
|
TERMINAL
| 0
| 0
|
]633;E;2025-06-26 21:00:01 cd ..;477fd18a-b71b-4901-88d4-4122e769427c]633;C]0;tum_ind3695@hkn1993:~/projects/jafar_dyn/sbatch_scripts]633;D;0
| null |
terminal_output
|
28
| 219,437
|
TERMINAL
| 0
| 0
|
cd ..
| null |
terminal_command
|
29
| 219,465
|
TERMINAL
| 0
| 0
|
]633;E;2025-06-26 21:00:03 cd ..;477fd18a-b71b-4901-88d4-4122e769427c]633;C]0;tum_ind3695@hkn1993:~/projects/jafar_dyn]633;D;0
| null |
terminal_output
|
30
| 222,564
|
TERMINAL
| 0
| 0
|
cursor .
| null |
terminal_command
|
31
| 222,595
|
TERMINAL
| 0
| 0
|
]633;E;2025-06-26 21:00:06 cursor .;477fd18a-b71b-4901-88d4-4122e769427c]633;C
| null |
terminal_output
|
32
| 222,733
|
TERMINAL
| 0
| 0
|
]0;tum_ind3695@hkn1993:~/projects/jafar_dyn]633;D;0]633;P;Cwd=/home/hk-project-pai00039/tum_ind3695/projects/jafar_dyn
| null |
terminal_output
|
33
| 236,149
|
TERMINAL
| 0
| 0
|
cursor .
| null |
terminal_command
|
34
| 236,200
|
TERMINAL
| 0
| 0
|
]633;E;2025-06-26 21:00:20 cursor .;477fd18a-b71b-4901-88d4-4122e769427c]633;C
| null |
terminal_output
|
35
| 236,347
|
TERMINAL
| 0
| 0
|
]0;tum_ind3695@hkn1993:~/projects/jafar_dyn]633;D;0
| null |
terminal_output
|
36
| 267,275
|
TERMINAL
| 0
| 0
|
ls
| null |
terminal_command
|
37
| 267,288
|
TERMINAL
| 0
| 0
|
]633;E;2025-06-26 21:00:51 ls;477fd18a-b71b-4901-88d4-4122e769427c]633;Cgenerate_dataset.py [0m[01;34m__pycache__[0m [01;34mshell_scripts[0m train_tokenizer_cp.py\r\n[01;35mgeneration_video_0_1750688558.8735778.gif[0m README.md single_batch.npy train_tokenizer_logging.py\r\n[01;35mgeneration_video_0_1750694195.5326087.gif[0m requirements_franz.txt [01;34mslurm[0m train_tokenizer.py\r\n[01;35mgeneration_video_0_1750871746.3682885.gif[0m requirements.txt train_dynamics.py train_tokenizer_single_batch.py\r\ngenie.py sample.py train_dynamics_single_batch.py [01;34mutils[0m\r\nLICENSE sample_resolution_batches.py train_lam_cp.py [01;34mwandb[0m\r\n[01;34mlogs[0m [01;34msample_results[0m train_lam.py\r\n[01;34mmodels[0m [01;34msandbox[0m train_lam_single_batch.py\r\nnotes.md [01;34msbatch_scripts[0m train_lam_tf_seeding.py\r\n]0;tum_ind3695@hkn1993:~/projects/jafar_dyn]633;D;0]633;P;Cwd=/home/hk-project-pai00039/tum_ind3695/projects/jafar_dyn
| null |
terminal_output
|
38
| 287,576
|
TERMINAL
| 0
| 0
|
cd sbatch_scripts/coinrun/latent_action_ablation/
| null |
terminal_command
|
39
| 287,583
|
TERMINAL
| 0
| 0
|
]633;E;2025-06-26 21:01:11 cd sbatch_scripts/coinrun/latent_action_ablation/;477fd18a-b71b-4901-88d4-4122e769427c]633;C]0;tum_ind3695@hkn1993:~/projects/jafar_dyn/sbatch_scripts/coinrun/latent_action_ablation]633;D;0
| null |
terminal_output
|
40
| 288,010
|
TERMINAL
| 0
| 0
|
ls
| null |
terminal_command
|
41
| 288,019
|
TERMINAL
| 0
| 0
|
]633;E;2025-06-26 21:01:12 ls;477fd18a-b71b-4901-88d4-4122e769427c]633;Ctrain_dynamics_coinrun.sbatch train_lam_12.sbatch train_lam_24.sbatch train_lam_48.sbatch train_lam_6.sbatch train_tokenizer_coinrun.sbatch\r\n]0;tum_ind3695@hkn1993:~/projects/jafar_dyn/sbatch_scripts/coinrun/latent_action_ablation]633;D;0
| null |
terminal_output
|
42
| 292,874
|
TERMINAL
| 0
| 0
|
cursor train_dynamics_coinrun.sbatch
| null |
terminal_command
|
43
| 292,917
|
TERMINAL
| 0
| 0
|
]633;E;2025-06-26 21:01:16 cursor train_dynamics_coinrun.sbatch ;477fd18a-b71b-4901-88d4-4122e769427c]633;C
| null |
terminal_output
|
44
| 293,045
|
TERMINAL
| 0
| 0
|
]0;tum_ind3695@hkn1993:~/projects/jafar_dyn/sbatch_scripts/coinrun/latent_action_ablation]633;D;0
| null |
terminal_output
|
45
| 293,232
|
/home/hk-project-pai00039/tum_ind3695/projects/jafar_dyn/sbatch_scripts/coinrun/latent_action_ablation/train_dynamics_coinrun.sbatch
| 0
| 0
|
#!/usr/bin/env bash\n\n#SBATCH --nodes=1\n#SBATCH --ntasks-per-node=1\n#SBATCH --time=24:00:00\n#SBATCH --partition=accelerated\n#SBATCH --account=hk-project-p0023960\n#SBATCH --cpus-per-task=8\n#SBATCH --gres=gpu:4\n#SBATCH --output=logs/logs_training_tokenizer/%x_%j.log\n#SBATCH --error=logs/logs_training_tokenizer/%x_%j.log\n#SBATCH --mail-user=avocadoaling@gmail.com\n#SBATCH --job-name=train_tokenizer_coinrun\n#SBATCH --mail-type=ALL\n\n# Log the sbatch script\ncat $0\n\nmodule unload mpi/openmpi/5.0\nmodule unload devel/cuda/12.4\nsource .venv_jafar/bin/activate\n\njob_name=$SLURM_JOB_NAME\nslurm_job_id=$SLURM_JOB_ID\n\nws_dir='/hkfs/work/workspace/scratch/tum_ind3695-jafa_ws_shared'\n\nCHECKPOINT_DIR=$ws_dir/data/checkpoints/$job_name_$slurm_job_id\nLOG_DIR=$ws_dir/logs/$job_name_$slurm_job_id\nmkdir -p $CHECKPOINT_DIR\nmkdir -p $LOG_DIR\n\ndata_dir='/hkfs/work/workspace/scratch/tum_ind3695-jafa_ws_shared/data/coinrun/coinrun_tfrecords'\n\nsrun python train_tokenizer.py \\n --batch_size=192 \\n --experiment_name $job_name \\n --ckpt_dir $CHECKPOINT_DIR \\n --log_checkpoint_interval=1000 \\n --log_image_interval=10 \\n --image_height 64 \\n --image_width 64 \\n --log \\n --entity instant-uv \\n --data_dir $data_dir \\n --project jafar\n
|
shellscript
|
tab
|
46
| 294,397
|
/home/hk-project-pai00039/tum_ind3695/projects/jafar_dyn/sbatch_scripts/coinrun/latent_action_ablation/train_dynamics_coinrun.sbatch
| 20
| 0
| null |
shellscript
|
selection_command
|
47
| 294,577
|
/home/hk-project-pai00039/tum_ind3695/projects/jafar_dyn/sbatch_scripts/coinrun/latent_action_ablation/train_dynamics_coinrun.sbatch
| 429
| 0
| null |
shellscript
|
selection_command
|
48
| 294,720
|
/home/hk-project-pai00039/tum_ind3695/projects/jafar_dyn/sbatch_scripts/coinrun/latent_action_ablation/train_dynamics_coinrun.sbatch
| 461
| 0
| null |
shellscript
|
selection_command
|
49
| 294,870
|
/home/hk-project-pai00039/tum_ind3695/projects/jafar_dyn/sbatch_scripts/coinrun/latent_action_ablation/train_dynamics_coinrun.sbatch
| 554
| 0
| null |
shellscript
|
selection_command
|
50
| 295,035
|
/home/hk-project-pai00039/tum_ind3695/projects/jafar_dyn/sbatch_scripts/coinrun/latent_action_ablation/train_dynamics_coinrun.sbatch
| 607
| 0
| null |
shellscript
|
selection_command
|
51
| 300,385
|
/home/hk-project-pai00039/tum_ind3695/projects/jafar_dyn/sbatch_scripts/coinrun/latent_action_ablation/train_dynamics_coinrun.sbatch
| 673
| 0
| null |
shellscript
|
selection_command
|
52
| 301,268
|
/home/hk-project-pai00039/tum_ind3695/projects/jafar_dyn/sbatch_scripts/coinrun/latent_action_ablation/train_dynamics_coinrun.sbatch
| 826
| 0
| null |
shellscript
|
selection_command
|
53
| 303,997
|
/home/hk-project-pai00039/tum_ind3695/projects/jafar_dyn/sbatch_scripts/coinrun/latent_action_ablation/train_dynamics_coinrun.sbatch
| 0
| 0
| null |
shellscript
|
selection_command
|
54
| 304,299
|
/home/hk-project-pai00039/tum_ind3695/projects/jafar_dyn/sbatch_scripts/coinrun/latent_action_ablation/train_dynamics_coinrun.sbatch
| 20
| 0
| null |
shellscript
|
selection_command
|
55
| 304,557
|
/home/hk-project-pai00039/tum_ind3695/projects/jafar_dyn/sbatch_scripts/coinrun/latent_action_ablation/train_dynamics_coinrun.sbatch
| 21
| 0
| null |
shellscript
|
selection_command
|
56
| 304,576
|
/home/hk-project-pai00039/tum_ind3695/projects/jafar_dyn/sbatch_scripts/coinrun/latent_action_ablation/train_dynamics_coinrun.sbatch
| 39
| 0
| null |
shellscript
|
selection_command
|
57
| 304,607
|
/home/hk-project-pai00039/tum_ind3695/projects/jafar_dyn/sbatch_scripts/coinrun/latent_action_ablation/train_dynamics_coinrun.sbatch
| 67
| 0
| null |
shellscript
|
selection_command
|
58
| 304,637
|
/home/hk-project-pai00039/tum_ind3695/projects/jafar_dyn/sbatch_scripts/coinrun/latent_action_ablation/train_dynamics_coinrun.sbatch
| 91
| 0
| null |
shellscript
|
selection_command
|
59
| 304,677
|
/home/hk-project-pai00039/tum_ind3695/projects/jafar_dyn/sbatch_scripts/coinrun/latent_action_ablation/train_dynamics_coinrun.sbatch
| 123
| 0
| null |
shellscript
|
selection_command
|
60
| 304,708
|
/home/hk-project-pai00039/tum_ind3695/projects/jafar_dyn/sbatch_scripts/coinrun/latent_action_ablation/train_dynamics_coinrun.sbatch
| 161
| 0
| null |
shellscript
|
selection_command
|
61
| 304,737
|
/home/hk-project-pai00039/tum_ind3695/projects/jafar_dyn/sbatch_scripts/coinrun/latent_action_ablation/train_dynamics_coinrun.sbatch
| 187
| 0
| null |
shellscript
|
selection_command
|
62
| 304,777
|
/home/hk-project-pai00039/tum_ind3695/projects/jafar_dyn/sbatch_scripts/coinrun/latent_action_ablation/train_dynamics_coinrun.sbatch
| 208
| 0
| null |
shellscript
|
selection_command
|
63
| 304,808
|
/home/hk-project-pai00039/tum_ind3695/projects/jafar_dyn/sbatch_scripts/coinrun/latent_action_ablation/train_dynamics_coinrun.sbatch
| 264
| 0
| null |
shellscript
|
selection_command
|
64
| 304,837
|
/home/hk-project-pai00039/tum_ind3695/projects/jafar_dyn/sbatch_scripts/coinrun/latent_action_ablation/train_dynamics_coinrun.sbatch
| 319
| 0
| null |
shellscript
|
selection_command
|
65
| 304,876
|
/home/hk-project-pai00039/tum_ind3695/projects/jafar_dyn/sbatch_scripts/coinrun/latent_action_ablation/train_dynamics_coinrun.sbatch
| 362
| 0
| null |
shellscript
|
selection_command
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.