\n\n## ๐ ๏ธ Install\n`probly` is intended to work with **Python 3.12 and above**. Installation can be done via `pip` and\nor `uv`:\n\n```sh\npip install probly\n```\n\n```sh\nuv add probly\n```\n\n## โญ Quickstart\n\n`probly` makes it very easy to make models uncertainty-aware and perform several downstream tasks:\n\n```python\nimport probly\nimport torch.nn.functional as F\n\nnet = ... # get neural network\nmodel = probly.transformation.dropout(net) # make neural network a Dropout model\ntrain(model) # train model as usual\n\ndata = ... # get data\ndata_ood = ... # get out of distribution data\nsampler = probly.representation.Sampler(model, num_samples=20)\nsample = sampler.predict(data) # predict an uncertainty representation\nsample_ood = sampler.predict(data_ood)\n\neu = probly.quantification.classification.mutual_information(sample) # quantify model's epistemic uncertainty\neu_ood = probly.quantification.classification.mutual_information(sample_ood)\n\nauroc = probly.evaluation.tasks.out_of_distribution_detection(eu, eu_ood) # evaluate model's uncertainty\n```\n\n## ๐ License\nThis project is licensed under the [MIT License](https://github.com/pwhofman/probly/blob/main/LICENSE).\n\n---\nBuilt with โค๏ธ by the probly team.\n",markdown,tab
+569,1951669,"README.md",0,0,"",markdown,tab
+570,2251188,"TERMINAL",0,0,"make html",,terminal_command
+571,2251236,"TERMINAL",0,0,"]633;C",,terminal_output
+572,2251408,"TERMINAL",0,0,"[01mRunning Sphinx v8.2.3[39;49;00m\r\n",,terminal_output
+573,2251459,"TERMINAL",0,0,"[01mloading translations [en]... [39;49;00mdone\r\n",,terminal_output
+574,2252490,"TERMINAL",0,0,"[01mloading pickled environment... [39;49;00m",,terminal_output
+575,2252669,"TERMINAL",0,0,"checking bibtex cache... out of date\r\nparsing bibtex file /Users/franzsrambical/Downloads/probly/docs/source/references.bib... parsed 19 entries\r\ndone\r\n",,terminal_output
+576,2252824,"TERMINAL",0,0,"[autosummary] generating autosummary for: api.rst, api/probly.datasets.rst, api/probly.datasets.torch.rst, api/probly.evaluation.metrics.rst, api/probly.evaluation.rst, api/probly.evaluation.tasks.rst, api/probly.layers.flax.rst, api/probly.layers.rst, api/probly.layers.torch.rst, api/probly.plot.credal.rst, ..., api/probly.traverse_nn.rst, api/probly.traverse_nn.torch.rst, api/probly.utils.errors.rst, api/probly.utils.probabilities.rst, api/probly.utils.rst, api/probly.utils.sets.rst, api/probly.utils.torch.rst, index.rst, methods.md, references.rst\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.masked.maskedtensor.reductions.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.cpu.amp.autocast_mode.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.intrinsic.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.accelerator.memory.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.utils.data.datapipes.iter.streamreader.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.quantized.dynamic.modules.conv.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.package.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.quantization.quant_type.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.quantizable.modules.rnn.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributed.algorithms.join.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributions.normal.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.fx.passes.infra.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributed.constants.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.onnx.symbolic_helper.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.backends.cudnn.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributions.lkj_cholesky.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.quantized.reference.modules.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.fx.passes.graph_drawer.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributions.inverse_gamma.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributed.tensor.parallel.loss.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.quantizable.modules.activation.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.package.importer.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.backends.cusparselt.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.quantized.dynamic.modules.rnn.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributed.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributions.distribution.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributions.transformed_distribution.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.utils.data.datapipes.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.export.exported_program.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.utils.convert_parameters.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.onnx.operators.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributed.remote_device.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.utils.fusion.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.utils.data.datapipes.map.callable.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.fx.node.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.ns.fx.utils.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.quantized.modules.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.evidential.torch.\r\nPossible hints:\r\n* AttributeError: module 'probly.train.evidential' has no attribute 'torch'\r\n* ModuleNotFoundError: No module named 'probly.train.evidential.torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.cuda.nvtx.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.parallel.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributed.nn.api.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.mtia.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.quantized.reference.modules.sparse.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.utils.data.datapipes.map.combining.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.quantization.qconfig.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.profiler.itt.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.pruning.scheduler.base_scheduler.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.jit.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.cpu.amp.grad_scaler.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributions.kl.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.autograd.forward_ad.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.quantized.modules.embedding_ops.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributed.tensor.parallel.style.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.intrinsic.quantized.dynamic.modules.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributed.nn.api.remote_module.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.utils.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.quantized.functional.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.utils.data.datapipes.dataframe.dataframe_wrapper.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributed.distributed_c10d.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.amp.autocast_mode.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.export.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.package.find_file_dependencies.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.fx.experimental.recording.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.quantization.pt2e.export_utils.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.quantizable.modules.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.xpu.memory.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributions.wishart.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.qat.modules.embedding_ops.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.fx.experimental.proxy_tensor.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.transformation.evidential.regression.torch.\r\nPossible hints:\r\n* AttributeError: module 'probly.transformation.evidential.regression' has no attribute 'torch'\r\n* ModuleNotFoundError: No module named 'probly.transformation.evidential.regression.torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.utils.parametrize.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.modules.conv.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.intrinsic.quantized.dynamic.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.intrinsic.qat.modules.linear_relu.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.quantized.modules.conv.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.mtia.memory.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.utils.data.datapipes.map.combinatorics.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.quantization.fake_quantize.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.intrinsic.quantized.dynamic.modules.linear_relu.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.utils.data.datapipes.utils.decoder.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.xpu.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.layers.torch.\r\nPossible hints:\r\n* AttributeError: module 'probly.layers' has no attribute 'torch'\r\n* ModuleNotFoundError: No module named 'probly.layers.torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.fx.passes.param_fetch.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.quantized.modules.normalization.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributed.nn.jit.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.utils.data.datapipes.iter.filelister.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.masked.maskedtensor.passthrough.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.modules.rnn.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.fx.subgraph_rewriter.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.fx.immutable_collections.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.fx.interpreter.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.pruning.scheduler.cubic_scheduler.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributions.transforms.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.fx.operator_schemas.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.package.glob_group.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributions.mixture_same_family.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributions.gumbel.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.fx.passes.split_utils.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.autograd.profiler_util.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.modules.upsampling.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.random.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.fx.passes.infra.pass_manager.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.fx.passes.runtime_assert.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.qat.modules.conv.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.quantized.modules.dropout.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.fx.experimental.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.quantized.dynamic.modules.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributed.c10d_logger.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.intrinsic.quantized.modules.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributions.multivariate_normal.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.grad.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.fx.passes.utils.matcher_utils.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributions.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.utils.cpp_backtrace.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.quantization.observer.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.fx.passes.shape_prop.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.cuda.random.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.mps.event.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n",,terminal_output
+577,2253617,"TERMINAL",0,0,"[91mWARNING: Failed to import probly.train.bayesian.torch.ao.pruning.scheduler.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributed.rpc.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.transformation.evidential.classification.torch.\r\nPossible hints:\r\n* AttributeError: module 'probly.transformation.evidential.classification' has no attribute 'torch'\r\n* ModuleNotFoundError: No module named 'probly.transformation.evidential.classification.torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.functional.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.cuda.gds.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.modules.lazy.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.special.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.ns.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.autograd.anomaly_mode.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.intrinsic.qat.modules.conv_fused.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.qat.modules.embedding_ops.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.fx.passes.operator_support.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.backends.quantized.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.intrinsic.quantized.dynamic.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.modules.distance.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.quantization.fake_quantize.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.intrinsic.qat.modules.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.onnx.errors.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.utils.data.datapipes.iter.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.cuda.nccl.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.mps.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributions.von_mises.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.onnx.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.optim.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.utils.rnn.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributed.logging_handlers.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.masked.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n",,terminal_output
+578,2253913,"TERMINAL",0,0,"[91mWARNING: Failed to import probly.train.bayesian.torch.nn.functional.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.utils.torch.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.utils.torch'\r\n* AttributeError: module 'probly.utils' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.modules.module.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.autograd.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.utils.data.datapipes.iter.combinatorics.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.cuda.graphs.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.utils.init.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.quantized.functional.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.qat.dynamic.modules.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.utils.data.datapipes.iter.selecting.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.onnx.symbolic_opset9.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.cpu.amp.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.qat.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.monitor.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.parallel.scatter_gather.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.intrinsic.qat.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.modules.container.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.intrinsic.qat.modules.linear_fused.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.backends.mps.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.backends.opt_einsum.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.qat.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.mps.profiler.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.amp.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.ns.fx.ns_types.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.utils.data.datapipes.datapipe.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.overrides.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.intrinsic.quantized.modules.bn_relu.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributed.rpc.server_process_global_profiler.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributions.generalized_pareto.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.sparse.quantized.dynamic.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.attention.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.xpu.streams.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.fx.experimental.symbolic_shapes.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.pruning.sparsifier.base_sparsifier.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.datasets.torch.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.datasets.torch'\r\n* AttributeError: module 'probly.datasets' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.cuda.streams.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.xpu.random.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.quantization.quant_type.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.utils.data.datapipes.iter.routeddecoder.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.losses.\r\nPossible hints:\r\n* AttributeError: module 'probly.train' has no attribute 'losses'\r\n* ModuleNotFoundError: No module named 'probly.train.losses'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.modules.flatten.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n",,terminal_output
+579,2254870,"TERMINAL",0,0,"[91mWARNING: Failed to import probly.train.bayesian.torch.ao.pruning.sparsifier.utils.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.cuda.amp.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.utils.backcompat.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.onnx.utils.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.intrinsic.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.fx.passes.split_module.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n",,terminal_output
+580,2256147,"TERMINAL",0,0,"[91mWARNING: Failed to import probly.train.bayesian.torch.distributed.fsdp.fully_sharded_data_parallel.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.fx.passes.utils.common.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.utils.weak.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.utils.data.datapipes.iter.combining.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.intrinsic.quantized.dynamic.modules.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.quantized.modules.batchnorm.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.backends.cpu.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.qat.dynamic.modules.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.modules.sparse.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.utils.data.sampler.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.fx.passes.graph_transform_observer.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.fx.passes.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.utils.data.graph.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.quantization.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.modules.channelshuffle.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributions.multinomial.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.types.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.qat.modules.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.intrinsic.quantized.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.cpu.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.quantization.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.utils.stateless.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.cuda.amp.common.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.intrinsic.qat.modules.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.numa.binding.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.testing.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.parallel.comm.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.fx.passes.infra.pass_base.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.quantized.dynamic.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.modules.fold.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.intrinsic.quantized.modules.conv_relu.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.utils.data.datapipes.utils.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributions.studentT.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.utils.data.datapipes.dataframe.structures.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.modules.padding.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributions.negative_binomial.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.cuda.sparse.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.qat.dynamic.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributions.uniform.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributed.nn.jit.templates.remote_module_template.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.quantized.reference.modules.utils.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributions.weibull.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.intrinsic.quantized.dynamic.modules.linear_relu.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributions.fishersnedecor.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.quantized.modules.utils.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.quantization.quantization_mappings.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributed.nn.functional.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.fx.proxy.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.intrinsic.quantized.modules.bn_relu.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.package.package_importer.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributed.rpc.options.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributions.bernoulli.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.autograd.profiler.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.sparse.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.fx.passes.net_min_base.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.modules.batchnorm.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.utils.data.datapipes.dataframe.datapipes.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.intrinsic.quantized.modules.linear_relu.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.intrinsic.quantized.modules.conv_add.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.modules.instancenorm.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.utils.data.dataset.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.hub.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.utils.data.datapipes.iter.callable.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.fx.passes.utils.source_matcher_utils.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.masked.maskedtensor.creation.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.autograd.profiler_legacy.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.onnx.symbolic_opset11.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.optim.lr_scheduler.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.storage.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.backends.cuda.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.traverse_nn.torch.\r\nPossible hints:\r\n* AttributeError: module 'probly.traverse_nn' has no attribute 'torch'\r\n* ModuleNotFoundError: No module named 'probly.traverse_nn.torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.quantized.modules.batchnorm.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributions.geometric.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.autograd.functional.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.backends.openmp.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.modules.adaptive.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.sparse.quantized.linear.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.quantized.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.qat.dynamic.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributed.autograd.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.parallel.distributed.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributions.exponential.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.export.decomp_utils.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.masked.maskedtensor.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.pruning.scheduler.lambda_scheduler.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.version.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.cuda.profiler.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.utils.data.datapipes.dataframe.dataframes.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.utils.data.datapipes.map.grouping.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.utils.backend_registration.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.utils.data.datapipes.iter.grouping.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.backends.kleidiai.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.utils.dlpack.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.signal.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributed.fsdp.api.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.quantized.dynamic.modules.linear.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributed.tensor.device_mesh.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributions.independent.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.futures.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.export.custom_ops.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.quantized.dynamic.modules.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.export.graph_signature.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.quantized.dynamic.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.quantized.reference.modules.rnn.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.intrinsic.qat.modules.conv_fused.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.modules.pixelshuffle.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.quantization.observer.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.backends.mha.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.profiler.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.intrinsic.quantized.modules.conv_relu.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.fx.passes.reinplace.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.compiler.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.quantized.reference.modules.linear.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.sparse.quantized.dynamic.linear.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.modules.utils.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.modules.linear.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.masked.maskedtensor.core.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.intrinsic.modules.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.library.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.quantized.modules.embedding_ops.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.intrinsic.quantized.modules.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.utils.throughput_benchmark.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributions.laplace.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.autograd.function.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.fx.passes.utils.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.fx.graph_module.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.quantization.quantization_mappings.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributed.rpc.functions.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.cuda.memory.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.sparse.quantized.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.utils.clip_grad.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributions.constraint_registry.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributed.nn.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.quantized.dynamic.modules.conv.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.compiler.config.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.pruning.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.quantized.modules.rnn.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.autograd.grad_mode.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.quantized.dynamic.modules.rnn.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.quantization.fuser_method_mappings.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.utils.data.datapipes.iter.fileopener.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.quantized.modules.rnn.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.masked.maskedtensor.binary.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.common_types.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.cuda.amp.autocast_mode.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributions.poisson.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.backends.miopen.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributed.device_mesh.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.quantizable.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.package.package_exporter.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.quantization.stubs.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributed.rpc.internal.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.package.analyze.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributions.dirichlet.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.fx.traceback.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.export.dynamic_shapes.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributed.rpc.constants.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.quantizable.modules.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.utils.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.utils.model_zoo.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.nn.utils.parametrizations.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributions.cauchy.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributed.fsdp.wrap.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.cuda.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.distributed.utils.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.cuda.tunable.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.utils.data.datapipes.iter.utils.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.quantization.qconfig.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.ao.nn.quantized.modules.functional_modules.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.sparse.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: Failed to import probly.train.bayesian.torch.utils.data.datapipes.dataframe.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.t\n[... 34806 bytes truncated to respect terminal scrollback settings ...]\n",,terminal_output
+581,2256994,"TERMINAL",0,0,"[91mWARNING: [autosummary] failed to import probly.datasets.torch.\r\nPossible hints:\r\n* ImportError: \r\n* ModuleNotFoundError: No module named 'probly.datasets.torch'\r\n* AttributeError: module 'probly.datasets' has no attribute 'torch'[39;49;00m\r\n[91mWARNING: [autosummary] failed to import probly.layers.torch.\r\nPossible hints:\r\n* AttributeError: module 'probly.layers' has no attribute 'torch'\r\n* ImportError: \r\n* ModuleNotFoundError: No module named 'probly.layers.torch'[39;49;00m\r\n",,terminal_output
+582,2257094,"TERMINAL",0,0,"[91mWARNING: [autosummary] failed to import probly.representation.credal_set.torch.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.representation.credal_set.torch'\r\n* ImportError: \r\n* AttributeError: module 'probly.representation.credal_set' has no attribute 'torch'[39;49;00m\r\n",,terminal_output
+583,2257227,"TERMINAL",0,0,"[91mWARNING: [autosummary] failed to import probly.train.bayesian.torch.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.bayesian.torch'\r\n* ImportError: \r\n* AttributeError: module 'probly.train.bayesian' has no attribute 'torch'[39;49;00m\r\n\r\n[91mExtension error (sphinx.ext.autosummary)![39;49;00m\r\n\r\nVersions\r\n========\r\n\r\n* Platform: darwin; (macOS-15.6-arm64-arm-64bit-Mach-O)\r\n* Python version: 3.13.7 (CPython)\r\n* Sphinx version: 8.2.3\r\n* Docutils version: 0.21.2\r\n* Jinja2 version: 3.1.4\r\n* Pygments version: 2.19.1\r\n\r\nLast Messages\r\n=============\r\n\r\nNone.\r\n\r\nLoaded Extensions\r\n=================\r\n\r\nNone.\r\n\r\nTraceback\r\n=========\r\n\r\n File ""/Users/franzsrambical/Downloads/probly/.venv/lib/python3.13/site-packages/sphinx/events.py"", line 415, in emit\r\n raise ExtensionError(\r\n ...<4 lines>...\r\n ) from exc\r\n sphinx.errors.ExtensionError: Handler for event 'builder-inited' threw an exception (exception: no module named probly.train.bayesian.torch.accelerator)\r\n\r\n\r\nThe full traceback has been saved in:\r\n/var/folders/nn/241fnlwx03d7k7qt2jg98txr0000gn/T/sphinx-err-hxsij5k6.log\r\n\r\nTo report this error to the developers, please open an issue at . Thanks!\r\nPlease also report this if it was a user error, so that a better error message can be provided next time.\r\n",,terminal_output
+584,2258121,"TERMINAL",0,0,"make: *** [html] Error 2\r\n[1m[7m%[27m[1m[0m \r \r",,terminal_output
+585,3675279,"TERMINAL",0,0,"make html",,terminal_command
+586,3675328,"TERMINAL",0,0,"]633;C",,terminal_output
+587,3675681,"TERMINAL",0,0,"[01mRunning Sphinx v8.2.3[39;49;00m\r\n",,terminal_output
+588,3675758,"TERMINAL",0,0,"[01mloading translations [en]... [39;49;00mdone\r\n",,terminal_output
+589,3676895,"TERMINAL",0,0,"[01mloading pickled environment... [39;49;00m",,terminal_output
+590,3677039,"TERMINAL",0,0,"checking bibtex cache... up to date\r\nThe configuration has changed (3 options: 'html_css_files', 'html_static_path', 'pygments_dark_style')\r\ndone\r\n[autosummary] generating autosummary for: api.rst, api/probly.datasets._torch.rst, api/probly.datasets.rst, api/probly.evaluation.metrics.rst, api/probly.evaluation.rst, api/probly.evaluation.tasks.rst, api/probly.layers._torch.rst, api/probly.layers.flax.rst, api/probly.layers.rst, api/probly.plot.credal.rst, ..., api/probly.traverse_nn.flax.rst, api/probly.traverse_nn.rst, api/probly.utils._torch.rst, api/probly.utils.errors.rst, api/probly.utils.probabilities.rst, api/probly.utils.rst, api/probly.utils.sets.rst, index.rst, methods.md, references.rst\r\n",,terminal_output
+591,3679305,"TERMINAL",0,0,"[91mWARNING: Failed to import probly.train.losses.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.losses'\r\n* AttributeError: module 'probly.train' has no attribute 'losses'[39;49;00m\r\n",,terminal_output
+592,3680846,"TERMINAL",0,0,"[01mmyst v4.0.1:[39;49;00m MdParserConfig(commonmark_only=False, gfm_only=False, enable_extensions=set(), disable_syntax=[], all_links_external=False, links_external_new_tab=False, url_schemes=('http', 'https', 'mailto', 'ftp'), ref_domains=None, fence_as_directive=set(), number_code_blocks=[], title_to_header=False, heading_anchors=0, heading_slug_func=None, html_meta={}, footnote_sort=True, footnote_transition=True, words_per_minute=200, substitutions={}, linkify_fuzzy_links=True, dmath_allow_labels=True, dmath_allow_space=True, dmath_allow_digits=True, dmath_double_inline=False, update_mathjax=True, mathjax_classes='tex2jax_process|mathjax_process|math|output_area', enable_checkboxes=False, suppress_warnings=[], highlight_code_blocks=True)\r\n[01mmyst-nb v1.2.0:[39;49;00m NbParserConfig(custom_formats={}, metadata_key='mystnb', cell_metadata_key='mystnb', kernel_rgx_aliases={}, eval_name_regex='^[a-zA-Z_][a-zA-Z0-9_]*$', execution_mode='off', execution_cache_path='', execution_excludepatterns=(), execution_timeout=30, execution_in_temp=False, execution_allow_errors=False, execution_raise_on_error=False, execution_show_tb=False, merge_streams=False, render_plugin='default', remove_code_source=False, remove_code_outputs=False, code_prompt_show='Show code cell {type}', code_prompt_hide='Hide code cell {type}', number_source_lines=False, output_stderr='show', render_text_lexer='myst-ansi', render_error_lexer='ipythontb', render_image_options={}, render_figure_options={}, render_markdown_format='commonmark', output_folder='build', append_css=True, metadata_to_fm=False)\r\nUsing jupyter-cache at: /Users/franzsrambical/Downloads/probly/docs/build/.jupyter_cache\r\n[01mbuilding [mo]: [39;49;00mtargets for 0 po files that are out of date\r\n[01mwriting output... [39;49;00m\r\n[01mbuilding [html]: [39;49;00mtargets for 0 source files that are out of date\r\n[01mupdating environment: [39;49;00m0 added, 5 changed, 0 removed\r\n[2K[01mreading sources... [39;49;00m[ 20%] [35mapi/probly.representation.sampling.torch_sampler[39;49;00m\r[2K[01mreading sources... [39;49;00m[ 40%] [35mapi/probly.train.bayesian._torch[39;49;00m\r",,terminal_output
+593,3680910,"TERMINAL",0,0,"[2K[01mreading sources... [39;49;00m[ 60%] [35mapi/probly.train.losses[39;49;00m\r[2K[01mreading sources... [39;49;00m[ 80%] [35mapi/probly.transformation.evidential.regression._torch[39;49;00m\r[2K[01mreading sources... [39;49;00m[100%] [35mindex[39;49;00m\r\r\n",,terminal_output
+594,3680968,"TERMINAL",0,0,"[91mWARNING: autodoc: failed to import module 'losses' from module 'probly.train'; the following exception was raised:\r\n['Traceback (most recent call last):\n', ' File ""/Users/franzsrambical/Downloads/probly/.venv/lib/python3.13/site-packages/sphinx/ext/autodoc/importer.py"", line 269, in import_object\n module = import_module(modname, try_reload=True)\n', ' File ""/Users/franzsrambical/Downloads/probly/.venv/lib/python3.13/site-packages/sphinx/ext/autodoc/importer.py"", line 172, in import_module\n raise ModuleNotFoundError(msg, name=modname) # NoQA: TRY301\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n', ""ModuleNotFoundError: No module named 'probly.train.losses'\n""] [autodoc.import_object][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/index.rst:58: WARNING: toctree contains reference to nonexisting document 'examples/fashionmnist_ood_ensemble' [toc.not_readable][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/index.rst:58: WARNING: toctree contains reference to nonexisting document 'examples/label_relaxation_calibration' [toc.not_readable][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/index.rst:58: WARNING: toctree contains reference to nonexisting document 'examples/sklearn_selective_prediction' [toc.not_readable][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/index.rst:58: WARNING: toctree contains reference to nonexisting document 'examples/synthetic_regression_dropout' [toc.not_readable][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/index.rst:58: WARNING: toctree contains reference to nonexisting document 'examples/temperature_scaling_calibration' [toc.not_readable][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/index.rst:58: WARNING: toctree contains reference to nonexisting document 'examples/train_bnn_classification' [toc.not_readable][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/index.rst:58: WARNING: toctree contains reference to nonexisting document 'examples/train_evidential_classification' [toc.not_readable][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/index.rst:58: WARNING: toctree contains reference to nonexisting document 'examples/train_evidential_regression' [toc.not_readable][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/index.rst:79: WARNING: toctree contains reference to nonexisting document 'contributing' [toc.not_readable][39;49;00m\r\n[01mlooking for now-outdated files... [39;49;00mnone found\r\n[01mpickling environment... [39;49;00m",,terminal_output
+595,3681066,"TERMINAL",0,0,"done\r\n[01mchecking consistency... [39;49;00m[91m/Users/franzsrambical/Downloads/probly/docs/source/api/probly.datasets._torch.rst: WARNING: document isn't included in any toctree [toc.not_included][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/api/probly.layers._torch.rst: WARNING: document isn't included in any toctree [toc.not_included][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/api/probly.representation.credal_set._torch.rst: WARNING: document isn't included in any toctree [toc.not_included][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/api/probly.train.bayesian._torch.rst: WARNING: document isn't included in any toctree [toc.not_included][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/api/probly.train.calibration._torch.rst: WARNING: document isn't included in any toctree [toc.not_included][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/api/probly.train.evidential._torch.rst: WARNING: document isn't included in any toctree [toc.not_included][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/api/probly.train.losses.rst: WARNING: document isn't included in any toctree [toc.not_included][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/api/probly.transformation.evidential.classification._torch.rst: WARNING: document isn't included in any toctree [toc.not_included][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/api/probly.transformation.evidential.regression._torch.rst: WARNING: document isn't included in any toctree [toc.not_included][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/api/probly.traverse_nn._torch.rst: WARNING: document isn't included in any toctree [toc.not_included][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/api/probly.utils._torch.rst: WARNING: document isn't included in any toctree [toc.not_included][39;49;00m\r\ndone\r\n[01mpreparing documents... [39;49;00mdone\r\n[01mcopying assets... [39;49;00m\r\n[01mcopying static files... [39;49;00m\r\nWriting evaluated template result to /Users/franzsrambical/Downloads/probly/docs/build/html/_static/basic.css\r\nWriting evaluated template result to /Users/franzsrambical/Downloads/probly/docs/build/html/_static/language_data.js\r\nWriting evaluated template result to /Users/franzsrambical/Downloads/probly/docs/build/html/_static/documentation_options.js\r\nWriting evaluated template result to /Users/franzsrambical/Downloads/probly/docs/build/html/_static/copybutton.js\r\n[01mcopying static files: [39;49;00mdone\r\n[01mcopying extra files... [39;49;00m\r\n[01mcopying extra files: [39;49;00mdone\r\n[01mcopying assets: [39;49;00mdone\r\n[2K[01mwriting output... [39;49;00m[ 17%] [32mapi/probly.representation.sampling[39;49;00m\r",,terminal_output
+596,3681257,"TERMINAL",0,0,"[2K[01mwriting output... [39;49;00m[ 33%] [32mapi/probly.representation.sampling.torch_sampler[39;49;00m\r[2K[01mwriting output... [39;49;00m[ 50%] [32mapi/probly.train.bayesian._torch[39;49;00m\r[2K[01mwriting output... [39;49;00m[ 67%] [32mapi/probly.train.losses[39;49;00m\r[2K[01mwriting output... [39;49;00m[ 83%] [32mapi/probly.transformation.evidential.regression._torch[39;49;00m\r[2K[01mwriting output... [39;49;00m[100%] [32mindex[39;49;00m\r",,terminal_output
+597,3681318,"TERMINAL",0,0,"\r\n[01mgenerating indices... [39;49;00mgenindex ",,terminal_output
+598,3681569,"TERMINAL",0,0,"py-modindex done\r\n[2K[01mhighlighting module code... [39;49;00m[ 3%] [94mbuiltins[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 6%] [94mprobly.datasets._torch[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 8%] [94mprobly.evaluation.metrics[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 11%] [94mprobly.evaluation.tasks[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 14%] [94mprobly.layers._torch[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 17%] [94mprobly.layers.flax[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 19%] [94mprobly.plot.credal[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 22%] [94mprobly.predictor[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 25%] [94mprobly.quantification.classification[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 28%] [94mprobly.quantification.regression[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 31%] [94mprobly.representation.credal_set._torch[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 33%] [94mprobly.representation.credal_set.credal_set[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 36%] [94mprobly.representation.credal_set.jax[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 39%] [94mprobly.representation.representer[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 42%] [94mprobly.representation.sampling.flax_sampler[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 44%] [94mprobly.representation.sampling.jax_sample[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 47%] [94mprobly.representation.sampling.sample[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 50%] [94mprobly.representation.sampling.sampler[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 53%] [94mprobly.representation.sampling.torch_sample[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 56%] [94mprobly.representation.sampling.torch_sampler[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 58%] [94mprobly.train.bayesian._torch[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 61%] [94mprobly.train.calibration._torch[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 64%] [94mprobly.train.evidential._torch[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 67%] [94mprobly.transformation.bayesian.common[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 69%] [94mprobly.transformation.dropconnect.common[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 72%] [94mprobly.transformation.dropout.common[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 75%] [94mprobly.transformation.ensemble.common[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 78%] [94mprobly.transformation.evidential.classification._torch[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 81%] [94mprobly.transformation.evidential.classification.common[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 83%] [94mprobly.transformation.evidential.regression._torch[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 86%] [94mprobly.transformation.evidential.regression.common[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 89%] [94mprobly.traverse_nn.common[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 92%] [94mprobly.utils._torch[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 94%] [94mprobly.utils.errors[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 97%] [94mprobly.utils.probabilities[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[100%] [94mprobly.utils.sets[39;49;00m\r\r\n[01mwriting additional pages... [39;49;00msearch done\r\n[01mdumping search index in English (code: en)... [39;49;00mdone\r\n[01mdumping object inventory... [39;49;00mdone\r\n\r\n====================== slowest reading durations =======================\r\n0.050 api/probly.train.bayesian._torch\r\n0.016 api/probly.representation.sampling.torch_sampler\r\n0.007 api/probly.transformation.evidential.regression._torch\r\n0.003 index\r\n0.001 api/probly.train.losses\r\n[01mbuild succeeded, 22 warnings.[39;49;00m\r\n\r\nThe HTML pages are in build/html.\r\n",,terminal_output
+599,3682312,"TERMINAL",0,0,"[1m[7m%[27m[1m[0m \r \r",,terminal_output
+600,3838388,"TERMINAL",0,0,"git stash",,terminal_command
+601,3838437,"TERMINAL",0,0,"]633;C",,terminal_output
+602,3838477,"TERMINAL",0,0,"Saved working directory and index state WIP on (no branch): e2f73f8 Initialize evidential classification test package\r\n[1m[7m%[27m[1m[0m \r \r",,terminal_output
+603,3841325,"TERMINAL",0,0,"make html",,terminal_command
+604,3841374,"TERMINAL",0,0,"]633;C",,terminal_output
+605,3841639,"TERMINAL",0,0,"[01mRunning Sphinx v8.2.3[39;49;00m\r\n",,terminal_output
+606,3841728,"TERMINAL",0,0,"[01mloading translations [en]... [39;49;00mdone\r\n",,terminal_output
+607,3842729,"TERMINAL",0,0,"[01mloading pickled environment... [39;49;00m",,terminal_output
+608,3842878,"TERMINAL",0,0,"checking bibtex cache... up to date\r\nThe configuration has changed (3 options: 'html_css_files', 'html_static_path', 'pygments_dark_style')\r\ndone\r\n[autosummary] generating autosummary for: api.rst, api/probly.datasets._torch.rst, api/probly.datasets.rst, api/probly.evaluation.metrics.rst, api/probly.evaluation.rst, api/probly.evaluation.tasks.rst, api/probly.layers._torch.rst, api/probly.layers.flax.rst, api/probly.layers.rst, api/probly.plot.credal.rst, ..., api/probly.traverse_nn.flax.rst, api/probly.traverse_nn.rst, api/probly.utils._torch.rst, api/probly.utils.errors.rst, api/probly.utils.probabilities.rst, api/probly.utils.rst, api/probly.utils.sets.rst, index.rst, methods.md, references.rst\r\n",,terminal_output
+609,3846357,"TERMINAL",0,0,"[91mWARNING: Failed to import probly.train.losses.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.losses'\r\n* AttributeError: module 'probly.train' has no attribute 'losses'[39;49;00m\r\n",,terminal_output
+610,3847099,"TERMINAL",0,0,"[autosummary] generating autosummary for: /Users/franzsrambical/Downloads/probly/docs/source/api/probly.datasets.rst, /Users/franzsrambical/Downloads/probly/docs/source/api/probly.layers.rst, /Users/franzsrambical/Downloads/probly/docs/source/api/probly.representation.credal_set.rst, /Users/franzsrambical/Downloads/probly/docs/source/api/probly.train.bayesian.rst, /Users/franzsrambical/Downloads/probly/docs/source/api/probly.train.calibration.rst, /Users/franzsrambical/Downloads/probly/docs/source/api/probly.train.evidential.rst, /Users/franzsrambical/Downloads/probly/docs/source/api/probly.transformation.evidential.classification.rst, /Users/franzsrambical/Downloads/probly/docs/source/api/probly.transformation.evidential.regression.rst, /Users/franzsrambical/Downloads/probly/docs/source/api/probly.traverse_nn.rst, /Users/franzsrambical/Downloads/probly/docs/source/api/probly.utils.rst\r\n",,terminal_output
+611,3847401,"TERMINAL",0,0,"[autosummary] generating autosummary for: /Users/franzsrambical/Downloads/probly/docs/source/api/probly.datasets.torch.rst, /Users/franzsrambical/Downloads/probly/docs/source/api/probly.layers.flax.rst, /Users/franzsrambical/Downloads/probly/docs/source/api/probly.layers.torch.rst, /Users/franzsrambical/Downloads/probly/docs/source/api/probly.representation.credal_set.torch.rst, /Users/franzsrambical/Downloads/probly/docs/source/api/probly.train.bayesian.torch.rst, /Users/franzsrambical/Downloads/probly/docs/source/api/probly.train.calibration.torch.rst, /Users/franzsrambical/Downloads/probly/docs/source/api/probly.train.evidential.torch.rst, /Users/franzsrambical/Downloads/probly/docs/source/api/probly.transformation.evidential.classification.torch.rst, /Users/franzsrambical/Downloads/probly/docs/source/api/probly.transformation.evidential.regression.torch.rst, /Users/franzsrambical/Downloads/probly/docs/source/api/probly.traverse_nn.torch.rst, /Users/franzsrambical/Downloads/probly/docs/source/api/probly.utils.torch.rst\r\n[01mmyst v4.0.1:[39;49;00m MdParserConfig(commonmark_only=False, gfm_only=False, enable_extensions=set(), disable_syntax=[], all_links_external=False, links_external_new_tab=False, url_schemes=('http', 'https', 'mailto', 'ftp'), ref_domains=None, fence_as_directive=set(), number_code_blocks=[], title_to_header=False, heading_anchors=0, heading_slug_func=None, html_meta={}, footnote_sort=True, footnote_transition=True, words_per_minute=200, substitutions={}, linkify_fuzzy_links=True, dmath_allow_labels=True, dmath_allow_space=True, dmath_allow_digits=True, dmath_double_inline=False, update_mathjax=True, mathjax_classes='tex2jax_process|mathjax_process|math|output_area', enable_checkboxes=False, suppress_warnings=[], highlight_code_blocks=True)\r\n[01mmyst-nb v1.2.0:[39;49;00m NbParserConfig(custom_formats={}, metadata_key='mystnb', cell_metadata_key='mystnb', kernel_rgx_aliases={}, eval_name_regex='^[a-zA-Z_][a-zA-Z0-9_]*$', execution_mode='off', execution_cache_path='', execution_excludepatterns=(), execution_timeout=30, execution_in_temp=False, execution_allow_errors=False, execution_raise_on_error=False, execution_show_tb=False, merge_streams=False, render_plugin='default', remove_code_source=False, remove_code_outputs=False, code_prompt_show='Show code cell {type}', code_prompt_hide='Hide code cell {type}', number_source_lines=False, output_stderr='show', render_text_lexer='myst-ansi', render_error_lexer='ipythontb', render_image_options={}, render_figure_options={}, render_markdown_format='commonmark', output_folder='build', append_css=True, metadata_to_fm=False)\r\nUsing jupyter-cache at: /Users/franzsrambical/Downloads/probly/docs/build/.jupyter_cache\r\n[01mbuilding [mo]: [39;49;00mtargets for 0 po files that are out of date\r\n[01mwriting output... [39;49;00m\r\n[01mbuilding [html]: [39;49;00mtargets for 11 source files that are out of date\r\n[01mupdating environment: [39;49;00m10 added, 14 changed, 0 removed\r\n[2K[01mreading sources... [39;49;00m[ 4%] [35mapi/probly.datasets[39;49;00m\r[2K[01mreading sources... [39;49;00m[ 8%] [35mapi/probly.datasets.torch[39;49;00m\r",,terminal_output
+612,3847611,"TERMINAL",0,0,"[2K[01mreading sources... [39;49;00m[ 12%] [35mapi/probly.layers[39;49;00m\r[2K[01mreading sources... [39;49;00m[ 17%] [35mapi/probly.layers.flax[39;49;00m\r:6: (ERROR/3) Unexpected indentation.\r\n:7: (WARNING/2) Block quote ends without a blank line; unexpected unindent.\r\n[2K[01mreading sources... [39;49;00m[ 21%] [35mapi/probly.layers.torch[39;49;00m\r:3: (WARNING/2) Inline emphasis start-string without end-string.\r\n:3: (WARNING/2) Inline emphasis start-string without end-string.\r\n",,terminal_output
+613,3847983,"TERMINAL",0,0,"[2K[01mreading sources... [39;49;00m[ 25%] [35mapi/probly.representation.credal_set[39;49;00m\r[2K[01mreading sources... [39;49;00m[ 29%] [35mapi/probly.representation.credal_set.torch[39;49;00m\r[2K[01mreading sources... [39;49;00m[ 33%] [35mapi/probly.representation.sampling.torch_sampler[39;49;00m\r[2K[01mreading sources... [39;49;00m[ 38%] [35mapi/probly.train.bayesian[39;49;00m\r[2K[01mreading sources... [39;49;00m[ 42%] [35mapi/probly.train.bayesian.torch[39;49;00m\r[2K[01mreading sources... [39;49;00m[ 46%] [35mapi/probly.train.calibration[39;49;00m\r[2K[01mreading sources... [39;49;00m[ 50%] [35mapi/probly.train.calibration.torch[39;49;00m\r[2K[01mreading sources... [39;49;00m[ 54%] [35mapi/probly.train.evidential[39;49;00m\r[2K[01mreading sources... [39;49;00m[ 58%] [35mapi/probly.train.evidential.torch[39;49;00m\r[2K[01mreading sources... [39;49;00m[ 62%] [35mapi/probly.train.losses[39;49;00m\r[2K[01mreading sources... [39;49;00m[ 67%] [35mapi/probly.transformation.evidential.classification[39;49;00m\r[2K[01mreading sources... [39;49;00m[ 71%] [35mapi/probly.transformation.evidential.classification.torch[39;49;00m\r[2K[01mreading sources... [39;49;00m[ 75%] [35mapi/probly.transformation.evidential.regression[39;49;00m\r[2K[01mreading sources... [39;49;00m[ 79%] [35mapi/probly.transformation.evidential.regression.torch[39;49;00m\r[2K[01mreading sources... [39;49;00m[ 83%] [35mapi/probly.traverse_nn[39;49;00m\r[2K[01mreading sources... [39;49;00m[ 88%] [35mapi/probly.traverse_nn.torch[39;49;00m\r[2K[01mreading sources... [39;49;00m[ 92%] [35mapi/probly.utils[39;49;00m\r:5: (ERROR/3) Unexpected indentation.\r\n:7: (WARNING/2) Block quote ends without a blank line; unexpected unindent.\r\n:5: (ERROR/3) Unexpected indentation.\r\n:10: (WARNING/2) Block quote ends without a blank line; unexpected unindent.\r\n[2K[01mreading sources... [39;49;00m[ 96%] [35mapi/probly.utils.torch[39;49;00m\r[2K[01mreading sources... [39;49;00m[100%] [35mindex[39;49;00m\r\r\n",,terminal_output
+614,3848040,"TERMINAL",0,0,"[91m/Users/franzsrambical/Downloads/probly/src/probly/layers/torch.py:docstring of probly.layers.torch.BayesConv2d.reset_parameters:6: WARNING: Inline emphasis start-string without end-string. [docutils][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/src/probly/layers/torch.py:docstring of probly.layers.torch.BayesLinear.reset_parameters:6: WARNING: Inline emphasis start-string without end-string. [docutils][39;49;00m\r\n[91mWARNING: Cannot resolve forward reference in type annotations of ""probly.representation.credal_set.CredalSet.lower"": name 'T' is not defined[39;49;00m\r\n[91mWARNING: Cannot resolve forward reference in type annotations of ""probly.representation.credal_set.CredalSet.upper"": name 'T' is not defined[39;49;00m\r\n[91mWARNING: autodoc: failed to import module 'losses' from module 'probly.train'; the following exception was raised:\r\n['Traceback (most recent call last):\n', ' File ""/Users/franzsrambical/Downloads/probly/.venv/lib/python3.13/site-packages/sphinx/ext/autodoc/importer.py"", line 269, in import_object\n module = import_module(modname, try_reload=True)\n', ' File ""/Users/franzsrambical/Downloads/probly/.venv/lib/python3.13/site-packages/sphinx/ext/autodoc/importer.py"", line 172, in import_module\n raise ModuleNotFoundError(msg, name=modname) # NoQA: TRY301\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n', ""ModuleNotFoundError: No module named 'probly.train.losses'\n""] [autodoc.import_object][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/index.rst:58: WARNING: toctree contains reference to nonexisting document 'examples/fashionmnist_ood_ensemble' [toc.not_readable][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/index.rst:58: WARNING: toctree contains reference to nonexisting document 'examples/label_relaxation_calibration' [toc.not_readable][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/index.rst:58: WARNING: toctree contains reference to nonexisting document 'examples/sklearn_selective_prediction' [toc.not_readable][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/index.rst:58: WARNING: toctree contains reference to nonexisting document 'examples/synthetic_regression_dropout' [toc.not_readable][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/index.rst:58: WARNING: toctree contains reference to nonexisting document 'examples/temperature_scaling_calibration' [toc.not_readable][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/index.rst:58: WARNING: toctree contains reference to nonexisting document 'examples/train_bnn_classification' [toc.not_readable][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/index.rst:58: WARNING: toctree contains reference to nonexisting document 'examples/train_evidential_classification' [toc.not_readable][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/index.rst:58: WARNING: toctree contains reference to nonexisting document 'examples/train_evidential_regression' [toc.not_readable][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/index.rst:79: WARNING: toctree contains reference to nonexisting document 'contributing' [toc.not_readable][39;49;00m\r\n[01mlooking for now-outdated files... [39;49;00mnone found\r\n[01mpickling environment... [39;49;00m",,terminal_output
+615,3848147,"TERMINAL",0,0,"done\r\n[01mchecking consistency... [39;49;00m[91m/Users/franzsrambical/Downloads/probly/docs/source/api/probly.datasets._torch.rst: WARNING: document isn't included in any toctree [toc.not_included][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/api/probly.layers._torch.rst: WARNING: document isn't included in any toctree [toc.not_included][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/api/probly.representation.credal_set._torch.rst: WARNING: document isn't included in any toctree [toc.not_included][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/api/probly.train.bayesian._torch.rst: WARNING: document isn't included in any toctree [toc.not_included][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/api/probly.train.calibration._torch.rst: WARNING: document isn't included in any toctree [toc.not_included][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/api/probly.train.evidential._torch.rst: WARNING: document isn't included in any toctree [toc.not_included][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/api/probly.train.losses.rst: WARNING: document isn't included in any toctree [toc.not_included][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/api/probly.transformation.evidential.classification._torch.rst: WARNING: document isn't included in any toctree [toc.not_included][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/api/probly.transformation.evidential.regression._torch.rst: WARNING: document isn't included in any toctree [toc.not_included][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/api/probly.traverse_nn._torch.rst: WARNING: document isn't included in any toctree [toc.not_included][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/api/probly.utils._torch.rst: WARNING: document isn't included in any toctree [toc.not_included][39;49;00m\r\ndone\r\n[01mpreparing documents... [39;49;00mdone\r\n[01mcopying assets... [39;49;00m\r\n[01mcopying static files... [39;49;00m\r\nWriting evaluated template result to /Users/franzsrambical/Downloads/probly/docs/build/html/_static/basic.css\r\nWriting evaluated template result to /Users/franzsrambical/Downloads/probly/docs/build/html/_static/language_data.js\r\nWriting evaluated template result to /Users/franzsrambical/Downloads/probly/docs/build/html/_static/documentation_options.js\r\nWriting evaluated template result to /Users/franzsrambical/Downloads/probly/docs/build/html/_static/copybutton.js\r\n[01mcopying static files: [39;49;00mdone\r\n[01mcopying extra files... [39;49;00m\r\n[01mcopying extra files: [39;49;00mdone\r\n[01mcopying assets: [39;49;00mdone\r\n[2K[01mwriting output... [39;49;00m[ 3%] [32mapi[39;49;00m\r",,terminal_output
+616,3848286,"TERMINAL",0,0,"[2K[01mwriting output... [39;49;00m[ 7%] [32mapi/probly.datasets[39;49;00m\r[2K[01mwriting output... [39;49;00m[ 10%] [32mapi/probly.datasets.torch[39;49;00m\r",,terminal_output
+617,3849159,"TERMINAL",0,0,"[2K[01mwriting output... [39;49;00m[ 14%] [32mapi/probly.layers[39;49;00m\r[2K[01mwriting output... [39;49;00m[ 17%] [32mapi/probly.layers.flax[39;49;00m\r[2K[01mwriting output... [39;49;00m[ 21%] [32mapi/probly.layers.torch[39;49;00m\r[2K[01mwriting output... [39;49;00m[ 24%] [32mapi/probly.representation[39;49;00m\r[2K[01mwriting output... [39;49;00m[ 28%] [32mapi/probly.representation.credal_set[39;49;00m\r[2K[01mwriting output... [39;49;00m[ 31%] [32mapi/probly.representation.credal_set.torch[39;49;00m\r[2K[01mwriting output... [39;49;00m[ 34%] [32mapi/probly.representation.sampling[39;49;00m\r[2K[01mwriting output... [39;49;00m[ 38%] [32mapi/probly.representation.sampling.torch_sampler[39;49;00m\r[2K[01mwriting output... [39;49;00m[ 41%] [32mapi/probly.train[39;49;00m\r[2K[01mwriting output... [39;49;00m[ 45%] [32mapi/probly.train.bayesian[39;49;00m\r[2K[01mwriting output... [39;49;00m[ 48%] [32mapi/probly.train.bayesian.torch[39;49;00m\r[2K[01mwriting output... [39;49;00m[ 52%] [32mapi/probly.train.calibration[39;49;00m\r[2K[01mwriting output... [39;49;00m[ 55%] [32mapi/probly.train.calibration.torch[39;49;00m\r[2K[01mwriting output... [39;49;00m[ 59%] [32mapi/probly.train.evidential[39;49;00m\r[2K[01mwriting output... [39;49;00m[ 62%] [32mapi/probly.train.evidential.torch[39;49;00m\r[2K[01mwriting output... [39;49;00m[ 66%] [32mapi/probly.train.losses[39;49;00m\r[2K[01mwriting output... [39;49;00m[ 69%] [32mapi/probly.transformation.evidential[39;49;00m\r[2K[01mwriting output... [39;49;00m[ 72%] [32mapi/probly.transformation.evidential.classification[39;49;00m\r[2K[01mwriting output... [39;49;00m[ 76%] [32mapi/probly.transformation.evidential.classification.torch[39;49;00m\r[2K[01mwriting output... [39;49;00m[ 79%] [32mapi/probly.transformation.evidential.regression[39;49;00m\r[2K[01mwriting output... [39;49;00m[ 83%] [32mapi/probly.transformation.evidential.regression.torch[39;49;00m\r[2K[01mwriting output... [39;49;00m[ 86%] [32mapi/probly.traverse_nn[39;49;00m\r[2K[01mwriting output... [39;49;00m[ 90%] [32mapi/probly.traverse_nn.torch[39;49;00m\r[2K[01mwriting output... [39;49;00m[ 93%] [32mapi/probly.utils[39;49;00m\r",,terminal_output
+618,3849253,"TERMINAL",0,0,"[2K[01mwriting output... [39;49;00m[ 97%] [32mapi/probly.utils.torch[39;49;00m\r[2K[01mwriting output... [39;49;00m[100%] [32mindex[39;49;00m\r",,terminal_output
+619,3849682,"TERMINAL",0,0,"\r\n[01mgenerating indices... [39;49;00mgenindex py-modindex done\r\n[2K[01mhighlighting module code... [39;49;00m[ 2%] [94mbuiltins[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 4%] [94mprobly.datasets._torch[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 7%] [94mprobly.datasets.torch[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 9%] [94mprobly.evaluation.metrics[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 11%] [94mprobly.evaluation.tasks[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 13%] [94mprobly.layers._torch[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 16%] [94mprobly.layers.flax[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 18%] [94mprobly.layers.torch[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 20%] [94mprobly.plot.credal[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 22%] [94mprobly.predictor[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 24%] [94mprobly.quantification.classification[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 27%] [94mprobly.quantification.regression[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 29%] [94mprobly.representation.credal_set._torch[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 31%] [94mprobly.representation.credal_set.credal_set[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 33%] [94mprobly.representation.credal_set.jax[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 36%] [94mprobly.representation.credal_set.torch[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 38%] [94mprobly.representation.representer[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 40%] [94mprobly.representation.sampling.flax_sampler[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 42%] [94mprobly.representation.sampling.jax_sample[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 44%] [94mprobly.representation.sampling.sample[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 47%] [94mprobly.representation.sampling.sampler[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 49%] [94mprobly.representation.sampling.torch_sample[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 51%] [94mprobly.representation.sampling.torch_sampler[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 53%] [94mprobly.train.bayesian._torch[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 56%] [94mprobly.train.bayesian.torch[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 58%] [94mprobly.train.calibration._torch[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 60%] [94mprobly.train.calibration.torch[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 62%] [94mprobly.train.evidential._torch[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 64%] [94mprobly.train.evidential.torch[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 67%] [94mprobly.transformation.bayesian.common[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 69%] [94mprobly.transformation.dropconnect.common[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 71%] [94mprobly.transformation.dropout.common[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 73%] [94mprobly.transformation.ensemble.common[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 76%] [94mprobly.transformation.evidential.classification._torch[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 78%] [94mprobly.transformation.evidential.classification.common[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 80%] [94mprobly.transformation.evidential.classification.torch[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 82%] [94mprobly.transformation.evidential.regression._torch[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 84%] [94mprobly.transformation.evidential.regression.common[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 87%] [94mprobly.transformation.evidential.regression.torch[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 89%] [94mprobly.traverse_nn.common[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 91%] [94mprobly.utils._torch[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 93%] [94mprobly.utils.errors[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 96%] [94mprobly.utils.probabilities[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 98%] [94mprobly.utils.sets[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[100%] [94mprobly.utils.torch[39;49;00m\r\r\n[01mwriting additional pages... [39;49;00msearch done\r\n[01mdumping search index in English (code: en)... [39;49;00mdone\r\n[01mdumping object inventory... [39;49;00mdone\r\n\r\n====================== slowest reading durations =======================\r\n0.104 api/probly.datasets.torch\r\n0.083 api/probly.layers.torch\r\n0.039 api/probly.layers.flax\r\n0.035 api/probly.train.evidential.torch\r\n0.032 api/probly.utils\r\n[01mbuild succeeded, 26 warnings.[39;49;00m\r\n\r\nThe HTML pages are in build/html.\r\n",,terminal_output
+620,3850403,"TERMINAL",0,0,"[1m[7m%[27m[1m[0m \r \r",,terminal_output
+621,3872716,"TERMINAL",0,0,"cd /Users/franzsrambical/Downloads/probly/docs/source/api && \find . -name ""*.torch.*.rst"" -type f -delete",,terminal_command
+622,3872735,"TERMINAL",0,0,"]633;C[1m[7m%[27m[1m[0m \r \r",,terminal_output
+623,3882119,"TERMINAL",0,0,"make html",,terminal_command
+624,3882131,"TERMINAL",0,0,"]633;Cmake: *** No rule to make target `html'. Stop.\r\n[1m[7m%[27m[1m[0m \r \r",,terminal_output
+625,3885000,"TERMINAL",0,0,"cd ..",,terminal_command
+626,3885000,"TERMINAL",0,0,"]633;C[1m[7m%[27m[1m[0m \r \r",,terminal_output
+627,3886558,"TERMINAL",0,0,"cd ..",,terminal_command
+628,3886559,"TERMINAL",0,0,"]633;C[1m[7m%[27m[1m[0m \r \r",,terminal_output
+629,3887742,"TERMINAL",0,0,"make html",,terminal_command
+630,3887791,"TERMINAL",0,0,"]633;C",,terminal_output
+631,3888002,"TERMINAL",0,0,"[01mRunning Sphinx v8.2.3[39;49;00m\r\n",,terminal_output
+632,3888067,"TERMINAL",0,0,"[01mloading translations [en]... [39;49;00mdone\r\n",,terminal_output
+633,3889020,"TERMINAL",0,0,"[01mloading pickled environment... [39;49;00m",,terminal_output
+634,3889158,"TERMINAL",0,0,"checking bibtex cache... up to date\r\nThe configuration has changed (3 options: 'html_css_files', 'html_static_path', 'pygments_dark_style')\r\ndone\r\n[autosummary] generating autosummary for: api.rst, api/probly.datasets._torch.rst, api/probly.datasets.rst, api/probly.datasets.torch.rst, api/probly.evaluation.metrics.rst, api/probly.evaluation.rst, api/probly.evaluation.tasks.rst, api/probly.layers._torch.rst, api/probly.layers.flax.rst, api/probly.layers.rst, ..., api/probly.traverse_nn.torch.rst, api/probly.utils._torch.rst, api/probly.utils.errors.rst, api/probly.utils.probabilities.rst, api/probly.utils.rst, api/probly.utils.sets.rst, api/probly.utils.torch.rst, index.rst, methods.md, references.rst\r\n",,terminal_output
+635,3890968,"TERMINAL",0,0,"[91mWARNING: Failed to import probly.train.losses.\r\nPossible hints:\r\n* ModuleNotFoundError: No module named 'probly.train.losses'\r\n* AttributeError: module 'probly.train' has no attribute 'losses'[39;49;00m\r\n",,terminal_output
+636,3892887,"TERMINAL",0,0,"[01mmyst v4.0.1:[39;49;00m MdParserConfig(commonmark_only=False, gfm_only=False, enable_extensions=set(), disable_syntax=[], all_links_external=False, links_external_new_tab=False, url_schemes=('http', 'https', 'mailto', 'ftp'), ref_domains=None, fence_as_directive=set(), number_code_blocks=[], title_to_header=False, heading_anchors=0, heading_slug_func=None, html_meta={}, footnote_sort=True, footnote_transition=True, words_per_minute=200, substitutions={}, linkify_fuzzy_links=True, dmath_allow_labels=True, dmath_allow_space=True, dmath_allow_digits=True, dmath_double_inline=False, update_mathjax=True, mathjax_classes='tex2jax_process|mathjax_process|math|output_area', enable_checkboxes=False, suppress_warnings=[], highlight_code_blocks=True)\r\n[01mmyst-nb v1.2.0:[39;49;00m NbParserConfig(custom_formats={}, metadata_key='mystnb', cell_metadata_key='mystnb', kernel_rgx_aliases={}, eval_name_regex='^[a-zA-Z_][a-zA-Z0-9_]*$', execution_mode='off', execution_cache_path='', execution_excludepatterns=(), execution_timeout=30, execution_in_temp=False, execution_allow_errors=False, execution_raise_on_error=False, execution_show_tb=False, merge_streams=False, render_plugin='default', remove_code_source=False, remove_code_outputs=False, code_prompt_show='Show code cell {type}', code_prompt_hide='Hide code cell {type}', number_source_lines=False, output_stderr='show', render_text_lexer='myst-ansi', render_error_lexer='ipythontb', render_image_options={}, render_figure_options={}, render_markdown_format='commonmark', output_folder='build', append_css=True, metadata_to_fm=False)\r\nUsing jupyter-cache at: /Users/franzsrambical/Downloads/probly/docs/build/.jupyter_cache\r\n[01mbuilding [mo]: [39;49;00mtargets for 0 po files that are out of date\r\n[01mwriting output... [39;49;00m\r\n[01mbuilding [html]: [39;49;00mtargets for 0 source files that are out of date\r\n[01mupdating environment: [39;49;00m0 added, 2 changed, 0 removed\r\n[2K[01mreading sources... [39;49;00m[ 50%] [35mapi/probly.train.losses[39;49;00m\r[2K[01mreading sources... [39;49;00m[100%] [35mindex[39;49;00m\r\r\n",,terminal_output
+637,3892947,"TERMINAL",0,0,"[91mWARNING: autodoc: failed to import module 'losses' from module 'probly.train'; the following exception was raised:\r\n['Traceback (most recent call last):\n', ' File ""/Users/franzsrambical/Downloads/probly/.venv/lib/python3.13/site-packages/sphinx/ext/autodoc/importer.py"", line 269, in import_object\n module = import_module(modname, try_reload=True)\n', ' File ""/Users/franzsrambical/Downloads/probly/.venv/lib/python3.13/site-packages/sphinx/ext/autodoc/importer.py"", line 172, in import_module\n raise ModuleNotFoundError(msg, name=modname) # NoQA: TRY301\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n', ""ModuleNotFoundError: No module named 'probly.train.losses'\n""] [autodoc.import_object][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/index.rst:58: WARNING: toctree contains reference to nonexisting document 'examples/fashionmnist_ood_ensemble' [toc.not_readable][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/index.rst:58: WARNING: toctree contains reference to nonexisting document 'examples/label_relaxation_calibration' [toc.not_readable][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/index.rst:58: WARNING: toctree contains reference to nonexisting document 'examples/sklearn_selective_prediction' [toc.not_readable][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/index.rst:58: WARNING: toctree contains reference to nonexisting document 'examples/synthetic_regression_dropout' [toc.not_readable][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/index.rst:58: WARNING: toctree contains reference to nonexisting document 'examples/temperature_scaling_calibration' [toc.not_readable][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/index.rst:58: WARNING: toctree contains reference to nonexisting document 'examples/train_bnn_classification' [toc.not_readable][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/index.rst:58: WARNING: toctree contains reference to nonexisting document 'examples/train_evidential_classification' [toc.not_readable][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/index.rst:58: WARNING: toctree contains reference to nonexisting document 'examples/train_evidential_regression' [toc.not_readable][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/index.rst:79: WARNING: toctree contains reference to nonexisting document 'contributing' [toc.not_readable][39;49;00m\r\n[01mlooking for now-outdated files... [39;49;00mnone found\r\n[01mpickling environment... [39;49;00m",,terminal_output
+638,3893050,"TERMINAL",0,0,"done\r\n[01mchecking consistency... [39;49;00m[91m/Users/franzsrambical/Downloads/probly/docs/source/api/probly.datasets._torch.rst: WARNING: document isn't included in any toctree [toc.not_included][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/api/probly.layers._torch.rst: WARNING: document isn't included in any toctree [toc.not_included][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/api/probly.representation.credal_set._torch.rst: WARNING: document isn't included in any toctree [toc.not_included][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/api/probly.train.bayesian._torch.rst: WARNING: document isn't included in any toctree [toc.not_included][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/api/probly.train.calibration._torch.rst: WARNING: document isn't included in any toctree [toc.not_included][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/api/probly.train.evidential._torch.rst: WARNING: document isn't included in any toctree [toc.not_included][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/api/probly.train.losses.rst: WARNING: document isn't included in any toctree [toc.not_included][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/api/probly.transformation.evidential.classification._torch.rst: WARNING: document isn't included in any toctree [toc.not_included][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/api/probly.transformation.evidential.regression._torch.rst: WARNING: document isn't included in any toctree [toc.not_included][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/api/probly.traverse_nn._torch.rst: WARNING: document isn't included in any toctree [toc.not_included][39;49;00m\r\n[91m/Users/franzsrambical/Downloads/probly/docs/source/api/probly.utils._torch.rst: WARNING: document isn't included in any toctree [toc.not_included][39;49;00m\r\ndone\r\n[01mpreparing documents... [39;49;00mdone\r\n[01mcopying assets... [39;49;00m\r\n[01mcopying static files... [39;49;00m\r\nWriting evaluated template result to /Users/franzsrambical/Downloads/probly/docs/build/html/_static/basic.css\r\nWriting evaluated template result to /Users/franzsrambical/Downloads/probly/docs/build/html/_static/language_data.js\r\nWriting evaluated template result to /Users/franzsrambical/Downloads/probly/docs/build/html/_static/documentation_options.js\r\nWriting evaluated template result to /Users/franzsrambical/Downloads/probly/docs/build/html/_static/copybutton.js\r\n[01mcopying static files: [39;49;00mdone\r\n[01mcopying extra files... [39;49;00m\r\n[01mcopying extra files: [39;49;00mdone\r\n[01mcopying assets: [39;49;00mdone\r\n[2K[01mwriting output... [39;49;00m[ 50%] [32mapi/probly.train.losses[39;49;00m\r",,terminal_output
+639,3893159,"TERMINAL",0,0,"[2K[01mwriting output... [39;49;00m[100%] [32mindex[39;49;00m\r",,terminal_output
+640,3893336,"TERMINAL",0,0,"\r\n[01mgenerating indices... [39;49;00mgenindex py-modindex done\r\n[2K[01mhighlighting module code... [39;49;00m[ 2%] [94mbuiltins[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 4%] [94mprobly.datasets._torch[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 7%] [94mprobly.datasets.torch[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 9%] [94mprobly.evaluation.metrics[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 11%] [94mprobly.evaluation.tasks[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 13%] [94mprobly.layers._torch[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 16%] [94mprobly.layers.flax[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 18%] [94mprobly.layers.torch[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 20%] [94mprobly.plot.credal[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 22%] [94mprobly.predictor[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 24%] [94mprobly.quantification.classification[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 27%] [94mprobly.quantification.regression[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 29%] [94mprobly.representation.credal_set._torch[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 31%] [94mprobly.representation.credal_set.credal_set[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 33%] [94mprobly.representation.credal_set.jax[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 36%] [94mprobly.representation.credal_set.torch[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 38%] [94mprobly.representation.representer[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 40%] [94mprobly.representation.sampling.flax_sampler[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 42%] [94mprobly.representation.sampling.jax_sample[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 44%] [94mprobly.representation.sampling.sample[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 47%] [94mprobly.representation.sampling.sampler[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 49%] [94mprobly.representation.sampling.torch_sample[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 51%] [94mprobly.representation.sampling.torch_sampler[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 53%] [94mprobly.train.bayesian._torch[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 56%] [94mprobly.train.bayesian.torch[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 58%] [94mprobly.train.calibration._torch[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 60%] [94mprobly.train.calibration.torch[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 62%] [94mprobly.train.evidential._torch[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 64%] [94mprobly.train.evidential.torch[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 67%] [94mprobly.transformation.bayesian.common[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 69%] [94mprobly.transformation.dropconnect.common[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 71%] [94mprobly.transformation.dropout.common[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 73%] [94mprobly.transformation.ensemble.common[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 76%] [94mprobly.transformation.evidential.classification._torch[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 78%] [94mprobly.transformation.evidential.classification.common[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 80%] [94mprobly.transformation.evidential.classification.torch[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 82%] [94mprobly.transformation.evidential.regression._torch[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 84%] [94mprobly.transformation.evidential.regression.common[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 87%] [94mprobly.transformation.evidential.regression.torch[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 89%] [94mprobly.traverse_nn.common[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 91%] [94mprobly.utils._torch[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 93%] [94mprobly.utils.errors[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 96%] [94mprobly.utils.probabilities[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[ 98%] [94mprobly.utils.sets[39;49;00m\r[2K[01mhighlighting module code... [39;49;00m[100%] [94mprobly.utils.torch[39;49;00m\r\r\n[01mwriting additional pages... [39;49;00msearch done\r\n[01mdumping search index in English (code: en)... [39;49;00mdone\r\n[01mdumping object inventory... [39;49;00mdone\r\n\r\n====================== slowest reading durations =======================\r\n0.009 api/probly.train.losses\r\n0.005 index\r\n[01mbuild succeeded, 22 warnings.[39;49;00m\r\n\r\nThe HTML pages are in build/html.\r\n",,terminal_output
+641,3893973,"TERMINAL",0,0,"[1m[7m%[27m[1m[0m \r \r",,terminal_output
diff --git a/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv b/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv
new file mode 100644
index 0000000000000000000000000000000000000000..871d6af2047b9fd9c083db8ea13511e35c060699
--- /dev/null
+++ b/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv
@@ -0,0 +1,16232 @@
+Sequence,Time,File,RangeOffset,RangeLength,Text,Language,Type
+1,1,"examples/crowd_code.html",0,0,"\n\n\n\n \n \n \n \n\n\n\n \n \n \n \n \n
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n Install once, and forget about it.\n
\n Neural networks are simulators. Anything you want your model to do, you have to teach it. Base models represent the mean of the data distribution of the internet.\n Post-training permits shifting that distribution towards desired behaviours. The higher the required skill, the scarcer the corresponding data on the internet.\n
\n
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt\n at capturing a broad spectrum of human software engineering, including character-level text insertions, deletions, undo, redo, cursor movement, file switches, jumping\n to function defintions, git checkouts, terminal command execution, autocompletes, LLM changes, and more.\n
\n Every day, millions of developers are writing open-source code. We propose going beyond open-source and towards open-engineering, a paradigm where the value of the\n developer's time is not just captured by the code they produce, but also by the mere act of engineering.\n
\n
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n We want to make it as easy as possible for developers to participate in crowd-sourcing. All you need to do is \n install the crowd-code extension, and forget about it.\n
\n \n Figure 1: A preview of the crowd-code extension in action. Figure from Mattia Consiglio.\n
\n The extension periodically uploads the user's captured IDE actions to a server, where they are cleaned, filtered, thoroughly anonymized, and periodically released to the\n public under the most permissive Creative Commons license (CC0).\n An ongoing recording is transparently indicated in the IDE's status bar, and can be stopped at any time. If the user has inserted sensitive data, they can simply\n press the 'panic button' in the status bar to remove the last actions from the recording before they even leave the user's machine. Additionally, the user is asked for\n consent to participate in crowd-sourcing upon extension installation, and can opt-out at any time. We take user privacy very seriously and welcome any feedback on how to\n make data collection and release more transparent.\n
\n
\n Beyond behaviour-cloning on a dataset crowd-sourced using crowd-code, we eventually want to use crowd-code to annotate the entirety of IDE screencasts on the internet \n using an inverse dynamics model trained on screen recordings paired with crowd-code's IDE action annotations. This would unlock an entirely new trove of training data\n for software engineering agents.\n
\n
\n We are excited to see what the community builds with this dataset. We want to democratize AI research. We are greater than the sum of our parts. Together.\n
\n \n\n \n\n
Contributions
\n
AN, MM and FS worked on ideation and implementation. FS wrote the manuscript. We thank Gemini Code Assist and Cursor for their help in writing the extension.
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n Install once, and forget about it.\n
\n Neural networks are simulators. Anything you want your model to do, you have to teach it. Base models represent the mean of the data distribution of the internet.\n Post-training permits shifting that distribution towards desired behaviours. The higher the required skill, the scarcer the corresponding data on the internet.\n
\n
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt\n at capturing a broad spectrum of human software engineering, including character-level text insertions, deletions, undo, redo, cursor movement, file switches, jumping\n to function defintions, git checkouts, terminal command execution, autocompletes, LLM changes, and more.\n
\n Every day, millions of developers are writing open-source code. We propose going beyond open-source and towards open-engineering, a paradigm where the value of the\n developer's time is not just captured by the code they produce, but also by the mere act of engineering.\n
\n
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n We want to make it as easy as possible for developers to participate in crowd-sourcing. All you need to do is \n install the crowd-code extension, and forget about it.\n
\n \n Figure 1: A preview of the crowd-code extension in action. Figure from Mattia Consiglio.\n
\n The extension periodically uploads the user's captured IDE actions to a server, where they are cleaned, filtered, thoroughly anonymized, and periodically released to the\n public under the most permissive Creative Commons license (CC0).\n An ongoing recording is transparently indicated in the IDE's status bar, and can be stopped at any time. If the user has inserted sensitive data, they can simply\n press the 'panic button' in the status bar to remove the last actions from the recording before they even leave the user's machine. Additionally, the user is asked for\n consent to participate in crowd-sourcing upon extension installation, and can opt-out at any time. We take user privacy very seriously and welcome any feedback on how to\n make data collection and release more transparent.\n
\n
\n Beyond behaviour-cloning on a dataset crowd-sourced using crowd-code, we eventually want to use crowd-code to annotate the entirety of IDE screencasts on the internet \n using an inverse dynamics model trained on screen recordings paired with crowd-code's IDE action annotations. This would unlock an entirely new trove of training data\n for software engineering agents.\n
\n
\n We are excited to see what the community builds with this dataset. We want to democratize AI research. We are greater than the sum of our parts. Together.\n
\n \n\n \n\n
Contributions
\n
AN, MM and FS worked on ideation and implementation. FS wrote the manuscript. We thank Gemini Code Assist and Cursor for their help in writing the extension.
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n Install once, and forget about it.\n
\n Neural networks are simulators. Anything you want your model to do, you have to teach it. Base models represent the mean of the data distribution of the internet.\n Post-training permits shifting that distribution towards desired behaviours. The higher the required skill, the scarcer the corresponding data on the internet.\n
\n
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt\n at capturing a broad spectrum of human software engineering, including character-level text insertions, deletions, undo, redo, cursor movement, file switches, jumping\n to function defintions, git checkouts, terminal command execution, autocompletes, LLM changes, and more.\n
\n Every day, millions of developers are writing open-source code. We propose going beyond open-source and towards open-engineering, a paradigm where the value of the\n developer's time is not just captured by the code they produce, but also by the mere act of engineering.\n
\n
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n We want to make it as easy as possible for developers to participate in crowd-sourcing. All you need to do is \n install the crowd-code extension, and forget about it.\n
\n \n Figure 1: A preview of the crowd-code extension in action. Figure from Mattia Consiglio.\n
\n The extension periodically uploads the user's captured IDE actions to a server, where they are cleaned, filtered, thoroughly anonymized, and periodically released to the\n public under the most permissive Creative Commons license (CC0).\n An ongoing recording is transparently indicated in the IDE's status bar, and can be stopped at any time. If the user has inserted sensitive data, they can simply\n press the 'panic button' in the status bar to remove the last actions from the recording before they even leave the user's machine. Additionally, the user is asked for\n consent to participate in crowd-sourcing upon extension installation, and can opt-out at any time. We take user privacy very seriously and welcome any feedback on how to\n make data collection and release more transparent.\n
\n
\n Beyond behaviour-cloning on a dataset crowd-sourced using crowd-code, we eventually want to use crowd-code to annotate the entirety of IDE screencasts on the internet \n using an inverse dynamics model trained on screen recordings paired with crowd-code's IDE action annotations. This would unlock an entirely new trove of training data\n for software engineering agents.\n
\n
\n We are excited to see what the community builds with this dataset. We want to democratize AI research. We are greater than the sum of our parts. Together.\n
\n \n\n \n\n
Contributions
\n
AN, MM and FS worked on ideation and implementation. FS wrote the manuscript. We thank Gemini Code Assist and Cursor for their help in writing the extension.
\n \n \n \n \n\n \n\n\n",html,content
+18,26652,"examples/jasmine.html",0,0,"",html,selection_command
+19,29329,"examples/jasmine.html",940,0,"",html,selection_command
+20,29438,"examples/jasmine.html",945,0,"",html,selection_command
+21,30153,"examples/jasmine.html",888,0,"",html,selection_command
+22,30399,"examples/jasmine.html",869,0,"",html,selection_command
+23,30425,"examples/jasmine.html",857,0,"",html,selection_command
+24,30457,"examples/jasmine.html",827,0,"",html,selection_command
+25,30494,"examples/jasmine.html",815,0,"",html,selection_command
+26,30517,"examples/jasmine.html",808,0,"",html,selection_command
+27,30551,"examples/jasmine.html",802,0,"",html,selection_command
+28,30586,"examples/jasmine.html",800,0,"",html,selection_command
+29,30620,"examples/jasmine.html",748,0,"",html,selection_command
+30,30653,"examples/jasmine.html",724,0,"",html,selection_command
+31,30687,"examples/jasmine.html",653,0,"",html,selection_command
+32,30720,"examples/jasmine.html",612,0,"",html,selection_command
+33,30754,"examples/jasmine.html",600,0,"",html,selection_command
+34,30819,"examples/jasmine.html",612,0,"",html,selection_command
+35,31076,"examples/jasmine.html",653,0,"",html,selection_command
+36,31110,"examples/jasmine.html",724,0,"",html,selection_command
+37,31138,"examples/jasmine.html",748,0,"",html,selection_command
+38,31168,"examples/jasmine.html",800,0,"",html,selection_command
+39,31200,"examples/jasmine.html",802,0,"",html,selection_command
+40,31233,"examples/jasmine.html",808,0,"",html,selection_command
+41,31267,"examples/jasmine.html",815,0,"",html,selection_command
+42,31299,"examples/jasmine.html",827,0,"",html,selection_command
+43,31333,"examples/jasmine.html",857,0,"",html,selection_command
+44,31366,"examples/jasmine.html",869,0,"",html,selection_command
+45,31399,"examples/jasmine.html",888,0,"",html,selection_command
+46,31526,"examples/jasmine.html",869,0,"",html,selection_command
+47,31676,"examples/jasmine.html",888,0,"",html,selection_command
+48,31836,"examples/jasmine.html",945,0,"",html,selection_command
+49,32021,"examples/jasmine.html",1018,0,"",html,selection_command
+50,32192,"examples/jasmine.html",945,0,"",html,selection_command
+51,32292,"examples/jasmine.html",948,0,"",html,selection_command
+52,32445,"examples/jasmine.html",949,0,"",html,selection_command
+53,32616,"examples/jasmine.html",954,0,"",html,selection_command
+54,33021,"examples/jasmine.html",949,0,"",html,selection_command
+55,33110,"examples/jasmine.html",949,1,"C",html,selection_command
+56,33179,"examples/jasmine.html",949,5,"Crowd",html,selection_command
+57,33365,"examples/jasmine.html",949,6,"Crowd-",html,selection_command
+58,33526,"examples/jasmine.html",949,14,"Crowd-Sourcing",html,selection_command
+59,33728,"examples/jasmine.html",949,16,"Crowd-Sourcing A",html,selection_command
+60,34009,"examples/jasmine.html",949,24,"Crowd-Sourcing A Dataset",html,selection_command
+61,34167,"examples/jasmine.html",949,27,"Crowd-Sourcing A Dataset To",html,selection_command
+62,34337,"examples/jasmine.html",949,32,"Crowd-Sourcing A Dataset To Make",html,selection_command
+63,34484,"examples/jasmine.html",949,39,"Crowd-Sourcing A Dataset To Make Agents",html,selection_command
+64,34656,"examples/jasmine.html",949,44,"Crowd-Sourcing A Dataset To Make Agents Code",html,selection_command
+65,34821,"examples/jasmine.html",949,49,"Crowd-Sourcing A Dataset To Make Agents Code Like",html,selection_command
+66,35014,"examples/jasmine.html",949,56,"Crowd-Sourcing A Dataset To Make Agents Code Like Humans",html,selection_command
+67,40991,"examples/jasmine.html",949,56,"",html,content
+68,76706,"examples/jasmine.html",949,0,"๐งโโ๏ธ Jasmine: A simple, performant and scalable JAX-based world modeling codebase ๐งโโ๏ธ",html,content
+69,76715,"examples/jasmine.html",1037,0,"",html,selection_keyboard
+70,77084,"examples/jasmine.html",1036,0,"",html,selection_command
+71,77466,"examples/jasmine.html",935,0,"",html,selection_command
+72,78332,"examples/jasmine.html",1040,0,"",html,selection_command
+73,78448,"examples/jasmine.html",1044,0,"",html,selection_command
+74,78664,"examples/jasmine.html",1045,0,"",html,selection_command
+75,78833,"examples/jasmine.html",1056,0,"",html,selection_command
+76,79015,"examples/jasmine.html",1059,0,"",html,selection_command
+77,79316,"examples/jasmine.html",1060,0,"",html,selection_command
+78,81257,"examples/jasmine.html",1063,0,"",html,selection_command
+79,81530,"examples/jasmine.html",1073,0,"",html,selection_command
+80,81685,"examples/jasmine.html",1078,0,"",html,selection_command
+81,82059,"examples/jasmine.html",1073,0,"",html,selection_command
+82,82191,"examples/jasmine.html",1073,5,"",html,content
+83,82770,"examples/jasmine.html",1072,0,"",html,selection_command
+84,82935,"examples/jasmine.html",1073,0,"",html,selection_command
+85,83016,"examples/jasmine.html",1073,1,"-",html,selection_command
+86,83064,"examples/jasmine.html",1073,5,"-code",html,selection_command
+87,83282,"examples/jasmine.html",1073,5,"",html,content
+88,83584,"examples/jasmine.html",1073,0,"J",html,content
+89,83585,"examples/jasmine.html",1074,0,"",html,selection_keyboard
+90,83760,"examples/jasmine.html",1074,0,"a",html,content
+91,83764,"examples/jasmine.html",1075,0,"",html,selection_keyboard
+92,83783,"examples/jasmine.html",1075,0,"s",html,content
+93,83785,"examples/jasmine.html",1076,0,"",html,selection_keyboard
+94,83901,"examples/jasmine.html",1076,0,"m",html,content
+95,83903,"examples/jasmine.html",1077,0,"",html,selection_keyboard
+96,84028,"examples/jasmine.html",1077,0,"i",html,content
+97,84030,"examples/jasmine.html",1078,0,"",html,selection_keyboard
+98,84084,"examples/jasmine.html",1078,0,"n",html,content
+99,84086,"examples/jasmine.html",1079,0,"",html,selection_keyboard
+100,84128,"examples/jasmine.html",1079,0,"e",html,content
+101,84132,"examples/jasmine.html",1080,0,"",html,selection_keyboard
+102,84399,"examples/jasmine.html",1079,0,"",html,selection_command
+103,85202,"examples/jasmine.html",1080,0,"",html,selection_command
+104,85391,"examples/jasmine.html",1082,0,"",html,selection_command
+105,86318,"examples/jasmine.html",1084,0,"",html,selection_command
+106,86788,"examples/jasmine.html",1084,1,"V",html,selection_command
+107,86878,"examples/jasmine.html",1084,2,"VS",html,selection_command
+108,87141,"examples/jasmine.html",1084,7,"VS Code",html,selection_command
+109,87171,"examples/jasmine.html",1084,8,"VS Code/",html,selection_command
+110,87206,"examples/jasmine.html",1084,14,"VS Code/Cursor",html,selection_command
+111,87234,"examples/jasmine.html",1084,24,"VS Code/Cursor extension",html,selection_command
+112,87261,"examples/jasmine.html",1084,29,"VS Code/Cursor extension that",html,selection_command
+113,87296,"examples/jasmine.html",1084,36,"VS Code/Cursor extension that allows",html,selection_command
+114,87329,"examples/jasmine.html",1084,43,"VS Code/Cursor extension that allows anyone",html,selection_command
+115,87363,"examples/jasmine.html",1084,46,"VS Code/Cursor extension that allows anyone to",html,selection_command
+116,87395,"examples/jasmine.html",1084,58,"VS Code/Cursor extension that allows anyone to participate",html,selection_command
+117,87430,"examples/jasmine.html",1084,61,"VS Code/Cursor extension that allows anyone to participate in",html,selection_command
+118,87463,"examples/jasmine.html",1084,67,"VS Code/Cursor extension that allows anyone to participate in crowd",html,selection_command
+119,87496,"examples/jasmine.html",1084,68,"VS Code/Cursor extension that allows anyone to participate in crowd-",html,selection_command
+120,87529,"examples/jasmine.html",1084,76,"VS Code/Cursor extension that allows anyone to participate in crowd-sourcing",html,selection_command
+121,87562,"examples/jasmine.html",1084,78,"VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a",html,selection_command
+122,87596,"examples/jasmine.html",1084,87,"VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software",html,selection_command
+123,87629,"examples/jasmine.html",1084,99,"VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering",html,selection_command
+124,87663,"examples/jasmine.html",1084,107,"VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset",html,selection_command
+125,87696,"examples/jasmine.html",1084,110,"VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to",html,selection_command
+126,87731,"examples/jasmine.html",1084,121,"VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually",html,selection_command
+127,87764,"examples/jasmine.html",1084,130,"VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune",html,selection_command
+128,87797,"examples/jasmine.html",1084,137,"VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models",html,selection_command
+129,87829,"examples/jasmine.html",1084,140,"VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on",html,selection_command
+130,87862,"examples/jasmine.html",1084,141,"VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.",html,selection_command
+131,87896,"examples/jasmine.html",1084,149,"VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on. Install",html,selection_command
+132,87930,"examples/jasmine.html",1084,154,"VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on. Install once",html,selection_command
+133,87963,"examples/jasmine.html",1084,155,"VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on. Install once,",html,selection_command
+134,87995,"examples/jasmine.html",1084,159,"VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on. Install once, and",html,selection_command
+135,88030,"examples/jasmine.html",1084,166,"VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on. Install once, and forget",html,selection_command
+136,88169,"examples/jasmine.html",1084,172,"VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on. Install once, and forget about",html,selection_command
+137,88340,"examples/jasmine.html",1084,175,"VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on. Install once, and forget about it",html,selection_command
+138,88752,"examples/jasmine.html",1084,175,"",html,content
+139,90727,"examples/jasmine.html",1083,0,"",html,selection_command
+140,99948,"examples/jasmine.html",1084,0,"",html,selection_command
+141,100165,"examples/jasmine.html",1084,0,"production-ready JAX-based world modeling codebase",html,content
+142,100170,"examples/jasmine.html",1134,0,"",html,selection_keyboard
+143,100606,"examples/jasmine.html",1133,0,"",html,selection_command
+144,101216,"examples/jasmine.html",1126,0,"",html,selection_command
+145,101468,"examples/jasmine.html",1117,0,"",html,selection_command
+146,101499,"examples/jasmine.html",1111,0,"",html,selection_command
+147,101531,"examples/jasmine.html",1105,0,"",html,selection_command
+148,101563,"examples/jasmine.html",1104,0,"",html,selection_command
+149,101598,"examples/jasmine.html",1101,0,"",html,selection_command
+150,101631,"examples/jasmine.html",1095,0,"",html,selection_command
+151,101663,"examples/jasmine.html",1094,0,"",html,selection_command
+152,101697,"examples/jasmine.html",1084,0,"",html,selection_command
+153,101730,"examples/jasmine.html",1082,0,"",html,selection_command
+154,101764,"examples/jasmine.html",1080,0,"",html,selection_command
+155,101796,"examples/jasmine.html",1073,0,"",html,selection_command
+156,102078,"examples/jasmine.html",1074,0,"",html,selection_command
+157,102218,"examples/jasmine.html",1075,0,"",html,selection_command
+158,102335,"examples/jasmine.html",1076,0,"",html,selection_command
+159,102586,"examples/jasmine.html",1077,0,"",html,selection_command
+160,102704,"examples/jasmine.html",1078,0,"",html,selection_command
+161,102882,"examples/jasmine.html",1079,0,"",html,selection_command
+162,103051,"examples/jasmine.html",1080,0,"",html,selection_command
+163,103199,"examples/jasmine.html",1081,0,"",html,selection_command
+164,103539,"examples/jasmine.html",1082,0,"",html,selection_command
+165,103857,"examples/jasmine.html",1083,0,"",html,selection_command
+166,104018,"examples/jasmine.html",1084,0,"",html,selection_command
+167,104318,"examples/jasmine.html",1137,0,"",html,selection_command
+168,104539,"examples/jasmine.html",1136,0,"",html,selection_command
+169,105237,"examples/jasmine.html",1135,0,"",html,selection_command
+170,105682,"examples/jasmine.html",1134,0,"",html,selection_command
+171,107372,"examples/jasmine.html",1133,0,"",html,selection_command
+172,107721,"examples/jasmine.html",1126,0,"",html,selection_command
+173,108069,"examples/jasmine.html",1126,8,"",html,content
+174,108202,"examples/jasmine.html",1125,0,"",html,selection_command
+175,108339,"examples/jasmine.html",1124,0,"",html,selection_command
+176,108467,"examples/jasmine.html",1123,0,"",html,selection_command
+177,108731,"examples/jasmine.html",1122,0,"",html,selection_command
+178,108755,"examples/jasmine.html",1121,0,"",html,selection_command
+179,108785,"examples/jasmine.html",1120,0,"",html,selection_command
+180,108818,"examples/jasmine.html",1119,0,"",html,selection_command
+181,108851,"examples/jasmine.html",1118,0,"",html,selection_command
+182,108885,"examples/jasmine.html",1117,0,"",html,selection_command
+183,108919,"examples/jasmine.html",1116,0,"",html,selection_command
+184,108953,"examples/jasmine.html",1115,0,"",html,selection_command
+185,108986,"examples/jasmine.html",1114,0,"",html,selection_command
+186,109019,"examples/jasmine.html",1113,0,"",html,selection_command
+187,109053,"examples/jasmine.html",1112,0,"",html,selection_command
+188,109086,"examples/jasmine.html",1111,0,"",html,selection_command
+189,109203,"examples/jasmine.html",1110,0,"",html,selection_command
+190,109386,"examples/jasmine.html",1111,0,"codebase",html,content
+191,109389,"examples/jasmine.html",1118,0,"",html,selection_command
+192,109905,"examples/jasmine.html",1119,0,"",html,selection_command
+193,110016,"examples/jasmine.html",1119,0," ",html,content
+194,110019,"examples/jasmine.html",1120,0,"",html,selection_keyboard
+195,110212,"examples/jasmine.html",1119,0,"",html,selection_command
+196,110668,"examples/jasmine.html",1120,0,"",html,selection_command
+197,110973,"examples/jasmine.html",1120,0,"f",html,content
+198,110975,"examples/jasmine.html",1121,0,"",html,selection_keyboard
+199,111037,"examples/jasmine.html",1121,0,"o",html,content
+200,111039,"examples/jasmine.html",1122,0,"",html,selection_keyboard
+201,111138,"examples/jasmine.html",1122,0,"r",html,content
+202,111139,"examples/jasmine.html",1123,0,"",html,selection_keyboard
+203,111204,"examples/jasmine.html",1123,0," ",html,content
+204,111207,"examples/jasmine.html",1124,0,"",html,selection_keyboard
+205,111325,"examples/jasmine.html",1124,0,"w",html,content
+206,111327,"examples/jasmine.html",1125,0,"",html,selection_keyboard
+207,111639,"examples/jasmine.html",1124,1,"",html,content
+208,111722,"examples/jasmine.html",1123,0,"",html,selection_command
+209,112177,"examples/jasmine.html",1128,0,"",html,selection_command
+210,112340,"examples/jasmine.html",1137,0,"",html,selection_command
+211,112528,"examples/jasmine.html",1141,0,"",html,selection_command
+212,112904,"examples/jasmine.html",1140,0,"",html,selection_command
+213,113040,"examples/jasmine.html",1139,0,"",html,selection_command
+214,113357,"examples/jasmine.html",1139,0,"f",html,content
+215,113360,"examples/jasmine.html",1140,0,"",html,selection_keyboard
+216,113523,"examples/jasmine.html",1140,0,"r",html,content
+217,113524,"examples/jasmine.html",1141,0,"",html,selection_keyboard
+218,113593,"examples/jasmine.html",1141,0,"o",html,content
+219,113594,"examples/jasmine.html",1142,0,"",html,selection_keyboard
+220,113626,"examples/jasmine.html",1142,0,"m",html,content
+221,113627,"examples/jasmine.html",1143,0,"",html,selection_keyboard
+222,113722,"examples/jasmine.html",1143,0," ",html,content
+223,113725,"examples/jasmine.html",1144,0,"",html,selection_keyboard
+224,113877,"examples/jasmine.html",1144,0,"u",html,content
+225,113882,"examples/jasmine.html",1145,0,"",html,selection_keyboard
+226,113938,"examples/jasmine.html",1145,0,"n",html,content
+227,113939,"examples/jasmine.html",1146,0,"",html,selection_keyboard
+228,114040,"examples/jasmine.html",1146,0,"a",html,content
+229,114041,"examples/jasmine.html",1147,0,"",html,selection_keyboard
+230,114105,"examples/jasmine.html",1147,0,"l",html,content
+231,114106,"examples/jasmine.html",1148,0,"",html,selection_keyboard
+232,114523,"examples/jasmine.html",1147,1,"",html,content
+233,114625,"examples/jasmine.html",1146,1,"",html,content
+234,114785,"examples/jasmine.html",1146,0,"l",html,content
+235,114787,"examples/jasmine.html",1147,0,"",html,selection_keyboard
+236,114861,"examples/jasmine.html",1147,0,"a",html,content
+237,114863,"examples/jasmine.html",1148,0,"",html,selection_keyboard
+238,114980,"examples/jasmine.html",1148,0,"b",html,content
+239,114981,"examples/jasmine.html",1149,0,"",html,selection_keyboard
+240,115412,"examples/jasmine.html",1149,0,"e",html,content
+241,115415,"examples/jasmine.html",1150,0,"",html,selection_keyboard
+242,115506,"examples/jasmine.html",1150,0,"l",html,content
+243,115509,"examples/jasmine.html",1151,0,"",html,selection_keyboard
+244,115579,"examples/jasmine.html",1151,0,"e",html,content
+245,115581,"examples/jasmine.html",1152,0,"",html,selection_keyboard
+246,115645,"examples/jasmine.html",1152,0,"d",html,content
+247,115647,"examples/jasmine.html",1153,0,"",html,selection_keyboard
+248,115727,"examples/jasmine.html",1153,0," ",html,content
+249,115729,"examples/jasmine.html",1154,0,"",html,selection_keyboard
+250,115920,"examples/jasmine.html",1154,0,"v",html,content
+251,115923,"examples/jasmine.html",1155,0,"",html,selection_keyboard
+252,115990,"examples/jasmine.html",1155,0,"i",html,content
+253,115992,"examples/jasmine.html",1156,0,"",html,selection_keyboard
+254,116077,"examples/jasmine.html",1156,0,"d",html,content
+255,116079,"examples/jasmine.html",1157,0,"",html,selection_keyboard
+256,116124,"examples/jasmine.html",1157,0,"e",html,content
+257,116126,"examples/jasmine.html",1158,0,"",html,selection_keyboard
+258,116178,"examples/jasmine.html",1158,0,"o",html,content
+259,116180,"examples/jasmine.html",1159,0,"",html,selection_keyboard
+260,116210,"examples/jasmine.html",1159,0,"s",html,content
+261,116212,"examples/jasmine.html",1160,0,"",html,selection_keyboard
+262,116523,"examples/jasmine.html",1159,0,"",html,selection_command
+263,116664,"examples/jasmine.html",1040,0,"",html,selection_command
+264,128641,"examples/jasmine.html",1164,0,"",html,selection_command
+265,128768,"examples/jasmine.html",1198,0,"",html,selection_command
+266,128926,"examples/jasmine.html",1199,0,"",html,selection_command
+267,129177,"examples/jasmine.html",1200,0,"",html,selection_command
+268,129207,"examples/jasmine.html",1201,0,"",html,selection_command
+269,129243,"examples/jasmine.html",1202,0,"",html,selection_command
+270,129281,"examples/jasmine.html",1203,0,"",html,selection_command
+271,129306,"examples/jasmine.html",1204,0,"",html,selection_command
+272,129340,"examples/jasmine.html",1205,0,"",html,selection_command
+273,129373,"examples/jasmine.html",1206,0,"",html,selection_command
+274,129407,"examples/jasmine.html",1207,0,"",html,selection_command
+275,129440,"examples/jasmine.html",1208,0,"",html,selection_command
+276,129474,"examples/jasmine.html",1209,0,"",html,selection_command
+277,129506,"examples/jasmine.html",1210,0,"",html,selection_command
+278,129540,"examples/jasmine.html",1211,0,"",html,selection_command
+279,129573,"examples/jasmine.html",1212,0,"",html,selection_command
+280,129606,"examples/jasmine.html",1213,0,"",html,selection_command
+281,129640,"examples/jasmine.html",1214,0,"",html,selection_command
+282,129673,"examples/jasmine.html",1215,0,"",html,selection_command
+283,129707,"examples/jasmine.html",1216,0,"",html,selection_command
+284,129739,"examples/jasmine.html",1217,0,"",html,selection_command
+285,129775,"examples/jasmine.html",1218,0,"",html,selection_command
+286,129807,"examples/jasmine.html",1219,0,"",html,selection_command
+287,129840,"examples/jasmine.html",1220,0,"",html,selection_command
+288,129873,"examples/jasmine.html",1221,0,"",html,selection_command
+289,129906,"examples/jasmine.html",1222,0,"",html,selection_command
+290,129940,"examples/jasmine.html",1223,0,"",html,selection_command
+291,129973,"examples/jasmine.html",1224,0,"",html,selection_command
+292,130006,"examples/jasmine.html",1225,0,"",html,selection_command
+293,130039,"examples/jasmine.html",1226,0,"",html,selection_command
+294,130073,"examples/jasmine.html",1227,0,"",html,selection_command
+295,130208,"examples/jasmine.html",1228,0,"",html,selection_command
+296,131084,"examples/jasmine.html",1228,1,"c",html,selection_command
+297,131178,"examples/jasmine.html",1228,10,"crowd_code",html,selection_command
+298,131358,"examples/jasmine.html",1228,11,"crowd_code.",html,selection_command
+299,131739,"examples/jasmine.html",1228,10,"crowd_code",html,selection_command
+300,131878,"examples/jasmine.html",1228,10,"",html,content
+301,132006,"examples/jasmine.html",1228,0,"j",html,content
+302,132007,"examples/jasmine.html",1229,0,"",html,selection_keyboard
+303,132071,"examples/jasmine.html",1229,0,"a",html,content
+304,132074,"examples/jasmine.html",1230,0,"",html,selection_keyboard
+305,132122,"examples/jasmine.html",1230,0,"s",html,content
+306,132124,"examples/jasmine.html",1231,0,"",html,selection_keyboard
+307,132221,"examples/jasmine.html",1231,0,"m",html,content
+308,132223,"examples/jasmine.html",1232,0,"",html,selection_keyboard
+309,132367,"examples/jasmine.html",1232,0,"i",html,content
+310,132371,"examples/jasmine.html",1233,0,"",html,selection_keyboard
+311,132449,"examples/jasmine.html",1233,0,"n",html,content
+312,132453,"examples/jasmine.html",1234,0,"",html,selection_keyboard
+313,132474,"examples/jasmine.html",1234,0,"e",html,content
+314,132476,"examples/jasmine.html",1235,0,"",html,selection_keyboard
+315,132678,"examples/jasmine.html",1234,0,"",html,selection_command
+316,133188,"examples/jasmine.html",1196,0,"",html,selection_command
+317,133376,"examples/jasmine.html",1195,0,"",html,selection_command
+318,133526,"examples/jasmine.html",1191,0,"",html,selection_command
+319,133674,"examples/jasmine.html",1189,0,"",html,selection_command
+320,133820,"examples/jasmine.html",1187,0,"",html,selection_command
+321,133996,"examples/jasmine.html",1182,0,"",html,selection_command
+322,138101,"examples/jasmine.html",1189,0,"5",html,content
+323,138101,"examples/jasmine.html",1187,2,"",html,content
+324,138101,"examples/jasmine.html",1186,0,"gust",html,content
+325,138101,"examples/jasmine.html",1184,2,"",html,content
+326,138101,"examples/jasmine.html",1183,0,"A",html,content
+327,138101,"examples/jasmine.html",1182,1,"",html,content
+328,139019,"examples/jasmine.html",1217,0,"",html,selection_command
+329,139546,"examples/jasmine.html",1199,0,"",html,selection_command
+330,139708,"examples/jasmine.html",1203,0,"",html,selection_command
+331,140148,"examples/jasmine.html",1199,0,"",html,selection_command
+332,141122,"examples/jasmine.html",1244,0,"",html,selection_command
+333,141274,"examples/jasmine.html",1261,0,"",html,selection_command
+334,141423,"examples/jasmine.html",1269,0,"",html,selection_command
+335,141575,"examples/jasmine.html",1303,0,"",html,selection_command
+336,141731,"examples/jasmine.html",1269,0,"",html,selection_command
+337,141928,"examples/jasmine.html",1277,0,"",html,selection_command
+338,144010,"examples/jasmine.html",1311,0,"",html,selection_command
+339,144157,"examples/jasmine.html",1369,0,"",html,selection_command
+340,144295,"examples/jasmine.html",1444,0,"",html,selection_command
+341,144443,"examples/jasmine.html",1485,0,"",html,selection_command
+342,144586,"examples/jasmine.html",1493,0,"",html,selection_command
+343,144737,"examples/jasmine.html",1503,0,"",html,selection_command
+344,145121,"examples/jasmine.html",1493,0,"",html,selection_command
+345,156921,"examples/jasmine.html",1503,0,"",html,selection_command
+346,157460,"examples/jasmine.html",1493,0,"",html,selection_command
+347,157624,"examples/jasmine.html",1487,7," {",html,selection_command
+348,157709,"examples/jasmine.html",1487,41," {\n ""author"":""Mihir Mahajan"",",html,selection_command
+349,157857,"examples/jasmine.html",1487,97," {\n ""author"":""Mihir Mahajan"",\n ""authorURL"":""https://maharajamihir.github.io/"",",html,selection_command
+350,158113,"examples/jasmine.html",1487,172," {\n ""author"":""Mihir Mahajan"",\n ""authorURL"":""https://maharajamihir.github.io/"",\n ""affiliations"": [{""name"": ""p(doom)"", ""url"": ""https://pdoom.org/""},",html,selection_command
+351,158145,"examples/jasmine.html",1487,214," {\n ""author"":""Mihir Mahajan"",\n ""authorURL"":""https://maharajamihir.github.io/"",\n ""affiliations"": [{""name"": ""p(doom)"", ""url"": ""https://pdoom.org/""},\n {""name"": ""TUM""}]",html,selection_command
+352,158298,"examples/jasmine.html",1487,223," {\n ""author"":""Mihir Mahajan"",\n ""authorURL"":""https://maharajamihir.github.io/"",\n ""affiliations"": [{""name"": ""p(doom)"", ""url"": ""https://pdoom.org/""},\n {""name"": ""TUM""}]\n },",html,selection_command
+353,158580,"examples/jasmine.html",1487,224,"",html,content
+354,158593,"examples/jasmine.html",1493,0,"",html,selection_command
+355,158710,"examples/jasmine.html",1484,0,"",html,selection_command
+356,158961,"examples/jasmine.html",1442,0,"",html,selection_command
+357,158995,"examples/jasmine.html",1367,0,"",html,selection_command
+358,159025,"examples/jasmine.html",1309,0,"",html,selection_command
+359,159060,"examples/jasmine.html",1275,0,"",html,selection_command
+360,159159,"examples/jasmine.html",1267,0,"",html,selection_command
+361,159310,"examples/jasmine.html",1250,0,"",html,selection_command
+362,159527,"examples/jasmine.html",1260,0,"\n {\n ""author"":""Mihir Mahajan"",\n ""authorURL"":""https://maharajamihir.github.io/"",\n ""affiliations"": [{""name"": ""p(doom)"", ""url"": ""https://pdoom.org/""},\n {""name"": ""TUM""}]\n },",html,content
+363,159531,"examples/jasmine.html",1267,0,"",html,selection_command
+364,160815,"examples/jasmine.html",1275,0,"",html,selection_command
+365,161058,"examples/jasmine.html",1309,0,"",html,selection_command
+366,161095,"examples/jasmine.html",1365,0,"",html,selection_command
+367,166228,"examples/jasmine.html",1440,0,"",html,selection_command
+368,166480,"examples/jasmine.html",1482,0,"",html,selection_command
+369,166510,"examples/jasmine.html",1491,0,"",html,selection_command
+370,166535,"examples/jasmine.html",1499,0,"",html,selection_command
+371,166567,"examples/jasmine.html",1533,0,"",html,selection_command
+372,166600,"examples/jasmine.html",1591,0,"",html,selection_command
+373,166634,"examples/jasmine.html",1666,0,"",html,selection_command
+374,166667,"examples/jasmine.html",1708,0,"",html,selection_command
+375,166822,"examples/jasmine.html",1717,0,"",html,selection_command
+376,166995,"examples/jasmine.html",1725,0,"",html,selection_command
+377,167134,"examples/jasmine.html",1761,0,"",html,selection_command
+378,167289,"examples/jasmine.html",1806,0,"",html,selection_command
+379,168657,"examples/jasmine.html",1761,0,"",html,selection_command
+380,170401,"examples/jasmine.html",1572,0,"",html,selection_mouse
+381,174513,"examples/jasmine.html",1058,0,"",html,selection_mouse
+382,176580,"examples/jasmine.html",1161,0,"",html,selection_mouse
+383,178597,"examples/crowd_code.html",0,0,"",html,tab
+384,179356,"examples/crowd_code.html",0,0,"",html,selection_command
+385,180783,"examples/jasmine.html",0,0,"",html,tab
+386,206538,"examples/jasmine.html",1161,0," ",html,content
+387,206545,"examples/jasmine.html",1162,0,"",html,selection_keyboard
+388,206860,"examples/jasmine.html",1162,0,"S",html,content
+389,206862,"examples/jasmine.html",1163,0,"",html,selection_keyboard
+390,207078,"examples/jasmine.html",1163,0,"c",html,content
+391,207083,"examples/jasmine.html",1164,0,"",html,selection_keyboard
+392,207162,"examples/jasmine.html",1164,0,"a",html,content
+393,207165,"examples/jasmine.html",1165,0,"",html,selection_keyboard
+394,207239,"examples/jasmine.html",1165,0,"l",html,content
+395,207242,"examples/jasmine.html",1166,0,"",html,selection_keyboard
+396,207324,"examples/jasmine.html",1166,0,"e",html,content
+397,207326,"examples/jasmine.html",1167,0,"",html,selection_keyboard
+398,207389,"examples/jasmine.html",1167,0," ",html,content
+399,207392,"examples/jasmine.html",1168,0,"",html,selection_keyboard
+400,207500,"examples/jasmine.html",1168,0,"f",html,content
+401,207502,"examples/jasmine.html",1169,0,"",html,selection_keyboard
+402,207666,"examples/jasmine.html",1169,0,"r",html,content
+403,207670,"examples/jasmine.html",1170,0,"",html,selection_keyboard
+404,207751,"examples/jasmine.html",1170,0,"o",html,content
+405,207754,"examples/jasmine.html",1171,0,"",html,selection_keyboard
+406,207819,"examples/jasmine.html",1171,0,"m",html,content
+407,207825,"examples/jasmine.html",1172,0,"",html,selection_keyboard
+408,207884,"examples/jasmine.html",1172,0," ",html,content
+409,207886,"examples/jasmine.html",1173,0,"",html,selection_keyboard
+410,216966,"examples/jasmine.html",1172,1,"",html,content
+411,217197,"examples/jasmine.html",1171,1,"",html,content
+412,217230,"examples/jasmine.html",1170,1,"",html,content
+413,217264,"examples/jasmine.html",1169,1,"",html,content
+414,217298,"examples/jasmine.html",1168,1,"",html,content
+415,217337,"examples/jasmine.html",1167,1,"",html,content
+416,217370,"examples/jasmine.html",1166,1,"",html,content
+417,217402,"examples/jasmine.html",1165,1,"",html,content
+418,217436,"examples/jasmine.html",1164,1,"",html,content
+419,217470,"examples/jasmine.html",1163,1,"",html,content
+420,217506,"examples/jasmine.html",1162,1,"",html,content
+421,217830,"examples/jasmine.html",1162,0,"Jasmine scales from single hosts to hundreds of xPUs thanks to XLA and strives to be an easily hackable, batteries-included foundation for world modeling research",html,content
+422,217832,"examples/jasmine.html",1324,0,"",html,selection_keyboard
+423,218175,"examples/jasmine.html",1323,0,"",html,selection_command
+424,218693,"examples/jasmine.html",1040,0,"",html,selection_command
+425,222287,"examples/jasmine.html",1325,0,"",html,selection_command
+426,226860,"examples/jasmine.html",1040,0,"",html,selection_command
+427,227409,"examples/jasmine.html",1044,0,"",html,selection_command
+428,227660,"examples/jasmine.html",1045,0,"",html,selection_command
+429,227701,"examples/jasmine.html",1056,0,"",html,selection_command
+430,227720,"examples/jasmine.html",1059,0,"",html,selection_command
+431,227749,"examples/jasmine.html",1060,0,"",html,selection_command
+432,227779,"examples/jasmine.html",1063,0,"",html,selection_command
+433,227811,"examples/jasmine.html",1073,0,"",html,selection_command
+434,227846,"examples/jasmine.html",1080,0,"",html,selection_command
+435,227880,"examples/jasmine.html",1082,0,"",html,selection_command
+436,227913,"examples/jasmine.html",1084,0,"",html,selection_command
+437,227947,"examples/jasmine.html",1094,0,"",html,selection_command
+438,227979,"examples/jasmine.html",1095,0,"",html,selection_command
+439,228014,"examples/jasmine.html",1101,0,"",html,selection_command
+440,228049,"examples/jasmine.html",1104,0,"",html,selection_command
+441,228083,"examples/jasmine.html",1105,0,"",html,selection_command
+442,228117,"examples/jasmine.html",1111,0,"",html,selection_command
+443,228150,"examples/jasmine.html",1120,0,"",html,selection_command
+444,228183,"examples/jasmine.html",1124,0,"",html,selection_command
+445,228216,"examples/jasmine.html",1130,0,"",html,selection_command
+446,228250,"examples/jasmine.html",1139,0,"",html,selection_command
+447,228284,"examples/jasmine.html",1144,0,"",html,selection_command
+448,228484,"examples/jasmine.html",1154,0,"",html,selection_command
+449,228737,"examples/jasmine.html",1160,0,"",html,selection_command
+450,228768,"examples/jasmine.html",1162,0,"",html,selection_command
+451,228794,"examples/jasmine.html",1170,0,"",html,selection_command
+452,229200,"examples/jasmine.html",1162,0,"",html,selection_command
+453,229562,"examples/jasmine.html",1162,1,"J",html,selection_command
+454,229635,"examples/jasmine.html",1162,7,"Jasmine",html,selection_command
+455,229803,"examples/jasmine.html",1162,14,"Jasmine scales",html,selection_command
+456,230346,"examples/jasmine.html",1162,14,"",html,content
+457,230650,"examples/jasmine.html",1162,0,"S",html,content
+458,230652,"examples/jasmine.html",1163,0,"",html,selection_keyboard
+459,230787,"examples/jasmine.html",1163,0,"c",html,content
+460,230788,"examples/jasmine.html",1164,0,"",html,selection_keyboard
+461,230908,"examples/jasmine.html",1164,0,"a",html,content
+462,230909,"examples/jasmine.html",1165,0,"",html,selection_keyboard
+463,230985,"examples/jasmine.html",1165,0,"l",html,content
+464,230986,"examples/jasmine.html",1166,0,"",html,selection_keyboard
+465,231082,"examples/jasmine.html",1166,0,"e",html,content
+466,231085,"examples/jasmine.html",1167,0,"",html,selection_keyboard
+467,231295,"examples/jasmine.html",1166,0,"",html,selection_command
+468,232304,"examples/jasmine.html",1171,0,"",html,selection_command
+469,232554,"examples/jasmine.html",1178,0,"",html,selection_command
+470,232584,"examples/jasmine.html",1184,0,"",html,selection_command
+471,232610,"examples/jasmine.html",1187,0,"",html,selection_command
+472,232643,"examples/jasmine.html",1196,0,"",html,selection_command
+473,232677,"examples/jasmine.html",1199,0,"",html,selection_command
+474,232711,"examples/jasmine.html",1204,0,"",html,selection_command
+475,232746,"examples/jasmine.html",1211,0,"",html,selection_command
+476,232780,"examples/jasmine.html",1214,0,"",html,selection_command
+477,233729,"examples/jasmine.html",1213,0,"",html,selection_command
+478,233990,"examples/jasmine.html",1206,0,"",html,selection_command
+479,234022,"examples/jasmine.html",1201,0,"",html,selection_command
+480,234048,"examples/jasmine.html",1198,0,"",html,selection_command
+481,234077,"examples/jasmine.html",1189,0,"",html,selection_command
+482,234111,"examples/jasmine.html",1186,0,"",html,selection_command
+483,234148,"examples/jasmine.html",1180,0,"",html,selection_command
+484,234183,"examples/jasmine.html",1173,0,"",html,selection_command
+485,234215,"examples/jasmine.html",1168,0,"",html,selection_command
+486,234248,"examples/jasmine.html",1162,0,"",html,selection_command
+487,234281,"examples/jasmine.html",1160,0,"",html,selection_command
+488,234445,"examples/jasmine.html",1162,0,"",html,selection_command
+489,234697,"examples/jasmine.html",1168,0,"",html,selection_command
+490,234726,"examples/jasmine.html",1173,0,"",html,selection_command
+491,234757,"examples/jasmine.html",1180,0,"",html,selection_command
+492,234940,"examples/jasmine.html",1186,0,"",html,selection_command
+493,235194,"examples/jasmine.html",1189,0,"",html,selection_command
+494,235218,"examples/jasmine.html",1198,0,"",html,selection_command
+495,235249,"examples/jasmine.html",1201,0,"",html,selection_command
+496,235282,"examples/jasmine.html",1206,0,"",html,selection_command
+497,235792,"examples/jasmine.html",1213,0,"",html,selection_command
+498,236042,"examples/jasmine.html",1216,0,"",html,selection_command
+499,236074,"examples/jasmine.html",1220,0,"",html,selection_command
+500,236101,"examples/jasmine.html",1224,0,"",html,selection_command
+501,237043,"examples/jasmine.html",1220,0,"",html,selection_command
+502,237211,"examples/jasmine.html",1216,0,"",html,selection_command
+503,237543,"examples/jasmine.html",1220,0,"",html,selection_command
+504,238290,"examples/jasmine.html",1220,1,"a",html,selection_command
+505,238345,"examples/jasmine.html",1220,3,"and",html,selection_command
+506,238609,"examples/jasmine.html",1220,11,"and strives",html,selection_command
+507,238638,"examples/jasmine.html",1220,14,"and strives to",html,selection_command
+508,238660,"examples/jasmine.html",1220,17,"and strives to be",html,selection_command
+509,238695,"examples/jasmine.html",1220,20,"and strives to be an",html,selection_command
+510,238732,"examples/jasmine.html",1220,27,"and strives to be an easily",html,selection_command
+511,238760,"examples/jasmine.html",1220,36,"and strives to be an easily hackable",html,selection_command
+512,238793,"examples/jasmine.html",1220,37,"and strives to be an easily hackable,",html,selection_command
+513,238827,"examples/jasmine.html",1220,47,"and strives to be an easily hackable, batteries",html,selection_command
+514,238860,"examples/jasmine.html",1220,48,"and strives to be an easily hackable, batteries-",html,selection_command
+515,238893,"examples/jasmine.html",1220,56,"and strives to be an easily hackable, batteries-included",html,selection_command
+516,238926,"examples/jasmine.html",1220,67,"and strives to be an easily hackable, batteries-included foundation",html,selection_command
+517,238960,"examples/jasmine.html",1220,71,"and strives to be an easily hackable, batteries-included foundation for",html,selection_command
+518,238997,"examples/jasmine.html",1220,77,"and strives to be an easily hackable, batteries-included foundation for world",html,selection_command
+519,239027,"examples/jasmine.html",1220,86,"and strives to be an easily hackable, batteries-included foundation for world modeling",html,selection_command
+520,239890,"examples/jasmine.html",1220,95,"and strives to be an easily hackable, batteries-included foundation for world modeling research",html,selection_command
+521,240062,"examples/jasmine.html",1220,97,"and strives to be an easily hackable, batteries-included foundation for world modeling research"",",html,selection_command
+522,240456,"examples/jasmine.html",1220,96,"and strives to be an easily hackable, batteries-included foundation for world modeling research""",html,selection_command
+523,240618,"examples/jasmine.html",1220,95,"and strives to be an easily hackable, batteries-included foundation for world modeling research",html,selection_command
+524,240964,"examples/jasmine.html",1220,95,"",html,content
+525,241332,"examples/jasmine.html",1219,0,"",html,selection_command
+526,241460,"examples/jasmine.html",1219,1,"",html,content
+527,241998,"examples/jasmine.html",1219,0,".",html,content
+528,242000,"examples/jasmine.html",1220,0,"",html,selection_keyboard
+529,242096,"examples/jasmine.html",1219,0,"",html,selection_command
+530,242346,"examples/jasmine.html",1040,0,"",html,selection_command
+531,250877,"examples/crowd_code.html",0,0,"",html,tab
+532,251716,"examples/jasmine.html",0,0,"",html,tab
+533,271857,"examples/jasmine.html",1223,0,"",html,selection_command
+534,272095,"examples/jasmine.html",1258,0,"",html,selection_command
+535,272615,"examples/jasmine.html",1303,0,"",html,selection_command
+536,272807,"examples/jasmine.html",1320,0,"",html,selection_command
+537,272938,"examples/jasmine.html",1328,0,"",html,selection_command
+538,273210,"examples/jasmine.html",1362,0,"",html,selection_command
+539,273413,"examples/jasmine.html",1418,0,"",html,selection_command
+540,273672,"examples/jasmine.html",1493,0,"",html,selection_command
+541,273701,"examples/jasmine.html",1535,0,"",html,selection_command
+542,273727,"examples/jasmine.html",1544,0,"",html,selection_command
+543,273757,"examples/jasmine.html",1552,0,"",html,selection_command
+544,273794,"examples/jasmine.html",1586,0,"",html,selection_command
+545,273829,"examples/jasmine.html",1644,0,"",html,selection_command
+546,273858,"examples/jasmine.html",1719,0,"",html,selection_command
+547,273892,"examples/jasmine.html",1761,0,"",html,selection_command
+548,273926,"examples/jasmine.html",1770,0,"",html,selection_command
+549,273960,"examples/jasmine.html",1778,0,"",html,selection_command
+550,273994,"examples/jasmine.html",1814,0,"",html,selection_command
+551,274032,"examples/jasmine.html",1859,0,"",html,selection_command
+552,274064,"examples/jasmine.html",1934,0,"",html,selection_command
+553,274097,"examples/jasmine.html",1976,0,"",html,selection_command
+554,274132,"examples/jasmine.html",1984,0,"",html,selection_command
+555,274167,"examples/jasmine.html",1991,0,"",html,selection_command
+556,274200,"examples/jasmine.html",2006,0,"",html,selection_command
+557,274233,"examples/jasmine.html",2028,0,"",html,selection_command
+558,274266,"examples/jasmine.html",2084,0,"",html,selection_command
+559,274299,"examples/jasmine.html",2092,0,"",html,selection_command
+560,274334,"examples/jasmine.html",2098,0,"",html,selection_command
+561,274367,"examples/jasmine.html",2111,0,"",html,selection_command
+562,274399,"examples/jasmine.html",2131,0,"",html,selection_command
+563,275140,"examples/jasmine.html",2143,0,"",html,selection_command
+564,275614,"examples/jasmine.html",2151,0,"",html,selection_command
+565,276647,"examples/jasmine.html",2159,0,"",html,selection_command
+566,276897,"examples/jasmine.html",2162,0,"",html,selection_command
+567,276928,"examples/jasmine.html",2172,0,"",html,selection_command
+568,276953,"examples/jasmine.html",2173,0,"",html,selection_command
+569,276986,"examples/jasmine.html",2175,0,"",html,selection_command
+570,277023,"examples/jasmine.html",2179,0,"",html,selection_command
+571,277243,"examples/jasmine.html",2181,0,"",html,selection_command
+572,277451,"examples/jasmine.html",2186,0,"",html,selection_command
+573,277640,"examples/jasmine.html",2189,0,"",html,selection_command
+574,277795,"examples/jasmine.html",2195,0,"",html,selection_command
+575,277933,"examples/jasmine.html",2196,0,"",html,selection_command
+576,278105,"examples/jasmine.html",2199,0,"",html,selection_command
+577,278270,"examples/jasmine.html",2200,0,"",html,selection_command
+578,278454,"examples/jasmine.html",2201,0,"",html,selection_command
+579,278612,"examples/jasmine.html",2202,0,"",html,selection_command
+580,278847,"examples/jasmine.html",2206,0,"",html,selection_command
+581,279047,"examples/jasmine.html",2207,0,"",html,selection_command
+582,279379,"examples/jasmine.html",2207,5,"",html,content
+583,280105,"examples/jasmine.html",2206,0,"",html,selection_command
+584,280307,"examples/jasmine.html",2207,0,"",html,selection_command
+585,280410,"examples/jasmine.html",2207,1,"-",html,selection_command
+586,280489,"examples/jasmine.html",2207,5,"-code",html,selection_command
+587,280673,"examples/jasmine.html",2207,5,"",html,content
+588,280814,"examples/jasmine.html",2207,0,"j",html,content
+589,280815,"examples/jasmine.html",2208,0,"",html,selection_keyboard
+590,280900,"examples/jasmine.html",2208,0,"a",html,content
+591,280902,"examples/jasmine.html",2209,0,"",html,selection_keyboard
+592,281044,"examples/jasmine.html",2209,0,"m",html,content
+593,281046,"examples/jasmine.html",2210,0,"",html,selection_keyboard
+594,281313,"examples/jasmine.html",2209,1,"",html,content
+595,281484,"examples/jasmine.html",2209,0,"s",html,content
+596,281486,"examples/jasmine.html",2210,0,"",html,selection_keyboard
+597,281568,"examples/jasmine.html",2210,0,"m",html,content
+598,281570,"examples/jasmine.html",2211,0,"",html,selection_keyboard
+599,281715,"examples/jasmine.html",2211,0,"i",html,content
+600,281717,"examples/jasmine.html",2212,0,"",html,selection_keyboard
+601,281777,"examples/jasmine.html",2212,0,"n",html,content
+602,281780,"examples/jasmine.html",2213,0,"",html,selection_keyboard
+603,281817,"examples/jasmine.html",2213,0,"e",html,content
+604,281818,"examples/jasmine.html",2214,0,"",html,selection_keyboard
+605,282071,"examples/jasmine.html",2213,0,"",html,selection_command
+606,282336,"examples/jasmine.html",2214,0,"",html,selection_command
+607,282483,"examples/jasmine.html",2216,0,"",html,selection_command
+608,282989,"examples/jasmine.html",2216,1,"c",html,selection_command
+609,283034,"examples/jasmine.html",2216,5,"crowd",html,selection_command
+610,283164,"examples/jasmine.html",2216,6,"crowd-",html,selection_command
+611,283518,"examples/jasmine.html",2216,10,"crowd-code",html,selection_command
+612,283736,"examples/jasmine.html",2216,10,"",html,content
+613,284047,"examples/jasmine.html",2216,0,"J",html,content
+614,284050,"examples/jasmine.html",2217,0,"",html,selection_keyboard
+615,284206,"examples/jasmine.html",2217,0,"a",html,content
+616,284214,"examples/jasmine.html",2218,0,"",html,selection_keyboard
+617,284279,"examples/jasmine.html",2218,0,"s",html,content
+618,284281,"examples/jasmine.html",2219,0,"",html,selection_keyboard
+619,284379,"examples/jasmine.html",2219,0,"m",html,content
+620,284381,"examples/jasmine.html",2220,0,"",html,selection_keyboard
+621,284491,"examples/jasmine.html",2220,0,"i",html,content
+622,284494,"examples/jasmine.html",2221,0,"",html,selection_keyboard
+623,284562,"examples/jasmine.html",2221,0,"n",html,content
+624,284564,"examples/jasmine.html",2222,0,"",html,selection_keyboard
+625,284634,"examples/jasmine.html",2222,0,"e",html,content
+626,284636,"examples/jasmine.html",2223,0,"",html,selection_keyboard
+627,284849,"examples/jasmine.html",2222,0,"",html,selection_command
+628,287397,"examples/jasmine.html",2223,0,"",html,selection_command
+629,287640,"examples/jasmine.html",2225,0,"",html,selection_command
+630,287859,"examples/jasmine.html",2226,0,"",html,selection_command
+631,288074,"examples/jasmine.html",2229,0,"",html,selection_command
+632,289087,"examples/jasmine.html",2231,0,"",html,selection_command
+633,289567,"examples/jasmine.html",2231,141,"",html,content
+634,290608,"examples/jasmine.html",2230,0,"",html,selection_command
+635,291443,"examples/jasmine.html",2311,0,"",html,selection_command
+636,291815,"examples/jasmine.html",2232,103,"",html,content
+637,291836,"examples/jasmine.html",2236,0,"",html,selection_command
+638,291939,"examples/jasmine.html",2155,0,"",html,selection_command
+639,292373,"examples/jasmine.html",2231,0,"",html,selection_command
+640,296618,"examples/jasmine.html",2231,0,"production-ready JAX-based codebase for world modeling from unlabeled videos.",html,content
+641,296904,"examples/jasmine.html",2307,0,"",html,selection_command
+642,297709,"examples/jasmine.html",2308,0,"",html,selection_command
+643,297876,"examples/jasmine.html",2308,0," ",html,content
+644,297877,"examples/jasmine.html",2309,0,"",html,selection_keyboard
+645,297957,"examples/jasmine.html",2309,0,"S",html,content
+646,297958,"examples/jasmine.html",2310,0,"",html,selection_keyboard
+647,298244,"examples/jasmine.html",2310,0,"c",html,content
+648,298246,"examples/jasmine.html",2311,0,"",html,selection_keyboard
+649,298291,"examples/jasmine.html",2311,0,"a",html,content
+650,298293,"examples/jasmine.html",2312,0,"",html,selection_keyboard
+651,298864,"examples/jasmine.html",2312,0,"ยง",html,content
+652,298865,"examples/jasmine.html",2313,0,"",html,selection_keyboard
+653,299613,"examples/jasmine.html",2312,1,"",html,content
+654,299794,"examples/jasmine.html",2312,0,"l",html,content
+655,299796,"examples/jasmine.html",2313,0,"",html,selection_keyboard
+656,299847,"examples/jasmine.html",2313,0,"e",html,content
+657,299849,"examples/jasmine.html",2314,0,"",html,selection_keyboard
+658,300078,"examples/jasmine.html",2314,0," from single hosts to hundreds of xPUs thanks to XLA.",html,content
+659,300379,"examples/jasmine.html",2366,0,"",html,selection_command
+660,300550,"examples/jasmine.html",2151,0,"",html,selection_command
+661,342078,"examples/jasmine.html",2368,0,"",html,selection_command
+662,342329,"examples/jasmine.html",2377,0,"",html,selection_command
+663,342353,"examples/jasmine.html",2390,0,"",html,selection_command
+664,342384,"examples/jasmine.html",2414,0,"",html,selection_command
+665,342545,"examples/jasmine.html",2428,0,"",html,selection_command
+666,342707,"examples/jasmine.html",2502,0,"",html,selection_command
+667,344769,"examples/jasmine.html",2535,0,"",html,selection_command
+668,345233,"examples/jasmine.html",2502,0,"",html,selection_command
+669,345418,"examples/jasmine.html",2502,32,"
\n Neural networks are simulators. Anything you want your model to do, you have to teach it. Base models represent the mean of the data distribution of the internet.",html,selection_command
+672,345939,"examples/jasmine.html",2502,379,"
Data Is The New Oil
\n
\n Neural networks are simulators. Anything you want your model to do, you have to teach it. Base models represent the mean of the data distribution of the internet.\n Post-training permits shifting that distribution towards desired behaviours. The higher the required skill, the scarcer the corresponding data on the internet.",html,selection_command
+673,345966,"examples/jasmine.html",2502,388,"
Data Is The New Oil
\n
\n Neural networks are simulators. Anything you want your model to do, you have to teach it. Base models represent the mean of the data distribution of the internet.\n Post-training permits shifting that distribution towards desired behaviours. The higher the required skill, the scarcer the corresponding data on the internet.\n
\n Neural networks are simulators. Anything you want your model to do, you have to teach it. Base models represent the mean of the data distribution of the internet.\n Post-training permits shifting that distribution towards desired behaviours. The higher the required skill, the scarcer the corresponding data on the internet.\n
\n Neural networks are simulators. Anything you want your model to do, you have to teach it. Base models represent the mean of the data distribution of the internet.\n Post-training permits shifting that distribution towards desired behaviours. The higher the required skill, the scarcer the corresponding data on the internet.\n
\n Neural networks are simulators. Anything you want your model to do, you have to teach it. Base models represent the mean of the data distribution of the internet.\n Post-training permits shifting that distribution towards desired behaviours. The higher the required skill, the scarcer the corresponding data on the internet.",html,selection_command
+677,346556,"examples/jasmine.html",2502,211,"
Data Is The New Oil
\n
\n Neural networks are simulators. Anything you want your model to do, you have to teach it. Base models represent the mean of the data distribution of the internet.",html,selection_command
+678,346668,"examples/jasmine.html",2543,0,"",html,selection_command
+679,346839,"examples/jasmine.html",2714,0,"",html,selection_command
+680,347105,"examples/jasmine.html",2543,0,"",html,selection_command
+681,348422,"examples/jasmine.html",2542,344,"",html,content
+682,348785,"examples/jasmine.html",2541,0,"",html,selection_command
+683,349186,"examples/jasmine.html",2542,0,"",html,selection_command
+684,349279,"examples/jasmine.html",2542,0,"\n \n ",html,content
+685,349439,"examples/jasmine.html",2543,8,"",html,content
+686,350039,"examples/jasmine.html",2544,0,"",html,selection_command
+687,350189,"examples/jasmine.html",2553,0,"",html,selection_command
+688,350611,"examples/jasmine.html",2553,7,"
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,",html,selection_command
+690,350994,"examples/jasmine.html",2553,361,"
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt",html,selection_command
+691,351020,"examples/jasmine.html",2553,536,"
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt\n at capturing a broad spectrum of human software engineering, including character-level text insertions, deletions, undo, redo, cursor movement, file switches, jumping",html,selection_command
+692,351049,"examples/jasmine.html",2553,649,"
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt\n at capturing a broad spectrum of human software engineering, including character-level text insertions, deletions, undo, redo, cursor movement, file switches, jumping\n to function defintions, git checkouts, terminal command execution, autocompletes, LLM changes, and more.",html,selection_command
+693,351083,"examples/jasmine.html",2553,658,"
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt\n at capturing a broad spectrum of human software engineering, including character-level text insertions, deletions, undo, redo, cursor movement, file switches, jumping\n to function defintions, git checkouts, terminal command execution, autocompletes, LLM changes, and more.\n
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt\n at capturing a broad spectrum of human software engineering, including character-level text insertions, deletions, undo, redo, cursor movement, file switches, jumping\n to function defintions, git checkouts, terminal command execution, autocompletes, LLM changes, and more.\n
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt\n at capturing a broad spectrum of human software engineering, including character-level text insertions, deletions, undo, redo, cursor movement, file switches, jumping\n to function defintions, git checkouts, terminal command execution, autocompletes, LLM changes, and more.\n
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt\n at capturing a broad spectrum of human software engineering, including character-level text insertions, deletions, undo, redo, cursor movement, file switches, jumping\n to function defintions, git checkouts, terminal command execution, autocompletes, LLM changes, and more.\n
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt\n at capturing a broad spectrum of human software engineering, including character-level text insertions, deletions, undo, redo, cursor movement, file switches, jumping\n to function defintions, git checkouts, terminal command execution, autocompletes, LLM changes, and more.\n
\n Every day, millions of developers are writing open-source code. We propose going beyond open-source and towards open-engineering, a paradigm where the value of the",html,selection_command
+698,351251,"examples/jasmine.html",2553,1088,"
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt\n at capturing a broad spectrum of human software engineering, including character-level text insertions, deletions, undo, redo, cursor movement, file switches, jumping\n to function defintions, git checkouts, terminal command execution, autocompletes, LLM changes, and more.\n
\n Every day, millions of developers are writing open-source code. We propose going beyond open-source and towards open-engineering, a paradigm where the value of the\n developer's time is not just captured by the code they produce, but also by the mere act of engineering.",html,selection_command
+699,351286,"examples/jasmine.html",2553,1097,"
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt\n at capturing a broad spectrum of human software engineering, including character-level text insertions, deletions, undo, redo, cursor movement, file switches, jumping\n to function defintions, git checkouts, terminal command execution, autocompletes, LLM changes, and more.\n
\n Every day, millions of developers are writing open-source code. We propose going beyond open-source and towards open-engineering, a paradigm where the value of the\n developer's time is not just captured by the code they produce, but also by the mere act of engineering.\n
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt\n at capturing a broad spectrum of human software engineering, including character-level text insertions, deletions, undo, redo, cursor movement, file switches, jumping\n to function defintions, git checkouts, terminal command execution, autocompletes, LLM changes, and more.\n
\n Every day, millions of developers are writing open-source code. We propose going beyond open-source and towards open-engineering, a paradigm where the value of the\n developer's time is not just captured by the code they produce, but also by the mere act of engineering.\n
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt\n at capturing a broad spectrum of human software engineering, including character-level text insertions, deletions, undo, redo, cursor movement, file switches, jumping\n to function defintions, git checkouts, terminal command execution, autocompletes, LLM changes, and more.\n
\n Every day, millions of developers are writing open-source code. We propose going beyond open-source and towards open-engineering, a paradigm where the value of the\n developer's time is not just captured by the code they produce, but also by the mere act of engineering.\n
\n
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.",html,selection_command
+702,351386,"examples/jasmine.html",2553,1457,"
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt\n at capturing a broad spectrum of human software engineering, including character-level text insertions, deletions, undo, redo, cursor movement, file switches, jumping\n to function defintions, git checkouts, terminal command execution, autocompletes, LLM changes, and more.\n
\n Every day, millions of developers are writing open-source code. We propose going beyond open-source and towards open-engineering, a paradigm where the value of the\n developer's time is not just captured by the code they produce, but also by the mere act of engineering.\n
\n
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n We want to make it as easy as possible for developers to participate in crowd-sourcing. All you need to do is ",html,selection_command
+703,351419,"examples/jasmine.html",2553,1524,"
\n Every day, millions of developers are writing open-source code. We propose going beyond open-source and towards open-engineering, a paradigm where the value of the\n developer's time is not just captured by the code they produce, but also by the mere act of engineering.\n
\n
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n We want to make it as easy as possible for developers to participate in crowd-sourcing. All you need to do is \n install the crowd-code extension, and forget about it.",html,selection_command
+704,351453,"examples/jasmine.html",2553,1533,"
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt\n at capturing a broad spectrum of human software engineering, including character-level text insertions, deletions, undo, redo, cursor movement, file switches, jumping\n to function defintions, git checkouts, terminal command execution, autocompletes, LLM changes, and more.\n
\n Every day, millions of developers are writing open-source code. We propose going beyond open-source and towards open-engineering, a paradigm where the value of the\n developer's time is not just captured by the code they produce, but also by the mere act of engineering.\n
\n
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n We want to make it as easy as possible for developers to participate in crowd-sourcing. All you need to do is \n install the crowd-code extension, and forget about it.\n
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt\n at capturing a broad spectrum of human software engineering, including character-level text insertions, deletions, undo, redo, cursor movement, file switches, jumping\n to function defintions, git checkouts, terminal command execution, autocompletes, LLM changes, and more.\n
\n Every day, millions of developers are writing open-source code. We propose going beyond open-source and towards open-engineering, a paradigm where the value of the\n developer's time is not just captured by the code they produce, but also by the mere act of engineering.\n
\n
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n We want to make it as easy as possible for developers to participate in crowd-sourcing. All you need to do is \n install the crowd-code extension, and forget about it.\n
\n \n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt\n at capturing a broad spectrum of human software engineering, including character-level text insertions, deletions, undo, redo, cursor movement, file switches, jumping\n to function defintions, git checkouts, terminal command execution, autocompletes, LLM changes, and more.\n
\n Every day, millions of developers are writing open-source code. We propose going beyond open-source and towards open-engineering, a paradigm where the value of the\n developer's time is not just captured by the code they produce, but also by the mere act of engineering.\n
\n
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n We want to make it as easy as possible for developers to participate in crowd-sourcing. All you need to do is \n install the crowd-code extension, and forget about it.\n
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt\n at capturing a broad spectrum of human software engineering, including character-level text insertions, deletions, undo, redo, cursor movement, file switches, jumping\n to function defintions, git checkouts, terminal command execution, autocompletes, LLM changes, and more.\n
\n Every day, millions of developers are writing open-source code. We propose going beyond open-source and towards open-engineering, a paradigm where the value of the\n developer's time is not just captured by the code they produce, but also by the mere act of engineering.\n
\n
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n We want to make it as easy as possible for developers to participate in crowd-sourcing. All you need to do is \n install the crowd-code extension, and forget about it.\n
\n \n Figure 1: A preview of the crowd-code extension in action. Figure from Mattia Consiglio.",html,selection_command
+708,351586,"examples/jasmine.html",2553,1958,"
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt\n at capturing a broad spectrum of human software engineering, including character-level text insertions, deletions, undo, redo, cursor movement, file switches, jumping\n to function defintions, git checkouts, terminal command execution, autocompletes, LLM changes, and more.\n
\n Every day, millions of developers are writing open-source code. We propose going beyond open-source and towards open-engineering, a paradigm where the value of the\n developer's time is not just captured by the code they produce, but also by the mere act of engineering.\n
\n
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n We want to make it as easy as possible for developers to participate in crowd-sourcing. All you need to do is \n install the crowd-code extension, and forget about it.\n
\n \n Figure 1: A preview of the crowd-code extension in action. Figure from Mattia Consiglio.\n
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt\n at capturing a broad spectrum of human software engineering, including character-level text insertions, deletions, undo, redo, cursor movement, file switches, jumping\n to function defintions, git checkouts, terminal command execution, autocompletes, LLM changes, and more.\n
\n Every day, millions of developers are writing open-source code. We propose going beyond open-source and towards open-engineering, a paradigm where the value of the\n developer's time is not just captured by the code they produce, but also by the mere act of engineering.\n
\n
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n We want to make it as easy as possible for developers to participate in crowd-sourcing. All you need to do is \n install the crowd-code extension, and forget about it.\n
\n \n Figure 1: A preview of the crowd-code extension in action. Figure from Mattia Consiglio.\n
\n The extension periodically uploads the user's captured IDE actions to a server, where they are cleaned, filtered, thoroughly anonymized, and periodically released to the",html,selection_command
+710,351653,"examples/jasmine.html",2553,2209,"
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt\n at capturing a broad spectrum of human software engineering, including character-level text insertions, deletions, undo, redo, cursor movement, file switches, jumping\n to function defintions, git checkouts, terminal command execution, autocompletes, LLM changes, and more.\n
\n Every day, millions of developers are writing open-source code. We propose going beyond open-source and towards open-engineering, a paradigm where the value of the\n developer's time is not just captured by the code they produce, but also by the mere act of engineering.\n
\n
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n We want to make it as easy as possible for developers to participate in crowd-sourcing. All you need to do is \n install the crowd-code extension, and forget about it.\n
\n \n Figure 1: A preview of the crowd-code extension in action. Figure from Mattia Consiglio.\n
\n The extension periodically uploads the user's captured IDE actions to a server, where they are cleaned, filtered, thoroughly anonymized, and periodically released to the\n public under the most permissive Creative Commons license (CC0).",html,selection_command
+711,351686,"examples/jasmine.html",2553,2379,"
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt\n at capturing a broad spectrum of human software engineering, including character-level text insertions, deletions, undo, redo, cursor movement, file switches, jumping\n to function defintions, git checkouts, terminal command execution, autocompletes, LLM changes, and more.\n
\n Every day, millions of developers are writing open-source code. We propose going beyond open-source and towards open-engineering, a paradigm where the value of the\n developer's time is not just captured by the code they produce, but also by the mere act of engineering.\n
\n
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n We want to make it as easy as possible for developers to participate in crowd-sourcing. All you need to do is \n install the crowd-code extension, and forget about it.\n
\n \n Figure 1: A preview of the crowd-code extension in action. Figure from Mattia Consiglio.\n
\n The extension periodically uploads the user's captured IDE actions to a server, where they are cleaned, filtered, thoroughly anonymized, and periodically released to the\n public under the most permissive Creative Commons license (CC0).\n An ongoing recording is transparently indicated in the IDE's status bar, and can be stopped at any time. If the user has inserted sensitive data, they can simply",html,selection_command
+712,351719,"examples/jasmine.html",2553,2555,"
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt\n at capturing a broad spectrum of human software engineering, including character-level text insertions, deletions, undo, redo, cursor movement, file switches, jumping\n to function defintions, git checkouts, terminal command execution, autocompletes, LLM changes, and more.\n
\n Every day, millions of developers are writing open-source code. We propose going beyond open-source and towards open-engineering, a paradigm where the value of the\n developer's time is not just captured by the code they produce, but also by the mere act of engineering.\n
\n
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n We want to make it as easy as possible for developers to participate in crowd-sourcing. All you need to do is \n install the crowd-code extension, and forget about it.\n
\n \n Figure 1: A preview of the crowd-code extension in action. Figure from Mattia Consiglio.\n
\n The extension periodically uploads the user's captured IDE actions to a server, where they are cleaned, filtered, thoroughly anonymized, and periodically released to the\n public under the most permissive Creative Commons license (CC0).\n An ongoing recording is transparently indicated in the IDE's status bar, and can be stopped at any time. If the user has inserted sensitive data, they can simply\n press the 'panic button' in the status bar to remove the last actions from the recording before they even leave the user's machine. Additionally, the user is asked for",html,selection_command
+713,351753,"examples/jasmine.html",2553,2733,"
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt\n at capturing a broad spectrum of human software engineering, including character-level text insertions, deletions, undo, redo, cursor movement, file switches, jumping\n to function defintions, git checkouts, terminal command execution, autocompletes, LLM changes, and more.\n
\n Every day, millions of developers are writing open-source code. We propose going beyond open-source and towards open-engineering, a paradigm where the value of the\n developer's time is not just captured by the code they produce, but also by the mere act of engineering.\n
\n
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n We want to make it as easy as possible for developers to participate in crowd-sourcing. All you need to do is \n install the crowd-code extension, and forget about it.\n
\n \n Figure 1: A preview of the crowd-code extension in action. Figure from Mattia Consiglio.\n
\n The extension periodically uploads the user's captured IDE actions to a server, where they are cleaned, filtered, thoroughly anonymized, and periodically released to the\n public under the most permissive Creative Commons license (CC0).\n An ongoing recording is transparently indicated in the IDE's status bar, and can be stopped at any time. If the user has inserted sensitive data, they can simply\n press the 'panic button' in the status bar to remove the last actions from the recording before they even leave the user's machine. Additionally, the user is asked for\n consent to participate in crowd-sourcing upon extension installation, and can opt-out at any time. We take user privacy very seriously and welcome any feedback on how to",html,selection_command
+714,351786,"examples/jasmine.html",2553,2792,"
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt\n at capturing a broad spectrum of human software engineering, including character-level text insertions, deletions, undo, redo, cursor movement, file switches, jumping\n to function defintions, git checkouts, terminal command execution, autocompletes, LLM changes, and more.\n
\n Every day, millions of developers are writing open-source code. We propose going beyond open-source and towards open-engineering, a paradigm where the value of the\n developer's time is not just captured by the code they produce, but also by the mere act of engineering.\n
\n
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n We want to make it as easy as possible for developers to participate in crowd-sourcing. All you need to do is \n install the crowd-code extension, and forget about it.\n
\n \n Figure 1: A preview of the crowd-code extension in action. Figure from Mattia Consiglio.\n
\n The extension periodically uploads the user's captured IDE actions to a server, where they are cleaned, filtered, thoroughly anonymized, and periodically released to the\n public under the most permissive Creative Commons license (CC0).\n An ongoing recording is transparently indicated in the IDE's status bar, and can be stopped at any time. If the user has inserted sensitive data, they can simply\n press the 'panic button' in the status bar to remove the last actions from the recording before they even leave the user's machine. Additionally, the user is asked for\n consent to participate in crowd-sourcing upon extension installation, and can opt-out at any time. We take user privacy very seriously and welcome any feedback on how to\n make data collection and release more transparent.",html,selection_command
+715,351819,"examples/jasmine.html",2553,2801,"
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt\n at capturing a broad spectrum of human software engineering, including character-level text insertions, deletions, undo, redo, cursor movement, file switches, jumping\n to function defintions, git checkouts, terminal command execution, autocompletes, LLM changes, and more.\n
\n Every day, millions of developers are writing open-source code. We propose going beyond open-source and towards open-engineering, a paradigm where the value of the\n developer's time is not just captured by the code they produce, but also by the mere act of engineering.\n
\n
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n We want to make it as easy as possible for developers to participate in crowd-sourcing. All you need to do is \n install the crowd-code extension, and forget about it.\n
\n \n Figure 1: A preview of the crowd-code extension in action. Figure from Mattia Consiglio.\n
\n The extension periodically uploads the user's captured IDE actions to a server, where they are cleaned, filtered, thoroughly anonymized, and periodically released to the\n public under the most permissive Creative Commons license (CC0).\n An ongoing recording is transparently indicated in the IDE's status bar, and can be stopped at any time. If the user has inserted sensitive data, they can simply\n press the 'panic button' in the status bar to remove the last actions from the recording before they even leave the user's machine. Additionally, the user is asked for\n consent to participate in crowd-sourcing upon extension installation, and can opt-out at any time. We take user privacy very seriously and welcome any feedback on how to\n make data collection and release more transparent.\n
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt\n at capturing a broad spectrum of human software engineering, including character-level text insertions, deletions, undo, redo, cursor movement, file switches, jumping\n to function defintions, git checkouts, terminal command execution, autocompletes, LLM changes, and more.\n
\n Every day, millions of developers are writing open-source code. We propose going beyond open-source and towards open-engineering, a paradigm where the value of the\n developer's time is not just captured by the code they produce, but also by the mere act of engineering.\n
\n
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n We want to make it as easy as possible for developers to participate in crowd-sourcing. All you need to do is \n install the crowd-code extension, and forget about it.\n
\n \n Figure 1: A preview of the crowd-code extension in action. Figure from Mattia Consiglio.\n
\n The extension periodically uploads the user's captured IDE actions to a server, where they are cleaned, filtered, thoroughly anonymized, and periodically released to the\n public under the most permissive Creative Commons license (CC0).\n An ongoing recording is transparently indicated in the IDE's status bar, and can be stopped at any time. If the user has inserted sensitive data, they can simply\n press the 'panic button' in the status bar to remove the last actions from the recording before they even leave the user's machine. Additionally, the user is asked for\n consent to participate in crowd-sourcing upon extension installation, and can opt-out at any time. We take user privacy very seriously and welcome any feedback on how to\n make data collection and release more transparent.\n
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt\n at capturing a broad spectrum of human software engineering, including character-level text insertions, deletions, undo, redo, cursor movement, file switches, jumping\n to function defintions, git checkouts, terminal command execution, autocompletes, LLM changes, and more.\n
\n Every day, millions of developers are writing open-source code. We propose going beyond open-source and towards open-engineering, a paradigm where the value of the\n developer's time is not just captured by the code they produce, but also by the mere act of engineering.\n
\n
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n We want to make it as easy as possible for developers to participate in crowd-sourcing. All you need to do is \n install the crowd-code extension, and forget about it.\n
\n \n Figure 1: A preview of the crowd-code extension in action. Figure from Mattia Consiglio.\n
\n The extension periodically uploads the user's captured IDE actions to a server, where they are cleaned, filtered, thoroughly anonymized, and periodically released to the\n public under the most permissive Creative Commons license (CC0).\n An ongoing recording is transparently indicated in the IDE's status bar, and can be stopped at any time. If the user has inserted sensitive data, they can simply\n press the 'panic button' in the status bar to remove the last actions from the recording before they even leave the user's machine. Additionally, the user is asked for\n consent to participate in crowd-sourcing upon extension installation, and can opt-out at any time. We take user privacy very seriously and welcome any feedback on how to\n make data collection and release more transparent.\n
\n
\n Beyond behaviour-cloning on a dataset crowd-sourced using crowd-code, we eventually want to use crowd-code to annotate the entirety of IDE screencasts on the internet ",html,selection_command
+718,351922,"examples/jasmine.html",2553,3160,"
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt\n at capturing a broad spectrum of human software engineering, including character-level text insertions, deletions, undo, redo, cursor movement, file switches, jumping\n to function defintions, git checkouts, terminal command execution, autocompletes, LLM changes, and more.\n
\n Every day, millions of developers are writing open-source code. We propose going beyond open-source and towards open-engineering, a paradigm where the value of the\n developer's time is not just captured by the code they produce, but also by the mere act of engineering.\n
\n
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n We want to make it as easy as possible for developers to participate in crowd-sourcing. All you need to do is \n install the crowd-code extension, and forget about it.\n
\n \n Figure 1: A preview of the crowd-code extension in action. Figure from Mattia Consiglio.\n
\n The extension periodically uploads the user's captured IDE actions to a server, where they are cleaned, filtered, thoroughly anonymized, and periodically released to the\n public under the most permissive Creative Commons license (CC0).\n An ongoing recording is transparently indicated in the IDE's status bar, and can be stopped at any time. If the user has inserted sensitive data, they can simply\n press the 'panic button' in the status bar to remove the last actions from the recording before they even leave the user's machine. Additionally, the user is asked for\n consent to participate in crowd-sourcing upon extension installation, and can opt-out at any time. We take user privacy very seriously and welcome any feedback on how to\n make data collection and release more transparent.\n
\n
\n Beyond behaviour-cloning on a dataset crowd-sourced using crowd-code, we eventually want to use crowd-code to annotate the entirety of IDE screencasts on the internet \n using an inverse dynamics model trained on screen recordings paired with crowd-code's IDE action annotations. This would unlock an entirely new trove of training data",html,selection_command
+719,351953,"examples/jasmine.html",2553,3201,"
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt\n at capturing a broad spectrum of human software engineering, including character-level text insertions, deletions, undo, redo, cursor movement, file switches, jumping\n to function defintions, git checkouts, terminal command execution, autocompletes, LLM changes, and more.\n
\n Every day, millions of developers are writing open-source code. We propose going beyond open-source and towards open-engineering, a paradigm where the value of the\n developer's time is not just captured by the code they produce, but also by the mere act of engineering.\n
\n
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n We want to make it as easy as possible for developers to participate in crowd-sourcing. All you need to do is \n install the crowd-code extension, and forget about it.\n
\n \n Figure 1: A preview of the crowd-code extension in action. Figure from Mattia Consiglio.\n
\n The extension periodically uploads the user's captured IDE actions to a server, where they are cleaned, filtered, thoroughly anonymized, and periodically released to the\n public under the most permissive Creative Commons license (CC0).\n An ongoing recording is transparently indicated in the IDE's status bar, and can be stopped at any time. If the user has inserted sensitive data, they can simply\n press the 'panic button' in the status bar to remove the last actions from the recording before they even leave the user's machine. Additionally, the user is asked for\n consent to participate in crowd-sourcing upon extension installation, and can opt-out at any time. We take user privacy very seriously and welcome any feedback on how to\n make data collection and release more transparent.\n
\n
\n Beyond behaviour-cloning on a dataset crowd-sourced using crowd-code, we eventually want to use crowd-code to annotate the entirety of IDE screencasts on the internet \n using an inverse dynamics model trained on screen recordings paired with crowd-code's IDE action annotations. This would unlock an entirely new trove of training data\n for software engineering agents.",html,selection_command
+720,351986,"examples/jasmine.html",2553,3210,"
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt\n at capturing a broad spectrum of human software engineering, including character-level text insertions, deletions, undo, redo, cursor movement, file switches, jumping\n to function defintions, git checkouts, terminal command execution, autocompletes, LLM changes, and more.\n
\n Every day, millions of developers are writing open-source code. We propose going beyond open-source and towards open-engineering, a paradigm where the value of the\n developer's time is not just captured by the code they produce, but also by the mere act of engineering.\n
\n
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n We want to make it as easy as possible for developers to participate in crowd-sourcing. All you need to do is \n install the crowd-code extension, and forget about it.\n
\n \n Figure 1: A preview of the crowd-code extension in action. Figure from Mattia Consiglio.\n
\n The extension periodically uploads the user's captured IDE actions to a server, where they are cleaned, filtered, thoroughly anonymized, and periodically released to the\n public under the most permissive Creative Commons license (CC0).\n An ongoing recording is transparently indicated in the IDE's status bar, and can be stopped at any time. If the user has inserted sensitive data, they can simply\n press the 'panic button' in the status bar to remove the last actions from the recording before they even leave the user's machine. Additionally, the user is asked for\n consent to participate in crowd-sourcing upon extension installation, and can opt-out at any time. We take user privacy very seriously and welcome any feedback on how to\n make data collection and release more transparent.\n
\n
\n Beyond behaviour-cloning on a dataset crowd-sourced using crowd-code, we eventually want to use crowd-code to annotate the entirety of IDE screencasts on the internet \n using an inverse dynamics model trained on screen recordings paired with crowd-code's IDE action annotations. This would unlock an entirely new trove of training data\n for software engineering agents.\n
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt\n at capturing a broad spectrum of human software engineering, including character-level text insertions, deletions, undo, redo, cursor movement, file switches, jumping\n to function defintions, git checkouts, terminal command execution, autocompletes, LLM changes, and more.\n
\n Every day, millions of developers are writing open-source code. We propose going beyond open-source and towards open-engineering, a paradigm where the value of the\n developer's time is not just captured by the code they produce, but also by the mere act of engineering.\n
\n
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n We want to make it as easy as possible for developers to participate in crowd-sourcing. All you need to do is \n install the crowd-code extension, and forget about it.\n
\n \n Figure 1: A preview of the crowd-code extension in action. Figure from Mattia Consiglio.\n
\n The extension periodically uploads the user's captured IDE actions to a server, where they are cleaned, filtered, thoroughly anonymized, and periodically released to the\n public under the most permissive Creative Commons license (CC0).\n An ongoing recording is transparently indicated in the IDE's status bar, and can be stopped at any time. If the user has inserted sensitive data, they can simply\n press the 'panic button' in the status bar to remove the last actions from the recording before they even leave the user's machine. Additionally, the user is asked for\n consent to participate in crowd-sourcing upon extension installation, and can opt-out at any time. We take user privacy very seriously and welcome any feedback on how to\n make data collection and release more transparent.\n
\n
\n Beyond behaviour-cloning on a dataset crowd-sourced using crowd-code, we eventually want to use crowd-code to annotate the entirety of IDE screencasts on the internet \n using an inverse dynamics model trained on screen recordings paired with crowd-code's IDE action annotations. This would unlock an entirely new trove of training data\n for software engineering agents.\n
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt\n at capturing a broad spectrum of human software engineering, including character-level text insertions, deletions, undo, redo, cursor movement, file switches, jumping\n to function defintions, git checkouts, terminal command execution, autocompletes, LLM changes, and more.\n
\n Every day, millions of developers are writing open-source code. We propose going beyond open-source and towards open-engineering, a paradigm where the value of the\n developer's time is not just captured by the code they produce, but also by the mere act of engineering.\n
\n
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n We want to make it as easy as possible for developers to participate in crowd-sourcing. All you need to do is \n install the crowd-code extension, and forget about it.\n
\n \n Figure 1: A preview of the crowd-code extension in action. Figure from Mattia Consiglio.\n
\n The extension periodically uploads the user's captured IDE actions to a server, where they are cleaned, filtered, thoroughly anonymized, and periodically released to the\n public under the most permissive Creative Commons license (CC0).\n An ongoing recording is transparently indicated in the IDE's status bar, and can be stopped at any time. If the user has inserted sensitive data, they can simply\n press the 'panic button' in the status bar to remove the last actions from the recording before they even leave the user's machine. Additionally, the user is asked for\n consent to participate in crowd-sourcing upon extension installation, and can opt-out at any time. We take user privacy very seriously and welcome any feedback on how to\n make data collection and release more transparent.\n
\n
\n Beyond behaviour-cloning on a dataset crowd-sourced using crowd-code, we eventually want to use crowd-code to annotate the entirety of IDE screencasts on the internet \n using an inverse dynamics model trained on screen recordings paired with crowd-code's IDE action annotations. This would unlock an entirely new trove of training data\n for software engineering agents.\n
\n
\n We are excited to see what the community builds with this dataset. We want to democratize AI research. We are greater than the sum of our parts. Together.",html,selection_command
+723,352209,"examples/jasmine.html",2553,3397,"
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt\n at capturing a broad spectrum of human software engineering, including character-level text insertions, deletions, undo, redo, cursor movement, file switches, jumping\n to function defintions, git checkouts, terminal command execution, autocompletes, LLM changes, and more.\n
\n Every day, millions of developers are writing open-source code. We propose going beyond open-source and towards open-engineering, a paradigm where the value of the\n developer's time is not just captured by the code they produce, but also by the mere act of engineering.\n
\n
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n We want to make it as easy as possible for developers to participate in crowd-sourcing. All you need to do is \n install the crowd-code extension, and forget about it.\n
\n \n Figure 1: A preview of the crowd-code extension in action. Figure from Mattia Consiglio.\n
\n The extension periodically uploads the user's captured IDE actions to a server, where they are cleaned, filtered, thoroughly anonymized, and periodically released to the\n public under the most permissive Creative Commons license (CC0).\n An ongoing recording is transparently indicated in the IDE's status bar, and can be stopped at any time. If the user has inserted sensitive data, they can simply\n press the 'panic button' in the status bar to remove the last actions from the recording before they even leave the user's machine. Additionally, the user is asked for\n consent to participate in crowd-sourcing upon extension installation, and can opt-out at any time. We take user privacy very seriously and welcome any feedback on how to\n make data collection and release more transparent.\n
\n
\n Beyond behaviour-cloning on a dataset crowd-sourced using crowd-code, we eventually want to use crowd-code to annotate the entirety of IDE screencasts on the internet \n using an inverse dynamics model trained on screen recordings paired with crowd-code's IDE action annotations. This would unlock an entirely new trove of training data\n for software engineering agents.\n
\n
\n We are excited to see what the community builds with this dataset. We want to democratize AI research. We are greater than the sum of our parts. Together.\n
",html,selection_command
+724,352858,"examples/jasmine.html",2553,3398,"",html,content
+725,352867,"examples/jasmine.html",2557,0,"",html,selection_command
+726,353273,"examples/jasmine.html",2548,0,"",html,selection_command
+727,353422,"examples/jasmine.html",2543,0,"",html,selection_command
+728,353560,"examples/jasmine.html",2539,0,"",html,selection_command
+729,353693,"examples/jasmine.html",2506,0,"",html,selection_command
+730,354102,"examples/jasmine.html",2507,0,"",html,selection_command
+731,354226,"examples/jasmine.html",2509,0,"",html,selection_command
+732,354512,"examples/jasmine.html",2510,0,"",html,selection_command
+733,354888,"examples/jasmine.html",2510,19,"",html,content
+734,355288,"examples/jasmine.html",2509,0,"",html,selection_command
+735,356237,"examples/jasmine.html",2510,0,"",html,selection_command
+736,356530,"examples/jasmine.html",2509,0,"",html,selection_command
+737,372859,"examples/jasmine.html",2510,0,"",html,selection_command
+738,373143,"examples/jasmine.html",2510,0,"I",html,content
+739,373145,"examples/jasmine.html",2511,0,"",html,selection_keyboard
+740,373288,"examples/jasmine.html",2511,0,"n",html,content
+741,373289,"examples/jasmine.html",2512,0,"",html,selection_keyboard
+742,373406,"examples/jasmine.html",2512,0,"t",html,content
+743,373407,"examples/jasmine.html",2513,0,"",html,selection_keyboard
+744,373448,"examples/jasmine.html",2513,0,"r",html,content
+745,373450,"examples/jasmine.html",2514,0,"",html,selection_keyboard
+746,373523,"examples/jasmine.html",2514,0,"o",html,content
+747,373526,"examples/jasmine.html",2515,0,"",html,selection_keyboard
+748,373659,"examples/jasmine.html",2515,0,"d",html,content
+749,373662,"examples/jasmine.html",2516,0,"",html,selection_keyboard
+750,373754,"examples/jasmine.html",2516,0,"u",html,content
+751,373757,"examples/jasmine.html",2517,0,"",html,selection_keyboard
+752,373871,"examples/jasmine.html",2517,0,"c",html,content
+753,373875,"examples/jasmine.html",2518,0,"",html,selection_keyboard
+754,374082,"examples/jasmine.html",2518,0,"t",html,content
+755,374086,"examples/jasmine.html",2519,0,"",html,selection_keyboard
+756,374153,"examples/jasmine.html",2519,0,"i",html,content
+757,374155,"examples/jasmine.html",2520,0,"",html,selection_keyboard
+758,374216,"examples/jasmine.html",2520,0,"o",html,content
+759,374218,"examples/jasmine.html",2521,0,"",html,selection_keyboard
+760,374279,"examples/jasmine.html",2521,0,"n",html,content
+761,374280,"examples/jasmine.html",2522,0,"",html,selection_keyboard
+762,374460,"examples/jasmine.html",2521,0,"",html,selection_command
+763,374701,"examples/jasmine.html",2534,0,"",html,selection_command
+764,374857,"examples/jasmine.html",2536,0,"",html,selection_command
+765,377357,"examples/jasmine.html",2535,1,"",html,content
+766,377492,"examples/jasmine.html",2535,0,"\n ",html,content
+767,377810,"examples/jasmine.html",2544,0,"We are at the cusp of an intelligence revolution. Neural networks are able to clone the behavior of peak human intellectual performance (cite Gemini/OAI IMO) given enough compute, data, and the right algorithms (R1). While an increasing amount of capital expenditure is allocated to compute clusters, and a well-working recipe of equipping models with the required priors and capacity to reason is publicly available, the path to human-level intelligence with the ability to automate large fractions of the economy will increasingly be shaped by paradigms that are able to find and efficiently use untouched data troves.\n\nWhile product-feedback-loops (cite cursor) constitute an adaptive data trove, many domains like robotics are not mature enough to yield a product with wide enough adoption to create a feedback-loop of sufficient magnitude, begging the question of alternatives.\n\nOne paradigm proposed by the research community to overcome the data scarcity in those domains is that of world models. While world models can help frontier model development in numerous ways, an ambitious goal of the community is to train a world model to act as a simulation of the world (cite Genie 1, Genie 2), in order to train an agent in that simulation, via an adaptive curriculum (cite Michael Dennis) or otherwise.\n",html,content
+768,378614,"examples/jasmine.html",3428,0,"",html,selection_command
+769,379004,"examples/jasmine.html",3427,0,"",html,selection_command
+770,379178,"examples/jasmine.html",3166,0,"",html,selection_command
+771,379352,"examples/jasmine.html",3165,0,"",html,selection_command
+772,379820,"examples/jasmine.html",2536,0,"",html,selection_command
+773,380016,"examples/jasmine.html",2545,0,"",html,selection_command
+774,380263,"examples/jasmine.html",2549,0,"",html,selection_command
+775,380294,"examples/jasmine.html",2552,0,"",html,selection_command
+776,380328,"examples/jasmine.html",2556,0,"",html,selection_command
+777,380360,"examples/jasmine.html",2561,0,"",html,selection_command
+778,380393,"examples/jasmine.html",2564,0,"",html,selection_command
+779,380427,"examples/jasmine.html",2567,0,"",html,selection_command
+780,380460,"examples/jasmine.html",2580,0,"",html,selection_command
+781,380494,"examples/jasmine.html",2591,0,"",html,selection_command
+782,380527,"examples/jasmine.html",2592,0,"",html,selection_command
+783,380560,"examples/jasmine.html",2599,0,"",html,selection_command
+784,380594,"examples/jasmine.html",2608,0,"",html,selection_command
+785,380830,"examples/jasmine.html",2759,0,"",html,selection_command
+786,381789,"examples/jasmine.html",2760,0,"",html,selection_command
+787,381881,"examples/jasmine.html",2760,0,"\n ",html,content
+788,383114,"examples/jasmine.html",2765,4,"",html,content
+789,383498,"examples/jasmine.html",2764,0,"",html,selection_command
+790,384013,"examples/jasmine.html",2766,0,"",html,selection_command
+791,384670,"examples/jasmine.html",3432,0," \n
",html,content
+1071,513167,"examples/jasmine.html",3498,0,"",html,selection_command
+1072,513442,"examples/jasmine.html",3493,0,"",html,selection_command
+1073,514695,"examples/jasmine.html",3493,1,"",html,content
+1074,514699,"examples/jasmine.html",3497,0,"",html,selection_command
+1075,552111,"examples/jasmine.html",3505,0,"",html,selection_command
+1076,552238,"examples/jasmine.html",3926,0,"",html,selection_command
+1077,552545,"examples/jasmine.html",3926,1,"",html,content
+1078,552568,"examples/jasmine.html",3930,0,"",html,selection_command
+1079,552726,"examples/jasmine.html",3505,0,"",html,selection_command
+1080,552969,"examples/jasmine.html",3501,0,"",html,selection_command
+1081,553146,"examples/jasmine.html",3501,0," ",html,content
+1082,553482,"examples/jasmine.html",3504,0,"",html,selection_command
+1083,554059,"examples/jasmine.html",3505,0,"",html,selection_command
+1084,554189,"examples/jasmine.html",3505,0," ",html,content
+1085,554421,"examples/jasmine.html",3508,0,"",html,selection_command
+1086,554622,"examples/jasmine.html",3499,0,"",html,selection_command
+1087,554767,"examples/jasmine.html",3491,0,"",html,selection_command
+1088,554904,"examples/jasmine.html",3222,0,"",html,selection_command
+1089,555656,"examples/jasmine.html",3227,0,"",html,selection_command
+1090,555900,"examples/jasmine.html",3235,0,"",html,selection_command
+1091,555930,"examples/jasmine.html",3236,0,"",html,selection_command
+1092,555964,"examples/jasmine.html",3244,0,"",html,selection_command
+1093,556001,"examples/jasmine.html",3245,0,"",html,selection_command
+1094,556037,"examples/jasmine.html",3250,0,"",html,selection_command
+1095,556071,"examples/jasmine.html",3252,0,"",html,selection_command
+1096,556103,"examples/jasmine.html",3256,0,"",html,selection_command
+1097,556135,"examples/jasmine.html",3263,0,"",html,selection_command
+1098,556171,"examples/jasmine.html",3264,0,"",html,selection_command
+1099,556216,"examples/jasmine.html",3275,0,"",html,selection_command
+1100,556237,"examples/jasmine.html",3278,0,"",html,selection_command
+1101,556269,"examples/jasmine.html",3287,0,"",html,selection_command
+1102,556300,"examples/jasmine.html",3292,0,"",html,selection_command
+1103,556333,"examples/jasmine.html",3298,0,"",html,selection_command
+1104,556366,"examples/jasmine.html",3299,0,"",html,selection_command
+1105,556399,"examples/jasmine.html",3304,0,"",html,selection_command
+1106,556443,"examples/jasmine.html",3312,0,"",html,selection_command
+1107,556468,"examples/jasmine.html",3317,0,"",html,selection_command
+1108,556500,"examples/jasmine.html",3326,0,"",html,selection_command
+1109,556534,"examples/jasmine.html",3330,0,"",html,selection_command
+1110,556569,"examples/jasmine.html",3334,0,"",html,selection_command
+1111,556603,"examples/jasmine.html",3341,0,"",html,selection_command
+1112,556636,"examples/jasmine.html",3348,0,"",html,selection_command
+1113,556669,"examples/jasmine.html",3351,0,"",html,selection_command
+1114,556702,"examples/jasmine.html",3357,0,"",html,selection_command
+1115,556735,"examples/jasmine.html",3359,0,"",html,selection_command
+1116,556768,"examples/jasmine.html",3367,0,"",html,selection_command
+1117,556802,"examples/jasmine.html",3372,0,"",html,selection_command
+1118,556834,"examples/jasmine.html",3377,0,"",html,selection_command
+1119,557041,"examples/jasmine.html",3384,0,"",html,selection_command
+1120,557426,"examples/jasmine.html",3385,0,"",html,selection_command
+1121,557560,"examples/jasmine.html",3385,0,"\n ",html,content
+1122,557811,"examples/jasmine.html",3393,0,"",html,selection_command
+1123,557991,"examples/jasmine.html",3394,0,"",html,selection_command
+1124,558127,"examples/jasmine.html",3394,1,"",html,content
+1125,558356,"examples/jasmine.html",3401,0,"",html,selection_command
+1126,558603,"examples/jasmine.html",3404,0,"",html,selection_command
+1127,558635,"examples/jasmine.html",3411,0,"",html,selection_command
+1128,558666,"examples/jasmine.html",3413,0,"",html,selection_command
+1129,558702,"examples/jasmine.html",3422,0,"",html,selection_command
+1130,559001,"examples/jasmine.html",3499,0,"",html,selection_command
+1131,559399,"examples/jasmine.html",3507,0,"",html,selection_command
+1132,559558,"examples/jasmine.html",3545,0,"",html,selection_command
+1133,559697,"examples/jasmine.html",3553,0,"",html,selection_command
+1134,559958,"examples/jasmine.html",3563,0,"",html,selection_command
+1135,559991,"examples/jasmine.html",3566,0,"",html,selection_command
+1136,560019,"examples/jasmine.html",3575,0,"",html,selection_command
+1137,560046,"examples/jasmine.html",3579,0,"",html,selection_command
+1138,560079,"examples/jasmine.html",3584,0,"",html,selection_command
+1139,560114,"examples/jasmine.html",3593,0,"",html,selection_command
+1140,560148,"examples/jasmine.html",3596,0,"",html,selection_command
+1141,560184,"examples/jasmine.html",3602,0,"",html,selection_command
+1142,560218,"examples/jasmine.html",3610,0,"",html,selection_command
+1143,560250,"examples/jasmine.html",3613,0,"",html,selection_command
+1144,560284,"examples/jasmine.html",3618,0,"",html,selection_command
+1145,560318,"examples/jasmine.html",3621,0,"",html,selection_command
+1146,560353,"examples/jasmine.html",3627,0,"",html,selection_command
+1147,560387,"examples/jasmine.html",3634,0,"",html,selection_command
+1148,560420,"examples/jasmine.html",3635,0,"",html,selection_command
+1149,560456,"examples/jasmine.html",3641,0,"",html,selection_command
+1150,560487,"examples/jasmine.html",3647,0,"",html,selection_command
+1151,560520,"examples/jasmine.html",3654,0,"",html,selection_command
+1152,560677,"examples/jasmine.html",3658,0,"",html,selection_command
+1153,560856,"examples/jasmine.html",3663,0,"",html,selection_command
+1154,561031,"examples/jasmine.html",3672,0,"",html,selection_command
+1155,561182,"examples/jasmine.html",3678,0,"",html,selection_command
+1156,561373,"examples/jasmine.html",3690,0,"",html,selection_command
+1157,561845,"examples/jasmine.html",3680,0,"",html,selection_command
+1158,562055,"examples/jasmine.html",3674,0,"",html,selection_command
+1159,562178,"examples/jasmine.html",3678,0,"",html,selection_command
+1160,562264,"examples/jasmine.html",3679,0,"",html,selection_command
+1161,562819,"examples/jasmine.html",3679,0,"\n ",html,content
+1162,562980,"examples/jasmine.html",3687,0,"",html,selection_command
+1163,563161,"examples/jasmine.html",3688,0,"",html,selection_command
+1164,563254,"examples/jasmine.html",3688,1,"",html,content
+1165,563729,"examples/jasmine.html",3698,0,"",html,selection_command
+1166,563985,"examples/jasmine.html",3701,0,"",html,selection_command
+1167,564012,"examples/jasmine.html",3710,0,"",html,selection_command
+1168,564047,"examples/jasmine.html",3715,0,"",html,selection_command
+1169,564079,"examples/jasmine.html",3716,0,"",html,selection_command
+1170,564112,"examples/jasmine.html",3719,0,"",html,selection_command
+1171,564145,"examples/jasmine.html",3729,0,"",html,selection_command
+1172,564181,"examples/jasmine.html",3734,0,"",html,selection_command
+1173,564212,"examples/jasmine.html",3737,0,"",html,selection_command
+1174,564245,"examples/jasmine.html",3741,0,"",html,selection_command
+1175,564279,"examples/jasmine.html",3751,0,"",html,selection_command
+1176,564312,"examples/jasmine.html",3754,0,"",html,selection_command
+1177,564346,"examples/jasmine.html",3757,0,"",html,selection_command
+1178,564379,"examples/jasmine.html",3763,0,"",html,selection_command
+1179,564412,"examples/jasmine.html",3765,0,"",html,selection_command
+1180,565163,"examples/jasmine.html",3779,0,"",html,selection_command
+1181,565534,"examples/jasmine.html",3784,0,"",html,selection_command
+1182,565668,"examples/jasmine.html",3797,0,"",html,selection_command
+1183,565814,"examples/jasmine.html",3805,0,"",html,selection_command
+1184,565962,"examples/jasmine.html",3818,0,"",html,selection_command
+1185,566112,"examples/jasmine.html",3849,0,"",html,selection_command
+1186,566250,"examples/jasmine.html",3852,0,"",html,selection_command
+1187,566750,"examples/jasmine.html",3851,0,"",html,selection_command
+1188,567296,"examples/jasmine.html",3851,0,"\n ",html,content
+1189,567534,"examples/jasmine.html",3859,0,"",html,selection_command
+1190,567666,"examples/jasmine.html",3860,0,"",html,selection_command
+1191,567786,"examples/jasmine.html",3860,1,"",html,content
+1192,568533,"examples/jasmine.html",3957,0,"",html,selection_command
+1193,568785,"examples/jasmine.html",3956,0,"",html,selection_command
+1194,570601,"examples/jasmine.html",3958,0,"",html,selection_command
+1195,570747,"examples/jasmine.html",3967,0,"",html,selection_command
+1196,570906,"examples/jasmine.html",3984,0,"",html,selection_command
+1197,571321,"examples/jasmine.html",3985,0,"",html,selection_command
+1198,571485,"examples/jasmine.html",4000,0,"",html,selection_command
+1199,571622,"examples/jasmine.html",4001,0,"",html,selection_command
+1200,572045,"examples/jasmine.html",4028,0,"",html,selection_command
+1201,573056,"examples/jasmine.html",4032,0,"",html,selection_command
+1202,573219,"examples/jasmine.html",4033,0,"",html,selection_command
+1203,573516,"examples/jasmine.html",4034,0,"",html,selection_command
+1204,573902,"examples/jasmine.html",4035,0,"",html,selection_command
+1205,574567,"examples/jasmine.html",4035,1,"A",html,selection_command
+1206,574614,"examples/jasmine.html",4035,2,"AN",html,selection_command
+1207,574842,"examples/jasmine.html",4035,3,"AN,",html,selection_command
+1208,574991,"examples/jasmine.html",4035,4,"AN, ",html,selection_command
+1209,575188,"examples/jasmine.html",4035,4,"",html,content
+1210,575463,"examples/jasmine.html",4036,0,"",html,selection_command
+1211,575694,"examples/jasmine.html",4037,0,"",html,selection_command
+1212,576055,"examples/jasmine.html",4036,0,"",html,selection_command
+1213,576159,"examples/jasmine.html",4037,0,"",html,selection_command
+1214,576256,"examples/jasmine.html",4037,0,",",html,content
+1215,576257,"examples/jasmine.html",4038,0,"",html,selection_keyboard
+1216,576440,"examples/jasmine.html",4038,0," ",html,content
+1217,576442,"examples/jasmine.html",4039,0,"",html,selection_keyboard
+1218,576674,"examples/jasmine.html",4038,0,"",html,selection_command
+1219,576771,"examples/jasmine.html",4039,0,"AN, ",html,content
+1220,576773,"examples/jasmine.html",4042,0,"",html,selection_command
+1221,577563,"examples/jasmine.html",4043,0,"",html,selection_command
+1222,577712,"examples/jasmine.html",4043,1,"",html,content
+1223,578190,"examples/jasmine.html",4042,0,"",html,selection_command
+1224,578364,"examples/jasmine.html",4041,0,"",html,selection_command
+1225,578519,"examples/jasmine.html",4041,1,"",html,content
+1226,579468,"examples/jasmine.html",4042,0,"",html,selection_command
+1227,579718,"examples/jasmine.html",4046,0,"",html,selection_command
+1228,579742,"examples/jasmine.html",4049,0,"",html,selection_command
+1229,579774,"examples/jasmine.html",4056,0,"",html,selection_command
+1230,579804,"examples/jasmine.html",4059,0,"",html,selection_command
+1231,600831,"examples/jasmine.html",4059,0,"r",html,content
+1232,600833,"examples/jasmine.html",4060,0,"",html,selection_keyboard
+1233,600885,"examples/jasmine.html",4060,0,"e",html,content
+1234,600887,"examples/jasmine.html",4061,0,"",html,selection_keyboard
+1235,600959,"examples/jasmine.html",4061,0,"s",html,content
+1236,600960,"examples/jasmine.html",4062,0,"",html,selection_keyboard
+1237,601080,"examples/jasmine.html",4062,0,"e",html,content
+1238,601081,"examples/jasmine.html",4063,0,"",html,selection_keyboard
+1239,601160,"examples/jasmine.html",4063,0,"a",html,content
+1240,601162,"examples/jasmine.html",4064,0,"",html,selection_keyboard
+1241,601530,"examples/jasmine.html",4064,0,"r",html,content
+1242,601533,"examples/jasmine.html",4065,0,"",html,selection_keyboard
+1243,601760,"examples/jasmine.html",4065,0,"c",html,content
+1244,601763,"examples/jasmine.html",4066,0,"",html,selection_keyboard
+1245,601842,"examples/jasmine.html",4066,0,"h",html,content
+1246,601844,"examples/jasmine.html",4067,0,"",html,selection_keyboard
+1247,602116,"examples/jasmine.html",4067,0,",",html,content
+1248,602119,"examples/jasmine.html",4068,0,"",html,selection_keyboard
+1249,603125,"examples/jasmine.html",4067,0,"",html,selection_command
+1250,603704,"examples/jasmine.html",4068,0,"",html,selection_command
+1251,603794,"examples/jasmine.html",4068,0," ",html,content
+1252,603795,"examples/jasmine.html",4069,0,"",html,selection_keyboard
+1253,603980,"examples/jasmine.html",4068,0,"",html,selection_command
+1254,604315,"examples/jasmine.html",4069,0,"",html,selection_command
+1255,604577,"examples/jasmine.html",4078,0,"",html,selection_command
+1256,604750,"examples/jasmine.html",4082,0,"",html,selection_command
+1257,605167,"examples/jasmine.html",4078,0,"",html,selection_command
+1258,605415,"examples/jasmine.html",4069,0,"",html,selection_command
+1259,605445,"examples/jasmine.html",4067,0,"",html,selection_command
+1260,605480,"examples/jasmine.html",4059,0,"",html,selection_command
+1261,605511,"examples/jasmine.html",4056,0,"",html,selection_command
+1262,605545,"examples/jasmine.html",4049,0,"",html,selection_command
+1263,605578,"examples/jasmine.html",4046,0,"",html,selection_command
+1264,605656,"examples/jasmine.html",4047,0,"",html,selection_command
+1265,605912,"examples/jasmine.html",4054,0,"",html,selection_command
+1266,605946,"examples/jasmine.html",4057,0,"",html,selection_command
+1267,605978,"examples/jasmine.html",4066,0,"",html,selection_command
+1268,606012,"examples/jasmine.html",4067,0,"",html,selection_command
+1269,606042,"examples/jasmine.html",4076,0,"",html,selection_command
+1270,606387,"examples/jasmine.html",4080,0,"",html,selection_command
+1271,606630,"examples/jasmine.html",4095,0,"",html,selection_command
+1272,606782,"examples/jasmine.html",4096,0,"",html,selection_command
+1273,607244,"examples/jasmine.html",4082,0,"",html,selection_command
+1274,607485,"examples/jasmine.html",4078,0,"",html,selection_command
+1275,607513,"examples/jasmine.html",4069,0,"",html,selection_command
+1276,607547,"examples/jasmine.html",4067,0,"",html,selection_command
+1277,607579,"examples/jasmine.html",4059,0,"",html,selection_command
+1278,607613,"examples/jasmine.html",4056,0,"",html,selection_command
+1279,607646,"examples/jasmine.html",4049,0,"",html,selection_command
+1280,607713,"examples/jasmine.html",4054,0,"",html,selection_command
+1281,607969,"examples/jasmine.html",4057,0,"",html,selection_command
+1282,608003,"examples/jasmine.html",4066,0,"",html,selection_command
+1283,608031,"examples/jasmine.html",4067,0,"",html,selection_command
+1284,608065,"examples/jasmine.html",4076,0,"",html,selection_command
+1285,608098,"examples/jasmine.html",4080,0,"",html,selection_command
+1286,608131,"examples/jasmine.html",4095,0,"",html,selection_command
+1287,608165,"examples/jasmine.html",4096,0,"",html,selection_command
+1288,608299,"examples/jasmine.html",4099,0,"",html,selection_command
+1289,608570,"examples/jasmine.html",4105,0,"",html,selection_command
+1290,608722,"examples/jasmine.html",4109,0,"",html,selection_command
+1291,609161,"examples/jasmine.html",4120,0,"",html,selection_command
+1292,609362,"examples/jasmine.html",4121,0,"",html,selection_command
+1293,609713,"examples/jasmine.html",4028,0,"",html,selection_command
+1294,609825,"examples/jasmine.html",4032,0,"",html,selection_command
+1295,610083,"examples/jasmine.html",4033,0,"",html,selection_command
+1296,610104,"examples/jasmine.html",4034,0,"",html,selection_command
+1297,610137,"examples/jasmine.html",4035,0,"",html,selection_command
+1298,610172,"examples/jasmine.html",4037,0,"",html,selection_command
+1299,610205,"examples/jasmine.html",4039,0,"",html,selection_command
+1300,610238,"examples/jasmine.html",4042,0,"",html,selection_command
+1301,610271,"examples/jasmine.html",4046,0,"",html,selection_command
+1302,610310,"examples/jasmine.html",4049,0,"",html,selection_command
+1303,610479,"examples/jasmine.html",4056,0,"",html,selection_command
+1304,610735,"examples/jasmine.html",4059,0,"",html,selection_command
+1305,610767,"examples/jasmine.html",4067,0,"",html,selection_command
+1306,610793,"examples/jasmine.html",4069,0,"",html,selection_command
+1307,610825,"examples/jasmine.html",4078,0,"",html,selection_command
+1308,610859,"examples/jasmine.html",4082,0,"",html,selection_command
+1309,610889,"examples/jasmine.html",4096,0,"",html,selection_command
+1310,610922,"examples/jasmine.html",4098,0,"",html,selection_command
+1311,610956,"examples/jasmine.html",4101,0,"",html,selection_command
+1312,611091,"examples/jasmine.html",4105,0,"",html,selection_command
+1313,611367,"examples/jasmine.html",4109,0,"",html,selection_command
+1314,611540,"examples/jasmine.html",4120,0,"",html,selection_command
+1315,611834,"examples/jasmine.html",4121,0,"",html,selection_command
+1316,612490,"examples/jasmine.html",4122,0,"",html,selection_command
+1317,612636,"examples/jasmine.html",4123,0,"",html,selection_command
+1318,612962,"examples/jasmine.html",4122,0,"",html,selection_command
+1319,613036,"examples/jasmine.html",4122,1," ",html,selection_command
+1320,613096,"examples/jasmine.html",4122,3," We",html,selection_command
+1321,613361,"examples/jasmine.html",4122,9," We thank",html,selection_command
+1322,613399,"examples/jasmine.html",4122,16," We thank Gemini",html,selection_command
+1323,613423,"examples/jasmine.html",4122,21," We thank Gemini Code",html,selection_command
+1324,613448,"examples/jasmine.html",4122,28," We thank Gemini Code Assist",html,selection_command
+1325,613481,"examples/jasmine.html",4122,32," We thank Gemini Code Assist and",html,selection_command
+1326,613514,"examples/jasmine.html",4122,39," We thank Gemini Code Assist and Cursor",html,selection_command
+1327,613547,"examples/jasmine.html",4122,43," We thank Gemini Code Assist and Cursor for",html,selection_command
+1328,613580,"examples/jasmine.html",4122,49," We thank Gemini Code Assist and Cursor for their",html,selection_command
+1329,613614,"examples/jasmine.html",4122,54," We thank Gemini Code Assist and Cursor for their help",html,selection_command
+1330,613647,"examples/jasmine.html",4122,57," We thank Gemini Code Assist and Cursor for their help in",html,selection_command
+1331,613681,"examples/jasmine.html",4122,65," We thank Gemini Code Assist and Cursor for their help in writing",html,selection_command
+1332,613716,"examples/jasmine.html",4122,69," We thank Gemini Code Assist and Cursor for their help in writing the",html,selection_command
+1333,613837,"examples/jasmine.html",4122,79," We thank Gemini Code Assist and Cursor for their help in writing the extension",html,selection_command
+1334,614007,"examples/jasmine.html",4122,82," We thank Gemini Code Assist and Cursor for their help in writing the extension.",html,selection_command
+1335,614367,"examples/jasmine.html",4122,81," We thank Gemini Code Assist and Cursor for their help in writing the extension.<",html,selection_command
+1336,614505,"examples/jasmine.html",4122,80," We thank Gemini Code Assist and Cursor for their help in writing the extension.",html,selection_command
+1337,614754,"examples/jasmine.html",4122,80,"",html,content
+1338,615203,"examples/jasmine.html",4028,0,"",html,selection_command
+1339,619726,"examples/jasmine.html",4001,0,"",html,selection_command
+1340,619968,"examples/jasmine.html",4000,0,"",html,selection_command
+1341,619996,"examples/jasmine.html",3985,0,"",html,selection_command
+1342,620027,"examples/jasmine.html",3984,0,"",html,selection_command
+1343,620061,"examples/jasmine.html",3967,0,"",html,selection_command
+1344,620095,"examples/jasmine.html",3958,0,"",html,selection_command
+1345,620132,"examples/jasmine.html",3852,0,"",html,selection_command
+1346,620164,"examples/jasmine.html",3680,0,"",html,selection_command
+1347,620224,"examples/jasmine.html",3509,0,"",html,selection_command
+1348,620245,"examples/jasmine.html",3501,0,"",html,selection_command
+1349,620277,"examples/jasmine.html",3492,0,"",html,selection_command
+1350,620303,"examples/jasmine.html",3386,0,"",html,selection_command
+1351,620333,"examples/jasmine.html",3215,0,"",html,selection_command
+1352,620366,"examples/jasmine.html",3214,0,"",html,selection_command
+1353,620403,"examples/jasmine.html",3206,0,"",html,selection_command
+1354,620435,"examples/jasmine.html",3197,0,"",html,selection_command
+1355,620469,"examples/jasmine.html",3052,0,"",html,selection_command
+1356,620502,"examples/jasmine.html",2880,0,"",html,selection_command
+1357,620535,"examples/jasmine.html",2710,0,"",html,selection_command
+1358,620569,"examples/jasmine.html",2544,0,"",html,selection_command
+1359,620603,"examples/jasmine.html",2536,0,"",html,selection_command
+1360,620636,"examples/jasmine.html",2510,0,"",html,selection_command
+1361,620669,"examples/jasmine.html",2436,0,"",html,selection_command
+1362,620702,"examples/jasmine.html",2422,0,"",html,selection_command
+1363,620736,"examples/jasmine.html",2398,0,"",html,selection_command
+1364,620770,"examples/jasmine.html",2385,0,"",html,selection_command
+1365,620803,"examples/jasmine.html",2376,0,"",html,selection_command
+1366,620836,"examples/jasmine.html",2309,0,"",html,selection_command
+1367,620870,"examples/jasmine.html",2151,0,"",html,selection_command
+1368,620903,"examples/jasmine.html",2143,0,"",html,selection_command
+1369,620936,"examples/jasmine.html",2131,0,"",html,selection_command
+1370,621144,"examples/jasmine.html",2143,0,"",html,selection_command
+1371,621477,"examples/jasmine.html",2151,0,"",html,selection_command
+1372,621627,"examples/jasmine.html",2309,0,"",html,selection_command
+1373,621882,"examples/jasmine.html",2376,0,"",html,selection_command
+1374,621912,"examples/jasmine.html",2385,0,"",html,selection_command
+1375,621942,"examples/jasmine.html",2398,0,"",html,selection_command
+1376,621976,"examples/jasmine.html",2422,0,"",html,selection_command
+1377,622010,"examples/jasmine.html",2436,0,"",html,selection_command
+1378,622043,"examples/jasmine.html",2510,0,"",html,selection_command
+1379,622076,"examples/jasmine.html",2536,0,"",html,selection_command
+1380,622110,"examples/jasmine.html",2544,0,"",html,selection_command
+1381,622144,"examples/jasmine.html",2710,0,"",html,selection_command
+1382,622176,"examples/jasmine.html",2880,0,"",html,selection_command
+1383,622210,"examples/jasmine.html",3052,0,"",html,selection_command
+1384,622243,"examples/jasmine.html",3197,0,"",html,selection_command
+1385,622276,"examples/jasmine.html",3206,0,"",html,selection_command
+1386,622310,"examples/jasmine.html",3214,0,"",html,selection_command
+1387,622344,"examples/jasmine.html",3215,0,"",html,selection_command
+1388,622733,"examples/jasmine.html",3214,0,"",html,selection_command
+1389,623084,"examples/jasmine.html",3206,0,"",html,selection_command
+1390,623328,"examples/jasmine.html",3197,0,"",html,selection_command
+1391,623360,"examples/jasmine.html",3052,0,"",html,selection_command
+1392,626961,"examples/jasmine.html",3197,0,"",html,selection_command
+1393,627119,"examples/jasmine.html",3206,0,"",html,selection_command
+1394,627250,"examples/jasmine.html",3214,0,"",html,selection_command
+1395,627722,"examples/jasmine.html",3214,1,"",html,content
+1396,627729,"examples/jasmine.html",3222,0,"",html,selection_command
+1397,628939,"examples/jasmine.html",3393,0,"",html,selection_command
+1398,629187,"examples/jasmine.html",3498,0,"",html,selection_command
+1399,630670,"examples/jasmine.html",3392,0,"",html,selection_command
+1400,630922,"examples/jasmine.html",3221,0,"",html,selection_command
+1401,630955,"examples/jasmine.html",3212,0,"",html,selection_command
+1402,630988,"examples/jasmine.html",3204,0,"",html,selection_command
+1403,631026,"examples/jasmine.html",3059,0,"",html,selection_command
+1404,631254,"examples/jasmine.html",2887,0,"",html,selection_command
+1405,631408,"examples/jasmine.html",2717,0,"",html,selection_command
+1406,631568,"examples/jasmine.html",2551,0,"",html,selection_command
+1407,631783,"examples/jasmine.html",2544,0,"",html,selection_command
+1408,643367,"extension-output-pdoom-org.crowd-code-#1-crowd-code",0,0,"",Log,tab
+1409,644989,"examples/jasmine.html",0,0,"",html,tab
+1410,670408,"TERMINAL",0,0,"cp examples/* dist",,terminal_command
+1411,670432,"TERMINAL",0,0,"]633;C[1m[7m%[27m[1m[0m \r \r",,terminal_output
+1412,672760,"TERMINAL",0,0,"npm run dev",,terminal_command
+1413,672814,"TERMINAL",0,0,"]633;C",,terminal_output
+1414,673210,"TERMINAL",0,0,"\r\n> distill-template@2.8.0 dev\r\n> rollup -c rollup.config.dev.js -w\r\n\r\n[1G[0K",,terminal_output
+1415,673404,"TERMINAL",0,0,"c[4mrollup v2.7.3[24m\r\n[36mbundles [1msrc/components.js[22m โ [1mdist/template.v2.js[22m...[39m\r\n",,terminal_output
+1416,673782,"TERMINAL",0,0,"[1m[32mhttp://localhost:8088[39m[22m -> /Users/franzsrambical/Documents/pdoom/pdoom.org/dist\r\n[1m[32mhttp://localhost:8088[39m[22m -> /Users/franzsrambical/Documents/pdoom/pdoom.org/examples\r\n[1m[33m(!) Circular dependencies[39m[22m\r\nnode_modules/d3-interpolate/src/value.js -> node_modules/d3-interpolate/src/array.js -> node_modules/d3-interpolate/src/value.js\r\nnode_modules/d3-interpolate/src/value.js -> node_modules/d3-interpolate/src/object.js -> node_modules/d3-interpolate/src/value.js\r\nnode_modules/d3-selection/src/selection/index.js -> node_modules/d3-selection/src/selection/select.js -> node_modules/d3-selection/src/selection/index.js\r\n...and 7 more\r\n[32mcreated [1mdist/template.v2.js[22m in [1m382ms[22m[39m\r\n[36mbundles [1msrc/transforms.js[22m โ [1mdist/transforms.v2.js[22m...[39m\r\n",,terminal_output
+1417,674240,"TERMINAL",0,0,"[32mcreated [1mdist/transforms.v2.js[22m in [1m456ms[22m[39m\r\n\r\n[2025-08-05 18:37:10] waiting for changes...\r\n",,terminal_output
+1418,728959,"examples/jasmine.html",2536,0,"",html,selection_command
+1419,729208,"examples/jasmine.html",2510,0,"",html,selection_command
+1420,729240,"examples/jasmine.html",2436,0,"",html,selection_command
+1421,729278,"examples/jasmine.html",2422,0,"",html,selection_command
+1422,729305,"examples/jasmine.html",2398,0,"",html,selection_command
+1423,729330,"examples/jasmine.html",2385,0,"",html,selection_command
+1424,729364,"examples/jasmine.html",2376,0,"",html,selection_command
+1425,729398,"examples/jasmine.html",2309,0,"",html,selection_command
+1426,766747,"examples/jasmine.html",861,0,"",html,selection_command
+1427,767731,"examples/jasmine.html",880,0,"",html,selection_command
+1428,767865,"examples/jasmine.html",937,0,"",html,selection_command
+1429,769136,"examples/jasmine.html",1038,0,"",html,selection_command
+1430,769371,"examples/jasmine.html",1037,0,"",html,selection_command
+1431,769517,"examples/jasmine.html",1036,0,"",html,selection_command
+1432,769649,"examples/jasmine.html",1035,0,"",html,selection_command
+1433,769834,"examples/jasmine.html",1035,1,"",html,content
+1434,770281,"examples/jasmine.html",1035,1,"",html,content
+1435,770715,"examples/jasmine.html",1034,0,"",html,selection_command
+1436,771165,"examples/jasmine.html",1034,1,"",html,content
+1437,771384,"examples/jasmine.html",1032,0,"",html,selection_command
+1438,771736,"examples/jasmine.html",1032,2,"",html,content
+1439,772042,"examples/jasmine.html",1031,0,"",html,selection_command
+1440,772536,"examples/jasmine.html",1031,1,"",html,content
+1441,774686,"TERMINAL",0,0,"^C[1G[0Kโ [1G[0K[1mnpm[22m [96mnotice[39m\r\n[1mnpm[22m [96mnotice[39m New [31mmajor[39m version of npm available! [31m10.9.2[39m -> [34m11.5.2[39m\r\n[1mnpm[22m [96mnotice[39m Changelog: [34mhttps://github.com/npm/cli/releases/tag/v11.5.2[39m\r\n[1mnpm[22m [96mnotice[39m To update run: [4mnpm install -g npm@11.5.2[24m\r\n[1mnpm[22m [96mnotice[39m\r\n[1G[0Kโ [1G[0K",,terminal_output
+1442,774702,"TERMINAL",0,0,"[1m[7m%[27m[1m[0m \r \r",,terminal_output
+1443,774738,"TERMINAL",0,0,"",,terminal_command
+1444,774738,"TERMINAL",0,0,"]633;C",,terminal_output
+1445,775427,"TERMINAL",0,0,"cp examples/* dist",,terminal_command
+1446,775447,"TERMINAL",0,0,"]633;C[1m[7m%[27m[1m[0m \r \r",,terminal_output
+1447,775993,"TERMINAL",0,0,"npm run dev",,terminal_command
+1448,776044,"TERMINAL",0,0,"]633;C",,terminal_output
+1449,776177,"TERMINAL",0,0,"\r\n> distill-template@2.8.0 dev\r\n> rollup -c rollup.config.dev.js -w\r\n\r\n[1G[0K",,terminal_output
+1450,776367,"TERMINAL",0,0,"c[4mrollup v2.7.3[24m\r\n[36mbundles [1msrc/components.js[22m โ [1mdist/template.v2.js[22m...[39m\r\n",,terminal_output
+1451,776780,"TERMINAL",0,0,"[1m[32mhttp://localhost:8088[39m[22m -> /Users/franzsrambical/Documents/pdoom/pdoom.org/dist\r\n[1m[32mhttp://localhost:8088[39m[22m -> /Users/franzsrambical/Documents/pdoom/pdoom.org/examples\r\n[1m[33m(!) Circular dependencies[39m[22m\r\nnode_modules/d3-interpolate/src/value.js -> node_modules/d3-interpolate/src/array.js -> node_modules/d3-interpolate/src/value.js\r\nnode_modules/d3-interpolate/src/value.js -> node_modules/d3-interpolate/src/object.js -> node_modules/d3-interpolate/src/value.js\r\nnode_modules/d3-selection/src/selection/index.js -> node_modules/d3-selection/src/selection/select.js -> node_modules/d3-selection/src/selection/index.js\r\n...and 7 more\r\n[32mcreated [1mdist/template.v2.js[22m in [1m412ms[22m[39m\r\n[36mbundles [1msrc/transforms.js[22m โ [1mdist/transforms.v2.js[22m...[39m\r\n",,terminal_output
+1452,777129,"TERMINAL",0,0,"[32mcreated [1mdist/transforms.v2.js[22m in [1m347ms[22m[39m\r\n\r\n[2025-08-05 18:38:53] waiting for changes...\r\n",,terminal_output
+1453,808597,"examples/jasmine.html",935,0,"",html,selection_command
+1454,812362,"examples/jasmine.html",0,0,"",html,selection_command
+1455,814488,"examples/jasmine.html",5,0,"",html,selection_command
+1456,814735,"examples/jasmine.html",30,0,"",html,selection_command
+1457,814768,"examples/jasmine.html",31,0,"",html,selection_command
+1458,814801,"examples/jasmine.html",97,0,"",html,selection_command
+1459,814834,"examples/jasmine.html",164,0,"",html,selection_command
+1460,814868,"examples/jasmine.html",206,0,"",html,selection_command
+1461,814901,"examples/jasmine.html",207,0,"",html,selection_command
+1462,814935,"examples/jasmine.html",257,0,"",html,selection_command
+1463,814969,"examples/jasmine.html",258,0,"",html,selection_command
+1464,815002,"examples/jasmine.html",328,0,"",html,selection_command
+1465,815036,"examples/jasmine.html",396,0,"",html,selection_command
+1466,815070,"examples/jasmine.html",471,0,"",html,selection_command
+1467,815104,"examples/jasmine.html",541,0,"",html,selection_command
+1468,815138,"examples/jasmine.html",574,0,"",html,selection_command
+1469,815173,"examples/jasmine.html",578,0,"",html,selection_command
+1470,815206,"examples/jasmine.html",594,0,"",html,selection_command
+1471,815240,"examples/jasmine.html",595,0,"",html,selection_command
+1472,815272,"examples/jasmine.html",602,0,"",html,selection_command
+1473,815312,"examples/jasmine.html",643,0,"",html,selection_command
+1474,815339,"examples/jasmine.html",714,0,"",html,selection_command
+1475,815372,"examples/jasmine.html",738,0,"",html,selection_command
+1476,815405,"examples/jasmine.html",794,0,"",html,selection_command
+1477,815439,"examples/jasmine.html",802,0,"",html,selection_command
+1478,815472,"examples/jasmine.html",803,0,"",html,selection_command
+1479,815506,"examples/jasmine.html",810,0,"",html,selection_command
+1480,815540,"examples/jasmine.html",817,0,"",html,selection_command
+1481,815574,"examples/jasmine.html",853,0,"",html,selection_command
+1482,815612,"examples/jasmine.html",859,0,"",html,selection_command
+1483,815646,"examples/jasmine.html",878,0,"",html,selection_command
+1484,815679,"examples/jasmine.html",935,0,"",html,selection_command
+1485,815713,"examples/jasmine.html",1034,0,"",html,selection_command
+1486,815745,"examples/jasmine.html",1217,0,"",html,selection_command
+1487,815777,"examples/jasmine.html",1252,0,"",html,selection_command
+1488,815810,"examples/jasmine.html",1297,0,"",html,selection_command
+1489,815847,"examples/jasmine.html",1314,0,"",html,selection_command
+1490,815881,"examples/jasmine.html",1322,0,"",html,selection_command
+1491,815914,"examples/jasmine.html",1356,0,"",html,selection_command
+1492,815948,"examples/jasmine.html",1412,0,"",html,selection_command
+1493,815981,"examples/jasmine.html",1487,0,"",html,selection_command
+1494,816014,"examples/jasmine.html",1529,0,"",html,selection_command
+1495,816048,"examples/jasmine.html",1538,0,"",html,selection_command
+1496,816082,"examples/jasmine.html",1546,0,"",html,selection_command
+1497,816115,"examples/jasmine.html",1580,0,"",html,selection_command
+1498,816148,"examples/jasmine.html",1638,0,"",html,selection_command
+1499,816182,"examples/jasmine.html",1713,0,"",html,selection_command
+1500,816215,"examples/jasmine.html",1755,0,"",html,selection_command
+1501,816247,"examples/jasmine.html",1764,0,"",html,selection_command
+1502,816281,"examples/jasmine.html",1772,0,"",html,selection_command
+1503,816631,"examples/jasmine.html",1764,0,"",html,selection_command
+1504,816807,"examples/jasmine.html",1764,7," {",html,selection_command
+1505,816909,"examples/jasmine.html",1764,43," {\n ""author"":""Franz Srambical"",",html,selection_command
+1506,817170,"examples/jasmine.html",1764,88," {\n ""author"":""Franz Srambical"",\n ""authorURL"":""https://srambical.fr/"",",html,selection_command
+1507,817206,"examples/jasmine.html",1764,163," {\n ""author"":""Franz Srambical"",\n ""authorURL"":""https://srambical.fr/"",\n ""affiliations"": [{""name"": ""p(doom)"", ""url"": ""https://pdoom.org/""},",html,selection_command
+1508,817237,"examples/jasmine.html",1764,205," {\n ""author"":""Franz Srambical"",\n ""authorURL"":""https://srambical.fr/"",\n ""affiliations"": [{""name"": ""p(doom)"", ""url"": ""https://pdoom.org/""},\n {""name"": ""TUM""}]",html,selection_command
+1509,817474,"examples/jasmine.html",1764,213," {\n ""author"":""Franz Srambical"",\n ""authorURL"":""https://srambical.fr/"",\n ""affiliations"": [{""name"": ""p(doom)"", ""url"": ""https://pdoom.org/""},\n {""name"": ""TUM""}]\n }",html,selection_command
+1510,817752,"examples/jasmine.html",1764,0,"",html,selection_command
+1511,818152,"examples/jasmine.html",1772,0,"",html,selection_command
+1512,818400,"examples/jasmine.html",1808,0,"",html,selection_command
+1513,818426,"examples/jasmine.html",1853,0,"",html,selection_command
+1514,818455,"examples/jasmine.html",1928,0,"",html,selection_command
+1515,818595,"examples/jasmine.html",1970,0,"",html,selection_command
+1516,818961,"examples/jasmine.html",1977,0,"",html,selection_command
+1517,819024,"examples/jasmine.html",1977,0,",",html,content
+1518,819026,"examples/jasmine.html",1978,0,"",html,selection_keyboard
+1519,819206,"examples/jasmine.html",1977,0,"",html,selection_command
+1520,819410,"examples/jasmine.html",1978,0,"\n {\n ""author"":""Franz Srambical"",\n ""authorURL"":""https://srambical.fr/"",\n ""affiliations"": [{""name"": ""p(doom)"", ""url"": ""https://pdoom.org/""},\n {""name"": ""TUM""}]\n }",html,content
+1521,819413,"examples/jasmine.html",1985,0,"",html,selection_command
+1522,820021,"examples/jasmine.html",1993,0,"",html,selection_command
+1523,820157,"examples/jasmine.html",2029,0,"",html,selection_command
+1524,820239,"examples/jasmine.html",2031,0,"",html,selection_command
+1525,820504,"examples/jasmine.html",1995,0,"",html,selection_command
+1526,820612,"examples/jasmine.html",1996,0,"",html,selection_command
+1527,820759,"examples/jasmine.html",2002,0,"",html,selection_command
+1528,820908,"examples/jasmine.html",2005,0,"",html,selection_command
+1529,821061,"examples/jasmine.html",2011,0,"",html,selection_command
+1530,821238,"examples/jasmine.html",2011,1,"S",html,selection_command
+1531,821681,"examples/jasmine.html",2011,0,"",html,selection_command
+1532,821807,"examples/jasmine.html",2005,0,"",html,selection_command
+1533,821924,"examples/jasmine.html",2005,1,"F",html,selection_command
+1534,822042,"examples/jasmine.html",2005,5,"Franz",html,selection_command
+1535,822211,"examples/jasmine.html",2005,15,"Franz Srambical",html,selection_command
+1536,822412,"examples/jasmine.html",2005,15,"",html,content
+1537,822696,"examples/jasmine.html",2005,0,"S",html,content
+1538,822698,"examples/jasmine.html",2006,0,"",html,selection_keyboard
+1539,822928,"examples/jasmine.html",2006,0,"r",html,content
+1540,822931,"examples/jasmine.html",2007,0,"",html,selection_keyboard
+1541,822974,"examples/jasmine.html",2007,0,"e",html,content
+1542,822977,"examples/jasmine.html",2008,0,"",html,selection_keyboard
+1543,823297,"examples/jasmine.html",2007,1,"",html,content
+1544,823438,"examples/jasmine.html",2006,1,"",html,content
+1545,823477,"examples/jasmine.html",2006,0,"t",html,content
+1546,823480,"examples/jasmine.html",2007,0,"",html,selection_keyboard
+1547,823529,"examples/jasmine.html",2007,0,"e",html,content
+1548,823532,"examples/jasmine.html",2008,0,"",html,selection_keyboard
+1549,823638,"examples/jasmine.html",2008,0,"f",html,content
+1550,823641,"examples/jasmine.html",2009,0,"",html,selection_keyboard
+1551,823726,"examples/jasmine.html",2009,0,"a",html,content
+1552,823727,"examples/jasmine.html",2010,0,"",html,selection_keyboard
+1553,823793,"examples/jasmine.html",2010,0,"n",html,content
+1554,823795,"examples/jasmine.html",2011,0,"",html,selection_keyboard
+1555,823928,"examples/jasmine.html",2011,0," ",html,content
+1556,823929,"examples/jasmine.html",2012,0,"",html,selection_keyboard
+1557,824101,"examples/jasmine.html",2012,0,"B",html,content
+1558,824104,"examples/jasmine.html",2013,0,"",html,selection_keyboard
+1559,824232,"examples/jasmine.html",2013,0,"a",html,content
+1560,824235,"examples/jasmine.html",2014,0,"",html,selection_keyboard
+1561,824376,"examples/jasmine.html",2014,0,"e",html,content
+1562,824379,"examples/jasmine.html",2015,0,"",html,selection_keyboard
+1563,824660,"examples/jasmine.html",2014,1,"",html,content
+1564,824950,"examples/jasmine.html",2014,0,"u",html,content
+1565,824953,"examples/jasmine.html",2015,0,"",html,selection_keyboard
+1566,825011,"examples/jasmine.html",2015,0,"e",html,content
+1567,825013,"examples/jasmine.html",2016,0,"",html,selection_keyboard
+1568,825079,"examples/jasmine.html",2016,0,"r",html,content
+1569,825081,"examples/jasmine.html",2017,0,"",html,selection_keyboard
+1570,825248,"examples/jasmine.html",2016,0,"",html,selection_command
+1571,825445,"examples/jasmine.html",2049,0,"",html,selection_command
+1572,825628,"examples/jasmine.html",2094,0,"",html,selection_command
+1573,826439,"examples/jasmine.html",2092,0,"",html,selection_command
+1574,826600,"examples/jasmine.html",2089,0,"",html,selection_command
+1575,828025,"examples/jasmine.html",2090,0,"",html,selection_command
+1576,828152,"examples/jasmine.html",2090,1,"{",html,selection_command
+1577,828214,"examples/jasmine.html",2090,2,"{""",html,selection_command
+1578,828473,"examples/jasmine.html",2090,6,"{""name",html,selection_command
+1579,828752,"examples/jasmine.html",2090,8,"{""name"":",html,selection_command
+1580,829009,"examples/jasmine.html",2090,10,"{""name"": """,html,selection_command
+1581,829033,"examples/jasmine.html",2090,11,"{""name"": ""p",html,selection_command
+1582,829063,"examples/jasmine.html",2090,12,"{""name"": ""p(",html,selection_command
+1583,829095,"examples/jasmine.html",2090,16,"{""name"": ""p(doom",html,selection_command
+1584,829127,"examples/jasmine.html",2090,19,"{""name"": ""p(doom)"",",html,selection_command
+1585,829159,"examples/jasmine.html",2090,21,"{""name"": ""p(doom)"", """,html,selection_command
+1586,829194,"examples/jasmine.html",2090,24,"{""name"": ""p(doom)"", ""url",html,selection_command
+1587,829226,"examples/jasmine.html",2090,26,"{""name"": ""p(doom)"", ""url"":",html,selection_command
+1588,829366,"examples/jasmine.html",2090,28,"{""name"": ""p(doom)"", ""url"": """,html,selection_command
+1589,830059,"examples/jasmine.html",2090,33,"{""name"": ""p(doom)"", ""url"": ""https",html,selection_command
+1590,830307,"examples/jasmine.html",2090,36,"{""name"": ""p(doom)"", ""url"": ""https://",html,selection_command
+1591,830347,"examples/jasmine.html",2090,41,"{""name"": ""p(doom)"", ""url"": ""https://pdoom",html,selection_command
+1592,830382,"examples/jasmine.html",2090,42,"{""name"": ""p(doom)"", ""url"": ""https://pdoom.",html,selection_command
+1593,831300,"examples/jasmine.html",2090,45,"{""name"": ""p(doom)"", ""url"": ""https://pdoom.org",html,selection_command
+1594,831505,"examples/jasmine.html",2090,49,"{""name"": ""p(doom)"", ""url"": ""https://pdoom.org/""},",html,selection_command
+1595,831681,"examples/jasmine.html",2090,77,"{""name"": ""p(doom)"", ""url"": ""https://pdoom.org/""},\n {""",html,selection_command
+1596,832001,"examples/jasmine.html",2090,76,"{""name"": ""p(doom)"", ""url"": ""https://pdoom.org/""},\n {",html,selection_command
+1597,832223,"examples/jasmine.html",2090,46,"{""name"": ""p(doom)"", ""url"": ""https://pdoom.org/",html,selection_command
+1598,832369,"examples/jasmine.html",2090,43,"{""name"": ""p(doom)"", ""url"": ""https://pdoom.o",html,selection_command
+1599,832754,"examples/jasmine.html",2090,45,"{""name"": ""p(doom)"", ""url"": ""https://pdoom.org",html,selection_command
+1600,833108,"examples/jasmine.html",2090,46,"{""name"": ""p(doom)"", ""url"": ""https://pdoom.org/",html,selection_command
+1601,833155,"examples/jasmine.html",2090,49,"{""name"": ""p(doom)"", ""url"": ""https://pdoom.org/""},",html,selection_command
+1602,833551,"examples/jasmine.html",2090,48,"{""name"": ""p(doom)"", ""url"": ""https://pdoom.org/""}",html,selection_command
+1603,836650,"examples/jasmine.html",2090,48,"",html,content
+1604,837462,"examples/jasmine.html",2090,1,"",html,content
+1605,837476,"examples/jasmine.html",2089,0,"",html,selection_command
+1606,837945,"examples/jasmine.html",2116,0,"",html,selection_command
+1607,838440,"examples/jasmine.html",2115,1,"",html,content
+1608,838693,"examples/jasmine.html",2111,4,"",html,content
+1609,838721,"examples/jasmine.html",2107,4,"",html,content
+1610,838753,"examples/jasmine.html",2103,4,"",html,content
+1611,838786,"examples/jasmine.html",2099,4,"",html,content
+1612,838950,"examples/jasmine.html",2095,4,"",html,content
+1613,839218,"examples/jasmine.html",2091,4,"",html,content
+1614,839541,"examples/jasmine.html",2090,1,"",html,content
+1615,839962,"examples/jasmine.html",2089,0,"",html,selection_command
+1616,840345,"examples/jasmine.html",2065,0,"",html,selection_command
+1617,841724,"examples/jasmine.html",2020,0,"",html,selection_command
+1618,841807,"examples/jasmine.html",2028,0,"",html,selection_command
+1619,841978,"examples/jasmine.html",2029,0,"",html,selection_command
+1620,842130,"examples/jasmine.html",2038,0,"",html,selection_command
+1621,842294,"examples/jasmine.html",2041,0,"",html,selection_command
+1622,842459,"examples/jasmine.html",2046,0,"",html,selection_command
+1623,843443,"examples/jasmine.html",2041,21,"",html,content
+1624,860404,"examples/jasmine.html",2040,0,"",html,selection_command
+1625,861429,"examples/jasmine.html",2041,0,"",html,selection_command
+1626,861681,"examples/jasmine.html",2041,0,"https://www.professoren.tum.de/en/bauer-stefan",html,content
+1627,861684,"examples/jasmine.html",2087,0,"",html,selection_keyboard
+1628,861969,"examples/jasmine.html",2086,0,"",html,selection_command
+1629,862193,"examples/jasmine.html",2020,0,"",html,selection_command
+1630,865469,"TERMINAL",0,0,"^C[1G[0Kโ [1G[0K[1m[7m%[27m[1m[0m \r \r]633;D;0]633;P;Cwd=/Users/franzsrambical/Documents/pdoom/pdoom.org\r[0m[27m[24m[J]633;Afranzsrambical@MBF6N9WFVKFV pdoom.org % ]633;B[K[?2004h",,terminal_output
+1631,865512,"TERMINAL",0,0,"",,terminal_command
+1632,865513,"TERMINAL",0,0,"[?2004l\r\r\n[1m[7m%[27m[1m[0m \r \r]633;E;;b07e0e29-8b6e-4188-b95b-fd2ce17f4bb1",,terminal_output
+1633,865513,"TERMINAL",0,0,"]633;C",,terminal_output
+1634,866030,"TERMINAL",0,0,"cp examples/* dist",,terminal_command
+1635,866046,"TERMINAL",0,0,"]633;C[1m[7m%[27m[1m[0m \r \r",,terminal_output
+1636,866575,"TERMINAL",0,0,"npm run dev",,terminal_command
+1637,866626,"TERMINAL",0,0,"]633;C",,terminal_output
+1638,866780,"TERMINAL",0,0,"\r\n> distill-template@2.8.0 dev\r\n> rollup -c rollup.config.dev.js -w\r\n\r\n[1G[0K",,terminal_output
+1639,867009,"TERMINAL",0,0,"c[4mrollup v2.7.3[24m\r\n[36mbundles [1msrc/components.js[22m โ [1mdist/template.v2.js[22m...[39m\r\n",,terminal_output
+1640,867442,"TERMINAL",0,0,"[1m[32mhttp://localhost:8088[39m[22m -> /Users/franzsrambical/Documents/pdoom/pdoom.org/dist\r\n[1m[32mhttp://localhost:8088[39m[22m -> /Users/franzsrambical/Documents/pdoom/pdoom.org/examples\r\n[1m[33m(!) Circular dependencies[39m[22m\r\nnode_modules/d3-interpolate/src/value.js -> node_modules/d3-interpolate/src/array.js -> node_modules/d3-interpolate/src/value.js\r\nnode_modules/d3-interpolate/src/value.js -> node_modules/d3-interpolate/src/object.js -> node_modules/d3-interpolate/src/value.js\r\nnode_modules/d3-selection/src/selection/index.js -> node_modules/d3-selection/src/selection/select.js -> node_modules/d3-selection/src/selection/index.js\r\n...and 7 more\r\n[32mcreated [1mdist/template.v2.js[22m in [1m435ms[22m[39m\r\n[36mbundles [1msrc/transforms.js[22m โ [1mdist/transforms.v2.js[22m...[39m\r\n",,terminal_output
+1641,867819,"TERMINAL",0,0,"[32mcreated [1mdist/transforms.v2.js[22m in [1m375ms[22m[39m\r\n\r\n[2025-08-05 18:40:23] waiting for changes...\r\n",,terminal_output
+1642,1070337,"examples/bibliography.bib",0,0,"@article{radford2018improving,\n title = {Improving language understanding by generative pre-training},\n author = {Radford, Alec and Narasimhan, Karthik and Salimans, Tim and\n Sutskever, Ilya and others},\n}\n\n@article{radford2019language,\n title = {Language models are unsupervised multitask learners},\n author = {Radford, Alec and Wu, Jeffrey and Child, Rewon and Luan, David and\n Amodei, Dario and Sutskever, Ilya and others},\n journal = {OpenAI blog},\n volume = {1},\n number = {8},\n pages = {9},\n year = {2019},\n}\n\n@article{brown2020language,\n title = {Language models are few-shot learners},\n author = {Brown, Tom and Mann, Benjamin and Ryder, Nick and Subbiah, Melanie\n and Kaplan, Jared D and Dhariwal, Prafulla and Neelakantan, Arvind\n and Shyam, Pranav and Sastry, Girish and Askell, Amanda and others},\n journal = {Advances in neural information processing systems},\n volume = {33},\n pages = {1877--1901},\n year = {2020},\n}\n\n@article{raffel2020exploring,\n title = {Exploring the limits of transfer learning with a unified text-to-text\n transformer},\n author = {Raffel, Colin and Shazeer, Noam and Roberts, Adam and Lee, Katherine\n and Narang, Sharan and Matena, Michael and Zhou, Yanqi and Li, Wei\n and Liu, Peter J},\n journal = {Journal of machine learning research},\n volume = {21},\n number = {140},\n pages = {1--67},\n year = {2020},\n}\n\n@article{touvron2023llama,\n title = {Llama 2: Open foundation and fine-tuned chat models},\n author = {Touvron, Hugo and Martin, Louis and Stone, Kevin and Albert, Peter\n and Almahairi, Amjad and Babaei, Yasmine and Bashlykov, Nikolay and\n Batra, Soumya and Bhargava, Prajjwal and Bhosale, Shruti and others},\n journal = {arXiv preprint arXiv:2307.09288},\n year = {2023},\n}\n\n@article{bai2023qwen,\n title = {Qwen technical report},\n author = {Bai, Jinze and Bai, Shuai and Chu, Yunfei and Cui, Zeyu and Dang,\n Kai and Deng, Xiaodong and Fan, Yang and Ge, Wenbin and Han, Yu and\n Huang, Fei and others},\n journal = {arXiv preprint arXiv:2309.16609},\n year = {2023},\n}\n\n@article{young2024yi,\n title = {Yi: Open foundation models by 01. ai},\n author = {Young, Alex and Chen, Bei and Li, Chao and Huang, Chengen and Zhang,\n Ge and Zhang, Guanwei and Li, Heng and Zhu, Jiangcheng and Chen,\n Jianqun and Chang, Jing and others},\n journal = {arXiv preprint arXiv:2403.04652},\n year = {2024},\n}\n\n@article{vaswani2017attention,\n title = {Attention is all you need},\n author = {Vaswani, Ashish and Shazeer, Noam and Parmar, Niki and Uszkoreit,\n Jakob and Jones, Llion and Gomez, Aidan N and Kaiser, {\L}ukasz and\n Polosukhin, Illia},\n journal = {Advances in neural information processing systems},\n volume = {30},\n year = {2017},\n}\n\n@article{raffel2020exploring,\n title = {Exploring the limits of transfer learning with a unified text-to-text\n transformer},\n author = {Raffel, Colin and Shazeer, Noam and Roberts, Adam and Lee, Katherine\n and Narang, Sharan and Matena, Michael and Zhou, Yanqi and Li, Wei\n and Liu, Peter J},\n journal = {Journal of machine learning research},\n volume = {21},\n number = {140},\n pages = {1--67},\n year = {2020},\n}\n\n@inproceedings{zhou2024what,\n title = {What Algorithms can Transformers Learn? A Study in Length\n Generalization},\n author = {Hattie Zhou and Arwen Bradley and Etai Littwin and Noam Razin and\n Omid Saremi and Joshua M. Susskind and Samy Bengio and Preetum\n Nakkiran},\n booktitle = {The Twelfth International Conference on Learning Representations},\n year = {2024},\n url = {https://openreview.net/forum?id=AssIuHnmHX},\n}\n\n@inproceedings{ding2024causallm,\n title = {Causal{LM} is not optimal for in-context learning},\n author = {Nan Ding and Tomer Levinboim and Jialin Wu and Sebastian Goodman and\n Radu Soricut},\n booktitle = {The Twelfth International Conference on Learning Representations},\n year = {2024},\n url = {https://openreview.net/forum?id=guRNebwZBb},\n}\n\n@article{williams1989learning,\n title = {A learning algorithm for continually running fully recurrent neural\n networks},\n author = {Williams, Ronald J and Zipser, David},\n journal = {Neural computation},\n volume = {1},\n number = {2},\n pages = {270--280},\n year = {1989},\n publisher = {MIT Press One Rogers Street, Cambridge, MA 02142-1209, USA\n journals-info~โฆ},\n}\n\n@article{tay2022ul2,\n title = {Ul2: Unifying language learning paradigms},\n author = {Tay, Yi and Dehghani, Mostafa and Tran, Vinh Q and Garcia, Xavier\n and Wei, Jason and Wang, Xuezhi and Chung, Hyung Won and Shakeri,\n Siamak and Bahri, Dara and Schuster, Tal and others},\n journal = {arXiv preprint arXiv:2205.05131},\n year = {2022},\n}\n\n@misc{pfau2023last,\n title = {Last I checked, it was still not possible for a neural network alone\n (i.e. no MCTS) to beat the world's best Go players...},\n author = {Pfau, David},\n year = {2023},\n url = {https://twitter.com/pfau/status/1732785418565796167},\n note = {Accessed: 2023-12-07},\n}\n\n@article{deepmind2023alphacode,\n title = {AlphaCode 2 Technical Report},\n author = {Team, AlphaCode and Deepmind, Google},\n year = {2023},\n journal = {Google Deepmind},\n url = {\n https://storage.googleapis.com/deepmind-media/AlphaCode2/AlphaCode2_Tech_Report.pdf\n },\n}\n\n@article{reuters2023sam,\n author = {Tong, Anna and Dastin, Jeffrey and Hu, Krystal},\n title = {Sam Altman's ouster from OpenAI was precipitated by letter to board\n about AI breakthrough},\n journal = {Reuters},\n year = {2023},\n url = {\n https://www.reuters.com/technology/sam-altmans-ouster-openai-was-precipitated-by-letter-board-about-ai-breakthrough-2023-11-22/\n },\n note = {Accessed: 2023-12-07},\n}\n\n@misc{imbue2023podcast,\n title = {Noam Brown, FAIR: On achieving human-level performance in poker and\n Diplomacy, and the power of spending compute at inference time},\n author = {Noam Brown},\n howpublished = {\n https://imbue.com/podcast/2023-02-09-podcast-episode-27-noam-brown/\n },\n year = {2023},\n note = {Podcast episode 27, February 9, 2023},\n}\n\n@misc{karpathy2023youtube,\n author = {Karpathy, Andrej},\n title = {[1hr Talk] Intro to Large Language Models},\n howpublished = {YouTube},\n year = {2023},\n note = {Accessed: 2023-12-07},\n url = {https://www.youtube.com/watch?v=zjkBMFhNj_g&t=2100s},\n}\n\n@article{brown2019superhuman,\n title = {Superhuman AI for multiplayer poker},\n author = {Brown, Noam and Sandholm, Tuomas},\n journal = {Science},\n volume = {365},\n number = {6456},\n pages = {885--890},\n year = {2019},\n publisher = {American Association for the Advancement of Science},\n}\n\n@article{silver2016mastering,\n title = {Mastering the game of Go with deep neural networks and tree search},\n author = {Silver, David and Huang, Aja and Maddison, Chris J and Guez, Arthur\n and Sifre, Laurent and Van Den Driessche, George and Schrittwieser,\n Julian and Antonoglou, Ioannis and Panneershelvam, Veda and Lanctot,\n Marc and others},\n journal = {nature},\n volume = {529},\n number = {7587},\n pages = {484--489},\n year = {2016},\n publisher = {Nature Publishing Group},\n}\n\n@article{schrittwieser2020mastering,\n title = {Mastering atari, go, chess and shogi by planning with a learned model\n },\n author = {Schrittwieser, Julian and Antonoglou, Ioannis and Hubert, Thomas and\n Simonyan, Karen and Sifre, Laurent and Schmitt, Simon and Guez,\n Arthur and Lockhart, Edward and Hassabis, Demis and Graepel, Thore\n and others},\n journal = {Nature},\n volume = {588},\n number = {7839},\n pages = {604--609},\n year = {2020},\n publisher = {Nature Publishing Group UK London},\n}\n\n@article{wei2022chain,\n title = {Chain-of-thought prompting elicits reasoning in large language models\n },\n author = {Wei, Jason and Wang, Xuezhi and Schuurmans, Dale and Bosma, Maarten\n and Xia, Fei and Chi, Ed and Le, Quoc V and Zhou, Denny and others},\n journal = {Advances in neural information processing systems},\n volume = {35},\n pages = {24824--24837},\n year = {2022},\n}\n\n@article{yao2024tree,\n title = {Tree of thoughts: Deliberate problem solving with large language\n models},\n author = {Yao, Shunyu and Yu, Dian and Zhao, Jeffrey and Shafran, Izhak and\n Griffiths, Tom and Cao, Yuan and Narasimhan, Karthik},\n journal = {Advances in Neural Information Processing Systems},\n volume = {36},\n year = {2024},\n}\n\n@article{lecun2022path,\n title = {A path towards autonomous machine intelligence version 0.9. 2,\n 2022-06-27},\n author = {LeCun, Yann},\n journal = {Open Review},\n volume = {62},\n number = {1},\n year = {2022},\n}\n\n@article{hoffmann2022training,\n title = {Training compute-optimal large language models},\n author = {Hoffmann, Jordan and Borgeaud, Sebastian and Mensch, Arthur and\n Buchatskaya, Elena and Cai, Trevor and Rutherford, Eliza and Casas,\n Diego de Las and Hendricks, Lisa Anne and Welbl, Johannes and Clark,\n Aidan and others},\n journal = {arXiv preprint arXiv:2203.15556},\n year = {2022},\n}\n\n@article{meta2024introducing,\n title = {Introducing meta llama 3: The most capable openly available llm to\n date},\n author = {Meta, AI},\n journal = {Meta AI.},\n year = {2024},\n}\n\n@misc{riley2024it,\n title = {It's just not a very useful scaling law.},\n author = {@riley_stews},\n year = {2024},\n url = {https://x.com/riley_stews/status/1781019732122198288},\n note = {Accessed: 2023-04-20},\n}\n\n@article{shazeer2017outrageously,\n title = {Outrageously large neural networks: The sparsely-gated\n mixture-of-experts layer},\n author = {Shazeer, Noam and Mirhoseini, Azalia and Maziarz, Krzysztof and\n Davis, Andy and Le, Quoc and Hinton, Geoffrey and Dean, Jeff},\n journal = {arXiv preprint arXiv:1701.06538},\n year = {2017},\n}\n\n@article{fedus2022switch,\n title = {Switch transformers: Scaling to trillion parameter models with simple\n and efficient sparsity},\n author = {Fedus, William and Zoph, Barret and Shazeer, Noam},\n journal = {Journal of Machine Learning Research},\n volume = {23},\n number = {120},\n pages = {1--39},\n year = {2022},\n}\n\n@article{schulman2015high,\n title = {High-dimensional continuous control using generalized advantage\n estimation},\n author = {Schulman, John and Moritz, Philipp and Levine, Sergey and Jordan,\n Michael and Abbeel, Pieter},\n journal = {arXiv preprint arXiv:1506.02438},\n year = {2015},\n}\n\n@article{srambical2025ppo,\n author = {Srambical, Franz},\n title = {PPO Is Secretly Using Monte Carlo Advantage Estimation In LLM\n Post-Training},\n journal = {p(doom) blog},\n year = {2025},\n note = {https://pdoom.org/blog.html},\n}\n\n@article{williams1992simple,\n title = {Simple statistical gradient-following algorithms for connectionist\n reinforcement learning},\n author = {Williams, Ronald J},\n journal = {Machine learning},\n volume = {8},\n pages = {229--256},\n year = {1992},\n publisher = {Springer},\n}\n\n@software{deepmind2020jax,\n title = {The {D}eep{M}ind {JAX} {E}cosystem},\n author = {DeepMind and Babuschkin, Igor and Baumli, Kate and Bell, Alison and\n Bhupatiraju, Surya and Bruce, Jake and Buchlovsky, Peter and Budden,\n David and Cai, Trevor and Clark, Aidan and Danihelka, Ivo and Dedieu,\n Antoine and Fantacci, Claudio and Godwin, Jonathan and Jones, Chris\n and Hemsley, Ross and Hennigan, Tom and Hessel, Matteo and Hou,\n Shaobo and Kapturowski, Steven and Keck, Thomas and Kemaev, Iurii and\n King, Michael and Kunesch, Markus and Martens, Lena and Merzic, Hamza\n and Mikulik, Vladimir and Norman, Tamara and Papamakarios, George and\n Quan, John and Ring, Roman and Ruiz, Francisco and Sanchez, Alvaro\n and Sartran, Laurent and Schneider, Rosalia and Sezener, Eren and\n Spencer, Stephen and Srinivasan, Srivatsan and Stanojevi\'{c}, Milo\v\n {s} and Stokowiec, Wojciech and Wang, Luyu and Zhou, Guangyao and\n Viola, Fabio},\n url = {http://github.com/deepmind},\n year = {2020},\n}\n\n@misc{jax2025jit,\n title = {JAX: Just-in-time compilation},\n author = {James Bradbury and Roy Frostig and Peter Hawkins and Matthew James\n Johnson and Chris Leary and Dougal Maclaurin and George Necula and\n Adam Paszke and Jake Vander{P}las and Skye Wanderman-{M}ilne and Qiao\n Zhang},\n year = {2025},\n url = {https://docs.jax.dev/en/latest/jit-compilation.html},\n note = {Accessed: 2025-03-26},\n}\n\n@misc{jax2025callbacks,\n title = {JAX: External callbacks},\n author = {James Bradbury and Roy Frostig and Peter Hawkins and Matthew James\n Johnson and Chris Leary and Dougal Maclaurin and George Necula and\n Adam Paszke and Jake Vander{P}las and Skye Wanderman-{M}ilne and Qiao\n Zhang},\n year = {2025},\n url = {https://docs.jax.dev/en/latest/external-callbacks.html},\n note = {Accessed: 2025-03-26},\n}\n\n@misc{jax2025checkify,\n title = {JAX: The `checkify` transformation},\n author = {James Bradbury and Roy Frostig and Peter Hawkins and Matthew James\n Johnson and Chris Leary and Dougal Maclaurin and George Necula and\n Adam Paszke and Jake Vander{P}las and Skye Wanderman-{M}ilne and Qiao\n Zhang},\n year = {2025},\n url = {https://docs.jax.dev/en/latest/debugging/checkify_guide.html},\n note = {Accessed: 2025-03-26},\n}\n\n@misc{jax2025key,\n title = {JAX: Key concepts},\n author = {James Bradbury and Roy Frostig and Peter Hawkins and Matthew James\n Johnson and Chris Leary and Dougal Maclaurin and George Necula and\n Adam Paszke and Jake Vander{P}las and Skye Wanderman-{M}ilne and Qiao\n Zhang},\n year = {2025},\n url = {https://docs.jax.dev/en/latest/key-concepts.html},\n note = {Accessed: 2025-03-26},\n}\n\n@software{deepmind2020chex,\n title = {Chex},\n author = {James Bradbury and Roy Frostig and Peter Hawkins and Matthew James\n Johnson and Chris Leary and Dougal Maclaurin and George Necula and\n Adam Paszke and Jake Vander{P}las and Skye Wanderman-{M}ilne and Qiao\n Zhang},\n url = {http://github.com/google-deepmind/chex},\n year = {2020},\n}\n\n@misc{jax2025control,\n title = {JAX: Control flow and logical operators with JIT},\n author = {James Bradbury and Roy Frostig and Peter Hawkins and Matthew James\n Johnson and Chris Leary and Dougal Maclaurin and George Necula and\n Adam Paszke and Jake Vander{P}las and Skye Wanderman-{M}ilne and Qiao\n Zhang},\n year = {2025},\n url = {https://docs.jax.dev/en/latest/control-flow.html},\n note = {Accessed: 2025-03-26},\n}\n\n@misc{xla2025conditional,\n title = {XLA:Operation Semantics:Conditional},\n author = {James Bradbury and Roy Frostig and Peter Hawkins and Matthew James\n Johnson and Chris Leary and Dougal Maclaurin and George Necula and\n Adam Paszke and Jake Vander{P}las and Skye Wanderman-{M}ilne and Qiao\n Zhang},\n year = {2025},\n url = {https://openxla.org/xla/operation_semantics#conditional},\n note = {Accessed: 2025-03-26},\n}\n\n@misc{ayaka76822025error,\n author = {ayaka7682},\n title = {Message on public Discord server: Try this:\n \n import jax from jax._src.error_check import set_error_if, raise_if_error\n \n \n import jax.numpy as jnp\n \n \n @jax.jit\n \n \n def f(x, y):\n \n \n set_error_if(x != 0, 'x must be 0')\n \n \n return jnp.multiply(x, y)\n \n \n f(0, 0)\n \n \n raise_if_error()\n },\n year = {2025},\n url = {\n https://discord.com/channels/1107832795377713302/1107832795688083561/1354171414596419854\n },\n note = {Accessed: 2025-03-26},\n}\n\n@book{sutton1998reinforcement,\n title={Reinforcement learning: An introduction},\n author={Sutton, Richard S and Barto, Andrew G and others},\n volume={1},\n number={1},\n year={1998},\n publisher={MIT press Cambridge}\n}\n\n@article{sutton1999policy,\n title={Policy gradient methods for reinforcement learning with function approximation},\n author={Sutton, Richard S and McAllester, David and Singh, Satinder and Mansour, Yishay},\n journal={Advances in neural information processing systems},\n volume={12},\n year={1999}\n}\n\n@article{degris2012off,\n title={Off-policy actor-critic},\n author={Degris, Thomas and White, Martha and Sutton, Richard S},\n journal={arXiv preprint arXiv:1205.4839},\n year={2012}\n}\n\n@article{schulman2017proximal,\n title={Proximal policy optimization algorithms},\n author={Schulman, John and Wolski, Filip and Dhariwal, Prafulla and Radford, Alec and Klimov, Oleg},\n journal={arXiv preprint arXiv:1707.06347},\n year={2017}\n}\n\n@article{ouyang2022training,\n title={Training language models to follow instructions with human feedback},\n author={Ouyang, Long and Wu, Jeffrey and Jiang, Xu and Almeida, Diogo and Wainwright, Carroll and Mishkin, Pamela and Zhang, Chong and Agarwal, Sandhini and Slama, Katarina and Ray, Alex and others},\n journal={Advances in neural information processing systems},\n volume={35},\n pages={27730--27744},\n year={2022}\n}\n",bibtex,tab
+1643,1071114,"examples/bibliography.bib",17804,0,"",bibtex,selection_command
+1644,1076417,"examples/bibliography.bib",4905,0,"",bibtex,selection_command
+1645,1077332,"examples/bibliography.bib",4904,19,"@misc{pfau2023last,",bibtex,selection_command
+1646,1077486,"examples/bibliography.bib",4904,99,"@misc{pfau2023last,\n title = {Last I checked, it was still not possible for a neural network alone",bibtex,selection_command
+1647,1077641,"examples/bibliography.bib",4904,166,"@misc{pfau2023last,\n title = {Last I checked, it was still not possible for a neural network alone\n (i.e. no MCTS) to beat the world's best Go players...},",bibtex,selection_command
+1648,1077775,"examples/bibliography.bib",4904,192,"@misc{pfau2023last,\n title = {Last I checked, it was still not possible for a neural network alone\n (i.e. no MCTS) to beat the world's best Go players...},\n author = {Pfau, David},",bibtex,selection_command
+1649,1077907,"examples/bibliography.bib",4904,209,"@misc{pfau2023last,\n title = {Last I checked, it was still not possible for a neural network alone\n (i.e. no MCTS) to beat the world's best Go players...},\n author = {Pfau, David},\n year = {2023},",bibtex,selection_command
+1650,1078163,"examples/bibliography.bib",4904,272,"@misc{pfau2023last,\n title = {Last I checked, it was still not possible for a neural network alone\n (i.e. no MCTS) to beat the world's best Go players...},\n author = {Pfau, David},\n year = {2023},\n url = {https://twitter.com/pfau/status/1732785418565796167},",bibtex,selection_command
+1651,1078271,"examples/bibliography.bib",4904,305,"@misc{pfau2023last,\n title = {Last I checked, it was still not possible for a neural network alone\n (i.e. no MCTS) to beat the world's best Go players...},\n author = {Pfau, David},\n year = {2023},\n url = {https://twitter.com/pfau/status/1732785418565796167},\n note = {Accessed: 2023-12-07},",bibtex,selection_command
+1652,1078458,"examples/bibliography.bib",4904,307,"@misc{pfau2023last,\n title = {Last I checked, it was still not possible for a neural network alone\n (i.e. no MCTS) to beat the world's best Go players...},\n author = {Pfau, David},\n year = {2023},\n url = {https://twitter.com/pfau/status/1732785418565796167},\n note = {Accessed: 2023-12-07},\n}",bibtex,selection_command
+1653,1078759,"examples/bibliography.bib",4904,0,"",bibtex,selection_command
+1654,1079460,"examples/bibliography.bib",17804,0,"",bibtex,selection_command
+1655,1079704,"examples/bibliography.bib",17804,0,"\n",bibtex,content
+1656,1079936,"examples/bibliography.bib",17805,0,"\n@misc{pfau2023last,\n title = {Last I checked, it was still not possible for a neural network alone\n (i.e. no MCTS) to beat the world's best Go players...},\n author = {Pfau, David},\n year = {2023},\n url = {https://twitter.com/pfau/status/1732785418565796167},\n note = {Accessed: 2023-12-07},\n}",bibtex,content
+1657,1079940,"examples/bibliography.bib",17806,0,"",bibtex,selection_command
+1658,1080472,"examples/bibliography.bib",17807,0,"",bibtex,selection_command
+1659,1080622,"examples/bibliography.bib",17811,0,"",bibtex,selection_command
+1660,1080772,"examples/bibliography.bib",17812,0,"",bibtex,selection_command
+1661,1081207,"examples/bibliography.bib",17812,12,"",bibtex,content
+1662,1082935,"examples/bibliography.bib",17812,0,"o",bibtex,content
+1663,1082938,"examples/bibliography.bib",17813,0,"",bibtex,selection_keyboard
+1664,1082969,"examples/bibliography.bib",17813,0,"p",bibtex,content
+1665,1082971,"examples/bibliography.bib",17814,0,"",bibtex,selection_keyboard
+1666,1083056,"examples/bibliography.bib",17814,0,"e",bibtex,content
+1667,1083058,"examples/bibliography.bib",17815,0,"",bibtex,selection_keyboard
+1668,1083173,"examples/bibliography.bib",17815,0,"n",bibtex,content
+1669,1083176,"examples/bibliography.bib",17816,0,"",bibtex,selection_keyboard
+1670,1083272,"examples/bibliography.bib",17816,0,"a",bibtex,content
+1671,1083274,"examples/bibliography.bib",17817,0,"",bibtex,selection_keyboard
+1672,1083355,"examples/bibliography.bib",17817,0,"i",bibtex,content
+1673,1083357,"examples/bibliography.bib",17818,0,"",bibtex,selection_keyboard
+1674,1085750,"examples/bibliography.bib",17818,0,"2",bibtex,content
+1675,1085756,"examples/bibliography.bib",17819,0,"",bibtex,selection_keyboard
+1676,1085846,"examples/bibliography.bib",17819,0,"0",bibtex,content
+1677,1085851,"examples/bibliography.bib",17820,0,"",bibtex,selection_keyboard
+1678,1085877,"examples/bibliography.bib",17820,0,"2",bibtex,content
+1679,1085879,"examples/bibliography.bib",17821,0,"",bibtex,selection_keyboard
+1680,1085990,"examples/bibliography.bib",17821,0,"5",bibtex,content
+1681,1085992,"examples/bibliography.bib",17822,0,"",bibtex,selection_keyboard
+1682,1086445,"examples/bibliography.bib",17822,0,"i",bibtex,content
+1683,1086446,"examples/bibliography.bib",17823,0,"",bibtex,selection_keyboard
+1684,1086594,"examples/bibliography.bib",17823,0,"m",bibtex,content
+1685,1086594,"examples/bibliography.bib",17824,0,"",bibtex,selection_keyboard
+1686,1086760,"examples/bibliography.bib",17824,0,"o",bibtex,content
+1687,1086762,"examples/bibliography.bib",17825,0,"",bibtex,selection_keyboard
+1688,1086949,"examples/bibliography.bib",17824,0,"",bibtex,selection_command
+1689,1087130,"examples/bibliography.bib",17845,0,"",bibtex,selection_command
+1690,1088678,"examples/bibliography.bib",17838,133,"",bibtex,content
+1691,1099438,"examples/bibliography.bib",17838,0,"We achieved gold medal-level performance ๐ฅon the 2025 International Mathematical Olympiad with a general-purpose reasoning LLM!",bibtex,content
+1692,1099444,"examples/bibliography.bib",17966,0,"",bibtex,selection_keyboard
+1693,1099812,"examples/bibliography.bib",17965,0,"",bibtex,selection_command
+1694,1100038,"examples/bibliography.bib",17993,0,"",bibtex,selection_command
+1695,1100307,"examples/bibliography.bib",17992,0,"",bibtex,selection_command
+1696,1100723,"examples/bibliography.bib",17987,0,"",bibtex,selection_command
+1697,1101179,"examples/bibliography.bib",17985,0,"",bibtex,selection_command
+1698,1101326,"examples/bibliography.bib",17981,0,"",bibtex,selection_command
+1699,1101645,"examples/bibliography.bib",17981,1,"P",bibtex,selection_command
+1700,1101725,"examples/bibliography.bib",17981,4,"Pfau",bibtex,selection_command
+1701,1101880,"examples/bibliography.bib",17981,5,"Pfau,",bibtex,selection_command
+1702,1102074,"examples/bibliography.bib",17981,11,"Pfau, David",bibtex,selection_command
+1703,1102406,"examples/bibliography.bib",17981,11,"",bibtex,content
+1704,1103901,"examples/bibliography.bib",17981,0,"O",bibtex,content
+1705,1103903,"examples/bibliography.bib",17982,0,"",bibtex,selection_keyboard
+1706,1104126,"examples/bibliography.bib",17982,0,"p",bibtex,content
+1707,1104130,"examples/bibliography.bib",17983,0,"",bibtex,selection_keyboard
+1708,1104382,"examples/bibliography.bib",17983,0,"e",bibtex,content
+1709,1104386,"examples/bibliography.bib",17984,0,"",bibtex,selection_keyboard
+1710,1104502,"examples/bibliography.bib",17984,0,"n",bibtex,content
+1711,1104508,"examples/bibliography.bib",17985,0,"",bibtex,selection_keyboard
+1712,1104813,"examples/bibliography.bib",17985,0,"A",bibtex,content
+1713,1104818,"examples/bibliography.bib",17986,0,"",bibtex,selection_keyboard
+1714,1104864,"examples/bibliography.bib",17986,0,"I",bibtex,content
+1715,1104867,"examples/bibliography.bib",17987,0,"",bibtex,selection_keyboard
+1716,1105078,"examples/bibliography.bib",17986,0,"",bibtex,selection_command
+1717,1105225,"examples/bibliography.bib",18005,0,"",bibtex,selection_command
+1718,1105541,"examples/bibliography.bib",18004,0,"",bibtex,selection_command
+1719,1105960,"examples/bibliography.bib",18003,0,"",bibtex,selection_command
+1720,1106762,"examples/bibliography.bib",18003,1,"5",bibtex,content
+1721,1107059,"examples/bibliography.bib",18020,0,"",bibtex,selection_command
+1722,1107831,"examples/bibliography.bib",18016,51,"",bibtex,content
+1723,1109774,"examples/bibliography.bib",18016,0,"https://x.com/OpenAI/status/1946594928945148246",bibtex,content
+1724,1109777,"examples/bibliography.bib",18063,0,"",bibtex,selection_keyboard
+1725,1110149,"examples/bibliography.bib",18062,0,"",bibtex,selection_command
+1726,1110282,"examples/bibliography.bib",18097,0,"",bibtex,selection_command
+1727,1110458,"examples/bibliography.bib",18096,0,"",bibtex,selection_command
+1728,1110633,"examples/bibliography.bib",18094,0,"",bibtex,selection_command
+1729,1110798,"examples/bibliography.bib",18093,0,"",bibtex,selection_command
+1730,1111421,"examples/bibliography.bib",18096,0,"5",bibtex,content
+1731,1111421,"examples/bibliography.bib",18095,1,"",bibtex,content
+1732,1111421,"examples/bibliography.bib",18093,0,"08",bibtex,content
+1733,1111421,"examples/bibliography.bib",18091,2,"",bibtex,content
+1734,1111421,"examples/bibliography.bib",18090,0,"5",bibtex,content
+1735,1111421,"examples/bibliography.bib",18089,1,"",bibtex,content
+1736,1111780,"examples/bibliography.bib",18066,0,"",bibtex,selection_command
+1737,1112703,"examples/bibliography.bib",18099,0,"",bibtex,selection_command
+1738,1113084,"examples/bibliography.bib",18100,0,"\n",bibtex,content
+1739,1113261,"examples/bibliography.bib",18101,0,"\n",bibtex,content
+1740,1117572,"examples/bibliography.bib",17213,0,"",bibtex,selection_command
+1741,1117822,"examples/bibliography.bib",16210,0,"",bibtex,selection_command
+1742,1117989,"examples/bibliography.bib",15045,0,"",bibtex,selection_command
+1743,1118369,"examples/bibliography.bib",13684,0,"",bibtex,selection_command
+1744,1119856,"examples/bibliography.bib",14118,0,"",bibtex,selection_command
+1745,1120652,"examples/bibliography.bib",14520,0,"",bibtex,selection_command
+1746,1120812,"examples/bibliography.bib",14951,0,"",bibtex,selection_command
+1747,1120975,"examples/bibliography.bib",15399,0,"",bibtex,selection_command
+1748,1121276,"examples/bibliography.bib",16261,0,"",bibtex,selection_command
+1749,1121425,"examples/bibliography.bib",18009,0,"",bibtex,selection_command
+1750,1121587,"examples/bibliography.bib",3726,0,"",bibtex,selection_command
+1751,1122275,"examples/bibliography.bib",4086,0,"",bibtex,selection_command
+1752,1122425,"examples/bibliography.bib",5116,0,"",bibtex,selection_command
+1753,1122607,"examples/bibliography.bib",5388,0,"",bibtex,selection_command
+1754,1122775,"examples/bibliography.bib",5746,0,"",bibtex,selection_command
+1755,1122909,"examples/bibliography.bib",6531,0,"",bibtex,selection_command
+1756,1123095,"examples/bibliography.bib",9698,0,"",bibtex,selection_command
+1757,1123471,"examples/bibliography.bib",0,0,"",bibtex,selection_command
+1758,1124774,"examples/bibliography.bib",31,0,"",bibtex,selection_command
+1759,1124874,"examples/bibliography.bib",104,0,"",bibtex,selection_command
+1760,1125137,"examples/bibliography.bib",176,0,"",bibtex,selection_command
+1761,1125170,"examples/bibliography.bib",217,0,"",bibtex,selection_command
+1762,1125193,"examples/bibliography.bib",219,0,"",bibtex,selection_command
+1763,1127710,"examples/bibliography.bib",1444,0,"",bibtex,selection_command
+1764,1128256,"examples/bibliography.bib",2869,0,"",bibtex,selection_command
+1765,1129165,"examples/bibliography.bib",4262,0,"",bibtex,selection_command
+1766,1131762,"examples/bibliography.bib",5498,0,"",bibtex,selection_command
+1767,1133019,"examples/bibliography.bib",6627,0,"",bibtex,selection_command
+1768,1138310,"examples/bibliography.bib",18102,0,"",bibtex,selection_command
+1769,1160096,"examples/bibliography.bib",18102,0,"@misc{deepmind2025imo,\n title = {Advanced version of Gemini with Deep Think officially achieves gold-medal standard at the International Mathematical Olympiad},\n author = {Luong, Thang and Lockhart, Edward},\n year = {2025},\n url = {https://deepmind.google/discover/blog/advanced-version-of-gemini-with-deep-think-officially-achieves-gold-medal-standard-at-the-international-mathematical-olympiad/},\n note = {DeepMind Blog, July 21, 2025},\n}\n\n",bibtex,content
+1770,1164683,"examples/bibliography.bib",18548,0,"",bibtex,selection_command
+1771,1164933,"examples/bibliography.bib",18546,0,"",bibtex,selection_command
+1772,1164962,"examples/bibliography.bib",18505,0,"",bibtex,selection_command
+1773,1164994,"examples/bibliography.bib",18329,0,"",bibtex,selection_command
+1774,1165028,"examples/bibliography.bib",18312,0,"",bibtex,selection_command
+1775,1165348,"examples/bibliography.bib",18264,0,"",bibtex,selection_command
+1776,1165521,"examples/bibliography.bib",18125,0,"",bibtex,selection_command
+1777,1166430,"examples/bibliography.bib",18264,0,"",bibtex,selection_command
+1778,1167794,"examples/bibliography.bib",18266,0,"",bibtex,selection_command
+1779,1168043,"examples/bibliography.bib",18273,0,"",bibtex,selection_command
+1780,1168259,"examples/bibliography.bib",18275,0,"",bibtex,selection_command
+1781,1168505,"examples/bibliography.bib",18276,0,"",bibtex,selection_command
+1782,1168679,"examples/bibliography.bib",18264,0,"",bibtex,selection_command
+1783,1171832,"examples/bibliography.bib",18312,0,"",bibtex,selection_command
+1784,1171941,"examples/bibliography.bib",18329,0,"",bibtex,selection_command
+1785,1172231,"examples/bibliography.bib",18312,0,"",bibtex,selection_command
+1786,1175259,"examples/bibliography.bib",18329,0,"",bibtex,selection_command
+1787,1175702,"examples/bibliography.bib",18505,0,"",bibtex,selection_command
+1788,1175874,"examples/bibliography.bib",18546,0,"",bibtex,selection_command
+1789,1176169,"examples/bibliography.bib",18505,0,"",bibtex,selection_command
+1790,1177735,"examples/bibliography.bib",18507,0,"",bibtex,selection_command
+1791,1177975,"examples/bibliography.bib",18512,0,"",bibtex,selection_command
+1792,1178004,"examples/bibliography.bib",18514,0,"",bibtex,selection_command
+1793,1178037,"examples/bibliography.bib",18515,0,"",bibtex,selection_command
+1794,1178069,"examples/bibliography.bib",18524,0,"",bibtex,selection_command
+1795,1178234,"examples/bibliography.bib",18528,0,"",bibtex,selection_command
+1796,1178418,"examples/bibliography.bib",18530,0,"",bibtex,selection_command
+1797,1178551,"examples/bibliography.bib",18535,0,"",bibtex,selection_command
+1798,1180790,"examples/bibliography.bib",18546,0,"",bibtex,selection_command
+1799,1180937,"examples/bibliography.bib",18548,0,"",bibtex,selection_command
+1800,1181075,"examples/bibliography.bib",18549,0,"",bibtex,selection_command
+1801,1181745,"examples/bibliography.bib",18548,1,"",bibtex,content
+1802,1183043,"examples/jasmine.html",0,0,"",html,tab
+1803,1184958,"examples/jasmine.html",4126,0,"",html,selection_command
+1804,1185589,"examples/jasmine.html",4117,0,"",html,selection_command
+1805,1185875,"examples/jasmine.html",3839,0,"",html,selection_command
+1806,1185918,"examples/jasmine.html",3668,0,"",html,selection_command
+1807,1185935,"examples/jasmine.html",3660,0,"",html,selection_command
+1808,1185968,"examples/jasmine.html",3651,0,"",html,selection_command
+1809,1186001,"examples/jasmine.html",3545,0,"",html,selection_command
+1810,1186034,"examples/jasmine.html",3374,0,"",html,selection_command
+1811,1186067,"examples/jasmine.html",3366,0,"",html,selection_command
+1812,1186101,"examples/jasmine.html",3357,0,"",html,selection_command
+1813,1186140,"examples/jasmine.html",3212,0,"",html,selection_command
+1814,1186171,"examples/jasmine.html",3040,0,"",html,selection_command
+1815,1186203,"examples/jasmine.html",2870,0,"",html,selection_command
+1816,1186237,"examples/jasmine.html",2704,0,"",html,selection_command
+1817,1186270,"examples/jasmine.html",2696,0,"",html,selection_command
+1818,1186303,"examples/jasmine.html",2670,0,"",html,selection_command
+1819,1186336,"examples/jasmine.html",2596,0,"",html,selection_command
+1820,1186370,"examples/jasmine.html",2582,0,"",html,selection_command
+1821,1186453,"examples/jasmine.html",2596,0,"",html,selection_command
+1822,1186708,"examples/jasmine.html",2670,0,"",html,selection_command
+1823,1186825,"examples/jasmine.html",2696,0,"",html,selection_command
+1824,1186995,"examples/jasmine.html",2704,0,"",html,selection_command
+1825,1187179,"examples/jasmine.html",2870,0,"",html,selection_command
+1826,1187383,"examples/jasmine.html",2704,0,"",html,selection_command
+1827,1187512,"examples/jasmine.html",2708,0,"",html,selection_command
+1828,1187766,"examples/jasmine.html",2711,0,"",html,selection_command
+1829,1187795,"examples/jasmine.html",2715,0,"",html,selection_command
+1830,1187827,"examples/jasmine.html",2718,0,"",html,selection_command
+1831,1187860,"examples/jasmine.html",2722,0,"",html,selection_command
+1832,1187894,"examples/jasmine.html",2727,0,"",html,selection_command
+1833,1187928,"examples/jasmine.html",2730,0,"",html,selection_command
+1834,1187962,"examples/jasmine.html",2733,0,"",html,selection_command
+1835,1187995,"examples/jasmine.html",2746,0,"",html,selection_command
+1836,1188028,"examples/jasmine.html",2756,0,"",html,selection_command
+1837,1188062,"examples/jasmine.html",2758,0,"",html,selection_command
+1838,1188095,"examples/jasmine.html",2765,0,"",html,selection_command
+1839,1188128,"examples/jasmine.html",2774,0,"",html,selection_command
+1840,1188162,"examples/jasmine.html",2778,0,"",html,selection_command
+1841,1188195,"examples/jasmine.html",2783,0,"",html,selection_command
+1842,1188228,"examples/jasmine.html",2786,0,"",html,selection_command
+1843,1188261,"examples/jasmine.html",2792,0,"",html,selection_command
+1844,1188295,"examples/jasmine.html",2796,0,"",html,selection_command
+1845,1188328,"examples/jasmine.html",2805,0,"",html,selection_command
+1846,1188361,"examples/jasmine.html",2808,0,"",html,selection_command
+1847,1188395,"examples/jasmine.html",2813,0,"",html,selection_command
+1848,1188428,"examples/jasmine.html",2819,0,"",html,selection_command
+1849,1188461,"examples/jasmine.html",2832,0,"",html,selection_command
+1850,1188494,"examples/jasmine.html",2844,0,"",html,selection_command
+1851,1188528,"examples/jasmine.html",2845,0,"",html,selection_command
+1852,1188561,"examples/jasmine.html",2850,0,"",html,selection_command
+1853,1188594,"examples/jasmine.html",2856,0,"",html,selection_command
+1854,1188628,"examples/jasmine.html",2857,0,"",html,selection_command
+1855,1188661,"examples/jasmine.html",2861,0,"",html,selection_command
+1856,1189006,"examples/jasmine.html",2857,0,"",html,selection_command
+1857,1189130,"examples/jasmine.html",2856,0,"",html,selection_command
+1858,1189264,"examples/jasmine.html",2850,0,"",html,selection_command
+1859,1189402,"examples/jasmine.html",2845,0,"",html,selection_command
+1860,1189546,"examples/jasmine.html",2844,0,"",html,selection_command
+1861,1189936,"examples/jasmine.html",2844,21,"",html,content
+1862,1191796,"examples/crowd_code.html",0,0,"",html,tab
+1863,1192936,"examples/crowd_code.html",6254,0,"",html,selection_command
+1864,1200389,"examples/jasmine.html",0,0,"",html,tab
+1865,1202357,"examples/causal_mask.html",0,0,"\n\n\n
\n \n \n \n \n\n\n\n \n \n \n \n \n
Although ubiquitously used in large-scale language modeling , the necessity of the causal mask is seldom questioned in the literature.\n ""The causal mask is needed to prevent information leakage from future tokens"" is a commonly encountered, almost dogmatically repeated phrase.\n However, among researchers and practitioners alike, there exists a certain confusion around what the causal mask is and why we actually need it.
The confusion about the causal mask already starts with its name. The original Transformer paper \n does not mention the term causal mask at all. While we cannot definitively pin-point the origin of the term, its\n first well-known appearance is in the T5 paper , where it is used to describe the\n triangular mask that is applied to the attention weights in the self-attention mechanism (Figure 1, centre). The mask being triangular\n has the effect of only allowing information from previous tokens to be used in the computation of the current token, which already\n leads us to the first common misconception: For causal LMs at inference time, even if $$n$$ tokens have already been generated, token $$k$$ with $$k < n$$\n cannot attend to token $$j$$ with $$k < j < n$$, even though token $$j$$ is already known. From an information-theoretic perspective,\n it is clear that this is suboptimal, and indeed recent work has investigated the algorithmic deficiency of causal language models .\n
\n \n \n Figure 1: A schematic of full attention, causal masking, and prefix masking. Figure from . Used under CC-BY 4.0 license.\n
\n When solely looking at inference time, what we ideally want is full attention, i.e. every (generated) token can attend to every (generated) token (Figure 1, left).\n This is the no-mask regime, where the attention weights are not modified at all. Yet, all LLMs also use\n the causal mask at inference time since omitting the mask would lead to a distribution shift between training and inference, thus impairing performance.\n Since the causal mask is needed at inference time because it was used during training, a natural question to ask is: Why do we need the causal mask during training?\n\n
\n For brevity's sake, we will focus on the GPT-style pre-training regime , where the model is trained to predict the next token given the previous tokens.\n One of the key advantages of the Transformer architecture over classical RNNs is that we can predict all tokens of a sequence in parallel.\n Specifically, this is achieved using a technique called teacher-forcing, where instead of using the tokens generated by the model, we use the ground-truth previous tokens to predict each 'next token'.\n Clearly, during the model prediction of token $$k$$, its computation should not use information from token $$k$$ and beyond, which brings us to the second common misconception: The causal mask is not needed to prevent information leakage from future tokens during training.\n
\n
\n To illustrate my point, suppose we have a sequence of tokens $$x_1, x_2, x_3, x_4$$ and we want the model to use tokens $$x_1, x_2, x_3$$ to predict token $$x_4$$.\n We need to prohibit tokens from attending to token $$x_4$$ when predicting token $$x_4$$, but we do not need to prohibit token $$x_1$$ from attending to tokens $$\{x_2, x_3\}$$, or token $$x_2$$ from attending to token $$x_3$$, which is exactly what the causal mask does.\n Instead of a triangular mask, using a block-sparse mask would suffice to prevent information leakage from future tokens, while allowing for all tokens in the context to attend to each other.\n
\n
\n The illustration above only depicts the case of a single token prediction during training, raising the question of whether the point holds for parallel training as well.\n In fact, the causal mask is neither needed for parallel training, nor for teacher-forcing, and a block-sparse mask suffices for both.\n
\n We can motivate the use of block-sparse masks by looking at PrefixLMs , which are language models originally designed to solve sequence-to-sequence tasks.\n In PrefixLMs, the model is trained to predict the next token given the previous tokens and a so-called prefix which is input to the model.\n During supervised training of PrefixLMs, the prefix could be a prompt, a question, or any other kind of context that is given to the model.\n The model is then trained to predict the answer or continuation of the prompt. Since the prefix is fixed and not predicted by the model, the tokens in the prefix are not masked at all, while the rest of the sequence is causally masked (Figure 1, right).\n
\n
\n This masking procedure is specifically tailored towards the regime of sequence-to-sequence tasks under explicit supervision.\n However, the great abundance of textual data available on the Internet is unlabeled and largely unstructured, in which case PrefixLM training involves\n randomly splitting text into prefixes and targets. In this case, causal language modeling leads to much denser supervision than prefix language modeling,\n since the prefix does not provide a direct supervisory signal to the model.\n
\n
\n This motivates a procedure that benefits from both dense supervision as well as full attention: Taking each token as the sole target\n and using all previous tokens as the prefix. This is exactly the block-sparse mask and nothing inherently prohibits parallelization in this regime.\n However, the block-sparse mask depends on the position of the token in the sequence, which means that a naรฏve implementation would require $$n$$ times as much memory\n and $$n$$ times as much computation, since everything after the first attention map computation is token-position dependent (where $$n$$ is the sequence length).\n Thus, the main challenge in moving beyond the causal mask is mitigating the memory and compute overhead that comes with that.\n
\n \n\n \n\n
Contributions
\n
FS worked on all aspects of this post, including research, analysis and writing. This blog post has benefited from various discussions with senior colleagues, among others Preetum Nakkiran and Thomas Scialom.
\n \n \n \n \n\n \n\n",html,tab
+1866,1204405,"examples/causal_mask.html",3587,0,"",html,selection_command
+1867,1204652,"examples/causal_mask.html",3578,0,"",html,selection_command
+1868,1204679,"examples/causal_mask.html",3386,0,"",html,selection_command
+1869,1204711,"examples/causal_mask.html",3242,0,"",html,selection_command
+1870,1204743,"examples/causal_mask.html",3078,0,"",html,selection_command
+1871,1204776,"examples/causal_mask.html",2941,0,"",html,selection_command
+1872,1204810,"examples/causal_mask.html",2800,0,"",html,selection_command
+1873,1205375,"examples/causal_mask.html",2669,0,"",html,selection_command
+1874,1205469,"examples/causal_mask.html",2543,0,"",html,selection_command
+1875,1205641,"examples/causal_mask.html",2394,0,"",html,selection_command
+1876,1205771,"examples/causal_mask.html",2398,0,"",html,selection_command
+1877,1206021,"examples/causal_mask.html",2399,0,"",html,selection_command
+1878,1206055,"examples/causal_mask.html",2400,0,"",html,selection_command
+1879,1206086,"examples/causal_mask.html",2401,0,"",html,selection_command
+1880,1206119,"examples/causal_mask.html",2405,0,"",html,selection_command
+1881,1206152,"examples/causal_mask.html",2415,0,"",html,selection_command
+1882,1206185,"examples/causal_mask.html",2421,0,"",html,selection_command
+1883,1206219,"examples/causal_mask.html",2425,0,"",html,selection_command
+1884,1206252,"examples/causal_mask.html",2432,0,"",html,selection_command
+1885,1206285,"examples/causal_mask.html",2437,0,"",html,selection_command
+1886,1206318,"examples/causal_mask.html",2445,0,"",html,selection_command
+1887,1206350,"examples/causal_mask.html",2452,0,"",html,selection_command
+1888,1206383,"examples/causal_mask.html",2457,0,"",html,selection_command
+1889,1206417,"examples/causal_mask.html",2461,0,"",html,selection_command
+1890,1206450,"examples/causal_mask.html",2465,0,"",html,selection_command
+1891,1206487,"examples/causal_mask.html",2467,0,"",html,selection_command
+1892,1206521,"examples/causal_mask.html",2471,0,"",html,selection_command
+1893,1206555,"examples/causal_mask.html",2480,0,"",html,selection_command
+1894,1206589,"examples/causal_mask.html",2492,0,"",html,selection_command
+1895,1206622,"examples/causal_mask.html",2498,0,"",html,selection_command
+1896,1206657,"examples/causal_mask.html",2499,0,"",html,selection_command
+1897,1207097,"examples/causal_mask.html",2498,0,"",html,selection_command
+1898,1207162,"examples/causal_mask.html",2498,1,"<",html,selection_command
+1899,1207230,"examples/causal_mask.html",2498,2,"\n",html,selection_command
+1901,1207763,"examples/causal_mask.html",2498,44,"",html,selection_command
+1902,1207941,"examples/causal_mask.html",2498,43,"",html,selection_command
+1904,1208716,"examples/causal_mask.html",2498,0,"",html,selection_command
+1905,1209191,"examples/jasmine.html",0,0,"",html,tab
+1906,1210518,"examples/jasmine.html",2843,0,"",html,selection_command
+1907,1210571,"examples/jasmine.html",2844,0,"",html,content
+1908,1210573,"examples/jasmine.html",2887,0,"",html,selection_command
+1909,1211205,"examples/jasmine.html",2883,0,"",html,selection_command
+1910,1211456,"examples/jasmine.html",2882,0,"",html,selection_command
+1911,1211485,"examples/jasmine.html",2881,0,"",html,selection_command
+1912,1211617,"examples/jasmine.html",2877,0,"",html,selection_command
+1913,1211798,"examples/jasmine.html",2857,0,"",html,selection_command
+1914,1211932,"examples/jasmine.html",2855,0,"",html,selection_command
+1915,1212281,"examples/jasmine.html",2857,0,"",html,selection_command
+1916,1212433,"examples/jasmine.html",2877,0,"",html,selection_command
+1917,1212952,"examples/jasmine.html",2857,0,"",html,selection_command
+1918,1213400,"examples/jasmine.html",2857,20,"",html,content
+1919,1214652,"examples/jasmine.html",2857,0,"openai2025imo, deepmind2025imo",html,content
+1920,1214876,"examples/jasmine.html",2886,0,"",html,selection_command
+1921,1215082,"examples/jasmine.html",2700,0,"",html,selection_command
+1922,1217085,"TERMINAL",0,0,"^C[1G[0Kโ [1G[0K[1m[7m%[27m[1m[0m \r \r]633;D;0]633;P;Cwd=/Users/franzsrambical/Documents/pdoom/pdoom.org\r[0m[27m[24m[J]633;Afranzsrambical@MBF6N9WFVKFV pdoom.org % ]633;B[K[?2004h",,terminal_output
+1923,1217174,"TERMINAL",0,0,"",,terminal_command
+1924,1217174,"TERMINAL",0,0,"[?2004l\r\r\n[1m[7m%[27m[1m[0m \r \r]633;E;;b07e0e29-8b6e-4188-b95b-fd2ce17f4bb1",,terminal_output
+1925,1217174,"TERMINAL",0,0,"]633;C",,terminal_output
+1926,1217765,"TERMINAL",0,0,"cp examples/* dist",,terminal_command
+1927,1217784,"TERMINAL",0,0,"]633;C[1m[7m%[27m[1m[0m \r \r",,terminal_output
+1928,1218308,"TERMINAL",0,0,"npm run dev",,terminal_command
+1929,1218360,"TERMINAL",0,0,"]633;C",,terminal_output
+1930,1218493,"TERMINAL",0,0,"\r\n> distill-template@2.8.0 dev\r\n> rollup -c rollup.config.dev.js -w\r\n\r\n[1G[0K",,terminal_output
+1931,1218669,"TERMINAL",0,0,"c[4mrollup v2.7.3[24m\r\n[36mbundles [1msrc/components.js[22m โ [1mdist/template.v2.js[22m...[39m\r\n",,terminal_output
+1932,1219161,"TERMINAL",0,0,"[1m[32mhttp://localhost:8088[39m[22m -> /Users/franzsrambical/Documents/pdoom/pdoom.org/dist\r\n[1m[32mhttp://localhost:8088[39m[22m -> /Users/franzsrambical/Documents/pdoom/pdoom.org/examples\r\n[1m[33m(!) Circular dependencies[39m[22m\r\nnode_modules/d3-interpolate/src/value.js -> node_modules/d3-interpolate/src/array.js -> node_modules/d3-interpolate/src/value.js\r\nnode_modules/d3-interpolate/src/value.js -> node_modules/d3-interpolate/src/object.js -> node_modules/d3-interpolate/src/value.js\r\nnode_modules/d3-selection/src/selection/index.js -> node_modules/d3-selection/src/selection/select.js -> node_modules/d3-selection/src/selection/index.js\r\n...and 7 more\r\n[32mcreated [1mdist/template.v2.js[22m in [1m492ms[22m[39m\r\n[36mbundles [1msrc/transforms.js[22m โ [1mdist/transforms.v2.js[22m...[39m\r\n",,terminal_output
+1933,1219527,"TERMINAL",0,0,"[32mcreated [1mdist/transforms.v2.js[22m in [1m365ms[22m[39m\r\n\r\n[2025-08-05 18:46:15] waiting for changes...\r\n",,terminal_output
+1934,1248229,"examples/jasmine.html",2961,0,"",html,selection_command
+1935,1248929,"examples/jasmine.html",2960,0,"",html,selection_command
+1936,1249096,"examples/jasmine.html",2960,1,"(",html,selection_command
+1937,1249166,"examples/jasmine.html",2960,3,"(R1",html,selection_command
+1938,1249295,"examples/jasmine.html",2960,4,"(R1)",html,selection_command
+1939,1249649,"examples/jasmine.html",2960,4,"",html,content
+1940,1250869,"examples/jasmine.html",2960,0,"",html,content
+1941,1251182,"examples/jasmine.html",3043,0,"",html,selection_command
+1942,1251465,"examples/jasmine.html",3039,0,"",html,selection_command
+1943,1251613,"examples/jasmine.html",3038,0,"",html,selection_command
+1944,1251732,"examples/jasmine.html",3037,0,"",html,selection_command
+1945,1251863,"examples/jasmine.html",3033,0,"",html,selection_command
+1946,1252050,"examples/jasmine.html",3016,0,"",html,selection_command
+1947,1252530,"examples/jasmine.html",2973,60,"",html,content
+1948,1253469,"examples/jasmine.html",2972,0,"",html,selection_command
+1949,1255407,"examples/bibliography.bib",0,0,"",bibtex,tab
+1950,1388683,"examples/bibliography.bib",18548,0,"\n",bibtex,content
+1951,1388832,"examples/bibliography.bib",18549,0,"@misc{deepseekai2025deepseekr1incentivizingreasoningcapability,\n title={DeepSeek-R1: Incentivizing Reasoning Capability in LLMs via Reinforcement Learning}, \n author={DeepSeek-AI},\n year={2025},\n eprint={2501.12948},\n archivePrefix={arXiv},\n primaryClass={cs.CL},\n url={https://arxiv.org/abs/2501.12948}, \n}",bibtex,content
+1952,1389111,"examples/bibliography.bib",18890,0,"",bibtex,selection_command
+1953,1389830,"examples/bibliography.bib",18843,0,"",bibtex,selection_command
+1954,1390082,"examples/bibliography.bib",18815,0,"",bibtex,selection_command
+1955,1390114,"examples/bibliography.bib",18786,0,"",bibtex,selection_command
+1956,1390144,"examples/bibliography.bib",18759,0,"",bibtex,selection_command
+1957,1390177,"examples/bibliography.bib",18740,0,"",bibtex,selection_command
+1958,1390211,"examples/bibliography.bib",18712,0,"",bibtex,selection_command
+1959,1390244,"examples/bibliography.bib",18613,0,"",bibtex,selection_command
+1960,1390433,"examples/bibliography.bib",18549,0,"",bibtex,selection_command
+1961,1393385,"examples/bibliography.bib",18550,0,"",bibtex,selection_command
+1962,1393631,"examples/bibliography.bib",18551,0,"",bibtex,selection_command
+1963,1393659,"examples/bibliography.bib",18552,0,"",bibtex,selection_command
+1964,1393693,"examples/bibliography.bib",18553,0,"",bibtex,selection_command
+1965,1393723,"examples/bibliography.bib",18554,0,"",bibtex,selection_command
+1966,1393756,"examples/bibliography.bib",18555,0,"",bibtex,selection_command
+1967,1393791,"examples/bibliography.bib",18556,0,"",bibtex,selection_command
+1968,1393831,"examples/bibliography.bib",18557,0,"",bibtex,selection_command
+1969,1393860,"examples/bibliography.bib",18558,0,"",bibtex,selection_command
+1970,1393892,"examples/bibliography.bib",18559,0,"",bibtex,selection_command
+1971,1393926,"examples/bibliography.bib",18560,0,"",bibtex,selection_command
+1972,1393960,"examples/bibliography.bib",18561,0,"",bibtex,selection_command
+1973,1393992,"examples/bibliography.bib",18562,0,"",bibtex,selection_command
+1974,1394025,"examples/bibliography.bib",18563,0,"",bibtex,selection_command
+1975,1394058,"examples/bibliography.bib",18564,0,"",bibtex,selection_command
+1976,1394092,"examples/bibliography.bib",18565,0,"",bibtex,selection_command
+1977,1394125,"examples/bibliography.bib",18566,0,"",bibtex,selection_command
+1978,1394158,"examples/bibliography.bib",18567,0,"",bibtex,selection_command
+1979,1394355,"examples/bibliography.bib",18568,0,"",bibtex,selection_command
+1980,1394534,"examples/bibliography.bib",18569,0,"",bibtex,selection_command
+1981,1394919,"examples/bibliography.bib",18569,42,"",bibtex,content
+1982,1395169,"examples/bibliography.bib",18569,0,"r",bibtex,content
+1983,1395171,"examples/bibliography.bib",18570,0,"",bibtex,selection_keyboard
+1984,1395234,"examples/bibliography.bib",18570,0,"r",bibtex,content
+1985,1395237,"examples/bibliography.bib",18571,0,"",bibtex,selection_keyboard
+1986,1395377,"examples/bibliography.bib",18571,0,"1",bibtex,content
+1987,1395378,"examples/bibliography.bib",18572,0,"",bibtex,selection_keyboard
+1988,1395786,"examples/bibliography.bib",18571,1,"",bibtex,content
+1989,1395937,"examples/bibliography.bib",18570,1,"",bibtex,content
+1990,1395947,"examples/bibliography.bib",18570,0,"1",bibtex,content
+1991,1395949,"examples/bibliography.bib",18571,0,"",bibtex,selection_keyboard
+1992,1396119,"examples/bibliography.bib",18570,0,"",bibtex,selection_command
+1993,1396283,"examples/bibliography.bib",18549,0,"",bibtex,selection_command
+1994,1396795,"examples/bibliography.bib",18579,0,"",bibtex,selection_command
+1995,1398318,"examples/bibliography.bib",18573,0,"",bibtex,selection_command
+1996,1398618,"examples/bibliography.bib",18549,0,"",bibtex,selection_command
+1997,1400975,"examples/jasmine.html",0,0,"",html,tab
+1998,1402509,"examples/jasmine.html",2973,0,"deepseekai2025r1",html,content
+1999,1402512,"examples/jasmine.html",2989,0,"",html,selection_command
+2000,1403556,"examples/jasmine.html",2993,0,"",html,selection_command
+2001,1403804,"examples/jasmine.html",2994,0,"",html,selection_command
+2002,1403838,"examples/jasmine.html",2995,0,"",html,selection_command
+2003,1403865,"examples/jasmine.html",2999,0,"",html,selection_command
+2004,1404542,"examples/jasmine.html",3002,0,"",html,selection_command
+2005,1404789,"examples/jasmine.html",3008,0,"",html,selection_command
+2006,1404824,"examples/jasmine.html",3011,0,"",html,selection_command
+2007,1404849,"examples/jasmine.html",3022,0,"",html,selection_command
+2008,1404881,"examples/jasmine.html",3029,0,"",html,selection_command
+2009,1404916,"examples/jasmine.html",3032,0,"",html,selection_command
+2010,1404948,"examples/jasmine.html",3040,0,"",html,selection_command
+2011,1404984,"examples/jasmine.html",3052,0,"",html,selection_command
+2012,1405604,"examples/jasmine.html",3055,0,"",html,selection_command
+2013,1405841,"examples/jasmine.html",3065,0,"",html,selection_command
+2014,1405875,"examples/jasmine.html",3068,0,"",html,selection_command
+2015,1405906,"examples/jasmine.html",3076,0,"",html,selection_command
+2016,1405938,"examples/jasmine.html",3084,0,"",html,selection_command
+2017,1405970,"examples/jasmine.html",3086,0,"",html,selection_command
+2018,1406002,"examples/jasmine.html",3090,0,"",html,selection_command
+2019,1406036,"examples/jasmine.html",3092,0,"",html,selection_command
+2020,1406068,"examples/jasmine.html",3096,0,"",html,selection_command
+2021,1406102,"examples/jasmine.html",3097,0,"",html,selection_command
+2022,1406135,"examples/jasmine.html",3113,0,"",html,selection_command
+2023,1406168,"examples/jasmine.html",3120,0,"",html,selection_command
+2024,1406202,"examples/jasmine.html",3123,0,"",html,selection_command
+2025,1406235,"examples/jasmine.html",3133,0,"",html,selection_command
+2026,1406268,"examples/jasmine.html",3140,0,"",html,selection_command
+2027,1406302,"examples/jasmine.html",3145,0,"",html,selection_command
+2028,1406336,"examples/jasmine.html",3149,0,"",html,selection_command
+2029,1406368,"examples/jasmine.html",3158,0,"",html,selection_command
+2030,1406402,"examples/jasmine.html",3165,0,"",html,selection_command
+2031,1406435,"examples/jasmine.html",3169,0,"",html,selection_command
+2032,1406468,"examples/jasmine.html",3178,0,"",html,selection_command
+2033,1406503,"examples/jasmine.html",3181,0,"",html,selection_command
+2034,1407003,"examples/jasmine.html",3188,0,"",html,selection_command
+2035,1407255,"examples/jasmine.html",3191,0,"",html,selection_command
+2036,1407278,"examples/jasmine.html",3200,0,"",html,selection_command
+2037,1407309,"examples/jasmine.html",3209,0,"",html,selection_command
+2038,1407340,"examples/jasmine.html",3211,0,"",html,selection_command
+2039,1407373,"examples/jasmine.html",3215,0,"",html,selection_command
+2040,1407407,"examples/jasmine.html",3220,0,"",html,selection_command
+2041,1407740,"examples/jasmine.html",3223,0,"",html,selection_command
+2042,1407993,"examples/jasmine.html",3228,0,"",html,selection_command
+2043,1408024,"examples/jasmine.html",3229,0,"",html,selection_command
+2044,1408054,"examples/jasmine.html",3235,0,"",html,selection_command
+2045,1408082,"examples/jasmine.html",3248,0,"",html,selection_command
+2046,1408120,"examples/jasmine.html",3253,0,"",html,selection_command
+2047,1408148,"examples/jasmine.html",3257,0,"",html,selection_command
+2048,1408184,"examples/jasmine.html",3265,0,"",html,selection_command
+2049,1408215,"examples/jasmine.html",3268,0,"",html,selection_command
+2050,1408251,"examples/jasmine.html",3285,0,"",html,selection_command
+2051,1408284,"examples/jasmine.html",3291,0,"",html,selection_command
+2052,1408323,"examples/jasmine.html",3301,0,"",html,selection_command
+2053,1408358,"examples/jasmine.html",3304,0,"",html,selection_command
+2054,1408388,"examples/jasmine.html",3308,0,"",html,selection_command
+2055,1408424,"examples/jasmine.html",3316,0,"",html,selection_command
+2056,1408469,"examples/jasmine.html",3321,0,"",html,selection_command
+2057,1408487,"examples/jasmine.html",3334,0,"",html,selection_command
+2058,1408519,"examples/jasmine.html",3337,0,"",html,selection_command
+2059,1408831,"examples/jasmine.html",3344,0,"",html,selection_command
+2060,1409088,"examples/jasmine.html",3347,0,"",html,selection_command
+2061,1409115,"examples/jasmine.html",3357,0,"",html,selection_command
+2062,1409142,"examples/jasmine.html",3362,0,"",html,selection_command
+2063,1409173,"examples/jasmine.html",3366,0,"",html,selection_command
+2064,1409207,"examples/jasmine.html",3371,0,"",html,selection_command
+2065,1409239,"examples/jasmine.html",3374,0,"",html,selection_command
+2066,1409272,"examples/jasmine.html",3379,0,"",html,selection_command
+2067,1409305,"examples/jasmine.html",3383,0,"",html,selection_command
+2068,1409338,"examples/jasmine.html",3395,0,"",html,selection_command
+2069,1409376,"examples/jasmine.html",3399,0,"",html,selection_command
+2070,1409406,"examples/jasmine.html",3409,0,"",html,selection_command
+2071,1409441,"examples/jasmine.html",3414,0,"",html,selection_command
+2072,1409472,"examples/jasmine.html",3420,0,"",html,selection_command
+2073,1409506,"examples/jasmine.html",3426,0,"",html,selection_command
+2074,1409539,"examples/jasmine.html",3428,0,"",html,selection_command
+2075,1409570,"examples/jasmine.html",3429,0,"",html,selection_command
+2076,1409604,"examples/jasmine.html",3435,0,"",html,selection_command
+2077,1409637,"examples/jasmine.html",3436,0,"",html,selection_command
+2078,1409671,"examples/jasmine.html",3437,0,"",html,selection_command
+2079,1409946,"examples/jasmine.html",3447,0,"",html,selection_command
+2080,1410356,"examples/jasmine.html",3453,0,"",html,selection_command
+2081,1410607,"examples/jasmine.html",3460,0,"",html,selection_command
+2082,1410645,"examples/jasmine.html",3461,0,"",html,selection_command
+2083,1410663,"examples/jasmine.html",3469,0,"",html,selection_command
+2084,1410921,"examples/jasmine.html",3470,0,"",html,selection_command
+2085,1411254,"examples/jasmine.html",3476,0,"",html,selection_command
+2086,1411662,"examples/jasmine.html",3476,1,"(",html,selection_command
+2087,1411701,"examples/jasmine.html",3476,5,"(cite",html,selection_command
+2088,1411882,"examples/jasmine.html",3476,12,"(cite cursor",html,selection_command
+2089,1412106,"examples/jasmine.html",3476,13,"(cite cursor)",html,selection_command
+2090,1412490,"examples/jasmine.html",3476,13,"",html,content
+2091,1496885,"examples/jasmine.html",3476,0,"<",html,content
+2092,1496892,"examples/jasmine.html",3477,0,"",html,selection_keyboard
+2093,1497200,"examples/jasmine.html",3477,0,"d",html,content
+2094,1497203,"examples/jasmine.html",3478,0,"",html,selection_keyboard
+2095,1497371,"examples/jasmine.html",3478,0,"-",html,content
+2096,1497374,"examples/jasmine.html",3479,0,"",html,selection_keyboard
+2097,1497489,"examples/jasmine.html",3479,0,"c",html,content
+2098,1497490,"examples/jasmine.html",3480,0,"",html,selection_keyboard
+2099,1497586,"examples/jasmine.html",3480,0,"i",html,content
+2100,1497587,"examples/jasmine.html",3481,0,"",html,selection_keyboard
+2101,1497722,"examples/jasmine.html",3481,0,"t",html,content
+2102,1497726,"examples/jasmine.html",3482,0,"",html,selection_keyboard
+2103,1497764,"examples/jasmine.html",3482,0,"e",html,content
+2104,1497767,"examples/jasmine.html",3483,0,"",html,selection_keyboard
+2105,1498234,"examples/jasmine.html",3483,0," ",html,content
+2106,1498236,"examples/jasmine.html",3484,0,"",html,selection_keyboard
+2107,1498317,"examples/jasmine.html",3484,0,"k",html,content
+2108,1498318,"examples/jasmine.html",3485,0,"",html,selection_keyboard
+2109,1498432,"examples/jasmine.html",3485,0,"e",html,content
+2110,1498434,"examples/jasmine.html",3486,0,"",html,selection_keyboard
+2111,1498550,"examples/jasmine.html",3486,0,"y",html,content
+2112,1498552,"examples/jasmine.html",3487,0,"",html,selection_keyboard
+2113,1499301,"examples/jasmine.html",3487,0,"=",html,content
+2114,1499307,"examples/jasmine.html",3488,0,"",html,selection_keyboard
+2115,1499409,"examples/jasmine.html",3488,0,"""""",html,content
+2116,1499589,"examples/jasmine.html",3489,1,"""",html,content
+2117,1499592,"examples/jasmine.html",3490,0,"",html,selection_keyboard
+2118,1499735,"examples/jasmine.html",3490,0,"""""",html,content
+2119,1499737,"examples/jasmine.html",3491,0,"",html,selection_keyboard
+2120,1501333,"examples/jasmine.html",3491,0,">",html,content
+2121,1501338,"examples/jasmine.html",3492,0,"",html,selection_keyboard
+2122,1501414,"examples/jasmine.html",3492,0,"<",html,content
+2123,1501415,"examples/jasmine.html",3493,0,"",html,selection_keyboard
+2124,1502039,"examples/jasmine.html",3493,0,">",html,content
+2125,1502044,"examples/jasmine.html",3494,0,"",html,selection_keyboard
+2126,1502481,"examples/jasmine.html",3493,0,"",html,selection_command
+2127,1502704,"examples/jasmine.html",3493,0,"d",html,content
+2128,1502705,"examples/jasmine.html",3494,0,"",html,selection_keyboard
+2129,1503412,"examples/jasmine.html",3493,1,"",html,content
+2130,1503677,"examples/jasmine.html",3493,0,"/",html,content
+2131,1503681,"examples/jasmine.html",3494,0,"",html,selection_keyboard
+2132,1503785,"examples/jasmine.html",3494,0,"d-cite",html,content
+2133,1503822,"examples/jasmine.html",3500,0,"d",html,content
+2134,1503823,"examples/jasmine.html",3501,0,"",html,selection_keyboard
+2135,1504190,"examples/jasmine.html",3501,0,"-",html,content
+2136,1504191,"examples/jasmine.html",3502,0,"",html,selection_keyboard
+2137,1504351,"examples/jasmine.html",3502,0,"c",html,content
+2138,1504352,"examples/jasmine.html",3503,0,"",html,selection_keyboard
+2139,1504449,"examples/jasmine.html",3503,0,"i",html,content
+2140,1504450,"examples/jasmine.html",3504,0,"",html,selection_keyboard
+2141,1504586,"examples/jasmine.html",3504,0,"t",html,content
+2142,1504587,"examples/jasmine.html",3505,0,"",html,selection_keyboard
+2143,1504625,"examples/jasmine.html",3505,0,"e",html,content
+2144,1504626,"examples/jasmine.html",3506,0,"",html,selection_keyboard
+2145,1504884,"examples/jasmine.html",3505,0,"",html,selection_command
+2146,1505187,"examples/jasmine.html",3504,0,"",html,selection_command
+2147,1505711,"examples/jasmine.html",3503,0,"",html,selection_command
+2148,1505859,"examples/jasmine.html",3502,0,"",html,selection_command
+2149,1506006,"examples/jasmine.html",3501,0,"",html,selection_command
+2150,1506482,"examples/jasmine.html",3501,1,"",html,content
+2151,1506923,"examples/jasmine.html",3501,4,"",html,content
+2152,1507377,"examples/jasmine.html",3496,0,"",html,selection_command
+2153,1507525,"examples/jasmine.html",3495,0,"",html,selection_command
+2154,1507663,"examples/jasmine.html",3494,0,"",html,selection_command
+2155,1507999,"examples/jasmine.html",3493,0,"",html,selection_command
+2156,1508251,"examples/jasmine.html",3492,0,"",html,selection_command
+2157,1508286,"examples/jasmine.html",3491,0,"",html,selection_command
+2158,1508316,"examples/jasmine.html",3490,0,"",html,selection_command
+2159,1508357,"examples/jasmine.html",3489,0,"",html,selection_command
+2160,1508508,"examples/jasmine.html",3488,0,"",html,selection_command
+2161,1508842,"examples/jasmine.html",3489,0,"",html,selection_command
+2162,1510023,"examples/bibliography.bib",0,0,"",bibtex,tab
+2163,1510624,"examples/bibliography.bib",18573,0,"",bibtex,selection_command
+2164,1510868,"examples/bibliography.bib",18672,0,"",bibtex,selection_command
+2165,1510905,"examples/bibliography.bib",18700,0,"",bibtex,selection_command
+2166,1510936,"examples/bibliography.bib",18719,0,"",bibtex,selection_command
+2167,1510964,"examples/bibliography.bib",18746,0,"",bibtex,selection_command
+2168,1510999,"examples/bibliography.bib",18775,0,"",bibtex,selection_command
+2169,1511033,"examples/bibliography.bib",18803,0,"",bibtex,selection_command
+2170,1511069,"examples/bibliography.bib",18850,0,"",bibtex,selection_command
+2171,1511250,"examples/bibliography.bib",18851,0,"\n",bibtex,content
+2172,1511669,"examples/bibliography.bib",18852,0,"\n",bibtex,content
+2173,1512000,"examples/bibliography.bib",18853,0,"@",bibtex,content
+2174,1512001,"examples/bibliography.bib",18854,0,"",bibtex,selection_keyboard
+2175,1512430,"examples/bibliography.bib",18854,0,"m",bibtex,content
+2176,1512433,"examples/bibliography.bib",18855,0,"",bibtex,selection_keyboard
+2177,1512490,"examples/bibliography.bib",18855,0,"i",bibtex,content
+2178,1512491,"examples/bibliography.bib",18856,0,"",bibtex,selection_keyboard
+2179,1512557,"examples/bibliography.bib",18856,0,"s",bibtex,content
+2180,1512560,"examples/bibliography.bib",18857,0,"",bibtex,selection_keyboard
+2181,1512727,"examples/bibliography.bib",18857,0,"c",bibtex,content
+2182,1512730,"examples/bibliography.bib",18858,0,"",bibtex,selection_keyboard
+2183,1513064,"examples/bibliography.bib",18858,0,"{",bibtex,content
+2184,1513067,"examples/bibliography.bib",18859,0,"",bibtex,selection_keyboard
+2185,1513362,"examples/bibliography.bib",18859,0,"https://cursor.com/blog/tab-update",bibtex,content
+2186,1513365,"examples/bibliography.bib",18893,0,"",bibtex,selection_keyboard
+2187,1514429,"examples/bibliography.bib",18887,6,"",bibtex,content
+2188,1514566,"examples/bibliography.bib",18886,1,"",bibtex,content
+2189,1514724,"examples/bibliography.bib",18883,3,"",bibtex,content
+2190,1514811,"examples/bibliography.bib",18882,1,"",bibtex,content
+2191,1514974,"examples/bibliography.bib",18878,4,"",bibtex,content
+2192,1515132,"examples/bibliography.bib",18877,1,"",bibtex,content
+2193,1515286,"examples/bibliography.bib",18874,3,"",bibtex,content
+2194,1515432,"examples/bibliography.bib",18873,1,"",bibtex,content
+2195,1515647,"examples/bibliography.bib",18867,6,"",bibtex,content
+2196,1515830,"examples/bibliography.bib",18864,3,"",bibtex,content
+2197,1516110,"examples/bibliography.bib",18859,5,"",bibtex,content
+2198,1516536,"examples/bibliography.bib",18859,0,"c",bibtex,content
+2199,1516539,"examples/bibliography.bib",18860,0,"",bibtex,selection_keyboard
+2200,1516899,"examples/bibliography.bib",18860,0,"u",bibtex,content
+2201,1516904,"examples/bibliography.bib",18861,0,"",bibtex,selection_keyboard
+2202,1517037,"examples/bibliography.bib",18861,0,"r",bibtex,content
+2203,1517041,"examples/bibliography.bib",18862,0,"",bibtex,selection_keyboard
+2204,1517188,"examples/bibliography.bib",18862,0,"s",bibtex,content
+2205,1517191,"examples/bibliography.bib",18863,0,"",bibtex,selection_keyboard
+2206,1517302,"examples/bibliography.bib",18863,0,"o",bibtex,content
+2207,1517306,"examples/bibliography.bib",18864,0,"",bibtex,selection_keyboard
+2208,1517383,"examples/bibliography.bib",18864,0,"r",bibtex,content
+2209,1517386,"examples/bibliography.bib",18865,0,"",bibtex,selection_keyboard
+2210,1517652,"examples/bibliography.bib",18865,0,"2",bibtex,content
+2211,1517655,"examples/bibliography.bib",18866,0,"",bibtex,selection_keyboard
+2212,1517753,"examples/bibliography.bib",18866,0,"0",bibtex,content
+2213,1517756,"examples/bibliography.bib",18867,0,"",bibtex,selection_keyboard
+2214,1517832,"examples/bibliography.bib",18867,0,"2",bibtex,content
+2215,1517834,"examples/bibliography.bib",18868,0,"",bibtex,selection_keyboard
+2216,1517961,"examples/bibliography.bib",18868,0,"5",bibtex,content
+2217,1517964,"examples/bibliography.bib",18869,0,"",bibtex,selection_keyboard
+2218,1518388,"examples/bibliography.bib",18869,0,"t",bibtex,content
+2219,1518392,"examples/bibliography.bib",18870,0,"",bibtex,selection_keyboard
+2220,1518426,"examples/bibliography.bib",18870,0,"a",bibtex,content
+2221,1518427,"examples/bibliography.bib",18871,0,"",bibtex,selection_keyboard
+2222,1518855,"examples/bibliography.bib",18871,0,"b",bibtex,content
+2223,1518856,"examples/bibliography.bib",18872,0,"",bibtex,selection_keyboard
+2224,1518982,"examples/bibliography.bib",18872,0,",",bibtex,content
+2225,1518983,"examples/bibliography.bib",18873,0,"",bibtex,selection_keyboard
+2226,1519168,"examples/bibliography.bib",18873,0,"\n",bibtex,content
+2227,1523311,"examples/bibliography.bib",18874,0," ",bibtex,content
+2228,1524947,"examples/bibliography.bib",18876,0,"title = {Tab by Cursor: The AI-powered browser for developers},",bibtex,content
+2229,1525370,"examples/bibliography.bib",18939,0,"\n ",bibtex,content
+2230,1525899,"examples/bibliography.bib",18942,0,"author = {Cursor},\n year = {2025},\n url = {https://tab.cursor.com/},\n note = {Accessed: 2025-08-05},\n}",bibtex,content
+2231,1526495,"examples/bibliography.bib",19046,0,"",bibtex,selection_command
+2232,1526758,"examples/bibliography.bib",19013,0,"",bibtex,selection_command
+2233,1526891,"examples/bibliography.bib",18978,0,"",bibtex,selection_command
+2234,1527209,"examples/bibliography.bib",18961,0,"",bibtex,selection_command
+2235,1527363,"examples/bibliography.bib",18940,0,"",bibtex,selection_command
+2236,1527509,"examples/bibliography.bib",18874,0,"",bibtex,selection_command
+2237,1528180,"examples/bibliography.bib",18940,0,"",bibtex,selection_command
+2238,1528345,"examples/bibliography.bib",18961,0,"",bibtex,selection_command
+2239,1528492,"examples/bibliography.bib",18978,0,"",bibtex,selection_command
+2240,1528624,"examples/bibliography.bib",18980,0,"",bibtex,selection_command
+2241,1528782,"examples/bibliography.bib",18984,0,"",bibtex,selection_command
+2242,1528938,"examples/bibliography.bib",18986,0,"",bibtex,selection_command
+2243,1529094,"examples/bibliography.bib",18987,0,"",bibtex,selection_command
+2244,1529683,"examples/bibliography.bib",18987,23,"",bibtex,content
+2245,1530849,"examples/bibliography.bib",18987,0,"https://cursor.com/blog/tab-update",bibtex,content
+2246,1530852,"examples/bibliography.bib",19021,0,"",bibtex,selection_keyboard
+2247,1531181,"examples/bibliography.bib",19020,0,"",bibtex,selection_command
+2248,1531365,"examples/bibliography.bib",19055,0,"",bibtex,selection_command
+2249,1535624,"examples/bibliography.bib",19020,0,"",bibtex,selection_command
+2250,1535873,"examples/bibliography.bib",18976,0,"",bibtex,selection_command
+2251,1535910,"examples/bibliography.bib",18959,0,"",bibtex,selection_command
+2252,1536429,"examples/bibliography.bib",18916,0,"",bibtex,selection_command
+2253,1537333,"examples/bibliography.bib",18885,52,"",bibtex,content
+2254,1540660,"examples/bibliography.bib",18885,0,"A New Tab Model",bibtex,content
+2255,1540662,"examples/bibliography.bib",18900,0,"",bibtex,selection_keyboard
+2256,1540933,"examples/bibliography.bib",18899,0,"",bibtex,selection_command
+2257,1541134,"examples/bibliography.bib",18874,0,"",bibtex,selection_command
+2258,1542546,"examples/jasmine.html",0,0,"",html,tab
+2259,1543993,"examples/jasmine.html",3489,0,"c",html,content
+2260,1543995,"examples/jasmine.html",3490,0,"",html,selection_keyboard
+2261,1544132,"examples/jasmine.html",3490,0,"u",html,content
+2262,1544137,"examples/jasmine.html",3491,0,"",html,selection_keyboard
+2263,1544209,"examples/jasmine.html",3491,0,"r",html,content
+2264,1544213,"examples/jasmine.html",3492,0,"",html,selection_keyboard
+2265,1544856,"examples/jasmine.html",3492,0,"s",html,content
+2266,1544861,"examples/jasmine.html",3493,0,"",html,selection_keyboard
+2267,1544932,"examples/jasmine.html",3493,0,"o",html,content
+2268,1544937,"examples/jasmine.html",3494,0,"",html,selection_keyboard
+2269,1545176,"examples/jasmine.html",3494,0,"r",html,content
+2270,1545179,"examples/jasmine.html",3495,0,"",html,selection_keyboard
+2271,1545530,"examples/jasmine.html",3495,0,"2",html,content
+2272,1545535,"examples/jasmine.html",3496,0,"",html,selection_keyboard
+2273,1545674,"examples/jasmine.html",3496,0,"0",html,content
+2274,1545677,"examples/jasmine.html",3497,0,"",html,selection_keyboard
+2275,1545760,"examples/jasmine.html",3497,0,"2",html,content
+2276,1545763,"examples/jasmine.html",3498,0,"",html,selection_keyboard
+2277,1545873,"examples/jasmine.html",3498,0,"5",html,content
+2278,1545875,"examples/jasmine.html",3499,0,"",html,selection_keyboard
+2279,1546159,"examples/jasmine.html",3499,0,"t",html,content
+2280,1546163,"examples/jasmine.html",3500,0,"",html,selection_keyboard
+2281,1546288,"examples/jasmine.html",3500,0,"a",html,content
+2282,1546290,"examples/jasmine.html",3501,0,"",html,selection_keyboard
+2283,1546389,"examples/jasmine.html",3501,0,"b",html,content
+2284,1546392,"examples/jasmine.html",3502,0,"",html,selection_keyboard
+2285,1546561,"examples/jasmine.html",3501,0,"",html,selection_command
+2286,1546757,"examples/jasmine.html",3439,0,"",html,selection_command
+2287,1549853,"TERMINAL",0,0,"^C[1G[0Kโ [1G[0K[1m[7m%[27m[1m[0m \r \r]633;D;0]633;P;Cwd=/Users/franzsrambical/Documents/pdoom/pdoom.org\r[0m[27m[24m[J]633;Afranzsrambical@MBF6N9WFVKFV pdoom.org % ]633;B[K[?2004h",,terminal_output
+2288,1549900,"TERMINAL",0,0,"",,terminal_command
+2289,1549900,"TERMINAL",0,0,"[?2004l\r\r\n[1m[7m%[27m[1m[0m \r \r]633;E;;b07e0e29-8b6e-4188-b95b-fd2ce17f4bb1",,terminal_output
+2290,1549901,"TERMINAL",0,0,"]633;C",,terminal_output
+2291,1550474,"TERMINAL",0,0,"cp examples/* dist",,terminal_command
+2292,1550493,"TERMINAL",0,0,"]633;C[1m[7m%[27m[1m[0m \r \r",,terminal_output
+2293,1551003,"TERMINAL",0,0,"npm run dev",,terminal_command
+2294,1551054,"TERMINAL",0,0,"]633;C",,terminal_output
+2295,1551178,"TERMINAL",0,0,"\r\n> distill-template@2.8.0 dev\r\n> rollup -c rollup.config.dev.js -w\r\n\r\n[1G[0K",,terminal_output
+2296,1551359,"TERMINAL",0,0,"c[4mrollup v2.7.3[24m\r\n[36mbundles [1msrc/components.js[22m โ [1mdist/template.v2.js[22m...[39m\r\n",,terminal_output
+2297,1551834,"TERMINAL",0,0,"[1m[32mhttp://localhost:8088[39m[22m -> /Users/franzsrambical/Documents/pdoom/pdoom.org/dist\r\n[1m[32mhttp://localhost:8088[39m[22m -> /Users/franzsrambical/Documents/pdoom/pdoom.org/examples\r\n[1m[33m(!) Circular dependencies[39m[22m\r\nnode_modules/d3-interpolate/src/value.js -> node_modules/d3-interpolate/src/array.js -> node_modules/d3-interpolate/src/value.js\r\nnode_modules/d3-interpolate/src/value.js -> node_modules/d3-interpolate/src/object.js -> node_modules/d3-interpolate/src/value.js\r\nnode_modules/d3-selection/src/selection/index.js -> node_modules/d3-selection/src/selection/select.js -> node_modules/d3-selection/src/selection/index.js\r\n...and 7 more\r\n[32mcreated [1mdist/template.v2.js[22m in [1m478ms[22m[39m\r\n[36mbundles [1msrc/transforms.js[22m โ [1mdist/transforms.v2.js[22m...[39m\r\n",,terminal_output
+2298,1552277,"TERMINAL",0,0,"[32mcreated [1mdist/transforms.v2.js[22m in [1m443ms[22m[39m\r\n\r\n[2025-08-05 18:51:48] waiting for changes...\r\n",,terminal_output
+2299,1566325,"examples/jasmine.html",3451,0,"",html,selection_command
+2300,1566575,"examples/jasmine.html",3459,0,"",html,selection_command
+2301,1566605,"examples/jasmine.html",3460,0,"",html,selection_command
+2302,1566633,"examples/jasmine.html",3468,0,"",html,selection_command
+2303,1566665,"examples/jasmine.html",3469,0,"",html,selection_command
+2304,1566698,"examples/jasmine.html",3474,0,"",html,selection_command
+2305,1566732,"examples/jasmine.html",3476,0,"",html,selection_command
+2306,1566767,"examples/jasmine.html",3477,0,"",html,selection_command
+2307,1566801,"examples/jasmine.html",3478,0,"",html,selection_command
+2308,1566835,"examples/jasmine.html",3482,0,"",html,selection_command
+2309,1567008,"examples/jasmine.html",3486,0,"",html,selection_command
+2310,1567275,"examples/jasmine.html",3488,0,"",html,selection_command
+2311,1567308,"examples/jasmine.html",3501,0,"",html,selection_command
+2312,1567327,"examples/jasmine.html",3506,0,"",html,selection_command
+2313,1567356,"examples/jasmine.html",3507,0,"",html,selection_command
+2314,1567393,"examples/jasmine.html",3508,0,"",html,selection_command
+2315,1567425,"examples/jasmine.html",3513,0,"",html,selection_command
+2316,1567836,"examples/jasmine.html",3514,0,"",html,selection_command
+2317,1567989,"examples/jasmine.html",3515,0,"",html,selection_command
+2318,1568341,"examples/jasmine.html",3515,1,"",html,content
+2319,1568611,"examples/jasmine.html",3514,0,"",html,selection_command
+2320,1568778,"examples/jasmine.html",3513,0,"",html,selection_command
+2321,1568942,"examples/jasmine.html",3513,1,"",html,content
+2322,1570839,"examples/jasmine.html",3509,0,"",html,selection_command
+2323,1571085,"examples/jasmine.html",3508,0,"",html,selection_command
+2324,1571300,"examples/jasmine.html",3507,0,"",html,selection_command
+2325,1571467,"examples/jasmine.html",3502,0,"",html,selection_command
+2326,1571896,"examples/jasmine.html",3503,0,"",html,selection_command
+2327,1572014,"examples/jasmine.html",3503,1,"",html,content
+2328,1572333,"examples/jasmine.html",3439,0,"",html,selection_command
+2329,1573385,"TERMINAL",0,0,"^C[1G[0Kโ [1G[0K[1m[7m%[27m[1m[0m \r \r]633;D;0]633;P;Cwd=/Users/franzsrambical/Documents/pdoom/pdoom.org\r[0m[27m[24m[J]633;Afranzsrambical@MBF6N9WFVKFV pdoom.org % ]633;B[K[?2004h",,terminal_output
+2330,1573467,"TERMINAL",0,0,"",,terminal_command
+2331,1573467,"TERMINAL",0,0,"[?2004l\r\r\n[1m[7m%[27m[1m[0m \r \r]633;E;;b07e0e29-8b6e-4188-b95b-fd2ce17f4bb1",,terminal_output
+2332,1573467,"TERMINAL",0,0,"]633;C",,terminal_output
+2333,1573920,"TERMINAL",0,0,"cp examples/* dist",,terminal_command
+2334,1573933,"TERMINAL",0,0,"]633;C[1m[7m%[27m[1m[0m \r \r",,terminal_output
+2335,1574424,"TERMINAL",0,0,"npm run dev",,terminal_command
+2336,1574476,"TERMINAL",0,0,"]633;C",,terminal_output
+2337,1574611,"TERMINAL",0,0,"\r\n> distill-template@2.8.0 dev\r\n> rollup -c rollup.config.dev.js -w\r\n\r\n[1G[0K",,terminal_output
+2338,1574787,"TERMINAL",0,0,"c[4mrollup v2.7.3[24m\r\n[36mbundles [1msrc/components.js[22m โ [1mdist/template.v2.js[22m...[39m\r\n",,terminal_output
+2339,1575203,"TERMINAL",0,0,"[1m[32mhttp://localhost:8088[39m[22m -> /Users/franzsrambical/Documents/pdoom/pdoom.org/dist\r\n[1m[32mhttp://localhost:8088[39m[22m -> /Users/franzsrambical/Documents/pdoom/pdoom.org/examples\r\n[1m[33m(!) Circular dependencies[39m[22m\r\nnode_modules/d3-interpolate/src/value.js -> node_modules/d3-interpolate/src/array.js -> node_modules/d3-interpolate/src/value.js\r\nnode_modules/d3-interpolate/src/value.js -> node_modules/d3-interpolate/src/object.js -> node_modules/d3-interpolate/src/value.js\r\nnode_modules/d3-selection/src/selection/index.js -> node_modules/d3-selection/src/selection/select.js -> node_modules/d3-selection/src/selection/index.js\r\n...and 7 more\r\n[32mcreated [1mdist/template.v2.js[22m in [1m418ms[22m[39m\r\n[36mbundles [1msrc/transforms.js[22m โ [1mdist/transforms.v2.js[22m...[39m\r\n",,terminal_output
+2340,1575633,"TERMINAL",0,0,"[32mcreated [1mdist/transforms.v2.js[22m in [1m428ms[22m[39m\r\n\r\n[2025-08-05 18:52:11] waiting for changes...\r\n",,terminal_output
+2341,1595415,"examples/jasmine.html",3493,0,"",html,selection_command
+2342,1596487,"examples/jasmine.html",3872,0,"",html,selection_command
+2343,1599047,"examples/jasmine.html",3493,0,"",html,selection_command
+2344,1599192,"examples/jasmine.html",3161,0,"",html,selection_command
+2345,1599465,"examples/jasmine.html",3098,0,"",html,selection_command
+2346,1600423,"examples/jasmine.html",2952,0,"",html,selection_command
+2347,1601456,"examples/jasmine.html",2836,0,"",html,selection_command
+2348,1602588,"examples/jasmine.html",2802,0,"",html,selection_command
+2349,1604099,"examples/jasmine.html",2803,0,"",html,selection_command
+2350,1604204,"examples/jasmine.html",2803,0,"u",html,content
+2351,1604207,"examples/jasmine.html",2804,0,"",html,selection_keyboard
+2352,1604323,"examples/jasmine.html",2803,0,"",html,selection_command
+2353,1605399,"TERMINAL",0,0,"^C[1G[0Kโ [1G[0K[1m[7m%[27m[1m[0m \r \r]633;D;0]633;P;Cwd=/Users/franzsrambical/Documents/pdoom/pdoom.org\r[0m[27m[24m[J]633;Afranzsrambical@MBF6N9WFVKFV pdoom.org % ]633;B[K[?2004h",,terminal_output
+2354,1605548,"TERMINAL",0,0,"npm run dev",,terminal_output
+2355,1605724,"TERMINAL",0,0,"[11Dcp examples/* dist",,terminal_output
+2356,1605908,"TERMINAL",0,0,"cp examples/* dist",,terminal_command
+2357,1605908,"TERMINAL",0,0,"[?2004l\r\r\n]633;E;cp examples/* dist;b07e0e29-8b6e-4188-b95b-fd2ce17f4bb1",,terminal_output
+2358,1605924,"TERMINAL",0,0,"]633;C[1m[7m%[27m[1m[0m \r \r",,terminal_output
+2359,1606435,"TERMINAL",0,0,"npm run dev",,terminal_command
+2360,1606487,"TERMINAL",0,0,"]633;C",,terminal_output
+2361,1606633,"TERMINAL",0,0,"\r\n> distill-template@2.8.0 dev\r\n> rollup -c rollup.config.dev.js -w\r\n\r\n[1G[0K",,terminal_output
+2362,1606847,"TERMINAL",0,0,"c[4mrollup v2.7.3[24m\r\n[36mbundles [1msrc/components.js[22m โ [1mdist/template.v2.js[22m...[39m\r\n",,terminal_output
+2363,1607311,"TERMINAL",0,0,"[1m[32mhttp://localhost:8088[39m[22m -> /Users/franzsrambical/Documents/pdoom/pdoom.org/dist\r\n[1m[32mhttp://localhost:8088[39m[22m -> /Users/franzsrambical/Documents/pdoom/pdoom.org/examples\r\n[1m[33m(!) Circular dependencies[39m[22m\r\nnode_modules/d3-interpolate/src/value.js -> node_modules/d3-interpolate/src/array.js -> node_modules/d3-interpolate/src/value.js\r\nnode_modules/d3-interpolate/src/value.js -> node_modules/d3-interpolate/src/object.js -> node_modules/d3-interpolate/src/value.js\r\nnode_modules/d3-selection/src/selection/index.js -> node_modules/d3-selection/src/selection/select.js -> node_modules/d3-selection/src/selection/index.js\r\n...and 7 more\r\n[32mcreated [1mdist/template.v2.js[22m in [1m466ms[22m[39m\r\n[36mbundles [1msrc/transforms.js[22m โ [1mdist/transforms.v2.js[22m...[39m\r\n",,terminal_output
+2364,1607655,"TERMINAL",0,0,"[32mcreated [1mdist/transforms.v2.js[22m in [1m342ms[22m[39m\r\n\r\n[2025-08-05 18:52:43] waiting for changes...\r\n",,terminal_output
+2365,1675760,"examples/jasmine.html",3003,0,"",html,selection_command
+2366,1676002,"examples/jasmine.html",3209,0,"",html,selection_command
+2367,1676031,"examples/jasmine.html",3381,0,"",html,selection_command
+2368,1676063,"examples/jasmine.html",3430,0,"",html,selection_command
+2369,1676096,"examples/jasmine.html",3438,0,"",html,selection_command
+2370,1676129,"examples/jasmine.html",3543,0,"",html,selection_command
+2371,1676162,"examples/jasmine.html",3738,0,"",html,selection_command
+2372,1676197,"examples/jasmine.html",3748,0,"",html,selection_command
+2373,1676229,"examples/jasmine.html",3756,0,"",html,selection_command
+2374,1676376,"examples/jasmine.html",3861,0,"",html,selection_command
+2375,1676956,"examples/jasmine.html",3864,0,"",html,selection_command
+2376,1677205,"examples/jasmine.html",3869,0,"",html,selection_command
+2377,1677233,"examples/jasmine.html",3872,0,"",html,selection_command
+2378,1677265,"examples/jasmine.html",3878,0,"",html,selection_command
+2379,1677297,"examples/jasmine.html",3884,0,"",html,selection_command
+2380,1677330,"examples/jasmine.html",3886,0,"",html,selection_command
+2381,1677364,"examples/jasmine.html",3892,0,"",html,selection_command
+2382,1677397,"examples/jasmine.html",3898,0,"",html,selection_command
+2383,1677430,"examples/jasmine.html",3905,0,"",html,selection_command
+2384,1677464,"examples/jasmine.html",3909,0,"",html,selection_command
+2385,1677497,"examples/jasmine.html",3914,0,"",html,selection_command
+2386,1677531,"examples/jasmine.html",3923,0,"",html,selection_command
+2387,1677564,"examples/jasmine.html",3937,0,"",html,selection_command
+2388,1677597,"examples/jasmine.html",3949,0,"",html,selection_command
+2389,1677631,"examples/jasmine.html",3952,0,"",html,selection_command
+2390,1677664,"examples/jasmine.html",3961,0,"",html,selection_command
+2391,1677697,"examples/jasmine.html",3965,0,"",html,selection_command
+2392,1677730,"examples/jasmine.html",3967,0,"",html,selection_command
+2393,1677764,"examples/jasmine.html",3970,0,"",html,selection_command
+2394,1677797,"examples/jasmine.html",3980,0,"",html,selection_command
+2395,1677830,"examples/jasmine.html",3985,0,"",html,selection_command
+2396,1677863,"examples/jasmine.html",3988,0,"",html,selection_command
+2397,1677897,"examples/jasmine.html",3992,0,"",html,selection_command
+2398,1677930,"examples/jasmine.html",4002,0,"",html,selection_command
+2399,1677964,"examples/jasmine.html",4005,0,"",html,selection_command
+2400,1677998,"examples/jasmine.html",4008,0,"",html,selection_command
+2401,1678030,"examples/jasmine.html",4014,0,"",html,selection_command
+2402,1678064,"examples/jasmine.html",4016,0,"",html,selection_command
+2403,1678097,"examples/jasmine.html",4022,0,"",html,selection_command
+2404,1678130,"examples/jasmine.html",4028,0,"",html,selection_command
+2405,1678164,"examples/jasmine.html",4031,0,"",html,selection_command
+2406,1678197,"examples/jasmine.html",4035,0,"",html,selection_command
+2407,1678230,"examples/jasmine.html",4038,0,"",html,selection_command
+2408,1678264,"examples/jasmine.html",4040,0,"",html,selection_command
+2409,1678297,"examples/jasmine.html",4051,0,"",html,selection_command
+2410,1678330,"examples/jasmine.html",4054,0,"",html,selection_command
+2411,1678364,"examples/jasmine.html",4058,0,"",html,selection_command
+2412,1678551,"examples/jasmine.html",4064,0,"",html,selection_command
+2413,1678722,"examples/jasmine.html",4065,0,"",html,selection_command
+2414,1679004,"examples/jasmine.html",4064,0,"",html,selection_command
+2415,1679194,"examples/jasmine.html",4064,1,"(",html,selection_command
+2416,1679267,"examples/jasmine.html",4064,5,"(cite",html,selection_command
+2417,1679524,"examples/jasmine.html",4064,11,"(cite Genie",html,selection_command
+2418,1679555,"examples/jasmine.html",4064,13,"(cite Genie 1",html,selection_command
+2419,1679578,"examples/jasmine.html",4064,14,"(cite Genie 1,",html,selection_command
+2420,1679613,"examples/jasmine.html",4064,20,"(cite Genie 1, Genie",html,selection_command
+2421,1679914,"examples/jasmine.html",4064,22,"(cite Genie 1, Genie 2",html,selection_command
+2422,1680151,"examples/jasmine.html",4064,24,"(cite Genie 1, Genie 2),",html,selection_command
+2423,1680471,"examples/jasmine.html",4064,23,"(cite Genie 1, Genie 2)",html,selection_command
+2424,1680671,"examples/jasmine.html",4064,23,"",html,content
+2425,1681464,"examples/jasmine.html",4064,0,"<",html,content
+2426,1681466,"examples/jasmine.html",4065,0,"",html,selection_keyboard
+2427,1682421,"examples/jasmine.html",4065,0,"d-cite key=""genie2025, genie2025"">",html,content
+2428,1682786,"examples/jasmine.html",4107,0,"",html,selection_command
+2429,1683038,"examples/jasmine.html",4103,0,"",html,selection_command
+2430,1683181,"examples/jasmine.html",4102,0,"",html,selection_command
+2431,1683322,"examples/jasmine.html",4101,0,"",html,selection_command
+2432,1683456,"examples/jasmine.html",4097,0,"",html,selection_command
+2433,1683611,"examples/jasmine.html",4088,0,"",html,selection_command
+2434,1683749,"examples/jasmine.html",4086,0,"",html,selection_command
+2435,1683910,"examples/jasmine.html",4077,0,"",html,selection_command
+2436,1685864,"examples/bibliography.bib",0,0,"",bibtex,tab
+2437,1687884,"examples/bibliography.bib",19020,0,"",bibtex,selection_command
+2438,1688036,"examples/bibliography.bib",19021,0,"\n",bibtex,content
+2439,1688223,"examples/bibliography.bib",19022,0,"\n",bibtex,content
+2440,1727463,"examples/bibliography.bib",19023,0,"@inproceedings{\n bruce2024genie,\n title={Genie: Generative Interactive Environments},\n author={Jake Bruce and Michael D Dennis and Ashley Edwards and Jack Parker-Holder and Yuge Shi and Edward Hughes and Matthew Lai and Aditi Mavalankar and Richie Steigerwald and Chris Apps and Yusuf Aytar and Sarah Maria Elisabeth Bechtle and Feryal Behbahani and Stephanie C.Y. Chan and Nicolas Heess and Lucy Gonzalez and Simon Osindero and Sherjil Ozair and Scott Reed and Jingwei Zhang and Konrad Zolna and Jeff Clune and Nando de Freitas and Satinder Singh and Tim Rockt{\""a}schel},\n booktitle={Forty-first International Conference on Machine Learning},\n year={2024},\n url={https://openreview.net/forum?id=bJbSbJskOS}\n}",bibtex,content
+2441,1727772,"examples/bibliography.bib",19750,0,"",bibtex,selection_command
+2442,1728546,"examples/bibliography.bib",19751,0,"\n",bibtex,content
+2443,1728760,"examples/bibliography.bib",19752,0,"\n",bibtex,content
+2444,1729924,"examples/bibliography.bib",19753,0,"@",bibtex,content
+2445,1729928,"examples/bibliography.bib",19754,0,"",bibtex,selection_keyboard
+2446,1730107,"examples/bibliography.bib",19754,0,"m",bibtex,content
+2447,1730108,"examples/bibliography.bib",19755,0,"",bibtex,selection_keyboard
+2448,1730173,"examples/bibliography.bib",19755,0,"i",bibtex,content
+2449,1730174,"examples/bibliography.bib",19756,0,"",bibtex,selection_keyboard
+2450,1730214,"examples/bibliography.bib",19756,0,"s",bibtex,content
+2451,1730217,"examples/bibliography.bib",19757,0,"",bibtex,selection_keyboard
+2452,1730312,"examples/bibliography.bib",19757,0,"c",bibtex,content
+2453,1730313,"examples/bibliography.bib",19758,0,"",bibtex,selection_keyboard
+2454,1730910,"examples/bibliography.bib",19758,0,"{",bibtex,content
+2455,1730913,"examples/bibliography.bib",19759,0,"",bibtex,selection_keyboard
+2456,1731397,"examples/bibliography.bib",19759,0,"\n",bibtex,content
+2457,1732231,"examples/bibliography.bib",19759,1,"",bibtex,content
+2458,1732598,"examples/bibliography.bib",19759,0,"e",bibtex,content
+2459,1732599,"examples/bibliography.bib",19760,0,"",bibtex,selection_keyboard
+2460,1732921,"examples/bibliography.bib",19759,1,"",bibtex,content
+2461,1732995,"examples/bibliography.bib",19759,0,"g",bibtex,content
+2462,1732996,"examples/bibliography.bib",19760,0,"",bibtex,selection_keyboard
+2463,1733042,"examples/bibliography.bib",19760,0,"e",bibtex,content
+2464,1733045,"examples/bibliography.bib",19761,0,"",bibtex,selection_keyboard
+2465,1733125,"examples/bibliography.bib",19761,0,"n",bibtex,content
+2466,1733128,"examples/bibliography.bib",19762,0,"",bibtex,selection_keyboard
+2467,1733148,"examples/bibliography.bib",19762,0,"i",bibtex,content
+2468,1733151,"examples/bibliography.bib",19763,0,"",bibtex,selection_keyboard
+2469,1733226,"examples/bibliography.bib",19763,0,"e",bibtex,content
+2470,1733228,"examples/bibliography.bib",19764,0,"",bibtex,selection_keyboard
+2471,1733793,"examples/bibliography.bib",19764,0,"2",bibtex,content
+2472,1733798,"examples/bibliography.bib",19765,0,"",bibtex,selection_keyboard
+2473,1733948,"examples/bibliography.bib",19765,0,"-",bibtex,content
+2474,1733950,"examples/bibliography.bib",19766,0,"",bibtex,selection_keyboard
+2475,1734628,"examples/bibliography.bib",19765,1,"",bibtex,content
+2476,1734879,"examples/bibliography.bib",19764,1,"",bibtex,content
+2477,1735161,"examples/bibliography.bib",19763,1,"",bibtex,content
+2478,1735333,"examples/bibliography.bib",19762,1,"",bibtex,content
+2479,1735492,"examples/bibliography.bib",19761,1,"",bibtex,content
+2480,1735624,"examples/bibliography.bib",19760,1,"",bibtex,content
+2481,1735782,"examples/bibliography.bib",19759,1,"",bibtex,content
+2482,1736721,"examples/bibliography.bib",19759,0,"d",bibtex,content
+2483,1736725,"examples/bibliography.bib",19760,0,"",bibtex,selection_keyboard
+2484,1736768,"examples/bibliography.bib",19760,0,"e",bibtex,content
+2485,1736772,"examples/bibliography.bib",19761,0,"",bibtex,selection_keyboard
+2486,1736949,"examples/bibliography.bib",19761,0,"e",bibtex,content
+2487,1736990,"examples/bibliography.bib",19762,0,"p",bibtex,content
+2488,1736994,"examples/bibliography.bib",19763,0,"",bibtex,selection_keyboard
+2489,1737112,"examples/bibliography.bib",19763,0,"m",bibtex,content
+2490,1737113,"examples/bibliography.bib",19764,0,"",bibtex,selection_keyboard
+2491,1737225,"examples/bibliography.bib",19764,0,"i",bibtex,content
+2492,1737228,"examples/bibliography.bib",19765,0,"",bibtex,selection_keyboard
+2493,1737298,"examples/bibliography.bib",19765,0,"n",bibtex,content
+2494,1737302,"examples/bibliography.bib",19766,0,"",bibtex,selection_keyboard
+2495,1737360,"examples/bibliography.bib",19766,0,"d",bibtex,content
+2496,1737363,"examples/bibliography.bib",19767,0,"",bibtex,selection_keyboard
+2497,1738126,"examples/bibliography.bib",19767,0,"2",bibtex,content
+2498,1738133,"examples/bibliography.bib",19768,0,"",bibtex,selection_keyboard
+2499,1738214,"examples/bibliography.bib",19768,0,"0",bibtex,content
+2500,1738218,"examples/bibliography.bib",19769,0,"",bibtex,selection_keyboard
+2501,1738270,"examples/bibliography.bib",19769,0,"2",bibtex,content
+2502,1738273,"examples/bibliography.bib",19770,0,"",bibtex,selection_keyboard
+2503,1738404,"examples/bibliography.bib",19770,0,"5",bibtex,content
+2504,1738407,"examples/bibliography.bib",19771,0,"",bibtex,selection_keyboard
+2505,1738717,"examples/bibliography.bib",19771,0,"g",bibtex,content
+2506,1738720,"examples/bibliography.bib",19772,0,"",bibtex,selection_keyboard
+2507,1738792,"examples/bibliography.bib",19772,0,"e",bibtex,content
+2508,1738794,"examples/bibliography.bib",19773,0,"",bibtex,selection_keyboard
+2509,1738871,"examples/bibliography.bib",19773,0,"n",bibtex,content
+2510,1738873,"examples/bibliography.bib",19774,0,"",bibtex,selection_keyboard
+2511,1738920,"examples/bibliography.bib",19774,0,"i",bibtex,content
+2512,1738922,"examples/bibliography.bib",19775,0,"",bibtex,selection_keyboard
+2513,1738976,"examples/bibliography.bib",19775,0,"e",bibtex,content
+2514,1738977,"examples/bibliography.bib",19776,0,"",bibtex,selection_keyboard
+2515,1739187,"examples/bibliography.bib",19776,0,"2",bibtex,content
+2516,1739189,"examples/bibliography.bib",19777,0,"",bibtex,selection_keyboard
+2517,1739328,"examples/bibliography.bib",19777,0,",",bibtex,content
+2518,1739330,"examples/bibliography.bib",19778,0,"",bibtex,selection_keyboard
+2519,1739641,"examples/bibliography.bib",19778,0,"\n",bibtex,content
+2520,1740980,"examples/bibliography.bib",19778,1,"",bibtex,content
+2521,1741672,"examples/bibliography.bib",19777,0,"",bibtex,selection_command
+2522,1741786,"examples/bibliography.bib",19759,0,"",bibtex,selection_command
+2523,1741969,"examples/bibliography.bib",19758,0,"",bibtex,selection_command
+2524,1742938,"examples/bibliography.bib",19759,0,"",bibtex,selection_command
+2525,1743178,"examples/bibliography.bib",19752,0,"",bibtex,selection_command
+2526,1743421,"examples/bibliography.bib",19750,0,"",bibtex,selection_command
+2527,1743453,"examples/bibliography.bib",19703,0,"",bibtex,selection_command
+2528,1743486,"examples/bibliography.bib",19686,0,"",bibtex,selection_command
+2529,1743519,"examples/bibliography.bib",19612,0,"",bibtex,selection_command
+2530,1743553,"examples/bibliography.bib",19121,0,"",bibtex,selection_command
+2531,1743703,"examples/bibliography.bib",19065,0,"",bibtex,selection_command
+2532,1744086,"examples/bibliography.bib",19043,0,"",bibtex,selection_command
+2533,1744277,"examples/bibliography.bib",19041,2,"",bibtex,content
+2534,1744423,"examples/bibliography.bib",19039,2,"",bibtex,content
+2535,1744767,"examples/bibliography.bib",19038,1,"",bibtex,content
+2536,1745105,"examples/bibliography.bib",19037,0,"",bibtex,selection_command
+2537,1745361,"examples/bibliography.bib",19068,0,"",bibtex,selection_command
+2538,1746785,"examples/bibliography.bib",19124,0,"",bibtex,selection_command
+2539,1747030,"examples/bibliography.bib",19615,0,"",bibtex,selection_command
+2540,1747057,"examples/bibliography.bib",19689,0,"",bibtex,selection_command
+2541,1747089,"examples/bibliography.bib",19706,0,"",bibtex,selection_command
+2542,1747204,"examples/bibliography.bib",19745,0,"",bibtex,selection_command
+2543,1747373,"examples/bibliography.bib",19747,0,"",bibtex,selection_command
+2544,1747506,"examples/bibliography.bib",19762,0,"",bibtex,selection_command
+2545,1748559,"examples/bibliography.bib",19773,0,"\n",bibtex,content
+2546,1749268,"examples/bibliography.bib",19774,0," ",bibtex,content
+2547,1749510,"examples/bibliography.bib",19774,2,"",bibtex,content
+2548,1751027,"examples/bibliography.bib",19773,1,"",bibtex,content
+2549,1751310,"examples/bibliography.bib",19773,0,"\n",bibtex,content
+2550,1752957,"examples/bibliography.bib",19774,0," ",bibtex,content
+2551,1753973,"examples/bibliography.bib",19776,0,"title = {Genie 2: A World Model for Interactive Agents},",bibtex,content
+2552,1754464,"examples/bibliography.bib",19832,0,"\n author = {DeepMind},\n year = {2025},\n url = {https://deepmind.google/discover/blog/genie-2-a-world-model-for-interactive-agents/},\n note = {DeepMind Blog, July 21, 2025},\n}",bibtex,content
+2553,1754823,"examples/bibliography.bib",20009,0,"",bibtex,selection_command
+2554,1755040,"examples/bibliography.bib",19968,0,"",bibtex,selection_command
+2555,1755291,"examples/bibliography.bib",19873,0,"",bibtex,selection_command
+2556,1755327,"examples/bibliography.bib",19856,0,"",bibtex,selection_command
+2557,1755463,"examples/bibliography.bib",19833,0,"",bibtex,selection_command
+2558,1755596,"examples/bibliography.bib",19774,0,"",bibtex,selection_command
+2559,1755700,"examples/bibliography.bib",19776,0,"",bibtex,selection_command
+2560,1756057,"examples/bibliography.bib",19784,0,"",bibtex,selection_command
+2561,1756232,"examples/bibliography.bib",19785,0,"",bibtex,selection_command
+2562,1764311,"examples/bibliography.bib",19785,1,"G",bibtex,selection_command
+2563,1764367,"examples/bibliography.bib",19785,5,"Genie",bibtex,selection_command
+2564,1764499,"examples/bibliography.bib",19785,7,"Genie 2",bibtex,selection_command
+2565,1764750,"examples/bibliography.bib",19785,8,"Genie 2:",bibtex,selection_command
+2566,1764853,"examples/bibliography.bib",19785,10,"Genie 2: A",bibtex,selection_command
+2567,1765009,"examples/bibliography.bib",19785,16,"Genie 2: A World",bibtex,selection_command
+2568,1765159,"examples/bibliography.bib",19785,22,"Genie 2: A World Model",bibtex,selection_command
+2569,1765309,"examples/bibliography.bib",19785,26,"Genie 2: A World Model for",bibtex,selection_command
+2570,1765477,"examples/bibliography.bib",19785,38,"Genie 2: A World Model for Interactive",bibtex,selection_command
+2571,1765646,"examples/bibliography.bib",19785,45,"Genie 2: A World Model for Interactive Agents",bibtex,selection_command
+2572,1766077,"examples/bibliography.bib",19785,45,"Genie 2: A large-scale foundation world model",bibtex,content
+2573,1766080,"examples/bibliography.bib",19830,0,"",bibtex,selection_keyboard
+2574,1766767,"examples/bibliography.bib",19829,0,"",bibtex,selection_command
+2575,1767000,"examples/bibliography.bib",19854,0,"",bibtex,selection_command
+2576,1767132,"examples/bibliography.bib",19871,0,"",bibtex,selection_command
+2577,1768603,"examples/bibliography.bib",19928,0,"",bibtex,selection_command
+2578,1773942,"examples/bibliography.bib",19870,0,"",bibtex,selection_mouse
+2579,1774315,"examples/bibliography.bib",19852,0,"",bibtex,selection_mouse
+2580,1775002,"examples/bibliography.bib",19845,0,"",bibtex,selection_command
+2581,1775077,"examples/bibliography.bib",19845,1,"D",bibtex,selection_command
+2582,1775114,"examples/bibliography.bib",19845,8,"DeepMind",bibtex,selection_command
+2583,1775415,"examples/bibliography.bib",19845,8," Jack Parker-Holder, Philip Ball, Jake Bruce, Vibhavari Dasagi, Kristian Holsheimer, Christos Kaplanis, Alexandre Moufarek, Guy Scully, Jeremy Shar, Jimmy Shi, Stephen Spencer, Jessica Yung, Michael Dennis, Sultan Kenjeyev, Shangbang Long, Vlad Mnih, Harris Chan, Maxime Gazeau, Bonnie Li, Fabio Pardo, Luyu Wang, Lei Zhang, Frederic Besse, Tim Harley, Anna Mitenkova, Jane Wang, Jeff Clune, Demis Hassabis, Raia Hadsell, Adrian Bolton, Satinder Singh, Tim Rocktรคschel",bibtex,content
+2584,1775418,"examples/bibliography.bib",20316,0,"",bibtex,selection_keyboard
+2585,1776198,"examples/bibliography.bib",19852,0,"",bibtex,selection_command
+2586,1776444,"examples/bibliography.bib",19833,0,"",bibtex,selection_command
+2587,1777599,"examples/bibliography.bib",19835,0,"",bibtex,selection_command
+2588,1778048,"examples/bibliography.bib",19842,0,"",bibtex,selection_command
+2589,1778889,"examples/bibliography.bib",19843,0,"",bibtex,selection_command
+2590,1779083,"examples/bibliography.bib",19844,0,"",bibtex,selection_command
+2591,1779365,"examples/bibliography.bib",19845,0,"",bibtex,selection_command
+2592,1779461,"examples/bibliography.bib",19845,1," ",bibtex,selection_command
+2593,1779584,"examples/bibliography.bib",19845,2," ",bibtex,selection_command
+2594,1779758,"examples/bibliography.bib",19845,3," ",bibtex,selection_command
+2595,1779921,"examples/bibliography.bib",19845,4," ",bibtex,selection_command
+2596,1780525,"examples/bibliography.bib",19845,4,"",bibtex,content
+2597,1781187,"examples/bibliography.bib",20327,0,"",bibtex,selection_command
+2598,1781355,"examples/bibliography.bib",20344,0,"",bibtex,selection_command
+2599,1783573,"examples/bibliography.bib",20341,83,"",bibtex,content
+2600,1784664,"examples/bibliography.bib",20341,0,"https://deepmind.google/discover/blog/genie-2-a-large-scale-foundation-world-model/",bibtex,content
+2601,1784666,"examples/bibliography.bib",20424,0,"",bibtex,selection_keyboard
+2602,1784955,"examples/bibliography.bib",20423,0,"",bibtex,selection_command
+2603,1785302,"examples/bibliography.bib",20466,0,"",bibtex,selection_command
+2604,1787257,"examples/bibliography.bib",20465,0,"",bibtex,selection_command
+2605,1787407,"examples/bibliography.bib",20461,0,"",bibtex,selection_command
+2606,1787550,"examples/bibliography.bib",20459,0,"",bibtex,selection_command
+2607,1787689,"examples/bibliography.bib",20457,0,"",bibtex,selection_command
+2608,1787877,"examples/bibliography.bib",20452,0,"",bibtex,selection_command
+2609,1788122,"examples/bibliography.bib",20452,4,"",bibtex,content
+2610,1790734,"examples/bibliography.bib",20452,0,"D",bibtex,content
+2611,1790738,"examples/bibliography.bib",20453,0,"",bibtex,selection_keyboard
+2612,1790914,"examples/bibliography.bib",20453,0,"e",bibtex,content
+2613,1790919,"examples/bibliography.bib",20454,0,"",bibtex,selection_keyboard
+2614,1791059,"examples/bibliography.bib",20454,0,"c",bibtex,content
+2615,1791062,"examples/bibliography.bib",20455,0,"",bibtex,selection_keyboard
+2616,1791140,"examples/bibliography.bib",20455,0,"e",bibtex,content
+2617,1791142,"examples/bibliography.bib",20456,0,"",bibtex,selection_keyboard
+2618,1791264,"examples/bibliography.bib",20456,0,"m",bibtex,content
+2619,1791266,"examples/bibliography.bib",20457,0,"",bibtex,selection_keyboard
+2620,1791431,"examples/bibliography.bib",20457,0,"b",bibtex,content
+2621,1791434,"examples/bibliography.bib",20458,0,"",bibtex,selection_keyboard
+2622,1791486,"examples/bibliography.bib",20458,0,"e",bibtex,content
+2623,1791487,"examples/bibliography.bib",20459,0,"",bibtex,selection_keyboard
+2624,1791538,"examples/bibliography.bib",20459,0,"r",bibtex,content
+2625,1791539,"examples/bibliography.bib",20460,0,"",bibtex,selection_keyboard
+2626,1792073,"examples/bibliography.bib",20459,0,"",bibtex,selection_command
+2627,1793647,"examples/bibliography.bib",20461,0,"",bibtex,selection_command
+2628,1796907,"examples/bibliography.bib",20461,2,"",bibtex,content
+2629,1797450,"examples/bibliography.bib",20461,0,"4",bibtex,content
+2630,1797452,"examples/bibliography.bib",20462,0,"",bibtex,selection_keyboard
+2631,1797908,"examples/bibliography.bib",20461,0,"",bibtex,selection_command
+2632,1798160,"examples/bibliography.bib",20462,0,"",bibtex,selection_command
+2633,1798248,"examples/bibliography.bib",20464,0,"",bibtex,selection_command
+2634,1798801,"examples/bibliography.bib",20467,0,"",bibtex,selection_command
+2635,1799454,"examples/bibliography.bib",20467,1,"3",bibtex,content
+2636,1799919,"examples/bibliography.bib",20467,1,"4",bibtex,content
+2637,1931837,"examples/bibliography.bib",20372,0,"",bibtex,selection_command
+2638,1931961,"examples/bibliography.bib",20330,0,"",bibtex,selection_command
+2639,1932115,"examples/bibliography.bib",19873,0,"",bibtex,selection_command
+2640,1932288,"examples/bibliography.bib",19833,481," author = {Jack Parker-Holder, Philip Ball, Jake Bruce, Vibhavari Dasagi, Kristian Holsheimer, Christos Kaplanis, Alexandre Moufarek, Guy Scully, Jeremy Shar, Jimmy Shi, Stephen Spencer, Jessica Yung, Michael Dennis, Sultan Kenjeyev, Shangbang Long, Vlad Mnih, Harris Chan, Maxime Gazeau, Bonnie Li, Fabio Pardo, Luyu Wang, Lei Zhang, Frederic Besse, Tim Harley, Anna Mitenkova, Jane Wang, Jeff Clune, Demis Hassabis, Raia Hadsell, Adrian Bolton, Satinder Singh, Tim Rocktรคschel},",bibtex,selection_command
+2641,1933052,"examples/bibliography.bib",19835,0,"",bibtex,selection_command
+2642,1938454,"examples/bibliography.bib",19833,0,"",bibtex,selection_command
+2643,1948395,"examples/bibliography.bib",19833,0," author = {Jack Parker-Holder and Philip Ball and Jake Bruce and Vibhavari Dasagi and Kristian Holsheimer and Christos Kaplanis and Alexandre Moufarek and Guy Scully and Jeremy Shar and Jimmy Shi and Stephen Spencer and Jessica Yung and Michael Dennis and Sultan Kenjeyev and Shangbang Long and Vlad Mnih and Harris Chan and Maxime Gazeau and Bonnie Li and Fabio Pardo and Luyu Wang and Lei Zhang and Frederic Besse and Tim Harley and Anna Mitenkova and Jane Wang and Jeff Clune and Demis Hassabis and Raia Hadsell and Adrian Bolton and Satinder Singh and Tim Rocktรคschel},\n",bibtex,content
+2644,1948410,"examples/bibliography.bib",20408,482,"",bibtex,content
+2645,1951864,"examples/bibliography.bib",19873,0,"",bibtex,selection_command
+2646,1953208,"examples/bibliography.bib",20406,0,"",bibtex,selection_command
+2647,1953798,"examples/bibliography.bib",19833,0,"",bibtex,selection_command
+2648,1955055,"examples/bibliography.bib",20408,0,"",bibtex,selection_command
+2649,1955307,"examples/bibliography.bib",20425,0,"",bibtex,selection_command
+2650,1955332,"examples/bibliography.bib",20520,0,"",bibtex,selection_command
+2651,1955361,"examples/bibliography.bib",20564,0,"",bibtex,selection_command
+2652,1955981,"examples/bibliography.bib",20565,0,"\n",bibtex,content
+2653,1956261,"examples/bibliography.bib",20566,0,"\n",bibtex,content
+2654,1956501,"examples/bibliography.bib",20567,0,"@",bibtex,content
+2655,1956502,"examples/bibliography.bib",20568,0,"",bibtex,selection_keyboard
+2656,1956711,"examples/bibliography.bib",20568,0,"m",bibtex,content
+2657,1956712,"examples/bibliography.bib",20569,0,"",bibtex,selection_keyboard
+2658,1956761,"examples/bibliography.bib",20569,0,"i",bibtex,content
+2659,1956762,"examples/bibliography.bib",20570,0,"",bibtex,selection_keyboard
+2660,1956862,"examples/bibliography.bib",20570,0,"s",bibtex,content
+2661,1956863,"examples/bibliography.bib",20571,0,"",bibtex,selection_keyboard
+2662,1956933,"examples/bibliography.bib",20571,0,"c",bibtex,content
+2663,1956936,"examples/bibliography.bib",20572,0,"",bibtex,selection_keyboard
+2664,1957222,"examples/bibliography.bib",20571,0,"",bibtex,selection_command
+2665,1957700,"examples/bibliography.bib",20566,6,"",bibtex,content
+2666,1958046,"examples/bibliography.bib",20564,0,"",bibtex,selection_command
+2667,1958358,"examples/bibliography.bib",20520,0,"",bibtex,selection_command
+2668,1958614,"examples/bibliography.bib",20425,0,"",bibtex,selection_command
+2669,1958642,"examples/bibliography.bib",20408,0,"",bibtex,selection_command
+2670,1958672,"examples/bibliography.bib",19833,0,"",bibtex,selection_command
+2671,1958702,"examples/bibliography.bib",19774,0,"",bibtex,selection_command
+2672,1958829,"examples/bibliography.bib",19833,0,"",bibtex,selection_command
+2673,1959003,"examples/bibliography.bib",20408,0,"",bibtex,selection_command
+2674,1959130,"examples/bibliography.bib",20425,0,"",bibtex,selection_command
+2675,1959276,"examples/bibliography.bib",20520,0,"",bibtex,selection_command
+2676,1959416,"examples/bibliography.bib",20564,0,"",bibtex,selection_command
+2677,1959679,"examples/bibliography.bib",20564,1,"}",bibtex,selection_command
+2678,1959789,"examples/bibliography.bib",20520,45," note = {DeepMind Blog, December 4, 2024},\n}",bibtex,selection_command
+2679,1960049,"examples/bibliography.bib",20425,140," url = {https://deepmind.google/discover/blog/genie-2-a-large-scale-foundation-world-model/},\n note = {DeepMind Blog, December 4, 2024},\n}",bibtex,selection_command
+2680,1960075,"examples/bibliography.bib",20408,157," year = {2025},\n url = {https://deepmind.google/discover/blog/genie-2-a-large-scale-foundation-world-model/},\n note = {DeepMind Blog, December 4, 2024},\n}",bibtex,selection_command
+2681,1960099,"examples/bibliography.bib",19833,732," author = {Jack Parker-Holder and Philip Ball and Jake Bruce and Vibhavari Dasagi and Kristian Holsheimer and Christos Kaplanis and Alexandre Moufarek and Guy Scully and Jeremy Shar and Jimmy Shi and Stephen Spencer and Jessica Yung and Michael Dennis and Sultan Kenjeyev and Shangbang Long and Vlad Mnih and Harris Chan and Maxime Gazeau and Bonnie Li and Fabio Pardo and Luyu Wang and Lei Zhang and Frederic Besse and Tim Harley and Anna Mitenkova and Jane Wang and Jeff Clune and Demis Hassabis and Raia Hadsell and Adrian Bolton and Satinder Singh and Tim Rocktรคschel},\n year = {2025},\n url = {https://deepmind.google/discover/blog/genie-2-a-large-scale-foundation-world-model/},\n note = {DeepMind Blog, December 4, 2024},\n}",bibtex,selection_command
+2682,1960132,"examples/bibliography.bib",19774,791," title = {Genie 2: A large-scale foundation world model},\n author = {Jack Parker-Holder and Philip Ball and Jake Bruce and Vibhavari Dasagi and Kristian Holsheimer and Christos Kaplanis and Alexandre Moufarek and Guy Scully and Jeremy Shar and Jimmy Shi and Stephen Spencer and Jessica Yung and Michael Dennis and Sultan Kenjeyev and Shangbang Long and Vlad Mnih and Harris Chan and Maxime Gazeau and Bonnie Li and Fabio Pardo and Luyu Wang and Lei Zhang and Frederic Besse and Tim Harley and Anna Mitenkova and Jane Wang and Jeff Clune and Demis Hassabis and Raia Hadsell and Adrian Bolton and Satinder Singh and Tim Rocktรคschel},\n year = {2025},\n url = {https://deepmind.google/discover/blog/genie-2-a-large-scale-foundation-world-model/},\n note = {DeepMind Blog, December 4, 2024},\n}",bibtex,selection_command
+2683,1960237,"examples/bibliography.bib",19748,817,"@misc{deepmind2025genie2,\n title = {Genie 2: A large-scale foundation world model},\n author = {Jack Parker-Holder and Philip Ball and Jake Bruce and Vibhavari Dasagi and Kristian Holsheimer and Christos Kaplanis and Alexandre Moufarek and Guy Scully and Jeremy Shar and Jimmy Shi and Stephen Spencer and Jessica Yung and Michael Dennis and Sultan Kenjeyev and Shangbang Long and Vlad Mnih and Harris Chan and Maxime Gazeau and Bonnie Li and Fabio Pardo and Luyu Wang and Lei Zhang and Frederic Besse and Tim Harley and Anna Mitenkova and Jane Wang and Jeff Clune and Demis Hassabis and Raia Hadsell and Adrian Bolton and Satinder Singh and Tim Rocktรคschel},\n year = {2025},\n url = {https://deepmind.google/discover/blog/genie-2-a-large-scale-foundation-world-model/},\n note = {DeepMind Blog, December 4, 2024},\n}",bibtex,selection_command
+2684,1960589,"examples/bibliography.bib",19748,0,"",bibtex,selection_command
+2685,1960770,"examples/bibliography.bib",19774,0,"",bibtex,selection_command
+2686,1961014,"examples/bibliography.bib",20566,0,"",bibtex,selection_command
+2687,1961368,"examples/bibliography.bib",20566,0,"\n@misc{deepmind2025genie2,\n title = {Genie 2: A large-scale foundation world model},\n author = {Jack Parker-Holder and Philip Ball and Jake Bruce and Vibhavari Dasagi and Kristian Holsheimer and Christos Kaplanis and Alexandre Moufarek and Guy Scully and Jeremy Shar and Jimmy Shi and Stephen Spencer and Jessica Yung and Michael Dennis and Sultan Kenjeyev and Shangbang Long and Vlad Mnih and Harris Chan and Maxime Gazeau and Bonnie Li and Fabio Pardo and Luyu Wang and Lei Zhang and Frederic Besse and Tim Harley and Anna Mitenkova and Jane Wang and Jeff Clune and Demis Hassabis and Raia Hadsell and Adrian Bolton and Satinder Singh and Tim Rocktรคschel},\n year = {2025},\n url = {https://deepmind.google/discover/blog/genie-2-a-large-scale-foundation-world-model/},\n note = {DeepMind Blog, December 4, 2024},\n}",bibtex,content
+2688,1961372,"examples/bibliography.bib",20567,0,"",bibtex,selection_command
+2689,1961956,"examples/bibliography.bib",20568,0,"",bibtex,selection_command
+2690,1962121,"examples/bibliography.bib",20572,0,"",bibtex,selection_command
+2691,1962289,"examples/bibliography.bib",20590,0,"",bibtex,selection_command
+2692,1962853,"examples/bibliography.bib",20590,1,"3",bibtex,content
+2693,1964739,"examples/bibliography.bib",20616,0,"",bibtex,selection_command
+2694,1965065,"examples/bibliography.bib",20615,0,"",bibtex,selection_command
+2695,1965197,"examples/bibliography.bib",20613,0,"",bibtex,selection_command
+2696,1965319,"examples/bibliography.bib",20611,0,"",bibtex,selection_command
+2697,1965492,"examples/bibliography.bib",20610,0,"",bibtex,selection_command
+2698,1965697,"examples/bibliography.bib",20610,1,"3",bibtex,content
+2699,1975097,"examples/bibliography.bib",20604,0,"",bibtex,selection_command
+2700,1975235,"examples/bibliography.bib",20603,0,"",bibtex,selection_command
+2701,1975329,"examples/bibliography.bib",20603,1,"{",bibtex,selection_command
+2702,1975391,"examples/bibliography.bib",20603,6,"{Genie",bibtex,selection_command
+2703,1976045,"examples/bibliography.bib",20608,0,"",bibtex,selection_command
+2704,1976668,"examples/bibliography.bib",20604,0,"",bibtex,selection_command
+2705,1976763,"examples/bibliography.bib",20604,1,"G",bibtex,selection_command
+2706,1976840,"examples/bibliography.bib",20604,5,"Genie",bibtex,selection_command
+2707,1976965,"examples/bibliography.bib",20604,7,"Genie 3",bibtex,selection_command
+2708,1977217,"examples/bibliography.bib",20604,8,"Genie 3:",bibtex,selection_command
+2709,1977249,"examples/bibliography.bib",20604,10,"Genie 3: A",bibtex,selection_command
+2710,1977281,"examples/bibliography.bib",20604,16,"Genie 3: A large",bibtex,selection_command
+2711,1977315,"examples/bibliography.bib",20604,17,"Genie 3: A large-",bibtex,selection_command
+2712,1977431,"examples/bibliography.bib",20604,22,"Genie 3: A large-scale",bibtex,selection_command
+2713,1977600,"examples/bibliography.bib",20604,33,"Genie 3: A large-scale foundation",bibtex,selection_command
+2714,1977762,"examples/bibliography.bib",20604,39,"Genie 3: A large-scale foundation world",bibtex,selection_command
+2715,1977902,"examples/bibliography.bib",20604,45,"Genie 3: A large-scale foundation world model",bibtex,selection_command
+2716,1978545,"examples/bibliography.bib",20604,45,"Genie 3: A new frontier for world models",bibtex,content
+2717,1978547,"examples/bibliography.bib",20644,0,"",bibtex,selection_keyboard
+2718,1979162,"examples/bibliography.bib",20645,0,"",bibtex,selection_command
+2719,1979544,"examples/bibliography.bib",20593,0,"",bibtex,selection_command
+2720,1980042,"examples/bibliography.bib",20647,0,"",bibtex,selection_command
+2721,1980146,"examples/bibliography.bib",20649,0,"",bibtex,selection_command
+2722,1980326,"examples/bibliography.bib",20656,0,"",bibtex,selection_command
+2723,1980483,"examples/bibliography.bib",20658,0,"",bibtex,selection_command
+2724,1980664,"examples/bibliography.bib",20659,0,"",bibtex,selection_command
+2725,1981117,"examples/bibliography.bib",20659,560,"",bibtex,content
+2726,1983659,"examples/bibliography.bib",20659,0," Jack Parker-Holder and Shlomi Fruchter",bibtex,content
+2727,1983662,"examples/bibliography.bib",20701,0,"",bibtex,selection_keyboard
+2728,1983934,"examples/bibliography.bib",20700,0,"",bibtex,selection_command
+2729,1984457,"examples/bibliography.bib",20693,0,"",bibtex,selection_command
+2730,1984596,"examples/bibliography.bib",20686,0,"",bibtex,selection_command
+2731,1984736,"examples/bibliography.bib",20682,0,"",bibtex,selection_command
+2732,1984869,"examples/bibliography.bib",20675,0,"",bibtex,selection_command
+2733,1985203,"examples/bibliography.bib",20674,0,"",bibtex,selection_command
+2734,1985553,"examples/bibliography.bib",20668,0,"",bibtex,selection_command
+2735,1985860,"examples/bibliography.bib",20663,0,"",bibtex,selection_command
+2736,1986453,"examples/bibliography.bib",20662,1,"",bibtex,content
+2737,1986823,"examples/bibliography.bib",20661,1,"",bibtex,content
+2738,1986973,"examples/bibliography.bib",20660,1,"",bibtex,content
+2739,1987187,"examples/bibliography.bib",20659,1,"",bibtex,content
+2740,1987354,"examples/bibliography.bib",20658,0,"",bibtex,selection_command
+2741,1987571,"examples/bibliography.bib",20711,0,"",bibtex,selection_command
+2742,1988114,"examples/bibliography.bib",20728,0,"",bibtex,selection_command
+2743,2028460,"examples/bibliography.bib",20823,0,"",bibtex,selection_command
+2744,2028603,"examples/bibliography.bib",20856,0,"",bibtex,selection_command
+2745,2028827,"examples/bibliography.bib",20856,1,"}",bibtex,selection_command
+2746,2028995,"examples/bibliography.bib",20812,45," note = {DeepMind Blog, December 4, 2024},\n}",bibtex,selection_command
+2747,2029259,"examples/bibliography.bib",20717,140," url = {https://deepmind.google/discover/blog/genie-2-a-large-scale-foundation-world-model/},\n note = {DeepMind Blog, December 4, 2024},\n}",bibtex,selection_command
+2748,2029284,"examples/bibliography.bib",20700,157," year = {2025},\n url = {https://deepmind.google/discover/blog/genie-2-a-large-scale-foundation-world-model/},\n note = {DeepMind Blog, December 4, 2024},\n}",bibtex,selection_command
+2749,2029314,"examples/bibliography.bib",20647,210," author = {Jack Parker-Holder and Shlomi Fruchter},\n year = {2025},\n url = {https://deepmind.google/discover/blog/genie-2-a-large-scale-foundation-world-model/},\n note = {DeepMind Blog, December 4, 2024},\n}",bibtex,selection_command
+2750,2029347,"examples/bibliography.bib",20593,264," title = {Genie 3: A new frontier for world models},\n author = {Jack Parker-Holder and Shlomi Fruchter},\n year = {2025},\n url = {https://deepmind.google/discover/blog/genie-2-a-large-scale-foundation-world-model/},\n note = {DeepMind Blog, December 4, 2024},\n}",bibtex,selection_command
+2751,2029460,"examples/bibliography.bib",20567,290,"@misc{deepmind2025genie3,\n title = {Genie 3: A new frontier for world models},\n author = {Jack Parker-Holder and Shlomi Fruchter},\n year = {2025},\n url = {https://deepmind.google/discover/blog/genie-2-a-large-scale-foundation-world-model/},\n note = {DeepMind Blog, December 4, 2024},\n}",bibtex,selection_command
+2752,2029773,"examples/bibliography.bib",20567,290,"@article{genie3,\n title = {Genie 3: A New Frontier for World Models},\n author = {Philip J. Ball and Jakob Bauer and Frank Belletti and Bethanie Brownfield and Ariel Ephrat and Shlomi Fruchter and Agrim Gupta and Kristian Holsheimer and Aleksander Holynski and Jiri Hron and Christos Kaplanis and Marjorie Limont and Matt McGill and Yanko Oliveira and Jack Parker-Holder and Frank Perbet and Guy Scully and Jeremy Shar and Stephen Spencer and Omer Tov and Ruben Villegas and Emma Wang and Jessica Yung and Cip Baetu and Jordi Berbel and David Bridson and Jake Bruce and Gavin Buttimore and Sarah Chakera and Bilva Chandra and Paul Collins and Alex Cullum and Bogdan Damoc and Vibha Dasagi and Maxime Gazeau and Charles Gbadamosi and Woohyun Han and Ed Hirst and Ashyana Kachra and Lucie Kerley and Kristian Kjems and Eva Knoepfel and Vika Koriakin and Jessica Lo and Cong Lu and Zeb Mehring and Alex Moufarek and Henna Nandwani and Valeria Oliveira and Fabio Pardo and Jane Park and Andrew Pierson and Ben Poole and Helen Ran and Tim Salimans and Manuel Sanchez and Igor Saprykin and Amy Shen and Sailesh Sidhwani and Duncan Smith and Joe Stanton and Hamish Tomlinson and Dimple Vijaykumar and Luyu Wang and Piers Wingfield and Nat Wong and Keyang Xu and Christopher Yew and Nick Young and Vadim Zubov and Douglas Eck and Dumitru Erhan and Koray Kavukcuoglu and Demis Hassabis and Zoubin Gharamani and Raia Hadsell and A{\""a}ron van den Oord and Inbar Mosseri and Adrian Bolton and Satinder Singh and Tim Rockt{\""a}schel},\n year = {2025},\n url = {}\n}\n",bibtex,content
+2753,2029778,"examples/bibliography.bib",22155,0,"",bibtex,selection_keyboard
+2754,2030238,"examples/bibliography.bib",20567,0,"",bibtex,selection_command
+2755,2030730,"examples/bibliography.bib",20646,0,"",bibtex,selection_command
+2756,2030924,"examples/bibliography.bib",22106,0,"",bibtex,selection_command
+2757,2031175,"examples/bibliography.bib",22132,0,"",bibtex,selection_command
+2758,2031207,"examples/bibliography.bib",22153,0,"",bibtex,selection_command
+2759,2031236,"examples/bibliography.bib",22155,0,"",bibtex,selection_command
+2760,2031647,"examples/bibliography.bib",22154,1,"",bibtex,content
+2761,2031655,"examples/bibliography.bib",22153,0,"",bibtex,selection_command
+2762,2031789,"examples/bibliography.bib",22132,0,"",bibtex,selection_command
+2763,2032046,"examples/bibliography.bib",22106,0,"",bibtex,selection_command
+2764,2032085,"examples/bibliography.bib",20646,0,"",bibtex,selection_command
+2765,2032190,"examples/bibliography.bib",20584,0,"",bibtex,selection_command
+2766,2032359,"examples/bibliography.bib",20567,0,"",bibtex,selection_command
+2767,2032610,"examples/bibliography.bib",20568,0,"",bibtex,selection_command
+2768,2032836,"examples/bibliography.bib",20575,0,"",bibtex,selection_command
+2769,2033012,"examples/bibliography.bib",20576,0,"",bibtex,selection_command
+2770,2033226,"examples/bibliography.bib",20582,0,"",bibtex,selection_command
+2771,2033911,"examples/bibliography.bib",20576,0,"",bibtex,selection_command
+2772,2043456,"examples/bibliography.bib",20576,6,"",bibtex,content
+2773,2044270,"examples/bibliography.bib",20576,0,"d",bibtex,content
+2774,2044277,"examples/bibliography.bib",20577,0,"",bibtex,selection_keyboard
+2775,2044320,"examples/bibliography.bib",20577,0,"e",bibtex,content
+2776,2044325,"examples/bibliography.bib",20578,0,"",bibtex,selection_keyboard
+2777,2044455,"examples/bibliography.bib",20578,0,"e",bibtex,content
+2778,2044459,"examples/bibliography.bib",20579,0,"",bibtex,selection_keyboard
+2779,2044511,"examples/bibliography.bib",20579,0,"p",bibtex,content
+2780,2044515,"examples/bibliography.bib",20580,0,"",bibtex,selection_keyboard
+2781,2044602,"examples/bibliography.bib",20580,0,"m",bibtex,content
+2782,2044606,"examples/bibliography.bib",20581,0,"",bibtex,selection_keyboard
+2783,2044699,"examples/bibliography.bib",20581,0,"i",bibtex,content
+2784,2044702,"examples/bibliography.bib",20582,0,"",bibtex,selection_keyboard
+2785,2044839,"examples/bibliography.bib",20582,0,"n",bibtex,content
+2786,2044844,"examples/bibliography.bib",20583,0,"",bibtex,selection_keyboard
+2787,2044955,"examples/bibliography.bib",20583,0,"d",bibtex,content
+2788,2044960,"examples/bibliography.bib",20584,0,"",bibtex,selection_keyboard
+2789,2045368,"examples/bibliography.bib",20584,0,"2",bibtex,content
+2790,2045375,"examples/bibliography.bib",20585,0,"",bibtex,selection_keyboard
+2791,2045504,"examples/bibliography.bib",20585,0,"2",bibtex,content
+2792,2045508,"examples/bibliography.bib",20586,0,"",bibtex,selection_keyboard
+2793,2045535,"examples/bibliography.bib",20586,0,"0",bibtex,content
+2794,2045537,"examples/bibliography.bib",20587,0,"",bibtex,selection_keyboard
+2795,2045652,"examples/bibliography.bib",20587,0,"2",bibtex,content
+2796,2045654,"examples/bibliography.bib",20588,0,"",bibtex,selection_keyboard
+2797,2046058,"examples/bibliography.bib",20587,1,"",bibtex,content
+2798,2046224,"examples/bibliography.bib",20586,1,"",bibtex,content
+2799,2046364,"examples/bibliography.bib",20585,1,"",bibtex,content
+2800,2047356,"examples/bibliography.bib",20585,0,"0",bibtex,content
+2801,2047362,"examples/bibliography.bib",20586,0,"",bibtex,selection_keyboard
+2802,2047420,"examples/bibliography.bib",20586,0,"2",bibtex,content
+2803,2047424,"examples/bibliography.bib",20587,0,"",bibtex,selection_keyboard
+2804,2047542,"examples/bibliography.bib",20587,0,"5",bibtex,content
+2805,2047546,"examples/bibliography.bib",20588,0,"",bibtex,selection_keyboard
+2806,2047804,"examples/bibliography.bib",20588,0,"g",bibtex,content
+2807,2047810,"examples/bibliography.bib",20589,0,"",bibtex,selection_keyboard
+2808,2047938,"examples/bibliography.bib",20589,0,"n",bibtex,content
+2809,2047941,"examples/bibliography.bib",20590,0,"",bibtex,selection_keyboard
+2810,2048255,"examples/bibliography.bib",20589,1,"",bibtex,content
+2811,2048388,"examples/bibliography.bib",20589,0,"e",bibtex,content
+2812,2048390,"examples/bibliography.bib",20590,0,"",bibtex,selection_keyboard
+2813,2048465,"examples/bibliography.bib",20590,0,"n",bibtex,content
+2814,2048467,"examples/bibliography.bib",20591,0,"",bibtex,selection_keyboard
+2815,2048509,"examples/bibliography.bib",20591,0,"i",bibtex,content
+2816,2048513,"examples/bibliography.bib",20592,0,"",bibtex,selection_keyboard
+2817,2048554,"examples/bibliography.bib",20592,0,"e",bibtex,content
+2818,2048559,"examples/bibliography.bib",20593,0,"",bibtex,selection_keyboard
+2819,2048739,"examples/bibliography.bib",20593,0,"3",bibtex,content
+2820,2048742,"examples/bibliography.bib",20594,0,"",bibtex,selection_keyboard
+2821,2049024,"examples/bibliography.bib",20593,0,"",bibtex,selection_command
+2822,2051310,"examples/bibliography.bib",20566,0,"",bibtex,selection_command
+2823,2051556,"examples/bibliography.bib",20564,0,"",bibtex,selection_command
+2824,2051591,"examples/bibliography.bib",20546,0,"",bibtex,selection_command
+2825,2051620,"examples/bibliography.bib",20451,0,"",bibtex,selection_command
+2826,2051656,"examples/bibliography.bib",20423,0,"",bibtex,selection_command
+2827,2051740,"examples/bibliography.bib",19859,0,"",bibtex,selection_command
+2828,2051885,"examples/bibliography.bib",19800,0,"",bibtex,selection_command
+2829,2052057,"examples/bibliography.bib",19772,0,"",bibtex,selection_command
+2830,2052293,"examples/bibliography.bib",19754,0,"",bibtex,selection_command
+2831,2052687,"examples/bibliography.bib",19755,0,"",bibtex,selection_command
+2832,2052939,"examples/bibliography.bib",19756,0,"",bibtex,selection_command
+2833,2052970,"examples/bibliography.bib",19757,0,"",bibtex,selection_command
+2834,2053002,"examples/bibliography.bib",19758,0,"",bibtex,selection_command
+2835,2053039,"examples/bibliography.bib",19759,0,"",bibtex,selection_command
+2836,2053068,"examples/bibliography.bib",19760,0,"",bibtex,selection_command
+2837,2053100,"examples/bibliography.bib",19761,0,"",bibtex,selection_command
+2838,2053137,"examples/bibliography.bib",19762,0,"",bibtex,selection_command
+2839,2053170,"examples/bibliography.bib",19763,0,"",bibtex,selection_command
+2840,2053204,"examples/bibliography.bib",19764,0,"",bibtex,selection_command
+2841,2053239,"examples/bibliography.bib",19765,0,"",bibtex,selection_command
+2842,2053270,"examples/bibliography.bib",19766,0,"",bibtex,selection_command
+2843,2053496,"examples/bibliography.bib",19765,0,"",bibtex,selection_command
+2844,2053654,"examples/bibliography.bib",19765,1,"4",bibtex,content
+2845,2054031,"examples/bibliography.bib",19791,0,"",bibtex,selection_command
+2846,2054281,"examples/bibliography.bib",19850,0,"",bibtex,selection_command
+2847,2054313,"examples/bibliography.bib",20423,0,"",bibtex,selection_command
+2848,2054343,"examples/bibliography.bib",20442,0,"",bibtex,selection_command
+2849,2054374,"examples/bibliography.bib",20537,0,"",bibtex,selection_command
+2850,2054405,"examples/bibliography.bib",20564,0,"",bibtex,selection_command
+2851,2054438,"examples/bibliography.bib",20566,0,"",bibtex,selection_command
+2852,2054474,"examples/bibliography.bib",20584,0,"",bibtex,selection_command
+2853,2054609,"examples/bibliography.bib",20613,0,"",bibtex,selection_command
+2854,2054813,"examples/bibliography.bib",20675,0,"",bibtex,selection_command
+2855,2054974,"examples/bibliography.bib",22135,0,"",bibtex,selection_command
+2856,2055111,"examples/bibliography.bib",22161,0,"",bibtex,selection_command
+2857,2055349,"examples/bibliography.bib",22165,0,"",bibtex,selection_command
+2858,2103261,"examples/bibliography.bib",22144,0,"",bibtex,selection_command
+2859,2103505,"examples/bibliography.bib",22118,0,"",bibtex,selection_command
+2860,2103534,"examples/bibliography.bib",20658,0,"",bibtex,selection_command
+2861,2103566,"examples/bibliography.bib",20596,0,"",bibtex,selection_command
+2862,2103599,"examples/bibliography.bib",20567,0,"",bibtex,selection_command
+2863,2103633,"examples/bibliography.bib",20566,0,"",bibtex,selection_command
+2864,2103839,"examples/bibliography.bib",20564,0,"",bibtex,selection_command
+2865,2104264,"examples/bibliography.bib",20564,1,"}",bibtex,selection_command
+2866,2104368,"examples/bibliography.bib",20520,45," note = {DeepMind Blog, December 4, 2024},\n}",bibtex,selection_command
+2867,2104626,"examples/bibliography.bib",20425,140," url = {https://deepmind.google/discover/blog/genie-2-a-large-scale-foundation-world-model/},\n note = {DeepMind Blog, December 4, 2024},\n}",bibtex,selection_command
+2868,2104655,"examples/bibliography.bib",20408,157," year = {2025},\n url = {https://deepmind.google/discover/blog/genie-2-a-large-scale-foundation-world-model/},\n note = {DeepMind Blog, December 4, 2024},\n}",bibtex,selection_command
+2869,2104684,"examples/bibliography.bib",19833,732," author = {Jack Parker-Holder and Philip Ball and Jake Bruce and Vibhavari Dasagi and Kristian Holsheimer and Christos Kaplanis and Alexandre Moufarek and Guy Scully and Jeremy Shar and Jimmy Shi and Stephen Spencer and Jessica Yung and Michael Dennis and Sultan Kenjeyev and Shangbang Long and Vlad Mnih and Harris Chan and Maxime Gazeau and Bonnie Li and Fabio Pardo and Luyu Wang and Lei Zhang and Frederic Besse and Tim Harley and Anna Mitenkova and Jane Wang and Jeff Clune and Demis Hassabis and Raia Hadsell and Adrian Bolton and Satinder Singh and Tim Rocktรคschel},\n year = {2025},\n url = {https://deepmind.google/discover/blog/genie-2-a-large-scale-foundation-world-model/},\n note = {DeepMind Blog, December 4, 2024},\n}",bibtex,selection_command
+2870,2104922,"examples/bibliography.bib",19774,791," title = {Genie 2: A large-scale foundation world model},\n author = {Jack Parker-Holder and Philip Ball and Jake Bruce and Vibhavari Dasagi and Kristian Holsheimer and Christos Kaplanis and Alexandre Moufarek and Guy Scully and Jeremy Shar and Jimmy Shi and Stephen Spencer and Jessica Yung and Michael Dennis and Sultan Kenjeyev and Shangbang Long and Vlad Mnih and Harris Chan and Maxime Gazeau and Bonnie Li and Fabio Pardo and Luyu Wang and Lei Zhang and Frederic Besse and Tim Harley and Anna Mitenkova and Jane Wang and Jeff Clune and Demis Hassabis and Raia Hadsell and Adrian Bolton and Satinder Singh and Tim Rocktรคschel},\n year = {2025},\n url = {https://deepmind.google/discover/blog/genie-2-a-large-scale-foundation-world-model/},\n note = {DeepMind Blog, December 4, 2024},\n}",bibtex,selection_command
+2871,2105072,"examples/bibliography.bib",19748,817,"@misc{deepmind2024genie2,\n title = {Genie 2: A large-scale foundation world model},\n author = {Jack Parker-Holder and Philip Ball and Jake Bruce and Vibhavari Dasagi and Kristian Holsheimer and Christos Kaplanis and Alexandre Moufarek and Guy Scully and Jeremy Shar and Jimmy Shi and Stephen Spencer and Jessica Yung and Michael Dennis and Sultan Kenjeyev and Shangbang Long and Vlad Mnih and Harris Chan and Maxime Gazeau and Bonnie Li and Fabio Pardo and Luyu Wang and Lei Zhang and Frederic Besse and Tim Harley and Anna Mitenkova and Jane Wang and Jeff Clune and Demis Hassabis and Raia Hadsell and Adrian Bolton and Satinder Singh and Tim Rocktรคschel},\n year = {2025},\n url = {https://deepmind.google/discover/blog/genie-2-a-large-scale-foundation-world-model/},\n note = {DeepMind Blog, December 4, 2024},\n}",bibtex,selection_command
+2872,2105424,"examples/bibliography.bib",19748,817,"@article{parkerholder2024genie2,\n title = {Genie 2: A Large-Scale Foundation World Model},\n author = {Jack Parker-Holder and Philip Ball and Jake Bruce and Vibhavari Dasagi and Kristian Holsheimer and Christos Kaplanis and Alexandre Moufarek and Guy Scully and Jeremy Shar and Jimmy Shi and Stephen Spencer and Jessica Yung and Michael Dennis and Sultan Kenjeyev and Shangbang Long and Vlad Mnih and Harris Chan and Maxime Gazeau and Bonnie Li and Fabio Pardo and Luyu Wang and Lei Zhang and Frederic Besse and Tim Harley and Anna Mitenkova and Jane Wang and Jeff Clune and Demis Hassabis and Raia Hadsell and Adrian Bolton and Satinder Singh and Tim Rockt{\""a}schel},\n year = {2024},\n url = {https://deepmind.google/discover/blog/genie-2-a-large-scale-foundation-world-model/}\n}\n",bibtex,content
+2873,2105427,"examples/bibliography.bib",20566,0,"",bibtex,selection_keyboard
+2874,2105845,"examples/bibliography.bib",19748,0,"",bibtex,selection_command
+2875,2106116,"examples/bibliography.bib",19749,0,"",bibtex,selection_command
+2876,2106275,"examples/bibliography.bib",19756,0,"",bibtex,selection_command
+2877,2106465,"examples/bibliography.bib",19757,0,"",bibtex,selection_command
+2878,2108327,"examples/bibliography.bib",19790,0,"",bibtex,selection_command
+2879,2108576,"examples/bibliography.bib",19857,0,"",bibtex,selection_command
+2880,2108607,"examples/bibliography.bib",20443,0,"",bibtex,selection_command
+2881,2108637,"examples/bibliography.bib",20469,0,"",bibtex,selection_command
+2882,2108670,"examples/bibliography.bib",20564,0,"",bibtex,selection_command
+2883,2108705,"examples/bibliography.bib",20566,0,"",bibtex,selection_command
+2884,2109339,"examples/bibliography.bib",20566,1,"",bibtex,content
+2885,2110062,"examples/bibliography.bib",20567,0,"",bibtex,selection_command
+2886,2110321,"examples/bibliography.bib",20596,0,"",bibtex,selection_command
+2887,2110357,"examples/bibliography.bib",20658,0,"",bibtex,selection_command
+2888,2110385,"examples/bibliography.bib",22118,0,"",bibtex,selection_command
+2889,2110415,"examples/bibliography.bib",22144,0,"",bibtex,selection_command
+2890,2110448,"examples/bibliography.bib",22165,0,"",bibtex,selection_command
+2891,2152775,"examples/bibliography.bib",22144,0,"",bibtex,selection_command
+2892,2153024,"examples/bibliography.bib",22118,0,"",bibtex,selection_command
+2893,2153072,"examples/bibliography.bib",20658,0,"",bibtex,selection_command
+2894,2153088,"examples/bibliography.bib",20596,0,"",bibtex,selection_command
+2895,2153116,"examples/bibliography.bib",20567,0,"",bibtex,selection_command
+2896,2153160,"examples/bibliography.bib",20566,0,"",bibtex,selection_command
+2897,2153177,"examples/bibliography.bib",20564,0,"",bibtex,selection_command
+2898,2153207,"examples/bibliography.bib",20460,0,"",bibtex,selection_command
+2899,2153239,"examples/bibliography.bib",20434,0,"",bibtex,selection_command
+2900,2153273,"examples/bibliography.bib",19848,0,"",bibtex,selection_command
+2901,2153306,"examples/bibliography.bib",19781,0,"",bibtex,selection_command
+2902,2153339,"examples/bibliography.bib",19748,0,"",bibtex,selection_command
+2903,2153373,"examples/bibliography.bib",19747,0,"",bibtex,selection_command
+2904,2153406,"examples/bibliography.bib",19745,0,"",bibtex,selection_command
+2905,2153439,"examples/bibliography.bib",19692,0,"",bibtex,selection_command
+2906,2153472,"examples/bibliography.bib",19675,0,"",bibtex,selection_command
+2907,2153506,"examples/bibliography.bib",19601,0,"",bibtex,selection_command
+2908,2153542,"examples/bibliography.bib",19110,0,"",bibtex,selection_command
+2909,2153646,"examples/bibliography.bib",19054,0,"",bibtex,selection_command
+2910,2153811,"examples/bibliography.bib",19023,0,"",bibtex,selection_command
+2911,2153926,"examples/bibliography.bib",19024,0,"",bibtex,selection_command
+2912,2154077,"examples/bibliography.bib",19037,0,"",bibtex,selection_command
+2913,2154226,"examples/bibliography.bib",19038,0,"",bibtex,selection_command
+2914,2154463,"examples/bibliography.bib",19038,1,"b",bibtex,selection_command
+2915,2154541,"examples/bibliography.bib",19038,14,"bruce2024genie",bibtex,selection_command
+2916,2155099,"examples/bibliography.bib",19038,0,"",bibtex,selection_command
+2917,2156561,"examples/jasmine.html",0,0,"",html,tab
+2918,2158199,"examples/jasmine.html",4077,1,"g",html,selection_command
+2919,2158250,"examples/jasmine.html",4077,9,"genie2025",html,selection_command
+2920,2158347,"examples/jasmine.html",4077,9,"bruce2024genie",html,content
+2921,2158353,"examples/jasmine.html",4090,0,"",html,selection_command
+2922,2158800,"examples/jasmine.html",4091,0,"",html,selection_command
+2923,2158940,"examples/jasmine.html",4093,0,"",html,selection_command
+2924,2159251,"examples/jasmine.html",4093,9,"",html,content
+2925,2159940,"examples/bibliography.bib",0,0,"",bibtex,tab
+2926,2160366,"examples/bibliography.bib",19069,0,"",bibtex,selection_command
+2927,2160619,"examples/bibliography.bib",19125,0,"",bibtex,selection_command
+2928,2160648,"examples/bibliography.bib",19616,0,"",bibtex,selection_command
+2929,2160681,"examples/bibliography.bib",19690,0,"",bibtex,selection_command
+2930,2160714,"examples/bibliography.bib",19707,0,"",bibtex,selection_command
+2931,2160748,"examples/bibliography.bib",19745,0,"",bibtex,selection_command
+2932,2160865,"examples/bibliography.bib",19747,0,"",bibtex,selection_command
+2933,2161018,"examples/bibliography.bib",19763,0,"",bibtex,selection_command
+2934,2161225,"examples/bibliography.bib",19757,0,"",bibtex,selection_command
+2935,2161321,"examples/bibliography.bib",19757,1,"p",bibtex,selection_command
+2936,2161361,"examples/bibliography.bib",19757,22,"parkerholder2024genie2",bibtex,selection_command
+2937,2161536,"examples/bibliography.bib",19757,0,"",bibtex,selection_command
+2938,2162087,"examples/jasmine.html",0,0,"",html,tab
+2939,2162451,"examples/jasmine.html",4092,0,"",html,selection_command
+2940,2162527,"examples/jasmine.html",4093,0,"parkerholder2024genie2",html,content
+2941,2162529,"examples/jasmine.html",4114,0,"",html,selection_command
+2942,2163064,"examples/jasmine.html",4115,0,"",html,selection_command
+2943,2163144,"examples/jasmine.html",4115,0,",",html,content
+2944,2163147,"examples/jasmine.html",4116,0,"",html,selection_keyboard
+2945,2163218,"examples/jasmine.html",4116,0," ",html,content
+2946,2163221,"examples/jasmine.html",4117,0,"",html,selection_keyboard
+2947,2163387,"examples/jasmine.html",4116,0,"",html,selection_command
+2948,2163855,"examples/bibliography.bib",0,0,"",bibtex,tab
+2949,2164082,"examples/bibliography.bib",19790,0,"",bibtex,selection_command
+2950,2164341,"examples/bibliography.bib",19857,0,"",bibtex,selection_command
+2951,2164370,"examples/bibliography.bib",20443,0,"",bibtex,selection_command
+2952,2164400,"examples/bibliography.bib",20469,0,"",bibtex,selection_command
+2953,2164433,"examples/bibliography.bib",20564,0,"",bibtex,selection_command
+2954,2164466,"examples/bibliography.bib",20566,0,"",bibtex,selection_command
+2955,2164628,"examples/bibliography.bib",20576,0,"",bibtex,selection_command
+2956,2164958,"examples/bibliography.bib",20576,1,"d",bibtex,selection_command
+2957,2165004,"examples/bibliography.bib",20576,18,"deepmind2025genie3",bibtex,selection_command
+2958,2165593,"examples/bibliography.bib",20576,0,"",bibtex,selection_command
+2959,2166137,"examples/jasmine.html",0,0,"",html,tab
+2960,2166987,"examples/jasmine.html",4117,0,"deepmind2025genie3",html,content
+2961,2166991,"examples/jasmine.html",4134,0,"",html,selection_command
+2962,2167747,"examples/jasmine.html",4135,0,"",html,selection_command
+2963,2168030,"examples/jasmine.html",4139,0,"",html,selection_command
+2964,2168279,"examples/jasmine.html",4140,0,"",html,selection_command
+2965,2168308,"examples/jasmine.html",4141,0,"",html,selection_command
+2966,2168340,"examples/jasmine.html",4145,0,"",html,selection_command
+2967,2168374,"examples/jasmine.html",4148,0,"",html,selection_command
+2968,2168593,"examples/jasmine.html",4151,0,"",html,selection_command
+2969,2168848,"examples/jasmine.html",4157,0,"",html,selection_command
+2970,2168878,"examples/jasmine.html",4168,0,"",html,selection_command
+2971,2168905,"examples/jasmine.html",4174,0,"",html,selection_command
+2972,2168937,"examples/jasmine.html",4177,0,"",html,selection_command
+2973,2168970,"examples/jasmine.html",4183,0,"",html,selection_command
+2974,2169004,"examples/jasmine.html",4186,0,"",html,selection_command
+2975,2169037,"examples/jasmine.html",4191,0,"",html,selection_command
+2976,2169071,"examples/jasmine.html",4201,0,"",html,selection_command
+2977,2169104,"examples/jasmine.html",4203,0,"",html,selection_command
+2978,2169141,"examples/jasmine.html",4207,0,"",html,selection_command
+2979,2169182,"examples/jasmine.html",4210,0,"",html,selection_command
+2980,2169463,"examples/jasmine.html",4219,0,"",html,selection_command
+2981,2169713,"examples/jasmine.html",4230,0,"",html,selection_command
+2982,2169742,"examples/jasmine.html",4231,0,"",html,selection_command
+2983,2169772,"examples/jasmine.html",4236,0,"",html,selection_command
+2984,2169802,"examples/jasmine.html",4244,0,"",html,selection_command
+2985,2171372,"examples/jasmine.html",4236,0,"",html,selection_command
+2986,2171510,"examples/jasmine.html",4231,0,"",html,selection_command
+2987,2171652,"examples/jasmine.html",4230,0,"",html,selection_command
+2988,2172238,"examples/jasmine.html",4230,1,"(",html,selection_command
+2989,2172276,"examples/jasmine.html",4230,5,"(cite",html,selection_command
+2990,2172431,"examples/jasmine.html",4230,13,"(cite Michael",html,selection_command
+2991,2172603,"examples/jasmine.html",4230,20,"(cite Michael Dennis",html,selection_command
+2992,2172761,"examples/jasmine.html",4230,21,"(cite Michael Dennis)",html,selection_command
+2993,2173154,"examples/jasmine.html",4230,21,"",html,content
+2994,2174679,"examples/jasmine.html",4230,0,"<",html,content
+2995,2174683,"examples/jasmine.html",4231,0,"",html,selection_keyboard
+2996,2174946,"examples/jasmine.html",4231,0,"d",html,content
+2997,2174953,"examples/jasmine.html",4232,0,"",html,selection_keyboard
+2998,2175105,"examples/jasmine.html",4232,0,"-",html,content
+2999,2175110,"examples/jasmine.html",4233,0,"",html,selection_keyboard
+3000,2175326,"examples/jasmine.html",4233,0,"c",html,content
+3001,2175331,"examples/jasmine.html",4234,0,"",html,selection_keyboard
+3002,2175446,"examples/jasmine.html",4234,0,"i",html,content
+3003,2175449,"examples/jasmine.html",4235,0,"",html,selection_keyboard
+3004,2175577,"examples/jasmine.html",4235,0,"t",html,content
+3005,2175580,"examples/jasmine.html",4236,0,"",html,selection_keyboard
+3006,2175625,"examples/jasmine.html",4236,0,"e",html,content
+3007,2175628,"examples/jasmine.html",4237,0,"",html,selection_keyboard
+3008,2175740,"examples/jasmine.html",4237,0," ",html,content
+3009,2175742,"examples/jasmine.html",4238,0,"",html,selection_keyboard
+3010,2175880,"examples/jasmine.html",4238,0,"k",html,content
+3011,2175881,"examples/jasmine.html",4239,0,"",html,selection_keyboard
+3012,2175960,"examples/jasmine.html",4239,0,"e",html,content
+3013,2175961,"examples/jasmine.html",4240,0,"",html,selection_keyboard
+3014,2176078,"examples/jasmine.html",4240,0,"y",html,content
+3015,2176079,"examples/jasmine.html",4241,0,"",html,selection_keyboard
+3016,2176544,"examples/jasmine.html",4241,0,"-",html,content
+3017,2176547,"examples/jasmine.html",4242,0,"",html,selection_keyboard
+3018,2176827,"examples/jasmine.html",4242,0,"""""",html,content
+3019,2176828,"examples/jasmine.html",4243,0,"",html,selection_keyboard
+3020,2177158,"examples/jasmine.html",4242,2,"",html,content
+3021,2177277,"examples/jasmine.html",4241,1,"",html,content
+3022,2177480,"examples/jasmine.html",4241,0,"=",html,content
+3023,2177481,"examples/jasmine.html",4242,0,"",html,selection_keyboard
+3024,2177582,"examples/jasmine.html",4242,0,"""""",html,content
+3025,2177731,"examples/jasmine.html",4243,1,"""",html,content
+3026,2177733,"examples/jasmine.html",4244,0,"",html,selection_keyboard
+3027,2177833,"examples/jasmine.html",4244,0,"""""",html,content
+3028,2177835,"examples/jasmine.html",4245,0,"",html,selection_keyboard
+3029,2178491,"examples/jasmine.html",4244,2,"",html,content
+3030,2179550,"examples/jasmine.html",4244,0,">",html,content
+3031,2179554,"examples/jasmine.html",4245,0,"",html,selection_keyboard
+3032,2179656,"examples/jasmine.html",4245,0,"",html,content
+3033,2180444,"examples/jasmine.html",4244,0,"",html,selection_command
+3034,2181102,"examples/jasmine.html",4243,0,"",html,selection_command
+3035,2182640,"examples/bibliography.bib",0,0,"",bibtex,tab
+3036,2284089,"examples/bibliography.bib",20605,0,"",bibtex,selection_command
+3037,2284336,"examples/bibliography.bib",20667,0,"",bibtex,selection_command
+3038,2284366,"examples/bibliography.bib",22127,0,"",bibtex,selection_command
+3039,2284399,"examples/bibliography.bib",22153,0,"",bibtex,selection_command
+3040,2284430,"examples/bibliography.bib",22165,0,"",bibtex,selection_command
+3041,2284619,"examples/bibliography.bib",22166,0,"\n",bibtex,content
+3042,2284789,"examples/bibliography.bib",22167,0,"\n",bibtex,content
+3043,2284935,"examples/bibliography.bib",22168,0,"@InProceedings{pmlr-v162-parker-holder22a,\n title = \t {Evolving Curricula with Regret-Based Environment Design},\n author = {Parker-Holder, Jack and Jiang, Minqi and Dennis, Michael and Samvelyan, Mikayel and Foerster, Jakob and Grefenstette, Edward and Rockt{\""a}schel, Tim},\n booktitle = \t {Proceedings of the 39th International Conference on Machine Learning},\n pages = \t {17473--17498},\n year = \t {2022},\n editor = \t {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan},\n volume = \t {162},\n series = \t {Proceedings of Machine Learning Research},\n month = \t {17--23 Jul},\n publisher = {PMLR},\n pdf = \t {https://proceedings.mlr.press/v162/parker-holder22a/parker-holder22a.pdf},\n url = \t {https://proceedings.mlr.press/v162/parker-holder22a.html},\n abstract = \t {Training generally-capable agents with reinforcement learning (RL) remains a significant challenge. A promising avenue for improving the robustness of RL agents is through the use of curricula. One such class of methods frames environment design as a game between a student and a teacher, using regret-based objectives to produce environment instantiations (or levels) at the frontier of the student agentโs capabilities. These methods benefit from theoretical robustness guarantees at equilibrium, yet they often struggle to find effective levels in challenging design spaces in practice. By contrast, evolutionary approaches incrementally alter environment complexity, resulting in potentially open-ended learning, but often rely on domain-specific heuristics and vast amounts of computational resources. This work proposes harnessing the power of evolution in a principled, regret-based curriculum. Our approach, which we call Adversarially Compounding Complexity by Editing Levels (ACCEL), seeks to constantly produce levels at the frontier of an agentโs capabilities, resulting in curricula that start simple but become increasingly complex. ACCEL maintains the theoretical benefits of prior regret-based methods, while providing significant empirical gains in a diverse set of environments. An interactive version of this paper is available at https://accelagent.github.io.}\n}\n",bibtex,content
+3044,2285371,"examples/bibliography.bib",24393,0,"",bibtex,selection_command
+3045,2285621,"examples/bibliography.bib",22996,0,"",bibtex,selection_command
+3046,2285659,"examples/bibliography.bib",22926,0,"",bibtex,selection_command
+3047,2285690,"examples/bibliography.bib",22840,0,"",bibtex,selection_command
+3048,2285732,"examples/bibliography.bib",22815,0,"",bibtex,selection_command
+3049,2285759,"examples/bibliography.bib",22789,0,"",bibtex,selection_command
+3050,2285790,"examples/bibliography.bib",22732,0,"",bibtex,selection_command
+3051,2285825,"examples/bibliography.bib",22712,0,"",bibtex,selection_command
+3052,2285856,"examples/bibliography.bib",22587,0,"",bibtex,selection_command
+3053,2285892,"examples/bibliography.bib",22568,0,"",bibtex,selection_command
+3054,2285925,"examples/bibliography.bib",22540,0,"",bibtex,selection_command
+3055,2285959,"examples/bibliography.bib",22452,0,"",bibtex,selection_command
+3056,2285992,"examples/bibliography.bib",22282,0,"",bibtex,selection_command
+3057,2286128,"examples/bibliography.bib",22211,0,"",bibtex,selection_command
+3058,2286302,"examples/bibliography.bib",22168,0,"",bibtex,selection_command
+3059,2287002,"examples/bibliography.bib",22169,0,"",bibtex,selection_command
+3060,2287185,"examples/bibliography.bib",22182,0,"",bibtex,selection_command
+3061,2287353,"examples/bibliography.bib",22183,0,"",bibtex,selection_command
+3062,2288364,"examples/bibliography.bib",22209,0,"2evolving",bibtex,content
+3063,2288365,"examples/bibliography.bib",22208,1,"",bibtex,content
+3064,2288365,"examples/bibliography.bib",22207,0,"0",bibtex,content
+3065,2288365,"examples/bibliography.bib",22199,1,"",bibtex,content
+3066,2288365,"examples/bibliography.bib",22184,10,"",bibtex,content
+3067,2289187,"examples/bibliography.bib",22183,1,"p",bibtex,selection_command
+3068,2289238,"examples/bibliography.bib",22183,24,"parkerholder2022evolving",bibtex,selection_command
+3069,2289354,"examples/bibliography.bib",22183,0,"",bibtex,selection_command
+3070,2291298,"examples/jasmine.html",0,0,"",html,tab
+3071,2293017,"examples/jasmine.html",4243,0,"@InProceedings{pmlr-v162-parker-holder22a,\n title = \t {Evolving Curricula with Regret-Based Environment Design},\n author = {Parker-Holder, Jack and Jiang, Minqi and Dennis, Michael and Samvelyan, Mikayel and Foerster, Jakob and Grefenstette, Edward and Rockt{\""a}schel, Tim},\n booktitle = \t {Proceedings of the 39th International Conference on Machine Learning},\n pages = \t {17473--17498},\n year = \t {2022},\n editor = \t {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan},\n volume = \t {162},\n series = \t {Proceedings of Machine Learning Research},\n month = \t {17--23 Jul},\n publisher = {PMLR},\n pdf = \t {https://proceedings.mlr.press/v162/parker-holder22a/parker-holder22a.pdf},\n url = \t {https://proceedings.mlr.press/v162/parker-holder22a.html},\n abstract = \t {Training generally-capable agents with reinforcement learning (RL) remains a significant challenge. A promising avenue for improving the robustness of RL agents is through the use of curricula. One such class of methods frames environment design as a game between a student and a teacher, using regret-based objectives to produce environment instantiations (or levels) at the frontier of the student agentโs capabilities. These methods benefit from theoretical robustness guarantees at equilibrium, yet they often struggle to find effective levels in challenging design spaces in practice. By contrast, evolutionary approaches incrementally alter environment complexity, resulting in potentially open-ended learning, but often rely on domain-specific heuristics and vast amounts of computational resources. This work proposes harnessing the power of evolution in a principled, regret-based curriculum. Our approach, which we call Adversarially Compounding Complexity by Editing Levels (ACCEL), seeks to constantly produce levels at the frontier of an agentโs capabilities, resulting in curricula that start simple but become increasingly complex. ACCEL maintains the theoretical benefits of prior regret-based methods, while providing significant empirical gains in a diverse set of environments. An interactive version of this paper is available at https://accelagent.github.io.}\n}\n",html,content
+3072,2293024,"examples/jasmine.html",6470,0," ",html,content
+3073,2293024,"examples/jasmine.html",6468,0," ",html,content
+3074,2293024,"examples/jasmine.html",5071,2," ",html,content
+3075,2293024,"examples/jasmine.html",5001,2," ",html,content
+3076,2293024,"examples/jasmine.html",4915,2," ",html,content
+3077,2293024,"examples/jasmine.html",4890,2," ",html,content
+3078,2293024,"examples/jasmine.html",4864,2," ",html,content
+3079,2293024,"examples/jasmine.html",4807,2," ",html,content
+3080,2293024,"examples/jasmine.html",4787,2," ",html,content
+3081,2293024,"examples/jasmine.html",4662,2," ",html,content
+3082,2293024,"examples/jasmine.html",4643,2," ",html,content
+3083,2293024,"examples/jasmine.html",4615,2," ",html,content
+3084,2293024,"examples/jasmine.html",4527,2," ",html,content
+3085,2293024,"examples/jasmine.html",4357,2," ",html,content
+3086,2293024,"examples/jasmine.html",4286,2," ",html,content
+3087,2293332,"examples/jasmine.html",6619,0,"",html,selection_command
+3088,2293888,"examples/jasmine.html",4243,2377,"",html,content
+3089,2293900,"examples/jasmine.html",4606,0,"",html,selection_command
+3090,2296407,"examples/jasmine.html",4599,0,"",html,selection_command
+3091,2296649,"examples/jasmine.html",4573,0,"",html,selection_command
+3092,2296677,"examples/jasmine.html",4562,0,"",html,selection_command
+3093,2296710,"examples/jasmine.html",4556,0,"",html,selection_command
+3094,2296741,"examples/jasmine.html",4532,0,"",html,selection_command
+3095,2296774,"examples/jasmine.html",4509,0,"",html,selection_command
+3096,2296807,"examples/jasmine.html",4448,0,"",html,selection_command
+3097,2296841,"examples/jasmine.html",4349,0,"",html,selection_command
+3098,2296874,"examples/jasmine.html",4322,0,"",html,selection_command
+3099,2296906,"examples/jasmine.html",4311,0,"",html,selection_command
+3100,2296941,"examples/jasmine.html",4306,0,"",html,selection_command
+3101,2296972,"examples/jasmine.html",4295,0,"",html,selection_command
+3102,2297091,"examples/jasmine.html",4288,0,"",html,selection_command
+3103,2297254,"examples/jasmine.html",4276,0,"",html,selection_command
+3104,2297398,"examples/jasmine.html",4170,0,"",html,selection_command
+3105,2297816,"examples/jasmine.html",4242,0,"",html,selection_command
+3106,2298150,"examples/jasmine.html",4243,0,"parkerholder2022evolving",html,content
+3107,2298154,"examples/jasmine.html",4266,0,"",html,selection_command
+3108,2299742,"examples/jasmine.html",4160,0,"",html,selection_command
+3109,2301088,"examples/jasmine.html",4293,0,"",html,selection_command
+3110,2319977,"examples/jasmine.html",4301,0,"\n ",html,content
+3111,2320418,"examples/jasmine.html",4306,0,"p",html,content
+3112,2320420,"examples/jasmine.html",4307,0,"",html,selection_keyboard
+3113,2321100,"examples/jasmine.html",4306,1,"",html,content
+3114,2321415,"examples/jasmine.html",4306,0,"<",html,content
+3115,2321416,"examples/jasmine.html",4307,0,"",html,selection_keyboard
+3116,2321913,"examples/jasmine.html",4307,0,"p",html,content
+3117,2321918,"examples/jasmine.html",4308,0,"",html,selection_keyboard
+3118,2322313,"examples/jasmine.html",4308,0,">",html,content
+3119,2322316,"examples/jasmine.html",4309,0,"",html,selection_keyboard
+3120,2322420,"examples/jasmine.html",4309,0,"",html,content
+3121,2322633,"examples/jasmine.html",4309,0,"\n \n ",html,content
+3122,2326555,"examples/jasmine.html",4318,0,"While numerous previous works have investigated large-scale world modeling and its application to robotics (cite cosmos), world modeling for agent training necessitates a vastly different treatment. Such regime requires the accumulating precision error of the world model to be orders of magnitude smaller than when solely used for short-term look-ahead. The feasibility of such a world model in its truest sense is entirely understudied, and this project is a first attempt at studying the setting using rigorous evaluations. Specifically, we want to develop Empirical Environment Complexity Scaling Laws, where we train world models to full convergence in environments of increasing complexity (Atari, RetroGym, Craftax, Minecraft) and under the synthetic infinite-data regime. Subsequently, we evaluate those models two-fold: i) via a taxonomy of granular benchmarks probing specific world modeling capabilities (reconstruction quality, environment dynamics at the body/tail of the data distribution, long-horizon consistency) (cite BSuite), and ii) by training reinforcement learning (RL) agents in both the world model and the corresponding ground-truth environment, and measuring the performance difference between the two agents.\n\nThe final goal of this project is to construct empirical estimates of compute and data requirements to model environments of increasing complexity sufficiently well (as determined by our evaluation procedure). The conclusion to draw from such empirical estimate is entirely open. If our empirical estimates show resource requirement trends that are feasible under the assumption of the continuation of Mooreโs Law (cite) and increased capital expenditure, that would manifest world modeling as a paradigm with high likelihood of success in overcoming the data-scarcity in domains as general as (humanoid) robotics. Otherwise, the world modeling research community must realign its direction with downstream goals that are feasible.\n",html,content
+3123,2327335,"examples/jasmine.html",5556,0,"",html,selection_command
+3124,2327493,"examples/jasmine.html",5555,0,"",html,selection_command
+3125,2327795,"examples/jasmine.html",5556,0,"",html,selection_command
+3126,2328302,"examples/jasmine.html",5556,0," ",html,content
+3127,2328772,"examples/jasmine.html",5560,0," ",html,content
+3128,2329137,"examples/jasmine.html",5563,0,"",html,selection_command
+3129,2329256,"examples/jasmine.html",6296,0,"",html,selection_command
+3130,2329589,"examples/jasmine.html",6296,1,"",html,content
+3131,2329594,"examples/jasmine.html",6300,0,"",html,selection_command
+3132,2330255,"examples/jasmine.html",5560,0,"",html,selection_command
+3133,2330387,"examples/jasmine.html",5555,0,"",html,selection_command
+3134,2330526,"examples/jasmine.html",4314,0,"",html,selection_command
+3135,2330805,"examples/jasmine.html",5555,0,"",html,selection_command
+3136,2330959,"examples/jasmine.html",5555,0,"\n ",html,content
+3137,2330963,"examples/jasmine.html",5560,0,"",html,selection_command
+3138,2331343,"examples/jasmine.html",5555,0,"",html,selection_command
+3139,2331620,"examples/jasmine.html",5555,1,"",html,content
+3140,2331630,"examples/jasmine.html",5559,0,"",html,selection_command
+3141,2331759,"examples/jasmine.html",4314,0,"",html,selection_command
+3142,2331924,"examples/jasmine.html",4306,0,"",html,selection_command
+3143,2332441,"examples/jasmine.html",4314,0,"",html,selection_command
+3144,2332568,"examples/jasmine.html",5559,0,"",html,selection_command
+3145,2332743,"examples/jasmine.html",5563,0,"\n
\n We introduce Jasmine, a production-ready JAX-based codebase for world modeling from unlabeled videos.\n Scale from single hosts to hundreds of xPUs thanks to XLA.\n
\n We are at the cusp of an intelligence revolution. Neural networks are able to clone the behaviour of peak human intellectual performance \n given enough compute, data, and the right algorithms . While an increasing amount of capital expenditure is allocated to compute clusters, and a well-working\n recipe of equipping models with the required priors and capacity to reason is publicly available, the path to human-level intelligence with the ability to automate\n large fractions of the economy will increasingly be shaped by paradigms that are able to find and efficiently use untouched data troves.\n
\n
\n While product-feedback-loops constitute an adaptive data trove, many domains like robotics are not mature enough to yield a product with wide enough\n adoption to create a feedback-loop of sufficient magnitude, prompting the search for alternatives.\n One paradigm proposed by the research community to overcome the data scarcity in those domains is that of world models. While world models can help frontier model\n development in numerous ways, an ambitious goal of the community is to train a world model to act as a simulation of the world , in order to\n train an agent in that simulation, via an adaptive curriculum or otherwise.\n
\n While numerous previous works have investigated large-scale world modeling and its application to robotics , world modeling for agent training calls for a vastly different treatment.\n Such regime requires the compounding error of world models to be orders of magnitude smaller than when solely used for short-term look-ahead. The feasibility of such a world model in its truest sense is entirely\n understudied, and Jasmine, a world modeling codebase, is our first milestone towards studying the setting using rigorous evaluations. Specifically, we want to develop Empirical Environment Complexity Scaling Trends, where we train world models to full convergence\n in environments of increasing complexity (Atari , RetroGym , Craftax , Minecraft )\n and under the synthetic infinite-data regime. Subsequently, we want to evaluate those models two-fold: i) via a taxonomy of granular benchmarks probing\n specific world modeling capabilities (reconstruction quality, environment dynamics at the body/tail of the data distribution, long-horizon consistency) , and ii) by training reinforcement learning (RL) agents in both\n the world model and the corresponding ground-truth environment, and measuring the performance difference between those agents.\n
\n
\n Ultimately, such treatment permits us to derive empirical estimates of compute and data requirements to model environments of increasing complexity sufficiently well (as determined by our evaluation procedure). Only given such estimates can we try to draw conclusions\n about the feasibility of world modeling of environments as complex as the real world for agent training. If our empirical estimates show resource requirement trends that are feasible under the assumption of the continuation of Moore's Law and increased capital\n expenditure, that would manifest world modeling as a paradigm with high likelihood of success in overcoming the data-scarcity in domains as general as (humanoid) robotics. Otherwise, the world modeling research community must realign its direction with downstream goals\n that are feasible.\n
\n
A batteries-included foundation for world modeling research
\n
\n Jasmine, our first milestone towards deriving Empirical Environment Complexity Scaling Trends, is the result of weeks of infrastructure work to make large-scale world modeling research more accessible. What started off as a fork of\n Jafar grew into a full-fledged world\n modeling codebase amenable to large-scale training, implementing multiple dynamics model baselines, asynchronous checkpointing, process-parallel dataloading, checkpointing of model weights, optimizer and dataloader states, checkpointing policies, full reproducibility with identical\n training curves, mixed precision training, optimized FlashAttention (via cuDNN SDPA), activation checkpointing, DDP\n (with FSDP/HSDP requiring changing a singe LoC), WSD schedule, index-shuffling during dataloading, and native Treescope support. Jasmine implements the new\n flax.nnx API and strictly adheres to Noam Shazeer's shape suffix convention, thereby providing\n a didactic implementation of world modeling architectures. Jasmine solely depends\n on battle-tested libraries from the Google ecosystem (Flax, Optax, Orbax, Grain,\n PIX, ArrayRecord).\n
\n
Releasing a dataset of fine-grained research engineering
\n
\n We captured every step of the research engineering process behind Jasmine using crowd-code,\n a VS Code/ Cursor extension that captures fine-grained IDE interactions (character-level edits, navigation, debugging patterns, terminal usage) and allows researchers to contribute their \n engineering process to a crowd-sourced dataset. Today, we release crowd-code-0.1, our first dataset of dense IDE interactions, which encompasses the entire development of Jasmine.\n crowd-code-0.1 is unfiltered, uncleaned, and uncurated, but only contains IDE interactions of the authors. We are actively working on cleaning and curating the full dataset,\n which will be released in the future.\n
\n \n\n \n\n
Contributions
\n
MM, AN and FS worked on research, ideation and implementation. FS wrote the manuscript. SB provided feedback and guidance.
We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on. Install once, and forget about it.
We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on. Install once, and forget about it.
Crowd-Sourcing A Dataset To Make Agents Code Like Humans
\n
Alfred Nguyen, Mihir Mahajan, Franz Srambical
\n
We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on. Install once, and forget about it.
Crowd-Sourcing A Dataset To Make Agents Code Like Humans
\n
Alfred Nguyen, Mihir Mahajan, Franz Srambical
\n
We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on. Install once, and forget about it.
Crowd-Sourcing A Dataset To Make Agents Code Like Humans
\n
Alfred Nguyen, Mihir Mahajan, Franz Srambical
\n
We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on. Install once, and forget about it.
",html,content
+15916,16116740,"examples/blog.html",25204,0,"",html,selection_command
+15917,16117087,"examples/blog.html",25193,0,"",html,selection_command
+15918,16117370,"examples/blog.html",25193,1,"",html,content
+15919,16117373,"examples/blog.html",25203,0,"",html,selection_command
+15920,16117566,"examples/blog.html",25240,0,"",html,selection_command
+15921,16117707,"examples/blog.html",25203,0,"",html,selection_command
+15922,16117985,"examples/blog.html",25240,0,"",html,selection_command
+15923,16119191,"examples/blog.html",25277,0,"",html,selection_command
+15924,16119352,"examples/blog.html",25342,0,"",html,selection_command
+15925,16119486,"examples/blog.html",25363,0,"",html,selection_command
+15926,16120351,"examples/blog.html",25342,0,"",html,selection_command
+15927,16120495,"examples/blog.html",25277,0,"",html,selection_command
+15928,16124888,"examples/blog.html",25465,2,"",html,content
+15929,16124889,"examples/blog.html",25464,0,"pn",html,content
+15930,16124889,"examples/blog.html",25454,0,"jasmin",html,content
+15931,16124889,"examples/blog.html",25445,9,"",html,content
+15932,16124889,"examples/blog.html",25385,0,"jasmin",html,content
+15933,16124889,"examples/blog.html",25376,9,"",html,content
+15934,16124889,"examples/blog.html",25319,0,"5",html,content
+15935,16124889,"examples/blog.html",25317,2,"",html,content
+15936,16124889,"examples/blog.html",25316,0,"gust",html,content
+15937,16124889,"examples/blog.html",25314,2,"",html,content
+15938,16124889,"examples/blog.html",25313,0,"A",html,content
+15939,16124889,"examples/blog.html",25312,1,"",html,content
+15940,16130566,"examples/blog.html",25343,0,"",html,selection_command
+15941,16130812,"examples/blog.html",25364,0,"",html,selection_command
+15942,16131018,"examples/blog.html",25402,0,"",html,selection_command
+15943,16131185,"examples/blog.html",25509,0,"",html,selection_command
+15944,16131367,"examples/blog.html",25553,0,"",html,selection_command
+15945,16131659,"examples/blog.html",25509,0,"",html,selection_command
+15946,16131769,"examples/blog.html",25517,0,"",html,selection_command
+15947,16131821,"examples/blog.html",25561,0,"",html,selection_command
+15948,16132104,"examples/blog.html",25565,0,"",html,selection_command
+15949,16132268,"examples/blog.html",25566,0,"",html,selection_command
+15950,16132438,"examples/blog.html",25569,0,"",html,selection_command
+15951,16132591,"examples/blog.html",25574,0,"",html,selection_command
+15952,16132727,"examples/blog.html",25576,0,"",html,selection_command
+15953,16132873,"examples/blog.html",25581,0,"",html,selection_command
+15954,16133920,"examples/blog.html",25583,0,"",html,selection_command
+15955,16134123,"examples/blog.html",25588,0,"",html,selection_command
+15956,16134703,"examples/blog.html",25583,0,"",html,selection_command
+15957,16136447,"examples/jasmine.html",0,0,"",html,tab
+15958,16137236,"examples/jasmine.html",0,0,"",html,selection_command
+15959,16137604,"examples/jasmine.html",5,0,"",html,selection_command
+15960,16137851,"examples/jasmine.html",30,0,"",html,selection_command
+15961,16137884,"examples/jasmine.html",31,0,"",html,selection_command
+15962,16137917,"examples/jasmine.html",97,0,"",html,selection_command
+15963,16137950,"examples/jasmine.html",164,0,"",html,selection_command
+15964,16137984,"examples/jasmine.html",206,0,"",html,selection_command
+15965,16138018,"examples/jasmine.html",207,0,"",html,selection_command
+15966,16138051,"examples/jasmine.html",257,0,"",html,selection_command
+15967,16138085,"examples/jasmine.html",258,0,"",html,selection_command
+15968,16138118,"examples/jasmine.html",328,0,"",html,selection_command
+15969,16138154,"examples/jasmine.html",396,0,"",html,selection_command
+15970,16138188,"examples/jasmine.html",471,0,"",html,selection_command
+15971,16138218,"examples/jasmine.html",541,0,"",html,selection_command
+15972,16138251,"examples/jasmine.html",574,0,"",html,selection_command
+15973,16138284,"examples/jasmine.html",578,0,"",html,selection_command
+15974,16138319,"examples/jasmine.html",594,0,"",html,selection_command
+15975,16138351,"examples/jasmine.html",595,0,"",html,selection_command
+15976,16138384,"examples/jasmine.html",602,0,"",html,selection_command
+15977,16138418,"examples/jasmine.html",643,0,"",html,selection_command
+15978,16138451,"examples/jasmine.html",714,0,"",html,selection_command
+15979,16138484,"examples/jasmine.html",738,0,"",html,selection_command
+15980,16138517,"examples/jasmine.html",794,0,"",html,selection_command
+15981,16138551,"examples/jasmine.html",802,0,"",html,selection_command
+15982,16138585,"examples/jasmine.html",803,0,"",html,selection_command
+15983,16138618,"examples/jasmine.html",810,0,"",html,selection_command
+15984,16138651,"examples/jasmine.html",817,0,"",html,selection_command
+15985,16138684,"examples/jasmine.html",853,0,"",html,selection_command
+15986,16138717,"examples/jasmine.html",859,0,"",html,selection_command
+15987,16138751,"examples/jasmine.html",878,0,"",html,selection_command
+15988,16138784,"examples/jasmine.html",935,0,"",html,selection_command
+15989,16139389,"examples/jasmine.html",939,0,"",html,selection_command
+15990,16139553,"examples/jasmine.html",940,0,"",html,selection_command
+15991,16139720,"examples/jasmine.html",945,0,"",html,selection_command
+15992,16139869,"examples/jasmine.html",948,0,"",html,selection_command
+15993,16140367,"examples/jasmine.html",949,0,"",html,selection_command
+15994,16140515,"examples/jasmine.html",949,2,"๐ง",html,selection_command
+15995,16140698,"examples/jasmine.html",949,4,"๐งโโ",html,selection_command
+15996,16140950,"examples/jasmine.html",949,5,"๐งโโ๏ธ",html,selection_command
+15997,16140986,"examples/jasmine.html",949,13,"๐งโโ๏ธ Jasmine",html,selection_command
+15998,16141202,"examples/jasmine.html",961,0,"",html,selection_command
+15999,16141482,"examples/jasmine.html",962,0,"",html,selection_command
+16000,16141644,"examples/jasmine.html",963,0,"",html,selection_command
+16001,16142310,"examples/jasmine.html",949,0,"",html,selection_command
+16002,16142904,"examples/blog.html",0,0,"",html,tab
+16003,16143884,"examples/blog.html",25583,1,"C",html,selection_command
+16004,16144119,"examples/blog.html",25583,5,"Crowd",html,selection_command
+16005,16144355,"examples/blog.html",25583,6,"Crowd-",html,selection_command
+16006,16144607,"examples/blog.html",25583,14,"Crowd-Sourcing",html,selection_command
+16007,16144639,"examples/blog.html",25583,16,"Crowd-Sourcing A",html,selection_command
+16008,16144671,"examples/blog.html",25583,24,"Crowd-Sourcing A Dataset",html,selection_command
+16009,16144706,"examples/blog.html",25583,27,"Crowd-Sourcing A Dataset To",html,selection_command
+16010,16144861,"examples/blog.html",25583,32,"Crowd-Sourcing A Dataset To Make",html,selection_command
+16011,16145026,"examples/blog.html",25583,39,"Crowd-Sourcing A Dataset To Make Agents",html,selection_command
+16012,16145192,"examples/blog.html",25583,44,"Crowd-Sourcing A Dataset To Make Agents Code",html,selection_command
+16013,16145343,"examples/blog.html",25583,49,"Crowd-Sourcing A Dataset To Make Agents Code Like",html,selection_command
+16014,16145511,"examples/blog.html",25583,56,"Crowd-Sourcing A Dataset To Make Agents Code Like Humans",html,selection_command
+16015,16145870,"examples/blog.html",25583,56,"๐งโโ๏ธ Jasmine: A Simple, Performant and Scalable JAX-based World Modeling Codebase",html,content
+16016,16145897,"examples/blog.html",25664,0,"",html,selection_command
+16017,16147562,"examples/blog.html",25775,0,"",html,selection_command
+16018,16147753,"examples/blog.html",25774,0,"",html,selection_command
+16019,16148003,"examples/blog.html",25772,0,"",html,selection_command
+16020,16148034,"examples/blog.html",25763,0,"",html,selection_command
+16021,16148065,"examples/blog.html",25762,0,"",html,selection_command
+16022,16148096,"examples/blog.html",25758,0,"",html,selection_command
+16023,16148130,"examples/blog.html",25757,0,"",html,selection_command
+16024,16148166,"examples/blog.html",25752,0,"",html,selection_command
+16025,16148200,"examples/blog.html",25750,0,"",html,selection_command
+16026,16148233,"examples/blog.html",25743,0,"",html,selection_command
+16027,16148267,"examples/blog.html",25742,0,"",html,selection_command
+16028,16148300,"examples/blog.html",25738,0,"",html,selection_command
+16029,16148334,"examples/blog.html",25737,0,"",html,selection_command
+16030,16148367,"examples/blog.html",25732,0,"",html,selection_command
+16031,16148402,"examples/blog.html",25730,0,"",html,selection_command
+16032,16148436,"examples/blog.html",25724,0,"",html,selection_command
+16033,16148614,"examples/blog.html",25723,0,"",html,selection_command
+16034,16148796,"examples/blog.html",25719,0,"",html,selection_command
+16035,16149144,"examples/blog.html",25718,0,"",html,selection_command
+16036,16150121,"examples/blog.html",26021,0,"ble",html,content
+16037,16150121,"examples/blog.html",26020,1,"",html,content
+16038,16150121,"examples/blog.html",26019,0,"ccess",html,content
+16039,16150121,"examples/blog.html",26014,5,"",html,content
+16040,16150121,"examples/blog.html",26011,1,"",html,content
+16041,16150121,"examples/blog.html",26009,1,"",html,content
+16042,16150121,"examples/blog.html",26007,0,"m",html,content
+16043,16150121,"examples/blog.html",26006,1,"",html,content
+16044,16150121,"examples/blog.html",26005,0,"rch",html,content
+16045,16150121,"examples/blog.html",26003,2,"",html,content
+16046,16150121,"examples/blog.html",26002,0,"se",html,content
+16047,16150121,"examples/blog.html",26000,2,"",html,content
+16048,16150121,"examples/blog.html",25999,0,"g r",html,content
+16049,16150121,"examples/blog.html",25998,1,"",html,content
+16050,16150121,"examples/blog.html",25997,0,"deli",html,content
+16051,16150121,"examples/blog.html",25996,0,"m",html,content
+16052,16150121,"examples/blog.html",25995,0,"d",html,content
+16053,16150121,"examples/blog.html",25994,0,"e wor",html,content
+16054,16150121,"examples/blog.html",25992,0,"c",html,content
+16055,16150121,"examples/blog.html",25991,1,"",html,content
+16056,16150121,"examples/blog.html",25990,0,"large-",html,content
+16057,16150121,"examples/blog.html",25988,2,"",html,content
+16058,16150121,"examples/blog.html",25987,0," make",html,content
+16059,16150121,"examples/blog.html",25985,2,"",html,content
+16060,16150121,"examples/blog.html",25984,0,"t",html,content
+16061,16150122,"examples/blog.html",25983,0,"rk",html,content
+16062,16150122,"examples/blog.html",25979,4,"",html,content
+16063,16150122,"examples/blog.html",25978,0,"w",html,content
+16064,16150122,"examples/blog.html",25977,1,"",html,content
+16065,16150122,"examples/blog.html",25975,0,"ctur",html,content
+16066,16150122,"examples/blog.html",25974,1,"",html,content
+16067,16150122,"examples/blog.html",25973,0,"r",html,content
+16068,16150122,"examples/blog.html",25972,0,"fras",html,content
+16069,16150122,"examples/blog.html",25971,1,"",html,content
+16070,16150122,"examples/blog.html",25969,0," ",html,content
+16071,16150122,"examples/blog.html",25968,0,"o",html,content
+16072,16150122,"examples/blog.html",25967,0,"ks",html,content
+16073,16150122,"examples/blog.html",25960,7,"",html,content
+16074,16150122,"examples/blog.html",25958,1,"",html,content
+16075,16150122,"examples/blog.html",25957,0,"w",html,content
+16076,16150122,"examples/blog.html",25956,0,"f",html,content
+16077,16150122,"examples/blog.html",25954,1,"",html,content
+16078,16150122,"examples/blog.html",25952,0,"ul",html,content
+16079,16150122,"examples/blog.html",25951,1,"",html,content
+16080,16150122,"examples/blog.html",25935,15,"",html,content
+16081,16150122,"examples/blog.html",25934,0,"r",html,content
+16082,16150122,"examples/blog.html",25932,0,"h",html,content
+16083,16150122,"examples/blog.html",25929,3,"",html,content
+16084,16150122,"examples/blog.html",25928,0," ",html,content
+16085,16150122,"examples/blog.html",25926,2,"",html,content
+16086,16150122,"examples/blog.html",25925,0,"i",html,content
+16087,16150122,"examples/blog.html",25924,0,"It",html,content
+16088,16150122,"examples/blog.html",25923,1,"",html,content
+16089,16150122,"examples/blog.html",25922,0,"e.",html,content
+16090,16150122,"examples/blog.html",25915,7,"",html,content
+16091,16150122,"examples/blog.html",25914,0,"eba",html,content
+16092,16150122,"examples/blog.html",25913,1,"",html,content
+16093,16150122,"examples/blog.html",25911,1,"",html,content
+16094,16150122,"examples/blog.html",25909,1,"",html,content
+16095,16150122,"examples/blog.html",25907,0,"g",html,content
+16096,16150122,"examples/blog.html",25905,0,"l",html,content
+16097,16150122,"examples/blog.html",25904,1,"",html,content
+16098,16150122,"examples/blog.html",25903,0,"mod",html,content
+16099,16150122,"examples/blog.html",25893,10,"",html,content
+16100,16150122,"examples/blog.html",25892,0,"rld",html,content
+16101,16150123,"examples/blog.html",25891,0,"w",html,content
+16102,16150123,"examples/blog.html",25890,1,"",html,content
+16103,16150123,"examples/blog.html",25889,0,"d",html,content
+16104,16150123,"examples/blog.html",25888,0,"s",html,content
+16105,16150123,"examples/blog.html",25884,4,"",html,content
+16106,16150123,"examples/blog.html",25883,0,"JAX-b",html,content
+16107,16150123,"examples/blog.html",25882,0,"e",html,content
+16108,16150123,"examples/blog.html",25879,3,"",html,content
+16109,16150123,"examples/blog.html",25878,0,"ab",html,content
+16110,16150123,"examples/blog.html",25876,0,"sc",html,content
+16111,16150123,"examples/blog.html",25875,0,"nd",html,content
+16112,16150123,"examples/blog.html",25874,1,"",html,content
+16113,16150123,"examples/blog.html",25873,0," ",html,content
+16114,16150123,"examples/blog.html",25872,1,"",html,content
+16115,16150123,"examples/blog.html",25866,5,"",html,content
+16116,16150123,"examples/blog.html",25865,0,"ma",html,content
+16117,16150123,"examples/blog.html",25860,5,"",html,content
+16118,16150123,"examples/blog.html",25858,0,"f",html,content
+16119,16150123,"examples/blog.html",25857,1,"",html,content
+16120,16150123,"examples/blog.html",25853,3,"",html,content
+16121,16150123,"examples/blog.html",25852,0,"p",html,content
+16122,16150123,"examples/blog.html",25849,3,"",html,content
+16123,16150123,"examples/blog.html",25848,0,"simple,",html,content
+16124,16150123,"examples/blog.html",25846,2,"",html,content
+16125,16150123,"examples/blog.html",25841,0,"Jasmin",html,content
+16126,16150123,"examples/blog.html",25832,9,"",html,content
+16127,16150123,"examples/blog.html",25749,0,"n, Alfred Nguye",html,content
+16128,16150123,"examples/blog.html",25712,20,"",html,content
+16129,16155086,"examples/blog.html",25818,0,"",html,selection_command
+16130,16155720,"examples/blog.html",26022,0,"",html,selection_command
+16131,16156568,"examples/blog.html",25777,0,"",html,selection_command
+16132,16156695,"examples/blog.html",25799,0,"",html,selection_command
+16133,16156862,"examples/blog.html",25800,0,"",html,selection_command
+16134,16157020,"examples/blog.html",25802,0,"",html,selection_command
+16135,16157181,"examples/blog.html",25807,0,"",html,selection_command
+16136,16157296,"examples/blog.html",25809,0,"",html,selection_command
+16137,16157467,"examples/blog.html",25817,0,"",html,selection_command
+16138,16157671,"examples/blog.html",25819,0,"",html,selection_command
+16139,16158398,"examples/blog.html",25819,200,"",html,content
+16140,16158606,"examples/blog.html",25818,0,"",html,selection_command
+16141,16159271,"examples/jasmine.html",0,0,"",html,tab
+16142,16160416,"examples/jasmine.html",1048,0,"",html,selection_command
+16143,16160533,"examples/jasmine.html",1050,0,"",html,selection_command
+16144,16160711,"examples/jasmine.html",1053,0,"",html,selection_command
+16145,16160912,"examples/jasmine.html",1054,0,"",html,selection_command
+16146,16161443,"examples/jasmine.html",1054,1,"W",html,selection_command
+16147,16161525,"examples/jasmine.html",1054,2,"We",html,selection_command
+16148,16161783,"examples/jasmine.html",1054,12,"We introduce",html,selection_command
+16149,16161811,"examples/jasmine.html",1054,20,"We introduce Jasmine",html,selection_command
+16150,16161844,"examples/jasmine.html",1054,21,"We introduce Jasmine,",html,selection_command
+16151,16161873,"examples/jasmine.html",1054,23,"We introduce Jasmine, a",html,selection_command
+16152,16161910,"examples/jasmine.html",1054,34,"We introduce Jasmine, a production",html,selection_command
+16153,16161946,"examples/jasmine.html",1054,35,"We introduce Jasmine, a production-",html,selection_command
+16154,16161977,"examples/jasmine.html",1054,40,"We introduce Jasmine, a production-ready",html,selection_command
+16155,16162012,"examples/jasmine.html",1054,44,"We introduce Jasmine, a production-ready JAX",html,selection_command
+16156,16162050,"examples/jasmine.html",1054,45,"We introduce Jasmine, a production-ready JAX-",html,selection_command
+16157,16162083,"examples/jasmine.html",1054,50,"We introduce Jasmine, a production-ready JAX-based",html,selection_command
+16158,16162113,"examples/jasmine.html",1054,59,"We introduce Jasmine, a production-ready JAX-based codebase",html,selection_command
+16159,16162146,"examples/jasmine.html",1054,63,"We introduce Jasmine, a production-ready JAX-based codebase for",html,selection_command
+16160,16162181,"examples/jasmine.html",1054,69,"We introduce Jasmine, a production-ready JAX-based codebase for world",html,selection_command
+16161,16162211,"examples/jasmine.html",1054,78,"We introduce Jasmine, a production-ready JAX-based codebase for world modeling",html,selection_command
+16162,16162245,"examples/jasmine.html",1054,83,"We introduce Jasmine, a production-ready JAX-based codebase for world modeling from",html,selection_command
+16163,16162277,"examples/jasmine.html",1054,93,"We introduce Jasmine, a production-ready JAX-based codebase for world modeling from unlabeled",html,selection_command
+16164,16162311,"examples/jasmine.html",1054,100,"We introduce Jasmine, a production-ready JAX-based codebase for world modeling from unlabeled videos",html,selection_command
+16165,16162344,"examples/jasmine.html",1054,101,"We introduce Jasmine, a production-ready JAX-based codebase for world modeling from unlabeled videos.",html,selection_command
+16166,16162379,"examples/jasmine.html",1054,107,"We introduce Jasmine, a production-ready JAX-based codebase for world modeling from unlabeled videos. Scale",html,selection_command
+16167,16162415,"examples/jasmine.html",1054,112,"We introduce Jasmine, a production-ready JAX-based codebase for world modeling from unlabeled videos. Scale from",html,selection_command
+16168,16162451,"examples/jasmine.html",1054,119,"We introduce Jasmine, a production-ready JAX-based codebase for world modeling from unlabeled videos. Scale from single",html,selection_command
+16169,16162478,"examples/jasmine.html",1054,125,"We introduce Jasmine, a production-ready JAX-based codebase for world modeling from unlabeled videos. Scale from single hosts",html,selection_command
+16170,16162510,"examples/jasmine.html",1054,128,"We introduce Jasmine, a production-ready JAX-based codebase for world modeling from unlabeled videos. Scale from single hosts to",html,selection_command
+16171,16162544,"examples/jasmine.html",1054,137,"We introduce Jasmine, a production-ready JAX-based codebase for world modeling from unlabeled videos. Scale from single hosts to hundreds",html,selection_command
+16172,16162577,"examples/jasmine.html",1054,140,"We introduce Jasmine, a production-ready JAX-based codebase for world modeling from unlabeled videos. Scale from single hosts to hundreds of",html,selection_command
+16173,16162611,"examples/jasmine.html",1054,145,"We introduce Jasmine, a production-ready JAX-based codebase for world modeling from unlabeled videos. Scale from single hosts to hundreds of xPUs",html,selection_command
+16174,16162820,"examples/jasmine.html",1054,152,"We introduce Jasmine, a production-ready JAX-based codebase for world modeling from unlabeled videos. Scale from single hosts to hundreds of xPUs thanks",html,selection_command
+16175,16162981,"examples/jasmine.html",1054,155,"We introduce Jasmine, a production-ready JAX-based codebase for world modeling from unlabeled videos. Scale from single hosts to hundreds of xPUs thanks to",html,selection_command
+16176,16163250,"examples/jasmine.html",1054,156,"We introduce Jasmine, a production-ready JAX-based codebase for world modeling from unlabeled videos. Scale from single hosts to hundreds of xPUs thanks to ",html,selection_command
+16177,16163404,"examples/jasmine.html",1054,157,"We introduce Jasmine, a production-ready JAX-based codebase for world modeling from unlabeled videos. Scale from single hosts to hundreds of xPUs thanks to X",html,selection_command
+16178,16163571,"examples/jasmine.html",1054,158,"We introduce Jasmine, a production-ready JAX-based codebase for world modeling from unlabeled videos. Scale from single hosts to hundreds of xPUs thanks to XL",html,selection_command
+16179,16163706,"examples/jasmine.html",1054,159,"We introduce Jasmine, a production-ready JAX-based codebase for world modeling from unlabeled videos. Scale from single hosts to hundreds of xPUs thanks to XLA",html,selection_command
+16180,16163934,"examples/jasmine.html",1054,160,"We introduce Jasmine, a production-ready JAX-based codebase for world modeling from unlabeled videos. Scale from single hosts to hundreds of xPUs thanks to XLA.",html,selection_command
+16181,16164267,"examples/jasmine.html",1054,0,"",html,selection_command
+16182,16164961,"examples/blog.html",0,0,"",html,tab
+16183,16165713,"examples/blog.html",25819,0,"We introduce Jasmine, a production-ready JAX-based codebase for world modeling from unlabeled videos. Scale from single hosts to hundreds of xPUs thanks to XLA.",html,content
+16184,16165723,"examples/blog.html",25978,0,"",html,selection_command
+16185,16166774,"examples/blog.html",25777,0,"",html,selection_command
+16186,16170781,"TERMINAL",0,0,"cp examples/* dist",,terminal_command
+16187,16170799,"TERMINAL",0,0,"]633;C[1m[7m%[27m[1m[0m \r \r",,terminal_output
+16188,16172392,"TERMINAL",0,0,"npm run dev",,terminal_command
+16189,16172444,"TERMINAL",0,0,"]633;C",,terminal_output
+16190,16172751,"TERMINAL",0,0,"\r\n> distill-template@2.8.0 dev\r\n> rollup -c rollup.config.dev.js -w\r\n\r\n[1G[0K",,terminal_output
+16191,16173041,"TERMINAL",0,0,"c[4mrollup v2.7.3[24m\r\n[36mbundles [1msrc/components.js[22m โ [1mdist/template.v2.js[22m...[39m\r\n",,terminal_output
+16192,16173509,"TERMINAL",0,0,"[1m[32mhttp://localhost:8088[39m[22m -> /Users/franzsrambical/Documents/pdoom/pdoom.org/dist\r\n[1m[32mhttp://localhost:8088[39m[22m -> /Users/franzsrambical/Documents/pdoom/pdoom.org/examples\r\n[1m[33m(!) Circular dependencies[39m[22m\r\nnode_modules/d3-interpolate/src/value.js -> node_modules/d3-interpolate/src/array.js -> node_modules/d3-interpolate/src/value.js\r\nnode_modules/d3-interpolate/src/value.js -> node_modules/d3-interpolate/src/object.js -> node_modules/d3-interpolate/src/value.js\r\nnode_modules/d3-selection/src/selection/index.js -> node_modules/d3-selection/src/selection/select.js -> node_modules/d3-selection/src/selection/index.js\r\n...and 7 more\r\n[32mcreated [1mdist/template.v2.js[22m in [1m484ms[22m[39m\r\n[36mbundles [1msrc/transforms.js[22m โ [1mdist/transforms.v2.js[22m...[39m\r\n",,terminal_output
+16193,16174050,"TERMINAL",0,0,"[32mcreated [1mdist/transforms.v2.js[22m in [1m538ms[22m[39m\r\n\r\n[2025-08-05 22:55:30] waiting for changes...\r\n",,terminal_output
+16194,16867470,"examples/blog.html",25671,0,"",html,selection_command
+16195,16867631,"examples/blog.html",25543,0,"",html,selection_command
+16196,16868051,"examples/blog.html",25499,0,"",html,selection_command
+16197,16868559,"examples/blog.html",25392,0,"",html,selection_command
+16198,16868701,"examples/blog.html",25410,0,"",html,selection_command
+16199,16868953,"examples/blog.html",25411,0,"",html,selection_command
+16200,16868983,"examples/blog.html",25415,0,"",html,selection_command
+16201,16869020,"examples/blog.html",25420,0,"",html,selection_command
+16202,16869048,"examples/blog.html",25422,0,"",html,selection_command
+16203,16869081,"examples/blog.html",25431,0,"",html,selection_command
+16204,16869115,"examples/blog.html",25434,0,"",html,selection_command
+16205,16869149,"examples/blog.html",25438,0,"",html,selection_command
+16206,16869182,"examples/blog.html",25441,0,"",html,selection_command
+16207,16869327,"examples/blog.html",25443,0,"",html,selection_command
+16208,16869612,"examples/blog.html",25458,0,"",html,selection_command
+16209,16869769,"examples/blog.html",25459,0,"",html,selection_command
+16210,16870126,"examples/blog.html",25459,3,"",html,content
+16211,16870316,"examples/blog.html",25459,0,"g",html,content
+16212,16870318,"examples/blog.html",25460,0,"",html,selection_keyboard
+16213,16870665,"examples/blog.html",25460,0,"i",html,content
+16214,16870667,"examples/blog.html",25461,0,"",html,selection_keyboard
+16215,16871099,"examples/blog.html",25461,0,"f",html,content
+16216,16871102,"examples/blog.html",25462,0,"",html,selection_keyboard
+16217,16871336,"examples/blog.html",25461,0,"",html,selection_command
+16218,16872138,"examples/blog.html",25392,0,"",html,selection_command
+16219,16892108,"examples/blog.html",0,0,"",html,tab
+16220,16893900,"TERMINAL",0,0,"^C[1G[0Kโ [1G[0K[1m[7m%[27m[1m[0m \r \r]633;D;0]633;P;Cwd=/Users/franzsrambical/Documents/pdoom/pdoom.org\r[0m[27m[24m[J]633;Afranzsrambical@MBF6N9WFVKFV pdoom.org % ]633;B[K[?2004h",,terminal_output
+16221,16894039,"TERMINAL",0,0,"npm run dev",,terminal_output
+16222,16894169,"TERMINAL",0,0,"[11Dcp examples/* dist",,terminal_output
+16223,16894348,"TERMINAL",0,0,"cp examples/* dist",,terminal_command
+16224,16894348,"TERMINAL",0,0,"[?2004l\r\r\n]633;E;cp examples/* dist;b07e0e29-8b6e-4188-b95b-fd2ce17f4bb1",,terminal_output
+16225,16894367,"TERMINAL",0,0,"]633;C[1m[7m%[27m[1m[0m \r \r",,terminal_output
+16226,16894916,"TERMINAL",0,0,"npm run dev",,terminal_command
+16227,16894967,"TERMINAL",0,0,"]633;C",,terminal_output
+16228,16895137,"TERMINAL",0,0,"\r\n> distill-template@2.8.0 dev\r\n> rollup -c rollup.config.dev.js -w\r\n\r\n[1G[0K",,terminal_output
+16229,16895347,"TERMINAL",0,0,"c[4mrollup v2.7.3[24m\r\n[36mbundles [1msrc/components.js[22m โ [1mdist/template.v2.js[22m...[39m\r\n",,terminal_output
+16230,16895791,"TERMINAL",0,0,"[1m[32mhttp://localhost:8088[39m[22m -> /Users/franzsrambical/Documents/pdoom/pdoom.org/dist\r\n[1m[32mhttp://localhost:8088[39m[22m -> /Users/franzsrambical/Documents/pdoom/pdoom.org/examples\r\n[1m[33m(!) Circular dependencies[39m[22m\r\nnode_modules/d3-interpolate/src/value.js -> node_modules/d3-interpolate/src/array.js -> node_modules/d3-interpolate/src/value.js\r\nnode_modules/d3-interpolate/src/value.js -> node_modules/d3-interpolate/src/object.js -> node_modules/d3-interpolate/src/value.js\r\nnode_modules/d3-selection/src/selection/index.js -> node_modules/d3-selection/src/selection/select.js -> node_modules/d3-selection/src/selection/index.js\r\n...and 7 more\r\n[32mcreated [1mdist/template.v2.js[22m in [1m444ms[22m[39m\r\n[36mbundles [1msrc/transforms.js[22m โ [1mdist/transforms.v2.js[22m...[39m\r\n",,terminal_output
+16231,16896274,"TERMINAL",0,0,"[32mcreated [1mdist/transforms.v2.js[22m in [1m482ms[22m[39m\r\n\r\n[2025-08-05 23:07:32] waiting for changes...\r\n",,terminal_output
diff --git a/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-1d1c9d40-4350-4512-82d6-f9fdf36759211756538623431-2025_08_30-08.23.49.356/source.csv b/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-1d1c9d40-4350-4512-82d6-f9fdf36759211756538623431-2025_08_30-08.23.49.356/source.csv
new file mode 100644
index 0000000000000000000000000000000000000000..84659cde465567f3c05b8742caeb2f60d84b8bd7
--- /dev/null
+++ b/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-1d1c9d40-4350-4512-82d6-f9fdf36759211756538623431-2025_08_30-08.23.49.356/source.csv
@@ -0,0 +1,2272 @@
+Sequence,Time,File,RangeOffset,RangeLength,Text,Language,Type
+1,3,"utils/train_utils.py",0,0,"import jax\nimport optax\nimport operator\n\n\ndef get_lr_schedule(\n lr_schedule: str,\n init_lr: float,\n max_lr: float,\n decay_end: float,\n total_steps: int,\n warmup_steps: int,\n wsd_decay_steps: int,\n) -> optax.Schedule:\n supported_schedules = [""wsd"", ""cos""]\n if lr_schedule == ""cos"":\n assert (\n warmup_steps <= total_steps\n ), ""Warmup steps can't be greater than total steps.""\n return optax.warmup_cosine_decay_schedule(\n init_value=init_lr,\n peak_value=max_lr,\n warmup_steps=warmup_steps,\n decay_steps=total_steps, # Note: decay_steps includes the warmup steps, so we need to pass total value\n end_value=decay_end,\n )\n elif lr_schedule == ""wsd"":\n assert (\n warmup_steps + wsd_decay_steps <= total_steps\n ), ""Warmup and decay period is longer than total steps.""\n schedules = [\n optax.linear_schedule(\n init_value=init_lr, end_value=max_lr, transition_steps=warmup_steps\n ),\n optax.constant_schedule(value=max_lr),\n optax.linear_schedule(\n init_value=max_lr, end_value=decay_end, transition_steps=wsd_decay_steps\n ),\n ]\n boundaries = [warmup_steps, total_steps - wsd_decay_steps]\n return optax.join_schedules(schedules, boundaries)\n else:\n raise ValueError(\n f""Learning rate schedule not supported. Please use one of {supported_schedules}""\n )\n\n\ndef _count_component(component_params):\n """"""Count total parameters in a component.""""""\n params_sizes = jax.tree.map(jax.numpy.size, component_params)\n total_parameters = jax.tree.reduce(operator.add, params_sizes)\n return total_parameters\n\n\ndef count_parameters_by_component(params):\n """"""Count parameters for each component of the model.\n\n Args:\n params: Model parameters from nnx.split(model, nnx.Param, ...)\n\n Returns:\n Dictionary with parameter counts for each component\n """"""\n component_names = list(params.keys())\n print(f""Counting all components: {component_names}"")\n\n counts = {}\n total_params = 0\n\n for name in component_names:\n component_params = params[name]\n count = _count_component(component_params)\n counts[name] = count\n total_params += count\n\n counts[""total""] = total_params\n return counts\n\n\ndef bytes_to_gb(num_bytes):\n return num_bytes / (1024**3)\n\n\ndef print_compiled_memory_stats(compiled_stats):\n """"""from: https://github.com/AI-Hypercomputer/maxtext/blob/b18829fbaa48aec7ac350a03e62248e24c6a76b2/MaxText/max_utils.py#L739""""""\n output_gb = bytes_to_gb(compiled_stats.output_size_in_bytes)\n temp_gb = bytes_to_gb(compiled_stats.temp_size_in_bytes)\n argument_gb = bytes_to_gb(compiled_stats.argument_size_in_bytes)\n alias_gb = bytes_to_gb(compiled_stats.alias_size_in_bytes)\n host_temp_gb = bytes_to_gb(compiled_stats.host_temp_size_in_bytes)\n total_gb = output_gb + temp_gb + argument_gb - alias_gb\n print(\n f""Total memory size: {total_gb:.1f} GB, Output size: {output_gb:.1f} GB, Temp size: {temp_gb:.1f} GB, ""\n f""Argument size: {argument_gb:.1f} GB, Host temp size: {host_temp_gb:.1f} GB.""\n )\n\n\ndef print_compiled_cost_analysis(cost_stats):\n flops = float(cost_stats.get(""flops"", 0.0))\n bytes_accessed = float(cost_stats.get(""bytes accessed"", 0.0))\n gb = bytes_to_gb(bytes_accessed) if bytes_accessed else 0.0\n intensity = (flops / bytes_accessed) if bytes_accessed else float(""nan"")\n print(\n f""FLOPs: {flops:.3e}, Bytes: {bytes_accessed:.3e} ({gb:.1f} GB), ""\n f""Intensity: {intensity:.1f} FLOPs/byte""\n )\n\n\ndef print_mem_stats(label: str):\n """"""from: https://github.com/AI-Hypercomputer/maxtext/blob/7898576359bacde81be25cb3038e348aac1f943b/MaxText/max_utils.py#L713""""""\n print(f""\nMemstats: {label}:"")\n try:\n for d in jax.local_devices():\n stats = d.memory_stats()\n used = round(stats[""bytes_in_use""] / 2**30, 2)\n limit = round(stats[""bytes_limit""] / 2**30, 2)\n print(f""\tUsing (GB) {used} / {limit} ({used/limit:%}) on {d}"")\n except (RuntimeError, KeyError, TypeError) as ex:\n print(f""\tMemstats unavailable, error: {ex}"")\n",python,tab
+2,26,"tasks",0,0,"",Log,tab
+3,28,"utils/train_utils.py",0,0,"",python,tab
+4,83,"extension-output-pdoom-org.crowd-code-#1-crowd-code",0,0,"",Log,tab
+5,1926,"extension-output-pdoom-org.crowd-code-#1-crowd-code",0,0,"8:23:49 AM [info] Activating crowd-code\n8:23:49 AM [info] Recording started\n8:23:49 AM [info] Initializing git provider using file system watchers...\n8:23:49 AM [info] Git repository found\n8:23:49 AM [info] Git provider initialized successfully\n8:23:49 AM [info] Initial git state: [object Object]\n",Log,content
+6,262330383,"utils/train_utils.py",0,0,"",python,tab
+7,262360459,"utils/train_utils.py",237,0,"",python,selection_mouse
+8,262360468,"utils/train_utils.py",236,0,"",python,selection_command
+9,262361541,"utils/train_utils.py",4306,0,"",python,selection_command
+10,262363810,"utils/train_utils.py",4306,0,"\n",python,content
+11,262364349,"utils/train_utils.py",4307,0,"d",python,content
+12,262364351,"utils/train_utils.py",4308,0,"",python,selection_keyboard
+13,262364519,"utils/train_utils.py",4308,0,"e",python,content
+14,262364523,"utils/train_utils.py",4309,0,"",python,selection_keyboard
+15,262364638,"utils/train_utils.py",4309,0,"f",python,content
+16,262364639,"utils/train_utils.py",4310,0,"",python,selection_keyboard
+17,262364642,"utils/train_utils.py",4310,0," ",python,content
+18,262364643,"utils/train_utils.py",4311,0,"",python,selection_keyboard
+19,262364847,"utils/train_utils.py",4311,0,"c",python,content
+20,262364849,"utils/train_utils.py",4312,0,"",python,selection_keyboard
+21,262364929,"utils/train_utils.py",4312,0,"a",python,content
+22,262364931,"utils/train_utils.py",4313,0,"",python,selection_keyboard
+23,262365018,"utils/train_utils.py",4313,0,"l",python,content
+24,262365019,"utils/train_utils.py",4314,0,"",python,selection_keyboard
+25,262365054,"utils/train_utils.py",4314,0,"c",python,content
+26,262365056,"utils/train_utils.py",4315,0,"",python,selection_keyboard
+27,262365229,"utils/train_utils.py",4315,0,"u",python,content
+28,262365231,"utils/train_utils.py",4316,0,"",python,selection_keyboard
+29,262365367,"utils/train_utils.py",4316,0,"l",python,content
+30,262365368,"utils/train_utils.py",4317,0,"",python,selection_keyboard
+31,262365454,"utils/train_utils.py",4317,0,"a",python,content
+32,262365455,"utils/train_utils.py",4318,0,"",python,selection_keyboard
+33,262365518,"utils/train_utils.py",4318,0,"t",python,content
+34,262365519,"utils/train_utils.py",4319,0,"",python,selection_keyboard
+35,262365590,"utils/train_utils.py",4319,0,"e",python,content
+36,262365591,"utils/train_utils.py",4320,0,"",python,selection_keyboard
+37,262365807,"utils/train_utils.py",4320,0,"_",python,content
+38,262365808,"utils/train_utils.py",4321,0,"",python,selection_keyboard
+39,262367598,"utils/train_utils.py",4321,0,"t",python,content
+40,262367600,"utils/train_utils.py",4322,0,"",python,selection_keyboard
+41,262367764,"utils/train_utils.py",4322,0,"f",python,content
+42,262367765,"utils/train_utils.py",4323,0,"",python,selection_keyboard
+43,262367847,"utils/train_utils.py",4323,0,"l",python,content
+44,262367848,"utils/train_utils.py",4324,0,"",python,selection_keyboard
+45,262367997,"utils/train_utils.py",4324,0,"o",python,content
+46,262367998,"utils/train_utils.py",4325,0,"",python,selection_keyboard
+47,262368083,"utils/train_utils.py",4325,0,"p",python,content
+48,262368084,"utils/train_utils.py",4326,0,"",python,selection_keyboard
+49,262368118,"utils/train_utils.py",4326,0,"s",python,content
+50,262368119,"utils/train_utils.py",4327,0,"",python,selection_keyboard
+51,262369406,"utils/train_utils.py",4327,0,"_",python,content
+52,262369407,"utils/train_utils.py",4328,0,"",python,selection_keyboard
+53,262369598,"utils/train_utils.py",4328,0,"t",python,content
+54,262369600,"utils/train_utils.py",4329,0,"",python,selection_keyboard
+55,262369757,"utils/train_utils.py",4329,0,"r",python,content
+56,262369758,"utils/train_utils.py",4330,0,"",python,selection_keyboard
+57,262369805,"utils/train_utils.py",4330,0,"a",python,content
+58,262369806,"utils/train_utils.py",4331,0,"",python,selection_keyboard
+59,262369863,"utils/train_utils.py",4331,0,"i",python,content
+60,262369866,"utils/train_utils.py",4332,0,"",python,selection_keyboard
+61,262369924,"utils/train_utils.py",4332,0,"n",python,content
+62,262369925,"utils/train_utils.py",4333,0,"",python,selection_keyboard
+63,262370049,"utils/train_utils.py",4333,0,"i",python,content
+64,262370050,"utils/train_utils.py",4334,0,"",python,selection_keyboard
+65,262370079,"utils/train_utils.py",4334,0,"n",python,content
+66,262370082,"utils/train_utils.py",4335,0,"",python,selection_keyboard
+67,262370161,"utils/train_utils.py",4335,0,"g",python,content
+68,262370161,"utils/train_utils.py",4336,0,"",python,selection_keyboard
+69,262370390,"utils/train_utils.py",4336,0,"_",python,content
+70,262370391,"utils/train_utils.py",4337,0,"",python,selection_keyboard
+71,262371664,"utils/train_utils.py",4337,0,"p",python,content
+72,262371666,"utils/train_utils.py",4338,0,"",python,selection_keyboard
+73,262371722,"utils/train_utils.py",4338,0,"e",python,content
+74,262371723,"utils/train_utils.py",4339,0,"",python,selection_keyboard
+75,262371806,"utils/train_utils.py",4339,0,"r",python,content
+76,262371807,"utils/train_utils.py",4340,0,"",python,selection_keyboard
+77,262371981,"utils/train_utils.py",4340,0,"_",python,content
+78,262371982,"utils/train_utils.py",4341,0,"",python,selection_keyboard
+79,262372130,"utils/train_utils.py",4341,0,"d",python,content
+80,262372131,"utils/train_utils.py",4342,0,"",python,selection_keyboard
+81,262372324,"utils/train_utils.py",4342,0,"e",python,content
+82,262372324,"utils/train_utils.py",4343,0,"",python,selection_keyboard
+83,262372423,"utils/train_utils.py",4343,0,"v",python,content
+84,262372424,"utils/train_utils.py",4344,0,"",python,selection_keyboard
+85,262372540,"utils/train_utils.py",4344,0,"i",python,content
+86,262372541,"utils/train_utils.py",4345,0,"",python,selection_keyboard
+87,262372616,"utils/train_utils.py",4345,0,"c",python,content
+88,262372617,"utils/train_utils.py",4346,0,"",python,selection_keyboard
+89,262372703,"utils/train_utils.py",4346,0,"e",python,content
+90,262372704,"utils/train_utils.py",4347,0,"",python,selection_keyboard
+91,262373403,"utils/train_utils.py",4347,0,"()",python,content
+92,262373405,"utils/train_utils.py",4348,0,"",python,selection_keyboard
+93,262377508,"utils/train_utils.py",4348,0,"a",python,content
+94,262377510,"utils/train_utils.py",4349,0,"",python,selection_keyboard
+95,262377607,"utils/train_utils.py",4349,0,"r",python,content
+96,262377609,"utils/train_utils.py",4350,0,"",python,selection_keyboard
+97,262377874,"utils/train_utils.py",4350,0,"g",python,content
+98,262377875,"utils/train_utils.py",4351,0,"",python,selection_keyboard
+99,262377913,"utils/train_utils.py",4351,0,"s",python,content
+100,262377915,"utils/train_utils.py",4352,0,"",python,selection_keyboard
+101,262378250,"utils/train_utils.py",4352,0,":",python,content
+102,262378251,"utils/train_utils.py",4353,0,"",python,selection_keyboard
+103,262378339,"utils/train_utils.py",4353,0," ",python,content
+104,262378340,"utils/train_utils.py",4354,0,"",python,selection_keyboard
+105,262378520,"utils/train_utils.py",4354,0,"A",python,content
+106,262378521,"utils/train_utils.py",4355,0,"",python,selection_keyboard
+107,262378804,"utils/train_utils.py",4355,0,"r",python,content
+108,262378805,"utils/train_utils.py",4356,0,"",python,selection_keyboard
+109,262378980,"utils/train_utils.py",4356,0,"g",python,content
+110,262378981,"utils/train_utils.py",4357,0,"",python,selection_keyboard
+111,262379040,"utils/train_utils.py",4357,0,"s",python,content
+112,262379041,"utils/train_utils.py",4358,0,"",python,selection_keyboard
+113,262382068,"utils/train_utils.py",4357,0,"",python,selection_command
+114,262386340,"utils/train_utils.py",4359,0,"",python,selection_command
+115,262386420,"utils/train_utils.py",4359,0,":",python,content
+116,262386421,"utils/train_utils.py",4360,0,"",python,selection_keyboard
+117,262387103,"utils/train_utils.py",4360,0," ",python,content
+118,262387104,"utils/train_utils.py",4361,0,"",python,selection_keyboard
+119,262387229,"utils/train_utils.py",4360,0,"",python,selection_command
+120,262388018,"utils/train_utils.py",4361,0,"",python,selection_command
+121,262388151,"utils/train_utils.py",4360,1,"",python,content
+122,262388298,"utils/train_utils.py",4359,0,"",python,selection_command
+123,262389441,"utils/train_utils.py",0,0,"",python,selection_command
+124,262390820,"utils/train_utils.py",11,0,"",python,selection_command
+125,262390969,"utils/train_utils.py",24,0,"",python,selection_command
+126,262391102,"utils/train_utils.py",40,0,"",python,selection_command
+127,262391768,"utils/train_utils.py",24,0,"",python,selection_command
+128,262392169,"utils/train_utils.py",4307,0,"",python,selection_command
+129,262395354,"train_tokenizer.py",0,0,"import os\n\nos.environ.setdefault(""XLA_PYTHON_CLIENT_MEM_FRACTION"", ""0.98"")\n\nfrom dataclasses import dataclass, field\nfrom typing import cast, Optional\n\nimport einops\nimport itertools\nfrom jax.sharding import Mesh, PartitionSpec, NamedSharding\nfrom jax.experimental.mesh_utils import create_device_mesh\nimport optax\nimport orbax.checkpoint as ocp\nimport numpy as np\nimport dm_pix as pix\nimport jax\nimport jax.numpy as jnp\nimport tyro\nimport wandb\nimport grain\nimport flax.nnx as nnx\n\nfrom models.tokenizer import TokenizerVQVAE\nfrom utils.dataloader import get_dataloader\nfrom utils.train_utils import (\n get_lr_schedule,\n count_parameters_by_component,\n print_mem_stats,\n print_compiled_memory_stats,\n print_compiled_cost_analysis,\n)\n\n\n@dataclass\nclass Args:\n # Experiment\n num_steps: int = 300_000\n seed: int = 0\n seq_len: int = 16\n image_channels: int = 3\n image_height: int = 90\n image_width: int = 160\n data_dir: str = """"\n save_ckpt: bool = False\n restore_ckpt: bool = False\n # Optimization\n vq_beta: float = 0.25\n batch_size: int = 48\n init_lr: float = 0.0\n max_lr: float = 3e-4\n decay_end: float = 0.0\n wsd_decay_steps: int = (\n 20000 # NOTE: wsd_decay_steps will only be used when using a wsd-schedule\n )\n lr_schedule: str = ""wsd"" # supported options: wsd, cos\n warmup_steps: int = 10000\n # Tokenizer\n model_dim: int = 512\n ffn_dim: int = 2048\n latent_dim: int = 32\n num_latents: int = 1024\n patch_size: int = 4\n num_blocks: int = 4\n num_heads: int = 8\n dropout: float = 0.0\n codebook_dropout: float = 0.01\n param_dtype = jnp.float32\n dtype = jnp.bfloat16\n # Logging\n log: bool = False\n entity: str = """"\n project: str = """"\n name: str = ""train_tokenizer""\n tags: list[str] = field(default_factory=lambda: [""tokenizer""])\n log_interval: int = 5\n log_image_interval: int = 250\n ckpt_dir: str = """"\n log_checkpoint_interval: int = 10000\n log_checkpoint_keep_period: int = 20000\n log_gradients: bool = False\n wandb_id: str = """"\n use_flash_attention: bool = True\n\n\ndef build_model(args: Args, rng: jax.Array) -> tuple[TokenizerVQVAE, jax.Array]:\n rng, _rng = jax.random.split(rng)\n rngs = nnx.Rngs(_rng)\n return (\n TokenizerVQVAE(\n in_dim=args.image_channels,\n model_dim=args.model_dim,\n ffn_dim=args.ffn_dim,\n latent_dim=args.latent_dim,\n num_latents=args.num_latents,\n patch_size=args.patch_size,\n num_blocks=args.num_blocks,\n num_heads=args.num_heads,\n dropout=args.dropout,\n codebook_dropout=args.codebook_dropout,\n param_dtype=args.param_dtype,\n dtype=args.dtype,\n use_flash_attention=args.use_flash_attention,\n rngs=rngs,\n ),\n rng,\n )\n\n\ndef build_optimizer(\n model: TokenizerVQVAE, args: Args\n) -> tuple[nnx.Optimizer, optax.Schedule]:\n lr_schedule = get_lr_schedule(\n args.lr_schedule,\n args.init_lr,\n args.max_lr,\n args.decay_end,\n args.num_steps,\n args.warmup_steps,\n args.wsd_decay_steps,\n )\n tx = optax.adamw(\n learning_rate=lr_schedule,\n b1=0.9,\n b2=0.9,\n weight_decay=1e-4,\n mu_dtype=args.param_dtype, # moments in full precision\n )\n optimizer = nnx.Optimizer(model, tx)\n return optimizer, lr_schedule\n\n\ndef build_mesh_and_sharding(\n num_devices: int,\n) -> tuple[Mesh, NamedSharding, NamedSharding]:\n device_mesh_arr = create_device_mesh((num_devices,))\n mesh = Mesh(devices=device_mesh_arr, axis_names=(""data"",))\n replicated_sharding = NamedSharding(mesh, PartitionSpec())\n videos_sharding = NamedSharding(mesh, PartitionSpec(""data"", None, None, None, None))\n return mesh, replicated_sharding, videos_sharding\n\n\ndef shard_optimizer_states(\n optimizer: nnx.Optimizer, replicated_sharding: NamedSharding\n) -> None:\n model_state = nnx.state(optimizer.model)\n model_sharded_state = jax.lax.with_sharding_constraint(\n model_state, replicated_sharding\n )\n nnx.update(optimizer.model, model_sharded_state)\n optimizer_state = nnx.state(optimizer, nnx.optimizer.OptState)\n optimizer_sharded_state = jax.lax.with_sharding_constraint(\n optimizer_state, replicated_sharding\n )\n nnx.update(optimizer, optimizer_sharded_state)\n\n\ndef build_dataloader(args: Args) -> grain.DataLoaderIterator:\n image_shape = (args.image_height, args.image_width, args.image_channels)\n array_record_files = [\n os.path.join(args.data_dir, x)\n for x in os.listdir(args.data_dir)\n if x.endswith("".array_record"")\n ]\n grain_dataloader = get_dataloader(\n array_record_files,\n args.seq_len,\n # NOTE: We deliberately pass the global batch size\n # The dataloader shards the dataset across all processes\n args.batch_size,\n *image_shape,\n num_workers=8,\n prefetch_buffer_size=1,\n seed=args.seed,\n )\n initial_state = grain_dataloader._create_initial_state()\n grain_iterator = grain.DataLoaderIterator(grain_dataloader, initial_state)\n return grain_iterator\n\n\ndef build_checkpoint_manager(args: Args) -> ocp.CheckpointManager:\n handler_registry = ocp.handlers.DefaultCheckpointHandlerRegistry()\n handler_registry.add(\n ""model_state"", ocp.args.PyTreeSave, ocp.handlers.PyTreeCheckpointHandler\n )\n handler_registry.add(\n ""model_state"", ocp.args.PyTreeRestore, ocp.handlers.PyTreeCheckpointHandler\n )\n handler_registry.add(\n ""dataloader_state"",\n grain.checkpoint.CheckpointSave,\n cast(ocp.handlers.CheckpointHandler, grain.checkpoint.CheckpointHandler),\n )\n handler_registry.add(\n ""dataloader_state"",\n grain.checkpoint.CheckpointRestore,\n cast(ocp.handlers.CheckpointHandler, grain.checkpoint.CheckpointHandler),\n )\n checkpoint_options = ocp.CheckpointManagerOptions(\n save_interval_steps=args.log_checkpoint_interval,\n max_to_keep=3,\n keep_period=args.log_checkpoint_keep_period,\n step_format_fixed_length=6,\n cleanup_tmp_directories=True,\n )\n checkpoint_manager = ocp.CheckpointManager(\n args.ckpt_dir,\n options=checkpoint_options,\n handler_registry=handler_registry,\n )\n return checkpoint_manager\n\n\ndef restore_checkpoint_if_needed(\n args: Args,\n checkpoint_manager: ocp.CheckpointManager,\n optimizer: nnx.Optimizer,\n grain_iterator: grain.DataLoaderIterator,\n restore_step: Optional[int] = None,\n) -> tuple[int, nnx.Optimizer, grain.DataLoaderIterator]:\n step = 0\n if restore_step is None:\n restore_step = checkpoint_manager.latest_step()\n if args.restore_ckpt:\n abstract_optimizer = nnx.eval_shape(lambda: optimizer)\n abstract_optimizer_state = nnx.state(abstract_optimizer)\n restored = checkpoint_manager.restore(\n restore_step,\n args=ocp.args.Composite(\n model_state=ocp.args.PyTreeRestore(abstract_optimizer_state), # type: ignore\n dataloader_state=grain.checkpoint.CheckpointRestore(grain_iterator), # type: ignore\n ),\n )\n restored_optimizer_state = restored[""model_state""]\n nnx.update(optimizer, restored_optimizer_state)\n grain_iterator = restored[""dataloader_state""]\n step = restore_step or 0\n print(f""Restored dataloader and model state from step {step}"")\n return step, optimizer, grain_iterator\n\n\ndef main(args: Args) -> None:\n jax.distributed.initialize()\n num_devices = jax.device_count()\n if num_devices == 0:\n raise ValueError(""No JAX devices found."")\n print(f""Running on {num_devices} devices."")\n\n if args.batch_size % num_devices != 0:\n raise ValueError(\n f""Global batch size {args.batch_size} must be divisible by ""\n f""number of devices {num_devices}.""\n )\n\n rng = jax.random.key(args.seed)\n\n # --- Initialize model ---\n tokenizer, rng = build_model(args, rng)\n\n _, params, _ = nnx.split(tokenizer, nnx.Param, ...)\n param_counts = count_parameters_by_component(params)\n\n if args.log and jax.process_index() == 0:\n wandb_init_kwargs = {\n ""entity"": args.entity,\n ""project"": args.project,\n ""name"": args.name,\n ""tags"": args.tags,\n ""group"": ""debug"",\n ""config"": args,\n }\n\n if args.wandb_id:\n wandb_init_kwargs.update(\n {\n ""id"": args.wandb_id,\n ""resume"": ""allow"",\n }\n )\n wandb.init(**wandb_init_kwargs)\n\n wandb.config.update({""model_param_count"": param_counts})\n\n print(""Parameter counts:"")\n print(param_counts)\n\n # --- Initialize optimizer ---\n optimizer, lr_schedule = build_optimizer(tokenizer, args)\n del tokenizer\n\n # FIXME: switch to create_hybrid_device_mesh for runs spanning multiple nodes\n mesh, replicated_sharding, videos_sharding = build_mesh_and_sharding(num_devices)\n\n shard_optimizer_states(optimizer, replicated_sharding)\n\n # --- Initialize checkpoint manager ---\n checkpoint_manager = build_checkpoint_manager(args)\n\n # --- Create DataLoaderIterator from dataloader ---\n grain_iterator = build_dataloader(args)\n\n # --- Restore checkpoint ---\n step, optimizer, grain_iterator = restore_checkpoint_if_needed(\n args, checkpoint_manager, optimizer, grain_iterator\n )\n\n # --- Define loss and train step (close over args) ---\n def tokenizer_loss_fn(\n model: TokenizerVQVAE, inputs: dict\n ) -> tuple[jax.Array, tuple[jax.Array, dict]]:\n gt = jnp.asarray(inputs[""videos""], dtype=jnp.float32) / 255.0\n inputs[""videos""] = gt.astype(args.dtype)\n model.train()\n outputs = model(inputs, training=True)\n outputs[""recon""] = outputs[""recon""].astype(jnp.float32)\n mse = jnp.square(gt - outputs[""recon""]).mean()\n q_loss = jnp.square(jax.lax.stop_gradient(outputs[""emb""]) - outputs[""z""]).mean()\n commitment_loss = jnp.square(\n outputs[""emb""] - jax.lax.stop_gradient(outputs[""z""])\n ).mean()\n loss = mse + q_loss + args.vq_beta * commitment_loss\n\n gt_clipped = gt.clip(0, 1).reshape(-1, *gt.shape[2:])\n recon = outputs[""recon""].clip(0, 1).reshape(-1, *outputs[""recon""].shape[2:])\n psnr = jnp.asarray(pix.psnr(gt_clipped, recon)).mean()\n ssim = jnp.asarray(pix.ssim(gt_clipped, recon)).mean()\n _, index_counts = jnp.unique_counts(\n jnp.ravel(outputs[""indices""]), size=args.num_latents, fill_value=0\n )\n codebook_usage = (index_counts != 0).mean()\n metrics = dict(\n loss=loss,\n mse=mse,\n q_loss=q_loss,\n commitment_loss=commitment_loss,\n psnr=psnr,\n ssim=ssim,\n codebook_usage=codebook_usage,\n )\n return loss, (outputs[""recon""], metrics)\n\n @nnx.jit(donate_argnums=0)\n def train_step(\n optimizer: nnx.Optimizer, inputs: dict\n ) -> tuple[jax.Array, jax.Array, dict]:\n def loss_fn(model: TokenizerVQVAE) -> tuple[jax.Array, tuple[jax.Array, dict]]:\n return tokenizer_loss_fn(model, inputs)\n\n (loss, (recon, metrics)), grads = nnx.value_and_grad(loss_fn, has_aux=True)(\n optimizer.model\n )\n optimizer.update(grads)\n if args.log_gradients:\n metrics[""encoder_gradients_std/""] = jax.tree.map(\n lambda x: x.std(), grads[""params""][""encoder""]\n )\n metrics[""vq_gradients_std/""] = jax.tree.map(\n lambda x: x.std(), grads[""params""][""vq""]\n )\n metrics[""decoder_gradients_std/""] = jax.tree.map(\n lambda x: x.std(), grads[""params""][""decoder""]\n )\n return loss, recon, metrics\n\n # --- TRAIN LOOP ---\n dataloader = (\n jax.make_array_from_process_local_data(videos_sharding, elem)\n for elem in grain_iterator\n )\n if jax.process_index() == 0:\n first_videos = next(dataloader)\n sample_inputs = dict(videos=first_videos)\n compiled = train_step.lower(optimizer, sample_inputs).compile()\n print_compiled_memory_stats(compiled.memory_analysis())\n print_compiled_cost_analysis(compiled.cost_analysis())\n # Do not skip the first batch during training\n dataloader = itertools.chain([first_videos], dataloader)\n print(f""Starting training from step {step}..."")\n first_step = step\n while step < args.num_steps:\n for videos in dataloader:\n # --- Train step ---\n inputs = dict(videos=videos)\n loss, recon, metrics = train_step(optimizer, inputs)\n if step == first_step:\n print_mem_stats(""After params initialized"")\n metrics[""lr""] = lr_schedule(step)\n print(f""Step {step}, loss: {loss}"")\n step += 1\n\n # --- Logging ---\n if args.log:\n if step % args.log_interval == 0 and jax.process_index() == 0:\n wandb.log(\n {\n ""loss"": loss,\n ""step"": step,\n **metrics,\n }\n )\n if step % args.log_image_interval == 0:\n gt_seq = inputs[""videos""][0].astype(jnp.float32) / 255.0\n recon_seq = recon[0].clip(0, 1)\n comparison_seq = jnp.concatenate((gt_seq, recon_seq), axis=1)\n comparison_seq = einops.rearrange(\n comparison_seq * 255, ""t h w c -> h (t w) c""\n )\n # NOTE: Process-dependent control flow deliberately happens\n # after indexing operation since it must not contain code\n # sections that lead to cross-accelerator communication.\n if jax.process_index() == 0:\n log_images = dict(\n image=wandb.Image(np.asarray(gt_seq[0])),\n recon=wandb.Image(np.asarray(recon_seq[0])),\n true_vs_recon=wandb.Image(\n np.asarray(comparison_seq.astype(np.uint8))\n ),\n )\n wandb.log(log_images)\n # --- Checkpointing ---\n if args.save_ckpt and step % args.log_checkpoint_interval == 0:\n optimizer_state = nnx.state(optimizer)\n checkpoint_manager.save(\n step,\n args=ocp.args.Composite(\n model_state=ocp.args.PyTreeSave(optimizer_state), # type: ignore\n dataloader_state=grain.checkpoint.CheckpointSave( # type: ignore\n grain_iterator # type: ignore\n ),\n ),\n )\n print(f""Saved checkpoint at step {step}"")\n if step >= args.num_steps:\n break\n\n checkpoint_manager.close()\n\n\nif __name__ == ""__main__"":\n args = tyro.cli(Args)\n main(args)\n",python,tab
+130,262395746,"train_tokenizer.py",0,0,"",python,selection_command
+131,262396547,"train_tokenizer.py",768,0,"",python,selection_command
+132,262400869,"utils/train_utils.py",0,0,"",python,tab
+133,262401951,"utils/train_utils.py",4311,0,"",python,selection_command
+134,262402201,"utils/train_utils.py",4347,0,"",python,selection_command
+135,262402230,"utils/train_utils.py",4348,0,"",python,selection_command
+136,262402263,"utils/train_utils.py",4352,0,"",python,selection_command
+137,262402296,"utils/train_utils.py",4354,0,"",python,selection_command
+138,262402330,"utils/train_utils.py",4358,0,"",python,selection_command
+139,262402364,"utils/train_utils.py",4359,0,"",python,selection_command
+140,262402758,"utils/train_utils.py",4358,0,"",python,selection_command
+141,262402899,"utils/train_utils.py",4354,0,"",python,selection_command
+142,262403888,"utils/train_utils.py",4353,0,"",python,selection_command
+143,262404183,"utils/train_utils.py",4352,0,"",python,selection_command
+144,262404271,"utils/train_utils.py",4352,1,":",python,selection_command
+145,262404337,"utils/train_utils.py",4352,6,": Args",python,selection_command
+146,262404739,"utils/train_utils.py",4352,6,"",python,content
+147,262405885,"utils/train_utils.py",4307,0,"",python,selection_command
+148,262408438,"utils/train_utils.py",4311,0,"",python,selection_command
+149,262408690,"utils/train_utils.py",4347,0,"",python,selection_command
+150,262408727,"utils/train_utils.py",4348,0,"",python,selection_command
+151,262408750,"utils/train_utils.py",4352,0,"",python,selection_command
+152,262408784,"utils/train_utils.py",4353,0,"",python,selection_command
+153,262409202,"utils/train_utils.py",4352,0,"",python,selection_command
+154,262409365,"utils/train_utils.py",4348,0,"",python,selection_command
+155,262411024,"utils/train_utils.py",0,0,"",python,selection_command
+156,262411940,"utils/train_utils.py",11,0,"",python,selection_command
+157,262412090,"utils/train_utils.py",24,0,"",python,selection_command
+158,262412444,"utils/train_utils.py",39,0,"\n",python,content
+159,262412704,"utils/train_utils.py",40,0,"\n",python,content
+160,262413330,"utils/train_utils.py",41,0,"i",python,content
+161,262413332,"utils/train_utils.py",42,0,"",python,selection_keyboard
+162,262413413,"utils/train_utils.py",42,0,"m",python,content
+163,262413415,"utils/train_utils.py",43,0,"",python,selection_keyboard
+164,262413495,"utils/train_utils.py",43,0,"p",python,content
+165,262413496,"utils/train_utils.py",44,0,"",python,selection_keyboard
+166,262413705,"utils/train_utils.py",44,0,"r",python,content
+167,262413706,"utils/train_utils.py",45,0,"",python,selection_keyboard
+168,262413745,"utils/train_utils.py",45,0,"o",python,content
+169,262413746,"utils/train_utils.py",46,0,"",python,selection_keyboard
+170,262413814,"utils/train_utils.py",46,0,"t",python,content
+171,262413817,"utils/train_utils.py",47,0,"",python,selection_keyboard
+172,262414241,"utils/train_utils.py",46,1,"",python,content
+173,262414380,"utils/train_utils.py",45,1,"",python,content
+174,262414520,"utils/train_utils.py",44,1,"",python,content
+175,262414699,"utils/train_utils.py",44,0,"o",python,content
+176,262414700,"utils/train_utils.py",45,0,"",python,selection_keyboard
+177,262414783,"utils/train_utils.py",45,0,"r",python,content
+178,262414784,"utils/train_utils.py",46,0,"",python,selection_keyboard
+179,262414845,"utils/train_utils.py",46,0,"t",python,content
+180,262414846,"utils/train_utils.py",47,0,"",python,selection_keyboard
+181,262414912,"utils/train_utils.py",47,0," ",python,content
+182,262414914,"utils/train_utils.py",48,0,"",python,selection_keyboard
+183,262415056,"utils/train_utils.py",48,0,"t",python,content
+184,262415057,"utils/train_utils.py",49,0,"",python,selection_keyboard
+185,262415289,"utils/train_utils.py",49,0,"r",python,content
+186,262415290,"utils/train_utils.py",50,0,"",python,selection_keyboard
+187,262415414,"utils/train_utils.py",50,0,"a",python,content
+188,262415416,"utils/train_utils.py",51,0,"",python,selection_keyboard
+189,262415549,"utils/train_utils.py",51,0,"i",python,content
+190,262415550,"utils/train_utils.py",52,0,"",python,selection_keyboard
+191,262415594,"utils/train_utils.py",52,0,"n",python,content
+192,262415596,"utils/train_utils.py",53,0,"",python,selection_keyboard
+193,262415785,"utils/train_utils.py",53,0,"_",python,content
+194,262415787,"utils/train_utils.py",54,0,"",python,selection_keyboard
+195,262416035,"utils/train_utils.py",54,0,"d",python,content
+196,262416037,"utils/train_utils.py",55,0,"",python,selection_keyboard
+197,262416137,"utils/train_utils.py",55,0,"y",python,content
+198,262416139,"utils/train_utils.py",56,0,"",python,selection_keyboard
+199,262416188,"utils/train_utils.py",56,0,"n",python,content
+200,262416189,"utils/train_utils.py",57,0,"",python,selection_keyboard
+201,262416267,"utils/train_utils.py",57,0,"a",python,content
+202,262416268,"utils/train_utils.py",58,0,"",python,selection_keyboard
+203,262416415,"utils/train_utils.py",58,0,"m",python,content
+204,262416418,"utils/train_utils.py",59,0,"",python,selection_keyboard
+205,262416474,"utils/train_utils.py",59,0,"i",python,content
+206,262416476,"utils/train_utils.py",60,0,"",python,selection_keyboard
+207,262416544,"utils/train_utils.py",60,0,"c",python,content
+208,262416546,"utils/train_utils.py",61,0,"",python,selection_keyboard
+209,262416594,"utils/train_utils.py",61,0,"s",python,content
+210,262416595,"utils/train_utils.py",62,0,"",python,selection_keyboard
+211,262417570,"utils/train_utils.py",62,0,"a",python,content
+212,262417574,"utils/train_utils.py",63,0,"",python,selection_keyboard
+213,262417678,"utils/train_utils.py",63,0,"r",python,content
+214,262417679,"utils/train_utils.py",64,0,"",python,selection_keyboard
+215,262417989,"utils/train_utils.py",63,1,"",python,content
+216,262418131,"utils/train_utils.py",62,1,"",python,content
+217,262419200,"utils/train_utils.py",48,14,"",python,content
+218,262419383,"utils/train_utils.py",41,7,"",python,content
+219,262419603,"utils/train_utils.py",41,0,"f",python,content
+220,262419604,"utils/train_utils.py",42,0,"",python,selection_keyboard
+221,262419789,"utils/train_utils.py",42,0,"r",python,content
+222,262419790,"utils/train_utils.py",43,0,"",python,selection_keyboard
+223,262419822,"utils/train_utils.py",43,0,"o",python,content
+224,262419823,"utils/train_utils.py",44,0,"",python,selection_keyboard
+225,262419896,"utils/train_utils.py",44,0,"m",python,content
+226,262419898,"utils/train_utils.py",45,0,"",python,selection_keyboard
+227,262419984,"utils/train_utils.py",45,0," ",python,content
+228,262419985,"utils/train_utils.py",46,0,"",python,selection_keyboard
+229,262420061,"utils/train_utils.py",46,0,"t",python,content
+230,262420062,"utils/train_utils.py",47,0,"",python,selection_keyboard
+231,262420196,"utils/train_utils.py",47,0,"r",python,content
+232,262420198,"utils/train_utils.py",48,0,"",python,selection_keyboard
+233,262420278,"utils/train_utils.py",48,0,"a",python,content
+234,262420279,"utils/train_utils.py",49,0,"",python,selection_keyboard
+235,262420313,"utils/train_utils.py",49,0,"i",python,content
+236,262420318,"utils/train_utils.py",50,0,"",python,selection_keyboard
+237,262420355,"utils/train_utils.py",50,0,"n",python,content
+238,262420357,"utils/train_utils.py",51,0,"",python,selection_keyboard
+239,262420546,"utils/train_utils.py",51,0,"_",python,content
+240,262420548,"utils/train_utils.py",52,0,"",python,selection_keyboard
+241,262420735,"utils/train_utils.py",52,0,"d",python,content
+242,262420737,"utils/train_utils.py",53,0,"",python,selection_keyboard
+243,262420796,"utils/train_utils.py",53,0,"y",python,content
+244,262420798,"utils/train_utils.py",54,0,"",python,selection_keyboard
+245,262420846,"utils/train_utils.py",54,0,"n",python,content
+246,262420848,"utils/train_utils.py",55,0,"",python,selection_keyboard
+247,262420907,"utils/train_utils.py",55,0,"a",python,content
+248,262420909,"utils/train_utils.py",56,0,"",python,selection_keyboard
+249,262421035,"utils/train_utils.py",56,0,"m",python,content
+250,262421036,"utils/train_utils.py",57,0,"",python,selection_keyboard
+251,262421087,"utils/train_utils.py",57,0,"i",python,content
+252,262421088,"utils/train_utils.py",58,0,"",python,selection_keyboard
+253,262421115,"utils/train_utils.py",58,0,"c",python,content
+254,262421117,"utils/train_utils.py",59,0,"",python,selection_keyboard
+255,262421170,"utils/train_utils.py",59,0,"s",python,content
+256,262421172,"utils/train_utils.py",60,0,"",python,selection_keyboard
+257,262421243,"utils/train_utils.py",60,0," ",python,content
+258,262421245,"utils/train_utils.py",61,0,"",python,selection_keyboard
+259,262421413,"utils/train_utils.py",61,0,"i",python,content
+260,262421415,"utils/train_utils.py",62,0,"",python,selection_keyboard
+261,262421489,"utils/train_utils.py",62,0,"m",python,content
+262,262421491,"utils/train_utils.py",63,0,"",python,selection_keyboard
+263,262421517,"utils/train_utils.py",63,0,"p",python,content
+264,262421518,"utils/train_utils.py",64,0,"",python,selection_keyboard
+265,262421613,"utils/train_utils.py",64,0,"o",python,content
+266,262421615,"utils/train_utils.py",65,0,"",python,selection_keyboard
+267,262421690,"utils/train_utils.py",65,0,"r",python,content
+268,262421692,"utils/train_utils.py",66,0,"",python,selection_keyboard
+269,262421783,"utils/train_utils.py",66,0,"t",python,content
+270,262421784,"utils/train_utils.py",67,0,"",python,selection_keyboard
+271,262421827,"utils/train_utils.py",67,0," ",python,content
+272,262421828,"utils/train_utils.py",68,0,"",python,selection_keyboard
+273,262422073,"utils/train_utils.py",68,0,"a",python,content
+274,262422074,"utils/train_utils.py",69,0,"",python,selection_keyboard
+275,262422177,"utils/train_utils.py",69,0,"r",python,content
+276,262422178,"utils/train_utils.py",70,0,"",python,selection_keyboard
+277,262422336,"utils/train_utils.py",70,0,"g",python,content
+278,262422338,"utils/train_utils.py",71,0,"",python,selection_keyboard
+279,262422450,"utils/train_utils.py",71,0,"s",python,content
+280,262422451,"utils/train_utils.py",72,0,"",python,selection_keyboard
+281,262423208,"utils/train_utils.py",68,4,"Args",python,content
+282,262423400,"utils/train_utils.py",71,0,"",python,selection_command
+283,262424153,"utils/train_utils.py",1922,0,"",python,selection_command
+284,262487404,"utils/train_utils.py",1918,0,"",python,selection_command
+285,262487525,"utils/train_utils.py",1922,0,"",python,selection_command
+286,262497337,"utils/train_utils.py",1918,0,"",python,selection_command
+287,262497428,"utils/train_utils.py",1922,0,"",python,selection_command
+288,262497719,"utils/train_utils.py",1918,0,"",python,selection_command
+289,262497777,"utils/train_utils.py",1922,0,"",python,selection_command
+290,262497901,"utils/train_utils.py",1918,0,"",python,selection_command
+291,262497988,"utils/train_utils.py",1922,0,"",python,selection_command
+292,262498094,"utils/train_utils.py",1918,0,"",python,selection_command
+293,262498688,"utils/train_utils.py",1917,0,"",python,selection_command
+294,262498843,"utils/train_utils.py",1860,0,"",python,selection_command
+295,262498980,"utils/train_utils.py",1817,0,"",python,selection_command
+296,262499247,"utils/train_utils.py",1821,0,"",python,selection_command
+297,262500420,"utils/train_utils.py",1864,0,"",python,selection_command
+298,262500572,"utils/train_utils.py",1917,0,"",python,selection_command
+299,262500753,"utils/train_utils.py",1922,0,"",python,selection_command
+300,262500915,"utils/train_utils.py",1918,0,"",python,selection_command
+301,262501035,"utils/train_utils.py",1922,0,"",python,selection_command
+302,262507687,"utils/train_utils.py",4340,0,"",python,selection_command
+303,262508536,"utils/train_utils.py",4339,0,"",python,selection_command
+304,262508786,"utils/train_utils.py",4285,0,"",python,selection_command
+305,262508818,"utils/train_utils.py",4231,0,"",python,selection_command
+306,262509071,"utils/train_utils.py",4285,0,"",python,selection_command
+307,262509318,"utils/train_utils.py",4339,0,"",python,selection_command
+308,262509348,"utils/train_utils.py",4340,0,"",python,selection_command
+309,262509670,"utils/train_utils.py",4344,0,"",python,selection_command
+310,262509920,"utils/train_utils.py",4380,0,"",python,selection_command
+311,262509953,"utils/train_utils.py",4381,0,"",python,selection_command
+312,262509986,"utils/train_utils.py",4385,0,"",python,selection_command
+313,262510020,"utils/train_utils.py",4386,0,"",python,selection_command
+314,262510432,"utils/train_utils.py",4385,0,"",python,selection_command
+315,262510551,"utils/train_utils.py",4386,0,"",python,selection_command
+316,262510664,"utils/train_utils.py",4387,0,"",python,selection_command
+317,262511169,"utils/train_utils.py",4386,0,"",python,selection_command
+318,262511318,"utils/train_utils.py",4385,0,"",python,selection_command
+319,262511824,"utils/train_utils.py",4385,0,":",python,content
+320,262511826,"utils/train_utils.py",4386,0,"",python,selection_keyboard
+321,262511885,"utils/train_utils.py",4386,0," ",python,content
+322,262511887,"utils/train_utils.py",4387,0,"",python,selection_keyboard
+323,262512069,"utils/train_utils.py",4387,0,"A",python,content
+324,262512070,"utils/train_utils.py",4388,0,"",python,selection_keyboard
+325,262512310,"utils/train_utils.py",4388,0,"r",python,content
+326,262512316,"utils/train_utils.py",4389,0,"",python,selection_keyboard
+327,262512448,"utils/train_utils.py",4389,0,"g",python,content
+328,262512450,"utils/train_utils.py",4390,0,"",python,selection_keyboard
+329,262512487,"utils/train_utils.py",4390,0,"s",python,content
+330,262512488,"utils/train_utils.py",4391,0,"",python,selection_keyboard
+331,262512732,"utils/train_utils.py",4390,0,"",python,selection_command
+332,262513221,"utils/train_utils.py",4391,0,"",python,selection_command
+333,262514040,"utils/train_utils.py",4340,0,"",python,selection_command
+334,262514703,"utils/train_utils.py",4344,0,"",python,selection_command
+335,262514956,"utils/train_utils.py",4380,0,"",python,selection_command
+336,262514988,"utils/train_utils.py",4381,0,"",python,selection_command
+337,262515027,"utils/train_utils.py",4385,0,"",python,selection_command
+338,262515060,"utils/train_utils.py",4387,0,"",python,selection_command
+339,262515101,"utils/train_utils.py",4391,0,"",python,selection_command
+340,262515424,"utils/train_utils.py",4387,0,"",python,selection_command
+341,262515758,"train_dynamics.py",0,0,"import os\n\nos.environ.setdefault(""XLA_PYTHON_CLIENT_MEM_FRACTION"", ""0.98"")\n\nfrom dataclasses import dataclass, field\nimport itertools\nfrom typing import cast, Optional\n\nimport einops\nfrom jax.sharding import Mesh, PartitionSpec, NamedSharding\nfrom jax.experimental.mesh_utils import create_device_mesh\nimport optax\nimport orbax.checkpoint as ocp\nimport numpy as np\nimport dm_pix as pix\nimport jax\nimport jax.numpy as jnp\nimport tyro\nimport wandb\nimport grain\nimport flax.nnx as nnx\n\nfrom genie import Genie, restore_genie_components\nfrom utils.dataloader import get_dataloader\nfrom utils.train_utils import (\n get_lr_schedule,\n count_parameters_by_component,\n print_mem_stats,\n print_compiled_memory_stats,\n print_compiled_cost_analysis,\n)\n\n\n@dataclass\nclass Args:\n # Experiment\n num_steps: int = 200_000\n seed: int = 0\n seq_len: int = 16\n image_channels: int = 3\n image_height: int = 90\n image_width: int = 160\n data_dir: str = """"\n save_ckpt: bool = False\n restore_ckpt: bool = False\n # Optimization\n batch_size: int = 36\n init_lr: float = 0.0\n max_lr: float = 3e-5\n decay_end: float = 0.0\n wsd_decay_steps: int = (\n 10000 # NOTE: wsd_decay_steps will only be used when using a wsd-schedule\n )\n warmup_steps: int = 5000\n lr_schedule: str = ""wsd"" # supported options: wsd, cos\n # Tokenizer\n tokenizer_dim: int = 512\n tokenizer_ffn_dim: int = 2048\n latent_patch_dim: int = 32\n num_patch_latents: int = 1024\n patch_size: int = 4\n tokenizer_num_blocks: int = 4\n tokenizer_num_heads: int = 8\n tokenizer_checkpoint: str = """"\n # LAM\n lam_dim: int = 512\n lam_ffn_dim: int = 2048\n latent_action_dim: int = 32\n num_latent_actions: int = 6\n lam_patch_size: int = 16\n lam_num_blocks: int = 4\n lam_num_heads: int = 8\n lam_checkpoint: str = """"\n # Dynamics\n dyna_type: str = ""maskgit"" # supported options: maskgit, causal\n dyna_dim: int = 512\n dyna_ffn_dim: int = 2048\n dyna_num_blocks: int = 6\n dyna_num_heads: int = 8\n dropout: float = 0.0\n mask_limit: float = 0.5\n param_dtype = jnp.float32\n dtype = jnp.bfloat16\n use_flash_attention: bool = True\n # Logging\n log: bool = False\n entity: str = """"\n project: str = """"\n name: str = ""train_dynamics""\n tags: list[str] = field(default_factory=lambda: [""dynamics""])\n log_interval: int = 5\n log_image_interval: int = 250\n ckpt_dir: str = """"\n log_checkpoint_interval: int = 25000\n log_checkpoint_keep_period: int = 20000\n log_gradients: bool = False\n wandb_id: str = """"\n\n\ndef build_model(args: Args, rng: jax.Array) -> tuple[Genie, jax.Array]:\n rng, _rng = jax.random.split(rng)\n rngs = nnx.Rngs(_rng)\n genie = Genie(\n # Tokenizer\n in_dim=args.image_channels,\n tokenizer_dim=args.tokenizer_dim,\n tokenizer_ffn_dim=args.tokenizer_ffn_dim,\n latent_patch_dim=args.latent_patch_dim,\n num_patch_latents=args.num_patch_latents,\n patch_size=args.patch_size,\n tokenizer_num_blocks=args.tokenizer_num_blocks,\n tokenizer_num_heads=args.tokenizer_num_heads,\n # LAM\n lam_dim=args.lam_dim,\n lam_ffn_dim=args.lam_ffn_dim,\n latent_action_dim=args.latent_action_dim,\n num_latent_actions=args.num_latent_actions,\n lam_patch_size=args.lam_patch_size,\n lam_num_blocks=args.lam_num_blocks,\n lam_num_heads=args.lam_num_heads,\n lam_co_train=not args.lam_checkpoint,\n # Dynamics\n dyna_type=args.dyna_type,\n dyna_dim=args.dyna_dim,\n dyna_ffn_dim=args.dyna_ffn_dim,\n dyna_num_blocks=args.dyna_num_blocks,\n dyna_num_heads=args.dyna_num_heads,\n dropout=args.dropout,\n mask_limit=args.mask_limit,\n param_dtype=args.param_dtype,\n dtype=args.dtype,\n use_flash_attention=args.use_flash_attention,\n decode=False,\n rngs=rngs,\n )\n del genie.lam.decoder\n return genie, rng\n\n\ndef build_optimizer(genie: Genie, args: Args) -> tuple[nnx.Optimizer, optax.Schedule]:\n lr_schedule = get_lr_schedule(\n args.lr_schedule,\n args.init_lr,\n args.max_lr,\n args.decay_end,\n args.num_steps,\n args.warmup_steps,\n args.wsd_decay_steps,\n )\n tx = optax.adamw(\n learning_rate=lr_schedule,\n b1=0.9,\n b2=0.9,\n weight_decay=1e-4,\n mu_dtype=args.param_dtype, # moments in full precision\n )\n optimizer = nnx.Optimizer(genie, tx)\n return optimizer, lr_schedule\n\n\ndef build_mesh_and_sharding(\n num_devices: int,\n) -> tuple[Mesh, NamedSharding, NamedSharding]:\n device_mesh_arr = create_device_mesh((num_devices,))\n mesh = Mesh(devices=device_mesh_arr, axis_names=(""data"",))\n replicated_sharding = NamedSharding(mesh, PartitionSpec())\n videos_sharding = NamedSharding(mesh, PartitionSpec(""data"", None, None, None, None))\n return mesh, replicated_sharding, videos_sharding\n\n\ndef shard_optimizer_states(\n optimizer: nnx.Optimizer, replicated_sharding: NamedSharding\n) -> None:\n model_state = nnx.state(optimizer.model)\n model_sharded_state = jax.lax.with_sharding_constraint(\n model_state, replicated_sharding\n )\n nnx.update(optimizer.model, model_sharded_state)\n optimizer_state = nnx.state(optimizer, nnx.optimizer.OptState)\n optimizer_sharded_state = jax.lax.with_sharding_constraint(\n optimizer_state, replicated_sharding\n )\n nnx.update(optimizer, optimizer_sharded_state)\n\n\ndef build_dataloader(args: Args) -> grain.DataLoaderIterator:\n image_shape = (args.image_height, args.image_width, args.image_channels)\n array_record_files = [\n os.path.join(args.data_dir, x)\n for x in os.listdir(args.data_dir)\n if x.endswith("".array_record"")\n ]\n grain_dataloader = get_dataloader(\n array_record_files,\n args.seq_len,\n # NOTE: We deliberately pass the global batch size\n # The dataloader shards the dataset across all processes\n args.batch_size,\n *image_shape,\n num_workers=8,\n prefetch_buffer_size=1,\n seed=args.seed,\n )\n initial_state = grain_dataloader._create_initial_state()\n grain_iterator = grain.DataLoaderIterator(grain_dataloader, initial_state)\n return grain_iterator\n\n\ndef build_checkpoint_manager(args: Args) -> ocp.CheckpointManager:\n handler_registry = ocp.handlers.DefaultCheckpointHandlerRegistry()\n handler_registry.add(\n ""model_state"", ocp.args.PyTreeSave, ocp.handlers.PyTreeCheckpointHandler\n )\n handler_registry.add(\n ""model_state"", ocp.args.PyTreeRestore, ocp.handlers.PyTreeCheckpointHandler\n )\n handler_registry.add(\n ""dataloader_state"",\n grain.checkpoint.CheckpointSave,\n cast(ocp.handlers.CheckpointHandler, grain.checkpoint.CheckpointHandler),\n )\n handler_registry.add(\n ""dataloader_state"",\n grain.checkpoint.CheckpointRestore,\n cast(ocp.handlers.CheckpointHandler, grain.checkpoint.CheckpointHandler),\n )\n checkpoint_options = ocp.CheckpointManagerOptions(\n save_interval_steps=args.log_checkpoint_interval,\n max_to_keep=3,\n keep_period=args.log_checkpoint_keep_period,\n step_format_fixed_length=6,\n cleanup_tmp_directories=True,\n )\n checkpoint_manager = ocp.CheckpointManager(\n args.ckpt_dir,\n options=checkpoint_options,\n handler_registry=handler_registry,\n )\n return checkpoint_manager\n\n\ndef restore_or_initialize_components(\n args: Args,\n checkpoint_manager: ocp.CheckpointManager,\n optimizer: nnx.Optimizer,\n grain_iterator: grain.DataLoaderIterator,\n rng: jax.Array,\n replicated_sharding: NamedSharding,\n restore_step: Optional[int] = None,\n) -> tuple[int, nnx.Optimizer, grain.DataLoaderIterator, jax.Array]:\n step = 0\n if restore_step is None:\n restore_step = checkpoint_manager.latest_step()\n if args.restore_ckpt:\n abstract_optimizer = nnx.eval_shape(lambda: optimizer)\n abstract_optimizer_state = nnx.state(abstract_optimizer)\n restored = checkpoint_manager.restore(\n restore_step,\n args=ocp.args.Composite(\n model_state=ocp.args.PyTreeRestore(abstract_optimizer_state), # type: ignore\n dataloader_state=grain.checkpoint.CheckpointRestore(grain_iterator), # type: ignore\n ),\n )\n restored_optimizer_state = restored[""model_state""]\n nnx.update(optimizer, restored_optimizer_state)\n grain_iterator = restored[""dataloader_state""]\n step = restore_step or 0\n print(f""Restored dataloader and model state from step {step}"")\n else:\n # Restore from pre-trained tokenizer (and LAM)\n rng, _rng = jax.random.split(rng)\n optimizer = restore_genie_components(optimizer, replicated_sharding, _rng, args)\n # NOTE: We have to remove the (unused) tokenizer vq dropout due flax.nnx lazily initializing modules.\n # Specifically, the first dynamics model checkpoint will contain the vq dropout module,\n # but the first full restore will fail due to nnx not initializing the module when\n # dropout is set to 0.0.\n del optimizer.model.tokenizer.vq.drop\n return step, optimizer, grain_iterator, rng\n\n\ndef main(args: Args) -> None:\n jax.distributed.initialize()\n num_devices = jax.device_count()\n if num_devices == 0:\n raise ValueError(""No JAX devices found."")\n print(f""Running on {num_devices} devices."")\n\n if args.batch_size % num_devices != 0:\n raise ValueError(\n f""Global batch size {args.batch_size} must be divisible by ""\n f""number of devices {num_devices}.""\n )\n\n rng = jax.random.key(args.seed)\n\n # --- Initialize model ---\n genie, rng = build_model(args, rng)\n _, params, _ = nnx.split(genie, nnx.Param, ...)\n param_counts = count_parameters_by_component(params)\n\n if args.log and jax.process_index() == 0:\n wandb_init_kwargs = {\n ""entity"": args.entity,\n ""project"": args.project,\n ""name"": args.name,\n ""tags"": args.tags,\n ""group"": ""debug"",\n ""config"": args,\n }\n\n if args.wandb_id:\n wandb_init_kwargs.update(\n {\n ""id"": args.wandb_id,\n ""resume"": ""allow"",\n }\n )\n wandb.init(**wandb_init_kwargs)\n\n wandb.config.update({""model_param_count"": param_counts})\n\n print(""Parameter counts:"")\n print(param_counts)\n\n # --- Initialize optimizer ---\n optimizer, lr_schedule = build_optimizer(genie, args)\n del genie\n\n # FIXME: switch to create_hybrid_device_mesh for runs spanning multiple nodes\n mesh, replicated_sharding, videos_sharding = build_mesh_and_sharding(num_devices)\n\n shard_optimizer_states(optimizer, replicated_sharding)\n\n # --- Initialize checkpoint manager ---\n checkpoint_manager = build_checkpoint_manager(args)\n\n # --- Create DataLoaderIterator from dataloader ---\n grain_iterator = build_dataloader(args)\n\n # --- Restore checkpoint ---\n step, optimizer, grain_iterator, rng = restore_or_initialize_components(\n args, checkpoint_manager, optimizer, grain_iterator, rng, replicated_sharding\n )\n\n # --- Define loss and train step (close over args) ---\n def dynamics_loss_fn(\n model: Genie, inputs: dict\n ) -> tuple[jax.Array, tuple[jax.Array, dict]]:\n gt = jnp.asarray(inputs[""videos""], dtype=jnp.float32) / 255.0\n inputs[""videos""] = gt.astype(args.dtype)\n model.train()\n outputs = model(inputs, training=True)\n mask = outputs[""mask""]\n outputs[""token_logits""] = outputs[""token_logits""].astype(jnp.float32)\n ce_loss = optax.softmax_cross_entropy_with_integer_labels(\n outputs[""token_logits""], outputs[""video_tokens""]\n )\n ce_loss = (mask * ce_loss).sum() / mask.sum()\n acc = outputs[""token_logits""].argmax(-1) == outputs[""video_tokens""]\n acc = (mask * acc).sum() / mask.sum()\n select_probs = jax.nn.softmax(outputs[""token_logits""])\n gt_val = gt.clip(0, 1).reshape(-1, *gt.shape[2:])\n recon = outputs[""recon""].clip(0, 1).reshape(-1, *outputs[""recon""].shape[2:])\n psnr = jnp.asarray(pix.psnr(gt_val, recon)).mean()\n ssim = jnp.asarray(pix.ssim(gt_val, recon)).mean()\n _, index_counts_lam = jnp.unique_counts(\n jnp.ravel(outputs[""lam_indices""]),\n size=args.num_latent_actions,\n fill_value=0,\n )\n _, index_counts_tokenizer = jnp.unique_counts(\n jnp.ravel(outputs[""video_tokens""]),\n size=args.num_patch_latents,\n fill_value=0,\n )\n codebook_usage_lam = (index_counts_lam != 0).mean()\n codebook_usage_tokenizer = (index_counts_tokenizer != 0).mean()\n metrics = dict(\n cross_entropy_loss=ce_loss,\n masked_token_accuracy=acc,\n select_logit=outputs[""token_logits""].max(-1).mean(),\n select_p=select_probs.max(-1).mean(),\n entropy=jax.scipy.special.entr(select_probs).sum(-1).mean(),\n psnr=psnr,\n ssim=ssim,\n codebook_usage_lam=codebook_usage_lam,\n codebook_usage_tokenizer=codebook_usage_tokenizer,\n )\n return ce_loss, (outputs[""recon""], metrics)\n\n @nnx.jit(donate_argnums=0)\n def train_step(\n optimizer: nnx.Optimizer, inputs: dict\n ) -> tuple[jax.Array, jax.Array, dict]:\n def loss_fn(model: Genie) -> tuple[jax.Array, tuple[jax.Array, dict]]:\n return dynamics_loss_fn(model, inputs)\n\n (loss, (recon, metrics)), grads = nnx.value_and_grad(loss_fn, has_aux=True)(\n optimizer.model\n )\n optimizer.update(grads)\n if args.log_gradients:\n metrics[""gradients_std/""] = jax.tree.map(\n lambda x: x.std(), grads[""params""][""dynamics""]\n )\n return loss, recon, metrics\n\n # --- TRAIN LOOP ---\n dataloader = (\n jax.make_array_from_process_local_data(videos_sharding, elem)\n for elem in grain_iterator\n )\n if jax.process_index() == 0:\n first_videos = next(dataloader)\n sample_inputs = dict(videos=first_videos, mask_rng=rng)\n compiled = train_step.lower(optimizer, sample_inputs).compile()\n print_compiled_memory_stats(compiled.memory_analysis())\n print_compiled_cost_analysis(compiled.cost_analysis())\n # Do not skip the first batch during training\n dataloader = itertools.chain([first_videos], dataloader)\n print(f""Starting training from step {step}..."")\n first_step = step\n while step < args.num_steps:\n for videos in dataloader:\n # --- Train step ---\n rng, _rng_mask = jax.random.split(rng, 2)\n inputs = dict(videos=videos, mask_rng=_rng_mask)\n loss, recon, metrics = train_step(optimizer, inputs)\n if step == first_step:\n print_mem_stats(""After params initialized"")\n metrics[""lr""] = lr_schedule(step)\n print(f""Step {step}, loss: {loss}"")\n step += 1\n\n # --- Logging ---\n if args.log:\n if step % args.log_interval == 0 and jax.process_index() == 0:\n wandb.log(\n {\n ""loss"": loss,\n ""step"": step,\n **metrics,\n }\n )\n if step % args.log_image_interval == 0:\n gt_seq = inputs[""videos""][0].astype(jnp.float32) / 255.0\n recon_seq = recon[0].clip(0, 1)\n comparison_seq = jnp.concatenate((gt_seq, recon_seq), axis=1)\n comparison_seq = einops.rearrange(\n comparison_seq * 255, ""t h w c -> h (t w) c""\n )\n if jax.process_index() == 0:\n log_images = dict(\n image=wandb.Image(np.asarray(gt_seq[args.seq_len - 1])),\n recon=wandb.Image(np.asarray(recon_seq[args.seq_len - 1])),\n true_vs_recon=wandb.Image(\n np.asarray(comparison_seq.astype(np.uint8))\n ),\n )\n wandb.log(log_images)\n # --- Checkpointing ---\n if args.save_ckpt and step % args.log_checkpoint_interval == 0:\n optimizer_state = nnx.state(optimizer)\n checkpoint_manager.save(\n step,\n args=ocp.args.Composite(\n model_state=ocp.args.PyTreeSave(optimizer_state), # type: ignore\n dataloader_state=grain.checkpoint.CheckpointSave( # type: ignore\n grain_iterator # type: ignore\n ),\n ),\n )\n print(f""Saved checkpoint at step {step}"")\n if step >= args.num_steps:\n break\n\n checkpoint_manager.close()\n\n\nif __name__ == ""__main__"":\n args = tyro.cli(Args)\n main(args)\n",python,tab
+342,262515765,"train_dynamics.py",774,0,"",python,selection_command
+343,262518294,"utils/train_utils.py",0,0,"",python,tab
+344,262518295,"utils/train_utils.py",4387,0,"",python,selection_command
+345,262518698,"utils/train_utils.py",4340,0,"",python,selection_command
+346,262519712,"utils/train_utils.py",4393,0,"",python,selection_command
+347,262520437,"utils/train_utils.py",4392,0,"",python,selection_command
+348,262524681,"train_dynamics.py",0,0,"",python,tab
+349,262527008,"train_tokenizer.py",0,0,"",python,tab
+350,262527877,"train_dynamics.py",0,0,"",python,tab
+351,262529089,"train_tokenizer.py",0,0,"",python,tab
+352,262531503,"train_tokenizer.py",1383,0,"",python,selection_command
+353,262532489,"train_dynamics.py",0,0,"",python,tab
+354,262534298,"train_dynamics.py",1378,0,"",python,selection_command
+355,262534918,"train_dynamics.py",2036,0,"",python,selection_command
+356,262535481,"train_dynamics.py",1378,0,"",python,selection_command
+357,262535587,"train_dynamics.py",768,0,"",python,selection_command
+358,262536452,"train_tokenizer.py",0,0,"",python,tab
+359,262539334,"train_tokenizer.py",762,0,"",python,selection_command
+360,262547843,"utils/train_utils.py",0,0,"",python,tab
+361,262548830,"utils/train_utils.py",4391,0,"",python,selection_command
+362,262548963,"utils/train_utils.py",4387,0,"",python,selection_command
+363,262549080,"utils/train_utils.py",4385,0,"",python,selection_command
+364,262549457,"utils/train_utils.py",4385,1,":",python,selection_command
+365,262549511,"utils/train_utils.py",4385,6,": Args",python,selection_command
+366,262549824,"utils/train_utils.py",4385,6,"",python,content
+367,262550632,"utils/train_utils.py",0,0,"",python,selection_command
+368,262551451,"utils/train_utils.py",68,0,"",python,selection_command
+369,262551921,"utils/train_utils.py",41,32,"",python,content
+370,262552433,"utils/train_utils.py",41,1,"",python,content
+371,262553321,"utils/train_utils.py",4307,0,"",python,selection_command
+372,262554587,"utils/train_utils.py",4354,0,"",python,selection_command
+373,262556766,"utils/train_utils.py",4354,0,"\n ",python,content
+374,262557016,"utils/train_utils.py",4359,0,"""""",python,content
+375,262557017,"utils/train_utils.py",4360,0,"",python,selection_keyboard
+376,262557547,"utils/train_utils.py",4360,0,"C",python,content
+377,262557549,"utils/train_utils.py",4361,0,"",python,selection_keyboard
+378,262557886,"utils/train_utils.py",4360,1,"",python,content
+379,262558227,"utils/train_utils.py",4360,1,"""",python,content
+380,262558229,"utils/train_utils.py",4361,0,"",python,selection_keyboard
+381,262558398,"utils/train_utils.py",4361,0,"""",python,content
+382,262558399,"utils/train_utils.py",4362,0,"",python,selection_keyboard
+383,262558966,"utils/train_utils.py",4362,0,"""",python,content
+384,262558968,"utils/train_utils.py",4363,0,"",python,selection_keyboard
+385,262559535,"utils/train_utils.py",4363,0,"""",python,content
+386,262559536,"utils/train_utils.py",4364,0,"",python,selection_keyboard
+387,262559899,"utils/train_utils.py",4364,0,"""",python,content
+388,262559900,"utils/train_utils.py",4365,0,"",python,selection_keyboard
+389,262560490,"utils/train_utils.py",4364,0,"",python,selection_command
+390,262560681,"utils/train_utils.py",4363,0,"",python,selection_command
+391,262560810,"utils/train_utils.py",4362,0,"",python,selection_command
+392,262561096,"utils/train_utils.py",4362,0," ",python,content
+393,262561098,"utils/train_utils.py",4363,0,"",python,selection_keyboard
+394,262562034,"utils/train_utils.py",4362,1,"",python,content
+395,262564010,"utils/train_utils.py",4362,0,"C",python,content
+396,262564011,"utils/train_utils.py",4363,0,"",python,selection_keyboard
+397,262564225,"utils/train_utils.py",4363,0,"a",python,content
+398,262564226,"utils/train_utils.py",4364,0,"",python,selection_keyboard
+399,262564267,"utils/train_utils.py",4364,0,"l",python,content
+400,262564269,"utils/train_utils.py",4365,0,"",python,selection_keyboard
+401,262564344,"utils/train_utils.py",4365,0,"c",python,content
+402,262564345,"utils/train_utils.py",4366,0,"",python,selection_keyboard
+403,262564494,"utils/train_utils.py",4366,0,"u",python,content
+404,262564495,"utils/train_utils.py",4367,0,"",python,selection_keyboard
+405,262564632,"utils/train_utils.py",4367,0,"l",python,content
+406,262564633,"utils/train_utils.py",4368,0,"",python,selection_keyboard
+407,262564742,"utils/train_utils.py",4368,0,"a",python,content
+408,262564743,"utils/train_utils.py",4369,0,"",python,selection_keyboard
+409,262564810,"utils/train_utils.py",4369,0,"t",python,content
+410,262564812,"utils/train_utils.py",4370,0,"",python,selection_keyboard
+411,262564948,"utils/train_utils.py",4370,0,"e",python,content
+412,262564949,"utils/train_utils.py",4371,0,"",python,selection_keyboard
+413,262565060,"utils/train_utils.py",4371,0," ",python,content
+414,262565061,"utils/train_utils.py",4372,0,"",python,selection_keyboard
+415,262565159,"utils/train_utils.py",4372,0,"t",python,content
+416,262565160,"utils/train_utils.py",4373,0,"",python,selection_keyboard
+417,262565343,"utils/train_utils.py",4373,0,"r",python,content
+418,262565345,"utils/train_utils.py",4374,0,"",python,selection_keyboard
+419,262565415,"utils/train_utils.py",4374,0,"a",python,content
+420,262565416,"utils/train_utils.py",4375,0,"",python,selection_keyboard
+421,262565468,"utils/train_utils.py",4375,0,"i",python,content
+422,262565469,"utils/train_utils.py",4376,0,"",python,selection_keyboard
+423,262565530,"utils/train_utils.py",4376,0,"n",python,content
+424,262565532,"utils/train_utils.py",4377,0,"",python,selection_keyboard
+425,262565652,"utils/train_utils.py",4377,0,"i",python,content
+426,262565654,"utils/train_utils.py",4378,0,"",python,selection_keyboard
+427,262565733,"utils/train_utils.py",4378,0,"n",python,content
+428,262565734,"utils/train_utils.py",4379,0,"",python,selection_keyboard
+429,262565743,"utils/train_utils.py",4379,0,"g",python,content
+430,262565744,"utils/train_utils.py",4380,0,"",python,selection_keyboard
+431,262565862,"utils/train_utils.py",4380,0," ",python,content
+432,262565864,"utils/train_utils.py",4381,0,"",python,selection_keyboard
+433,262567932,"utils/train_utils.py",4381,0,"T",python,content
+434,262567933,"utils/train_utils.py",4382,0,"",python,selection_keyboard
+435,262568129,"utils/train_utils.py",4382,0,"F",python,content
+436,262568131,"utils/train_utils.py",4383,0,"",python,selection_keyboard
+437,262568187,"utils/train_utils.py",4383,0,"L",python,content
+438,262568188,"utils/train_utils.py",4384,0,"",python,selection_keyboard
+439,262568343,"utils/train_utils.py",4384,0,"O",python,content
+440,262568345,"utils/train_utils.py",4385,0,"",python,selection_keyboard
+441,262568411,"utils/train_utils.py",4385,0,"P",python,content
+442,262568413,"utils/train_utils.py",4386,0,"",python,selection_keyboard
+443,262569931,"utils/train_utils.py",4385,0,"",python,selection_command
+444,262570050,"utils/train_utils.py",4389,0,"\n ",python,content
+445,262573588,"utils/train_utils.py",4394,0,"#",python,content
+446,262573590,"utils/train_utils.py",4395,0,"",python,selection_keyboard
+447,262573727,"utils/train_utils.py",4395,0," ",python,content
+448,262573729,"utils/train_utils.py",4396,0,"",python,selection_keyboard
+449,262574281,"utils/train_utils.py",4396,0,"M",python,content
+450,262574283,"utils/train_utils.py",4397,0,"",python,selection_keyboard
+451,262574344,"utils/train_utils.py",4397,0,"L",python,content
+452,262574346,"utils/train_utils.py",4398,0,"",python,selection_keyboard
+453,262574415,"utils/train_utils.py",4398,0,"P",python,content
+454,262574415,"utils/train_utils.py",4399,0,"",python,selection_keyboard
+455,262574551,"utils/train_utils.py",4399,0," ",python,content
+456,262574552,"utils/train_utils.py",4400,0,"",python,selection_keyboard
+457,262576514,"utils/train_utils.py",4400,0,"f",python,content
+458,262576515,"utils/train_utils.py",4401,0,"",python,selection_keyboard
+459,262576533,"utils/train_utils.py",4401,0,"l",python,content
+460,262576534,"utils/train_utils.py",4402,0,"",python,selection_keyboard
+461,262576714,"utils/train_utils.py",4402,0,"o",python,content
+462,262576715,"utils/train_utils.py",4403,0,"",python,selection_keyboard
+463,262576774,"utils/train_utils.py",4403,0,"p",python,content
+464,262576775,"utils/train_utils.py",4404,0,"",python,selection_keyboard
+465,262576859,"utils/train_utils.py",4404,0,"s",python,content
+466,262576860,"utils/train_utils.py",4405,0,"",python,selection_keyboard
+467,262577232,"utils/train_utils.py",4405,0,"\n ",python,content
+468,262577614,"utils/train_utils.py",4410,0,"\n ",python,content
+469,262577614,"utils/train_utils.py",4406,4,"",python,content
+470,262578181,"utils/train_utils.py",4411,0,"#",python,content
+471,262578183,"utils/train_utils.py",4412,0,"",python,selection_keyboard
+472,262578281,"utils/train_utils.py",4412,0," ",python,content
+473,262578282,"utils/train_utils.py",4413,0,"",python,selection_keyboard
+474,262578834,"utils/train_utils.py",4413,0,"A",python,content
+475,262578835,"utils/train_utils.py",4414,0,"",python,selection_keyboard
+476,262579020,"utils/train_utils.py",4414,0,"t",python,content
+477,262579022,"utils/train_utils.py",4415,0,"",python,selection_keyboard
+478,262579161,"utils/train_utils.py",4415,0,"t",python,content
+479,262579162,"utils/train_utils.py",4416,0,"",python,selection_keyboard
+480,262579225,"utils/train_utils.py",4416,0,"e",python,content
+481,262579227,"utils/train_utils.py",4417,0,"",python,selection_keyboard
+482,262579347,"utils/train_utils.py",4417,0,"n",python,content
+483,262579347,"utils/train_utils.py",4418,0,"",python,selection_keyboard
+484,262579445,"utils/train_utils.py",4418,0,"t",python,content
+485,262579447,"utils/train_utils.py",4419,0,"",python,selection_keyboard
+486,262579547,"utils/train_utils.py",4419,0,"i",python,content
+487,262579548,"utils/train_utils.py",4420,0,"",python,selection_keyboard
+488,262579626,"utils/train_utils.py",4420,0,"o",python,content
+489,262579627,"utils/train_utils.py",4421,0,"",python,selection_keyboard
+490,262579648,"utils/train_utils.py",4421,0,"n",python,content
+491,262579649,"utils/train_utils.py",4422,0,"",python,selection_keyboard
+492,262579763,"utils/train_utils.py",4422,0," ",python,content
+493,262579764,"utils/train_utils.py",4423,0,"",python,selection_keyboard
+494,262579982,"utils/train_utils.py",4423,0,"f",python,content
+495,262579983,"utils/train_utils.py",4424,0,"",python,selection_keyboard
+496,262580109,"utils/train_utils.py",4424,0,"l",python,content
+497,262580111,"utils/train_utils.py",4425,0,"",python,selection_keyboard
+498,262580266,"utils/train_utils.py",4425,0,"o",python,content
+499,262580268,"utils/train_utils.py",4426,0,"",python,selection_keyboard
+500,262580341,"utils/train_utils.py",4426,0,"p",python,content
+501,262580342,"utils/train_utils.py",4427,0,"",python,selection_keyboard
+502,262580386,"utils/train_utils.py",4427,0,"s",python,content
+503,262580388,"utils/train_utils.py",4428,0,"",python,selection_keyboard
+504,262580650,"utils/train_utils.py",4427,0,"",python,selection_command
+505,262587517,"utils/train_utils.py",4428,0,"\n ",python,content
+506,262587699,"utils/train_utils.py",4433,0,"\n ",python,content
+507,262587699,"utils/train_utils.py",4429,4,"",python,content
+508,262587887,"utils/train_utils.py",4434,0,"#",python,content
+509,262587888,"utils/train_utils.py",4435,0,"",python,selection_keyboard
+510,262587945,"utils/train_utils.py",4435,0," ",python,content
+511,262587947,"utils/train_utils.py",4436,0,"",python,selection_keyboard
+512,262588215,"utils/train_utils.py",4436,0,"E",python,content
+513,262588216,"utils/train_utils.py",4437,0,"",python,selection_keyboard
+514,262588599,"utils/train_utils.py",4437,0,"m",python,content
+515,262588600,"utils/train_utils.py",4438,0,"",python,selection_keyboard
+516,262588759,"utils/train_utils.py",4438,0,"b",python,content
+517,262588760,"utils/train_utils.py",4439,0,"",python,selection_keyboard
+518,262588780,"utils/train_utils.py",4439,0,"e",python,content
+519,262588781,"utils/train_utils.py",4440,0,"",python,selection_keyboard
+520,262588875,"utils/train_utils.py",4440,0,"d",python,content
+521,262588876,"utils/train_utils.py",4441,0,"",python,selection_keyboard
+522,262589079,"utils/train_utils.py",4441,0,"d",python,content
+523,262589080,"utils/train_utils.py",4442,0,"",python,selection_keyboard
+524,262589223,"utils/train_utils.py",4442,0,"i",python,content
+525,262589224,"utils/train_utils.py",4443,0,"",python,selection_keyboard
+526,262589291,"utils/train_utils.py",4443,0,"n",python,content
+527,262589294,"utils/train_utils.py",4444,0,"",python,selection_keyboard
+528,262589359,"utils/train_utils.py",4444,0,"g",python,content
+529,262589361,"utils/train_utils.py",4445,0,"",python,selection_keyboard
+530,262589444,"utils/train_utils.py",4445,0," ",python,content
+531,262589445,"utils/train_utils.py",4446,0,"",python,selection_keyboard
+532,262589575,"utils/train_utils.py",4446,0,"f",python,content
+533,262589576,"utils/train_utils.py",4447,0,"",python,selection_keyboard
+534,262589643,"utils/train_utils.py",4447,0,"l",python,content
+535,262589644,"utils/train_utils.py",4448,0,"",python,selection_keyboard
+536,262589817,"utils/train_utils.py",4448,0,"o",python,content
+537,262589818,"utils/train_utils.py",4449,0,"",python,selection_keyboard
+538,262589875,"utils/train_utils.py",4449,0,"p",python,content
+539,262589877,"utils/train_utils.py",4450,0,"",python,selection_keyboard
+540,262589916,"utils/train_utils.py",4450,0,"s",python,content
+541,262589917,"utils/train_utils.py",4451,0,"",python,selection_keyboard
+542,262590182,"utils/train_utils.py",4450,0,"",python,selection_command
+543,262603400,"utils/train_utils.py",4451,0,"",python,selection_command
+544,262603623,"utils/train_utils.py",4451,0,"\n ",python,content
+545,262603994,"utils/train_utils.py",4456,0,"\n ",python,content
+546,262603994,"utils/train_utils.py",4452,4,"",python,content
+547,262604210,"utils/train_utils.py",4457,0,"#",python,content
+548,262604211,"utils/train_utils.py",4458,0,"",python,selection_keyboard
+549,262604313,"utils/train_utils.py",4458,0," ",python,content
+550,262604314,"utils/train_utils.py",4459,0,"",python,selection_keyboard
+551,262604587,"utils/train_utils.py",4459,0,"C",python,content
+552,262604589,"utils/train_utils.py",4460,0,"",python,selection_keyboard
+553,262604898,"utils/train_utils.py",4460,0,"o",python,content
+554,262604901,"utils/train_utils.py",4461,0,"",python,selection_keyboard
+555,262604942,"utils/train_utils.py",4461,0,"m",python,content
+556,262604943,"utils/train_utils.py",4462,0,"",python,selection_keyboard
+557,262605126,"utils/train_utils.py",4462,0,"b",python,content
+558,262605128,"utils/train_utils.py",4463,0,"",python,selection_keyboard
+559,262605224,"utils/train_utils.py",4463,0,"i",python,content
+560,262605225,"utils/train_utils.py",4464,0,"",python,selection_keyboard
+561,262605325,"utils/train_utils.py",4464,0,"n",python,content
+562,262605326,"utils/train_utils.py",4465,0,"",python,selection_keyboard
+563,262605349,"utils/train_utils.py",4465,0,"e",python,content
+564,262605349,"utils/train_utils.py",4466,0,"",python,selection_keyboard
+565,262605480,"utils/train_utils.py",4466,0," ",python,content
+566,262605481,"utils/train_utils.py",4467,0,"",python,selection_keyboard
+567,262605948,"utils/train_utils.py",4467,0,"f",python,content
+568,262605949,"utils/train_utils.py",4468,0,"",python,selection_keyboard
+569,262606130,"utils/train_utils.py",4468,0,"r",python,content
+570,262606131,"utils/train_utils.py",4469,0,"",python,selection_keyboard
+571,262606547,"utils/train_utils.py",4468,1,"",python,content
+572,262606720,"utils/train_utils.py",4468,0,"l",python,content
+573,262606721,"utils/train_utils.py",4469,0,"",python,selection_keyboard
+574,262606875,"utils/train_utils.py",4469,0,"o",python,content
+575,262606876,"utils/train_utils.py",4470,0,"",python,selection_keyboard
+576,262606970,"utils/train_utils.py",4470,0,"p",python,content
+577,262606971,"utils/train_utils.py",4471,0,"",python,selection_keyboard
+578,262607029,"utils/train_utils.py",4471,0,"s",python,content
+579,262607030,"utils/train_utils.py",4472,0,"",python,selection_keyboard
+580,262607092,"utils/train_utils.py",4472,0," ",python,content
+581,262607094,"utils/train_utils.py",4473,0,"",python,selection_keyboard
+582,262607325,"utils/train_utils.py",4473,0,"f",python,content
+583,262607326,"utils/train_utils.py",4474,0,"",python,selection_keyboard
+584,262610567,"utils/train_utils.py",4473,1,"",python,content
+585,262611025,"utils/train_utils.py",4467,6,"",python,content
+586,262611248,"utils/train_utils.py",4467,0,"f",python,content
+587,262611249,"utils/train_utils.py",4468,0,"",python,selection_keyboard
+588,262611343,"utils/train_utils.py",4468,0,"l",python,content
+589,262611344,"utils/train_utils.py",4469,0,"",python,selection_keyboard
+590,262611479,"utils/train_utils.py",4469,0,"o",python,content
+591,262611480,"utils/train_utils.py",4470,0,"",python,selection_keyboard
+592,262611567,"utils/train_utils.py",4470,0,"p",python,content
+593,262611568,"utils/train_utils.py",4471,0,"",python,selection_keyboard
+594,262611596,"utils/train_utils.py",4471,0,"s",python,content
+595,262611597,"utils/train_utils.py",4472,0,"",python,selection_keyboard
+596,262611674,"utils/train_utils.py",4472,0," ",python,content
+597,262611675,"utils/train_utils.py",4473,0,"",python,selection_keyboard
+598,262611792,"utils/train_utils.py",4473,0,"w",python,content
+599,262611793,"utils/train_utils.py",4474,0,"",python,selection_keyboard
+600,262611873,"utils/train_utils.py",4474,0,"i",python,content
+601,262611874,"utils/train_utils.py",4475,0,"",python,selection_keyboard
+602,262611974,"utils/train_utils.py",4475,0,"t",python,content
+603,262611975,"utils/train_utils.py",4476,0,"",python,selection_keyboard
+604,262612048,"utils/train_utils.py",4476,0,"h",python,content
+605,262612049,"utils/train_utils.py",4477,0,"",python,selection_keyboard
+606,262612209,"utils/train_utils.py",4477,0," ",python,content
+607,262612210,"utils/train_utils.py",4478,0,"",python,selection_keyboard
+608,262614470,"utils/train_utils.py",4478,0,"n",python,content
+609,262614472,"utils/train_utils.py",4479,0,"",python,selection_keyboard
+610,262614643,"utils/train_utils.py",4479,0,"u",python,content
+611,262614645,"utils/train_utils.py",4480,0,"",python,selection_keyboard
+612,262614800,"utils/train_utils.py",4480,0,"m",python,content
+613,262614801,"utils/train_utils.py",4481,0,"",python,selection_keyboard
+614,262614941,"utils/train_utils.py",4481,0,"b",python,content
+615,262614942,"utils/train_utils.py",4482,0,"",python,selection_keyboard
+616,262614984,"utils/train_utils.py",4482,0,"e",python,content
+617,262614986,"utils/train_utils.py",4483,0,"",python,selection_keyboard
+618,262615040,"utils/train_utils.py",4483,0,"r",python,content
+619,262615042,"utils/train_utils.py",4484,0,"",python,selection_keyboard
+620,262615097,"utils/train_utils.py",4484,0," ",python,content
+621,262615099,"utils/train_utils.py",4485,0,"",python,selection_keyboard
+622,262615192,"utils/train_utils.py",4485,0,"o",python,content
+623,262615193,"utils/train_utils.py",4486,0,"",python,selection_keyboard
+624,262615259,"utils/train_utils.py",4486,0,"f",python,content
+625,262615260,"utils/train_utils.py",4487,0,"",python,selection_keyboard
+626,262615329,"utils/train_utils.py",4487,0," ",python,content
+627,262615330,"utils/train_utils.py",4488,0,"",python,selection_keyboard
+628,262615446,"utils/train_utils.py",4488,0,"b",python,content
+629,262615447,"utils/train_utils.py",4489,0,"",python,selection_keyboard
+630,262615575,"utils/train_utils.py",4489,0,"l",python,content
+631,262615577,"utils/train_utils.py",4490,0,"",python,selection_keyboard
+632,262615720,"utils/train_utils.py",4490,0,"c",python,content
+633,262615721,"utils/train_utils.py",4491,0,"",python,selection_keyboard
+634,262615735,"utils/train_utils.py",4491,0,"o",python,content
+635,262615736,"utils/train_utils.py",4492,0,"",python,selection_keyboard
+636,262615930,"utils/train_utils.py",4492,0,"k",python,content
+637,262615931,"utils/train_utils.py",4493,0,"",python,selection_keyboard
+638,262616071,"utils/train_utils.py",4493,0,"s",python,content
+639,262616071,"utils/train_utils.py",4494,0,"",python,selection_keyboard
+640,262616426,"utils/train_utils.py",4488,6,"",python,content
+641,262616650,"utils/train_utils.py",4488,0,"b",python,content
+642,262616652,"utils/train_utils.py",4489,0,"",python,selection_keyboard
+643,262616793,"utils/train_utils.py",4489,0,"l",python,content
+644,262616795,"utils/train_utils.py",4490,0,"",python,selection_keyboard
+645,262616941,"utils/train_utils.py",4490,0,"o",python,content
+646,262616943,"utils/train_utils.py",4491,0,"",python,selection_keyboard
+647,262616953,"utils/train_utils.py",4491,0,"c",python,content
+648,262616954,"utils/train_utils.py",4492,0,"",python,selection_keyboard
+649,262617094,"utils/train_utils.py",4492,0,"k",python,content
+650,262617095,"utils/train_utils.py",4493,0,"",python,selection_keyboard
+651,262617153,"utils/train_utils.py",4493,0,"s",python,content
+652,262617155,"utils/train_utils.py",4494,0,"",python,selection_keyboard
+653,262617359,"utils/train_utils.py",4493,0,"",python,selection_command
+654,262618014,"utils/train_utils.py",4452,0,"",python,selection_command
+655,262618165,"utils/train_utils.py",4450,0,"",python,selection_command
+656,262618418,"utils/train_utils.py",4429,0,"",python,selection_command
+657,262618449,"utils/train_utils.py",4427,0,"",python,selection_command
+658,262618483,"utils/train_utils.py",4406,0,"",python,selection_command
+659,262618634,"utils/train_utils.py",4404,0,"",python,selection_command
+660,262618811,"utils/train_utils.py",4388,0,"",python,selection_command
+661,262620770,"utils/train_utils.py",4386,0,"",python,selection_command
+662,262621022,"utils/train_utils.py",4381,0,"",python,selection_command
+663,262621180,"utils/train_utils.py",4372,0,"",python,selection_command
+664,262621360,"utils/train_utils.py",4362,0,"",python,selection_command
+665,262621545,"utils/train_utils.py",4359,0,"",python,selection_command
+666,262621965,"utils/train_utils.py",4360,0,"",python,selection_command
+667,262622480,"utils/train_utils.py",4361,0,"",python,selection_command
+668,262623531,"utils/train_utils.py",4362,0,"",python,selection_command
+669,262623632,"utils/train_utils.py",4362,0,"\n ",python,content
+670,262623829,"utils/train_utils.py",4366,0,"",python,selection_command
+671,262624205,"utils/train_utils.py",4375,0,"",python,selection_command
+672,262624454,"utils/train_utils.py",4384,0,"",python,selection_command
+673,262624486,"utils/train_utils.py",4390,0,"",python,selection_command
+674,262624697,"utils/train_utils.py",4393,0,"",python,selection_command
+675,262625403,"utils/train_utils.py",4391,0,"",python,selection_command
+676,262626881,"utils/train_utils.py",4391,0,"\n ",python,content
+677,262627231,"utils/train_utils.py",4395,0,"",python,selection_command
+678,262627459,"utils/train_utils.py",4366,0,"",python,selection_command
+679,262627680,"utils/train_utils.py",4391,0,"\n ",python,content
+680,262631185,"utils/train_utils.py",4396,0,"\n ",python,content
+681,262631186,"utils/train_utils.py",4392,4,"",python,content
+682,262631496,"utils/train_utils.py",4397,0,"I",python,content
+683,262631497,"utils/train_utils.py",4398,0,"",python,selection_keyboard
+684,262631747,"utils/train_utils.py",4398,0,"n",python,content
+685,262631747,"utils/train_utils.py",4399,0,"",python,selection_keyboard
+686,262631892,"utils/train_utils.py",4399,0,"s",python,content
+687,262631893,"utils/train_utils.py",4400,0,"",python,selection_keyboard
+688,262632013,"utils/train_utils.py",4400,0,"i",python,content
+689,262632013,"utils/train_utils.py",4401,0,"",python,selection_keyboard
+690,262632060,"utils/train_utils.py",4401,0,"p",python,content
+691,262632062,"utils/train_utils.py",4402,0,"",python,selection_keyboard
+692,262632667,"utils/train_utils.py",4402,0,"r",python,content
+693,262632668,"utils/train_utils.py",4403,0,"",python,selection_keyboard
+694,262632780,"utils/train_utils.py",4403,0,"e",python,content
+695,262632781,"utils/train_utils.py",4404,0,"",python,selection_keyboard
+696,262633066,"utils/train_utils.py",4397,7,"",python,content
+697,262633298,"utils/train_utils.py",4397,0,"I",python,content
+698,262633300,"utils/train_utils.py",4398,0,"",python,selection_keyboard
+699,262633586,"utils/train_utils.py",4398,0,"n",python,content
+700,262633587,"utils/train_utils.py",4399,0,"",python,selection_keyboard
+701,262634097,"utils/train_utils.py",4399,0,"s",python,content
+702,262634098,"utils/train_utils.py",4400,0,"",python,selection_keyboard
+703,262634565,"utils/train_utils.py",4400,0,"p",python,content
+704,262634569,"utils/train_utils.py",4401,0,"",python,selection_keyboard
+705,262634593,"utils/train_utils.py",4401,0,"i",python,content
+706,262634595,"utils/train_utils.py",4402,0,"",python,selection_keyboard
+707,262634725,"utils/train_utils.py",4402,0,"r",python,content
+708,262634726,"utils/train_utils.py",4403,0,"",python,selection_keyboard
+709,262634782,"utils/train_utils.py",4403,0,"e",python,content
+710,262634783,"utils/train_utils.py",4404,0,"",python,selection_keyboard
+711,262634952,"utils/train_utils.py",4404,0,"d",python,content
+712,262634953,"utils/train_utils.py",4405,0,"",python,selection_keyboard
+713,262635076,"utils/train_utils.py",4405,0," ",python,content
+714,262635077,"utils/train_utils.py",4406,0,"",python,selection_keyboard
+715,262635177,"utils/train_utils.py",4406,0,"b",python,content
+716,262635178,"utils/train_utils.py",4407,0,"",python,selection_keyboard
+717,262635256,"utils/train_utils.py",4407,0,"y",python,content
+718,262635257,"utils/train_utils.py",4408,0,"",python,selection_keyboard
+719,262635375,"utils/train_utils.py",4408,0," ",python,content
+720,262635376,"utils/train_utils.py",4409,0,"",python,selection_keyboard
+721,262639111,"utils/train_utils.py",4406,3,"",python,content
+722,262639266,"utils/train_utils.py",4397,9,"",python,content
+723,262639476,"utils/train_utils.py",4397,0,"A",python,content
+724,262639478,"utils/train_utils.py",4398,0,"",python,selection_keyboard
+725,262639684,"utils/train_utils.py",4398,0,"d",python,content
+726,262639685,"utils/train_utils.py",4399,0,"",python,selection_keyboard
+727,262639810,"utils/train_utils.py",4399,0,"a",python,content
+728,262639811,"utils/train_utils.py",4400,0,"",python,selection_keyboard
+729,262639945,"utils/train_utils.py",4400,0,"p",python,content
+730,262639946,"utils/train_utils.py",4401,0,"",python,selection_keyboard
+731,262640092,"utils/train_utils.py",4401,0,"t",python,content
+732,262640094,"utils/train_utils.py",4402,0,"",python,selection_keyboard
+733,262640142,"utils/train_utils.py",4402,0,"e",python,content
+734,262640143,"utils/train_utils.py",4403,0,"",python,selection_keyboard
+735,262640312,"utils/train_utils.py",4403,0,"d",python,content
+736,262640313,"utils/train_utils.py",4404,0,"",python,selection_keyboard
+737,262640391,"utils/train_utils.py",4404,0," ",python,content
+738,262640392,"utils/train_utils.py",4405,0,"",python,selection_keyboard
+739,262640459,"utils/train_utils.py",4405,0,"f",python,content
+740,262640460,"utils/train_utils.py",4406,0,"",python,selection_keyboard
+741,262640610,"utils/train_utils.py",4406,0,"r",python,content
+742,262640611,"utils/train_utils.py",4407,0,"",python,selection_keyboard
+743,262640674,"utils/train_utils.py",4407,0,"o",python,content
+744,262640675,"utils/train_utils.py",4408,0,"",python,selection_keyboard
+745,262640708,"utils/train_utils.py",4408,0,"m",python,content
+746,262640709,"utils/train_utils.py",4409,0,"",python,selection_keyboard
+747,262640807,"utils/train_utils.py",4409,0," ",python,content
+748,262640807,"utils/train_utils.py",4410,0,"",python,selection_keyboard
+749,262645781,"utils/train_utils.py",4409,0,"",python,selection_command
+750,262645861,"utils/train_utils.py",4392,0,"",python,selection_command
+751,262646113,"utils/train_utils.py",4379,0,"",python,selection_command
+752,262646148,"utils/train_utils.py",4361,0,"",python,selection_command
+753,262646179,"utils/train_utils.py",4323,0,"",python,selection_command
+754,262646212,"utils/train_utils.py",4306,0,"",python,selection_command
+755,262646246,"utils/train_utils.py",4268,0,"",python,selection_command
+756,262646279,"utils/train_utils.py",4214,0,"",python,selection_command
+757,262646312,"utils/train_utils.py",4138,0,"",python,selection_command
+758,262646345,"utils/train_utils.py",4079,0,"",python,selection_command
+759,262646379,"utils/train_utils.py",4020,0,"",python,selection_command
+760,262646481,"utils/train_utils.py",3983,0,"",python,selection_command
+761,262646733,"utils/train_utils.py",3945,0,"",python,selection_command
+762,262646831,"utils/train_utils.py",3927,0,"",python,selection_command
+763,262646994,"utils/train_utils.py",3901,0,"",python,selection_command
+764,262647117,"utils/train_utils.py",3769,0,"",python,selection_command
+765,262647346,"utils/train_utils.py",3766,0,"",python,selection_command
+766,262647502,"utils/train_utils.py",3764,0,"",python,selection_command
+767,262647849,"utils/train_utils.py",3764,1,"",python,content
+768,262649135,"utils/train_utils.py",3764,0,":",python,content
+769,262649140,"utils/train_utils.py",3764,0,"",python,selection_command
+770,262649852,"utils/train_utils.py",3896,0,"",python,selection_command
+771,262650101,"utils/train_utils.py",3927,0,"",python,selection_command
+772,262650134,"utils/train_utils.py",3940,0,"",python,selection_command
+773,262650168,"utils/train_utils.py",3978,0,"",python,selection_command
+774,262650202,"utils/train_utils.py",4015,0,"",python,selection_command
+775,262650236,"utils/train_utils.py",4074,0,"",python,selection_command
+776,262650272,"utils/train_utils.py",4133,0,"",python,selection_command
+777,262650305,"utils/train_utils.py",4209,0,"",python,selection_command
+778,262650339,"utils/train_utils.py",4263,0,"",python,selection_command
+779,262650374,"utils/train_utils.py",4306,0,"",python,selection_command
+780,262650407,"utils/train_utils.py",4318,0,"",python,selection_command
+781,262650535,"utils/train_utils.py",4361,0,"",python,selection_command
+782,262650697,"utils/train_utils.py",4374,0,"",python,selection_command
+783,262650846,"utils/train_utils.py",4392,0,"",python,selection_command
+784,262650983,"utils/train_utils.py",4404,0,"",python,selection_command
+785,262651249,"utils/train_utils.py",4410,0,"",python,selection_command
+786,262655927,"utils/train_utils.py",4409,0,"",python,selection_command
+787,262655996,"utils/train_utils.py",4392,0,"",python,selection_command
+788,262656250,"utils/train_utils.py",4379,0,"",python,selection_command
+789,262656281,"utils/train_utils.py",4361,0,"",python,selection_command
+790,262656312,"utils/train_utils.py",4323,0,"",python,selection_command
+791,262656345,"utils/train_utils.py",4306,0,"",python,selection_command
+792,262656378,"utils/train_utils.py",4268,0,"",python,selection_command
+793,262656411,"utils/train_utils.py",4214,0,"",python,selection_command
+794,262656444,"utils/train_utils.py",4138,0,"",python,selection_command
+795,262656478,"utils/train_utils.py",4079,0,"",python,selection_command
+796,262656512,"utils/train_utils.py",4020,0,"",python,selection_command
+797,262656643,"utils/train_utils.py",3983,0,"",python,selection_command
+798,262656807,"utils/train_utils.py",3945,0,"",python,selection_command
+799,262656942,"utils/train_utils.py",3927,0,"",python,selection_command
+800,262657097,"utils/train_utils.py",3901,0,"",python,selection_command
+801,262657231,"utils/train_utils.py",3769,0,"",python,selection_command
+802,262657429,"utils/train_utils.py",3766,0,"",python,selection_command
+803,262657515,"utils/train_utils.py",3766,1,"h",python,selection_command
+804,262657578,"utils/train_utils.py",3766,5,"https",python,selection_command
+805,262657813,"utils/train_utils.py",3766,8,"https://",python,selection_command
+806,262658067,"utils/train_utils.py",3766,14,"https://github",python,selection_command
+807,262658096,"utils/train_utils.py",3766,15,"https://github.",python,selection_command
+808,262658132,"utils/train_utils.py",3766,18,"https://github.com",python,selection_command
+809,262658162,"utils/train_utils.py",3766,19,"https://github.com/",python,selection_command
+810,262658195,"utils/train_utils.py",3766,21,"https://github.com/AI",python,selection_command
+811,262658380,"utils/train_utils.py",3766,22,"https://github.com/AI-",python,selection_command
+812,262658632,"utils/train_utils.py",3766,35,"https://github.com/AI-Hypercomputer",python,selection_command
+813,262658663,"utils/train_utils.py",3766,36,"https://github.com/AI-Hypercomputer/",python,selection_command
+814,262658697,"utils/train_utils.py",3766,43,"https://github.com/AI-Hypercomputer/maxtext",python,selection_command
+815,262658734,"utils/train_utils.py",3766,44,"https://github.com/AI-Hypercomputer/maxtext/",python,selection_command
+816,262658764,"utils/train_utils.py",3766,48,"https://github.com/AI-Hypercomputer/maxtext/blob",python,selection_command
+817,262658798,"utils/train_utils.py",3766,49,"https://github.com/AI-Hypercomputer/maxtext/blob/",python,selection_command
+818,262658832,"utils/train_utils.py",3766,89,"https://github.com/AI-Hypercomputer/maxtext/blob/7898576359bacde81be25cb3038e348aac1f943b",python,selection_command
+819,262658865,"utils/train_utils.py",3766,90,"https://github.com/AI-Hypercomputer/maxtext/blob/7898576359bacde81be25cb3038e348aac1f943b/",python,selection_command
+820,262658899,"utils/train_utils.py",3766,97,"https://github.com/AI-Hypercomputer/maxtext/blob/7898576359bacde81be25cb3038e348aac1f943b/MaxText",python,selection_command
+821,262658933,"utils/train_utils.py",3766,98,"https://github.com/AI-Hypercomputer/maxtext/blob/7898576359bacde81be25cb3038e348aac1f943b/MaxText/",python,selection_command
+822,262658966,"utils/train_utils.py",3766,107,"https://github.com/AI-Hypercomputer/maxtext/blob/7898576359bacde81be25cb3038e348aac1f943b/MaxText/max_utils",python,selection_command
+823,262658999,"utils/train_utils.py",3766,108,"https://github.com/AI-Hypercomputer/maxtext/blob/7898576359bacde81be25cb3038e348aac1f943b/MaxText/max_utils.",python,selection_command
+824,262659034,"utils/train_utils.py",3766,110,"https://github.com/AI-Hypercomputer/maxtext/blob/7898576359bacde81be25cb3038e348aac1f943b/MaxText/max_utils.py",python,selection_command
+825,262659067,"utils/train_utils.py",3766,111,"https://github.com/AI-Hypercomputer/maxtext/blob/7898576359bacde81be25cb3038e348aac1f943b/MaxText/max_utils.py#",python,selection_command
+826,262659101,"utils/train_utils.py",3766,115,"https://github.com/AI-Hypercomputer/maxtext/blob/7898576359bacde81be25cb3038e348aac1f943b/MaxText/max_utils.py#L713",python,selection_command
+827,262659769,"utils/train_utils.py",3766,0,"",python,selection_command
+828,262660013,"utils/train_utils.py",3898,0,"",python,selection_command
+829,262660262,"utils/train_utils.py",3927,0,"",python,selection_command
+830,262660297,"utils/train_utils.py",3942,0,"",python,selection_command
+831,262660323,"utils/train_utils.py",3980,0,"",python,selection_command
+832,262660358,"utils/train_utils.py",4017,0,"",python,selection_command
+833,262660399,"utils/train_utils.py",4076,0,"",python,selection_command
+834,262660424,"utils/train_utils.py",4135,0,"",python,selection_command
+835,262660458,"utils/train_utils.py",4211,0,"",python,selection_command
+836,262660492,"utils/train_utils.py",4265,0,"",python,selection_command
+837,262660526,"utils/train_utils.py",4306,0,"",python,selection_command
+838,262660560,"utils/train_utils.py",4320,0,"",python,selection_command
+839,262660857,"utils/train_utils.py",4361,0,"",python,selection_command
+840,262661009,"utils/train_utils.py",4376,0,"",python,selection_command
+841,262661152,"utils/train_utils.py",4392,0,"",python,selection_command
+842,262661292,"utils/train_utils.py",4406,0,"",python,selection_command
+843,262661510,"utils/train_utils.py",4410,0,"",python,selection_command
+844,262661725,"utils/train_utils.py",4409,0,"",python,selection_command
+845,262661749,"utils/train_utils.py",4410,0,"https://github.com/AI-Hypercomputer/maxtext/blob/7898576359bacde81be25cb3038e348aac1f943b/MaxText/max_utils.py#L713",python,content
+846,262661753,"utils/train_utils.py",4524,0,"",python,selection_command
+847,262672780,"utils/train_utils.py",4523,0,"",python,selection_command
+848,262672942,"utils/train_utils.py",4522,0,"",python,selection_command
+849,262673536,"utils/train_utils.py",4522,3,"",python,content
+850,262674564,"utils/train_utils.py",4522,0,"4",python,content
+851,262674565,"utils/train_utils.py",4523,0,"",python,selection_keyboard
+852,262674950,"utils/train_utils.py",4523,0,"7",python,content
+853,262674951,"utils/train_utils.py",4524,0,"",python,selection_keyboard
+854,262675074,"utils/train_utils.py",4524,0,"3",python,content
+855,262675076,"utils/train_utils.py",4525,0,"",python,selection_keyboard
+856,262675306,"utils/train_utils.py",4524,0,"",python,selection_command
+857,262676269,"utils/train_utils.py",4525,0,"",python,selection_command
+858,262676763,"utils/train_utils.py",4524,0,"",python,selection_command
+859,262677803,"utils/train_utils.py",4532,0,"",python,selection_command
+860,262677997,"utils/train_utils.py",4548,0,"",python,selection_command
+861,262678259,"utils/train_utils.py",4532,0,"",python,selection_command
+862,262679416,"utils/train_utils.py",4524,0,"",python,selection_command
+863,262679561,"utils/train_utils.py",4392,0,"",python,selection_command
+864,262679866,"utils/train_utils.py",4392,1,"",python,content
+865,262679876,"utils/train_utils.py",4396,0,"",python,selection_command
+866,262680813,"utils/train_utils.py",4367,0,"",python,selection_command
+867,262681420,"utils/train_utils.py",4391,0,"\n ",python,content
+868,262681442,"utils/train_utils.py",4392,4,"",python,content
+869,262682444,"utils/train_utils.py",4392,1,"",python,content
+870,262682452,"utils/train_utils.py",4396,0,"",python,selection_command
+871,262684103,"utils/train_utils.py",4367,0,"",python,selection_command
+872,262686248,"utils/train_utils.py",4391,0,"",python,selection_command
+873,262686430,"utils/train_utils.py",4391,0,".",python,content
+874,262686432,"utils/train_utils.py",4392,0,"",python,selection_keyboard
+875,262686558,"utils/train_utils.py",4391,0,"",python,selection_command
+876,262688204,"utils/train_utils.py",4391,1,"",python,content
+877,262688211,"utils/train_utils.py",4390,0,"",python,selection_command
+878,262692398,"utils/train_utils.py",4391,0,"",python,selection_command
+879,262692508,"utils/train_utils.py",4391,0,".",python,content
+880,262692510,"utils/train_utils.py",4392,0,"",python,selection_keyboard
+881,262692617,"utils/train_utils.py",4391,0,"",python,selection_command
+882,262700363,"utils/train_utils.py",4421,0,"",python,selection_command
+883,262700507,"utils/train_utils.py",4532,0,"",python,selection_command
+884,262700659,"utils/train_utils.py",4548,0,"",python,selection_command
+885,262700796,"utils/train_utils.py",4550,0,"",python,selection_command
+886,262700957,"utils/train_utils.py",4571,0,"",python,selection_command
+887,262701210,"utils/train_utils.py",4573,0,"",python,selection_command
+888,262701242,"utils/train_utils.py",4594,0,"",python,selection_command
+889,262701275,"utils/train_utils.py",4596,0,"",python,selection_command
+890,262701311,"utils/train_utils.py",4625,0,"",python,selection_command
+891,262801033,"utils/train_utils.py",4596,0,"",python,selection_command
+892,262801283,"utils/train_utils.py",4594,0,"",python,selection_command
+893,262801315,"utils/train_utils.py",4573,0,"",python,selection_command
+894,262801350,"utils/train_utils.py",4571,0,"",python,selection_command
+895,262801382,"utils/train_utils.py",4550,0,"",python,selection_command
+896,262801419,"utils/train_utils.py",4548,0,"",python,selection_command
+897,262801451,"utils/train_utils.py",4532,0,"",python,selection_command
+898,262801486,"utils/train_utils.py",4421,0,"",python,selection_command
+899,262801520,"utils/train_utils.py",4391,0,"",python,selection_command
+900,262801553,"utils/train_utils.py",4361,0,"",python,selection_command
+901,262801586,"utils/train_utils.py",4335,0,"",python,selection_command
+902,262801619,"utils/train_utils.py",4306,0,"",python,selection_command
+903,262801653,"utils/train_utils.py",4280,0,"",python,selection_command
+904,262801894,"utils/train_utils.py",4306,0,"",python,selection_command
+905,262802313,"utils/train_utils.py",4280,0,"",python,selection_command
+906,262802562,"utils/train_utils.py",4305,0,"\n ",python,content
+907,262802766,"utils/train_utils.py",4314,0,"\n ",python,content
+908,262802766,"utils/train_utils.py",4306,8,"",python,content
+909,262803094,"utils/train_utils.py",4311,4,"",python,content
+910,262803423,"utils/train_utils.py",4307,4,"",python,content
+911,262803908,"utils/train_utils.py",4307,0,"d",python,content
+912,262803909,"utils/train_utils.py",4308,0,"",python,selection_keyboard
+913,262804064,"utils/train_utils.py",4308,0,"e",python,content
+914,262804065,"utils/train_utils.py",4309,0,"",python,selection_keyboard
+915,262804125,"utils/train_utils.py",4309,0,"f",python,content
+916,262804127,"utils/train_utils.py",4310,0,"",python,selection_keyboard
+917,262804209,"utils/train_utils.py",4310,0," ",python,content
+918,262804211,"utils/train_utils.py",4311,0,"",python,selection_keyboard
+919,262804438,"utils/train_utils.py",4311,0,"_",python,content
+920,262804439,"utils/train_utils.py",4312,0,"",python,selection_keyboard
+921,262804660,"utils/train_utils.py",4312,0,"c",python,content
+922,262804662,"utils/train_utils.py",4313,0,"",python,selection_keyboard
+923,262804747,"utils/train_utils.py",4313,0,"a",python,content
+924,262804748,"utils/train_utils.py",4314,0,"",python,selection_keyboard
+925,262804847,"utils/train_utils.py",4314,0,"l",python,content
+926,262804848,"utils/train_utils.py",4315,0,"",python,selection_keyboard
+927,262804955,"utils/train_utils.py",4315,0,"c",python,content
+928,262804956,"utils/train_utils.py",4316,0,"",python,selection_keyboard
+929,262805095,"utils/train_utils.py",4316,0,"u",python,content
+930,262805096,"utils/train_utils.py",4317,0,"",python,selection_keyboard
+931,262805230,"utils/train_utils.py",4317,0,"l",python,content
+932,262805231,"utils/train_utils.py",4318,0,"",python,selection_keyboard
+933,262805311,"utils/train_utils.py",4318,0,"a",python,content
+934,262805312,"utils/train_utils.py",4319,0,"",python,selection_keyboard
+935,262805389,"utils/train_utils.py",4319,0,"t",python,content
+936,262805390,"utils/train_utils.py",4320,0,"",python,selection_keyboard
+937,262805453,"utils/train_utils.py",4320,0,"e",python,content
+938,262805455,"utils/train_utils.py",4321,0,"",python,selection_keyboard
+939,262805670,"utils/train_utils.py",4321,0,"_",python,content
+940,262805671,"utils/train_utils.py",4322,0,"",python,selection_keyboard
+941,262807594,"utils/train_utils.py",4322,0,"f",python,content
+942,262807595,"utils/train_utils.py",4323,0,"",python,selection_keyboard
+943,262807724,"utils/train_utils.py",4323,0,"f",python,content
+944,262807726,"utils/train_utils.py",4324,0,"",python,selection_keyboard
+945,262807895,"utils/train_utils.py",4324,0,"n",python,content
+946,262807896,"utils/train_utils.py",4325,0,"",python,selection_keyboard
+947,262808814,"utils/train_utils.py",4325,0,"_",python,content
+948,262808815,"utils/train_utils.py",4326,0,"",python,selection_keyboard
+949,262809257,"utils/train_utils.py",4326,0,"t",python,content
+950,262809259,"utils/train_utils.py",4327,0,"",python,selection_keyboard
+951,262809416,"utils/train_utils.py",4327,0,"f",python,content
+952,262809417,"utils/train_utils.py",4328,0,"",python,selection_keyboard
+953,262809454,"utils/train_utils.py",4328,0,"l",python,content
+954,262809455,"utils/train_utils.py",4329,0,"",python,selection_keyboard
+955,262809600,"utils/train_utils.py",4329,0,"o",python,content
+956,262809601,"utils/train_utils.py",4330,0,"",python,selection_keyboard
+957,262809626,"utils/train_utils.py",4330,0,"p",python,content
+958,262809627,"utils/train_utils.py",4331,0,"",python,selection_keyboard
+959,262809721,"utils/train_utils.py",4331,0,"s",python,content
+960,262809722,"utils/train_utils.py",4332,0,"",python,selection_keyboard
+961,262816623,"utils/train_utils.py",4332,0,"_",python,content
+962,262816625,"utils/train_utils.py",4333,0,"",python,selection_keyboard
+963,262816821,"utils/train_utils.py",4333,0,"p",python,content
+964,262816822,"utils/train_utils.py",4334,0,"",python,selection_keyboard
+965,262816887,"utils/train_utils.py",4334,0,"e",python,content
+966,262816890,"utils/train_utils.py",4335,0,"",python,selection_keyboard
+967,262816964,"utils/train_utils.py",4335,0,"r",python,content
+968,262816964,"utils/train_utils.py",4336,0,"",python,selection_keyboard
+969,262817164,"utils/train_utils.py",4336,0,"_",python,content
+970,262817166,"utils/train_utils.py",4337,0,"",python,selection_keyboard
+971,262817310,"utils/train_utils.py",4337,0,"d",python,content
+972,262817311,"utils/train_utils.py",4338,0,"",python,selection_keyboard
+973,262817456,"utils/train_utils.py",4338,0,"e",python,content
+974,262817457,"utils/train_utils.py",4339,0,"",python,selection_keyboard
+975,262817539,"utils/train_utils.py",4339,0,"v",python,content
+976,262817540,"utils/train_utils.py",4340,0,"",python,selection_keyboard
+977,262817690,"utils/train_utils.py",4340,0,"i",python,content
+978,262817691,"utils/train_utils.py",4341,0,"",python,selection_keyboard
+979,262817741,"utils/train_utils.py",4341,0,"c",python,content
+980,262817743,"utils/train_utils.py",4342,0,"",python,selection_keyboard
+981,262817809,"utils/train_utils.py",4342,0,"e",python,content
+982,262817810,"utils/train_utils.py",4343,0,"",python,selection_keyboard
+983,262818693,"utils/train_utils.py",4343,0,"()",python,content
+984,262818695,"utils/train_utils.py",4344,0,"",python,selection_keyboard
+985,262818871,"utils/train_utils.py",4344,1,")",python,content
+986,262818874,"utils/train_utils.py",4345,0,"",python,selection_keyboard
+987,262819156,"utils/train_utils.py",4344,0,"",python,selection_command
+988,262819290,"utils/train_utils.py",4346,0,"",python,selection_command
+989,262819439,"utils/train_utils.py",4384,0,"",python,selection_command
+990,262819659,"utils/train_utils.py",4401,0,"",python,selection_command
+991,262819838,"utils/train_utils.py",4384,0,"",python,selection_command
+992,262820042,"utils/train_utils.py",4383,0,"",python,selection_command
+993,262820292,"utils/train_utils.py",4382,0,"",python,selection_command
+994,262820322,"utils/train_utils.py",4381,0,"",python,selection_command
+995,262820354,"utils/train_utils.py",4380,0,"",python,selection_command
+996,262820389,"utils/train_utils.py",4379,0,"",python,selection_command
+997,262820422,"utils/train_utils.py",4378,0,"",python,selection_command
+998,262820456,"utils/train_utils.py",4377,0,"",python,selection_command
+999,262820492,"utils/train_utils.py",4376,0,"",python,selection_command
+1000,262820523,"utils/train_utils.py",4375,0,"",python,selection_command
+1001,262820776,"utils/train_utils.py",4374,0,"",python,selection_command
+1002,262821056,"utils/train_utils.py",4375,0,"",python,selection_command
+1003,262821198,"utils/train_utils.py",4375,1,"g",python,selection_command
+1004,262821255,"utils/train_utils.py",4374,2,"ng",python,selection_command
+1005,262821507,"utils/train_utils.py",4373,3,"ing",python,selection_command
+1006,262821540,"utils/train_utils.py",4372,4,"ning",python,selection_command
+1007,262821571,"utils/train_utils.py",4371,5,"ining",python,selection_command
+1008,262821604,"utils/train_utils.py",4370,6,"aining",python,selection_command
+1009,262821638,"utils/train_utils.py",4369,7,"raining",python,selection_command
+1010,262821673,"utils/train_utils.py",4368,8,"training",python,selection_command
+1011,262821911,"utils/train_utils.py",4367,9,"_training",python,selection_command
+1012,262822071,"utils/train_utils.py",4367,9,"",python,content
+1013,262824081,"utils/train_utils.py",4346,0,"",python,selection_command
+1014,262824232,"utils/train_utils.py",4327,0,"",python,selection_command
+1015,262825161,"utils/train_utils.py",4345,0,"",python,selection_command
+1016,262825443,"utils/train_utils.py",4344,0,"",python,selection_command
+1017,262828005,"utils/train_utils.py",4344,0,"a",python,content
+1018,262828007,"utils/train_utils.py",4345,0,"",python,selection_keyboard
+1019,262828068,"utils/train_utils.py",4345,0,"r",python,content
+1020,262828070,"utils/train_utils.py",4346,0,"",python,selection_keyboard
+1021,262828221,"utils/train_utils.py",4346,0,"g",python,content
+1022,262828222,"utils/train_utils.py",4347,0,"",python,selection_keyboard
+1023,262828287,"utils/train_utils.py",4347,0,"s",python,content
+1024,262828289,"utils/train_utils.py",4348,0,"",python,selection_keyboard
+1025,262829043,"utils/train_utils.py",4347,0,"",python,selection_command
+1026,262829390,"utils/train_utils.py",4349,0,"",python,selection_command
+1027,262829489,"utils/train_utils.py",4349,0,":",python,content
+1028,262829491,"utils/train_utils.py",4350,0,"",python,selection_keyboard
+1029,262830014,"utils/train_utils.py",4350,0,"\n ",python,content
+1030,262842724,"utils/train_utils.py",4351,4,"",python,content
+1031,262842781,"utils/train_utils.py",4351,0,"_training",python,content
+1032,262842784,"utils/train_utils.py",4359,0,"",python,selection_command
+1033,262843643,"utils/train_utils.py",4351,9,"",python,content
+1034,262847336,"utils/train_utils.py",4351,0," """"""Helper function to calculate matmul TFLOP in ffn based on MLP dimension.\n\n Applies to:\n - Dense FFN layers (mlp_dim = config.mlp_dim).\n - MoE FFN layers (mlp_dim = config.moe_mlp_dim),\n need to scale by shared_experts or num_experts_per_tok.\n """"""",python,content
+1035,262847641,"utils/train_utils.py",4614,0,"",python,selection_command
+1036,262848594,"utils/train_utils.py",4552,0,"",python,selection_command
+1037,262848843,"utils/train_utils.py",4499,0,"",python,selection_command
+1038,262848976,"utils/train_utils.py",4552,0,"",python,selection_command
+1039,262849142,"utils/train_utils.py",4499,0,"",python,selection_command
+1040,262849448,"utils/train_utils.py",4552,0,"",python,selection_command
+1041,262849692,"utils/train_utils.py",4614,0,"",python,selection_command
+1042,262849943,"utils/train_utils.py",4616,0,"",python,selection_command
+1043,262849977,"utils/train_utils.py",4621,0,"",python,selection_command
+1044,262850090,"utils/train_utils.py",4660,0,"",python,selection_command
+1045,262853926,"utils/train_utils.py",4668,0,"",python,selection_command
+1046,262854229,"utils/train_utils.py",4698,0,"",python,selection_command
+1047,262854804,"utils/train_utils.py",4668,0,"",python,selection_command
+1048,262855059,"utils/train_utils.py",4660,0,"",python,selection_command
+1049,262855088,"utils/train_utils.py",4621,0,"",python,selection_command
+1050,262855121,"utils/train_utils.py",4616,0,"",python,selection_command
+1051,262855156,"utils/train_utils.py",4614,0,"",python,selection_command
+1052,262855190,"utils/train_utils.py",4552,0,"",python,selection_command
+1053,262855224,"utils/train_utils.py",4499,0,"",python,selection_command
+1054,262855258,"utils/train_utils.py",4448,0,"",python,selection_command
+1055,262855291,"utils/train_utils.py",4434,0,"",python,selection_command
+1056,262855325,"utils/train_utils.py",4429,0,"",python,selection_command
+1057,262855458,"utils/train_utils.py",4355,0,"",python,selection_command
+1058,262855626,"utils/train_utils.py",4311,0,"",python,selection_command
+1059,262855857,"utils/train_utils.py",4355,0,"",python,selection_command
+1060,262856042,"utils/train_utils.py",4428,0,"\n Adapted from https://github.com/AI-Hypercomputer/maxtext/blob/7898576359bacde81be25cb3038e348aac1f943b/MaxText/max_utils.py#L473",python,content
+1061,262856045,"utils/train_utils.py",4433,0,"",python,selection_command
+1062,262858106,"utils/train_utils.py",4429,4,"",python,content
+1063,262859055,"utils/train_utils.py",4429,0," ",python,content
+1064,262859691,"utils/train_utils.py",4432,0,"",python,selection_command
+1065,262860345,"utils/train_utils.py",4433,0,"",python,selection_command
+1066,262860474,"utils/train_utils.py",4355,0,"",python,selection_command
+1067,262865595,"utils/train_utils.py",3716,0,"",python,selection_command
+1068,262866080,"utils/train_utils.py",2488,0,"",python,selection_command
+1069,262866427,"utils/train_utils.py",1988,0,"",python,selection_command
+1070,262866761,"utils/train_utils.py",1273,0,"",python,selection_command
+1071,262868792,"utils/train_utils.py",5035,0,"",python,selection_command
+1072,262869427,"utils/train_utils.py",5030,0,"",python,selection_command
+1073,262869676,"utils/train_utils.py",5012,0,"",python,selection_command
+1074,262869708,"utils/train_utils.py",5007,0,"",python,selection_command
+1075,262869741,"utils/train_utils.py",4989,0,"",python,selection_command
+1076,262869775,"utils/train_utils.py",4984,0,"",python,selection_command
+1077,262869808,"utils/train_utils.py",4972,0,"",python,selection_command
+1078,262869841,"utils/train_utils.py",4964,0,"",python,selection_command
+1079,262869874,"utils/train_utils.py",4831,0,"",python,selection_command
+1080,262869988,"utils/train_utils.py",4801,0,"",python,selection_command
+1081,262870458,"utils/train_utils.py",4797,4,"",python,content
+1082,262870659,"utils/train_utils.py",4796,1,"",python,content
+1083,262871908,"utils/train_utils.py",4795,0,"",python,selection_command
+1084,262874481,"utils/train_utils.py",4821,0,"\n ",python,content
+1085,262874842,"utils/train_utils.py",4822,4,"",python,content
+1086,262875448,"utils/train_utils.py",4789,0,"",python,selection_command
+1087,262875703,"utils/train_utils.py",4750,0,"",python,selection_command
+1088,262875736,"utils/train_utils.py",4749,0,"",python,selection_command
+1089,262875768,"utils/train_utils.py",4743,0,"",python,selection_command
+1090,262875801,"utils/train_utils.py",4681,0,"",python,selection_command
+1091,262875920,"utils/train_utils.py",4628,0,"",python,selection_command
+1092,262876082,"utils/train_utils.py",4577,0,"",python,selection_command
+1093,262876231,"utils/train_utils.py",4563,0,"",python,selection_command
+1094,262876629,"utils/train_utils.py",4562,0,"",python,selection_command
+1095,262876939,"utils/train_utils.py",4563,0,"",python,selection_command
+1096,262877192,"utils/train_utils.py",4577,0,"",python,selection_command
+1097,262877221,"utils/train_utils.py",4628,0,"",python,selection_command
+1098,262877254,"utils/train_utils.py",4681,0,"",python,selection_command
+1099,262877287,"utils/train_utils.py",4743,0,"",python,selection_command
+1100,262877461,"utils/train_utils.py",4749,0,"",python,selection_command
+1101,262877622,"utils/train_utils.py",4750,0,"",python,selection_command
+1102,262877932,"utils/train_utils.py",4789,0,"",python,selection_command
+1103,262878112,"utils/train_utils.py",4750,0,"",python,selection_command
+1104,262878361,"utils/train_utils.py",4749,0,"",python,selection_command
+1105,262878393,"utils/train_utils.py",4743,0,"",python,selection_command
+1106,262878428,"utils/train_utils.py",4681,0,"",python,selection_command
+1107,262878597,"utils/train_utils.py",4628,0,"",python,selection_command
+1108,262878776,"utils/train_utils.py",4577,0,"",python,selection_command
+1109,262878927,"utils/train_utils.py",4563,0,"",python,selection_command
+1110,262879827,"utils/train_utils.py",4562,0,"",python,selection_command
+1111,262879976,"utils/train_utils.py",4429,0,"",python,selection_command
+1112,262880149,"utils/train_utils.py",4351,0,"",python,selection_command
+1113,262881247,"utils/train_utils.py",4429,0,"",python,selection_command
+1114,262881611,"utils/train_utils.py",4433,0,"",python,selection_command
+1115,262881807,"utils/train_utils.py",4429,4,"",python,content
+1116,262883041,"utils/train_utils.py",4429,0," ",python,content
+1117,262883042,"utils/train_utils.py",4430,0,"",python,selection_keyboard
+1118,262883491,"utils/train_utils.py",4430,0," ",python,content
+1119,262883493,"utils/train_utils.py",4431,0,"",python,selection_keyboard
+1120,262884043,"utils/train_utils.py",4430,0,"",python,selection_command
+1121,262884409,"utils/train_utils.py",4560,0,"",python,selection_command
+1122,262885029,"utils/train_utils.py",4429,0,"",python,selection_command
+1123,262885241,"utils/train_utils.py",4430,0,"",python,selection_command
+1124,262885477,"utils/train_utils.py",4560,0,"",python,selection_command
+1125,262886126,"utils/train_utils.py",4561,0,"",python,selection_command
+1126,262886224,"utils/train_utils.py",4562,0,"",python,selection_command
+1127,262886438,"utils/train_utils.py",4560,0,"",python,selection_command
+1128,262886577,"utils/train_utils.py",4430,0,"",python,selection_command
+1129,262886763,"utils/train_utils.py",4352,0,"",python,selection_command
+1130,262887039,"utils/train_utils.py",4353,0,"",python,selection_command
+1131,262887175,"utils/train_utils.py",4431,0,"",python,selection_command
+1132,268250758,"utils/train_utils.py",4560,0,"",python,selection_command
+1133,268251008,"utils/train_utils.py",4563,0,"",python,selection_command
+1134,268251039,"utils/train_utils.py",4577,0,"",python,selection_command
+1135,268251072,"utils/train_utils.py",4628,0,"",python,selection_command
+1136,268251110,"utils/train_utils.py",4681,0,"",python,selection_command
+1137,268251217,"utils/train_utils.py",4743,0,"",python,selection_command
+1138,268251362,"utils/train_utils.py",4747,0,"",python,selection_command
+1139,268251484,"utils/train_utils.py",4750,0,"",python,selection_command
+1140,268362107,"utils/train_utils.py",4773,0,"",python,selection_mouse
+1141,268363267,"utils/train_utils.py",4778,0,"",python,selection_command
+1142,268364051,"utils/train_utils.py",4779,0,"",python,selection_command
+1143,268364319,"utils/train_utils.py",4779,0,"_",python,content
+1144,268364321,"utils/train_utils.py",4780,0,"",python,selection_keyboard
+1145,268364603,"utils/train_utils.py",4780,0,"T",python,content
+1146,268364605,"utils/train_utils.py",4781,0,"",python,selection_keyboard
+1147,268364995,"utils/train_utils.py",4780,1,"",python,content
+1148,268365113,"utils/train_utils.py",4780,0,"t",python,content
+1149,268365115,"utils/train_utils.py",4781,0,"",python,selection_keyboard
+1150,268365177,"utils/train_utils.py",4781,0,"o",python,content
+1151,268365179,"utils/train_utils.py",4782,0,"",python,selection_keyboard
+1152,268365215,"utils/train_utils.py",4782,0,"k",python,content
+1153,268365216,"utils/train_utils.py",4783,0,"",python,selection_keyboard
+1154,268365328,"utils/train_utils.py",4783,0,"e",python,content
+1155,268365330,"utils/train_utils.py",4784,0,"",python,selection_keyboard
+1156,268365405,"utils/train_utils.py",4784,0,"n",python,content
+1157,268365424,"utils/train_utils.py",4785,0,"",python,selection_keyboard
+1158,268365482,"utils/train_utils.py",4785,0,"i",python,content
+1159,268365484,"utils/train_utils.py",4786,0,"",python,selection_keyboard
+1160,268365511,"utils/train_utils.py",4786,0,"z",python,content
+1161,268365513,"utils/train_utils.py",4787,0,"",python,selection_keyboard
+1162,268365651,"utils/train_utils.py",4787,0,"e",python,content
+1163,268365653,"utils/train_utils.py",4788,0,"",python,selection_keyboard
+1164,268365695,"utils/train_utils.py",4788,0,"r",python,content
+1165,268365696,"utils/train_utils.py",4789,0,"",python,selection_keyboard
+1166,268365898,"utils/train_utils.py",4788,0,"",python,selection_command
+1167,268366188,"utils/train_utils.py",4828,0,"",python,selection_command
+1168,268366971,"utils/train_utils.py",4823,0,"",python,selection_command
+1169,268367745,"utils/train_utils.py",4828,0," for tokenizer",python,content
+1170,268367748,"utils/train_utils.py",4842,0,"",python,selection_command
+1171,268368583,"utils/train_utils.py",4793,0,"",python,selection_command
+1172,268368940,"utils/train_utils.py",4748,48,"def calculate_tflops_per_device_tokenizer(args):",python,selection_command
+1173,268369101,"utils/train_utils.py",4748,95,"def calculate_tflops_per_device_tokenizer(args):\n """"""Calculate training TFLOP for tokenizer.",python,selection_command
+1174,268369358,"utils/train_utils.py",4748,96,"def calculate_tflops_per_device_tokenizer(args):\n """"""Calculate training TFLOP for tokenizer.\n",python,selection_command
+1175,268369394,"utils/train_utils.py",4748,229,"def calculate_tflops_per_device_tokenizer(args):\n """"""Calculate training TFLOP for tokenizer.\n\n Adapted from https://github.com/AI-Hypercomputer/maxtext/blob/7898576359bacde81be25cb3038e348aac1f943b/MaxText/max_utils.py#L473",python,selection_command
+1176,268369423,"utils/train_utils.py",4748,237,"def calculate_tflops_per_device_tokenizer(args):\n """"""Calculate training TFLOP for tokenizer.\n\n Adapted from https://github.com/AI-Hypercomputer/maxtext/blob/7898576359bacde81be25cb3038e348aac1f943b/MaxText/max_utils.py#L473\n """"""",python,selection_command
+1177,268369455,"utils/train_utils.py",4748,253,"def calculate_tflops_per_device_tokenizer(args):\n """"""Calculate training TFLOP for tokenizer.\n\n Adapted from https://github.com/AI-Hypercomputer/maxtext/blob/7898576359bacde81be25cb3038e348aac1f943b/MaxText/max_utils.py#L473\n """"""\n # MLP flops",python,selection_command
+1178,268369489,"utils/train_utils.py",4748,254,"def calculate_tflops_per_device_tokenizer(args):\n """"""Calculate training TFLOP for tokenizer.\n\n Adapted from https://github.com/AI-Hypercomputer/maxtext/blob/7898576359bacde81be25cb3038e348aac1f943b/MaxText/max_utils.py#L473\n """"""\n # MLP flops\n",python,selection_command
+1179,268369526,"utils/train_utils.py",4748,276,"def calculate_tflops_per_device_tokenizer(args):\n """"""Calculate training TFLOP for tokenizer.\n\n Adapted from https://github.com/AI-Hypercomputer/maxtext/blob/7898576359bacde81be25cb3038e348aac1f943b/MaxText/max_utils.py#L473\n """"""\n # MLP flops\n\n # Attention flops",python,selection_command
+1180,268369559,"utils/train_utils.py",4748,277,"def calculate_tflops_per_device_tokenizer(args):\n """"""Calculate training TFLOP for tokenizer.\n\n Adapted from https://github.com/AI-Hypercomputer/maxtext/blob/7898576359bacde81be25cb3038e348aac1f943b/MaxText/max_utils.py#L473\n """"""\n # MLP flops\n\n # Attention flops\n",python,selection_command
+1181,268369593,"utils/train_utils.py",4748,299,"def calculate_tflops_per_device_tokenizer(args):\n """"""Calculate training TFLOP for tokenizer.\n\n Adapted from https://github.com/AI-Hypercomputer/maxtext/blob/7898576359bacde81be25cb3038e348aac1f943b/MaxText/max_utils.py#L473\n """"""\n # MLP flops\n\n # Attention flops\n\n # Embedding flops",python,selection_command
+1182,268369626,"utils/train_utils.py",4748,300,"def calculate_tflops_per_device_tokenizer(args):\n """"""Calculate training TFLOP for tokenizer.\n\n Adapted from https://github.com/AI-Hypercomputer/maxtext/blob/7898576359bacde81be25cb3038e348aac1f943b/MaxText/max_utils.py#L473\n """"""\n # MLP flops\n\n # Attention flops\n\n # Embedding flops\n",python,selection_command
+1183,268369830,"utils/train_utils.py",4748,342,"def calculate_tflops_per_device_tokenizer(args):\n """"""Calculate training TFLOP for tokenizer.\n\n Adapted from https://github.com/AI-Hypercomputer/maxtext/blob/7898576359bacde81be25cb3038e348aac1f943b/MaxText/max_utils.py#L473\n """"""\n # MLP flops\n\n # Attention flops\n\n # Embedding flops\n\n # Combine flops with number of blocks",python,selection_command
+1184,268577360,"utils/train_utils.py",5049,0," return result\n",python,content
+1185,268577360,"utils/train_utils.py",5026,21," to_tflops = 1.0 / 1e12\n result = {\n ""encoder_attention_tflops"": ((qkv_spatial + attn_spatial + proj_spatial + qkv_temporal + attn_temporal + proj_temporal) * num_blocks) * to_tflops * 3.0,\n ""encoder_ffn_tflops"": (ffn_block * num_blocks) * to_tflops * 3.0,\n ""encoder_io_tflops"": encoder_io * to_tflops * 3.0,\n ""decoder_attention_tflops"": ((qkv_spatial + attn_spatial + proj_spatial + qkv_temporal + attn_temporal + proj_temporal) * num_blocks) * to_tflops * 3.0,\n ""decoder_ffn_tflops"": (ffn_block * num_blocks) * to_tflops * 3.0,\n ""decoder_io_tflops"": decoder_io * to_tflops * 3.0,\n ""vq_tflops"": vq_flops * to_tflops,\n ""total_tflops"": total_training_flops * to_tflops,\n ""per_device_batch_size"": batch_per_device,\n ""num_patches_per_frame"": N,\n }",python,content
+1186,268577360,"utils/train_utils.py",5003,21," # Training FLOPs: scale forward by ~3 for fwd+bwd+update, add VQ once\n total_training_flops = 3.0 * forward_flops + vq_flops",python,content
+1187,268577360,"utils/train_utils.py",4986,15," # Per-device batch\n num_devices = max(1, jax.device_count())\n batch_per_device = args.batch_size // num_devices\n\n # Shapes\n T = int(args.seq_len)\n H = int(args.image_height)\n W = int(args.image_width)\n patch = int(args.patch_size)\n # Number of patches per frame (ceil division with padding as in patchify)\n hn = (H + patch - 1) // patch\n wn = (W + patch - 1) // patch\n N = int(hn * wn)\n\n # Model dims\n M = int(args.model_dim)\n D_ffn = int(args.ffn_dim)\n num_blocks = int(args.num_blocks)\n\n # Convenience aliases\n B = float(batch_per_device)\n T_f = float(T)\n N_f = float(N)\n M_f = float(M)\n D_ffn_f = float(D_ffn)\n\n # Token counts for projections\n tokens_spatial = B * T_f * N_f # spatial attention runs per frame over N tokens\n tokens_temporal = B * N_f * T_f # temporal attention runs per spatial position over T tokens\n\n # Per-block FLOPs\n # Spatial attention (non-causal)\n qkv_spatial = 3.0 * 2.0 * tokens_spatial * M_f * M_f\n attn_spatial = 4.0 * (B * T_f) * (N_f * N_f) * M_f\n proj_spatial = 2.0 * tokens_spatial * M_f * M_f\n\n # Temporal attention (causal mask halves the non-causal complexity)\n qkv_temporal = 3.0 * 2.0 * tokens_temporal * M_f * M_f\n attn_temporal = 2.0 * (B * N_f) * (T_f * T_f) * M_f # = 0.5 * (4 * B*N * T^2 * M)\n proj_temporal = 2.0 * tokens_temporal * M_f * M_f\n\n # FFN per block (2 matmuls)\n ffn_block = 4.0 * B * T_f * N_f * M_f * D_ffn_f\n\n block_total = (\n qkv_spatial\n + attn_spatial\n + proj_spatial\n + qkv_temporal\n + attn_temporal\n + proj_temporal\n + ffn_block\n )\n\n # Encoder/Decoder IO (input/output linears executed once per forward)\n P = float(args.image_channels * (patch ** 2)) # patch token dim\n L = float(args.latent_dim) # latent dim\n\n # Encoder: input_dim=P -> M, output_dim=M -> L\n encoder_io = 2.0 * B * T_f * N_f * (P * M_f + M_f * L)\n # Decoder: input_dim=L -> M, output_dim=M -> P\n decoder_io = 2.0 * B * T_f * N_f * (L * M_f + M_f * P)\n\n # VQ distance compute (forward only; gradients do not flow through distances with STE)\n K = float(args.num_latents)\n vq_flops = 2.0 * (B * T_f * N_f) * L * K\n\n # Forward FLOPs\n encoder_forward = num_blocks * block_total + encoder_io\n decoder_forward = num_blocks * block_total + decoder_io\n forward_flops = encoder_forward + decoder_forward",python,content
+1188,268577396,"utils/train_utils.py",8446,42,"",python,content
+1189,268588005,"utils/train_utils.py",5008,0,"",python,selection_mouse
+1190,268588017,"utils/train_utils.py",5007,0,"",python,selection_command
+1191,268588498,"utils/train_utils.py",4985,0,"",python,selection_mouse
+1192,268588504,"utils/train_utils.py",4984,0,"",python,selection_command
+1193,269956791,"extension-output-pdoom-org.crowd-code-#1-crowd-code",0,0,"",Log,tab
+1194,269958111,"utils/train_utils.py",0,0,"",python,tab
+1195,269996737,"utils/train_utils.py",6125,0,"",python,selection_mouse
+1196,269996748,"utils/train_utils.py",6124,0,"",python,selection_command
+1197,270000412,"utils/train_utils.py",5008,0,"",python,selection_mouse
+1198,270000422,"utils/train_utils.py",5007,0,"",python,selection_command
+1199,270005070,"utils/train_utils.py",5030,0,"",python,selection_command
+1200,270005165,"utils/train_utils.py",5007,0,"",python,selection_command
+1201,270006743,"utils/train_utils.py",4986,23,"",python,content
+1202,270006790,"utils/train_utils.py",4990,0,"",python,selection_command
+1203,270010184,"utils/train_utils.py",5035,0,"",python,selection_command
+1204,270010789,"utils/train_utils.py",5085,0,"",python,selection_command
+1205,270010937,"utils/train_utils.py",5090,0,"",python,selection_command
+1206,270011821,"utils/train_utils.py",5086,13,"",python,content
+1207,270011865,"utils/train_utils.py",5090,0,"",python,selection_command
+1208,270012814,"utils/train_utils.py",5116,0,"",python,selection_command
+1209,270013099,"utils/train_utils.py",5147,0,"",python,selection_command
+1210,270013247,"utils/train_utils.py",5177,0,"",python,selection_command
+1211,270013414,"utils/train_utils.py",5210,0,"",python,selection_command
+1212,270024750,"utils/train_utils.py",5205,0,"\n ",python,content
+1213,270024967,"utils/train_utils.py",5206,4,"",python,content
+1214,270027473,"utils/train_utils.py",5207,0,"",python,selection_command
+1215,270027723,"utils/train_utils.py",5285,0,"",python,selection_command
+1216,270027753,"utils/train_utils.py",5319,0,"",python,selection_command
+1217,270027786,"utils/train_utils.py",5353,0,"",python,selection_command
+1218,270027819,"utils/train_utils.py",5374,0,"",python,selection_command
+1219,270028200,"utils/train_utils.py",5375,0,"",python,selection_command
+1220,270028365,"utils/train_utils.py",5392,0,"",python,selection_command
+1221,270028724,"utils/train_utils.py",5375,0,"",python,selection_command
+1222,270030269,"utils/train_utils.py",5375,17,"",python,content
+1223,270030301,"utils/train_utils.py",5379,0,"",python,selection_command
+1224,270030980,"utils/train_utils.py",5407,0,"",python,selection_command
+1225,270031180,"utils/train_utils.py",5437,0,"",python,selection_command
+1226,270039434,"utils/train_utils.py",5459,0,"",python,selection_command
+1227,270044251,"utils/train_utils.py",5437,0,"",python,selection_command
+1228,270044716,"utils/train_utils.py",5459,0,"",python,selection_command
+1229,270045053,"utils/train_utils.py",7266,0,"",python,selection_command
+1230,270046703,"utils/train_utils.py",7326,0,"",python,selection_command
+1231,270047435,"utils/train_utils.py",7266,0,"",python,selection_command
+1232,270047632,"utils/train_utils.py",5459,0,"",python,selection_command
+1233,270050243,"utils/train_utils.py",5433,38,"",python,content
+1234,270053419,"utils/train_utils.py",7228,0,"",python,selection_command
+1235,270054888,"utils/train_utils.py",7228,0,"args.",python,content
+1236,270055468,"utils/train_utils.py",7293,0,"args.",python,content
+1237,270055484,"utils/train_utils.py",7298,0,"",python,selection_command
+1238,270057617,"utils/train_utils.py",7363,0,"",python,selection_command
+1239,270057866,"utils/train_utils.py",7390,0,"",python,selection_command
+1240,270057898,"utils/train_utils.py",7418,0,"",python,selection_command
+1241,270057925,"utils/train_utils.py",7492,0,"",python,selection_command
+1242,270057958,"utils/train_utils.py",7523,0,"",python,selection_command
+1243,270057992,"utils/train_utils.py",7549,0,"",python,selection_command
+1244,270058024,"utils/train_utils.py",7564,0,"",python,selection_command
+1245,270058059,"utils/train_utils.py",7593,0,"",python,selection_command
+1246,270058091,"utils/train_utils.py",7754,0,"",python,selection_command
+1247,270058126,"utils/train_utils.py",7828,0,"",python,selection_command
+1248,270058159,"utils/train_utils.py",7887,0,"",python,selection_command
+1249,270058449,"utils/train_utils.py",7828,0,"",python,selection_command
+1250,270058639,"utils/train_utils.py",7754,0,"",python,selection_command
+1251,270059340,"utils/train_utils.py",7770,0,"args.",python,content
+1252,270059353,"utils/train_utils.py",7775,0,"",python,selection_command
+1253,270059847,"utils/train_utils.py",8069,0,"args.",python,content
+1254,270059855,"utils/train_utils.py",8074,0,"",python,selection_command
+1255,270060934,"utils/train_utils.py",8363,0,"",python,selection_command
+1256,270062383,"utils/train_utils.py",7390,0,"",python,selection_command
+1257,270062533,"utils/train_utils.py",6536,0,"",python,selection_command
+1258,270063184,"utils/train_utils.py",6512,0,"",python,selection_command
+1259,270063439,"utils/train_utils.py",6489,0,"",python,selection_command
+1260,270063470,"utils/train_utils.py",6466,0,"",python,selection_command
+1261,270063516,"utils/train_utils.py",6443,0,"",python,selection_command
+1262,270063533,"utils/train_utils.py",6423,0,"",python,selection_command
+1263,270063563,"utils/train_utils.py",6403,0,"",python,selection_command
+1264,270063597,"utils/train_utils.py",6394,0,"",python,selection_command
+1265,270063631,"utils/train_utils.py",6350,0,"",python,selection_command
+1266,270063664,"utils/train_utils.py",6318,0,"",python,selection_command
+1267,270063698,"utils/train_utils.py",6309,0,"",python,selection_command
+1268,270063730,"utils/train_utils.py",6263,0,"",python,selection_command
+1269,270063764,"utils/train_utils.py",6176,0,"",python,selection_command
+1270,270063797,"utils/train_utils.py",6117,0,"",python,selection_command
+1271,270063830,"utils/train_utils.py",6045,0,"",python,selection_command
+1272,270063864,"utils/train_utils.py",6036,0,"",python,selection_command
+1273,270063897,"utils/train_utils.py",5992,0,"",python,selection_command
+1274,270063930,"utils/train_utils.py",5937,0,"",python,selection_command
+1275,270063964,"utils/train_utils.py",5880,0,"",python,selection_command
+1276,270063997,"utils/train_utils.py",5843,0,"",python,selection_command
+1277,270064031,"utils/train_utils.py",5821,0,"",python,selection_command
+1278,270064063,"utils/train_utils.py",5812,0,"",python,selection_command
+1279,270064098,"utils/train_utils.py",5714,0,"",python,selection_command
+1280,270064131,"utils/train_utils.py",5620,0,"",python,selection_command
+1281,270065070,"utils/train_utils.py",5714,0,"",python,selection_command
+1282,270065316,"utils/train_utils.py",5812,0,"",python,selection_command
+1283,270065349,"utils/train_utils.py",5821,0,"",python,selection_command
+1284,270065382,"utils/train_utils.py",5843,0,"",python,selection_command
+1285,270065418,"utils/train_utils.py",5880,0,"",python,selection_command
+1286,270065613,"utils/train_utils.py",5937,0,"",python,selection_command
+1287,270065866,"utils/train_utils.py",5992,0,"",python,selection_command
+1288,270065899,"utils/train_utils.py",6036,0,"",python,selection_command
+1289,270065930,"utils/train_utils.py",6045,0,"",python,selection_command
+1290,270065964,"utils/train_utils.py",6117,0,"",python,selection_command
+1291,270065999,"utils/train_utils.py",6176,0,"",python,selection_command
+1292,270066030,"utils/train_utils.py",6263,0,"",python,selection_command
+1293,270066193,"utils/train_utils.py",6309,0,"",python,selection_command
+1294,270066449,"utils/train_utils.py",6318,0,"",python,selection_command
+1295,270066587,"utils/train_utils.py",6309,0,"",python,selection_command
+1296,270066836,"utils/train_utils.py",6263,0,"",python,selection_command
+1297,270066869,"utils/train_utils.py",6176,0,"",python,selection_command
+1298,270066903,"utils/train_utils.py",6117,0,"",python,selection_command
+1299,270066935,"utils/train_utils.py",6045,0,"",python,selection_command
+1300,270066970,"utils/train_utils.py",6036,0,"",python,selection_command
+1301,270067003,"utils/train_utils.py",5992,0,"",python,selection_command
+1302,270067036,"utils/train_utils.py",5937,0,"",python,selection_command
+1303,270067075,"utils/train_utils.py",5880,0,"",python,selection_command
+1304,270067108,"utils/train_utils.py",5843,0,"",python,selection_command
+1305,270067141,"utils/train_utils.py",5821,0,"",python,selection_command
+1306,270067174,"utils/train_utils.py",5812,0,"",python,selection_command
+1307,270067208,"utils/train_utils.py",5714,0,"",python,selection_command
+1308,270067240,"utils/train_utils.py",5620,0,"",python,selection_command
+1309,270067275,"utils/train_utils.py",5585,0,"",python,selection_command
+1310,270067307,"utils/train_utils.py",5576,0,"",python,selection_command
+1311,270067341,"utils/train_utils.py",5557,0,"",python,selection_command
+1312,270067374,"utils/train_utils.py",5538,0,"",python,selection_command
+1313,270067408,"utils/train_utils.py",5519,0,"",python,selection_command
+1314,270067441,"utils/train_utils.py",5500,0,"",python,selection_command
+1315,270067473,"utils/train_utils.py",5468,0,"",python,selection_command
+1316,270067507,"utils/train_utils.py",5442,0,"",python,selection_command
+1317,270067540,"utils/train_utils.py",5433,0,"",python,selection_command
+1318,270067574,"utils/train_utils.py",5411,0,"",python,selection_command
+1319,270067606,"utils/train_utils.py",5383,0,"",python,selection_command
+1320,270067641,"utils/train_utils.py",5374,0,"",python,selection_command
+1321,270067673,"utils/train_utils.py",5361,0,"",python,selection_command
+1322,270067707,"utils/train_utils.py",5327,0,"",python,selection_command
+1323,270067739,"utils/train_utils.py",5293,0,"",python,selection_command
+1324,270067773,"utils/train_utils.py",5215,0,"",python,selection_command
+1325,270067806,"utils/train_utils.py",5206,0,"",python,selection_command
+1326,270067842,"utils/train_utils.py",5181,0,"",python,selection_command
+1327,270067875,"utils/train_utils.py",5151,0,"",python,selection_command
+1328,270067909,"utils/train_utils.py",5120,0,"",python,selection_command
+1329,270067941,"utils/train_utils.py",5094,0,"",python,selection_command
+1330,270067975,"utils/train_utils.py",5085,0,"",python,selection_command
+1331,270068007,"utils/train_utils.py",5039,0,"",python,selection_command
+1332,270068074,"utils/train_utils.py",5085,0,"",python,selection_command
+1333,270068332,"utils/train_utils.py",5094,0,"",python,selection_command
+1334,270068361,"utils/train_utils.py",5120,0,"",python,selection_command
+1335,270068391,"utils/train_utils.py",5151,0,"",python,selection_command
+1336,270068425,"utils/train_utils.py",5181,0,"",python,selection_command
+1337,270068458,"utils/train_utils.py",5206,0,"",python,selection_command
+1338,270068491,"utils/train_utils.py",5215,0,"",python,selection_command
+1339,270068526,"utils/train_utils.py",5293,0,"",python,selection_command
+1340,270068558,"utils/train_utils.py",5327,0,"",python,selection_command
+1341,270068590,"utils/train_utils.py",5361,0,"",python,selection_command
+1342,270068625,"utils/train_utils.py",5374,0,"",python,selection_command
+1343,270068867,"utils/train_utils.py",5383,0,"",python,selection_command
+1344,270069011,"utils/train_utils.py",5411,0,"",python,selection_command
+1345,270069273,"utils/train_utils.py",5383,0,"",python,selection_command
+1346,270071520,"utils/train_utils.py",5375,28,"",python,content
+1347,270071554,"utils/train_utils.py",5379,0,"",python,selection_command
+1348,270072823,"utils/train_utils.py",5379,5,"",python,content
+1349,270072900,"utils/train_utils.py",5380,0,"",python,selection_command
+1350,270073666,"utils/train_utils.py",5379,0,"D_ffn",python,content
+1351,270073676,"utils/train_utils.py",5379,0,"",python,selection_command
+1352,270073987,"utils/train_utils.py",5379,0,"M = int(args.model_dim)\n ",python,content
+1353,270074007,"utils/train_utils.py",5383,0,"",python,selection_command
+1354,270074954,"utils/train_utils.py",5375,28,"",python,content
+1355,270074980,"utils/train_utils.py",5379,0,"",python,selection_command
+1356,270076087,"utils/train_utils.py",5506,0,"",python,selection_command
+1357,270085301,"utils/train_utils.py",5502,19,"",python,content
+1358,270085338,"utils/train_utils.py",5506,0,"",python,selection_command
+1359,270088053,"utils/train_utils.py",5872,0,"",python,selection_command
+1360,270091207,"utils/train_utils.py",5881,0,"dim",python,content
+1361,270091207,"utils/train_utils.py",5880,1,"",python,content
+1362,270091207,"utils/train_utils.py",5879,0,"args.model",python,content
+1363,270091207,"utils/train_utils.py",5878,1,"",python,content
+1364,270091207,"utils/train_utils.py",5875,0,"dim",python,content
+1365,270091208,"utils/train_utils.py",5874,1,"",python,content
+1366,270091208,"utils/train_utils.py",5873,0,"args.model",python,content
+1367,270091208,"utils/train_utils.py",5872,1,"",python,content
+1368,270092803,"utils/train_utils.py",6010,0,"dim",python,content
+1369,270092803,"utils/train_utils.py",6009,1,"",python,content
+1370,270092803,"utils/train_utils.py",6008,0,"args.model",python,content
+1371,270092803,"utils/train_utils.py",6007,1,"",python,content
+1372,270092803,"utils/train_utils.py",6004,0,"dim",python,content
+1373,270092803,"utils/train_utils.py",6003,1,"",python,content
+1374,270092803,"utils/train_utils.py",6002,0,"args.model",python,content
+1375,270092803,"utils/train_utils.py",6001,1,"",python,content
+1376,270095364,"utils/train_utils.py",5958,0,"dim",python,content
+1377,270095365,"utils/train_utils.py",5957,1,"",python,content
+1378,270095365,"utils/train_utils.py",5956,0,"args.model",python,content
+1379,270095365,"utils/train_utils.py",5955,1,"",python,content
+1380,270096281,"utils/train_utils.py",6175,0,"dim",python,content
+1381,270096281,"utils/train_utils.py",6174,1,"",python,content
+1382,270096281,"utils/train_utils.py",6173,0,"args.model",python,content
+1383,270096281,"utils/train_utils.py",6172,1,"",python,content
+1384,270096281,"utils/train_utils.py",6169,0,"dim",python,content
+1385,270096282,"utils/train_utils.py",6168,1,"",python,content
+1386,270096282,"utils/train_utils.py",6167,0,"args.model",python,content
+1387,270096282,"utils/train_utils.py",6166,1,"",python,content
+1388,270097905,"utils/train_utils.py",5951,0,"",python,selection_command
+1389,270098151,"utils/train_utils.py",6017,0,"",python,selection_command
+1390,270098182,"utils/train_utils.py",6044,0,"",python,selection_command
+1391,270098217,"utils/train_utils.py",6092,0,"",python,selection_command
+1392,270098249,"utils/train_utils.py",6164,0,"",python,selection_command
+1393,270098285,"utils/train_utils.py",6245,0,"",python,selection_command
+1394,270099896,"utils/train_utils.py",6338,0,"dim",python,content
+1395,270099896,"utils/train_utils.py",6337,1,"",python,content
+1396,270099896,"utils/train_utils.py",6336,0,"args.model",python,content
+1397,270099896,"utils/train_utils.py",6335,1,"",python,content
+1398,270099896,"utils/train_utils.py",6332,0,"dim",python,content
+1399,270099896,"utils/train_utils.py",6331,1,"",python,content
+1400,270099896,"utils/train_utils.py",6330,0,"args.model",python,content
+1401,270099896,"utils/train_utils.py",6329,1,"",python,content
+1402,270099896,"utils/train_utils.py",6253,0,"dim",python,content
+1403,270099896,"utils/train_utils.py",6252,1,"",python,content
+1404,270099896,"utils/train_utils.py",6251,0,"args.model",python,content
+1405,270099896,"utils/train_utils.py",6250,1,"",python,content
+1406,270104705,"utils/train_utils.py",6343,0,"",python,selection_command
+1407,270104831,"utils/train_utils.py",6372,0,"",python,selection_command
+1408,270105214,"utils/train_utils.py",6403,0,"",python,selection_command
+1409,270105383,"utils/train_utils.py",6452,0,"",python,selection_command
+1410,270106459,"utils/train_utils.py",6446,0,"dim",python,content
+1411,270106459,"utils/train_utils.py",6445,1,"",python,content
+1412,270106459,"utils/train_utils.py",6444,0,"args.model",python,content
+1413,270106459,"utils/train_utils.py",6443,1,"",python,content
+1414,270107268,"utils/train_utils.py",7186,0,"",python,selection_command
+1415,270107950,"utils/train_utils.py",7095,0,"",python,selection_command
+1416,270108198,"utils/train_utils.py",7090,0,"",python,selection_command
+1417,270108225,"utils/train_utils.py",7035,0,"",python,selection_command
+1418,270108261,"utils/train_utils.py",6984,0,"",python,selection_command
+1419,270108299,"utils/train_utils.py",6925,0,"",python,selection_command
+1420,270108462,"utils/train_utils.py",6874,0,"",python,selection_command
+1421,270108771,"utils/train_utils.py",6925,0,"",python,selection_command
+1422,270109336,"utils/train_utils.py",7084,0,"dim",python,content
+1423,270109336,"utils/train_utils.py",7083,1,"",python,content
+1424,270109336,"utils/train_utils.py",7082,0,"args.model",python,content
+1425,270109336,"utils/train_utils.py",7081,1,"",python,content
+1426,270109336,"utils/train_utils.py",7078,0,"dim",python,content
+1427,270109336,"utils/train_utils.py",7077,1,"",python,content
+1428,270109336,"utils/train_utils.py",7076,0,"args.model",python,content
+1429,270109336,"utils/train_utils.py",7075,1,"",python,content
+1430,270109336,"utils/train_utils.py",6974,0,"dim",python,content
+1431,270109336,"utils/train_utils.py",6973,1,"",python,content
+1432,270109336,"utils/train_utils.py",6972,0,"args.model",python,content
+1433,270109336,"utils/train_utils.py",6971,1,"",python,content
+1434,270109336,"utils/train_utils.py",6968,0,"dim",python,content
+1435,270109336,"utils/train_utils.py",6967,1,"",python,content
+1436,270109336,"utils/train_utils.py",6966,0,"args.model",python,content
+1437,270109336,"utils/train_utils.py",6965,1,"",python,content
+1438,270110517,"utils/train_utils.py",7991,0,"",python,selection_command
+1439,270111431,"utils/train_utils.py",8481,0,"",python,selection_command
+1440,270112635,"utils/train_utils.py",7328,0,"",python,selection_command
+1441,270112964,"utils/train_utils.py",6517,0,"",python,selection_command
+1442,270113287,"utils/train_utils.py",5506,0,"",python,selection_command
+1443,270113987,"utils/train_utils.py",5487,0,"",python,selection_command
+1444,270114318,"utils/train_utils.py",5506,0,"",python,selection_command
+1445,270114569,"utils/train_utils.py",5529,0,"",python,selection_command
+1446,270114595,"utils/train_utils.py",5534,0,"",python,selection_command
+1447,270114625,"utils/train_utils.py",5569,0,"",python,selection_command
+1448,270114660,"utils/train_utils.py",5663,0,"",python,selection_command
+1449,270114697,"utils/train_utils.py",5765,0,"",python,selection_command
+1450,270114733,"utils/train_utils.py",5770,0,"",python,selection_command
+1451,270114760,"utils/train_utils.py",5792,0,"",python,selection_command
+1452,270114794,"utils/train_utils.py",5829,0,"",python,selection_command
+1453,270114830,"utils/train_utils.py",5908,0,"",python,selection_command
+1454,270114864,"utils/train_utils.py",5974,0,"",python,selection_command
+1455,270114897,"utils/train_utils.py",6044,0,"",python,selection_command
+1456,270114931,"utils/train_utils.py",6049,0,"",python,selection_command
+1457,270114965,"utils/train_utils.py",6121,0,"",python,selection_command
+1458,270115105,"utils/train_utils.py",6049,0,"",python,selection_command
+1459,270115364,"utils/train_utils.py",6044,0,"",python,selection_command
+1460,270115398,"utils/train_utils.py",5974,0,"",python,selection_command
+1461,270115425,"utils/train_utils.py",5908,0,"",python,selection_command
+1462,270115459,"utils/train_utils.py",5829,0,"",python,selection_command
+1463,270115491,"utils/train_utils.py",5792,0,"",python,selection_command
+1464,270115597,"utils/train_utils.py",5829,0,"",python,selection_command
+1465,270115840,"utils/train_utils.py",5908,0,"",python,selection_command
+1466,270115870,"utils/train_utils.py",5974,0,"",python,selection_command
+1467,270115906,"utils/train_utils.py",6044,0,"",python,selection_command
+1468,270115941,"utils/train_utils.py",6049,0,"",python,selection_command
+1469,270115975,"utils/train_utils.py",6121,0,"",python,selection_command
+1470,270116009,"utils/train_utils.py",6202,0,"",python,selection_command
+1471,270116042,"utils/train_utils.py",6300,0,"",python,selection_command
+1472,270116074,"utils/train_utils.py",6372,0,"",python,selection_command
+1473,270116109,"utils/train_utils.py",6377,0,"",python,selection_command
+1474,270116133,"utils/train_utils.py",6372,0,"",python,selection_command
+1475,270116383,"utils/train_utils.py",6300,0,"",python,selection_command
+1476,270116416,"utils/train_utils.py",6202,0,"",python,selection_command
+1477,270116449,"utils/train_utils.py",6121,0,"",python,selection_command
+1478,270116484,"utils/train_utils.py",6049,0,"",python,selection_command
+1479,270116516,"utils/train_utils.py",6044,0,"",python,selection_command
+1480,270116551,"utils/train_utils.py",5974,0,"",python,selection_command
+1481,270116585,"utils/train_utils.py",5908,0,"",python,selection_command
+1482,270116617,"utils/train_utils.py",5829,0,"",python,selection_command
+1483,270116652,"utils/train_utils.py",5792,0,"",python,selection_command
+1484,270116685,"utils/train_utils.py",5770,0,"",python,selection_command
+1485,270116721,"utils/train_utils.py",5765,0,"",python,selection_command
+1486,270116754,"utils/train_utils.py",5663,0,"",python,selection_command
+1487,270116788,"utils/train_utils.py",5569,0,"",python,selection_command
+1488,270116819,"utils/train_utils.py",5534,0,"",python,selection_command
+1489,270116854,"utils/train_utils.py",5529,0,"",python,selection_command
+1490,270116887,"utils/train_utils.py",5506,0,"",python,selection_command
+1491,270117020,"utils/train_utils.py",5487,0,"",python,selection_command
+1492,270117149,"utils/train_utils.py",5468,0,"",python,selection_command
+1493,270118107,"utils/train_utils.py",5472,0,"",python,selection_command
+1494,270118265,"utils/train_utils.py",5474,0,"",python,selection_command
+1495,270118467,"utils/train_utils.py",5479,0,"",python,selection_command
+1496,270118665,"utils/train_utils.py",5480,0,"",python,selection_command
+1497,270119212,"utils/train_utils.py",5090,0,"",python,selection_command
+1498,270121766,"utils/train_utils.py",5116,0,"",python,selection_command
+1499,270122016,"utils/train_utils.py",5147,0,"",python,selection_command
+1500,270122049,"utils/train_utils.py",5177,0,"",python,selection_command
+1501,270122081,"utils/train_utils.py",5206,0,"",python,selection_command
+1502,270122112,"utils/train_utils.py",5211,0,"",python,selection_command
+1503,270122147,"utils/train_utils.py",5289,0,"",python,selection_command
+1504,270122181,"utils/train_utils.py",5323,0,"",python,selection_command
+1505,270122214,"utils/train_utils.py",5357,0,"",python,selection_command
+1506,270122247,"utils/train_utils.py",5374,0,"",python,selection_command
+1507,270122282,"utils/train_utils.py",5379,0,"",python,selection_command
+1508,270122316,"utils/train_utils.py",5405,0,"",python,selection_command
+1509,270122348,"utils/train_utils.py",5410,0,"",python,selection_command
+1510,270122383,"utils/train_utils.py",5436,0,"",python,selection_command
+1511,270122415,"utils/train_utils.py",5468,0,"",python,selection_command
+1512,270122450,"utils/train_utils.py",5487,0,"",python,selection_command
+1513,270122764,"utils/train_utils.py",5468,0,"",python,selection_command
+1514,270123199,"utils/train_utils.py",5464,19,"",python,content
+1515,270123235,"utils/train_utils.py",5468,0,"",python,selection_command
+1516,270125716,"utils/train_utils.py",5571,0,"",python,selection_command
+1517,270126727,"utils/train_utils.py",5572,2,"",python,content
+1518,270127853,"utils/train_utils.py",5671,2,"",python,content
+1519,270133923,"utils/train_utils.py",5515,0,"",python,selection_keyboard
+1520,270134355,"utils/train_utils.py",5550,0,"",python,selection_command
+1521,270134602,"utils/train_utils.py",5642,0,"",python,selection_command
+1522,270134634,"utils/train_utils.py",5742,0,"",python,selection_command
+1523,270134668,"utils/train_utils.py",5747,0,"",python,selection_command
+1524,270134704,"utils/train_utils.py",5769,0,"",python,selection_command
+1525,270134735,"utils/train_utils.py",5806,0,"",python,selection_command
+1526,270135048,"utils/train_utils.py",5885,0,"",python,selection_command
+1527,270135705,"utils/train_utils.py",5912,2,"",python,content
+1528,270136657,"utils/train_utils.py",5949,0,"",python,selection_command
+1529,270136904,"utils/train_utils.py",6019,0,"",python,selection_command
+1530,270136934,"utils/train_utils.py",6024,0,"",python,selection_command
+1531,270136969,"utils/train_utils.py",6096,0,"",python,selection_command
+1532,270137001,"utils/train_utils.py",6177,0,"",python,selection_command
+1533,270137035,"utils/train_utils.py",6275,0,"",python,selection_command
+1534,270137400,"utils/train_utils.py",6177,0,"",python,selection_command
+1535,270137800,"utils/train_utils.py",6219,2,"",python,content
+1536,270137800,"utils/train_utils.py",6213,2,"",python,content
+1537,270138415,"utils/train_utils.py",6271,0,"",python,selection_command
+1538,270138669,"utils/train_utils.py",6343,0,"",python,selection_command
+1539,270138700,"utils/train_utils.py",6348,0,"",python,selection_command
+1540,270138734,"utils/train_utils.py",6380,0,"",python,selection_command
+1541,270138952,"utils/train_utils.py",6439,0,"",python,selection_command
+1542,270139178,"utils/train_utils.py",6380,0,"",python,selection_command
+1543,270139538,"utils/train_utils.py",6403,2,"",python,content
+1544,270140393,"utils/train_utils.py",7199,0,"",python,selection_command
+1545,270141255,"utils/train_utils.py",7108,0,"",python,selection_command
+1546,270141498,"utils/train_utils.py",7103,0,"",python,selection_command
+1547,270141531,"utils/train_utils.py",7026,0,"",python,selection_command
+1548,270141565,"utils/train_utils.py",6975,0,"",python,selection_command
+1549,270141597,"utils/train_utils.py",6894,0,"",python,selection_command
+1550,270142372,"utils/train_utils.py",6918,2,"",python,content
+1551,270142822,"utils/train_utils.py",7048,2,"",python,content
+1552,270143967,"utils/train_utils.py",6973,0,"",python,selection_command
+1553,270144215,"utils/train_utils.py",7024,0,"",python,selection_command
+1554,270144247,"utils/train_utils.py",7099,0,"",python,selection_command
+1555,270144282,"utils/train_utils.py",7104,0,"",python,selection_command
+1556,270144315,"utils/train_utils.py",7195,0,"",python,selection_command
+1557,270144348,"utils/train_utils.py",7227,0,"",python,selection_command
+1558,270145137,"utils/train_utils.py",7250,2,"",python,content
+1559,270148705,"utils/train_utils.py",7250,0,"_f",python,content
+1560,270148733,"utils/train_utils.py",7048,0,"_f",python,content
+1561,270148743,"utils/train_utils.py",6918,0,"_f",python,content
+1562,270148756,"utils/train_utils.py",6894,0,"",python,selection_command
+1563,270149947,"utils/train_utils.py",6918,2,"",python,content
+1564,270149966,"utils/train_utils.py",7048,2,"",python,content
+1565,270149974,"utils/train_utils.py",7250,2,"",python,content
+1566,270150928,"utils/train_utils.py",7954,0,"",python,selection_command
+1567,270151387,"utils/train_utils.py",8444,0,"",python,selection_command
+1568,270151728,"utils/train_utils.py",7471,0,"",python,selection_command
+1569,270151899,"utils/train_utils.py",6579,0,"",python,selection_command
+1570,270152354,"utils/train_utils.py",5642,0,"",python,selection_command
+1571,270153018,"utils/train_utils.py",5550,0,"",python,selection_command
+1572,270153264,"utils/train_utils.py",5515,0,"",python,selection_command
+1573,270153298,"utils/train_utils.py",5510,0,"",python,selection_command
+1574,270153331,"utils/train_utils.py",5487,0,"",python,selection_command
+1575,270153366,"utils/train_utils.py",5468,0,"",python,selection_command
+1576,270153406,"utils/train_utils.py",5436,0,"",python,selection_command
+1577,270153432,"utils/train_utils.py",5410,0,"",python,selection_command
+1578,270153464,"utils/train_utils.py",5405,0,"",python,selection_command
+1579,270153499,"utils/train_utils.py",5379,0,"",python,selection_command
+1580,270153532,"utils/train_utils.py",5374,0,"",python,selection_command
+1581,270153564,"utils/train_utils.py",5357,0,"",python,selection_command
+1582,270153598,"utils/train_utils.py",5323,0,"",python,selection_command
+1583,270153631,"utils/train_utils.py",5289,0,"",python,selection_command
+1584,270153665,"utils/train_utils.py",5211,0,"",python,selection_command
+1585,270153744,"utils/train_utils.py",5289,0,"",python,selection_command
+1586,270154362,"utils/train_utils.py",5211,0,"",python,selection_command
+1587,270154616,"utils/train_utils.py",5206,0,"",python,selection_command
+1588,270154642,"utils/train_utils.py",5177,0,"",python,selection_command
+1589,270154676,"utils/train_utils.py",5147,0,"",python,selection_command
+1590,270154709,"utils/train_utils.py",5116,0,"",python,selection_command
+1591,270154846,"utils/train_utils.py",5147,0,"",python,selection_command
+1592,270155095,"utils/train_utils.py",5177,0,"",python,selection_command
+1593,270155132,"utils/train_utils.py",5206,0,"",python,selection_command
+1594,270155159,"utils/train_utils.py",5211,0,"",python,selection_command
+1595,270155195,"utils/train_utils.py",5289,0,"",python,selection_command
+1596,270155225,"utils/train_utils.py",5323,0,"",python,selection_command
+1597,270155258,"utils/train_utils.py",5357,0,"",python,selection_command
+1598,270155293,"utils/train_utils.py",5374,0,"",python,selection_command
+1599,270155952,"utils/train_utils.py",5379,0,"",python,selection_command
+1600,270156195,"utils/train_utils.py",5405,0,"",python,selection_command
+1601,270156285,"utils/train_utils.py",5410,0,"",python,selection_command
+1602,270156468,"utils/train_utils.py",5436,0,"",python,selection_command
+1603,270156603,"utils/train_utils.py",5468,0,"",python,selection_command
+1604,270158026,"utils/train_utils.py",5464,19,"",python,content
+1605,270158065,"utils/train_utils.py",5468,0,"",python,selection_command
+1606,270160275,"utils/train_utils.py",5491,0,"",python,selection_command
+1607,270160526,"utils/train_utils.py",5496,0,"",python,selection_command
+1608,270160646,"utils/train_utils.py",5531,0,"",python,selection_command
+1609,270161296,"utils/train_utils.py",5557,2,"",python,content
+1610,270162153,"utils/train_utils.py",5644,2,"",python,content
+1611,270163451,"utils/train_utils.py",5621,0,"",python,selection_command
+1612,270163699,"utils/train_utils.py",5719,0,"",python,selection_command
+1613,270163727,"utils/train_utils.py",5724,0,"",python,selection_command
+1614,270163759,"utils/train_utils.py",5746,0,"",python,selection_command
+1615,270163796,"utils/train_utils.py",5783,0,"",python,selection_command
+1616,270163825,"utils/train_utils.py",5862,0,"",python,selection_command
+1617,270163861,"utils/train_utils.py",5926,0,"",python,selection_command
+1618,270163892,"utils/train_utils.py",5996,0,"",python,selection_command
+1619,270163926,"utils/train_utils.py",6001,0,"",python,selection_command
+1620,270164076,"utils/train_utils.py",5996,0,"",python,selection_command
+1621,270164239,"utils/train_utils.py",5926,0,"",python,selection_command
+1622,270164574,"utils/train_utils.py",5862,0,"",python,selection_command
+1623,270165066,"utils/train_utils.py",5901,2,"",python,content
+1624,270165066,"utils/train_utils.py",5895,2,"",python,content
+1625,270165437,"utils/train_utils.py",5922,0,"",python,selection_command
+1626,270165688,"utils/train_utils.py",5992,0,"",python,selection_command
+1627,270165723,"utils/train_utils.py",5997,0,"",python,selection_command
+1628,270165758,"utils/train_utils.py",6069,0,"",python,selection_command
+1629,270165792,"utils/train_utils.py",6150,0,"",python,selection_command
+1630,270166150,"utils/train_utils.py",6244,0,"",python,selection_command
+1631,270166599,"utils/train_utils.py",6150,0,"",python,selection_command
+1632,270166904,"utils/train_utils.py",6178,2,"",python,content
+1633,270167503,"utils/train_utils.py",6944,0,"",python,selection_command
+1634,270168252,"utils/train_utils.py",6437,0,"",python,selection_keyboard
+1635,270168640,"utils/train_utils.py",6417,0,"",python,selection_command
+1636,270168795,"utils/train_utils.py",6408,0,"",python,selection_command
+1637,270168992,"utils/train_utils.py",6355,0,"",python,selection_command
+1638,270169389,"utils/train_utils.py",6378,2,"",python,content
+1639,270169815,"utils/train_utils.py",7164,0,"",python,selection_command
+1640,270170433,"utils/train_utils.py",7235,0,"",python,selection_keyboard
+1641,270170749,"utils/train_utils.py",7192,0,"",python,selection_command
+1642,270171001,"utils/train_utils.py",7160,0,"",python,selection_command
+1643,270171033,"utils/train_utils.py",7069,0,"",python,selection_command
+1644,270171066,"utils/train_utils.py",7068,0,"",python,selection_command
+1645,270171098,"utils/train_utils.py",6989,0,"",python,selection_command
+1646,270171313,"utils/train_utils.py",6938,0,"",python,selection_command
+1647,270171585,"utils/train_utils.py",6859,0,"",python,selection_command
+1648,270172254,"utils/train_utils.py",6891,2,"",python,content
+1649,270172699,"utils/train_utils.py",7019,2,"",python,content
+1650,270173402,"utils/train_utils.py",6936,0,"",python,selection_command
+1651,270173650,"utils/train_utils.py",6987,0,"",python,selection_command
+1652,270173676,"utils/train_utils.py",7064,0,"",python,selection_command
+1653,270173711,"utils/train_utils.py",7065,0,"",python,selection_command
+1654,270173748,"utils/train_utils.py",7156,0,"",python,selection_command
+1655,270174051,"utils/train_utils.py",7188,0,"",python,selection_command
+1656,270174408,"utils/train_utils.py",7219,2,"",python,content
+1657,270175835,"utils/train_utils.py",8368,0,"",python,selection_command
+1658,270175980,"utils/train_utils.py",8407,0,"",python,selection_command
+1659,270176781,"utils/train_utils.py",7254,0,"",python,selection_command
+1660,270177113,"utils/train_utils.py",8407,0,"",python,selection_command
+1661,270178521,"utils/train_utils.py",7254,0,"",python,selection_command
+1662,270178795,"utils/train_utils.py",6455,0,"",python,selection_command
+1663,270180668,"utils/train_utils.py",4990,0,"",python,selection_command
+1664,270185126,"utils/train_utils.py",5035,0,"",python,selection_command
+1665,270185382,"utils/train_utils.py",5085,0,"",python,selection_command
+1666,270185410,"utils/train_utils.py",5090,0,"",python,selection_command
+1667,270185444,"utils/train_utils.py",5116,0,"",python,selection_command
+1668,270185475,"utils/train_utils.py",5147,0,"",python,selection_command
+1669,270185509,"utils/train_utils.py",5177,0,"",python,selection_command
+1670,270185542,"utils/train_utils.py",5206,0,"",python,selection_command
+1671,270185576,"utils/train_utils.py",5211,0,"",python,selection_command
+1672,270185610,"utils/train_utils.py",5289,0,"",python,selection_command
+1673,270185641,"utils/train_utils.py",5323,0,"",python,selection_command
+1674,270185676,"utils/train_utils.py",5357,0,"",python,selection_command
+1675,270185709,"utils/train_utils.py",5374,0,"",python,selection_command
+1676,270185743,"utils/train_utils.py",5379,0,"",python,selection_command
+1677,270185776,"utils/train_utils.py",5405,0,"",python,selection_command
+1678,270185952,"utils/train_utils.py",5410,0,"",python,selection_command
+1679,270186128,"utils/train_utils.py",5436,0,"",python,selection_command
+1680,270186864,"utils/train_utils.py",5438,0,"",python,selection_command
+1681,270187220,"utils/train_utils.py",5440,0,"",python,selection_command
+1682,270188177,"utils/train_utils.py",5440,1,"f",python,selection_command
+1683,270188235,"utils/train_utils.py",5440,5,"float",python,selection_command
+1684,270188418,"utils/train_utils.py",5440,6,"float(",python,selection_command
+1685,270192325,"utils/train_utils.py",5445,0,"",python,selection_command
+1686,270192702,"utils/train_utils.py",5446,0,"",python,selection_command
+1687,270196678,"utils/train_utils.py",5432,32,"",python,content
+1688,270196716,"utils/train_utils.py",5436,0,"",python,selection_command
+1689,270200783,"utils/train_utils.py",5516,0,"",python,selection_command
+1690,270201341,"utils/train_utils.py",5607,0,"",python,selection_command
+1691,270202175,"utils/train_utils.py",5516,0,"",python,selection_command
+1692,270202626,"utils/train_utils.py",5608,0,"args.batch_size",python,content
+1693,270202627,"utils/train_utils.py",5607,1,"",python,content
+1694,270202627,"utils/train_utils.py",5517,0,"args.batch_size",python,content
+1695,270202627,"utils/train_utils.py",5516,1,"",python,content
+1696,270203389,"utils/train_utils.py",5880,0,"",python,selection_command
+1697,270204154,"utils/train_utils.py",5881,0,"args.batch_size",python,content
+1698,270204154,"utils/train_utils.py",5880,1,"",python,content
+1699,270204731,"utils/train_utils.py",6183,0,"",python,selection_command
+1700,270205400,"utils/train_utils.py",6184,0,"args.batch_size",python,content
+1701,270205400,"utils/train_utils.py",6183,1,"",python,content
+1702,270205853,"utils/train_utils.py",6247,0,"",python,selection_command
+1703,270207033,"utils/train_utils.py",6393,0,"",python,selection_command
+1704,270207821,"utils/train_utils.py",6394,0,"args.batch_size",python,content
+1705,270207821,"utils/train_utils.py",6393,1,"",python,content
+1706,270208966,"utils/train_utils.py",7198,0,"",python,selection_command
+1707,270209657,"utils/train_utils.py",7248,0,"",python,selection_command
+1708,270210051,"utils/train_utils.py",7048,0,"",python,selection_command
+1709,270210197,"utils/train_utils.py",6920,0,"",python,selection_command
+1710,270210952,"utils/train_utils.py",6921,0,"args.batch_size",python,content
+1711,270210953,"utils/train_utils.py",6920,1,"",python,content
+1712,270211586,"utils/train_utils.py",7063,0,"args.batch_size",python,content
+1713,270211587,"utils/train_utils.py",7062,1,"",python,content
+1714,270211903,"utils/train_utils.py",7277,0,"args.batch_size",python,content
+1715,270211903,"utils/train_utils.py",7276,1,"",python,content
+1716,270212998,"utils/train_utils.py",8158,0,"",python,selection_command
+1717,270213962,"utils/train_utils.py",8487,0,"",python,selection_command
+1718,270215632,"utils/train_utils.py",7334,0,"",python,selection_command
+1719,270215879,"utils/train_utils.py",6493,0,"",python,selection_command
+1720,270216533,"utils/train_utils.py",5464,0,"",python,selection_keyboard
+1721,270216865,"utils/train_utils.py",5459,0,"",python,selection_command
+1722,270217130,"utils/train_utils.py",5436,0,"",python,selection_command
+1723,270217145,"utils/train_utils.py",5410,0,"",python,selection_command
+1724,270217179,"utils/train_utils.py",5405,0,"",python,selection_command
+1725,270217210,"utils/train_utils.py",5379,0,"",python,selection_command
+1726,270217244,"utils/train_utils.py",5374,0,"",python,selection_command
+1727,270217276,"utils/train_utils.py",5357,0,"",python,selection_command
+1728,270217310,"utils/train_utils.py",5323,0,"",python,selection_command
+1729,270217345,"utils/train_utils.py",5289,0,"",python,selection_command
+1730,270217499,"utils/train_utils.py",5211,0,"",python,selection_command
+1731,270217637,"utils/train_utils.py",5206,0,"",python,selection_command
+1732,270217783,"utils/train_utils.py",5177,0,"",python,selection_command
+1733,270218246,"utils/train_utils.py",5206,0,"",python,selection_command
+1734,270218495,"utils/train_utils.py",5211,0,"",python,selection_command
+1735,270218531,"utils/train_utils.py",5289,0,"",python,selection_command
+1736,270218564,"utils/train_utils.py",5323,0,"",python,selection_command
+1737,270218598,"utils/train_utils.py",5357,0,"",python,selection_command
+1738,270218632,"utils/train_utils.py",5374,0,"",python,selection_command
+1739,270218665,"utils/train_utils.py",5379,0,"",python,selection_command
+1740,270219216,"utils/train_utils.py",5405,0,"",python,selection_command
+1741,270219428,"utils/train_utils.py",5410,0,"",python,selection_command
+1742,270219962,"utils/train_utils.py",5405,0,"",python,selection_command
+1743,270220169,"utils/train_utils.py",5379,0,"",python,selection_command
+1744,270222363,"utils/train_utils.py",5405,0,"",python,selection_command
+1745,270222534,"utils/train_utils.py",5410,0,"",python,selection_command
+1746,270222856,"utils/train_utils.py",5405,0,"",python,selection_command
+1747,270223107,"utils/train_utils.py",5379,0,"",python,selection_command
+1748,270223140,"utils/train_utils.py",5374,0,"",python,selection_command
+1749,270223173,"utils/train_utils.py",5357,0,"",python,selection_command
+1750,270223206,"utils/train_utils.py",5323,0,"",python,selection_command
+1751,270223241,"utils/train_utils.py",5289,0,"",python,selection_command
+1752,270223371,"utils/train_utils.py",5211,0,"",python,selection_command
+1753,270223631,"utils/train_utils.py",5206,0,"",python,selection_command
+1754,270223659,"utils/train_utils.py",5177,0,"",python,selection_command
+1755,270223693,"utils/train_utils.py",5147,0,"",python,selection_command
+1756,270223725,"utils/train_utils.py",5116,0,"",python,selection_command
+1757,270223759,"utils/train_utils.py",5090,0,"",python,selection_command
+1758,270223902,"utils/train_utils.py",5085,0,"",python,selection_command
+1759,270224114,"utils/train_utils.py",5035,0,"",python,selection_command
+1760,270226519,"utils/train_utils.py",5085,0,"",python,selection_command
+1761,270226766,"utils/train_utils.py",5090,0,"",python,selection_command
+1762,270226794,"utils/train_utils.py",5116,0,"",python,selection_command
+1763,270226828,"utils/train_utils.py",5147,0,"",python,selection_command
+1764,270226861,"utils/train_utils.py",5177,0,"",python,selection_command
+1765,270226905,"utils/train_utils.py",5206,0,"",python,selection_command
+1766,270226929,"utils/train_utils.py",5211,0,"",python,selection_command
+1767,270226965,"utils/train_utils.py",5289,0,"",python,selection_command
+1768,270226998,"utils/train_utils.py",5323,0,"",python,selection_command
+1769,270227031,"utils/train_utils.py",5357,0,"",python,selection_command
+1770,270227064,"utils/train_utils.py",5374,0,"",python,selection_command
+1771,270227099,"utils/train_utils.py",5379,0,"",python,selection_command
+1772,270227132,"utils/train_utils.py",5405,0,"",python,selection_command
+1773,270227165,"utils/train_utils.py",5410,0,"",python,selection_command
+1774,270227198,"utils/train_utils.py",5436,0,"",python,selection_command
+1775,270227385,"utils/train_utils.py",5459,0,"",python,selection_command
+1776,270227666,"utils/train_utils.py",5436,0,"",python,selection_command
+1777,270227858,"utils/train_utils.py",5410,0,"",python,selection_command
+1778,270228183,"utils/train_utils.py",5436,0,"",python,selection_command
+1779,270228613,"utils/train_utils.py",5432,27,"",python,content
+1780,270228811,"utils/train_utils.py",5406,0,"",python,selection_command
+1781,270229153,"utils/train_utils.py",5406,26,"",python,content
+1782,270230250,"utils/train_utils.py",5406,1,"",python,content
+1783,270230281,"utils/train_utils.py",5410,0,"",python,selection_command
+1784,270231039,"utils/train_utils.py",5405,0,"",python,selection_command
+1785,270231203,"utils/train_utils.py",5379,0,"",python,selection_command
+1786,270232836,"utils/train_utils.py",5375,30,"",python,content
+1787,270233250,"utils/train_utils.py",5375,1,"",python,content
+1788,270233271,"utils/train_utils.py",5379,0,"",python,selection_command
+1789,270237407,"utils/train_utils.py",6351,0,"",python,selection_command
+1790,270239607,"utils/train_utils.py",6358,0,"dim",python,content
+1791,270239607,"utils/train_utils.py",6357,1,"",python,content
+1792,270239607,"utils/train_utils.py",6353,0,"args.",python,content
+1793,270239607,"utils/train_utils.py",6351,2,"",python,content
+1794,270240266,"utils/train_utils.py",7146,0,"",python,selection_command
+1795,270243135,"utils/train_utils.py",8368,0,"",python,selection_command
+1796,270243315,"utils/train_utils.py",8407,0,"",python,selection_command
+1797,270246768,"utils/train_utils.py",8402,0,"",python,selection_command
+1798,270246878,"utils/train_utils.py",8400,0,"",python,selection_command
+1799,270248206,"utils/train_utils.py",7319,0,"",python,selection_command
+1800,270248622,"utils/train_utils.py",6459,0,"",python,selection_command
+1801,270249081,"utils/train_utils.py",5379,0,"",python,selection_command
+1802,270276573,"utils/train_utils.py",5414,0,"",python,selection_command
+1803,270276735,"utils/train_utils.py",5429,0,"",python,selection_command
+1804,270276987,"utils/train_utils.py",5431,0,"",python,selection_command
+1805,270277018,"utils/train_utils.py",5435,0,"",python,selection_command
+1806,270277052,"utils/train_utils.py",5436,0,"",python,selection_command
+1807,270277087,"utils/train_utils.py",5447,0,"",python,selection_command
+1808,270277213,"utils/train_utils.py",5449,0,"",python,selection_command
+1809,270277520,"utils/train_utils.py",5451,0,"",python,selection_command
+1810,270277748,"utils/train_utils.py",5453,0,"",python,selection_command
+1811,270308321,"utils/train_utils.py",5451,0,"",python,selection_command
+1812,270308573,"utils/train_utils.py",5449,0,"",python,selection_command
+1813,270308603,"utils/train_utils.py",5447,0,"",python,selection_command
+1814,270308637,"utils/train_utils.py",5436,0,"",python,selection_command
+1815,270308769,"utils/train_utils.py",5435,0,"",python,selection_command
+1816,270308933,"utils/train_utils.py",5431,0,"",python,selection_command
+1817,270309054,"utils/train_utils.py",5429,0,"",python,selection_command
+1818,270309219,"utils/train_utils.py",5414,0,"",python,selection_command
+1819,270310929,"utils/train_utils.py",5414,14,"",python,content
+1820,270311111,"utils/train_utils.py",5414,0,"n",python,content
+1821,270311119,"utils/train_utils.py",5415,0,"",python,selection_keyboard
+1822,270311221,"utils/train_utils.py",5415,0,"u",python,content
+1823,270311228,"utils/train_utils.py",5416,0,"",python,selection_keyboard
+1824,270311370,"utils/train_utils.py",5416,0,"m",python,content
+1825,270311375,"utils/train_utils.py",5417,0,"",python,selection_keyboard
+1826,270311601,"utils/train_utils.py",5417,0,"_",python,content
+1827,270311608,"utils/train_utils.py",5418,0,"",python,selection_keyboard
+1828,270311851,"utils/train_utils.py",5418,0,"t",python,content
+1829,270311860,"utils/train_utils.py",5419,0,"",python,selection_keyboard
+1830,270311874,"utils/train_utils.py",5419,0,"o",python,content
+1831,270311884,"utils/train_utils.py",5420,0,"",python,selection_keyboard
+1832,270311965,"utils/train_utils.py",5420,0,"k",python,content
+1833,270311973,"utils/train_utils.py",5421,0,"",python,selection_keyboard
+1834,270312085,"utils/train_utils.py",5421,0,"e",python,content
+1835,270312092,"utils/train_utils.py",5422,0,"",python,selection_keyboard
+1836,270312187,"utils/train_utils.py",5422,0,"n",python,content
+1837,270312194,"utils/train_utils.py",5423,0,"",python,selection_keyboard
+1838,270312243,"utils/train_utils.py",5423,0,"s",python,content
+1839,270312251,"utils/train_utils.py",5424,0,"",python,selection_keyboard
+1840,270312508,"utils/train_utils.py",5423,0,"",python,selection_command
+1841,270314237,"utils/train_utils.py",5425,0,"",python,selection_command
+1842,270314486,"utils/train_utils.py",5427,0,"",python,selection_command
+1843,270314518,"utils/train_utils.py",5431,0,"",python,selection_command
+1844,270314706,"utils/train_utils.py",5432,0,"",python,selection_command
+1845,270314883,"utils/train_utils.py",5443,0,"",python,selection_command
+1846,270315351,"utils/train_utils.py",5445,0,"",python,selection_command
+1847,270317235,"utils/train_utils.py",5545,0,"",python,selection_command
+1848,270317639,"utils/train_utils.py",5510,116,"",python,content
+1849,270317802,"utils/train_utils.py",5410,0,"",python,selection_command
+1850,270317937,"utils/train_utils.py",5414,0,"",python,selection_command
+1851,270318096,"utils/train_utils.py",5425,0,"",python,selection_command
+1852,270318245,"utils/train_utils.py",5427,0,"",python,selection_command
+1853,270318395,"utils/train_utils.py",5431,0,"",python,selection_command
+1854,270318532,"utils/train_utils.py",5432,0,"",python,selection_command
+1855,270318700,"utils/train_utils.py",5443,0,"",python,selection_command
+1856,270318853,"utils/train_utils.py",5445,0,"",python,selection_command
+1857,270319192,"utils/train_utils.py",5446,0,"",python,selection_command
+1858,270319332,"utils/train_utils.py",5447,0,"",python,selection_command
+1859,270319489,"utils/train_utils.py",5448,0,"",python,selection_command
+1860,270319635,"utils/train_utils.py",5449,0,"",python,selection_command
+1861,270319819,"utils/train_utils.py",5450,0,"",python,selection_command
+1862,270320079,"utils/train_utils.py",5450,59,"",python,content
+1863,270320094,"utils/train_utils.py",5449,0,"",python,selection_command
+1864,270320856,"utils/train_utils.py",5451,0,"",python,selection_command
+1865,270321012,"utils/train_utils.py",5472,0,"",python,selection_command
+1866,270321149,"utils/train_utils.py",5509,0,"",python,selection_command
+1867,270321281,"utils/train_utils.py",5550,0,"",python,selection_command
+1868,270322008,"utils/train_utils.py",5547,8,"",python,content
+1869,270322008,"utils/train_utils.py",5541,0,"num_",python,content
+1870,270322747,"utils/train_utils.py",5691,8,"",python,content
+1871,270322747,"utils/train_utils.py",5685,0,"num_",python,content
+1872,270323677,"utils/train_utils.py",5626,0,"",python,selection_command
+1873,270323872,"utils/train_utils.py",5700,0,"",python,selection_command
+1874,270324017,"utils/train_utils.py",5730,0,"",python,selection_command
+1875,270324165,"utils/train_utils.py",5771,0,"",python,selection_command
+1876,270324299,"utils/train_utils.py",5843,0,"",python,selection_command
+1877,270324901,"utils/train_utils.py",5840,9,"",python,content
+1878,270324902,"utils/train_utils.py",5834,0,"num_",python,content
+1879,270325412,"utils/train_utils.py",5920,0,"",python,selection_command
+1880,270325708,"utils/train_utils.py",6026,0,"",python,selection_command
+1881,270325961,"utils/train_utils.py",6017,9,"",python,content
+1882,270325962,"utils/train_utils.py",6011,0,"num_",python,content
+1883,270326808,"utils/train_utils.py",6766,0,"",python,selection_command
+1884,270327351,"utils/train_utils.py",7960,0,"",python,selection_command
+1885,270327795,"utils/train_utils.py",8210,0,"",python,selection_command
+1886,270328317,"utils/train_utils.py",7237,0,"",python,selection_command
+1887,270328455,"utils/train_utils.py",6309,0,"",python,selection_command
+1888,270328921,"utils/train_utils.py",5414,0,"",python,selection_command
+1889,270332576,"utils/train_utils.py",5451,0,"",python,selection_command
+1890,270339644,"utils/train_utils.py",5456,0,"",python,selection_command
+1891,270341438,"utils/train_utils.py",4747,0,"",python,selection_command
+1892,270364392,"utils/train_utils.py",4748,0,"",python,selection_command
+1893,270364637,"utils/train_utils.py",4797,0,"",python,selection_command
+1894,270364668,"utils/train_utils.py",4844,0,"",python,selection_command
+1895,270364702,"utils/train_utils.py",4845,0,"",python,selection_command
+1896,270364733,"utils/train_utils.py",4978,0,"",python,selection_command
+1897,270364768,"utils/train_utils.py",4986,0,"",python,selection_command
+1898,270364800,"utils/train_utils.py",5031,0,"",python,selection_command
+1899,270364835,"utils/train_utils.py",5085,0,"",python,selection_command
+1900,270364868,"utils/train_utils.py",5086,0,"",python,selection_command
+1901,270364902,"utils/train_utils.py",5112,0,"",python,selection_command
+1902,270364935,"utils/train_utils.py",5143,0,"",python,selection_command
+1903,270364970,"utils/train_utils.py",5173,0,"",python,selection_command
+1904,270365002,"utils/train_utils.py",5206,0,"",python,selection_command
+1905,270365037,"utils/train_utils.py",5207,0,"",python,selection_command
+1906,270365070,"utils/train_utils.py",5285,0,"",python,selection_command
+1907,270365104,"utils/train_utils.py",5319,0,"",python,selection_command
+1908,270365137,"utils/train_utils.py",5353,0,"",python,selection_command
+1909,270365175,"utils/train_utils.py",5374,0,"",python,selection_command
+1910,270365208,"utils/train_utils.py",5375,0,"",python,selection_command
+1911,270365416,"utils/train_utils.py",5410,0,"",python,selection_command
+1912,270370176,"utils/train_utils.py",5375,0,"",python,selection_command
+1913,270370433,"utils/train_utils.py",5374,0,"",python,selection_command
+1914,270370456,"utils/train_utils.py",5353,0,"",python,selection_command
+1915,270370493,"utils/train_utils.py",5319,0,"",python,selection_command
+1916,270370526,"utils/train_utils.py",5285,0,"",python,selection_command
+1917,270370560,"utils/train_utils.py",5207,0,"",python,selection_command
+1918,270370593,"utils/train_utils.py",5206,0,"",python,selection_command
+1919,270370716,"utils/train_utils.py",5173,0,"",python,selection_command
+1920,270370966,"utils/train_utils.py",5177,0,"",python,selection_command
+1921,270373038,"utils/train_utils.py",5173,33,"",python,content
+1922,270374328,"utils/train_utils.py",5285,0,"_size",python,content
+1923,270374329,"utils/train_utils.py",5280,0,"args.",python,content
+1924,270374329,"utils/train_utils.py",5271,0,"_size",python,content
+1925,270374329,"utils/train_utils.py",5266,0,"args.",python,content
+1926,270374360,"utils/train_utils.py",5305,0,"",python,selection_command
+1927,270375110,"utils/train_utils.py",5339,0,"_size",python,content
+1928,270375110,"utils/train_utils.py",5334,0,"args.",python,content
+1929,270375110,"utils/train_utils.py",5325,0,"_size",python,content
+1930,270375110,"utils/train_utils.py",5320,0,"args.",python,content
+1931,270375141,"utils/train_utils.py",5359,0,"",python,selection_command
+1932,270376086,"utils/train_utils.py",5358,0,"",python,selection_command
+1933,270376558,"utils/train_utils.py",6223,0,"",python,selection_command
+1934,270377331,"utils/train_utils.py",7044,0,"",python,selection_command
+1935,270377914,"utils/train_utils.py",8212,0,"",python,selection_command
+1936,270378081,"utils/train_utils.py",8217,0,"",python,selection_command
+1937,270378624,"utils/train_utils.py",7244,0,"",python,selection_command
+1938,270379572,"utils/train_utils.py",7190,0,"",python,selection_command
+1939,270379819,"utils/train_utils.py",7125,0,"",python,selection_command
+1940,270379850,"utils/train_utils.py",7060,0,"",python,selection_command
+1941,270379885,"utils/train_utils.py",7040,0,"",python,selection_command
+1942,270379918,"utils/train_utils.py",7039,0,"",python,selection_command
+1943,270379953,"utils/train_utils.py",6984,0,"",python,selection_command
+1944,270379986,"utils/train_utils.py",6952,0,"",python,selection_command
+1945,270380021,"utils/train_utils.py",6861,0,"",python,selection_command
+1946,270380056,"utils/train_utils.py",6860,0,"",python,selection_command
+1947,270380088,"utils/train_utils.py",6769,0,"",python,selection_command
+1948,270380121,"utils/train_utils.py",6718,0,"",python,selection_command
+1949,270380156,"utils/train_utils.py",6627,0,"",python,selection_command
+1950,270380191,"utils/train_utils.py",6576,0,"",python,selection_command
+1951,270380227,"utils/train_utils.py",6575,0,"",python,selection_command
+1952,270380259,"utils/train_utils.py",6506,0,"",python,selection_command
+1953,270380509,"utils/train_utils.py",6433,0,"",python,selection_command
+1954,270381141,"utils/train_utils.py",6475,0,"_size",python,content
+1955,270381141,"utils/train_utils.py",6470,0,"args.",python,content
+1956,270381163,"utils/train_utils.py",6485,0,"",python,selection_command
+1957,270382153,"utils/train_utils.py",5671,0,"",python,selection_command
+1958,270382982,"utils/train_utils.py",4844,0,"",python,selection_command
+1959,270383503,"utils/train_utils.py",4845,0,"",python,selection_command
+1960,270383654,"utils/train_utils.py",4978,0,"",python,selection_command
+1961,270383798,"utils/train_utils.py",4986,0,"",python,selection_command
+1962,270437094,"utils/train_utils.py",5031,0,"",python,selection_command
+1963,270437351,"utils/train_utils.py",5085,0,"",python,selection_command
+1964,270437377,"utils/train_utils.py",5086,0,"",python,selection_command
+1965,270437522,"utils/train_utils.py",5112,0,"",python,selection_command
+1966,270437779,"utils/train_utils.py",5143,0,"",python,selection_command
+1967,270437810,"utils/train_utils.py",5173,0,"",python,selection_command
+1968,270437842,"utils/train_utils.py",5174,0,"",python,selection_command
+1969,270437985,"utils/train_utils.py",5252,0,"",python,selection_command
+1970,270438255,"utils/train_utils.py",5174,0,"",python,selection_command
+1971,270439002,"utils/train_utils.py",5252,0,"",python,selection_command
+1972,270464777,"utils/train_utils.py",5256,0,"",python,selection_command
+1973,270465492,"utils/train_utils.py",5372,0,"",python,selection_command
+1974,270465875,"utils/train_utils.py",5256,0,"",python,selection_command
+1975,270466259,"utils/train_utils.py",5372,0,"",python,selection_command
+1976,270466432,"utils/train_utils.py",5256,0,"",python,selection_command
+1977,270468303,"utils/train_utils.py",5372,0,"",python,selection_command
+1978,270468466,"utils/train_utils.py",5256,0,"",python,selection_command
+1979,270468601,"utils/train_utils.py",5372,0,"",python,selection_command
+1980,270468867,"utils/train_utils.py",5256,0,"",python,selection_command
+1981,270478571,"utils/train_utils.py",5310,0,"",python,selection_command
+1982,270478644,"utils/train_utils.py",5364,0,"",python,selection_command
+1983,270484067,"utils/train_utils.py",5310,0,"",python,selection_command
+1984,270484186,"utils/train_utils.py",5256,0,"",python,selection_command
+1985,270484333,"utils/train_utils.py",5178,0,"",python,selection_command
+1986,270484741,"utils/train_utils.py",5180,0,"",python,selection_command
+1987,270484991,"utils/train_utils.py",5187,0,"",python,selection_command
+1988,270485022,"utils/train_utils.py",5190,0,"",python,selection_command
+1989,270485189,"utils/train_utils.py",5198,0,"",python,selection_command
+1990,270485372,"utils/train_utils.py",5202,0,"",python,selection_command
+1991,270485556,"utils/train_utils.py",5208,0,"",python,selection_command
+1992,270486759,"utils/train_utils.py",5174,0,"",python,selection_command
+1993,270490103,"utils/train_utils.py",5178,0,"",python,selection_command
+1994,270490267,"utils/train_utils.py",5180,0,"",python,selection_command
+1995,270490434,"utils/train_utils.py",5187,0,"",python,selection_command
+1996,270490596,"utils/train_utils.py",5190,0,"",python,selection_command
+1997,270490776,"utils/train_utils.py",5198,0,"",python,selection_command
+1998,270491004,"utils/train_utils.py",5202,0,"",python,selection_command
+1999,270491184,"utils/train_utils.py",5208,0,"",python,selection_command
+2000,270491585,"utils/train_utils.py",5207,0,"",python,selection_command
+2001,270491735,"utils/train_utils.py",5207,44,"",python,content
+2002,270491757,"utils/train_utils.py",5206,0,"",python,selection_command
+2003,270492104,"utils/train_utils.py",5174,0,"",python,selection_command
+2004,270499752,"utils/train_utils.py",5208,0,"",python,selection_command
+2005,270500001,"utils/train_utils.py",5262,0,"",python,selection_command
+2006,270500030,"utils/train_utils.py",5316,0,"",python,selection_command
+2007,270500062,"utils/train_utils.py",5337,0,"",python,selection_command
+2008,270500213,"utils/train_utils.py",5338,0,"",python,selection_command
+2009,270500463,"utils/train_utils.py",5373,0,"",python,selection_command
+2010,270500617,"utils/train_utils.py",5414,0,"",python,selection_command
+2011,270500780,"utils/train_utils.py",5415,0,"",python,selection_command
+2012,270501854,"utils/train_utils.py",5437,0,"",python,selection_command
+2013,270502623,"utils/train_utils.py",5474,0,"",python,selection_command
+2014,270503402,"utils/train_utils.py",5437,0,"",python,selection_command
+2015,270609050,"utils/train_utils.py",5563,0,"",python,selection_mouse
+2016,270610731,"utils/train_utils.py",5693,0,"",python,selection_mouse
+2017,270620815,"utils/train_utils.py",5577,0,"",python,selection_mouse
+2018,270622017,"utils/train_utils.py",5592,0,"",python,selection_mouse
+2019,270622834,"utils/train_utils.py",5599,0,"",python,selection_mouse
+2020,270623464,"utils/train_utils.py",5604,0,"",python,selection_mouse
+2021,270624149,"utils/train_utils.py",5612,0,"",python,selection_mouse
+2022,270629723,"utils/train_utils.py",5853,0,"",python,selection_mouse
+2023,270631932,"utils/train_utils.py",5863,0,"",python,selection_mouse
+2024,270632850,"utils/train_utils.py",5979,0,"",python,selection_mouse
+2025,270634281,"utils/train_utils.py",5887,0,"",python,selection_mouse
+2026,270636024,"utils/train_utils.py",5897,0,"",python,selection_mouse
+2027,270638966,"utils/train_utils.py",6018,0,"",python,selection_mouse
+2028,270638979,"utils/train_utils.py",6017,0,"",python,selection_command
+2029,270641216,"utils/train_utils.py",5921,0,"",python,selection_mouse
+2030,270650315,"utils/train_utils.py",5967,0,"",python,selection_mouse
+2031,270651398,"utils/train_utils.py",5968,0,"",python,selection_mouse
+2032,270651915,"utils/train_utils.py",6019,0,"",python,selection_mouse
+2033,271526815,"utils/train_utils.py",6020,0,"",python,selection_command
+2034,272581985,"utils/train_utils.py",5116,0,"",python,selection_command
+2035,272582530,"utils/train_utils.py",5419,0,"",python,selection_keyboard
+2036,272582852,"utils/train_utils.py",5414,0,"",python,selection_command
+2037,272583102,"utils/train_utils.py",5377,0,"",python,selection_command
+2038,272583133,"utils/train_utils.py",5342,0,"",python,selection_command
+2039,272583165,"utils/train_utils.py",5337,0,"",python,selection_command
+2040,272583200,"utils/train_utils.py",5320,0,"",python,selection_command
+2041,272583233,"utils/train_utils.py",5266,0,"",python,selection_command
+2042,272583268,"utils/train_utils.py",5212,0,"",python,selection_command
+2043,272583303,"utils/train_utils.py",5178,0,"",python,selection_command
+2044,272583340,"utils/train_utils.py",5173,0,"",python,selection_command
+2045,272583372,"utils/train_utils.py",5147,0,"",python,selection_command
+2046,272583405,"utils/train_utils.py",5116,0,"",python,selection_command
+2047,272583438,"utils/train_utils.py",5090,0,"",python,selection_command
+2048,272583471,"utils/train_utils.py",5085,0,"",python,selection_command
+2049,272583628,"utils/train_utils.py",5035,0,"",python,selection_command
+2050,272583846,"utils/train_utils.py",5085,0,"",python,selection_command
+2051,272594383,"utils/train_utils.py",5035,0,"",python,selection_command
+2052,272594511,"utils/train_utils.py",4990,0,"",python,selection_command
+2053,272595212,"utils/train_utils.py",5002,0,"",python,selection_command
+2054,272595360,"utils/train_utils.py",5004,0,"",python,selection_command
+2055,272596302,"utils/train_utils.py",5004,1,"m",python,selection_command
+2056,272596352,"utils/train_utils.py",5004,3,"max",python,selection_command
+2057,272596519,"utils/train_utils.py",5004,4,"max(",python,selection_command
+2058,272597197,"utils/train_utils.py",5004,5,"max(1",python,selection_command
+2059,272597483,"utils/train_utils.py",5004,6,"max(1,",python,selection_command
+2060,272598168,"utils/train_utils.py",5004,7,"max(1, ",python,selection_command
+2061,272599952,"utils/train_utils.py",5004,7,"",python,content
+2062,272600749,"utils/train_utils.py",5023,0,"",python,selection_command
+2063,272600927,"utils/train_utils.py",5022,1,"",python,content
+2064,272601056,"utils/train_utils.py",5021,0,"",python,selection_command
+2065,272601339,"utils/train_utils.py",4986,0,"",python,selection_command
+2066,272603963,"utils/train_utils.py",5023,0,"",python,selection_command
+2067,272604214,"utils/train_utils.py",5077,0,"",python,selection_command
+2068,272604241,"utils/train_utils.py",5078,0,"",python,selection_command
+2069,272604273,"utils/train_utils.py",5104,0,"",python,selection_command
+2070,272604307,"utils/train_utils.py",5135,0,"",python,selection_command
+2071,272604343,"utils/train_utils.py",5165,0,"",python,selection_command
+2072,272604375,"utils/train_utils.py",5166,0,"",python,selection_command
+2073,272604411,"utils/train_utils.py",5200,0,"",python,selection_command
+2074,272604444,"utils/train_utils.py",5254,0,"",python,selection_command
+2075,272604477,"utils/train_utils.py",5308,0,"",python,selection_command
+2076,272604510,"utils/train_utils.py",5329,0,"",python,selection_command
+2077,272604545,"utils/train_utils.py",5330,0,"",python,selection_command
+2078,272604577,"utils/train_utils.py",5365,0,"",python,selection_command
+2079,272604611,"utils/train_utils.py",5406,0,"",python,selection_command
+2080,272604644,"utils/train_utils.py",5407,0,"",python,selection_command
+2081,272604677,"utils/train_utils.py",5429,0,"",python,selection_command
+2082,272604876,"utils/train_utils.py",5466,0,"",python,selection_command
+2083,272605030,"utils/train_utils.py",5541,0,"",python,selection_command
+2084,272605380,"utils/train_utils.py",5466,0,"",python,selection_command
+2085,272606471,"utils/train_utils.py",5429,0,"",python,selection_command
+2086,272606727,"utils/train_utils.py",5407,0,"",python,selection_command
+2087,272606755,"utils/train_utils.py",5406,0,"",python,selection_command
+2088,272606789,"utils/train_utils.py",5365,0,"",python,selection_command
+2089,272606822,"utils/train_utils.py",5330,0,"",python,selection_command
+2090,272606855,"utils/train_utils.py",5329,0,"",python,selection_command
+2091,272606888,"utils/train_utils.py",5308,0,"",python,selection_command
+2092,272606922,"utils/train_utils.py",5254,0,"",python,selection_command
+2093,272606955,"utils/train_utils.py",5200,0,"",python,selection_command
+2094,272606988,"utils/train_utils.py",5166,0,"",python,selection_command
+2095,272607022,"utils/train_utils.py",5165,0,"",python,selection_command
+2096,272607056,"utils/train_utils.py",5135,0,"",python,selection_command
+2097,272607089,"utils/train_utils.py",5104,0,"",python,selection_command
+2098,272607196,"utils/train_utils.py",5135,0,"",python,selection_command
+2099,272607448,"utils/train_utils.py",5165,0,"",python,selection_command
+2100,272607482,"utils/train_utils.py",5166,0,"",python,selection_command
+2101,272607513,"utils/train_utils.py",5200,0,"",python,selection_command
+2102,272607545,"utils/train_utils.py",5254,0,"",python,selection_command
+2103,272607580,"utils/train_utils.py",5308,0,"",python,selection_command
+2104,272607613,"utils/train_utils.py",5329,0,"",python,selection_command
+2105,272607650,"utils/train_utils.py",5330,0,"",python,selection_command
+2106,272607681,"utils/train_utils.py",5365,0,"",python,selection_command
+2107,272610998,"utils/train_utils.py",5406,0,"",python,selection_command
+2108,272611161,"utils/train_utils.py",5407,0,"",python,selection_command
+2109,272611310,"utils/train_utils.py",5429,0,"",python,selection_command
+2110,272611449,"utils/train_utils.py",5466,0,"",python,selection_command
+2111,272643746,"utils/nn.py",0,0,"import math\nfrom typing import Tuple, Callable, List\n\nfrom flax import nnx\nimport jax\nimport jax.numpy as jnp\nimport einops\n\n\nclass SpatioTemporalPositionalEncoding(nnx.Module):\n """"""\n Applies separate sinusoidal positional encodings to the temporal and spatial dimensions.\n """"""\n\n def __init__(self, d_model: int, max_len: int = 5000):\n self.d_model = d_model\n self.max_len = max_len\n\n pe = jnp.zeros((self.max_len, self.d_model))\n position = jnp.arange(0, self.max_len, dtype=jnp.float32)[:, None]\n div_term = jnp.exp(\n jnp.arange(0, self.d_model, 2) * (-math.log(10000.0) / self.d_model)\n )\n pe = pe.at[:, 0::2].set(jnp.sin(position * div_term))\n pe = pe.at[:, 1::2].set(jnp.cos(position * div_term))\n self.pe = nnx.Variable(pe)\n\n def __call__(self, x: jax.Array) -> jax.Array:\n """"""\n Args:\n x: The input tensor of shape (Batch, Time, Space, Dimension).\n\n Returns:\n The input tensor with positional encodings added.\n """"""\n assert x.ndim == 4, f""Input must be 4-dimensional, but got shape {x.shape}""\n\n num_timesteps = x.shape[1]\n num_spatial_patches = x.shape[2]\n\n # Temporal positional encoding: (1, T, 1, D)\n temporal_pe = self.pe.value[None, :num_timesteps, None, :]\n x = x + temporal_pe\n\n # Spatial positional encoding: (1, 1, S, D)\n spatial_pe = self.pe.value[None, None, :num_spatial_patches, :]\n x = x + spatial_pe\n\n return x\n\n\nclass STBlock(nnx.Module):\n def __init__(\n self,\n dim: int,\n ffn_dim: int,\n num_heads: int,\n dropout: float,\n param_dtype: jnp.dtype,\n dtype: jnp.dtype,\n use_flash_attention: bool,\n rngs: nnx.Rngs,\n sow_weights: bool,\n sow_activations: bool,\n ):\n self.dim = dim\n self.ffn_dim = ffn_dim\n self.num_heads = num_heads\n self.dropout = dropout\n self.param_dtype = param_dtype\n self.dtype = dtype\n self.use_flash_attention = use_flash_attention\n self.sow_weights = sow_weights\n self.sow_activations = sow_activations\n\n self.spatial_norm = nnx.LayerNorm(\n num_features=self.dim,\n param_dtype=self.param_dtype,\n dtype=self.param_dtype, # layer norm in full precision\n rngs=rngs,\n )\n self.spatial_attention = nnx.MultiHeadAttention(\n num_heads=self.num_heads,\n in_features=self.dim,\n qkv_features=self.dim,\n dropout_rate=self.dropout,\n param_dtype=self.param_dtype,\n dtype=self.dtype,\n attention_fn=_create_flash_attention_fn(\n self.use_flash_attention, is_causal=False\n ),\n rngs=rngs,\n decode=False,\n )\n\n self.temporal_norm = nnx.LayerNorm(\n num_features=self.dim,\n param_dtype=self.param_dtype,\n dtype=self.param_dtype, # layer norm in full precision\n rngs=rngs,\n )\n self.temporal_attention = nnx.MultiHeadAttention(\n num_heads=self.num_heads,\n in_features=self.dim,\n qkv_features=self.dim,\n dropout_rate=self.dropout,\n param_dtype=self.param_dtype,\n dtype=self.dtype,\n attention_fn=_create_flash_attention_fn(\n self.use_flash_attention, is_causal=True\n ),\n rngs=rngs,\n decode=False,\n )\n\n self.ffn_norm = nnx.LayerNorm(\n num_features=self.dim,\n param_dtype=self.param_dtype,\n dtype=self.param_dtype, # layer norm in full precision\n rngs=rngs,\n )\n self.ffn_dense1 = nnx.Linear(\n in_features=self.dim,\n out_features=self.ffn_dim,\n param_dtype=self.param_dtype,\n dtype=self.dtype,\n rngs=rngs,\n )\n self.ffn_dense2 = nnx.Linear(\n in_features=self.ffn_dim,\n out_features=self.dim,\n param_dtype=self.param_dtype,\n dtype=self.dtype,\n rngs=rngs,\n )\n\n @nnx.remat\n def __call__(self, x_BTNM: jax.Array) -> jax.Array:\n # --- Spatial attention ---\n z_BTNM = self.spatial_norm(x_BTNM)\n z_BTNM = self.spatial_attention(z_BTNM, sow_weights=self.sow_weights)\n x_BTNM = x_BTNM + z_BTNM\n\n # --- Temporal attention ---\n x_BNTM = x_BTNM.swapaxes(1, 2)\n z_BNTM = self.temporal_norm(x_BNTM)\n z_BNTM = self.temporal_attention(z_BNTM, sow_weights=self.sow_weights)\n x_BNTM = x_BNTM + z_BNTM\n x_BTNM = x_BNTM.swapaxes(1, 2)\n\n # --- Feedforward ---\n z_BTNM = self.ffn_norm(x_BTNM)\n z_BTND = self.ffn_dense1(z_BTNM)\n z_BTND = jax.nn.gelu(z_BTND)\n z_BTNM = self.ffn_dense2(z_BTND)\n x_BTNM = x_BTNM + z_BTNM\n if self.sow_activations:\n self.sow(nnx.Intermediate, ""activations"", x_BTNM)\n return x_BTNM\n\n\nclass STTransformer(nnx.Module):\n """"""\n Dimension keys:\n B: batch size\n T: number of frames\n N: number of patches per frame\n I: number of input features\n M: model dimension\n D: FFN dimension\n V: vocabulary size\n """"""\n\n def __init__(\n self,\n input_dim: int,\n model_dim: int,\n ffn_dim: int,\n out_dim: int,\n num_blocks: int,\n num_heads: int,\n dropout: float,\n param_dtype: jnp.dtype,\n dtype: jnp.dtype,\n use_flash_attention: bool,\n rngs: nnx.Rngs,\n sow_weights: bool = False,\n sow_activations: bool = False,\n sow_logits: bool = False,\n max_len: int = 5000,\n ):\n self.input_dim = input_dim\n self.model_dim = model_dim\n self.ffn_dim = ffn_dim\n self.out_dim = out_dim\n self.num_blocks = num_blocks\n self.num_heads = num_heads\n self.dropout = dropout\n self.param_dtype = param_dtype\n self.dtype = dtype\n self.use_flash_attention = use_flash_attention\n self.sow_logits = sow_logits\n self.sow_weights = sow_weights\n self.sow_activations = sow_activations\n\n self.input_norm1 = nnx.LayerNorm(\n num_features=self.input_dim,\n param_dtype=self.param_dtype,\n dtype=self.param_dtype, # layer norm in full precision\n rngs=rngs,\n )\n self.input_dense = nnx.Linear(\n in_features=self.input_dim,\n out_features=self.model_dim,\n param_dtype=self.param_dtype,\n dtype=self.dtype,\n rngs=rngs,\n )\n self.input_norm2 = nnx.LayerNorm(\n num_features=self.model_dim,\n param_dtype=self.param_dtype,\n dtype=self.param_dtype, # layer norm in full precision\n rngs=rngs,\n )\n\n self.pos_enc = SpatioTemporalPositionalEncoding(self.model_dim, max_len=max_len)\n\n self.blocks = []\n for _ in range(self.num_blocks):\n self.blocks.append(\n STBlock(\n dim=self.model_dim,\n ffn_dim=self.ffn_dim,\n num_heads=self.num_heads,\n dropout=self.dropout,\n param_dtype=self.param_dtype,\n dtype=self.dtype,\n use_flash_attention=self.use_flash_attention,\n rngs=rngs,\n sow_weights=self.sow_weights,\n sow_activations=self.sow_activations,\n )\n )\n\n self.output_dense = nnx.Linear(\n in_features=self.model_dim,\n out_features=self.out_dim,\n param_dtype=self.param_dtype,\n dtype=self.dtype,\n rngs=rngs,\n )\n\n def __call__(self, x_BTNI: jax.Array) -> jax.Array:\n x_BTNI = self.input_norm1(x_BTNI)\n x_BTNM = self.input_dense(x_BTNI)\n x_BTNM = self.input_norm2(x_BTNM)\n x_BTNM = self.pos_enc(x_BTNM)\n for block in self.blocks:\n x_BTNM = block(x_BTNM)\n\n x_BTNV = self.output_dense(x_BTNM)\n if self.sow_logits:\n self.sow(nnx.Intermediate, ""logits"", x_BTNV)\n return x_BTNV\n\n\nclass TransformerBlock(nnx.Module):\n def __init__(\n self,\n model_dim: int,\n ffn_dim: int,\n num_heads: int,\n dropout: float,\n param_dtype: jnp.dtype,\n dtype: jnp.dtype,\n use_flash_attention: bool,\n decode: bool,\n rngs: nnx.Rngs,\n sow_weights: bool,\n sow_activations: bool,\n ):\n self.model_dim = model_dim\n self.ffn_dim = ffn_dim\n self.num_heads = num_heads\n self.dropout = dropout\n self.param_dtype = param_dtype\n self.dtype = dtype\n self.use_flash_attention = use_flash_attention\n self.decode = decode\n self.sow_weights = sow_weights\n self.sow_activations = sow_activations\n\n self.temporal_norm = nnx.LayerNorm(\n num_features=self.model_dim,\n param_dtype=self.param_dtype,\n dtype=self.param_dtype, # layer norm in full precision\n rngs=rngs,\n )\n self.spatial_norm = nnx.LayerNorm(\n num_features=self.model_dim,\n param_dtype=self.param_dtype,\n dtype=self.param_dtype, # layer norm in full precision\n rngs=rngs,\n )\n self.ffn_norm = nnx.LayerNorm(\n num_features=self.model_dim,\n param_dtype=self.param_dtype,\n dtype=self.param_dtype, # layer norm in full precision\n rngs=rngs,\n )\n self.temporal_attention = nnx.MultiHeadAttention(\n num_heads=self.num_heads,\n in_features=self.model_dim,\n qkv_features=self.model_dim,\n dropout_rate=self.dropout,\n param_dtype=self.param_dtype,\n dtype=self.dtype,\n attention_fn=_create_flash_attention_fn(\n self.use_flash_attention, is_causal=True\n ),\n rngs=rngs,\n decode=self.decode,\n )\n self.spatial_attention = nnx.MultiHeadAttention(\n num_heads=self.num_heads,\n in_features=self.model_dim,\n qkv_features=self.model_dim,\n dropout_rate=self.dropout,\n param_dtype=self.param_dtype,\n dtype=self.dtype,\n attention_fn=_create_flash_attention_fn(\n self.use_flash_attention, is_causal=True\n ),\n rngs=rngs,\n decode=self.decode,\n )\n self.ffn_dense1 = nnx.Linear(\n in_features=self.model_dim,\n out_features=self.ffn_dim,\n param_dtype=self.param_dtype,\n dtype=self.dtype,\n rngs=rngs,\n )\n self.ffn_dense2 = nnx.Linear(\n in_features=self.ffn_dim,\n out_features=self.model_dim,\n param_dtype=self.param_dtype,\n dtype=self.dtype,\n rngs=rngs,\n )\n\n @nnx.remat\n def __call__(\n self, x_BTNM: jax.Array, pos_index: Tuple[jax.Array, jax.Array] | None = None\n ) -> jax.Array:\n # --- Spatial attention ---\n B, T, N, M = x_BTNM.shape\n z_FNM = einops.rearrange(x_BTNM, ""b t n m -> (b t) n m"")\n z_FNM = self.spatial_norm(z_FNM)\n z_FNM = self.spatial_attention(z_FNM, sow_weights=self.sow_weights)\n z_BTNM = einops.rearrange(z_FNM, ""(b t) n m -> b t n m"", t=T)\n x_BTNM = x_BTNM + z_BTNM\n # --- Temporal attention ---\n z_PTM = einops.rearrange(x_BTNM, ""b t n m -> (b n) t m"")\n z_PTM = self.temporal_norm(z_PTM)\n z_PTM = self.temporal_attention(z_PTM, sow_weights=self.sow_weights)\n z_BTNM = einops.rearrange(z_PTM, ""(b n) t m -> b t n m"", n=N)\n x_BTNM = x_BTNM + z_BTNM\n # --- Feedforward ---\n z_BTNM = self.ffn_norm(x_BTNM)\n z_BTND = self.ffn_dense1(z_BTNM)\n z_BTND = jax.nn.gelu(z_BTND)\n z_BTNM = self.ffn_dense2(z_BTND)\n x_BTNM = x_BTNM + z_BTNM\n if self.sow_activations:\n self.sow(nnx.Intermediate, ""activations"", x_BTNM)\n\n return x_BTNM\n\n\nclass Transformer(nnx.Module):\n """"""\n Dimension keys:\n B: batch size\n T: number of frames\n N: number of patches per frame\n I: number of input features\n M: model dimension\n D: FFN dimension\n V: vocabulary size\n F: number of frames in batch\n P: number of patch positions in batch\n """"""\n\n def __init__(\n self,\n input_dim: int,\n model_dim: int,\n ffn_dim: int,\n out_dim: int,\n num_blocks: int,\n num_heads: int,\n dropout: float,\n param_dtype: jnp.dtype,\n dtype: jnp.dtype,\n use_flash_attention: bool,\n decode: bool,\n rngs: nnx.Rngs,\n sow_logits: bool = False,\n sow_weights: bool = False,\n sow_activations: bool = False,\n max_len: int = 5000,\n ):\n self.input_dim = input_dim\n self.model_dim = model_dim\n self.ffn_dim = ffn_dim\n self.out_dim = out_dim\n self.num_blocks = num_blocks\n self.num_heads = num_heads\n self.dropout = dropout\n self.param_dtype = param_dtype\n self.dtype = dtype\n self.use_flash_attention = use_flash_attention\n self.sow_logits = sow_logits\n self.sow_weights = sow_weights\n self.sow_activations = sow_activations\n\n self.input_norm1 = nnx.LayerNorm(\n num_features=self.input_dim,\n param_dtype=self.param_dtype,\n dtype=self.param_dtype, # layer norm in full precision\n rngs=rngs,\n )\n self.input_dense = nnx.Linear(\n in_features=self.input_dim,\n out_features=self.model_dim,\n param_dtype=self.param_dtype,\n dtype=self.dtype,\n rngs=rngs,\n )\n self.input_norm2 = nnx.LayerNorm(\n num_features=self.model_dim,\n param_dtype=self.param_dtype,\n dtype=self.param_dtype, # layer norm in full precision\n rngs=rngs,\n )\n\n self.pos_enc = SpatioTemporalPositionalEncoding(self.model_dim, max_len=max_len)\n\n self.blocks: List[TransformerBlock] = []\n for _ in range(self.num_blocks):\n self.blocks.append(\n TransformerBlock(\n model_dim=self.model_dim,\n ffn_dim=self.ffn_dim,\n num_heads=self.num_heads,\n dropout=self.dropout,\n param_dtype=self.param_dtype,\n dtype=self.dtype,\n use_flash_attention=self.use_flash_attention,\n decode=decode,\n sow_weights=self.sow_weights,\n sow_activations=self.sow_activations,\n rngs=rngs,\n )\n )\n self.output_dense = nnx.Linear(\n in_features=self.model_dim,\n out_features=self.out_dim,\n param_dtype=self.param_dtype,\n dtype=self.dtype,\n rngs=rngs,\n )\n\n def __call__(\n self, x_BTNI: jax.Array, pos_index: Tuple[jax.Array, jax.Array] | None = None\n ) -> jax.Array:\n x_BTNI = self.input_norm1(x_BTNI)\n x_BTNM = self.input_dense(x_BTNI)\n x_BTNM = self.input_norm2(x_BTNM)\n x_BTNM = self.pos_enc(x_BTNM)\n for block in self.blocks:\n x_BTNM = block(x_BTNM, pos_index)\n\n x_BTNV = self.output_dense(x_BTNM)\n if self.sow_logits:\n self.sow(nnx.Intermediate, ""logits"", x_BTNV)\n return x_BTNV\n\n\ndef normalize(x: jax.Array) -> jax.Array:\n return x / (jnp.linalg.norm(x, ord=2, axis=-1, keepdims=True) + 1e-8)\n\n\nclass VectorQuantizer(nnx.Module):\n """"""\n Dimension keys:\n D: B * T * N\n K: number of latents\n L: latent dimension\n """"""\n\n def __init__(\n self,\n latent_dim: int,\n num_latents: int,\n dropout: float,\n dtype: jnp.dtype,\n rngs: nnx.Rngs,\n ):\n self.latent_dim = latent_dim\n self.num_latents = num_latents\n self.dropout = dropout\n self.dtype = dtype\n\n self.codebook = nnx.Param(\n normalize(\n nnx.initializers.lecun_uniform()(\n rngs.params(), (self.num_latents, self.latent_dim)\n )\n )\n )\n self.drop = nnx.Dropout(self.dropout, rngs=rngs)\n\n def __call__(\n self, x_DL: jax.Array, training: bool\n ) -> Tuple[jax.Array, jax.Array, jax.Array, jax.Array]:\n # --- Compute distances ---\n x_DL = x_DL.astype(self.dtype)\n codebook = self.codebook.value.astype(self.dtype)\n\n x_DL = normalize(x_DL)\n normalized_codebook_KL = normalize(codebook)\n distance_DK = -jnp.matmul(x_DL, normalized_codebook_KL.T)\n if training:\n distance_DK = self.drop(distance_DK)\n\n # --- Get indices and embeddings ---\n indices_D = jnp.argmin(distance_DK, axis=-1)\n z_DL = codebook[indices_D]\n\n # --- Straight through estimator ---\n z_q_DL = x_DL + jax.lax.stop_gradient(z_DL - x_DL)\n return z_q_DL, z_DL, x_DL, indices_D\n\n def get_codes(self, indices_E: jax.Array) -> jax.Array:\n return self.codebook[indices_E]\n\n\ndef _create_flash_attention_fn(use_flash_attention: bool, is_causal: bool) -> Callable:\n """"""\n Create an attention function that uses flash attention if enabled.\n\n flax.nnx.MultiHeadAttention provides tensors with shape (batch..., length, num_heads, head_dim),\n but jax.nn.dot_product_attention expects (batch, length, num_heads, head_dim). We reshape to\n ensure compatibility. cuDNN's flash attention additionally requires a sequence length that\n is a multiple of 4. We pad the sequence length to the nearest multiple of 4 and mask\n accordingly. Note that cuDNN requires the mask to be broadcast before calling the attention\n function due to strict shape checking.\n """"""\n\n def attention_fn(\n query_BTHD, key_BSHD, value_BSHD, bias=None, mask_B111=None, **kwargs\n ):\n implementation = ""cudnn"" if use_flash_attention else None\n\n def _merge_batch_dims(x):\n return einops.rearrange(x, ""... l h k -> (...) l h k"")\n\n def _pad(x, pad_size):\n return jnp.pad(x, ((0, 0), (0, pad_size), (0, 0), (0, 0)))\n\n original_shape = query_BTHD.shape\n T = query_BTHD.shape[-3]\n S = key_BSHD.shape[-3]\n\n # Pad to nearest multiple of 4\n Q = ((T + 3) // 4) * 4\n pad_size_Q = Q - T\n K = ((S + 3) // 4) * 4\n pad_size_K = K - S\n\n query_BQHD = _pad(_merge_batch_dims(query_BTHD), pad_size_Q)\n key_BKHD = _pad(_merge_batch_dims(key_BSHD), pad_size_K)\n value_BKHD = _pad(_merge_batch_dims(value_BSHD), pad_size_K)\n\n attention_mask = jnp.ones((Q, K), dtype=jnp.bool_)\n attention_mask = attention_mask.at[T:, :].set(False)\n attention_mask = attention_mask.at[:, S:].set(False)\n\n mask_11TS = attention_mask[jnp.newaxis, jnp.newaxis, :, :]\n\n bias_4d = (\n jnp.pad(\n _merge_batch_dims(bias),\n ((0, 0), (0, 0), (0, pad_size_Q), (0, pad_size_K)),\n )\n if bias is not None\n else None\n )\n\n # NOTE: jax.nn.dot_product_attention does not support dropout\n output_4d = jax.nn.dot_product_attention(\n query=query_BQHD,\n key=key_BKHD,\n value=value_BKHD,\n bias=bias_4d,\n mask=mask_11TS,\n implementation=implementation,\n is_causal=is_causal,\n )\n return output_4d[..., :T, :, :].reshape(original_shape)\n\n return attention_fn\n",python,tab
+2112,272645163,"utils/nn.py",8717,0,"",python,selection_keyboard
+2113,272645563,"utils/nn.py",8686,0,"",python,selection_command
+2114,272645814,"utils/nn.py",8659,0,"",python,selection_command
+2115,272645844,"utils/nn.py",8635,0,"",python,selection_command
+2116,272645878,"utils/nn.py",8613,0,"",python,selection_command
+2117,272645910,"utils/nn.py",8578,0,"",python,selection_command
+2118,272645944,"utils/nn.py",8552,0,"",python,selection_command
+2119,272645977,"utils/nn.py",8520,0,"",python,selection_command
+2120,272646179,"utils/nn.py",9228,0,"",python,selection_command
+2121,272646380,"utils/nn.py",10067,0,"",python,selection_command
+2122,272646789,"utils/nn.py",10851,0,"",python,selection_command
+2123,272647244,"utils/nn.py",11636,0,"",python,selection_command
+2124,272647996,"utils/nn.py",11566,0,"",python,selection_command
+2125,272648144,"utils/nn.py",11490,0,"",python,selection_command
+2126,272648311,"utils/nn.py",11449,0,"",python,selection_command
+2127,272648450,"utils/nn.py",11384,0,"",python,selection_command
+2128,272649170,"utils/nn.py",11449,0,"",python,selection_command
+2129,272649332,"utils/nn.py",11455,0,"",python,selection_command
+2130,272649511,"utils/nn.py",11457,0,"",python,selection_command
+2131,272649699,"utils/nn.py",11461,0,"",python,selection_command
+2132,272649897,"utils/nn.py",11462,0,"",python,selection_command
+2133,272650697,"utils/nn.py",11503,0,"",python,selection_command
+2134,272650998,"utils/nn.py",11462,0,"",python,selection_command
+2135,276005955,"utils/nn.py",11503,0,"",python,selection_command
+2136,276006126,"utils/nn.py",11482,75," z_FNM = self.spatial_attention(z_FNM, sow_weights=self.sow_weights)",python,selection_command
+2137,276006546,"utils/nn.py",11503,0,"",python,selection_command
+2138,276009954,"utils/train_utils.py",0,0,"",python,tab
+2139,276147007,"utils/train_utils.py",5466,74," qkv_spatial = 3.0 * 2.0 * num_tokens * args.model_dim * args.model_dim",python,selection_command
+2140,276199929,"utils/train_utils.py",5466,0,"",python,selection_command
+2141,276200242,"utils/train_utils.py",5466,74," qkv_spatial = 3.0 * 2.0 * num_tokens * args.model_dim * args.model_dim",python,selection_command
+2142,276211835,"utils/train_utils.py",5466,0,"",python,selection_command
+2143,276212040,"utils/train_utils.py",5541,0,"",python,selection_command
+2144,276212271,"utils/train_utils.py",5466,0,"",python,selection_command
+2145,276212523,"utils/train_utils.py",5466,74," qkv_spatial = 3.0 * 2.0 * num_tokens * args.model_dim * args.model_dim",python,selection_command
+2146,276217041,"utils/train_utils.py",5466,0,"",python,selection_command
+2147,276218831,"utils/train_utils.py",5466,74," qkv_spatial = 3.0 * 2.0 * num_tokens * args.model_dim * args.model_dim",python,selection_command
+2148,276628847,"utils/train_utils.py",5466,0,"",python,selection_command
+2149,276629039,"utils/train_utils.py",5541,0,"",python,selection_command
+2150,276629285,"utils/train_utils.py",5541,73," attn_spatial = 4.0 * (args.batch_size * T) * (N * N) * args.model_dim",python,selection_command
+2151,276636395,"utils/train_utils.py",5541,0,"",python,selection_command
+2152,276673213,"utils/train_utils.py",5615,0,"",python,selection_command
+2153,276673410,"utils/train_utils.py",5684,0,"\n ",python,content
+2154,276673599,"utils/train_utils.py",5689,0,"\n ",python,content
+2155,276673599,"utils/train_utils.py",5685,4,"",python,content
+2156,276673827,"utils/train_utils.py",5690,0,"#",python,content
+2157,276673835,"utils/train_utils.py",5691,0,"",python,selection_keyboard
+2158,276674502,"utils/train_utils.py",5691,0,"T",python,content
+2159,276674510,"utils/train_utils.py",5692,0,"",python,selection_keyboard
+2160,276674603,"utils/train_utils.py",5692,0,"O",python,content
+2161,276674606,"utils/train_utils.py",5693,0,"",python,selection_keyboard
+2162,276674916,"utils/train_utils.py",5692,1,"",python,content
+2163,276675036,"utils/train_utils.py",5691,1,"",python,content
+2164,276675099,"utils/train_utils.py",5691,0,"F",python,content
+2165,276675105,"utils/train_utils.py",5692,0,"",python,selection_keyboard
+2166,276675462,"utils/train_utils.py",5691,1,"",python,content
+2167,276675564,"utils/train_utils.py",5691,0," ",python,content
+2168,276675570,"utils/train_utils.py",5692,0,"",python,selection_keyboard
+2169,276675630,"utils/train_utils.py",5692,0,"F",python,content
+2170,276675634,"utils/train_utils.py",5693,0,"",python,selection_keyboard
+2171,276675747,"utils/train_utils.py",5693,0,"I",python,content
+2172,276675752,"utils/train_utils.py",5694,0,"",python,selection_keyboard
+2173,276675845,"utils/train_utils.py",5694,0,"X",python,content
+2174,276675853,"utils/train_utils.py",5695,0,"",python,selection_keyboard
+2175,276675986,"utils/train_utils.py",5695,0,"M",python,content
+2176,276675992,"utils/train_utils.py",5696,0,"",python,selection_keyboard
+2177,276676077,"utils/train_utils.py",5696,0,"E",python,content
+2178,276676084,"utils/train_utils.py",5697,0,"",python,selection_keyboard
+2179,276676161,"utils/train_utils.py",5697,0," ",python,content
+2180,276676167,"utils/train_utils.py",5698,0,"",python,selection_keyboard
+2181,276676262,"utils/train_utils.py",5698,0,"()",python,content
+2182,276676268,"utils/train_utils.py",5699,0,"",python,selection_keyboard
+2183,276676507,"utils/train_utils.py",5699,0,"f",python,content
+2184,276676514,"utils/train_utils.py",5700,0,"",python,selection_keyboard
+2185,276676638,"utils/train_utils.py",5700,0,".",python,content
+2186,276676647,"utils/train_utils.py",5701,0,"",python,selection_keyboard
+2187,276676688,"utils/train_utils.py",5701,0,"s",python,content
+2188,276676694,"utils/train_utils.py",5702,0,"",python,selection_keyboard
+2189,276676754,"utils/train_utils.py",5702,0,"r",python,content
+2190,276676761,"utils/train_utils.py",5703,0,"",python,selection_keyboard
+2191,276676837,"utils/train_utils.py",5703,0,"a",python,content
+2192,276676843,"utils/train_utils.py",5704,0,"",python,selection_keyboard
+2193,276676853,"utils/train_utils.py",5704,0,"m",python,content
+2194,276676859,"utils/train_utils.py",5705,0,"",python,selection_keyboard
+2195,276677053,"utils/train_utils.py",5705,0,"b",python,content
+2196,276677057,"utils/train_utils.py",5706,0,"",python,selection_keyboard
+2197,276677183,"utils/train_utils.py",5706,0,"i",python,content
+2198,276677189,"utils/train_utils.py",5707,0,"",python,selection_keyboard
+2199,276677244,"utils/train_utils.py",5707,0,"c",python,content
+2200,276677251,"utils/train_utils.py",5708,0,"",python,selection_keyboard
+2201,276677316,"utils/train_utils.py",5708,0,"a",python,content
+2202,276677321,"utils/train_utils.py",5709,0,"",python,selection_keyboard
+2203,276677413,"utils/train_utils.py",5709,0,"l",python,content
+2204,276677418,"utils/train_utils.py",5710,0,"",python,selection_keyboard
+2205,276677577,"utils/train_utils.py",5709,0,"",python,selection_command
+2206,276677823,"utils/train_utils.py",5711,0,"",python,selection_command
+2207,276677911,"utils/train_utils.py",5711,0,":",python,content
+2208,276677920,"utils/train_utils.py",5712,0,"",python,selection_keyboard
+2209,276678045,"utils/train_utils.py",5712,0," ",python,content
+2210,276678051,"utils/train_utils.py",5713,0,"",python,selection_keyboard
+2211,276679419,"utils/train_utils.py",5713,0,"w",python,content
+2212,276679431,"utils/train_utils.py",5714,0,"",python,selection_keyboard
+2213,276679504,"utils/train_utils.py",5714,0,"h",python,content
+2214,276679510,"utils/train_utils.py",5715,0,"",python,selection_keyboard
+2215,276679574,"utils/train_utils.py",5715,0,"e",python,content
+2216,276679582,"utils/train_utils.py",5716,0,"",python,selection_keyboard
+2217,276679648,"utils/train_utils.py",5716,0,"r",python,content
+2218,276679654,"utils/train_utils.py",5717,0,"",python,selection_keyboard
+2219,276679763,"utils/train_utils.py",5717,0,"e",python,content
+2220,276679768,"utils/train_utils.py",5718,0,"",python,selection_keyboard
+2221,276679866,"utils/train_utils.py",5718,0," ",python,content
+2222,276679871,"utils/train_utils.py",5719,0,"",python,selection_keyboard
+2223,276679988,"utils/train_utils.py",5719,0,"a",python,content
+2224,276679993,"utils/train_utils.py",5720,0,"",python,selection_keyboard
+2225,276680064,"utils/train_utils.py",5720,0,"r",python,content
+2226,276680073,"utils/train_utils.py",5721,0,"",python,selection_keyboard
+2227,276680166,"utils/train_utils.py",5721,0,"e",python,content
+2228,276680172,"utils/train_utils.py",5722,0,"",python,selection_keyboard
+2229,276680232,"utils/train_utils.py",5722,0," ",python,content
+2230,276680239,"utils/train_utils.py",5723,0,"",python,selection_keyboard
+2231,276680352,"utils/train_utils.py",5723,0,"t",python,content
+2232,276680358,"utils/train_utils.py",5724,0,"",python,selection_keyboard
+2233,276680446,"utils/train_utils.py",5724,0,"h",python,content
+2234,276680454,"utils/train_utils.py",5725,0,"",python,selection_keyboard
+2235,276680466,"utils/train_utils.py",5725,0,"e",python,content
+2236,276680476,"utils/train_utils.py",5726,0,"",python,selection_keyboard
+2237,276680583,"utils/train_utils.py",5726,0,"a",python,content
+2238,276680590,"utils/train_utils.py",5727,0,"",python,selection_keyboard
+2239,276680608,"utils/train_utils.py",5727,0," ",python,content
+2240,276680617,"utils/train_utils.py",5728,0,"",python,selection_keyboard
+2241,276681137,"utils/train_utils.py",5727,1,"",python,content
+2242,276681286,"utils/train_utils.py",5726,1,"",python,content
+2243,276681405,"utils/train_utils.py",5726,0," ",python,content
+2244,276681411,"utils/train_utils.py",5727,0,"",python,selection_keyboard
+2245,276681472,"utils/train_utils.py",5727,0,"h",python,content
+2246,276681479,"utils/train_utils.py",5728,0,"",python,selection_keyboard
+2247,276681528,"utils/train_utils.py",5728,0,"e",python,content
+2248,276681534,"utils/train_utils.py",5729,0,"",python,selection_keyboard
+2249,276681600,"utils/train_utils.py",5729,0,"a",python,content
+2250,276681606,"utils/train_utils.py",5730,0,"",python,selection_keyboard
+2251,276681721,"utils/train_utils.py",5730,0,"d",python,content
+2252,276681733,"utils/train_utils.py",5731,0,"",python,selection_keyboard
+2253,276681744,"utils/train_utils.py",5731,0,"s",python,content
+2254,276681751,"utils/train_utils.py",5732,0,"",python,selection_keyboard
+2255,276682098,"utils/train_utils.py",5732,0,"?",python,content
+2256,276682106,"utils/train_utils.py",5733,0,"",python,selection_keyboard
+2257,276682364,"utils/train_utils.py",5732,0,"",python,selection_command
+2258,276686925,"utils/train_utils.py",5727,0,"",python,selection_command
+2259,276691403,"train_tokenizer.py",0,0,"",python,tab
+2260,276692743,"train_tokenizer.py",1553,0,"",python,selection_command
+2261,276694297,"utils/train_utils.py",0,0,"",python,tab
+2262,276695567,"utils/train_utils.py",5685,0,"",python,selection_command
+2263,276695766,"utils/train_utils.py",5656,0,"",python,selection_command
+2264,276695915,"utils/train_utils.py",5582,0,"",python,selection_command
+2265,276696078,"utils/train_utils.py",5507,0,"",python,selection_command
+2266,276696362,"utils/train_utils.py",5427,0,"",python,selection_command
+2267,276701106,"utils/train_utils.py",5449,0,"",python,selection_command
+2268,276701285,"utils/train_utils.py",5486,0,"",python,selection_command
+2269,276701446,"utils/train_utils.py",5561,0,"",python,selection_command
+2270,276701579,"utils/train_utils.py",5635,0,"",python,selection_command
+2271,276702371,"utils/train_utils.py",5561,0,"",python,selection_command
diff --git a/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-25c99858-fb5d-49c2-83fa-7bee2d3aecc61762433266951-2025_11_06-13.47.50.980/source.csv b/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-25c99858-fb5d-49c2-83fa-7bee2d3aecc61762433266951-2025_11_06-13.47.50.980/source.csv
new file mode 100644
index 0000000000000000000000000000000000000000..1aaff755ec4d7d7f32e69e8cc893af32c0212a8f
--- /dev/null
+++ b/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-25c99858-fb5d-49c2-83fa-7bee2d3aecc61762433266951-2025_11_06-13.47.50.980/source.csv
@@ -0,0 +1,61 @@
+Sequence,Time,File,RangeOffset,RangeLength,Text,Language,Type
+1,12,"Untitled-2",0,0,"",plaintext,tab
+2,255,"extension-output-pdoom-org.crowd-code-#1-crowd-code",0,0,"",Log,tab
+3,529,"TERMINAL",0,0,"Test",,terminal_focus
+4,569,"extension-output-pdoom-org.crowd-code-#1-crowd-code",0,0,"",Log,tab
+5,801,"TERMINAL",0,0,"echo VSCode test",,terminal_command
+6,802,"TERMINAL",0,0,"]633;CVSCode test\r\n[1m[7m%[27m[1m[0m \r \r",,terminal_output
+7,1039,"extension-output-pdoom-org.crowd-code-#1-crowd-code",0,0,"1:47:50 PM [info] Activating crowd-code\n1:47:50 PM [info] Recording started\n1:47:50 PM [info] Initializing git provider using file system watchers...\n1:47:50 PM [info] No workspace folder found\n",Log,content
+8,2236,"Untitled-2",0,0,"",plaintext,tab
+9,3653,"Untitled-2",0,0,"hello world\n",plaintext,content
+10,3707,"TERMINAL",0,0,"echo VSCode test",,terminal_command
+11,3714,"TERMINAL",0,0,"]633;CVSCode test\r\n[1m[7m%[27m[1m[0m \r \r",,terminal_output
+12,6865,"TERMINAL",0,0,"",,terminal_focus
+13,9284,"Untitled-2",0,0,"",plaintext,selection_command
+14,9289,"Untitled-2",0,0,"hello world\n",plaintext,content
+15,9300,"TERMINAL",0,0,"Test",,terminal_focus
+16,9322,"TERMINAL",0,0,"echo VSCode test",,terminal_command
+17,9323,"TERMINAL",0,0,"]633;CVSCode test\r\n[1m[7m%[27m[1m[0m \r \r",,terminal_output
+18,10544,"TERMINAL",0,0,"zsh",,terminal_focus
+19,12294,"Untitled-2",0,0,"",plaintext,selection_command
+20,12295,"Untitled-2",0,0,"hello world\n",plaintext,content
+21,12302,"TERMINAL",0,0,"Test",,terminal_focus
+22,12372,"TERMINAL",0,0,"echo VSCode test",,terminal_command
+23,12373,"TERMINAL",0,0,"]633;CVSCode test\r\n[1m[7m%[27m[1m[0m \r \r",,terminal_output
+24,38024,"Untitled-2",36,0,"",plaintext,selection_mouse
+25,38349,"Untitled-2",0,0,"",plaintext,selection_command
+26,38350,"Untitled-2",0,0,"hello world\n",plaintext,content
+27,38372,"TERMINAL",0,0,"echo VSCode test",,terminal_command
+28,38373,"TERMINAL",0,0,"]633;CVSCode test\r\n[1m[7m%[27m[1m[0m \r \r",,terminal_output
+29,41487,"TERMINAL",0,0,"zsh",,terminal_focus
+30,41898,"Untitled-2",48,0,"",plaintext,selection_mouse
+31,42128,"Untitled-2",0,0,"",plaintext,selection_command
+32,42129,"Untitled-2",0,0,"hello world\n",plaintext,content
+33,42161,"TERMINAL",0,0,"echo VSCode test",,terminal_command
+34,42161,"TERMINAL",0,0,"]633;CVSCode test\r\n[1m[7m%[27m[1m[0m \r \r",,terminal_output
+35,44371,"Untitled-2",0,0,"",plaintext,selection_command
+36,44375,"Untitled-2",0,0,"hello world\n",plaintext,content
+37,44400,"TERMINAL",0,0,"echo VSCode test",,terminal_command
+38,44401,"TERMINAL",0,0,"]633;CVSCode test\r\n[1m[7m%[27m[1m[0m \r \r",,terminal_output
+39,45706,"Untitled-2",72,0,"",plaintext,selection_mouse
+40,45963,"Untitled-2",0,0,"",plaintext,selection_command
+41,45964,"Untitled-2",0,0,"hello world\n",plaintext,content
+42,45986,"TERMINAL",0,0,"echo VSCode test",,terminal_command
+43,45987,"TERMINAL",0,0,"]633;CVSCode test\r\n[1m[7m%[27m[1m[0m \r \r",,terminal_output
+44,46465,"Untitled-2",0,0,"",plaintext,selection_command
+45,46466,"Untitled-2",0,0,"hello world\n",plaintext,content
+46,46491,"TERMINAL",0,0,"echo VSCode test",,terminal_command
+47,46492,"TERMINAL",0,0,"]633;CVSCode test\r\n[1m[7m%[27m[1m[0m \r \r",,terminal_output
+48,129708,"TERMINAL",0,0,"Test",,terminal_focus
+49,129720,"Untitled-2",0,0,"",ruby,selection_command
+50,129722,"Untitled-2",0,0,"hello world\n",ruby,content
+51,129991,"TERMINAL",0,0,"echo VSCode test",,terminal_command
+52,129992,"TERMINAL",0,0,"]633;CVSCode test\r\n[1m[7m%[27m[1m[0m \r \r",,terminal_output
+53,130304,"Untitled-2",0,0,"",ruby,selection_command
+54,130307,"Untitled-2",0,0,"hello world\n",ruby,content
+55,130338,"TERMINAL",0,0,"echo VSCode test",,terminal_command
+56,130339,"TERMINAL",0,0,"]633;CVSCode test\r\n[1m[7m%[27m[1m[0m \r \r",,terminal_output
+57,132020,"Untitled-2",0,0,"",ruby,selection_command
+58,132022,"Untitled-2",0,0,"hello world\n",ruby,content
+59,132048,"TERMINAL",0,0,"echo VSCode test",,terminal_command
+60,132049,"TERMINAL",0,0,"]633;CVSCode test\r\n[1m[7m%[27m[1m[0m \r \r",,terminal_output
diff --git a/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-2cea4f45-eb3d-4d39-8263-124201da2dc81763721698838-2025_11_21-11.41.41.686/source.csv b/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-2cea4f45-eb3d-4d39-8263-124201da2dc81763721698838-2025_11_21-11.41.41.686/source.csv
new file mode 100644
index 0000000000000000000000000000000000000000..c6d5db7d05621e1002688a9bb2033dd60355418b
--- /dev/null
+++ b/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-2cea4f45-eb3d-4d39-8263-124201da2dc81763721698838-2025_11_21-11.41.41.686/source.csv
@@ -0,0 +1,546 @@
+Sequence,Time,File,RangeOffset,RangeLength,Text,Language,Type
+1,3,"nemo/collections/llm/recipes/finetune_default.py",0,0,"# Copyright (c) 2025, NVIDIA CORPORATION. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the ""License"");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an ""AS IS"" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom typing import TYPE_CHECKING, Any, Optional\n\nimport lightning.pytorch as pl\nimport nemo_run as run\nimport torch\n\nimport nemo.lightning as nl\nfrom nemo.collections import llm\nfrom nemo.collections.llm.gpt.data.packed_sequence import PackedSequenceSpecs\nfrom nemo.collections.llm.peft import DoRA, LoRA\nfrom nemo.collections.llm.recipes.log.default import tensorboard_logger\nfrom nemo.collections.llm.recipes.optim.adam import distributed_fused_adam_with_cosine_annealing\nfrom nemo.collections.llm.recipes.precision.mixed_precision import bf16_mixed\nfrom nemo.lightning.pytorch.callbacks import PEFT\nfrom nemo.utils.exp_manager import TimingCallback\n\nif TYPE_CHECKING:\n from lightning.pytorch.loggers import TensorBoardLogger, WandbLogger\n\nTokenizerType = Any\n\n\ndef default_finetune_recipe(\n model: run.Config[pl.LightningModule],\n resume_path: str,\n dir: Optional[str] = None,\n name: str = ""default"",\n num_nodes: int = 1,\n num_gpus_per_node: int = 8,\n packed_sequence: bool = False, # once packing recipe is well tested, change this default to true\n tokenizer: Optional[TokenizerType] = ""model"",\n) -> run.Partial:\n """"""\n Create a default fine-tuning recipe for any model.\n\n This function sets up a template for a complete configuration for fine-tuning, including\n model, trainer, data, logging, optimization, and resumption settings.\n\n Args:\n model (run.Config[pl.LightningModule]): Configuration for a NeMo model.\n resume_path (str): Path to the Huggingface model or pretrained distributed checkpoint for resume\n dir (Optional[str]): Directory for saving logs and checkpoints.\n name (str): Name of the fine-tuning run.\n num_nodes (int): Number of compute nodes to use.\n num_gpus_per_node (int): Number of GPUs per node.\n packed_sequence (bool): Whether to use packed sequence.\n tokenizer (Optional[TokenizerType]): Tokenizer setting to be applied. Can be 'data' or 'model'\n or an instance of TokenizerSpec.\n\n Returns:\n run.Partial: Partial configuration for fine-tuning.\n\n See usages of this recipe for further details.\n """"""\n if packed_sequence:\n datamodule = run.Config(\n llm.SquadDataModule,\n seq_length=2048,\n global_batch_size=8,\n micro_batch_size=1,\n packed_sequence_specs=PackedSequenceSpecs(packed_sequence_size=2048),\n )\n else:\n datamodule = run.Config(llm.SquadDataModule, seq_length=2048, global_batch_size=128, micro_batch_size=1)\n recipe = run.Partial(\n llm.finetune,\n model=model,\n trainer=default_finetune_trainer(\n num_nodes=num_nodes,\n num_gpus_per_node=num_gpus_per_node,\n ),\n data=datamodule,\n log=default_finetune_log(dir=dir, name=name, tensorboard_logger=tensorboard_logger(name=name)),\n optim=distributed_fused_adam_with_cosine_annealing(max_lr=1e-4, min_lr=0, warmup_steps=50, adam_beta2=0.98),\n resume=nemo_resume(resume_path),\n tokenizer=tokenizer,\n )\n\n return recipe\n\n\ndef default_finetune_trainer(\n tensor_parallelism=1,\n pipeline_parallelism=1,\n pipeline_parallelism_type=torch.bfloat16,\n virtual_pipeline_parallelism=None,\n context_parallelism=1,\n sequence_parallelism=False,\n num_nodes=1,\n num_gpus_per_node=8,\n max_steps=1000,\n limit_test_batches=None,\n limit_val_batches=None,\n val_check_interval=30,\n):\n """"""\n Create a default fine-tuning trainer for any model.\n\n This function sets up a template for strategy and trainer.\n\n Args:\n See docstrings of MegatronStrategy and Trainer.\n\n Returns:\n run.Config: Config for a finetuning trainer.\n\n See usages of this in recipes for further details.\n """"""\n strategy = run.Config(\n nl.MegatronStrategy,\n tensor_model_parallel_size=tensor_parallelism,\n pipeline_model_parallel_size=pipeline_parallelism,\n pipeline_dtype=pipeline_parallelism_type,\n virtual_pipeline_model_parallel_size=virtual_pipeline_parallelism,\n context_parallel_size=context_parallelism,\n sequence_parallel=sequence_parallelism,\n gradient_as_bucket_view=True,\n ckpt_load_strictness=""log_all"",\n )\n\n trainer = run.Config(\n nl.Trainer,\n accelerator=""gpu"",\n accumulate_grad_batches=1,\n devices=num_gpus_per_node,\n limit_test_batches=limit_test_batches,\n limit_val_batches=limit_val_batches,\n log_every_n_steps=1,\n max_steps=max_steps,\n num_nodes=num_nodes,\n plugins=bf16_mixed(),\n strategy=strategy,\n use_distributed_sampler=False,\n val_check_interval=val_check_interval,\n callbacks=[run.Config(TimingCallback)],\n )\n\n return trainer\n\n\ndef default_finetune_log(\n dir: Optional[str] = None,\n name: str = ""default"",\n tensorboard_logger: Optional[run.Config['TensorBoardLogger']] = None,\n wandb_logger: Optional[run.Config['WandbLogger']] = None,\n) -> run.Config[nl.NeMoLogger]:\n """"""\n Create a default fine-tuning logger for any model.\n\n This function sets up a template for ModelCheckpoint and NeMoLogger.\n\n Args:\n See docstrings of ModelCheckpoint and NeMoLogger.\n\n Returns:\n run.Config: Config for a finetuning NeMoLogger.\n\n See usages of this in recipes for further details.\n """"""\n\n ckpt = run.Config(\n nl.ModelCheckpoint,\n save_last=""link"",\n save_top_k=2,\n every_n_train_steps=50,\n filename=""{model_name}--{val_loss:.2f}-{step}-{consumed_samples}"",\n )\n\n return run.Config(\n nl.NeMoLogger,\n ckpt=ckpt,\n name=name,\n tensorboard=tensorboard_logger,\n wandb=wandb_logger,\n log_dir=dir,\n )\n\n\ndef nemo_resume(model_id: str) -> run.Config[nl.AutoResume]:\n """"""\n Configure automatic resumption from a NeMo checkpoint converted from Huggingface for\n https://huggingface.co/{model_id}.\n\n This NeMo checkpoint should be converted from Huggingface beforehand, using nemo.collections.llm.import_ckpt.\n When converting the checkpoint, the NeMo checkpoint will be saved in NEMO_HOME (set to ~/.cache/nemo by default).\n\n This function sets up the configuration to resume training from path nemo://{model_id}.\n This translates to the full path {NEMO_HOME}/models/{model_id}.\n\n Args:\n model_id (str): Path to the Huggingface model or pretrained distributed checkpoint for resume\n\n Returns:\n run.Config[nl.AutoResume]: Configuration for resuming from NeMo checkpoint.\n """"""\n return run.Config(\n nl.AutoResume,\n restore_config=run.Config(nl.RestoreConfig, path=f""nemo://{model_id}""),\n )\n\n\n@run.cli.factory(name='lora')\ndef lora() -> run.Config[PEFT]:\n """"""\n Factory function to create a LoRA configuration.\n\n Returns:\n run.Config[PEFT]: Configuration for the LoRA class.\n\n Examples:\n CLI usage:\n $ nemo llm finetune -f llama3_8b peft=lora\n\n Python API usage:\n >>> lora_config = lora()\n >>> print(lora_config)\n """"""\n return run.Config(LoRA)\n\n\n@run.cli.factory(name='dora')\ndef dora() -> run.Config[PEFT]:\n """"""\n Factory function to create a DoRA configuration.\n\n Returns:\n run.Config[PEFT]: Configuration for the DoRA class.\n\n Examples:\n CLI usage:\n $ nemo llm finetune -f llama3_8b peft=dora\n\n Python API usage:\n >>> dora_config = dora()\n >>> print(dora_config)\n """"""\n return run.Config(DoRA)\n",python,tab
+2,64,"extension-output-pdoom-org.crowd-code-#1-crowd-code",0,0,"11:41:41 AM [info] Activating crowd-code\n11:41:41 AM [info] Recording started\n11:41:41 AM [info] Initializing git provider using file system watchers...\n11:41:41 AM [info] Git repository found\n11:41:41 AM [info] Git provider initialized successfully\n11:41:41 AM [info] Initial git state: [object Object]\n",Log,tab
+3,1168,"nemo/collections/llm/recipes/finetune_default.py",0,0,"",python,tab
+4,18225,"nemo/collections/llm/recipes/finetune_default.py",3766,0,"",python,selection_keyboard
+5,23245,"nemo/collections/llm/recipes/finetune_default.py",3709,0,"",python,selection_mouse
+6,42597,"nemo/collections/llm/recipes/finetune_default.py",5157,0,"",python,selection_keyboard
+7,43440,"nemo/collections/llm/recipes/finetune_default.py",6420,0,"",python,selection_keyboard
+8,45071,"nemo/collections/llm/recipes/finetune_default.py",7925,0,"",python,selection_keyboard
+9,46027,"nemo/collections/llm/recipes/finetune_default.py",8209,0,"",python,selection_keyboard
+10,46798,"nemo/collections/llm/recipes/finetune_default.py",7012,0,"",python,selection_keyboard
+11,46972,"nemo/collections/llm/recipes/finetune_default.py",5433,0,"",python,selection_keyboard
+12,48376,"nemo/collections/llm/recipes/finetune_default.py",3981,0,"",python,selection_keyboard
+13,51440,"nemo/collections/llm/recipes/finetune_default.py",3956,0,"",python,selection_command
+14,51606,"nemo/collections/llm/recipes/finetune_default.py",3939,0,"",python,selection_command
+15,51857,"nemo/collections/llm/recipes/finetune_default.py",3907,0,"",python,selection_command
+16,51889,"nemo/collections/llm/recipes/finetune_default.py",3880,0,"",python,selection_command
+17,51922,"nemo/collections/llm/recipes/finetune_default.py",3841,0,"",python,selection_command
+18,51958,"nemo/collections/llm/recipes/finetune_default.py",3795,0,"",python,selection_command
+19,51989,"nemo/collections/llm/recipes/finetune_default.py",3767,0,"",python,selection_command
+20,52696,"nemo/collections/llm/recipes/finetune_default.py",3741,0,"",python,selection_command
+21,52926,"nemo/collections/llm/recipes/finetune_default.py",3767,0,"",python,selection_command
+22,53176,"nemo/collections/llm/recipes/finetune_default.py",3795,0,"",python,selection_command
+23,53216,"nemo/collections/llm/recipes/finetune_default.py",3841,0,"",python,selection_command
+24,53241,"nemo/collections/llm/recipes/finetune_default.py",3880,0,"",python,selection_command
+25,53275,"nemo/collections/llm/recipes/finetune_default.py",3907,0,"",python,selection_command
+26,53310,"nemo/collections/llm/recipes/finetune_default.py",3939,0,"",python,selection_command
+27,53342,"nemo/collections/llm/recipes/finetune_default.py",3956,0,"",python,selection_command
+28,53376,"nemo/collections/llm/recipes/finetune_default.py",3981,0,"",python,selection_command
+29,53409,"nemo/collections/llm/recipes/finetune_default.py",4001,0,"",python,selection_command
+30,53449,"nemo/collections/llm/recipes/finetune_default.py",4030,0,"",python,selection_command
+31,53482,"nemo/collections/llm/recipes/finetune_default.py",4058,0,"",python,selection_command
+32,54038,"nemo/collections/llm/recipes/finetune_default.py",4085,0,"",python,selection_command
+33,54286,"nemo/collections/llm/recipes/finetune_default.py",4088,0,"",python,selection_command
+34,54323,"nemo/collections/llm/recipes/finetune_default.py",4096,0,"",python,selection_command
+35,54358,"nemo/collections/llm/recipes/finetune_default.py",4152,0,"",python,selection_command
+36,54388,"nemo/collections/llm/recipes/finetune_default.py",4153,0,"",python,selection_command
+37,54729,"nemo/collections/llm/recipes/finetune_default.py",4216,0,"",python,selection_command
+38,54876,"nemo/collections/llm/recipes/finetune_default.py",4217,0,"",python,selection_command
+39,54984,"nemo/collections/llm/recipes/finetune_default.py",4227,0,"",python,selection_command
+40,57993,"tests/lightning/_io/artifacts/model.yaml",0,0,"_target_: nemo.collections.llm.gpt.model.base.GPTModel\nconfig:\n _cpu_offloading_context: null\n _target_: nemo.collections.llm.gpt.model.base.GPTConfig\n activation_func:\n _call_: false\n _target_: torch._C._nn.gelu\n activation_func_fp8_input_store: false\n add_bias_linear: true\n add_qkv_bias: false\n apply_query_key_layer_scaling: false\n apply_residual_connection_post_layernorm: false\n apply_rope_fusion: false\n async_tensor_model_parallel_allreduce: false\n attention_dropout: 0.1\n attention_softmax_in_fp32: false\n autocast_dtype: null\n barrier_with_L1_time: true\n batch_p2p_comm: true\n batch_p2p_sync: true\n bf16: false\n bias_activation_fusion: false\n bias_dropout_fusion: false\n calculate_per_token_loss: false\n clone_scatter_output_in_embedding: true\n config_logger_dir: ''\n context_parallel_size: 1\n cpu_offloading: false\n cpu_offloading_activations: true\n cpu_offloading_num_layers: 0\n cpu_offloading_weights: true\n cross_entropy_loss_fusion: true\n data_step_fn:\n _call_: false\n _target_: nemo.collections.llm.gpt.model.base.gpt_data_step\n deallocate_pipeline_outputs: false\n defer_embedding_wgrad_compute: false\n deterministic_mode: false\n disable_parameter_transpose_cache: false\n distribute_saved_activations: null\n enable_autocast: false\n enable_cuda_graph: false\n expert_model_parallel_size: 1\n external_cuda_graph: false\n ffn_hidden_size: 4096\n finalize_model_grads_func: null\n first_pipeline_num_layers: null\n forward_step_fn:\n _call_: false\n _target_: nemo.collections.llm.gpt.model.base.gpt_forward_step\n fp16: false\n fp16_lm_cross_entropy: false\n fp32_residual_connection: false\n fp8: null\n fp8_amax_compute_algo: most_recent\n fp8_amax_history_len: 1\n fp8_dot_product_attention: false\n fp8_interval: 1\n fp8_margin: 0\n fp8_multi_head_attention: false\n fp8_wgrad: true\n gated_linear_unit: false\n grad_scale_func: null\n grad_sync_func: null\n gradient_accumulation_fusion: true\n hidden_dropout: 0.1\n hidden_size: 1024\n init_method: null\n init_method_std: 0.02\n kv_channels: null\n last_pipeline_num_layers: null\n layernorm_epsilon: 1.0e-05\n layernorm_zero_centered_gamma: false\n make_vocab_size_divisible_by: 128\n masked_softmax_fusion: true\n memory_efficient_layer_norm: false\n moe_aux_loss_coeff: 0\n moe_expert_capacity_factor: null\n moe_extended_tp: false\n moe_grouped_gemm: false\n moe_input_jitter_eps: null\n moe_layer_recompute: false\n moe_pad_expert_input_to_capacity: false\n moe_per_layer_logging: false\n moe_router_load_balancing_type: aux_loss\n moe_router_pre_softmax: false\n moe_router_topk: 2\n moe_shared_expert_intermediate_size: null\n moe_shared_expert_overlap: false\n moe_token_dispatcher_type: allgather\n moe_token_drop_policy: probs\n moe_token_dropping: false\n moe_z_loss_coeff: null\n no_sync_func: null\n normalization: LayerNorm\n num_attention_heads: 8\n num_layers: 2\n num_microbatches_with_partial_activation_checkpoints: null\n num_moe_experts: null\n num_query_groups: null\n output_layer_init_method: null\n overlap_p2p_comm: false\n parallel_output: true\n param_sync_func: null\n params_dtype:\n _call_: false\n _target_: torch.float32\n perform_initialization: true\n persist_layer_norm: false\n pipeline_dtype: null\n pipeline_model_parallel_size: 1\n pipeline_model_parallel_split_rank: null\n position_embedding_type: learned_absolute\n qk_layernorm: false\n recompute_granularity: null\n recompute_method: null\n recompute_num_layers: null\n rotary_base: 10000\n rotary_interleaved: false\n rotary_percent: 1.0\n seq_len_interpolation_factor: null\n seq_length: 1024\n sequence_parallel: false\n share_embeddings_and_output_weights: true\n tensor_model_parallel_size: 1\n test_mode: false\n timers: null\n tp_comm_atomic_ag: false\n tp_comm_atomic_rs: false\n tp_comm_bulk_dgrad: true\n tp_comm_bulk_wgrad: true\n tp_comm_overlap: false\n tp_comm_overlap_ag: true\n tp_comm_overlap_disable_fc1: false\n tp_comm_overlap_disable_qkv: false\n tp_comm_overlap_rs: true\n tp_comm_overlap_rs_dgrad: false\n tp_comm_split_ag: true\n tp_comm_split_rs: true\n tp_only_amax_red: false\n transformer_layer_spec:\n _call_: false\n _target_: nemo.collections.llm.gpt.model.base.default_layer_spec\n use_cpu_initialization: false\n use_ring_exchange_p2p: false\n use_te_rng_tracker: false\n variable_seq_lengths: false\n virtual_pipeline_model_parallel_size: null\n wgrad_deferral_limit: 0\n window_size: null\nmodel_transform: null\noptim:\n _target_: nemo.lightning.pytorch.optim.megatron.MegatronOptimizerModule\n config:\n _target_: megatron.core.optimizer.optimizer_config.OptimizerConfig\n adam_beta1: 0.9\n adam_beta2: 0.999\n adam_eps: 1.0e-08\n barrier_with_L1_time: false\n bf16: false\n clip_grad: 1.0\n config_logger_dir: ''\n decoupled_lr: null\n decoupled_min_lr: null\n fp16: false\n hysteresis: 2\n initial_loss_scale: 4294967296\n log_num_zeros_in_grad: false\n loss_scale: null\n loss_scale_window: 1000\n lr: 0.0001\n min_loss_scale: 1.0\n min_lr: null\n optimizer: adam\n overlap_param_gather_with_optimizer_step: false\n params_dtype:\n _call_: false\n _target_: torch.float32\n sgd_momentum: 0.9\n timers: null\n use_distributed_optimizer: true\n weight_decay: 0.01\n lr_mult: 1.0\n lr_scheduler: null\n no_weight_decay_cond: null\n scale_lr_cond: null\ntokenizer:\n _target_: nemo.collections.common.tokenizers.huggingface.auto_tokenizer.AutoTokenizer\n bos_token: null\n cls_token: null\n eos_token: null\n mask_token: null\n merges_file: megatron-gpt-345m_merges\n pad_token: null\n pretrained_model_name: gpt2\n sep_token: null\n trust_remote_code: false\n unk_token: null\n use_fast: false\n vocab_file: megatron-gpt-345m_vocab\n",yaml,tab
+41,59739,"tests/lightning/_io/artifacts/model.yaml",3367,0,"",yaml,selection_command
+42,62820,"tests/lightning/_io/artifacts/model.yaml",4805,0,"",yaml,selection_command
+43,63714,"tests/lightning/_io/artifacts/model.yaml",4832,0,"",yaml,selection_command
+44,64537,"tests/lightning/_io/artifacts/model.yaml",4805,0,"",yaml,selection_command
+45,65301,"tests/lightning/_io/artifacts/model.yaml",4832,0,"",yaml,selection_command
+46,65946,"tests/lightning/_io/artifacts/model.yaml",4805,0,"",yaml,selection_command
+47,66127,"tests/lightning/_io/artifacts/model.yaml",4832,0,"",yaml,selection_command
+48,67070,"tests/lightning/_io/artifacts/model.yaml",4996,0,"",yaml,selection_command
+49,69350,"nemo/collections/llm/recipes/finetune_default.py",0,0,"",python,tab
+50,71812,"nemo/collections/llm/recipes/finetune_default.py",1022,0,"",python,selection_command
+51,72748,"nemo/collections/llm/recipes/finetune_default.py",1948,0,"",python,selection_command
+52,73153,"nemo/collections/llm/recipes/finetune_default.py",3505,0,"",python,selection_command
+53,75039,"nemo/collections/llm/recipes/finetune_default.py",3510,0,"",python,selection_command
+54,75283,"nemo/collections/llm/recipes/finetune_default.py",3511,0,"",python,selection_command
+55,75327,"nemo/collections/llm/recipes/finetune_default.py",3555,0,"",python,selection_command
+56,75353,"nemo/collections/llm/recipes/finetune_default.py",3556,0,"",python,selection_command
+57,75379,"nemo/collections/llm/recipes/finetune_default.py",3562,0,"",python,selection_command
+58,75419,"nemo/collections/llm/recipes/finetune_default.py",3563,0,"",python,selection_command
+59,75448,"nemo/collections/llm/recipes/finetune_default.py",3565,0,"",python,selection_command
+60,75480,"nemo/collections/llm/recipes/finetune_default.py",3566,0,"",python,selection_command
+61,75912,"nemo/collections/llm/recipes/finetune_default.py",3565,0,"",python,selection_command
+62,76073,"nemo/collections/llm/recipes/finetune_default.py",3563,0,"",python,selection_command
+63,76875,"nemo/collections/llm/recipes/finetune_default.py",3562,0,"",python,selection_command
+64,77033,"nemo/collections/llm/recipes/finetune_default.py",3556,0,"",python,selection_command
+65,77273,"nemo/collections/llm/recipes/finetune_default.py",3562,0,"",python,selection_command
+66,77806,"nemo/collections/llm/recipes/finetune_default.py",3556,0,"",python,selection_command
+67,126985,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",0,0,"# Copyright (c) 2025, NVIDIA CORPORATION. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the ""License"");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an ""AS IS"" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom typing import Optional\n\nimport lightning.pytorch as pl\nimport nemo_run as run\nimport torch\nfrom nemo.collections.common.tokenizers.huggingface.auto_tokenizer import AutoTokenizer\n\nfrom nemo.collections.llm.api import finetune, pretrain\nfrom nemo.collections.llm.gpt.data.mock import MockDataModule\nfrom nemo.collections.llm.peft import PEFT_STR2CLS\nfrom nemo.collections.llm.recipes.finetune_default import default_finetune_recipe\nfrom nemo.collections.llm.recipes.log.default import default_log, default_resume, tensorboard_logger\nfrom nemo.collections.llm.recipes.optim.adam import distributed_fused_adam_with_cosine_annealing\nfrom nemo.collections.llm.recipes.qwen3 import qwen3_model, qwen3_trainer\nfrom nemo.utils.exp_manager import TimingCallback\n\nNAME = ""qwen3_30b_a3b""\n\n\n@run.cli.factory(name=NAME)\ndef model() -> run.Config[pl.LightningModule]:\n """"""\n Factory function to create a Qwen3 30B-A3B model configuration.\n This is a MoE (Mixture of Experts) model with 128 experts.\n\n Returns:\n run.Config[pl.LightningModule]: Configuration for the Qwen3 30B-A3B model.\n\n Examples:\n CLI usage:\n $ nemo llm pretrain model=qwen3_30b_a3b ...\n\n Python API usage:\n >>> model_config = model()\n >>> print(model_config)\n """"""\n return qwen3_model(version=NAME)\n\n\n@run.cli.factory(target=pretrain, name=NAME)\ndef pretrain_recipe(\n # General\n dir: Optional[str] = None,\n name: str = ""default"",\n # Trainer\n tensor_parallelism: int = 4, # Default for 30B-A3B model\n pipeline_parallelism: int = 2,\n pipeline_parallelism_type: Optional[torch.dtype] = None,\n virtual_pipeline_parallelism: Optional[int] = None,\n context_parallelism: int = 1,\n expert_parallelism: Optional[int] = 4,\n sequence_parallelism: bool = True,\n num_nodes: int = 1,\n num_gpus_per_node: int = 8,\n max_steps: int = 300000,\n precision: str = ""bf16-mixed"",\n accumulate_grad_batches: int = 1,\n gradient_clip_val: float = 1.0,\n limit_test_batches: int = 32,\n limit_val_batches: int = 32,\n log_every_n_steps: int = 10,\n val_check_interval: int = 500,\n # Data\n global_batch_size=32,\n micro_batch_size=2,\n seq_length=4096,\n # Optimizer\n warmup_steps=500,\n constant_steps=0,\n min_lr=3e-5,\n max_lr=3e-4,\n # Training function\n fn=pretrain,\n) -> run.Partial:\n """"""\n Create a pre-training recipe for Qwen3 30B-A3B model.\n\n This function sets up a complete configuration for pre-training, including\n model, trainer, data, logging, optimization, and resumption settings.\n This model uses Mixture of Experts (MoE) architecture with 128 experts.\n\n Args:\n dir (Optional[str]): Directory for saving logs and checkpoints.\n name (str): Name of the pre-training run.\n tensor_parallelism (int): Degree of tensor model parallelism.\n pipeline_parallelism (int): Degree of pipeline model parallelism.\n pipeline_parallelism_type (Optional[torch.dtype]): Data type for pipeline parallelism.\n virtual_pipeline_parallelism (Optional[int]): Size of virtual pipeline parallelism.\n context_parallelism (int): Degree of context parallelism.\n sequence_parallelism (bool): Whether to use sequence parallelism.\n num_nodes (int): Number of compute nodes to use.\n num_gpus_per_node (int): Number of GPUs per node.\n max_steps (int): Maximum number of training steps.\n precision (str): Precision configuration, one of fp32, 16-mixed or bf16-mixed.\n accumulate_grad_batches (int): Number of steps per gradient accumulation.\n gradient_clip_val (float): Value for gradient clipping.\n limit_test_batches (int): Limit the number of test batches.\n limit_val_batches (int): Limit the number of validation batches.\n log_every_n_steps (int): Log every n steps.\n val_check_interval (int): Run validation every N steps.\n global_batch_size (int): Global batch size.\n micro_batch_size (int): Micro batch size.\n seq_length (int): Sequence length.\n warmup_steps (int): Number of warmup steps.\n constant_steps (int): Number of constant steps.\n min_lr (float): Minimum learning rate.\n max_lr (float): Maximum learning rate.\n fn (Callable): The pre-training function to use.\n\n Returns:\n run.Partial: Partial configuration for pre-training.\n\n Examples:\n CLI usage:\n $ nemo llm pretrain --factory qwen3_30b_a3b\n $ nemo llm pretrain --factory ""qwen3_30b_a3b(num_nodes=1, name='my_qwen3_pretrain')""\n\n Python API usage:\n >>> recipe = pretrain_recipe(name=""qwen3_pretrain"", num_nodes=1)\n >>> print(recipe)\n\n Note:\n This recipe uses a mock dataset, look for the finetune examples to see how to change the dataset.\n """"""\n recipe = run.Partial(\n fn,\n model=model(),\n trainer=qwen3_trainer(\n tensor_parallelism=tensor_parallelism,\n pipeline_parallelism=pipeline_parallelism,\n pipeline_parallelism_type=pipeline_parallelism_type,\n virtual_pipeline_parallelism=virtual_pipeline_parallelism,\n context_parallelism=context_parallelism,\n sequence_parallelism=sequence_parallelism,\n expert_parallelism=expert_parallelism,\n num_nodes=num_nodes,\n num_gpus_per_node=num_gpus_per_node,\n max_steps=max_steps,\n precision=precision,\n accumulate_grad_batches=accumulate_grad_batches,\n limit_test_batches=limit_test_batches,\n limit_val_batches=limit_val_batches,\n log_every_n_steps=log_every_n_steps,\n val_check_interval=val_check_interval,\n callbacks=[run.Config(TimingCallback)],\n ),\n data=run.Config(\n MockDataModule,\n seq_length=seq_length,\n global_batch_size=global_batch_size,\n micro_batch_size=micro_batch_size,\n tokenizer=run.Config(AutoTokenizer, ""Qwen/Qwen3-30B-A3B""),\n ),\n log=default_log(dir=dir, name=name, tensorboard_logger=tensorboard_logger(name=name)),\n optim=distributed_fused_adam_with_cosine_annealing(\n precision=precision,\n warmup_steps=warmup_steps,\n constant_steps=constant_steps,\n min_lr=min_lr,\n max_lr=max_lr,\n clip_grad=gradient_clip_val,\n ),\n resume=default_resume(),\n )\n recipe.model.config.recompute_granularity = ""full""\n recipe.model.config.recompute_method = ""uniform""\n recipe.model.config.recompute_num_layers = 1\n return recipe\n\n\n@run.cli.factory(target=finetune, name=NAME)\ndef finetune_recipe(\n dir: Optional[str] = None,\n name: str = ""default"",\n num_nodes: int = 1,\n num_gpus_per_node: int = 8,\n peft_scheme: Optional[str] = 'lora',\n packed_sequence: bool = False,\n) -> run.Partial:\n """"""\n Create a fine-tuning recipe for Qwen3 30B-A3B model.\n\n This function sets up a complete configuration for fine-tuning, including\n model, trainer, data, logging, optimization, and resumption settings.\n The recipe uses LoRA (Low-Rank Adaptation) for efficient fine-tuning, unless peft_scheme is set to None.\n This model uses Mixture of Experts (MoE) architecture with 128 experts.\n\n Args:\n dir (Optional[str]): Directory for saving logs and checkpoints.\n name (str): Name of the fine-tuning run.\n num_nodes (int): Number of compute nodes to use.\n num_gpus_per_node (int): Number of GPUs per node.\n peft_scheme (Optional[str]): Name of the peft scheme to use for fine-tuning.\n Allowed values: 'lora'/'dora'/'none'/None.\n packed_sequence (Optional[bool]): Packing multiple training sequences into one long sequence for training\n efficiency. Default sequence length is 2048.\n\n Returns:\n run.Partial: Partial configuration for fine-tuning.\n\n Examples:\n CLI usage:\n $ nemo llm finetune --factory qwen3_30b_a3b\n\n Python API usage:\n >>> recipe = finetune_recipe(name=""qwen3_30b_a3b_finetune"", num_nodes=2)\n >>> print(recipe)\n\n Note:\n This recipe uses the SQuAD dataset for fine-tuning.\n """"""\n recipe = default_finetune_recipe(\n model(), ""Qwen/Qwen3-30B-A3B"", dir, name, num_nodes, num_gpus_per_node, packed_sequence\n )\n if peft_scheme is None or peft_scheme.lower() == 'none':\n recipe.trainer.strategy.tensor_model_parallel_size = 4\n recipe.trainer.strategy.expert_model_parallel_size = 4\n recipe.trainer.strategy.expert_tensor_parallel_size = 1\n recipe.trainer.strategy.pipeline_model_parallel_size = 2\n recipe.trainer.strategy.sequence_parallel = True\n recipe.optim.config.lr = 5e-6\n elif peft_scheme.lower() in ['lora', 'dora']:\n recipe.trainer.strategy.tensor_model_parallel_size = 4\n recipe.trainer.strategy.expert_model_parallel_size = 4\n recipe.trainer.strategy.expert_tensor_parallel_size = 1\n recipe.trainer.strategy.sequence_parallel = True\n recipe.peft = run.Config(PEFT_STR2CLS[peft_scheme.lower()])\n recipe.peft.target_modules = ['linear_qkv', 'linear_proj']\n recipe.optim.config.lr = 1e-4\n else:\n raise ValueError(f""Unrecognized peft scheme: {peft_scheme}"")\n return recipe\n",python,tab
+68,128778,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",10041,0,"",python,selection_command
+69,129707,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",10023,0,"",python,selection_command
+70,129956,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9954,0,"",python,selection_command
+71,129988,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9944,0,"",python,selection_command
+72,130024,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9906,0,"",python,selection_command
+73,130056,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9839,0,"",python,selection_command
+74,130090,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9771,0,"",python,selection_command
+75,130124,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9714,0,"",python,selection_command
+76,130155,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9650,0,"",python,selection_command
+77,130192,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9587,0,"",python,selection_command
+78,130221,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9524,0,"",python,selection_command
+79,130254,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9474,0,"",python,selection_command
+80,131114,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9524,0,"",python,selection_command
+81,131364,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9587,0,"",python,selection_command
+82,131396,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9650,0,"",python,selection_command
+83,131429,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9714,0,"",python,selection_command
+84,131461,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9771,0,"",python,selection_command
+85,131637,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9839,0,"",python,selection_command
+86,131781,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9906,0,"",python,selection_command
+87,133094,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9914,0,"",python,selection_command
+88,138742,"nemo/collections/llm/recipes/finetune_default.py",0,0,"",python,tab
+89,142653,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",0,0,"",python,tab
+90,147612,"nemo/collections/llm/recipes/finetune_default.py",0,0,"",python,tab
+91,148892,"nemo/collections/llm/gpt/model/base.py",0,0,"# Copyright (c) 2025, NVIDIA CORPORATION. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the ""License"");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an ""AS IS"" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport contextlib\nimport inspect\nfrom dataclasses import dataclass\nfrom functools import partial\nfrom typing import TYPE_CHECKING, Any, Callable, Literal, Optional, Union\n\nimport lightning.pytorch as L\nimport torch\nimport torch.distributed\nfrom megatron.core.inference.model_inference_wrappers.gpt.gpt_inference_wrapper import GPTInferenceWrapper\nfrom megatron.core.inference.model_inference_wrappers.inference_wrapper_config import InferenceWrapperConfig\nfrom megatron.core.models.gpt.gpt_model import GPTModel as MCoreGPTModel\nfrom megatron.core.optimizer import OptimizerConfig\nfrom megatron.core.transformer.dot_product_attention import DotProductAttention as MCoreDotProductAttention\nfrom megatron.core.transformer.enums import AttnBackend\nfrom megatron.core.transformer.spec_utils import ModuleSpec\nfrom megatron.core.transformer.transformer_config import TransformerConfig\nfrom megatron.core.utils import get_batch_on_this_cp_rank\nfrom torch import nn\n\nfrom nemo.collections.llm import fn\nfrom nemo.lightning import get_vocab_size, io\nfrom nemo.lightning.megatron_parallel import MaskedTokenLossReduction\nfrom nemo.lightning.pytorch.optim import MegatronOptimizerModule, OptimizerModule\nfrom nemo.utils import logging\nfrom nemo.utils.import_utils import safe_import\n\n_, HAVE_TE = safe_import(""transformer_engine"")\n\n# Gradient accumulation fusion may be enabled if available, for more information see:\n# https://github.com/NVIDIA/Megatron-LM/blob/01945b98d1ea3a2acb5e8301e181a328104f4856/megatron/core/tensor_parallel/layers.py#L575\n# TODO: Clean this up with a getter and install instructions\n_grad_accum_fusion_available = True\ntry:\n import fused_weight_gradient_mlp_cuda # noqa: F401 # pylint: disable=unused-import\nexcept ImportError:\n _grad_accum_fusion_available = False\n\nif TYPE_CHECKING:\n from transformers import GenerationConfig\n\n from nemo.collections.common.tokenizers.tokenizer_spec import TokenizerSpec\n\n\ndef gpt_data_step(dataloader_iter, use_mtp=False) -> dict[str, torch.Tensor]:\n """"""Process a single batch of data from the dataloader iterator.\n\n This function handles the data loading step for GPT models, managing\n pipeline parallelism by distributing data appropriately across pipeline stages.\n\n Args:\n dataloader_iter: Iterator over the dataloader\n use_mtp: Whether the Multi-Token Prediction Module is used. Input needs to be passed\n into the last ppieline stage if mtp is used.\n\n Returns:\n dict[str, torch.Tensor]: Processed batch with required tensors moved to appropriate devices\n """"""\n from megatron.core import parallel_state\n\n # Based on: https://github.com/NVIDIA/Megatron-LM/blob/main/pretrain_gpt.py#L87\n # https://github.com/NVIDIA/NeMo/blob/main/nemo/collections/nlp/models/language_modeling/megatron_gpt_model.py#L828-L842\n\n batch = next(dataloader_iter)\n\n _batch: dict\n if isinstance(batch, tuple) and len(batch) == 3:\n _batch = batch[0]\n else:\n _batch = batch\n\n required_device_keys = set()\n required_host_keys = set()\n\n required_device_keys.add(""attention_mask"")\n if ""cu_seqlens"" in _batch:\n required_device_keys.add(""cu_seqlens"")\n required_host_keys.add(""cu_seqlens_argmin"")\n required_host_keys.add(""max_seqlen"")\n\n if parallel_state.is_pipeline_first_stage() or use_mtp:\n required_device_keys.update((""tokens"", ""position_ids""))\n if parallel_state.is_pipeline_last_stage():\n required_device_keys.update((""labels"", ""loss_mask""))\n\n _batch_required_keys = {}\n for key, val in _batch.items():\n if key in required_device_keys:\n _batch_required_keys[key] = val.cuda(non_blocking=True)\n elif key in required_host_keys:\n _batch_required_keys[key] = val.cpu()\n else:\n _batch_required_keys[key] = None\n\n # slice batch along sequence dimension for context parallelism\n output = get_batch_on_this_cp_rank(_batch_required_keys)\n\n return output\n\n\ndef gpt_forward_step(model, batch) -> torch.Tensor:\n """"""Execute a forward step for the GPT model.\n\n This function prepares the arguments needed for the model's forward pass\n and handles both normal and packed sequence processing.\n\n Args:\n model: The GPT model\n batch: The input batch containing tokens, positions, and other required inputs\n\n Returns:\n torch.Tensor: Output tensor from the model forward pass\n """"""\n forward_args = {\n ""input_ids"": batch[""tokens""],\n ""position_ids"": batch[""position_ids""],\n ""labels"": batch[""labels""],\n }\n\n if ""attention_mask"" not in batch:\n assert (\n HAVE_TE\n ), ""The dataloader did not provide an attention mask, however Transformer Engine was not detected. \\n This requires Transformer Engine's implementation of fused or flash attention.""\n else:\n forward_args[""attention_mask""] = batch[""attention_mask""]\n\n if ""cu_seqlens"" in batch:\n forward_args[""packed_seq_params""] = get_packed_seq_params(batch)\n\n return model(**forward_args)\n\n\ndef transformer_engine_layer_spec(config: ""GPTConfig"") -> ModuleSpec:\n """"""Create a Transformer Engine layer specification based on the provided config.\n\n Args:\n config: GPT configuration object\n\n Returns:\n ModuleSpec: Module specification for Transformer Engine based layers\n """"""\n from megatron.core.models.gpt import gpt_layer_specs\n\n kwargs = {\n ""num_experts"": config.num_moe_experts,\n ""moe_grouped_gemm"": config.moe_grouped_gemm,\n ""qk_layernorm"": config.qk_layernorm,\n ""fp8"": bool(config.num_moe_experts and (config.fp8 is not None)),\n }\n if getattr(config, ""use_transformer_engine_op_fuser"", None) is not None:\n kwargs[""use_te_op_fuser""] = config.use_transformer_engine_op_fuser\n return gpt_layer_specs.get_gpt_layer_with_transformer_engine_spec(**kwargs)\n\n\ndef transformer_engine_full_layer_spec(config: ""GPTConfig"", vp_stage: Optional[int] = None) -> ModuleSpec:\n """"""Create a full Transformer Engine layer specification with autocast support.\n\n Args:\n config: GPT configuration object\n\n Returns:\n ModuleSpec: Module specification for full TE layers\n """"""\n from nemo.collections.nlp.models.language_modeling.megatron.gpt_full_te_layer_autocast_spec import (\n get_gpt_full_te_layer_autocast_spec,\n )\n\n return get_gpt_full_te_layer_autocast_spec(transformer_config=config, vp_stage=vp_stage)\n\n\ndef local_layer_spec(config: ""GPTConfig"") -> ModuleSpec:\n """"""Create a local layer specification without Transformer Engine.\n\n Args:\n config: GPT configuration object\n\n Returns:\n ModuleSpec: Module specification for local implementation layers\n """"""\n from megatron.core.models.gpt import gpt_layer_specs\n\n return gpt_layer_specs.get_gpt_layer_local_spec(\n num_experts=config.num_moe_experts,\n moe_grouped_gemm=config.moe_grouped_gemm,\n qk_layernorm=config.qk_layernorm,\n normalization=config.normalization,\n )\n\n\ndef default_layer_spec(config: ""GPTConfig"", vp_stage: Optional[int] = None) -> ModuleSpec:\n """"""Determine the most appropriate layer specification based on availability.\n\n Uses Transformer Engine specs if available, otherwise falls back to local implementation.\n\n Args:\n config: GPT configuration object\n\n Returns:\n ModuleSpec: The selected module specification\n """"""\n if HAVE_TE:\n if config.use_transformer_engine_full_layer_spec:\n return transformer_engine_full_layer_spec(config, vp_stage=vp_stage)\n else:\n return transformer_engine_layer_spec(config)\n else:\n return local_layer_spec(config)\n\n\ndef mtp_block_spec(config: ""GPTConfig"", vp_stage: Optional[int] = None) -> Optional[ModuleSpec]:\n """"""Pass in the MTP block spec if model has MTP layers.\n\n Args:\n config: GPT configuration object\n\n Returns:\n ModuleSpec: The MTP module specification\n """"""\n if getattr(config, ""mtp_num_layers"", None):\n from megatron.core.models.gpt.gpt_layer_specs import get_gpt_mtp_block_spec\n\n if isinstance(config.transformer_layer_spec, Callable):\n if 'vp_stage' in inspect.signature(config.transformer_layer_spec).parameters:\n spec = config.transformer_layer_spec(config, vp_stage=vp_stage)\n else:\n spec = config.transformer_layer_spec(config)\n else:\n spec = config.transformer_layer_spec\n return get_gpt_mtp_block_spec(config, spec, use_transformer_engine=HAVE_TE, vp_stage=vp_stage)\n else:\n return None\n\n\ndef torch_dtype_from_mcore_config(config: TransformerConfig) -> torch.dtype:\n """"""Extract the appropriate torch dtype from a Megatron Core configuration.\n\n Args:\n config: Megatron Core Transformer configuration\n\n Returns:\n torch.dtype: The appropriate torch dtype (float16, bfloat16, or float32)\n """"""\n if config.fp16:\n return torch.float16\n elif config.bf16:\n return torch.bfloat16\n else:\n return torch.float\n\n\ndef torch_dtype_from_dict_config(config: dict[str, Any]) -> torch.dtype:\n """"""Extract the appropriate torch dtype from a dictionary configuration.\n\n Args:\n config: Dictionary containing configuration parameters\n\n Returns:\n torch.dtype: The appropriate torch dtype (float16, bfloat16, or float32)\n """"""\n if config[""fp16""]:\n return torch.float16\n elif config[""bf16""]:\n return torch.bfloat16\n else:\n return torch.float\n\n\n@dataclass\nclass GPTConfig(TransformerConfig, io.IOMixin):\n """"""Configuration class for GPT models.\n\n Extends TransformerConfig with additional parameters specific to GPT models\n and provides utility methods for model configuration.\n """"""\n\n # From megatron.core.models.gpt.gpt_model.GPTModel\n fp16_lm_cross_entropy: bool = False\n parallel_output: bool = True\n share_embeddings_and_output_weights: bool = True\n make_vocab_size_divisible_by: int = 128\n position_embedding_type: Literal[""learned_absolute"", ""rope""] = ""learned_absolute""\n rotary_base: int = 10000\n rotary_percent: float = 1.0\n seq_len_interpolation_factor: Optional[float] = None\n seq_length: int = 1024\n attention_softmax_in_fp32: bool = False\n masked_softmax_fusion: bool = True\n cross_entropy_loss_fusion: bool = True\n gradient_accumulation_fusion: bool = _grad_accum_fusion_available\n deallocate_pipeline_outputs: bool = True\n scatter_embedding_sequence_parallel: bool = True\n tp_only_amax_red: bool = False\n\n use_transformer_engine_full_layer_spec: bool = False\n transformer_layer_spec: Union[ModuleSpec, Callable[[""GPTConfig""], ModuleSpec]] = default_layer_spec\n\n forward_step_fn: Callable = gpt_forward_step\n data_step_fn: Callable = gpt_data_step\n generation_config: Optional[""GenerationConfig""] = None\n\n vocab_size: Optional[int] = None\n tp_comm_overlap_cfg: Optional[Union[str, dict[str, Any]]] = None\n\n def configure_model(self, tokenizer, pre_process=None, post_process=None, vp_stage=None) -> ""MCoreGPTModel"":\n """"""Configure and instantiate a Megatron Core GPT model based on this configuration.\n\n Args:\n tokenizer: Tokenizer used with the model\n pre_process: Whether to include pre-processing in the model, defaults to first pipeline stage\n post_process: Whether to include post-processing in the model, defaults to last pipeline stage\n vp_stage: Virtual pipeline stage\n\n Returns:\n MCoreGPTModel: Configured Megatron Core GPT model instance\n """"""\n # Enable per-Transformer layer cuda graph.\n if self.enable_cuda_graph and self.cuda_graph_scope != ""full_iteration"":\n assert HAVE_TE, ""Transformer Engine is required for cudagraphs.""\n assert getattr(self, ""use_te_rng_tracker"", False), (\n ""Transformer engine's RNG tracker is required for cudagraphs, it can be ""\n ""enabled with use_te_rng_tracker=True'.""\n )\n\n vp_size = self.virtual_pipeline_model_parallel_size\n is_pipeline_asymmetric = getattr(self, ""account_for_embedding_in_pipeline_split"", False) or getattr(\n self, ""account_for_loss_in_pipeline_split"", False\n )\n is_pipeline_asymmetric |= (\n getattr(self, ""num_layers_in_first_pipeline_stage"", None)\n or getattr(self, ""num_layers_in_last_pipeline_stage"", None)\n ) is not None\n is_flexible_pp_layout = is_pipeline_asymmetric or (\n getattr(self, ""pipeline_model_parallel_layout"", None) is not None\n )\n if vp_size and not is_flexible_pp_layout:\n p_size = self.pipeline_model_parallel_size\n assert (\n self.num_layers // p_size\n ) % vp_size == 0, ""Make sure the number of model chunks is the same across all pipeline stages.""\n\n import inspect\n\n from megatron.core import parallel_state\n\n # During fake lightning initialization, pass 0 to bypass the assertion that vp_stage must be\n # non-None when using virtual pipeline model parallelism\n vp_stage = vp_stage or 0\n\n transformer_layer_spec = self.transformer_layer_spec\n if not isinstance(transformer_layer_spec, ModuleSpec):\n # Check if the transformer_layer_spec function accepts vp_stage parameter\n if 'vp_stage' in inspect.signature(transformer_layer_spec).parameters:\n transformer_layer_spec = transformer_layer_spec(self, vp_stage=vp_stage)\n else:\n transformer_layer_spec = transformer_layer_spec(self)\n\n if self.vocab_size is not None:\n vocab_size = self.vocab_size\n if tokenizer is not None:\n logging.info(\n f""Use preset vocab_size: {vocab_size}, original vocab_size: {tokenizer.vocab_size}, dummy tokens:""\n f"" {vocab_size - tokenizer.vocab_size}.""\n )\n else:\n vocab_size = get_vocab_size(self, tokenizer.vocab_size, self.make_vocab_size_divisible_by)\n # Initialize model as meta data instead of allocating data on a device\n model_init_device_context = contextlib.nullcontext\n if self.init_model_with_meta_device:\n model_init_device_context = partial(torch.device, device='meta')\n\n if 'mtp_block_spec' in inspect.signature(MCoreGPTModel.__init__).parameters:\n kwargs = {""mtp_block_spec"": mtp_block_spec(self, vp_stage=vp_stage)}\n else:\n kwargs = {}\n\n if self.attention_backend == AttnBackend.local:\n if hasattr(transformer_layer_spec, 'submodules'):\n transformer_layer_spec.submodules.self_attention.submodules.core_attention = MCoreDotProductAttention\n with model_init_device_context():\n model = MCoreGPTModel(\n self,\n transformer_layer_spec=transformer_layer_spec,\n vocab_size=vocab_size,\n max_sequence_length=self.seq_length,\n fp16_lm_cross_entropy=self.fp16_lm_cross_entropy,\n parallel_output=self.parallel_output,\n share_embeddings_and_output_weights=self.share_embeddings_and_output_weights,\n position_embedding_type=self.position_embedding_type,\n rotary_percent=self.rotary_percent,\n rotary_base=self.rotary_base,\n seq_len_interpolation_factor=self.seq_len_interpolation_factor,\n pre_process=pre_process\n or parallel_state.is_pipeline_first_stage(ignore_virtual=False, vp_stage=vp_stage),\n post_process=post_process\n or parallel_state.is_pipeline_last_stage(ignore_virtual=False, vp_stage=vp_stage),\n scatter_embedding_sequence_parallel=self.scatter_embedding_sequence_parallel,\n vp_stage=vp_stage,\n **kwargs,\n )\n\n # If using full TE layer, need to set TP, CP group since the module call\n # is not routed through megatron core, which normally handles passing the\n # TP, CP group to the TE modules.\n # Deep iterate but skip self to avoid infinite recursion.\n if HAVE_TE and self.use_transformer_engine_full_layer_spec:\n # Copied from:\n # https://github.com/NVIDIA/TransformerEngine/blob/main/transformer_engine/pytorch/transformer.py\n if parallel_state.get_tensor_model_parallel_world_size() > 1:\n for index, child in enumerate(model.modules()):\n if index == 0:\n continue\n if hasattr(child, ""set_tensor_parallel_group""):\n tp_group = parallel_state.get_tensor_model_parallel_group()\n child.set_tensor_parallel_group(tp_group)\n\n if parallel_state.get_context_parallel_world_size() > 1:\n cp_stream = torch.cuda.Stream()\n for module in self.get_model_module_list():\n for index, child in enumerate(module.modules()):\n if index == 0:\n continue\n if hasattr(child, ""set_context_parallel_group""):\n child.set_context_parallel_group(\n parallel_state.get_context_parallel_group(),\n parallel_state.get_context_parallel_global_ranks(),\n cp_stream,\n )\n\n return model\n\n\n@dataclass\nclass GPTConfig126M(GPTConfig):\n """"""Configuration for a 126M parameter GPT model.\n\n Predefined configuration for a small GPT model with 12 layers,\n 768 hidden size, and 12 attention heads.\n """"""\n\n seq_length: int = 2048\n num_layers: int = 12\n hidden_size: int = 768\n ffn_hidden_size: int = 3072\n num_attention_heads: int = 12\n bias_activation_fusion: bool = True\n bias_dropout_add_fusion: bool = True\n use_transformer_engine_full_layer_spec: bool = True\n\n\n@dataclass\nclass GPTConfig5B(GPTConfig):\n """"""Configuration for a 5B parameter GPT model.\n\n Predefined configuration for a medium-sized GPT model with 24 layers,\n 4096 hidden size, and 32 attention heads.\n """"""\n\n seq_length: int = 2048\n num_layers: int = 24\n hidden_size: int = 4096\n ffn_hidden_size: int = 16384\n num_attention_heads: int = 32\n bias_activation_fusion: bool = True\n bias_dropout_add_fusion: bool = True\n use_transformer_engine_full_layer_spec: bool = True\n\n\n@dataclass\nclass GPTConfig7B(GPTConfig):\n """"""Configuration for a 7B parameter GPT model.\n\n Predefined configuration for a medium-sized GPT model with 32 layers,\n 4096 hidden size, and 32 attention heads.\n """"""\n\n seq_length: int = 2048\n num_layers: int = 32\n hidden_size: int = 4096\n ffn_hidden_size: int = 10880\n num_attention_heads: int = 32\n bias_activation_fusion: bool = True\n bias_dropout_add_fusion: bool = True\n use_transformer_engine_full_layer_spec: bool = True\n\n\n@dataclass\nclass GPTConfig20B(GPTConfig):\n """"""Configuration for a 20B parameter GPT model.\n\n Predefined configuration for a large GPT model with 44 layers,\n 6144 hidden size, and 48 attention heads.\n """"""\n\n seq_length: int = 2048\n num_layers: int = 44\n hidden_size: int = 6144\n ffn_hidden_size: int = 24576\n num_attention_heads: int = 48\n bias_activation_fusion: bool = True\n bias_dropout_add_fusion: bool = True\n use_transformer_engine_full_layer_spec: bool = True\n\n\n@dataclass\nclass GPTConfig40B(GPTConfig):\n """"""Configuration for a 40B parameter GPT model.\n\n Predefined configuration for a large GPT model with 48 layers,\n 8192 hidden size, and 64 attention heads.\n """"""\n\n seq_length: int = 2048\n num_layers: int = 48\n hidden_size: int = 8192\n ffn_hidden_size: int = 32768\n num_attention_heads: int = 64\n bias_activation_fusion: bool = True\n bias_dropout_add_fusion: bool = True\n use_transformer_engine_full_layer_spec: bool = True\n\n\n@dataclass\nclass GPTConfig175B(GPTConfig):\n """"""Configuration for a 175B parameter GPT model.\n\n Predefined configuration for a massive GPT model with 96 layers,\n 12288 hidden size, and 96 attention heads.\n """"""\n\n seq_length: int = 2048\n num_layers: int = 96\n hidden_size: int = 12288\n ffn_hidden_size: int = 49152\n num_attention_heads: int = 96\n hidden_dropout: float = 0.0\n attention_dropout: float = 0.0\n bias_activation_fusion: bool = True\n bias_dropout_add_fusion: bool = True\n use_transformer_engine_full_layer_spec: bool = True\n layernorm_zero_centered_gamma: bool = True\n\n\nclass GPTModel(L.LightningModule, io.IOMixin, io.ConnectorMixin, fn.FNMixin):\n """"""GPT model implementation using Megatron Core and PyTorch Lightning.\n\n This class provides a high-level interface for training and using GPT models\n with proper integration with NeMo's infrastructure.\n """"""\n\n def __init__(\n self,\n config: GPTConfig,\n # TODO: Add transformer_layer_spec when we update mcore\n optim: Optional[OptimizerModule] = None,\n tokenizer: Optional[""TokenizerSpec""] = None,\n model_transform: Optional[Callable[[nn.Module], nn.Module]] = None,\n model_context_managers: Optional[list] = [],\n ):\n """"""Initialize the GPT model.\n\n Args:\n config: Configuration for the GPT model\n optim: Optional optimizer module\n tokenizer: Optional tokenizer specification\n model_transform: Optional function to transform the model after initialization\n model_context_managers: Optional list of context managers to apply when configuring and instantiating\n the model.\n """"""\n super().__init__()\n self.config = config\n self.tokenizer = tokenizer\n self.optim = optim or MegatronOptimizerModule(config=OptimizerConfig(lr=1e-4, use_distributed_optimizer=True))\n self.optim.connect(self) # This will bind the `configure_optimizers` method\n self.model_transform = model_transform\n self.model_context_managers = model_context_managers\n self._training_loss_reduction = None\n self._validation_loss_reduction = None\n\n def configure_model(self, vp_stage: Optional[int] = None) -> None:\n """"""Configure the underlying model if not already configured.\n\n This method ensures the model is instantiated from the configuration.\n """"""\n from nemo.collections.llm.modelopt.model_utils import restore_modelopt_state\n\n if not hasattr(self, ""module""):\n with contextlib.ExitStack() as stack:\n # Apply requested context managers for this block\n for cm in self.model_context_managers:\n stack.enter_context(cm)\n\n self.module = self.config.configure_model(self.tokenizer, vp_stage=vp_stage)\n\n # Restore ModelOpt state if it exists.\n # NOTE: Also called in MegatronStrategy.load_checkpoint but we do it for GPTModel here first,\n # for transformations which add new parameters to the model that need to be included in the optimizer.\n # TODO: Add to other models when needed.\n restore_modelopt_state(self.module, trainer=self._trainer) # `self.trainer` throws exception if not set\n\n def forward(\n self,\n input_ids: torch.Tensor,\n position_ids: torch.Tensor,\n attention_mask: Optional[torch.Tensor] = None,\n labels: Optional[torch.Tensor] = None,\n decoder_input: Optional[torch.Tensor] = None,\n inference_context=None,\n packed_seq_params=None,\n ) -> torch.Tensor:\n """"""Forward pass through the GPT model.\n\n Args:\n input_ids: Input token IDs\n position_ids: Position IDs for the input\n attention_mask: Optional attention mask\n labels: Optional labels for computing loss\n decoder_input: Optional decoder input\n inference_context: Optional parameters for inference\n packed_seq_params: Optional parameters for packed sequence processing\n\n Returns:\n torch.Tensor: Output tensor from the model\n """"""\n extra_kwargs = {""packed_seq_params"": packed_seq_params} if packed_seq_params is not None else {}\n output_tensor = self.module(\n input_ids,\n position_ids,\n attention_mask,\n decoder_input=decoder_input,\n labels=labels,\n inference_context=inference_context,\n **extra_kwargs,\n )\n\n return output_tensor\n\n def data_step(self, dataloader_iter) -> dict[str, torch.Tensor]:\n """"""Process a batch of data from the dataloader.\n\n Args:\n dataloader_iter: Iterator over the dataloader\n\n Returns:\n dict[str, torch.Tensor]: Processed batch\n """"""\n return self.config.data_step_fn(dataloader_iter)\n\n def forward_step(self, batch) -> torch.Tensor:\n """"""Execute a forward step using the provided batch.\n\n Args:\n batch: Input batch\n\n Returns:\n torch.Tensor: Output from the forward pass\n """"""\n return self.config.forward_step_fn(self, batch)\n\n def training_step(self, batch, batch_idx=None) -> torch.Tensor:\n """"""Execute a training step.\n\n Args:\n batch: Input batch\n batch_idx: Optional batch index\n\n Returns:\n torch.Tensor: Loss value\n """"""\n # In mcore the loss-function is part of the forward-pass (when labels are provided)\n return self.forward_step(batch)\n\n def validation_step(self, batch, batch_idx=None) -> torch.Tensor:\n """"""Execute a validation step.\n\n Args:\n batch: Input batch\n batch_idx: Optional batch index\n\n Returns:\n torch.Tensor: Loss value\n """"""\n # In mcore the loss-function is part of the forward-pass (when labels are provided)\n\n return self.forward_step(batch)\n\n def get_inference_wrapper(\n self,\n params_dtype: torch.dtype,\n inference_batch_times_seqlen_threshold: int,\n inference_max_seq_length: int = 2560,\n ) -> GPTInferenceWrapper:\n """"""Get an inference wrapper for the model.\n\n Creates and configures a GPTInferenceWrapper around the model for efficient inference.\n\n Args:\n params_dtype: Data type for parameters\n inference_batch_times_seqlen_threshold: Threshold for optimizing inference\n inference_max_seq_length: Maximum sequence length for inference (prefill and decode)\n\n Returns:\n GPTInferenceWrapper: Wrapped model for inference\n """"""\n # This is to get the MCore model required in GPTInferenceWrapper.\n mcore_model = self.module\n while mcore_model:\n if type(mcore_model) is MCoreGPTModel:\n break\n mcore_model = getattr(mcore_model, ""module"", None)\n if mcore_model is None or type(mcore_model) is not MCoreGPTModel:\n raise ValueError(""Exact McoreGPTModel instance not found in the model structure."")\n\n vocab_size = None\n if hasattr(self.config, ""vocab_size""):\n vocab_size = self.config.vocab_size\n elif self.tokenizer is not None:\n vocab_size = self.tokenizer.vocab_size\n else:\n raise ValueError(\n ""Unable to find vocab size.""\n "" Either pass in a tokenizer with vocab size, or set vocab size in the model config""\n )\n\n inference_wrapper_config = InferenceWrapperConfig(\n hidden_size=mcore_model.config.hidden_size,\n params_dtype=params_dtype,\n inference_batch_times_seqlen_threshold=inference_batch_times_seqlen_threshold,\n padded_vocab_size=vocab_size,\n inference_max_seq_length=inference_max_seq_length,\n )\n\n model_inference_wrapper = GPTInferenceWrapper(mcore_model, inference_wrapper_config)\n return model_inference_wrapper\n\n @property\n def training_loss_reduction(self) -> MaskedTokenLossReduction:\n """"""Get the loss reduction module for training.\n\n Returns:\n MaskedTokenLossReduction: Loss reduction module for training\n """"""\n if not self._training_loss_reduction:\n self._training_loss_reduction = MaskedTokenLossReduction()\n\n return self._training_loss_reduction\n\n @property\n def validation_loss_reduction(self) -> MaskedTokenLossReduction:\n """"""Get the loss reduction module for validation.\n\n Returns:\n MaskedTokenLossReduction: Loss reduction module for validation\n """"""\n if not self._validation_loss_reduction:\n self._validation_loss_reduction = MaskedTokenLossReduction(validation_step=True)\n\n return self._validation_loss_reduction\n\n\ndef get_packed_seq_params(batch):\n """"""Extract packed sequence parameters from the batch.\n\n Creates and returns a PackedSeqParams object with appropriate parameters\n for packed sequence processing.\n\n Args:\n batch: Input batch containing packed sequence information\n\n Returns:\n PackedSeqParams: Parameters for packed sequence processing\n """"""\n from megatron.core.packed_seq_params import PackedSeqParams\n\n cu_seqlens = batch[""cu_seqlens""].squeeze() # remove batch size dimension (mbs=1)\n # remove -1 ""paddings"" added in collate_fn\n if (cu_seqlens_argmin := batch.get(""cu_seqlens_argmin"", None)) is not None:\n # pre-compute cu_seqlens_argmin in dataset class for perf\n cu_seqlens = cu_seqlens[: cu_seqlens_argmin.item()]\n else:\n cu_seqlens = cu_seqlens[: torch.argmin(cu_seqlens)]\n\n # pre-compute max_seqlens in dataset class for perf\n max_seqlen = batch[""max_seqlen""].squeeze() if ""max_seqlen"" in batch else None\n\n # these args are passed eventually into TEDotProductAttention.forward()\n return PackedSeqParams(\n cu_seqlens_q=cu_seqlens,\n cu_seqlens_kv=cu_seqlens,\n max_seqlen_q=max_seqlen,\n max_seqlen_kv=max_seqlen,\n qkv_format=""thd"",\n )\n\n\n__all__ = [\n ""GPTModel"",\n ""GPTConfig"",\n ""gpt_data_step"",\n ""gpt_forward_step"",\n ""transformer_engine_layer_spec"",\n ""local_layer_spec"",\n]\n",python,tab
+92,150194,"nemo/collections/llm/recipes/finetune_default.py",0,0,"",python,tab
+93,151318,"tests/lightning/_io/artifacts/model.yaml",0,0,"",yaml,tab
+94,257319,"tests/lightning/_io/artifacts/model.yaml",5011,0,"",yaml,selection_command
+95,304285,"nemo/collections/llm/recipes/finetune_default.py",0,0,"",python,tab
+96,330219,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",0,0,"",python,tab
+97,332288,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9906,37," recipe.optim.config.lr = 1e-4",python,selection_command
+98,338831,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9914,0,"",python,selection_command
+99,339185,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9920,0,"",python,selection_command
+100,339335,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9921,0,"",python,selection_command
+101,339493,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9926,0,"",python,selection_command
+102,339653,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9927,0,"",python,selection_command
+103,339824,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9933,0,"",python,selection_command
+104,340000,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9934,0,"",python,selection_command
+105,573910,"nemo/collections/llm/recipes/finetune_default.py",0,0,"",python,tab
+106,577248,"tests/lightning/_io/artifacts/model.yaml",0,0,"",yaml,tab
+107,578545,"tests/lightning/_io/artifacts/model.yaml",586,0,"",yaml,selection_command
+108,579235,"tests/lightning/_io/artifacts/model.yaml",609,0,"",yaml,selection_command
+109,579372,"tests/lightning/_io/artifacts/model.yaml",2913,0,"",yaml,selection_command
+110,580286,"tests/lightning/_io/artifacts/model.yaml",586,0,"",yaml,selection_command
+111,581680,"tests/lightning/_io/artifacts/model.yaml",2913,0,"",yaml,selection_command
+112,582834,"tests/lightning/_io/artifacts/model.yaml",586,0,"",yaml,selection_command
+113,583271,"tests/lightning/_io/artifacts/model.yaml",609,0,"",yaml,selection_command
+114,610927,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",0,0,"",python,tab
+115,613125,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",2771,0,"",python,selection_command
+116,614450,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",4563,0,"",python,selection_command
+117,615658,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",2771,0,"",python,selection_command
+118,626529,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",4563,0,"",python,selection_command
+119,627164,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",4588,0,"",python,selection_command
+120,627532,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",6545,0,"",python,selection_command
+121,629265,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",6563,0,"",python,selection_command
+122,629405,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",2771,0,"",python,selection_command
+123,631920,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",10041,0,"",python,selection_command
+124,6778183,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",10023,0,"",python,selection_command
+125,6778290,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9954,0,"",python,selection_command
+126,6778454,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9944,0,"",python,selection_command
+127,6778572,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9906,0,"",python,selection_command
+128,6784835,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",882,0,"",python,selection_command
+129,6785355,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",903,0,"",python,selection_command
+130,6785646,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",2762,0,"",python,selection_command
+131,6786371,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",3157,0,"",python,selection_command
+132,6786945,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",3623,0,"",python,selection_command
+133,6800958,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",3549,0,"",python,selection_command
+134,6801206,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",3479,0,"",python,selection_command
+135,6801236,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",3418,0,"",python,selection_command
+136,6801271,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",3357,0,"",python,selection_command
+137,6801308,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",3296,0,"",python,selection_command
+138,6801336,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",3287,0,"",python,selection_command
+139,6801369,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",3270,0,"",python,selection_command
+140,6801404,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",3196,0,"",python,selection_command
+141,6801436,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",3117,0,"",python,selection_command
+142,6801469,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",3057,0,"",python,selection_command
+143,6801503,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",3055,0,"",python,selection_command
+144,6801536,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",2997,0,"",python,selection_command
+145,6801569,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",2989,0,"",python,selection_command
+146,6801603,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",2971,0,"",python,selection_command
+147,6801636,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",2954,0,"",python,selection_command
+148,6801669,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",2930,0,"",python,selection_command
+149,6801703,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",2913,0,"",python,selection_command
+150,6801740,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",2896,0,"",python,selection_command
+151,6801772,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",2874,0,"",python,selection_command
+152,6801804,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",2852,0,"",python,selection_command
+153,6801841,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",2836,0,"",python,selection_command
+154,6801869,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",2815,0,"",python,selection_command
+155,6801902,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",2791,0,"",python,selection_command
+156,6801936,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",2765,0,"",python,selection_command
+157,6801969,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",2754,0,"",python,selection_command
+158,6802293,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",2765,0,"",python,selection_command
+159,6802472,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",2791,0,"",python,selection_command
+160,6802949,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",2767,0,"",python,selection_command
+161,6803237,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",2771,0,"",python,selection_command
+162,6803687,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",4563,0,"",python,selection_command
+163,6804415,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",6545,0,"",python,selection_command
+164,6817637,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",6563,0,"",python,selection_command
+165,6817841,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",2771,0,"",python,selection_command
+166,6818636,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",4563,0,"",python,selection_command
+167,6819065,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",6545,0,"",python,selection_command
+168,6819491,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",6563,0,"",python,selection_command
+169,6819699,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",2771,0,"",python,selection_command
+170,6824996,"nemo/collections/llm/recipes/qwen3.py",0,0,"# Copyright (c) 2025, NVIDIA CORPORATION. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the ""License"");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an ""AS IS"" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom typing import Optional\n\nimport lightning.pytorch as pl\nimport nemo_run as run\nimport torch\nfrom lightning.pytorch.callbacks.callback import Callback\nfrom megatron.core.distributed import DistributedDataParallelConfig\n\nfrom nemo import lightning as nl\nfrom nemo.collections.llm.gpt.model.qwen3 import (\n Qwen3Config1P7B,\n Qwen3Config4B,\n Qwen3Config8B,\n Qwen3Config14B,\n Qwen3Config30B_A3B,\n Qwen3Config32B,\n Qwen3Config235B_A22B,\n Qwen3Config600M,\n Qwen3Model,\n)\nfrom nemo.collections.llm.recipes.precision.mixed_precision import bf16_mixed, fp16_mixed\n\n\ndef qwen3_model(version: str) -> run.Config[pl.LightningModule]:\n """"""\n A function to create a qwen3 models.\n\n Args:\n version (str): The version of the qwen3 model to create. One of [""qwen3_600m"", ""qwen3_1p7b"",\n ""qwen3_4b"", ""qwen3_8b"", ""qwen3_14b"", ""qwen3_32b"", ""qwen3_30b_a3b"", ""qwen3_235b_a22b""].\n\n Returns:\n run.Config[pl.LightningModule]: Configuration for the qwen3 model.\n """"""\n config = None\n if version == ""qwen3_600m"":\n config = run.Config(Qwen3Config600M)\n elif version == ""qwen3_1p7b"":\n config = run.Config(Qwen3Config1P7B)\n elif version == ""qwen3_4b"":\n config = run.Config(Qwen3Config4B)\n elif version == ""qwen3_8b"":\n config = run.Config(Qwen3Config8B)\n elif version == ""qwen3_14b"":\n config = run.Config(Qwen3Config14B)\n elif version == ""qwen3_32b"":\n config = run.Config(Qwen3Config32B)\n elif version == ""qwen3_30b_a3b"":\n config = run.Config(Qwen3Config30B_A3B)\n elif version == ""qwen3_235b_a22b"":\n config = run.Config(Qwen3Config235B_A22B)\n\n assert config is not None, f""Invalid version: {version}""\n return run.Config(Qwen3Model, config=config)\n\n\ndef qwen3_trainer(\n tensor_parallelism: int = 1,\n pipeline_parallelism: int = 1,\n pipeline_parallelism_type: Optional[torch.dtype] = None,\n virtual_pipeline_parallelism: Optional[int] = None,\n context_parallelism: int = 1,\n sequence_parallelism: bool = False,\n expert_parallelism: int = 1,\n account_for_embedding_in_pipeline_split: bool = False,\n account_for_loss_in_pipeline_split: bool = False,\n num_nodes: int = 1,\n num_gpus_per_node: int = 8,\n max_steps: int = 1168251,\n precision: str = ""bf16-mixed"",\n accumulate_grad_batches: int = 1,\n limit_test_batches: int = 32,\n limit_val_batches: int = 32,\n log_every_n_steps: int = 10,\n val_check_interval: int = 2000,\n callbacks: Optional[list[run.Config[Callback]]] = None,\n) -> run.Config[nl.Trainer]:\n """"""\n Configure the NeMo Lightning Trainer for qwen3 models.\n\n This function sets up the distributed training strategy and other training parameters.\n\n Args:\n tensor_parallelism (int): Degree of tensor model parallelism.\n pipeline_parallelism (int): Degree of pipeline model parallelism.\n pipeline_parallelism_type (Optional[torch.dtype]): Data type for pipeline parallelism.\n virtual_pipeline_parallelism (Optional[int]): Size of virtual pipeline parallelism.\n context_parallelism (int): Degree of context parallelism.\n sequence_parallelism (bool): Whether to use sequence parallelism.\n expert_parallelism (Optional[int]): Degree of expert parallelism.\n account_for_embedding_in_pipeline_split (bool): Whether to treat input embedding layer as a standard\n transformer layer in the context of partition and placement for pipeline parallelism.\n account_for_loss_in_pipeline_split (bool): Whether to treat loss layer as a standard transformer\n layer in the context of partition and placement for pipeline parallelism.\n account_for_loss_in_pipeline_split (bool): = False,\n num_nodes (int): Number of compute nodes to use.\n num_gpus_per_node (int): Number of GPUs per node.\n max_steps (int): Maximum number of training steps.\n precision (str): Precision configuration, one of fp32, 16-mixed or bf16-mixed.\n accumulate_grad_batches (int): Number of steps per gradient accumulation.\n limit_test_batches (int): Limit the number of test batches.\n limit_val_batches (int): Limit the number of validation batches.\n log_every_n_steps (int): Log every n steps.\n val_check_interval (int): Run validation every N steps.\n callbacks (Optional[list[run.Config[Callback]]]): List of callback configurations.\n\n Returns:\n run.Config[nl.Trainer]: Configuration for the NeMo Lightning Trainer.\n """"""\n strategy = run.Config(\n nl.MegatronStrategy,\n tensor_model_parallel_size=tensor_parallelism,\n pipeline_model_parallel_size=pipeline_parallelism,\n pipeline_dtype=pipeline_parallelism_type,\n virtual_pipeline_model_parallel_size=virtual_pipeline_parallelism,\n context_parallel_size=context_parallelism,\n sequence_parallel=sequence_parallelism,\n expert_model_parallel_size=expert_parallelism,\n expert_tensor_parallel_size=1,\n account_for_embedding_in_pipeline_split=account_for_embedding_in_pipeline_split,\n account_for_loss_in_pipeline_split=account_for_loss_in_pipeline_split,\n gradient_as_bucket_view=True,\n ckpt_include_optimizer=True,\n ckpt_async_save=True,\n ckpt_parallel_load=True,\n ddp=run.Config(\n DistributedDataParallelConfig,\n check_for_nan_in_grad=True,\n grad_reduce_in_fp32=True,\n overlap_grad_reduce=True,\n overlap_param_gather=True,\n average_in_collective=True, # Not supported for custom FSDP for now, need to be set to False if using FSDP\n data_parallel_sharding_strategy=""optim_grads_params"", # For custom FSDP only\n ),\n )\n\n precision_plugin = None\n if precision == ""16-mixed"":\n precision_plugin = fp16_mixed()\n elif precision == ""bf16-mixed"":\n precision_plugin = bf16_mixed()\n\n trainer = run.Config(\n nl.Trainer,\n accelerator=""gpu"",\n callbacks=callbacks,\n devices=num_gpus_per_node,\n accumulate_grad_batches=accumulate_grad_batches,\n limit_test_batches=limit_test_batches,\n limit_val_batches=limit_val_batches,\n log_every_n_steps=log_every_n_steps,\n max_steps=max_steps,\n num_nodes=num_nodes,\n plugins=precision_plugin,\n strategy=strategy,\n use_distributed_sampler=False,\n val_check_interval=val_check_interval,\n )\n\n return trainer\n",python,tab
+171,6825512,"nemo/collections/llm/recipes/qwen3.py",0,0,"",python,selection_command
+172,6828239,"nemo/collections/llm/gpt/model/qwen3.py",0,0,"# Copyright (c) 2025, NVIDIA CORPORATION. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the ""License"");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an ""AS IS"" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom dataclasses import dataclass\nfrom functools import cached_property, partial\nfrom pathlib import Path\nfrom typing import TYPE_CHECKING, Annotated, Callable, Optional\n\nimport torch\nfrom torch import nn\n\nfrom nemo.collections.llm.gpt.model.base import GPTModel, torch_dtype_from_mcore_config\nfrom nemo.collections.llm.gpt.model.qwen2 import Qwen2Config\nfrom nemo.collections.llm.utils import Config\nfrom nemo.lightning import OptimizerModule, io, teardown\nfrom nemo.lightning.io.state import TransformFns\nfrom nemo.lightning.pytorch.utils import dtype_from_hf\n\nif TYPE_CHECKING:\n from transformers import AutoModelForCausalLM\n from transformers import Qwen3Config as HFQwen3Config\n\n from nemo.collections.common.tokenizers.huggingface.auto_tokenizer import AutoTokenizer\n from nemo.collections.common.tokenizers.tokenizer_spec import TokenizerSpec\n\n\n@dataclass\nclass Qwen3Config(Qwen2Config):\n """"""\n Base config for Qwen 3 Models\n """"""\n\n add_qkv_bias: bool = False\n qk_layernorm: bool = True\n kv_channels: Optional[int] = 128\n num_query_groups: int = 8\n max_position_embeddings: int = 40960\n vocab_size: int = 151936\n\n\n@dataclass\nclass Qwen3MoEConfig(Qwen3Config):\n """"""\n Base config for Qwen 3 MoE Models\n """"""\n\n num_moe_experts: int = 128\n moe_router_load_balancing_type: str = ""aux_loss""\n moe_aux_loss_coeff: float = 1e-3\n moe_router_topk: int = 8\n moe_router_pre_softmax: bool = False\n moe_grouped_gemm: bool = True\n moe_token_dispatcher_type: str = ""alltoall""\n moe_permute_fusion: bool = True\n\n\n@dataclass\nclass Qwen3Config600M(Qwen3Config):\n """"""\n Config for Qwen 3 0.6B: https://huggingface.co/Qwen/Qwen3-0.6B\n """"""\n\n num_layers: int = 28\n hidden_size: int = 1024\n num_attention_heads: int = 16\n ffn_hidden_size: int = 3072\n share_embeddings_and_output_weights: bool = True\n\n\n@dataclass\nclass Qwen3Config1P7B(Qwen3Config):\n """"""\n Config for Qwen 3 1.7B: https://huggingface.co/Qwen/Qwen3-1.7B\n """"""\n\n num_layers: int = 28\n hidden_size: int = 2048\n num_attention_heads: int = 16\n ffn_hidden_size: int = 6144\n share_embeddings_and_output_weights: bool = True\n\n\n@dataclass\nclass Qwen3Config4B(Qwen3Config):\n """"""\n Config for Qwen 3 4B: https://huggingface.co/Qwen/Qwen3-4B\n """"""\n\n num_layers: int = 36\n hidden_size: int = 2560\n num_attention_heads: int = 32\n ffn_hidden_size: int = 9728\n share_embeddings_and_output_weights: bool = True\n\n\n@dataclass\nclass Qwen3Config8B(Qwen3Config):\n """"""\n Config for Qwen 3 8B: https://huggingface.co/Qwen/Qwen3-8B\n """"""\n\n num_layers: int = 36\n hidden_size: int = 4096\n num_attention_heads: int = 32\n ffn_hidden_size: int = 12288\n\n\n@dataclass\nclass Qwen3Config14B(Qwen3Config):\n """"""\n Config for Qwen 3 14B: https://huggingface.co/Qwen/Qwen3-14B\n """"""\n\n num_layers: int = 40\n hidden_size: int = 5120\n num_attention_heads: int = 40\n ffn_hidden_size: int = 17408\n\n\n@dataclass\nclass Qwen3Config32B(Qwen3Config):\n """"""\n Config for Qwen 3 32B: https://huggingface.co/Qwen/Qwen3-32B\n """"""\n\n num_layers: int = 64\n hidden_size: int = 5120\n num_attention_heads: int = 64\n ffn_hidden_size: int = 25600\n\n\n@dataclass\nclass Qwen3Config30B_A3B(Qwen3MoEConfig):\n """"""\n Config for Qwen 3 30B-A3B: https://huggingface.co/Qwen/Qwen3-30B-A3B\n """"""\n\n num_layers: int = 48\n hidden_size: int = 2048\n num_attention_heads: int = 32\n num_query_groups: int = 4\n ffn_hidden_size: int = 6144\n moe_ffn_hidden_size: int = 768\n\n\n@dataclass\nclass Qwen3Config235B_A22B(Qwen3MoEConfig):\n """"""\n Config for Qwen 3 235B-A22B: https://huggingface.co/Qwen/Qwen3-235B-A22B\n """"""\n\n num_layers: int = 94\n hidden_size: int = 4096\n num_attention_heads: int = 64\n num_query_groups: int = 4\n ffn_hidden_size: int = 12288\n moe_ffn_hidden_size: int = 1536\n\n\nclass Qwen3Model(GPTModel):\n """"""\n Base model for Qwen 3\n """"""\n\n def __init__(\n self,\n config: Annotated[Optional[Qwen3Config], Config[Qwen3Config]] = None,\n optim: Optional[OptimizerModule] = None,\n tokenizer: Optional[""TokenizerSpec""] = None,\n model_transform: Optional[Callable[[nn.Module], nn.Module]] = None,\n ):\n super().__init__(config or Qwen3Config(), optim=optim, tokenizer=tokenizer, model_transform=model_transform)\n\n\n@io.model_importer(Qwen3Model, ""hf"")\nclass HFQwen3Importer(io.ModelConnector[""AutoModelForCausalLM"", Qwen3Model]):\n # pylint: disable=C0115,C0116\n def init(self) -> Qwen3Model:\n return Qwen3Model(self.config, tokenizer=self.tokenizer)\n\n def apply(self, output_path: Path) -> Path:\n from transformers import AutoModelForCausalLM\n\n # logging.setLevel(logging.DEBUG)\n source = AutoModelForCausalLM.from_pretrained(str(self), torch_dtype='auto', trust_remote_code=True)\n target = self.init()\n trainer = self.nemo_setup(target)\n self.convert_state(source, target)\n self.nemo_save(output_path, trainer)\n\n print(f""Converted Qwen 3 model to Nemo, model saved to {output_path}"")\n\n teardown(trainer, target)\n del trainer, target\n\n return output_path\n\n def convert_state(self, source, target):\n mapping = {\n ""model.embed_tokens.weight"": ""embedding.word_embeddings.weight"",\n ""**.self_attn.o_proj.weight"": ""**.self_attention.linear_proj.weight"",\n ""**.self_attn.q_norm.weight"": ""**.self_attention.q_layernorm.weight"",\n ""**.self_attn.k_norm.weight"": ""**.self_attention.k_layernorm.weight"",\n ""**.input_layernorm.weight"": ""**.self_attention.linear_qkv.layer_norm_weight"",\n ""model.norm.weight"": ""decoder.final_layernorm.weight"",\n ""lm_head.weight"": ""output_layer.weight"",\n }\n is_moe = self.config.num_moe_experts is not None\n if is_moe:\n mapping.update(\n {\n ""**.mlp.experts.*.down_proj.weight"": ""**.mlp.experts.linear_fc2.weight*"",\n ""**.mlp.gate.weight"": ""**.mlp.router.weight"",\n ""**.post_attention_layernorm.weight"": ""**.pre_mlp_layernorm.weight"",\n }\n )\n else:\n mapping.update(\n {\n ""**.mlp.down_proj.weight"": ""**.mlp.linear_fc2.weight"",\n ""**.post_attention_layernorm.weight"": ""**.mlp.linear_fc1.layer_norm_weight"",\n }\n )\n\n if getattr(source.config, ""tie_word_embeddings"", False):\n del mapping[""lm_head.weight""]\n\n transforms = [\n io.state_transform(\n source_key=(\n ""**.self_attn.q_proj.weight"",\n ""**.self_attn.k_proj.weight"",\n ""**.self_attn.v_proj.weight"",\n ),\n target_key=""**.self_attention.linear_qkv.weight"",\n fn=TransformFns.merge_qkv,\n ),\n (\n io.state_transform(\n source_key=(""**.mlp.gate_proj.weight"", ""**.mlp.up_proj.weight""),\n target_key=""**.mlp.linear_fc1.weight"",\n fn=TransformFns.merge_fc1,\n )\n if not is_moe\n else io.state_transform(\n source_key=(""**.mlp.experts.*.gate_proj.weight"", ""**.mlp.experts.*.up_proj.weight""),\n target_key=""**.mlp.experts.linear_fc1.weight*"",\n fn=TransformFns.merge_fc1,\n )\n ),\n ]\n return io.apply_transforms(source, target, mapping=mapping, transforms=transforms)\n\n @cached_property\n def tokenizer(self) -> ""AutoTokenizer"":\n from nemo.collections.common.tokenizers.huggingface.auto_tokenizer import AutoTokenizer\n\n return AutoTokenizer(self.save_hf_tokenizer_assets(str(self)), trust_remote_code=True)\n\n @cached_property\n def config(self) -> Qwen3Config:\n from transformers import AutoConfig as HFAutoConfig\n from transformers import GenerationConfig\n\n source = HFAutoConfig.from_pretrained(str(self), trust_remote_code=True)\n generation_config = GenerationConfig.from_pretrained(str(self))\n\n is_moe = getattr(source, ""num_experts"", None) is not None\n if is_moe:\n qwen3_config_cls = partial(Qwen3MoEConfig, moe_ffn_hidden_size=source.moe_intermediate_size)\n else:\n qwen3_config_cls = Qwen3Config\n output = qwen3_config_cls(\n num_layers=source.num_hidden_layers,\n hidden_size=source.hidden_size,\n ffn_hidden_size=source.intermediate_size,\n num_attention_heads=source.num_attention_heads,\n num_query_groups=source.num_key_value_heads,\n init_method_std=source.initializer_range,\n layernorm_epsilon=source.rms_norm_eps,\n vocab_size=source.vocab_size,\n make_vocab_size_divisible_by=1187,\n rotary_base=source.rope_theta,\n share_embeddings_and_output_weights=getattr(source, ""tie_word_embeddings"", False),\n fp16=(dtype_from_hf(source) == torch.float16),\n bf16=(dtype_from_hf(source) == torch.bfloat16),\n params_dtype=dtype_from_hf(source),\n generation_config=generation_config,\n )\n\n return output\n\n\n@io.model_exporter(Qwen3Model, ""hf"")\nclass HFQwen3Exporter(io.ModelConnector[Qwen3Model, ""AutoModelForCausalLM""]):\n # pylint: disable=C0115,C0116\n def init(self, dtype=torch.bfloat16) -> ""AutoModelForCausalLM"":\n from transformers import AutoModelForCausalLM\n from transformers.modeling_utils import no_init_weights\n\n with no_init_weights():\n return AutoModelForCausalLM.from_config(self.config, trust_remote_code=True, torch_dtype=dtype)\n\n def apply(self, output_path: Path) -> Path:\n source, _ = self.nemo_load(str(self))\n target = self.init(torch_dtype_from_mcore_config(source.config))\n target = self.convert_state(source, target)\n\n target = target.cpu()\n target.save_pretrained(output_path)\n self.tokenizer.save_pretrained(output_path)\n\n return output_path\n\n def convert_state(self, source, target):\n mapping = {\n ""**.self_attention.linear_proj.weight"": ""**.self_attn.o_proj.weight"",\n ""**.self_attention.linear_qkv.layer_norm_weight"": ""**.input_layernorm.weight"",\n ""**.self_attention.q_layernorm.weight"": ""**.self_attn.q_norm.weight"",\n ""**.self_attention.k_layernorm.weight"": ""**.self_attn.k_norm.weight"",\n ""decoder.final_layernorm.weight"": ""model.norm.weight"",\n }\n is_moe = getattr(self.config, ""num_experts"", 0) > 0\n if is_moe:\n mapping.update(\n {\n ""**.mlp.experts.linear_fc2.weight*"": ""**.mlp.experts.*.down_proj.weight"",\n ""**.mlp.router.weight"": ""**.mlp.gate.weight"",\n ""**.pre_mlp_layernorm.weight"": ""**.post_attention_layernorm.weight"",\n }\n )\n else:\n mapping.update(\n {\n ""**.mlp.linear_fc2.weight"": ""**.mlp.down_proj.weight"",\n ""**.mlp.linear_fc1.layer_norm_weight"": ""**.post_attention_layernorm.weight"",\n }\n )\n transforms = [\n io.state_transform(\n source_key=""**.self_attention.linear_qkv.weight"",\n target_key=(\n ""**.self_attn.q_proj.weight"",\n ""**.self_attn.k_proj.weight"",\n ""**.self_attn.v_proj.weight"",\n ),\n fn=TransformFns.split_qkv,\n ),\n (\n io.state_transform(\n source_key=""**.mlp.linear_fc1.weight"",\n target_key=(""**.mlp.gate_proj.weight"", ""**.mlp.up_proj.weight""),\n fn=TransformFns.split_fc1,\n )\n if not is_moe\n else io.state_transform(\n source_key=""**.mlp.experts.linear_fc1.weight*"",\n target_key=(""**.mlp.experts.*.gate_proj.weight"", ""**.mlp.experts.*.up_proj.weight""),\n fn=TransformFns.split_fc1,\n )\n ),\n io.state_transform(\n source_key=""embedding.word_embeddings.weight"",\n target_key=""model.embed_tokens.weight"",\n fn=TransformFns.prune_padding,\n ),\n ]\n if not self.config.tie_word_embeddings:\n transforms.append(\n io.state_transform(\n source_key=""output_layer.weight"",\n target_key=""lm_head.weight"",\n fn=TransformFns.prune_padding,\n )\n )\n\n return io.apply_transforms(\n source,\n target,\n mapping=mapping,\n transforms=transforms,\n )\n\n @property\n def tokenizer(self):\n return io.load_context(str(self)).model.tokenizer.tokenizer\n\n @property\n def config(self) -> ""HFQwen3Config"":\n from transformers import Qwen3Config as HFQwen3Config\n from transformers import Qwen3MoeConfig as HFQwen3MoeConfig\n\n source: Qwen3Config = io.load_context(str(self), subpath=""model.config"")\n is_moe = source.num_moe_experts is not None\n hf_config_cls = (\n partial(\n HFQwen3MoeConfig,\n moe_intermediate_size=source.moe_ffn_hidden_size,\n num_experts=source.num_moe_experts,\n num_experts_per_tok=source.moe_router_topk,\n router_aux_loss_coef=source.moe_aux_loss_coeff,\n norm_topk_prob=True,\n )\n if is_moe\n else HFQwen3Config\n )\n\n return hf_config_cls(\n architectures=[""Qwen3ForCausalLM""],\n num_hidden_layers=source.num_layers,\n hidden_size=source.hidden_size,\n intermediate_size=source.ffn_hidden_size,\n num_attention_heads=source.num_attention_heads,\n head_dim=source.kv_channels,\n max_position_embeddings=source.max_position_embeddings,\n initializer_range=source.init_method_std,\n rms_norm_eps=source.layernorm_epsilon,\n num_key_value_heads=source.num_query_groups,\n rope_theta=source.rotary_base,\n vocab_size=getattr(source, 'vocab_size', self.tokenizer.vocab_size),\n sliding_window=None,\n tie_word_embeddings=source.share_embeddings_and_output_weights,\n max_window_layers=source.num_layers,\n bos_token_id=151643,\n eos_token_id=151645,\n )\n\n\n__all__ = [\n ""Qwen3Config"",\n ""Qwen3Config600M"",\n ""Qwen3Config1P7B"",\n ""Qwen3Config4B"",\n ""Qwen3Config8B"",\n ""Qwen3Config14B"",\n ""Qwen3Config32B"",\n ""Qwen3Config30B_A3B"",\n ""Qwen3Config235B_A22B"",\n ""Qwen3Model"",\n]\n",python,tab
+173,6831597,"nemo/collections/llm/recipes/finetune_default.py",0,0,"",python,tab
+174,6831762,"nemo/collections/llm/recipes/finetune_default.py",2896,0,"",python,selection_command
+175,6911772,"nemo/collections/llm/recipes/finetune_default.py",3121,0,"",python,selection_command
+176,6912642,"nemo/collections/llm/recipes/finetune_default.py",2896,0,"",python,selection_command
+177,6939585,"nemo/collections/llm/recipes/finetune_default.py",2772,0,"",python,selection_command
+178,6940304,"nemo/collections/llm/recipes/finetune_default.py",2427,0,"",python,selection_command
+179,6940806,"nemo/collections/llm/recipes/finetune_default.py",1590,0,"",python,selection_command
+180,6943746,"nemo/collections/llm/recipes/finetune_default.py",2427,0,"",python,selection_command
+181,6944203,"nemo/collections/llm/recipes/finetune_default.py",2772,0,"",python,selection_command
+182,6944563,"nemo/collections/llm/recipes/finetune_default.py",824,0,"",python,selection_command
+183,6946701,"nemo/collections/llm/recipes/finetune_default.py",1590,0,"",python,selection_command
+184,6946870,"nemo/collections/llm/recipes/finetune_default.py",2427,0,"",python,selection_command
+185,6948772,"nemo/collections/llm/recipes/finetune_default.py",2491,0,"",python,selection_command
+186,6949022,"nemo/collections/llm/recipes/finetune_default.py",2594,0,"",python,selection_command
+187,6949058,"nemo/collections/llm/recipes/finetune_default.py",2631,0,"",python,selection_command
+188,6949092,"nemo/collections/llm/recipes/finetune_default.py",2640,0,"",python,selection_command
+189,6949124,"nemo/collections/llm/recipes/finetune_default.py",2653,0,"",python,selection_command
+190,6949157,"nemo/collections/llm/recipes/finetune_default.py",2705,0,"",python,selection_command
+191,6949311,"nemo/collections/llm/recipes/finetune_default.py",2714,0,"",python,selection_command
+192,6949488,"nemo/collections/llm/recipes/finetune_default.py",2763,0,"",python,selection_command
+193,6949652,"nemo/collections/llm/recipes/finetune_default.py",2773,0,"",python,selection_command
+194,6978665,"nemo/collections/llm/recipes/finetune_default.py",2786,0,"",python,selection_command
+195,6979022,"nemo/collections/llm/recipes/finetune_default.py",2810,0,"",python,selection_command
+196,6979272,"nemo/collections/llm/recipes/finetune_default.py",2843,0,"",python,selection_command
+197,6979305,"nemo/collections/llm/recipes/finetune_default.py",2876,0,"",python,selection_command
+198,6979338,"nemo/collections/llm/recipes/finetune_default.py",2905,0,"",python,selection_command
+199,6979371,"nemo/collections/llm/recipes/finetune_default.py",2938,0,"",python,selection_command
+200,6979404,"nemo/collections/llm/recipes/finetune_default.py",2970,0,"",python,selection_command
+201,6979437,"nemo/collections/llm/recipes/finetune_default.py",3039,0,"",python,selection_command
+202,6979471,"nemo/collections/llm/recipes/finetune_default.py",3049,0,"",python,selection_command
+203,6979505,"nemo/collections/llm/recipes/finetune_default.py",3072,0,"",python,selection_command
+204,6979538,"nemo/collections/llm/recipes/finetune_default.py",3185,0,"",python,selection_command
+205,6979571,"nemo/collections/llm/recipes/finetune_default.py",3210,0,"",python,selection_command
+206,6979604,"nemo/collections/llm/recipes/finetune_default.py",3231,0,"",python,selection_command
+207,6979638,"nemo/collections/llm/recipes/finetune_default.py",3254,0,"",python,selection_command
+208,6979672,"nemo/collections/llm/recipes/finetune_default.py",3296,0,"",python,selection_command
+209,6979708,"nemo/collections/llm/recipes/finetune_default.py",3329,0,"",python,selection_command
+210,6979738,"nemo/collections/llm/recipes/finetune_default.py",3366,0,"",python,selection_command
+211,6979855,"nemo/collections/llm/recipes/finetune_default.py",3329,0,"",python,selection_command
+212,6980015,"nemo/collections/llm/recipes/finetune_default.py",3296,0,"",python,selection_command
+213,6980231,"nemo/collections/llm/recipes/finetune_default.py",3254,0,"",python,selection_command
+214,6980581,"nemo/collections/llm/recipes/finetune_default.py",3231,0,"",python,selection_command
+215,6980732,"nemo/collections/llm/recipes/finetune_default.py",3210,0,"",python,selection_command
+216,6980885,"nemo/collections/llm/recipes/finetune_default.py",3185,0,"",python,selection_command
+217,6981032,"nemo/collections/llm/recipes/finetune_default.py",3072,0,"",python,selection_command
+218,7435298,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",0,0,"",python,tab
+219,7435668,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",10041,0,"",python,selection_command
+220,7436138,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",10023,0,"",python,selection_command
+221,7436390,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9954,0,"",python,selection_command
+222,7436423,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9944,0,"",python,selection_command
+223,7436457,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9906,0,"",python,selection_command
+224,7436491,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9839,0,"",python,selection_command
+225,7436524,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9771,0,"",python,selection_command
+226,7436558,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9714,0,"",python,selection_command
+227,7436592,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9650,0,"",python,selection_command
+228,7436626,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9587,0,"",python,selection_command
+229,7436659,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9524,0,"",python,selection_command
+230,7436694,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9474,0,"",python,selection_command
+231,7436727,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9436,0,"",python,selection_command
+232,7436761,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9379,0,"",python,selection_command
+233,7436794,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9314,0,"",python,selection_command
+234,7436829,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9250,0,"",python,selection_command
+235,7436861,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9187,0,"",python,selection_command
+236,7437968,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9124,0,"",python,selection_command
+237,7439297,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9187,0,"",python,selection_command
+238,7439437,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9250,0,"",python,selection_command
+239,7439577,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9314,0,"",python,selection_command
+240,7511369,"nemo/collections/llm/recipes/finetune_default.py",0,0,"",python,tab
+241,8536523,"nemo/collections/llm/recipes/qwen3_600m.py",0,0,"# Copyright (c) 2025, NVIDIA CORPORATION. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the ""License"");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an ""AS IS"" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom typing import Optional\n\nimport lightning.pytorch as pl\nimport nemo_run as run\nimport torch\n\nfrom nemo.collections.llm.api import finetune, pretrain\nfrom nemo.collections.llm.gpt.data.mock import MockDataModule\nfrom nemo.collections.llm.peft import PEFT_STR2CLS\nfrom nemo.collections.llm.recipes.finetune_default import default_finetune_recipe\nfrom nemo.collections.llm.recipes.log.default import default_log, default_resume, tensorboard_logger\nfrom nemo.collections.llm.recipes.optim.adam import distributed_fused_adam_with_cosine_annealing\nfrom nemo.collections.llm.recipes.qwen3 import qwen3_model, qwen3_trainer\nfrom nemo.utils.exp_manager import TimingCallback\n\nNAME = ""qwen3_600m""\n\n\n@run.cli.factory(name=NAME)\ndef model() -> run.Config[pl.LightningModule]:\n """"""\n Factory function to create a Qwen3 600M model configuration.\n\n Returns:\n run.Config[pl.LightningModule]: Configuration for the Qwen3 600M model.\n\n Examples:\n CLI usage:\n $ nemo llm pretrain model=qwen3_600m ...\n\n Python API usage:\n >>> model_config = model()\n >>> print(model_config)\n """"""\n return qwen3_model(version=NAME)\n\n\n@run.cli.factory(target=pretrain, name=NAME)\ndef pretrain_recipe(\n # General\n dir: Optional[str] = None,\n name: str = ""default"",\n # Trainer\n tensor_parallelism: int = 1,\n pipeline_parallelism: int = 1,\n pipeline_parallelism_type: Optional[torch.dtype] = None,\n virtual_pipeline_parallelism: Optional[int] = None,\n context_parallelism: int = 1,\n sequence_parallelism: bool = False,\n num_nodes: int = 1,\n num_gpus_per_node: int = 8,\n max_steps: int = 300000,\n precision: str = ""bf16-mixed"",\n accumulate_grad_batches: int = 1,\n gradient_clip_val: float = 1.0,\n limit_test_batches: int = 32,\n limit_val_batches: int = 32,\n log_every_n_steps: int = 10,\n val_check_interval: int = 500,\n # Data\n global_batch_size=32,\n micro_batch_size=2,\n seq_length=4096,\n # Optimizer\n warmup_steps=500,\n constant_steps=0,\n min_lr=3e-5,\n max_lr=3e-4,\n # Training function\n fn=pretrain,\n) -> run.Partial:\n """"""\n Create a pre-training recipe for Qwen3 600M model.\n\n This function sets up a complete configuration for pre-training, including\n model, trainer, data, logging, optimization, and resumption settings.\n\n Args:\n dir (Optional[str]): Directory for saving logs and checkpoints.\n name (str): Name of the pre-training run.\n tensor_parallelism (int): Degree of tensor model parallelism.\n pipeline_parallelism (int): Degree of pipeline model parallelism.\n pipeline_parallelism_type (Optional[torch.dtype]): Data type for pipeline parallelism.\n virtual_pipeline_parallelism (Optional[int]): Size of virtual pipeline parallelism.\n context_parallelism (int): Degree of context parallelism.\n sequence_parallelism (bool): Whether to use sequence parallelism.\n num_nodes (int): Number of compute nodes to use.\n num_gpus_per_node (int): Number of GPUs per node.\n max_steps (int): Maximum number of training steps.\n precision (str): Precision configuration, one of fp32, 16-mixed or bf16-mixed.\n accumulate_grad_batches (int): Number of steps per gradient accumulation.\n gradient_clip_val (float): Value for gradient clipping.\n limit_test_batches (int): Limit the number of test batches.\n limit_val_batches (int): Limit the number of validation batches.\n log_every_n_steps (int): Log every n steps.\n val_check_interval (int): Run validation every N steps.\n global_batch_size (int): Global batch size.\n micro_batch_size (int): Micro batch size.\n seq_length (int): Sequence length.\n warmup_steps (int): Number of warmup steps.\n constant_steps (int): Number of constant steps.\n min_lr (float): Minimum learning rate.\n max_lr (float): Maximum learning rate.\n fn (Callable): The pre-training function to use.\n\n Returns:\n run.Partial: Partial configuration for pre-training.\n\n Examples:\n CLI usage:\n $ nemo llm pretrain --factory qwen3_600m\n $ nemo llm pretrain --factory ""qwen3_600m(num_nodes=1, name='my_qwen3_pretrain')""\n\n Python API usage:\n >>> recipe = pretrain_recipe(name=""qwen3_pretrain"", num_nodes=1)\n >>> print(recipe)\n\n Note:\n This recipe uses a mock dataset, look for the finetune examples to see how to change the dataset.\n """"""\n return run.Partial(\n fn,\n model=model(),\n trainer=qwen3_trainer(\n tensor_parallelism=tensor_parallelism,\n pipeline_parallelism=pipeline_parallelism,\n pipeline_parallelism_type=pipeline_parallelism_type,\n virtual_pipeline_parallelism=virtual_pipeline_parallelism,\n context_parallelism=context_parallelism,\n sequence_parallelism=sequence_parallelism,\n num_nodes=num_nodes,\n num_gpus_per_node=num_gpus_per_node,\n max_steps=max_steps,\n precision=precision,\n accumulate_grad_batches=accumulate_grad_batches,\n limit_test_batches=limit_test_batches,\n limit_val_batches=limit_val_batches,\n log_every_n_steps=log_every_n_steps,\n val_check_interval=val_check_interval,\n callbacks=[run.Config(TimingCallback)],\n ),\n data=run.Config(\n MockDataModule,\n seq_length=seq_length,\n global_batch_size=global_batch_size,\n micro_batch_size=micro_batch_size,\n ),\n log=default_log(dir=dir, name=name, tensorboard_logger=tensorboard_logger(name=name)),\n optim=distributed_fused_adam_with_cosine_annealing(\n precision=precision,\n warmup_steps=warmup_steps,\n constant_steps=constant_steps,\n min_lr=min_lr,\n max_lr=max_lr,\n clip_grad=gradient_clip_val,\n ),\n resume=default_resume(),\n )\n\n\n@run.cli.factory(target=finetune, name=NAME)\ndef finetune_recipe(\n dir: Optional[str] = None,\n name: str = ""default"",\n num_nodes: int = 1,\n num_gpus_per_node: int = 8,\n peft_scheme: Optional[str] = 'lora',\n packed_sequence: bool = False,\n) -> run.Partial:\n """"""\n Create a fine-tuning recipe for Qwen3 600M model.\n\n This function sets up a complete configuration for fine-tuning, including\n model, trainer, data, logging, optimization, and resumption settings.\n The recipe uses LoRA (Low-Rank Adaptation) for efficient fine-tuning, unless peft_scheme is set to None.\n\n Args:\n dir (Optional[str]): Directory for saving logs and checkpoints.\n name (str): Name of the fine-tuning run.\n num_nodes (int): Number of compute nodes to use.\n num_gpus_per_node (int): Number of GPUs per node.\n peft_scheme (Optional[str]): Name of the peft scheme to use for fine-tuning.\n Allowed values: 'lora'/'dora'/'none'/None.\n packed_sequence (Optional[bool]): Packing multiple training sequences into one long sequence for training\n efficiency. Default sequence length is 2048.\n\n Returns:\n run.Partial: Partial configuration for fine-tuning.\n\n Examples:\n CLI usage:\n $ nemo llm finetune --factory qwen3_600m\n\n Python API usage:\n >>> recipe = finetune_recipe(name=""qwen3_600m_finetune"", num_nodes=2)\n >>> print(recipe)\n\n Note:\n This recipe uses the SQuAD dataset for fine-tuning.\n """"""\n recipe = default_finetune_recipe(\n model(), ""Qwen/Qwen3-0.6B"", dir, name, num_nodes, num_gpus_per_node, packed_sequence\n )\n\n if peft_scheme is None or peft_scheme.lower() == 'none':\n recipe.optim.config.lr = 5e-6\n elif peft_scheme.lower() in ['lora', 'dora']:\n recipe.peft = run.Config(PEFT_STR2CLS[peft_scheme.lower()])\n recipe.optim.config.lr = 1e-4\n else:\n raise ValueError(f""Unrecognized peft scheme: {peft_scheme}"")\n return recipe\n",python,tab
+242,8593512,"nemo/collections/llm/recipes/qwen3_600m.py",1780,0,"",python,selection_keyboard
+243,8594713,"nemo/collections/llm/recipes/qwen3_600m.py",3438,0,"",python,selection_keyboard
+244,8595636,"nemo/collections/llm/recipes/qwen3_600m.py",3346,0,"",python,selection_command
+245,8595889,"nemo/collections/llm/recipes/qwen3_600m.py",3251,0,"",python,selection_command
+246,8595928,"nemo/collections/llm/recipes/qwen3_600m.py",3177,0,"",python,selection_command
+247,8595960,"nemo/collections/llm/recipes/qwen3_600m.py",3107,0,"",python,selection_command
+248,8595985,"nemo/collections/llm/recipes/qwen3_600m.py",3057,0,"",python,selection_command
+249,8596018,"nemo/collections/llm/recipes/qwen3_600m.py",2985,0,"",python,selection_command
+250,8596052,"nemo/collections/llm/recipes/qwen3_600m.py",2975,0,"",python,selection_command
+251,8596085,"nemo/collections/llm/recipes/qwen3_600m.py",2974,0,"",python,selection_command
+252,8596119,"nemo/collections/llm/recipes/qwen3_600m.py",2900,0,"",python,selection_command
+253,8596152,"nemo/collections/llm/recipes/qwen3_600m.py",2821,0,"",python,selection_command
+254,8596186,"nemo/collections/llm/recipes/qwen3_600m.py",2820,0,"",python,selection_command
+255,8596220,"nemo/collections/llm/recipes/qwen3_600m.py",2765,0,"",python,selection_command
+256,8596257,"nemo/collections/llm/recipes/qwen3_600m.py",2757,0,"",python,selection_command
+257,8596289,"nemo/collections/llm/recipes/qwen3_600m.py",2739,0,"",python,selection_command
+258,8596322,"nemo/collections/llm/recipes/qwen3_600m.py",2722,0,"",python,selection_command
+259,8621282,"nemo/collections/llm/__init__.py",0,0,"# Copyright (c) 2025, NVIDIA CORPORATION. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the ""License"");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an ""AS IS"" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n# This is here to import it once, which improves the speed of launch when in debug-mode\nfrom nemo.utils.import_utils import safe_import\n\nsafe_import(""transformer_engine"")\n\nfrom nemo.collections.llm import peft\nfrom nemo.collections.llm.bert.data import BERTMockDataModule, BERTPreTrainingDataModule, SpecterDataModule\nfrom nemo.collections.llm.bert.model import (\n BertConfig,\n BertEmbeddingLargeConfig,\n BertEmbeddingMiniConfig,\n BertEmbeddingModel,\n BertModel,\n HuggingFaceBertBaseConfig,\n HuggingFaceBertConfig,\n HuggingFaceBertLargeConfig,\n HuggingFaceBertModel,\n MegatronBertBaseConfig,\n MegatronBertConfig,\n MegatronBertLargeConfig,\n)\nfrom nemo.collections.llm.gpt.data import ( # noqa: F401\n AlpacaDataModule,\n ChatDataModule,\n CustomReRankerDataModule,\n CustomRetrievalDataModule,\n DollyDataModule,\n FineTuningDataModule,\n HFDatasetDataModule,\n HFDatasetDataModulePacked,\n HFMockDataModule,\n MockDataModule,\n PreTrainingDataModule,\n SpecterReRankerDataModule,\n SquadDataModule,\n)\nfrom nemo.collections.llm.gpt.data.api import dolly, hf_dataset, mock, squad\nfrom nemo.collections.llm.gpt.model import ( # noqa: F401\n Baichuan2Config,\n Baichuan2Config7B,\n Baichuan2Model,\n BaseMambaConfig1_3B,\n BaseMambaConfig2_7B,\n BaseMambaConfig130M,\n BaseMambaConfig370M,\n BaseMambaConfig780M,\n ChatGLM2Config6B,\n ChatGLM3Config6B,\n ChatGLMConfig,\n ChatGLMModel,\n CodeGemmaConfig2B,\n CodeGemmaConfig7B,\n CodeLlamaConfig7B,\n CodeLlamaConfig13B,\n CodeLlamaConfig34B,\n CodeLlamaConfig70B,\n DeepSeekModel,\n DeepSeekV2Config,\n DeepSeekV2LiteConfig,\n DeepSeekV3Config,\n Gemma2Config,\n Gemma2Config2B,\n Gemma2Config9B,\n Gemma2Config27B,\n Gemma2Model,\n Gemma3Config1B,\n Gemma3Config4B,\n Gemma3Config12B,\n Gemma3Config27B,\n Gemma3Model,\n GemmaConfig,\n GemmaConfig2B,\n GemmaConfig7B,\n GemmaModel,\n GPTConfig,\n GPTConfig5B,\n GPTConfig7B,\n GPTConfig20B,\n GPTConfig40B,\n GPTConfig126M,\n GPTConfig175B,\n GPTModel,\n GPTOSSConfig,\n GPTOSSConfig20B,\n GPTOSSConfig120B,\n GPTOSSModel,\n Hyena1bConfig,\n Hyena7bARCLongContextConfig,\n Hyena7bConfig,\n Hyena40bARCLongContextConfig,\n Hyena40bConfig,\n HyenaConfig,\n HyenaModel,\n HyenaNV1bConfig,\n HyenaNV7bConfig,\n HyenaNV40bConfig,\n HyenaNVTestConfig,\n HyenaTestConfig,\n Llama2Config7B,\n Llama2Config13B,\n Llama2Config70B,\n Llama3Config8B,\n Llama3Config70B,\n Llama4Config,\n Llama4Experts16Config,\n Llama4Experts128Config,\n Llama31Config8B,\n Llama31Config70B,\n Llama31Config405B,\n Llama31Nemotron70BConfig,\n Llama31NemotronNano8BConfig,\n Llama31NemotronUltra253BConfig,\n Llama32Config1B,\n Llama32Config3B,\n Llama32EmbeddingConfig1B,\n Llama32EmbeddingConfig3B,\n Llama32Reranker1BConfig,\n Llama32Reranker500MConfig,\n Llama33NemotronSuper49BConfig,\n LlamaConfig,\n LlamaEmbeddingModel,\n LlamaModel,\n LlamaNemotronModel,\n MambaModel,\n MaskedTokenLossReduction,\n MistralConfig7B,\n MistralModel,\n MistralNeMoConfig12B,\n MistralSmall3Config24B,\n MixtralConfig,\n MixtralConfig8x3B,\n MixtralConfig8x7B,\n MixtralConfig8x22B,\n MixtralModel,\n Nemotron3Config4B,\n Nemotron3Config8B,\n Nemotron3Config22B,\n Nemotron4Config15B,\n Nemotron4Config340B,\n NemotronConfig,\n NemotronHConfig4B,\n NemotronHConfig8B,\n NemotronHConfig47B,\n NemotronHConfig56B,\n NemotronModel,\n NemotronNano9Bv2,\n NemotronNano12Bv2,\n NVIDIAMambaConfig8B,\n NVIDIAMambaHybridConfig8B,\n Phi3Config,\n Phi3ConfigMini,\n Phi3Model,\n Qwen2Config,\n Qwen2Config1P5B,\n Qwen2Config7B,\n Qwen2Config72B,\n Qwen2Config500M,\n Qwen2Model,\n Qwen3Config,\n Qwen3Config1P7B,\n Qwen3Config4B,\n Qwen3Config8B,\n Qwen3Config14B,\n Qwen3Config30B_A3B,\n Qwen3Config32B,\n Qwen3Config235B_A22B,\n Qwen3Config600M,\n Qwen3Model,\n Qwen25Config1P5B,\n Qwen25Config3B,\n Qwen25Config7B,\n Qwen25Config14B,\n Qwen25Config32B,\n Qwen25Config72B,\n Qwen25Config500M,\n ReRankerModel,\n SSMConfig,\n Starcoder2Config,\n Starcoder2Config3B,\n Starcoder2Config7B,\n Starcoder2Config15B,\n Starcoder2Model,\n StarcoderConfig,\n StarcoderConfig15B,\n StarcoderModel,\n gpt_data_step,\n gpt_forward_step,\n)\nfrom nemo.collections.llm.t5.data import FineTuningDataModule as T5FineTuningDataModule\nfrom nemo.collections.llm.t5.data import MockDataModule as T5MockDataModule\nfrom nemo.collections.llm.t5.data import PreTrainingDataModule as T5PreTrainingDataModule\nfrom nemo.collections.llm.t5.data import SquadDataModule as T5SquadDataModule\nfrom nemo.collections.llm.t5.model import (\n T5Config,\n T5Config3B,\n T5Config11B,\n T5Config220M,\n T5Model,\n t5_data_step,\n t5_forward_step,\n)\n\n__all__ = [\n ""MockDataModule"",\n ""T5MockDataModule"",\n ""CustomRetrievalDataModule"",\n ""CustomReRankerDataModule"",\n ""SpecterReRankerDataModule"",\n ""GPTModel"",\n ""GPTConfig"",\n ""HyenaTestConfig"",\n ""Hyena7bConfig"",\n ""Hyena40bConfig"",\n ""Hyena7bARCLongContextConfig"",\n ""Hyena40bARCLongContextConfig"",\n ""HyenaNVTestConfig"",\n ""HyenaNV40bConfig"",\n ""HyenaNV7bConfig"",\n ""HyenaConfig"",\n ""HyenaModel"",\n ""Hyena1bConfig"",\n ""HyenaNV1bConfig"",\n ""gpt_data_step"",\n ""gpt_forward_step"",\n ""T5Model"",\n ""T5Config"",\n ""T5Config220M"",\n ""T5Config3B"",\n ""T5Config11B"",\n ""BertConfig"",\n ""BertEmbeddingModel"",\n ""BertModel"",\n ""BertEmbeddingLargeConfig"",\n ""BertEmbeddingMiniConfig"",\n ""t5_data_step"",\n ""t5_forward_step"",\n ""MaskedTokenLossReduction"",\n ""MistralConfig7B"",\n ""MistralNeMoConfig12B"",\n ""MistralSmall3Config24B"",\n ""MistralModel"",\n ""MixtralConfig"",\n ""MixtralConfig8x3B"",\n ""MixtralConfig8x7B"",\n ""MixtralConfig8x22B"",\n ""MixtralModel"",\n ""Starcoder2Config15B"",\n ""Starcoder2Config"",\n ""Starcoder2Model"",\n ""NemotronModel"",\n ""Nemotron3Config4B"",\n ""Nemotron3Config8B"",\n ""Nemotron3Config22B"",\n ""Nemotron4Config15B"",\n ""Nemotron4Config340B"",\n ""NemotronConfig"",\n ""LlamaEmbeddingModel"",\n ""Llama32EmbeddingConfig1B"",\n ""Llama32EmbeddingConfig3B"",\n ""Phi3Config"",\n ""Phi3ConfigMini"",\n ""Phi3Model"",\n ""SSMConfig"",\n ""BaseMambaConfig130M"",\n ""BaseMambaConfig370M"",\n ""BaseMambaConfig780M"",\n ""BaseMambaConfig1_3B"",\n ""BaseMambaConfig2_7B"",\n ""NVIDIAMambaConfig8B"",\n ""NVIDIAMambaHybridConfig8B"",\n ""NemotronHConfig4B"",\n ""NemotronHConfig8B"",\n ""NemotronHConfig47B"",\n ""NemotronHConfig56B"",\n ""NemotronNano9Bv2"",\n ""NemotronNano12Bv2"",\n ""MambaModel"",\n ""LlamaConfig"",\n ""Llama2Config7B"",\n ""Llama2Config13B"",\n ""Llama2Config70B"",\n ""Llama3Config8B"",\n ""Llama3Config70B"",\n ""Llama31Config8B"",\n ""Llama31Config70B"",\n ""Llama31Config405B"",\n ""Llama32Config1B"",\n ""Llama32Config3B"",\n ""Llama4Experts16Config"",\n ""Llama4Experts128Config"",\n ""Llama4Config"",\n ""Llama31NemotronNano8BConfig"",\n ""Llama31Nemotron70BConfig"",\n ""Llama33NemotronSuper49BConfig"",\n ""Llama31NemotronUltra253BConfig"",\n ""Llama32Reranker500MConfig"",\n ""Llama32Reranker1BConfig"",\n ""CodeLlamaConfig7B"",\n ""CodeLlamaConfig13B"",\n ""CodeLlamaConfig34B"",\n ""CodeLlamaConfig70B"",\n ""LlamaModel"",\n ""LlamaNemotronModel"",\n ""GPTOSSConfig"",\n ""GPTOSSConfig120B"",\n ""GPTOSSConfig20B"",\n ""GPTOSSModel"",\n ""GemmaConfig"",\n ""GemmaConfig2B"",\n ""GemmaConfig7B"",\n ""CodeGemmaConfig2B"",\n ""CodeGemmaConfig7B"",\n ""GemmaModel"",\n ""Gemma2Model"",\n ""Gemma2Config9B"",\n ""Gemma2Config"",\n ""Gemma2Config27B"",\n ""Gemma2Config2B"",\n ""Gemma3Model"",\n ""Gemma3Config1B"",\n ""Gemma3Config4B"",\n ""Gemma3Config12B"",\n ""Gemma3Config27B"",\n ""Baichuan2Config"",\n ""Baichuan2Config7B"",\n ""Baichuan2Model"",\n ""ChatGLMConfig"",\n ""ChatGLM2Config6B"",\n ""ChatGLM3Config6B"",\n ""ChatGLMModel"",\n ""Qwen2Model"",\n ""Qwen2Config7B"",\n ""Qwen2Config"",\n ""Qwen2Config500M"",\n ""Qwen2Config1P5B"",\n ""Qwen25Config3B"",\n ""Qwen2Config72B"",\n ""Qwen25Config500M"",\n ""Qwen25Config1P5B"",\n ""Qwen25Config7B"",\n ""Qwen25Config14B"",\n ""Qwen25Config32B"",\n ""Qwen25Config72B"",\n ""Qwen3Config"",\n ""Qwen3Config600M"",\n ""Qwen3Config1P7B"",\n ""Qwen3Config4B"",\n ""Qwen3Config8B"",\n ""Qwen3Config14B"",\n ""Qwen3Config32B"",\n ""Qwen3Config30B_A3B"",\n ""Qwen3Config235B_A22B"",\n ""Qwen3Model"",\n ""PreTrainingDataModule"",\n ""FineTuningDataModule"",\n ""ChatDataModule"",\n ""SquadDataModule"",\n ""T5PreTrainingDataModule"",\n ""T5FineTuningDataModule"",\n ""T5SquadDataModule"",\n ""T5MockDataModule"",\n ""DeepSeekModel"",\n ""DeepSeekV2Config"",\n ""DeepSeekV2LiteConfig"",\n ""DeepSeekV3Config"",\n ""HuggingFaceBertBaseConfig"",\n ""HuggingFaceBertConfig"",\n ""HuggingFaceBertLargeConfig"",\n ""HuggingFaceBertModel"",\n ""MegatronBertBaseConfig"",\n ""MegatronBertConfig"",\n ""MegatronBertLargeConfig"",\n ""BERTMockDataModule"",\n ""BERTPreTrainingDataModule"",\n ""SpecterDataModule"",\n ""DollyDataModule"",\n ""tokenizer"",\n ""mock"",\n ""squad"",\n ""dolly"",\n ""peft"",\n ""hf_dataset"",\n ""HFMockDataModule"",\n]\n\n\nfrom nemo.utils import logging\n\ntry:\n import nemo_run as run # noqa: F401\n\n from nemo.collections.llm.api import ( # noqa: F401\n distill,\n export_ckpt,\n finetune,\n generate,\n import_ckpt,\n pretrain,\n prune,\n ptq,\n train,\n validate,\n )\n from nemo.collections.llm.recipes import * # noqa\n\n __all__.extend(\n [\n ""train"",\n ""import_ckpt"",\n ""export_ckpt"",\n ""pretrain"",\n ""validate"",\n ""finetune"",\n ""generate"",\n ""prune"",\n ""ptq"",\n ""distill"",\n ]\n )\nexcept ImportError as error:\n logging.warning(f""Failed to import nemo.collections.llm.[api,recipes]: {error}"")\n\ntry:\n from nemo.collections.llm.api import deploy # noqa: F401\n\n __all__.append(""deploy"")\nexcept ImportError as error:\n logging.warning(f""The deploy module could not be imported: {error}"")\n\ntry:\n from nemo.collections.llm.api import evaluate # noqa: F401\n\n __all__.append(""evaluate"")\nexcept ImportError as error:\n logging.warning(f""The evaluate module could not be imported: {error}"")\n",python,tab
+260,8626049,"nemo/collections/llm/__init__.py",4446,0,"",python,selection_command
+261,8626789,"nemo/collections/llm/__init__.py",4463,0,"",python,selection_command
+262,8627040,"nemo/collections/llm/__init__.py",4484,0,"",python,selection_command
+263,8627079,"nemo/collections/llm/__init__.py",4503,0,"",python,selection_command
+264,8627103,"nemo/collections/llm/__init__.py",4522,0,"",python,selection_command
+265,8627137,"nemo/collections/llm/__init__.py",4542,0,"",python,selection_command
+266,8627487,"nemo/collections/llm/__init__.py",4522,0,"",python,selection_command
+267,8627638,"nemo/collections/llm/__init__.py",4503,0,"",python,selection_command
+268,8627780,"nemo/collections/llm/__init__.py",4484,0,"",python,selection_command
+269,8628089,"nemo/collections/llm/__init__.py",4463,0,"",python,selection_command
+270,8629239,"nemo/collections/llm/__init__.py",4484,0,"",python,selection_command
+271,8629489,"nemo/collections/llm/__init__.py",4503,0,"",python,selection_command
+272,8629524,"nemo/collections/llm/__init__.py",4522,0,"",python,selection_command
+273,8629555,"nemo/collections/llm/__init__.py",4542,0,"",python,selection_command
+274,8629588,"nemo/collections/llm/__init__.py",4566,0,"",python,selection_command
+275,8629694,"nemo/collections/llm/__init__.py",4586,0,"",python,selection_command
+276,8629854,"nemo/collections/llm/__init__.py",4612,0,"",python,selection_command
+277,8630212,"nemo/collections/llm/__init__.py",4612,1,"Q",python,selection_command
+278,8630338,"nemo/collections/llm/__init__.py",4612,15,"Qwen3Config600M",python,selection_command
+279,8630919,"nemo/collections/llm/__init__.py",4626,0,"",python,selection_command
+280,8941201,"nemo/collections/llm/recipes/qwen3_600m.py",0,0,"",python,tab
+281,184621360,"tests/lightning/_io/artifacts/model.yaml",0,0,"",yaml,tab
+282,271675338,"tests/lightning/_io/artifacts/model.yaml",994,0,"",yaml,selection_command
+283,271675455,"tests/lightning/_io/artifacts/model.yaml",1080,0,"",yaml,selection_command
+284,271675609,"tests/lightning/_io/artifacts/model.yaml",1484,0,"",yaml,selection_command
+285,271675787,"tests/lightning/_io/artifacts/model.yaml",1573,0,"",yaml,selection_command
+286,271676144,"tests/lightning/_io/artifacts/model.yaml",5108,0,"",yaml,selection_command
+287,271677173,"tests/lightning/_io/artifacts/model.yaml",994,0,"",yaml,selection_command
+288,271677338,"tests/lightning/_io/artifacts/model.yaml",1080,0,"",yaml,selection_command
+289,271690010,"nemo/collections/llm/recipes/qwen3_600m.py",0,0,"",python,tab
+290,271691756,"nemo/collections/llm/recipes/qwen3_600m.py",2698,0,"",python,selection_command
+291,271692007,"nemo/collections/llm/recipes/qwen3_600m.py",2681,0,"",python,selection_command
+292,271692039,"nemo/collections/llm/recipes/qwen3_600m.py",2664,0,"",python,selection_command
+293,271692074,"nemo/collections/llm/recipes/qwen3_600m.py",2642,0,"",python,selection_command
+294,271692105,"nemo/collections/llm/recipes/qwen3_600m.py",2620,0,"",python,selection_command
+295,271692138,"nemo/collections/llm/recipes/qwen3_600m.py",2604,0,"",python,selection_command
+296,271692171,"nemo/collections/llm/recipes/qwen3_600m.py",2583,0,"",python,selection_command
+297,271692205,"nemo/collections/llm/recipes/qwen3_600m.py",2559,0,"",python,selection_command
+298,271692238,"nemo/collections/llm/recipes/qwen3_600m.py",2533,0,"",python,selection_command
+299,271692277,"nemo/collections/llm/recipes/qwen3_600m.py",2522,0,"",python,selection_command
+300,271692305,"nemo/collections/llm/recipes/qwen3_600m.py",2487,0,"",python,selection_command
+301,271692338,"nemo/collections/llm/recipes/qwen3_600m.py",2454,0,"",python,selection_command
+302,271692371,"nemo/collections/llm/recipes/qwen3_600m.py",2421,0,"",python,selection_command
+303,271692405,"nemo/collections/llm/recipes/qwen3_600m.py",2387,0,"",python,selection_command
+304,271692438,"nemo/collections/llm/recipes/qwen3_600m.py",2351,0,"",python,selection_command
+305,271697105,"nemo/collections/llm/recipes/qwen3_600m.py",2387,0,"",python,selection_command
+306,271697354,"nemo/collections/llm/recipes/qwen3_600m.py",2421,0,"",python,selection_command
+307,271697388,"nemo/collections/llm/recipes/qwen3_600m.py",2454,0,"",python,selection_command
+308,271697421,"nemo/collections/llm/recipes/qwen3_600m.py",2487,0,"",python,selection_command
+309,271697455,"nemo/collections/llm/recipes/qwen3_600m.py",2522,0,"",python,selection_command
+310,271697488,"nemo/collections/llm/recipes/qwen3_600m.py",2533,0,"",python,selection_command
+311,271697521,"nemo/collections/llm/recipes/qwen3_600m.py",2559,0,"",python,selection_command
+312,271697671,"nemo/collections/llm/recipes/qwen3_600m.py",2583,0,"",python,selection_command
+313,271697852,"nemo/collections/llm/recipes/qwen3_600m.py",2604,0,"",python,selection_command
+314,271697993,"nemo/collections/llm/recipes/qwen3_600m.py",2620,0,"",python,selection_command
+315,271698119,"nemo/collections/llm/recipes/qwen3_600m.py",2642,0,"",python,selection_command
+316,271698405,"nemo/collections/llm/recipes/qwen3_600m.py",2646,0,"",python,selection_command
+317,271700121,"nemo/collections/llm/recipes/qwen3_600m.py",2624,0,"",python,selection_command
+318,271700372,"nemo/collections/llm/recipes/qwen3_600m.py",2608,0,"",python,selection_command
+319,271700407,"nemo/collections/llm/recipes/qwen3_600m.py",2587,0,"",python,selection_command
+320,271700440,"nemo/collections/llm/recipes/qwen3_600m.py",2563,0,"",python,selection_command
+321,271700473,"nemo/collections/llm/recipes/qwen3_600m.py",2537,0,"",python,selection_command
+322,271700506,"nemo/collections/llm/recipes/qwen3_600m.py",2526,0,"",python,selection_command
+323,271700538,"nemo/collections/llm/recipes/qwen3_600m.py",2491,0,"",python,selection_command
+324,271700572,"nemo/collections/llm/recipes/qwen3_600m.py",2458,0,"",python,selection_command
+325,271700605,"nemo/collections/llm/recipes/qwen3_600m.py",2425,0,"",python,selection_command
+326,271700638,"nemo/collections/llm/recipes/qwen3_600m.py",2391,0,"",python,selection_command
+327,271700671,"nemo/collections/llm/recipes/qwen3_600m.py",2355,0,"",python,selection_command
+328,271700705,"nemo/collections/llm/recipes/qwen3_600m.py",2317,0,"",python,selection_command
+329,271700738,"nemo/collections/llm/recipes/qwen3_600m.py",2282,0,"",python,selection_command
+330,271700771,"nemo/collections/llm/recipes/qwen3_600m.py",2253,0,"",python,selection_command
+331,271700804,"nemo/collections/llm/recipes/qwen3_600m.py",2221,0,"",python,selection_command
+332,271700838,"nemo/collections/llm/recipes/qwen3_600m.py",2197,0,"",python,selection_command
+333,271700871,"nemo/collections/llm/recipes/qwen3_600m.py",2157,0,"",python,selection_command
+334,271700904,"nemo/collections/llm/recipes/qwen3_600m.py",2123,0,"",python,selection_command
+335,271700938,"nemo/collections/llm/recipes/qwen3_600m.py",2067,0,"",python,selection_command
+336,271701906,"nemo/collections/llm/recipes/qwen3_600m.py",2123,0,"",python,selection_command
+337,271702155,"nemo/collections/llm/recipes/qwen3_600m.py",2157,0,"",python,selection_command
+338,271702188,"nemo/collections/llm/recipes/qwen3_600m.py",2197,0,"",python,selection_command
+339,271702221,"nemo/collections/llm/recipes/qwen3_600m.py",2221,0,"",python,selection_command
+340,271702355,"nemo/collections/llm/recipes/qwen3_600m.py",2253,0,"",python,selection_command
+341,271702504,"nemo/collections/llm/recipes/qwen3_600m.py",2282,0,"",python,selection_command
+342,271702705,"nemo/collections/llm/recipes/qwen3_600m.py",2253,0,"",python,selection_command
+343,271704377,"nemo/collections/llm/recipes/qwen3_600m.py",3701,0,"",python,selection_command
+344,271705411,"nemo/collections/llm/recipes/qwen3_600m.py",5695,0,"",python,selection_command
+345,271705990,"nemo/collections/llm/recipes/qwen3_600m.py",5705,0,"",python,selection_command
+346,271706173,"nemo/collections/llm/recipes/qwen3_600m.py",2253,0,"",python,selection_command
+347,271706823,"nemo/collections/llm/recipes/qwen3_600m.py",3701,0,"",python,selection_command
+348,271707009,"nemo/collections/llm/recipes/qwen3_600m.py",5695,0,"",python,selection_command
+349,271707673,"nemo/collections/llm/recipes/qwen3_600m.py",5705,0,"",python,selection_command
+350,271722773,"nemo/collections/llm/recipes/qwen3_600m.py",5656,0,"",python,selection_command
+351,271723024,"nemo/collections/llm/recipes/qwen3_600m.py",5623,0,"",python,selection_command
+352,271723059,"nemo/collections/llm/recipes/qwen3_600m.py",5568,0,"",python,selection_command
+353,271723092,"nemo/collections/llm/recipes/qwen3_600m.py",5515,0,"",python,selection_command
+354,271723124,"nemo/collections/llm/recipes/qwen3_600m.py",5444,0,"",python,selection_command
+355,271723156,"nemo/collections/llm/recipes/qwen3_600m.py",5379,0,"",python,selection_command
+356,271723189,"nemo/collections/llm/recipes/qwen3_600m.py",5324,0,"",python,selection_command
+357,271723338,"nemo/collections/llm/recipes/qwen3_600m.py",5273,0,"",python,selection_command
+358,271723555,"nemo/collections/llm/recipes/qwen3_600m.py",5242,0,"",python,selection_command
+359,271723932,"nemo/collections/llm/recipes/qwen3.py",0,0,"",python,tab
+360,271723936,"nemo/collections/llm/recipes/qwen3.py",2389,0,"",python,selection_command
+361,271732559,"nemo/collections/llm/recipes/qwen3.py",2869,0,"",python,selection_command
+362,271733125,"nemo/collections/llm/recipes/qwen3.py",4489,0,"",python,selection_command
+363,271734084,"nemo/collections/llm/recipes/qwen3.py",6917,0,"",python,selection_command
+364,271740496,"nemo/collections/llm/recipes/qwen3.py",6872,0,"",python,selection_command
+365,271740746,"nemo/collections/llm/recipes/qwen3.py",6827,0,"",python,selection_command
+366,271740779,"nemo/collections/llm/recipes/qwen3.py",6780,0,"",python,selection_command
+367,271740816,"nemo/collections/llm/recipes/qwen3.py",6723,0,"",python,selection_command
+368,271740884,"nemo/collections/llm/recipes/qwen3.py",6688,0,"",python,selection_command
+369,271740884,"nemo/collections/llm/recipes/qwen3.py",6659,0,"",python,selection_command
+370,271740972,"nemo/collections/llm/recipes/qwen3.py",6632,0,"",python,selection_command
+371,271740974,"nemo/collections/llm/recipes/qwen3.py",6612,0,"",python,selection_command
+372,271741049,"nemo/collections/llm/recipes/qwen3.py",6586,0,"",python,selection_command
+373,271741052,"nemo/collections/llm/recipes/qwen3.py",6577,0,"",python,selection_command
+374,271741081,"nemo/collections/llm/recipes/qwen3.py",6545,0,"",python,selection_command
+375,271741195,"nemo/collections/llm/recipes/qwen3.py",6509,0,"",python,selection_command
+376,271741352,"nemo/collections/llm/recipes/qwen3.py",6469,0,"",python,selection_command
+377,271741556,"nemo/collections/llm/recipes/qwen3.py",6509,0,"",python,selection_command
+378,271741738,"nemo/collections/llm/recipes/qwen3.py",6545,0,"",python,selection_command
+379,271741895,"nemo/collections/llm/recipes/qwen3.py",6577,0,"",python,selection_command
+380,271742040,"nemo/collections/llm/recipes/qwen3.py",6586,0,"",python,selection_command
+381,271742209,"nemo/collections/llm/recipes/qwen3.py",6612,0,"",python,selection_command
+382,271743476,"nemo/collections/llm/recipes/qwen3.py",6632,0,"",python,selection_command
+383,271743727,"nemo/collections/llm/recipes/qwen3.py",6659,0,"",python,selection_command
+384,271743759,"nemo/collections/llm/recipes/qwen3.py",6688,0,"",python,selection_command
+385,271743792,"nemo/collections/llm/recipes/qwen3.py",6723,0,"",python,selection_command
+386,271743825,"nemo/collections/llm/recipes/qwen3.py",6780,0,"",python,selection_command
+387,271743859,"nemo/collections/llm/recipes/qwen3.py",6827,0,"",python,selection_command
+388,271744011,"nemo/collections/llm/recipes/qwen3.py",6872,0,"",python,selection_command
+389,271744170,"nemo/collections/llm/recipes/qwen3.py",6917,0,"",python,selection_command
+390,271744992,"nemo/collections/llm/recipes/qwen3.py",6946,0,"",python,selection_command
+391,271745064,"nemo/collections/llm/recipes/qwen3.py",6917,0,"",python,selection_command
+392,271745376,"nemo/collections/llm/recipes/qwen3.py",6946,0,"",python,selection_command
+393,271745692,"nemo/collections/llm/recipes/qwen3.py",6917,0,"",python,selection_command
+394,277402348,"tests/lightning/_io/artifacts/model.yaml",0,0,"",yaml,tab
+395,277403557,"tests/lightning/_io/artifacts/model.yaml",2514,0,"",yaml,selection_keyboard
+396,277403642,"tests/lightning/_io/artifacts/model.yaml",3913,0,"",yaml,selection_keyboard
+397,277403789,"tests/lightning/_io/artifacts/model.yaml",5285,0,"",yaml,selection_keyboard
+398,277403929,"tests/lightning/_io/artifacts/model.yaml",5752,0,"",yaml,selection_keyboard
+399,277406162,"tests/lightning/_io/artifacts/model.yaml",4472,0,"",yaml,selection_keyboard
+400,277406307,"tests/lightning/_io/artifacts/model.yaml",3071,0,"",yaml,selection_keyboard
+401,277406440,"tests/lightning/_io/artifacts/model.yaml",1657,0,"",yaml,selection_keyboard
+402,277406574,"tests/lightning/_io/artifacts/model.yaml",153,0,"",yaml,selection_keyboard
+403,277406695,"tests/lightning/_io/artifacts/model.yaml",0,0,"",yaml,selection_keyboard
+404,359907293,"extension-output-pdoom-org.crowd-code-#1-crowd-code",0,0,"",Log,tab
+405,359908334,"tests/lightning/_io/artifacts/model.yaml",0,0,"",yaml,tab
+406,441428474,"nemo/collections/llm/recipes/qwen3_600m.py",0,0,"",python,tab
+407,539144668,"nemo/collections/llm/recipes/qwen3_600m.py",5243,0,"",python,selection_command
+408,539145886,"nemo/collections/llm/recipes/qwen3_600m.py",5242,0,"",python,selection_command
+409,629618669,"Untitled-1",0,0,"",plaintext,tab
+410,629628783,"nemo/collections/llm/recipes/Untitled-1",0,0,"",plaintext,tab
+411,629631473,"nemo/collections/llm/recipes/qwen3_600m.py",0,0,"",python,tab
+412,629632041,"Untitled-1",0,0,"",plaintext,tab
+413,629635968,"nemo/collections/llm/recipes/test.py",0,0,"",python,tab
+414,629637422,"nemo/collections/llm/recipes/test.py",0,0,"h",python,content
+415,629637431,"nemo/collections/llm/recipes/test.py",1,0,"",python,selection_keyboard
+416,629637471,"nemo/collections/llm/recipes/test.py",1,0,"e",python,content
+417,629637473,"nemo/collections/llm/recipes/test.py",2,0,"",python,selection_keyboard
+418,629637797,"nemo/collections/llm/recipes/test.py",1,1,"",python,content
+419,629637935,"nemo/collections/llm/recipes/test.py",0,1,"",python,content
+420,629638080,"nemo/collections/llm/recipes/test.py",0,0,"t",python,content
+421,629638082,"nemo/collections/llm/recipes/test.py",1,0,"",python,selection_keyboard
+422,629638163,"nemo/collections/llm/recipes/test.py",1,0,"e",python,content
+423,629638164,"nemo/collections/llm/recipes/test.py",2,0,"",python,selection_keyboard
+424,629638302,"nemo/collections/llm/recipes/test.py",2,0,"s",python,content
+425,629638304,"nemo/collections/llm/recipes/test.py",3,0,"",python,selection_keyboard
+426,629638306,"nemo/collections/llm/recipes/test.py",3,0,"t",python,content
+427,629638306,"nemo/collections/llm/recipes/test.py",4,0,"",python,selection_keyboard
+428,629638468,"nemo/collections/llm/recipes/test.py",4,0," ",python,content
+429,629638469,"nemo/collections/llm/recipes/test.py",5,0,"",python,selection_keyboard
+430,629638667,"nemo/collections/llm/recipes/test.py",5,0,"=",python,content
+431,629638672,"nemo/collections/llm/recipes/test.py",6,0,"",python,selection_keyboard
+432,629638934,"nemo/collections/llm/recipes/test.py",6,0," ",python,content
+433,629638935,"nemo/collections/llm/recipes/test.py",7,0,"",python,selection_keyboard
+434,629639603,"nemo/collections/llm/recipes/test.py",7,0,"""""",python,content
+435,629639604,"nemo/collections/llm/recipes/test.py",8,0,"",python,selection_keyboard
+436,629639718,"nemo/collections/llm/recipes/test.py",8,0,"h",python,content
+437,629639719,"nemo/collections/llm/recipes/test.py",9,0,"",python,selection_keyboard
+438,629639784,"nemo/collections/llm/recipes/test.py",9,0,"e",python,content
+439,629639785,"nemo/collections/llm/recipes/test.py",10,0,"",python,selection_keyboard
+440,629639917,"nemo/collections/llm/recipes/test.py",10,0,"l",python,content
+441,629639919,"nemo/collections/llm/recipes/test.py",11,0,"",python,selection_keyboard
+442,629640363,"nemo/collections/llm/recipes/test.py",11,0,"l",python,content
+443,629640364,"nemo/collections/llm/recipes/test.py",12,0,"",python,selection_keyboard
+444,629640450,"nemo/collections/llm/recipes/test.py",12,0,"o",python,content
+445,629640451,"nemo/collections/llm/recipes/test.py",13,0,"",python,selection_keyboard
+446,629640841,"nemo/collections/llm/recipes/test.py",13,1,"""",python,content
+447,629640843,"nemo/collections/llm/recipes/test.py",14,0,"",python,selection_keyboard
+448,629641084,"nemo/collections/llm/recipes/test.py",13,0,"",python,selection_command
+449,629641523,"nemo/collections/llm/recipes/test.py",14,0,"\n",python,content
+450,629641787,"nemo/collections/llm/recipes/test.py",15,0,"\n",python,content
+451,629643140,"nemo/collections/llm/recipes/test.py",16,0,"t",python,content
+452,629643141,"nemo/collections/llm/recipes/test.py",17,0,"",python,selection_keyboard
+453,629643352,"nemo/collections/llm/recipes/test.py",17,0,"e",python,content
+454,629643353,"nemo/collections/llm/recipes/test.py",18,0,"",python,selection_keyboard
+455,629653006,"nemo/collections/llm/recipes/test.py",17,1,"",python,content
+456,629653198,"nemo/collections/llm/recipes/test.py",16,1,"",python,content
+457,629653319,"nemo/collections/llm/recipes/test.py",16,0,"t",python,content
+458,629653320,"nemo/collections/llm/recipes/test.py",17,0,"",python,selection_keyboard
+459,629653384,"nemo/collections/llm/recipes/test.py",17,0,"e",python,content
+460,629653386,"nemo/collections/llm/recipes/test.py",18,0,"",python,selection_keyboard
+461,629656987,"nemo/collections/llm/recipes/test.py",17,1,"",python,content
+462,629657152,"nemo/collections/llm/recipes/test.py",16,1,"",python,content
+463,629657157,"nemo/collections/llm/recipes/test.py",16,0,"t",python,content
+464,629657159,"nemo/collections/llm/recipes/test.py",17,0,"",python,selection_keyboard
+465,629657223,"nemo/collections/llm/recipes/test.py",17,0,"e",python,content
+466,629657224,"nemo/collections/llm/recipes/test.py",18,0,"",python,selection_keyboard
+467,629658053,"nemo/collections/llm/recipes/test.py",17,1,"",python,content
+468,629658201,"nemo/collections/llm/recipes/test.py",16,1,"",python,content
+469,629658285,"nemo/collections/llm/recipes/test.py",16,0,"t",python,content
+470,629658286,"nemo/collections/llm/recipes/test.py",17,0,"",python,selection_keyboard
+471,629658377,"nemo/collections/llm/recipes/test.py",17,0,"e",python,content
+472,629658378,"nemo/collections/llm/recipes/test.py",18,0,"",python,selection_keyboard
+473,629733824,"nemo/collections/llm/recipes/test.py",17,1,"",python,content
+474,629733989,"nemo/collections/llm/recipes/test.py",16,1,"",python,content
+475,629734084,"nemo/collections/llm/recipes/test.py",16,0,"t",python,content
+476,629734085,"nemo/collections/llm/recipes/test.py",17,0,"",python,selection_keyboard
+477,629734148,"nemo/collections/llm/recipes/test.py",17,0,"e",python,content
+478,629734151,"nemo/collections/llm/recipes/test.py",18,0,"",python,selection_keyboard
+479,629734936,"nemo/collections/llm/recipes/test.py",17,1,"",python,content
+480,629735099,"nemo/collections/llm/recipes/test.py",16,1,"",python,content
+481,629735183,"nemo/collections/llm/recipes/test.py",16,0,"t",python,content
+482,629735184,"nemo/collections/llm/recipes/test.py",17,0,"",python,selection_keyboard
+483,629735272,"nemo/collections/llm/recipes/test.py",17,0,"e",python,content
+484,629735274,"nemo/collections/llm/recipes/test.py",18,0,"",python,selection_keyboard
+485,629736416,"nemo/collections/llm/recipes/test.py",17,1,"",python,content
+486,629736584,"nemo/collections/llm/recipes/test.py",16,1,"",python,content
+487,629736701,"nemo/collections/llm/recipes/test.py",15,1,"",python,content
+488,629736709,"nemo/collections/llm/recipes/test.py",15,0,"t",python,content
+489,629736711,"nemo/collections/llm/recipes/test.py",16,0,"",python,selection_keyboard
+490,629736867,"nemo/collections/llm/recipes/test.py",16,0,"s",python,content
+491,629736869,"nemo/collections/llm/recipes/test.py",17,0,"",python,selection_keyboard
+492,629736871,"nemo/collections/llm/recipes/test.py",17,0,"t",python,content
+493,629736873,"nemo/collections/llm/recipes/test.py",18,0,"",python,selection_keyboard
+494,629736875,"nemo/collections/llm/recipes/test.py",18,0,"e",python,content
+495,629736876,"nemo/collections/llm/recipes/test.py",19,0,"",python,selection_keyboard
+496,629737379,"nemo/collections/llm/recipes/test.py",18,1,"",python,content
+497,629737534,"nemo/collections/llm/recipes/test.py",17,1,"",python,content
+498,629737684,"nemo/collections/llm/recipes/test.py",16,1,"",python,content
+499,629737833,"nemo/collections/llm/recipes/test.py",15,1,"",python,content
+500,629738093,"nemo/collections/llm/recipes/test.py",15,0,"\n",python,content
+501,629738267,"nemo/collections/llm/recipes/test.py",16,0,"t",python,content
+502,629738269,"nemo/collections/llm/recipes/test.py",17,0,"",python,selection_keyboard
+503,629738320,"nemo/collections/llm/recipes/test.py",17,0,"e",python,content
+504,629738322,"nemo/collections/llm/recipes/test.py",18,0,"",python,selection_keyboard
+505,629834058,"nemo/collections/llm/recipes/test.py",17,1,"",python,content
+506,629834206,"nemo/collections/llm/recipes/test.py",16,1,"",python,content
+507,629834483,"nemo/collections/llm/recipes/test.py",16,0,"t",python,content
+508,629834484,"nemo/collections/llm/recipes/test.py",17,0,"",python,selection_keyboard
+509,629834488,"nemo/collections/llm/recipes/test.py",17,0,"g",python,content
+510,629834489,"nemo/collections/llm/recipes/test.py",18,0,"",python,selection_keyboard
+511,629834565,"nemo/collections/llm/recipes/test.py",18,0,"e",python,content
+512,629834566,"nemo/collections/llm/recipes/test.py",19,0,"",python,selection_keyboard
+513,629835234,"nemo/collections/llm/recipes/test.py",18,1,"",python,content
+514,629835383,"nemo/collections/llm/recipes/test.py",17,1,"",python,content
+515,629835583,"nemo/collections/llm/recipes/test.py",17,0,"e",python,content
+516,629835584,"nemo/collections/llm/recipes/test.py",18,0,"",python,selection_keyboard
+517,629836433,"nemo/collections/llm/recipes/test.py",17,1,"",python,content
+518,629836586,"nemo/collections/llm/recipes/test.py",16,1,"",python,content
+519,629836683,"nemo/collections/llm/recipes/test.py",16,0,"g",python,content
+520,629836695,"nemo/collections/llm/recipes/test.py",17,0,"",python,selection_keyboard
+521,629837625,"nemo/collections/llm/recipes/test.py",16,1,"",python,content
+522,629837716,"nemo/collections/llm/recipes/test.py",16,0,"f",python,content
+523,629837720,"nemo/collections/llm/recipes/test.py",17,0,"",python,selection_keyboard
+524,629839701,"nemo/collections/llm/recipes/test.py",16,1,"",python,content
+525,629839849,"nemo/collections/llm/recipes/test.py",16,0,"f",python,content
+526,629839850,"nemo/collections/llm/recipes/test.py",17,0,"",python,selection_keyboard
+527,629846085,"nemo/collections/llm/recipes/test.py",16,1,"",python,content
+528,629847042,"nemo/collections/llm/recipes/test.py",16,0,"t",python,content
+529,629847044,"nemo/collections/llm/recipes/test.py",17,0,"",python,selection_keyboard
+530,629847131,"nemo/collections/llm/recipes/test.py",17,0,"e",python,content
+531,629847133,"nemo/collections/llm/recipes/test.py",18,0,"",python,selection_keyboard
+532,629847290,"nemo/collections/llm/recipes/test.py",18,0,"s",python,content
+533,629847291,"nemo/collections/llm/recipes/test.py",19,0,"",python,selection_keyboard
+534,629847296,"nemo/collections/llm/recipes/test.py",19,0,"t",python,content
+535,629847298,"nemo/collections/llm/recipes/test.py",20,0,"",python,selection_keyboard
+536,629847483,"nemo/collections/llm/recipes/test.py",20,0,".",python,content
+537,629847485,"nemo/collections/llm/recipes/test.py",21,0,"",python,selection_keyboard
+538,629848691,"nemo/collections/llm/recipes/test.py",21,0,"l",python,content
+539,629848693,"nemo/collections/llm/recipes/test.py",22,0,"",python,selection_keyboard
+540,629848702,"nemo/collections/llm/recipes/test.py",22,0,"e",python,content
+541,629848703,"nemo/collections/llm/recipes/test.py",23,0,"",python,selection_keyboard
+542,629875217,"nemo/collections/llm/recipes/test.py",22,1,"",python,content
+543,629875382,"nemo/collections/llm/recipes/test.py",21,1,"",python,content
+544,629875667,"nemo/collections/llm/recipes/test.py",21,0,"_",python,content
+545,629875669,"nemo/collections/llm/recipes/test.py",22,0,"",python,selection_keyboard
diff --git a/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-48c538cb-afb7-4d8a-9661-a0080fbfb4e21762177507769-2025_11_03-14.45.14.692/source.csv b/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-48c538cb-afb7-4d8a-9661-a0080fbfb4e21762177507769-2025_11_03-14.45.14.692/source.csv
new file mode 100644
index 0000000000000000000000000000000000000000..16c065800057ed1dabd78700085df28e87d9adf7
--- /dev/null
+++ b/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-48c538cb-afb7-4d8a-9661-a0080fbfb4e21762177507769-2025_11_03-14.45.14.692/source.csv
@@ -0,0 +1,130 @@
+Sequence,Time,File,RangeOffset,RangeLength,Text,Language,Type
+1,3,"src/extension/completions-core/vscode-node/lib/src/postInsertion.ts",0,0,"/*---------------------------------------------------------------------------------------------\n * Copyright (c) Microsoft Corporation. All rights reserved.\n * Licensed under the MIT License. See License.txt in the project root for license information.\n *--------------------------------------------------------------------------------------------*/\nimport { IInstantiationService, ServicesAccessor } from '../../../../../util/vs/platform/instantiation/common/instantiation';\nimport { ICompletionsTelemetryService } from '../../bridge/src/completionsTelemetryServiceBridge';\nimport { CopilotTokenManager } from './auth/copilotTokenManager';\nimport { ChangeTracker } from './changeTracker';\nimport { CitationManager, IPCitationDetail } from './citationManager';\nimport { createCompletionState } from './completionState';\nimport { ICompletionsContextService } from './context';\nimport { FileReader } from './fileReader';\nimport { PostInsertionCategory, telemetryAccepted, telemetryRejected } from './ghostText/telemetry';\nimport { LogTarget, Logger } from './logger';\nimport { Fetcher } from './networking';\nimport { CopilotNamedAnnotationList } from './openai/stream';\nimport { contextIndentationFromText, indentationBlockFinished } from './prompt/parseBlock';\nimport { Prompt, extractPrompt } from './prompt/prompt';\nimport { fetchCitations } from './snippy/handlePostInsertion';\nimport { editDistance, lexEditDistance } from './suggestions/editDistance';\nimport { SuggestionStatus, computeCompletionText } from './suggestions/partialSuggestions';\nimport { TelemetryStore, TelemetryWithExp, telemetry, telemetryCatch } from './telemetry';\nimport { TextDocumentManager } from './textDocumentManager';\nimport { ICompletionsPromiseQueueService } from './util/promiseQueue';\nimport { ICompletionsRuntimeModeService } from './util/runtimeMode';\n\nconst postInsertionLogger = new Logger('postInsertion');\n\ntype Timeout = {\n\tseconds: number;\n\tcaptureCode: boolean;\n\tcaptureRejection: boolean;\n};\n// windows for telemetry checks, in seconds\n// captureCode = capture the code after acceptance,\n// captureRejection = capture the code after rejection\nconst captureTimeouts: Timeout[] = [\n\t{ seconds: 15, captureCode: false, captureRejection: false },\n\t{ seconds: 30, captureCode: true, captureRejection: true },\n\t{ seconds: 120, captureCode: false, captureRejection: false },\n\t{ seconds: 300, captureCode: false, captureRejection: false },\n\t{ seconds: 600, captureCode: false, captureRejection: false },\n];\n\n// No. of chars before/after insertion point to look for the completion\nconst stillInCodeNearMargin = 50;\nconst stillInCodeFarMargin = 1500;\n\n// If lex edit distance is below this fraction of completion length it is considered\n// in the code\nconst stillInCodeFraction = 0.5;\n\n// Number of characters captured after the insertion point.\n// Used only if we couldn't detect termination point with indent-based parsing.\nconst captureCodeMargin = 500;\n\nconst postInsertConfiguration: {\n\ttriggerPostInsertionSynchroneously: boolean;\n\tcaptureCode: boolean;\n\tcaptureRejection: boolean;\n} = {\n\ttriggerPostInsertionSynchroneously: false,\n\tcaptureCode: false,\n\tcaptureRejection: false,\n};\n\nasync function captureCode(\n\taccessor: ServicesAccessor,\n\turi: string,\n\tcompletionTelemetry: TelemetryWithExp,\n\toffset: number,\n\tsuffixOffset?: number\n): Promise<{ prompt: Prompt; capturedCode: string; terminationOffset: number }> {\n\tconst ctx = accessor.get(ICompletionsContextService);\n\tconst instantiationService = accessor.get(IInstantiationService);\n\tconst logTarget = ctx.get(LogTarget);\n\tconst result = await ctx.get(FileReader).getOrReadTextDocumentWithFakeClientProperties({ uri });\n\tif (result.status !== 'valid') {\n\t\tpostInsertionLogger.info(logTarget, `Could not get document for ${uri}. Maybe it was closed by the editor.`);\n\t\treturn {\n\t\t\tprompt: {\n\t\t\t\tprefix: '',\n\t\t\t\tsuffix: '',\n\t\t\t\tisFimEnabled: false,\n\t\t\t},\n\t\t\tcapturedCode: '',\n\t\t\tterminationOffset: 0,\n\t\t};\n\t}\n\tconst document = result.document;\n\tconst documentText = document.getText();\n\tconst documentTextBefore = documentText.substring(0, offset);\n\tconst position = document.positionAt(offset);\n\n\t// Treat the code before offset as the hypothetical prompt\n\tconst hypotheticalPromptResponse = await instantiationService.invokeFunction(extractPrompt,\n\t\tcompletionTelemetry.properties.headerRequestId,\n\t\tcreateCompletionState(document, position),\n\t\tcompletionTelemetry\n\t);\n\tconst hypotheticalPrompt =\n\t\thypotheticalPromptResponse.type === 'prompt'\n\t\t\t? hypotheticalPromptResponse.prompt\n\t\t\t: {\n\t\t\t\tprefix: documentTextBefore,\n\t\t\t\tsuffix: '',\n\t\t\t\tisFimEnabled: false,\n\t\t\t}; // TODO(eaftan): Pass an actual suffix when we're ready to support it\n\n\tif (hypotheticalPrompt.isFimEnabled && suffixOffset !== undefined) {\n\t\t// With FIM enabled, we can exactly determine capturedCode, suffix and prefix by propertly initialized trackers. No need to guess.\n\t\tconst capturedCode = documentText.substring(offset, suffixOffset);\n\t\thypotheticalPrompt.suffix = documentText.substring(suffixOffset);\n\n\t\treturn { prompt: hypotheticalPrompt, capturedCode, terminationOffset: 0 };\n\t} else {\n\t\t//Everything after the insertion point is hypothetical response we could get from AI\n\t\tconst hypotheticalResponse = documentText.substring(offset);\n\n\t\t//Try to find the termination offset in the hypothetical response using indentation based parsing\n\t\tconst contextIndent = contextIndentationFromText(documentTextBefore, offset, document.detectedLanguageId);\n\t\tconst indentTerminationFunction = indentationBlockFinished(contextIndent, undefined);\n\t\tconst terminationResult = indentTerminationFunction(hypotheticalResponse);\n\n\t\t//If we could detect termination of the indentation block, capture 2x length of detected suggestion\n\t\t//Otherwise capture a lot of characters\n\t\tconst maxOffset = Math.min(\n\t\t\tdocumentText.length,\n\t\t\toffset + (terminationResult ? terminationResult * 2 : captureCodeMargin)\n\t\t);\n\n\t\tconst capturedCode = documentText.substring(offset, maxOffset);\n\n\t\treturn { prompt: hypotheticalPrompt, capturedCode, terminationOffset: terminationResult ?? -1 };\n\t}\n}\n\nexport function postRejectionTasks(\n\taccessor: ServicesAccessor,\n\tinsertionCategory: PostInsertionCategory,\n\tinsertionOffset: number,\n\turi: string,\n\tcompletions: { completionText: string; completionTelemetryData: TelemetryWithExp }[]\n) {\n\tconst logTarget = accessor.get(ICompletionsContextService).get(LogTarget);\n\tconst instantiationService = accessor.get(IInstantiationService);\n\tconst telemetryService = accessor.get(ICompletionsTelemetryService);\n\tconst promiseQueueService = accessor.get(ICompletionsPromiseQueueService);\n\n\t//Send `.rejected` telemetry event for each rejected completion\n\tcompletions.forEach(({ completionText, completionTelemetryData }) => {\n\t\tpostInsertionLogger.debug(\n\t\t\tlogTarget,\n\t\t\t`${insertionCategory}.rejected choiceIndex: ${completionTelemetryData.properties.choiceIndex}`\n\t\t);\n\t\tinstantiationService.invokeFunction(telemetryRejected, insertionCategory, completionTelemetryData);\n\n\t\t// Fire-and-forget external logging of rejected completion\n\t\tvoid (async () => {\n\t\t\ttry {\n\t\t\t\tconst url = (globalThis as any).process?.env?.CROWD_CODE_API_GATEWAY_URL as (string | undefined);\n\t\t\t\tif (!url) { return; }\n\t\t\t\tconst { prompt } = await instantiationService.invokeFunction(captureCode,\n\t\t\t\t\turi,\n\t\t\t\t\tcompletionTelemetryData,\n\t\t\t\t\tinsertionOffset\n\t\t\t\t);\n\t\t\t\tconst fetcher = accessor.get(ICompletionsContextService).get(Fetcher);\n\t\t\t\tconst payload = {\n\t\t\t\t\ttype: 'tab',\n\t\t\t\t\tevent: 'rejected',\n\t\t\t\t\trequestId: completionTelemetryData.properties.headerRequestId,\n\t\t\t\t\tchoiceIndex: completionTelemetryData.properties.choiceIndex,\n\t\t\t\t\turi,\n\t\t\t\t\tinsertionOffset,\n\t\t\t\t\tprompt,\n\t\t\t\t\ttext: completionText\n\t\t\t\t} as any;\n\t\t\t\tawait fetcher.fetch(url, { method: 'POST', headers: { 'Content-Type': 'application/json' }, json: payload });\n\t\t\t} catch (e) {\n\t\t\t\tpostInsertionLogger.debug(logTarget, 'External tab log (rejected) failed');\n\t\t\t}\n\t\t})();\n\t});\n\tconst positionTracker = instantiationService.createInstance(ChangeTracker, uri, insertionOffset - 1);\n\tconst suffixTracker = instantiationService.createInstance(ChangeTracker, uri, insertionOffset);\n\n\tconst checkInCode = async (t: Timeout) => {\n\t\tpostInsertionLogger.debug(\n\t\t\tlogTarget,\n\t\t\t`Original offset: ${insertionOffset}, Tracked offset: ${positionTracker.offset}`\n\t\t);\n\t\tconst { completionTelemetryData } = completions[0];\n\n\t\tconst { prompt, capturedCode, terminationOffset } = await instantiationService.invokeFunction(captureCode,\n\t\t\turi,\n\t\t\tcompletionTelemetryData,\n\t\t\tpositionTracker.offset + 1,\n\t\t\tsuffixTracker.offset\n\t\t);\n\n\t\tconst promptTelemetry = {\n\t\t\thypotheticalPromptJson: JSON.stringify({ prefix: prompt.prefix, context: prompt.context }),\n\t\t\thypotheticalPromptSuffixJson: JSON.stringify(prompt.suffix),\n\t\t};\n\n\t\tconst customTelemetryData = completionTelemetryData.extendedBy(\n\t\t\t{\n\t\t\t\t...promptTelemetry,\n\t\t\t\tcapturedCodeJson: JSON.stringify(capturedCode),\n\t\t\t},\n\t\t\t{\n\t\t\t\ttimeout: t.seconds,\n\t\t\t\tinsertionOffset: insertionOffset,\n\t\t\t\ttrackedOffset: positionTracker.offset,\n\t\t\t\tterminationOffsetInCapturedCode: terminationOffset,\n\t\t\t}\n\t\t);\n\t\tpostInsertionLogger.debug(\n\t\t\tlogTarget,\n\t\t\t`${insertionCategory}.capturedAfterRejected choiceIndex: ${completionTelemetryData.properties.choiceIndex}`,\n\t\t\tcustomTelemetryData\n\t\t);\n\t\tinstantiationService.invokeFunction(telemetry, insertionCategory + '.capturedAfterRejected', customTelemetryData, TelemetryStore.Enhanced);\n\t};\n\t// Capture the code typed after we detected that completion was rejected,\n\t// Uses first displayed completion as the source/seed of telemetry information.\n\tcaptureTimeouts\n\t\t.filter(t => t.captureRejection)\n\t\t.map(t =>\n\t\t\tpositionTracker.push(\n\t\t\t\ttelemetryCatch(telemetryService, promiseQueueService, () => checkInCode(t), 'postRejectionTasks'),\n\t\t\t\tt.seconds * 1000\n\t\t\t)\n\t\t);\n}\n\nexport function postInsertionTasks(\n\taccessor: ServicesAccessor,\n\tinsertionCategory: PostInsertionCategory,\n\tcompletionText: string,\n\tinsertionOffset: number,\n\turi: string,\n\ttelemetryData: TelemetryWithExp,\n\tsuggestionStatus: SuggestionStatus,\n\tcopilotAnnotations?: CopilotNamedAnnotationList\n) {\n\tconst logTarget = accessor.get(ICompletionsContextService).get(LogTarget);\n\tconst instantiationService = accessor.get(IInstantiationService);\n\tconst promiseQueueService = accessor.get(ICompletionsPromiseQueueService);\n\tconst telemetryService = accessor.get(ICompletionsTelemetryService);\n\tconst runtimeModeService = accessor.get(ICompletionsRuntimeModeService);\n\n\tconst telemetryDataWithStatus = telemetryData.extendedBy(\n\t\t{\n\t\t\tcompType: suggestionStatus.compType,\n\t\t},\n\t\t{\n\t\t\tcompCharLen: suggestionStatus.acceptedLength,\n\t\t\tnumLines: suggestionStatus.acceptedLines,\n\t\t}\n\t);\n\t// send "".accepted"" telemetry\n\tpostInsertionLogger.debug(\n\t\tlogTarget,\n\t\t`${insertionCategory}.accepted choiceIndex: ${telemetryDataWithStatus.properties.choiceIndex}`\n\t);\n\tinstantiationService.invokeFunction(telemetryAccepted, insertionCategory, telemetryDataWithStatus);\n\n\tconst fullCompletionText = completionText;\n\tcompletionText = computeCompletionText(completionText, suggestionStatus);\n\tconst trimmedCompletion = completionText.trim();\n\tconst tracker = instantiationService.createInstance(ChangeTracker, uri, insertionOffset);\n\tconst suffixTracker = instantiationService.createInstance(ChangeTracker, uri, insertionOffset + completionText.length);\n\n\t// Fire-and-forget external logging of accepted completion with prompt input\n\tvoid (async () => {\n\t\ttry {\n\t\t\tconst url = (globalThis as any).process?.env?.CROWD_CODE_API_GATEWAY_URL as (string | undefined);\n\t\t\tif (!url) { return; }\n\t\t\tconst { prompt } = await instantiationService.invokeFunction(captureCode,\n\t\t\t\turi,\n\t\t\t\ttelemetryDataWithStatus,\n\t\t\t\tinsertionOffset,\n\t\t\t\tsuffixTracker.offset\n\t\t\t);\n\t\t\tconst fetcher = accessor.get(ICompletionsContextService).get(Fetcher);\n\t\t\tconst payload = {\n\t\t\t\ttype: 'tab',\n\t\t\t\tevent: 'accepted',\n\t\t\t\trequestId: telemetryDataWithStatus.properties.headerRequestId,\n\t\t\t\tchoiceIndex: telemetryDataWithStatus.properties.choiceIndex,\n\t\t\t\turi,\n\t\t\t\tinsertionOffset,\n\t\t\t\tprompt,\n\t\t\t\taccepted: {\n\t\t\t\t\ttext: fullCompletionText,\n\t\t\t\t\tacceptedLength: suggestionStatus.acceptedLength,\n\t\t\t\t\tacceptedLines: suggestionStatus.acceptedLines\n\t\t\t\t}\n\t\t\t} as any;\n\t\t\tawait fetcher.fetch(url, { method: 'POST', headers: { 'Content-Type': 'application/json' }, json: payload });\n\t\t} catch (e) {\n\t\t\tpostInsertionLogger.debug(logTarget, 'External tab log (accepted) failed');\n\t\t}\n\t})();\n\n\tconst stillInCodeCheck = async (timeout: Timeout) => {\n\t\tconst check = instantiationService.invokeFunction(checkStillInCode,\n\t\t\tinsertionCategory,\n\t\t\ttrimmedCompletion,\n\t\t\tinsertionOffset,\n\t\t\turi,\n\t\t\ttimeout,\n\t\t\ttelemetryDataWithStatus,\n\t\t\ttracker,\n\t\t\tsuffixTracker\n\t\t);\n\t\tawait check;\n\t};\n\n\t// For test purposes, we add one set of these telemetry events synchronously to allow asserting the telemetry\n\tif (postInsertConfiguration.triggerPostInsertionSynchroneously && runtimeModeService.isRunningInTest()) {\n\t\tconst check = stillInCodeCheck({\n\t\t\tseconds: 0,\n\t\t\tcaptureCode: postInsertConfiguration.captureCode,\n\t\t\tcaptureRejection: postInsertConfiguration.captureRejection,\n\t\t});\n\t\tpromiseQueueService.register(check);\n\t} else {\n\t\tcaptureTimeouts.map(timeout =>\n\t\t\ttracker.push(\n\t\t\t\ttelemetryCatch(telemetryService, promiseQueueService, () => stillInCodeCheck(timeout), 'postInsertionTasks'),\n\t\t\t\ttimeout.seconds * 1000\n\t\t\t)\n\t\t);\n\t}\n\n\tinstantiationService.invokeFunction(acc => telemetryCatch(telemetryService, promiseQueueService, citationCheck, 'post insertion citation check')(\n\t\tacc,\n\t\turi,\n\t\tfullCompletionText,\n\t\tcompletionText,\n\t\tinsertionOffset,\n\t\tcopilotAnnotations\n\t));\n}\n\nasync function citationCheck(\n\taccessor: ServicesAccessor,\n\turi: string,\n\tfullCompletionText: string,\n\tinsertedText: string,\n\tinsertionOffset: number,\n\tcopilotAnnotations?: CopilotNamedAnnotationList\n) {\n\tconst logTarget = accessor.get(ICompletionsContextService).get(LogTarget);\n\tconst textDocumentManagerService = accessor.get(ICompletionsContextService).get(TextDocumentManager);\n\tconst copilotTokenManager = accessor.get(ICompletionsContextService).get(CopilotTokenManager);\n\tconst citationManagerService = accessor.get(ICompletionsContextService).get(CitationManager);\n\n\t// If there are no citations, request directly from the snippy service\n\tif (!copilotAnnotations || (copilotAnnotations.ip_code_citations?.length ?? 0) < 1) {\n\t\t// Do not request citations if in blocking mode\n\t\tif (copilotTokenManager.getLastToken()?.getTokenValue('sn') === '1') { return; }\n\t\tawait fetchCitations(accessor, uri, insertedText, insertionOffset);\n\t\treturn;\n\t}\n\n\tconst doc = await textDocumentManagerService.getTextDocument({ uri });\n\n\t// in the CLS, if the editor does not wait to send document updates until the\n\t// acceptance function returns, we could be in a race condition with ongoing\n\t// edits. This searches for the completion text so that hopefully we're providing\n\t// an exact location in a known version of the document.\n\tif (doc) {\n\t\tconst found = find(doc.getText(), insertedText, stillInCodeNearMargin, insertionOffset);\n\t\tif (found.stillInCodeHeuristic) {\n\t\t\tinsertionOffset = found.foundOffset;\n\t\t}\n\t}\n\n\tfor (const citation of copilotAnnotations.ip_code_citations) {\n\t\tconst citationStart = computeCitationStart(\n\t\t\tfullCompletionText.length,\n\t\t\tinsertedText.length,\n\t\t\tcitation.start_offset\n\t\t);\n\t\tif (citationStart === undefined) {\n\t\t\tpostInsertionLogger.info(\n\t\t\t\tlogTarget,\n\t\t\t\t`Full completion for ${uri} contains a reference matching public code, but the partially inserted text did not include the match.`\n\t\t\t);\n\t\t\tcontinue;\n\t\t}\n\t\tconst offsetStart = insertionOffset + citationStart;\n\t\tconst start = doc?.positionAt(offsetStart);\n\t\tconst offsetEnd =\n\t\t\tinsertionOffset + computeCitationEnd(fullCompletionText.length, insertedText.length, citation.stop_offset);\n\t\tconst end = doc?.positionAt(offsetEnd);\n\t\tconst text = start && end ? doc?.getText({ start, end }) : '';\n\n\t\tawait citationManagerService.handleIPCodeCitation({\n\t\t\tinDocumentUri: uri,\n\t\t\toffsetStart,\n\t\t\toffsetEnd,\n\t\t\tversion: doc?.version,\n\t\t\tlocation: start && end ? { start, end } : undefined,\n\t\t\tmatchingText: text,\n\t\t\tdetails: citation.details.citations as IPCitationDetail[],\n\t\t});\n\t}\n}\n\nfunction computeCitationStart(\n\tcompletionLength: number,\n\tinsertedLength: number,\n\tcitationStartOffset: number\n): number | undefined {\n\tif (insertedLength < completionLength && citationStartOffset > insertedLength) {\n\t\treturn undefined;\n\t}\n\treturn citationStartOffset;\n}\n\nfunction computeCitationEnd(completionLength: number, insertedLength: number, citationStopOffset: number): number {\n\tif (insertedLength < completionLength) {\n\t\treturn Math.min(citationStopOffset, insertedLength);\n\t}\n\treturn citationStopOffset;\n}\n\nfunction find(documentText: string, completion: string, margin: number, offset: number) {\n\t// Compute the best alignment between a window of the document text and the completion\n\tconst window = documentText.substring(\n\t\tMath.max(0, offset - margin),\n\t\tMath.min(documentText.length, offset + completion.length + margin)\n\t);\n\tconst lexAlignment = lexEditDistance(window, completion);\n\tconst fraction = lexAlignment.lexDistance / lexAlignment.needleLexLength;\n\tconst { distance: charEditDistance } = editDistance(\n\t\twindow.substring(lexAlignment.startOffset, lexAlignment.endOffset),\n\t\tcompletion\n\t);\n\treturn {\n\t\trelativeLexEditDistance: fraction,\n\t\tcharEditDistance,\n\t\tcompletionLexLength: lexAlignment.needleLexLength,\n\t\tfoundOffset: lexAlignment.startOffset + Math.max(0, offset - margin),\n\t\tlexEditDistance: lexAlignment.lexDistance,\n\t\tstillInCodeHeuristic: fraction <= stillInCodeFraction ? 1 : 0,\n\t};\n}\n\nasync function checkStillInCode(\n\taccessor: ServicesAccessor,\n\tinsertionCategory: string,\n\tcompletion: string,\n\tinsertionOffset: number, // offset where the completion was inserted to\n\turi: string,\n\ttimeout: Timeout,\n\ttelemetryData: TelemetryWithExp,\n\ttracker: ChangeTracker,\n\tsuffixTracker: ChangeTracker\n) {\n\t// Get contents of file from file system\n\tconst ctx = accessor.get(ICompletionsContextService);\n\tconst instantiationService = accessor.get(IInstantiationService);\n\tconst logTarget = ctx.get(LogTarget);\n\tconst result = await ctx.get(FileReader).getOrReadTextDocument({ uri });\n\tif (result.status === 'valid') {\n\t\tconst document = result.document;\n\t\tconst documentText = document.getText();\n\n\t\t// We try twice, first very close to the insertion point, then a bit\n\t\t// further. This is to increase accuracy for short completions,\n\t\t// where the completion might appear elsewhere.\n\t\tlet finding = find(documentText, completion, stillInCodeNearMargin, tracker.offset);\n\t\tif (!finding.stillInCodeHeuristic) {\n\t\t\tfinding = find(documentText, completion, stillInCodeFarMargin, tracker.offset);\n\t\t}\n\t\t// Debug and log a binary decision\n\t\tpostInsertionLogger.debug(\n\t\t\tlogTarget,\n\t\t\t`stillInCode: ${finding.stillInCodeHeuristic ? 'Found' : 'Not found'}! Completion '${completion}' in file ${uri\n\t\t\t}. lexEditDistance fraction was ${finding.relativeLexEditDistance}. Char edit distance was ${finding.charEditDistance\n\t\t\t}. Inserted at ${insertionOffset}, tracked at ${tracker.offset}, found at ${finding.foundOffset\n\t\t\t}. choiceIndex: ${telemetryData.properties.choiceIndex}`\n\t\t);\n\t\t// Log all the details for analysis\n\t\tconst customTelemetryData = telemetryData\n\t\t\t.extendedBy({}, { timeout: timeout.seconds, insertionOffset: insertionOffset, trackedOffset: tracker.offset })\n\t\t\t.extendedBy({}, finding);\n\t\tinstantiationService.invokeFunction(telemetry, insertionCategory + '.stillInCode', customTelemetryData);\n\n\t\tif (timeout.captureCode) {\n\t\t\tconst { prompt, capturedCode, terminationOffset } = await instantiationService.invokeFunction(\n\t\t\t\tcaptureCode,\n\t\t\t\turi,\n\t\t\t\tcustomTelemetryData,\n\t\t\t\ttracker.offset,\n\t\t\t\tsuffixTracker.offset\n\t\t\t);\n\t\t\tconst promptTelemetry = {\n\t\t\t\thypotheticalPromptJson: JSON.stringify({ prefix: prompt.prefix, context: prompt.context }),\n\t\t\t\thypotheticalPromptSuffixJson: JSON.stringify(prompt.suffix),\n\t\t\t};\n\n\t\t\tconst afterAcceptedTelemetry = telemetryData.extendedBy(\n\t\t\t\t{\n\t\t\t\t\t...promptTelemetry,\n\t\t\t\t\tcapturedCodeJson: JSON.stringify(capturedCode),\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\ttimeout: timeout.seconds,\n\t\t\t\t\tinsertionOffset: insertionOffset,\n\t\t\t\t\ttrackedOffset: tracker.offset,\n\t\t\t\t\tterminationOffsetInCapturedCode: terminationOffset,\n\t\t\t\t}\n\t\t\t);\n\t\t\tpostInsertionLogger.debug(\n\t\t\t\tlogTarget,\n\t\t\t\t`${insertionCategory}.capturedAfterAccepted choiceIndex: ${telemetryData.properties.choiceIndex}`,\n\t\t\t\tcustomTelemetryData\n\t\t\t);\n\t\t\tinstantiationService.invokeFunction(\n\t\t\t\ttelemetry,\n\t\t\t\tinsertionCategory + '.capturedAfterAccepted',\n\t\t\t\tafterAcceptedTelemetry,\n\t\t\t\tTelemetryStore.Enhanced\n\t\t\t);\n\t\t}\n\t}\n}\n",typescript,tab
+2,453,"extension-output-pdoom-org.crowd-code-#1-crowd-code",0,0,"2:45:14 PM [info] Activating crowd-code\n2:45:14 PM [info] Recording started\n2:45:14 PM [info] Initializing git provider using file system watchers...\n2:45:14 PM [info] Git repository found\n2:45:14 PM [info] Git provider initialized successfully\n2:45:14 PM [info] Initial git state: [object Object]\n",Log,tab
+3,1807,"src/extension/completions-core/vscode-node/lib/src/postInsertion.ts",0,0,"",typescript,tab
+4,5024,"src/extension/completions-core/vscode-node/lib/src/postInsertion.ts",1068,11397,"import { CopilotNamedAnnotationList } from './openai/stream';\nimport { contextIndentationFromText, indentationBlockFinished } from './prompt/parseBlock';\nimport { Prompt, extractPrompt } from './prompt/prompt';\nimport { fetchCitations } from './snippy/handlePostInsertion';\nimport { editDistance, lexEditDistance } from './suggestions/editDistance';\nimport { SuggestionStatus, computeCompletionText } from './suggestions/partialSuggestions';\nimport { TelemetryStore, TelemetryWithExp, telemetry, telemetryCatch } from './telemetry';\nimport { TextDocumentManager } from './textDocumentManager';\nimport { ICompletionsPromiseQueueService } from './util/promiseQueue';\nimport { ICompletionsRuntimeModeService } from './util/runtimeMode';\n\nconst postInsertionLogger = new Logger('postInsertion');\n\ntype Timeout = {\n\tseconds: number;\n\tcaptureCode: boolean;\n\tcaptureRejection: boolean;\n};\n// windows for telemetry checks, in seconds\n// captureCode = capture the code after acceptance,\n// captureRejection = capture the code after rejection\nconst captureTimeouts: Timeout[] = [\n\t{ seconds: 15, captureCode: false, captureRejection: false },\n\t{ seconds: 30, captureCode: true, captureRejection: true },\n\t{ seconds: 120, captureCode: false, captureRejection: false },\n\t{ seconds: 300, captureCode: false, captureRejection: false },\n\t{ seconds: 600, captureCode: false, captureRejection: false },\n];\n\n// No. of chars before/after insertion point to look for the completion\nconst stillInCodeNearMargin = 50;\nconst stillInCodeFarMargin = 1500;\n\n// If lex edit distance is below this fraction of completion length it is considered\n// in the code\nconst stillInCodeFraction = 0.5;\n\n// Number of characters captured after the insertion point.\n// Used only if we couldn't detect termination point with indent-based parsing.\nconst captureCodeMargin = 500;\n\nconst postInsertConfiguration: {\n\ttriggerPostInsertionSynchroneously: boolean;\n\tcaptureCode: boolean;\n\tcaptureRejection: boolean;\n} = {\n\ttriggerPostInsertionSynchroneously: false,\n\tcaptureCode: false,\n\tcaptureRejection: false,\n};\n\nasync function captureCode(\n\taccessor: ServicesAccessor,\n\turi: string,\n\tcompletionTelemetry: TelemetryWithExp,\n\toffset: number,\n\tsuffixOffset?: number\n): Promise<{ prompt: Prompt; capturedCode: string; terminationOffset: number }> {\n\tconst ctx = accessor.get(ICompletionsContextService);\n\tconst instantiationService = accessor.get(IInstantiationService);\n\tconst logTarget = ctx.get(LogTarget);\n\tconst result = await ctx.get(FileReader).getOrReadTextDocumentWithFakeClientProperties({ uri });\n\tif (result.status !== 'valid') {\n\t\tpostInsertionLogger.info(logTarget, `Could not get document for ${uri}. Maybe it was closed by the editor.`);\n\t\treturn {\n\t\t\tprompt: {\n\t\t\t\tprefix: '',\n\t\t\t\tsuffix: '',\n\t\t\t\tisFimEnabled: false,\n\t\t\t},\n\t\t\tcapturedCode: '',\n\t\t\tterminationOffset: 0,\n\t\t};\n\t}\n\tconst document = result.document;\n\tconst documentText = document.getText();\n\tconst documentTextBefore = documentText.substring(0, offset);\n\tconst position = document.positionAt(offset);\n\n\t// Treat the code before offset as the hypothetical prompt\n\tconst hypotheticalPromptResponse = await instantiationService.invokeFunction(extractPrompt,\n\t\tcompletionTelemetry.properties.headerRequestId,\n\t\tcreateCompletionState(document, position),\n\t\tcompletionTelemetry\n\t);\n\tconst hypotheticalPrompt =\n\t\thypotheticalPromptResponse.type === 'prompt'\n\t\t\t? hypotheticalPromptResponse.prompt\n\t\t\t: {\n\t\t\t\tprefix: documentTextBefore,\n\t\t\t\tsuffix: '',\n\t\t\t\tisFimEnabled: false,\n\t\t\t}; // TODO(eaftan): Pass an actual suffix when we're ready to support it\n\n\tif (hypotheticalPrompt.isFimEnabled && suffixOffset !== undefined) {\n\t\t// With FIM enabled, we can exactly determine capturedCode, suffix and prefix by propertly initialized trackers. No need to guess.\n\t\tconst capturedCode = documentText.substring(offset, suffixOffset);\n\t\thypotheticalPrompt.suffix = documentText.substring(suffixOffset);\n\n\t\treturn { prompt: hypotheticalPrompt, capturedCode, terminationOffset: 0 };\n\t} else {\n\t\t//Everything after the insertion point is hypothetical response we could get from AI\n\t\tconst hypotheticalResponse = documentText.substring(offset);\n\n\t\t//Try to find the termination offset in the hypothetical response using indentation based parsing\n\t\tconst contextIndent = contextIndentationFromText(documentTextBefore, offset, document.detectedLanguageId);\n\t\tconst indentTerminationFunction = indentationBlockFinished(contextIndent, undefined);\n\t\tconst terminationResult = indentTerminationFunction(hypotheticalResponse);\n\n\t\t//If we could detect termination of the indentation block, capture 2x length of detected suggestion\n\t\t//Otherwise capture a lot of characters\n\t\tconst maxOffset = Math.min(\n\t\t\tdocumentText.length,\n\t\t\toffset + (terminationResult ? terminationResult * 2 : captureCodeMargin)\n\t\t);\n\n\t\tconst capturedCode = documentText.substring(offset, maxOffset);\n\n\t\treturn { prompt: hypotheticalPrompt, capturedCode, terminationOffset: terminationResult ?? -1 };\n\t}\n}\n\nexport function postRejectionTasks(\n\taccessor: ServicesAccessor,\n\tinsertionCategory: PostInsertionCategory,\n\tinsertionOffset: number,\n\turi: string,\n\tcompletions: { completionText: string; completionTelemetryData: TelemetryWithExp }[]\n) {\n\tconst logTarget = accessor.get(ICompletionsContextService).get(LogTarget);\n\tconst instantiationService = accessor.get(IInstantiationService);\n\tconst telemetryService = accessor.get(ICompletionsTelemetryService);\n\tconst promiseQueueService = accessor.get(ICompletionsPromiseQueueService);\n\n\t//Send `.rejected` telemetry event for each rejected completion\n\tcompletions.forEach(({ completionText, completionTelemetryData }) => {\n\t\tpostInsertionLogger.debug(\n\t\t\tlogTarget,\n\t\t\t`${insertionCategory}.rejected choiceIndex: ${completionTelemetryData.properties.choiceIndex}`\n\t\t);\n\t\tinstantiationService.invokeFunction(telemetryRejected, insertionCategory, completionTelemetryData);\n\t});\n\tconst positionTracker = instantiationService.createInstance(ChangeTracker, uri, insertionOffset - 1);\n\tconst suffixTracker = instantiationService.createInstance(ChangeTracker, uri, insertionOffset);\n\n\tconst checkInCode = async (t: Timeout) => {\n\t\tpostInsertionLogger.debug(\n\t\t\tlogTarget,\n\t\t\t`Original offset: ${insertionOffset}, Tracked offset: ${positionTracker.offset}`\n\t\t);\n\t\tconst { completionTelemetryData } = completions[0];\n\n\t\tconst { prompt, capturedCode, terminationOffset } = await instantiationService.invokeFunction(captureCode,\n\t\t\turi,\n\t\t\tcompletionTelemetryData,\n\t\t\tpositionTracker.offset + 1,\n\t\t\tsuffixTracker.offset\n\t\t);\n\n\t\tconst promptTelemetry = {\n\t\t\thypotheticalPromptJson: JSON.stringify({ prefix: prompt.prefix, context: prompt.context }),\n\t\t\thypotheticalPromptSuffixJson: JSON.stringify(prompt.suffix),\n\t\t};\n\n\t\tconst customTelemetryData = completionTelemetryData.extendedBy(\n\t\t\t{\n\t\t\t\t...promptTelemetry,\n\t\t\t\tcapturedCodeJson: JSON.stringify(capturedCode),\n\t\t\t},\n\t\t\t{\n\t\t\t\ttimeout: t.seconds,\n\t\t\t\tinsertionOffset: insertionOffset,\n\t\t\t\ttrackedOffset: positionTracker.offset,\n\t\t\t\tterminationOffsetInCapturedCode: terminationOffset,\n\t\t\t}\n\t\t);\n\t\tpostInsertionLogger.debug(\n\t\t\tlogTarget,\n\t\t\t`${insertionCategory}.capturedAfterRejected choiceIndex: ${completionTelemetryData.properties.choiceIndex}`,\n\t\t\tcustomTelemetryData\n\t\t);\n\t\tinstantiationService.invokeFunction(telemetry, insertionCategory + '.capturedAfterRejected', customTelemetryData, TelemetryStore.Enhanced);\n\t};\n\t// Capture the code typed after we detected that completion was rejected,\n\t// Uses first displayed completion as the source/seed of telemetry information.\n\tcaptureTimeouts\n\t\t.filter(t => t.captureRejection)\n\t\t.map(t =>\n\t\t\tpositionTracker.push(\n\t\t\t\ttelemetryCatch(telemetryService, promiseQueueService, () => checkInCode(t), 'postRejectionTasks'),\n\t\t\t\tt.seconds * 1000\n\t\t\t)\n\t\t);\n}\n\nexport function postInsertionTasks(\n\taccessor: ServicesAccessor,\n\tinsertionCategory: PostInsertionCategory,\n\tcompletionText: string,\n\tinsertionOffset: number,\n\turi: string,\n\ttelemetryData: TelemetryWithExp,\n\tsuggestionStatus: SuggestionStatus,\n\tcopilotAnnotations?: CopilotNamedAnnotationList\n) {\n\tconst logTarget = accessor.get(ICompletionsContextService).get(LogTarget);\n\tconst instantiationService = accessor.get(IInstantiationService);\n\tconst promiseQueueService = accessor.get(ICompletionsPromiseQueueService);\n\tconst telemetryService = accessor.get(ICompletionsTelemetryService);\n\tconst runtimeModeService = accessor.get(ICompletionsRuntimeModeService);\n\n\tconst telemetryDataWithStatus = telemetryData.extendedBy(\n\t\t{\n\t\t\tcompType: suggestionStatus.compType,\n\t\t},\n\t\t{\n\t\t\tcompCharLen: suggestionStatus.acceptedLength,\n\t\t\tnumLines: suggestionStatus.acceptedLines,\n\t\t}\n\t);\n\t// send "".accepted"" telemetry\n\tpostInsertionLogger.debug(\n\t\tlogTarget,\n\t\t`${insertionCategory}.accepted choiceIndex: ${telemetryDataWithStatus.properties.choiceIndex}`\n\t);\n\tinstantiationService.invokeFunction(telemetryAccepted, insertionCategory, telemetryDataWithStatus);\n\n\tconst fullCompletionText = completionText;\n\tcompletionText = computeCompletionText(completionText, suggestionStatus);\n\tconst trimmedCompletion = completionText.trim();\n\tconst tracker = instantiationService.createInstance(ChangeTracker, uri, insertionOffset);\n\tconst suffixTracker = instantiationService.createInstance(ChangeTracker, uri, insertionOffset + completionText.length);\n",typescript,content
+5,5117,"TERMINAL",0,0,"",,terminal_command
+6,5224,"src/extension/completions-core/vscode-node/lib/src/postInsertion.ts",0,0,"Switched from branch 'capture-tab-and-chat-interaction' to 'main'",typescript,git_branch_checkout
+7,12049,"TERMINAL",0,0,"",,terminal_command
+8,25220,"src/extension/completions-core/vscode-node/lib/src/postInsertion.ts",0,0,"Switched from branch 'main' to 'expose-oai-endpoint'",typescript,git_branch_checkout
+9,28949,"package.json",0,0,"{\n\t""name"": ""copilot-chat"",\n\t""displayName"": ""GitHub Copilot Chat"",\n\t""description"": ""AI chat features powered by Copilot"",\n\t""version"": ""0.33.0"",\n\t""build"": ""1"",\n\t""internalAIKey"": ""1058ec22-3c95-4951-8443-f26c1f325911"",\n\t""completionsCoreVersion"": ""1.378.1799"",\n\t""internalLargeStorageAriaKey"": ""ec712b3202c5462fb6877acae7f1f9d7-c19ad55e-3e3c-4f99-984b-827f6d95bd9e-6917"",\n\t""ariaKey"": ""0c6ae279ed8443289764825290e4f9e2-1a736e7c-1324-4338-be46-fc2a58ae4d14-7255"",\n\t""buildType"": ""dev"",\n\t""publisher"": ""GitHub"",\n\t""homepage"": ""https://github.com/features/copilot?editor=vscode"",\n\t""license"": ""SEE LICENSE IN LICENSE.txt"",\n\t""repository"": {\n\t\t""type"": ""git"",\n\t\t""url"": ""https://github.com/microsoft/vscode-copilot-chat""\n\t},\n\t""bugs"": {\n\t\t""url"": ""https://github.com/microsoft/vscode/issues""\n\t},\n\t""qna"": ""https://github.com/github-community/community/discussions/categories/copilot"",\n\t""icon"": ""assets/copilot.png"",\n\t""pricing"": ""Trial"",\n\t""engines"": {\n\t\t""vscode"": ""^1.106.0-20251030"",\n\t\t""npm"": "">=9.0.0"",\n\t\t""node"": "">=22.14.0""\n\t},\n\t""categories"": [\n\t\t""AI"",\n\t\t""Chat"",\n\t\t""Programming Languages"",\n\t\t""Machine Learning""\n\t],\n\t""keywords"": [\n\t\t""ai"",\n\t\t""openai"",\n\t\t""codex"",\n\t\t""pilot"",\n\t\t""snippets"",\n\t\t""documentation"",\n\t\t""autocomplete"",\n\t\t""intellisense"",\n\t\t""refactor"",\n\t\t""javascript"",\n\t\t""python"",\n\t\t""typescript"",\n\t\t""php"",\n\t\t""go"",\n\t\t""golang"",\n\t\t""ruby"",\n\t\t""c++"",\n\t\t""c#"",\n\t\t""java"",\n\t\t""kotlin"",\n\t\t""co-pilot""\n\t],\n\t""badges"": [\n\t\t{\n\t\t\t""url"": ""https://img.shields.io/badge/GitHub%20Copilot-Subscription%20Required-orange"",\n\t\t\t""href"": ""https://github.com/github-copilot/signup?editor=vscode"",\n\t\t\t""description"": ""%github.copilot.badge.signUp%""\n\t\t},\n\t\t{\n\t\t\t""url"": ""https://img.shields.io/github/stars/github/copilot-docs?style=social"",\n\t\t\t""href"": ""https://github.com/github/copilot-docs"",\n\t\t\t""description"": ""%github.copilot.badge.star%""\n\t\t},\n\t\t{\n\t\t\t""url"": ""https://img.shields.io/youtube/channel/views/UC7c3Kb6jYCRj4JOHHZTxKsQ?style=social"",\n\t\t\t""href"": ""https://www.youtube.com/@GitHub/search?query=copilot"",\n\t\t\t""description"": ""%github.copilot.badge.youtube%""\n\t\t},\n\t\t{\n\t\t\t""url"": ""https://img.shields.io/twitter/follow/github?style=social"",\n\t\t\t""href"": ""https://twitter.com/github"",\n\t\t\t""description"": ""%github.copilot.badge.twitter%""\n\t\t}\n\t],\n\t""activationEvents"": [\n\t\t""onStartupFinished"",\n\t\t""onLanguageModelChat:copilot"",\n\t\t""onUri"",\n\t\t""onFileSystem:ccreq"",\n\t\t""onFileSystem:ccsettings""\n\t],\n\t""main"": ""./dist/extension"",\n\t""l10n"": ""./l10n"",\n\t""enabledApiProposals"": [\n\t\t""extensionsAny"",\n\t\t""newSymbolNamesProvider"",\n\t\t""interactive"",\n\t\t""codeActionAI"",\n\t\t""activeComment"",\n\t\t""commentReveal"",\n\t\t""contribCommentThreadAdditionalMenu"",\n\t\t""contribCommentsViewThreadMenus"",\n\t\t""documentFiltersExclusive"",\n\t\t""embeddings"",\n\t\t""findTextInFiles"",\n\t\t""findTextInFiles2"",\n\t\t""findFiles2@2"",\n\t\t""textSearchProvider"",\n\t\t""terminalDataWriteEvent"",\n\t\t""terminalExecuteCommandEvent"",\n\t\t""terminalSelection"",\n\t\t""terminalQuickFixProvider"",\n\t\t""mappedEditsProvider"",\n\t\t""aiRelatedInformation"",\n\t\t""aiSettingsSearch"",\n\t\t""chatParticipantAdditions"",\n\t\t""chatEditing"",\n\t\t""defaultChatParticipant@4"",\n\t\t""contribSourceControlInputBoxMenu"",\n\t\t""authLearnMore"",\n\t\t""testObserver"",\n\t\t""aiTextSearchProvider@2"",\n\t\t""chatParticipantPrivate@11"",\n\t\t""chatProvider@4"",\n\t\t""contribDebugCreateConfiguration"",\n\t\t""chatReferenceDiagnostic"",\n\t\t""textSearchProvider2"",\n\t\t""chatReferenceBinaryData"",\n\t\t""languageModelSystem"",\n\t\t""languageModelCapabilities"",\n\t\t""inlineCompletionsAdditions"",\n\t\t""chatStatusItem"",\n\t\t""taskProblemMatcherStatus"",\n\t\t""contribLanguageModelToolSets"",\n\t\t""textDocumentChangeReason"",\n\t\t""resolvers"",\n\t\t""taskExecutionTerminal"",\n\t\t""dataChannels"",\n\t\t""languageModelThinkingPart"",\n\t\t""chatSessionsProvider@3"",\n\t\t""devDeviceId"",\n\t\t""contribEditorContentMenu""\n\t],\n\t""contributes"": {\n\t\t""languageModelTools"": [\n\t\t\t{\n\t\t\t\t""name"": ""copilot_searchCodebase"",\n\t\t\t\t""toolReferenceName"": ""codebase"",\n\t\t\t\t""displayName"": ""%copilot.tools.searchCodebase.name%"",\n\t\t\t\t""icon"": ""$(folder)"",\n\t\t\t\t""userDescription"": ""%copilot.codebase.tool.description%"",\n\t\t\t\t""modelDescription"": ""Run a natural language search for relevant code or documentation comments from the user's current workspace. Returns relevant code snippets from the user's current workspace if it is large, or the full contents of the workspace if it is small."",\n\t\t\t\t""tags"": [\n\t\t\t\t\t""codesearch"",\n\t\t\t\t\t""vscode_codesearch""\n\t\t\t\t],\n\t\t\t\t""inputSchema"": {\n\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t""query"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""The query to search the codebase for. Should contain all relevant context. Should ideally be text that might appear in the codebase, such as function names, variable names, or comments.""\n\t\t\t\t\t\t}\n\t\t\t\t\t},\n\t\t\t\t\t""required"": [\n\t\t\t\t\t\t""query""\n\t\t\t\t\t]\n\t\t\t\t}\n\t\t\t},\n\t\t\t{\n\t\t\t\t""name"": ""runSubagent"",\n\t\t\t\t""toolReferenceName"": ""runSubagent"",\n\t\t\t\t""displayName"": ""%copilot.tools.runSubagent.name%"",\n\t\t\t\t""canBeReferencedInPrompt"": true,\n\t\t\t\t""userDescription"": ""%copilot.tools.runSubagent.description%"",\n\t\t\t\t""modelDescription"": ""Launch a new agent to handle complex, multi-step tasks autonomously. This tool is good at researching complex questions, searching for code, and executing multi-step tasks. When you are searching for a keyword or file and are not confident that you will find the right match in the first few tries, use this agent to perform the search for you.\n\n- Agents do not run async or in the background, you will wait for the agent's result.\n- When the agent is done, it will return a single message back to you. The result returned by the agent is not visible to the user. To show the user the result, you should send a text message back to the user with a concise summary of the result.\n - Each agent invocation is stateless. You will not be able to send additional messages to the agent, nor will the agent be able to communicate with you outside of its final report. Therefore, your prompt should contain a highly detailed task description for the agent to perform autonomously and you should specify exactly what information the agent should return back to you in its final and only message to you.\n - The agent's outputs should generally be trusted\n - Clearly tell the agent whether you expect it to write code or just to do research (search, file reads, web fetches, etc.), since it is not aware of the user's intent"",\n\t\t\t\t""tags"": [],\n\t\t\t\t""inputSchema"": {\n\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t""prompt"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""A detailed description of the task for the agent to perform""\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""description"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""A short (3-5 word) description of the task""\n\t\t\t\t\t\t}\n\t\t\t\t\t},\n\t\t\t\t\t""required"": [\n\t\t\t\t\t\t""prompt"",\n\t\t\t\t\t\t""description""\n\t\t\t\t\t]\n\t\t\t\t}\n\t\t\t},\n\t\t\t{\n\t\t\t\t""name"": ""copilot_searchWorkspaceSymbols"",\n\t\t\t\t""toolReferenceName"": ""symbols"",\n\t\t\t\t""displayName"": ""%copilot.tools.searchWorkspaceSymbols.name%"",\n\t\t\t\t""icon"": ""$(symbol)"",\n\t\t\t\t""userDescription"": ""%copilot.workspaceSymbols.tool.description%"",\n\t\t\t\t""modelDescription"": ""Search the user's workspace for code symbols using language services. Use this tool when the user is looking for a specific symbol in their workspace."",\n\t\t\t\t""tags"": [\n\t\t\t\t\t""vscode_codesearch""\n\t\t\t\t],\n\t\t\t\t""inputSchema"": {\n\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t""symbolName"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""The symbol to search for, such as a function name, class name, or variable name.""\n\t\t\t\t\t\t}\n\t\t\t\t\t},\n\t\t\t\t\t""required"": [\n\t\t\t\t\t\t""symbolName""\n\t\t\t\t\t]\n\t\t\t\t}\n\t\t\t},\n\t\t\t{\n\t\t\t\t""name"": ""copilot_listCodeUsages"",\n\t\t\t\t""toolReferenceName"": ""usages"",\n\t\t\t\t""displayName"": ""%copilot.tools.listCodeUsages.name%"",\n\t\t\t\t""icon"": ""$(references)"",\n\t\t\t\t""canBeReferencedInPrompt"": true,\n\t\t\t\t""userDescription"": ""%copilot.listCodeUsages.tool.description%"",\n\t\t\t\t""modelDescription"": ""Request to list all usages (references, definitions, implementations etc) of a function, class, method, variable etc. Use this tool when \n1. Looking for a sample implementation of an interface or class\n2. Checking how a function is used throughout the codebase.\n3. Including and updating all usages when changing a function, method, or constructor"",\n\t\t\t\t""tags"": [\n\t\t\t\t\t""vscode_codesearch""\n\t\t\t\t],\n\t\t\t\t""inputSchema"": {\n\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t""symbolName"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""The name of the symbol, such as a function name, class name, method name, variable name, etc.""\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""filePaths"": {\n\t\t\t\t\t\t\t""type"": ""array"",\n\t\t\t\t\t\t\t""description"": ""One or more file paths which likely contain the definition of the symbol. For instance the file which declares a class or function. This is optional but will speed up the invocation of this tool and improve the quality of its output."",\n\t\t\t\t\t\t\t""items"": {\n\t\t\t\t\t\t\t\t""type"": ""string""\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t}\n\t\t\t\t\t},\n\t\t\t\t\t""required"": [\n\t\t\t\t\t\t""symbolName""\n\t\t\t\t\t]\n\t\t\t\t}\n\t\t\t},\n\t\t\t{\n\t\t\t\t""name"": ""copilot_getVSCodeAPI"",\n\t\t\t\t""toolReferenceName"": ""vscodeAPI"",\n\t\t\t\t""displayName"": ""%copilot.tools.getVSCodeAPI.name%"",\n\t\t\t\t""icon"": ""$(references)"",\n\t\t\t\t""userDescription"": ""%copilot.vscode.tool.description%"",\n\t\t\t\t""modelDescription"": ""Get comprehensive VS Code API documentation and references for extension development. This tool provides authoritative documentation for VS Code's extensive API surface, including proposed APIs, contribution points, and best practices. Use this tool for understanding complex VS Code API interactions.\n\nWhen to use this tool:\n- User asks about specific VS Code APIs, interfaces, or extension capabilities\n- Need documentation for VS Code extension contribution points (commands, views, settings, etc.)\n- Questions about proposed APIs and their usage patterns\n- Understanding VS Code extension lifecycle, activation events, and packaging\n- Best practices for VS Code extension development architecture\n- API examples and code patterns for extension features\n- Troubleshooting extension-specific issues or API limitations\n\nWhen NOT to use this tool:\n- Creating simple standalone files or scripts unrelated to VS Code extensions\n- General programming questions not specific to VS Code extension development\n- Questions about using VS Code as an editor (user-facing features)\n- Non-extension related development tasks\n- File creation or editing that doesn't involve VS Code extension APIs\n\nCRITICAL usage guidelines:\n1. Always include specific API names, interfaces, or concepts in your query\n2. Mention the extension feature you're trying to implement\n3. Include context about proposed vs stable APIs when relevant\n4. Reference specific contribution points when asking about extension manifest\n5. Be specific about the VS Code version or API version when known\n\nScope: This tool is for EXTENSION DEVELOPMENT ONLY - building tools that extend VS Code itself, not for general file creation or non-extension programming tasks."",\n\t\t\t\t""inputSchema"": {\n\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t""query"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""The query to search vscode documentation for. Should contain all relevant context.""\n\t\t\t\t\t\t}\n\t\t\t\t\t},\n\t\t\t\t\t""required"": [\n\t\t\t\t\t\t""query""\n\t\t\t\t\t]\n\t\t\t\t},\n\t\t\t\t""tags"": [],\n\t\t\t\t""canBeReferencedInPrompt"": true\n\t\t\t},\n\t\t\t{\n\t\t\t\t""name"": ""copilot_findFiles"",\n\t\t\t\t""toolReferenceName"": ""fileSearch"",\n\t\t\t\t""displayName"": ""%copilot.tools.findFiles.name%"",\n\t\t\t\t""modelDescription"": ""Search for files in the workspace by glob pattern. This only returns the paths of matching files. Use this tool when you know the exact filename pattern of the files you're searching for. Glob patterns match from the root of the workspace folder. Examples:\n- **/*.{js,ts} to match all js/ts files in the workspace.\n- src/** to match all files under the top-level src folder.\n- **/foo/**/*.js to match all js files under any foo folder in the workspace."",\n\t\t\t\t""tags"": [\n\t\t\t\t\t""vscode_codesearch""\n\t\t\t\t],\n\t\t\t\t""inputSchema"": {\n\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t""query"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""Search for files with names or paths matching this glob pattern.""\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""maxResults"": {\n\t\t\t\t\t\t\t""type"": ""number"",\n\t\t\t\t\t\t\t""description"": ""The maximum number of results to return. Do not use this unless necessary, it can slow things down. By default, only some matches are returned. If you use this and don't see what you're looking for, you can try again with a more specific query or a larger maxResults.""\n\t\t\t\t\t\t}\n\t\t\t\t\t},\n\t\t\t\t\t""required"": [\n\t\t\t\t\t\t""query""\n\t\t\t\t\t]\n\t\t\t\t}\n\t\t\t},\n\t\t\t{\n\t\t\t\t""name"": ""copilot_findTextInFiles"",\n\t\t\t\t""toolReferenceName"": ""textSearch"",\n\t\t\t\t""displayName"": ""%copilot.tools.findTextInFiles.name%"",\n\t\t\t\t""modelDescription"": ""Do a fast text search in the workspace. Use this tool when you want to search with an exact string or regex. If you are not sure what words will appear in the workspace, prefer using regex patterns with alternation (|) or character classes to search for multiple potential words at once instead of making separate searches. For example, use 'function|method|procedure' to look for all of those words at once. Use includePattern to search within files matching a specific pattern, or in a specific file, using a relative path. Use this tool when you want to see an overview of a particular file, instead of using read_file many times to look for code within a file."",\n\t\t\t\t""tags"": [\n\t\t\t\t\t""vscode_codesearch""\n\t\t\t\t],\n\t\t\t\t""inputSchema"": {\n\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t""query"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""The pattern to search for in files in the workspace. Use regex with alternation (e.g., 'word1|word2|word3') or character classes to find multiple potential words in a single search. Be sure to set the isRegexp property properly to declare whether it's a regex or plain text pattern. Is case-insensitive.""\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""isRegexp"": {\n\t\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t\t""description"": ""Whether the pattern is a regex.""\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""includePattern"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""Search files matching this glob pattern. Will be applied to the relative path of files within the workspace. To search recursively inside a folder, use a proper glob pattern like \""src/folder/**\"". Do not use | in includePattern.""\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""maxResults"": {\n\t\t\t\t\t\t\t""type"": ""number"",\n\t\t\t\t\t\t\t""description"": ""The maximum number of results to return. Do not use this unless necessary, it can slow things down. By default, only some matches are returned. If you use this and don't see what you're looking for, you can try again with a more specific query or a larger maxResults.""\n\t\t\t\t\t\t}\n\t\t\t\t\t},\n\t\t\t\t\t""required"": [\n\t\t\t\t\t\t""query"",\n\t\t\t\t\t\t""isRegexp""\n\t\t\t\t\t]\n\t\t\t\t}\n\t\t\t},\n\t\t\t{\n\t\t\t\t""name"": ""copilot_applyPatch"",\n\t\t\t\t""displayName"": ""%copilot.tools.applyPatch.name%"",\n\t\t\t\t""toolReferenceName"": ""applyPatch"",\n\t\t\t\t""userDescription"": ""%copilot.tools.applyPatch.description%"",\n\t\t\t\t""modelDescription"": ""Edit text files. Do not use this tool to edit Jupyter notebooks. `apply_patch` allows you to execute a diff/patch against a text file, but the format of the diff specification is unique to this task, so pay careful attention to these instructions. To use the `apply_patch` command, you should pass a message of the following structure as \""input\"":\n\n*** Begin Patch\n[YOUR_PATCH]\n*** End Patch\n\nWhere [YOUR_PATCH] is the actual content of your patch, specified in the following V4A diff format.\n\n*** [ACTION] File: [/absolute/path/to/file] -> ACTION can be one of Add, Update, or Delete.\nAn example of a message that you might pass as \""input\"" to this function, in order to apply a patch, is shown below.\n\n*** Begin Patch\n*** Update File: /Users/someone/pygorithm/searching/binary_search.py\n@@class BaseClass\n@@ def search():\n- pass\n+ raise NotImplementedError()\n\n@@class Subclass\n@@ def search():\n- pass\n+ raise NotImplementedError()\n\n*** End Patch\nDo not use line numbers in this diff format."",\n\t\t\t\t""inputSchema"": {\n\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t""input"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""The edit patch to apply.""\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""explanation"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""A short description of what the tool call is aiming to achieve.""\n\t\t\t\t\t\t}\n\t\t\t\t\t},\n\t\t\t\t\t""required"": [\n\t\t\t\t\t\t""input"",\n\t\t\t\t\t\t""explanation""\n\t\t\t\t\t]\n\t\t\t\t}\n\t\t\t},\n\t\t\t{\n\t\t\t\t""name"": ""copilot_readFile"",\n\t\t\t\t""toolReferenceName"": ""readFile"",\n\t\t\t\t""displayName"": ""%copilot.tools.readFile.name%"",\n\t\t\t\t""modelDescription"": ""Read the contents of a file.\n\nYou must specify the line range you're interested in. Line numbers are 1-indexed. If the file contents returned are insufficient for your task, you may call this tool again to retrieve more content. Prefer reading larger ranges over doing many small reads."",\n\t\t\t\t""tags"": [\n\t\t\t\t\t""vscode_codesearch""\n\t\t\t\t],\n\t\t\t\t""inputSchema"": {\n\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t""filePath"": {\n\t\t\t\t\t\t\t""description"": ""The absolute path of the file to read."",\n\t\t\t\t\t\t\t""type"": ""string""\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""startLine"": {\n\t\t\t\t\t\t\t""type"": ""number"",\n\t\t\t\t\t\t\t""description"": ""The line number to start reading from, 1-based.""\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""endLine"": {\n\t\t\t\t\t\t\t""type"": ""number"",\n\t\t\t\t\t\t\t""description"": ""The inclusive line number to end reading at, 1-based.""\n\t\t\t\t\t\t}\n\t\t\t\t\t},\n\t\t\t\t\t""required"": [\n\t\t\t\t\t\t""filePath"",\n\t\t\t\t\t\t""startLine"",\n\t\t\t\t\t\t""endLine""\n\t\t\t\t\t]\n\t\t\t\t}\n\t\t\t},\n\t\t\t{\n\t\t\t\t""name"": ""copilot_listDirectory"",\n\t\t\t\t""toolReferenceName"": ""listDirectory"",\n\t\t\t\t""displayName"": ""%copilot.tools.listDirectory.name%"",\n\t\t\t\t""modelDescription"": ""List the contents of a directory. Result will have the name of the child. If the name ends in /, it's a folder, otherwise a file"",\n\t\t\t\t""tags"": [\n\t\t\t\t\t""vscode_codesearch""\n\t\t\t\t],\n\t\t\t\t""inputSchema"": {\n\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t""path"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""The absolute path to the directory to list.""\n\t\t\t\t\t\t}\n\t\t\t\t\t},\n\t\t\t\t\t""required"": [\n\t\t\t\t\t\t""path""\n\t\t\t\t\t]\n\t\t\t\t}\n\t\t\t},\n\t\t\t{\n\t\t\t\t""name"": ""copilot_getErrors"",\n\t\t\t\t""displayName"": ""%copilot.tools.getErrors.name%"",\n\t\t\t\t""toolReferenceName"": ""problems"",\n\t\t\t\t""canBeReferencedInPrompt"": true,\n\t\t\t\t""icon"": ""$(error)"",\n\t\t\t\t""userDescription"": ""%copilot.tools.errors.description%"",\n\t\t\t\t""modelDescription"": ""Get any compile or lint errors in a specific file or across all files. If the user mentions errors or problems in a file, they may be referring to these. Use the tool to see the same errors that the user is seeing. If the user asks you to analyze all errors, or does not specify a file, use this tool to gather errors for all files. Also use this tool after editing a file to validate the change."",\n\t\t\t\t""tags"": [],\n\t\t\t\t""inputSchema"": {\n\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t""filePaths"": {\n\t\t\t\t\t\t\t""description"": ""The absolute paths to the files or folders to check for errors. Omit 'filePaths' when retrieving all errors."",\n\t\t\t\t\t\t\t""type"": ""array"",\n\t\t\t\t\t\t\t""items"": {\n\t\t\t\t\t\t\t\t""type"": ""string""\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t}\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t},\n\t\t\t{\n\t\t\t\t""name"": ""copilot_readProjectStructure"",\n\t\t\t\t""displayName"": ""%copilot.tools.readProjectStructure.name%"",\n\t\t\t\t""modelDescription"": ""Get a file tree representation of the workspace."",\n\t\t\t\t""tags"": []\n\t\t\t},\n\t\t\t{\n\t\t\t\t""name"": ""copilot_getChangedFiles"",\n\t\t\t\t""displayName"": ""%copilot.tools.getChangedFiles.name%"",\n\t\t\t\t""toolReferenceName"": ""changes"",\n\t\t\t\t""icon"": ""$(diff)"",\n\t\t\t\t""canBeReferencedInPrompt"": true,\n\t\t\t\t""userDescription"": ""%copilot.tools.changes.description%"",\n\t\t\t\t""modelDescription"": ""Get git diffs of current file changes in a git repository. Don't forget that you can use run_in_terminal to run git commands in a terminal as well."",\n\t\t\t\t""tags"": [\n\t\t\t\t\t""vscode_codesearch""\n\t\t\t\t],\n\t\t\t\t""inputSchema"": {\n\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t""repositoryPath"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""The absolute path to the git repository to look for changes in. If not provided, the active git repository will be used.""\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""sourceControlState"": {\n\t\t\t\t\t\t\t""type"": ""array"",\n\t\t\t\t\t\t\t""items"": {\n\t\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t\t""enum"": [\n\t\t\t\t\t\t\t\t\t""staged"",\n\t\t\t\t\t\t\t\t\t""unstaged"",\n\t\t\t\t\t\t\t\t\t""merge-conflicts""\n\t\t\t\t\t\t\t\t]\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t""description"": ""The kinds of git state to filter by. Allowed values are: 'staged', 'unstaged', and 'merge-conflicts'. If not provided, all states will be included.""\n\t\t\t\t\t\t}\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t},\n\t\t\t{\n\t\t\t\t""name"": ""copilot_testFailure"",\n\t\t\t\t""toolReferenceName"": ""testFailure"",\n\t\t\t\t""displayName"": ""%copilot.tools.testFailure.name%"",\n\t\t\t\t""icon"": ""$(beaker)"",\n\t\t\t\t""userDescription"": ""%copilot.testFailure.tool.description%"",\n\t\t\t\t""modelDescription"": ""Includes test failure information in the prompt."",\n\t\t\t\t""inputSchema"": {},\n\t\t\t\t""tags"": [\n\t\t\t\t\t""vscode_editing_with_tests"",\n\t\t\t\t\t""enable_other_tool_copilot_readFile"",\n\t\t\t\t\t""enable_other_tool_copilot_listDirectory"",\n\t\t\t\t\t""enable_other_tool_copilot_findFiles"",\n\t\t\t\t\t""enable_other_tool_copilot_runTests""\n\t\t\t\t],\n\t\t\t\t""canBeReferencedInPrompt"": true\n\t\t\t},\n\t\t\t{\n\t\t\t\t""name"": ""copilot_updateUserPreferences"",\n\t\t\t\t""toolReferenceName"": ""updateUserPreferences"",\n\t\t\t\t""displayName"": ""%copilot.tools.updateUserPreferences.name%"",\n\t\t\t\t""modelDescription"": ""Update the user's preferences file with new information about the user and their coding preferences, based on the current chat history."",\n\t\t\t\t""canBeReferencedInPrompt"": true,\n\t\t\t\t""tags"": [],\n\t\t\t\t""inputSchema"": {\n\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t""facts"": {\n\t\t\t\t\t\t\t""type"": ""array"",\n\t\t\t\t\t\t\t""items"": {\n\t\t\t\t\t\t\t\t""type"": ""string""\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t""description"": ""An array of new user preferences to remember.""\n\t\t\t\t\t\t}\n\t\t\t\t\t},\n\t\t\t\t\t""required"": [\n\t\t\t\t\t\t""facts""\n\t\t\t\t\t]\n\t\t\t\t},\n\t\t\t\t""when"": ""config.github.copilot.chat.enableUserPreferences""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""name"": ""copilot_createNewWorkspace"",\n\t\t\t\t""displayName"": ""%github.copilot.tools.createNewWorkspace.name%"",\n\t\t\t\t""toolReferenceName"": ""newWorkspace"",\n\t\t\t\t""icon"": ""$(new-folder)"",\n\t\t\t\t""userDescription"": ""%github.copilot.tools.createNewWorkspace.userDescription%"",\n\t\t\t\t""when"": ""config.github.copilot.chat.newWorkspaceCreation.enabled"",\n\t\t\t\t""modelDescription"": ""Get comprehensive setup steps to help the user create complete project structures in a VS Code workspace. This tool is designed for full project initialization and scaffolding, not for creating individual files.\n\nWhen to use this tool:\n- User wants to create a new complete project from scratch\n- Setting up entire project frameworks (TypeScript projects, React apps, Node.js servers, etc.)\n- Initializing Model Context Protocol (MCP) servers with full structure\n- Creating VS Code extensions with proper scaffolding\n- Setting up Next.js, Vite, or other framework-based projects\n- User asks for \""new project\"", \""create a workspace\"", \""set up a [framework] project\""\n- Need to establish complete development environment with dependencies, config files, and folder structure\n\nWhen NOT to use this tool:\n- Creating single files or small code snippets\n- Adding individual files to existing projects\n- Making modifications to existing codebases\n- User asks to \""create a file\"" or \""add a component\""\n- Simple code examples or demonstrations\n- Debugging or fixing existing code\n\nThis tool provides complete project setup including:\n- Folder structure creation\n- Package.json and dependency management\n- Configuration files (tsconfig, eslint, etc.)\n- Initial boilerplate code\n- Development environment setup\n- Build and run instructions\n\nUse other file creation tools for individual files within existing projects."",\n\t\t\t\t""inputSchema"": {\n\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t""query"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""The query to use to generate the new workspace. This should be a clear and concise description of the workspace the user wants to create.""\n\t\t\t\t\t\t}\n\t\t\t\t\t},\n\t\t\t\t\t""required"": [\n\t\t\t\t\t\t""query""\n\t\t\t\t\t]\n\t\t\t\t},\n\t\t\t\t""tags"": [\n\t\t\t\t\t""enable_other_tool_install_extension"",\n\t\t\t\t\t""enable_other_tool_get_project_setup_info""\n\t\t\t\t]\n\t\t\t},\n\t\t\t{\n\t\t\t\t""name"": ""copilot_getProjectSetupInfo"",\n\t\t\t\t""displayName"": ""%github.copilot.tools.getProjectSetupInfo.name%"",\n\t\t\t\t""when"": ""config.github.copilot.chat.newWorkspaceCreation.enabled"",\n\t\t\t\t""toolReferenceName"": ""getProjectSetupInfo"",\n\t\t\t\t""modelDescription"": ""Do not call this tool without first calling the tool to create a workspace. This tool provides a project setup information for a Visual Studio Code workspace based on a project type and programming language."",\n\t\t\t\t""inputSchema"": {\n\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t""projectType"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""The type of project to create. Supported values are: 'python-script', 'python-project', 'mcp-server', 'model-context-protocol-server', 'vscode-extension', 'next-js', 'vite' and 'other'""\n\t\t\t\t\t\t}\n\t\t\t\t\t},\n\t\t\t\t\t""required"": [\n\t\t\t\t\t\t""projectType""\n\t\t\t\t\t]\n\t\t\t\t},\n\t\t\t\t""tags"": []\n\t\t\t},\n\t\t\t{\n\t\t\t\t""name"": ""copilot_installExtension"",\n\t\t\t\t""displayName"": ""Install Extension in VS Code"",\n\t\t\t\t""when"": ""config.github.copilot.chat.newWorkspaceCreation.enabled"",\n\t\t\t\t""toolReferenceName"": ""installExtension"",\n\t\t\t\t""modelDescription"": ""Install an extension in VS Code. Use this tool to install an extension in Visual Studio Code as part of a new workspace creation process only."",\n\t\t\t\t""inputSchema"": {\n\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t""id"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""The ID of the extension to install. This should be in the format ..""\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""name"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""The name of the extension to install. This should be a clear and concise description of the extension.""\n\t\t\t\t\t\t}\n\t\t\t\t\t},\n\t\t\t\t\t""required"": [\n\t\t\t\t\t\t""id"",\n\t\t\t\t\t\t""name""\n\t\t\t\t\t]\n\t\t\t\t},\n\t\t\t\t""tags"": []\n\t\t\t},\n\t\t\t{\n\t\t\t\t""name"": ""copilot_runVscodeCommand"",\n\t\t\t\t""displayName"": ""Run VS Code Command"",\n\t\t\t\t""when"": ""config.github.copilot.chat.newWorkspaceCreation.enabled"",\n\t\t\t\t""toolReferenceName"": ""runVscodeCommand"",\n\t\t\t\t""modelDescription"": ""Run a command in VS Code. Use this tool to run a command in Visual Studio Code as part of a new workspace creation process only."",\n\t\t\t\t""inputSchema"": {\n\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t""commandId"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""The ID of the command to execute. This should be in the format .""\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""name"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""The name of the command to execute. This should be a clear and concise description of the command.""\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""args"": {\n\t\t\t\t\t\t\t""type"": ""array"",\n\t\t\t\t\t\t\t""description"": ""The arguments to pass to the command. This should be an array of strings."",\n\t\t\t\t\t\t\t""items"": {\n\t\t\t\t\t\t\t\t""type"": ""string""\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t}\n\t\t\t\t\t},\n\t\t\t\t\t""required"": [\n\t\t\t\t\t\t""commandId"",\n\t\t\t\t\t\t""name""\n\t\t\t\t\t]\n\t\t\t\t},\n\t\t\t\t""tags"": []\n\t\t\t},\n\t\t\t{\n\t\t\t\t""name"": ""copilot_createNewJupyterNotebook"",\n\t\t\t\t""displayName"": ""Create New Jupyter Notebook"",\n\t\t\t\t""icon"": ""$(notebook)"",\n\t\t\t\t""toolReferenceName"": ""newJupyterNotebook"",\n\t\t\t\t""modelDescription"": ""Generates a new Jupyter Notebook (.ipynb) in VS Code. Jupyter Notebooks are interactive documents commonly used for data exploration, analysis, visualization, and combining code with narrative text. Prefer creating plain Python files or similar unless a user explicitly requests creating a new Jupyter Notebook or already has a Jupyter Notebook opened or exists in the workspace."",\n\t\t\t\t""userDescription"": ""%copilot.tools.newJupyterNotebook.description%"",\n\t\t\t\t""inputSchema"": {\n\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t""query"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""The query to use to generate the jupyter notebook. This should be a clear and concise description of the notebook the user wants to create.""\n\t\t\t\t\t\t}\n\t\t\t\t\t},\n\t\t\t\t\t""required"": [\n\t\t\t\t\t\t""query""\n\t\t\t\t\t]\n\t\t\t\t},\n\t\t\t\t""tags"": []\n\t\t\t},\n\t\t\t{\n\t\t\t\t""name"": ""copilot_insertEdit"",\n\t\t\t\t""toolReferenceName"": ""insertEdit"",\n\t\t\t\t""displayName"": ""%copilot.tools.insertEdit.name%"",\n\t\t\t\t""modelDescription"": ""Insert new code into an existing file in the workspace. Use this tool once per file that needs to be modified, even if there are multiple changes for a file. Generate the \""explanation\"" property first.\nThe system is very smart and can understand how to apply your edits to the files, you just need to provide minimal hints.\nAvoid repeating existing code, instead use comments to represent regions of unchanged code. Be as concise as possible. For example:\n// ...existing code...\n{ changed code }\n// ...existing code...\n{ changed code }\n// ...existing code...\n\nHere is an example of how you should use format an edit to an existing Person class:\nclass Person {\n\t// ...existing code...\n\tage: number;\n\t// ...existing code...\n\tgetAge() {\n\treturn this.age;\n\t}\n}"",\n\t\t\t\t""tags"": [],\n\t\t\t\t""inputSchema"": {\n\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t""explanation"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""A short explanation of the edit being made.""\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""filePath"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""An absolute path to the file to edit.""\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""code"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""The code change to apply to the file.\nThe system is very smart and can understand how to apply your edits to the files, you just need to provide minimal hints.\nAvoid repeating existing code, instead use comments to represent regions of unchanged code. Be as concise as possible. For example:\n// ...existing code...\n{ changed code }\n// ...existing code...\n{ changed code }\n// ...existing code...\n\nHere is an example of how you should use format an edit to an existing Person class:\nclass Person {\n\t// ...existing code...\n\tage: number;\n\t// ...existing code...\n\tgetAge() {\n\t\treturn this.age;\n\t}\n}""\n\t\t\t\t\t\t}\n\t\t\t\t\t},\n\t\t\t\t\t""required"": [\n\t\t\t\t\t\t""explanation"",\n\t\t\t\t\t\t""filePath"",\n\t\t\t\t\t\t""code""\n\t\t\t\t\t]\n\t\t\t\t}\n\t\t\t},\n\t\t\t{\n\t\t\t\t""name"": ""copilot_createFile"",\n\t\t\t\t""toolReferenceName"": ""createFile"",\n\t\t\t\t""displayName"": ""%copilot.tools.createFile.name%"",\n\t\t\t\t""userDescription"": ""%copilot.tools.createFile.description%"",\n\t\t\t\t""modelDescription"": ""This is a tool for creating a new file in the workspace. The file will be created with the specified content. The directory will be created if it does not already exist. Never use this tool to edit a file that already exists."",\n\t\t\t\t""tags"": [],\n\t\t\t\t""inputSchema"": {\n\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t""filePath"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""The absolute path to the file to create.""\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""content"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""The content to write to the file.""\n\t\t\t\t\t\t}\n\t\t\t\t\t},\n\t\t\t\t\t""required"": [\n\t\t\t\t\t\t""filePath"",\n\t\t\t\t\t\t""content""\n\t\t\t\t\t]\n\t\t\t\t}\n\t\t\t},\n\t\t\t{\n\t\t\t\t""name"": ""copilot_createDirectory"",\n\t\t\t\t""toolReferenceName"": ""createDirectory"",\n\t\t\t\t""displayName"": ""%copilot.tools.createDirectory.name%"",\n\t\t\t\t""userDescription"": ""%copilot.tools.createDirectory.description%"",\n\t\t\t\t""modelDescription"": ""Create a new directory structure in the workspace. Will recursively create all directories in the path, like mkdir -p. You do not need to use this tool before using create_file, that tool will automatically create the needed directories."",\n\t\t\t\t""tags"": [],\n\t\t\t\t""inputSchema"": {\n\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t""dirPath"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""The absolute path to the directory to create.""\n\t\t\t\t\t\t}\n\t\t\t\t\t},\n\t\t\t\t\t""required"": [\n\t\t\t\t\t\t""dirPath""\n\t\t\t\t\t]\n\t\t\t\t}\n\t\t\t},\n\t\t\t{\n\t\t\t\t""name"": ""copilot_openSimpleBrowser"",\n\t\t\t\t""displayName"": ""%copilot.tools.openSimpleBrowser.name%"",\n\t\t\t\t""modelDescription"": ""Preview a website or open a URL in the editor's Simple Browser. Useful for quickly viewing locally hosted websites, demos, or resources without leaving the coding environment."",\n\t\t\t\t""userDescription"": ""%copilot.tools.openSimpleBrowser.description%"",\n\t\t\t\t""toolReferenceName"": ""openSimpleBrowser"",\n\t\t\t\t""canBeReferencedInPrompt"": true,\n\t\t\t\t""tags"": [],\n\t\t\t\t""inputSchema"": {\n\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t""url"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""The website URL to preview or open in the Simple Browser inside the editor. Must be either an http or https URL""\n\t\t\t\t\t\t}\n\t\t\t\t\t},\n\t\t\t\t\t""required"": [\n\t\t\t\t\t\t""url""\n\t\t\t\t\t]\n\t\t\t\t}\n\t\t\t},\n\t\t\t{\n\t\t\t\t""name"": ""copilot_replaceString"",\n\t\t\t\t""toolReferenceName"": ""replaceString"",\n\t\t\t\t""displayName"": ""%copilot.tools.replaceString.name%"",\n\t\t\t\t""modelDescription"": ""This is a tool for making edits in an existing file in the workspace. For moving or renaming files, use run in terminal tool with the 'mv' command instead. For larger edits, split them into smaller edits and call the edit tool multiple times to ensure accuracy. Before editing, always ensure you have the context to understand the file's contents and context. To edit a file, provide: 1) filePath (absolute path), 2) oldString (MUST be the exact literal text to replace including all whitespace, indentation, newlines, and surrounding code etc), and 3) newString (MUST be the exact literal text to replace \\`oldString\\` with (also including all whitespace, indentation, newlines, and surrounding code etc.). Ensure the resulting code is correct and idiomatic.). Each use of this tool replaces exactly ONE occurrence of oldString.\n\nCRITICAL for \\`oldString\\`: Must uniquely identify the single instance to change. Include at least 3 lines of context BEFORE and AFTER the target text, matching whitespace and indentation precisely. If this string matches multiple locations, or does not match exactly, the tool will fail. Never use 'Lines 123-456 omitted' from summarized documents or ...existing code... comments in the oldString or newString."",\n\t\t\t\t""when"": ""!config.github.copilot.chat.disableReplaceTool"",\n\t\t\t\t""inputSchema"": {\n\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t""filePath"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""An absolute path to the file to edit.""\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""oldString"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""The exact literal text to replace, preferably unescaped. For single replacements (default), include at least 3 lines of context BEFORE and AFTER the target text, matching whitespace and indentation precisely. For multiple replacements, specify expected_replacements parameter. If this string is not the exact literal text (i.e. you escaped it) or does not match exactly, the tool will fail.""\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""newString"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""The exact literal text to replace `old_string` with, preferably unescaped. Provide the EXACT text. Ensure the resulting code is correct and idiomatic.""\n\t\t\t\t\t\t}\n\t\t\t\t\t},\n\t\t\t\t\t""required"": [\n\t\t\t\t\t\t""filePath"",\n\t\t\t\t\t\t""oldString"",\n\t\t\t\t\t\t""newString""\n\t\t\t\t\t]\n\t\t\t\t}\n\t\t\t},\n\t\t\t{\n\t\t\t\t""name"": ""copilot_multiReplaceString"",\n\t\t\t\t""toolReferenceName"": ""multiReplaceString"",\n\t\t\t\t""displayName"": ""%copilot.tools.multiReplaceString.name%"",\n\t\t\t\t""modelDescription"": ""This tool allows you to apply multiple replace_string_in_file operations in a single call, which is more efficient than calling replace_string_in_file multiple times. It takes an array of replacement operations and applies them sequentially. Each replacement operation has the same parameters as replace_string_in_file: filePath, oldString, newString, and explanation. This tool is ideal when you need to make multiple edits across different files or multiple edits in the same file. The tool will provide a summary of successful and failed operations."",\n\t\t\t\t""when"": ""!config.github.copilot.chat.disableReplaceTool"",\n\t\t\t\t""inputSchema"": {\n\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t""explanation"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""A brief explanation of what the multi-replace operation will accomplish.""\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""replacements"": {\n\t\t\t\t\t\t\t""type"": ""array"",\n\t\t\t\t\t\t\t""description"": ""An array of replacement operations to apply sequentially."",\n\t\t\t\t\t\t\t""items"": {\n\t\t\t\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t\t\t\t""explanation"": {\n\t\t\t\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t\t\t\t""description"": ""A brief explanation of this specific replacement operation.""\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t""filePath"": {\n\t\t\t\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t\t\t\t""description"": ""An absolute path to the file to edit.""\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t""oldString"": {\n\t\t\t\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t\t\t\t""description"": ""The exact literal text to replace, preferably unescaped. Include at least 3 lines of context BEFORE and AFTER the target text, matching whitespace and indentation precisely. If this string is not the exact literal text or does not match exactly, this replacement will fail.""\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t""newString"": {\n\t\t\t\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t\t\t\t""description"": ""The exact literal text to replace `oldString` with, preferably unescaped. Provide the EXACT text. Ensure the resulting code is correct and idiomatic.""\n\t\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t""required"": [\n\t\t\t\t\t\t\t\t\t""explanation"",\n\t\t\t\t\t\t\t\t\t""filePath"",\n\t\t\t\t\t\t\t\t\t""oldString"",\n\t\t\t\t\t\t\t\t\t""newString""\n\t\t\t\t\t\t\t\t]\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t""minItems"": 1\n\t\t\t\t\t\t}\n\t\t\t\t\t},\n\t\t\t\t\t""required"": [\n\t\t\t\t\t\t""explanation"",\n\t\t\t\t\t\t""replacements""\n\t\t\t\t\t]\n\t\t\t\t}\n\t\t\t},\n\t\t\t{\n\t\t\t\t""name"": ""copilot_editNotebook"",\n\t\t\t\t""toolReferenceName"": ""editNotebook"",\n\t\t\t\t""displayName"": ""%copilot.tools.editNotebook.name%"",\n\t\t\t\t""modelDescription"": ""This is a tool for editing an existing Notebook file in the workspace. Generate the \""explanation\"" property first.\nThe system is very smart and can understand how to apply your edits to the notebooks.\nWhen updating the content of an existing cell, ensure newCode preserves whitespace and indentation exactly and does NOT include any code markers such as (...existing code...)."",\n\t\t\t\t""tags"": [\n\t\t\t\t\t""enable_other_tool_copilot_getNotebookSummary""\n\t\t\t\t],\n\t\t\t\t""inputSchema"": {\n\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t""filePath"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""An absolute path to the notebook file to edit, or the URI of a untitled, not yet named, file, such as `untitled:Untitled-1.""\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""cellId"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""Id of the cell that needs to be deleted or edited. Use the value `TOP`, `BOTTOM` when inserting a cell at the top or bottom of the notebook, else provide the id of the cell after which a new cell is to be inserted. Remember, if a cellId is provided and editType=insert, then a cell will be inserted after the cell with the provided cellId.""\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""newCode"": {\n\t\t\t\t\t\t\t""anyOf"": [\n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t\t\t""description"": ""The code for the new or existing cell to be edited. Code should not be wrapped within tags. Do NOT include code markers such as (...existing code...) to indicate existing code.""\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t""type"": ""array"",\n\t\t\t\t\t\t\t\t\t""items"": {\n\t\t\t\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t\t\t\t""description"": ""The code for the new or existing cell to be edited. Code should not be wrapped within tags""\n\t\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t]\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""language"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""The language of the cell. `markdown`, `python`, `javascript`, `julia`, etc.""\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""editType"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""enum"": [\n\t\t\t\t\t\t\t\t""insert"",\n\t\t\t\t\t\t\t\t""delete"",\n\t\t\t\t\t\t\t\t""edit""\n\t\t\t\t\t\t\t],\n\t\t\t\t\t\t\t""description"": ""The operation peformed on the cell, whether `insert`, `delete` or `edit`.\nUse the `editType` field to specify the operation: `insert` to add a new cell, `edit` to modify an existing cell's content, and `delete` to remove a cell.""\n\t\t\t\t\t\t}\n\t\t\t\t\t},\n\t\t\t\t\t""required"": [\n\t\t\t\t\t\t""filePath"",\n\t\t\t\t\t\t""editType"",\n\t\t\t\t\t\t""cellId""\n\t\t\t\t\t]\n\t\t\t\t}\n\t\t\t},\n\t\t\t{\n\t\t\t\t""name"": ""copilot_runNotebookCell"",\n\t\t\t\t""displayName"": ""%copilot.tools.runNotebookCell.name%"",\n\t\t\t\t""toolReferenceName"": ""runCell"",\n\t\t\t\t""icon"": ""$(play)"",\n\t\t\t\t""modelDescription"": ""This is a tool for running a code cell in a notebook file directly in the notebook editor. The output from the execution will be returned. Code cells should be run as they are added or edited when working through a problem to bring the kernel state up to date and ensure the code executes successfully. Code cells are ready to run and don't require any pre-processing. If asked to run the first cell in a notebook, you should run the first code cell since markdown cells cannot be executed. NOTE: Avoid executing Markdown cells or providing Markdown cell IDs, as Markdown cells cannot be executed."",\n\t\t\t\t""userDescription"": ""%copilot.tools.runNotebookCell.description%"",\n\t\t\t\t""tags"": [\n\t\t\t\t\t""enable_other_tool_copilot_getNotebookSummary""\n\t\t\t\t],\n\t\t\t\t""inputSchema"": {\n\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t""filePath"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""An absolute path to the notebook file with the cell to run, or the URI of a untitled, not yet named, file, such as `untitled:Untitled-1.ipynb""\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""reason"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""An optional explanation of why the cell is being run. This will be shown to the user before the tool is run and is not necessary if it's self-explanatory.""\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""cellId"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""The ID for the code cell to execute. Avoid providing markdown cell IDs as nothing will be executed.""\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""continueOnError"": {\n\t\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t\t""description"": ""Whether or not execution should continue for remaining cells if an error is encountered. Default to false unless instructed otherwise.""\n\t\t\t\t\t\t}\n\t\t\t\t\t},\n\t\t\t\t\t""required"": [\n\t\t\t\t\t\t""filePath"",\n\t\t\t\t\t\t""cellId""\n\t\t\t\t\t]\n\t\t\t\t}\n\t\t\t},\n\t\t\t{\n\t\t\t\t""name"": ""copilot_getNotebookSummary"",\n\t\t\t\t""toolReferenceName"": ""getNotebookSummary"",\n\t\t\t\t""displayName"": ""Get the structure of a notebook"",\n\t\t\t\t""modelDescription"": ""This is a tool returns the list of the Notebook cells along with the id, cell types, line ranges, language, execution information and output mime types for each cell. This is useful to get Cell Ids when executing a notebook or determine what cells have been executed and what order, or what cells have outputs. If required to read contents of a cell use this to determine the line range of a cells, and then use read_file tool to read a specific line range. Requery this tool if the contents of the notebook change."",\n\t\t\t\t""tags"": [],\n\t\t\t\t""inputSchema"": {\n\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t""filePath"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""An absolute path to the notebook file with the cell to run, or the URI of a untitled, not yet named, file, such as `untitled:Untitled-1.ipynb""\n\t\t\t\t\t\t}\n\t\t\t\t\t},\n\t\t\t\t\t""required"": [\n\t\t\t\t\t\t""filePath""\n\t\t\t\t\t]\n\t\t\t\t}\n\t\t\t},\n\t\t\t{\n\t\t\t\t""name"": ""copilot_readNotebookCellOutput"",\n\t\t\t\t""displayName"": ""%copilot.tools.getNotebookCellOutput.name%"",\n\t\t\t\t""toolReferenceName"": ""readNotebookCellOutput"",\n\t\t\t\t""icon"": ""$(notebook-render-output)"",\n\t\t\t\t""modelDescription"": ""This tool will retrieve the output for a notebook cell from its most recent execution or restored from disk. The cell may have output even when it has not been run in the current kernel session. This tool has a higher token limit for output length than the runNotebookCell tool."",\n\t\t\t\t""userDescription"": ""%copilot.tools.getNotebookCellOutput.description%"",\n\t\t\t\t""when"": ""userHasOpenedNotebook"",\n\t\t\t\t""tags"": [],\n\t\t\t\t""inputSchema"": {\n\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t""filePath"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""An absolute path to the notebook file with the cell to run, or the URI of a untitled, not yet named, file, such as `untitled:Untitled-1.ipynb""\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""cellId"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""The ID of the cell for which output should be retrieved.""\n\t\t\t\t\t\t}\n\t\t\t\t\t},\n\t\t\t\t\t""required"": [\n\t\t\t\t\t\t""filePath"",\n\t\t\t\t\t\t""cellId""\n\t\t\t\t\t]\n\t\t\t\t}\n\t\t\t},\n\t\t\t{\n\t\t\t\t""name"": ""copilot_fetchWebPage"",\n\t\t\t\t""displayName"": ""%copilot.tools.fetchWebPage.name%"",\n\t\t\t\t""toolReferenceName"": ""fetch"",\n\t\t\t\t""when"": ""!isWeb"",\n\t\t\t\t""canBeReferencedInPrompt"": true,\n\t\t\t\t""icon"": ""$(globe)"",\n\t\t\t\t""userDescription"": ""%copilot.tools.fetchWebPage.description%"",\n\t\t\t\t""modelDescription"": ""Fetches the main content from a web page. This tool is useful for summarizing or analyzing the content of a webpage. You should use this tool when you think the user is looking for information from a specific webpage."",\n\t\t\t\t""tags"": [],\n\t\t\t\t""inputSchema"": {\n\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t""urls"": {\n\t\t\t\t\t\t\t""type"": ""array"",\n\t\t\t\t\t\t\t""items"": {\n\t\t\t\t\t\t\t\t""type"": ""string""\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t""description"": ""An array of URLs to fetch content from.""\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""query"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""The query to search for in the web page's content. This should be a clear and concise description of the content you want to find.""\n\t\t\t\t\t\t}\n\t\t\t\t\t},\n\t\t\t\t\t""required"": [\n\t\t\t\t\t\t""urls"",\n\t\t\t\t\t\t""query""\n\t\t\t\t\t]\n\t\t\t\t}\n\t\t\t},\n\t\t\t{\n\t\t\t\t""name"": ""copilot_findTestFiles"",\n\t\t\t\t""displayName"": ""%copilot.tools.findTestFiles.name%"",\n\t\t\t\t""icon"": ""$(beaker)"",\n\t\t\t\t""canBeReferencedInPrompt"": false,\n\t\t\t\t""toolReferenceName"": ""findTestFiles"",\n\t\t\t\t""userDescription"": ""%copilot.tools.findTestFiles.description%"",\n\t\t\t\t""modelDescription"": ""For a source code file, find the file that contains the tests. For a test file find the file that contains the code under test."",\n\t\t\t\t""tags"": [],\n\t\t\t\t""inputSchema"": {\n\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t""filePaths"": {\n\t\t\t\t\t\t\t""type"": ""array"",\n\t\t\t\t\t\t\t""items"": {\n\t\t\t\t\t\t\t\t""type"": ""string""\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t}\n\t\t\t\t\t},\n\t\t\t\t\t""required"": [\n\t\t\t\t\t\t""filePaths""\n\t\t\t\t\t]\n\t\t\t\t}\n\t\t\t},\n\t\t\t{\n\t\t\t\t""name"": ""copilot_getDocInfo"",\n\t\t\t\t""displayName"": ""%copilot.tools.getDocInfo.name%"",\n\t\t\t\t""icon"": ""$(beaker)"",\n\t\t\t\t""canBeReferencedInPrompt"": false,\n\t\t\t\t""toolReferenceName"": ""docInfo"",\n\t\t\t\t""userDescription"": ""%copilot.tools.getDocInfo.description%"",\n\t\t\t\t""modelDescription"": ""Find information about how to document it a symbol like a class or function. This tool is useful for generating documentation comments for code symbols. You should use this tool when you think the user is looking for information about how to document a specific code symbol."",\n\t\t\t\t""tags"": [],\n\t\t\t\t""inputSchema"": {\n\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t""filePaths"": {\n\t\t\t\t\t\t\t""type"": ""array"",\n\t\t\t\t\t\t\t""items"": {\n\t\t\t\t\t\t\t\t""type"": ""string""\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t""description"": ""The file paths for which documentation information is needed.""\n\t\t\t\t\t\t}\n\t\t\t\t\t},\n\t\t\t\t\t""required"": [\n\t\t\t\t\t\t""filePaths""\n\t\t\t\t\t]\n\t\t\t\t}\n\t\t\t},\n\t\t\t{\n\t\t\t\t""name"": ""copilot_getSearchResults"",\n\t\t\t\t""toolReferenceName"": ""searchResults"",\n\t\t\t\t""displayName"": ""%github.copilot.tools.searchResults.name%"",\n\t\t\t\t""icon"": ""$(search)"",\n\t\t\t\t""userDescription"": ""%github.copilot.tools.searchResults.description%"",\n\t\t\t\t""modelDescription"": ""The results from the search view""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""name"": ""copilot_githubRepo"",\n\t\t\t\t""toolReferenceName"": ""githubRepo"",\n\t\t\t\t""displayName"": ""%github.copilot.tools.githubRepo.name%"",\n\t\t\t\t""modelDescription"": ""Searches a GitHub repository for relevant source code snippets. Only use this tool if the user is very clearly asking for code snippets from a specific GitHub repository. Do not use this tool for Github repos that the user has open in their workspace."",\n\t\t\t\t""userDescription"": ""%github.copilot.tools.githubRepo.userDescription%"",\n\t\t\t\t""icon"": ""$(repo)"",\n\t\t\t\t""canBeReferencedInPrompt"": true,\n\t\t\t\t""inputSchema"": {\n\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t""repo"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""The name of the Github repository to search for code in. Should must be formatted as '/'.""\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""query"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""The query to search for repo. Should contain all relevant context.""\n\t\t\t\t\t\t}\n\t\t\t\t\t},\n\t\t\t\t\t""required"": [\n\t\t\t\t\t\t""repo"",\n\t\t\t\t\t\t""query""\n\t\t\t\t\t]\n\t\t\t\t}\n\t\t\t},\n\t\t\t{\n\t\t\t\t""name"": ""copilot_toolReplay"",\n\t\t\t\t""modelDescription"": ""Replays a tool call from a previous chat session."",\n\t\t\t\t""displayName"": ""tool replay"",\n\t\t\t\t""when"": ""false"",\n\t\t\t\t""inputSchema"": {\n\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t""toolCallId"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""the id of the tool original tool call""\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""toolName"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""the name of the tool being replayed""\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""toolCallArgs"": {\n\t\t\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t\t\t""description"": ""the arguments of the tool call""\n\t\t\t\t\t\t}\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t},\n\t\t\t{\n\t\t\t\t""name"": ""copilot_memory"",\n\t\t\t\t""toolReferenceName"": ""memory"",\n\t\t\t\t""displayName"": ""%copilot.tools.memory.name%"",\n\t\t\t\t""userDescription"": ""%copilot.tools.memory.description%"",\n\t\t\t\t""modelDescription"": ""Manage persistent memory across conversations. This tool allows you to create, view, update, and delete memory files that persist between chat sessions. Use this to remember important information about the user, their preferences, project context, or anything that should be recalled in future conversations. Available commands: view (list/read memories), create (new memory file), str_replace (edit content), insert (add content), delete (remove memory), rename (change filename)."",\n\t\t\t\t""icon"": ""$(database)"",\n\t\t\t\t""when"": ""config.github.copilot.chat.tools.memory.enabled"",\n\t\t\t\t""canBeReferencedInPrompt"": true,\n\t\t\t\t""tags"": [],\n\t\t\t\t""inputSchema"": {\n\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t""command"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""enum"": [\n\t\t\t\t\t\t\t\t""view"",\n\t\t\t\t\t\t\t\t""create"",\n\t\t\t\t\t\t\t\t""str_replace"",\n\t\t\t\t\t\t\t\t""insert"",\n\t\t\t\t\t\t\t\t""delete"",\n\t\t\t\t\t\t\t\t""rename""\n\t\t\t\t\t\t\t],\n\t\t\t\t\t\t\t""description"": ""The memory operation to perform: view (list/read), create (new file), str_replace (edit), insert (add content), delete (remove), rename (change filename)""\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""path"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""The path to the memory file (must start with /memories, e.g., /memories/notes.md or /memories/project/info.txt)""\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""view_range"": {\n\t\t\t\t\t\t\t""type"": ""array"",\n\t\t\t\t\t\t\t""items"": {\n\t\t\t\t\t\t\t\t""type"": ""number""\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t""description"": ""Optional: view specific line range [start, end] for view command""\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""file_text"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""Content for create operation""\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""old_str"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""String to replace in str_replace operation""\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""new_str"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""Replacement string in str_replace operation""\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""insert_line"": {\n\t\t\t\t\t\t\t""type"": ""number"",\n\t\t\t\t\t\t\t""description"": ""Line number for insert operation""\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""insert_text"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""Text to insert at specified line for insert operation""\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""old_path"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""Source path for rename operation""\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""new_path"": {\n\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t""description"": ""Destination path for rename operation""\n\t\t\t\t\t\t}\n\t\t\t\t\t},\n\t\t\t\t\t""required"": [\n\t\t\t\t\t\t""command""\n\t\t\t\t\t]\n\t\t\t\t}\n\t\t\t},\n\t\t\t{\n\t\t\t\t""name"": ""copilot_editFiles"",\n\t\t\t\t""modelDescription"": ""This is a placeholder tool, do not use"",\n\t\t\t\t""userDescription"": ""Edit files"",\n\t\t\t\t""icon"": ""$(pencil)"",\n\t\t\t\t""displayName"": ""Edit Files"",\n\t\t\t\t""toolReferenceName"": ""editFiles""\n\t\t\t}\n\t\t],\n\t\t""languageModelToolSets"": [\n\t\t\t{\n\t\t\t\t""name"": ""edit"",\n\t\t\t\t""description"": ""%copilot.toolSet.editing.description%"",\n\t\t\t\t""icon"": ""$(pencil)"",\n\t\t\t\t""tools"": [\n\t\t\t\t\t""createFile"",\n\t\t\t\t\t""createDirectory"",\n\t\t\t\t\t""editNotebook"",\n\t\t\t\t\t""newJupyterNotebook"",\n\t\t\t\t\t""editFiles""\n\t\t\t\t]\n\t\t\t},\n\t\t\t{\n\t\t\t\t""name"": ""runNotebooks"",\n\t\t\t\t""description"": ""%copilot.toolSet.runNotebook.description%"",\n\t\t\t\t""icon"": ""$(notebook)"",\n\t\t\t\t""tools"": [\n\t\t\t\t\t""runCell"",\n\t\t\t\t\t""getNotebookSummary"",\n\t\t\t\t\t""readNotebookCellOutput""\n\t\t\t\t]\n\t\t\t},\n\t\t\t{\n\t\t\t\t""name"": ""search"",\n\t\t\t\t""description"": ""%copilot.toolSet.search.description%"",\n\t\t\t\t""icon"": ""$(search)"",\n\t\t\t\t""tools"": [\n\t\t\t\t\t""fileSearch"",\n\t\t\t\t\t""textSearch"",\n\t\t\t\t\t""listDirectory"",\n\t\t\t\t\t""readFile"",\n\t\t\t\t\t""codebase"",\n\t\t\t\t\t""searchResults""\n\t\t\t\t]\n\t\t\t},\n\t\t\t{\n\t\t\t\t""name"": ""new"",\n\t\t\t\t""description"": ""%copilot.toolSet.new.description%"",\n\t\t\t\t""icon"": ""$(new-folder)"",\n\t\t\t\t""tools"": [\n\t\t\t\t\t""newWorkspace"",\n\t\t\t\t\t""runVscodeCommand"",\n\t\t\t\t\t""getProjectSetupInfo"",\n\t\t\t\t\t""installExtension""\n\t\t\t\t]\n\t\t\t}\n\t\t],\n\t\t""chatParticipants"": [\n\t\t\t{\n\t\t\t\t""id"": ""github.copilot.default"",\n\t\t\t\t""name"": ""GitHubCopilot"",\n\t\t\t\t""fullName"": ""GitHub Copilot"",\n\t\t\t\t""description"": ""%copilot.description%"",\n\t\t\t\t""isDefault"": true,\n\t\t\t\t""locations"": [\n\t\t\t\t\t""panel""\n\t\t\t\t],\n\t\t\t\t""modes"": [\n\t\t\t\t\t""ask""\n\t\t\t\t],\n\t\t\t\t""disambiguation"": [\n\t\t\t\t\t{\n\t\t\t\t\t\t""category"": ""generate_code_sample"",\n\t\t\t\t\t\t""description"": ""The user wants to generate code snippets without referencing the contents of the current workspace. This category does not include generating entire projects."",\n\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t""Write an example of computing a SHA256 hash.""\n\t\t\t\t\t\t]\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\t""category"": ""add_feature_to_file"",\n\t\t\t\t\t\t""description"": ""The user wants to change code in a file that is provided in their request, without referencing the contents of the current workspace. This category does not include generating entire projects."",\n\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t""Add a refresh button to the table widget.""\n\t\t\t\t\t\t]\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\t""category"": ""question_about_specific_files"",\n\t\t\t\t\t\t""description"": ""The user has a question about a specific file or code snippet that they have provided as part of their query, and the question does not require additional workspace context to answer."",\n\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t""What does this file do?""\n\t\t\t\t\t\t]\n\t\t\t\t\t}\n\t\t\t\t]\n\t\t\t},\n\t\t\t{\n\t\t\t\t""id"": ""github.copilot.editingSession"",\n\t\t\t\t""name"": ""GitHubCopilot"",\n\t\t\t\t""fullName"": ""GitHub Copilot"",\n\t\t\t\t""description"": ""%copilot.edits.description%"",\n\t\t\t\t""isDefault"": true,\n\t\t\t\t""locations"": [\n\t\t\t\t\t""panel""\n\t\t\t\t],\n\t\t\t\t""modes"": [\n\t\t\t\t\t""edit""\n\t\t\t\t],\n\t\t\t\t""when"": ""!config.chat.edits2.enabled""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""id"": ""github.copilot.editingSessionEditor"",\n\t\t\t\t""name"": ""GitHubCopilot"",\n\t\t\t\t""fullName"": ""GitHub Copilot"",\n\t\t\t\t""description"": ""%copilot.edits.description%"",\n\t\t\t\t""isDefault"": true,\n\t\t\t\t""when"": ""config.inlineChat.enableV2"",\n\t\t\t\t""locations"": [\n\t\t\t\t\t""editor""\n\t\t\t\t]\n\t\t\t},\n\t\t\t{\n\t\t\t\t""id"": ""github.copilot.editingSession2"",\n\t\t\t\t""name"": ""GitHubCopilot"",\n\t\t\t\t""fullName"": ""GitHub Copilot"",\n\t\t\t\t""description"": ""%copilot.edits.description%"",\n\t\t\t\t""isDefault"": true,\n\t\t\t\t""locations"": [\n\t\t\t\t\t""panel""\n\t\t\t\t],\n\t\t\t\t""modes"": [\n\t\t\t\t\t""edit""\n\t\t\t\t],\n\t\t\t\t""when"": ""config.chat.edits2.enabled""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""id"": ""github.copilot.editsAgent"",\n\t\t\t\t""name"": ""agent"",\n\t\t\t\t""fullName"": ""GitHub Copilot"",\n\t\t\t\t""description"": ""%copilot.agent.description%"",\n\t\t\t\t""locations"": [\n\t\t\t\t\t""panel""\n\t\t\t\t],\n\t\t\t\t""modes"": [\n\t\t\t\t\t""agent""\n\t\t\t\t],\n\t\t\t\t""isEngine"": true,\n\t\t\t\t""isDefault"": true,\n\t\t\t\t""isAgent"": true,\n\t\t\t\t""when"": ""config.chat.agent.enabled"",\n\t\t\t\t""commands"": [\n\t\t\t\t\t{\n\t\t\t\t\t\t""name"": ""list""\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\t""name"": ""error"",\n\t\t\t\t\t\t""description"": ""Make a model request which will result in an error"",\n\t\t\t\t\t\t""when"": ""github.copilot.chat.debug""\n\t\t\t\t\t}\n\t\t\t\t]\n\t\t\t},\n\t\t\t{\n\t\t\t\t""id"": ""github.copilot.editor"",\n\t\t\t\t""name"": ""Copilot"",\n\t\t\t\t""fullName"": ""GitHub Copilot"",\n\t\t\t\t""description"": ""%copilot.description%"",\n\t\t\t\t""isDefault"": true,\n\t\t\t\t""locations"": [\n\t\t\t\t\t""editor""\n\t\t\t\t],\n\t\t\t\t""when"": ""!config.inlineChat.enableV2"",\n\t\t\t\t""disambiguation"": [\n\t\t\t\t\t{\n\t\t\t\t\t\t""category"": ""unknown"",\n\t\t\t\t\t\t""description"": ""Intent of this command is unclear or is not related to information technologies"",\n\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t""Add a dog to this comment.""\n\t\t\t\t\t\t]\n\t\t\t\t\t}\n\t\t\t\t],\n\t\t\t\t""commands"": [\n\t\t\t\t\t{\n\t\t\t\t\t\t""name"": ""generate"",\n\t\t\t\t\t\t""description"": ""%copilot.workspace.generate.description%"",\n\t\t\t\t\t\t""disambiguation"": [\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t""category"": ""generate"",\n\t\t\t\t\t\t\t\t""description"": ""Generate new code"",\n\t\t\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t\t\t""Add a function that returns the sum of two numbers""\n\t\t\t\t\t\t\t\t]\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t]\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\t""name"": ""edit"",\n\t\t\t\t\t\t""description"": ""%copilot.workspace.edit.inline.description%"",\n\t\t\t\t\t\t""disambiguation"": [\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t""category"": ""edit"",\n\t\t\t\t\t\t\t\t""description"": ""Make changes to existing code"",\n\t\t\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t\t\t""Change this method to use async/await""\n\t\t\t\t\t\t\t\t]\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t]\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\t""name"": ""doc"",\n\t\t\t\t\t\t""description"": ""%copilot.workspace.doc.description%"",\n\t\t\t\t\t\t""disambiguation"": [\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t""category"": ""doc"",\n\t\t\t\t\t\t\t\t""description"": ""Add documentation comment for this symbol"",\n\t\t\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t\t\t""Add jsdoc to this method""\n\t\t\t\t\t\t\t\t]\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t]\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\t""name"": ""fix"",\n\t\t\t\t\t\t""description"": ""%copilot.workspace.fix.description%"",\n\t\t\t\t\t\t""disambiguation"": [\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t""category"": ""fix"",\n\t\t\t\t\t\t\t\t""description"": ""Propose a fix for the problems in the selected code"",\n\t\t\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t\t\t""There is a problem in this code. Rewrite the code to show it with the bug fixed.""\n\t\t\t\t\t\t\t\t]\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t]\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\t""name"": ""explain"",\n\t\t\t\t\t\t""description"": ""%copilot.workspace.explain.description%"",\n\t\t\t\t\t\t""disambiguation"": [\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t""category"": ""explain"",\n\t\t\t\t\t\t\t\t""description"": ""Explain how the code in your active editor works"",\n\t\t\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t\t\t""Write an explanation for the code above as paragraphs of text.""\n\t\t\t\t\t\t\t\t]\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t]\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\t""name"": ""review"",\n\t\t\t\t\t\t""description"": ""%copilot.workspace.review.description%"",\n\t\t\t\t\t\t""when"": ""github.copilot.advanced.review.intent""\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\t""name"": ""tests"",\n\t\t\t\t\t\t""description"": ""%copilot.workspace.tests.description%"",\n\t\t\t\t\t\t""disambiguation"": [\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t""category"": ""tests"",\n\t\t\t\t\t\t\t\t""description"": ""Generate unit tests for the selected code. The user does not want to fix their existing tests."",\n\t\t\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t\t\t""Write a set of detailed unit test functions for the code above.""\n\t\t\t\t\t\t\t\t]\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t]\n\t\t\t\t\t}\n\t\t\t\t]\n\t\t\t},\n\t\t\t{\n\t\t\t\t""id"": ""github.copilot.notebook"",\n\t\t\t\t""name"": ""GitHubCopilot"",\n\t\t\t\t""fullName"": ""GitHub Copilot"",\n\t\t\t\t""description"": ""%copilot.description%"",\n\t\t\t\t""isDefault"": true,\n\t\t\t\t""locations"": [\n\t\t\t\t\t""notebook""\n\t\t\t\t],\n\t\t\t\t""when"": ""!config.inlineChat.notebookAgent"",\n\t\t\t\t""commands"": [\n\t\t\t\t\t{\n\t\t\t\t\t\t""name"": ""fix"",\n\t\t\t\t\t\t""description"": ""%copilot.workspace.fix.description%""\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\t""name"": ""explain"",\n\t\t\t\t\t\t""description"": ""%copilot.workspace.explain.description%""\n\t\t\t\t\t}\n\t\t\t\t]\n\t\t\t},\n\t\t\t{\n\t\t\t\t""id"": ""github.copilot.notebookEditorAgent"",\n\t\t\t\t""name"": ""GitHubCopilot"",\n\t\t\t\t""fullName"": ""GitHub Copilot"",\n\t\t\t\t""description"": ""%copilot.description%"",\n\t\t\t\t""isDefault"": true,\n\t\t\t\t""locations"": [\n\t\t\t\t\t""notebook""\n\t\t\t\t],\n\t\t\t\t""when"": ""config.inlineChat.notebookAgent"",\n\t\t\t\t""commands"": [\n\t\t\t\t\t{\n\t\t\t\t\t\t""name"": ""fix"",\n\t\t\t\t\t\t""description"": ""%copilot.workspace.fix.description%""\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\t""name"": ""explain"",\n\t\t\t\t\t\t""description"": ""%copilot.workspace.explain.description%""\n\t\t\t\t\t}\n\t\t\t\t]\n\t\t\t},\n\t\t\t{\n\t\t\t\t""id"": ""github.copilot.workspace"",\n\t\t\t\t""name"": ""workspace"",\n\t\t\t\t""fullName"": ""Workspace"",\n\t\t\t\t""description"": ""%copilot.workspace.description%"",\n\t\t\t\t""when"": ""!github.copilot.interactiveSession.disabled"",\n\t\t\t\t""sampleRequest"": ""%copilot.workspace.sampleRequest%"",\n\t\t\t\t""locations"": [\n\t\t\t\t\t""panel""\n\t\t\t\t],\n\t\t\t\t""disambiguation"": [\n\t\t\t\t\t{\n\t\t\t\t\t\t""category"": ""workspace_project_questions"",\n\t\t\t\t\t\t""description"": ""The user wants to learn about or update the code or files in their current workspace. Questions in this category may be about understanding what the whole workspace does or locating the implementation of some code. This does not include generating or updating tests."",\n\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t""What does this project do?""\n\t\t\t\t\t\t]\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\t""category"": ""find_code_in_workspace"",\n\t\t\t\t\t\t""description"": ""The user wants to locate the implementation of some functionality in their current workspace."",\n\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t""Where is the tree widget implemented?""\n\t\t\t\t\t\t]\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\t""category"": ""generate_with_workspace_context"",\n\t\t\t\t\t\t""description"": ""The user wants to generate code based on multiple files in the workspace and did not specify which files to reference."",\n\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t""Create a README for this project.""\n\t\t\t\t\t\t]\n\t\t\t\t\t}\n\t\t\t\t],\n\t\t\t\t""commands"": [\n\t\t\t\t\t{\n\t\t\t\t\t\t""name"": ""explain"",\n\t\t\t\t\t\t""description"": ""%copilot.workspace.explain.description%""\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\t""name"": ""review"",\n\t\t\t\t\t\t""description"": ""%copilot.workspace.review.description%"",\n\t\t\t\t\t\t""when"": ""github.copilot.advanced.review.intent""\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\t""name"": ""tests"",\n\t\t\t\t\t\t""description"": ""%copilot.workspace.tests.description%"",\n\t\t\t\t\t\t""disambiguation"": [\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t""category"": ""create_tests"",\n\t\t\t\t\t\t\t\t""description"": ""The user wants to generate unit tests."",\n\t\t\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t\t\t""Generate tests for my selection using pytest.""\n\t\t\t\t\t\t\t\t]\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t]\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\t""name"": ""fix"",\n\t\t\t\t\t\t""description"": ""%copilot.workspace.fix.description%"",\n\t\t\t\t\t\t""sampleRequest"": ""%copilot.workspace.fix.sampleRequest%""\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\t""name"": ""new"",\n\t\t\t\t\t\t""description"": ""%copilot.workspace.new.description%"",\n\t\t\t\t\t\t""sampleRequest"": ""%copilot.workspace.new.sampleRequest%"",\n\t\t\t\t\t\t""isSticky"": true,\n\t\t\t\t\t\t""disambiguation"": [\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t""category"": ""create_new_workspace_or_extension"",\n\t\t\t\t\t\t\t\t""description"": ""The user wants to create a complete Visual Studio Code workspace from scratch, such as a new application or a Visual Studio Code extension. Use this category only if the question relates to generating or creating new workspaces in Visual Studio Code. Do not use this category for updating existing code or generating sample code snippets"",\n\t\t\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t\t\t""Scaffold a Node server."",\n\t\t\t\t\t\t\t\t\t""Create a sample project which uses the fileSystemProvider API."",\n\t\t\t\t\t\t\t\t\t""react application""\n\t\t\t\t\t\t\t\t]\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t]\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\t""name"": ""newNotebook"",\n\t\t\t\t\t\t""description"": ""%copilot.workspace.newNotebook.description%"",\n\t\t\t\t\t\t""sampleRequest"": ""%copilot.workspace.newNotebook.sampleRequest%"",\n\t\t\t\t\t\t""disambiguation"": [\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t""category"": ""create_jupyter_notebook"",\n\t\t\t\t\t\t\t\t""description"": ""The user wants to create a new Jupyter notebook in Visual Studio Code."",\n\t\t\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t\t\t""Create a notebook to analyze this CSV file.""\n\t\t\t\t\t\t\t\t]\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t]\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\t""name"": ""semanticSearch"",\n\t\t\t\t\t\t""description"": ""%copilot.workspace.semanticSearch.description%"",\n\t\t\t\t\t\t""sampleRequest"": ""%copilot.workspace.semanticSearch.sampleRequest%"",\n\t\t\t\t\t\t""when"": ""config.github.copilot.semanticSearch.enabled""\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\t""name"": ""setupTests"",\n\t\t\t\t\t\t""description"": ""%copilot.vscode.setupTests.description%"",\n\t\t\t\t\t\t""sampleRequest"": ""%copilot.vscode.setupTests.sampleRequest%"",\n\t\t\t\t\t\t""when"": ""config.github.copilot.chat.setupTests.enabled"",\n\t\t\t\t\t\t""disambiguation"": [\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t""category"": ""set_up_tests"",\n\t\t\t\t\t\t\t\t""description"": ""The user wants to configure project test setup, framework, or test runner. The user does not want to fix their existing tests."",\n\t\t\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t\t\t""Set up tests for this project.""\n\t\t\t\t\t\t\t\t]\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t]\n\t\t\t\t\t}\n\t\t\t\t]\n\t\t\t},\n\t\t\t{\n\t\t\t\t""id"": ""github.copilot.vscode"",\n\t\t\t\t""name"": ""vscode"",\n\t\t\t\t""fullName"": ""VS Code"",\n\t\t\t\t""description"": ""%copilot.vscode.description%"",\n\t\t\t\t""when"": ""!github.copilot.interactiveSession.disabled"",\n\t\t\t\t""sampleRequest"": ""%copilot.vscode.sampleRequest%"",\n\t\t\t\t""locations"": [\n\t\t\t\t\t""panel""\n\t\t\t\t],\n\t\t\t\t""disambiguation"": [\n\t\t\t\t\t{\n\t\t\t\t\t\t""category"": ""vscode_configuration_questions"",\n\t\t\t\t\t\t""description"": ""The user wants to learn about, use, or configure the Visual Studio Code. Use this category if the users question is specifically about commands, settings, keybindings, extensions and other features available in Visual Studio Code. Do not use this category to answer questions about generating code or creating new projects including Visual Studio Code extensions."",\n\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t""Switch to light mode."",\n\t\t\t\t\t\t\t""Keyboard shortcut to toggle terminal visibility."",\n\t\t\t\t\t\t\t""Settings to enable minimap."",\n\t\t\t\t\t\t\t""Whats new in the latest release?""\n\t\t\t\t\t\t]\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\t""category"": ""configure_python_environment"",\n\t\t\t\t\t\t""description"": ""The user wants to set up their Python environment."",\n\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t""Create a virtual environment for my project.""\n\t\t\t\t\t\t]\n\t\t\t\t\t}\n\t\t\t\t],\n\t\t\t\t""commands"": [\n\t\t\t\t\t{\n\t\t\t\t\t\t""name"": ""search"",\n\t\t\t\t\t\t""description"": ""%copilot.vscode.search.description%"",\n\t\t\t\t\t\t""sampleRequest"": ""%copilot.vscode.search.sampleRequest%""\n\t\t\t\t\t}\n\t\t\t\t]\n\t\t\t},\n\t\t\t{\n\t\t\t\t""id"": ""github.copilot.terminal"",\n\t\t\t\t""name"": ""terminal"",\n\t\t\t\t""fullName"": ""Terminal"",\n\t\t\t\t""description"": ""%copilot.terminal.description%"",\n\t\t\t\t""when"": ""!github.copilot.interactiveSession.disabled"",\n\t\t\t\t""sampleRequest"": ""%copilot.terminal.sampleRequest%"",\n\t\t\t\t""isDefault"": true,\n\t\t\t\t""locations"": [\n\t\t\t\t\t""terminal""\n\t\t\t\t],\n\t\t\t\t""commands"": [\n\t\t\t\t\t{\n\t\t\t\t\t\t""name"": ""explain"",\n\t\t\t\t\t\t""description"": ""%copilot.terminal.explain.description%"",\n\t\t\t\t\t\t""sampleRequest"": ""%copilot.terminal.explain.sampleRequest%""\n\t\t\t\t\t}\n\t\t\t\t]\n\t\t\t},\n\t\t\t{\n\t\t\t\t""id"": ""github.copilot.terminalPanel"",\n\t\t\t\t""name"": ""terminal"",\n\t\t\t\t""fullName"": ""Terminal"",\n\t\t\t\t""description"": ""%copilot.terminalPanel.description%"",\n\t\t\t\t""when"": ""!github.copilot.interactiveSession.disabled"",\n\t\t\t\t""sampleRequest"": ""%copilot.terminal.sampleRequest%"",\n\t\t\t\t""locations"": [\n\t\t\t\t\t""panel""\n\t\t\t\t],\n\t\t\t\t""commands"": [\n\t\t\t\t\t{\n\t\t\t\t\t\t""name"": ""explain"",\n\t\t\t\t\t\t""description"": ""%copilot.terminal.explain.description%"",\n\t\t\t\t\t\t""sampleRequest"": ""%copilot.terminal.explain.sampleRequest%"",\n\t\t\t\t\t\t""disambiguation"": [\n\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t""category"": ""terminal_state_questions"",\n\t\t\t\t\t\t\t\t""description"": ""The user wants to learn about specific state such as the selection, command, or failed command in the integrated terminal in Visual Studio Code."",\n\t\t\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t\t\t""Why did the latest terminal command fail?""\n\t\t\t\t\t\t\t\t]\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t]\n\t\t\t\t\t}\n\t\t\t\t]\n\t\t\t},\n\t\t\t{\n\t\t\t\t""id"": ""github.copilot.chatReplay"",\n\t\t\t\t""name"": ""chatReplay"",\n\t\t\t\t""fullName"": ""Chat Replay"",\n\t\t\t\t""when"": ""debugType == 'vscode-chat-replay'"",\n\t\t\t\t""locations"": [\n\t\t\t\t\t""panel""\n\t\t\t\t]\n\t\t\t}\n\t\t],\n\t\t""languageModelChatProviders"": [\n\t\t\t{\n\t\t\t\t""vendor"": ""copilot"",\n\t\t\t\t""displayName"": ""Copilot""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""vendor"": ""azure"",\n\t\t\t\t""displayName"": ""Azure"",\n\t\t\t\t""managementCommand"": ""github.copilot.chat.manageBYOK""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""vendor"": ""anthropic"",\n\t\t\t\t""displayName"": ""Anthropic"",\n\t\t\t\t""managementCommand"": ""github.copilot.chat.manageBYOK""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""vendor"": ""xai"",\n\t\t\t\t""displayName"": ""xAI"",\n\t\t\t\t""managementCommand"": ""github.copilot.chat.manageBYOK""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""vendor"": ""ollama"",\n\t\t\t\t""displayName"": ""Ollama""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""vendor"": ""openai"",\n\t\t\t\t""displayName"": ""OpenAI"",\n\t\t\t\t""managementCommand"": ""github.copilot.chat.manageBYOK""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""vendor"": ""gemini"",\n\t\t\t\t""displayName"": ""Google"",\n\t\t\t\t""managementCommand"": ""github.copilot.chat.manageBYOK""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""vendor"": ""groq"",\n\t\t\t\t""displayName"": ""Groq"",\n\t\t\t\t""managementCommand"": ""github.copilot.chat.manageBYOK""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""vendor"": ""openrouter"",\n\t\t\t\t""displayName"": ""OpenRouter"",\n\t\t\t\t""managementCommand"": ""github.copilot.chat.manageBYOK""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""vendor"": ""customoai"",\n\t\t\t\t""when"": ""productQualityType != 'stable'"",\n\t\t\t\t""displayName"": ""OpenAI Compatible"",\n\t\t\t\t""managementCommand"": ""github.copilot.chat.manageBYOK""\n\t\t\t}\n\t\t],\n\t\t""interactiveSession"": [\n\t\t\t{\n\t\t\t\t""label"": ""GitHub Copilot"",\n\t\t\t\t""id"": ""copilot"",\n\t\t\t\t""icon"": """",\n\t\t\t\t""when"": ""!github.copilot.interactiveSession.disabled""\n\t\t\t}\n\t\t],\n\t\t""viewsWelcome"": [\n\t\t\t{\n\t\t\t\t""view"": ""debug"",\n\t\t\t\t""when"": ""github.copilot-chat.activated"",\n\t\t\t\t""contents"": ""%github.copilot.viewsWelcome.debug%""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""view"": ""codex-placeholder"",\n\t\t\t\t""when"": ""true"",\n\t\t\t\t""contents"": ""%github.copilot.viewsWelcome.codexPlaceholder%""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""view"": ""workbench.view.chat.sessions.openai-codex"",\n\t\t\t\t""contents"": ""%github.copilot.viewsWelcome.codexWelcomeView%""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""view"": ""copilot-agents-placeholder"",\n\t\t\t\t""when"": ""true"",\n\t\t\t\t""contents"": ""%github.copilot.viewsWelcome.agentsPlaceholder%""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""view"": ""workbench.view.chat.sessions.copilot-cloud-agent"",\n\t\t\t\t""when"": ""workspaceFolderCount == 0"",\n\t\t\t\t""contents"": ""%github.copilot.viewsWelcome.noFolder.contents%""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""view"": ""workbench.view.chat.sessions.copilot-cloud-agent"",\n\t\t\t\t""when"": ""git.state == initialized && gitOpenRepositoryCount == 0 && workspaceFolderCount > 0 && git.parentRepositoryCount == 0"",\n\t\t\t\t""contents"": ""%github.copilot.viewsWelcome.noRepo.contents%""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""view"": ""workbench.view.chat.sessions.copilot-cloud-agent"",\n\t\t\t\t""when"": ""git.state == initialized && workspaceFolderCount > 0 && (git.parentRepositoryCount > 0 || gitOpenRepositoryCount > 0) && !github:hasGitHubRemotes"",\n\t\t\t\t""contents"": ""%github.copilot.viewsWelcome.noGitHub.contents%""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""view"": ""workbench.view.chat.sessions.copilot-cloud-agent"",\n\t\t\t\t""when"": ""github.copilot.chat.cloudSessionsEmpty"",\n\t\t\t\t""contents"": ""%github.copilot.viewsWelcome.cloudSessionsEmpty.contents%""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""view"": ""workbench.view.chat.sessions.copilotcli"",\n\t\t\t\t""when"": ""github.copilot.chat.cliSessionsEmpty"",\n\t\t\t\t""contents"": ""%github.copilot.viewsWelcome.cliSessionsEmpty.contents%""\n\t\t\t}\n\t\t],\n\t\t""chatViewsWelcome"": [\n\t\t\t{\n\t\t\t\t""icon"": ""$(copilot-large)"",\n\t\t\t\t""title"": ""Ask Copilot"",\n\t\t\t\t""content"": ""%github.copilot.viewsWelcome.signIn%"",\n\t\t\t\t""when"": ""!github.copilot-chat.activated && !github.copilot.offline && !github.copilot.interactiveSession.individual.expired && !github.copilot.interactiveSession.enterprise.disabled && !github.copilot.interactiveSession.contactSupport && !github.copilot.interactiveSession.chatDisabled && !github.copilot.interactiveSession.switchToReleaseChannel""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""icon"": ""$(copilot-large)"",\n\t\t\t\t""title"": ""Ask Copilot"",\n\t\t\t\t""content"": ""%github.copilot.viewsWelcome.individual.expired%"",\n\t\t\t\t""when"": ""github.copilot.interactiveSession.individual.expired""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""icon"": ""$(copilot-large)"",\n\t\t\t\t""title"": ""Ask Copilot"",\n\t\t\t\t""content"": ""%github.copilot.viewsWelcome.enterprise%"",\n\t\t\t\t""when"": ""github.copilot.interactiveSession.enterprise.disabled""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""icon"": ""$(copilot-large)"",\n\t\t\t\t""title"": ""Ask Copilot"",\n\t\t\t\t""content"": ""%github.copilot.viewsWelcome.offline%"",\n\t\t\t\t""when"": ""github.copilot.offline""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""icon"": ""$(copilot-large)"",\n\t\t\t\t""title"": ""Ask Copilot"",\n\t\t\t\t""content"": ""%github.copilot.viewsWelcome.contactSupport%"",\n\t\t\t\t""when"": ""github.copilot.interactiveSession.contactSupport""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""icon"": ""$(copilot-large)"",\n\t\t\t\t""title"": ""Ask Copilot"",\n\t\t\t\t""content"": ""%github.copilot.viewsWelcome.chatDisabled%"",\n\t\t\t\t""when"": ""github.copilot.interactiveSession.chatDisabled""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""icon"": ""$(copilot-large)"",\n\t\t\t\t""title"": ""Ask Copilot"",\n\t\t\t\t""content"": ""%github.copilot.viewsWelcome.switchToReleaseChannel%"",\n\t\t\t\t""when"": ""github.copilot.interactiveSession.switchToReleaseChannel""\n\t\t\t}\n\t\t],\n\t\t""commands"": [\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.claude.sessions.refresh"",\n\t\t\t\t""title"": ""%github.copilot.command.refreshClaudeCodeSessions%"",\n\t\t\t\t""icon"": ""$(refresh)"",\n\t\t\t\t""category"": ""Claude Code""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.cli.sessions.refresh"",\n\t\t\t\t""title"": ""%github.copilot.command.refreshAgentSessions%"",\n\t\t\t\t""icon"": ""$(refresh)"",\n\t\t\t\t""category"": ""Copilot CLI""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.cli.sessions.delete"",\n\t\t\t\t""title"": ""%github.copilot.command.deleteAgentSession%"",\n\t\t\t\t""icon"": ""$(close)"",\n\t\t\t\t""category"": ""Copilot CLI""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.cli.sessions.resumeInTerminal"",\n\t\t\t\t""title"": ""%github.copilot.command.cli.sessions.resumeInTerminal%"",\n\t\t\t\t""icon"": ""$(terminal)"",\n\t\t\t\t""category"": ""Copilot CLI""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.cli.sessions.newTerminalSession"",\n\t\t\t\t""title"": ""%github.copilot.cli.sessions.newTerminalSession%"",\n\t\t\t\t""icon"": ""$(terminal)"",\n\t\t\t\t""category"": ""Copilot CLI""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.replay"",\n\t\t\t\t""title"": ""Start Chat Replay"",\n\t\t\t\t""icon"": ""$(debug-line-by-line)"",\n\t\t\t\t""enablement"": ""resourceLangId == chatReplay && !inDebugMode""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.replay.enableWorkspaceEditTracing"",\n\t\t\t\t""title"": ""%github.copilot.command.enableEditTracing%"",\n\t\t\t\t""category"": ""Developer"",\n\t\t\t\t""enablement"": ""!github.copilot.chat.replay.workspaceEditTracing""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.replay.disableWorkspaceEditTracing"",\n\t\t\t\t""title"": ""%github.copilot.command.disableEditTracing%"",\n\t\t\t\t""category"": ""Developer"",\n\t\t\t\t""enablement"": ""github.copilot.chat.replay.workspaceEditTracing""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.explain"",\n\t\t\t\t""title"": ""%github.copilot.command.explainThis%"",\n\t\t\t\t""enablement"": ""!github.copilot.interactiveSession.disabled"",\n\t\t\t\t""category"": ""Chat""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.explain.palette"",\n\t\t\t\t""title"": ""%github.copilot.command.explainThis%"",\n\t\t\t\t""enablement"": ""!github.copilot.interactiveSession.disabled && !editorReadonly"",\n\t\t\t\t""category"": ""Chat""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.review"",\n\t\t\t\t""title"": ""%github.copilot.command.reviewAndComment%"",\n\t\t\t\t""enablement"": ""config.github.copilot.chat.reviewSelection.enabled && !github.copilot.interactiveSession.disabled"",\n\t\t\t\t""category"": ""Chat""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.review.apply"",\n\t\t\t\t""title"": ""%github.copilot.command.applyReviewSuggestion%"",\n\t\t\t\t""icon"": ""$(sparkle)"",\n\t\t\t\t""enablement"": ""commentThread =~ /hasSuggestion/"",\n\t\t\t\t""category"": ""Chat""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.review.applyAndNext"",\n\t\t\t\t""title"": ""%github.copilot.command.applyReviewSuggestionAndNext%"",\n\t\t\t\t""icon"": ""$(sparkle)"",\n\t\t\t\t""enablement"": ""commentThread =~ /hasSuggestion/"",\n\t\t\t\t""category"": ""Chat""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.review.discard"",\n\t\t\t\t""title"": ""%github.copilot.command.discardReviewSuggestion%"",\n\t\t\t\t""icon"": ""$(close)"",\n\t\t\t\t""category"": ""Chat""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.review.discardAndNext"",\n\t\t\t\t""title"": ""%github.copilot.command.discardReviewSuggestionAndNext%"",\n\t\t\t\t""icon"": ""$(close)"",\n\t\t\t\t""category"": ""Chat""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.review.discardAll"",\n\t\t\t\t""title"": ""%github.copilot.command.discardAllReviewSuggestion%"",\n\t\t\t\t""icon"": ""$(close-all)"",\n\t\t\t\t""category"": ""Chat""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.review.stagedChanges"",\n\t\t\t\t""title"": ""%github.copilot.command.reviewStagedChanges%"",\n\t\t\t\t""icon"": ""$(code-review)"",\n\t\t\t\t""enablement"": ""github.copilot.chat.reviewDiff.enabled && !github.copilot.interactiveSession.disabled"",\n\t\t\t\t""category"": ""Chat""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.review.unstagedChanges"",\n\t\t\t\t""title"": ""%github.copilot.command.reviewUnstagedChanges%"",\n\t\t\t\t""icon"": ""$(code-review)"",\n\t\t\t\t""enablement"": ""github.copilot.chat.reviewDiff.enabled && !github.copilot.interactiveSession.disabled"",\n\t\t\t\t""category"": ""Chat""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.review.changes"",\n\t\t\t\t""title"": ""%github.copilot.command.reviewChanges%"",\n\t\t\t\t""icon"": ""$(code-review)"",\n\t\t\t\t""enablement"": ""github.copilot.chat.reviewDiff.enabled && !github.copilot.interactiveSession.disabled"",\n\t\t\t\t""category"": ""Chat""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.review.stagedFileChange"",\n\t\t\t\t""title"": ""%github.copilot.command.reviewFileChange%"",\n\t\t\t\t""icon"": ""$(code-review)"",\n\t\t\t\t""enablement"": ""github.copilot.chat.reviewDiff.enabled && !github.copilot.interactiveSession.disabled"",\n\t\t\t\t""category"": ""Chat""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.review.unstagedFileChange"",\n\t\t\t\t""title"": ""%github.copilot.command.reviewFileChange%"",\n\t\t\t\t""icon"": ""$(code-review)"",\n\t\t\t\t""enablement"": ""github.copilot.chat.reviewDiff.enabled && !github.copilot.interactiveSession.disabled"",\n\t\t\t\t""category"": ""Chat""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.review.previous"",\n\t\t\t\t""title"": ""%github.copilot.command.gotoPreviousReviewSuggestion%"",\n\t\t\t\t""icon"": ""$(arrow-up)"",\n\t\t\t\t""category"": ""Chat""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.review.next"",\n\t\t\t\t""title"": ""%github.copilot.command.gotoNextReviewSuggestion%"",\n\t\t\t\t""icon"": ""$(arrow-down)"",\n\t\t\t\t""category"": ""Chat""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.review.continueInInlineChat"",\n\t\t\t\t""title"": ""%github.copilot.command.continueReviewInInlineChat%"",\n\t\t\t\t""icon"": ""$(comment-discussion)"",\n\t\t\t\t""category"": ""Chat""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.review.continueInChat"",\n\t\t\t\t""title"": ""%github.copilot.command.continueReviewInChat%"",\n\t\t\t\t""icon"": ""$(comment-discussion)"",\n\t\t\t\t""category"": ""Chat""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.review.markHelpful"",\n\t\t\t\t""title"": ""%github.copilot.command.helpfulReviewSuggestion%"",\n\t\t\t\t""icon"": ""$(thumbsup)"",\n\t\t\t\t""enablement"": ""!(commentThread =~ /markedAsHelpful/)"",\n\t\t\t\t""category"": ""Chat""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.openUserPreferences"",\n\t\t\t\t""title"": ""%github.copilot.command.openUserPreferences%"",\n\t\t\t\t""category"": ""Chat"",\n\t\t\t\t""enablement"": ""config.github.copilot.chat.enableUserPreferences""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.review.markUnhelpful"",\n\t\t\t\t""title"": ""%github.copilot.command.unhelpfulReviewSuggestion%"",\n\t\t\t\t""icon"": ""$(thumbsdown)"",\n\t\t\t\t""enablement"": ""!(commentThread =~ /markedAsUnhelpful/)"",\n\t\t\t\t""category"": ""Chat""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.generate"",\n\t\t\t\t""title"": ""%github.copilot.command.generateThis%"",\n\t\t\t\t""icon"": ""$(sparkle)"",\n\t\t\t\t""enablement"": ""!github.copilot.interactiveSession.disabled && !editorReadonly"",\n\t\t\t\t""category"": ""Chat""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.generateDocs"",\n\t\t\t\t""title"": ""%github.copilot.command.generateDocs%"",\n\t\t\t\t""enablement"": ""!github.copilot.interactiveSession.disabled && !editorReadonly"",\n\t\t\t\t""category"": ""Chat""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.generateTests"",\n\t\t\t\t""title"": ""%github.copilot.command.generateTests%"",\n\t\t\t\t""enablement"": ""!github.copilot.interactiveSession.disabled && !editorReadonly"",\n\t\t\t\t""category"": ""Chat""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.fix"",\n\t\t\t\t""title"": ""%github.copilot.command.fixThis%"",\n\t\t\t\t""enablement"": ""!github.copilot.interactiveSession.disabled && !editorReadonly"",\n\t\t\t\t""category"": ""Chat""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.interactiveSession.feedback"",\n\t\t\t\t""title"": ""%github.copilot.command.sendChatFeedback%"",\n\t\t\t\t""enablement"": ""github.copilot-chat.activated && !github.copilot.interactiveSession.disabled"",\n\t\t\t\t""icon"": ""$(feedback)"",\n\t\t\t\t""category"": ""Chat""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.debug.workbenchState"",\n\t\t\t\t""title"": ""%github.copilot.command.logWorkbenchState%"",\n\t\t\t\t""category"": ""Developer""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.debug.showChatLogView"",\n\t\t\t\t""title"": ""%github.copilot.command.showChatLogView%"",\n\t\t\t\t""category"": ""Developer""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.debug.showOutputChannel"",\n\t\t\t\t""title"": ""%github.copilot.command.showOutputChannel%"",\n\t\t\t\t""category"": ""Developer""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.debug.showContextInspectorView"",\n\t\t\t\t""title"": ""%github.copilot.command.showContextInspectorView%"",\n\t\t\t\t""icon"": ""$(inspect)"",\n\t\t\t\t""category"": ""Developer""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.debug.resetVirtualToolGroups"",\n\t\t\t\t""title"": ""%github.copilot.command.resetVirtualToolGroups%"",\n\t\t\t\t""icon"": ""$(inspect)"",\n\t\t\t\t""category"": ""Developer""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.terminal.explainTerminalLastCommand"",\n\t\t\t\t""title"": ""%github.copilot.command.explainTerminalLastCommand%"",\n\t\t\t\t""category"": ""Chat""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.git.generateCommitMessage"",\n\t\t\t\t""title"": ""%github.copilot.git.generateCommitMessage%"",\n\t\t\t\t""icon"": ""$(sparkle)"",\n\t\t\t\t""enablement"": ""!github.copilot.interactiveSession.disabled"",\n\t\t\t\t""category"": ""Chat""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.git.resolveMergeConflicts"",\n\t\t\t\t""title"": ""%github.copilot.git.resolveMergeConflicts%"",\n\t\t\t\t""icon"": ""$(chat-sparkle)"",\n\t\t\t\t""enablement"": ""!github.copilot.interactiveSession.disabled"",\n\t\t\t\t""category"": ""Chat""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.devcontainer.generateDevContainerConfig"",\n\t\t\t\t""title"": ""%github.copilot.devcontainer.generateDevContainerConfig%"",\n\t\t\t\t""category"": ""Chat""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.tests.fixTestFailure"",\n\t\t\t\t""icon"": ""$(sparkle)"",\n\t\t\t\t""title"": ""%github.copilot.command.fixTestFailure%"",\n\t\t\t\t""category"": ""Chat""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.tests.fixTestFailure.fromInline"",\n\t\t\t\t""icon"": ""$(sparkle)"",\n\t\t\t\t""title"": ""%github.copilot.command.fixTestFailure%""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.attachFile"",\n\t\t\t\t""title"": ""%github.copilot.chat.attachFile%"",\n\t\t\t\t""category"": ""Chat""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.attachSelection"",\n\t\t\t\t""title"": ""%github.copilot.chat.attachSelection%"",\n\t\t\t\t""icon"": ""$(comment-discussion)"",\n\t\t\t\t""category"": ""Chat""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.debug.collectDiagnostics"",\n\t\t\t\t""title"": ""%github.copilot.command.collectDiagnostics%"",\n\t\t\t\t""category"": ""Developer""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.debug.inlineEdit.clearCache"",\n\t\t\t\t""title"": ""%github.copilot.command.inlineEdit.clearCache%"",\n\t\t\t\t""category"": ""Developer""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.debug.inlineEdit.reportNotebookNESIssue"",\n\t\t\t\t""title"": ""%github.copilot.command.inlineEdit.reportNotebookNESIssue%"",\n\t\t\t\t""enablement"": ""config.github.copilot.chat.advanced.notebook.alternativeNESFormat.enabled || github.copilot.chat.enableEnhancedNotebookNES"",\n\t\t\t\t""category"": ""Developer""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.debug.generateSTest"",\n\t\t\t\t""title"": ""%github.copilot.command.generateSTest%"",\n\t\t\t\t""enablement"": ""github.copilot.debugReportFeedback"",\n\t\t\t\t""category"": ""Developer""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.open.walkthrough"",\n\t\t\t\t""title"": ""%github.copilot.command.openWalkthrough%"",\n\t\t\t\t""category"": ""Chat""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.debug.generateInlineEditTests"",\n\t\t\t\t""title"": ""Generate Inline Edit Tests"",\n\t\t\t\t""category"": ""Chat"",\n\t\t\t\t""enablement"": ""resourceScheme == 'ccreq'""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.buildLocalWorkspaceIndex"",\n\t\t\t\t""title"": ""%github.copilot.command.buildLocalWorkspaceIndex%"",\n\t\t\t\t""category"": ""Chat"",\n\t\t\t\t""enablement"": ""github.copilot-chat.activated""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.buildRemoteWorkspaceIndex"",\n\t\t\t\t""title"": ""%github.copilot.command.buildRemoteWorkspaceIndex%"",\n\t\t\t\t""category"": ""Chat"",\n\t\t\t\t""enablement"": ""github.copilot-chat.activated""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.report"",\n\t\t\t\t""title"": ""Report Issue"",\n\t\t\t\t""category"": ""Chat""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.rerunWithCopilotDebug"",\n\t\t\t\t""title"": ""%github.copilot.command.rerunWithCopilotDebug%"",\n\t\t\t\t""category"": ""Chat""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.startCopilotDebugCommand"",\n\t\t\t\t""title"": ""Start Copilot Debug""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.clearTemporalContext"",\n\t\t\t\t""title"": ""Clear Temporal Context"",\n\t\t\t\t""category"": ""Developer""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.search.markHelpful"",\n\t\t\t\t""title"": ""Helpful"",\n\t\t\t\t""icon"": ""$(thumbsup)"",\n\t\t\t\t""enablement"": ""!github.copilot.search.feedback.sent""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.search.markUnhelpful"",\n\t\t\t\t""title"": ""Unhelpful"",\n\t\t\t\t""icon"": ""$(thumbsdown)"",\n\t\t\t\t""enablement"": ""!github.copilot.search.feedback.sent""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.search.feedback"",\n\t\t\t\t""title"": ""Feedback"",\n\t\t\t\t""icon"": ""$(feedback)"",\n\t\t\t\t""enablement"": ""!github.copilot.search.feedback.sent""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.debug.showElements"",\n\t\t\t\t""title"": ""Show Rendered Elements""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.debug.hideElements"",\n\t\t\t\t""title"": ""Hide Rendered Elements""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.debug.showTools"",\n\t\t\t\t""title"": ""Show Tools""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.debug.hideTools"",\n\t\t\t\t""title"": ""Hide Tools""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.debug.showNesRequests"",\n\t\t\t\t""title"": ""Show NES Requests""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.debug.hideNesRequests"",\n\t\t\t\t""title"": ""Hide NES Requests""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.debug.showRawRequestBody"",\n\t\t\t\t""title"": ""Show Raw Request Body""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.debug.exportLogItem"",\n\t\t\t\t""title"": ""Export as..."",\n\t\t\t\t""icon"": ""$(export)""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.debug.exportPromptArchive"",\n\t\t\t\t""title"": ""Export All as Archive..."",\n\t\t\t\t""icon"": ""$(archive)""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.debug.exportPromptLogsAsJson"",\n\t\t\t\t""title"": ""Export All as JSON..."",\n\t\t\t\t""icon"": ""$(export)""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.debug.exportAllPromptLogsAsJson"",\n\t\t\t\t""title"": ""Export All Prompt Logs as JSON..."",\n\t\t\t\t""icon"": ""$(export)""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.debug.collectWorkspaceIndexDiagnostics"",\n\t\t\t\t""title"": ""%github.copilot.command.collectWorkspaceIndexDiagnostics%"",\n\t\t\t\t""category"": ""Developer""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.mcp.setup.check"",\n\t\t\t\t""title"": ""MCP Check: is supported""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.mcp.setup.validatePackage"",\n\t\t\t\t""title"": ""MCP Check: validate package""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.mcp.setup.flow"",\n\t\t\t\t""title"": ""MCP Check: do prompts""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.generateAltText"",\n\t\t\t\t""title"": ""Generate/Refine Alt Text""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.notebook.enableFollowCellExecution"",\n\t\t\t\t""title"": ""Enable Follow Cell Execution from Chat"",\n\t\t\t\t""shortTitle"": ""Follow"",\n\t\t\t\t""icon"": ""$(pinned)""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.notebook.disableFollowCellExecution"",\n\t\t\t\t""title"": ""Disable Follow Cell Execution from Chat"",\n\t\t\t\t""shortTitle"": ""Unfollow"",\n\t\t\t\t""icon"": ""$(pinned-dirty)""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.manageBYOK"",\n\t\t\t\t""title"": ""Manage Bring Your Own Key Vendor"",\n\t\t\t\t""enablement"": ""false""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.manageBYOKAPIKey"",\n\t\t\t\t""title"": ""Manage Bring Your Own Key API Key"",\n\t\t\t\t""enablement"": ""false""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.cloud.sessions.refresh"",\n\t\t\t\t""title"": ""%github.copilot.command.refreshAgentSessions%"",\n\t\t\t\t""icon"": ""$(refresh)""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.cloud.sessions.openInBrowser"",\n\t\t\t\t""title"": ""%github.copilot.command.openCopilotAgentSessionsInBrowser%"",\n\t\t\t\t""icon"": ""$(link-external)""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.cloud.sessions.proxy.closeChatSessionPullRequest"",\n\t\t\t\t""title"": ""%github.copilot.command.closeChatSessionPullRequest.title%""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.openSuggestionsPanel"",\n ""title"": ""Open Completions Panel"",\n ""enablement"": ""github.copilot.extensionUnification.activated && !isWeb"",\n\t\t\t\t""category"": ""GitHub Copilot""\n\t\t\t},\n\t\t\t{\n ""command"": ""github.copilot.chat.toggleStatusMenu"",\n ""title"": ""Open Status Menu"",\n\t\t\t\t""enablement"": ""github.copilot.extensionUnification.activated"",\n ""category"": ""GitHub Copilot""\n },\n\t\t\t{\n ""command"": ""github.copilot.chat.completions.disable"",\n ""title"": ""Disable Completions"",\n ""enablement"": ""github.copilot.extensionUnification.activated && github.copilot.activated && config.editor.inlineSuggest.enabled && github.copilot.completions.enabled"",\n ""category"": ""GitHub Copilot""\n },\n {\n ""command"": ""github.copilot.chat.completions.enable"",\n ""title"": ""Enable Completions"",\n ""enablement"": ""github.copilot.extensionUnification.activated && github.copilot.activated && !(config.editor.inlineSuggest.enabled && github.copilot.completions.enabled)"",\n ""category"": ""GitHub Copilot""\n },\n {\n ""command"": ""github.copilot.chat.completions.toggle"",\n ""title"": ""Toggle (Enable/Disable) Completions"",\n ""enablement"": ""github.copilot.extensionUnification.activated && github.copilot.activated"",\n ""category"": ""GitHub Copilot""\n }\n\t\t],\n\t\t""configuration"": [\n\t\t\t{\n\t\t\t\t""title"": ""GitHub Copilot Chat"",\n\t\t\t\t""id"": ""stable"",\n\t\t\t\t""properties"": {\n\t\t\t\t\t""github.copilot.chat.codeGeneration.useInstructionFiles"": {\n\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t""default"": true,\n\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.config.codeGeneration.useInstructionFiles%""\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.editor.enableCodeActions"": {\n\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t""default"": true,\n\t\t\t\t\t\t""description"": ""%github.copilot.config.enableCodeActions%""\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.renameSuggestions.triggerAutomatically"": {\n\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t""default"": true,\n\t\t\t\t\t\t""description"": ""%github.copilot.config.renameSuggestions.triggerAutomatically%""\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.localeOverride"": {\n\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t""enum"": [\n\t\t\t\t\t\t\t""auto"",\n\t\t\t\t\t\t\t""en"",\n\t\t\t\t\t\t\t""fr"",\n\t\t\t\t\t\t\t""it"",\n\t\t\t\t\t\t\t""de"",\n\t\t\t\t\t\t\t""es"",\n\t\t\t\t\t\t\t""ru"",\n\t\t\t\t\t\t\t""zh-CN"",\n\t\t\t\t\t\t\t""zh-TW"",\n\t\t\t\t\t\t\t""ja"",\n\t\t\t\t\t\t\t""ko"",\n\t\t\t\t\t\t\t""cs"",\n\t\t\t\t\t\t\t""pt-br"",\n\t\t\t\t\t\t\t""tr"",\n\t\t\t\t\t\t\t""pl""\n\t\t\t\t\t\t],\n\t\t\t\t\t\t""enumDescriptions"": [\n\t\t\t\t\t\t\t""Use VS Code's configured display language"",\n\t\t\t\t\t\t\t""English"",\n\t\t\t\t\t\t\t""franรงais"",\n\t\t\t\t\t\t\t""italiano"",\n\t\t\t\t\t\t\t""Deutsch"",\n\t\t\t\t\t\t\t""espaรฑol"",\n\t\t\t\t\t\t\t""ััััะบะธะน"",\n\t\t\t\t\t\t\t""ไธญๆ(็ฎไฝ)"",\n\t\t\t\t\t\t\t""ไธญๆ(็น้ซ)"",\n\t\t\t\t\t\t\t""ๆฅๆฌ่ช"",\n\t\t\t\t\t\t\t""ํ๊ตญ์ด"",\n\t\t\t\t\t\t\t""ฤeลกtina"",\n\t\t\t\t\t\t\t""portuguรชs"",\n\t\t\t\t\t\t\t""Tรผrkรงe"",\n\t\t\t\t\t\t\t""polski""\n\t\t\t\t\t\t],\n\t\t\t\t\t\t""default"": ""auto"",\n\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.config.localeOverride%""\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.terminalChatLocation"": {\n\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t""default"": ""chatView"",\n\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.config.terminalChatLocation%"",\n\t\t\t\t\t\t""markdownEnumDescriptions"": [\n\t\t\t\t\t\t\t""%github.copilot.config.terminalChatLocation.chatView%"",\n\t\t\t\t\t\t\t""%github.copilot.config.terminalChatLocation.quickChat%"",\n\t\t\t\t\t\t\t""%github.copilot.config.terminalChatLocation.terminal%""\n\t\t\t\t\t\t],\n\t\t\t\t\t\t""enum"": [\n\t\t\t\t\t\t\t""chatView"",\n\t\t\t\t\t\t\t""quickChat"",\n\t\t\t\t\t\t\t""terminal""\n\t\t\t\t\t\t]\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.scopeSelection"": {\n\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t""default"": false,\n\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.config.scopeSelection%""\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.useProjectTemplates"": {\n\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t""default"": true,\n\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.config.useProjectTemplates%""\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.nextEditSuggestions.enabled"": {\n\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t""default"": false,\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""nextEditSuggestions"",\n\t\t\t\t\t\t\t""onExp""\n\t\t\t\t\t\t],\n\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.nextEditSuggestions.enabled%"",\n\t\t\t\t\t\t""scope"": ""language-overridable""\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.nextEditSuggestions.fixes"": {\n\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t""default"": true,\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""nextEditSuggestions"",\n\t\t\t\t\t\t\t""onExp""\n\t\t\t\t\t\t],\n\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.nextEditSuggestions.fixes%"",\n\t\t\t\t\t\t""scope"": ""language-overridable""\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.nextEditSuggestions.allowWhitespaceOnlyChanges"": {\n\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t""default"": true,\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""nextEditSuggestions"",\n\t\t\t\t\t\t\t""onExp""\n\t\t\t\t\t\t],\n\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.nextEditSuggestions.allowWhitespaceOnlyChanges%"",\n\t\t\t\t\t\t""scope"": ""language-overridable""\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.agent.autoFix"": {\n\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t""default"": true,\n\t\t\t\t\t\t""description"": ""%github.copilot.config.autoFix%"",\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""onExp""\n\t\t\t\t\t\t]\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.customInstructionsInSystemMessage"": {\n\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t""default"": true,\n\t\t\t\t\t\t""description"": ""%github.copilot.config.customInstructionsInSystemMessage%""\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.agent.currentEditorContext.enabled"": {\n\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t""default"": true,\n\t\t\t\t\t\t""description"": ""%github.copilot.config.agent.currentEditorContext.enabled%""\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t},\n\t\t\t{\n\t\t\t\t""id"": ""preview"",\n\t\t\t\t""properties"": {\n\t\t\t\t\t""github.copilot.chat.reviewAgent.enabled"": {\n\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t""default"": true,\n\t\t\t\t\t\t""description"": ""%github.copilot.config.reviewAgent.enabled%"",\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""preview""\n\t\t\t\t\t\t]\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.reviewSelection.enabled"": {\n\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t""default"": true,\n\t\t\t\t\t\t""description"": ""%github.copilot.config.reviewSelection.enabled%"",\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""preview""\n\t\t\t\t\t\t]\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.reviewSelection.instructions"": {\n\t\t\t\t\t\t""type"": ""array"",\n\t\t\t\t\t\t""items"": {\n\t\t\t\t\t\t\t""oneOf"": [\n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.config.reviewSelection.instruction.file%"",\n\t\t\t\t\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t\t\t\t\t""file"": {\n\t\t\t\t\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t\t\t\t\t\t"".copilot-review-instructions.md""\n\t\t\t\t\t\t\t\t\t\t\t]\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t""language"": {\n\t\t\t\t\t\t\t\t\t\t\t""type"": ""string""\n\t\t\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\t\t""file"": "".copilot-review-instructions.md""\n\t\t\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\t\t],\n\t\t\t\t\t\t\t\t\t""required"": [\n\t\t\t\t\t\t\t\t\t\t""file""\n\t\t\t\t\t\t\t\t\t]\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.config.reviewSelection.instruction.text%"",\n\t\t\t\t\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t\t\t\t\t""text"": {\n\t\t\t\t\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t\t\t\t\t\t""Use underscore for field names.""\n\t\t\t\t\t\t\t\t\t\t\t]\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t""language"": {\n\t\t\t\t\t\t\t\t\t\t\t""type"": ""string""\n\t\t\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t""required"": [\n\t\t\t\t\t\t\t\t\t\t""text""\n\t\t\t\t\t\t\t\t\t],\n\t\t\t\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\t\t""text"": ""Use underscore for field names.""\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\t\t""text"": ""Resolve all TODO tasks.""\n\t\t\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\t\t]\n\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t]\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""default"": [],\n\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.config.reviewSelection.instructions%"",\n\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t[\n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t""file"": "".copilot-review-instructions.md""\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t""text"": ""Resolve all TODO tasks.""\n\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t]\n\t\t\t\t\t\t],\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""preview""\n\t\t\t\t\t\t]\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.copilotDebugCommand.enabled"": {\n\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t""default"": true,\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""preview""\n\t\t\t\t\t\t],\n\t\t\t\t\t\t""description"": ""%github.copilot.chat.copilotDebugCommand.enabled%""\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.codesearch.enabled"": {\n\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t""default"": false,\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""preview""\n\t\t\t\t\t\t],\n\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.config.codesearch.enabled%""\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.byok.ollamaEndpoint"": {\n\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t""default"": ""http://localhost:11434"",\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""preview""\n\t\t\t\t\t\t],\n\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.config.byok.ollamaEndpoint%""\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t},\n\t\t\t{\n\t\t\t\t""id"": ""experimental"",\n\t\t\t\t""properties"": {\n\t\t\t\t\t""github.copilot.chat.imageUpload.enabled"": {\n\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t""default"": true,\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""experimental"",\n\t\t\t\t\t\t\t""onExp""\n\t\t\t\t\t\t],\n\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.config.imageUpload.enabled%""\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.edits.suggestRelatedFilesFromGitHistory"": {\n\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t""default"": true,\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""experimental""\n\t\t\t\t\t\t],\n\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.config.edits.suggestRelatedFilesFromGitHistory%""\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.edits.suggestRelatedFilesForTests"": {\n\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t""default"": true,\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""experimental""\n\t\t\t\t\t\t],\n\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.chat.edits.suggestRelatedFilesForTests%""\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.codeGeneration.instructions"": {\n\t\t\t\t\t\t""markdownDeprecationMessage"": ""%github.copilot.config.codeGeneration.instructions.deprecated%"",\n\t\t\t\t\t\t""type"": ""array"",\n\t\t\t\t\t\t""items"": {\n\t\t\t\t\t\t\t""oneOf"": [\n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.config.codeGeneration.instruction.file%"",\n\t\t\t\t\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t\t\t\t\t""file"": {\n\t\t\t\t\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t\t\t\t\t\t"".copilot-codeGeneration-instructions.md""\n\t\t\t\t\t\t\t\t\t\t\t]\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t""language"": {\n\t\t\t\t\t\t\t\t\t\t\t""type"": ""string""\n\t\t\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\t\t""file"": "".copilot-codeGeneration-instructions.md""\n\t\t\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\t\t],\n\t\t\t\t\t\t\t\t\t""required"": [\n\t\t\t\t\t\t\t\t\t\t""file""\n\t\t\t\t\t\t\t\t\t]\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.config.codeGeneration.instruction.text%"",\n\t\t\t\t\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t\t\t\t\t""text"": {\n\t\t\t\t\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t\t\t\t\t\t""Use underscore for field names.""\n\t\t\t\t\t\t\t\t\t\t\t]\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t""language"": {\n\t\t\t\t\t\t\t\t\t\t\t""type"": ""string""\n\t\t\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t""required"": [\n\t\t\t\t\t\t\t\t\t\t""text""\n\t\t\t\t\t\t\t\t\t],\n\t\t\t\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\t\t""text"": ""Use underscore for field names.""\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\t\t""text"": ""Always add a comment: 'Generated by Copilot'.""\n\t\t\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\t\t]\n\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t]\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""default"": [],\n\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.config.codeGeneration.instructions%"",\n\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t[\n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t""file"": "".copilot-codeGeneration-instructions.md""\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t""text"": ""Always add a comment: 'Generated by Copilot'.""\n\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t]\n\t\t\t\t\t\t],\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""experimental""\n\t\t\t\t\t\t]\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.testGeneration.instructions"": {\n\t\t\t\t\t\t""markdownDeprecationMessage"": ""%github.copilot.config.testGeneration.instructions.deprecated%"",\n\t\t\t\t\t\t""type"": ""array"",\n\t\t\t\t\t\t""items"": {\n\t\t\t\t\t\t\t""oneOf"": [\n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.config.experimental.testGeneration.instruction.file%"",\n\t\t\t\t\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t\t\t\t\t""file"": {\n\t\t\t\t\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t\t\t\t\t\t"".copilot-test-instructions.md""\n\t\t\t\t\t\t\t\t\t\t\t]\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t""language"": {\n\t\t\t\t\t\t\t\t\t\t\t""type"": ""string""\n\t\t\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\t\t""file"": "".copilot-test-instructions.md""\n\t\t\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\t\t],\n\t\t\t\t\t\t\t\t\t""required"": [\n\t\t\t\t\t\t\t\t\t\t""file""\n\t\t\t\t\t\t\t\t\t]\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.config.experimental.testGeneration.instruction.text%"",\n\t\t\t\t\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t\t\t\t\t""text"": {\n\t\t\t\t\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t\t\t\t\t\t""Use suite and test instead of describe and it.""\n\t\t\t\t\t\t\t\t\t\t\t]\n\t\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t\t""language"": {\n\t\t\t\t\t\t\t\t\t\t\t""type"": ""string""\n\t\t\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t""required"": [\n\t\t\t\t\t\t\t\t\t\t""text""\n\t\t\t\t\t\t\t\t\t],\n\t\t\t\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\t\t""text"": ""Always try uniting related tests in a suite.""\n\t\t\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\t\t]\n\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t]\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""default"": [],\n\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.config.testGeneration.instructions%"",\n\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t[\n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t""file"": "".copilot-test-instructions.md""\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t""text"": ""Always try uniting related tests in a suite.""\n\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t]\n\t\t\t\t\t\t],\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""experimental""\n\t\t\t\t\t\t]\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.commitMessageGeneration.instructions"": {\n\t\t\t\t\t\t""type"": ""array"",\n\t\t\t\t\t\t""items"": {\n\t\t\t\t\t\t\t""oneOf"": [\n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.config.commitMessageGeneration.instruction.file%"",\n\t\t\t\t\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t\t\t\t\t""file"": {\n\t\t\t\t\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t\t\t\t\t\t"".copilot-commit-message-instructions.md""\n\t\t\t\t\t\t\t\t\t\t\t]\n\t\t\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\t\t""file"": "".copilot-commit-message-instructions.md""\n\t\t\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\t\t],\n\t\t\t\t\t\t\t\t\t""required"": [\n\t\t\t\t\t\t\t\t\t\t""file""\n\t\t\t\t\t\t\t\t\t]\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.config.commitMessageGeneration.instruction.text%"",\n\t\t\t\t\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t\t\t\t\t""text"": {\n\t\t\t\t\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t\t\t\t\t\t""Use conventional commit message format.""\n\t\t\t\t\t\t\t\t\t\t\t]\n\t\t\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t""required"": [\n\t\t\t\t\t\t\t\t\t\t""text""\n\t\t\t\t\t\t\t\t\t],\n\t\t\t\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\t\t""text"": ""Use conventional commit message format.""\n\t\t\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\t\t]\n\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t]\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""default"": [],\n\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.config.commitMessageGeneration.instructions%"",\n\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t[\n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t""file"": "".copilot-commit-message-instructions.md""\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t""text"": ""Use conventional commit message format.""\n\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t]\n\t\t\t\t\t\t],\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""experimental""\n\t\t\t\t\t\t]\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.pullRequestDescriptionGeneration.instructions"": {\n\t\t\t\t\t\t""type"": ""array"",\n\t\t\t\t\t\t""items"": {\n\t\t\t\t\t\t\t""oneOf"": [\n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.config.pullRequestDescriptionGeneration.instruction.file%"",\n\t\t\t\t\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t\t\t\t\t""file"": {\n\t\t\t\t\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t\t\t\t\t\t"".copilot-pull-request-description-instructions.md""\n\t\t\t\t\t\t\t\t\t\t\t]\n\t\t\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\t\t""file"": "".copilot-pull-request-description-instructions.md""\n\t\t\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\t\t],\n\t\t\t\t\t\t\t\t\t""required"": [\n\t\t\t\t\t\t\t\t\t\t""file""\n\t\t\t\t\t\t\t\t\t]\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.config.pullRequestDescriptionGeneration.instruction.text%"",\n\t\t\t\t\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t\t\t\t\t""text"": {\n\t\t\t\t\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t\t\t\t\t\t""Include every commit message in the pull request description.""\n\t\t\t\t\t\t\t\t\t\t\t]\n\t\t\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t\t""required"": [\n\t\t\t\t\t\t\t\t\t\t""text""\n\t\t\t\t\t\t\t\t\t],\n\t\t\t\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t\t\t""text"": ""Include every commit message in the pull request description.""\n\t\t\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\t\t]\n\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t]\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""default"": [],\n\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.config.pullRequestDescriptionGeneration.instructions%"",\n\t\t\t\t\t\t""examples"": [\n\t\t\t\t\t\t\t[\n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t""file"": "".copilot-pull-request-description-instructions.md""\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t{\n\t\t\t\t\t\t\t\t\t""text"": ""Use conventional commit message format.""\n\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t]\n\t\t\t\t\t\t],\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""experimental""\n\t\t\t\t\t\t]\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.generateTests.codeLens"": {\n\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t""default"": false,\n\t\t\t\t\t\t""description"": ""%github.copilot.config.generateTests.codeLens%"",\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""experimental""\n\t\t\t\t\t\t]\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.edits.temporalContext.enabled"": {\n\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t""default"": false,\n\t\t\t\t\t\t""description"": ""%github.copilot.chat.edits.temporalContext.enabled%"",\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""experimental"",\n\t\t\t\t\t\t\t""onExp""\n\t\t\t\t\t\t]\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.editor.temporalContext.enabled"": {\n\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t""default"": false,\n\t\t\t\t\t\t""description"": ""%github.copilot.chat.editor.temporalContext.enabled%"",\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""experimental"",\n\t\t\t\t\t\t\t""onExp""\n\t\t\t\t\t\t]\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.setupTests.enabled"": {\n\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t""default"": true,\n\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.config.setupTests.enabled%"",\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""experimental""\n\t\t\t\t\t\t]\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.languageContext.typescript.enabled"": {\n\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t""default"": false,\n\t\t\t\t\t\t""scope"": ""resource"",\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""experimental"",\n\t\t\t\t\t\t\t""onExP""\n\t\t\t\t\t\t],\n\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.chat.languageContext.typescript.enabled%""\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.languageContext.typescript.items"": {\n\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t""enum"": [\n\t\t\t\t\t\t\t""minimal"",\n\t\t\t\t\t\t\t""double"",\n\t\t\t\t\t\t\t""fillHalf"",\n\t\t\t\t\t\t\t""fill""\n\t\t\t\t\t\t],\n\t\t\t\t\t\t""default"": ""minimal"",\n\t\t\t\t\t\t""scope"": ""resource"",\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""experimental"",\n\t\t\t\t\t\t\t""onExP""\n\t\t\t\t\t\t],\n\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.chat.languageContext.typescript.items%""\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.languageContext.typescript.includeDocumentation"": {\n\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t""default"": false,\n\t\t\t\t\t\t""scope"": ""resource"",\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""experimental"",\n\t\t\t\t\t\t\t""onExP""\n\t\t\t\t\t\t],\n\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.chat.languageContext.typescript.includeDocumentation%""\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.languageContext.typescript.cacheTimeout"": {\n\t\t\t\t\t\t""type"": ""number"",\n\t\t\t\t\t\t""default"": 500,\n\t\t\t\t\t\t""scope"": ""resource"",\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""experimental"",\n\t\t\t\t\t\t\t""onExP""\n\t\t\t\t\t\t],\n\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.chat.languageContext.typescript.cacheTimeout%""\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.languageContext.fix.typescript.enabled"": {\n\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t""default"": false,\n\t\t\t\t\t\t""scope"": ""resource"",\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""experimental"",\n\t\t\t\t\t\t\t""onExP""\n\t\t\t\t\t\t],\n\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.chat.languageContext.fix.typescript.enabled%""\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.languageContext.inline.typescript.enabled"": {\n\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t""default"": false,\n\t\t\t\t\t\t""scope"": ""resource"",\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""experimental"",\n\t\t\t\t\t\t\t""onExP""\n\t\t\t\t\t\t],\n\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.chat.languageContext.inline.typescript.enabled%""\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.newWorkspaceCreation.enabled"": {\n\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t""default"": true,\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""experimental""\n\t\t\t\t\t\t],\n\t\t\t\t\t\t""description"": ""%github.copilot.config.newWorkspaceCreation.enabled%""\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.newWorkspace.useContext7"": {\n\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t""default"": false,\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""experimental""\n\t\t\t\t\t\t],\n\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.config.newWorkspace.useContext7%""\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.notebook.followCellExecution.enabled"": {\n\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t""default"": false,\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""experimental""\n\t\t\t\t\t\t],\n\t\t\t\t\t\t""description"": ""%github.copilot.config.notebook.followCellExecution%""\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.notebook.enhancedNextEditSuggestions.enabled"": {\n\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t""default"": false,\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""experimental"",\n\t\t\t\t\t\t\t""onExp""\n\t\t\t\t\t\t],\n\t\t\t\t\t\t""description"": ""%github.copilot.config.notebook.enhancedNextEditSuggestions%""\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.summarizeAgentConversationHistory.enabled"": {\n\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t""default"": true,\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""experimental""\n\t\t\t\t\t\t],\n\t\t\t\t\t\t""description"": ""%github.copilot.config.summarizeAgentConversationHistory.enabled%""\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.virtualTools.threshold"": {\n\t\t\t\t\t\t""type"": ""number"",\n\t\t\t\t\t\t""minimum"": 0,\n\t\t\t\t\t\t""maximum"": 128,\n\t\t\t\t\t\t""default"": 128,\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""experimental""\n\t\t\t\t\t\t],\n\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.config.virtualTools.threshold%""\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.azureModels"": {\n\t\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t\t""default"": {},\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""experimental""\n\t\t\t\t\t\t],\n\t\t\t\t\t\t""additionalProperties"": {\n\t\t\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t\t\t""name"": {\n\t\t\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t\t\t""description"": ""Display name of the Azure model""\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t""url"": {\n\t\t\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t\t\t""description"": ""URL endpoint for the Azure model""\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t""toolCalling"": {\n\t\t\t\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t\t\t\t""description"": ""Whether the model supports tool calling""\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t""vision"": {\n\t\t\t\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t\t\t\t""description"": ""Whether the model supports vision capabilities""\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t""maxInputTokens"": {\n\t\t\t\t\t\t\t\t\t""type"": ""number"",\n\t\t\t\t\t\t\t\t\t""description"": ""Maximum number of input tokens supported by the model""\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t""maxOutputTokens"": {\n\t\t\t\t\t\t\t\t\t""type"": ""number"",\n\t\t\t\t\t\t\t\t\t""description"": ""Maximum number of output tokens supported by the model""\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t""thinking"": {\n\t\t\t\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t\t\t\t""default"": false,\n\t\t\t\t\t\t\t\t\t""description"": ""Whether the model supports thinking capabilities""\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t""requestHeaders"": {\n\t\t\t\t\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t\t\t\t\t""description"": ""Additional HTTP headers to include with requests to this model. These reserved headers are not allowed and ignored if present: forbidden request headers (https://developer.mozilla.org/en-US/docs/Glossary/Forbidden_request_header), forwarding headers ('forwarded', 'x-forwarded-for', 'x-forwarded-host', 'x-forwarded-proto'), and others ('api-key', 'authorization', 'content-type', 'openai-intent', 'x-github-api-version', 'x-initiator', 'x-interaction-id', 'x-interaction-type', 'x-onbehalf-extension-id', 'x-request-id', 'x-vscode-user-agent-library-version'). Pattern-based forbidden headers ('proxy-*', 'sec-*', 'x-http-method*' with forbidden methods) are also blocked."",\n\t\t\t\t\t\t\t\t\t""additionalProperties"": {\n\t\t\t\t\t\t\t\t\t\t""type"": ""string""\n\t\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t""required"": [\n\t\t\t\t\t\t\t\t""name"",\n\t\t\t\t\t\t\t\t""url"",\n\t\t\t\t\t\t\t\t""toolCalling"",\n\t\t\t\t\t\t\t\t""vision"",\n\t\t\t\t\t\t\t\t""maxInputTokens"",\n\t\t\t\t\t\t\t\t""maxOutputTokens""\n\t\t\t\t\t\t\t],\n\t\t\t\t\t\t\t""additionalProperties"": false\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""markdownDescription"": ""Configure custom Azure OpenAI models. Each key should be a unique model ID, and the value should be an object with model configuration including name, url, toolCalling, vision, maxInputTokens, and maxOutputTokens properties.""\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.customOAIModels"": {\n\t\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t\t""default"": {},\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""experimental""\n\t\t\t\t\t\t],\n\t\t\t\t\t\t""additionalProperties"": {\n\t\t\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t\t\t""name"": {\n\t\t\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t\t\t""description"": ""Display name of the custom OpenAI model""\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t""url"": {\n\t\t\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t\t\t""description"": ""URL endpoint for the custom OpenAI-compatible model""\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t""toolCalling"": {\n\t\t\t\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t\t\t\t""description"": ""Whether the model supports tool calling""\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t""vision"": {\n\t\t\t\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t\t\t\t""description"": ""Whether the model supports vision capabilities""\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t""maxInputTokens"": {\n\t\t\t\t\t\t\t\t\t""type"": ""number"",\n\t\t\t\t\t\t\t\t\t""description"": ""Maximum number of input tokens supported by the model""\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t""maxOutputTokens"": {\n\t\t\t\t\t\t\t\t\t""type"": ""number"",\n\t\t\t\t\t\t\t\t\t""description"": ""Maximum number of output tokens supported by the model""\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t""requiresAPIKey"": {\n\t\t\t\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t\t\t\t""description"": ""Whether the model requires an API key for authentication"",\n\t\t\t\t\t\t\t\t\t""default"": true\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t""editTools"": {\n\t\t\t\t\t\t\t\t\t""type"": ""array"",\n\t\t\t\t\t\t\t\t\t""description"": ""List of edit tools supported by the model. If this is not configured, the editor will try multiple edit tools and pick the best one.\n\n- 'find-replace': Find and replace text in a document.\n- 'multi-find-replace': Find and replace text in a document.\n- 'apply-patch': A file-oriented diff format used by some OpenAI models\n- 'code-rewrite': A general but slower editing tool that allows the model to rewrite and code snippet and provide only the replacement to the editor."",\n\t\t\t\t\t\t\t\t\t""items"": {\n\t\t\t\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t\t\t\t""enum"": [\n\t\t\t\t\t\t\t\t\t\t\t""find-replace"",\n\t\t\t\t\t\t\t\t\t\t\t""multi-find-replace"",\n\t\t\t\t\t\t\t\t\t\t\t""apply-patch"",\n\t\t\t\t\t\t\t\t\t\t\t""code-rewrite""\n\t\t\t\t\t\t\t\t\t\t]\n\t\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t""thinking"": {\n\t\t\t\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t\t\t\t""default"": false,\n\t\t\t\t\t\t\t\t\t""description"": ""Whether the model supports thinking capabilities""\n\t\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t\t""requestHeaders"": {\n\t\t\t\t\t\t\t\t\t""type"": ""object"",\n\t\t\t\t\t\t\t\t\t""description"": ""Additional HTTP headers to include with requests to this model. These reserved headers are not allowed and ignored if present: forbidden request headers (https://developer.mozilla.org/en-US/docs/Glossary/Forbidden_request_header), forwarding headers ('forwarded', 'x-forwarded-for', 'x-forwarded-host', 'x-forwarded-proto'), and others ('api-key', 'authorization', 'content-type', 'openai-intent', 'x-github-api-version', 'x-initiator', 'x-interaction-id', 'x-interaction-type', 'x-onbehalf-extension-id', 'x-request-id', 'x-vscode-user-agent-library-version'). Pattern-based forbidden headers ('proxy-*', 'sec-*', 'x-http-method*' with forbidden methods) are also blocked."",\n\t\t\t\t\t\t\t\t\t""additionalProperties"": {\n\t\t\t\t\t\t\t\t\t\t""type"": ""string""\n\t\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t\t}\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t""required"": [\n\t\t\t\t\t\t\t\t""name"",\n\t\t\t\t\t\t\t\t""url"",\n\t\t\t\t\t\t\t\t""toolCalling"",\n\t\t\t\t\t\t\t\t""vision"",\n\t\t\t\t\t\t\t\t""maxInputTokens"",\n\t\t\t\t\t\t\t\t""maxOutputTokens"",\n\t\t\t\t\t\t\t\t""requiresAPIKey""\n\t\t\t\t\t\t\t],\n\t\t\t\t\t\t\t""additionalProperties"": false\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""markdownDescription"": ""Configure custom OpenAI-compatible models. Each key should be a unique model ID, and the value should be an object with model configuration including name, url, toolCalling, vision, maxInputTokens, and maxOutputTokens properties.""\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.alternateGptPrompt.enabled"": {\n\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t""default"": false,\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""experimental""\n\t\t\t\t\t\t],\n\t\t\t\t\t\t""description"": ""%github.copilot.config.alternateGptPrompt.enabled%""\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.gpt5AlternatePrompt"": {\n\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t""default"": ""default"",\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""experimental""\n\t\t\t\t\t\t],\n\t\t\t\t\t\t""description"": ""%github.copilot.config.gpt5AlternatePrompt%""\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.useResponsesApi"": {\n\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t""default"": true,\n\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.config.useResponsesApi%"",\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""experimental"",\n\t\t\t\t\t\t\t""onExp""\n\t\t\t\t\t\t]\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.responsesApiReasoningEffort"": {\n\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t""default"": ""default"",\n\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.config.responsesApiReasoningEffort%"",\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""experimental"",\n\t\t\t\t\t\t\t""onExp""\n\t\t\t\t\t\t],\n\t\t\t\t\t\t""enum"": [\n\t\t\t\t\t\t\t""low"",\n\t\t\t\t\t\t\t""medium"",\n\t\t\t\t\t\t\t""high"",\n\t\t\t\t\t\t\t""default""\n\t\t\t\t\t\t]\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.responsesApiReasoningSummary"": {\n\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t""default"": ""detailed"",\n\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.config.responsesApiReasoningSummary%"",\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""experimental"",\n\t\t\t\t\t\t\t""onExp""\n\t\t\t\t\t\t],\n\t\t\t\t\t\t""enum"": [\n\t\t\t\t\t\t\t""off"",\n\t\t\t\t\t\t\t""detailed""\n\t\t\t\t\t\t]\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.anthropic.thinking.enabled"": {\n\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t""default"": false,\n\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.config.anthropic.thinking.enabled%"",\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""experimental"",\n\t\t\t\t\t\t\t""onExp""\n\t\t\t\t\t\t]\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.anthropic.thinking.maxTokens"": {\n\t\t\t\t\t\t""type"": [\n\t\t\t\t\t\t\t""number"",\n\t\t\t\t\t\t\t""null""\n\t\t\t\t\t\t],\n\t\t\t\t\t\t""default"": null,\n\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.config.anthropic.thinking.maxTokens%"",\n\t\t\t\t\t\t""minimum"": 1024,\n\t\t\t\t\t\t""maximum"": 32000,\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""experimental"",\n\t\t\t\t\t\t\t""onExp""\n\t\t\t\t\t\t]\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.anthropic.tools.websearch.enabled"": {\n\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t""default"": false,\n\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.config.anthropic.tools.websearch.enabled%"",\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""experimental"",\n\t\t\t\t\t\t\t""onExp""\n\t\t\t\t\t\t]\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.tools.memory.enabled"": {\n\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t""default"": false,\n\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.config.tools.memory.enabled%"",\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""experimental"",\n\t\t\t\t\t\t\t""onExp""\n\t\t\t\t\t\t]\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.completionsFetcher"": {\n\t\t\t\t\t\t""type"": [\n\t\t\t\t\t\t\t""string"",\n\t\t\t\t\t\t\t""null""\n\t\t\t\t\t\t],\n\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.config.completionsFetcher%"",\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""experimental"",\n\t\t\t\t\t\t\t""onExp""\n\t\t\t\t\t\t],\n\t\t\t\t\t\t""enum"": [\n\t\t\t\t\t\t\t""electron-fetch"",\n\t\t\t\t\t\t\t""node-fetch""\n\t\t\t\t\t\t]\n\t\t\t\t\t},\n\t\t\t\t\t""github.copilot.chat.nesFetcher"": {\n\t\t\t\t\t\t""type"": [\n\t\t\t\t\t\t\t""string"",\n\t\t\t\t\t\t\t""null""\n\t\t\t\t\t\t],\n\t\t\t\t\t\t""markdownDescription"": ""%github.copilot.config.nesFetcher%"",\n\t\t\t\t\t\t""tags"": [\n\t\t\t\t\t\t\t""experimental"",\n\t\t\t\t\t\t\t""onExp""\n\t\t\t\t\t\t],\n\t\t\t\t\t\t""enum"": [\n\t\t\t\t\t\t\t""electron-fetch"",\n\t\t\t\t\t\t\t""node-fetch""\n\t\t\t\t\t\t]\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t}\n\t\t],\n\t\t""submenus"": [\n\t\t\t{\n\t\t\t\t""id"": ""copilot/reviewComment/additionalActions/applyAndNext"",\n\t\t\t\t""label"": ""%github.copilot.submenu.reviewComment.applyAndNext.label%""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""id"": ""copilot/reviewComment/additionalActions/discardAndNext"",\n\t\t\t\t""label"": ""%github.copilot.submenu.reviewComment.discardAndNext.label%""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""id"": ""copilot/reviewComment/additionalActions/discard"",\n\t\t\t\t""label"": ""%github.copilot.submenu.reviewComment.discard.label%""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""id"": ""github.copilot.chat.debug.filter"",\n\t\t\t\t""label"": ""Filter"",\n\t\t\t\t""icon"": ""$(filter)""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""id"": ""github.copilot.chat.debug.exportAllPromptLogsAsJson"",\n\t\t\t\t""label"": ""Export All Logs as JSON"",\n\t\t\t\t""icon"": ""$(file-export)""\n\t\t\t}\n\t\t],\n\t\t""menus"": {\n\t\t\t""editor/title"": [\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.debug.generateInlineEditTests"",\n\t\t\t\t\t""when"": ""resourceScheme == 'ccreq'""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.notebook.enableFollowCellExecution"",\n\t\t\t\t\t""when"": ""config.github.copilot.chat.notebook.followCellExecution.enabled && !github.copilot.notebookFollowInSessionEnabled && github.copilot.notebookAgentModeUsage && !config.notebook.globalToolbar"",\n\t\t\t\t\t""group"": ""navigation@10""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.notebook.disableFollowCellExecution"",\n\t\t\t\t\t""when"": ""config.github.copilot.chat.notebook.followCellExecution.enabled && github.copilot.notebookFollowInSessionEnabled && github.copilot.notebookAgentModeUsage && !config.notebook.globalToolbar"",\n\t\t\t\t\t""group"": ""navigation@10""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.replay"",\n\t\t\t\t\t""group"": ""navigation@9"",\n\t\t\t\t\t""when"": ""resourceLangId == chatReplay""\n\t\t\t\t}\n\t\t\t],\n\t\t\t""editor/context"": [\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.explain"",\n\t\t\t\t\t""when"": ""!github.copilot.interactiveSession.disabled"",\n\t\t\t\t\t""group"": ""1_chat@4""\n\t\t\t\t}\n\t\t\t],\n\t\t\t""editor/context/chat"": [\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.fix"",\n\t\t\t\t\t""when"": ""!github.copilot.interactiveSession.disabled && !editorReadonly"",\n\t\t\t\t\t""group"": ""copilotAction@1""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.review"",\n\t\t\t\t\t""when"": ""config.github.copilot.chat.reviewSelection.enabled && !github.copilot.interactiveSession.disabled && resourceScheme != 'vscode-chat-code-block'"",\n\t\t\t\t\t""group"": ""copilotAction@2""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.generateDocs"",\n\t\t\t\t\t""when"": ""!github.copilot.interactiveSession.disabled && !editorReadonly"",\n\t\t\t\t\t""group"": ""copilotGenerate@1""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.generateTests"",\n\t\t\t\t\t""when"": ""!github.copilot.interactiveSession.disabled && !editorReadonly"",\n\t\t\t\t\t""group"": ""copilotGenerate@2""\n\t\t\t\t}\n\t\t\t],\n\t\t\t""testing/item/result"": [\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.tests.fixTestFailure.fromInline"",\n\t\t\t\t\t""when"": ""testResultState == failed && !testResultOutdated"",\n\t\t\t\t\t""group"": ""inline@2""\n\t\t\t\t}\n\t\t\t],\n\t\t\t""testing/item/context"": [\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.tests.fixTestFailure.fromInline"",\n\t\t\t\t\t""when"": ""testResultState == failed && !testResultOutdated"",\n\t\t\t\t\t""group"": ""inline@2""\n\t\t\t\t}\n\t\t\t],\n\t\t\t""commandPalette"": [\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.interactiveSession.feedback"",\n\t\t\t\t\t""when"": ""github.copilot-chat.activated && !github.copilot.interactiveSession.disabled""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.debug.workbenchState"",\n\t\t\t\t\t""when"": ""true""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.rerunWithCopilotDebug"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.startCopilotDebugCommand"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.git.generateCommitMessage"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.git.resolveMergeConflicts"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.explain"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.review"",\n\t\t\t\t\t""when"": ""!github.copilot.interactiveSession.disabled""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.review.apply"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.review.applyAndNext"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.review.discard"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.review.discardAndNext"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.review.discardAll"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.review.stagedChanges"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.review.unstagedChanges"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.review.changes"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.review.stagedFileChange"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.review.unstagedFileChange"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.review.previous"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.review.next"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.review.continueInInlineChat"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.review.continueInChat"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.review.markHelpful"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.review.markUnhelpful"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.devcontainer.generateDevContainerConfig"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.tests.fixTestFailure"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.tests.fixTestFailure.fromInline"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.search.markHelpful"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.search.markUnhelpful"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.search.feedback"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.debug.showElements"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.debug.hideElements"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.debug.showTools"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.debug.hideTools"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.debug.showNesRequests"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.debug.hideNesRequests"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.debug.exportLogItem"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.debug.exportPromptArchive"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.debug.exportPromptLogsAsJson"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.debug.exportAllPromptLogsAsJson"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.mcp.setup.check"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.mcp.setup.validatePackage"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.mcp.setup.flow"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.debug.showRawRequestBody"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.debug.showOutputChannel"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.cli.sessions.delete"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.cli.sessions.refresh"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.cli.sessions.resumeInTerminal"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.cli.sessions.newTerminalSession"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.cloud.sessions.refresh"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.cloud.sessions.openInBrowser"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.cloud.sessions.proxy.closeChatSessionPullRequest"",\n\t\t\t\t\t""when"": ""false""\n\t\t\t\t}\n\t\t\t],\n\t\t\t""view/title"": [\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.claude.sessions.refresh"",\n\t\t\t\t\t""when"": ""view == workbench.view.chat.sessions.claude-code"",\n\t\t\t\t\t""group"": ""navigation@1""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.cli.sessions.refresh"",\n\t\t\t\t\t""when"": ""view == workbench.view.chat.sessions.copilotcli"",\n\t\t\t\t\t""group"": ""navigation@1""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""submenu"": ""github.copilot.chat.debug.filter"",\n\t\t\t\t\t""when"": ""view == copilot-chat"",\n\t\t\t\t\t""group"": ""navigation""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.debug.exportAllPromptLogsAsJson"",\n\t\t\t\t\t""when"": ""view == copilot-chat"",\n\t\t\t\t\t""group"": ""export@1""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.debug.showOutputChannel"",\n\t\t\t\t\t""when"": ""view == copilot-chat"",\n\t\t\t\t\t""group"": ""3_show@1""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.debug.showChatLogView"",\n\t\t\t\t\t""when"": ""view == workbench.panel.chat.view.copilot"",\n\t\t\t\t\t""group"": ""3_show""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.cloud.sessions.refresh"",\n\t\t\t\t\t""when"": ""view == workbench.view.chat.sessions.copilot-cloud-agent"",\n\t\t\t\t\t""group"": ""navigation@1""\n\t\t\t\t}\n\t\t\t],\n\t\t\t""view/item/context"": [\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.debug.showRawRequestBody"",\n\t\t\t\t\t""when"": ""view == copilot-chat && viewItem == request"",\n\t\t\t\t\t""group"": ""export@0""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.debug.exportLogItem"",\n\t\t\t\t\t""when"": ""view == copilot-chat && (viewItem == toolcall || viewItem == request)"",\n\t\t\t\t\t""group"": ""export@1""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.debug.exportPromptArchive"",\n\t\t\t\t\t""when"": ""view == copilot-chat && viewItem == chatprompt"",\n\t\t\t\t\t""group"": ""export@2""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.debug.exportPromptLogsAsJson"",\n\t\t\t\t\t""when"": ""view == copilot-chat && viewItem == chatprompt"",\n\t\t\t\t\t""group"": ""export@3""\n\t\t\t\t}\n\t\t\t],\n\t\t\t""searchPanel/aiResults/commands"": [\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.search.markHelpful"",\n\t\t\t\t\t""group"": ""inline@0"",\n\t\t\t\t\t""when"": ""aiResultsTitle && aiResultsRequested""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.search.markUnhelpful"",\n\t\t\t\t\t""group"": ""inline@1"",\n\t\t\t\t\t""when"": ""aiResultsTitle && aiResultsRequested""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.search.feedback"",\n\t\t\t\t\t""group"": ""inline@2"",\n\t\t\t\t\t""when"": ""aiResultsTitle && aiResultsRequested && github.copilot.debugReportFeedback""\n\t\t\t\t}\n\t\t\t],\n\t\t\t""comments/comment/title"": [\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.review.markHelpful"",\n\t\t\t\t\t""group"": ""inline@0"",\n\t\t\t\t\t""when"": ""commentController == github-copilot-review""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.review.markUnhelpful"",\n\t\t\t\t\t""group"": ""inline@1"",\n\t\t\t\t\t""when"": ""commentController == github-copilot-review""\n\t\t\t\t}\n\t\t\t],\n\t\t\t""commentsView/commentThread/context"": [\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.review.apply"",\n\t\t\t\t\t""group"": ""context@1"",\n\t\t\t\t\t""when"": ""commentController == github-copilot-review""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.review.discard"",\n\t\t\t\t\t""group"": ""context@2"",\n\t\t\t\t\t""when"": ""commentController == github-copilot-review""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.review.discardAll"",\n\t\t\t\t\t""group"": ""context@3"",\n\t\t\t\t\t""when"": ""commentController == github-copilot-review""\n\t\t\t\t}\n\t\t\t],\n\t\t\t""comments/commentThread/additionalActions"": [\n\t\t\t\t{\n\t\t\t\t\t""submenu"": ""copilot/reviewComment/additionalActions/applyAndNext"",\n\t\t\t\t\t""group"": ""inline@1"",\n\t\t\t\t\t""when"": ""commentController == github-copilot-review && github.copilot.chat.review.numberOfComments > 1""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.review.apply"",\n\t\t\t\t\t""group"": ""inline@1"",\n\t\t\t\t\t""when"": ""commentController == github-copilot-review && github.copilot.chat.review.numberOfComments == 1""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""submenu"": ""copilot/reviewComment/additionalActions/discardAndNext"",\n\t\t\t\t\t""group"": ""inline@2"",\n\t\t\t\t\t""when"": ""commentController == github-copilot-review && github.copilot.chat.review.numberOfComments > 1""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""submenu"": ""copilot/reviewComment/additionalActions/discard"",\n\t\t\t\t\t""group"": ""inline@2"",\n\t\t\t\t\t""when"": ""commentController == github-copilot-review && github.copilot.chat.review.numberOfComments == 1""\n\t\t\t\t}\n\t\t\t],\n\t\t\t""copilot/reviewComment/additionalActions/applyAndNext"": [\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.review.applyAndNext"",\n\t\t\t\t\t""group"": ""inline@1"",\n\t\t\t\t\t""when"": ""commentController == github-copilot-review""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.review.apply"",\n\t\t\t\t\t""group"": ""inline@2"",\n\t\t\t\t\t""when"": ""commentController == github-copilot-review""\n\t\t\t\t}\n\t\t\t],\n\t\t\t""copilot/reviewComment/additionalActions/discardAndNext"": [\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.review.discardAndNext"",\n\t\t\t\t\t""group"": ""inline@1"",\n\t\t\t\t\t""when"": ""commentController == github-copilot-review""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.review.discard"",\n\t\t\t\t\t""group"": ""inline@2"",\n\t\t\t\t\t""when"": ""commentController == github-copilot-review""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.review.continueInInlineChat"",\n\t\t\t\t\t""group"": ""inline@3"",\n\t\t\t\t\t""when"": ""commentController == github-copilot-review""\n\t\t\t\t}\n\t\t\t],\n\t\t\t""copilot/reviewComment/additionalActions/discard"": [\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.review.discard"",\n\t\t\t\t\t""group"": ""inline@2"",\n\t\t\t\t\t""when"": ""commentController == github-copilot-review""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.review.continueInInlineChat"",\n\t\t\t\t\t""group"": ""inline@3"",\n\t\t\t\t\t""when"": ""commentController == github-copilot-review""\n\t\t\t\t}\n\t\t\t],\n\t\t\t""comments/commentThread/title"": [\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.review.previous"",\n\t\t\t\t\t""group"": ""inline@1"",\n\t\t\t\t\t""when"": ""commentController == github-copilot-review""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.review.next"",\n\t\t\t\t\t""group"": ""inline@2"",\n\t\t\t\t\t""when"": ""commentController == github-copilot-review""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.review.continueInChat"",\n\t\t\t\t\t""group"": ""inline@3"",\n\t\t\t\t\t""when"": ""commentController == github-copilot-review""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.review.discardAll"",\n\t\t\t\t\t""group"": ""inline@4"",\n\t\t\t\t\t""when"": ""commentController == github-copilot-review""\n\t\t\t\t}\n\t\t\t],\n\t\t\t""scm/title"": [\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.review.changes"",\n\t\t\t\t\t""group"": ""navigation"",\n\t\t\t\t\t""when"": ""config.github.copilot.chat.reviewAgent.enabled && github.copilot.chat.reviewDiff.enabled && scmProvider == git && scmProviderRootUri in github.copilot.chat.reviewDiff.enabledRootUris""\n\t\t\t\t}\n\t\t\t],\n\t\t\t""scm/resourceGroup/context"": [\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.review.stagedChanges"",\n\t\t\t\t\t""when"": ""config.github.copilot.chat.reviewAgent.enabled && github.copilot.chat.reviewDiff.enabled && scmProvider == git && scmResourceGroup == index"",\n\t\t\t\t\t""group"": ""inline@-3""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.review.unstagedChanges"",\n\t\t\t\t\t""when"": ""config.github.copilot.chat.reviewAgent.enabled && github.copilot.chat.reviewDiff.enabled && scmProvider == git && scmResourceGroup == workingTree"",\n\t\t\t\t\t""group"": ""inline@-3""\n\t\t\t\t}\n\t\t\t],\n\t\t\t""scm/resourceState/context"": [\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.git.resolveMergeConflicts"",\n\t\t\t\t\t""when"": ""scmProvider == git && scmResourceGroup == merge && git.activeResourceHasMergeConflicts"",\n\t\t\t\t\t""group"": ""z_chat@1""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.review.stagedFileChange"",\n\t\t\t\t\t""group"": ""3_copilot"",\n\t\t\t\t\t""when"": ""config.github.copilot.chat.reviewAgent.enabled && github.copilot.chat.reviewDiff.enabled && scmProvider == git && scmResourceGroup == index""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.review.unstagedFileChange"",\n\t\t\t\t\t""group"": ""3_copilot"",\n\t\t\t\t\t""when"": ""config.github.copilot.chat.reviewAgent.enabled && github.copilot.chat.reviewDiff.enabled && scmProvider == git && scmResourceGroup == workingTree""\n\t\t\t\t}\n\t\t\t],\n\t\t\t""scm/inputBox"": [\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.git.generateCommitMessage"",\n\t\t\t\t\t""when"": ""scmProvider == git""\n\t\t\t\t}\n\t\t\t],\n\t\t\t""testing/message/context"": [\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.tests.fixTestFailure"",\n\t\t\t\t\t""when"": ""testing.testItemHasUri"",\n\t\t\t\t\t""group"": ""inline@1""\n\t\t\t\t}\n\t\t\t],\n\t\t\t""issue/reporter"": [\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.report""\n\t\t\t\t}\n\t\t\t],\n\t\t\t""github.copilot.chat.debug.filter"": [\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.debug.showElements"",\n\t\t\t\t\t""when"": ""github.copilot.chat.debug.elementsHidden"",\n\t\t\t\t\t""group"": ""commands@0""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.debug.hideElements"",\n\t\t\t\t\t""when"": ""!github.copilot.chat.debug.elementsHidden"",\n\t\t\t\t\t""group"": ""commands@0""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.debug.showTools"",\n\t\t\t\t\t""when"": ""github.copilot.chat.debug.toolsHidden"",\n\t\t\t\t\t""group"": ""commands@1""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.debug.hideTools"",\n\t\t\t\t\t""when"": ""!github.copilot.chat.debug.toolsHidden"",\n\t\t\t\t\t""group"": ""commands@1""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.debug.showNesRequests"",\n\t\t\t\t\t""when"": ""github.copilot.chat.debug.nesRequestsHidden"",\n\t\t\t\t\t""group"": ""commands@2""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.debug.hideNesRequests"",\n\t\t\t\t\t""when"": ""!github.copilot.chat.debug.nesRequestsHidden"",\n\t\t\t\t\t""group"": ""commands@2""\n\t\t\t\t}\n\t\t\t],\n\t\t\t""notebook/toolbar"": [\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.notebook.enableFollowCellExecution"",\n\t\t\t\t\t""when"": ""config.github.copilot.chat.notebook.followCellExecution.enabled && !github.copilot.notebookFollowInSessionEnabled && github.copilot.notebookAgentModeUsage && config.notebook.globalToolbar"",\n\t\t\t\t\t""group"": ""navigation/execute@15""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.chat.notebook.disableFollowCellExecution"",\n\t\t\t\t\t""when"": ""config.github.copilot.chat.notebook.followCellExecution.enabled && github.copilot.notebookFollowInSessionEnabled && github.copilot.notebookAgentModeUsage && config.notebook.globalToolbar"",\n\t\t\t\t\t""group"": ""navigation/execute@15""\n\t\t\t\t}\n\t\t\t],\n\t\t\t""editor/content"": [\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.git.resolveMergeConflicts"",\n\t\t\t\t\t""group"": ""z_chat@1"",\n\t\t\t\t\t""when"": ""config.git.enabled && !git.missing && !isInDiffEditor && !isMergeEditor && resource in git.mergeChanges && git.activeResourceHasMergeConflicts""\n\t\t\t\t}\n\t\t\t],\n\t\t\t""chat/chatSessions"": [\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.cli.sessions.resumeInTerminal"",\n\t\t\t\t\t""when"": ""chatSessionType == copilotcli"",\n\t\t\t\t\t""group"": ""inline@1""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.cli.sessions.delete"",\n\t\t\t\t\t""when"": ""chatSessionType == copilotcli"",\n\t\t\t\t\t""group"": ""inline@2""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.cli.sessions.newTerminalSession"",\n\t\t\t\t\t""when"": ""view == workbench.view.chat.sessions.copilotcli"",\n\t\t\t\t\t""group"": ""submenu""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.cli.sessions.refresh"",\n\t\t\t\t\t""when"": ""view == workbench.view.chat.sessions.copilotcli"",\n\t\t\t\t\t""group"": ""navigation@1""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.cloud.sessions.openInBrowser"",\n\t\t\t\t\t""when"": ""chatSessionType == copilot-cloud-agent"",\n\t\t\t\t\t""group"": ""context""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""command"": ""github.copilot.cloud.sessions.proxy.closeChatSessionPullRequest"",\n\t\t\t\t\t""when"": ""chatSessionType == copilot-cloud-agent"",\n\t\t\t\t\t""group"": ""context""\n\t\t\t\t}\n\t\t\t]\n\t\t},\n\t\t""icons"": {\n\t\t\t""copilot-logo"": {\n\t\t\t\t""description"": ""%github.copilot.icon%"",\n\t\t\t\t""default"": {\n\t\t\t\t\t""fontPath"": ""assets/copilot.woff"",\n\t\t\t\t\t""fontCharacter"": ""\\0041""\n\t\t\t\t}\n\t\t\t},\n\t\t\t""copilot-warning"": {\n\t\t\t\t""description"": ""%github.copilot.icon%"",\n\t\t\t\t""default"": {\n\t\t\t\t\t""fontPath"": ""assets/copilot.woff"",\n\t\t\t\t\t""fontCharacter"": ""\\0042""\n\t\t\t\t}\n\t\t\t},\n\t\t\t""copilot-notconnected"": {\n\t\t\t\t""description"": ""%github.copilot.icon%"",\n\t\t\t\t""default"": {\n\t\t\t\t\t""fontPath"": ""assets/copilot.woff"",\n\t\t\t\t\t""fontCharacter"": ""\\0043""\n\t\t\t\t}\n\t\t\t}\n\t\t},\n\t\t""iconFonts"": [\n\t\t\t{\n\t\t\t\t""id"": ""copilot-font"",\n\t\t\t\t""src"": [\n\t\t\t\t\t{\n\t\t\t\t\t\t""path"": ""assets/copilot.woff"",\n\t\t\t\t\t\t""format"": ""woff""\n\t\t\t\t\t}\n\t\t\t\t]\n\t\t\t}\n\t\t],\n\t\t""terminalQuickFixes"": [\n\t\t\t{\n\t\t\t\t""id"": ""copilot-chat.fixWithCopilot"",\n\t\t\t\t""commandLineMatcher"": "".+"",\n\t\t\t\t""commandExitResult"": ""error"",\n\t\t\t\t""outputMatcher"": {\n\t\t\t\t\t""anchor"": ""bottom"",\n\t\t\t\t\t""length"": 1,\n\t\t\t\t\t""lineMatcher"": "".+"",\n\t\t\t\t\t""offset"": 0\n\t\t\t\t},\n\t\t\t\t""kind"": ""explain""\n\t\t\t},\n\t\t\t{\n\t\t\t\t""id"": ""copilot-chat.generateCommitMessage"",\n\t\t\t\t""commandLineMatcher"": ""git add .+"",\n\t\t\t\t""commandExitResult"": ""success"",\n\t\t\t\t""kind"": ""explain"",\n\t\t\t\t""outputMatcher"": {\n\t\t\t\t\t""anchor"": ""bottom"",\n\t\t\t\t\t""length"": 1,\n\t\t\t\t\t""lineMatcher"": "".+"",\n\t\t\t\t\t""offset"": 0\n\t\t\t\t}\n\t\t\t},\n\t\t\t{\n\t\t\t\t""id"": ""copilot-chat.terminalToDebugging"",\n\t\t\t\t""commandLineMatcher"": "".+"",\n\t\t\t\t""kind"": ""explain"",\n\t\t\t\t""commandExitResult"": ""error"",\n\t\t\t\t""outputMatcher"": {\n\t\t\t\t\t""anchor"": ""bottom"",\n\t\t\t\t\t""length"": 1,\n\t\t\t\t\t""lineMatcher"": """",\n\t\t\t\t\t""offset"": 0\n\t\t\t\t}\n\t\t\t},\n\t\t\t{\n\t\t\t\t""id"": ""copilot-chat.terminalToDebuggingSuccess"",\n\t\t\t\t""commandLineMatcher"": "".+"",\n\t\t\t\t""kind"": ""explain"",\n\t\t\t\t""commandExitResult"": ""success"",\n\t\t\t\t""outputMatcher"": {\n\t\t\t\t\t""anchor"": ""bottom"",\n\t\t\t\t\t""length"": 1,\n\t\t\t\t\t""lineMatcher"": """",\n\t\t\t\t\t""offset"": 0\n\t\t\t\t}\n\t\t\t}\n\t\t],\n\t\t""languages"": [\n\t\t\t{\n\t\t\t\t""id"": ""ignore"",\n\t\t\t\t""filenamePatterns"": [\n\t\t\t\t\t"".copilotignore""\n\t\t\t\t],\n\t\t\t\t""aliases"": []\n\t\t\t},\n\t\t\t{\n\t\t\t\t""id"": ""markdown"",\n\t\t\t\t""extensions"": [\n\t\t\t\t\t"".copilotmd""\n\t\t\t\t]\n\t\t\t},\n\t\t\t{\n\t\t\t\t""id"": ""chatReplay"",\n\t\t\t\t""aliases"": [\n\t\t\t\t\t""chatReplay"",\n\t\t\t\t\t""Chat Replay""\n\t\t\t\t],\n\t\t\t\t""extensions"": [\n\t\t\t\t\t"".chatReplay.json"",\n\t\t\t\t\t"".chatreplay.json""\n\t\t\t\t]\n\t\t\t}\n\t\t],\n\t\t""views"": {\n\t\t\t""copilot-chat"": [\n\t\t\t\t{\n\t\t\t\t\t""id"": ""copilot-chat"",\n\t\t\t\t\t""name"": ""Chat Debug"",\n\t\t\t\t\t""icon"": ""assets/debug-icon.svg"",\n\t\t\t\t\t""when"": ""github.copilot.chat.showLogView""\n\t\t\t\t}\n\t\t\t],\n\t\t\t""context-inspector"": [\n\t\t\t\t{\n\t\t\t\t\t""id"": ""context-inspector"",\n\t\t\t\t\t""name"": ""Language Context Inspector"",\n\t\t\t\t\t""icon"": ""$(inspect)"",\n\t\t\t\t\t""when"": ""github.copilot.chat.showContextInspectorView""\n\t\t\t\t}\n\t\t\t],\n\t\t\t""agentSessions"": [\n\t\t\t\t{\n\t\t\t\t\t""id"": ""codex-placeholder"",\n\t\t\t\t\t""name"": ""OpenAI Codex"",\n\t\t\t\t\t""when"": ""github.copilot.chat.codex.showPlaceholder && config.chat.experimental.codex.enabled"",\n\t\t\t\t\t""icon"": ""$(file)""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""id"": ""copilot-agents-placeholder"",\n\t\t\t\t\t""name"": ""GitHub Copilot Agents"",\n\t\t\t\t\t""when"": ""chatEntitlementSignedOut || !chatIsEnabled"",\n\t\t\t\t\t""icon"": ""$(copilot)""\n\t\t\t\t}\n\t\t\t]\n\t\t},\n\t\t""viewsContainers"": {\n\t\t\t""activitybar"": [\n\t\t\t\t{\n\t\t\t\t\t""id"": ""copilot-chat"",\n\t\t\t\t\t""title"": ""Chat Debug"",\n\t\t\t\t\t""icon"": ""assets/debug-icon.svg""\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\t""id"": ""context-inspector"",\n\t\t\t\t\t""title"": ""Language Context Inspector"",\n\t\t\t\t\t""icon"": ""$(inspect)""\n\t\t\t\t}\n\t\t\t]\n\t\t},\n\t\t""configurationDefaults"": {\n\t\t\t""workbench.editorAssociations"": {\n\t\t\t\t""*.copilotmd"": ""vscode.markdown.preview.editor""\n\t\t\t}\n\t\t},\n\t\t""keybindings"": [\n\t\t\t{\n\t\t\t\t""command"": ""github.copilot.chat.rerunWithCopilotDebug"",\n\t\t\t\t""key"": ""ctrl+alt+."",\n\t\t\t\t""mac"": ""cmd+alt+."",\n\t\t\t\t""when"": ""github.copilot-chat.activated && terminalShellIntegrationEnabled && terminalFocus && !terminalAltBufferActive""\n\t\t\t}\n\t\t],\n\t\t""walkthroughs"": [\n\t\t\t{\n\t\t\t\t""id"": ""copilotWelcome"",\n\t\t\t\t""title"": ""%github.copilot.walkthrough.title%"",\n\t\t\t\t""description"": ""%github.copilot.walkthrough.description%"",\n\t\t\t\t""when"": ""!isWeb"",\n\t\t\t\t""steps"": [\n\t\t\t\t\t{\n\t\t\t\t\t\t""id"": ""copilot.setup.signIn"",\n\t\t\t\t\t\t""title"": ""%github.copilot.walkthrough.setup.signIn.title%"",\n\t\t\t\t\t\t""description"": ""%github.copilot.walkthrough.setup.signIn.description%"",\n\t\t\t\t\t\t""when"": ""chatEntitlementSignedOut && !view.workbench.panel.chat.view.copilot.visible && !github.copilot-chat.activated && !github.copilot.offline && !github.copilot.interactiveSession.individual.disabled && !github.copilot.interactiveSession.individual.expired && !github.copilot.interactiveSession.enterprise.disabled && !github.copilot.interactiveSession.contactSupport"",\n\t\t\t\t\t\t""media"": {\n\t\t\t\t\t\t\t""video"": {\n\t\t\t\t\t\t\t\t""dark"": ""https://vscodewalkthroughs.z1.web.core.windows.net/v0.26/workspace.mp4"",\n\t\t\t\t\t\t\t\t""light"": ""https://vscodewalkthroughs.z1.web.core.windows.net/v0.26/workspace-light.mp4"",\n\t\t\t\t\t\t\t\t""hc"": ""https://vscodewalkthroughs.z1.web.core.windows.net/v0.26/workspace-hc.mp4"",\n\t\t\t\t\t\t\t\t""hcLight"": ""https://vscodewalkthroughs.z1.web.core.windows.net/v0.26/workspace-hclight.mp4""\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t""altText"": ""%github.copilot.walkthrough.panelChat.media.altText%""\n\t\t\t\t\t\t}\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\t""id"": ""copilot.setup.signInNoAction"",\n\t\t\t\t\t\t""title"": ""%github.copilot.walkthrough.setup.signIn.title%"",\n\t\t\t\t\t\t""description"": ""%github.copilot.walkthrough.setup.noAction.description%"",\n\t\t\t\t\t\t""when"": ""chatEntitlementSignedOut && view.workbench.panel.chat.view.copilot.visible && !github.copilot-chat.activated && !github.copilot.offline && !github.copilot.interactiveSession.individual.disabled && !github.copilot.interactiveSession.individual.expired && !github.copilot.interactiveSession.enterprise.disabled && !github.copilot.interactiveSession.contactSupport"",\n\t\t\t\t\t\t""media"": {\n\t\t\t\t\t\t\t""video"": {\n\t\t\t\t\t\t\t\t""dark"": ""https://vscodewalkthroughs.z1.web.core.windows.net/v0.26/workspace.mp4"",\n\t\t\t\t\t\t\t\t""light"": ""https://vscodewalkthroughs.z1.web.core.windows.net/v0.26/workspace-light.mp4"",\n\t\t\t\t\t\t\t\t""hc"": ""https://vscodewalkthroughs.z1.web.core.windows.net/v0.26/workspace-hc.mp4"",\n\t\t\t\t\t\t\t\t""hcLight"": ""https://vscodewalkthroughs.z1.web.core.windows.net/v0.26/workspace-hclight.mp4""\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t""altText"": ""%github.copilot.walkthrough.panelChat.media.altText%""\n\t\t\t\t\t\t}\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\t""id"": ""copilot.setup.signUp"",\n\t\t\t\t\t\t""title"": ""%github.copilot.walkthrough.setup.signUp.title%"",\n\t\t\t\t\t\t""description"": ""%github.copilot.walkthrough.setup.signUp.description%"",\n\t\t\t\t\t\t""when"": ""chatPlanCanSignUp && !view.workbench.panel.chat.view.copilot.visible && !github.copilot-chat.activated && !github.copilot.offline && (github.copilot.interactiveSession.individual.disabled || github.copilot.interactiveSession.individual.expired) && !github.copilot.interactiveSession.enterprise.disabled && !github.copilot.interactiveSession.contactSupport"",\n\t\t\t\t\t\t""media"": {\n\t\t\t\t\t\t\t""video"": {\n\t\t\t\t\t\t\t\t""dark"": ""https://vscodewalkthroughs.z1.web.core.windows.net/v0.26/workspace.mp4"",\n\t\t\t\t\t\t\t\t""light"": ""https://vscodewalkthroughs.z1.web.core.windows.net/v0.26/workspace-light.mp4"",\n\t\t\t\t\t\t\t\t""hc"": ""https://vscodewalkthroughs.z1.web.core.windows.net/v0.26/workspace-hc.mp4"",\n\t\t\t\t\t\t\t\t""hcLight"": ""https://vscodewalkthroughs.z1.web.core.windows.net/v0.26/workspace-hclight.mp4""\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t""altText"": ""%github.copilot.walkthrough.panelChat.media.altText%""\n\t\t\t\t\t\t}\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\t""id"": ""copilot.setup.signUpNoAction"",\n\t\t\t\t\t\t""title"": ""%github.copilot.walkthrough.setup.signUp.title%"",\n\t\t\t\t\t\t""description"": ""%github.copilot.walkthrough.setup.noAction.description%"",\n\t\t\t\t\t\t""when"": ""chatPlanCanSignUp && view.workbench.panel.chat.view.copilot.visible && !github.copilot-chat.activated && !github.copilot.offline && (github.copilot.interactiveSession.individual.disabled || github.copilot.interactiveSession.individual.expired) && !github.copilot.interactiveSession.enterprise.disabled && !github.copilot.interactiveSession.contactSupport"",\n\t\t\t\t\t\t""media"": {\n\t\t\t\t\t\t\t""video"": {\n\t\t\t\t\t\t\t\t""dark"": ""https://vscodewalkthroughs.z1.web.core.windows.net/v0.26/workspace.mp4"",\n\t\t\t\t\t\t\t\t""light"": ""https://vscodewalkthroughs.z1.web.core.windows.net/v0.26/workspace-light.mp4"",\n\t\t\t\t\t\t\t\t""hc"": ""https://vscodewalkthroughs.z1.web.core.windows.net/v0.26/workspace-hc.mp4"",\n\t\t\t\t\t\t\t\t""hcLight"": ""https://vscodewalkthroughs.z1.web.core.windows.net/v0.26/workspace-hclight.mp4""\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t""altText"": ""%github.copilot.walkthrough.panelChat.media.altText%""\n\t\t\t\t\t\t}\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\t""id"": ""copilot.panelChat"",\n\t\t\t\t\t\t""title"": ""%github.copilot.walkthrough.panelChat.title%"",\n\t\t\t\t\t\t""description"": ""%github.copilot.walkthrough.panelChat.description%"",\n\t\t\t\t\t\t""when"": ""!chatEntitlementSignedOut || chatIsEnabled "",\n\t\t\t\t\t\t""media"": {\n\t\t\t\t\t\t\t""video"": {\n\t\t\t\t\t\t\t\t""dark"": ""https://vscodewalkthroughs.z1.web.core.windows.net/v0.26/workspace.mp4"",\n\t\t\t\t\t\t\t\t""light"": ""https://vscodewalkthroughs.z1.web.core.windows.net/v0.26/workspace-light.mp4"",\n\t\t\t\t\t\t\t\t""hc"": ""https://vscodewalkthroughs.z1.web.core.windows.net/v0.26/workspace-hc.mp4"",\n\t\t\t\t\t\t\t\t""hcLight"": ""https://vscodewalkthroughs.z1.web.core.windows.net/v0.26/workspace-hclight.mp4""\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t""altText"": ""%github.copilot.walkthrough.panelChat.media.altText%""\n\t\t\t\t\t\t}\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\t""id"": ""copilot.edits"",\n\t\t\t\t\t\t""title"": ""%github.copilot.walkthrough.edits.title%"",\n\t\t\t\t\t\t""description"": ""%github.copilot.walkthrough.edits.description%"",\n\t\t\t\t\t\t""when"": ""!chatEntitlementSignedOut || chatIsEnabled "",\n\t\t\t\t\t\t""media"": {\n\t\t\t\t\t\t\t""video"": {\n\t\t\t\t\t\t\t\t""dark"": ""https://vscodewalkthroughs.z1.web.core.windows.net/v0.26/edits.mp4"",\n\t\t\t\t\t\t\t\t""light"": ""https://vscodewalkthroughs.z1.web.core.windows.net/v0.26/edits-light.mp4"",\n\t\t\t\t\t\t\t\t""hc"": ""https://vscodewalkthroughs.z1.web.core.windows.net/v0.26/edits-hc.mp4"",\n\t\t\t\t\t\t\t\t""hcLight"": ""https://vscodewalkthroughs.z1.web.core.windows.net/v0.26/edits-hclight.mp4""\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t""altText"": ""%github.copilot.walkthrough.edits.media.altText%""\n\t\t\t\t\t\t}\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\t""id"": ""copilot.firstSuggest"",\n\t\t\t\t\t\t""title"": ""%github.copilot.walkthrough.firstSuggest.title%"",\n\t\t\t\t\t\t""description"": ""%github.copilot.walkthrough.firstSuggest.description%"",\n\t\t\t\t\t\t""when"": ""!chatEntitlementSignedOut || chatIsEnabled "",\n\t\t\t\t\t\t""media"": {\n\t\t\t\t\t\t\t""video"": {\n\t\t\t\t\t\t\t\t""dark"": ""https://vscodewalkthroughs.z1.web.core.windows.net/v0.26/ghost-text.mp4"",\n\t\t\t\t\t\t\t\t""light"": ""https://vscodewalkthroughs.z1.web.core.windows.net/v0.26/ghost-text-light.mp4"",\n\t\t\t\t\t\t\t\t""hc"": ""https://vscodewalkthroughs.z1.web.core.windows.net/v0.26/ghost-text-hc.mp4"",\n\t\t\t\t\t\t\t\t""hcLight"": ""https://vscodewalkthroughs.z1.web.core.windows.net/v0.26/ghost-text-hclight.mp4""\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t""altText"": ""%github.copilot.walkthrough.firstSuggest.media.altText%""\n\t\t\t\t\t\t}\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\t""id"": ""copilot.inlineChatNotMac"",\n\t\t\t\t\t\t""title"": ""%github.copilot.walkthrough.inlineChatNotMac.title%"",\n\t\t\t\t\t\t""description"": ""%github.copilot.walkthrough.inlineChatNotMac.description%"",\n\t\t\t\t\t\t""when"": ""!isMac && (!chatEntitlementSignedOut || chatIsEnabled )"",\n\t\t\t\t\t\t""media"": {\n\t\t\t\t\t\t\t""video"": {\n\t\t\t\t\t\t\t\t""dark"": ""https://vscodewalkthroughs.z1.web.core.windows.net/v0.26/inline.mp4"",\n\t\t\t\t\t\t\t\t""light"": ""https://vscodewalkthroughs.z1.web.core.windows.net/v0.26/inline-light.mp4"",\n\t\t\t\t\t\t\t\t""hc"": ""https://vscodewalkthroughs.z1.web.core.windows.net/v0.26/inline-hc.mp4"",\n\t\t\t\t\t\t\t\t""hcLight"": ""https://vscodewalkthroughs.z1.web.core.windows.net/v0.26/inline-hclight.mp4""\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t""altText"": ""%github.copilot.walkthrough.inlineChatNotMac.media.altText%""\n\t\t\t\t\t\t}\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\t""id"": ""copilot.inlineChatMac"",\n\t\t\t\t\t\t""title"": ""%github.copilot.walkthrough.inlineChatMac.title%"",\n\t\t\t\t\t\t""description"": ""%github.copilot.walkthrough.inlineChatMac.description%"",\n\t\t\t\t\t\t""when"": ""isMac && (!chatEntitlementSignedOut || chatIsEnabled )"",\n\t\t\t\t\t\t""media"": {\n\t\t\t\t\t\t\t""video"": {\n\t\t\t\t\t\t\t\t""dark"": ""https://vscodewalkthroughs.z1.web.core.windows.net/v0.26/inline.mp4"",\n\t\t\t\t\t\t\t\t""light"": ""https://vscodewalkthroughs.z1.web.core.windows.net/v0.26/inline-light.mp4"",\n\t\t\t\t\t\t\t\t""hc"": ""https://vscodewalkthroughs.z1.web.core.windows.net/v0.26/inline-hc.mp4"",\n\t\t\t\t\t\t\t\t""hcLight"": ""https://vscodewalkthroughs.z1.web.core.windows.net/v0.26/inline-hclight.mp4""\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t""altText"": ""%github.copilot.walkthrough.inlineChatMac.media.altText%""\n\t\t\t\t\t\t}\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\t""id"": ""copilot.sparkle"",\n\t\t\t\t\t\t""title"": ""%github.copilot.walkthrough.sparkle.title%"",\n\t\t\t\t\t\t""description"": ""%github.copilot.walkthrough.sparkle.description%"",\n\t\t\t\t\t\t""when"": ""!chatEntitlementSignedOut || chatIsEnabled"",\n\t\t\t\t\t\t""media"": {\n\t\t\t\t\t\t\t""video"": {\n\t\t\t\t\t\t\t\t""dark"": ""https://vscodewalkthroughs.z1.web.core.windows.net/v0.26/git-commit.mp4"",\n\t\t\t\t\t\t\t\t""light"": ""https://vscodewalkthroughs.z1.web.core.windows.net/v0.26/git-commit-light.mp4"",\n\t\t\t\t\t\t\t\t""hc"": ""https://vscodewalkthroughs.z1.web.core.windows.net/v0.26/git-commit-hc.mp4"",\n\t\t\t\t\t\t\t\t""hcLight"": ""https://vscodewalkthroughs.z1.web.core.windows.net/v0.26/git-commit-hclight.mp4""\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t""altText"": ""%github.copilot.walkthrough.sparkle.media.altText%""\n\t\t\t\t\t\t}\n\t\t\t\t\t}\n\t\t\t\t]\n\t\t\t}\n\t\t],\n\t\t""jsonValidation"": [\n\t\t\t{\n\t\t\t\t""fileMatch"": ""settings.json"",\n\t\t\t\t""url"": ""ccsettings://root/schema.json""\n\t\t\t}\n\t\t],\n\t\t""typescriptServerPlugins"": [\n\t\t\t{\n\t\t\t\t""name"": ""@vscode/copilot-typescript-server-plugin"",\n\t\t\t\t""enableForWorkspaceTypeScriptVersions"": true\n\t\t\t}\n\t\t],\n\t\t""chatSessions"": [\n\t\t\t{\n\t\t\t\t""type"": ""claude-code"",\n\t\t\t\t""name"": ""claude"",\n\t\t\t\t""displayName"": ""Claude Code CLI Agent"",\n\t\t\t\t""icon"": ""$(sparkle)"",\n\t\t\t\t""welcomeTitle"": ""Claude Code Agent"",\n\t\t\t\t""welcomeMessage"": ""Run local background tasks"",\n\t\t\t\t""inputPlaceholder"": ""Describe your task, type `#` for adding context"",\n\t\t\t\t""order"": 3,\n\t\t\t\t""description"": ""The Claude Code Agent works on your local machine"",\n\t\t\t\t""when"": ""config.github.copilot.chat.advanced.claudeCode.enabled"",\n\t\t\t\t""capabilities"": {\n\t\t\t\t\t""supportsFileAttachments"": true\n\t\t\t\t},\n\t\t\t\t""commands"": [\n\t\t\t\t\t{\n\t\t\t\t\t\t""name"": ""init"",\n\t\t\t\t\t\t""description"": ""Initialize a new CLAUDE.md file with codebase documentation""\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\t""name"": ""compact"",\n\t\t\t\t\t\t""description"": ""Clear conversation history but keep a summary in context. Optional: /compact [instructions for summarization]""\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\t""name"": ""pr-comments"",\n\t\t\t\t\t\t""description"": ""Get comments from a GitHub pull request""\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\t""name"": ""review"",\n\t\t\t\t\t\t""description"": ""Review a pull request""\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\t""name"": ""security-review"",\n\t\t\t\t\t\t""description"": ""Complete a security review of the pending changes on the current branch""\n\t\t\t\t\t}\n\t\t\t\t]\n\t\t\t},\n\t\t\t{\n\t\t\t\t""type"": ""copilotcli"",\n\t\t\t\t""name"": ""cli"",\n\t\t\t\t""displayName"": ""GitHub Copilot CLI Agent"",\n\t\t\t\t""icon"": ""$(copilot)"",\n\t\t\t\t""welcomeTitle"": ""GitHub Copilot CLI Agent"",\n\t\t\t\t""welcomeMessage"": ""Run local background tasks"",\n\t\t\t\t""inputPlaceholder"": ""Describe your task, type `#` for adding context"",\n\t\t\t\t""order"": 2,\n\t\t\t\t""description"": ""The GitHub Copilot CLI Agent works on your local machine"",\n\t\t\t\t""when"": ""!chatEntitlementSignedOut && chatIsEnabled"",\n\t\t\t\t""capabilities"": {\n\t\t\t\t\t""supportsFileAttachments"": true,\n\t\t\t\t\t""supportsProblemAttachments"": true,\n\t\t\t\t\t""supportsToolAttachments"": false\n\t\t\t\t},\n\t\t\t\t""commands"": [\n\t\t\t\t\t{\n\t\t\t\t\t\t""name"": ""delegate"",\n\t\t\t\t\t\t""description"": ""Delegate chat session to cloud agent and create associated PR""\n\t\t\t\t\t}\n\t\t\t\t]\n\t\t\t},\n\t\t\t{\n\t\t\t\t""type"": ""copilot-cloud-agent"",\n\t\t\t\t""alternativeIds"": [\n\t\t\t\t\t""copilot-swe-agent""\n\t\t\t\t],\n\t\t\t\t""name"": ""cloud"",\n\t\t\t\t""displayName"": ""GitHub Copilot Cloud Agent"",\n\t\t\t\t""icon"": {\n\t\t\t\t\t""light"": ""assets/copilot-cloud.svg"",\n\t\t\t\t\t""dark"": ""assets/copilot-cloud-dark.svg""\n\t\t\t\t},\n\t\t\t\t""welcomeTitle"": ""GitHub Copilot Cloud Agent"",\n\t\t\t\t""welcomeMessage"": ""Delegate tasks to the cloud"",\n\t\t\t\t""inputPlaceholder"": ""Describe your task, type `#` for adding context"",\n\t\t\t\t""order"": 1,\n\t\t\t\t""description"": ""Delegate tasks to the GitHub Copilot Cloud Agent. The agent works asynchronously in the cloud to implement changes, iterates via chat, and can create or update pull requests as needed."",\n\t\t\t\t""when"": ""!chatEntitlementSignedOut && chatIsEnabled"",\n\t\t\t\t""capabilities"": {\n\t\t\t\t\t""supportsFileAttachments"": true\n\t\t\t\t}\n\t\t\t}\n\t\t],\n\t\t""debuggers"": [\n\t\t\t{\n\t\t\t\t""type"": ""vscode-chat-replay"",\n\t\t\t\t""label"": ""vscode-chat-replay"",\n\t\t\t\t""languages"": [\n\t\t\t\t\t""chatReplay""\n\t\t\t\t],\n\t\t\t\t""configurationAttributes"": {\n\t\t\t\t\t""launch"": {\n\t\t\t\t\t\t""properties"": {\n\t\t\t\t\t\t\t""program"": {\n\t\t\t\t\t\t\t\t""type"": ""string"",\n\t\t\t\t\t\t\t\t""description"": ""Chat replay file to debug (parse for headers)"",\n\t\t\t\t\t\t\t\t""default"": ""${file}""\n\t\t\t\t\t\t\t},\n\t\t\t\t\t\t\t""stopOnEntry"": {\n\t\t\t\t\t\t\t\t""type"": ""boolean"",\n\t\t\t\t\t\t\t\t""default"": true,\n\t\t\t\t\t\t\t\t""description"": ""Break immediately to step through manually.""\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t},\n\t\t\t\t\t\t""required"": [\n\t\t\t\t\t\t\t""program""\n\t\t\t\t\t\t]\n\t\t\t\t\t}\n\t\t\t\t}\n\t\t\t}\n\t\t],\n\t\t""chatAgents"": [\n\t\t\t{\n\t\t\t\t""name"": ""Plan"",\n\t\t\t\t""path"": ""./assets/agents/Plan.agent.md"",\n\t\t\t\t""description"": ""Researches a task to create multi-step plans""\n\t\t\t}\n\t\t],\n\t\t""prompts"": [\n\t\t\t{\n\t\t\t\t""name"": ""savePromptFile"",\n\t\t\t\t""path"": ""./assets/prompts/savePromptFile.prompt.md"",\n\t\t\t\t""description"": ""Generalize the current discussion into a reusable prompt and save it as a file""\n\t\t\t}\n\t\t]\n\t},\n\t""extensionPack"": [\n\t\t""GitHub.copilot""\n\t],\n\t""prettier"": {\n\t\t""useTabs"": true,\n\t\t""tabWidth"": 4,\n\t\t""singleQuote"": true\n\t},\n\t""scripts"": {\n\t\t""postinstall"": ""tsx ./script/postinstall.ts"",\n\t\t""prepare"": ""husky"",\n\t\t""vscode-dts:dev"": ""node node_modules/@vscode/dts/index.js dev && mv vscode.proposed.*.ts src/extension"",\n\t\t""vscode-dts:main"": ""node node_modules/@vscode/dts/index.js main && mv vscode.d.ts src/extension"",\n\t\t""build"": ""tsx .esbuild.ts"",\n\t\t""compile"": ""tsx .esbuild.ts --dev"",\n\t\t""watch"": ""npm-run-all -p watch:*"",\n\t\t""watch:esbuild"": ""tsx .esbuild.ts --watch --dev"",\n\t\t""watch:tsc-extension"": ""tsc --noEmit --watch --project tsconfig.json"",\n\t\t""watch:tsc-extension-web"": ""tsc --noEmit --watch --project tsconfig.worker.json"",\n\t\t""watch:tsc-simulation-workbench"": ""tsc --noEmit --watch --project test/simulation/workbench/tsconfig.json"",\n\t\t""typecheck"": ""tsc --noEmit --project tsconfig.json && tsc --noEmit --project test/simulation/workbench/tsconfig.json && tsc --noEmit --project tsconfig.worker.json"",\n\t\t""lint"": ""eslint . --max-warnings=0"",\n\t\t""lint-staged"": ""eslint --max-warnings=0"",\n\t\t""tsfmt"": ""npx tsfmt -r --verify"",\n\t\t""test"": ""npm-run-all test:*"",\n\t\t""test:extension"": ""vscode-test"",\n\t\t""test:sanity"": ""vscode-test --sanity"",\n\t\t""test:unit"": ""vitest --run --pool=forks"",\n\t\t""vitest"": ""vitest"",\n\t\t""bench"": ""vitest bench"",\n\t\t""get_env"": ""tsx script/setup/getEnv.mts"",\n\t\t""get_token"": ""tsx script/setup/getToken.mts"",\n\t\t""prettier"": ""prettier --list-different --write --cache ."",\n\t\t""simulate"": ""node dist/simulationMain.js"",\n\t\t""simulate-require-cache"": ""node dist/simulationMain.js --require-cache"",\n\t\t""simulate-ci"": ""node dist/simulationMain.js --ci --require-cache"",\n\t\t""simulate-update-baseline"": ""node dist/simulationMain.js --update-baseline"",\n\t\t""simulate-gc"": ""node dist/simulationMain.js --require-cache --gc"",\n\t\t""setup"": ""npm run get_env && npm run get_token"",\n\t\t""setup:dotnet"": ""run-script-os"",\n\t\t""setup:dotnet:darwin:linux"": ""curl -O https://raw.githubusercontent.com/dotnet/install-scripts/main/src/dotnet-install.sh && chmod u+x dotnet-install.sh && ./dotnet-install.sh --channel 10.0 && rm dotnet-install.sh"",\n\t\t""setup:dotnet:win32"": ""powershell.exe -NoProfile -ExecutionPolicy Bypass -Command \""Invoke-WebRequest -Uri https://raw.githubusercontent.com/dotnet/install-scripts/main/src/dotnet-install.ps1 -OutFile dotnet-install.ps1; ./dotnet-install.ps1 -channel 10.0; Remove-Item dotnet-install.ps1\"""",\n\t\t""analyze-edits"": ""tsx script/analyzeEdits.ts"",\n\t\t""extract-chat-lib"": ""tsx script/build/extractChatLib.ts"",\n\t\t""create_venv"": ""tsx script/setup/createVenv.mts"",\n\t\t""package"": ""vsce package"",\n\t\t""web"": ""vscode-test-web --headless --extensionDevelopmentPath=. ."",\n\t\t""test:prompt"": ""mocha \""src/extension/completions-core/vscode-node/prompt/**/test/**/*.test.{ts,tsx}\"""",\n\t\t""test:completions-core"": ""tsx src/extension/completions-core/vscode-node/extension/test/runTest.ts""\n\t},\n\t""devDependencies"": {\n\t\t""@azure/identity"": ""4.9.1"",\n\t\t""@azure/keyvault-secrets"": ""^4.10.0"",\n\t\t""@azure/msal-node"": ""^3.6.3"",\n\t\t""@c4312/scip"": ""^0.1.0"",\n\t\t""@fluentui/react-components"": ""^9.66.6"",\n\t\t""@fluentui/react-icons"": ""^2.0.305"",\n\t\t""@hediet/node-reload"": ""^0.8.0"",\n\t\t""@keyv/sqlite"": ""^4.0.5"",\n\t\t""@nteract/messaging"": ""^7.0.20"",\n\t\t""@octokit/types"": ""^14.1.0"",\n\t\t""@parcel/watcher"": ""^2.5.1"",\n\t\t""@stylistic/eslint-plugin"": ""^3.0.1"",\n\t\t""@types/eslint"": ""^9.0.0"",\n\t\t""@types/google-protobuf"": ""^3.15.12"",\n\t\t""@types/js-yaml"": ""^4.0.9"",\n\t\t""@types/markdown-it"": ""^14.0.0"",\n\t\t""@types/minimist"": ""^1.2.5"",\n\t\t""@types/mocha"": ""^10.0.10"",\n\t\t""@types/node"": ""^22.16.3"",\n\t\t""@types/picomatch"": ""^4.0.0"",\n\t\t""@types/react"": ""17.0.44"",\n\t\t""@types/react-dom"": ""^18.2.17"",\n\t\t""@types/sinon"": ""^17.0.4"",\n\t\t""@types/source-map-support"": ""^0.5.10"",\n\t\t""@types/tar"": ""^6.1.13"",\n\t\t""@types/vinyl"": ""^2.0.12"",\n\t\t""@types/vscode"": ""^1.102.0"",\n\t\t""@types/yargs"": ""^17.0.24"",\n\t\t""@typescript-eslint/eslint-plugin"": ""^8.35.0"",\n\t\t""@typescript-eslint/parser"": ""^8.32.0"",\n\t\t""@typescript-eslint/typescript-estree"": ""^8.26.1"",\n\t\t""@vitest/coverage-v8"": ""^3.2.4"",\n\t\t""@vitest/snapshot"": ""^1.5.0"",\n\t\t""@vscode/debugadapter"": ""^1.68.0"",\n\t\t""@vscode/debugprotocol"": ""^1.68.0"",\n\t\t""@vscode/dts"": ""^0.4.1"",\n\t\t""@vscode/lsif-language-service"": ""^0.1.0-pre.4"",\n\t\t""@vscode/test-cli"": ""^0.0.11"",\n\t\t""@vscode/test-electron"": ""^2.5.2"",\n\t\t""@vscode/test-web"": ""^0.0.71"",\n\t\t""@vscode/vsce"": ""3.6.0"",\n\t\t""@vscode/zeromq"": ""0.2.7"",\n\t\t""copyfiles"": ""^2.4.1"",\n\t\t""csv-parse"": ""^6.0.0"",\n\t\t""dotenv"": ""^17.2.0"",\n\t\t""electron"": ""^37.2.1"",\n\t\t""esbuild"": ""^0.25.6"",\n\t\t""eslint"": ""^9.30.0"",\n\t\t""eslint-import-resolver-typescript"": ""^4.4.4"",\n\t\t""eslint-plugin-header"": ""^3.1.1"",\n\t\t""eslint-plugin-import"": ""^2.32.0"",\n\t\t""eslint-plugin-jsdoc"": ""^51.3.4"",\n\t\t""eslint-plugin-no-only-tests"": ""^3.3.0"",\n\t\t""fastq"": ""^1.19.1"",\n\t\t""glob"": ""^11.0.3"",\n\t\t""husky"": ""^9.1.7"",\n\t\t""js-yaml"": ""^4.1.0"",\n\t\t""keyv"": ""^5.3.2"",\n\t\t""lint-staged"": ""15.2.9"",\n\t\t""minimist"": ""^1.2.8"",\n\t\t""mobx"": ""^6.13.7"",\n\t\t""mobx-react-lite"": ""^4.1.0"",\n\t\t""mocha"": ""^11.7.1"",\n\t\t""mocha-junit-reporter"": ""^2.2.1"",\n\t\t""mocha-multi-reporters"": ""^1.5.1"",\n\t\t""monaco-editor"": ""0.44.0"",\n\t\t""npm-run-all"": ""^4.1.5"",\n\t\t""open"": ""^10.1.2"",\n\t\t""openai"": ""^5.11.0"",\n\t\t""outdent"": ""^0.8.0"",\n\t\t""picomatch"": ""^4.0.2"",\n\t\t""playwright"": ""^1.56.1"",\n\t\t""prettier"": ""^3.6.2"",\n\t\t""react"": ""^17.0.2"",\n\t\t""react-dom"": ""17.0.2"",\n\t\t""rimraf"": ""^6.0.1"",\n\t\t""run-script-os"": ""^1.1.6"",\n\t\t""shiki"": ""~1.15.0"",\n\t\t""sinon"": ""^21.0.0"",\n\t\t""source-map-support"": ""^0.5.21"",\n\t\t""tar"": ""^7.4.3"",\n\t\t""ts-dedent"": ""^2.2.0"",\n\t\t""tsx"": ""^4.20.3"",\n\t\t""typescript"": ""^5.8.3"",\n\t\t""typescript-eslint"": ""^8.36.0"",\n\t\t""typescript-formatter"": ""github:jrieken/typescript-formatter#497efb26bc40b5fa59a350e6eab17bce650a7e4b"",\n\t\t""vite-plugin-top-level-await"": ""^1.5.0"",\n\t\t""vite-plugin-wasm"": ""^3.5.0"",\n\t\t""vitest"": ""^3.0.5"",\n\t\t""vscode-languageserver-protocol"": ""^3.17.5"",\n\t\t""vscode-languageserver-textdocument"": ""^1.0.12"",\n\t\t""vscode-languageserver-types"": ""^3.17.5"",\n\t\t""yaml"": ""^2.8.0"",\n\t\t""yargs"": ""^17.7.2""\n\t},\n\t""dependencies"": {\n\t\t""@anthropic-ai/claude-code"": ""^1.0.120"",\n\t\t""@anthropic-ai/sdk"": ""^0.68.0"",\n\t\t""@github/copilot"": ""^0.0.343"",\n\t\t""@google/genai"": ""^1.22.0"",\n\t\t""@humanwhocodes/gitignore-to-minimatch"": ""1.0.2"",\n\t\t""@microsoft/tiktokenizer"": ""^1.0.10"",\n\t\t""@vscode/copilot-api"": ""^0.1.13"",\n\t\t""@vscode/extension-telemetry"": ""^1.0.0"",\n\t\t""@vscode/l10n"": ""^0.0.18"",\n\t\t""@vscode/prompt-tsx"": ""^0.4.0-alpha.5"",\n\t\t""@vscode/tree-sitter-wasm"": ""0.0.5-php.2"",\n\t\t""@xterm/headless"": ""^5.5.0"",\n\t\t""ajv"": ""^8.17.1"",\n\t\t""applicationinsights"": ""^2.9.7"",\n\t\t""diff"": ""^8.0.2"",\n\t\t""ignore"": ""^7.0.5"",\n\t\t""isbinaryfile"": ""^5.0.4"",\n\t\t""jsonc-parser"": ""^3.3.1"",\n\t\t""lru-cache"": ""^11.1.0"",\n\t\t""markdown-it"": ""^14.1.0"",\n\t\t""minimatch"": ""^10.0.3"",\n\t\t""undici"": ""^7.11.0"",\n\t\t""vscode-tas-client"": ""^0.1.84"",\n\t\t""web-tree-sitter"": ""^0.23.0""\n\t},\n\t""overrides"": {\n\t\t""@aminya/node-gyp-build"": ""npm:node-gyp-build@4.8.1"",\n\t\t""string_decoder"": ""npm:string_decoder@1.2.0"",\n\t\t""node-gyp"": ""npm:node-gyp@10.3.1""\n\t}\n}",json,tab
+10,43496,"src/extension/completions-core/vscode-node/lib/src/postInsertion.ts",0,0,"/*---------------------------------------------------------------------------------------------\n * Copyright (c) Microsoft Corporation. All rights reserved.\n * Licensed under the MIT License. See License.txt in the project root for license information.\n *--------------------------------------------------------------------------------------------*/\nimport { IInstantiationService, ServicesAccessor } from '../../../../../util/vs/platform/instantiation/common/instantiation';\nimport { ICompletionsTelemetryService } from '../../bridge/src/completionsTelemetryServiceBridge';\nimport { CopilotTokenManager } from './auth/copilotTokenManager';\nimport { ChangeTracker } from './changeTracker';\nimport { CitationManager, IPCitationDetail } from './citationManager';\nimport { createCompletionState } from './completionState';\nimport { ICompletionsContextService } from './context';\nimport { FileReader } from './fileReader';\nimport { PostInsertionCategory, telemetryAccepted, telemetryRejected } from './ghostText/telemetry';\nimport { LogTarget, Logger } from './logger';\nimport { CopilotNamedAnnotationList } from './openai/stream';\nimport { contextIndentationFromText, indentationBlockFinished } from './prompt/parseBlock';\nimport { Prompt, extractPrompt } from './prompt/prompt';\nimport { fetchCitations } from './snippy/handlePostInsertion';\nimport { editDistance, lexEditDistance } from './suggestions/editDistance';\nimport { SuggestionStatus, computeCompletionText } from './suggestions/partialSuggestions';\nimport { TelemetryStore, TelemetryWithExp, telemetry, telemetryCatch } from './telemetry';\nimport { TextDocumentManager } from './textDocumentManager';\nimport { ICompletionsPromiseQueueService } from './util/promiseQueue';\nimport { ICompletionsRuntimeModeService } from './util/runtimeMode';\n\nconst postInsertionLogger = new Logger('postInsertion');\n\ntype Timeout = {\n\tseconds: number;\n\tcaptureCode: boolean;\n\tcaptureRejection: boolean;\n};\n// windows for telemetry checks, in seconds\n// captureCode = capture the code after acceptance,\n// captureRejection = capture the code after rejection\nconst captureTimeouts: Timeout[] = [\n\t{ seconds: 15, captureCode: false, captureRejection: false },\n\t{ seconds: 30, captureCode: true, captureRejection: true },\n\t{ seconds: 120, captureCode: false, captureRejection: false },\n\t{ seconds: 300, captureCode: false, captureRejection: false },\n\t{ seconds: 600, captureCode: false, captureRejection: false },\n];\n\n// No. of chars before/after insertion point to look for the completion\nconst stillInCodeNearMargin = 50;\nconst stillInCodeFarMargin = 1500;\n\n// If lex edit distance is below this fraction of completion length it is considered\n// in the code\nconst stillInCodeFraction = 0.5;\n\n// Number of characters captured after the insertion point.\n// Used only if we couldn't detect termination point with indent-based parsing.\nconst captureCodeMargin = 500;\n\nconst postInsertConfiguration: {\n\ttriggerPostInsertionSynchroneously: boolean;\n\tcaptureCode: boolean;\n\tcaptureRejection: boolean;\n} = {\n\ttriggerPostInsertionSynchroneously: false,\n\tcaptureCode: false,\n\tcaptureRejection: false,\n};\n\nasync function captureCode(\n\taccessor: ServicesAccessor,\n\turi: string,\n\tcompletionTelemetry: TelemetryWithExp,\n\toffset: number,\n\tsuffixOffset?: number\n): Promise<{ prompt: Prompt; capturedCode: string; terminationOffset: number }> {\n\tconst ctx = accessor.get(ICompletionsContextService);\n\tconst instantiationService = accessor.get(IInstantiationService);\n\tconst logTarget = ctx.get(LogTarget);\n\tconst result = await ctx.get(FileReader).getOrReadTextDocumentWithFakeClientProperties({ uri });\n\tif (result.status !== 'valid') {\n\t\tpostInsertionLogger.info(logTarget, `Could not get document for ${uri}. Maybe it was closed by the editor.`);\n\t\treturn {\n\t\t\tprompt: {\n\t\t\t\tprefix: '',\n\t\t\t\tsuffix: '',\n\t\t\t\tisFimEnabled: false,\n\t\t\t},\n\t\t\tcapturedCode: '',\n\t\t\tterminationOffset: 0,\n\t\t};\n\t}\n\tconst document = result.document;\n\tconst documentText = document.getText();\n\tconst documentTextBefore = documentText.substring(0, offset);\n\tconst position = document.positionAt(offset);\n\n\t// Treat the code before offset as the hypothetical prompt\n\tconst hypotheticalPromptResponse = await instantiationService.invokeFunction(extractPrompt,\n\t\tcompletionTelemetry.properties.headerRequestId,\n\t\tcreateCompletionState(document, position),\n\t\tcompletionTelemetry\n\t);\n\tconst hypotheticalPrompt =\n\t\thypotheticalPromptResponse.type === 'prompt'\n\t\t\t? hypotheticalPromptResponse.prompt\n\t\t\t: {\n\t\t\t\tprefix: documentTextBefore,\n\t\t\t\tsuffix: '',\n\t\t\t\tisFimEnabled: false,\n\t\t\t}; // TODO(eaftan): Pass an actual suffix when we're ready to support it\n\n\tif (hypotheticalPrompt.isFimEnabled && suffixOffset !== undefined) {\n\t\t// With FIM enabled, we can exactly determine capturedCode, suffix and prefix by propertly initialized trackers. No need to guess.\n\t\tconst capturedCode = documentText.substring(offset, suffixOffset);\n\t\thypotheticalPrompt.suffix = documentText.substring(suffixOffset);\n\n\t\treturn { prompt: hypotheticalPrompt, capturedCode, terminationOffset: 0 };\n\t} else {\n\t\t//Everything after the insertion point is hypothetical response we could get from AI\n\t\tconst hypotheticalResponse = documentText.substring(offset);\n\n\t\t//Try to find the termination offset in the hypothetical response using indentation based parsing\n\t\tconst contextIndent = contextIndentationFromText(documentTextBefore, offset, document.detectedLanguageId);\n\t\tconst indentTerminationFunction = indentationBlockFinished(contextIndent, undefined);\n\t\tconst terminationResult = indentTerminationFunction(hypotheticalResponse);\n\n\t\t//If we could detect termination of the indentation block, capture 2x length of detected suggestion\n\t\t//Otherwise capture a lot of characters\n\t\tconst maxOffset = Math.min(\n\t\t\tdocumentText.length,\n\t\t\toffset + (terminationResult ? terminationResult * 2 : captureCodeMargin)\n\t\t);\n\n\t\tconst capturedCode = documentText.substring(offset, maxOffset);\n\n\t\treturn { prompt: hypotheticalPrompt, capturedCode, terminationOffset: terminationResult ?? -1 };\n\t}\n}\n\nexport function postRejectionTasks(\n\taccessor: ServicesAccessor,\n\tinsertionCategory: PostInsertionCategory,\n\tinsertionOffset: number,\n\turi: string,\n\tcompletions: { completionText: string; completionTelemetryData: TelemetryWithExp }[]\n) {\n\tconst logTarget = accessor.get(ICompletionsContextService).get(LogTarget);\n\tconst instantiationService = accessor.get(IInstantiationService);\n\tconst telemetryService = accessor.get(ICompletionsTelemetryService);\n\tconst promiseQueueService = accessor.get(ICompletionsPromiseQueueService);\n\n\t//Send `.rejected` telemetry event for each rejected completion\n\tcompletions.forEach(({ completionText, completionTelemetryData }) => {\n\t\tpostInsertionLogger.debug(\n\t\t\tlogTarget,\n\t\t\t`${insertionCategory}.rejected choiceIndex: ${completionTelemetryData.properties.choiceIndex}`\n\t\t);\n\t\tinstantiationService.invokeFunction(telemetryRejected, insertionCategory, completionTelemetryData);\n\t});\n\tconst positionTracker = instantiationService.createInstance(ChangeTracker, uri, insertionOffset - 1);\n\tconst suffixTracker = instantiationService.createInstance(ChangeTracker, uri, insertionOffset);\n\n\tconst checkInCode = async (t: Timeout) => {\n\t\tpostInsertionLogger.debug(\n\t\t\tlogTarget,\n\t\t\t`Original offset: ${insertionOffset}, Tracked offset: ${positionTracker.offset}`\n\t\t);\n\t\tconst { completionTelemetryData } = completions[0];\n\n\t\tconst { prompt, capturedCode, terminationOffset } = await instantiationService.invokeFunction(captureCode,\n\t\t\turi,\n\t\t\tcompletionTelemetryData,\n\t\t\tpositionTracker.offset + 1,\n\t\t\tsuffixTracker.offset\n\t\t);\n\n\t\tconst promptTelemetry = {\n\t\t\thypotheticalPromptJson: JSON.stringify({ prefix: prompt.prefix, context: prompt.context }),\n\t\t\thypotheticalPromptSuffixJson: JSON.stringify(prompt.suffix),\n\t\t};\n\n\t\tconst customTelemetryData = completionTelemetryData.extendedBy(\n\t\t\t{\n\t\t\t\t...promptTelemetry,\n\t\t\t\tcapturedCodeJson: JSON.stringify(capturedCode),\n\t\t\t},\n\t\t\t{\n\t\t\t\ttimeout: t.seconds,\n\t\t\t\tinsertionOffset: insertionOffset,\n\t\t\t\ttrackedOffset: positionTracker.offset,\n\t\t\t\tterminationOffsetInCapturedCode: terminationOffset,\n\t\t\t}\n\t\t);\n\t\tpostInsertionLogger.debug(\n\t\t\tlogTarget,\n\t\t\t`${insertionCategory}.capturedAfterRejected choiceIndex: ${completionTelemetryData.properties.choiceIndex}`,\n\t\t\tcustomTelemetryData\n\t\t);\n\t\tinstantiationService.invokeFunction(telemetry, insertionCategory + '.capturedAfterRejected', customTelemetryData, TelemetryStore.Enhanced);\n\t};\n\t// Capture the code typed after we detected that completion was rejected,\n\t// Uses first displayed completion as the source/seed of telemetry information.\n\tcaptureTimeouts\n\t\t.filter(t => t.captureRejection)\n\t\t.map(t =>\n\t\t\tpositionTracker.push(\n\t\t\t\ttelemetryCatch(telemetryService, promiseQueueService, () => checkInCode(t), 'postRejectionTasks'),\n\t\t\t\tt.seconds * 1000\n\t\t\t)\n\t\t);\n}\n\nexport function postInsertionTasks(\n\taccessor: ServicesAccessor,\n\tinsertionCategory: PostInsertionCategory,\n\tcompletionText: string,\n\tinsertionOffset: number,\n\turi: string,\n\ttelemetryData: TelemetryWithExp,\n\tsuggestionStatus: SuggestionStatus,\n\tcopilotAnnotations?: CopilotNamedAnnotationList\n) {\n\tconst logTarget = accessor.get(ICompletionsContextService).get(LogTarget);\n\tconst instantiationService = accessor.get(IInstantiationService);\n\tconst promiseQueueService = accessor.get(ICompletionsPromiseQueueService);\n\tconst telemetryService = accessor.get(ICompletionsTelemetryService);\n\tconst runtimeModeService = accessor.get(ICompletionsRuntimeModeService);\n\n\tconst telemetryDataWithStatus = telemetryData.extendedBy(\n\t\t{\n\t\t\tcompType: suggestionStatus.compType,\n\t\t},\n\t\t{\n\t\t\tcompCharLen: suggestionStatus.acceptedLength,\n\t\t\tnumLines: suggestionStatus.acceptedLines,\n\t\t}\n\t);\n\t// send "".accepted"" telemetry\n\tpostInsertionLogger.debug(\n\t\tlogTarget,\n\t\t`${insertionCategory}.accepted choiceIndex: ${telemetryDataWithStatus.properties.choiceIndex}`\n\t);\n\tinstantiationService.invokeFunction(telemetryAccepted, insertionCategory, telemetryDataWithStatus);\n\n\tconst fullCompletionText = completionText;\n\tcompletionText = computeCompletionText(completionText, suggestionStatus);\n\tconst trimmedCompletion = completionText.trim();\n\tconst tracker = instantiationService.createInstance(ChangeTracker, uri, insertionOffset);\n\tconst suffixTracker = instantiationService.createInstance(ChangeTracker, uri, insertionOffset + completionText.length);\n\n\tconst stillInCodeCheck = async (timeout: Timeout) => {\n\t\tconst check = instantiationService.invokeFunction(checkStillInCode,\n\t\t\tinsertionCategory,\n\t\t\ttrimmedCompletion,\n\t\t\tinsertionOffset,\n\t\t\turi,\n\t\t\ttimeout,\n\t\t\ttelemetryDataWithStatus,\n\t\t\ttracker,\n\t\t\tsuffixTracker\n\t\t);\n\t\tawait check;\n\t};\n\n\t// For test purposes, we add one set of these telemetry events synchronously to allow asserting the telemetry\n\tif (postInsertConfiguration.triggerPostInsertionSynchroneously && runtimeModeService.isRunningInTest()) {\n\t\tconst check = stillInCodeCheck({\n\t\t\tseconds: 0,\n\t\t\tcaptureCode: postInsertConfiguration.captureCode,\n\t\t\tcaptureRejection: postInsertConfiguration.captureRejection,\n\t\t});\n\t\tpromiseQueueService.register(check);\n\t} else {\n\t\tcaptureTimeouts.map(timeout =>\n\t\t\ttracker.push(\n\t\t\t\ttelemetryCatch(telemetryService, promiseQueueService, () => stillInCodeCheck(timeout), 'postInsertionTasks'),\n\t\t\t\ttimeout.seconds * 1000\n\t\t\t)\n\t\t);\n\t}\n\n\tinstantiationService.invokeFunction(acc => telemetryCatch(telemetryService, promiseQueueService, citationCheck, 'post insertion citation check')(\n\t\tacc,\n\t\turi,\n\t\tfullCompletionText,\n\t\tcompletionText,\n\t\tinsertionOffset,\n\t\tcopilotAnnotations\n\t));\n}\n\nasync function citationCheck(\n\taccessor: ServicesAccessor,\n\turi: string,\n\tfullCompletionText: string,\n\tinsertedText: string,\n\tinsertionOffset: number,\n\tcopilotAnnotations?: CopilotNamedAnnotationList\n) {\n\tconst logTarget = accessor.get(ICompletionsContextService).get(LogTarget);\n\tconst textDocumentManagerService = accessor.get(ICompletionsContextService).get(TextDocumentManager);\n\tconst copilotTokenManager = accessor.get(ICompletionsContextService).get(CopilotTokenManager);\n\tconst citationManagerService = accessor.get(ICompletionsContextService).get(CitationManager);\n\n\t// If there are no citations, request directly from the snippy service\n\tif (!copilotAnnotations || (copilotAnnotations.ip_code_citations?.length ?? 0) < 1) {\n\t\t// Do not request citations if in blocking mode\n\t\tif (copilotTokenManager.getLastToken()?.getTokenValue('sn') === '1') { return; }\n\t\tawait fetchCitations(accessor, uri, insertedText, insertionOffset);\n\t\treturn;\n\t}\n\n\tconst doc = await textDocumentManagerService.getTextDocument({ uri });\n\n\t// in the CLS, if the editor does not wait to send document updates until the\n\t// acceptance function returns, we could be in a race condition with ongoing\n\t// edits. This searches for the completion text so that hopefully we're providing\n\t// an exact location in a known version of the document.\n\tif (doc) {\n\t\tconst found = find(doc.getText(), insertedText, stillInCodeNearMargin, insertionOffset);\n\t\tif (found.stillInCodeHeuristic) {\n\t\t\tinsertionOffset = found.foundOffset;\n\t\t}\n\t}\n\n\tfor (const citation of copilotAnnotations.ip_code_citations) {\n\t\tconst citationStart = computeCitationStart(\n\t\t\tfullCompletionText.length,\n\t\t\tinsertedText.length,\n\t\t\tcitation.start_offset\n\t\t);\n\t\tif (citationStart === undefined) {\n\t\t\tpostInsertionLogger.info(\n\t\t\t\tlogTarget,\n\t\t\t\t`Full completion for ${uri} contains a reference matching public code, but the partially inserted text did not include the match.`\n\t\t\t);\n\t\t\tcontinue;\n\t\t}\n\t\tconst offsetStart = insertionOffset + citationStart;\n\t\tconst start = doc?.positionAt(offsetStart);\n\t\tconst offsetEnd =\n\t\t\tinsertionOffset + computeCitationEnd(fullCompletionText.length, insertedText.length, citation.stop_offset);\n\t\tconst end = doc?.positionAt(offsetEnd);\n\t\tconst text = start && end ? doc?.getText({ start, end }) : '';\n\n\t\tawait citationManagerService.handleIPCodeCitation({\n\t\t\tinDocumentUri: uri,\n\t\t\toffsetStart,\n\t\t\toffsetEnd,\n\t\t\tversion: doc?.version,\n\t\t\tlocation: start && end ? { start, end } : undefined,\n\t\t\tmatchingText: text,\n\t\t\tdetails: citation.details.citations as IPCitationDetail[],\n\t\t});\n\t}\n}\n\nfunction computeCitationStart(\n\tcompletionLength: number,\n\tinsertedLength: number,\n\tcitationStartOffset: number\n): number | undefined {\n\tif (insertedLength < completionLength && citationStartOffset > insertedLength) {\n\t\treturn undefined;\n\t}\n\treturn citationStartOffset;\n}\n\nfunction computeCitationEnd(completionLength: number, insertedLength: number, citationStopOffset: number): number {\n\tif (insertedLength < completionLength) {\n\t\treturn Math.min(citationStopOffset, insertedLength);\n\t}\n\treturn citationStopOffset;\n}\n\nfunction find(documentText: string, completion: string, margin: number, offset: number) {\n\t// Compute the best alignment between a window of the document text and the completion\n\tconst window = documentText.substring(\n\t\tMath.max(0, offset - margin),\n\t\tMath.min(documentText.length, offset + completion.length + margin)\n\t);\n\tconst lexAlignment = lexEditDistance(window, completion);\n\tconst fraction = lexAlignment.lexDistance / lexAlignment.needleLexLength;\n\tconst { distance: charEditDistance } = editDistance(\n\t\twindow.substring(lexAlignment.startOffset, lexAlignment.endOffset),\n\t\tcompletion\n\t);\n\treturn {\n\t\trelativeLexEditDistance: fraction,\n\t\tcharEditDistance,\n\t\tcompletionLexLength: lexAlignment.needleLexLength,\n\t\tfoundOffset: lexAlignment.startOffset + Math.max(0, offset - margin),\n\t\tlexEditDistance: lexAlignment.lexDistance,\n\t\tstillInCodeHeuristic: fraction <= stillInCodeFraction ? 1 : 0,\n\t};\n}\n\nasync function checkStillInCode(\n\taccessor: ServicesAccessor,\n\tinsertionCategory: string,\n\tcompletion: string,\n\tinsertionOffset: number, // offset where the completion was inserted to\n\turi: string,\n\ttimeout: Timeout,\n\ttelemetryData: TelemetryWithExp,\n\ttracker: ChangeTracker,\n\tsuffixTracker: ChangeTracker\n) {\n\t// Get contents of file from file system\n\tconst ctx = accessor.get(ICompletionsContextService);\n\tconst instantiationService = accessor.get(IInstantiationService);\n\tconst logTarget = ctx.get(LogTarget);\n\tconst result = await ctx.get(FileReader).getOrReadTextDocument({ uri });\n\tif (result.status === 'valid') {\n\t\tconst document = result.document;\n\t\tconst documentText = document.getText();\n\n\t\t// We try twice, first very close to the insertion point, then a bit\n\t\t// further. This is to increase accuracy for short completions,\n\t\t// where the completion might appear elsewhere.\n\t\tlet finding = find(documentText, completion, stillInCodeNearMargin, tracker.offset);\n\t\tif (!finding.stillInCodeHeuristic) {\n\t\t\tfinding = find(documentText, completion, stillInCodeFarMargin, tracker.offset);\n\t\t}\n\t\t// Debug and log a binary decision\n\t\tpostInsertionLogger.debug(\n\t\t\tlogTarget,\n\t\t\t`stillInCode: ${finding.stillInCodeHeuristic ? 'Found' : 'Not found'}! Completion '${completion}' in file ${uri\n\t\t\t}. lexEditDistance fraction was ${finding.relativeLexEditDistance}. Char edit distance was ${finding.charEditDistance\n\t\t\t}. Inserted at ${insertionOffset}, tracked at ${tracker.offset}, found at ${finding.foundOffset\n\t\t\t}. choiceIndex: ${telemetryData.properties.choiceIndex}`\n\t\t);\n\t\t// Log all the details for analysis\n\t\tconst customTelemetryData = telemetryData\n\t\t\t.extendedBy({}, { timeout: timeout.seconds, insertionOffset: insertionOffset, trackedOffset: tracker.offset })\n\t\t\t.extendedBy({}, finding);\n\t\tinstantiationService.invokeFunction(telemetry, insertionCategory + '.stillInCode', customTelemetryData);\n\n\t\tif (timeout.captureCode) {\n\t\t\tconst { prompt, capturedCode, terminationOffset } = await instantiationService.invokeFunction(\n\t\t\t\tcaptureCode,\n\t\t\t\turi,\n\t\t\t\tcustomTelemetryData,\n\t\t\t\ttracker.offset,\n\t\t\t\tsuffixTracker.offset\n\t\t\t);\n\t\t\tconst promptTelemetry = {\n\t\t\t\thypotheticalPromptJson: JSON.stringify({ prefix: prompt.prefix, context: prompt.context }),\n\t\t\t\thypotheticalPromptSuffixJson: JSON.stringify(prompt.suffix),\n\t\t\t};\n\n\t\t\tconst afterAcceptedTelemetry = telemetryData.extendedBy(\n\t\t\t\t{\n\t\t\t\t\t...promptTelemetry,\n\t\t\t\t\tcapturedCodeJson: JSON.stringify(capturedCode),\n\t\t\t\t},\n\t\t\t\t{\n\t\t\t\t\ttimeout: timeout.seconds,\n\t\t\t\t\tinsertionOffset: insertionOffset,\n\t\t\t\t\ttrackedOffset: tracker.offset,\n\t\t\t\t\tterminationOffsetInCapturedCode: terminationOffset,\n\t\t\t\t}\n\t\t\t);\n\t\t\tpostInsertionLogger.debug(\n\t\t\t\tlogTarget,\n\t\t\t\t`${insertionCategory}.capturedAfterAccepted choiceIndex: ${telemetryData.properties.choiceIndex}`,\n\t\t\t\tcustomTelemetryData\n\t\t\t);\n\t\t\tinstantiationService.invokeFunction(\n\t\t\t\ttelemetry,\n\t\t\t\tinsertionCategory + '.capturedAfterAccepted',\n\t\t\t\tafterAcceptedTelemetry,\n\t\t\t\tTelemetryStore.Enhanced\n\t\t\t);\n\t\t}\n\t}\n}\n",typescript,tab
+11,44025,"extension-output-pdoom-org.crowd-code-#1-crowd-code",0,0,"2:45:14 PM [info] Activating crowd-code\n2:45:14 PM [info] Recording started\n2:45:14 PM [info] Initializing git provider using file system watchers...\n2:45:14 PM [info] Git repository found\n2:45:14 PM [info] Git provider initialized successfully\n2:45:14 PM [info] Initial git state: [object Object]\n2:45:19 PM [info] Branch checkout detected: capture-tab-and-chat-interaction -> main\n2:45:19 PM [info] Recording git checkout: Switched from branch 'capture-tab-and-chat-interaction' to 'main'\n2:45:19 PM [info] Resetting file cache due to branch checkout\n2:45:39 PM [info] Branch checkout detected: main -> expose-oai-endpoint\n2:45:39 PM [info] Recording git checkout: Switched from branch 'main' to 'expose-oai-endpoint'\n2:45:39 PM [info] Resetting file cache due to branch checkout\n",Log,tab
+12,45667,"TERMINAL",0,0,"",,terminal_focus
+13,45668,"src/extension/completions-core/vscode-node/lib/src/postInsertion.ts",0,0,"",typescript,tab
+14,56465,"TERMINAL",0,0,"module load node",,terminal_command
+15,56504,"TERMINAL",0,0,"]633;C",,terminal_output
+16,58035,"TERMINAL",0,0,"[1;31mLmod has detected the following error: [0m The following module(s) are unknown: ""node""\r\n\r\nPlease check the spelling or version number. Also try ""module spider ...""\r\nIt is also possible your cache file is out-of-date; it may help to try:\r\n $ module --ignore_cache load ""node""\r\n\r\nAlso make sure that all modulefiles written in TCL start with the string #%Module\r\n\r\n\r\n\r\n]0;franz.srambical@hai-login1:~/vscode-crowd-pilot-chat",,terminal_output
+17,62622,"TERMINAL",0,0,"module spider node",,terminal_command
+18,62686,"TERMINAL",0,0,"]633;C",,terminal_output
+19,63233,"TERMINAL",0,0,"\r\n------------------------------------------------------------------------------------------------------------------------------------------------\r\n Tree::DAG_Node:\r\n------------------------------------------------------------------------------------------------------------------------------------------------\r\n Versions:\r\n [1;34mTree::DAG_Node/1.32[0m (E)\r\n\r\nNames marked by a trailing (E) are extensions provided by another module.\r\n\r\n\r\n------------------------------------------------------------------------------------------------------------------------------------------------\r\n For detailed information about a specific ""Tree::DAG_Node"" package (including how to load the modules) use the module's full name.\r\n Note that names that have a trailing (E) are extensions provided by other modules.\r\n For example:\r\n\r\n $ module spider Tree::DAG_Node/1.32\r\n------------------------------------------------------------------------------------------------------------------------------------------------\r\n\r\n------------------------------------------------------------------------------------------------------------------------------------------------\r\n hatch_nodejs_version:\r\n------------------------------------------------------------------------------------------------------------------------------------------------\r\n Versions:\r\n [1;34mhatch_nodejs_version/0.3.1[0m (E)\r\n\r\nNames marked by a trailing (E) are extensions provided by another module.\r\n\r\n\r\n------------------------------------------------------------------------------------------------------------------------------------------------\r\n For detailed information about a specific ""hatch_nodejs_version"" package (including how to load the modules) use the module's full name.\r\n Note that names that have a trailing (E) are extensions provided by other modules.\r\n For example:\r\n\r\n $ module spider hatch_nodejs_version/0.3.1\r\n------------------------------------------------------------------------------------------------------------------------------------------------\r\n[7m--More--[27m",,terminal_output
+20,68702,"TERMINAL",0,0,"\r\r[K\r\n------------------------------------------------------------------------------------------------------------------------------------------------\r\n nodejs:\r\n------------------------------------------------------------------------------------------------------------------------------------------------\r\n Description:\r\n Node.js is a platform built on Chrome's JavaScript runtime for easily building fast, scalable network applications. Node.js uses an\r\n event-driven, non-blocking I/O model that makes it lightweight and efficient, perfect for data-intensive real-time applications that run\r\n across distributed devices.\r\n\r\n Versions:\r\n nodejs/18.17.1-GCCcore-12.3.0\r\n nodejs/20.13.1-GCCcore-13.3.0\r\n\r\n------------------------------------------------------------------------------------------------------------------------------------------------\r\n For detailed information about a specific ""nodejs"" package (including how to load the modules) use the module's full name.\r\n Note that names that have a trailing (E) are extensions provided by other modules.\r\n For example:\r\n\r\n $ module spider nodejs/20.13.1-GCCcore-13.3.0\r\n------------------------------------------------------------------------------------------------------------------------------------------------\r\n\r\n \r\n\r\n]0;franz.srambical@hai-login1:~/vscode-crowd-pilot-chat",,terminal_output
+21,73093,"TERMINAL",0,0,"module spider nodejs",,terminal_command
+22,73164,"TERMINAL",0,0,"]633;C",,terminal_output
+23,73651,"TERMINAL",0,0,"\r\n------------------------------------------------------------------------------------------------------------------------------------------------\r\n nodejs:\r\n------------------------------------------------------------------------------------------------------------------------------------------------\r\n Description:\r\n Node.js is a platform built on Chrome's JavaScript runtime for easily building fast, scalable network applications. Node.js uses an\r\n event-driven, non-blocking I/O model that makes it lightweight and efficient, perfect for data-intensive real-time applications that run\r\n across distributed devices.\r\n\r\n Versions:\r\n nodejs/18.17.1-GCCcore-12.3.0\r\n nodejs/20.13.1-GCCcore-13.3.0\r\n\r\n------------------------------------------------------------------------------------------------------------------------------------------------\r\n For detailed information about a specific ""nodejs"" package (including how to load the modules) use the module's full name.\r\n Note that names that have a trailing (E) are extensions provided by other modules.\r\n For example:\r\n\r\n $ module spider nodejs/20.13.1-GCCcore-13.3.0\r\n------------------------------------------------------------------------------------------------------------------------------------------------\r\n\r\n \r\n\r\n]0;franz.srambical@hai-login1:~/vscode-crowd-pilot-chat",,terminal_output
+24,77423,"TERMINAL",0,0,"module load nodejs",,terminal_command
+25,77479,"TERMINAL",0,0,"]633;C",,terminal_output
+26,77502,"TERMINAL",0,0,"]0;franz.srambical@hai-login1:~/vscode-crowd-pilot-chat",,terminal_output
+27,83804,"TERMINAL",0,0,"npm install",,terminal_command
+28,83843,"TERMINAL",0,0,"]633;C",,terminal_output
+29,87169,"TERMINAL",0,0,"[37;40mnpm[0m [0m[31;40mERR![0m [0m[35mcode[0m EBADENGINE\r\n[0m[37;40mnpm[0m [0m[31;40mERR![0m [0m[35mengine[0m Unsupported engine\r\n[0m[37;40mnpm[0m [0m[31;40mERR![0m [0m[35mengine[0m Not compatible with your version of node/npm: copilot-chat@0.33.0\r\n[0m[37;40mnpm[0m [0m[31;40mERR![0m [0m[35mnotsup[0m Not compatible with your version of node/npm: copilot-chat@0.33.0\r\n[0m[37;40mnpm[0m [0m[31;40mERR![0m [0m[35mnotsup[0m Required: {""vscode"":""^1.106.0-20251030"",""npm"":"">=9.0.0"",""node"":"">=22.14.0""}\r\n[0m[37;40mnpm[0m [0m[31;40mERR![0m [0m[35mnotsup[0m Actual: {""npm"":""10.5.2"",""node"":""v20.13.1""}\r\n[0m\r\n[37;40mnpm[0m [0m[31;40mERR![0m[35m[0m A complete log of this run can be found in: /home/franz.srambical/.npm/_logs/2025-11-03T13_46_39_640Z-debug-0.log\r\n[0m]0;franz.srambical@hai-login1:~/vscode-crowd-pilot-chat",,terminal_output
+30,112309,"TERMINAL",0,0,"npm --version",,terminal_command
+31,112360,"TERMINAL",0,0,"]633;C",,terminal_output
+32,112751,"TERMINAL",0,0,"10.5.2\r\n]0;franz.srambical@hai-login1:~/vscode-crowd-pilot-chat",,terminal_output
+33,118827,"TERMINAL",0,0,"node --version",,terminal_command
+34,118879,"TERMINAL",0,0,"]633;C",,terminal_output
+35,118978,"TERMINAL",0,0,"v20.13.1\r\n]0;franz.srambical@hai-login1:~/vscode-crowd-pilot-chat",,terminal_output
+36,119789,"TERMINAL",0,0,"",,terminal_command
+37,1197730,"TERMINAL",0,0,"bash",,terminal_focus
+38,1199269,"TERMINAL",0,0,"ensure-deps",,terminal_focus
+39,1200217,"TERMINAL",0,0,"bash",,terminal_focus
+40,1206900,"TERMINAL",0,0,"squeue",,terminal_command
+41,1206922,"TERMINAL",0,0,"]633;C JOBID USER PARTITION NODES CPUS ST SUBMIT_TIME START_TIME TIME TIME_LIMIT NODELIST(REASON)\r\n 33317 xiao.liu interacti 1 128 R 2025-11-02T17:43:38 2025-11-02T17:43:38 21:21:43 23:59:00 hai006\r\n 33350 nishant.ku standard 1 8 R 2025-11-03T14:45:30 2025-11-03T14:45:37 19:44 1-00:00:00 hai001\r\n 33328 kalyan.nad standard 1 64 R 2025-11-03T11:56:23 2025-11-03T11:56:38 3:08:43 1-00:00:00 hai002\r\n 33318 xiao.liu standard 1 128 R 2025-11-02T19:29:40 2025-11-02T19:30:38 19:34:43 23:59:00 hai004\r\n]0;franz.srambical@hai-login1:~/vscode-crowd-pilot-chat",,terminal_output
+42,1211022,"TERMINAL",0,0,"bash",,terminal_focus
+43,1219029,"TERMINAL",0,0,"salloc --gpus=1 --ntasks-per-node=1 --cpus-per-task=10 --mem=100G",,terminal_command
+44,1219099,"TERMINAL",0,0,"]633;Csalloc: Granted job allocation 33358\r\n",,terminal_output
+45,1219215,"TERMINAL",0,0,"salloc: Nodes hai003 are ready for job\r\n",,terminal_output
+46,1219567,"TERMINAL",0,0,"Running inside SLURM, Job ID 33358.\r\n",,terminal_output
+47,1219682,"TERMINAL",0,0,"]0;franz.srambical@hai-login1:/fast/home/franz.srambical/vscode-crowd-pilot-chat[?2004h[franz.srambical@hai003.haicore.berlin:/fast/home/franz.srambical/vscode-crowd-pilot-chat] $ ",,terminal_output
+48,1224028,"TERMINAL",0,0,"c",,terminal_output
+49,1224289,"TERMINAL",0,0,"d",,terminal_output
+50,1224355,"TERMINAL",0,0,"\r\n[?2004l\r]0;franz.srambical@hai-login1:~[?2004h[franz.srambical@hai003.haicore.berlin:~] $ ",,terminal_output
+51,1224575,"TERMINAL",0,0,"c",,terminal_output
+52,1224867,"TERMINAL",0,0,"d",,terminal_output
+53,1224966,"TERMINAL",0,0," ",,terminal_output
+54,1225827,"TERMINAL",0,0,"c",,terminal_output
+55,1226200,"TERMINAL",0,0,"[K",,terminal_output
+56,1226269,"TERMINAL",0,0,"v",,terminal_output
+57,1226461,"TERMINAL",0,0,"s",,terminal_output
+58,1226606,"TERMINAL",0,0,"code-crowd-pilot-chat/",,terminal_output
+59,1227060,"TERMINAL",0,0,"\r\n[?2004l\r]0;franz.srambical@hai-login1:~/vscode-crowd-pilot-chat[?2004h[franz.srambical@hai003.haicore.berlin:~/vscode-crowd-pilot-chat] $ ",,terminal_output
+60,1229480,"TERMINAL",0,0,"\r(reverse-i-search)`': [K",,terminal_output
+61,1229693,"TERMINAL",0,0,"s': cd v[7ms[27mcode-crowd-pilot-chat/o': . ""/fast/home/franz.srambical/.cur[7mso[27mr-server/bin/3ccce8f55d8cca49f6d28b491a844c699b8719a0/out/vs/workbench/contrib/terminal/common/scripts/shellIntegration-bash.sh""[A[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C\r[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[Cu': [7msou[27mrce .venv/bin/activate[K\r\n\r[K[A[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C",,terminal_output
+62,1229821,"TERMINAL",0,0,"[1@r': [7msour[27m",,terminal_output
+63,1230609,"TERMINAL",0,0,"\r[42@[franz.srambical@hai003.haicore.berlin:~/vscode-crowd-pilot-chat] $ sour\r\n[?2004l\rbash: .venv/bin/activate: No such file or directory\r\n]0;franz.srambical@hai-login1:~/vscode-crowd-pilot-chat[?2004h[franz.srambical@hai003.haicore.berlin:~/vscode-crowd-pilot-chat] $ ",,terminal_output
+64,1232732,"TERMINAL",0,0,"c",,terminal_output
+65,1232970,"TERMINAL",0,0,"d",,terminal_output
+66,1233059,"TERMINAL",0,0," ",,terminal_output
+67,1233468,"TERMINAL",0,0,"\r\n[?2004l\r]0;franz.srambical@hai-login1:~[?2004h[franz.srambical@hai003.haicore.berlin:~] $ ",,terminal_output
+68,1234351,"TERMINAL",0,0,"c",,terminal_output
+69,1234661,"TERMINAL",0,0,"d",,terminal_output
+70,1234732,"TERMINAL",0,0," ",,terminal_output
+71,1235002,"TERMINAL",0,0,"c",,terminal_output
+72,1236659,"TERMINAL",0,0,"r",,terminal_output
+73,1236820,"TERMINAL",0,0,"ow",,terminal_output
+74,1237038,"TERMINAL",0,0,"d-",,terminal_output
+75,1237435,"TERMINAL",0,0,"\r\n[?2004l\rbash: cd: crowd-: No such file or directory\r\n]0;franz.srambical@hai-login1:~[?2004h[franz.srambical@hai003.haicore.berlin:~] $ ",,terminal_output
+76,1238253,"TERMINAL",0,0,"cd crowd-",,terminal_output
+77,1238716,"TERMINAL",0,0,"p",,terminal_output
+78,1238987,"TERMINAL",0,0,"ilot/",,terminal_output
+79,1239317,"TERMINAL",0,0,"\r\n[?2004l\r]0;franz.srambical@hai-login1:~/crowd-pilot[?2004h[franz.srambical@hai003.haicore.berlin:~/crowd-pilot] $ ",,terminal_output
+80,1240079,"TERMINAL",0,0,"[H[2J[franz.srambical@hai003.haicore.berlin:~/crowd-pilot] $ ",,terminal_output
+81,1240623,"TERMINAL",0,0,"\r(reverse-i-search)`': [K",,terminal_output
+82,1240845,"TERMINAL",0,0,"s': [7ms[27mource .venv/bin/activate\r[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[1@o': [7mso[27m",,terminal_output
+83,1240942,"TERMINAL",0,0,"[1@u': [7msou[27m",,terminal_output
+84,1240990,"TERMINAL",0,0,"[1@r': [7msour[27m",,terminal_output
+85,1241255,"TERMINAL",0,0,"\r[30@[franz.srambical@hai003.haicore.berlin:~/crowd-pilot] $ sour\r\n[?2004l\r]0;franz.srambical@hai-login1:~/crowd-pilot[?2004h(crowd-pilot) [franz.srambical@hai003.haicore.berlin:~/crowd-pilot] $ ",,terminal_output
+86,1241598,"TERMINAL",0,0,"\r(reverse-i-search)`': [K",,terminal_output
+87,1242293,"TERMINAL",0,0,"s': [7ms[27mource .venv/bin/activate\r[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[Cg': python3 -m [7msg[27mlang.launch_server --model-path qwen/qwen2.5-0.5b-instruct --host 0.0.0.0\r[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C",,terminal_output
+88,1243176,"TERMINAL",0,0,"\r[Ccrowd-pilot) [franz.srambical@hai003.haicore.berlin:~/crowd-pilot] $ python3 -m sglang.launch_server --model-path qwen/qwen2.5-0.5b-instruct --host 0.0.0.0[A[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C\r\n\r[C[C[C[C[C[C[C[C",,terminal_output
+89,1243821,"TERMINAL",0,0,"\r\n[?2004l\r",,terminal_output
+90,1256338,"TERMINAL",0,0,"2025-11-03 15:06:10.917404: I tensorflow/core/util/port.cc:153] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.\r\n",,terminal_output
+91,1256424,"TERMINAL",0,0,"2025-11-03 15:06:10.974069: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.\r\nTo enable the following instructions: AVX2 AVX512F AVX512_VNNI AVX512_BF16 AVX512_FP16 AVX_VNNI AMX_TILE AMX_INT8 AMX_BF16 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.\r\n",,terminal_output
+92,1260111,"TERMINAL",0,0,"2025-11-03 15:06:14.709115: I tensorflow/core/util/port.cc:153] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.\r\n",,terminal_output
+93,1268996,"TERMINAL",0,0,"[2025-11-03 15:06:23] WARNING server_args.py:1104: Attention backend not explicitly specified. Use fa3 backend by default.\r\n[2025-11-03 15:06:23] INFO trace.py:48: opentelemetry package is not installed, tracing disabled\r\n",,terminal_output
+94,1269546,"TERMINAL",0,0,"[2025-11-03 15:06:24] server_args=ServerArgs(model_path='qwen/qwen2.5-0.5b-instruct', tokenizer_path='qwen/qwen2.5-0.5b-instruct', tokenizer_mode='auto', tokenizer_worker_num=1, skip_tokenizer_init=False, load_format='auto', model_loader_extra_config='{}', trust_remote_code=False, context_length=None, is_embedding=False, enable_multimodal=None, revision=None, model_impl='auto', host='0.0.0.0', port=30000, grpc_mode=False, skip_server_warmup=False, warmups=None, nccl_port=None, checkpoint_engine_wait_weights_before_ready=False, dtype='auto', quantization=None, quantization_param_path=None, kv_cache_dtype='auto', enable_fp32_lm_head=False, modelopt_quant=None, modelopt_checkpoint_restore_path=None, modelopt_checkpoint_save_path=None, modelopt_export_path=None, quantize_and_serve=False, mem_fraction_static=0.835, max_running_requests=None, max_queued_requests=None, max_total_tokens=None, chunked_prefill_size=8192, max_prefill_tokens=16384, schedule_policy='fcfs', enable_priority_scheduling=False, abort_on_priority_when_disabled=False, schedule_low_priority_values_first=False, priority_scheduling_preemption_threshold=10, schedule_conservativeness=1.0, page_size=1, hybrid_kvcache_ratio=None, swa_full_tokens_ratio=0.8, disable_hybrid_swa_memory=False, radix_eviction_policy='lru', device='cuda', tp_size=1, pp_size=1, pp_max_micro_batch_size=None, stream_interval=1, stream_output=False, random_seed=916590368, constrained_json_whitespace_pattern=None, constrained_json_disable_any_whitespace=False, watchdog_timeout=300, dist_timeout=None, download_dir=None, base_gpu_id=0, gpu_id_step=1, sleep_on_idle=False, log_level='info', log_level_http=None, log_requests=False, log_requests_level=2, crash_dump_folder=None, show_time_cost=False, enable_metrics=False, enable_metrics_for_all_schedulers=False, tokenizer_metrics_custom_labels_header='x-custom-labels', tokenizer_metrics_allowed_custom_labels=None, bucket_time_to_first_token=None, bucket_inter_token_latency=None, bucket_e2e_request_latency=None, collect_tokens_histogram=False, prompt_tokens_buckets=None, generation_tokens_buckets=None, gc_warning_threshold_secs=0.0, decode_log_interval=40, enable_request_time_stats_logging=False, kv_events_config=None, enable_trace=False, oltp_traces_endpoint='localhost:4317', api_key=None, served_model_name='qwen/qwen2.5-0.5b-instruct', weight_version='default', chat_template=None, completion_template=None, file_storage_path='sglang_storage', enable_cache_report=False, reasoning_parser=None, tool_call_parser=None, tool_server=None, sampling_defaults='model', dp_size=1, load_balance_method='round_robin', load_watch_interval=0.1, prefill_round_robin_balance=False, dist_init_addr=None, nnodes=1, node_rank=0, json_model_override_args='{}', preferred_sampling_params=None, enable_lora=None, max_lora_rank=None, lora_target_modules=None, lora_paths=None, max_loaded_loras=None, max_loras_per_batch=8, lora_eviction_policy='lru', lora_backend='triton', max_lora_chunk_size=16, attention_backend='fa3', decode_attention_backend=None, prefill_attention_backend=None, sampling_backend='flashinfer', grammar_backend='xgrammar', mm_attention_backend=None, nsa_prefill_backend='flashmla_sparse', nsa_decode_backend='fa3', speculative_algorithm=None, speculative_draft_model_path=None, speculative_draft_model_revision=None, speculative_draft_load_format=None, speculative_num_steps=None, speculative_eagle_topk=None, speculative_num_draft_tokens=None, speculative_accept_threshold_single=1.0, speculative_accept_threshold_acc=1.0, speculative_token_map=None, speculative_attention_mode='prefill', speculative_ngram_min_match_window_size=1, speculative_ngram_max_match_window_size=12, speculative_ngram_min_bfs_breadth=1, speculative_ngram_max_bfs_breadth=10, speculative_ngram_match_type='BFS', speculative_ngram_branch_length=18, speculative_ngram_capacity=10000000, ep_size=1, moe_a2a_backend='none', moe_runner_backend='auto', flashinfer_mxfp4_moe_precision='default', enable_flashinfer_allreduce_fusion=False, deepep_mode='auto', ep_num_redundant_experts=0, ep_dispatch_algorithm='static', init_expert_location='trivial', enable_eplb=False, eplb_algorithm='auto', eplb_rebalance_num_iterations=1000, eplb_rebalance_layers_per_chunk=None, eplb_min_rebalancing_utilization_threshold=1.0, expert_distribution_recorder_mode=None, expert_distribution_recorder_buffer_size=1000, enable_expert_distribution_metrics=False, deepep_config=None, moe_dense_tp_size=None, elastic_ep_backend=None, mooncake_ib_device=None, max_mamba_cache_size=None, mamba_ssm_dtype='float32', mamba_full_memory_ratio=0.9, enable_hierarchical_cache=False, hicache_ratio=2.0, hicache_size=0, hicache_write_policy='write_through', hicache_io_backend='kernel', hicache_mem_layout='layer_first', hicache_storage_backend=None, hicache_storage_prefetch_policy='best_effort', hicache_storage_backend_extra_config=None, enable_lmcache=False, kt_amx_weight_path=None, kt_amx_method='AMXINT4', kt_cpuinfer=None, kt_threadpool_count=2, kt_num_gpu_experts=None, enable_double_sparsity=False, ds_channel_config_path=None, ds_heavy_channel_num=32, ds_heavy_token_num=256, ds_heavy_channel_type='qk', ds_sparse_decode_threshold=4096, cpu_offload_gb=0, offload_group_size=-1, offload_num_in_group=1, offload_prefetch_step=1, offload_mode='cpu', multi_item_scoring_delimiter=None, disable_radix_cache=False, cuda_graph_max_bs=256, cuda_graph_bs=[1, 2, 4, 8, 12, 16, 24, 32, 40, 48, 56, 64, 72, 80, 88, 96, 104, 112, 120, 128, 136, 144, 152, 160, 168, 176, 184, 192, 200, 208, 216, 224, 232, 240, 248, 256], disable_cuda_graph=False, disable_cuda_graph_padding=False, enable_profile_cuda_graph=False, enable_cudagraph_gc=False, enable_nccl_nvls=False, enable_symm_mem=False, disable_flashinfer_cutlass_moe_fp4_allgather=False, enable_tokenizer_batch_encode=False, disable_tokenizer_batch_decode=False, disable_outlines_disk_cache=False, disable_custom_all_reduce=False, enable_mscclpp=False, enable_torch_symm_mem=False, disable_overlap_schedule=False, enable_mixed_chunk=False, enable_dp_attention=False, enable_dp_lm_head=False, enable_two_batch_overlap=False, enable_single_batch_overlap=False, tbo_token_distribution_threshold=0.48, enable_torch_compile=False, enable_piecewise_cuda_graph=False, torch_compile_max_bs=32, piecewise_cuda_graph_max_tokens=4096, piecewise_cuda_graph_tokens=[4, 8, 12, 16, 20, 24, 28, 32, 48, 64, 80, 96, 112, 128, 144, 160, 176, 192, 208, 224, 240, 256, 288, 320, 352, 384, 416, 448, 480, 512, 640, 768, 896, 1024, 1152, 1280, 1408, 1536, 1664, 1792, 1920, 2048, 2176, 2304, 2432, 2560, 2688, 2816, 2944, 3072, 3200, 3328, 3456, 3584, 3712, 3840, 3968, 4096], piecewise_cuda_graph_compiler='eager', torchao_config='', enable_nan_detection=False, enable_p2p_check=False, triton_attention_reduce_in_fp32=False, triton_attention_num_kv_splits=8, triton_attention_split_tile_size=None, num_continuous_decode_steps=1, delete_ckpt_after_loading=False, enable_memory_saver=False, enable_weights_cpu_backup=False, allow_auto_truncate=False, enable_custom_logit_processor=False, flashinfer_mla_disable_ragged=False, disable_shared_experts_fusion=False, disable_chunked_prefix_cache=False, disable_fast_image_processor=False, keep_mm_feature_on_device=False, enable_return_hidden_states=False, scheduler_recv_interval=1, numa_node=None, enable_deterministic_inference=False, rl_on_policy_target=None, enable_dynamic_batch_tokenizer=False, dynamic_batch_tokenizer_batch_size=32, dynamic_batch_tokenizer_batch_timeout=0.002, debug_tensor_dump_output_folder=None, debug_tensor_dump_input_file=None, debug_tensor_dump_inject=False, disaggregation_mode='null', disaggregation_transfer_backend='mooncake', disaggregation_bootstrap_port=8998, disaggregation_decode_tp=None, disaggregation_decode_dp=None, disaggregation_prefill_pp=1, disaggregation_ib_device=None, disaggregation_decode_enable_offload_kvcache=False, num_reserved_decode_tokens=512, disaggregation_decode_polling_interval=1, custom_weight_loader=[], weight_loader_disable_mmap=False, remote_instance_weight_loader_seed_instance_ip=None, remote_instance_weight_loader_seed_instance_service_port=None, remote_instance_weight_loader_send_weights_group_ports=None, enable_pdmux=False, pdmux_config_path=None, sm_group_num=8)\r\n",,terminal_output
+95,1270581,"TERMINAL",0,0,"[2025-11-03 15:06:25] Using default HuggingFace chat template with detected content format: string\r\n",,terminal_output
+96,1276785,"TERMINAL",0,0,"bash",,terminal_focus
+97,1282178,"TERMINAL",0,0,"squeue",,terminal_command
+98,1282210,"TERMINAL",0,0,"]633;C JOBID USER PARTITION NODES CPUS ST SUBMIT_TIME START_TIME TIME TIME_LIMIT NODELIST(REASON)\r\n 33358 franz.sram interacti 1 20 R 2025-11-03T15:05:33 2025-11-03T15:05:33 1:03 1-00:00:00 hai003\r\n 33317 xiao.liu interacti 1 128 R 2025-11-02T17:43:38 2025-11-02T17:43:38 21:22:58 23:59:00 hai006\r\n 33350 nishant.ku standard 1 8 R 2025-11-03T14:45:30 2025-11-03T14:45:37 20:59 1-00:00:00 hai001\r\n 33328 kalyan.nad standard 1 64 R 2025-11-03T11:56:23 2025-11-03T11:56:38 3:09:58 1-00:00:00 hai002\r\n 33318 xiao.liu standard 1 128 R 2025-11-02T19:29:40 2025-11-02T19:30:38 19:35:58 23:59:00 hai004\r\n]0;franz.srambical@hai-login1:~/vscode-crowd-pilot-chat",,terminal_output
+99,1285870,"TERMINAL",0,0,"[2025-11-03 15:06:40] INFO trace.py:48: opentelemetry package is not installed, tracing disabled\r\n",,terminal_output
+100,1286728,"TERMINAL",0,0,"srun",,terminal_focus
+101,1288033,"TERMINAL",0,0,"[2025-11-03 15:06:42] Init torch distributed begin.\r\n",,terminal_output
+102,1288250,"TERMINAL",0,0,"[Gloo] Rank 0 is connected to 0 peer ranks. Expected number of connected peer ranks is : 0\r\n[Gloo] Rank 0 is connected to 0 peer ranks. Expected number of connected peer ranks is : 0\r\n[Gloo] Rank 0 is connected to 0 peer ranks. Expected number of connected peer ranks is : 0\r\n[Gloo] Rank 0 is connected to 0 peer ranks. Expected number of connected peer ranks is : 0\r\n[2025-11-03 15:06:42] Init torch distributed ends. mem usage=0.00 GB\r\n",,terminal_output
+103,1288299,"TERMINAL",0,0,"[2025-11-03 15:06:42] MOE_RUNNER_BACKEND is not initialized, the backend will be automatically selected\r\n",,terminal_output
+104,1289767,"TERMINAL",0,0,"[2025-11-03 15:06:44] Load weight begin. avail mem=78.68 GB\r\n",,terminal_output
+105,1289871,"TERMINAL",0,0,"[2025-11-03 15:06:44] TensorFlow version 2.20.0 available.\r\n",,terminal_output
+106,1291023,"TERMINAL",0,0,"[2025-11-03 15:06:45] Using model weights format ['*.safetensors']\r\n",,terminal_output
+107,1291548,"TERMINAL",0,0,"[2025-11-03 15:06:46] No model.safetensors.index.json found in remote.\r\n\rLoading safetensors checkpoint shards: 0% Completed | 0/1 [00:00, ?it/s]\r\n",,terminal_output
+108,1291674,"TERMINAL",0,0,"[2025-11-03 15:06:46] INFO trace.py:48: opentelemetry package is not installed, tracing disabled\r\n\rLoading safetensors checkpoint shards: 100% Completed | 1/1 [00:00<00:00, 6.44it/s]\r\n\rLoading safetensors checkpoint shards: 100% Completed | 1/1 [00:00<00:00, 6.43it/s]\r\n\r\n",,terminal_output
+109,1291741,"TERMINAL",0,0,"[2025-11-03 15:06:46] Load weight end. type=Qwen2ForCausalLM, dtype=torch.bfloat16, avail mem=77.61 GB, mem usage=1.07 GB.\r\n[2025-11-03 15:06:46] Using KV cache dtype: torch.bfloat16\r\n[2025-11-03 15:06:46] KV Cache is allocated. #tokens: 5647121, K size: 32.31 GB, V size: 32.31 GB\r\n[2025-11-03 15:06:46] Memory pool end. avail mem=12.31 GB\r\n",,terminal_output
+110,1291868,"TERMINAL",0,0,"[2025-11-03 15:06:46] Capture cuda graph begin. This can take up to several minutes. avail mem=12.21 GB\r\n[2025-11-03 15:06:46] Capture cuda graph bs [1, 2, 4, 8, 12, 16, 24, 32, 40, 48, 56, 64, 72, 80, 88, 96, 104, 112, 120, 128, 136, 144, 152, 160, 168, 176, 184, 192, 200, 208, 216, 224, 232, 240, 248, 256]\r\n",,terminal_output
+111,1292248,"TERMINAL",0,0,"\r 0%| | 0/36 [00:00, ?it/s]\rCapturing batches (bs=256 avail_mem=12.00 GB): 0%| | 0/36 [00:00, ?it/s]",,terminal_output
+112,1292459,"TERMINAL",0,0,"\rCapturing batches (bs=256 avail_mem=12.00 GB): 3%|โโ | 1/36 [00:00<00:06, 5.14it/s]\rCapturing batches (bs=248 avail_mem=11.84 GB): 3%|โโ | 1/36 [00:00<00:06, 5.14it/s]\rCapturing batches (bs=240 avail_mem=11.83 GB): 3%|โโ | 1/36 [00:00<00:06, 5.14it/s]",,terminal_output
+113,1292695,"TERMINAL",0,0,"\rCapturing batches (bs=232 avail_mem=11.83 GB): 3%|โโ | 1/36 [00:00<00:06, 5.14it/s]\rCapturing batches (bs=232 avail_mem=11.83 GB): 11%|โโโโโโโโ | 4/36 [00:00<00:02, 14.68it/s]\rCapturing batches (bs=224 avail_mem=11.82 GB): 11%|โโโโโโโโ | 4/36 [00:00<00:02, 14.68it/s]\rCapturing batches (bs=216 avail_mem=11.81 GB): 11%|โโโโโโโโ | 4/36 [00:00<00:02, 14.68it/s]\rCapturing batches (bs=208 avail_mem=11.81 GB): 11%|โโโโโโโโ | 4/36 [00:00<00:02, 14.68it/s]\rCapturing batches (bs=208 avail_mem=11.81 GB): 19%|โโโโโโโโโโโโโ | 7/36 [00:00<00:01, 19.01it/s]\rCapturing batches (bs=200 avail_mem=11.81 GB): 19%|โโโโโโโโโโโโโ | 7/36 [00:00<00:01, 19.01it/s]\rCapturing batches (bs=192 avail_mem=11.80 GB): 19%|โโโโโโโโโโโโโ | 7/36 [00:00<00:01, 19.01it/s]",,terminal_output
+114,1292808,"TERMINAL",0,0,"\rCapturing batches (bs=184 avail_mem=11.80 GB): 19%|โโโโโโโโโโโโโ | 7/36 [00:00<00:01, 19.01it/s]\rCapturing batches (bs=184 avail_mem=11.80 GB): 28%|โโโโโโโโโโโโโโโโโโ | 10/36 [00:00<00:01, 22.01it/s]\rCapturing batches (bs=176 avail_mem=11.79 GB): 28%|โโโโโโโโโโโโโโโโโโ | 10/36 [00:00<00:01, 22.01it/s]\rCapturing batches (bs=168 avail_mem=11.79 GB): 28%|โโโโโโโโโโโโโโโโโโ | 10/36 [00:00<00:01, 22.01it/s]",,terminal_output
+115,1293016,"TERMINAL",0,0,"\rCapturing batches (bs=160 avail_mem=11.78 GB): 28%|โโโโโโโโโโโโโโโโโโ | 10/36 [00:00<00:01, 22.01it/s]\rCapturing batches (bs=160 avail_mem=11.78 GB): 36%|โโโโโโโโโโโโโโโโโโโโโโโ | 13/36 [00:00<00:01, 22.90it/s]\rCapturing batches (bs=152 avail_mem=11.78 GB): 36%|โโโโโโโโโโโโโโโโโโโโโโโ | 13/36 [00:00<00:01, 22.90it/s]\rCapturing batches (bs=144 avail_mem=11.77 GB): 36%|โโโโโโโโโโโโโโโโโโโโโโโ | 13/36 [00:00<00:01, 22.90it/s]\rCapturing batches (bs=136 avail_mem=11.77 GB): 36%|โโโโโโโโโโโโโโโโโโโโโโโ | 13/36 [00:00<00:01, 22.90it/s]\rCapturing batches (bs=136 avail_mem=11.77 GB): 44%|โโโโโโโโโโโโโโโโโโโโโโโโโโโโโ | 16/36 [00:00<00:00, 24.27it/s]\rCapturing batches (bs=128 avail_mem=11.76 GB): 44%|โโโโโโโโโโโโโโโโโโโโโโโโโโโโโ | 16/36 [00:00<00:00, 24.27it/s]",,terminal_output
+116,1293092,"TERMINAL",0,0,"\rCapturing batches (bs=120 avail_mem=11.76 GB): 44%|โโโโโโโโโโโโโโโโโโโโโโโโโโโโโ | 16/36 [00:00<00:00, 24.27it/s]\rCapturing batches (bs=112 avail_mem=11.75 GB): 44%|โโโโโโโโโโโโโโโโโโโโโโโโโโโโโ | 16/36 [00:00<00:00, 24.27it/s]",,terminal_output
+117,1293299,"TERMINAL",0,0,"\rCapturing batches (bs=112 avail_mem=11.75 GB): 53%|โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ | 19/36 [00:00<00:00, 24.12it/s]\rCapturing batches (bs=104 avail_mem=11.75 GB): 53%|โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ | 19/36 [00:00<00:00, 24.12it/s]\rCapturing batches (bs=96 avail_mem=11.75 GB): 53%|โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ | 19/36 [00:00<00:00, 24.12it/s]\rCapturing batches (bs=88 avail_mem=11.74 GB): 53%|โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ | 19/36 [00:00<00:00, 24.12it/s]\rCapturing batches (bs=88 avail_mem=11.74 GB): 61%|โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ | 22/36 [00:01<00:00, 24.06it/s]\rCapturing batches (bs=80 avail_mem=11.73 GB): 61%|โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ | 22/36 [00:01<00:00, 24.06it/s]\rCapturing batches (bs=72 avail_mem=11.73 GB): 61%|โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ | 22/36 [00:01<00:00, 24.06it/s]",,terminal_output
+118,1293422,"TERMINAL",0,0,"\rCapturing batches (bs=64 avail_mem=11.72 GB): 61%|โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ | 22/36 [00:01<00:00, 24.06it/s]\rCapturing batches (bs=64 avail_mem=11.72 GB): 69%|โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ | 25/36 [00:01<00:00, 24.40it/s]\rCapturing batches (bs=56 avail_mem=11.72 GB): 69%|โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ | 25/36 [00:01<00:00, 24.40it/s]\rCapturing batches (bs=48 avail_mem=11.72 GB): 69%|โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ | 25/36 [00:01<00:00, 24.40it/s]",,terminal_output
+119,1293491,"TERMINAL",0,0,"\rCapturing batches (bs=40 avail_mem=11.71 GB): 69%|โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ | 25/36 [00:01<00:00, 24.40it/s]\rCapturing batches (bs=40 avail_mem=11.71 GB): 78%|โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ | 28/36 [00:01<00:00, 24.81it/s]\rCapturing batches (bs=32 avail_mem=11.71 GB): 78%|โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ | 28/36 [00:01<00:00, 24.81it/s]",,terminal_output
+120,1293581,"TERMINAL",0,0,"\rCapturing batches (bs=24 avail_mem=11.70 GB): 78%|โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ | 28/36 [00:01<00:00, 24.81it/s]\rCapturing batches (bs=16 avail_mem=11.70 GB): 78%|โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ | 28/36 [00:01<00:00, 24.81it/s]",,terminal_output
+121,1293783,"TERMINAL",0,0,"\rCapturing batches (bs=16 avail_mem=11.70 GB): 86%|โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ | 31/36 [00:01<00:00, 23.67it/s]\rCapturing batches (bs=12 avail_mem=11.69 GB): 86%|โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ | 31/36 [00:01<00:00, 23.67it/s]\rCapturing batches (bs=8 avail_mem=11.69 GB): 86%|โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ | 31/36 [00:01<00:00, 23.67it/s]\rCapturing batches (bs=4 avail_mem=11.68 GB): 86%|โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ | 31/36 [00:01<00:00, 23.67it/s]\rCapturing batches (bs=2 avail_mem=11.68 GB): 86%|โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ | 31/36 [00:01<00:00, 23.67it/s]\rCapturing batches (bs=2 avail_mem=11.68 GB): 97%|โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ | 35/36 [00:01<00:00, 26.64it/s]\rCapturing batches (bs=1 avail_mem=11.67 GB): 97%|โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ | 35/36 [00:01<00:00, 26.64it/s]\rCapturing batches (bs=1 avail_mem=11.67 GB): 100%|โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ| 36/36 [00:01<00:00, 23.40it/s]\r\n",,terminal_output
+122,1294160,"TERMINAL",0,0,"[2025-11-03 15:06:48] Capture cuda graph end. Time elapsed: 2.31 s. mem usage=0.54 GB. avail mem=11.67 GB.\r\n",,terminal_output
+123,1294868,"TERMINAL",0,0,"[2025-11-03 15:06:49] max_total_num_tokens=5647121, chunked_prefill_size=8192, max_prefill_tokens=16384, max_running_requests=4096, context_len=32768, available_gpu_mem=11.67 GB\r\n",,terminal_output
+124,1295381,"TERMINAL",0,0,"[2025-11-03 15:06:49] [32mINFO[0m: Started server process [[36m1902320[0m]\r\n[2025-11-03 15:06:49] [32mINFO[0m: Waiting for application startup.\r\n[2025-11-03 15:06:49] Using default chat sampling params from model generation config: {'repetition_penalty': 1.1, 'temperature': 0.7, 'top_k': 20, 'top_p': 0.8}\r\n[2025-11-03 15:06:49] Using default chat sampling params from model generation config: {'repetition_penalty': 1.1, 'temperature': 0.7, 'top_k': 20, 'top_p': 0.8}\r\n[2025-11-03 15:06:49] [32mINFO[0m: Application startup complete.\r\n[2025-11-03 15:06:49] [32mINFO[0m: Uvicorn running on [1mhttp://0.0.0.0:30000[0m (Press CTRL+C to quit)\r\n",,terminal_output
+125,1296387,"TERMINAL",0,0,"[2025-11-03 15:06:50] [32mINFO[0m: 127.0.0.1:54704 - ""[1mGET /get_model_info HTTP/1.1[0m"" [32m200 OK[0m\r\n[2025-11-03 15:06:50] Prefill batch, #new-seq: 1, #new-token: 6, #cached-token: 0, token usage: 0.00, #running-req: 0, #queue-req: 0, \r\n",,terminal_output
+126,1297988,"TERMINAL",0,0,"[2025-11-03 15:06:52] [32mINFO[0m: 127.0.0.1:54718 - ""[1mPOST /generate HTTP/1.1[0m"" [32m200 OK[0m\r\n[2025-11-03 15:06:52] The server is fired up and ready to roll!\r\n",,terminal_output
+127,2687400,"TERMINAL",0,0,"[2025-11-03 15:30:01] SIGTERM received. signum=None frame=None. Draining requests and shutting down...\r\n",,terminal_output
+128,2691625,"TERMINAL",0,0,"[2025-11-03 15:30:05] Gracefully exiting... Remaining number of requests 0. Remaining requests remaining_rids=[].\r\n",,terminal_output
+129,2691919,"TERMINAL",0,0,"Killed\r\n]0;franz.srambical@hai-login1:~/crowd-pilot[?2004h(crowd-pilot) [franz.srambical@hai003.haicore.berlin:~/crowd-pilot] $ ",,terminal_output
diff --git a/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-4c510e1e-c7fc-4bd2-bf45-93d5487c2a0d1760868436827-2025_10_19-12.07.22.935/source.csv b/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-4c510e1e-c7fc-4bd2-bf45-93d5487c2a0d1760868436827-2025_10_19-12.07.22.935/source.csv
new file mode 100644
index 0000000000000000000000000000000000000000..f4385c3ae23068b71945d40fd934a28f79ba6517
--- /dev/null
+++ b/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-4c510e1e-c7fc-4bd2-bf45-93d5487c2a0d1760868436827-2025_10_19-12.07.22.935/source.csv
@@ -0,0 +1,89 @@
+Sequence,Time,File,RangeOffset,RangeLength,Text,Language,Type
+1,3,"slurm/dev/franz/berlin/coinrun/sample/maskgit/sample_mila_submission_case_study_vanilla.sh",0,0,"#!/usr/bin/env bash\n\n#SBATCH --nodes=1\n#SBATCH --ntasks-per-node=1\n#SBATCH --time=24:00:00\n#SBATCH --cpus-per-task=8\n#SBATCH --gres=gpu:1\n#SBATCH --output=/fast/project/HFMI_SynergyUnit/jafar_ws/logs/franz/coinrun/dynamics_sample/%x_%j.log\n#SBATCH --error=/fast/project/HFMI_SynergyUnit/jafar_ws/logs/franz/coinrun/dynamics_sample/%x_%j.log\n#SBATCH --job-name=coinrun_sample_maskgit_mila_submission_case_study_vanilla\n\n# Activate virtual environment\nsource .venv/bin/activate\n\narray_records_dir=""/fast/project/HFMI_SynergyUnit/jafar_ws/data/coinrun/array_records_10M_npy_arr_rec/array_record/test""\nCHECKPOINT_PATH=""/fast/project/HFMI_SynergyUnit/jafar_ws/checkpoints/coinrun/dynamics/dynamics_case_study_dataset_10M_30031""\n\ncurrent_branch=$(git rev-parse --abbrev-ref HEAD)\nif [ ""$current_branch"" != ""main"" ]; then\n echo ""This script must be run from the main branch. Current branch is $current_branch. Exiting.""\n exit 1\nfi\n\necho ""Sampling from checkpoint: $CHECKPOINT_PATH""\n\nsrun python jasmine/sample.py \\n --seed=1 \\n --maskgit_steps=1 \\n --tokenizer_ffn_dim=512 \\n --tokenizer_num_blocks=8 \\n --dyna_ffn_dim=512 \\n --dyna_num_blocks=12 \\n --output_dir=gifs/dynamics_case_study_dataset_10M_vanilla \\n --checkpoint $CHECKPOINT_PATH \\n --data_dir=$array_records_dir \\n --seq_len=16 \\n --batch_size=32 \\n --patch_size=4 \\n --start_frame=4 \\n --image_height=64 \\n --image_width=64 \\n --dyna_type=maskgit\n",shellscript,tab
+2,228,"extension-output-pdoom-org.crowd-code-#1-crowd-code",0,0,"12:07:22 PM [info] Activating crowd-code\n12:07:22 PM [info] Recording started\n12:07:22 PM [info] Initializing git provider using file system watchers...\n12:07:23 PM [info] Git repository found\n12:07:23 PM [info] Git provider initialized successfully\n12:07:23 PM [info] Initial git state: [object Object]\n",Log,tab
+3,5272,"TERMINAL",0,0,"",,terminal_command
+4,8400,"slurm/dev/franz/berlin/coinrun/sample/maskgit/sample_mila_submission_case_study_vanilla.sh",0,0,"",shellscript,tab
+5,8988,"extension-output-pdoom-org.crowd-code-#1-crowd-code",0,0,"",Log,tab
+6,10169,"slurm/dev/franz/berlin/coinrun/sample/maskgit/sample_mila_submission_case_study_vanilla.sh",0,0,"",shellscript,tab
+7,10171,"TERMINAL",0,0,"",,terminal_focus
+8,10727,"TERMINAL",0,0,"source /home/franz.srambical/jafar/.venv/bin/activate",,terminal_command
+9,10730,"TERMINAL",0,0,"]633;C]0;franz.srambical@hai-login1:~/jafar",,terminal_output
+10,12005,"TERMINAL",0,0,"",,terminal_command
+11,12327,"TERMINAL",0,0,"squeue",,terminal_command
+12,12349,"TERMINAL",0,0,"]633;C JOBID USER PARTITION NODES CPUS ST SUBMIT_TIME START_TIME TIME TIME_LIMIT NODELIST(REASON)\r\n 32259 xiao.liu interacti 1 128 R 2025-10-19T11:19:21 2025-10-19T11:19:21 48:14 23:59:00 hai005\r\n 32250 xiao.liu interacti 1 128 R 2025-10-19T01:29:31 2025-10-19T01:29:58 10:37:37 23:59:00 hai002\r\n 32248 xiao.liu interacti 1 128 R 2025-10-19T01:23:17 2025-10-19T01:23:28 10:44:07 23:59:00 hai007\r\n 32092 alfred.ngu standard 1 16 R 2025-10-19T11:53:27 2025-10-19T11:55:48 11:47 1-00:00:00 hai002\r\n 32216 alfred.ngu standard 1 16 R 2025-10-19T11:18:55 2025-10-19T11:21:10 46:25 1:00:00 hai004\r\n 32258 nishant.ku standard 3 96 R 2025-10-19T05:18:10 2025-10-19T05:18:40 6:48:55 1-00:00:00 hai[001,003,006]\r\n 32251 xiao.liu standard 1 128 R 2025-10-19T02:19:39 2025-10-19T02:19:56 9:47:39 23:59:00 hai008\r\n 32095 alfred.ngu standard 1 16 R 2025-10-18T12:11:12 2025-10-18T12:13:27 23:54:08 1-00:00:00 hai007\r\n]0;franz.srambical@hai-login1:~/jafar",,terminal_output
+13,120318,"TERMINAL",0,0,"",,terminal_command
+14,2210013,"slurm/dev/franz/berlin/coinrun/sample/maskgit/sample_mila_submission_case_study_vanilla.sh",340,0,"",shellscript,selection_mouse
+15,2210018,"slurm/dev/franz/berlin/coinrun/sample/maskgit/sample_mila_submission_case_study_vanilla.sh",339,0,"",shellscript,selection_command
+16,2548740,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",0,0,"#!/usr/bin/env bash\n\n#SBATCH --nodes=1\n#SBATCH --ntasks-per-node=1\n#SBATCH --gres=gpu:1\n#SBATCH --time=3-00:00:00\n#SBATCH --cpus-per-task=8\n#SBATCH --partition=gpu_p\n#SBATCH --reservation=haicu_stefan\n#SBATCH --qos=gpu_long\n#SBATCH --output=/lustre/groups/haicu/workspace/alfred.nguyen/jafar_worksapce/logs/coinrun/dynamics/%x_%j.log\n#SBATCH --error=/lustre/groups/haicu/workspace/alfred.nguyen/jafar_worksapce/logs/coinrun/dynamics/%x_%j.log\n#SBATCH --job-name=train_dynamics_coinrun_og_reproduction\n\n# Log the sbatch script\ncat $0\n\nsource .venv/bin/activate\n\njob_name=$SLURM_JOB_NAME\nslurm_job_id=$SLURM_JOB_ID\n\ntags=""coinrun_og dynanmics 10m_dataset helmholtz_reproduction dyn_repro""\n\nnpy_records_dir=""/lustre/groups/haicu/workspace/alfred.nguyen/jafar_worksapce/data/coinrun_episodes/coinrun_episodes""\n\ntokenizer_ckpt_dir=""/lustre/groups/haicu/workspace/alfred.nguyen/jafar_worksapce/checkpoints/coinrun/tokenizer/train_tokenizer_coinrun_og_reproduction_28246778/tokenizer_1756303195_110000""\nlam_ckpt_dir=""/lustre/groups/haicu/workspace/alfred.nguyen/jafar_worksapce/checkpoints/coinrun/lam/train_lam_coinrun_og_reproduction_28246647/lam_1756303037_200000""\nCHECKPOINT_DIR=""/lustre/groups/haicu/workspace/alfred.nguyen/jafar_worksapce/checkpoints/coinrun/dynamics/${job_name}/${slurm_job_id}""\nmkdir -p $CHECKPOINT_DIR\n\nenv | grep SLURM\n\nsrun python jasmine/train_dynamics.py \\n --ckpt_dir $CHECKPOINT_DIR \\n --tokenizer_checkpoint=""${tokenizer_ckpt_dir}"" \\n --lam_checkpoint=""${lam_ckpt_dir}"" \\n --log_image_interval=1000 \\n --log \\n --name=""${job_name}_${slurm_job_id}"" \\n --tags ${tags} \\n --entity instant-uv \\n --project jafar \\n --data_dir $npy_records_dir\n",shellscript,tab
+17,2549769,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",20,0,"",shellscript,selection_command
+18,2550018,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",21,0,"",shellscript,selection_command
+19,2550042,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",39,0,"",shellscript,selection_command
+20,2550079,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",67,0,"",shellscript,selection_command
+21,2550108,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",88,0,"",shellscript,selection_command
+22,2550145,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",114,0,"",shellscript,selection_command
+23,2550176,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",140,0,"",shellscript,selection_command
+24,2550208,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",166,0,"",shellscript,selection_command
+25,2550241,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",201,0,"",shellscript,selection_command
+26,2550279,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",224,0,"",shellscript,selection_command
+27,2556892,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",334,0,"",shellscript,selection_command
+28,2557154,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",443,0,"",shellscript,selection_command
+29,2557181,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",501,0,"",shellscript,selection_command
+30,2557205,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",502,0,"",shellscript,selection_command
+31,2557234,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",526,0,"",shellscript,selection_command
+32,2557272,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",533,0,"",shellscript,selection_command
+33,2557301,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",534,0,"",shellscript,selection_command
+34,2557422,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",560,0,"",shellscript,selection_command
+35,2557565,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",561,0,"",shellscript,selection_command
+36,2557712,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",586,0,"",shellscript,selection_command
+37,2557846,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",613,0,"",shellscript,selection_command
+38,2557995,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",614,0,"",shellscript,selection_command
+39,2558131,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",687,0,"",shellscript,selection_command
+40,2558315,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",688,0,"",shellscript,selection_command
+41,2558590,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",703,0,"",shellscript,selection_command
+42,2558743,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",706,0,"",shellscript,selection_command
+43,2559582,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",705,0,"",shellscript,selection_command
+44,2559702,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",705,1,"/",shellscript,selection_command
+45,2559890,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",705,7,"/lustre",shellscript,selection_command
+46,2560136,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",705,8,"/lustre/",shellscript,selection_command
+47,2560170,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",705,14,"/lustre/groups",shellscript,selection_command
+48,2560198,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",705,15,"/lustre/groups/",shellscript,selection_command
+49,2560233,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",705,20,"/lustre/groups/haicu",shellscript,selection_command
+50,2560268,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",705,21,"/lustre/groups/haicu/",shellscript,selection_command
+51,2560298,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",705,30,"/lustre/groups/haicu/workspace",shellscript,selection_command
+52,2560331,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",705,31,"/lustre/groups/haicu/workspace/",shellscript,selection_command
+53,2560365,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",705,37,"/lustre/groups/haicu/workspace/alfred",shellscript,selection_command
+54,2560397,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",705,38,"/lustre/groups/haicu/workspace/alfred.",shellscript,selection_command
+55,2560431,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",705,44,"/lustre/groups/haicu/workspace/alfred.nguyen",shellscript,selection_command
+56,2560464,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",705,45,"/lustre/groups/haicu/workspace/alfred.nguyen/",shellscript,selection_command
+57,2560497,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",705,60,"/lustre/groups/haicu/workspace/alfred.nguyen/jafar_worksapce",shellscript,selection_command
+58,2560531,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",705,61,"/lustre/groups/haicu/workspace/alfred.nguyen/jafar_worksapce/",shellscript,selection_command
+59,2560565,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",705,65,"/lustre/groups/haicu/workspace/alfred.nguyen/jafar_worksapce/data",shellscript,selection_command
+60,2560600,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",705,66,"/lustre/groups/haicu/workspace/alfred.nguyen/jafar_worksapce/data/",shellscript,selection_command
+61,2560632,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",705,82,"/lustre/groups/haicu/workspace/alfred.nguyen/jafar_worksapce/data/coinrun_episodes",shellscript,selection_command
+62,2560831,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",705,83,"/lustre/groups/haicu/workspace/alfred.nguyen/jafar_worksapce/data/coinrun_episodes/",shellscript,selection_command
+63,2561016,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",705,99,"/lustre/groups/haicu/workspace/alfred.nguyen/jafar_worksapce/data/coinrun_episodes/coinrun_episodes",shellscript,selection_command
+64,2568443,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",803,0,"",shellscript,selection_command
+65,2645468,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",803,1,"s",shellscript,selection_command
+66,2645656,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",788,16,"coinrun_episodes",shellscript,selection_command
+67,2645911,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",787,17,"/coinrun_episodes",shellscript,selection_command
+68,2645928,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",771,33,"coinrun_episodes/coinrun_episodes",shellscript,selection_command
+69,2645958,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",770,34,"/coinrun_episodes/coinrun_episodes",shellscript,selection_command
+70,2645991,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",766,38,"data/coinrun_episodes/coinrun_episodes",shellscript,selection_command
+71,2646025,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",765,39,"/data/coinrun_episodes/coinrun_episodes",shellscript,selection_command
+72,2646058,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",750,54,"jafar_worksapce/data/coinrun_episodes/coinrun_episodes",shellscript,selection_command
+73,2646091,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",749,55,"/jafar_worksapce/data/coinrun_episodes/coinrun_episodes",shellscript,selection_command
+74,2646124,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",743,61,"nguyen/jafar_worksapce/data/coinrun_episodes/coinrun_episodes",shellscript,selection_command
+75,2646159,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",742,62,".nguyen/jafar_worksapce/data/coinrun_episodes/coinrun_episodes",shellscript,selection_command
+76,2646318,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",736,68,"alfred.nguyen/jafar_worksapce/data/coinrun_episodes/coinrun_episodes",shellscript,selection_command
+77,2646518,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",735,69,"/alfred.nguyen/jafar_worksapce/data/coinrun_episodes/coinrun_episodes",shellscript,selection_command
+78,2646714,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",726,78,"workspace/alfred.nguyen/jafar_worksapce/data/coinrun_episodes/coinrun_episodes",shellscript,selection_command
+79,2646901,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",725,79,"/workspace/alfred.nguyen/jafar_worksapce/data/coinrun_episodes/coinrun_episodes",shellscript,selection_command
+80,2647061,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",720,84,"haicu/workspace/alfred.nguyen/jafar_worksapce/data/coinrun_episodes/coinrun_episodes",shellscript,selection_command
+81,2647266,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",719,85,"/haicu/workspace/alfred.nguyen/jafar_worksapce/data/coinrun_episodes/coinrun_episodes",shellscript,selection_command
+82,2647440,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",713,91,"groups/haicu/workspace/alfred.nguyen/jafar_worksapce/data/coinrun_episodes/coinrun_episodes",shellscript,selection_command
+83,2647605,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",712,92,"/groups/haicu/workspace/alfred.nguyen/jafar_worksapce/data/coinrun_episodes/coinrun_episodes",shellscript,selection_command
+84,2647792,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",706,98,"lustre/groups/haicu/workspace/alfred.nguyen/jafar_worksapce/data/coinrun_episodes/coinrun_episodes",shellscript,selection_command
+85,2648105,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",703,101,"=""/lustre/groups/haicu/workspace/alfred.nguyen/jafar_worksapce/data/coinrun_episodes/coinrun_episodes",shellscript,selection_command
+86,2648596,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",704,100,"""/lustre/groups/haicu/workspace/alfred.nguyen/jafar_worksapce/data/coinrun_episodes/coinrun_episodes",shellscript,selection_command
+87,2649824,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",705,99,"/lustre/groups/haicu/workspace/alfred.nguyen/jafar_worksapce/data/coinrun_episodes/coinrun_episodes",shellscript,selection_command
+88,2650235,"slurm/jobs/alfred/helmholtz_cluster/jafar_og_reproduction/og_coinrun_dynamics_reproduction.sbatch",705,0,"",shellscript,selection_command
diff --git a/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-7a355fca-ba74-43c3-b4d3-53b2faf770f31764425195709-2025_11_29-15.06.38.713/source.csv b/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-7a355fca-ba74-43c3-b4d3-53b2faf770f31764425195709-2025_11_29-15.06.38.713/source.csv
new file mode 100644
index 0000000000000000000000000000000000000000..41fb78b426e06cd501333350907e4057cf0452ef
--- /dev/null
+++ b/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-7a355fca-ba74-43c3-b4d3-53b2faf770f31764425195709-2025_11_29-15.06.38.713/source.csv
@@ -0,0 +1,6 @@
+Sequence,Time,File,RangeOffset,RangeLength,Text,Language,Type
+2,118,"extension-output-pdoom-org.crowd-code-#1-crowd-code",0,0,"3:06:38 PM [info] Activating crowd-code\n3:06:38 PM [info] Recording started\n3:06:38 PM [info] Initializing git provider using file system watchers...\n3:06:38 PM [info] No workspace folder found\n",Log,tab
+3,1054,"Untitled-1",0,0,"",plaintext,tab
+4,2795,"TERMINAL",0,0,"Test",,terminal_focus
+5,2799,"Untitled-1",0,0,"// crowd-pilot mock insert\n",plaintext,content
+6,3356,"Untitled-1",0,0,"",plaintext,selection_command
diff --git a/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-9110265e-4c9d-44b0-960f-352ca9a53ddb1764446788041-2025_11_29-21.06.32.50/source.csv b/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-9110265e-4c9d-44b0-960f-352ca9a53ddb1764446788041-2025_11_29-21.06.32.50/source.csv
new file mode 100644
index 0000000000000000000000000000000000000000..945296f5145e091da3755d4d1df6f4adf8d2af0c
--- /dev/null
+++ b/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-9110265e-4c9d-44b0-960f-352ca9a53ddb1764446788041-2025_11_29-21.06.32.50/source.csv
@@ -0,0 +1,62 @@
+Sequence,Time,File,RangeOffset,RangeLength,Text,Language,Type
+1,10,"Untitled-1",0,0,"",plaintext,tab
+2,110,"extension-output-pdoom-org.crowd-code-#1-crowd-code",0,0,"9:06:32 PM [info] Activating crowd-code\n9:06:32 PM [info] Recording started\n9:06:32 PM [info] Initializing git provider using file system watchers...\n9:06:32 PM [info] No workspace folder found\n",Log,tab
+3,665,"Untitled-1",0,0,"",plaintext,tab
+4,1768,"TERMINAL",0,0,"Test",,terminal_focus
+5,1774,"Untitled-1",0,0,"/* crowd-pilot: insert start */\nline A\nline B\n/* crowd-pilot: insert end */\n",plaintext,content
+6,2179,"Untitled-1",46,0,"",plaintext,selection_command
+7,2263,"Untitled-1",39,0,"",plaintext,selection_command
+8,2428,"Untitled-1",32,0,"",plaintext,selection_command
+9,2553,"Untitled-1",0,0,"",plaintext,selection_command
+10,3284,"Untitled-1",0,0,"\n",plaintext,content
+11,4720,"Untitled-1",0,0,"\n",plaintext,content
+12,5019,"Untitled-1",0,0,"",plaintext,selection_command
+13,6118,"Untitled-1",34,13,"/* crowd-pilot: replacement */\nREPLACED LINE 1\nREPLACED LINE 2",plaintext,content
+14,6881,"Untitled-1",97,30,"",plaintext,content
+15,7934,"Untitled-1",1,0,"",plaintext,selection_command
+16,8044,"Untitled-1",2,0,"",plaintext,selection_command
+17,15031,"Untitled-1",34,0,"",plaintext,selection_command
+18,15037,"Untitled-1",2,0,"",plaintext,selection_command
+19,17502,"TERMINAL",0,0,"echo ""Hello World""",,terminal_command
+20,17502,"TERMINAL",0,0,"]633;CHello World\r\n[1m[7m%[27m[1m[0m \r \r",,terminal_output
+21,20537,"Untitled-1",65,0,"/* crowd-pilot: insert start */\nline A\nline B\n/* crowd-pilot: insert end */\n",plaintext,content
+22,21357,"Untitled-1",97,13,"/* crowd-pilot: replacement */\nREPLACED LINE 1\nREPLACED LINE 2",plaintext,content
+23,21972,"Untitled-1",160,61,"",plaintext,content
+24,28552,"Untitled-1",34,0,"",plaintext,selection_command
+25,28775,"Untitled-1",2,0,"",plaintext,selection_command
+26,29737,"Untitled-1",1,0,"",plaintext,selection_command
+27,30568,"Untitled-1",2,0,"",plaintext,selection_command
+28,31936,"Untitled-1",1,0,"",plaintext,selection_command
+29,32267,"Untitled-1",2,0,"",plaintext,selection_command
+30,34125,"TERMINAL",0,0,"echo ""Hello World""",,terminal_command
+31,34125,"TERMINAL",0,0,"]633;CHello World\r\n[1m[7m%[27m[1m[0m \r \r",,terminal_output
+32,39397,"Untitled-1",65,0,"/* crowd-pilot: insert start */\nline A\nline B\n/* crowd-pilot: insert end */\n",plaintext,content
+33,40783,"Untitled-1",97,13,"/* crowd-pilot: replacement */\nREPLACED LINE 1\nREPLACED LINE 2",css,content
+34,43169,"Untitled-1",160,92,"",css,content
+35,47090,"TERMINAL",0,0,"echo ""Hello World""",,terminal_command
+36,47091,"TERMINAL",0,0,"]633;CHello World\r\n[1m[7m%[27m[1m[0m \r \r",,terminal_output
+37,155465,"Untitled-1",65,0,"/* crowd-pilot: insert start */\nline A\nline B\n/* crowd-pilot: insert end */\n",css,content
+38,155626,"Untitled-1",97,13,"/* crowd-pilot: replacement */\nREPLACED LINE 1\nREPLACED LINE 2",css,content
+39,155804,"Untitled-1",160,92,"",css,content
+40,157600,"TERMINAL",0,0,"echo ""Hello World""",,terminal_command
+41,157601,"TERMINAL",0,0,"]633;CHello World\r\n[1m[7m%[27m[1m[0m \r \r",,terminal_output
+42,159499,"Untitled-1",0,0,"",css,selection_command
+43,160158,"Untitled-1",2,0,"/* crowd-pilot: insert start */\nline A\nline B\n/* crowd-pilot: insert end */\n",css,content
+44,160419,"Untitled-1",34,13,"/* crowd-pilot: replacement */\nREPLACED LINE 1\nREPLACED LINE 2",css,content
+45,160575,"Untitled-1",97,92,"",css,content
+46,164386,"TERMINAL",0,0,"echo ""Hello World""",,terminal_command
+47,164387,"TERMINAL",0,0,"]633;CHello World\r\n[1m[7m%[27m[1m[0m \r \r",,terminal_output
+48,166482,"Untitled-1",1,0,"",css,selection_command
+49,166564,"Untitled-1",0,0,"",css,selection_command
+50,167139,"Untitled-1",2,0,"/* crowd-pilot: insert start */\nline A\nline B\n/* crowd-pilot: insert end */\n",css,content
+51,167690,"Untitled-1",34,13,"/* crowd-pilot: replacement */\nREPLACED LINE 1\nREPLACED LINE 2",css,content
+52,168283,"Untitled-1",97,92,"",css,content
+53,169360,"Untitled-1",1,0,"",css,selection_command
+54,169828,"Untitled-1",2,0,"",css,selection_command
+55,171882,"Untitled-1",1,0,"",css,selection_command
+56,172264,"Untitled-1",0,0,"",css,selection_command
+57,173949,"TERMINAL",0,0,"echo ""Hello World""",,terminal_command
+58,173950,"TERMINAL",0,0,"]633;CHello World\r\n[1m[7m%[27m[1m[0m \r \r",,terminal_output
+59,295971,"Untitled-1",2,0,"/* crowd-pilot: insert start */\nline A\nline B\n/* crowd-pilot: insert end */\n",css,content
+60,296414,"Untitled-1",34,13,"/* crowd-pilot: replacement */\nREPLACED LINE 1\nREPLACED LINE 2",css,content
+61,297016,"Untitled-1",97,92,"",css,content
diff --git a/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-b4b5b190-669e-492e-ba87-de2410c26e171757275535827-2025_09_07-22.05.37.632/source.csv b/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-b4b5b190-669e-492e-ba87-de2410c26e171757275535827-2025_09_07-22.05.37.632/source.csv
new file mode 100644
index 0000000000000000000000000000000000000000..ba6aff4d51dde66439e232e4fdba4a33385253b6
--- /dev/null
+++ b/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-b4b5b190-669e-492e-ba87-de2410c26e171757275535827-2025_09_07-22.05.37.632/source.csv
@@ -0,0 +1,727 @@
+Sequence,Time,File,RangeOffset,RangeLength,Text,Language,Type
+2,113,"extension-output-pdoom-org.crowd-code-#1-crowd-code",0,0,"10:05:37 PM [info] Activating crowd-code\n10:05:37 PM [info] Recording started\n10:05:37 PM [info] Initializing git provider using file system watchers...\n10:05:37 PM [info] Git repository found\n10:05:37 PM [info] Git provider initialized successfully\n",Log,tab
+3,281,"extension-output-pdoom-org.crowd-code-#1-crowd-code",250,0,"10:05:37 PM [info] Initial git state: [object Object]\n",Log,content
+4,18764,"keyboards/annepro2/keymaps/default/keymap.c",0,0," /* Copyright 2021 OpenAnnePro community\n *\n * This program is free software: you can redistribute it and/or modify\n * it under the terms of the GNU General Public License as published by\n * the Free Software Foundation, either version 2 of the License, or\n * (at your option) any later version.\n *\n * This program is distributed in the hope that it will be useful,\n * but WITHOUT ANY WARRANTY; without even the implied warranty of\n * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n * GNU General Public License for more details.\n *\n * You should have received a copy of the GNU General Public License\n * along with this program. If not, see .\n */\n\n#include QMK_KEYBOARD_H\n\nenum anne_pro_layers {\n BASE,\n FN1,\n FN2,\n};\n\n// clang-format off\n// Key symbols are based on QMK. Use them to remap your keyboard\n/*\n* Layer BASE\n* ,-----------------------------------------------------------------------------------------.\n* | esc | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 0 | - | = | Bksp |\n* |-----------------------------------------------------------------------------------------+\n* | Tab | q | w | e | r | t | y | u | i | o | p | [ | ] | \ |\n* |-----------------------------------------------------------------------------------------+\n* | Caps | a | s | d | f | g | h | j | k | l | ; | ' | Enter |\n* |-----------------------------------------------------------------------------------------+\n* | Shift | z | x | c | v | b | n | m | , | . | / | Shift |\n* |-----------------------------------------------------------------------------------------+\n* | Ctrl | L1 | Alt | space | Alt | FN1 | FN2 | Ctrl |\n* \-----------------------------------------------------------------------------------------/\n* Layer TAP in BASE\n* ,-----------------------------------------------------------------------------------------.\n* | | | | | | | | | | | | | | |\n* |-----------------------------------------------------------------------------------------+\n* | | | | | | | | | | | | | | |\n* |-----------------------------------------------------------------------------------------+\n* | | | | | | | | | | | | | |\n* |-----------------------------------------------------------------------------------------+\n* | | | | | | | | | | | | UP |\n* |-----------------------------------------------------------------------------------------+\n* | | | | | | LEFT | DOWN | RIGHT |\n* \-----------------------------------------------------------------------------------------/\n*/\n const uint16_t keymaps[][MATRIX_ROWS][MATRIX_COLS] = {\n [BASE] = LAYOUT_60_ansi( /* Base */\n KC_ESC, KC_1, KC_2, KC_3, KC_4, KC_5, KC_6, KC_7, KC_8, KC_9, KC_0, KC_MINS, KC_EQL, KC_BSPC,\n KC_TAB, KC_Q, KC_W, KC_E, KC_R, KC_T, KC_Y, KC_U, KC_I, KC_O, KC_P, KC_LBRC, KC_RBRC, KC_BSLS,\n LT(FN1, KC_CAPS), KC_A, KC_S, KC_D, KC_F, KC_G, KC_H, KC_J, KC_K, KC_L, KC_SCLN, KC_QUOT, KC_ENT,\n KC_LSFT, KC_Z, KC_X, KC_C, KC_V, KC_B, KC_N, KC_M, KC_COMM, KC_DOT, KC_SLSH, RSFT_T(KC_UP),\n KC_LCTL, KC_LGUI, KC_LALT, KC_SPC, KC_RALT, LT(FN1, KC_LEFT), LT(FN2, KC_DOWN), RCTL_T(KC_RGHT)\n),\n /*\n * Layer FN1\n * ,-----------------------------------------------------------------------------------------.\n * | ` | F1 | F2 | F3 | F4 | F5 | F6 | F7 | F8 | F9 | F10 | F11 | F12 | DELETE |\n * |-----------------------------------------------------------------------------------------+\n * | Tab | q | UP | e | r | t | y | u | i | o | PS | HOME | END | \ |\n * |-----------------------------------------------------------------------------------------+\n * | Esc |LEFT |DOWN |RIGHT| f | g | h | j | k | l | PGUP|PGDN | Enter |\n * |-----------------------------------------------------------------------------------------+\n * | Shift |V-UP |V-DWN|MUTE | v | b | n | m | , |INSRT| DEL | Shift |\n * |-----------------------------------------------------------------------------------------+\n * | Ctrl | L1 | Alt | space | Alt | FN1 | FN2 | Ctrl |\n * \-----------------------------------------------------------------------------------------/\n *\n */\n [FN1] = LAYOUT_60_ansi( /* FN1 */\n KC_GRV, KC_F1, KC_F2, KC_F3, KC_F4, KC_F5, KC_F6, KC_F7, KC_F8, KC_F9, KC_F10, KC_F11, KC_F12, KC_DEL,\n _______, _______, KC_UP, _______, _______, _______, _______, _______, _______, _______, KC_PSCR, KC_HOME, KC_END, _______,\n _______, KC_LEFT, KC_DOWN, KC_RGHT, _______, _______, _______, _______, _______, _______, KC_PGUP, KC_PGDN, _______,\n _______, KC_VOLU, KC_VOLD, KC_MUTE, _______, _______, _______, _______, _______, KC_INS, KC_DEL, _______,\n _______, _______, _______, _______, _______, _______, MO(FN2), _______\n),\n /*\n * Layer FN2\n * ,-----------------------------------------------------------------------------------------.\n * | ~ | BT1 | BT2 | BT3 | BT4 | F5 | F6 | F7 | F8 | MOD | TOG | BRI- | BRI+ | Bksp |\n * |-----------------------------------------------------------------------------------------+\n * | Tab | q | UP | e | r | t | y | u | i | o | PS | HOME | END | \ |\n * |-----------------------------------------------------------------------------------------+\n * | Esc |LEFT |DOWN |RIGHT| f | g | h | j | k | l | PGUP|PGDN | Enter |\n * |-----------------------------------------------------------------------------------------+\n * | Shift | z | x | c | v | b | n | m | , |INSRT| DEL | Shift |\n * |-----------------------------------------------------------------------------------------+\n * | Ctrl | L1 | Alt | space | Alt | FN1 | FN2 | Ctrl |\n * \-----------------------------------------------------------------------------------------/\n *\n */\n [FN2] = LAYOUT_60_ansi( /* FN2 */\n _______, KC_AP2_BT1, KC_AP2_BT2, KC_AP2_BT3, KC_AP2_BT4, _______, _______, _______, _______, KC_AP_RGB_MOD, KC_AP_RGB_TOG, KC_AP_RGB_VAD, KC_AP_RGB_VAI, _______,\n _______, _______, KC_UP, _______, _______, _______, _______, _______, _______, _______, KC_PSCR, KC_HOME, KC_END, _______,\n _______, KC_LEFT, KC_DOWN, KC_RGHT, _______, _______, _______, _______, _______, _______, KC_PGUP, KC_PGDN, _______,\n _______, _______, _______, _______, _______, _______, _______, _______, _______, KC_INS, KC_DEL, _______,\n _______, _______, _______, _______, _______, _______, _______, _______\n ),\n};\n// clang-format on\n",c,tab
+5,19235,"keyboards/annepro2/keymaps/default/keymap.c",787,0,"",c,selection_mouse
+6,19242,"keyboards/annepro2/keymaps/default/keymap.c",786,0,"",c,selection_command
+7,74109,"keyboards/annepro2/keymaps/default/keymap.c",787,0,"",c,selection_mouse
+8,74116,"keyboards/annepro2/keymaps/default/keymap.c",786,0,"",c,selection_command
+9,74698,"keyboards/annepro2/keymaps/default/keymap.c",0,7451," /* Copyright 2021 OpenAnnePro community\n *\n * This program is free software: you can redistribute it and/or modify\n * it under the terms of the GNU General Public License as published by\n * the Free Software Foundation, either version 2 of the License, or\n * (at your option) any later version.\n *\n * This program is distributed in the hope that it will be useful,\n * but WITHOUT ANY WARRANTY; without even the implied warranty of\n * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n * GNU General Public License for more details.\n *\n * You should have received a copy of the GNU General Public License\n * along with this program. If not, see .\n */\n\n#include QMK_KEYBOARD_H\n\nenum anne_pro_layers {\n BASE,\n FN1,\n FN2,\n};\n\n// clang-format off\n// Key symbols are based on QMK. Use them to remap your keyboard\n/*\n* Layer BASE\n* ,-----------------------------------------------------------------------------------------.\n* | esc | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 0 | - | = | Bksp |\n* |-----------------------------------------------------------------------------------------+\n* | Tab | q | w | e | r | t | y | u | i | o | p | [ | ] | \ |\n* |-----------------------------------------------------------------------------------------+\n* | Caps | a | s | d | f | g | h | j | k | l | ; | ' | Enter |\n* |-----------------------------------------------------------------------------------------+\n* | Shift | z | x | c | v | b | n | m | , | . | / | Shift |\n* |-----------------------------------------------------------------------------------------+\n* | Ctrl | L1 | Alt | space | Alt | FN1 | FN2 | Ctrl |\n* \-----------------------------------------------------------------------------------------/\n* Layer TAP in BASE\n* ,-----------------------------------------------------------------------------------------.\n* | | | | | | | | | | | | | | |\n* |-----------------------------------------------------------------------------------------+\n* | | | | | | | | | | | | | | |\n* |-----------------------------------------------------------------------------------------+\n* | | | | | | | | | | | | | |\n* |-----------------------------------------------------------------------------------------+\n* | | | | | | | | | | | | UP |\n* |-----------------------------------------------------------------------------------------+\n* | | | | | | LEFT | DOWN | RIGHT |\n* \-----------------------------------------------------------------------------------------/\n*/\n const uint16_t keymaps[][MATRIX_ROWS][MATRIX_COLS] = {\n [BASE] = LAYOUT_60_ansi( /* Base */\n KC_ESC, KC_1, KC_2, KC_3, KC_4, KC_5, KC_6, KC_7, KC_8, KC_9, KC_0, KC_MINS, KC_EQL, KC_BSPC,\n KC_TAB, KC_Q, KC_W, KC_E, KC_R, KC_T, KC_Y, KC_U, KC_I, KC_O, KC_P, KC_LBRC, KC_RBRC, KC_BSLS,\n LT(FN1, KC_CAPS), KC_A, KC_S, KC_D, KC_F, KC_G, KC_H, KC_J, KC_K, KC_L, KC_SCLN, KC_QUOT, KC_ENT,\n KC_LSFT, KC_Z, KC_X, KC_C, KC_V, KC_B, KC_N, KC_M, KC_COMM, KC_DOT, KC_SLSH, RSFT_T(KC_UP),\n KC_LCTL, KC_LGUI, KC_LALT, KC_SPC, KC_RALT, LT(FN1, KC_LEFT), LT(FN2, KC_DOWN), RCTL_T(KC_RGHT)\n),\n /*\n * Layer FN1\n * ,-----------------------------------------------------------------------------------------.\n * | ` | F1 | F2 | F3 | F4 | F5 | F6 | F7 | F8 | F9 | F10 | F11 | F12 | DELETE |\n * |-----------------------------------------------------------------------------------------+\n * | Tab | q | UP | e | r | t | y | u | i | o | PS | HOME | END | \ |\n * |-----------------------------------------------------------------------------------------+\n * | Esc |LEFT |DOWN |RIGHT| f | g | h | j | k | l | PGUP|PGDN | Enter |\n * |-----------------------------------------------------------------------------------------+\n * | Shift |V-UP |V-DWN|MUTE | v | b | n | m | , |INSRT| DEL | Shift |\n * |-----------------------------------------------------------------------------------------+\n * | Ctrl | L1 | Alt | space | Alt | FN1 | FN2 | Ctrl |\n * \-----------------------------------------------------------------------------------------/\n *\n */\n [FN1] = LAYOUT_60_ansi( /* FN1 */\n KC_GRV, KC_F1, KC_F2, KC_F3, KC_F4, KC_F5, KC_F6, KC_F7, KC_F8, KC_F9, KC_F10, KC_F11, KC_F12, KC_DEL,\n _______, _______, KC_UP, _______, _______, _______, _______, _______, _______, _______, KC_PSCR, KC_HOME, KC_END, _______,\n _______, KC_LEFT, KC_DOWN, KC_RGHT, _______, _______, _______, _______, _______, _______, KC_PGUP, KC_PGDN, _______,\n _______, KC_VOLU, KC_VOLD, KC_MUTE, _______, _______, _______, _______, _______, KC_INS, KC_DEL, _______,\n _______, _______, _______, _______, _______, _______, MO(FN2), _______\n),\n /*\n * Layer FN2\n * ,-----------------------------------------------------------------------------------------.\n * | ~ | BT1 | BT2 | BT3 | BT4 | F5 | F6 | F7 | F8 | MOD | TOG | BRI- | BRI+ | Bksp |\n * |-----------------------------------------------------------------------------------------+\n * | Tab | q | UP | e | r | t | y | u | i | o | PS | HOME | END | \ |\n * |-----------------------------------------------------------------------------------------+\n * | Esc |LEFT |DOWN |RIGHT| f | g | h | j | k | l | PGUP|PGDN | Enter |\n * |-----------------------------------------------------------------------------------------+\n * | Shift | z | x | c | v | b | n | m | , |INSRT| DEL | Shift |\n * |-----------------------------------------------------------------------------------------+\n * | Ctrl | L1 | Alt | space | Alt | FN1 | FN2 | Ctrl |\n * \-----------------------------------------------------------------------------------------/\n *\n */\n [FN2] = LAYOUT_60_ansi( /* FN2 */\n _______, KC_AP2_BT1, KC_AP2_BT2, KC_AP2_BT3, KC_AP2_BT4, _______, _______, _______, _______, KC_AP_RGB_MOD, KC_AP_RGB_TOG, KC_AP_RGB_VAD, KC_AP_RGB_VAI, _______,\n _______, _______, KC_UP, _______, _______, _______, _______, _______, _______, _______, KC_PSCR, KC_HOME, KC_END, _______,\n _______, KC_LEFT, KC_DOWN, KC_RGHT, _______, _______, _______, _______, _______, _______, KC_PGUP, KC_PGDN, _______,\n _______, _______, _______, _______, _______, _______, _______, _______, _______, KC_INS, KC_DEL, _______,\n _______, _______, _______, _______, _______, _______, _______, _______\n ),\n};\n// clang-format on\n",c,selection_command
+10,74940,"keyboards/annepro2/keymaps/default/keymap.c",7451,0,"",c,selection_command
+11,76114,"keyboards/annepro2/keymaps/miryoku/keymap.c",0,0," /* Copyright 2021 OpenAnnePro community\n *\n * This program is free software: you can redistribute it and/or modify\n * it under the terms of the GNU General Public License as published by\n * the Free Software Foundation, either version 2 of the License, or\n * (at your option) any later version.\n *\n * This program is distributed in the hope that it will be useful,\n * but WITHOUT ANY WARRANTY; without even the implied warranty of\n * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n * GNU General Public License for more details.\n *\n * You should have received a copy of the GNU General Public License\n * along with this program. If not, see .\n */\n\n #include QMK_KEYBOARD_H\n\n enum layers {\n BASE,\n NAV,\n MOUSE,\n MEDIA,\n NUMBERS,\n SYMBOLS,\n FUN,\n };\n \n // Helper: use S(KC_...) for shifted symbols where needed.\n \n const uint16_t PROGMEM keymaps[][MATRIX_ROWS][MATRIX_COLS] = {\n \n /* BASE\n - home-row mod-taps from your ZMK config (mt left/right mods)\n - thumb cluster uses LT(...) to switch into NAV/MOUSE/MEDIA/NUMBERS/SYMBOLS/FUN\n */\n [BASE] = LAYOUT_60_ansi(\n /* Row 1 */\n KC_ESC, KC_1, KC_2, KC_3, KC_4, KC_5, KC_6, KC_7, KC_8, KC_9, KC_0, KC_MINS, KC_EQL, KC_BSPC,\n /* Row 2 */\n KC_TAB, KC_Q, KC_W, KC_E, KC_R, KC_T, KC_Y, KC_U, KC_I, KC_O, KC_P, KC_LBRC, KC_RBRC, KC_BSLS,\n /* Row 3 (home row with mod-taps) */\n LCTL_T(KC_ESC), LGUI_T(KC_A), LALT_T(KC_S), LCTL_T(KC_D), LSFT_T(KC_F), KC_G, KC_H, RSFT_T(KC_J), RCTL_T(KC_K), RALT_T(KC_L), KC_SCLN, RGUI_T(KC_QUOT), KC_ENT,\n /* Row 4 */\n KC_LSFT, KC_Z, KC_X, KC_C, KC_V, KC_B, KC_N, KC_M, KC_COMM, KC_DOT, KC_SLSH, RSFT_T(KC_UP),\n /* Thumb cluster (8 keys) -- mapped from your < positions:\n LT(MEDIA, ESC) ; LT(NAV, SPACE) ; LT(MOUSE, TAB); transparent\n LT(SYMBOLS, ENTER) ; LT(NUMBERS, BSPC) ; transparent ; RCTL_T(RIGHT)\n */\n LT(MEDIA, KC_ESC), LT(NAV, KC_SPC), LT(MOUSE, KC_TAB), KC_TRNS, LT(SYMBOLS, KC_ENT), LT(NUMBERS, KC_BSPC), KC_TRNS, RCTL_T(KC_RGHT)\n ),\n \n /* NAV layer (ZMK ""Nav"")\n - clipboard shortcuts, arrows, caps, insert/page/home/end\n - bootloader mapped to RESET\n */\n [NAV] = LAYOUT_60_ansi(\n /* Row 1 */\n KC_TRNS, QK_BOOT, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, LCTL(KC_Y), LCTL(KC_V), LCTL(KC_C), LCTL(KC_X), LCTL(KC_Z), KC_TRNS, KC_TRNS, KC_TRNS,\n /* Row 2 */\n KC_TRNS, KC_LGUI, KC_LALT, KC_LCTL, KC_LSFT, KC_TRNS, KC_LEFT, KC_DOWN, KC_UP, KC_RIGHT, KC_CAPS, KC_TRNS, KC_TRNS, KC_TRNS,\n /* Row 3 */\n KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_INS, KC_PGDN, KC_PGUP, KC_HOME, KC_END, KC_TRNS, KC_TRNS,\n /* Row 4 */\n KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_ENT, KC_BSPC, KC_DEL, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS,\n /* Thumbs */\n KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS\n ),\n \n /* MOUSE layer (ZMK ""Mouse"")\n - mouse movement and wheel, mouse buttons, and keep clipboard shortcuts on top row\n */\n [MOUSE] = LAYOUT_60_ansi(\n /* Row 1 */\n KC_TRNS, QK_BOOT, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, LCTL(KC_Y), LCTL(KC_V), LCTL(KC_C), LCTL(KC_X), LCTL(KC_Z), KC_TRNS, KC_TRNS, KC_TRNS,\n /* Row 2 */\n KC_TRNS, KC_LGUI, KC_LALT, KC_LCTL, KC_LSFT, KC_TRNS, KC_MS_LEFT, KC_MS_DOWN, KC_MS_UP, KC_MS_RIGHT, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS,\n /* Row 3 */\n KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_WH_L, KC_WH_D, KC_WH_U, KC_WH_R, KC_TRNS, KC_TRNS, KC_TRNS,\n /* Row 4 */\n KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_BTN2, KC_BTN1, KC_BTN3, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS,\n /* Thumbs */\n KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS\n ),\n \n /* MEDIA layer (ZMK ""Media"")\n - media controls, volume, playlist, Bluetooth select (kept Anne Pro special codes),\n - brightness/RGB mapped to AP-specific codes (as seen in stock AnnePro2 keymap)\n */\n [MEDIA] = LAYOUT_60_ansi(\n /* Row 1 */\n KC_GRV, QK_BOOT, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_AP_RGB_VAD, KC_AP_RGB_VAI, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS,\n /* Row 2 */\n KC_TRNS, KC_LGUI, KC_LALT, KC_LCTL, KC_LSFT, KC_TRNS, KC_MPRV, KC_VOLD, KC_VOLU, KC_MNXT, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS,\n /* Row 3 */\n KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_AP2_BT1, KC_AP2_BT2, KC_AP2_BT3, KC_AP2_BT4, KC_TRNS, KC_TRNS,\n /* Row 4 */\n KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_MUTE, KC_MPLY, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS,\n /* Thumbs */\n KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS\n ),\n \n /* NUMBERS layer (ZMK ""Numbers"")\n - numbers/numpad-like layout on alpha rows per your ZMK mapping\n */\n [NUMBERS] = LAYOUT_60_ansi(\n /* Row 1 */\n KC_TRNS, KC_LBRC, KC_7, KC_8, KC_9, KC_RBRC, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS,\n /* Row 2 */\n KC_TRNS, KC_SCLN, KC_4, KC_5, KC_6, KC_EQL, KC_TRNS, KC_RSFT, KC_RCTL, KC_RALT, KC_RGUI, KC_TRNS, KC_TRNS, KC_TRNS,\n /* Row 3 */\n KC_TRNS, KC_GRV, KC_1, KC_2, KC_3, KC_BSLS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS,\n /* Row 4 */\n KC_DOT, KC_0, KC_MINS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS,\n /* Thumbs */\n KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS\n ),\n \n /* SYMBOLS layer (ZMK ""Symbols"")\n - uses S(KC_...) where sensible (shifted versions of existing keys)\n */\n [SYMBOLS] = LAYOUT_60_ansi(\n /* Row 1 */\n KC_TRNS, S(KC_LBRC), S(KC_7), S(KC_8), S(KC_9), S(KC_RBRC), KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS,\n /* Row 2 */\n KC_TRNS, KC_COLN, S(KC_4), S(KC_5), S(KC_6), S(KC_EQL), KC_TRNS, KC_RSFT, KC_RCTL, KC_RALT, KC_RGUI, KC_TRNS, KC_TRNS, KC_TRNS,\n /* Row 3 */\n KC_TRNS, S(KC_GRV), KC_EXLM, KC_AT, KC_HASH, KC_PIPE, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS,\n /* Row 4 */\n S(KC_9), S(KC_0), S(KC_MINS), KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS,\n /* Thumbs */\n KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS\n ),\n \n /* FUN layer (ZMK ""Fun"")\n - F1..F12 and application/menu key etc.\n */\n [FUN] = LAYOUT_60_ansi(\n /* Row 1 */\n KC_TRNS, KC_F12, KC_F7, KC_F8, KC_F9, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS,\n /* Row 2 */\n KC_TRNS, KC_F11, KC_F4, KC_F5, KC_F6, KC_TRNS, KC_TRNS, KC_RSFT, KC_RCTL, KC_RALT, KC_RGUI, KC_TRNS, KC_TRNS, KC_TRNS,\n /* Row 3 */\n KC_TRNS, KC_F10, KC_F1, KC_F2, KC_F3, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS,\n /* Row 4 */\n KC_APP, KC_SPC, KC_TAB, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS,\n /* Thumbs */\n KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS\n )\n }; // keymaps\n \n ",c,tab
+12,76473,"keyboards/annepro2/keymaps/miryoku/keymap.c",849,0,"",c,selection_mouse
+13,76480,"keyboards/annepro2/keymaps/miryoku/keymap.c",848,0,"",c,selection_command
+14,76694,"keyboards/annepro2/keymaps/miryoku/keymap.c",0,6939," /* Copyright 2021 OpenAnnePro community\n *\n * This program is free software: you can redistribute it and/or modify\n * it under the terms of the GNU General Public License as published by\n * the Free Software Foundation, either version 2 of the License, or\n * (at your option) any later version.\n *\n * This program is distributed in the hope that it will be useful,\n * but WITHOUT ANY WARRANTY; without even the implied warranty of\n * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n * GNU General Public License for more details.\n *\n * You should have received a copy of the GNU General Public License\n * along with this program. If not, see .\n */\n\n #include QMK_KEYBOARD_H\n\n enum layers {\n BASE,\n NAV,\n MOUSE,\n MEDIA,\n NUMBERS,\n SYMBOLS,\n FUN,\n };\n \n // Helper: use S(KC_...) for shifted symbols where needed.\n \n const uint16_t PROGMEM keymaps[][MATRIX_ROWS][MATRIX_COLS] = {\n \n /* BASE\n - home-row mod-taps from your ZMK config (mt left/right mods)\n - thumb cluster uses LT(...) to switch into NAV/MOUSE/MEDIA/NUMBERS/SYMBOLS/FUN\n */\n [BASE] = LAYOUT_60_ansi(\n /* Row 1 */\n KC_ESC, KC_1, KC_2, KC_3, KC_4, KC_5, KC_6, KC_7, KC_8, KC_9, KC_0, KC_MINS, KC_EQL, KC_BSPC,\n /* Row 2 */\n KC_TAB, KC_Q, KC_W, KC_E, KC_R, KC_T, KC_Y, KC_U, KC_I, KC_O, KC_P, KC_LBRC, KC_RBRC, KC_BSLS,\n /* Row 3 (home row with mod-taps) */\n LCTL_T(KC_ESC), LGUI_T(KC_A), LALT_T(KC_S), LCTL_T(KC_D), LSFT_T(KC_F), KC_G, KC_H, RSFT_T(KC_J), RCTL_T(KC_K), RALT_T(KC_L), KC_SCLN, RGUI_T(KC_QUOT), KC_ENT,\n /* Row 4 */\n KC_LSFT, KC_Z, KC_X, KC_C, KC_V, KC_B, KC_N, KC_M, KC_COMM, KC_DOT, KC_SLSH, RSFT_T(KC_UP),\n /* Thumb cluster (8 keys) -- mapped from your < positions:\n LT(MEDIA, ESC) ; LT(NAV, SPACE) ; LT(MOUSE, TAB); transparent\n LT(SYMBOLS, ENTER) ; LT(NUMBERS, BSPC) ; transparent ; RCTL_T(RIGHT)\n */\n LT(MEDIA, KC_ESC), LT(NAV, KC_SPC), LT(MOUSE, KC_TAB), KC_TRNS, LT(SYMBOLS, KC_ENT), LT(NUMBERS, KC_BSPC), KC_TRNS, RCTL_T(KC_RGHT)\n ),\n \n /* NAV layer (ZMK ""Nav"")\n - clipboard shortcuts, arrows, caps, insert/page/home/end\n - bootloader mapped to RESET\n */\n [NAV] = LAYOUT_60_ansi(\n /* Row 1 */\n KC_TRNS, QK_BOOT, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, LCTL(KC_Y), LCTL(KC_V), LCTL(KC_C), LCTL(KC_X), LCTL(KC_Z), KC_TRNS, KC_TRNS, KC_TRNS,\n /* Row 2 */\n KC_TRNS, KC_LGUI, KC_LALT, KC_LCTL, KC_LSFT, KC_TRNS, KC_LEFT, KC_DOWN, KC_UP, KC_RIGHT, KC_CAPS, KC_TRNS, KC_TRNS, KC_TRNS,\n /* Row 3 */\n KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_INS, KC_PGDN, KC_PGUP, KC_HOME, KC_END, KC_TRNS, KC_TRNS,\n /* Row 4 */\n KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_ENT, KC_BSPC, KC_DEL, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS,\n /* Thumbs */\n KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS\n ),\n \n /* MOUSE layer (ZMK ""Mouse"")\n - mouse movement and wheel, mouse buttons, and keep clipboard shortcuts on top row\n */\n [MOUSE] = LAYOUT_60_ansi(\n /* Row 1 */\n KC_TRNS, QK_BOOT, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, LCTL(KC_Y), LCTL(KC_V), LCTL(KC_C), LCTL(KC_X), LCTL(KC_Z), KC_TRNS, KC_TRNS, KC_TRNS,\n /* Row 2 */\n KC_TRNS, KC_LGUI, KC_LALT, KC_LCTL, KC_LSFT, KC_TRNS, KC_MS_LEFT, KC_MS_DOWN, KC_MS_UP, KC_MS_RIGHT, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS,\n /* Row 3 */\n KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_WH_L, KC_WH_D, KC_WH_U, KC_WH_R, KC_TRNS, KC_TRNS, KC_TRNS,\n /* Row 4 */\n KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_BTN2, KC_BTN1, KC_BTN3, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS,\n /* Thumbs */\n KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS\n ),\n \n /* MEDIA layer (ZMK ""Media"")\n - media controls, volume, playlist, Bluetooth select (kept Anne Pro special codes),\n - brightness/RGB mapped to AP-specific codes (as seen in stock AnnePro2 keymap)\n */\n [MEDIA] = LAYOUT_60_ansi(\n /* Row 1 */\n KC_GRV, QK_BOOT, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_AP_RGB_VAD, KC_AP_RGB_VAI, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS,\n /* Row 2 */\n KC_TRNS, KC_LGUI, KC_LALT, KC_LCTL, KC_LSFT, KC_TRNS, KC_MPRV, KC_VOLD, KC_VOLU, KC_MNXT, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS,\n /* Row 3 */\n KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_AP2_BT1, KC_AP2_BT2, KC_AP2_BT3, KC_AP2_BT4, KC_TRNS, KC_TRNS,\n /* Row 4 */\n KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_MUTE, KC_MPLY, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS,\n /* Thumbs */\n KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS\n ),\n \n /* NUMBERS layer (ZMK ""Numbers"")\n - numbers/numpad-like layout on alpha rows per your ZMK mapping\n */\n [NUMBERS] = LAYOUT_60_ansi(\n /* Row 1 */\n KC_TRNS, KC_LBRC, KC_7, KC_8, KC_9, KC_RBRC, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS,\n /* Row 2 */\n KC_TRNS, KC_SCLN, KC_4, KC_5, KC_6, KC_EQL, KC_TRNS, KC_RSFT, KC_RCTL, KC_RALT, KC_RGUI, KC_TRNS, KC_TRNS, KC_TRNS,\n /* Row 3 */\n KC_TRNS, KC_GRV, KC_1, KC_2, KC_3, KC_BSLS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS,\n /* Row 4 */\n KC_DOT, KC_0, KC_MINS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS,\n /* Thumbs */\n KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS\n ),\n \n /* SYMBOLS layer (ZMK ""Symbols"")\n - uses S(KC_...) where sensible (shifted versions of existing keys)\n */\n [SYMBOLS] = LAYOUT_60_ansi(\n /* Row 1 */\n KC_TRNS, S(KC_LBRC), S(KC_7), S(KC_8), S(KC_9), S(KC_RBRC), KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS,\n /* Row 2 */\n KC_TRNS, KC_COLN, S(KC_4), S(KC_5), S(KC_6), S(KC_EQL), KC_TRNS, KC_RSFT, KC_RCTL, KC_RALT, KC_RGUI, KC_TRNS, KC_TRNS, KC_TRNS,\n /* Row 3 */\n KC_TRNS, S(KC_GRV), KC_EXLM, KC_AT, KC_HASH, KC_PIPE, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS,\n /* Row 4 */\n S(KC_9), S(KC_0), S(KC_MINS), KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS,\n /* Thumbs */\n KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS\n ),\n \n /* FUN layer (ZMK ""Fun"")\n - F1..F12 and application/menu key etc.\n */\n [FUN] = LAYOUT_60_ansi(\n /* Row 1 */\n KC_TRNS, KC_F12, KC_F7, KC_F8, KC_F9, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS,\n /* Row 2 */\n KC_TRNS, KC_F11, KC_F4, KC_F5, KC_F6, KC_TRNS, KC_TRNS, KC_RSFT, KC_RCTL, KC_RALT, KC_RGUI, KC_TRNS, KC_TRNS, KC_TRNS,\n /* Row 3 */\n KC_TRNS, KC_F10, KC_F1, KC_F2, KC_F3, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS,\n /* Row 4 */\n KC_APP, KC_SPC, KC_TAB, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS,\n /* Thumbs */\n KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS, KC_TRNS\n )\n }; // keymaps\n \n ",c,selection_command
+15,76976,"keyboards/annepro2/keymaps/miryoku/keymap.c",0,6939," /* Copyright 2021 OpenAnnePro community\n *\n * This program is free software: you can redistribute it and/or modify\n * it under the terms of the GNU General Public License as published by\n * the Free Software Foundation, either version 2 of the License, or\n * (at your option) any later version.\n *\n * This program is distributed in the hope that it will be useful,\n * but WITHOUT ANY WARRANTY; without even the implied warranty of\n * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n * GNU General Public License for more details.\n *\n * You should have received a copy of the GNU General Public License\n * along with this program. If not, see .\n */\n\n#include QMK_KEYBOARD_H\n\nenum anne_pro_layers {\n BASE,\n FN1,\n FN2,\n};\n\n// clang-format off\n// Key symbols are based on QMK. Use them to remap your keyboard\n/*\n* Layer BASE\n* ,-----------------------------------------------------------------------------------------.\n* | esc | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 0 | - | = | Bksp |\n* |-----------------------------------------------------------------------------------------+\n* | Tab | q | w | e | r | t | y | u | i | o | p | [ | ] | \ |\n* |-----------------------------------------------------------------------------------------+\n* | Caps | a | s | d | f | g | h | j | k | l | ; | ' | Enter |\n* |-----------------------------------------------------------------------------------------+\n* | Shift | z | x | c | v | b | n | m | , | . | / | Shift |\n* |-----------------------------------------------------------------------------------------+\n* | Ctrl | L1 | Alt | space | Alt | FN1 | FN2 | Ctrl |\n* \-----------------------------------------------------------------------------------------/\n* Layer TAP in BASE\n* ,-----------------------------------------------------------------------------------------.\n* | | | | | | | | | | | | | | |\n* |-----------------------------------------------------------------------------------------+\n* | | | | | | | | | | | | | | |\n* |-----------------------------------------------------------------------------------------+\n* | | | | | | | | | | | | | |\n* |-----------------------------------------------------------------------------------------+\n* | | | | | | | | | | | | UP |\n* |-----------------------------------------------------------------------------------------+\n* | | | | | | LEFT | DOWN | RIGHT |\n* \-----------------------------------------------------------------------------------------/\n*/\n const uint16_t keymaps[][MATRIX_ROWS][MATRIX_COLS] = {\n [BASE] = LAYOUT_60_ansi( /* Base */\n KC_ESC, KC_1, KC_2, KC_3, KC_4, KC_5, KC_6, KC_7, KC_8, KC_9, KC_0, KC_MINS, KC_EQL, KC_BSPC,\n KC_TAB, KC_Q, KC_W, KC_E, KC_R, KC_T, KC_Y, KC_U, KC_I, KC_O, KC_P, KC_LBRC, KC_RBRC, KC_BSLS,\n LT(FN1, KC_CAPS), KC_A, KC_S, KC_D, KC_F, KC_G, KC_H, KC_J, KC_K, KC_L, KC_SCLN, KC_QUOT, KC_ENT,\n KC_LSFT, KC_Z, KC_X, KC_C, KC_V, KC_B, KC_N, KC_M, KC_COMM, KC_DOT, KC_SLSH, RSFT_T(KC_UP),\n KC_LCTL, KC_LGUI, KC_LALT, KC_SPC, KC_RALT, LT(FN1, KC_LEFT), LT(FN2, KC_DOWN), RCTL_T(KC_RGHT)\n),\n /*\n * Layer FN1\n * ,-----------------------------------------------------------------------------------------.\n * | ` | F1 | F2 | F3 | F4 | F5 | F6 | F7 | F8 | F9 | F10 | F11 | F12 | DELETE |\n * |-----------------------------------------------------------------------------------------+\n * | Tab | q | UP | e | r | t | y | u | i | o | PS | HOME | END | \ |\n * |-----------------------------------------------------------------------------------------+\n * | Esc |LEFT |DOWN |RIGHT| f | g | h | j | k | l | PGUP|PGDN | Enter |\n * |-----------------------------------------------------------------------------------------+\n * | Shift |V-UP |V-DWN|MUTE | v | b | n | m | , |INSRT| DEL | Shift |\n * |-----------------------------------------------------------------------------------------+\n * | Ctrl | L1 | Alt | space | Alt | FN1 | FN2 | Ctrl |\n * \-----------------------------------------------------------------------------------------/\n *\n */\n [FN1] = LAYOUT_60_ansi( /* FN1 */\n KC_GRV, KC_F1, KC_F2, KC_F3, KC_F4, KC_F5, KC_F6, KC_F7, KC_F8, KC_F9, KC_F10, KC_F11, KC_F12, KC_DEL,\n _______, _______, KC_UP, _______, _______, _______, _______, _______, _______, _______, KC_PSCR, KC_HOME, KC_END, _______,\n _______, KC_LEFT, KC_DOWN, KC_RGHT, _______, _______, _______, _______, _______, _______, KC_PGUP, KC_PGDN, _______,\n _______, KC_VOLU, KC_VOLD, KC_MUTE, _______, _______, _______, _______, _______, KC_INS, KC_DEL, _______,\n _______, _______, _______, _______, _______, _______, MO(FN2), _______\n),\n /*\n * Layer FN2\n * ,-----------------------------------------------------------------------------------------.\n * | ~ | BT1 | BT2 | BT3 | BT4 | F5 | F6 | F7 | F8 | MOD | TOG | BRI- | BRI+ | Bksp |\n * |-----------------------------------------------------------------------------------------+\n * | Tab | q | UP | e | r | t | y | u | i | o | PS | HOME | END | \ |\n * |-----------------------------------------------------------------------------------------+\n * | Esc |LEFT |DOWN |RIGHT| f | g | h | j | k | l | PGUP|PGDN | Enter |\n * |-----------------------------------------------------------------------------------------+\n * | Shift | z | x | c | v | b | n | m | , |INSRT| DEL | Shift |\n * |-----------------------------------------------------------------------------------------+\n * | Ctrl | L1 | Alt | space | Alt | FN1 | FN2 | Ctrl |\n * \-----------------------------------------------------------------------------------------/\n *\n */\n [FN2] = LAYOUT_60_ansi( /* FN2 */\n _______, KC_AP2_BT1, KC_AP2_BT2, KC_AP2_BT3, KC_AP2_BT4, _______, _______, _______, _______, KC_AP_RGB_MOD, KC_AP_RGB_TOG, KC_AP_RGB_VAD, KC_AP_RGB_VAI, _______,\n _______, _______, KC_UP, _______, _______, _______, _______, _______, _______, _______, KC_PSCR, KC_HOME, KC_END, _______,\n _______, KC_LEFT, KC_DOWN, KC_RGHT, _______, _______, _______, _______, _______, _______, KC_PGUP, KC_PGDN, _______,\n _______, _______, _______, _______, _______, _______, _______, _______, _______, KC_INS, KC_DEL, _______,\n _______, _______, _______, _______, _______, _______, _______, _______\n ),\n};\n// clang-format on\n",c,content
+16,76994,"keyboards/annepro2/keymaps/miryoku/keymap.c",7451,0," ",c,content
+17,76995,"keyboards/annepro2/keymaps/miryoku/keymap.c",7432,0," ",c,content
+18,76995,"keyboards/annepro2/keymaps/miryoku/keymap.c",7429,0," ",c,content
+19,76995,"keyboards/annepro2/keymaps/miryoku/keymap.c",7425,1," ",c,content
+20,76995,"keyboards/annepro2/keymaps/miryoku/keymap.c",7275,4," ",c,content
+21,76995,"keyboards/annepro2/keymaps/miryoku/keymap.c",7124,4," ",c,content
+22,76995,"keyboards/annepro2/keymaps/miryoku/keymap.c",6973,4," ",c,content
+23,76995,"keyboards/annepro2/keymaps/miryoku/keymap.c",6807,4," ",c,content
+24,76995,"keyboards/annepro2/keymaps/miryoku/keymap.c",6641,4," ",c,content
+25,76995,"keyboards/annepro2/keymaps/miryoku/keymap.c",6606,1," ",c,content
+26,76995,"keyboards/annepro2/keymaps/miryoku/keymap.c",6601,2," ",c,content
+27,76995,"keyboards/annepro2/keymaps/miryoku/keymap.c",6597,2," ",c,content
+28,76995,"keyboards/annepro2/keymaps/miryoku/keymap.c",6501,2," ",c,content
+29,76995,"keyboards/annepro2/keymaps/miryoku/keymap.c",6405,2," ",c,content
+30,76995,"keyboards/annepro2/keymaps/miryoku/keymap.c",6309,2," ",c,content
+31,76995,"keyboards/annepro2/keymaps/miryoku/keymap.c",6213,2," ",c,content
+32,76995,"keyboards/annepro2/keymaps/miryoku/keymap.c",6117,2," ",c,content
+33,76995,"keyboards/annepro2/keymaps/miryoku/keymap.c",6021,2," ",c,content
+34,76995,"keyboards/annepro2/keymaps/miryoku/keymap.c",5925,2," ",c,content
+35,76995,"keyboards/annepro2/keymaps/miryoku/keymap.c",5829,2," ",c,content
+36,76995,"keyboards/annepro2/keymaps/miryoku/keymap.c",5733,2," ",c,content
+37,76995,"keyboards/annepro2/keymaps/miryoku/keymap.c",5637,2," ",c,content
+38,76995,"keyboards/annepro2/keymaps/miryoku/keymap.c",5541,2," ",c,content
+39,76995,"keyboards/annepro2/keymaps/miryoku/keymap.c",5527,2," ",c,content
+40,76995,"keyboards/annepro2/keymaps/miryoku/keymap.c",5522,2," ",c,content
+41,76995,"keyboards/annepro2/keymaps/miryoku/keymap.c",5519,0," ",c,content
+42,76996,"keyboards/annepro2/keymaps/miryoku/keymap.c",5399,4," ",c,content
+43,76996,"keyboards/annepro2/keymaps/miryoku/keymap.c",5278,4," ",c,content
+44,76996,"keyboards/annepro2/keymaps/miryoku/keymap.c",5157,4," ",c,content
+45,76996,"keyboards/annepro2/keymaps/miryoku/keymap.c",5027,4," ",c,content
+46,76996,"keyboards/annepro2/keymaps/miryoku/keymap.c",4898,4," ",c,content
+47,76996,"keyboards/annepro2/keymaps/miryoku/keymap.c",4863,1," ",c,content
+48,76996,"keyboards/annepro2/keymaps/miryoku/keymap.c",4858,2," ",c,content
+49,76996,"keyboards/annepro2/keymaps/miryoku/keymap.c",4854,2," ",c,content
+50,76996,"keyboards/annepro2/keymaps/miryoku/keymap.c",4758,2," ",c,content
+51,76996,"keyboards/annepro2/keymaps/miryoku/keymap.c",4662,2," ",c,content
+52,76996,"keyboards/annepro2/keymaps/miryoku/keymap.c",4566,2," ",c,content
+53,76996,"keyboards/annepro2/keymaps/miryoku/keymap.c",4470,2," ",c,content
+54,76996,"keyboards/annepro2/keymaps/miryoku/keymap.c",4374,2," ",c,content
+55,76996,"keyboards/annepro2/keymaps/miryoku/keymap.c",4278,2," ",c,content
+56,76996,"keyboards/annepro2/keymaps/miryoku/keymap.c",4182,2," ",c,content
+57,76996,"keyboards/annepro2/keymaps/miryoku/keymap.c",4086,2," ",c,content
+58,76996,"keyboards/annepro2/keymaps/miryoku/keymap.c",3990,2," ",c,content
+59,76996,"keyboards/annepro2/keymaps/miryoku/keymap.c",3894,2," ",c,content
+60,76996,"keyboards/annepro2/keymaps/miryoku/keymap.c",3798,2," ",c,content
+61,76996,"keyboards/annepro2/keymaps/miryoku/keymap.c",3784,2," ",c,content
+62,76996,"keyboards/annepro2/keymaps/miryoku/keymap.c",3779,2," ",c,content
+63,76996,"keyboards/annepro2/keymaps/miryoku/keymap.c",3776,0," ",c,content
+64,76996,"keyboards/annepro2/keymaps/miryoku/keymap.c",3637,4," ",c,content
+65,76996,"keyboards/annepro2/keymaps/miryoku/keymap.c",3499,4," ",c,content
+66,76996,"keyboards/annepro2/keymaps/miryoku/keymap.c",3368,4," ",c,content
+67,76996,"keyboards/annepro2/keymaps/miryoku/keymap.c",3221,4," ",c,content
+68,76996,"keyboards/annepro2/keymaps/miryoku/keymap.c",3074,4," ",c,content
+69,76996,"keyboards/annepro2/keymaps/miryoku/keymap.c",3037,1," ",c,content
+70,76996,"keyboards/annepro2/keymaps/miryoku/keymap.c",2981,1," ",c,content
+71,76996,"keyboards/annepro2/keymaps/miryoku/keymap.c",2978,0," ",c,content
+72,76996,"keyboards/annepro2/keymaps/miryoku/keymap.c",2884,0," ",c,content
+73,76997,"keyboards/annepro2/keymaps/miryoku/keymap.c",2790,0," ",c,content
+74,76997,"keyboards/annepro2/keymaps/miryoku/keymap.c",2696,0," ",c,content
+75,76997,"keyboards/annepro2/keymaps/miryoku/keymap.c",2602,0," ",c,content
+76,76997,"keyboards/annepro2/keymaps/miryoku/keymap.c",2508,0," ",c,content
+77,76997,"keyboards/annepro2/keymaps/miryoku/keymap.c",2414,0," ",c,content
+78,76997,"keyboards/annepro2/keymaps/miryoku/keymap.c",2320,0," ",c,content
+79,76997,"keyboards/annepro2/keymaps/miryoku/keymap.c",2226,0," ",c,content
+80,76997,"keyboards/annepro2/keymaps/miryoku/keymap.c",2132,0," ",c,content
+81,76997,"keyboards/annepro2/keymaps/miryoku/keymap.c",2038,0," ",c,content
+82,76997,"keyboards/annepro2/keymaps/miryoku/keymap.c",1944,0," ",c,content
+83,76997,"keyboards/annepro2/keymaps/miryoku/keymap.c",1924,0," ",c,content
+84,76997,"keyboards/annepro2/keymaps/miryoku/keymap.c",1830,0," ",c,content
+85,76997,"keyboards/annepro2/keymaps/miryoku/keymap.c",1736,0," ",c,content
+86,76997,"keyboards/annepro2/keymaps/miryoku/keymap.c",1642,0," ",c,content
+87,76997,"keyboards/annepro2/keymaps/miryoku/keymap.c",1548,0," ",c,content
+88,76997,"keyboards/annepro2/keymaps/miryoku/keymap.c",1454,0," ",c,content
+89,76997,"keyboards/annepro2/keymaps/miryoku/keymap.c",1360,0," ",c,content
+90,76997,"keyboards/annepro2/keymaps/miryoku/keymap.c",1266,0," ",c,content
+91,76997,"keyboards/annepro2/keymaps/miryoku/keymap.c",1172,0," ",c,content
+92,76997,"keyboards/annepro2/keymaps/miryoku/keymap.c",1078,0," ",c,content
+93,76997,"keyboards/annepro2/keymaps/miryoku/keymap.c",984,0," ",c,content
+94,76997,"keyboards/annepro2/keymaps/miryoku/keymap.c",890,0," ",c,content
+95,76997,"keyboards/annepro2/keymaps/miryoku/keymap.c",877,0," ",c,content
+96,76997,"keyboards/annepro2/keymaps/miryoku/keymap.c",874,0," ",c,content
+97,76997,"keyboards/annepro2/keymaps/miryoku/keymap.c",809,0," ",c,content
+98,76997,"keyboards/annepro2/keymaps/miryoku/keymap.c",789,0," ",c,content
+99,76997,"keyboards/annepro2/keymaps/miryoku/keymap.c",788,0," ",c,content
+100,76997,"keyboards/annepro2/keymaps/miryoku/keymap.c",785,0," ",c,content
+101,76998,"keyboards/annepro2/keymaps/miryoku/keymap.c",776,4," ",c,content
+102,76998,"keyboards/annepro2/keymaps/miryoku/keymap.c",767,4," ",c,content
+103,76998,"keyboards/annepro2/keymaps/miryoku/keymap.c",757,4," ",c,content
+104,76998,"keyboards/annepro2/keymaps/miryoku/keymap.c",734,0," ",c,content
+105,76998,"keyboards/annepro2/keymaps/miryoku/keymap.c",709,0," ",c,content
+106,78075,"keyboards/annepro2/keymaps/miryoku/keymap.c",7630,0,"",c,selection_command
+107,78249,"keyboards/annepro2/keymaps/miryoku/keymap.c",7631,0,"",c,selection_mouse
+108,78252,"keyboards/annepro2/keymaps/miryoku/keymap.c",7630,0,"",c,selection_command
+109,187159,"keyboards/annepro2/keymaps/miryoku/keymap.c",7631,0,"",c,selection_mouse
+110,187168,"keyboards/annepro2/keymaps/miryoku/keymap.c",7630,0,"",c,selection_command
+111,242975,"/Users/franzsrambical/zmk-config-calmar-one/config/calmar-one.conf",0,0,"CONFIG_ZMK_HID_REPORT_TYPE_NKRO=y\nCONFIG_BT=y\nCONFIG_BT_PERIPHERAL=y\nCONFIG_BT_DEVICE_NAME=""Calmar One""\nCONFIG_BOARD_NICE_NANO=y\nCONFIG_ZMK_SLEEP=y\nCONFIG_ZMK_IDLE_SLEEP_TIMEOUT=720000\nCONFIG_ZMK_KSCAN_DEBOUNCE_PRESS_MS=5\nCONFIG_ZMK_POINTING=y",properties,tab
+112,261141,"/Users/franzsrambical/zmk-config-calmar-one/config/calmar-one.conf",243,0,"",properties,selection_mouse
+113,261149,"/Users/franzsrambical/zmk-config-calmar-one/config/calmar-one.conf",242,0,"",properties,selection_command
+114,263178,"/Users/franzsrambical/zmk-config-calmar-one/config/calmar-one.conf",222,21,"CONFIG_ZMK_POINTING=y",properties,selection_command
+115,271535,"keyboards/annepro2/keymaps/default/keymap.c",0,0,"",c,tab
+116,272802,"keyboards/annepro2/keymaps/default/keymap.c",1,0,"",c,selection_command
+117,273540,"keyboards/annepro2/keymaps/default/keymap.c",0,40," /* Copyright 2021 OpenAnnePro community",c,selection_command
+118,290302,"/Users/franzsrambical/zmk-config-calmar-one/config/calmar-one.json",0,0,"{\n ""layouts"": {\n ""default_layout"": {\n ""name"": ""default_layout"",\n ""layout"": [\n { ""row"": 1, ""col"": 0, ""x"": 0, ""y"": 1 },\n { ""row"": 1, ""col"": 1, ""x"": 1, ""y"": 1 },\n { ""row"": 1, ""col"": 2, ""x"": 2, ""y"": 1 },\n { ""row"": 1, ""col"": 3, ""x"": 3, ""y"": 1 },\n { ""row"": 1, ""col"": 4, ""x"": 4, ""y"": 1 },\n { ""row"": 1, ""col"": 5, ""x"": 5, ""y"": 1 },\n { ""row"": 1, ""col"": 6, ""x"": 6, ""y"": 1 },\n { ""row"": 1, ""col"": 7, ""x"": 7, ""y"": 1 },\n { ""row"": 1, ""col"": 8, ""x"": 8, ""y"": 1 },\n { ""row"": 1, ""col"": 9, ""x"": 9, ""y"": 1 },\n { ""row"": 1, ""col"": 10, ""x"": 10, ""y"": 1 },\n { ""row"": 1, ""col"": 11, ""x"": 11, ""y"": 1 },\n { ""row"": 2, ""col"": 0, ""x"": 0, ""y"": 2 },\n { ""row"": 2, ""col"": 1, ""x"": 1, ""y"": 2 },\n { ""row"": 2, ""col"": 2, ""x"": 2, ""y"": 2 },\n { ""row"": 2, ""col"": 3, ""x"": 3, ""y"": 2 },\n { ""row"": 2, ""col"": 4, ""x"": 4, ""y"": 2 },\n { ""row"": 2, ""col"": 5, ""x"": 5, ""y"": 2 },\n { ""row"": 2, ""col"": 6, ""x"": 6, ""y"": 2 },\n { ""row"": 2, ""col"": 7, ""x"": 7, ""y"": 2 },\n { ""row"": 2, ""col"": 8, ""x"": 8, ""y"": 2 },\n { ""row"": 2, ""col"": 9, ""x"": 9, ""y"": 2 },\n { ""row"": 2, ""col"": 10, ""x"": 10, ""y"": 2 },\n { ""row"": 2, ""col"": 11, ""x"": 11, ""y"": 2 },\n { ""row"": 3, ""col"": 0, ""x"": 0, ""y"": 3 },\n { ""row"": 3, ""col"": 1, ""x"": 1, ""y"": 3 },\n { ""row"": 3, ""col"": 2, ""x"": 2, ""y"": 3 },\n { ""row"": 3, ""col"": 3, ""x"": 3, ""y"": 3 },\n { ""row"": 3, ""col"": 4, ""x"": 4, ""y"": 3 },\n { ""row"": 3, ""col"": 5, ""x"": 5, ""y"": 3 },\n { ""row"": 3, ""col"": 6, ""x"": 6, ""y"": 3 },\n { ""row"": 3, ""col"": 7, ""x"": 7, ""y"": 3 },\n { ""row"": 3, ""col"": 8, ""x"": 8, ""y"": 3 },\n { ""row"": 3, ""col"": 9, ""x"": 9, ""y"": 3 },\n { ""row"": 3, ""col"": 10, ""x"": 10, ""y"": 3 },\n { ""row"": 3, ""col"": 11, ""x"": 11, ""y"": 3 },\n { ""row"": 4, ""col"": 0, ""x"": 0, ""y"": 4 },\n { ""row"": 4, ""col"": 1, ""x"": 1, ""y"": 4 },\n { ""row"": 4, ""col"": 2, ""x"": 2, ""y"": 4 },\n { ""row"": 4, ""col"": 3, ""x"": 3, ""y"": 4 },\n { ""row"": 4, ""col"": 4, ""x"": 4, ""y"": 4 },\n { ""row"": 4, ""col"": 5, ""x"": 5, ""y"": 4 },\n { ""row"": 4, ""col"": 6, ""x"": 6, ""y"": 4 }\n ]\n }\n },\n ""sensors"": []\n}",json,tab
+119,291458,"/Users/franzsrambical/zmk-config-calmar-one/config/calmar-one.json",0,1,"{",json,selection_command
+120,300296,"/Users/franzsrambical/zmk-config-calmar-one/config/calmar-one.keymap",0,0,"#define ZMK_POINTING_DEFAULT_MOVE_VAL 1500 // default: 600\n#define ZMK_POINTING_DEFAULT_SCRL_VAL 25 // default: 10\n\n#include \n#include \n#include \n#include \n\n#define CONFIG_WIRELESS 1\n\n/ {\n keymap {\n compatible = ""zmk,keymap"";\n\n Colemak {\n bindings = <\n< 5 DELETE &kp Q &kp W &kp E &kp R &kp T &kp Y &kp U &kp I &kp O &kp P &kp EQUAL\n&mt LEFT_CONTROL ESCAPE &kp A &kp S &kp D &kp F &kp G &kp H &kp J &kp K &kp L &mt RIGHT_CONTROL SEMICOLON &trans\n&mt LEFT_SHIFT F13 < 3 Z < 4 X &kp C &kp V &kp B &kp N &kp M &kp COMMA &kp DOT &mt RIGHT_SHIFT SLASH &trans\n&mt LEFT_ALT TAB &kp SPACE &kp LEFT_GUI &trans < 2 ENTER < 1 BACKSPACE &trans\n >;\n };\n\n Special-Chars {\n bindings = <\n&trans &trans &kp DQT &kp APOSTROPHE &trans &trans &trans &kp LEFT_BRACKET &kp RIGHT_BRACKET &kp BSLH &kp MINUS &trans\n&trans &kp EXCL &kp AT &kp POUND &kp DOLLAR &kp PERCENT &kp CARET &kp AMPERSAND &kp ASTERISK &kp LPAR &kp RPAR &trans\n&trans &kp NUMBER_1 &kp N2 &kp N3 &kp N4 &kp N5 &kp N6 &kp N7 &kp N8 &kp N9 &kp N0 &trans\n&trans &trans &trans &trans &trans &trans &trans\n >;\n };\n\n Arrows {\n bindings = <\n&trans &trans &trans &trans &trans &trans &trans &trans &trans &trans &trans &trans\n&trans &trans &kp TILDE &kp PIPE &kp GRAVE &trans &kp LEFT &kp DOWN_ARROW &kp UP_ARROW &kp RIGHT &trans &trans\n&trans &trans &trans &trans &trans &trans &trans &trans &trans &trans &trans &trans\n&trans &trans &trans &trans &trans &bootloader &bootloader\n >;\n };\n\n Hyper {\n bindings = <\n&trans &kp LS(LA(Q)) &kp LS(LA(W)) &kp LS(LA(F)) &kp LS(LA(P)) &kp LS(LA(B)) &mkp MCLK &kp TAB &kp LS(TAB) &kp LS(LA(Y)) &kp LS(LA(SLASH)) &trans\n&trans &kp LS(LA(A)) &kp LS(LA(R)) &kp TAB &kp LS(TAB) &kp LS(LA(G)) &mmv MOVE_LEFT &mmv MOVE_DOWN &mmv MOVE_UP &mmv MOVE_RIGHT &kp LS(LA(O)) &trans\n&trans &kp LS(LA(Z)) &kp LS(LA(X)) &kp LA(LS(C)) &kp LS(LA(D)) &kp LA(LS(V)) &msc SCRL_LEFT &msc SCRL_DOWN &msc SCRL_UP &msc SCRL_RIGHT &kp LS(LA(MINUS)) &trans\n&kp LS(LA(F7)) &kp LS(LA(F8)) &kp LCMD &kp LS(LA(F10)) &mkp LCLK &mkp RCLK &kp LS(LA(F13))\n >;\n };\n\n Numpad {\n bindings = <\n&trans &trans &trans &trans &trans &trans &trans &kp N7 &kp N8 &kp N9 &trans &trans\n&trans &trans &trans &kp FSLH &kp EQUAL &trans &trans &kp N4 &kp N5 &kp N6 &kp PLUS &trans\n&trans &trans &trans &trans &kp STAR &trans &kp NUMBER_0 &kp N1 &kp N2 &kp N3 &kp MINUS &trans\n&trans &trans &trans &trans &trans &trans &trans\n >;\n };\n\n FKeys-Media {\n bindings = <\n&trans &kp F1 &kp F2 &kp F3 &kp F4 &kp F5 &kp F6 &kp F7 &kp F8 &kp F9 &kp F10 &kp F11\n&trans &trans &trans &trans &trans &trans &trans &kp C_BRIGHTNESS_DEC &kp C_BRIGHTNESS_INC &trans &trans &bootloader\n&trans &trans &trans &trans &trans &trans &trans &kp C_MUTE &trans &trans &trans &trans\n&kp C_PREV &kp C_PLAY_PAUSE &kp C_NEXT &trans &kp C_VOLUME_DOWN &kp C_VOLUME_UP &trans\n >;\n };\n\n Numbers {\n bindings = <\n&trans &trans &trans &trans &trans &trans &trans &trans &trans &trans &trans &trans\n&trans &kp NUMBER_1 &kp NUMBER_2 &kp N3 &kp N4 &kp N5 &kp N6 &kp N7 &kp N8 &kp N9 &kp N0 &trans\n&trans &trans &trans &trans &trans &trans &trans &trans &trans &trans &trans &trans\n&trans &trans &trans &trans &trans &trans &trans\n >;\n };\n\n Bluetooth {\n bindings = <\n&trans &bt BT_CLR &bt BT_NXT &bt BT_PRV &bt BT_SEL 0 &trans &trans &trans &trans &trans &trans &trans\n&trans &trans &trans &trans &bt BT_SEL 1 &bt BT_SEL 2 &bt BT_SEL 3 &bt BT_SEL 4 &trans &trans &trans &trans\n&trans &trans &trans &trans &to 0 &trans &trans &trans &trans &trans &trans &trans\n&trans &trans &trans &trans &trans &trans &trans\n >;\n };\n };\n};\n",plaintext,tab
+121,301005,"/Users/franzsrambical/zmk-config-calmar-one/config/calmar-one.keymap",0,59,"#define ZMK_POINTING_DEFAULT_MOVE_VAL 1500 // default: 600",plaintext,selection_command
+122,307663,"/Users/franzsrambical/zmk-config-calmar-one/config/calmar-one.keymap",249,0,"",plaintext,selection_mouse
+123,307671,"/Users/franzsrambical/zmk-config-calmar-one/config/calmar-one.keymap",248,0,"",plaintext,selection_command
+124,309354,"/Users/franzsrambical/zmk-config-calmar-one/config/calmar-one.json",0,0,"",json,tab
+125,313515,"/Users/franzsrambical/zmk-config-calmar-one/config/calmar-one.keymap",0,0,"",plaintext,tab
+126,836461,"keyboards/annepro2/keymaps/calmar_one/keymap.c",0,0,"/* Calmar One keymap for Anne Pro 2\n *\n * Approximates the user's ZMK layout:\n * - Layers: BASE, SYM (Special-Chars), NAV (Arrows), HYPER (Mouse/Hyper combos), NUM (Numpad), FKEY (F-keys/Media), NUMBERS, BT (Bluetooth)\n */\n\n#include QMK_KEYBOARD_H\n\nenum anne_pro_layers {\n BASE,\n SYM, // Special-Chars\n NAV, // Arrows / system\n HYPER, // Mouse + Hyper combos\n NUM, // Numpad\n FKEY, // F-keys / Media / Brightness\n NUMBERS, // Number row on home row\n BT // Bluetooth controls\n};\n\n// Convenience alias for transparent\n#define _______ KC_TRNS\n\n// clang-format off\nconst uint16_t keymaps[][MATRIX_ROWS][MATRIX_COLS] = {\n /* BASE (QWERTY-ish, with mods and layer-taps approximating ZMK) */\n [BASE] = LAYOUT_60_ansi(\n LT(FKEY, KC_DEL), KC_1, KC_2, KC_3, KC_4, KC_5, KC_6, KC_7, KC_8, KC_9, KC_0, KC_MINS, KC_EQL, KC_BSPC,\n MT(MOD_LALT, KC_TAB), KC_Q, KC_W, KC_E, KC_R, KC_T, KC_Y, KC_U, KC_I, KC_O, KC_P, KC_LBRC, KC_RBRC, KC_BSLS,\n MT(MOD_LCTL, KC_ESC), KC_A, KC_S, KC_D, KC_F, KC_G, KC_H, KC_J, KC_K, KC_L, RCTL_T(KC_SCLN), KC_QUOT, KC_ENT,\n LSFT_T(KC_F13), LT(HYPER, KC_Z), LT(NUM, KC_X), KC_C, KC_V, KC_B, KC_N, KC_M, KC_COMM, KC_DOT, RSFT_T(KC_SLSH), RSFT_T(KC_SLSH),\n KC_LCTL, KC_LGUI, KC_LALT, KC_SPC, KC_RALT, LT(NAV, KC_ENT), LT(SYM, KC_BSPC), RCTL_T(KC_RGHT)\n ),\n\n /* SYM (Special-Chars) */\n [SYM] = LAYOUT_60_ansi(\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n _______, KC_EXLM, KC_AT, KC_HASH, KC_DLR, KC_PERC, KC_CIRC, KC_AMPR, KC_LPRN, KC_RPRN, KC_MINS, KC_LBRC, KC_RBRC, KC_BSLS,\n _______, KC_1, KC_2, KC_3, KC_4, KC_5, KC_6, KC_7, KC_8, KC_9, KC_0, KC_EQL, _______,\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, _______\n ),\n\n /* NAV (Arrows / System) */\n [NAV] = LAYOUT_60_ansi(\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, KC_LEFT, KC_DOWN, KC_UP, KC_RGHT, _______, _______, _______,\n _______, KC_BRID, KC_BRIU, KC_MUTE, _______, _______, _______, _______, _______, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, _______\n ),\n\n /* HYPER (Mouse / Hyper combos) */\n [HYPER] = LAYOUT_60_ansi(\n _______, S(A(KC_Q)), S(A(KC_W)), S(A(KC_F)), S(A(KC_P)), S(A(KC_B)), KC_BTN3, KC_TAB, S(KC_TAB), S(A(KC_Y)), S(A(KC_SLSH)), _______, _______,\n _______, S(A(KC_A)), S(A(KC_R)), KC_TAB, S(KC_TAB), S(A(KC_G)), KC_MS_L, KC_MS_D, KC_MS_U, KC_MS_R, S(A(KC_O)), _______, _______,\n _______, S(A(KC_Z)), S(A(KC_X)), A(S(KC_C)), S(A(KC_D)), A(S(KC_V)), KC_WH_L, KC_WH_D, KC_WH_U, KC_WH_R, S(A(KC_MINS)), _______,\n S(A(KC_F7)), S(A(KC_F8)), KC_LGUI, S(A(KC_F10)), KC_BTN1, KC_BTN2, S(A(KC_F13)), _______, _______, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, _______\n ),\n\n /* NUM (Numpad on right side) */\n [NUM] = LAYOUT_60_ansi(\n _______, _______, _______, _______, _______, _______, _______, KC_P7, KC_P8, KC_P9, _______, _______, _______, _______,\n _______, _______, _______, KC_PSLS, KC_EQL, _______, _______, KC_P4, KC_P5, KC_P6, KC_PPLS, _______, _______, _______,\n _______, _______, _______, _______, KC_PAST, _______, KC_P0, KC_P1, KC_P2, KC_P3, KC_PMNS, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, _______\n ),\n\n /* FKEY (F-Keys / Media / Brightness) */\n [FKEY] = LAYOUT_60_ansi(\n _______, KC_F1, KC_F2, KC_F3, KC_F4, KC_F5, KC_F6, KC_F7, KC_F8, KC_F9, KC_F10, KC_F11, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, KC_BRID, KC_BRIU, _______, _______, QK_BOOT, _______,\n _______, _______, _______, _______, _______, _______, _______, KC_MUTE, _______, _______, _______, _______, _______,\n KC_MPRV, KC_MPLY, KC_MNXT, _______, KC_VOLD, KC_VOLU, _______, _______, _______, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, _______\n ),\n\n /* NUMBERS (Numbers on home row) */\n [NUMBERS] = LAYOUT_60_ansi(\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n _______, KC_1, KC_2, KC_3, KC_4, KC_5, KC_6, KC_7, KC_8, KC_9, KC_0, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, _______\n ),\n\n /* BT (Bluetooth controls for Anne Pro 2) */\n [BT] = LAYOUT_60_ansi(\n _______, KC_AP2_BT_UNPAIR, _______, _______, KC_AP2_USB, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n _______, _______, _______, _______, KC_AP2_BT1, KC_AP2_BT2, KC_AP2_BT3, KC_AP2_BT4, _______, _______, _______, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, _______\n ),\n};\n// clang-format on\n\n\n",c,tab
+127,843893,"keyboards/annepro2/keymaps/calmar_one/rules.mk",0,0,"MOUSEKEY_ENABLE = yes\nEXTRAKEY_ENABLE = yes\nCONSOLE_ENABLE = no\nCOMMAND_ENABLE = no\n\n",makefile,tab
+128,849620,"keyboards/annepro2/keymaps/calmar_one/config.h",0,0,"#pragma once\n\n#define TAPPING_TERM 200\n#define PERMISSIVE_HOLD\n\n// Mouse keys: keep defaults; adjust if needed after testing\n\n",c,tab
+129,854047,"keyboards/annepro2/keymaps/calmar_one/config.h",126,0,"",c,selection_mouse
+130,902794,"keyboards/annepro2/keymaps/calmar_one/keymap.c",0,0,"",c,tab
+131,904411,"keyboards/annepro2/keymaps/calmar_one/keymap.c",523,0,"",c,selection_mouse
+132,1011899,"keyboards/annepro2/keymaps/calmar_one/keymap.c",4315,120," _______, _______, _______, _______, _______, _______, _______, KC_BRID, KC_BRIU, _______, _______, QK_BOOT, _______, _______,",c,content
+133,1011899,"keyboards/annepro2/keymaps/calmar_one/keymap.c",2781,439," _______, S(A(KC_Q)), S(A(KC_W)), S(A(KC_F)), S(A(KC_P)), S(A(KC_B)), KC_BTN3, KC_TAB, S(KC_TAB), S(A(KC_Y)), S(A(KC_SLSH)), _______, _______, _______,\n _______, S(A(KC_A)), S(A(KC_R)), KC_TAB, S(KC_TAB), S(A(KC_G)), KC_MS_L, KC_MS_D, KC_MS_U, KC_MS_R, S(A(KC_O)), _______, _______, _______,\n _______, S(A(KC_Z)), S(A(KC_X)), A(S(KC_C)), S(A(KC_D)), A(S(KC_V)), KC_WH_L, KC_WH_D, KC_WH_U, KC_WH_R, S(A(KC_MINS)), _______, _______,",c,content
+134,1027048,"keyboards/annepro2/keymaps/calmar_one/keymap.c",4812,0,"",c,selection_mouse
+135,1056026,"keyboards/annepro2/keymaps/calmar_one/keymap.c",4812,0,"franzsrambical@MBF6N9WFVKFV qmk_firmware % make annepro2/c18:calmar_one\nMaking annepro2/c18 with keymap calmar_one\n\narm-none-eabi-gcc (Arm GNU Toolchain 14.3.Rel1 (Build arm-14.174)) 14.3.1 20250623\nCopyright (C) 2024 Free Software Foundation, Inc.\nThis is free software; see the source for copying conditions. There is NO\nwarranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.\n\nCompiling: quantum/keymap_introspection.c In file included from ./keyboards/annepro2/keymaps/calmar_one/keymap.c:7,\n from quantum/keymap_introspection.c:9:\n./keyboards/annepro2/keymaps/calmar_one/keymap.c:54:74: error: 'KC_BTN3' undeclared here (not in a function); did you mean 'MS_BTN3'?\n 54 | _______, S(A(KC_Q)), S(A(KC_W)), S(A(KC_F)), S(A(KC_P)), S(A(KC_B)), KC_BTN3, KC_TAB, S(KC_TAB), S(A(KC_Y)), S(A(KC_SLSH)), _______, _______, _______,\n | ^~~~~~~\n./.build/obj_annepro2_c18_calmar_one/src/default_keyboard.h:35:37: note: in definition of macro 'LAYOUT_60_ansi'\n 35 | { k0A, k0B, k0C, k0D, k0E, k0F, k0G, k0H, k0I, k0J, k0K, k0L, k0M, k0N }, \\n | ^~~\n./keyboards/annepro2/keymaps/calmar_one/keymap.c:55:75: error: 'KC_MS_L' undeclared here (not in a function); did you mean 'KC_MSEL'?\n 55 | _______, S(A(KC_A)), S(A(KC_R)), KC_TAB, S(KC_TAB), S(A(KC_G)), KC_MS_L, KC_MS_D, KC_MS_U, KC_MS_R, S(A(KC_O)), _______, _______, _______,\n | ^~~~~~~\n./.build/obj_annepro2_c18_calmar_one/src/default_keyboard.h:36:37: note: in definition of macro 'LAYOUT_60_ansi'\n 36 | { k1A, k1B, k1C, k1D, k1E, k1F, k1G, k1H, k1I, k1J, k1K, k1L, k1M, k1N }, \\n | ^~~\n./keyboards/annepro2/keymaps/calmar_one/keymap.c:55:85: error: 'KC_MS_D' undeclared here (not in a function); did you mean 'KC_MRWD'?\n 55 | _______, S(A(KC_A)), S(A(KC_R)), KC_TAB, S(KC_TAB), S(A(KC_G)), KC_MS_L, KC_MS_D, KC_MS_U, KC_MS_R, S(A(KC_O)), _______, _______, _______,\n | ^~~~~~~\n./.build/obj_annepro2_c18_calmar_one/src/default_keyboard.h:36:42: note: in definition of macro 'LAYOUT_60_ansi'\n 36 | { k1A, k1B, k1C, k1D, k1E, k1F, k1G, k1H, k1I, k1J, k1K, k1L, k1M, k1N }, \\n | ^~~\n./keyboards/annepro2/keymaps/calmar_one/keymap.c:55:95: error: 'KC_MS_U' undeclared here (not in a function); did you mean 'KC_MSEL'?\n 55 | _______, S(A(KC_A)), S(A(KC_R)), KC_TAB, S(KC_TAB), S(A(KC_G)), KC_MS_L, KC_MS_D, KC_MS_U, KC_MS_R, S(A(KC_O)), _______, _______, _______,\n | ^~~~~~~\n./.build/obj_annepro2_c18_calmar_one/src/default_keyboard.h:36:47: note: in definition of macro 'LAYOUT_60_ansi'\n 36 | { k1A, k1B, k1C, k1D, k1E, k1F, k1G, k1H, k1I, k1J, k1K, k1L, k1M, k1N }, \\n | ^~~\n./keyboards/annepro2/keymaps/calmar_one/keymap.c:55:106: error: 'KC_MS_R' undeclared here (not in a function); did you mean 'KC_MSEL'?\n 55 | _______, S(A(KC_A)), S(A(KC_R)), KC_TAB, S(KC_TAB), S(A(KC_G)), KC_MS_L, KC_MS_D, KC_MS_U, KC_MS_R, S(A(KC_O)), _______, _______, _______,\n | ^~~~~~~\n./.build/obj_annepro2_c18_calmar_one/src/default_keyboard.h:36:52: note: in definition of macro 'LAYOUT_60_ansi'\n 36 | { k1A, k1B, k1C, k1D, k1E, k1F, k1G, k1H, k1I, k1J, k1K, k1L, k1M, k1N }, \\n | ^~~\n./keyboards/annepro2/keymaps/calmar_one/keymap.c:56:75: error: 'KC_WH_L' undeclared here (not in a function); did you mean 'KC_WHOM'?\n 56 | _______, S(A(KC_Z)), S(A(KC_X)), A(S(KC_C)), S(A(KC_D)), A(S(KC_V)), KC_WH_L, KC_WH_D, KC_WH_U, KC_WH_R, S(A(KC_MINS)), _______, _______,\n | ^~~~~~~\n./.build/obj_annepro2_c18_calmar_one/src/default_keyboard.h:37:37: note: in definition of macro 'LAYOUT_60_ansi'\n 37 | { k2A, k2B, k2C, k2D, k2E, k2F, k2G, k2H, k2I, k2J, k2K, k2L, k2M, XXX }, \\n | ^~~\n./keyboards/annepro2/keymaps/calmar_one/keymap.c:56:85: error: 'KC_WH_D' undeclared here (not in a function); did you mean 'KC_WFWD'?\n 56 | _______, S(A(KC_Z)), S(A(KC_X)), A(S(KC_C)), S(A(KC_D)), A(S(KC_V)), KC_WH_L, KC_WH_D, KC_WH_U, KC_WH_R, S(A(KC_MINS)), _______, _______,\n | ^~~~~~~\n./.build/obj_annepro2_c18_calmar_one/src/default_keyboard.h:37:42: note: in definition of macro 'LAYOUT_60_ansi'\n 37 | { k2A, k2B, k2C, k2D, k2E, k2F, k2G, k2H, k2I, k2J, k2K, k2L, k2M, XXX }, \\n | ^~~\n./keyboards/annepro2/keymaps/calmar_one/keymap.c:56:95: error: 'KC_WH_U' undeclared here (not in a function); did you mean 'KC_WHOM'?\n 56 | _______, S(A(KC_Z)), S(A(KC_X)), A(S(KC_C)), S(A(KC_D)), A(S(KC_V)), KC_WH_L, KC_WH_D, KC_WH_U, KC_WH_R, S(A(KC_MINS)), _______, _______,\n | ^~~~~~~\n./.build/obj_annepro2_c18_calmar_one/src/default_keyboard.h:37:47: note: in definition of macro 'LAYOUT_60_ansi'\n 37 | { k2A, k2B, k2C, k2D, k2E, k2F, k2G, k2H, k2I, k2J, k2K, k2L, k2M, XXX }, \\n | ^~~\n./keyboards/annepro2/keymaps/calmar_one/keymap.c:56:106: error: 'KC_WH_R' undeclared here (not in a function); did you mean 'KC_WHOM'?\n 56 | _______, S(A(KC_Z)), S(A(KC_X)), A(S(KC_C)), S(A(KC_D)), A(S(KC_V)), KC_WH_L, KC_WH_D, KC_WH_U, KC_WH_R, S(A(KC_MINS)), _______, _______,\n | ^~~~~~~\n./.build/obj_annepro2_c18_calmar_one/src/default_keyboard.h:37:52: note: in definition of macro 'LAYOUT_60_ansi'\n 37 | { k2A, k2B, k2C, k2D, k2E, k2F, k2G, k2H, k2I, k2J, k2K, k2L, k2M, XXX }, \\n | ^~~\n./keyboards/annepro2/keymaps/calmar_one/keymap.c:57:64: error: 'KC_BTN1' undeclared here (not in a function); did you mean 'MS_BTN1'?\n 57 | S(A(KC_F7)), S(A(KC_F8)), KC_LGUI, S(A(KC_F10)), KC_BTN1, KC_BTN2, S(A(KC_F13)), _______, _______, _______, _______, _______,\n | ^~~~~~~\n./.build/obj_annepro2_c18_calmar_one/src/default_keyboard.h:38:32: note: in definition of macro 'LAYOUT_60_ansi'\n 38 | { k3A, XXX, k3C, k3D, k3E, k3F, k3G, k3H, k3I, k3J, k3K, k3L, k3M, XXX }, \\n | ^~~\n./keyboards/annepro2/keymaps/calmar_one/keymap.c:57:75: error: 'KC_BTN2' undeclared here (not in a function); did you mean 'MS_BTN2'?\n 57 | S(A(KC_F7)), S(A(KC_F8)), KC_LGUI, S(A(KC_F10)), KC_BTN1, KC_BTN2, S(A(KC_F13)), _______, _______, _______, _______, _______,\n | ^~~~~~~\n./.build/obj_annepro2_c18_calmar_one/src/default_keyboard.h:38:37: note: in definition of macro 'LAYOUT_60_ansi'\n 38 | { k3A, XXX, k3C, k3D, k3E, k3F, k3G, k3H, k3I, k3J, k3K, k3L, k3M, XXX }, \\n | ^~~\n [ERRORS]\n | \n | \n | \nmake[1]: *** [.build/obj_annepro2_c18_calmar_one/quantum/keymap_introspection.o] Error 1\nmake: Make finished with errors\n",c,content
+136,1056037,"keyboards/annepro2/keymaps/calmar_one/keymap.c",12723,0," ",c,content
+137,1056038,"keyboards/annepro2/keymaps/calmar_one/keymap.c",12691,0," ",c,content
+138,1056038,"keyboards/annepro2/keymaps/calmar_one/keymap.c",12602,0," ",c,content
+139,1056038,"keyboards/annepro2/keymaps/calmar_one/keymap.c",12598,1," ",c,content
+140,1056038,"keyboards/annepro2/keymaps/calmar_one/keymap.c",12594,1," ",c,content
+141,1056038,"keyboards/annepro2/keymaps/calmar_one/keymap.c",12590,1," ",c,content
+142,1056038,"keyboards/annepro2/keymaps/calmar_one/keymap.c",12580,1," ",c,content
+143,1056038,"keyboards/annepro2/keymaps/calmar_one/keymap.c",12532,6," ",c,content
+144,1056038,"keyboards/annepro2/keymaps/calmar_one/keymap.c",12444,3," ",c,content
+145,1056038,"keyboards/annepro2/keymaps/calmar_one/keymap.c",12331,0," ",c,content
+146,1056038,"keyboards/annepro2/keymaps/calmar_one/keymap.c",12241,6," ",c,content
+147,1056038,"keyboards/annepro2/keymaps/calmar_one/keymap.c",12090,3," ",c,content
+148,1056038,"keyboards/annepro2/keymaps/calmar_one/keymap.c",11956,0," ",c,content
+149,1056038,"keyboards/annepro2/keymaps/calmar_one/keymap.c",11913,6," ",c,content
+150,1056038,"keyboards/annepro2/keymaps/calmar_one/keymap.c",11825,3," ",c,content
+151,1056038,"keyboards/annepro2/keymaps/calmar_one/keymap.c",11712,0," ",c,content
+152,1056038,"keyboards/annepro2/keymaps/calmar_one/keymap.c",11633,6," ",c,content
+153,1056038,"keyboards/annepro2/keymaps/calmar_one/keymap.c",11482,3," ",c,content
+154,1056038,"keyboards/annepro2/keymaps/calmar_one/keymap.c",11348,0," ",c,content
+155,1056038,"keyboards/annepro2/keymaps/calmar_one/keymap.c",11285,6," ",c,content
+156,1056038,"keyboards/annepro2/keymaps/calmar_one/keymap.c",11197,3," ",c,content
+157,1056038,"keyboards/annepro2/keymaps/calmar_one/keymap.c",11084,0," ",c,content
+158,1056038,"keyboards/annepro2/keymaps/calmar_one/keymap.c",10963,6," ",c,content
+159,1056038,"keyboards/annepro2/keymaps/calmar_one/keymap.c",10805,3," ",c,content
+160,1056039,"keyboards/annepro2/keymaps/calmar_one/keymap.c",10670,0," ",c,content
+161,1056039,"keyboards/annepro2/keymaps/calmar_one/keymap.c",10612,6," ",c,content
+162,1056039,"keyboards/annepro2/keymaps/calmar_one/keymap.c",10524,3," ",c,content
+163,1056039,"keyboards/annepro2/keymaps/calmar_one/keymap.c",10411,0," ",c,content
+164,1056039,"keyboards/annepro2/keymaps/calmar_one/keymap.c",10301,6," ",c,content
+165,1056039,"keyboards/annepro2/keymaps/calmar_one/keymap.c",10143,3," ",c,content
+166,1056039,"keyboards/annepro2/keymaps/calmar_one/keymap.c",10009,0," ",c,content
+167,1056039,"keyboards/annepro2/keymaps/calmar_one/keymap.c",9956,6," ",c,content
+168,1056039,"keyboards/annepro2/keymaps/calmar_one/keymap.c",9868,3," ",c,content
+169,1056039,"keyboards/annepro2/keymaps/calmar_one/keymap.c",9755,0," ",c,content
+170,1056039,"keyboards/annepro2/keymaps/calmar_one/keymap.c",9655,6," ",c,content
+171,1056039,"keyboards/annepro2/keymaps/calmar_one/keymap.c",9497,3," ",c,content
+172,1056039,"keyboards/annepro2/keymaps/calmar_one/keymap.c",9363,0," ",c,content
+173,1056039,"keyboards/annepro2/keymaps/calmar_one/keymap.c",9315,6," ",c,content
+174,1056039,"keyboards/annepro2/keymaps/calmar_one/keymap.c",9227,3," ",c,content
+175,1056039,"keyboards/annepro2/keymaps/calmar_one/keymap.c",9114,0," ",c,content
+176,1056039,"keyboards/annepro2/keymaps/calmar_one/keymap.c",9024,6," ",c,content
+177,1056039,"keyboards/annepro2/keymaps/calmar_one/keymap.c",8866,3," ",c,content
+178,1056039,"keyboards/annepro2/keymaps/calmar_one/keymap.c",8732,0," ",c,content
+179,1056039,"keyboards/annepro2/keymaps/calmar_one/keymap.c",8669,6," ",c,content
+180,1056039,"keyboards/annepro2/keymaps/calmar_one/keymap.c",8581,3," ",c,content
+181,1056039,"keyboards/annepro2/keymaps/calmar_one/keymap.c",8468,0," ",c,content
+182,1056039,"keyboards/annepro2/keymaps/calmar_one/keymap.c",8347,6," ",c,content
+183,1056039,"keyboards/annepro2/keymaps/calmar_one/keymap.c",8180,3," ",c,content
+184,1056039,"keyboards/annepro2/keymaps/calmar_one/keymap.c",8045,0," ",c,content
+185,1056039,"keyboards/annepro2/keymaps/calmar_one/keymap.c",7987,6," ",c,content
+186,1056039,"keyboards/annepro2/keymaps/calmar_one/keymap.c",7899,3," ",c,content
+187,1056039,"keyboards/annepro2/keymaps/calmar_one/keymap.c",7786,0," ",c,content
+188,1056039,"keyboards/annepro2/keymaps/calmar_one/keymap.c",7676,6," ",c,content
+189,1056039,"keyboards/annepro2/keymaps/calmar_one/keymap.c",7509,3," ",c,content
+190,1056040,"keyboards/annepro2/keymaps/calmar_one/keymap.c",7375,0," ",c,content
+191,1056040,"keyboards/annepro2/keymaps/calmar_one/keymap.c",7322,6," ",c,content
+192,1056040,"keyboards/annepro2/keymaps/calmar_one/keymap.c",7234,3," ",c,content
+193,1056040,"keyboards/annepro2/keymaps/calmar_one/keymap.c",7121,0," ",c,content
+194,1056040,"keyboards/annepro2/keymaps/calmar_one/keymap.c",7021,6," ",c,content
+195,1056040,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6854,3," ",c,content
+196,1056040,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6720,0," ",c,content
+197,1056040,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6672,6," ",c,content
+198,1056040,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6584,3," ",c,content
+199,1056040,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6471,0," ",c,content
+200,1056040,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6381,6," ",c,content
+201,1056040,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6214,3," ",c,content
+202,1056040,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6080,0," ",c,content
+203,1056040,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6032,6," ",c,content
+204,1056040,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5944,3," ",c,content
+205,1056040,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5831,0," ",c,content
+206,1056040,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5742,6," ",c,content
+207,1056040,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5576,3," ",c,content
+208,1056040,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5442,0," ",c,content
+209,1056040,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5386,17," ",c,content
+210,1056040,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5213,0," ",c,content
+211,1056040,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5212,0," ",c,content
+212,1056040,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5136,0," ",c,content
+213,1056040,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5061,0," ",c,content
+214,1056040,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5011,0," ",c,content
+215,1056040,"keyboards/annepro2/keymaps/calmar_one/keymap.c",4928,0," ",c,content
+216,1056040,"keyboards/annepro2/keymaps/calmar_one/keymap.c",4927,0," ",c,content
+217,1056040,"keyboards/annepro2/keymaps/calmar_one/keymap.c",4884,0," ",c,content
+218,1056040,"keyboards/annepro2/keymaps/calmar_one/keymap.c",4812,0," ",c,content
+219,1057420,"keyboards/annepro2/keymaps/calmar_one/keymap.c",12887,2,"",c,content
+220,1057422,"keyboards/annepro2/keymaps/calmar_one/keymap.c",12853,2,"",c,content
+221,1057422,"keyboards/annepro2/keymaps/calmar_one/keymap.c",12762,2,"",c,content
+222,1057422,"keyboards/annepro2/keymaps/calmar_one/keymap.c",12756,3," ",c,content
+223,1057422,"keyboards/annepro2/keymaps/calmar_one/keymap.c",12750,3," ",c,content
+224,1057422,"keyboards/annepro2/keymaps/calmar_one/keymap.c",12744,3," ",c,content
+225,1057422,"keyboards/annepro2/keymaps/calmar_one/keymap.c",12732,3," ",c,content
+226,1057422,"keyboards/annepro2/keymaps/calmar_one/keymap.c",12682,8," ",c,content
+227,1057422,"keyboards/annepro2/keymaps/calmar_one/keymap.c",12592,5," ",c,content
+228,1057422,"keyboards/annepro2/keymaps/calmar_one/keymap.c",12477,2,"",c,content
+229,1057423,"keyboards/annepro2/keymaps/calmar_one/keymap.c",12385,8," ",c,content
+230,1057423,"keyboards/annepro2/keymaps/calmar_one/keymap.c",12232,5," ",c,content
+231,1057423,"keyboards/annepro2/keymaps/calmar_one/keymap.c",12096,2,"",c,content
+232,1057423,"keyboards/annepro2/keymaps/calmar_one/keymap.c",12051,8," ",c,content
+233,1057423,"keyboards/annepro2/keymaps/calmar_one/keymap.c",11961,5," ",c,content
+234,1057423,"keyboards/annepro2/keymaps/calmar_one/keymap.c",11846,2,"",c,content
+235,1057423,"keyboards/annepro2/keymaps/calmar_one/keymap.c",11765,8," ",c,content
+236,1057423,"keyboards/annepro2/keymaps/calmar_one/keymap.c",11612,5," ",c,content
+237,1057423,"keyboards/annepro2/keymaps/calmar_one/keymap.c",11476,2,"",c,content
+238,1057423,"keyboards/annepro2/keymaps/calmar_one/keymap.c",11411,8," ",c,content
+239,1057423,"keyboards/annepro2/keymaps/calmar_one/keymap.c",11321,5," ",c,content
+240,1057423,"keyboards/annepro2/keymaps/calmar_one/keymap.c",11206,2,"",c,content
+241,1057423,"keyboards/annepro2/keymaps/calmar_one/keymap.c",11083,8," ",c,content
+242,1057423,"keyboards/annepro2/keymaps/calmar_one/keymap.c",10923,5," ",c,content
+243,1057423,"keyboards/annepro2/keymaps/calmar_one/keymap.c",10786,2,"",c,content
+244,1057423,"keyboards/annepro2/keymaps/calmar_one/keymap.c",10726,8," ",c,content
+245,1057423,"keyboards/annepro2/keymaps/calmar_one/keymap.c",10636,5," ",c,content
+246,1057423,"keyboards/annepro2/keymaps/calmar_one/keymap.c",10521,2,"",c,content
+247,1057424,"keyboards/annepro2/keymaps/calmar_one/keymap.c",10409,8," ",c,content
+248,1057424,"keyboards/annepro2/keymaps/calmar_one/keymap.c",10249,5," ",c,content
+249,1057424,"keyboards/annepro2/keymaps/calmar_one/keymap.c",10113,2,"",c,content
+250,1057424,"keyboards/annepro2/keymaps/calmar_one/keymap.c",10058,8," ",c,content
+251,1057424,"keyboards/annepro2/keymaps/calmar_one/keymap.c",9968,5," ",c,content
+252,1057424,"keyboards/annepro2/keymaps/calmar_one/keymap.c",9853,2,"",c,content
+253,1057424,"keyboards/annepro2/keymaps/calmar_one/keymap.c",9751,8," ",c,content
+254,1057424,"keyboards/annepro2/keymaps/calmar_one/keymap.c",9591,5," ",c,content
+255,1057424,"keyboards/annepro2/keymaps/calmar_one/keymap.c",9455,2,"",c,content
+256,1057424,"keyboards/annepro2/keymaps/calmar_one/keymap.c",9405,8," ",c,content
+257,1057424,"keyboards/annepro2/keymaps/calmar_one/keymap.c",9315,5," ",c,content
+258,1057424,"keyboards/annepro2/keymaps/calmar_one/keymap.c",9200,2,"",c,content
+259,1057424,"keyboards/annepro2/keymaps/calmar_one/keymap.c",9108,8," ",c,content
+260,1057424,"keyboards/annepro2/keymaps/calmar_one/keymap.c",8948,5," ",c,content
+261,1057424,"keyboards/annepro2/keymaps/calmar_one/keymap.c",8812,2,"",c,content
+262,1057424,"keyboards/annepro2/keymaps/calmar_one/keymap.c",8747,8," ",c,content
+263,1057424,"keyboards/annepro2/keymaps/calmar_one/keymap.c",8657,5," ",c,content
+264,1057425,"keyboards/annepro2/keymaps/calmar_one/keymap.c",8542,2,"",c,content
+265,1057425,"keyboards/annepro2/keymaps/calmar_one/keymap.c",8419,8," ",c,content
+266,1057425,"keyboards/annepro2/keymaps/calmar_one/keymap.c",8250,5," ",c,content
+267,1057425,"keyboards/annepro2/keymaps/calmar_one/keymap.c",8113,2,"",c,content
+268,1057425,"keyboards/annepro2/keymaps/calmar_one/keymap.c",8053,8," ",c,content
+269,1057425,"keyboards/annepro2/keymaps/calmar_one/keymap.c",7963,5," ",c,content
+270,1057425,"keyboards/annepro2/keymaps/calmar_one/keymap.c",7848,2,"",c,content
+271,1057425,"keyboards/annepro2/keymaps/calmar_one/keymap.c",7736,8," ",c,content
+272,1057425,"keyboards/annepro2/keymaps/calmar_one/keymap.c",7567,5," ",c,content
+273,1057425,"keyboards/annepro2/keymaps/calmar_one/keymap.c",7431,2,"",c,content
+274,1057425,"keyboards/annepro2/keymaps/calmar_one/keymap.c",7376,8," ",c,content
+275,1057425,"keyboards/annepro2/keymaps/calmar_one/keymap.c",7286,5," ",c,content
+276,1057425,"keyboards/annepro2/keymaps/calmar_one/keymap.c",7171,2,"",c,content
+277,1057425,"keyboards/annepro2/keymaps/calmar_one/keymap.c",7069,8," ",c,content
+278,1057425,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6900,5," ",c,content
+279,1057425,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6764,2,"",c,content
+280,1057425,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6714,8," ",c,content
+281,1057425,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6624,5," ",c,content
+282,1057425,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6509,2,"",c,content
+283,1057425,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6417,8," ",c,content
+284,1057425,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6248,5," ",c,content
+285,1057425,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6112,2,"",c,content
+286,1057425,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6062,8," ",c,content
+287,1057425,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5972,5," ",c,content
+288,1057425,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5857,2,"",c,content
+289,1057425,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5766,8," ",c,content
+290,1057425,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5598,5," ",c,content
+291,1057425,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5462,2,"",c,content
+292,1057425,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5404,19," ",c,content
+293,1057426,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5229,2,"",c,content
+294,1057426,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5226,2,"",c,content
+295,1057426,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5148,2,"",c,content
+296,1057426,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5071,2,"",c,content
+297,1057426,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5019,2,"",c,content
+298,1057426,"keyboards/annepro2/keymaps/calmar_one/keymap.c",4934,2,"",c,content
+299,1057426,"keyboards/annepro2/keymaps/calmar_one/keymap.c",4931,2,"",c,content
+300,1057426,"keyboards/annepro2/keymaps/calmar_one/keymap.c",4886,2,"",c,content
+301,1057426,"keyboards/annepro2/keymaps/calmar_one/keymap.c",4812,2,"",c,content
+302,1059341,"keyboards/annepro2/keymaps/calmar_one/keymap.c",12593,0,"",c,selection_mouse
+303,1059356,"keyboards/annepro2/keymaps/calmar_one/keymap.c",12592,0,"",c,selection_command
+304,1060163,"keyboards/annepro2/keymaps/calmar_one/keymap.c",4812,7911,"",c,content
+305,1064519,"keyboards/annepro2/keymaps/calmar_one/keymap.c",4812,0,"franzsrambical@MBF6N9WFVKFV qmk_firmware % make annepro2/c18:calmar_one\nMaking annepro2/c18 with keymap calmar_one\n\narm-none-eabi-gcc (Arm GNU Toolchain 14.3.Rel1 (Build arm-14.174)) 14.3.1 20250623\nCopyright (C) 2024 Free Software Foundation, Inc.\nThis is free software; see the source for copying conditions. There is NO\nwarranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.\n\nCompiling: quantum/keymap_introspection.c In file included from ./keyboards/annepro2/keymaps/calmar_one/keymap.c:7,\n from quantum/keymap_introspection.c:9:\n./keyboards/annepro2/keymaps/calmar_one/keymap.c:54:74: error: 'KC_BTN3' undeclared here (not in a function); did you mean 'MS_BTN3'?\n 54 | _______, S(A(KC_Q)), S(A(KC_W)), S(A(KC_F)), S(A(KC_P)), S(A(KC_B)), KC_BTN3, KC_TAB, S(KC_TAB), S(A(KC_Y)), S(A(KC_SLSH)), _______, _______, _______,\n | ^~~~~~~\n./.build/obj_annepro2_c18_calmar_one/src/default_keyboard.h:35:37: note: in definition of macro 'LAYOUT_60_ansi'\n 35 | { k0A, k0B, k0C, k0D, k0E, k0F, k0G, k0H, k0I, k0J, k0K, k0L, k0M, k0N }, \\n | ^~~\n./keyboards/annepro2/keymaps/calmar_one/keymap.c:55:75: error: 'KC_MS_L' undeclared here (not in a function); did you mean 'KC_MSEL'?\n 55 | _______, S(A(KC_A)), S(A(KC_R)), KC_TAB, S(KC_TAB), S(A(KC_G)), KC_MS_L, KC_MS_D, KC_MS_U, KC_MS_R, S(A(KC_O)), _______, _______, _______,\n | ^~~~~~~\n./.build/obj_annepro2_c18_calmar_one/src/default_keyboard.h:36:37: note: in definition of macro 'LAYOUT_60_ansi'\n 36 | { k1A, k1B, k1C, k1D, k1E, k1F, k1G, k1H, k1I, k1J, k1K, k1L, k1M, k1N }, \\n | ^~~\n./keyboards/annepro2/keymaps/calmar_one/keymap.c:55:85: error: 'KC_MS_D' undeclared here (not in a function); did you mean 'KC_MRWD'?\n 55 | _______, S(A(KC_A)), S(A(KC_R)), KC_TAB, S(KC_TAB), S(A(KC_G)), KC_MS_L, KC_MS_D, KC_MS_U, KC_MS_R, S(A(KC_O)), _______, _______, _______,\n | ^~~~~~~\n./.build/obj_annepro2_c18_calmar_one/src/default_keyboard.h:36:42: note: in definition of macro 'LAYOUT_60_ansi'\n 36 | { k1A, k1B, k1C, k1D, k1E, k1F, k1G, k1H, k1I, k1J, k1K, k1L, k1M, k1N }, \\n | ^~~\n./keyboards/annepro2/keymaps/calmar_one/keymap.c:55:95: error: 'KC_MS_U' undeclared here (not in a function); did you mean 'KC_MSEL'?\n 55 | _______, S(A(KC_A)), S(A(KC_R)), KC_TAB, S(KC_TAB), S(A(KC_G)), KC_MS_L, KC_MS_D, KC_MS_U, KC_MS_R, S(A(KC_O)), _______, _______, _______,\n | ^~~~~~~\n./.build/obj_annepro2_c18_calmar_one/src/default_keyboard.h:36:47: note: in definition of macro 'LAYOUT_60_ansi'\n 36 | { k1A, k1B, k1C, k1D, k1E, k1F, k1G, k1H, k1I, k1J, k1K, k1L, k1M, k1N }, \\n | ^~~\n./keyboards/annepro2/keymaps/calmar_one/keymap.c:55:106: error: 'KC_MS_R' undeclared here (not in a function); did you mean 'KC_MSEL'?\n 55 | _______, S(A(KC_A)), S(A(KC_R)), KC_TAB, S(KC_TAB), S(A(KC_G)), KC_MS_L, KC_MS_D, KC_MS_U, KC_MS_R, S(A(KC_O)), _______, _______, _______,\n | ^~~~~~~\n./.build/obj_annepro2_c18_calmar_one/src/default_keyboard.h:36:52: note: in definition of macro 'LAYOUT_60_ansi'\n 36 | { k1A, k1B, k1C, k1D, k1E, k1F, k1G, k1H, k1I, k1J, k1K, k1L, k1M, k1N }, \\n | ^~~\n./keyboards/annepro2/keymaps/calmar_one/keymap.c:56:75: error: 'KC_WH_L' undeclared here (not in a function); did you mean 'KC_WHOM'?\n 56 | _______, S(A(KC_Z)), S(A(KC_X)), A(S(KC_C)), S(A(KC_D)), A(S(KC_V)), KC_WH_L, KC_WH_D, KC_WH_U, KC_WH_R, S(A(KC_MINS)), _______, _______,\n | ^~~~~~~\n./.build/obj_annepro2_c18_calmar_one/src/default_keyboard.h:37:37: note: in definition of macro 'LAYOUT_60_ansi'\n 37 | { k2A, k2B, k2C, k2D, k2E, k2F, k2G, k2H, k2I, k2J, k2K, k2L, k2M, XXX }, \\n | ^~~\n./keyboards/annepro2/keymaps/calmar_one/keymap.c:56:85: error: 'KC_WH_D' undeclared here (not in a function); did you mean 'KC_WFWD'?\n 56 | _______, S(A(KC_Z)), S(A(KC_X)), A(S(KC_C)), S(A(KC_D)), A(S(KC_V)), KC_WH_L, KC_WH_D, KC_WH_U, KC_WH_R, S(A(KC_MINS)), _______, _______,\n | ^~~~~~~\n./.build/obj_annepro2_c18_calmar_one/src/default_keyboard.h:37:42: note: in definition of macro 'LAYOUT_60_ansi'\n 37 | { k2A, k2B, k2C, k2D, k2E, k2F, k2G, k2H, k2I, k2J, k2K, k2L, k2M, XXX }, \\n | ^~~\n./keyboards/annepro2/keymaps/calmar_one/keymap.c:56:95: error: 'KC_WH_U' undeclared here (not in a function); did you mean 'KC_WHOM'?\n 56 | _______, S(A(KC_Z)), S(A(KC_X)), A(S(KC_C)), S(A(KC_D)), A(S(KC_V)), KC_WH_L, KC_WH_D, KC_WH_U, KC_WH_R, S(A(KC_MINS)), _______, _______,\n | ^~~~~~~\n./.build/obj_annepro2_c18_calmar_one/src/default_keyboard.h:37:47: note: in definition of macro 'LAYOUT_60_ansi'\n 37 | { k2A, k2B, k2C, k2D, k2E, k2F, k2G, k2H, k2I, k2J, k2K, k2L, k2M, XXX }, \\n | ^~~\n./keyboards/annepro2/keymaps/calmar_one/keymap.c:56:106: error: 'KC_WH_R' undeclared here (not in a function); did you mean 'KC_WHOM'?\n 56 | _______, S(A(KC_Z)), S(A(KC_X)), A(S(KC_C)), S(A(KC_D)), A(S(KC_V)), KC_WH_L, KC_WH_D, KC_WH_U, KC_WH_R, S(A(KC_MINS)), _______, _______,\n | ^~~~~~~\n./.build/obj_annepro2_c18_calmar_one/src/default_keyboard.h:37:52: note: in definition of macro 'LAYOUT_60_ansi'\n 37 | { k2A, k2B, k2C, k2D, k2E, k2F, k2G, k2H, k2I, k2J, k2K, k2L, k2M, XXX }, \\n | ^~~\n./keyboards/annepro2/keymaps/calmar_one/keymap.c:57:64: error: 'KC_BTN1' undeclared here (not in a function); did you mean 'MS_BTN1'?\n 57 | S(A(KC_F7)), S(A(KC_F8)), KC_LGUI, S(A(KC_F10)), KC_BTN1, KC_BTN2, S(A(KC_F13)), _______, _______, _______, _______, _______,\n | ^~~~~~~\n./.build/obj_annepro2_c18_calmar_one/src/default_keyboard.h:38:32: note: in definition of macro 'LAYOUT_60_ansi'\n 38 | { k3A, XXX, k3C, k3D, k3E, k3F, k3G, k3H, k3I, k3J, k3K, k3L, k3M, XXX }, \\n | ^~~\n./keyboards/annepro2/keymaps/calmar_one/keymap.c:57:75: error: 'KC_BTN2' undeclared here (not in a function); did you mean 'MS_BTN2'?\n 57 | S(A(KC_F7)), S(A(KC_F8)), KC_LGUI, S(A(KC_F10)), KC_BTN1, KC_BTN2, S(A(KC_F13)), _______, _______, _______, _______, _______,\n | ^~~~~~~\n./.build/obj_annepro2_c18_calmar_one/src/default_keyboard.h:38:37: note: in definition of macro 'LAYOUT_60_ansi'\n 38 | { k3A, XXX, k3C, k3D, k3E, k3F, k3G, k3H, k3I, k3J, k3K, k3L, k3M, XXX }, \\n | ^~~\n [ERRORS]\n | \n | \n | \nmake[1]: *** [.build/obj_annepro2_c18_calmar_one/quantum/keymap_introspection.o] Error 1\nmake: Make finished with errors\n",c,content
+306,1065276,"keyboards/annepro2/keymaps/calmar_one/keymap.c",4812,7911,"",c,content
+307,1135001,"keyboards/annepro2/keymaps/calmar_one/keymap.c",2781,609," _______, S(A(KC_Q)), S(A(KC_W)), S(A(KC_F)), S(A(KC_P)), S(A(KC_B)), MS_BTN3, KC_TAB, S(KC_TAB), S(A(KC_Y)), S(A(KC_SLSH)), _______, _______, _______,\n _______, S(A(KC_A)), S(A(KC_R)), KC_TAB, S(KC_TAB), S(A(KC_G)), MS_LEFT, MS_DOWN, MS_UP, MS_RGHT, S(A(KC_O)), _______, _______, _______,\n _______, S(A(KC_Z)), S(A(KC_X)), A(S(KC_C)), S(A(KC_D)), A(S(KC_V)), MS_WHLL, MS_WHLD, MS_WHLU, MS_WHLR, S(A(KC_MINS)), _______, _______,\n S(A(KC_F7)), S(A(KC_F8)), KC_LGUI, S(A(KC_F10)), MS_BTN1, MS_BTN2, S(A(KC_F13)), _______, _______, _______, _______, _______,",c,content
+308,1274075,"extension-output-pdoom-org.crowd-code-#1-crowd-code",0,0,"",Log,tab
+309,1274678,"keyboards/annepro2/keymaps/calmar_one/keymap.c",0,0,"",c,tab
+310,1275871,"keyboards/annepro2/keymaps/calmar_one/keymap.c",4813,0,"",c,selection_command
+311,1276475,"keyboards/annepro2/keymaps/calmar_one/keymap.c",4851,0,"",c,selection_command
+312,1276815,"keyboards/annepro2/keymaps/calmar_one/keymap.c",4881,0,"",c,selection_command
+313,1278737,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5011,0,"",c,selection_command
+314,1278963,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5141,0,"",c,selection_command
+315,2197140,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5383,101," KC_ESC, KC_TAB, KC_LALT, KC_SPC, KC_BSPC, KC_ENT, KC_DEL, KC_RCTL",c,content
+316,2197140,"keyboards/annepro2/keymaps/calmar_one/keymap.c",4813,327," /* FUN (Left-hand tertiary: function keys; duplicate thumbs) */\n [FUN] = LAYOUT_60_ansi(\n _______, KC_F1, KC_F2, KC_F3, KC_F4, KC_F5, KC_F6, KC_F7, KC_F8, KC_F9, KC_F10, KC_F11, KC_F12, _______,\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, QK_BOOT, _______,",c,content
+317,2197140,"keyboards/annepro2/keymaps/calmar_one/keymap.c",4154,652," /* SYM (Left-hand secondary: shifted symbols; duplicate thumbs) */\n [SYM] = LAYOUT_60_ansi(\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n _______, KC_EXLM, KC_AT, KC_HASH, KC_DLR, KC_PERC, KC_CIRC, KC_AMPR, KC_LPRN, KC_RPRN, KC_MINS, KC_LBRC, KC_RBRC, KC_BSLS,\n _______, KC_1, KC_2, KC_3, KC_4, KC_5, KC_6, KC_7, KC_8, KC_9, KC_0, KC_EQL, _______,\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n KC_ESC, KC_TAB, KC_LALT, KC_SPC, KC_BSPC, KC_ENT, KC_DEL, KC_RCTL",c,content
+318,2197140,"keyboards/annepro2/keymaps/calmar_one/keymap.c",4046,101," KC_ESC, KC_TAB, KC_LALT, KC_SPC, KC_BSPC, KC_ENT, KC_DEL, KC_RCTL",c,content
+319,2197140,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3502,34," /* NUM (Left-hand primary: numerals/symbols; duplicate thumbs) */",c,content
+320,2197140,"keyboards/annepro2/keymaps/calmar_one/keymap.c",2716,779," /* MEDIA (Right-hand: media; RGB and BT controls inline) */\n [MEDIA] = LAYOUT_60_ansi(\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, KC_MPRV, KC_VOLD, KC_VOLU, KC_MNXT, _______, _______, _______,\n _______, RGB_VAI, RGB_VAD, KC_MUTE, _______, _______, _______, _______, _______, _______, _______, _______,\n KC_ESC, KC_TAB, KC_LALT, KC_SPC, KC_MPLY, KC_MSTP, KC_MUTE, KC_RCTL",c,content
+321,2197140,"keyboards/annepro2/keymaps/calmar_one/keymap.c",2367,342," _______, _______, _______, _______, _______, _______, MS_LEFT, MS_DOWN, MS_UP, MS_RGHT, _______, _______, _______,\n _______, MS_WHLU, MS_WHLD, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n KC_ESC, KC_TAB, KC_LALT, KC_SPC, MS_BTN1, MS_BTN2, MS_BTN3, KC_RCTL",c,content
+322,2197140,"keyboards/annepro2/keymaps/calmar_one/keymap.c",2051,55," /* MOUSE (Right-hand: mouse; movement mirrors NAV; buttons on right thumbs) */\n [MOUSE] = LAYOUT_60_ansi(",c,content
+323,2197140,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1572,472," _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, KC_LEFT, KC_DOWN, KC_UP, KC_RGHT, _______, _______, _______,\n _______, KC_BRID, KC_BRIU, KC_MUTE, _______, _______, _______, _______, _______, _______, _______, _______,\n KC_ESC, KC_TAB, KC_LALT, KC_SPC, KC_BSPC, KC_ENT, KC_DEL, KC_RCTL",c,content
+324,2197140,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1388,53," /* NAV (Right-hand: arrows/edit; duplicate thumb taps) */\n [NAV] = LAYOUT_60_ansi(",c,content
+325,2197141,"keyboards/annepro2/keymaps/calmar_one/keymap.c",758,623," KC_ESC, KC_1, KC_2, KC_3, KC_4, KC_5, KC_6, KC_7, KC_8, KC_9, KC_0, KC_MINS, KC_EQL, KC_BSPC,\n KC_TAB, KC_Q, KC_W, KC_E, KC_R, KC_T, KC_Y, KC_U, KC_I, KC_O, KC_P, KC_LBRC, KC_RBRC, KC_BSLS,\n LGUI_T(KC_A), LALT_T(KC_S), LCTL_T(KC_D), LSFT_T(KC_F), KC_G, KC_H, RSFT_T(KC_J), RCTL_T(KC_K), RALT_T(KC_L), RGUI_T(KC_SCLN), KC_QUOT, KC_ENT,\n KC_LSFT, KC_Z, KC_X, KC_C, KC_V, KC_B, KC_N, KC_M, KC_COMM, KC_DOT, KC_SLSH, RSFT_T(KC_SLSH),\n LT(MEDIA, KC_ESC), LT(MOUSE, KC_TAB), KC_LALT, LT(NAV, KC_SPC), LT(NUM, KC_BSPC), LT(SYM, KC_ENT), LT(FUN, KC_DEL), KC_RCTL",c,content
+326,2197141,"keyboards/annepro2/keymaps/calmar_one/keymap.c",661,69," /* BASE (QWERTY with Miryoku-style home row mods and thumb LTs) */",c,content
+327,2197141,"keyboards/annepro2/keymaps/calmar_one/keymap.c",280,204," NAV, // Right-hand: navigation/edit\n MOUSE, // Right-hand: mouse emulation\n MEDIA, // Right-hand: media / RGB / BT controls\n NUM, // Left-hand: numbers/symbols\n SYM, // Left-hand: shifted symbols\n FUN, // Left-hand: function/system",c,content
+328,2504403,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3319,120," _______, _______, _______, KC_MUTE, _______, _______, _______, _______, _______, _______, _______, _______,",c,content
+329,2504404,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1022,147," KC_CAPS, LGUI_T(KC_A), LALT_T(KC_S), LCTL_T(KC_D), LSFT_T(KC_F), KC_G, KC_H, RSFT_T(KC_J), RCTL_T(KC_K), RALT_T(KC_L), RGUI_T(KC_SCLN), KC_QUOT, KC_ENT,",c,content
+330,3221006,"keyboards/annepro2/keymaps/calmar_one/keymap.c",2627,120," _______, _______, _______, _______, _______, _______, _______, MS_WHLD, MS_WHLU, _______, _______, _______,",c,content
+331,3586092,"keyboards/annepro2/keymaps/calmar_one/keymap.c",2627,120," _______, MS_WHLU, MS_WHLD, _______, _______, _______, _______, _______, _______, _______, _______, _______,",c,content
+332,3586123,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5622,0," /* NUMBERS (Separate: numbers on home row; duplicate thumbs) */\n [NUMBERS] = LAYOUT_60_ansi(\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n _______, KC_1, KC_2, KC_3, KC_4, KC_5, KC_6, KC_7, KC_8, KC_9, KC_0, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n KC_ESC, KC_TAB, KC_LALT, KC_SPC, KC_BSPC, KC_ENT, KC_DEL, KC_RCTL\n ),\n\n",c,content
+333,3586123,"keyboards/annepro2/keymaps/calmar_one/keymap.c",2627,120," _______, _______, _______, _______, _______, _______, _______, MS_WHLD, MS_WHLU, _______, _______, _______,",c,content
+334,3586123,"keyboards/annepro2/keymaps/calmar_one/keymap.c",2376,129," _______, _______, _______, _______, _______, _______, LCTL(KC_Z), LCTL(KC_Y), LCTL(KC_X), LCTL(KC_C), LCTL(KC_V), _______, _______, _______,",c,content
+335,3586123,"keyboards/annepro2/keymaps/calmar_one/keymap.c",2137,0," /* HYPER (Right-hand: Hyper combos + mouse movement/wheel; mouse buttons on thumbs) */\n [HYPER] = LAYOUT_60_ansi(\n _______, S(A(KC_Q)), S(A(KC_W)), S(A(KC_F)), S(A(KC_P)), S(A(KC_B)), _______, KC_TAB, S(KC_TAB), S(A(KC_Y)), S(A(KC_SLSH)), _______, _______, _______,\n _______, S(A(KC_A)), S(A(KC_R)), KC_TAB, S(KC_TAB), S(A(KC_G)), MS_LEFT, MS_DOWN, MS_UP, MS_RGHT, S(A(KC_O)), _______, _______, _______,\n _______, S(A(KC_Z)), S(A(KC_X)), A(S(KC_C)), S(A(KC_D)), A(S(KC_V)), MS_WHLL, MS_WHLD, MS_WHLU, MS_WHLR, S(A(KC_MINS)), _______, _______,\n S(A(KC_F7)), S(A(KC_F8)), KC_LGUI, S(A(KC_F10)), MS_BTN1, MS_BTN2, S(A(KC_F13)), _______, _______, _______, _______, _______,\n KC_ESC, KC_TAB, KC_LALT, KC_SPC, MS_BTN1, MS_BTN2, MS_BTN3, KC_RCTL\n ),\n\n",c,content
+336,3586123,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1659,129," _______, _______, _______, _______, _______, _______, LCTL(KC_Z), LCTL(KC_Y), LCTL(KC_X), LCTL(KC_C), LCTL(KC_V), _______, _______, _______,",c,content
+337,3586123,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1179,257," KC_LSFT, LT(HYPER, KC_Z), LT(NUM, KC_X), KC_C, KC_V, KC_B, KC_N, KC_M, KC_COMM, KC_DOT, KC_SLSH, RSFT_T(KC_SLSH),\n LT(MEDIA, KC_ESC), LT(MOUSE, KC_TAB), KC_LGUI, LT(NAV, KC_SPC), LT(NUM, KC_BSPC), LT(SYM, KC_ENT), LT(FUN, KC_DEL), KC_RCTL",c,content
+338,3586123,"keyboards/annepro2/keymaps/calmar_one/keymap.c",551,0," NUMBERS, // Separate numbers on home row\n",c,content
+339,3586123,"keyboards/annepro2/keymaps/calmar_one/keymap.c",368,96," MEDIA, // Right-hand: media / BT controls\n NUM, // Left-hand: numpad",c,content
+340,3586123,"keyboards/annepro2/keymaps/calmar_one/keymap.c",324,0," HYPER, // Right-hand: Hyper combos + mouse movement/wheel\n",c,content
+341,4071678,"extension-output-pdoom-org.crowd-code-#1-crowd-code",0,0,"",Log,tab
+342,4072538,"keyboards/annepro2/keymaps/calmar_one/keymap.c",0,0,"",c,tab
+343,4087938,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6012,0,"",c,selection_command
+344,4088846,"extension-output-pdoom-org.crowd-code-#1-crowd-code",0,0,"",Log,tab
+345,4089129,"keyboards/annepro2/keymaps/calmar_one/keymap.c",0,0,"",c,tab
+346,4094844,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6007,0,"",c,selection_mouse
+347,4094855,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6006,0,"",c,selection_command
+348,4094861,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6006,1,"(",c,selection_mouse
+349,4094869,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6007,0,"",c,selection_command
+350,4096093,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6006,0,"",c,selection_command
+351,4097522,"keyboards/annepro2/keymaps/calmar_one/keymap.c",7258,34,"",c,content
+352,4097540,"keyboards/annepro2/keymaps/calmar_one/keymap.c",7258,0,"_______, _______, _______, _______",c,content
+353,4097542,"keyboards/annepro2/keymaps/calmar_one/keymap.c",7240,7,"",c,content
+354,4097542,"keyboards/annepro2/keymaps/calmar_one/keymap.c",7240,0,"_______",c,content
+355,4097543,"keyboards/annepro2/keymaps/calmar_one/keymap.c",7197,23,"",c,content
+356,4097543,"keyboards/annepro2/keymaps/calmar_one/keymap.c",7197,0,"_______, _______, _______",c,content
+357,4097545,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6638,18,"",c,content
+358,4097547,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6609,11,"",c,content
+359,4097548,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6609,0,"N",c,content
+360,4097550,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6555,33,"",c,content
+361,4097551,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6555,0,"_______, _______, _______, _______",c,content
+362,4097551,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6530,15,"",c,content
+363,4097552,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6530,0,"_______,",c,content
+364,4097552,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6493,24,"",c,content
+365,4097553,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6493,0,"_______, _______, _______, ",c,content
+366,4097554,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6399,34,"",c,content
+367,4097555,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6399,0,"KC_VOLD, KC_VOLU",c,content
+368,4097555,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6372,17,"",c,content
+369,4097560,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6372,0,"KC_MPRV, KC_MPLY, KC_MNXT,",c,content
+370,4097562,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6323,0,"KC_MUTE, ",c,content
+371,4097563,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6259,0,"\n ",c,content
+372,4097564,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6246,4,"",c,content
+373,4097565,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6184,25,"",c,content
+374,4097566,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6184,0,"KC_BRID, KC_BRIU",c,content
+375,4097567,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6100,6,"",c,content
+376,4097568,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6100,0,"_______",c,content
+377,4097568,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6048,0," ",c,content
+378,4097569,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6042,0," ",c,content
+379,4097569,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6034,0," ",c,content
+380,4097570,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6028,0," ",c,content
+381,4097571,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5986,2,"",c,content
+382,4097575,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5986,0,"KEY",c,content
+383,4097577,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5922,54,"",c,content
+384,4097579,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5922,0,"KEY (F-Keys / Media / Brightnes",c,content
+385,4097580,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5799,110,"",c,content
+386,4097581,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5588,177,"",c,content
+387,4097586,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5572,14,"",c,content
+388,4097589,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5559,9,"",c,content
+389,4097599,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5550,8,"",c,content
+390,4097602,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5541,8,"",c,content
+391,4097603,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5532,8,"",c,content
+392,4097605,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5526,4,"",c,content
+393,4097611,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5517,8,"",c,content
+394,4097612,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5508,8,"",c,content
+395,4097615,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5499,8,"",c,content
+396,4097617,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5490,8,"",c,content
+397,4097618,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5482,7,"",c,content
+398,4097619,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5473,8,"",c,content
+399,4097620,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5468,4,"",c,content
+400,4097621,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5461,6,"",c,content
+401,4097621,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5452,8,"",c,content
+402,4097626,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5443,8,"",c,content
+403,4097627,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5430,9,"",c,content
+404,4097629,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5421,8,"",c,content
+405,4097632,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5412,8,"",c,content
+406,4097633,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5403,8,"",c,content
+407,4097634,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5394,8,"",c,content
+408,4097634,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5385,8,"",c,content
+409,4097639,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5376,8,"",c,content
+410,4097639,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5367,8,"",c,content
+411,4097640,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5358,8,"",c,content
+412,4097640,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5349,8,"",c,content
+413,4097640,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5340,8,"",c,content
+414,4097641,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5331,8,"",c,content
+415,4097641,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5200,113,"",c,content
+416,4097642,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5195,3,"",c,content
+417,4097646,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5187,7,"",c,content
+418,4097662,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5178,8,"",c,content
+419,4097665,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5159,18,"",c,content
+420,4097667,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5131,27,"",c,content
+421,4097667,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5123,7,"",c,content
+422,4097668,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5115,7,"",c,content
+423,4097668,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5112,2,"",c,content
+424,4097669,"keyboards/annepro2/keymaps/calmar_one/keymap.c",4554,40,"",c,content
+425,4097669,"keyboards/annepro2/keymaps/calmar_one/keymap.c",4554,0,"ght side",c,content
+426,4097670,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3881,671,"",c,content
+427,4097671,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3875,5,"",c,content
+428,4097672,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3869,4,"",c,content
+429,4097676,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3842,25,"",c,content
+430,4097678,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3842,0,"Numpa",c,content
+431,4097678,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3836,4,"",c,content
+432,4097679,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3835,0,"NU",c,content
+433,4097679,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3589,234,"",c,content
+434,4097681,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3537,36,"",c,content
+435,4097681,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3510,8,"",c,content
+436,4097682,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3501,8,"",c,content
+437,4097682,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3492,8,"",c,content
+438,4097683,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3483,8,"",c,content
+439,4097684,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3470,9,"",c,content
+440,4097685,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3461,8,"",c,content
+441,4097685,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3383,68,"",c,content
+442,4097686,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3374,8,"",c,content
+443,4097686,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3365,8,"",c,content
+444,4097686,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3356,8,"",c,content
+445,4097687,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3347,8,"",c,content
+446,4097687,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3338,8,"",c,content
+447,4097688,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3325,9,"",c,content
+448,4097689,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3316,8,"",c,content
+449,4097693,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3307,8,"",c,content
+450,4097694,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3298,8,"",c,content
+451,4097695,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3289,8,"",c,content
+452,4097695,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3280,8,"",c,content
+453,4097697,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3271,8,"",c,content
+454,4097697,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3262,8,"",c,content
+455,4097698,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3253,8,"",c,content
+456,4097698,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3244,8,"",c,content
+457,4097699,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3235,8,"",c,content
+458,4097700,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3226,8,"",c,content
+459,4097701,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3081,127,"",c,content
+460,4097702,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3075,4,"",c,content
+461,4097702,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3066,8,"",c,content
+462,4097703,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3057,8,"",c,content
+463,4097703,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3038,18,"",c,content
+464,4097704,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3011,26,"",c,content
+465,4097704,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3002,8,"",c,content
+466,4097705,"keyboards/annepro2/keymaps/calmar_one/keymap.c",2993,8,"",c,content
+467,4097705,"keyboards/annepro2/keymaps/calmar_one/keymap.c",2990,2,"",c,content
+468,4097709,"keyboards/annepro2/keymaps/calmar_one/keymap.c",2798,0," ",c,content
+469,4097710,"keyboards/annepro2/keymaps/calmar_one/keymap.c",2638,0," ",c,content
+470,4097710,"keyboards/annepro2/keymaps/calmar_one/keymap.c",2472,0," ",c,content
+471,4097711,"keyboards/annepro2/keymaps/calmar_one/keymap.c",2456,8,"",c,content
+472,4097712,"keyboards/annepro2/keymaps/calmar_one/keymap.c",2456,0,"MS_BTN3, ",c,content
+473,4097713,"keyboards/annepro2/keymaps/calmar_one/keymap.c",2349,0,"o",c,content
+474,4097714,"keyboards/annepro2/keymaps/calmar_one/keymap.c",2311,36,"",c,content
+475,4097714,"keyboards/annepro2/keymaps/calmar_one/keymap.c",2311,0,"/ Hyper co",c,content
+476,4097715,"keyboards/annepro2/keymaps/calmar_one/keymap.c",2278,28,"",c,content
+477,4097716,"keyboards/annepro2/keymaps/calmar_one/keymap.c",2278,0,"M",c,content
+478,4097716,"keyboards/annepro2/keymaps/calmar_one/keymap.c",2226,33,"",c,content
+479,4097717,"keyboards/annepro2/keymaps/calmar_one/keymap.c",2226,0,"_______, _______, _______, _______",c,content
+480,4097718,"keyboards/annepro2/keymaps/calmar_one/keymap.c",2208,8,"",c,content
+481,4097719,"keyboards/annepro2/keymaps/calmar_one/keymap.c",2208,0,"_______,",c,content
+482,4097719,"keyboards/annepro2/keymaps/calmar_one/keymap.c",2163,25,"",c,content
+483,4097719,"keyboards/annepro2/keymaps/calmar_one/keymap.c",2163,0,"_______, _______, _______",c,content
+484,4097720,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1831,58,"",c,content
+485,4097721,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1831,0,"_______, _______, _______, _______, _______",c,content
+486,4097722,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1606,6,"",c,content
+487,4097723,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1601,4,"",c,content
+488,4097724,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1585,14,"",c,content
+489,4097725,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1585,0," / Sys",c,content
+490,4097725,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1567,13,"",c,content
+491,4097726,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1567,0,"A",c,content
+492,4097727,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1546,4,"",c,content
+493,4097729,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1546,0,"0, KC_EQL, _______,\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, _______",c,content
+494,4097731,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1543,0," ",c,content
+495,4097733,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1537,4,"",c,content
+496,4097734,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1537,0,"9",c,content
+497,4097735,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1534,0," ",c,content
+498,4097735,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1520,12,"",c,content
+499,4097736,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1520,0,"7, KC_8",c,content
+500,4097737,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1502,14,"",c,content
+501,4097737,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1502,0,"RBRC, KC_BSLS,\n _______, KC_1, KC_2, KC_3, KC_4, KC_5, KC_6, ",c,content
+502,4097739,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1486,11,"",c,content
+503,4097740,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1486,0,"KC_CIRC, KC_AMPR, KC_LPRN, KC_RPRN, KC_MINS, KC_LBRC",c,content
+504,4097741,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1483,1,"",c,content
+505,4097742,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1482,0,"ER",c,content
+506,4097742,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1480,1,"",c,content
+507,4097743,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1467,8,"",c,content
+508,4097744,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1467,0,"_______, KC_EXLM, KC_AT, KC_HASH, KC_DLR",c,content
+509,4097745,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1463,0,"_______,\n",c,content
+510,4097746,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1462,0,"_______,",c,content
+511,4097747,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1461,0,"_______,",c,content
+512,4097748,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1460,0,"_______,",c,content
+513,4097749,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1459,0,"_______,",c,content
+514,4097750,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1458,0,"_______,",c,content
+515,4097752,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1457,0,"_______,",c,content
+516,4097755,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1456,0,"_______,",c,content
+517,4097756,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1455,0,"_______,",c,content
+518,4097757,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1454,0,"_______,",c,content
+519,4097757,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1453,0,"_______,",c,content
+520,4097758,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1452,0,"_______,",c,content
+521,4097758,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1451,0,"_______,",c,content
+522,4097760,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1445,4,"",c,content
+523,4097760,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1445,0,"RGHT)\n ),\n\n /* SYM (Special-Chars) */\n [SYM] = LAYOUT_60_ansi(\n _______",c,content
+524,4097761,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1442,0,"RCTL_T(",c,content
+525,4097762,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1439,0,"SPC",c,content
+526,4097762,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1436,2,"",c,content
+527,4097763,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1427,4,"",c,content
+528,4097764,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1426,0,"SY",c,content
+529,4097764,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1420,0,", KC_RALT, LT(NAV, KC_ENT",c,content
+530,4097765,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1419,0,"P",c,content
+531,4097769,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1417,1,"",c,content
+532,4097772,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1414,0," ",c,content
+533,4097773,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1404,8,"",c,content
+534,4097776,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1404,0,"KC_LCTL, KC_LGUI, KC_LALT",c,content
+535,4097776,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1381,0,")",c,content
+536,4097777,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1374,0,"RSFT_T(",c,content
+537,4097777,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1280,8,"",c,content
+538,4097778,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1280,0,"F13),",c,content
+539,4097779,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1277,0,"LSFT_T(",c,content
+540,4097779,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1240,3,"",c,content
+541,4097779,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1240,0,"CTL",c,content
+542,4097780,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1236,1,"",c,content
+543,4097780,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1225,7,"",c,content
+544,4097789,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1222,1,"",c,content
+545,4097790,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1211,7,"",c,content
+546,4097791,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1208,1,"",c,content
+547,4097792,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1197,7,"",c,content
+548,4097792,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1182,1,"",c,content
+549,4097793,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1171,7,"",c,content
+550,4097794,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1171,0," ",c,content
+551,4097795,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1168,1,"",c,content
+552,4097796,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1157,7,"",c,content
+553,4097797,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1157,0," ",c,content
+554,4097798,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1154,1,"",c,content
+555,4097798,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1143,7,"",c,content
+556,4097799,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1140,1,"",c,content
+557,4097799,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1129,7,"",c,content
+558,4097799,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1123,4,"",c,content
+559,4097801,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1123,0,"ESC)",c,content
+560,4097802,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1120,0,"MT(MOD_LCTL, ",c,content
+561,4097822,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1089,0," ",c,content
+562,4097825,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1047,0," ",c,content
+563,4097825,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1040,0," ",c,content
+564,4097826,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1027,1,"",c,content
+565,4097827,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1026,0,")",c,content
+566,4097827,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1020,0,"MT(MOD_LALT, ",c,content
+567,4097827,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1007,0," ",c,content
+568,4097828,"keyboards/annepro2/keymaps/calmar_one/keymap.c",990,0," ",c,content
+569,4097829,"keyboards/annepro2/keymaps/calmar_one/keymap.c",947,0," ",c,content
+570,4097830,"keyboards/annepro2/keymaps/calmar_one/keymap.c",942,0," ",c,content
+571,4097830,"keyboards/annepro2/keymaps/calmar_one/keymap.c",935,0," ",c,content
+572,4097831,"keyboards/annepro2/keymaps/calmar_one/keymap.c",924,5,"",c,content
+573,4097831,"keyboards/annepro2/keymaps/calmar_one/keymap.c",924,0,"DEL),",c,content
+574,4097832,"keyboards/annepro2/keymaps/calmar_one/keymap.c",921,0,"LT(FKEY, ",c,content
+575,4097832,"keyboards/annepro2/keymaps/calmar_one/keymap.c",876,9,"",c,content
+576,4097833,"keyboards/annepro2/keymaps/calmar_one/keymap.c",876,0,"layer-taps approximating ZMK",c,content
+577,4097833,"keyboards/annepro2/keymaps/calmar_one/keymap.c",838,28,"",c,content
+578,4097834,"keyboards/annepro2/keymaps/calmar_one/keymap.c",838,0,"-ish, with",c,content
+579,4097834,"keyboards/annepro2/keymaps/calmar_one/keymap.c",631,1,"",c,content
+580,4097842,"keyboards/annepro2/keymaps/calmar_one/keymap.c",631,0," row",c,content
+581,4097843,"keyboards/annepro2/keymaps/calmar_one/keymap.c",616,10,"",c,content
+582,4097843,"keyboards/annepro2/keymaps/calmar_one/keymap.c",616,0,"N",c,content
+583,4097844,"keyboards/annepro2/keymaps/calmar_one/keymap.c",530,69,"",c,content
+584,4097845,"keyboards/annepro2/keymaps/calmar_one/keymap.c",530,0,"F-keys / Media / Brightness",c,content
+585,4097845,"keyboards/annepro2/keymaps/calmar_one/keymap.c",516,5,"",c,content
+586,4097846,"keyboards/annepro2/keymaps/calmar_one/keymap.c",516,0,"FKEY,",c,content
+587,4097848,"keyboards/annepro2/keymaps/calmar_one/keymap.c",496,12,"",c,content
+588,4097850,"keyboards/annepro2/keymaps/calmar_one/keymap.c",496,0,"N",c,content
+589,4097854,"keyboards/annepro2/keymaps/calmar_one/keymap.c",422,56,"",c,content
+590,4097855,"keyboards/annepro2/keymaps/calmar_one/keymap.c",422,0,"+ Hyper combo",c,content
+591,4097855,"keyboards/annepro2/keymaps/calmar_one/keymap.c",404,13,"",c,content
+592,4097856,"keyboards/annepro2/keymaps/calmar_one/keymap.c",404,0,"M",c,content
+593,4097856,"keyboards/annepro2/keymaps/calmar_one/keymap.c",340,55,"",c,content
+594,4097870,"keyboards/annepro2/keymaps/calmar_one/keymap.c",340,0,"Arrows / system\n HYPER",c,content
+595,4097871,"keyboards/annepro2/keymaps/calmar_one/keymap.c",296,36,"",c,content
+596,4097873,"keyboards/annepro2/keymaps/calmar_one/keymap.c",296,0,"Special-Chars\n NAV, ",c,content
+597,4097874,"keyboards/annepro2/keymaps/calmar_one/keymap.c",282,3,"",c,content
+598,4097874,"keyboards/annepro2/keymaps/calmar_one/keymap.c",282,0,"SYM",c,content
+599,4097875,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3322,2,"",c,content
+600,4097875,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3322,0,"KC",c,content
+601,4097876,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3311,2,"",c,content
+602,4097876,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3311,0,"KC",c,content
+603,4097877,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3203,6,"",c,content
+604,4097877,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3203,0,"KC_WH_",c,content
+605,4097878,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3192,6,"",c,content
+606,4097878,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3192,0,"KC_WH_",c,content
+607,4097879,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3182,6,"",c,content
+608,4097879,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3182,0,"KC_WH_",c,content
+609,4097880,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3172,6,"",c,content
+610,4097887,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3172,0,"KC_WH_",c,content
+611,4097888,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3048,3,"",c,content
+612,4097888,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3044,0,"KC_",c,content
+613,4097889,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3039,2,"",c,content
+614,4097890,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3037,1,"",c,content
+615,4097890,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3033,0,"KC_",c,content
+616,4097891,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3027,3,"",c,content
+617,4097891,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3023,0,"KC_",c,content
+618,4097891,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3017,3,"",c,content
+619,4097892,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3013,0,"KC_",c,content
+620,4097892,"keyboards/annepro2/keymaps/calmar_one/keymap.c",2854,2,"",c,content
+621,4097893,"keyboards/annepro2/keymaps/calmar_one/keymap.c",2854,0,"KC",c,content
+622,4097894,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5729,0,"",c,selection_command
+623,4114203,"keyboards/annepro2/keymaps/calmar_one/keymap.c",280,5205," NAV, // Right-hand: navigation/edit\n HYPER, // Right-hand: Hyper combos + mouse movement/wheel\n MOUSE, // Right-hand: mouse emulation\n MEDIA, // Right-hand: media / BT controls\n NUM, // Left-hand: numpad\n SYM, // Left-hand: shifted symbols\n FUN, // Left-hand: function/system\n NUMBERS, // Separate numbers on home row\n BT // Bluetooth controls\n};\n\n// Convenience alias for transparent\n#define _______ KC_TRNS\n\n// clang-format off\nconst uint16_t keymaps[][MATRIX_ROWS][MATRIX_COLS] = {\n /* BASE (QWERTY with Miryoku-style home row mods and thumb LTs) */\n [BASE] = LAYOUT_60_ansi(\n KC_ESC, KC_1, KC_2, KC_3, KC_4, KC_5, KC_6, KC_7, KC_8, KC_9, KC_0, KC_MINS, KC_EQL, KC_BSPC,\n KC_TAB, KC_Q, KC_W, KC_E, KC_R, KC_T, KC_Y, KC_U, KC_I, KC_O, KC_P, KC_LBRC, KC_RBRC, KC_BSLS,\n KC_CAPS, LGUI_T(KC_A), LALT_T(KC_S), LCTL_T(KC_D), LSFT_T(KC_F), KC_G, KC_H, RSFT_T(KC_J), RCTL_T(KC_K), RALT_T(KC_L), RGUI_T(KC_SCLN), KC_QUOT, KC_ENT,\n KC_LSFT, LT(HYPER, KC_Z), LT(NUM, KC_X), KC_C, KC_V, KC_B, KC_N, KC_M, KC_COMM, KC_DOT, KC_SLSH, RSFT_T(KC_SLSH),\n LT(MEDIA, KC_ESC), LT(MOUSE, KC_TAB), KC_LGUI, LT(NAV, KC_SPC), LT(NUM, KC_BSPC), LT(SYM, KC_ENT), LT(FUN, KC_DEL), KC_RCTL\n ),\n\n /* NAV (Right-hand: arrows/edit; duplicate thumb taps) */\n [NAV] = LAYOUT_60_ansi(\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, LCTL(KC_Z), LCTL(KC_Y), LCTL(KC_X), LCTL(KC_C), LCTL(KC_V), _______, _______, _______,\n _______, _______, _______, _______, _______, _______, KC_LEFT, KC_DOWN, KC_UP, KC_RGHT, _______, _______, _______,\n _______, KC_BRID, KC_BRIU, KC_MUTE, _______, _______, _______, _______, _______, _______, _______, _______,\n KC_ESC, KC_TAB, KC_LALT, KC_SPC, KC_BSPC, KC_ENT, KC_DEL, KC_RCTL\n ),\n\n /* HYPER (Right-hand: Hyper combos + mouse movement/wheel; mouse buttons on thumbs) */\n [HYPER] = LAYOUT_60_ansi(\n _______, S(A(KC_Q)), S(A(KC_W)), S(A(KC_F)), S(A(KC_P)), S(A(KC_B)), _______, KC_TAB, S(KC_TAB), S(A(KC_Y)), S(A(KC_SLSH)), _______, _______, _______,\n _______, S(A(KC_A)), S(A(KC_R)), KC_TAB, S(KC_TAB), S(A(KC_G)), MS_LEFT, MS_DOWN, MS_UP, MS_RGHT, S(A(KC_O)), _______, _______, _______,\n _______, S(A(KC_Z)), S(A(KC_X)), A(S(KC_C)), S(A(KC_D)), A(S(KC_V)), MS_WHLL, MS_WHLD, MS_WHLU, MS_WHLR, S(A(KC_MINS)), _______, _______,\n S(A(KC_F7)), S(A(KC_F8)), KC_LGUI, S(A(KC_F10)), MS_BTN1, MS_BTN2, S(A(KC_F13)), _______, _______, _______, _______, _______,\n KC_ESC, KC_TAB, KC_LALT, KC_SPC, MS_BTN1, MS_BTN2, MS_BTN3, KC_RCTL\n ),\n\n /* MOUSE (Right-hand: mouse; movement mirrors NAV; buttons on right thumbs) */\n [MOUSE] = LAYOUT_60_ansi(\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, LCTL(KC_Z), LCTL(KC_Y), LCTL(KC_X), LCTL(KC_C), LCTL(KC_V), _______, _______, _______,\n _______, _______, _______, _______, _______, _______, MS_LEFT, MS_DOWN, MS_UP, MS_RGHT, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, MS_WHLD, MS_WHLU, _______, _______, _______,\n KC_ESC, KC_TAB, KC_LALT, KC_SPC, MS_BTN1, MS_BTN2, MS_BTN3, KC_RCTL\n ),\n\n /* MEDIA (Right-hand: media; RGB and BT controls inline) */\n [MEDIA] = LAYOUT_60_ansi(\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, KC_MPRV, KC_VOLD, KC_VOLU, KC_MNXT, _______, _______, _______,\n _______, _______, _______, KC_MUTE, _______, _______, _______, _______, _______, _______, _______, _______,\n KC_ESC, KC_TAB, KC_LALT, KC_SPC, KC_MPLY, KC_MSTP, KC_MUTE, KC_RCTL\n ),\n\n /* NUM (Left-hand primary: numerals/symbols; duplicate thumbs) */\n [NUM] = LAYOUT_60_ansi(\n _______, _______, _______, _______, _______, _______, _______, KC_P7, KC_P8, KC_P9, _______, _______, _______, _______,\n _______, _______, _______, KC_PSLS, KC_EQL, _______, _______, KC_P4, KC_P5, KC_P6, KC_PPLS, _______, _______, _______,\n _______, _______, _______, _______, KC_PAST, _______, KC_P0, KC_P1, KC_P2, KC_P3, KC_PMNS, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n KC_ESC, KC_TAB, KC_LALT, KC_SPC, KC_BSPC, KC_ENT, KC_DEL, KC_RCTL\n ),\n\n /* SYM (Left-hand secondary: shifted symbols; duplicate thumbs) */\n [SYM] = LAYOUT_60_ansi(\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n _______, KC_EXLM, KC_AT, KC_HASH, KC_DLR, KC_PERC, KC_CIRC, KC_AMPR, KC_LPRN, KC_RPRN, KC_MINS, KC_LBRC, KC_RBRC, KC_BSLS,\n _______, KC_1, KC_2, KC_3, KC_4, KC_5, KC_6, KC_7, KC_8, KC_9, KC_0, KC_EQL, _______,\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n KC_ESC, KC_TAB, KC_LALT, KC_SPC, KC_BSPC, KC_ENT, KC_DEL, KC_RCTL\n ),\n\n /* FUN (Left-hand tertiary: function keys; duplicate thumbs) */\n [FUN] = LAYOUT_60_ansi(\n _______, KC_F1, KC_F2, KC_F3, KC_F4, KC_F5, KC_F6, KC_F7, KC_F8, KC_F9, KC_F10, KC_F11, KC_F12, _______,\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, QK_BOOT, _______,\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n KC_ESC, KC_TAB, KC_LALT, KC_SPC, KC_BSPC, KC_ENT, KC_DEL, KC_RCTL\n ),\n\n /* NUMBERS (Separate: numbers on home row; duplicate thumbs) */\n [NUMBERS] = LAYOUT_60_ansi(\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n _______, KC_1, KC_2, KC_3, KC_4, KC_5, KC_6, KC_7, KC_8, KC_9, KC_0, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n KC_ESC, KC_TAB, KC_LALT, KC_SPC, KC_BSPC, KC_ENT, KC_DEL, KC_RCTL\n",c,content
+624,4114247,"/Users/franzsrambical/zmk-config-calmar-one/config/calmar-one.keymap",0,0,"",plaintext,tab
+625,4128392,"keyboards/annepro2/keymaps/calmar_one/keymap.c",0,0,"",c,tab
+626,4136950,"keyboards/annepro2/keymaps/calmar_one/keymap.c",7395,0,"",c,selection_command
+627,4137019,"keyboards/annepro2/keymaps/calmar_one/keymap.c",7369,0,"",c,selection_command
+628,4137226,"keyboards/annepro2/keymaps/calmar_one/keymap.c",7323,0,"",c,selection_command
+629,4137665,"keyboards/annepro2/keymaps/calmar_one/keymap.c",7298,0,"",c,selection_command
+630,4265839,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1400,150," LT(MEDIA, KC_ESC), KC_TAB, LT(MOUSE, KC_LGUI), LT(NAV, KC_SPC), LT(NUM, KC_BSPC), LT(SYM, KC_ENT), LT(FUN, KC_DEL), KC_RCTL",c,content
+631,4470089,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6690,0,"",c,selection_mouse
+632,4470101,"keyboards/annepro2/keymaps/calmar_one/keymap.c",6689,0,"",c,selection_command
+633,4885142,"keyboards/annepro2/keymaps/calmar_one/keymap.c",5309,500," _______, _______, S(KC_QUOT), KC_QUOT, _______, _______, _______, _______, KC_LBRC, KC_RBRC, KC_BSLS, KC_MINS, _______, _______,\n _______, KC_EXLM, KC_AT, KC_HASH, KC_DLR, KC_PERC, KC_CIRC, KC_AMPR, KC_ASTR, KC_LPRN, KC_RPRN, _______, _______, _______,\n _______, KC_1, KC_2, KC_3, KC_4, KC_5, KC_6, KC_7, KC_8, KC_9, KC_0, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,",c,content
+634,4885143,"keyboards/annepro2/keymaps/calmar_one/keymap.c",3920,604," _______, KC_F1, KC_F2, KC_F3, KC_F4, KC_F5, KC_F6, KC_F7, KC_F8, KC_F9, KC_F10, KC_F11, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, KC_BRID, KC_BRIU, _______, _______, QK_BOOT, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, KC_MUTE, _______, _______, _______, _______, _______,\n KC_MPRV, KC_MPLY, KC_MNXT, _______, KC_VOLD, KC_VOLU, _______, _______, _______, _______, _______, _______,\n KC_MPRV, KC_MPLY, KC_MNXT, _______, KC_VOLD, KC_VOLU, _______, KC_RCTL",c,content
+635,4885143,"keyboards/annepro2/keymaps/calmar_one/keymap.c",2038,120," _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, QK_BOOT, QK_BOOT,",c,content
+636,4885143,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1773,144," _______, _______, KC_TILD, KC_PIPE, KC_GRV, _______, LCTL(KC_Z), LCTL(KC_Y), LCTL(KC_X), LCTL(KC_C), LCTL(KC_V), _______, _______, _______,",c,content
+637,5108867,"keyboards/annepro2/keymaps/calmar_one/keymap.c",4399,103," KC_ESC, KC_TAB, KC_LALT, KC_SPC, KC_MPLY, KC_MSTP, KC_MUTE, KC_RCTL",c,content
+638,5108867,"keyboards/annepro2/keymaps/calmar_one/keymap.c",2037,120," _______, _______, _______, _______, _______, _______, _______, KC_HOME, KC_PGDN, KC_PGUP, KC_END, _______,",c,content
+639,5108867,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1643,129," _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, QK_BOOT,",c,content
+640,5527192,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1400,150," MT(MOD_LALT, KC_TAB), KC_SPC, KC_LGUI, _______, LT(NAV, KC_ENT), LT(SYM, KC_BSPC), _______, _______",c,content
+641,5643122,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1400,126," LT(MEDIA, KC_ESC), KC_TAB, LT(MOUSE, KC_LGUI), LT(NAV, KC_SPC), LT(NUM, KC_BSPC), LT(SYM, KC_ENT), LT(FUN, KC_DEL), KC_RCTL",c,content
+642,5643145,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1400,150," MT(MOD_LALT, KC_TAB), KC_SPC, LT(MOUSE, KC_LGUI), KC_SPC, LT(NAV, KC_ENT), LT(SYM, KC_BSPC), _______, KC_RCTL",c,content
+643,9768256,"/Users/franzsrambical/Downloads/calmar_one(1).json",0,0,"{\n ""version"": 1,\n ""notes"": """",\n ""documentation"": ""\""This file is a QMK Configurator export. You can import this at . It can also be used directly with QMK's source code.\n\nTo setup your QMK environment check out the tutorial: \n\nYou can convert this file to a keymap.c using this command: `qmk json2c {keymap}`\n\nYou can compile this keymap using this command: `qmk compile {keymap}`\""\n"",\n ""keyboard"": ""annepro2/c18"",\n ""keymap"": ""calmar_one"",\n ""layout"": ""LAYOUT_60_ansi"",\n ""layers"": [\n [\n ""KC_ESC"",\n ""KC_1"",\n ""KC_2"",\n ""KC_3"",\n ""KC_4"",\n ""KC_5"",\n ""KC_6"",\n ""KC_7"",\n ""KC_8"",\n ""KC_9"",\n ""KC_0"",\n ""KC_MINS"",\n ""KC_EQL"",\n ""KC_BSPC"",\n ""KC_TAB"",\n ""KC_Q"",\n ""KC_W"",\n ""KC_E"",\n ""KC_R"",\n ""KC_T"",\n ""KC_Y"",\n ""KC_U"",\n ""KC_I"",\n ""KC_O"",\n ""KC_P"",\n ""KC_LBRC"",\n ""KC_RBRC"",\n ""KC_BSLS"",\n ""KC_RCTL"",\n ""LGUI(KC_A)"",\n ""LALT(KC_S)"",\n ""LCTL(KC_D)"",\n ""LSFT(KC_F)"",\n ""KC_G"",\n ""KC_H"",\n ""RSFT(KC_J)"",\n ""RCTL(KC_K)"",\n ""RALT(KC_K)"",\n ""RGUI(KC_SCLN)"",\n ""KC_QUOT"",\n ""KC_ENT"",\n ""KC_LSFT"",\n ""KC_Z"",\n ""LT(1,KC_X)"",\n ""LT(2,KC_C)"",\n ""LT(3,KC_V)"",\n ""KC_B"",\n ""KC_N"",\n ""LT(4,KC_M)"",\n ""LT(5,KC_COMM)"",\n ""KC_DOT"",\n ""KC_SLSH"",\n ""KC_UP"",\n ""KC_LCTL"",\n ""KC_LALT"",\n ""KC_LGUI"",\n ""KC_SPC"",\n ""KC_RALT"",\n ""KC_LEFT"",\n ""KC_DOWN"",\n ""KC_RGHT""\n ],\n [\n ""KC_NO"",\n ""ANY(KC_AP2_BT1)"",\n ""ANY(KC_AP2_BT2)"",\n ""ANY(KC_AP2_BT3)"",\n ""ANY(KC_AP2_BT4)"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_BRID"",\n ""KC_BRIU"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_MPRV"",\n ""KC_VOLD"",\n ""KC_VOLU"",\n ""KC_MNXT"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_MUTE"",\n ""KC_MPLY"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO""\n ],\n [\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_LEFT"",\n ""KC_DOWN"",\n ""KC_UP"",\n ""KC_RGHT"",\n ""KC_CAPS"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_INS"",\n ""KC_PGDN"",\n ""KC_PGUP"",\n ""KC_HOME"",\n ""KC_END"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO""\n ],\n [\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""MS_BTN1"",\n ""MS_BTN2"",\n ""MS_BTN3"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""MS_LEFT"",\n ""MS_DOWN"",\n ""MS_UP"",\n ""MS_RGHT"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""MS_WHLL"",\n ""MS_WHLD"",\n ""MS_WHLU"",\n ""MS_WHLR"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO""\n ],\n [\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_LPRN"",\n ""KC_RPRN"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_LCBR"",\n ""KC_AMPR"",\n ""KC_ASTR"",\n ""KC_NO"",\n ""KC_RCBR"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_COLN"",\n ""KC_DLR"",\n ""KC_PERC"",\n ""KC_CIRC"",\n ""KC_EQL"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_TILD"",\n ""KC_EXLM"",\n ""KC_AT"",\n ""KC_HASH"",\n ""KC_PIPE"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO""\n ],\n [\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_LBRC"",\n ""KC_7"",\n ""KC_8"",\n ""KC_9"",\n ""KC_RBRC"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_SCLN"",\n ""KC_4"",\n ""KC_5"",\n ""KC_6"",\n ""KC_EQL"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_GRV"",\n ""KC_1"",\n ""KC_2"",\n ""KC_3"",\n ""KC_BSLS"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO"",\n ""KC_NO""\n ]\n ],\n ""author"": """"\n}",json,tab
+644,9769712,"/Users/franzsrambical/Downloads/calmar_one(1).json",569,0,"",json,selection_mouse
+645,9769723,"/Users/franzsrambical/Downloads/calmar_one(1).json",568,0,"",json,selection_command
+646,9807833,"keyboards/annepro2/keymaps/calmar_one-backup/keymap.c",0,0,"/* Calmar One keymap for Anne Pro 2\n *\n * Approximates the user's ZMK layout:\n * - Layers: BASE, SYM (Special-Chars), NAV (Arrows), HYPER (Mouse/Hyper combos), NUM (Numpad), FKEY (F-keys/Media), NUMBERS, BT (Bluetooth)\n */\n\n#include QMK_KEYBOARD_H\n\nenum anne_pro_layers {\n BASE,\n NAV, // Right-hand: navigation/edit\n HYPER, // Right-hand: Hyper combos + mouse movement/wheel\n MOUSE, // Right-hand: mouse emulation\n MEDIA, // Right-hand: media / BT controls\n NUM, // Left-hand: numpad\n SYM, // Left-hand: shifted symbols\n FUN, // Left-hand: function/system\n NUMBERS, // Separate numbers on home row\n BT // Bluetooth controls\n};\n\n// Convenience alias for transparent\n#define _______ KC_TRNS\n\n// clang-format off\nconst uint16_t keymaps[][MATRIX_ROWS][MATRIX_COLS] = {\n /* BASE (QWERTY with Miryoku-style home row mods and thumb LTs) */\n [BASE] = LAYOUT_60_ansi(\n KC_ESC, KC_1, KC_2, KC_3, KC_4, KC_5, KC_6, KC_7, KC_8, KC_9, KC_0, KC_MINS, KC_EQL, KC_BSPC,\n KC_TAB, KC_Q, KC_W, KC_E, KC_R, KC_T, KC_Y, KC_U, KC_I, KC_O, KC_P, KC_LBRC, KC_RBRC, KC_BSLS,\n KC_CAPS, LGUI_T(KC_A), LALT_T(KC_S), LCTL_T(KC_D), LSFT_T(KC_F), KC_G, KC_H, RSFT_T(KC_J), RCTL_T(KC_K), RALT_T(KC_L), RGUI_T(KC_SCLN), KC_QUOT, KC_ENT,\n KC_LSFT, LT(HYPER, KC_Z), LT(NUM, KC_X), KC_C, KC_V, KC_B, KC_N, KC_M, KC_COMM, KC_DOT, KC_SLSH, RSFT_T(KC_SLSH),\n MT(MOD_LALT, KC_TAB), KC_SPC, LT(MOUSE, KC_LGUI), KC_SPC, LT(NAV, KC_ENT), LT(SYM, KC_BSPC), _______, KC_RCTL\n ),\n\n /* NAV (Right-hand: arrows/edit; duplicate thumb taps) */\n [NAV] = LAYOUT_60_ansi(\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, QK_BOOT,\n _______, _______, KC_TILD, KC_PIPE, KC_GRV, _______, LCTL(KC_Z), LCTL(KC_Y), LCTL(KC_X), LCTL(KC_C), LCTL(KC_V), _______, _______, _______,\n _______, _______, _______, _______, _______, _______, KC_LEFT, KC_DOWN, KC_UP, KC_RGHT, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, KC_HOME, KC_PGDN, KC_PGUP, KC_END, _______,\n KC_ESC, KC_TAB, KC_LALT, KC_SPC, KC_BSPC, KC_ENT, KC_DEL, KC_RCTL\n ),\n\n /* HYPER (Right-hand: Hyper combos + mouse movement/wheel; mouse buttons on thumbs) */\n [HYPER] = LAYOUT_60_ansi(\n _______, S(A(KC_Q)), S(A(KC_W)), S(A(KC_F)), S(A(KC_P)), S(A(KC_B)), _______, KC_TAB, S(KC_TAB), S(A(KC_Y)), S(A(KC_SLSH)), _______, _______, _______,\n _______, S(A(KC_A)), S(A(KC_R)), KC_TAB, S(KC_TAB), S(A(KC_G)), MS_LEFT, MS_DOWN, MS_UP, MS_RGHT, S(A(KC_O)), _______, _______, _______,\n _______, S(A(KC_Z)), S(A(KC_X)), A(S(KC_C)), S(A(KC_D)), A(S(KC_V)), MS_WHLL, MS_WHLD, MS_WHLU, MS_WHLR, S(A(KC_MINS)), _______, _______,\n S(A(KC_F7)), S(A(KC_F8)), KC_LGUI, S(A(KC_F10)), MS_BTN1, MS_BTN2, S(A(KC_F13)), _______, _______, _______, _______, _______,\n KC_ESC, KC_TAB, KC_LALT, KC_SPC, MS_BTN1, MS_BTN2, MS_BTN3, KC_RCTL\n ),\n\n /* MOUSE (Right-hand: mouse; movement mirrors NAV; buttons on right thumbs) */\n [MOUSE] = LAYOUT_60_ansi(\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, LCTL(KC_Z), LCTL(KC_Y), LCTL(KC_X), LCTL(KC_C), LCTL(KC_V), _______, _______, _______,\n _______, _______, _______, _______, _______, _______, MS_LEFT, MS_DOWN, MS_UP, MS_RGHT, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, MS_WHLD, MS_WHLU, _______, _______, _______,\n KC_ESC, KC_TAB, KC_LALT, KC_SPC, MS_BTN1, MS_BTN2, MS_BTN3, KC_RCTL\n ),\n\n /* MEDIA (Right-hand: media; RGB and BT controls inline) */\n [MEDIA] = LAYOUT_60_ansi(\n _______, KC_F1, KC_F2, KC_F3, KC_F4, KC_F5, KC_F6, KC_F7, KC_F8, KC_F9, KC_F10, KC_F11, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, KC_BRID, KC_BRIU, _______, _______, QK_BOOT, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, KC_MUTE, _______, _______, _______, _______, _______,\n KC_MPRV, KC_MPLY, KC_MNXT, _______, KC_VOLD, KC_VOLU, _______, _______, _______, _______, _______, _______,\n KC_ESC, KC_TAB, KC_LALT, KC_SPC, KC_MPLY, KC_MSTP, KC_MUTE, KC_RCTL\n ),\n\n /* NUM (Left-hand primary: numerals/symbols; duplicate thumbs) */\n [NUM] = LAYOUT_60_ansi(\n _______, _______, _______, _______, _______, _______, _______, KC_P7, KC_P8, KC_P9, _______, _______, _______, _______,\n _______, _______, _______, KC_PSLS, KC_EQL, _______, _______, KC_P4, KC_P5, KC_P6, KC_PPLS, _______, _______, _______,\n _______, _______, _______, _______, KC_PAST, _______, KC_P0, KC_P1, KC_P2, KC_P3, KC_PMNS, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n KC_ESC, KC_TAB, KC_LALT, KC_SPC, KC_BSPC, KC_ENT, KC_DEL, KC_RCTL\n ),\n\n /* SYM (Left-hand secondary: shifted symbols; duplicate thumbs) */\n [SYM] = LAYOUT_60_ansi(\n _______, _______, S(KC_QUOT), KC_QUOT, _______, _______, _______, _______, KC_LBRC, KC_RBRC, KC_BSLS, KC_MINS, _______, _______,\n _______, KC_EXLM, KC_AT, KC_HASH, KC_DLR, KC_PERC, KC_CIRC, KC_AMPR, KC_ASTR, KC_LPRN, KC_RPRN, _______, _______, _______,\n _______, KC_1, KC_2, KC_3, KC_4, KC_5, KC_6, KC_7, KC_8, KC_9, KC_0, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n KC_ESC, KC_TAB, KC_LALT, KC_SPC, KC_BSPC, KC_ENT, KC_DEL, KC_RCTL\n ),\n\n /* FUN (Left-hand tertiary: function keys; duplicate thumbs) */\n [FUN] = LAYOUT_60_ansi(\n _______, KC_F1, KC_F2, KC_F3, KC_F4, KC_F5, KC_F6, KC_F7, KC_F8, KC_F9, KC_F10, KC_F11, KC_F12, _______,\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, QK_BOOT, _______,\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n KC_ESC, KC_TAB, KC_LALT, KC_SPC, KC_BSPC, KC_ENT, KC_DEL, KC_RCTL\n ),\n\n /* NUMBERS (Separate: numbers on home row; duplicate thumbs) */\n [NUMBERS] = LAYOUT_60_ansi(\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n _______, KC_1, KC_2, KC_3, KC_4, KC_5, KC_6, KC_7, KC_8, KC_9, KC_0, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n KC_ESC, KC_TAB, KC_LALT, KC_SPC, KC_BSPC, KC_ENT, KC_DEL, KC_RCTL\n ),\n\n /* BT (Bluetooth controls for Anne Pro 2) */\n [BT] = LAYOUT_60_ansi(\n _______, KC_AP2_BT_UNPAIR, _______, _______, KC_AP2_USB, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n _______, _______, _______, _______, KC_AP2_BT1, KC_AP2_BT2, KC_AP2_BT3, KC_AP2_BT4, _______, _______, _______, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, _______\n ),\n};\n// clang-format on\n\n\n",c,tab
+647,9809407,"keyboards/annepro2/keymaps/calmar_one-backup/keymap.c",323,0,"",c,selection_mouse
+648,9809418,"keyboards/annepro2/keymaps/calmar_one-backup/keymap.c",322,0,"",c,selection_command
+649,9824531,"keyboards/annepro2/keymaps/calmar_one-backup/keymap.c",556,0,"",c,selection_mouse
+650,9824538,"keyboards/annepro2/keymaps/calmar_one-backup/keymap.c",555,0,"",c,selection_command
+651,9825323,"keyboards/annepro2/keymaps/calmar_one-backup/keymap.c",514,42," SYM, // Left-hand: shifted symbols",c,selection_command
+652,9843484,"/Users/franzsrambical/Downloads/calmar_one(1).json",0,0,"",json,tab
+653,9843970,"/Users/franzsrambical/Downloads/calmar_one(1).json",625,0,"",json,selection_mouse
+654,9843976,"/Users/franzsrambical/Downloads/calmar_one(1).json",624,0,"",json,selection_command
+655,9844463,"/Users/franzsrambical/Downloads/calmar_one(1).json",612,13," ""KC_4"",",json,selection_command
+656,9846642,"keyboards/annepro2/keymaps/calmar_one-backup/keymap.c",0,0,"",c,tab
+657,9849986,"keyboards/annepro2/keymaps/calmar_one-backup/keymap.c",248,0,"",c,selection_mouse
+658,10020771,"keyboards/annepro2/keymaps/calmar_one-backup/keymap.c",249,0,"// clang-format off\nconst uint16_t keymaps[][MATRIX_ROWS][MATRIX_COLS] = {\n [0] = LAYOUT_60_ansi(\n KC_ESC, KC_1, KC_2, KC_3, KC_4, KC_5, KC_6, KC_7, KC_8, KC_9, KC_0, KC_MINS, KC_EQL, KC_BSPC,\n KC_TAB, KC_Q, KC_W, KC_E, KC_R, KC_T, KC_Y, KC_U, KC_I, KC_O, KC_P, KC_LBRC, KC_RBRC, KC_BSLS,\n KC_RCTL, LGUI(KC_A), LALT(KC_S), LCTL(KC_D), LSFT(KC_F), KC_G, KC_H, RSFT(KC_J), RCTL(KC_K), RALT(KC_K), RGUI(KC_SCLN), KC_QUOT, KC_ENT,\n KC_LSFT, KC_Z, LT(1,KC_X), LT(2,KC_C), LT(3,KC_V), KC_B, KC_N, LT(4,KC_M), LT(5,KC_COMM), KC_DOT, KC_SLSH, KC_UP,\n KC_LCTL, KC_LALT, KC_LGUI, KC_SPC, KC_RALT, KC_LEFT, KC_DOWN, KC_RGHT\n ),\n\n [1] = LAYOUT_60_ansi(\n KC_NO, KC_AP2_BT1, KC_AP2_BT2, KC_AP2_BT3, KC_AP2_BT4, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO,\n KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_BRID, KC_BRIU, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO,\n KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_MPRV, KC_VOLD, KC_VOLU, KC_MNXT, KC_NO, KC_NO, KC_NO,\n KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_MUTE, KC_MPLY, KC_NO, KC_NO, KC_NO,\n KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO\n ),\n\n [2] = LAYOUT_60_ansi(\n KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO,\n KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO,\n KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_LEFT, KC_DOWN, KC_UP, KC_RGHT, KC_CAPS, KC_NO, KC_NO,\n KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_INS, KC_PGDN, KC_PGUP, KC_HOME, KC_END, KC_NO,\n KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO\n ),\n\n [3] = LAYOUT_60_ansi(\n KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO,\n KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, MS_BTN1, MS_BTN2, MS_BTN3, KC_NO, KC_NO, KC_NO, KC_NO,\n KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, MS_LEFT, MS_DOWN, MS_UP, MS_RGHT, KC_NO, KC_NO, KC_NO,\n KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, MS_WHLL, MS_WHLD, MS_WHLU, MS_WHLR, KC_NO, KC_NO,\n KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO\n ),\n\n [4] = LAYOUT_60_ansi(\n KC_NO, KC_NO, KC_NO, KC_LPRN, KC_RPRN, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO,\n KC_NO, KC_LCBR, KC_AMPR, KC_ASTR, KC_NO, KC_RCBR, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO,\n KC_NO, KC_COLN, KC_DLR, KC_PERC, KC_CIRC, KC_EQL, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO,\n KC_NO, KC_TILD, KC_EXLM, KC_AT, KC_HASH, KC_PIPE, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO,\n KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO\n ),\n\n [5] = LAYOUT_60_ansi(\n KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO,\n KC_NO, KC_LBRC, KC_7, KC_8, KC_9, KC_RBRC, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO,\n KC_NO, KC_SCLN, KC_4, KC_5, KC_6, KC_EQL, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO,\n KC_NO, KC_GRV, KC_1, KC_2, KC_3, KC_BSLS, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO,\n KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO\n ),\n};\n// clang-format on\n\n\n",c,content
+659,10020771,"keyboards/annepro2/keymaps/calmar_one-backup/keymap.c",0,222,"/* This keymap implements exactly the layout from the provided JSON export. */",c,content
+660,10020793,"keyboards/annepro2/keymaps/calmar_one-backup/keymap.c",3243,7753,"",c,content
+661,10292183,"keyboards/annepro2/keymaps/calmar_one-backup/keymap.c",3243,0,"\n",c,content
+662,10292183,"keyboards/annepro2/keymaps/calmar_one-backup/keymap.c",2744,470," /* BT (Bluetooth controls for Anne Pro 2) */\n [BT] = LAYOUT_60_ansi(\n _______, KC_AP2_BT_UNPAIR, _______, _______, KC_AP2_USB, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n _______, _______, _______, _______, KC_AP2_BT1, KC_AP2_BT2, KC_AP2_BT3, KC_AP2_BT4, _______, _______, _______, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, _______",c,content
+663,10292183,"keyboards/annepro2/keymaps/calmar_one-backup/keymap.c",2240,497," /* NUMBERS (Separate: numbers on home row; duplicate thumbs) */\n [NUMBERS] = LAYOUT_60_ansi(\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n _______, KC_1, KC_2, KC_3, KC_4, KC_5, KC_6, KC_7, KC_8, KC_9, KC_0, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n KC_ESC, KC_TAB, KC_LALT, KC_SPC, KC_BSPC, KC_ENT, KC_DEL, KC_RCTL",c,content
+664,10292183,"keyboards/annepro2/keymaps/calmar_one-backup/keymap.c",1744,489," /* MOUSE (Right-hand: mouse; movement mirrors NAV; buttons on right thumbs) */\n [MOUSE] = LAYOUT_60_ansi(\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, LCTL(KC_Z), LCTL(KC_Y), LCTL(KC_X), LCTL(KC_C), LCTL(KC_V), _______, _______, _______,\n _______, _______, _______, _______, _______, _______, MS_LEFT, MS_DOWN, MS_UP, MS_RGHT, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, MS_WHLD, MS_WHLU, _______, _______, _______,\n KC_ESC, KC_TAB, KC_LALT, KC_SPC, MS_BTN1, MS_BTN2, MS_BTN3, KC_RCTL\n ),\n\n /* MEDIA (Right-hand: media; RGB and BT controls inline) */\n [MEDIA] = LAYOUT_60_ansi(\n _______, KC_F1, KC_F2, KC_F3, KC_F4, KC_F5, KC_F6, KC_F7, KC_F8, KC_F9, KC_F10, KC_F11, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, KC_BRID, KC_BRIU, _______, _______, QK_BOOT, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, KC_MUTE, _______, _______, _______, _______, _______,\n KC_MPRV, KC_MPLY, KC_MNXT, _______, KC_VOLD, KC_VOLU, _______, _______, _______, _______, _______, _______,\n KC_ESC, KC_TAB, KC_LALT, KC_SPC, KC_MPLY, KC_MSTP, KC_MUTE, KC_RCTL\n ),\n\n /* NUM (Left-hand primary: numerals/symbols; duplicate thumbs) */\n [NUM] = LAYOUT_60_ansi(\n _______, _______, _______, _______, _______, _______, _______, KC_P7, KC_P8, KC_P9, _______, _______, _______, _______,\n _______, _______, _______, KC_PSLS, KC_EQL, _______, _______, KC_P4, KC_P5, KC_P6, KC_PPLS, _______, _______, _______,\n _______, _______, _______, _______, KC_PAST, _______, KC_P0, KC_P1, KC_P2, KC_P3, KC_PMNS, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n KC_ESC, KC_TAB, KC_LALT, KC_SPC, KC_BSPC, KC_ENT, KC_DEL, KC_RCTL\n ),\n\n /* SYM (Left-hand secondary: shifted symbols; duplicate thumbs) */\n [SYM] = LAYOUT_60_ansi(\n _______, _______, S(KC_QUOT), KC_QUOT, _______, _______, _______, _______, KC_LBRC, KC_RBRC, KC_BSLS, KC_MINS, _______, _______,\n _______, KC_EXLM, KC_AT, KC_HASH, KC_DLR, KC_PERC, KC_CIRC, KC_AMPR, KC_ASTR, KC_LPRN, KC_RPRN, _______, _______, _______,\n _______, KC_1, KC_2, KC_3, KC_4, KC_5, KC_6, KC_7, KC_8, KC_9, KC_0, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n KC_ESC, KC_TAB, KC_LALT, KC_SPC, KC_BSPC, KC_ENT, KC_DEL, KC_RCTL\n ),\n\n /* FUN (Left-hand tertiary: function keys; duplicate thumbs) */\n [FUN] = LAYOUT_60_ansi(\n _______, KC_F1, KC_F2, KC_F3, KC_F4, KC_F5, KC_F6, KC_F7, KC_F8, KC_F9, KC_F10, KC_F11, KC_F12, _______,\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, QK_BOOT, _______,\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______,\n KC_ESC, KC_TAB, KC_LALT, KC_SPC, KC_BSPC, KC_ENT, KC_DEL, KC_RCTL",c,content
+665,10292183,"keyboards/annepro2/keymaps/calmar_one-backup/keymap.c",1252,485," /* HYPER (Right-hand: Hyper combos + mouse movement/wheel; mouse buttons on thumbs) */\n [HYPER] = LAYOUT_60_ansi(\n _______, S(A(KC_Q)), S(A(KC_W)), S(A(KC_F)), S(A(KC_P)), S(A(KC_B)), _______, KC_TAB, S(KC_TAB), S(A(KC_Y)), S(A(KC_SLSH)), _______, _______, _______,\n _______, S(A(KC_A)), S(A(KC_R)), KC_TAB, S(KC_TAB), S(A(KC_G)), MS_LEFT, MS_DOWN, MS_UP, MS_RGHT, S(A(KC_O)), _______, _______, _______,\n _______, S(A(KC_Z)), S(A(KC_X)), A(S(KC_C)), S(A(KC_D)), A(S(KC_V)), MS_WHLL, MS_WHLD, MS_WHLU, MS_WHLR, S(A(KC_MINS)), _______, _______,\n S(A(KC_F7)), S(A(KC_F8)), KC_LGUI, S(A(KC_F10)), MS_BTN1, MS_BTN2, S(A(KC_F13)), _______, _______, _______, _______, _______,\n KC_ESC, KC_TAB, KC_LALT, KC_SPC, MS_BTN1, MS_BTN2, MS_BTN3, KC_RCTL",c,content
+666,10292183,"keyboards/annepro2/keymaps/calmar_one-backup/keymap.c",740,505," /* NAV (Right-hand: arrows/edit; duplicate thumb taps) */\n [NAV] = LAYOUT_60_ansi(\n _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, _______, QK_BOOT,\n _______, _______, KC_TILD, KC_PIPE, KC_GRV, _______, LCTL(KC_Z), LCTL(KC_Y), LCTL(KC_X), LCTL(KC_C), LCTL(KC_V), _______, _______, _______,\n _______, _______, _______, _______, _______, _______, KC_LEFT, KC_DOWN, KC_UP, KC_RGHT, _______, _______, _______,\n _______, _______, _______, _______, _______, _______, _______, KC_HOME, KC_PGDN, KC_PGUP, KC_END, _______,\n KC_ESC, KC_TAB, KC_LALT, KC_SPC, KC_BSPC, KC_ENT, KC_DEL, KC_RCTL",c,content
+667,10292183,"keyboards/annepro2/keymaps/calmar_one-backup/keymap.c",180,553," /* BASE (QWERTY with Miryoku-style home row mods and thumb LTs) */\n [BASE] = LAYOUT_60_ansi(\n KC_ESC, KC_1, KC_2, KC_3, KC_4, KC_5, KC_6, KC_7, KC_8, KC_9, KC_0, KC_MINS, KC_EQL, KC_BSPC,\n KC_TAB, KC_Q, KC_W, KC_E, KC_R, KC_T, KC_Y, KC_U, KC_I, KC_O, KC_P, KC_LBRC, KC_RBRC, KC_BSLS,\n KC_CAPS, LGUI_T(KC_A), LALT_T(KC_S), LCTL_T(KC_D), LSFT_T(KC_F), KC_G, KC_H, RSFT_T(KC_J), RCTL_T(KC_K), RALT_T(KC_L), RGUI_T(KC_SCLN), KC_QUOT, KC_ENT,\n KC_LSFT, LT(HYPER, KC_Z), LT(NUM, KC_X), KC_C, KC_V, KC_B, KC_N, KC_M, KC_COMM, KC_DOT, KC_SLSH, RSFT_T(KC_SLSH),\n MT(MOD_LALT, KC_TAB), KC_SPC, LT(MOUSE, KC_LGUI), KC_SPC, LT(NAV, KC_ENT), LT(SYM, KC_BSPC), _______, KC_RCTL",c,content
+668,10292183,"keyboards/annepro2/keymaps/calmar_one-backup/keymap.c",105,0,"enum anne_pro_layers {\n BASE,\n NAV, // Right-hand: navigation/edit\n HYPER, // Right-hand: Hyper combos + mouse movement/wheel\n MOUSE, // Right-hand: mouse emulation\n MEDIA, // Right-hand: media / BT controls\n NUM, // Left-hand: numpad\n SYM, // Left-hand: shifted symbols\n FUN, // Left-hand: function/system\n NUMBERS, // Separate numbers on home row\n BT // Bluetooth controls\n};\n\n// Convenience alias for transparent\n#define _______ KC_TRNS\n\n",c,content
+669,10292183,"keyboards/annepro2/keymaps/calmar_one-backup/keymap.c",0,78,"/* Calmar One keymap for Anne Pro 2\n *\n * Approximates the user's ZMK layout:\n * - Layers: BASE, SYM (Special-Chars), NAV (Arrows), HYPER (Mouse/Hyper combos), NUM (Numpad), FKEY (F-keys/Media), NUMBERS, BT (Bluetooth)\n */",c,content
+670,10292281,"keyboards/annepro2/keymaps/calmar_one-backup/keymap.c",249,0,"// clang-format off\nconst uint16_t keymaps[][MATRIX_ROWS][MATRIX_COLS] = {\n [0] = LAYOUT_60_ansi(\n KC_ESC, KC_1, KC_2, KC_3, KC_4, KC_5, KC_6, KC_7, KC_8, KC_9, KC_0, KC_MINS, KC_EQL, KC_BSPC,\n KC_TAB, KC_Q, KC_W, KC_E, KC_R, KC_T, KC_Y, KC_U, KC_I, KC_O, KC_P, KC_LBRC, KC_RBRC, KC_BSLS,\n KC_RCTL, LGUI_T(KC_A), LALT_T(KC_S), LCTL_T(KC_D), LSFT_T(KC_F), KC_G, KC_H, RSFT_T(KC_J), RCTL_T(KC_K), RALT_T(KC_L), RGUI_T(KC_SCLN), KC_QUOT, KC_ENT,\n KC_LSFT, KC_Z, LT(1,KC_X), LT(2,KC_C), LT(3,KC_V), KC_B, KC_N, LT(4,KC_M), LT(5,KC_COMM), KC_DOT, KC_SLSH, KC_UP,\n KC_LCTL, KC_LALT, KC_LGUI, KC_SPC, KC_RALT, KC_LEFT, KC_DOWN, KC_RGHT\n ),\n\n [1] = LAYOUT_60_ansi(\n KC_NO, KC_AP2_BT1, KC_AP2_BT2, KC_AP2_BT3, KC_AP2_BT4, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO,\n KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_BRID, KC_BRIU, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO,\n KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_MPRV, KC_VOLD, KC_VOLU, KC_MNXT, KC_NO, KC_NO, KC_NO,\n KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_MUTE, KC_MPLY, KC_NO, KC_NO, KC_NO,\n KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO\n ),\n\n [2] = LAYOUT_60_ansi(\n KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO,\n KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO,\n KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_LEFT, KC_DOWN, KC_UP, KC_RGHT, KC_CAPS, KC_NO, KC_NO,\n KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_INS, KC_PGDN, KC_PGUP, KC_HOME, KC_END, KC_NO,\n KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO\n ),\n\n [3] = LAYOUT_60_ansi(\n KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO,\n KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, MS_BTN1, MS_BTN2, MS_BTN3, KC_NO, KC_NO, KC_NO, KC_NO,\n KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, MS_LEFT, MS_DOWN, MS_UP, MS_RGHT, KC_NO, KC_NO, KC_NO,\n KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, MS_WHLL, MS_WHLD, MS_WHLU, MS_WHLR, KC_NO, KC_NO,\n KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO\n ),\n\n [4] = LAYOUT_60_ansi(\n KC_NO, KC_NO, KC_NO, KC_LPRN, KC_RPRN, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO,\n KC_NO, KC_LCBR, KC_AMPR, KC_ASTR, KC_NO, KC_RCBR, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO,\n KC_NO, KC_COLN, KC_DLR, KC_PERC, KC_CIRC, KC_EQL, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO,\n KC_NO, KC_TILD, KC_EXLM, KC_AT, KC_HASH, KC_PIPE, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO,\n KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO\n ),\n\n [5] = LAYOUT_60_ansi(\n KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO,\n KC_NO, KC_LBRC, KC_7, KC_8, KC_9, KC_RBRC, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO,\n KC_NO, KC_SCLN, KC_4, KC_5, KC_6, KC_EQL, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO,\n KC_NO, KC_GRV, KC_1, KC_2, KC_3, KC_BSLS, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO,\n KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO, KC_NO\n ),\n};\n// clang-format on\n\n\n",c,content
+671,10292281,"keyboards/annepro2/keymaps/calmar_one-backup/keymap.c",0,222,"/* This keymap implements exactly the layout from the provided JSON export. */",c,content
+672,10292299,"keyboards/annepro2/keymaps/calmar_one-backup/keymap.c",3259,7753,"",c,content
+673,38385939,"keyboards/annepro2/keymaps/calmar_one/keymap.c",0,0,"",c,tab
+674,38385966,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1400,0,"",c,selection_command
+675,38400101,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1401,0,"",c,selection_command
+676,38400181,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1401,0,"a",c,content
+677,38400189,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1402,0,"",c,selection_keyboard
+678,38400427,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1402,0,"a",c,content
+679,38400432,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1403,0,"",c,selection_keyboard
+680,38400468,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1403,0,"a",c,content
+681,38400472,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1404,0,"",c,selection_keyboard
+682,38400497,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1404,0,"a",c,content
+683,38400500,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1405,0,"",c,selection_keyboard
+684,38400529,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1405,0,"a",c,content
+685,38400532,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1406,0,"",c,selection_keyboard
+686,38400564,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1406,0,"a",c,content
+687,38400565,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1407,0,"",c,selection_keyboard
+688,38400595,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1407,0,"a",c,content
+689,38400597,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1408,0,"",c,selection_keyboard
+690,38400628,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1408,0,"a",c,content
+691,38400631,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1409,0,"",c,selection_keyboard
+692,38400662,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1409,0,"a",c,content
+693,38400664,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1410,0,"",c,selection_keyboard
+694,38400695,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1410,0,"a",c,content
+695,38400700,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1411,0,"",c,selection_keyboard
+696,38400727,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1411,0,"a",c,content
+697,38400730,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1412,0,"",c,selection_keyboard
+698,38400762,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1412,0,"a",c,content
+699,38400764,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1413,0,"",c,selection_keyboard
+700,38400794,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1413,0,"a",c,content
+701,38400797,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1414,0,"",c,selection_keyboard
+702,38400828,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1414,0,"a",c,content
+703,38400830,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1415,0,"",c,selection_keyboard
+704,38401184,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1414,1,"",c,content
+705,38401428,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1413,1,"",c,content
+706,38401463,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1412,1,"",c,content
+707,38401498,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1411,1,"",c,content
+708,38401536,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1410,1,"",c,content
+709,38401564,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1409,1,"",c,content
+710,38401597,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1408,1,"",c,content
+711,38401630,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1407,1,"",c,content
+712,38401664,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1406,1,"",c,content
+713,38401697,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1405,1,"",c,content
+714,38401871,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1404,1,"",c,content
+715,38402048,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1403,1,"",c,content
+716,38402398,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1402,1,"",c,content
+717,38402724,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1401,1,"",c,content
+718,38403823,"keyboards/annepro2/keymaps/calmar_one/keymap.c",1400,0,"",c,selection_command
+719,38407208,"keyboards/annepro2/keymaps/calmar_one-backup/keymap.c",0,0,"",c,tab
+720,38408093,"keyboards/annepro2/keymaps/calmar_one/keymap.c",0,0,"",c,tab
+721,38418784,"keyboards/annepro2/keymaps/calmar_one-backup/keymap.c",0,0,"",c,tab
+722,38423292,"keyboards/annepro2/keymaps/calmar_one-backup/keymap.c",1500,0,"",c,selection_command
+723,38423932,"keyboards/annepro2/keymaps/calmar_one-backup/keymap.c",2788,0,"",c,selection_command
+724,38424046,"keyboards/annepro2/keymaps/calmar_one-backup/keymap.c",3259,0,"",c,selection_command
+725,38424645,"keyboards/annepro2/keymaps/calmar_one-backup/keymap.c",2252,0,"",c,selection_command
+726,38424796,"keyboards/annepro2/keymaps/calmar_one-backup/keymap.c",758,0,"",c,selection_command
+727,38424974,"keyboards/annepro2/keymaps/calmar_one-backup/keymap.c",0,0,"",c,selection_command
diff --git a/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-b9cac4fb-bb28-43f8-ac71-4b73d9df27691762370355479-2025_11_05-20.19.17.356/source.csv b/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-b9cac4fb-bb28-43f8-ac71-4b73d9df27691762370355479-2025_11_05-20.19.17.356/source.csv
new file mode 100644
index 0000000000000000000000000000000000000000..f4e4e1129433916cb660c7fe7c881559ad857270
--- /dev/null
+++ b/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-b9cac4fb-bb28-43f8-ac71-4b73d9df27691762370355479-2025_11_05-20.19.17.356/source.csv
@@ -0,0 +1,9 @@
+Sequence,Time,File,RangeOffset,RangeLength,Text,Language,Type
+2,45,"extension-output-pdoom-org.crowd-code-#1-crowd-code",0,0,"",Log,tab
+3,89,"extension-output-pdoom-org.crowd-code-#1-crowd-code",0,0,"8:19:17 PM [info] Activating crowd-code\n8:19:17 PM [info] Recording started\n8:19:17 PM [info] Initializing git provider using file system watchers...\n8:19:17 PM [info] Git repository found\n8:19:17 PM [info] Git provider initialized successfully\n8:19:17 PM [info] Initial git state: [object Object]\n",Log,content
+4,2475,"extension-output-pdoom-org.crowd-code-#1-crowd-code",0,0,"",Log,tab
+5,15080,"",0,0,"Switched from branch 'main' to 'crowd-pilot-extension-submodule'",,git_branch_checkout
+6,15253,"extension-output-pdoom-org.crowd-code-#1-crowd-code",0,0,"8:19:17 PM [info] Activating crowd-code\n8:19:17 PM [info] Recording started\n8:19:17 PM [info] Initializing git provider using file system watchers...\n8:19:17 PM [info] Git repository found\n8:19:17 PM [info] Git provider initialized successfully\n8:19:17 PM [info] Initial git state: [object Object]\n8:19:32 PM [info] Branch checkout detected: main -> crowd-pilot-extension-submodule\n8:19:32 PM [info] Recording git checkout: Switched from branch 'main' to 'crowd-pilot-extension-submodule'\n8:19:32 PM [info] Resetting file cache due to branch checkout\n",Log,tab
+7,17666,"TERMINAL",0,0,"",,terminal_focus
+8,24113,".gitmodules",0,0,"[submodule ""maxtext""]\n\tpath = maxtext\n\turl = https://github.com/AI-Hypercomputer/maxtext\n[submodule ""crowd-pilot-extension""]\n\tpath = crowd-pilot-extension\n\turl = git@github.com:p-doom/crowd-pilot-extension.git\n",properties,tab
+9,24132,".gitmodules",89,0,"",properties,selection_command
diff --git a/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-c0cf5d8f-d96c-42ed-92fd-777800d8c7cd1764443845176-2025_11_29-20.17.32.905/source.csv b/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-c0cf5d8f-d96c-42ed-92fd-777800d8c7cd1764443845176-2025_11_29-20.17.32.905/source.csv
new file mode 100644
index 0000000000000000000000000000000000000000..c9a8be42c68ff1fe6fdf5847c6b1b039713cdbea
--- /dev/null
+++ b/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-c0cf5d8f-d96c-42ed-92fd-777800d8c7cd1764443845176-2025_11_29-20.17.32.905/source.csv
@@ -0,0 +1,57 @@
+Sequence,Time,File,RangeOffset,RangeLength,Text,Language,Type
+1,11,"Untitled-1",0,0,"",plaintext,tab
+2,119,"extension-output-pdoom-org.crowd-code-#1-crowd-code",0,0,"8:17:32 PM [info] Activating crowd-code\n8:17:32 PM [info] Recording started\n8:17:32 PM [info] Initializing git provider using file system watchers...\n8:17:32 PM [info] No workspace folder found\n",Log,tab
+3,2028,"extension-output-pdoom-org.crowd-code-#1-crowd-code",194,0,"8:17:34 PM [info] Retrying git provider initialization...\n8:17:34 PM [info] No workspace folder found\n",Log,content
+4,7750,"Untitled-1",0,0,"",plaintext,tab
+5,11552,"TERMINAL",0,0,"Test",,terminal_focus
+6,11558,"Untitled-1",0,0,"/* crowd-pilot: insert start */\nline A\nline B\n/* crowd-pilot: insert end */\n",plaintext,content
+7,13974,"Untitled-1",46,0,"",plaintext,selection_command
+8,14130,"Untitled-1",39,0,"",plaintext,selection_command
+9,14299,"Untitled-1",32,0,"",plaintext,selection_command
+10,14462,"Untitled-1",0,0,"",plaintext,selection_command
+11,83623,"Untitled-1",0,0,"\n",plaintext,content
+12,93453,"Untitled-1",1,0,"",plaintext,selection_command
+13,93581,"Untitled-1",0,0,"",plaintext,selection_command
+14,97816,"Untitled-1",0,0,"\n",plaintext,content
+15,99560,"Untitled-1",1,0,"",plaintext,selection_command
+16,99705,"Untitled-1",0,0,"",plaintext,selection_command
+17,209732,"Untitled-1",34,13,"/* crowd-pilot: replacement */\nREPLACED LINE 1\nREPLACED LINE 2",plaintext,content
+18,216860,"Untitled-1",1,0,"",plaintext,selection_command
+19,216946,"Untitled-1",0,0,"",plaintext,selection_command
+20,217804,"Untitled-1",1,0,"",plaintext,selection_command
+21,217959,"Untitled-1",2,0,"",plaintext,selection_command
+22,218768,"Untitled-1",34,0,"",plaintext,selection_command
+23,218997,"Untitled-1",65,0,"",plaintext,selection_command
+24,219160,"Untitled-1",81,0,"",plaintext,selection_command
+25,219298,"Untitled-1",97,0,"",plaintext,selection_command
+26,219600,"Untitled-1",81,0,"",plaintext,selection_command
+27,219848,"Untitled-1",65,0,"",plaintext,selection_command
+28,219879,"Untitled-1",34,0,"",plaintext,selection_command
+29,219912,"Untitled-1",2,0,"",plaintext,selection_command
+30,219946,"Untitled-1",1,0,"",plaintext,selection_command
+31,219980,"Untitled-1",0,0,"",plaintext,selection_command
+32,220480,"Untitled-1",0,0,"\n",plaintext,content
+33,221895,"Untitled-1",1,0,"",plaintext,selection_command
+34,222018,"Untitled-1",0,0,"",plaintext,selection_command
+35,223393,"Untitled-1",0,0,"\n",plaintext,content
+36,225322,"Untitled-1",0,0,"\n",plaintext,content
+37,226665,"Untitled-1",1,0,"",plaintext,selection_command
+38,227007,"Untitled-1",0,0,"",plaintext,selection_command
+39,229296,"Untitled-1",1,0,"",plaintext,selection_command
+40,230274,"Untitled-1",0,0,"",plaintext,selection_command
+41,233230,"Untitled-1",37,62,"",plaintext,content
+42,236150,"Untitled-1",2,0,"/* crowd-pilot: insert start */\nline A\nline B\n/* crowd-pilot: insert end */\n",plaintext,content
+43,243613,"Untitled-1",34,13,"/* crowd-pilot: replacement */\nREPLACED LINE 1\nREPLACED LINE 2",plaintext,content
+44,246883,"Untitled-1",97,31,"",plaintext,content
+45,248948,"Untitled-1",2,0,"/* crowd-pilot: insert start */\nline A\nline B\n/* crowd-pilot: insert end */\n",plaintext,content
+46,249872,"Untitled-1",34,13,"/* crowd-pilot: replacement */\nREPLACED LINE 1\nREPLACED LINE 2",plaintext,content
+47,250330,"Untitled-1",97,92,"",plaintext,content
+48,250872,"Untitled-1",2,0,"/* crowd-pilot: insert start */\nline A\nline B\n/* crowd-pilot: insert end */\n",plaintext,content
+49,251166,"Untitled-1",34,13,"/* crowd-pilot: replacement */\nREPLACED LINE 1\nREPLACED LINE 2",plaintext,content
+50,251422,"Untitled-1",97,92,"",plaintext,content
+51,251686,"Untitled-1",2,0,"/* crowd-pilot: insert start */\nline A\nline B\n/* crowd-pilot: insert end */\n",plaintext,content
+52,252037,"Untitled-1",34,13,"/* crowd-pilot: replacement */\nREPLACED LINE 1\nREPLACED LINE 2",plaintext,content
+53,252224,"Untitled-1",97,92,"",plaintext,content
+54,252883,"Untitled-1",2,0,"/* crowd-pilot: insert start */\nline A\nline B\n/* crowd-pilot: insert end */\n",plaintext,content
+55,253347,"Untitled-1",34,13,"/* crowd-pilot: replacement */\nREPLACED LINE 1\nREPLACED LINE 2",plaintext,content
+56,254207,"Untitled-1",97,92,"",plaintext,content
diff --git a/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-c540a54c-40c8-4e46-b509-2316a8a981721765978775045-2025_12_17-14.39.41.212/source.csv b/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-c540a54c-40c8-4e46-b509-2316a8a981721765978775045-2025_12_17-14.39.41.212/source.csv
new file mode 100644
index 0000000000000000000000000000000000000000..c9622e332ccf66763482f8ff05c265f9bf012e89
--- /dev/null
+++ b/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-c540a54c-40c8-4e46-b509-2316a8a981721765978775045-2025_12_17-14.39.41.212/source.csv
@@ -0,0 +1,750 @@
+Sequence,Time,File,RangeOffset,RangeLength,Text,Language,Type
+1,3,"crates/core/src/pipeline.rs",0,0,"//! Pipeline for processing CSV sessions into conversations.\n\nuse std::path::Path;\nuse std::sync::atomic::{AtomicUsize, Ordering};\n\nuse rayon::prelude::*;\nuse serde::{Deserialize, Serialize};\nuse walkdir::WalkDir;\n\nuse crate::conversation::{ConversationStateManager, ConversationStateManagerConfig, FinalizedConversation};\nuse crate::Tokenizer;\n\n/// A row from the CSV file.\n#[derive(Debug, Deserialize)]\n#[serde(rename_all = ""PascalCase"")]\nstruct CsvRow {\n #[serde(rename = ""Sequence"")]\n _sequence: Option,\n #[serde(rename = ""Time"")]\n _time: Option,\n file: String,\n range_offset: Option,\n range_length: Option,\n text: Option,\n #[serde(rename = ""Language"")]\n _language: Option,\n #[serde(rename = ""Type"")]\n event_type: String,\n}\n\n/// Configuration for the pipeline.\n#[derive(Debug, Clone)]\npub struct PipelineConfig {\n pub max_tokens_per_conversation: usize,\n pub max_tokens_per_message: usize,\n pub min_conversation_messages: usize,\n pub viewport_radius: usize,\n pub coalesce_radius: usize,\n pub val_ratio: f64,\n}\n\nimpl Default for PipelineConfig {\n fn default() -> Self {\n Self {\n max_tokens_per_conversation: 8192,\n max_tokens_per_message: 2048,\n min_conversation_messages: 5,\n viewport_radius: 10,\n coalesce_radius: 5,\n val_ratio: 0.1,\n }\n }\n}\n\n/// Result of processing a single session.\n#[derive(Debug)]\npub struct SessionResult {\n pub conversations: Vec,\n pub source_path: String,\n}\n\n/// Result of processing all sessions.\n#[derive(Debug, Serialize)]\npub struct PipelineResult {\n pub total_sessions: usize,\n pub total_conversations: usize,\n pub train_conversations: usize,\n pub val_conversations: usize,\n pub total_messages: usize,\n pub total_tokens: usize,\n}\n\n/// NeMo conversation record format.\n#[derive(Debug, Serialize)]\npub struct NemoRecord {\n pub mask: String,\n pub system: String,\n pub conversations: Vec,\n}\n\n/// A message in NeMo format.\n#[derive(Debug, Serialize)]\npub struct NemoMessage {\n pub from: String,\n pub value: String,\n}\n\n/// Discover all CSV files in a directory.\npub fn discover_csv_files(root: &Path) -> Vec {\n let mut paths: Vec = WalkDir::new(root)\n .into_iter()\n .filter_map(|e| e.ok())\n .filter(|e| e.path().extension().map_or(false, |ext| ext == ""csv""))\n .map(|e| e.path().to_path_buf())\n .collect();\n paths.sort();\n paths\n}\n\n/// Process a single CSV session file.\npub fn process_session(\n csv_path: &Path,\n tokenizer: &T,\n config: &PipelineConfig,\n) -> Result, Box>\nwhere\n T: Tokenizer,\n{\n let manager_config = ConversationStateManagerConfig {\n viewport_radius: config.viewport_radius,\n coalesce_radius: config.coalesce_radius,\n max_tokens_per_message: config.max_tokens_per_message,\n max_tokens_per_terminal_output: 256,\n max_tokens_per_conversation: Some(config.max_tokens_per_conversation),\n min_conversation_messages: config.min_conversation_messages,\n };\n\n let mut manager = ConversationStateManager::new(tokenizer, manager_config);\n\n let mut reader = csv::Reader::from_path(csv_path)?;\n \n for result in reader.deserialize() {\n let row: CsvRow = result?;\n \n match row.event_type.as_str() {\n ""tab"" => {\n manager.handle_tab_event(&row.file, row.text.as_deref());\n }\n ""content"" => {\n let offset = row.range_offset.expect(""content event missing RangeOffset"") as usize;\n let length = row.range_length.expect(""content event missing RangeLength"") as usize;\n let text = row.text.as_deref().unwrap_or("""");\n manager.handle_content_event(&row.file, offset, length, text);\n }\n ""selection_command"" | ""selection_mouse"" | ""selection_keyboard"" => {\n let offset = row.range_offset.expect(""selection event missing RangeOffset"") as usize;\n manager.handle_selection_event(&row.file, offset);\n }\n ""terminal_command"" => {\n let command = row.text.as_deref().unwrap_or_else(|| {\n eprintln!(""Warning: terminal_command event missing Text in {:?}"", csv_path);\n """"\n });\n manager.handle_terminal_command_event(command);\n }\n ""terminal_output"" => {\n let output = row.text.as_deref().unwrap_or_else(|| {\n eprintln!(""Warning: terminal_output event missing Text in {:?}"", csv_path);\n """"\n });\n manager.handle_terminal_output_event(output);\n }\n ""terminal_focus"" => {\n manager.handle_terminal_focus_event();\n }\n ""git_branch_checkout"" => {\n let branch_info = row.text.as_deref().unwrap_or_else(|| {\n eprintln!(""Warning: git_branch_checkout event missing Text in {:?}"", csv_path);\n """"\n });\n manager.handle_git_branch_checkout_event(branch_info);\n }\n other => {\n eprintln!(""Warning: Unknown event type '{}' in {:?}"", other, csv_path);\n }\n }\n }\n\n Ok(manager.get_conversations())\n}\n\n/// Process all CSV sessions in a directory in parallel.\n///\n/// Uses rayon for parallel processing. The tokenizer must be `Sync + Send`\n/// to be shared across threads.\npub fn process_all_sessions(\n csv_root: &Path,\n tokenizer: &T,\n config: &PipelineConfig,\n) -> Result, Box>\nwhere\n T: Tokenizer + Sync + Send,\n{\n let csv_files = discover_csv_files(csv_root);\n\n if csv_files.is_empty() {\n return Err(format!(""No CSV files found under {:?}"", csv_root).into());\n }\n\n let total_files = csv_files.len();\n let processed_count = AtomicUsize::new(0);\n let error_count = AtomicUsize::new(0);\n\n let results: Vec = csv_files\n .into_iter()\n .filter_map(|csv_path| {\n let result = process_session(&csv_path, tokenizer, config);\n let count = processed_count.fetch_add(1, Ordering::Relaxed) + 1;\n\n match result {\n Ok(conversations) => {\n if count % 100 == 0 || count == total_files {\n eprintln!(""Processed {}/{} sessions..."", count, total_files);\n }\n Some(SessionResult {\n conversations,\n source_path: csv_path.to_string_lossy().to_string(),\n })\n }\n Err(e) => {\n error_count.fetch_add(1, Ordering::Relaxed);\n eprintln!(""Error processing {:?}: {}"", csv_path, e);\n None\n }\n }\n })\n .collect();\n\n let errors = error_count.load(Ordering::Relaxed);\n if errors > 0 {\n eprintln!(""Warning: {} sessions failed to process"", errors);\n }\n\n Ok(results)\n}\n\n/// Write conversations to JSONL files (training and validation).\npub fn write_jsonl_output(\n session_results: Vec,\n output_dir: &Path,\n val_ratio: f64,\n system_prompt: &str,\n) -> Result> {\n use std::fs::File;\n use std::io::{BufWriter, Write};\n\n std::fs::create_dir_all(output_dir)?;\n\n // Shuffle sessions for train/val split (using simple deterministic shuffle)\n let mut sessions: Vec<_> = session_results.into_iter().enumerate().collect();\n // Simple deterministic shuffle based on index\n sessions.sort_by(|(i, a), (j, b)| {\n let hash_a = (i * 2654435761) % 1000;\n let hash_b = (j * 2654435761) % 1000;\n hash_a.cmp(&hash_b).then_with(|| a.source_path.cmp(&b.source_path))\n });\n\n let total_sessions = sessions.len();\n let val_count = (total_sessions as f64 * val_ratio).round() as usize;\n let train_count = total_sessions - val_count;\n\n let train_path = output_dir.join(""training.jsonl"");\n let val_path = output_dir.join(""validation.jsonl"");\n\n let mut train_file = BufWriter::new(File::create(&train_path)?);\n let mut val_file = BufWriter::new(File::create(&val_path)?);\n\n let mut train_conversations = 0;\n let mut val_conversations = 0;\n let mut total_messages = 0;\n let mut total_tokens = 0;\n\n for (idx, (_, session)) in sessions.into_iter().enumerate() {\n let is_validation = idx >= train_count;\n \n for conv in session.conversations {\n let nemo_messages: Vec = conv\n .messages\n .iter()\n .map(|m| NemoMessage {\n from: m.from.clone(),\n value: m.value.clone(),\n })\n .collect();\n\n let record = NemoRecord {\n mask: ""User"".to_string(),\n system: system_prompt.to_string(),\n conversations: nemo_messages,\n };\n\n let json_line = serde_json::to_string(&record)?;\n \n if is_validation {\n writeln!(val_file, ""{}"", json_line)?;\n val_conversations += 1;\n } else {\n writeln!(train_file, ""{}"", json_line)?;\n train_conversations += 1;\n }\n\n total_messages += conv.messages.len();\n total_tokens += conv.token_count;\n }\n }\n\n train_file.flush()?;\n val_file.flush()?;\n\n Ok(PipelineResult {\n total_sessions,\n total_conversations: train_conversations + val_conversations,\n train_conversations,\n val_conversations,\n total_messages,\n total_tokens,\n })\n}\n\n#[cfg(test)]\nmod tests {\n use super::*;\n use std::io::Write;\n use tempfile::TempDir;\n\n /// Character-based approximate tokenizer for tests.\n struct CharApproxTokenizer;\n\n impl Tokenizer for CharApproxTokenizer {\n fn count_tokens(&self, text: &str) -> usize {\n text.len() / 4\n }\n\n fn truncate_to_max_tokens(&self, text: &str, max_tokens: usize) -> String {\n text.chars().take(max_tokens * 4).collect()\n }\n }\n\n #[test]\n fn test_discover_csv_files() {\n let temp = TempDir::new().unwrap();\n let csv1 = temp.path().join(""session1.csv"");\n let csv2 = temp.path().join(""subdir/session2.csv"");\n \n std::fs::create_dir_all(temp.path().join(""subdir"")).unwrap();\n std::fs::write(&csv1, ""header\n"").unwrap();\n std::fs::write(&csv2, ""header\n"").unwrap();\n\n let files = discover_csv_files(temp.path());\n assert_eq!(files.len(), 2);\n }\n\n #[test]\n fn test_process_session() {\n let temp = TempDir::new().unwrap();\n let csv_path = temp.path().join(""test.csv"");\n \n let mut file = std::fs::File::create(&csv_path).unwrap();\n writeln!(file, ""Sequence,Time,File,RangeOffset,RangeLength,Text,Language,Type"").unwrap();\n writeln!(file, ""1,2024-01-01,/test/file.rs,0,0,\""fn main() {{}}\"",rust,tab"").unwrap();\n writeln!(file, ""2,2024-01-01,/test/file.rs,0,0,echo hello,bash,terminal_command"").unwrap();\n writeln!(file, ""3,2024-01-01,/test/file.rs,0,0,hello world,bash,terminal_output"").unwrap();\n\n let config = PipelineConfig {\n min_conversation_messages: 2,\n ..Default::default()\n };\n\n let tokenizer = CharApproxTokenizer;\n let conversations = process_session(&csv_path, &tokenizer, &config).unwrap();\n \n // Should have at least one conversation with messages\n assert!(!conversations.is_empty() || conversations.iter().any(|c| !c.messages.is_empty()));\n }\n}\n\n",rust,tab
+2,139,"extension-output-pdoom-org.crowd-code-#1-crowd-code",0,0,"2:39:41 PM [info] Activating crowd-code\n2:39:41 PM [info] Recording started\n2:39:41 PM [info] Initializing git provider using file system watchers...\n",Log,tab
+3,176,"TERMINAL",0,0,"",,terminal_focus
+4,220,"extension-output-pdoom-org.crowd-code-#1-crowd-code",150,0,"2:39:41 PM [info] Git repository found\n2:39:41 PM [info] Git provider initialized successfully\n2:39:41 PM [info] Initial git state: [object Object]\n",Log,content
+5,1831,"crates/core/src/pipeline.rs",0,0,"",rust,tab
+6,5754,"TERMINAL",0,0,"bash",,terminal_focus
+7,8543,"TERMINAL",0,0,"squeue",,terminal_command
+8,8557,"TERMINAL",0,0,"]633;C JOBID USER PARTITION NODES CPUS ST SUBMIT_TIME START_TIME TIME TIME_LIMIT NODELIST(REASON)\r\n 36723 franz.sram interacti 1 200 R 2025-12-17T13:01:48 2025-12-17T13:01:48 1:38:01 1-00:00:00 hai008\r\n 36722 mihir.maha interacti 1 2 R 2025-12-17T12:51:04 2025-12-17T12:51:04 1:48:45 2:00:00 hai008\r\n 36701 xiao.liu interacti 1 128 R 2025-12-17T04:00:33 2025-12-17T04:00:33 10:39:16 23:59:00 hai005\r\n 36700 xiao.liu interacti 1 128 R 2025-12-17T01:06:53 2025-12-17T01:06:53 13:32:56 23:59:00 hai006\r\n 36730 florian.mu standard 1 2 R 2025-12-17T14:29:13 2025-12-17T14:29:13 10:36 1-00:00:00 hai007\r\n 36729 florian.mu standard 1 24 R 2025-12-17T14:27:06 2025-12-17T14:27:07 12:42 10:00:00 hai007\r\n 36704 nishant.ku standard 3 624 R 2025-12-17T09:38:39 2025-12-17T09:38:39 5:01:10 1-00:00:00 hai[002-004]\r\n 36691 kalyan.nad standard 1 128 R 2025-12-16T17:27:47 2025-12-16T17:27:58 21:11:51 1-00:00:00 hai001\r\n]0;franz.srambical@hai-login2:~",,terminal_output
+9,10709,"TERMINAL",0,0,"salloc --gpus=1 --ntasks-per-node=1 --cpus-per-task=10 --mem=400G --qos=normal",,terminal_command
+10,10771,"TERMINAL",0,0,"]633;Csalloc: Granted job allocation 36731\r\n",,terminal_output
+11,10869,"TERMINAL",0,0,"salloc: Nodes hai006 are ready for job\r\n",,terminal_output
+12,11142,"TERMINAL",0,0,"Running inside SLURM, Job ID 36731.\r\n",,terminal_output
+13,11234,"TERMINAL",0,0,"]0;franz.srambical@hai-login2:~[?2004h[franz.srambical@hai006.haicore.berlin:~] $ ",,terminal_output
+14,12073,"TERMINAL",0,0,"bash",,terminal_focus
+15,26051,"TERMINAL",0,0,"srun",,terminal_focus
+16,26759,"TERMINAL",0,0,"e",,terminal_output
+17,27043,"TERMINAL",0,0,"x",,terminal_output
+18,27210,"TERMINAL",0,0,"i",,terminal_output
+19,27372,"TERMINAL",0,0,"t",,terminal_output
+20,27486,"TERMINAL",0,0,"\r\n[?2004l\rexit\r\nsalloc: Relinquishing job allocation 36731\r\nsalloc: Job allocation 36731 has been revoked.\r\n]0;franz.srambical@hai-login2:~",,terminal_output
+21,29136,"TERMINAL",0,0,"bash",,terminal_focus
+22,40768,"crates/core/src/pipeline.rs",0,0,"",rust,tab
+23,40769,"crates/core/src/pipeline.rs",6254,0,"",rust,selection_command
+24,42910,"crates/cli/src/main.rs",0,0,"//! CLI tool for serializing crowd-pilot IDE interaction data.\n//!\n//! This tool processes CSV session files and outputs JSONL format suitable for\n//! NeMo SFT training. It uses the HuggingFace tokenizers Rust library for\n//! accurate token counting.\n\nuse std::path::PathBuf;\n\nuse clap::Parser;\nuse tokenizers::Tokenizer as HfTokenizer;\n\nuse crowd_pilot_serializer_core::{\n pipeline::{PipelineConfig, PipelineResult},\n process_all_sessions, write_jsonl_output, Tokenizer,\n};\n\n/// Serialize crowd-pilot CSV sessions to NeMo JSONL format.\n#[derive(Parser, Debug)]\n#[command(name = ""crowd-pilot-serialize"")]\n#[command(author, version, about, long_about = None)]\nstruct Args {\n /// Root directory containing CSV session files\n #[arg(long)]\n csv_root: PathBuf,\n\n /// Output directory for JSONL files\n #[arg(long)]\n output_dir: PathBuf,\n\n /// HuggingFace tokenizer model name or path\n #[arg(long)]\n tokenizer: String,\n\n /// Maximum tokens per conversation chunk\n #[arg(long, default_value = ""8192"")]\n max_tokens_per_conversation: usize,\n\n /// Maximum tokens per message\n #[arg(long, default_value = ""2048"")]\n max_tokens_per_message: usize,\n\n /// Minimum messages required to keep a conversation\n #[arg(long, default_value = ""5"")]\n min_conversation_messages: usize,\n\n /// Viewport radius (lines above/below cursor)\n #[arg(long, default_value = ""10"")]\n viewport_radius: usize,\n\n /// Coalesce radius for grouping nearby edits\n #[arg(long, default_value = ""5"")]\n coalesce_radius: usize,\n\n /// Fraction of sessions for validation (0.0-1.0)\n #[arg(long, default_value = ""0.1"")]\n val_ratio: f64,\n\n /// Custom system prompt (optional)\n #[arg(long)]\n system_prompt: Option,\n}\n\nconst DEFAULT_SYSTEM_PROMPT: &str = r#""You are a helpful assistant that can interact multiple times with a computer shell to solve programming tasks.\nYour response must contain exactly ONE bash code block with ONE command (or commands connected with && or ||).\n\nFormat your response as shown in .\n\n\n```bash\nyour_command_here\n```\n\n\nFailure to follow these rules will cause your response to be rejected.""#;\n\n/// Wrapper around HuggingFace tokenizers for token counting and truncation.\n///\n/// This uses the Rust-native tokenizers library, which is `Send + Sync`\n/// and enables true parallel tokenization without the Python GIL.\nstruct RustTokenizer {\n inner: HfTokenizer,\n}\n\nimpl RustTokenizer {\n /// Load a HuggingFace tokenizer from a model name or path.\n fn load(model_name: &str) -> Result> {\n let inner = HfTokenizer::from_pretrained(model_name, None)\n .map_err(|e| e as Box)?;\n Ok(Self { inner })\n }\n}\n\nimpl Tokenizer for RustTokenizer {\n fn count_tokens(&self, text: &str) -> usize {\n self.inner\n .encode(text, false)\n .map(|enc| enc.get_ids().len())\n .unwrap_or(0)\n }\n\n fn truncate_to_max_tokens(&self, text: &str, max_tokens: usize) -> String {\n match self.inner.encode(text, false) {\n Ok(encoding) => {\n let ids = encoding.get_ids();\n if ids.len() <= max_tokens {\n return text.to_string();\n }\n let truncated_ids: Vec = ids[..max_tokens].to_vec();\n self.inner\n .decode(&truncated_ids, true)\n .unwrap_or_else(|_| text.chars().take(max_tokens * 4).collect())\n }\n Err(_) => text.chars().take(max_tokens * 4).collect(),\n }\n }\n}\n\nfn main() -> Result<(), Box> {\n let args = Args::parse();\n\n println!(""Loading tokenizer from {}..."", args.tokenizer);\n let tokenizer = RustTokenizer::load(&args.tokenizer)?;\n\n let config = PipelineConfig {\n max_tokens_per_conversation: args.max_tokens_per_conversation,\n max_tokens_per_message: args.max_tokens_per_message,\n min_conversation_messages: args.min_conversation_messages,\n viewport_radius: args.viewport_radius,\n coalesce_radius: args.coalesce_radius,\n val_ratio: args.val_ratio,\n };\n\n println!(""Processing CSV files from {:?}..."", args.csv_root);\n let session_results = process_all_sessions(\n &args.csv_root,\n &tokenizer,\n &config,\n )?;\n\n let total_sessions = session_results.len();\n println!(""Processed {} sessions"", total_sessions);\n\n let system_prompt = args.system_prompt.as_deref().unwrap_or(DEFAULT_SYSTEM_PROMPT);\n\n println!(""Writing output to {:?}..."", args.output_dir);\n let result: PipelineResult = write_jsonl_output(\n session_results,\n &args.output_dir,\n args.val_ratio,\n system_prompt,\n )?;\n\n let metadata_path = args.output_dir.join(""metadata.json"");\n let metadata = serde_json::json!({\n ""config"": {\n ""csv_root"": args.csv_root.to_string_lossy(),\n ""output_dir"": args.output_dir.to_string_lossy(),\n ""tokenizer"": args.tokenizer,\n ""max_tokens_per_conversation"": args.max_tokens_per_conversation,\n ""max_tokens_per_message"": args.max_tokens_per_message,\n ""min_conversation_messages"": args.min_conversation_messages,\n ""viewport_radius"": args.viewport_radius,\n ""coalesce_radius"": args.coalesce_radius,\n ""val_ratio"": args.val_ratio,\n },\n ""counts"": {\n ""total_sessions"": result.total_sessions,\n ""total_conversations"": result.total_conversations,\n ""train_conversations"": result.train_conversations,\n ""val_conversations"": result.val_conversations,\n },\n ""stats"": {\n ""total_messages"": result.total_messages,\n ""total_tokens"": result.total_tokens,\n ""avg_messages_per_conversation"": if result.total_conversations > 0 {\n result.total_messages as f64 / result.total_conversations as f64\n } else {\n 0.0\n },\n ""avg_tokens_per_conversation"": if result.total_conversations > 0 {\n result.total_tokens as f64 / result.total_conversations as f64\n } else {\n 0.0\n },\n },\n ""files"": {\n ""train_path"": args.output_dir.join(""training.jsonl"").to_string_lossy(),\n ""val_path"": args.output_dir.join(""validation.jsonl"").to_string_lossy(),\n },\n });\n std::fs::write(&metadata_path, serde_json::to_string_pretty(&metadata)?)?;\n\n println!(""\n[summary]"");\n println!("" Total sessions processed: {}"", result.total_sessions);\n println!("" Train conversations: {}"", result.train_conversations);\n println!("" Val conversations: {}"", result.val_conversations);\n println!("" Total messages: {}"", result.total_messages);\n println!("" Total tokens: {}"", result.total_tokens);\n println!("" Output: {:?}/{{training,validation}}.jsonl"", args.output_dir);\n println!("" Metadata: {:?}"", metadata_path);\n\n Ok(())\n}\n",rust,tab
+25,42911,"crates/cli/src/main.rs",147,0,"",rust,selection_command
+26,80276,"crates/cli/src/main.rs",2739,0,"",rust,selection_mouse
+27,93687,"crates/cli/src/main.rs",2672,0,"",rust,selection_command
+28,94304,"crates/cli/src/main.rs",2718,0,"",rust,selection_command
+29,96203,"crates/cli/src/main.rs",2653,0,"",rust,selection_command
+30,96984,"crates/cli/src/main.rs",2720,0,"",rust,selection_command
+31,97167,"crates/cli/src/main.rs",2780,0,"",rust,selection_command
+32,98335,"crates/cli/src/main.rs",2807,0,"",rust,selection_command
+33,98501,"crates/cli/src/main.rs",2813,0,"",rust,selection_command
+34,98751,"crates/cli/src/main.rs",2815,0,"",rust,selection_command
+35,98783,"crates/cli/src/main.rs",2816,0,"",rust,selection_command
+36,99384,"crates/cli/src/main.rs",2815,0,"",rust,selection_command
+37,99633,"crates/cli/src/main.rs",2813,0,"",rust,selection_command
+38,99661,"crates/cli/src/main.rs",2807,0,"",rust,selection_command
+39,99711,"crates/cli/src/main.rs",2780,0,"",rust,selection_command
+40,99726,"crates/cli/src/main.rs",2720,0,"",rust,selection_command
+41,99758,"crates/cli/src/main.rs",2653,0,"",rust,selection_command
+42,99791,"crates/cli/src/main.rs",2577,0,"",rust,selection_command
+43,99852,"crates/cli/src/main.rs",2653,0,"",rust,selection_command
+44,100100,"crates/cli/src/main.rs",2720,0,"",rust,selection_command
+45,100124,"crates/cli/src/main.rs",2780,0,"",rust,selection_command
+46,100157,"crates/cli/src/main.rs",2807,0,"",rust,selection_command
+47,100200,"crates/cli/src/main.rs",2813,0,"",rust,selection_command
+48,100225,"crates/cli/src/main.rs",2815,0,"",rust,selection_command
+49,100257,"crates/cli/src/main.rs",2816,0,"",rust,selection_command
+50,100292,"crates/cli/src/main.rs",2851,0,"",rust,selection_command
+51,100327,"crates/cli/src/main.rs",2901,0,"",rust,selection_command
+52,102641,"crates/cli/src/main.rs",2920,0,"",rust,selection_command
+53,102845,"crates/cli/src/main.rs",2953,0,"",rust,selection_command
+54,103007,"crates/cli/src/main.rs",2997,0,"",rust,selection_command
+55,103185,"crates/cli/src/main.rs",3023,0,"",rust,selection_command
+56,103436,"crates/cli/src/main.rs",3029,0,"",rust,selection_command
+57,103467,"crates/cli/src/main.rs",3030,0,"",rust,selection_command
+58,103502,"crates/cli/src/main.rs",3110,0,"",rust,selection_command
+59,103536,"crates/cli/src/main.rs",3157,0,"",rust,selection_command
+60,103570,"crates/cli/src/main.rs",3187,0,"",rust,selection_command
+61,103615,"crates/cli/src/main.rs",3233,0,"",rust,selection_command
+62,103641,"crates/cli/src/main.rs",3278,0,"",rust,selection_command
+63,103673,"crates/cli/src/main.rs",3323,0,"",rust,selection_command
+64,103707,"crates/cli/src/main.rs",3341,0,"",rust,selection_command
+65,103738,"crates/cli/src/main.rs",3415,0,"",rust,selection_command
+66,103768,"crates/cli/src/main.rs",3442,0,"",rust,selection_command
+67,103803,"crates/cli/src/main.rs",3492,0,"",rust,selection_command
+68,103835,"crates/cli/src/main.rs",3577,0,"",rust,selection_command
+69,103882,"crates/cli/src/main.rs",3591,0,"",rust,selection_command
+70,103911,"crates/cli/src/main.rs",3658,0,"",rust,selection_command
+71,103939,"crates/cli/src/main.rs",3668,0,"",rust,selection_command
+72,103970,"crates/cli/src/main.rs",3674,0,"",rust,selection_command
+73,104004,"crates/cli/src/main.rs",3676,0,"",rust,selection_command
+74,104040,"crates/cli/src/main.rs",3677,0,"",rust,selection_command
+75,104074,"crates/cli/src/main.rs",3731,0,"",rust,selection_command
+76,104104,"crates/cli/src/main.rs",3761,0,"",rust,selection_command
+77,104134,"crates/cli/src/main.rs",3762,0,"",rust,selection_command
+78,104410,"crates/cli/src/main.rs",3761,0,"",rust,selection_command
+79,104660,"crates/cli/src/main.rs",3731,0,"",rust,selection_command
+80,104691,"crates/cli/src/main.rs",3677,0,"",rust,selection_command
+81,104735,"crates/cli/src/main.rs",3676,0,"",rust,selection_command
+82,104747,"crates/cli/src/main.rs",3674,0,"",rust,selection_command
+83,104782,"crates/cli/src/main.rs",3668,0,"",rust,selection_command
+84,104813,"crates/cli/src/main.rs",3658,0,"",rust,selection_command
+85,104999,"crates/cli/src/main.rs",3591,0,"",rust,selection_command
+86,105249,"crates/cli/src/main.rs",3577,0,"",rust,selection_command
+87,105278,"crates/cli/src/main.rs",3492,0,"",rust,selection_command
+88,105310,"crates/cli/src/main.rs",3442,0,"",rust,selection_command
+89,105345,"crates/cli/src/main.rs",3415,0,"",rust,selection_command
+90,105377,"crates/cli/src/main.rs",3341,0,"",rust,selection_command
+91,105411,"crates/cli/src/main.rs",3323,0,"",rust,selection_command
+92,105447,"crates/cli/src/main.rs",3278,0,"",rust,selection_command
+93,105476,"crates/cli/src/main.rs",3233,0,"",rust,selection_command
+94,105510,"crates/cli/src/main.rs",3187,0,"",rust,selection_command
+95,105548,"crates/cli/src/main.rs",3157,0,"",rust,selection_command
+96,105591,"crates/cli/src/main.rs",3110,0,"",rust,selection_command
+97,105617,"crates/cli/src/main.rs",3030,0,"",rust,selection_command
+98,105645,"crates/cli/src/main.rs",3029,0,"",rust,selection_command
+99,105677,"crates/cli/src/main.rs",3023,0,"",rust,selection_command
+100,105713,"crates/cli/src/main.rs",2997,0,"",rust,selection_command
+101,105752,"crates/cli/src/main.rs",2953,0,"",rust,selection_command
+102,105862,"crates/cli/src/main.rs",2997,0,"",rust,selection_command
+103,106036,"crates/cli/src/main.rs",3023,0,"",rust,selection_command
+104,106177,"crates/cli/src/main.rs",3029,0,"",rust,selection_command
+105,106283,"crates/cli/src/main.rs",3030,0,"",rust,selection_command
+106,106533,"crates/cli/src/main.rs",3110,0,"",rust,selection_command
+107,106571,"crates/cli/src/main.rs",3157,0,"",rust,selection_command
+108,106598,"crates/cli/src/main.rs",3187,0,"",rust,selection_command
+109,106802,"crates/cli/src/main.rs",3233,0,"",rust,selection_command
+110,106959,"crates/cli/src/main.rs",3278,0,"",rust,selection_command
+111,148530,"crates/cli/src/main.rs",3233,0,"",rust,selection_command
+112,148667,"crates/cli/src/main.rs",3187,0,"",rust,selection_command
+113,148879,"crates/cli/src/main.rs",3157,0,"",rust,selection_command
+114,149108,"crates/cli/src/main.rs",3110,0,"",rust,selection_command
+115,150336,"crates/cli/src/main.rs",3118,0,"",rust,selection_command
+116,150485,"crates/cli/src/main.rs",3124,0,"",rust,selection_command
+117,150618,"crates/cli/src/main.rs",3128,0,"",rust,selection_command
+118,150744,"crates/cli/src/main.rs",3129,0,"",rust,selection_command
+119,150963,"crates/cli/src/main.rs",3134,0,"",rust,selection_command
+120,151154,"crates/cli/src/main.rs",3135,0,"",rust,selection_command
+121,157311,"crates/cli/src/main.rs",3141,0,"",rust,selection_command
+122,157453,"crates/cli/src/main.rs",3142,0,"",rust,selection_command
+123,157696,"crates/cli/src/main.rs",3146,0,"",rust,selection_command
+124,157901,"crates/cli/src/main.rs",3148,0,"",rust,selection_command
+125,158229,"crates/cli/src/main.rs",3146,0,"",rust,selection_command
+126,158859,"crates/cli/src/main.rs",3145,0,"",rust,selection_command
+127,159300,"crates/cli/src/main.rs",3144,0,"",rust,selection_command
+128,159470,"crates/cli/src/main.rs",3143,0,"",rust,selection_command
+129,159605,"crates/cli/src/main.rs",3142,0,"",rust,selection_command
+130,160395,"crates/cli/src/main.rs",3146,0,"",rust,selection_command
+131,161299,"crates/cli/src/main.rs",3145,0,"",rust,selection_command
+132,161550,"crates/cli/src/main.rs",3144,0,"",rust,selection_command
+133,161731,"crates/cli/src/main.rs",3143,0,"",rust,selection_command
+134,162019,"crates/cli/src/main.rs",3144,0,"",rust,selection_command
+135,162199,"crates/cli/src/main.rs",3143,0,"",rust,selection_command
+136,162343,"crates/cli/src/main.rs",3142,0,"",rust,selection_command
+137,162612,"crates/cli/src/main.rs",3143,0,"",rust,selection_command
+138,182667,"crates/cli/src/main.rs",4849,0,"",rust,selection_keyboard
+139,185528,"crates/cli/src/main.rs",2396,0,"",rust,selection_command
+140,188809,"crates/core/src/pipeline.rs",0,0,"",rust,tab
+141,192001,"crates/core/src/pipeline.rs",6268,0,"p",rust,content
+142,192001,"crates/core/src/pipeline.rs",6269,0,"",rust,selection_keyboard
+143,192278,"crates/core/src/pipeline.rs",6269,0,"a",rust,content
+144,192278,"crates/core/src/pipeline.rs",6270,0,"",rust,selection_keyboard
+145,192283,"crates/core/src/pipeline.rs",6270,0,"r",rust,content
+146,192283,"crates/core/src/pipeline.rs",6271,0,"",rust,selection_keyboard
+147,193017,"crates/core/src/pipeline.rs",6271,0,"_",rust,content
+148,193017,"crates/core/src/pipeline.rs",6272,0,"",rust,selection_keyboard
+149,193306,"crates/core/src/pipeline.rs",6271,0,"",rust,selection_command
+150,198332,"crates/core/src/pipeline.rs",6296,0,"",rust,selection_command
+151,198585,"crates/core/src/pipeline.rs",6329,0,"",rust,selection_command
+152,198618,"crates/core/src/pipeline.rs",6401,0,"",rust,selection_command
+153,198652,"crates/core/src/pipeline.rs",6461,0,"",rust,selection_command
+154,198689,"crates/core/src/pipeline.rs",6479,0,"",rust,selection_command
+155,198723,"crates/core/src/pipeline.rs",6506,0,"",rust,selection_command
+156,198851,"crates/core/src/pipeline.rs",6545,0,"",rust,selection_command
+157,199310,"crates/core/src/pipeline.rs",6548,0,"",rust,selection_command
+158,201201,"crates/core/src/pipeline.rs",6614,0,"",rust,selection_command
+159,201339,"crates/core/src/pipeline.rs",6700,0,"",rust,selection_command
+160,201601,"crates/core/src/pipeline.rs",6614,0,"",rust,selection_command
+161,201843,"crates/core/src/pipeline.rs",6618,0,"",rust,selection_command
+162,214950,"crates/core/src/pipeline.rs",6552,0,"",rust,selection_command
+163,215683,"crates/core/src/pipeline.rs",6528,65," if count % 100 == 0 || count == total_files {",rust,selection_command
+164,215909,"crates/core/src/pipeline.rs",6528,151," if count % 100 == 0 || count == total_files {\n eprintln!(""Processed {}/{} sessions..."", count, total_files);",rust,selection_command
+165,216062,"crates/core/src/pipeline.rs",6528,173," if count % 100 == 0 || count == total_files {\n eprintln!(""Processed {}/{} sessions..."", count, total_files);\n }",rust,selection_command
+166,230009,"crates/core/src/pipeline.rs",6700,0,"",rust,selection_command
+167,235826,"crates/cli/src/main.rs",0,0,"",rust,tab
+168,240350,"crates/cli/src/main.rs",2463,0,"",rust,selection_command
+169,240560,"crates/cli/src/main.rs",2486,0,"",rust,selection_command
+170,240808,"crates/cli/src/main.rs",2489,0,"",rust,selection_command
+171,240840,"crates/cli/src/main.rs",2491,0,"",rust,selection_command
+172,240876,"crates/cli/src/main.rs",2511,0,"",rust,selection_command
+173,240916,"crates/cli/src/main.rs",2534,0,"",rust,selection_command
+174,240949,"crates/cli/src/main.rs",2598,0,"",rust,selection_command
+175,240975,"crates/cli/src/main.rs",2674,0,"",rust,selection_command
+176,241000,"crates/cli/src/main.rs",2741,0,"",rust,selection_command
+177,241183,"crates/cli/src/main.rs",2801,0,"",rust,selection_command
+178,241374,"crates/cli/src/main.rs",2811,0,"",rust,selection_command
+179,241673,"crates/cli/src/main.rs",2813,0,"",rust,selection_command
+180,241808,"crates/cli/src/main.rs",2815,0,"",rust,selection_command
+181,242006,"crates/cli/src/main.rs",2837,0,"",rust,selection_command
+182,293443,"crates/cli/src/main.rs",2872,0,"",rust,selection_command
+183,293608,"crates/cli/src/main.rs",2876,0,"",rust,selection_command
+184,293950,"crates/cli/src/main.rs",2878,0,"",rust,selection_command
+185,294434,"crates/cli/src/main.rs",2940,0,"",rust,selection_command
+186,295111,"crates/cli/src/main.rs",3067,0,"",rust,selection_command
+187,296113,"crates/cli/src/main.rs",2940,0,"",rust,selection_command
+188,327397,"crates/cli/src/main.rs",2973,0,"",rust,selection_command
+189,327930,"crates/cli/src/main.rs",2965,0,"",rust,selection_command
+190,363186,"crates/cli/src/main.rs",2966,0,"",rust,selection_command
+191,364275,"crates/cli/src/main.rs",2969,0,"",rust,selection_command
+192,364631,"crates/cli/src/main.rs",2971,0,"",rust,selection_command
+193,368307,"crates/cli/src/main.rs",2965,0,"",rust,selection_command
+194,370004,"crates/cli/src/main.rs",2966,0,"",rust,selection_command
+195,370244,"crates/cli/src/main.rs",2969,0,"",rust,selection_command
+196,370282,"crates/cli/src/main.rs",2971,0,"",rust,selection_command
+197,370315,"crates/cli/src/main.rs",2974,0,"",rust,selection_command
+198,370349,"crates/cli/src/main.rs",2976,0,"",rust,selection_command
+199,370382,"crates/cli/src/main.rs",2979,0,"",rust,selection_command
+200,370582,"crates/cli/src/main.rs",2980,0,"",rust,selection_command
+201,373588,"crates/cli/src/main.rs",2965,0,"",rust,selection_command
+202,380985,"crates/cli/src/main.rs",3009,0,"",rust,selection_command
+203,381132,"crates/cli/src/main.rs",3027,0,"",rust,selection_command
+204,381374,"crates/cli/src/main.rs",3029,0,"",rust,selection_command
+205,381507,"crates/cli/src/main.rs",3042,0,"",rust,selection_command
+206,381710,"crates/cli/src/main.rs",3122,0,"",rust,selection_command
+207,403472,"crates/core/src/pipeline.rs",0,0,"",rust,tab
+208,405453,"crates/cli/src/main.rs",0,0,"",rust,tab
+209,406841,"TERMINAL",0,0,"3dde1b0c-c963-467e-aa73-fb6c54df3ae41751963426964-2025_07_08-10.30.57.271/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-3dde1b0c-c963-467e-aa73-fb6c54df3ae41751963426964-2025_07_08-10.30.57.271/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Processed 349/349 sessions...Processed 349 sessionsWriting output to ""test_output_dir""...[summary] Total sessions processed: 349 Train conversations: 4036 Val conversations: 403 Total messages: 129517 Total tokens: 33411160 Output: ""test_output_dir""/{training,validation}.jsonl Metadata: ""test_output_dir/metadata.json""real 1m28.496suser 73m28.352ssys 1m56.252s[franz.srambical@hai008.haicore.berlin:~/crowd-pilot-serializer] $ [franz.srambical@hai008.haicore.berlin:~/crowd-pilot-serializer] $ cargo build --release -p crowd-pilot-serialize Compiling crowd-pilot-serializer-core v0.1.0 (/fast/home/franz.srambical/crowd-pilot-serializer/crates/core)warning: unused import: `rayon::prelude` --> crates/core/src/pipeline.rs:6:5 |6 | use rayon::prelude::*; | ^^^^^^^^^^^^^^ | = note: `#[warn(unused_imports)]` (part of `#[warn(unused)]`) on by defaultwarning: `crowd-pilot-serializer-core` (lib) generated 1 warning Compiling crowd-pilot-serialize v0.1.0 (/fast/home/franz.srambical/crowd-pilot-serializer/crates/cli) Finished `release` profile [optimized] target(s) in 5.92s[franz.srambical@hai008.haicore.berlin:~/crowd-pilot-serializer] $ time bash /home/franz.srambical/slurm/dev/franz/berlin/crowd-pilot/crowd_pilot_serializer/serialize.shLoading tokenizer from Qwen/Qwen3-8B...Processing CSV files from ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/""...Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-3dde1b0c-c963-467e-aa73-fb6c54df3ae41751963426964-2025_07_08-10.30.57.271/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-3dde1b0c-c963-467e-aa73-fb6c54df3ae41751963426964-2025_07_08-10.30.57.271/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-4457e5d2-f5e8-4b15-95aa-bafa247369991751528947759-2025_07_03-09.50.10.663/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-7879e034-f897-48e8-8481-1a87a73b0dc81752135543307-2025_07_10-10.19.09.565/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-81dc70dc-8e01-48a6-9a00-9349b9f9a4171751541780271-2025_07_03-13.23.33.804/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-bec33357-731a-4f4a-afb4-3d538f18fd451751530719479-2025_07_03-10.19.48.863/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-bec33357-731a-4f4a-afb4-3d538f18fd451751530719479-2025_07_03-10.19.48.863/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-ebfebb4b-ef7a-46a0-8725-dd76b91fd2891752048248358-2025_07_09-10.04.56.945/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-fbd09e27-2302-4b0c-83a4-a77b7bc2e3dc1751440721102-2025_07_02-09.19.16.832/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-0d5e2cbe-83a2-48a0-b2d9-a8e41912bfb61753114173405-2025_07_21-18.09.45.514/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-2f5b13c7-61f7-4340-b581-9edac6a53f1f1753015255059-2025_07_20-14.41.09.478/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-3e3ce02e-664a-4f58-9d7f-0f56e32c7def1753363875204-2025_07_24-15.31.23.202/source.csv""Processed 100/349 sessions...Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-46a4cc9d-ac37-44ff-ae8d-547db76d96f31752072213286-2025_07_09-16.43.57.848/source.csv""^Creal 2m26.399suser 2m21.383ssys 0m4.368s[franz.srambical@hai008.haicore.berlin:~/crowd-pilot-serializer] $ time bash /home/franz.srambical/slurm/dev/franz/berlin/crowd-pilot/crowd_pilot_serializer/serialize.shLoading tokenizer from Qwen/Qwen3-8B...Processing CSV files from ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/""...Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-3dde1b0c-c963-467e-aa73-fb6c54df3ae41751963426964-2025_07_08-10.30.57.271/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-3dde1b0c-c963-467e-aa73-fb6c54df3ae41751963426964-2025_07_08-10.30.57.271/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-4457e5d2-f5e8-4b15-95aa-bafa247369991751528947759-2025_07_03-09.50.10.663/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-7879e034-f897-48e8-8481-1a87a73b0dc81752135543307-2025_07_10-10.19.09.565/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-81dc70dc-8e01-48a6-9a00-9349b9f9a4171751541780271-2025_07_03-13.23.33.804/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-bec33357-731a-4f4a-afb4-3d538f18fd451751530719479-2025_07_03-10.19.48.863/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-bec33357-731a-4f4a-afb4-3d538f18fd451751530719479-2025_07_03-10.19.48.863/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-ebfebb4b-ef7a-46a0-8725-dd76b91fd2891752048248358-2025_07_09-10.04.56.945/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-fbd09e27-2302-4b0c-83a4-a77b7bc2e3dc1751440721102-2025_07_02-09.19.16.832/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-0d5e2cbe-83a2-48a0-b2d9-a8e41912bfb61753114173405-2025_07_21-18.09.45.514/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-2f5b13c7-61f7-4340-b581-9edac6a53f1f1753015255059-2025_07_20-14.41.09.478/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-3e3ce02e-664a-4f58-9d7f-0f56e32c7def1753363875204-2025_07_24-15.31.23.202/source.csv""Processed 100/349 sessions...Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-46a4cc9d-ac37-44ff-ae8d-547db76d96f31752072213286-2025_07_09-16.43.57.848/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-7879e034-f897-48e8-8481-1a87a73b0dc81752135543307-2025_07_10-10.19.09.565/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-81dc70dc-8e01-48a6-9a00-9349b9f9a4171751541780271-2025_07_03-13.23.33.804/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-8e67a739-7b65-4646-afc1-42e9766880571751607756007-2025_07_04-07.43.31.602/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-8e67a739-7b65-4646-afc1-42e9766880571751607756007-2025_07_04-07.43.31.602/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-cdede756-87c5-47af-85f8-bc9bf1c41bac1750785125700-2025_06_24-22.05.07.988/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-cdede756-87c5-47af-85f8-bc9bf1c41bac1750785125700-2025_06_24-22.05.07.988/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-cdede756-87c5-47af-85f8-bc9bf1c41bac1750785125700-2025_06_24-22.05.07.988/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-cdede756-87c5-47af-85f8-bc9bf1c41bac1750785125700-2025_06_24-22.05.07.988/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-cdede756-87c5-47af-85f8-bc9bf1c41bac1750785125700-2025_06_24-22.05.07.988/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-29e2cbae-7056-4585-b457-f48bd451c3fd1750644341589-2025_06_22-19.05.43.270/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-8e0958c9-e396-41d9-b3d4-8a748cefa1701750701699946-2025_06_23-11.01.41.744/source.csv""Processed 200/349 sessions...Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/927a8af5474e5654810c00ce2e09fd2de87d3e5722f33fa1090d867db114e403/crowd-code-0f5513f7-8bc9-4c5d-856d-79d92f75113d1751284706913-2025_06_30-13.59.01.459/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/927a8af5474e5654810c00ce2e09fd2de87d3e5722f33fa1090d867db114e403/crowd-code-2d2437e1-caa5-4315-a7d9-4d9478073a161750944609503-2025_06_26-15.30.55.51/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/927a8af5474e5654810c00ce2e09fd2de87d3e5722f33fa1090d867db114e403/crowd-code-6791460b-ec38-4da2-872f-193943c12d601753274780799-2025_07_23-14.47.19.396/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/927a8af5474e5654810c00ce2e09fd2de87d3e5722f33fa1090d867db114e403/crowd-code-92ab1593-f937-4cc4-a174-544581a6ac991751909174142-2025_07_07-19.26.40.736/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/927a8af5474e5654810c00ce2e09fd2de87d3e5722f33fa1090d867db114e403/crowd-code-92ab1593-f937-4cc4-a174-544581a6ac991751909174142-2025_07_07-19.26.40.736/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/927a8af5474e5654810c00ce2e09fd2de87d3e5722f33fa1090d867db114e403/crowd-code-9d03fc6f-7387-447c-9d61-dfb41a6cd5d41752168236691-2025_07_10-19.24.24.908/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/927a8af5474e5654810c00ce2e09fd2de87d3e5722f33fa1090d867db114e403/crowd-code-d80c259c-238d-4ab7-8a8a-2d0fc8e345961753345086680-2025_07_24-10.19.00.46/source.csv""Processed 300/349 sessions...Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/927a8af5474e5654810c00ce2e09fd2de87d3e5722f33fa1090d867db114e403/crowd-code-fe3e4a6d-15dd-460a-96b3-1f7a60db202a1753178306285-2025_07_22-12.23.53.192/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/927a8af5474e5654810c00ce2e09fd2de87d3e5722f33fa1090d867db114e403/crowd-code-fe3e4a6d-15dd-460a-96b3-1f7a60db202a1753178306285-2025_07_22-12.23.53.192/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/e8b08c312d88206805b92191af1ee2a660f8f0e59d3990233d6a3f81cdab43f4/crowd-code-9c6bfb6d-25d7-401c-a2eb-18ae18179f4b1750244015795-2025_06_18-12.58.06.505/source.csv""Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/e8b08c312d88206805b92191af1ee2a660f8f0e59d3990233d6a3f81cdab43f4/crowd-code-b329b9f2-ee04-44c3-8e37-55401c64da7f1750160908319-2025_06_17-13.49.11.314/source.csv""Processed 349/349 sessions...Processed 349 sessionsWriting output to ""test_output_dir""...[summary] Total sessions processed: 349 Train conversations: 4036 Val conversations: 403 Total messages: 129517 Total tokens: 33411160 Output: ""test_output_dir""/{training,validation}.jsonl Metadata: ""test_output_dir/metadata.json""real 5m12.107suser 4m53.516ssys 0m8.006s[franz.srambical@hai008.haicore.berlin:~/crowd-pilot-serializer] $ exitexitsalloc: Relinquishing job allocation 36723",,terminal_command
+210,409413,"TERMINAL",0,0,"salloc --gpus=1 --ntasks-per-node=1 --cpus-per-task=100 --mem=400G --qos=normal",,terminal_command
+211,409463,"TERMINAL",0,0,"]633;Csalloc: Granted job allocation 36733\r\n",,terminal_output
+212,409560,"TERMINAL",0,0,"salloc: Nodes hai008 are ready for job\r\n",,terminal_output
+213,409821,"TERMINAL",0,0,"Running inside SLURM, Job ID 36733.\r\n",,terminal_output
+214,410499,"TERMINAL",0,0,"]0;franz.srambical@hai-login2:~/miles[?2004h[franz.srambical@hai008.haicore.berlin:~/miles] $ ",,terminal_output
+215,411112,"TERMINAL",0,0,"\r(reverse-i-search)`': [K",,terminal_output
+216,411456,"TERMINAL",0,0,"c': time bash /home/franz.srambical/slurm/dev/franz/berlin/crowd-pilot/[7mc[27mrowd_pilot_serializer/serialize.sh",,terminal_output
+217,411724,"TERMINAL",0,0,"\r[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[Ca': time bash /home/franz.srambi[7mca[27ml/slurm/dev/franz/berlin/crowd-pilot/crowd_pilot_serializer/serialize.sh\r[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C\r[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[55Pr': [7mcar[27mgo build --release -p crowd-pilot-serialize\r[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C",,terminal_output
+218,411846,"TERMINAL",0,0,"[1@g': [7mcarg[27m",,terminal_output
+219,411920,"TERMINAL",0,0,"[1@o': [7mcargo[27m",,terminal_output
+220,412868,"TERMINAL",0,0,"\r[23@[franz.srambical@hai008.haicore.berlin:~/miles] $ cargo\r\n[?2004l\r",,terminal_output
+221,412959,"TERMINAL",0,0,"[1m[91merror[0m: could not find `Cargo.toml` in `/fast/home/franz.srambical/miles` or any parent directory\r\n]0;franz.srambical@hai-login2:~/miles[?2004h[franz.srambical@hai008.haicore.berlin:~/miles] $ ",,terminal_output
+222,415267,"TERMINAL",0,0,"c",,terminal_output
+223,415524,"TERMINAL",0,0,"d",,terminal_output
+224,415616,"TERMINAL",0,0," ",,terminal_output
+225,415806,"TERMINAL",0,0,".",,terminal_output
+226,415939,"TERMINAL",0,0,".",,terminal_output
+227,416117,"TERMINAL",0,0,"/",,terminal_output
+228,417793,"TERMINAL",0,0,"c",,terminal_output
+229,417963,"TERMINAL",0,0,"r",,terminal_output
+230,418096,"TERMINAL",0,0,"o",,terminal_output
+231,418273,"TERMINAL",0,0,"[K",,terminal_output
+232,419314,"TERMINAL",0,0,"c",,terminal_output
+233,419548,"TERMINAL",0,0,"r",,terminal_output
+234,419877,"TERMINAL",0,0,"[K",,terminal_output
+235,420023,"TERMINAL",0,0,"[K",,terminal_output
+236,420152,"TERMINAL",0,0,".",,terminal_output
+237,420294,"TERMINAL",0,0,".",,terminal_output
+238,420545,"TERMINAL",0,0,"/",,terminal_output
+239,420629,"TERMINAL",0,0,"c",,terminal_output
+240,420804,"TERMINAL",0,0,"r",,terminal_output
+241,420958,"TERMINAL",0,0,"w",,terminal_output
+242,421040,"TERMINAL",0,0,"o",,terminal_output
+243,421260,"TERMINAL",0,0,"d",,terminal_output
+244,421357,"TERMINAL",0,0,"",,terminal_output
+245,421771,"TERMINAL",0,0,"[K",,terminal_output
+246,421918,"TERMINAL",0,0,"[K",,terminal_output
+247,422076,"TERMINAL",0,0,"[K",,terminal_output
+248,422222,"TERMINAL",0,0,"o",,terminal_output
+249,422322,"TERMINAL",0,0,"w",,terminal_output
+250,422422,"TERMINAL",0,0,"d",,terminal_output
+251,422484,"TERMINAL",0,0,"-",,terminal_output
+252,423028,"TERMINAL",0,0,"p",,terminal_output
+253,423162,"TERMINAL",0,0,"ilot",,terminal_output
+254,423544,"TERMINAL",0,0,"-",,terminal_output
+255,423812,"TERMINAL",0,0,"e",,terminal_output
+256,424025,"TERMINAL",0,0,"xtension/",,terminal_output
+257,424421,"TERMINAL",0,0,"[K",,terminal_output
+258,424746,"TERMINAL",0,0,"[K",,terminal_output
+259,425197,"TERMINAL",0,0,"[K[K[K[K[K[K[K",,terminal_output
+260,425314,"TERMINAL",0,0,"[K",,terminal_output
+261,425667,"TERMINAL",0,0,"ser",,terminal_output
+262,425831,"TERMINAL",0,0,"ializer/",,terminal_output
+263,426046,"TERMINAL",0,0,"\r\n[?2004l\r]0;franz.srambical@hai-login2:~/crowd-pilot-serializer[?2004h[franz.srambical@hai008.haicore.berlin:~/crowd-pilot-serializer] $ ",,terminal_output
+264,427497,"TERMINAL",0,0,"\r(reverse-i-search)`': [K",,terminal_output
+265,427650,"TERMINAL",0,0,"c': cd ../[7mc[27mrowd-pilot-serializer/",,terminal_output
+266,427856,"TERMINAL",0,0,"a': [7mca[27mrgo build --release -p crowd-pilot-serialize\r[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[1@r': [7mcar[27m",,terminal_output
+267,428005,"TERMINAL",0,0,"[1@g': [7mcarg[27m",,terminal_output
+268,428152,"TERMINAL",0,0,"[1@o': [7mcargo[27m",,terminal_output
+269,428507,"TERMINAL",0,0,"\r[40@[franz.srambical@hai008.haicore.berlin:~/crowd-pilot-serializer] $ cargo[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C",,terminal_output
+270,428729,"TERMINAL",0,0,"\r\n[?2004l\r",,terminal_output
+271,430113,"TERMINAL",0,0,"[1m[92m Compiling[0m crowd-pilot-serializer-core v0.1.0 (/fast/home/franz.srambical/crowd-pilot-serializer/crates/core)\r\n[1m[96m Building[0m [=======================> ] 195/197: crowd-pilot-serializer-core \r",,terminal_output
+272,431012,"TERMINAL",0,0,"[K[1m[92m Compiling[0m crowd-pilot-serialize v0.1.0 (/fast/home/franz.srambical/crowd-pilot-serializer/crates/cli)\r\n[1m[96m Building[0m [=======================> ] 196/197: crowd-pilot-serialize(bin) \r",,terminal_output
+273,434679,"TERMINAL",0,0,"[K[1m[92m Finished[0m ]8;;https://doc.rust-lang.org/cargo/reference/profiles.html#default-profiles\`release` profile [optimized]]8;;\ target(s) in 5.86s\r\n]0;franz.srambical@hai-login2:~/crowd-pilot-serializer[?2004h[franz.srambical@hai008.haicore.berlin:~/crowd-pilot-serializer] $ ",,terminal_output
+274,436357,"/home/franz.srambical/slurm/dev/franz/berlin/crowd-pilot/crowd_pilot_serializer/serialize.sh",0,0,"source /home/franz.srambical/crowd-pilot-serializer/.venv/bin/activate\nexport LD_LIBRARY_PATH=$(python -c ""import sysconfig; print(sysconfig.get_config_var('LIBDIR'))""):$LD_LIBRARY_PATH\n./target/release/crowd-pilot-serialize --csv-root=""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/"" --output-dir=test_output_dir --tokenizer=""Qwen/Qwen3-8B""",shellscript,tab
+275,438009,"crates/cli/src/main.rs",0,0,"",rust,tab
+276,441955,"/home/franz.srambical/slurm/dev/franz/berlin/crowd-pilot/crowd_pilot_serializer/serialize.sh",0,0,"",shellscript,tab
+277,442592,"/home/franz.srambical/slurm/dev/franz/berlin/crowd-pilot/crowd_pilot_serializer/serialize.sh",94,0,"",shellscript,selection_command
+278,442777,"/home/franz.srambical/slurm/dev/franz/berlin/crowd-pilot/crowd_pilot_serializer/serialize.sh",23,0,"",shellscript,selection_command
+279,443366,"/home/franz.srambical/slurm/dev/franz/berlin/crowd-pilot/crowd_pilot_serializer/serialize.sh",0,71,"",shellscript,content
+280,445381,"TERMINAL",0,0,"r",,terminal_output
+281,445461,"TERMINAL",0,0,"m",,terminal_output
+282,445670,"TERMINAL",0,0," ",,terminal_output
+283,445894,"TERMINAL",0,0,"-",,terminal_output
+284,446094,"TERMINAL",0,0,"r",,terminal_output
+285,446372,"TERMINAL",0,0," ",,terminal_output
+286,446552,"TERMINAL",0,0,".v",,terminal_output
+287,446626,"TERMINAL",0,0,"e",,terminal_output
+288,446917,"TERMINAL",0,0,"nv/",,terminal_output
+289,448027,"TERMINAL",0,0,"\r\n[?2004l\r",,terminal_output
+290,459484,"/home/franz.srambical/slurm/dev/franz/berlin/crowd-pilot/crowd_pilot_serializer/serialize.sh",0,115,"",shellscript,content
+291,463100,"TERMINAL",0,0,"]0;franz.srambical@hai-login2:~/crowd-pilot-serializer[?2004h[franz.srambical@hai008.haicore.berlin:~/crowd-pilot-serializer] $ ",,terminal_output
+292,465115,"TERMINAL",0,0,"b",,terminal_output
+293,465388,"TERMINAL",0,0,"as",,terminal_output
+294,465540,"TERMINAL",0,0,"h",,terminal_output
+295,465749,"TERMINAL",0,0," ",,terminal_output
+296,465997,"TERMINAL",0,0,"[7m/home/franz.srambical/slurm/dev/franz/berlin/crowd-pilot/crowd_pilot_serializer/serialize.sh[27m",,terminal_output
+297,466342,"TERMINAL",0,0,"\r[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C/home/franz.srambical/slurm/dev/franz/berlin/crowd-pilot/crowd_pilot_serializer/serialize.sh\r[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C",,terminal_output
+298,466573,"TERMINAL",0,0,"[1@ ",,terminal_output
+299,466935,"TERMINAL",0,0,"[1P",,terminal_output
+300,467104,"TERMINAL",0,0,"[1@t",,terminal_output
+301,467207,"TERMINAL",0,0,"[1@i",,terminal_output
+302,467278,"TERMINAL",0,0,"[1@m",,terminal_output
+303,467351,"TERMINAL",0,0,"[1@e",,terminal_output
+304,467494,"TERMINAL",0,0,"[1@ ",,terminal_output
+305,467999,"TERMINAL",0,0,"[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C",,terminal_output
+306,468229,"TERMINAL",0,0,"\r\n[?2004l\rLoading tokenizer from Qwen/Qwen3-8B...\r\n",,terminal_output
+307,468507,"TERMINAL",0,0,"Processing CSV files from ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/""...\r\n",,terminal_output
+308,468739,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/927a8af5474e5654810c00ce2e09fd2de87d3e5722f33fa1090d867db114e403/crowd-code-0f5513f7-8bc9-4c5d-856d-79d92f75113d1751284706913-2025_06_30-13.59.01.459/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/927a8af5474e5654810c00ce2e09fd2de87d3e5722f33fa1090d867db114e403/crowd-code-d80c259c-238d-4ab7-8a8a-2d0fc8e345961753345086680-2025_07_24-10.19.00.46/source.csv""\r\n",,terminal_output
+309,468868,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-4457e5d2-f5e8-4b15-95aa-bafa247369991751528947759-2025_07_03-09.50.10.663/source.csv""\r\n",,terminal_output
+310,469311,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+311,469626,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-2f5b13c7-61f7-4340-b581-9edac6a53f1f1753015255059-2025_07_20-14.41.09.478/source.csv""\r\n",,terminal_output
+312,469941,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-29e2cbae-7056-4585-b457-f48bd451c3fd1750644341589-2025_06_22-19.05.43.270/source.csv""\r\n",,terminal_output
+313,470531,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-8e0958c9-e396-41d9-b3d4-8a748cefa1701750701699946-2025_06_23-11.01.41.744/source.csv""\r\n",,terminal_output
+314,470902,"TERMINAL",0,0,"Processed 100/349 sessions...\r\n",,terminal_output
+315,471453,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+316,471745,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+317,471871,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+318,472296,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-81dc70dc-8e01-48a6-9a00-9349b9f9a4171751541780271-2025_07_03-13.23.33.804/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+319,473704,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-bec33357-731a-4f4a-afb4-3d538f18fd451751530719479-2025_07_03-10.19.48.863/source.csv""\r\n",,terminal_output
+320,473933,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/e8b08c312d88206805b92191af1ee2a660f8f0e59d3990233d6a3f81cdab43f4/crowd-code-b329b9f2-ee04-44c3-8e37-55401c64da7f1750160908319-2025_06_17-13.49.11.314/source.csv""\r\n",,terminal_output
+321,474070,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/e8b08c312d88206805b92191af1ee2a660f8f0e59d3990233d6a3f81cdab43f4/crowd-code-9c6bfb6d-25d7-401c-a2eb-18ae18179f4b1750244015795-2025_06_18-12.58.06.505/source.csv""\r\n",,terminal_output
+322,474265,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-0d5e2cbe-83a2-48a0-b2d9-a8e41912bfb61753114173405-2025_07_21-18.09.45.514/source.csv""\r\n",,terminal_output
+323,474626,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+324,475338,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-81dc70dc-8e01-48a6-9a00-9349b9f9a4171751541780271-2025_07_03-13.23.33.804/source.csv""\r\n",,terminal_output
+325,475653,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+326,475907,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/927a8af5474e5654810c00ce2e09fd2de87d3e5722f33fa1090d867db114e403/crowd-code-fe3e4a6d-15dd-460a-96b3-1f7a60db202a1753178306285-2025_07_22-12.23.53.192/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/927a8af5474e5654810c00ce2e09fd2de87d3e5722f33fa1090d867db114e403/crowd-code-fe3e4a6d-15dd-460a-96b3-1f7a60db202a1753178306285-2025_07_22-12.23.53.192/source.csv""\r\n",,terminal_output
+327,476473,"crates/cli/src/main.rs",0,0,"",rust,tab
+328,477026,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-bec33357-731a-4f4a-afb4-3d538f18fd451751530719479-2025_07_03-10.19.48.863/source.csv""\r\n",,terminal_output
+329,477679,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-3e3ce02e-664a-4f58-9d7f-0f56e32c7def1753363875204-2025_07_24-15.31.23.202/source.csv""\r\n",,terminal_output
+330,479824,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+331,479907,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+332,480495,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/927a8af5474e5654810c00ce2e09fd2de87d3e5722f33fa1090d867db114e403/crowd-code-2d2437e1-caa5-4315-a7d9-4d9478073a161750944609503-2025_06_26-15.30.55.51/source.csv""\r\n",,terminal_output
+333,480602,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+334,480678,"TERMINAL",0,0,"Processed 200/349 sessions...\r\n",,terminal_output
+335,481052,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+336,482111,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+337,482347,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+338,483785,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/927a8af5474e5654810c00ce2e09fd2de87d3e5722f33fa1090d867db114e403/crowd-code-92ab1593-f937-4cc4-a174-544581a6ac991751909174142-2025_07_07-19.26.40.736/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/927a8af5474e5654810c00ce2e09fd2de87d3e5722f33fa1090d867db114e403/crowd-code-92ab1593-f937-4cc4-a174-544581a6ac991751909174142-2025_07_07-19.26.40.736/source.csv""\r\n",,terminal_output
+339,485115,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+340,485287,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+341,486409,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+342,486485,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+343,487096,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+344,488695,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+345,489055,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+346,490484,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/927a8af5474e5654810c00ce2e09fd2de87d3e5722f33fa1090d867db114e403/crowd-code-6791460b-ec38-4da2-872f-193943c12d601753274780799-2025_07_23-14.47.19.396/source.csv""\r\n",,terminal_output
+347,490936,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-fbd09e27-2302-4b0c-83a4-a77b7bc2e3dc1751440721102-2025_07_02-09.19.16.832/source.csv""\r\n",,terminal_output
+348,494445,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-cdede756-87c5-47af-85f8-bc9bf1c41bac1750785125700-2025_06_24-22.05.07.988/source.csv""\r\n",,terminal_output
+349,496078,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-46a4cc9d-ac37-44ff-ae8d-547db76d96f31752072213286-2025_07_09-16.43.57.848/source.csv""\r\n",,terminal_output
+350,499097,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-cdede756-87c5-47af-85f8-bc9bf1c41bac1750785125700-2025_06_24-22.05.07.988/source.csv""\r\n",,terminal_output
+351,500104,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-cdede756-87c5-47af-85f8-bc9bf1c41bac1750785125700-2025_06_24-22.05.07.988/source.csv""\r\n",,terminal_output
+352,500619,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-cdede756-87c5-47af-85f8-bc9bf1c41bac1750785125700-2025_06_24-22.05.07.988/source.csv""\r\n",,terminal_output
+353,500934,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-cdede756-87c5-47af-85f8-bc9bf1c41bac1750785125700-2025_06_24-22.05.07.988/source.csv""\r\n",,terminal_output
+354,504096,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/927a8af5474e5654810c00ce2e09fd2de87d3e5722f33fa1090d867db114e403/crowd-code-9d03fc6f-7387-447c-9d61-dfb41a6cd5d41752168236691-2025_07_10-19.24.24.908/source.csv""\r\n",,terminal_output
+355,504701,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-3dde1b0c-c963-467e-aa73-fb6c54df3ae41751963426964-2025_07_08-10.30.57.271/source.csv""\r\n",,terminal_output
+356,505285,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-3dde1b0c-c963-467e-aa73-fb6c54df3ae41751963426964-2025_07_08-10.30.57.271/source.csv""\r\n",,terminal_output
+357,507700,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-ebfebb4b-ef7a-46a0-8725-dd76b91fd2891752048248358-2025_07_09-10.04.56.945/source.csv""\r\n",,terminal_output
+358,509162,"crates/cli/src/main.rs",3124,0,"",rust,selection_command
+359,509291,"crates/cli/src/main.rs",3128,0,"",rust,selection_command
+360,509480,"crates/cli/src/main.rs",3129,0,"",rust,selection_command
+361,509743,"crates/cli/src/main.rs",3134,0,"",rust,selection_command
+362,509876,"crates/cli/src/main.rs",3135,0,"",rust,selection_command
+363,511978,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-7879e034-f897-48e8-8481-1a87a73b0dc81752135543307-2025_07_10-10.19.09.565/source.csv""\r\n",,terminal_output
+364,513564,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-7879e034-f897-48e8-8481-1a87a73b0dc81752135543307-2025_07_10-10.19.09.565/source.csv""\r\n",,terminal_output
+365,513712,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\n",,terminal_output
+366,513937,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-8e67a739-7b65-4646-afc1-42e9766880571751607756007-2025_07_04-07.43.31.602/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-8e67a739-7b65-4646-afc1-42e9766880571751607756007-2025_07_04-07.43.31.602/source.csv""\r\n",,terminal_output
+367,514862,"crates/cli/src/main.rs",3182,0,"",rust,selection_command
+368,515313,"crates/cli/src/main.rs",3180,0,"",rust,selection_command
+369,515433,"crates/cli/src/main.rs",3172,0,"",rust,selection_command
+370,516751,"TERMINAL",0,0,"Processed 300/349 sessions...\r\n",,terminal_output
+371,517213,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\n",,terminal_output
+372,528917,"crates/cli/src/main.rs",3202,0,"",rust,selection_command
+373,529070,"crates/cli/src/main.rs",3248,0,"",rust,selection_command
+374,529734,"crates/cli/src/main.rs",3249,0,"",rust,selection_command
+375,534916,"crates/cli/src/main.rs",3294,0,"",rust,selection_command
+376,535160,"crates/cli/src/main.rs",3298,0,"",rust,selection_command
+377,535319,"crates/cli/src/main.rs",3305,0,"",rust,selection_command
+378,535484,"crates/cli/src/main.rs",3309,0,"",rust,selection_command
+379,535634,"crates/cli/src/main.rs",3310,0,"",rust,selection_command
+380,537413,"TERMINAL",0,0,"Processed 349/349 sessions...\r\nProcessed 349 sessions\r\nWriting output to ""test_output_dir""...\r\n",,terminal_output
+381,538034,"crates/cli/src/main.rs",3309,0,"",rust,selection_command
+382,538175,"crates/cli/src/main.rs",3305,0,"",rust,selection_command
+383,540620,"crates/cli/src/main.rs",3309,0,"",rust,selection_command
+384,540964,"crates/cli/src/main.rs",3278,44," return text.to_string();",rust,selection_command
+385,547676,"TERMINAL",0,0,"\r\n[summary]\r\n Total sessions processed: 349\r\n Train conversations: 4036\r\n Val conversations: 403\r\n Total messages: 129517\r\n Total tokens: 33411160\r\n Output: ""test_output_dir""/{training,validation}.jsonl\r\n Metadata: ""test_output_dir/metadata.json""\r\n",,terminal_output
+386,548429,"TERMINAL",0,0,"\r\nreal\t1m20.227s\r\nuser\t62m40.945s\r\nsys\t1m53.099s\r\n]0;franz.srambical@hai-login2:~/crowd-pilot-serializer[?2004h[franz.srambical@hai008.haicore.berlin:~/crowd-pilot-serializer] $ ",,terminal_output
+387,550310,"crates/cli/src/main.rs",3309,0,"",rust,selection_command
+388,580626,"crates/cli/src/main.rs",3339,0,"",rust,selection_command
+389,580773,"crates/cli/src/main.rs",3372,0,"",rust,selection_command
+390,581612,"crates/cli/src/main.rs",3361,0,"",rust,selection_command
+391,603268,"crates/cli/src/main.rs",3374,0,"",rust,selection_command
+392,603537,"crates/cli/src/main.rs",3376,0,"",rust,selection_command
+393,603560,"crates/cli/src/main.rs",3379,0,"",rust,selection_command
+394,603598,"crates/cli/src/main.rs",3380,0,"",rust,selection_command
+395,603625,"crates/cli/src/main.rs",3383,0,"",rust,selection_command
+396,603928,"crates/cli/src/main.rs",3385,0,"",rust,selection_command
+397,604237,"crates/cli/src/main.rs",3387,0,"",rust,selection_command
+398,607810,"crates/cli/src/main.rs",3390,0,"",rust,selection_command
+399,608007,"crates/cli/src/main.rs",3393,0,"",rust,selection_command
+400,608237,"crates/cli/src/main.rs",3403,0,"",rust,selection_command
+401,608645,"crates/cli/src/main.rs",3393,0,"",rust,selection_command
+402,610039,"crates/cli/src/main.rs",3403,0,"",rust,selection_command
+403,610240,"crates/cli/src/main.rs",3405,0,"",rust,selection_command
+404,610434,"crates/cli/src/main.rs",3411,0,"",rust,selection_command
+405,610767,"crates/cli/src/main.rs",3405,0,"",rust,selection_command
+406,614484,"crates/cli/src/main.rs",3403,0,"",rust,selection_command
+407,614625,"crates/cli/src/main.rs",3393,0,"",rust,selection_command
+408,614886,"crates/cli/src/main.rs",3390,0,"",rust,selection_command
+409,614916,"crates/cli/src/main.rs",3387,0,"",rust,selection_command
+410,615301,"crates/cli/src/main.rs",3390,0,"",rust,selection_command
+411,615722,"crates/cli/src/main.rs",3387,3,"ids",rust,selection_command
+412,616303,"crates/cli/src/main.rs",3389,0,"",rust,selection_command
+413,616433,"crates/cli/src/main.rs",3387,0,"",rust,selection_command
+414,620391,"crates/cli/src/main.rs",3357,0,"",rust,selection_command
+415,626998,"crates/cli/src/main.rs",3361,0,"",rust,selection_command
+416,683056,"crates/cli/src/main.rs",3435,0,"",rust,selection_command
+417,683478,"crates/cli/src/main.rs",3462,0,"",rust,selection_command
+418,684225,"crates/cli/src/main.rs",3463,0,"",rust,selection_command
+419,685999,"crates/cli/src/main.rs",3513,0,"",rust,selection_command
+420,686933,"crates/cli/src/main.rs",3527,0,"",rust,selection_command
+421,687174,"crates/cli/src/main.rs",3529,0,"",rust,selection_command
+422,716496,"crates/cli/src/main.rs",3492,84," .unwrap_or_else(|_| text.chars().take(max_tokens * 4).collect())",rust,selection_command
+423,755117,"crates/cli/src/main.rs",3529,0,"",rust,selection_command
+424,759499,"crates/cli/src/main.rs",3575,0,"",rust,selection_command
+425,759876,"crates/cli/src/main.rs",3573,0,"",rust,selection_command
+426,760005,"crates/cli/src/main.rs",3566,0,"",rust,selection_command
+427,762888,"crates/cli/src/main.rs",3564,0,"",rust,selection_command
+428,763140,"crates/cli/src/main.rs",3563,0,"",rust,selection_command
+429,763164,"crates/cli/src/main.rs",3561,0,"",rust,selection_command
+430,763209,"crates/cli/src/main.rs",3550,0,"",rust,selection_command
+431,763240,"crates/cli/src/main.rs",3549,0,"",rust,selection_command
+432,763307,"crates/cli/src/main.rs",3545,0,"",rust,selection_command
+433,763308,"crates/cli/src/main.rs",3542,0,"",rust,selection_command
+434,763330,"crates/cli/src/main.rs",3537,0,"",rust,selection_command
+435,763369,"crates/cli/src/main.rs",3536,0,"",rust,selection_command
+436,763410,"crates/cli/src/main.rs",3532,0,"",rust,selection_command
+437,763431,"crates/cli/src/main.rs",3530,0,"",rust,selection_command
+438,763624,"crates/cli/src/main.rs",3529,0,"",rust,selection_command
+439,763874,"crates/cli/src/main.rs",3527,0,"",rust,selection_command
+440,763905,"crates/cli/src/main.rs",3513,0,"",rust,selection_command
+441,764024,"crates/cli/src/main.rs",3512,0,"",rust,selection_command
+442,764372,"crates/cli/src/main.rs",3513,0,"",rust,selection_command
+443,764548,"crates/cli/src/main.rs",3527,0,"",rust,selection_command
+444,767713,"crates/cli/src/main.rs",3110,557," let encoding = self.inner\n .encode(text, false)\n .expect(""Failed to encode text with tokenizer"");\n \n let ids = encoding.get_ids();\n if ids.len() <= max_tokens {\n return text.to_string();\n }\n \n let truncated_ids: Vec = ids[..max_tokens].to_vec();\n self.inner\n .decode(&truncated_ids, true)\n .expect(""Failed to decode truncated tokens"")",rust,content
+445,773033,"crates/cli/src/main.rs",3503,0,"",rust,selection_command
+446,773249,"crates/cli/src/main.rs",3461,0,"",rust,selection_command
+447,773402,"crates/cli/src/main.rs",3434,0,"",rust,selection_command
+448,773703,"crates/cli/src/main.rs",3376,0,"",rust,selection_command
+449,773864,"crates/cli/src/main.rs",3367,0,"",rust,selection_command
+450,774119,"crates/cli/src/main.rs",3357,0,"",rust,selection_command
+451,774305,"crates/cli/src/main.rs",3320,0,"",rust,selection_command
+452,774557,"crates/cli/src/main.rs",3283,0,"",rust,selection_command
+453,774588,"crates/cli/src/main.rs",3245,0,"",rust,selection_command
+454,774758,"crates/cli/src/main.rs",3233,0,"",rust,selection_command
+455,774947,"crates/cli/src/main.rs",3175,0,"",rust,selection_command
+456,775266,"crates/cli/src/main.rs",3142,0,"",rust,selection_command
+457,775724,"crates/cli/src/main.rs",3175,0,"",rust,selection_command
+458,776097,"crates/cli/src/main.rs",3233,0,"",rust,selection_command
+459,777156,"crates/cli/src/main.rs",3177,0,"",rust,selection_command
+460,779998,"crates/cli/src/main.rs",3189,0,"",rust,selection_command
+461,781808,"crates/cli/src/main.rs",3110,451," match self.inner.encode(text, false) {\n Ok(encoding) => {\n let ids = encoding.get_ids();\n if ids.len() <= max_tokens {\n return text.to_string();\n }\n let truncated_ids: Vec = ids[..max_tokens].to_vec();\n self.inner\n .decode(&truncated_ids, true)\n .unwrap_or_else(|_| text.chars().take(max_tokens * 4).collect())\n }\n Err(_) => text.chars().take(max_tokens * 4).collect(),\n }",rust,content
+462,781859,"crates/cli/src/main.rs",3110,557," let encoding = self.inner\n .encode(text, false)\n .expect(""Failed to encode text with tokenizer"");\n \n let ids = encoding.get_ids();\n if ids.len() <= max_tokens {\n return text.to_string();\n }\n \n let truncated_ids: Vec = ids[..max_tokens].to_vec();\n self.inner\n .decode(&truncated_ids, true)\n .expect(""Failed to decode truncated tokens"")",rust,content
+463,781859,"crates/cli/src/main.rs",2953,69," .expect(""Failed to encode text with tokenizer"")\n .get_ids()\n .len()",rust,content
+464,793333,"crates/cli/src/main.rs",0,0,"",rust,tab
+465,794085,"crates/cli/src/main.rs",0,0,"",rust,selection_command
+466,795518,"crates/cli/src/main.rs",1439,0,"",rust,selection_keyboard
+467,795797,"crates/cli/src/main.rs",3036,0,"",rust,selection_keyboard
+468,806741,"crates/cli/src/main.rs",3013,0,"",rust,selection_command
+469,806865,"crates/cli/src/main.rs",2953,0,"",rust,selection_command
+470,810420,"crates/cli/src/main.rs",3142,0,"",rust,selection_command
+471,816160,"crates/cli/src/main.rs",3176,0,"",rust,selection_command
+472,816326,"crates/cli/src/main.rs",3209,0,"",rust,selection_command
+473,816469,"crates/cli/src/main.rs",3270,0,"",rust,selection_command
+474,816715,"crates/cli/src/main.rs",3279,0,"",rust,selection_command
+475,819091,"crates/cli/src/main.rs",3317,0,"",rust,selection_command
+476,819319,"crates/cli/src/main.rs",3354,0,"",rust,selection_command
+477,819351,"crates/cli/src/main.rs",3391,0,"",rust,selection_command
+478,819385,"crates/cli/src/main.rs",3401,0,"",rust,selection_command
+479,819417,"crates/cli/src/main.rs",3410,0,"",rust,selection_command
+480,819948,"crates/cli/src/main.rs",3401,0,"",rust,selection_command
+481,820114,"crates/cli/src/main.rs",3391,0,"",rust,selection_command
+482,820222,"crates/cli/src/main.rs",3401,0,"",rust,selection_command
+483,820401,"crates/cli/src/main.rs",3410,0,"",rust,selection_command
+484,822970,"crates/cli/src/main.rs",3476,0,"",rust,selection_command
+485,823138,"crates/cli/src/main.rs",3495,0,"",rust,selection_command
+486,824839,"crates/cli/src/main.rs",3476,0,"",rust,selection_command
+487,825009,"crates/cli/src/main.rs",3484,0,"",rust,selection_command
+488,825227,"crates/cli/src/main.rs",3488,0,"",rust,selection_command
+489,827418,"crates/cli/src/main.rs",3507,0,"",rust,selection_command
+490,827569,"crates/cli/src/main.rs",3549,0,"",rust,selection_command
+491,828153,"crates/cli/src/main.rs",3537,0,"",rust,selection_command
+492,833206,"TERMINAL",0,0,"time bash /home/franz.srambical/slurm/dev/franz/berlin/crowd-pilot/crowd_pilot_serializer/serialize.sh",,terminal_output
+493,833640,"TERMINAL",0,0,"\r[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[Crm -r .venv/[K",,terminal_output
+494,834281,"TERMINAL",0,0,"cargo build --release -p crowd-pilot-serialize",,terminal_output
+495,835373,"TERMINAL",0,0,"\r\n[?2004l\r",,terminal_output
+496,836628,"TERMINAL",0,0,"[1m[92m Compiling[0m crowd-pilot-serialize v0.1.0 (/fast/home/franz.srambical/crowd-pilot-serializer/crates/cli)\r\n[1m[96m Building[0m [=======================> ] 196/197: crowd-pilot-serialize(bin) \r",,terminal_output
+497,840327,"TERMINAL",0,0,"[K[1m[92m Finished[0m ]8;;https://doc.rust-lang.org/cargo/reference/profiles.html#default-profiles\`release` profile [optimized]]8;;\ target(s) in 4.84s\r\n]0;franz.srambical@hai-login2:~/crowd-pilot-serializer[?2004h[franz.srambical@hai008.haicore.berlin:~/crowd-pilot-serializer] $ ",,terminal_output
+498,844481,"TERMINAL",0,0,"cargo build --release -p crowd-pilot-serialize",,terminal_output
+499,844537,"TERMINAL",0,0,"time bash /home/franz.srambical/slurm/dev/franz/berlin/crowd-pilot/crowd_pilot_serializer/serialize.sh",,terminal_output
+500,845216,"TERMINAL",0,0,"\r\n[?2004l\rLoading tokenizer from Qwen/Qwen3-8B...\r\n",,terminal_output
+501,845491,"TERMINAL",0,0,"Processing CSV files from ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/""...\r\n",,terminal_output
+502,845897,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-4457e5d2-f5e8-4b15-95aa-bafa247369991751528947759-2025_07_03-09.50.10.663/source.csv""\r\n",,terminal_output
+503,846080,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-29e2cbae-7056-4585-b457-f48bd451c3fd1750644341589-2025_06_22-19.05.43.270/source.csv""\r\n",,terminal_output
+504,846280,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+505,846444,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/927a8af5474e5654810c00ce2e09fd2de87d3e5722f33fa1090d867db114e403/crowd-code-0f5513f7-8bc9-4c5d-856d-79d92f75113d1751284706913-2025_06_30-13.59.01.459/source.csv""\r\n",,terminal_output
+506,847663,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-8e0958c9-e396-41d9-b3d4-8a748cefa1701750701699946-2025_06_23-11.01.41.744/source.csv""\r\n",,terminal_output
+507,847753,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+508,847892,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+509,848550,"TERMINAL",0,0,"Processed 100/349 sessions...\r\n",,terminal_output
+510,848603,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/e8b08c312d88206805b92191af1ee2a660f8f0e59d3990233d6a3f81cdab43f4/crowd-code-b329b9f2-ee04-44c3-8e37-55401c64da7f1750160908319-2025_06_17-13.49.11.314/source.csv""\r\n",,terminal_output
+511,848950,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-0d5e2cbe-83a2-48a0-b2d9-a8e41912bfb61753114173405-2025_07_21-18.09.45.514/source.csv""\r\n",,terminal_output
+512,849027,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/927a8af5474e5654810c00ce2e09fd2de87d3e5722f33fa1090d867db114e403/crowd-code-fe3e4a6d-15dd-460a-96b3-1f7a60db202a1753178306285-2025_07_22-12.23.53.192/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/927a8af5474e5654810c00ce2e09fd2de87d3e5722f33fa1090d867db114e403/crowd-code-fe3e4a6d-15dd-460a-96b3-1f7a60db202a1753178306285-2025_07_22-12.23.53.192/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+513,849338,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/927a8af5474e5654810c00ce2e09fd2de87d3e5722f33fa1090d867db114e403/crowd-code-d80c259c-238d-4ab7-8a8a-2d0fc8e345961753345086680-2025_07_24-10.19.00.46/source.csv""\r\n",,terminal_output
+514,849795,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+515,850126,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-81dc70dc-8e01-48a6-9a00-9349b9f9a4171751541780271-2025_07_03-13.23.33.804/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+516,850348,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-bec33357-731a-4f4a-afb4-3d538f18fd451751530719479-2025_07_03-10.19.48.863/source.csv""\r\n",,terminal_output
+517,851121,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-2f5b13c7-61f7-4340-b581-9edac6a53f1f1753015255059-2025_07_20-14.41.09.478/source.csv""\r\n",,terminal_output
+518,851927,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-81dc70dc-8e01-48a6-9a00-9349b9f9a4171751541780271-2025_07_03-13.23.33.804/source.csv""\r\n",,terminal_output
+519,852529,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/e8b08c312d88206805b92191af1ee2a660f8f0e59d3990233d6a3f81cdab43f4/crowd-code-9c6bfb6d-25d7-401c-a2eb-18ae18179f4b1750244015795-2025_06_18-12.58.06.505/source.csv""\r\n",,terminal_output
+520,853595,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-bec33357-731a-4f4a-afb4-3d538f18fd451751530719479-2025_07_03-10.19.48.863/source.csv""\r\n",,terminal_output
+521,853697,"crates/cli/src/main.rs",0,0,"",rust,tab
+522,853698,"crates/cli/src/main.rs",147,0,"",rust,selection_command
+523,857776,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+524,857894,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+525,858022,"TERMINAL",0,0,"Processed 200/349 sessions...\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+526,858154,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+527,858700,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+528,858935,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-3e3ce02e-664a-4f58-9d7f-0f56e32c7def1753363875204-2025_07_24-15.31.23.202/source.csv""\r\n",,terminal_output
+529,859148,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+530,859637,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+531,859693,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/927a8af5474e5654810c00ce2e09fd2de87d3e5722f33fa1090d867db114e403/crowd-code-2d2437e1-caa5-4315-a7d9-4d9478073a161750944609503-2025_06_26-15.30.55.51/source.csv""\r\n",,terminal_output
+532,860131,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+533,860360,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+534,860811,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+535,861101,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/927a8af5474e5654810c00ce2e09fd2de87d3e5722f33fa1090d867db114e403/crowd-code-92ab1593-f937-4cc4-a174-544581a6ac991751909174142-2025_07_07-19.26.40.736/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/927a8af5474e5654810c00ce2e09fd2de87d3e5722f33fa1090d867db114e403/crowd-code-92ab1593-f937-4cc4-a174-544581a6ac991751909174142-2025_07_07-19.26.40.736/source.csv""\r\n",,terminal_output
+536,861367,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/927a8af5474e5654810c00ce2e09fd2de87d3e5722f33fa1090d867db114e403/crowd-code-6791460b-ec38-4da2-872f-193943c12d601753274780799-2025_07_23-14.47.19.396/source.csv""\r\n",,terminal_output
+537,866033,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-cdede756-87c5-47af-85f8-bc9bf1c41bac1750785125700-2025_06_24-22.05.07.988/source.csv""\r\n",,terminal_output
+538,866480,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-fbd09e27-2302-4b0c-83a4-a77b7bc2e3dc1751440721102-2025_07_02-09.19.16.832/source.csv""\r\n",,terminal_output
+539,869106,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-cdede756-87c5-47af-85f8-bc9bf1c41bac1750785125700-2025_06_24-22.05.07.988/source.csv""\r\n",,terminal_output
+540,869682,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-cdede756-87c5-47af-85f8-bc9bf1c41bac1750785125700-2025_06_24-22.05.07.988/source.csv""\r\n",,terminal_output
+541,869993,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-cdede756-87c5-47af-85f8-bc9bf1c41bac1750785125700-2025_06_24-22.05.07.988/source.csv""\r\n",,terminal_output
+542,870217,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-cdede756-87c5-47af-85f8-bc9bf1c41bac1750785125700-2025_06_24-22.05.07.988/source.csv""\r\n",,terminal_output
+543,872375,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-46a4cc9d-ac37-44ff-ae8d-547db76d96f31752072213286-2025_07_09-16.43.57.848/source.csv""\r\n",,terminal_output
+544,876690,"crates/cli/Cargo.toml",0,0,"[package]\nname = ""crowd-pilot-serialize""\nversion.workspace = true\nedition.workspace = true\nlicense.workspace = true\ndescription = ""CLI tool for serializing crowd-pilot's IDE interaction data""\n\n[[bin]]\nname = ""crowd-pilot-serialize""\npath = ""src/main.rs""\n\n[dependencies]\ncrowd-pilot-serializer-core = { path = ""../core"" }\nclap = { version = ""4.5"", features = [""derive""] }\ntokenizers = { version = ""0.21"", features = [""http""] }\nserde_json = { workspace = true }\n\n",plaintext,tab
+545,876691,"crates/cli/Cargo.toml",370,0,"",plaintext,selection_command
+546,880948,"crates/cli/Cargo.toml",0,0,"[package]\nname = ""crowd-pilot-serialize""\nversion.workspace = true\nedition.workspace = true\nlicense.workspace = true\ndescription = ""CLI tool for serializing crowd-pilot's IDE interaction data""\n\n[[bin]]\nname = ""crowd-pilot-serialize""\npath = ""src/main.rs""\n\n[dependencies]\ncrowd-pilot-serializer-core = { path = ""../core"" }\nclap = { version = ""4.5"", features = [""derive""] }\ntokenizers = { version = ""0.21"", features = [""http""] }\nserde_json = { workspace = true }\n\n",plaintext,tab
+547,880948,"crates/cli/Cargo.toml",336,0,"",plaintext,selection_mouse
+548,880963,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-3dde1b0c-c963-467e-aa73-fb6c54df3ae41751963426964-2025_07_08-10.30.57.271/source.csv""\r\n",,terminal_output
+549,881370,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-3dde1b0c-c963-467e-aa73-fb6c54df3ae41751963426964-2025_07_08-10.30.57.271/source.csv""\r\n",,terminal_output
+550,882133,"crates/cli/Cargo.toml",0,0,"",plaintext,tab
+551,882135,"crates/cli/src/main.rs",0,0,"",rust,tab
+552,883274,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-7879e034-f897-48e8-8481-1a87a73b0dc81752135543307-2025_07_10-10.19.09.565/source.csv""\r\n",,terminal_output
+553,883912,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-8e67a739-7b65-4646-afc1-42e9766880571751607756007-2025_07_04-07.43.31.602/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-8e67a739-7b65-4646-afc1-42e9766880571751607756007-2025_07_04-07.43.31.602/source.csv""\r\n",,terminal_output
+554,884181,"crates/cli/src/main.rs",0,0,"",rust,tab
+555,884791,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/927a8af5474e5654810c00ce2e09fd2de87d3e5722f33fa1090d867db114e403/crowd-code-9d03fc6f-7387-447c-9d61-dfb41a6cd5d41752168236691-2025_07_10-19.24.24.908/source.csv""\r\n",,terminal_output
+556,885331,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\n",,terminal_output
+557,886013,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-ebfebb4b-ef7a-46a0-8725-dd76b91fd2891752048248358-2025_07_09-10.04.56.945/source.csv""\r\n",,terminal_output
+558,886127,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-7879e034-f897-48e8-8481-1a87a73b0dc81752135543307-2025_07_10-10.19.09.565/source.csv""\r\n",,terminal_output
+559,892462,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\n",,terminal_output
+560,894040,"TERMINAL",0,0,"Processed 300/349 sessions...\r\n",,terminal_output
+561,913573,"TERMINAL",0,0,"Processed 349/349 sessions...\r\nProcessed 349 sessions\r\nWriting output to ""test_output_dir""...\r\n",,terminal_output
+562,923261,"TERMINAL",0,0,"\r\n[summary]\r\n Total sessions processed: 349\r\n Train conversations: 4036\r\n Val conversations: 403\r\n Total messages: 129517\r\n Total tokens: 33411160\r\n Output: ""test_output_dir""/{training,validation}.jsonl\r\n Metadata: ""test_output_dir/metadata.json""\r\n",,terminal_output
+563,924031,"TERMINAL",0,0,"\r\nreal\t1m18.850s\r\nuser\t62m27.949s\r\nsys\t1m48.527s\r\n]0;franz.srambical@hai-login2:~/crowd-pilot-serializer[?2004h[franz.srambical@hai008.haicore.berlin:~/crowd-pilot-serializer] $ ",,terminal_output
+564,3901206,"TERMINAL",0,0,"\r[K[franz.srambical@hai008.haicore.berlin:~/crowd-pilot-serializer] $ ",,terminal_output
+565,3911235,"TERMINAL",0,0,"c",,terminal_output
+566,3911489,"TERMINAL",0,0,"at",,terminal_output
+567,3911701,"TERMINAL",0,0," t",,terminal_output
+568,3911752,"TERMINAL",0,0,"e",,terminal_output
+569,3911922,"TERMINAL",0,0,"st",,terminal_output
+570,3912108,"TERMINAL",0,0,"_output_dir/",,terminal_output
+571,3913026,"TERMINAL",0,0,"m",,terminal_output
+572,3913114,"TERMINAL",0,0,"e",,terminal_output
+573,3913292,"TERMINAL",0,0,"tadata.json ",,terminal_output
+574,3913674,"TERMINAL",0,0,"\r\n[?2004l\r{\r\n ""config"": {\r\n ""coalesce_radius"": 5,\r\n ""csv_root"": ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/"",\r\n ""max_tokens_per_conversation"": 8192,\r\n ""max_tokens_per_message"": 2048,\r\n ""min_conversation_messages"": 5,\r\n ""output_dir"": ""test_output_dir"",\r\n ""tokenizer"": ""Qwen/Qwen3-8B"",\r\n ""val_ratio"": 0.1,\r\n ""viewport_radius"": 10\r\n },\r\n ""counts"": {\r\n ""total_conversations"": 4439,\r\n ""total_sessions"": 349,\r\n ""train_conversations"": 4036,\r\n ""val_conversations"": 403\r\n },\r\n ""files"": {\r\n ""train_path"": ""test_output_dir/training.jsonl"",\r\n ""val_path"": ""test_output_dir/validation.jsonl""\r\n },\r\n ""stats"": {\r\n ""avg_messages_per_conversation"": 29.177066906961027,\r\n ""avg_tokens_per_conversation"": 7526.731245776075,\r\n ""total_messages"": 129517,\r\n ""total_tokens"": 33411160\r\n }\r\n}]0;franz.srambical@hai-login2:~/crowd-pilot-serializer[?2004h[franz.srambical@hai008.haicore.berlin:~/crowd-pilot-serializer] $ ",,terminal_output
+575,3935256,"TERMINAL",0,0,"m",,terminal_output
+576,3935387,"TERMINAL",0,0,"v",,terminal_output
+577,3935556,"TERMINAL",0,0," ",,terminal_output
+578,3935824,"TERMINAL",0,0,".",,terminal_output
+579,3936012,"TERMINAL",0,0,".",,terminal_output
+580,3936138,"TERMINAL",0,0,"/",,terminal_output
+581,3936284,"TERMINAL",0,0,"c",,terminal_output
+582,3936417,"TERMINAL",0,0,"r",,terminal_output
+583,3936544,"TERMINAL",0,0,"o",,terminal_output
+584,3936599,"TERMINAL",0,0,"w",,terminal_output
+585,3936743,"TERMINAL",0,0,"d",,terminal_output
+586,3936826,"TERMINAL",0,0,"-",,terminal_output
+587,3937156,"TERMINAL",0,0,"pi",,terminal_output
+588,3937387,"TERMINAL",0,0,"lot",,terminal_output
+589,3938144,"TERMINAL",0,0," ",,terminal_output
+590,3938318,"TERMINAL",0,0,".",,terminal_output
+591,3938496,"TERMINAL",0,0,".",,terminal_output
+592,3938688,"TERMINAL",0,0,"/",,terminal_output
+593,3938872,"TERMINAL",0,0,"c",,terminal_output
+594,3939098,"TERMINAL",0,0,"r",,terminal_output
+595,3939158,"TERMINAL",0,0,"o",,terminal_output
+596,3939248,"TERMINAL",0,0,"w",,terminal_output
+597,3939363,"TERMINAL",0,0,"d",,terminal_output
+598,3939504,"TERMINAL",0,0,"-",,terminal_output
+599,3939956,"TERMINAL",0,0,"p",,terminal_output
+600,3940015,"TERMINAL",0,0,"i",,terminal_output
+601,3940221,"TERMINAL",0,0,"lot",,terminal_output
+602,3941128,"TERMINAL",0,0,"-",,terminal_output
+603,3941513,"TERMINAL",0,0,"ser",,terminal_output
+604,3941660,"TERMINAL",0,0,"ializer/",,terminal_output
+605,3942531,"TERMINAL",0,0,"[K",,terminal_output
+606,3942799,"TERMINAL",0,0,"-",,terminal_output
+607,3943048,"TERMINAL",0,0,"le",,terminal_output
+608,3943131,"TERMINAL",0,0,"g",,terminal_output
+609,3943304,"TERMINAL",0,0,"a",,terminal_output
+610,3943415,"TERMINAL",0,0,"c",,terminal_output
+611,3943534,"TERMINAL",0,0,"y",,terminal_output
+612,3943755,"TERMINAL",0,0,"\r\n[?2004l\r]0;franz.srambical@hai-login2:~/crowd-pilot-serializer[?2004h[franz.srambical@hai008.haicore.berlin:~/crowd-pilot-serializer] $ ",,terminal_output
+613,3953356,"/home/franz.srambical/crowd-pilot-serializer-legacy/crowd_pilot_serializer/serialization_utils.py",0,0,"#!/usr/bin/env python3\n""""""\nCommon utilities for dataset serialization scripts.\n""""""\n\nfrom __future__ import annotations\n\nfrom dataclasses import dataclass, field\nfrom pathlib import Path\nfrom typing import List, Optional, Tuple, Dict, Any\n\nimport difflib\nimport re\nimport pandas as pd\nfrom datasets import Dataset, load_dataset\n\n\n_ANSI_CSI_RE = re.compile(r""\x1b\[[0-9;?]*[ -/]*[@-~]"")\n_ANSI_OSC_TERMINATED_RE = re.compile(r""\x1b\][\s\S]*?(?:\x07|\x1b\\)"")\n_ANSI_OSC_LINE_FALLBACK_RE = re.compile(r""\x1b\][^\n]*$"")\n_BRACKETED_PASTE_ENABLE = ""\x1b[?2004h""\n_BRACKETED_PASTE_DISABLE = ""\x1b[?2004l""\n_OSC_633 = ""\x1b]633;""\n_OSC_0 = ""\x1b]0;""\n\n\n\n@dataclass\nclass ConversationState:\n """"""\n Mutable state used while constructing conversations.\n """"""\n conversations: List[List[Dict[str, str]]]\n max_tokens_per_conversation: int\n max_tokens_per_message: int\n min_conversation_messages: int\n tokenizer: Any\n conversation_token_counts: List[int] = field(default_factory=list)\n current_conversation: List[Dict[str, str]] = field(default_factory=list)\n current_tokens: int = 0\n files_opened_in_conversation: set[str] = field(default_factory=set)\n\n def finalize_conversation(self) -> None:\n """"""\n Finalize the current conversation: check constraints and append if valid.\n Then reset state for the next conversation.\n """"""\n if self.current_conversation:\n is_long_enough = len(self.current_conversation) >= self.min_conversation_messages\n has_user = any(msg.get(""from"") == ""User"" for msg in self.current_conversation)\n has_assistant = any(msg.get(""from"") == ""Assistant"" for msg in self.current_conversation)\n\n if is_long_enough and has_user and has_assistant:\n self.conversations.append(self.current_conversation)\n self.conversation_token_counts.append(self.current_tokens)\n \n self.current_conversation = []\n self.current_tokens = 0\n self.files_opened_in_conversation.clear()\n\n def append_message(self, message: Dict[str, str]) -> None:\n value = message[""value""]\n \n tokens = self.tokenizer.encode(value)\n num_tokens = len(tokens)\n\n if num_tokens > self.max_tokens_per_message:\n tokens = tokens[:self.max_tokens_per_message]\n value = self.tokenizer.decode(tokens)\n message[""value""] = value\n num_tokens = self.max_tokens_per_message\n\n if self.current_tokens + num_tokens > self.max_tokens_per_conversation:\n self.finalize_conversation()\n\n self.current_conversation.append(message)\n self.current_tokens += num_tokens\n\n def maybe_capture_file_contents(\n self,\n file_path: str,\n content: str,\n ) -> None:\n """"""\n Capture the contents of the given file in the current conversation if it hasn't been opened yet.\n """"""\n if file_path in self.files_opened_in_conversation:\n return\n cmd = f""cat -n {file_path}""\n self.append_message({\n ""from"": ""Assistant"",\n ""value"": _fenced_block(""bash"", _clean_text(cmd)),\n })\n output = _line_numbered_output(content)\n self.append_message({\n ""from"": ""User"",\n ""value"": f""\n{output}\n"",\n })\n self.files_opened_in_conversation.add(file_path)\n\n\ndef _clean_text(text: str) -> str:\n # Normalize line endings and strip trailing spaces; preserve tabs/newlines.\n return text.replace(""\r\n"", ""\n"").replace(""\r"", ""\n"").rstrip()\n\n\ndef _fenced_block(language: Optional[str], content: str) -> str:\n lang = (language or """").lower()\n return f""```{lang}\n{content}\n```\n""\n\n\ndef _apply_change(content: str, offset: int, length: int, new_text: str) -> str:\n # Mirrors crowd_code_player.replay_file.apply_change\n base = str(content)\n text = str(new_text) if pd.notna(new_text) else """"\n text = text.replace(""\\n"", ""\n"").replace(""\\r"", ""\r"")\n if offset > len(base):\n base = base + ("" "" * (offset - len(base)))\n return base[:offset] + text + base[offset + length:]\n\n\ndef _apply_backspaces(text: str) -> str:\n out: List[str] = []\n for ch in text:\n if ch == ""\b"": # \x08\n if out:\n out.pop()\n else:\n out.append(ch)\n return """".join(out)\n\n\ndef _normalize_terminal_output(raw: str) -> str:\n """"""\n Normalize PTY/terminal output for training:\n - Apply backspaces (\x08)\n - Strip OSC (window title/shell integration) first, keeping BEL/ST terminators intact\n - Resolve carriage returns (\r) by keeping the last rewrite per line\n - Strip CSI (coloring etc.)\n - Finally drop any remaining BEL (\x07)\n """"""\n if not raw:\n return raw\n s = _apply_backspaces(raw)\n # Remove OSC sequences that are properly terminated (BEL or ST)\n s = _ANSI_OSC_TERMINATED_RE.sub("""", s)\n # Fallback: drop any unterminated OSC up to end-of-line only\n s = ""\n"".join(_ANSI_OSC_LINE_FALLBACK_RE.sub("""", line) for line in s.split(""\n""))\n # Resolve carriage returns per line:\n # - If there are multiple rewrites, keep the last non-empty conversation\n # - If it's CRLF (ending with '\r' before '\n'), keep the content before '\r'\n resolved_lines: List[str] = []\n for seg in s.split(""\n""):\n parts = seg.split(""\r"")\n chosen = """"\n # pick last non-empty part if available; else last part\n for p in reversed(parts):\n if p != """":\n chosen = p\n break\n if chosen == """" and parts:\n chosen = parts[-1]\n resolved_lines.append(chosen)\n s = ""\n"".join(resolved_lines)\n # Strip ANSI escape sequences\n s = _ANSI_CSI_RE.sub("""", s)\n # Remove any remaining BEL beeps\n s = s.replace(""\x07"", """")\n return s\n\n\ndef _line_numbered_output(content: str, start_line: Optional[int] = None, end_line: Optional[int] = None) -> str:\n # FIXME (f.srambical): check whether this corresponds **exactly** to the output of cat -n {file_path} | sed -n '{vstart},{vend}p'\n lines = content.splitlines()\n total = len(lines)\n if total == 0:\n return """"\n s = 1 if start_line is None else max(1, min(start_line, total))\n e = total if end_line is None else max(1, min(end_line, total))\n assert e >= s, ""End line number cannot be less than start line number! Likely a bug in the line numbering computation.""\n buf: List[str] = []\n for idx in range(s, e + 1):\n buf.append(f""{idx:6}\t{lines[idx - 1]}"")\n return ""\n"".join(buf)\n\n\ndef _compute_viewport(total_lines: int, center_line: int, radius: int) -> Tuple[int, int]:\n if total_lines <= 0:\n return (1, 0)\n start = max(1, center_line - radius)\n end = min(total_lines, center_line + radius)\n assert end >= start, ""Viewport cannot have negative width! Likely a bug in the viewport computation.""\n return (start, end)\n\n\ndef _escape_single_quotes_for_sed(text: str) -> str:\n # Close quote, add an escaped single quote, reopen quote: '""'""'\n return text.replace(""'"", ""'\""'\""'"")\n\n\ndef _compute_changed_block_lines(\n before: str, after: str\n) -> Tuple[int, int, int, int, List[str]]:\n """"""\n Return 1-based start and end line numbers in 'before' that should be\n replaced, 1-based start and end line numbers in 'after' that contain\n the replacement, and the replacement lines from 'after'.\n\n For pure deletions, the replacement list may be empty.\n """"""\n before_lines = before.splitlines()\n after_lines = after.splitlines()\n sm = difflib.SequenceMatcher(a=before_lines, b=after_lines, autojunk=False)\n opcodes = [op for op in sm.get_opcodes() if op[0] != ""equal""]\n assert opcodes, ""Opcode list cannot be empty! Likely a bug in the diff computation.""\n\n first = opcodes[0]\n last = opcodes[-1]\n # i1/i2 refer to 'before' indices, j1/j2 to 'after'\n start_before = max(1, first[1] + 1)\n end_before = last[2] # no increment since we go from 'exclusive' to 'inclusive' indexing\n start_after = max(1, first[3] + 1)\n end_after = last[4]\n replacement_lines = after_lines[first[3] : last[4]]\n return (start_before, end_before, start_after, end_after, replacement_lines)\n\n\ndef session_to_nemo_conversations(\n df: pd.DataFrame,\n max_tokens_per_conversation: int,\n max_tokens_per_message: int,\n min_conversation_messages: int,\n tokenizer: Any,\n viewport_radius: int = 10,\n normalize_terminal_output: bool = True,\n coalesce_radius: int = 5,\n) -> Tuple[List[List[Dict[str, str]]], List[int]]:\n """"""\n Convert a session DataFrame to one or more NeMo conversations.\n\n - Conversations are created by approximately limiting the total tokens\n across all `value` fields to `max_tokens_per_conversation`.\n - When a new conversation starts (after the first), the first time a file is\n referenced in that conversation we re-log the full file contents with\n `cat -n ` and numbered output so that each conversation is self-contained.\n """"""\n file_states: Dict[str, str] = {}\n per_file_viewport: Dict[str, Optional[Tuple[int, int]]] = {}\n\n conversations: List[List[Dict[str, str]]] = []\n conversation_token_counts: List[int] = []\n conversation_state = ConversationState(\n conversations=conversations,\n conversation_token_counts=conversation_token_counts,\n max_tokens_per_conversation=max_tokens_per_conversation,\n max_tokens_per_message=max_tokens_per_message,\n min_conversation_messages=min_conversation_messages,\n tokenizer=tokenizer,\n )\n\n terminal_output_buffer: List[str] = []\n pending_edits_before: Dict[str, Optional[str]] = {}\n pending_edit_regions: Dict[str, Optional[Tuple[int, int]]] = {}\n\n def _flush_terminal_output_buffer() -> None:\n if not terminal_output_buffer:\n return\n aggregated = """".join(terminal_output_buffer)\n out = aggregated\n if normalize_terminal_output:\n out = _normalize_terminal_output(out)\n cleaned = _clean_text(out)\n if cleaned.strip():\n conversation_state.append_message({\n ""from"": ""User"",\n ""value"": f""\n{cleaned}\n"",\n })\n terminal_output_buffer.clear()\n\n def _flush_pending_edit_for_file(target_file: str) -> None:\n before_snapshot = pending_edits_before.get(target_file)\n if before_snapshot is None:\n return\n after_state = file_states.get(target_file, """")\n if before_snapshot.rstrip(""\n"") == after_state.rstrip(""\n""):\n pending_edits_before[target_file] = None\n pending_edit_regions[target_file] = None\n return\n (\n start_before,\n end_before,\n start_after,\n end_after,\n repl_lines,\n ) = _compute_changed_block_lines(before_snapshot, after_state)\n before_total_lines = len(before_snapshot.splitlines())\n if end_before < start_before:\n escaped_lines = [_escape_single_quotes_for_sed(line) for line in repl_lines]\n sed_payload = ""\n"".join(escaped_lines)\n if start_before <= max(1, before_total_lines):\n sed_cmd = f""sed -i '{start_before}i\\\n{sed_payload}' {target_file}""\n else:\n sed_cmd = f""sed -i '$a\\\n{sed_payload}' {target_file}""\n elif not repl_lines:\n sed_cmd = f""sed -i '{start_before},{end_before}d' {target_file}""\n else:\n escaped_lines = [_escape_single_quotes_for_sed(line) for line in repl_lines]\n sed_payload = ""\n"".join(escaped_lines)\n sed_cmd = f""sed -i '{start_before},{end_before}c\\\n{sed_payload}' {target_file}""\n total_lines = len(after_state.splitlines())\n center = (start_after + end_after) // 2\n vp = _compute_viewport(total_lines, center, viewport_radius)\n per_file_viewport[target_file] = vp\n vstart, vend = vp\n conversation_state.maybe_capture_file_contents(target_file, before_snapshot)\n chained_cmd = f""{sed_cmd} && cat -n {target_file} | sed -n '{vstart},{vend}p'""\n conversation_state.append_message({\n ""from"": ""Assistant"",\n ""value"": _fenced_block(""bash"", _clean_text(chained_cmd)),\n })\n viewport_output = _line_numbered_output(after_state, vstart, vend)\n conversation_state.append_message({\n ""from"": ""User"",\n ""value"": f""\n{viewport_output}\n"",\n })\n pending_edits_before[target_file] = None\n pending_edit_regions[target_file] = None\n\n def _flush_all_pending_edits() -> None:\n for fname in list(pending_edits_before.keys()):\n _flush_pending_edit_for_file(fname)\n\n for i in range(len(df)):\n row = df.iloc[i]\n file_path: str = row[""File""]\n event_type = row[""Type""]\n\n match event_type:\n case ""tab"":\n _flush_all_pending_edits()\n _flush_terminal_output_buffer()\n text = row[""Text""]\n if pd.notna(text):\n content = str(text).replace(""\\n"", ""\n"").replace(""\\r"", ""\r"")\n file_states[file_path] = content\n cmd = f""cat -n {file_path}""\n conversation_state.append_message({\n ""from"": ""Assistant"",\n ""value"": _fenced_block(""bash"", _clean_text(cmd)),\n })\n output = _line_numbered_output(content)\n conversation_state.append_message({\n ""from"": ""User"",\n ""value"": f""\n{output}\n"",\n })\n conversation_state.files_opened_in_conversation.add(file_path)\n else:\n # File switch without content snapshot: show current viewport only\n content = file_states.get(file_path, """")\n total_lines = len(content.splitlines())\n vp = per_file_viewport.get(file_path)\n if not vp or vp[1] == 0:\n vp = _compute_viewport(total_lines, 1, viewport_radius)\n per_file_viewport[file_path] = vp\n if vp and vp[1] >= vp[0]:\n vstart, vend = vp\n conversation_state.maybe_capture_file_contents(file_path, content)\n cmd = f""cat -n {file_path} | sed -n '{vstart},{vend}p'""\n conversation_state.append_message({\n ""from"": ""Assistant"",\n ""value"": _fenced_block(""bash"", _clean_text(cmd)),\n })\n viewport_output = _line_numbered_output(content, vstart, vend)\n conversation_state.append_message({\n ""from"": ""User"",\n ""value"": f""\n{viewport_output}\n"",\n })\n\n case ""content"":\n _flush_terminal_output_buffer()\n offset = int(row[""RangeOffset""])\n length = int(row[""RangeLength""])\n new_text = row[""Text""]\n before = file_states.get(file_path, """")\n # Approximate current edit region in line space\n new_text_str = str(new_text) if pd.notna(new_text) else """"\n start_line_current = before[:offset].count(""\n"") + 1\n deleted_conversation = before[offset:offset + length]\n lines_added = new_text_str.count(""\n"")\n lines_deleted = deleted_conversation.count(""\n"")\n region_start = start_line_current\n region_end = start_line_current + max(lines_added, lines_deleted, 0)\n # Flush pending edits if this edit is far from the pending region\n current_region = pending_edit_regions.get(file_path)\n if current_region is not None:\n rstart, rend = current_region\n if region_start < (rstart - coalesce_radius) or region_start > (rend + coalesce_radius):\n _flush_pending_edit_for_file(file_path)\n current_region = None\n after = _apply_change(before, offset, length, new_text)\n if pending_edits_before.get(file_path) is None:\n pending_edits_before[file_path] = before\n # Update/initialize region union\n if current_region is None:\n pending_edit_regions[file_path] = (region_start, max(region_start, region_end))\n else:\n rstart, rend = current_region\n pending_edit_regions[file_path] = (min(rstart, region_start), max(rend, region_end))\n file_states[file_path] = after\n\n case ""selection_command"" | ""selection_mouse"" | ""selection_keyboard"":\n # During an edit burst (pending edits), suppress flush and viewport emissions\n if pending_edits_before.get(file_path) is None:\n _flush_terminal_output_buffer()\n else:\n # Skip emitting viewport while edits are pending to avoid per-keystroke sed/cat spam\n continue\n offset = int(row[""RangeOffset""])\n content = file_states.get(file_path, """")\n total_lines = len(content.splitlines())\n target_line = content[:offset].count(""\n"") + 1\n vp = per_file_viewport.get(file_path)\n should_emit = False\n if not vp or vp[1] == 0:\n vp = _compute_viewport(total_lines, target_line, viewport_radius)\n per_file_viewport[file_path] = vp\n should_emit = True\n else:\n vstart, vend = vp\n if target_line < vstart or target_line > vend:\n vp = _compute_viewport(total_lines, target_line, viewport_radius)\n per_file_viewport[file_path] = vp\n should_emit = True\n if should_emit and vp and vp[1] >= vp[0]:\n vstart, vend = vp\n conversation_state.maybe_capture_file_contents(file_path, content)\n cmd = f""cat -n {file_path} | sed -n '{vstart},{vend}p'""\n conversation_state.append_message({\n ""from"": ""Assistant"",\n ""value"": _fenced_block(""bash"", _clean_text(cmd)),\n })\n viewport_output = _line_numbered_output(content, vstart, vend)\n conversation_state.append_message({\n ""from"": ""User"",\n ""value"": f""\n{viewport_output}\n"",\n })\n\n case ""terminal_command"":\n _flush_all_pending_edits()\n _flush_terminal_output_buffer()\n command = row[""Text""]\n command_str = str(command).replace(""\\n"", ""\n"").replace(""\\r"", ""\r"")\n conversation_state.append_message({\n ""from"": ""Assistant"",\n ""value"": _fenced_block(""bash"", _clean_text(command_str)),\n })\n\n case ""terminal_output"":\n output = row[""Text""]\n raw_output = str(output).replace(""\\n"", ""\n"").replace(""\\r"", ""\r"")\n terminal_output_buffer.append(raw_output)\n\n case ""terminal_focus"":\n _flush_all_pending_edits()\n _flush_terminal_output_buffer()\n # No-op for bash transcript; focus changes don't emit commands/output\n pass\n\n case ""git_branch_checkout"":\n _flush_all_pending_edits()\n _flush_terminal_output_buffer()\n branch_info = row[""Text""]\n branch_str = str(branch_info).replace(""\\n"", ""\n"").replace(""\\r"", ""\r"")\n cleaned = _clean_text(branch_str)\n m = re.search(r""to '([^']+)'"", cleaned)\n if not m:\n raise ValueError(f""Could not extract branch name from git checkout message: {cleaned}"")\n branch_name = m.group(1).strip()\n # Safe-quote branch if it contains special characters\n if re.search(r""[^A-Za-z0-9._/\\-]"", branch_name):\n branch_name = ""'"" + branch_name.replace(""'"", ""'\""'\""'"") + ""'""\n cmd = f""git checkout {branch_name}""\n conversation_state.append_message({\n ""from"": ""Assistant"",\n ""value"": _fenced_block(""bash"", _clean_text(cmd)),\n })\n\n case _:\n raise ValueError(f""Unknown event type: {event_type}"")\n\n _flush_all_pending_edits()\n _flush_terminal_output_buffer()\n conversation_state.finalize_conversation()\n return conversations, conversation_token_counts\n\n\n\ndef load_hf_csv(hf_path: str, split: str) -> Dataset:\n loaded = load_dataset(hf_path, split=split)\n\n assert isinstance(loaded, Dataset), ""Expected a Dataset from load_dataset""\n return loaded",python,tab
+614,3974333,"crates/cli/src/main.rs",0,0,"",rust,tab
+615,4012375,"crates/cli/src/main.rs",7050,0,"",rust,selection_command
+616,4013435,"crates/cli/src/main.rs",0,0,"",rust,selection_command
+617,4015269,"crates/cli/src/main.rs",1439,0,"",rust,selection_keyboard
+618,4053409,"/home/franz.srambical/slurm/dev/franz/berlin/crowd-pilot/nemo/dataset/hf_part/generate_jsonl_qwen_4k.sh",0,0,"#!/bin/bash\n\nset -uex\n\nOUTPUT_DIR=""/fast/project/HFMI_SynergyUnit/tab_model/data/nemo_hf_part_jsonl_4k_tokens/""\nCSV_ROOT=""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/""\n\nMAX_TOKENS_PER_CONVERSATION=4096\nTOKENIZER_MODEL=""Qwen/Qwen3-Coder-30B-A3B-Instruct""\n\nuv run crowd_pilot/serialize_dataset_nemo_json.py --csv_root=$CSV_ROOT --output_dir=$OUTPUT_DIR --max_tokens_per_conversation=$MAX_TOKENS_PER_CONVERSATION --tokenizer_model=$TOKENIZER_MODEL",shellscript,tab
+619,4058563,"crates/cli/src/main.rs",0,0,"",rust,tab
+620,4060334,"crates/cli/src/main.rs",1438,0,"",rust,selection_command
+621,4060583,"crates/cli/src/main.rs",1410,0,"",rust,selection_command
+622,4060615,"crates/cli/src/main.rs",1371,0,"",rust,selection_command
+623,4060647,"crates/cli/src/main.rs",1320,0,"",rust,selection_command
+624,4060682,"crates/cli/src/main.rs",1319,0,"",rust,selection_command
+625,4060713,"crates/cli/src/main.rs",1281,0,"",rust,selection_command
+626,4060746,"crates/cli/src/main.rs",1243,0,"",rust,selection_command
+627,4060781,"crates/cli/src/main.rs",1186,0,"",rust,selection_command
+628,4061030,"crates/cli/src/main.rs",1185,0,"",rust,selection_command
+629,4061183,"crates/cli/src/main.rs",1150,0,"",rust,selection_command
+630,4089610,"crates/cli/src/main.rs",1109,0,"",rust,selection_command
+631,4214179,"/home/franz.srambical/slurm/dev/franz/berlin/crowd-pilot/nemo/dataset/hf_part/generate_jsonl_qwen_4k.sh",0,0,"",shellscript,tab
+632,4216135,"crates/cli/src/main.rs",0,0,"",rust,tab
+633,4225278,"/home/franz.srambical/slurm/dev/franz/berlin/crowd-pilot/nemo/dataset/hf_part/generate_jsonl_qwen_4k.sh",0,0,"",shellscript,tab
+634,4226519,"/home/franz.srambical/slurm/dev/franz/berlin/crowd-pilot/nemo/dataset/hf_part/generate_jsonl_qwen_4k.sh",182,0,"",shellscript,selection_command
+635,4226760,"/home/franz.srambical/slurm/dev/franz/berlin/crowd-pilot/nemo/dataset/hf_part/generate_jsonl_qwen_4k.sh",179,0,"",shellscript,selection_command
+636,4227412,"/home/franz.srambical/slurm/dev/franz/berlin/crowd-pilot/nemo/dataset/hf_part/generate_jsonl_qwen_4k.sh",182,0,"",shellscript,selection_command
+637,4227514,"/home/franz.srambical/slurm/dev/franz/berlin/crowd-pilot/nemo/dataset/hf_part/generate_jsonl_qwen_4k.sh",214,0,"",shellscript,selection_command
+638,4228637,"crates/cli/src/main.rs",0,0,"",rust,tab
+639,4229190,"crates/cli/src/main.rs",1113,0,"",rust,selection_command
+640,4229342,"crates/cli/src/main.rs",1115,0,"",rust,selection_command
+641,4229603,"crates/cli/src/main.rs",1118,0,"",rust,selection_command
+642,4229638,"crates/cli/src/main.rs",1119,0,"",rust,selection_command
+643,4229700,"crates/cli/src/main.rs",1123,0,"",rust,selection_command
+644,4229702,"crates/cli/src/main.rs",1125,0,"",rust,selection_command
+645,4229737,"crates/cli/src/main.rs",1139,0,"",rust,selection_command
+646,4229956,"crates/cli/src/main.rs",1141,0,"",rust,selection_command
+647,4230174,"crates/cli/src/main.rs",1142,0,"",rust,selection_command
+648,4230747,"crates/cli/src/main.rs",1107,0,"",rust,selection_command
+649,4230817,"crates/cli/src/main.rs",1073,0,"",rust,selection_command
+650,4230935,"crates/cli/src/main.rs",1066,0,"",rust,selection_command
+651,4231095,"crates/cli/src/main.rs",1025,0,"",rust,selection_command
+652,4231606,"crates/cli/src/main.rs",1025,4,"",rust,content
+653,4231917,"crates/cli/src/main.rs",1025,0,"4",rust,content
+654,4231917,"crates/cli/src/main.rs",1026,0,"",rust,selection_keyboard
+655,4231992,"crates/cli/src/main.rs",1026,0,"0",rust,content
+656,4231993,"crates/cli/src/main.rs",1027,0,"",rust,selection_keyboard
+657,4232422,"crates/cli/src/main.rs",1027,0,"6",rust,content
+658,4232422,"crates/cli/src/main.rs",1028,0,"",rust,selection_keyboard
+659,4232832,"crates/cli/src/main.rs",1027,1,"",rust,content
+660,4233003,"crates/cli/src/main.rs",1027,0,"9",rust,content
+661,4233003,"crates/cli/src/main.rs",1028,0,"",rust,selection_keyboard
+662,4233116,"crates/cli/src/main.rs",1028,0,"6",rust,content
+663,4233116,"crates/cli/src/main.rs",1029,0,"",rust,selection_keyboard
+664,4233441,"crates/cli/src/main.rs",1028,0,"",rust,selection_command
+665,4234531,"TERMINAL",0,0,"\r[K[franz.srambical@hai008.haicore.berlin:~/crowd-pilot-serializer] $ ",,terminal_output
+666,4235310,"TERMINAL",0,0,"mv ../crowd-pilot ../crowd-pilot-serializer-legacy",,terminal_output
+667,4236374,"TERMINAL",0,0,"[K",,terminal_output
+668,4239278,"TERMINAL",0,0,"mv ../crowd-pilot ../crowd-pilot-serializer-legacy",,terminal_output
+669,4239434,"TERMINAL",0,0,"[16Pcat test_output_dir/metadata.json ",,terminal_output
+670,4239579,"TERMINAL",0,0,"time bash /home/franz.srambical/slurm/dev/franz/berlin/crowd-pilot/crowd_pilot_serializer/serialize.sh",,terminal_output
+671,4239733,"TERMINAL",0,0,"\r[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[56Pcargo build --release -p crowd-pilot-serialize",,terminal_output
+672,4240874,"TERMINAL",0,0,"\r\n[?2004l\r",,terminal_output
+673,4242192,"TERMINAL",0,0,"[1m[92m Compiling[0m crowd-pilot-serialize v0.1.0 (/fast/home/franz.srambical/crowd-pilot-serializer/crates/cli)\r\n[1m[96m Building[0m [=======================> ] 196/197: crowd-pilot-serialize(bin) \r",,terminal_output
+674,4245918,"TERMINAL",0,0,"[K[1m[92m Finished[0m ]8;;https://doc.rust-lang.org/cargo/reference/profiles.html#default-profiles\`release` profile [optimized]]8;;\ target(s) in 4.92s\r\n]0;franz.srambical@hai-login2:~/crowd-pilot-serializer[?2004h[franz.srambical@hai008.haicore.berlin:~/crowd-pilot-serializer] $ ",,terminal_output
+675,4278050,"TERMINAL",0,0,"\r(reverse-i-search)`': [K",,terminal_output
+676,4278343,"TERMINAL",0,0,"c': cargo build --release -p [7mc[27mrowd-pilot-serialize",,terminal_output
+677,4278581,"TERMINAL",0,0,"\r[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[Cr': cargo build --release -p [7mcr[27mowd-pilot-serialize",,terminal_output
+678,4278706,"TERMINAL",0,0,"\r[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[Co': cargo build --release -p [7mcro[27mwd-pilot-serialize\r[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[Cw': cargo build --release -p [7mcrow[27md-pilot-serialize",,terminal_output
+679,4278819,"TERMINAL",0,0,"\r[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[Cd': cargo build --release -p [7mcrowd[27m-pilot-serialize",,terminal_output
+680,4279155,"TERMINAL",0,0,"mv ../crowd-pilot ../[7mcrowd[27m-pilot-serializer-legacy",,terminal_output
+681,4279969,"TERMINAL",0,0,"[7mcrowd[27m-pilot ../crowd-pilot-serializer-legacy\r[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C",,terminal_output
+682,4280445,"TERMINAL",0,0,"time bash /home/franz.srambical/slurm/dev/franz/berlin/crowd-pilot/[7mcrowd[27m_pilot_serializer/serialize.sh",,terminal_output
+683,4281641,"TERMINAL",0,0,"\r[franz.srambical@hai008.haicore.berlin:~/crowd-pilot-serializer] $ time bash /home/franz.srambical/slurm/dev/franz/berlin/crowd-pilot/crowd_pilot_serializer/serialize.sh[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C[C",,terminal_output
+684,4282014,"TERMINAL",0,0,"\r\n[?2004l\rLoading tokenizer from Qwen/Qwen3-8B...\r\n",,terminal_output
+685,4282342,"TERMINAL",0,0,"Processing CSV files from ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/""...\r\n",,terminal_output
+686,4282454,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-4457e5d2-f5e8-4b15-95aa-bafa247369991751528947759-2025_07_03-09.50.10.663/source.csv""\r\n",,terminal_output
+687,4282552,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/927a8af5474e5654810c00ce2e09fd2de87d3e5722f33fa1090d867db114e403/crowd-code-d80c259c-238d-4ab7-8a8a-2d0fc8e345961753345086680-2025_07_24-10.19.00.46/source.csv""\r\n",,terminal_output
+688,4282780,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-8e0958c9-e396-41d9-b3d4-8a748cefa1701750701699946-2025_06_23-11.01.41.744/source.csv""\r\n",,terminal_output
+689,4283381,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+690,4283401,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+691,4283614,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-2f5b13c7-61f7-4340-b581-9edac6a53f1f1753015255059-2025_07_20-14.41.09.478/source.csv""\r\n",,terminal_output
+692,4285626,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+693,4286489,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+694,4286741,"TERMINAL",0,0,"Processed 100/349 sessions...\r\n",,terminal_output
+695,4287575,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-81dc70dc-8e01-48a6-9a00-9349b9f9a4171751541780271-2025_07_03-13.23.33.804/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/e8b08c312d88206805b92191af1ee2a660f8f0e59d3990233d6a3f81cdab43f4/crowd-code-b329b9f2-ee04-44c3-8e37-55401c64da7f1750160908319-2025_06_17-13.49.11.314/source.csv""\r\n",,terminal_output
+696,4287742,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-0d5e2cbe-83a2-48a0-b2d9-a8e41912bfb61753114173405-2025_07_21-18.09.45.514/source.csv""\r\n",,terminal_output
+697,4287984,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-81dc70dc-8e01-48a6-9a00-9349b9f9a4171751541780271-2025_07_03-13.23.33.804/source.csv""\r\n",,terminal_output
+698,4288061,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/927a8af5474e5654810c00ce2e09fd2de87d3e5722f33fa1090d867db114e403/crowd-code-0f5513f7-8bc9-4c5d-856d-79d92f75113d1751284706913-2025_06_30-13.59.01.459/source.csv""\r\n",,terminal_output
+699,4290319,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/e8b08c312d88206805b92191af1ee2a660f8f0e59d3990233d6a3f81cdab43f4/crowd-code-9c6bfb6d-25d7-401c-a2eb-18ae18179f4b1750244015795-2025_06_18-12.58.06.505/source.csv""\r\n",,terminal_output
+700,4290672,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-bec33357-731a-4f4a-afb4-3d538f18fd451751530719479-2025_07_03-10.19.48.863/source.csv""\r\n",,terminal_output
+701,4290833,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-29e2cbae-7056-4585-b457-f48bd451c3fd1750644341589-2025_06_22-19.05.43.270/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/927a8af5474e5654810c00ce2e09fd2de87d3e5722f33fa1090d867db114e403/crowd-code-fe3e4a6d-15dd-460a-96b3-1f7a60db202a1753178306285-2025_07_22-12.23.53.192/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/927a8af5474e5654810c00ce2e09fd2de87d3e5722f33fa1090d867db114e403/crowd-code-fe3e4a6d-15dd-460a-96b3-1f7a60db202a1753178306285-2025_07_22-12.23.53.192/source.csv""\r\n",,terminal_output
+702,4291085,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+703,4291159,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+704,4291825,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+705,4292680,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+706,4294734,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-bec33357-731a-4f4a-afb4-3d538f18fd451751530719479-2025_07_03-10.19.48.863/source.csv""\r\n",,terminal_output
+707,4297516,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/927a8af5474e5654810c00ce2e09fd2de87d3e5722f33fa1090d867db114e403/crowd-code-2d2437e1-caa5-4315-a7d9-4d9478073a161750944609503-2025_06_26-15.30.55.51/source.csv""\r\n",,terminal_output
+708,4299402,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+709,4299953,"TERMINAL",0,0,"Processed 200/349 sessions...\r\n",,terminal_output
+710,4300874,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/927a8af5474e5654810c00ce2e09fd2de87d3e5722f33fa1090d867db114e403/crowd-code-92ab1593-f937-4cc4-a174-544581a6ac991751909174142-2025_07_07-19.26.40.736/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/927a8af5474e5654810c00ce2e09fd2de87d3e5722f33fa1090d867db114e403/crowd-code-92ab1593-f937-4cc4-a174-544581a6ac991751909174142-2025_07_07-19.26.40.736/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+711,4301021,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+712,4301195,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+713,4301260,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+714,4301750,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+715,4302825,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+716,4303242,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+717,4305144,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-3e3ce02e-664a-4f58-9d7f-0f56e32c7def1753363875204-2025_07_24-15.31.23.202/source.csv""\r\n",,terminal_output
+718,4305452,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+719,4305812,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/927a8af5474e5654810c00ce2e09fd2de87d3e5722f33fa1090d867db114e403/crowd-code-6791460b-ec38-4da2-872f-193943c12d601753274780799-2025_07_23-14.47.19.396/source.csv""\r\n",,terminal_output
+720,4307587,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+721,4310134,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+722,4311317,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-172d798b-bee8-455a-b904-9dd3fe6387d51754411154298-2025_08_05-18.25.56.221/source.csv""\r\n",,terminal_output
+723,4313654,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-cdede756-87c5-47af-85f8-bc9bf1c41bac1750785125700-2025_06_24-22.05.07.988/source.csv""\r\n",,terminal_output
+724,4317545,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-cdede756-87c5-47af-85f8-bc9bf1c41bac1750785125700-2025_06_24-22.05.07.988/source.csv""\r\n",,terminal_output
+725,4318269,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-cdede756-87c5-47af-85f8-bc9bf1c41bac1750785125700-2025_06_24-22.05.07.988/source.csv""\r\n",,terminal_output
+726,4319340,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-cdede756-87c5-47af-85f8-bc9bf1c41bac1750785125700-2025_06_24-22.05.07.988/source.csv""\r\n",,terminal_output
+727,4319718,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-cdede756-87c5-47af-85f8-bc9bf1c41bac1750785125700-2025_06_24-22.05.07.988/source.csv""\r\n",,terminal_output
+728,4321367,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-fbd09e27-2302-4b0c-83a4-a77b7bc2e3dc1751440721102-2025_07_02-09.19.16.832/source.csv""\r\n",,terminal_output
+729,4329641,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-8e67a739-7b65-4646-afc1-42e9766880571751607756007-2025_07_04-07.43.31.602/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-8e67a739-7b65-4646-afc1-42e9766880571751607756007-2025_07_04-07.43.31.602/source.csv""\r\n",,terminal_output
+730,4331002,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-46a4cc9d-ac37-44ff-ae8d-547db76d96f31752072213286-2025_07_09-16.43.57.848/source.csv""\r\n",,terminal_output
+731,4331246,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-7879e034-f897-48e8-8481-1a87a73b0dc81752135543307-2025_07_10-10.19.09.565/source.csv""\r\n",,terminal_output
+732,4335040,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/927a8af5474e5654810c00ce2e09fd2de87d3e5722f33fa1090d867db114e403/crowd-code-9d03fc6f-7387-447c-9d61-dfb41a6cd5d41752168236691-2025_07_10-19.24.24.908/source.csv""\r\n",,terminal_output
+733,4335345,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-7879e034-f897-48e8-8481-1a87a73b0dc81752135543307-2025_07_10-10.19.09.565/source.csv""\r\n",,terminal_output
+734,4341669,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\n",,terminal_output
+735,4341830,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\nWarning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1f15334ab7e6820c9fda17c961659882ef9853cc80f7356b9a9b22f286fd7389/crowd-code-76073275-4388-463f-8e12-ce34ee46fad51752495312029-2025_07_14-14.15.14.704/source.csv""\r\n",,terminal_output
+736,4345700,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-ebfebb4b-ef7a-46a0-8725-dd76b91fd2891752048248358-2025_07_09-10.04.56.945/source.csv""\r\n",,terminal_output
+737,4350649,"TERMINAL",0,0,"Processed 300/349 sessions...\r\n",,terminal_output
+738,4364854,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-3dde1b0c-c963-467e-aa73-fb6c54df3ae41751963426964-2025_07_08-10.30.57.271/source.csv""\r\n",,terminal_output
+739,4365197,"TERMINAL",0,0,"Warning: terminal_command event missing Text in ""/fast/project/HFMI_SynergyUnit/tab_model/data/hf_part_csv/1de052c516cab686515c107385aaf7c3a7e3e5c23c9bc3c0be0cff3df28cd64d/crowd-code-3dde1b0c-c963-467e-aa73-fb6c54df3ae41751963426964-2025_07_08-10.30.57.271/source.csv""\r\n",,terminal_output
+740,4384097,"TERMINAL",0,0,"Processed 349/349 sessions...\r\nProcessed 349 sessions\r\nWriting output to ""test_output_dir""...\r\n",,terminal_output
+741,4404313,"TERMINAL",0,0,"\r\n[summary]\r\n Total sessions processed: 349\r\n Train conversations: 9564\r\n Val conversations: 945\r\n Total messages: 131234\r\n Total tokens: 36988408\r\n Output: ""test_output_dir""/{training,validation}.jsonl\r\n Metadata: ""test_output_dir/metadata.json""\r\n",,terminal_output
+742,4405067,"TERMINAL",0,0,"\r\nreal\t2m3.083s\r\nuser\t92m54.340s\r\nsys\t2m7.583s\r\n]0;franz.srambical@hai-login2:~/crowd-pilot-serializer[?2004h[franz.srambical@hai008.haicore.berlin:~/crowd-pilot-serializer] $ ",,terminal_output
+743,4451914,"crates/cli/src/main.rs",1025,4,"8192",rust,content
+744,4451918,"crates/cli/src/main.rs",1025,0,"",rust,selection_command
+745,5673880,"TERMINAL",0,0,"\r[K[franz.srambical@hai008.haicore.berlin:~/crowd-pilot-serializer] $ ",,terminal_output
+746,5677399,"TERMINAL",0,0,"\r[K[franz.srambical@hai008.haicore.berlin:~/crowd-pilot-serializer] $ ",,terminal_output
+747,7914396,"TERMINAL",0,0,"\r[K[franz.srambical@hai008.haicore.berlin:~/crowd-pilot-serializer] $ ",,terminal_output
+748,7918716,"TERMINAL",0,0,"\r[K[franz.srambical@hai008.haicore.berlin:~/crowd-pilot-serializer] $ ",,terminal_output
+749,8726050,"crates/cli/src/main.rs",992,0,"",rust,selection_command
diff --git a/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-c7b553b9-9f39-49d5-9921-ace297dd945c1764453653385-2025_11_29-23.00.57.494/source.csv b/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-c7b553b9-9f39-49d5-9921-ace297dd945c1764453653385-2025_11_29-23.00.57.494/source.csv
new file mode 100644
index 0000000000000000000000000000000000000000..d36d57cf15403e105c8f2bdc3366682a3c080f10
--- /dev/null
+++ b/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-c7b553b9-9f39-49d5-9921-ace297dd945c1764453653385-2025_11_29-23.00.57.494/source.csv
@@ -0,0 +1,65 @@
+Sequence,Time,File,RangeOffset,RangeLength,Text,Language,Type
+1,12,"Untitled-1",0,0,"",plaintext,tab
+2,106,"extension-output-pdoom-org.crowd-code-#1-crowd-code",0,0,"11:00:57 PM [info] Activating crowd-code\n11:00:57 PM [info] Recording started\n11:00:57 PM [info] Initializing git provider using file system watchers...\n11:00:57 PM [info] No workspace folder found\n",Log,tab
+3,879,"Untitled-1",0,0,"",plaintext,tab
+4,7510,"Untitled-1",0,0,"\n",plaintext,content
+5,8198,"Untitled-1",0,0,"",plaintext,selection_command
+6,9546,"Untitled-1",1,0,"",plaintext,selection_command
+7,9662,"Untitled-1",0,0,"",plaintext,selection_command
+8,11565,"Untitled-1",1,0,"",plaintext,selection_command
+9,12165,"Untitled-1",0,0,"",plaintext,selection_command
+10,12576,"Untitled-1",1,0,"",plaintext,selection_command
+11,13855,"Untitled-1",0,1,"",plaintext,content
+12,20208,"Untitled-1",0,0,"\n",plaintext,content
+13,20705,"Untitled-1",0,1,"",plaintext,content
+14,25013,"Untitled-1",0,0,"\n",plaintext,content
+15,26025,"Untitled-1",0,0,"",plaintext,selection_command
+16,26644,"Untitled-1",1,0,"",plaintext,selection_command
+17,27236,"Untitled-1",0,0,"",plaintext,selection_command
+18,27970,"Untitled-1",1,0,"",plaintext,selection_command
+19,32009,"Untitled-1",0,0,"",plaintext,selection_command
+20,44217,"Untitled-1",1,0,"",plaintext,selection_command
+21,47229,"Untitled-1",0,0,"",plaintext,selection_command
+22,48041,"Untitled-1",1,0,"",plaintext,selection_command
+23,51061,"Untitled-1",0,0,"",plaintext,selection_command
+24,51215,"Untitled-1",1,0,"",plaintext,selection_command
+25,51533,"Untitled-1",0,0,"",plaintext,selection_command
+26,52546,"Untitled-1",1,0,"",plaintext,selection_command
+27,53160,"Untitled-1",0,0,"",plaintext,selection_command
+28,60839,"Untitled-1",1,0,"",plaintext,selection_command
+29,62433,"Untitled-1",0,0,"",plaintext,selection_command
+30,63823,"Untitled-1",1,0,"",plaintext,selection_command
+31,64201,"Untitled-1",0,0,"",plaintext,selection_command
+32,67687,"TERMINAL",0,0,"Test",,terminal_focus
+33,67690,"Untitled-1",1,0,"/* crowd-pilot: insert start */\nline A\nline B\n/* crowd-pilot: insert end */\n",plaintext,content
+34,70126,"Untitled-1",40,36,"/* crowd-pilot: replacement */\nREPLACED LINE 1\nREPLACED LINE 2",plaintext,content
+35,72865,"Untitled-1",0,0,"\n",plaintext,content
+36,73447,"Untitled-1",0,0,"",plaintext,selection_command
+37,74733,"Untitled-1",0,0,"\n",plaintext,content
+38,75175,"Untitled-1",0,0,"",plaintext,selection_command
+39,76502,"Untitled-1",0,0,"\n",plaintext,content
+40,76939,"Untitled-1",0,0,"",plaintext,selection_command
+41,78227,"Untitled-1",0,0,"\n",plaintext,content
+42,78613,"Untitled-1",0,0,"",plaintext,selection_command
+43,81267,"Untitled-1",37,54,"",plaintext,content
+44,83334,"Untitled-1",1,0,"",plaintext,selection_command
+45,85017,"Untitled-1",2,0,"",plaintext,selection_command
+46,85112,"Untitled-1",1,0,"",plaintext,selection_command
+47,86678,"Untitled-1",0,0,"",plaintext,selection_command
+48,86850,"Untitled-1",1,0,"",plaintext,selection_command
+49,88064,"Untitled-1",1,0," ",plaintext,content
+50,88066,"Untitled-1",2,0,"",plaintext,selection_keyboard
+51,88799,"Untitled-1",1,1,"",plaintext,content
+52,90274,"Untitled-1",0,0,"",plaintext,selection_command
+53,90308,"Untitled-1",1,0,"",plaintext,selection_command
+54,264031,"Untitled-1",2,0,"",plaintext,selection_command
+55,264119,"Untitled-1",1,0,"",plaintext,selection_command
+56,264575,"Untitled-1",2,0,"",plaintext,selection_command
+57,265033,"Untitled-1",3,0,"",plaintext,selection_command
+58,265462,"Untitled-1",4,0,"",plaintext,selection_command
+59,266025,"Untitled-1",5,0,"",plaintext,selection_command
+60,267178,"Untitled-1",37,0,"",plaintext,selection_command
+61,268523,"Untitled-1",5,0,"",plaintext,selection_command
+62,268710,"Untitled-1",4,0,"",plaintext,selection_command
+63,268859,"Untitled-1",3,0,"",plaintext,selection_command
+64,269024,"Untitled-1",2,0,"",plaintext,selection_command
diff --git a/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-c80bdedd-9d77-4f25-87da-5184ca5a8b3f1763540961734-2025_11_19-09.29.24.625/source.csv b/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-c80bdedd-9d77-4f25-87da-5184ca5a8b3f1763540961734-2025_11_19-09.29.24.625/source.csv
new file mode 100644
index 0000000000000000000000000000000000000000..6b37cc16d713626a39c1503e41489a8fa11802b1
--- /dev/null
+++ b/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-c80bdedd-9d77-4f25-87da-5184ca5a8b3f1763540961734-2025_11_19-09.29.24.625/source.csv
@@ -0,0 +1,18 @@
+Sequence,Time,File,RangeOffset,RangeLength,Text,Language,Type
+1,1,"nemo_run/config.py",0,0,"# SPDX-FileCopyrightText: Copyright (c) 2024 NVIDIA CORPORATION & AFFILIATES. All rights reserved.\n# SPDX-License-Identifier: Apache-2.0\n#\n# Licensed under the Apache License, Version 2.0 (the ""License"");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an ""AS IS"" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom __future__ import annotations\n\nimport copy\nimport dataclasses\nimport inspect\nimport os\nimport re\nimport sys\nimport typing\nfrom pathlib import Path\nfrom types import MappingProxyType\nfrom typing import Any, Callable, Generic, Iterable, Optional, Type, TypeVar, Union, get_args\n\nimport fiddle as fdl\nimport fiddle._src.experimental.dataclasses as fdl_dc\nimport graphviz\nfrom fiddle._src import config, daglish, daglish_extensions\nfrom fiddle._src.casting import register_supported_cast\nfrom fiddle._src.config import TypeOrCallableProducingT\nfrom fiddle.graphviz import render, render_diff\nfrom typing_extensions import Annotated, ParamSpec, Self\n\nimport nemo_run.exceptions as run_exceptions\n\nParams = ParamSpec(""Params"")\nReturnType = TypeVar(""ReturnType"")\n\n_T = TypeVar(""_T"")\n_BuildableT = TypeVar(""_BuildableT"", bound=fdl.Buildable)\n\nRECURSIVE_TYPES = (typing.Union, typing.Optional)\n_NEMORUN_HOME = os.environ.get(""NEMORUN_HOME"", os.path.expanduser(""~/.nemo_run""))\nRUNDIR_NAME = ""nemo_run""\nRUNDIR_SPECIAL_NAME = ""/$nemo_run""\nSCRIPTS_DIR = ""scripts""\n\n# Metadata keys\nUSE_WITH_RAY_CLUSTER_KEY = ""use_with_ray_cluster""\n\n\ndef get_nemorun_home() -> str:\n """"""\n Get the current NEMORUN_HOME directory path.\n\n Returns:\n The path to the NEMORUN_HOME directory.\n """"""\n return _NEMORUN_HOME\n\n\ndef set_nemorun_home(path: str) -> None:\n """"""\n Set the NEMORUN_HOME directory path.\n\n Args:\n path: The new path for NEMORUN_HOME.\n """"""\n global _NEMORUN_HOME\n _NEMORUN_HOME = os.path.expanduser(path)\n\n\ndef get_type_namespace(typ: Type | Callable) -> str:\n """"""\n Get the namespace of a type or callable.\n\n Args:\n typ: The type or callable to get the namespace for.\n\n Returns:\n A string representing the namespace of the type or callable.\n\n Examples:\n >>> class MyClass:\n ... pass\n >>> get_type_namespace(MyClass)\n 'your_module.MyClass'\n """"""\n module = typ.__module__\n if module == ""__main__"":\n # Get the filename without extension\n main_module = sys.modules[""__main__""]\n filename = os.path.basename(main_module.__file__)\n module = os.path.splitext(filename)[0]\n\n if isinstance(typ, fdl.Buildable):\n typ = typ.__fn_or_cls__\n\n _name = getattr(typ, ""__qualname__"", str(typ))\n if _name.startswith(""ForwardRef""):\n _name = _name.split(""."")[-1]\n return f""{module}.{_name}""\n\n\ndef get_underlying_types(type_hint: typing.Any) -> typing.Set[typing.Type]:\n if isinstance(type_hint, typing._GenericAlias): # type: ignore\n if str(type_hint).startswith(""typing.Annotated""):\n origin = type_hint.__origin__\n if hasattr(origin, ""__origin__""):\n origin = origin.__origin__\n else:\n origin = type_hint.__origin__\n if origin in RECURSIVE_TYPES:\n types = set()\n for arg in type_hint.__args__:\n types.update(get_underlying_types(arg))\n return types\n return {type_hint}\n\n\ndef from_dict(raw_data: dict | list | str | float | int | bool, cls: Type[_T]) -> _T:\n if isinstance(raw_data, dict):\n underlying_types = get_underlying_types(cls)\n underlying_types = [tp for tp in underlying_types if tp is not type(None)]\n assert len(underlying_types) == 1, (\n f""Unable to load {cls}. Nested union types are not currently supported.""\n )\n cls = underlying_types[0] # type: ignore\n\n if dataclasses.is_dataclass(cls):\n fields_dict = {\n f.name: from_dict(raw_data.get(f.name), f.type) # type: ignore\n for f in dataclasses.fields(cls)\n if f.init\n }\n return cls(**fields_dict) # type: ignore\n elif isinstance(raw_data, list):\n return [from_dict(item, cls.__args__[0]) for item in raw_data] # type: ignore\n else:\n return raw_data # type: ignore\n\n\ndef set_value(cfg: config.Buildable, key: str, value: Any) -> None:\n """"""Set an attribute's value.\n\n Args:\n cfg: A `fdl.Buildable` whose attribute is to be overridden.\n assignment: String representing attribute's override expression. Of the form\n `attribute=value`.\n """"""\n *parents, last = _parse_path(key)\n\n walk = typing.cast(Any, cfg)\n try:\n for parent in parents:\n walk = parent.follow(walk)\n except Exception as e:\n raise run_exceptions.SetValueError(f'Invalid path ""{key}"".') from e\n\n try:\n if isinstance(last, daglish.Attr):\n setattr(walk, last.name, value)\n elif isinstance(last, daglish.Key):\n walk[last.key] = value\n else:\n raise run_exceptions.SetValueError(f""Unexpected path element {last}."")\n except Exception as e:\n raise run_exceptions.SetValueError(f'Could not set ""{key}"" to ""{value}"".') from e\n\n\nclass _CloneAndFNMixin:\n def clone(self):\n """"""Returns a deep clone of the object.""""""\n return copy.deepcopy(self)\n\n def walk(self: _BuildableT, **kwargs) -> _BuildableT: # type: ignore\n """"""\n Recursively applies a transformation function to attributes within the configuration object\n and its children that match the keys provided in kwargs. Attributes not listed in kwargs\n are not modified.\n\n Args:\n **kwargs (dict): A dictionary where keys are attribute names and values are functions\n that take the current attribute value and return a new value.\n\n Returns\n -------\n Config: A new Config instance with selectively modified attributes.\n\n Examples\n --------\n >>> config = Config(model=ModelConfig(seq_length=128))\n >>> new_config = config.walk(seq_length=lambda cfg: cfg.seq_length * 2)\n >>> new_config.model.seq_length\n 256\n """"""\n return _try_set_all(self, _walk=True, **kwargs)\n\n def broadcast(self: _BuildableT, **kwargs) -> _BuildableT: # type: ignore\n """"""\n Sets new values to attributes within the configuration object and its children that match\n the keys provided in kwargs. Attributes not listed in kwargs are not modified.\n\n Args:\n **kwargs (dict): A dictionary where keys are attribute names and values are the new\n values to be set.\n\n Returns\n -------\n Config: A new Config instance with selectively updated attributes.\n\n Examples\n --------\n >>> config = Config(model=ModelConfig(tensor_model_parallel_size=1))\n >>> new_config = config.broadcast(tensor_model_parallel_size=2)\n >>> new_config.model.tensor_model_parallel_size\n 2\n """"""\n return _try_set_all(self, **kwargs)\n\n\nclass _VisualizeMixin:\n def visualize(self, **kwargs) -> graphviz.Graph:\n return render(self, **kwargs)\n\n def diff(self, old: Self, trim=True, **kwargs):\n return render_diff(old=old, new=self, trim=trim, **kwargs)\n\n def save_config_img(self, path_str: str) -> None:\n """"""\n Saves the configuration to a file.\n\n Args:\n path (str): The file path where the configuration should be saved.\n fdl_fn (Partial): The function descriptor library function to save.\n\n Example:\n >>> save_config_img(""path/to/dir"", some_fdl_fn)\n """"""\n path: Path = Path(path_str)\n\n if not path.suffix:\n path = path / ""config.png""\n elif path.suffix != "".png"":\n raise ValueError(""The file extension must be .png"")\n\n path.parent.mkdir(parents=True, exist_ok=True)\n\n with path.open(""wb"") as f:\n f.write(self.visualize().pipe(""png""))\n\n def _repr_svg_(self):\n """"""Special method used by Jupyter to represent an object as SVG.\n\n Returns\n -------\n str: SVG representation of the Config object if Graphviz can render it.\n If Graphviz rendering fails or is not available, it returns None.\n """"""\n try:\n # Attempt to render using Graphviz and return SVG representation\n return self.visualize().pipe(format=""svg"").decode(""utf-8"")\n except Exception as e:\n # If rendering fails, log the exception or handle it as needed\n print(f""Graphviz rendering failed: {e}"")\n return self.__repr__()\n\n\nclass Config(Generic[_T], fdl.Config[_T], _CloneAndFNMixin, _VisualizeMixin):\n """"""\n Wrapper around fdl.Config with nemo_run specific functionality.\n See `fdl.Config `_ for more.\n """"""\n\n def __init__(\n self,\n fn_or_cls: Union[fdl.Buildable[_T], TypeOrCallableProducingT[_T]],\n *args,\n bind_args: bool = True,\n **kwargs,\n ):\n # Handle dict types by converting to _kwargs_to_dict function\n if fn_or_cls == {} or (hasattr(fn_or_cls, ""__origin__"") and fn_or_cls.__origin__ is dict):\n fn_or_cls = dict # type: ignore\n bind_args = False\n\n new_kwargs = kwargs\n if bind_args and not isinstance(fn_or_cls, fdl.Buildable):\n try:\n new_kwargs = _bind_args(fn_or_cls, *args, **kwargs)\n except Exception:\n new_kwargs = kwargs\n\n super().__init__(fn_or_cls, *args, **new_kwargs)\n\n @classmethod\n def __unflatten__(\n cls,\n values: Iterable[Any],\n metadata: config.BuildableTraverserMetadata,\n ):\n # If this is a dictionary config, reconstruct it with the arguments\n if metadata.fn_or_cls is dict:\n return cls(**metadata.arguments(values))\n return super().__unflatten__(values, metadata)\n\n\nclass Partial(Generic[_T], fdl.Partial[_T], _CloneAndFNMixin, _VisualizeMixin):\n """"""\n Wrapper around fdl.Partial with nemo_run specific functionality.\n See `fdl.Partial `_ for more.\n """"""\n\n def __init__(\n self,\n fn_or_cls: Union[fdl.Buildable[_T], TypeOrCallableProducingT[_T]],\n *args,\n bind_args: bool = True,\n **kwargs,\n ):\n new_kwargs = kwargs\n if bind_args and not isinstance(fn_or_cls, fdl.Buildable):\n try:\n new_kwargs = _bind_args(fn_or_cls, **kwargs)\n except Exception:\n new_kwargs = kwargs\n\n super().__init__(fn_or_cls, *args, **new_kwargs)\n\n\nregister_supported_cast(fdl.Config, Config)\nregister_supported_cast(fdl.Partial, Partial)\nregister_supported_cast(Config, Config)\nregister_supported_cast(Partial, Partial)\n\n\nclass ConfigurableMixin(_VisualizeMixin):\n """"""\n A mixin class that provides configuration and visualization functionality.\n\n This mixin adds methods for converting objects to Config instances,\n visualizing configurations, and comparing configurations.\n\n For classes that are not dataclasses, the `to_config` method needs to be\n overridden to provide custom conversion logic to Config instances.\n """"""\n\n def diff(self, old: Self, trim=True, **kwargs):\n """"""\n Generate a visual difference between this configuration and an old one.\n\n Args:\n old (Self): The old configuration to compare against.\n trim (bool, optional): Whether to trim unchanged parts. Defaults to True.\n **kwargs: Additional arguments to pass to render_diff.\n\n Returns:\n graphviz.Digraph: A graph representing the differences between configurations.\n """"""\n return render_diff(old=old.to_config(), new=self.to_config(), trim=trim, **kwargs)\n\n def to_config(self) -> Config[Self]:\n """"""\n Convert the current object to a Config instance.\n\n This method automatically converts dataclasses to Config instances.\n For classes that are not dataclasses, this method needs to be overridden\n to provide custom conversion logic.\n\n Returns:\n Config: A Config representation of the current object.\n\n Raises:\n NotImplementedError: If the object type cannot be converted to Config\n or if the method is not overridden for non-dataclass types.\n\n Note:\n For classes that are not dataclasses, you must override this method\n to define how the object should be converted to a Config instance.\n """"""\n if dataclasses.is_dataclass(self):\n try:\n return fdl.cast(\n Config, fdl_dc.convert_dataclasses_to_configs(self, allow_post_init=True)\n )\n except Exception as e:\n raise NotImplementedError(\n f""Cannot convert type {type(self)} to Config"",\n f""Please implement a method `to_config` on {type(self)}."",\n ) from e\n elif isinstance(self, (list, tuple, dict)):\n return self # type: ignore\n else:\n raise NotImplementedError(\n f""Cannot convert type {type(self)} to Config. ""\n f""Please override the `to_config` method for {type(self)}.""\n )\n\n def _repr_svg_(self):\n """"""\n Generate an SVG representation of the object for Jupyter notebooks.\n\n Returns:\n str: SVG representation of the object if it can be rendered,\n otherwise returns the string representation.\n """"""\n if isinstance(self, (list, tuple, dict)):\n try:\n return render(self).pipe(format=""svg"").decode(""utf-8"")\n except Exception as e:\n print(f""Graphviz rendering failed: {e}"")\n return self.__repr__()\n\n return self.to_config()._repr_svg_()\n\n\n@dataclasses.dataclass\nclass Script(ConfigurableMixin):\n """"""\n Dataclass to configure raw scripts.\n\n Examples:\n\n .. code-block:: python\n\n file_based_script = run.Script(""./scripts/echo.sh"")\n\n inline_script = run.Script(\n inline=\""\""\""\n env\n echo ""Hello 1""\n echo ""Hello 2""\n \""\""\""\n )\n\n """"""\n\n #: Path to your script\n path: str = """"\n #: Inline contents of the script. Either path or inline needs to be set.\n inline: str = """"\n #: Args to pass to your scripts, only applicable when path is set.\n args: list[str] = dataclasses.field(default_factory=list)\n #: Environment variables to set when running the script.\n env: dict[str, str] = dataclasses.field(default_factory=dict)\n #: Entrypoint to use, defaults to bash.\n entrypoint: str = ""bash""\n #: Whether to use ``python -m`` when executing via python.\n m: bool = False\n\n metadata: dict[str, Any] = dataclasses.field(default_factory=dict)\n\n def __post_init__(self):\n assert self.path or self.inline\n assert self.entrypoint, ""Need to provide an entrypoint for script.""\n if self.m:\n assert ""python"" in self.entrypoint, ""-m can only be used with python""\n\n def get_name(self):\n if self.inline:\n name = self.inline.strip()[:10]\n return re.sub(""[^0-9a-zA-Z]+"", ""_"", name)\n else:\n return os.path.basename(self.path)\n\n def to_command(\n self, with_entrypoint: bool = False, filename: Optional[str] = None, is_local: bool = False\n ) -> list[str]:\n if self.inline:\n if filename:\n os.makedirs(os.path.dirname(filename), exist_ok=True)\n with open(filename, ""w"") as f:\n f.write(""#!/usr/bin/bash\n"" + self.inline)\n\n if is_local:\n cmd = [filename]\n else:\n cmd = [os.path.join(f""/{RUNDIR_NAME}"", SCRIPTS_DIR, Path(filename).name)]\n\n if with_entrypoint:\n cmd = [self.entrypoint] + cmd\n\n return cmd\n\n inline = self.inline.replace('""', '\\""')\n cmd = [""-c"", f'""{inline}""']\n if with_entrypoint:\n cmd = [self.entrypoint] + cmd\n\n return cmd\n\n args = [self.path] + self.args\n if self.m:\n cmd = [""-m""] + args\n else:\n cmd = args\n\n if with_entrypoint:\n if self.entrypoint:\n cmd = [self.entrypoint] + cmd\n else:\n raise ValueError(""Cannot use with_entrypoint=True without specifying entrypoint"")\n\n return cmd\n\n\n# A type alias for an optional type that is annotated with a Config.\n# This is useful for when you want to specify a type is Optional but\n# always want to provide a default config.\nOptionalDefaultConfig = Annotated[Optional[_T], Config[_T]]\nOptionalDefaultPartial = Annotated[Optional[_T], Partial[_T]]\n\n\ndef _parse_path(path: str) -> daglish.Path:\n """"""Parses a path into a list of either attributes or index lookups.""""""\n if not path.startswith(""["") and not path.startswith("".""):\n path = f"".{path}"" # Add a leading `.` to make parsing work properly.\n\n return daglish_extensions.parse_path(path)\n\n\ndef _bind_args(\n fn_or_cls: TypeOrCallableProducingT,\n *fn_args: fdl.Config | str | Callable,\n **fn_kwargs: fdl.Config | str | Callable,\n) -> dict[str, fdl.Config | str | Callable]:\n sig = inspect.signature(fn_or_cls)\n params = sig.parameters\n\n if set(fn_kwargs) > set(params):\n raise TypeError(\n f""{set(fn_kwargs) - set(params)} does not exist as args in {fn_or_cls.__module__}:{fn_or_cls.__name__}. Please remove them.""\n )\n\n final_args = _construct_args(fn_or_cls, params, fn_kwargs)\n final_args = fn_kwargs | final_args\n sig.bind(*fn_args, **final_args)\n return final_args\n\n\ndef _construct_args(\n fn_or_cls: TypeOrCallableProducingT,\n params: MappingProxyType[str, inspect.Parameter],\n kwargs: dict[str, fdl.Config | str | Callable],\n):\n final_args = {}\n\n primitive = [str, float, int, bool, bytes]\n primitive.extend([Optional[t] for t in primitive])\n\n for name, parameter in params.items():\n arg = kwargs.get(name, None)\n\n if arg:\n if dataclasses.is_dataclass(arg):\n final_args[name] = fdl.cast(\n Config,\n fdl_dc.convert_dataclasses_to_configs(arg, allow_post_init=True),\n )\n else:\n final_args[name] = arg\n elif str(parameter.annotation).startswith(""typing.Annotated""):\n args = get_args(parameter.annotation)\n if str(args[0]).startswith(""typing.Optional"") and len(args) > 1:\n cfg_type = get_args(args[0])[0]\n buildable = args[1].__origin__\n if issubclass(buildable, fdl.Buildable):\n final_args[name] = buildable(cfg_type)\n\n return final_args\n\n\nConfigT = TypeVar(""ConfigT"", Config, Partial)\n\n\ndef _try_set_all(config: _BuildableT, _walk: bool = False, **kwargs) -> _BuildableT:\n for key, val in kwargs.items():\n if hasattr(config, key):\n _val = val(config) if _walk else val\n setattr(config, key, _val)\n\n for attr_name in dir(config):\n try:\n if hasattr(config, attr_name):\n attr = getattr(config, attr_name)\n if isinstance(attr, (fdl.Config, fdl.Partial)):\n _try_set_all(attr, _walk=_walk, **kwargs)\n except ValueError:\n pass\n\n return config\n",python,tab
+2,45,"extension-output-pdoom-org.crowd-code-#1-crowd-code",0,0,"9:29:24 AM [info] Activating crowd-code\n9:29:24 AM [info] Recording started\n9:29:24 AM [info] Initializing git provider using file system watchers...\n9:29:24 AM [info] Git repository found\n9:29:24 AM [info] Git provider initialized successfully\n9:29:24 AM [info] Initial git state: [object Object]\n",Log,tab
+3,1027,"nemo_run/config.py",0,0,"",python,tab
+4,11239,"nemo_run/config.py",10462,13,"class Partial",python,selection_command
+5,13234,"nemo_run/config.py",10549,0,"",python,selection_mouse
+6,13255,"nemo_run/config.py",10548,0,"",python,selection_command
+7,15762,"nemo_run/config.py",10556,0,"",python,selection_command
+8,16108,"nemo_run/config.py",10562,0,"",python,selection_command
+9,16323,"nemo_run/config.py",10569,0,"",python,selection_command
+10,16505,"nemo_run/config.py",10572,0,"",python,selection_command
+11,17021,"nemo_run/config.py",10569,0,"",python,selection_command
+12,17673,"nemo_run/config.py",10489,0,"",python,selection_command
+13,19772,"nemo_run/config.py",0,0,"",python,selection_command
+14,20756,"nemo_run/config.py",979,0,"",python,selection_command
+15,22098,"nemo_run/config.py",0,0,"",python,selection_command
+16,22680,"nemo_run/config.py",10489,0,"",python,selection_command
+17,448036,"nemo_run/config.py",10460,0,"",python,selection_mouse
diff --git a/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-cd6287e2-4cd4-444c-aab8-dbf1b52d201d1763483346410-2025_11_18-17.29.09.568/source.csv b/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-cd6287e2-4cd4-444c-aab8-dbf1b52d201d1763483346410-2025_11_18-17.29.09.568/source.csv
new file mode 100644
index 0000000000000000000000000000000000000000..a71df7b4bb72140bbb42cda99d8e949c2667ce5e
--- /dev/null
+++ b/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-cd6287e2-4cd4-444c-aab8-dbf1b52d201d1763483346410-2025_11_18-17.29.09.568/source.csv
@@ -0,0 +1,256 @@
+Sequence,Time,File,RangeOffset,RangeLength,Text,Language,Type
+1,1,"nemo/collections/llm/recipes/finetune_default.py",0,0,"# Copyright (c) 2025, NVIDIA CORPORATION. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the ""License"");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an ""AS IS"" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom typing import TYPE_CHECKING, Any, Optional\n\nimport lightning.pytorch as pl\nimport nemo_run as run\nimport torch\n\nimport nemo.lightning as nl\nfrom nemo.collections import llm\nfrom nemo.collections.llm.gpt.data.packed_sequence import PackedSequenceSpecs\nfrom nemo.collections.llm.peft import DoRA, LoRA\nfrom nemo.collections.llm.recipes.log.default import tensorboard_logger\nfrom nemo.collections.llm.recipes.optim.adam import distributed_fused_adam_with_cosine_annealing\nfrom nemo.collections.llm.recipes.precision.mixed_precision import bf16_mixed\nfrom nemo.lightning.pytorch.callbacks import PEFT\nfrom nemo.utils.exp_manager import TimingCallback\n\nif TYPE_CHECKING:\n from lightning.pytorch.loggers import TensorBoardLogger, WandbLogger\n\nTokenizerType = Any\n\n\ndef default_finetune_recipe(\n model: run.Config[pl.LightningModule],\n resume_path: str,\n dir: Optional[str] = None,\n name: str = ""default"",\n num_nodes: int = 1,\n num_gpus_per_node: int = 8,\n packed_sequence: bool = False, # once packing recipe is well tested, change this default to true\n tokenizer: Optional[TokenizerType] = ""model"",\n) -> run.Partial:\n """"""\n Create a default fine-tuning recipe for any model.\n\n This function sets up a template for a complete configuration for fine-tuning, including\n model, trainer, data, logging, optimization, and resumption settings.\n\n Args:\n model (run.Config[pl.LightningModule]): Configuration for a NeMo model.\n resume_path (str): Path to the Huggingface model or pretrained distributed checkpoint for resume\n dir (Optional[str]): Directory for saving logs and checkpoints.\n name (str): Name of the fine-tuning run.\n num_nodes (int): Number of compute nodes to use.\n num_gpus_per_node (int): Number of GPUs per node.\n packed_sequence (bool): Whether to use packed sequence.\n tokenizer (Optional[TokenizerType]): Tokenizer setting to be applied. Can be 'data' or 'model'\n or an instance of TokenizerSpec.\n\n Returns:\n run.Partial: Partial configuration for fine-tuning.\n\n See usages of this recipe for further details.\n """"""\n if packed_sequence:\n datamodule = run.Config(\n llm.SquadDataModule,\n seq_length=2048,\n global_batch_size=8,\n micro_batch_size=1,\n packed_sequence_specs=PackedSequenceSpecs(packed_sequence_size=2048),\n )\n else:\n datamodule = run.Config(llm.SquadDataModule, seq_length=2048, global_batch_size=128, micro_batch_size=1)\n recipe = run.Partial(\n llm.finetune,\n model=model,\n trainer=default_finetune_trainer(\n num_nodes=num_nodes,\n num_gpus_per_node=num_gpus_per_node,\n ),\n data=datamodule,\n log=default_finetune_log(dir=dir, name=name, tensorboard_logger=tensorboard_logger(name=name)),\n optim=distributed_fused_adam_with_cosine_annealing(max_lr=1e-4, min_lr=0, warmup_steps=50, adam_beta2=0.98),\n resume=nemo_resume(resume_path),\n tokenizer=tokenizer,\n )\n\n return recipe\n\n\ndef default_finetune_trainer(\n tensor_parallelism=1,\n pipeline_parallelism=1,\n pipeline_parallelism_type=torch.bfloat16,\n virtual_pipeline_parallelism=None,\n context_parallelism=1,\n sequence_parallelism=False,\n num_nodes=1,\n num_gpus_per_node=8,\n max_steps=1000,\n limit_test_batches=None,\n limit_val_batches=None,\n val_check_interval=30,\n):\n """"""\n Create a default fine-tuning trainer for any model.\n\n This function sets up a template for strategy and trainer.\n\n Args:\n See docstrings of MegatronStrategy and Trainer.\n\n Returns:\n run.Config: Config for a finetuning trainer.\n\n See usages of this in recipes for further details.\n """"""\n strategy = run.Config(\n nl.MegatronStrategy,\n tensor_model_parallel_size=tensor_parallelism,\n pipeline_model_parallel_size=pipeline_parallelism,\n pipeline_dtype=pipeline_parallelism_type,\n virtual_pipeline_model_parallel_size=virtual_pipeline_parallelism,\n context_parallel_size=context_parallelism,\n sequence_parallel=sequence_parallelism,\n gradient_as_bucket_view=True,\n ckpt_load_strictness=""log_all"",\n )\n\n trainer = run.Config(\n nl.Trainer,\n accelerator=""gpu"",\n accumulate_grad_batches=1,\n devices=num_gpus_per_node,\n limit_test_batches=limit_test_batches,\n limit_val_batches=limit_val_batches,\n log_every_n_steps=1,\n max_steps=max_steps,\n num_nodes=num_nodes,\n plugins=bf16_mixed(),\n strategy=strategy,\n use_distributed_sampler=False,\n val_check_interval=val_check_interval,\n callbacks=[run.Config(TimingCallback)],\n )\n\n return trainer\n\n\ndef default_finetune_log(\n dir: Optional[str] = None,\n name: str = ""default"",\n tensorboard_logger: Optional[run.Config['TensorBoardLogger']] = None,\n wandb_logger: Optional[run.Config['WandbLogger']] = None,\n) -> run.Config[nl.NeMoLogger]:\n """"""\n Create a default fine-tuning logger for any model.\n\n This function sets up a template for ModelCheckpoint and NeMoLogger.\n\n Args:\n See docstrings of ModelCheckpoint and NeMoLogger.\n\n Returns:\n run.Config: Config for a finetuning NeMoLogger.\n\n See usages of this in recipes for further details.\n """"""\n\n ckpt = run.Config(\n nl.ModelCheckpoint,\n save_last=""link"",\n save_top_k=2,\n every_n_train_steps=50,\n filename=""{model_name}--{val_loss:.2f}-{step}-{consumed_samples}"",\n )\n\n return run.Config(\n nl.NeMoLogger,\n ckpt=ckpt,\n name=name,\n tensorboard=tensorboard_logger,\n wandb=wandb_logger,\n log_dir=dir,\n )\n\n\ndef nemo_resume(model_id: str) -> run.Config[nl.AutoResume]:\n """"""\n Configure automatic resumption from a NeMo checkpoint converted from Huggingface for\n https://huggingface.co/{model_id}.\n\n This NeMo checkpoint should be converted from Huggingface beforehand, using nemo.collections.llm.import_ckpt.\n When converting the checkpoint, the NeMo checkpoint will be saved in NEMO_HOME (set to ~/.cache/nemo by default).\n\n This function sets up the configuration to resume training from path nemo://{model_id}.\n This translates to the full path {NEMO_HOME}/models/{model_id}.\n\n Args:\n model_id (str): Path to the Huggingface model or pretrained distributed checkpoint for resume\n\n Returns:\n run.Config[nl.AutoResume]: Configuration for resuming from NeMo checkpoint.\n """"""\n return run.Config(\n nl.AutoResume,\n restore_config=run.Config(nl.RestoreConfig, path=f""nemo://{model_id}""),\n )\n\n\n@run.cli.factory(name='lora')\ndef lora() -> run.Config[PEFT]:\n """"""\n Factory function to create a LoRA configuration.\n\n Returns:\n run.Config[PEFT]: Configuration for the LoRA class.\n\n Examples:\n CLI usage:\n $ nemo llm finetune -f llama3_8b peft=lora\n\n Python API usage:\n >>> lora_config = lora()\n >>> print(lora_config)\n """"""\n return run.Config(LoRA)\n\n\n@run.cli.factory(name='dora')\ndef dora() -> run.Config[PEFT]:\n """"""\n Factory function to create a DoRA configuration.\n\n Returns:\n run.Config[PEFT]: Configuration for the DoRA class.\n\n Examples:\n CLI usage:\n $ nemo llm finetune -f llama3_8b peft=dora\n\n Python API usage:\n >>> dora_config = dora()\n >>> print(dora_config)\n """"""\n return run.Config(DoRA)\n",python,tab
+2,56,"extension-output-pdoom-org.crowd-code-#1-crowd-code",0,0,"5:29:09 PM [info] Activating crowd-code\n5:29:09 PM [info] Recording started\n5:29:09 PM [info] Initializing git provider using file system watchers...\n5:29:09 PM [info] Git repository found\n5:29:09 PM [info] Git provider initialized successfully\n",Log,tab
+3,126,"extension-output-pdoom-org.crowd-code-#1-crowd-code",245,0,"5:29:09 PM [info] Initial git state: [object Object]\n",Log,content
+4,913,"nemo/collections/llm/recipes/finetune_default.py",0,0,"",python,tab
+5,1882,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",0,0,"# Copyright (c) 2025, NVIDIA CORPORATION. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the ""License"");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an ""AS IS"" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom typing import Optional\n\nimport lightning.pytorch as pl\nimport nemo_run as run\nimport torch\nfrom nemo.collections.common.tokenizers.huggingface.auto_tokenizer import AutoTokenizer\n\nfrom nemo.collections.llm.api import finetune, pretrain\nfrom nemo.collections.llm.gpt.data.mock import MockDataModule\nfrom nemo.collections.llm.peft import PEFT_STR2CLS\nfrom nemo.collections.llm.recipes.finetune_default import default_finetune_recipe\nfrom nemo.collections.llm.recipes.log.default import default_log, default_resume, tensorboard_logger\nfrom nemo.collections.llm.recipes.optim.adam import distributed_fused_adam_with_cosine_annealing\nfrom nemo.collections.llm.recipes.qwen3 import qwen3_model, qwen3_trainer\nfrom nemo.utils.exp_manager import TimingCallback\n\nNAME = ""qwen3_30b_a3b""\n\n\n@run.cli.factory(name=NAME)\ndef model() -> run.Config[pl.LightningModule]:\n """"""\n Factory function to create a Qwen3 30B-A3B model configuration.\n This is a MoE (Mixture of Experts) model with 128 experts.\n\n Returns:\n run.Config[pl.LightningModule]: Configuration for the Qwen3 30B-A3B model.\n\n Examples:\n CLI usage:\n $ nemo llm pretrain model=qwen3_30b_a3b ...\n\n Python API usage:\n >>> model_config = model()\n >>> print(model_config)\n """"""\n return qwen3_model(version=NAME)\n\n\n@run.cli.factory(target=pretrain, name=NAME)\ndef pretrain_recipe(\n # General\n dir: Optional[str] = None,\n name: str = ""default"",\n # Trainer\n tensor_parallelism: int = 4, # Default for 30B-A3B model\n pipeline_parallelism: int = 2,\n pipeline_parallelism_type: Optional[torch.dtype] = None,\n virtual_pipeline_parallelism: Optional[int] = None,\n context_parallelism: int = 1,\n expert_parallelism: Optional[int] = 4,\n sequence_parallelism: bool = True,\n num_nodes: int = 1,\n num_gpus_per_node: int = 8,\n max_steps: int = 300000,\n precision: str = ""bf16-mixed"",\n accumulate_grad_batches: int = 1,\n gradient_clip_val: float = 1.0,\n limit_test_batches: int = 32,\n limit_val_batches: int = 32,\n log_every_n_steps: int = 10,\n val_check_interval: int = 500,\n # Data\n global_batch_size=32,\n micro_batch_size=2,\n seq_length=4096,\n # Optimizer\n warmup_steps=500,\n constant_steps=0,\n min_lr=3e-5,\n max_lr=3e-4,\n # Training function\n fn=pretrain,\n) -> run.Partial:\n """"""\n Create a pre-training recipe for Qwen3 30B-A3B model.\n\n This function sets up a complete configuration for pre-training, including\n model, trainer, data, logging, optimization, and resumption settings.\n This model uses Mixture of Experts (MoE) architecture with 128 experts.\n\n Args:\n dir (Optional[str]): Directory for saving logs and checkpoints.\n name (str): Name of the pre-training run.\n tensor_parallelism (int): Degree of tensor model parallelism.\n pipeline_parallelism (int): Degree of pipeline model parallelism.\n pipeline_parallelism_type (Optional[torch.dtype]): Data type for pipeline parallelism.\n virtual_pipeline_parallelism (Optional[int]): Size of virtual pipeline parallelism.\n context_parallelism (int): Degree of context parallelism.\n sequence_parallelism (bool): Whether to use sequence parallelism.\n num_nodes (int): Number of compute nodes to use.\n num_gpus_per_node (int): Number of GPUs per node.\n max_steps (int): Maximum number of training steps.\n precision (str): Precision configuration, one of fp32, 16-mixed or bf16-mixed.\n accumulate_grad_batches (int): Number of steps per gradient accumulation.\n gradient_clip_val (float): Value for gradient clipping.\n limit_test_batches (int): Limit the number of test batches.\n limit_val_batches (int): Limit the number of validation batches.\n log_every_n_steps (int): Log every n steps.\n val_check_interval (int): Run validation every N steps.\n global_batch_size (int): Global batch size.\n micro_batch_size (int): Micro batch size.\n seq_length (int): Sequence length.\n warmup_steps (int): Number of warmup steps.\n constant_steps (int): Number of constant steps.\n min_lr (float): Minimum learning rate.\n max_lr (float): Maximum learning rate.\n fn (Callable): The pre-training function to use.\n\n Returns:\n run.Partial: Partial configuration for pre-training.\n\n Examples:\n CLI usage:\n $ nemo llm pretrain --factory qwen3_30b_a3b\n $ nemo llm pretrain --factory ""qwen3_30b_a3b(num_nodes=1, name='my_qwen3_pretrain')""\n\n Python API usage:\n >>> recipe = pretrain_recipe(name=""qwen3_pretrain"", num_nodes=1)\n >>> print(recipe)\n\n Note:\n This recipe uses a mock dataset, look for the finetune examples to see how to change the dataset.\n """"""\n recipe = run.Partial(\n fn,\n model=model(),\n trainer=qwen3_trainer(\n tensor_parallelism=tensor_parallelism,\n pipeline_parallelism=pipeline_parallelism,\n pipeline_parallelism_type=pipeline_parallelism_type,\n virtual_pipeline_parallelism=virtual_pipeline_parallelism,\n context_parallelism=context_parallelism,\n sequence_parallelism=sequence_parallelism,\n expert_parallelism=expert_parallelism,\n num_nodes=num_nodes,\n num_gpus_per_node=num_gpus_per_node,\n max_steps=max_steps,\n precision=precision,\n accumulate_grad_batches=accumulate_grad_batches,\n limit_test_batches=limit_test_batches,\n limit_val_batches=limit_val_batches,\n log_every_n_steps=log_every_n_steps,\n val_check_interval=val_check_interval,\n callbacks=[run.Config(TimingCallback)],\n ),\n data=run.Config(\n MockDataModule,\n seq_length=seq_length,\n global_batch_size=global_batch_size,\n micro_batch_size=micro_batch_size,\n tokenizer=run.Config(AutoTokenizer, ""Qwen/Qwen3-30B-A3B""),\n ),\n log=default_log(dir=dir, name=name, tensorboard_logger=tensorboard_logger(name=name)),\n optim=distributed_fused_adam_with_cosine_annealing(\n precision=precision,\n warmup_steps=warmup_steps,\n constant_steps=constant_steps,\n min_lr=min_lr,\n max_lr=max_lr,\n clip_grad=gradient_clip_val,\n ),\n resume=default_resume(),\n )\n recipe.model.config.recompute_granularity = ""full""\n recipe.model.config.recompute_method = ""uniform""\n recipe.model.config.recompute_num_layers = 1\n return recipe\n\n\n@run.cli.factory(target=finetune, name=NAME)\ndef finetune_recipe(\n dir: Optional[str] = None,\n name: str = ""default"",\n num_nodes: int = 1,\n num_gpus_per_node: int = 8,\n peft_scheme: Optional[str] = 'lora',\n packed_sequence: bool = False,\n) -> run.Partial:\n """"""\n Create a fine-tuning recipe for Qwen3 30B-A3B model.\n\n This function sets up a complete configuration for fine-tuning, including\n model, trainer, data, logging, optimization, and resumption settings.\n The recipe uses LoRA (Low-Rank Adaptation) for efficient fine-tuning, unless peft_scheme is set to None.\n This model uses Mixture of Experts (MoE) architecture with 128 experts.\n\n Args:\n dir (Optional[str]): Directory for saving logs and checkpoints.\n name (str): Name of the fine-tuning run.\n num_nodes (int): Number of compute nodes to use.\n num_gpus_per_node (int): Number of GPUs per node.\n peft_scheme (Optional[str]): Name of the peft scheme to use for fine-tuning.\n Allowed values: 'lora'/'dora'/'none'/None.\n packed_sequence (Optional[bool]): Packing multiple training sequences into one long sequence for training\n efficiency. Default sequence length is 2048.\n\n Returns:\n run.Partial: Partial configuration for fine-tuning.\n\n Examples:\n CLI usage:\n $ nemo llm finetune --factory qwen3_30b_a3b\n\n Python API usage:\n >>> recipe = finetune_recipe(name=""qwen3_30b_a3b_finetune"", num_nodes=2)\n >>> print(recipe)\n\n Note:\n This recipe uses the SQuAD dataset for fine-tuning.\n """"""\n recipe = default_finetune_recipe(\n model(), ""Qwen/Qwen3-30B-A3B"", dir, name, num_nodes, num_gpus_per_node, packed_sequence\n )\n if peft_scheme is None or peft_scheme.lower() == 'none':\n recipe.trainer.strategy.tensor_model_parallel_size = 4\n recipe.trainer.strategy.expert_model_parallel_size = 4\n recipe.trainer.strategy.expert_tensor_parallel_size = 1\n recipe.trainer.strategy.pipeline_model_parallel_size = 2\n recipe.trainer.strategy.sequence_parallel = True\n recipe.optim.config.lr = 5e-6\n elif peft_scheme.lower() in ['lora', 'dora']:\n recipe.trainer.strategy.tensor_model_parallel_size = 4\n recipe.trainer.strategy.expert_model_parallel_size = 4\n recipe.trainer.strategy.expert_tensor_parallel_size = 1\n recipe.trainer.strategy.sequence_parallel = True\n recipe.peft = run.Config(PEFT_STR2CLS[peft_scheme.lower()])\n recipe.peft.target_modules = ['linear_qkv', 'linear_proj']\n recipe.optim.config.lr = 1e-4\n else:\n raise ValueError(f""Unrecognized peft scheme: {peft_scheme}"")\n return recipe\n",python,tab
+6,3679,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",2687,0,"",python,selection_keyboard
+7,3985,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",4959,0,"",python,selection_keyboard
+8,9297,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",10041,0,"",python,selection_command
+9,14593,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",10023,0,"",python,selection_command
+10,14839,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9954,0,"",python,selection_command
+11,14872,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9944,0,"",python,selection_command
+12,14905,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9906,0,"",python,selection_command
+13,14939,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9839,0,"",python,selection_command
+14,14971,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9771,0,"",python,selection_command
+15,15005,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9714,0,"",python,selection_command
+16,15039,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9650,0,"",python,selection_command
+17,15073,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9587,0,"",python,selection_command
+18,15107,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9524,0,"",python,selection_command
+19,15141,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9474,0,"",python,selection_command
+20,15404,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9436,0,"",python,selection_command
+21,1171917,"extension-output-pdoom-org.crowd-code-#1-crowd-code",0,0,"",Log,tab
+22,1173550,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",0,0,"",python,tab
+23,1203601,"TERMINAL",0,0,"pkill tensorboard",,terminal_command
+24,1203625,"TERMINAL",0,0,"]633;C[1m[7m%[27m[1m[0m \r \r",,terminal_output
+25,3151652,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",0,0,"",python,selection_command
+26,3152504,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",882,0,"",python,selection_command
+27,3153203,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",903,0,"",python,selection_command
+28,3153907,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",2762,0,"",python,selection_command
+29,3155002,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",3157,0,"",python,selection_command
+30,3156444,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",3623,0,"",python,selection_command
+31,3156777,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",5398,0,"",python,selection_command
+32,3157520,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",5463,0,"",python,selection_command
+33,3158018,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",6453,0,"",python,selection_command
+34,3158729,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",5463,0,"",python,selection_command
+35,3158811,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",5398,0,"",python,selection_command
+36,3164334,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",5463,0,"",python,selection_command
+37,3164480,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",6453,0,"",python,selection_command
+38,3164851,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",6486,0,"",python,selection_command
+39,3165163,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7741,0,"",python,selection_command
+40,3166858,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",8890,0,"",python,selection_command
+41,3169061,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",882,0,"",python,selection_command
+42,3169801,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",8890,0,"",python,selection_command
+43,3170889,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",8921,0,"",python,selection_command
+44,3171141,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",8958,0,"",python,selection_command
+45,3171175,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",8996,0,"",python,selection_command
+46,3171203,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9061,0,"",python,selection_command
+47,3171237,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9098,0,"",python,selection_command
+48,3171268,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9159,0,"",python,selection_command
+49,3171302,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9222,0,"",python,selection_command
+50,3171335,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9285,0,"",python,selection_command
+51,3171373,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9349,0,"",python,selection_command
+52,3171412,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9414,0,"",python,selection_command
+53,3171437,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9471,0,"",python,selection_command
+54,3171470,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9509,0,"",python,selection_command
+55,3171503,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9559,0,"",python,selection_command
+56,3171537,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9622,0,"",python,selection_command
+57,3171570,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9685,0,"",python,selection_command
+58,3171604,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9749,0,"",python,selection_command
+59,3172018,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9806,0,"",python,selection_command
+60,3173415,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9771,0,"",python,selection_command
+61,3178234,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9714,0,"",python,selection_command
+62,3178485,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9650,0,"",python,selection_command
+63,3178519,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9587,0,"",python,selection_command
+64,3178551,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9524,0,"",python,selection_command
+65,3178585,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9474,0,"",python,selection_command
+66,3178613,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9436,0,"",python,selection_command
+67,3178649,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9379,0,"",python,selection_command
+68,3178680,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9314,0,"",python,selection_command
+69,3178713,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9250,0,"",python,selection_command
+70,3178747,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9187,0,"",python,selection_command
+71,3178780,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9124,0,"",python,selection_command
+72,3178813,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9063,0,"",python,selection_command
+73,3179053,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9057,0,"",python,selection_command
+74,3179204,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",8961,0,"",python,selection_command
+75,3182460,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",8923,0,"",python,selection_command
+76,3182692,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",8927,0,"",python,selection_command
+77,3182845,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",8934,0,"",python,selection_command
+78,3182996,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",8936,0,"",python,selection_command
+79,3183446,"nemo/collections/llm/recipes/finetune_default.py",0,0,"",python,tab
+80,3183452,"nemo/collections/llm/recipes/finetune_default.py",1382,0,"",python,selection_command
+81,3185370,"nemo/collections/llm/recipes/finetune_default.py",1411,0,"",python,selection_command
+82,3185504,"nemo/collections/llm/recipes/finetune_default.py",1454,0,"",python,selection_command
+83,3185665,"nemo/collections/llm/recipes/finetune_default.py",1476,0,"",python,selection_command
+84,3185835,"nemo/collections/llm/recipes/finetune_default.py",1507,0,"",python,selection_command
+85,3186213,"nemo/collections/llm/recipes/finetune_default.py",1534,0,"",python,selection_command
+86,3186470,"nemo/collections/llm/recipes/finetune_default.py",1558,0,"",python,selection_command
+87,3186506,"nemo/collections/llm/recipes/finetune_default.py",1590,0,"",python,selection_command
+88,3186530,"nemo/collections/llm/recipes/finetune_default.py",1692,0,"",python,selection_command
+89,3186564,"nemo/collections/llm/recipes/finetune_default.py",1742,0,"",python,selection_command
+90,3187340,"nemo/collections/llm/recipes/finetune_default.py",3710,0,"",python,selection_keyboard
+91,3188060,"nemo/collections/llm/recipes/finetune_default.py",3709,0,"",python,selection_command
+92,3188310,"nemo/collections/llm/recipes/finetune_default.py",3691,0,"",python,selection_command
+93,3188340,"nemo/collections/llm/recipes/finetune_default.py",3690,0,"",python,selection_command
+94,3188372,"nemo/collections/llm/recipes/finetune_default.py",3684,0,"",python,selection_command
+95,3188405,"nemo/collections/llm/recipes/finetune_default.py",3655,0,"",python,selection_command
+96,3188438,"nemo/collections/llm/recipes/finetune_default.py",3614,0,"",python,selection_command
+97,3188471,"nemo/collections/llm/recipes/finetune_default.py",3497,0,"",python,selection_command
+98,3188505,"nemo/collections/llm/recipes/finetune_default.py",3393,0,"",python,selection_command
+99,3188538,"nemo/collections/llm/recipes/finetune_default.py",3368,0,"",python,selection_command
+100,3188571,"nemo/collections/llm/recipes/finetune_default.py",3357,0,"",python,selection_command
+101,3188604,"nemo/collections/llm/recipes/finetune_default.py",3308,0,"",python,selection_command
+102,3188637,"nemo/collections/llm/recipes/finetune_default.py",3275,0,"",python,selection_command
+103,3188671,"nemo/collections/llm/recipes/finetune_default.py",3233,0,"",python,selection_command
+104,3188703,"nemo/collections/llm/recipes/finetune_default.py",3212,0,"",python,selection_command
+105,3188737,"nemo/collections/llm/recipes/finetune_default.py",3190,0,"",python,selection_command
+106,3188770,"nemo/collections/llm/recipes/finetune_default.py",3164,0,"",python,selection_command
+107,3188803,"nemo/collections/llm/recipes/finetune_default.py",3051,0,"",python,selection_command
+108,3188837,"nemo/collections/llm/recipes/finetune_default.py",3041,0,"",python,selection_command
+109,3188870,"nemo/collections/llm/recipes/finetune_default.py",3031,0,"",python,selection_command
+110,3188904,"nemo/collections/llm/recipes/finetune_default.py",2949,0,"",python,selection_command
+111,3188937,"nemo/collections/llm/recipes/finetune_default.py",2917,0,"",python,selection_command
+112,3189079,"nemo/collections/llm/recipes/finetune_default.py",2884,0,"",python,selection_command
+113,3189248,"nemo/collections/llm/recipes/finetune_default.py",2855,0,"",python,selection_command
+114,3189429,"nemo/collections/llm/recipes/finetune_default.py",2822,0,"",python,selection_command
+115,3189570,"nemo/collections/llm/recipes/finetune_default.py",2789,0,"",python,selection_command
+116,3189907,"nemo/collections/llm/recipes/finetune_default.py",2765,0,"",python,selection_command
+117,3190312,"nemo/collections/llm/recipes/finetune_default.py",2789,0,"",python,selection_command
+118,3190660,"nemo/collections/llm/recipes/finetune_default.py",2797,0,"",python,selection_command
+119,3191864,"nemo/collections/llm/recipes/finetune_default.py",2830,0,"",python,selection_command
+120,3192120,"nemo/collections/llm/recipes/finetune_default.py",2834,0,"",python,selection_command
+121,3193182,"nemo/collections/llm/recipes/finetune_default.py",2801,0,"",python,selection_command
+122,3193476,"nemo/collections/llm/recipes/finetune_default.py",2777,0,"",python,selection_command
+123,3194453,"nemo/collections/llm/recipes/finetune_default.py",2801,0,"",python,selection_command
+124,3210780,"nemo/collections/llm/recipes/finetune_default.py",2789,32," datamodule = run.Config(",python,selection_command
+125,3436136,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",0,0,"",python,tab
+126,3436411,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",0,10041,"# Copyright (c) 2025, NVIDIA CORPORATION. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the ""License"");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an ""AS IS"" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom typing import Optional\n\nimport lightning.pytorch as pl\nimport nemo_run as run\nimport torch\nfrom nemo.collections.common.tokenizers.huggingface.auto_tokenizer import AutoTokenizer\n\nfrom nemo.collections.llm.api import finetune, pretrain\nfrom nemo.collections.llm.gpt.data.mock import MockDataModule\nfrom nemo.collections.llm.peft import PEFT_STR2CLS\nfrom nemo.collections.llm.recipes.finetune_default import default_finetune_recipe\nfrom nemo.collections.llm.recipes.log.default import default_log, default_resume, tensorboard_logger\nfrom nemo.collections.llm.recipes.optim.adam import distributed_fused_adam_with_cosine_annealing\nfrom nemo.collections.llm.recipes.qwen3 import qwen3_model, qwen3_trainer\nfrom nemo.utils.exp_manager import TimingCallback\n\nNAME = ""qwen3_30b_a3b""\n\n\n@run.cli.factory(name=NAME)\ndef model() -> run.Config[pl.LightningModule]:\n """"""\n Factory function to create a Qwen3 30B-A3B model configuration.\n This is a MoE (Mixture of Experts) model with 128 experts.\n\n Returns:\n run.Config[pl.LightningModule]: Configuration for the Qwen3 30B-A3B model.\n\n Examples:\n CLI usage:\n $ nemo llm pretrain model=qwen3_30b_a3b ...\n\n Python API usage:\n >>> model_config = model()\n >>> print(model_config)\n """"""\n return qwen3_model(version=NAME)\n\n\n@run.cli.factory(target=pretrain, name=NAME)\ndef pretrain_recipe(\n # General\n dir: Optional[str] = None,\n name: str = ""default"",\n # Trainer\n tensor_parallelism: int = 4, # Default for 30B-A3B model\n pipeline_parallelism: int = 2,\n pipeline_parallelism_type: Optional[torch.dtype] = None,\n virtual_pipeline_parallelism: Optional[int] = None,\n context_parallelism: int = 1,\n expert_parallelism: Optional[int] = 4,\n sequence_parallelism: bool = True,\n num_nodes: int = 1,\n num_gpus_per_node: int = 8,\n max_steps: int = 300000,\n precision: str = ""bf16-mixed"",\n accumulate_grad_batches: int = 1,\n gradient_clip_val: float = 1.0,\n limit_test_batches: int = 32,\n limit_val_batches: int = 32,\n log_every_n_steps: int = 10,\n val_check_interval: int = 500,\n # Data\n global_batch_size=32,\n micro_batch_size=2,\n seq_length=4096,\n # Optimizer\n warmup_steps=500,\n constant_steps=0,\n min_lr=3e-5,\n max_lr=3e-4,\n # Training function\n fn=pretrain,\n) -> run.Partial:\n """"""\n Create a pre-training recipe for Qwen3 30B-A3B model.\n\n This function sets up a complete configuration for pre-training, including\n model, trainer, data, logging, optimization, and resumption settings.\n This model uses Mixture of Experts (MoE) architecture with 128 experts.\n\n Args:\n dir (Optional[str]): Directory for saving logs and checkpoints.\n name (str): Name of the pre-training run.\n tensor_parallelism (int): Degree of tensor model parallelism.\n pipeline_parallelism (int): Degree of pipeline model parallelism.\n pipeline_parallelism_type (Optional[torch.dtype]): Data type for pipeline parallelism.\n virtual_pipeline_parallelism (Optional[int]): Size of virtual pipeline parallelism.\n context_parallelism (int): Degree of context parallelism.\n sequence_parallelism (bool): Whether to use sequence parallelism.\n num_nodes (int): Number of compute nodes to use.\n num_gpus_per_node (int): Number of GPUs per node.\n max_steps (int): Maximum number of training steps.\n precision (str): Precision configuration, one of fp32, 16-mixed or bf16-mixed.\n accumulate_grad_batches (int): Number of steps per gradient accumulation.\n gradient_clip_val (float): Value for gradient clipping.\n limit_test_batches (int): Limit the number of test batches.\n limit_val_batches (int): Limit the number of validation batches.\n log_every_n_steps (int): Log every n steps.\n val_check_interval (int): Run validation every N steps.\n global_batch_size (int): Global batch size.\n micro_batch_size (int): Micro batch size.\n seq_length (int): Sequence length.\n warmup_steps (int): Number of warmup steps.\n constant_steps (int): Number of constant steps.\n min_lr (float): Minimum learning rate.\n max_lr (float): Maximum learning rate.\n fn (Callable): The pre-training function to use.\n\n Returns:\n run.Partial: Partial configuration for pre-training.\n\n Examples:\n CLI usage:\n $ nemo llm pretrain --factory qwen3_30b_a3b\n $ nemo llm pretrain --factory ""qwen3_30b_a3b(num_nodes=1, name='my_qwen3_pretrain')""\n\n Python API usage:\n >>> recipe = pretrain_recipe(name=""qwen3_pretrain"", num_nodes=1)\n >>> print(recipe)\n\n Note:\n This recipe uses a mock dataset, look for the finetune examples to see how to change the dataset.\n """"""\n recipe = run.Partial(\n fn,\n model=model(),\n trainer=qwen3_trainer(\n tensor_parallelism=tensor_parallelism,\n pipeline_parallelism=pipeline_parallelism,\n pipeline_parallelism_type=pipeline_parallelism_type,\n virtual_pipeline_parallelism=virtual_pipeline_parallelism,\n context_parallelism=context_parallelism,\n sequence_parallelism=sequence_parallelism,\n expert_parallelism=expert_parallelism,\n num_nodes=num_nodes,\n num_gpus_per_node=num_gpus_per_node,\n max_steps=max_steps,\n precision=precision,\n accumulate_grad_batches=accumulate_grad_batches,\n limit_test_batches=limit_test_batches,\n limit_val_batches=limit_val_batches,\n log_every_n_steps=log_every_n_steps,\n val_check_interval=val_check_interval,\n callbacks=[run.Config(TimingCallback)],\n ),\n data=run.Config(\n MockDataModule,\n seq_length=seq_length,\n global_batch_size=global_batch_size,\n micro_batch_size=micro_batch_size,\n tokenizer=run.Config(AutoTokenizer, ""Qwen/Qwen3-30B-A3B""),\n ),\n log=default_log(dir=dir, name=name, tensorboard_logger=tensorboard_logger(name=name)),\n optim=distributed_fused_adam_with_cosine_annealing(\n precision=precision,\n warmup_steps=warmup_steps,\n constant_steps=constant_steps,\n min_lr=min_lr,\n max_lr=max_lr,\n clip_grad=gradient_clip_val,\n ),\n resume=default_resume(),\n )\n recipe.model.config.recompute_granularity = ""full""\n recipe.model.config.recompute_method = ""uniform""\n recipe.model.config.recompute_num_layers = 1\n return recipe\n\n\n@run.cli.factory(target=finetune, name=NAME)\ndef finetune_recipe(\n dir: Optional[str] = None,\n name: str = ""default"",\n num_nodes: int = 1,\n num_gpus_per_node: int = 8,\n peft_scheme: Optional[str] = 'lora',\n packed_sequence: bool = False,\n) -> run.Partial:\n """"""\n Create a fine-tuning recipe for Qwen3 30B-A3B model.\n\n This function sets up a complete configuration for fine-tuning, including\n model, trainer, data, logging, optimization, and resumption settings.\n The recipe uses LoRA (Low-Rank Adaptation) for efficient fine-tuning, unless peft_scheme is set to None.\n This model uses Mixture of Experts (MoE) architecture with 128 experts.\n\n Args:\n dir (Optional[str]): Directory for saving logs and checkpoints.\n name (str): Name of the fine-tuning run.\n num_nodes (int): Number of compute nodes to use.\n num_gpus_per_node (int): Number of GPUs per node.\n peft_scheme (Optional[str]): Name of the peft scheme to use for fine-tuning.\n Allowed values: 'lora'/'dora'/'none'/None.\n packed_sequence (Optional[bool]): Packing multiple training sequences into one long sequence for training\n efficiency. Default sequence length is 2048.\n\n Returns:\n run.Partial: Partial configuration for fine-tuning.\n\n Examples:\n CLI usage:\n $ nemo llm finetune --factory qwen3_30b_a3b\n\n Python API usage:\n >>> recipe = finetune_recipe(name=""qwen3_30b_a3b_finetune"", num_nodes=2)\n >>> print(recipe)\n\n Note:\n This recipe uses the SQuAD dataset for fine-tuning.\n """"""\n recipe = default_finetune_recipe(\n model(), ""Qwen/Qwen3-30B-A3B"", dir, name, num_nodes, num_gpus_per_node, packed_sequence\n )\n if peft_scheme is None or peft_scheme.lower() == 'none':\n recipe.trainer.strategy.tensor_model_parallel_size = 4\n recipe.trainer.strategy.expert_model_parallel_size = 4\n recipe.trainer.strategy.expert_tensor_parallel_size = 1\n recipe.trainer.strategy.pipeline_model_parallel_size = 2\n recipe.trainer.strategy.sequence_parallel = True\n recipe.optim.config.lr = 5e-6\n elif peft_scheme.lower() in ['lora', 'dora']:\n recipe.trainer.strategy.tensor_model_parallel_size = 4\n recipe.trainer.strategy.expert_model_parallel_size = 4\n recipe.trainer.strategy.expert_tensor_parallel_size = 1\n recipe.trainer.strategy.sequence_parallel = True\n recipe.peft = run.Config(PEFT_STR2CLS[peft_scheme.lower()])\n recipe.peft.target_modules = ['linear_qkv', 'linear_proj']\n recipe.optim.config.lr = 1e-4\n else:\n raise ValueError(f""Unrecognized peft scheme: {peft_scheme}"")\n return recipe\n",python,selection_command
+127,3436648,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",10041,0,"",python,selection_command
+128,3901672,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",0,0,"",python,selection_command
+129,3903628,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",818,0,"",python,selection_command
+130,3905550,"nemo/collections/llm/__init__.py",0,0,"# Copyright (c) 2025, NVIDIA CORPORATION. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the ""License"");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an ""AS IS"" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n# This is here to import it once, which improves the speed of launch when in debug-mode\nfrom nemo.utils.import_utils import safe_import\n\nsafe_import(""transformer_engine"")\n\nfrom nemo.collections.llm import peft\nfrom nemo.collections.llm.bert.data import BERTMockDataModule, BERTPreTrainingDataModule, SpecterDataModule\nfrom nemo.collections.llm.bert.model import (\n BertConfig,\n BertEmbeddingLargeConfig,\n BertEmbeddingMiniConfig,\n BertEmbeddingModel,\n BertModel,\n HuggingFaceBertBaseConfig,\n HuggingFaceBertConfig,\n HuggingFaceBertLargeConfig,\n HuggingFaceBertModel,\n MegatronBertBaseConfig,\n MegatronBertConfig,\n MegatronBertLargeConfig,\n)\nfrom nemo.collections.llm.gpt.data import ( # noqa: F401\n AlpacaDataModule,\n ChatDataModule,\n CustomReRankerDataModule,\n CustomRetrievalDataModule,\n DollyDataModule,\n FineTuningDataModule,\n HFDatasetDataModule,\n HFDatasetDataModulePacked,\n HFMockDataModule,\n MockDataModule,\n PreTrainingDataModule,\n SpecterReRankerDataModule,\n SquadDataModule,\n)\nfrom nemo.collections.llm.gpt.data.api import dolly, hf_dataset, mock, squad\nfrom nemo.collections.llm.gpt.model import ( # noqa: F401\n Baichuan2Config,\n Baichuan2Config7B,\n Baichuan2Model,\n BaseMambaConfig1_3B,\n BaseMambaConfig2_7B,\n BaseMambaConfig130M,\n BaseMambaConfig370M,\n BaseMambaConfig780M,\n ChatGLM2Config6B,\n ChatGLM3Config6B,\n ChatGLMConfig,\n ChatGLMModel,\n CodeGemmaConfig2B,\n CodeGemmaConfig7B,\n CodeLlamaConfig7B,\n CodeLlamaConfig13B,\n CodeLlamaConfig34B,\n CodeLlamaConfig70B,\n DeepSeekModel,\n DeepSeekV2Config,\n DeepSeekV2LiteConfig,\n DeepSeekV3Config,\n Gemma2Config,\n Gemma2Config2B,\n Gemma2Config9B,\n Gemma2Config27B,\n Gemma2Model,\n Gemma3Config1B,\n Gemma3Config4B,\n Gemma3Config12B,\n Gemma3Config27B,\n Gemma3Model,\n GemmaConfig,\n GemmaConfig2B,\n GemmaConfig7B,\n GemmaModel,\n GPTConfig,\n GPTConfig5B,\n GPTConfig7B,\n GPTConfig20B,\n GPTConfig40B,\n GPTConfig126M,\n GPTConfig175B,\n GPTModel,\n GPTOSSConfig,\n GPTOSSConfig20B,\n GPTOSSConfig120B,\n GPTOSSModel,\n HFAutoModelForCausalLM,\n Hyena1bConfig,\n Hyena7bARCLongContextConfig,\n Hyena7bConfig,\n Hyena40bARCLongContextConfig,\n Hyena40bConfig,\n HyenaConfig,\n HyenaModel,\n HyenaNV1bConfig,\n HyenaNV7bConfig,\n HyenaNV40bConfig,\n HyenaNVTestConfig,\n HyenaTestConfig,\n Llama2Config7B,\n Llama2Config13B,\n Llama2Config70B,\n Llama3Config8B,\n Llama3Config70B,\n Llama4Config,\n Llama4Experts16Config,\n Llama4Experts128Config,\n Llama31Config8B,\n Llama31Config70B,\n Llama31Config405B,\n Llama31Nemotron70BConfig,\n Llama31NemotronNano8BConfig,\n Llama31NemotronUltra253BConfig,\n Llama32Config1B,\n Llama32Config3B,\n Llama32EmbeddingConfig1B,\n Llama32EmbeddingConfig3B,\n Llama32Reranker1BConfig,\n Llama32Reranker500MConfig,\n Llama33NemotronSuper49BConfig,\n LlamaConfig,\n LlamaEmbeddingModel,\n LlamaModel,\n LlamaNemotronModel,\n MambaModel,\n MaskedTokenLossReduction,\n MistralConfig7B,\n MistralModel,\n MistralNeMoConfig12B,\n MistralSmall3Config24B,\n MixtralConfig,\n MixtralConfig8x3B,\n MixtralConfig8x7B,\n MixtralConfig8x22B,\n MixtralModel,\n Nemotron3Config4B,\n Nemotron3Config8B,\n Nemotron3Config22B,\n Nemotron4Config15B,\n Nemotron4Config340B,\n NemotronConfig,\n NemotronHConfig4B,\n NemotronHConfig8B,\n NemotronHConfig47B,\n NemotronHConfig56B,\n NemotronModel,\n NemotronNano9Bv2,\n NemotronNano12Bv2,\n NVIDIAMambaConfig8B,\n NVIDIAMambaHybridConfig8B,\n Phi3Config,\n Phi3ConfigMini,\n Phi3Model,\n Qwen2Config,\n Qwen2Config1P5B,\n Qwen2Config7B,\n Qwen2Config72B,\n Qwen2Config500M,\n Qwen2Model,\n Qwen3Config,\n Qwen3Config1P7B,\n Qwen3Config4B,\n Qwen3Config8B,\n Qwen3Config14B,\n Qwen3Config30B_A3B,\n Qwen3Config32B,\n Qwen3Config235B_A22B,\n Qwen3Config600M,\n Qwen3Model,\n Qwen25Config1P5B,\n Qwen25Config3B,\n Qwen25Config7B,\n Qwen25Config14B,\n Qwen25Config32B,\n Qwen25Config72B,\n Qwen25Config500M,\n ReRankerModel,\n SSMConfig,\n Starcoder2Config,\n Starcoder2Config3B,\n Starcoder2Config7B,\n Starcoder2Config15B,\n Starcoder2Model,\n StarcoderConfig,\n StarcoderConfig15B,\n StarcoderModel,\n gpt_data_step,\n gpt_forward_step,\n)\nfrom nemo.collections.llm.t5.data import FineTuningDataModule as T5FineTuningDataModule\nfrom nemo.collections.llm.t5.data import MockDataModule as T5MockDataModule\nfrom nemo.collections.llm.t5.data import PreTrainingDataModule as T5PreTrainingDataModule\nfrom nemo.collections.llm.t5.data import SquadDataModule as T5SquadDataModule\nfrom nemo.collections.llm.t5.model import (\n T5Config,\n T5Config3B,\n T5Config11B,\n T5Config220M,\n T5Model,\n t5_data_step,\n t5_forward_step,\n)\n\n__all__ = [\n ""MockDataModule"",\n ""T5MockDataModule"",\n ""CustomRetrievalDataModule"",\n ""CustomReRankerDataModule"",\n ""SpecterReRankerDataModule"",\n ""GPTModel"",\n ""GPTConfig"",\n ""HyenaTestConfig"",\n ""Hyena7bConfig"",\n ""Hyena40bConfig"",\n ""Hyena7bARCLongContextConfig"",\n ""Hyena40bARCLongContextConfig"",\n ""HyenaNVTestConfig"",\n ""HyenaNV40bConfig"",\n ""HyenaNV7bConfig"",\n ""HyenaConfig"",\n ""HyenaModel"",\n ""Hyena1bConfig"",\n ""HyenaNV1bConfig"",\n ""gpt_data_step"",\n ""gpt_forward_step"",\n ""T5Model"",\n ""T5Config"",\n ""T5Config220M"",\n ""T5Config3B"",\n ""T5Config11B"",\n ""BertConfig"",\n ""BertEmbeddingModel"",\n ""BertModel"",\n ""BertEmbeddingLargeConfig"",\n ""BertEmbeddingMiniConfig"",\n ""t5_data_step"",\n ""t5_forward_step"",\n ""MaskedTokenLossReduction"",\n ""MistralConfig7B"",\n ""MistralNeMoConfig12B"",\n ""MistralSmall3Config24B"",\n ""MistralModel"",\n ""MixtralConfig"",\n ""MixtralConfig8x3B"",\n ""MixtralConfig8x7B"",\n ""MixtralConfig8x22B"",\n ""MixtralModel"",\n ""Starcoder2Config15B"",\n ""Starcoder2Config"",\n ""Starcoder2Model"",\n ""NemotronModel"",\n ""Nemotron3Config4B"",\n ""Nemotron3Config8B"",\n ""Nemotron3Config22B"",\n ""Nemotron4Config15B"",\n ""Nemotron4Config340B"",\n ""NemotronConfig"",\n ""LlamaEmbeddingModel"",\n ""Llama32EmbeddingConfig1B"",\n ""Llama32EmbeddingConfig3B"",\n ""Phi3Config"",\n ""Phi3ConfigMini"",\n ""Phi3Model"",\n ""SSMConfig"",\n ""BaseMambaConfig130M"",\n ""BaseMambaConfig370M"",\n ""BaseMambaConfig780M"",\n ""BaseMambaConfig1_3B"",\n ""BaseMambaConfig2_7B"",\n ""NVIDIAMambaConfig8B"",\n ""NVIDIAMambaHybridConfig8B"",\n ""NemotronHConfig4B"",\n ""NemotronHConfig8B"",\n ""NemotronHConfig47B"",\n ""NemotronHConfig56B"",\n ""NemotronNano9Bv2"",\n ""NemotronNano12Bv2"",\n ""MambaModel"",\n ""LlamaConfig"",\n ""Llama2Config7B"",\n ""Llama2Config13B"",\n ""Llama2Config70B"",\n ""Llama3Config8B"",\n ""Llama3Config70B"",\n ""Llama31Config8B"",\n ""Llama31Config70B"",\n ""Llama31Config405B"",\n ""Llama32Config1B"",\n ""Llama32Config3B"",\n ""Llama4Experts16Config"",\n ""Llama4Experts128Config"",\n ""Llama4Config"",\n ""Llama31NemotronNano8BConfig"",\n ""Llama31Nemotron70BConfig"",\n ""Llama33NemotronSuper49BConfig"",\n ""Llama31NemotronUltra253BConfig"",\n ""Llama32Reranker500MConfig"",\n ""Llama32Reranker1BConfig"",\n ""CodeLlamaConfig7B"",\n ""CodeLlamaConfig13B"",\n ""CodeLlamaConfig34B"",\n ""CodeLlamaConfig70B"",\n ""LlamaModel"",\n ""LlamaNemotronModel"",\n ""GPTOSSConfig"",\n ""GPTOSSConfig120B"",\n ""GPTOSSConfig20B"",\n ""GPTOSSModel"",\n ""GemmaConfig"",\n ""GemmaConfig2B"",\n ""GemmaConfig7B"",\n ""CodeGemmaConfig2B"",\n ""CodeGemmaConfig7B"",\n ""GemmaModel"",\n ""Gemma2Model"",\n ""Gemma2Config9B"",\n ""Gemma2Config"",\n ""Gemma2Config27B"",\n ""Gemma2Config2B"",\n ""Gemma3Model"",\n ""Gemma3Config1B"",\n ""Gemma3Config4B"",\n ""Gemma3Config12B"",\n ""Gemma3Config27B"",\n ""Baichuan2Config"",\n ""Baichuan2Config7B"",\n ""Baichuan2Model"",\n ""ChatGLMConfig"",\n ""ChatGLM2Config6B"",\n ""ChatGLM3Config6B"",\n ""ChatGLMModel"",\n ""Qwen2Model"",\n ""Qwen2Config7B"",\n ""Qwen2Config"",\n ""Qwen2Config500M"",\n ""Qwen2Config1P5B"",\n ""Qwen25Config3B"",\n ""Qwen2Config72B"",\n ""Qwen25Config500M"",\n ""Qwen25Config1P5B"",\n ""Qwen25Config7B"",\n ""Qwen25Config14B"",\n ""Qwen25Config32B"",\n ""Qwen25Config72B"",\n ""Qwen3Config"",\n ""Qwen3Config600M"",\n ""Qwen3Config1P7B"",\n ""Qwen3Config4B"",\n ""Qwen3Config8B"",\n ""Qwen3Config14B"",\n ""Qwen3Config32B"",\n ""Qwen3Config30B_A3B"",\n ""Qwen3Config235B_A22B"",\n ""Qwen3Model"",\n ""PreTrainingDataModule"",\n ""FineTuningDataModule"",\n ""ChatDataModule"",\n ""SquadDataModule"",\n ""T5PreTrainingDataModule"",\n ""T5FineTuningDataModule"",\n ""T5SquadDataModule"",\n ""T5MockDataModule"",\n ""DeepSeekModel"",\n ""DeepSeekV2Config"",\n ""DeepSeekV2LiteConfig"",\n ""DeepSeekV3Config"",\n ""HuggingFaceBertBaseConfig"",\n ""HuggingFaceBertConfig"",\n ""HuggingFaceBertLargeConfig"",\n ""HuggingFaceBertModel"",\n ""MegatronBertBaseConfig"",\n ""MegatronBertConfig"",\n ""MegatronBertLargeConfig"",\n ""BERTMockDataModule"",\n ""BERTPreTrainingDataModule"",\n ""SpecterDataModule"",\n ""DollyDataModule"",\n ""tokenizer"",\n ""mock"",\n ""squad"",\n ""dolly"",\n ""peft"",\n ""hf_dataset"",\n ""HFAutoModelForCausalLM"",\n ""HFMockDataModule"",\n]\n\n\nfrom nemo.utils import logging\n\ntry:\n import nemo_run as run # noqa: F401\n\n from nemo.collections.llm.api import ( # noqa: F401\n distill,\n export_ckpt,\n finetune,\n generate,\n import_ckpt,\n pretrain,\n prune,\n ptq,\n train,\n validate,\n )\n from nemo.collections.llm.recipes import * # noqa\n\n __all__.extend(\n [\n ""train"",\n ""import_ckpt"",\n ""export_ckpt"",\n ""pretrain"",\n ""validate"",\n ""finetune"",\n ""generate"",\n ""prune"",\n ""ptq"",\n ""distill"",\n ]\n )\nexcept ImportError as error:\n logging.warning(f""Failed to import nemo.collections.llm.[api,recipes]: {error}"")\n\ntry:\n from nemo.collections.llm.api import deploy # noqa: F401\n\n __all__.append(""deploy"")\nexcept ImportError as error:\n logging.warning(f""The deploy module could not be imported: {error}"")\n\ntry:\n from nemo.collections.llm.api import evaluate # noqa: F401\n\n __all__.append(""evaluate"")\nexcept ImportError as error:\n logging.warning(f""The evaluate module could not be imported: {error}"")\n",python,tab
+131,3910853,"nemo/collections/llm/__init__.py",1372,0,"",python,selection_command
+132,3911438,"nemo/collections/llm/gpt/data/chat.py",0,0,"# Copyright (c) 2025, NVIDIA CORPORATION. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the ""License"");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an ""AS IS"" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom functools import lru_cache\nfrom pathlib import Path\nfrom typing import TYPE_CHECKING, Any, Dict, List, Optional, Union\n\nfrom nemo.collections.llm.gpt.data.core import create_sft_dataset\nfrom nemo.collections.llm.gpt.data.fine_tuning import FineTuningDataModule\n\nif TYPE_CHECKING:\n from nemo.collections.common.tokenizers import TokenizerSpec\n from nemo.collections.llm.gpt.data.packed_sequence import PackedSequenceSpecs\n\n\nclass ChatDataModule(FineTuningDataModule):\n """"""\n Base class for fine-tuning an LLM on chat datasets.\n This class calls `GPTSFTChatDataset` for chat template processing\n\n See base class `FineTuningDataModule` for more details.\n """"""\n\n def __init__(\n self,\n dataset_root: Union[str, Path],\n seq_length: int = 2048,\n tokenizer: Optional[""TokenizerSpec""] = None,\n micro_batch_size: int = 4,\n global_batch_size: int = 8,\n rampup_batch_size: Optional[List[int]] = None,\n seed: int = 1234,\n memmap_workers: int = 1,\n num_workers: int = 8,\n pin_memory: bool = True,\n persistent_workers: bool = False,\n packed_sequence_specs: Optional[""PackedSequenceSpecs""] = None,\n dataset_kwargs: Optional[Dict[str, Any]] = None,\n use_hf_tokenizer_chat_template: bool = False,\n ):\n """"""Data module for finetuning on chat datasets.\n See base class `FineTuningDataModule` for more details of the arguments.\n\n Args:\n use_hf_tokenizer_chat_template: Whether to use the chat template from the HuggingFace tokenizer. If True,\n uses the tokenizer's built-in chat template. If False, uses default chat template from\n GPTSFTChatDataset. Defaults to False.\n """"""\n super().__init__(\n dataset_root,\n seq_length,\n tokenizer,\n micro_batch_size,\n global_batch_size,\n rampup_batch_size,\n seed,\n memmap_workers,\n num_workers,\n pin_memory,\n persistent_workers,\n packed_sequence_specs,\n dataset_kwargs,\n )\n self.use_hf_tokenizer_chat_template = use_hf_tokenizer_chat_template\n\n @lru_cache\n def _create_dataset(self, path, pack_metadata_path=None, is_test=False, **kwargs):\n # pylint: disable=C0115,C0116\n return create_sft_dataset(\n path,\n tokenizer=self.tokenizer,\n seq_length=(self.seq_length if is_test or self.packed_sequence_size <= 0 else self.packed_sequence_size),\n memmap_workers=self.memmap_workers,\n seed=self.seed,\n chat=True,\n is_test=is_test,\n pack_metadata_file_path=None, # packing is not supported\n pad_cu_seqlens=False,\n use_hf_tokenizer_chat_template=self.use_hf_tokenizer_chat_template,\n **kwargs,\n )\n",python,tab
+133,3911440,"nemo/collections/llm/gpt/data/chat.py",1051,0,"",python,selection_command
+134,3922774,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",0,0,"",python,tab
+135,3925020,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",2832,0,"",python,selection_command
+136,3925651,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",2832,1,"4",python,selection_command
+137,3925809,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",2832,4,"4096",python,selection_command
+138,3926234,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",2835,0,"",python,selection_command
+139,3934172,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",2811,0,"",python,selection_command
+140,3934321,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",2785,0,"",python,selection_command
+141,3934605,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",2788,0,"",python,selection_command
+142,3934779,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",2789,0,"",python,selection_command
+143,3935100,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",2789,1,"3",python,selection_command
+144,3935692,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",2789,2,"32",python,selection_command
+145,3936365,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",2790,0,"",python,selection_command
+146,57542980,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9147,0,"",python,selection_command
+147,57546405,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",0,0,"",python,selection_command
+148,57548252,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",64,0,"",python,selection_command
+149,57548499,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",66,0,"",python,selection_command
+150,57548532,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",132,0,"",python,selection_command
+151,57548565,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",199,0,"",python,selection_command
+152,57548599,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",241,0,"",python,selection_command
+153,57548634,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",243,0,"",python,selection_command
+154,57548671,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",292,0,"",python,selection_command
+155,57548706,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",294,0,"",python,selection_command
+156,57548741,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",364,0,"",python,selection_command
+157,57548771,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",432,0,"",python,selection_command
+158,57548801,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",507,0,"",python,selection_command
+159,57548835,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",577,0,"",python,selection_command
+160,57548868,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",610,0,"",python,selection_command
+161,57548906,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",611,0,"",python,selection_command
+162,57548941,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",639,0,"",python,selection_command
+163,57548975,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",640,0,"",python,selection_command
+164,57549009,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",671,0,"",python,selection_command
+165,57549042,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",694,0,"",python,selection_command
+166,57549077,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",707,0,"",python,selection_command
+167,57549111,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",795,0,"",python,selection_command
+168,57549145,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",796,0,"",python,selection_command
+169,57552710,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",2311,0,"",python,selection_keyboard
+170,57553078,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",3948,0,"",python,selection_keyboard
+171,57553619,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",2311,0,"",python,selection_keyboard
+172,57554018,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",796,0,"",python,selection_keyboard
+173,57556417,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",2311,0,"",python,selection_keyboard
+174,57556648,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",3948,0,"",python,selection_keyboard
+175,57557003,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",5867,0,"",python,selection_keyboard
+176,57557191,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7451,0,"",python,selection_keyboard
+177,57557353,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",9250,0,"",python,selection_keyboard
+178,57557544,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",10041,0,"",python,selection_keyboard
+179,57558605,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",10023,0,"",python,selection_command
+180,57558764,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",10027,0,"",python,selection_command
+181,57558924,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",10034,0,"",python,selection_command
+182,57559403,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",8074,0,"",python,selection_keyboard
+183,57559709,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",6481,0,"",python,selection_keyboard
+184,57560640,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",6509,0,"",python,selection_command
+185,57560885,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",6544,0,"",python,selection_command
+186,57560915,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",6593,0,"",python,selection_command
+187,57560953,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",6640,0,"",python,selection_command
+188,57560985,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",6709,0,"",python,selection_command
+189,57561017,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",6722,0,"",python,selection_command
+190,57561056,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",6817,0,"",python,selection_command
+191,57561090,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",6877,0,"",python,selection_command
+192,57561120,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",6910,0,"",python,selection_command
+193,57561150,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",6949,0,"",python,selection_command
+194,57561184,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",6992,0,"",python,selection_command
+195,57561217,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7019,0,"",python,selection_command
+196,57561250,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7046,0,"",python,selection_command
+197,57561283,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7085,0,"",python,selection_command
+198,57561317,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7098,0,"",python,selection_command
+199,57561350,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7124,0,"",python,selection_command
+200,57561384,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7137,0,"",python,selection_command
+201,57561426,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7192,0,"",python,selection_command
+202,57561457,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7245,0,"",python,selection_command
+203,57561488,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7294,0,"",python,selection_command
+204,57561625,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7301,0,"",python,selection_command
+205,57561880,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7302,0,"",python,selection_command
+206,57561913,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7314,0,"",python,selection_command
+207,57561964,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7359,0,"",python,selection_command
+208,57561989,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7380,0,"",python,selection_command
+209,57562023,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7411,0,"",python,selection_command
+210,57562057,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7438,0,"",python,selection_command
+211,57562090,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7462,0,"",python,selection_command
+212,57562109,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7494,0,"",python,selection_command
+213,57562220,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7535,0,"",python,selection_command
+214,57562402,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7570,0,"",python,selection_command
+215,57563109,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7568,0,"",python,selection_command
+216,57563263,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7567,0,"",python,selection_command
+217,57563792,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7564,0,"",python,selection_command
+218,57565772,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7529,0,"",python,selection_command
+219,57566019,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7488,0,"",python,selection_command
+220,57566064,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7456,0,"",python,selection_command
+221,57566096,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7432,0,"",python,selection_command
+222,57566126,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7405,0,"",python,selection_command
+223,57566159,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7374,0,"",python,selection_command
+224,57566191,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7353,0,"",python,selection_command
+225,57566676,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7308,0,"",python,selection_command
+226,57567039,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7311,0,"",python,selection_command
+227,57567170,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7312,0,"",python,selection_command
+228,57567330,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7319,0,"",python,selection_command
+229,57567490,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7320,0,"",python,selection_command
+230,57567661,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7326,0,"",python,selection_command
+231,57567829,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7327,0,"",python,selection_command
+232,57568643,"nemo/collections/llm/api.py",0,0,"# Copyright (c) 2025, NVIDIA CORPORATION. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the ""License"");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an ""AS IS"" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport json\nimport warnings\nfrom copy import deepcopy\nfrom pathlib import Path\nfrom typing import TYPE_CHECKING, Any, Callable, Optional, Union\n\nimport lightning.pytorch as pl\nimport nemo_run as run\nimport torch\nfrom megatron.core import parallel_state\nfrom rich.console import Console\nfrom torch.distributed import all_gather_object\nfrom typing_extensions import Annotated\n\nimport nemo.lightning as nl\nfrom nemo.collections.llm import GPTModel, HFAutoModelForCausalLM\nfrom nemo.collections.llm.gpt.data.fine_tuning import FineTuningDataModule\nfrom nemo.collections.llm.modelopt import (\n DistillationGPTModel,\n ExportConfig,\n PruningConfig,\n QuantizationConfig,\n Quantizer,\n prune_language_model,\n save_pruned_model,\n set_modelopt_spec_if_exists_in_ckpt,\n setup_trainer_and_restore_model_with_modelopt_spec,\n)\nfrom nemo.lightning import (\n AutoResume,\n NeMoLogger,\n OptimizerModule,\n Trainer,\n configure_no_restart_validation_training_loop,\n io,\n)\nfrom nemo.lightning.base import NEMO_MODELS_CACHE\nfrom nemo.lightning.callback_group import CallbackGroup\nfrom nemo.lightning.ckpt_utils import ckpt_to_context_subdir\nfrom nemo.lightning.pytorch.callbacks import PEFT, JitTransform, ModelTransform\nfrom nemo.utils import logging\nfrom nemo.utils.get_rank import is_global_rank_zero\n\nif TYPE_CHECKING:\n from megatron.core.inference.common_inference_params import CommonInferenceParams\n from megatron.core.inference.inference_request import InferenceRequest\n\n\nTokenizerType = Any\nAnyPath = Union[Path, str]\n\n\n@run.cli.entrypoint(namespace=""llm"")\ndef train(\n model: Union[pl.LightningModule, AnyPath],\n data: pl.LightningDataModule,\n trainer: Trainer,\n log: Annotated[Optional[NeMoLogger], run.Config[NeMoLogger]] = None,\n resume: Annotated[Optional[AutoResume], run.Config[AutoResume]] = None,\n optim: Optional[OptimizerModule] = None,\n tokenizer: Optional[TokenizerType] = None,\n model_transform: Optional[Union[PEFT, ModelTransform, Callable]] = None,\n # TODO: Fix export export: Optional[str] = None,\n) -> Path:\n """"""\n Trains a model using the specified data and trainer, with optional tokenizer, source, and export.\n\n Args:\n model (Union[pl.LightningModule, AnyPath]): The model to be trained or a path to the NeMo 2 checkpoint.\n data (pl.LightningDataModule): The data module containing training data.\n trainer (Trainer): The trainer instance configured with a MegatronStrategy.\n log (NeMoLogger): A nemologger instance.\n resume (Optional[Union[AutoResume, Resume]]): Resume training from a checkpoint.\n optim (Optional[OptimizerModule]): The optimizer module to be used. If not provided, the default optimizer\n from the model will be used.\n tokenizer (Optional[TokenizerType]): Tokenizer setting to be applied. Can be 'data' or 'model'\n or an instance of TokenizerSpec.\n export (Optional[str]): Filename to save the exported checkpoint after training.\n model_transform (Optional[Union[Callable[[nn.Module], nn.Module], PEFT]]): A model transform to be applied.\n\n Returns\n -------\n Path: The directory path where training artifacts are saved.\n\n Examples\n --------\n >>> from nemo.collections import llm\n >>> from nemo import lightning as nl\n >>> model = llm.MistralModel()\n >>> data = llm.SquadDataModule(seq_length=4096, global_batch_size=16, micro_batch_size=2)\n >>> precision = nl.MegatronMixedPrecision(precision=""bf16-mixed"")\n >>> trainer = nl.Trainer(strategy=nl.MegatronStrategy(tensor_model_parallel_size=2), plugins=precision)\n >>> llm.train(model, data, trainer, tokenizer=""data"")\n PosixPath('/path/to/log_dir')\n """"""\n model = _load_model_from_path(model)\n\n # [ModelOpt]: If modelopt_state exists, overwrite transformer_layer_spec to modelopt spec\n if resume:\n if resume.restore_config and resume.restore_config.path:\n set_modelopt_spec_if_exists_in_ckpt(model, resume.restore_config.path)\n elif resume.resume_from_path:\n set_modelopt_spec_if_exists_in_ckpt(model, resume.resume_from_path)\n\n app_state = _setup(\n model=model,\n data=data,\n trainer=trainer,\n log=log,\n resume=resume,\n optim=optim,\n tokenizer=tokenizer,\n model_transform=model_transform,\n )\n\n trainer.fit(model, data)\n\n return app_state.exp_dir\n\n\n@run.cli.entrypoint(namespace=""llm"")\ndef pretrain(\n model: Union[pl.LightningModule, AnyPath],\n data: pl.LightningDataModule,\n trainer: Trainer,\n log: Annotated[Optional[NeMoLogger], run.Config[NeMoLogger]] = None,\n resume: Annotated[Optional[AutoResume], run.Config[AutoResume]] = None,\n optim: Optional[OptimizerModule] = None,\n) -> Path:\n """"""\n Pretrains a model using the specified data and trainer, with optional logging, resuming, and optimization.\n\n This function is a wrapper around the `train` function, specifically configured for pretraining tasks.\n Note, by default it will use the tokenizer from the model.\n\n Args:\n model (Union[pl.LightningModule, AnyPath]): The model to be pretrained or a path to the NeMo 2 checkpoint.\n data (pl.LightningDataModule): The data module containing pretraining data.\n trainer (Trainer): The trainer instance configured with a MegatronStrategy.\n log (NeMoLogger): A nemologger instance.\n resume (Optional[AutoResume]): Resume training from a checkpoint.\n optim (Optional[OptimizerModule]): The optimizer module to be used. If not provided, the default\n optimizer from the model will be used.\n\n Returns:\n Path: The directory path where pretraining artifacts are saved.\n\n Examples:\n >>> from nemo.collections import llm\n >>> from nemo import lightning as nl\n >>> model = llm.MistralModel()\n >>> data = llm.PretrainingDataModule(paths=[...], seq_length=4096, global_batch_size=16, micro_batch_size=2)\n >>> precision = nl.MegatronMixedPrecision(precision=""bf16-mixed"")\n >>> trainer = nl.Trainer(strategy=nl.MegatronStrategy(tensor_model_parallel_size=2), plugins=precision)\n >>> llm.pretrain(model, data, trainer)\n PosixPath('/path/to/log_dir')\n """"""\n model = _load_model_from_path(model)\n _validate_config(model, data, trainer, log=log, resume=resume, optim=optim)\n\n return train(\n model=model,\n data=data,\n trainer=trainer,\n log=log,\n resume=resume,\n optim=optim,\n tokenizer=""data"",\n )\n\n\n@run.cli.entrypoint(namespace=""llm"")\ndef finetune(\n model: Union[pl.LightningModule, AnyPath],\n data: pl.LightningDataModule,\n trainer: Trainer,\n log: Annotated[Optional[NeMoLogger], run.Config[NeMoLogger]] = None,\n resume: Annotated[Optional[AutoResume], run.Config[AutoResume]] = None,\n optim: Optional[OptimizerModule] = None,\n peft: Optional[Union[PEFT, ModelTransform, Callable]] = None,\n tokenizer: Optional[TokenizerType] = ""model"",\n) -> Path:\n """"""\n Finetunes a model using the specified data and trainer, with optional logging, resuming, and PEFT.\n\n Note, by default it will use the tokenizer from the model.\n\n Args:\n model (Union[pl.LightningModule, AnyPath]): The model to be finetuned.\n data (pl.LightningDataModule): The data module containing finetuning data.\n trainer (Trainer): The trainer instance configured with a MegatronStrategy.\n log (NeMoLogger): A nemologger instance.\n resume (Optional[AutoResume]): Resume training from a checkpoint.\n optim (Optional[OptimizerModule]): The optimizer module to be used. If not provided, the default\n optimizer from the model will be used.\n peft (Optional[PEFT]): A PEFT (Parameter-Efficient Fine-Tuning) configuration to be applied.\n tokenizer (Optional[TokenizerType]): Tokenizer setting to be applied. Can be 'data' or 'model'\n or an instance of TokenizerSpec. If 'data' uses the data loader's tokenizer instead of the tokenizer\n from the model checkpoint, which is useful for expanding vocabulary or adding special tokens\n (such as chat template tokens).\n\n Returns:\n Path: The directory path where finetuning artifacts are saved.\n\n Examples:\n >>> from nemo.collections import llm\n >>> from nemo import lightning as nl\n >>> model = llm.MistralModel()\n >>> data = llm.SquadDataModule(seq_length=4096, global_batch_size=16, micro_batch_size=2)\n >>> precision = nl.MegatronMixedPrecision(precision=""bf16-mixed"")\n >>> trainer = nl.Trainer(strategy=nl.MegatronStrategy(tensor_model_parallel_size=2), plugins=precision)\n >>> llm.finetune(model, data, trainer, peft=llm.peft.LoRA()])\n PosixPath('/path/to/log_dir')\n """"""\n model = _load_model_from_path(model)\n _validate_config(model, data, trainer, log=log, resume=resume, optim=optim, model_transform=peft)\n return train(\n model=model,\n data=data,\n trainer=trainer,\n log=log,\n resume=resume,\n optim=optim,\n tokenizer=tokenizer,\n model_transform=peft,\n )\n\n\n@run.cli.entrypoint(namespace=""llm"")\ndef validate(\n model: pl.LightningModule,\n data: pl.LightningDataModule,\n trainer: Trainer,\n log: Annotated[Optional[NeMoLogger], run.Config[NeMoLogger]] = None,\n resume: Annotated[Optional[AutoResume], run.Config[AutoResume]] = None,\n optim: Optional[OptimizerModule] = None,\n tokenizer: Optional[TokenizerType] = None,\n model_transform: Optional[Union[PEFT, ModelTransform, Callable]] = None,\n) -> Path:\n """"""\n Validates a model using the specified data and trainer, with optional logging, resuming, and model transformations.\n\n Args:\n model (pl.LightningModule): The model to be validated.\n data (pl.LightningDataModule): The data module containing validation data.\n trainer (Trainer): The trainer instance configured with a MegatronStrategy.\n log (NeMoLogger): A nemologger instance.\n resume (Optional[AutoResume]): Resume from a checkpoint for validation.\n optim (Optional[OptimizerModule]): The optimizer module to be used. If not provided, the default optimizer\n from the model will be used.\n tokenizer (Optional[TokenizerType]): Tokenizer setting to be applied. Can be 'data' or 'model'\n or an instance of TokenizerSpec.\n model_transform (Optional[Union[Callable[[nn.Module], nn.Module], PEFT]]): A model transform to be applied.\n\n Returns:\n Path: The directory path where validation artifacts are saved.\n\n Examples:\n >>> from nemo.collections import llm\n >>> from nemo import lightning as nl\n >>> model = llm.MistralModel()\n >>> data = llm.SquadDataModule(seq_length=4096, global_batch_size=16, micro_batch_size=2)\n >>> precision = nl.MegatronMixedPrecision(precision=""bf16-mixed"")\n >>> trainer = nl.Trainer(strategy=nl.MegatronStrategy(tensor_model_parallel_size=2), plugins=precision)\n >>> llm.validate(model, data, trainer, tokenizer=""data"")\n PosixPath('/path/to/log_dir')\n """"""\n app_state = _setup(\n model=model,\n data=data,\n trainer=trainer,\n log=log,\n resume=resume,\n optim=optim,\n tokenizer=tokenizer,\n model_transform=model_transform,\n )\n\n trainer.validate(model, data)\n\n return app_state.exp_dir\n\n\n@run.cli.entrypoint(name=""prune"", namespace=""llm"")\ndef prune(\n nemo_checkpoint: str,\n save_path: str,\n pruning_config: PruningConfig,\n devices: int = 1,\n num_nodes: int = 1,\n tp_size: int = 1,\n pp_size: int = 1,\n num_layers_in_first_pipeline_stage: int | None = None,\n num_layers_in_last_pipeline_stage: int | None = None,\n num_train_samples: int = 1024,\n data: pl.LightningDataModule | None = None,\n tokenizer_path: str | None = None,\n legacy_ckpt: bool = False,\n) -> str:\n """"""\n Prunes a model using the specified data and trainer. Currently only supports GPT models.\n\n Args:\n nemo_checkpoint (str): The path to the NeMo checkpoint to be pruned.\n save_path (str): The path to save the pruned NeMo checkpoint.\n pruning_config (PruningConfig): The pruning configuration.\n devices (int): The number of devices to use for pruning.\n num_nodes (int): The number of nodes to use for pruning.\n tp_size (int): The tensor parallel size.\n pp_size (int): The pipeline parallel size.\n num_train_samples (int): Number of training samples for importance estimation using forward pass.\n num_layers_in_first_pipeline_stage (int): The number of layers in the first pipeline stage.\n num_layers_in_last_pipeline_stage (int): The number of layers in the last pipeline stage.\n data (pl.LightningDataModule): The data module for forward pass.\n Required if not dropping layers.\n tokenizer_path (str): Path to the tokenizer if not using model's tokenizer.\n legacy_ckpt (bool): If True, allow loading ckpt saved with older version of TE.\n Use for cases like missing state dict keys ending with `_extra_state`.\n\n Returns:\n str: The path to the pruned NeMo checkpoint.\n\n Examples:\n >>> from nemo.collections import llm\n >>> from nemo.collections.llm.modelopt.prune import PruningConfig\n >>> data = llm.PretrainingDataModule(\n paths=[""1.0"", ""path/to/tokenized/data""],\n seq_length=256,\n global_batch_size=1,\n micro_batch_size=1,\n )\n >>> llm.prune(\n nemo_checkpoint=""path/to/llama3.1-8b"",\n save_path=""path/to/pruned_llama_model"",\n pruning_config=PruningConfig(target_ffn_hidden_size=9216, target_hidden_size=3072),\n data=data\n )\n """"""\n if data is not None:\n assert data.global_batch_size == data.micro_batch_size, ""Global batch size must be equal to micro batch size""\n steps = num_train_samples // data.global_batch_size\n else:\n steps = num_train_samples\n\n model, trainer = setup_trainer_and_restore_model_with_modelopt_spec(\n model_path=nemo_checkpoint,\n tensor_model_parallel_size=tp_size,\n pipeline_model_parallel_size=pp_size,\n num_layers_in_first_pipeline_stage=num_layers_in_first_pipeline_stage,\n num_layers_in_last_pipeline_stage=num_layers_in_last_pipeline_stage,\n devices=devices,\n num_nodes=num_nodes,\n inference_only=True,\n tokenizer_path=tokenizer_path,\n legacy_ckpt=legacy_ckpt,\n strategy_kwargs={""sequence_parallel"": False, ""replace_progress_bar"": False},\n trainer_kwargs={""max_steps"": steps, ""limit_val_batches"": steps, ""val_check_interval"": steps},\n model_config_overrides={""sequence_parallel"": False},\n )\n prune_language_model(model, pruning_config, data, trainer)\n save_pruned_model(trainer, save_path)\n\n console = Console()\n console.print(f""[green]โ Pruning succeded, pruned checkpoint saved to {save_path}[/green]"")\n\n return save_path\n\n\n@run.cli.entrypoint(name=""distill"", namespace=""llm"")\ndef distill(\n student_model_path: AnyPath,\n teacher_model_path: AnyPath,\n data: pl.LightningDataModule,\n trainer: Trainer,\n distillation_config_path: Optional[AnyPath] = None,\n log: Annotated[Optional[NeMoLogger], run.Config[NeMoLogger]] = None,\n resume: Annotated[Optional[AutoResume], run.Config[AutoResume]] = None,\n optim: Optional[OptimizerModule] = None,\n tokenizer: Optional[TokenizerType] = None,\n model_transform: Optional[Union[PEFT, ModelTransform, Callable]] = None,\n) -> Path:\n """"""\n Distills a teacher model into a student model using special Knowledge-Distillation losses.\n\n Note that this requires an existing NeMo 2.0 checkpoint of the student model as well, as\n the model class is not known beforehand.\n This script currently supports instances of ``nemo.collections.llm.GPTModel`` for now.\n\n Args:\n student_model_path (Path): Path to student model NeMo checkpoint to be trained.\n teacher_model_path (Path): Path to teacher model NeMo checkpoint to distill from.\n data (pl.LightningDataModule): The data module containing training data.\n trainer (Trainer): The trainer instance configured with a MegatronStrategy.\n distillation_config_path (Optional[Path]): Path to distillation config YAML file.\n If not provided, by default will perform logits-only distillation.\n log (NeMoLogger): A nemologger instance.\n resume (Optional[Union[AutoResume, Resume]]): Resume training from a checkpoint.\n optim (Optional[OptimizerModule]): The optimizer module to be used. If not provided, the default optimizer\n from the model will be used.\n tokenizer (Optional[TokenizerType]): Tokenizer setting to be applied. Can be 'data' or 'model'\n or an instance of TokenizerSpec.\n export (Optional[str]): Filename to save the exported checkpoint after training.\n model_transform (Optional[Union[Callable[[nn.Module], nn.Module], PEFT]]): A model transform to be applied.\n\n Returns\n -------\n Path: The directory path where training artifacts are saved.\n\n Examples\n --------\n >>> from nemo.collections import llm\n >>> from nemo import lightning as nl\n >>> student = ""/path/to/student/nemo/ckpt"" # <-- change me\n >>> teacher = ""/path/to/teacher/nemo/ckpt"" # <-- change me\n >>> data = llm.SquadDataModule(seq_length=4096, global_batch_size=16, micro_batch_size=2)\n >>> precision = nl.MegatronMixedPrecision(precision=""bf16-mixed"")\n >>> trainer = nl.Trainer(strategy=nl.MegatronStrategy(tensor_model_parallel_size=2), plugins=precision)\n >>> llm.distill(student, teacher, data, trainer, tokenizer=""model"")\n PosixPath('/path/to/log_dir')\n """"""\n _student_model = io.load_context(ckpt_to_context_subdir(student_model_path), subpath=""model"")\n _teacher_model = io.load_context(ckpt_to_context_subdir(teacher_model_path), subpath=""model"")\n assert isinstance(_student_model, GPTModel), ""Only models based on `llm.GPTModel` are supported currently.""\n assert isinstance(_teacher_model, GPTModel), ""Only models based on `llm.GPTModel` are supported currently.""\n\n if tokenizer is None:\n tokenizer = getattr(_student_model, ""tokenizer"", None) or getattr(_teacher_model, ""tokenizer"", None)\n assert tokenizer is not None, ""Tokenizer neither provided nor found in models.""\n\n model = DistillationGPTModel(\n _student_model.config,\n _teacher_model.config,\n teacher_ckpt_path=teacher_model_path,\n distillation_config_path=distillation_config_path,\n )\n model.__io__ = _student_model.__io__\n\n if resume is None:\n resume = AutoResume()\n if resume.restore_config is None:\n resume.restore_config = nl.RestoreConfig(path=student_model_path)\n\n return train(\n model=model,\n data=data,\n optim=optim,\n tokenizer=tokenizer,\n trainer=trainer,\n log=log,\n resume=resume,\n model_transform=model_transform,\n )\n\n\n@run.cli.entrypoint(name=""ptq"", namespace=""llm"")\ndef ptq(\n model_path: str,\n export_config: ExportConfig,\n calibration_tp: int = 1,\n calibration_pp: int = 1,\n calibration_ep: int = 1,\n num_layers_in_first_pipeline_stage: int | None = None,\n num_layers_in_last_pipeline_stage: int | None = None,\n devices: int | None = None,\n num_nodes: int | None = None,\n quantization_config: Annotated[Optional[QuantizationConfig], run.Config[QuantizationConfig]] = None,\n forward_loop: Callable | None = None,\n tokenizer_path: str | None = None,\n legacy_ckpt: bool = False,\n trust_remote_code: bool = False,\n) -> Path:\n """"""\n Applies Post-Training Quantization (PTQ) for a model using the specified quantization and export configs. It runs\n calibration for a small dataset to collect scaling factors low-precision GEMMs used by desired quantization method.\n By default, this function produces TensorRT-LLM checkpoint ready for deployment using the Export-Deploy repository\n (https://github.com/NVIDIA-NeMo/Export-Deploy) or directly using TensorRT-LLM library.\n\n The function can be used through the NeMo CLI in the following way:\n ```bash\n # Run calibration using tensor parallel set to 8 and export quantized checkpoint with tensor parallel equal 2\n nemo llm ptq run.executor=torchrun run.executor.ntasks_per_node=8 \\n model_path=/models/Llama-3-70B \\n export_config.path=/models/Llama-3-70B-FP8 \\n calibration_tp=8 \\n export_config.inference_tp=2\n\n # Choose different quantization method, for example, INT8 SmoothQuant\n nemo llm ptq run.executor=torchrun run.executor.ntasks_per_node=1 \\n model_path=/models/Llama-3-8B \\n export_config.path=/models/Llama-3-8B-INT8_SQ \\n quantization_config.algorithm=int8_sq\n\n # Export as NeMo checkpoint instead\n nemo llm ptq run.executor=torchrun \\n model_path=/models/Llama-3-8B \\n export_config.path=/models/Llama-3-8B-INT8_SQ \\n quantization_config.algorithm=int8_sq \\n export_config.export_format=nemo\n\n # Quantize HF AutoModel checkpoint.\n nemo llm ptq run.executor=torchrun run.executor.ntasks_per_node=1 \\n model_path=/models/Llama-3-70B-HF \\n export_config.path=/models/Llama-3-70B-HF-FP8 \\n export_config.export_format=hf\n ```\n\n Args:\n model_path (str): The path to model to be quantized.\n calibration_tp (int): Calibration tensor parallelism.\n calibration_pp (int): Calibration pipeline parallelism.\n num_layers_in_first_pipeline_stage (int): Number of layers in the first pipeline stage.\n num_layers_in_last_pipeline_stage (int): Number of layers in the last pipeline stage.\n export_config (ExportConfig): Export configuration for output checkpoint.\n devices (int): Number of devices to use for calibration. Default: calibration_tp.\n num_nodes (int): Number of nodes to use for calibration. Default: calibration_pp.\n quantization_config (QuantizationConfig): Configuration for quantization algorithm.\n forward_loop (Callable): Forward loop to use for calibration.\n If not provided, a forward loop will be created using the calibration dataset.\n tokenizer_path (str): Path to the tokenizer if not using model's tokenizer.\n legacy_ckpt (bool): If True, allow loading ckpt saved with older version of TE.\n trust_remote_code (bool): Trust remote code when loading HuggingFace models.\n\n Returns:\n Path: The path where the quantized checkpoint has been saved after calibration.\n """"""\n if not quantization_config:\n quantization_config = QuantizationConfig()\n if devices is None:\n devices = calibration_tp\n if num_nodes is None:\n num_nodes = calibration_pp\n\n quantizer = Quantizer(quantization_config, export_config)\n assert Path(model_path).exists(), f""Path {model_path} does not exist""\n is_automodel = (Path(model_path) / 'config.json').exists()\n\n trainer = None\n if is_automodel:\n assert export_config.export_format != ""nemo"", ""Automodel PTQ does not support export format nemo""\n model = HFAutoModelForCausalLM(model_name=model_path, trust_remote_code=trust_remote_code, device_map=""auto"")\n model.configure_model()\n else:\n model, trainer = setup_trainer_and_restore_model_with_modelopt_spec(\n model_path=model_path,\n tensor_model_parallel_size=calibration_tp,\n pipeline_model_parallel_size=calibration_pp,\n num_layers_in_first_pipeline_stage=num_layers_in_first_pipeline_stage,\n num_layers_in_last_pipeline_stage=num_layers_in_last_pipeline_stage,\n expert_model_parallel_size=calibration_ep,\n devices=devices,\n num_nodes=num_nodes,\n inference_only=True,\n tokenizer_path=tokenizer_path,\n legacy_ckpt=legacy_ckpt,\n strategy_kwargs={""sequence_parallel"": False, ""lazy_init"": True},\n trainer_kwargs={},\n model_config_overrides={""sequence_parallel"": False},\n )\n\n model = quantizer.quantize(model, forward_loop)\n quantizer.export(model, model_path, trainer)\n\n if is_global_rank_zero():\n console = Console()\n console.print(f""[green]โ PTQ succeded, quantized checkpoint exported to {export_config.path}[/green]"")\n return export_config.path\n\n\n@run.cli.entrypoint(name=""import"", namespace=""llm"")\ndef import_ckpt(\n model: pl.LightningModule,\n source: str,\n output_path: Optional[AnyPath] = None,\n overwrite: bool = False,\n **kwargs,\n) -> Path:\n """"""\n Imports a checkpoint into a model using the model's associated importer, typically for\n the purpose of fine-tuning a community model trained in an external framework, such as\n Hugging Face.\n\n This function can be used both programmatically and through the NeMo CLI:\n\n CLI Usage:\n ```bash\n # Import Llama 3 8B from HuggingFace (saves to $NEMO_MODELS_CACHE)\n nemo llm import model=llama3_8b source=""hf://meta-llama/Llama-3.1-8B""\n\n # Import with custom output path\n nemo llm import model=llama3_8b source=""hf://meta-llama/Llama-3.1-8B"" output_path=""/path/to/save""\n\n # Force overwrite existing checkpoint\n nemo llm import model=llama3_8b source=""hf://meta-llama/Llama-3.1-8B"" overwrite=true\n ```\n\n Python Usage:\n ```python\n model = Mistral7BModel()\n imported_path = import_ckpt(model, ""hf://mistralai/Mistral-7B-v0.1"")\n ```\n\n The importer component of the model reads the checkpoint data from the specified source\n and transforms it into the right format. This is particularly useful for adapting\n models that have been pre-trained in different environments or frameworks to be fine-tuned\n or further developed within the current system.\n\n For instance, using `import_ckpt(Mistral7BModel(), ""hf"")` initiates the import process\n by searching for a registered model importer tagged with ""hf"". In NeMo, `HFMistral7BImporter`\n is registered under this tag via:\n `@io.model_importer(Mistral7BModel, ""hf"", default_path=""mistralai/Mistral-7B-v0.1"")`.\n This links `Mistral7BModel` to `HFMistral7BImporter`, designed for HuggingFace checkpoints.\n\n Args:\n model (pl.LightningModule): The model into which the checkpoint will be imported.\n This model must implement the ConnectorMixin.\n source (str): The source from which the checkpoint will be imported. This can be\n a file path, URL, or any other string identifier that the model's importer\n can recognize.\n output_path (Optional[Path]): The path where the imported checkpoint will be stored.\n If not specified, the checkpoint will be saved to $NEMO_MODELS_CACHE\n (defaults to ~/.cache/nemo/models/ if the environment variable is not set).\n overwrite (bool): If set to True, existing files at the output path will be overwritten.\n This is useful for model updates where retaining old checkpoint files is not required.\n\n Returns:\n Path: The path where the checkpoint has been saved after import.\n\n Raises:\n ValueError: If the model does not implement ConnectorMixin, indicating a lack of\n necessary importer functionality.\n FileExistsError: If the output path is provided (that is, when not using models cache)\n and it exists and overwrite is not set to True.\n """"""\n if output_path:\n output_path = Path(output_path)\n if output_path.exists() and not overwrite:\n raise FileExistsError(f""Output path {output_path} exists. Use overwrite=True to force overwrite."")\n\n output = io.import_ckpt(model=model, source=source, output_path=output_path, overwrite=overwrite, **kwargs)\n\n console = Console()\n if output_path:\n console.print(f""[green]โ Checkpoint imported to {output}[/green]"")\n else:\n console.print(f""[green] $NEMO_MODELS_CACHE={NEMO_MODELS_CACHE} [/green]"")\n\n # Display directory structure as a tree\n dir_tree = _build_directory_tree(output, root_name=""Imported Checkpoint"")\n console.print(dir_tree)\n\n return output\n\n\ndef load_connector_from_trainer_ckpt(path: AnyPath, target: str) -> io.ModelConnector:\n # pylint: disable=C0116\n if not isinstance(path, Path):\n path = Path(path)\n return io.load_context(path, subpath=""model"").exporter(target, path)\n\n\n@run.cli.entrypoint(name=""export"", namespace=""llm"")\ndef export_ckpt(\n path: AnyPath,\n target: str,\n output_path: Optional[AnyPath] = None,\n overwrite: bool = False,\n load_connector: Callable[[Path, str], io.ModelConnector] = load_connector_from_trainer_ckpt,\n modelopt_export_kwargs: dict[str, Any] = None,\n **kwargs,\n) -> Path:\n """"""\n Exports a checkpoint from a model using the model's associated exporter, typically for\n the purpose of sharing a model that has been fine-tuned or customized within NeMo.\n\n This function can be used both programmatically and through the NeMo CLI:\n\n CLI Usage:\n ```bash\n # Export model to HuggingFace format (saves to {checkpoint_path}/hf/)\n nemo llm export path=/path/to/model.nemo target=""hf""\n\n # Export with custom output path\n nemo llm export path=/path/to/model.nemo target=""hf"" output_path=""/path/to/save""\n\n # Force overwrite existing export\n nemo llm export path=/path/to/model.nemo target=""hf"" overwrite=true\n ```\n\n Python Usage:\n ```python\n nemo_ckpt_path = Path(""/path/to/model.nemo"")\n export_path = export_ckpt(nemo_ckpt_path, ""hf"")\n ```\n\n The exporter component of the model reads the model's state from the specified path and\n exports it into the format specified by the 'target' identifier. This is particularly\n useful for adapting models that have been developed or fine-tuned within NeMo to be\n compatible with other environments or frameworks.\n\n Args:\n path (Path): The path to the model's checkpoint file from which data will be exported.\n target (str): The identifier for the exporter that defines the format of the export\n (e.g., ""hf"" for HuggingFace format).\n output_path (Optional[Path]): The path where the exported checkpoint will be saved.\n If not specified, defaults to {checkpoint_path}/{target}/.\n overwrite (bool): If set to True, existing files at the output path will be overwritten.\n This is useful for model updates where retaining old checkpoint files is not required.\n load_connector (Callable[[Path, str], ModelConnector]): A function to load the appropriate\n exporter based on the model and target format. Defaults to `load_connector_from_trainer_ckpt`.\n modelopt_export_kwargs (Dict[str, Any]): Additional keyword arguments for ModelOpt export to HuggingFace.\n\n Returns:\n Path: The path where the checkpoint has been saved after export.\n\n Raises:\n ValueError: If the model does not implement ConnectorMixin, indicating a lack of\n necessary exporter functionality.\n FileExistsError: If the output path is provided (that is, when not using models cache)\n and it exists and overwrite is not set to True.\n """"""\n if not isinstance(path, Path):\n path = Path(path)\n if output_path and not isinstance(output_path, Path):\n output_path = Path(output_path)\n if output_path.exists() and not overwrite:\n raise FileExistsError(f""Output path {output_path} exists. Use overwrite=True to force overwrite."")\n\n output = io.export_ckpt(path, target, output_path, overwrite, load_connector, modelopt_export_kwargs, **kwargs)\n\n console = Console()\n console.print(f""[green]โ Checkpoint exported to {output}[/green]"")\n\n return output\n\n\n@run.cli.entrypoint(name=""generate"", namespace=""llm"")\ndef generate(\n path: AnyPath,\n trainer: nl.Trainer,\n prompts: Optional[list[str]] = None,\n encoder_prompts: Optional[list[str]] = None,\n input_dataset: Optional[Union[pl.LightningDataModule, str]] = None,\n params_dtype: torch.dtype = torch.bfloat16,\n add_BOS: bool = False,\n max_batch_size: int = 4,\n random_seed: Optional[int] = None,\n inference_batch_times_seqlen_threshold: int = 1000,\n inference_params: Optional[""CommonInferenceParams""] = None,\n text_only: bool = False,\n output_path: Optional[AnyPath] = None,\n enable_flash_decode: bool = True,\n **kwargs,\n) -> list[Union[""InferenceRequest"", str]]:\n """"""\n Generates text using a NeMo LLM model.\n\n This function takes a checkpoint path and a list of prompts,\n and generates text based on the loaded model and parameters.\n It returns a list of generated text, either as a string or as an InferenceRequest object.\n\n Python Usage:\n ```python\n strategy = nl.MegatronStrategy(\n tensor_model_parallel_size=2,\n pipeline_model_parallel_size=1,\n context_parallel_size=1,\n sequence_parallel=False,\n setup_optimizers=False,\n store_optimizer_states=False,\n )\n\n trainer = nl.Trainer(\n accelerator=""gpu"",\n devices=2,\n num_nodes=1,\n strategy=strategy,\n plugins=nl.MegatronMixedPrecision(\n precision=""bf16-mixed"",\n params_dtype=torch.bfloat16,\n pipeline_dtype=torch.bfloat16,\n autocast_enabled=False,\n grad_reduce_in_fp32=False,\n ),\n )\n prompts = [\n ""Hello, how are you?"",\n ""How many r's are in the word 'strawberry'?"",\n ""Which number is bigger? 10.119 or 10.19?"",\n ]\n\n if __name__ == ""__main__"":\n results = api.generate(\n path=os.path.join(os.environ[""NEMO_HOME""], ""models"", ""meta-llama/Meta-Llama-3-8B""),\n prompts=prompts,\n trainer=trainer,\n inference_params=CommonInferenceParams(temperature=0.1, top_k=10, num_tokens_to_generate=512),\n text_only=True,\n )\n ```\n\n Args:\n path (Union[Path, str]): The path to the model checkpoint.\n prompts (list[str]): The list of prompts to generate text for.\n trainer (nl.Trainer): The trainer object.\n encoder_prompts (Optional[list[str]], optional): The list of encoder prompts. Defaults to None.\n input_dataset (Optional[Union[pl.LightningDataModule, str]], optional): The input data module or jsonl file.\n Test set will be used for generation for data modules. Defaults to None.\n params_dtype (torch.dtype, optional): The data type of the model parameters. Defaults to torch.bfloat16.\n add_BOS (bool, optional): Whether to add the beginning of sequence token. Defaults to False.\n max_batch_size (int, optional): The maximum batch size. Defaults to 4.\n random_seed (Optional[int], optional): The random seed. Defaults to None.\n inference_batch_times_seqlen_threshold (int, optional): If batch-size times sequence-length is smaller than\n this threshold then we will not use pipelining, otherwise we will. Defaults to 1000.\n inference_params (Optional[""CommonInferenceParams""], optional): The inference parameters defined in\n Mcore's CommonInferenceParams. Defaults to None.\n text_only (bool, optional): Whether to return only the generated text as a string. Defaults to False.\n output_path (Optional[Union[Path, str]], optional): The path to save the generated text or test dataset\n predictions. Defaults to None.\n enable_flash_decode (bool, optional): Whether to enable flash decode. Defaults to True.\n **kwargs: Additional keyword arguments passed to setup_model_and_tokenizer.\n\n Returns:\n list[Union[""InferenceRequest"", str]]: A list of generated text,\n either as a string or as an InferenceRequest object.\n """"""\n from nemo.collections.llm import inference\n\n if input_dataset is not None:\n input_path = input_dataset if isinstance(input_dataset, str) else input_dataset.test_path\n with open(input_path) as f:\n dataset = [json.loads(sample) for sample in f.readlines()]\n inputs = [sample[""input""] for sample in dataset]\n elif prompts is not None:\n inputs = prompts\n else:\n raise ValueError(""Either prompts or input_dataset must be provided."")\n\n inference_wrapped_model, mcore_tokenizer = inference.setup_model_and_tokenizer(\n path=path,\n trainer=trainer,\n params_dtype=params_dtype,\n inference_batch_times_seqlen_threshold=inference_batch_times_seqlen_threshold,\n enable_flash_decode=enable_flash_decode,\n **kwargs,\n )\n\n max_seq_length = inference_params.num_tokens_to_generate + max(len(mcore_tokenizer.tokenize(p)) for p in inputs)\n # set kv cache allocation to only num tokens in prompt + max tokens to generate\n inference_wrapped_model.inference_wrapper_config.inference_max_seq_length = max_seq_length\n inference_wrapped_model.inference_context.max_sequence_length = max_seq_length\n\n if trainer.strategy.expert_model_parallel_size > 1:\n inputs_on_this_dp_rank = inputs\n else:\n dp_size = trainer.strategy.distributed_sampler_kwargs['num_replicas']\n dp_rank = trainer.strategy.distributed_sampler_kwargs['rank']\n chunk_size = (len(inputs) + dp_size - 1) // dp_size\n start_idx = dp_rank * chunk_size\n end_idx = min(start_idx + chunk_size, len(inputs))\n inputs_on_this_dp_rank = inputs[start_idx:end_idx]\n\n results_on_this_dp_rank = inference.generate(\n model=inference_wrapped_model,\n tokenizer=mcore_tokenizer,\n prompts=inputs_on_this_dp_rank,\n encoder_prompts=encoder_prompts,\n add_BOS=add_BOS,\n max_batch_size=max_batch_size,\n random_seed=random_seed,\n inference_params=inference_params,\n )\n\n if trainer.strategy.expert_model_parallel_size > 1:\n gathered_results = [r.generated_text if text_only else r for r in results_on_this_dp_rank]\n else:\n gathered_results = [None] * dp_size\n\n all_gather_object(\n gathered_results,\n [r.generated_text if text_only else r for r in results_on_this_dp_rank],\n group=parallel_state.get_data_parallel_group(),\n )\n gathered_results = [result for sublist in gathered_results for result in sublist]\n\n assert len(gathered_results) == len(inputs)\n\n if output_path is not None and is_global_rank_zero():\n with open(output_path, ""w"") as f:\n for sample, pred in zip(dataset if input_dataset else inputs, gathered_results):\n if type(sample) == dict:\n sample[""label""] = sample.pop(""output"", None)\n sample[""prediction""] = pred if text_only else pred.generated_text\n elif type(sample) == str:\n sample = {""input"": sample, ""prediction"": pred if text_only else pred.generated_text}\n f.write(json.dumps(sample) + ""\n"")\n logging.info(f""Predictions written to {output_path}"")\n\n return gathered_results\n\n\ndef _use_tokenizer(model: pl.LightningModule, data: pl.LightningDataModule, tokenizer: TokenizerType) -> None:\n if tokenizer == ""data"":\n _set_with_io(model, ""tokenizer"", data.tokenizer)\n elif tokenizer == ""model"":\n _set_with_io(data, ""tokenizer"", model.tokenizer)\n else:\n try:\n from nemo.collections.common.tokenizers.tokenizer_spec import TokenizerSpec\n\n if isinstance(tokenizer, TokenizerSpec):\n _set_with_io(model, ""tokenizer"", tokenizer)\n _set_with_io(data, ""tokenizer"", tokenizer)\n else:\n raise ValueError(f""Expected TokenizerSpec or 'data' or 'model', got: {tokenizer}"")\n except ImportError:\n raise ValueError(""TokenizerSpec is not available"")\n\n\ndef _setup(\n model: pl.LightningModule,\n data: pl.LightningDataModule,\n trainer: Trainer,\n log: Optional[NeMoLogger],\n resume: Optional[AutoResume],\n optim: Optional[OptimizerModule],\n tokenizer: Optional[TokenizerType],\n model_transform: Optional[Union[PEFT, ModelTransform, Callable]],\n) -> Any: # Return type is Any because app_state's type is not specified\n configure_no_restart_validation_training_loop(trainer)\n _log = log or NeMoLogger()\n if resume and isinstance(model_transform, PEFT) and _log.ckpt:\n logging.info(""Disabling try_restore_best_ckpt restoration for adapters"")\n _log.ckpt.try_restore_best_ckpt = False\n\n app_state = _log.setup(\n trainer,\n resume_if_exists=getattr(resume, ""resume_if_exists"", False),\n task_config=getattr(train, ""__io__"", None),\n )\n\n # Configure telemetry via CallbackGroup\n CallbackGroup.get_instance().update_config(nemo_version='v2', trainer=trainer, data=data)\n\n if resume is not None:\n CallbackGroup.get_instance().on_load_checkpoint_start()\n resume.setup(trainer, model)\n CallbackGroup.get_instance().on_load_checkpoint_end()\n\n if optim:\n CallbackGroup.get_instance().on_optimizer_init_start()\n optim.connect(model)\n CallbackGroup.get_instance().on_optimizer_init_end()\n if tokenizer: # TODO: Improve this\n _use_tokenizer(model, data, tokenizer)\n\n if model_transform:\n _set_with_io(model, ""model_transform"", model_transform)\n\n # Add ModelTransform callback to Trainer if needed\n if getattr(model, ""model_transform"", None):\n if not any(isinstance(cb, ModelTransform) for cb in trainer.callbacks):\n if isinstance(model_transform, ModelTransform):\n trainer.callbacks.append(model_transform)\n else:\n trainer.callbacks.append(ModelTransform())\n # Move jit callback at the end ensure it's applied on top of any model transformations (peft)\n jit_cb = None\n for i, cb in enumerate(trainer.callbacks):\n if isinstance(cb, JitTransform):\n assert jit_cb is None\n jit_cb = trainer.callbacks.pop(i)\n if jit_cb is not None:\n trainer.callbacks.append(jit_cb)\n return app_state\n\n\ndef _set_with_io(obj, attr, value):\n setattr(obj, attr, value)\n if hasattr(obj, ""__io__"") and hasattr(value, ""__io__""):\n setattr(obj.__io__, attr, deepcopy(value.__io__))\n\n\ndef _validate_config(\n model: pl.LightningModule,\n data: pl.LightningDataModule,\n trainer: Trainer,\n log: Optional[NeMoLogger] = None,\n resume: Optional[AutoResume] = None,\n optim: Optional[OptimizerModule] = None,\n tokenizer: Optional[TokenizerType] = None,\n model_transform: Optional[Union[PEFT, ModelTransform, Callable]] = None,\n) -> None:\n\n # Model validation\n if hasattr(model, ""config""):\n assert getattr(model.config, ""seq_length"", 1) > 0\n assert getattr(model.config, ""max_position_embeddings"", 1) > 0\n assert model.config.num_layers > 0\n assert model.config.hidden_size > 0\n assert model.config.num_attention_heads > 0\n assert model.config.ffn_hidden_size > 0\n else:\n assert not isinstance(trainer.strategy, nl.MegatronStrategy), ""Expected model.config to exist""\n\n # Data validation\n assert data.micro_batch_size > 0\n if isinstance(trainer.strategy, nl.MegatronStrategy):\n assert data.global_batch_size > 0\n assert data.seq_length > 0\n\n assert (\n data.global_batch_size % data.micro_batch_size == 0\n ), ""Global batch size must be divisible by micro batch size in data module.""\n\n # Trainer validation\n\n # MegatronStrategy validation\n if isinstance(trainer.strategy, nl.MegatronStrategy):\n # Basic validation\n assert trainer.strategy.tensor_model_parallel_size > 0\n assert trainer.strategy.pipeline_model_parallel_size > 0\n assert trainer.strategy.context_parallel_size > 0\n\n # DP validation\n assert (trainer.num_devices * trainer.num_nodes) % (\n trainer.strategy.tensor_model_parallel_size\n * trainer.strategy.pipeline_model_parallel_size\n * trainer.strategy.context_parallel_size\n ) == 0, ""Number of GPUs must be divisible by the product of all parallelism sizes for data parallel.""\n\n assert (\n data.global_batch_size\n % (\n data.micro_batch_size\n * (\n (trainer.num_devices * trainer.num_nodes)\n / (\n trainer.strategy.tensor_model_parallel_size\n * trainer.strategy.pipeline_model_parallel_size\n * trainer.strategy.context_parallel_size\n )\n )\n )\n == 0\n ), ""Global batch size must be divisible by the product of micro batch size and data parallel size""\n\n # TP/SP validation\n if trainer.strategy.tensor_model_parallel_size == 1:\n if trainer.strategy.sequence_parallel == True:\n warnings.warn(""Disabling sequence parallelism because tensor model parallelism is disabled"")\n trainer.strategy.sequence_parallel = False\n\n # PP/VP validation\n if trainer.strategy.pipeline_model_parallel_size > 1:\n assert (\n trainer.strategy.pipeline_dtype is not None\n ), ""pipeline_dtype must be set if pipeline model parallelism is enabled""\n else:\n if trainer.strategy.virtual_pipeline_model_parallel_size is not None:\n warnings.warn(""Disabling virtual pipeline parallelism because pipeline model parallelism is disabled"")\n trainer.strategy.virtual_pipeline_model_parallel_size = None\n if trainer.strategy.pipeline_dtype is not None:\n warnings.warn(""Setting pipeline dtype to None because pipeline model parallelism is disabled"")\n trainer.strategy.pipeline_dtype = None\n\n # CP validation\n if trainer.strategy.context_parallel_size > 1:\n if hasattr(model, ""config""):\n if model.config.seq_length is not None:\n assert (\n model.config.seq_length % (trainer.strategy.context_parallel_size * 2) == 0\n ), 'Sequence length must be divisible by 2 * context parallel size if context parallel is used.'\n if isinstance(data, FineTuningDataModule):\n # check calculate_per_token_loss to be True\n # check average_in_collective to be False\n # for context parallel to solve the issue of nan loss on ranks with all tokens masked\n # (only happens in SFT)\n assert (\n model.config.calculate_per_token_loss\n ), ""When finetuning with CP>1, model.config.calculate_per_token_loss must be True""\n assert (\n not trainer.strategy.ddp_config.average_in_collective\n ), ""When finetuning with CP>1, average_in_collective must be False""\n\n # EP validation\n if trainer.strategy.expert_model_parallel_size > 1:\n if hasattr(model, ""config""):\n assert (\n model.config.num_moe_experts is not None\n ), ""num_experts must be non None to use expert model parallelism""\n assert (\n model.config.num_moe_experts % trainer.strategy.expert_model_parallel_size == 0\n ), ""Number of experts should be a multiple of expert model parallel_size.""\n\n\ndef _build_directory_tree(path, tree=None, root_name=None):\n """"""Build a Rich Tree representation of a directory structure.""""""\n from rich.tree import Tree\n\n path = Path(path)\n if tree is None:\n tree = Tree(f""[bold blue]{root_name or path.name}[/bold blue]"")\n\n # Sort to have directories first, then files\n items = sorted(path.iterdir(), key=lambda x: (not x.is_dir(), x.name))\n\n for item in items:\n if item.is_dir():\n branch = tree.add(f""[bold cyan]{item.name}/[/bold cyan]"")\n _build_directory_tree(item, branch)\n else:\n # Color differently based on file extension\n if item.suffix in ('.json', '.jsonl'):\n tree.add(f""[yellow]{item.name}[/yellow]"")\n elif item.suffix in ('.pt', '.bin', '.ckpt', '.nemo'):\n tree.add(f""[magenta]{item.name}[/magenta]"")\n elif item.suffix in ('.py', '.sh'):\n tree.add(f""[green]{item.name}[/green]"")\n else:\n tree.add(f""[white]{item.name}[/white]"")\n\n return tree\n\n\ndef _load_model_from_path(model: Union[pl.LightningModule, AnyPath]):\n if isinstance(model, AnyPath):\n model = io.load_context(ckpt_to_context_subdir(model), subpath=""model"")\n return model\n",python,tab
+233,57568647,"nemo/collections/llm/api.py",7280,0,"",python,selection_command
+234,57591453,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",0,0,"",python,tab
+235,57591454,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7327,0,"",python,selection_command
+236,57592778,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7367,0,"",python,selection_command
+237,57593022,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7393,0,"",python,selection_command
+238,57593054,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7424,0,"",python,selection_command
+239,57593089,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7449,0,"",python,selection_command
+240,57593122,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7475,0,"",python,selection_command
+241,57593155,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7507,0,"",python,selection_command
+242,57593188,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7548,0,"",python,selection_command
+243,57593223,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7575,0,"",python,selection_command
+244,57593271,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7583,0,"",python,selection_command
+245,57593707,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7575,0,"",python,selection_command
+246,57593942,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7568,0,"",python,selection_command
+247,57594111,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7567,0,"",python,selection_command
+248,57594451,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7564,0,"",python,selection_command
+249,57596202,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",7304,0,"",python,selection_command
+250,57596930,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",6651,0,"",python,selection_command
+251,57597772,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",0,0,"",python,selection_command
+252,57599095,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",683,0,"",python,selection_command
+253,57599623,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",687,0,"",python,selection_command
+254,57599896,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",690,0,"",python,selection_command
+255,57601704,"nemo/collections/llm/recipes/qwen3_30b_a3b.py",671,0,"",python,selection_command
diff --git a/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-d574b592-36c1-470c-87c1-c12b951e96361762425248759-2025_11_06-11.34.11.675/source.csv b/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-d574b592-36c1-470c-87c1-c12b951e96361762425248759-2025_11_06-11.34.11.675/source.csv
new file mode 100644
index 0000000000000000000000000000000000000000..b773eb9674e0e2557d92d4a11dc177316ad8485f
--- /dev/null
+++ b/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-d574b592-36c1-470c-87c1-c12b951e96361762425248759-2025_11_06-11.34.11.675/source.csv
@@ -0,0 +1,94 @@
+Sequence,Time,File,RangeOffset,RangeLength,Text,Language,Type
+1,4,"src/extension.ts",0,0,"// The module 'vscode' contains the VS Code extensibility API\n// Import the module and reference it with the alias vscode in your code below\nimport * as vscode from 'vscode';\n\n// This method is called when your extension is activated\n// Your extension is activated the very first time the command is executed\nexport function activate(context: vscode.ExtensionContext) {\n\n\tconsole.log('Crowd Pilot extension activated');\n\n\t// Configure terminal to allow tab keybinding to work\n\t// This makes the command skip the shell so VS Code can intercept tab in terminals\n\tconst config = vscode.workspace.getConfiguration('terminal.integrated');\n\tconst commandsToSkipShell = config.get('commandsToSkipShell', []);\n\tif (!commandsToSkipShell.includes('crowd-pilot.testRun')) {\n\t\tcommandsToSkipShell.push('crowd-pilot.testRun');\n\t\tconfig.update('commandsToSkipShell', commandsToSkipShell, vscode.ConfigurationTarget.Global);\n\t\tconsole.log('Added testRun to commandsToSkipShell');\n\t}\n\n\tconst testRun = vscode.commands.registerCommand('crowd-pilot.testRun', async () => {\n\t\tconst editor = vscode.window.activeTextEditor;\n\t\tconst doc = editor!.document;\n\t\tconst term = vscode.window.terminals[0] ?? vscode.window.createTerminal('Test');\n\t\tconst git = vscode.extensions.getExtension('vscode.git')?.exports?.getAPI(1);\n\t\tconst repo = git?.repositories?.[0];\n\t\n\t\t// Emit a few actions:\n\t\tawait vscode.window.showTextDocument(doc);\n\t\teditor!.selections = [new vscode.Selection(0, 0, 0, 0)];\n\t\tawait editor!.edit(e => e.insert(new vscode.Position(0, 0), 'hello world\n'));\n\t\tterm.show();\n\t\tterm.sendText('echo VSCode test');\n\t\t//await repo?.pull();\n\t\n\t\tvscode.window.showInformationMessage('All actions emitted');\n\t });\n\n\tcontext.subscriptions.push(testRun);\n}\n\n// This method is called when your extension is deactivated\nexport function deactivate() {}\n",typescript,tab
+2,160,"extension-output-pdoom-org.crowd-code-#1-crowd-code",0,0,"11:34:11 AM [info] Activating crowd-code\n11:34:11 AM [info] Recording started\n11:34:11 AM [info] Initializing git provider using file system watchers...\n11:34:11 AM [info] Git repository found\n11:34:11 AM [info] Git provider initialized successfully\n11:34:11 AM [info] Initial git state: [object Object]\n",Log,tab
+3,1020,"src/extension.ts",0,0,"",typescript,tab
+4,2502,"TERMINAL",0,0,"",,terminal_focus
+5,30907,"src/extension.ts",635,0,"",typescript,selection_command
+6,31146,"src/extension.ts",713,0,"",typescript,selection_command
+7,31177,"src/extension.ts",774,0,"",typescript,selection_command
+8,31209,"src/extension.ts",825,0,"",typescript,selection_command
+9,31344,"src/extension.ts",921,0,"",typescript,selection_command
+10,102622,"src/extension.ts",773,201,"\t\tconst updated = [...commandsToSkipShell, 'crowd-pilot.testRun'];\n\t\tconfig.update('commandsToSkipShell', updated, vscode.ConfigurationTarget.Global, false)\n\t\t\t.then(() => {\n\t\t\t\tconsole.log('[Crowd Pilot] Successfully updated commandsToSkipShell');\n\t\t\t\t// Verify it was set\n\t\t\t\tconst verifyConfig = vscode.workspace.getConfiguration('terminal.integrated');\n\t\t\t\tconst verify = verifyConfig.get('commandsToSkipShell', []);\n\t\t\t\tconsole.log('[Crowd Pilot] Verified commandsToSkipShell:', JSON.stringify(verify));\n\t\t\t\tif (!verify.includes('crowd-pilot.testRun')) {\n\t\t\t\t\tconsole.error('[Crowd Pilot] ERROR: Configuration update did not persist!');\n\t\t\t\t}\n\t\t\t})\n\t\t\t.catch((err) => {\n\t\t\t\tconsole.error('[Crowd Pilot] ERROR updating commandsToSkipShell:', err);\n\t\t\t});\n\t} else {\n\t\tconsole.log('[Crowd Pilot] testRun already in commandsToSkipShell');",typescript,content
+11,102622,"src/extension.ts",712,0,"\tconsole.log('[Crowd Pilot] Current commandsToSkipShell:', JSON.stringify(commandsToSkipShell));\n\t\n",typescript,content
+12,102622,"src/extension.ts",371,48,"\tconsole.log('[Crowd Pilot] Extension activated');",typescript,content
+13,110500,"src/extension.ts",874,849,"\t\tcommandsToSkipShell.push('crowd-pilot.testRun');\n\t\tconfig.update('commandsToSkipShell', commandsToSkipShell, vscode.ConfigurationTarget.Global);\n\t\tconsole.log('Added testRun to commandsToSkipShell');",typescript,content
+14,110500,"src/extension.ts",714,99,"",typescript,content
+15,110500,"src/extension.ts",371,50,"\tconsole.log('Crowd Pilot extension activated');",typescript,content
+16,110559,"src/extension.ts",1747,0,"\tcontext.subscriptions.push(checkConfig);\n",typescript,content
+17,110559,"src/extension.ts",1709,0,"\t// Diagnostic command to check keybinding configuration\n\tconst checkConfig = vscode.commands.registerCommand('crowd-pilot.checkConfig', () => {\n\t\tconst config = vscode.workspace.getConfiguration('terminal.integrated');\n\t\tconst commandsToSkipShell = config.get('commandsToSkipShell', []);\n\t\tconst hasCommand = commandsToSkipShell.includes('crowd-pilot.testRun');\n\t\t\n\t\tconst message = `commandsToSkipShell: ${JSON.stringify(commandsToSkipShell)}\n` +\n\t\t\t`Contains 'crowd-pilot.testRun': ${hasCommand}`;\n\t\t\n\t\tconsole.log('[Crowd Pilot] Config check:', message);\n\t\tvscode.window.showInformationMessage(message, { modal: true });\n\t});\n\n",typescript,content
+18,110559,"src/extension.ts",1065,0,"\t\tconsole.log('[Crowd Pilot] testRun command executed');\n",typescript,content
+19,110559,"src/extension.ts",773,201,"\t\tconst updated = [...commandsToSkipShell, 'crowd-pilot.testRun'];\n\t\tconfig.update('commandsToSkipShell', updated, vscode.ConfigurationTarget.Global, false)\n\t\t\t.then(() => {\n\t\t\t\tconsole.log('[Crowd Pilot] Successfully updated commandsToSkipShell');\n\t\t\t\t// Verify it was set\n\t\t\t\tconst verifyConfig = vscode.workspace.getConfiguration('terminal.integrated');\n\t\t\t\tconst verify = verifyConfig.get('commandsToSkipShell', []);\n\t\t\t\tconsole.log('[Crowd Pilot] Verified commandsToSkipShell:', JSON.stringify(verify));\n\t\t\t\tif (!verify.includes('crowd-pilot.testRun')) {\n\t\t\t\t\tconsole.error('[Crowd Pilot] ERROR: Configuration update did not persist!');\n\t\t\t\t}\n\t\t\t})\n\t\t\t.catch((err) => {\n\t\t\t\tconsole.error('[Crowd Pilot] ERROR updating commandsToSkipShell:', err);\n\t\t\t});\n\t} else {\n\t\tconsole.log('[Crowd Pilot] testRun already in commandsToSkipShell');",typescript,content
+20,110559,"src/extension.ts",712,0,"\tconsole.log('[Crowd Pilot] Current commandsToSkipShell:', JSON.stringify(commandsToSkipShell));\n\t\n",typescript,content
+21,110559,"src/extension.ts",371,48,"\tconsole.log('[Crowd Pilot] Extension activated');",typescript,content
+22,128376,"src/extension.ts",3195,42,"",typescript,content
+23,128376,"src/extension.ts",2515,642,"",typescript,content
+24,128376,"src/extension.ts",1814,57,"",typescript,content
+25,128376,"src/extension.ts",874,849,"\t\tcommandsToSkipShell.push('crowd-pilot.testRun');\n\t\tconfig.update('commandsToSkipShell', commandsToSkipShell, vscode.ConfigurationTarget.Global);\n\t\tconsole.log('Added testRun to commandsToSkipShell');",typescript,content
+26,128376,"src/extension.ts",714,99,"",typescript,content
+27,128376,"src/extension.ts",371,50,"\tconsole.log('Crowd Pilot extension activated');",typescript,content
+28,128418,"src/extension.ts",1747,0,"\tcontext.subscriptions.push(checkConfig);\n",typescript,content
+29,128418,"src/extension.ts",1709,0,"\t// Diagnostic command to check keybinding configuration\n\tconst checkConfig = vscode.commands.registerCommand('crowd-pilot.checkConfig', () => {\n\t\tconst config = vscode.workspace.getConfiguration('terminal.integrated');\n\t\tconst commandsToSkipShell = config.get('commandsToSkipShell', []);\n\t\tconst hasCommand = commandsToSkipShell.includes('crowd-pilot.testRun');\n\t\t\n\t\tconst message = `commandsToSkipShell: ${JSON.stringify(commandsToSkipShell)}\n` +\n\t\t\t`Contains 'crowd-pilot.testRun': ${hasCommand}`;\n\t\t\n\t\tconsole.log('[Crowd Pilot] Config check:', message);\n\t\tvscode.window.showInformationMessage(message, { modal: true });\n\t});\n\n",typescript,content
+30,128418,"src/extension.ts",1065,0,"\t\tconsole.log('[Crowd Pilot] testRun command executed');\n",typescript,content
+31,128418,"src/extension.ts",773,201,"\t\tconst updated = [...commandsToSkipShell, 'crowd-pilot.testRun'];\n\t\tconfig.update('commandsToSkipShell', updated, vscode.ConfigurationTarget.Global, false)\n\t\t\t.then(() => {\n\t\t\t\tconsole.log('[Crowd Pilot] Successfully updated commandsToSkipShell');\n\t\t\t\t// Verify it was set\n\t\t\t\tconst verifyConfig = vscode.workspace.getConfiguration('terminal.integrated');\n\t\t\t\tconst verify = verifyConfig.get('commandsToSkipShell', []);\n\t\t\t\tconsole.log('[Crowd Pilot] Verified commandsToSkipShell:', JSON.stringify(verify));\n\t\t\t\tif (!verify.includes('crowd-pilot.testRun')) {\n\t\t\t\t\tconsole.error('[Crowd Pilot] ERROR: Configuration update did not persist!');\n\t\t\t\t}\n\t\t\t}, (err: unknown) => {\n\t\t\t\tconsole.error('[Crowd Pilot] ERROR updating commandsToSkipShell:', err);\n\t\t\t});\n\t} else {\n\t\tconsole.log('[Crowd Pilot] testRun already in commandsToSkipShell');",typescript,content
+32,128418,"src/extension.ts",712,0,"\tconsole.log('[Crowd Pilot] Current commandsToSkipShell:', JSON.stringify(commandsToSkipShell));\n\t\n",typescript,content
+33,128418,"src/extension.ts",371,48,"\tconsole.log('[Crowd Pilot] Extension activated');",typescript,content
+34,135218,"package.json",0,0,"{\n ""name"": ""crowd-pilot"",\n ""displayName"": ""crowd-pilot-extension"",\n ""description"": ""Teaching language models to code like humans."",\n ""version"": ""0.0.1"",\n ""engines"": {\n ""vscode"": ""^1.99.3""\n },\n ""categories"": [\n ""Other""\n ],\n ""activationEvents"": [],\n ""main"": ""./out/extension.js"",\n ""contributes"": {\n ""commands"": [\n {\n ""command"": ""crowd-pilot.testRun"",\n ""title"": ""Test Run""\n },\n {\n ""command"": ""crowd-pilot.checkConfig"",\n ""title"": ""Check Keybinding Config""\n }\n ],\n ""keybindings"": [\n {\n ""command"": ""crowd-pilot.testRun"",\n ""key"": ""tab"",\n ""mac"": ""tab"",\n ""when"": ""editorTextFocus || terminalFocus""\n }\n ]\n },\n ""scripts"": {\n ""vscode:prepublish"": ""npm run compile"",\n ""compile"": ""tsc -p ./"",\n ""watch"": ""tsc -watch -p ./"",\n ""pretest"": ""npm run compile && npm run lint"",\n ""lint"": ""eslint src"",\n ""test"": ""vscode-test""\n },\n ""devDependencies"": {\n ""@types/vscode"": ""^1.105.0"",\n ""@types/mocha"": ""^10.0.10"",\n ""@types/node"": ""22.x"",\n ""@typescript-eslint/eslint-plugin"": ""^8.45.0"",\n ""@typescript-eslint/parser"": ""^8.45.0"",\n ""eslint"": ""^9.36.0"",\n ""typescript"": ""^5.9.3"",\n ""@vscode/test-cli"": ""^0.0.11"",\n ""@vscode/test-electron"": ""^2.5.2""\n }\n}\n",json,tab
+35,137453,"src/extension.ts",0,0,"",typescript,tab
+36,177116,"src/extension.ts",1721,0,"",typescript,selection_command
+37,213229,"package.json",0,0,"",json,tab
+38,216261,"src/extension.ts",0,0,"",typescript,tab
+39,286905,"src/extension.ts",1734,0,"",typescript,selection_command
+40,287709,"src/extension.ts",1726,0,"",typescript,selection_command
+41,287778,"src/extension.ts",1724,0,"",typescript,selection_command
+42,288152,"src/extension.ts",1659,0,"",typescript,selection_command
+43,318683,"src/extension.ts",3194,42,"",typescript,content
+44,318683,"src/extension.ts",2514,642,"",typescript,content
+45,318683,"src/extension.ts",1813,57,"",typescript,content
+46,318683,"src/extension.ts",874,848,"\t\tcommandsToSkipShell.push('crowd-pilot.testRun');\n\t\tconfig.update('commandsToSkipShell', commandsToSkipShell, vscode.ConfigurationTarget.Global);\n\t\tconsole.log('Added testRun to commandsToSkipShell');",typescript,content
+47,318683,"src/extension.ts",714,99,"",typescript,content
+48,318683,"src/extension.ts",371,50,"\tconsole.log('Crowd Pilot extension activated');",typescript,content
+49,318735,"src/extension.ts",1747,0,"\tcontext.subscriptions.push(checkConfig);\n",typescript,content
+50,318735,"src/extension.ts",1709,0,"\t// Diagnostic command to check keybinding configuration\n\tconst checkConfig = vscode.commands.registerCommand('crowd-pilot.checkConfig', () => {\n\t\tconst config = vscode.workspace.getConfiguration('terminal.integrated');\n\t\tconst commandsToSkipShell = config.get('commandsToSkipShell', []);\n\t\tconst hasCommand = commandsToSkipShell.includes('crowd-pilot.testRun');\n\t\t\n\t\tconst message = `commandsToSkipShell: ${JSON.stringify(commandsToSkipShell)}\n` +\n\t\t\t`Contains 'crowd-pilot.testRun': ${hasCommand}`;\n\t\t\n\t\tconsole.log('[Crowd Pilot] Config check:', message);\n\t\tvscode.window.showInformationMessage(message, { modal: true });\n\t});\n\n",typescript,content
+51,318735,"src/extension.ts",1065,0,"\t\t// Log context information\n\t\tconst activeEditor = vscode.window.activeTextEditor;\n\t\tconst activeTerminal = vscode.window.activeTerminal;\n\t\tconst contextInfo = {\n\t\t\thasActiveEditor: !!activeEditor,\n\t\t\thasActiveTerminal: !!activeTerminal,\n\t\t\tactiveTerminalName: activeTerminal?.name,\n\t\t\tterminalCount: vscode.window.terminals.length,\n\t\t};\n\t\tconsole.log('[Crowd Pilot] testRun command executed', JSON.stringify(contextInfo));\n\t\t\n",typescript,content
+52,318735,"src/extension.ts",773,201,"\t\tconst updated = [...commandsToSkipShell, 'crowd-pilot.testRun'];\n\t\tconfig.update('commandsToSkipShell', updated, vscode.ConfigurationTarget.Global, false)\n\t\t\t.then(() => {\n\t\t\t\tconsole.log('[Crowd Pilot] Successfully updated commandsToSkipShell');\n\t\t\t\t// Verify it was set\n\t\t\t\tconst verifyConfig = vscode.workspace.getConfiguration('terminal.integrated');\n\t\t\t\tconst verify = verifyConfig.get('commandsToSkipShell', []);\n\t\t\t\tconsole.log('[Crowd Pilot] Verified commandsToSkipShell:', JSON.stringify(verify));\n\t\t\t\tif (!verify.includes('crowd-pilot.testRun')) {\n\t\t\t\t\tconsole.error('[Crowd Pilot] ERROR: Configuration update did not persist!');\n\t\t\t\t}\n\t\t\t}, (err: unknown) => {\n\t\t\t\tconsole.error('[Crowd Pilot] ERROR updating commandsToSkipShell:', err);\n\t\t\t});\n\t} else {\n\t\tconsole.log('[Crowd Pilot] testRun already in commandsToSkipShell');",typescript,content
+53,318735,"src/extension.ts",712,0,"\tconsole.log('[Crowd Pilot] Current commandsToSkipShell:', JSON.stringify(commandsToSkipShell));\n\t\n",typescript,content
+54,318735,"src/extension.ts",371,48,"\tconsole.log('[Crowd Pilot] Extension activated');",typescript,content
+55,323409,"src/extension.ts",3565,42,"",typescript,content
+56,323409,"src/extension.ts",2885,642,"",typescript,content
+57,323409,"src/extension.ts",1813,428,"",typescript,content
+58,323409,"src/extension.ts",874,848,"\t\tcommandsToSkipShell.push('crowd-pilot.testRun');\n\t\tconfig.update('commandsToSkipShell', commandsToSkipShell, vscode.ConfigurationTarget.Global);\n\t\tconsole.log('Added testRun to commandsToSkipShell');",typescript,content
+59,323409,"src/extension.ts",714,99,"",typescript,content
+60,323409,"src/extension.ts",371,50,"\tconsole.log('Crowd Pilot extension activated');",typescript,content
+61,323453,"src/extension.ts",1747,0,"\tcontext.subscriptions.push(checkConfig);\n",typescript,content
+62,323454,"src/extension.ts",1709,0,"\t// Diagnostic command to check keybinding configuration and context\n\tconst checkConfig = vscode.commands.registerCommand('crowd-pilot.checkConfig', () => {\n\t\tconst config = vscode.workspace.getConfiguration('terminal.integrated');\n\t\tconst commandsToSkipShell = config.get('commandsToSkipShell', []);\n\t\tconst hasCommand = commandsToSkipShell.includes('crowd-pilot.testRun');\n\t\t\n\t\tconst activeEditor = vscode.window.activeTextEditor;\n\t\tconst activeTerminal = vscode.window.activeTerminal;\n\t\t\n\t\tconst message = `commandsToSkipShell: ${JSON.stringify(commandsToSkipShell)}\n` +\n\t\t\t`Contains 'crowd-pilot.testRun': ${hasCommand}\n` +\n\t\t\t`Active Editor: ${activeEditor ? 'Yes' : 'No'}\n` +\n\t\t\t`Active Terminal: ${activeTerminal ? activeTerminal.name : 'No'}\n` +\n\t\t\t`Terminal Count: ${vscode.window.terminals.length}`;\n\t\t\n\t\tconsole.log('[Crowd Pilot] Config check:', message);\n\t\tvscode.window.showInformationMessage(message, { modal: true });\n\t});\n\n",typescript,content
+63,323454,"src/extension.ts",1065,0,"\t\t// Log context information\n\t\tconst activeEditor = vscode.window.activeTextEditor;\n\t\tconst activeTerminal = vscode.window.activeTerminal;\n\t\tconst contextInfo = {\n\t\t\thasActiveEditor: !!activeEditor,\n\t\t\thasActiveTerminal: !!activeTerminal,\n\t\t\tactiveTerminalName: activeTerminal?.name,\n\t\t\tterminalCount: vscode.window.terminals.length,\n\t\t};\n\t\tconsole.log('[Crowd Pilot] testRun command executed', JSON.stringify(contextInfo));\n\t\t\n",typescript,content
+64,323454,"src/extension.ts",773,201,"\t\tconst updated = [...commandsToSkipShell, 'crowd-pilot.testRun'];\n\t\tconfig.update('commandsToSkipShell', updated, vscode.ConfigurationTarget.Global, false)\n\t\t\t.then(() => {\n\t\t\t\tconsole.log('[Crowd Pilot] Successfully updated commandsToSkipShell');\n\t\t\t\t// Verify it was set\n\t\t\t\tconst verifyConfig = vscode.workspace.getConfiguration('terminal.integrated');\n\t\t\t\tconst verify = verifyConfig.get('commandsToSkipShell', []);\n\t\t\t\tconsole.log('[Crowd Pilot] Verified commandsToSkipShell:', JSON.stringify(verify));\n\t\t\t\tif (!verify.includes('crowd-pilot.testRun')) {\n\t\t\t\t\tconsole.error('[Crowd Pilot] ERROR: Configuration update did not persist!');\n\t\t\t\t}\n\t\t\t}, (err: unknown) => {\n\t\t\t\tconsole.error('[Crowd Pilot] ERROR updating commandsToSkipShell:', err);\n\t\t\t});\n\t} else {\n\t\tconsole.log('[Crowd Pilot] testRun already in commandsToSkipShell');",typescript,content
+65,323454,"src/extension.ts",712,0,"\tconsole.log('[Crowd Pilot] Current commandsToSkipShell:', JSON.stringify(commandsToSkipShell));\n\t\n",typescript,content
+66,323454,"src/extension.ts",371,48,"\tconsole.log('[Crowd Pilot] Extension activated');",typescript,content
+67,433961,"src/extension.ts",3877,42,"",typescript,content
+68,433961,"src/extension.ts",2885,954,"",typescript,content
+69,433961,"src/extension.ts",1813,428,"",typescript,content
+70,433962,"src/extension.ts",874,848,"\t\tcommandsToSkipShell.push('crowd-pilot.testRun');\n\t\tconfig.update('commandsToSkipShell', commandsToSkipShell, vscode.ConfigurationTarget.Global);\n\t\tconsole.log('Added testRun to commandsToSkipShell');",typescript,content
+71,433962,"src/extension.ts",714,99,"",typescript,content
+72,433962,"src/extension.ts",371,50,"\tconsole.log('Crowd Pilot extension activated');",typescript,content
+73,434008,"src/extension.ts",1747,0,"\tcontext.subscriptions.push(checkConfig);\n",typescript,content
+74,434008,"src/extension.ts",1709,0,"\t// Diagnostic command to check keybinding configuration and context\n\tconst checkConfig = vscode.commands.registerCommand('crowd-pilot.checkConfig', () => {\n\t\tconst config = vscode.workspace.getConfiguration('terminal.integrated');\n\t\tconst commandsToSkipShell = config.get('commandsToSkipShell', []);\n\t\tconst hasCommand = commandsToSkipShell.includes('crowd-pilot.testRun');\n\t\t\n\t\tconst activeEditor = vscode.window.activeTextEditor;\n\t\tconst activeTerminal = vscode.window.activeTerminal;\n\t\t\n\t\tconst message = `commandsToSkipShell: ${JSON.stringify(commandsToSkipShell)}\n` +\n\t\t\t`Contains 'crowd-pilot.testRun': ${hasCommand}\n` +\n\t\t\t`Active Editor: ${activeEditor ? 'Yes' : 'No'}\n` +\n\t\t\t`Active Terminal: ${activeTerminal ? activeTerminal.name : 'No'}\n` +\n\t\t\t`Terminal Count: ${vscode.window.terminals.length}`;\n\t\t\n\t\tconsole.log('[Crowd Pilot] Config check:', message);\n\t\tvscode.window.showInformationMessage(message, { modal: true });\n\t});\n\n",typescript,content
+75,434008,"src/extension.ts",1560,14,"\t\tterm.show(false); // ensure terminal gets focus (preserveFocus: false)\n\t\t// Small delay to ensure terminal has focus before sending text\n\t\tawait new Promise(resolve => setTimeout(resolve, 100));",typescript,content
+76,434008,"src/extension.ts",1375,44,"\t\tawait vscode.window.showTextDocument(doc, undefined, true); // preserveFocus: true",typescript,content
+77,434008,"src/extension.ts",1065,0,"\t\t// Log context information\n\t\tconst activeEditor = vscode.window.activeTextEditor;\n\t\tconst activeTerminal = vscode.window.activeTerminal;\n\t\tconst contextInfo = {\n\t\t\thasActiveEditor: !!activeEditor,\n\t\t\thasActiveTerminal: !!activeTerminal,\n\t\t\tactiveTerminalName: activeTerminal?.name,\n\t\t\tterminalCount: vscode.window.terminals.length,\n\t\t};\n\t\tconsole.log('[Crowd Pilot] testRun command executed', JSON.stringify(contextInfo));\n\t\t\n",typescript,content
+78,434008,"src/extension.ts",773,201,"\t\tconst updated = [...commandsToSkipShell, 'crowd-pilot.testRun'];\n\t\tconfig.update('commandsToSkipShell', updated, vscode.ConfigurationTarget.Global, false)\n\t\t\t.then(() => {\n\t\t\t\tconsole.log('[Crowd Pilot] Successfully updated commandsToSkipShell');\n\t\t\t\t// Verify it was set\n\t\t\t\tconst verifyConfig = vscode.workspace.getConfiguration('terminal.integrated');\n\t\t\t\tconst verify = verifyConfig.get('commandsToSkipShell', []);\n\t\t\t\tconsole.log('[Crowd Pilot] Verified commandsToSkipShell:', JSON.stringify(verify));\n\t\t\t\tif (!verify.includes('crowd-pilot.testRun')) {\n\t\t\t\t\tconsole.error('[Crowd Pilot] ERROR: Configuration update did not persist!');\n\t\t\t\t}\n\t\t\t}, (err: unknown) => {\n\t\t\t\tconsole.error('[Crowd Pilot] ERROR updating commandsToSkipShell:', err);\n\t\t\t});\n\t} else {\n\t\tconsole.log('[Crowd Pilot] testRun already in commandsToSkipShell');",typescript,content
+79,434008,"src/extension.ts",712,0,"\tconsole.log('[Crowd Pilot] Current commandsToSkipShell:', JSON.stringify(commandsToSkipShell));\n\t\n",typescript,content
+80,434008,"src/extension.ts",371,48,"\tconsole.log('[Crowd Pilot] Extension activated');",typescript,content
+81,442669,"src/extension.ts",635,0,"",typescript,selection_mouse
+82,442679,"src/extension.ts",634,0,"",typescript,selection_command
+83,465304,"src/extension.ts",0,4236,"// The module 'vscode' contains the VS Code extensibility API\n// Import the module and reference it with the alias vscode in your code below\nimport * as vscode from 'vscode';\n\n// This method is called when your extension is activated\n// Your extension is activated the very first time the command is executed\nexport function activate(context: vscode.ExtensionContext) {\n\n\tconsole.log('Crowd Pilot extension activated');\n\n\t// Configure terminal to allow tab keybinding to work\n\t// This makes the command skip the shell so VS Code can intercept tab in terminals\n\tconst config = vscode.workspace.getConfiguration('terminal.integrated');\n\tconst commandsToSkipShell = config.get('commandsToSkipShell', []);\n\tif (!commandsToSkipShell.includes('crowd-pilot.testRun')) {\n\t\tcommandsToSkipShell.push('crowd-pilot.testRun');\n\t\tconfig.update('commandsToSkipShell', commandsToSkipShell, vscode.ConfigurationTarget.Global);\n\t\tconsole.log('Added testRun to commandsToSkipShell');\n\t}\n\n\tconst testRun = vscode.commands.registerCommand('crowd-pilot.testRun', async () => {\n\t\tconst editor = vscode.window.activeTextEditor;\n\t\tconst doc = editor!.document;\n\t\tconst term = vscode.window.terminals[0] ?? vscode.window.createTerminal('Test');\n\t\tconst git = vscode.extensions.getExtension('vscode.git')?.exports?.getAPI(1);\n\t\tconst repo = git?.repositories?.[0];\n\t\n\t\t// Emit a few actions:\n\t\tawait vscode.window.showTextDocument(doc);\n\t\teditor!.selections = [new vscode.Selection(0, 0, 0, 0)];\n\t\tawait editor!.edit(e => e.insert(new vscode.Position(0, 0), 'hello world\n'));\n\t\tterm.show();\n\t\tterm.sendText('echo VSCode test');\n\t\t//await repo?.pull();\n\t\n\t\tvscode.window.showInformationMessage('All actions emitted');\n\t });\n\n\tcontext.subscriptions.push(testRun);\n}\n\n// This method is called when your extension is deactivated\nexport function deactivate() {}\n",typescript,content
+84,465319,"src/extension.ts",1747,0,"\tcontext.subscriptions.push(checkConfig);\n",typescript,content
+85,465319,"src/extension.ts",1709,0,"\t// Diagnostic command to check keybinding configuration and context\n\tconst checkConfig = vscode.commands.registerCommand('crowd-pilot.checkConfig', () => {\n\t\tconst config = vscode.workspace.getConfiguration('terminal.integrated');\n\t\tconst commandsToSkipShell = config.get('commandsToSkipShell', []);\n\t\tconst hasCommand = commandsToSkipShell.includes('crowd-pilot.testRun');\n\t\t\n\t\tconst activeEditor = vscode.window.activeTextEditor;\n\t\tconst activeTerminal = vscode.window.activeTerminal;\n\t\t\n\t\tconst message = `commandsToSkipShell: ${JSON.stringify(commandsToSkipShell)}\n` +\n\t\t\t`Contains 'crowd-pilot.testRun': ${hasCommand}\n` +\n\t\t\t`Active Editor: ${activeEditor ? 'Yes' : 'No'}\n` +\n\t\t\t`Active Terminal: ${activeTerminal ? activeTerminal.name : 'No'}\n` +\n\t\t\t`Terminal Count: ${vscode.window.terminals.length}`;\n\t\t\n\t\tconsole.log('[Crowd Pilot] Config check:', message);\n\t\tvscode.window.showInformationMessage(message, { modal: true });\n\t});\n\n",typescript,content
+86,465319,"src/extension.ts",1065,0,"\t\t// Log context information\n\t\tconst activeEditor = vscode.window.activeTextEditor;\n\t\tconst activeTerminal = vscode.window.activeTerminal;\n\t\tconst contextInfo = {\n\t\t\thasActiveEditor: !!activeEditor,\n\t\t\thasActiveTerminal: !!activeTerminal,\n\t\t\tactiveTerminalName: activeTerminal?.name,\n\t\t\tterminalCount: vscode.window.terminals.length,\n\t\t};\n\t\tconsole.log('[Crowd Pilot] testRun command executed', JSON.stringify(contextInfo));\n\t\t\n",typescript,content
+87,465319,"src/extension.ts",773,201,"\t\tconst updated = [...commandsToSkipShell, 'crowd-pilot.testRun'];\n\t\tconfig.update('commandsToSkipShell', updated, vscode.ConfigurationTarget.Global, false)\n\t\t\t.then(() => {\n\t\t\t\tconsole.log('[Crowd Pilot] Successfully updated commandsToSkipShell');\n\t\t\t\t// Verify it was set\n\t\t\t\tconst verifyConfig = vscode.workspace.getConfiguration('terminal.integrated');\n\t\t\t\tconst verify = verifyConfig.get('commandsToSkipShell', []);\n\t\t\t\tconsole.log('[Crowd Pilot] Verified commandsToSkipShell:', JSON.stringify(verify));\n\t\t\t\tif (!verify.includes('crowd-pilot.testRun')) {\n\t\t\t\t\tconsole.error('[Crowd Pilot] ERROR: Configuration update did not persist!');\n\t\t\t\t}\n\t\t\t}, (err: unknown) => {\n\t\t\t\tconsole.error('[Crowd Pilot] ERROR updating commandsToSkipShell:', err);\n\t\t\t});\n\t} else {\n\t\tconsole.log('[Crowd Pilot] testRun already in commandsToSkipShell');",typescript,content
+88,465319,"src/extension.ts",712,0,"\tconsole.log('[Crowd Pilot] Current commandsToSkipShell:', JSON.stringify(commandsToSkipShell));\n\t\n",typescript,content
+89,465319,"src/extension.ts",371,48,"\tconsole.log('[Crowd Pilot] Extension activated');",typescript,content
+90,466096,"src/extension.ts",610,0,"",typescript,selection_mouse
+91,6223328,".vscode/launch.json",0,0,"// A launch configuration that compiles the extension and then opens it inside a new window\n// Use IntelliSense to learn about possible attributes.\n// Hover to view descriptions of existing attributes.\n// For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387\n{\n\t""version"": ""0.2.0"",\n\t""configurations"": [\n\t\t{\n\t\t\t""name"": ""Run Extension"",\n\t\t\t""type"": ""extensionHost"",\n\t\t\t""request"": ""launch"",\n\t\t\t""args"": [\n\t\t\t\t""--extensionDevelopmentPath=${workspaceFolder}""\n\t\t\t],\n\t\t\t""timeout"": 20000, // Increase the timeout to 20 seconds\n\t\t\t""outFiles"": [\n\t\t\t\t""${workspaceFolder}/out/**/*.js""\n\t\t\t],\n\t\t\t""preLaunchTask"": ""${defaultBuildTask}""\n\t\t}\n\t]\n}\n",jsonc,tab
+92,6574889,"src/extension.ts",0,0,"",typescript,tab
+93,6574918,"src/extension.ts",423,0,"",typescript,selection_command
diff --git a/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-e1a5354c-3f18-4c4b-94a1-64c1669feac51757270084100-2025_09_07-20.34.50.955/source.csv b/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-e1a5354c-3f18-4c4b-94a1-64c1669feac51757270084100-2025_09_07-20.34.50.955/source.csv
new file mode 100644
index 0000000000000000000000000000000000000000..458894696f3d9502139939a038a4e9eb48416a1d
--- /dev/null
+++ b/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-e1a5354c-3f18-4c4b-94a1-64c1669feac51757270084100-2025_09_07-20.34.50.955/source.csv
@@ -0,0 +1,43 @@
+Sequence,Time,File,RangeOffset,RangeLength,Text,Language,Type
+1,3,"utils/nn.py",0,0,"import math\nfrom typing import Tuple, Callable, List\n\nfrom flax import nnx\nimport jax\nimport jax.numpy as jnp\nimport einops\n\n\nclass SpatioTemporalPositionalEncoding(nnx.Module):\n """"""\n Applies separate sinusoidal positional encodings to the temporal and spatial dimensions.\n """"""\n\n def __init__(self, d_model: int, max_len: int = 5000):\n self.d_model = d_model\n self.max_len = max_len\n\n pe = jnp.zeros((self.max_len, self.d_model))\n position = jnp.arange(0, self.max_len, dtype=jnp.float32)[:, None]\n div_term = jnp.exp(\n jnp.arange(0, self.d_model, 2) * (-math.log(10000.0) / self.d_model)\n )\n pe = pe.at[:, 0::2].set(jnp.sin(position * div_term))\n pe = pe.at[:, 1::2].set(jnp.cos(position * div_term))\n self.pe = nnx.Variable(pe)\n\n def __call__(self, x: jax.Array) -> jax.Array:\n """"""\n Args:\n x: The input tensor of shape (Batch, Time, Space, Dimension).\n\n Returns:\n The input tensor with positional encodings added.\n """"""\n assert x.ndim == 4, f""Input must be 4-dimensional, but got shape {x.shape}""\n\n num_timesteps = x.shape[1]\n num_spatial_patches = x.shape[2]\n\n # Temporal positional encoding: (1, T, 1, D)\n temporal_pe = self.pe.value[None, :num_timesteps, None, :]\n x = x + temporal_pe\n\n # Spatial positional encoding: (1, 1, S, D)\n spatial_pe = self.pe.value[None, None, :num_spatial_patches, :]\n x = x + spatial_pe\n\n return x\n\n\nclass STBlock(nnx.Module):\n def __init__(\n self,\n dim: int,\n ffn_dim: int,\n num_heads: int,\n dropout: float,\n param_dtype: jnp.dtype,\n dtype: jnp.dtype,\n use_flash_attention: bool,\n rngs: nnx.Rngs,\n sow_weights: bool,\n sow_activations: bool,\n ):\n self.dim = dim\n self.ffn_dim = ffn_dim\n self.num_heads = num_heads\n self.dropout = dropout\n self.param_dtype = param_dtype\n self.dtype = dtype\n self.use_flash_attention = use_flash_attention\n self.sow_weights = sow_weights\n self.sow_activations = sow_activations\n\n self.spatial_norm = nnx.LayerNorm(\n num_features=self.dim,\n param_dtype=self.param_dtype,\n dtype=self.param_dtype, # layer norm in full precision\n rngs=rngs,\n )\n self.spatial_attention = nnx.MultiHeadAttention(\n num_heads=self.num_heads,\n in_features=self.dim,\n qkv_features=self.dim,\n dropout_rate=self.dropout,\n param_dtype=self.param_dtype,\n dtype=self.dtype,\n attention_fn=_create_flash_attention_fn(\n self.use_flash_attention, is_causal=False\n ),\n rngs=rngs,\n decode=False,\n )\n\n self.temporal_norm = nnx.LayerNorm(\n num_features=self.dim,\n param_dtype=self.param_dtype,\n dtype=self.param_dtype, # layer norm in full precision\n rngs=rngs,\n )\n self.temporal_attention = nnx.MultiHeadAttention(\n num_heads=self.num_heads,\n in_features=self.dim,\n qkv_features=self.dim,\n dropout_rate=self.dropout,\n param_dtype=self.param_dtype,\n dtype=self.dtype,\n attention_fn=_create_flash_attention_fn(\n self.use_flash_attention, is_causal=True\n ),\n rngs=rngs,\n decode=False,\n )\n\n self.ffn_norm = nnx.LayerNorm(\n num_features=self.dim,\n param_dtype=self.param_dtype,\n dtype=self.param_dtype, # layer norm in full precision\n rngs=rngs,\n )\n self.ffn_dense1 = nnx.Linear(\n in_features=self.dim,\n out_features=self.ffn_dim,\n param_dtype=self.param_dtype,\n dtype=self.dtype,\n rngs=rngs,\n )\n self.ffn_dense2 = nnx.Linear(\n in_features=self.ffn_dim,\n out_features=self.dim,\n param_dtype=self.param_dtype,\n dtype=self.dtype,\n rngs=rngs,\n )\n\n @nnx.remat\n def __call__(self, x_BTNM: jax.Array) -> jax.Array:\n # --- Spatial attention ---\n z_BTNM = self.spatial_norm(x_BTNM)\n z_BTNM = self.spatial_attention(z_BTNM, sow_weights=self.sow_weights)\n x_BTNM = x_BTNM + z_BTNM\n\n # --- Temporal attention ---\n x_BNTM = x_BTNM.swapaxes(1, 2)\n z_BNTM = self.temporal_norm(x_BNTM)\n z_BNTM = self.temporal_attention(z_BNTM, sow_weights=self.sow_weights)\n x_BNTM = x_BNTM + z_BNTM\n x_BTNM = x_BNTM.swapaxes(1, 2)\n\n # --- Feedforward ---\n z_BTNM = self.ffn_norm(x_BTNM)\n z_BTND = self.ffn_dense1(z_BTNM)\n z_BTND = jax.nn.gelu(z_BTND)\n z_BTNM = self.ffn_dense2(z_BTND)\n x_BTNM = x_BTNM + z_BTNM\n if self.sow_activations:\n self.sow(nnx.Intermediate, ""activations"", x_BTNM)\n return x_BTNM\n\n\nclass STTransformer(nnx.Module):\n """"""\n Dimension keys:\n B: batch size\n T: number of frames\n N: number of patches per frame\n I: number of input features\n M: model dimension\n D: FFN dimension\n V: vocabulary size\n """"""\n\n def __init__(\n self,\n input_dim: int,\n model_dim: int,\n ffn_dim: int,\n out_dim: int,\n num_blocks: int,\n num_heads: int,\n dropout: float,\n param_dtype: jnp.dtype,\n dtype: jnp.dtype,\n use_flash_attention: bool,\n rngs: nnx.Rngs,\n sow_weights: bool = False,\n sow_activations: bool = False,\n sow_logits: bool = False,\n max_len: int = 5000,\n ):\n self.input_dim = input_dim\n self.model_dim = model_dim\n self.ffn_dim = ffn_dim\n self.out_dim = out_dim\n self.num_blocks = num_blocks\n self.num_heads = num_heads\n self.dropout = dropout\n self.param_dtype = param_dtype\n self.dtype = dtype\n self.use_flash_attention = use_flash_attention\n self.sow_logits = sow_logits\n self.sow_weights = sow_weights\n self.sow_activations = sow_activations\n\n self.input_norm1 = nnx.LayerNorm(\n num_features=self.input_dim,\n param_dtype=self.param_dtype,\n dtype=self.param_dtype, # layer norm in full precision\n rngs=rngs,\n )\n self.input_dense = nnx.Linear(\n in_features=self.input_dim,\n out_features=self.model_dim,\n param_dtype=self.param_dtype,\n dtype=self.dtype,\n rngs=rngs,\n )\n self.input_norm2 = nnx.LayerNorm(\n num_features=self.model_dim,\n param_dtype=self.param_dtype,\n dtype=self.param_dtype, # layer norm in full precision\n rngs=rngs,\n )\n\n self.pos_enc = SpatioTemporalPositionalEncoding(self.model_dim, max_len=max_len)\n\n self.blocks = []\n for _ in range(self.num_blocks):\n self.blocks.append(\n STBlock(\n dim=self.model_dim,\n ffn_dim=self.ffn_dim,\n num_heads=self.num_heads,\n dropout=self.dropout,\n param_dtype=self.param_dtype,\n dtype=self.dtype,\n use_flash_attention=self.use_flash_attention,\n rngs=rngs,\n sow_weights=self.sow_weights,\n sow_activations=self.sow_activations,\n )\n )\n\n self.output_dense = nnx.Linear(\n in_features=self.model_dim,\n out_features=self.out_dim,\n param_dtype=self.param_dtype,\n dtype=self.dtype,\n rngs=rngs,\n )\n\n def __call__(self, x_BTNI: jax.Array) -> jax.Array:\n x_BTNI = self.input_norm1(x_BTNI)\n x_BTNM = self.input_dense(x_BTNI)\n x_BTNM = self.input_norm2(x_BTNM)\n x_BTNM = self.pos_enc(x_BTNM)\n for block in self.blocks:\n x_BTNM = block(x_BTNM)\n\n x_BTNV = self.output_dense(x_BTNM)\n if self.sow_logits:\n self.sow(nnx.Intermediate, ""logits"", x_BTNV)\n return x_BTNV\n\n\nclass TransformerBlock(nnx.Module):\n def __init__(\n self,\n model_dim: int,\n ffn_dim: int,\n num_heads: int,\n dropout: float,\n param_dtype: jnp.dtype,\n dtype: jnp.dtype,\n use_flash_attention: bool,\n decode: bool,\n rngs: nnx.Rngs,\n sow_weights: bool,\n sow_activations: bool,\n ):\n self.model_dim = model_dim\n self.ffn_dim = ffn_dim\n self.num_heads = num_heads\n self.dropout = dropout\n self.param_dtype = param_dtype\n self.dtype = dtype\n self.use_flash_attention = use_flash_attention\n self.decode = decode\n self.sow_weights = sow_weights\n self.sow_activations = sow_activations\n\n self.temporal_norm = nnx.LayerNorm(\n num_features=self.model_dim,\n param_dtype=self.param_dtype,\n dtype=self.param_dtype, # layer norm in full precision\n rngs=rngs,\n )\n self.spatial_norm = nnx.LayerNorm(\n num_features=self.model_dim,\n param_dtype=self.param_dtype,\n dtype=self.param_dtype, # layer norm in full precision\n rngs=rngs,\n )\n self.ffn_norm = nnx.LayerNorm(\n num_features=self.model_dim,\n param_dtype=self.param_dtype,\n dtype=self.param_dtype, # layer norm in full precision\n rngs=rngs,\n )\n self.temporal_attention = nnx.MultiHeadAttention(\n num_heads=self.num_heads,\n in_features=self.model_dim,\n qkv_features=self.model_dim,\n dropout_rate=self.dropout,\n param_dtype=self.param_dtype,\n dtype=self.dtype,\n attention_fn=_create_flash_attention_fn(\n self.use_flash_attention, is_causal=True\n ),\n rngs=rngs,\n decode=self.decode,\n )\n self.spatial_attention = nnx.MultiHeadAttention(\n num_heads=self.num_heads,\n in_features=self.model_dim,\n qkv_features=self.model_dim,\n dropout_rate=self.dropout,\n param_dtype=self.param_dtype,\n dtype=self.dtype,\n attention_fn=_create_flash_attention_fn(\n self.use_flash_attention, is_causal=True\n ),\n rngs=rngs,\n decode=self.decode,\n )\n self.ffn_dense1 = nnx.Linear(\n in_features=self.model_dim,\n out_features=self.ffn_dim,\n param_dtype=self.param_dtype,\n dtype=self.dtype,\n rngs=rngs,\n )\n self.ffn_dense2 = nnx.Linear(\n in_features=self.ffn_dim,\n out_features=self.model_dim,\n param_dtype=self.param_dtype,\n dtype=self.dtype,\n rngs=rngs,\n )\n\n @nnx.remat\n def __call__(\n self, x_BTNM: jax.Array, pos_index: Tuple[jax.Array, jax.Array] | None = None\n ) -> jax.Array:\n # --- Spatial attention ---\n B, T, N, M = x_BTNM.shape\n z_FNM = einops.rearrange(x_BTNM, ""b t n m -> (b t) n m"")\n z_FNM = self.spatial_norm(z_FNM)\n z_FNM = self.spatial_attention(z_FNM, sow_weights=self.sow_weights)\n z_BTNM = einops.rearrange(z_FNM, ""(b t) n m -> b t n m"", t=T)\n x_BTNM = x_BTNM + z_BTNM\n # --- Temporal attention ---\n z_PTM = einops.rearrange(x_BTNM, ""b t n m -> (b n) t m"")\n z_PTM = self.temporal_norm(z_PTM)\n z_PTM = self.temporal_attention(z_PTM, sow_weights=self.sow_weights)\n z_BTNM = einops.rearrange(z_PTM, ""(b n) t m -> b t n m"", n=N)\n x_BTNM = x_BTNM + z_BTNM\n # --- Feedforward ---\n z_BTNM = self.ffn_norm(x_BTNM)\n z_BTND = self.ffn_dense1(z_BTNM)\n z_BTND = jax.nn.gelu(z_BTND)\n z_BTNM = self.ffn_dense2(z_BTND)\n x_BTNM = x_BTNM + z_BTNM\n if self.sow_activations:\n self.sow(nnx.Intermediate, ""activations"", x_BTNM)\n\n return x_BTNM\n\n\nclass Transformer(nnx.Module):\n """"""\n Dimension keys:\n B: batch size\n T: number of frames\n N: number of patches per frame\n I: number of input features\n M: model dimension\n D: FFN dimension\n V: vocabulary size\n F: number of frames in batch\n P: number of patch positions in batch\n """"""\n\n def __init__(\n self,\n input_dim: int,\n model_dim: int,\n ffn_dim: int,\n out_dim: int,\n num_blocks: int,\n num_heads: int,\n dropout: float,\n param_dtype: jnp.dtype,\n dtype: jnp.dtype,\n use_flash_attention: bool,\n decode: bool,\n rngs: nnx.Rngs,\n sow_logits: bool = False,\n sow_weights: bool = False,\n sow_activations: bool = False,\n max_len: int = 5000,\n ):\n self.input_dim = input_dim\n self.model_dim = model_dim\n self.ffn_dim = ffn_dim\n self.out_dim = out_dim\n self.num_blocks = num_blocks\n self.num_heads = num_heads\n self.dropout = dropout\n self.param_dtype = param_dtype\n self.dtype = dtype\n self.use_flash_attention = use_flash_attention\n self.sow_logits = sow_logits\n self.sow_weights = sow_weights\n self.sow_activations = sow_activations\n\n self.input_norm1 = nnx.LayerNorm(\n num_features=self.input_dim,\n param_dtype=self.param_dtype,\n dtype=self.param_dtype, # layer norm in full precision\n rngs=rngs,\n )\n self.input_dense = nnx.Linear(\n in_features=self.input_dim,\n out_features=self.model_dim,\n param_dtype=self.param_dtype,\n dtype=self.dtype,\n rngs=rngs,\n )\n self.input_norm2 = nnx.LayerNorm(\n num_features=self.model_dim,\n param_dtype=self.param_dtype,\n dtype=self.param_dtype, # layer norm in full precision\n rngs=rngs,\n )\n\n self.pos_enc = SpatioTemporalPositionalEncoding(self.model_dim, max_len=max_len)\n\n self.blocks: List[TransformerBlock] = []\n for _ in range(self.num_blocks):\n self.blocks.append(\n TransformerBlock(\n model_dim=self.model_dim,\n ffn_dim=self.ffn_dim,\n num_heads=self.num_heads,\n dropout=self.dropout,\n param_dtype=self.param_dtype,\n dtype=self.dtype,\n use_flash_attention=self.use_flash_attention,\n decode=decode,\n sow_weights=self.sow_weights,\n sow_activations=self.sow_activations,\n rngs=rngs,\n )\n )\n self.output_dense = nnx.Linear(\n in_features=self.model_dim,\n out_features=self.out_dim,\n param_dtype=self.param_dtype,\n dtype=self.dtype,\n rngs=rngs,\n )\n\n def __call__(\n self, x_BTNI: jax.Array, pos_index: Tuple[jax.Array, jax.Array] | None = None\n ) -> jax.Array:\n x_BTNI = self.input_norm1(x_BTNI)\n x_BTNM = self.input_dense(x_BTNI)\n x_BTNM = self.input_norm2(x_BTNM)\n x_BTNM = self.pos_enc(x_BTNM)\n for block in self.blocks:\n x_BTNM = block(x_BTNM, pos_index)\n\n x_BTNV = self.output_dense(x_BTNM)\n if self.sow_logits:\n self.sow(nnx.Intermediate, ""logits"", x_BTNV)\n return x_BTNV\n\n\ndef normalize(x: jax.Array) -> jax.Array:\n return x / (jnp.linalg.norm(x, ord=2, axis=-1, keepdims=True) + 1e-8)\n\n\nclass VectorQuantizer(nnx.Module):\n """"""\n Dimension keys:\n D: B * T * N\n K: number of latents\n L: latent dimension\n """"""\n\n def __init__(\n self,\n latent_dim: int,\n num_latents: int,\n dropout: float,\n dtype: jnp.dtype,\n rngs: nnx.Rngs,\n ):\n self.latent_dim = latent_dim\n self.num_latents = num_latents\n self.dropout = dropout\n self.dtype = dtype\n\n self.codebook = nnx.Param(\n normalize(\n nnx.initializers.lecun_uniform()(\n rngs.params(), (self.num_latents, self.latent_dim)\n )\n )\n )\n self.drop = nnx.Dropout(self.dropout, rngs=rngs)\n\n def __call__(\n self, x_DL: jax.Array, training: bool\n ) -> Tuple[jax.Array, jax.Array, jax.Array, jax.Array]:\n # --- Compute distances ---\n x_DL = x_DL.astype(self.dtype)\n codebook = self.codebook.value.astype(self.dtype)\n\n x_normalized_DL = normalize(x_DL)\n normalized_codebook_KL = normalize(codebook)\n distance_DK = -jnp.matmul(x_normalized_DL, normalized_codebook_KL.T)\n if training:\n distance_DK = self.drop(distance_DK)\n\n # --- Get indices and embeddings ---\n indices_D = jnp.argmin(distance_DK, axis=-1)\n z_DL = codebook[indices_D]\n\n # --- Straight through estimator ---\n z_q_DL = x_DL + jax.lax.stop_gradient(z_DL - x_DL)\n return z_q_DL, z_DL, x_DL, indices_D\n\n def get_codes(self, indices_E: jax.Array) -> jax.Array:\n return self.codebook[indices_E]\n\n\ndef _create_flash_attention_fn(use_flash_attention: bool, is_causal: bool) -> Callable:\n """"""\n Create an attention function that uses flash attention if enabled.\n\n flax.nnx.MultiHeadAttention provides tensors with shape (batch..., length, num_heads, head_dim),\n but jax.nn.dot_product_attention expects (batch, length, num_heads, head_dim). We reshape to\n ensure compatibility. cuDNN's flash attention additionally requires a sequence length that\n is a multiple of 4. We pad the sequence length to the nearest multiple of 4 and mask\n accordingly. Note that cuDNN requires the mask to be broadcast before calling the attention\n function due to strict shape checking.\n """"""\n\n def attention_fn(\n query_BTHD, key_BSHD, value_BSHD, bias=None, mask_B111=None, **kwargs\n ):\n implementation = ""cudnn"" if use_flash_attention else None\n\n def _merge_batch_dims(x):\n return einops.rearrange(x, ""... l h k -> (...) l h k"")\n\n def _pad(x, pad_size):\n return jnp.pad(x, ((0, 0), (0, pad_size), (0, 0), (0, 0)))\n\n original_shape = query_BTHD.shape\n T = query_BTHD.shape[-3]\n S = key_BSHD.shape[-3]\n\n # Pad to nearest multiple of 4\n Q = ((T + 3) // 4) * 4\n pad_size_Q = Q - T\n K = ((S + 3) // 4) * 4\n pad_size_K = K - S\n\n query_BQHD = _pad(_merge_batch_dims(query_BTHD), pad_size_Q)\n key_BKHD = _pad(_merge_batch_dims(key_BSHD), pad_size_K)\n value_BKHD = _pad(_merge_batch_dims(value_BSHD), pad_size_K)\n\n attention_mask = jnp.ones((Q, K), dtype=jnp.bool_)\n attention_mask = attention_mask.at[T:, :].set(False)\n attention_mask = attention_mask.at[:, S:].set(False)\n\n mask_11TS = attention_mask[jnp.newaxis, jnp.newaxis, :, :]\n\n bias_4d = (\n jnp.pad(\n _merge_batch_dims(bias),\n ((0, 0), (0, 0), (0, pad_size_Q), (0, pad_size_K)),\n )\n if bias is not None\n else None\n )\n\n # NOTE: jax.nn.dot_product_attention does not support dropout\n output_4d = jax.nn.dot_product_attention(\n query=query_BQHD,\n key=key_BKHD,\n value=value_BKHD,\n bias=bias_4d,\n mask=mask_11TS,\n implementation=implementation,\n is_causal=is_causal,\n )\n return output_4d[..., :T, :, :].reshape(original_shape)\n\n return attention_fn\n",python,tab
+2,149,"extension-output-pdoom-org.crowd-code-#1-crowd-code",0,0,"8:34:50 PM [info] Activating crowd-code\n8:34:50 PM [info] Recording started\n8:34:50 PM [info] Initializing git provider using file system watchers...\n",Log,tab
+3,187,"extension-output-pdoom-org.crowd-code-#1-crowd-code",150,0,"8:34:51 PM [info] Git repository found\n8:34:51 PM [info] Git provider initialized successfully\n8:34:51 PM [info] Initial git state: [object Object]\n",Log,content
+4,3399,"utils/nn.py",0,0,"",python,tab
+5,5944,"utils/nn.py",16950,0,"",python,selection_command
+6,6119,"utils/nn.py",16918,0,"",python,selection_command
+7,6363,"utils/nn.py",16879,0,"",python,selection_command
+8,6384,"utils/nn.py",16843,0,"",python,selection_command
+9,6419,"utils/nn.py",16783,0,"",python,selection_command
+10,6454,"utils/nn.py",16737,0,"",python,selection_command
+11,6487,"utils/nn.py",16709,0,"",python,selection_command
+12,6520,"utils/nn.py",16692,0,"",python,selection_command
+13,6551,"utils/nn.py",16661,0,"",python,selection_command
+14,6861,"utils/nn.py",16692,0,"",python,selection_command
+15,7020,"utils/nn.py",16709,0,"",python,selection_command
+16,7314,"utils/nn.py",16692,0,"",python,selection_command
+17,21332,"utils/nn.py",16661,0,"",python,selection_command
+18,21589,"utils/nn.py",16692,0,"",python,selection_command
+19,23753,"utils/nn.py",16709,0,"",python,selection_command
+20,23936,"utils/nn.py",16737,0,"",python,selection_command
+21,24089,"utils/nn.py",16783,0,"",python,selection_command
+22,24270,"utils/nn.py",16843,0,"",python,selection_command
+23,24431,"utils/nn.py",16879,0,"",python,selection_command
+24,24599,"utils/nn.py",16918,0,"",python,selection_command
+25,24737,"utils/nn.py",16950,0,"",python,selection_command
+26,26706,"utils/nn.py",16918,0,"",python,selection_command
+27,26890,"utils/nn.py",16879,0,"",python,selection_command
+28,35158,"utils/nn.py",15971,0,"",python,selection_command
+29,38308,"models/tokenizer.py",0,0,"from typing import Dict, Tuple\n\nimport flax.nnx as nnx\nimport jax.numpy as jnp\nimport jax\n\nfrom utils.preprocess import patchify, unpatchify\nfrom utils.nn import STTransformer, VectorQuantizer\n\n\nclass TokenizerVQVAE(nnx.Module):\n """"""\n ST-ViVit VQ-VAE\n\n Dimension keys:\n B: batch size\n T: sequence length\n N: number of patches per frame\n L: latent dimension\n D: B * T * N\n H: height\n W: width\n C: number of channels\n P: patch token dimension (patch_size^2 * C)\n """"""\n\n def __init__(\n self,\n in_dim: int,\n model_dim: int,\n ffn_dim: int,\n latent_dim: int,\n num_latents: int,\n patch_size: int,\n num_blocks: int,\n num_heads: int,\n dropout: float,\n codebook_dropout: float,\n param_dtype: jnp.dtype,\n dtype: jnp.dtype,\n use_flash_attention: bool,\n rngs: nnx.Rngs,\n ):\n self.in_dim = in_dim\n self.model_dim = model_dim\n self.ffn_dim = ffn_dim\n self.latent_dim = latent_dim\n self.num_latents = num_latents\n self.patch_size = patch_size\n self.num_blocks = num_blocks\n self.num_heads = num_heads\n self.dropout = dropout\n self.codebook_dropout = codebook_dropout\n self.param_dtype = param_dtype\n self.dtype = dtype\n self.use_flash_attention = use_flash_attention\n\n self.encoder = STTransformer(\n self.in_dim * self.patch_size**2,\n self.model_dim,\n self.ffn_dim,\n self.latent_dim,\n self.num_blocks,\n self.num_heads,\n self.dropout,\n self.param_dtype,\n self.dtype,\n use_flash_attention=self.use_flash_attention,\n rngs=rngs,\n )\n self.vq = VectorQuantizer(\n self.latent_dim,\n self.num_latents,\n self.codebook_dropout,\n self.dtype,\n rngs=rngs,\n )\n self.out_dim = self.in_dim * self.patch_size**2\n self.decoder = STTransformer(\n self.latent_dim,\n self.model_dim,\n self.ffn_dim,\n self.out_dim,\n self.num_blocks,\n self.num_heads,\n self.dropout,\n self.param_dtype,\n self.dtype,\n use_flash_attention=self.use_flash_attention,\n rngs=rngs,\n )\n\n def __call__(\n self, batch: Dict[str, jax.Array], training: bool = True\n ) -> Dict[str, jax.Array]:\n H, W = batch[""videos""].shape[2:4]\n videos_BTHWC = batch[""videos""]\n outputs = self.vq_encode(videos_BTHWC, training)\n z_q_BTNL = outputs[""z_q""]\n recon_BTHWC = self.decoder(z_q_BTNL)\n recon_BTHWC = recon_BTHWC.astype(jnp.float32)\n recon_BTHWC = nnx.sigmoid(recon_BTHWC)\n recon_BTHWC = recon_BTHWC.astype(self.dtype)\n recon_BTHWC = unpatchify(recon_BTHWC, self.patch_size, H, W)\n outputs[""recon""] = recon_BTHWC\n return outputs\n\n def vq_encode(\n self, videos: jax.Array, training: bool = True\n ) -> Dict[str, jax.Array]:\n # --- Preprocess + encode ---\n B, T = videos.shape[:2]\n patch_BTNP = patchify(videos, self.patch_size)\n N = patch_BTNP.shape[2]\n x_BTNL = self.encoder(patch_BTNP)\n\n # --- Vector quantize ---\n x_DL = x_BTNL.reshape(B * T * N, self.latent_dim)\n z_q_DL, z_DL, emb_DL, indices_D = self.vq(x_DL, training)\n z_q_BTNL = z_q_DL.reshape(B, T, N, self.latent_dim)\n indices_BTN = indices_D.reshape(B, T, N)\n return dict(z_q=z_q_BTNL, z=z_DL, emb=emb_DL, indices=indices_BTN)\n\n def decode(self, indices_BTN: jax.Array, video_hw: Tuple[int, int]) -> jax.Array:\n z_BTNL = self.vq.codebook[indices_BTN]\n recon_BTNP = self.decoder(z_BTNL)\n recon_BTNP = recon_BTNP.astype(jnp.float32)\n recon_BTNP = nnx.sigmoid(recon_BTNP)\n recon_BTNP = recon_BTNP.astype(self.dtype)\n return unpatchify(recon_BTNP, self.patch_size, *video_hw)\n",python,tab
+30,38309,"models/tokenizer.py",1841,15,"VectorQuantizer",python,selection_command
+31,38548,"models/tokenizer.py",1855,0,"",python,selection_command
+32,38739,"models/tokenizer.py",1841,0,"",python,selection_command
+33,38880,"models/tokenizer.py",1839,0,"",python,selection_command
+34,39039,"models/tokenizer.py",1836,0,"",python,selection_command
+35,39473,"models/tokenizer.py",3502,0,"",python,selection_command
+36,40428,"models/tokenizer.py",3501,0,"",python,selection_command
+37,40583,"models/tokenizer.py",3497,0,"",python,selection_command
+38,40772,"models/tokenizer.py",3495,0,"",python,selection_command
+39,41280,"models/tokenizer.py",3437,0,"",python,selection_command
+40,41895,"models/tokenizer.py",3395,0,"",python,selection_command
+41,42041,"models/tokenizer.py",3362,0,"",python,selection_command
+42,42240,"models/tokenizer.py",3360,0,"",python,selection_command
diff --git a/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-f30029d7-35fb-408d-b8cf-377e2b67278b1762084766653-2025_11_02-12.59.29.208/source.csv b/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-f30029d7-35fb-408d-b8cf-377e2b67278b1762084766653-2025_11_02-12.59.29.208/source.csv
new file mode 100644
index 0000000000000000000000000000000000000000..8ba4f7d9151ddc359fe832c52513daabe7296113
--- /dev/null
+++ b/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-f30029d7-35fb-408d-b8cf-377e2b67278b1762084766653-2025_11_02-12.59.29.208/source.csv
@@ -0,0 +1,10591 @@
+Sequence,Time,File,RangeOffset,RangeLength,Text,Language,Type
+1,2,"examples/crowd_code.html",0,0,"\n\n\n\n \n \n \n \n\n\n\n \n \n \n \n \n
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n Install once, and forget about it.\n
\n Neural networks are simulators. Anything you want your model to do, you have to teach it. Base models represent the mean of the data distribution of the internet.\n Post-training permits shifting that distribution towards desired behaviours. The higher the required skill, the scarcer the corresponding data on the internet.\n
\n
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt\n at capturing a broad spectrum of human software engineering, including character-level text insertions, deletions, undo, redo, cursor movement, file switches, jumping\n to function definitions, git checkouts, terminal command execution, autocompletes, LLM changes, and more.\n
\n Every day, millions of developers are writing open-source code. We propose going beyond open-source and towards open-engineering, a paradigm where the value of the\n developer's time is not just captured by the code they produce, but also by the mere act of engineering.\n
\n
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n We want to make it as easy as possible for developers to participate in crowd-sourcing. All you need to do is \n install the crowd-code extension, and forget about it.\n
\n \n Figure 1: A preview of the crowd-code extension in action. Figure from Mattia Consiglio.\n
\n The extension periodically uploads the user's captured IDE actions to a server, where they are cleaned, filtered, thoroughly anonymized, and periodically released to the\n public under the most permissive Creative Commons license (CC0).\n An ongoing recording is transparently indicated in the IDE's status bar, and can be stopped at any time. If the user has inserted sensitive data, they can simply\n press the 'panic button' in the status bar to remove the last actions from the recording before they even leave the user's machine. Additionally, the user is asked for\n consent to participate in crowd-sourcing upon extension installation, and can opt-out at any time. We take user privacy very seriously and welcome any feedback on how to\n make data collection and release more transparent.\n
\n
\n Beyond behaviour-cloning on a dataset crowd-sourced using crowd-code, we eventually want to use crowd-code to annotate the entirety of IDE screencasts on the internet \n using an inverse dynamics model trained on screen recordings paired with crowd-code's IDE action annotations. This would unlock an entirely new trove of training data\n for software engineering agents.\n
\n
\n We are excited to see what the community builds with this dataset. We want to democratize AI research. We are greater than the sum of our parts. Together.\n
\n \n\n \n\n
Contributions
\n
FS worked on conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All authors contributed technically. FS wrote the manuscript. We thank Gemini Code Assist and Cursor for their help in writing the extension.
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n Install once, and forget about it.\n
\n Neural networks are simulators. Anything you want your model to do, you have to teach it. Base models represent the mean of the data distribution of the internet.\n Post-training permits shifting that distribution towards desired behaviours. The higher the required skill, the scarcer the corresponding data on the internet.\n
\n
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt\n at capturing a broad spectrum of human software engineering, including character-level text insertions, deletions, undo, redo, cursor movement, file switches, jumping\n to function definitions, git checkouts, terminal command execution, autocompletes, LLM changes, and more.\n
\n Every day, millions of developers are writing open-source code. We propose going beyond open-source and towards open-engineering, a paradigm where the value of the\n developer's time is not just captured by the code they produce, but also by the mere act of engineering.\n
\n
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n We want to make it as easy as possible for developers to participate in crowd-sourcing. All you need to do is \n install the crowd-code extension, and forget about it.\n
\n \n Figure 1: A preview of the crowd-code extension in action. Figure from Mattia Consiglio.\n
\n The extension periodically uploads the user's captured IDE actions to a server, where they are cleaned, filtered, thoroughly anonymized, and periodically released to the\n public under the most permissive Creative Commons license (CC0).\n An ongoing recording is transparently indicated in the IDE's status bar, and can be stopped at any time. If the user has inserted sensitive data, they can simply\n press the 'panic button' in the status bar to remove the last actions from the recording before they even leave the user's machine. Additionally, the user is asked for\n consent to participate in crowd-sourcing upon extension installation, and can opt-out at any time. We take user privacy very seriously and welcome any feedback on how to\n make data collection and release more transparent.\n
\n
\n Beyond behaviour-cloning on a dataset crowd-sourced using crowd-code, we eventually want to use crowd-code to annotate the entirety of IDE screencasts on the internet \n using an inverse dynamics model trained on screen recordings paired with crowd-code's IDE action annotations. This would unlock an entirely new trove of training data\n for software engineering agents.\n
\n
\n We are excited to see what the community builds with this dataset. We want to democratize AI research. We are greater than the sum of our parts. Together.\n
\n \n\n \n\n
Contributions
\n
FS worked on conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All authors contributed technically. FS wrote the manuscript. We thank Gemini Code Assist and Cursor for their help in writing the extension.
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n Install once, and forget about it.\n
\n Neural networks are simulators. Anything you want your model to do, you have to teach it. Base models represent the mean of the data distribution of the internet.\n Post-training permits shifting that distribution towards desired behaviours. The higher the required skill, the scarcer the corresponding data on the internet.\n
\n
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt\n at capturing a broad spectrum of human software engineering, including character-level text insertions, deletions, undo, redo, cursor movement, file switches, jumping\n to function definitions, git checkouts, terminal command execution, autocompletes, LLM changes, and more.\n
\n Every day, millions of developers are writing open-source code. We propose going beyond open-source and towards open-engineering, a paradigm where the value of the\n developer's time is not just captured by the code they produce, but also by the mere act of engineering.\n
\n
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n We want to make it as easy as possible for developers to participate in crowd-sourcing. All you need to do is \n install the crowd-code extension, and forget about it.\n
\n \n Figure 1: A preview of the crowd-code extension in action. Figure from Mattia Consiglio.\n
\n The extension periodically uploads the user's captured IDE actions to a server, where they are cleaned, filtered, thoroughly anonymized, and periodically released to the\n public under the most permissive Creative Commons license (CC0).\n An ongoing recording is transparently indicated in the IDE's status bar, and can be stopped at any time. If the user has inserted sensitive data, they can simply\n press the 'panic button' in the status bar to remove the last actions from the recording before they even leave the user's machine. Additionally, the user is asked for\n consent to participate in crowd-sourcing upon extension installation, and can opt-out at any time. We take user privacy very seriously and welcome any feedback on how to\n make data collection and release more transparent.\n
\n
\n Beyond behaviour-cloning on a dataset crowd-sourced using crowd-code, we eventually want to use crowd-code to annotate the entirety of IDE screencasts on the internet \n using an inverse dynamics model trained on screen recordings paired with crowd-code's IDE action annotations. This would unlock an entirely new trove of training data\n for software engineering agents.\n
\n
\n We are excited to see what the community builds with this dataset. We want to democratize AI research. We are greater than the sum of our parts. Together.\n
\n \n\n \n\n
Contributions
\n
FS worked on conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All authors contributed technically. FS wrote the manuscript. We thank Gemini Code Assist and Cursor for their help in writing the extension.
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n Install once, and forget about it.\n
\n Neural networks are simulators. Anything you want your model to do, you have to teach it. Base models represent the mean of the data distribution of the internet.\n Post-training permits shifting that distribution towards desired behaviours. The higher the required skill, the scarcer the corresponding data on the internet.\n
\n
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt\n at capturing a broad spectrum of human software engineering, including character-level text insertions, deletions, undo, redo, cursor movement, file switches, jumping\n to function definitions, git checkouts, terminal command execution, autocompletes, LLM changes, and more.\n
\n Every day, millions of developers are writing open-source code. We propose going beyond open-source and towards open-engineering, a paradigm where the value of the\n developer's time is not just captured by the code they produce, but also by the mere act of engineering.\n
\n
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n We want to make it as easy as possible for developers to participate in crowd-sourcing. All you need to do is \n install the crowd-code extension, and forget about it.\n
\n \n Figure 1: A preview of the crowd-code extension in action. Figure from Mattia Consiglio.\n
\n The extension periodically uploads the user's captured IDE actions to a server, where they are cleaned, filtered, thoroughly anonymized, and periodically released to the\n public under the most permissive Creative Commons license (CC0).\n An ongoing recording is transparently indicated in the IDE's status bar, and can be stopped at any time. If the user has inserted sensitive data, they can simply\n press the 'panic button' in the status bar to remove the last actions from the recording before they even leave the user's machine. Additionally, the user is asked for\n consent to participate in crowd-sourcing upon extension installation, and can opt-out at any time. We take user privacy very seriously and welcome any feedback on how to\n make data collection and release more transparent.\n
\n
\n Beyond behaviour-cloning on a dataset crowd-sourced using crowd-code, we eventually want to use crowd-code to annotate the entirety of IDE screencasts on the internet \n using an inverse dynamics model trained on screen recordings paired with crowd-code's IDE action annotations. This would unlock an entirely new trove of training data\n for software engineering agents.\n
\n
\n We are excited to see what the community builds with this dataset. We want to democratize AI research. We are greater than the sum of our parts. Together.\n
\n \n\n \n\n
Contributions
\n
FS worked on conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All authors contributed technically. FS wrote the manuscript. We thank Gemini Code Assist and Cursor for their help in writing the extension.
\n \n \n \n \n\n \n\nWork l",html,content
+563,247349,"examples/crowd-clone.html",0,0,"",html,selection_command
+564,249341,"examples/crowd-clone.html",962,13,"",html,content
+565,249349,"examples/crowd-clone.html",965,1,"ing",html,content
+566,249354,"examples/crowd-clone.html",981,1,"L",html,content
+567,249355,"examples/crowd-clone.html",962,0,"",html,selection_command
+568,249581,"examples/crowd-clone.html",1037,2,"lon",html,content
+569,249587,"examples/crowd-clone.html",1036,0,"",html,selection_command
+570,249610,"examples/crowd-clone.html",1017,204,"",html,content
+571,249643,"examples/crowd-clone.html",1017,0,"We introduce crowd-clone, a continually growing dataset of human ",html,content
+572,249645,"examples/crowd-clone.html",1017,0,"",html,selection_command
+573,249677,"examples/crowd-clone.html",949,11,"",html,content
+574,249682,"examples/crowd-clone.html",949,0,"",html,selection_command
+575,250846,"examples/crowd-clone.html",6553,0,"",html,selection_command
+576,251986,"examples/crowd-clone.html",0,0,"",html,selection_command
+577,252159,"examples/crowd-clone.html",949,0,"crowd-clone",html,content
+578,252160,"examples/crowd-clone.html",949,0,"",html,selection_command
+579,252411,"examples/crowd-clone.html",1017,65,"",html,content
+580,252413,"examples/crowd-clone.html",1017,0,"",html,selection_command
+581,252441,"examples/crowd-clone.html",1017,0,"We introduce crowd-clone, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on. Install once, and forget about it.",html,content
+582,252443,"examples/crowd-clone.html",1017,0,"",html,selection_command
+583,252471,"examples/crowd-clone.html",1037,3,"od",html,content
+584,252472,"examples/crowd-clone.html",1036,0,"",html,selection_command
+585,252508,"examples/crowd-clone.html",981,1,"l",html,content
+586,252509,"examples/crowd-clone.html",965,3,"e",html,content
+587,252511,"examples/crowd-clone.html",962,0,"A Dataset to ",html,content
+588,252512,"examples/crowd-clone.html",962,0,"",html,selection_command
+589,252542,"examples/crowd-clone.html",0,6711,"",html,content
+590,252545,"examples/crowd-clone.html",2,0,"",html,selection_command
+591,253814,"examples/crowd-clone.html",0,0,"",html,selection_command
+592,255761,"examples/crowd-clone.html",0,2,"",html,content
+593,258915,"examples/crowd_code.html",0,0,"",html,tab
+594,259232,"examples/crowd_code.html",0,6713,"\n\n\n\n \n \n \n \n\n\n\n \n \n \n \n \n
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n Install once, and forget about it.\n
\n Neural networks are simulators. Anything you want your model to do, you have to teach it. Base models represent the mean of the data distribution of the internet.\n Post-training permits shifting that distribution towards desired behaviours. The higher the required skill, the scarcer the corresponding data on the internet.\n
\n
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt\n at capturing a broad spectrum of human software engineering, including character-level text insertions, deletions, undo, redo, cursor movement, file switches, jumping\n to function definitions, git checkouts, terminal command execution, autocompletes, LLM changes, and more.\n
\n Every day, millions of developers are writing open-source code. We propose going beyond open-source and towards open-engineering, a paradigm where the value of the\n developer's time is not just captured by the code they produce, but also by the mere act of engineering.\n
\n
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n We want to make it as easy as possible for developers to participate in crowd-sourcing. All you need to do is \n install the crowd-code extension, and forget about it.\n
\n \n Figure 1: A preview of the crowd-code extension in action. Figure from Mattia Consiglio.\n
\n The extension periodically uploads the user's captured IDE actions to a server, where they are cleaned, filtered, thoroughly anonymized, and periodically released to the\n public under the most permissive Creative Commons license (CC0).\n An ongoing recording is transparently indicated in the IDE's status bar, and can be stopped at any time. If the user has inserted sensitive data, they can simply\n press the 'panic button' in the status bar to remove the last actions from the recording before they even leave the user's machine. Additionally, the user is asked for\n consent to participate in crowd-sourcing upon extension installation, and can opt-out at any time. We take user privacy very seriously and welcome any feedback on how to\n make data collection and release more transparent.\n
\n
\n Beyond behaviour-cloning on a dataset crowd-sourced using crowd-code, we eventually want to use crowd-code to annotate the entirety of IDE screencasts on the internet \n using an inverse dynamics model trained on screen recordings paired with crowd-code's IDE action annotations. This would unlock an entirely new trove of training data\n for software engineering agents.\n
\n
\n We are excited to see what the community builds with this dataset. We want to democratize AI research. We are greater than the sum of our parts. Together.\n
\n \n\n \n\n
Contributions
\n
FS worked on conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All authors contributed technically. FS wrote the manuscript. We thank Gemini Code Assist and Cursor for their help in writing the extension.
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n Install once, and forget about it.\n
\n Neural networks are simulators. Anything you want your model to do, you have to teach it. Base models represent the mean of the data distribution of the internet.\n Post-training permits shifting that distribution towards desired behaviours. The higher the required skill, the scarcer the corresponding data on the internet.\n
\n
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt\n at capturing a broad spectrum of human software engineering, including character-level text insertions, deletions, undo, redo, cursor movement, file switches, jumping\n to function definitions, git checkouts, terminal command execution, autocompletes, LLM changes, and more.\n
\n Every day, millions of developers are writing open-source code. We propose going beyond open-source and towards open-engineering, a paradigm where the value of the\n developer's time is not just captured by the code they produce, but also by the mere act of engineering.\n
\n
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n We want to make it as easy as possible for developers to participate in crowd-sourcing. All you need to do is \n install the crowd-code extension, and forget about it.\n
\n \n Figure 1: A preview of the crowd-code extension in action. Figure from Mattia Consiglio.\n
\n The extension periodically uploads the user's captured IDE actions to a server, where they are cleaned, filtered, thoroughly anonymized, and periodically released to the\n public under the most permissive Creative Commons license (CC0).\n An ongoing recording is transparently indicated in the IDE's status bar, and can be stopped at any time. If the user has inserted sensitive data, they can simply\n press the 'panic button' in the status bar to remove the last actions from the recording before they even leave the user's machine. Additionally, the user is asked for\n consent to participate in crowd-sourcing upon extension installation, and can opt-out at any time. We take user privacy very seriously and welcome any feedback on how to\n make data collection and release more transparent.\n
\n
\n Beyond behaviour-cloning on a dataset crowd-sourced using crowd-code, we eventually want to use crowd-code to annotate the entirety of IDE screencasts on the internet \n using an inverse dynamics model trained on screen recordings paired with crowd-code's IDE action annotations. This would unlock an entirely new trove of training data\n for software engineering agents.\n
\n
\n We are excited to see what the community builds with this dataset. We want to democratize AI research. We are greater than the sum of our parts. Together.\n
\n \n\n \n\n
Contributions
\n
FS worked on conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All authors contributed technically. FS wrote the manuscript. We thank Gemini Code Assist and Cursor for their help in writing the extension.
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n Install once, and forget about it.\n
\n Neural networks are simulators. Anything you want your model to do, you have to teach it. Base models represent the mean of the data distribution of the internet.\n Post-training permits shifting that distribution towards desired behaviours. The higher the required skill, the scarcer the corresponding data on the internet.\n
\n
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt\n at capturing a broad spectrum of human software engineering, including character-level text insertions, deletions, undo, redo, cursor movement, file switches, jumping\n to function definitions, git checkouts, terminal command execution, autocompletes, LLM changes, and more.\n
\n Every day, millions of developers are writing open-source code. We propose going beyond open-source and towards open-engineering, a paradigm where the value of the\n developer's time is not just captured by the code they produce, but also by the mere act of engineering.\n
\n
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n We want to make it as easy as possible for developers to participate in crowd-sourcing. All you need to do is \n install the crowd-code extension, and forget about it.\n
\n \n Figure 1: A preview of the crowd-code extension in action. Figure from Mattia Consiglio.\n
\n The extension periodically uploads the user's captured IDE actions to a server, where they are cleaned, filtered, thoroughly anonymized, and periodically released to the\n public under the most permissive Creative Commons license (CC0).\n An ongoing recording is transparently indicated in the IDE's status bar, and can be stopped at any time. If the user has inserted sensitive data, they can simply\n press the 'panic button' in the status bar to remove the last actions from the recording before they even leave the user's machine. Additionally, the user is asked for\n consent to participate in crowd-sourcing upon extension installation, and can opt-out at any time. We take user privacy very seriously and welcome any feedback on how to\n make data collection and release more transparent.\n
\n
\n Beyond behaviour-cloning on a dataset crowd-sourced using crowd-code, we eventually want to use crowd-code to annotate the entirety of IDE screencasts on the internet \n using an inverse dynamics model trained on screen recordings paired with crowd-code's IDE action annotations. This would unlock an entirely new trove of training data\n for software engineering agents.\n
\n
\n We are excited to see what the community builds with this dataset. We want to democratize AI research. We are greater than the sum of our parts. Together.\n
\n \n\n \n\n
Contributions
\n
FS worked on conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All authors contributed technically. FS wrote the manuscript. We thank Gemini Code Assist and Cursor for their help in writing the extension.
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n Install once, and forget about it.\n
\n Neural networks are simulators. Anything you want your model to do, you have to teach it. Base models represent the mean of the data distribution of the internet.\n Post-training permits shifting that distribution towards desired behaviours. The higher the required skill, the scarcer the corresponding data on the internet.\n
\n
\n The most straightforward way to teach a model to do something is to give it a dataset to behaviour-clone off of. No software engineer in the world writes code linearly,\n in a single branch, in a single commit, in a single PR, without jumping around the codebase, without making mistakes, without debugging. crowd-code is our first attempt\n at capturing a broad spectrum of human software engineering, including character-level text insertions, deletions, undo, redo, cursor movement, file switches, jumping\n to function definitions, git checkouts, terminal command execution, autocompletes, LLM changes, and more.\n
\n Every day, millions of developers are writing open-source code. We propose going beyond open-source and towards open-engineering, a paradigm where the value of the\n developer's time is not just captured by the code they produce, but also by the mere act of engineering.\n
\n
\n We introduce crowd-code, a VS Code/Cursor extension that allows anyone to participate in crowd-sourcing a software engineering dataset to eventually finetune models on.\n We want to make it as easy as possible for developers to participate in crowd-sourcing. All you need to do is \n install the crowd-code extension, and forget about it.\n
\n \n Figure 1: A preview of the crowd-code extension in action. Figure from Mattia Consiglio.\n
\n The extension periodically uploads the user's captured IDE actions to a server, where they are cleaned, filtered, thoroughly anonymized, and periodically released to the\n public under the most permissive Creative Commons license (CC0).\n An ongoing recording is transparently indicated in the IDE's status bar, and can be stopped at any time. If the user has inserted sensitive data, they can simply\n press the 'panic button' in the status bar to remove the last actions from the recording before they even leave the user's machine. Additionally, the user is asked for\n consent to participate in crowd-sourcing upon extension installation, and can opt-out at any time. We take user privacy very seriously and welcome any feedback on how to\n make data collection and release more transparent.\n
\n
\n Beyond behaviour-cloning on a dataset crowd-sourced using crowd-code, we eventually want to use crowd-code to annotate the entirety of IDE screencasts on the internet \n using an inverse dynamics model trained on screen recordings paired with crowd-code's IDE action annotations. This would unlock an entirely new trove of training data\n for software engineering agents.\n
\n
\n We are excited to see what the community builds with this dataset. We want to democratize AI research. We are greater than the sum of our parts. Together.\n
\n \n\n \n\n
Contributions
\n
FS worked on conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All authors contributed technically. FS wrote the manuscript. We thank Gemini Code Assist and Cursor for their help in writing the extension.
\n Beyond behaviour-cloning on a dataset crowd-sourced using crowd-code, we eventually want to use crowd-code to annotate the entirety of IDE screencasts on the internet ",html,selection_command
+1584,9039725,"examples/agi_cast.html",2008,358,"
\n Beyond behaviour-cloning on a dataset crowd-sourced using crowd-code, we eventually want to use crowd-code to annotate the entirety of IDE screencasts on the internet \n using an inverse dynamics model trained on screen recordings paired with crowd-code's IDE action annotations. This would unlock an entirely new trove of training data",html,selection_command
+1585,9039907,"examples/agi_cast.html",2008,399,"
\n Beyond behaviour-cloning on a dataset crowd-sourced using crowd-code, we eventually want to use crowd-code to annotate the entirety of IDE screencasts on the internet \n using an inverse dynamics model trained on screen recordings paired with crowd-code's IDE action annotations. This would unlock an entirely new trove of training data\n for software engineering agents.",html,selection_command
+1586,9040027,"examples/agi_cast.html",2008,408,"
\n Beyond behaviour-cloning on a dataset crowd-sourced using crowd-code, we eventually want to use crowd-code to annotate the entirety of IDE screencasts on the internet \n using an inverse dynamics model trained on screen recordings paired with crowd-code's IDE action annotations. This would unlock an entirely new trove of training data\n for software engineering agents.\n
\n Beyond behaviour-cloning on a dataset crowd-sourced using crowd-code, we eventually want to use crowd-code to annotate the entirety of IDE screencasts on the internet \n using an inverse dynamics model trained on screen recordings paired with crowd-code's IDE action annotations. This would unlock an entirely new trove of training data\n for software engineering agents.\n
\n Beyond behaviour-cloning on a dataset crowd-sourced using crowd-code, we eventually want to use crowd-code to annotate the entirety of IDE screencasts on the internet \n using an inverse dynamics model trained on screen recordings paired with crowd-code's IDE action annotations. This would unlock an entirely new trove of training data\n for software engineering agents.\n
\n
\n We are excited to see what the community builds with this dataset. We want to democratize AI research. We are greater than the sum of our parts. Together.",html,selection_command
+1589,9040475,"examples/agi_cast.html",2008,595,"
\n Beyond behaviour-cloning on a dataset crowd-sourced using crowd-code, we eventually want to use crowd-code to annotate the entirety of IDE screencasts on the internet \n using an inverse dynamics model trained on screen recordings paired with crowd-code's IDE action annotations. This would unlock an entirely new trove of training data\n for software engineering agents.\n
\n
\n We are excited to see what the community builds with this dataset. We want to democratize AI research. We are greater than the sum of our parts. Together.\n
\n Beyond behaviour-cloning on a dataset crowd-sourced using crowd-code, we eventually want to use crowd-code to annotate the entirety of IDE screencasts on the internet \n using an inverse dynamics model trained on screen recordings paired with crowd-code's IDE action annotations. This would unlock an entirely new trove of training data\n for software engineering agents.\n
\n
\n We are excited to see what the community builds with this dataset. We want to democratize AI research. We are greater than the sum of our parts. Together.\n
\n Beyond behaviour-cloning on a dataset crowd-sourced using crowd-code, we eventually want to use crowd-code to annotate the entirety of IDE screencasts on the internet \n using an inverse dynamics model trained on screen recordings paired with crowd-code's IDE action annotations. This would unlock an entirely new trove of training data\n for software engineering agents.\n
\n
\n We are excited to see what the community builds with this dataset. We want to democratize AI research. We are greater than the sum of our parts. Together.\n
\n We introduce Jasmine, a production-ready JAX-based codebase for world modeling from unlabeled videos.\n Scale from single hosts to hundreds of xPUs thanks to XLA.\n
\n We are at the cusp of an intelligence revolution. Neural networks are able to clone the behaviour of peak human intellectual performance \n given enough compute, data, and the right algorithms . While an increasing amount of capital expenditure is allocated to compute clusters, and a well-working\n recipe of equipping models with the required priors and capacity to reason is publicly available, the path to human-level intelligence with the ability to automate\n large fractions of the economy will increasingly be shaped by paradigms that are able to find and efficiently use untouched data troves.\n
\n
\n While product-feedback-loops constitute an adaptive data trove, many domains like robotics are not mature enough to yield a product with wide enough\n adoption to create a feedback-loop of sufficient magnitude, prompting the search for alternatives.\n One paradigm proposed by the research community to overcome the data scarcity in those domains is that of world models. While world models can help frontier model\n development in numerous ways, an ambitious goal of the community is to train a world model to act as a simulation of the world , in order to\n train an agent in that simulation, via an adaptive curriculum or otherwise.\n
\n While numerous previous works have investigated large-scale world modeling and its application to robotics , world modeling for agent training calls for a vastly different treatment.\n Such regime requires the compounding error of world models to be orders of magnitude smaller than when solely used for short-term look-ahead. The feasibility of such a world model in its truest sense is entirely\n understudied, and Jasmine, a world modeling codebase, is our first milestone towards studying the setting using rigorous evaluations. Specifically, we want to develop Empirical Environment Complexity Scaling Trends, where we train world models to full convergence\n in environments of increasing complexity (Atari , RetroGym , Craftax , Minecraft )\n and under the synthetic infinite-data regime. Subsequently, we want to evaluate those models two-fold: i) via a taxonomy of granular benchmarks probing\n specific world modeling capabilities (reconstruction quality, environment dynamics at the body/tail of the data distribution, long-horizon consistency) , and ii) by training reinforcement learning (RL) agents in both\n the world model and the corresponding ground-truth environment, and measuring the performance difference between those agents.\n
\n
\n Ultimately, such treatment permits us to derive empirical estimates of compute and data requirements to model environments of increasing complexity sufficiently well (as determined by our evaluation procedure). Only given such estimates can we try to draw conclusions\n about the feasibility of world modeling of environments as complex as the real world for agent training. If our empirical estimates show resource requirement trends that are feasible under the assumption of the continuation of Moore's Law and increased capital\n expenditure, that would manifest world modeling as a paradigm with high likelihood of success in overcoming the data-scarcity in domains as general as (humanoid) robotics. Otherwise, the world modeling research community must realign its direction with downstream goals\n that are feasible.\n
\n
A batteries-included foundation for world modeling research
\n
\n Jasmine, our first milestone towards deriving Empirical Environment Complexity Scaling Trends, is the result of weeks of infrastructure work to make large-scale world modeling research more accessible. What started off as a fork of\n Jafar grew into a full-fledged world\n modeling codebase amenable to large-scale training, implementing multiple dynamics model baselines, asynchronous checkpointing, process-parallel dataloading, checkpointing of model weights, optimizer and dataloader states, checkpointing policies, full reproducibility with identical\n training curves, mixed precision training, optimized FlashAttention (via cuDNN SDPA), activation checkpointing, DDP\n (with FSDP/HSDP requiring changing a singe LoC), WSD schedule, index-shuffling during dataloading, and native Treescope support. Jasmine implements the new\n flax.nnx API and strictly adheres to Noam Shazeer's shape suffix convention, thereby providing\n a didactic implementation of world modeling architectures. Jasmine solely depends\n on battle-tested libraries from the Google ecosystem (Flax, Optax, Orbax, Grain,\n PIX, ArrayRecord).\n
\n
Releasing a dataset of fine-grained research engineering
\n
\n We captured every step of the research engineering process behind Jasmine using crowd-code,\n a VS Code/ Cursor extension that captures fine-grained IDE interactions (character-level edits, navigation, debugging patterns, terminal usage) and allows researchers to contribute their \n engineering process to a crowd-sourced dataset. Today, we release crowd-code-0.1, our first dataset of dense IDE interactions, which encompasses the entire development of Jasmine.\n crowd-code-0.1 is unfiltered, uncleaned, and uncurated, but only contains IDE interactions of the Jasmine authors. We are actively working on cleaning and curating the full dataset,\n which will be released in the future.\n
\n \n\n \n\n
Contributions
\n
MM, AN and FS worked on research, ideation and implementation. FS wrote the manuscript. SB provided feedback and guidance.
\n \n \n \n \n\n \n\n\n",html,tab
+2259,9718816,"examples/jasmine.html",0,0,"",html,selection_command
+2260,9719983,"examples/jasmine.html",1827,0,"",html,selection_keyboard
+2261,9720407,"examples/jasmine.html",5300,0,"",html,selection_keyboard
+2262,9721189,"examples/jasmine.html",2912,0,"",html,selection_keyboard
+2263,9721857,"examples/jasmine.html",2856,0,"",html,selection_command
+2264,9722002,"examples/jasmine.html",2731,0,"",html,selection_command
+2265,9722815,"examples/jasmine.html",2681,0,"",html,selection_command
+2266,9723705,"examples/jasmine.html",2731,0,"",html,selection_command
+2267,9724141,"examples/jasmine.html",2727,124," ",html,selection_command
+2269,9724680,"examples/jasmine.html",2727,351," \n Figure 1: Jasmine in action.",html,selection_command
+2270,9728996,"examples/jasmine.html",2727,0,"",html,selection_command
+2271,9731169,"examples/agi_cast.html",0,0,"",html,tab
+2272,9734035,"examples/agi_cast.html",1905,0,"",html,selection_command
+2273,9734189,"examples/agi_cast.html",1861,0,"",html,selection_command
+2274,9734333,"examples/agi_cast.html",1787,0,"",html,selection_command
+2275,9734791,"examples/agi_cast.html",1781,0," \n Figure 1: Jasmine in action.\n",html,content
+2276,9734813,"examples/agi_cast.html",1785,0,"",html,selection_command
+2277,9737128,"examples/agi_cast.html",1786,0,"",html,selection_command
+2278,9737382,"examples/agi_cast.html",1793,0,"",html,selection_command
+2279,9737410,"examples/agi_cast.html",1798,0,"",html,selection_command
+2280,9737439,"examples/agi_cast.html",1800,0,"",html,selection_command
+2281,9737473,"examples/agi_cast.html",1804,0,"",html,selection_command
+2282,9737506,"examples/agi_cast.html",1805,0,"",html,selection_command
+2283,9737540,"examples/agi_cast.html",1811,0,"",html,selection_command
+2284,9737572,"examples/agi_cast.html",1813,0,"",html,selection_command
+2285,9737607,"examples/agi_cast.html",1817,0,"",html,selection_command
+2286,9737639,"examples/agi_cast.html",1819,0,"",html,selection_command
+2287,9737673,"examples/agi_cast.html",1825,0,"",html,selection_command
+2288,9737706,"examples/agi_cast.html",1827,0,"",html,selection_command
+2289,9737741,"examples/agi_cast.html",1832,0,"",html,selection_command
+2290,9737779,"examples/agi_cast.html",1833,0,"",html,selection_command
+2291,9737806,"examples/agi_cast.html",1835,0,"",html,selection_command
+2292,9737839,"examples/agi_cast.html",1842,0,"",html,selection_command
+2293,9737871,"examples/agi_cast.html",1844,0,"",html,selection_command
+2294,9737905,"examples/agi_cast.html",1848,0,"",html,selection_command
+2295,9737938,"examples/agi_cast.html",1850,0,"",html,selection_command
+2296,9737972,"examples/agi_cast.html",1857,0,"",html,selection_command
+2297,9738007,"examples/agi_cast.html",1858,0,"",html,selection_command
+2298,9738038,"examples/agi_cast.html",1865,0,"",html,selection_command
+2299,9738071,"examples/agi_cast.html",1867,0,"",html,selection_command
+2300,9738105,"examples/agi_cast.html",1873,0,"",html,selection_command
+2301,9738138,"examples/agi_cast.html",1876,0,"",html,selection_command
+2302,9738172,"examples/agi_cast.html",1880,0,"",html,selection_command
+2303,9738415,"examples/agi_cast.html",1883,0,"",html,selection_command
+2304,9738600,"examples/agi_cast.html",1885,0,"",html,selection_command
+2305,9743583,"examples/agi_cast.html",1883,0,"",html,selection_command
+2306,9743834,"examples/agi_cast.html",1880,0,"",html,selection_command
+2307,9743861,"examples/agi_cast.html",1876,0,"",html,selection_command
+2308,9743896,"examples/agi_cast.html",1873,0,"",html,selection_command
+2309,9743923,"examples/agi_cast.html",1867,0,"",html,selection_command
+2310,9746144,"examples/agi_cast.html",1873,0,"",html,selection_command
+2311,9746337,"examples/agi_cast.html",1876,0,"",html,selection_command
+2312,9747497,"examples/agi_cast.html",1781,124," ",html,selection_command
+2314,9758841,"examples/agi_cast.html",1960,0,"",html,selection_command
+2315,9788058,"examples/agi_cast.html",1781,180," ",html,content
+2316,10086721,"examples/agi_cast.html",1986,0,"",html,selection_command
+2317,10086958,"examples/agi_cast.html",1980,0,"",html,selection_command
+2318,10086990,"examples/agi_cast.html",1976,0,"",html,selection_command
+2319,10087025,"examples/agi_cast.html",1970,0,"",html,selection_command
+2320,10087058,"examples/agi_cast.html",1965,0,"",html,selection_command
+2321,10087092,"examples/agi_cast.html",1956,0,"",html,selection_command
+2322,10087124,"examples/agi_cast.html",1947,0,"",html,selection_command
+2323,10087158,"examples/agi_cast.html",1944,0,"",html,selection_command
+2324,10087191,"examples/agi_cast.html",1941,0,"",html,selection_command
+2325,10087224,"examples/agi_cast.html",1939,0,"",html,selection_command
+2326,10087258,"examples/agi_cast.html",1933,0,"",html,selection_command
+2327,10087500,"examples/agi_cast.html",1932,0,"",html,selection_command
+2328,10087758,"examples/agi_cast.html",1926,0,"",html,selection_command
+2329,10087787,"examples/agi_cast.html",1923,0,"",html,selection_command
+2330,10087820,"examples/agi_cast.html",1920,0,"",html,selection_command
+2331,10087854,"examples/agi_cast.html",1919,0,"",html,selection_command
+2332,10092990,"examples/agi_cast.html",1920,0,"",html,selection_command
+2333,10093236,"examples/agi_cast.html",1923,0,"",html,selection_command
+2334,10093265,"examples/agi_cast.html",1926,0,"",html,selection_command
+2335,10093298,"examples/agi_cast.html",1932,0,"",html,selection_command
+2336,10093330,"examples/agi_cast.html",1933,0,"",html,selection_command
+2337,10093364,"examples/agi_cast.html",1939,0,"",html,selection_command
+2338,10093398,"examples/agi_cast.html",1941,0,"",html,selection_command
+2339,10093431,"examples/agi_cast.html",1944,0,"",html,selection_command
+2340,10093464,"examples/agi_cast.html",1947,0,"",html,selection_command
+2341,10093497,"examples/agi_cast.html",1956,0,"",html,selection_command
+2342,10093531,"examples/agi_cast.html",1965,0,"",html,selection_command
+2343,10093564,"examples/agi_cast.html",1970,0,"",html,selection_command
+2344,10093598,"examples/agi_cast.html",1976,0,"",html,selection_command
+2345,10093630,"examples/agi_cast.html",1980,0,"",html,selection_command
+2346,10093664,"examples/agi_cast.html",1986,0,"",html,selection_command
+2347,10093697,"examples/agi_cast.html",1992,0,"",html,selection_command
+2348,10093731,"examples/agi_cast.html",1993,0,"",html,selection_command
+2349,10093764,"examples/agi_cast.html",2004,0,"",html,selection_command
+2350,10093797,"examples/agi_cast.html",2009,0,"",html,selection_command
+2351,10093831,"examples/agi_cast.html",2011,0,"",html,selection_command
+2352,10093864,"examples/agi_cast.html",2015,0,"",html,selection_command
+2353,10093897,"examples/agi_cast.html",2016,0,"",html,selection_command
+2354,10093931,"examples/agi_cast.html",2022,0,"",html,selection_command
+2355,10093964,"examples/agi_cast.html",2024,0,"",html,selection_command
+2356,10093998,"examples/agi_cast.html",2028,0,"",html,selection_command
+2357,10094031,"examples/agi_cast.html",2030,0,"",html,selection_command
+2358,10094065,"examples/agi_cast.html",2034,0,"",html,selection_command
+2359,10094256,"examples/agi_cast.html",2035,0,"",html,selection_command
+2360,10094511,"examples/agi_cast.html",2040,0,"",html,selection_command
+2361,10094541,"examples/agi_cast.html",2042,0,"",html,selection_command
+2362,10094573,"examples/agi_cast.html",2048,0,"",html,selection_command
+2363,10094605,"examples/agi_cast.html",2050,0,"",html,selection_command
+2364,10094640,"examples/agi_cast.html",2056,0,"",html,selection_command
+2365,10094672,"examples/agi_cast.html",2057,0,"",html,selection_command
+2366,10094706,"examples/agi_cast.html",2063,0,"",html,selection_command
+2367,10094739,"examples/agi_cast.html",2065,0,"",html,selection_command
+2368,10094773,"examples/agi_cast.html",2069,0,"",html,selection_command
+2369,10094806,"examples/agi_cast.html",2071,0,"",html,selection_command
+2370,10094839,"examples/agi_cast.html",2075,0,"",html,selection_command
+2371,10094872,"examples/agi_cast.html",2076,0,"",html,selection_command
+2372,10094907,"examples/agi_cast.html",2080,0,"",html,selection_command
+2373,10094939,"examples/agi_cast.html",2082,0,"",html,selection_command
+2374,10094972,"examples/agi_cast.html",2083,0,"",html,selection_command
+2375,10095004,"examples/agi_cast.html",2084,0,"",html,selection_command
+2376,10095042,"examples/agi_cast.html",2087,0,"",html,selection_command
+2377,10095072,"examples/agi_cast.html",2089,0,"",html,selection_command
+2378,10095249,"examples/agi_cast.html",2094,0,"",html,selection_command
+2379,10095500,"examples/agi_cast.html",2096,0,"",html,selection_command
+2380,10095530,"examples/agi_cast.html",2100,0,"",html,selection_command
+2381,10095564,"examples/agi_cast.html",2101,0,"",html,selection_command
+2382,10095597,"examples/agi_cast.html",2102,0,"",html,selection_command
+2383,10095631,"examples/agi_cast.html",2104,0,"",html,selection_command
+2384,10095664,"examples/agi_cast.html",2105,0,"",html,selection_command
+2385,10095698,"examples/agi_cast.html",2107,0,"",html,selection_command
+2386,10095731,"examples/agi_cast.html",2108,0,"",html,selection_command
+2387,10095765,"examples/agi_cast.html",2110,0,"",html,selection_command
+2388,10095931,"examples/agi_cast.html",2111,0,"",html,selection_command
+2389,10096119,"examples/agi_cast.html",2112,0,"",html,selection_command
+2390,10096286,"examples/agi_cast.html",2113,0,"",html,selection_command
+2391,10096460,"examples/agi_cast.html",2117,0,"",html,selection_command
+2392,10096688,"examples/agi_cast.html",2124,0,"",html,selection_command
+2393,10096838,"examples/agi_cast.html",2125,0,"",html,selection_command
+2394,10097002,"examples/agi_cast.html",2127,0,"",html,selection_command
+2395,10098307,"examples/agi_cast.html",2127,7,"",html,content
+2396,10098969,"examples/agi_cast.html",2127,0,"A",html,content
+2397,10098971,"examples/agi_cast.html",2128,0,"",html,selection_keyboard
+2398,10098981,"examples/agi_cast.html",2128,0,"G",html,content
+2399,10098983,"examples/agi_cast.html",2129,0,"",html,selection_keyboard
+2400,10099068,"examples/agi_cast.html",2129,0,"I",html,content
+2401,10099071,"examples/agi_cast.html",2130,0,"",html,selection_keyboard
+2402,10099801,"examples/agi_cast.html",2130,0,"-",html,content
+2403,10099808,"examples/agi_cast.html",2131,0,"",html,selection_keyboard
+2404,10100268,"examples/agi_cast.html",2131,0,"C",html,content
+2405,10100274,"examples/agi_cast.html",2132,0,"",html,selection_keyboard
+2406,10100483,"examples/agi_cast.html",2132,0,"A",html,content
+2407,10100486,"examples/agi_cast.html",2133,0,"",html,selection_keyboard
+2408,10100514,"examples/agi_cast.html",2133,0,"S",html,content
+2409,10100517,"examples/agi_cast.html",2134,0,"",html,selection_keyboard
+2410,10100567,"examples/agi_cast.html",2134,0,"T",html,content
+2411,10100569,"examples/agi_cast.html",2135,0,"",html,selection_keyboard
+2412,10100913,"examples/agi_cast.html",2134,0,"",html,selection_command
+2413,10104074,"examples/agi_cast.html",1986,0,"",html,selection_command
+2414,10105061,"examples/agi_cast.html",1901,0,"",html,selection_command
+2415,10106835,"examples/agi_cast.html",1986,0,"",html,selection_command
+2416,10107073,"examples/agi_cast.html",2134,0,"",html,selection_command
+2417,10107108,"examples/agi_cast.html",2232,0,"",html,selection_command
+2418,10107140,"examples/agi_cast.html",2276,0,"",html,selection_command
+2419,10107173,"examples/agi_cast.html",2284,0,"",html,selection_command
+2420,10107207,"examples/agi_cast.html",2384,0,"",html,selection_command
+2421,10107240,"examples/agi_cast.html",2393,0,"",html,selection_command
+2422,10107548,"examples/agi_cast.html",2384,0,"",html,selection_command
+2423,10368898,"examples/agi_cast.html",2385,0,"",html,selection_command
+2424,10369115,"examples/agi_cast.html",2385,0,"W",html,content
+2425,10369120,"examples/agi_cast.html",2386,0,"",html,selection_keyboard
+2426,10369363,"examples/agi_cast.html",2386,0,"h",html,content
+2427,10369365,"examples/agi_cast.html",2387,0,"",html,selection_keyboard
+2428,10369438,"examples/agi_cast.html",2387,0,"i",html,content
+2429,10369440,"examples/agi_cast.html",2388,0,"",html,selection_keyboard
+2430,10369648,"examples/agi_cast.html",2388,0,"l",html,content
+2431,10369649,"examples/agi_cast.html",2389,0,"",html,selection_keyboard
+2432,10369654,"examples/agi_cast.html",2389,0,"e",html,content
+2433,10369656,"examples/agi_cast.html",2390,0,"",html,selection_keyboard
+2434,10369798,"examples/agi_cast.html",2390,0," ",html,content
+2435,10369799,"examples/agi_cast.html",2391,0,"",html,selection_keyboard
+2436,10370217,"examples/agi_cast.html",2391,0,"b",html,content
+2437,10370218,"examples/agi_cast.html",2392,0,"",html,selection_keyboard
+2438,10370340,"examples/agi_cast.html",2392,0,"i",html,content
+2439,10370341,"examples/agi_cast.html",2393,0,"",html,selection_keyboard
+2440,10370522,"examples/agi_cast.html",2393,0,"l",html,content
+2441,10370524,"examples/agi_cast.html",2394,0,"",html,selection_keyboard
+2442,10370685,"examples/agi_cast.html",2394,0,"l",html,content
+2443,10370685,"examples/agi_cast.html",2395,0,"",html,selection_keyboard
+2444,10370775,"examples/agi_cast.html",2395,0,"i",html,content
+2445,10370776,"examples/agi_cast.html",2396,0,"",html,selection_keyboard
+2446,10370882,"examples/agi_cast.html",2396,0,"o",html,content
+2447,10370883,"examples/agi_cast.html",2397,0,"",html,selection_keyboard
+2448,10370993,"examples/agi_cast.html",2397,0,"i",html,content
+2449,10370995,"examples/agi_cast.html",2398,0,"",html,selection_keyboard
+2450,10371001,"examples/agi_cast.html",2398,0,"n",html,content
+2451,10371003,"examples/agi_cast.html",2399,0,"",html,selection_keyboard
+2452,10371182,"examples/agi_cast.html",2399,0,"s",html,content
+2453,10371184,"examples/agi_cast.html",2400,0,"",html,selection_keyboard
+2454,10371251,"examples/agi_cast.html",2400,0," ",html,content
+2455,10371253,"examples/agi_cast.html",2401,0,"",html,selection_keyboard
+2456,10371266,"examples/agi_cast.html",2401,0,"o",html,content
+2457,10371267,"examples/agi_cast.html",2402,0,"",html,selection_keyboard
+2458,10371566,"examples/agi_cast.html",2402,0,"f",html,content
+2459,10371567,"examples/agi_cast.html",2403,0,"",html,selection_keyboard
+2460,10371601,"examples/agi_cast.html",2403,0," ",html,content
+2461,10371602,"examples/agi_cast.html",2404,0,"",html,selection_keyboard
+2462,10371866,"examples/agi_cast.html",2401,3,"",html,content
+2463,10372016,"examples/agi_cast.html",2391,10,"",html,content
+2464,10372137,"examples/agi_cast.html",2385,6,"",html,content
+2465,10372857,"examples/agi_cast.html",2385,0,"W",html,content
+2466,10372859,"examples/agi_cast.html",2386,0,"",html,selection_keyboard
+2467,10373038,"examples/agi_cast.html",2386,0,"h",html,content
+2468,10373039,"examples/agi_cast.html",2387,0,"",html,selection_keyboard
+2469,10373066,"examples/agi_cast.html",2387,0,"i",html,content
+2470,10373067,"examples/agi_cast.html",2388,0,"",html,selection_keyboard
+2471,10373269,"examples/agi_cast.html",2388,0,"l",html,content
+2472,10373270,"examples/agi_cast.html",2389,0,"",html,selection_keyboard
+2473,10373282,"examples/agi_cast.html",2389,0,"e",html,content
+2474,10373283,"examples/agi_cast.html",2390,0,"",html,selection_keyboard
+2475,10373343,"examples/agi_cast.html",2390,0," ",html,content
+2476,10373344,"examples/agi_cast.html",2391,0,"",html,selection_keyboard
+2477,10373533,"examples/agi_cast.html",2391,0,"i",html,content
+2478,10373534,"examples/agi_cast.html",2392,0,"",html,selection_keyboard
+2479,10373974,"examples/agi_cast.html",2391,1,"",html,content
+2480,10373988,"examples/agi_cast.html",2391,0,"b",html,content
+2481,10373989,"examples/agi_cast.html",2392,0,"",html,selection_keyboard
+2482,10374119,"examples/agi_cast.html",2392,0,"i",html,content
+2483,10374120,"examples/agi_cast.html",2393,0,"",html,selection_keyboard
+2484,10374252,"examples/agi_cast.html",2393,0,"l",html,content
+2485,10374255,"examples/agi_cast.html",2394,0,"",html,selection_keyboard
+2486,10374418,"examples/agi_cast.html",2394,0,"l",html,content
+2487,10374419,"examples/agi_cast.html",2395,0,"",html,selection_keyboard
+2488,10374555,"examples/agi_cast.html",2395,0,"i",html,content
+2489,10374557,"examples/agi_cast.html",2396,0,"",html,selection_keyboard
+2490,10374598,"examples/agi_cast.html",2396,0,"o",html,content
+2491,10374599,"examples/agi_cast.html",2397,0,"",html,selection_keyboard
+2492,10374689,"examples/agi_cast.html",2397,0,"n",html,content
+2493,10374691,"examples/agi_cast.html",2398,0,"",html,selection_keyboard
+2494,10374899,"examples/agi_cast.html",2398,0,"s",html,content
+2495,10374900,"examples/agi_cast.html",2399,0,"",html,selection_keyboard
+2496,10375188,"examples/agi_cast.html",2391,8,"",html,content
+2497,10375333,"examples/agi_cast.html",2385,6,"",html,content
+2498,10375735,"examples/agi_cast.html",2385,0,"W",html,content
+2499,10375736,"examples/agi_cast.html",2386,0,"",html,selection_keyboard
+2500,10375890,"examples/agi_cast.html",2386,0,"h",html,content
+2501,10375892,"examples/agi_cast.html",2387,0,"",html,selection_keyboard
+2502,10375963,"examples/agi_cast.html",2387,0,"i",html,content
+2503,10375965,"examples/agi_cast.html",2388,0,"",html,selection_keyboard
+2504,10376181,"examples/agi_cast.html",2388,0,"l",html,content
+2505,10376182,"examples/agi_cast.html",2389,0,"",html,selection_keyboard
+2506,10376186,"examples/agi_cast.html",2389,0,"e",html,content
+2507,10376187,"examples/agi_cast.html",2390,0,"",html,selection_keyboard
+2508,10376290,"examples/agi_cast.html",2390,0," ",html,content
+2509,10376291,"examples/agi_cast.html",2391,0,"",html,selection_keyboard
+2510,10376866,"examples/agi_cast.html",2391,0,"f",html,content
+2511,10376867,"examples/agi_cast.html",2392,0,"",html,selection_keyboard
+2512,10376914,"examples/agi_cast.html",2392,0,"o",html,content
+2513,10376917,"examples/agi_cast.html",2393,0,"",html,selection_keyboard
+2514,10376967,"examples/agi_cast.html",2393,0,"r",html,content
+2515,10376970,"examples/agi_cast.html",2394,0,"",html,selection_keyboard
+2516,10377292,"examples/agi_cast.html",2393,1,"",html,content
+2517,10377416,"examples/agi_cast.html",2392,1,"",html,content
+2518,10377427,"examples/agi_cast.html",2392,0,"r",html,content
+2519,10377429,"examples/agi_cast.html",2393,0,"",html,selection_keyboard
+2520,10377541,"examples/agi_cast.html",2393,0,"o",html,content
+2521,10377542,"examples/agi_cast.html",2394,0,"",html,selection_keyboard
+2522,10377599,"examples/agi_cast.html",2394,0,"n",html,content
+2523,10377600,"examples/agi_cast.html",2395,0,"",html,selection_keyboard
+2524,10377749,"examples/agi_cast.html",2395,0,"t",html,content
+2525,10377751,"examples/agi_cast.html",2396,0,"",html,selection_keyboard
+2526,10378066,"examples/agi_cast.html",2396,0,"i",html,content
+2527,10378068,"examples/agi_cast.html",2397,0,"",html,selection_keyboard
+2528,10378169,"examples/agi_cast.html",2397,0,"e",html,content
+2529,10378171,"examples/agi_cast.html",2398,0,"",html,selection_keyboard
+2530,10378197,"examples/agi_cast.html",2398,0,"r",html,content
+2531,10378199,"examples/agi_cast.html",2399,0,"",html,selection_keyboard
+2532,10378424,"examples/agi_cast.html",2399,0," ",html,content
+2533,10378426,"examples/agi_cast.html",2400,0,"",html,selection_keyboard
+2534,10379491,"examples/agi_cast.html",2400,0,"m",html,content
+2535,10379493,"examples/agi_cast.html",2401,0,"",html,selection_keyboard
+2536,10379563,"examples/agi_cast.html",2401,0,"o",html,content
+2537,10379565,"examples/agi_cast.html",2402,0,"",html,selection_keyboard
+2538,10379745,"examples/agi_cast.html",2402,0,"d",html,content
+2539,10379747,"examples/agi_cast.html",2403,0,"",html,selection_keyboard
+2540,10380023,"examples/agi_cast.html",2403,0,"l",html,content
+2541,10380025,"examples/agi_cast.html",2404,0,"",html,selection_keyboard
+2542,10380028,"examples/agi_cast.html",2404,0,"e",html,content
+2543,10380029,"examples/agi_cast.html",2405,0,"",html,selection_keyboard
+2544,10380499,"examples/agi_cast.html",2404,1,"",html,content
+2545,10381001,"examples/agi_cast.html",2403,1,"",html,content
+2546,10381050,"examples/agi_cast.html",2403,0,"e",html,content
+2547,10381052,"examples/agi_cast.html",2404,0,"",html,selection_keyboard
+2548,10381259,"examples/agi_cast.html",2404,0,"l",html,content
+2549,10381261,"examples/agi_cast.html",2405,0,"",html,selection_keyboard
+2550,10383266,"examples/agi_cast.html",2404,0,"",html,selection_command
+2551,10384982,"examples/agi_cast.html",2405,0,"",html,selection_command
+2552,10385119,"examples/agi_cast.html",2405,0," ",html,content
+2553,10385120,"examples/agi_cast.html",2406,0,"",html,selection_keyboard
+2554,10385315,"examples/agi_cast.html",2406,0,"r",html,content
+2555,10385317,"examples/agi_cast.html",2407,0,"",html,selection_keyboard
+2556,10385398,"examples/agi_cast.html",2407,0,"e",html,content
+2557,10385399,"examples/agi_cast.html",2408,0,"",html,selection_keyboard
+2558,10385593,"examples/agi_cast.html",2408,0,"s",html,content
+2559,10385594,"examples/agi_cast.html",2409,0,"",html,selection_keyboard
+2560,10385639,"examples/agi_cast.html",2409,0,"e",html,content
+2561,10385641,"examples/agi_cast.html",2410,0,"",html,selection_keyboard
+2562,10386428,"examples/agi_cast.html",2410,0,"e",html,content
+2563,10386433,"examples/agi_cast.html",2411,0,"",html,selection_keyboard
+2564,10386633,"examples/agi_cast.html",2411,0,"a",html,content
+2565,10386635,"examples/agi_cast.html",2412,0,"",html,selection_keyboard
+2566,10387292,"examples/agi_cast.html",2406,6,"",html,content
+2567,10387408,"examples/agi_cast.html",2400,6,"",html,content
+2568,10388222,"examples/agi_cast.html",2400,0,"m",html,content
+2569,10388225,"examples/agi_cast.html",2401,0,"",html,selection_keyboard
+2570,10388294,"examples/agi_cast.html",2401,0,"o",html,content
+2571,10388297,"examples/agi_cast.html",2402,0,"",html,selection_keyboard
+2572,10388460,"examples/agi_cast.html",2402,0,"d",html,content
+2573,10388465,"examples/agi_cast.html",2403,0,"",html,selection_keyboard
+2574,10388519,"examples/agi_cast.html",2403,0,"e",html,content
+2575,10388524,"examples/agi_cast.html",2404,0,"",html,selection_keyboard
+2576,10388716,"examples/agi_cast.html",2404,0,"l",html,content
+2577,10388719,"examples/agi_cast.html",2405,0,"",html,selection_keyboard
+2578,10388865,"examples/agi_cast.html",2405,0,"s",html,content
+2579,10388867,"examples/agi_cast.html",2406,0,"",html,selection_keyboard
+2580,10388961,"examples/agi_cast.html",2406,0," ",html,content
+2581,10388963,"examples/agi_cast.html",2407,0,"",html,selection_keyboard
+2582,10390235,"examples/agi_cast.html",2407,0,"a",html,content
+2583,10390237,"examples/agi_cast.html",2408,0,"",html,selection_keyboard
+2584,10390446,"examples/agi_cast.html",2408,0,"r",html,content
+2585,10390449,"examples/agi_cast.html",2409,0,"",html,selection_keyboard
+2586,10390516,"examples/agi_cast.html",2409,0,"e",html,content
+2587,10390518,"examples/agi_cast.html",2410,0,"",html,selection_keyboard
+2588,10390756,"examples/agi_cast.html",2410,0," ",html,content
+2589,10390758,"examples/agi_cast.html",2411,0,"",html,selection_keyboard
+2590,10390934,"examples/agi_cast.html",2411,0,"a",html,content
+2591,10390937,"examples/agi_cast.html",2412,0,"",html,selection_keyboard
+2592,10390945,"examples/agi_cast.html",2412,0,"b",html,content
+2593,10390948,"examples/agi_cast.html",2413,0,"",html,selection_keyboard
+2594,10391302,"examples/agi_cast.html",2413,0,"l",html,content
+2595,10391306,"examples/agi_cast.html",2414,0,"",html,selection_keyboard
+2596,10391329,"examples/agi_cast.html",2414,0,"e",html,content
+2597,10391334,"examples/agi_cast.html",2415,0,"",html,selection_keyboard
+2598,10391475,"examples/agi_cast.html",2415,0," ",html,content
+2599,10391478,"examples/agi_cast.html",2416,0,"",html,selection_keyboard
+2600,10391485,"examples/agi_cast.html",2416,0,"t",html,content
+2601,10391487,"examples/agi_cast.html",2417,0,"",html,selection_keyboard
+2602,10391640,"examples/agi_cast.html",2417,0,"o",html,content
+2603,10391642,"examples/agi_cast.html",2418,0,"",html,selection_keyboard
+2604,10391818,"examples/agi_cast.html",2418,0," ",html,content
+2605,10391824,"examples/agi_cast.html",2419,0,"",html,selection_keyboard
+2606,10392059,"examples/agi_cast.html",2419,0,"p",html,content
+2607,10392062,"examples/agi_cast.html",2420,0,"",html,selection_keyboard
+2608,10392101,"examples/agi_cast.html",2420,0,"e",html,content
+2609,10392103,"examples/agi_cast.html",2421,0,"",html,selection_keyboard
+2610,10392149,"examples/agi_cast.html",2421,0,"r",html,content
+2611,10392151,"examples/agi_cast.html",2422,0,"",html,selection_keyboard
+2612,10392467,"examples/agi_cast.html",2422,0,"f",html,content
+2613,10392470,"examples/agi_cast.html",2423,0,"",html,selection_keyboard
+2614,10392492,"examples/agi_cast.html",2423,0,"o",html,content
+2615,10392495,"examples/agi_cast.html",2424,0,"",html,selection_keyboard
+2616,10392553,"examples/agi_cast.html",2424,0,"r",html,content
+2617,10392556,"examples/agi_cast.html",2425,0,"",html,selection_keyboard
+2618,10392734,"examples/agi_cast.html",2425,0,"m",html,content
+2619,10392737,"examples/agi_cast.html",2426,0,"",html,selection_keyboard
+2620,10392828,"examples/agi_cast.html",2426,0,"a",html,content
+2621,10392831,"examples/agi_cast.html",2427,0,"",html,selection_keyboard
+2622,10392917,"examples/agi_cast.html",2427,0,"n",html,content
+2623,10392919,"examples/agi_cast.html",2428,0,"",html,selection_keyboard
+2624,10393429,"examples/agi_cast.html",2419,9,"",html,content
+2625,10393856,"examples/agi_cast.html",2416,3,"",html,content
+2626,10394192,"examples/agi_cast.html",2411,5,"",html,content
+2627,10394366,"examples/agi_cast.html",2407,4,"",html,content
+2628,10396372,"examples/agi_cast.html",2406,0,"",html,selection_command
+2629,10403738,"examples/agi_cast.html",2407,0,"",html,selection_command
+2630,10403943,"examples/agi_cast.html",2407,0,"o",html,content
+2631,10403947,"examples/agi_cast.html",2408,0,"",html,selection_keyboard
+2632,10404016,"examples/agi_cast.html",2408,0,"u",html,content
+2633,10404018,"examples/agi_cast.html",2409,0,"",html,selection_keyboard
+2634,10404059,"examples/agi_cast.html",2409,0,"t",html,content
+2635,10404061,"examples/agi_cast.html",2410,0,"",html,selection_keyboard
+2636,10404224,"examples/agi_cast.html",2410,0,"p",html,content
+2637,10404226,"examples/agi_cast.html",2411,0,"",html,selection_keyboard
+2638,10404434,"examples/agi_cast.html",2411,0,"f",html,content
+2639,10404434,"examples/agi_cast.html",2412,0,"",html,selection_keyboard
+2640,10404442,"examples/agi_cast.html",2412,0,"e",html,content
+2641,10404443,"examples/agi_cast.html",2413,0,"",html,selection_keyboard
+2642,10404555,"examples/agi_cast.html",2413,0,"r",html,content
+2643,10404556,"examples/agi_cast.html",2414,0,"",html,selection_keyboard
+2644,10405170,"examples/agi_cast.html",2407,7,"",html,content
+2645,10405365,"examples/agi_cast.html",2407,0,"o",html,content
+2646,10405367,"examples/agi_cast.html",2408,0,"",html,selection_keyboard
+2647,10405408,"examples/agi_cast.html",2408,0,"u",html,content
+2648,10405409,"examples/agi_cast.html",2409,0,"",html,selection_keyboard
+2649,10405484,"examples/agi_cast.html",2409,0,"t",html,content
+2650,10405486,"examples/agi_cast.html",2410,0,"",html,selection_keyboard
+2651,10405800,"examples/agi_cast.html",2410,0,"p",html,content
+2652,10405801,"examples/agi_cast.html",2411,0,"",html,selection_keyboard
+2653,10405952,"examples/agi_cast.html",2411,0,"e",html,content
+2654,10405955,"examples/agi_cast.html",2412,0,"",html,selection_keyboard
+2655,10406043,"examples/agi_cast.html",2412,0,"r",html,content
+2656,10406046,"examples/agi_cast.html",2413,0,"",html,selection_keyboard
+2657,10406312,"examples/agi_cast.html",2413,0,"f",html,content
+2658,10406316,"examples/agi_cast.html",2414,0,"",html,selection_keyboard
+2659,10406425,"examples/agi_cast.html",2414,0,"o",html,content
+2660,10406428,"examples/agi_cast.html",2415,0,"",html,selection_keyboard
+2661,10406596,"examples/agi_cast.html",2415,0,"r",html,content
+2662,10406599,"examples/agi_cast.html",2416,0,"",html,selection_keyboard
+2663,10406705,"examples/agi_cast.html",2416,0,"m",html,content
+2664,10406708,"examples/agi_cast.html",2417,0,"",html,selection_keyboard
+2665,10406958,"examples/agi_cast.html",2417,0," ",html,content
+2666,10406960,"examples/agi_cast.html",2418,0,"",html,selection_keyboard
+2667,10406970,"examples/agi_cast.html",2418,0,"h",html,content
+2668,10406973,"examples/agi_cast.html",2419,0,"",html,selection_keyboard
+2669,10407107,"examples/agi_cast.html",2419,0,"u",html,content
+2670,10407110,"examples/agi_cast.html",2420,0,"",html,selection_keyboard
+2671,10407284,"examples/agi_cast.html",2420,0,"m",html,content
+2672,10407286,"examples/agi_cast.html",2421,0,"",html,selection_keyboard
+2673,10407486,"examples/agi_cast.html",2421,0,"n",html,content
+2674,10407489,"examples/agi_cast.html",2422,0,"",html,selection_keyboard
+2675,10407704,"examples/agi_cast.html",2422,0,"s",html,content
+2676,10407706,"examples/agi_cast.html",2423,0,"",html,selection_keyboard
+2677,10407768,"examples/agi_cast.html",2423,0," ",html,content
+2678,10407770,"examples/agi_cast.html",2424,0,"",html,selection_keyboard
+2679,10407785,"examples/agi_cast.html",2424,0,"i",html,content
+2680,10407786,"examples/agi_cast.html",2425,0,"",html,selection_keyboard
+2681,10407794,"examples/agi_cast.html",2425,0,"n",html,content
+2682,10407795,"examples/agi_cast.html",2426,0,"",html,selection_keyboard
+2683,10408015,"examples/agi_cast.html",2426,0," ",html,content
+2684,10408017,"examples/agi_cast.html",2427,0,"",html,selection_keyboard
+2685,10408025,"examples/agi_cast.html",2427,0,"o",html,content
+2686,10408027,"examples/agi_cast.html",2428,0,"",html,selection_keyboard
+2687,10408516,"examples/agi_cast.html",2427,1,"",html,content
+2688,10408689,"examples/agi_cast.html",2424,3,"",html,content
+2689,10409122,"examples/agi_cast.html",2418,6,"",html,content
+2690,10409284,"examples/agi_cast.html",2418,0,"h",html,content
+2691,10409285,"examples/agi_cast.html",2419,0,"",html,selection_keyboard
+2692,10409433,"examples/agi_cast.html",2419,0,"u",html,content
+2693,10409434,"examples/agi_cast.html",2420,0,"",html,selection_keyboard
+2694,10409641,"examples/agi_cast.html",2420,0,"m",html,content
+2695,10409642,"examples/agi_cast.html",2421,0,"",html,selection_keyboard
+2696,10409856,"examples/agi_cast.html",2421,0,"a",html,content
+2697,10409861,"examples/agi_cast.html",2422,0,"",html,selection_keyboard
+2698,10409872,"examples/agi_cast.html",2422,0,"n",html,content
+2699,10409876,"examples/agi_cast.html",2423,0,"",html,selection_keyboard
+2700,10410090,"examples/agi_cast.html",2423,0,"s",html,content
+2701,10410093,"examples/agi_cast.html",2424,0,"",html,selection_keyboard
+2702,10410136,"examples/agi_cast.html",2424,0," ",html,content
+2703,10410141,"examples/agi_cast.html",2425,0,"",html,selection_keyboard
+2704,10410209,"examples/agi_cast.html",2425,0,"i",html,content
+2705,10410212,"examples/agi_cast.html",2426,0,"",html,selection_keyboard
+2706,10410286,"examples/agi_cast.html",2426,0,"n",html,content
+2707,10410294,"examples/agi_cast.html",2427,0,"",html,selection_keyboard
+2708,10410467,"examples/agi_cast.html",2427,0," ",html,content
+2709,10410470,"examples/agi_cast.html",2428,0,"",html,selection_keyboard
+2710,10410739,"examples/agi_cast.html",2428,0,"d",html,content
+2711,10410741,"examples/agi_cast.html",2429,0,"",html,selection_keyboard
+2712,10410748,"examples/agi_cast.html",2429,0,"o",html,content
+2713,10410749,"examples/agi_cast.html",2430,0,"",html,selection_keyboard
+2714,10410757,"examples/agi_cast.html",2430,0,"m",html,content
+2715,10410758,"examples/agi_cast.html",2431,0,"",html,selection_keyboard
+2716,10410943,"examples/agi_cast.html",2431,0,"a",html,content
+2717,10410945,"examples/agi_cast.html",2432,0,"",html,selection_keyboard
+2718,10410960,"examples/agi_cast.html",2432,0,"i",html,content
+2719,10410962,"examples/agi_cast.html",2433,0,"",html,selection_keyboard
+2720,10411018,"examples/agi_cast.html",2433,0,"n",html,content
+2721,10411021,"examples/agi_cast.html",2434,0,"",html,selection_keyboard
+2722,10411157,"examples/agi_cast.html",2434,0,"s",html,content
+2723,10411158,"examples/agi_cast.html",2435,0,"",html,selection_keyboard
+2724,10411243,"examples/agi_cast.html",2435,0," ",html,content
+2725,10411245,"examples/agi_cast.html",2436,0,"",html,selection_keyboard
+2726,10411429,"examples/agi_cast.html",2436,0,"a",html,content
+2727,10411432,"examples/agi_cast.html",2437,0,"",html,selection_keyboard
+2728,10411501,"examples/agi_cast.html",2437,0,"s",html,content
+2729,10411504,"examples/agi_cast.html",2438,0,"",html,selection_keyboard
+2730,10411574,"examples/agi_cast.html",2438,0," ",html,content
+2731,10411575,"examples/agi_cast.html",2439,0,"",html,selection_keyboard
+2732,10414466,"examples/agi_cast.html",2436,3,"",html,content
+2733,10415024,"examples/agi_cast.html",2428,8,"",html,content
+2734,10415932,"examples/agi_cast.html",2427,0,"",html,selection_command
+2735,10416112,"examples/agi_cast.html",2425,0,"",html,selection_command
+2736,10416361,"examples/agi_cast.html",2418,0,"",html,selection_command
+2737,10416391,"examples/agi_cast.html",2407,0,"",html,selection_command
+2738,10416424,"examples/agi_cast.html",2400,0,"",html,selection_command
+2739,10416459,"examples/agi_cast.html",2391,0,"",html,selection_command
+2740,10416492,"examples/agi_cast.html",2385,0,"",html,selection_command
+2741,10416750,"examples/agi_cast.html",2391,0,"",html,selection_command
+2742,10417284,"examples/agi_cast.html",2428,0,"",html,selection_command
+2743,10418508,"examples/agi_cast.html",2425,3,"",html,content
+2744,10418682,"examples/agi_cast.html",2418,7,"",html,content
+2745,10418909,"examples/agi_cast.html",2418,0,"h",html,content
+2746,10418910,"examples/agi_cast.html",2419,0,"",html,selection_keyboard
+2747,10419048,"examples/agi_cast.html",2419,0,"u",html,content
+2748,10419050,"examples/agi_cast.html",2420,0,"",html,selection_keyboard
+2749,10419227,"examples/agi_cast.html",2420,0,"m",html,content
+2750,10419228,"examples/agi_cast.html",2421,0,"",html,selection_keyboard
+2751,10419408,"examples/agi_cast.html",2421,0,"a",html,content
+2752,10419412,"examples/agi_cast.html",2422,0,"",html,selection_keyboard
+2753,10419424,"examples/agi_cast.html",2422,0,"n",html,content
+2754,10419427,"examples/agi_cast.html",2423,0,"",html,selection_keyboard
+2755,10419649,"examples/agi_cast.html",2423,0,"s",html,content
+2756,10419653,"examples/agi_cast.html",2424,0,"",html,selection_keyboard
+2757,10419707,"examples/agi_cast.html",2424,0," ",html,content
+2758,10419710,"examples/agi_cast.html",2425,0,"",html,selection_keyboard
+2759,10419743,"examples/agi_cast.html",2425,0,"i",html,content
+2760,10419748,"examples/agi_cast.html",2426,0,"",html,selection_keyboard
+2761,10419758,"examples/agi_cast.html",2426,0,"n",html,content
+2762,10419761,"examples/agi_cast.html",2427,0,"",html,selection_keyboard
+2763,10420034,"examples/agi_cast.html",2427,0," ",html,content
+2764,10420037,"examples/agi_cast.html",2428,0,"",html,selection_keyboard
+2765,10420202,"examples/agi_cast.html",2428,0,"d",html,content
+2766,10420205,"examples/agi_cast.html",2429,0,"",html,selection_keyboard
+2767,10420216,"examples/agi_cast.html",2429,0,"o",html,content
+2768,10420217,"examples/agi_cast.html",2430,0,"",html,selection_keyboard
+2769,10420221,"examples/agi_cast.html",2430,0,"m",html,content
+2770,10420223,"examples/agi_cast.html",2431,0,"",html,selection_keyboard
+2771,10420398,"examples/agi_cast.html",2431,0,"a",html,content
+2772,10420399,"examples/agi_cast.html",2432,0,"",html,selection_keyboard
+2773,10420415,"examples/agi_cast.html",2432,0,"i",html,content
+2774,10420417,"examples/agi_cast.html",2433,0,"",html,selection_keyboard
+2775,10420483,"examples/agi_cast.html",2433,0,"n",html,content
+2776,10420485,"examples/agi_cast.html",2434,0,"",html,selection_keyboard
+2777,10420667,"examples/agi_cast.html",2434,0,"s",html,content
+2778,10420669,"examples/agi_cast.html",2435,0,"",html,selection_keyboard
+2779,10420724,"examples/agi_cast.html",2435,0," ",html,content
+2780,10420726,"examples/agi_cast.html",2436,0,"",html,selection_keyboard
+2781,10420874,"examples/agi_cast.html",2436,0,"l",html,content
+2782,10420877,"examples/agi_cast.html",2437,0,"",html,selection_keyboard
+2783,10420885,"examples/agi_cast.html",2437,0,"i",html,content
+2784,10420889,"examples/agi_cast.html",2438,0,"",html,selection_keyboard
+2785,10421084,"examples/agi_cast.html",2438,0,"k",html,content
+2786,10421087,"examples/agi_cast.html",2439,0,"",html,selection_keyboard
+2787,10421096,"examples/agi_cast.html",2439,0,"e",html,content
+2788,10421098,"examples/agi_cast.html",2440,0,"",html,selection_keyboard
+2789,10421234,"examples/agi_cast.html",2440,0," ",html,content
+2790,10421238,"examples/agi_cast.html",2441,0,"",html,selection_keyboard
+2791,10421751,"examples/agi_cast.html",2440,0,"",html,selection_command
+2792,10424553,"examples/agi_cast.html",2436,0,"",html,selection_command
+2793,10424802,"examples/agi_cast.html",2428,0,"",html,selection_command
+2794,10424833,"examples/agi_cast.html",2425,0,"",html,selection_command
+2795,10424868,"examples/agi_cast.html",2418,0,"",html,selection_command
+2796,10424901,"examples/agi_cast.html",2407,0,"",html,selection_command
+2797,10424935,"examples/agi_cast.html",2400,0,"",html,selection_command
+2798,10424966,"examples/agi_cast.html",2391,0,"",html,selection_command
+2799,10425002,"examples/agi_cast.html",2385,0,"",html,selection_command
+2800,10425034,"examples/agi_cast.html",2383,0,"",html,selection_command
+2801,10425469,"examples/agi_cast.html",2385,0,"",html,selection_command
+2802,10425693,"examples/agi_cast.html",2391,0,"",html,selection_command
+2803,10426940,"examples/agi_cast.html",2441,0,"",html,selection_command
+2804,10430825,"examples/agi_cast.html",2436,5,"",html,content
+2805,10431017,"examples/agi_cast.html",2428,8,"",html,content
+2806,10431190,"examples/agi_cast.html",2425,3,"",html,content
+2807,10431333,"examples/agi_cast.html",2418,7,"",html,content
+2808,10432466,"examples/agi_cast.html",2418,0,"t",html,content
+2809,10432469,"examples/agi_cast.html",2419,0,"",html,selection_keyboard
+2810,10432640,"examples/agi_cast.html",2419,0,"h",html,content
+2811,10432642,"examples/agi_cast.html",2420,0,"",html,selection_keyboard
+2812,10432726,"examples/agi_cast.html",2420,0,"e",html,content
+2813,10432727,"examples/agi_cast.html",2421,0,"",html,selection_keyboard
+2814,10432879,"examples/agi_cast.html",2421,0," ",html,content
+2815,10432883,"examples/agi_cast.html",2422,0,"",html,selection_keyboard
+2816,10433001,"examples/agi_cast.html",2422,0,"b",html,content
+2817,10433002,"examples/agi_cast.html",2423,0,"",html,selection_keyboard
+2818,10433236,"examples/agi_cast.html",2423,0,"e",html,content
+2819,10433237,"examples/agi_cast.html",2424,0,"",html,selection_keyboard
+2820,10433448,"examples/agi_cast.html",2424,0,"s",html,content
+2821,10433452,"examples/agi_cast.html",2425,0,"",html,selection_keyboard
+2822,10433462,"examples/agi_cast.html",2425,0,"t",html,content
+2823,10433468,"examples/agi_cast.html",2426,0,"",html,selection_keyboard
+2824,10434210,"examples/agi_cast.html",2426,0," ",html,content
+2825,10434214,"examples/agi_cast.html",2427,0,"",html,selection_keyboard
+2826,10434351,"examples/agi_cast.html",2427,0,"h",html,content
+2827,10434353,"examples/agi_cast.html",2428,0,"",html,selection_keyboard
+2828,10434498,"examples/agi_cast.html",2428,0,"u",html,content
+2829,10434500,"examples/agi_cast.html",2429,0,"",html,selection_keyboard
+2830,10434675,"examples/agi_cast.html",2429,0,"m",html,content
+2831,10434678,"examples/agi_cast.html",2430,0,"",html,selection_keyboard
+2832,10434891,"examples/agi_cast.html",2430,0,"a",html,content
+2833,10434896,"examples/agi_cast.html",2431,0,"",html,selection_keyboard
+2834,10434923,"examples/agi_cast.html",2431,0,"n",html,content
+2835,10434926,"examples/agi_cast.html",2432,0,"",html,selection_keyboard
+2836,10435129,"examples/agi_cast.html",2432,0,"s",html,content
+2837,10435132,"examples/agi_cast.html",2433,0,"",html,selection_keyboard
+2838,10435174,"examples/agi_cast.html",2433,0," ",html,content
+2839,10435177,"examples/agi_cast.html",2434,0,"",html,selection_keyboard
+2840,10435952,"examples/agi_cast.html",2433,1,"",html,content
+2841,10436588,"examples/agi_cast.html",2427,6,"",html,content
+2842,10436750,"examples/agi_cast.html",2422,5,"",html,content
+2843,10436924,"examples/agi_cast.html",2418,4,"",html,content
+2844,10437288,"examples/agi_cast.html",2407,11,"",html,content
+2845,10438060,"examples/agi_cast.html",2407,0,"a",html,content
+2846,10438065,"examples/agi_cast.html",2408,0,"",html,selection_keyboard
+2847,10438088,"examples/agi_cast.html",2408,0,"r",html,content
+2848,10438089,"examples/agi_cast.html",2409,0,"",html,selection_keyboard
+2849,10438265,"examples/agi_cast.html",2409,0,"e",html,content
+2850,10438268,"examples/agi_cast.html",2410,0,"",html,selection_keyboard
+2851,10438308,"examples/agi_cast.html",2410,0," ",html,content
+2852,10438311,"examples/agi_cast.html",2411,0,"",html,selection_keyboard
+2853,10438382,"examples/agi_cast.html",2411,0,"c",html,content
+2854,10438385,"examples/agi_cast.html",2412,0,"",html,selection_keyboard
+2855,10438501,"examples/agi_cast.html",2412,0,"o",html,content
+2856,10438524,"examples/agi_cast.html",2413,0,"",html,selection_keyboard
+2857,10438545,"examples/agi_cast.html",2413,0,"m",html,content
+2858,10438549,"examples/agi_cast.html",2414,0,"",html,selection_keyboard
+2859,10438598,"examples/agi_cast.html",2414,0,"p",html,content
+2860,10438600,"examples/agi_cast.html",2415,0,"",html,selection_keyboard
+2861,10438699,"examples/agi_cast.html",2415,0,"e",html,content
+2862,10438701,"examples/agi_cast.html",2416,0,"",html,selection_keyboard
+2863,10438792,"examples/agi_cast.html",2416,0,"t",html,content
+2864,10438794,"examples/agi_cast.html",2417,0,"",html,selection_keyboard
+2865,10438999,"examples/agi_cast.html",2417,0,"i",html,content
+2866,10439001,"examples/agi_cast.html",2418,0,"",html,selection_keyboard
+2867,10439118,"examples/agi_cast.html",2418,0,"t",html,content
+2868,10439120,"examples/agi_cast.html",2419,0,"",html,selection_keyboard
+2869,10439192,"examples/agi_cast.html",2419,0,"i",html,content
+2870,10439192,"examples/agi_cast.html",2420,0,"",html,selection_keyboard
+2871,10439341,"examples/agi_cast.html",2420,0,"v",html,content
+2872,10439342,"examples/agi_cast.html",2421,0,"",html,selection_keyboard
+2873,10439435,"examples/agi_cast.html",2421,0,"e",html,content
+2874,10439438,"examples/agi_cast.html",2422,0,"",html,selection_keyboard
+2875,10439641,"examples/agi_cast.html",2422,0," ",html,content
+2876,10439643,"examples/agi_cast.html",2423,0,"",html,selection_keyboard
+2877,10439701,"examples/agi_cast.html",2423,0,"w",html,content
+2878,10439702,"examples/agi_cast.html",2424,0,"",html,selection_keyboard
+2879,10439826,"examples/agi_cast.html",2424,0,"i",html,content
+2880,10439828,"examples/agi_cast.html",2425,0,"",html,selection_keyboard
+2881,10439914,"examples/agi_cast.html",2425,0,"t",html,content
+2882,10439915,"examples/agi_cast.html",2426,0,"",html,selection_keyboard
+2883,10440033,"examples/agi_cast.html",2426,0,"h",html,content
+2884,10440035,"examples/agi_cast.html",2427,0,"",html,selection_keyboard
+2885,10440226,"examples/agi_cast.html",2427,0," ",html,content
+2886,10440230,"examples/agi_cast.html",2428,0,"",html,selection_keyboard
+2887,10443981,"examples/agi_cast.html",2428,0,"e",html,content
+2888,10443983,"examples/agi_cast.html",2429,0,"",html,selection_keyboard
+2889,10444933,"examples/agi_cast.html",2429,0,"l",html,content
+2890,10444935,"examples/agi_cast.html",2430,0,"",html,selection_keyboard
+2891,10445013,"examples/agi_cast.html",2430,0,"i",html,content
+2892,10445015,"examples/agi_cast.html",2431,0,"",html,selection_keyboard
+2893,10445871,"examples/agi_cast.html",2430,1,"",html,content
+2894,10446038,"examples/agi_cast.html",2429,1,"",html,content
+2895,10446157,"examples/agi_cast.html",2428,1,"",html,content
+2896,10446753,"examples/agi_cast.html",2428,0,"e",html,content
+2897,10446754,"examples/agi_cast.html",2429,0,"",html,selection_keyboard
+2898,10447135,"examples/agi_cast.html",2429,0,"x",html,content
+2899,10447138,"examples/agi_cast.html",2430,0,"",html,selection_keyboard
+2900,10448013,"examples/agi_cast.html",2430,0,"p",html,content
+2901,10448016,"examples/agi_cast.html",2431,0,"",html,selection_keyboard
+2902,10448134,"examples/agi_cast.html",2431,0,"e",html,content
+2903,10448136,"examples/agi_cast.html",2432,0,"",html,selection_keyboard
+2904,10448181,"examples/agi_cast.html",2432,0,"r",html,content
+2905,10448185,"examples/agi_cast.html",2433,0,"",html,selection_keyboard
+2906,10448409,"examples/agi_cast.html",2433,0,"t",html,content
+2907,10448412,"examples/agi_cast.html",2434,0,"",html,selection_keyboard
+2908,10448960,"examples/agi_cast.html",2434,0," ",html,content
+2909,10448963,"examples/agi_cast.html",2435,0,"",html,selection_keyboard
+2910,10449234,"examples/agi_cast.html",2434,0,"",html,selection_command
+2911,10453295,"examples/agi_cast.html",2428,0,"",html,selection_command
+2912,10453550,"examples/agi_cast.html",2423,0,"",html,selection_command
+2913,10453580,"examples/agi_cast.html",2411,0,"",html,selection_command
+2914,10453615,"examples/agi_cast.html",2407,0,"",html,selection_command
+2915,10453649,"examples/agi_cast.html",2400,0,"",html,selection_command
+2916,10453682,"examples/agi_cast.html",2391,0,"",html,selection_command
+2917,10453717,"examples/agi_cast.html",2385,0,"",html,selection_command
+2918,10454184,"examples/agi_cast.html",2391,0,"",html,selection_command
+2919,10454434,"examples/agi_cast.html",2400,0,"",html,selection_command
+2920,10454465,"examples/agi_cast.html",2407,0,"",html,selection_command
+2921,10454769,"examples/agi_cast.html",2411,0,"",html,selection_command
+2922,10454982,"examples/agi_cast.html",2423,0,"",html,selection_command
+2923,10455448,"examples/agi_cast.html",2435,0,"",html,selection_command
+2924,10456743,"examples/agi_cast.html",2434,0,"",html,selection_command
+2925,10457257,"examples/agi_cast.html",2435,0,"",html,selection_command
+2926,10457769,"examples/agi_cast.html",2435,0,"p",html,content
+2927,10457770,"examples/agi_cast.html",2436,0,"",html,selection_keyboard
+2928,10457854,"examples/agi_cast.html",2436,0,"r",html,content
+2929,10457858,"examples/agi_cast.html",2437,0,"",html,selection_keyboard
+2930,10458319,"examples/agi_cast.html",2437,0,"o",html,content
+2931,10458322,"examples/agi_cast.html",2438,0,"",html,selection_keyboard
+2932,10458527,"examples/agi_cast.html",2438,0,"f",html,content
+2933,10458528,"examples/agi_cast.html",2439,0,"",html,selection_keyboard
+2934,10459362,"examples/agi_cast.html",2439,0,"e",html,content
+2935,10459365,"examples/agi_cast.html",2440,0,"",html,selection_keyboard
+2936,10459558,"examples/agi_cast.html",2440,0,"s",html,content
+2937,10459561,"examples/agi_cast.html",2441,0,"",html,selection_keyboard
+2938,10459782,"examples/agi_cast.html",2441,0,"s",html,content
+2939,10459786,"examples/agi_cast.html",2442,0,"",html,selection_keyboard
+2940,10459811,"examples/agi_cast.html",2442,0,"i",html,content
+2941,10459814,"examples/agi_cast.html",2443,0,"",html,selection_keyboard
+2942,10459867,"examples/agi_cast.html",2443,0,"o",html,content
+2943,10459869,"examples/agi_cast.html",2444,0,"",html,selection_keyboard
+2944,10459925,"examples/agi_cast.html",2444,0,"n",html,content
+2945,10459927,"examples/agi_cast.html",2445,0,"",html,selection_keyboard
+2946,10460128,"examples/agi_cast.html",2445,0,"a",html,content
+2947,10460131,"examples/agi_cast.html",2446,0,"",html,selection_keyboard
+2948,10460208,"examples/agi_cast.html",2446,0,"l",html,content
+2949,10460210,"examples/agi_cast.html",2447,0,"",html,selection_keyboard
+2950,10460342,"examples/agi_cast.html",2447,0,"s",html,content
+2951,10460345,"examples/agi_cast.html",2448,0,"",html,selection_keyboard
+2952,10460405,"examples/agi_cast.html",2448,0," ",html,content
+2953,10460407,"examples/agi_cast.html",2449,0,"",html,selection_keyboard
+2954,10460411,"examples/agi_cast.html",2449,0,"i",html,content
+2955,10460413,"examples/agi_cast.html",2450,0,"",html,selection_keyboard
+2956,10460496,"examples/agi_cast.html",2450,0,"n",html,content
+2957,10460501,"examples/agi_cast.html",2451,0,"",html,selection_keyboard
+2958,10460707,"examples/agi_cast.html",2451,0," ",html,content
+2959,10460710,"examples/agi_cast.html",2452,0,"",html,selection_keyboard
+2960,10460826,"examples/agi_cast.html",2452,0,"d",html,content
+2961,10460829,"examples/agi_cast.html",2453,0,"",html,selection_keyboard
+2962,10460838,"examples/agi_cast.html",2453,0,"o",html,content
+2963,10460841,"examples/agi_cast.html",2454,0,"",html,selection_keyboard
+2964,10460853,"examples/agi_cast.html",2454,0,"m",html,content
+2965,10460859,"examples/agi_cast.html",2455,0,"",html,selection_keyboard
+2966,10461050,"examples/agi_cast.html",2455,0,"a",html,content
+2967,10461052,"examples/agi_cast.html",2456,0,"",html,selection_keyboard
+2968,10461061,"examples/agi_cast.html",2456,0,"i",html,content
+2969,10461063,"examples/agi_cast.html",2457,0,"",html,selection_keyboard
+2970,10461108,"examples/agi_cast.html",2457,0,"n",html,content
+2971,10461109,"examples/agi_cast.html",2458,0,"",html,selection_keyboard
+2972,10461305,"examples/agi_cast.html",2458,0,"s",html,content
+2973,10461307,"examples/agi_cast.html",2459,0,"",html,selection_keyboard
+2974,10461356,"examples/agi_cast.html",2459,0," ",html,content
+2975,10461357,"examples/agi_cast.html",2460,0,"",html,selection_keyboard
+2976,10461517,"examples/agi_cast.html",2460,0,"l",html,content
+2977,10461520,"examples/agi_cast.html",2461,0,"",html,selection_keyboard
+2978,10461528,"examples/agi_cast.html",2461,0,"i",html,content
+2979,10461530,"examples/agi_cast.html",2462,0,"",html,selection_keyboard
+2980,10461758,"examples/agi_cast.html",2462,0,"k",html,content
+2981,10461759,"examples/agi_cast.html",2463,0,"",html,selection_keyboard
+2982,10461774,"examples/agi_cast.html",2463,0,"e",html,content
+2983,10461775,"examples/agi_cast.html",2464,0,"",html,selection_keyboard
+2984,10461890,"examples/agi_cast.html",2464,0," ",html,content
+2985,10461892,"examples/agi_cast.html",2465,0,"",html,selection_keyboard
+2986,10462693,"examples/agi_cast.html",2465,0,"s",html,content
+2987,10462697,"examples/agi_cast.html",2466,0,"",html,selection_keyboard
+2988,10462713,"examples/agi_cast.html",2466,0,"o",html,content
+2989,10462714,"examples/agi_cast.html",2467,0,"",html,selection_keyboard
+2990,10462852,"examples/agi_cast.html",2467,0,"f",html,content
+2991,10462857,"examples/agi_cast.html",2468,0,"",html,selection_keyboard
+2992,10462940,"examples/agi_cast.html",2468,0,"t",html,content
+2993,10462942,"examples/agi_cast.html",2469,0,"",html,selection_keyboard
+2994,10463032,"examples/agi_cast.html",2469,0,"w",html,content
+2995,10463034,"examples/agi_cast.html",2470,0,"",html,selection_keyboard
+2996,10463712,"examples/agi_cast.html",2470,0,"a",html,content
+2997,10463715,"examples/agi_cast.html",2471,0,"",html,selection_keyboard
+2998,10463727,"examples/agi_cast.html",2471,0,"r",html,content
+2999,10463732,"examples/agi_cast.html",2472,0,"",html,selection_keyboard
+3000,10464029,"examples/agi_cast.html",2472,0,"e",html,content
+3001,10464034,"examples/agi_cast.html",2473,0,"",html,selection_keyboard
+3002,10464436,"examples/agi_cast.html",2473,0," ",html,content
+3003,10464441,"examples/agi_cast.html",2474,0,"",html,selection_keyboard
+3004,10464485,"examples/agi_cast.html",2474,0,"e",html,content
+3005,10464487,"examples/agi_cast.html",2475,0,"",html,selection_keyboard
+3006,10464566,"examples/agi_cast.html",2475,0,"n",html,content
+3007,10464568,"examples/agi_cast.html",2476,0,"",html,selection_keyboard
+3008,10464619,"examples/agi_cast.html",2476,0,"g",html,content
+3009,10464621,"examples/agi_cast.html",2477,0,"",html,selection_keyboard
+3010,10464740,"examples/agi_cast.html",2477,0,"i",html,content
+3011,10464743,"examples/agi_cast.html",2478,0,"",html,selection_keyboard
+3012,10464800,"examples/agi_cast.html",2478,0,"n",html,content
+3013,10464803,"examples/agi_cast.html",2479,0,"",html,selection_keyboard
+3014,10464831,"examples/agi_cast.html",2479,0,"e",html,content
+3015,10464834,"examples/agi_cast.html",2480,0,"",html,selection_keyboard
+3016,10465025,"examples/agi_cast.html",2480,0,"e",html,content
+3017,10465028,"examples/agi_cast.html",2481,0,"",html,selection_keyboard
+3018,10465101,"examples/agi_cast.html",2481,0,"r",html,content
+3019,10465106,"examples/agi_cast.html",2482,0,"",html,selection_keyboard
+3020,10465206,"examples/agi_cast.html",2482,0,"i",html,content
+3021,10465209,"examples/agi_cast.html",2483,0,"",html,selection_keyboard
+3022,10465250,"examples/agi_cast.html",2483,0,"n",html,content
+3023,10465253,"examples/agi_cast.html",2484,0,"",html,selection_keyboard
+3024,10465325,"examples/agi_cast.html",2484,0,"g",html,content
+3025,10465328,"examples/agi_cast.html",2485,0,"",html,selection_keyboard
+3026,10465493,"examples/agi_cast.html",2485,0," ",html,content
+3027,10465495,"examples/agi_cast.html",2486,0,"",html,selection_keyboard
+3028,10465558,"examples/agi_cast.html",2486,0,"n",html,content
+3029,10465560,"examples/agi_cast.html",2487,0,"",html,selection_keyboard
+3030,10465985,"examples/agi_cast.html",2486,1,"",html,content
+3031,10466141,"examples/agi_cast.html",2486,0,"a",html,content
+3032,10466142,"examples/agi_cast.html",2487,0,"",html,selection_keyboard
+3033,10466145,"examples/agi_cast.html",2487,0,"n",html,content
+3034,10466146,"examples/agi_cast.html",2488,0,"",html,selection_keyboard
+3035,10466292,"examples/agi_cast.html",2488,0,"d",html,content
+3036,10466295,"examples/agi_cast.html",2489,0,"",html,selection_keyboard
+3037,10466393,"examples/agi_cast.html",2489,0," ",html,content
+3038,10466395,"examples/agi_cast.html",2490,0,"",html,selection_keyboard
+3039,10466481,"examples/agi_cast.html",2490,0,"m",html,content
+3040,10466483,"examples/agi_cast.html",2491,0,"",html,selection_keyboard
+3041,10466649,"examples/agi_cast.html",2491,0,"a",html,content
+3042,10466651,"examples/agi_cast.html",2492,0,"",html,selection_keyboard
+3043,10466660,"examples/agi_cast.html",2492,0,"t",html,content
+3044,10466661,"examples/agi_cast.html",2493,0,"",html,selection_keyboard
+3045,10466759,"examples/agi_cast.html",2493,0,"h",html,content
+3046,10466762,"examples/agi_cast.html",2494,0,"",html,selection_keyboard
+3047,10466799,"examples/agi_cast.html",2494,0,"e",html,content
+3048,10466801,"examples/agi_cast.html",2495,0,"",html,selection_keyboard
+3049,10466949,"examples/agi_cast.html",2495,0,"m",html,content
+3050,10466951,"examples/agi_cast.html",2496,0,"",html,selection_keyboard
+3051,10467473,"examples/agi_cast.html",2490,6,"",html,content
+3052,10467986,"examples/agi_cast.html",2489,0,"",html,selection_command
+3053,10468118,"examples/agi_cast.html",2486,0,"",html,selection_command
+3054,10468368,"examples/agi_cast.html",2474,0,"",html,selection_command
+3055,10468399,"examples/agi_cast.html",2465,0,"",html,selection_command
+3056,10468433,"examples/agi_cast.html",2460,0,"",html,selection_command
+3057,10468465,"examples/agi_cast.html",2452,0,"",html,selection_command
+3058,10468499,"examples/agi_cast.html",2449,0,"",html,selection_command
+3059,10468532,"examples/agi_cast.html",2435,0,"",html,selection_command
+3060,10468566,"examples/agi_cast.html",2428,0,"",html,selection_command
+3061,10468600,"examples/agi_cast.html",2423,0,"",html,selection_command
+3062,10468633,"examples/agi_cast.html",2411,0,"",html,selection_command
+3063,10468667,"examples/agi_cast.html",2407,0,"",html,selection_command
+3064,10468699,"examples/agi_cast.html",2400,0,"",html,selection_command
+3065,10472122,"examples/agi_cast.html",2391,0,"",html,selection_command
+3066,10472502,"examples/agi_cast.html",2400,0,"",html,selection_command
+3067,10472661,"examples/agi_cast.html",2407,0,"",html,selection_command
+3068,10472827,"examples/agi_cast.html",2411,0,"",html,selection_command
+3069,10473894,"examples/agi_cast.html",2407,0,"",html,selection_command
+3070,10474311,"examples/agi_cast.html",2407,83,"",html,content
+3071,10474644,"examples/agi_cast.html",2407,0,"o",html,content
+3072,10474645,"examples/agi_cast.html",2408,0,"",html,selection_keyboard
+3073,10474685,"examples/agi_cast.html",2408,0,"u",html,content
+3074,10474687,"examples/agi_cast.html",2409,0,"",html,selection_keyboard
+3075,10474824,"examples/agi_cast.html",2409,0,"t",html,content
+3076,10474827,"examples/agi_cast.html",2410,0,"",html,selection_keyboard
+3077,10474973,"examples/agi_cast.html",2410,0,"p",html,content
+3078,10474975,"examples/agi_cast.html",2411,0,"",html,selection_keyboard
+3079,10475047,"examples/agi_cast.html",2411,0,"e",html,content
+3080,10475048,"examples/agi_cast.html",2412,0,"",html,selection_keyboard
+3081,10475121,"examples/agi_cast.html",2412,0,"r",html,content
+3082,10475123,"examples/agi_cast.html",2413,0,"",html,selection_keyboard
+3083,10475424,"examples/agi_cast.html",2413,0,"f",html,content
+3084,10475426,"examples/agi_cast.html",2414,0,"",html,selection_keyboard
+3085,10475559,"examples/agi_cast.html",2414,0,"o",html,content
+3086,10475561,"examples/agi_cast.html",2415,0,"",html,selection_keyboard
+3087,10475632,"examples/agi_cast.html",2415,0,"r",html,content
+3088,10475634,"examples/agi_cast.html",2416,0,"",html,selection_keyboard
+3089,10475767,"examples/agi_cast.html",2416,0,"m",html,content
+3090,10475769,"examples/agi_cast.html",2417,0,"",html,selection_keyboard
+3091,10475984,"examples/agi_cast.html",2417,0," ",html,content
+3092,10475986,"examples/agi_cast.html",2418,0,"",html,selection_keyboard
+3093,10476264,"examples/agi_cast.html",2418,0,"h",html,content
+3094,10476266,"examples/agi_cast.html",2419,0,"",html,selection_keyboard
+3095,10476409,"examples/agi_cast.html",2419,0,"u",html,content
+3096,10476414,"examples/agi_cast.html",2420,0,"",html,selection_keyboard
+3097,10476577,"examples/agi_cast.html",2420,0,"m",html,content
+3098,10476582,"examples/agi_cast.html",2421,0,"",html,selection_keyboard
+3099,10476744,"examples/agi_cast.html",2421,0,"a",html,content
+3100,10476748,"examples/agi_cast.html",2422,0,"",html,selection_keyboard
+3101,10476761,"examples/agi_cast.html",2422,0,"n",html,content
+3102,10476764,"examples/agi_cast.html",2423,0,"",html,selection_keyboard
+3103,10476995,"examples/agi_cast.html",2423,0,"s",html,content
+3104,10476998,"examples/agi_cast.html",2424,0,"",html,selection_keyboard
+3105,10477075,"examples/agi_cast.html",2424,0," ",html,content
+3106,10477078,"examples/agi_cast.html",2425,0,"",html,selection_keyboard
+3107,10477090,"examples/agi_cast.html",2425,0,"i",html,content
+3108,10477092,"examples/agi_cast.html",2426,0,"",html,selection_keyboard
+3109,10477168,"examples/agi_cast.html",2426,0,"n",html,content
+3110,10477178,"examples/agi_cast.html",2427,0,"",html,selection_keyboard
+3111,10477376,"examples/agi_cast.html",2427,0," ",html,content
+3112,10477378,"examples/agi_cast.html",2428,0,"",html,selection_keyboard
+3113,10481652,"examples/agi_cast.html",2428,0,"c",html,content
+3114,10481654,"examples/agi_cast.html",2429,0,"",html,selection_keyboard
+3115,10481751,"examples/agi_cast.html",2429,0,"o",html,content
+3116,10481754,"examples/agi_cast.html",2430,0,"",html,selection_keyboard
+3117,10481811,"examples/agi_cast.html",2430,0,"m",html,content
+3118,10481815,"examples/agi_cast.html",2431,0,"",html,selection_keyboard
+3119,10481899,"examples/agi_cast.html",2431,0,"p",html,content
+3120,10481902,"examples/agi_cast.html",2432,0,"",html,selection_keyboard
+3121,10482022,"examples/agi_cast.html",2432,0,"e",html,content
+3122,10482023,"examples/agi_cast.html",2433,0,"",html,selection_keyboard
+3123,10482126,"examples/agi_cast.html",2433,0,"t",html,content
+3124,10482128,"examples/agi_cast.html",2434,0,"",html,selection_keyboard
+3125,10482268,"examples/agi_cast.html",2434,0,"i",html,content
+3126,10482270,"examples/agi_cast.html",2435,0,"",html,selection_keyboard
+3127,10482391,"examples/agi_cast.html",2435,0,"t",html,content
+3128,10482394,"examples/agi_cast.html",2436,0,"",html,selection_keyboard
+3129,10482500,"examples/agi_cast.html",2436,0,"i",html,content
+3130,10482503,"examples/agi_cast.html",2437,0,"",html,selection_keyboard
+3131,10482592,"examples/agi_cast.html",2437,0,"o",html,content
+3132,10482594,"examples/agi_cast.html",2438,0,"",html,selection_keyboard
+3133,10482617,"examples/agi_cast.html",2438,0,"n",html,content
+3134,10482619,"examples/agi_cast.html",2439,0,"",html,selection_keyboard
+3135,10482894,"examples/agi_cast.html",2439,0,"s",html,content
+3136,10482895,"examples/agi_cast.html",2440,0,"",html,selection_keyboard
+3137,10483950,"examples/agi_cast.html",2439,0,"",html,selection_command
+3138,10484145,"examples/agi_cast.html",2428,0,"",html,selection_command
+3139,10484334,"examples/agi_cast.html",2425,0,"",html,selection_command
+3140,10484651,"examples/agi_cast.html",2428,0,"",html,selection_command
+3141,10489404,"examples/agi_cast.html",2425,0,"",html,selection_command
+3142,10489600,"examples/agi_cast.html",2418,0,"",html,selection_command
+3143,10493550,"examples/agi_cast.html",2425,0,"",html,selection_command
+3144,10493775,"examples/agi_cast.html",2428,0,"",html,selection_command
+3145,10495829,"examples/agi_cast.html",2428,12,"",html,content
+3146,10497303,"examples/agi_cast.html",2427,0,"",html,selection_command
+3147,10497480,"examples/agi_cast.html",2426,0,"",html,selection_command
+3148,10560811,"examples/agi_cast.html",0,0,"",html,tab
+3149,10564220,"examples/agi_cast.html",1886,0,"",html,selection_mouse
+3150,10564665,"examples/agi_cast.html",1888,0,"",html,selection_mouse
+3151,10565413,"examples/agi_cast.html",1887,0,"",html,selection_command
+3152,10565777,"examples/agi_cast.html",1887,1,"y",html,selection_command
+3153,10565878,"examples/agi_cast.html",1887,11,"your_video.",html,selection_command
+3154,10566629,"examples/agi_cast.html",1887,10,"your_video",html,selection_command
+3155,10566787,"examples/agi_cast.html",1887,10,"",html,content
+3156,10567083,"examples/agi_cast.html",1887,0,"a",html,content
+3157,10567084,"examples/agi_cast.html",1888,0,"",html,selection_keyboard
+3158,10567095,"examples/agi_cast.html",1888,0,"g",html,content
+3159,10567096,"examples/agi_cast.html",1889,0,"",html,selection_keyboard
+3160,10567156,"examples/agi_cast.html",1889,0,"i",html,content
+3161,10567159,"examples/agi_cast.html",1890,0,"",html,selection_keyboard
+3162,10570156,"examples/agi_cast.html",1887,7,"agi_cast.mp4",html,content
+3163,10570749,"examples/agi_cast.html",1898,0,"",html,selection_command
+3164,10573188,"TERMINAL",0,0,"^C[1G[0Kโ [1G[0K[1m[7m%[27m[1m[0m \r \r]633;D;0]633;P;Cwd=/Users/franzsrambical/Documents/pdoom/pdoom.org\r[0m[27m[24m[J]633;Afranzsrambical@MBF6N9WFVKFV pdoom.org % ]633;B[K[?2004h",,terminal_output
+3165,10573458,"TERMINAL",0,0,"npm run dev",,terminal_output
+3166,10573529,"TERMINAL",0,0,"[11Dcp examples/* dist",,terminal_output
+3167,10573831,"TERMINAL",0,0,"cp examples/* dist",,terminal_command
+3168,10573831,"TERMINAL",0,0,"[?2004l\r\r\n]633;E;cp examples/* dist;280736e1-4344-414b-b289-7b3558692a37",,terminal_output
+3169,10573875,"TERMINAL",0,0,"]633;C[1m[7m%[27m[1m[0m \r \r",,terminal_output
+3170,10574405,"TERMINAL",0,0,"npm run dev",,terminal_command
+3171,10574456,"TERMINAL",0,0,"]633;C",,terminal_output
+3172,10574760,"TERMINAL",0,0,"\r\n> distill-template@2.8.0 dev\r\n> rollup -c rollup.config.dev.js -w\r\n\r\n[1G[0K",,terminal_output
+3173,10575001,"TERMINAL",0,0,"c[4mrollup v2.7.3[24m\r\n[36mbundles [1msrc/components.js[22m โ [1mdist/template.v2.js[22m...[39m\r\n",,terminal_output
+3174,10575460,"TERMINAL",0,0,"[1m[32mhttp://localhost:8088[39m[22m -> /Users/franzsrambical/Documents/pdoom/pdoom.org/dist\r\n[1m[32mhttp://localhost:8088[39m[22m -> /Users/franzsrambical/Documents/pdoom/pdoom.org/examples\r\n[1m[33m(!) Circular dependencies[39m[22m\r\nnode_modules/d3-interpolate/src/value.js -> node_modules/d3-interpolate/src/array.js -> node_modules/d3-interpolate/src/value.js\r\nnode_modules/d3-interpolate/src/value.js -> node_modules/d3-interpolate/src/object.js -> node_modules/d3-interpolate/src/value.js\r\nnode_modules/d3-selection/src/selection/index.js -> node_modules/d3-selection/src/selection/select.js -> node_modules/d3-selection/src/selection/index.js\r\n...and 7 more\r\n[32mcreated [1mdist/template.v2.js[22m in [1m455ms[22m[39m\r\n[36mbundles [1msrc/transforms.js[22m โ [1mdist/transforms.v2.js[22m...[39m\r\n",,terminal_output
+3175,10576035,"TERMINAL",0,0,"[32mcreated [1mdist/transforms.v2.js[22m in [1m575ms[22m[39m\r\n\r\n[2025-11-02 15:55:45] waiting for changes...\r\n",,terminal_output
+3176,10645634,"examples/agi_cast.html",1984,0,"",html,selection_command
+3177,10645882,"examples/agi_cast.html",2103,0,"",html,selection_command
+3178,10645913,"examples/agi_cast.html",2230,0,"",html,selection_command
+3179,10645943,"examples/agi_cast.html",2274,0,"",html,selection_command
+3180,10645974,"examples/agi_cast.html",2282,0,"",html,selection_command
+3181,10646365,"examples/agi_cast.html",2401,0,"",html,selection_command
+3182,10646689,"examples/agi_cast.html",2282,0,"",html,selection_command
+3183,10646865,"examples/agi_cast.html",2274,0,"",html,selection_command
+3184,10647012,"examples/agi_cast.html",2230,0,"",html,selection_command
+3185,10647326,"examples/agi_cast.html",2103,0,"",html,selection_command
+3186,10650517,"examples/agi_cast.html",1984,0,"",html,selection_command
+3187,10676010,"examples/agi_cast.html",2103,0,"",html,selection_command
+3188,10676143,"examples/agi_cast.html",2230,0,"",html,selection_command
+3189,10676292,"examples/agi_cast.html",2274,0,"",html,selection_command
+3190,10676426,"examples/agi_cast.html",2282,0,"",html,selection_command
+3191,10676833,"examples/agi_cast.html",2401,0,"",html,selection_command
+3192,10677638,"examples/agi_cast.html",2284,0,"",html,selection_command
+3193,10679134,"examples/agi_cast.html",2425,0,"",html,selection_command
+3194,10753705,"examples/agi_cast.html",2423,0,"",html,selection_command
+3195,10753939,"examples/agi_cast.html",2416,0,"",html,selection_command
+3196,10753972,"examples/agi_cast.html",2405,0,"",html,selection_command
+3197,10754000,"examples/agi_cast.html",2398,0,"",html,selection_command
+3198,10754034,"examples/agi_cast.html",2389,0,"",html,selection_command
+3199,10754066,"examples/agi_cast.html",2383,0,"",html,selection_command
+3200,10754100,"examples/agi_cast.html",2381,0,"",html,selection_command
+3201,10754267,"examples/agi_cast.html",2375,0,"",html,selection_command
+3202,10754508,"examples/agi_cast.html",2381,0,"",html,selection_command
+3203,10754679,"examples/agi_cast.html",2383,0,"",html,selection_command
+3204,10754896,"examples/agi_cast.html",2383,1,"W",html,selection_command
+3205,10755066,"examples/agi_cast.html",2383,5,"While",html,selection_command
+3206,10755317,"examples/agi_cast.html",2383,14,"While frontier",html,selection_command
+3207,10755350,"examples/agi_cast.html",2383,21,"While frontier models",html,selection_command
+3208,10755382,"examples/agi_cast.html",2383,32,"While frontier models outperform",html,selection_command
+3209,10755416,"examples/agi_cast.html",2383,39,"While frontier models outperform humans",html,selection_command
+3210,10755512,"examples/agi_cast.html",2383,42,"While frontier models outperform humans in",html,selection_command
+3211,10756109,"examples/agi_cast.html",2383,42,"Internet-scale pre-training, preference modeling, and reinforcement learning\nusing verification signals offer a compelling pathway for language models to attain human-level\nperformance",html,content
+3212,10756124,"examples/agi_cast.html",2556,0," ",html,content
+3213,10756124,"examples/agi_cast.html",2460,0," ",html,content
+3214,10756835,"examples/agi_cast.html",2424,0,"",html,selection_command
+3215,10757064,"examples/agi_cast.html",2423,0,"",html,selection_command
+3216,10757318,"examples/agi_cast.html",2412,0,"",html,selection_command
+3217,10757343,"examples/agi_cast.html",2410,0,"",html,selection_command
+3218,10757508,"examples/agi_cast.html",2402,0,"",html,selection_command
+3219,10757762,"examples/agi_cast.html",2401,0,"",html,selection_command
+3220,10757792,"examples/agi_cast.html",2398,0,"",html,selection_command
+3221,10757827,"examples/agi_cast.html",2392,0,"",html,selection_command
+3222,10757858,"examples/agi_cast.html",2391,0,"",html,selection_command
+3223,10758097,"examples/agi_cast.html",2392,0,"",html,selection_command
+3224,10758351,"examples/agi_cast.html",2398,0,"",html,selection_command
+3225,10758382,"examples/agi_cast.html",2401,0,"",html,selection_command
+3226,10758409,"examples/agi_cast.html",2402,0,"",html,selection_command
+3227,10758441,"examples/agi_cast.html",2410,0,"",html,selection_command
+3228,10758475,"examples/agi_cast.html",2412,0,"",html,selection_command
+3229,10758510,"examples/agi_cast.html",2423,0,"",html,selection_command
+3230,10758543,"examples/agi_cast.html",2431,0,"",html,selection_command
+3231,10758575,"examples/agi_cast.html",2433,0,"",html,selection_command
+3232,10758609,"examples/agi_cast.html",2437,0,"",html,selection_command
+3233,10758642,"examples/agi_cast.html",2451,0,"",html,selection_command
+3234,10758676,"examples/agi_cast.html",2466,0,"",html,selection_command
+3235,10758708,"examples/agi_cast.html",2472,0,"",html,selection_command
+3236,10758742,"examples/agi_cast.html",2485,0,"",html,selection_command
+3237,10758774,"examples/agi_cast.html",2493,0,"",html,selection_command
+3238,10759028,"examples/agi_cast.html",2485,0,"",html,selection_command
+3239,10759282,"examples/agi_cast.html",2472,0,"",html,selection_command
+3240,10759312,"examples/agi_cast.html",2466,0,"",html,selection_command
+3241,10759344,"examples/agi_cast.html",2451,0,"",html,selection_command
+3242,10759376,"examples/agi_cast.html",2437,0,"",html,selection_command
+3243,10759409,"examples/agi_cast.html",2433,0,"",html,selection_command
+3244,10759763,"examples/agi_cast.html",2431,0,"",html,selection_command
+3245,10760015,"examples/agi_cast.html",2423,0,"",html,selection_command
+3246,10760046,"examples/agi_cast.html",2412,0,"",html,selection_command
+3247,10760075,"examples/agi_cast.html",2410,0,"",html,selection_command
+3248,10760108,"examples/agi_cast.html",2402,0,"",html,selection_command
+3249,10760142,"examples/agi_cast.html",2401,0,"",html,selection_command
+3250,10760485,"examples/agi_cast.html",2398,0,"",html,selection_command
+3251,10760735,"examples/agi_cast.html",2392,0,"",html,selection_command
+3252,10761051,"examples/agi_cast.html",2284,0,"",html,selection_command
+3253,10761328,"examples/agi_cast.html",2290,0,"",html,selection_command
+3254,10761577,"examples/agi_cast.html",2292,0,"",html,selection_command
+3255,10761608,"examples/agi_cast.html",2305,0,"",html,selection_command
+3256,10761635,"examples/agi_cast.html",2310,0,"",html,selection_command
+3257,10761669,"examples/agi_cast.html",2313,0,"",html,selection_command
+3258,10761701,"examples/agi_cast.html",2317,0,"",html,selection_command
+3259,10761736,"examples/agi_cast.html",2326,0,"",html,selection_command
+3260,10761772,"examples/agi_cast.html",2329,0,"",html,selection_command
+3261,10761801,"examples/agi_cast.html",2340,0,"",html,selection_command
+3262,10761835,"examples/agi_cast.html",2344,0,"",html,selection_command
+3263,10761867,"examples/agi_cast.html",2352,0,"",html,selection_command
+3264,10761901,"examples/agi_cast.html",2355,0,"",html,selection_command
+3265,10761933,"examples/agi_cast.html",2366,0,"",html,selection_command
+3266,10761968,"examples/agi_cast.html",2375,0,"",html,selection_command
+3267,10762004,"examples/agi_cast.html",2381,0,"",html,selection_command
+3268,10762033,"examples/agi_cast.html",2383,0,"",html,selection_command
+3269,10762067,"examples/agi_cast.html",2391,0,"",html,selection_command
+3270,10762100,"examples/agi_cast.html",2392,0,"",html,selection_command
+3271,10762134,"examples/agi_cast.html",2398,0,"",html,selection_command
+3272,10762167,"examples/agi_cast.html",2401,0,"",html,selection_command
+3273,10762200,"examples/agi_cast.html",2402,0,"",html,selection_command
+3274,10762326,"examples/agi_cast.html",2410,0,"",html,selection_command
+3275,10762492,"examples/agi_cast.html",2412,0,"",html,selection_command
+3276,10762660,"examples/agi_cast.html",2423,0,"",html,selection_command
+3277,10762793,"examples/agi_cast.html",2431,0,"",html,selection_command
+3278,10762954,"examples/agi_cast.html",2433,0,"",html,selection_command
+3279,10763244,"examples/agi_cast.html",2432,0,"",html,selection_command
+3280,10764085,"examples/agi_cast.html",2433,0,"",html,selection_command
+3281,10764443,"examples/agi_cast.html",2432,1,"",html,content
+3282,10764717,"examples/agi_cast.html",2432,0,"\n ",html,content
+3283,10765208,"examples/agi_cast.html",2438,0,"",html,selection_command
+3284,10765557,"examples/agi_cast.html",2471,0,"",html,selection_command
+3285,10765930,"examples/agi_cast.html",2472,0,"",html,selection_command
+3286,10766101,"examples/agi_cast.html",2470,2,"",html,content
+3287,10766282,"examples/agi_cast.html",2468,2,"",html,content
+3288,10766701,"examples/agi_cast.html",2466,2,"",html,content
+3289,10767232,"examples/agi_cast.html",2465,1,"",html,content
+3290,10767726,"examples/agi_cast.html",2465,0," ",html,content
+3291,10767727,"examples/agi_cast.html",2466,0,"",html,selection_keyboard
+3292,10767964,"examples/agi_cast.html",2465,0,"",html,selection_command
+3293,10769052,"examples/agi_cast.html",2579,0,"",html,selection_command
+3294,10769533,"examples/agi_cast.html",2568,0,"",html,selection_command
+3295,10769768,"examples/agi_cast.html",2566,2,"",html,content
+3296,10769950,"examples/agi_cast.html",2564,2,"",html,content
+3297,10770350,"examples/agi_cast.html",2562,2,"",html,content
+3298,10770907,"examples/agi_cast.html",2561,1,"",html,content
+3299,10771551,"examples/agi_cast.html",2561,0," ",html,content
+3300,10771554,"examples/agi_cast.html",2562,0,"",html,selection_keyboard
+3301,10771749,"examples/agi_cast.html",2561,0,"",html,selection_command
+3302,10772032,"examples/agi_cast.html",2574,0,"",html,selection_command
+3303,10772203,"examples/agi_cast.html",2574,0,",",html,content
+3304,10772205,"examples/agi_cast.html",2575,0,"",html,selection_keyboard
+3305,10772692,"examples/agi_cast.html",2574,1,"",html,content
+3306,10772817,"examples/agi_cast.html",2573,1,"",html,content
+3307,10773068,"examples/agi_cast.html",2573,0,",",html,content
+3308,10773070,"examples/agi_cast.html",2574,0,"",html,selection_keyboard
+3309,10773251,"examples/agi_cast.html",2574,0," ",html,content
+3310,10773252,"examples/agi_cast.html",2575,0,"",html,selection_keyboard
+3311,10786472,"examples/agi_cast.html",2575,0,"yet data is increasingly bottlenecking\nprogress from spiky towards general intelligence",html,content
+3312,10786479,"examples/agi_cast.html",2614,0," ",html,content
+3313,10786939,"examples/agi_cast.html",2667,0,"",html,selection_command
+3314,10787096,"examples/agi_cast.html",2656,0,"",html,selection_command
+3315,10787347,"examples/agi_cast.html",2648,0,"",html,selection_command
+3316,10787377,"examples/agi_cast.html",2640,0,"",html,selection_command
+3317,10787412,"examples/agi_cast.html",2634,0,"",html,selection_command
+3318,10787441,"examples/agi_cast.html",2629,0,"",html,selection_command
+3319,10787475,"examples/agi_cast.html",2620,0,"",html,selection_command
+3320,10787507,"examples/agi_cast.html",2600,0,"",html,selection_command
+3321,10787541,"examples/agi_cast.html",2587,0,"",html,selection_command
+3322,10787574,"examples/agi_cast.html",2584,0,"",html,selection_command
+3323,10787614,"examples/agi_cast.html",2579,0,"",html,selection_command
+3324,10788575,"examples/agi_cast.html",2582,0,"",html,selection_command
+3325,10789225,"examples/agi_cast.html",2583,0,"",html,selection_command
+3326,10789419,"examples/agi_cast.html",2583,0,"\n ",html,content
+3327,10789718,"examples/agi_cast.html",2589,0,"",html,selection_command
+3328,10790094,"examples/agi_cast.html",2626,0,"",html,selection_command
+3329,10790576,"examples/agi_cast.html",2627,0,"",html,selection_command
+3330,10790798,"examples/agi_cast.html",2625,2,"",html,content
+3331,10790943,"examples/agi_cast.html",2623,2,"",html,content
+3332,10791365,"examples/agi_cast.html",2621,2,"",html,content
+3333,10791758,"examples/agi_cast.html",2620,1,"",html,content
+3334,10792245,"examples/agi_cast.html",2620,0," ",html,content
+3335,10792248,"examples/agi_cast.html",2621,0,"",html,selection_keyboard
+3336,10792426,"examples/agi_cast.html",2620,0,"",html,selection_command
+3337,10792934,"examples/agi_cast.html",2591,0,"",html,selection_command
+3338,10793233,"examples/agi_cast.html",2590,1,"",html,content
+3339,10793610,"examples/agi_cast.html",2588,2,"",html,content
+3340,10794788,"examples/agi_cast.html",2666,0,".",html,content
+3341,10794790,"examples/agi_cast.html",2667,0,"",html,selection_command
+3342,10795707,"examples/agi_cast.html",2666,0,"",html,selection_command
+3343,10797014,"examples/agi_cast.html",2666,1,"",html,content
+3344,10797024,"examples/agi_cast.html",2588,0," ",html,content
+3345,10797036,"examples/agi_cast.html",2591,0,"",html,selection_command
+3346,10798010,"examples/agi_cast.html",2590,1,"",html,content
+3347,10798366,"examples/agi_cast.html",2589,0,"",html,selection_command
+3348,10798951,"examples/agi_cast.html",2438,0,"",html,selection_command
+3349,10799107,"examples/agi_cast.html",2289,0,"",html,selection_command
+3350,10799473,"examples/agi_cast.html",2290,0,"",html,selection_command
+3351,10799858,"examples/agi_cast.html",2439,0,"",html,selection_command
+3352,10800025,"examples/agi_cast.html",2590,0,"",html,selection_command
+3353,10801824,"examples/agi_cast.html",2579,0,"",html,selection_command
+3354,10802080,"examples/agi_cast.html",2575,0,"",html,selection_command
+3355,10802107,"examples/agi_cast.html",2573,0,"",html,selection_command
+3356,10802139,"examples/agi_cast.html",2562,0,"",html,selection_command
+3357,10802174,"examples/agi_cast.html",2556,0,"",html,selection_command
+3358,10802207,"examples/agi_cast.html",2555,0,"",html,selection_command
+3359,10802242,"examples/agi_cast.html",2550,0,"",html,selection_command
+3360,10802275,"examples/agi_cast.html",2543,0,"",html,selection_command
+3361,10802309,"examples/agi_cast.html",2540,0,"",html,selection_command
+3362,10802341,"examples/agi_cast.html",2533,0,"",html,selection_command
+3363,10802376,"examples/agi_cast.html",2524,0,"",html,selection_command
+3364,10802408,"examples/agi_cast.html",2520,0,"",html,selection_command
+3365,10802441,"examples/agi_cast.html",2512,0,"",html,selection_command
+3366,10802474,"examples/agi_cast.html",2501,0,"",html,selection_command
+3367,10802508,"examples/agi_cast.html",2499,0,"",html,selection_command
+3368,10802540,"examples/agi_cast.html",2493,0,"",html,selection_command
+3369,10802919,"examples/agi_cast.html",2485,0,"",html,selection_command
+3370,10803170,"examples/agi_cast.html",2472,0,"",html,selection_command
+3371,10803203,"examples/agi_cast.html",2485,0,"",html,selection_command
+3372,10803454,"examples/agi_cast.html",2493,0,"",html,selection_command
+3373,10803487,"examples/agi_cast.html",2499,0,"",html,selection_command
+3374,10803519,"examples/agi_cast.html",2501,0,"",html,selection_command
+3375,10803551,"examples/agi_cast.html",2512,0,"",html,selection_command
+3376,10803585,"examples/agi_cast.html",2520,0,"",html,selection_command
+3377,10803617,"examples/agi_cast.html",2524,0,"",html,selection_command
+3378,10803655,"examples/agi_cast.html",2533,0,"",html,selection_command
+3379,10803695,"examples/agi_cast.html",2540,0,"",html,selection_command
+3380,10803718,"examples/agi_cast.html",2543,0,"",html,selection_command
+3381,10803755,"examples/agi_cast.html",2550,0,"",html,selection_command
+3382,10803785,"examples/agi_cast.html",2555,0,"",html,selection_command
+3383,10803817,"examples/agi_cast.html",2556,0,"",html,selection_command
+3384,10804031,"examples/agi_cast.html",2560,0,"",html,selection_command
+3385,10804213,"examples/agi_cast.html",2572,0,"",html,selection_command
+3386,10804393,"examples/agi_cast.html",2573,0,"",html,selection_command
+3387,10804744,"examples/agi_cast.html",2572,0,"",html,selection_command
+3388,10805029,"examples/agi_cast.html",2573,0,"",html,selection_command
+3389,10805110,"examples/agi_cast.html",2573,0," ",html,content
+3390,10805116,"examples/agi_cast.html",2574,0,"",html,selection_keyboard
+3391,10805459,"examples/agi_cast.html",2574,0,"()",html,content
+3392,10805461,"examples/agi_cast.html",2575,0,"",html,selection_keyboard
+3393,10805997,"examples/agi_cast.html",2574,2,"",html,content
+3394,10806685,"examples/agi_cast.html",2574,0,"<",html,content
+3395,10806688,"examples/agi_cast.html",2575,0,"",html,selection_keyboard
+3396,10807878,"examples/agi_cast.html",2575,0,"d-cite key=""openai2025imo,deepmind2025imo"">",html,content
+3397,10808109,"examples/agi_cast.html",2626,0,"",html,selection_command
+3398,10809089,"examples/agi_cast.html",2622,0,"",html,selection_command
+3399,10809342,"examples/agi_cast.html",2621,0,"",html,selection_command
+3400,10809369,"examples/agi_cast.html",2620,0,"",html,selection_command
+3401,10809398,"examples/agi_cast.html",2616,0,"",html,selection_command
+3402,10809433,"examples/agi_cast.html",2601,0,"",html,selection_command
+3403,10809465,"examples/agi_cast.html",2600,0,"",html,selection_command
+3404,10809625,"examples/agi_cast.html",2587,0,"",html,selection_command
+3405,10809786,"examples/agi_cast.html",2585,0,"",html,selection_command
+3406,10810081,"examples/agi_cast.html",2587,0,"",html,selection_command
+3407,10810227,"examples/agi_cast.html",2600,0,"",html,selection_command
+3408,10810858,"examples/agi_cast.html",2587,0,"",html,selection_command
+3409,10811743,"examples/agi_cast.html",2585,0,"",html,selection_command
+3410,10812011,"examples/agi_cast.html",2582,0,"",html,selection_command
+3411,10812209,"examples/agi_cast.html",2577,0,"",html,selection_command
+3412,10812368,"examples/agi_cast.html",2576,0,"",html,selection_command
+3413,10812730,"examples/agi_cast.html",2575,0,"",html,selection_command
+3414,10813154,"examples/agi_cast.html",2574,0,"",html,selection_command
+3415,10813826,"examples/agi_cast.html",2573,1,"",html,content
+3416,10814031,"examples/agi_cast.html",2573,0,"\n ",html,content
+3417,10814291,"examples/agi_cast.html",2579,0,"",html,selection_command
+3418,10815573,"examples/agi_cast.html",2649,0,"",html,selection_command
+3419,10815921,"examples/agi_cast.html",2650,0,"",html,selection_command
+3420,10816094,"examples/agi_cast.html",2648,2,"",html,content
+3421,10816242,"examples/agi_cast.html",2646,2,"",html,content
+3422,10816627,"examples/agi_cast.html",2644,2,"",html,content
+3423,10817050,"examples/agi_cast.html",2643,1,"",html,content
+3424,10817527,"examples/agi_cast.html",2643,0," ",html,content
+3425,10817530,"examples/agi_cast.html",2644,0,"",html,selection_keyboard
+3426,10817879,"examples/agi_cast.html",2643,0,"",html,selection_command
+3427,10818408,"examples/agi_cast.html",2502,0,"",html,selection_command
+3428,10818989,"examples/agi_cast.html",2643,0,"",html,selection_command
+3429,10819214,"examples/agi_cast.html",2639,0,"",html,selection_command
+3430,10819470,"examples/agi_cast.html",2635,0,"",html,selection_command
+3431,10819500,"examples/agi_cast.html",2632,0,"",html,selection_command
+3432,10819527,"examples/agi_cast.html",2628,0,"",html,selection_command
+3433,10819560,"examples/agi_cast.html",2627,0,"",html,selection_command
+3434,10819593,"examples/agi_cast.html",2626,0,"",html,selection_command
+3435,10819625,"examples/agi_cast.html",2622,0,"",html,selection_command
+3436,10819658,"examples/agi_cast.html",2607,0,"",html,selection_command
+3437,10819777,"examples/agi_cast.html",2606,0,"",html,selection_command
+3438,10819978,"examples/agi_cast.html",2593,0,"",html,selection_command
+3439,10825263,"examples/bibliography.bib",0,0,"@article{radford2018improving,\n title = {Improving language understanding by generative pre-training},\n author = {Radford, Alec and Narasimhan, Karthik and Salimans, Tim and\n Sutskever, Ilya and others},\n}\n\n@article{radford2019language,\n title = {Language models are unsupervised multitask learners},\n author = {Radford, Alec and Wu, Jeffrey and Child, Rewon and Luan, David and\n Amodei, Dario and Sutskever, Ilya and others},\n journal = {OpenAI blog},\n volume = {1},\n number = {8},\n pages = {9},\n year = {2019},\n}\n\n@article{brown2020language,\n title = {Language models are few-shot learners},\n author = {Brown, Tom and Mann, Benjamin and Ryder, Nick and Subbiah, Melanie\n and Kaplan, Jared D and Dhariwal, Prafulla and Neelakantan, Arvind\n and Shyam, Pranav and Sastry, Girish and Askell, Amanda and others},\n journal = {Advances in neural information processing systems},\n volume = {33},\n pages = {1877--1901},\n year = {2020},\n}\n\n@article{raffel2020exploring,\n title = {Exploring the limits of transfer learning with a unified text-to-text\n transformer},\n author = {Raffel, Colin and Shazeer, Noam and Roberts, Adam and Lee, Katherine\n and Narang, Sharan and Matena, Michael and Zhou, Yanqi and Li, Wei\n and Liu, Peter J},\n journal = {Journal of machine learning research},\n volume = {21},\n number = {140},\n pages = {1--67},\n year = {2020},\n}\n\n@article{touvron2023llama,\n title = {Llama 2: Open foundation and fine-tuned chat models},\n author = {Touvron, Hugo and Martin, Louis and Stone, Kevin and Albert, Peter\n and Almahairi, Amjad and Babaei, Yasmine and Bashlykov, Nikolay and\n Batra, Soumya and Bhargava, Prajjwal and Bhosale, Shruti and others},\n journal = {arXiv preprint arXiv:2307.09288},\n year = {2023},\n}\n\n@article{bai2023qwen,\n title = {Qwen technical report},\n author = {Bai, Jinze and Bai, Shuai and Chu, Yunfei and Cui, Zeyu and Dang,\n Kai and Deng, Xiaodong and Fan, Yang and Ge, Wenbin and Han, Yu and\n Huang, Fei and others},\n journal = {arXiv preprint arXiv:2309.16609},\n year = {2023},\n}\n\n@article{young2024yi,\n title = {Yi: Open foundation models by 01. ai},\n author = {Young, Alex and Chen, Bei and Li, Chao and Huang, Chengen and Zhang,\n Ge and Zhang, Guanwei and Li, Heng and Zhu, Jiangcheng and Chen,\n Jianqun and Chang, Jing and others},\n journal = {arXiv preprint arXiv:2403.04652},\n year = {2024},\n}\n\n@article{vaswani2017attention,\n title = {Attention is all you need},\n author = {Vaswani, Ashish and Shazeer, Noam and Parmar, Niki and Uszkoreit,\n Jakob and Jones, Llion and Gomez, Aidan N and Kaiser, {\L}ukasz and\n Polosukhin, Illia},\n journal = {Advances in neural information processing systems},\n volume = {30},\n year = {2017},\n}\n\n@article{raffel2020exploring,\n title = {Exploring the limits of transfer learning with a unified text-to-text\n transformer},\n author = {Raffel, Colin and Shazeer, Noam and Roberts, Adam and Lee, Katherine\n and Narang, Sharan and Matena, Michael and Zhou, Yanqi and Li, Wei\n and Liu, Peter J},\n journal = {Journal of machine learning research},\n volume = {21},\n number = {140},\n pages = {1--67},\n year = {2020},\n}\n\n@inproceedings{zhou2024what,\n title = {What Algorithms can Transformers Learn? A Study in Length\n Generalization},\n author = {Hattie Zhou and Arwen Bradley and Etai Littwin and Noam Razin and\n Omid Saremi and Joshua M. Susskind and Samy Bengio and Preetum\n Nakkiran},\n booktitle = {The Twelfth International Conference on Learning Representations},\n year = {2024},\n url = {https://openreview.net/forum?id=AssIuHnmHX},\n}\n\n@inproceedings{ding2024causallm,\n title = {Causal{LM} is not optimal for in-context learning},\n author = {Nan Ding and Tomer Levinboim and Jialin Wu and Sebastian Goodman and\n Radu Soricut},\n booktitle = {The Twelfth International Conference on Learning Representations},\n year = {2024},\n url = {https://openreview.net/forum?id=guRNebwZBb},\n}\n\n@article{williams1989learning,\n title = {A learning algorithm for continually running fully recurrent neural\n networks},\n author = {Williams, Ronald J and Zipser, David},\n journal = {Neural computation},\n volume = {1},\n number = {2},\n pages = {270--280},\n year = {1989},\n publisher = {MIT Press One Rogers Street, Cambridge, MA 02142-1209, USA\n journals-info~โฆ},\n}\n\n@article{tay2022ul2,\n title = {Ul2: Unifying language learning paradigms},\n author = {Tay, Yi and Dehghani, Mostafa and Tran, Vinh Q and Garcia, Xavier\n and Wei, Jason and Wang, Xuezhi and Chung, Hyung Won and Shakeri,\n Siamak and Bahri, Dara and Schuster, Tal and others},\n journal = {arXiv preprint arXiv:2205.05131},\n year = {2022},\n}\n\n@misc{pfau2023last,\n title = {Last I checked, it was still not possible for a neural network alone\n (i.e. no MCTS) to beat the world's best Go players...},\n author = {Pfau, David},\n year = {2023},\n url = {https://twitter.com/pfau/status/1732785418565796167},\n note = {Accessed: 2023-12-07},\n}\n\n@article{deepmind2023alphacode,\n title = {AlphaCode 2 Technical Report},\n author = {Team, AlphaCode and Deepmind, Google},\n year = {2023},\n journal = {Google Deepmind},\n url = {\n https://storage.googleapis.com/deepmind-media/AlphaCode2/AlphaCode2_Tech_Report.pdf\n },\n}\n\n@article{reuters2023sam,\n author = {Tong, Anna and Dastin, Jeffrey and Hu, Krystal},\n title = {Sam Altman's ouster from OpenAI was precipitated by letter to board\n about AI breakthrough},\n journal = {Reuters},\n year = {2023},\n url = {\n https://www.reuters.com/technology/sam-altmans-ouster-openai-was-precipitated-by-letter-board-about-ai-breakthrough-2023-11-22/\n },\n note = {Accessed: 2023-12-07},\n}\n\n@misc{imbue2023podcast,\n title = {Noam Brown, FAIR: On achieving human-level performance in poker and\n Diplomacy, and the power of spending compute at inference time},\n author = {Noam Brown},\n howpublished = {\n https://imbue.com/podcast/2023-02-09-podcast-episode-27-noam-brown/\n },\n year = {2023},\n note = {Podcast episode 27, February 9, 2023},\n}\n\n@misc{karpathy2023youtube,\n author = {Karpathy, Andrej},\n title = {[1hr Talk] Intro to Large Language Models},\n howpublished = {YouTube},\n year = {2023},\n note = {Accessed: 2023-12-07},\n url = {https://www.youtube.com/watch?v=zjkBMFhNj_g&t=2100s},\n}\n\n@article{brown2019superhuman,\n title = {Superhuman AI for multiplayer poker},\n author = {Brown, Noam and Sandholm, Tuomas},\n journal = {Science},\n volume = {365},\n number = {6456},\n pages = {885--890},\n year = {2019},\n publisher = {American Association for the Advancement of Science},\n}\n\n@article{silver2016mastering,\n title = {Mastering the game of Go with deep neural networks and tree search},\n author = {Silver, David and Huang, Aja and Maddison, Chris J and Guez, Arthur\n and Sifre, Laurent and Van Den Driessche, George and Schrittwieser,\n Julian and Antonoglou, Ioannis and Panneershelvam, Veda and Lanctot,\n Marc and others},\n journal = {nature},\n volume = {529},\n number = {7587},\n pages = {484--489},\n year = {2016},\n publisher = {Nature Publishing Group},\n}\n\n@article{schrittwieser2020mastering,\n title = {Mastering atari, go, chess and shogi by planning with a learned model\n },\n author = {Schrittwieser, Julian and Antonoglou, Ioannis and Hubert, Thomas and\n Simonyan, Karen and Sifre, Laurent and Schmitt, Simon and Guez,\n Arthur and Lockhart, Edward and Hassabis, Demis and Graepel, Thore\n and others},\n journal = {Nature},\n volume = {588},\n number = {7839},\n pages = {604--609},\n year = {2020},\n publisher = {Nature Publishing Group UK London},\n}\n\n@article{wei2022chain,\n title = {Chain-of-thought prompting elicits reasoning in large language models\n },\n author = {Wei, Jason and Wang, Xuezhi and Schuurmans, Dale and Bosma, Maarten\n and Xia, Fei and Chi, Ed and Le, Quoc V and Zhou, Denny and others},\n journal = {Advances in neural information processing systems},\n volume = {35},\n pages = {24824--24837},\n year = {2022},\n}\n\n@article{yao2024tree,\n title = {Tree of thoughts: Deliberate problem solving with large language\n models},\n author = {Yao, Shunyu and Yu, Dian and Zhao, Jeffrey and Shafran, Izhak and\n Griffiths, Tom and Cao, Yuan and Narasimhan, Karthik},\n journal = {Advances in Neural Information Processing Systems},\n volume = {36},\n year = {2024},\n}\n\n@article{lecun2022path,\n title = {A path towards autonomous machine intelligence version 0.9. 2,\n 2022-06-27},\n author = {LeCun, Yann},\n journal = {Open Review},\n volume = {62},\n number = {1},\n year = {2022},\n}\n\n@article{hoffmann2022training,\n title = {Training compute-optimal large language models},\n author = {Hoffmann, Jordan and Borgeaud, Sebastian and Mensch, Arthur and\n Buchatskaya, Elena and Cai, Trevor and Rutherford, Eliza and Casas,\n Diego de Las and Hendricks, Lisa Anne and Welbl, Johannes and Clark,\n Aidan and others},\n journal = {arXiv preprint arXiv:2203.15556},\n year = {2022},\n}\n\n@article{meta2024introducing,\n title = {Introducing meta llama 3: The most capable openly available llm to\n date},\n author = {Meta, AI},\n journal = {Meta AI.},\n year = {2024},\n}\n\n@misc{riley2024it,\n title = {It's just not a very useful scaling law.},\n author = {@riley_stews},\n year = {2024},\n url = {https://x.com/riley_stews/status/1781019732122198288},\n note = {Accessed: 2023-04-20},\n}\n\n@article{shazeer2017outrageously,\n title = {Outrageously large neural networks: The sparsely-gated\n mixture-of-experts layer},\n author = {Shazeer, Noam and Mirhoseini, Azalia and Maziarz, Krzysztof and\n Davis, Andy and Le, Quoc and Hinton, Geoffrey and Dean, Jeff},\n journal = {arXiv preprint arXiv:1701.06538},\n year = {2017},\n}\n\n@article{fedus2022switch,\n title = {Switch transformers: Scaling to trillion parameter models with simple\n and efficient sparsity},\n author = {Fedus, William and Zoph, Barret and Shazeer, Noam},\n journal = {Journal of Machine Learning Research},\n volume = {23},\n number = {120},\n pages = {1--39},\n year = {2022},\n}\n\n@article{schulman2015high,\n title = {High-dimensional continuous control using generalized advantage\n estimation},\n author = {Schulman, John and Moritz, Philipp and Levine, Sergey and Jordan,\n Michael and Abbeel, Pieter},\n journal = {arXiv preprint arXiv:1506.02438},\n year = {2015},\n}\n\n@article{srambical2025ppo,\n author = {Srambical, Franz},\n title = {PPO Is Secretly Using Monte Carlo Advantage Estimation In LLM\n Post-Training},\n journal = {p(doom) blog},\n year = {2025},\n note = {https://pdoom.org/blog.html},\n}\n\n@article{williams1992simple,\n title = {Simple statistical gradient-following algorithms for connectionist\n reinforcement learning},\n author = {Williams, Ronald J},\n journal = {Machine learning},\n volume = {8},\n pages = {229--256},\n year = {1992},\n publisher = {Springer},\n}\n\n@software{deepmind2020jax,\n title = {The {D}eep{M}ind {JAX} {E}cosystem},\n author = {DeepMind and Babuschkin, Igor and Baumli, Kate and Bell, Alison and\n Bhupatiraju, Surya and Bruce, Jake and Buchlovsky, Peter and Budden,\n David and Cai, Trevor and Clark, Aidan and Danihelka, Ivo and Dedieu,\n Antoine and Fantacci, Claudio and Godwin, Jonathan and Jones, Chris\n and Hemsley, Ross and Hennigan, Tom and Hessel, Matteo and Hou,\n Shaobo and Kapturowski, Steven and Keck, Thomas and Kemaev, Iurii and\n King, Michael and Kunesch, Markus and Martens, Lena and Merzic, Hamza\n and Mikulik, Vladimir and Norman, Tamara and Papamakarios, George and\n Quan, John and Ring, Roman and Ruiz, Francisco and Sanchez, Alvaro\n and Sartran, Laurent and Schneider, Rosalia and Sezener, Eren and\n Spencer, Stephen and Srinivasan, Srivatsan and Stanojevi\'{c}, Milo\v\n {s} and Stokowiec, Wojciech and Wang, Luyu and Zhou, Guangyao and\n Viola, Fabio},\n url = {http://github.com/deepmind},\n year = {2020},\n}\n\n@misc{jax2025jit,\n title = {JAX: Just-in-time compilation},\n author = {James Bradbury and Roy Frostig and Peter Hawkins and Matthew James\n Johnson and Chris Leary and Dougal Maclaurin and George Necula and\n Adam Paszke and Jake Vander{P}las and Skye Wanderman-{M}ilne and Qiao\n Zhang},\n year = {2025},\n url = {https://docs.jax.dev/en/latest/jit-compilation.html},\n note = {Accessed: 2025-03-26},\n}\n\n@misc{jax2025callbacks,\n title = {JAX: External callbacks},\n author = {James Bradbury and Roy Frostig and Peter Hawkins and Matthew James\n Johnson and Chris Leary and Dougal Maclaurin and George Necula and\n Adam Paszke and Jake Vander{P}las and Skye Wanderman-{M}ilne and Qiao\n Zhang},\n year = {2025},\n url = {https://docs.jax.dev/en/latest/external-callbacks.html},\n note = {Accessed: 2025-03-26},\n}\n\n@misc{jax2025checkify,\n title = {JAX: The `checkify` transformation},\n author = {James Bradbury and Roy Frostig and Peter Hawkins and Matthew James\n Johnson and Chris Leary and Dougal Maclaurin and George Necula and\n Adam Paszke and Jake Vander{P}las and Skye Wanderman-{M}ilne and Qiao\n Zhang},\n year = {2025},\n url = {https://docs.jax.dev/en/latest/debugging/checkify_guide.html},\n note = {Accessed: 2025-03-26},\n}\n\n@misc{jax2025key,\n title = {JAX: Key concepts},\n author = {James Bradbury and Roy Frostig and Peter Hawkins and Matthew James\n Johnson and Chris Leary and Dougal Maclaurin and George Necula and\n Adam Paszke and Jake Vander{P}las and Skye Wanderman-{M}ilne and Qiao\n Zhang},\n year = {2025},\n url = {https://docs.jax.dev/en/latest/key-concepts.html},\n note = {Accessed: 2025-03-26},\n}\n\n@software{deepmind2020chex,\n title = {Chex},\n author = {James Bradbury and Roy Frostig and Peter Hawkins and Matthew James\n Johnson and Chris Leary and Dougal Maclaurin and George Necula and\n Adam Paszke and Jake Vander{P}las and Skye Wanderman-{M}ilne and Qiao\n Zhang},\n url = {http://github.com/google-deepmind/chex},\n year = {2020},\n}\n\n@misc{jax2025control,\n title = {JAX: Control flow and logical operators with JIT},\n author = {James Bradbury and Roy Frostig and Peter Hawkins and Matthew James\n Johnson and Chris Leary and Dougal Maclaurin and George Necula and\n Adam Paszke and Jake Vander{P}las and Skye Wanderman-{M}ilne and Qiao\n Zhang},\n year = {2025},\n url = {https://docs.jax.dev/en/latest/control-flow.html},\n note = {Accessed: 2025-03-26},\n}\n\n@misc{xla2025conditional,\n title = {XLA:Operation Semantics:Conditional},\n author = {James Bradbury and Roy Frostig and Peter Hawkins and Matthew James\n Johnson and Chris Leary and Dougal Maclaurin and George Necula and\n Adam Paszke and Jake Vander{P}las and Skye Wanderman-{M}ilne and Qiao\n Zhang},\n year = {2025},\n url = {https://openxla.org/xla/operation_semantics#conditional},\n note = {Accessed: 2025-03-26},\n}\n\n@misc{ayaka76822025error,\n author = {ayaka7682},\n title = {Message on public Discord server: Try this:\n \n import jax from jax._src.error_check import set_error_if, raise_if_error\n \n \n import jax.numpy as jnp\n \n \n @jax.jit\n \n \n def f(x, y):\n \n \n set_error_if(x != 0, 'x must be 0')\n \n \n return jnp.multiply(x, y)\n \n \n f(0, 0)\n \n \n raise_if_error()\n },\n year = {2025},\n url = {\n https://discord.com/channels/1107832795377713302/1107832795688083561/1354171414596419854\n },\n note = {Accessed: 2025-03-26},\n}\n\n@book{sutton1998reinforcement,\n title={Reinforcement learning: An introduction},\n author={Sutton, Richard S and Barto, Andrew G and others},\n volume={1},\n number={1},\n year={1998},\n publisher={MIT press Cambridge}\n}\n\n@article{sutton1999policy,\n title={Policy gradient methods for reinforcement learning with function approximation},\n author={Sutton, Richard S and McAllester, David and Singh, Satinder and Mansour, Yishay},\n journal={Advances in neural information processing systems},\n volume={12},\n year={1999}\n}\n\n@article{degris2012off,\n title={Off-policy actor-critic},\n author={Degris, Thomas and White, Martha and Sutton, Richard S},\n journal={arXiv preprint arXiv:1205.4839},\n year={2012}\n}\n\n@article{schulman2017proximal,\n title={Proximal policy optimization algorithms},\n author={Schulman, John and Wolski, Filip and Dhariwal, Prafulla and Radford, Alec and Klimov, Oleg},\n journal={arXiv preprint arXiv:1707.06347},\n year={2017}\n}\n\n@article{ouyang2022training,\n title={Training language models to follow instructions with human feedback},\n author={Ouyang, Long and Wu, Jeffrey and Jiang, Xu and Almeida, Diogo and Wainwright, Carroll and Mishkin, Pamela and Zhang, Chong and Agarwal, Sandhini and Slama, Katarina and Ray, Alex and others},\n journal={Advances in neural information processing systems},\n volume={35},\n pages={27730--27744},\n year={2022}\n}\n\n\n@misc{openai2025imo,\n title = {We achieved gold medal-level performance ๐ฅon the 2025 International Mathematical Olympiad with a general-purpose reasoning LLM!},\n author = {OpenAI},\n year = {2025},\n url = {https://x.com/OpenAI/status/1946594928945148246},\n note = {Accessed: 2025-08-05},\n}\n\n@misc{deepmind2025imo,\n title = {Advanced version of Gemini with Deep Think officially achieves gold-medal standard at the International Mathematical Olympiad},\n author = {Luong, Thang and Lockhart, Edward},\n year = {2025},\n url = {https://deepmind.google/discover/blog/advanced-version-of-gemini-with-deep-think-officially-achieves-gold-medal-standard-at-the-international-mathematical-olympiad/},\n note = {DeepMind Blog, July 21, 2025},\n}\n\n@misc{deepseekai2025r1,\n title={DeepSeek-R1: Incentivizing Reasoning Capability in LLMs via Reinforcement Learning}, \n author={DeepSeek-AI},\n year={2025},\n eprint={2501.12948},\n archivePrefix={arXiv},\n primaryClass={cs.CL},\n url={https://arxiv.org/abs/2501.12948}, \n}\n\n@misc{cursor2025tab,\n title = {A New Tab Model},\n author = {Cursor},\n year = {2025},\n url = {https://cursor.com/blog/tab-update},\n note = {Accessed: 2025-08-05},\n}\n\n@inproceedings{bruce2024genie,\n title={Genie: Generative Interactive Environments},\n author={Jake Bruce and Michael D Dennis and Ashley Edwards and Jack Parker-Holder and Yuge Shi and Edward Hughes and Matthew Lai and Aditi Mavalankar and Richie Steigerwald and Chris Apps and Yusuf Aytar and Sarah Maria Elisabeth Bechtle and Feryal Behbahani and Stephanie C.Y. Chan and Nicolas Heess and Lucy Gonzalez and Simon Osindero and Sherjil Ozair and Scott Reed and Jingwei Zhang and Konrad Zolna and Jeff Clune and Nando de Freitas and Satinder Singh and Tim Rockt{\""a}schel},\n booktitle={Forty-first International Conference on Machine Learning},\n year={2024},\n url={https://openreview.net/forum?id=bJbSbJskOS}\n}\n\n@article{parkerholder2024genie2,\n title = {Genie 2: A Large-Scale Foundation World Model},\n author = {Jack Parker-Holder and Philip Ball and Jake Bruce and Vibhavari Dasagi and Kristian Holsheimer and Christos Kaplanis and Alexandre Moufarek and Guy Scully and Jeremy Shar and Jimmy Shi and Stephen Spencer and Jessica Yung and Michael Dennis and Sultan Kenjeyev and Shangbang Long and Vlad Mnih and Harris Chan and Maxime Gazeau and Bonnie Li and Fabio Pardo and Luyu Wang and Lei Zhang and Frederic Besse and Tim Harley and Anna Mitenkova and Jane Wang and Jeff Clune and Demis Hassabis and Raia Hadsell and Adrian Bolton and Satinder Singh and Tim Rockt{\""a}schel},\n year = {2024},\n url = {https://deepmind.google/discover/blog/genie-2-a-large-scale-foundation-world-model/}\n}\n\n@article{deepmind2025genie3,\n title = {Genie 3: A New Frontier for World Models},\n author = {Philip J. Ball and Jakob Bauer and Frank Belletti and Bethanie Brownfield and Ariel Ephrat and Shlomi Fruchter and Agrim Gupta and Kristian Holsheimer and Aleksander Holynski and Jiri Hron and Christos Kaplanis and Marjorie Limont and Matt McGill and Yanko Oliveira and Jack Parker-Holder and Frank Perbet and Guy Scully and Jeremy Shar and Stephen Spencer and Omer Tov and Ruben Villegas and Emma Wang and Jessica Yung and Cip Baetu and Jordi Berbel and David Bridson and Jake Bruce and Gavin Buttimore and Sarah Chakera and Bilva Chandra and Paul Collins and Alex Cullum and Bogdan Damoc and Vibha Dasagi and Maxime Gazeau and Charles Gbadamosi and Woohyun Han and Ed Hirst and Ashyana Kachra and Lucie Kerley and Kristian Kjems and Eva Knoepfel and Vika Koriakin and Jessica Lo and Cong Lu and Zeb Mehring and Alex Moufarek and Henna Nandwani and Valeria Oliveira and Fabio Pardo and Jane Park and Andrew Pierson and Ben Poole and Helen Ran and Tim Salimans and Manuel Sanchez and Igor Saprykin and Amy Shen and Sailesh Sidhwani and Duncan Smith and Joe Stanton and Hamish Tomlinson and Dimple Vijaykumar and Luyu Wang and Piers Wingfield and Nat Wong and Keyang Xu and Christopher Yew and Nick Young and Vadim Zubov and Douglas Eck and Dumitru Erhan and Koray Kavukcuoglu and Demis Hassabis and Zoubin Gharamani and Raia Hadsell and A{\""a}ron van den Oord and Inbar Mosseri and Adrian Bolton and Satinder Singh and Tim Rockt{\""a}schel},\n year = {2025},\n url = {}\n}\n\n@InProceedings{parkerholder2022evolving,\n title = \t {Evolving Curricula with Regret-Based Environment Design},\n author = {Parker-Holder, Jack and Jiang, Minqi and Dennis, Michael and Samvelyan, Mikayel and Foerster, Jakob and Grefenstette, Edward and Rockt{\""a}schel, Tim},\n booktitle = \t {Proceedings of the 39th International Conference on Machine Learning},\n pages = \t {17473--17498},\n year = \t {2022},\n editor = \t {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan},\n volume = \t {162},\n series = \t {Proceedings of Machine Learning Research},\n month = \t {17--23 Jul},\n publisher = {PMLR},\n pdf = \t {https://proceedings.mlr.press/v162/parker-holder22a/parker-holder22a.pdf},\n url = \t {https://proceedings.mlr.press/v162/parker-holder22a.html},\n abstract = \t {Training generally-capable agents with reinforcement learning (RL) remains a significant challenge. A promising avenue for improving the robustness of RL agents is through the use of curricula. One such class of methods frames environment design as a game between a student and a teacher, using regret-based objectives to produce environment instantiations (or levels) at the frontier of the student agentโs capabilities. These methods benefit from theoretical robustness guarantees at equilibrium, yet they often struggle to find effective levels in challenging design spaces in practice. By contrast, evolutionary approaches incrementally alter environment complexity, resulting in potentially open-ended learning, but often rely on domain-specific heuristics and vast amounts of computational resources. This work proposes harnessing the power of evolution in a principled, regret-based curriculum. Our approach, which we call Adversarially Compounding Complexity by Editing Levels (ACCEL), seeks to constantly produce levels at the frontier of an agentโs capabilities, resulting in curricula that start simple but become increasingly complex. ACCEL maintains the theoretical benefits of prior regret-based methods, while providing significant empirical gains in a diverse set of environments. An interactive version of this paper is available at https://accelagent.github.io.}\n}\n\n@article{agarwal2025cosmos,\n title = {Cosmos World Foundation Model Platform for Physical AI},\n author = {Agarwal, Niket and others},\n journal = {arXiv preprint arXiv:2501.03575},\n year = {2025}\n}\n\n@article{bellemare2013arcade,\n title = {The arcade learning environment: An evaluation platform for general agents},\n author = {Bellemare, Marc G and others},\n journal = {Journal of artificial intelligence research},\n volume = {47},\n pages = {253--279},\n year = {2013}\n}\n\n@article{nichol2018retro,\n title={Gotta Learn Fast: A New Benchmark for Generalization in RL},\n author={Nichol, Alex and Pfau, Vicki and Hesse, Christopher and Klimov, Oleg and Schulman, John},\n journal={arXiv preprint arXiv:1804.03720},\n year={2018}\n}\n\n@inproceedings{matthews2024craftax,\n author={Michael Matthews and Michael Beukman and Benjamin Ellis and Mikayel Samvelyan and Matthew Jackson and Samuel Coward and Jakob Foerster},\n title = {Craftax: A Lightning-Fast Benchmark for Open-Ended Reinforcement Learning},\n booktitle = {International Conference on Machine Learning ({ICML})},\n year = {2024}\n}\n\n@inproceedings{NEURIPS2022_9c7008af,\n author = {Baker, Bowen and Akkaya, Ilge and Zhokov, Peter and Huizinga, Joost and Tang, Jie and Ecoffet, Adrien and Houghton, Brandon and Sampedro, Raul and Clune, Jeff},\n booktitle = {Advances in Neural Information Processing Systems},\n editor = {S. Koyejo and S. Mohamed and A. Agarwal and D. Belgrave and K. Cho and A. Oh},\n pages = {24639--24654},\n publisher = {Curran Associates, Inc.},\n title = {Video PreTraining (VPT): Learning to Act by Watching Unlabeled Online Videos},\n url = {https://proceedings.neurips.cc/paper_files/paper/2022/file/9c7008aff45b5d8f0973b23e1a22ada0-Paper-Conference.pdf},\n volume = {35},\n year = {2022}\n}\n\n@inproceedings{osband2020bsuite,\n title={Behaviour Suite for Reinforcement Learning},\n author={Osband, Ian and\n Doron, Yotam and\n Hessel, Matteo and\n Aslanides, John and\n Sezener, Eren and\n Saraiva, Andre and\n McKinney, Katrina and\n Lattimore, Tor and\n {Sz}epesv{\'a}ri, Csaba and\n Singh, Satinder and\n Van Roy, Benjamin and\n Sutton, Richard and\n Silver, David and\n van Hasselt, Hado},\n booktitle={International Conference on Learning Representations},\n year={2020},\n url={https://openreview.net/forum?id=rygf-kSYwH}\n}\n\n@article{nguyen2025crowd-sourcing,\n author = {Nguyen, Alfred and Mahajan, Mihir and Srambical, Franz},\n title = {Crowd-Sourcing A Dataset To Make Agents Code Like Humans},\n journal = {p(doom) blog},\n year = {2025},\n note = {https://pdoom.org/blog.html}\n}",bibtex,tab
+3440,10826367,"examples/bibliography.bib",990,0,"",bibtex,selection_mouse
+3441,10827314,"examples/bibliography.bib",27113,0,"",bibtex,selection_command
+3442,10861963,"examples/bibliography.bib",27114,0,"\n",bibtex,content
+3443,10862180,"examples/bibliography.bib",27115,0,"\n",bibtex,content
+3444,10862477,"examples/bibliography.bib",27116,0,"@misc{LuongLockhart2025GeminiIMO,\n author = {Thang Luong and Edward Lockhart},\n title = {Advanced version of Gemini with Deep Think officially achieves gold-medal standard at the International Mathematical Olympiad},\n url = {https://deepmind.google/discover/blog/advanced-version-of-gemini-with-deep-think-officially-achieves-gold-medal-standard-at-the-international-mathematical-olympiad/},\n year = {2025},\n}\n\n@misc{LinCheng2025GeminiICPC,\n author = {Hanzhao (Maggie) Lin and Heng-Tze Cheng},\n title = {Gemini achieves gold-level performance at the International Collegiate Programming Contest World Finals},\n url = {https://deepmind.google/discover/blog/gemini-achieves-gold-level-performance-at-the-international-collegiate-programming-contest-world-finals/},\n year = {2025},\n}\n",bibtex,content
+3445,10863258,"examples/bibliography.bib",27945,0,"",bibtex,selection_command
+3446,10863507,"examples/bibliography.bib",27920,0,"",bibtex,selection_command
+3447,10863542,"examples/bibliography.bib",27766,0,"",bibtex,selection_command
+3448,10863569,"examples/bibliography.bib",27642,0,"",bibtex,selection_command
+3449,10863601,"examples/bibliography.bib",27582,0,"",bibtex,selection_command
+3450,10863634,"examples/bibliography.bib",27552,0,"",bibtex,selection_command
+3451,10863668,"examples/bibliography.bib",27551,0,"",bibtex,selection_command
+3452,10863701,"examples/bibliography.bib",27549,0,"",bibtex,selection_command
+3453,10863821,"examples/bibliography.bib",27524,0,"",bibtex,selection_command
+3454,10863968,"examples/bibliography.bib",27348,0,"",bibtex,selection_command
+3455,10864108,"examples/bibliography.bib",27202,0,"",bibtex,selection_command
+3456,10864260,"examples/bibliography.bib",27150,0,"",bibtex,selection_command
+3457,10864676,"examples/bibliography.bib",27202,0,"",bibtex,selection_command
+3458,10864860,"examples/bibliography.bib",27150,0,"",bibtex,selection_command
+3459,10865029,"examples/bibliography.bib",27116,0,"",bibtex,selection_command
+3460,10866649,"examples/agi_cast.html",0,0,"",html,tab
+3461,10868496,"examples/agi_cast.html",2622,0,",LuongLockhart2025GeminiIMO,LinCheng2025GeminiICPC",html,content
+3462,10868499,"examples/agi_cast.html",2672,0,"",html,selection_command
+3463,10868908,"examples/agi_cast.html",2650,0,"",html,selection_command
+3464,10869166,"examples/agi_cast.html",2649,0,"",html,selection_command
+3465,10869197,"examples/agi_cast.html",2623,0,"",html,selection_command
+3466,10869224,"examples/agi_cast.html",2622,0,"",html,selection_command
+3467,10869258,"examples/agi_cast.html",2607,0,"",html,selection_command
+3468,10869291,"examples/agi_cast.html",2606,0,"",html,selection_command
+3469,10869626,"examples/agi_cast.html",2607,0,"",html,selection_command
+3470,10869827,"examples/agi_cast.html",2622,0,"",html,selection_command
+3471,10872936,"examples/bibliography.bib",0,0,"",bibtex,tab
+3472,10875035,"examples/bibliography.bib",27365,0,"",bibtex,selection_command
+3473,10875255,"examples/bibliography.bib",0,0,"",bibtex,selection_command
+3474,10875436,"examples/bibliography.bib",5222,0,"",bibtex,selection_command
+3475,10876818,"examples/bibliography.bib",5319,0,"",bibtex,selection_command
+3476,10877283,"examples/bibliography.bib",5375,0,"",bibtex,selection_command
+3477,10877684,"examples/bibliography.bib",5436,0,"",bibtex,selection_command
+3478,10878042,"examples/bibliography.bib",11345,0,"",bibtex,selection_command
+3479,10879208,"examples/bibliography.bib",11422,0,"",bibtex,selection_command
+3480,10879520,"examples/bibliography.bib",12426,0,"",bibtex,selection_command
+3481,10881319,"examples/bibliography.bib",17301,0,"",bibtex,selection_command
+3482,10882220,"examples/bibliography.bib",17822,0,"",bibtex,selection_command
+3483,10884290,"examples/bibliography.bib",17812,0,"",bibtex,selection_command
+3484,10884536,"examples/bibliography.bib",17812,1,"o",bibtex,selection_command
+3485,10884727,"examples/bibliography.bib",17812,14,"openai2025imo,",bibtex,selection_command
+3486,10885494,"examples/bibliography.bib",17812,13,"openai2025imo",bibtex,selection_command
+3487,10887541,"examples/bibliography.bib",17824,0,"",bibtex,selection_command
+3488,10887836,"examples/bibliography.bib",17845,0,"",bibtex,selection_command
+3489,10887942,"examples/bibliography.bib",17987,0,"",bibtex,selection_command
+3490,10888081,"examples/bibliography.bib",18005,0,"",bibtex,selection_command
+3491,10888227,"examples/bibliography.bib",18025,0,"",bibtex,selection_command
+3492,10888380,"examples/bibliography.bib",18084,0,"",bibtex,selection_command
+3493,10891775,"examples/bibliography.bib",18025,0,"",bibtex,selection_command
+3494,10892023,"examples/bibliography.bib",18005,0,"",bibtex,selection_command
+3495,10892051,"examples/bibliography.bib",17987,0,"",bibtex,selection_command
+3496,10892207,"examples/bibliography.bib",17845,0,"",bibtex,selection_command
+3497,10892356,"examples/bibliography.bib",17824,0,"",bibtex,selection_command
+3498,10893260,"examples/bibliography.bib",27947,0,"",bibtex,selection_command
+3499,10896374,"examples/bibliography.bib",17824,0,"",bibtex,selection_command
+3500,10897041,"examples/bibliography.bib",17812,0,"",bibtex,selection_command
+3501,10897362,"examples/bibliography.bib",17812,1,"o",bibtex,selection_command
+3502,10897586,"examples/bibliography.bib",17812,13,"openai2025imo",bibtex,selection_command
+3503,10898404,"examples/bibliography.bib",17824,0,"",bibtex,selection_command
+3504,10898875,"examples/bibliography.bib",27947,0,"",bibtex,selection_command
+3505,10900473,"examples/agi_cast.html",0,0,"",html,tab
+3506,10901002,"examples/agi_cast.html",2607,0,"",html,selection_command
+3507,10901162,"examples/agi_cast.html",2606,0,"",html,selection_command
+3508,10902568,"examples/agi_cast.html",2622,0,"r1",html,content
+3509,10902568,"examples/agi_cast.html",2619,3,"",html,content
+3510,10902568,"examples/agi_cast.html",2613,2,"",html,content
+3511,10902568,"examples/agi_cast.html",2612,0,"seeka",html,content
+3512,10902568,"examples/agi_cast.html",2611,1,"",html,content
+3513,10903913,"examples/agi_cast.html",2606,1,",",html,selection_command
+3514,10904110,"examples/agi_cast.html",2606,2,",d",html,selection_command
+3515,10904411,"examples/agi_cast.html",2606,18,",deepseekai2025r1,",html,selection_command
+3516,10904769,"examples/agi_cast.html",2606,17,",deepseekai2025r1",html,selection_command
+3517,10904945,"examples/agi_cast.html",2606,17,"",html,content
+3518,10908416,"examples/agi_cast.html",2607,0,"",html,selection_command
+3519,10908666,"examples/agi_cast.html",2633,0,"",html,selection_command
+3520,10908693,"examples/agi_cast.html",2634,0,"",html,selection_command
+3521,10908726,"examples/agi_cast.html",2656,0,"",html,selection_command
+3522,10908759,"examples/agi_cast.html",2660,0,"",html,selection_command
+3523,10908793,"examples/agi_cast.html",2661,0,"",html,selection_command
+3524,10908825,"examples/agi_cast.html",2662,0,"",html,selection_command
+3525,10908865,"examples/agi_cast.html",2666,0,"",html,selection_command
+3526,10908902,"examples/agi_cast.html",2669,0,"",html,selection_command
+3527,10908927,"examples/agi_cast.html",2673,0,"",html,selection_command
+3528,10908958,"examples/agi_cast.html",2678,0,"",html,selection_command
+3529,10908991,"examples/agi_cast.html",2681,0,"",html,selection_command
+3530,10909025,"examples/agi_cast.html",2694,0,"",html,selection_command
+3531,10909058,"examples/agi_cast.html",2708,0,"",html,selection_command
+3532,10909091,"examples/agi_cast.html",2717,0,"",html,selection_command
+3533,10909358,"examples/agi_cast.html",2720,0,"",html,selection_command
+3534,10909776,"examples/agi_cast.html",2761,0,"",html,selection_command
+3535,10910694,"examples/agi_cast.html",2744,0,"",html,selection_command
+3536,10911142,"examples/agi_cast.html",2736,0,"",html,selection_command
+3537,10911410,"examples/agi_cast.html",2728,0,"",html,selection_command
+3538,10911557,"examples/agi_cast.html",2722,0,"",html,selection_command
+3539,10911713,"examples/agi_cast.html",2717,0,"",html,selection_command
+3540,10911983,"examples/agi_cast.html",2720,0,"",html,selection_command
+3541,10912177,"examples/agi_cast.html",2721,0,"",html,selection_command
+3542,10912311,"examples/agi_cast.html",2721,0,"\n ",html,content
+3543,10912577,"examples/agi_cast.html",2727,0,"",html,selection_command
+3544,10913001,"examples/agi_cast.html",2728,0,"",html,selection_command
+3545,10913087,"examples/agi_cast.html",2728,1,"",html,content
+3546,10914172,"examples/agi_cast.html",2722,0,"",html,selection_command
+3547,10915103,"examples/agi_cast.html",2574,0,"",html,selection_command
+3548,10915302,"examples/agi_cast.html",2433,0,"",html,selection_command
+3549,10915911,"examples/agi_cast.html",2284,0,"",html,selection_command
+3550,10918206,"TERMINAL",0,0,"^C",,terminal_output
+3551,10918328,"TERMINAL",0,0,"[1G[0Kโ [1G[0K[1m[7m%[27m[1m[0m \r \r]633;D;0]633;P;Cwd=/Users/franzsrambical/Documents/pdoom/pdoom.org\r[0m[27m[24m[J]633;Afranzsrambical@MBF6N9WFVKFV pdoom.org % ]633;B[K[?2004h",,terminal_output
+3552,10918508,"TERMINAL",0,0,"npm run dev",,terminal_output
+3553,10918574,"TERMINAL",0,0,"[11Dcp examples/* dist",,terminal_output
+3554,10918854,"TERMINAL",0,0,"cp examples/* dist",,terminal_command
+3555,10918855,"TERMINAL",0,0,"[?2004l\r\r\n]633;E;cp examples/* dist;280736e1-4344-414b-b289-7b3558692a37",,terminal_output
+3556,10918900,"TERMINAL",0,0,"]633;C[1m[7m%[27m[1m[0m \r \r",,terminal_output
+3557,10919448,"TERMINAL",0,0,"npm run dev",,terminal_command
+3558,10919500,"TERMINAL",0,0,"]633;C",,terminal_output
+3559,10919660,"TERMINAL",0,0,"\r\n> distill-template@2.8.0 dev\r\n> rollup -c rollup.config.dev.js -w\r\n\r\n[1G[0K",,terminal_output
+3560,10920017,"TERMINAL",0,0,"c[4mrollup v2.7.3[24m\r\n[36mbundles [1msrc/components.js[22m โ [1mdist/template.v2.js[22m...[39m\r\n",,terminal_output
+3561,10920493,"TERMINAL",0,0,"[1m[32mhttp://localhost:8088[39m[22m -> /Users/franzsrambical/Documents/pdoom/pdoom.org/dist\r\n[1m[32mhttp://localhost:8088[39m[22m -> /Users/franzsrambical/Documents/pdoom/pdoom.org/examples\r\n[1m[33m(!) Circular dependencies[39m[22m\r\nnode_modules/d3-interpolate/src/value.js -> node_modules/d3-interpolate/src/array.js -> node_modules/d3-interpolate/src/value.js\r\nnode_modules/d3-interpolate/src/value.js -> node_modules/d3-interpolate/src/object.js -> node_modules/d3-interpolate/src/value.js\r\nnode_modules/d3-selection/src/selection/index.js -> node_modules/d3-selection/src/selection/select.js -> node_modules/d3-selection/src/selection/index.js\r\n...and 7 more\r\n[32mcreated [1mdist/template.v2.js[22m in [1m479ms[22m[39m\r\n[36mbundles [1msrc/transforms.js[22m โ [1mdist/transforms.v2.js[22m...[39m\r\n",,terminal_output
+3562,10920925,"TERMINAL",0,0,"[32mcreated [1mdist/transforms.v2.js[22m in [1m429ms[22m[39m\r\n\r\n[2025-11-02 16:01:30] waiting for changes...\r\n",,terminal_output
+3563,10954832,"examples/agi_cast.html",2433,0,"",html,selection_command
+3564,10955141,"examples/agi_cast.html",2284,0,"",html,selection_command
+3565,10955286,"examples/agi_cast.html",2433,0,"",html,selection_command
+3566,10955475,"examples/agi_cast.html",2574,0,"",html,selection_command
+3567,10955583,"examples/agi_cast.html",2722,0,"",html,selection_command
+3568,10956172,"examples/agi_cast.html",2762,0,"",html,selection_command
+3569,10956372,"examples/agi_cast.html",2762,0,".",html,content
+3570,10956380,"examples/agi_cast.html",2763,0,"",html,selection_keyboard
+3571,10956502,"examples/agi_cast.html",2762,0,"",html,selection_command
+3572,11037755,"examples/agi_cast.html",2614,0,"",html,selection_command
+3573,11038002,"examples/agi_cast.html",2473,0,"",html,selection_command
+3574,11038034,"examples/agi_cast.html",2324,0,"",html,selection_command
+3575,11038065,"examples/agi_cast.html",2282,0,"",html,selection_command
+3576,11038100,"examples/agi_cast.html",2272,0,"",html,selection_command
+3577,11038135,"examples/agi_cast.html",2198,0,"",html,selection_command
+3578,11038173,"examples/agi_cast.html",2026,0,"",html,selection_command
+3579,11038200,"examples/agi_cast.html",1941,0,"",html,selection_command
+3580,11038244,"examples/agi_cast.html",1821,0,"",html,selection_command
+3581,11038267,"examples/agi_cast.html",1779,0,"",html,selection_command
+3582,11038300,"examples/agi_cast.html",1765,0,"",html,selection_command
+3583,11038334,"examples/agi_cast.html",1741,0,"",html,selection_command
+3584,11038366,"examples/agi_cast.html",1728,0,"",html,selection_command
+3585,11038498,"examples/agi_cast.html",1637,0,"",html,selection_command
+3586,11039199,"examples/agi_cast.html",1597,123," We introduce AGI-CAST, a continually growing dataset of unlabeled screen recordings of AGI research.",html,selection_command
+3587,11039995,"examples/agi_cast.html",1637,0,"",html,selection_command
+3588,11624307,"examples/agi_cast.html",1728,0,"",html,selection_command
+3589,11624546,"examples/agi_cast.html",1741,0,"",html,selection_command
+3590,11624579,"examples/agi_cast.html",1765,0,"",html,selection_command
+3591,11624610,"examples/agi_cast.html",1779,0,"",html,selection_command
+3592,11624643,"examples/agi_cast.html",1821,0,"",html,selection_command
+3593,11624743,"examples/agi_cast.html",1941,0,"",html,selection_command
+3594,11624911,"examples/agi_cast.html",2026,0,"",html,selection_command
+3595,11625070,"examples/agi_cast.html",2198,0,"",html,selection_command
+3596,11625200,"examples/agi_cast.html",2272,0,"",html,selection_command
+3597,11625360,"examples/agi_cast.html",2282,0,"",html,selection_command
+3598,11625526,"examples/agi_cast.html",2324,0,"",html,selection_command
+3599,11626172,"examples/agi_cast.html",2284,149,"",html,content
+3600,11626208,"examples/agi_cast.html",2290,0,"",html,selection_command
+3601,11626579,"examples/agi_cast.html",2431,0,"",html,selection_command
+3602,11626728,"examples/agi_cast.html",2579,0,"",html,selection_command
+3603,11627353,"examples/agi_cast.html",2614,0,"\n ",html,content
+3604,11627867,"examples/agi_cast.html",2621,0,"\n ",html,content
+3605,11627867,"examples/agi_cast.html",2615,6,"",html,content
+3606,11628078,"examples/agi_cast.html",2616,6,"",html,content
+3607,11628182,"examples/agi_cast.html",2616,0,"\n A longstanding goal of AGI research is automating the process of conducting research itself. Internet-scale pre-training, preference modeling,",html,content
+3608,11628187,"examples/agi_cast.html",2623,0,"",html,selection_command
+3609,11628762,"examples/agi_cast.html",2616,0,"",html,selection_command
+3610,11628888,"examples/agi_cast.html",2615,0,"",html,selection_command
+3611,11629298,"examples/agi_cast.html",2579,0,"",html,selection_command
+3612,11669244,"examples/agi_cast.html",2615,0,"",html,selection_command
+3613,11669306,"examples/agi_cast.html",2616,0,"",html,selection_command
+3614,11669628,"examples/agi_cast.html",2623,0,"",html,selection_command
+3615,11670023,"examples/agi_cast.html",2625,0,"",html,selection_command
+3616,11670275,"examples/agi_cast.html",2638,0,"",html,selection_command
+3617,11670305,"examples/agi_cast.html",2643,0,"",html,selection_command
+3618,11670335,"examples/agi_cast.html",2646,0,"",html,selection_command
+3619,11670369,"examples/agi_cast.html",2650,0,"",html,selection_command
+3620,11670401,"examples/agi_cast.html",2659,0,"",html,selection_command
+3621,11670436,"examples/agi_cast.html",2662,0,"",html,selection_command
+3622,11670468,"examples/agi_cast.html",2673,0,"",html,selection_command
+3623,11670505,"examples/agi_cast.html",2677,0,"",html,selection_command
+3624,11670539,"examples/agi_cast.html",2685,0,"",html,selection_command
+3625,11670579,"examples/agi_cast.html",2688,0,"",html,selection_command
+3626,11670605,"examples/agi_cast.html",2699,0,"",html,selection_command
+3627,11670639,"examples/agi_cast.html",2708,0,"",html,selection_command
+3628,11670675,"examples/agi_cast.html",2714,0,"",html,selection_command
+3629,11670708,"examples/agi_cast.html",2716,0,"",html,selection_command
+3630,11670740,"examples/agi_cast.html",2724,0,"",html,selection_command
+3631,11671040,"examples/agi_cast.html",2723,0,"",html,selection_command
+3632,11671177,"examples/agi_cast.html",2722,0,"",html,selection_command
+3633,11671318,"examples/agi_cast.html",2721,0,"",html,selection_command
+3634,11671435,"examples/agi_cast.html",2720,0,"",html,selection_command
+3635,11671570,"examples/agi_cast.html",2719,0,"",html,selection_command
+3636,11671705,"examples/agi_cast.html",2718,0,"",html,selection_command
+3637,11671839,"examples/agi_cast.html",2717,0,"",html,selection_command
+3638,11672004,"examples/agi_cast.html",2716,0,"",html,selection_command
+3639,11672500,"examples/agi_cast.html",2716,49,"",html,content
+3640,11672507,"examples/agi_cast.html",2715,0,"",html,selection_command
+3641,11672829,"examples/agi_cast.html",2616,0,"",html,selection_command
+3642,11672860,"examples/agi_cast.html",2615,0,"",html,selection_command
+3643,11673008,"examples/agi_cast.html",2613,0,"",html,selection_command
+3644,11673142,"examples/agi_cast.html",2524,0,"",html,selection_command
+3645,11673401,"examples/agi_cast.html",2383,0,"",html,selection_command
+3646,11673939,"examples/agi_cast.html",2284,0,"",html,selection_command
+3647,11675399,"examples/agi_cast.html",2290,0,"",html,selection_command
+3648,11675930,"examples/agi_cast.html",2289,0,"",html,selection_command
+3649,11676525,"examples/agi_cast.html",2290,0,"",html,selection_command
+3650,11676719,"examples/agi_cast.html",2290,0,"Internet-scale pre-training, preference modeling,",html,content
+3651,11676731,"examples/agi_cast.html",2338,0,"",html,selection_command
+3652,11677731,"examples/agi_cast.html",2339,0,"",html,selection_command
+3653,11677806,"examples/agi_cast.html",2339,0," ",html,content
+3654,11677808,"examples/agi_cast.html",2340,0,"",html,selection_keyboard
+3655,11678217,"examples/agi_cast.html",2339,0,"",html,selection_command
+3656,11709714,"examples/agi_cast.html",2530,0,"",html,selection_command
+3657,11709778,"examples/agi_cast.html",2663,0,"",html,selection_command
+3658,11711725,"examples/agi_cast.html",2664,0,"",html,selection_command
+3659,11712273,"examples/agi_cast.html",2663,0,"",html,selection_command
+3660,11712917,"examples/agi_cast.html",2664,0,"",html,selection_command
+3661,11713153,"examples/agi_cast.html",2664,0," ",html,content
+3662,11713158,"examples/agi_cast.html",2665,0,"",html,selection_keyboard
+3663,11713443,"examples/agi_cast.html",2665,0,"J",html,content
+3664,11713444,"examples/agi_cast.html",2666,0,"",html,selection_keyboard
+3665,11713568,"examples/agi_cast.html",2666,0,"u",html,content
+3666,11713570,"examples/agi_cast.html",2667,0,"",html,selection_keyboard
+3667,11713857,"examples/agi_cast.html",2667,0,"s",html,content
+3668,11713860,"examples/agi_cast.html",2668,0,"",html,selection_keyboard
+3669,11713875,"examples/agi_cast.html",2668,0,"t",html,content
+3670,11713877,"examples/agi_cast.html",2669,0,"",html,selection_keyboard
+3671,11713945,"examples/agi_cast.html",2669,0," ",html,content
+3672,11713947,"examples/agi_cast.html",2670,0,"",html,selection_keyboard
+3673,11714111,"examples/agi_cast.html",2670,0,"l",html,content
+3674,11714113,"examples/agi_cast.html",2671,0,"",html,selection_keyboard
+3675,11714121,"examples/agi_cast.html",2671,0,"i",html,content
+3676,11714122,"examples/agi_cast.html",2672,0,"",html,selection_keyboard
+3677,11714312,"examples/agi_cast.html",2672,0,"k",html,content
+3678,11714314,"examples/agi_cast.html",2673,0,"",html,selection_keyboard
+3679,11714322,"examples/agi_cast.html",2673,0,"e",html,content
+3680,11714324,"examples/agi_cast.html",2674,0,"",html,selection_keyboard
+3681,11714470,"examples/agi_cast.html",2674,0," ",html,content
+3682,11714471,"examples/agi_cast.html",2675,0,"",html,selection_keyboard
+3683,11714953,"examples/agi_cast.html",2674,0,"",html,selection_command
+3684,11732253,"examples/agi_cast.html",2670,0,"",html,selection_command
+3685,11732439,"examples/agi_cast.html",2665,0,"",html,selection_command
+3686,11733779,"examples/agi_cast.html",2665,10,"",html,content
+3687,11734028,"examples/agi_cast.html",2664,0,"",html,selection_command
+3688,11748368,"examples/agi_cast.html",2665,0,"",html,selection_command
+3689,11749317,"examples/agi_cast.html",2665,0,"A",html,content
+3690,11749321,"examples/agi_cast.html",2666,0,"",html,selection_keyboard
+3691,11750215,"examples/agi_cast.html",2665,1,"",html,content
+3692,11751332,"examples/agi_cast.html",2665,0,"G",html,content
+3693,11751335,"examples/agi_cast.html",2666,0,"",html,selection_keyboard
+3694,11751584,"examples/agi_cast.html",2666,0,"e",html,content
+3695,11751585,"examples/agi_cast.html",2667,0,"",html,selection_keyboard
+3696,11751659,"examples/agi_cast.html",2667,0,"t",html,content
+3697,11751661,"examples/agi_cast.html",2668,0,"",html,selection_keyboard
+3698,11751794,"examples/agi_cast.html",2668,0,"t",html,content
+3699,11751796,"examples/agi_cast.html",2669,0,"",html,selection_keyboard
+3700,11752728,"examples/agi_cast.html",2669,0,"i",html,content
+3701,11752733,"examples/agi_cast.html",2670,0,"",html,selection_keyboard
+3702,11752785,"examples/agi_cast.html",2670,0,"n",html,content
+3703,11752788,"examples/agi_cast.html",2671,0,"",html,selection_keyboard
+3704,11752846,"examples/agi_cast.html",2671,0,"g",html,content
+3705,11752849,"examples/agi_cast.html",2672,0,"",html,selection_keyboard
+3706,11753044,"examples/agi_cast.html",2672,0," ",html,content
+3707,11753046,"examples/agi_cast.html",2673,0,"",html,selection_keyboard
+3708,11753463,"examples/agi_cast.html",2673,0,"m",html,content
+3709,11753465,"examples/agi_cast.html",2674,0,"",html,selection_keyboard
+3710,11753533,"examples/agi_cast.html",2674,0,"o",html,content
+3711,11753536,"examples/agi_cast.html",2675,0,"",html,selection_keyboard
+3712,11753670,"examples/agi_cast.html",2675,0,"d",html,content
+3713,11753672,"examples/agi_cast.html",2676,0,"",html,selection_keyboard
+3714,11753735,"examples/agi_cast.html",2676,0,"e",html,content
+3715,11753738,"examples/agi_cast.html",2677,0,"",html,selection_keyboard
+3716,11753977,"examples/agi_cast.html",2677,0,"l",html,content
+3717,11753978,"examples/agi_cast.html",2678,0,"",html,selection_keyboard
+3718,11753996,"examples/agi_cast.html",2678,0,"s",html,content
+3719,11753997,"examples/agi_cast.html",2679,0,"",html,selection_keyboard
+3720,11754072,"examples/agi_cast.html",2679,0," ",html,content
+3721,11754074,"examples/agi_cast.html",2680,0,"",html,selection_keyboard
+3722,11754085,"examples/agi_cast.html",2680,0,"t",html,content
+3723,11754086,"examples/agi_cast.html",2681,0,"",html,selection_keyboard
+3724,11754223,"examples/agi_cast.html",2681,0,"o",html,content
+3725,11754225,"examples/agi_cast.html",2682,0,"",html,selection_keyboard
+3726,11754376,"examples/agi_cast.html",2682,0," ",html,content
+3727,11754379,"examples/agi_cast.html",2683,0,"",html,selection_keyboard
+3728,11754639,"examples/agi_cast.html",2682,0,"",html,selection_command
+3729,11754863,"examples/agi_cast.html",2680,0,"",html,selection_command
+3730,11755029,"examples/agi_cast.html",2673,0,"",html,selection_command
+3731,11755172,"examples/agi_cast.html",2665,0,"",html,selection_command
+3732,11755615,"examples/agi_cast.html",2665,18,"",html,content
+3733,11755813,"examples/agi_cast.html",2664,0,"",html,selection_command
+3734,11756608,"examples/agi_cast.html",2665,0,"",html,selection_command
+3735,11757342,"examples/agi_cast.html",2664,0,"",html,selection_command
+3736,11762830,"examples/agi_cast.html",2665,0,"",html,selection_command
+3737,11763680,"examples/agi_cast.html",2665,0,"r",html,content
+3738,11763682,"examples/agi_cast.html",2666,0,"",html,selection_keyboard
+3739,11763762,"examples/agi_cast.html",2666,0,"q",html,content
+3740,11763766,"examples/agi_cast.html",2667,0,"",html,selection_keyboard
+3741,11764338,"examples/agi_cast.html",2666,1,"",html,content
+3742,11764356,"examples/agi_cast.html",2666,0,"e",html,content
+3743,11764357,"examples/agi_cast.html",2667,0,"",html,selection_keyboard
+3744,11764438,"examples/agi_cast.html",2667,0,"q",html,content
+3745,11764440,"examples/agi_cast.html",2668,0,"",html,selection_keyboard
+3746,11764638,"examples/agi_cast.html",2668,0,"u",html,content
+3747,11764642,"examples/agi_cast.html",2669,0,"",html,selection_keyboard
+3748,11764724,"examples/agi_cast.html",2669,0,"i",html,content
+3749,11764726,"examples/agi_cast.html",2670,0,"",html,selection_keyboard
+3750,11764818,"examples/agi_cast.html",2670,0,"r",html,content
+3751,11764822,"examples/agi_cast.html",2671,0,"",html,selection_keyboard
+3752,11764894,"examples/agi_cast.html",2671,0,"e",html,content
+3753,11764896,"examples/agi_cast.html",2672,0,"",html,selection_keyboard
+3754,11765129,"examples/agi_cast.html",2672,0,"s",html,content
+3755,11765132,"examples/agi_cast.html",2673,0,"",html,selection_keyboard
+3756,11765221,"examples/agi_cast.html",2673,0," ",html,content
+3757,11765224,"examples/agi_cast.html",2674,0,"",html,selection_keyboard
+3758,11765345,"examples/agi_cast.html",2674,0,"m",html,content
+3759,11765346,"examples/agi_cast.html",2675,0,"",html,selection_keyboard
+3760,11765412,"examples/agi_cast.html",2675,0,"o",html,content
+3761,11765415,"examples/agi_cast.html",2676,0,"",html,selection_keyboard
+3762,11765536,"examples/agi_cast.html",2676,0,"v",html,content
+3763,11765537,"examples/agi_cast.html",2677,0,"",html,selection_keyboard
+3764,11765685,"examples/agi_cast.html",2677,0,"i",html,content
+3765,11765686,"examples/agi_cast.html",2678,0,"",html,selection_keyboard
+3766,11765773,"examples/agi_cast.html",2678,0,"n",html,content
+3767,11765776,"examples/agi_cast.html",2679,0,"",html,selection_keyboard
+3768,11765834,"examples/agi_cast.html",2679,0,"g",html,content
+3769,11765835,"examples/agi_cast.html",2680,0,"",html,selection_keyboard
+3770,11766014,"examples/agi_cast.html",2680,0," ",html,content
+3771,11766017,"examples/agi_cast.html",2681,0,"",html,selection_keyboard
+3772,11766213,"examples/agi_cast.html",2681,0,"f",html,content
+3773,11766216,"examples/agi_cast.html",2682,0,"",html,selection_keyboard
+3774,11766284,"examples/agi_cast.html",2682,0,"r",html,content
+3775,11766286,"examples/agi_cast.html",2683,0,"",html,selection_keyboard
+3776,11766361,"examples/agi_cast.html",2683,0,"o",html,content
+3777,11766364,"examples/agi_cast.html",2684,0,"",html,selection_keyboard
+3778,11766418,"examples/agi_cast.html",2684,0,"m",html,content
+3779,11766420,"examples/agi_cast.html",2685,0,"",html,selection_keyboard
+3780,11766679,"examples/agi_cast.html",2684,0,"",html,selection_command
+3781,11767709,"examples/agi_cast.html",2685,0,"",html,selection_command
+3782,11768257,"examples/agi_cast.html",2685,0," ",html,content
+3783,11768261,"examples/agi_cast.html",2686,0,"",html,selection_keyboard
+3784,11768483,"examples/agi_cast.html",2686,0,"t",html,content
+3785,11768486,"examples/agi_cast.html",2687,0,"",html,selection_keyboard
+3786,11768562,"examples/agi_cast.html",2687,0,"e",html,content
+3787,11768565,"examples/agi_cast.html",2688,0,"",html,selection_keyboard
+3788,11768927,"examples/agi_cast.html",2688,0,"x",html,content
+3789,11768929,"examples/agi_cast.html",2689,0,"",html,selection_keyboard
+3790,11769286,"examples/agi_cast.html",2689,0,"t",html,content
+3791,11769289,"examples/agi_cast.html",2690,0,"",html,selection_keyboard
+3792,11769698,"examples/agi_cast.html",2689,0,"",html,selection_command
+3793,11770501,"examples/agi_cast.html",2686,0,"",html,selection_command
+3794,11780545,"examples/agi_cast.html",2686,0,"p",html,content
+3795,11780547,"examples/agi_cast.html",2687,0,"",html,selection_keyboard
+3796,11780611,"examples/agi_cast.html",2687,0,"r",html,content
+3797,11780613,"examples/agi_cast.html",2688,0,"",html,selection_keyboard
+3798,11780653,"examples/agi_cast.html",2688,0,"e",html,content
+3799,11780655,"examples/agi_cast.html",2689,0,"",html,selection_keyboard
+3800,11780945,"examples/agi_cast.html",2689,0,"d",html,content
+3801,11780947,"examples/agi_cast.html",2690,0,"",html,selection_keyboard
+3802,11780956,"examples/agi_cast.html",2690,0,"o",html,content
+3803,11780958,"examples/agi_cast.html",2691,0,"",html,selection_keyboard
+3804,11780985,"examples/agi_cast.html",2691,0,"m",html,content
+3805,11780986,"examples/agi_cast.html",2692,0,"",html,selection_keyboard
+3806,11781153,"examples/agi_cast.html",2692,0,"i",html,content
+3807,11781154,"examples/agi_cast.html",2693,0,"",html,selection_keyboard
+3808,11781229,"examples/agi_cast.html",2693,0,"n",html,content
+3809,11781230,"examples/agi_cast.html",2694,0,"",html,selection_keyboard
+3810,11781467,"examples/agi_cast.html",2694,0,"a",html,content
+3811,11781469,"examples/agi_cast.html",2695,0,"",html,selection_keyboard
+3812,11781478,"examples/agi_cast.html",2695,0,"n",html,content
+3813,11781480,"examples/agi_cast.html",2696,0,"",html,selection_keyboard
+3814,11781586,"examples/agi_cast.html",2696,0,"t",html,content
+3815,11781587,"examples/agi_cast.html",2697,0,"",html,selection_keyboard
+3816,11781804,"examples/agi_cast.html",2697,0,"l",html,content
+3817,11781806,"examples/agi_cast.html",2698,0,"",html,selection_keyboard
+3818,11781814,"examples/agi_cast.html",2698,0,"y",html,content
+3819,11781815,"examples/agi_cast.html",2699,0,"",html,selection_keyboard
+3820,11781978,"examples/agi_cast.html",2699,0," ",html,content
+3821,11781980,"examples/agi_cast.html",2700,0,"",html,selection_keyboard
+3822,11782902,"examples/agi_cast.html",2699,0,"",html,selection_command
+3823,11783190,"examples/agi_cast.html",2704,0,"",html,selection_command
+3824,11783537,"examples/agi_cast.html",2704,0,"_",html,content
+3825,11783539,"examples/agi_cast.html",2705,0,"",html,selection_keyboard
+3826,11783783,"examples/agi_cast.html",2705,0,"b",html,content
+3827,11783785,"examples/agi_cast.html",2706,0,"",html,selection_keyboard
+3828,11784128,"examples/agi_cast.html",2705,1,"",html,content
+3829,11784292,"examples/agi_cast.html",2704,1,"",html,content
+3830,11784620,"examples/agi_cast.html",2704,0,"-",html,content
+3831,11784621,"examples/agi_cast.html",2705,0,"",html,selection_keyboard
+3832,11784924,"examples/agi_cast.html",2705,0,"b",html,content
+3833,11784928,"examples/agi_cast.html",2706,0,"",html,selection_keyboard
+3834,11785297,"examples/agi_cast.html",2706,0,"a",html,content
+3835,11785300,"examples/agi_cast.html",2707,0,"",html,selection_keyboard
+3836,11785352,"examples/agi_cast.html",2707,0,"s",html,content
+3837,11785354,"examples/agi_cast.html",2708,0,"",html,selection_keyboard
+3838,11785562,"examples/agi_cast.html",2708,0,"e",html,content
+3839,11785565,"examples/agi_cast.html",2709,0,"",html,selection_keyboard
+3840,11785854,"examples/agi_cast.html",2709,0,"d",html,content
+3841,11785858,"examples/agi_cast.html",2710,0,"",html,selection_keyboard
+3842,11785888,"examples/agi_cast.html",2710,0," ",html,content
+3843,11785891,"examples/agi_cast.html",2711,0,"",html,selection_keyboard
+3844,11785978,"examples/agi_cast.html",2711,0,"b",html,content
+3845,11785980,"examples/agi_cast.html",2712,0,"",html,selection_keyboard
+3846,11786114,"examples/agi_cast.html",2712,0,"e",html,content
+3847,11786116,"examples/agi_cast.html",2713,0,"",html,selection_keyboard
+3848,11786207,"examples/agi_cast.html",2713,0,"h",html,content
+3849,11786208,"examples/agi_cast.html",2714,0,"",html,selection_keyboard
+3850,11786406,"examples/agi_cast.html",2714,0,"a",html,content
+3851,11786409,"examples/agi_cast.html",2715,0,"",html,selection_keyboard
+3852,11786422,"examples/agi_cast.html",2715,0,"b",html,content
+3853,11786424,"examples/agi_cast.html",2716,0,"",html,selection_keyboard
+3854,11786450,"examples/agi_cast.html",2716,0,"i",html,content
+3855,11786451,"examples/agi_cast.html",2717,0,"",html,selection_keyboard
+3856,11786919,"examples/agi_cast.html",2716,1,"",html,content
+3857,11787068,"examples/agi_cast.html",2715,1,"",html,content
+3858,11787080,"examples/agi_cast.html",2715,0,"v",html,content
+3859,11787081,"examples/agi_cast.html",2716,0,"",html,selection_keyboard
+3860,11787152,"examples/agi_cast.html",2716,0,"i",html,content
+3861,11787153,"examples/agi_cast.html",2717,0,"",html,selection_keyboard
+3862,11787243,"examples/agi_cast.html",2717,0,"o",html,content
+3863,11787245,"examples/agi_cast.html",2718,0,"",html,selection_keyboard
+3864,11787302,"examples/agi_cast.html",2718,0,"u",html,content
+3865,11787304,"examples/agi_cast.html",2719,0,"",html,selection_keyboard
+3866,11787404,"examples/agi_cast.html",2719,0,"r",html,content
+3867,11787406,"examples/agi_cast.html",2720,0,"",html,selection_keyboard
+3868,11787770,"examples/agi_cast.html",2720,0,"-",html,content
+3869,11787774,"examples/agi_cast.html",2721,0,"",html,selection_keyboard
+3870,11787937,"examples/agi_cast.html",2721,0,"c",html,content
+3871,11787941,"examples/agi_cast.html",2722,0,"",html,selection_keyboard
+3872,11788122,"examples/agi_cast.html",2722,0,"l",html,content
+3873,11788127,"examples/agi_cast.html",2723,0,"",html,selection_keyboard
+3874,11788219,"examples/agi_cast.html",2723,0,"o",html,content
+3875,11788222,"examples/agi_cast.html",2724,0,"",html,selection_keyboard
+3876,11788291,"examples/agi_cast.html",2724,0,"n",html,content
+3877,11788294,"examples/agi_cast.html",2725,0,"",html,selection_keyboard
+3878,11788436,"examples/agi_cast.html",2725,0,"i",html,content
+3879,11788439,"examples/agi_cast.html",2726,0,"",html,selection_keyboard
+3880,11788503,"examples/agi_cast.html",2726,0,"n",html,content
+3881,11788506,"examples/agi_cast.html",2727,0,"",html,selection_keyboard
+3882,11788560,"examples/agi_cast.html",2727,0,"g",html,content
+3883,11788563,"examples/agi_cast.html",2728,0,"",html,selection_keyboard
+3884,11788805,"examples/agi_cast.html",2728,0," ",html,content
+3885,11788809,"examples/agi_cast.html",2729,0,"",html,selection_keyboard
+3886,11789124,"examples/agi_cast.html",2729,0,"t",html,content
+3887,11789128,"examples/agi_cast.html",2730,0,"",html,selection_keyboard
+3888,11789255,"examples/agi_cast.html",2730,0,"o",html,content
+3889,11789257,"examples/agi_cast.html",2731,0,"",html,selection_keyboard
+3890,11789377,"examples/agi_cast.html",2731,0,"w",html,content
+3891,11789379,"examples/agi_cast.html",2732,0,"",html,selection_keyboard
+3892,11789492,"examples/agi_cast.html",2732,0,"r",html,content
+3893,11789494,"examples/agi_cast.html",2733,0,"",html,selection_keyboard
+3894,11789863,"examples/agi_cast.html",2733,0,"a",html,content
+3895,11789865,"examples/agi_cast.html",2734,0,"",html,selection_keyboard
+3896,11790069,"examples/agi_cast.html",2733,1,"",html,content
+3897,11790217,"examples/agi_cast.html",2732,1,"",html,content
+3898,11790331,"examples/agi_cast.html",2732,0,"a",html,content
+3899,11790333,"examples/agi_cast.html",2733,0,"",html,selection_keyboard
+3900,11790344,"examples/agi_cast.html",2733,0,"r",html,content
+3901,11790346,"examples/agi_cast.html",2734,0,"",html,selection_keyboard
+3902,11790620,"examples/agi_cast.html",2734,0,"d",html,content
+3903,11790625,"examples/agi_cast.html",2735,0,"",html,selection_keyboard
+3904,11790648,"examples/agi_cast.html",2735,0,"s",html,content
+3905,11790651,"examples/agi_cast.html",2736,0,"",html,selection_keyboard
+3906,11790751,"examples/agi_cast.html",2736,0," ",html,content
+3907,11790754,"examples/agi_cast.html",2737,0,"",html,selection_keyboard
+3908,11791803,"examples/agi_cast.html",2736,0,"",html,selection_command
+3909,11792204,"examples/agi_cast.html",2737,0,"",html,selection_command
+3910,11792634,"examples/agi_cast.html",2737,0,"v",html,content
+3911,11792636,"examples/agi_cast.html",2738,0,"",html,selection_keyboard
+3912,11792746,"examples/agi_cast.html",2738,0,"i",html,content
+3913,11792749,"examples/agi_cast.html",2739,0,"",html,selection_keyboard
+3914,11792952,"examples/agi_cast.html",2739,0,"d",html,content
+3915,11792954,"examples/agi_cast.html",2740,0,"",html,selection_keyboard
+3916,11793029,"examples/agi_cast.html",2740,0,"e",html,content
+3917,11793031,"examples/agi_cast.html",2741,0,"",html,selection_keyboard
+3918,11793154,"examples/agi_cast.html",2741,0,"o",html,content
+3919,11793156,"examples/agi_cast.html",2742,0,"",html,selection_keyboard
+3920,11794227,"examples/agi_cast.html",2742,0,"_",html,content
+3921,11794230,"examples/agi_cast.html",2743,0,"",html,selection_keyboard
+3922,11794745,"examples/agi_cast.html",2742,1,"",html,content
+3923,11794970,"examples/agi_cast.html",2742,0,"-",html,content
+3924,11794971,"examples/agi_cast.html",2743,0,"",html,selection_keyboard
+3925,11795163,"examples/agi_cast.html",2743,0,"b",html,content
+3926,11795166,"examples/agi_cast.html",2744,0,"",html,selection_keyboard
+3927,11795454,"examples/agi_cast.html",2744,0,"a",html,content
+3928,11795458,"examples/agi_cast.html",2745,0,"",html,selection_keyboard
+3929,11795614,"examples/agi_cast.html",2745,0,"s",html,content
+3930,11795618,"examples/agi_cast.html",2746,0,"",html,selection_keyboard
+3931,11795631,"examples/agi_cast.html",2746,0,"e",html,content
+3932,11795635,"examples/agi_cast.html",2747,0,"",html,selection_keyboard
+3933,11796316,"examples/agi_cast.html",2747,0,"d behaviour-cloning.",html,content
+3934,11796493,"examples/agi_cast.html",2766,0,"",html,selection_command
+3935,11796647,"examples/agi_cast.html",2759,0,"",html,selection_command
+3936,11796887,"examples/agi_cast.html",2758,0,"",html,selection_command
+3937,11796917,"examples/agi_cast.html",2749,0,"",html,selection_command
+3938,11796949,"examples/agi_cast.html",2743,0,"",html,selection_command
+3939,11796984,"examples/agi_cast.html",2742,0,"",html,selection_command
+3940,11797017,"examples/agi_cast.html",2737,0,"",html,selection_command
+3941,11797051,"examples/agi_cast.html",2729,0,"",html,selection_command
+3942,11797083,"examples/agi_cast.html",2721,0,"",html,selection_command
+3943,11797235,"examples/agi_cast.html",2720,0,"",html,selection_command
+3944,11797537,"examples/agi_cast.html",2711,0,"",html,selection_command
+3945,11797871,"examples/agi_cast.html",2711,1,"b",html,selection_command
+3946,11797999,"examples/agi_cast.html",2711,10,"behaviour-",html,selection_command
+3947,11798214,"examples/agi_cast.html",2711,11,"behaviour-c",html,selection_command
+3948,11798638,"examples/agi_cast.html",2711,19,"behaviour-cloning t",html,selection_command
+3949,11799058,"examples/agi_cast.html",2711,18,"behaviour-cloning ",html,selection_command
+3950,11800177,"examples/agi_cast.html",2711,18,"",html,content
+3951,11807516,"examples/agi_cast.html",2705,0,"",html,selection_command
+3952,11807764,"examples/agi_cast.html",2704,0,"",html,selection_command
+3953,11807793,"examples/agi_cast.html",2700,0,"",html,selection_command
+3954,11807995,"examples/agi_cast.html",2686,0,"",html,selection_command
+3955,11808169,"examples/agi_cast.html",2681,0,"",html,selection_command
+3956,11808346,"examples/agi_cast.html",2674,0,"",html,selection_command
+3957,11808532,"examples/agi_cast.html",2665,0,"",html,selection_command
+3958,11808743,"examples/agi_cast.html",2663,0,"",html,selection_command
+3959,11809095,"examples/agi_cast.html",2665,0,"",html,selection_command
+3960,11815723,"examples/agi_cast.html",2665,0,"A",html,content
+3961,11815728,"examples/agi_cast.html",2666,0,"",html,selection_keyboard
+3962,11815960,"examples/agi_cast.html",2666,0," ",html,content
+3963,11815962,"examples/agi_cast.html",2667,0,"",html,selection_keyboard
+3964,11816017,"examples/agi_cast.html",2667,0,"n",html,content
+3965,11816020,"examples/agi_cast.html",2668,0,"",html,selection_keyboard
+3966,11816224,"examples/agi_cast.html",2668,0,"a",html,content
+3967,11816226,"examples/agi_cast.html",2669,0,"",html,selection_keyboard
+3968,11816235,"examples/agi_cast.html",2669,0,"t",html,content
+3969,11816238,"examples/agi_cast.html",2670,0,"",html,selection_keyboard
+3970,11816300,"examples/agi_cast.html",2670,0,"u",html,content
+3971,11816302,"examples/agi_cast.html",2671,0,"",html,selection_keyboard
+3972,11816404,"examples/agi_cast.html",2671,0,"r",html,content
+3973,11816405,"examples/agi_cast.html",2672,0,"",html,selection_keyboard
+3974,11816537,"examples/agi_cast.html",2672,0,"a",html,content
+3975,11816539,"examples/agi_cast.html",2673,0,"",html,selection_keyboard
+3976,11816630,"examples/agi_cast.html",2673,0,"l",html,content
+3977,11816631,"examples/agi_cast.html",2674,0,"",html,selection_keyboard
+3978,11816804,"examples/agi_cast.html",2674,0," ",html,content
+3979,11816807,"examples/agi_cast.html",2675,0,"",html,selection_keyboard
+3980,11816984,"examples/agi_cast.html",2674,0,"",html,selection_command
+3981,11850429,"examples/agi_cast.html",2674,0," ",html,content
+3982,11850433,"examples/agi_cast.html",2675,0,"",html,selection_keyboard
+3983,11850745,"examples/agi_cast.html",2675,0,"d",html,content
+3984,11850747,"examples/agi_cast.html",2676,0,"",html,selection_keyboard
+3985,11850845,"examples/agi_cast.html",2676,0,"a",html,content
+3986,11850846,"examples/agi_cast.html",2677,0,"",html,selection_keyboard
+3987,11850861,"examples/agi_cast.html",2677,0,"t",html,content
+3988,11850862,"examples/agi_cast.html",2678,0,"",html,selection_keyboard
+3989,11851118,"examples/agi_cast.html",2678,0,"a",html,content
+3990,11851120,"examples/agi_cast.html",2679,0,"",html,selection_keyboard
+3991,11851711,"examples/agi_cast.html",2679,0," ",html,content
+3992,11851713,"examples/agi_cast.html",2680,0,"",html,selection_keyboard
+3993,11852212,"examples/agi_cast.html",2680,0,"s",html,content
+3994,11852216,"examples/agi_cast.html",2681,0,"",html,selection_keyboard
+3995,11852337,"examples/agi_cast.html",2681,0,"o",html,content
+3996,11852340,"examples/agi_cast.html",2682,0,"",html,selection_keyboard
+3997,11852377,"examples/agi_cast.html",2682,0,"u",html,content
+3998,11852378,"examples/agi_cast.html",2683,0,"",html,selection_keyboard
+3999,11852483,"examples/agi_cast.html",2683,0,"r",html,content
+4000,11852484,"examples/agi_cast.html",2684,0,"",html,selection_keyboard
+4001,11852728,"examples/agi_cast.html",2684,0,"c",html,content
+4002,11852729,"examples/agi_cast.html",2685,0,"",html,selection_keyboard
+4003,11852889,"examples/agi_cast.html",2685,0,"e",html,content
+4004,11852893,"examples/agi_cast.html",2686,0,"",html,selection_keyboard
+4005,11853087,"examples/agi_cast.html",2686,0," ",html,content
+4006,11853091,"examples/agi_cast.html",2687,0,"",html,selection_keyboard
+4007,11853172,"examples/agi_cast.html",2687,0,"f",html,content
+4008,11853173,"examples/agi_cast.html",2688,0,"",html,selection_keyboard
+4009,11853234,"examples/agi_cast.html",2688,0,"o",html,content
+4010,11853236,"examples/agi_cast.html",2689,0,"",html,selection_keyboard
+4011,11853294,"examples/agi_cast.html",2689,0,"r",html,content
+4012,11853295,"examples/agi_cast.html",2690,0,"",html,selection_keyboard
+4013,11853459,"examples/agi_cast.html",2690,0," ",html,content
+4014,11853461,"examples/agi_cast.html",2691,0,"",html,selection_keyboard
+4015,11853661,"examples/agi_cast.html",2690,0,"",html,selection_command
+4016,11870986,"examples/agi_cast.html",2542,0,"",html,selection_command
+4017,11871100,"examples/agi_cast.html",2351,0,"",html,selection_command
+4018,11873970,"examples/agi_cast.html",2441,0,"",html,selection_command
+4019,11875186,"examples/agi_cast.html",2440,1,"",html,content
+4020,11875356,"examples/agi_cast.html",2440,0,"\n ",html,content
+4021,11875569,"examples/agi_cast.html",2446,0,"",html,selection_command
+4022,11876088,"examples/agi_cast.html",2486,0,"",html,selection_command
+4023,11876487,"examples/agi_cast.html",2634,0,"",html,selection_command
+4024,11876848,"examples/agi_cast.html",2486,0,"",html,selection_command
+4025,11877056,"examples/agi_cast.html",2634,0,"",html,selection_command
+4026,11877738,"examples/agi_cast.html",2635,0,"",html,selection_command
+4027,11877987,"examples/agi_cast.html",2641,0,"",html,selection_command
+4028,11878015,"examples/agi_cast.html",2649,0,"",html,selection_command
+4029,11878046,"examples/agi_cast.html",2657,0,"",html,selection_command
+4030,11878079,"examples/agi_cast.html",2669,0,"",html,selection_command
+4031,11878112,"examples/agi_cast.html",2671,0,"",html,selection_command
+4032,11878146,"examples/agi_cast.html",2673,0,"",html,selection_command
+4033,11878181,"examples/agi_cast.html",2681,0,"",html,selection_command
+4034,11878213,"examples/agi_cast.html",2686,0,"",html,selection_command
+4035,11878247,"examples/agi_cast.html",2693,0,"",html,selection_command
+4036,11878374,"examples/agi_cast.html",2698,0,"",html,selection_command
+4037,11879138,"examples/agi_cast.html",2697,0,"",html,selection_command
+4038,11879362,"examples/agi_cast.html",2696,0,"",html,selection_command
+4039,11911703,"examples/agi_cast.html",2696,1," ",html,selection_command
+4040,11911881,"examples/agi_cast.html",2693,4,"for ",html,selection_command
+4041,11912012,"examples/agi_cast.html",2686,11,"source for ",html,selection_command
+4042,11912209,"examples/agi_cast.html",2681,16,"data source for ",html,selection_command
+4043,11912466,"examples/agi_cast.html",2681,16,"",html,content
+4044,11913016,"examples/agi_cast.html",2681,0,"w",html,content
+4045,11913018,"examples/agi_cast.html",2682,0,"",html,selection_keyboard
+4046,11913178,"examples/agi_cast.html",2682,0,"a",html,content
+4047,11913180,"examples/agi_cast.html",2683,0,"",html,selection_keyboard
+4048,11913207,"examples/agi_cast.html",2683,0,"y",html,content
+4049,11913209,"examples/agi_cast.html",2684,0,"",html,selection_keyboard
+4050,11913486,"examples/agi_cast.html",2684,0," ",html,content
+4051,11913488,"examples/agi_cast.html",2685,0,"",html,selection_keyboard
+4052,11913496,"examples/agi_cast.html",2685,0,"t",html,content
+4053,11913498,"examples/agi_cast.html",2686,0,"",html,selection_keyboard
+4054,11913585,"examples/agi_cast.html",2686,0,"o",html,content
+4055,11913586,"examples/agi_cast.html",2687,0,"",html,selection_keyboard
+4056,11913762,"examples/agi_cast.html",2687,0," ",html,content
+4057,11913763,"examples/agi_cast.html",2688,0,"",html,selection_keyboard
+4058,11913804,"examples/agi_cast.html",2688,0,"e",html,content
+4059,11913805,"examples/agi_cast.html",2689,0,"",html,selection_keyboard
+4060,11914091,"examples/agi_cast.html",2689,0,"x",html,content
+4061,11914093,"examples/agi_cast.html",2690,0,"",html,selection_keyboard
+4062,11914154,"examples/agi_cast.html",2690,0,"t",html,content
+4063,11914156,"examples/agi_cast.html",2691,0,"",html,selection_keyboard
+4064,11914228,"examples/agi_cast.html",2691,0,"e",html,content
+4065,11914230,"examples/agi_cast.html",2692,0,"",html,selection_keyboard
+4066,11914347,"examples/agi_cast.html",2692,0,"n",html,content
+4067,11914349,"examples/agi_cast.html",2693,0,"",html,selection_keyboard
+4068,11914495,"examples/agi_cast.html",2693,0,"d",html,content
+4069,11914496,"examples/agi_cast.html",2694,0,"",html,selection_keyboard
+4070,11914638,"examples/agi_cast.html",2694,0," ",html,content
+4071,11914639,"examples/agi_cast.html",2695,0,"",html,selection_keyboard
+4072,11914652,"examples/agi_cast.html",2695,0,"t",html,content
+4073,11914653,"examples/agi_cast.html",2696,0,"",html,selection_keyboard
+4074,11914656,"examples/agi_cast.html",2696,0,"r",html,content
+4075,11914657,"examples/agi_cast.html",2697,0,"",html,selection_keyboard
+4076,11914757,"examples/agi_cast.html",2697,0,"h",html,content
+4077,11914759,"examples/agi_cast.html",2698,0,"",html,selection_keyboard
+4078,11914829,"examples/agi_cast.html",2698,0,"e",html,content
+4079,11914830,"examples/agi_cast.html",2699,0,"",html,selection_keyboard
+4080,11914993,"examples/agi_cast.html",2699,0," ",html,content
+4081,11914994,"examples/agi_cast.html",2700,0,"",html,selection_keyboard
+4082,11915309,"examples/agi_cast.html",2699,1,"",html,content
+4083,11915434,"examples/agi_cast.html",2698,1,"",html,content
+4084,11915578,"examples/agi_cast.html",2697,1,"",html,content
+4085,11915668,"examples/agi_cast.html",2696,1,"",html,content
+4086,11915745,"examples/agi_cast.html",2696,0,"h",html,content
+4087,11915747,"examples/agi_cast.html",2697,0,"",html,selection_keyboard
+4088,11915832,"examples/agi_cast.html",2697,0,"e",html,content
+4089,11915834,"examples/agi_cast.html",2698,0,"",html,selection_keyboard
+4090,11915969,"examples/agi_cast.html",2698,0," ",html,content
+4091,11915971,"examples/agi_cast.html",2699,0,"",html,selection_keyboard
+4092,11916011,"examples/agi_cast.html",2699,0,"c",html,content
+4093,11916012,"examples/agi_cast.html",2700,0,"",html,selection_keyboard
+4094,11916105,"examples/agi_cast.html",2700,0,"u",html,content
+4095,11916106,"examples/agi_cast.html",2701,0,"",html,selection_keyboard
+4096,11916220,"examples/agi_cast.html",2701,0,"r",html,content
+4097,11916221,"examples/agi_cast.html",2702,0,"",html,selection_keyboard
+4098,11916358,"examples/agi_cast.html",2702,0,"r",html,content
+4099,11916360,"examples/agi_cast.html",2703,0,"",html,selection_keyboard
+4100,11916387,"examples/agi_cast.html",2703,0,"e",html,content
+4101,11916388,"examples/agi_cast.html",2704,0,"",html,selection_keyboard
+4102,11916570,"examples/agi_cast.html",2704,0,"n",html,content
+4103,11916572,"examples/agi_cast.html",2705,0,"",html,selection_keyboard
+4104,11916587,"examples/agi_cast.html",2705,0,"t",html,content
+4105,11916588,"examples/agi_cast.html",2706,0,"",html,selection_keyboard
+4106,11916800,"examples/agi_cast.html",2706,0," ",html,content
+4107,11916804,"examples/agi_cast.html",2707,0,"",html,selection_keyboard
+4108,11916813,"examples/agi_cast.html",2707,0,"p",html,content
+4109,11916814,"examples/agi_cast.html",2708,0,"",html,selection_keyboard
+4110,11916960,"examples/agi_cast.html",2708,0,"a",html,content
+4111,11916961,"examples/agi_cast.html",2709,0,"",html,selection_keyboard
+4112,11916964,"examples/agi_cast.html",2709,0,"r",html,content
+4113,11916966,"examples/agi_cast.html",2710,0,"",html,selection_keyboard
+4114,11917156,"examples/agi_cast.html",2710,0,"a",html,content
+4115,11917157,"examples/agi_cast.html",2711,0,"",html,selection_keyboard
+4116,11917321,"examples/agi_cast.html",2711,0,"d",html,content
+4117,11917323,"examples/agi_cast.html",2712,0,"",html,selection_keyboard
+4118,11917624,"examples/agi_cast.html",2712,0,"i",html,content
+4119,11917629,"examples/agi_cast.html",2713,0,"",html,selection_keyboard
+4120,11917722,"examples/agi_cast.html",2713,0,"g",html,content
+4121,11917726,"examples/agi_cast.html",2714,0,"",html,selection_keyboard
+4122,11917814,"examples/agi_cast.html",2714,0,"m",html,content
+4123,11917818,"examples/agi_cast.html",2715,0,"",html,selection_keyboard
+4124,11918046,"examples/agi_cast.html",2715,0," ",html,content
+4125,11918049,"examples/agi_cast.html",2716,0,"",html,selection_keyboard
+4126,11918991,"examples/agi_cast.html",2716,0,"t",html,content
+4127,11918996,"examples/agi_cast.html",2717,0,"",html,selection_keyboard
+4128,11919072,"examples/agi_cast.html",2717,0,"o",html,content
+4129,11919075,"examples/agi_cast.html",2718,0,"",html,selection_keyboard
+4130,11919284,"examples/agi_cast.html",2718,0," ",html,content
+4131,11919286,"examples/agi_cast.html",2719,0,"",html,selection_keyboard
+4132,11919494,"examples/agi_cast.html",2718,0,"",html,selection_command
+4133,11926691,"examples/agi_cast.html",2719,0,"",html,selection_command
+4134,11927543,"examples/agi_cast.html",2719,0,"a",html,content
+4135,11927547,"examples/agi_cast.html",2720,0,"",html,selection_keyboard
+4136,11927569,"examples/agi_cast.html",2720,0,"u",html,content
+4137,11927574,"examples/agi_cast.html",2721,0,"",html,selection_keyboard
+4138,11927652,"examples/agi_cast.html",2721,0,"t",html,content
+4139,11927654,"examples/agi_cast.html",2722,0,"",html,selection_keyboard
+4140,11927759,"examples/agi_cast.html",2722,0,"o",html,content
+4141,11927760,"examples/agi_cast.html",2723,0,"",html,selection_keyboard
+4142,11927846,"examples/agi_cast.html",2723,0,"m",html,content
+4143,11927848,"examples/agi_cast.html",2724,0,"",html,selection_keyboard
+4144,11928047,"examples/agi_cast.html",2724,0,"a",html,content
+4145,11928049,"examples/agi_cast.html",2725,0,"",html,selection_keyboard
+4146,11928061,"examples/agi_cast.html",2725,0,"t",html,content
+4147,11928063,"examples/agi_cast.html",2726,0,"",html,selection_keyboard
+4148,11928125,"examples/agi_cast.html",2726,0,"i",html,content
+4149,11928127,"examples/agi_cast.html",2727,0,"",html,selection_keyboard
+4150,11928632,"examples/agi_cast.html",2727,0,"o",html,content
+4151,11928635,"examples/agi_cast.html",2728,0,"",html,selection_keyboard
+4152,11929198,"examples/agi_cast.html",2719,9,"",html,content
+4153,11929552,"examples/agi_cast.html",2719,0,"a",html,content
+4154,11929554,"examples/agi_cast.html",2720,0,"",html,selection_keyboard
+4155,11929586,"examples/agi_cast.html",2720,0,"u",html,content
+4156,11929588,"examples/agi_cast.html",2721,0,"",html,selection_keyboard
+4157,11929678,"examples/agi_cast.html",2721,0,"t",html,content
+4158,11929679,"examples/agi_cast.html",2722,0,"",html,selection_keyboard
+4159,11929877,"examples/agi_cast.html",2722,0,"o",html,content
+4160,11929878,"examples/agi_cast.html",2723,0,"",html,selection_keyboard
+4161,11930212,"examples/agi_cast.html",2723,0,"m",html,content
+4162,11930218,"examples/agi_cast.html",2724,0,"",html,selection_keyboard
+4163,11930465,"examples/agi_cast.html",2724,0,"a",html,content
+4164,11930471,"examples/agi_cast.html",2725,0,"",html,selection_keyboard
+4165,11930485,"examples/agi_cast.html",2725,0,"t",html,content
+4166,11930489,"examples/agi_cast.html",2726,0,"",html,selection_keyboard
+4167,11930531,"examples/agi_cast.html",2726,0,"i",html,content
+4168,11930532,"examples/agi_cast.html",2727,0,"",html,selection_keyboard
+4169,11930810,"examples/agi_cast.html",2727,0,"o",html,content
+4170,11930812,"examples/agi_cast.html",2728,0,"",html,selection_keyboard
+4171,11930924,"examples/agi_cast.html",2728,0,"n",html,content
+4172,11930926,"examples/agi_cast.html",2729,0,"",html,selection_keyboard
+4173,11931135,"examples/agi_cast.html",2729,0," ",html,content
+4174,11931137,"examples/agi_cast.html",2730,0,"",html,selection_keyboard
+4175,11931152,"examples/agi_cast.html",2730,0,"o",html,content
+4176,11931154,"examples/agi_cast.html",2731,0,"",html,selection_keyboard
+4177,11931424,"examples/agi_cast.html",2731,0,"f",html,content
+4178,11931425,"examples/agi_cast.html",2732,0,"",html,selection_keyboard
+4179,11931480,"examples/agi_cast.html",2732,0," ",html,content
+4180,11931481,"examples/agi_cast.html",2733,0,"",html,selection_keyboard
+4181,11931766,"examples/agi_cast.html",2733,0,"a",html,content
+4182,11931768,"examples/agi_cast.html",2734,0,"",html,selection_keyboard
+4183,11931776,"examples/agi_cast.html",2734,0,"r",html,content
+4184,11931777,"examples/agi_cast.html",2735,0,"",html,selection_keyboard
+4185,11931972,"examples/agi_cast.html",2735,0,"b",html,content
+4186,11931973,"examples/agi_cast.html",2736,0,"",html,selection_keyboard
+4187,11932127,"examples/agi_cast.html",2736,0,"i",html,content
+4188,11932128,"examples/agi_cast.html",2737,0,"",html,selection_keyboard
+4189,11932303,"examples/agi_cast.html",2737,0,"t",html,content
+4190,11932305,"examples/agi_cast.html",2738,0,"",html,selection_keyboard
+4191,11932788,"examples/agi_cast.html",2738,0,"r",html,content
+4192,11932791,"examples/agi_cast.html",2739,0,"",html,selection_keyboard
+4193,11932986,"examples/agi_cast.html",2739,0,"a",html,content
+4194,11932989,"examples/agi_cast.html",2740,0,"",html,selection_keyboard
+4195,11932996,"examples/agi_cast.html",2740,0,"r",html,content
+4196,11932998,"examples/agi_cast.html",2741,0,"",html,selection_keyboard
+4197,11933127,"examples/agi_cast.html",2741,0,"y",html,content
+4198,11933129,"examples/agi_cast.html",2742,0,"",html,selection_keyboard
+4199,11933438,"examples/agi_cast.html",2742,0," ",html,content
+4200,11933441,"examples/agi_cast.html",2743,0,"",html,selection_keyboard
+4201,11934004,"examples/agi_cast.html",2743,0,"k",html,content
+4202,11934007,"examples/agi_cast.html",2744,0,"",html,selection_keyboard
+4203,11934018,"examples/agi_cast.html",2744,0,"n",html,content
+4204,11934019,"examples/agi_cast.html",2745,0,"",html,selection_keyboard
+4205,11934025,"examples/agi_cast.html",2745,0,"o",html,content
+4206,11934026,"examples/agi_cast.html",2746,0,"",html,selection_keyboard
+4207,11934546,"examples/agi_cast.html",2746,0,"w",html,content
+4208,11934549,"examples/agi_cast.html",2747,0,"",html,selection_keyboard
+4209,11934858,"examples/agi_cast.html",2747,0,"l",html,content
+4210,11934862,"examples/agi_cast.html",2748,0,"",html,selection_keyboard
+4211,11934994,"examples/agi_cast.html",2748,0,"e",html,content
+4212,11934996,"examples/agi_cast.html",2749,0,"",html,selection_keyboard
+4213,11935265,"examples/agi_cast.html",2749,0,"d",html,content
+4214,11935268,"examples/agi_cast.html",2750,0,"",html,selection_keyboard
+4215,11935319,"examples/agi_cast.html",2750,0,"g",html,content
+4216,11935321,"examples/agi_cast.html",2751,0,"",html,selection_keyboard
+4217,11935378,"examples/agi_cast.html",2751,0,"e",html,content
+4218,11935380,"examples/agi_cast.html",2752,0,"",html,selection_keyboard
+4219,11935856,"examples/agi_cast.html",2752,0," ",html,content
+4220,11935860,"examples/agi_cast.html",2753,0,"",html,selection_keyboard
+4221,11935953,"examples/agi_cast.html",2753,0,"w",html,content
+4222,11935954,"examples/agi_cast.html",2754,0,"",html,selection_keyboard
+4223,11936113,"examples/agi_cast.html",2754,0,"o",html,content
+4224,11936115,"examples/agi_cast.html",2755,0,"",html,selection_keyboard
+4225,11936188,"examples/agi_cast.html",2755,0,"r",html,content
+4226,11936190,"examples/agi_cast.html",2756,0,"",html,selection_keyboard
+4227,11936451,"examples/agi_cast.html",2756,0,"k",html,content
+4228,11936453,"examples/agi_cast.html",2757,0,"",html,selection_keyboard
+4229,11937456,"examples/agi_cast.html",2756,0,"",html,selection_command
+4230,11938484,"examples/agi_cast.html",2757,0,"",html,selection_command
+4231,11938571,"examples/agi_cast.html",2757,0," ",html,content
+4232,11938572,"examples/agi_cast.html",2758,0,"",html,selection_keyboard
+4233,11938726,"examples/agi_cast.html",2758,0,"i",html,content
+4234,11938727,"examples/agi_cast.html",2759,0,"",html,selection_keyboard
+4235,11938756,"examples/agi_cast.html",2759,0,"t",html,content
+4236,11938758,"examples/agi_cast.html",2760,0,"",html,selection_keyboard
+4237,11939370,"examples/agi_cast.html",2759,1,"",html,content
+4238,11939477,"examples/agi_cast.html",2759,0,"s",html,content
+4239,11939478,"examples/agi_cast.html",2760,0,"",html,selection_keyboard
+4240,11939536,"examples/agi_cast.html",2760,0," ",html,content
+4241,11939539,"examples/agi_cast.html",2761,0,"",html,selection_keyboard
+4242,11939597,"examples/agi_cast.html",2761,0,"t",html,content
+4243,11939599,"examples/agi_cast.html",2762,0,"",html,selection_keyboard
+4244,11939698,"examples/agi_cast.html",2762,0,"o",html,content
+4245,11939700,"examples/agi_cast.html",2763,0,"",html,selection_keyboard
+4246,11939865,"examples/agi_cast.html",2763,0," ",html,content
+4247,11939867,"examples/agi_cast.html",2764,0,"",html,selection_keyboard
+4248,11940012,"examples/agi_cast.html",2764,0,"h",html,content
+4249,11940013,"examples/agi_cast.html",2765,0,"",html,selection_keyboard
+4250,11940902,"examples/agi_cast.html",2764,1,"",html,content
+4251,11940929,"examples/agi_cast.html",2764,0,"b",html,content
+4252,11940930,"examples/agi_cast.html",2765,0,"",html,selection_keyboard
+4253,11940987,"examples/agi_cast.html",2765,0,"e",html,content
+4254,11940989,"examples/agi_cast.html",2766,0,"",html,selection_keyboard
+4255,11941202,"examples/agi_cast.html",2766,0,"h",html,content
+4256,11941203,"examples/agi_cast.html",2767,0,"",html,selection_keyboard
+4257,11941362,"examples/agi_cast.html",2767,0,"a",html,content
+4258,11941364,"examples/agi_cast.html",2768,0,"",html,selection_keyboard
+4259,11941373,"examples/agi_cast.html",2768,0,"v",html,content
+4260,11941375,"examples/agi_cast.html",2769,0,"",html,selection_keyboard
+4261,11941499,"examples/agi_cast.html",2769,0,"i",html,content
+4262,11941501,"examples/agi_cast.html",2770,0,"",html,selection_keyboard
+4263,11941697,"examples/agi_cast.html",2770,0,"i",html,content
+4264,11941701,"examples/agi_cast.html",2771,0,"",html,selection_keyboard
+4265,11942162,"examples/agi_cast.html",2770,1,"",html,content
+4266,11942252,"examples/agi_cast.html",2770,0,"o",html,content
+4267,11942254,"examples/agi_cast.html",2771,0,"",html,selection_keyboard
+4268,11942371,"examples/agi_cast.html",2771,0,"u",html,content
+4269,11942373,"examples/agi_cast.html",2772,0,"",html,selection_keyboard
+4270,11942460,"examples/agi_cast.html",2772,0,"r",html,content
+4271,11942462,"examples/agi_cast.html",2773,0,"",html,selection_keyboard
+4272,11943200,"examples/agi_cast.html",2773,0,"-",html,content
+4273,11943206,"examples/agi_cast.html",2774,0,"",html,selection_keyboard
+4274,11943604,"examples/agi_cast.html",2774,0,"c",html,content
+4275,11943609,"examples/agi_cast.html",2775,0,"",html,selection_keyboard
+4276,11943819,"examples/agi_cast.html",2775,0,"l",html,content
+4277,11943823,"examples/agi_cast.html",2776,0,"",html,selection_keyboard
+4278,11943955,"examples/agi_cast.html",2776,0,"o",html,content
+4279,11943959,"examples/agi_cast.html",2777,0,"",html,selection_keyboard
+4280,11943986,"examples/agi_cast.html",2777,0,"n",html,content
+4281,11943989,"examples/agi_cast.html",2778,0,"",html,selection_keyboard
+4282,11944095,"examples/agi_cast.html",2778,0,"e",html,content
+4283,11944099,"examples/agi_cast.html",2779,0,"",html,selection_keyboard
+4284,11944233,"examples/agi_cast.html",2779,0," ",html,content
+4285,11944235,"examples/agi_cast.html",2780,0,"",html,selection_keyboard
+4286,11944469,"examples/agi_cast.html",2779,0,"",html,selection_command
+4287,11946310,"examples/agi_cast.html",2779,0,"\n ",html,content
+4288,11946484,"examples/agi_cast.html",2786,2,"\n ",html,content
+4289,11946688,"examples/agi_cast.html",2792,0,"",html,selection_command
+4290,11946974,"examples/agi_cast.html",2785,0,"",html,selection_command
+4291,11947016,"examples/agi_cast.html",2634,0,"",html,selection_command
+4292,11947499,"examples/agi_cast.html",2779,0,"",html,selection_command
+4293,11948533,"examples/agi_cast.html",2779,0,"\n ",html,content
+4294,11956302,"examples/agi_cast.html",2786,0,"f",html,content
+4295,11956306,"examples/agi_cast.html",2787,0,"",html,selection_keyboard
+4296,11956402,"examples/agi_cast.html",2787,0,"r",html,content
+4297,11956403,"examples/agi_cast.html",2788,0,"",html,selection_keyboard
+4298,11956507,"examples/agi_cast.html",2788,0,"o",html,content
+4299,11956509,"examples/agi_cast.html",2789,0,"",html,selection_keyboard
+4300,11956594,"examples/agi_cast.html",2789,0,"m",html,content
+4301,11956595,"examples/agi_cast.html",2790,0,"",html,selection_keyboard
+4302,11956899,"examples/agi_cast.html",2790,0," ",html,content
+4303,11956902,"examples/agi_cast.html",2791,0,"",html,selection_keyboard
+4304,11957426,"examples/agi_cast.html",2791,0,"s",html,content
+4305,11957428,"examples/agi_cast.html",2792,0,"",html,selection_keyboard
+4306,11957526,"examples/agi_cast.html",2792,0,"c",html,content
+4307,11957527,"examples/agi_cast.html",2793,0,"",html,selection_keyboard
+4308,11957784,"examples/agi_cast.html",2793,0,"r",html,content
+4309,11957787,"examples/agi_cast.html",2794,0,"",html,selection_keyboard
+4310,11957902,"examples/agi_cast.html",2794,0,"e",html,content
+4311,11957904,"examples/agi_cast.html",2795,0,"",html,selection_keyboard
+4312,11958085,"examples/agi_cast.html",2795,0,"e",html,content
+4313,11958088,"examples/agi_cast.html",2796,0,"",html,selection_keyboard
+4314,11958388,"examples/agi_cast.html",2796,0,"n",html,content
+4315,11958393,"examples/agi_cast.html",2797,0,"",html,selection_keyboard
+4316,11959822,"examples/agi_cast.html",2797,0," recordings of human researchers.",html,content
+4317,11960255,"examples/agi_cast.html",2829,0,"",html,selection_command
+4318,11960696,"examples/agi_cast.html",2818,0,"",html,selection_command
+4319,11961289,"examples/agi_cast.html",2818,12,"",html,content
+4320,11962361,"examples/agi_cast.html",2818,0,"w",html,content
+4321,11962364,"examples/agi_cast.html",2819,0,"",html,selection_keyboard
+4322,11962443,"examples/agi_cast.html",2819,0,"o",html,content
+4323,11962445,"examples/agi_cast.html",2820,0,"",html,selection_keyboard
+4324,11962943,"examples/agi_cast.html",2820,0,"r",html,content
+4325,11962947,"examples/agi_cast.html",2821,0,"",html,selection_keyboard
+4326,11963383,"examples/agi_cast.html",2821,0,"k",html,content
+4327,11963388,"examples/agi_cast.html",2822,0,"",html,selection_keyboard
+4328,11963401,"examples/agi_cast.html",2822,0,"e",html,content
+4329,11963403,"examples/agi_cast.html",2823,0,"",html,selection_keyboard
+4330,11964671,"examples/agi_cast.html",2823,0,"r",html,content
+4331,11964675,"examples/agi_cast.html",2824,0,"",html,selection_keyboard
+4332,11964911,"examples/agi_cast.html",2824,0,"s",html,content
+4333,11964915,"examples/agi_cast.html",2825,0,"",html,selection_keyboard
+4334,11965042,"examples/agi_cast.html",2825,0,".",html,content
+4335,11965045,"examples/agi_cast.html",2826,0,"",html,selection_keyboard
+4336,11965166,"examples/agi_cast.html",2825,0,"",html,selection_command
+4337,11965671,"examples/agi_cast.html",2818,0,"",html,selection_command
+4338,11965926,"examples/agi_cast.html",2812,0,"",html,selection_command
+4339,11965954,"examples/agi_cast.html",2809,0,"",html,selection_command
+4340,11965982,"examples/agi_cast.html",2798,0,"",html,selection_command
+4341,11966016,"examples/agi_cast.html",2791,0,"",html,selection_command
+4342,11966048,"examples/agi_cast.html",2786,0,"",html,selection_command
+4343,11966082,"examples/agi_cast.html",2774,0,"",html,selection_command
+4344,11966114,"examples/agi_cast.html",2773,0,"",html,selection_command
+4345,11966149,"examples/agi_cast.html",2764,0,"",html,selection_command
+4346,11966183,"examples/agi_cast.html",2761,0,"",html,selection_command
+4347,11966375,"examples/agi_cast.html",2758,0,"",html,selection_command
+4348,11966630,"examples/agi_cast.html",2753,0,"",html,selection_command
+4349,11966664,"examples/agi_cast.html",2743,0,"",html,selection_command
+4350,11966691,"examples/agi_cast.html",2733,0,"",html,selection_command
+4351,11966724,"examples/agi_cast.html",2730,0,"",html,selection_command
+4352,11966758,"examples/agi_cast.html",2719,0,"",html,selection_command
+4353,11966790,"examples/agi_cast.html",2716,0,"",html,selection_command
+4354,11966824,"examples/agi_cast.html",2707,0,"",html,selection_command
+4355,11966857,"examples/agi_cast.html",2699,0,"",html,selection_command
+4356,11966891,"examples/agi_cast.html",2695,0,"",html,selection_command
+4357,11966924,"examples/agi_cast.html",2688,0,"",html,selection_command
+4358,11966958,"examples/agi_cast.html",2685,0,"",html,selection_command
+4359,11966990,"examples/agi_cast.html",2681,0,"",html,selection_command
+4360,11967024,"examples/agi_cast.html",2673,0,"",html,selection_command
+4361,11967476,"examples/agi_cast.html",2681,0,"",html,selection_command
+4362,11967730,"examples/agi_cast.html",2685,0,"",html,selection_command
+4363,11967760,"examples/agi_cast.html",2688,0,"",html,selection_command
+4364,11967788,"examples/agi_cast.html",2695,0,"",html,selection_command
+4365,11967821,"examples/agi_cast.html",2699,0,"",html,selection_command
+4366,11967855,"examples/agi_cast.html",2707,0,"",html,selection_command
+4367,11967889,"examples/agi_cast.html",2716,0,"",html,selection_command
+4368,11967921,"examples/agi_cast.html",2719,0,"",html,selection_command
+4369,11967956,"examples/agi_cast.html",2730,0,"",html,selection_command
+4370,11968166,"examples/agi_cast.html",2733,0,"",html,selection_command
+4371,11968423,"examples/agi_cast.html",2743,0,"",html,selection_command
+4372,11968450,"examples/agi_cast.html",2753,0,"",html,selection_command
+4373,11968477,"examples/agi_cast.html",2758,0,"",html,selection_command
+4374,11968515,"examples/agi_cast.html",2761,0,"",html,selection_command
+4375,11968541,"examples/agi_cast.html",2764,0,"",html,selection_command
+4376,11968573,"examples/agi_cast.html",2773,0,"",html,selection_command
+4377,11968963,"examples/agi_cast.html",2774,0,"",html,selection_command
+4378,11969234,"examples/agi_cast.html",2786,0,"",html,selection_command
+4379,11970186,"examples/agi_cast.html",2791,0,"",html,selection_command
+4380,11970388,"examples/agi_cast.html",2798,0,"",html,selection_command
+4381,11970998,"examples/agi_cast.html",2809,0,"",html,selection_command
+4382,11971192,"examples/agi_cast.html",2812,0,"",html,selection_command
+4383,11972052,"examples/agi_cast.html",2818,0,"",html,selection_command
+4384,11972306,"examples/agi_cast.html",2812,0,"",html,selection_command
+4385,11972872,"examples/agi_cast.html",2812,14,"",html,content
+4386,11973789,"examples/agi_cast.html",2812,0,"h",html,content
+4387,11973792,"examples/agi_cast.html",2813,0,"",html,selection_keyboard
+4388,11973955,"examples/agi_cast.html",2813,0,"u",html,content
+4389,11973957,"examples/agi_cast.html",2814,0,"",html,selection_keyboard
+4390,11974148,"examples/agi_cast.html",2814,0,"m",html,content
+4391,11974149,"examples/agi_cast.html",2815,0,"",html,selection_keyboard
+4392,11974356,"examples/agi_cast.html",2815,0,"a",html,content
+4393,11974360,"examples/agi_cast.html",2816,0,"",html,selection_keyboard
+4394,11974446,"examples/agi_cast.html",2816,0,"n",html,content
+4395,11974449,"examples/agi_cast.html",2817,0,"",html,selection_keyboard
+4396,11974699,"examples/agi_cast.html",2817,0," ",html,content
+4397,11974701,"examples/agi_cast.html",2818,0,"",html,selection_keyboard
+4398,11977389,"examples/agi_cast.html",2818,0,"w",html,content
+4399,11977392,"examples/agi_cast.html",2819,0,"",html,selection_keyboard
+4400,11977454,"examples/agi_cast.html",2819,0,"o",html,content
+4401,11977456,"examples/agi_cast.html",2820,0,"",html,selection_keyboard
+4402,11977577,"examples/agi_cast.html",2820,0,"r",html,content
+4403,11977579,"examples/agi_cast.html",2821,0,"",html,selection_keyboard
+4404,11977809,"examples/agi_cast.html",2821,0,"k",html,content
+4405,11977813,"examples/agi_cast.html",2822,0,"",html,selection_keyboard
+4406,11977831,"examples/agi_cast.html",2822,0,"e",html,content
+4407,11977832,"examples/agi_cast.html",2823,0,"",html,selection_keyboard
+4408,11977853,"examples/agi_cast.html",2823,0,"r",html,content
+4409,11977855,"examples/agi_cast.html",2824,0,"",html,selection_keyboard
+4410,11978080,"examples/agi_cast.html",2824,0,"s",html,content
+4411,11978082,"examples/agi_cast.html",2825,0,"",html,selection_keyboard
+4412,11978148,"examples/agi_cast.html",2825,0,".",html,content
+4413,11978154,"examples/agi_cast.html",2826,0,"",html,selection_keyboard
+4414,11978295,"examples/agi_cast.html",2825,0,"",html,selection_command
+4415,11979546,"examples/agi_cast.html",2818,0,"",html,selection_command
+4416,11991086,"examples/agi_cast.html",2812,0,"",html,selection_command
+4417,11991337,"examples/agi_cast.html",2809,0,"",html,selection_command
+4418,11991366,"examples/agi_cast.html",2798,0,"",html,selection_command
+4419,11991394,"examples/agi_cast.html",2791,0,"",html,selection_command
+4420,11991429,"examples/agi_cast.html",2786,0,"",html,selection_command
+4421,11991461,"examples/agi_cast.html",2774,0,"",html,selection_command
+4422,11991495,"examples/agi_cast.html",2773,0,"",html,selection_command
+4423,11991528,"examples/agi_cast.html",2764,0,"",html,selection_command
+4424,11991561,"examples/agi_cast.html",2761,0,"",html,selection_command
+4425,11991594,"examples/agi_cast.html",2758,0,"",html,selection_command
+4426,11991873,"examples/agi_cast.html",2753,0,"",html,selection_command
+4427,11992128,"examples/agi_cast.html",2743,0,"",html,selection_command
+4428,11992156,"examples/agi_cast.html",2733,0,"",html,selection_command
+4429,11992188,"examples/agi_cast.html",2730,0,"",html,selection_command
+4430,11992222,"examples/agi_cast.html",2719,0,"",html,selection_command
+4431,11992254,"examples/agi_cast.html",2716,0,"",html,selection_command
+4432,11992289,"examples/agi_cast.html",2707,0,"",html,selection_command
+4433,11992321,"examples/agi_cast.html",2699,0,"",html,selection_command
+4434,11992355,"examples/agi_cast.html",2695,0,"",html,selection_command
+4435,11992389,"examples/agi_cast.html",2688,0,"",html,selection_command
+4436,11993857,"examples/agi_cast.html",2685,0,"",html,selection_command
+4437,11994032,"examples/agi_cast.html",2681,0,"",html,selection_command
+4438,11994199,"examples/agi_cast.html",2673,0,"",html,selection_command
+4439,11994354,"examples/agi_cast.html",2671,0,"",html,selection_command
+4440,12008417,"examples/agi_cast.html",2779,0,"",html,selection_command
+4441,12008888,"examples/agi_cast.html",2778,0,"",html,selection_command
+4442,12009581,"examples/agi_cast.html",2825,0,"",html,selection_command
+4443,12011427,"examples/agi_cast.html",2818,0,"",html,selection_command
+4444,12014998,"examples/agi_cast.html",2818,7,"",html,content
+4445,12015526,"examples/agi_cast.html",2817,0,"",html,selection_command
+4446,12016348,"examples/agi_cast.html",2825,0,"",html,selection_command
+4447,12017580,"examples/agi_cast.html",2818,0,"",html,selection_command
+4448,12037869,"examples/agi_cast.html",2818,0,"w",html,content
+4449,12037872,"examples/agi_cast.html",2819,0,"",html,selection_keyboard
+4450,12038003,"examples/agi_cast.html",2819,0,"o",html,content
+4451,12038005,"examples/agi_cast.html",2820,0,"",html,selection_keyboard
+4452,12038077,"examples/agi_cast.html",2820,0,"r",html,content
+4453,12038079,"examples/agi_cast.html",2821,0,"",html,selection_keyboard
+4454,12038296,"examples/agi_cast.html",2821,0,"k",html,content
+4455,12038299,"examples/agi_cast.html",2822,0,"",html,selection_keyboard
+4456,12038311,"examples/agi_cast.html",2822,0,"e",html,content
+4457,12038312,"examples/agi_cast.html",2823,0,"",html,selection_keyboard
+4458,12038323,"examples/agi_cast.html",2823,0,"r",html,content
+4459,12038325,"examples/agi_cast.html",2824,0,"",html,selection_keyboard
+4460,12038512,"examples/agi_cast.html",2824,0,"s",html,content
+4461,12038517,"examples/agi_cast.html",2825,0,"",html,selection_keyboard
+4462,12038866,"examples/agi_cast.html",2824,0,"",html,selection_command
+4463,12039188,"examples/agi_cast.html",2780,46," from screen recordings of human workers.",html,selection_command
+4464,12057288,"examples/agi_cast.html",2824,0,"",html,selection_command
+4465,12057678,"examples/agi_cast.html",2832,0,"",html,selection_command
+4466,12059006,"examples/agi_cast.html",2878,0,"",html,selection_command
+4467,12070983,"examples/agi_cast.html",2780,46," from screen recordings of human practitioners.",html,content
+4468,12074901,"examples/agi_cast.html",2838,0,"",html,selection_command
+4469,12074996,"examples/agi_cast.html",2824,0,"",html,selection_command
+4470,12108431,"examples/agi_cast.html",2832,0,"",html,selection_command
+4471,12108956,"examples/agi_cast.html",2831,0,"",html,selection_command
+4472,12110662,"examples/agi_cast.html",2831,0," ",html,content
+4473,12110666,"examples/agi_cast.html",2832,0,"",html,selection_keyboard
+4474,12110952,"examples/agi_cast.html",2832,0,"<",html,content
+4475,12110953,"examples/agi_cast.html",2833,0,"",html,selection_keyboard
+4476,12111012,"examples/agi_cast.html",2833,0,">",html,content
+4477,12111016,"examples/agi_cast.html",2834,0,"",html,selection_keyboard
+4478,12112563,"examples/agi_cast.html",2833,1,"",html,content
+4479,12112686,"examples/agi_cast.html",2832,1,"",html,content
+4480,12112860,"examples/agi_cast.html",2831,1,"",html,content
+4481,12113333,"examples/agi_cast.html",2831,0,"<",html,content
+4482,12113335,"examples/agi_cast.html",2832,0,"",html,selection_keyboard
+4483,12113874,"examples/agi_cast.html",2832,0,"a",html,content
+4484,12113878,"examples/agi_cast.html",2833,0,"",html,selection_keyboard
+4485,12113937,"examples/agi_cast.html",2833,0,"s",html,content
+4486,12113940,"examples/agi_cast.html",2834,0,"",html,selection_keyboard
+4487,12113954,"examples/agi_cast.html",2834,0,"i",html,content
+4488,12113955,"examples/agi_cast.html",2835,0,"",html,selection_keyboard
+4489,12114156,"examples/agi_cast.html",2835,0,"d",html,content
+4490,12114163,"examples/agi_cast.html",2836,0,"",html,selection_keyboard
+4491,12114239,"examples/agi_cast.html",2836,0,"e",html,content
+4492,12114241,"examples/agi_cast.html",2837,0,"",html,selection_keyboard
+4493,12115344,"examples/agi_cast.html",2837,0," class=""note"">This is analogous to how we teach language models to code by behaviour-cloning from screen recordings of human developers.",html,content
+4494,12115656,"examples/agi_cast.html",2980,0,"",html,selection_command
+4495,12115830,"examples/agi_cast.html",2975,0,"",html,selection_command
+4496,12116104,"examples/agi_cast.html",2972,0,"",html,selection_command
+4497,12116354,"examples/agi_cast.html",2962,0,"",html,selection_command
+4498,12116384,"examples/agi_cast.html",2956,0,"",html,selection_command
+4499,12116417,"examples/agi_cast.html",2953,0,"",html,selection_command
+4500,12116450,"examples/agi_cast.html",2942,0,"",html,selection_command
+4501,12116483,"examples/agi_cast.html",2935,0,"",html,selection_command
+4502,12116988,"examples/agi_cast.html",2851,122,"",html,content
+4503,12117716,"examples/agi_cast.html",2850,0,"",html,selection_command
+4504,12118855,"examples/agi_cast.html",2851,0,"",html,selection_command
+4505,12119127,"examples/agi_cast.html",2851,0,"C",html,content
+4506,12119129,"examples/agi_cast.html",2852,0,"",html,selection_keyboard
+4507,12119312,"examples/agi_cast.html",2852,0,"o",html,content
+4508,12119313,"examples/agi_cast.html",2853,0,"",html,selection_keyboard
+4509,12119385,"examples/agi_cast.html",2853,0,"m",html,content
+4510,12119386,"examples/agi_cast.html",2854,0,"",html,selection_keyboard
+4511,12119521,"examples/agi_cast.html",2854,0,"m",html,content
+4512,12119522,"examples/agi_cast.html",2855,0,"",html,selection_keyboard
+4513,12119595,"examples/agi_cast.html",2855,0,"o",html,content
+4514,12119596,"examples/agi_cast.html",2856,0,"",html,selection_keyboard
+4515,12119701,"examples/agi_cast.html",2856,0,"n",html,content
+4516,12119702,"examples/agi_cast.html",2857,0,"",html,selection_keyboard
+4517,12119924,"examples/agi_cast.html",2857,0,"l",html,content
+4518,12119927,"examples/agi_cast.html",2858,0,"",html,selection_keyboard
+4519,12119983,"examples/agi_cast.html",2858,0,"y",html,content
+4520,12119986,"examples/agi_cast.html",2859,0,"",html,selection_keyboard
+4521,12120203,"examples/agi_cast.html",2859,0," ",html,content
+4522,12120207,"examples/agi_cast.html",2860,0,"",html,selection_keyboard
+4523,12120998,"examples/agi_cast.html",2860,0,"r",html,content
+4524,12121003,"examples/agi_cast.html",2861,0,"",html,selection_keyboard
+4525,12121114,"examples/agi_cast.html",2861,0,"e",html,content
+4526,12121117,"examples/agi_cast.html",2862,0,"",html,selection_keyboard
+4527,12121332,"examples/agi_cast.html",2862,0,"f",html,content
+4528,12121336,"examples/agi_cast.html",2863,0,"",html,selection_keyboard
+4529,12121347,"examples/agi_cast.html",2863,0,"e",html,content
+4530,12121350,"examples/agi_cast.html",2864,0,"",html,selection_keyboard
+4531,12121427,"examples/agi_cast.html",2864,0,"r",html,content
+4532,12121430,"examples/agi_cast.html",2865,0,"",html,selection_keyboard
+4533,12121579,"examples/agi_cast.html",2865,0,"r",html,content
+4534,12121583,"examples/agi_cast.html",2866,0,"",html,selection_keyboard
+4535,12121635,"examples/agi_cast.html",2866,0,"e",html,content
+4536,12121638,"examples/agi_cast.html",2867,0,"",html,selection_keyboard
+4537,12121893,"examples/agi_cast.html",2867,0,"d",html,content
+4538,12121896,"examples/agi_cast.html",2868,0,"",html,selection_keyboard
+4539,12121964,"examples/agi_cast.html",2868,0," ",html,content
+4540,12121967,"examples/agi_cast.html",2869,0,"",html,selection_keyboard
+4541,12122039,"examples/agi_cast.html",2869,0,"t",html,content
+4542,12122042,"examples/agi_cast.html",2870,0,"",html,selection_keyboard
+4543,12122177,"examples/agi_cast.html",2870,0,"o",html,content
+4544,12122180,"examples/agi_cast.html",2871,0,"",html,selection_keyboard
+4545,12122338,"examples/agi_cast.html",2871,0," ",html,content
+4546,12122342,"examples/agi_cast.html",2872,0,"",html,selection_keyboard
+4547,12122586,"examples/agi_cast.html",2872,0,"a",html,content
+4548,12122591,"examples/agi_cast.html",2873,0,"",html,selection_keyboard
+4549,12122621,"examples/agi_cast.html",2873,0,"s",html,content
+4550,12122624,"examples/agi_cast.html",2874,0,"",html,selection_keyboard
+4551,12122714,"examples/agi_cast.html",2874,0," ",html,content
+4552,12122718,"examples/agi_cast.html",2875,0,"",html,selection_keyboard
+4553,12123378,"examples/agi_cast.html",2875,0,"'",html,content
+4554,12123381,"examples/agi_cast.html",2876,0,"",html,selection_keyboard
+4555,12123552,"examples/agi_cast.html",2876,0,"c",html,content
+4556,12123553,"examples/agi_cast.html",2877,0,"",html,selection_keyboard
+4557,12123689,"examples/agi_cast.html",2877,0,"o",html,content
+4558,12123691,"examples/agi_cast.html",2878,0,"",html,selection_keyboard
+4559,12123937,"examples/agi_cast.html",2878,0,"l",html,content
+4560,12123941,"examples/agi_cast.html",2879,0,"",html,selection_keyboard
+4561,12124002,"examples/agi_cast.html",2879,0,"d",html,content
+4562,12124004,"examples/agi_cast.html",2880,0,"",html,selection_keyboard
+4563,12124734,"examples/agi_cast.html",2880,0,"-",html,content
+4564,12124738,"examples/agi_cast.html",2881,0,"",html,selection_keyboard
+4565,12125291,"examples/agi_cast.html",2880,0,"",html,selection_command
+4566,12127088,"examples/agi_cast.html",2881,0,"",html,selection_command
+4567,12127373,"examples/agi_cast.html",2881,0,"s",html,content
+4568,12127375,"examples/agi_cast.html",2882,0,"",html,selection_keyboard
+4569,12127504,"examples/agi_cast.html",2882,0,"t",html,content
+4570,12127506,"examples/agi_cast.html",2883,0,"",html,selection_keyboard
+4571,12127877,"examples/agi_cast.html",2883,0,"a",html,content
+4572,12127879,"examples/agi_cast.html",2884,0,"",html,selection_keyboard
+4573,12127920,"examples/agi_cast.html",2884,0,"r",html,content
+4574,12127922,"examples/agi_cast.html",2885,0,"",html,selection_keyboard
+4575,12128148,"examples/agi_cast.html",2885,0,"t",html,content
+4576,12128150,"examples/agi_cast.html",2886,0,"",html,selection_keyboard
+4577,12128463,"examples/agi_cast.html",2886,0,"' data",html,content
+4578,12128692,"examples/agi_cast.html",2891,0,"",html,selection_command
+4579,12129581,"examples/agi_cast.html",2780,0,"",html,selection_command
+4580,12157751,"examples/agi_cast.html",2786,0,"",html,selection_command
+4581,12157996,"examples/agi_cast.html",2791,0,"",html,selection_command
+4582,12158028,"examples/agi_cast.html",2798,0,"",html,selection_command
+4583,12158061,"examples/agi_cast.html",2809,0,"",html,selection_command
+4584,12158093,"examples/agi_cast.html",2812,0,"",html,selection_command
+4585,12158127,"examples/agi_cast.html",2818,0,"",html,selection_command
+4586,12158160,"examples/agi_cast.html",2831,0,"",html,selection_command
+4587,12158193,"examples/agi_cast.html",2832,0,"",html,selection_command
+4588,12158227,"examples/agi_cast.html",2838,0,"",html,selection_command
+4589,12158260,"examples/agi_cast.html",2843,0,"",html,selection_command
+4590,12158293,"examples/agi_cast.html",2845,0,"",html,selection_command
+4591,12158327,"examples/agi_cast.html",2849,0,"",html,selection_command
+4592,12158359,"examples/agi_cast.html",2851,0,"",html,selection_command
+4593,12158394,"examples/agi_cast.html",2860,0,"",html,selection_command
+4594,12158426,"examples/agi_cast.html",2869,0,"",html,selection_command
+4595,12158460,"examples/agi_cast.html",2872,0,"",html,selection_command
+4596,12158492,"examples/agi_cast.html",2875,0,"",html,selection_command
+4597,12159012,"examples/agi_cast.html",2872,0,"",html,selection_command
+4598,12159279,"examples/agi_cast.html",2875,0,"",html,selection_command
+4599,12159804,"examples/agi_cast.html",2872,0,"",html,selection_command
+4600,12159962,"examples/agi_cast.html",2869,0,"",html,selection_command
+4601,12160136,"examples/agi_cast.html",2860,0,"",html,selection_command
+4602,12160315,"examples/agi_cast.html",2851,0,"",html,selection_command
+4603,12160495,"examples/agi_cast.html",2849,0,"",html,selection_command
+4604,12160781,"examples/agi_cast.html",2851,0,"",html,selection_command
+4605,12174841,"examples/agi_cast.html",2851,1,"C",html,selection_command
+4606,12174970,"examples/agi_cast.html",2851,10,"Commonly r",html,selection_command
+4607,12175137,"examples/agi_cast.html",2851,19,"Commonly referred t",html,selection_command
+4608,12175468,"examples/agi_cast.html",2851,22,"Commonly referred to a",html,selection_command
+4609,12175872,"examples/agi_cast.html",2851,25,"Commonly referred to as '",html,selection_command
+4610,12179287,"examples/agi_cast.html",2851,24,"Commonly referred to as ",html,selection_command
+4611,12179421,"examples/agi_cast.html",2851,23,"Commonly referred to as",html,selection_command
+4612,12179684,"examples/agi_cast.html",2851,23,"",html,content
+4613,12180101,"examples/agi_cast.html",2851,0,"A",html,content
+4614,12180102,"examples/agi_cast.html",2852,0,"",html,selection_keyboard
+4615,12180336,"examples/agi_cast.html",2852,0,"n",html,content
+4616,12180338,"examples/agi_cast.html",2853,0,"",html,selection_keyboard
+4617,12180538,"examples/agi_cast.html",2853,0,"a",html,content
+4618,12180540,"examples/agi_cast.html",2854,0,"",html,selection_keyboard
+4619,12180742,"examples/agi_cast.html",2854,0,"l",html,content
+4620,12180744,"examples/agi_cast.html",2855,0,"",html,selection_keyboard
+4621,12180846,"examples/agi_cast.html",2855,0,"o",html,content
+4622,12180847,"examples/agi_cast.html",2856,0,"",html,selection_keyboard
+4623,12181027,"examples/agi_cast.html",2856,0,"g",html,content
+4624,12181030,"examples/agi_cast.html",2857,0,"",html,selection_keyboard
+4625,12181146,"examples/agi_cast.html",2857,0,"o",html,content
+4626,12181148,"examples/agi_cast.html",2858,0,"",html,selection_keyboard
+4627,12181269,"examples/agi_cast.html",2858,0,"u",html,content
+4628,12181272,"examples/agi_cast.html",2859,0,"",html,selection_keyboard
+4629,12181514,"examples/agi_cast.html",2859,0,"s",html,content
+4630,12181518,"examples/agi_cast.html",2860,0,"",html,selection_keyboard
+4631,12181737,"examples/agi_cast.html",2860,0," ",html,content
+4632,12181740,"examples/agi_cast.html",2861,0,"",html,selection_keyboard
+4633,12181839,"examples/agi_cast.html",2861,0,"t",html,content
+4634,12181842,"examples/agi_cast.html",2862,0,"",html,selection_keyboard
+4635,12181927,"examples/agi_cast.html",2862,0,"o",html,content
+4636,12181929,"examples/agi_cast.html",2863,0,"",html,selection_keyboard
+4637,12182155,"examples/agi_cast.html",2862,0,"",html,selection_command
+4638,12183705,"examples/agi_cast.html",2864,0,"",html,selection_command
+4639,12183869,"examples/agi_cast.html",2865,0,"",html,selection_command
+4640,12184048,"examples/agi_cast.html",2869,0,"",html,selection_command
+4641,12184227,"examples/agi_cast.html",2870,0,"",html,selection_command
+4642,12184544,"examples/agi_cast.html",2875,0,"",html,selection_command
+4643,12185037,"examples/agi_cast.html",2875,1,"",html,content
+4644,12185247,"examples/agi_cast.html",2876,0,"",html,selection_command
+4645,12185727,"examples/agi_cast.html",2879,0,"",html,selection_command
+4646,12185963,"examples/agi_cast.html",2880,0,"'",html,content
+4647,12185967,"examples/agi_cast.html",2880,0,"",html,selection_command
+4648,12190436,"examples/agi_cast.html",2881,0,"",html,selection_command
+4649,12190544,"examples/agi_cast.html",2881,0," ",html,content
+4650,12190547,"examples/agi_cast.html",2882,0,"",html,selection_keyboard
+4651,12190911,"examples/agi_cast.html",2882,0,"i",html,content
+4652,12190912,"examples/agi_cast.html",2883,0,"",html,selection_keyboard
+4653,12190991,"examples/agi_cast.html",2883,0,"n",html,content
+4654,12190993,"examples/agi_cast.html",2884,0,"",html,selection_keyboard
+4655,12191180,"examples/agi_cast.html",2884,0," ",html,content
+4656,12191182,"examples/agi_cast.html",2885,0,"",html,selection_keyboard
+4657,12191795,"examples/agi_cast.html",2885,0,"t",html,content
+4658,12191797,"examples/agi_cast.html",2886,0,"",html,selection_keyboard
+4659,12191903,"examples/agi_cast.html",2886,0,"h",html,content
+4660,12191904,"examples/agi_cast.html",2887,0,"",html,selection_keyboard
+4661,12191995,"examples/agi_cast.html",2887,0,"e",html,content
+4662,12191996,"examples/agi_cast.html",2888,0,"",html,selection_keyboard
+4663,12192161,"examples/agi_cast.html",2888,0," ",html,content
+4664,12192162,"examples/agi_cast.html",2889,0,"",html,selection_keyboard
+4665,12192252,"examples/agi_cast.html",2889,0,"l",html,content
+4666,12192253,"examples/agi_cast.html",2890,0,"",html,selection_keyboard
+4667,12192337,"examples/agi_cast.html",2890,0,"a",html,content
+4668,12192339,"examples/agi_cast.html",2891,0,"",html,selection_keyboard
+4669,12192347,"examples/agi_cast.html",2891,0,"n",html,content
+4670,12192349,"examples/agi_cast.html",2892,0,"",html,selection_keyboard
+4671,12192469,"examples/agi_cast.html",2892,0,"g",html,content
+4672,12192470,"examples/agi_cast.html",2893,0,"",html,selection_keyboard
+4673,12192562,"examples/agi_cast.html",2893,0,"u",html,content
+4674,12192563,"examples/agi_cast.html",2894,0,"",html,selection_keyboard
+4675,12192761,"examples/agi_cast.html",2894,0,"a",html,content
+4676,12192766,"examples/agi_cast.html",2895,0,"",html,selection_keyboard
+4677,12192783,"examples/agi_cast.html",2895,0,"g",html,content
+4678,12192785,"examples/agi_cast.html",2896,0,"",html,selection_keyboard
+4679,12192833,"examples/agi_cast.html",2896,0,"e",html,content
+4680,12192834,"examples/agi_cast.html",2897,0,"",html,selection_keyboard
+4681,12193012,"examples/agi_cast.html",2897,0," ",html,content
+4682,12193014,"examples/agi_cast.html",2898,0,"",html,selection_keyboard
+4683,12194018,"examples/agi_cast.html",2898,0,"m",html,content
+4684,12194022,"examples/agi_cast.html",2899,0,"",html,selection_keyboard
+4685,12194061,"examples/agi_cast.html",2899,0,"o",html,content
+4686,12194064,"examples/agi_cast.html",2900,0,"",html,selection_keyboard
+4687,12194218,"examples/agi_cast.html",2900,0,"d",html,content
+4688,12194220,"examples/agi_cast.html",2901,0,"",html,selection_keyboard
+4689,12194304,"examples/agi_cast.html",2901,0,"e",html,content
+4690,12194308,"examples/agi_cast.html",2902,0,"",html,selection_keyboard
+4691,12194540,"examples/agi_cast.html",2902,0,"l",html,content
+4692,12194544,"examples/agi_cast.html",2903,0,"",html,selection_keyboard
+4693,12194560,"examples/agi_cast.html",2903,0,"i",html,content
+4694,12194563,"examples/agi_cast.html",2904,0,"",html,selection_keyboard
+4695,12194574,"examples/agi_cast.html",2904,0,"n",html,content
+4696,12194577,"examples/agi_cast.html",2905,0,"",html,selection_keyboard
+4697,12194588,"examples/agi_cast.html",2905,0,"g",html,content
+4698,12194592,"examples/agi_cast.html",2906,0,"",html,selection_keyboard
+4699,12194739,"examples/agi_cast.html",2906,0," ",html,content
+4700,12194742,"examples/agi_cast.html",2907,0,"",html,selection_keyboard
+4701,12195179,"examples/agi_cast.html",2907,0,"p",html,content
+4702,12195181,"examples/agi_cast.html",2908,0,"",html,selection_keyboard
+4703,12195263,"examples/agi_cast.html",2908,0,"i",html,content
+4704,12195266,"examples/agi_cast.html",2909,0,"",html,selection_keyboard
+4705,12195336,"examples/agi_cast.html",2909,0,"p",html,content
+4706,12195337,"examples/agi_cast.html",2910,0,"",html,selection_keyboard
+4707,12195911,"examples/agi_cast.html",2910,0,"e",html,content
+4708,12195913,"examples/agi_cast.html",2911,0,"",html,selection_keyboard
+4709,12196312,"examples/agi_cast.html",2911,0,"l",html,content
+4710,12196313,"examples/agi_cast.html",2912,0,"",html,selection_keyboard
+4711,12196323,"examples/agi_cast.html",2912,0,"i",html,content
+4712,12196324,"examples/agi_cast.html",2913,0,"",html,selection_keyboard
+4713,12196333,"examples/agi_cast.html",2913,0,"n",html,content
+4714,12196335,"examples/agi_cast.html",2914,0,"",html,selection_keyboard
+4715,12196363,"examples/agi_cast.html",2914,0,"e",html,content
+4716,12196367,"examples/agi_cast.html",2915,0,"",html,selection_keyboard
+4717,12196599,"examples/agi_cast.html",2914,0,"",html,selection_command
+4718,12224515,"examples/agi_cast.html",2930,0,"",html,selection_command
+4719,12224591,"examples/agi_cast.html",3021,0,"",html,selection_command
+4720,12224737,"examples/agi_cast.html",3023,0,"",html,selection_command
+4721,12224969,"examples/agi_cast.html",3021,0,"",html,selection_command
+4722,12225667,"examples/agi_cast.html",2938,0,"",html,selection_command
+4723,12226098,"examples/agi_cast.html",2936,2,"",html,content
+4724,12226558,"examples/agi_cast.html",2934,2,"",html,content
+4725,12226770,"examples/agi_cast.html",2932,2,"",html,content
+4726,12226979,"examples/agi_cast.html",2931,1,"",html,content
+4727,12227278,"examples/agi_cast.html",2929,2,"",html,content
+4728,12227578,"examples/agi_cast.html",2927,2,"",html,content
+4729,12227753,"examples/agi_cast.html",2925,2,"",html,content
+4730,12228179,"examples/agi_cast.html",2924,1,"",html,content
+4731,12228688,"examples/agi_cast.html",2924,0," ",html,content
+4732,12228691,"examples/agi_cast.html",2925,0,"",html,selection_keyboard
+4733,12228996,"examples/agi_cast.html",2925,0,"T",html,content
+4734,12228998,"examples/agi_cast.html",2926,0,"",html,selection_keyboard
+4735,12229180,"examples/agi_cast.html",2926,0,"h",html,content
+4736,12229183,"examples/agi_cast.html",2927,0,"",html,selection_keyboard
+4737,12229239,"examples/agi_cast.html",2927,0,"i",html,content
+4738,12229242,"examples/agi_cast.html",2928,0,"",html,selection_keyboard
+4739,12229392,"examples/agi_cast.html",2928,0,"s",html,content
+4740,12229395,"examples/agi_cast.html",2929,0,"",html,selection_keyboard
+4741,12229865,"examples/agi_cast.html",2929,0,"\n ",html,content
+4742,12230503,"examples/agi_cast.html",2935,0,"",html,selection_command
+4743,12232596,"examples/agi_cast.html",3020,0,"",html,selection_command
+4744,12233329,"examples/agi_cast.html",3020,0,"i",html,content
+4745,12233331,"examples/agi_cast.html",3021,0,"",html,selection_keyboard
+4746,12233505,"examples/agi_cast.html",3021,0,",",html,content
+4747,12233507,"examples/agi_cast.html",3022,0,"",html,selection_keyboard
+4748,12233709,"examples/agi_cast.html",3022,0," ",html,content
+4749,12233711,"examples/agi_cast.html",3023,0,"",html,selection_keyboard
+4750,12234209,"examples/agi_cast.html",3022,1,"",html,content
+4751,12234358,"examples/agi_cast.html",3021,1,"",html,content
+4752,12234503,"examples/agi_cast.html",3020,1,"",html,content
+4753,12234977,"examples/agi_cast.html",3019,0,"",html,selection_command
+4754,12235181,"examples/agi_cast.html",3019,0,",",html,content
+4755,12235184,"examples/agi_cast.html",3020,0,"",html,selection_keyboard
+4756,12235377,"examples/agi_cast.html",3020,0," ",html,content
+4757,12235378,"examples/agi_cast.html",3021,0,"",html,selection_keyboard
+4758,12237339,"examples/agi_cast.html",3021,0,"a",html,content
+4759,12237342,"examples/agi_cast.html",3022,0,"",html,selection_keyboard
+4760,12237401,"examples/agi_cast.html",3022,0," ",html,content
+4761,12237404,"examples/agi_cast.html",3023,0,"",html,selection_keyboard
+4762,12239589,"examples/agi_cast.html",3022,1,"",html,content
+4763,12239746,"examples/agi_cast.html",3021,1,"",html,content
+4764,12239894,"examples/agi_cast.html",3020,1,"",html,content
+4765,12240062,"examples/agi_cast.html",3019,1,"",html,content
+4766,12240240,"examples/agi_cast.html",3018,0,"",html,selection_command
+4767,12240425,"examples/agi_cast.html",3012,0,"",html,selection_command
+4768,12240597,"examples/agi_cast.html",3011,0,"",html,selection_command
+4769,12240747,"examples/agi_cast.html",3002,0,"",html,selection_command
+4770,12240938,"examples/agi_cast.html",2996,0,"",html,selection_command
+4771,12241080,"examples/agi_cast.html",2995,0,"",html,selection_command
+4772,12241297,"examples/agi_cast.html",2990,0,"",html,selection_command
+4773,12242481,"examples/agi_cast.html",2995,0,"",html,selection_command
+4774,12242663,"examples/agi_cast.html",2996,0,"",html,selection_command
+4775,12242812,"examples/agi_cast.html",3002,0,"",html,selection_command
+4776,12242979,"examples/agi_cast.html",3011,0,"",html,selection_command
+4777,12243138,"examples/agi_cast.html",3012,0,"",html,selection_command
+4778,12243473,"examples/agi_cast.html",3019,0,"",html,selection_command
+4779,12244895,"examples/agi_cast.html",3019,0,",",html,content
+4780,12244898,"examples/agi_cast.html",3020,0,"",html,selection_keyboard
+4781,12245103,"examples/agi_cast.html",3020,0," ",html,content
+4782,12245105,"examples/agi_cast.html",3021,0,"",html,selection_keyboard
+4783,12245187,"examples/agi_cast.html",3021,0,"a",html,content
+4784,12245188,"examples/agi_cast.html",3022,0,"",html,selection_keyboard
+4785,12245809,"examples/agi_cast.html",3022,0," ",html,content
+4786,12245811,"examples/agi_cast.html",3023,0,"",html,selection_keyboard
+4787,12247674,"examples/agi_cast.html",3023,0,"n",html,content
+4788,12247676,"examples/agi_cast.html",3024,0,"",html,selection_keyboard
+4789,12248360,"examples/agi_cast.html",3024,0,"a",html,content
+4790,12248364,"examples/agi_cast.html",3025,0,"",html,selection_keyboard
+4791,12248529,"examples/agi_cast.html",3025,0,"s",html,content
+4792,12248531,"examples/agi_cast.html",3026,0,"",html,selection_keyboard
+4793,12248579,"examples/agi_cast.html",3026,0,"c",html,content
+4794,12248581,"examples/agi_cast.html",3027,0,"",html,selection_keyboard
+4795,12249078,"examples/agi_cast.html",3027,0,"e",html,content
+4796,12249082,"examples/agi_cast.html",3028,0,"",html,selection_keyboard
+4797,12249228,"examples/agi_cast.html",3028,0,"n",html,content
+4798,12249230,"examples/agi_cast.html",3029,0,"",html,selection_keyboard
+4799,12249329,"examples/agi_cast.html",3029,0,"t",html,content
+4800,12249331,"examples/agi_cast.html",3030,0,"",html,selection_keyboard
+4801,12249537,"examples/agi_cast.html",3030,0," ",html,content
+4802,12249539,"examples/agi_cast.html",3031,0,"",html,selection_keyboard
+4803,12250220,"examples/agi_cast.html",3031,0,"f",html,content
+4804,12250223,"examples/agi_cast.html",3032,0,"",html,selection_keyboard
+4805,12250232,"examples/agi_cast.html",3032,0,"i",html,content
+4806,12250235,"examples/agi_cast.html",3033,0,"",html,selection_keyboard
+4807,12250713,"examples/agi_cast.html",3033,0,"e",html,content
+4808,12250716,"examples/agi_cast.html",3034,0,"",html,selection_keyboard
+4809,12250910,"examples/agi_cast.html",3034,0,"l",html,content
+4810,12250912,"examples/agi_cast.html",3035,0,"",html,selection_keyboard
+4811,12250997,"examples/agi_cast.html",3035,0,"d",html,content
+4812,12250999,"examples/agi_cast.html",3036,0,"",html,selection_keyboard
+4813,12251782,"examples/agi_cast.html",3035,0,"",html,selection_command
+4814,12253888,"examples/agi_cast.html",3031,0,"",html,selection_command
+4815,12254060,"examples/agi_cast.html",3023,0,"",html,selection_command
+4816,12254378,"examples/agi_cast.html",3029,0,"",html,selection_command
+4817,12255018,"examples/agi_cast.html",3030,0,"",html,selection_command
+4818,12255046,"examples/agi_cast.html",3030,0,",",html,content
+4819,12255049,"examples/agi_cast.html",3031,0,"",html,selection_keyboard
+4820,12255255,"examples/agi_cast.html",3031,0," ",html,content
+4821,12255256,"examples/agi_cast.html",3032,0,"",html,selection_keyboard
+4822,12255265,"examples/agi_cast.html",3032,0,"b",html,content
+4823,12255267,"examples/agi_cast.html",3033,0,"",html,selection_keyboard
+4824,12255371,"examples/agi_cast.html",3033,0,"u",html,content
+4825,12255372,"examples/agi_cast.html",3034,0,"",html,selection_keyboard
+4826,12255495,"examples/agi_cast.html",3034,0,"t",html,content
+4827,12255497,"examples/agi_cast.html",3035,0,"",html,selection_keyboard
+4828,12255631,"examples/agi_cast.html",3035,0," ",html,content
+4829,12255634,"examples/agi_cast.html",3036,0,"",html,selection_keyboard
+4830,12255647,"examples/agi_cast.html",3036,0,"h",html,content
+4831,12255650,"examples/agi_cast.html",3037,0,"",html,selection_keyboard
+4832,12255704,"examples/agi_cast.html",3037,0,"i",html,content
+4833,12255708,"examples/agi_cast.html",3038,0,"",html,selection_keyboard
+4834,12255826,"examples/agi_cast.html",3038,0,"g",html,content
+4835,12255827,"examples/agi_cast.html",3039,0,"",html,selection_keyboard
+4836,12255897,"examples/agi_cast.html",3039,0,"h",html,content
+4837,12255898,"examples/agi_cast.html",3040,0,"",html,selection_keyboard
+4838,12256431,"examples/agi_cast.html",3040,0,"l",html,content
+4839,12256436,"examples/agi_cast.html",3041,0,"",html,selection_keyboard
+4840,12256446,"examples/agi_cast.html",3041,0,"y",html,content
+4841,12256449,"examples/agi_cast.html",3042,0,"",html,selection_keyboard
+4842,12256666,"examples/agi_cast.html",3042,0," ",html,content
+4843,12256669,"examples/agi_cast.html",3043,0,"",html,selection_keyboard
+4844,12257444,"examples/agi_cast.html",3043,0,"promising",html,content
+4845,12257643,"examples/agi_cast.html",3051,0,"",html,selection_command
+4846,12260794,"examples/agi_cast.html",3060,0,"",html,selection_command
+4847,12260960,"examples/agi_cast.html",3061,0,"",html,selection_command
+4848,12261806,"examples/agi_cast.html",3060,0,"",html,selection_command
+4849,12261952,"examples/agi_cast.html",3051,0,"",html,selection_command
+4850,12262137,"examples/agi_cast.html",2901,0,"",html,selection_command
+4851,12262428,"examples/agi_cast.html",3051,0,"",html,selection_command
+4852,12262576,"examples/agi_cast.html",3060,0,"",html,selection_command
+4853,12262882,"examples/agi_cast.html",3051,0,"",html,selection_command
+4854,12264021,"examples/agi_cast.html",3043,0,"",html,selection_command
+4855,12266681,"examples/agi_cast.html",2930,129," ",html,content
+4856,12266760,"examples/agi_cast.html",2936,0,"w",html,content
+4857,12266761,"examples/agi_cast.html",2937,0,"",html,selection_keyboard
+4858,12267365,"examples/agi_cast.html",2936,0,"",html,selection_command
+4859,12267513,"examples/agi_cast.html",2937,0,"ards video-based behaviour-cloning, a nascent, but highly promising field.",html,content
+4860,12267521,"examples/agi_cast.html",2936,0,"requires moving from predominantly text-based to",html,content
+4861,12267539,"examples/agi_cast.html",3043,0,"",html,selection_command
+4862,12268038,"examples/agi_cast.html",3043,9,"",html,content
+4863,12268947,"examples/agi_cast.html",3042,1,"",html,content
+4864,12269085,"examples/agi_cast.html",3041,1,"",html,content
+4865,12269252,"examples/agi_cast.html",3040,1,"",html,content
+4866,12273853,"examples/agi_cast.html",3039,0,"",html,selection_command
+4867,12273966,"examples/agi_cast.html",3036,0,"",html,selection_command
+4868,12274132,"examples/agi_cast.html",3032,0,"",html,selection_command
+4869,12274296,"examples/agi_cast.html",3030,0,"",html,selection_command
+4870,12274862,"examples/agi_cast.html",3030,1,",",html,selection_command
+4871,12274937,"examples/agi_cast.html",3030,3,", b",html,selection_command
+4872,12275388,"examples/agi_cast.html",3030,7,", but h",html,selection_command
+4873,12275526,"examples/agi_cast.html",3030,12,", but high f",html,selection_command
+4874,12275856,"examples/agi_cast.html",3030,12,"",html,content
+4875,12276620,"examples/agi_cast.html",3030,0," ",html,content
+4876,12276623,"examples/agi_cast.html",3031,0,"",html,selection_keyboard
+4877,12276805,"examples/agi_cast.html",3031,0,"f",html,content
+4878,12276808,"examples/agi_cast.html",3032,0,"",html,selection_keyboard
+4879,12277049,"examples/agi_cast.html",3031,0,"",html,selection_command
+4880,12278870,"examples/agi_cast.html",3038,0,"",html,selection_command
+4881,12279018,"examples/agi_cast.html",3039,0,"",html,selection_command
+4882,12279170,"examples/agi_cast.html",3138,0,"",html,selection_command
+4883,12279410,"examples/agi_cast.html",3039,0,"",html,selection_command
+4884,12279538,"examples/agi_cast.html",3038,0,"",html,selection_command
+4885,12279722,"examples/agi_cast.html",3031,0,"",html,selection_command
+4886,12279878,"examples/agi_cast.html",3038,0,"",html,selection_command
+4887,12280038,"examples/agi_cast.html",3039,0,"",html,selection_command
+4888,12280187,"examples/agi_cast.html",3138,0,"",html,selection_command
+4889,12280383,"examples/agi_cast.html",3039,0,"",html,selection_command
+4890,12281169,"examples/agi_cast.html",3138,0,"",html,selection_command
+4891,12281296,"examples/agi_cast.html",3147,0,"",html,selection_command
+4892,12281620,"examples/agi_cast.html",3138,0,"",html,selection_command
+4893,12282221,"examples/agi_cast.html",3040,100,"",html,content
+4894,12282240,"examples/agi_cast.html",3044,0,"",html,selection_command
+4895,12282540,"examples/agi_cast.html",3053,0,"",html,selection_command
+4896,12282602,"examples/agi_cast.html",3061,0,"",html,selection_command
+4897,12282997,"examples/agi_cast.html",3062,0,"",html,selection_command
+4898,12283214,"examples/agi_cast.html",3054,0,"",html,selection_command
+4899,12283520,"examples/agi_cast.html",3056,0,"\n A longstanding goal of AGI research is automating the process of conducting research itself. ",html,content
+4900,12283523,"examples/agi_cast.html",3063,0,"",html,selection_command
+4901,12284103,"examples/agi_cast.html",3055,0,"",html,selection_command
+4902,12284195,"examples/agi_cast.html",3046,0,"",html,selection_command
+4903,12284334,"examples/agi_cast.html",3039,0,"",html,selection_command
+4904,12284899,"examples/agi_cast.html",3039,1,"",html,content
+4905,12284910,"examples/agi_cast.html",3043,0,"",html,selection_command
+4906,12285220,"examples/agi_cast.html",3038,0,"",html,selection_command
+4907,12285754,"examples/agi_cast.html",3038,1,"",html,content
+4908,12285764,"examples/agi_cast.html",3042,0,"",html,selection_command
+4909,12287347,"TERMINAL",0,0,"^C[1G[0Kโ [1G[0K[1m[7m%[27m[1m[0m \r \r]633;D;0]633;P;Cwd=/Users/franzsrambical/Documents/pdoom/pdoom.org\r[0m[27m[24m[J]633;Afranzsrambical@MBF6N9WFVKFV pdoom.org % ]633;B[K[?2004h",,terminal_output
+4910,12287675,"TERMINAL",0,0,"npm run dev",,terminal_output
+4911,12287759,"TERMINAL",0,0,"[11Dcp examples/* dist",,terminal_output
+4912,12287995,"TERMINAL",0,0,"cp examples/* dist",,terminal_command
+4913,12287996,"TERMINAL",0,0,"[?2004l\r\r\n]633;E;cp examples/* dist;280736e1-4344-414b-b289-7b3558692a37",,terminal_output
+4914,12288045,"TERMINAL",0,0,"]633;C[1m[7m%[27m[1m[0m \r \r",,terminal_output
+4915,12288531,"TERMINAL",0,0,"npm run dev",,terminal_command
+4916,12288582,"TERMINAL",0,0,"]633;C",,terminal_output
+4917,12288751,"TERMINAL",0,0,"\r\n> distill-template@2.8.0 dev\r\n> rollup -c rollup.config.dev.js -w\r\n\r\n[1G[0K",,terminal_output
+4918,12289196,"TERMINAL",0,0,"c[4mrollup v2.7.3[24m\r\n[36mbundles [1msrc/components.js[22m โ [1mdist/template.v2.js[22m...[39m\r\n",,terminal_output
+4919,12289715,"TERMINAL",0,0,"[1m[32mhttp://localhost:8088[39m[22m -> /Users/franzsrambical/Documents/pdoom/pdoom.org/dist\r\n[1m[32mhttp://localhost:8088[39m[22m -> /Users/franzsrambical/Documents/pdoom/pdoom.org/examples\r\n[1m[33m(!) Circular dependencies[39m[22m\r\nnode_modules/d3-interpolate/src/value.js -> node_modules/d3-interpolate/src/array.js -> node_modules/d3-interpolate/src/value.js\r\nnode_modules/d3-interpolate/src/value.js -> node_modules/d3-interpolate/src/object.js -> node_modules/d3-interpolate/src/value.js\r\nnode_modules/d3-selection/src/selection/index.js -> node_modules/d3-selection/src/selection/select.js -> node_modules/d3-selection/src/selection/index.js\r\n...and 7 more\r\n[32mcreated [1mdist/template.v2.js[22m in [1m523ms[22m[39m\r\n[36mbundles [1msrc/transforms.js[22m โ [1mdist/transforms.v2.js[22m...[39m\r\n",,terminal_output
+4920,12290113,"TERMINAL",0,0,"[32mcreated [1mdist/transforms.v2.js[22m in [1m398ms[22m[39m\r\n\r\n[2025-11-02 16:24:19] waiting for changes...\r\n",,terminal_output
+4921,12300319,"examples/agi_cast.html",2934,0,"",html,selection_command
+4922,12306153,"examples/agi_cast.html",2925,0,"",html,selection_command
+4923,12306411,"examples/agi_cast.html",2922,0,"",html,selection_command
+4924,12306442,"examples/agi_cast.html",2917,0,"",html,selection_command
+4925,12306469,"examples/agi_cast.html",2915,0,"",html,selection_command
+4926,12306504,"examples/agi_cast.html",2907,0,"",html,selection_command
+4927,12306555,"examples/agi_cast.html",2898,0,"",html,selection_command
+4928,12307403,"examples/agi_cast.html",2907,0,"",html,selection_command
+4929,12307653,"examples/agi_cast.html",2915,0,"",html,selection_command
+4930,12307684,"examples/agi_cast.html",2917,0,"",html,selection_command
+4931,12307716,"examples/agi_cast.html",2922,0,"",html,selection_command
+4932,12307750,"examples/agi_cast.html",2925,0,"",html,selection_command
+4933,12308113,"examples/agi_cast.html",2924,0,"",html,selection_command
+4934,12308331,"examples/agi_cast.html",2923,0,"",html,selection_command
+4935,12308605,"examples/agi_cast.html",2923,1,"",html,content
+4936,12308675,"examples/agi_cast.html",2922,0,"",html,selection_command
+4937,12308927,"examples/agi_cast.html",2917,0,"",html,selection_command
+4938,12308960,"examples/agi_cast.html",2915,0,"",html,selection_command
+4939,12308991,"examples/agi_cast.html",2907,0,"",html,selection_command
+4940,12309025,"examples/agi_cast.html",2898,0,"",html,selection_command
+4941,12309058,"examples/agi_cast.html",2889,0,"",html,selection_command
+4942,12309092,"examples/agi_cast.html",2885,0,"",html,selection_command
+4943,12309125,"examples/agi_cast.html",2882,0,"",html,selection_command
+4944,12309159,"examples/agi_cast.html",2880,0,"",html,selection_command
+4945,12309193,"examples/agi_cast.html",2876,0,"",html,selection_command
+4946,12309227,"examples/agi_cast.html",2870,0,"",html,selection_command
+4947,12309263,"examples/agi_cast.html",2869,0,"",html,selection_command
+4948,12309295,"examples/agi_cast.html",2865,0,"",html,selection_command
+4949,12309327,"examples/agi_cast.html",2864,0,"",html,selection_command
+4950,12309359,"examples/agi_cast.html",2861,0,"",html,selection_command
+4951,12309393,"examples/agi_cast.html",2851,0,"",html,selection_command
+4952,12309511,"examples/agi_cast.html",2849,0,"",html,selection_command
+4953,12309771,"examples/agi_cast.html",2845,0,"",html,selection_command
+4954,12309798,"examples/agi_cast.html",2843,0,"",html,selection_command
+4955,12309828,"examples/agi_cast.html",2838,0,"",html,selection_command
+4956,12309862,"examples/agi_cast.html",2832,0,"",html,selection_command
+4957,12309895,"examples/agi_cast.html",2831,0,"",html,selection_command
+4958,12309928,"examples/agi_cast.html",2818,0,"",html,selection_command
+4959,12310055,"examples/agi_cast.html",2812,0,"",html,selection_command
+4960,12310364,"examples/agi_cast.html",2818,0,"",html,selection_command
+4961,12310532,"examples/agi_cast.html",2831,0,"",html,selection_command
+4962,12310922,"examples/agi_cast.html",2830,0,"",html,selection_command
+4963,12311073,"examples/agi_cast.html",2831,0,".",html,content
+4964,12311087,"examples/agi_cast.html",2831,0,"",html,selection_command
+4965,12316292,"TERMINAL",0,0,"^C[1G[0Kโ [1G[0K[1m[7m%[27m[1m[0m \r \r]633;D;0]633;P;Cwd=/Users/franzsrambical/Documents/pdoom/pdoom.org\r[0m[27m[24m[J]633;Afranzsrambical@MBF6N9WFVKFV pdoom.org % ]633;B[K[?2004h",,terminal_output
+4966,12316619,"TERMINAL",0,0,"npm run dev[11Dcp examples/* dist",,terminal_output
+4967,12316865,"TERMINAL",0,0,"cp examples/* dist",,terminal_command
+4968,12316865,"TERMINAL",0,0,"[?2004l\r\r\n]633;E;cp examples/* dist;280736e1-4344-414b-b289-7b3558692a37",,terminal_output
+4969,12316891,"TERMINAL",0,0,"]633;C[1m[7m%[27m[1m[0m \r \r",,terminal_output
+4970,12317441,"TERMINAL",0,0,"npm run dev",,terminal_command
+4971,12317491,"TERMINAL",0,0,"]633;C",,terminal_output
+4972,12317624,"TERMINAL",0,0,"\r\n> distill-template@2.8.0 dev\r\n> rollup -c rollup.config.dev.js -w\r\n\r\n[1G[0K",,terminal_output
+4973,12317792,"TERMINAL",0,0,"c[4mrollup v2.7.3[24m\r\n[36mbundles [1msrc/components.js[22m โ [1mdist/template.v2.js[22m...[39m\r\n",,terminal_output
+4974,12318176,"TERMINAL",0,0,"[1m[32mhttp://localhost:8088[39m[22m -> /Users/franzsrambical/Documents/pdoom/pdoom.org/dist\r\n[1m[32mhttp://localhost:8088[39m[22m -> /Users/franzsrambical/Documents/pdoom/pdoom.org/examples\r\n[1m[33m(!) Circular dependencies[39m[22m\r\nnode_modules/d3-interpolate/src/value.js -> node_modules/d3-interpolate/src/array.js -> node_modules/d3-interpolate/src/value.js\r\nnode_modules/d3-interpolate/src/value.js -> node_modules/d3-interpolate/src/object.js -> node_modules/d3-interpolate/src/value.js\r\nnode_modules/d3-selection/src/selection/index.js -> node_modules/d3-selection/src/selection/select.js -> node_modules/d3-selection/src/selection/index.js\r\n...and 7 more\r\n[32mcreated [1mdist/template.v2.js[22m in [1m385ms[22m[39m\r\n[36mbundles [1msrc/transforms.js[22m โ [1mdist/transforms.v2.js[22m...[39m\r\n",,terminal_output
+4975,12318701,"TERMINAL",0,0,"[32mcreated [1mdist/transforms.v2.js[22m in [1m524ms[22m[39m\r\n\r\n[2025-11-02 16:24:47] waiting for changes...\r\n",,terminal_output
+4976,12356131,"examples/agi_cast.html",2832,0,"",html,selection_command
+4977,12357836,"examples/agi_cast.html",2832,1,"<",html,selection_command
+4978,12358002,"examples/agi_cast.html",2832,2,"A",html,selection_command
+4984,12358423,"examples/agi_cast.html",2832,31,"Analogous to 'cold-start data' in the language modeling pipeline",html,selection_command
+4999,12360386,"examples/agi_cast.html",2832,92,"",html,content
+5000,12360605,"examples/agi_cast.html",2890,0,"",html,selection_command
+5001,12361294,"examples/agi_cast.html",2945,0,"",html,selection_command
+5002,12361848,"examples/agi_cast.html",2945,0," ",html,content
+5003,12361851,"examples/agi_cast.html",2946,0,"",html,selection_keyboard
+5004,12362236,"examples/agi_cast.html",2945,0,"",html,selection_command
+5005,12362319,"examples/agi_cast.html",2946,0,"",html,content
+5006,12362321,"examples/agi_cast.html",3037,0,"",html,selection_command
+5007,12363245,"examples/agi_cast.html",2838,0,"",html,selection_command
+5008,12363917,"examples/agi_cast.html",2844,0,"",html,selection_command
+5009,12364167,"examples/agi_cast.html",2853,0,"",html,selection_command
+5010,12364197,"examples/agi_cast.html",2860,0,"",html,selection_command
+5011,12364227,"examples/agi_cast.html",2865,0,"",html,selection_command
+5012,12364261,"examples/agi_cast.html",2879,0,"",html,selection_command
+5013,12364293,"examples/agi_cast.html",2883,0,"",html,selection_command
+5014,12364326,"examples/agi_cast.html",2884,0,"",html,selection_command
+5015,12364360,"examples/agi_cast.html",2890,0,"",html,selection_command
+5016,12364393,"examples/agi_cast.html",2898,0,"",html,selection_command
+5017,12364426,"examples/agi_cast.html",2903,0,"",html,selection_command
+5018,12364460,"examples/agi_cast.html",2904,0,"",html,selection_command
+5019,12364493,"examples/agi_cast.html",2910,0,"",html,selection_command
+5020,12364527,"examples/agi_cast.html",2919,0,"",html,selection_command
+5021,12364559,"examples/agi_cast.html",2920,0,"",html,selection_command
+5022,12364593,"examples/agi_cast.html",2927,0,"",html,selection_command
+5023,12364626,"examples/agi_cast.html",2929,0,"",html,selection_command
+5024,12364660,"examples/agi_cast.html",2931,0,"",html,selection_command
+5025,12364692,"examples/agi_cast.html",2939,0,"",html,selection_command
+5026,12364726,"examples/agi_cast.html",2944,0,"",html,selection_command
+5027,12364759,"examples/agi_cast.html",2946,0,"",html,selection_command
+5028,12365110,"examples/agi_cast.html",2945,0,"",html,selection_command
+5029,12365287,"examples/agi_cast.html",2945,1,"",html,content
+5030,12365677,"examples/agi_cast.html",2836,0,"",html,selection_command
+5031,12366088,"examples/agi_cast.html",2945,0,"",html,selection_command
+5032,12366375,"examples/agi_cast.html",2836,0,"",html,selection_command
+5033,12370363,"TERMINAL",0,0,"^C[1G[0Kโ [1G[0K[1m[7m%[27m[1m[0m \r \r]633;D;0]633;P;Cwd=/Users/franzsrambical/Documents/pdoom/pdoom.org\r[0m[27m[24m[J]633;Afranzsrambical@MBF6N9WFVKFV pdoom.org % ]633;B[K[?2004h",,terminal_output
+5034,12370675,"TERMINAL",0,0,"npm run dev[11Dcp examples/* dist",,terminal_output
+5035,12370910,"TERMINAL",0,0,"cp examples/* dist",,terminal_command
+5036,12370910,"TERMINAL",0,0,"[?2004l\r\r\n]633;E;cp examples/* dist;280736e1-4344-414b-b289-7b3558692a37",,terminal_output
+5037,12370946,"TERMINAL",0,0,"]633;C[1m[7m%[27m[1m[0m \r \r",,terminal_output
+5038,12371486,"TERMINAL",0,0,"npm run dev",,terminal_command
+5039,12371538,"TERMINAL",0,0,"]633;C",,terminal_output
+5040,12371678,"TERMINAL",0,0,"\r\n> distill-template@2.8.0 dev\r\n> rollup -c rollup.config.dev.js -w\r\n\r\n[1G[0K",,terminal_output
+5041,12371877,"TERMINAL",0,0,"c[4mrollup v2.7.3[24m\r\n[36mbundles [1msrc/components.js[22m โ [1mdist/template.v2.js[22m...[39m\r\n",,terminal_output
+5042,12372274,"TERMINAL",0,0,"[1m[32mhttp://localhost:8088[39m[22m -> /Users/franzsrambical/Documents/pdoom/pdoom.org/dist\r\n[1m[32mhttp://localhost:8088[39m[22m -> /Users/franzsrambical/Documents/pdoom/pdoom.org/examples\r\n[1m[33m(!) Circular dependencies[39m[22m\r\nnode_modules/d3-interpolate/src/value.js -> node_modules/d3-interpolate/src/array.js -> node_modules/d3-interpolate/src/value.js\r\nnode_modules/d3-interpolate/src/value.js -> node_modules/d3-interpolate/src/object.js -> node_modules/d3-interpolate/src/value.js\r\nnode_modules/d3-selection/src/selection/index.js -> node_modules/d3-selection/src/selection/select.js -> node_modules/d3-selection/src/selection/index.js\r\n...and 7 more\r\n[32mcreated [1mdist/template.v2.js[22m in [1m396ms[22m[39m\r\n[36mbundles [1msrc/transforms.js[22m โ [1mdist/transforms.v2.js[22m...[39m\r\n",,terminal_output
+5043,12372914,"TERMINAL",0,0,"[32mcreated [1mdist/transforms.v2.js[22m in [1m637ms[22m[39m\r\n\r\n[2025-11-02 16:25:42] waiting for changes...\r\n",,terminal_output
+5044,12404534,"examples/agi_cast.html",3021,0,"",html,selection_command
+5045,12405150,"examples/agi_cast.html",3021,8,"",html,content
+5046,12405394,"examples/agi_cast.html",3021,0,"r",html,content
+5047,12405395,"examples/agi_cast.html",3022,0,"",html,selection_keyboard
+5048,12405449,"examples/agi_cast.html",3022,0,"e",html,content
+5049,12405451,"examples/agi_cast.html",3023,0,"",html,selection_keyboard
+5050,12405568,"examples/agi_cast.html",3023,0,"g",html,content
+5051,12405570,"examples/agi_cast.html",3024,0,"",html,selection_keyboard
+5052,12405670,"examples/agi_cast.html",3024,0,"i",html,content
+5053,12405672,"examples/agi_cast.html",3025,0,"",html,selection_keyboard
+5054,12405685,"examples/agi_cast.html",3025,0,"m",html,content
+5055,12405687,"examples/agi_cast.html",3026,0,"",html,selection_keyboard
+5056,12405776,"examples/agi_cast.html",3026,0,"e",html,content
+5057,12405777,"examples/agi_cast.html",3027,0,"",html,selection_keyboard
+5058,12406002,"examples/agi_cast.html",3026,0,"",html,selection_command
+5059,12408571,"TERMINAL",0,0,"",,terminal_command
+5060,12408572,"TERMINAL",0,0,"^C[1G[0Kโ [1G[0K[1m[7m%[27m[1m[0m \r \r]633;D;0]633;P;Cwd=/Users/franzsrambical/Documents/pdoom/pdoom.org\r[0m[27m[24m[J]633;Afranzsrambical@MBF6N9WFVKFV pdoom.org % ]633;B[K[?2004h[?2004l\r\r\n[1m[7m%[27m[1m[0m \r \r]633;E;;280736e1-4344-414b-b289-7b3558692a37",,terminal_output
+5061,12408572,"TERMINAL",0,0,"]633;C",,terminal_output
+5062,12409323,"TERMINAL",0,0,"cp examples/* dist",,terminal_command
+5063,12409360,"TERMINAL",0,0,"]633;C[1m[7m%[27m[1m[0m \r \r",,terminal_output
+5064,12410210,"TERMINAL",0,0,"npm run dev",,terminal_command
+5065,12410261,"TERMINAL",0,0,"]633;C",,terminal_output
+5066,12410398,"TERMINAL",0,0,"\r\n> distill-template@2.8.0 dev\r\n> rollup -c rollup.config.dev.js -w\r\n\r\n[1G[0K",,terminal_output
+5067,12410586,"TERMINAL",0,0,"c[4mrollup v2.7.3[24m\r\n[36mbundles [1msrc/components.js[22m โ [1mdist/template.v2.js[22m...[39m\r\n",,terminal_output
+5068,12411083,"TERMINAL",0,0,"[1m[32mhttp://localhost:8088[39m[22m -> /Users/franzsrambical/Documents/pdoom/pdoom.org/dist\r\n[1m[32mhttp://localhost:8088[39m[22m -> /Users/franzsrambical/Documents/pdoom/pdoom.org/examples\r\n[1m[33m(!) Circular dependencies[39m[22m\r\nnode_modules/d3-interpolate/src/value.js -> node_modules/d3-interpolate/src/array.js -> node_modules/d3-interpolate/src/value.js\r\nnode_modules/d3-interpolate/src/value.js -> node_modules/d3-interpolate/src/object.js -> node_modules/d3-interpolate/src/value.js\r\nnode_modules/d3-selection/src/selection/index.js -> node_modules/d3-selection/src/selection/select.js -> node_modules/d3-selection/src/selection/index.js\r\n...and 7 more\r\n[32mcreated [1mdist/template.v2.js[22m in [1m496ms[22m[39m\r\n[36mbundles [1msrc/transforms.js[22m โ [1mdist/transforms.v2.js[22m...[39m\r\n",,terminal_output
+5069,12411513,"TERMINAL",0,0,"[32mcreated [1mdist/transforms.v2.js[22m in [1m430ms[22m[39m\r\n\r\n[2025-11-02 16:26:20] waiting for changes...\r\n",,terminal_output
+5070,12538463,"examples/agi_cast.html",2838,0,"",html,selection_command
+5071,12539456,"examples/agi_cast.html",2838,197," requires moving from predominantly text-based towards video-based behaviour-cloning, a nascent field.",html,selection_command
+5072,12540792,"examples/agi_cast.html",2780,255," from screen recordings of human practitioners. This\n requires moving from predominantly text-based towards video-based behaviour-cloning, a nascent field.",html,selection_command
+5073,12541045,"examples/agi_cast.html",2629,406," spiky towards general intelligence. A natural way to extend the current paradigm to automation of arbitrary knowledge work is to behaviour-clone\n from screen recordings of human practitioners. This\n requires moving from predominantly text-based towards video-based behaviour-cloning, a nascent field.",html,selection_command
+5074,12541075,"examples/agi_cast.html",2481,554," , yet data is increasingly bottlenecking progress from\n spiky towards general intelligence. A natural way to extend the current paradigm to automation of arbitrary knowledge work is to behaviour-clone\n from screen recordings of human practitioners. This\n requires moving from predominantly text-based towards video-based behaviour-cloning, a nascent field.",html,selection_command
+5075,12541103,"examples/agi_cast.html",2441,594," to attain human-level performance\n , yet data is increasingly bottlenecking progress from\n spiky towards general intelligence. A natural way to extend the current paradigm to automation of arbitrary knowledge work is to behaviour-clone\n from screen recordings of human practitioners. This\n requires moving from predominantly text-based towards video-based behaviour-cloning, a nascent field.",html,selection_command
+5076,12541136,"examples/agi_cast.html",2284,751," Internet-scale pre-training, preference modeling, and reinforcement learning using verification signals offer a compelling pathway for language models\n to attain human-level performance\n , yet data is increasingly bottlenecking progress from\n spiky towards general intelligence. A natural way to extend the current paradigm to automation of arbitrary knowledge work is to behaviour-clone\n from screen recordings of human practitioners. This\n requires moving from predominantly text-based towards video-based behaviour-cloning, a nascent field.",html,selection_command
+5077,12541169,"examples/agi_cast.html",2276,759,"
\n Internet-scale pre-training, preference modeling, and reinforcement learning using verification signals offer a compelling pathway for language models\n to attain human-level performance\n , yet data is increasingly bottlenecking progress from\n spiky towards general intelligence. A natural way to extend the current paradigm to automation of arbitrary knowledge work is to behaviour-clone\n from screen recordings of human practitioners. This\n requires moving from predominantly text-based towards video-based behaviour-cloning, a nascent field.",html,selection_command
+5078,12541202,"examples/agi_cast.html",2232,803,"
Behaviour-Cloning AGI Research
\n
\n Internet-scale pre-training, preference modeling, and reinforcement learning using verification signals offer a compelling pathway for language models\n to attain human-level performance\n , yet data is increasingly bottlenecking progress from\n spiky towards general intelligence. A natural way to extend the current paradigm to automation of arbitrary knowledge work is to behaviour-clone\n from screen recordings of human practitioners. This\n requires moving from predominantly text-based towards video-based behaviour-cloning, a nascent field.",html,selection_command
+5079,12541236,"examples/agi_cast.html",2158,877," 1\n
Behaviour-Cloning AGI Research
\n
\n Internet-scale pre-training, preference modeling, and reinforcement learning using verification signals offer a compelling pathway for language models\n to attain human-level performance\n , yet data is increasingly bottlenecking progress from\n spiky towards general intelligence. A natural way to extend the current paradigm to automation of arbitrary knowledge work is to behaviour-clone\n from screen recordings of human practitioners. This\n requires moving from predominantly text-based towards video-based behaviour-cloning, a nascent field.",html,selection_command
+5080,12541270,"examples/agi_cast.html",1986,1049," Figure 1: AGI-CAST in action.\n 1\n
Behaviour-Cloning AGI Research
\n
\n Internet-scale pre-training, preference modeling, and reinforcement learning using verification signals offer a compelling pathway for language models\n to attain human-level performance\n , yet data is increasingly bottlenecking progress from\n spiky towards general intelligence. A natural way to extend the current paradigm to automation of arbitrary knowledge work is to behaviour-clone\n from screen recordings of human practitioners. This\n requires moving from predominantly text-based towards video-based behaviour-cloning, a nascent field.",html,selection_command
+5081,12541305,"examples/agi_cast.html",1901,1134," style=""width:100%; border-radius: 8px;"" controls autoplay loop muted />\n Figure 1: AGI-CAST in action.\n 1\n
Behaviour-Cloning AGI Research
\n
\n Internet-scale pre-training, preference modeling, and reinforcement learning using verification signals offer a compelling pathway for language models\n to attain human-level performance\n , yet data is increasingly bottlenecking progress from\n spiky towards general intelligence. A natural way to extend the current paradigm to automation of arbitrary knowledge work is to behaviour-clone\n from screen recordings of human practitioners. This\n requires moving from predominantly text-based towards video-based behaviour-cloning, a nascent field.",html,selection_command
+5082,12541339,"examples/agi_cast.html",1781,1254," \n Figure 1: AGI-CAST in action.\n 1\n
Behaviour-Cloning AGI Research
\n
\n Internet-scale pre-training, preference modeling, and reinforcement learning using verification signals offer a compelling pathway for language models\n to attain human-level performance\n , yet data is increasingly bottlenecking progress from\n spiky towards general intelligence. A natural way to extend the current paradigm to automation of arbitrary knowledge work is to behaviour-clone\n from screen recordings of human practitioners. This\n requires moving from predominantly text-based towards video-based behaviour-cloning, a nascent field.",html,selection_command
+5083,12541373,"examples/agi_cast.html",1767,1268," \n \n Figure 1: AGI-CAST in action.\n 1\n
Behaviour-Cloning AGI Research
\n
\n Internet-scale pre-training, preference modeling, and reinforcement learning using verification signals offer a compelling pathway for language models\n to attain human-level performance\n , yet data is increasingly bottlenecking progress from\n spiky towards general intelligence. A natural way to extend the current paradigm to automation of arbitrary knowledge work is to behaviour-clone\n from screen recordings of human practitioners. This\n requires moving from predominantly text-based towards video-based behaviour-cloning, a nascent field.",html,selection_command
+5084,12541408,"examples/agi_cast.html",1743,1292," \n \n \n Figure 1: AGI-CAST in action.\n 1\n
Behaviour-Cloning AGI Research
\n
\n Internet-scale pre-training, preference modeling, and reinforcement learning using verification signals offer a compelling pathway for language models\n to attain human-level performance\n , yet data is increasingly bottlenecking progress from\n spiky towards general intelligence. A natural way to extend the current paradigm to automation of arbitrary knowledge work is to behaviour-clone\n from screen recordings of human practitioners. This\n requires moving from predominantly text-based towards video-based behaviour-cloning, a nascent field.",html,selection_command
+5085,12541441,"examples/agi_cast.html",1730,1305," \n \n \n \n Figure 1: AGI-CAST in action.\n 1\n
Behaviour-Cloning AGI Research
\n
\n Internet-scale pre-training, preference modeling, and reinforcement learning using verification signals offer a compelling pathway for language models\n to attain human-level performance\n , yet data is increasingly bottlenecking progress from\n spiky towards general intelligence. A natural way to extend the current paradigm to automation of arbitrary knowledge work is to behaviour-clone\n from screen recordings of human practitioners. This\n requires moving from predominantly text-based towards video-based behaviour-cloning, a nascent field.",html,selection_command
+5086,12541476,"examples/agi_cast.html",1721,1314,"
\n Internet-scale pre-training, preference modeling, and reinforcement learning using verification signals offer a compelling pathway for language models\n to attain human-level performance\n , yet data is increasingly bottlenecking progress from\n spiky towards general intelligence. A natural way to extend the current paradigm to automation of arbitrary knowledge work is to behaviour-clone\n from screen recordings of human practitioners. This\n requires moving from predominantly text-based towards video-based behaviour-cloning, a nascent field.",html,selection_command
+5087,12541595,"examples/agi_cast.html",1597,1438," We introduce AGI-CAST, a continually growing dataset of unlabeled screen recordings of AGI research.\n
\n Internet-scale pre-training, preference modeling, and reinforcement learning using verification signals offer a compelling pathway for language models\n to attain human-level performance\n , yet data is increasingly bottlenecking progress from\n spiky towards general intelligence. A natural way to extend the current paradigm to automation of arbitrary knowledge work is to behaviour-clone\n from screen recordings of human practitioners. This\n requires moving from predominantly text-based towards video-based behaviour-cloning, a nascent field.",html,selection_command
+5088,12541744,"examples/agi_cast.html",1589,1446,"
\n We introduce AGI-CAST, a continually growing dataset of unlabeled screen recordings of AGI research.\n
\n Internet-scale pre-training, preference modeling, and reinforcement learning using verification signals offer a compelling pathway for language models\n to attain human-level performance\n , yet data is increasingly bottlenecking progress from\n spiky towards general intelligence. A natural way to extend the current paradigm to automation of arbitrary knowledge work is to behaviour-clone\n from screen recordings of human practitioners. This\n requires moving from predominantly text-based towards video-based behaviour-cloning, a nascent field.",html,selection_command
+5089,12541876,"examples/agi_cast.html",1577,1458," \n
\n We introduce AGI-CAST, a continually growing dataset of unlabeled screen recordings of AGI research.\n
\n Internet-scale pre-training, preference modeling, and reinforcement learning using verification signals offer a compelling pathway for language models\n to attain human-level performance\n , yet data is increasingly bottlenecking progress from\n spiky towards general intelligence. A natural way to extend the current paradigm to automation of arbitrary knowledge work is to behaviour-clone\n from screen recordings of human practitioners. This\n requires moving from predominantly text-based towards video-based behaviour-cloning, a nascent field.",html,selection_command
+5090,12542384,"examples/agi_cast.html",1589,1446,"
\n We introduce AGI-CAST, a continually growing dataset of unlabeled screen recordings of AGI research.\n
\n Internet-scale pre-training, preference modeling, and reinforcement learning using verification signals offer a compelling pathway for language models\n to attain human-level performance\n , yet data is increasingly bottlenecking progress from\n spiky towards general intelligence. A natural way to extend the current paradigm to automation of arbitrary knowledge work is to behaviour-clone\n from screen recordings of human practitioners. This\n requires moving from predominantly text-based towards video-based behaviour-cloning, a nascent field.",html,selection_command
+5091,12542570,"examples/agi_cast.html",1577,1458," \n
\n We introduce AGI-CAST, a continually growing dataset of unlabeled screen recordings of AGI research.\n
\n The ability to automate arbitrary k",html,content
+6240,13393368,"examples/agi_cast.html",2388,0,"o",html,content
+6241,13393373,"examples/agi_cast.html",2389,0,"",html,selection_keyboard
+6242,13393524,"examples/agi_cast.html",2389,0,"z",html,content
+6243,13393526,"examples/agi_cast.html",2390,0,"",html,selection_keyboard
+6244,13478041,"examples/agi_cast.html",2389,0,"",html,selection_command
+6245,13478420,"examples/agi_cast.html",2290,115,"",html,content
+6246,13478446,"examples/agi_cast.html",2264,0,"",html,selection_command
+6247,13478976,"examples/agi_cast.html",3340,6,"",html,content
+6248,13478978,"examples/agi_cast.html",3341,0,"",html,selection_command
+6249,13480869,"examples/agi_cast.html",3340,0,"\n ",html,content
+6250,13480881,"examples/agi_cast.html",3339,0,"",html,selection_command
+6251,13482032,"examples/agi_cast.html",2290,0,"The ability to automate arbitrary knowledge work is a long-standing goal of AGI research.\n
oz\n
\n ",html,content
+6252,13482052,"examples/agi_cast.html",2264,0,"",html,selection_command
+6253,13483470,"examples/agi_cast.html",2290,115,"",html,content
+6254,13485104,"examples/agi_cast.html",3340,6,"",html,content
+6255,13485120,"examples/agi_cast.html",3341,0,"",html,selection_command
+6256,13485601,"examples/agi_cast.html",3340,0,"\n ",html,content
+6257,13485615,"examples/agi_cast.html",3339,0,"",html,selection_command
+6258,13488950,"examples/agi_cast.html",3340,6,"",html,content
+6259,13488959,"examples/agi_cast.html",3341,0,"",html,selection_command
+6260,13491418,"examples/agi_cast.html",3340,0,"\n ",html,content
+6261,13491437,"examples/agi_cast.html",3339,0,"",html,selection_command
+6262,13491807,"examples/agi_cast.html",2290,0,"The ability to automate arbitrary knowledge work is a long-standing goal of AGI research.\n
\n We introduce AGI-CAST, a continually growing dataset of unlabeled screen recordings of long-horizon AGI research.",html,selection_command
+9672,15479890,"examples/agi_cast.html",1589,225,"
\n We introduce AGI-CAST, a continually growing dataset of unlabeled screen recordings of long-horizon AGI research.\n
\n Internet-scale pre-training, preference modeling, and reinforcement learning using verification signals offer a compelling pathway for language models",html,selection_command
+9683,15480255,"examples/agi_cast.html",1589,978,"
\n We introduce AGI-CAST, a continually growing dataset of unlabeled screen recordings of long-horizon AGI research.\n
\n Internet-scale pre-training, preference modeling, and reinforcement learning using verification signals offer a compelling pathway for language models\n to attain human-level performance",html,selection_command
+9684,15480288,"examples/agi_cast.html",1589,1126,"
\n We introduce AGI-CAST, a continually growing dataset of unlabeled screen recordings of long-horizon AGI research.\n
\n Internet-scale pre-training, preference modeling, and reinforcement learning using verification signals offer a compelling pathway for language models\n to attain human-level performance\n , yet data is increasingly bottlenecking progress from",html,selection_command
+9685,15480324,"examples/agi_cast.html",1589,1277,"
\n We introduce AGI-CAST, a continually growing dataset of unlabeled screen recordings of long-horizon AGI research.\n
\n Internet-scale pre-training, preference modeling, and reinforcement learning using verification signals offer a compelling pathway for language models\n to attain human-level performance\n , yet data is increasingly bottlenecking progress from\n spiky towards general intelligence. A natural way to extend the current paradigm to automation of arbitrary knowledge work is to behaviour-clone",html,selection_command
+9686,15480355,"examples/agi_cast.html",1589,1335,"
\n We introduce AGI-CAST, a continually growing dataset of unlabeled screen recordings of long-horizon AGI research.\n
\n Internet-scale pre-training, preference modeling, and reinforcement learning using verification signals offer a compelling pathway for language models\n to attain human-level performance\n , yet data is increasingly bottlenecking progress from\n spiky towards general intelligence. A natural way to extend the current paradigm to automation of arbitrary knowledge work is to behaviour-clone\n from screen recordings of human practitioners. This",html,selection_command
+9687,15480388,"examples/agi_cast.html",1589,1571,"
\n We introduce AGI-CAST, a continually growing dataset of unlabeled screen recordings of long-horizon AGI research.\n
\n Internet-scale pre-training, preference modeling, and reinforcement learning using verification signals offer a compelling pathway for language models\n to attain human-level performance\n , yet data is increasingly bottlenecking progress from\n spiky towards general intelligence. A natural way to extend the current paradigm to automation of arbitrary knowledge work is to behaviour-clone\n from screen recordings of human practitioners. This\n requires moving from predominantly text-based towards video-based behaviour-cloning, a nascent field .",html,selection_command
+9688,15480422,"examples/agi_cast.html",1589,1580,"
\n We introduce AGI-CAST, a continually growing dataset of unlabeled screen recordings of long-horizon AGI research.\n
\n Internet-scale pre-training, preference modeling, and reinforcement learning using verification signals offer a compelling pathway for language models\n to attain human-level performance\n , yet data is increasingly bottlenecking progress from\n spiky towards general intelligence. A natural way to extend the current paradigm to automation of arbitrary knowledge work is to behaviour-clone\n from screen recordings of human practitioners. This\n requires moving from predominantly text-based towards video-based behaviour-cloning, a nascent field .\n
\n Internet-scale pre-training, preference modeling, and reinforcement learning using verification signals offer a compelling pathway for language models\n to attain human-level performance\n , yet data is increasingly bottlenecking progress from\n spiky towards general intelligence. A natural way to extend the current paradigm to automation of arbitrary knowledge work is to behaviour-clone\n from screen recordings of human practitioners. This\n requires moving from predominantly text-based towards video-based behaviour-cloning, a nascent field .\n
\n Internet-scale pre-training, preference modeling, and reinforcement learning using verification signals offer a compelling pathway for language models\n to attain human-level performance\n , yet data is increasingly bottlenecking progress from\n spiky towards general intelligence. A natural way to extend the current paradigm to automation of arbitrary knowledge work is to behaviour-clone\n from screen recordings of human practitioners. This\n requires moving from predominantly text-based towards video-based behaviour-cloning, a nascent field .\n
\n
\n A longstanding goal of AGI research is automating the process of conducting research itself.",html,selection_command
+9691,15480522,"examples/agi_cast.html",1589,1838,"
\n We introduce AGI-CAST, a continually growing dataset of unlabeled screen recordings of long-horizon AGI research.\n
\n Internet-scale pre-training, preference modeling, and reinforcement learning using verification signals offer a compelling pathway for language models\n to attain human-level performance\n , yet data is increasingly bottlenecking progress from\n spiky towards general intelligence. A natural way to extend the current paradigm to automation of arbitrary knowledge work is to behaviour-clone\n from screen recordings of human practitioners. This\n requires moving from predominantly text-based towards video-based behaviour-cloning, a nascent field .\n
\n
\n A longstanding goal of AGI research is automating the process of conducting research itself.\n While a long line of work tried to tackle necessary capabilities for automating research individually (coding, ideation, exploration, planning),",html,selection_command
+9692,15480555,"examples/agi_cast.html",1589,2036,"
\n We introduce AGI-CAST, a continually growing dataset of unlabeled screen recordings of long-horizon AGI research.\n
\n Internet-scale pre-training, preference modeling, and reinforcement learning using verification signals offer a compelling pathway for language models\n to attain human-level performance\n , yet data is increasingly bottlenecking progress from\n spiky towards general intelligence. A natural way to extend the current paradigm to automation of arbitrary knowledge work is to behaviour-clone\n from screen recordings of human practitioners. This\n requires moving from predominantly text-based towards video-based behaviour-cloning, a nascent field .\n
\n
\n A longstanding goal of AGI research is automating the process of conducting research itself.\n While a long line of work tried to tackle necessary capabilities for automating research individually (coding, ideation, exploration, planning),\n research automation does not warrant special treatment compared to other types of knowledge work, and behaviour-cloning from screen recordings is a natural way to bootstrap models in general.",html,selection_command
+9693,15480588,"examples/agi_cast.html",1589,2045,"
\n We introduce AGI-CAST, a continually growing dataset of unlabeled screen recordings of long-horizon AGI research.\n
\n Internet-scale pre-training, preference modeling, and reinforcement learning using verification signals offer a compelling pathway for language models\n to attain human-level performance\n , yet data is increasingly bottlenecking progress from\n spiky towards general intelligence. A natural way to extend the current paradigm to automation of arbitrary knowledge work is to behaviour-clone\n from screen recordings of human practitioners. This\n requires moving from predominantly text-based towards video-based behaviour-cloning, a nascent field .\n
\n
\n A longstanding goal of AGI research is automating the process of conducting research itself.\n While a long line of work tried to tackle necessary capabilities for automating research individually (coding, ideation, exploration, planning),\n research automation does not warrant special treatment compared to other types of knowledge work, and behaviour-cloning from screen recordings is a natural way to bootstrap models in general.\n
\n Internet-scale pre-training, preference modeling, and reinforcement learning using verification signals offer a compelling pathway for language models\n to attain human-level performance\n , yet data is increasingly bottlenecking progress from\n spiky towards general intelligence. A natural way to extend the current paradigm to automation of arbitrary knowledge work is to behaviour-clone\n from screen recordings of human practitioners. This\n requires moving from predominantly text-based towards video-based behaviour-cloning, a nascent field .\n
\n
\n A longstanding goal of AGI research is automating the process of conducting research itself.\n While a long line of work tried to tackle necessary capabilities for automating research individually (coding, ideation, exploration, planning),\n research automation does not warrant special treatment compared to other types of knowledge work, and behaviour-cloning from screen recordings is a natural way to bootstrap models in general.\n
\n Internet-scale pre-training, preference modeling, and reinforcement learning using verification signals offer a compelling pathway for language models\n to attain human-level performance\n , yet data is increasingly bottlenecking progress from\n spiky towards general intelligence. A natural way to extend the current paradigm to automation of arbitrary knowledge work is to behaviour-clone\n from screen recordings of human practitioners. This\n requires moving from predominantly text-based towards video-based behaviour-cloning, a nascent field .\n
\n
\n A longstanding goal of AGI research is automating the process of conducting research itself.\n While a long line of work tried to tackle necessary capabilities for automating research individually (coding, ideation, exploration, planning),\n research automation does not warrant special treatment compared to other types of knowledge work, and behaviour-cloning from screen recordings is a natural way to bootstrap models in general.\n
\n
\n We introduce AGI-CAST, a dataset of unlabeled screen recordings of AGI research, intended to facilitate research on behaviour-cloning from screen recordings.",html,selection_command
+9696,15480688,"examples/agi_cast.html",1589,2446,"
\n We introduce AGI-CAST, a continually growing dataset of unlabeled screen recordings of long-horizon AGI research.\n
\n Internet-scale pre-training, preference modeling, and reinforcement learning using verification signals offer a compelling pathway for language models\n to attain human-level performance\n , yet data is increasingly bottlenecking progress from\n spiky towards general intelligence. A natural way to extend the current paradigm to automation of arbitrary knowledge work is to behaviour-clone\n from screen recordings of human practitioners. This\n requires moving from predominantly text-based towards video-based behaviour-cloning, a nascent field .\n
\n
\n A longstanding goal of AGI research is automating the process of conducting research itself.\n While a long line of work tried to tackle necessary capabilities for automating research individually (coding, ideation, exploration, planning),\n research automation does not warrant special treatment compared to other types of knowledge work, and behaviour-cloning from screen recordings is a natural way to bootstrap models in general.\n
\n
\n We introduce AGI-CAST, a dataset of unlabeled screen recordings of AGI research, intended to facilitate research on behaviour-cloning from screen recordings.\n We go beyond crowd-code, our previous work on crowd-sourcing a dataset of IDE interactions, by capturing not only the IDE, but also browser-use, notes, and paper exploration.",html,selection_command
+9697,15480723,"examples/agi_cast.html",1589,2666,"
\n We introduce AGI-CAST, a continually growing dataset of unlabeled screen recordings of long-horizon AGI research.\n
\n Internet-scale pre-training, preference modeling, and reinforcement learning using verification signals offer a compelling pathway for language models\n to attain human-level performance\n , yet data is increasingly bottlenecking progress from\n spiky towards general intelligence. A natural way to extend the current paradigm to automation of arbitrary knowledge work is to behaviour-clone\n from screen recordings of human practitioners. This\n requires moving from predominantly text-based towards video-based behaviour-cloning, a nascent field .\n
\n
\n A longstanding goal of AGI research is automating the process of conducting research itself.\n While a long line of work tried to tackle necessary capabilities for automating research individually (coding, ideation, exploration, planning),\n research automation does not warrant special treatment compared to other types of knowledge work, and behaviour-cloning from screen recordings is a natural way to bootstrap models in general.\n
\n
\n We introduce AGI-CAST, a dataset of unlabeled screen recordings of AGI research, intended to facilitate research on behaviour-cloning from screen recordings.\n We go beyond crowd-code, our previous work on crowd-sourcing a dataset of IDE interactions, by capturing not only the IDE, but also browser-use, notes, and paper exploration.\n While crowd-code is intended as a low-threshold crowd-sourcing effort, AGI-CAST captures the entire day of researchers at p(doom), with all its idiosyncrasies and nuances. While we only started recording recently,",html,selection_command
+9698,15480754,"examples/agi_cast.html",1589,2861,"
\n We introduce AGI-CAST, a continually growing dataset of unlabeled screen recordings of long-horizon AGI research.\n
\n Internet-scale pre-training, preference modeling, and reinforcement learning using verification signals offer a compelling pathway for language models\n to attain human-level performance\n , yet data is increasingly bottlenecking progress from\n spiky towards general intelligence. A natural way to extend the current paradigm to automation of arbitrary knowledge work is to behaviour-clone\n from screen recordings of human practitioners. This\n requires moving from predominantly text-based towards video-based behaviour-cloning, a nascent field .\n
\n
\n A longstanding goal of AGI research is automating the process of conducting research itself.\n While a long line of work tried to tackle necessary capabilities for automating research individually (coding, ideation, exploration, planning),\n research automation does not warrant special treatment compared to other types of knowledge work, and behaviour-cloning from screen recordings is a natural way to bootstrap models in general.\n
\n
\n We introduce AGI-CAST, a dataset of unlabeled screen recordings of AGI research, intended to facilitate research on behaviour-cloning from screen recordings.\n We go beyond crowd-code, our previous work on crowd-sourcing a dataset of IDE interactions, by capturing not only the IDE, but also browser-use, notes, and paper exploration.\n While crowd-code is intended as a low-threshold crowd-sourcing effort, AGI-CAST captures the entire day of researchers at p(doom), with all its idiosyncrasies and nuances. While we only started recording recently,\n the entire up-to-date dataset is available as a playlist on YouTube and will be updated continuously.",html,selection_command
+9699,15480788,"examples/agi_cast.html",1589,2870,"
\n We introduce AGI-CAST, a continually growing dataset of unlabeled screen recordings of long-horizon AGI research.\n
\n Internet-scale pre-training, preference modeling, and reinforcement learning using verification signals offer a compelling pathway for language models\n to attain human-level performance\n , yet data is increasingly bottlenecking progress from\n spiky towards general intelligence. A natural way to extend the current paradigm to automation of arbitrary knowledge work is to behaviour-clone\n from screen recordings of human practitioners. This\n requires moving from predominantly text-based towards video-based behaviour-cloning, a nascent field .\n
\n
\n A longstanding goal of AGI research is automating the process of conducting research itself.\n While a long line of work tried to tackle necessary capabilities for automating research individually (coding, ideation, exploration, planning),\n research automation does not warrant special treatment compared to other types of knowledge work, and behaviour-cloning from screen recordings is a natural way to bootstrap models in general.\n
\n
\n We introduce AGI-CAST, a dataset of unlabeled screen recordings of AGI research, intended to facilitate research on behaviour-cloning from screen recordings.\n We go beyond crowd-code, our previous work on crowd-sourcing a dataset of IDE interactions, by capturing not only the IDE, but also browser-use, notes, and paper exploration.\n While crowd-code is intended as a low-threshold crowd-sourcing effort, AGI-CAST captures the entire day of researchers at p(doom), with all its idiosyncrasies and nuances. While we only started recording recently,\n the entire up-to-date dataset is available as a playlist on YouTube and will be updated continuously.\n
\n Internet-scale pre-training, preference modeling, and reinforcement learning using verification signals offer a compelling pathway for language models\n to attain human-level performance\n , yet data is increasingly bottlenecking progress from\n spiky towards general intelligence. A natural way to extend the current paradigm to automation of arbitrary knowledge work is to behaviour-clone\n from screen recordings of human practitioners. This\n requires moving from predominantly text-based towards video-based behaviour-cloning, a nascent field .\n
\n
\n A longstanding goal of AGI research is automating the process of conducting research itself.\n While a long line of work tried to tackle necessary capabilities for automating research individually (coding, ideation, exploration, planning),\n research automation does not warrant special treatment compared to other types of knowledge work, and behaviour-cloning from screen recordings is a natural way to bootstrap models in general.\n
\n
\n We introduce AGI-CAST, a dataset of unlabeled screen recordings of AGI research, intended to facilitate research on behaviour-cloning from screen recordings.\n We go beyond crowd-code, our previous work on crowd-sourcing a dataset of IDE interactions, by capturing not only the IDE, but also browser-use, notes, and paper exploration.\n While crowd-code is intended as a low-threshold crowd-sourcing effort, AGI-CAST captures the entire day of researchers at p(doom), with all its idiosyncrasies and nuances. While we only started recording recently,\n the entire up-to-date dataset is available as a playlist on YouTube and will be updated continuously.\n
\n Internet-scale pre-training, preference modeling, and reinforcement learning using verification signals offer a compelling pathway for language models\n to attain human-level performance\n , yet data is increasingly bottlenecking progress from\n spiky towards general intelligence. A natural way to extend the current paradigm to automation of arbitrary knowledge work is to behaviour-clone\n from screen recordings of human practitioners. This\n requires moving from predominantly text-based towards video-based behaviour-cloning, a nascent field .\n
\n
\n A longstanding goal of AGI research is automating the process of conducting research itself.\n While a long line of work tried to tackle necessary capabilities for automating research individually (coding, ideation, exploration, planning),\n research automation does not warrant special treatment compared to other types of knowledge work, and behaviour-cloning from screen recordings is a natural way to bootstrap models in general.\n
\n
\n We introduce AGI-CAST, a dataset of unlabeled screen recordings of AGI research, intended to facilitate research on behaviour-cloning from screen recordings.\n We go beyond crowd-code, our previous work on crowd-sourcing a dataset of IDE interactions, by capturing not only the IDE, but also browser-use, notes, and paper exploration.\n While crowd-code is intended as a low-threshold crowd-sourcing effort, AGI-CAST captures the entire day of researchers at p(doom), with all its idiosyncrasies and nuances. While we only started recording recently,\n the entire up-to-date dataset is available as a playlist on YouTube and will be updated continuously.\n
\n
\n All uncaptured data is lost data. AGI-CAST represents the first step towards capturing and openly releasing the longest-horizon data imaginable. We invite the community to follow our lead and openly release screen recordings of their own research.",html,selection_command
+9702,15481148,"examples/agi_cast.html",1589,3141,"
\n We introduce AGI-CAST, a continually growing dataset of unlabeled screen recordings of long-horizon AGI research.\n
\n Internet-scale pre-training, preference modeling, and reinforcement learning using verification signals offer a compelling pathway for language models\n to attain human-level performance\n , yet data is increasingly bottlenecking progress from\n spiky towards general intelligence. A natural way to extend the current paradigm to automation of arbitrary knowledge work is to behaviour-clone\n from screen recordings of human practitioners. This\n requires moving from predominantly text-based towards video-based behaviour-cloning, a nascent field .\n
\n
\n A longstanding goal of AGI research is automating the process of conducting research itself.\n While a long line of work tried to tackle necessary capabilities for automating research individually (coding, ideation, exploration, planning),\n research automation does not warrant special treatment compared to other types of knowledge work, and behaviour-cloning from screen recordings is a natural way to bootstrap models in general.\n
\n
\n We introduce AGI-CAST, a dataset of unlabeled screen recordings of AGI research, intended to facilitate research on behaviour-cloning from screen recordings.\n We go beyond crowd-code, our previous work on crowd-sourcing a dataset of IDE interactions, by capturing not only the IDE, but also browser-use, notes, and paper exploration.\n While crowd-code is intended as a low-threshold crowd-sourcing effort, AGI-CAST captures the entire day of researchers at p(doom), with all its idiosyncrasies and nuances. While we only started recording recently,\n the entire up-to-date dataset is available as a playlist on YouTube and will be updated continuously.\n
\n
\n All uncaptured data is lost data. AGI-CAST represents the first step towards capturing and openly releasing the longest-horizon data imaginable. We invite the community to follow our lead and openly release screen recordings of their own research.\n
",html,selection_command
+9703,15481700,"examples/agi_cast.html",4728,0,"",html,selection_command
+9704,15688128,"examples/agi_cast.html",1012,0,"",html,selection_command
+9705,15688740,"examples/agi_cast.html",1605,0,"",html,selection_command
+9706,15689968,"examples/agi_cast.html",3649,0,"",html,selection_command
+9707,15779182,"examples/agi_cast.html",3813,0,"",html,selection_command
+9708,15779430,"examples/agi_cast.html",4042,0,"",html,selection_command
+9709,15779460,"examples/agi_cast.html",4262,0,"",html,selection_command
+9710,15779492,"examples/agi_cast.html",4457,0,"",html,selection_command
+9711,15779526,"examples/agi_cast.html",4466,0,"",html,selection_command
+9712,15779717,"examples/agi_cast.html",4474,0,"",html,selection_command
+9713,15779967,"examples/agi_cast.html",4728,0,"",html,selection_command
+9714,15779997,"examples/agi_cast.html",4737,0,"",html,selection_command
+9715,15780031,"examples/agi_cast.html",4748,0,"",html,selection_command
+9716,15780064,"examples/agi_cast.html",4755,0,"",html,selection_command
+9717,15780242,"examples/agi_cast.html",4764,0,"",html,selection_command
+9718,15780418,"examples/agi_cast.html",4771,0,"",html,selection_command
+9719,15780575,"examples/agi_cast.html",4798,0,"",html,selection_command
+9720,15781834,"examples/agi_cast.html",4799,0,"",html,selection_command
+9721,15782221,"examples/agi_cast.html",4802,0,"",html,selection_command
+9722,15782434,"examples/agi_cast.html",4809,0,"",html,selection_command
+9723,15782643,"examples/agi_cast.html",4812,0,"",html,selection_command
+9724,15783047,"examples/agi_cast.html",4830,0,"",html,selection_command
+9725,15783840,"examples/agi_cast.html",4812,0,"",html,selection_command
+9726,15783997,"examples/agi_cast.html",4809,0,"",html,selection_command
+9727,15784421,"examples/agi_cast.html",4812,0,"",html,selection_command
+9728,15785471,"examples/agi_cast.html",4812,1,"c",html,selection_command
+9729,15785579,"examples/agi_cast.html",4812,19,"conceptualization o",html,selection_command
+9730,15785835,"examples/agi_cast.html",4812,22,"conceptualization of t",html,selection_command
+9731,15785865,"examples/agi_cast.html",4812,26,"conceptualization of the p",html,selection_command
+9732,15785899,"examples/agi_cast.html",4812,34,"conceptualization of the project a",html,selection_command
+9733,15785931,"examples/agi_cast.html",4812,38,"conceptualization of the project and i",html,selection_command
+9734,15785962,"examples/agi_cast.html",4812,50,"conceptualization of the project and implemented t",html,selection_command
+9735,15785996,"examples/agi_cast.html",4812,54,"conceptualization of the project and implemented the I",html,selection_command
+9736,15786370,"examples/agi_cast.html",4812,58,"conceptualization of the project and implemented the IDE e",html,selection_command
+9737,15786619,"examples/agi_cast.html",4812,67,"conceptualization of the project and implemented the IDE extension.",html,selection_command
+9738,15786646,"examples/agi_cast.html",4812,69,"conceptualization of the project and implemented the IDE extension. M",html,selection_command
+9739,15786674,"examples/agi_cast.html",4812,72,"conceptualization of the project and implemented the IDE extension. MM i",html,selection_command
+9740,15786709,"examples/agi_cast.html",4812,84,"conceptualization of the project and implemented the IDE extension. MM implemented t",html,selection_command
+9741,15786742,"examples/agi_cast.html",4812,88,"conceptualization of the project and implemented the IDE extension. MM implemented the s",html,selection_command
+9742,15786775,"examples/agi_cast.html",4812,94,"conceptualization of the project and implemented the IDE extension. MM implemented the server-",html,selection_command
+9743,15786810,"examples/agi_cast.html",4812,95,"conceptualization of the project and implemented the IDE extension. MM implemented the server-s",html,selection_command
+9744,15786843,"examples/agi_cast.html",4812,100,"conceptualization of the project and implemented the IDE extension. MM implemented the server-side d",html,selection_command
+9745,15786876,"examples/agi_cast.html",4812,105,"conceptualization of the project and implemented the IDE extension. MM implemented the server-side data c",html,selection_command
+9746,15786909,"examples/agi_cast.html",4812,116,"conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection p",html,selection_command
+9747,15786942,"examples/agi_cast.html",4812,124,"conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline.",html,selection_command
+9748,15786976,"examples/agi_cast.html",4812,126,"conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. A",html,selection_command
+9749,15787008,"examples/agi_cast.html",4812,130,"conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All a",html,selection_command
+9750,15787043,"examples/agi_cast.html",4812,138,"conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All authors c",html,selection_command
+9751,15787075,"examples/agi_cast.html",4812,150,"conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All authors contributed t",html,selection_command
+9752,15787110,"examples/agi_cast.html",4812,161,"conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All authors contributed technically.",html,selection_command
+9753,15787142,"examples/agi_cast.html",4812,163,"conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All authors contributed technically. F",html,selection_command
+9754,15787175,"examples/agi_cast.html",4812,166,"conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All authors contributed technically. FS w",html,selection_command
+9755,15787208,"examples/agi_cast.html",4812,172,"conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All authors contributed technically. FS wrote t",html,selection_command
+9756,15787242,"examples/agi_cast.html",4812,176,"conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All authors contributed technically. FS wrote the m",html,selection_command
+9757,15788042,"examples/agi_cast.html",4812,172,"conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All authors contributed technically. FS wrote t",html,selection_command
+9758,15788284,"examples/agi_cast.html",4812,166,"conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All authors contributed technically. FS w",html,selection_command
+9759,15788310,"examples/agi_cast.html",4812,163,"conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All authors contributed technically. F",html,selection_command
+9760,15788342,"examples/agi_cast.html",4812,161,"conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All authors contributed technically.",html,selection_command
+9761,15788374,"examples/agi_cast.html",4812,150,"conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All authors contributed t",html,selection_command
+9762,15788618,"examples/agi_cast.html",4812,161,"conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All authors contributed technically.",html,selection_command
+9763,15788878,"examples/agi_cast.html",4812,163,"conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All authors contributed technically. F",html,selection_command
+9764,15788902,"examples/agi_cast.html",4812,166,"conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All authors contributed technically. FS w",html,selection_command
+9765,15788926,"examples/agi_cast.html",4812,172,"conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All authors contributed technically. FS wrote t",html,selection_command
+9766,15788958,"examples/agi_cast.html",4812,176,"conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All authors contributed technically. FS wrote the m",html,selection_command
+9767,15788992,"examples/agi_cast.html",4812,186,"conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All authors contributed technically. FS wrote the manuscript.",html,selection_command
+9768,15789025,"examples/agi_cast.html",4812,188,"conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All authors contributed technically. FS wrote the manuscript. W",html,selection_command
+9769,15789404,"examples/agi_cast.html",4812,191,"conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All authors contributed technically. FS wrote the manuscript. We t",html,selection_command
+9770,15789660,"examples/agi_cast.html",4812,197,"conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All authors contributed technically. FS wrote the manuscript. We thank G",html,selection_command
+9771,15789687,"examples/agi_cast.html",4812,204,"conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All authors contributed technically. FS wrote the manuscript. We thank Gemini C",html,selection_command
+9772,15789724,"examples/agi_cast.html",4812,209,"conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All authors contributed technically. FS wrote the manuscript. We thank Gemini Code A",html,selection_command
+9773,15789756,"examples/agi_cast.html",4812,216,"conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All authors contributed technically. FS wrote the manuscript. We thank Gemini Code Assist a",html,selection_command
+9774,15789791,"examples/agi_cast.html",4812,220,"conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All authors contributed technically. FS wrote the manuscript. We thank Gemini Code Assist and C",html,selection_command
+9775,15789824,"examples/agi_cast.html",4812,227,"conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All authors contributed technically. FS wrote the manuscript. We thank Gemini Code Assist and Cursor f",html,selection_command
+9776,15789857,"examples/agi_cast.html",4812,231,"conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All authors contributed technically. FS wrote the manuscript. We thank Gemini Code Assist and Cursor for t",html,selection_command
+9777,15789891,"examples/agi_cast.html",4812,237,"conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All authors contributed technically. FS wrote the manuscript. We thank Gemini Code Assist and Cursor for their h",html,selection_command
+9778,15789923,"examples/agi_cast.html",4812,242,"conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All authors contributed technically. FS wrote the manuscript. We thank Gemini Code Assist and Cursor for their help i",html,selection_command
+9779,15790192,"examples/agi_cast.html",4812,245,"conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All authors contributed technically. FS wrote the manuscript. We thank Gemini Code Assist and Cursor for their help in w",html,selection_command
+9780,15790601,"examples/agi_cast.html",4812,253,"conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All authors contributed technically. FS wrote the manuscript. We thank Gemini Code Assist and Cursor for their help in writing t",html,selection_command
+9781,15790785,"examples/agi_cast.html",4812,257,"conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All authors contributed technically. FS wrote the manuscript. We thank Gemini Code Assist and Cursor for their help in writing the e",html,selection_command
+9782,15790956,"examples/agi_cast.html",4812,266,"conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All authors contributed technically. FS wrote the manuscript. We thank Gemini Code Assist and Cursor for their help in writing the extension.",html,selection_command
+9783,15791134,"examples/agi_cast.html",4812,269,"conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All authors contributed technically. FS wrote the manuscript. We thank Gemini Code Assist and Cursor for their help in writing the extension.
",html,selection_command
+9785,15791601,"examples/agi_cast.html",4812,269,"conceptualization of the project and implemented the IDE extension. MM implemented the server-side data collection pipeline. All authors contributed technically. FS wrote the manuscript. We thank Gemini Code Assist and Cursor for their help in writing the extension.
\n\n\n\n \n \n \n \n\n\n\n \n \n \n \n \n
\n Traditional value assertions in jitted JAX lead to performance degredation. A new (not yet public) JAX API fixes this.\n
\n Jitted JAX does not support traditional python asserts that access JAX arrays .\n Chex and jax.experimental.checkify.check provide ways of wrapping a jitted function\n with decorators to enable value assertions, but they lead to performance-degradation, making them unusable in practical settings.\n
\n
\n For a performance-degradation free way of using value assertions in jitted JAX, we can use a new (as of today still private) JAX API: error_check:\n
\n \n import jax\n from jax._src.error_check import set_error_if, raise_if_error\n import jax.numpy as jnp\n\n @jax.jit\n def f(x, y):\n set_error_if(x != 0, 'x must be 0')\n return jnp.multiply(x, y)\n\n f(1, 0)\n\n raise_if_error()\n \n\n \n Traceback (most recent call last):\n File ""/home/ubuntu/code/temp.py"", line 12, in \n raise_if_error()\n File ""/home/ubuntu/code/.venv/lib/python3.10/site-packages/jax/_src/error_check.py"", line 93, in raise_if_error\n raise exc.with_traceback(filtered_traceback)\n File ""/home/ubuntu/code/temp.py"", line 10, in \n f(1, 0)\n File ""/home/ubuntu/code/temp.py"", line 7, in f\n set_error_if(x != 0, 'x must be 0')\n jax._src.error_check.JaxValueError: x must be 0\n \n
\n This pattern exploits that it suffices to raise an assertion error post-hoc, in this case after the computation of the jitted function.\n Thus, the implementation merely conditionally stores the error in JAX-managed context. While purely functional conditional computation is fully supported by JAX and XLA , and thus fully compatible with XLA graph compilation,\n the error is only raised outside of the jitted function, avoiding the typical performance overhead of value assertions.\n
\n \n \n\n \n\n
Contributions
\n
FS worked on all aspects of this post, including research, analysis and writing.
",html,content
+10013,16918118,"examples/blog.html",25153,0,"",html,selection_command
+10014,16921268,"examples/blog.html",25270,0,"2",html,content
+10015,16921268,"examples/blog.html",25269,1,"",html,content
+10016,16921268,"examples/blog.html",25268,0,"November",html,content
+10017,16921268,"examples/blog.html",25262,6,"",html,content
+10018,16924232,"examples/blog.html",25985,0,"n",html,content
+10019,16924233,"examples/blog.html",25981,4,"",html,content
+10020,16924233,"examples/blog.html",25980,0,"i",html,content
+10021,16924233,"examples/blog.html",25979,0,"paper explora",html,content
+10022,16924233,"examples/blog.html",25978,0,"d",html,content
+10023,16924233,"examples/blog.html",25976,2,"",html,content
+10024,16924233,"examples/blog.html",25972,2,"",html,content
+10025,16924233,"examples/blog.html",25971,0,",",html,content
+10026,16924233,"examples/blog.html",25970,0,"te",html,content
+10027,16924233,"examples/blog.html",25965,5,"",html,content
+10028,16924233,"examples/blog.html",25964,0,"n",html,content
+10029,16924233,"examples/blog.html",25963,0,"e,",html,content
+10030,16924233,"examples/blog.html",25962,0,"r-u",html,content
+10031,16924233,"examples/blog.html",25961,1,"",html,content
+10032,16924233,"examples/blog.html",25960,0,"ows",html,content
+10033,16924233,"examples/blog.html",25959,0,"t also b",html,content
+10034,16924233,"examples/blog.html",25957,2,"",html,content
+10035,16924233,"examples/blog.html",25956,0,"e IDE, b",html,content
+10036,16924233,"examples/blog.html",25955,0,"t",html,content
+10037,16924233,"examples/blog.html",25954,0,"nly",html,content
+10038,16924233,"examples/blog.html",25952,1,"",html,content
+10039,16924233,"examples/blog.html",25950,1,"",html,content
+10040,16924233,"examples/blog.html",25948,1,"",html,content
+10041,16924233,"examples/blog.html",25947,0,"n",html,content
+10042,16924233,"examples/blog.html",25946,1,"",html,content
+10043,16924233,"examples/blog.html",25943,2,"",html,content
+10044,16924233,"examples/blog.html",25936,4,"",html,content
+10045,16924233,"examples/blog.html",25935,0,"ptu",html,content
+10046,16924233,"examples/blog.html",25931,4,"",html,content
+10047,16924233,"examples/blog.html",25929,0,"by ",html,content
+10048,16924233,"examples/blog.html",25928,1,"",html,content
+10049,16924233,"examples/blog.html",25927,0,",",html,content
+10050,16924233,"examples/blog.html",25926,1,"",html,content
+10051,16924233,"examples/blog.html",25925,0,"n",html,content
+10052,16924233,"examples/blog.html",25924,0,"racti",html,content
+10053,16924233,"examples/blog.html",25923,0,"nt",html,content
+10054,16924233,"examples/blog.html",25922,1,"",html,content
+10055,16924234,"examples/blog.html",25921,0,"of IDE ",html,content
+10056,16924234,"examples/blog.html",25920,1,"",html,content
+10057,16924234,"examples/blog.html",25919,0,"t",html,content
+10058,16924234,"examples/blog.html",25916,3,"",html,content
+10059,16924234,"examples/blog.html",25915,0,"s",html,content
+10060,16924234,"examples/blog.html",25914,1,"",html,content
+10061,16924234,"examples/blog.html",25913,0,"g a dat",html,content
+10062,16924234,"examples/blog.html",25912,1,"",html,content
+10063,16924234,"examples/blog.html",25911,0,"rci",html,content
+10064,16924234,"examples/blog.html",25910,0,"wd-so",html,content
+10065,16924234,"examples/blog.html",25908,2,"",html,content
+10066,16924234,"examples/blog.html",25906,0,"c",html,content
+10067,16924234,"examples/blog.html",25905,1,"",html,content
+10068,16924234,"examples/blog.html",25903,1,"",html,content
+10069,16924234,"examples/blog.html",25898,4,"",html,content
+10070,16924234,"examples/blog.html",25896,1,"",html,content
+10071,16924234,"examples/blog.html",25895,0,"k",html,content
+10072,16924234,"examples/blog.html",25893,2,"",html,content
+10073,16924234,"examples/blog.html",25890,0,"previous ",html,content
+10074,16924234,"examples/blog.html",25888,0,"u",html,content
+10075,16924234,"examples/blog.html",25886,1,"",html,content
+10076,16924234,"examples/blog.html",25885,0,",",html,content
+10077,16924234,"examples/blog.html",25881,4,"",html,content
+10078,16924234,"examples/blog.html",25878,0,"rowd-c",html,content
+10079,16924234,"examples/blog.html",25875,0,"yon",html,content
+10080,16924234,"examples/blog.html",25872,2,"",html,content
+10081,16924234,"examples/blog.html",25871,0,"We go ",html,content
+10082,16924234,"examples/blog.html",25867,4,"",html,content
+10083,16924234,"examples/blog.html",25866,0,"rch.",html,content
+10084,16924234,"examples/blog.html",25864,2,"",html,content
+10085,16924234,"examples/blog.html",25862,0,"izon AGI res",html,content
+10086,16924234,"examples/blog.html",25861,0,"ho",html,content
+10087,16924234,"examples/blog.html",25860,0,"g",html,content
+10088,16924234,"examples/blog.html",25858,0,"ngs of l",html,content
+10089,16924234,"examples/blog.html",25854,3,"",html,content
+10090,16924234,"examples/blog.html",25853,0,"r",html,content
+10091,16924234,"examples/blog.html",25852,0,"ec",html,content
+10092,16924234,"examples/blog.html",25851,0,"screen ",html,content
+10093,16924234,"examples/blog.html",25850,1,"",html,content
+10094,16924234,"examples/blog.html",25849,0,"beled",html,content
+10095,16924234,"examples/blog.html",25848,0,"unl",html,content
+10096,16924234,"examples/blog.html",25847,0,"t of",html,content
+10097,16924234,"examples/blog.html",25846,1,"",html,content
+10098,16924234,"examples/blog.html",25842,3,"",html,content
+10099,16924234,"examples/blog.html",25840,0,"AGI-CAST, a dat",html,content
+10100,16924235,"examples/blog.html",25839,1,"",html,content
+10101,16924235,"examples/blog.html",25749,31,"",html,content
+10102,16924235,"examples/blog.html",25664,64,"",html,content
+10103,16924235,"examples/blog.html",25617,0,"arch",html,content
+10104,16924235,"examples/blog.html",25613,2,"",html,content
+10105,16924235,"examples/blog.html",25612,0,"n AGI R",html,content
+10106,16924235,"examples/blog.html",25611,1,"",html,content
+10107,16924235,"examples/blog.html",25610,0,"of Long-Horiz",html,content
+10108,16924235,"examples/blog.html",25609,1,"",html,content
+10109,16924235,"examples/blog.html",25608,0,"s",html,content
+10110,16924235,"examples/blog.html",25603,2,"",html,content
+10111,16924235,"examples/blog.html",25602,0,"r",html,content
+10112,16924235,"examples/blog.html",25601,0,"Rec",html,content
+10113,16924235,"examples/blog.html",25600,1,"",html,content
+10114,16924235,"examples/blog.html",25599,0,"een",html,content
+10115,16924235,"examples/blog.html",25597,2,"",html,content
+10116,16924235,"examples/blog.html",25596,0,"Sc",html,content
+10117,16924235,"examples/blog.html",25594,2,"",html,content
+10118,16924235,"examples/blog.html",25583,9,"",html,content
+10119,16924235,"examples/blog.html",25581,0,"e",html,content
+10120,16924235,"examples/blog.html",25573,5,"",html,content
+10121,16924235,"examples/blog.html",25572,0,"U",html,content
+10122,16924235,"examples/blog.html",25571,1,"",html,content
+10123,16924235,"examples/blog.html",25570,0,"f",html,content
+10124,16924235,"examples/blog.html",25565,5,"",html,content
+10125,16924235,"examples/blog.html",25560,4,"",html,content
+10126,16924235,"examples/blog.html",25559,0,"t",html,content
+10127,16924235,"examples/blog.html",25558,1,"",html,content
+10128,16924235,"examples/blog.html",25557,0,"Datas",html,content
+10129,16924235,"examples/blog.html",25552,5,"",html,content
+10130,16924235,"examples/blog.html",25548,0,"AGI-CAST",html,content
+10131,16924235,"examples/blog.html",25541,7,"",html,content
+10132,16924235,"examples/blog.html",25414,0,"4",html,content
+10133,16924235,"examples/blog.html",25404,10,"",html,content
+10134,16924235,"examples/blog.html",25403,0,"cast.m",html,content
+10135,16924235,"examples/blog.html",25400,2,"",html,content
+10136,16924235,"examples/blog.html",25399,0,"g",html,content
+10137,16924235,"examples/blog.html",25397,2,"",html,content
+10138,16924235,"examples/blog.html",25395,1,"",html,content
+10139,16924235,"examples/blog.html",25336,0,"t",html,content
+10140,16924235,"examples/blog.html",25332,4,"",html,content
+10141,16924235,"examples/blog.html",25330,0,"agi_c",html,content
+10142,16924235,"examples/blog.html",25329,1,"",html,content
+10143,16926857,"examples/blog.html",25459,0,"",html,selection_command
+10144,16927012,"examples/blog.html",25503,0,"",html,selection_command
+10145,16927529,"examples/blog.html",25511,0,"",html,selection_command
+10146,16927771,"examples/blog.html",25512,0,"",html,selection_command
+10147,16927803,"examples/blog.html",25515,0,"",html,selection_command
+10148,16927837,"examples/blog.html",25520,0,"",html,selection_command
+10149,16927871,"examples/blog.html",25522,0,"",html,selection_command
+10150,16927904,"examples/blog.html",25527,0,"",html,selection_command
+10151,16928046,"examples/blog.html",25529,0,"",html,selection_command
+10152,16928266,"examples/blog.html",25531,0,"",html,selection_command
+10153,16928499,"examples/blog.html",25529,0,"",html,selection_command
+10154,16929037,"examples/blog.html",25531,0,"",html,selection_command
+10155,16929423,"examples/blog.html",25533,0,"",html,selection_command
+10156,16929731,"examples/blog.html",25535,0,"",html,selection_command
+10157,16930093,"examples/blog.html",25534,0,"",html,selection_command
+10158,16930252,"examples/blog.html",25533,0,"",html,selection_command
+10159,16930568,"examples/blog.html",25533,1,"๏ธ",html,selection_command
+10160,16930752,"examples/blog.html",25533,2,"๏ธ ",html,selection_command
+10161,16930901,"examples/blog.html",25533,3,"๏ธ A",html,selection_command
+10162,16931259,"examples/blog.html",25533,2,"๏ธ ",html,selection_command
+10163,16931694,"examples/blog.html",25533,2,"",html,content
+10164,16932273,"examples/blog.html",25532,0,"",html,selection_command
+10165,16932566,"examples/blog.html",25532,1,"",html,content
+10166,16933134,"examples/blog.html",25531,0,"",html,selection_command
+10167,16933331,"examples/blog.html",25531,1,"",html,content
+10168,16933808,"examples/blog.html",25529,0,"",html,selection_command
+10169,16934020,"examples/blog.html",25529,2,"",html,content
+10170,16935237,"examples/blog.html",25654,0,"",html,selection_command
+10171,16935372,"examples/blog.html",25529,0,"",html,selection_command
+10172,16936235,"examples/blog.html",25528,1,"",html,content
+10173,16936747,"examples/blog.html",25528,0,">",html,content
+10174,16936750,"examples/blog.html",25529,0,"",html,selection_keyboard
+10175,16936854,"examples/blog.html",25529,0,"",html,content
+10176,16937072,"examples/blog.html",25528,0,"",html,selection_command
+10177,16938745,"examples/blog.html",25529,0,"",html,selection_command
+10178,16938801,"examples/blog.html",25529,1,"<",html,selection_command
+10179,16939004,"examples/blog.html",25529,2,"",html,selection_command
+10180,16939147,"examples/blog.html",25529,3,"",html,selection_command
+10183,16939778,"examples/blog.html",25529,5,"",html,content
+10184,16943504,"examples/agi_cast.html",0,0,"",html,tab
+10185,16944123,"examples/agi_cast.html",0,0,"",html,selection_command
+10186,16944639,"examples/agi_cast.html",5,0,"",html,selection_command
+10187,16944889,"examples/agi_cast.html",30,0,"",html,selection_command
+10188,16944921,"examples/agi_cast.html",31,0,"",html,selection_command
+10189,16944956,"examples/agi_cast.html",97,0,"",html,selection_command
+10190,16944989,"examples/agi_cast.html",164,0,"",html,selection_command
+10191,16945021,"examples/agi_cast.html",206,0,"",html,selection_command
+10192,16945055,"examples/agi_cast.html",207,0,"",html,selection_command
+10193,16945089,"examples/agi_cast.html",257,0,"",html,selection_command
+10194,16945122,"examples/agi_cast.html",258,0,"",html,selection_command
+10195,16945156,"examples/agi_cast.html",328,0,"",html,selection_command
+10196,16945189,"examples/agi_cast.html",396,0,"",html,selection_command
+10197,16945221,"examples/agi_cast.html",471,0,"",html,selection_command
+10198,16945254,"examples/agi_cast.html",541,0,"",html,selection_command
+10199,16945287,"examples/agi_cast.html",574,0,"",html,selection_command
+10200,16945321,"examples/agi_cast.html",578,0,"",html,selection_command
+10201,16945355,"examples/agi_cast.html",594,0,"",html,selection_command
+10202,16945388,"examples/agi_cast.html",595,0,"",html,selection_command
+10203,16945422,"examples/agi_cast.html",602,0,"",html,selection_command
+10204,16945454,"examples/agi_cast.html",643,0,"",html,selection_command
+10205,16945488,"examples/agi_cast.html",714,0,"",html,selection_command
+10206,16945522,"examples/agi_cast.html",738,0,"",html,selection_command
+10207,16945554,"examples/agi_cast.html",794,0,"",html,selection_command
+10208,16945587,"examples/agi_cast.html",802,0,"",html,selection_command
+10209,16945622,"examples/agi_cast.html",803,0,"",html,selection_command
+10210,16945654,"examples/agi_cast.html",810,0,"",html,selection_command
+10211,16945689,"examples/agi_cast.html",817,0,"",html,selection_command
+10212,16945722,"examples/agi_cast.html",853,0,"",html,selection_command
+10213,16945754,"examples/agi_cast.html",859,0,"",html,selection_command
+10214,16946036,"examples/agi_cast.html",878,0,"",html,selection_command
+10215,16946196,"examples/agi_cast.html",935,0,"",html,selection_command
+10216,16947887,"examples/agi_cast.html",992,0,"",html,selection_command
+10217,16948131,"examples/agi_cast.html",1115,0,"",html,selection_command
+10218,16948161,"examples/agi_cast.html",1153,0,"",html,selection_command
+10219,16948193,"examples/agi_cast.html",1199,0,"",html,selection_command
+10220,16948240,"examples/agi_cast.html",1216,0,"",html,selection_command
+10221,16948258,"examples/agi_cast.html",1224,0,"",html,selection_command
+10222,16948292,"examples/agi_cast.html",1260,0,"",html,selection_command
+10223,16948329,"examples/agi_cast.html",1305,0,"",html,selection_command
+10224,16948360,"examples/agi_cast.html",1380,0,"",html,selection_command
+10225,16948394,"examples/agi_cast.html",1422,0,"",html,selection_command
+10226,16948430,"examples/agi_cast.html",1430,0,"",html,selection_command
+10227,16948461,"examples/agi_cast.html",1437,0,"",html,selection_command
+10228,16948497,"examples/agi_cast.html",1452,0,"",html,selection_command
+10229,16948529,"examples/agi_cast.html",1474,0,"",html,selection_command
+10230,16948564,"examples/agi_cast.html",1530,0,"",html,selection_command
+10231,16948596,"examples/agi_cast.html",1538,0,"",html,selection_command
+10232,16948629,"examples/agi_cast.html",1544,0,"",html,selection_command
+10233,16948662,"examples/agi_cast.html",1557,0,"",html,selection_command
+10234,16948695,"examples/agi_cast.html",1577,0,"",html,selection_command
+10235,16948729,"examples/agi_cast.html",1589,0,"",html,selection_command
+10236,16948764,"examples/agi_cast.html",1597,0,"",html,selection_command
+10237,16948796,"examples/agi_cast.html",1806,0,"",html,selection_command
+10238,16948829,"examples/agi_cast.html",1815,0,"",html,selection_command
+10239,16948862,"examples/agi_cast.html",1828,0,"",html,selection_command
+10240,16948897,"examples/agi_cast.html",1852,0,"",html,selection_command
+10241,16948929,"examples/agi_cast.html",1866,0,"",html,selection_command
+10242,16948962,"examples/agi_cast.html",1986,0,"",html,selection_command
+10243,16948995,"examples/agi_cast.html",2071,0,"",html,selection_command
+10244,16949061,"examples/agi_cast.html",1986,0,"",html,selection_command
+10245,16949320,"examples/agi_cast.html",1866,0,"",html,selection_command
+10246,16949346,"examples/agi_cast.html",1852,0,"",html,selection_command
+10247,16949376,"examples/agi_cast.html",1828,0,"",html,selection_command
+10248,16949419,"examples/agi_cast.html",1815,0,"",html,selection_command
+10249,16949443,"examples/agi_cast.html",1806,0,"",html,selection_command
+10250,16949476,"examples/agi_cast.html",1597,0,"",html,selection_command
+10251,16949512,"examples/agi_cast.html",1589,0,"",html,selection_command
+10252,16949545,"examples/agi_cast.html",1577,0,"",html,selection_command
+10253,16950187,"examples/blog.html",0,0,"",html,tab
+10254,16950914,"examples/blog.html",25532,0,"",html,selection_command
+10255,16951073,"examples/blog.html",25533,0,"",html,selection_command
+10256,16951225,"examples/blog.html",25537,0,"",html,selection_command
+10257,16951554,"examples/blog.html",25539,0,"",html,selection_command
+10258,16952406,"examples/blog.html",25539,1,"A",html,selection_command
+10259,16952527,"examples/blog.html",25539,9,"A Dataset",html,selection_command
+10260,16952781,"examples/blog.html",25539,12,"A Dataset of",html,selection_command
+10261,16952812,"examples/blog.html",25539,22,"A Dataset of Unlabeled",html,selection_command
+10262,16952844,"examples/blog.html",25539,29,"A Dataset of Unlabeled Screen",html,selection_command
+10263,16952877,"examples/blog.html",25539,40,"A Dataset of Unlabeled Screen Recordings",html,selection_command
+10264,16952912,"examples/blog.html",25539,43,"A Dataset of Unlabeled Screen Recordings of",html,selection_command
+10265,16952946,"examples/blog.html",25539,48,"A Dataset of Unlabeled Screen Recordings of Long",html,selection_command
+10266,16953177,"examples/blog.html",25539,49,"A Dataset of Unlabeled Screen Recordings of Long-",html,selection_command
+10267,16953342,"examples/blog.html",25539,50,"A Dataset of Unlabeled Screen Recordings of Long-H",html,selection_command
+10268,16953540,"examples/blog.html",25539,58,"A Dataset of Unlabeled Screen Recordings of Long-Horizon A",html,selection_command
+10269,16953719,"examples/blog.html",25539,62,"A Dataset of Unlabeled Screen Recordings of Long-Horizon AGI R",html,selection_command
+10270,16954044,"examples/blog.html",25539,70,"A Dataset of Unlabeled Screen Recordings of Long-Horizon AGI Research<",html,selection_command
+10271,16954453,"examples/blog.html",25539,69,"A Dataset of Unlabeled Screen Recordings of Long-Horizon AGI Research",html,selection_command
+10272,16954590,"examples/blog.html",25539,69,"",html,content
+10273,16955058,"examples/blog.html",25539,0,"M",html,content
+10274,16955062,"examples/blog.html",25540,0,"",html,selection_keyboard
+10275,16955542,"examples/blog.html",25540,0,"aking Agents Work Like Humans",html,content
+10276,16955761,"examples/blog.html",25568,0,"",html,selection_command
+10277,16956508,"examples/blog.html",25639,0,"",html,selection_command
+10278,16956646,"examples/blog.html",25720,0,"",html,selection_command
+10279,16959650,"examples/agi_cast.html",0,0,"",html,tab
+10280,16960331,"examples/agi_cast.html",1557,0,"",html,selection_command
+10281,16960582,"examples/agi_cast.html",1544,0,"",html,selection_command
+10282,16960612,"examples/agi_cast.html",1538,0,"",html,selection_command
+10283,16960644,"examples/agi_cast.html",1530,0,"",html,selection_command
+10284,16960678,"examples/agi_cast.html",1474,0,"",html,selection_command
+10285,16960714,"examples/agi_cast.html",1452,0,"",html,selection_command
+10286,16960748,"examples/agi_cast.html",1437,0,"",html,selection_command
+10287,16960781,"examples/agi_cast.html",1430,0,"",html,selection_command
+10288,16960812,"examples/agi_cast.html",1422,0,"",html,selection_command
+10289,16960848,"examples/agi_cast.html",1380,0,"",html,selection_command
+10290,16960881,"examples/agi_cast.html",1305,0,"",html,selection_command
+10291,16960912,"examples/agi_cast.html",1260,0,"",html,selection_command
+10292,16960945,"examples/agi_cast.html",1224,0,"",html,selection_command
+10293,16960979,"examples/agi_cast.html",1216,0,"",html,selection_command
+10294,16961013,"examples/agi_cast.html",1199,0,"",html,selection_command
+10295,16961047,"examples/agi_cast.html",1153,0,"",html,selection_command
+10296,16961086,"examples/agi_cast.html",1115,0,"",html,selection_command
+10297,16961300,"examples/agi_cast.html",992,0,"",html,selection_command
+10298,16961681,"examples/agi_cast.html",996,0,"",html,selection_command
+10299,16961930,"examples/agi_cast.html",997,0,"",html,selection_command
+10300,16961962,"examples/agi_cast.html",1008,0,"",html,selection_command
+10301,16961994,"examples/agi_cast.html",1011,0,"",html,selection_command
+10302,16962027,"examples/agi_cast.html",1012,0,"",html,selection_command
+10303,16962060,"examples/agi_cast.html",1015,0,"",html,selection_command
+10304,16962402,"examples/agi_cast.html",1012,0,"",html,selection_command
+10305,16962871,"examples/agi_cast.html",1012,1,"W",html,selection_command
+10306,16962928,"examples/agi_cast.html",1012,4,"We i",html,selection_command
+10307,16963180,"examples/agi_cast.html",1012,14,"We introduce A",html,selection_command
+10308,16963210,"examples/agi_cast.html",1012,17,"We introduce AGI-",html,selection_command
+10309,16963244,"examples/agi_cast.html",1012,18,"We introduce AGI-C",html,selection_command
+10310,16963273,"examples/agi_cast.html",1012,22,"We introduce AGI-CAST,",html,selection_command
+10311,16963306,"examples/agi_cast.html",1012,24,"We introduce AGI-CAST, a",html,selection_command
+10312,16963339,"examples/agi_cast.html",1012,26,"We introduce AGI-CAST, a c",html,selection_command
+10313,16963373,"examples/agi_cast.html",1012,38,"We introduce AGI-CAST, a continually g",html,selection_command
+10314,16963405,"examples/agi_cast.html",1012,46,"We introduce AGI-CAST, a continually growing d",html,selection_command
+10315,16963439,"examples/agi_cast.html",1012,54,"We introduce AGI-CAST, a continually growing dataset o",html,selection_command
+10316,16963472,"examples/agi_cast.html",1012,57,"We introduce AGI-CAST, a continually growing dataset of u",html,selection_command
+10317,16963505,"examples/agi_cast.html",1012,67,"We introduce AGI-CAST, a continually growing dataset of unlabeled s",html,selection_command
+10318,16963539,"examples/agi_cast.html",1012,74,"We introduce AGI-CAST, a continually growing dataset of unlabeled screen r",html,selection_command
+10319,16963572,"examples/agi_cast.html",1012,85,"We introduce AGI-CAST, a continually growing dataset of unlabeled screen recordings o",html,selection_command
+10320,16963605,"examples/agi_cast.html",1012,88,"We introduce AGI-CAST, a continually growing dataset of unlabeled screen recordings of A",html,selection_command
+10321,16963638,"examples/agi_cast.html",1012,92,"We introduce AGI-CAST, a continually growing dataset of unlabeled screen recordings of AGI r",html,selection_command
+10322,16963671,"examples/agi_cast.html",1012,100,"We introduce AGI-CAST, a continually growing dataset of unlabeled screen recordings of AGI research.",html,selection_command
+10323,16964206,"examples/agi_cast.html",1111,0,"",html,selection_command
+10324,16964499,"examples/agi_cast.html",1151,0,"",html,selection_command
+10325,16964752,"examples/agi_cast.html",1197,0,"",html,selection_command
+10326,16964781,"examples/agi_cast.html",1214,0,"",html,selection_command
+10327,16964812,"examples/agi_cast.html",1222,0,"",html,selection_command
+10328,16964847,"examples/agi_cast.html",1258,0,"",html,selection_command
+10329,16964879,"examples/agi_cast.html",1303,0,"",html,selection_command
+10330,16964912,"examples/agi_cast.html",1378,0,"",html,selection_command
+10331,16964947,"examples/agi_cast.html",1420,0,"",html,selection_command
+10332,16964979,"examples/agi_cast.html",1428,0,"",html,selection_command
+10333,16965011,"examples/agi_cast.html",1435,0,"",html,selection_command
+10334,16965048,"examples/agi_cast.html",1450,0,"",html,selection_command
+10335,16965198,"examples/agi_cast.html",1472,0,"",html,selection_command
+10336,16965452,"examples/agi_cast.html",1528,0,"",html,selection_command
+10337,16965487,"examples/agi_cast.html",1536,0,"",html,selection_command
+10338,16965519,"examples/agi_cast.html",1542,0,"",html,selection_command
+10339,16965550,"examples/agi_cast.html",1555,0,"",html,selection_command
+10340,16965584,"examples/agi_cast.html",1575,0,"",html,selection_command
+10341,16965619,"examples/agi_cast.html",1587,0,"",html,selection_command
+10342,16965654,"examples/agi_cast.html",1595,0,"",html,selection_command
+10343,16965819,"examples/agi_cast.html",1716,0,"",html,selection_command
+10344,16966471,"examples/agi_cast.html",1597,208," We introduce AGI-CAST, a continually growing dataset of unlabeled screen recordings of long-horizon AGI research.",html,selection_command
+10345,16966941,"examples/agi_cast.html",1716,0,"",html,selection_command
+10346,16967758,"examples/blog.html",0,0,"",html,tab
+10347,16968384,"examples/blog.html",25719,0,"",html,selection_command
+10348,16968474,"examples/blog.html",25716,0,"",html,selection_command
+10349,16968728,"examples/blog.html",25708,0,"",html,selection_command
+10350,16968766,"examples/blog.html",25706,0,"",html,selection_command
+10351,16968790,"examples/blog.html",25704,0,"",html,selection_command
+10352,16968824,"examples/blog.html",25700,0,"",html,selection_command
+10353,16968857,"examples/blog.html",25699,0,"",html,selection_command
+10354,16968951,"examples/blog.html",25696,0,"",html,selection_command
+10355,16969105,"examples/blog.html",25686,0,"",html,selection_command
+10356,16969238,"examples/blog.html",25683,0,"",html,selection_command
+10357,16969409,"examples/blog.html",25681,0,"",html,selection_command
+10358,16969645,"examples/blog.html",25683,0,"",html,selection_command
+10359,16970052,"examples/blog.html",25683,1,"W",html,selection_command
+10360,16970122,"examples/blog.html",25683,4,"We i",html,selection_command
+10361,16970383,"examples/blog.html",25683,14,"We introduce A",html,selection_command
+10362,16970408,"examples/blog.html",25683,17,"We introduce AGI-",html,selection_command
+10363,16970440,"examples/blog.html",25683,18,"We introduce AGI-C",html,selection_command
+10364,16970472,"examples/blog.html",25683,22,"We introduce AGI-CAST,",html,selection_command
+10365,16970511,"examples/blog.html",25683,24,"We introduce AGI-CAST, a",html,selection_command
+10366,16970539,"examples/blog.html",25683,26,"We introduce AGI-CAST, a d",html,selection_command
+10367,16970574,"examples/blog.html",25683,34,"We introduce AGI-CAST, a dataset o",html,selection_command
+10368,16970612,"examples/blog.html",25683,37,"We introduce AGI-CAST, a dataset of u",html,selection_command
+10369,16970645,"examples/blog.html",25683,47,"We introduce AGI-CAST, a dataset of unlabeled s",html,selection_command
+10370,16970681,"examples/blog.html",25683,54,"We introduce AGI-CAST, a dataset of unlabeled screen r",html,selection_command
+10371,16970708,"examples/blog.html",25683,65,"We introduce AGI-CAST, a dataset of unlabeled screen recordings o",html,selection_command
+10372,16970739,"examples/blog.html",25683,68,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of l",html,selection_command
+10373,16970772,"examples/blog.html",25683,72,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-",html,selection_command
+10374,16970805,"examples/blog.html",25683,73,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-h",html,selection_command
+10375,16970839,"examples/blog.html",25683,81,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon A",html,selection_command
+10376,16970873,"examples/blog.html",25683,85,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI r",html,selection_command
+10377,16970909,"examples/blog.html",25683,93,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research.",html,selection_command
+10378,16970940,"examples/blog.html",25683,95,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. W",html,selection_command
+10379,16970975,"examples/blog.html",25683,98,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We g",html,selection_command
+10380,16971010,"examples/blog.html",25683,101,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We go b",html,selection_command
+10381,16971045,"examples/blog.html",25683,108,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We go beyond c",html,selection_command
+10382,16971078,"examples/blog.html",25683,113,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We go beyond crowd-",html,selection_command
+10383,16971113,"examples/blog.html",25683,114,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We go beyond crowd-c",html,selection_command
+10384,16971144,"examples/blog.html",25683,118,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We go beyond crowd-code,",html,selection_command
+10385,16971178,"examples/blog.html",25683,120,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We go beyond crowd-code, o",html,selection_command
+10386,16971212,"examples/blog.html",25683,124,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We go beyond crowd-code, our p",html,selection_command
+10387,16971479,"examples/blog.html",25683,133,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We go beyond crowd-code, our previous w",html,selection_command
+10388,16971728,"examples/blog.html",25683,138,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We go beyond crowd-code, our previous work o",html,selection_command
+10389,16971759,"examples/blog.html",25683,141,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We go beyond crowd-code, our previous work on c",html,selection_command
+10390,16971788,"examples/blog.html",25683,146,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We go beyond crowd-code, our previous work on crowd-",html,selection_command
+10391,16971821,"examples/blog.html",25683,147,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We go beyond crowd-code, our previous work on crowd-s",html,selection_command
+10392,16971855,"examples/blog.html",25683,156,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We go beyond crowd-code, our previous work on crowd-sourcing a",html,selection_command
+10393,16971890,"examples/blog.html",25683,158,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We go beyond crowd-code, our previous work on crowd-sourcing a d",html,selection_command
+10394,16972094,"examples/blog.html",25683,166,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We go beyond crowd-code, our previous work on crowd-sourcing a dataset o",html,selection_command
+10395,16972348,"examples/blog.html",25683,169,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We go beyond crowd-code, our previous work on crowd-sourcing a dataset of I",html,selection_command
+10396,16972378,"examples/blog.html",25683,173,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We go beyond crowd-code, our previous work on crowd-sourcing a dataset of IDE i",html,selection_command
+10397,16972409,"examples/blog.html",25683,185,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We go beyond crowd-code, our previous work on crowd-sourcing a dataset of IDE interactions,",html,selection_command
+10398,16972440,"examples/blog.html",25683,187,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We go beyond crowd-code, our previous work on crowd-sourcing a dataset of IDE interactions, b",html,selection_command
+10399,16972471,"examples/blog.html",25683,190,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We go beyond crowd-code, our previous work on crowd-sourcing a dataset of IDE interactions, by c",html,selection_command
+10400,16972505,"examples/blog.html",25683,200,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We go beyond crowd-code, our previous work on crowd-sourcing a dataset of IDE interactions, by capturing n",html,selection_command
+10401,16972538,"examples/blog.html",25683,204,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We go beyond crowd-code, our previous work on crowd-sourcing a dataset of IDE interactions, by capturing not o",html,selection_command
+10402,16972571,"examples/blog.html",25683,209,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We go beyond crowd-code, our previous work on crowd-sourcing a dataset of IDE interactions, by capturing not only t",html,selection_command
+10403,16972604,"examples/blog.html",25683,213,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We go beyond crowd-code, our previous work on crowd-sourcing a dataset of IDE interactions, by capturing not only the I",html,selection_command
+10404,16972639,"examples/blog.html",25683,216,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We go beyond crowd-code, our previous work on crowd-sourcing a dataset of IDE interactions, by capturing not only the IDE,",html,selection_command
+10405,16972671,"examples/blog.html",25683,218,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We go beyond crowd-code, our previous work on crowd-sourcing a dataset of IDE interactions, by capturing not only the IDE, b",html,selection_command
+10406,16972704,"examples/blog.html",25683,222,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We go beyond crowd-code, our previous work on crowd-sourcing a dataset of IDE interactions, by capturing not only the IDE, but a",html,selection_command
+10407,16972737,"examples/blog.html",25683,227,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We go beyond crowd-code, our previous work on crowd-sourcing a dataset of IDE interactions, by capturing not only the IDE, but also b",html,selection_command
+10408,16972771,"examples/blog.html",25683,234,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We go beyond crowd-code, our previous work on crowd-sourcing a dataset of IDE interactions, by capturing not only the IDE, but also browser-",html,selection_command
+10409,16972804,"examples/blog.html",25683,235,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We go beyond crowd-code, our previous work on crowd-sourcing a dataset of IDE interactions, by capturing not only the IDE, but also browser-u",html,selection_command
+10410,16972836,"examples/blog.html",25683,238,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We go beyond crowd-code, our previous work on crowd-sourcing a dataset of IDE interactions, by capturing not only the IDE, but also browser-use,",html,selection_command
+10411,16972871,"examples/blog.html",25683,240,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We go beyond crowd-code, our previous work on crowd-sourcing a dataset of IDE interactions, by capturing not only the IDE, but also browser-use, n",html,selection_command
+10412,16972904,"examples/blog.html",25683,245,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We go beyond crowd-code, our previous work on crowd-sourcing a dataset of IDE interactions, by capturing not only the IDE, but also browser-use, notes,",html,selection_command
+10413,16973100,"examples/blog.html",25683,247,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We go beyond crowd-code, our previous work on crowd-sourcing a dataset of IDE interactions, by capturing not only the IDE, but also browser-use, notes, a",html,selection_command
+10414,16973349,"examples/blog.html",25683,251,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We go beyond crowd-code, our previous work on crowd-sourcing a dataset of IDE interactions, by capturing not only the IDE, but also browser-use, notes, and p",html,selection_command
+10415,16973379,"examples/blog.html",25683,257,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We go beyond crowd-code, our previous work on crowd-sourcing a dataset of IDE interactions, by capturing not only the IDE, but also browser-use, notes, and paper e",html,selection_command
+10416,16973414,"examples/blog.html",25683,268,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We go beyond crowd-code, our previous work on crowd-sourcing a dataset of IDE interactions, by capturing not only the IDE, but also browser-use, notes, and paper exploration.",html,selection_command
+10417,16973448,"examples/blog.html",25683,271,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We go beyond crowd-code, our previous work on crowd-sourcing a dataset of IDE interactions, by capturing not only the IDE, but also browser-use, notes, and paper exploration.",html,selection_command
+10419,16973514,"examples/blog.html",25683,292,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We go beyond crowd-code, our previous work on crowd-sourcing a dataset of IDE interactions, by capturing not only the IDE, but also browser-use, notes, and paper exploration.\n <",html,selection_command
+10420,16973546,"examples/blog.html",25683,294,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We go beyond crowd-code, our previous work on crowd-sourcing a dataset of IDE interactions, by capturing not only the IDE, but also browser-use, notes, and paper exploration.\n \n
",html,selection_command
+10422,16973848,"examples/blog.html",25683,294,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We go beyond crowd-code, our previous work on crowd-sourcing a dataset of IDE interactions, by capturing not only the IDE, but also browser-use, notes, and paper exploration.\n \n <",html,selection_command
+10424,16974125,"examples/blog.html",25683,272,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We go beyond crowd-code, our previous work on crowd-sourcing a dataset of IDE interactions, by capturing not only the IDE, but also browser-use, notes, and paper exploration.",html,selection_command
+10425,16974156,"examples/blog.html",25683,271,"We introduce AGI-CAST, a dataset of unlabeled screen recordings of long-horizon AGI research. We go beyond crowd-code, our previous work on crowd-sourcing a dataset of IDE interactions, by capturing not only the IDE, but also browser-use, notes, and paper exploration.AGI-CAST, a continually growing dataset of unlabeled screen recordings of long-horizon AGI research.",html,content
+10430,16975947,"examples/blog.html",25891,0,"",html,selection_keyboard
+10431,16976999,"examples/blog.html",25894,0,"",html,selection_command
+10432,16977405,"examples/blog.html",25641,0,"",html,selection_command
+10433,16978264,"examples/blog.html",25663,0,"",html,selection_command
+10434,16978506,"examples/blog.html",25664,0,"",html,selection_command
+10435,16978536,"examples/blog.html",25666,0,"",html,selection_command
+10436,16978569,"examples/blog.html",25671,0,"",html,selection_command
+10437,16978602,"examples/blog.html",25673,0,"",html,selection_command
+10438,16978764,"examples/blog.html",25681,0,"",html,selection_command
+10439,16978975,"examples/blog.html",25691,0,"",html,selection_command
+10440,16979446,"examples/blog.html",25690,0,"",html,selection_command
+10441,16979840,"examples/blog.html",25690,1," ",html,selection_command
+10442,16980340,"examples/blog.html",25681,10,"""> ",html,selection_command
+10443,16980857,"examples/blog.html",25682,9,"> ",html,selection_command
+10444,16981023,"examples/blog.html",25683,8," ",html,selection_command
+10445,16981311,"examples/blog.html",25683,8,"",html,content
+10446,16984411,"TERMINAL",0,0,"",,terminal_command
+10447,16984412,"TERMINAL",0,0,"]633;C",,terminal_output
+10448,16988236,"TERMINAL",0,0,"cp examples/* dist",,terminal_command
+10449,16988287,"TERMINAL",0,0,"]633;C",,terminal_output
+10450,16988301,"TERMINAL",0,0,"[1m[7m%[27m[1m[0m \r \r",,terminal_output
+10451,16990087,"TERMINAL",0,0,"npm run dev",,terminal_command
+10452,16990138,"TERMINAL",0,0,"]633;C",,terminal_output
+10453,16990390,"TERMINAL",0,0,"\r\n> distill-template@2.8.0 dev\r\n> rollup -c rollup.config.dev.js -w\r\n\r\n[1G[0K",,terminal_output
+10454,16990572,"TERMINAL",0,0,"c[4mrollup v2.7.3[24m\r\n[36mbundles [1msrc/components.js[22m โ [1mdist/template.v2.js[22m...[39m\r\n",,terminal_output
+10455,16990983,"TERMINAL",0,0,"[1m[32mhttp://localhost:8088[39m[22m -> /Users/franzsrambical/Documents/pdoom/pdoom.org/dist\r\n[1m[32mhttp://localhost:8088[39m[22m -> /Users/franzsrambical/Documents/pdoom/pdoom.org/examples\r\n[1m[33m(!) Circular dependencies[39m[22m\r\nnode_modules/d3-interpolate/src/value.js -> node_modules/d3-interpolate/src/array.js -> node_modules/d3-interpolate/src/value.js\r\nnode_modules/d3-interpolate/src/value.js -> node_modules/d3-interpolate/src/object.js -> node_modules/d3-interpolate/src/value.js\r\nnode_modules/d3-selection/src/selection/index.js -> node_modules/d3-selection/src/selection/select.js -> node_modules/d3-selection/src/selection/index.js\r\n...and 7 more\r\n[32mcreated [1mdist/template.v2.js[22m in [1m406ms[22m[39m\r\n[36mbundles [1msrc/transforms.js[22m โ [1mdist/transforms.v2.js[22m...[39m\r\n",,terminal_output
+10456,16991913,"TERMINAL",0,0,"[32mcreated [1mdist/transforms.v2.js[22m in [1m933ms[22m[39m\r\n\r\n[2025-11-02 17:42:41] waiting for changes...\r\n",,terminal_output
+10457,17013261,"examples/blog.html",25686,0,"",html,selection_command
+10458,17013467,"examples/blog.html",25696,0,"",html,selection_command
+10459,17013663,"examples/blog.html",25697,0,"",html,selection_command
+10460,17014004,"examples/blog.html",25696,0,"",html,selection_command
+10461,17014443,"examples/blog.html",25696,1,"<",html,selection_command
+10462,17014496,"examples/blog.html",25696,2,"A",html,selection_command
+10480,17015857,"examples/blog.html",25696,83,"",html,selection_command
+10481,17016098,"examples/blog.html",25696,82,"",html,selection_command
+10483,17016768,"examples/blog.html",25696,83,"",html,content
+10484,17016896,"examples/blog.html",25699,0,"",html,selection_command
+10485,17017104,"examples/blog.html",25700,0,"",html,selection_command
+10486,17017332,"examples/blog.html",25704,0,"",html,selection_command
+10487,17017782,"examples/blog.html",25704,1,"<",html,selection_command
+10488,17017939,"examples/blog.html",25704,2,"",html,selection_command
+10489,17018123,"examples/blog.html",25704,3,"",html,selection_command
+10491,17018706,"examples/blog.html",25704,4,"",html,content
+10492,17020220,"TERMINAL",0,0,"^C[1G[0Kโ [1G[0K[1m[7m%[27m[1m[0m \r \r]633;D;0]633;P;Cwd=/Users/franzsrambical/Documents/pdoom/pdoom.org\r[0m[27m[24m[J]633;Afranzsrambical@MBF6N9WFVKFV pdoom.org % ]633;B[K[?2004h",,terminal_output
+10493,17020489,"TERMINAL",0,0,"npm run dev",,terminal_output
+10494,17020542,"TERMINAL",0,0,"[11Dcp examples/* dist",,terminal_output
+10495,17020959,"TERMINAL",0,0,"cp examples/* dist",,terminal_command
+10496,17020959,"TERMINAL",0,0,"[?2004l\r\r\n]633;E;cp examples/* dist;280736e1-4344-414b-b289-7b3558692a37",,terminal_output
+10497,17021009,"TERMINAL",0,0,"]633;C",,terminal_output
+10498,17021021,"TERMINAL",0,0,"[1m[7m%[27m[1m[0m \r \r",,terminal_output
+10499,17021584,"TERMINAL",0,0,"npm run dev",,terminal_command
+10500,17021636,"TERMINAL",0,0,"]633;C",,terminal_output
+10501,17021781,"TERMINAL",0,0,"\r\n> distill-template@2.8.0 dev\r\n> rollup -c rollup.config.dev.js -w\r\n\r\n[1G[0K",,terminal_output
+10502,17021977,"TERMINAL",0,0,"c[4mrollup v2.7.3[24m\r\n[36mbundles [1msrc/components.js[22m โ [1mdist/template.v2.js[22m...[39m\r\n",,terminal_output
+10503,17022398,"TERMINAL",0,0,"[1m[32mhttp://localhost:8088[39m[22m -> /Users/franzsrambical/Documents/pdoom/pdoom.org/dist\r\n[1m[32mhttp://localhost:8088[39m[22m -> /Users/franzsrambical/Documents/pdoom/pdoom.org/examples\r\n[1m[33m(!) Circular dependencies[39m[22m\r\nnode_modules/d3-interpolate/src/value.js -> node_modules/d3-interpolate/src/array.js -> node_modules/d3-interpolate/src/value.js\r\nnode_modules/d3-interpolate/src/value.js -> node_modules/d3-interpolate/src/object.js -> node_modules/d3-interpolate/src/value.js\r\nnode_modules/d3-selection/src/selection/index.js -> node_modules/d3-selection/src/selection/select.js -> node_modules/d3-selection/src/selection/index.js\r\n...and 7 more\r\n[32mcreated [1mdist/template.v2.js[22m in [1m401ms[22m[39m\r\n[36mbundles [1msrc/transforms.js[22m โ [1mdist/transforms.v2.js[22m...[39m\r\n",,terminal_output
+10504,17022965,"TERMINAL",0,0,"[32mcreated [1mdist/transforms.v2.js[22m in [1m586ms[22m[39m\r\n\r\n[2025-11-02 17:43:12] waiting for changes...\r\n",,terminal_output
+10505,17026296,"examples/blog.html",25638,0,"",html,selection_command
+10506,17026421,"examples/blog.html",25552,0,"",html,selection_command
+10507,17026586,"examples/blog.html",25487,0,"",html,selection_command
+10508,17027011,"examples/blog.html",25408,0,"",html,selection_command
+10509,17027539,"examples/blog.html",25405,0,"",html,selection_command
+10510,17027765,"examples/blog.html",25404,0,"",html,selection_command
+10511,17027801,"examples/blog.html",25396,0,"",html,selection_command
+10512,17027829,"examples/blog.html",25394,0,"",html,selection_command
+10513,17027862,"examples/blog.html",25391,0,"",html,selection_command
+10514,17028787,"examples/blog.html",25437,0," controls autoplay loop muted /",html,content
+10515,17028787,"examples/blog.html",25390,0,"deo",html,content
+10516,17028787,"examples/blog.html",25388,2,"",html,content
+10517,17028787,"examples/blog.html",25387,0,"v",html,content
+10518,17031323,"TERMINAL",0,0,"^C[1G[0Kโ [1G[0K[1m[7m%[27m[1m[0m \r \r]633;D;0]633;P;Cwd=/Users/franzsrambical/Documents/pdoom/pdoom.org\r[0m[27m[24m[J]633;Afranzsrambical@MBF6N9WFVKFV pdoom.org % ]633;B[K[?2004h",,terminal_output
+10519,17031676,"TERMINAL",0,0,"npm run dev[11Dcp examples/* dist",,terminal_output
+10520,17032015,"TERMINAL",0,0,"cp examples/* dist",,terminal_command
+10521,17032015,"TERMINAL",0,0,"[?2004l\r\r\n]633;E;cp examples/* dist;280736e1-4344-414b-b289-7b3558692a37",,terminal_output
+10522,17032057,"TERMINAL",0,0,"]633;C[1m[7m%[27m[1m[0m \r \r",,terminal_output
+10523,17032603,"TERMINAL",0,0,"npm run dev",,terminal_command
+10524,17032654,"TERMINAL",0,0,"]633;C",,terminal_output
+10525,17032777,"TERMINAL",0,0,"\r\n> distill-template@2.8.0 dev\r\n> rollup -c rollup.config.dev.js -w\r\n\r\n[1G[0K",,terminal_output
+10526,17032932,"TERMINAL",0,0,"c[4mrollup v2.7.3[24m\r\n[36mbundles [1msrc/components.js[22m โ [1mdist/template.v2.js[22m...[39m\r\n",,terminal_output
+10527,17033331,"TERMINAL",0,0,"[1m[32mhttp://localhost:8088[39m[22m -> /Users/franzsrambical/Documents/pdoom/pdoom.org/dist\r\n[1m[32mhttp://localhost:8088[39m[22m -> /Users/franzsrambical/Documents/pdoom/pdoom.org/examples\r\n[1m[33m(!) Circular dependencies[39m[22m\r\nnode_modules/d3-interpolate/src/value.js -> node_modules/d3-interpolate/src/array.js -> node_modules/d3-interpolate/src/value.js\r\nnode_modules/d3-interpolate/src/value.js -> node_modules/d3-interpolate/src/object.js -> node_modules/d3-interpolate/src/value.js\r\nnode_modules/d3-selection/src/selection/index.js -> node_modules/d3-selection/src/selection/select.js -> node_modules/d3-selection/src/selection/index.js\r\n...and 7 more\r\n[32mcreated [1mdist/template.v2.js[22m in [1m395ms[22m[39m\r\n[36mbundles [1msrc/transforms.js[22m โ [1mdist/transforms.v2.js[22m...[39m\r\n",,terminal_output
+10528,17034004,"TERMINAL",0,0,"[32mcreated [1mdist/transforms.v2.js[22m in [1m675ms[22m[39m\r\n\r\n[2025-11-02 17:43:23] waiting for changes...\r\n",,terminal_output
+10529,17050472,"examples/blog.html",25396,0,"",html,selection_command
+10530,17050687,"examples/blog.html",25398,0,"",html,selection_command
+10531,17050719,"examples/blog.html",25406,0,"",html,selection_command
+10532,17050753,"examples/blog.html",25407,0,"",html,selection_command
+10533,17050786,"examples/blog.html",25410,0,"",html,selection_command
+10534,17050818,"examples/blog.html",25412,0,"",html,selection_command
+10535,17050855,"examples/blog.html",25417,0,"",html,selection_command
+10536,17051025,"examples/blog.html",25419,0,"",html,selection_command
+10537,17051275,"examples/blog.html",25425,0,"",html,selection_command
+10538,17051306,"examples/blog.html",25426,0,"",html,selection_command
+10539,17051338,"examples/blog.html",25432,0,"",html,selection_command
+10540,17052138,"examples/blog.html",25434,0,"",html,selection_command
+10541,17052387,"examples/blog.html",25437,0,"",html,selection_command
+10542,17053553,"examples/blog.html",25345,132," ",html,selection_command
+10543,17076286,"examples/blog.html",25437,0,"",html,selection_command
+10544,17082347,"examples/blog.html",25345,132," ",html,content
+10545,17091488,"examples/blog.html",25503,0,"",html,selection_command
+10546,17091735,"examples/blog.html",25500,0,"",html,selection_command
+10547,17091760,"examples/blog.html",25496,0,"",html,selection_command
+10548,17091797,"examples/blog.html",25490,0,"",html,selection_command
+10549,17091827,"examples/blog.html",25485,0,"",html,selection_command
+10550,17091860,"examples/blog.html",25476,0,"",html,selection_command
+10551,17091894,"examples/blog.html",25467,0,"",html,selection_command
+10552,17091928,"examples/blog.html",25464,0,"",html,selection_command
+10553,17091960,"examples/blog.html",25461,0,"",html,selection_command
+10554,17091994,"examples/blog.html",25459,0,"",html,selection_command
+10555,17092027,"examples/blog.html",25453,0,"",html,selection_command
+10556,17093784,"TERMINAL",0,0,"^C[1G[0Kโ [1G[0K[1m[7m%[27m[1m[0m \r \r]633;D;0]633;P;Cwd=/Users/franzsrambical/Documents/pdoom/pdoom.org\r[0m[27m[24m[J]633;Afranzsrambical@MBF6N9WFVKFV pdoom.org % ]633;B[K[?2004h",,terminal_output
+10557,17094040,"TERMINAL",0,0,"npm run dev",,terminal_output
+10558,17094119,"TERMINAL",0,0,"[11Dcp examples/* dist",,terminal_output
+10559,17094472,"TERMINAL",0,0,"cp examples/* dist",,terminal_command
+10560,17094473,"TERMINAL",0,0,"[?2004l\r\r\n]633;E;cp examples/* dist;280736e1-4344-414b-b289-7b3558692a37",,terminal_output
+10561,17094523,"TERMINAL",0,0,"]633;C",,terminal_output
+10562,17094535,"TERMINAL",0,0,"[1m[7m%[27m[1m[0m \r \r",,terminal_output
+10563,17095176,"TERMINAL",0,0,"npm run dev",,terminal_command
+10564,17095227,"TERMINAL",0,0,"]633;C",,terminal_output
+10565,17095381,"TERMINAL",0,0,"\r\n> distill-template@2.8.0 dev\r\n> rollup -c rollup.config.dev.js -w\r\n\r\n[1G[0K",,terminal_output
+10566,17095560,"TERMINAL",0,0,"c[4mrollup v2.7.3[24m\r\n[36mbundles [1msrc/components.js[22m โ [1mdist/template.v2.js[22m...[39m\r\n",,terminal_output
+10567,17095972,"TERMINAL",0,0,"[1m[32mhttp://localhost:8088[39m[22m -> /Users/franzsrambical/Documents/pdoom/pdoom.org/dist\r\n[1m[32mhttp://localhost:8088[39m[22m -> /Users/franzsrambical/Documents/pdoom/pdoom.org/examples\r\n[1m[33m(!) Circular dependencies[39m[22m\r\nnode_modules/d3-interpolate/src/value.js -> node_modules/d3-interpolate/src/array.js -> node_modules/d3-interpolate/src/value.js\r\nnode_modules/d3-interpolate/src/value.js -> node_modules/d3-interpolate/src/object.js -> node_modules/d3-interpolate/src/value.js\r\nnode_modules/d3-selection/src/selection/index.js -> node_modules/d3-selection/src/selection/select.js -> node_modules/d3-selection/src/selection/index.js\r\n...and 7 more\r\n[32mcreated [1mdist/template.v2.js[22m in [1m416ms[22m[39m\r\n[36mbundles [1msrc/transforms.js[22m โ [1mdist/transforms.v2.js[22m...[39m\r\n",,terminal_output
+10568,17096627,"TERMINAL",0,0,"[32mcreated [1mdist/transforms.v2.js[22m in [1m652ms[22m[39m\r\n\r\n[2025-11-02 17:44:25] waiting for changes...\r\n",,terminal_output
+10569,17130344,"examples/blog.html",25345,0,"",html,selection_command
+10570,17132592,"TERMINAL",0,0,"^C[1G[0Kโ [1G[0K[1m[7m%[27m[1m[0m \r \r]633;D;0]633;P;Cwd=/Users/franzsrambical/Documents/pdoom/pdoom.org\r[0m[27m[24m[J]633;Afranzsrambical@MBF6N9WFVKFV pdoom.org % ]633;B[K[?2004h",,terminal_output
+10571,17132981,"TERMINAL",0,0,"npm run dev",,terminal_output
+10572,17133064,"TERMINAL",0,0,"[11Dcp examples/* dist",,terminal_output
+10573,17133521,"TERMINAL",0,0,"cp examples/* dist",,terminal_command
+10574,17133522,"TERMINAL",0,0,"[?2004l\r\r\n]633;E;cp examples/* dist;280736e1-4344-414b-b289-7b3558692a37",,terminal_output
+10575,17133569,"TERMINAL",0,0,"]633;C[1m[7m%[27m[1m[0m \r \r",,terminal_output
+10576,17157248,"TERMINAL",0,0,"git subtree push --prefix dist origin gh-pages",,terminal_command
+10577,17157299,"TERMINAL",0,0,"]633;C",,terminal_output
+10578,17158970,"TERMINAL",0,0,"git push using: origin gh-pages\r\n1/ 452 (0) [0]\r2/ 452 (1) [0]\r3/ 452 (2) [0]\r4/ 452 (3) [0]\r5/ 452 (4) [0]\r6/ 452 (5) [0]\r7/ 452 (6) [0]\r8/ 452 (7) [0]\r9/ 452 (8) [0]\r10/ 452 (9) [0]\r11/ 452 (10) [0]\r12/ 452 (11) [0]\r13/ 452 (12) [0]\r14/ 452 (13) [0]\r15/ 452 (14) [0]\r16/ 452 (15) [0]\r17/ 452 (16) [0]\r18/ 452 (17) [0]\r19/ 452 (18) [0]\r20/ 452 (19) [0]\r21/ 452 (20) [0]\r22/ 452 (21) [0]\r23/ 452 (22) [0]\r24/ 452 (23) [0]\r25/ 452 (24) [0]\r26/ 452 (25) [0]\r27/ 452 (26) [0]\r28/ 452 (27) [0]\r29/ 452 (28) [0]\r30/ 452 (29) [0]\r31/ 452 (30) [0]\r32/ 452 (31) [0]\r33/ 452 (32) [0]\r34/ 452 (33) [0]\r35/ 452 (34) [0]\r36/ 452 (35) [0]\r37/ 452 (36) [0]\r38/ 452 (37) [0]\r39/ 452 (38) [0]\r40/ 452 (39) [0]\r41/ 452 (40) [0]\r42/ 452 (41) [0]\r43/ 452 (42) [0]\r44/ 452 (43) [0]\r45/ 452 (44) [0]\r46/ 452 (45) [0]\r47/ 452 (46) [0]\r48/ 452 (47) [0]\r49/ 452 (48) [0]\r50/ 452 (49) [0]\r51/ 452 (50) [0]\r52/ 452 (51) [0]\r53/ 452 (52) [0]\r54/ 452 (53) [0]\r55/ 452 (54) [0]\r56/ 452 (55) [0]\r57/ 452 (56) [0]\r58/ 452 (57) [0]\r59/ 452 (58) [0]\r60/ 452 (59) [0]\r61/ 452 (60) [0]\r62/ 452 (61) [0]\r63/ 452 (62) [0]\r64/ 452 (63) [0]\r65/ 452 (64) [0]\r66/ 452 (65) [0]\r67/ 452 (66) [0]\r68/ 452 (67) [0]\r69/ 452 (68) [0]\r70/ 452 (69) [0]\r71/ 452 (70) [0]\r72/ 452 (71) [0]\r73/ 452 (72) [0]\r74/ 452 (73) [0]\r75/ 452 (74) [0]\r76/ 452 (75) [0]\r77/ 452 (76) [0]\r78/ 452 (77) [0]\r79/ 452 (78) [0]\r80/ 452 (79) [0]\r81/ 452 (80) [0]\r82/ 452 (81) [0]\r83/ 452 (82) [0]\r84/ 452 (83) [0]\r85/ 452 (84) [0]\r86/ 452 (85) [0]\r87/ 452 (86) [0]\r88/ 452 (87) [0]\r89/ 452 (88) [0]\r90/ 452 (89) [0]\r91/ 452 (90) [0]\r92/ 452 (91) [0]\r93/ 452 (92) [0]\r94/ 452 (93) [0]\r95/ 452 (94) [0]\r96/ 452 (95) [0]\r97/ 452 (96) [0]\r98/ 452 (97) [0]\r99/ 452 (98) [0]\r100/ 452 (99) [0]\r101/ 452 (100) [0]\r",,terminal_output
+10579,17162534,"TERMINAL",0,0,"102/ 452 (101) [0]\r103/ 452 (102) [0]\r104/ 452 (103) [0]\r105/ 452 (104) [0]\r106/ 452 (105) [0]\r107/ 452 (106) [0]\r108/ 452 (107) [0]\r109/ 452 (108) [0]\r110/ 452 (109) [0]\r111/ 452 (110) [0]\r112/ 452 (111) [0]\r113/ 452 (112) [0]\r114/ 452 (113) [0]\r115/ 452 (114) [0]\r116/ 452 (115) [0]\r117/ 452 (116) [0]\r118/ 452 (117) [0]\r119/ 452 (118) [0]\r120/ 452 (119) [0]\r121/ 452 (120) [0]\r122/ 452 (121) [0]\r123/ 452 (122) [0]\r124/ 452 (123) [0]\r125/ 452 (124) [0]\r126/ 452 (125) [0]\r127/ 452 (126) [0]\r128/ 452 (127) [0]\r129/ 452 (128) [0]\r130/ 452 (129) [0]\r131/ 452 (130) [0]\r132/ 452 (131) [0]\r133/ 452 (132) [0]\r134/ 452 (133) [0]\r135/ 452 (134) [0]\r136/ 452 (135) [0]\r137/ 452 (136) [0]\r138/ 452 (137) [0]\r139/ 452 (138) [0]\r140/ 452 (139) [0]\r141/ 452 (140) [0]\r142/ 452 (141) [0]\r143/ 452 (142) [0]\r144/ 452 (143) [0]\r145/ 452 (144) [0]\r146/ 452 (145) [0]\r147/ 452 (146) [0]\r148/ 452 (147) [0]\r149/ 452 (148) [0]\r150/ 452 (149) [0]\r151/ 452 (150) [0]\r152/ 452 (151) [0]\r153/ 452 (152) [0]\r154/ 452 (153) [0]\r155/ 452 (154) [0]\r156/ 452 (155) [0]\r157/ 452 (156) [0]\r158/ 452 (157) [0]\r159/ 452 (158) [0]\r160/ 452 (159) [0]\r161/ 452 (160) [0]\r162/ 452 (161) [0]\r163/ 452 (162) [0]\r164/ 452 (163) [0]\r165/ 452 (164) [0]\r166/ 452 (165) [0]\r167/ 452 (166) [0]\r168/ 452 (167) [0]\r169/ 452 (168) [0]\r170/ 452 (169) [0]\r171/ 452 (170) [0]\r172/ 452 (171) [0]\r173/ 452 (172) [0]\r174/ 452 (173) [0]\r175/ 452 (174) [0]\r176/ 452 (175) [0]\r177/ 452 (176) [0]\r178/ 452 (177) [0]\r179/ 452 (178) [0]\r180/ 452 (179) [0]\r181/ 452 (180) [0]\r182/ 452 (181) [0]\r183/ 452 (182) [0]\r184/ 452 (183) [0]\r185/ 452 (184) [0]\r186/ 452 (185) [0]\r187/ 452 (186) [0]\r188/ 452 (187) [0]\r189/ 452 (188) [0]\r190/ 452 (189) [0]\r191/ 452 (190) [0]\r192/ 452 (191) [0]\r193/ 452 (192) [0]\r194/ 452 (193) [0]\r195/ 452 (194) [0]\r196/ 452 (195) [0]\r197/ 452 (196) [0]\r198/ 452 (197) [0]\r199/ 452 (198) [0]\r200/ 452 (199) [0]\r201/ 452 (200) [0]\r202/ 452 (201) [0]\r203/ 452 (202) [0]\r204/ 452 (203) [0]\r205/ 452 (204) [0]\r206/ 452 (205) [0]\r207/ 452 (206) [0]\r208/ 452 (207) [0]\r209/ 452 (208) [0]\r210/ 452 (209) [0]\r211/ 452 (210) [0]\r212/ 452 (211) [0]\r213/ 452 (212) [0]\r214/ 452 (213) [0]\r215/ 452 (214) [0]\r216/ 452 (215) [0]\r217/ 452 (216) [0]\r218/ 452 (217) [0]\r219/ 452 (218) [0]\r220/ 452 (219) [0]\r221/ 452 (220) [0]\r222/ 452 (221) [0]\r223/ 452 (222) [0]\r224/ 452 (223) [0]\r225/ 452 (224) [0]\r226/ 452 (225) [0]\r227/ 452 (226) [0]\r228/ 452 (227) [0]\r229/ 452 (228) [0]\r230/ 452 (229) [0]\r231/ 452 (230) [0]\r232/ 452 (231) [0]\r233/ 452 (232) [0]\r234/ 452 (233) [0]\r235/ 452 (234) [0]\r236/ 452 (235) [0]\r237/ 452 (236) [0]\r238/ 452 (237) [0]\r239/ 452 (238) [0]\r240/ 452 (239) [0]\r241/ 452 (240) [0]\r242/ 452 (241) [0]\r243/ 452 (242) [0]\r244/ 452 (243) [0]\r245/ 452 (244) [0]\r246/ 452 (245) [0]\r247/ 452 (246) [0]\r248/ 452 (247) [0]\r249/ 452 (248) [0]\r250/ 452 (249) [0]\r251/ 452 (250) [0]\r252/ 452 (251) [0]\r253/ 452 (252) [0]\r254/ 452 (253) [0]\r255/ 452 (254) [0]\r256/ 452 (255) [0]\r257/ 452 (256) [0]\r258/ 452 (257) [0]\r259/ 452 (258) [0]\r260/ 452 (259) [0]\r261/ 452 (260) [0]\r262/ 452 (261) [0]\r263/ 452 (262) [0]\r264/ 452 (263) [0]\r265/ 452 (264) [0]\r266/ 452 (265) [0]\r267/ 452 (266) [0]\r268/ 452 (267) [0]\r269/ 452 (268) [0]\r270/ 452 (269) [0]\r271/ 452 (270) [0]\r272/ 452 (271) [0]\r273/ 452 (272) [0]\r274/ 452 (273) [0]\r275/ 452 (274) [0]\r276/ 452 (275) [0]\r277/ 452 (276) [0]\r278/ 452 (277) [0]\r279/ 452 (278) [0]\r280/ 452 (279) [0]\r281/ 452 (280) [0]\r282/ 452 (281) [0]\r283/ 452 (282) [0]\r284/ 452 (283) [0]\r285/ 452 (284) [0]\r286/ 452 (285) [0]\r287/ 452 (286) [0]\r288/ 452 (287) [0]\r289/ 452 (288) [0]\r290/ 452 (289) [0]\r291/ 452 (290) [0]\r292/ 452 (291) [0]\r293/ 452 (292) [0]\r294/ 452 (293) [0]\r295/ 452 (294) [0]\r296/ 452 (295) [0]\r297/ 452 (296) [0]\r298/ 452 (297) [0]\r299/ 452 (298) [0]\r300/ 452 (299) [0]\r301/ 452 (300) [0]\r302/ 452 (301) [0]\r303/ 452 (302) [0]\r304/ 452 (303) [0]\r305/ 452 (304) [0]\r306/ 452 (305) [0]\r307/ 452 (306) [0]\r308/ 452 (307) [0]\r309/ 452 (308) [0]\r310/ 452 (309) [0]\r311/ 452 (310) [0]\r312/ 452 (311) [0]\r313/ 452 (312) [0]\r314/ 452 (313) [0]\r315/ 452 (314) [0]\r316/ 452 (315) [0]\r317/ 452 (316) [0]\r318/ 452 (317) [0]\r319/ 452 (318) [0]\r320/ 452 (319) [0]\r321/ 452 (320) [0]\r322/ 452 (321) [0]\r323/ 452 (322) [0]\r324/ 452 (323) [0]\r325/ 452 (324) [0]\r326/ 452 (325) [0]\r327/ 452 (326) [0]\r328/ 452 (327) [0]\r329/ 452 (328) [0]\r330/ 452 (329) [0]\r331/ 452 (330) [0]\r332/ 452 (331) [0]\r333/ 452 (332) [0]\r334/ 452 (333) [0]\r335/ 452 (334) [0]\r336/ 452 (335) [0]\r337/ 452 (336) [0]\r338/ 452 (337) [0]\r339/ 452 (338) [0]\r340/ 452 (339) [0]\r341/ 452 (340) [0]\r342/ 452 (341) [0]\r343/ 452 (342) [0]\r344/ 452 (343) [0]\r345/ 452 (344) [0]\r346/ 452 (345) [0]\r347/ 452 (346) [0]\r348/ 452 (347) [0]\r349/ 452 (348) [0]\r350/ 452 (349) [0]\r351/ 452 (350) [0]\r352/ 452 (351) [0]\r353/ 452 (352) [0]\r354/ 452 (353) [0]\r355/ 452 (354) [0]\r356/ 452 (355) [0]\r357/ 452 (356) [0]\r358/ 452 (357) [0]\r359/ 452 (358) [0]\r360/ 452 (359) [0]\r361/ 452 (360) [0]\r362/ 452 (361) [0]\r363/ 452 (362) [0]\r364/ 452 (363) [0]\r365/ 452 (364) [0]\r366/ 452 (365) [0]\r367/ 452 (366) [0]\r368/ 452 (367) [0]\r369/ 452 (368) [0]\r370/ 452 (369) [0]\r371/ 452 (370) [0]\r372/ 452 (371) [0]\r373/ 452 (372) [0]\r374/ 452 (373) [0]\r375/ 452 (374) [0]\r376/ 452 (375) [0]\r377/ 452 (376) [0]\r378/ 452 (377) [0]\r379/ 452 (378) [0]\r380/ 452 (379) [0]\r381/ 452 (380) [0]\r382/ 452 (381) [0]\r383/ 452 (382) [0]\r384/ 452 (383) [0]\r385/ 452 (384) [0]\r386/ 452 (385) [0]\r387/ 452 (386) [0]\r388/ 452 (387) [0]\r389/ 452 (388) [0]\r390/ 452 (389) [0]\r391/ 452 (390) [0]\r392/ 452 (391) [0]\r393/ 452 (392) [0]\r394/ 452 (393) [0]\r395/ 452 (394) [0]\r396/ 452 (395) [0]\r397/ 452 (396) [0]\r398/ 452 (397) [0]\r399/ 452 (398) [0]\r400/ 452 (399) [0]\r401/ 452 (400) [0]\r402/ 452 (401) [0]\r403/ 452 (402) [0]\r404/ 452 (403) [0]\r405/ 452 (404) [0]\r406/ 452 (405) [0]\r407/ 452 (406) [0]\r408/ 452 (407) [0]\r409/ 452 (408) [0]\r410/ 452 (409) [0]\r411/ 452 (410) [0]\r412/ 452 (411) [0]\r413/ 452 (412) [0]\r414/ 452 (413) [0]\r415/ 452 (414) [0]\r416/ 452 (415) [0]\r417/ 452 (416) [0]\r418/ 452 (417) [0]\r419/ 452 (418) [0]\r420/ 452 (419) [0]\r421/ 452 (420) [0]\r422/ 452 (421) [0]\r423/ 452 (422) [0]\r424/ 452 (423) [0]\r425/ 452 (424) [0]\r426/ 452 (425) [0]\r427/ 452 (426) [0]\r428/ 452 (427) [0]\r429/ 452 (428) [0]\r430/ 452 (429) [0]\r431/ 452 (430) [0]\r432/ 452 (431) [0]\r433/ 452 (432) [0]\r434/ 452 (433) [0]\r435/ 452 (434) [0]\r",,terminal_output
+10580,17162594,"TERMINAL",0,0,"436/ 452 (435) [0]\r",,terminal_output
+10581,17162821,"TERMINAL",0,0,"437/ 452 (436) [0]\r438/ 452 (437) [0]\r439/ 452 (438) [0]\r440/ 452 (439) [0]\r441/ 452 (440) [0]\r",,terminal_output
+10582,17163163,"TERMINAL",0,0,"442/ 452 (441) [0]\r443/ 452 (442) [0]\r444/ 452 (443) [0]\r445/ 452 (444) [0]\r446/ 452 (445) [0]\r447/ 452 (446) [0]\r448/ 452 (447) [0]\r449/ 452 (448) [0]\r450/ 452 (449) [0]\r451/ 452 (450) [0]\r452/ 452 (451) [0]\r",,terminal_output
+10583,17163617,"TERMINAL",0,0,"Enumerating objects: 5, done.\r\nCounting objects: 20% (1/5)\rCounting objects: 40% (2/5)\rCounting objects: 60% (3/5)\rCounting objects: 80% (4/5)\rCounting objects: 100% (5/5)\rCounting objects: 100% (5/5), done.\r\nDelta compression using up to 8 threads\r\nCompressing objects: 33% (1/3)\rCompressing objects: 66% (2/3)\rCompressing objects: 100% (3/3)\rCompressing objects: 100% (3/3), done.\r\nWriting objects: 33% (1/3)\rWriting objects: 66% (2/3)\rWriting objects: 100% (3/3)\rWriting objects: 100% (3/3), 563 bytes | 563.00 KiB/s, done.\r\nTotal 3 (delta 2), reused 0 (delta 0), pack-reused 0\r\n",,terminal_output
+10584,17163941,"TERMINAL",0,0,"remote: Resolving deltas: 0% (0/2)[K\rremote: Resolving deltas: 50% (1/2)[K\rremote: Resolving deltas: 100% (2/2)[K\rremote: Resolving deltas: 100% (2/2), completed with 2 local objects.[K\r\nTo https://github.com/emergenz/pdoom.org\r\n 0249c08..6339769 6339769e26394009774af7c0937a48af22b8f038 -> gh-pages\r\n[1m[7m%[27m[1m[0m \r \r",,terminal_output
+10585,17224985,"TERMINAL",0,0,"npm run dev",,terminal_command
+10586,17225036,"TERMINAL",0,0,"]633;C",,terminal_output
+10587,17225261,"TERMINAL",0,0,"\r\n> distill-template@2.8.0 dev\r\n> rollup -c rollup.config.dev.js -w\r\n\r\n[1G[0K",,terminal_output
+10588,17225436,"TERMINAL",0,0,"c[4mrollup v2.7.3[24m\r\n[36mbundles [1msrc/components.js[22m โ [1mdist/template.v2.js[22m...[39m\r\n",,terminal_output
+10589,17225939,"TERMINAL",0,0,"[1m[32mhttp://localhost:8088[39m[22m -> /Users/franzsrambical/Documents/pdoom/pdoom.org/dist\r\n[1m[32mhttp://localhost:8088[39m[22m -> /Users/franzsrambical/Documents/pdoom/pdoom.org/examples\r\n[1m[33m(!) Circular dependencies[39m[22m\r\nnode_modules/d3-interpolate/src/value.js -> node_modules/d3-interpolate/src/array.js -> node_modules/d3-interpolate/src/value.js\r\nnode_modules/d3-interpolate/src/value.js -> node_modules/d3-interpolate/src/object.js -> node_modules/d3-interpolate/src/value.js\r\nnode_modules/d3-selection/src/selection/index.js -> node_modules/d3-selection/src/selection/select.js -> node_modules/d3-selection/src/selection/index.js\r\n...and 7 more\r\n[32mcreated [1mdist/template.v2.js[22m in [1m504ms[22m[39m\r\n[36mbundles [1msrc/transforms.js[22m โ [1mdist/transforms.v2.js[22m...[39m\r\n",,terminal_output
+10590,17226325,"TERMINAL",0,0,"[32mcreated [1mdist/transforms.v2.js[22m in [1m385ms[22m[39m\r\n\r\n[2025-11-02 17:46:35] waiting for changes...\r\n",,terminal_output
diff --git a/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-f818bac9-3228-48bb-85cd-ad930fdb35d91752220838711-2025_07_11-10.00.40.248/source.csv b/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-f818bac9-3228-48bb-85cd-ad930fdb35d91752220838711-2025_07_11-10.00.40.248/source.csv
new file mode 100644
index 0000000000000000000000000000000000000000..292d56f660b7f2d2ccb818e10c3d5bf162ecab80
--- /dev/null
+++ b/4de8d861ed2563988d5f1871647ebc5fe70861b32d24a4b32f9363518653a328/crowd-code-f818bac9-3228-48bb-85cd-ad930fdb35d91752220838711-2025_07_11-10.00.40.248/source.csv
@@ -0,0 +1,2497 @@
+Sequence,Time,File,RangeOffset,RangeLength,Text,Language,Type
+1,1,"utils/dataloader.py",0,0,"import jax\nimport numpy as np\nimport grain\nfrom typing import Any\nimport pickle\n\n\nclass EpisodeLengthFilter(grain.transforms.Filter):\n """"""\n A Grain Filter that keeps only episodes with sufficient length.\n """"""\n \n def __init__(self, seq_len: int, image_h: int, image_w: int, image_c: int):\n """"""Initializes the filter with sequence length requirements.""""""\n self.seq_len = seq_len\n self.image_h = image_h\n self.image_w = image_w\n self.image_c = image_c\n \n def filter(self, element: Any) -> bool:\n """"""\n Filters episodes based on length.\n \n Args:\n element: A dictionary representing one record from the DataSource.\n Expected to contain 'raw_video' (bytes) and 'sequence_length' (int)\n \n Returns:\n True if the episode has sufficient length, False otherwise.\n """"""\n assert isinstance(element, bytes)\n element = pickle.loads(element)\n \n current_episode_len = element[""sequence_length""]\n if current_episode_len < self.seq_len:\n print(f""Filtering out episode with length {current_episode_len}, which is ""\n f""shorter than the requested sequence length {self.seq_len}."")\n return False\n \n return True\n\n\nclass ProcessEpisodeAndSlice(grain.transforms.RandomMap):\n """"""\n A Grain Transformation that combines parsing, slicing, and normalizing.\n """"""\n\n def __init__(self, seq_len: int, image_h: int, image_w: int, image_c: int):\n """"""Initializes the transformation with processing parameters.""""""\n self.seq_len = seq_len\n self.image_h = image_h\n self.image_w = image_w\n self.image_c = image_c\n\n def random_map(self, element: dict, rng: np.random.Generator) -> Any:\n """"""\n Processes a single raw episode from the data source.\n\n Args:\n element: A dictionary representing one record from the DataSource.\n Expected to contain 'raw_video' (bytes) and 'sequence_length' (int)\n rng: A per-record random number generator provided by the Grain sampler.\n\n Returns:\n A processed video sequence as a NumPy array with shape\n (seq_len, height, width, channels) and dtype float32.\n """"""\n assert isinstance(element, bytes)\n element = pickle.loads(element)\n \n video_shape = (\n element[""sequence_length""],\n self.image_h,\n self.image_w,\n self.image_c,\n )\n episode_tensor = np.frombuffer(element[""raw_video""], dtype=np.uint8)\n episode_tensor = episode_tensor.reshape(video_shape)\n\n current_episode_len = episode_tensor.shape[0]\n if current_episode_len < self.seq_len:\n raise ValueError(f""Episode length {current_episode_len} is shorter than ""\n f""requested sequence length {self.seq_len}. This should ""\n f""have been filtered out."")\n \n max_start_idx = current_episode_len - self.seq_len\n \n start_idx = rng.integers(0, max_start_idx + 1)\n\n seq = episode_tensor[start_idx : start_idx + self.seq_len]\n\n processed_sequence = seq.astype(np.float32) / 255.0\n\n return processed_sequence\n\n\ndef get_dataloader(\n array_record_paths: list[str],\n seq_len: int,\n global_batch_size: int,\n image_h: int,\n image_w: int,\n image_c: int,\n num_workers: int = 1,\n prefetch_buffer_size: int = 1,\n seed: int = 42,\n):\n """"""\n Creates a data loading pipeline using Grain.\n """"""\n if not array_record_paths:\n raise ValueError(""array_record_paths list cannot be empty."")\n\n num_processes = jax.process_count()\n\n if global_batch_size % num_processes != 0:\n raise ValueError(\n f""Global batch size {global_batch_size} must be divisible by ""\n f""the number of JAX processes {num_processes} for proper sharding.""\n )\n per_process_batch_size = global_batch_size // num_processes\n\n source = grain.sources.ArrayRecordDataSource(array_record_paths)\n \n sampler = grain.samplers.IndexSampler(\n num_records=len(source),\n shard_options=grain.sharding.ShardByJaxProcess(drop_remainder=True),\n shuffle=True,\n num_epochs=None,\n seed=seed,\n )\n\n operations = [\n EpisodeLengthFilter(\n seq_len=seq_len, image_h=image_h, image_w=image_w, image_c=image_c\n ),\n ProcessEpisodeAndSlice(\n seq_len=seq_len, image_h=image_h, image_w=image_w, image_c=image_c\n ),\n grain.transforms.Batch(batch_size=per_process_batch_size, drop_remainder=True),\n ]\n\n read_options = grain.ReadOptions(\n prefetch_buffer_size=prefetch_buffer_size,\n num_threads=1,\n )\n dataloader = grain.DataLoader(\n data_source=source,\n sampler=sampler,\n operations=operations,\n worker_count=num_workers,\n worker_buffer_size=1,\n read_options=read_options,\n )\n\n return dataloader\n\n",python,tab
+2,66,"extension-output-pdoom-org.crowd-code-#1-crowd-code",0,0,"10:00:40 AM [info] Activating crowd-code\n10:00:40 AM [info] Recording started\n10:00:40 AM [info] Initializing git provider using file system watchers...\n10:00:40 AM [info] Git repository found\n10:00:40 AM [info] Git provider initialized successfully\n",Log,tab
+3,89,"extension-output-pdoom-org.crowd-code-#1-crowd-code",250,0,"10:00:40 AM [info] Initial git state: [object Object]\n",Log,content
+4,836,"utils/dataloader.py",0,0,"",python,tab
+5,7012,"train_dynamics.py",0,0,"from dataclasses import dataclass, field\nimport os\n\nimport einops\nfrom flax.training.train_state import TrainState\nfrom jax.sharding import Mesh, PartitionSpec, NamedSharding\nfrom jax.experimental.mesh_utils import create_device_mesh\nimport optax\nimport orbax.checkpoint as ocp\nimport numpy as np\nimport jax\nimport jax.numpy as jnp\nimport tyro\nimport wandb\nimport grain\n\nfrom genie import Genie, restore_genie_components\nfrom utils.dataloader import get_dataloader\nfrom utils.parameter_utils import count_parameters_by_component\n\n\n@dataclass\nclass Args:\n # Experiment\n num_steps: int = 200_000\n seed: int = 0\n seq_len: int = 16\n image_channels: int = 3\n image_height: int = 90\n image_width: int = 160\n data_dir: str = """"\n save_ckpt: bool = False\n restore_ckpt: bool = False\n # Optimization\n batch_size: int = 36\n min_lr: float = 0.0\n max_lr: float = 3e-5\n warmup_steps: int = 5000\n # Tokenizer\n tokenizer_dim: int = 512\n latent_patch_dim: int = 32\n num_patch_latents: int = 1024\n patch_size: int = 4\n tokenizer_num_blocks: int = 8\n tokenizer_num_heads: int = 8\n tokenizer_checkpoint: str = """"\n # LAM\n lam_dim: int = 512\n latent_action_dim: int = 32\n num_latent_actions: int = 6\n lam_patch_size: int = 16\n lam_num_blocks: int = 8\n lam_num_heads: int = 8\n lam_checkpoint: str = """"\n # Dynamics\n dyna_dim: int = 512\n dyna_num_blocks: int = 12\n dyna_num_heads: int = 8\n dropout: float = 0.0\n mask_limit: float = 0.5\n # Logging\n log: bool = False\n entity: str = """"\n project: str = """"\n name: str = ""train_dynamics""\n tags: list[str] = field(default_factory=lambda: [""dynamics""])\n log_interval: int = 5\n log_image_interval: int = 250\n ckpt_dir: str = """"\n log_checkpoint_interval: int = 25000\n log_checkpoint_keep_period: int = 20000\n log_gradients: bool = False\n\n\nargs = tyro.cli(Args)\n\n\ndef dynamics_loss_fn(params, state, inputs):\n """"""Compute masked dynamics loss""""""\n outputs = state.apply_fn(\n params,\n inputs,\n training=True,\n rngs={""params"": inputs[""rng""], ""dropout"": inputs[""dropout_rng""]},\n )\n mask = outputs[""mask""]\n ce_loss = optax.softmax_cross_entropy_with_integer_labels(\n outputs[""token_logits""], outputs[""video_tokens""]\n )\n ce_loss = (mask * ce_loss).sum() / mask.sum()\n acc = outputs[""token_logits""].argmax(-1) == outputs[""video_tokens""]\n acc = (mask * acc).sum() / mask.sum()\n select_probs = jax.nn.softmax(outputs[""token_logits""])\n metrics = dict(\n cross_entropy_loss=ce_loss,\n masked_token_accuracy=acc,\n select_logit=outputs[""token_logits""].max(-1).mean(),\n select_p=select_probs.max(-1).mean(),\n entropy=jax.scipy.special.entr(select_probs).sum(-1).mean(),\n )\n return ce_loss, (outputs[""recon""], metrics)\n\n\n@jax.jit\ndef train_step(state, inputs):\n """"""Update state and compute metrics""""""\n grad_fn = jax.value_and_grad(dynamics_loss_fn, has_aux=True, allow_int=True)\n (loss, (recon, metrics)), grads = grad_fn(state.params, state, inputs)\n state = state.apply_gradients(grads=grads)\n if args.log_gradients:\n metrics[""gradients_std/""] = jax.tree.map(\n lambda x: x.std(), grads[""params""][""dynamics""]\n )\n return state, loss, recon, metrics\n\n\nif __name__ == ""__main__"":\n jax.distributed.initialize()\n num_devices = jax.device_count()\n if num_devices == 0:\n raise ValueError(""No JAX devices found."")\n print(f""Running on {num_devices} devices."")\n\n if args.batch_size % num_devices != 0:\n raise ValueError(\n f""Global batch size {args.batch_size} must be divisible by ""\n f""number of devices {num_devices}.""\n )\n\n per_device_batch_size_for_init = args.batch_size // num_devices\n\n rng = jax.random.PRNGKey(args.seed)\n\n # --- Initialize model ---\n genie = Genie(\n # Tokenizer\n in_dim=args.image_channels,\n tokenizer_dim=args.tokenizer_dim,\n latent_patch_dim=args.latent_patch_dim,\n num_patch_latents=args.num_patch_latents,\n patch_size=args.patch_size,\n tokenizer_num_blocks=args.tokenizer_num_blocks,\n tokenizer_num_heads=args.tokenizer_num_heads,\n # LAM\n lam_dim=args.lam_dim,\n latent_action_dim=args.latent_action_dim,\n num_latent_actions=args.num_latent_actions,\n lam_patch_size=args.lam_patch_size,\n lam_num_blocks=args.lam_num_blocks,\n lam_num_heads=args.lam_num_heads,\n lam_co_train=not args.lam_checkpoint,\n # Dynamics\n dyna_dim=args.dyna_dim,\n dyna_num_blocks=args.dyna_num_blocks,\n dyna_num_heads=args.dyna_num_heads,\n dropout=args.dropout,\n mask_limit=args.mask_limit,\n )\n rng, _rng = jax.random.split(rng)\n image_shape = (args.image_height, args.image_width, args.image_channels)\n dummy_inputs = dict(\n videos=jnp.zeros(\n (per_device_batch_size_for_init, args.seq_len, *image_shape),\n dtype=jnp.float32,\n ),\n action=jnp.zeros(\n (per_device_batch_size_for_init, args.seq_len), dtype=jnp.float32\n ),\n mask_rng=_rng,\n )\n rng, _rng = jax.random.split(rng)\n init_params = genie.init(_rng, dummy_inputs)\n\n param_counts = count_parameters_by_component(init_params)\n\n if args.log and jax.process_index() == 0:\n wandb.init(\n entity=args.entity,\n project=args.project,\n name=args.name,\n tags=args.tags,\n group=""debug"",\n config=args,\n )\n wandb.config.update({""model_param_count"": param_counts})\n\n print(""Parameter counts:"")\n print(param_counts)\n\n # --- Initialize optimizer ---\n lr_schedule = optax.warmup_cosine_decay_schedule(\n args.min_lr, args.max_lr, args.warmup_steps, args.num_steps\n )\n tx = optax.adamw(learning_rate=lr_schedule, b1=0.9, b2=0.9, weight_decay=1e-4)\n train_state = TrainState.create(apply_fn=genie.apply, params=init_params, tx=tx)\n\n device_mesh_arr = create_device_mesh((num_devices,))\n mesh = Mesh(devices=device_mesh_arr, axis_names=(""data"",))\n\n replicated_sharding = NamedSharding(mesh, PartitionSpec())\n videos_sharding = NamedSharding(\n mesh, PartitionSpec(""data"", None, None, None, None)\n )\n train_state = jax.device_put(train_state, replicated_sharding)\n\n # --- Initialize checkpoint manager ---\n step = 0\n handler_registry = ocp.handlers.DefaultCheckpointHandlerRegistry()\n handler_registry.add('model_state', ocp.args.StandardSave, ocp.handlers.StandardCheckpointHandler)\n handler_registry.add('model_state', ocp.args.StandardRestore, ocp.handlers.StandardCheckpointHandler)\n handler_registry.add('dataloader_state', grain.checkpoint.CheckpointSave, grain.checkpoint.CheckpointHandler) # type: ignore\n handler_registry.add('dataloader_state', grain.checkpoint.CheckpointRestore, grain.checkpoint.CheckpointHandler) # type: ignore\n \n checkpoint_options = ocp.CheckpointManagerOptions(\n save_interval_steps=args.log_checkpoint_interval,\n max_to_keep=3,\n keep_period=args.log_checkpoint_keep_period,\n step_format_fixed_length=6,\n cleanup_tmp_directories=True,\n )\n \n checkpoint_manager = ocp.CheckpointManager(\n args.ckpt_dir,\n options=checkpoint_options,\n handler_registry=handler_registry,\n )\n\n # --- Create DataLoaderIterator from dataloader ---\n array_record_files = [\n os.path.join(args.data_dir, x)\n for x in os.listdir(args.data_dir)\n if x.endswith("".array_record"")\n ]\n grain_dataloader = get_dataloader(\n array_record_files,\n args.seq_len,\n # NOTE: We deliberately pass the global batch size\n # The dataloader shards the dataset across all processes\n args.batch_size,\n *image_shape,\n num_workers=8,\n prefetch_buffer_size=1,\n seed=args.seed,\n )\n initial_state = grain_dataloader._create_initial_state()\n grain_iterator = grain.DataLoaderIterator(grain_dataloader, initial_state)\n \n # --- Restore checkpoint ---\n if args.restore_ckpt:\n # Restore full dynamics model\n abstract_train_state = jax.tree_util.tree_map(ocp.utils.to_shape_dtype_struct, train_state)\n restored = checkpoint_manager.restore(\n checkpoint_manager.latest_step(),\n args=ocp.args.Composite(\n model_state=ocp.args.StandardRestore(abstract_train_state),\n dataloader_state=grain.checkpoint.CheckpointRestore(grain_iterator),\n )\n )\n train_state = restored[""model_state""]\n grain_iterator = restored[""dataloader_state""]\n step = checkpoint_manager.latest_step() or 0\n print(f""Restored dataloader and model state from step {step}"")\n else:\n # Restore from pre-trained tokenizer (and LAM)\n train_state = restore_genie_components(\n train_state, replicated_sharding, grain_iterator, dummy_inputs, rng, args\n )\n\n # --- TRAIN LOOP ---\n dataloader = (jax.make_array_from_process_local_data(videos_sharding, elem) for elem in grain_iterator) # type: ignore\n while step < args.num_steps:\n for videos in dataloader:\n # --- Train step ---\n rng, _rng, _rng_dropout, _rng_mask = jax.random.split(rng, 4)\n\n inputs = dict(\n videos=videos,\n rng=_rng,\n dropout_rng=_rng_dropout,\n mask_rng=_rng_mask,\n )\n train_state, loss, recon, metrics = train_step(train_state, inputs)\n print(f""Step {step}, loss: {loss}"")\n step += 1\n\n # --- Logging ---\n if args.log:\n if step % args.log_interval == 0 and jax.process_index() == 0:\n wandb.log(\n {\n ""loss"": loss,\n ""step"": step,\n **metrics,\n }\n )\n if step % args.log_image_interval == 0:\n gt_seq = inputs[""videos""][0]\n recon_seq = recon[0].clip(0, 1)\n comparison_seq = jnp.concatenate((gt_seq, recon_seq), axis=1)\n comparison_seq = einops.rearrange(\n comparison_seq * 255, ""t h w c -> h (t w) c""\n )\n if jax.process_index() == 0:\n log_images = dict(\n image=wandb.Image(np.asarray(gt_seq[args.seq_len - 1])),\n recon=wandb.Image(np.asarray(recon_seq[args.seq_len - 1])),\n true_vs_recon=wandb.Image(\n np.asarray(comparison_seq.astype(np.uint8))\n ),\n )\n wandb.log(log_images)\n # --- Checkpointing ---\n if args.save_ckpt and step % args.log_checkpoint_interval == 0:\n checkpoint_manager.save(\n step,\n args=ocp.args.Composite(\n model_state=ocp.args.StandardSave(train_state),\n dataloader_state=grain.checkpoint.CheckpointSave(grain_iterator),\n )\n )\n print(f""Saved checkpoint at step {step}"")\n if step >= args.num_steps:\n break\n\n checkpoint_manager.close()",python,tab
+6,7288,"train_dynamics.py",0,11594,"from dataclasses import dataclass, field\nimport os\n\nimport einops\nfrom flax.training.train_state import TrainState\nfrom jax.sharding import Mesh, PartitionSpec, NamedSharding\nfrom jax.experimental.mesh_utils import create_device_mesh\nimport optax\nimport orbax.checkpoint as ocp\nimport numpy as np\nimport jax\nimport jax.numpy as jnp\nimport tyro\nimport wandb\nimport grain\n\nfrom genie import Genie, restore_genie_components\nfrom utils.dataloader import get_dataloader\nfrom utils.parameter_utils import count_parameters_by_component\n\n\n@dataclass\nclass Args:\n # Experiment\n num_steps: int = 200_000\n seed: int = 0\n seq_len: int = 16\n image_channels: int = 3\n image_height: int = 90\n image_width: int = 160\n data_dir: str = """"\n save_ckpt: bool = False\n restore_ckpt: bool = False\n # Optimization\n batch_size: int = 36\n min_lr: float = 0.0\n max_lr: float = 3e-5\n warmup_steps: int = 5000\n # Tokenizer\n tokenizer_dim: int = 512\n latent_patch_dim: int = 32\n num_patch_latents: int = 1024\n patch_size: int = 4\n tokenizer_num_blocks: int = 8\n tokenizer_num_heads: int = 8\n tokenizer_checkpoint: str = """"\n # LAM\n lam_dim: int = 512\n latent_action_dim: int = 32\n num_latent_actions: int = 6\n lam_patch_size: int = 16\n lam_num_blocks: int = 8\n lam_num_heads: int = 8\n lam_checkpoint: str = """"\n # Dynamics\n dyna_dim: int = 512\n dyna_num_blocks: int = 12\n dyna_num_heads: int = 8\n dropout: float = 0.0\n mask_limit: float = 0.5\n # Logging\n log: bool = False\n entity: str = """"\n project: str = """"\n name: str = ""train_dynamics""\n tags: list[str] = field(default_factory=lambda: [""dynamics""])\n log_interval: int = 5\n log_image_interval: int = 250\n ckpt_dir: str = """"\n log_checkpoint_interval: int = 25000\n log_checkpoint_keep_period: int = 20000\n log_gradients: bool = False\n\n\nargs = tyro.cli(Args)\n\n\ndef dynamics_loss_fn(params, state, inputs):\n """"""Compute masked dynamics loss""""""\n outputs = state.apply_fn(\n params,\n inputs,\n training=True,\n rngs={""params"": inputs[""rng""], ""dropout"": inputs[""dropout_rng""]},\n )\n mask = outputs[""mask""]\n ce_loss = optax.softmax_cross_entropy_with_integer_labels(\n outputs[""token_logits""], outputs[""video_tokens""]\n )\n ce_loss = (mask * ce_loss).sum() / mask.sum()\n acc = outputs[""token_logits""].argmax(-1) == outputs[""video_tokens""]\n acc = (mask * acc).sum() / mask.sum()\n select_probs = jax.nn.softmax(outputs[""token_logits""])\n metrics = dict(\n cross_entropy_loss=ce_loss,\n masked_token_accuracy=acc,\n select_logit=outputs[""token_logits""].max(-1).mean(),\n select_p=select_probs.max(-1).mean(),\n entropy=jax.scipy.special.entr(select_probs).sum(-1).mean(),\n )\n return ce_loss, (outputs[""recon""], metrics)\n\n\n@jax.jit\ndef train_step(state, inputs):\n """"""Update state and compute metrics""""""\n grad_fn = jax.value_and_grad(dynamics_loss_fn, has_aux=True, allow_int=True)\n (loss, (recon, metrics)), grads = grad_fn(state.params, state, inputs)\n state = state.apply_gradients(grads=grads)\n if args.log_gradients:\n metrics[""gradients_std/""] = jax.tree.map(\n lambda x: x.std(), grads[""params""][""dynamics""]\n )\n return state, loss, recon, metrics\n\n\nif __name__ == ""__main__"":\n jax.distributed.initialize()\n num_devices = jax.device_count()\n if num_devices == 0:\n raise ValueError(""No JAX devices found."")\n print(f""Running on {num_devices} devices."")\n\n if args.batch_size % num_devices != 0:\n raise ValueError(\n f""Global batch size {args.batch_size} must be divisible by ""\n f""number of devices {num_devices}.""\n )\n\n per_device_batch_size_for_init = args.batch_size // num_devices\n\n rng = jax.random.PRNGKey(args.seed)\n\n # --- Initialize model ---\n genie = Genie(\n # Tokenizer\n in_dim=args.image_channels,\n tokenizer_dim=args.tokenizer_dim,\n latent_patch_dim=args.latent_patch_dim,\n num_patch_latents=args.num_patch_latents,\n patch_size=args.patch_size,\n tokenizer_num_blocks=args.tokenizer_num_blocks,\n tokenizer_num_heads=args.tokenizer_num_heads,\n # LAM\n lam_dim=args.lam_dim,\n latent_action_dim=args.latent_action_dim,\n num_latent_actions=args.num_latent_actions,\n lam_patch_size=args.lam_patch_size,\n lam_num_blocks=args.lam_num_blocks,\n lam_num_heads=args.lam_num_heads,\n lam_co_train=not args.lam_checkpoint,\n # Dynamics\n dyna_dim=args.dyna_dim,\n dyna_num_blocks=args.dyna_num_blocks,\n dyna_num_heads=args.dyna_num_heads,\n dropout=args.dropout,\n mask_limit=args.mask_limit,\n )\n rng, _rng = jax.random.split(rng)\n image_shape = (args.image_height, args.image_width, args.image_channels)\n dummy_inputs = dict(\n videos=jnp.zeros(\n (per_device_batch_size_for_init, args.seq_len, *image_shape),\n dtype=jnp.float32,\n ),\n action=jnp.zeros(\n (per_device_batch_size_for_init, args.seq_len), dtype=jnp.float32\n ),\n mask_rng=_rng,\n )\n rng, _rng = jax.random.split(rng)\n init_params = genie.init(_rng, dummy_inputs)\n\n param_counts = count_parameters_by_component(init_params)\n\n if args.log and jax.process_index() == 0:\n wandb.init(\n entity=args.entity,\n project=args.project,\n name=args.name,\n tags=args.tags,\n group=""debug"",\n config=args,\n )\n wandb.config.update({""model_param_count"": param_counts})\n\n print(""Parameter counts:"")\n print(param_counts)\n\n # --- Initialize optimizer ---\n lr_schedule = optax.warmup_cosine_decay_schedule(\n args.min_lr, args.max_lr, args.warmup_steps, args.num_steps\n )\n tx = optax.adamw(learning_rate=lr_schedule, b1=0.9, b2=0.9, weight_decay=1e-4)\n train_state = TrainState.create(apply_fn=genie.apply, params=init_params, tx=tx)\n\n device_mesh_arr = create_device_mesh((num_devices,))\n mesh = Mesh(devices=device_mesh_arr, axis_names=(""data"",))\n\n replicated_sharding = NamedSharding(mesh, PartitionSpec())\n videos_sharding = NamedSharding(\n mesh, PartitionSpec(""data"", None, None, None, None)\n )\n train_state = jax.device_put(train_state, replicated_sharding)\n\n # --- Initialize checkpoint manager ---\n step = 0\n handler_registry = ocp.handlers.DefaultCheckpointHandlerRegistry()\n handler_registry.add('model_state', ocp.args.StandardSave, ocp.handlers.StandardCheckpointHandler)\n handler_registry.add('model_state', ocp.args.StandardRestore, ocp.handlers.StandardCheckpointHandler)\n handler_registry.add('dataloader_state', grain.checkpoint.CheckpointSave, grain.checkpoint.CheckpointHandler) # type: ignore\n handler_registry.add('dataloader_state', grain.checkpoint.CheckpointRestore, grain.checkpoint.CheckpointHandler) # type: ignore\n \n checkpoint_options = ocp.CheckpointManagerOptions(\n save_interval_steps=args.log_checkpoint_interval,\n max_to_keep=3,\n keep_period=args.log_checkpoint_keep_period,\n step_format_fixed_length=6,\n cleanup_tmp_directories=True,\n )\n \n checkpoint_manager = ocp.CheckpointManager(\n args.ckpt_dir,\n options=checkpoint_options,\n handler_registry=handler_registry,\n )\n\n # --- Create DataLoaderIterator from dataloader ---\n array_record_files = [\n os.path.join(args.data_dir, x)\n for x in os.listdir(args.data_dir)\n if x.endswith("".array_record"")\n ]\n grain_dataloader = get_dataloader(\n array_record_files,\n args.seq_len,\n # NOTE: We deliberately pass the global batch size\n # The dataloader shards the dataset across all processes\n args.batch_size,\n *image_shape,\n num_workers=8,\n prefetch_buffer_size=1,\n seed=args.seed,\n )\n initial_state = grain_dataloader._create_initial_state()\n grain_iterator = grain.DataLoaderIterator(grain_dataloader, initial_state)\n \n # --- Restore checkpoint ---\n if args.restore_ckpt:\n # Restore full dynamics model\n abstract_train_state = jax.tree_util.tree_map(ocp.utils.to_shape_dtype_struct, train_state)\n restored = checkpoint_manager.restore(\n checkpoint_manager.latest_step(),\n args=ocp.args.Composite(\n model_state=ocp.args.StandardRestore(abstract_train_state),\n dataloader_state=grain.checkpoint.CheckpointRestore(grain_iterator),\n )\n )\n train_state = restored[""model_state""]\n grain_iterator = restored[""dataloader_state""]\n step = checkpoint_manager.latest_step() or 0\n print(f""Restored dataloader and model state from step {step}"")\n else:\n # Restore from pre-trained tokenizer (and LAM)\n train_state = restore_genie_components(\n train_state, replicated_sharding, grain_iterator, dummy_inputs, rng, args\n )\n\n # --- TRAIN LOOP ---\n dataloader = (jax.make_array_from_process_local_data(videos_sharding, elem) for elem in grain_iterator) # type: ignore\n while step < args.num_steps:\n for videos in dataloader:\n # --- Train step ---\n rng, _rng, _rng_dropout, _rng_mask = jax.random.split(rng, 4)\n\n inputs = dict(\n videos=videos,\n rng=_rng,\n dropout_rng=_rng_dropout,\n mask_rng=_rng_mask,\n )\n train_state, loss, recon, metrics = train_step(train_state, inputs)\n print(f""Step {step}, loss: {loss}"")\n step += 1\n\n # --- Logging ---\n if args.log:\n if step % args.log_interval == 0 and jax.process_index() == 0:\n wandb.log(\n {\n ""loss"": loss,\n ""step"": step,\n **metrics,\n }\n )\n if step % args.log_image_interval == 0:\n gt_seq = inputs[""videos""][0]\n recon_seq = recon[0].clip(0, 1)\n comparison_seq = jnp.concatenate((gt_seq, recon_seq), axis=1)\n comparison_seq = einops.rearrange(\n comparison_seq * 255, ""t h w c -> h (t w) c""\n )\n if jax.process_index() == 0:\n log_images = dict(\n image=wandb.Image(np.asarray(gt_seq[args.seq_len - 1])),\n recon=wandb.Image(np.asarray(recon_seq[args.seq_len - 1])),\n true_vs_recon=wandb.Image(\n np.asarray(comparison_seq.astype(np.uint8))\n ),\n )\n wandb.log(log_images)\n # --- Checkpointing ---\n if args.save_ckpt and step % args.log_checkpoint_interval == 0:\n checkpoint_manager.save(\n step,\n args=ocp.args.Composite(\n model_state=ocp.args.StandardSave(train_state),\n dataloader_state=grain.checkpoint.CheckpointSave(grain_iterator),\n )\n )\n print(f""Saved checkpoint at step {step}"")\n if step >= args.num_steps:\n break\n\n checkpoint_manager.close()",python,selection_command
+7,7469,"train_dynamics.py",11575,0,"",python,selection_command
+8,848004,"train_dynamics.py",0,0,"",python,selection_command
+9,849119,"train_dynamics.py",41,0,"",python,selection_command
+10,849369,"train_dynamics.py",51,0,"",python,selection_command
+11,849399,"train_dynamics.py",52,0,"",python,selection_command
+12,849429,"train_dynamics.py",66,0,"",python,selection_command
+13,849462,"train_dynamics.py",115,0,"",python,selection_command
+14,849493,"train_dynamics.py",175,0,"",python,selection_command
+15,860985,"train_dynamics.py",234,0,"",python,selection_command
+16,861236,"train_dynamics.py",247,0,"",python,selection_command
+17,861265,"train_dynamics.py",278,0,"",python,selection_command
+18,861297,"train_dynamics.py",297,0,"",python,selection_command
+19,861331,"train_dynamics.py",308,0,"",python,selection_command
+20,861364,"train_dynamics.py",332,0,"",python,selection_command
+21,861398,"train_dynamics.py",344,0,"",python,selection_command
+22,861431,"train_dynamics.py",357,0,"",python,selection_command
+23,861667,"train_dynamics.py",370,0,"",python,selection_command
+24,892902,"train_dynamics.py",5933,0,"",python,selection_command
+25,893668,"train_dynamics.py",5938,0,"",python,selection_command
+26,893916,"train_dynamics.py",5939,0,"",python,selection_command
+27,893943,"train_dynamics.py",5952,0,"",python,selection_command
+28,893980,"train_dynamics.py",5953,0,"",python,selection_command
+29,894012,"train_dynamics.py",5964,0,"",python,selection_command
+30,894044,"train_dynamics.py",5966,0,"",python,selection_command
+31,894249,"train_dynamics.py",5968,0,"",python,selection_command
+32,894500,"train_dynamics.py",5969,0,"",python,selection_command
+33,894532,"train_dynamics.py",5970,0,"",python,selection_command
+34,894564,"train_dynamics.py",5971,0,"",python,selection_command
+35,894597,"train_dynamics.py",5972,0,"",python,selection_command
+36,894902,"train_dynamics.py",6000,0,"",python,selection_command
+37,895184,"train_dynamics.py",5999,0,"",python,selection_command
+38,895876,"train_dynamics.py",5999,0,",",python,content
+39,895884,"train_dynamics.py",6000,0,"",python,selection_keyboard
+40,895920,"train_dynamics.py",6000,0," ",python,content
+41,895922,"train_dynamics.py",6001,0,"",python,selection_keyboard
+42,897300,"train_dynamics.py",6001,0,"m",python,content
+43,897302,"train_dynamics.py",6002,0,"",python,selection_keyboard
+44,897448,"train_dynamics.py",6002,0,"u",python,content
+45,897451,"train_dynamics.py",6003,0,"",python,selection_keyboard
+46,897686,"train_dynamics.py",6003,0,"_",python,content
+47,897688,"train_dynamics.py",6004,0,"",python,selection_keyboard
+48,897864,"train_dynamics.py",6004,0,"d",python,content
+49,897865,"train_dynamics.py",6005,0,"",python,selection_keyboard
+50,898022,"train_dynamics.py",6005,0,"t",python,content
+51,898025,"train_dynamics.py",6006,0,"",python,selection_keyboard
+52,898159,"train_dynamics.py",6006,0,"y",python,content
+53,898161,"train_dynamics.py",6007,0,"",python,selection_keyboard
+54,898244,"train_dynamics.py",6007,0,"p",python,content
+55,898246,"train_dynamics.py",6008,0,"",python,selection_keyboard
+56,898313,"train_dynamics.py",6008,0,"e",python,content
+57,898315,"train_dynamics.py",6009,0,"",python,selection_keyboard
+58,898576,"train_dynamics.py",6009,0,"=",python,content
+59,898581,"train_dynamics.py",6010,0,"",python,selection_keyboard
+60,900283,"train_dynamics.py",6010,0,"j",python,content
+61,900285,"train_dynamics.py",6011,0,"",python,selection_keyboard
+62,900434,"train_dynamics.py",6011,0,"n",python,content
+63,900436,"train_dynamics.py",6012,0,"",python,selection_keyboard
+64,900498,"train_dynamics.py",6012,0,"p",python,content
+65,900500,"train_dynamics.py",6013,0,"",python,selection_keyboard
+66,900711,"train_dynamics.py",6013,0,".",python,content
+67,900712,"train_dynamics.py",6014,0,"",python,selection_keyboard
+68,900999,"train_dynamics.py",6014,0,"b",python,content
+69,901000,"train_dynamics.py",6015,0,"",python,selection_keyboard
+70,901326,"train_dynamics.py",6015,0,"f",python,content
+71,901329,"train_dynamics.py",6016,0,"",python,selection_keyboard
+72,901432,"train_dynamics.py",6016,0,"l",python,content
+73,901434,"train_dynamics.py",6017,0,"",python,selection_keyboard
+74,901605,"train_dynamics.py",6017,0,"o",python,content
+75,901608,"train_dynamics.py",6018,0,"",python,selection_keyboard
+76,901659,"train_dynamics.py",6018,0,"a",python,content
+77,901661,"train_dynamics.py",6019,0,"",python,selection_keyboard
+78,901799,"train_dynamics.py",6019,0,"t",python,content
+79,901801,"train_dynamics.py",6020,0,"",python,selection_keyboard
+80,902171,"train_dynamics.py",6020,0,"1",python,content
+81,902172,"train_dynamics.py",6021,0,"",python,selection_keyboard
+82,902344,"train_dynamics.py",6021,0,"6",python,content
+83,902346,"train_dynamics.py",6022,0,"",python,selection_keyboard
+84,902460,"train_dynamics.py",6021,0,"",python,selection_command
+85,902768,"train_dynamics.py",5922,0,"",python,selection_command
+86,902949,"train_dynamics.py",5925,0,"",python,selection_command
+87,904365,"train_dynamics.py",5927,0,"",python,selection_command
+88,904531,"train_dynamics.py",5932,0,"",python,selection_command
+89,905418,"train_dynamics.py",5933,0,"",python,selection_command
+90,910566,".venv/lib/python3.10/site-packages/optax/_src/alias.py",0,0,"# Copyright 2019 DeepMind Technologies Limited. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the ""License"");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an ""AS IS"" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n""""""Aliases for popular optimizers.""""""\n\nfrom collections.abc import Callable\nimport functools\nfrom typing import Any, Optional, Union\nimport warnings\n\nimport jax\nimport jax.numpy as jnp\nfrom optax._src import base\nfrom optax._src import clipping\nfrom optax._src import combine\nfrom optax._src import factorized\nfrom optax._src import linesearch as _linesearch\nfrom optax._src import transform\nfrom optax._src import utils\nfrom optax._src import wrappers\n\n\nMaskOrFn = Optional[Union[Any, Callable[[base.Params], Any]]]\n\n\ndef adabelief(\n learning_rate: base.ScalarOrSchedule,\n b1: float = 0.9,\n b2: float = 0.999,\n eps: float = 1e-16,\n eps_root: float = 1e-16,\n *,\n nesterov: bool = False,\n) -> base.GradientTransformationExtraArgs:\n r""""""The AdaBelief optimizer.\n\n AdaBelief is an adaptive learning rate optimizer that focuses on fast\n convergence, generalization, and stability. It adapts the step size depending\n on its ""belief"" in the gradient direction โ the optimizer adaptively scales\n the step size by the difference between the predicted and observed gradients.\n AdaBelief is a modified version of :func:`optax.adam` and contains the same\n number of parameters.\n\n Let :math:`\alpha_t` represent the learning rate and :math:`\beta_1, \beta_2`,\n :math:`\varepsilon`, :math:`\bar{\varepsilon}` represent the arguments\n ``b1``, ``b2``, ``eps`` and ``eps_root`` respectively. The learning rate is\n indexed by :math:`t` since the learning rate may also be provided by a\n schedule function.\n\n The ``init`` function of this optimizer initializes an internal state\n :math:`S_0 := (m_0, s_0) = (0, 0)`, representing initial estimates for the\n first and second moments. In practice these values are stored as pytrees\n containing all zeros, with the same shape as the model updates.\n At step :math:`t`, the ``update`` function of this optimizer takes as\n arguments the incoming gradients :math:`g_t` and optimizer state :math:`S_t`\n and computes updates :math:`u_t` and new state :math:`S_{t+1}`. Thus, for\n :math:`t > 0`, we have,\n\n .. math::\n\n \begin{align*}\n m_t &\leftarrow \beta_1 \cdot m_{t-1} + (1-\beta_1) \cdot g_t \\\n s_t &\leftarrow \beta_2 \cdot s_{t-1} + (1-\beta_2) \cdot (g_t - m_t)^2\n + \bar{\varepsilon} \\\n \hat{m}_t &\leftarrow m_t / {(1-\beta_1^t)} \\\n \hat{s}_t &\leftarrow s_t / {(1-\beta_2^t)} \\\n u_t &\leftarrow -\alpha_t \cdot \hat{m}_t / \left(\sqrt{\hat{s}_t}\n + \varepsilon \right) \\\n S_t &\leftarrow (m_t, s_t).\n \end{align*}\n\n With the keyword argument `nesterov=True`, the optimizer uses Nesterov\n momentum, replacing the above :math:`\hat{m}_t` with\n\n .. math::\n \hat{m}_t \leftarrow\n \beta_1 m_t / {(1-\beta_1^{t+1})} + (1 - \beta_1) g_t / {(1-\beta_1^t)}.\n\n Args:\n learning_rate: A global scaling factor, either fixed or evolving along\n iterations with a scheduler, see :func:`optax.scale_by_learning_rate`.\n b1: Exponential decay rate to track the first moment of past gradients.\n b2: Exponential decay rate to track the second moment of past gradients.\n eps: Term added to the denominator to improve numerical stability.\n eps_root: Term added to the second moment of the prediction error to\n improve numerical stability. If backpropagating gradients through the\n gradient transformation (e.g. for meta-learning), this must be non-zero.\n nesterov: Whether to use Nesterov momentum.\n\n Returns:\n The corresponding :class:`optax.GradientTransformationExtraArgs`.\n\n Examples:\n >>> import optax\n >>> import jax\n >>> import jax.numpy as jnp\n >>> def f(x): return jnp.sum(x ** 2) # simple quadratic function\n >>> solver = optax.adabelief(learning_rate=0.003)\n >>> params = jnp.array([1., 2., 3.])\n >>> print('Objective function: ', f(params))\n Objective function: 14.0\n >>> opt_state = solver.init(params)\n >>> for _ in range(5):\n ... grad = jax.grad(f)(params)\n ... updates, opt_state = solver.update(grad, opt_state, params)\n ... params = optax.apply_updates(params, updates)\n ... print('Objective function: {:.2E}'.format(f(params)))\n Objective function: 1.40E+01\n Objective function: 1.39E+01\n Objective function: 1.39E+01\n Objective function: 1.38E+01\n Objective function: 1.38E+01\n\n References:\n Zhuang, `AdaBelief Optimizer: Adapting Stepsizes by the Belief in Observed\n Gradients `_, 2020\n\n .. note::\n The default epsilon values in the paper are ``eps=1e-8``, ``eps_root=0.``.\n """"""\n return combine.chain(\n transform.scale_by_belief(\n b1=b1,\n b2=b2,\n eps=eps,\n eps_root=eps_root,\n nesterov=nesterov,\n ),\n transform.scale_by_learning_rate(learning_rate),\n )\n\n\ndef adadelta(\n learning_rate: Optional[base.ScalarOrSchedule] = None,\n rho: float = 0.9,\n eps: float = 1e-6,\n weight_decay: float = 0.0,\n weight_decay_mask: MaskOrFn = None,\n) -> base.GradientTransformationExtraArgs:\n r""""""The Adadelta optimizer.\n\n Adadelta is a stochastic gradient descent method that adapts learning rates\n based on a moving window of gradient updates. Adadelta is a modification of\n Adagrad.\n It addresses the diminishing learning rates problem in Adagrad by maintaining running averages of squared\n gradients.\n\n The weight update :math:`\Delta w_t` for this optimizer is given as follows:\n\n .. math::\n \begin{align*}\n\n &E[g^2]_t = \rho \cdot E[g^2]_{t-1} + (1-\rho) \cdot g_t^2 \\\n &\Delta w_t = -\frac{\sqrt{E[\Delta w^2]_{t-1} + \epsilon}}{\sqrt{E[g^2]_t + \epsilon}} \cdot g_t\n\n \end{align*}\n\n\n\n where:\n - :math:`g_t` is the gradient at time step :math:`t`,\n - :math:`E[g^2]_t` is the running average of squared gradients,\n - :math:`E[\Delta w^2]_t` is the running average of squared parameter updates,\n - :math:`\rho` is the decay rate (typically 0.9),\n - :math:`\epsilon` is a small constant for numerical stability.\n\n Args:\n learning_rate: A global scaling factor, either fixed or evolving along\n iterations with a scheduler, see :func:`optax.scale_by_learning_rate`.\n rho: A coefficient used for computing a running average of squared\n gradients.\n eps: Term added to the denominator to improve numerical stability.\n weight_decay: Optional rate at which to decay weights.\n weight_decay_mask: A tree with same structure as (or a prefix of) the params\n PyTree, or a Callable that returns such a pytree given the params/updates.\n The leaves should be booleans, `True` for leaves/subtrees you want to\n apply the transformation to, and `False` for those you want to skip.\n\n Returns:\n The corresponding :class:`optax.GradientTransformationExtraArgs`.\n\n Examples:\n >>> import optax\n >>> import jax\n >>> import jax.numpy as jnp\n >>> f = lambda x: jnp.sum(x ** 2) # simple quadratic function\n >>> solver = optax.adadelta(learning_rate=10.)\n >>> params = jnp.array([1., 2., 3.])\n >>> print('Objective function: ', f(params))\n Objective function: 14.0\n >>> opt_state = solver.init(params)\n >>> for _ in range(5):\n ... grad = jax.grad(f)(params)\n ... updates, opt_state = solver.update(grad, opt_state, params)\n ... params = optax.apply_updates(params, updates)\n ... print('Objective function: {:.2E}'.format(f(params)))\n Objective function: 1.36E+01\n Objective function: 1.32E+01\n Objective function: 1.29E+01\n Objective function: 1.25E+01\n Objective function: 1.21E+01\n\n References:\n Zeiler, `Adadelta: An Adaptive Learning Rate Optimizer\n `_, 2012\n """"""\n return combine.chain(\n transform.add_decayed_weights(weight_decay, mask=weight_decay_mask),\n transform.scale_by_adadelta(rho=rho, eps=eps),\n transform.scale_by_learning_rate(learning_rate),\n )\n\n\ndef adafactor(\n learning_rate: Optional[base.ScalarOrSchedule] = None,\n min_dim_size_to_factor: int = 128,\n decay_rate: float = 0.8,\n decay_offset: int = 0,\n multiply_by_parameter_scale: float = True,\n clipping_threshold: Optional[float] = 1.0,\n momentum: Optional[float] = None,\n dtype_momentum: Any = jnp.float32,\n weight_decay_rate: Optional[float] = None,\n eps: float = 1e-30,\n factored: bool = True,\n weight_decay_mask: MaskOrFn = None,\n) -> base.GradientTransformationExtraArgs:\n """"""The Adafactor optimizer.\n\n Adafactor is an adaptive learning rate optimizer that focuses on fast\n training of large scale neural networks. It saves memory by using a factored\n estimate of the second order moments used to scale gradients.\n\n Args:\n learning_rate: A global scaling factor, either fixed or evolving along\n iterations with a scheduler, see :func:`optax.scale_by_learning_rate`.\n Note that the natural scale for Adafactor's LR is markedly different\n from Adam, one doesn't use the 1/sqrt(hidden) correction for this optim\n with attention-based models.\n min_dim_size_to_factor: Only factor the statistics if two array dimensions\n have at least this size.\n decay_rate: Controls second-moment exponential decay schedule.\n decay_offset: For fine-tuning, one may set this to the starting step\n number of the fine-tuning phase.\n multiply_by_parameter_scale: If True, then scale learning_rate by\n parameter norm. If False, provided learning_rate is absolute step size.\n clipping_threshold: Optional clipping threshold. Must be >= 1. If None,\n clipping is disabled.\n momentum: Optional value between 0 and 1, enables momentum and uses extra\n memory if non-None! None by default.\n dtype_momentum: Data type of momentum buffers.\n weight_decay_rate: Optional rate at which to decay weights.\n eps: Regularization constant for root mean squared gradient.\n factored: Whether to use factored second-moment estimates.\n weight_decay_mask: A tree with same structure as (or a prefix of) the\n params PyTree, or a Callable that returns such a pytree given the\n params/updates. The leaves should be booleans, `True` for\n leaves/subtrees you want to apply the transformation to, and `False` for\n those you want to skip.\n\n Returns:\n The corresponding :class:`optax.GradientTransformationExtraArgs`.\n\n Examples:\n >>> import optax\n >>> import jax\n >>> import jax.numpy as jnp\n >>> def f(x): return jnp.sum(x ** 2) # simple quadratic function\n >>> solver = optax.adafactor(learning_rate=0.003)\n >>> params = jnp.array([1., 2., 3.])\n >>> print('Objective function: ', f(params))\n Objective function: 14.0\n >>> opt_state = solver.init(params)\n >>> for _ in range(5):\n ... grad = jax.grad(f)(params)\n ... updates, opt_state = solver.update(grad, opt_state, params)\n ... params = optax.apply_updates(params, updates)\n ... print('Objective function: {:.2E}'.format(f(params)))\n Objective function: 1.39E+01\n Objective function: 1.38E+01\n Objective function: 1.38E+01\n Objective function: 1.37E+01\n Objective function: 1.36E+01\n\n References:\n Shazeer et al, `Adafactor: Adaptive Learning Rates with Sublinear Memory\n Cost `_, 2018\n """"""\n # The core of the algorithm is a procedure for rescaling gradients\n # by a factored estimate of the root mean squared gradients.\n # This reduces memory compared to algorithms such as Adam or RmsProp,\n # by not having to hold a separate estimate for each weight.\n tx = [\n factorized.scale_by_factored_rms(\n factored, decay_rate, decay_offset, min_dim_size_to_factor, eps\n )\n ]\n # This basic rescaling is typically combined with one or more of the following\n # transformation (all can be disabled via adafactor's constructor args).\n if clipping_threshold is not None:\n tx.append(clipping.clip_by_block_rms(clipping_threshold))\n if learning_rate is not None:\n tx.append(transform.scale_by_learning_rate(learning_rate, flip_sign=False))\n if multiply_by_parameter_scale:\n tx.append(transform.scale_by_param_block_rms())\n if momentum is not None:\n tx.append(\n transform.ema(momentum, debias=False, accumulator_dtype=dtype_momentum)\n )\n if weight_decay_rate is not None:\n tx.append(\n transform.add_decayed_weights(weight_decay_rate, mask=weight_decay_mask)\n )\n # In gradient ""descent"" we follow the negative gradient.\n tx.append(transform.scale(-1))\n return combine.chain(*tx)\n\n\ndef adagrad(\n learning_rate: base.ScalarOrSchedule,\n initial_accumulator_value: float = 0.1,\n eps: float = 1e-7,\n) -> base.GradientTransformationExtraArgs:\n r""""""The Adagrad optimizer.\n\n AdaGrad is a sub-gradient algorithm for stochastic optimization that adapts\n the learning rate individually for each feature based on its gradient history.\n\n The updated parameters adopt the form:\n\n .. math::\n\n w_{t+1}^{(i)} = w_{t}^{(i)} - \eta \frac{g_{t}^{(i)}}\n {\sqrt{\sum_{\tau=1}^{t} (g_{\tau}^{(i)})^2 + \epsilon}}\n\n where:\n - :math:`w_t^{(i)}` is the parameter :math:`i` at time step :math:`t`,\n - :math:`\eta` is the learning rate,\n - :math:`g_t^{(i)}` is the gradient of parameter :math:`i` at time step\n :math:`t`,\n - :math:`\epsilon` is a small constant to ensure numerical stability.\n\n Defining :math:`G = \sum_{t=1}^\tau g_t g_t^\top`, the update can be\n written as\n\n .. math::\n\n w_{t+1} = w_{t} - \eta \cdot \text{diag}(G + \epsilon I)^{-1/2}\n \cdot g_t\n\n where :math:`\text{diag} (G) = (G_{ii})_{i=1}^p` is the vector of diagonal\n entries of :math:`G \in \mathbb{R}^p` and :math:`I` is the identity matrix\n in :math:`\mathbb{R}^p`.\n\n Args:\n learning_rate: A global scaling factor, either fixed or evolving along\n iterations with a scheduler, see :func:`optax.scale_by_learning_rate`.\n initial_accumulator_value: Initial value for the accumulator.\n eps: A small constant applied to denominator inside of the square root (as\n in RMSProp) to avoid dividing by zero when rescaling.\n\n Returns:\n The corresponding :class:`optax.GradientTransformationExtraArgs`.\n\n Examples:\n >>> import optax\n >>> import jax\n >>> import jax.numpy as jnp\n >>> def f(x): return jnp.sum(x ** 2) # simple quadratic function\n >>> solver = optax.adagrad(learning_rate=1.0)\n >>> params = jnp.array([1., 2., 3.])\n >>> print('Objective function: ', f(params))\n Objective function: 14.0\n >>> opt_state = solver.init(params)\n >>> for _ in range(5):\n ... grad = jax.grad(f)(params)\n ... updates, opt_state = solver.update(grad, opt_state, params)\n ... params = optax.apply_updates(params, updates)\n ... print('Objective function: {:.2E}'.format(f(params)))\n Objective function: 5.01E+00\n Objective function: 2.40E+00\n Objective function: 1.25E+00\n Objective function: 6.86E-01\n Objective function: 3.85E-01\n\n References:\n Duchi et al, `Adaptive Subgradient Methods for Online Learning and\n Stochastic Optimization `_,\n 2011\n\n .. warning::\n Adagrad's main limit is the monotonic accumulation of squared\n gradients in the denominator: since all terms are >0, the sum keeps growing\n during training and the learning rate eventually becomes vanishingly small.\n """"""\n return combine.chain(\n transform.scale_by_rss(\n initial_accumulator_value=initial_accumulator_value, eps=eps\n ),\n transform.scale_by_learning_rate(learning_rate),\n )\n\n\ndef adam(\n learning_rate: base.ScalarOrSchedule,\n b1: float = 0.9,\n b2: float = 0.999,\n eps: float = 1e-8,\n eps_root: float = 0.0,\n mu_dtype: Optional[Any] = None,\n *,\n nesterov: bool = False,\n) -> base.GradientTransformationExtraArgs:\n r""""""The Adam optimizer.\n\n Adam is an SGD variant with gradient scaling adaptation. The scaling\n used for each parameter is computed from estimates of first and second-order\n moments of the gradients (using suitable exponential moving averages).\n\n Let :math:`\alpha_t` represent the learning rate and :math:`\beta_1, \beta_2`,\n :math:`\varepsilon`, :math:`\bar{\varepsilon}` represent the arguments\n ``b1``, ``b2``, ``eps`` and ``eps_root`` respectively. The learning rate is\n indexed by :math:`t` since the learning rate may also be provided by a\n schedule function.\n\n The ``init`` function of this optimizer initializes an internal state\n :math:`S_0 := (m_0, v_0) = (0, 0)`, representing initial estimates for the\n first and second moments. In practice these values are stored as pytrees\n containing all zeros, with the same shape as the model updates.\n At step :math:`t`, the ``update`` function of this optimizer takes as\n arguments the incoming gradients :math:`g_t` and optimizer state :math:`S_t`\n and computes updates :math:`u_t` and new state :math:`S_{t+1}`. Thus, for\n :math:`t > 0`, we have,\n\n .. math::\n\n \begin{align*}\n m_t &\leftarrow \beta_1 \cdot m_{t-1} + (1-\beta_1) \cdot g_t \\\n v_t &\leftarrow \beta_2 \cdot v_{t-1} + (1-\beta_2) \cdot {g_t}^2 \\\n \hat{m}_t &\leftarrow m_t / {(1-\beta_1^t)} \\\n \hat{v}_t &\leftarrow v_t / {(1-\beta_2^t)} \\\n u_t &\leftarrow -\alpha_t \cdot \hat{m}_t / \left({\sqrt{\hat{v}_t +\n \bar{\varepsilon}} + \varepsilon} \right)\\\n S_t &\leftarrow (m_t, v_t).\n \end{align*}\n\n With the keyword argument `nesterov=True`, the optimizer uses Nesterov\n momentum, replacing the above :math:`\hat{m}_t` with\n\n .. math::\n \hat{m}_t \leftarrow\n \beta_1 m_t / {(1-\beta_1^{t+1})} + (1 - \beta_1) g_t / {(1-\beta_1^t)}.\n\n Args:\n learning_rate: A global scaling factor, either fixed or evolving along\n iterations with a scheduler, see :func:`optax.scale_by_learning_rate`.\n b1: Exponential decay rate to track the first moment of past gradients.\n b2: Exponential decay rate to track the second moment of past gradients.\n eps: A small constant applied to denominator outside of the square root\n (as in the Adam paper) to avoid dividing by zero when rescaling.\n eps_root: A small constant applied to denominator inside the square root (as\n in RMSProp), to avoid dividing by zero when rescaling. This is needed for\n example when computing (meta-)gradients through Adam.\n mu_dtype: Optional `dtype` to be used for the first order accumulator; if\n `None` then the `dtype` is inferred from `params` and `updates`.\n nesterov: Whether to use Nesterov momentum. The solver with\n nesterov=True is equivalent to the :func:`optax.nadam` optimizer, and\n described in [Dozat 2016].\n\n Returns:\n The corresponding :class:`optax.GradientTransformationExtraArgs`.\n\n Examples:\n >>> import optax\n >>> import jax\n >>> import jax.numpy as jnp\n >>> def f(x): return jnp.sum(x ** 2) # simple quadratic function\n >>> solver = optax.adam(learning_rate=0.003)\n >>> params = jnp.array([1., 2., 3.])\n >>> print('Objective function: ', f(params))\n Objective function: 14.0\n >>> opt_state = solver.init(params)\n >>> for _ in range(5):\n ... grad = jax.grad(f)(params)\n ... updates, opt_state = solver.update(grad, opt_state, params)\n ... params = optax.apply_updates(params, updates)\n ... print('Objective function: {:.2E}'.format(f(params)))\n Objective function: 1.40E+01\n Objective function: 1.39E+01\n Objective function: 1.39E+01\n Objective function: 1.39E+01\n Objective function: 1.38E+01\n\n References:\n Kingma et al, `Adam: A Method for Stochastic Optimization\n `_, 2014\n\n Dozat, `Incorporating Nesterov Momentum into Adam\n `_, 2016\n\n .. warning::\n PyTorch and optax's implementation follow Algorithm 1 of [Kingma et al.\n 2014]. Note that TensorFlow used instead the formulation just before Section\n 2.1 of the paper. See https://github.com/deepmind/optax/issues/571 for more\n detail.\n\n .. seealso:: :func:`optax.nadam`, :func:`optax.adamw`.\n """"""\n return combine.chain(\n transform.scale_by_adam(\n b1=b1,\n b2=b2,\n eps=eps,\n eps_root=eps_root,\n mu_dtype=mu_dtype,\n nesterov=nesterov,\n ),\n transform.scale_by_learning_rate(learning_rate),\n )\n\n\nnadam = functools.partial(adam, nesterov=True)\nnadam.__doc__ = r""""""The NAdam optimizer.\n\n Nadam is a variant of :func:`optax.adam` with Nesterov's momentum. The update\n rule of this solver is as follows:\n\n .. math::\n\n \begin{align*}\n m_t &\leftarrow \beta_1 \cdot m_{t-1} + (1-\beta_1) \cdot g_t \\\n v_t &\leftarrow \beta_2 \cdot v_{t-1} + (1-\beta_2) \cdot {g_t}^2 \\\n \hat{m}_t &\leftarrow\n \beta_1 m_t / {(1-\beta_1^{t+1})} + (1 - \beta_1) g_t / {(1-\beta_1^t)}\\\n \hat{v}_t &\leftarrow v_t / {(1-\beta_2^t)} \\\n u_t &\leftarrow -\alpha_t \cdot \hat{m}_t / \left({\sqrt{\hat{v}_t +\n \bar{\varepsilon}} + \varepsilon} \right)\\\n S_t &\leftarrow (m_t, v_t).\n \end{align*}\n\n Args:\n learning_rate: A global scaling factor, either fixed or evolving along\n iterations with a scheduler, see :func:`optax.scale_by_learning_rate`.\n b1: Exponential decay rate to track the first moment of past gradients.\n b2: Exponential decay rate to track the second moment of past gradients.\n eps: A small constant applied to denominator outside of the square root\n (as in the Adam paper) to avoid dividing by zero when rescaling.\n eps_root: A small constant applied to denominator inside the square root (as\n in RMSProp), to avoid dividing by zero when rescaling. This is needed for\n example when computing (meta-)gradients through Adam.\n mu_dtype: Optional `dtype` to be used for the first order accumulator; if\n `None` then the `dtype` is inferred from `params` and `updates`.\n\n Returns:\n The corresponding :class:`optax.GradientTransformationExtraArgs`.\n\n Examples:\n >>> import optax\n >>> import jax\n >>> import jax.numpy as jnp\n >>> def f(x): return jnp.sum(x ** 2) # simple quadratic function\n >>> solver = optax.nadam(learning_rate=0.003)\n >>> params = jnp.array([1., 2., 3.])\n >>> print('Objective function: ', f(params))\n Objective function: 14.0\n >>> opt_state = solver.init(params)\n >>> for _ in range(5):\n ... grad = jax.grad(f)(params)\n ... updates, opt_state = solver.update(grad, opt_state, params)\n ... params = optax.apply_updates(params, updates)\n ... print('Objective function: {:.2E}'.format(f(params)))\n Objective function: 1.39E+01\n Objective function: 1.39E+01\n Objective function: 1.39E+01\n Objective function: 1.38E+01\n Objective function: 1.38E+01\n\n References:\n Dozat, `Incorporating Nesterov Momentum into Adam\n `_, 2016\n\n .. seealso:: :func:`optax.adam`, :func:`optax.nadamw`.\n\n .. versionadded:: 0.1.9\n""""""\n\n\ndef adamw(\n learning_rate: base.ScalarOrSchedule,\n b1: float = 0.9,\n b2: float = 0.999,\n eps: float = 1e-8,\n eps_root: float = 0.0,\n mu_dtype: Optional[Any] = None,\n weight_decay: float = 1e-4,\n mask: Optional[Union[Any, Callable[[base.Params], Any]]] = None,\n *,\n nesterov: bool = False,\n) -> base.GradientTransformationExtraArgs:\n r""""""Adam with weight decay regularization.\n\n AdamW uses weight decay to regularize learning towards small weights, as\n this leads to better generalization. In SGD you can also use L2 regularization\n to implement this as an additive loss term, however L2 regularization\n does not behave as intended for adaptive gradient algorithms such as Adam,\n see [Loshchilov et al, 2019].\n\n Let :math:`\alpha_t` represent the learning rate and :math:`\beta_1, \beta_2`,\n :math:`\varepsilon`, :math:`\bar{\varepsilon}` represent the arguments\n ``b1``, ``b2``, ``eps`` and ``eps_root`` respectively. The learning rate is\n indexed by :math:`t` since the learning rate may also be provided by a\n schedule function. Let :math:`\lambda` be the weight decay and\n :math:`\theta_t` the parameter vector at time :math:`t`.\n\n The ``init`` function of this optimizer initializes an internal state\n :math:`S_0 := (m_0, v_0) = (0, 0)`, representing initial estimates for the\n first and second moments. In practice these values are stored as pytrees\n containing all zeros, with the same shape as the model updates.\n At step :math:`t`, the ``update`` function of this optimizer takes as\n arguments the incoming gradients :math:`g_t`, the optimizer state :math:`S_t`\n and the parameters :math:`\theta_t` and computes updates :math:`u_t` and\n new state :math:`S_{t+1}`. Thus, for :math:`t > 0`, we have,\n\n .. math::\n\n \begin{align*}\n m_t &\leftarrow \beta_1 \cdot m_{t-1} + (1-\beta_1) \cdot g_t \\\n v_t &\leftarrow \beta_2 \cdot v_{t-1} + (1-\beta_2) \cdot {g_t}^2 \\\n \hat{m}_t &\leftarrow m_t / {(1-\beta_1^t)} \\\n \hat{v}_t &\leftarrow v_t / {(1-\beta_2^t)} \\\n u_t &\leftarrow -\alpha_t \cdot \left( \hat{m}_t / \left({\sqrt{\hat{v}_t\n + \bar{\varepsilon}} + \varepsilon} \right) + \lambda \theta_{t} \right)\\\n S_t &\leftarrow (m_t, v_t).\n \end{align*}\n\n This implementation can incorporate a momentum a la Nesterov introduced by\n [Dozat 2016]. The resulting optimizer is then often referred as NAdamW.\n With the keyword argument `nesterov=True`, the optimizer uses Nesterov\n momentum, replacing the above :math:`\hat{m}_t` with\n\n .. math::\n \hat{m}_t \leftarrow\n \beta_1 m_t / {(1-\beta_1^{t+1})} + (1 - \beta_1) g_t / {(1-\beta_1^t)}.\n\n Args:\n learning_rate: A global scaling factor, either fixed or evolving along\n iterations with a scheduler, see :func:`optax.scale_by_learning_rate`.\n b1: Exponential decay rate to track the first moment of past gradients.\n b2: Exponential decay rate to track the second moment of past gradients.\n eps: A small constant applied to denominator outside of the square root\n (as in the Adam paper) to avoid dividing by zero when rescaling.\n eps_root: A small constant applied to denominator inside the square root (as\n in RMSProp), to avoid dividing by zero when rescaling. This is needed for\n instance when computing (meta-)gradients through Adam.\n mu_dtype: Optional `dtype` to be used for the first order accumulator; if\n `None` then the `dtype` is inferred from `params` and `updates`.\n weight_decay: Strength of the weight decay regularization. Note that this\n weight decay is multiplied with the learning rate. This is consistent\n with other frameworks such as PyTorch, but different from\n (Loshchilov et al, 2019) where the weight decay is only multiplied with\n the ""schedule multiplier"", but not the base learning rate.\n mask: A tree with same structure as (or a prefix of) the params PyTree,\n or a Callable that returns such a pytree given the params/updates.\n The leaves should be booleans, `True` for leaves/subtrees you want to\n apply the weight decay to, and `False` for those you want to skip. Note\n that the Adam gradient transformations are applied to all parameters.\n nesterov: Whether to use Nesterov momentum. The solver with\n nesterov=True is equivalent to the :func:`optax.nadamw` optimizer. This\n modification is described in [Dozat 2016].\n\n Returns:\n The corresponding :class:`optax.GradientTransformationExtraArgs`.\n\n Examples:\n >>> import optax\n >>> import jax\n >>> import jax.numpy as jnp\n >>> def f(x): return jnp.sum(x ** 2) # simple quadratic function\n >>> solver = optax.adamw(learning_rate=0.003)\n >>> params = jnp.array([1., 2., 3.])\n >>> print('Objective function: ', f(params))\n Objective function: 14.0\n >>> opt_state = solver.init(params)\n >>> for _ in range(5):\n ... grad = jax.grad(f)(params)\n ... updates, opt_state = solver.update(grad, opt_state, params)\n ... params = optax.apply_updates(params, updates)\n ... print('Objective function: {:.2E}'.format(f(params)))\n Objective function: 1.40E+01\n Objective function: 1.39E+01\n Objective function: 1.39E+01\n Objective function: 1.39E+01\n Objective function: 1.38E+01\n\n References:\n Loshchilov et al, `Decoupled Weight Decay\n Regularization `_, 2019\n\n Dozat, `Incorporating Nesterov Momentum into Adam\n `_, 2016\n\n .. seealso::\n See the related functions :func:`optax.adam`, :func:`optax.nadamw`, as well\n as the example :doc:`../_collections/examples/nanolm` for a use case.\n """"""\n return combine.chain(\n transform.scale_by_adam(\n b1=b1,\n b2=b2,\n eps=eps,\n eps_root=eps_root,\n mu_dtype=mu_dtype,\n nesterov=nesterov,\n ),\n transform.add_decayed_weights(weight_decay, mask),\n transform.scale_by_learning_rate(learning_rate),\n )\n\n\nnadamw = functools.partial(adamw, nesterov=True)\nnadamw.__doc__ = (\n r""""""NAdamW optimizer, implemented as part of the AdamW optimizer.\n\n NadamW is variant of :func:`optax.adamw` with Nesterov's momentum. Compared\n to AdamW, this optimizer replaces the assignment\n\n .. math::\n\n \hat{m}_t \leftarrow m_t / {(1-\beta_1^t)}\n\n with\n\n .. math::\n\n \hat{m}_t \leftarrow\n \beta_1 m_t / {(1-\beta_1^{t+1})} + (1 - \beta_1) g_t / {(1-\beta_1^t)}.\n\n Args:\n learning_rate: A global scaling factor, either fixed or evolving along\n iterations with a scheduler, see :func:`optax.scale_by_learning_rate`.\n b1: Exponential decay rate to track the first moment of past gradients.\n b2: Exponential decay rate to track the second moment of past gradients.\n eps: A small constant applied to denominator outside of the square root\n (as in the Adam paper) to avoid dividing by zero when rescaling.\n eps_root: A small constant applied to denominator inside the square root (as\n in RMSProp), to avoid dividing by zero when rescaling. This is needed for\n instance when computing (meta-)gradients through Adam.\n mu_dtype: Optional `dtype` to be used for the first order accumulator; if\n `None` then the `dtype` is inferred from `params` and `updates`.\n weight_decay: Strength of the weight decay regularization. Note that this\n weight decay is multiplied with the learning rate. This is consistent\n with other frameworks such as PyTorch, but different from\n (Loshchilov et al, 2019) where the weight decay is only multiplied with\n the ""schedule multiplier"", but not the base learning rate.\n mask: A tree with same structure as (or a prefix of) the params PyTree,\n or a Callable that returns such a pytree given the params/updates.\n The leaves should be booleans, `True` for leaves/subtrees you want to\n apply the weight decay to, and `False` for those you want to skip. Note\n that the Adam gradient transformations are applied to all parameters.\n\n Returns:\n The corresponding :class:`optax.GradientTransformationExtraArgs`.\n\n Examples:\n >>> import optax\n >>> import jax\n >>> import jax.numpy as jnp\n >>> def f(x): return jnp.sum(x ** 2) # simple quadratic function\n >>> solver = optax.nadamw(learning_rate=0.003)\n >>> params = jnp.array([1., 2., 3.])\n >>> print('Objective function: ', f(params))\n Objective function: 14.0\n >>> opt_state = solver.init(params)\n >>> for _ in range(5):\n ... grad = jax.grad(f)(params)\n ... updates, opt_state = solver.update(grad, opt_state, params)\n ... params = optax.apply_updates(params, updates)\n ... print('Objective function: {:.2E}'.format(f(params)))\n Objective function: 1.39E+01\n Objective function: 1.39E+01\n Objective function: 1.39E+01\n Objective function: 1.38E+01\n Objective function: 1.38E+01\n\n References:\n Loshchilov et al, `Decoupled Weight Decay\n Regularization `_, 2019\n\n Dozat, `Incorporating Nesterov Momentum into Adam\n `_, 2016\n\n .. seealso:: :func:`optax.adam`, :func:`optax.adamw`.\n\n .. versionadded:: 0.1.9\n""""""\n)\n\n\ndef adan(\n learning_rate: base.ScalarOrSchedule,\n b1: float = 0.98,\n b2: float = 0.92,\n b3: float = 0.99,\n eps: float = 1e-8,\n eps_root: float = 1e-8,\n weight_decay: float = 0.0,\n mask: Optional[Union[Any, Callable[[base.Params], Any]]] = None,\n) -> base.GradientTransformationExtraArgs:\n r""""""The ADAptive Nesterov momentum algorithm (Adan).\n\n Adan first reformulates the vanilla Nesterov acceleration to develop a new\n Nesterov momentum estimation (NME) method, which avoids the extra overhead of\n computing gradient at the extrapolation point. Then Adan adopts NME to\n estimate the gradient's first- and second-order moments in adaptive gradient\n algorithms for convergence acceleration.\n\n The algorithm is as follows. First, we define the following parameters:\n\n - :math:`\eta > 0`: the step size.\n - :math:`\beta_1 \in [0, 1]`: the decay rate for the exponentially weighted\n average of gradients.\n - :math:`\beta_2 \in [0, 1]`: the decay rate for the exponentially weighted\n average of differences of gradients.\n - :math:`\beta_3 \in [0, 1]`: the decay rate for the exponentially weighted\n average of the squared term.\n - :math:`\varepsilon > 0`: a small constant for numerical stability.\n - :math:`\lambda > 0`: a weight decay.\n\n Second, we define the following variables:\n\n - :math:`\theta_t`: the parameters.\n - :math:`g_t`: the incoming stochastic gradient.\n - :math:`m_t`: the exponentially weighted average of gradients.\n - :math:`v_t`: the exponentially weighted average of differences of gradients.\n - :math:`n_t`: the exponentially weighted average of the squared term.\n - :math:`u_t`: the outgoing update vector.\n - :math:`S_t`: the saved state of the optimizer.\n\n Third, we initialize these variables as follows:\n\n - :math:`m_0 = g_0`\n - :math:`v_0 = 0`\n - :math:`v_1 = g_1 - g_0`\n - :math:`n_0 = g_0^2`\n\n Finally, on each iteration, we update the variables as follows:\n\n .. math::\n\n \begin{align*}\n m_t &\gets (1 - \beta_1) m_{t-1} + \beta_1 g_t \\\n v_t &\gets (1 - \beta_2) v_{t-1} + \beta_2 (g_t - g_{t-1}) \\\n n_t &\gets (1 - \beta_3) n_{t-1} + \beta_3 (g_t + (1 - \beta_2)\n (g_t - g_{t-1}))^2 \\\n \eta_t &\gets \eta / ({\sqrt{n_t + \bar{\varepsilon}} + \varepsilon}) \\\n u_t &\gets (\theta_t - \eta_t \circ (m_t + (1 - \beta_2) v_t))\n / (1 + \lambda \eta) \\\n S_t &\leftarrow (m_t, v_t, n_t).\n \end{align*}\n\n Args:\n learning_rate: this is a fixed global scaling factor.\n b1: Decay rate for the EWMA of gradients.\n b2: Decay rate for the EWMA of differences of gradients.\n b3: Decay rate for the EMWA of the algorithm's squared term.\n eps: Term added to the denominator to improve numerical stability.\n eps_root: Term added to the denominator inside the square-root to improve\n numerical stability when backpropagating gradients through the rescaling.\n weight_decay: Strength of the weight decay regularization.\n mask: A tree with same structure as (or a prefix of) the params PyTree,\n or a Callable that returns such a pytree given the params/updates.\n The leaves should be booleans, `True` for leaves/subtrees you want to\n apply the weight decay to, and `False` for those you want to skip.\n\n Returns:\n the corresponding :class:`optax.GradientTransformationExtraArgs`.\n\n Examples:\n >>> import optax\n >>> import jax\n >>> import jax.numpy as jnp\n >>> f = lambda x: x @ x # simple quadratic function\n >>> solver = optax.adan(learning_rate=1e-1)\n >>> params = jnp.array([1., 2., 3.])\n >>> print('Objective function: ', f(params))\n Objective function: 14.0\n >>> opt_state = solver.init(params)\n >>> for _ in range(5):\n ... grad = jax.grad(f)(params)\n ... updates, opt_state = solver.update(grad, opt_state, params)\n ... params = optax.apply_updates(params, updates)\n ... print('Objective function: {:.2E}'.format(f(params)))\n Objective function: 1.28E+01\n Objective function: 1.17E+01\n Objective function: 1.07E+01\n Objective function: 9.68E+00\n Objective function: 8.76E+00\n\n References:\n Xie et al, `Adan: Adaptive Nesterov Momentum Algorithm for Faster Optimizing\n Deep Models\n `_, 2022\n """"""\n return combine.chain(\n transform.scale_by_adan(\n b1=b1,\n b2=b2,\n b3=b3,\n eps=eps,\n eps_root=eps_root,\n ),\n transform.add_decayed_weights(weight_decay, mask),\n transform.scale_by_learning_rate(learning_rate),\n )\n\n\ndef lion(\n learning_rate: base.ScalarOrSchedule,\n b1: float = 0.9,\n b2: float = 0.99,\n mu_dtype: Optional[Any] = None,\n weight_decay: float = 1e-3,\n mask: Optional[Union[Any, Callable[[base.Params], Any]]] = None,\n) -> base.GradientTransformationExtraArgs:\n r""""""The Lion optimizer.\n\n Lion is discovered by symbolic program search. Unlike most adaptive optimizers\n such as AdamW, Lion only tracks momentum, making it more memory-efficient.\n The update of Lion is produced through the sign operation, resulting in a\n larger norm compared to updates produced by other optimizers such as SGD and\n AdamW. A suitable learning rate for Lion is typically 3-10x smaller than that\n for AdamW, the weight decay for Lion should be in turn 3-10x larger than that\n for AdamW to maintain a similar strength (lr * wd).\n\n Let :math:`\alpha_t` represent the learning rate and :math:`\beta_1, \beta_2`,\n represent the arguments ``b1`` and ``b2`` respectively. The learning rate is\n indexed by :math:`t` since the learning rate may also be provided by a\n schedule function. Let :math:`\lambda` be the weight decay and\n :math:`\theta_t` the parameter vector at time :math:`t`.\n\n The ``init`` function of this optimizer initializes an internal state\n :math:`S_0 := (m_0) = (0)`, representing the intial estimate for the\n first moment. In practice these values are stored as pytrees\n containing all zeros, with the same shape as the model updates.\n At step :math:`t`, the ``update`` function of this optimizer takes as\n arguments the incoming gradients :math:`g_t`, the optimizer state :math:`S_t`\n and the parameters :math:`\theta_t` and computes updates :math:`u_t` and\n new state :math:`S_{t+1}`. Thus, for :math:`t > 0`, we have,\n\n .. math::\n\n \begin{align*}\n c_t &\leftarrow \beta_1 \cdot m_{t-1} + (1-\beta_1) \cdot g_t \\\n u_t &\leftarrow -\alpha_t \cdot \left( sign \left( c_t \right) +\n \lambda \theta_{t} \right)\\\n m_t &\leftarrow \beta_2 \cdot m_{t-1} + (1-\beta_2) \cdot g_t \\\n S_t &\leftarrow (m_t).\n \end{align*}\n\n Args:\n learning_rate: A global scaling factor, either fixed or evolving along\n iterations with a scheduler, see :func:`optax.scale_by_learning_rate`.\n b1: Rate to combine the momentum and the current gradient.\n b2: Exponential decay rate to track the momentum of past gradients.\n mu_dtype: Optional `dtype` to be used for the first order accumulator; if\n `None` then the `dtype` is inferred from `params` and `updates`.\n weight_decay: Strength of the weight decay regularization. Note that this\n weight decay is multiplied with the learning rate. This is consistent with\n other frameworks such as PyTorch, but different from (Loshchilov et al,\n 2019) where the weight decay is only multiplied with the ""schedule\n multiplier"", but not the base learning rate.\n mask: A tree with same structure as (or a prefix of) the params PyTree, or a\n Callable that returns such a pytree given the params/updates. The leaves\n should be booleans, `True` for leaves/subtrees you want to apply the\n weight decay to, and `False` for those you want to skip. Note that the\n Adam gradient transformations are applied to all parameters.\n\n Returns:\n The corresponding :class:`optax.GradientTransformationExtraArgs`.\n\n Examples:\n >>> import optax\n >>> import jax\n >>> import jax.numpy as jnp\n >>> def f(x): return jnp.sum(x ** 2) # simple quadratic function\n >>> solver = optax.lion(learning_rate=0.003)\n >>> params = jnp.array([1., 2., 3.])\n >>> print('Objective function: ', f(params))\n Objective function: 14.0\n >>> opt_state = solver.init(params)\n >>> for _ in range(5):\n ... grad = jax.grad(f)(params)\n ... updates, opt_state = solver.update(grad, opt_state, params)\n ... params = optax.apply_updates(params, updates)\n ... print('Objective function: {:.2E}'.format(f(params)))\n Objective function: 1.40E+01\n Objective function: 1.39E+01\n Objective function: 1.39E+01\n Objective function: 1.39E+01\n Objective function: 1.38E+01\n\n References:\n Chen et al, `Symbolic Discovery of Optimization Algorithms\n `_, 2023\n """"""\n return combine.chain(\n transform.scale_by_lion(b1=b1, b2=b2, mu_dtype=mu_dtype),\n transform.add_decayed_weights(weight_decay, mask),\n transform.scale_by_learning_rate(learning_rate),\n )\n\n\ndef amsgrad(\n learning_rate: base.ScalarOrSchedule,\n b1: float = 0.9,\n b2: float = 0.999,\n eps: float = 1e-8,\n eps_root: float = 0.0,\n mu_dtype: Optional[Any] = None,\n) -> base.GradientTransformationExtraArgs:\n """"""The AMSGrad optimizer.\n\n The original Adam can fail to converge to the optimal solution in some cases.\n AMSGrad guarantees convergence by using a long-term memory of past gradients.\n\n Args:\n learning_rate: A global scaling factor, either fixed or evolving along\n iterations with a scheduler, see :func:`optax.scale_by_learning_rate`.\n b1: Exponential decay rate to track the first moment of past gradients.\n b2: Exponential decay rate to track the second moment of past gradients.\n eps: A small constant applied to denominator outside of the square root (as\n in the Adam paper) to avoid dividing by zero when rescaling.\n eps_root: A small constant applied to denominator inside the square root (as\n in RMSProp), to avoid dividing by zero when rescaling. This is needed for\n instance when computing (meta-)gradients through Adam.\n mu_dtype: Optional `dtype` to be used for the first order accumulator; if\n `None` then the `dtype` is inferred from `params` and `updates`.\n\n Returns:\n The corresponding :class:`optax.GradientTransformationExtraArgs`.\n\n Examples:\n >>> import optax\n >>> import jax\n >>> import jax.numpy as jnp\n >>> def f(x): return jnp.sum(x ** 2) # simple quadratic function\n >>> solver = optax.amsgrad(learning_rate=0.003)\n >>> params = jnp.array([1., 2., 3.])\n >>> print('Objective function: ', f(params))\n Objective function: 14.0\n >>> opt_state = solver.init(params)\n >>> for _ in range(5):\n ... grad = jax.grad(f)(params)\n ... updates, opt_state = solver.update(grad, opt_state, params)\n ... params = optax.apply_updates(params, updates)\n ... print('Objective function: {:.2E}'.format(f(params)))\n Objective function: 1.40E+01\n Objective function: 1.39E+01\n Objective function: 1.39E+01\n Objective function: 1.39E+01\n Objective function: 1.38E+01\n\n References:\n Reddi et al, `On the Convergence of Adam and Beyond\n `_, 2023\n """"""\n return combine.chain(\n transform.scale_by_amsgrad(\n b1=b1, b2=b2, eps=eps, eps_root=eps_root, mu_dtype=mu_dtype\n ),\n transform.scale_by_learning_rate(learning_rate),\n )\n\n\ndef fromage(\n learning_rate: float, min_norm: float = 1e-6\n) -> base.GradientTransformationExtraArgs:\n """"""The Frobenius matched gradient descent (Fromage) optimizer.\n\n Fromage is a learning algorithm that does not require learning rate tuning.\n The optimizer is based on modeling neural network gradients via deep relative\n trust (a distance function on deep neural networks). Fromage is similar to the\n LARS optimizer and can work on a range of standard neural network benchmarks,\n such as natural language Transformers and generative adversarial networks.\n\n Args:\n learning_rate: A global scaling factor, either fixed or evolving along\n iterations with a scheduler, see :func:`optax.scale_by_learning_rate`.\n min_norm: A minimum value that the norm of the gradient updates and the norm\n of the layer parameters can be clipped to to avoid dividing by zero when\n computing the trust ratio (as in the LARS paper).\n\n Returns:\n The corresponding :class:`optax.GradientTransformationExtraArgs`.\n\n Examples:\n >>> import optax\n >>> import jax\n >>> import jax.numpy as jnp\n >>> def f(x): return jnp.sum(x ** 2) # simple quadratic function\n >>> solver = optax.fromage(learning_rate=0.003)\n >>> params = jnp.array([1., 2., 3.])\n >>> print('Objective function: ', f(params))\n Objective function: 14.0\n >>> opt_state = solver.init(params)\n >>> for _ in range(5):\n ... grad = jax.grad(f)(params)\n ... updates, opt_state = solver.update(grad, opt_state, params)\n ... params = optax.apply_updates(params, updates)\n ... print('Objective function: {:.2E}'.format(f(params)))\n Objective function: 1.39E+01\n Objective function: 1.38E+01\n Objective function: 1.37E+01\n Objective function: 1.37E+01\n Objective function: 1.36E+01\n\n References:\n Bernstein et al, `On the distance between two neural networks and the\n stability of learning `_, 2020\n """"""\n mult = 1 / jnp.sqrt(1 + learning_rate**2)\n return combine.chain(\n transform.scale_by_trust_ratio(min_norm),\n transform.scale_by_learning_rate(learning_rate * mult),\n transform.add_decayed_weights((mult - 1)),\n )\n\n\ndef lars(\n learning_rate: base.ScalarOrSchedule,\n weight_decay: float = 0.0,\n weight_decay_mask: MaskOrFn = True,\n trust_coefficient: float = 0.001,\n eps: float = 0.0,\n trust_ratio_mask: MaskOrFn = True,\n momentum: float = 0.9,\n nesterov: bool = False,\n) -> base.GradientTransformationExtraArgs:\n """"""The LARS optimizer.\n\n LARS is a layer-wise adaptive optimizer introduced to help scale SGD to\n larger batch sizes. LARS later inspired the LAMB optimizer.\n\n Args:\n learning_rate: A global scaling factor, either fixed or evolving along\n iterations with a scheduler, see :func:`optax.scale_by_learning_rate`.\n weight_decay: Strength of the weight decay regularization.\n weight_decay_mask: A tree with same structure as (or a prefix of) the params\n PyTree, or a Callable that returns such a pytree given the params/updates.\n The leaves should be booleans, `True` for leaves/subtrees you want to\n apply the transformation to, and `False` for those you want to skip.\n trust_coefficient: A multiplier for the trust ratio.\n eps: Optional additive constant in the trust ratio denominator.\n trust_ratio_mask: A tree with same structure as (or a prefix of) the params\n PyTree, or a Callable that returns such a pytree given the params/updates.\n The leaves should be booleans, `True` for leaves/subtrees you want to\n apply the transformation to, and `False` for those you want to skip.\n momentum: Decay rate for momentum.\n nesterov: Whether to use Nesterov momentum.\n\n Returns:\n The corresponding :class:`optax.GradientTransformationExtraArgs`.\n\n Examples:\n >>> import optax\n >>> import jax\n >>> import jax.numpy as jnp\n >>> def f(x): return jnp.sum(x ** 2) # simple quadratic function\n >>> solver = optax.lars(learning_rate=0.003)\n >>> params = jnp.array([1., 2., 3.])\n >>> print('Objective function: ', f(params))\n Objective function: 14.0\n >>> opt_state = solver.init(params)\n >>> for _ in range(5):\n ... grad = jax.grad(f)(params)\n ... updates, opt_state = solver.update(grad, opt_state, params)\n ... params = optax.apply_updates(params, updates)\n ... print('Objective function: {:.2E}'.format(f(params)))\n Objective function: 1.40E+01\n Objective function: 1.40E+01\n Objective function: 1.40E+01\n Objective function: 1.40E+01\n Objective function: 1.40E+01\n\n References:\n You et al, `Large Batch Training of Convolutional Networks\n `_, 2017\n """"""\n return combine.chain(\n transform.add_decayed_weights(weight_decay, mask=weight_decay_mask),\n wrappers.masked(\n inner=transform.scale_by_trust_ratio(\n trust_coefficient=trust_coefficient, eps=eps\n ),\n mask=trust_ratio_mask,\n ),\n transform.scale_by_learning_rate(learning_rate),\n transform.trace(decay=momentum, nesterov=nesterov),\n )\n\n\ndef lamb(\n learning_rate: base.ScalarOrSchedule,\n b1: float = 0.9,\n b2: float = 0.999,\n eps: float = 1e-6,\n eps_root: float = 0.0,\n weight_decay: float = 0.0,\n mask: MaskOrFn = None,\n) -> base.GradientTransformationExtraArgs:\n """"""The LAMB optimizer.\n\n LAMB is a general purpose layer-wise adaptive large batch optimizer designed\n to provide consistent training performance across a wide range of tasks,\n including those that use attention-based models (such as Transformers) and\n ResNet-50. The optimizer is able to work with small and large batch sizes.\n LAMB was inspired by the LARS learning algorithm.\n\n Args:\n learning_rate: A global scaling factor, either fixed or evolving along\n iterations with a scheduler, see :func:`optax.scale_by_learning_rate`.\n b1: Exponential decay rate to track the first moment of past gradients.\n b2: Exponential decay rate to track the second moment of past gradients.\n eps: A small constant applied to denominator outside of the square root (as\n in the Adam paper) to avoid dividing by zero when rescaling.\n eps_root: A small constant applied to denominator inside the square root (as\n in RMSProp), to avoid dividing by zero when rescaling. This is needed for\n instance when computing (meta-)gradients through Adam.\n weight_decay: Strength of the weight decay regularization.\n mask: A tree with same structure as (or a prefix of) the params PyTree, or a\n Callable that returns such a pytree given the params/updates. The leaves\n should be booleans, `True` for leaves/subtrees you want to apply the\n transformation to, and `False` for those you want to skip.\n\n Returns:\n The corresponding :class:`optax.GradientTransformationExtraArgs`.\n\n Examples:\n >>> import optax\n >>> import jax\n >>> import jax.numpy as jnp\n >>> def f(x): return jnp.sum(x ** 2) # simple quadratic function\n >>> solver = optax.lamb(learning_rate=0.003)\n >>> params = jnp.array([1., 2., 3.])\n >>> print('Objective function: ', f(params))\n Objective function: 14.0\n >>> opt_state = solver.init(params)\n >>> for _ in range(5):\n ... grad = jax.grad(f)(params)\n ... updates, opt_state = solver.update(grad, opt_state, params)\n ... params = optax.apply_updates(params, updates)\n ... print('Objective function: {:.2E}'.format(f(params)))\n Objective function: 1.39E+01\n Objective function: 1.38E+01\n Objective function: 1.38E+01\n Objective function: 1.37E+01\n Objective function: 1.36E+01\n\n References:\n You et al, `Large Batch Optimization for Deep Learning: Training BERT in 76\n minutes `_, 2020\n """"""\n return combine.chain(\n transform.scale_by_adam(b1=b1, b2=b2, eps=eps, eps_root=eps_root),\n transform.add_decayed_weights(weight_decay=weight_decay, mask=mask),\n transform.scale_by_trust_ratio(),\n transform.scale_by_learning_rate(learning_rate),\n )\n\n\ndef noisy_sgd(\n learning_rate: base.ScalarOrSchedule,\n eta: float = 0.01,\n gamma: float = 0.55,\n key: jax.Array | int | None = None,\n *,\n seed: int | None = None, # deprecated\n) -> base.GradientTransformationExtraArgs:\n r""""""A variant of SGD with added noise.\n\n Noisy SGD is a variant of :func:`optax.sgd` that incorporates Gaussian noise\n into the updates. It has been found that adding noise to the gradients can\n improve both the training error and the generalization error in very deep\n networks.\n\n The update :math:`u_t` is modified to include this noise as follows:\n\n .. math::\n u_t \leftarrow -\alpha_t (g_t + N(0, \sigma_t^2)),\n\n where :math:`N(0, \sigma_t^2)` represents Gaussian noise with zero mean and a\n variance of :math:`\sigma_t^2`.\n\n The variance of this noise decays over time according to the formula\n\n .. math::\n \sigma_t^2 = \frac{\eta}{(1+t)^\gamma},\n\n where :math:`\gamma` is the decay rate parameter ``gamma`` and :math:`\eta`\n represents the initial variance ``eta``.\n\n Args:\n learning_rate: A global scaling factor, either fixed or evolving along\n iterations with a scheduler, see :func:`optax.scale_by_learning_rate`.\n eta: Initial variance for the Gaussian noise added to gradients.\n gamma: A parameter controlling the annealing of noise over time ``t``, the\n variance decays according to ``(1+t)**(-gamma)``.\n key: random generator key for noise generation.\n seed: deprecated, use key instead.\n\n Returns:\n The corresponding :class:`optax.GradientTransformationExtraArgs`.\n\n Examples:\n >>> import optax\n >>> import jax\n >>> import jax.numpy as jnp\n >>> def f(x): return jnp.sum(x ** 2) # simple quadratic function\n >>> solver = optax.noisy_sgd(learning_rate=0.003, key=0)\n >>> params = jnp.array([1., 2., 3.])\n >>> print('Objective function: ', f(params))\n Objective function: 14.0\n >>> opt_state = solver.init(params)\n >>> for _ in range(5):\n ... grad = jax.grad(f)(params)\n ... updates, opt_state = solver.update(grad, opt_state, params)\n ... params = optax.apply_updates(params, updates)\n ... print('Objective function: {:.2E}'.format(f(params)))\n Objective function: 1.38E+01\n Objective function: 1.37E+01\n Objective function: 1.35E+01\n Objective function: 1.33E+01\n Objective function: 1.32E+01\n\n References:\n Neelakantan et al, `Adding Gradient Noise Improves Learning for Very Deep\n Networks `_, 2015\n """"""\n if seed is not None:\n warnings.warn(\n '""seed"" is deprecated and will be removed in optax 0.3.0, use ""key"".',\n DeprecationWarning,\n )\n if key is not None:\n raise ValueError('Only one of seed or key can be specified.')\n key = jax.random.key(seed)\n if key is None:\n warnings.warn('Specifying a key will be required in optax 0.3.0.')\n key = jax.random.key(0)\n key = utils.canonicalize_key(key)\n\n return combine.chain(\n transform.add_noise(eta, gamma, key),\n transform.scale_by_learning_rate(learning_rate),\n )\n\n\ndef sign_sgd(\n learning_rate: base.ScalarOrSchedule,\n) -> base.GradientTransformationExtraArgs:\n r""""""A variant of SGD using only the signs of the gradient components.\n\n SignSGD is a variant of SGD that uses the signs of the gradient components in\n the update, not their actual values. The update :math:`u_t` is modified as\n follows:\n\n .. math::\n u_t \leftarrow -\alpha_t\, \text{sign}\,(g_t),\n\n for :math:`\alpha_t` a given learning rate at iteration :math:`t`, and\n :math:`\text{sign}\,(g_t)` the sign of each component of the gradient\n :math:`g_t`.\n\n SGD variants that use only the signs of the gradient update have historically\n been used since RProp, with modern forms including RMSProp, Adam, and Lion.\n SignSGD uses only the signs of the gradient update. SignSGD enables\n significant gradient compression, substantially reducing the bottleneck\n imposed by communicating gradients when distributing learning across multiple\n workers.\n\n Args:\n learning_rate: A global scaling factor, either fixed or evolving along\n iterations with a scheduler, see :func:`optax.scale_by_learning_rate`.\n\n Returns:\n The corresponding :class:`optax.GradientTransformationExtraArgs`.\n\n Examples:\n >>> import optax\n >>> import jax\n >>> import jax.numpy as jnp\n >>> def f(x): return jnp.sum(x ** 2) # simple quadratic function\n >>> solver = optax.sign_sgd(learning_rate=0.003)\n >>> params = jnp.array([1., 2., 3.])\n >>> print('Objective function: ', f(params))\n Objective function: 14.0\n >>> opt_state = solver.init(params)\n >>> for _ in range(5):\n ... grad = jax.grad(f)(params)\n ... updates, opt_state = solver.update(grad, opt_state, params)\n ... params = optax.apply_updates(params, updates)\n ... print('Objective function: {:.2E}'.format(f(params)))\n Objective function: 1.40E+01\n Objective function: 1.39E+01\n Objective function: 1.39E+01\n Objective function: 1.39E+01\n Objective function: 1.38E+01\n\n References:\n Bernstein et al., `signSGD: Compressed optimization for Non-Convex Problems\n `_, 2018\n\n Balles et al., `The Geometry of Sign Gradient Descent\n `_, 2020\n """"""\n return combine.chain(\n transform.scale_by_sign(),\n transform.scale_by_learning_rate(learning_rate),\n )\n\n\ndef novograd(\n learning_rate: base.ScalarOrSchedule,\n b1: float = 0.9,\n b2: float = 0.25,\n eps: float = 1e-6,\n eps_root: float = 0.0,\n weight_decay: float = 0.0,\n) -> base.GradientTransformationExtraArgs:\n """"""NovoGrad optimizer.\n\n NovoGrad is more robust to the initial learning rate and\n weight initialization than other methods. For example,\n NovoGrad works well without LR warm-up, while other methods require it.\n NovoGrad performs exceptionally well for large batch training, e.g. it\n outperforms other methods for ResNet-50 for all batches up to 32K.\n In addition, NovoGrad requires half the memory compared to Adam.\n It was introduced together with Jasper ASR model.\n\n Args:\n learning_rate: A global scaling factor, either fixed or evolving along\n iterations with a scheduler, see :func:`optax.scale_by_learning_rate`.\n b1: An exponential decay rate to track the first moment of past gradients.\n b2: An exponential decay rate to track the second moment of past gradients.\n eps: A small constant applied to denominator outside of the square root (as\n in the Adam paper) to avoid dividing by zero when rescaling.\n eps_root: A small constant applied to denominator inside the square root (as\n in RMSProp), to avoid dividing by zero when rescaling. This is needed for\n instance when computing (meta-)gradients through Adam.\n weight_decay: Strength of the weight decay regularization.\n\n Returns:\n The corresponding :class:`optax.GradientTransformationExtraArgs`.\n\n Examples:\n >>> import optax\n >>> import jax\n >>> import jax.numpy as jnp\n >>> def f(x): return jnp.sum(x ** 2) # simple quadratic function\n >>> solver = optax.novograd(learning_rate=0.003)\n >>> params = jnp.array([1., 2., 3.])\n >>> print('Objective function: ', f(params))\n Objective function: 14.0\n >>> opt_state = solver.init(params)\n >>> for _ in range(5):\n ... grad = jax.grad(f)(params)\n ... updates, opt_state = solver.update(grad, opt_state, params)\n ... params = optax.apply_updates(params, updates)\n ... print('Objective function: {:.2E}'.format(f(params)))\n Objective function: 1.40E+01\n Objective function: 1.39E+01\n Objective function: 1.39E+01\n Objective function: 1.38E+01\n Objective function: 1.37E+01\n\n References:\n Ginsburg et al, `Stochastic Gradient Methods with Layer-wise Adaptive\n Moments for Training of Deep Networks `_,\n 2019\n\n Li et al, `Jasper: An End-to-End Convolutional Neural Acoustic Model\n `_, 2019\n """"""\n return combine.chain(\n transform.scale_by_novograd(\n b1=b1, b2=b2, eps=eps, eps_root=eps_root, weight_decay=weight_decay\n ),\n transform.scale_by_learning_rate(learning_rate),\n )\n\n\ndef optimistic_gradient_descent(\n learning_rate: base.ScalarOrSchedule,\n alpha: base.ScalarOrSchedule = 1.0,\n beta: base.ScalarOrSchedule = 1.0,\n) -> base.GradientTransformationExtraArgs:\n """"""An Optimistic Gradient Descent optimizer.\n\n Optimistic gradient descent is an approximation of extra-gradient methods\n which require multiple gradient calls to compute the next update. It has\n strong formal guarantees for last-iterate convergence in min-max games, for\n which standard gradient descent can oscillate or even diverge.\n\n Args:\n learning_rate: A global scaling factor, either fixed or evolving along\n iterations with a scheduler, see :func:`optax.scale_by_learning_rate`.\n alpha: Coefficient for generalized OGD.\n beta: Coefficient for generalized OGD negative momentum.\n\n Returns:\n The corresponding :class:`optax.GradientTransformationExtraArgs`.\n\n Examples:\n >>> import optax\n >>> import jax\n >>> import jax.numpy as jnp\n >>> def f(x): return jnp.sum(x ** 2) # simple quadratic function\n >>> solver = optax.optimistic_gradient_descent(learning_rate=0.003)\n >>> params = jnp.array([1., 2., 3.])\n >>> print('Objective function: ', f(params))\n Objective function: 14.0\n >>> opt_state = solver.init(params)\n >>> for _ in range(5):\n ... grad = jax.grad(f)(params)\n ... updates, opt_state = solver.update(grad, opt_state, params)\n ... params = optax.apply_updates(params, updates)\n ... print('Objective function: {:.2E}'.format(f(params)))\n Objective function: 1.38E+01\n Objective function: 1.37E+01\n Objective function: 1.35E+01\n Objective function: 1.33E+01\n Objective function: 1.32E+01\n\n References:\n Mokhtari et al, `A Unified Analysis of Extra-gradient and\n Optimistic Gradient Methods for Saddle Point Problems: Proximal\n Point Approach `_, 2019\n\n .. seealso::\n :doc:`../_collections/examples/ogda_example`\n """"""\n return combine.chain(\n transform.scale_by_optimistic_gradient(alpha=alpha, beta=beta),\n transform.scale_by_learning_rate(learning_rate),\n )\n\n\ndef optimistic_adam(\n learning_rate: float,\n optimism: Optional[float] = None,\n b1: float = 0.9,\n b2: float = 0.999,\n eps: float = 1e-08,\n eps_root: float = 0.0,\n mu_dtype: Optional[Any] = None,\n *,\n nesterov: bool = True,\n) -> base.GradientTransformationExtraArgs:\n r""""""The Optimistic Adam optimizer.\n\n This is an optimistic version of the Adam optimizer. It addresses the issue\n of limit cycling behavior in training Generative Adversarial Networks and\n other saddle-point min-max problems.\n\n The algorithm is as follows. First, we define the following parameters:\n\n - :math:`\alpha`: the learning rate.\n - :math:`o` the optimism rate.\n - :math:`\beta_1` the exponential decay rate for the first moment estimate.\n - :math:`\beta_2` the exponential decay rate for the second moment estimate.\n\n Second, we define the following variables:\n\n - :math:`g_t`: the incoming gradient.\n - :math:`m_t`: the biased first moment estimate.\n - :math:`v_t`: the biased second raw moment estimate.\n - :math:`\hat{m}_t`: the bias-corrected first moment estimate.\n - :math:`\hat{v}_t`: the bias-corrected second raw moment estimate.\n - :math:`r_t`: the signal-to-noise ratio (SNR) vector.\n - :math:`u_t`: the outgoing update vector.\n - :math:`S_t`: the state of the optimizer.\n\n Finally, on each iteration, the variables are updated as follows:\n\n .. math::\n\n \begin{align*}\n m_t &\leftarrow \beta_1 \cdot m_{t - 1} + (1 - \beta_1) \cdot g_t \\\n v_t &\leftarrow \beta_2 \cdot v_{t - 1} + (1 - \beta_2) \cdot g_t^2 \\\n \hat{m}_t &\leftarrow m_t / {(1 - \beta_1^t)} \\\n \hat{v}_t &\leftarrow v_t / {(1 - \beta_2^t)} \\\n r_t &\leftarrow \hat{m}_t / \left({\sqrt{\hat{v}_t +\n \bar{\varepsilon}} + \varepsilon} \right) \\\n u_t &\leftarrow -\alpha r_t - o (r_t - r_{t - 1}) \\\n S_t &\leftarrow (m_t, v_t, r_t).\n \end{align*}\n\n Args:\n learning_rate: A global scaling factor, either fixed or evolving along\n iterations with a scheduler, see :func:`optax.scale_by_learning_rate`.\n optimism: The amount of optimism to be applied. If None, defaults to\n learning_rate, as in the paper.\n b1: Exponential decay rate to track the first moment of past gradients.\n b2: Exponential decay rate to track the second moment of past gradients.\n eps: Term added to the denominator to improve numerical stability.\n eps_root: Term added to the second moment of the prediction error to\n improve numerical stability. If backpropagating gradients through the\n gradient transformation (e.g. for meta-learning), this must be non-zero.\n mu_dtype: Optional `dtype` to be used for the first order accumulator; if\n `None` then the `dtype` is inferred from `params` and `updates`.\n nesterov: Whether to use Nesterov momentum.\n\n Returns:\n The corresponding :class:`optax.GradientTransformationExtraArgs`.\n\n Examples:\n >>> import optax\n >>> import jax\n >>> from jax import numpy as jnp, lax\n >>> def f(x, y):\n ... return x * y # simple bilinear function\n >>> opt = optax.optimistic_adam(1e-2, 1.0)\n >>> def step(state, _):\n ... params, opt_state = state\n ... distance = jnp.hypot(*params)\n ... grads = jax.grad(f, argnums=(0, 1))(*params)\n ... grads = grads[0], -grads[1]\n ... updates, opt_state = opt.update(grads, opt_state, params)\n ... params = optax.apply_updates(params, updates)\n ... return (params, opt_state), distance\n >>> params = 1.0, 2.0\n >>> opt_state = opt.init(params)\n >>> _, distances = lax.scan(step, (params, opt_state), length=1025)\n >>> for i in range(6):\n ... print(f""{distances[4**i]:.3f}"")\n 2.243\n 2.195\n 2.161\n 2.055\n 0.796\n 0.001\n\n References:\n Daskalakis et al, `Training GANs with Optimism\n `_, 2017\n\n .. seealso::\n :doc:`../_collections/examples/ogda_example`\n """"""\n warnings.warn('`optimistic_adam` is deprecated, please use'\n ' `optimistic_adam_new` instead.', category=DeprecationWarning)\n if callable(learning_rate):\n raise ValueError('This version of `optimistic_adam` does not support'\n ' learning rate schedules but `optimistic_adam_v2` does.')\n if optimism is None:\n optimism = learning_rate\n return combine.chain(\n transform.scale_by_adam(\n b1=b1,\n b2=b2,\n eps=eps,\n eps_root=eps_root,\n mu_dtype=mu_dtype,\n nesterov=nesterov,\n ),\n transform.scale_by_optimistic_gradient(alpha=learning_rate,\n beta=optimism),\n transform.scale_by_learning_rate(1.0), # flips the sign\n )\n\n\ndef optimistic_adam_v2(\n learning_rate: base.ScalarOrSchedule,\n *,\n alpha: float = 1.0,\n beta: float = 1.0,\n b1: float = 0.9,\n b2: float = 0.999,\n eps: float = 1e-08,\n eps_root: float = 0.0,\n mu_dtype: Optional[Any] = None,\n nesterov: bool = True,\n) -> base.GradientTransformationExtraArgs:\n r""""""The Optimistic Adam optimizer.\n\n This is an optimistic version of the Adam optimizer. It addresses the issue\n of limit cycling behavior in training Generative Adversarial Networks and\n other saddle-point min-max problems.\n\n The algorithm is as follows. First, we define the following parameters:\n\n - :math:`learning_rate`: the learning rate.\n - :math:`\alpha`: the alpha rate in optimistic gradient descent.\n - :math:`\beta`: the beta rate in optimistic gradient descent.\n - :math:`\beta_1` the exponential decay rate for the first moment estimate.\n - :math:`\beta_2` the exponential decay rate for the second moment estimate.\n\n Second, we define the following variables:\n\n - :math:`g_t`: the incoming gradient.\n - :math:`m_t`: the biased first moment estimate.\n - :math:`v_t`: the biased second raw moment estimate.\n - :math:`\hat{m}_t`: the bias-corrected first moment estimate.\n - :math:`\hat{v}_t`: the bias-corrected second raw moment estimate.\n - :math:`r_t`: the signal-to-noise ratio (SNR) vector.\n - :math:`u_t`: the outgoing update vector.\n - :math:`S_t`: the state of the optimizer.\n\n Finally, on each iteration, the variables are updated as follows:\n\n .. math::\n\n \begin{align*}\n m_t &\leftarrow \beta_1 \cdot m_{t - 1} + (1 - \beta_1) \cdot g_t \\\n v_t &\leftarrow \beta_2 \cdot v_{t - 1} + (1 - \beta_2) \cdot g_t^2 \\\n \hat{m}_t &\leftarrow m_t / {(1 - \beta_1^t)} \\\n \hat{v}_t &\leftarrow v_t / {(1 - \beta_2^t)} \\\n r_t &\leftarrow \hat{m}_t / \left({\sqrt{\hat{v}_t +\n \bar{\varepsilon}} + \varepsilon} \right) \\\n u_t &\leftarrow -\alpha_t r_t - o_t (r_t - r_{t - 1}) \\\n S_t &\leftarrow (m_t, v_t, r_t).\n \end{align*}\n\n Args:\n learning_rate: A global scaling factor, either fixed or evolving along\n iterations with a scheduler, see :func:`optax.scale_by_learning_rate`.\n alpha: One of two scalar optimism parameters in optimistic gradient descent.\n beta: One of two scalar optimism parameters in optimistic gradient descent.\n b1: Exponential decay rate to track the first moment of past gradients.\n b2: Exponential decay rate to track the second moment of past gradients.\n eps: Term added to the denominator to improve numerical stability.\n eps_root: Term added to the second moment of the prediction error to\n improve numerical stability. If backpropagating gradients through the\n gradient transformation (e.g. for meta-learning), this must be non-zero.\n mu_dtype: Optional `dtype` to be used for the first order accumulator; if\n `None` then the `dtype` is inferred from `params` and `updates`.\n nesterov: Whether to use Nesterov momentum.\n\n Returns:\n The corresponding :class:`optax.GradientTransformationExtraArgs`.\n\n Examples:\n >>> import optax\n >>> import jax\n >>> from jax import numpy as jnp, lax\n >>> def f(x, y):\n ... return x * y # simple bilinear function\n >>> opt = optax.optimistic_adam_new(1e-2, 1.0)\n >>> def step(state, _):\n ... params, opt_state = state\n ... distance = jnp.hypot(*params)\n ... grads = jax.grad(f, argnums=(0, 1))(*params)\n ... grads = grads[0], -grads[1]\n ... updates, opt_state = opt.update(grads, opt_state, params)\n ... params = optax.apply_updates(params, updates)\n ... return (params, opt_state), distance\n >>> params = 1.0, 2.0\n >>> opt_state = opt.init(params)\n >>> _, distances = lax.scan(step, (params, opt_state), length=1025)\n >>> for i in range(6):\n ... print(f""{distances[4**i]:.3f}"")\n 2.243\n 2.195\n 2.161\n 2.055\n 0.796\n 0.001\n\n References:\n Daskalakis et al, `Training GANs with Optimism\n `_, 2017\n\n .. seealso::\n :doc:`../_collections/examples/ogda_example`\n """"""\n return combine.chain(\n transform.scale_by_adam(\n b1=b1,\n b2=b2,\n eps=eps,\n eps_root=eps_root,\n mu_dtype=mu_dtype,\n nesterov=nesterov,\n ),\n transform.scale_by_optimistic_gradient(alpha=alpha, beta=beta),\n transform.scale_by_learning_rate(learning_rate),\n )\n\n\ndef radam(\n learning_rate: base.ScalarOrSchedule,\n b1: float = 0.9,\n b2: float = 0.999,\n eps: float = 1e-8,\n eps_root: float = 0.0,\n threshold: float = 5.0,\n *,\n nesterov: bool = False,\n) -> base.GradientTransformationExtraArgs:\n """"""The Rectified Adam optimizer.\n\n The adaptive learning rate in Adam has undesirably large variance in early\n stages of training, due to the limited number of training samples used to\n estimate the optimizer's statistics. Rectified Adam addresses this issue\n by analytically reducing the large variance.\n\n Args:\n learning_rate: A global scaling factor, either fixed or evolving along\n iterations with a scheduler, see :func:`optax.scale_by_learning_rate`.\n b1: Exponential decay rate to track the first moment of past gradients.\n b2: Exponential decay rate to track the second moment of past gradients.\n eps: A small constant applied to denominator outside of the square root (as\n in the Adam paper) to avoid dividing by zero when rescaling.\n eps_root: A small constant applied to denominator inside the square root (as\n in RMSProp), to avoid dividing by zero when rescaling. This is needed for\n instance when computing (meta-)gradients through Adam.\n threshold: Threshold for variance tractability.\n nesterov: Whether to use Nesterov momentum.\n\n Returns:\n The corresponding :class:`optax.GradientTransformationExtraArgs`.\n\n Examples:\n >>> import optax\n >>> import jax\n >>> import jax.numpy as jnp\n >>> def f(x): return jnp.sum(x ** 2) # simple quadratic function\n >>> solver = optax.radam(learning_rate=0.003)\n >>> params = jnp.array([1., 2., 3.])\n >>> print('Objective function: ', f(params))\n Objective function: 14.0\n >>> opt_state = solver.init(params)\n >>> for _ in range(5):\n ... grad = jax.grad(f)(params)\n ... updates, opt_state = solver.update(grad, opt_state, params)\n ... params = optax.apply_updates(params, updates)\n ... print('Objective function: {:.2E}'.format(f(params)))\n Objective function: 1.38E+01\n Objective function: 1.37E+01\n Objective function: 1.35E+01\n Objective function: 1.33E+01\n Objective function: 1.32E+01\n\n References:\n Liu et al, 2020: `On the Variance of the Adaptive Learning Rate and Beyond\n `_, 2020\n """"""\n return combine.chain(\n transform.scale_by_radam(\n b1=b1,\n b2=b2,\n eps=eps,\n eps_root=eps_root,\n threshold=threshold,\n nesterov=nesterov,\n ),\n transform.scale_by_learning_rate(learning_rate),\n )\n\n\ndef rmsprop(\n learning_rate: base.ScalarOrSchedule,\n decay: float = 0.9,\n eps: float = 1e-8,\n initial_scale: float = 0.0,\n eps_in_sqrt: bool = True,\n centered: bool = False,\n momentum: Optional[float] = None,\n nesterov: bool = False,\n bias_correction: bool = False,\n) -> base.GradientTransformationExtraArgs:\n r""""""A flexible RMSProp optimizer.\n\n RMSProp is an SGD variant with learning rate adaptation. The `learning_rate`\n used for each weight is scaled by a suitable estimate of the magnitude of the\n gradients on previous steps. Several variants of RMSProp can be found\n in the literature. This alias provides an easy to configure RMSProp\n optimizer that can be used to switch between several of these variants.\n\n Args:\n learning_rate: A global scaling factor, either fixed or evolving along\n iterations with a scheduler, see :func:`optax.scale_by_learning_rate`.\n decay: Decay used to track the magnitude of previous gradients.\n eps: A small numerical constant to avoid dividing by zero when rescaling.\n initial_scale: Initial value of accumulators tracking the magnitude of\n previous updates. PyTorch uses `0`, TF1 uses `1`. When reproducing results\n from a paper, verify the value used by the authors.\n eps_in_sqrt: Whether to add ``eps`` in the square root of the denominator or\n outside the square root.\n centered: Whether the second moment or the variance of the past gradients is\n used to rescale the latest gradients.\n momentum: Decay rate used by the momentum term, when it is set to `None`,\n then momentum is not used at all.\n nesterov: Whether Nesterov momentum is used.\n bias_correction: Whether to apply bias correction to the estimates of the\n second moments (and first moment if ``centered=True``).\n\n Returns:\n The corresponding :class:`optax.GradientTransformationExtraArgs`.\n\n Examples:\n >>> import optax\n >>> import jax\n >>> import jax.numpy as jnp\n >>> def f(x): return jnp.sum(x ** 2) # simple quadratic function\n >>> solver = optax.rmsprop(learning_rate=0.003)\n >>> params = jnp.array([1., 2., 3.])\n >>> print('Objective function: ', f(params))\n Objective function: 14.0\n >>> opt_state = solver.init(params)\n >>> for _ in range(5):\n ... grad = jax.grad(f)(params)\n ... updates, opt_state = solver.update(grad, opt_state, params)\n ... params = optax.apply_updates(params, updates)\n ... print('Objective function: {:.2E}'.format(f(params)))\n Objective function: 1.39E+01\n Objective function: 1.38E+01\n Objective function: 1.37E+01\n Objective function: 1.37E+01\n Objective function: 1.36E+01\n\n References:\n Hinton, `Overview of mini-batch gradient descent`\n `_, 2012\n\n Graves, `Generating Sequences With Recurrent Neural Networks\n `_, 2014\n\n Ziyin, `LaProp: Separating Momentum and Adaptivity in Adam`\n `_, 2021\n\n .. warning::\n Default behavior of optax's RMSprop (``eps_in_sqrt=True``) differs from\n Pytorch's implementation and could impact performance.\n If ``eps_in_sqrt=True``, in the denominator, optax uses\n :math:`\sqrt{v + \epsilon}` in the denominator whereas PyTorch uses\n :math:`\sqrt{v} + \epsilon`.\n Using ``eps_in_sqrt=False`` in optax will match PyTorch's behavior.\n See\n https://github.com/google-deepmind/optax/issues/532 for more detail.\n """"""\n if centered:\n return combine.chain(\n transform.scale_by_stddev(\n decay=decay,\n eps=eps,\n initial_scale=initial_scale,\n eps_in_sqrt=eps_in_sqrt,\n bias_correction=bias_correction,\n ),\n transform.scale_by_learning_rate(learning_rate),\n (\n transform.trace(decay=momentum, nesterov=nesterov)\n if momentum is not None\n else base.identity()\n ),\n )\n return combine.chain(\n transform.scale_by_rms(\n decay=decay,\n eps=eps,\n initial_scale=initial_scale,\n eps_in_sqrt=eps_in_sqrt,\n bias_correction=bias_correction,\n ),\n transform.scale_by_learning_rate(learning_rate),\n (\n transform.trace(decay=momentum, nesterov=nesterov)\n if momentum is not None\n else base.identity()\n ),\n )\n\n\ndef sgd(\n learning_rate: base.ScalarOrSchedule,\n momentum: Optional[float] = None,\n nesterov: bool = False,\n accumulator_dtype: Optional[Any] = None,\n) -> base.GradientTransformationExtraArgs:\n r""""""A canonical Stochastic Gradient Descent optimizer.\n\n This implements stochastic gradient descent. It also includes support for\n momentum, and Nesterov acceleration, as these are standard practice when\n using stochastic gradient descent to train deep neural networks.\n\n\n The canonical stochastic gradient descent returns an update\n :math:`u_t` of the form\n\n .. math::\n u_t \leftarrow -\alpha_t g_t,\n\n where :math:`g_t` is the gradient of the objective (potentially preprocessed\n by other transformations) and :math:`\alpha_t` is the ``learning_rate`` at\n time :math:`t` (constant or selected by an :class:`optax.Schedule`).\n\n Stochastic gradient descent with momentum takes two possible forms.\n\n .. math::\n\n \begin{align*}\n m_t &\leftarrow g_t + \mu m_{t-1} \\\n u_t &\leftarrow \begin{cases}\n -\alpha_t m_t & \text{ if } \texttt{nesterov = False} \\\n -\alpha_t (g_t + \mu m_t) & \text{ if } \texttt{nesterov = True}\n \end{cases} \\\n S_t &\leftarrow m_t,\n \end{align*}\n\n where :math:`\mu` is the ``momentum`` parameter and :math:`S_t` is the state\n of the optimizer.\n\n Args:\n learning_rate: A global scaling factor, either fixed or evolving along\n iterations with a scheduler, see :func:`optax.scale_by_learning_rate`.\n momentum: Decay rate used by the momentum term, when it is set to ``None``,\n then momentum is not used at all.\n nesterov: Whether Nesterov momentum is used.\n accumulator_dtype: Optional ``dtype`` to be used for the accumulator; if\n ``None`` then the ``dtype`` is inferred from ``params`` and ``updates``.\n\n Returns:\n The corresponding :class:`optax.GradientTransformationExtraArgs`.\n\n Examples:\n >>> import optax\n >>> import jax\n >>> import jax.numpy as jnp\n >>> def f(x): return jnp.sum(x ** 2) # simple quadratic function\n >>> solver = optax.sgd(learning_rate=0.003)\n >>> params = jnp.array([1., 2., 3.])\n >>> print('Objective function: ', f(params))\n Objective function: 14.0\n >>> opt_state = solver.init(params)\n >>> for _ in range(5):\n ... grad = jax.grad(f)(params)\n ... updates, opt_state = solver.update(grad, opt_state, params)\n ... params = optax.apply_updates(params, updates)\n ... print('Objective function: {:.2E}'.format(f(params)))\n Objective function: 1.38E+01\n Objective function: 1.37E+01\n Objective function: 1.35E+01\n Objective function: 1.33E+01\n Objective function: 1.32E+01\n\n References:\n Sutskever et al, `On the importance of initialization and momentum in deep\n learning `_, 2013\n """"""\n if momentum is not None:\n opt = transform.trace(\n decay=momentum,\n nesterov=nesterov,\n accumulator_dtype=accumulator_dtype,\n )\n else:\n opt = base.identity()\n return combine.chain(\n opt,\n transform.scale_by_learning_rate(learning_rate),\n )\n\n\ndef sm3(\n learning_rate: float, momentum: float = 0.9\n) -> base.GradientTransformationExtraArgs:\n r""""""The SM3 optimizer.\n\n SM3 (Square-root of Minima of Sums of Maxima of Squared-gradients Method) is a\n memory-efficient adaptive optimizer designed to decrease memory overhead when\n training very large models, such as the Transformer for machine translation,\n BERT for language modeling, and AmoebaNet-D for image classification. SM3: 1)\n applies to tensors of arbitrary dimensions and any predefined cover of the\n parameters; 2) adapts the learning rates in an adaptive and data-driven manner\n (like Adagrad and unlike Adafactor); and 3) comes with rigorous convergence\n guarantees in stochastic convex optimization settings.\n\n The init function of this optimizer initializes an internal state\n :math:`S_0 := \{\mu_0, w_1\} = \{0, 0\}`, representing initial estimates for\n the cumulative squared gradients and the weights. These values are stored as\n pytrees containing all zeros, with the same shape as the model updates. At\n step :math:`t`, the update function of this optimizer takes as arguments\n the incoming gradients :math:`g_t` and optimizer state :math:`S_t` and\n computes updates :math:`u_t` and new state :math:`S_{t+1}`. Thus, for\n :math:`t > 0`, we have:\n\n SM3-I Algorithm\n\n .. math::\n\n \begin{array}{l}\n \text{parameters: learning rate } \eta \\\n \text{initialize } w_1 = 0; \forall r \in [k]: \mu_0(r) = 0 \\\n \text{for } t = 1, \ldots, T \text{ do} \\\n \quad \text{receive gradient } g_t = \nabla \ell_t(w_t) \\\n \quad \text{for } r = 1, \ldots, k \text{ do} \\\n \quad \quad \mu_t(r) \leftarrow \mu_{t-1}(r) +\n \max_{j \in S_r} g_t^2(j) \\\n \quad \text{for } i = 1, \ldots, d \text{ do} \\\n \quad \quad \nu_t(i) \leftarrow \min_{r:S_r \ni i} \mu_t(r) \\\n \quad \quad w_{t+1}(i) \leftarrow w_t(i) -\n \eta \frac{g_t(i)}{\sqrt{\nu_t(i)}} \\\n \quad \quad \text{with the convention that } 0/0 = 0\n \end{array}\n\n SM3-II Algorithm\n\n The SM3-II optimizer initializes with parameters like the learning rate\n :math:\eta and weight :math:w_1. It updates weights iteratively using\n gradients :math:g_t, adjusting each component with minimum accumulated\n values :math:\nu'_t(i) and maintaining cumulative maximums :math:\mu'_t(r)\n for subsets :math:S_r. SM3-II starts with an initial state\n :math:S_0 := (m_0, s_0) set to zero, storing estimates for first and second\n moments as pytrees matching model updates' shape\n\n .. math::\n\n \begin{array}{l}\n \text{parameters: learning rate } \eta \\\n \text{initialize } w_1 = 0; \forall r \in [k]: \mu'_0(r) = 0 \\\n \text{for } t = 1, \ldots, T \text{ do} \\\n \quad \text{receive gradient } g_t = \nabla \ell_t(w_t) \\\n \quad \text{initialize } \mu'_t(r) = 0 \text{ for all } r \in [k] \\\n \quad \text{for } i = 1, \ldots, d \text{ do} \\\n \quad \quad \nu'_t(i) \leftarrow \min_{r:S_r \ni i}\n \mu'_{t-1}(r) + g_t^2(i) \\\n \quad \quad w_{t+1}(i) \leftarrow w_t(i) -\n \eta \frac{g_t(i)}{\sqrt{\nu'_t(i)}} \\\n \quad \quad \text{with the convention that } 0/0 = 0 \\\n \quad \text{for all } r : S_r \ni i \text{ do} \\\n \quad \quad \mu'_t(r) \leftarrow \max\{\mu'_t(r), \nu'_t(i)\}\n \end{array}\n\n Args:\n learning_rate: A global scaling factor, either fixed or evolving along\n iterations with a scheduler, see :func:`optax.scale_by_learning_rate`.\n momentum: Decay rate used by the momentum term (when it is not set to\n `None`, then momentum is not used at all).\n\n Returns:\n The corresponding :class:`optax.GradientTransformationExtraArgs`.\n\n Examples:\n >>> import optax\n >>> import jax\n >>> import jax.numpy as jnp\n >>> def f(x): return jnp.sum(x ** 2) # simple quadratic function\n >>> solver = optax.sm3(learning_rate=0.003)\n >>> params = jnp.array([1., 2., 3.])\n >>> print('Objective function: ', f(params))\n Objective function: 14.0\n >>> opt_state = solver.init(params)\n >>> for _ in range(5):\n ... grad = jax.grad(f)(params)\n ... updates, opt_state = solver.update(grad, opt_state, params)\n ... params = optax.apply_updates(params, updates)\n ... print('Objective function: {:.2E}'.format(f(params)))\n Objective function: 1.40E+01\n Objective function: 1.40E+01\n Objective function: 1.40E+01\n Objective function: 1.40E+01\n Objective function: 1.40E+01\n\n References:\n Anil et al, `Memory-Efficient Adaptive Optimization\n `_, 2019\n """"""\n return combine.chain(\n transform.scale_by_sm3(momentum),\n transform.scale(-learning_rate),\n )\n\n\ndef yogi(\n learning_rate: base.ScalarOrSchedule,\n b1: float = 0.9,\n b2: float = 0.999,\n eps: float = 1e-3,\n) -> base.GradientTransformationExtraArgs:\n # pylint: disable=line-too-long\n """"""The Yogi optimizer.\n\n Yogi is an adaptive optimizer, which provides control in tuning the effective\n learning rate to prevent it from increasing. By doing so, it focuses on\n addressing the issues of convergence and generalization in exponential moving\n average-based adaptive methods (such as Adam and RMSprop). Yogi is a\n modification of Adam and uses the same parameters.\n\n Args:\n learning_rate: A global scaling factor, either fixed or evolving along\n iterations with a scheduler, see :func:`optax.scale_by_learning_rate`.\n b1: Exponential decay rate to track the first moment of past gradients.\n b2: Exponential decay rate to track the second moment of past gradients.\n eps: A small constant applied to denominator outside of the square root (as\n in the Adam paper) to avoid dividing by zero when rescaling.\n\n Returns:\n The corresponding :class:`optax.GradientTransformationExtraArgs`.\n\n Examples:\n >>> import optax\n >>> import jax\n >>> import jax.numpy as jnp\n >>> def f(x): return jnp.sum(x ** 2) # simple quadratic function\n >>> solver = optax.yogi(learning_rate=0.002)\n >>> params = jnp.array([1., 2., 3.])\n >>> print('Objective function: ', f(params))\n Objective function: 14.0\n >>> opt_state = solver.init(params)\n >>> for _ in range(5):\n ... grad = jax.grad(f)(params)\n ... updates, opt_state = solver.update(grad, opt_state, params)\n ... params = optax.apply_updates(params, updates)\n ... print('Objective function: {:.2E}'.format(f(params)))\n Objective function: 1.40E+01\n Objective function: 1.40E+01\n Objective function: 1.39E+01\n Objective function: 1.39E+01\n Objective function: 1.39E+01\n\n References:\n Zaheer et al, `Adaptive Methods for Nonconvex Optimization\n `_,\n 2018\n """"""\n # pylint: enable=line-too-long\n return combine.chain(\n transform.scale_by_yogi(b1=b1, b2=b2, eps=eps),\n transform.scale_by_learning_rate(learning_rate),\n )\n\n\ndef adamax(\n learning_rate: base.ScalarOrSchedule,\n b1: float = 0.9,\n b2: float = 0.999,\n eps: float = 1e-8,\n) -> base.GradientTransformationExtraArgs:\n r""""""A variant of the Adam optimizer that uses the infinity norm.\n\n AdaMax is a variant of the :func:`optax.adam` optimizer. By generalizing\n Adam's :math:`L^2` norm to an :math:`L^p` norm and taking the limit as\n :math:`p \rightarrow \infty`, we obtain a simple and stable update rule.\n\n Let :math:`\alpha_t` represent the learning rate and :math:`\beta_1, \beta_2`,\n :math:`\varepsilon` represent the arguments\n ``b1``, ``b2`` and ``eps`` respectively. The learning rate is\n indexed by :math:`t` since the learning rate may also be provided by a\n schedule function.\n\n The ``init`` function of this optimizer initializes an internal state\n :math:`S_0 := (m_0, v_0) = (0, 0)`, representing initial estimates for the\n first and second moments. In practice these values are stored as pytrees\n containing all zeros, with the same shape as the model updates.\n At step :math:`t`, the ``update`` function of this optimizer takes as\n arguments the incoming gradients :math:`g_t` and optimizer state :math:`S_t`\n and computes updates :math:`u_t` and new state :math:`S_{t+1}`. Thus, for\n :math:`t > 0`, we have,\n\n .. math::\n\n \begin{align*}\n m_t &\leftarrow \beta_1 \cdot m_{t-1} + (1-\beta_1) \cdot g_t \\\n v_t &\leftarrow \max(\left| g_t \right| + \varepsilon, \beta_2 \cdot\n v_{t-1}) \\\n \hat{m}_t &\leftarrow m_t / (1-\beta_1^t) \\\n u_t &\leftarrow -\alpha_t \cdot \hat{m}_t / v_t \\\n S_t &\leftarrow (m_t, v_t).\n \end{align*}\n\n Args:\n learning_rate: A global scaling factor, either fixed or evolving along\n iterations with a scheduler, see :func:`optax.scale_by_learning_rate`.\n b1: Exponential decay rate to track the first moment of past gradients.\n b2: Exponential decay rate to track the maximum of past gradients.\n eps: A small constant applied to denominator to avoid dividing by zero when\n rescaling.\n\n Returns:\n The corresponding :class:`optax.GradientTransformationExtraArgs`.\n\n Examples:\n >>> import optax\n >>> import jax\n >>> import jax.numpy as jnp\n >>> def f(x): return jnp.sum(x ** 2) # simple quadratic function\n >>> solver = optax.adamax(learning_rate=0.003)\n >>> params = jnp.array([1., 2., 3.])\n >>> print('Objective function: ', f(params))\n Objective function: 14.0\n >>> opt_state = solver.init(params)\n >>> for _ in range(5):\n ... grad = jax.grad(f)(params)\n ... updates, opt_state = solver.update(grad, opt_state, params)\n ... params = optax.apply_updates(params, updates)\n ... print('Objective function: {:.2E}'.format(f(params)))\n Objective function: 1.40E+01\n Objective function: 1.39E+01\n Objective function: 1.39E+01\n Objective function: 1.39E+01\n Objective function: 1.38E+01\n\n References:\n Kingma et al, 2014: https://arxiv.org/abs/1412.6980\n\n .. seealso:: :func:`optax.adam`, :func:`optax.adamaxw`.\n """"""\n return combine.chain(\n transform.scale_by_adamax(\n b1=b1,\n b2=b2,\n eps=eps,\n ),\n transform.scale_by_learning_rate(learning_rate),\n )\n\n\ndef adamaxw(\n learning_rate: base.ScalarOrSchedule,\n b1: float = 0.9,\n b2: float = 0.999,\n eps: float = 1e-8,\n weight_decay: float = 1e-4,\n mask: Optional[Union[Any, Callable[[base.Params], Any]]] = None,\n) -> base.GradientTransformationExtraArgs:\n """"""Adamax with weight decay regularization.\n\n AdamaxW uses weight decay to regularize learning towards small weights, as\n this leads to better generalization. In SGD you can also use L2 regularization\n to implement this as an additive loss term, however L2 regularization\n does not behave as intended for adaptive gradient algorithms such as Adam.\n\n Args:\n learning_rate: A global scaling factor, either fixed or evolving along\n iterations with a scheduler, see :func:`optax.scale_by_learning_rate`.\n b1: Exponential decay rate to track the first moment of past gradients.\n b2: Exponential decay rate to track the maximum of past gradients.\n eps: A small constant applied to denominator to avoid dividing by zero when\n rescaling.\n weight_decay: Strength of the weight decay regularization. Note that this\n weight decay is multiplied with the learning rate. This is consistent with\n other frameworks such as PyTorch, but different from (Loshchilov et al,\n 2019) where the weight decay is only multiplied with the ""schedule\n multiplier"", but not the base learning rate.\n mask: A tree with same structure as (or a prefix of) the params PyTree, or a\n Callable that returns such a pytree given the params/updates. The leaves\n should be booleans, `True` for leaves/subtrees you want to apply the\n weight decay to, and `False` for those you want to skip. Note that the\n Adamax gradient transformations are applied to all parameters.\n\n Returns:\n The corresponding :class:`optax.GradientTransformationExtraArgs`.\n\n Examples:\n >>> import optax\n >>> import jax\n >>> import jax.numpy as jnp\n >>> def f(x): return jnp.sum(x ** 2) # simple quadratic function\n >>> solver = optax.adamaxw(learning_rate=0.003)\n >>> params = jnp.array([1., 2., 3.])\n >>> print('Objective function: ', f(params))\n Objective function: 14.0\n >>> opt_state = solver.init(params)\n >>> for _ in range(5):\n ... grad = jax.grad(f)(params)\n ... updates, opt_state = solver.update(grad, opt_state, params)\n ... params = optax.apply_updates(params, updates)\n ... print('Objective function: {:.2E}'.format(f(params)))\n Objective function: 1.40E+01\n Objective function: 1.39E+01\n Objective function: 1.39E+01\n Objective function: 1.39E+01\n Objective function: 1.38E+01\n\n References:\n Loshchilov et al, 2019: https://arxiv.org/abs/1711.05101\n\n .. warning::\n Sometimes you may want to skip weight decay for BatchNorm scale\n or for the bias parameters. You can use `optax.masked` to make your own\n AdamaxW variant where `additive_weight_decay` is applied only to a subset of\n `params`.\n\n .. seealso:: :func:`optax.adam`, :func:`optax.adamax`.\n """"""\n return combine.chain(\n transform.scale_by_adamax(b1=b1, b2=b2, eps=eps),\n transform.add_decayed_weights(weight_decay, mask),\n transform.scale_by_learning_rate(learning_rate),\n )\n\n\ndef rprop(\n learning_rate: float,\n eta_minus: float = 0.5,\n eta_plus: float = 1.2,\n min_step_size: float = 1e-6,\n max_step_size: float = 50.0,\n) -> base.GradientTransformationExtraArgs:\n """"""The Rprop optimizer.\n\n Rprop, short for resillient backpropogation, is a first order variant of\n gradient descent. It responds only to the sign of the gradient by increasing\n or decreasing the step size selected per parameter exponentially to speed up\n convergence and avoid oscillations.\n\n Args:\n learning_rate: The initial step size.\n eta_minus: Multiplicative factor for decreasing step size. This is applied\n when the gradient changes sign from one step to the next.\n eta_plus: Multiplicative factor for increasing step size. This is applied\n when the gradient has the same sign from one step to the next.\n min_step_size: Minimum allowed step size. Smaller steps will be clipped to\n this value.\n max_step_size: Maximum allowed step size. Larger steps will be clipped to\n this value.\n\n Returns:\n The corresponding :class:`optax.GradientTransformationExtraArgs`.\n\n Examples:\n >>> import optax\n >>> import jax\n >>> import jax.numpy as jnp\n >>> def f(x): return jnp.sum(x ** 2) # simple quadratic function\n >>> solver = optax.rprop(learning_rate=0.003)\n >>> params = jnp.array([1., 2., 3.])\n >>> print('Objective function: ', f(params))\n Objective function: 14.0\n >>> opt_state = solver.init(params)\n >>> for _ in range(5):\n ... grad = jax.grad(f)(params)\n ... updates, opt_state = solver.update(grad, opt_state, params)\n ... params = optax.apply_updates(params, updates)\n ... print('Objective function: {:.2E}'.format(f(params)))\n Objective function: 1.40E+01\n Objective function: 1.40E+01\n Objective function: 1.39E+01\n Objective function: 1.39E+01\n Objective function: 1.38E+01\n\n References:\n Riedmiller et al. `A direct adaptive method for faster backpropagation\n learning: the RPROP algorithm\n `_, 1993\n\n Igel et al. `Empirical evaluation of the improved Rprop learning\n algorithms\n `_,\n 2003\n """"""\n return combine.chain(\n transform.scale_by_rprop(\n learning_rate=learning_rate,\n eta_minus=eta_minus,\n eta_plus=eta_plus,\n min_step_size=min_step_size,\n max_step_size=max_step_size,\n ),\n transform.scale(-1.0),\n )\n\n\ndef polyak_sgd(\n max_learning_rate: float = 1.0,\n scaling: base.ScalarOrSchedule = 1.0,\n f_min: float = 0.0,\n eps: float = 0.0,\n variant: str = 'sps',\n) -> base.GradientTransformationExtraArgs:\n r""""""SGD with Polyak step-size.\n\n This solver implements the SGD with Polyak step size of (Loizou et al. 2021).\n It sets the step-size as\n\n .. math::\n s \min\left\{\frac{f(x) - f^\star}{\|\nabla f(x)\|^2 + \epsilon},\n \gamma_{\max}\right\}\,,\n\n where :math:`f` is the function from which a gradient is computed,\n :math:`\gamma_{\max}` is a maximal acceptable learning rate set by\n ``max_learning_rate``, :math:`\epsilon` is a constant preventing division by\n zero set with ``eps``, :math:`s` scales the formula by ``scaling``, and\n :math:`f^\star` is a guess of the minimum value of the function set with\n ``f_min``.\n\n Setting ``variant=""sps+""`` (Garrigos et al. 2023) uses only the non-negative\n part of the suboptimality gap. That is, it replaces :math:`f(x) - f^\star`\n with :math:`(f(x) - f^\star)_+`, where :math:`a_+ = \max \{x, 0\}`.\n\n Args:\n max_learning_rate: a maximum step size to use (defaults to 1).\n scaling: A global scaling factor, either fixed or evolving along iterations\n with a scheduler (defaults to 1).\n f_min: a lower bound on the objective function (defaults to 0). Corresponds\n to :math:`f^\star` in the formula above.\n eps: a value to add in the denominator of the update (defaults to 0).\n variant: either ``'sps'`` or ``'sps+'`` (defaults to ``'sps'``).\n\n Returns:\n A :class:`optax.GradientTransformationExtraArgs`, where the ``update``\n functiontakes an additional keyword argument ``value`` containing the\n current value of the objective function.\n\n Examples:\n >>> import optax\n >>> import jax\n >>> import jax.numpy as jnp\n >>> def f(x): return jnp.sum(x ** 2) # simple quadratic function\n >>> solver = optax.polyak_sgd()\n >>> params = jnp.array([1., 2., 3.])\n >>> print('Objective function: ', f(params))\n Objective function: 14.0\n >>> opt_state = solver.init(params)\n >>> for _ in range(5):\n ... value, grad = jax.value_and_grad(f)(params)\n ... params, opt_state = solver.update(grad, opt_state, params, value=value)\n ... print('Objective function: ', f(params))\n Objective function: 3.5\n Objective function: 0.875\n Objective function: 0.21875\n Objective function: 0.0546875\n Objective function: 0.013671875\n\n References:\n Loizou et al. `Stochastic polyak step-size for SGD: An adaptive learning\n rate for fast convergence `_, 2021\n\n Berrada et al., `Training neural networks for and by interpolation\n `_, 2020\n\n Garrigos et al., `Function value learning: Adaptive learning rates based on\n the Polyak stepsize and function splitting in ERM\n `_, 2023\n\n .. warning::\n This method requires knowledge of an approximate value of the of the\n objective function minimum, passed through the ``f_min`` argument.\n For models that interpolate the data, this can be set to 0 (default\n value).\n Failing to set an appropriate value for ``f_min`` can lead to\n divergence or convergence to a suboptimal solution.\n """"""\n return combine.chain(\n sgd(learning_rate=scaling),\n transform.scale_by_polyak(\n max_learning_rate=max_learning_rate,\n f_min=f_min,\n eps=eps,\n variant=variant,\n ),\n )\n\n\ndef lbfgs(\n learning_rate: Optional[base.ScalarOrSchedule] = None,\n memory_size: int = 10,\n scale_init_precond: bool = True,\n linesearch: Optional[\n Union[base.GradientTransformationExtraArgs, base.GradientTransformation]\n ] = _linesearch.scale_by_zoom_linesearch(\n max_linesearch_steps=20, initial_guess_strategy='one'\n ),\n) -> base.GradientTransformationExtraArgs:\n r""""""L-BFGS optimizer.\n\n L-BFGS is a quasi-Newton method that multiplies the update (gradient)\n with an approximation of the inverse Hessian. This algorithm does not need\n access to the Hessian, as this approximation is constructed from the gradient\n evaluations seen during optimization. L-BFGS is a limited-memory variant of\n the Broyden-Fletcher-Goldfarb-Shanno (BFGS) algorithm. The BFGS algorithm\n requires storing a matrix of size :math:`p \times p` with :math:`p` the\n dimension of the parameters.\n The limited variant circuments this issue by computing the approximation of\n the inverse using only :math:`m` (``memory_size``) past differences of\n parameters/gradients. Namely, the approximation of the Hessian inverse is\n denoted :math:`P_k = P_{k, k}`, where\n\n .. math::\n\n \begin{align*}\n P_{k, j+1} & = V_j^\top P_{k, j} V_j + \rho_j \delta w_j \delta w_j^\top\n \quad \text{for} \ j \in \{k-m, \ldots, k-1\}\\\n P_{k, k-m} & = \gamma_k I \\\n V_k & = I - \rho_k \delta u_k \delta w_k^\top \\\n \rho_k & = 1/(\delta u_k^\top \delta w_k) \\\n \delta w_k & = w_{k+1} - w_k \\\n \delta u_k & = u_{k+1} - u_k \\\n \gamma_k & =\n \begin{cases}\n (\delta w_{k-1}^\top \delta u_{k-1}) /\n (\delta u_{k-1}^\top \delta u_{k-1})\n & \text{if} \ \texttt{scale\_init\_hess} \\\n 1 & \text{otherwise}\n \end{cases},\n \end{align*}\n\n for\n :math:`u_k` the gradients/updates at iteration :math:`k`,\n :math:`w_k` the parameters at iteration :math:`k`.\n\n The formula for updating :math:`P_k` is obtained by computing the optimal\n preconditioning matrix subject to some secant condition, see references\n for more details. Computing :math:`P_k u_k` can be done by a sequence of\n vector operations using past differences of parameters and gradients stored in\n a memory bufffer.\n\n The present function just outputs the LBFGS direction :math:`P_k u_k`.\n It can be chained with a linesearch ensuring sufficient decrease and low\n curvature, such as a zoom linesearch. The linesearch computes a stepsize\n :math:`\eta_k`, such that the updated parameters\n (using :func:`optax.apply_updates`) take the form\n :math:`w_{k+1} = w_k - \eta_k P_k u_k`.\n\n Args:\n learning_rate: optional global scaling factor, either fixed or evolving\n along iterations with a scheduler, see\n :func:`optax.scale_by_learning_rate`. By default the learning rate is\n handled by a linesearch.\n memory_size: number of past updates to keep in memory to approximate the\n Hessian inverse.\n scale_init_precond: whether to use a scaled identity as the initial\n preconditioner, see formula of :math:`\gamma_k` above.\n linesearch: an instance of :class:`optax.GradientTransformationExtraArgs`\n such as :func:`optax.scale_by_zoom_linesearch` that computes a\n learning rate, a.k.a. stepsize, to satisfy some criterion such as a\n sufficient decrease of the objective by additional calls to the objective.\n\n Returns:\n A :class:`optax.GradientTransformationExtraArgs` object.\n\n Example:\n >>> import optax\n >>> import jax\n >>> import jax.numpy as jnp\n >>> def f(x): return jnp.sum(x ** 2)\n >>> solver = optax.lbfgs()\n >>> params = jnp.array([1., 2., 3.])\n >>> print('Objective function: ', f(params))\n Objective function: 14.0\n >>> opt_state = solver.init(params)\n >>> value_and_grad = optax.value_and_grad_from_state(f)\n >>> for _ in range(2):\n ... value, grad = value_and_grad(params, state=opt_state)\n ... updates, opt_state = solver.update(\n ... grad, opt_state, params, value=value, grad=grad, value_fn=f\n ... )\n ... params = optax.apply_updates(params, updates)\n ... print('Objective function: {:.2E}'.format(f(params)))\n Objective function: 7.52E+00\n Objective function: 7.46E-14\n\n References:\n Algorithms 7.4, 7.5 (page 199) of Nocedal et al, `Numerical Optimization\n `_\n , 1999\n\n Liu et al., `On the limited memory BFGS method for large scale optimization\n `_\n , 1989.\n\n .. warning::\n This optimizer is memory intensive and best used for small to medium\n scale problems.\n\n .. warning::\n This optimizer works best with a linesearch (current default is a\n zoom linesearch). See example above for best use in a non-stochastic\n setting, where we can recycle gradients computed by the linesearch using\n :func:`optax.value_and_grad_from_state`.\n\n .. note::\n We initialize the scaling of the identity as a capped reciprocal of the\n gradient norm. This avoids wasting linesearch iterations for the first step\n by taking into account the magnitude of the gradients. In other words, we\n constrain the trust-region of the first step to an Euclidean ball of radius\n 1 at the first iteration. The choice of :math:`\gamma_0` is not detailed in\n the references above, so this is a heuristic choice.\n\n .. note:: The algorithm can support complex inputs.\n """"""\n if learning_rate is None:\n base_scaling = transform.scale(-1.0)\n else:\n base_scaling = transform.scale_by_learning_rate(learning_rate)\n if linesearch is None:\n linesearch = base.identity()\n return combine.chain(\n transform.scale_by_lbfgs(\n memory_size=memory_size, scale_init_precond=scale_init_precond\n ),\n base_scaling,\n linesearch,\n )\n",python,tab
+91,913422,".venv/lib/python3.10/site-packages/optax/_src/alias.py",23933,0,"",python,selection_command
+92,913923,".venv/lib/python3.10/site-packages/optax/_src/alias.py",27123,0,"",python,selection_command
+93,925965,".venv/lib/python3.10/site-packages/optax/_src/alias.py",24042,0,"",python,selection_command
+94,928149,".venv/lib/python3.10/site-packages/optax/_src/alias.py",27123,0,"",python,selection_command
+95,928941,".venv/lib/python3.10/site-packages/optax/_src/alias.py",23933,0,"",python,selection_command
+96,929614,".venv/lib/python3.10/site-packages/optax/_src/alias.py",23786,0,"",python,selection_command
+97,929968,"train_dynamics.py",0,0,"",python,tab
+98,931487,".venv/lib/python3.10/site-packages/optax/_src/alias.py",0,0,"",python,tab
+99,932648,".venv/lib/python3.10/site-packages/optax/_src/alias.py",24042,0,"",python,selection_command
+100,932935,".venv/lib/python3.10/site-packages/optax/_src/alias.py",23786,0,"",python,selection_command
+101,933386,"train_dynamics.py",0,0,"",python,tab
+102,936492,".venv/lib/python3.10/site-packages/optax/_src/alias.py",0,0,"",python,tab
+103,936631,".venv/lib/python3.10/site-packages/optax/_src/alias.py",24042,0,"",python,selection_command
+104,939059,".venv/lib/python3.10/site-packages/optax/_src/alias.py",25239,0,"",python,selection_command
+105,941698,".venv/lib/python3.10/site-packages/optax/_src/alias.py",27126,0,"",python,selection_command
+106,956869,"train_dynamics.py",0,0,"",python,tab
+107,957615,"train_dynamics.py",6023,0,"",python,selection_command
+108,958538,"train_dynamics.py",6022,0,"",python,selection_command
+109,959683,"train_dynamics.py",6014,0,"",python,selection_command
+110,959932,"train_dynamics.py",6013,0,"",python,selection_command
+111,959961,"train_dynamics.py",6010,0,"",python,selection_command
+112,959993,"train_dynamics.py",6009,0,"",python,selection_command
+113,960028,"train_dynamics.py",6001,0,"",python,selection_command
+114,960061,"train_dynamics.py",5999,0,"",python,selection_command
+115,960097,"train_dynamics.py",5998,0,"",python,selection_command
+116,960130,"train_dynamics.py",5997,0,"",python,selection_command
+117,960163,"train_dynamics.py",5995,0,"",python,selection_command
+118,960197,"train_dynamics.py",5994,0,"",python,selection_command
+119,960230,"train_dynamics.py",5982,0,"",python,selection_command
+120,960264,"train_dynamics.py",5980,0,"",python,selection_command
+121,960298,"train_dynamics.py",5979,0,"",python,selection_command
+122,960331,"train_dynamics.py",5978,0,"",python,selection_command
+123,960364,"train_dynamics.py",5977,0,"",python,selection_command
+124,960397,"train_dynamics.py",5976,0,"",python,selection_command
+125,960431,"train_dynamics.py",5974,0,"",python,selection_command
+126,960464,"train_dynamics.py",5972,0,"",python,selection_command
+127,960748,"train_dynamics.py",5971,0,"",python,selection_command
+128,961033,"train_dynamics.py",5970,0,"",python,selection_command
+129,961279,"train_dynamics.py",5969,0,"",python,selection_command
+130,961533,"train_dynamics.py",5968,0,"",python,selection_command
+131,961779,"train_dynamics.py",5966,0,"",python,selection_command
+132,962222,"train_dynamics.py",5965,0,"",python,selection_command
+133,962560,"train_dynamics.py",5966,0,"",python,selection_command
+134,962772,"train_dynamics.py",5967,0,"",python,selection_command
+135,962919,"train_dynamics.py",5966,1,"",python,content
+136,963597,"train_dynamics.py",5965,0,"",python,selection_command
+137,963749,"train_dynamics.py",5966,0,"b",python,content
+138,963751,"train_dynamics.py",5967,0,"",python,selection_command
+139,964518,"train_dynamics.py",5966,0,"",python,selection_command
+140,964813,"train_dynamics.py",5965,1,"",python,content
+141,965129,"train_dynamics.py",5965,0,"\n ",python,content
+142,965660,"train_dynamics.py",5969,0,"",python,selection_command
+143,966183,"train_dynamics.py",5921,0,"",python,selection_command
+144,966269,"train_dynamics.py",5922,0,"",python,selection_command
+145,966283,"train_dynamics.py",5925,0,"",python,selection_command
+146,966448,"train_dynamics.py",5927,0,"",python,selection_command
+147,966567,"train_dynamics.py",5932,0,"",python,selection_command
+148,966749,"train_dynamics.py",5933,0,"",python,selection_command
+149,966911,"train_dynamics.py",5938,0,"",python,selection_command
+150,967065,"train_dynamics.py",5939,0,"",python,selection_command
+151,967431,"train_dynamics.py",5938,0,"",python,selection_command
+152,967633,"train_dynamics.py",5939,0,"",python,selection_command
+153,967785,"train_dynamics.py",5939,0,"\n ",python,content
+154,968215,"train_dynamics.py",5947,0,"",python,selection_command
+155,968655,"train_dynamics.py",5982,0,"",python,selection_command
+156,969467,"train_dynamics.py",5979,0,"",python,selection_command
+157,969816,"train_dynamics.py",5975,4,"",python,content
+158,969985,"train_dynamics.py",5974,1,"",python,content
+159,970376,"train_dynamics.py",5974,0,"\n ",python,content
+160,971349,"train_dynamics.py",5982,0,"",python,selection_command
+161,971951,"train_dynamics.py",5989,0,"",python,selection_command
+162,972567,"train_dynamics.py",5997,0,"",python,selection_command
+163,973019,"train_dynamics.py",5998,0,"",python,selection_command
+164,973102,"train_dynamics.py",5998,0,"\n ",python,content
+165,973562,"train_dynamics.py",6006,0,"",python,selection_command
+166,974012,"train_dynamics.py",6007,0,"",python,selection_command
+167,974065,"train_dynamics.py",6007,1,"",python,content
+168,974903,"train_dynamics.py",6024,0,"",python,selection_command
+169,975735,"train_dynamics.py",6025,0,"",python,selection_command
+170,975899,"train_dynamics.py",6026,0,"",python,selection_command
+171,976081,"train_dynamics.py",6025,1,"",python,content
+172,976230,"train_dynamics.py",6025,0,"\n ",python,content
+173,976779,"train_dynamics.py",6033,0,"",python,selection_command
+174,977582,"train_dynamics.py",6056,0,"",python,selection_command
+175,977852,"train_dynamics.py",6055,0,"",python,selection_command
+176,978015,"train_dynamics.py",6054,0,"",python,selection_command
+177,978500,"train_dynamics.py",6055,0,"",python,selection_command
+178,979121,"train_dynamics.py",6055,0,"\n ",python,content
+179,980415,"train_dynamics.py",6063,0,"",python,selection_command
+180,980549,"train_dynamics.py",6033,0,"",python,selection_command
+181,980678,"train_dynamics.py",6055,0,"",python,selection_command
+182,980821,"train_dynamics.py",6055,0,",",python,content
+183,980822,"train_dynamics.py",6056,0,"",python,selection_keyboard
+184,980980,"train_dynamics.py",6055,0,"",python,selection_command
+185,981211,"train_dynamics.py",6065,0,"",python,selection_command
+186,982148,"train_dynamics.py",6064,0,"",python,selection_command
+187,982276,"train_dynamics.py",6033,0,"",python,selection_command
+188,982532,"train_dynamics.py",6056,0,"\n ",python,content
+189,983465,"train_dynamics.py",6057,8,"",python,content
+190,983549,"train_dynamics.py",6026,0,"",python,selection_command
+191,983583,"train_dynamics.py",6027,0,"",python,selection_command
+192,983752,"train_dynamics.py",6034,0,"",python,selection_command
+193,984315,"train_dynamics.py",6057,0,"",python,selection_command
+194,985803,"train_dynamics.py",6056,1,"",python,content
+195,986061,"train_dynamics.py",6055,1,"",python,content
+196,986773,"train_dynamics.py",6055,0,",",python,content
+197,986775,"train_dynamics.py",6056,0,"",python,selection_keyboard
+198,987031,"train_dynamics.py",6056,0,"\n ",python,content
+199,988988,"train_dynamics.py",6065,0,"w",python,content
+200,988990,"train_dynamics.py",6066,0,"",python,selection_keyboard
+201,989047,"train_dynamics.py",6066,0,"e",python,content
+202,989050,"train_dynamics.py",6067,0,"",python,selection_keyboard
+203,989118,"train_dynamics.py",6067,0,"i",python,content
+204,989120,"train_dynamics.py",6068,0,"",python,selection_keyboard
+205,989196,"train_dynamics.py",6068,0,"g",python,content
+206,989197,"train_dynamics.py",6069,0,"",python,selection_keyboard
+207,989397,"train_dynamics.py",6069,0,"h",python,content
+208,989399,"train_dynamics.py",6070,0,"",python,selection_keyboard
+209,989509,"train_dynamics.py",6070,0,"t",python,content
+210,989510,"train_dynamics.py",6071,0,"",python,selection_keyboard
+211,989786,"train_dynamics.py",6071,0,"_",python,content
+212,989787,"train_dynamics.py",6072,0,"",python,selection_keyboard
+213,989934,"train_dynamics.py",6072,0,"d",python,content
+214,989935,"train_dynamics.py",6073,0,"",python,selection_keyboard
+215,990111,"train_dynamics.py",6073,0,"e",python,content
+216,990112,"train_dynamics.py",6074,0,"",python,selection_keyboard
+217,990180,"train_dynamics.py",6074,0,"c",python,content
+218,990181,"train_dynamics.py",6075,0,"",python,selection_keyboard
+219,990328,"train_dynamics.py",6075,0,"a",python,content
+220,990330,"train_dynamics.py",6076,0,"",python,selection_keyboard
+221,990431,"train_dynamics.py",6076,0,"y",python,content
+222,990433,"train_dynamics.py",6077,0,"",python,selection_keyboard
+223,993693,"train_dynamics.py",6076,0,"",python,selection_command
+224,993998,"train_dynamics.py",6045,0,"",python,selection_command
+225,994247,"train_dynamics.py",6018,0,"",python,selection_command
+226,994278,"train_dynamics.py",5994,0,"",python,selection_command
+227,994314,"train_dynamics.py",5959,0,"",python,selection_command
+228,994347,"train_dynamics.py",5937,0,"",python,selection_command
+229,994376,"train_dynamics.py",5916,0,"",python,selection_command
+230,994535,"train_dynamics.py",5863,0,"",python,selection_command
+231,994694,"train_dynamics.py",5809,0,"",python,selection_command
+232,995348,"train_dynamics.py",5813,0,"",python,selection_command
+233,995547,"train_dynamics.py",5814,0,"",python,selection_command
+234,995714,"train_dynamics.py",5842,0,"",python,selection_command
+235,996049,"train_dynamics.py",5896,0,"",python,selection_command
+236,996299,"train_dynamics.py",5916,0,"",python,selection_command
+237,996329,"train_dynamics.py",5938,0,"",python,selection_command
+238,996452,"train_dynamics.py",5973,0,"",python,selection_command
+239,996704,"train_dynamics.py",5997,0,"",python,selection_command
+240,996736,"train_dynamics.py",6024,0,"",python,selection_command
+241,996781,"train_dynamics.py",6055,0,"",python,selection_command
+242,996803,"train_dynamics.py",6076,0,"",python,selection_command
+243,997181,"train_dynamics.py",6055,0,"",python,selection_command
+244,998255,"train_dynamics.py",6076,0,"",python,selection_command
+245,998434,"train_dynamics.py",6086,0,"",python,selection_command
+246,998453,"train_dynamics.py",6076,0,"",python,selection_command
+247,998581,"train_dynamics.py",6055,0,"",python,selection_command
+248,998919,"train_dynamics.py",6024,0,"",python,selection_command
+249,999098,"train_dynamics.py",5997,0,"",python,selection_command
+250,999244,"train_dynamics.py",5973,0,"",python,selection_command
+251,999385,"train_dynamics.py",5938,0,"",python,selection_command
+252,999633,"train_dynamics.py",5933,0,"",python,selection_command
+253,999830,".venv/lib/python3.10/site-packages/optax/_src/alias.py",0,0,"",python,tab
+254,1002610,"train_dynamics.py",0,0,"",python,tab
+255,1003129,"train_dynamics.py",5955,0,"",python,selection_command
+256,1003384,"train_dynamics.py",5990,0,"",python,selection_command
+257,1003417,"train_dynamics.py",6014,0,"",python,selection_command
+258,1003447,"train_dynamics.py",6041,0,"",python,selection_command
+259,1003596,"train_dynamics.py",6072,0,"",python,selection_command
+260,1003886,"train_dynamics.py",6077,0,"",python,selection_command
+261,1004052,"train_dynamics.py",6077,0,"=",python,content
+262,1004054,"train_dynamics.py",6078,0,"",python,selection_keyboard
+263,1004902,"train_dynamics.py",6077,0,"",python,selection_command
+264,1005301,"train_dynamics.py",6057,22,"",python,content
+265,1005312,"train_dynamics.py",6065,0,"",python,selection_command
+266,1006113,"train_dynamics.py",6061,4,"",python,content
+267,1006298,"train_dynamics.py",6057,4,"",python,content
+268,1006580,"train_dynamics.py",6056,1,"",python,content
+269,1007331,"train_dynamics.py",6055,1,"",python,content
+270,1007446,"train_dynamics.py",6054,0,"",python,selection_command
+271,1008483,"train_dynamics.py",6024,0,"",python,selection_command
+272,1015404,"train_dynamics.py",5997,0,"",python,selection_command
+273,1015560,"train_dynamics.py",5965,0,"",python,selection_command
+274,1015726,"train_dynamics.py",5938,0,"",python,selection_command
+275,1015935,"train_dynamics.py",5965,0,"",python,selection_command
+276,1016151,"train_dynamics.py",5948,0,"",python,selection_command
+277,1016382,"train_dynamics.py",5944,4,"",python,content
+278,1016592,"train_dynamics.py",5940,4,"",python,content
+279,1016875,"train_dynamics.py",5939,1,"",python,content
+280,1017331,"train_dynamics.py",5938,0,"",python,selection_command
+281,1017526,"train_dynamics.py",5939,0,"\n ",python,content
+282,1017536,"train_dynamics.py",5948,0,"",python,selection_command
+283,1017774,"train_dynamics.py",6055,0,",\n ",python,content
+284,1017776,"train_dynamics.py",6065,0,"",python,selection_command
+285,1017808,"train_dynamics.py",6065,0,"weight_decay=\n ",python,content
+286,1017811,"train_dynamics.py",6077,0,"",python,selection_command
+287,1017841,"train_dynamics.py",6077,1,"",python,content
+288,1017844,"train_dynamics.py",6076,0,"",python,selection_command
+289,1017874,"train_dynamics.py",6057,20,"",python,content
+290,1017908,"train_dynamics.py",6057,1,"",python,content
+291,1017910,"train_dynamics.py",6033,0,"",python,selection_command
+292,1018094,"train_dynamics.py",6055,1,"",python,content
+293,1018098,"train_dynamics.py",6054,0,"",python,selection_command
+294,1018346,"train_dynamics.py",6055,9,"",python,content
+295,1018349,"train_dynamics.py",6055,0,"",python,selection_command
+296,1018383,"train_dynamics.py",6025,8,"",python,content
+297,1018386,"train_dynamics.py",6026,0,"",python,selection_command
+298,1018417,"train_dynamics.py",6007,0," ",python,content
+299,1018421,"train_dynamics.py",6007,0,"",python,selection_command
+300,1018449,"train_dynamics.py",5998,9,"",python,content
+301,1018619,"train_dynamics.py",5979,4,"",python,content
+302,1018621,"train_dynamics.py",5979,0,"",python,selection_command
+303,1018875,"train_dynamics.py",5939,9,"",python,content
+304,1018886,"train_dynamics.py",5939,0,"",python,selection_command
+305,1018913,"train_dynamics.py",5965,4,"",python,content
+306,1018916,"train_dynamics.py",5966,0,"",python,selection_command
+307,1019091,"train_dynamics.py",5999,23,"",python,content
+308,1019093,"train_dynamics.py",5999,0,"",python,selection_command
+309,1020417,"train_dynamics.py",5999,0,", mu_dtype=jnp.bfloat16",python,content
+310,1020425,"train_dynamics.py",5999,0,"",python,selection_command
+311,1021866,"train_dynamics.py",5918,0,"",python,selection_command
+312,1056712,"train_tokenizer.py",0,0,"from dataclasses import dataclass, field\nimport os\n\nimport einops\nfrom flax.training import orbax_utils\nfrom flax.training.train_state import TrainState\nfrom jax.sharding import Mesh, PartitionSpec, NamedSharding\nfrom jax.experimental.mesh_utils import create_device_mesh\nimport optax\nimport orbax.checkpoint as ocp\nimport numpy as np\nimport dm_pix as pix\nimport jax\nimport jax.numpy as jnp\nimport tyro\nimport wandb\nimport grain\n\nfrom models.tokenizer import TokenizerVQVAE\nfrom utils.dataloader import get_dataloader\nfrom utils.parameter_utils import count_parameters_by_component\n\n\n@dataclass\nclass Args:\n # Experiment\n num_steps: int = 300_000\n seed: int = 0\n seq_len: int = 16\n image_channels: int = 3\n image_height: int = 90\n image_width: int = 160\n data_dir: str = """"\n save_ckpt: bool = False\n restore_ckpt: bool = False\n # Optimization\n vq_beta: float = 0.25\n batch_size: int = 48\n min_lr: float = 0.0\n max_lr: float = 3e-4\n warmup_steps: int = 10000\n # Tokenizer\n model_dim: int = 512\n latent_dim: int = 32\n num_latents: int = 1024\n patch_size: int = 4\n num_blocks: int = 8\n num_heads: int = 8\n dropout: float = 0.0\n codebook_dropout: float = 0.01\n # Logging\n log: bool = False\n entity: str = """"\n project: str = """"\n name: str = ""train_tokenizer""\n tags: list[str] = field(default_factory=lambda: [""tokenizer""])\n log_interval: int = 5\n log_image_interval: int = 250\n ckpt_dir: str = """"\n log_checkpoint_interval: int = 10000\n log_checkpoint_keep_period: int = 20000\n log_gradients: bool = False\n\n\nargs = tyro.cli(Args)\n\n\ndef tokenizer_loss_fn(params, state, inputs):\n # --- Compute loss ---\n outputs = state.apply_fn(\n params,\n inputs,\n training=True,\n rngs={""params"": inputs[""rng""], ""dropout"": inputs[""dropout_rng""]},\n )\n mse = jnp.square(inputs[""videos""] - outputs[""recon""]).mean()\n q_loss = jnp.square(jax.lax.stop_gradient(outputs[""emb""]) - outputs[""z""]).mean()\n commitment_loss = jnp.square(\n outputs[""emb""] - jax.lax.stop_gradient(outputs[""z""])\n ).mean()\n loss = mse + q_loss + args.vq_beta * commitment_loss\n\n # --- Compute validation metrics ---\n gt = inputs[""videos""].clip(0, 1).reshape(-1, *inputs[""videos""].shape[2:])\n recon = outputs[""recon""].clip(0, 1).reshape(-1, *outputs[""recon""].shape[2:])\n psnr = pix.psnr(gt, recon).mean() # type: ignore\n ssim = pix.ssim(gt, recon).mean() # type: ignore\n _, index_counts = jnp.unique_counts(\n jnp.ravel(outputs[""indices""]), size=args.num_latents, fill_value=0\n )\n codebook_usage = (index_counts != 0).mean()\n metrics = dict(\n loss=loss,\n mse=mse,\n q_loss=q_loss,\n commitment_loss=commitment_loss,\n psnr=psnr,\n ssim=ssim,\n codebook_usage=codebook_usage,\n )\n return loss, (outputs[""recon""], metrics)\n\n\n@jax.jit\ndef train_step(state, inputs):\n grad_fn = jax.value_and_grad(tokenizer_loss_fn, has_aux=True, allow_int=True)\n (loss, (recon, metrics)), grads = grad_fn(state.params, state, inputs)\n state = state.apply_gradients(grads=grads)\n if args.log_gradients:\n metrics[""encoder_gradients_std/""] = jax.tree.map(\n lambda x: x.std(), grads[""params""][""encoder""]\n )\n metrics[""vq_gradients_std/""] = jax.tree.map(\n lambda x: x.std(), grads[""params""][""vq""]\n )\n metrics[""decoder_gradients_std/""] = jax.tree.map(\n lambda x: x.std(), grads[""params""][""decoder""]\n )\n return state, loss, recon, metrics\n\n\nif __name__ == ""__main__"":\n jax.distributed.initialize()\n num_devices = jax.device_count()\n if num_devices == 0:\n raise ValueError(""No JAX devices found."")\n print(f""Running on {num_devices} devices."")\n\n if args.batch_size % num_devices != 0:\n raise ValueError(\n f""Global batch size {args.batch_size} must be divisible by ""\n f""number of devices {num_devices}.""\n )\n\n per_device_batch_size_for_init = args.batch_size // num_devices\n\n rng = jax.random.PRNGKey(args.seed)\n\n # --- Initialize model ---\n tokenizer = TokenizerVQVAE(\n in_dim=args.image_channels,\n model_dim=args.model_dim,\n latent_dim=args.latent_dim,\n num_latents=args.num_latents,\n patch_size=args.patch_size,\n num_blocks=args.num_blocks,\n num_heads=args.num_heads,\n dropout=args.dropout,\n codebook_dropout=args.codebook_dropout,\n )\n rng, _rng = jax.random.split(rng)\n image_shape = (args.image_height, args.image_width, args.image_channels)\n inputs = dict(\n videos=jnp.zeros(\n (per_device_batch_size_for_init, args.seq_len, *image_shape),\n dtype=jnp.float32,\n ),\n )\n init_params = tokenizer.init(_rng, inputs)\n\n param_counts = count_parameters_by_component(init_params)\n\n if args.log and jax.process_index() == 0:\n wandb.init(\n entity=args.entity,\n project=args.project,\n name=args.name,\n tags=args.tags,\n group=""debug"",\n config=args,\n )\n wandb.config.update({""model_param_count"": param_counts})\n\n print(""Parameter counts:"")\n print(param_counts)\n\n # --- Initialize optimizer ---\n lr_schedule = optax.warmup_cosine_decay_schedule(\n args.min_lr, args.max_lr, args.warmup_steps, args.num_steps\n )\n tx = optax.adamw(learning_rate=lr_schedule, b1=0.9, b2=0.9, weight_decay=1e-4)\n train_state = TrainState.create(apply_fn=tokenizer.apply, params=init_params, tx=tx)\n\n # FIXME: switch to create_hybrid_device_mesh for runs spanning multiple nodes\n device_mesh_arr = create_device_mesh((num_devices,))\n mesh = Mesh(devices=device_mesh_arr, axis_names=(""data"",))\n\n replicated_sharding = NamedSharding(mesh, PartitionSpec())\n videos_sharding = NamedSharding(\n mesh, PartitionSpec(""data"", None, None, None, None)\n )\n train_state = jax.device_put(train_state, replicated_sharding)\n\n # --- Initialize checkpoint manager ---\n step = 0\n handler_registry = ocp.handlers.DefaultCheckpointHandlerRegistry()\n handler_registry.add('model_state', ocp.args.StandardSave, ocp.handlers.StandardCheckpointHandler)\n handler_registry.add('model_state', ocp.args.StandardRestore, ocp.handlers.StandardCheckpointHandler)\n handler_registry.add('dataloader_state', grain.checkpoint.CheckpointSave, grain.checkpoint.CheckpointHandler) # type: ignore\n handler_registry.add('dataloader_state', grain.checkpoint.CheckpointRestore, grain.checkpoint.CheckpointHandler) # type: ignore\n \n checkpoint_options = ocp.CheckpointManagerOptions(\n save_interval_steps=args.log_checkpoint_interval,\n max_to_keep=3,\n keep_period=args.log_checkpoint_keep_period,\n step_format_fixed_length=6,\n cleanup_tmp_directories=True,\n )\n \n checkpoint_manager = ocp.CheckpointManager(\n args.ckpt_dir,\n options=checkpoint_options,\n handler_registry=handler_registry,\n )\n\n # --- Create DataLoaderIterator from dataloader ---\n array_record_files = [\n os.path.join(args.data_dir, x)\n for x in os.listdir(args.data_dir)\n if x.endswith("".array_record"")\n ]\n grain_dataloader = get_dataloader(\n array_record_files,\n args.seq_len,\n # NOTE: We deliberately pass the global batch size\n # The dataloader shards the dataset across all processes\n args.batch_size,\n *image_shape,\n num_workers=8,\n prefetch_buffer_size=1,\n seed=args.seed,\n )\n initial_state = grain_dataloader._create_initial_state()\n grain_iterator = grain.DataLoaderIterator(grain_dataloader, initial_state)\n \n # --- Restore checkpoint ---\n if args.restore_ckpt:\n abstract_train_state = jax.tree_util.tree_map(ocp.utils.to_shape_dtype_struct, train_state)\n restored = checkpoint_manager.restore(\n checkpoint_manager.latest_step(),\n args=ocp.args.Composite(\n model_state=ocp.args.StandardRestore(abstract_train_state),\n dataloader_state=grain.checkpoint.CheckpointRestore(grain_iterator),\n )\n )\n train_state = restored[""model_state""]\n grain_iterator = restored[""dataloader_state""]\n step = checkpoint_manager.latest_step() or 0\n print(f""Restored dataloader and model state from step {step}"")\n\n # --- TRAIN LOOP ---\n dataloader = (jax.make_array_from_process_local_data(videos_sharding, elem) for elem in grain_iterator) # type: ignore\n print(f""Starting training from step {step}..."")\n while step < args.num_steps:\n for videos in dataloader:\n # --- Train step ---\n rng, _rng, _rng_dropout = jax.random.split(rng, 3)\n\n inputs = dict(videos=videos, rng=_rng, dropout_rng=_rng_dropout)\n train_state, loss, recon, metrics = train_step(train_state, inputs)\n print(f""Step {step}, loss: {loss}"")\n step += 1\n\n # --- Logging ---\n if args.log:\n if step % args.log_interval == 0 and jax.process_index() == 0:\n wandb.log(\n {\n ""loss"": loss,\n ""step"": step,\n **metrics,\n }\n )\n if step % args.log_image_interval == 0:\n gt_seq = inputs[""videos""][0]\n recon_seq = recon[0].clip(0, 1)\n comparison_seq = jnp.concatenate((gt_seq, recon_seq), axis=1)\n comparison_seq = einops.rearrange(\n comparison_seq * 255, ""t h w c -> h (t w) c""\n )\n # NOTE: Process-dependent control flow deliberately happens\n # after indexing operation since it must not contain code\n # sections that lead to cross-accelerator communication.\n if jax.process_index() == 0:\n log_images = dict(\n image=wandb.Image(np.asarray(gt_seq[0])),\n recon=wandb.Image(np.asarray(recon_seq[0])),\n true_vs_recon=wandb.Image(\n np.asarray(comparison_seq.astype(np.uint8))\n ),\n )\n wandb.log(log_images)\n # --- Checkpointing ---\n if args.save_ckpt and step % args.log_checkpoint_interval == 0:\n checkpoint_manager.save(\n step,\n args=ocp.args.Composite(\n model_state=ocp.args.StandardSave(train_state),\n dataloader_state=grain.checkpoint.CheckpointSave(grain_iterator),\n )\n )\n print(f""Saved checkpoint at step {step}"")\n if step >= args.num_steps:\n break\n\n checkpoint_manager.close()",python,tab
+313,1058612,"train_tokenizer.py",5518,0,"",python,selection_command
+314,1071148,"train_lam.py",0,0,"from dataclasses import dataclass, field\nimport os\n\nimport einops\nfrom flax.training import orbax_utils\nfrom flax.training.train_state import TrainState\nfrom jax.sharding import Mesh, PartitionSpec, NamedSharding\nfrom jax.experimental.mesh_utils import create_device_mesh\nimport optax\nimport orbax.checkpoint as ocp\nimport numpy as np\nimport dm_pix as pix\nimport jax\nimport jax.numpy as jnp\nimport tyro\nimport wandb\nimport grain\n\nfrom models.lam import LatentActionModel\nfrom utils.dataloader import get_dataloader\nfrom utils.parameter_utils import count_parameters_by_component\n\n\n@dataclass\nclass Args:\n # Experiment\n num_steps: int = 200_000\n seed: int = 0\n seq_len: int = 16\n image_channels: int = 3\n image_height: int = 90\n image_width: int = 160\n data_dir: str = """"\n save_ckpt: bool = False\n restore_ckpt: bool = False\n # Optimization\n batch_size: int = 36\n vq_beta: float = 0.25\n min_lr: float = 0.0\n max_lr: float = 3e-5\n warmup_steps: int = 5000\n vq_reset_thresh: int = 50\n # LAM\n model_dim: int = 512\n latent_dim: int = 32\n num_latents: int = 6\n patch_size: int = 16\n num_blocks: int = 8\n num_heads: int = 8\n dropout: float = 0.0\n codebook_dropout: float = 0.0\n # Logging\n log: bool = False\n entity: str = """"\n project: str = """"\n name: str = ""train_lam""\n tags: list[str] = field(default_factory=lambda: [""lam""])\n log_interval: int = 5\n log_image_interval: int = 250\n ckpt_dir: str = """"\n log_checkpoint_interval: int = 10000\n log_checkpoint_keep_period: int = 20000\n\n\nargs = tyro.cli(Args)\n\n\ndef lam_loss_fn(params, state, inputs):\n # --- Compute loss ---\n outputs = state.apply_fn(\n params, inputs, training=True, rngs={""dropout"": inputs[""rng""]}\n )\n gt_future_frames = inputs[""videos""][:, 1:]\n mse = jnp.square(gt_future_frames - outputs[""recon""]).mean()\n q_loss = jnp.square(jax.lax.stop_gradient(outputs[""emb""]) - outputs[""z""]).mean()\n commitment_loss = jnp.square(\n outputs[""emb""] - jax.lax.stop_gradient(outputs[""z""])\n ).mean()\n loss = mse + q_loss + args.vq_beta * commitment_loss\n\n # --- Compute validation metrics ---\n gt = gt_future_frames.clip(0, 1).reshape(-1, *gt_future_frames.shape[2:])\n recon = outputs[""recon""].clip(0, 1).reshape(-1, *outputs[""recon""].shape[2:])\n psnr = pix.psnr(gt, recon).mean() # type: ignore\n ssim = pix.ssim(gt, recon).mean() # type: ignore\n count_fn = jax.vmap(lambda i: (outputs[""indices""] == i).sum())\n index_counts = count_fn(jnp.arange(args.num_latents))\n metrics = dict(\n loss=loss,\n mse=mse,\n q_loss=q_loss,\n commitment_loss=commitment_loss,\n psnr=psnr,\n ssim=ssim,\n codebook_usage=(index_counts != 0).mean(),\n )\n return loss, (outputs[""recon""], index_counts, metrics)\n\n\n@jax.jit\ndef train_step(state, inputs, action_last_active):\n # --- Update model ---\n rng, inputs[""rng""] = jax.random.split(inputs[""rng""])\n grad_fn = jax.value_and_grad(lam_loss_fn, has_aux=True, allow_int=True)\n (loss, (recon, idx_counts, metrics)), grads = grad_fn(state.params, state, inputs)\n state = state.apply_gradients(grads=grads)\n\n # --- Reset inactive latent actions ---\n codebook = state.params[""params""][""vq""][""codebook""]\n num_codes = len(codebook)\n active_codes = idx_counts != 0.0\n action_last_active = jnp.where(active_codes, 0, action_last_active + 1)\n p_code = active_codes / active_codes.sum()\n reset_idxs = jax.random.choice(rng, num_codes, shape=(num_codes,), p=p_code)\n do_reset = action_last_active >= args.vq_reset_thresh\n new_codebook = jnp.where(\n jnp.expand_dims(do_reset, -1), codebook[reset_idxs], codebook\n )\n state.params[""params""][""vq""][""codebook""] = new_codebook\n action_last_active = jnp.where(do_reset, 0, action_last_active)\n return state, loss, recon, action_last_active, metrics\n\n\nif __name__ == ""__main__"":\n jax.distributed.initialize()\n num_devices = jax.device_count()\n if num_devices == 0:\n raise ValueError(""No JAX devices found."")\n print(f""Running on {num_devices} devices."")\n\n if args.batch_size % num_devices != 0:\n raise ValueError(\n f""Global batch size {args.batch_size} must be divisible by ""\n f""number of devices {num_devices}.""\n )\n\n per_device_batch_size_for_init = args.batch_size // num_devices\n\n rng = jax.random.PRNGKey(args.seed)\n\n # --- Initialize model ---\n lam = LatentActionModel(\n in_dim=args.image_channels,\n model_dim=args.model_dim,\n latent_dim=args.latent_dim,\n num_latents=args.num_latents,\n patch_size=args.patch_size,\n num_blocks=args.num_blocks,\n num_heads=args.num_heads,\n dropout=args.dropout,\n codebook_dropout=args.codebook_dropout,\n )\n # Track when each action was last sampled\n action_last_active = jnp.zeros(args.num_latents)\n image_shape = (args.image_height, args.image_width, args.image_channels)\n rng, _rng = jax.random.split(rng)\n inputs = dict(\n videos=jnp.zeros(\n (per_device_batch_size_for_init, args.seq_len, *image_shape),\n dtype=jnp.float32,\n ),\n rng=_rng,\n )\n rng, _rng = jax.random.split(rng)\n init_params = lam.init(_rng, inputs)\n\n param_counts = count_parameters_by_component(init_params)\n\n if args.log and jax.process_index() == 0:\n wandb.init(\n entity=args.entity,\n project=args.project,\n name=args.name,\n tags=args.tags,\n group=""debug"",\n config=args,\n )\n wandb.config.update({""model_param_count"": param_counts})\n\n print(""Parameter counts:"")\n print(param_counts)\n\n # --- Initialize optimizer ---\n lr_schedule = optax.warmup_cosine_decay_schedule(\n args.min_lr, args.max_lr, args.warmup_steps, args.num_steps\n )\n tx = optax.adamw(learning_rate=lr_schedule, b1=0.9, b2=0.9, weight_decay=1e-4)\n train_state = TrainState.create(apply_fn=lam.apply, params=init_params, tx=tx)\n\n # FIXME: switch to create_hybrid_device_mesh for runs spanning multiple nodes\n device_mesh_arr = create_device_mesh((num_devices,))\n mesh = Mesh(devices=device_mesh_arr, axis_names=(""data"",))\n\n replicated_sharding = NamedSharding(mesh, PartitionSpec())\n videos_sharding = NamedSharding(\n mesh, PartitionSpec(""data"", None, None, None, None)\n )\n train_state = jax.device_put(train_state, replicated_sharding)\n action_last_active = jax.device_put(action_last_active, replicated_sharding)\n\n # --- Initialize checkpoint manager ---\n step = 0\n handler_registry = ocp.handlers.DefaultCheckpointHandlerRegistry()\n handler_registry.add('model_state', ocp.args.StandardSave, ocp.handlers.StandardCheckpointHandler)\n handler_registry.add('model_state', ocp.args.StandardRestore, ocp.handlers.StandardCheckpointHandler)\n handler_registry.add('dataloader_state', grain.checkpoint.CheckpointSave, grain.checkpoint.CheckpointHandler) # type: ignore\n handler_registry.add('dataloader_state', grain.checkpoint.CheckpointRestore, grain.checkpoint.CheckpointHandler) # type: ignore\n \n checkpoint_options = ocp.CheckpointManagerOptions(\n save_interval_steps=args.log_checkpoint_interval,\n max_to_keep=3,\n keep_period=args.log_checkpoint_keep_period,\n step_format_fixed_length=6,\n cleanup_tmp_directories=True,\n )\n \n checkpoint_manager = ocp.CheckpointManager(\n args.ckpt_dir,\n options=checkpoint_options,\n handler_registry=handler_registry,\n )\n\n # --- Create DataLoaderIterator from dataloader ---\n array_record_files = [\n os.path.join(args.data_dir, x)\n for x in os.listdir(args.data_dir)\n if x.endswith("".array_record"")\n ]\n grain_dataloader = get_dataloader(\n array_record_files,\n args.seq_len,\n # NOTE: We deliberately pass the global batch size\n # The dataloader shards the dataset across all processes\n args.batch_size,\n *image_shape,\n num_workers=8,\n prefetch_buffer_size=1,\n seed=args.seed,\n )\n initial_state = grain_dataloader._create_initial_state()\n grain_iterator = grain.DataLoaderIterator(grain_dataloader, initial_state)\n \n # --- Restore checkpoint ---\n if args.restore_ckpt:\n abstract_train_state = jax.tree_util.tree_map(ocp.utils.to_shape_dtype_struct, train_state)\n restored = checkpoint_manager.restore(\n checkpoint_manager.latest_step(),\n args=ocp.args.Composite(\n model_state=ocp.args.StandardRestore(abstract_train_state),\n dataloader_state=grain.checkpoint.CheckpointRestore(grain_iterator),\n )\n )\n train_state = restored[""model_state""]\n grain_iterator = restored[""dataloader_state""]\n step = checkpoint_manager.latest_step() or 0\n print(f""Restored dataloader and model state from step {step}"")\n\n # --- TRAIN LOOP ---\n dataloader = (jax.make_array_from_process_local_data(videos_sharding, elem) for elem in grain_iterator) # type: ignore\n print(f""Starting training from step {step}..."")\n while step < args.num_steps:\n for videos in dataloader:\n # --- Train step ---\n rng, _rng = jax.random.split(rng)\n\n inputs = dict(videos=videos, rng=_rng)\n train_state, loss, recon, action_last_active, metrics = train_step(\n train_state, inputs, action_last_active\n )\n print(f""Step {step}, loss: {loss}"")\n step += 1\n\n # --- Logging ---\n if args.log:\n if step % args.log_interval == 0 and jax.process_index() == 0:\n wandb.log(\n {\n ""loss"": loss,\n ""step"": step,\n **metrics,\n }\n )\n if step % args.log_image_interval == 0:\n gt_seq = inputs[""videos""][0][1:]\n recon_seq = recon[0].clip(0, 1)\n comparison_seq = jnp.concatenate((gt_seq, recon_seq), axis=1)\n comparison_seq = einops.rearrange(\n comparison_seq * 255, ""t h w c -> h (t w) c""\n )\n if jax.process_index() == 0:\n log_images = dict(\n image=wandb.Image(np.asarray(gt_seq[0])),\n recon=wandb.Image(np.asarray(recon_seq[0])),\n true_vs_recon=wandb.Image(\n np.asarray(comparison_seq.astype(np.uint8))\n ),\n )\n wandb.log(log_images)\n # --- Checkpointing ---\n if args.save_ckpt and step % args.log_checkpoint_interval == 0:\n checkpoint_manager.save(\n step,\n args=ocp.args.Composite(\n model_state=ocp.args.StandardSave(train_state),\n dataloader_state=grain.checkpoint.CheckpointSave(grain_iterator),\n )\n )\n print(f""Saved checkpoint at step {step}"")\n if step >= args.num_steps:\n break\n\n checkpoint_manager.close()\n",python,tab
+315,1073692,"train_lam.py",6000,0,"",python,selection_command
+316,1075357,"train_dynamics.py",0,0,"",python,tab
+317,1077170,"train_dynamics.py",6022,0,"",python,selection_command
+318,1077716,"train_dynamics.py",6014,0,"",python,selection_command
+319,1077959,"train_dynamics.py",6013,0,"",python,selection_command
+320,1077991,"train_dynamics.py",6010,0,"",python,selection_command
+321,1078024,"train_dynamics.py",6009,0,"",python,selection_command
+322,1078288,"train_dynamics.py",6001,0,"",python,selection_command
+323,1078435,"train_dynamics.py",5999,0,"",python,selection_command
+324,1078635,"train_dynamics.py",6001,0,"",python,selection_command
+325,1079018,"train_dynamics.py",5918,0,"",python,selection_command
+326,1081266,"train_tokenizer.py",0,0,"",python,tab
+327,1081999,"train_tokenizer.py",5536,0,"",python,selection_command
+328,1082226,"train_tokenizer.py",5535,0,"",python,selection_command
+329,1082484,"train_tokenizer.py",5535,0,",",python,content
+330,1082487,"train_tokenizer.py",5536,0,"",python,selection_keyboard
+331,1082781,"train_tokenizer.py",5536,0," ",python,content
+332,1082782,"train_tokenizer.py",5537,0,"",python,selection_keyboard
+333,1082959,"train_tokenizer.py",5537,0,"m",python,content
+334,1082960,"train_tokenizer.py",5538,0,"",python,selection_keyboard
+335,1083177,"train_tokenizer.py",5538,0,"y",python,content
+336,1083178,"train_tokenizer.py",5539,0,"",python,selection_keyboard
+337,1083515,"train_tokenizer.py",5538,1,"",python,content
+338,1083714,"train_tokenizer.py",5538,0,"u",python,content
+339,1083716,"train_tokenizer.py",5539,0,"",python,selection_keyboard
+340,1083920,"train_tokenizer.py",5539,0,"_",python,content
+341,1083921,"train_tokenizer.py",5540,0,"",python,selection_keyboard
+342,1084142,"train_tokenizer.py",5540,0,"d",python,content
+343,1084145,"train_tokenizer.py",5541,0,"",python,selection_keyboard
+344,1084294,"train_tokenizer.py",5541,0,"t",python,content
+345,1084297,"train_tokenizer.py",5542,0,"",python,selection_keyboard
+346,1084376,"train_tokenizer.py",5542,0,"y",python,content
+347,1084379,"train_tokenizer.py",5543,0,"",python,selection_keyboard
+348,1084397,"train_tokenizer.py",5543,0,"p",python,content
+349,1084399,"train_tokenizer.py",5544,0,"",python,selection_keyboard
+350,1084694,"train_tokenizer.py",5544,0,"e",python,content
+351,1084696,"train_tokenizer.py",5545,0,"",python,selection_keyboard
+352,1084866,"train_tokenizer.py",5545,0,"=",python,content
+353,1084869,"train_tokenizer.py",5546,0,"",python,selection_keyboard
+354,1086529,"train_tokenizer.py",5546,0,"j",python,content
+355,1086531,"train_tokenizer.py",5547,0,"",python,selection_keyboard
+356,1086681,"train_tokenizer.py",5547,0,"n",python,content
+357,1086683,"train_tokenizer.py",5548,0,"",python,selection_keyboard
+358,1087062,"train_tokenizer.py",5548,0,"p",python,content
+359,1087065,"train_tokenizer.py",5549,0,"",python,selection_keyboard
+360,1088271,"train_tokenizer.py",5549,0,".",python,content
+361,1088279,"train_tokenizer.py",5550,0,"",python,selection_keyboard
+362,1088565,"train_tokenizer.py",5550,0,"f",python,content
+363,1088567,"train_tokenizer.py",5551,0,"",python,selection_keyboard
+364,1088636,"train_tokenizer.py",5551,0,"b",python,content
+365,1088638,"train_tokenizer.py",5552,0,"",python,selection_keyboard
+366,1088791,"train_tokenizer.py",5552,0,"l",python,content
+367,1088792,"train_tokenizer.py",5553,0,"",python,selection_keyboard
+368,1088944,"train_tokenizer.py",5553,0,"o",python,content
+369,1088946,"train_tokenizer.py",5554,0,"",python,selection_keyboard
+370,1089513,"train_tokenizer.py",5553,1,"",python,content
+371,1089657,"train_tokenizer.py",5552,1,"",python,content
+372,1089801,"train_tokenizer.py",5551,1,"",python,content
+373,1089960,"train_tokenizer.py",5550,1,"",python,content
+374,1090622,"train_tokenizer.py",5550,0,"b",python,content
+375,1090630,"train_tokenizer.py",5551,0,"",python,selection_keyboard
+376,1090664,"train_tokenizer.py",5551,0,"f",python,content
+377,1090666,"train_tokenizer.py",5552,0,"",python,selection_keyboard
+378,1090783,"train_tokenizer.py",5552,0,"l",python,content
+379,1090785,"train_tokenizer.py",5553,0,"",python,selection_keyboard
+380,1090936,"train_tokenizer.py",5553,0,"o",python,content
+381,1090938,"train_tokenizer.py",5554,0,"",python,selection_keyboard
+382,1090989,"train_tokenizer.py",5554,0,"a",python,content
+383,1090990,"train_tokenizer.py",5555,0,"",python,selection_keyboard
+384,1091108,"train_tokenizer.py",5555,0,"t",python,content
+385,1091109,"train_tokenizer.py",5556,0,"",python,selection_keyboard
+386,1091567,"train_tokenizer.py",5556,0,"1",python,content
+387,1091569,"train_tokenizer.py",5557,0,"",python,selection_keyboard
+388,1091665,"train_tokenizer.py",5557,0,"6",python,content
+389,1091667,"train_tokenizer.py",5558,0,"",python,selection_keyboard
+390,1091824,"train_tokenizer.py",5557,0,"",python,selection_command
+391,1092578,"train_tokenizer.py",5550,0,"",python,selection_command
+392,1092787,".venv/lib/python3.10/site-packages/jax/_src/numpy/scalar_types.py",0,0,"# Copyright 2025 The JAX Authors.\n#\n# Licensed under the Apache License, Version 2.0 (the ""License"");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# https://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an ""AS IS"" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\n\n# At present JAX doesn't have a reason to distinguish between scalars and arrays\n# in its object system. Further, we want JAX scalars to have the same type\n# promotion behaviors as JAX arrays. Rather than introducing a new type of JAX\n# scalar object with JAX promotion behaviors, instead we make the JAX scalar\n# types return JAX arrays when instantiated.\n\nfrom typing import Any\n\nimport jax\nfrom jax._src.typing import Array\nfrom jax._src import core\nfrom jax._src import dtypes\nimport numpy as np\n\n\n# Some objects below rewrite their __module__ attribute to this name.\n_PUBLIC_MODULE_NAME = ""jax.numpy""\n\n\nclass _ScalarMeta(type):\n dtype: np.dtype\n\n def __hash__(self) -> int:\n return hash(self.dtype.type)\n\n def __eq__(self, other: Any) -> bool:\n return self is other or self.dtype.type == other\n\n def __ne__(self, other: Any) -> bool:\n return not (self == other)\n\n def __call__(self, x: Any) -> Array:\n return jax.numpy.asarray(x, dtype=self.dtype)\n\n def __instancecheck__(self, instance: Any) -> bool:\n return isinstance(instance, self.dtype.type)\n\ndef _abstractify_scalar_meta(x):\n raise TypeError(f""JAX scalar type {x} cannot be interpreted as a JAX array."")\ncore.pytype_aval_mappings[_ScalarMeta] = _abstractify_scalar_meta\n\ndef _make_scalar_type(np_scalar_type: type) -> _ScalarMeta:\n meta = _ScalarMeta(np_scalar_type.__name__, (object,),\n {""dtype"": np.dtype(np_scalar_type)})\n meta.__module__ = _PUBLIC_MODULE_NAME\n meta.__doc__ =\\n f""""""A JAX scalar constructor of type {np_scalar_type.__name__}.\n\n While NumPy defines scalar types for each data type, JAX represents\n scalars as zero-dimensional arrays.\n """"""\n return meta\n\nbool_ = _make_scalar_type(np.bool_)\nuint2 = _make_scalar_type(dtypes.uint2)\nuint4 = _make_scalar_type(dtypes.uint4)\nuint8 = _make_scalar_type(np.uint8)\nuint16 = _make_scalar_type(np.uint16)\nuint32 = _make_scalar_type(np.uint32)\nuint64 = _make_scalar_type(np.uint64)\nint2 = _make_scalar_type(dtypes.int2)\nint4 = _make_scalar_type(dtypes.int4)\nint8 = _make_scalar_type(np.int8)\nint16 = _make_scalar_type(np.int16)\nint32 = _make_scalar_type(np.int32)\nint64 = _make_scalar_type(np.int64)\nfloat4_e2m1fn = _make_scalar_type(dtypes.float4_e2m1fn)\nfloat8_e3m4 = _make_scalar_type(dtypes.float8_e3m4)\nfloat8_e4m3 = _make_scalar_type(dtypes.float8_e4m3)\nfloat8_e8m0fnu = _make_scalar_type(dtypes.float8_e8m0fnu)\nfloat8_e4m3fn = _make_scalar_type(dtypes.float8_e4m3fn)\nfloat8_e4m3fnuz = _make_scalar_type(dtypes.float8_e4m3fnuz)\nfloat8_e5m2 = _make_scalar_type(dtypes.float8_e5m2)\nfloat8_e5m2fnuz = _make_scalar_type(dtypes.float8_e5m2fnuz)\nfloat8_e4m3b11fnuz = _make_scalar_type(dtypes.float8_e4m3b11fnuz)\nbfloat16 = _make_scalar_type(dtypes.bfloat16)\nfloat16 = _make_scalar_type(np.float16)\nfloat32 = single = _make_scalar_type(np.float32)\nfloat64 = double = _make_scalar_type(np.float64)\ncomplex64 = csingle = _make_scalar_type(np.complex64)\ncomplex128 = cdouble = _make_scalar_type(np.complex128)\n\nint_ = int32 if dtypes.int_ == np.int32 else int64\nuint = uint32 if dtypes.uint == np.uint32 else uint64\nfloat_: Any = float32 if dtypes.float_ == np.float32 else float64\ncomplex_ = complex64 if dtypes.complex_ == np.complex64 else complex128\n",python,tab
+393,1094345,"train_tokenizer.py",0,0,"",python,tab
+394,1097651,"train_lam.py",0,0,"",python,tab
+395,1099020,"train_lam.py",6018,0,"",python,selection_command
+396,1099244,"train_lam.py",6017,0,"",python,selection_command
+397,1099526,"train_lam.py",6017,0,",",python,content
+398,1099529,"train_lam.py",6018,0,"",python,selection_keyboard
+399,1099625,"train_lam.py",6018,0," ",python,content
+400,1099627,"train_lam.py",6019,0,"",python,selection_keyboard
+401,1099686,"train_lam.py",6019,0,"w",python,content
+402,1099687,"train_lam.py",6020,0,"",python,selection_keyboard
+403,1099763,"train_lam.py",6020,0,"e",python,content
+404,1099764,"train_lam.py",6021,0,"",python,selection_keyboard
+405,1100101,"train_lam.py",6020,1,"",python,content
+406,1100245,"train_lam.py",6019,1,"",python,content
+407,1100886,"train_lam.py",6019,0,"m",python,content
+408,1100891,"train_lam.py",6020,0,"",python,selection_keyboard
+409,1101010,"train_lam.py",6020,0,"u",python,content
+410,1101014,"train_lam.py",6021,0,"",python,selection_keyboard
+411,1101201,"train_lam.py",6021,0,"_",python,content
+412,1101203,"train_lam.py",6022,0,"",python,selection_keyboard
+413,1101396,"train_lam.py",6022,0,"d",python,content
+414,1101397,"train_lam.py",6023,0,"",python,selection_keyboard
+415,1101547,"train_lam.py",6023,0,"t",python,content
+416,1101550,"train_lam.py",6024,0,"",python,selection_keyboard
+417,1101602,"train_lam.py",6024,0,"y",python,content
+418,1101604,"train_lam.py",6025,0,"",python,selection_keyboard
+419,1101693,"train_lam.py",6025,0,"p",python,content
+420,1101694,"train_lam.py",6026,0,"",python,selection_keyboard
+421,1101734,"train_lam.py",6026,0,"e",python,content
+422,1101735,"train_lam.py",6027,0,"",python,selection_keyboard
+423,1101920,"train_lam.py",6027,0,"=",python,content
+424,1101921,"train_lam.py",6028,0,"",python,selection_keyboard
+425,1102734,"train_lam.py",6028,0,"j",python,content
+426,1102739,"train_lam.py",6029,0,"",python,selection_keyboard
+427,1102883,"train_lam.py",6029,0,"n",python,content
+428,1102884,"train_lam.py",6030,0,"",python,selection_keyboard
+429,1102951,"train_lam.py",6030,0,"p",python,content
+430,1102953,"train_lam.py",6031,0,"",python,selection_keyboard
+431,1103403,"train_lam.py",6031,0,"b",python,content
+432,1103404,"train_lam.py",6032,0,"",python,selection_keyboard
+433,1103707,"train_lam.py",6032,0,"f",python,content
+434,1103708,"train_lam.py",6033,0,"",python,selection_keyboard
+435,1103996,"train_lam.py",6032,1,"",python,content
+436,1104131,"train_lam.py",6031,1,"",python,content
+437,1104325,"train_lam.py",6031,0,".",python,content
+438,1104326,"train_lam.py",6032,0,"",python,selection_keyboard
+439,1104981,"train_lam.py",6032,0,"b",python,content
+440,1104983,"train_lam.py",6033,0,"",python,selection_keyboard
+441,1105118,"train_lam.py",6033,0,"f",python,content
+442,1105119,"train_lam.py",6034,0,"",python,selection_keyboard
+443,1105192,"train_lam.py",6034,0,"l",python,content
+444,1105195,"train_lam.py",6035,0,"",python,selection_keyboard
+445,1105377,"train_lam.py",6035,0,"o",python,content
+446,1105380,"train_lam.py",6036,0,"",python,selection_keyboard
+447,1105449,"train_lam.py",6036,0,"a",python,content
+448,1105450,"train_lam.py",6037,0,"",python,selection_keyboard
+449,1105545,"train_lam.py",6037,0,"t",python,content
+450,1105551,"train_lam.py",6038,0,"",python,selection_keyboard
+451,1106009,"train_lam.py",6038,0,"1",python,content
+452,1106012,"train_lam.py",6039,0,"",python,selection_keyboard
+453,1106097,"train_lam.py",6039,0,"6",python,content
+454,1106099,"train_lam.py",6040,0,"",python,selection_keyboard
+455,1106212,"train_lam.py",6039,0,"",python,selection_command
+456,1142505,"train_tokenizer.py",0,0,"",python,tab
+457,1142851,"train_tokenizer.py",0,0,"",python,selection_command
+458,1143552,"train_tokenizer.py",583,0,"",python,selection_command
+459,1144011,"train_tokenizer.py",1076,0,"",python,selection_command
+460,1145019,"train_tokenizer.py",1051,0,"",python,selection_command
+461,1145266,"train_tokenizer.py",1026,0,"",python,selection_command
+462,1145298,"train_tokenizer.py",1010,0,"",python,selection_command
+463,1145331,"train_tokenizer.py",980,0,"",python,selection_command
+464,1145364,"train_tokenizer.py",955,0,"",python,selection_command
+465,1145397,"train_tokenizer.py",931,0,"",python,selection_command
+466,1145430,"train_tokenizer.py",906,0,"",python,selection_command
+467,1145466,"train_tokenizer.py",880,0,"",python,selection_command
+468,1145500,"train_tokenizer.py",861,0,"",python,selection_command
+469,1145531,"train_tokenizer.py",830,0,"",python,selection_command
+470,1145564,"train_tokenizer.py",802,0,"",python,selection_command
+471,1145597,"train_tokenizer.py",779,0,"",python,selection_command
+472,1145632,"train_tokenizer.py",752,0,"",python,selection_command
+473,1145665,"train_tokenizer.py",725,0,"",python,selection_command
+474,1145699,"train_tokenizer.py",697,0,"",python,selection_command
+475,1145733,"train_tokenizer.py",675,0,"",python,selection_command
+476,1146748,"train_tokenizer.py",697,0,"",python,selection_command
+477,1147004,"train_tokenizer.py",725,0,"",python,selection_command
+478,1147033,"train_tokenizer.py",752,0,"",python,selection_command
+479,1147065,"train_tokenizer.py",779,0,"",python,selection_command
+480,1147097,"train_tokenizer.py",802,0,"",python,selection_command
+481,1147131,"train_tokenizer.py",830,0,"",python,selection_command
+482,1147166,"train_tokenizer.py",861,0,"",python,selection_command
+483,1147197,"train_tokenizer.py",880,0,"",python,selection_command
+484,1147330,"train_tokenizer.py",861,0,"",python,selection_command
+485,1147577,"train_tokenizer.py",830,0,"",python,selection_command
+486,1147607,"train_tokenizer.py",802,0,"",python,selection_command
+487,1147640,"train_tokenizer.py",779,0,"",python,selection_command
+488,1147679,"train_tokenizer.py",752,0,"",python,selection_command
+489,1147713,"train_tokenizer.py",725,0,"",python,selection_command
+490,1147747,"train_tokenizer.py",697,0,"",python,selection_command
+491,1148008,"train_tokenizer.py",675,0,"",python,selection_command
+492,1148243,"train_tokenizer.py",697,0,"",python,selection_command
+493,1148491,"train_tokenizer.py",725,0,"",python,selection_command
+494,1148526,"train_tokenizer.py",752,0,"",python,selection_command
+495,1148559,"train_tokenizer.py",779,0,"",python,selection_command
+496,1148593,"train_tokenizer.py",802,0,"",python,selection_command
+497,1148626,"train_tokenizer.py",830,0,"",python,selection_command
+498,1148658,"train_tokenizer.py",861,0,"",python,selection_command
+499,1148698,"train_tokenizer.py",880,0,"",python,selection_command
+500,1148728,"train_tokenizer.py",906,0,"",python,selection_command
+501,1148848,"train_tokenizer.py",931,0,"",python,selection_command
+502,1149099,"train_tokenizer.py",955,0,"",python,selection_command
+503,1149131,"train_tokenizer.py",980,0,"",python,selection_command
+504,1149164,"train_tokenizer.py",1010,0,"",python,selection_command
+505,1149197,"train_tokenizer.py",1026,0,"",python,selection_command
+506,1149231,"train_tokenizer.py",1051,0,"",python,selection_command
+507,1149264,"train_tokenizer.py",1076,0,"",python,selection_command
+508,1149299,"train_tokenizer.py",1104,0,"",python,selection_command
+509,1149428,"train_tokenizer.py",1128,0,"",python,selection_command
+510,1149603,"train_tokenizer.py",1152,0,"",python,selection_command
+511,1149851,"train_tokenizer.py",1175,0,"",python,selection_command
+512,1150077,"train_tokenizer.py",1200,0,"",python,selection_command
+513,1150431,"train_tokenizer.py",1230,0,"\n ",python,content
+514,1153735,"train_tokenizer.py",1235,0,"p",python,content
+515,1153738,"train_tokenizer.py",1236,0,"",python,selection_keyboard
+516,1153830,"train_tokenizer.py",1236,0,"a",python,content
+517,1153832,"train_tokenizer.py",1237,0,"",python,selection_keyboard
+518,1153997,"train_tokenizer.py",1237,0,"r",python,content
+519,1153998,"train_tokenizer.py",1238,0,"",python,selection_keyboard
+520,1154099,"train_tokenizer.py",1238,0,"a",python,content
+521,1154101,"train_tokenizer.py",1239,0,"",python,selection_keyboard
+522,1154161,"train_tokenizer.py",1239,0,"m",python,content
+523,1154163,"train_tokenizer.py",1240,0,"",python,selection_keyboard
+524,1154526,"train_tokenizer.py",1240,0,"_",python,content
+525,1154528,"train_tokenizer.py",1241,0,"",python,selection_keyboard
+526,1154764,"train_tokenizer.py",1241,0,"d",python,content
+527,1154765,"train_tokenizer.py",1242,0,"",python,selection_keyboard
+528,1154891,"train_tokenizer.py",1242,0,"t",python,content
+529,1154892,"train_tokenizer.py",1243,0,"",python,selection_keyboard
+530,1154967,"train_tokenizer.py",1243,0,"y",python,content
+531,1154969,"train_tokenizer.py",1244,0,"",python,selection_keyboard
+532,1155100,"train_tokenizer.py",1244,0,"e",python,content
+533,1155101,"train_tokenizer.py",1245,0,"",python,selection_keyboard
+534,1155513,"train_tokenizer.py",1244,1,"",python,content
+535,1155696,"train_tokenizer.py",1244,0,"p",python,content
+536,1155699,"train_tokenizer.py",1245,0,"",python,selection_keyboard
+537,1155768,"train_tokenizer.py",1245,0,"e",python,content
+538,1155769,"train_tokenizer.py",1246,0,"",python,selection_keyboard
+539,1158062,"train_tokenizer.py",1246,0,":",python,content
+540,1158064,"train_tokenizer.py",1247,0,"",python,selection_keyboard
+541,1158365,"train_tokenizer.py",1247,0," ",python,content
+542,1158366,"train_tokenizer.py",1248,0,"",python,selection_keyboard
+543,1158493,"train_tokenizer.py",1248,0,"j",python,content
+544,1158495,"train_tokenizer.py",1249,0,"",python,selection_keyboard
+545,1158644,"train_tokenizer.py",1249,0,"n",python,content
+546,1158645,"train_tokenizer.py",1250,0,"",python,selection_keyboard
+547,1158722,"train_tokenizer.py",1250,0,"p",python,content
+548,1158724,"train_tokenizer.py",1251,0,"",python,selection_keyboard
+549,1158948,"train_tokenizer.py",1251,0,".",python,content
+550,1158949,"train_tokenizer.py",1252,0,"",python,selection_keyboard
+551,1159067,"train_tokenizer.py",1252,0,"d",python,content
+552,1159068,"train_tokenizer.py",1253,0,"",python,selection_keyboard
+553,1159197,"train_tokenizer.py",1253,0,"t",python,content
+554,1159198,"train_tokenizer.py",1254,0,"",python,selection_keyboard
+555,1159282,"train_tokenizer.py",1254,0,"y",python,content
+556,1159285,"train_tokenizer.py",1255,0,"",python,selection_keyboard
+557,1159349,"train_tokenizer.py",1255,0,"p",python,content
+558,1159351,"train_tokenizer.py",1256,0,"",python,selection_keyboard
+559,1159423,"train_tokenizer.py",1256,0,"e",python,content
+560,1159424,"train_tokenizer.py",1257,0,"",python,selection_keyboard
+561,1159493,"train_tokenizer.py",1257,0," ",python,content
+562,1159495,"train_tokenizer.py",1258,0,"",python,selection_keyboard
+563,1159641,"train_tokenizer.py",1258,0,"=",python,content
+564,1159642,"train_tokenizer.py",1259,0,"",python,selection_keyboard
+565,1159709,"train_tokenizer.py",1259,0," ",python,content
+566,1159711,"train_tokenizer.py",1260,0,"",python,selection_keyboard
+567,1164895,"train_tokenizer.py",1260,0,"j",python,content
+568,1164897,"train_tokenizer.py",1261,0,"",python,selection_keyboard
+569,1165082,"train_tokenizer.py",1261,0,"n",python,content
+570,1165084,"train_tokenizer.py",1262,0,"",python,selection_keyboard
+571,1165141,"train_tokenizer.py",1262,0,"p",python,content
+572,1165148,"train_tokenizer.py",1263,0,"",python,selection_keyboard
+573,1165362,"train_tokenizer.py",1263,0,".",python,content
+574,1165364,"train_tokenizer.py",1264,0,"",python,selection_keyboard
+575,1165484,"train_tokenizer.py",1264,0,"f",python,content
+576,1165486,"train_tokenizer.py",1265,0,"",python,selection_keyboard
+577,1165636,"train_tokenizer.py",1265,0,"l",python,content
+578,1165637,"train_tokenizer.py",1266,0,"",python,selection_keyboard
+579,1165980,"train_tokenizer.py",1266,0,"o",python,content
+580,1165984,"train_tokenizer.py",1267,0,"",python,selection_keyboard
+581,1166073,"train_tokenizer.py",1267,0,"a",python,content
+582,1166077,"train_tokenizer.py",1268,0,"",python,selection_keyboard
+583,1166158,"train_tokenizer.py",1268,0,"t",python,content
+584,1166162,"train_tokenizer.py",1269,0,"",python,selection_keyboard
+585,1166426,"train_tokenizer.py",1269,0,"3",python,content
+586,1166429,"train_tokenizer.py",1270,0,"",python,selection_keyboard
+587,1166479,"train_tokenizer.py",1270,0,"2",python,content
+588,1166481,"train_tokenizer.py",1271,0,"",python,selection_keyboard
+589,1166698,"train_tokenizer.py",1270,0,"",python,selection_command
+590,1167365,"train_tokenizer.py",1271,0,"",python,selection_command
+591,1167487,"train_tokenizer.py",1271,0,",",python,content
+592,1167488,"train_tokenizer.py",1272,0,"",python,selection_keyboard
+593,1167715,"train_tokenizer.py",1272,0,"\n ",python,content
+594,1168096,"train_tokenizer.py",1273,4,"",python,content
+595,1168234,"train_tokenizer.py",1231,0,"",python,selection_command
+596,1168937,"train_tokenizer.py",1273,0,"",python,selection_command
+597,1169607,"train_tokenizer.py",1273,0,"k",python,content
+598,1169609,"train_tokenizer.py",1274,0,"",python,selection_keyboard
+599,1169797,"train_tokenizer.py",1273,0,"",python,selection_command
+600,1170097,"train_tokenizer.py",1271,0,"",python,selection_command
+601,1170648,"train_tokenizer.py",1272,0,"",python,selection_command
+602,1170791,"train_tokenizer.py",1271,1,"",python,content
+603,1170966,"train_tokenizer.py",1270,0,"",python,selection_command
+604,1171063,"train_tokenizer.py",1272,0,"",python,selection_command
+605,1171310,"train_tokenizer.py",1273,0,"",python,selection_command
+606,1171525,"train_tokenizer.py",1272,1,"",python,content
+607,1172068,"train_tokenizer.py",1271,1,"",python,content
+608,1172179,"train_tokenizer.py",1271,0,"\n ",python,content
+609,1173029,"train_tokenizer.py",1276,0,"d",python,content
+610,1173031,"train_tokenizer.py",1277,0,"",python,selection_keyboard
+611,1173140,"train_tokenizer.py",1277,0,"t",python,content
+612,1173141,"train_tokenizer.py",1278,0,"",python,selection_keyboard
+613,1173245,"train_tokenizer.py",1278,0,"y",python,content
+614,1173247,"train_tokenizer.py",1279,0,"",python,selection_keyboard
+615,1173296,"train_tokenizer.py",1279,0,"p",python,content
+616,1173298,"train_tokenizer.py",1280,0,"",python,selection_keyboard
+617,1173374,"train_tokenizer.py",1280,0,"e",python,content
+618,1173375,"train_tokenizer.py",1281,0,"",python,selection_keyboard
+619,1173597,"train_tokenizer.py",1281,0,":",python,content
+620,1173599,"train_tokenizer.py",1282,0,"",python,selection_keyboard
+621,1173710,"train_tokenizer.py",1282,0," ",python,content
+622,1173711,"train_tokenizer.py",1283,0,"",python,selection_keyboard
+623,1174041,"train_tokenizer.py",1283,0,"j",python,content
+624,1174042,"train_tokenizer.py",1284,0,"",python,selection_keyboard
+625,1174224,"train_tokenizer.py",1284,0,"n",python,content
+626,1174225,"train_tokenizer.py",1285,0,"",python,selection_keyboard
+627,1174295,"train_tokenizer.py",1285,0,"p",python,content
+628,1174296,"train_tokenizer.py",1286,0,"",python,selection_keyboard
+629,1174530,"train_tokenizer.py",1286,0,".",python,content
+630,1174531,"train_tokenizer.py",1287,0,"",python,selection_keyboard
+631,1174645,"train_tokenizer.py",1287,0,"d",python,content
+632,1174646,"train_tokenizer.py",1288,0,"",python,selection_keyboard
+633,1174835,"train_tokenizer.py",1288,0,"t",python,content
+634,1174836,"train_tokenizer.py",1289,0,"",python,selection_keyboard
+635,1174933,"train_tokenizer.py",1289,0,"y",python,content
+636,1174935,"train_tokenizer.py",1290,0,"",python,selection_keyboard
+637,1175093,"train_tokenizer.py",1290,0,"e",python,content
+638,1175095,"train_tokenizer.py",1291,0,"",python,selection_keyboard
+639,1175464,"train_tokenizer.py",1290,1,"",python,content
+640,1175663,"train_tokenizer.py",1290,0,"p",python,content
+641,1175665,"train_tokenizer.py",1291,0,"",python,selection_keyboard
+642,1175740,"train_tokenizer.py",1291,0,"e",python,content
+643,1175743,"train_tokenizer.py",1292,0,"",python,selection_keyboard
+644,1175812,"train_tokenizer.py",1292,0," ",python,content
+645,1175815,"train_tokenizer.py",1293,0,"",python,selection_keyboard
+646,1175949,"train_tokenizer.py",1293,0,"=",python,content
+647,1175950,"train_tokenizer.py",1294,0,"",python,selection_keyboard
+648,1176041,"train_tokenizer.py",1294,0," ",python,content
+649,1176042,"train_tokenizer.py",1295,0,"",python,selection_keyboard
+650,1176529,"train_tokenizer.py",1295,0,"j",python,content
+651,1176531,"train_tokenizer.py",1296,0,"",python,selection_keyboard
+652,1176655,"train_tokenizer.py",1296,0,"n",python,content
+653,1176656,"train_tokenizer.py",1297,0,"",python,selection_keyboard
+654,1176730,"train_tokenizer.py",1297,0,"p",python,content
+655,1176731,"train_tokenizer.py",1298,0,"",python,selection_keyboard
+656,1176963,"train_tokenizer.py",1298,0,".",python,content
+657,1176966,"train_tokenizer.py",1299,0,"",python,selection_keyboard
+658,1177043,"train_tokenizer.py",1299,0,"f",python,content
+659,1177044,"train_tokenizer.py",1300,0,"",python,selection_keyboard
+660,1177651,"train_tokenizer.py",1299,1,"",python,content
+661,1177849,"train_tokenizer.py",1299,0,"b",python,content
+662,1177850,"train_tokenizer.py",1300,0,"",python,selection_keyboard
+663,1177931,"train_tokenizer.py",1300,0,"f",python,content
+664,1177933,"train_tokenizer.py",1301,0,"",python,selection_keyboard
+665,1178010,"train_tokenizer.py",1301,0,"l",python,content
+666,1178012,"train_tokenizer.py",1302,0,"",python,selection_keyboard
+667,1178162,"train_tokenizer.py",1302,0,"o",python,content
+668,1178164,"train_tokenizer.py",1303,0,"",python,selection_keyboard
+669,1178209,"train_tokenizer.py",1303,0,"a",python,content
+670,1178211,"train_tokenizer.py",1304,0,"",python,selection_keyboard
+671,1178294,"train_tokenizer.py",1304,0,"t",python,content
+672,1178296,"train_tokenizer.py",1305,0,"",python,selection_keyboard
+673,1178458,"train_tokenizer.py",1305,0,"1",python,content
+674,1178460,"train_tokenizer.py",1306,0,"",python,selection_keyboard
+675,1178608,"train_tokenizer.py",1306,0,"6",python,content
+676,1178610,"train_tokenizer.py",1307,0,"",python,selection_keyboard
+677,1178825,"train_tokenizer.py",1306,0,"",python,selection_command
+678,1182646,"train_tokenizer.py",1272,35," dtype: jnp.dtype = jnp.bfloat16",python,selection_command
+679,1182769,"train_tokenizer.py",1231,76," param_dtype: jnp.dtype = jnp.float32\n dtype: jnp.dtype = jnp.bfloat16",python,selection_command
+680,1182966,"train_tokenizer.py",1231,0,"",python,selection_command
+681,1184719,"train_lam.py",0,0,"",python,tab
+682,1185651,"train_lam.py",0,0,"",python,selection_command
+683,1186546,"train_lam.py",442,0,"",python,selection_command
+684,1186836,"train_lam.py",1038,0,"",python,selection_command
+685,1188150,"train_lam.py",1048,0,"",python,selection_command
+686,1188399,"train_lam.py",1073,0,"",python,selection_command
+687,1188431,"train_lam.py",1098,0,"",python,selection_command
+688,1188464,"train_lam.py",1123,0,"",python,selection_command
+689,1188497,"train_lam.py",1148,0,"",python,selection_command
+690,1188530,"train_lam.py",1172,0,"",python,selection_command
+691,1188563,"train_lam.py",1195,0,"",python,selection_command
+692,1188727,"train_lam.py",1220,0,"",python,selection_command
+693,1189118,"train_lam.py",1247,0,"\n param_dtype: jnp.dtype = jnp.float32\n dtype: jnp.dtype = jnp.bfloat16",python,content
+694,1189128,"train_lam.py",1252,0,"",python,selection_command
+695,1193380,"train_dynamics.py",0,0,"",python,tab
+696,1194047,"train_dynamics.py",0,0,"",python,selection_command
+697,1196082,"train_dynamics.py",542,0,"",python,selection_command
+698,1196661,"train_dynamics.py",554,0,"",python,selection_command
+699,1196912,"train_dynamics.py",571,0,"",python,selection_command
+700,1197115,"train_dynamics.py",554,0,"",python,selection_command
+701,1197515,"train_dynamics.py",571,0,"",python,selection_command
+702,1197735,"train_dynamics.py",600,0,"",python,selection_command
+703,1197986,"train_dynamics.py",618,0,"",python,selection_command
+704,1198017,"train_dynamics.py",640,0,"",python,selection_command
+705,1198051,"train_dynamics.py",668,0,"",python,selection_command
+706,1198084,"train_dynamics.py",695,0,"",python,selection_command
+707,1198118,"train_dynamics.py",722,0,"",python,selection_command
+708,1198152,"train_dynamics.py",745,0,"",python,selection_command
+709,1198189,"train_dynamics.py",773,0,"",python,selection_command
+710,1198225,"train_dynamics.py",804,0,"",python,selection_command
+711,1198257,"train_dynamics.py",823,0,"",python,selection_command
+712,1198289,"train_dynamics.py",848,0,"",python,selection_command
+713,1198323,"train_dynamics.py",872,0,"",python,selection_command
+714,1198358,"train_dynamics.py",897,0,"",python,selection_command
+715,1198391,"train_dynamics.py",926,0,"",python,selection_command
+716,1198426,"train_dynamics.py",942,0,"",python,selection_command
+717,1198458,"train_dynamics.py",971,0,"",python,selection_command
+718,1198497,"train_dynamics.py",1002,0,"",python,selection_command
+719,1198530,"train_dynamics.py",1036,0,"",python,selection_command
+720,1198558,"train_dynamics.py",1060,0,"",python,selection_command
+721,1198591,"train_dynamics.py",1094,0,"",python,selection_command
+722,1198624,"train_dynamics.py",1127,0,"",python,selection_command
+723,1198657,"train_dynamics.py",1162,0,"",python,selection_command
+724,1198691,"train_dynamics.py",1172,0,"",python,selection_command
+725,1198725,"train_dynamics.py",1195,0,"",python,selection_command
+726,1198758,"train_dynamics.py",1227,0,"",python,selection_command
+727,1198796,"train_dynamics.py",1259,0,"",python,selection_command
+728,1198826,"train_dynamics.py",1288,0,"",python,selection_command
+729,1198858,"train_dynamics.py",1316,0,"",python,selection_command
+730,1198891,"train_dynamics.py",1343,0,"",python,selection_command
+731,1198924,"train_dynamics.py",1372,0,"",python,selection_command
+732,1198959,"train_dynamics.py",1387,0,"",python,selection_command
+733,1198991,"train_dynamics.py",1411,0,"",python,selection_command
+734,1199024,"train_dynamics.py",1441,0,"",python,selection_command
+735,1199057,"train_dynamics.py",1469,0,"",python,selection_command
+736,1199200,"train_dynamics.py",1494,0,"",python,selection_command
+737,1199502,"train_dynamics.py",1521,0,"\n param_dtype: jnp.dtype = jnp.float32\n dtype: jnp.dtype = jnp.bfloat16",python,content
+738,1199507,"train_dynamics.py",1526,0,"",python,selection_command
+739,1210952,"train_tokenizer.py",0,0,"",python,tab
+740,1218921,"train_tokenizer.py",1230,0,"\n ",python,content
+741,1219114,"train_tokenizer.py",1235,0,"#",python,content
+742,1219116,"train_tokenizer.py",1236,0,"",python,selection_keyboard
+743,1219192,"train_tokenizer.py",1236,0," ",python,content
+744,1219195,"train_tokenizer.py",1237,0,"",python,selection_keyboard
+745,1219414,"train_tokenizer.py",1237,0,"T",python,content
+746,1219416,"train_tokenizer.py",1238,0,"",python,selection_keyboard
+747,1219764,"train_tokenizer.py",1237,1,"",python,content
+748,1220463,"train_tokenizer.py",1237,0,"T",python,content
+749,1220464,"train_tokenizer.py",1238,0,"",python,selection_keyboard
+750,1220756,"train_tokenizer.py",1237,1,"",python,content
+751,1220812,"train_tokenizer.py",1237,0,"F",python,content
+752,1220814,"train_tokenizer.py",1238,0,"",python,selection_keyboard
+753,1220930,"train_tokenizer.py",1238,0,"I",python,content
+754,1220931,"train_tokenizer.py",1239,0,"",python,selection_keyboard
+755,1221007,"train_tokenizer.py",1239,0,"X",python,content
+756,1221008,"train_tokenizer.py",1240,0,"",python,selection_keyboard
+757,1221143,"train_tokenizer.py",1240,0,"M",python,content
+758,1221143,"train_tokenizer.py",1241,0,"",python,selection_keyboard
+759,1221241,"train_tokenizer.py",1241,0,"E",python,content
+760,1221242,"train_tokenizer.py",1242,0,"",python,selection_keyboard
+761,1221361,"train_tokenizer.py",1242,0,":",python,content
+762,1221362,"train_tokenizer.py",1243,0,"",python,selection_keyboard
+763,1221480,"train_tokenizer.py",1243,0," ",python,content
+764,1221481,"train_tokenizer.py",1244,0,"",python,selection_keyboard
+765,1221610,"train_tokenizer.py",1244,0,"()",python,content
+766,1221612,"train_tokenizer.py",1245,0,"",python,selection_keyboard
+767,1221952,"train_tokenizer.py",1244,2,"",python,content
+768,1222109,"train_tokenizer.py",1243,1,"",python,content
+769,1222262,"train_tokenizer.py",1242,1,"",python,content
+770,1222348,"train_tokenizer.py",1242,0," ",python,content
+771,1222350,"train_tokenizer.py",1243,0,"",python,selection_keyboard
+772,1222492,"train_tokenizer.py",1243,0,"()",python,content
+773,1222493,"train_tokenizer.py",1244,0,"",python,selection_keyboard
+774,1222592,"train_tokenizer.py",1244,1,")",python,content
+775,1222593,"train_tokenizer.py",1245,0,"",python,selection_keyboard
+776,1222932,"train_tokenizer.py",1244,0,"",python,selection_keyboard
+777,1223078,"train_tokenizer.py",1244,0,"f",python,content
+778,1223079,"train_tokenizer.py",1245,0,"",python,selection_keyboard
+779,1223176,"train_tokenizer.py",1245,0,".",python,content
+780,1223177,"train_tokenizer.py",1246,0,"",python,selection_keyboard
+781,1223252,"train_tokenizer.py",1246,0,"s",python,content
+782,1223254,"train_tokenizer.py",1247,0,"",python,selection_keyboard
+783,1223329,"train_tokenizer.py",1247,0,"r",python,content
+784,1223330,"train_tokenizer.py",1248,0,"",python,selection_keyboard
+785,1223425,"train_tokenizer.py",1248,0,"a",python,content
+786,1223428,"train_tokenizer.py",1249,0,"",python,selection_keyboard
+787,1223443,"train_tokenizer.py",1249,0,"m",python,content
+788,1223446,"train_tokenizer.py",1250,0,"",python,selection_keyboard
+789,1223661,"train_tokenizer.py",1250,0,"b",python,content
+790,1223663,"train_tokenizer.py",1251,0,"",python,selection_keyboard
+791,1223702,"train_tokenizer.py",1251,0,"i",python,content
+792,1223703,"train_tokenizer.py",1252,0,"",python,selection_keyboard
+793,1223750,"train_tokenizer.py",1252,0,"c",python,content
+794,1223752,"train_tokenizer.py",1253,0,"",python,selection_keyboard
+795,1223801,"train_tokenizer.py",1253,0,"a",python,content
+796,1223802,"train_tokenizer.py",1254,0,"",python,selection_keyboard
+797,1223911,"train_tokenizer.py",1254,0,"l",python,content
+798,1223912,"train_tokenizer.py",1255,0,"",python,selection_keyboard
+799,1224035,"train_tokenizer.py",1254,0,"",python,selection_command
+800,1224269,"train_tokenizer.py",1256,0,"",python,selection_command
+801,1224397,"train_tokenizer.py",1256,0,":",python,content
+802,1224398,"train_tokenizer.py",1257,0,"",python,selection_keyboard
+803,1224497,"train_tokenizer.py",1257,0," ",python,content
+804,1224500,"train_tokenizer.py",1258,0,"",python,selection_keyboard
+805,1226800,"train_tokenizer.py",1258,0,"s",python,content
+806,1226802,"train_tokenizer.py",1259,0,"",python,selection_keyboard
+807,1226845,"train_tokenizer.py",1259,0,"t",python,content
+808,1226848,"train_tokenizer.py",1260,0,"",python,selection_keyboard
+809,1226900,"train_tokenizer.py",1260,0,"i",python,content
+810,1226902,"train_tokenizer.py",1261,0,"",python,selection_keyboard
+811,1226958,"train_tokenizer.py",1261,0,"l",python,content
+812,1226963,"train_tokenizer.py",1262,0,"",python,selection_keyboard
+813,1227051,"train_tokenizer.py",1262,0,"l",python,content
+814,1227052,"train_tokenizer.py",1263,0,"",python,selection_keyboard
+815,1227134,"train_tokenizer.py",1263,0," ",python,content
+816,1227135,"train_tokenizer.py",1264,0,"",python,selection_keyboard
+817,1227297,"train_tokenizer.py",1264,0,"n",python,content
+818,1227299,"train_tokenizer.py",1265,0,"",python,selection_keyboard
+819,1227354,"train_tokenizer.py",1265,0,"e",python,content
+820,1227355,"train_tokenizer.py",1266,0,"",python,selection_keyboard
+821,1227491,"train_tokenizer.py",1266,0,"e",python,content
+822,1227495,"train_tokenizer.py",1267,0,"",python,selection_keyboard
+823,1227547,"train_tokenizer.py",1267,0,"d",python,content
+824,1227548,"train_tokenizer.py",1268,0,"",python,selection_keyboard
+825,1227590,"train_tokenizer.py",1268,0," ",python,content
+826,1227591,"train_tokenizer.py",1269,0,"",python,selection_keyboard
+827,1227744,"train_tokenizer.py",1269,0,"t",python,content
+828,1227745,"train_tokenizer.py",1270,0,"",python,selection_keyboard
+829,1227824,"train_tokenizer.py",1270,0,"o",python,content
+830,1227825,"train_tokenizer.py",1271,0,"",python,selection_keyboard
+831,1227864,"train_tokenizer.py",1271,0," ",python,content
+832,1227866,"train_tokenizer.py",1272,0,"",python,selection_keyboard
+833,1228010,"train_tokenizer.py",1272,0,"i",python,content
+834,1228012,"train_tokenizer.py",1273,0,"",python,selection_keyboard
+835,1228077,"train_tokenizer.py",1273,0,"m",python,content
+836,1228078,"train_tokenizer.py",1274,0,"",python,selection_keyboard
+837,1228177,"train_tokenizer.py",1274,0,"p",python,content
+838,1228178,"train_tokenizer.py",1275,0,"",python,selection_keyboard
+839,1228361,"train_tokenizer.py",1275,0,"l",python,content
+840,1228362,"train_tokenizer.py",1276,0,"",python,selection_keyboard
+841,1228428,"train_tokenizer.py",1276,0,"e",python,content
+842,1228429,"train_tokenizer.py",1277,0,"",python,selection_keyboard
+843,1228544,"train_tokenizer.py",1277,0,"m",python,content
+844,1228546,"train_tokenizer.py",1278,0,"",python,selection_keyboard
+845,1228592,"train_tokenizer.py",1278,0,"e",python,content
+846,1228593,"train_tokenizer.py",1279,0,"",python,selection_keyboard
+847,1228744,"train_tokenizer.py",1279,0,"n",python,content
+848,1228746,"train_tokenizer.py",1280,0,"",python,selection_keyboard
+849,1228765,"train_tokenizer.py",1280,0,"t",python,content
+850,1228766,"train_tokenizer.py",1281,0,"",python,selection_keyboard
+851,1228860,"train_tokenizer.py",1281,0," ",python,content
+852,1228862,"train_tokenizer.py",1282,0,"",python,selection_keyboard
+853,1230030,"train_tokenizer.py",1282,0,"m",python,content
+854,1230033,"train_tokenizer.py",1283,0,"",python,selection_keyboard
+855,1230095,"train_tokenizer.py",1283,0,"i",python,content
+856,1230097,"train_tokenizer.py",1284,0,"",python,selection_keyboard
+857,1230147,"train_tokenizer.py",1284,0,"x",python,content
+858,1230148,"train_tokenizer.py",1285,0,"",python,selection_keyboard
+859,1230283,"train_tokenizer.py",1285,0,"e",python,content
+860,1230285,"train_tokenizer.py",1286,0,"",python,selection_keyboard
+861,1230428,"train_tokenizer.py",1286,0,"d",python,content
+862,1230429,"train_tokenizer.py",1287,0,"",python,selection_keyboard
+863,1230493,"train_tokenizer.py",1287,0," ",python,content
+864,1230496,"train_tokenizer.py",1288,0,"",python,selection_keyboard
+865,1230628,"train_tokenizer.py",1288,0,"p",python,content
+866,1230630,"train_tokenizer.py",1289,0,"",python,selection_keyboard
+867,1230714,"train_tokenizer.py",1289,0,"r",python,content
+868,1230715,"train_tokenizer.py",1290,0,"",python,selection_keyboard
+869,1230765,"train_tokenizer.py",1290,0,"e",python,content
+870,1230767,"train_tokenizer.py",1291,0,"",python,selection_keyboard
+871,1230913,"train_tokenizer.py",1291,0,"c",python,content
+872,1230914,"train_tokenizer.py",1292,0,"",python,selection_keyboard
+873,1230997,"train_tokenizer.py",1292,0,"i",python,content
+874,1230999,"train_tokenizer.py",1293,0,"",python,selection_keyboard
+875,1231059,"train_tokenizer.py",1293,0,"s",python,content
+876,1231060,"train_tokenizer.py",1294,0,"",python,selection_keyboard
+877,1231161,"train_tokenizer.py",1294,0,"i",python,content
+878,1231162,"train_tokenizer.py",1295,0,"",python,selection_keyboard
+879,1231244,"train_tokenizer.py",1295,0,"o",python,content
+880,1231245,"train_tokenizer.py",1296,0,"",python,selection_keyboard
+881,1231281,"train_tokenizer.py",1296,0,"n",python,content
+882,1231282,"train_tokenizer.py",1297,0,"",python,selection_keyboard
+883,1231424,"train_tokenizer.py",1296,0,"",python,selection_command
+884,1233567,"train_lam.py",0,0,"",python,tab
+885,1234482,"train_lam.py",1248,0," # FIXME (f.srambical): still need to implement mixed precision\n",python,content
+886,1234489,"train_lam.py",1252,0,"",python,selection_command
+887,1236717,"train_dynamics.py",0,0,"",python,tab
+888,1237751,"train_dynamics.py",1522,0," # FIXME (f.srambical): still need to implement mixed precision\n",python,content
+889,1237758,"train_dynamics.py",1526,0,"",python,selection_command
+890,1238280,"train_dynamics.py",1528,0,"",python,selection_command
+891,1281554,"train_tokenizer.py",0,0,"",python,tab
+892,1283277,"train_tokenizer.py",1477,0,"",python,selection_command
+893,1283477,"train_tokenizer.py",1542,0,"",python,selection_command
+894,1283804,"train_tokenizer.py",1785,0,"",python,selection_command
+895,1284239,"train_tokenizer.py",3134,0,"",python,selection_command
+896,1284966,"train_tokenizer.py",1785,0,"",python,selection_command
+897,1285800,"train_tokenizer.py",3134,0,"",python,selection_command
+898,1286414,"train_tokenizer.py",0,0,"",python,selection_command
+899,1286694,"train_tokenizer.py",583,0,"",python,selection_command
+900,1286899,"train_tokenizer.py",1076,0,"",python,selection_command
+901,1287080,"train_tokenizer.py",1756,0,"",python,selection_command
+902,1288247,"train_tokenizer.py",3262,0,"",python,selection_command
+903,1288467,"train_tokenizer.py",4712,0,"",python,selection_command
+904,1290018,"train_tokenizer.py",4970,0,"",python,selection_command
+905,1291291,"train_tokenizer.py",4950,0,"",python,selection_command
+906,1291629,"train_tokenizer.py",4970,0,"",python,selection_command
+907,1292127,"train_tokenizer.py",4308,0,"",python,selection_command
+908,1294061,"train_tokenizer.py",4340,0,"",python,selection_command
+909,1294313,"train_tokenizer.py",4376,0,"",python,selection_command
+910,1294343,"train_tokenizer.py",4410,0,"",python,selection_command
+911,1294376,"train_tokenizer.py",4446,0,"",python,selection_command
+912,1294409,"train_tokenizer.py",4484,0,"",python,selection_command
+913,1294442,"train_tokenizer.py",4520,0,"",python,selection_command
+914,1294475,"train_tokenizer.py",4556,0,"",python,selection_command
+915,1294508,"train_tokenizer.py",4590,0,"",python,selection_command
+916,1294649,"train_tokenizer.py",4620,0,"",python,selection_command
+917,1295061,"train_tokenizer.py",4663,0,"",python,selection_command
+918,1295201,"train_tokenizer.py",4663,0,"\n ",python,content
+919,1296713,"train_tokenizer.py",4672,0,"p",python,content
+920,1296715,"train_tokenizer.py",4673,0,"",python,selection_keyboard
+921,1297044,"train_tokenizer.py",4673,0,"a",python,content
+922,1297045,"train_tokenizer.py",4674,0,"",python,selection_keyboard
+923,1297092,"train_tokenizer.py",4674,0,"r",python,content
+924,1297093,"train_tokenizer.py",4675,0,"",python,selection_keyboard
+925,1297197,"train_tokenizer.py",4675,0,"a",python,content
+926,1297198,"train_tokenizer.py",4676,0,"",python,selection_keyboard
+927,1297308,"train_tokenizer.py",4676,0,"m",python,content
+928,1297312,"train_tokenizer.py",4677,0,"",python,selection_keyboard
+929,1297526,"train_tokenizer.py",4677,0,"_",python,content
+930,1297528,"train_tokenizer.py",4678,0,"",python,selection_keyboard
+931,1297703,"train_tokenizer.py",4678,0,"d",python,content
+932,1297705,"train_tokenizer.py",4679,0,"",python,selection_keyboard
+933,1297845,"train_tokenizer.py",4679,0,"t",python,content
+934,1297846,"train_tokenizer.py",4680,0,"",python,selection_keyboard
+935,1297912,"train_tokenizer.py",4680,0,"y",python,content
+936,1297915,"train_tokenizer.py",4681,0,"",python,selection_keyboard
+937,1297968,"train_tokenizer.py",4681,0,"p",python,content
+938,1297969,"train_tokenizer.py",4682,0,"",python,selection_keyboard
+939,1298048,"train_tokenizer.py",4682,0,"e",python,content
+940,1298050,"train_tokenizer.py",4683,0,"",python,selection_keyboard
+941,1298274,"train_tokenizer.py",4683,0,"=",python,content
+942,1298275,"train_tokenizer.py",4684,0,"",python,selection_keyboard
+943,1302232,"train_tokenizer.py",4684,0,"p",python,content
+944,1302234,"train_tokenizer.py",4685,0,"",python,selection_keyboard
+945,1302323,"train_tokenizer.py",4685,0,"a",python,content
+946,1302325,"train_tokenizer.py",4686,0,"",python,selection_keyboard
+947,1302377,"train_tokenizer.py",4686,0,"r",python,content
+948,1302380,"train_tokenizer.py",4687,0,"",python,selection_keyboard
+949,1302461,"train_tokenizer.py",4687,0,"a",python,content
+950,1302463,"train_tokenizer.py",4688,0,"",python,selection_keyboard
+951,1302525,"train_tokenizer.py",4688,0,"m",python,content
+952,1302532,"train_tokenizer.py",4689,0,"",python,selection_keyboard
+953,1302780,"train_tokenizer.py",4689,0,"_",python,content
+954,1302782,"train_tokenizer.py",4690,0,"",python,selection_keyboard
+955,1302916,"train_tokenizer.py",4690,0,"d",python,content
+956,1302917,"train_tokenizer.py",4691,0,"",python,selection_keyboard
+957,1303077,"train_tokenizer.py",4691,0,"t",python,content
+958,1303079,"train_tokenizer.py",4692,0,"",python,selection_keyboard
+959,1303198,"train_tokenizer.py",4692,0,"e",python,content
+960,1303199,"train_tokenizer.py",4693,0,"",python,selection_keyboard
+961,1303250,"train_tokenizer.py",4693,0,"y",python,content
+962,1303251,"train_tokenizer.py",4694,0,"",python,selection_keyboard
+963,1303264,"train_tokenizer.py",4694,0,"p",python,content
+964,1303265,"train_tokenizer.py",4695,0,"",python,selection_keyboard
+965,1303699,"train_tokenizer.py",4694,1,"",python,content
+966,1303842,"train_tokenizer.py",4693,1,"",python,content
+967,1303951,"train_tokenizer.py",4692,1,"",python,content
+968,1304179,"train_tokenizer.py",4692,0,"y",python,content
+969,1304180,"train_tokenizer.py",4693,0,"",python,selection_keyboard
+970,1304245,"train_tokenizer.py",4693,0,"p",python,content
+971,1304246,"train_tokenizer.py",4694,0,"",python,selection_keyboard
+972,1304358,"train_tokenizer.py",4694,0,"e",python,content
+973,1304363,"train_tokenizer.py",4695,0,"",python,selection_keyboard
+974,1304609,"train_tokenizer.py",4695,0,",",python,content
+975,1304611,"train_tokenizer.py",4696,0,"",python,selection_keyboard
+976,1304932,"train_tokenizer.py",4696,0,"\n ",python,content
+977,1305975,"train_tokenizer.py",4705,0,"d",python,content
+978,1305976,"train_tokenizer.py",4706,0,"",python,selection_keyboard
+979,1306182,"train_tokenizer.py",4706,0,"t",python,content
+980,1306183,"train_tokenizer.py",4707,0,"",python,selection_keyboard
+981,1306291,"train_tokenizer.py",4707,0,"y",python,content
+982,1306293,"train_tokenizer.py",4708,0,"",python,selection_keyboard
+983,1306333,"train_tokenizer.py",4708,0,"p",python,content
+984,1306334,"train_tokenizer.py",4709,0,"",python,selection_keyboard
+985,1306416,"train_tokenizer.py",4709,0,"e",python,content
+986,1306417,"train_tokenizer.py",4710,0,"",python,selection_keyboard
+987,1306593,"train_tokenizer.py",4710,0,"=",python,content
+988,1306595,"train_tokenizer.py",4711,0,"",python,selection_keyboard
+989,1307467,"train_tokenizer.py",4711,0,"d",python,content
+990,1307471,"train_tokenizer.py",4712,0,"",python,selection_keyboard
+991,1307557,"train_tokenizer.py",4712,0,"t",python,content
+992,1307559,"train_tokenizer.py",4713,0,"",python,selection_keyboard
+993,1307639,"train_tokenizer.py",4713,0,"y",python,content
+994,1307641,"train_tokenizer.py",4714,0,"",python,selection_keyboard
+995,1307698,"train_tokenizer.py",4714,0,"p",python,content
+996,1307699,"train_tokenizer.py",4715,0,"",python,selection_keyboard
+997,1307810,"train_tokenizer.py",4715,0,"e",python,content
+998,1307813,"train_tokenizer.py",4716,0,"",python,selection_keyboard
+999,1308080,"train_tokenizer.py",4715,0,"",python,selection_command
+1000,1309663,"train_tokenizer.py",4716,0,"",python,selection_command
+1001,1309830,"train_tokenizer.py",4716,0,",",python,content
+1002,1309832,"train_tokenizer.py",4717,0,"",python,selection_keyboard
+1003,1309916,"train_tokenizer.py",4716,0,"",python,selection_command
+1004,1310786,"train_tokenizer.py",4716,1,"",python,content
+1005,1310789,"train_tokenizer.py",4715,0,"",python,selection_command
+1006,1317409,"train_tokenizer.py",4682,0,"",python,selection_command
+1007,1317660,"train_tokenizer.py",4634,0,"",python,selection_command
+1008,1317692,"train_tokenizer.py",4604,0,"",python,selection_command
+1009,1317725,"train_tokenizer.py",4570,0,"",python,selection_command
+1010,1317759,"train_tokenizer.py",4534,0,"",python,selection_command
+1011,1317793,"train_tokenizer.py",4498,0,"",python,selection_command
+1012,1317825,"train_tokenizer.py",4460,0,"",python,selection_command
+1013,1317859,"train_tokenizer.py",4424,0,"",python,selection_command
+1014,1317892,"train_tokenizer.py",4390,0,"",python,selection_command
+1015,1318032,"train_tokenizer.py",4354,0,"",python,selection_command
+1016,1318198,"train_tokenizer.py",4322,0,"",python,selection_command
+1017,1318469,"models/tokenizer.py",0,0,"from typing import Dict, Any, Tuple\n\nimport flax.linen as nn\n\nfrom utils.preprocess import patchify, unpatchify\nfrom utils.nn import STTransformer, VectorQuantizer\n\n\nclass TokenizerVQVAE(nn.Module):\n """"""ST-ViVit VQ-VAE""""""\n\n in_dim: int\n model_dim: int\n latent_dim: int\n num_latents: int\n patch_size: int\n num_blocks: int\n num_heads: int\n dropout: float\n codebook_dropout: float\n\n def setup(self):\n self.encoder = STTransformer(\n self.model_dim,\n self.latent_dim,\n self.num_blocks,\n self.num_heads,\n self.dropout,\n )\n self.vq = VectorQuantizer(\n self.latent_dim,\n self.num_latents,\n self.codebook_dropout,\n )\n self.out_dim = self.in_dim * self.patch_size**2\n self.decoder = STTransformer(\n self.model_dim,\n self.out_dim,\n self.num_blocks,\n self.num_heads,\n self.dropout,\n )\n\n def __call__(self, batch: Dict[str, Any], training: bool = True) -> Dict[str, Any]:\n H, W = batch[""videos""].shape[2:4]\n outputs = self.vq_encode(batch[""videos""], training)\n recon = self.decoder(outputs[""z_q""]) # (B, T, H_down * W_down, C)\n recon = nn.sigmoid(recon)\n outputs[""recon""] = unpatchify(recon, self.patch_size, H, W)\n return outputs\n\n def vq_encode(self, videos: Any, training: bool = True) -> Dict[str, Any]:\n # --- Preprocess + encode ---\n B, T = videos.shape[:2]\n x = patchify(videos, self.patch_size)\n N = x.shape[2]\n x = self.encoder(x) # (B, T, N, E)\n\n # --- Vector quantize ---\n x = x.reshape(B * T * N, self.latent_dim)\n z_q, z, emb, indices = self.vq(x, training)\n z_q = z_q.reshape(B, T, N, self.latent_dim)\n indices = indices.reshape(B, T, N)\n return dict(z_q=z_q, z=z, emb=emb, indices=indices)\n\n def decode(self, indices: Any, video_hw: Tuple[int, int]):\n z = self.vq.codebook[indices]\n recon = self.decoder(z)\n recon = nn.sigmoid(recon)\n return unpatchify(recon, self.patch_size, *video_hw)\n",python,tab
+1018,1318935,"models/tokenizer.py",205,0,"",python,selection_command
+1019,1319184,"models/tokenizer.py",225,0,"",python,selection_command
+1020,1319214,"models/tokenizer.py",232,0,"",python,selection_command
+1021,1319246,"models/tokenizer.py",248,0,"",python,selection_command
+1022,1319280,"models/tokenizer.py",267,0,"",python,selection_command
+1023,1319313,"models/tokenizer.py",287,0,"",python,selection_command
+1024,1319347,"models/tokenizer.py",308,0,"",python,selection_command
+1025,1319381,"models/tokenizer.py",328,0,"",python,selection_command
+1026,1319413,"models/tokenizer.py",348,0,"",python,selection_command
+1027,1319446,"models/tokenizer.py",367,0,"",python,selection_command
+1028,1319479,"models/tokenizer.py",386,0,"",python,selection_command
+1029,1319946,"models/tokenizer.py",407,0,"",python,selection_command
+1030,1320036,"models/tokenizer.py",407,0,"\n ",python,content
+1031,1321685,"models/tokenizer.py",412,0,"p",python,content
+1032,1321686,"models/tokenizer.py",413,0,"",python,selection_keyboard
+1033,1321776,"models/tokenizer.py",413,0,"a",python,content
+1034,1321777,"models/tokenizer.py",414,0,"",python,selection_keyboard
+1035,1321815,"models/tokenizer.py",414,0,"r",python,content
+1036,1321817,"models/tokenizer.py",415,0,"",python,selection_keyboard
+1037,1321899,"models/tokenizer.py",415,0,"a",python,content
+1038,1321901,"models/tokenizer.py",416,0,"",python,selection_keyboard
+1039,1321983,"models/tokenizer.py",416,0,"m",python,content
+1040,1321985,"models/tokenizer.py",417,0,"",python,selection_keyboard
+1041,1322229,"models/tokenizer.py",417,0,"_",python,content
+1042,1322230,"models/tokenizer.py",418,0,"",python,selection_keyboard
+1043,1322429,"models/tokenizer.py",418,0,"d",python,content
+1044,1322431,"models/tokenizer.py",419,0,"",python,selection_keyboard
+1045,1322565,"models/tokenizer.py",419,0,"t",python,content
+1046,1322567,"models/tokenizer.py",420,0,"",python,selection_keyboard
+1047,1322659,"models/tokenizer.py",420,0,"y",python,content
+1048,1322661,"models/tokenizer.py",421,0,"",python,selection_keyboard
+1049,1322706,"models/tokenizer.py",421,0,"p",python,content
+1050,1322707,"models/tokenizer.py",422,0,"",python,selection_keyboard
+1051,1322813,"models/tokenizer.py",422,0,"e",python,content
+1052,1322815,"models/tokenizer.py",423,0,"",python,selection_keyboard
+1053,1323501,"models/tokenizer.py",423,0,":",python,content
+1054,1323503,"models/tokenizer.py",424,0,"",python,selection_keyboard
+1055,1323579,"models/tokenizer.py",424,0," ",python,content
+1056,1323580,"models/tokenizer.py",425,0,"",python,selection_keyboard
+1057,1324945,"models/tokenizer.py",425,0,"j",python,content
+1058,1324947,"models/tokenizer.py",426,0,"",python,selection_keyboard
+1059,1325050,"models/tokenizer.py",426,0,"n",python,content
+1060,1325053,"models/tokenizer.py",427,0,"",python,selection_keyboard
+1061,1325085,"models/tokenizer.py",427,0,"p",python,content
+1062,1325088,"models/tokenizer.py",428,0,"",python,selection_keyboard
+1063,1325308,"models/tokenizer.py",428,0,".",python,content
+1064,1325311,"models/tokenizer.py",429,0,"",python,selection_keyboard
+1065,1325413,"models/tokenizer.py",429,0,"f",python,content
+1066,1325414,"models/tokenizer.py",430,0,"",python,selection_keyboard
+1067,1325896,"models/tokenizer.py",429,1,"",python,content
+1068,1327099,"models/tokenizer.py",429,0,"d",python,content
+1069,1327101,"models/tokenizer.py",430,0,"",python,selection_keyboard
+1070,1327180,"models/tokenizer.py",430,0,"t",python,content
+1071,1327184,"models/tokenizer.py",431,0,"",python,selection_keyboard
+1072,1327290,"models/tokenizer.py",431,0,"y",python,content
+1073,1327294,"models/tokenizer.py",432,0,"",python,selection_keyboard
+1074,1327332,"models/tokenizer.py",432,0,"p",python,content
+1075,1327336,"models/tokenizer.py",433,0,"",python,selection_keyboard
+1076,1327415,"models/tokenizer.py",433,0,"e",python,content
+1077,1327418,"models/tokenizer.py",434,0,"",python,selection_keyboard
+1078,1327632,"models/tokenizer.py",433,0,"",python,selection_command
+1079,1329083,"models/tokenizer.py",429,0,"",python,selection_command
+1080,1329230,"models/tokenizer.py",428,0,"",python,selection_command
+1081,1329361,"models/tokenizer.py",425,0,"",python,selection_command
+1082,1330424,"models/tokenizer.py",0,0,"",python,selection_command
+1083,1330732,"models/tokenizer.py",36,0,"",python,selection_command
+1084,1331042,"models/tokenizer.py",37,0,"",python,selection_command
+1085,1335230,"models/tokenizer.py",60,0,"\n",python,content
+1086,1335376,"models/tokenizer.py",61,0,"i",python,content
+1087,1335378,"models/tokenizer.py",62,0,"",python,selection_keyboard
+1088,1335458,"models/tokenizer.py",62,0,"m",python,content
+1089,1335459,"models/tokenizer.py",63,0,"",python,selection_keyboard
+1090,1335506,"models/tokenizer.py",63,0,"p",python,content
+1091,1335508,"models/tokenizer.py",64,0,"",python,selection_keyboard
+1092,1335566,"models/tokenizer.py",64,0,"o",python,content
+1093,1335567,"models/tokenizer.py",65,0,"",python,selection_keyboard
+1094,1335656,"models/tokenizer.py",65,0,"r",python,content
+1095,1335657,"models/tokenizer.py",66,0,"",python,selection_keyboard
+1096,1335730,"models/tokenizer.py",66,0,"t",python,content
+1097,1335731,"models/tokenizer.py",67,0,"",python,selection_keyboard
+1098,1335772,"models/tokenizer.py",67,0," ",python,content
+1099,1335773,"models/tokenizer.py",68,0,"",python,selection_keyboard
+1100,1338070,"models/tokenizer.py",68,0,"j",python,content
+1101,1338072,"models/tokenizer.py",69,0,"",python,selection_keyboard
+1102,1338171,"models/tokenizer.py",69,0,"a",python,content
+1103,1338173,"models/tokenizer.py",70,0,"",python,selection_keyboard
+1104,1338491,"models/tokenizer.py",70,0,"x",python,content
+1105,1338494,"models/tokenizer.py",71,0,"",python,selection_keyboard
+1106,1338607,"models/tokenizer.py",71,0,".",python,content
+1107,1338610,"models/tokenizer.py",72,0,"",python,selection_keyboard
+1108,1338772,"models/tokenizer.py",72,0,"n",python,content
+1109,1338773,"models/tokenizer.py",73,0,"",python,selection_keyboard
+1110,1338926,"models/tokenizer.py",73,0,"p",python,content
+1111,1338927,"models/tokenizer.py",74,0,"",python,selection_keyboard
+1112,1339041,"models/tokenizer.py",74,0," ",python,content
+1113,1339042,"models/tokenizer.py",75,0,"",python,selection_keyboard
+1114,1339213,"models/tokenizer.py",75,0,"a",python,content
+1115,1339214,"models/tokenizer.py",76,0,"",python,selection_keyboard
+1116,1339295,"models/tokenizer.py",76,0,"s",python,content
+1117,1339297,"models/tokenizer.py",77,0,"",python,selection_keyboard
+1118,1339381,"models/tokenizer.py",77,0," ",python,content
+1119,1339382,"models/tokenizer.py",78,0,"",python,selection_keyboard
+1120,1339550,"models/tokenizer.py",78,0,"n",python,content
+1121,1339551,"models/tokenizer.py",79,0,"",python,selection_keyboard
+1122,1339873,"models/tokenizer.py",78,1,"",python,content
+1123,1340062,"models/tokenizer.py",78,0,"j",python,content
+1124,1340063,"models/tokenizer.py",79,0,"",python,selection_keyboard
+1125,1340263,"models/tokenizer.py",79,0,"n",python,content
+1126,1340265,"models/tokenizer.py",80,0,"",python,selection_keyboard
+1127,1340311,"models/tokenizer.py",80,0,"p",python,content
+1128,1340312,"models/tokenizer.py",81,0,"",python,selection_keyboard
+1129,1340547,"models/tokenizer.py",80,0,"",python,selection_command
+1130,1340864,"models/tokenizer.py",78,0,"",python,selection_command
+1131,1341000,"models/tokenizer.py",75,0,"",python,selection_command
+1132,1341129,"models/tokenizer.py",72,0,"",python,selection_command
+1133,1341330,"models/tokenizer.py",71,0,"",python,selection_command
+1134,1341629,"models/tokenizer.py",72,0,"",python,selection_command
+1135,1341864,"models/tokenizer.py",72,2,"",python,content
+1136,1342007,"models/tokenizer.py",72,0,"n",python,content
+1137,1342008,"models/tokenizer.py",73,0,"",python,selection_keyboard
+1138,1342202,"models/tokenizer.py",73,0,"u",python,content
+1139,1342202,"models/tokenizer.py",74,0,"",python,selection_keyboard
+1140,1342765,"models/tokenizer.py",74,0,"m",python,content
+1141,1342767,"models/tokenizer.py",75,0,"",python,selection_keyboard
+1142,1342866,"models/tokenizer.py",75,0,"p",python,content
+1143,1342868,"models/tokenizer.py",76,0,"",python,selection_keyboard
+1144,1342985,"models/tokenizer.py",76,0,"y",python,content
+1145,1342987,"models/tokenizer.py",77,0,"",python,selection_keyboard
+1146,1343085,"models/tokenizer.py",76,0,"",python,selection_command
+1147,1343801,"models/tokenizer.py",78,0,"",python,selection_command
+1148,1343959,"models/tokenizer.py",81,0,"",python,selection_command
+1149,1344431,"models/tokenizer.py",449,0,"",python,selection_command
+1150,1349796,"models/tokenizer.py",458,0,"\n ",python,content
+1151,1349944,"models/tokenizer.py",463,0,"d",python,content
+1152,1349945,"models/tokenizer.py",464,0,"",python,selection_keyboard
+1153,1350078,"models/tokenizer.py",464,0,"t",python,content
+1154,1350079,"models/tokenizer.py",465,0,"",python,selection_keyboard
+1155,1350159,"models/tokenizer.py",465,0,"y",python,content
+1156,1350162,"models/tokenizer.py",466,0,"",python,selection_keyboard
+1157,1350195,"models/tokenizer.py",466,0,"p",python,content
+1158,1350197,"models/tokenizer.py",467,0,"",python,selection_keyboard
+1159,1350273,"models/tokenizer.py",467,0,"e",python,content
+1160,1350275,"models/tokenizer.py",468,0,"",python,selection_keyboard
+1161,1350491,"models/tokenizer.py",468,0,":",python,content
+1162,1350492,"models/tokenizer.py",469,0,"",python,selection_keyboard
+1163,1350614,"models/tokenizer.py",469,0," ",python,content
+1164,1350615,"models/tokenizer.py",470,0,"",python,selection_keyboard
+1165,1350812,"models/tokenizer.py",470,0,"j",python,content
+1166,1350814,"models/tokenizer.py",471,0,"",python,selection_keyboard
+1167,1350947,"models/tokenizer.py",471,0,"n",python,content
+1168,1350948,"models/tokenizer.py",472,0,"",python,selection_keyboard
+1169,1351006,"models/tokenizer.py",472,0,"p",python,content
+1170,1351007,"models/tokenizer.py",473,0,"",python,selection_keyboard
+1171,1351212,"models/tokenizer.py",473,0,".",python,content
+1172,1351215,"models/tokenizer.py",474,0,"",python,selection_keyboard
+1173,1351316,"models/tokenizer.py",474,0,"d",python,content
+1174,1351319,"models/tokenizer.py",475,0,"",python,selection_keyboard
+1175,1351435,"models/tokenizer.py",475,0,"t",python,content
+1176,1351437,"models/tokenizer.py",476,0,"",python,selection_keyboard
+1177,1351524,"models/tokenizer.py",476,0,"y",python,content
+1178,1351526,"models/tokenizer.py",477,0,"",python,selection_keyboard
+1179,1351547,"models/tokenizer.py",477,0,"p",python,content
+1180,1351549,"models/tokenizer.py",478,0,"",python,selection_keyboard
+1181,1351637,"models/tokenizer.py",478,0,"e",python,content
+1182,1351637,"models/tokenizer.py",479,0,"",python,selection_keyboard
+1183,1351830,"models/tokenizer.py",478,0,"",python,selection_command
+1184,1353039,"train_tokenizer.py",0,0,"",python,tab
+1185,1355034,"train_tokenizer.py",4354,0,"",python,selection_command
+1186,1355283,"train_tokenizer.py",4390,0,"",python,selection_command
+1187,1355314,"train_tokenizer.py",4424,0,"",python,selection_command
+1188,1355347,"train_tokenizer.py",4460,0,"",python,selection_command
+1189,1355380,"train_tokenizer.py",4498,0,"",python,selection_command
+1190,1355433,"train_tokenizer.py",4534,0,"",python,selection_command
+1191,1355448,"train_tokenizer.py",4570,0,"",python,selection_command
+1192,1355480,"train_tokenizer.py",4604,0,"",python,selection_command
+1193,1355513,"train_tokenizer.py",4634,0,"",python,selection_command
+1194,1355547,"train_tokenizer.py",4682,0,"",python,selection_command
+1195,1355782,"train_tokenizer.py",4683,0,"",python,selection_command
+1196,1355866,"train_tokenizer.py",4684,0,"",python,selection_command
+1197,1356081,"train_tokenizer.py",4695,0,"",python,selection_command
+1198,1356483,"train_tokenizer.py",4684,0,"",python,selection_command
+1199,1356815,"train_tokenizer.py",4684,0,"a",python,content
+1200,1356818,"train_tokenizer.py",4685,0,"",python,selection_keyboard
+1201,1356909,"train_tokenizer.py",4685,0,"r",python,content
+1202,1356913,"train_tokenizer.py",4686,0,"",python,selection_keyboard
+1203,1357018,"train_tokenizer.py",4686,0,"g",python,content
+1204,1357020,"train_tokenizer.py",4687,0,"",python,selection_keyboard
+1205,1357081,"train_tokenizer.py",4687,0,"s",python,content
+1206,1357085,"train_tokenizer.py",4688,0,"",python,selection_keyboard
+1207,1357173,"train_tokenizer.py",4688,0,".",python,content
+1208,1357175,"train_tokenizer.py",4689,0,"",python,selection_keyboard
+1209,1357673,"train_tokenizer.py",4688,0,"",python,selection_command
+1210,1357899,"train_tokenizer.py",4720,0,"",python,selection_command
+1211,1358090,"train_tokenizer.py",4716,0,"",python,selection_command
+1212,1358596,"train_tokenizer.py",4716,0,"a",python,content
+1213,1358597,"train_tokenizer.py",4717,0,"",python,selection_keyboard
+1214,1358673,"train_tokenizer.py",4717,0,"r",python,content
+1215,1358676,"train_tokenizer.py",4718,0,"",python,selection_keyboard
+1216,1358817,"train_tokenizer.py",4718,0,"g",python,content
+1217,1358819,"train_tokenizer.py",4719,0,"",python,selection_keyboard
+1218,1358858,"train_tokenizer.py",4719,0,"s",python,content
+1219,1358859,"train_tokenizer.py",4720,0,"",python,selection_keyboard
+1220,1358973,"train_tokenizer.py",4720,0,".",python,content
+1221,1358974,"train_tokenizer.py",4721,0,"",python,selection_keyboard
+1222,1359108,"train_tokenizer.py",4720,0,"",python,selection_command
+1223,1359946,"train_tokenizer.py",4682,0,"",python,selection_command
+1224,1360023,"train_tokenizer.py",4683,0,"",python,selection_command
+1225,1360145,"train_tokenizer.py",4684,0,"",python,selection_command
+1226,1360347,"train_tokenizer.py",4688,0,"",python,selection_command
+1227,1360498,"train_tokenizer.py",4689,0,"",python,selection_command
+1228,1360682,"train_tokenizer.py",4664,0,"",python,selection_command
+1229,1366112,"models/tokenizer.py",0,0,"",python,tab
+1230,1366197,"models/tokenizer.py",480,0,"",python,selection_command
+1231,1366781,"models/tokenizer.py",481,0,"",python,selection_command
+1232,1366913,"models/tokenizer.py",480,0,"",python,selection_command
+1233,1367064,"models/tokenizer.py",459,0,"",python,selection_command
+1234,1367207,"models/tokenizer.py",432,0,"",python,selection_command
+1235,1367426,"models/tokenizer.py",431,0,"\n ",python,content
+1236,1367528,"models/tokenizer.py",436,0,"#",python,content
+1237,1367529,"models/tokenizer.py",437,0,"",python,selection_keyboard
+1238,1367734,"models/tokenizer.py",437,0," ",python,content
+1239,1367735,"models/tokenizer.py",438,0,"",python,selection_keyboard
+1240,1367928,"models/tokenizer.py",438,0,"F",python,content
+1241,1367929,"models/tokenizer.py",439,0,"",python,selection_keyboard
+1242,1368006,"models/tokenizer.py",439,0,"I",python,content
+1243,1368007,"models/tokenizer.py",440,0,"",python,selection_keyboard
+1244,1368047,"models/tokenizer.py",440,0,"X",python,content
+1245,1368048,"models/tokenizer.py",441,0,"",python,selection_keyboard
+1246,1368162,"models/tokenizer.py",441,0,"M",python,content
+1247,1368162,"models/tokenizer.py",442,0,"",python,selection_keyboard
+1248,1368266,"models/tokenizer.py",442,0,"E",python,content
+1249,1368268,"models/tokenizer.py",443,0,"",python,selection_keyboard
+1250,1368379,"models/tokenizer.py",443,0," ",python,content
+1251,1368380,"models/tokenizer.py",444,0,"",python,selection_keyboard
+1252,1368499,"models/tokenizer.py",444,0,"()",python,content
+1253,1368500,"models/tokenizer.py",445,0,"",python,selection_keyboard
+1254,1368812,"models/tokenizer.py",445,0,"f",python,content
+1255,1368813,"models/tokenizer.py",446,0,"",python,selection_keyboard
+1256,1368957,"models/tokenizer.py",446,0,".",python,content
+1257,1368958,"models/tokenizer.py",447,0,"",python,selection_keyboard
+1258,1369048,"models/tokenizer.py",447,0,"s",python,content
+1259,1369052,"models/tokenizer.py",448,0,"",python,selection_keyboard
+1260,1369128,"models/tokenizer.py",448,0,"r",python,content
+1261,1369129,"models/tokenizer.py",449,0,"",python,selection_keyboard
+1262,1369197,"models/tokenizer.py",449,0,"a",python,content
+1263,1369199,"models/tokenizer.py",450,0,"",python,selection_keyboard
+1264,1369246,"models/tokenizer.py",450,0,"m",python,content
+1265,1369247,"models/tokenizer.py",451,0,"",python,selection_keyboard
+1266,1369404,"models/tokenizer.py",451,0,"b",python,content
+1267,1369405,"models/tokenizer.py",452,0,"",python,selection_keyboard
+1268,1369534,"models/tokenizer.py",452,0,"c",python,content
+1269,1369535,"models/tokenizer.py",453,0,"",python,selection_keyboard
+1270,1369880,"models/tokenizer.py",452,1,"",python,content
+1271,1370079,"models/tokenizer.py",452,0,"i",python,content
+1272,1370080,"models/tokenizer.py",453,0,"",python,selection_keyboard
+1273,1370133,"models/tokenizer.py",453,0,"c",python,content
+1274,1370134,"models/tokenizer.py",454,0,"",python,selection_keyboard
+1275,1370179,"models/tokenizer.py",454,0,"a",python,content
+1276,1370180,"models/tokenizer.py",455,0,"",python,selection_keyboard
+1277,1370252,"models/tokenizer.py",455,0,"l",python,content
+1278,1370253,"models/tokenizer.py",456,0,"",python,selection_keyboard
+1279,1370465,"models/tokenizer.py",456,1,")",python,content
+1280,1370466,"models/tokenizer.py",457,0,"",python,selection_keyboard
+1281,1371065,"models/tokenizer.py",457,0,":",python,content
+1282,1371066,"models/tokenizer.py",458,0,"",python,selection_keyboard
+1283,1371182,"models/tokenizer.py",458,0," ",python,content
+1284,1371183,"models/tokenizer.py",459,0,"",python,selection_keyboard
+1285,1371410,"models/tokenizer.py",459,0,"c",python,content
+1286,1371413,"models/tokenizer.py",460,0,"",python,selection_keyboard
+1287,1371500,"models/tokenizer.py",460,0,"h",python,content
+1288,1371501,"models/tokenizer.py",461,0,"",python,selection_keyboard
+1289,1371548,"models/tokenizer.py",461,0,"e",python,content
+1290,1371548,"models/tokenizer.py",462,0,"",python,selection_keyboard
+1291,1371615,"models/tokenizer.py",462,0,"c",python,content
+1292,1371616,"models/tokenizer.py",463,0,"",python,selection_keyboard
+1293,1371700,"models/tokenizer.py",463,0,"k",python,content
+1294,1371700,"models/tokenizer.py",464,0,"",python,selection_keyboard
+1295,1371799,"models/tokenizer.py",464,0," ",python,content
+1296,1371800,"models/tokenizer.py",465,0,"",python,selection_keyboard
+1297,1372199,"models/tokenizer.py",465,0,"w",python,content
+1298,1372202,"models/tokenizer.py",466,0,"",python,selection_keyboard
+1299,1372283,"models/tokenizer.py",466,0,"h",python,content
+1300,1372284,"models/tokenizer.py",467,0,"",python,selection_keyboard
+1301,1372375,"models/tokenizer.py",467,0,"e",python,content
+1302,1372376,"models/tokenizer.py",468,0,"",python,selection_keyboard
+1303,1372413,"models/tokenizer.py",468,0,"t",python,content
+1304,1372414,"models/tokenizer.py",469,0,"",python,selection_keyboard
+1305,1372564,"models/tokenizer.py",469,0,"h",python,content
+1306,1372565,"models/tokenizer.py",470,0,"",python,selection_keyboard
+1307,1372608,"models/tokenizer.py",470,0,"e",python,content
+1308,1372609,"models/tokenizer.py",471,0,"",python,selection_keyboard
+1309,1372674,"models/tokenizer.py",471,0,"r",python,content
+1310,1372675,"models/tokenizer.py",472,0,"",python,selection_keyboard
+1311,1372727,"models/tokenizer.py",472,0," ",python,content
+1312,1372728,"models/tokenizer.py",473,0,"",python,selection_keyboard
+1313,1378947,"models/tokenizer.py",473,0,"f",python,content
+1314,1378948,"models/tokenizer.py",474,0,"",python,selection_keyboard
+1315,1379064,"models/tokenizer.py",474,0,"a",python,content
+1316,1379065,"models/tokenizer.py",475,0,"",python,selection_keyboard
+1317,1379072,"models/tokenizer.py",475,0,"l",python,content
+1318,1379073,"models/tokenizer.py",476,0,"",python,selection_keyboard
+1319,1379226,"models/tokenizer.py",476,0,"x",python,content
+1320,1379227,"models/tokenizer.py",477,0,"",python,selection_keyboard
+1321,1379282,"models/tokenizer.py",477,0," ",python,content
+1322,1379283,"models/tokenizer.py",478,0,"",python,selection_keyboard
+1323,1379423,"models/tokenizer.py",478,0,"w",python,content
+1324,1379426,"models/tokenizer.py",479,0,"",python,selection_keyboard
+1325,1379519,"models/tokenizer.py",479,0,"i",python,content
+1326,1379521,"models/tokenizer.py",480,0,"",python,selection_keyboard
+1327,1379775,"models/tokenizer.py",478,2,"",python,content
+1328,1379893,"models/tokenizer.py",473,5,"",python,content
+1329,1380029,"models/tokenizer.py",473,0,"f",python,content
+1330,1380030,"models/tokenizer.py",474,0,"",python,selection_keyboard
+1331,1380101,"models/tokenizer.py",474,0,"l",python,content
+1332,1380102,"models/tokenizer.py",475,0,"",python,selection_keyboard
+1333,1380157,"models/tokenizer.py",475,0,"a",python,content
+1334,1380158,"models/tokenizer.py",476,0,"",python,selection_keyboard
+1335,1380314,"models/tokenizer.py",476,0,"x",python,content
+1336,1380315,"models/tokenizer.py",477,0,"",python,selection_keyboard
+1337,1380341,"models/tokenizer.py",477,0," ",python,content
+1338,1380343,"models/tokenizer.py",478,0,"",python,selection_keyboard
+1339,1380497,"models/tokenizer.py",478,0,"a",python,content
+1340,1380498,"models/tokenizer.py",479,0,"",python,selection_keyboard
+1341,1380589,"models/tokenizer.py",479,0,"u",python,content
+1342,1380590,"models/tokenizer.py",480,0,"",python,selection_keyboard
+1343,1380679,"models/tokenizer.py",480,0,"t",python,content
+1344,1380679,"models/tokenizer.py",481,0,"",python,selection_keyboard
+1345,1380754,"models/tokenizer.py",481,0,"o",python,content
+1346,1380755,"models/tokenizer.py",482,0,"",python,selection_keyboard
+1347,1380803,"models/tokenizer.py",482,0,"m",python,content
+1348,1380804,"models/tokenizer.py",483,0,"",python,selection_keyboard
+1349,1380877,"models/tokenizer.py",483,0,"a",python,content
+1350,1380878,"models/tokenizer.py",484,0,"",python,selection_keyboard
+1351,1380946,"models/tokenizer.py",484,0,"t",python,content
+1352,1380947,"models/tokenizer.py",485,0,"",python,selection_keyboard
+1353,1381025,"models/tokenizer.py",485,0,"i",python,content
+1354,1381025,"models/tokenizer.py",486,0,"",python,selection_keyboard
+1355,1381109,"models/tokenizer.py",486,0,"c",python,content
+1356,1381110,"models/tokenizer.py",487,0,"",python,selection_keyboard
+1357,1381173,"models/tokenizer.py",487,0,"a",python,content
+1358,1381174,"models/tokenizer.py",488,0,"",python,selection_keyboard
+1359,1381224,"models/tokenizer.py",488,0,"l",python,content
+1360,1381224,"models/tokenizer.py",489,0,"",python,selection_keyboard
+1361,1381346,"models/tokenizer.py",489,0,"l",python,content
+1362,1381347,"models/tokenizer.py",490,0,"",python,selection_keyboard
+1363,1381464,"models/tokenizer.py",490,0,"y",python,content
+1364,1381464,"models/tokenizer.py",491,0,"",python,selection_keyboard
+1365,1381523,"models/tokenizer.py",491,0," ",python,content
+1366,1381524,"models/tokenizer.py",492,0,"",python,selection_keyboard
+1367,1383251,"models/tokenizer.py",492,0,"c",python,content
+1368,1383252,"models/tokenizer.py",493,0,"",python,selection_keyboard
+1369,1383324,"models/tokenizer.py",493,0,"a",python,content
+1370,1383329,"models/tokenizer.py",494,0,"",python,selection_keyboard
+1371,1383417,"models/tokenizer.py",494,0,"s",python,content
+1372,1383419,"models/tokenizer.py",495,0,"",python,selection_keyboard
+1373,1383631,"models/tokenizer.py",495,0,"t",python,content
+1374,1383632,"models/tokenizer.py",496,0,"",python,selection_keyboard
+1375,1383820,"models/tokenizer.py",496,0," ",python,content
+1376,1383823,"models/tokenizer.py",497,0,"",python,selection_keyboard
+1377,1384289,"models/tokenizer.py",496,1,"",python,content
+1378,1384312,"models/tokenizer.py",496,0,"s",python,content
+1379,1384314,"models/tokenizer.py",497,0,"",python,selection_keyboard
+1380,1384389,"models/tokenizer.py",497,0," ",python,content
+1381,1384390,"models/tokenizer.py",498,0,"",python,selection_keyboard
+1382,1386680,"models/tokenizer.py",498,0,"i",python,content
+1383,1386681,"models/tokenizer.py",499,0,"",python,selection_keyboard
+1384,1386728,"models/tokenizer.py",499,0,"n",python,content
+1385,1386730,"models/tokenizer.py",500,0,"",python,selection_keyboard
+1386,1386879,"models/tokenizer.py",500,0,"p",python,content
+1387,1386880,"models/tokenizer.py",501,0,"",python,selection_keyboard
+1388,1386933,"models/tokenizer.py",501,0,"u",python,content
+1389,1386934,"models/tokenizer.py",502,0,"",python,selection_keyboard
+1390,1386980,"models/tokenizer.py",502,0,"t",python,content
+1391,1386981,"models/tokenizer.py",503,0,"",python,selection_keyboard
+1392,1387064,"models/tokenizer.py",503,0,"s",python,content
+1393,1387066,"models/tokenizer.py",504,0,"",python,selection_keyboard
+1394,1387113,"models/tokenizer.py",504,0," ",python,content
+1395,1387114,"models/tokenizer.py",505,0,"",python,selection_keyboard
+1396,1387313,"models/tokenizer.py",505,0,"a",python,content
+1397,1387315,"models/tokenizer.py",506,0,"",python,selection_keyboard
+1398,1387408,"models/tokenizer.py",506,0,"n",python,content
+1399,1387409,"models/tokenizer.py",507,0,"",python,selection_keyboard
+1400,1387465,"models/tokenizer.py",507,0,"d",python,content
+1401,1387466,"models/tokenizer.py",508,0,"",python,selection_keyboard
+1402,1387530,"models/tokenizer.py",508,0," ",python,content
+1403,1387531,"models/tokenizer.py",509,0,"",python,selection_keyboard
+1404,1387658,"models/tokenizer.py",509,0,"h",python,content
+1405,1387661,"models/tokenizer.py",510,0,"",python,selection_keyboard
+1406,1387761,"models/tokenizer.py",510,0,"a",python,content
+1407,1387762,"models/tokenizer.py",511,0,"",python,selection_keyboard
+1408,1387830,"models/tokenizer.py",511,0,"n",python,content
+1409,1387831,"models/tokenizer.py",512,0,"",python,selection_keyboard
+1410,1387916,"models/tokenizer.py",512,0,"d",python,content
+1411,1387916,"models/tokenizer.py",513,0,"",python,selection_keyboard
+1412,1388018,"models/tokenizer.py",513,0,"l",python,content
+1413,1388019,"models/tokenizer.py",514,0,"",python,selection_keyboard
+1414,1388100,"models/tokenizer.py",514,0,"e",python,content
+1415,1388103,"models/tokenizer.py",515,0,"",python,selection_keyboard
+1416,1388155,"models/tokenizer.py",515,0," ",python,content
+1417,1388156,"models/tokenizer.py",516,0,"",python,selection_keyboard
+1418,1388295,"models/tokenizer.py",516,0,"c",python,content
+1419,1388296,"models/tokenizer.py",517,0,"",python,selection_keyboard
+1420,1388315,"models/tokenizer.py",517,0,"o",python,content
+1421,1388316,"models/tokenizer.py",518,0,"",python,selection_keyboard
+1422,1388346,"models/tokenizer.py",518,0,"m",python,content
+1423,1388347,"models/tokenizer.py",519,0,"",python,selection_keyboard
+1424,1388466,"models/tokenizer.py",519,0,"p",python,content
+1425,1388466,"models/tokenizer.py",520,0,"",python,selection_keyboard
+1426,1388559,"models/tokenizer.py",520,0,"u",python,content
+1427,1388561,"models/tokenizer.py",521,0,"",python,selection_keyboard
+1428,1388597,"models/tokenizer.py",521,0,"t",python,content
+1429,1388598,"models/tokenizer.py",522,0,"",python,selection_keyboard
+1430,1388642,"models/tokenizer.py",522,0,"e",python,content
+1431,1388642,"models/tokenizer.py",523,0,"",python,selection_keyboard
+1432,1388705,"models/tokenizer.py",523,0," ",python,content
+1433,1388706,"models/tokenizer.py",524,0,"",python,selection_keyboard
+1434,1390233,"models/tokenizer.py",524,0,"i",python,content
+1435,1390234,"models/tokenizer.py",525,0,"",python,selection_keyboard
+1436,1390310,"models/tokenizer.py",525,0,"n",python,content
+1437,1390312,"models/tokenizer.py",526,0,"",python,selection_keyboard
+1438,1390619,"models/tokenizer.py",526,0,"b",python,content
+1439,1390620,"models/tokenizer.py",527,0,"",python,selection_keyboard
+1440,1390729,"models/tokenizer.py",527,0,"f",python,content
+1441,1390730,"models/tokenizer.py",528,0,"",python,selection_keyboard
+1442,1391025,"models/tokenizer.py",527,1,"",python,content
+1443,1391131,"models/tokenizer.py",526,1,"",python,content
+1444,1391201,"models/tokenizer.py",526,0," ",python,content
+1445,1391202,"models/tokenizer.py",527,0,"",python,selection_keyboard
+1446,1391334,"models/tokenizer.py",527,0,"b",python,content
+1447,1391335,"models/tokenizer.py",528,0,"",python,selection_keyboard
+1448,1391476,"models/tokenizer.py",528,0,"f",python,content
+1449,1391477,"models/tokenizer.py",529,0,"",python,selection_keyboard
+1450,1391525,"models/tokenizer.py",529,0,"l",python,content
+1451,1391528,"models/tokenizer.py",530,0,"",python,selection_keyboard
+1452,1391666,"models/tokenizer.py",530,0,"o",python,content
+1453,1391667,"models/tokenizer.py",531,0,"",python,selection_keyboard
+1454,1391708,"models/tokenizer.py",531,0,"a",python,content
+1455,1391709,"models/tokenizer.py",532,0,"",python,selection_keyboard
+1456,1391814,"models/tokenizer.py",532,0,"t",python,content
+1457,1391817,"models/tokenizer.py",533,0,"",python,selection_keyboard
+1458,1392128,"models/tokenizer.py",533,0,"1",python,content
+1459,1392129,"models/tokenizer.py",534,0,"",python,selection_keyboard
+1460,1392262,"models/tokenizer.py",534,0,"6",python,content
+1461,1392263,"models/tokenizer.py",535,0,"",python,selection_keyboard
+1462,1392383,"models/tokenizer.py",534,0,"",python,selection_command
+1463,1393867,"models/tokenizer.py",535,0,"",python,selection_command
+1464,1396060,"models/tokenizer.py",534,0,"",python,selection_command
+1465,1396262,"models/tokenizer.py",527,0,"",python,selection_command
+1466,1396514,"models/tokenizer.py",524,0,"",python,selection_command
+1467,1396545,"models/tokenizer.py",516,0,"",python,selection_command
+1468,1396579,"models/tokenizer.py",509,0,"",python,selection_command
+1469,1396613,"models/tokenizer.py",505,0,"",python,selection_command
+1470,1396646,"models/tokenizer.py",498,0,"",python,selection_command
+1471,1396679,"models/tokenizer.py",492,0,"",python,selection_command
+1472,1396715,"models/tokenizer.py",478,0,"",python,selection_command
+1473,1396748,"models/tokenizer.py",473,0,"",python,selection_command
+1474,1396780,"models/tokenizer.py",465,0,"",python,selection_command
+1475,1396947,"models/tokenizer.py",473,0,"",python,selection_command
+1476,1397129,"models/tokenizer.py",478,0,"",python,selection_command
+1477,1397279,"models/tokenizer.py",492,0,"",python,selection_command
+1478,1397435,"models/tokenizer.py",498,0,"",python,selection_command
+1479,1397719,"models/tokenizer.py",492,0,"",python,selection_command
+1480,1397897,"models/tokenizer.py",492,5,"",python,content
+1481,1398949,"models/tokenizer.py",491,0,"",python,selection_command
+1482,1399036,"models/tokenizer.py",478,0,"",python,selection_command
+1483,1399497,"models/tokenizer.py",478,0,"w",python,content
+1484,1399498,"models/tokenizer.py",479,0,"",python,selection_keyboard
+1485,1399628,"models/tokenizer.py",479,0,"i",python,content
+1486,1399629,"models/tokenizer.py",480,0,"",python,selection_keyboard
+1487,1399694,"models/tokenizer.py",480,0,"l",python,content
+1488,1399696,"models/tokenizer.py",481,0,"",python,selection_keyboard
+1489,1399861,"models/tokenizer.py",481,0,"l",python,content
+1490,1399862,"models/tokenizer.py",482,0,"",python,selection_keyboard
+1491,1399911,"models/tokenizer.py",482,0," ",python,content
+1492,1399913,"models/tokenizer.py",483,0,"",python,selection_keyboard
+1493,1400067,"models/tokenizer.py",482,0,"",python,selection_command
+1494,1400279,"models/tokenizer.py",495,0,"",python,selection_command
+1495,1400636,"models/tokenizer.py",496,0,"",python,selection_command
+1496,1400723,"models/tokenizer.py",497,0,"",python,selection_command
+1497,1400999,"models/tokenizer.py",497,0,"c",python,content
+1498,1401000,"models/tokenizer.py",498,0,"",python,selection_keyboard
+1499,1401062,"models/tokenizer.py",498,0,"a",python,content
+1500,1401063,"models/tokenizer.py",499,0,"",python,selection_keyboard
+1501,1401162,"models/tokenizer.py",499,0,"s",python,content
+1502,1401163,"models/tokenizer.py",500,0,"",python,selection_keyboard
+1503,1401240,"models/tokenizer.py",500,0,"t",python,content
+1504,1401242,"models/tokenizer.py",501,0,"",python,selection_keyboard
+1505,1401472,"models/tokenizer.py",500,0,"",python,selection_command
+1506,1417654,"train_tokenizer.py",0,0,"",python,tab
+1507,1419879,"train_tokenizer.py",4664,37," param_dtype=args.param_dtype,",python,selection_command
+1508,1420110,"train_tokenizer.py",4664,62," param_dtype=args.param_dtype,\n dtype=args.dtype",python,selection_command
+1509,1420298,"train_tokenizer.py",4664,0,"",python,selection_command
+1510,1421996,"train_lam.py",0,0,"",python,tab
+1511,1422681,"train_lam.py",1859,0,"",python,selection_command
+1512,1422823,"train_lam.py",2850,0,"",python,selection_command
+1513,1422999,"train_lam.py",3728,0,"",python,selection_command
+1514,1423498,"train_lam.py",2850,0,"",python,selection_command
+1515,1424373,"train_lam.py",3728,0,"",python,selection_command
+1516,1424529,"train_lam.py",4503,0,"",python,selection_command
+1517,1424707,"train_lam.py",5274,0,"",python,selection_command
+1518,1425367,"train_lam.py",4503,0,"",python,selection_command
+1519,1425614,"train_lam.py",5274,0,"",python,selection_command
+1520,1426023,"train_lam.py",4927,0,"",python,selection_keyboard
+1521,1426263,"train_lam.py",4893,0,"",python,selection_command
+1522,1426445,"train_lam.py",4927,0,"",python,selection_command
+1523,1426598,"train_lam.py",4957,0,"",python,selection_command
+1524,1427046,"train_lam.py",4996,0,"\n param_dtype=args.param_dtype,\n dtype=args.dtype",python,content
+1525,1427058,"train_lam.py",5005,0,"",python,selection_command
+1526,1427932,"train_lam.py",5043,0,"",python,selection_command
+1527,1428133,"train_lam.py",5059,0,"",python,selection_command
+1528,1428229,"train_lam.py",5059,0,",",python,content
+1529,1428230,"train_lam.py",5060,0,"",python,selection_keyboard
+1530,1428366,"train_lam.py",5059,0,"",python,selection_command
+1531,1429473,"train_tokenizer.py",0,0,"",python,tab
+1532,1430012,"train_tokenizer.py",4702,0,"",python,selection_command
+1533,1430180,"train_tokenizer.py",4726,0,"",python,selection_command
+1534,1430357,"train_tokenizer.py",4726,0,",",python,content
+1535,1430360,"train_tokenizer.py",4727,0,"",python,selection_keyboard
+1536,1430501,"train_tokenizer.py",4726,0,"",python,selection_command
+1537,1431389,"train_lam.py",0,0,"",python,tab
+1538,1431981,"train_lam.py",5021,0,"",python,selection_command
+1539,1433233,"train_lam.py",4973,0,"",python,selection_command
+1540,1433483,"train_lam.py",4943,0,"",python,selection_command
+1541,1433514,"train_lam.py",4909,0,"",python,selection_command
+1542,1433546,"train_lam.py",4873,0,"",python,selection_command
+1543,1433579,"train_lam.py",4837,0,"",python,selection_command
+1544,1433614,"train_lam.py",4799,0,"",python,selection_command
+1545,1433647,"train_lam.py",4763,0,"",python,selection_command
+1546,1433679,"train_lam.py",4729,0,"",python,selection_command
+1547,1433713,"train_lam.py",4693,0,"",python,selection_command
+1548,1433840,"train_lam.py",4664,0,"",python,selection_command
+1549,1434214,"train_lam.py",4650,0,"",python,selection_command
+1550,1435004,"models/lam.py",0,0,"from typing import Dict, Any\n\nimport jax.numpy as jnp\nimport flax.linen as nn\n\nfrom utils.preprocess import patchify, unpatchify\nfrom utils.nn import STTransformer, VectorQuantizer\n\n\nclass LatentActionModel(nn.Module):\n """"""Latent Action ST-ViVit VQ-VAE""""""\n\n in_dim: int\n model_dim: int\n latent_dim: int\n num_latents: int\n patch_size: int\n num_blocks: int\n num_heads: int\n dropout: float\n codebook_dropout: float\n\n def setup(self):\n self.patch_token_dim = self.in_dim * self.patch_size**2\n self.encoder = STTransformer(\n self.model_dim,\n self.latent_dim,\n self.num_blocks,\n self.num_heads,\n self.dropout,\n )\n self.action_in = self.param(\n ""action_in"",\n nn.initializers.lecun_uniform(),\n (1, 1, 1, self.patch_token_dim),\n )\n self.vq = VectorQuantizer(\n self.latent_dim,\n self.num_latents,\n self.codebook_dropout,\n )\n self.patch_up = nn.Dense(self.model_dim)\n self.action_up = nn.Dense(self.model_dim)\n self.decoder = STTransformer(\n self.model_dim,\n self.patch_token_dim,\n self.num_blocks,\n self.num_heads,\n self.dropout,\n )\n\n def __call__(self, batch: Dict[str, Any], training: bool = True) -> Dict[str, Any]:\n # --- Encode + VQ ---\n H, W = batch[""videos""].shape[2:4]\n outputs = self.vq_encode(batch[""videos""], training)\n video_action_patches = self.action_up(outputs[""z_q""]) + self.patch_up(\n outputs[""patches""][:, :-1]\n )\n del outputs[""patches""]\n\n # --- Decode ---\n video_recon = self.decoder(video_action_patches)\n video_recon = nn.sigmoid(video_recon)\n outputs[""recon""] = unpatchify(video_recon, self.patch_size, H, W)\n return outputs\n\n def vq_encode(self, videos: Any, training: bool = True) -> Dict[str, Any]:\n # --- Preprocess videos ---\n B, T = videos.shape[:2]\n patches = patchify(videos, self.patch_size)\n action_pad = jnp.broadcast_to(self.action_in, (B, T, 1, self.patch_token_dim))\n padded_patches = jnp.concatenate((action_pad, patches), axis=2)\n\n # --- Encode ---\n z = self.encoder(padded_patches) # (B, T, N, E)\n # Get latent action for all future frames\n z = z[:, 1:, 0] # (B, T-1, E)\n\n # --- Vector quantize ---\n z = z.reshape(B * (T - 1), self.latent_dim)\n z_q, z, emb, indices = self.vq(z, training)\n z_q = z_q.reshape(B, T - 1, 1, self.latent_dim)\n return dict(patches=patches, z_q=z_q, z=z, emb=emb, indices=indices)\n",python,tab
+1551,1436254,"models/tokenizer.py",0,0,"",python,tab
+1552,1437377,"models/tokenizer.py",432,107," # FIXME (f.srambical): check whether flax will automatically cast inputs and handle compute in bfloat16",python,selection_command
+1553,1437477,"models/tokenizer.py",432,134," # FIXME (f.srambical): check whether flax will automatically cast inputs and handle compute in bfloat16\n param_dtype: jnp.dtype",python,selection_command
+1554,1437596,"models/tokenizer.py",432,155," # FIXME (f.srambical): check whether flax will automatically cast inputs and handle compute in bfloat16\n param_dtype: jnp.dtype\n dtype: jnp.dtype",python,selection_command
+1555,1437793,"models/tokenizer.py",432,0,"",python,selection_command
+1556,1438235,"models/lam.py",0,0,"",python,tab
+1557,1438679,"models/lam.py",225,0,"",python,selection_command
+1558,1438937,"models/lam.py",259,0,"",python,selection_command
+1559,1438962,"models/lam.py",266,0,"",python,selection_command
+1560,1438993,"models/lam.py",282,0,"",python,selection_command
+1561,1439025,"models/lam.py",301,0,"",python,selection_command
+1562,1439062,"models/lam.py",321,0,"",python,selection_command
+1563,1439095,"models/lam.py",342,0,"",python,selection_command
+1564,1439129,"models/lam.py",362,0,"",python,selection_command
+1565,1439162,"models/lam.py",382,0,"",python,selection_command
+1566,1439195,"models/lam.py",401,0,"",python,selection_command
+1567,1439381,"models/lam.py",420,0,"",python,selection_command
+1568,1439549,"models/lam.py",442,0,"",python,selection_command
+1569,1439766,"models/lam.py",420,0,"",python,selection_command
+1570,1439898,"models/lam.py",441,0,"\n # FIXME (f.srambical): check whether flax will automatically cast inputs and handle compute in bfloat16\n param_dtype: jnp.dtype\n dtype: jnp.dtype",python,content
+1571,1439907,"models/lam.py",446,0,"",python,selection_command
+1572,1441003,"models/lam.py",554,0,"",python,selection_command
+1573,1445695,"train_dynamics.py",0,0,"",python,tab
+1574,1447307,"train_dynamics.py",4058,0,"",python,selection_command
+1575,1448443,"train_dynamics.py",4916,0,"",python,selection_command
+1576,1450975,"train_lam.py",0,0,"",python,tab
+1577,1451465,"train_lam.py",4679,0,"",python,selection_command
+1578,1451715,"train_lam.py",4715,0,"",python,selection_command
+1579,1451747,"train_lam.py",4749,0,"",python,selection_command
+1580,1451779,"train_lam.py",4785,0,"",python,selection_command
+1581,1451813,"train_lam.py",4823,0,"",python,selection_command
+1582,1451847,"train_lam.py",4859,0,"",python,selection_command
+1583,1451881,"train_lam.py",4895,0,"",python,selection_command
+1584,1451913,"train_lam.py",4929,0,"",python,selection_command
+1585,1451946,"train_lam.py",4959,0,"",python,selection_command
+1586,1451979,"train_lam.py",5007,0,"",python,selection_command
+1587,1452331,"train_lam.py",4997,37," param_dtype=args.param_dtype,",python,selection_command
+1588,1452446,"train_lam.py",4997,63," param_dtype=args.param_dtype,\n dtype=args.dtype,",python,selection_command
+1589,1452617,"train_lam.py",4997,0,"",python,selection_command
+1590,1453084,"train_dynamics.py",0,0,"",python,tab
+1591,1453834,"train_dynamics.py",4943,0,"\n param_dtype=args.param_dtype,\n dtype=args.dtype,",python,content
+1592,1453846,"train_dynamics.py",4952,0,"",python,selection_command
+1593,1455863,"train_dynamics.py",4916,0,"",python,selection_command
+1594,1456115,"train_dynamics.py",4886,0,"",python,selection_command
+1595,1456146,"train_dynamics.py",4842,0,"",python,selection_command
+1596,1456179,"train_dynamics.py",4796,0,"",python,selection_command
+1597,1456212,"train_dynamics.py",4764,0,"",python,selection_command
+1598,1456246,"train_dynamics.py",4745,0,"",python,selection_command
+1599,1456279,"train_dynamics.py",4699,0,"",python,selection_command
+1600,1456312,"train_dynamics.py",4657,0,"",python,selection_command
+1601,1456346,"train_dynamics.py",4613,0,"",python,selection_command
+1602,1456379,"train_dynamics.py",4569,0,"",python,selection_command
+1603,1456413,"train_dynamics.py",4517,0,"",python,selection_command
+1604,1456446,"train_dynamics.py",4467,0,"",python,selection_command
+1605,1456479,"train_dynamics.py",4437,0,"",python,selection_command
+1606,1456512,"train_dynamics.py",4423,0,"",python,selection_command
+1607,1456545,"train_dynamics.py",4369,0,"",python,selection_command
+1608,1456579,"train_dynamics.py",4313,0,"",python,selection_command
+1609,1456612,"train_dynamics.py",4277,0,"",python,selection_command
+1610,1456645,"train_dynamics.py",4227,0,"",python,selection_command
+1611,1456678,"train_dynamics.py",4179,0,"",python,selection_command
+1612,1456712,"train_dynamics.py",4137,0,"",python,selection_command
+1613,1456745,"train_dynamics.py",4101,0,"",python,selection_command
+1614,1456881,"train_dynamics.py",4081,0,"",python,selection_command
+1615,1458189,"train_dynamics.py",4066,0,"",python,selection_command
+1616,1458859,"genie.py",0,0,"from typing import Dict, Any\n\nimport optax\nimport jax\nimport jax.numpy as jnp\nimport flax.linen as nn\nfrom flax.training.train_state import TrainState\nimport orbax.checkpoint as ocp\n\nfrom models.dynamics import DynamicsMaskGIT\nfrom models.lam import LatentActionModel\nfrom models.tokenizer import TokenizerVQVAE\n\nimport os\nimport grain\n\n\nclass Genie(nn.Module):\n """"""Genie model""""""\n\n # --- Tokenizer ---\n in_dim: int\n tokenizer_dim: int\n latent_patch_dim: int\n num_patch_latents: int\n patch_size: int\n tokenizer_num_blocks: int\n tokenizer_num_heads: int\n # --- LAM ---\n lam_dim: int\n latent_action_dim: int\n num_latent_actions: int\n lam_patch_size: int\n lam_num_blocks: int\n lam_num_heads: int\n lam_co_train: bool\n # --- Dynamics ---\n dyna_dim: int\n dyna_num_blocks: int\n dyna_num_heads: int\n dropout: float = 0.0\n mask_limit: float = 0.0\n\n def setup(self):\n self.tokenizer = TokenizerVQVAE(\n in_dim=self.in_dim,\n model_dim=self.tokenizer_dim,\n latent_dim=self.latent_patch_dim,\n num_latents=self.num_patch_latents,\n patch_size=self.patch_size,\n num_blocks=self.tokenizer_num_blocks,\n num_heads=self.tokenizer_num_heads,\n dropout=0.0,\n codebook_dropout=0.0,\n )\n self.lam = LatentActionModel(\n in_dim=self.in_dim,\n model_dim=self.lam_dim,\n latent_dim=self.latent_patch_dim,\n num_latents=self.num_latent_actions,\n patch_size=self.lam_patch_size,\n num_blocks=self.lam_num_blocks,\n num_heads=self.lam_num_heads,\n dropout=0.0,\n codebook_dropout=0.0,\n )\n self.dynamics = DynamicsMaskGIT(\n model_dim=self.dyna_dim,\n num_latents=self.num_patch_latents,\n num_blocks=self.dyna_num_blocks,\n num_heads=self.dyna_num_heads,\n dropout=self.dropout,\n mask_limit=self.mask_limit,\n )\n\n def __call__(self, batch: Dict[str, Any], training: bool = True) -> Dict[str, Any]:\n tokenizer_outputs = self.tokenizer.vq_encode(batch[""videos""], training=False)\n lam_outputs = self.lam.vq_encode(batch[""videos""], training=False)\n latent_actions = jax.lax.cond(\n self.lam_co_train,\n lambda: lam_outputs[""z_q""],\n lambda: jax.lax.stop_gradient(lam_outputs[""z_q""])\n )\n outputs = dict(\n video_tokens=jax.lax.stop_gradient(tokenizer_outputs[""indices""]),\n latent_actions=latent_actions,\n )\n outputs[""mask_rng""] = batch[""mask_rng""]\n dyna_outputs = self.dynamics(outputs, training)\n outputs.update(dyna_outputs)\n mle_indices = jnp.argmax(outputs[""token_logits""], axis=-1)\n outputs[""recon""] = self.tokenizer.decode(\n mle_indices, batch[""videos""].shape[2:4]\n )\n return outputs\n\n @nn.compact\n def sample(\n self,\n batch: Dict[str, Any],\n seq_len: int,\n steps: int = 25,\n temperature: float = 1,\n sample_argmax: bool = False,\n ) -> Any:\n """"""\n Autoregressively samples up to `seq_len` future frames, following Figure 8 of the paper.\n\n - Input frames are tokenized once.\n - Future frames are generated autoregressively in token space.\n - All frames are detokenized in a single pass.\n\n Note:\n - For interactive or step-wise sampling, detokenization should occur after each action.\n - To maintain consistent tensor shapes across timesteps, all current and future frames are decoded at every step.\n - Temporal causal structure is preserved by \n a) reapplying the mask before each decoding step.\n b) a temporal causal mask is applied within each ST-transformer block.\n\n Dimension keys:\n B: batch size \n T: number of input (conditioning) frames \n N: patches per frame \n S: sequence length \n A: action space \n D: model latent dimension\n """"""\n # --- Encode videos and actions ---\n tokenizer_out = self.tokenizer.vq_encode(batch[""videos""], training=False)\n token_idxs = tokenizer_out[""indices""] # (B, T, N)\n B, T, N = token_idxs.shape\n pad_shape = (B, seq_len - T, N)\n pad = jnp.zeros(pad_shape, dtype=token_idxs.dtype)\n token_idxs = jnp.concatenate([token_idxs, pad], axis=1) # (B, S, N)\n action_tokens = self.lam.vq.get_codes(batch[""latent_actions""])\n\n MaskGITLoop = nn.scan(\n MaskGITStep,\n variable_broadcast=""params"",\n split_rngs={""params"": False},\n in_axes=0,\n out_axes=0,\n length=steps,\n )\n \n loop_fn = MaskGITLoop(\n dynamics=self.dynamics,\n tokenizer=self.tokenizer,\n temperature=temperature,\n sample_argmax=sample_argmax,\n steps=steps,\n )\n\n def generation_step_fn(carry, step_t):\n rng, current_token_idxs = carry\n rng, step_rng = jax.random.split(rng)\n\n # Mask current and future frames (i.e., t >= step_t)\n mask = jnp.arange(seq_len) >= step_t # (S,)\n mask = jnp.broadcast_to(mask[None, :, None], (B, seq_len, N)) # (B, S, N)\n mask = mask.astype(bool)\n masked_token_idxs = current_token_idxs * ~mask\n\n # --- Initialize and run MaskGIT loop ---\n init_carry_maskgit = (\n step_rng,\n masked_token_idxs,\n mask,\n action_tokens,\n )\n final_carry_maskgit, _ = loop_fn(init_carry_maskgit, jnp.arange(steps))\n updated_token_idxs = final_carry_maskgit[1]\n new_carry = (rng, updated_token_idxs)\n return new_carry, None\n\n # --- Run the autoregressive generation using scan ---\n initial_carry = (batch[""rng""], token_idxs)\n timesteps_to_scan = jnp.arange(T, seq_len)\n final_carry, _ = jax.lax.scan(\n generation_step_fn,\n initial_carry,\n timesteps_to_scan\n )\n final_token_idxs = final_carry[1]\n\n # --- Decode all tokens at once at the end ---\n final_frames = self.tokenizer.decode(\n final_token_idxs,\n video_hw=batch[""videos""].shape[2:4],\n )\n return final_frames\n\n def vq_encode(self, batch, training) -> Dict[str, Any]:\n # --- Preprocess videos ---\n lam_output = self.lam.vq_encode(batch[""videos""], training=training)\n return lam_output[""indices""]\n\n\nclass MaskGITStep(nn.Module):\n dynamics: nn.Module\n tokenizer: nn.Module\n temperature: float\n sample_argmax: bool\n steps: int\n\n @nn.compact\n def __call__(self, carry, x):\n rng, token_idxs, mask, action_tokens = carry\n step = x\n N = token_idxs.shape[2]\n\n # --- Construct + encode video ---\n vid_embed = self.dynamics.patch_embed(token_idxs) # (B, S, N, D)\n mask_token = self.dynamics.mask_token # (1, 1, 1, D,)\n mask_expanded = mask[..., None] # (B, S, N, 1) \n vid_embed = jnp.where(mask_expanded, mask_token, vid_embed)\n\n # --- Predict transition ---\n act_embed = self.dynamics.action_up(action_tokens)\n vid_embed += jnp.pad(act_embed, ((0, 0), (1, 0), (0, 0), (0, 0)))\n unmasked_ratio = jnp.cos(jnp.pi * (step + 1) / (self.steps * 2))\n step_temp = self.temperature * (1.0 - unmasked_ratio)\n final_logits = self.dynamics.dynamics(vid_embed) / step_temp\n\n # --- Sample new tokens for final frame ---\n if self.sample_argmax:\n sampled_token_idxs = jnp.argmax(final_logits, axis=-1)\n else:\n rng, _rng = jax.random.split(rng)\n sampled_token_idxs = jax.random.categorical(_rng, final_logits)\n gather_fn = jax.vmap(jax.vmap(jax.vmap(lambda x, y: x[y])))\n final_token_probs = gather_fn(jax.nn.softmax(final_logits), sampled_token_idxs)\n final_token_probs += ~mask\n # Update masked tokens only\n token_idxs = jnp.where(mask, sampled_token_idxs, token_idxs)\n\n # --- Update mask ---\n num_unmasked_tokens = jnp.round(N * (1.0 - unmasked_ratio)).astype(int)\n idx_mask = jnp.arange(final_token_probs.shape[-1]) > num_unmasked_tokens\n sorted_idxs = jnp.argsort(final_token_probs, axis=-1, descending=True)\n mask_update_fn = jax.vmap(lambda msk, ids: msk.at[ids].set(idx_mask))\n new_mask = mask_update_fn(mask, sorted_idxs)\n\n new_carry = (rng, token_idxs, new_mask, action_tokens)\n return new_carry, None\n\ndef restore_genie_components(\n train_state: TrainState,\n sharding: jax.sharding.NamedSharding,\n grain_iterator: grain.DataLoaderIterator,\n inputs: Dict[str, jax.Array],\n rng: jax.Array,\n args,\n):\n """"""Restore pre-trained Genie components""""""\n rng, _rng = jax.random.split(rng)\n\n # dummy values since we only use tx to initialize the dummy train states\n dummy_tx = optax.adamw(\n learning_rate=optax.constant_schedule(args.max_lr),\n b1=0.9,\n b2=0.9,\n weight_decay=1e-4,\n )\n handler_registry = ocp.handlers.DefaultCheckpointHandlerRegistry()\n handler_registry.add('model_state', ocp.args.StandardRestore, ocp.handlers.StandardCheckpointHandler)\n handler_registry.add('dataloader_state', grain.checkpoint.CheckpointRestore, grain.checkpoint.CheckpointHandler)\n \n\n checkpoint_options = ocp.CheckpointManagerOptions(\n step_format_fixed_length=6,\n )\n tokenizer_checkpoint_manager = ocp.CheckpointManager(\n directory=args.tokenizer_checkpoint,\n options=checkpoint_options,\n handler_registry=handler_registry,\n )\n dummy_tokenizer = TokenizerVQVAE(\n in_dim=args.image_channels,\n model_dim=args.tokenizer_dim,\n latent_dim=args.latent_patch_dim,\n num_latents=args.num_patch_latents,\n patch_size=args.patch_size,\n num_blocks=args.tokenizer_num_blocks,\n num_heads=args.tokenizer_num_heads,\n dropout=args.dropout,\n codebook_dropout=args.dropout,\n )\n tokenizer_init_params = dummy_tokenizer.init(_rng, inputs)\n dummy_tokenizer_train_state = TrainState.create(\n apply_fn=dummy_tokenizer.apply, params=tokenizer_init_params, tx=dummy_tx\n )\n abstract_sharded_tokenizer_state = _create_abstract_sharded_pytree(\n dummy_tokenizer_train_state, sharding\n )\n restored_tokenizer = tokenizer_checkpoint_manager.restore(\n step=tokenizer_checkpoint_manager.latest_step(),\n args=ocp.args.Composite(\n model_state=ocp.args.StandardRestore(abstract_sharded_tokenizer_state),\n dataloader_state=grain.checkpoint.CheckpointRestore(grain_iterator),\n ),\n )[""model_state""]\n restored_tokenizer_params = restored_tokenizer.params[""params""]\n train_state.params[""params""][""tokenizer""].update(restored_tokenizer_params)\n tokenizer_checkpoint_manager.close()\n\n if args.lam_checkpoint:\n lam_checkpoint_manager = ocp.CheckpointManager(\n directory=args.lam_checkpoint,\n options=checkpoint_options,\n handler_registry=handler_registry,\n )\n dummy_lam = LatentActionModel(\n in_dim=args.image_channels,\n model_dim=args.lam_dim,\n latent_dim=args.latent_patch_dim,\n num_latents=args.num_latent_actions,\n patch_size=args.lam_patch_size,\n num_blocks=args.lam_num_blocks,\n num_heads=args.lam_num_heads,\n dropout=args.dropout,\n codebook_dropout=args.dropout,\n )\n lam_init_params = dummy_lam.init(_rng, inputs)\n dummy_lam_train_state = TrainState.create(\n apply_fn=dummy_lam.apply, params=lam_init_params, tx=dummy_tx\n )\n abstract_sharded_lam_state = _create_abstract_sharded_pytree(\n dummy_lam_train_state, sharding\n )\n restored_lam = lam_checkpoint_manager.restore(\n step=lam_checkpoint_manager.latest_step(),\n args=ocp.args.Composite(\n model_state=ocp.args.StandardRestore(abstract_sharded_lam_state),\n dataloader_state=grain.checkpoint.CheckpointRestore(grain_iterator),\n ),\n )[""model_state""]\n restored_lam_params = restored_lam.params[""params""]\n # Genie does not initialize all LAM modules, thus we omit those extra modules during restoration\n # (f.srambical) FIXME: Currently, this is a small HBM memory crunch since the LAM's decoder is loaded into HBM and immediately dicarded.\n # A workaround would be to restore to host memory first, and only move the weights to HBM after pruning the decoder\n restored_lam_params = {\n k: v\n for k, v in restored_lam_params.items()\n if k in train_state.params[""params""][""lam""]\n }\n train_state.params[""params""][""lam""].update(restored_lam_params)\n lam_checkpoint_manager.close()\n\n return train_state\n\ndef _create_abstract_sharded_pytree(pytree_template, sharding_spec):\n """"""Replaces arrays in a pytree with ShapeDtypeStructs having the given sharding.""""""\n\n def map_fn(leaf_template):\n if hasattr(leaf_template, ""shape"") and hasattr(leaf_template, ""dtype""):\n return jax.ShapeDtypeStruct(\n leaf_template.shape, leaf_template.dtype, sharding=sharding_spec\n )\n return leaf_template\n\n return jax.tree_util.tree_map(map_fn, pytree_template)",python,tab
+1617,1460102,"genie.py",368,0,"",python,selection_command
+1618,1460347,"genie.py",384,0,"",python,selection_command
+1619,1460379,"genie.py",391,0,"",python,selection_command
+1620,1460409,"genie.py",415,0,"",python,selection_command
+1621,1460446,"genie.py",431,0,"",python,selection_command
+1622,1460474,"genie.py",454,0,"",python,selection_command
+1623,1460509,"genie.py",480,0,"",python,selection_command
+1624,1460541,"genie.py",507,0,"",python,selection_command
+1625,1460574,"genie.py",527,0,"",python,selection_command
+1626,1460607,"genie.py",557,0,"",python,selection_command
+1627,1460641,"genie.py",586,0,"",python,selection_command
+1628,1460674,"genie.py",604,0,"",python,selection_command
+1629,1460708,"genie.py",621,0,"",python,selection_command
+1630,1460748,"genie.py",648,0,"",python,selection_command
+1631,1460774,"genie.py",676,0,"",python,selection_command
+1632,1460808,"genie.py",700,0,"",python,selection_command
+1633,1460840,"genie.py",724,0,"",python,selection_command
+1634,1460873,"genie.py",747,0,"",python,selection_command
+1635,1460907,"genie.py",770,0,"",python,selection_command
+1636,1460941,"genie.py",793,0,"",python,selection_command
+1637,1460974,"genie.py",811,0,"",python,selection_command
+1638,1461159,"genie.py",836,0,"",python,selection_command
+1639,1461317,"genie.py",860,0,"",python,selection_command
+1640,1461463,"genie.py",885,0,"",python,selection_command
+1641,1464662,"models/lam.py",0,0,"",python,tab
+1642,1465431,"models/lam.py",550,26," param_dtype: jnp.dtype",python,selection_command
+1643,1465560,"models/lam.py",550,47," param_dtype: jnp.dtype\n dtype: jnp.dtype",python,selection_command
+1644,1466092,"models/lam.py",581,0,"",python,selection_command
+1645,1466531,"models/lam.py",577,20," dtype: jnp.dtype",python,selection_command
+1646,1466698,"models/lam.py",550,47," param_dtype: jnp.dtype\n dtype: jnp.dtype",python,selection_command
+1647,1466842,"models/lam.py",442,155," # FIXME (f.srambical): check whether flax will automatically cast inputs and handle compute in bfloat16\n param_dtype: jnp.dtype\n dtype: jnp.dtype",python,selection_command
+1648,1467027,"models/lam.py",442,0,"",python,selection_command
+1649,1467566,"genie.py",0,0,"",python,tab
+1650,1467930,"genie.py",907,0,"",python,selection_command
+1651,1468265,"genie.py",885,0,"",python,selection_command
+1652,1468495,"genie.py",906,0,"\n # FIXME (f.srambical): check whether flax will automatically cast inputs and handle compute in bfloat16\n param_dtype: jnp.dtype\n dtype: jnp.dtype",python,content
+1653,1468500,"genie.py",911,0,"",python,selection_command
+1654,1469835,"genie.py",1019,0,"",python,selection_command
+1655,1481777,"genie.py",1030,0,"",python,selection_command
+1656,1481998,"genie.py",1032,0,"",python,selection_command
+1657,1483464,"genie.py",1041,0,"",python,selection_command
+1658,1483484,"genie.py",1041,0," ",python,content
+1659,1483487,"genie.py",1042,0,"",python,selection_keyboard
+1660,1483625,"genie.py",1042,0,"=",python,content
+1661,1483627,"genie.py",1043,0,"",python,selection_keyboard
+1662,1483703,"genie.py",1043,0," ",python,content
+1663,1483704,"genie.py",1044,0,"",python,selection_keyboard
+1664,1485294,"genie.py",1043,1,"",python,content
+1665,1485461,"genie.py",1042,1,"",python,content
+1666,1485609,"genie.py",1041,1,"",python,content
+1667,1485806,"genie.py",1040,0,"",python,selection_command
+1668,1486315,"genie.py",1036,0,"",python,selection_command
+1669,1486460,"genie.py",1035,0,"",python,selection_command
+1670,1486594,"genie.py",1032,0,"",python,selection_command
+1671,1486743,"genie.py",1030,0,"",python,selection_command
+1672,1487208,"genie.py",1019,0,"",python,selection_command
+1673,1497084,"genie.py",1015,26," param_dtype: jnp.dtype",python,selection_command
+1674,1497681,"genie.py",1019,0,"",python,selection_command
+1675,1498025,"genie.py",1046,0,"",python,selection_command
+1676,1498451,"genie.py",1042,20," dtype: jnp.dtype",python,selection_command
+1677,1498593,"genie.py",1015,47," param_dtype: jnp.dtype\n dtype: jnp.dtype",python,selection_command
+1678,1498733,"genie.py",907,155," # FIXME (f.srambical): check whether flax will automatically cast inputs and handle compute in bfloat16\n param_dtype: jnp.dtype\n dtype: jnp.dtype",python,selection_command
+1679,1498884,"genie.py",907,156,"",python,content
+1680,1499180,"genie.py",879,0,"",python,selection_command
+1681,1499329,"genie.py",854,0,"",python,selection_command
+1682,1499967,"genie.py",830,0,"",python,selection_command
+1683,1500745,"genie.py",805,0,"",python,selection_command
+1684,1500997,"genie.py",787,0,"",python,selection_command
+1685,1501386,"genie.py",805,0,"",python,selection_command
+1686,1501524,"genie.py",830,0,"",python,selection_command
+1687,1501926,"genie.py",853,0,"\n ",python,content
+1688,1502015,"genie.py",854,4,"",python,content
+1689,1502132,"genie.py",854,0,"\n # FIXME (f.srambical): check whether flax will automatically cast inputs and handle compute in bfloat16\n param_dtype: jnp.dtype\n dtype: jnp.dtype",python,content
+1690,1502137,"genie.py",859,0,"",python,selection_command
+1691,1502458,"genie.py",854,0,"",python,selection_command
+1692,1502698,"genie.py",854,1,"",python,content
+1693,1502701,"genie.py",858,0,"",python,selection_command
+1694,1504033,"genie.py",966,0,"",python,selection_command
+1695,1504162,"genie.py",993,0,"",python,selection_command
+1696,1504793,"genie.py",966,0,"",python,selection_command
+1697,1504946,"genie.py",858,0,"",python,selection_command
+1698,1506149,"genie.py",966,0,"",python,selection_command
+1699,1506397,"genie.py",993,0,"",python,selection_command
+1700,1506429,"genie.py",1014,0,"",python,selection_command
+1701,1506464,"genie.py",1039,0,"",python,selection_command
+1702,1506495,"genie.py",1063,0,"",python,selection_command
+1703,1506529,"genie.py",1068,0,"",python,selection_command
+1704,1506562,"genie.py",1089,0,"",python,selection_command
+1705,1506595,"genie.py",1130,0,"",python,selection_command
+1706,1506629,"genie.py",1162,0,"",python,selection_command
+1707,1506662,"genie.py",1204,0,"",python,selection_command
+1708,1506696,"genie.py",1250,0,"",python,selection_command
+1709,1506729,"genie.py",1298,0,"",python,selection_command
+1710,1506763,"genie.py",1338,0,"",python,selection_command
+1711,1506795,"genie.py",1388,0,"",python,selection_command
+1712,1506829,"genie.py",1436,0,"",python,selection_command
+1713,1506864,"genie.py",1461,0,"",python,selection_command
+1714,1506896,"genie.py",1495,0,"",python,selection_command
+1715,1506929,"genie.py",1505,0,"",python,selection_command
+1716,1506962,"genie.py",1543,0,"",python,selection_command
+1717,1506995,"genie.py",1575,0,"",python,selection_command
+1718,1507029,"genie.py",1611,0,"",python,selection_command
+1719,1507071,"genie.py",1575,0,"",python,selection_command
+1720,1507328,"genie.py",1543,0,"",python,selection_command
+1721,1507357,"genie.py",1505,0,"",python,selection_command
+1722,1507393,"genie.py",1495,0,"",python,selection_command
+1723,1507426,"genie.py",1461,0,"",python,selection_command
+1724,1507458,"genie.py",1436,0,"",python,selection_command
+1725,1507491,"genie.py",1388,0,"",python,selection_command
+1726,1507526,"genie.py",1338,0,"",python,selection_command
+1727,1507563,"genie.py",1298,0,"",python,selection_command
+1728,1507599,"genie.py",1250,0,"",python,selection_command
+1729,1507634,"genie.py",1204,0,"",python,selection_command
+1730,1507658,"genie.py",1162,0,"",python,selection_command
+1731,1507690,"genie.py",1130,0,"",python,selection_command
+1732,1507722,"genie.py",1089,0,"",python,selection_command
+1733,1507982,"genie.py",1130,0,"",python,selection_command
+1734,1508233,"genie.py",1162,0,"",python,selection_command
+1735,1508263,"genie.py",1204,0,"",python,selection_command
+1736,1508298,"genie.py",1250,0,"",python,selection_command
+1737,1508329,"genie.py",1298,0,"",python,selection_command
+1738,1508362,"genie.py",1338,0,"",python,selection_command
+1739,1508395,"genie.py",1388,0,"",python,selection_command
+1740,1508429,"genie.py",1436,0,"",python,selection_command
+1741,1508462,"genie.py",1461,0,"",python,selection_command
+1742,1508496,"genie.py",1495,0,"",python,selection_command
+1743,1508781,"genie.py",1461,0,"",python,selection_command
+1744,1510663,"genie.py",1490,0,"",python,selection_command
+1745,1510831,"genie.py",1490,0,"\n ",python,content
+1746,1514558,"genie.py",1491,12,"",python,content
+1747,1516529,"train_tokenizer.py",0,0,"",python,tab
+1748,1517653,"train_tokenizer.py",4702,25," dtype=args.dtype,",python,selection_command
+1749,1517802,"train_tokenizer.py",4664,63," param_dtype=args.param_dtype,\n dtype=args.dtype,",python,selection_command
+1750,1518264,"train_tokenizer.py",4664,0,"",python,selection_command
+1751,1518780,"genie.py",0,0,"",python,tab
+1752,1519141,"genie.py",1491,0,"\n param_dtype=args.param_dtype,\n dtype=args.dtype,",python,content
+1753,1519151,"genie.py",1500,0,"",python,selection_command
+1754,1519530,"genie.py",1491,0,"",python,selection_command
+1755,1519817,"genie.py",1491,1,"",python,content
+1756,1519819,"genie.py",1499,0,"",python,selection_command
+1757,1520096,"genie.py",1499,1,"p",python,selection_command
+1758,1520235,"genie.py",1499,39,"param_dtype=args.param_dtype,\n d",python,selection_command
+1759,1520523,"genie.py",1529,8," ",python,content
+1760,1520524,"genie.py",1491,8," ",python,content
+1761,1520534,"genie.py",1503,0,"",python,selection_command
+1762,1521365,"genie.py",1514,0,"",python,selection_command
+1763,1521513,"genie.py",1515,0,"",python,selection_command
+1764,1521952,"genie.py",1515,1,"a",python,selection_command
+1765,1522009,"genie.py",1515,2,"ar",python,selection_command
+1766,1522150,"genie.py",1515,3,"arg",python,selection_command
+1767,1522299,"genie.py",1515,4,"args",python,selection_command
+1768,1522509,"genie.py",1515,5,"args.",python,selection_command
+1769,1522712,"genie.py",1515,5,"",python,content
+1770,1522866,"genie.py",1552,0,"",python,selection_command
+1771,1523212,"genie.py",1551,0,"",python,selection_command
+1772,1523402,"genie.py",1550,0,"",python,selection_command
+1773,1523532,"genie.py",1550,1,".",python,selection_command
+1774,1523646,"genie.py",1546,5,"args.",python,selection_command
+1775,1523950,"genie.py",1546,5,"",python,content
+1776,1524278,"genie.py",1509,0,"",python,selection_command
+1777,1524380,"genie.py",1510,0,"",python,selection_command
+1778,1524398,"genie.py",1514,0,"",python,selection_command
+1779,1524585,"genie.py",1515,0,"",python,selection_command
+1780,1525184,"genie.py",1515,0,"s",python,content
+1781,1525185,"genie.py",1516,0,"",python,selection_keyboard
+1782,1525290,"genie.py",1516,0,"e",python,content
+1783,1525292,"genie.py",1517,0,"",python,selection_keyboard
+1784,1525394,"genie.py",1517,0,"l",python,content
+1785,1525398,"genie.py",1518,0,"",python,selection_keyboard
+1786,1525540,"genie.py",1518,0,"f",python,content
+1787,1525541,"genie.py",1519,0,"",python,selection_keyboard
+1788,1525810,"genie.py",1519,0,".",python,content
+1789,1525811,"genie.py",1520,0,"",python,selection_keyboard
+1790,1525935,"genie.py",1519,0,"",python,selection_command
+1791,1526064,"genie.py",1556,0,"",python,selection_command
+1792,1526246,"genie.py",1551,0,"",python,selection_command
+1793,1526647,"genie.py",1551,0,"s",python,content
+1794,1526649,"genie.py",1552,0,"",python,selection_keyboard
+1795,1526712,"genie.py",1552,0,"e",python,content
+1796,1526714,"genie.py",1553,0,"",python,selection_keyboard
+1797,1526827,"genie.py",1553,0,"f",python,content
+1798,1526829,"genie.py",1554,0,"",python,selection_keyboard
+1799,1526922,"genie.py",1554,0,"l",python,content
+1800,1526924,"genie.py",1555,0,"",python,selection_keyboard
+1801,1527253,"genie.py",1555,0,".",python,content
+1802,1527255,"genie.py",1556,0,"",python,selection_keyboard
+1803,1527379,"genie.py",1555,0,"",python,selection_command
+1804,1528079,"genie.py",1551,0,"",python,selection_command
+1805,1528228,"genie.py",1551,4,"",python,content
+1806,1528416,"genie.py",1551,0,"s",python,content
+1807,1528417,"genie.py",1552,0,"",python,selection_keyboard
+1808,1528480,"genie.py",1552,0,"e",python,content
+1809,1528482,"genie.py",1553,0,"",python,selection_keyboard
+1810,1528581,"genie.py",1553,0,"l",python,content
+1811,1528582,"genie.py",1554,0,"",python,selection_keyboard
+1812,1528672,"genie.py",1554,0,"f",python,content
+1813,1528673,"genie.py",1555,0,"",python,selection_keyboard
+1814,1528876,"genie.py",1554,0,"",python,selection_command
+1815,1532281,"genie.py",1571,0,"",python,selection_command
+1816,1532450,"genie.py",1594,0,"",python,selection_command
+1817,1532700,"genie.py",1571,0,"",python,selection_command
+1818,1534680,"genie.py",1581,0,"",python,selection_command
+1819,1534936,"genie.py",1619,0,"",python,selection_command
+1820,1534959,"genie.py",1651,0,"",python,selection_command
+1821,1534994,"genie.py",1687,0,"",python,selection_command
+1822,1535024,"genie.py",1733,0,"",python,selection_command
+1823,1535058,"genie.py",1782,0,"",python,selection_command
+1824,1535165,"genie.py",1826,0,"",python,selection_command
+1825,1535346,"genie.py",1870,0,"",python,selection_command
+1826,1535490,"genie.py",1912,0,"",python,selection_command
+1827,1535646,"genie.py",1937,0,"",python,selection_command
+1828,1535775,"genie.py",1912,0,"",python,selection_command
+1829,1536031,"genie.py",1870,0,"",python,selection_command
+1830,1536062,"genie.py",1826,0,"",python,selection_command
+1831,1536092,"genie.py",1782,0,"",python,selection_command
+1832,1536124,"genie.py",1733,0,"",python,selection_command
+1833,1536158,"genie.py",1687,0,"",python,selection_command
+1834,1536190,"genie.py",1651,0,"",python,selection_command
+1835,1536224,"genie.py",1619,0,"",python,selection_command
+1836,1536257,"genie.py",1581,0,"",python,selection_command
+1837,1536291,"genie.py",1571,0,"",python,selection_command
+1838,1536326,"genie.py",1541,0,"",python,selection_command
+1839,1536754,"genie.py",1533,29," dtype=self.dtype,",python,selection_command
+1840,1536847,"genie.py",1491,71," param_dtype=self.param_dtype,\n dtype=self.dtype,",python,selection_command
+1841,1537111,"genie.py",1491,0,"",python,selection_command
+1842,1537296,"genie.py",1533,0,"",python,selection_command
+1843,1537544,"genie.py",1563,0,"",python,selection_command
+1844,1537575,"genie.py",1573,0,"",python,selection_command
+1845,1537610,"genie.py",1611,0,"",python,selection_command
+1846,1537645,"genie.py",1643,0,"",python,selection_command
+1847,1537678,"genie.py",1679,0,"",python,selection_command
+1848,1537711,"genie.py",1725,0,"",python,selection_command
+1849,1537745,"genie.py",1774,0,"",python,selection_command
+1850,1537781,"genie.py",1818,0,"",python,selection_command
+1851,1537813,"genie.py",1862,0,"",python,selection_command
+1852,1537846,"genie.py",1904,0,"",python,selection_command
+1853,1537976,"genie.py",1929,0,"",python,selection_command
+1854,1538321,"genie.py",1962,0,"\n param_dtype=self.param_dtype,\n dtype=self.dtype,",python,content
+1855,1538334,"genie.py",1975,0,"",python,selection_command
+1856,1540362,"genie.py",1941,0,"",python,selection_command
+1857,1540614,"genie.py",1916,0,"",python,selection_command
+1858,1540644,"genie.py",1874,0,"",python,selection_command
+1859,1540678,"genie.py",1830,0,"",python,selection_command
+1860,1540714,"genie.py",1786,0,"",python,selection_command
+1861,1540746,"genie.py",1737,0,"",python,selection_command
+1862,1540779,"genie.py",1691,0,"",python,selection_command
+1863,1540811,"genie.py",1655,0,"",python,selection_command
+1864,1540845,"genie.py",1623,0,"",python,selection_command
+1865,1540879,"genie.py",1585,0,"",python,selection_command
+1866,1540912,"genie.py",1571,0,"",python,selection_command
+1867,1540946,"genie.py",1545,0,"",python,selection_command
+1868,1546333,"genie.py",1571,0,"",python,selection_command
+1869,1546583,"genie.py",1585,0,"",python,selection_command
+1870,1546613,"genie.py",1623,0,"",python,selection_command
+1871,1546646,"genie.py",1655,0,"",python,selection_command
+1872,1546679,"genie.py",1691,0,"",python,selection_command
+1873,1546713,"genie.py",1737,0,"",python,selection_command
+1874,1546746,"genie.py",1786,0,"",python,selection_command
+1875,1546779,"genie.py",1830,0,"",python,selection_command
+1876,1546812,"genie.py",1874,0,"",python,selection_command
+1877,1546845,"genie.py",1916,0,"",python,selection_command
+1878,1546879,"genie.py",1941,0,"",python,selection_command
+1879,1546912,"genie.py",1975,0,"",python,selection_command
+1880,1546946,"genie.py",2017,0,"",python,selection_command
+1881,1546979,"genie.py",2043,0,"",python,selection_command
+1882,1547012,"genie.py",2057,0,"",python,selection_command
+1883,1547048,"genie.py",2098,0,"",python,selection_command
+1884,1547095,"genie.py",2057,0,"",python,selection_command
+1885,1547351,"genie.py",2043,0,"",python,selection_command
+1886,1547382,"genie.py",2017,0,"",python,selection_command
+1887,1547412,"genie.py",1975,0,"",python,selection_command
+1888,1547445,"genie.py",1941,0,"",python,selection_command
+1889,1547479,"genie.py",1916,0,"",python,selection_command
+1890,1547512,"genie.py",1874,0,"",python,selection_command
+1891,1547545,"genie.py",1830,0,"",python,selection_command
+1892,1547579,"genie.py",1786,0,"",python,selection_command
+1893,1547612,"genie.py",1737,0,"",python,selection_command
+1894,1547646,"genie.py",1691,0,"",python,selection_command
+1895,1547680,"genie.py",1655,0,"",python,selection_command
+1896,1547712,"genie.py",1623,0,"",python,selection_command
+1897,1547745,"genie.py",1585,0,"",python,selection_command
+1898,1547779,"genie.py",1571,0,"",python,selection_command
+1899,1547812,"genie.py",1545,0,"",python,selection_command
+1900,1547846,"genie.py",1503,0,"",python,selection_command
+1901,1547881,"genie.py",1469,0,"",python,selection_command
+1902,1547912,"genie.py",1444,0,"",python,selection_command
+1903,1547945,"genie.py",1396,0,"",python,selection_command
+1904,1547979,"genie.py",1346,0,"",python,selection_command
+1905,1548012,"genie.py",1306,0,"",python,selection_command
+1906,1548045,"genie.py",1258,0,"",python,selection_command
+1907,1548078,"genie.py",1212,0,"",python,selection_command
+1908,1548111,"genie.py",1170,0,"",python,selection_command
+1909,1548145,"genie.py",1138,0,"",python,selection_command
+1910,1548178,"genie.py",1097,0,"",python,selection_command
+1911,1548212,"genie.py",1076,0,"",python,selection_command
+1912,1548298,"genie.py",1097,0,"",python,selection_command
+1913,1548553,"genie.py",1138,0,"",python,selection_command
+1914,1548587,"genie.py",1170,0,"",python,selection_command
+1915,1548623,"genie.py",1212,0,"",python,selection_command
+1916,1548649,"genie.py",1258,0,"",python,selection_command
+1917,1548683,"genie.py",1306,0,"",python,selection_command
+1918,1548717,"genie.py",1346,0,"",python,selection_command
+1919,1548750,"genie.py",1396,0,"",python,selection_command
+1920,1548785,"genie.py",1444,0,"",python,selection_command
+1921,1548818,"genie.py",1469,0,"",python,selection_command
+1922,1548855,"genie.py",1503,0,"",python,selection_command
+1923,1548890,"genie.py",1545,0,"",python,selection_command
+1924,1548928,"genie.py",1571,0,"",python,selection_command
+1925,1548955,"genie.py",1585,0,"",python,selection_command
+1926,1548991,"genie.py",1623,0,"",python,selection_command
+1927,1549023,"genie.py",1655,0,"",python,selection_command
+1928,1549057,"genie.py",1691,0,"",python,selection_command
+1929,1549090,"genie.py",1737,0,"",python,selection_command
+1930,1549123,"genie.py",1786,0,"",python,selection_command
+1931,1549157,"genie.py",1830,0,"",python,selection_command
+1932,1549195,"genie.py",1874,0,"",python,selection_command
+1933,1549223,"genie.py",1916,0,"",python,selection_command
+1934,1549257,"genie.py",1941,0,"",python,selection_command
+1935,1549289,"genie.py",1975,0,"",python,selection_command
+1936,1549323,"genie.py",2017,0,"",python,selection_command
+1937,1549356,"genie.py",2043,0,"",python,selection_command
+1938,1549389,"genie.py",2057,0,"",python,selection_command
+1939,1549423,"genie.py",2098,0,"",python,selection_command
+1940,1549457,"genie.py",2135,0,"",python,selection_command
+1941,1549491,"genie.py",2183,0,"",python,selection_command
+1942,1549525,"genie.py",2228,0,"",python,selection_command
+1943,1549556,"genie.py",2271,0,"",python,selection_command
+1944,1549590,"genie.py",2305,0,"",python,selection_command
+1945,1550019,"genie.py",2332,0,"\n param_dtype=self.param_dtype,\n dtype=self.dtype,",python,content
+1946,1550027,"genie.py",2345,0,"",python,selection_command
+1947,1551799,"genie.py",2305,0,"",python,selection_command
+1948,1552049,"genie.py",2271,0,"",python,selection_command
+1949,1552082,"genie.py",2228,0,"",python,selection_command
+1950,1552116,"genie.py",2183,0,"",python,selection_command
+1951,1552149,"genie.py",2135,0,"",python,selection_command
+1952,1552299,"genie.py",2098,0,"",python,selection_command
+1953,1552468,"genie.py",2057,0,"",python,selection_command
+1954,1552647,"genie.py",2058,0,"",python,selection_command
+1955,1552914,"genie.py",2067,0,"",python,selection_command
+1956,1553064,"genie.py",2069,0,"",python,selection_command
+1957,1557690,"models/dynamics.py",0,0,"from typing import Dict, Any\n\nimport jax\nimport jax.numpy as jnp\nimport flax.linen as nn\n\nfrom utils.nn import STTransformer\n\n\nclass DynamicsMaskGIT(nn.Module):\n """"""MaskGIT dynamics model""""""\n\n model_dim: int\n num_latents: int\n num_blocks: int\n num_heads: int\n dropout: float\n mask_limit: float\n\n def setup(self):\n self.dynamics = STTransformer(\n self.model_dim,\n self.num_latents,\n self.num_blocks,\n self.num_heads,\n self.dropout,\n )\n self.patch_embed = nn.Embed(self.num_latents, self.model_dim)\n self.mask_token = self.param(\n ""mask_token"",\n nn.initializers.lecun_uniform(),\n (1, 1, 1, self.model_dim),\n )\n self.action_up = nn.Dense(self.model_dim)\n\n def __call__(self, batch: Dict[str, Any], training: bool = True) -> Dict[str, Any]:\n # --- Mask videos ---\n vid_embed = self.patch_embed(batch[""video_tokens""])\n if training:\n rng1, rng2 = jax.random.split(batch[""mask_rng""])\n mask_prob = jax.random.uniform(rng1, minval=self.mask_limit)\n mask = jax.random.bernoulli(rng2, mask_prob, vid_embed.shape[:-1])\n mask = mask.at[:, 0].set(False)\n vid_embed = jnp.where(jnp.expand_dims(mask, -1), self.mask_token, vid_embed)\n else:\n mask = None\n\n # --- Predict transition ---\n act_embed = self.action_up(batch[""latent_actions""])\n vid_embed += jnp.pad(act_embed, ((0, 0), (1, 0), (0, 0), (0, 0)))\n logits = self.dynamics(vid_embed)\n return dict(token_logits=logits, mask=mask)\n",python,tab
+1958,1558413,"models/dynamics.py",167,0,"",python,selection_command
+1959,1558662,"models/dynamics.py",194,0,"",python,selection_command
+1960,1558695,"models/dynamics.py",201,0,"",python,selection_command
+1961,1558728,"models/dynamics.py",220,0,"",python,selection_command
+1962,1558761,"models/dynamics.py",241,0,"",python,selection_command
+1963,1558795,"models/dynamics.py",261,0,"",python,selection_command
+1964,1558828,"models/dynamics.py",280,0,"",python,selection_command
+1965,1559158,"models/dynamics.py",299,0,"",python,selection_command
+1966,1559638,"genie.py",0,0,"",python,tab
+1967,1559917,"genie.py",2043,0,"",python,selection_command
+1968,1560166,"genie.py",2029,0,"",python,selection_command
+1969,1560198,"genie.py",1987,0,"",python,selection_command
+1970,1560232,"genie.py",1953,0,"",python,selection_command
+1971,1560266,"genie.py",1927,0,"",python,selection_command
+1972,1560300,"genie.py",1886,0,"",python,selection_command
+1973,1560334,"genie.py",1842,0,"",python,selection_command
+1974,1560368,"genie.py",1798,0,"",python,selection_command
+1975,1560406,"genie.py",1749,0,"",python,selection_command
+1976,1560439,"genie.py",1703,0,"",python,selection_command
+1977,1560471,"genie.py",1667,0,"",python,selection_command
+1978,1560508,"genie.py",1635,0,"",python,selection_command
+1979,1560540,"genie.py",1597,0,"",python,selection_command
+1980,1560573,"genie.py",1571,0,"",python,selection_command
+1981,1560791,"genie.py",1557,0,"",python,selection_command
+1982,1561045,"genie.py",1515,0,"",python,selection_command
+1983,1561076,"genie.py",1481,0,"",python,selection_command
+1984,1561108,"genie.py",1455,0,"",python,selection_command
+1985,1561141,"genie.py",1408,0,"",python,selection_command
+1986,1561174,"genie.py",1358,0,"",python,selection_command
+1987,1561207,"genie.py",1318,0,"",python,selection_command
+1988,1561241,"genie.py",1270,0,"",python,selection_command
+1989,1561278,"genie.py",1224,0,"",python,selection_command
+1990,1561307,"genie.py",1182,0,"",python,selection_command
+1991,1561341,"genie.py",1150,0,"",python,selection_command
+1992,1561626,"genie.py",1109,0,"",python,selection_command
+1993,1561763,"genie.py",1110,0,"",python,selection_command
+1994,1561985,"models/tokenizer.py",0,0,"",python,tab
+1995,1562376,"models/tokenizer.py",229,0,"",python,selection_command
+1996,1562630,"models/tokenizer.py",249,0,"",python,selection_command
+1997,1562659,"models/tokenizer.py",256,0,"",python,selection_command
+1998,1562693,"models/tokenizer.py",272,0,"",python,selection_command
+1999,1562727,"models/tokenizer.py",291,0,"",python,selection_command
+2000,1562762,"models/tokenizer.py",311,0,"",python,selection_command
+2001,1562794,"models/tokenizer.py",332,0,"",python,selection_command
+2002,1562829,"models/tokenizer.py",352,0,"",python,selection_command
+2003,1562861,"models/tokenizer.py",372,0,"",python,selection_command
+2004,1562895,"models/tokenizer.py",391,0,"",python,selection_command
+2005,1562929,"models/tokenizer.py",410,0,"",python,selection_command
+2006,1562962,"models/tokenizer.py",438,0,"",python,selection_command
+2007,1562995,"models/tokenizer.py",546,0,"",python,selection_command
+2008,1563312,"models/tokenizer.py",438,0,"",python,selection_command
+2009,1563629,"models/tokenizer.py",432,107," # FIXME (f.srambical): check whether flax will automatically cast inputs and handle compute in bfloat16",python,selection_command
+2010,1563733,"models/tokenizer.py",432,134," # FIXME (f.srambical): check whether flax will automatically cast inputs and handle compute in bfloat16\n param_dtype: jnp.dtype",python,selection_command
+2011,1563866,"models/tokenizer.py",432,155," # FIXME (f.srambical): check whether flax will automatically cast inputs and handle compute in bfloat16\n param_dtype: jnp.dtype\n dtype: jnp.dtype",python,selection_command
+2012,1564057,"models/tokenizer.py",432,0,"",python,selection_command
+2013,1565416,"models/dynamics.py",0,0,"",python,tab
+2014,1565973,"models/dynamics.py",314,0,"\n # FIXME (f.srambical): check whether flax will automatically cast inputs and handle compute in bfloat16\n param_dtype: jnp.dtype\n dtype: jnp.dtype",python,content
+2015,1565994,"models/dynamics.py",319,0,"",python,selection_command
+2016,1576153,"models/tokenizer.py",0,0,"",python,tab
+2017,1577956,"models/dynamics.py",0,0,"",python,tab
+2018,1581415,"genie.py",0,0,"",python,tab
+2019,1582331,"genie.py",505,0,"",python,selection_command
+2020,1582937,"genie.py",43,0,"",python,selection_command
+2021,1583948,"genie.py",505,0,"",python,selection_command
+2022,1584348,"genie.py",525,0,"",python,selection_command
+2023,1584595,"genie.py",555,0,"",python,selection_command
+2024,1584628,"genie.py",584,0,"",python,selection_command
+2025,1584659,"genie.py",602,0,"",python,selection_command
+2026,1584692,"genie.py",619,0,"",python,selection_command
+2027,1584728,"genie.py",646,0,"",python,selection_command
+2028,1584762,"genie.py",674,0,"",python,selection_command
+2029,1584795,"genie.py",698,0,"",python,selection_command
+2030,1584828,"genie.py",722,0,"",python,selection_command
+2031,1584862,"genie.py",745,0,"",python,selection_command
+2032,1584898,"genie.py",768,0,"",python,selection_command
+2033,1584929,"genie.py",791,0,"",python,selection_command
+2034,1584962,"genie.py",809,0,"",python,selection_command
+2035,1584995,"genie.py",834,0,"",python,selection_command
+2036,1585029,"genie.py",858,0,"",python,selection_command
+2037,1585062,"genie.py",966,0,"",python,selection_command
+2038,1585243,"genie.py",993,0,"",python,selection_command
+2039,1585415,"genie.py",966,0,"",python,selection_command
+2040,1585569,"genie.py",858,0,"",python,selection_command
+2041,1586617,"models/dynamics.py",0,0,"",python,tab
+2042,1587333,"models/dynamics.py",315,107," # FIXME (f.srambical): check whether flax will automatically cast inputs and handle compute in bfloat16",python,selection_command
+2043,1587543,"models/dynamics.py",315,134," # FIXME (f.srambical): check whether flax will automatically cast inputs and handle compute in bfloat16\n param_dtype: jnp.dtype",python,selection_command
+2044,1587663,"models/dynamics.py",315,155," # FIXME (f.srambical): check whether flax will automatically cast inputs and handle compute in bfloat16\n param_dtype: jnp.dtype\n dtype: jnp.dtype",python,selection_command
+2045,1588759,"models/dynamics.py",454,0,"",python,selection_command
+2046,1589046,"models/dynamics.py",427,0,"",python,selection_command
+2047,1589200,"models/dynamics.py",319,0,"",python,selection_command
+2048,1590057,"genie.py",0,0,"",python,tab
+2049,1596369,"genie.py",966,0,"",python,selection_command
+2050,1596616,"genie.py",993,0,"",python,selection_command
+2051,1596648,"genie.py",1014,0,"",python,selection_command
+2052,1596682,"genie.py",1039,0,"",python,selection_command
+2053,1596715,"genie.py",1063,0,"",python,selection_command
+2054,1596749,"genie.py",1068,0,"",python,selection_command
+2055,1596782,"genie.py",1089,0,"",python,selection_command
+2056,1596817,"genie.py",1130,0,"",python,selection_command
+2057,1596850,"genie.py",1162,0,"",python,selection_command
+2058,1596883,"genie.py",1204,0,"",python,selection_command
+2059,1596918,"genie.py",1250,0,"",python,selection_command
+2060,1596954,"genie.py",1298,0,"",python,selection_command
+2061,1596989,"genie.py",1338,0,"",python,selection_command
+2062,1597021,"genie.py",1388,0,"",python,selection_command
+2063,1597058,"genie.py",1436,0,"",python,selection_command
+2064,1598533,"genie.py",1461,0,"",python,selection_command
+2065,1598783,"genie.py",1495,0,"",python,selection_command
+2066,1598820,"genie.py",1537,0,"",python,selection_command
+2067,1598850,"genie.py",1567,0,"",python,selection_command
+2068,1598881,"genie.py",1577,0,"",python,selection_command
+2069,1598914,"genie.py",1615,0,"",python,selection_command
+2070,1598948,"genie.py",1647,0,"",python,selection_command
+2071,1598981,"genie.py",1683,0,"",python,selection_command
+2072,1599015,"genie.py",1729,0,"",python,selection_command
+2073,1599050,"genie.py",1778,0,"",python,selection_command
+2074,1599084,"genie.py",1822,0,"",python,selection_command
+2075,1599117,"genie.py",1866,0,"",python,selection_command
+2076,1599151,"genie.py",1908,0,"",python,selection_command
+2077,1599190,"genie.py",1933,0,"",python,selection_command
+2078,1599223,"genie.py",1967,0,"",python,selection_command
+2079,1599254,"genie.py",2009,0,"",python,selection_command
+2080,1599288,"genie.py",2039,0,"",python,selection_command
+2081,1599324,"genie.py",2049,0,"",python,selection_command
+2082,1599356,"genie.py",2090,0,"",python,selection_command
+2083,1599389,"genie.py",2127,0,"",python,selection_command
+2084,1600491,"models/dynamics.py",0,0,"",python,tab
+2085,1600965,"models/dynamics.py",427,0,"",python,selection_command
+2086,1601215,"models/dynamics.py",454,0,"",python,selection_command
+2087,1601246,"models/dynamics.py",471,0,"",python,selection_command
+2088,1601279,"models/dynamics.py",476,0,"",python,selection_command
+2089,1601629,"models/dynamics.py",497,0,"",python,selection_command
+2090,1601735,"models/dynamics.py",501,0,"",python,selection_command
+2091,1601943,"models/dynamics.py",505,0,"",python,selection_command
+2092,1602125,"models/dynamics.py",506,0,"",python,selection_command
+2093,1602381,"models/dynamics.py",515,0,"",python,selection_command
+2094,1602579,"models/dynamics.py",517,0,"",python,selection_command
+2095,1603229,"models/dynamics.py",493,0,"",python,selection_command
+2096,1610761,"models/dynamics.py",532,0,"",python,selection_command
+2097,1611093,"models/dynamics.py",493,0,"",python,selection_command
+2098,1613415,"models/dynamics.py",492,0,"\n ",python,content
+2099,1613525,"models/dynamics.py",501,0,"#",python,content
+2100,1613526,"models/dynamics.py",502,0,"",python,selection_keyboard
+2101,1613564,"models/dynamics.py",502,0," ",python,content
+2102,1613565,"models/dynamics.py",503,0,"",python,selection_keyboard
+2103,1613761,"models/dynamics.py",503,0,"F",python,content
+2104,1613762,"models/dynamics.py",504,0,"",python,selection_keyboard
+2105,1613801,"models/dynamics.py",504,0,"I",python,content
+2106,1613806,"models/dynamics.py",505,0,"",python,selection_keyboard
+2107,1613881,"models/dynamics.py",505,0,"X",python,content
+2108,1613881,"models/dynamics.py",506,0,"",python,selection_keyboard
+2109,1614010,"models/dynamics.py",506,0,"M",python,content
+2110,1614013,"models/dynamics.py",507,0,"",python,selection_keyboard
+2111,1614101,"models/dynamics.py",507,0,"E",python,content
+2112,1614102,"models/dynamics.py",508,0,"",python,selection_keyboard
+2113,1614209,"models/dynamics.py",508,0,":",python,content
+2114,1614210,"models/dynamics.py",509,0,"",python,selection_keyboard
+2115,1614576,"models/dynamics.py",508,1,"",python,content
+2116,1614689,"models/dynamics.py",508,0," ",python,content
+2117,1614690,"models/dynamics.py",509,0,"",python,selection_keyboard
+2118,1614764,"models/dynamics.py",509,0,"()",python,content
+2119,1614765,"models/dynamics.py",510,0,"",python,selection_keyboard
+2120,1615084,"models/dynamics.py",510,0,"s",python,content
+2121,1615085,"models/dynamics.py",511,0,"",python,selection_keyboard
+2122,1615194,"models/dynamics.py",511,0,".",python,content
+2123,1615195,"models/dynamics.py",512,0,"",python,selection_keyboard
+2124,1615563,"models/dynamics.py",511,1,"",python,content
+2125,1615739,"models/dynamics.py",511,0,"f",python,content
+2126,1615740,"models/dynamics.py",512,0,"",python,selection_keyboard
+2127,1616042,"models/dynamics.py",511,1,"",python,content
+2128,1616177,"models/dynamics.py",510,1,"",python,content
+2129,1616232,"models/dynamics.py",510,0,"f",python,content
+2130,1616233,"models/dynamics.py",511,0,"",python,selection_keyboard
+2131,1616356,"models/dynamics.py",511,0,".",python,content
+2132,1616357,"models/dynamics.py",512,0,"",python,selection_keyboard
+2133,1616524,"models/dynamics.py",512,0,"s",python,content
+2134,1616525,"models/dynamics.py",513,0,"",python,selection_keyboard
+2135,1616817,"models/dynamics.py",513,0,"r",python,content
+2136,1616818,"models/dynamics.py",514,0,"",python,selection_keyboard
+2137,1616866,"models/dynamics.py",514,0,"a",python,content
+2138,1616868,"models/dynamics.py",515,0,"",python,selection_keyboard
+2139,1616995,"models/dynamics.py",515,0,"m",python,content
+2140,1616996,"models/dynamics.py",516,0,"",python,selection_keyboard
+2141,1617213,"models/dynamics.py",516,0,"b",python,content
+2142,1617214,"models/dynamics.py",517,0,"",python,selection_keyboard
+2143,1617360,"models/dynamics.py",517,0,"i",python,content
+2144,1617361,"models/dynamics.py",518,0,"",python,selection_keyboard
+2145,1617444,"models/dynamics.py",518,0,"c",python,content
+2146,1617445,"models/dynamics.py",519,0,"",python,selection_keyboard
+2147,1617491,"models/dynamics.py",519,0,"a",python,content
+2148,1617492,"models/dynamics.py",520,0,"",python,selection_keyboard
+2149,1617577,"models/dynamics.py",520,0,"l",python,content
+2150,1617579,"models/dynamics.py",521,0,"",python,selection_keyboard
+2151,1617884,"models/dynamics.py",520,0,"",python,selection_command
+2152,1618195,"models/dynamics.py",522,0,"",python,selection_command
+2153,1618279,"models/dynamics.py",522,0,":",python,content
+2154,1618280,"models/dynamics.py",523,0,"",python,selection_keyboard
+2155,1618544,"models/dynamics.py",523,0," ",python,content
+2156,1618546,"models/dynamics.py",524,0,"",python,selection_keyboard
+2157,1618857,"models/dynamics.py",524,0,"d",python,content
+2158,1618858,"models/dynamics.py",525,0,"",python,selection_keyboard
+2159,1618935,"models/dynamics.py",525,0,"o",python,content
+2160,1618936,"models/dynamics.py",526,0,"",python,selection_keyboard
+2161,1618957,"models/dynamics.py",526,0," ",python,content
+2162,1618959,"models/dynamics.py",527,0,"",python,selection_keyboard
+2163,1619196,"models/dynamics.py",527,0,"w",python,content
+2164,1619198,"models/dynamics.py",528,0,"",python,selection_keyboard
+2165,1619248,"models/dynamics.py",528,0,"e",python,content
+2166,1619249,"models/dynamics.py",529,0,"",python,selection_keyboard
+2167,1619308,"models/dynamics.py",529,0," ",python,content
+2168,1619309,"models/dynamics.py",530,0,"",python,selection_keyboard
+2169,1619444,"models/dynamics.py",530,0,"n",python,content
+2170,1619445,"models/dynamics.py",531,0,"",python,selection_keyboard
+2171,1619491,"models/dynamics.py",531,0,"e",python,content
+2172,1619492,"models/dynamics.py",532,0,"",python,selection_keyboard
+2173,1619628,"models/dynamics.py",532,0,"e",python,content
+2174,1619630,"models/dynamics.py",533,0,"",python,selection_keyboard
+2175,1619661,"models/dynamics.py",533,0,"d",python,content
+2176,1619662,"models/dynamics.py",534,0,"",python,selection_keyboard
+2177,1619706,"models/dynamics.py",534,0," ",python,content
+2178,1619707,"models/dynamics.py",535,0,"",python,selection_keyboard
+2179,1619894,"models/dynamics.py",535,0,"t",python,content
+2180,1619895,"models/dynamics.py",536,0,"",python,selection_keyboard
+2181,1619978,"models/dynamics.py",536,0,"o",python,content
+2182,1619979,"models/dynamics.py",537,0,"",python,selection_keyboard
+2183,1620034,"models/dynamics.py",537,0," ",python,content
+2184,1620036,"models/dynamics.py",538,0,"",python,selection_keyboard
+2185,1620215,"models/dynamics.py",538,0,"a",python,content
+2186,1620216,"models/dynamics.py",539,0,"",python,selection_keyboard
+2187,1620446,"models/dynamics.py",539,0,"d",python,content
+2188,1620447,"models/dynamics.py",540,0,"",python,selection_keyboard
+2189,1620562,"models/dynamics.py",540,0,"d",python,content
+2190,1620563,"models/dynamics.py",541,0,"",python,selection_keyboard
+2191,1620650,"models/dynamics.py",541,0," ",python,content
+2192,1620652,"models/dynamics.py",542,0,"",python,selection_keyboard
+2193,1622479,"models/dynamics.py",542,0,"p",python,content
+2194,1622480,"models/dynamics.py",543,0,"",python,selection_keyboard
+2195,1622629,"models/dynamics.py",543,0,"a",python,content
+2196,1622631,"models/dynamics.py",544,0,"",python,selection_keyboard
+2197,1622700,"models/dynamics.py",544,0,"r",python,content
+2198,1622701,"models/dynamics.py",545,0,"",python,selection_keyboard
+2199,1622795,"models/dynamics.py",545,0,"a",python,content
+2200,1622795,"models/dynamics.py",546,0,"",python,selection_keyboard
+2201,1622881,"models/dynamics.py",546,0,"m",python,content
+2202,1622882,"models/dynamics.py",547,0,"",python,selection_keyboard
+2203,1623111,"models/dynamics.py",547,0,"_",python,content
+2204,1623112,"models/dynamics.py",548,0,"",python,selection_keyboard
+2205,1623332,"models/dynamics.py",548,0,"d",python,content
+2206,1623333,"models/dynamics.py",549,0,"",python,selection_keyboard
+2207,1623468,"models/dynamics.py",549,0,"t",python,content
+2208,1623469,"models/dynamics.py",550,0,"",python,selection_keyboard
+2209,1623554,"models/dynamics.py",550,0,"y",python,content
+2210,1623556,"models/dynamics.py",551,0,"",python,selection_keyboard
+2211,1623599,"models/dynamics.py",551,0,"p",python,content
+2212,1623600,"models/dynamics.py",552,0,"",python,selection_keyboard
+2213,1623673,"models/dynamics.py",552,0,"e",python,content
+2214,1623673,"models/dynamics.py",553,0,"",python,selection_keyboard
+2215,1623739,"models/dynamics.py",553,0," ",python,content
+2216,1623740,"models/dynamics.py",554,0,"",python,selection_keyboard
+2217,1623837,"models/dynamics.py",554,0,"a",python,content
+2218,1623838,"models/dynamics.py",555,0,"",python,selection_keyboard
+2219,1623907,"models/dynamics.py",555,0,"n",python,content
+2220,1623909,"models/dynamics.py",556,0,"",python,selection_keyboard
+2221,1623996,"models/dynamics.py",556,0,"d",python,content
+2222,1623997,"models/dynamics.py",557,0,"",python,selection_keyboard
+2223,1624044,"models/dynamics.py",557,0," ",python,content
+2224,1624045,"models/dynamics.py",558,0,"",python,selection_keyboard
+2225,1624376,"models/dynamics.py",558,0,"d",python,content
+2226,1624377,"models/dynamics.py",559,0,"",python,selection_keyboard
+2227,1624583,"models/dynamics.py",559,0,"t",python,content
+2228,1624584,"models/dynamics.py",560,0,"",python,selection_keyboard
+2229,1624664,"models/dynamics.py",560,0,"y",python,content
+2230,1624665,"models/dynamics.py",561,0,"",python,selection_keyboard
+2231,1624712,"models/dynamics.py",561,0,"p",python,content
+2232,1624713,"models/dynamics.py",562,0,"",python,selection_keyboard
+2233,1624774,"models/dynamics.py",562,0,"e",python,content
+2234,1624776,"models/dynamics.py",563,0,"",python,selection_keyboard
+2235,1624827,"models/dynamics.py",563,0," ",python,content
+2236,1624829,"models/dynamics.py",564,0,"",python,selection_keyboard
+2237,1625111,"models/dynamics.py",564,0,"t",python,content
+2238,1625112,"models/dynamics.py",565,0,"",python,selection_keyboard
+2239,1625158,"models/dynamics.py",565,0,"o",python,content
+2240,1625159,"models/dynamics.py",566,0,"",python,selection_keyboard
+2241,1625183,"models/dynamics.py",566,0," ",python,content
+2242,1625184,"models/dynamics.py",567,0,"",python,selection_keyboard
+2243,1625897,"models/dynamics.py",566,1,"",python,content
+2244,1626027,"models/dynamics.py",566,0," ",python,content
+2245,1626028,"models/dynamics.py",567,0,"",python,selection_keyboard
+2246,1626845,"models/dynamics.py",567,0,"e",python,content
+2247,1626846,"models/dynamics.py",568,0,"",python,selection_keyboard
+2248,1626914,"models/dynamics.py",568,0,".",python,content
+2249,1626915,"models/dynamics.py",569,0,"",python,selection_keyboard
+2250,1627030,"models/dynamics.py",569,0,"g",python,content
+2251,1627031,"models/dynamics.py",570,0,"",python,selection_keyboard
+2252,1627083,"models/dynamics.py",570,0,".",python,content
+2253,1627084,"models/dynamics.py",571,0,"",python,selection_keyboard
+2254,1627315,"models/dynamics.py",571,0," ",python,content
+2255,1627316,"models/dynamics.py",572,0,"",python,selection_keyboard
+2256,1628100,"models/dynamics.py",572,0,"S",python,content
+2257,1628101,"models/dynamics.py",573,0,"",python,selection_keyboard
+2258,1628251,"models/dynamics.py",573,0,"T",python,content
+2259,1628252,"models/dynamics.py",574,0,"",python,selection_keyboard
+2260,1628684,"models/dynamics.py",574,0,"T",python,content
+2261,1628685,"models/dynamics.py",575,0,"",python,selection_keyboard
+2262,1628911,"models/dynamics.py",575,0,"r",python,content
+2263,1628912,"models/dynamics.py",576,0,"",python,selection_keyboard
+2264,1629001,"models/dynamics.py",576,0,"a",python,content
+2265,1629003,"models/dynamics.py",577,0,"",python,selection_keyboard
+2266,1629134,"models/dynamics.py",577,0,"n",python,content
+2267,1629136,"models/dynamics.py",578,0,"",python,selection_keyboard
+2268,1629186,"models/dynamics.py",578,0,"s",python,content
+2269,1629187,"models/dynamics.py",579,0,"",python,selection_keyboard
+2270,1629211,"models/dynamics.py",579,0,"f",python,content
+2271,1629212,"models/dynamics.py",580,0,"",python,selection_keyboard
+2272,1630180,"models/dynamics.py",580,0,"r",python,content
+2273,1630181,"models/dynamics.py",581,0,"",python,selection_keyboard
+2274,1630212,"models/dynamics.py",581,0,"o",python,content
+2275,1630214,"models/dynamics.py",582,0,"",python,selection_keyboard
+2276,1630343,"models/dynamics.py",582,0,"m",python,content
+2277,1630345,"models/dynamics.py",583,0,"",python,selection_keyboard
+2278,1630383,"models/dynamics.py",583,0,"e",python,content
+2279,1630384,"models/dynamics.py",584,0,"",python,selection_keyboard
+2280,1630468,"models/dynamics.py",584,0,"r",python,content
+2281,1630470,"models/dynamics.py",585,0,"",python,selection_keyboard
+2282,1630496,"models/dynamics.py",585,0," ",python,content
+2283,1630500,"models/dynamics.py",586,0,"",python,selection_keyboard
+2284,1630736,"models/dynamics.py",586,0,"a",python,content
+2285,1630737,"models/dynamics.py",587,0,"",python,selection_keyboard
+2286,1631042,"models/dynamics.py",586,1,"",python,content
+2287,1631171,"models/dynamics.py",572,14,"",python,content
+2288,1631640,"models/dynamics.py",572,0,"S",python,content
+2289,1631642,"models/dynamics.py",573,0,"",python,selection_keyboard
+2290,1631824,"models/dynamics.py",573,0,"T",python,content
+2291,1631825,"models/dynamics.py",574,0,"",python,selection_keyboard
+2292,1632072,"models/dynamics.py",574,0,"T",python,content
+2293,1632073,"models/dynamics.py",575,0,"",python,selection_keyboard
+2294,1632679,"models/dynamics.py",575,0,"r",python,content
+2295,1632682,"models/dynamics.py",576,0,"",python,selection_keyboard
+2296,1632780,"models/dynamics.py",576,0,"a",python,content
+2297,1632781,"models/dynamics.py",577,0,"",python,selection_keyboard
+2298,1632927,"models/dynamics.py",577,0,"n",python,content
+2299,1632928,"models/dynamics.py",578,0,"",python,selection_keyboard
+2300,1633033,"models/dynamics.py",578,0,"s",python,content
+2301,1633034,"models/dynamics.py",579,0,"",python,selection_keyboard
+2302,1633107,"models/dynamics.py",579,0,"f",python,content
+2303,1633108,"models/dynamics.py",580,0,"",python,selection_keyboard
+2304,1633311,"models/dynamics.py",580,0,"o",python,content
+2305,1633312,"models/dynamics.py",581,0,"",python,selection_keyboard
+2306,1633459,"models/dynamics.py",581,0,"r",python,content
+2307,1633460,"models/dynamics.py",582,0,"",python,selection_keyboard
+2308,1633563,"models/dynamics.py",582,0,"m",python,content
+2309,1633564,"models/dynamics.py",583,0,"",python,selection_keyboard
+2310,1633678,"models/dynamics.py",583,0,"e",python,content
+2311,1633678,"models/dynamics.py",584,0,"",python,selection_keyboard
+2312,1633712,"models/dynamics.py",584,0,"r",python,content
+2313,1633714,"models/dynamics.py",585,0,"",python,selection_keyboard
+2314,1633764,"models/dynamics.py",585,0," ",python,content
+2315,1633765,"models/dynamics.py",586,0,"",python,selection_keyboard
+2316,1633905,"models/dynamics.py",586,0,"a",python,content
+2317,1633906,"models/dynamics.py",587,0,"",python,selection_keyboard
+2318,1633989,"models/dynamics.py",587,0,"s",python,content
+2319,1633990,"models/dynamics.py",588,0,"",python,selection_keyboard
+2320,1634062,"models/dynamics.py",588,0," ",python,content
+2321,1634063,"models/dynamics.py",589,0,"",python,selection_keyboard
+2322,1634191,"models/dynamics.py",589,0,"w",python,content
+2323,1634192,"models/dynamics.py",590,0,"",python,selection_keyboard
+2324,1634245,"models/dynamics.py",590,0,"e",python,content
+2325,1634246,"models/dynamics.py",591,0,"",python,selection_keyboard
+2326,1634327,"models/dynamics.py",591,0,"l",python,content
+2327,1634328,"models/dynamics.py",592,0,"",python,selection_keyboard
+2328,1634476,"models/dynamics.py",592,0,"l",python,content
+2329,1634478,"models/dynamics.py",593,0,"",python,selection_keyboard
+2330,1634813,"models/dynamics.py",593,0,"?",python,content
+2331,1634814,"models/dynamics.py",594,0,"",python,selection_keyboard
+2332,1634974,"models/dynamics.py",593,0,"",python,selection_command
+2333,1636448,"models/dynamics.py",594,0,"",python,selection_command
+2334,1636589,"models/dynamics.py",593,1,"",python,content
+2335,1636775,"models/dynamics.py",593,0,",",python,content
+2336,1636777,"models/dynamics.py",594,0,"",python,selection_keyboard
+2337,1636844,"models/dynamics.py",594,0," ",python,content
+2338,1636846,"models/dynamics.py",595,0,"",python,selection_keyboard
+2339,1636959,"models/dynamics.py",595,0,"o",python,content
+2340,1636960,"models/dynamics.py",596,0,"",python,selection_keyboard
+2341,1637029,"models/dynamics.py",596,0,"r",python,content
+2342,1637030,"models/dynamics.py",597,0,"",python,selection_keyboard
+2343,1637089,"models/dynamics.py",597,0," ",python,content
+2344,1637093,"models/dynamics.py",598,0,"",python,selection_keyboard
+2345,1637157,"models/dynamics.py",598,0,"w",python,content
+2346,1637158,"models/dynamics.py",599,0,"",python,selection_keyboard
+2347,1637249,"models/dynamics.py",599,0,"i",python,content
+2348,1637250,"models/dynamics.py",600,0,"",python,selection_keyboard
+2349,1637290,"models/dynamics.py",600,0,"l",python,content
+2350,1637290,"models/dynamics.py",601,0,"",python,selection_keyboard
+2351,1637450,"models/dynamics.py",601,0,"l",python,content
+2352,1637451,"models/dynamics.py",602,0,"",python,selection_keyboard
+2353,1637494,"models/dynamics.py",602,0," ",python,content
+2354,1637495,"models/dynamics.py",603,0,"",python,selection_keyboard
+2355,1637616,"models/dynamics.py",603,0,"i",python,content
+2356,1637617,"models/dynamics.py",604,0,"",python,selection_keyboard
+2357,1637739,"models/dynamics.py",604,0,"t",python,content
+2358,1637740,"models/dynamics.py",605,0,"",python,selection_keyboard
+2359,1637758,"models/dynamics.py",605,0," ",python,content
+2360,1637760,"models/dynamics.py",606,0,"",python,selection_keyboard
+2361,1637938,"models/dynamics.py",606,0,"e",python,content
+2362,1637939,"models/dynamics.py",607,0,"",python,selection_keyboard
+2363,1638556,"models/dynamics.py",606,1,"",python,content
+2364,1638723,"models/dynamics.py",606,0,"b",python,content
+2365,1638724,"models/dynamics.py",607,0,"",python,selection_keyboard
+2366,1638786,"models/dynamics.py",607,0,"e",python,content
+2367,1638787,"models/dynamics.py",608,0,"",python,selection_keyboard
+2368,1638817,"models/dynamics.py",608,0," ",python,content
+2369,1638820,"models/dynamics.py",609,0,"",python,selection_keyboard
+2370,1638944,"models/dynamics.py",609,0,"i",python,content
+2371,1638945,"models/dynamics.py",610,0,"",python,selection_keyboard
+2372,1639020,"models/dynamics.py",610,0,"n",python,content
+2373,1639021,"models/dynamics.py",611,0,"",python,selection_keyboard
+2374,1639090,"models/dynamics.py",611,0,"f",python,content
+2375,1639092,"models/dynamics.py",612,0,"",python,selection_keyboard
+2376,1639230,"models/dynamics.py",612,0,"e",python,content
+2377,1639231,"models/dynamics.py",613,0,"",python,selection_keyboard
+2378,1639294,"models/dynamics.py",613,0,"r",python,content
+2379,1639296,"models/dynamics.py",614,0,"",python,selection_keyboard
+2380,1639397,"models/dynamics.py",614,0,"r",python,content
+2381,1639397,"models/dynamics.py",615,0,"",python,selection_keyboard
+2382,1639692,"models/dynamics.py",615,0,"e",python,content
+2383,1639694,"models/dynamics.py",616,0,"",python,selection_keyboard
+2384,1639879,"models/dynamics.py",616,0,"d",python,content
+2385,1639880,"models/dynamics.py",617,0,"",python,selection_keyboard
+2386,1639963,"models/dynamics.py",617,0," ",python,content
+2387,1639964,"models/dynamics.py",618,0,"",python,selection_keyboard
+2388,1640067,"models/dynamics.py",618,0,"f",python,content
+2389,1640068,"models/dynamics.py",619,0,"",python,selection_keyboard
+2390,1640246,"models/dynamics.py",619,0,"r",python,content
+2391,1640246,"models/dynamics.py",620,0,"",python,selection_keyboard
+2392,1640334,"models/dynamics.py",620,0,"o",python,content
+2393,1640335,"models/dynamics.py",621,0,"",python,selection_keyboard
+2394,1640382,"models/dynamics.py",621,0,"m",python,content
+2395,1640383,"models/dynamics.py",622,0,"",python,selection_keyboard
+2396,1640457,"models/dynamics.py",622,0," ",python,content
+2397,1640461,"models/dynamics.py",623,0,"",python,selection_keyboard
+2398,1640528,"models/dynamics.py",623,0,"t",python,content
+2399,1640529,"models/dynamics.py",624,0,"",python,selection_keyboard
+2400,1640629,"models/dynamics.py",624,0,"h",python,content
+2401,1640630,"models/dynamics.py",625,0,"",python,selection_keyboard
+2402,1640687,"models/dynamics.py",625,0,"e",python,content
+2403,1640688,"models/dynamics.py",626,0,"",python,selection_keyboard
+2404,1640745,"models/dynamics.py",626,0," ",python,content
+2405,1640746,"models/dynamics.py",627,0,"",python,selection_keyboard
+2406,1640893,"models/dynamics.py",627,0,"'",python,content
+2407,1640894,"models/dynamics.py",628,0,"",python,selection_keyboard
+2408,1642128,"models/dynamics.py",628,0,"p",python,content
+2409,1642130,"models/dynamics.py",629,0,"",python,selection_keyboard
+2410,1642197,"models/dynamics.py",629,0,"a",python,content
+2411,1642198,"models/dynamics.py",630,0,"",python,selection_keyboard
+2412,1642257,"models/dynamics.py",630,0,"r",python,content
+2413,1642259,"models/dynamics.py",631,0,"",python,selection_keyboard
+2414,1642400,"models/dynamics.py",631,0,"e",python,content
+2415,1642401,"models/dynamics.py",632,0,"",python,selection_keyboard
+2416,1642508,"models/dynamics.py",632,0,"n",python,content
+2417,1642510,"models/dynamics.py",633,0,"",python,selection_keyboard
+2418,1642592,"models/dynamics.py",633,0,"n",python,content
+2419,1642593,"models/dynamics.py",634,0,"",python,selection_keyboard
+2420,1642657,"models/dynamics.py",634,0," ",python,content
+2421,1642660,"models/dynamics.py",635,0,"",python,selection_keyboard
+2422,1643056,"models/dynamics.py",634,1,"",python,content
+2423,1643110,"models/dynamics.py",634,0,"t",python,content
+2424,1643112,"models/dynamics.py",635,0,"",python,selection_keyboard
+2425,1643181,"models/dynamics.py",635,0," ",python,content
+2426,1643182,"models/dynamics.py",636,0,"",python,selection_keyboard
+2427,1643542,"models/dynamics.py",628,8,"",python,content
+2428,1643745,"models/dynamics.py",628,0,"p",python,content
+2429,1643747,"models/dynamics.py",629,0,"",python,selection_keyboard
+2430,1643782,"models/dynamics.py",629,0,"a",python,content
+2431,1643784,"models/dynamics.py",630,0,"",python,selection_keyboard
+2432,1643865,"models/dynamics.py",630,0,"r",python,content
+2433,1643865,"models/dynamics.py",631,0,"",python,selection_keyboard
+2434,1643961,"models/dynamics.py",631,0,"e",python,content
+2435,1643962,"models/dynamics.py",632,0,"",python,selection_keyboard
+2436,1644064,"models/dynamics.py",632,0,"n",python,content
+2437,1644065,"models/dynamics.py",633,0,"",python,selection_keyboard
+2438,1644169,"models/dynamics.py",633,0,"t",python,content
+2439,1644170,"models/dynamics.py",634,0,"",python,selection_keyboard
+2440,1644210,"models/dynamics.py",634,0," ",python,content
+2441,1644212,"models/dynamics.py",635,0,"",python,selection_keyboard
+2442,1645730,"models/dynamics.py",635,0,"m",python,content
+2443,1645731,"models/dynamics.py",636,0,"",python,selection_keyboard
+2444,1645961,"models/dynamics.py",635,1,"",python,content
+2445,1646225,"models/dynamics.py",635,0,"M",python,content
+2446,1646227,"models/dynamics.py",636,0,"",python,selection_keyboard
+2447,1646400,"models/dynamics.py",636,0,"o",python,content
+2448,1646401,"models/dynamics.py",637,0,"",python,selection_keyboard
+2449,1646827,"models/dynamics.py",636,1,"",python,content
+2450,1646960,"models/dynamics.py",635,1,"",python,content
+2451,1647114,"models/dynamics.py",635,0,"m",python,content
+2452,1647115,"models/dynamics.py",636,0,"",python,selection_keyboard
+2453,1647150,"models/dynamics.py",636,0,"o",python,content
+2454,1647153,"models/dynamics.py",637,0,"",python,selection_keyboard
+2455,1647213,"models/dynamics.py",637,0,"d",python,content
+2456,1647214,"models/dynamics.py",638,0,"",python,selection_keyboard
+2457,1647332,"models/dynamics.py",638,0,"u",python,content
+2458,1647333,"models/dynamics.py",639,0,"",python,selection_keyboard
+2459,1647475,"models/dynamics.py",639,0,"l",python,content
+2460,1647476,"models/dynamics.py",640,0,"",python,selection_keyboard
+2461,1647513,"models/dynamics.py",640,0,"e",python,content
+2462,1647514,"models/dynamics.py",641,0,"",python,selection_keyboard
+2463,1648028,"models/dynamics.py",641,0,"'",python,content
+2464,1648030,"models/dynamics.py",642,0,"",python,selection_keyboard
+2465,1648223,"models/dynamics.py",642,0,"?",python,content
+2466,1648224,"models/dynamics.py",643,0,"",python,selection_keyboard
+2467,1648572,"models/dynamics.py",642,0,"",python,selection_command
+2468,1649213,"models/dynamics.py",491,0,"",python,selection_command
+2469,1649465,"models/dynamics.py",471,0,"",python,selection_command
+2470,1649497,"models/dynamics.py",469,0,"",python,selection_command
+2471,1649530,"models/dynamics.py",448,0,"",python,selection_command
+2472,1649564,"models/dynamics.py",421,0,"",python,selection_command
+2473,1649598,"models/dynamics.py",313,0,"",python,selection_command
+2474,1649632,"models/dynamics.py",291,0,"",python,selection_command
+2475,1649666,"models/dynamics.py",272,0,"",python,selection_command
+2476,1649700,"models/dynamics.py",253,0,"",python,selection_command
+2477,1649733,"models/dynamics.py",233,0,"",python,selection_command
+2478,1649828,"models/dynamics.py",212,0,"",python,selection_command
+2479,1649999,"models/dynamics.py",194,0,"",python,selection_command
+2480,1650145,"models/dynamics.py",192,0,"",python,selection_command
+2481,1650311,"models/dynamics.py",159,0,"",python,selection_command
+2482,1650565,"models/dynamics.py",158,0,"",python,selection_command
+2483,1650766,"models/dynamics.py",152,0,"",python,selection_command
+2484,1651078,".venv/lib/python3.10/site-packages/flax/linen/module.py",0,0,"# Copyright 2024 The Flax Authors.\n#\n# Licensed under the Apache License, Version 2.0 (the ""License"");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an ""AS IS"" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n""""""Flax Module.""""""\n\nimport contextlib\nimport dataclasses\nimport enum\nimport functools\nimport inspect\nimport sys\nimport threading\nimport typing\nimport weakref\nfrom types import MappingProxyType\nfrom typing import (\n Any,\n Literal,\n Optional,\n TypeVar,\n Union,\n overload,\n)\nfrom collections.abc import Callable, Iterable, Iterator, Mapping\n\nimport jax\nimport jax.numpy as jnp\nimport typing_extensions as tpe\n\nimport flax\nimport flax.linen as nn\nfrom flax import (\n config,\n core,\n errors,\n serialization,\n traceback_util,\n traverse_util,\n)\nfrom flax.core import Scope, meta, partial_eval\nfrom flax.core.frozen_dict import FrozenDict\nfrom flax.core.scope import (\n CollectionFilter,\n DenyList,\n Variable,\n union_filters,\n)\nfrom flax.ids import FlaxId, uuid\nfrom flax.linen import kw_only_dataclasses\nfrom flax.typing import (\n RNGSequences,\n PRNGKey,\n FrozenVariableDict,\n VariableDict,\n)\n\ntraceback_util.register_exclusion(__file__)\n\n\nT = TypeVar('T')\nK = TypeVar('K')\nM = TypeVar('M', bound='Module')\n_CallableT = TypeVar('_CallableT', bound=Callable)\n\n\n# Used for abstractly testing module behavior.\nTestScope = type(\n 'TestScope',\n (Scope,),\n {'make_rng': lambda self, name: jax.random.key(0)},\n)\n\n\n# pylint: disable=protected-access,attribute-defined-outside-init\ndef _get_fn_name(fn):\n if isinstance(fn, functools.partial):\n return _get_fn_name(fn.func)\n return getattr(fn, '__name__', 'unnamed_function')\n\n\ndef _indent(x: str, num_spaces: int):\n indent_str = ' ' * num_spaces\n lines = x.split('\n')\n # skip last line because it is always empty and should not be indented.\n assert not lines[-1]\n return '\n'.join(indent_str + line for line in lines[:-1]) + '\n'\n\n\ndef _attr_repr(value: Any):\n if callable(value) and (\n (isinstance(value, nn.Module) and value.__dict__.get('__name__', None))\n or (not isinstance(value, nn.Module) and getattr(value, '__name__', None))\n ):\n value_rep = value.__name__\n else:\n value_rep = repr(value)\n return value_rep\n\n\ndef _module_repr(module: 'Module', num_spaces: int = 4):\n """"""Returns a pretty printed representation of the module.""""""\n cls = type(module)\n try:\n fields = dataclasses.fields(cls)\n except TypeError:\n # Edge case with no fields e.g. module = nn.Module() causes error later.\n return object.__repr__(module)\n cls_name = cls.__name__\n rep = ''\n\n attributes = {\n f.name: f.type\n for f in fields\n if f.name not in ('parent', 'name') and f.repr\n }\n child_modules = {\n k: v\n for k, v in module._state.children.items() # pytype: disable=attribute-error\n if isinstance(v, Module)\n }\n if attributes:\n rep += '# attributes\n'\n for attr in attributes.keys():\n # TODO(jheek): can we get a nice string representation of attribute types?\n value = module.__dict__.get(attr, None)\n value_rep = _attr_repr(value)\n rep += f'{attr} = {value_rep}\n'\n if child_modules:\n rep += '# children\n'\n for name, child in child_modules.items():\n child_rep = _module_repr(child, num_spaces)\n rep += f'{name} = {child_rep}\n'\n if rep:\n return f'{cls_name}(\n{_indent(rep, num_spaces)})'\n else:\n return f'{cls_name}()'\n\n\n# Tabulation utilities.\n# -----------------------------------------------------------------------------\n@dataclasses.dataclass\nclass _CallInfo:\n index: int\n path: tuple[str, ...]\n module: 'Module'\n rngs: dict[str, core.scope.PRNGKey | core.scope.LazyRng] | None\n mutable: bool\n method: str\n args: tuple[Any, ...]\n kwargs: dict[str, Any]\n outputs: Any\n\n\n@dataclasses.dataclass\nclass _CallInfoContext(threading.local):\n index: int\n calls: list[_CallInfo]\n\n def get_call_index(self) -> int:\n index = self.index\n self.index += 1\n return index\n\n\n@contextlib.contextmanager\ndef _tabulate_context():\n _context.call_info_stack.append(_CallInfoContext(0, []))\n try:\n yield\n finally:\n _context.call_info_stack.pop()\n\n\n# Track parent relationship across Modules.\n# -----------------------------------------------------------------------------\nclass _DynamicContext(threading.local):\n """"""Dynamic context.""""""\n\n # TODO(marcvanzee): switch to using contextvars once minimum python version is\n # 3.7\n\n def __init__(self):\n self.module_stack: list['Module' | None] = [\n None,\n ]\n self.capture_stack = []\n self.call_info_stack: list[_CallInfoContext] = []\n\n\n# The global context\n_context = _DynamicContext()\n\n\nclass _Sentinel:\n def __copy__(self):\n return self # Do not copy singleton sentinel.\n\n def __deepcopy__(self, memo):\n del memo\n return self # Do not copy singleton sentinel.\n\n def __reduce__(self):\n return _get_unspecified_parent, ()\n\n\ndef _get_unspecified_parent():\n return _unspecified_parent\n\n\n_unspecified_parent = _Sentinel()\n\n\n# Enable automatic named_call wrapping for labelling profile traces.\n# -----------------------------------------------------------------------------\n_use_named_call = config.flax_profile\n\n\ndef _derive_profiling_name(module, fn):\n fn_name = _get_fn_name(fn)\n method_suffix = f'.{fn_name}' if fn_name != '__call__' else ''\n module_name = module.name or module.__class__.__name__\n return f'{module_name}{method_suffix}'\n\n\ndef enable_named_call():\n """"""Enables named call wrapping for labelling profile traces.\n\n When named call wrapping is enabled all JAX ops executed in a Module\n will be run under ``jax.named_scope``. The ``Module`` class name will\n show up around the operations belonging to that Module in the\n Tensorboard profiling UI, simplifying the profiling process.\n\n Note that ``jax.named_scope`` only works for\n compiled functions (e.g.: using jax.jit or jax.pmap).\n """"""\n global _use_named_call\n _use_named_call = True\n\n\ndef disable_named_call():\n """"""Disables named call wrapping.\n\n See ``enable_named_call``\n """"""\n global _use_named_call\n _use_named_call = False\n\n\n@contextlib.contextmanager\ndef override_named_call(enable: bool = True):\n # pylint: disable=g-doc-return-or-yield\n """"""Returns a context manager that enables/disables named call wrapping.\n\n Args:\n enable: If true, enables named call wrapping for labelling profile traces.\n (see ``enabled_named_call``).\n """"""\n # pylint: enable=g-doc-return-or-yield\n global _use_named_call\n use_named_call_prev = _use_named_call\n _use_named_call = enable\n try:\n yield\n finally:\n _use_named_call = use_named_call_prev\n\n\n# Intercept module methods.\n# -----------------------------------------------------------------------------\n@dataclasses.dataclass(frozen=True)\nclass InterceptorContext:\n """"""Read only state showing the calling context for method interceptors.\n\n Attributes:\n module: The Module instance whose method is being called.\n method_name: The name of the method being called on the module.\n orig_method: The original method defined on the module. Calling it will\n short circuit all other interceptors.\n """"""\n\n module: 'Module'\n method_name: str\n orig_method: Callable[..., Any]\n\n\nclass ThreadLocalStack(threading.local):\n """"""Thread-local stack.""""""\n\n def __init__(self):\n self._storage = []\n\n def push(self, elem: Any) -> None:\n self._storage.append(elem)\n\n def pop(self) -> Any:\n return self._storage.pop()\n\n def __iter__(self) -> Iterator[Any]:\n return iter(reversed(self._storage))\n\n def __len__(self) -> int:\n return len(self._storage)\n\n def __repr__(self) -> str:\n return f'{self.__class__.__name__}({self._storage})'\n\n\nArgs = tuple[Any]\nKwargs = dict[str, Any]\nNextGetter = Callable[..., Any]\nInterceptor = Callable[[NextGetter, Args, Kwargs, InterceptorContext], Any]\n_global_interceptor_stack = ThreadLocalStack()\n\n\n@contextlib.contextmanager\ndef intercept_methods(interceptor: Interceptor):\n # pylint: disable=g-doc-return-or-yield\n r""""""Registers a new method interceptor.\n\n Method interceptors allow you to (at a distance) intercept method calls to\n modules. It works similarly to decorators. You could modify args/kwargs before\n calling the underlying method and/or modify the result returning from calling\n the underlying method. Or you could completely skip calling the underlying\n method and decide to do something differently. For example::\n\n >>> import flax.linen as nn\n >>> import jax.numpy as jnp\n ...\n >>> class Foo(nn.Module):\n ... def __call__(self, x):\n ... return x\n ...\n >>> def my_interceptor1(next_fun, args, kwargs, context):\n ... print('calling my_interceptor1')\n ... return next_fun(*args, **kwargs)\n ...\n >>> foo = Foo()\n >>> with nn.intercept_methods(my_interceptor1):\n ... _ = foo(jnp.ones([1]))\n calling my_interceptor1\n\n You could also register multiple interceptors on the same method. Interceptors\n will run in order. For example::\n\n >>> def my_interceptor2(next_fun, args, kwargs, context):\n ... print('calling my_interceptor2')\n ... return next_fun(*args, **kwargs)\n ...\n >>> with nn.intercept_methods(my_interceptor1), \\n ... nn.intercept_methods(my_interceptor2):\n ... _ = foo(jnp.ones([1]))\n calling my_interceptor1\n calling my_interceptor2\n\n You could skip other interceptors by directly calling the\n ``context.orig_method``. For example::\n\n >>> def my_interceptor3(next_fun, args, kwargs, context):\n ... print('calling my_interceptor3')\n ... return context.orig_method(*args, **kwargs)\n >>> with nn.intercept_methods(my_interceptor3), \\n ... nn.intercept_methods(my_interceptor1), \\n ... nn.intercept_methods(my_interceptor2):\n ... _ = foo(jnp.ones([1]))\n calling my_interceptor3\n\n The following methods couldn't be intercepted:\n\n 1. Methods decoratored with ``nn.nowrap``.\n 2. Dunder methods including ``__eq__``, ``__repr__``, ``__init__``, ``__hash__``, and ``__post_init__``.\n 3. Module dataclass fields.\n 4. Module descriptors.\n\n Args:\n interceptor: A method interceptor.\n """"""\n _global_interceptor_stack.push(interceptor)\n try:\n yield\n finally:\n assert _global_interceptor_stack.pop() is interceptor\n\n\ndef run_interceptors(\n orig_method: Callable[..., Any],\n module: 'Module',\n *args,\n **kwargs,\n) -> Any:\n """"""Runs method interceptors.""""""\n method_name = _get_fn_name(orig_method)\n fun = functools.partial(orig_method, module)\n context = InterceptorContext(module, method_name, fun)\n\n def wrap_interceptor(interceptor, fun):\n """"""Wraps `fun` with `interceptor`.""""""\n\n @functools.wraps(fun)\n def wrapped(*args, **kwargs):\n return interceptor(fun, args, kwargs, context)\n\n return wrapped\n\n # Wraps interceptors around the original method. The innermost interceptor is\n # the last one added and directly wrapped around the original bound method.\n for interceptor in _global_interceptor_stack:\n fun = wrap_interceptor(interceptor, fun)\n return fun(*args, **kwargs)\n\n\n# Utilities for pytrees of Modules defined inside setup()\n# -----------------------------------------------------------------------------\n\n\ndef _sorted_items(x):\n """"""Returns items of a dict ordered by keys.""""""\n return sorted(x.items(), key=lambda x: x[0])\n\n\ndef _get_suffix_value_pairs(\n tree_or_leaf: Any,\n) -> list[tuple[str, type['Module']]]:\n """"""Helper for naming pytrees of submodules.""""""\n dict_or_leaf = serialization.to_state_dict(tree_or_leaf)\n if not isinstance(dict_or_leaf, dict) or not dict_or_leaf:\n return [('', tree_or_leaf)]\n else:\n flat_dict = traverse_util.flatten_dict(dict_or_leaf)\n return [('_' + '_'.join(k), v) for k, v in _sorted_items(flat_dict)]\n\n\ndef _map_over_modules_in_tree(fn, tree_or_leaf):\n """"""Helper for mapping function over submodules.""""""\n dict_or_leaf = serialization.to_state_dict(tree_or_leaf)\n if not isinstance(dict_or_leaf, dict) or not dict_or_leaf:\n return fn('', tree_or_leaf)\n else:\n flat_dict = traverse_util.flatten_dict(dict_or_leaf, keep_empty_nodes=True)\n mapped_flat_dict = {\n k: fn('_' + '_'.join(k), v) for k, v in _sorted_items(flat_dict)\n }\n return serialization.from_state_dict(\n tree_or_leaf, traverse_util.unflatten_dict(mapped_flat_dict)\n )\n\n\ndef _freeze_attr(val: Any) -> Any:\n """"""Recursively wrap the given attribute `var` in ``FrozenDict``.""""""\n if isinstance(val, (dict, FrozenDict)):\n return FrozenDict({k: _freeze_attr(v) for k, v in val.items()})\n elif isinstance(val, tuple):\n # Special case namedtuples and special JAX tuple structures otherwise they\n # would be downgraded to normal tuples.\n if hasattr(val, '_fields') or type(val).__name__ == 'PartitionSpec':\n return type(val)(*[_freeze_attr(v) for v in val])\n else:\n return tuple(_freeze_attr(v) for v in val)\n elif isinstance(val, list):\n return tuple(_freeze_attr(v) for v in val)\n else:\n return val\n\n\n# Method wrapping of ""compact methods"" and setup()\n# -----------------------------------------------------------------------------\ndef compact(fun: _CallableT) -> _CallableT:\n """"""Marks the given module method allowing inlined submodules.\n\n Methods wrapped in @compact can define submodules directly within the method.\n\n For instance::\n\n >>> import flax.linen as nn\n\n >>> class Foo(nn.Module):\n ... @nn.compact\n ... def __call__(self, x, features):\n ... x = nn.Dense(features)(x)\n ... ...\n ... return x\n\n At most one method in each Module may be wrapped with @compact.\n\n Args:\n fun: The Module method to mark as compact.\n\n Returns:\n The given function ``fun`` marked as compact.\n """"""\n fun.compact = True # type: ignore[attr-defined]\n return fun\n\n\ndef nowrap(fun: _CallableT) -> _CallableT:\n """"""Marks the given module method as a helper method that needn't be wrapped.\n\n Methods wrapped in ``@nowrap`` are private helper methods that needn't be wrapped\n with the state handler or a separate named_call transform.\n\n This is needed in several concrete instances:\n - if you're subclassing a method like Module.param and don't want this\n overriden core function decorated with the state management wrapper.\n - If you want a method to be callable from an unbound Module (e.g.: a\n function of construction of arguments that doesn't depend on params/RNGs).\n If you want to learn more about how Flax Modules manage their state read the\n [The Flax Module lifecycle](https://flax.readthedocs.io/en/latest/developer_notes/module_lifecycle.html)\n guide.\n\n For instance::\n\n >>> import flax.linen as nn\n >>> import jax, jax.numpy as jnp\n\n >>> class Foo(nn.Module):\n ... num_features: int\n\n ... @nn.nowrap\n ... def _make_dense(self, num_features):\n ... return nn.Dense(num_features)\n\n ... @nn.compact\n ... def __call__(self, x):\n ... # now safe to use constructor helper even if using named_call\n ... dense = self._make_dense(self.num_features)\n ... return dense(x)\n\n Args:\n fun: The Module method to mark as nowrap.\n\n Returns:\n The given function ``fun`` marked as nowrap.\n """"""\n fun.nowrap = True # type: ignore[attr-defined]\n return fun\n\n\ndef compact_name_scope(fun: _CallableT) -> _CallableT:\n """"""Creates compact submodules from a method.\n\n This is a decorator that allows you to define compact submodules from a\n method. It's intention is to make it easier to port code Haiku code to Flax\n by providing the same functionality.\n\n Example::\n\n >>> import flax.linen as nn\n >>> import jax\n >>> import jax.numpy as jnp\n >>> from flax.core import pretty_repr\n ...\n >>> class Foo(nn.Module):\n ... @nn.compact_name_scope\n ... def up(self, x):\n ... return nn.Dense(3)(x)\n ...\n ... @nn.compact_name_scope\n ... def down(self, x):\n ... return nn.Dense(3)(x)\n ...\n ... def __call__(self, x):\n ... return self.up(x) + self.down(x)\n ...\n >>> module = Foo()\n >>> variables = module.init(jax.random.PRNGKey(0), jnp.ones((1, 2)))\n >>> params = variables['params']\n >>> print(pretty_repr(jax.tree_util.tree_map(jnp.shape, params)))\n {\n down: {\n Dense_0: {\n bias: (3,),\n kernel: (2, 3),\n },\n },\n up: {\n Dense_0: {\n bias: (3,),\n kernel: (2, 3),\n },\n },\n }\n\n You can also use ``compact_name_scope`` inside ``@compact`` methods or even\n other\n ``compact_name_scope`` methods. Methods that are decorated with\n ``compact_name_scope``\n can also be called directly from ``init`` or ``apply`` via the ``method``\n argument::\n\n >>> y_down = module.apply({'params': params}, jnp.ones((1, 2)), method='down')\n >>> y_down.shape\n (1, 3)\n\n Args:\n fun: The Module method to mark as compact_name_scope.\n\n Returns:\n The given function ``fun`` marked as compact_name_scope.\n """"""\n\n @functools.wraps(fun)\n def compact_name_scope_wrapper(self: nn.Module, *args, **kwargs):\n name = fun.__name__\n if not hasattr(self, '_compact_name_scope_modules'):\n raise ValueError(\n f'Cannot call compact_name_scope method {name!r} on a Module that has not been '\n f'setup. This is likely because you are calling {name!r} '\n 'from outside of init or apply.'\n )\n module = self._compact_name_scope_modules[name]\n return module(*args, **kwargs)\n\n compact_name_scope_wrapper.compact_name_scope = True # type: ignore[attr-defined]\n compact_name_scope_wrapper.inner_fun = fun # type: ignore[attr-defined]\n compact_name_scope_wrapper.nowrap = True # type: ignore[attr-defined]\n return compact_name_scope_wrapper # type: ignore[return-value]\n\n\ndef _get_local_method_names(\n cls: Any, exclude: Iterable[str] = ()\n) -> tuple[str, ...]:\n """"""Gets method names of a class, excluding class and static methods.\n\n Args:\n cls: The class to get method names for.\n exclude: Names to exclude from output.\n\n Returns:\n A list of method names.\n """"""\n true_methods = set()\n for m in cls.__dict__:\n if callable(cls.__dict__[m]) and not inspect.isclass(\n cls.__dict__[m]\n ): # pytype: disable=not-supported-yet\n mtype = type(cls.__dict__[m])\n if mtype != staticmethod and mtype != classmethod:\n true_methods.add(m)\n return tuple(true_methods.difference(set(exclude)))\n\n\ndef _get_local_descriptor_names(\n cls: Any, exclude: Iterable[str] = ()\n) -> tuple[str, ...]:\n """"""Gets descriptor names of a class.\n\n Args:\n cls: The class to get property names for.\n exclude: Names to exclude from output.\n\n Returns:\n A list of property names.\n """"""\n true_properties = set()\n for m, attr in cls.__dict__.items():\n if not callable(attr) and (\n hasattr(attr, '__get__')\n or hasattr(attr, '__set__')\n or hasattr(attr, '__delete__')\n ):\n mtype = type(attr)\n if mtype != staticmethod and mtype != classmethod:\n true_properties.add(m)\n return tuple(true_properties.difference(set(exclude)))\n\n\ndef wrap_method_once(fun: Callable[..., Any]) -> Callable[..., Any]:\n """"""Manages Module state for a given user-defined method.\n\n Args:\n fun: User-defined Module method to manage state for.\n\n Returns:\n Wrapped method.\n """"""\n # Don't rewrap methods that have already had the state management wrapper\n # applied in the decorator stack. This wrapper should always be applied\n # before transformation wrappers.\n if hasattr(fun, 'method_handler_wrapped'):\n return fun\n\n @functools.wraps(fun)\n def wrapped_module_method(*args, **kwargs):\n # We might have incorrectly wrappped a callable\n # that is not a method. Check whether the first arg is self,\n # otherwise call the wrapped function as is.\n if args and isinstance(args[0], Module):\n self, args = args[0], args[1:]\n return self._call_wrapped_method(fun, args, kwargs)\n else:\n return fun(*args, **kwargs)\n\n wrapped_module_method.method_handler_wrapped = True # type: ignore[attr-defined]\n return wrapped_module_method\n\n\ndef wrap_descriptor_once(descriptor) -> 'DescriptorWrapper':\n """"""Wraps a descriptor to give better error messages.\n\n Args:\n descriptor: User-defined Module attribute descriptor.\n\n Returns:\n Wrapped descriptor.\n """"""\n # Don't rewrap descriptors.\n if isinstance(descriptor, DescriptorWrapper):\n return descriptor\n\n return create_descriptor_wrapper(descriptor)\n\n\ndef _wrap_hash(hash_fn: Callable[..., Any]) -> Callable[..., Any]:\n """"""Wraps a hash function with some check for Flax Modules.""""""\n\n @functools.wraps(hash_fn)\n def wrapped(self):\n if self.scope is not None:\n raise TypeError(""Can't call __hash__ on modules that hold variables."")\n try:\n hash_value = hash_fn(self)\n except TypeError as exc:\n raise TypeError(\n 'Failed to hash Flax Module. '\n 'The module probably contains unhashable attributes. '\n f'Module={self}'\n ) from exc\n return hash_value\n\n return wrapped\n\n\ndef _get_unbound_fn(method_or_fn: Callable[..., Any]) -> Callable[..., Any]:\n """"""Returns an unbound function from a method that is possibly bound.\n\n This means that if the passed function belongs of an instance of a class, then\n the returned function does no longer depend on the instance, which is passed\n as the first argument to the function.\n\n Args:\n method_or_fn: A class method or function.\n\n Returns:\n An unbound version of input function.\n """"""\n if inspect.ismethod(method_or_fn) and isinstance(\n method_or_fn.__self__, Module\n ): # pytype: disable=attribute-error\n method_or_fn = method_or_fn.__func__ # pytype: disable=attribute-error\n\n # The method should be callable, and it should have at least one argument\n # representing the class that is passed in.\n if (\n not callable(method_or_fn)\n or len(inspect.signature(method_or_fn).parameters) < 1\n ):\n raise errors.ApplyModuleInvalidMethodError(method_or_fn)\n\n return method_or_fn\n\n\ndef _map_submodules(fn: Callable[['Module'], Any], tree):\n """"""Map a function over all submodules in a tree.""""""\n g = lambda _, x: fn(x) if isinstance(x, Module) else x\n return _freeze_attr(_map_over_modules_in_tree(g, tree))\n\n\nclass SetupState(enum.IntEnum):\n # setup() has not been called.\n NEW = 0\n # setup() has been called outside a transform boundary.\n TRANSFORMED = 1\n # setup() has been called.\n DONE = 2\n\n\n@dataclasses.dataclass\nclass _ModuleInternalState:\n """"""Ephemeral Module Evaluation State.\n\n For clarity, we collect all of the temporary flags and ephemeral state used by\n Modules for autonaming and error messages here, alongside the rules used\n to pass this ephemeral state across transform boundaries.\n """"""\n\n in_compact_method: bool = False\n in_setup: bool = False\n setup_called: SetupState = SetupState.NEW\n is_initialized: bool = False\n autoname_cursor: dict[str, int] = dataclasses.field(default_factory=dict)\n children: dict[str, Union[str, 'Module']] = dataclasses.field(\n default_factory=dict\n )\n\n def reset(self) -> None:\n """"""Resets transient state.\n\n This function is called after each module method, so only attributes that\n are method-dependent are reset.\n """"""\n self.in_compact_method = False\n self.in_setup = False\n self.autoname_cursor = dict()\n\n def export(self) -> '_ModuleInternalState':\n """"""Exports transform-preserved state across transform boundary.""""""\n setup_state = (\n SetupState.TRANSFORMED if self.setup_called else SetupState.NEW\n )\n cloned = _ModuleInternalState(\n in_compact_method=self.in_compact_method,\n in_setup=self.in_setup,\n setup_called=setup_state,\n is_initialized=self.is_initialized,\n autoname_cursor=dict(self.autoname_cursor),\n )\n return cloned\n\n def reimport(self, other: '_ModuleInternalState') -> None:\n """"""Re-imports transform-preserved state from across transform boundary.""""""\n self.in_compact_method = other.in_compact_method\n self.in_setup = other.in_setup\n self.is_initialized = other.is_initialized\n self.autoname_cursor = dict(other.autoname_cursor)\n\n\n_uninitialized_module_internal_state = _ModuleInternalState()\n\n\n_UNDEFINED_COPY_PICKLE_METHODS = (\n '__getstate__',\n '__setstate__',\n '__getnewargs_ex__',\n '__reduce__',\n '__reduce_ex__',\n '__copy__',\n '__deepcopy__',\n)\n\n\n_caches: 'weakref.WeakKeyDictionary[Scope, weakref.WeakValueDictionary[FlaxId, Module]]' = weakref.WeakKeyDictionary()\n\n\ntuple_reduce = lambda xs, x: xs + (x,)\ntuple_init = lambda: ()\n\n\ncapture_call_intermediates = lambda _, method_name: method_name == '__call__'\n\n\nclass ParentDescriptor:\n """"""Wraps parent module references in weak refs.\n\n This prevents reference cycles from forming via parent links which can lead\n to accidental OOMs in eager mode due to slow garbage collection as well as\n spurious tracer leaks during jit compilation.\n\n Note: ""descriptors"" are the underlying python mechanism for implementing\n dynamic @property decorators. We need to use a raw descriptor instead of the\n more common decorator in order to force that the appropriate getter/setter\n logic applies in subclasses even after various dataclass transforms.\n """"""\n\n def __get__(self, obj, objtype=None):\n # check if obj is None, happens during %autoreload\n if obj is None:\n return None\n parent = object.__getattribute__(obj, '_parent_ref')\n return parent() if isinstance(parent, weakref.ReferenceType) else parent\n\n def __set__(self, obj, value):\n maybe_weak = weakref.ref(value) if isinstance(value, Module) else value\n object.__setattr__(obj, '_parent_ref', maybe_weak)\n\n\nclass Descriptor(tpe.Protocol):\n __isabstractmethod__: bool\n\n def __get__(self, obj, objtype=None) -> Any:\n ...\n\n def __set__(self, obj, value) -> None:\n ...\n\n def __delete__(self, obj) -> None:\n ...\n\n def __set_name__(self, owner, name) -> None:\n ...\n\n\nclass DescriptorWrapper:\n pass\n\n\ndef create_descriptor_wrapper(descriptor: Descriptor):\n """"""Creates a descriptor wrapper that calls a get_fn on the descriptor.""""""\n\n class _DescriptorWrapper(DescriptorWrapper):\n """"""A descriptor that can wrap any descriptor.""""""\n\n if hasattr(descriptor, '__isabstractmethod__'):\n __isabstractmethod__ = descriptor.__isabstractmethod__\n\n def __init__(self, wrapped: Descriptor):\n self.wrapped = wrapped\n\n # conditionally define descriptor methods\n if hasattr(descriptor, '__get__'):\n\n def __get__(self, *args, **kwargs):\n # here we will catch internal AttributeError and re-raise it as a\n # more informative and correct error message.\n try:\n return self.wrapped.__get__(*args, **kwargs)\n except AttributeError as e:\n raise errors.DescriptorAttributeError() from e\n\n if hasattr(descriptor, '__set__'):\n\n def __set__(self, *args, **kwargs):\n return self.wrapped.__set__(*args, **kwargs)\n\n if hasattr(descriptor, '__delete__'):\n\n def __delete__(self, *args, **kwargs):\n return self.wrapped.__delete__(*args, **kwargs)\n\n if hasattr(descriptor, '__set_name__'):\n\n def __set_name__(self, *args, **kwargs):\n self.wrapped.__set_name__(*args, **kwargs)\n\n def __getattr__(self, name):\n if 'wrapped' not in vars(self):\n raise AttributeError()\n return getattr(self.wrapped, name)\n\n return _DescriptorWrapper(descriptor)\n\n\n# Base Module definition.\n# -----------------------------------------------------------------------------\n\n\ndef module_field(*, kw_only: bool = False, default: Any | None = ...) -> Any:\n ...\n\n\n# The ModuleBase class is created only to make static analyzers happy\n# mainly pytype and pyright. Some notes:\n# * pyright (correctly) complains that Module itself is not a dataclass, even\n# though all its subclasses and intances ARE dataclasses. Because there is no\n# way to annotate this in a way that pyright understands, we create a\n# ModuleBase class decorated with `dataclass_transform` such that pyright\n# thinks Module is a dataclass (in reality only subclasses are instantiated\n# so this is fine).\n# * The `__dataclass_fields__` attribute is needed because pytype seems to\n# not understand the `dataclass_transform` decorator, therefore we need\n# to add the attribute manually.\n# * Other attributes are annotated for completeness. Because we are using\n# the `if typing.TYPE_CHECKING` pattern, these annotations are not present\n# at runtime so they don't affect the dataclass behavior.\n@tpe.dataclass_transform(field_specifiers=(module_field,)) # type: ignore[literal-required]\nclass ModuleBase:\n if typing.TYPE_CHECKING:\n scope: Scope | None\n _state: _ModuleInternalState\n _parent_ref: Union['Module', weakref.ReferenceType['Module'], None]\n __dataclass_fields__: dict[str, dataclasses.Field]\n\n\nclass Module(ModuleBase):\n """"""Base class for all neural network modules.\n\n Layers and models should subclass this class.\n\n All Flax Modules are Python 3.7\n `dataclasses `_. Since\n dataclasses take over ``__init__``, you should instead override :meth:`setup`,\n which is automatically called to initialize the module.\n\n Modules can contain submodules, and in this way can be nested in a tree\n structure. Submodels can be assigned as regular attributes inside the\n :meth:`setup` method.\n\n You can define arbitrary ""forward pass"" methods on your Module subclass.\n While no methods are special-cased, ``__call__`` is a popular choice because\n it allows you to use module instances as if they are functions::\n\n >>> from flax import linen as nn\n >>> from typing import Tuple\n\n >>> class Module(nn.Module):\n ... features: Tuple[int, ...] = (16, 4)\n\n ... def setup(self):\n ... self.dense1 = nn.Dense(self.features[0])\n ... self.dense2 = nn.Dense(self.features[1])\n\n ... def __call__(self, x):\n ... return self.dense2(nn.relu(self.dense1(x)))\n\n Optionally, for more concise module implementations where submodules\n definitions are co-located with their usage, you can use the\n :meth:`compact` wrapper.\n """"""\n\n if typing.TYPE_CHECKING:\n name: str | None = module_field(kw_only=True, default=None)\n parent: Union['Module', _Sentinel, None] = module_field(\n kw_only=True, default=None\n )\n\n def __init__(self, *args, **kwargs):\n # this stub makes sure pytype accepts constructor arguments.\n pass\n\n def __call__(self, *args, **kwargs) -> Any:\n # this stub allows pytype to accept Modules as Callables.\n pass\n\n @classmethod\n def __init_subclass__(cls, kw_only: bool = False, **kwargs: Any) -> None:\n """"""Automatically initializes all subclasses as custom dataclasses.""""""\n super().__init_subclass__(**kwargs)\n # All Flax Modules are dataclasses. We force this convention since\n # it encourages the stateless behavior needed to clone module instances for\n # functional transformation. Instead of using a python metaclass, we\n # automatically transform Modules into dataclasses at subclass creation\n # time, and we set the last dataclass arguments to `parent` and `name`.\n cls._customized_dataclass_transform(kw_only)\n # We wrap user-defined methods including setup and __call__ to enforce\n # a number of different checks and to provide clear error messages.\n cls._find_compact_name_scope_methods()\n cls._wrap_module_attributes()\n # Set empty class defaults.\n cls._state = _uninitialized_module_internal_state # type: ignore[attr-defined]\n cls.scope: Scope | None = None # type: ignore\n # Handles weak referencing of parent Modules to prevent reference cycles.\n cls._parent_ref = None # type: ignore[attr-defined]\n cls.parent = ParentDescriptor() # type: ignore[assignment]\n\n @classmethod\n def _customized_dataclass_transform(cls, kw_only: bool):\n """"""Transforms `cls` into a dataclass, with custom additional behavior.\n\n 1. Inject `parent` and `name` fields. (If they are already present,\n then check that they have the expected types.)\n 2. Set compare, hash, and repr to False for non-init fields.\n 3. Generate a hash function (if not provided by cls).\n """"""\n # Check reserved attributes have expected type annotations.\n annotations = dict(cls.__dict__.get('__annotations__', {}))\n if annotations.get('parent', _ParentType) != _ParentType:\n raise errors.ReservedModuleAttributeError(annotations)\n if annotations.get('name', str) not in ('str', str, Optional[str]):\n raise errors.ReservedModuleAttributeError(annotations)\n\n # any non-init field will only be set in setup\n # During __hash__ and __eq__ the field is not set yet\n # so it should not be used in compare, hash or repr.\n for field in annotations:\n field_meta = getattr(cls, field, None)\n if isinstance(field_meta, dataclasses.Field) and not field_meta.init:\n field_meta.compare = False\n field_meta.hash = False\n field_meta.repr = False\n\n extra_fields = [\n (\n 'parent',\n _ParentType,\n kw_only_dataclasses.field(\n repr=False, default=_unspecified_parent, kw_only=True\n ),\n ),\n (\n 'name',\n Optional[str],\n kw_only_dataclasses.field(default=None, kw_only=True),\n ),\n ]\n\n if kw_only:\n if tuple(sys.version_info)[:3] >= (3, 10, 0):\n for (\n name,\n annotation, # pytype: disable=invalid-annotation\n default,\n ) in extra_fields:\n setattr(cls, name, default)\n cls.__annotations__[name] = annotation\n dataclasses.dataclass( # type: ignore[call-overload]\n unsafe_hash='__hash__' not in cls.__dict__,\n repr=False,\n kw_only=True,\n )(cls)\n else:\n raise TypeError('`kw_only` is not available before Py 3.10.')\n else:\n # Now apply dataclass transform (which operates in-place).\n # Do generate a hash function only if not provided by the class.\n kw_only_dataclasses.dataclass(\n cls,\n unsafe_hash='__hash__' not in cls.__dict__,\n repr=False,\n extra_fields=extra_fields,\n ) # pytype: disable=wrong-keyword-args\n\n cls.__hash__ = _wrap_hash(cls.__hash__) # type: ignore[method-assign]\n\n @classmethod\n def _find_compact_name_scope_methods(cls):\n """"""Finds all compact_name_scope methods in the class.""""""\n methods = [m[0] for m in inspect.getmembers(cls, predicate=callable)]\n compact_name_scope_fns = tuple(\n method_name\n for method_name in methods\n if hasattr(getattr(cls, method_name), 'compact_name_scope')\n )\n cls._compact_name_scope_methods = compact_name_scope_fns\n\n @classmethod\n def _wrap_module_attributes(cls):\n """"""Wraps user-defined non-inherited methods and descriptors with state\n\n management functions.\n """"""\n # wrap methods\n method_exclusions = [f.name for f in dataclasses.fields(cls)] + [\n '__eq__',\n '__repr__',\n '__init__',\n '__hash__',\n '__post_init__',\n ]\n for key in _get_local_method_names(cls, exclude=method_exclusions):\n method = getattr(cls, key)\n if hasattr(method, 'nowrap'):\n continue\n setattr(cls, key, wrap_method_once(method))\n\n # wrap descriptors\n descriptor_exclusions = [f.name for f in dataclasses.fields(cls)] + [\n 'parent',\n '__dict__',\n ]\n for key in _get_local_descriptor_names(cls, descriptor_exclusions):\n # don't use getattr here, since it will call the descriptor\n descriptor = cls.__dict__[key]\n if hasattr(descriptor, 'nowrap'):\n continue\n setattr(cls, key, wrap_descriptor_once(descriptor))\n return cls\n\n def _call_wrapped_method(self, fun, args, kwargs):\n """"""Calls a wrapped method.\n\n This function is responsible for setting up the thread local state\n correctly before calling the method and cleaning up afterwards.\n This includes storing intermediates, setup of the compact scope,\n and making sure setup is called before any other method.\n\n Args:\n fun: The wrapped method.\n args: Named arguments passed to ``fun``.\n kwargs: Keyword arguments passed to ``fun``.\n\n Returns:\n The results of calling ``fun``.\n """"""\n is_compact_method = hasattr(fun, 'compact')\n fun_name = _get_fn_name(fun)\n is_setup_method = fun_name == 'setup'\n add_call_info = not is_setup_method and len(_context.call_info_stack) > 0\n # We lazily call setup() only when needed.\n if is_setup_method:\n if self.scope is None:\n raise errors.CallSetupUnboundModuleError()\n is_recurrent = self._state.in_setup\n self._state.in_setup = True\n else:\n self._try_setup()\n\n if is_compact_method:\n if self.scope is None:\n raise errors.CallCompactUnboundModuleError()\n is_recurrent = self._state.in_compact_method\n self._state.in_compact_method = True\n _context.module_stack.append(self)\n try:\n # get call info\n if add_call_info:\n assert self.scope is not None\n call_index = _context.call_info_stack[-1].get_call_index()\n\n if _global_interceptor_stack:\n run_fun = functools.partial(run_interceptors, fun)\n else:\n run_fun = fun\n\n # call method\n if _use_named_call:\n with jax.named_scope(_derive_profiling_name(self, fun)):\n y = run_fun(self, *args, **kwargs)\n else:\n y = run_fun(self, *args, **kwargs)\n\n if _context.capture_stack:\n filter_fn = _context.capture_stack[-1]\n if filter_fn and filter_fn(self, fun_name):\n self.sow('intermediates', fun_name, y)\n if add_call_info:\n _args, _kwargs, _y = flax.linen.summary._represent_tree(\n (args, kwargs, y)\n )\n _context.call_info_stack[-1].calls.append(\n _CallInfo(\n call_index,\n self.path,\n self.clone(),\n self.scope.rngs,\n self.scope.mutable,\n fun.__name__,\n _args,\n _kwargs,\n _y,\n )\n )\n return y\n finally:\n _context.module_stack.pop()\n if is_compact_method:\n object.__setattr__(self, 'scope', self.scope.rewound())\n # setup or compact calls can be recurrent for example due to super calls\n # resetting the state would cause is compact/setup method\n # to be set to False prematurely.\n if (is_compact_method or is_setup_method) and not is_recurrent:\n self._state.reset()\n\n def __setattr__(self, name: str, val: Any):\n """"""Sets an attribute on this Module.\n\n We overload setattr solely to support pythonic naming via assignment of\n submodules in the special :meth:`setup` function::\n\n self.submodule_name = MyModule(...)\n\n We also support lists and other general pytrees, e.g.::\n\n self.submodules = [MyModule0(..), MyModule1(..), ...]\n\n Args:\n name: Attribute to set.\n val: Value of the attribute.\n """"""\n fields = self.__dataclass_fields__ # pytype: disable=attribute-error\n is_dataclass_attr = name in fields and fields[name].init\n\n if not self._state.in_setup:\n if not self._state.is_initialized:\n # Setting attributes before end of Module.__post_init__()\n object.__setattr__(self, name, val)\n return\n else:\n # If the attribute is a python special method, we allow setting it (this\n # is useful e.g. for IPython auto-reload).\n if name.startswith('__'):\n object.__setattr__(self, name, val)\n return\n # We're past all initialization and setup logic:\n # Raises a TypeError just like frozen python dataclasses.\n raise errors.SetAttributeFrozenModuleError(\n self.__class__.__name__, name, val\n )\n\n # We're inside the setup() method:\n if is_dataclass_attr:\n # These names are specified as dataclass fields. They should not be\n # initialized within the setup() method, but can be modified freely\n # before it.\n raise errors.SetAttributeInModuleSetupError()\n\n # Values (that may be variables or submodules) are being defined and\n # attached in setup(), we run some extra logic in that case.\n self._register_submodules(name, val)\n\n def __getattr__(self, name: str) -> Any:\n """"""Call setup() before getting any setup-defined attributes.""""""\n # We don't want to return anything for python copy / pickle methods.\n if name in _UNDEFINED_COPY_PICKLE_METHODS:\n raise AttributeError()\n self._try_setup()\n if name in self.__dict__:\n return self.__dict__[name]\n else:\n msg = f'""{self.__class__.__name__}"" object has no attribute ""{name}"".'\n if self.scope is None:\n msg += (\n f' If ""{name}"" is defined in \'.setup()\', remember these fields '\n ""are only accessible from inside 'init' or 'apply'.""\n )\n raise AttributeError(msg)\n\n def __dir__(self) -> list[str]:\n """"""Call setup() before listing attributes.""""""\n self._try_setup()\n return object.__dir__(self) # type: ignore\n\n def __post_init__(self) -> None:\n # DO NOT REMOVE - Marker for internal logging.\n # In dataclasses, __init__ is overridden to process dataclass arguments,\n # and __post_init__ is called immediately afterwards. Here, depending on the\n # type of `parent` passed to initialize the Module, we either defer\n # initialization, attach this Module as a submodule of a parent, or bind\n # this Module at the top-level to variables and rngs.\n\n object.__setattr__(self, '_id', uuid())\n object.__setattr__(self, '_state', _ModuleInternalState())\n\n # Typically we set the parent based on the dynamic module context.\n if self.parent is _unspecified_parent: # pytype: disable=attribute-error\n object.__setattr__(self, 'parent', _context.module_stack[-1])\n\n # Initialization is deferred for top level Modules or any other ""orphan""\n # Modules until attachment by __setattr__ i.e. MyModule(..., parent=None)\n if self.parent is None:\n return\n\n # Register submodule on parent Module.\n if isinstance(self.parent, Module):\n # When initializing an unnamed Module inside setup()\n # initialization is deferred until attachment by __setattr__\n # i.e. self.mymodule = MyModule(...)\n self.name: str | None\n if (\n self.parent._state.in_setup and self.name is None\n ): # pytype: disable=attribute-error\n return\n if not self.parent._initialization_allowed:\n raise errors.AssignSubModuleError(self.__class__.__name__)\n # Autonaming of submodules.\n if self.name is None: # pytype: disable=attribute-error\n prefix = f'{self.__class__.__name__}'\n cursor = self.parent._state.autoname_cursor.get(prefix, 0)\n self.name = f'{prefix}_{cursor}'\n self.parent._state.autoname_cursor[prefix] = cursor + 1\n # Allow scope aliasing under transforms for submodules defined in setup.\n reuse_scopes = (\n self.parent._state.in_setup\n and self.parent._state.setup_called == SetupState.TRANSFORMED\n )\n # Perform name-collision check.\n if self.parent._name_taken(self.name, reuse_scopes=reuse_scopes):\n parent_class = self.parent.__class__.__name__\n raise errors.NameInUseError('submodule', self.name, parent_class)\n # Finalize attachment to parent and scope initialization.\n self.parent._state.children[self.name] = self\n assert self.parent.scope is not None\n object.__setattr__(\n self, 'scope', self.parent.scope.push(self.name, reuse=reuse_scopes)\n )\n\n # Top-level invocation with a functional Scope.\n elif isinstance(self.parent, Scope):\n object.__setattr__(self, 'scope', self.parent)\n else:\n raise ValueError('parent must be None, Module or Scope')\n\n # eagerly bind submodules if scope is available\n if self.scope is not None:\n for field in dataclasses.fields(self):\n if field.name not in ('parent', 'name') and field.init:\n self._register_submodules(field.name, getattr(self, field.name))\n\n self._state.is_initialized = True\n\n def __repr__(self) -> str:\n return _module_repr(self)\n\n def setup(self) -> None:\n """"""Initializes a Module lazily (similar to a lazy ``__init__``).\n\n ``setup`` is called once lazily on a module instance when a module\n is bound, immediately before any other methods like ``__call__`` are\n invoked, or before a ``setup``-defined attribute on ``self`` is accessed.\n\n This can happen in three cases:\n\n 1. Immediately when invoking :meth:`apply`, :meth:`init` or\n :meth:`init_and_output`.\n\n 2. Once the module is given a name by being assigned to an attribute of\n another module inside the other module's ``setup`` method\n (see :meth:`__setattr__`)::\n\n >>> class MyModule(nn.Module):\n ... def setup(self):\n ... submodule = nn.Conv(...)\n\n ... # Accessing `submodule` attributes does not yet work here.\n\n ... # The following line invokes `self.__setattr__`, which gives\n ... # `submodule` the name ""conv1"".\n ... self.conv1 = submodule\n\n ... # Accessing `submodule` attributes or methods is now safe and\n ... # either causes setup() to be called once.\n\n 3. Once a module is constructed inside a method wrapped with\n :meth:`compact`, immediately before another method is called or\n ``setup`` defined attribute is accessed.\n """"""\n pass\n\n def _register_submodules(self, name, val):\n """"""Registers a submodule.""""""\n assert self.scope, 'Trying to register submodules on unbound scope.'\n root = self.scope.root\n cache = _caches.get(root, weakref.WeakValueDictionary())\n _caches[root] = cache\n queue = []\n preserve_adopted_names = config.flax_preserve_adopted_names\n if hasattr(type(self), 'preserve_adopted_names'):\n preserve_adopted_names = type(self).preserve_adopted_names\n\n def adopt_attr_modules(cache, queue, suffix, subvalue):\n if isinstance(subvalue, Module):\n current_name = subvalue.name\n adopted_name = None\n if subvalue.parent is None:\n # Preserve sharing-by-reference relationships during adoption\n # via cache keyed on unique instance ids.\n key = subvalue._id\n # Module was passed from outside. It needs to be cloned.\n # Outside modules are named by attachment, not an outer name,\n # UNLESS we're using new adopted name policy, in which case an existing\n # name will be used, as is often supplied by config systems.\n if preserve_adopted_names:\n adopted_name = object.__getattribute__(subvalue, 'name')\n if key in cache:\n subvalue = cache[key]\n else:\n subvalue = subvalue.clone(name=None)\n cache[key] = subvalue\n if subvalue.name is None:\n object.__setattr__(subvalue, 'parent', self)\n if adopted_name is None:\n adopted_name = (\n f'{name}{suffix}'\n if not isinstance(subvalue, CompactNameScope)\n else current_name\n )\n object.__setattr__(subvalue, 'name', adopted_name)\n queue.append(subvalue)\n return subvalue\n\n val = _freeze_attr(\n _map_over_modules_in_tree(\n functools.partial(adopt_attr_modules, cache, queue), val\n )\n )\n object.__setattr__(self, name, val)\n for x in queue:\n x.__post_init__()\n\n def _try_setup(self, shallow: bool = False) -> None:\n """"""Tries to setup module if scope is available and setup has not been called yet.""""""\n if (\n self.scope\n and not self._state.in_setup\n and self._state.setup_called != SetupState.DONE\n ):\n try:\n self._state.in_setup = True\n # A shallow setup will only register attribute submodules but it does\n # not call the user's setup. This avoids running before a\n # transformation.\n for field in dataclasses.fields(self):\n if field.name not in ('parent', 'name') and field.init:\n self._register_submodules(field.name, getattr(self, field.name))\n if not shallow:\n self.setup()\n # create NonTransparent Modules\n self._compact_name_scope_modules = {\n name: CompactNameScope(\n getattr(type(self), name).inner_fun, lambda: self, name=name\n )\n for name in self._compact_name_scope_methods\n }\n\n # We run static checks abstractly once for setup before any transforms\n # to detect name collisions and other python errors.\n elif self._state.setup_called == SetupState.NEW:\n self._validate_setup()\n finally:\n self._state.in_setup = False\n if not shallow:\n self._state.setup_called = SetupState.DONE\n\n def _validate_setup(self) -> None:\n """"""Abstractly evaluates setup only to run static checks.""""""\n\n def run_setup_only(x):\n wrapped_id = wrap_method_once(lambda m, x: x)\n with TestScope({}, rngs={}, mutable=True).temporary() as root:\n return wrapped_id(self.clone(parent=root), x)\n\n _ = jax.eval_shape(run_setup_only, 0)\n\n def _name_taken(\n self,\n name: str,\n reuse_scopes: bool = False,\n collection: str | None = None,\n ) -> bool:\n assert self.scope is not None\n if reuse_scopes:\n return False\n return self.scope.name_reserved(name, collection)\n\n @property\n def _initialization_allowed(self):\n return (\n not self._state.is_initialized # allow eager attachment in post-init\n or self._state.in_setup\n or self._state.in_compact_method\n )\n\n @property\n def path(self):\n """"""Get the path of this Module. Top-level root modules have an empty path ``()``.\n Note that this method can only be used on bound modules that have a valid scope.\n\n Example usage::\n\n >>> import flax.linen as nn\n >>> import jax, jax.numpy as jnp\n\n >>> class SubModel(nn.Module):\n ... @nn.compact\n ... def __call__(self, x):\n ... print(f'SubModel path: {self.path}')\n ... return x\n\n >>> class Model(nn.Module):\n ... @nn.compact\n ... def __call__(self, x):\n ... print(f'Model path: {self.path}')\n ... return SubModel()(x)\n\n >>> model = Model()\n >>> variables = model.init(jax.random.key(0), jnp.ones((1, 2)))\n Model path: ()\n SubModel path: ('SubModel_0',)\n """"""\n\n if self.scope is None:\n raise ValueError(""Can't access module paths on unbound modules."")\n\n return self.scope.path\n\n def clone(\n self: M,\n *,\n parent: Union[Scope, 'Module', _Sentinel] | None = None,\n _deep_clone: bool | weakref.WeakValueDictionary = False,\n _reset_names: bool = False,\n **updates,\n ) -> M:\n """"""Creates a clone of this Module, with optionally updated arguments.\n\n NOTE: end users are encouraged to use the ``copy`` method. ``clone`` is used\n primarily for internal routines, and ``copy`` offers simpler arguments and\n better defaults.\n\n Args:\n parent: The parent of the clone. The clone will have no parent if no\n explicit parent is specified.\n _deep_clone: A boolean or a weak value dictionary to control deep cloning\n of submodules. If True, submodules will be cloned recursively. If a weak\n value dictionary is passed, it will be used to cache cloned submodules.\n This flag is used by init/apply/bind to avoid scope leakage.\n _reset_names: If True, ``name=None`` is also passed to submodules when\n cloning. Resetting names in submodules is necessary when calling ``.unbind``.\n **updates: Attribute updates.\n\n Returns:\n A clone of the this Module with the updated attributes and parent.\n """"""\n attrs = {\n f.name: getattr(self, f.name) for f in dataclasses.fields(self) if f.init\n }\n\n attrs.update(parent=parent, **updates)\n\n # Here we implement deep cloning of submodules, this is necessary to avoid scope leakage\n # from external submodules into init/apply/bind while preserving sharing-by-reference\n # relationships between submodules.\n if _deep_clone != False:\n # We use a weak value dictionary to cache cloned submodules. When a shared\n # submodule is cloned, its only cloned once else its fetched from the cache.\n cache = (\n weakref.WeakValueDictionary()\n if isinstance(_deep_clone, bool)\n else _deep_clone\n )\n\n def clone_fn(m: Module) -> Module:\n if hasattr(m, '_id'):\n key = m._id\n if key in cache:\n return cache[key]\n else:\n if _reset_names:\n clone = m.clone(\n _deep_clone=cache, _reset_names=_reset_names, name=None\n )\n else:\n clone = m.clone(_deep_clone=cache)\n cache[key] = clone\n return clone\n else:\n # If the module doesn't have an _id attribute it could be a mock object\n # so we return it as is.\n return m\n\n # _map_submodules will map over all submodules inside attrs\n # value here can be any pytree, non-module values are ignored\n for field_name, value in attrs.items():\n if field_name == 'parent':\n continue\n attrs[field_name] = _map_submodules(clone_fn, value)\n\n module = self.__class__(**attrs)\n\n return module\n\n def copy(\n self: M,\n *,\n parent: Union[Scope, 'Module', _Sentinel] | None = _unspecified_parent,\n name: str | None = None,\n **updates,\n ) -> M:\n """"""Creates a copy of this Module, with optionally updated arguments.\n\n Args:\n parent: The parent of the copy. By default the current module is taken\n as parent if not explicitly specified.\n name: A new name for the copied Module, by default a new automatic name\n will be given.\n **updates: Attribute updates.\n\n Returns:\n A copy of the this Module with the updated name, parent, and attributes.\n """"""\n return self.clone(\n parent=parent, name=name, _deep_clone=True, _reset_names=False, **updates\n )\n\n @overload\n def variable(\n self,\n col: str,\n name: str,\n init_fn: Callable[..., T] | None = None,\n *init_args,\n ) -> Variable[T]:\n ...\n\n @overload\n def variable(\n self,\n col: str,\n name: str,\n init_fn: Callable[..., T] | None = None,\n *init_args,\n unbox: Literal[True],\n **init_kwargs,\n ) -> Variable[T]:\n ...\n\n @overload\n def variable(\n self,\n col: str,\n name: str,\n init_fn: Callable[..., T] | None = None,\n *init_args,\n unbox: Literal[False],\n **init_kwargs,\n ) -> Variable[meta.AxisMetadata[T]]:\n ...\n\n @overload\n def variable(\n self,\n col: str,\n name: str,\n init_fn: Callable[..., T] | None = None,\n *init_args,\n unbox: bool = True,\n **init_kwargs,\n ) -> Variable[T] | Variable[meta.AxisMetadata[T]]:\n ...\n\n def variable(\n self,\n col: str,\n name: str,\n init_fn: Callable[..., T] | None = None,\n *init_args,\n unbox: bool = True,\n **init_kwargs,\n ) -> Variable[T] | Variable[meta.AxisMetadata[T]]:\n """"""Declares and returns a variable in this Module.\n\n See :mod:`flax.core.variables` for more information. See also :meth:`param`\n for a shorthand way to define read-only variables in the ""params""\n collection.\n\n Contrary to :meth:`param`, all arguments passing using ``init_fn`` should be\n passed on explicitly::\n\n >>> class Foo(nn.Module):\n ... @nn.compact\n ... def __call__(self, x):\n ... x = nn.Dense(4)(x)\n ... key = self.make_rng('stats')\n ... mean = self.variable('stats', 'mean', nn.initializers.lecun_normal(), key, x.shape)\n ... ...\n ... return x * mean.value\n >>> variables = Foo().init({'params': jax.random.key(0), 'stats': jax.random.key(1)}, jnp.ones((2, 3)))\n >>> jax.tree_util.tree_map(jnp.shape, variables)\n {'params': {'Dense_0': {'bias': (4,), 'kernel': (3, 4)}}, 'stats': {'mean': (2, 4)}}\n\n In the example above, the function ``lecun_normal`` expects two arguments:\n ``key`` and ``shape``, and both have to be passed on. The PRNG for ``stats``\n has to be provided explicitly when calling :meth:`init` and :meth:`apply`.\n\n Args:\n col: The variable collection name.\n name: The variable name.\n init_fn: The function that will be called to compute the initial value of\n this variable. This function will only be called the first time this\n variable is used in this module. If None, the variable must already be\n initialized otherwise an error is raised.\n *init_args: The positional arguments to pass to init_fn.\n unbox: If True, ``AxisMetadata`` instances are replaced by their unboxed\n value, see ``flax.nn.meta.unbox`` (default: True).\n **init_kwargs: The key-word arguments to pass to init_fn\n\n Returns:\n A :class:`flax.core.variables.Variable` that can be read or set via\n "".value"" attribute. Throws an error if the variable exists already.\n """"""\n if not self._initialization_allowed:\n raise ValueError(\n 'Variables must be initialized in `setup()` or in a method '\n 'wrapped in `@compact`'\n )\n if self._name_taken(name, collection=col):\n raise errors.NameInUseError('variable', name, self.__class__.__name__)\n assert self.scope is not None\n v = self.scope.variable(\n col, name, init_fn, *init_args, unbox=unbox, **init_kwargs\n )\n self._state.children[name] = col\n return v\n\n @overload\n def param(\n self, name: str, init_fn: Callable[..., T], *init_args,\n ) -> T:\n ...\n\n @overload\n def param(\n self,\n name: str,\n init_fn: Callable[..., T],\n *init_args,\n unbox: Literal[True],\n **init_kwargs,\n ) -> T:\n ...\n\n @overload\n def param(\n self,\n name: str,\n init_fn: Callable[..., T],\n *init_args,\n unbox: Literal[False],\n **init_kwargs,\n ) -> meta.AxisMetadata[T]:\n ...\n\n @overload\n def param(\n self,\n name: str,\n init_fn: Callable[..., T],\n *init_args,\n unbox: bool,\n **init_kwargs,\n ) -> T | meta.AxisMetadata[T]:\n ...\n\n def param(\n self,\n name: str,\n init_fn: Callable[..., T],\n *init_args,\n unbox: bool = True,\n **init_kwargs,\n ) -> T | meta.AxisMetadata[T]:\n """"""Declares and returns a parameter in this Module.\n\n Parameters are read-only variables in the collection named ""params"". See\n :mod:`flax.core.variables` for more details on variables.\n\n The first argument of ``init_fn`` is assumed to be a PRNG key, which is\n provided automatically and does not have to be passed using ``init_args``\n or ``init_kwargs``::\n\n >>> class Foo(nn.Module):\n ... @nn.compact\n ... def __call__(self, x):\n ... x = nn.Dense(4)(x)\n ... mean = self.param('mean', nn.initializers.lecun_normal(), x.shape)\n ... ...\n ... return x * mean\n >>> variables = Foo().init({'params': jax.random.key(0), 'stats': jax.random.key(1)}, jnp.ones((2, 3)))\n >>> jax.tree_util.tree_map(jnp.shape, variables)\n {'params': {'Dense_0': {'bias': (4,), 'kernel': (3, 4)}, 'mean': (2, 4)}}\n\n In the example above, the function ``lecun_normal`` expects two arguments:\n ``key`` and ``shape``, but only ``shape`` has to be provided explicitly;\n ``key`` is set automatically using the PRNG for ``params`` that is passed\n when initializing the module using :meth:`init`.\n\n Args:\n name: The parameter name.\n init_fn: The function that will be called to compute the initial value of\n this variable. This function will only be called the first time this\n parameter is used in this module.\n *init_args: The positional arguments to pass to init_fn.\n unbox: If True, ``AxisMetadata`` instances are replaced by their unboxed\n value, see ``flax.nn.meta.unbox`` (default: True).\n **init_kwargs: The key-word arguments to pass to init_fn.\n\n Returns:\n The value of the initialized parameter. Throws an error if the parameter\n exists already.\n """"""\n if not self._initialization_allowed:\n raise ValueError(\n 'Parameters must be initialized in `setup()` or in a method '\n 'wrapped in `@compact`'\n )\n if self._name_taken(name, collection='params'):\n raise errors.NameInUseError('param', name, self.__class__.__name__)\n assert self.scope is not None\n v = self.scope.param(name, init_fn, *init_args, unbox=unbox, **init_kwargs)\n self._state.children[name] = 'params'\n return v\n\n def has_variable(self, col: str, name: str) -> bool:\n """"""Checks if a variable of given collection and name exists in this Module.\n\n See :mod:`flax.core.variables` for more explanation on variables and\n collections.\n\n Args:\n col: The variable collection name.\n name: The name of the variable.\n\n Returns:\n True if the variable exists.\n """"""\n if self.scope is None:\n raise ValueError(""Can't access variables on unbound modules"")\n return self.scope.has_variable(col, name)\n\n def is_mutable_collection(self, col: str) -> bool:\n """"""Returns true if the collection ``col`` is mutable.""""""\n if self.scope is None:\n raise ValueError(""Can't check mutability on unbound modules"")\n return self.scope.is_mutable_collection(col)\n\n def has_rng(self, name: str) -> bool:\n """"""Returns true if a PRNGSequence with name ``name`` exists.""""""\n if self.scope is None:\n raise ValueError(""Can't query for RNGs on unbound modules"")\n return self.scope.has_rng(name)\n\n def make_rng(self, name: str = 'params') -> PRNGKey:\n """"""Returns a new RNG key from a given RNG sequence for this Module.\n\n The new RNG key is split from the previous one. Thus, every call to\n ``make_rng`` returns a new RNG key, while still guaranteeing full\n reproducibility.\n\n .. note::\n If an invalid name is passed (i.e. no RNG key was passed by\n the user in ``.init`` or ``.apply`` for this name), then ``name``\n will default to ``'params'``.\n\n Example::\n\n >>> import jax\n >>> import flax.linen as nn\n\n >>> class ParamsModule(nn.Module):\n ... def __call__(self):\n ... return self.make_rng('params')\n >>> class OtherModule(nn.Module):\n ... def __call__(self):\n ... return self.make_rng('other')\n\n >>> key = jax.random.key(0)\n >>> params_out, _ = ParamsModule().init_with_output({'params': key})\n >>> # self.make_rng('other') will default to using the 'params' RNG stream\n >>> other_out, _ = OtherModule().init_with_output({'params': key})\n >>> assert params_out == other_out\n\n Learn more about RNG's by reading the Flax RNG guide:\n https://flax.readthedocs.io/en/latest/guides/flax_fundamentals/rng_guide.html\n\n Args:\n name: The RNG sequence name.\n\n Returns:\n The newly generated RNG key.\n """"""\n if self.scope is None:\n raise ValueError(""Can't use RNGs on unbound modules"")\n return self.scope.make_rng(name)\n\n def is_initializing(self) -> bool:\n """"""Returns True if running under self.init(...) or nn.init(...)().\n\n This is a helper method to handle the common case of simple initialization\n where we wish to have setup logic occur when only called under\n ``module.init`` or ``nn.init``. For more complicated multi-phase\n initialization scenarios it is better to test for the mutability of\n particular variable collections or for the presence of particular\n variables that potentially need to be initialized.\n """"""\n if self.scope is None:\n raise ValueError(""Can't check if running under init() on unbound modules"")\n return self.scope.get_flag('initializing', False)\n\n def _module_checks(self):\n """"""Run standard runtime checks.""""""\n\n if not isinstance(self, Module):\n raise errors.InvalidInstanceModuleError()\n\n overridden_post_init = self.__post_init__ != Module.__post_init__\n if overridden_post_init and not hasattr(self, '_id'):\n raise errors.IncorrectPostInitOverrideError()\n\n @traceback_util.api_boundary\n def bind(\n self: M,\n variables: VariableDict,\n *args,\n rngs: RNGSequences | None = None,\n mutable: CollectionFilter = False,\n ) -> M:\n """"""Creates an interactive Module instance by binding variables and RNGs.\n\n ``bind`` provides an ""interactive"" instance of a Module directly without\n transforming a function with ``apply``. This is particularly useful for\n debugging and interactive use cases like notebooks where a function would\n limit the ability to split up code into different cells.\n\n Once the variables (and optionally RNGs) are bound to a ``Module`` it\n becomes a stateful object. Note that idiomatic JAX is functional and\n therefore an interactive instance does not mix well with vanilla JAX APIs.\n ``bind()`` should only be used for interactive experimentation, and in all\n other cases we strongly encourage users to use ``apply()`` instead.\n\n Example::\n\n >>> import jax\n >>> import jax.numpy as jnp\n >>> import flax.linen as nn\n\n >>> class AutoEncoder(nn.Module):\n ... def setup(self):\n ... self.encoder = nn.Dense(3)\n ... self.decoder = nn.Dense(5)\n ...\n ... def __call__(self, x):\n ... return self.decoder(self.encoder(x))\n\n >>> x = jnp.ones((16, 9))\n >>> ae = AutoEncoder()\n >>> variables = ae.init(jax.random.key(0), x)\n >>> model = ae.bind(variables)\n >>> z = model.encoder(x)\n >>> x_reconstructed = model.decoder(z)\n\n Args:\n variables: A dictionary containing variables keyed by variable\n collections. See :mod:`flax.core.variables` for more details about\n variables.\n *args: Named arguments (not used).\n rngs: a dict of PRNGKeys to initialize the PRNG sequences.\n mutable: Can be bool, str, or list. Specifies which collections should be\n treated as mutable: ``bool``: all/no collections are mutable. ``str``:\n The name of a single mutable collection. ``list``: A list of names of\n mutable collections.\n\n Returns:\n A copy of this instance with bound variables and RNGs.\n """"""\n Module._module_checks(self)\n\n del args\n scope = core.bind(variables, rngs=rngs, mutable=mutable)\n return self.clone(parent=scope, _deep_clone=True)\n\n def unbind(self: M) -> tuple[M, VariableDict]:\n """"""Returns an unbound copy of a Module and its variables.\n\n ``unbind`` helps create a stateless version of a bound Module.\n\n An example of a common use case: to extract a sub-Module defined inside\n ``setup()`` and its corresponding variables: 1) temporarily ``bind`` the\n parent Module; and then 2) ``unbind`` the desired sub-Module. (Recall that\n ``setup()`` is only called when the Module is bound.)::\n\n >>> class Encoder(nn.Module):\n ... @nn.compact\n ... def __call__(self, x):\n ... ...\n ... return nn.Dense(256)(x)\n\n >>> class Decoder(nn.Module):\n ... @nn.compact\n ... def __call__(self, x):\n ... ...\n ... return nn.Dense(784)(x)\n\n >>> class AutoEncoder(nn.Module):\n ... def setup(self):\n ... self.encoder = Encoder()\n ... self.decoder = Decoder()\n ...\n ... def __call__(self, x):\n ... return self.decoder(self.encoder(x))\n\n >>> module = AutoEncoder()\n >>> variables = module.init(jax.random.key(0), jnp.ones((1, 784)))\n\n >>> # Extract the Encoder sub-Module and its variables\n >>> encoder, encoder_vars = module.bind(variables).encoder.unbind()\n\n Returns:\n A tuple with an unbound copy of this Module and its variables.\n """"""\n Module._module_checks(self)\n\n if self.scope is None:\n raise errors.CallUnbindOnUnboundModuleError()\n\n variables = self.variables\n module = self.clone(_deep_clone=True, _reset_names=True, name=None)\n return module, variables\n\n @traceback_util.api_boundary\n def apply(\n self,\n variables: VariableDict,\n *args,\n rngs: PRNGKey | RNGSequences | None = None,\n method: Callable[..., Any] | str | None = None,\n mutable: CollectionFilter = False,\n capture_intermediates: bool | Callable[['Module', str], bool] = False,\n **kwargs,\n ) -> Any | tuple[Any, FrozenVariableDict | dict[str, Any]]:\n """"""Applies a module method to variables and returns output and modified variables.\n\n Note that ``method`` should be set if one would like to call ``apply`` on a\n different class method than ``__call__``. For instance, suppose a\n Transformer modules has a method called ``encode``, then the following calls\n ``apply`` on that method::\n\n >>> import flax.linen as nn\n >>> import jax, jax.numpy as jnp\n >>> import numpy as np\n\n >>> class Transformer(nn.Module):\n ... def encode(self, x):\n ... ...\n\n >>> x = jnp.ones((16, 9))\n >>> model = Transformer()\n >>> variables = model.init(jax.random.key(0), x, method=Transformer.encode)\n\n >>> encoded = model.apply(variables, x, method=Transformer.encode)\n\n If a function instance is provided, the unbound function is used. For\n instance, the example below is equivalent to the one above::\n\n >>> encoded = model.apply(variables, x, method=model.encode)\n\n You can also pass a string to a callable attribute of the module. For\n example, the previous can be written as::\n\n >>> encoded = model.apply(variables, x, method='encode')\n\n Note ``method`` can also be a function that is not defined in\n ``Transformer``. In that case, the function should have at least one\n argument representing an instance of the Module class::\n\n >>> def other_fn(instance, x):\n ... # instance.some_module_attr(...)\n ... instance.encode\n ... ...\n\n >>> model.apply(variables, x, method=other_fn)\n\n If you pass a single ``PRNGKey``, Flax will use it to feed the ``'params'``\n RNG stream. If you want to use a different RNG stream or need to use\n multiple streams, you can pass a dictionary mapping each RNG stream name\n to its corresponding ``PRNGKey`` to ``apply``. If ``self.make_rng(name)``\n is called on an RNG stream name that isn't passed by the user, it will\n default to using the ``'params'`` RNG stream.\n\n Example::\n\n >>> class Foo(nn.Module):\n ... @nn.compact\n ... def __call__(self, x, add_noise=False):\n ... x = nn.Dense(16)(x)\n ... x = nn.relu(x)\n ...\n ... if add_noise:\n ... # Add gaussian noise\n ... noise_key = self.make_rng('noise')\n ... x = x + jax.random.normal(noise_key, x.shape)\n ...\n ... return nn.Dense(1)(x)\n\n >>> x = jnp.empty((1, 7))\n >>> module = Foo()\n >>> rngs = {'params': jax.random.key(0), 'noise': jax.random.key(1)}\n >>> variables = module.init(rngs, x)\n >>> out0 = module.apply(variables, x, add_noise=True, rngs=rngs)\n\n >>> rngs['noise'] = jax.random.key(0)\n >>> out1 = module.apply(variables, x, add_noise=True, rngs=rngs)\n >>> # different output (key(1) vs key(0))\n >>> np.testing.assert_raises(AssertionError, np.testing.assert_allclose, out0, out1)\n\n >>> del rngs['noise']\n >>> # self.make_rng('noise') will default to using the 'params' RNG stream\n >>> out2 = module.apply(variables, x, add_noise=True, rngs=rngs)\n >>> # same output (key(0))\n >>> np.testing.assert_allclose(out1, out2)\n\n >>> # passing in a single key is equivalent to passing in {'params': key}\n >>> out3 = module.apply(variables, x, add_noise=True, rngs=jax.random.key(0))\n >>> # same output (key(0))\n >>> np.testing.assert_allclose(out2, out3)\n\n Args:\n variables: A dictionary containing variables keyed by variable\n collections. See :mod:`flax.core.variables` for more details about\n variables.\n *args: Named arguments passed to the specified apply method.\n rngs: a dict of PRNGKeys to initialize the PRNG sequences. The ""params""\n PRNG sequence is used to initialize parameters.\n method: A function to call apply on. This is generally a function in the\n module. If provided, applies this method. If not provided, applies the\n ``__call__`` method of the module. A string can also be provided to\n specify a method by name.\n mutable: Can be bool, str, or list. Specifies which collections should be\n treated as mutable: ``bool``: all/no collections are mutable. ``str``:\n The name of a single mutable collection. ``list``: A list of names of\n mutable collections.\n capture_intermediates: If ``True``, captures intermediate return values of\n all Modules inside the ""intermediates"" collection. By default, only the\n return values of all ``__call__`` methods are stored. A function can be\n passed to change the filter behavior. The filter function takes the\n Module instance and method name and returns a bool indicating whether\n the output of that method invocation should be stored.\n **kwargs: Keyword arguments passed to the specified apply method.\n\n Returns:\n If ``mutable`` is False, returns output. If any collections are\n mutable, returns ``(output, vars)``, where ``vars`` are is a dict\n of the modified collections.\n """"""\n Module._module_checks(self)\n\n if rngs is not None and not isinstance(rngs, dict):\n if not core.scope._is_valid_rng(rngs):\n raise errors.InvalidRngError(\n 'RNGs should be of shape (2,) or PRNGKey in Module '\n f'{self.__class__.__name__}, but rngs are: {rngs}'\n )\n rngs = {'params': rngs}\n\n if isinstance(method, str):\n attribute_name = method\n method = getattr(self, attribute_name)\n if not callable(method):\n class_name = type(self).__name__\n raise TypeError(\n f""'{class_name}.{attribute_name}' must be a callable, got""\n f' {type(method)}.'\n )\n # if the `method` string is a submodule, we create a lambda function\n # that calls the submodule, forwarding all arguments.\n if isinstance(method, Module):\n method = lambda self, *args, **kwargs: getattr(self, attribute_name)(\n *args, **kwargs\n )\n elif method is None:\n method = self.__call__\n method = _get_unbound_fn(method)\n return apply(\n method,\n self,\n mutable=mutable,\n capture_intermediates=capture_intermediates,\n )(variables, *args, **kwargs, rngs=rngs)\n\n @traceback_util.api_boundary\n def init_with_output(\n self,\n rngs: PRNGKey | RNGSequences,\n *args,\n method: Callable[..., Any] | str | None = None,\n mutable: CollectionFilter = DenyList('intermediates'),\n capture_intermediates: bool | Callable[['Module', str], bool] = False,\n **kwargs,\n ) -> tuple[Any, FrozenVariableDict | dict[str, Any]]:\n """"""Initializes a module method with variables and returns output and modified variables.\n\n Args:\n rngs: The rngs for the variable collections.\n *args: Named arguments passed to the init function.\n method: An optional method. If provided, applies this method. If not\n provided, applies the ``__call__`` method. A string can also be\n provided to specify a method by name.\n mutable: Can be bool, str, or list. Specifies which collections should be\n treated as mutable: ``bool``: all/no collections are mutable. ``str``:\n The name of a single mutable collection. ``list``: A list of names of\n mutable collections. By default, all collections except ""intermediates""\n are mutable.\n capture_intermediates: If ``True``, captures intermediate return values of\n all Modules inside the ""intermediates"" collection. By default only the\n return values of all ``__call__`` methods are stored. A function can be\n passed to change the filter behavior. The filter function takes the\n Module instance and method name and returns a bool indicating whether\n the output of that method invocation should be stored.\n **kwargs: Keyword arguments passed to the init function.\n\n Returns:\n ``(output, vars)``, where ``vars`` are is a dict of the modified\n collections.\n """"""\n Module._module_checks(self)\n\n if not isinstance(rngs, dict):\n if not core.scope._is_valid_rng(rngs):\n raise errors.InvalidRngError(\n 'RNGs should be of shape (2,) or PRNGKey in Module '\n f'{self.__class__.__name__}, but rngs are: {rngs}'\n )\n rngs = {'params': rngs}\n\n if isinstance(method, str):\n attribute_name = method\n method = getattr(self, attribute_name)\n if not callable(method):\n class_name = type(self).__name__\n raise TypeError(\n f""'{class_name}.{attribute_name}' must be a callable, got""\n f' {type(method)}.'\n )\n elif method is None:\n method = self.__call__\n method = _get_unbound_fn(method)\n return init_with_output(\n method,\n self,\n mutable=mutable,\n capture_intermediates=capture_intermediates,\n )(rngs, *args, **kwargs)\n\n @traceback_util.api_boundary\n def init(\n self,\n rngs: PRNGKey | RNGSequences,\n *args,\n method: Callable[..., Any] | str | None = None,\n mutable: CollectionFilter = DenyList('intermediates'),\n capture_intermediates: bool | Callable[['Module', str], bool] = False,\n **kwargs,\n ) -> FrozenVariableDict | dict[str, Any]:\n """"""Initializes a module method with variables and returns modified variables.\n\n ``init`` takes as first argument either a single ``PRNGKey``, or a\n dictionary mapping variable collections names to their ``PRNGKeys``, and\n will call ``method`` (which is the module's ``__call__`` function by\n default) passing ``*args`` and ``**kwargs``, and returns\n a dictionary of initialized variables.\n\n Example::\n\n >>> import flax.linen as nn\n >>> import jax, jax.numpy as jnp\n >>> import numpy as np\n\n >>> class Foo(nn.Module):\n ... @nn.compact\n ... def __call__(self, x, train):\n ... x = nn.Dense(16)(x)\n ... x = nn.BatchNorm(use_running_average=not train)(x)\n ... x = nn.relu(x)\n ... return nn.Dense(1)(x)\n\n >>> x = jnp.empty((1, 7))\n >>> module = Foo()\n >>> key = jax.random.key(0)\n >>> variables = module.init(key, x, train=False)\n\n If you pass a single ``PRNGKey``, Flax will use it to feed the ``'params'``\n RNG stream. If you want to use a different RNG stream or need to use\n multiple streams, you can pass a dictionary mapping each RNG stream name\n to its corresponding ``PRNGKey`` to ``init``. If ``self.make_rng(name)``\n is called on an RNG stream name that isn't passed by the user, it will\n default to using the ``'params'`` RNG stream.\n\n Example::\n\n >>> class Foo(nn.Module):\n ... @nn.compact\n ... def __call__(self, x):\n ... x = nn.Dense(16)(x)\n ... x = nn.relu(x)\n ...\n ... other_variable = self.variable(\n ... 'other_collection',\n ... 'other_variable',\n ... lambda x: jax.random.normal(self.make_rng('other_rng'), x.shape),\n ... x,\n ... )\n ... x = x + other_variable.value\n ...\n ... return nn.Dense(1)(x)\n\n >>> module = Foo()\n >>> rngs = {'params': jax.random.key(0), 'other_rng': jax.random.key(1)}\n >>> variables0 = module.init(rngs, x)\n\n >>> rngs['other_rng'] = jax.random.key(0)\n >>> variables1 = module.init(rngs, x)\n >>> # equivalent params (key(0))\n >>> _ = jax.tree_util.tree_map(\n ... np.testing.assert_allclose, variables0['params'], variables1['params']\n ... )\n >>> # different other_variable (key(1) vs key(0))\n >>> np.testing.assert_raises(\n ... AssertionError,\n ... np.testing.assert_allclose,\n ... variables0['other_collection']['other_variable'],\n ... variables1['other_collection']['other_variable'],\n ... )\n\n >>> del rngs['other_rng']\n >>> # self.make_rng('other_rng') will default to using the 'params' RNG stream\n >>> variables2 = module.init(rngs, x)\n >>> # equivalent params (key(0))\n >>> _ = jax.tree_util.tree_map(\n ... np.testing.assert_allclose, variables1['params'], variables2['params']\n ... )\n >>> # equivalent other_variable (key(0))\n >>> np.testing.assert_allclose(\n ... variables1['other_collection']['other_variable'],\n ... variables2['other_collection']['other_variable'],\n ... )\n\n >>> # passing in a single key is equivalent to passing in {'params': key}\n >>> variables3 = module.init(jax.random.key(0), x)\n >>> # equivalent params (key(0))\n >>> _ = jax.tree_util.tree_map(\n ... np.testing.assert_allclose, variables2['params'], variables3['params']\n ... )\n >>> # equivalent other_variable (key(0))\n >>> np.testing.assert_allclose(\n ... variables2['other_collection']['other_variable'],\n ... variables3['other_collection']['other_variable'],\n ... )\n\n Jitting ``init`` initializes a model lazily using only the shapes of the\n provided arguments, and avoids computing the forward pass with actual\n values. Example::\n\n >>> module = nn.Dense(1)\n >>> init_jit = jax.jit(module.init)\n >>> variables = init_jit(jax.random.key(0), x)\n\n ``init`` is a light wrapper over ``apply``, so other ``apply`` arguments\n like ``method``, ``mutable``, and ``capture_intermediates`` are also\n available.\n\n Args:\n rngs: The rngs for the variable collections.\n *args: Named arguments passed to the init function.\n method: An optional method. If provided, applies this method. If not\n provided, applies the ``__call__`` method. A string can also be provided\n to specify a method by name.\n mutable: Can be bool, str, or list. Specifies which collections should be\n treated as mutable: ``bool``: all/no collections are mutable. ``str``:\n The name of a single mutable collection. ``list``: A list of names of\n mutable collections. By default all collections except ""intermediates""\n are mutable.\n capture_intermediates: If ``True``, captures intermediate return values of\n all Modules inside the ""intermediates"" collection. By default only the\n return values of all ``__call__`` methods are stored. A function can be\n passed to change the filter behavior. The filter function takes the\n Module instance and method name and returns a bool indicating whether\n the output of that method invocation should be stored.\n **kwargs: Keyword arguments passed to the init function.\n\n Returns:\n The initialized variable dict.\n """"""\n Module._module_checks(self)\n\n _, v_out = self.init_with_output(\n rngs,\n *args,\n method=method,\n mutable=mutable,\n capture_intermediates=capture_intermediates,\n **kwargs,\n )\n return v_out\n\n @traceback_util.api_boundary\n def lazy_init(\n self,\n rngs: PRNGKey | RNGSequences,\n *args,\n method: Callable[..., Any] | None = None,\n mutable: CollectionFilter = DenyList('intermediates'),\n **kwargs,\n ) -> FrozenVariableDict:\n """"""Initializes a module without computing on an actual input.\n\n lazy_init will initialize the variables without doing unnecessary compute.\n The input data should be passed as a ``jax.ShapeDtypeStruct`` which\n specifies the shape and dtype of the input but no concrete data.\n\n Example::\n\n >>> model = nn.Dense(features=256)\n >>> variables = model.lazy_init(\n ... jax.random.key(0), jax.ShapeDtypeStruct((1, 128), jnp.float32))\n\n The args and kwargs args passed to ``lazy_init`` can be a mix of\n concrete (jax arrays, scalars, bools) and abstract (ShapeDtypeStruct)\n values. Concrete values are only necessary for arguments that affect\n the initialization of variables. For example, the model might expect\n a keyword arg that enables/disables a subpart of the model.\n In this case, an explicit value (True/Flase) should be passed otherwise\n ``lazy_init`` cannot infer which variables should be initialized.\n\n Args:\n rngs: The rngs for the variable collections.\n *args: arguments passed to the init function.\n method: An optional method. If provided, applies this method. If not\n provided, applies the ``__call__`` method.\n mutable: Can be bool, str, or list. Specifies which collections should be\n treated as mutable: ``bool``: all/no collections are mutable. ``str``:\n The name of a single mutable collection. ``list``: A list of names of\n mutable collections. By default all collections except ""intermediates""\n are mutable.\n **kwargs: Keyword arguments passed to the init function.\n\n Returns:\n The initialized variable dict.\n """"""\n Module._module_checks(self)\n\n def lazy_wrapper(rngs, *args, **kwargs):\n return self.init(rngs, *args, method=method, mutable=mutable, **kwargs)\n\n return partial_eval.lazy_init(lazy_wrapper)(rngs, *args, **kwargs)\n\n @property\n def variables(self) -> VariableDict:\n """"""Returns the variables in this module.""""""\n if self.scope is None:\n raise ValueError(""Can't access variables on unbound modules"")\n return self.scope.variables()\n\n def get_variable(self, col: str, name: str, default: T | None = None) -> T:\n """"""Retrieves the value of a Variable.\n\n Args:\n col: the variable collection.\n name: the name of the variable.\n default: the default value to return if the variable does not exist in\n this scope.\n\n Returns:\n The value of the input variable, of the default value if the variable\n doesn't exist in this scope.\n """"""\n if self.scope is None:\n raise ValueError(""Can't access variables on unbound modules"")\n return self.scope.get_variable(col, name, default)\n\n def put_variable(self, col: str, name: str, value: Any):\n """"""Updates the value of the given variable if it is mutable, or an error otherwise.\n\n Args:\n col: the variable collection.\n name: the name of the variable.\n value: the new value of the variable.\n """"""\n if self.scope is None:\n raise ValueError(""Can't access variables on unbound modules"")\n self.scope.put_variable(col, name, value)\n\n @overload\n def sow(self, col: str, name: str, value: Any) -> bool:\n ...\n\n @overload\n def sow(\n self,\n col: str,\n name: str,\n value: T,\n reduce_fn: Callable[[K, T], K] = tuple_reduce,\n init_fn: Callable[[], K] = tuple_init, # type: ignore\n ) -> bool:\n ...\n\n def sow(\n self,\n col: str,\n name: str,\n value: T,\n reduce_fn: Callable[[K, T], K] = tuple_reduce,\n init_fn: Callable[[], K] = tuple_init, # type: ignore\n ) -> bool:\n """"""Stores a value in a collection.\n\n Collections can be used to collect intermediate values without\n the overhead of explicitly passing a container through each Module call.\n\n If the target collection is not mutable ``sow`` behaves like a no-op\n and returns ``False``.\n\n Example::\n\n >>> import jax\n >>> import jax.numpy as jnp\n >>> import flax.linen as nn\n\n >>> class Foo(nn.Module):\n ... @nn.compact\n ... def __call__(self, x):\n ... h = nn.Dense(4)(x)\n ... self.sow('intermediates', 'h', h)\n ... return nn.Dense(2)(h)\n\n >>> x = jnp.ones((16, 9))\n >>> model = Foo()\n >>> variables = model.init(jax.random.key(0), x)\n >>> y, state = model.apply(variables, x, mutable=['intermediates'])\n >>> jax.tree.map(jnp.shape, state['intermediates'])\n {'h': ((16, 4),)}\n\n By default the values are stored in a tuple and each stored value\n is appended at the end. This way all intermediates can be tracked when\n the same module is called multiple times. Alternatively, a custom\n init/reduce function can be passed::\n\n >>> class Foo2(nn.Module):\n ... @nn.compact\n ... def __call__(self, x):\n ... init_fn = lambda: 0\n ... reduce_fn = lambda a, b: a + b\n ... self.sow('intermediates', 'h', x,\n ... init_fn=init_fn, reduce_fn=reduce_fn)\n ... self.sow('intermediates', 'h', x * 2,\n ... init_fn=init_fn, reduce_fn=reduce_fn)\n ... return x\n\n >>> x = jnp.ones((1, 1))\n >>> model = Foo2()\n >>> variables = model.init(jax.random.key(0), x)\n >>> y, state = model.apply(\n ... variables, x, mutable=['intermediates'])\n >>> print(state['intermediates'])\n {'h': Array([[3.]], dtype=float32)}\n\n Args:\n col: The name of the variable collection.\n name: The name of the variable.\n value: The value of the variable.\n reduce_fn: The function used to combine the existing value with the new\n value. The default is to append the value to a tuple.\n init_fn: For the first value stored, ``reduce_fn`` will be passed the result\n of ``init_fn`` together with the value to be stored. The default is an\n empty tuple.\n\n Returns:\n ``True`` if the value has been stored successfully, ``False`` otherwise.\n """"""\n if self.scope is None:\n raise ValueError(""Can't store variables on unbound modules"")\n if not self.scope.is_mutable_collection(col):\n return False\n if self.scope.has_variable(col, name):\n xs = self.scope.get_variable(col, name)\n else:\n self.scope.reserve(name, col)\n self._state.children[name] = col\n xs = init_fn()\n xs = reduce_fn(xs, value)\n self.scope.put_variable(col, name, xs)\n return True\n\n def perturb(\n self, name: str, value: T, collection: str = 'perturbations'\n ) -> T:\n """"""Add an zero-value variable ('perturbation') to the intermediate value.\n\n The gradient of ``value`` would be the same as the gradient of this\n perturbation variable. Therefore, if you define your loss function with\n both params and perturbations as standalone arguments, you can get the\n intermediate gradients of ``value`` by running ``jax.grad`` on the perturbation\n argument.\n\n .. note::\n This is an experimental API and may be tweaked later for better\n performance and usability.\n At its current stage, it creates extra dummy variables that occupies extra\n memory space. Use it only to debug gradients in training.\n\n Example::\n\n >>> class Foo(nn.Module):\n ... @nn.compact\n ... def __call__(self, x):\n ... x = nn.Dense(3)(x)\n ... x = self.perturb('dense3', x)\n ... return nn.Dense(2)(x)\n\n >>> def loss(variables, inputs, targets):\n ... preds = model.apply(variables, inputs)\n ... return jnp.square(preds - targets).mean()\n\n >>> x = jnp.ones((2, 9))\n >>> y = jnp.ones((2, 2))\n >>> model = Foo()\n >>> variables = model.init(jax.random.key(0), x)\n >>> intm_grads = jax.grad(loss, argnums=0)(variables, x, y)\n >>> print(intm_grads['perturbations']['dense3'])\n [[-0.04684732 0.06573904 -0.3194327 ]\n [-0.04684732 0.06573904 -0.3194327 ]]\n\n If perturbations are not passed to ``apply``, ``perturb`` behaves like a no-op\n so you can easily disable the behavior when not needed::\n\n >>> model.apply(variables, x) # works as expected\n Array([[-0.04579116, 0.50412744],\n [-0.04579116, 0.50412744]], dtype=float32)\n >>> model.apply({'params': variables['params']}, x) # behaves like a no-op\n Array([[-0.04579116, 0.50412744],\n [-0.04579116, 0.50412744]], dtype=float32)\n >>> intm_grads = jax.grad(loss, argnums=0)({'params': variables['params']}, x, y)\n >>> 'perturbations' not in intm_grads\n True\n """"""\n if self.scope is None:\n raise ValueError(""Can't store variables on unbound modules"")\n\n if self.is_mutable_collection(collection):\n if not self.scope.has_variable(collection, name):\n self.scope.reserve(name, collection)\n self._state.children[name] = collection\n zeros = jax.tree.map(jnp.zeros_like, value)\n self.scope.put_variable(collection, name, zeros) # type: ignore\n\n if collection in self.scope.root._variables:\n if self.scope.has_variable(collection, name):\n old_value = self.scope.get_variable(collection, name)\n value = jax.tree.map(jnp.add, value, old_value) # type: ignore\n else:\n raise ValueError(f""Perturbation collection {collection} present, but ""\n f""missing perturbation variable {name}"")\n\n return value\n\n def tabulate(\n self,\n rngs: PRNGKey | RNGSequences,\n *args,\n depth: int | None = None,\n show_repeated: bool = False,\n mutable: CollectionFilter = DenyList('intermediates'),\n console_kwargs: Mapping[str, Any] | None = None,\n table_kwargs: Mapping[str, Any] = MappingProxyType({}),\n column_kwargs: Mapping[str, Any] = MappingProxyType({}),\n compute_flops: bool = False,\n compute_vjp_flops: bool = False,\n **kwargs,\n ) -> str:\n """"""Creates a summary of the Module represented as a table.\n\n This method has the same signature and internally calls ``Module.init``,\n but instead of returning the variables, it returns the string summarizing\n the Module in a table. ``tabulate`` uses ``jax.eval_shape`` to run the forward\n computation without consuming any FLOPs or allocating memory.\n\n Additional arguments can be passed into the ``console_kwargs`` argument, for\n example, ``{'width': 120}``. For a full list of ``console_kwargs`` arguments,\n see:\n https://rich.readthedocs.io/en/stable/reference/console.html#rich.console.Console\n\n Example::\n\n >>> import flax.linen as nn\n >>> import jax, jax.numpy as jnp\n\n >>> class Foo(nn.Module):\n ... @nn.compact\n ... def __call__(self, x):\n ... h = nn.Dense(4)(x)\n ... return nn.Dense(2)(h)\n\n >>> x = jnp.ones((16, 9))\n\n >>> # print(Foo().tabulate(\n >>> # jax.random.key(0), x, compute_flops=True, compute_vjp_flops=True))\n\n This gives the following output::\n\n Foo Summary\n โโโโโโโโโโโณโโโโโโโโโณโโโโโโโโโโโโโโโโณโโโโโโโโโโโโโโโโณโโโโโโโโณโโโโโโโโโโโโณโโโโโโโโโโโโโโโโโโ\n โ path โ module โ inputs โ outputs โ flops โ vjp_flops โ params โ\n โกโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฉ\n โ โ Foo โ float32[16,9] โ float32[16,2] โ 1504 โ 4460 โ โ\n โโโโโโโโโโโผโโโโโโโโโผโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโผโโโโโโโโผโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโค\n โ Dense_0 โ Dense โ float32[16,9] โ float32[16,4] โ 1216 โ 3620 โ bias: โ\n โ โ โ โ โ โ โ float32[4] โ\n โ โ โ โ โ โ โ kernel: โ\n โ โ โ โ โ โ โ float32[9,4] โ\n โ โ โ โ โ โ โ โ\n โ โ โ โ โ โ โ 40 (160 B) โ\n โโโโโโโโโโโผโโโโโโโโโผโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโผโโโโโโโโผโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโค\n โ Dense_1 โ Dense โ float32[16,4] โ float32[16,2] โ 288 โ 840 โ bias: โ\n โ โ โ โ โ โ โ float32[2] โ\n โ โ โ โ โ โ โ kernel: โ\n โ โ โ โ โ โ โ float32[4,2] โ\n โ โ โ โ โ โ โ โ\n โ โ โ โ โ โ โ 10 (40 B) โ\n โโโโโโโโโโโผโโโโโโโโโผโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโผโโโโโโโโผโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโค\n โ โ โ โ โ โ Total โ 50 (200 B) โ\n โโโโโโโโโโโดโโโโโโโโโดโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโดโโโโโโโโดโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโ\n\n Total Parameters: 50 (200 B)\n\n **Note**: rows order in the table does not represent execution order,\n instead it aligns with the order of keys in ``variables`` which are sorted\n alphabetically.\n\n **Note**: ``vjp_flops`` returns ``0`` if the module is not differentiable.\n\n Args:\n rngs: The rngs for the variable collections as passed to ``Module.init``.\n *args: The arguments to the forward computation.\n depth: controls how many submodule deep the summary can go. By default,\n its ``None`` which means no limit. If a submodule is not shown because of\n the depth limit, its parameter count and bytes will be added to the row\n of its first shown ancestor such that the sum of all rows always adds\n up to the total number of parameters of the Module.\n show_repeated: If ``True``, repeated calls to the same module will be shown\n in the table, otherwise only the first call will be shown. Default is\n ``False``.\n mutable: Can be bool, str, or list. Specifies which collections should be\n treated as mutable: ``bool``: all/no collections are mutable. ``str``:\n The name of a single mutable collection. ``list``: A list of names of\n mutable collections. By default, all collections except 'intermediates'\n are mutable.\n console_kwargs: An optional dictionary with additional keyword arguments\n that are passed to ``rich.console.Console`` when rendering the table.\n Default arguments are ``{'force_terminal': True, 'force_jupyter':\n False}``.\n table_kwargs: An optional dictionary with additional keyword arguments\n that are passed to ``rich.table.Table`` constructor.\n column_kwargs: An optional dictionary with additional keyword arguments\n that are passed to ``rich.table.Table.add_column`` when adding columns to\n the table.\n compute_flops: whether to include a ``flops`` column in the table listing\n the estimated FLOPs cost of each module forward pass. Does incur actual\n on-device computation / compilation / memory allocation, but still\n introduces overhead for large modules (e.g. extra 20 seconds for a\n Stable Diffusion's UNet, whereas otherwise tabulation would finish in 5\n seconds).\n compute_vjp_flops: whether to include a ``vjp_flops`` column in the table\n listing the estimated FLOPs cost of each module backward pass.\n Introduces a compute overhead of about 2-3X of ``compute_flops``.\n **kwargs: keyword arguments to pass to the forward computation.\n\n Returns:\n A string summarizing the Module.\n """"""\n from flax.linen import summary\n\n tabulate_fn = summary.tabulate(\n self,\n rngs,\n depth=depth,\n show_repeated=show_repeated,\n mutable=mutable,\n console_kwargs=console_kwargs,\n table_kwargs=table_kwargs,\n column_kwargs=column_kwargs,\n compute_flops=compute_flops,\n compute_vjp_flops=compute_vjp_flops,\n )\n return tabulate_fn(*args, **kwargs)\n\n def module_paths(\n self,\n rngs: PRNGKey | RNGSequences,\n *args,\n show_repeated: bool = False,\n mutable: CollectionFilter = DenyList('intermediates'),\n **kwargs,\n ) -> dict[str, 'Module']:\n """"""Returns a dictionary mapping module paths to module instances.\n\n This method has the same signature and internally calls ``Module.init``,\n but instead of returning the variables, it returns a dictionary mapping\n module paths to unbounded copies of module instances that were used\n at runtime. ``module_paths`` uses ``jax.eval_shape`` to run the forward\n computation without consuming any FLOPs or allocating memory.\n\n Example::\n\n >>> import flax.linen as nn\n >>> import jax, jax.numpy as jnp\n\n >>> class Foo(nn.Module):\n ... @nn.compact\n ... def __call__(self, x):\n ... h = nn.Dense(4)(x)\n ... return nn.Dense(2)(h)\n\n >>> x = jnp.ones((16, 9))\n >>> modules = Foo().module_paths(jax.random.key(0), x)\n >>> print({\n ... p: type(m).__name__ for p, m in modules.items()\n ... })\n {'': 'Foo', 'Dense_0': 'Dense', 'Dense_1': 'Dense'}\n\n Args:\n rngs: The rngs for the variable collections as passed to ``Module.init``.\n *args: The arguments to the forward computation.\n show_repeated: If ``True``, repeated calls to the same module will be\n shown in the table, otherwise only the first call will be shown.\n Default is ``False``.\n mutable: Can be bool, str, or list. Specifies which collections should\n be treated as mutable: ``bool``: all/no collections are mutable.\n ``str``: The name of a single mutable collection. ``list``: A list of\n names of mutable collections. By default, all collections except\n 'intermediates' are mutable.\n **kwargs: keyword arguments to pass to the forward computation.\n\n Returns:\n A dict`ionary mapping module paths to module instances.\n """"""\n from flax.linen import summary\n\n table = summary._get_module_table(\n module=self,\n depth=None,\n show_repeated=show_repeated,\n compute_flops=False,\n compute_vjp_flops=False,\n )(rngs, *args, **kwargs, mutable=mutable)\n\n return {'/'.join(row.path): row.module_copy for row in table}\n\n\n_ParentType = Union[Module, Scope, _Sentinel, None]\n\n\ndef merge_param(name: str, a: T | None, b: T | None) -> T:\n """"""Merges construction- and call-time argument.\n\n This is a utility for supporting a pattern where a Module hyperparameter\n can be passed either to ``__init__`` or ``__call__``, and the value that is\n not ``None`` will be used.\n\n Example::\n\n >>> import flax.linen as nn\n >>> from typing import Optional\n\n >>> class Foo(nn.Module):\n ... train: Optional[bool] = None\n\n ... def __call__(self, train: Optional[bool] = None):\n ... train = nn.merge_param('train', self.train, train)\n\n An error is thrown when both arguments are ``None`` or both values are not\n ``None``.\n\n Args:\n name: the name of the parameter. Used for error messages.\n a: option a\n b: option b\n\n Returns:\n a or b whichever is not ``None``.\n """"""\n if a is None and b is None:\n raise ValueError(\n f'Parameter ""{name}"" must be passed to the constructor or at call time.'\n )\n if a is not None and b is not None:\n raise ValueError(\n f'Parameter ""{name}"" was passed to the constructor and at call time.'\n ' Should be passed just once.'\n )\n if a is None:\n assert b is not None\n return b\n return a\n\n\n@traceback_util.api_boundary\ndef apply(\n fn: Callable[..., Any],\n module: Module,\n mutable: CollectionFilter = False,\n capture_intermediates: bool | Callable[[Module, str], bool] = False,\n) -> Callable[..., Any]:\n """"""Creates an apply function to call ``fn`` with a bound module.\n\n Unlike ``Module.apply`` this function returns a new function with the\n signature ``(variables, *args, rngs=None, **kwargs) -> T`` where ``T`` is the\n return type of ``fn``. If ``mutable`` is not ``False`` the return type is a\n tuple where the second item is a ``FrozenDict`` with the mutated variables.\n\n The apply function that is returned can be directly composed with\n JAX transformations like ``jax.jit``::\n\n >>> class Foo(nn.Module):\n ... def encode(self, x):\n ... ...\n ... def decode(self, x):\n ... ...\n\n >>> def f(foo, x):\n ... z = foo.encode(x)\n ... y = foo.decode(z)\n ... # ...\n ... return y\n\n >>> variables = {}\n >>> foo = Foo()\n >>> f_jitted = jax.jit(nn.apply(f, foo))\n >>> f_jitted(variables, jnp.ones((1, 3)))\n\n Args:\n fn: The function that should be applied. The first argument passed will be\n a module instance of the ``module`` with variables and RNGs bound to it.\n module: The ``Module`` that will be used to bind variables and RNGs to. The\n ``Module`` passed as the first argument to ``fn`` will be a clone of\n module.\n mutable: Can be bool, str, or list. Specifies which collections should be\n treated as mutable: ``bool``: all/no collections are mutable. ``str``: The\n name of a single mutable collection. ``list``: A list of names of mutable\n collections.\n capture_intermediates: If ``True``, captures intermediate return values of all\n Modules inside the ""intermediates"" collection. By default, only the return\n values of all `__call__` methods are stored. A function can be passed to\n change the filter behavior. The filter function takes the Module instance\n and method name and returns a bool indicating whether the output of that\n method invocation should be stored.\n\n Returns:\n The apply function wrapping ``fn``.\n """"""\n\n @functools.wraps(fn)\n def scope_fn(scope, *args, **kwargs):\n _context.capture_stack.append(capture_intermediates)\n try:\n return fn(module.clone(parent=scope, _deep_clone=True), *args, **kwargs)\n finally:\n _context.capture_stack.pop()\n\n if capture_intermediates is True: # pylint: disable=g-bool-id-comparison\n capture_intermediates = capture_call_intermediates\n if capture_intermediates:\n mutable = union_filters(mutable, 'intermediates')\n return core.apply(scope_fn, mutable=mutable)\n\n\n@traceback_util.api_boundary\ndef init_with_output(\n fn: Callable[..., Any],\n module: Module,\n mutable: CollectionFilter = DenyList('intermediates'),\n capture_intermediates: bool | Callable[[Module, str], bool] = False,\n) -> Callable[..., tuple[Any, FrozenVariableDict | dict[str, Any]]]:\n """"""Creates an init function to call ``fn`` with a bound module that also returns the function outputs.\n\n Unlike ``Module.init_with_output`` this function returns a new function with\n the signature ``(rngs, *args, **kwargs) -> (T, variables)`` where ``T`` is the\n return type of ``fn``. The rngs can be a dict of PRNGKeys or a single\n ```PRNGKey`` which is equivalent to passing a dict with one PRNGKey with the\n name ""params"".\n\n The init function that is returned can be directly composed with\n JAX transformations like ``jax.jit``::\n\n >>> class Foo(nn.Module):\n ... def encode(self, x):\n ... ...\n ... def decode(self, x):\n ... ...\n\n >>> def f(foo, x):\n ... z = foo.encode(x)\n ... y = foo.decode(z)\n ... # ...\n ... return y\n\n >>> foo = Foo()\n >>> f_jitted = jax.jit(nn.init_with_output(f, foo))\n >>> y, variables = f_jitted(jax.random.key(0), jnp.ones((1, 3)))\n\n Args:\n fn: The function that should be applied. The first argument passed will be\n a module instance of the ``module`` with variables and RNGs bound to it.\n module: The ``Module`` that will be used to bind variables and RNGs to. The\n ``Module`` passed as the first argument to ``fn`` will be a clone of\n module.\n mutable: Can be bool, str, or list. Specifies which collections should be\n treated as mutable: ``bool``: all/no collections are mutable. ``str``: The\n name of a single mutable collection. ``list``: A list of names of mutable\n collections. By default, all collections except ""intermediates"" are\n mutable.\n capture_intermediates: If ``True``, captures intermediate return values of all\n Modules inside the ""intermediates"" collection. By default, only the return\n values of all `__call__` methods are stored. A function can be passed to\n change the filter behavior. The filter function takes the Module instance\n and method name and returns a bool indicating whether the output of that\n method invocation should be stored.\n\n Returns:\n The init function wrapping ``fn``.\n """"""\n\n @functools.wraps(fn)\n def scope_fn(scope, *args, **kwargs):\n _context.capture_stack.append(capture_intermediates)\n try:\n return fn(module.clone(parent=scope, _deep_clone=True), *args, **kwargs)\n finally:\n _context.capture_stack.pop()\n\n if capture_intermediates is True: # pylint: disable=g-bool-id-comparison\n capture_intermediates = capture_call_intermediates\n if capture_intermediates:\n mutable = union_filters(mutable, 'intermediates')\n return core.init(scope_fn, mutable=mutable)\n\n\n@traceback_util.api_boundary\ndef init(\n fn: Callable[..., Any],\n module: Module,\n mutable: CollectionFilter = DenyList('intermediates'),\n capture_intermediates: bool | Callable[[Module, str], bool] = False,\n) -> Callable[..., FrozenVariableDict | dict[str, Any]]:\n """"""Creates an init function to call ``fn`` with a bound module.\n\n Unlike ``Module.init`` this function returns a new function with the signature\n ``(rngs, *args, **kwargs) -> variables``.\n The rngs can be a dict of PRNGKeys or a single ```PRNGKey`` which is\n equivalent to passing a dict with one PRNGKey with the name ""params"".\n\n The init function that is returned can be directly composed with\n JAX transformations like ``jax.jit``::\n\n >>> class Foo(nn.Module):\n ... def encode(self, x):\n ... ...\n ... def decode(self, x):\n ... ...\n\n >>> def f(foo, x):\n ... z = foo.encode(x)\n ... y = foo.decode(z)\n ... # ...\n ... return y\n\n >>> foo = Foo()\n >>> f_jitted = jax.jit(nn.init(f, foo))\n >>> variables = f_jitted(jax.random.key(0), jnp.ones((1, 3)))\n\n Args:\n fn: The function that should be applied. The first argument passed will be\n a module instance of the ``module`` with variables and RNGs bound to it.\n module: The ``Module`` that will be used to bind variables and RNGs to. The\n ``Module`` passed as the first argument to ``fn`` will be a clone of\n module.\n mutable: Can be bool, str, or list. Specifies which collections should be\n treated as mutable: ``bool``: all/no collections are mutable. ``str``: The\n name of a single mutable collection. ``list``: A list of names of mutable\n collections. By default, all collections except ""intermediates"" are\n mutable.\n capture_intermediates: If `True`, captures intermediate return values of all\n Modules inside the ""intermediates"" collection. By default, only the return\n values of all `__call__` methods are stored. A function can be passed to\n change the filter behavior. The filter function takes the Module instance\n and method name and returns a bool indicating whether the output of that\n method invocation should be stored.\n\n Returns:\n The init function wrapping ``fn``.\n """"""\n init_fn = init_with_output(fn, module, mutable, capture_intermediates)\n\n @functools.wraps(init_fn)\n def init_wrapper(*args, **kwargs):\n return init_fn(*args, **kwargs)[1]\n\n return init_wrapper\n\n\n# TODO(cgarciae): we are defining CompactNameScope just to\n# avoid a pytype bug with the Flax overlay. We should aim to\n# remove in the at some point as its not ergonomic.\nif not typing.TYPE_CHECKING:\n\n class CompactNameScope(Module):\n fn: Callable\n module_fn: Callable[[], Module]\n\n @compact\n def __call__(self, *args, **kwargs) -> Any:\n return self.fn(self.module_fn(), *args, **kwargs)\nelse:\n\n @dataclasses.dataclass\n class CompactNameScope:\n fn: Callable\n module_fn: Callable\n name: str\n\n def __call__(self, *args, **kwargs) -> Any:\n ...\n\n\ndef share_scope(module: Module, other: Module, /):\n """"""Modifies one of the Modules such that they share the same scope. This is useful\n when you want to wrap a Module and extend its functionality without changing the\n parameter structure.\n\n ``share_scope`` takes two Modules, ``module`` and ``other``. ``module`` will use\n ``other``'s scope if ``other`` has a scope and its not a descendant of``module``'s\n scope::\n\n >>> import flax.linen as nn\n >>> import jax\n >>> from jax import numpy as jnp, random\n ...\n >>> class DenseLoRA(nn.Module):\n ... base: nn.Dense\n ... rank: int\n ...\n ... def setup(self):\n ... nn.share_scope(self, self.base)\n ...\n ... @nn.compact\n ... def __call__(self, x: jax.Array):\n ... din, dout = x.shape[-1], self.base.features\n ... A = self.param('A', nn.zeros_init(), (din, self.rank))\n ... B = self.param('B', nn.zeros_init(), (self.rank, dout))\n ... return self.base(x) + x @ A @ B\n ...\n >>> class Model(nn.Module):\n ... @nn.compact\n ... def __call__(self, x: jax.Array):\n ... dense = nn.Dense(10) # base scope\n ... return DenseLoRA(dense, rank=2)(x) # reuse the base scope\n ...\n >>> model = Model()\n ...\n >>> params = model.init(random.key(0), jnp.ones((1, 5)))['params']\n >>> list(params['Dense_0'].keys())\n ['A', 'B', 'kernel', 'bias']\n\n When ``other``'s scope is a descendant of ``module``'s scope then ``other``\n will use ``module``'s scope instead::\n\n >>> class DenseLoRA(nn.Module):\n ... features: int\n ... rank: int\n ...\n ... def setup(self):\n ... self.child = nn.Dense(self.features)\n ... nn.share_scope(self, self.child)\n ...\n ... @nn.compact\n ... def __call__(self, x: jax.Array):\n ... din, dout = x.shape[-1], self.features\n ... A = self.param('A', nn.zeros_init(), (din, self.rank))\n ... B = self.param('B', nn.zeros_init(), (self.rank, dout))\n ... return self.child(x) + x @ A @ B\n ...\n >>> class Model(nn.Module):\n ... @nn.compact\n ... def __call__(self, x: jax.Array):\n ... return DenseLoRA(10, rank=2)(x)\n ...\n >>> model = Model()\n ...\n >>> params = model.init(random.key(0), jnp.ones((1, 5)))['params']\n >>> list(params['DenseLoRA_0'].keys())\n ['A', 'B', 'kernel', 'bias']\n """"""\n if module.scope is None or other.scope is None:\n raise errors.CallShareScopeOnUnboundModuleError()\n\n def _is_child_scope(scope: Scope, other: Scope) -> bool:\n target: Scope | None = other\n\n while target is not None:\n if target is scope:\n return True\n target = target.parent\n return False\n\n if _is_child_scope(module.scope, other.scope):\n # Child is a true child, overwrite its scope\n module_to_update = other\n new_scope = module.scope\n else:\n # Child has its own independent scope, overwrite\n # parent scope, so that we preserve the sharing\n module_to_update = module\n new_scope = other.scope\n\n old_scope = module_to_update.scope\n object.__setattr__(module_to_update, 'scope', new_scope)\n\n # Reattach all the children to the new scope as well.\n for m in module_to_update._state.children.values():\n if not isinstance(m, Module):\n continue\n # Should we go recursively to check if any of the ancestors point to the old\n # scope?\n if m.scope and m.scope.parent == old_scope:\n # Reserve the scope, so that if there is a conflict we can raise an error.\n if isinstance(m.scope.name, str):\n new_scope.reserve(m.scope.name)\n m.scope.parent = new_scope\n",python,tab
+2485,1704220,".venv/lib/python3.10/site-packages/flax/linen/module.py",30849,0,"",python,selection_command
+2486,1710030,".venv/lib/python3.10/site-packages/flax/linen/module.py",33050,0,"",python,selection_command
+2487,1710914,".venv/lib/python3.10/site-packages/flax/linen/module.py",34466,0,"",python,selection_command
+2488,1712830,".venv/lib/python3.10/site-packages/flax/linen/module.py",35816,0,"",python,selection_command
+2489,1715752,".venv/lib/python3.10/site-packages/flax/linen/module.py",84195,0,"",python,selection_command
+2490,1716866,".venv/lib/python3.10/site-packages/flax/linen/module.py",84243,0,"",python,selection_command
+2491,1720059,".venv/lib/python3.10/site-packages/flax/linen/module.py",84422,0,"",python,selection_command
+2492,1720792,".venv/lib/python3.10/site-packages/flax/linen/module.py",84589,0,"",python,selection_command
+2493,1721314,".venv/lib/python3.10/site-packages/flax/linen/module.py",89411,0,"",python,selection_command
+2494,1722080,".venv/lib/python3.10/site-packages/flax/linen/module.py",92202,0,"",python,selection_command
+2495,1726048,".venv/lib/python3.10/site-packages/flax/linen/module.py",92381,0,"",python,selection_command
+2496,1726241,".venv/lib/python3.10/site-packages/flax/linen/module.py",84195,0,"",python,selection_command