Parsed
Collection
parsed websites by me • 2 items • Updated
repo_full_name stringlengths 6 93 | repo_url stringlengths 25 112 | repo_api_url stringclasses 28
values | owner stringclasses 28
values | repo_name stringclasses 28
values | description stringclasses 28
values | stars int64 617 98.8k | forks int64 31 355 ⌀ | watchers int64 990 999 ⌀ | license stringclasses 2
values | default_branch stringclasses 2
values | repo_created_at timestamp[s]date 2012-07-24 23:12:50 2025-06-16 08:07:28 ⌀ | repo_updated_at timestamp[s]date 2026-02-23 15:23:15 2026-05-03 18:52:12 ⌀ | repo_topics listlengths 0 13 ⌀ | repo_languages unknown | is_fork bool 1
class | open_issues int64 3 104 ⌀ | file_path stringlengths 3 208 | file_name stringclasses 509
values | file_extension stringclasses 1
value | file_size_bytes int64 101 84k ⌀ | file_url stringclasses 627
values | file_raw_url stringclasses 627
values | file_sha stringclasses 624
values | language stringclasses 8
values | parsed_at stringdate 2026-05-04 01:12:36 2026-05-04 19:41:55 | text stringlengths 100 102k |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Liuhong99/Sophia | https://github.com/Liuhong99/Sophia | https://api.github.com/repos/Liuhong99/Sophia | Liuhong99 | Sophia | The official implementation of “Sophia: A Scalable Stochastic Second-order Optimizer for Language Model Pre-training” | 999 | 58 | 999 | mit | main | 2023-05-23T10:00:14 | 2026-05-03T07:47:47 | [
"deep-learning",
"large-language-models",
"optimizer"
] | {
"Python": 71048
} | false | 8 | config/train_gpt2_large_adam.py | train_gpt2_large_adam.py | .py | 967 | https://github.com/Liuhong99/Sophia/blob/main/config/train_gpt2_large_adam.py | https://raw.githubusercontent.com/Liuhong99/Sophia/main/config/train_gpt2_large_adam.py | 5462780781bab669b824230ad458c426550634fe | Python | 2026-05-04T01:12:36.262667 | wandb_log = True
wandb_project = 'sophia'
wandb_run_name='gpt2-large-adam-100k'
# these make the total batch size be ~0.5M
# 6 batch size * 1024 block size * 10 gradaccum * 8 GPUs = 491,520
batch_size = 4
block_size = 1024
gradient_accumulation_steps = 12
n_layer = 36
n_head = 20
n_embd = 1280
dropout = 0.0 # for pre... |
Liuhong99/Sophia | https://github.com/Liuhong99/Sophia | https://api.github.com/repos/Liuhong99/Sophia | Liuhong99 | Sophia | The official implementation of “Sophia: A Scalable Stochastic Second-order Optimizer for Language Model Pre-training” | 999 | 58 | 999 | mit | main | 2023-05-23T10:00:14 | 2026-05-03T07:47:47 | [
"deep-learning",
"large-language-models",
"optimizer"
] | {
"Python": 71048
} | false | 8 | config/train_gpt2_large_sophiag.py | train_gpt2_large_sophiag.py | .py | 1,002 | https://github.com/Liuhong99/Sophia/blob/main/config/train_gpt2_large_sophiag.py | https://raw.githubusercontent.com/Liuhong99/Sophia/main/config/train_gpt2_large_sophiag.py | 88c16e02728f21aaf5b1cf4e33897ed29118a3d8 | Python | 2026-05-04T01:12:37.575308 | wandb_log = True
wandb_project = 'sophia'
wandb_run_name='gpt2-large-sophiag-100k'
# these make the total batch size be ~0.5M
# 6 batch size * 1024 block size * 10 gradaccum * 8 GPUs = 491,520
batch_size = 4
block_size = 1024
gradient_accumulation_steps = 12
n_layer = 36
n_head = 20
n_embd = 1280
dropout = 0.0 # for ... |
Liuhong99/Sophia | https://github.com/Liuhong99/Sophia | https://api.github.com/repos/Liuhong99/Sophia | Liuhong99 | Sophia | The official implementation of “Sophia: A Scalable Stochastic Second-order Optimizer for Language Model Pre-training” | 999 | 58 | 999 | mit | main | 2023-05-23T10:00:14 | 2026-05-03T07:47:47 | [
"deep-learning",
"large-language-models",
"optimizer"
] | {
"Python": 71048
} | false | 8 | config/train_gpt2_medium_adam.py | train_gpt2_medium_adam.py | .py | 968 | https://github.com/Liuhong99/Sophia/blob/main/config/train_gpt2_medium_adam.py | https://raw.githubusercontent.com/Liuhong99/Sophia/main/config/train_gpt2_medium_adam.py | b244ff31834f724d22096526296c12a6381ebd0c | Python | 2026-05-04T01:12:38.906567 | wandb_log = True
wandb_project = 'sophia'
wandb_run_name='gpt2-medium-adam-100k'
# these make the total batch size be ~0.5M
# 6 batch size * 1024 block size * 10 gradaccum * 8 GPUs = 491,520
batch_size = 6
block_size = 1024
gradient_accumulation_steps = 8
n_layer = 24
n_head = 16
n_embd = 1024
dropout = 0.0 # for pre... |
Liuhong99/Sophia | https://github.com/Liuhong99/Sophia | https://api.github.com/repos/Liuhong99/Sophia | Liuhong99 | Sophia | The official implementation of “Sophia: A Scalable Stochastic Second-order Optimizer for Language Model Pre-training” | 999 | 58 | 999 | mit | main | 2023-05-23T10:00:14 | 2026-05-03T07:47:47 | [
"deep-learning",
"large-language-models",
"optimizer"
] | {
"Python": 71048
} | false | 8 | config/train_gpt2_medium_sophiag.py | train_gpt2_medium_sophiag.py | .py | 1,004 | https://github.com/Liuhong99/Sophia/blob/main/config/train_gpt2_medium_sophiag.py | https://raw.githubusercontent.com/Liuhong99/Sophia/main/config/train_gpt2_medium_sophiag.py | adaa60d27ec7a6482b0834b7d05d599bfa311385 | Python | 2026-05-04T01:12:40.436903 | wandb_log = True
wandb_project = 'sophia'
wandb_run_name='gpt2-medium-sophiag-100k'
# these make the total batch size be ~0.5M
# 6 batch size * 1024 block size * 10 gradaccum * 8 GPUs = 491,520
batch_size = 10
block_size = 1024
gradient_accumulation_steps = 6
n_layer = 24
n_head = 16
n_embd = 1024
dropout = 0.0 # for... |
Liuhong99/Sophia | https://github.com/Liuhong99/Sophia | https://api.github.com/repos/Liuhong99/Sophia | Liuhong99 | Sophia | The official implementation of “Sophia: A Scalable Stochastic Second-order Optimizer for Language Model Pre-training” | 999 | 58 | 999 | mit | main | 2023-05-23T10:00:14 | 2026-05-03T07:47:47 | [
"deep-learning",
"large-language-models",
"optimizer"
] | {
"Python": 71048
} | false | 8 | config/train_gpt2_small_adam.py | train_gpt2_small_adam.py | .py | 926 | https://github.com/Liuhong99/Sophia/blob/main/config/train_gpt2_small_adam.py | https://raw.githubusercontent.com/Liuhong99/Sophia/main/config/train_gpt2_small_adam.py | 967f480e2c1de2ae07da5442c229cc93351032ae | Python | 2026-05-04T01:12:41.774198 | wandb_log = True
wandb_project = 'sophia'
wandb_run_name='gpt2-small-adam-100k'
# these make the total batch size be ~0.5M
# 8 batch size * 1024 block size * 6 gradaccum * 10 GPUs = 491,520
batch_size = 8
block_size = 1024
gradient_accumulation_steps = 6
n_layer = 12
n_head = 12
n_embd = 768
dropout = 0.0 # for pretr... |
Liuhong99/Sophia | https://github.com/Liuhong99/Sophia | https://api.github.com/repos/Liuhong99/Sophia | Liuhong99 | Sophia | The official implementation of “Sophia: A Scalable Stochastic Second-order Optimizer for Language Model Pre-training” | 999 | 58 | 999 | mit | main | 2023-05-23T10:00:14 | 2026-05-03T07:47:47 | [
"deep-learning",
"large-language-models",
"optimizer"
] | {
"Python": 71048
} | false | 8 | config/train_gpt2_small_sophiag.py | train_gpt2_small_sophiag.py | .py | 978 | https://github.com/Liuhong99/Sophia/blob/main/config/train_gpt2_small_sophiag.py | https://raw.githubusercontent.com/Liuhong99/Sophia/main/config/train_gpt2_small_sophiag.py | a6bdc517f00e11c662257ebc6175b465cfe018bd | Python | 2026-05-04T01:12:43.002923 | wandb_log = True
wandb_project = 'sophia'
wandb_run_name='gpt2-small-sophiag-100k'
# these make the total batch size be ~0.5M
# 8 batch size * 1024 block size * 6 gradaccum * 10 GPUs = 491,520
batch_size = 8
block_size = 1024
gradient_accumulation_steps = 6
total_bs = 480
n_layer = 12
n_head = 12
n_embd = 768
dropout... |
Liuhong99/Sophia | https://github.com/Liuhong99/Sophia | https://api.github.com/repos/Liuhong99/Sophia | Liuhong99 | Sophia | The official implementation of “Sophia: A Scalable Stochastic Second-order Optimizer for Language Model Pre-training” | 999 | 58 | 999 | mit | main | 2023-05-23T10:00:14 | 2026-05-03T07:47:47 | [
"deep-learning",
"large-language-models",
"optimizer"
] | {
"Python": 71048
} | false | 8 | configurator.py | configurator.py | .py | 1,758 | https://github.com/Liuhong99/Sophia/blob/main/configurator.py | https://raw.githubusercontent.com/Liuhong99/Sophia/main/configurator.py | a8bba9599537cbb34b378399a077b525a87d497f | Python | 2026-05-04T01:12:44.342508 | """
Poor Man's Configurator. Probably a terrible idea. Example usage:
$ python train.py config/override_file.py --batch_size=32
this will first run config/override_file.py, then override batch_size to 32
The code in this file will be run as follows from e.g. train.py:
>>> exec(open('configurator.py').read())
So it's ... |
Liuhong99/Sophia | https://github.com/Liuhong99/Sophia | https://api.github.com/repos/Liuhong99/Sophia | Liuhong99 | Sophia | The official implementation of “Sophia: A Scalable Stochastic Second-order Optimizer for Language Model Pre-training” | 999 | 58 | 999 | mit | main | 2023-05-23T10:00:14 | 2026-05-03T07:47:47 | [
"deep-learning",
"large-language-models",
"optimizer"
] | {
"Python": 71048
} | false | 8 | data/openwebtext/prepare.py | prepare.py | .py | 2,493 | https://github.com/Liuhong99/Sophia/blob/main/data/openwebtext/prepare.py | https://raw.githubusercontent.com/Liuhong99/Sophia/main/data/openwebtext/prepare.py | 42c13ecdf362475268d5daa141d344c01dfc13b9 | Python | 2026-05-04T01:12:45.584517 | # saves the openwebtext dataset to a binary file for training. following was helpful:
# https://github.com/HazyResearch/flash-attention/blob/main/training/src/datamodules/language_modeling_hf.py
import os
from tqdm import tqdm
import numpy as np
import tiktoken
from datasets import load_dataset # huggingface datasets
... |
Liuhong99/Sophia | https://github.com/Liuhong99/Sophia | https://api.github.com/repos/Liuhong99/Sophia | Liuhong99 | Sophia | The official implementation of “Sophia: A Scalable Stochastic Second-order Optimizer for Language Model Pre-training” | 999 | 58 | 999 | mit | main | 2023-05-23T10:00:14 | 2026-05-03T07:47:47 | [
"deep-learning",
"large-language-models",
"optimizer"
] | {
"Python": 71048
} | false | 8 | model.py | model.py | .py | 18,489 | https://github.com/Liuhong99/Sophia/blob/main/model.py | https://raw.githubusercontent.com/Liuhong99/Sophia/main/model.py | 5f190edab0ea972734880b8473c64f5d8bbfc4ad | Python | 2026-05-04T01:12:47.102558 | import math
import inspect
from dataclasses import dataclass
from sophia import SophiaG
import torch
import torch.nn as nn
from torch.nn import functional as F
optimizer_dict = {'adamw': torch.optim.AdamW,
'sophiag': SophiaG
}
# @torch.jit.script # good to enable when not using tor... |
Liuhong99/Sophia | https://github.com/Liuhong99/Sophia | https://api.github.com/repos/Liuhong99/Sophia | Liuhong99 | Sophia | The official implementation of “Sophia: A Scalable Stochastic Second-order Optimizer for Language Model Pre-training” | 999 | 58 | 999 | mit | main | 2023-05-23T10:00:14 | 2026-05-03T07:47:47 | [
"deep-learning",
"large-language-models",
"optimizer"
] | {
"Python": 71048
} | false | 8 | sophia.py | sophia.py | .py | 7,381 | https://github.com/Liuhong99/Sophia/blob/main/sophia.py | https://raw.githubusercontent.com/Liuhong99/Sophia/main/sophia.py | 1e552809e5e8710532ead90dfc706048cc274123 | Python | 2026-05-04T01:12:48.448294 | import math
import torch
from torch import Tensor
from torch.optim.optimizer import Optimizer
from typing import List, Optional
class SophiaG(Optimizer):
def __init__(self, params, lr=1e-4, betas=(0.965, 0.99), rho = 0.04,
weight_decay=1e-1, *, maximize: bool = False,
capturable: bool = False):
... |
Liuhong99/Sophia | https://github.com/Liuhong99/Sophia | https://api.github.com/repos/Liuhong99/Sophia | Liuhong99 | Sophia | The official implementation of “Sophia: A Scalable Stochastic Second-order Optimizer for Language Model Pre-training” | 999 | 58 | 999 | mit | main | 2023-05-23T10:00:14 | 2026-05-03T07:47:47 | [
"deep-learning",
"large-language-models",
"optimizer"
] | {
"Python": 71048
} | false | 8 | train_adam.py | train_adam.py | .py | 14,774 | https://github.com/Liuhong99/Sophia/blob/main/train_adam.py | https://raw.githubusercontent.com/Liuhong99/Sophia/main/train_adam.py | 7f89547c9810aee227c8d5415dc4888293b2806e | Python | 2026-05-04T01:12:49.591563 | import os
import time
import math
import pickle
from contextlib import nullcontext
import numpy as np
import torch
from torch.nn.parallel import DistributedDataParallel as DDP
from torch.distributed import init_process_group, destroy_process_group
from model import GPTConfig, GPT
# ----------------------------------... |
Liuhong99/Sophia | https://github.com/Liuhong99/Sophia | https://api.github.com/repos/Liuhong99/Sophia | Liuhong99 | Sophia | The official implementation of “Sophia: A Scalable Stochastic Second-order Optimizer for Language Model Pre-training” | 999 | 58 | 999 | mit | main | 2023-05-23T10:00:14 | 2026-05-03T07:47:47 | [
"deep-learning",
"large-language-models",
"optimizer"
] | {
"Python": 71048
} | false | 8 | train_sophiag.py | train_sophiag.py | .py | 20,308 | https://github.com/Liuhong99/Sophia/blob/main/train_sophiag.py | https://raw.githubusercontent.com/Liuhong99/Sophia/main/train_sophiag.py | 6895f1d29b13733a489b9d3164edd214b591cb47 | Python | 2026-05-04T01:12:50.857597 | import os
import time
import math
import pickle
from contextlib import nullcontext
import numpy as np
import torch
import torch.nn.functional as F
from torch.nn.parallel import DistributedDataParallel as DDP
from torch.distributed import init_process_group, destroy_process_group
from model import GPTConfig, GPT
import... |
maximeraafat/BlenderNeRF | https://github.com/maximeraafat/BlenderNeRF | https://api.github.com/repos/maximeraafat/BlenderNeRF | maximeraafat | BlenderNeRF | Easy NeRF synthetic dataset creation within Blender | 998 | 74 | 998 | mit | main | 2022-07-11T21:09:11 | 2026-04-28T07:30:22 | [
"addons",
"ai",
"blender",
"computer-graphics",
"computer-vision",
"gaussian-splatting",
"instant-ngp",
"nerf",
"neural-rendering",
"python"
] | {
"Python": 42297
} | false | 11 | __init__.py | __init__.py | .py | 8,045 | https://github.com/maximeraafat/BlenderNeRF/blob/main/__init__.py | https://raw.githubusercontent.com/maximeraafat/BlenderNeRF/main/__init__.py | e349533be429142d87747090e7abd466491dda4b | Python | 2026-05-04T01:12:54.769099 | import bpy
from . import helper, blender_nerf_ui, sof_ui, ttc_ui, cos_ui, sof_operator, ttc_operator, cos_operator
# blender info
bl_info = {
'name': 'BlenderNeRF',
'description': 'Easy NeRF synthetic dataset creation within Blender',
'author': 'Maxime Raafat',
'version': (6, 0, 0),
'blender': (4,... |
maximeraafat/BlenderNeRF | https://github.com/maximeraafat/BlenderNeRF | https://api.github.com/repos/maximeraafat/BlenderNeRF | maximeraafat | BlenderNeRF | Easy NeRF synthetic dataset creation within Blender | 998 | 74 | 998 | mit | main | 2022-07-11T21:09:11 | 2026-04-28T07:30:22 | [
"addons",
"ai",
"blender",
"computer-graphics",
"computer-vision",
"gaussian-splatting",
"instant-ngp",
"nerf",
"neural-rendering",
"python"
] | {
"Python": 42297
} | false | 11 | blender_nerf_operator.py | blender_nerf_operator.py | .py | 10,965 | https://github.com/maximeraafat/BlenderNeRF/blob/main/blender_nerf_operator.py | https://raw.githubusercontent.com/maximeraafat/BlenderNeRF/main/blender_nerf_operator.py | ff32447face58bc67d8bcae8a273d569ee84ecdd | Python | 2026-05-04T01:12:56.315299 | import os
import math
import json
import datetime
import bpy
# global addon script variables
OUTPUT_TRAIN = 'train'
OUTPUT_TEST = 'test'
CAMERA_NAME = 'BlenderNeRF Camera'
TMP_VERTEX_COLORS = 'blendernerf_vertex_colors_tmp'
# blender nerf operator parent class
class BlenderNeRF_Operator(bpy.types.Operator):
# ... |
maximeraafat/BlenderNeRF | https://github.com/maximeraafat/BlenderNeRF | https://api.github.com/repos/maximeraafat/BlenderNeRF | maximeraafat | BlenderNeRF | Easy NeRF synthetic dataset creation within Blender | 998 | 74 | 998 | mit | main | 2022-07-11T21:09:11 | 2026-04-28T07:30:22 | [
"addons",
"ai",
"blender",
"computer-graphics",
"computer-vision",
"gaussian-splatting",
"instant-ngp",
"nerf",
"neural-rendering",
"python"
] | {
"Python": 42297
} | false | 11 | blender_nerf_ui.py | blender_nerf_ui.py | .py | 1,681 | https://github.com/maximeraafat/BlenderNeRF/blob/main/blender_nerf_ui.py | https://raw.githubusercontent.com/maximeraafat/BlenderNeRF/main/blender_nerf_ui.py | f17cc90124a0c5ce2524bb36b76db21b4f2bb0c2 | Python | 2026-05-04T01:12:57.486230 | import bpy
# blender nerf shared ui properties class
class BlenderNeRF_UI(bpy.types.Panel):
'''BlenderNeRF UI'''
bl_idname = 'VIEW3D_PT_blender_nerf_ui'
bl_label = 'BlenderNeRF shared UI'
bl_space_type = 'VIEW_3D'
bl_region_type = 'UI'
bl_category = 'BlenderNeRF'
def draw(self, context):
... |
maximeraafat/BlenderNeRF | https://github.com/maximeraafat/BlenderNeRF | https://api.github.com/repos/maximeraafat/BlenderNeRF | maximeraafat | BlenderNeRF | Easy NeRF synthetic dataset creation within Blender | 998 | 74 | 998 | mit | main | 2022-07-11T21:09:11 | 2026-04-28T07:30:22 | [
"addons",
"ai",
"blender",
"computer-graphics",
"computer-vision",
"gaussian-splatting",
"instant-ngp",
"nerf",
"neural-rendering",
"python"
] | {
"Python": 42297
} | false | 11 | cos_operator.py | cos_operator.py | .py | 3,872 | https://github.com/maximeraafat/BlenderNeRF/blob/main/cos_operator.py | https://raw.githubusercontent.com/maximeraafat/BlenderNeRF/main/cos_operator.py | 29454ef21339bbfac14946f8e3879e87bed09a9f | Python | 2026-05-04T01:12:59.693998 | import os
import shutil
import bpy
from . import helper, blender_nerf_operator
# global addon script variables
EMPTY_NAME = 'BlenderNeRF Sphere'
CAMERA_NAME = 'BlenderNeRF Camera'
# camera on sphere operator class
class CameraOnSphere(blender_nerf_operator.BlenderNeRF_Operator):
'''Camera on Sphere Operator'''
... |
maximeraafat/BlenderNeRF | https://github.com/maximeraafat/BlenderNeRF | https://api.github.com/repos/maximeraafat/BlenderNeRF | maximeraafat | BlenderNeRF | Easy NeRF synthetic dataset creation within Blender | 998 | 74 | 998 | mit | main | 2022-07-11T21:09:11 | 2026-04-28T07:30:22 | [
"addons",
"ai",
"blender",
"computer-graphics",
"computer-vision",
"gaussian-splatting",
"instant-ngp",
"nerf",
"neural-rendering",
"python"
] | {
"Python": 42297
} | false | 11 | cos_ui.py | cos_ui.py | .py | 1,366 | https://github.com/maximeraafat/BlenderNeRF/blob/main/cos_ui.py | https://raw.githubusercontent.com/maximeraafat/BlenderNeRF/main/cos_ui.py | 3dbaa1eb597ba12996c1848e93817c60de7b586c | Python | 2026-05-04T01:13:01.138308 | import bpy
# camera on sphere ui class
class COS_UI(bpy.types.Panel):
'''Camera on Sphere UI'''
bl_idname = 'VIEW3D_PT_cos_ui'
bl_label = 'Camera on Sphere COS'
bl_space_type = 'VIEW_3D'
bl_region_type = 'UI'
bl_category = 'BlenderNeRF'
bl_options = {'DEFAULT_CLOSED'}
def draw(self, c... |
maximeraafat/BlenderNeRF | https://github.com/maximeraafat/BlenderNeRF | https://api.github.com/repos/maximeraafat/BlenderNeRF | maximeraafat | BlenderNeRF | Easy NeRF synthetic dataset creation within Blender | 998 | 74 | 998 | mit | main | 2022-07-11T21:09:11 | 2026-04-28T07:30:22 | [
"addons",
"ai",
"blender",
"computer-graphics",
"computer-vision",
"gaussian-splatting",
"instant-ngp",
"nerf",
"neural-rendering",
"python"
] | {
"Python": 42297
} | false | 11 | helper.py | helper.py | .py | 8,840 | https://github.com/maximeraafat/BlenderNeRF/blob/main/helper.py | https://raw.githubusercontent.com/maximeraafat/BlenderNeRF/main/helper.py | fa054dc8593a370f0e4d48bdebd7db4f56a6f017 | Python | 2026-05-04T01:13:02.638908 | import os
import shutil
import random
import math
import mathutils
import bpy
from bpy.app.handlers import persistent
# global addon script variables
EMPTY_NAME = 'BlenderNeRF Sphere'
CAMERA_NAME = 'BlenderNeRF Camera'
## property poll and update functions
# camera pointer property poll function
def poll_is_camera(... |
maximeraafat/BlenderNeRF | https://github.com/maximeraafat/BlenderNeRF | https://api.github.com/repos/maximeraafat/BlenderNeRF | maximeraafat | BlenderNeRF | Easy NeRF synthetic dataset creation within Blender | 998 | 74 | 998 | mit | main | 2022-07-11T21:09:11 | 2026-04-28T07:30:22 | [
"addons",
"ai",
"blender",
"computer-graphics",
"computer-vision",
"gaussian-splatting",
"instant-ngp",
"nerf",
"neural-rendering",
"python"
] | {
"Python": 42297
} | false | 11 | sof_operator.py | sof_operator.py | .py | 2,844 | https://github.com/maximeraafat/BlenderNeRF/blob/main/sof_operator.py | https://raw.githubusercontent.com/maximeraafat/BlenderNeRF/main/sof_operator.py | 7137c96c05e8140cc6bc3a81aa21dafc3947eecc | Python | 2026-05-04T01:13:03.975876 | import os
import shutil
import bpy
from . import blender_nerf_operator
# subset of frames operator class
class SubsetOfFrames(blender_nerf_operator.BlenderNeRF_Operator):
'''Subset of Frames Operator'''
bl_idname = 'object.subset_of_frames'
bl_label = 'Subset of Frames SOF'
def execute(self, context)... |
maximeraafat/BlenderNeRF | https://github.com/maximeraafat/BlenderNeRF | https://api.github.com/repos/maximeraafat/BlenderNeRF | maximeraafat | BlenderNeRF | Easy NeRF synthetic dataset creation within Blender | 998 | 74 | 998 | mit | main | 2022-07-11T21:09:11 | 2026-04-28T07:30:22 | [
"addons",
"ai",
"blender",
"computer-graphics",
"computer-vision",
"gaussian-splatting",
"instant-ngp",
"nerf",
"neural-rendering",
"python"
] | {
"Python": 42297
} | false | 11 | sof_ui.py | sof_ui.py | .py | 723 | https://github.com/maximeraafat/BlenderNeRF/blob/main/sof_ui.py | https://raw.githubusercontent.com/maximeraafat/BlenderNeRF/main/sof_ui.py | bdd0462cdc244405415132d5a70a5b2876fe624a | Python | 2026-05-04T01:13:05.243093 | import bpy
# subset of frames ui class
class SOF_UI(bpy.types.Panel):
'''Subset of Frames UI'''
bl_idname = 'VIEW3D_PT_sof_ui'
bl_label = 'Subset of Frames SOF'
bl_space_type = 'VIEW_3D'
bl_region_type = 'UI'
bl_category = 'BlenderNeRF'
bl_options = {'DEFAULT_CLOSED'}
def draw(self, c... |
maximeraafat/BlenderNeRF | https://github.com/maximeraafat/BlenderNeRF | https://api.github.com/repos/maximeraafat/BlenderNeRF | maximeraafat | BlenderNeRF | Easy NeRF synthetic dataset creation within Blender | 998 | 74 | 998 | mit | main | 2022-07-11T21:09:11 | 2026-04-28T07:30:22 | [
"addons",
"ai",
"blender",
"computer-graphics",
"computer-vision",
"gaussian-splatting",
"instant-ngp",
"nerf",
"neural-rendering",
"python"
] | {
"Python": 42297
} | false | 11 | ttc_operator.py | ttc_operator.py | .py | 3,111 | https://github.com/maximeraafat/BlenderNeRF/blob/main/ttc_operator.py | https://raw.githubusercontent.com/maximeraafat/BlenderNeRF/main/ttc_operator.py | fc04c97c55cedefb17362abdec231cd2abd8e4aa | Python | 2026-05-04T01:13:06.452737 | import os
import shutil
import bpy
from . import blender_nerf_operator
# train and test cameras operator class
class TrainTestCameras(blender_nerf_operator.BlenderNeRF_Operator):
'''Train and Test Cameras Operator'''
bl_idname = 'object.train_test_cameras'
bl_label = 'Train and Test Cameras TTC'
def ... |
maximeraafat/BlenderNeRF | https://github.com/maximeraafat/BlenderNeRF | https://api.github.com/repos/maximeraafat/BlenderNeRF | maximeraafat | BlenderNeRF | Easy NeRF synthetic dataset creation within Blender | 998 | 74 | 998 | mit | main | 2022-07-11T21:09:11 | 2026-04-28T07:30:22 | [
"addons",
"ai",
"blender",
"computer-graphics",
"computer-vision",
"gaussian-splatting",
"instant-ngp",
"nerf",
"neural-rendering",
"python"
] | {
"Python": 42297
} | false | 11 | ttc_ui.py | ttc_ui.py | .py | 850 | https://github.com/maximeraafat/BlenderNeRF/blob/main/ttc_ui.py | https://raw.githubusercontent.com/maximeraafat/BlenderNeRF/main/ttc_ui.py | bde47190082c4ae9d47eee36ebc3ae30ff404bf4 | Python | 2026-05-04T01:13:07.783245 | import bpy
# train and test cameras ui class
class TTC_UI(bpy.types.Panel):
'''Train and Test Cameras UI'''
bl_idname = 'VIEW3D_PT_ttc_ui'
bl_label = 'Train and Test Cameras TTC'
bl_space_type = 'VIEW_3D'
bl_region_type = 'UI'
bl_category = 'BlenderNeRF'
bl_options = {'DEFAULT_CLOSED'}
... |
0xZDH/o365spray | https://github.com/0xZDH/o365spray | https://api.github.com/repos/0xZDH/o365spray | 0xZDH | o365spray | Username enumeration and password spraying tool aimed at Microsoft O365. | 997 | 119 | 997 | mit | master | 2019-08-07T14:47:45 | 2026-04-28T07:50:39 | [
"enumeration",
"password-spray",
"pentest",
"pentesting-tools",
"python",
"python3",
"security",
"security-tools"
] | {
"Python": 123401
} | false | 7 | o365spray.py | o365spray.py | .py | 128 | https://github.com/0xZDH/o365spray/blob/master/o365spray.py | https://raw.githubusercontent.com/0xZDH/o365spray/master/o365spray.py | 9d680e59bc75f1280b2e481b5f53d8f99a48d07b | Python | 2026-05-04T01:13:11.879963 | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
if __name__ == "__main__":
from o365spray.__main__ import main
main()
|
0xZDH/o365spray | https://github.com/0xZDH/o365spray | https://api.github.com/repos/0xZDH/o365spray | 0xZDH | o365spray | Username enumeration and password spraying tool aimed at Microsoft O365. | 997 | 119 | 997 | mit | master | 2019-08-07T14:47:45 | 2026-04-28T07:50:39 | [
"enumeration",
"password-spray",
"pentest",
"pentesting-tools",
"python",
"python3",
"security",
"security-tools"
] | {
"Python": 123401
} | false | 7 | o365spray/__init__.py | __init__.py | .py | 101 | https://github.com/0xZDH/o365spray/blob/master/o365spray/__init__.py | https://raw.githubusercontent.com/0xZDH/o365spray/master/o365spray/__init__.py | e150733b03373dbb766dc64d142dd3139a25098f | Python | 2026-05-04T01:13:13.217975 | #!/usr/bin/env python3
_V_MAJ = 3
_V_MIN = 0
_V_MNT = 4
__version__ = f"{_V_MAJ}.{_V_MIN}.{_V_MNT}"
|
0xZDH/o365spray | https://github.com/0xZDH/o365spray | https://api.github.com/repos/0xZDH/o365spray | 0xZDH | o365spray | Username enumeration and password spraying tool aimed at Microsoft O365. | 997 | 119 | 997 | mit | master | 2019-08-07T14:47:45 | 2026-04-28T07:50:39 | [
"enumeration",
"password-spray",
"pentest",
"pentesting-tools",
"python",
"python3",
"security",
"security-tools"
] | {
"Python": 123401
} | false | 7 | o365spray/__main__.py | __main__.py | .py | 9,788 | https://github.com/0xZDH/o365spray/blob/master/o365spray/__main__.py | https://raw.githubusercontent.com/0xZDH/o365spray/master/o365spray/__main__.py | 0c70664191f93a54b58a4c7ede7545a2bd11ab60 | Python | 2026-05-04T01:13:14.644653 | #!/usr/bin/env python3
import argparse
import logging
import os
import sys
import time
from pathlib import Path
from random import randint
from o365spray import __version__
from o365spray.core.handlers.enumerator import enumerate
from o365spray.core.handlers.sprayer import spray
from o365spray.core.handlers.validator... |
0xZDH/o365spray | https://github.com/0xZDH/o365spray | https://api.github.com/repos/0xZDH/o365spray | 0xZDH | o365spray | Username enumeration and password spraying tool aimed at Microsoft O365. | 997 | 119 | 997 | mit | master | 2019-08-07T14:47:45 | 2026-04-28T07:50:39 | [
"enumeration",
"password-spray",
"pentest",
"pentesting-tools",
"python",
"python3",
"security",
"security-tools"
] | {
"Python": 123401
} | false | 7 | o365spray/core/handlers/base.py | base.py | .py | 2,996 | https://github.com/0xZDH/o365spray/blob/master/o365spray/core/handlers/base.py | https://raw.githubusercontent.com/0xZDH/o365spray/master/o365spray/core/handlers/base.py | 12b9ef9832825349c48db3dfa661c25107d2a40e | Python | 2026-05-04T01:13:15.984800 | #!/usr/bin/env python3
import logging
import time
import requests # type: ignore
import urllib3 # type: ignore
from random import randint
from typing import (
Any,
Dict,
List,
Union,
)
from o365spray.core.utils import (
Defaults,
Helper,
)
urllib3.disable_warnings(urllib3.exceptions.Insecu... |
0xZDH/o365spray | https://github.com/0xZDH/o365spray | https://api.github.com/repos/0xZDH/o365spray | 0xZDH | o365spray | Username enumeration and password spraying tool aimed at Microsoft O365. | 997 | 119 | 997 | mit | master | 2019-08-07T14:47:45 | 2026-04-28T07:50:39 | [
"enumeration",
"password-spray",
"pentest",
"pentesting-tools",
"python",
"python3",
"security",
"security-tools"
] | {
"Python": 123401
} | false | 7 | o365spray/core/handlers/enumerator/enumerate.py | enumerate.py | .py | 2,981 | https://github.com/0xZDH/o365spray/blob/master/o365spray/core/handlers/enumerator/enumerate.py | https://raw.githubusercontent.com/0xZDH/o365spray/master/o365spray/core/handlers/enumerator/enumerate.py | 03e02f26ba7bc1682af71df4bfcdb79fdc2a9963 | Python | 2026-05-04T01:13:17.187388 | #!/usr/bin/env python3
import argparse
import asyncio
import importlib
import logging
import signal
import sys
from pathlib import Path
from o365spray.core.utils import (
Defaults,
Helper,
)
def enumerate(args: argparse.Namespace, output_dir: str) -> object:
"""Run user enumeration against a given domai... |
0xZDH/o365spray | https://github.com/0xZDH/o365spray | https://api.github.com/repos/0xZDH/o365spray | 0xZDH | o365spray | Username enumeration and password spraying tool aimed at Microsoft O365. | 997 | 119 | 997 | mit | master | 2019-08-07T14:47:45 | 2026-04-28T07:50:39 | [
"enumeration",
"password-spray",
"pentest",
"pentesting-tools",
"python",
"python3",
"security",
"security-tools"
] | {
"Python": 123401
} | false | 7 | o365spray/core/handlers/enumerator/modules/autodiscover.py | autodiscover.py | .py | 3,157 | https://github.com/0xZDH/o365spray/blob/master/o365spray/core/handlers/enumerator/modules/autodiscover.py | https://raw.githubusercontent.com/0xZDH/o365spray/master/o365spray/core/handlers/enumerator/modules/autodiscover.py | e48b237c3b3813db5d98c07afe594c0ff5b07e4f | Python | 2026-05-04T01:13:18.336990 | #!/usr/bin/env python3
import logging
import time
from o365spray.core.handlers.enumerator.modules.base import EnumeratorBase
from o365spray.core.utils import (
Defaults,
Helper,
text_colors,
)
class EnumerateModule_autodiscover(EnumeratorBase):
"""Autodiscover Enumeration module class"""
def __... |
0xZDH/o365spray | https://github.com/0xZDH/o365spray | https://api.github.com/repos/0xZDH/o365spray | 0xZDH | o365spray | Username enumeration and password spraying tool aimed at Microsoft O365. | 997 | 119 | 997 | mit | master | 2019-08-07T14:47:45 | 2026-04-28T07:50:39 | [
"enumeration",
"password-spray",
"pentest",
"pentesting-tools",
"python",
"python3",
"security",
"security-tools"
] | {
"Python": 123401
} | false | 7 | o365spray/core/handlers/enumerator/modules/autologon.py | autologon.py | .py | 5,492 | https://github.com/0xZDH/o365spray/blob/master/o365spray/core/handlers/enumerator/modules/autologon.py | https://raw.githubusercontent.com/0xZDH/o365spray/master/o365spray/core/handlers/enumerator/modules/autologon.py | 7feb01dd22f17c055d6ae74248f15e7dd44b1596 | Python | 2026-05-04T01:13:20.175236 | #!/usr/bin/env python3
import logging
import time
from datetime import (
datetime,
timedelta,
)
from uuid import uuid4
from o365spray.core.handlers.enumerator.modules.base import EnumeratorBase
from o365spray.core.utils import (
Defaults,
Helper,
text_colors,
)
class EnumerateModule_autologon(En... |
0xZDH/o365spray | https://github.com/0xZDH/o365spray | https://api.github.com/repos/0xZDH/o365spray | 0xZDH | o365spray | Username enumeration and password spraying tool aimed at Microsoft O365. | 997 | 119 | 997 | mit | master | 2019-08-07T14:47:45 | 2026-04-28T07:50:39 | [
"enumeration",
"password-spray",
"pentest",
"pentesting-tools",
"python",
"python3",
"security",
"security-tools"
] | {
"Python": 123401
} | false | 7 | o365spray/core/handlers/enumerator/modules/base.py | base.py | .py | 6,723 | https://github.com/0xZDH/o365spray/blob/master/o365spray/core/handlers/enumerator/modules/base.py | https://raw.githubusercontent.com/0xZDH/o365spray/master/o365spray/core/handlers/enumerator/modules/base.py | 9d0cce8e73c128746de71ec925e8dd01fc8c0424 | Python | 2026-05-04T01:13:21.609335 | #!/usr/bin/env python3
import asyncio
import concurrent.futures
import concurrent.futures.thread
import logging
import urllib3 # type: ignore
from functools import partial
from typing import (
Dict,
List,
Union,
)
from o365spray.core.handlers.base import BaseHandler
from o365spray.core.utils import (
... |
0xZDH/o365spray | https://github.com/0xZDH/o365spray | https://api.github.com/repos/0xZDH/o365spray | 0xZDH | o365spray | Username enumeration and password spraying tool aimed at Microsoft O365. | 997 | 119 | 997 | mit | master | 2019-08-07T14:47:45 | 2026-04-28T07:50:39 | [
"enumeration",
"password-spray",
"pentest",
"pentesting-tools",
"python",
"python3",
"security",
"security-tools"
] | {
"Python": 123401
} | false | 7 | o365spray/core/handlers/enumerator/modules/oauth2.py | oauth2.py | .py | 4,184 | https://github.com/0xZDH/o365spray/blob/master/o365spray/core/handlers/enumerator/modules/oauth2.py | https://raw.githubusercontent.com/0xZDH/o365spray/master/o365spray/core/handlers/enumerator/modules/oauth2.py | 915c5f04fe7eb03fff1fdc6f881a2647070f5d5e | Python | 2026-05-04T01:13:22.841774 | #!/usr/bin/env python3
import logging
import time
from o365spray.core.handlers.enumerator.modules.base import EnumeratorBase
from o365spray.core.utils import (
Defaults,
Helper,
text_colors,
)
class EnumerateModule_oauth2(EnumeratorBase):
"""oAuth2 Enumeration module class"""
def __init__(self,... |
0xZDH/o365spray | https://github.com/0xZDH/o365spray | https://api.github.com/repos/0xZDH/o365spray | 0xZDH | o365spray | Username enumeration and password spraying tool aimed at Microsoft O365. | 997 | 119 | 997 | mit | master | 2019-08-07T14:47:45 | 2026-04-28T07:50:39 | [
"enumeration",
"password-spray",
"pentest",
"pentesting-tools",
"python",
"python3",
"security",
"security-tools"
] | {
"Python": 123401
} | false | 7 | o365spray/core/handlers/enumerator/modules/office.py | office.py | .py | 5,438 | https://github.com/0xZDH/o365spray/blob/master/o365spray/core/handlers/enumerator/modules/office.py | https://raw.githubusercontent.com/0xZDH/o365spray/master/o365spray/core/handlers/enumerator/modules/office.py | cb771fe61cedfe297cfd21676cf6e7c6f98dc680 | Python | 2026-05-04T01:13:24.270454 | #!/usr/bin/env python3
import logging
import time
from o365spray.core.handlers.enumerator.modules.base import EnumeratorBase
from o365spray.core.utils import (
Defaults,
Helper,
text_colors,
)
class EnumerateModule_office(EnumeratorBase):
"""Office Enumeration module class"""
def __init__(self,... |
A dataset consisting of source code from GitHub repositories.
Each record in the dataset contains the following fields:
repo_full_name: The full name of the repository (owner/name).repo_url: Direct link to the GitHub repository.stars: Number of stars at the time of parsing.license: Repository license (mit or apache-2.0).file_path: Path to the file within the repository.language: The programming language of the file.parsed_at: Timestamp when the file was processed.text: The raw source code content.