AI & ML interests
None yet
Organizations
None yet
yardeny/vlm_vity_tiny_SmolLM2_135M
Image-Text-to-Text
• 0.1B • Updated
yardeny/vlm_vity_tiny_SmolLM2_135M_epoch4
Image-Text-to-Text
• 0.1B • Updated
yardeny/vlm_vity_tiny_SmolLM2_135M_projector
Image-Text-to-Text
• 0.1B • Updated
yardeny/vlm_vity_tiny_SmolLM2_135M_projector_epoch4
Image-Text-to-Text
• 0.1B • Updated
yardeny/vlm_vity_tiny_SmolLM2_135M_epoch3
Image-Text-to-Text
• 0.1B • Updated
yardeny/vlm_vity_tiny_SmolLM2_135M_projector_epoch3
Image-Text-to-Text
• 0.1B • Updated
yardeny/vlm_vity_tiny_SmolLM2_135M_projector_epoch2
Image-Text-to-Text
• 0.1B • Updated
yardeny/vlm_vity_tiny_SmolLM2_135M_epoch2
Image-Text-to-Text
• 0.1B • Updated
yardeny/vlm_vity_tiny_SmolLM2_135M_projector_epoch1
Image-Text-to-Text
• 0.1B • Updated
yardeny/vlm_vity_tiny_SmolLM2_135M_projector_epoch0
Image-Text-to-Text
• 0.1B • Updated
yardeny/vlm_vity_tiny_SmolLM2_135M_epoch1
Image-Text-to-Text
• 0.1B • Updated
yardeny/vlm_vity_tiny_SmolLM2_135M_epoch0
Image-Text-to-Text
• 0.1B • Updated
yardeny/pretrain_t5_small_context_64_lr_5e5_ga_1_1gpu
Updated
yardeny/pretrain_t5_small_context_512_64_from_10K_lr_5e5_ga_1_1gpu
Updated
yardeny/t5_small_context_256_64_batch8_from_7p5K_lr_5e5_ga_1_1gpu
Updated
yardeny/t5_small_context_512_batch8_lr_5e5_ga_1_1gpu
Updated
yardeny/pretrain_t5_small_context_256_lr_1e5_ga_1_1gpu
Updated
yardeny/pretrain_t5_small_context_256_to_64_lr_5e5_ga_1_1gpu
Updated
yardeny/pretrain_t5_small_context_256_lr_5e5_ga_1_1gpu
Updated
yardeny/pretrain_t5_small_context_64_lr_5e6_ga_1_1gpu
Updated
yardeny/pretrain_t5_small_context_256
Updated
yardeny/pretrain_t5_small_context_64_lr_5e5
Updated
yardeny/pretrain_t5_small_context_64_tokenizer_base
Updated
yardeny/pretrain_t5_context_512_lr_1e5
Updated
yardeny/pretrain_t5_context_64_lr_1e5
Updated
yardeny/pretrain_t5_context_64_lr_5e5
Updated
yardeny/pretrain_gpt2_LR1e3_context_64_batch_8GA4
Updated
yardeny/pretrain_gpt2_LR5e7_context_64
Updated
yardeny/pretrain_gpt2_LR5e7_2gpu
Updated