title stringlengths 1 300 | score int64 0 8.54k | selftext stringlengths 0 41.5k | created timestamp[ns]date 2023-04-01 04:30:41 2026-03-04 02:14:14 ⌀ | url stringlengths 0 878 | author stringlengths 3 20 | domain stringlengths 0 82 | edited timestamp[ns]date 1970-01-01 00:00:00 2026-02-19 14:51:53 | gilded int64 0 2 | gildings stringclasses 7
values | id stringlengths 7 7 | locked bool 2
classes | media stringlengths 646 1.8k ⌀ | name stringlengths 10 10 | permalink stringlengths 33 82 | spoiler bool 2
classes | stickied bool 2
classes | thumbnail stringlengths 4 213 ⌀ | ups int64 0 8.54k | preview stringlengths 301 5.01k ⌀ |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Deepseek R1 lied about its codeforces rating to be 2029? | 1 | [removed] | 2025-06-24T19:08:20 | https://www.reddit.com/r/LocalLLaMA/comments/1ljjzfv/deepseek_r1_lied_about_its_codeforces_rating_to/ | ThemeResponsible2116 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ljjzfv | false | null | t3_1ljjzfv | /r/LocalLLaMA/comments/1ljjzfv/deepseek_r1_lied_about_its_codeforces_rating_to/ | false | false | 1 | null | |
Angry creator seeks free AI to rewrite the fire OpenAI tried to put out | 1 | [removed] | 2025-06-24T19:06:17 | https://www.reddit.com/r/LocalLLaMA/comments/1ljjxgx/angry_creator_seeks_free_ai_to_rewrite_the_fire/ | A_R_N-c_a | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ljjxgx | false | null | t3_1ljjxgx | /r/LocalLLaMA/comments/1ljjxgx/angry_creator_seeks_free_ai_to_rewrite_the_fire/ | false | false | self | 1 | null |
Tiny Tavern - IA character mobile app via Ollama | 1 | [removed] | 2025-06-24T18:57:04 | https://www.reddit.com/r/LocalLLaMA/comments/1ljjojq/tiny_tavern_ia_character_mobile_app_via_ollama/ | Ill_Marketing_5245 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ljjojq | false | null | t3_1ljjojq | /r/LocalLLaMA/comments/1ljjojq/tiny_tavern_ia_character_mobile_app_via_ollama/ | false | false | 1 | null | |
The LLM's RL Revelation We Didn't See Coming | 1 | [removed] | 2025-06-24T18:50:26 | https://youtu.be/z3awgfU4yno | FeathersOfTheArrow | youtu.be | 1970-01-01T00:00:00 | 0 | {} | 1ljji8f | false | {'oembed': {'author_name': 'bycloud', 'author_url': 'https://www.youtube.com/@bycloudAI', 'height': 200, 'html': '<iframe width="356" height="200" src="https://www.youtube.com/embed/z3awgfU4yno?feature=oembed&enablejsapi=1" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; pic... | t3_1ljji8f | /r/LocalLLaMA/comments/1ljji8f/the_llms_rl_revelation_we_didnt_see_coming/ | false | false | default | 1 | null |
Anthropic wins a major fair use victory for AI (Purchased copies of books is fair use for training) | 1 | 2025-06-24T18:34:53 | https://www.theverge.com/news/692015/anthropic-wins-a-major-fair-use-victory-for-ai-but-its-still-in-trouble-for-stealing-books | theZeitt | theverge.com | 1970-01-01T00:00:00 | 0 | {} | 1ljj3ey | false | null | t3_1ljj3ey | /r/LocalLLaMA/comments/1ljj3ey/anthropic_wins_a_major_fair_use_victory_for_ai/ | false | false | default | 1 | null | |
So, are we back? | 1 | [removed] | 2025-06-24T18:20:32 | https://www.reddit.com/r/LocalLLaMA/comments/1ljipwv/so_are_we_back/ | Herr_Drosselmeyer | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ljipwv | false | null | t3_1ljipwv | /r/LocalLLaMA/comments/1ljipwv/so_are_we_back/ | false | false | self | 1 | null |
Running on TPU ?!! | 1 | [removed] | 2025-06-24T17:47:24 | https://www.reddit.com/r/LocalLLaMA/comments/1ljhttx/running_on_tpu/ | Symbiote_in_me | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ljhttx | false | null | t3_1ljhttx | /r/LocalLLaMA/comments/1ljhttx/running_on_tpu/ | false | false | self | 1 | null |
I wanna create a startup using LLaMa smthg, idk what? any ideas geeks? | 1 | [removed] | 2025-06-24T17:44:57 | https://www.reddit.com/r/LocalLLaMA/comments/1ljhrh3/i_wanna_create_a_startup_using_llama_smthg_idk/ | Expert-Address-2918 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ljhrh3 | false | null | t3_1ljhrh3 | /r/LocalLLaMA/comments/1ljhrh3/i_wanna_create_a_startup_using_llama_smthg_idk/ | false | false | self | 1 | null |
Federal Judge: Training On Copyrighted Works Is Fair Use | 1 | [removed] | 2025-06-24T17:44:04 | https://www.reddit.com/r/LocalLLaMA/comments/1ljhql9/federal_judge_training_on_copyrighted_works_is/ | MrPecunius | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ljhql9 | false | null | t3_1ljhql9 | /r/LocalLLaMA/comments/1ljhql9/federal_judge_training_on_copyrighted_works_is/ | false | false | self | 1 | null |
I'm sure most people have read about the Claud Spiritual Bliss Attractor and I wanted to reproduce it locally, so I made Resonant Chat Arena, a simple python script to put two LLMs in conversation with each other. | 8 | 2025-06-24T17:33:15 | https://github.com/jkingsman/resonant-chat-arena | CharlesStross | github.com | 1970-01-01T00:00:00 | 0 | {} | 1ljhg1i | false | null | t3_1ljhg1i | /r/LocalLLaMA/comments/1ljhg1i/im_sure_most_people_have_read_about_the_claud/ | false | false | 8 | {'enabled': False, 'images': [{'id': 'rna5zREg5_FzFmMGv-Mzfn4pHDOOgy6GUqSdq0vIQVE', 'resolutions': [{'height': 54, 'url': 'https://external-preview.redd.it/rna5zREg5_FzFmMGv-Mzfn4pHDOOgy6GUqSdq0vIQVE.png?width=108&crop=smart&auto=webp&s=be8cbbf89194fd69bf6142f0e8f0f036ca1df411', 'width': 108}, {'height': 108, 'url': 'h... | ||
Are there leaderboards that ranking LLM for tasks? | 1 | [removed] | 2025-06-24T17:12:14 | https://www.reddit.com/r/LocalLLaMA/comments/1ljgvod/are_there_leaderboards_that_ranking_llm_for_tasks/ | GTHell | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ljgvod | false | null | t3_1ljgvod | /r/LocalLLaMA/comments/1ljgvod/are_there_leaderboards_that_ranking_llm_for_tasks/ | false | false | self | 1 | null |
Tokilake: Unlock Your Idle GPUs as a Unified LLM API – Behind NAT, Fully Private | 1 | [removed] | 2025-06-24T16:42:15 | https://www.reddit.com/r/LocalLLaMA/comments/1ljg35h/tokilake_unlock_your_idle_gpus_as_a_unified_llm/ | Square-Air6513 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ljg35h | false | null | t3_1ljg35h | /r/LocalLLaMA/comments/1ljg35h/tokilake_unlock_your_idle_gpus_as_a_unified_llm/ | false | false | self | 1 | null |
WebBench: A real-world benchmark for Browser Agents | 1 | [removed] | 2025-06-24T16:40:55 | Impressive_Half_2819 | i.redd.it | 1970-01-01T00:00:00 | 0 | {} | 1ljg1ux | false | null | t3_1ljg1ux | /r/LocalLLaMA/comments/1ljg1ux/webbench_a_realworld_benchmark_for_browser_agents/ | false | false | default | 1 | {'enabled': True, 'images': [{'id': 'qkwd19xhlw8f1', 'resolutions': [{'height': 61, 'url': 'https://preview.redd.it/qkwd19xhlw8f1.jpeg?width=108&crop=smart&auto=webp&s=ca4b9df8943992c44e9dd02311d75e3f4ed297db', 'width': 108}, {'height': 123, 'url': 'https://preview.redd.it/qkwd19xhlw8f1.jpeg?width=216&crop=smart&auto=w... | |
What's the best Vision Model for local OCR ( scanned invoices etc.) on a rtx 5080 in june 2025? | 1 | [removed] | 2025-06-24T16:18:58 | https://www.reddit.com/r/LocalLLaMA/comments/1ljfgsd/whats_the_best_vision_model_for_local_ocr_scanned/ | Key_Rush_3180 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ljfgsd | false | null | t3_1ljfgsd | /r/LocalLLaMA/comments/1ljfgsd/whats_the_best_vision_model_for_local_ocr_scanned/ | false | false | self | 1 | null |
Day 2 of 50 Days of Building a Small Language Model from Scratch — Tokenizers: The Unsung Heroes of Language Models | 1 | [removed] | 2025-06-24T16:10:27 | https://www.reddit.com/r/LocalLLaMA/comments/1ljf8ls/day_2_of_50_days_of_building_a_small_language/ | Prashant-Lakhera | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ljf8ls | false | null | t3_1ljf8ls | /r/LocalLLaMA/comments/1ljf8ls/day_2_of_50_days_of_building_a_small_language/ | false | false | self | 1 | null |
What's a good model for generating a schedule for multiple employees? | 1 | [removed] | 2025-06-24T16:08:54 | https://www.reddit.com/r/LocalLLaMA/comments/1ljf76m/whats_a_good_model_for_generating_a_schedule_for/ | ThatGreenM-M | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ljf76m | false | null | t3_1ljf76m | /r/LocalLLaMA/comments/1ljf76m/whats_a_good_model_for_generating_a_schedule_for/ | false | false | self | 1 | null |
I built a tool to calculate exactly how many GPUs you need—based on your chosen model, quantization, context length, concurrency level, and target throughput. | 1 | [removed] | 2025-06-24T16:03:31 | https://www.reddit.com/r/LocalLLaMA/comments/1ljf1z4/i_built_a_tool_to_calculate_exactly_how_many_gpus/ | RubJunior488 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ljf1z4 | false | null | t3_1ljf1z4 | /r/LocalLLaMA/comments/1ljf1z4/i_built_a_tool_to_calculate_exactly_how_many_gpus/ | false | false | 1 | null | |
Applying COCONUT continuous reasoning into a learnt linear layer that produces sampling parameters (temp, top-k, top-p, etc.) for the current token | 1 | [removed] | 2025-06-24T15:51:26 | ryunuck | i.redd.it | 1970-01-01T00:00:00 | 0 | {} | 1ljeqko | false | null | t3_1ljeqko | /r/LocalLLaMA/comments/1ljeqko/applying_coconut_continuous_reasoning_into_a/ | false | false | 1 | {'enabled': True, 'images': [{'id': 'JRIP_xkg4BuH_ZZnihw-prRSBimm16YO4u5T6SZALhc', 'resolutions': [{'height': 144, 'url': 'https://preview.redd.it/k1malgvjcw8f1.png?width=108&crop=smart&auto=webp&s=ca66e2e50a9b024b413fe22b1397932563367ac5', 'width': 108}, {'height': 288, 'url': 'https://preview.redd.it/k1malgvjcw8f1.pn... | ||
Local equivalent to Gemini 2.0 flash | 1 | [removed] | 2025-06-24T15:50:26 | https://www.reddit.com/r/LocalLLaMA/comments/1ljepmx/local_equivalent_to_gemini_20_flash/ | Russ_Dill | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ljepmx | false | null | t3_1ljepmx | /r/LocalLLaMA/comments/1ljepmx/local_equivalent_to_gemini_20_flash/ | false | false | self | 1 | null |
I built a tool to calculate exactly how many GPUs you need—based on your chosen model, quantization, context length, concurrency level, and target throughput. | 1 | [removed] | 2025-06-24T15:50:02 | https://www.reddit.com/r/LocalLLaMA/comments/1ljep8m/i_built_a_tool_to_calculate_exactly_how_many_gpus/ | RubJunior488 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ljep8m | false | null | t3_1ljep8m | /r/LocalLLaMA/comments/1ljep8m/i_built_a_tool_to_calculate_exactly_how_many_gpus/ | false | false | self | 1 | null |
Applying COCONUT continuous reasoning into a learnt linear layer that produces sampling parameters (temp, top-k, top-p, etc.) for the current token | 1 | [removed] | 2025-06-24T15:47:14 | https://www.reddit.com/r/LocalLLaMA/comments/1ljemhd/applying_coconut_continuous_reasoning_into_a/ | ryunuck | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ljemhd | false | null | t3_1ljemhd | /r/LocalLLaMA/comments/1ljemhd/applying_coconut_continuous_reasoning_into_a/ | false | false | 1 | null | |
Help with vLLM Speculative Decoding using Medusa | 1 | [removed] | 2025-06-24T15:43:29 | https://www.reddit.com/r/LocalLLaMA/comments/1ljeizj/help_with_vllm_speculative_decoding_using_medusa/ | Equivalent_Pair_4146 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ljeizj | false | null | t3_1ljeizj | /r/LocalLLaMA/comments/1ljeizj/help_with_vllm_speculative_decoding_using_medusa/ | false | false | self | 1 | null |
Applying COCONUT continuous reasoning into a learnt linear layer that produces sampling parameters (temp, top-k, top-p, etc.) for the current token | 1 | [removed] | 2025-06-24T15:42:42 | psychonucks | i.redd.it | 1970-01-01T00:00:00 | 0 | {} | 1ljei7l | false | null | t3_1ljei7l | /r/LocalLLaMA/comments/1ljei7l/applying_coconut_continuous_reasoning_into_a/ | false | false | default | 1 | {'enabled': True, 'images': [{'id': 'ttjtuk92bw8f1', 'resolutions': [{'height': 144, 'url': 'https://preview.redd.it/ttjtuk92bw8f1.png?width=108&crop=smart&auto=webp&s=43304c1d2b04e081425d806184ca26b436c44435', 'width': 108}, {'height': 288, 'url': 'https://preview.redd.it/ttjtuk92bw8f1.png?width=216&crop=smart&auto=we... | |
Applying COCONUT continuous reasoning into a learnt linear layer that produces sampling parameters (temp, top-k, top-p, etc.) for the current token | 1 | [removed] | 2025-06-24T15:40:18 | ryunuck | i.redd.it | 1970-01-01T00:00:00 | 0 | {} | 1ljefwo | false | null | t3_1ljefwo | /r/LocalLLaMA/comments/1ljefwo/applying_coconut_continuous_reasoning_into_a/ | false | false | 1 | {'enabled': True, 'images': [{'id': 'JRIP_xkg4BuH_ZZnihw-prRSBimm16YO4u5T6SZALhc', 'resolutions': [{'height': 144, 'url': 'https://preview.redd.it/o7mv6dugaw8f1.png?width=108&crop=smart&auto=webp&s=843fb5f5735191c6789bdde40227822e168f0360', 'width': 108}, {'height': 288, 'url': 'https://preview.redd.it/o7mv6dugaw8f1.pn... | ||
Applying COCONUT continuous reasoning into a learnt linear layer that produces sampling parameters (temp, top-k, top-p, etc.) for the current token | 1 | [removed] | 2025-06-24T15:37:21 | ryunuck | i.redd.it | 1970-01-01T00:00:00 | 0 | {} | 1ljed12 | false | null | t3_1ljed12 | /r/LocalLLaMA/comments/1ljed12/applying_coconut_continuous_reasoning_into_a/ | false | false | default | 1 | {'enabled': True, 'images': [{'id': '8r7jlwzw9w8f1', 'resolutions': [{'height': 144, 'url': 'https://preview.redd.it/8r7jlwzw9w8f1.png?width=108&crop=smart&auto=webp&s=f80f572936e71a88cd9a898040b4e846b610d95c', 'width': 108}, {'height': 288, 'url': 'https://preview.redd.it/8r7jlwzw9w8f1.png?width=216&crop=smart&auto=we... | |
Issues with Qwen | 1 | [removed] | 2025-06-24T15:34:38 | https://www.reddit.com/r/LocalLLaMA/comments/1ljeagz/issues_with_qwen/ | LazyChampionship5819 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ljeagz | false | null | t3_1ljeagz | /r/LocalLLaMA/comments/1ljeagz/issues_with_qwen/ | false | false | 1 | null | |
Why aren’t there any new posts? | 1 | [removed] | 2025-06-24T15:11:55 | https://www.reddit.com/r/LocalLLaMA/comments/1ljdoxb/why_arent_there_any_new_posts/ | tubi_el_tababa | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ljdoxb | false | null | t3_1ljdoxb | /r/LocalLLaMA/comments/1ljdoxb/why_arent_there_any_new_posts/ | false | false | self | 1 | null |
Seeking recommendations for an advanced, company-funded AI/LLM course | 1 | [removed] | 2025-06-24T14:51:05 | https://www.reddit.com/r/LocalLLaMA/comments/1ljd5co/seeking_recommendations_for_an_advanced/ | amunocis | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ljd5co | false | null | t3_1ljd5co | /r/LocalLLaMA/comments/1ljd5co/seeking_recommendations_for_an_advanced/ | false | false | self | 1 | {'enabled': False, 'images': [{'id': 'SKRWjohBObgx3yHzNjtPrtwjGEYT8p8D_A_p6zNsKM8', 'resolutions': [{'height': 60, 'url': 'https://external-preview.redd.it/SKRWjohBObgx3yHzNjtPrtwjGEYT8p8D_A_p6zNsKM8.png?width=108&crop=smart&auto=webp&s=dd798cbbdbb71fca5a22358785768f607fd15254', 'width': 108}, {'height': 121, 'url': 'h... |
Speed comparison for Gemma 3 27B | 1 | [removed] | 2025-06-24T14:49:25 | https://www.reddit.com/r/LocalLLaMA/comments/1ljd3uz/speed_comparison_for_gemma_3_27b/ | phazze777 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ljd3uz | false | null | t3_1ljd3uz | /r/LocalLLaMA/comments/1ljd3uz/speed_comparison_for_gemma_3_27b/ | false | false | self | 1 | null |
Automating Form Mapping with AI | 1 | [removed] | 2025-06-24T14:48:37 | https://www.reddit.com/r/LocalLLaMA/comments/1ljd355/automating_form_mapping_with_ai/ | carrick1363 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ljd355 | false | null | t3_1ljd355 | /r/LocalLLaMA/comments/1ljd355/automating_form_mapping_with_ai/ | false | false | self | 1 | null |
GGUF Vision Models - Does it make a difference if I pick f16 or bf16 for the mmproj file? | 1 | [removed] | 2025-06-24T14:47:13 | https://www.reddit.com/r/LocalLLaMA/comments/1ljd1t1/gguf_vision_models_does_it_make_a_difference_if_i/ | nmkd | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ljd1t1 | false | null | t3_1ljd1t1 | /r/LocalLLaMA/comments/1ljd1t1/gguf_vision_models_does_it_make_a_difference_if_i/ | false | false | self | 1 | null |
Confused about serving STT and TTS concurrently through API. Any help would be appreciated! | 1 | [removed] | 2025-06-24T14:28:27 | https://www.reddit.com/r/LocalLLaMA/comments/1ljckln/confused_about_serving_stt_and_tts_concurrently/ | learninggamdev | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ljckln | false | null | t3_1ljckln | /r/LocalLLaMA/comments/1ljckln/confused_about_serving_stt_and_tts_concurrently/ | false | false | self | 1 | null |
LM Studio to read documents? | 1 | [removed] | 2025-06-24T14:26:56 | https://www.reddit.com/r/LocalLLaMA/comments/1ljcj7n/lm_studio_to_read_documents/ | rocky_balboa202 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ljcj7n | false | null | t3_1ljcj7n | /r/LocalLLaMA/comments/1ljcj7n/lm_studio_to_read_documents/ | false | false | self | 1 | null |
Any open source text to speech that gives you more expressive control? | 1 | [removed] | 2025-06-24T14:07:44 | https://www.reddit.com/r/LocalLLaMA/comments/1ljc24w/any_open_source_text_to_speech_that_gives_you/ | Brad12d3 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ljc24w | false | null | t3_1ljc24w | /r/LocalLLaMA/comments/1ljc24w/any_open_source_text_to_speech_that_gives_you/ | false | false | self | 1 | null |
How many of you are using MCP? | 1 | [removed] | 2025-06-24T13:51:21 | https://www.reddit.com/r/LocalLLaMA/comments/1ljbnpk/how_many_of_you_are_using_mcp/ | Yapper_Zipper | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ljbnpk | false | null | t3_1ljbnpk | /r/LocalLLaMA/comments/1ljbnpk/how_many_of_you_are_using_mcp/ | false | false | self | 1 | null |
I'm releasing my app ! | 1 | [removed] | 2025-06-24T13:51:01 | Kindly-Treacle-6378 | i.redd.it | 1970-01-01T00:00:00 | 0 | {} | 1ljbng3 | false | null | t3_1ljbng3 | /r/LocalLLaMA/comments/1ljbng3/im_releasing_my_app/ | false | false | default | 1 | {'enabled': True, 'images': [{'id': 'alg4dvl6rv8f1', 'resolutions': [{'height': 216, 'url': 'https://preview.redd.it/alg4dvl6rv8f1.png?width=108&crop=smart&auto=webp&s=725af794422dc20e1312d1a71bc585bf08bd6462', 'width': 108}, {'height': 432, 'url': 'https://preview.redd.it/alg4dvl6rv8f1.png?width=216&crop=smart&auto=we... | |
What t f happened !!!!???? | 1 | [removed] | 2025-06-24T13:46:21 | https://www.reddit.com/r/LocalLLaMA/comments/1ljbjej/what_t_f_happened/ | Zealousideal-Cut590 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ljbjej | false | null | t3_1ljbjej | /r/LocalLLaMA/comments/1ljbjej/what_t_f_happened/ | false | false | self | 1 | null |
What are best LLMs in french ? | 1 | [removed] | 2025-06-24T13:23:04 | https://www.reddit.com/r/LocalLLaMA/comments/1ljb00d/what_are_best_llms_in_french/ | Head_Mushroom_3748 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ljb00d | false | null | t3_1ljb00d | /r/LocalLLaMA/comments/1ljb00d/what_are_best_llms_in_french/ | false | false | self | 1 | null |
Introducing Atria, a self-hosted AI building platform (beta launch) | 1 | [removed] | 2025-06-24T13:22:41 | https://www.reddit.com/r/LocalLLaMA/comments/1ljazob/introducing_atria_a_selfhosted_ai_building/ | StygianBlue2 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ljazob | false | null | t3_1ljazob | /r/LocalLLaMA/comments/1ljazob/introducing_atria_a_selfhosted_ai_building/ | false | false | self | 1 | null |
I Discovered the road to ASI 🧐 | 1 | [removed] | 2025-06-24T12:46:22 | https://www.reddit.com/r/LocalLLaMA/comments/1lja6ji/i_discovered_the_road_to_asi/ | 1EvilSexyGenius | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1lja6ji | false | null | t3_1lja6ji | /r/LocalLLaMA/comments/1lja6ji/i_discovered_the_road_to_asi/ | false | false | self | 1 | null |
Where to start learning? | 1 | [removed] | 2025-06-24T12:35:19 | https://www.reddit.com/r/LocalLLaMA/comments/1lj9xyu/where_to_start_learning/ | Ok_Caterpillar_9945 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1lj9xyu | false | null | t3_1lj9xyu | /r/LocalLLaMA/comments/1lj9xyu/where_to_start_learning/ | false | false | self | 1 | null |
(meta question) Many comments on this sub are not visible for me | 1 | [removed] | 2025-06-24T12:33:00 | https://www.reddit.com/r/LocalLLaMA/comments/1lj9w5d/meta_question_many_comments_on_this_sub_are_not/ | ilhud9s | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1lj9w5d | false | null | t3_1lj9w5d | /r/LocalLLaMA/comments/1lj9w5d/meta_question_many_comments_on_this_sub_are_not/ | false | false | self | 1 | null |
How to train AI agent with my local repository and use it in house? | 1 | [removed] | 2025-06-24T12:14:13 | https://www.reddit.com/r/LocalLLaMA/comments/1lj9hue/how_to_train_ai_agent_with_my_local_repository/ | 01101110111motiv | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1lj9hue | false | null | t3_1lj9hue | /r/LocalLLaMA/comments/1lj9hue/how_to_train_ai_agent_with_my_local_repository/ | false | false | self | 1 | null |
Speech to text with emotions | 1 | [removed] | 2025-06-24T12:10:33 | https://www.reddit.com/r/LocalLLaMA/comments/1lj9f72/speech_to_text_with_emotions/ | iboima | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1lj9f72 | false | null | t3_1lj9f72 | /r/LocalLLaMA/comments/1lj9f72/speech_to_text_with_emotions/ | false | false | self | 1 | null |
Jailbreaking Gemini | 1 | [removed] | 2025-06-24T11:48:29 | https://www.reddit.com/r/LocalLLaMA/comments/1lj8zgx/jailbreaking_gemini/ | Sufficient_Abies2479 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1lj8zgx | false | null | t3_1lj8zgx | /r/LocalLLaMA/comments/1lj8zgx/jailbreaking_gemini/ | false | false | self | 1 | null |
Is it possible to run 2x Radeon RX 7600 XT 16 GB for local AI? | 1 | [removed] | 2025-06-24T11:41:27 | https://www.reddit.com/r/LocalLLaMA/comments/1lj8ukb/is_it_possible_to_run_2x_radeon_rx_7600_xt_16_gb/ | whatever462672 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1lj8ukb | false | null | t3_1lj8ukb | /r/LocalLLaMA/comments/1lj8ukb/is_it_possible_to_run_2x_radeon_rx_7600_xt_16_gb/ | false | false | self | 1 | null |
What is happening with r/LocalLLama? | 1 | [removed] | 2025-06-24T10:56:32 | https://www.reddit.com/r/LocalLLaMA/comments/1lj80j6/what_is_happening_with_rlocalllama/ | benja0x40 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1lj80j6 | false | null | t3_1lj80j6 | /r/LocalLLaMA/comments/1lj80j6/what_is_happening_with_rlocalllama/ | false | false | self | 1 | null |
I was done scrolling, so i built a Alt Tab like UI to navigate questions in ChatGPT | 1 | [removed] | 2025-06-24T10:21:47 | https://v.redd.it/6133mqg5pu8f1 | CategoryFew5869 | v.redd.it | 1970-01-01T00:00:00 | 0 | {} | 1lj7fs5 | false | {'reddit_video': {'bitrate_kbps': 5000, 'dash_url': 'https://v.redd.it/6133mqg5pu8f1/DASHPlaylist.mpd?a=1753352519%2CMjdlM2ZhNmYzZTM2OGU4ZjU0MzFiOWEwYzBkNThlYzliNDEzODYyNzBlNTFlMzYyM2IzY2JhOTIzZThiNDk1Ng%3D%3D&v=1&f=sd', 'duration': 21, 'fallback_url': 'https://v.redd.it/6133mqg5pu8f1/DASH_1080.mp4?source=fallback', 'h... | t3_1lj7fs5 | /r/LocalLLaMA/comments/1lj7fs5/i_was_done_scrolling_so_i_built_a_alt_tab_like_ui/ | false | false | 1 | {'enabled': False, 'images': [{'id': 'MGc0cnpxZzVwdThmMYC__n0Nn3l7RMQZdme2zTaTfGW2W0uGQqrbgd6LLmdx', 'resolutions': [{'height': 60, 'url': 'https://external-preview.redd.it/MGc0cnpxZzVwdThmMYC__n0Nn3l7RMQZdme2zTaTfGW2W0uGQqrbgd6LLmdx.png?width=108&crop=smart&format=pjpg&auto=webp&s=2f29e832cd817751021df2ac42a473297955b... | |
why do we have to tokenize our input in huggingface but not in ollama? | 1 | [removed] | 2025-06-24T10:02:11 | https://www.reddit.com/r/LocalLLaMA/comments/1lj748z/why_do_we_have_to_tokenize_our_input_in/ | Beyond_Birthday_13 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1lj748z | false | null | t3_1lj748z | /r/LocalLLaMA/comments/1lj748z/why_do_we_have_to_tokenize_our_input_in/ | false | false | self | 1 | null |
Mistral-small 3.2 24B tool call parser | 1 | [removed] | 2025-06-24T09:59:54 | https://www.reddit.com/r/LocalLLaMA/comments/1lj72pt/mistralsmall_32_24b_tool_call_parser/ | Remarkable-Law9287 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1lj72pt | false | null | t3_1lj72pt | /r/LocalLLaMA/comments/1lj72pt/mistralsmall_32_24b_tool_call_parser/ | false | false | self | 1 | null |
It's art | 1 | 2025-06-24T09:29:59 | kernel348 | i.redd.it | 1970-01-01T00:00:00 | 0 | {} | 1lj6lnm | false | null | t3_1lj6lnm | /r/LocalLLaMA/comments/1lj6lnm/its_art/ | false | false | default | 1 | {'enabled': True, 'images': [{'id': 'fsp50n0mgu8f1', 'resolutions': [{'height': 108, 'url': 'https://preview.redd.it/fsp50n0mgu8f1.jpeg?width=108&crop=smart&auto=webp&s=d895b4afdbf46520df71664da11864ebe5305274', 'width': 108}, {'height': 216, 'url': 'https://preview.redd.it/fsp50n0mgu8f1.jpeg?width=216&crop=smart&auto=... | ||
s | 1 | [removed] | 2025-06-24T09:29:44 | https://www.reddit.com/r/LocalLLaMA/comments/1lj6liy/s/ | Worthstream | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1lj6liy | false | null | t3_1lj6liy | /r/LocalLLaMA/comments/1lj6liy/s/ | false | false | self | 1 | null |
what happened to this subreddit? | 1 | [removed] | 2025-06-24T09:21:35 | https://www.reddit.com/r/LocalLLaMA/comments/1lj6h1o/what_happened_to_this_subreddit/ | anonymous_anki | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1lj6h1o | false | null | t3_1lj6h1o | /r/LocalLLaMA/comments/1lj6h1o/what_happened_to_this_subreddit/ | false | false | self | 1 | null |
Mistral Small 3.1 Instruct suddenly a gated model | 1 | [removed] | 2025-06-24T09:18:54 | https://www.reddit.com/r/LocalLLaMA/comments/1lj6fiy/mistral_small_31_instruct_suddenly_a_gated_model/ | gustavhertz | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1lj6fiy | false | null | t3_1lj6fiy | /r/LocalLLaMA/comments/1lj6fiy/mistral_small_31_instruct_suddenly_a_gated_model/ | false | false | 1 | null | |
Qwen2.5 14b on an M1 Pro with 16gb RAM | 1 | [removed] | 2025-06-24T09:08:13 | https://www.reddit.com/r/LocalLLaMA/comments/1lj69o2/qwen25_14b_on_an_m1_pro_with_16gb_ram/ | gamerboy12555 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1lj69o2 | false | null | t3_1lj69o2 | /r/LocalLLaMA/comments/1lj69o2/qwen25_14b_on_an_m1_pro_with_16gb_ram/ | false | false | self | 1 | null |
How to read a text in an image which is rotated and extract the hand-marked checkbox values through prompt? | 1 | [removed] | 2025-06-24T08:44:17 | https://www.reddit.com/r/LocalLLaMA/comments/1lj5wlm/how_to_read_a_text_in_an_image_which_is_rotated/ | TariqSm | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1lj5wlm | false | null | t3_1lj5wlm | /r/LocalLLaMA/comments/1lj5wlm/how_to_read_a_text_in_an_image_which_is_rotated/ | false | false | self | 1 | null |
What are the best AI tools that can build a web app from just a prompt? | 1 | [removed] | 2025-06-24T08:30:38 | https://www.reddit.com/r/LocalLLaMA/comments/1lj5pbj/what_are_the_best_ai_tools_that_can_build_a_web/ | netixc1 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1lj5pbj | false | null | t3_1lj5pbj | /r/LocalLLaMA/comments/1lj5pbj/what_are_the_best_ai_tools_that_can_build_a_web/ | false | false | self | 1 | null |
Best truly uncensored open-source LLM | 1 | [removed] | 2025-06-24T08:10:49 | https://www.reddit.com/r/LocalLLaMA/comments/1lj5eoq/best_truly_uncensored_opensource_llm/ | Over_Friendship3455 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1lj5eoq | false | null | t3_1lj5eoq | /r/LocalLLaMA/comments/1lj5eoq/best_truly_uncensored_opensource_llm/ | false | false | self | 1 | null |
Doing a half-assed RAG | 1 | [removed] | 2025-06-24T07:58:22 | https://www.reddit.com/r/LocalLLaMA/comments/1lj5829/doing_a_halfassed_rag/ | HistorianPotential48 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1lj5829 | false | null | t3_1lj5829 | /r/LocalLLaMA/comments/1lj5829/doing_a_halfassed_rag/ | false | false | self | 1 | null |
😂🏎️🏎️🏎️ | 1 | 2025-06-24T07:49:40 | codegolf-guru | i.redd.it | 1970-01-01T00:00:00 | 0 | {} | 1lj53gv | false | null | t3_1lj53gv | /r/LocalLLaMA/comments/1lj53gv/_/ | false | false | default | 1 | {'enabled': True, 'images': [{'id': '973m4slnxt8f1', 'resolutions': [{'height': 139, 'url': 'https://preview.redd.it/973m4slnxt8f1.png?width=108&crop=smart&auto=webp&s=a47777d96a155d910c844b2bf720cb77c4b72dee', 'width': 108}, {'height': 278, 'url': 'https://preview.redd.it/973m4slnxt8f1.png?width=216&crop=smart&auto=we... | ||
Oh shit, the user is right | 1 | [removed] | 2025-06-24T07:09:54 | cool_xixi | i.redd.it | 1970-01-01T00:00:00 | 0 | {} | 1lj4iae | false | null | t3_1lj4iae | /r/LocalLLaMA/comments/1lj4iae/oh_shit_the_user_is_right/ | false | false | 1 | {'enabled': True, 'images': [{'id': 'u7uQsxFoHlA_4q5qqI6XkGin9_uKjSXrScHrF9aKtRs', 'resolutions': [{'height': 37, 'url': 'https://preview.redd.it/yumgniygrt8f1.png?width=108&crop=smart&auto=webp&s=0a0abad07258f203568de17764acf86e8eda2e43', 'width': 108}, {'height': 74, 'url': 'https://preview.redd.it/yumgniygrt8f1.png?... | ||
Qualcomm adds Whisper large and medium to AI Hub Models collection | 1 | [removed] | 2025-06-24T07:05:54 | https://github.com/quic/ai-hub-models/releases/tag/v0.30.5 | Balance- | github.com | 1970-01-01T00:00:00 | 0 | {} | 1lj4g6f | false | null | t3_1lj4g6f | /r/LocalLLaMA/comments/1lj4g6f/qualcomm_adds_whisper_large_and_medium_to_ai_hub/ | false | false | 1 | {'enabled': False, 'images': [{'id': 'uYzwDTVcqVWRdorcuWeLEdil7LEtkxRhQIcQiWCygLI', 'resolutions': [{'height': 54, 'url': 'https://external-preview.redd.it/uYzwDTVcqVWRdorcuWeLEdil7LEtkxRhQIcQiWCygLI.png?width=108&crop=smart&auto=webp&s=8d38cad4b21aea56b376c5e9111ae872ebdeef0c', 'width': 108}, {'height': 108, 'url': 'h... | |
Bug Bounty Researcher | 1 | [removed] | 2025-06-24T06:23:33 | https://www.reddit.com/r/LocalLLaMA/comments/1lj3sil/bug_bounty_researcher/ | No_Paraphernalia | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1lj3sil | false | null | t3_1lj3sil | /r/LocalLLaMA/comments/1lj3sil/bug_bounty_researcher/ | false | false | self | 1 | null |
Just open-sourced Eion - a shared memory system for AI agents | 1 | [removed] | 2025-06-24T06:14:35 | https://www.reddit.com/r/LocalLLaMA/comments/1lj3nlt/just_opensourced_eion_a_shared_memory_system_for/ | 7wdb417 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1lj3nlt | false | null | t3_1lj3nlt | /r/LocalLLaMA/comments/1lj3nlt/just_opensourced_eion_a_shared_memory_system_for/ | false | false | self | 1 | null |
Phi-mini-MoE & Phi-tiny-MoE | 1 | 2025-06-24T05:48:20 | https://huggingface.co/microsoft/Phi-mini-MoE-instruct | kristaller486 | huggingface.co | 1970-01-01T00:00:00 | 0 | {} | 1lj38kr | false | null | t3_1lj38kr | /r/LocalLLaMA/comments/1lj38kr/phiminimoe_phitinymoe/ | false | false | default | 1 | null | |
Phi-tiny-MoE & Phi-mini-MoE released | 1 | [removed] | 2025-06-24T05:47:13 | https://huggingface.co/microsoft/Phi-mini-MoE-instruct | kristaller486 | huggingface.co | 1970-01-01T00:00:00 | 0 | {} | 1lj37y8 | false | null | t3_1lj37y8 | /r/LocalLLaMA/comments/1lj37y8/phitinymoe_phiminimoe_released/ | false | false | default | 1 | null |
I built a LOCAL OS that makes LLMs into REAL autonomous agents (no more prompt-chaining BS | 1 | [removed] | 2025-06-24T04:49:51 | https://github.com/iluxu/llmbasedos | iluxu | github.com | 1970-01-01T00:00:00 | 0 | {} | 1lj29ii | false | null | t3_1lj29ii | /r/LocalLLaMA/comments/1lj29ii/i_built_a_local_os_that_makes_llms_into_real/ | false | false | 1 | {'enabled': False, 'images': [{'id': 'jgEuQnheBQAiYNAguw1V1SKJQxy22rvASFf_3EqKWfI', 'resolutions': [{'height': 54, 'url': 'https://external-preview.redd.it/jgEuQnheBQAiYNAguw1V1SKJQxy22rvASFf_3EqKWfI.png?width=108&crop=smart&auto=webp&s=a039e85e21244846e59cda2e2fa470b632a56517', 'width': 108}, {'height': 108, 'url': 'h... | |
A test method to assess whether LLMs actually "think" | 1 | [removed] | 2025-06-24T04:08:31 | https://www.reddit.com/r/LocalLLaMA/comments/1lj1j27/a_test_method_to_assess_whether_llms_actually/ | Upper-Pressure-1954 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1lj1j27 | false | null | t3_1lj1j27 | /r/LocalLLaMA/comments/1lj1j27/a_test_method_to_assess_whether_llms_actually/ | false | false | self | 1 | null |
Maybe stupid question but, Why can't Field Specific models be ridden of unnecessary information? Wouldn't that make them better? | 1 | [removed] | 2025-06-24T04:04:59 | https://www.reddit.com/gallery/1lj1gro | HareMayor | reddit.com | 1970-01-01T00:00:00 | 0 | {} | 1lj1gro | false | null | t3_1lj1gro | /r/LocalLLaMA/comments/1lj1gro/maybe_stupid_question_but_why_cant_field_specific/ | false | false | 1 | {'enabled': True, 'images': [{'id': 'G7k-TQabDNi3vQVw1KUm6Tp7frp2cVPQpNw402uPmIo', 'resolutions': [{'height': 39, 'url': 'https://external-preview.redd.it/G7k-TQabDNi3vQVw1KUm6Tp7frp2cVPQpNw402uPmIo.png?width=108&crop=smart&auto=webp&s=58fb216f58d0193ed0f7e3062368f3ef886776ee', 'width': 108}, {'height': 79, 'url': 'htt... | |
Mistral-Small got an update | 1 | [removed] | 2025-06-24T03:20:31 | https://www.reddit.com/r/LocalLLaMA/comments/1lj0mpl/mistralsmall_got_an_update/ | ttkciar | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1lj0mpl | false | null | t3_1lj0mpl | /r/LocalLLaMA/comments/1lj0mpl/mistralsmall_got_an_update/ | false | false | self | 1 | {'enabled': False, 'images': [{'id': '3DBqKqgOLKDMFbcrOD5Qa-3M1IIegLfhMX6TTbsgXeU', 'resolutions': [{'height': 58, 'url': 'https://external-preview.redd.it/3DBqKqgOLKDMFbcrOD5Qa-3M1IIegLfhMX6TTbsgXeU.png?width=108&crop=smart&auto=webp&s=bcb646eb0d29b10fc855c3faa4ec547bea3a2720', 'width': 108}, {'height': 116, 'url': 'h... |
bought a tower with an rtx 4090 and want to install another rtx 4090 externally | 1 | [removed] | 2025-06-24T02:51:19 | https://www.reddit.com/r/LocalLLaMA/comments/1lj029h/bought_a_tower_with_an_rtx_4090_and_want_to/ | vegatx40 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1lj029h | false | null | t3_1lj029h | /r/LocalLLaMA/comments/1lj029h/bought_a_tower_with_an_rtx_4090_and_want_to/ | false | false | self | 1 | null |
Is it possible to de-align models? | 1 | [removed] | 2025-06-24T02:44:31 | https://www.reddit.com/r/LocalLLaMA/comments/1lizxhs/is_it_possible_to_dealign_models/ | Remarkable_Story_310 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1lizxhs | false | null | t3_1lizxhs | /r/LocalLLaMA/comments/1lizxhs/is_it_possible_to_dealign_models/ | false | false | self | 1 | null |
GPU Upgrade on an Outdated Build or Start Fresh? | 1 | [removed] | 2025-06-24T02:41:12 | https://www.reddit.com/r/LocalLLaMA/comments/1lizv4y/gpu_upgrade_on_an_outdated_build_or_start_fresh/ | swanlake523 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1lizv4y | false | null | t3_1lizv4y | /r/LocalLLaMA/comments/1lizv4y/gpu_upgrade_on_an_outdated_build_or_start_fresh/ | false | false | self | 1 | null |
Unlimited llm usage !!! How much are we willing to pay? | 1 | [removed]
[View Poll](https://www.reddit.com/poll/1lizq99) | 2025-06-24T02:34:15 | https://www.reddit.com/r/LocalLLaMA/comments/1lizq99/unlimited_llm_usage_how_much_are_we_willing_to_pay/ | Inevitable-Orange-43 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1lizq99 | false | null | t3_1lizq99 | /r/LocalLLaMA/comments/1lizq99/unlimited_llm_usage_how_much_are_we_willing_to_pay/ | false | false | self | 1 | null |
What's the best nsfw llm that I can run on my 5090? | 1 | [removed] | 2025-06-24T01:38:57 | https://www.reddit.com/r/LocalLLaMA/comments/1liylsy/whats_the_best_nsfw_llm_that_i_can_run_on_my_5090/ | jwheeler2210 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1liylsy | false | null | t3_1liylsy | /r/LocalLLaMA/comments/1liylsy/whats_the_best_nsfw_llm_that_i_can_run_on_my_5090/ | false | false | nsfw | 1 | null |
Uncensored model with persistent memory that works as an assistent? | 1 | [removed] | 2025-06-24T00:02:09 | https://www.reddit.com/r/LocalLLaMA/comments/1liwmip/uncensored_model_with_persistent_memory_that/ | Born_Ground_8919 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1liwmip | false | null | t3_1liwmip | /r/LocalLLaMA/comments/1liwmip/uncensored_model_with_persistent_memory_that/ | false | false | self | 1 | null |
Can I usw my old PC as server? | 1 | [removed] | 2025-06-23T23:43:24 | https://www.reddit.com/r/LocalLLaMA/comments/1liw7y7/can_i_usw_my_old_pc_as_server/ | Odd-Name-1556 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1liw7y7 | false | null | t3_1liw7y7 | /r/LocalLLaMA/comments/1liw7y7/can_i_usw_my_old_pc_as_server/ | false | false | self | 1 | null |
The Local LLM Research Challenge: Can Your Model Match GPT-4's ~95% Accuracy? | 1 | [removed] | 2025-06-23T23:14:34 | https://www.reddit.com/r/LocalLLaMA/comments/1livkug/the_local_llm_research_challenge_can_your_model/ | ComplexIt | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1livkug | false | null | t3_1livkug | /r/LocalLLaMA/comments/1livkug/the_local_llm_research_challenge_can_your_model/ | false | false | self | 1 | null |
Local LLM-Based AI Agent for Automated System Performance Debugging | 1 | [removed] | 2025-06-23T22:47:40 | https://www.reddit.com/r/LocalLLaMA/comments/1liuydy/local_llmbased_ai_agent_for_automated_system/ | Prashant-Lakhera | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1liuydy | false | null | t3_1liuydy | /r/LocalLLaMA/comments/1liuydy/local_llmbased_ai_agent_for_automated_system/ | false | false | 1 | null | |
Introducing the First AI Agent for System Performance Debugging | 1 | [removed] | 2025-06-23T22:33:28 | https://www.reddit.com/r/LocalLLaMA/comments/1liumf6/introducing_the_first_ai_agent_for_system/ | Prashant-Lakhera | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1liumf6 | false | null | t3_1liumf6 | /r/LocalLLaMA/comments/1liumf6/introducing_the_first_ai_agent_for_system/ | false | false | 1 | null | |
How much power would one need to run their own Deepseek? | 1 | [removed] | 2025-06-23T21:44:11 | https://www.reddit.com/r/LocalLLaMA/comments/1litewp/how_much_power_would_one_need_to_run_their_own/ | IZA_does_the_art | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1litewp | false | null | t3_1litewp | /r/LocalLLaMA/comments/1litewp/how_much_power_would_one_need_to_run_their_own/ | false | false | self | 1 | null |
Prompt Playground - app for comparing and fine-tuning LLM prompts | 1 | [removed] | 2025-06-23T21:39:18 | https://www.reddit.com/r/LocalLLaMA/comments/1litaod/prompt_playground_app_for_comparing_and/ | shaypser1 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1litaod | false | null | t3_1litaod | /r/LocalLLaMA/comments/1litaod/prompt_playground_app_for_comparing_and/ | false | false | self | 1 | null |
🧠💬 Introducing AI Dialogue Duo – A Two-AI Conversational Roleplay System (Open Source) | 1 | [removed] | 2025-06-23T21:33:49 | https://www.reddit.com/r/LocalLLaMA/comments/1lit5t3/introducing_ai_dialogue_duo_a_twoai/ | Reasonable_Brief578 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1lit5t3 | false | null | t3_1lit5t3 | /r/LocalLLaMA/comments/1lit5t3/introducing_ai_dialogue_duo_a_twoai/ | false | false | self | 1 | null |
Strange Results Running dots.llm1 instruct IQ4_XS? | 1 | [removed] | 2025-06-23T21:30:58 | https://www.reddit.com/r/LocalLLaMA/comments/1lit36k/strange_results_running_dotsllm1_instruct_iq4_xs/ | random-tomato | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1lit36k | false | null | t3_1lit36k | /r/LocalLLaMA/comments/1lit36k/strange_results_running_dotsllm1_instruct_iq4_xs/ | false | false | self | 1 | null |
Best model to code a game with unity | 1 | [removed] | 2025-06-23T21:26:17 | https://www.reddit.com/r/LocalLLaMA/comments/1lisz0r/best_model_to_code_a_game_with_unity/ | Momkiller781 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1lisz0r | false | null | t3_1lisz0r | /r/LocalLLaMA/comments/1lisz0r/best_model_to_code_a_game_with_unity/ | false | false | self | 1 | null |
New Gemini Model Released on google ai studio!!!! | 1 | [removed] | 2025-06-23T21:18:11 | https://www.reddit.com/r/LocalLLaMA/comments/1lisrxl/new_gemini_model_released_on_google_ai_studio/ | Minute_Window_9258 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1lisrxl | false | null | t3_1lisrxl | /r/LocalLLaMA/comments/1lisrxl/new_gemini_model_released_on_google_ai_studio/ | false | false | 1 | null | |
I’ve been building an AI platform called “I AM”, like ChatGPT but more personal | 1 | [removed] | 2025-06-23T20:52:37 | https://v.redd.it/w9wlrpoepq8f1 | axeltdesign23 | v.redd.it | 1970-01-01T00:00:00 | 0 | {} | 1lis40a | false | {'reddit_video': {'bitrate_kbps': 5000, 'dash_url': 'https://v.redd.it/w9wlrpoepq8f1/DASHPlaylist.mpd?a=1753303973%2CZDViZDAxZDY5OGZjNTAxNGJjNWNiMjU2NWEzMzVlNDI3YzkxN2Q0NDRlM2QwMmE2NWQ2Mzg2YmIxMDFlMjYwNw%3D%3D&v=1&f=sd', 'duration': 19, 'fallback_url': 'https://v.redd.it/w9wlrpoepq8f1/DASH_1080.mp4?source=fallback', 'h... | t3_1lis40a | /r/LocalLLaMA/comments/1lis40a/ive_been_building_an_ai_platform_called_i_am_like/ | false | false | 1 | {'enabled': False, 'images': [{'id': 'dmJzczRxb2VwcThmMZ6P8Scnx6jx3p1Z6WlHM7krHJZkPpKvoWvFYtblBHeU', 'resolutions': [{'height': 58, 'url': 'https://external-preview.redd.it/dmJzczRxb2VwcThmMZ6P8Scnx6jx3p1Z6WlHM7krHJZkPpKvoWvFYtblBHeU.png?width=108&crop=smart&format=pjpg&auto=webp&s=4f7e8b6efae95e58ee08ccc012532cd3f1fc2... | |
Trying to Learn AI on My Own – Need Help Creating a Roadmap | 1 | [removed] | 2025-06-23T20:33:14 | https://www.reddit.com/r/LocalLLaMA/comments/1lirlx8/trying_to_learn_ai_on_my_own_need_help_creating_a/ | Specialist_Cry2443 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1lirlx8 | false | null | t3_1lirlx8 | /r/LocalLLaMA/comments/1lirlx8/trying_to_learn_ai_on_my_own_need_help_creating_a/ | false | false | self | 1 | null |
[Off Topic] What happened to this post? | 1 | [removed] | 2025-06-23T20:33:03 | https://www.reddit.com/r/LocalLLaMA/comments/1lirlqg/off_topic_what_happened_to_this_post/ | IngenuityNo1411 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1lirlqg | false | null | t3_1lirlqg | /r/LocalLLaMA/comments/1lirlqg/off_topic_what_happened_to_this_post/ | false | false | 1 | null | |
I'm building a 100% private, local AI assistant, but hit a wall with internet access. So I built this privacy-first solution. What do you think? | 1 | [removed] | 2025-06-23T20:27:50 | SmarterWaysProd | i.redd.it | 1970-01-01T00:00:00 | 0 | {} | 1lirgtu | false | null | t3_1lirgtu | /r/LocalLLaMA/comments/1lirgtu/im_building_a_100_private_local_ai_assistant_but/ | false | false | 1 | {'enabled': True, 'images': [{'id': 'C7L95vQWMxYrKdqI9qKgLiJGtPYHTdyvtW1JZDzRcus', 'resolutions': [{'height': 83, 'url': 'https://preview.redd.it/quxy8ypzjq8f1.png?width=108&crop=smart&auto=webp&s=080a1232fc40a5a71a2eb3afc05df7a15c1af220', 'width': 108}, {'height': 166, 'url': 'https://preview.redd.it/quxy8ypzjq8f1.png... | ||
I figured out how to build AGI and here is my research work | 1 | [removed] | 2025-06-23T20:25:09 | https://playwithagi.com/ | Altruistic-Tea-5612 | playwithagi.com | 1970-01-01T00:00:00 | 0 | {} | 1lireai | false | null | t3_1lireai | /r/LocalLLaMA/comments/1lireai/i_figured_out_how_to_build_agi_and_here_is_my/ | false | false | default | 1 | null |
Translation benchmark? | 1 | [removed] | 2025-06-23T20:21:40 | https://www.reddit.com/r/LocalLLaMA/comments/1lirb0s/translation_benchmark/ | Educational_Grab_473 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1lirb0s | false | null | t3_1lirb0s | /r/LocalLLaMA/comments/1lirb0s/translation_benchmark/ | false | false | self | 1 | null |
Made An LLM Client for the PS Vita | 1 | [removed] | 2025-06-23T20:04:54 | https://v.redd.it/26zwv16h6q8f1 | ajunior7 | v.redd.it | 1970-01-01T00:00:00 | 0 | {} | 1liqvfb | false | {'reddit_video': {'bitrate_kbps': 5000, 'dash_url': 'https://v.redd.it/26zwv16h6q8f1/DASHPlaylist.mpd?a=1753302367%2CYjQwZTIyMmY1MWJiYzRlOGQ5ZDNhNzQ3NzQzNTRmMzNlZTcyODc2N2E2YmMxM2M4OWFkZGJmODM4NjIzZDQwMw%3D%3D&v=1&f=sd', 'duration': 117, 'fallback_url': 'https://v.redd.it/26zwv16h6q8f1/DASH_1080.mp4?source=fallback', '... | t3_1liqvfb | /r/LocalLLaMA/comments/1liqvfb/made_an_llm_client_for_the_ps_vita/ | false | false | 1 | {'enabled': False, 'images': [{'id': 'bGp2bzYyNmg2cThmMfIP8BrPficmhyY5KB42Ptrwyms9E-ke6lpIPgzOipjX', 'resolutions': [{'height': 60, 'url': 'https://external-preview.redd.it/bGp2bzYyNmg2cThmMfIP8BrPficmhyY5KB42Ptrwyms9E-ke6lpIPgzOipjX.png?width=108&crop=smart&format=pjpg&auto=webp&s=e3ae1b8a163e8e7b2db34c6fc178650e02ce2... | |
Anyone tried to repurpose crypto mining rigs and use them for GenAI? | 1 | [removed] | 2025-06-23T19:48:57 | https://www.reddit.com/r/LocalLLaMA/comments/1liqgsz/anyone_tried_to_repurpose_crypto_mining_rigs_and/ | Illustrious_Swim9349 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1liqgsz | false | null | t3_1liqgsz | /r/LocalLLaMA/comments/1liqgsz/anyone_tried_to_repurpose_crypto_mining_rigs_and/ | false | false | self | 1 | null |
AMD Instinct MI60 (32gb VRAM) "llama bench" results for 10 models - Qwen3 30B A3B Q4_0 resulted in: pp512 - 1,165 t/s | tg128 68 t/s - Overall very pleased and resulted in a better outcome for my use case than I even expected | 1 | [removed] | 2025-06-23T19:45:07 | https://www.reddit.com/r/LocalLLaMA/comments/1liqd7c/amd_instinct_mi60_32gb_vram_llama_bench_results/ | FantasyMaster85 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1liqd7c | false | null | t3_1liqd7c | /r/LocalLLaMA/comments/1liqd7c/amd_instinct_mi60_32gb_vram_llama_bench_results/ | false | false | 1 | null | |
Gave full control to AI for one feature and instantly regretted it | 1 | [removed] | 2025-06-23T19:38:26 | https://www.reddit.com/r/LocalLLaMA/comments/1liq706/gave_full_control_to_ai_for_one_feature_and/ | eastwindtoday | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1liq706 | false | null | t3_1liq706 | /r/LocalLLaMA/comments/1liq706/gave_full_control_to_ai_for_one_feature_and/ | false | false | self | 1 | null |
What is LlamaBarn (llama.cpp) | 1 | [removed] | 2025-06-23T19:31:26 | https://www.reddit.com/r/LocalLLaMA/comments/1liq0fa/what_is_llamabarn_llamacpp/ | Broke_DBA | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1liq0fa | false | null | t3_1liq0fa | /r/LocalLLaMA/comments/1liq0fa/what_is_llamabarn_llamacpp/ | false | false | 1 | null | |
Don’t Just Throw AI at Problems – How to Design Great Use Cases | 1 | 2025-06-23T19:23:46 | https://upwarddynamism.com/ai-use-cases-prompts/design-thinking-gen-ai-use-cases/ | DarknStormyKnight | upwarddynamism.com | 1970-01-01T00:00:00 | 0 | {} | 1liptb1 | false | null | t3_1liptb1 | /r/LocalLLaMA/comments/1liptb1/dont_just_throw_ai_at_problems_how_to_design/ | false | false | default | 1 | null | |
What are the best self-hosted AI code assistants for local development without relying on cloud APIs? | 1 | [removed] | 2025-06-23T19:15:26 | https://www.reddit.com/r/LocalLLaMA/comments/1liplc0/what_are_the_best_selfhosted_ai_code_assistants/ | Sorry-Dragonfruit738 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1liplc0 | false | null | t3_1liplc0 | /r/LocalLLaMA/comments/1liplc0/what_are_the_best_selfhosted_ai_code_assistants/ | false | false | self | 1 | null |
Facefusion launches HyperSwap 256 model seems to outperform INSwapper 128 | 1 | [removed] | 2025-06-23T19:11:11 | https://www.reddit.com/r/LocalLLaMA/comments/1liphbc/facefusion_launches_hyperswap_256_model_seems_to/ | khubebk | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1liphbc | false | null | t3_1liphbc | /r/LocalLLaMA/comments/1liphbc/facefusion_launches_hyperswap_256_model_seems_to/ | false | false | 1 | null |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.