Close Menu
    What's Hot

    SOUYIE SW-9 GPT Powered Smartwatch

    November 26

    Manus Browser Operator: Agent Works In your Browser

    November 26

    Pi GPT: Vibe Coding for Raspberry Pi

    November 25
    Facebook X (Twitter) Instagram
    • AI Robots
    • AI News
    • Text to Video AI Tools
    • ChatGPT
    Facebook X (Twitter) Instagram Pinterest Vimeo
    Rad NeuronsRad Neurons
    • AI Robots
      • AI Coding
    • ChatGPT
    • Text to Video AI
    Subscribe
    Rad NeuronsRad Neurons
    Home » Brave Browser Gets Accelerated Local LLMs with Leo AI and Ollama
    AI News

    Brave Browser Gets Accelerated Local LLMs with Leo AI and Ollama

    AI NinjaBy AI NinjaOctober 72 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
    Share
    Facebook Twitter LinkedIn Pinterest Email

    Many of us have used various top LLM AI tools to generate content and get answers to our questions in the past. AI models like OpenAI’s ChatGPT and Google’s Bard are well-known for their conversational abilities. However, these solutions are entirely cloud-based. Alexa and Siri can answer basic questions but are limited in many ways. They are also cloud based. Running a local AI model on a weak hardware is not ideal. They can also be time consuming to set up. As NVIDIA reports, Leo AI and Ollama can bring accelerated local LLMs to Brave browser.

    This technology relies on NVIDIA GPUs and Tensor cores to handle massive amounts of calculations simultaneously. Llama.cpp is an open-source library and framework used by Leo AI to pull it off.

    Brave’s Leo AI can run in the cloud or locally on a PC through Ollama. When you use a local model, you don’t have to worry about your privacy being compromised. With a RTX graphics card, you can run AI models fast. Using the Llama 3 8B model, you can expect up to 110 words per second. Once you download Ollama, you can run a variety of models locally from the command line.

    [more info]

    AI ollama
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleXeon 5S ChatGPT Glasses with Photochromic Lenses
    Next Article OfflineLLM: Offline AI Chatbot for Apple Vision Pro
    AI Ninja
    • Website

    Related Posts

    AI News

    Manus Browser Operator: Agent Works In your Browser

    November 26
    AI Coding

    Gemini 3 Breaks the Internet. Here Are a Few Examples

    November 19
    AI News

    Grok 4.1 Released Ahead of Gemini 3.0 Pro Launch

    November 18
    Add A Comment
    Leave A Reply Cancel Reply

    Top Posts

    INMO Air3 1080p AR Glasses with Gemini/GPT Support

    October 1022 Views

    Wan2.2-I2V-Flash Fast Cheap Image to Video Model

    August 1521 Views

    Genesis: Generative Python-based Physics Engine for 4D Worlds for General Purpose Robots

    December 1913 Views
    More
    AI News

    Manus Browser Operator: Agent Works In your Browser

    AI NinjaNovember 26
    AI Coding

    Gemini 3 Breaks the Internet. Here Are a Few Examples

    AI NinjaNovember 19
    AI News

    Grok 4.1 Released Ahead of Gemini 3.0 Pro Launch

    AI NinjaNovember 18
    Most Popular

    Prompt Cannon: Run Prompts Across Multiple Models

    June 243,229 Views

    Dipal D1 2.5K Curved Screen 3D AI Character

    June 23955 Views

    GPTARS: GPT Powered TARS Robot

    November 21679 Views
    Our Picks

    SOUYIE SW-9 GPT Powered Smartwatch

    November 26

    Manus Browser Operator: Agent Works In your Browser

    November 26

    Pi GPT: Vibe Coding for Raspberry Pi

    November 25
    Tags
    3D agent AI AI model ai video app avatar browser canvas ChatGPT Chess Claude coding DeepSeek ElevenLabs ERNIE glasses GPT Grok Hailuo Higgsfield image kling leonardo LLM Manus MCP midjourney model music nano banana o3 OpenAI open source QWEN robot runway sora text to video Veo 2 Veo 3 Vibe coding video video model Voice

    © 2025 Rad Neurons. Inspired by Entropy Grid
    • Home
    • Terms of Use
    • Privacy Policy
    • Disclaimer

    Type above and press Enter to search. Press Esc to cancel.