• 3 Posts
  • 471 Comments
Joined 2 years ago
cake
Cake day: April 27th, 2024

help-circle







  • It’s even weirder for me, like ive got the voice in my head, but also a whole suite of 3D visualisations of tens of thousands of my memories complete with the 3d environment, sound, the clothing I wore, the temperature, humidity, air pressure, the exact location, etc, I can even just visualise entire buildings I’ve been in a few times, like friends houses, the primary, secondary and high school campuses I attended, literally anything. Ive also got my entire music playlist of probably 4000+ songs completely backed up in my nogin, and a whole heap more.

    And then my brain just decides “let’s sort through the trauma first shall we”, like bro, you have virtually infinite other things to think about, and you pick the worst?





  • Actually, I agree. And so far, small local models are really solid, and can punch above its weight even when compared to frontier models.

    I believe what I meant when I said I doubted it was since these AI corpos seemingly give no indication that local is an option, so most people would think they can only access an LLM through the web. This would bolster the SaaS ecosystem dominating over local AI, although local will keep increasingly growing as a more favourable option.

    Although I do agree that the industry will shift from being server based to PC based inference as well, I don’t see that shift being large enough to make these companies change their training paradigms to include telemetry from local AI, but I’m sure some will.



  • I doubt it, but honesty, many systems can do inference pretty well, like how I ran the MLX version of Qwen 3 4b with a DuckDuckGo search RAG, and used it to ask quick questions and verify some simple things, running on a MacBook Air m2 16gb, and barely made a dent in the RAM utilisation or SoC, and this also goes for my much less powerful machines, like even a galaxy a20, with 3gb of memory and a low spec octacore exynos, can run small models really well, although the quantisation needs to be a bit strict.