Ask HN: Are we gonna back less powerful local LLMs

9 points | by omertt27 1 day ago

4 comments

  • frangonf 1 day ago

    I think the trend is that top models are meant for companies. Small devs did our job of hyping and training and we can now either pay way more, or pay more and use not sota models, or give our data to access train chinese models in hopes they keep 6 months behind in the cold war and need still need some of our input, or invest around 5-10K for powerful local personal AI [0].

    On the other hand I think that AI can really raise the bar of "average tech", and we devs are wired to think that better tech == more value... but this might not be the case in the many many many cases where existing average tech and velocity is already good enough and the real moat is the handshake, trust, marketing, etc etc

    [0] https://github.com/antirez/ds4

    • aurareturn 1 day ago

      Funny I just came across a comment that accurately answers your question: https://news.ycombinator.com/item?id=48047722

      • al_borland 1 day ago

        This is one take. Not every project is made better by expanding the scope by 100x. This often makes things worse.

        • aurareturn 1 day ago

          One of the hardest things to do in software is to make things simple and easy to use. That could mean increasing the scope to do so. Scope doesn't automatically mean more bloat.

        • omertt27 1 day ago

          Thanks!

        • sdevonoes 1 day ago

          The same way most of us use linux (not windows) and postgres/mysql (not oracle), many of us are using open source models and not proprietary ones.

        • lalsanhim 19 hours ago

          I need a proficinal hiking system