I have a very long, possibly book-length take on LLMs that has been brewing since May 2015 but basically: wow humans love to take something that works extremely well in a certain narrow domain and then bend over backwards to insist it will solve every ...
-
replied to Random Geek last edited by
@randomgeek added to the list!
-
replied to blaine last edited by
@blaine @darius so, I don't think technical people realize how empowering LLMs are.
People who have ideas about how to make technology work normally have to either learn how to code, or pay a rare, expensive and recalcitrant programmer to make the technology for them.
Some systems have simple macro languages, but most don't. So learning to code means years of study from app basics on up.
-
replied to Evan Prodromou last edited by
-
replied to Evan Prodromou last edited by
@evan @blaine @evan @blaine the use case you are describing is inside the very narrow band domain that I feel pretty positive about
Although I don't think knowing how to code is any more empowering than knowing how to say, verbally persuade people. The fact that it feels like magic and that verbal persuasion (as one example) doesn't is... a problem imo
-
replied to Darius Kazemi last edited by
@darius is it a natural human tendency or an effect of capitalism?
Put differently: if we weren't all focused on squeezing more profit than the last quarter out of everything, would this phenomena exist or not?
-
replied to Darius Kazemi last edited by
-
replied to Evan Prodromou last edited by
-
replied to Evan Prodromou last edited by
-
replied to Chris Mabry last edited by
@chrismabry I think pre capitalism we also saw the tendency towards wanting easy totalizing solutions. It feels lazy to even point this out but look at religion, politics, and so on. Which all predate capitalism by millennia
-
replied to Darius Kazemi last edited by
"LLMs are like magic"
I agree on the grounds that basically everything is magical if you think about it hard enough.
I'm probably more literal and serious about this statement than the reader imagines
-
replied to Darius Kazemi last edited by
-
replied to Evan Prodromou last edited by
@evan @darius my reference point for this is every non-profit that's interacted with a tech person who's built a tool that "helps" the non-profit. The first pass is easy, trivial even. LLMs are great at technology at that level. It's the long-term social stuff that's hard; "The Team is the Unit of Delivery" and all that.
"sudo make me a magical 500,000 cell excel spreadsheet but no way to manage the complexity"
-
replied to Darius Kazemi last edited by
@darius "Any sufficiently advanced technology is indistinguishable from magic." - https://en.wikipedia.org/wiki/Clarke%27s_three_laws
Really should say any technology not currently within your understanding... just 'cause YOU can't explain/understand it doesn't mean that it is not understandable.
-
replied to Kathe Todd-Brown last edited by
@ktoddbrown I'm actually being more literal than that!
-
replied to Darius Kazemi last edited by
@darius not to mention folks ready to profit on another 'cure-all'. I don't know enough about AI but that's the general feeling I get...
-
replied to blaine last edited by
-
replied to Darius Kazemi last edited by
@darius [approaching a server farm] Waitβ¦do you feel that? The higher elements are active here
-
replied to blaine last edited by
@blaine @evan @darius the fallacy at the core of a lot of this stuff is the idea that the hard part of making software is writing the first draft of it. which... it's not that programming isn't difficult and making it more accessible isn't good, but once you become passably ok at it you just start finding lots of other problems you previously weren't aware of
-
replied to Darius Kazemi last edited by
@darius everything is magic if you don't understand it