Barth's Distinction: There are two types of people: those who divide people into two types, and those who don't.
—fortune(6)
Barth's Distinction: There are two types of people: those who divide people into two types, and those who don't.
—fortune(6)
Eagleson's Law: Any code of your own that you haven't looked at for six or more months, might as well have been written by someone else. (Eagleson is an optimist, the real number is more like three weeks.)
—fortune(6)
@inthehands One could imagine using a fixed set of model weights and not retraining, using a fixed random seed, and keeping the model temperature relatively low. I'm imagining on some level basically the programming-language-generating version of Nvidia's DLSS tech here. But that's not what people are doing and I'm not convinced if we did that it would be useful
@inthehands Another bull-case argument about LLMs is that they are a form of autonomation (autonomy + automation), in the sense that the Toyota Production System uses it, the classic example being the automated loom which has a tension sensor and will stop if one of the warp yarns break. But we already have many such systems in software, made out of normal non-LLM parts, and also that's ... not really what's going on here, at least the way they're currently being used.
@inthehands Yes! Yes. This is it exactly.
One can imagine a version of these systems where all the "source code" is English-language text describing a software system, and the Makefile first runs that through an LLM to generate C or Python or whatever before handing it off to a regular complier, which would in some sense be more abstraction, but this is like keeping the .o files around and making the programmers debug the assembly with a hex editor.
@inthehands The bet that a lot of these CXOs are making implicitly is that this will be like the transition from assembly to higher-level languages like C (I think most of them are too young and/or too disconnected to make it explicitly). And I'm not 100% sold on it but my 60% hunch is that it's not.