Still thinking about this post from @tess. (See the threaded follow-up for explanation!)
-
Non-programmers tend to think that •syntax• is the hard part of programming, but it’s not. The hard part is dealing with unambiguous communication, watching a machine do •exactly what you told it to do• — no matter how wrong you were, or how little sense it makes, or how small the mistake. Nothing makes one feel as stupid as writing code! It’s sorcerer’s apprentice all day every day: fallibility come to life.
Computers make our imaginary ideas talk back to us, and our ideas surprise us.
4/
-
@inthehands and data != context or understanding … something people seem to misunderstand these days.
-
We programmers, I think, tend to misunderstand the legal system, imagining that laws are like code — or that they •should• be like code, and the problem with them is that they’re insufficiently precise.
That’s wildly incorrect.
The whole •point• of the law is that it’s ambiguous and interpreted by humans. Justice Oliver Wendell Holmes: “The life of the law has not been logic: it has been experience.”
5/
-
Paul Cantrellreplied to Paul Cantrell last edited by [email protected]
As much as laws have unintended consequences — sometimes that were contrary to intent, even consequences that •nobody• wanted! — we do have room for humans to wiggle within and around the law, to argue its meaning, to refine and sharpen its interpretation. That means courts reading the law, yes, and civil disobedience — and it also includes the creative workaround and/or malicious compliance of these city workers repaving a sidewalk.
Interpretive latitude in the law is a feature, not a bug.
6/
-
@inthehands Absolutely! Without the incredibly flexible deliberately-misunderstandable nature of English, our potential for puns would be greatly reduced, which would be a national tragedy
-
Paul Cantrellreplied to Paul Cantrell last edited by [email protected]
(If you are still on a thing about how the law really should be like code, please, please imagine: would you trust the people who serve in Congress to write actual code and put it in production without being able to test it? Really?? “Live debugging” of the law is the only thing that keeps society based in the rule of law from collapsing!)
7/
-
@inthehands I remember Amelia Bedelia being surprised that some people would “dust” their furniture - she would un-dust her furniture!
-
@inthehands This reminds me of the (fortunately) brief time during the cryptocurrency bubble when a few techbros thought that “code as law” would be a workable concept. Even without the instances of “code as law” losing badly to “existing law as law”, there were way too many cases of the code doing exactly what code will do, and screwing its own designers over in ways that were completely predictable to non-techbros.
-
Paul Cantrellreplied to Paul Cantrell last edited by [email protected]
Part of the appeal of LLMs, I think, is that they make a tantalizing false promise of allowing humans to give instructions to computers expecting the same kind of common-sense interpretation we expect of humans.
Bad news, kids: it’s still a machine. •Somewhere• in the chain, it becomes unambiguous. And that moment where we cross the threshold from ambiguous, contextualized human interpretation to unambiguous machine instructions — •that• is the moment of programming. There live the demons.
8/
-
@inthehands
Computer: Your code is unambiguous
Also computer: NullPointerException -
In code, it’s the programmer who crosses that threshold. We straddle two worlds: the messy world of human communication and human context, and the alien world of machine logic.
If you turn code into LLM prompting, who crosses that threshold? It’s not the prompt-writer, not really; they only shape it, like a middle manager talking to a programmer. It’s not the author of the LLM’s code either.
It’s a random number generator.
9/
-
Sheldon Chang 🇺🇸replied to Paul Cantrell last edited by
@inthehands every time I get asked if I'm nervous about some random technology replacing my skills as a developer, I laugh. The tech isn't the roadblock. The humans are. Until humans are better at humaning, I'll always have a job understanding what the humans really mean when they say they want this vs that.
I got into doing tech work an unusual way. I was a physical therapist before and I spent my days hearing people's theories on why their knee hurt. It was the weather. They sat in the wrong chair 10 years ago. They took the wrong supplement. It's because they wore shorts on a cold day... and then after all that I'd conclude they actually had a hip or a back problem or needed new shoes.
Figuring out what humans want with technology has actually been a lot easier than understanding humans well enough to help them understand how their own bodies work. I had no and still have no certifications, but leaned hard into my interviewing skills to gain an edge as a late starter.
-
An LLM will •randomly• turn a prompt into some unambiguous machine behavior that has some probability of fitting the prompt. And — worse!— it reapplies that randomness afresh every time.
That’s not necessarily a bad thing. Randomness can be useful. Generating random possibilities can be useful.
But I don’t think that’s what people have in mind when they imagine replacing programming with generative AI.
10/
-
Say what you will about software, but at least it’s consistent: right or wrong, at least it will follow the same instructions in the same unambiguous way.
Say what you will about the law, but at least it’s human: we constantly renegotiate it, reshape it as we live it, hot patch it as we follow it.
LLMs are neither of these things. I don’t think we have good intuitions about what they are. A lot of sloppy thinking is filling in the gaps, and investors are capitalizing on that.
/end
-
The “Can’t you just…” mantra is NOT limited to SWE; it applies to all technical jobs.
-
@emenel
Indeed. Related: https://hachyderm.io/@inthehands/113227415258787221 -
@ShadSterling
Loved those books. -
@RommelRico
You jest, but NullPointerException was a major safety advancement over dereferencing null and getting whatever’s at memory address zero! Why “safety?” Because it made the machine’s rule (“you just can’t dereference null, period”) something that humans could mentally model and predict. -
Paul Cantrellreplied to Sheldon Chang 🇺🇸 last edited by
-
@partnumber2 @tess
Yes, and not just technical either. ALL jobs!Paul Cantrell (@[email protected])
“Can’t you just…” is an idiom that means “I don’t understand the work you do at all”
Hachyderm.io (hachyderm.io)