I really like Drew's framework here dividing current AI use-cases into Gods (human replacement, which I think of as still mostly science fiction), Interns (assistants you delegate closely-reviewed tasks to, which is most of how I use LLMs today) and Co...
-
I really like Drew's framework here dividing current AI use-cases into Gods (human replacement, which I think of as still mostly science fiction), Interns (assistants you delegate closely-reviewed tasks to, which is most of how I use LLMs today) and Cogs (smaller tools that can more reliably serve a single purpose, like Whisper for transcription) - more of my own notes on this here: https://simonwillison.net/2024/Oct/20/gods-interns-and-cogs/
https://note.computer/@dbreunig/113330394050829486 -
@simon For the "gods" category, also check out @forrestbrazeal's excellent song "AGI (Artificial God Incarnate)": https://www.youtube.com/watch?v=1ZhhO7MGknQ
-
Jeff "weBOOOOlogy" Triplettreplied to Simon Willison last edited by
@simon I have been struggling with terminology, so this is useful. That said, I'm not a fan of "interns" used like this. The context that you used it in felt more appropriate than a whole class of AI terminology that literally means to replace a useful class of workers and learning.
I have personally struggled with the term Agents for lack of a framework or way to use them outside of running a Python script.
-
@matt @forrestbrazeal OK that is a banger! Blogged about it here https://simonwillison.net/2024/Oct/20/knowledge-worker/
-
Simon Willisonreplied to Jeff "weBOOOOlogy" Triplett last edited by
@webology my usual version is "weird intern" to help emphasize that it's really not a replacement for an actual human intern - but I do find the term useful in that it indicates the need for close supervision and some form of "coaching" in terms of prodding it in the right direction
I hate the term "agent", everyone seems to have a completely different idea of what it means, and often don't seem to realize that there's no single universal definition so nobody else understands what they mean
-
Simon Willisonreplied to Simon Willison last edited by
@matt @forrestbrazeal oh hot damn I've been working on a blog entry which tries to say what's in this song but in 10x more words and 1/10th as good https://www.youtube.com/watch?v=hrfEUZ0UvRo
-
Jeff "weBOOOOlogy" Triplettreplied to Simon Willison last edited by
@simon Knowing how tech works, I hate that the term Intern when used in AI is going to kill most internships or at least the paid ones.
"Supervised {AI Term}" would be better.
Same goes when someone deems a medical AI with "Nurse" which also feels like the wrong term.
I understand the "weird intern" terminology, but that felt more like you are asking weird questions, expecting that the result than the answers one gets are weird by default.
-
Simon Willisonreplied to Jeff "weBOOOOlogy" Triplett last edited by
@webology I think the reason I don't find the term as upsetting is that I'm not sold on the "AI means no more junior/intern roles" thing yet
I think it means no more not-AI-enhanced juniors, but I'm very excited to see what AI-enhanced juniors end up looking like
Maybe I'm completely out of touch and the intern/junior role has been extinguished already though
-
Jeff "weBOOOOlogy" Triplettreplied to Simon Willison last edited by
@simon Overall, the numbers seem to have fallen off quite a bit this year. I lazily searched and linked the first one https://www.bloomberg.com/news/articles/2024-06-13/looking-for-an-internship-fewer-spots-for-more-applicants-in-cutthroat-year
but I have heard a half dozen Bloomburg articles (my preferred morning finance/business news via Echo) that have been reporting on it for a while.
I would be shocked if 2025 isn't much worse.
-
Simon Willisonreplied to Jeff "weBOOOOlogy" Triplett last edited by
@webology right, but is that because of AI, or is it because the tech companies all over-hired during the pandemic, had massive layoffs and as a result the market is flooded with experienced talent which makes the market for juniors really awful?
-
Matt Campbellreplied to Simon Willison last edited by
@simon You sure you linked to the right song? Have you gone through the kind of existential crisis portrayed in that song? I always saw you as being cautiously optimistic about AI, that it can be a useful tool if used well, but not making us redundant.
-
Simon Willisonreplied to Matt Campbell last edited by
@matt @forrestbrazeal oh I absolutely went through these 5 stages of grief, but I've been firmly in the 5th step for over a year at this point
-
Matt Campbellreplied to Simon Willison last edited by
@simon @forrestbrazeal I don't think I ever actually felt the anger. And I feel kind of guilty about that, because of course some people have decided to stop there and fight against AI, and they have such moral certainty about it.
-
@simon @matt @forrestbrazeal I don’t feel like I am stuck in step 4, but I can’t get to step 5 because I do very much believe that (at least in their current incarnation) AI is very much going away. I am worried about the collateral damage that this particular dead tree is going to cause when it smashes into the rest of the ecosystem. E.g.: nvidia currently has a market cap of 11% of GDP, which I do not think is sustainable. But who on earth is going to get a positive ROI on ChatGPT at $20/mo?
-
Simon Willisonreplied to Matt Campbell last edited by
@matt @forrestbrazeal I've not felt the anger personally, because I've been releasing open source code for 20+ years so I already default to "I want people to be able to reuse my work as much as possible" - but I absolutely understand the people who ARE angry
If I was an artist and someone trained Stable Diffusion on my work without my permission and then started competing with me for commissions that used my own personal style I think I'd feel very differently about this all
-
@glyph @matt @forrestbrazeal I expect there's going to be a substantial AI crash, but I don't think (most of) the tools I'm using right now will become unavailable to me - especially since I can run Llama 70B on my own laptop now
-
Matt Campbellreplied to Simon Willison last edited by
-
@matt @simon while I agree that the models you are using locally will not go away, I am also skeptical that they will receive much in the way of maintenance once the commercial bubble has burst. Nobody is getting paid when you run a model on your laptop locally, and training them still requires a zillion hours of GPU time and a corpus in the petabytes
-
Simon Willisonreplied to Matt Campbell last edited by
@matt @glyph the models I can run in my laptop today are leagues ahead of the models I ran on the exact same hardware a year ago - improvements in that space have been significant
This new trick from Microsoft looks like it could be a huge leap forward too - I've not dug into it properly yet though https://github.com/microsoft/BitNet
-