LLMs and the fall of the technical founder
There was a moment in recent history that I was opitimistic that business and start-ups were going to focus more on why and what they were doing, rather than chasing the next bubble. That was when you couldn't get away from Simon Sinek and his talking about finding your why.
Then the NFT bubble started and I started to loose that optimism.
Today, we have AI. Its not AI, and it is only called that because of clever marketing. All AI on the market today, or even in R&D are just spicy auto-complete. They are statistical modelling run amok. But, since the companies today are not run by people with very strong scientific and engineering foundations, large companies have essentially "bet the farm" on AI.
If there were more technical folk in the rooms when these systems were being pitched, there would have been MUCH more skepticism about the claim that LLM is just the starting point and general AI will be just around the corner.
That isn't the path, LLMs and today's AI is on the wrong branch of the evolutionary tree and is going to die off eventually. There is no technological path that I can see that takes us from statistical modelling to artificial intelligence. To think so is to misunderstand intelligence in general. Intelligence is NOT just pattern recognition, there is reasoning too, and no statistical model can reason.

We have gotten to the point now that companies are injecting this slop into software and tools that make those tools worse. When M$ added a way to run python code that added an amazing way to make excel better.
Now that you can "use" CoPilot in spreadsheets, but can't use it to run calculations, it only makes it worse. No matter how many times M$ warns users NOT do use it that way, people are going to do it anyway, and the mighty spreadsheet goes from being a reliable tool to a Tesla with a vibe-coded autopilot.
No Conclusion
I have nothing really to add to this conversation. I am strongly anti-LLM for all things. There isn't a use case out there that I think LLMs do well, or better than a minimally educated human can do. We would be better off, as a society to hire people (paying them a livable wage) to mechanical turk everything for us, to attend all out meetings, etc.
With a system like that, then maybe more than 100 people would control 50% of all global wealth.