Kelsey Hightower was a standout for me during the whole crypto/NFT insanity. His commentary cut through the noise, and I really respected that.
I’m currently on an experimental team at work diving deep into a bunch of these tools. My evaluation of most of them as an IC tends to land somewhere between “this is complete garbage” and “okay, this has some real utility, but it’s not living up to the hype.” There’s just so much noise, especially from certain kinds of product people who seem convinced this tech is going to replace engineers entirely.
So if you’ve found people or places that offer actual signal in all of this, I’d love to hear about them.
- Simon Wardley (of Wardley maps fame) on Software Engineering vs Vibe Coding: https://www.linkedin.com/posts/simonwardley_x-why-are-you-so...
- Gergely Orosz from https://www.pragmaticengineer.com/. I find myself agreeing with takes like this one: https://twitter.com/GergelyOrosz/status/1912135400480526366
- Maybe a bit too optimistic, but I agree with the overall picture presented in https://sourcegraph.com/blog/revenge-of-the-junior-developer
My take is that it won't replace us but our roles will evolve to leverage these tools more and more to achieve human ends. There are so many considerations to take into account like production stance, fallback, data integrity, mitigation, privacy, security. I think we are a ways from that. When AI can start solving production issues post mortem and requirements implementation, then I would say we have hit a milestone. With that said, new engineers should still need to understand the fundamentals of CS, systems and how to properly use a language.
My main gripe is this push of AI in everything without fully understanding the value proposition and if it would be useful in the given case. I mostly ascribe this to leadership who don't really understand it and think it will just solve all of their problems and vastly increase productivity. What's worse is when it gets passed down the chain from C-suite and ends up being a hot mess by the time you get a project because there is no true direction of how it will apply to your product. It takes good leadership and identifying genuine usefulness for certain products.
AI is just the current tip of the spear. AI is most appealing to people looking for ever more convenient social exchanges, social gravities. While this is somewhat apparent in reality it is intentionally the focus around marketing of AI products and services.
There appears to be a clear and growing trajectory towards “tell me what I want to hear” in combination with “I’m not qualified to participate but can with just a bit of help” mentality. This thinking is driven by personality irrespective of education or intelligence, but those appear to be strong indicators as well. The personality types in question tend to be high agreeability and high neuroticism, which results in preferences towards social gravities versus individualism/originality. Historically this mode of thinking was considered to be more feminine but over the last few decades there been has a gradual erosion of masculine social identity in the west resulting in males adopting this more traditionally feminine social conduct.
The hype cycle around AI feels unlike anything I've ever experienced. Like yes, obviously it's a useful tech, but it won't become a literal god-like figure as some of our industry leadership is convinced.
I'm not even convinced it'll become reliable enough to replace humans for most tasks. I'd love to find some fellow skeptics that still appreciate and report about the actual advantages of AI as a tool, not as a religious movement.
I think he's doing an AI YouTube channel. He was quite rational with all the nosql hype back in the day, and seems to still be quite rational.