The promise that wasn’t kept

Literally yes, but figuratively no, well maybe not, it depends. If you are shipping insecure software that you don’t know how to debug, scale, then you are not a software engineer, at least not yet. CI/CD principles, lean, xp, agile delivery, test-driven development and teaming (mob/group/social programming) are all tools that solve this. If you don’t have the discipline and the processes in place it makes little difference if you have used AI or not.

Code is only as good as the delivery mechanism used to give it to the user.

What’s more, there’s the environmental impacts of AI, which are only just beginning to emerge. An MIT article titled Explained: Generative AI’s environmental impact outlines how the rapid expansion of generative AI presents significant sustainability challenges, including electricity and water overuse, hardware-related emissions, and increasing pressure on power grids.

The world is already cooked. And yet, we’re cooking it more — literally, and figuratively — by shipping insecure software into the void that we have no idea how to debug, scale, or extend. In the not-so-distant future, LLMs will be trained purely on LLM-generated software, and the world will eat itself.

I challenge you to find the value in that.

»Salma Alam-Naylor →

// Published: , with 203 words. 0 mentions.

No webmentions were found.