Software engineers have joined the ranks of copy editors, translators and others who fear that they are about to be replaced by generative AI. But it might be surprising to learn that coders already had been under threat. New technologies have long promised to “disrupt” engineering, and these innovations have always failed to get rid of the need for human software developers. If anything, they often made these workers that much more indispensable.
To understand where handwringing about the end of programmers comes from—and why it's overblown—we need to look back at the evolution of coding and computing. Software was an afterthought for many early computing pioneers, who considered hardware and systems architecture the true intellectual pursuits within the field. To the computer scientist John Backus, for instance, calling coders "programmers" or "engineers" was akin to re-labeling janitors "custodians," an attempt at pretending that their menial work was more important than it was. What's more, many early programmers were women, and sexist colleagues often saw their work as secretarial. However, while programmers might have held a lowly position in the eyes of somebody like Backus, they were also indispensable—they saved people like him from having to bother with the routine business of programming, debugging and testing.
Even though they performed a vital—if underappreciated—role, software engineers often fit poorly into company hierarchies. In the early days of computers, they were frequently self-taught and worked on programs they alone had devised, which meant that they didn't have a clear place within preexisting departments and that managing them could be complicated. As a result, many modern features of software development were developed to simplify, and even eliminate, interactions with coders. FORTRAN was supposed to allow scientists and others to write programs without any support from a programmer. COBOL's English syntax was intended to be so simple that managers could bypass developers entirely. Waterfall-based development was invented to standardize and make routine the development of new software. Object-oriented programming was supposed to be so simple that eventually all computer users could do their own software engineering.
Please select this link to read the complete article from WIRED.