Opinion | It’s the End of Computer Programming as We Know It. (And I Feel Fine.)
“Programming will be obsolete,” Matt Welsh, a former engineer at Google and Apple, predicted recently. Welsh now runs an A.I. start-up, but his prediction, while perhaps self-serving, doesn’t sound implausible:
I believe the conventional idea of “writing a program” is headed for extinction, and indeed, for all but very specialized applications, most software, as we know it, will be replaced by A.I. systems that are trained rather than programmed. In situations where one needs a “simple” program … those programs will, themselves, be generated by an A.I. rather than coded by hand.
Welsh’s argument, which ran earlier this year in the house organ of the Association for Computing Machinery, carried the headline “The End of Programming,” but there’s also a way in which A.I. could mark the beginning of a new kind of programming — one that doesn’t require us to learn code but instead transforms human-language instructions into software. An A.I. “doesn’t care how you program it — it will try to understand what you mean,” Jensen Huang, the chief executive of the chip-making company Nvidia, said in a speech this week at the Computex conference in Taiwan. He added: “We have closed the digital divide. Everyone is a programmer now — you just have to say something to the computer.”
Wait a second, though — wasn’t coding supposed to be one of the can’t-miss careers of the digital age? In the decades since I puttered around with my Spectrum, computer programming grew from a nerdy hobby into a vocational near-imperative, the one skill to acquire to survive technological dislocation, no matter how absurd or callous-sounding the advice. Joe Biden to coal miners: Learn to code! Twitter trolls to laid-off journalists: Learn to code! Tim Cook to French kids: Apprenez à programmer!
Programming might still be a worthwhile skill to learn, if only as an intellectual exercise, but it would have been silly to think of it as an endeavor insulated from the very automation it was enabling. Over much of the history of computing, coding has been on a path toward increasing simplicity. Once, only the small priesthood of scientists who understood binary bits of 1s or 0s could manipulate computers. Over time, from the development of assembly language through more human-readable languages like C and Python and Java, programming has climbed what computer scientists call increasing levels of abstraction — at each step growing more removed from the electronic guts of computing and more approachable to the people who use them.
A.I. might now be enabling the final layer of abstraction: the level on which you can tell a computer to do something the same way you’d tell another human.
So far, programmers seem to be on board with how A.I. is changing their jobs. GitHub, the coder’s repository owned by Microsoft, surveyed 2,000 programmers last year about how they’re using GitHub’s A.I. coding assistant, Copilot. A majority said Copilot helped them feel less frustrated and more fulfilled in their jobs; 88 percent said it improved their productivity. Researchers at Google found that among the company’s programmers, A.I. reduced “coding iteration time” by 6 percent.