Since reading Bryan Appleyard’s book “The Brain is Wider Than the Sky: Why Simple Solutions Don’t Work in a Complex World” I’ve had time to reflect on his discussion of what is referred to as The Singularity (otherwise known as “The Technological Singularity”). I believe John von Neumann was the first to coin the phrase c. 1955 when he said that the…
…ever accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue
Many others have supported the concept of The Singularity including Alan Turing, Stanislaw Ulam, and Ray Kurzweil. It was the observation “human affairs […] could not continue” that struck me.
If (when) we do create an intelligence that is superior to the combined minds of the entire planet will it be a good point in mankind’s evolution?
Machines are already very good as writing code based on observing comments that I write. GitHub’s CoPilot is spookily good at writing large sections of code, even using my own style. It really is as though someone grew up with me and is looking over my shoulder. I often find myself simply tabbing through code construction as it predicts almost exactly what I’m about to write. As a programmer am I worried? A little. But after more than four decades creating software I don’t worry too much about my career. But I do worry much more for those starting out on a career now – whether it’s programming or any other industry. The rapid advancement of machine learning will, I think, affect millions, if not all of us. In all walks of life. The singularity might even trigger an extinction event with some (Kurzweil) predicting the singularity will occur within 20 years, by 2045, with others saying human-level machine intelligence would be with us by 2050.
Whether it’s in 20 years or 100 the singularity will come.

Leave a Reply