Bloomberg/ Getty Images

The Future of Computing

Ever since the American computer scientist John McCarthy coined the term “Artificial Intelligence” in 1955, the public has imagined a future of sentient computers and robots that think and act like humans. Though that remains a distant prospect, the foreseeable frontier of computing is no less exciting.

ZURICH – Ever since the American computer scientist John McCarthy coined the term “Artificial Intelligence” in 1955, the public has imagined a future of sentient computers and robots that think and act like humans. But while such a future may indeed arrive, it remains, for the moment, a distant prospect.

And yet the foreseeable frontier of computing is no less exciting. We have entered what we at IBM call the Cognitive Era. Breakthroughs in computing are enhancing our ability to make sense of large bodies of data, providing guidance in some of the world’s most important decisions, and potentially revolutionizing entire industries.

The term “cognitive computing” refers to systems that, rather than being explicitly programmed, are built to learn from their experiences. By extracting useful information from unstructured data, these systems accelerate the information age, helping their users with a broad range of tasks, from identifying unique market opportunities to discovering new treatments for diseases to crafting creative solutions for cities, companies, and communities.

To continue reading, please log in or enter your email address.

To continue reading, please log in or register now. After entering your email, you'll have access to two free articles every month. For unlimited access to Project Syndicate, subscribe now.

required

By proceeding, you are agreeing to our Terms and Conditions.

Log in

http://prosyn.org/px2CZ17;

Cookies and Privacy

We use cookies to improve your experience on our website. To find out more, read our updated cookie policy and privacy policy.