Bloomberg/ Getty Images

The Future of Computing

Ever since the American computer scientist John McCarthy coined the term “Artificial Intelligence” in 1955, the public has imagined a future of sentient computers and robots that think and act like humans. Though that remains a distant prospect, the foreseeable frontier of computing is no less exciting.

ZURICH – Ever since the American computer scientist John McCarthy coined the term “Artificial Intelligence” in 1955, the public has imagined a future of sentient computers and robots that think and act like humans. But while such a future may indeed arrive, it remains, for the moment, a distant prospect.

And yet the foreseeable frontier of computing is no less exciting. We have entered what we at IBM call the Cognitive Era. Breakthroughs in computing are enhancing our ability to make sense of large bodies of data, providing guidance in some of the world’s most important decisions, and potentially revolutionizing entire industries.

The term “cognitive computing” refers to systems that, rather than being explicitly programmed, are built to learn from their experiences. By extracting useful information from unstructured data, these systems accelerate the information age, helping their users with a broad range of tasks, from identifying unique market opportunities to discovering new treatments for diseases to crafting creative solutions for cities, companies, and communities.

To continue reading, please log in or enter your email address.

Registration is quick and easy and requires only your email address. If you already have an account with us, please log in. Or subscribe now for unlimited access.

required

Log in

http://prosyn.org/px2CZ17;