Scientists at IBM are developing a new software ecosystem that would be able to support cognitive computing systems that interact more naturally with humans.
This sort of technology allows for the creation of “applications that mimic the brain’s abilities for perception, action, and cognition.” That means computers would deal with data and “think” more the way we do as humans.
IBM explains that programmable computing systems we use today were designed decades ago and are efficient “number crunchers.” But in the world we live in today with real-time big data being produced in massive quantities globally, this aging technology just doesn’t cut it anymore.
“Architectures and programs are closely intertwined and a new architecture necessitates a new programming paradigm,” Dharmendra S. Modha, IBM Research principal investigator and senior manager, said in a news release.
That’s why IBM is developing this “new cognitive ecosystem” to includes a software simulator that has “a network of neurosynaptic cores,” a neuron model that can process “brain-like computation,” a programming model based on “composable, reusable building blocks” called “corelets” and a program library to store corelets. This architecture would support these next-generation systems that would behave more like biological beings.
IBM is presenting all their developments at the International Joint Conference on Neural Networks in Dallas this week.
IBM says in the long term, the tech company hopes to build “a chip system with ten billion neurons and hundred trillion synapses” that consumes little power and occupies little volume. That jargon-filled goal would mean that for example, humans could develop special eyeglasses to help the visually impaired, which have “multiple video and auditory sensors” to process optical data.
Given that our human eyes look at a terabyte of data per day, according to IBM, these beefy sensors could help the visually impaired navigate the world more easily.