IBM scientists are developing a new software ecosystem that would be able to support cognitive computing systems that interact more naturally with humans.

Cognitive computing systems can be trained with artificial intelligence and machine-learning algorithms. The potential for this technology is elaborated in the video below. IBM Research says this sort of technology allows for the creation of "applications that mimic the brain’s abilities for perception, action, and cognition." That means computers would deal with data and "think" more the way we do as humans.

If this all sounds complex, well, that's because it is. IBM explains that programmable computing systems we use today were designed decades ago and are efficient "number crunchers." But in the world we live in today with real-time big data being produced in massive quantities globally, this aging technology just doesn't cut it anymore.

That's why modeling computing systems after the brain might work better. IBM Watson (pictured above) is the most well-known cognitive computer, and it famously competed on game show Jeopardy! in 2011, beating two human champions.

"Architectures and programs are closely intertwined and a new architecture necessitates a new programming paradigm," Dharmendra S. Modha, IBM Research principal investigator and senior manager, said in a news release.

That's why IBM is developing this "new cognitive ecosystem" to includes a software simulator that has "a network of neurosynaptic cores," a neuron model that can process "brain-like computation," a programming model based on "composable, reusable building blocks" called "corelets" and a program library to store corelets. This architecture would support these next-generation systems that would behave more like biological beings.

IBM is presenting all their developments at the International Joint Conference on Neural Networks in Dallas this week.

But what is all this good for? IBM says in the long term, the tech company hopes to build "a chip system with ten billion neurons and hundred trillion synapses" that consumes little power and occupies little volume. That jargon-filled goal would mean that for example, humans could develop special eyeglasses to help the visually impaired, which have "multiple video and auditory sensors" to process optical data.

Given that our human eyes look at a terabyte of data per day, according to IBM, these beefy sensors could help the visually impaired navigate the world more easily.

Lead image: Ben Hider/Getty Images. Secondary images: Flickr, IBM Research.