Computing innovations to imitate, not replace, human brain

Parallel processing and integration of processing and memory units some innovations aiming to imitate brain functions but will be ’long, long time’ before artificial intelligence replaces it entirely, expert says.

By Ellyne Phneah

Innovations around computing are increasingly designed to imitate the human brain but the day artificial intelligence becomes “smarter” than human intelligence is a “long time” away, said experts.

Leong Tze Yun, associate professor at National University of Singapore’s (NUS) School of Computing, said alternate computational models have been “actively pursued” in recent years. Emulating the human brain–which she described as the “ultimate and best computer”–is one of the main directions researchers have been moving toward.

“Some of the main functions that ‘brain chips’ and other brain-like computing models aim to emulate include parallel processing, integration of processing and memory units and adaptive configuration of emerging requirements and functions,” the academic elaborated in her e-mail.

This is because current computational models with separate memory and processor units for crunching information has limitations in areas of computing capacity, efficiency and the range of tasks that can be solved, Leong added.

Her comments come after an August report by CBS News revealed that Big Blue has built two prototype chips that process data similar to how the human brain digests information in that they are able to adapt to information that it was not programmed for.

David Bader, professor high performance computing executive director at the Georgia Institute of Technology’s College of Computing in the United States, chimed in, saying that by using multicore Intel processors and massively parallel graphic processors by Nvidia, a “thinking” computer becomes “closer to reality”. This way of harnessing the power of multiple processors has helped solve some of the most challenging problems in science and engineering, he pointed out in an e-mail.

“The ability to compute thousands of trillions of operations per second or sift through billions of pieces of information per second allows them to solve problems once thought intractable,” said Bader, pointing to how IBM’s Watson supercomputer defeated human contestants during the U.S. game show Jeopardy earlier this year as an example.

Correctly harnessing compute power

IBM Singapore’s CTO Foong Sew Bun clarified that its Watson project was not to build a human brain as it is “much too complex and marvelous to ever replicate” in silicon form.

“We are simply drawing inspiration from the brain’s ability for massively parallel processing to build a more efficient computer,” he stated.

Leong, too, agreed that it is unlikely these supercomputers will ever replace the human brain while Leong was slightly more hopeful, saying that it would be a “long, long time” before a computer will behave like those seen in the movie “Terminator”.

She elaborated that computers will not be “smarter” than the human brain if the criterion is defined in the broad, human sense of being able to adapt, learn and improve themselves, the people and things around and for the “goodness of mankind”.

That said, if it is defined as “the narrow sense of task efficiency and effectiveness”, there are already processes that computers or other technologies can do better than human beings, Leong qualified.

The NUS professor also predicted that there will come a time when computers are “assimilated into our daily lives intelligently and seamlessly”. In such a reality, man and machine will co-exist, interact, complement, support and improve each other in various physical, social and economic activities, she pointed out.

In terms of optimizing the use of computing technologies to better our lives, Leong said companies should set right objectives and values in the development and use of these innovations.

She also stressed that people do not “stop thinking” when working with smart computers. “As developers and users, we decide the types of ‘smart computers’ we want to build and cannot always rely entirely on the results and suggestions produced by computers,” the professor cautioned.

https://www.zdnet.com/article/computing-innovations-to-imitate-not-replace-human-brain/

David A. Bader
David A. Bader
Distinguished Professor and Director of the Institute for Data Science

David A. Bader is a Distinguished Professor in the Department of Computer Science at New Jersey Institute of Technology.