Neuroscience is a funny discipline which can demand a level of interdisciplinary knowledge that is hard to achieve. At its heart, neuroscience is concerned with understanding the organ that is responsible for generating our behaviour, and thus, it is a branch of physiology and psychology. At the same time, most neuroscientists will have heard, or even used, the words “calculate”, “algorithm”, and “computation” many times in their professional lives. I think there’s a good reason for this: the brain is a computer, and neuroscience is also branch of computer science, in my opinion. However, many neuroscientists do not see it that way.
In online discussions I have often read the phrase “The Brain as a Computer Metaphor”. The implication of this phrase is clear: the brain is not a computer, and at best, we can use computers as a metaphor to understand the brain. (At worst, the “metaphor” does not hold and should be abandoned.) Similarly, I have heard neuroscientists say things like, “neural circuits don’t truly run algorithms”. Again, the conclusion is clear: the brain doesn’t run any algorithms in reality, so our constant use of the words “algorithm” and “computer” when talking about the brain is misguided.
Unfortunately, what these discussions demonstrate is that many researchers do not, as a rule, actually understand the formal definitions of “computer” or “algorithm” as provided by computer science. (Or alternatively, if they do understand them, they don’t accept them for some reason.) If you understand the formal definitions of computer and algorithm as given by computer science, then you know that the brain is very clearly a computer running algorithms, almost trivially so. Read More
Tag Archives: Human
Restoring Vision With Bionic Eyes: No Longer Science Fiction
Bionic vision might sound like science fiction, but Dr. Michael Beyeler is working on just that.
Originally from Switzerland, Dr. Beyeler is wrapping up his postdoctoral fellow at the University of Washington before moving to the University of California Santa Barbara this fall to head up the newly formed Bionic Vision Lab in the Departments of Computer Science and Psychological & Brain Sciences.
We spoke with him about this “deep fascination with the brain” and how he hopes his work will eventually be able to restore vision to the blind. Read More
Augmenting Human Intelligence
Context is critical. As what was once mere data evolves into actionable intelligence, the context that binds that data becomes ever more essential.
Consider the word “java.” With no context around those four letters, you might not understand the reference or make any sort of connection. But if you add just one word to “java,” such as “development,” “island,” or “coffee,” the reference changes completely—and that’s with just a single word of context.
This is the type of active context and connection that the Brainspace engine provides. Read More
Augmented Intelligence: A Collaboration of Humans and Machines
Has humanity reached "peak" intelligence?
You may not have noticed, but we are living in an intellectual golden age.
Since the intelligence test was invented more than 100 years ago, our IQ scores have been steadily increasing. Even the average person today would have been considered a genius compared to someone born in 1919 – a phenomenon known as the Flynn effect.
We may have to enjoy it while we can. The most recent evidence suggests that this trend may now be slowing. It may even be reversing, meaning that we have already passed the summit of human intellectual potential. Read More
Superhuman AI for multiplayer poker
In recent years there have been great strides in artificial intelligence (AI), with games often serving as challenge problems, benchmarks, and milestones for progress. Poker has served for decades as such a challenge problem. Past successes in such benchmarks, including poker, have been limited to two-player games. However, poker in particular is traditionally played with more than two players. Multiplayer games present fundamental additional issues beyond those in two-player games, and multiplayer poker is a recognized AI milestone. In this paper we present Pluribus, an AI that we show is stronger than top human professionals in six-player no-limit Texas hold’em poker, the most popular form of poker played by humans. Read More
Teaching AI the Concept of ‘Similar, but Different’
As a human you instinctively know that a leopard is closer to a cat than a motorbike, but the way we train most AI makes them oblivious to these kinds of relations. Building the concept of similarity into our algorithms could make them far more capable, writes the author of a new paper in Science Robotics.
Convolutional neural networks have revolutionized the field of computer visionto the point that machines are now outperforming humans on some of the most challenging visual tasks. But the way we train them to analyze images is very different from the way humans learn, says Atsuto Maki, an associate professor at KTH Royal Institute of Technology.
“Imagine that you are two years old and being quizzed on what you see in a photo of a leopard,” he writes. “You might answer ‘a cat’ and your parents might say, ‘yeah, not quite but similar’.” Read More
AI+EI – A recipe for success or disaster?
If one thing is for sure, it is that businesses are reaping the benefits of AI’s ability to free us from the more repetitive tasks in the workplace. AI is changing the nature of work. It’s helping to remove the mundane, enabling us to make more informed decisions with its analytical capabilities and its ability to wade through large amounts of data through machine learning.
Yet, according to a report from Gartner, EI accounts for more than 90% of a person’s performance and success in a technical and leadership role. With this in mind, it would be unlikely for AI to completely replace human beings in the workplace at this stage, given its lack of emotional intelligence (among other things). Emotional intelligence, deep domain expertise and a set of “soft skills” cannot yet be automated by current AI technologies. Read More
Data can now be stored inside the molecules that power our metabolism
DNA isn’t the only molecule we could use for digital storage. It turns out that solutions containing sugars, amino acids and other small molecules could replace hard drives too.
Jacob Rosenstein and his colleagues at Brown University, Rhode Island, stored and retrieved pictures of an Egyptian cat, an ibex and an anchor using an array of these small molecules. They say the approach could make storage that is less vulnerable to hacking and that could function in more extreme environmental conditions.
Inspired by recent research showing that it is possible to store data on DNA, Rosenstein’s team wanted to see if smaller and simpler molecules could also encode abstract information. Read More