When the Music Stops: AI and Deflation

Artificial intelligence, the use of computer processes to infer and make decisions on information about the world that is not necessarily explicitly given, has been a hallmark of much of this decade. From word processors that went from simple spell check to office suites that now have a significant hand in the production process, from cruise control to self-driving vehicles, from halting speech recognition software to fully integrated video/audio concept recognition, AI and its related technologies have quietly but perhaps irrevocably changed our relationship with computers far more than most people realize.

Yet as the information revolution continues, the impacts that it is having upon our economy are now reaching an extent where most of the models that economists have formulated about how that economy works are being thrown out. We’re in terra incognita at this stage, and this, in turn, is forcing politicians, policy makers, economists, business leaders and everyday people to rethink many of the fundamental assumptions on which we base our notions of work, value and utility. Read More

#augmented-intelligence, #economics

Should you build or buy AI?

Read More

#ai-first, #strategy

Yes, the brain is a computer…

Neuroscience is a funny discipline which can demand a level of interdisciplinary knowledge that is hard to achieve. At its heart, neuroscience is concerned with understanding the organ that is responsible for generating our behaviour, and thus, it is a branch of physiology and psychology. At the same time, most neuroscientists will have heard, or even used, the words “calculate”, “algorithm”, and “computation” many times in their professional lives. I think there’s a good reason for this: the brain is a computer, and neuroscience is also branch of computer science, in my opinion. However, many neuroscientists do not see it that way.

In online discussions I have often read the phrase “The Brain as a Computer Metaphor”. The implication of this phrase is clear: the brain is not a computer, and at best, we can use computers as a metaphor to understand the brain. (At worst, the “metaphor” does not hold and should be abandoned.) Similarly, I have heard neuroscientists say things like, “neural circuits don’t truly run algorithms”. Again, the conclusion is clear: the brain doesn’t run any algorithms in reality, so our constant use of the words “algorithm” and “computer” when talking about the brain is misguided.

Unfortunately, what these discussions demonstrate is that many researchers do not, as a rule, actually understand the formal definitions of “computer” or “algorithm” as provided by computer science. (Or alternatively, if they do understand them, they don’t accept them for some reason.) If you understand the formal definitions of computer and algorithm as given by computer science, then you know that the brain is very clearly a computer running algorithms, almost trivially so. Read More

#human

Machine Learning in one map!

Read More

#machine-learning

A.I. is translating messages of long-lost languages

Researchers from MIT and Google Brain discover how to use deep learning to decipher ancient languages.

The technique can be used to read languages that died long ago.

The method builds on the ability of machines to quickly complete monotonous tasks. Read More

#nlp

Faster Neural Network Training with Data Echoing

In the twilight of Moore’s law, GPUs and other specialized hardware accelerators have dramatically sped up neural network training. However, earlier stages of the training pipeline, such as disk I/O and data preprocessing, do not run on accelerators. As accelerators continue to improve, these earlier stages will increasingly become the bottleneck. In this paper, we introduce “data echoing,” which reduces the total computation used by earlier pipeline stages and speeds up training whenever computation upstream from accelerators dominates the training time. Data echoing reuses (or “echoes”) intermediate outputs from earlier pipeline stages in order to reclaim idle capacity. We investigate the behavior of different data echoing algorithms on various workloads, for various amounts of echoing, and for various batch sizes. We find that in all settings, at least one data echoing algorithm can match the baseline’s predictive performance using less upstream computation. In some cases, data echoing can even compensate for a 4x slower input pipeline. Read More

#neural-networks, #training

Deep Learning State of the Art (2019) – MIT

Read More

#deep-learning, #videos

China Internet Report 2019

China has emerged on the world stage with a host of global tech companies that are innovative and competitive. And increasingly, their successes are being studied and replicated in other markets. This report, informed by on-the-ground reporting by the South China Morning Post and Abacus, offers insights into China’s tech trailblazers and the big important trends shaping the world’s biggest internet community. Read More

#china, #china-vs-us

AI Can Now Create Artificial People – What Does That Mean For Humans?

When DataGrid, Inc. announced it successfully developed an AI system capable of generating high-quality photorealistic Japanese faces, it was impressive. But now the company has gone even further. Its artificial intelligence (AI) system can now create not only faces and hair from a variety of ethnicities, but bodies that can move and wear any outfit. While these images are fictitious, they are incredibly photorealistic. Read More

#fake, #gans

Computer vision harvesting. 4 algorithms simultaneously identifying: – License plate number recognition – Brand and model type recognition – Logo detection – Car color recognition.

Read More

#image-recognition, #videos