This article is a response to an article arguing that an AI Winter maybe inevitable. However, I believe that there are fundamental differences between what happened in the 1970s (the fist AI winter) and late 1980s (the second AI winter with the fall of Expert Systems) with the arrival and growth of the internet, smart mobiles and social media resulting in the volume and velocity of data being generated constantly increasing and requiring Machine Learning and Deep Learning to make sense of the Big Data that we generate.
The rapid growth in Big Data has driven much of the growth in AI alongside reduced cost of data storage (Cloud Servers) and Graphical Processing Units (GPUs) making Deep Learning more scalable. Data will continue to drive much of the future growth of AI, however, the nature of the data and the location of its interaction with AI will change. This article will set out how the future of AI will increasingly be alongside data generated at the edge of the network (on device) closer to the user. This will have the advantage that latency will be lower and 5G networks will enable a dramatic increase in device connectivity with much greater capacity to connect IoT devices relative to 4G networks. Read More
Daily Archives: December 3, 2020
Your Brain Doesn’t Work the Way You Think It Does
A conversation with neuroscientist Lisa Feldman Barrett on the counterintuitive ways your mind processes reality—and why understanding that might help you feel a little less anxious.
At the very beginning of her new book Seven and a Half Lessons About the Brain, psychology professor Lisa Feldman Barrett writes that each chapter will present “a few compelling scientific nuggets about your brain and considers what they might reveal about human nature.” Though it’s an accurate description of what follows, it dramatically undersells the degree to which each lesson will enlighten and unsettle you. It’s like lifting up the hood of a car to see an engine, except that the car is you and you find an engine that doesn’t work at all like you thought it did.
For instance, consider the fourth lesson, You Brain Predicts (Almost) Everything You Do. “Neuroscientists like to say that your day-today experience is a carefully controlled hallucination, constrained by the world and your body but ultimately constructed by your brain,” writes Dr. Barrett, who is a University Distinguished Professor at Northeastern and who has research appointments at Harvard Medical School and Massachusetts General Hospital. “It’s an everyday kind of hallucination that creates all of your experiences and guides all your actions. It’s the normal way that your brain gives meaning to the sensory inputs from your body and from the world (called “sense data”), and you’re almost always unaware that it’s happening.” Read ore
Contrastive Learning of Medical Visual Representations from Paired Images and Text
Learning visual representations of medical images is core to medical image understanding but its progress has been held back by the small size of hand-labeled datasets. Existing work commonly relies on transferring weights from ImageNet pretraining, which is suboptimal due to drastically different image characteristics,or rule-based label extraction from the textual report data paired with medical images, which is inaccurate and hard to generalize. We propose an alternative unsupervised strategy to learn medical visual representations directly from the naturally occurring pairing of images and textual data. Our method of pretraining medical image encoders with the paired text data via a bidirectional contrastive objective between the two modalities is domain-agnostic, and requires no additional expert input. We test our method by transferring our pretrained weights to 4 medical image classification tasks and 2 zero-shot retrieval tasks, and show that our method leads to image representations that considerably outperform strong base-lines in most settings. Notably, in all 4 classification tasks, our method requires only 10% as much labeled training data as an ImageNet initialized counterpart to achieve better or comparable performance, demonstrating superior data efficiency. Read More