Researchers Have Found a Way to Listen In On Your Conversations Using Light Bulb Vibrations

As if we didn’t have enough to be stressed about in 2020, researchers from the Cyber Security Labs at Ben Gurion University and the Weizmann Institute of Science have come up with a way to listen in on a room, even at long distances, using less than $1,000 worth of equipment that’s able to measure subtle light changes in a room caused by sound waves vibrating a light bulb. Read More

#cyber, #nlp, #privacy

Enigma: Decentralized Computation Platform with Guaranteed Privacy

A peer-to-peer network, enabling different parties to jointly store and run computations on data while keeping the data completely private. Enigma’s computational model is based on a highly optimized version of secure multi-party computation,guaranteed by a verifiable secret-sharing scheme. For storage, we use a modified distributed hash table for holding secret-shared data. An external blockchain is utilized as the controller of the network, manages access control, identities and serves as a tamper-proof log of events. Security deposits and fees incentivize operation, correctness and fairness of the system. Similar to Bitcoin, Enigma removes the need for a trusted third party, enabling autonomous control of personal data.For the first time, users are able to share their data with cryptographic guarantees regarding their privacy. Read More

#privacy

Amazon explores a way to preserve privacy in natural language processing

Can privacy and security be preserved in the course of large-scale textual data analysis? As it turns out, yes. A team of Amazon researchers in a recently published study proposed a way to anonymize customer-supplied data. They claim that their approach, which works by rephrasing samples and basing the analysis on the new phrasing, results in at least 20-fold greater guarantees on expected privacy. Read More

#privacy

The Secretive Company That Might End Privacy as We Know It

The New York Times has a long story about a little-known start-up, Clearview AI, that helps law enforcement match photos of unknown people to their online images — and “might lead to a dystopian future or something,” a backer says. Read More

#image-recognition, #privacy

How federated learning could shape the future of AI in a privacy-obsessed world

You may not have noticed, but two of the world’s most popular machine learning frameworks — TensorFlow and PyTorch — have taken steps in recent months toward privacy with solutions that incorporate federated learning.

Instead of gathering data in the cloud from users to train data sets, federated learning trains AI models on mobile devices in large batches, then transfers those learnings back to a global model without the need for data to leave the device. Read More

#federated-learning, #privacy

Twelve Million Phones, One Dataset, Zero Privacy

Every minute of every day, everywhere on the planet, dozens of companies — largely unregulated, little scrutinized — are logging the movements of tens of millions of people with mobile phones and storing the information in gigantic data files. The Times Privacy Project obtained one such file, by far the largest and most sensitive ever to be reviewed by journalists. It holds more than 50 billion location pings from the phones of more than 12 million Americans as they moved through several major cities, including Washington, New York, San Francisco and Los Angeles.

Each piece of information in this file represents the precise location of a single smartphone over a period of several months in 2016 and 2017. Read More

#cyber, #privacy, #surveillance, #wifi

Building a World Where Data Privacy Exists Online

Data is valuable — something that companies like Facebook, Google and Amazon realized far earlier than most consumers did. But computer scientists have been working on alternative models, even as the public has grown weary of having their data used and abused.

Dawn Song, a professor at the University of California, Berkeley, and one of the world’s foremost experts in computer security and trustworthy artificial intelligence, envisions a new paradigm in which people control their data and are compensated for its use by corporations. Read More

#adversarial, #assurance, #privacy

Google is open-sourcing a tool for data scientists to help protect private information

Google today announced that it is open-sourcing its so-called differential privacy library, an internal tool the company uses to securely draw insights from datasets that contain the private and sensitive personal information of its users.

Differential privacy is a cryptographic approach to data science, particularly with regard to analysis, that allows someone relying on software-aided analysis to draw insights from massive datasets while protecting user privacy. It does so by mixing novel user data with artificial “white noise,” as explained by Wired’s Andy Greenberg. That way, the results of any analysis cannot be used to unmask individuals or allow a malicious third party to trace any one data point back to an identifiable source. Read More

#homomorphic-encryption, #privacy

Emotionless: Privacy-Preserving Speech Analysis for Voice Assistants

Voice-enabled interactions provide more human-like experiences in many popular IoT systems. Cloud-based speech analysis services extract useful information from voice input using speech recognition techniques. The voice signal is a rich resource that discloses several possible states of a speaker, such as emotional state, confidence and stress levels,physical condition, age, gender, and personal traits. Service providers can build a very accurate profile of a user’s demographic category, personal preferences, and may compromise privacy. To address this problem, a privacy-preserving intermediate layer between users and cloud services is proposed to sanitize the voice input. It aims to maintain utility while preserving user privacy. It achieves this by collecting real time speech data and analyzes the signal to ensure privacy protection prior to sharing of this data with services providers. Precisely, the sensitive representations are extracted from the raw signal by using transformation functions and then wrapped it via voice conversion technology.Experimental evaluation based on emotion recognition to assess the efficacy of the proposed method shows that identification of sensitive emotional state of the speaker is reduced by∼96 %. Read More

#nlp, #privacy, #voice

China's hackers are ransacking databases for your health data

In May 2017, the WannaCry ransomware spread around the globe. As the worm locked Windows PCs, the UK’s National Health Service quickly ground to a halt. 19,000 appointments were cancelled, doctor’s couldn’t access patient files and email accounts were taken offline.

But North Korean hackers behind WannaCry didn’t touch one thing: patient data. No personal information was stolen, the NHS has concluded. The cyberattack was purely to cause disruption and an attempt to earn the hermit state some much-needed cash.

The same can’t be said for China. New analysis has indicated that state-sponsored hackers from the country are targetting medical data from the healthcare industry. Research from security firm FireEye, has identified multiple groups with links to China attacking medical systems and databases around the world. These attacks include incidents in 2019, but also date back as far as 2013. Read More

#china, #cyber, #privacy