The partnership aims to improve performance and accuracy of FHE to make it practical for business and government to better protect confidential data in the cloud.
Intel has partnered with Microsoft as part of a US Defense Advanced Research Projects Agency (DARPA) program that aims to develop hardware and software to drastically improve the performance of fully homomorphic encryption (FHE) computation. As part of the program, Intel will develop a hardware accelerator that could make machine learning practical with always-encrypted and privacy-preserving data. Read More
Tag Archives: Homomorphic Encryption
Why Intel believes confidential computing will boost AI and machine learning
Companies are collecting increasing amounts of data, a trend that is driving the development of better analytical tools and tougher security. Analysis and security are now converging as confidential computing prepares to deliver a critical boost to artificial intelligence.
Intel has been investing heavily in confidential computing as a way to expand the amount and types of data companies will manage through cloud services. Read More
Computer Scientists Achieve ‘Crown Jewel’ of Cryptography
A cryptographic master tool called indistinguishability obfuscation has for years seemed too good to be true. Three researchers have figured out that it can work.
… Indistinguishability obfuscation, if it could be built, would be able to hide not just collections of data but the inner workings of a computer program itself, creating a sort of cryptographic master tool from which nearly every other cryptographic protocol could be built. Read More
IBM completes successful field trials on Fully Homomorphic Encryption
FHE allows computation of still-encrypted data, without sharing the secrets.
Yesterday, Ars spoke with IBM Senior Research Scientist Flavio Bergamaschi about the company’s recent successful field trials of Fully Homomorphic Encryption. We suspect many of you will have the same questions that we did—beginning with “what is Fully Homomorphic Encryption?”
FHE is a type of encryption that allows direct mathematical operations on the encrypted data. Upon decryption, the results will be correct.
…You don’t ever have to share a key with the third party doing the computation; the data remains encrypted with a key the third party never received. Read More
Learning to Protect Communications with Adversarial Neural Cryptography
We ask whether neural networks can learn to use secret keys to protect information from other neural networks. Specifically, we focus on ensuring confidentiality properties in a multiagent system, and we specify those properties in terms of an adversary. Thus, a system may consist of neural networks named Alice and Bob,and we aim to limit what a third neural network named Eve learns from eavesdrop-ping on the communication between Alice and Bob. We do not prescribe specific cryptographic algorithms to these neural networks; instead, we train end-to-end, adversarially. We demonstrate that the neural networks can learn how to perform forms of encryption and decryption, and also how to apply these operations selectively in order to meet confidentiality goals. Read More
#adversarial, #homomorphic-encryptionAI has a privacy problem, but these techniques could fix it
Artificial intelligence promises to transform — and indeed, has already transformed — entire industries, from civic planning and health care to cybersecurity. But privacy remains an unsolved challenge in the industry, particularly where compliance and regulation are concerned.
Recent controversies put the problem into sharp relief. The Royal Free London NHS Foundation Trust, a division of the U.K.’s National Health Service based in London, provided Alphabet’s DeepMind with data on 1.6 million patients without their consent. Google — whose health data-sharing partnership with Ascension became the subject of scrutiny in November — abandoned plans to publish scans of chest X-rays over concerns that they contained personally identifiable information. This past summer, Microsoft quietly removed a data set (MS Celeb) with more than 10 million images of people after it was revealed that some weren’t aware they had been included. Read More
CryptoNN: Training Neural Networks over Encrypted Data
Emerging neural networks based machine learning techniques such as deep learning and its variants have shown tremendous potential in many application domains. However,they raise serious privacy concerns due to the risk of leakage of highly privacy-sensitive data when data collected from usersis used to train neural network models to support predictive tasks. To tackle such serious privacy concerns, several privacy-preserving approaches have been proposed in the literature that use either secure multi-party computation (SMC) or homomorphic encryption (HE) as the underlying mechanisms. However, neither of these cryptographic approaches provides an efficient solution towards constructing a privacy-preserving machine learning model, as well as supporting both the training and inference phases.To tackle the above issue, we propose a CryptoNN framework that supports training a neural network model over encrypted data by using the emerging functional encryption scheme instead of SMC or HE. We also construct a functional encryption scheme for basic arithmetic computation to support the requirement ofthe proposed CryptoNN framework. We present performance evaluation and security analysis of the underlying crypto scheme and show through our experiments that CryptoNN achieves accuracy that is similar to those of the baseline neural network models on the MNIST dataset. Read More
Google is open-sourcing a tool for data scientists to help protect private information
Google today announced that it is open-sourcing its so-called differential privacy library, an internal tool the company uses to securely draw insights from datasets that contain the private and sensitive personal information of its users.
Differential privacy is a cryptographic approach to data science, particularly with regard to analysis, that allows someone relying on software-aided analysis to draw insights from massive datasets while protecting user privacy. It does so by mixing novel user data with artificial “white noise,” as explained by Wired’s Andy Greenberg. That way, the results of any analysis cannot be used to unmask individuals or allow a malicious third party to trace any one data point back to an identifiable source. Read More
Google Turns to Retro Cryptography to Keep Datasets Private
Certain studies require sensitive datasets: the relationship between nutritious school lunch and student health, the effectiveness of salary equity initiatives, and so on. Valuable insights require navigating a minefield of private, personal information. Now, after years of work, cryptographers and data scientists at Google have come up with a technique to enable this “multi-party computation” without exposing information to anyone who didn’t already have it.
Today Google will release an open source cryptographic tool known as Private Join and Compute. It facilitates the process of joining numeric columns from different datasets to calculate a sum, count, or average on data that is encrypted and unreadable during its entire mathematical journey. Only the results of the computation can be decrypted and viewed by all parties, meaning that you only get the results, not the data you didn’t already own. Read More
Decentralizing Privacy: Using Blockchain to Protect Personal Data
The recent increase in reported incidents of surveillance and security breaches compromising users’ privacy call into question the current model, in which third-parties collect and control massive amounts of personal data. Bitcoin has demonstrated in the financial space that trusted, auditable computing is possible using a decentralized network of peers accompanied by a public ledger. In this paper, we describe a decentralized personal data management system that ensures users own and control their data. We implement a protocol that turns a blockchain into an automated access-control manager that does not require trust in a third party. Unlike Bitcoin, transactions in our system are not strictly financial – they are used to carry instructions, such as storing, querying and sharing data. Finally, we discuss possible future extensions to blockchains that could harness them into a well-rounded solution for trusted computing problems in society. Read More