With homomorphic encryption we (have to) trust

0

(Ruslan-Lytvyn / Shutterstock)

Just as we were on the verge of gaining godlike powers with advanced analytics and machine learning in large data pools, privacy laws like GDPR and CCPA came to end the craze. But thanks to the advent of homomorphic encryption, the big data party can go on – provided enough people trust it.

Homomorphic encryption is a relatively new technology that you will likely hear more about in the future. Because it does what seems impossible at first glance: It enables research groups to apply advanced analysis and ML techniques to encrypted data.

The math behind homomorphic encryption was proven over a decade ago, but computers were too slow to make the technology practical for everyday use. That has changed in the past few years, and now homomorphic encryption solutions are being adopted by large companies, says Alon Kaufman, CEO and co-founder of Duality Technologies, which offers homomorphic encryption solutions.

“There are many reasons why companies want to share and work together,” says Kaufman, one of the world’s leading experts on homomorphic encryption. “At the same time, for data protection reasons and for their own business reasons, they are not ready to do so. That is what makes duality possible. “

Individuals and groups have one good reason not to share their data today: they can lose their privacy. This is not a trivial proposition, especially given the big data abuse by tech giants – not to mention the potential for hefty fines if found to violate GDPR, CCPA, HIPAA, and a host of other new data protection laws. have emerged in recent years.

Data transfer

While privacy is valuable, data sharing is valuable at both the population level and the individual level, says Michael Jordan, a computer science professor at UC Berkeley.

“A disease like cancer is a kind of common disease that we can all have, in different ways, but in different ways,” says Jordan, who is a consultant to Duity. “The collaborative data analysis will be the solution.”

Hospitals are currently banned from pooling their respective cancer data and allowing researchers to use them to find patterns and potential treatments because of the risk of invading people’s privacy. HIPAA has strict patient protection requirements, and the penalties for violating them can be severe.

“I am a hospital. I have all of this great cancer data. I would like to share it with other hospitals so that we can all have a bigger vision about cancer as a phenomenon, ”says Jordan Data name. “I can’t because the lawyers tell me I can’t and the lawyers haven’t really thought that through. They just said, ‘Ah, you can’t risk that.’

Alon Kaufman is the co-founder and CEO of Duality Technologies

“Well, if you could go to a lawyer and say, ‘We have this technique that actually guarantees something here, this fear that you have is not real,’ we can get rid of that fear,” he continues. “Then the legal profession will go back to the issue and say, ‘Oh, I see that certain aspects of the data can be shared without putting people at risk. So that we can change the law. ‘”

In health care, the laws have not yet been changed. But banks, insurance companies, and other financial services companies are advancing with homomorphic encryption to fight fraud and meet anti-money laundering (AML) requirements, Kaufman says.

“Regulators have been telling companies for many years that they need to do AML and [fight] Fraud in a collaborative way, but hey, wait a second, you are not allowed to share this information for privacy reasons, ”says Kaufman. “You told this kind of contradicting story. And they have adopted this technology [homomorphic encryption] and they are now promoting these kinds of things to enable sharing under these restrictions. “

Duity is currently working with Oracle to implement homomorphic encryption in their financial crime fighting software. According to Kaufman, Oracle implements controls of what allows collaborative analysis and what cannot be shared.

“We now have partnerships with companies like Oracle and IBM that are reputable companies and that definitely helps build credibility,” said Kaufman. “The other aspect is to make sure that regulators support this. This is what happens in finance. Not yet in other areas. “

Evaluation of data

While the underlying core technology of homomorphic encryption is ready, we still need to build the systems and infrastructure on top of it that will really enable the exchange of data as Jordan envisions it. It will likely take decades to build the confidence and economic construction that this will completely unleash in the world, he says.

“I think the first wave right now is more of an economic model, a recognition that we have networks of decision-makers and that they have things to share,” he says. “They compete with each other and they have to have some kind of rules of the game. You have to know how to value things. “

For example, how much is your data worth? We are only willing to enter into further collaborations if we have a way to value data, he says.

Data protection is an obstacle to wider data sharing (ZinetroN / Shutterstock)

“Data is difficult to evaluate,” says Jordan. “You have to probe it a bit. If you are on the same date as me, somehow it is not so valuable to us that we get together. If we have complementary data, it is much more valuable. So can you judge the complementarity? Can you assess relevance? Can you judge these things? And can you assess that before you hand over the entire data set? “

“So you need tools, and homomorphic companies certainly are. When it comes to exploitation, Duity is a world leader, ”he continues. “Once you’ve taken advantage of it, you can build on it. It will be a solid base from which to do other things. “

According to Jordan, it will be a gradual process to build people’s trust in homomorphic encryption and the data exchange solutions based on it. People will be inclined to share a little data in exchange for some benefit. If it works as advertised, they’ll do a little more and so on.

“Many of them will involve compromises. I’m willing to give up a little privacy when I get real value, ”he says. “I want to make a compromise that is beneficial to me in the context of a particular problem that I am trying to solve. Well, compromising puts us in the world of economics … And so encryption techniques begin to give you a solid foundation from which to say things like: B. how much? How relevant? How useful? What are the parameters of the cooperation? “

Economic compromises

The world has been black so far, and when it comes to sharing data, Kaufman says

“Either you share your data and have no privacy, or you do nothing and gain no value,” he says. “This [homemorphic encryption] opens up a whole new spectrum of, OK, it’s not a black and white world, there are things to share. There are different types of guarantees that you can get. And it’s a much richer and much more valuable discussion that I would say. “

We’re not far from having the tools and technology to enable data scientists and researchers to share large, shared data sets. Jordan is helping bring some of these technologies into the real world through UC Berkely’s RISELab, which stands for Real-time Intelligent Secure Execution.

UC Berkeley computer science professor Michael Jordan is a consultant to Duality Technologies

“You’re going to see a little mini-revolution once that kind of thing really gets into Spark and Ray and all these other kinds of stacks. There will be a little hustle and bustle, ”says Jordan. “The agenda will be as easy to use as anything else from day one. Instead of telling the difference between A and B, you call yourself HME [homomorphic encryption] the difference between A and B. And now that you’ve created this code, you have a guarantee of something, and that can be treated as an invariant in your code and can be made really, really straightforward and easy. “

You will still need some level of expertise to adopt HME, says Jordan, but it will follow the general path in computing where complex things get easier over time.

“Getting these things really serious will take a layer of expertise,” he says. “And nobody will understand the whole thing. Layers of trust are built up. “

The last point – about trust – is crucial if the world as a whole is going to be comfortable opening up and sharing its data.

“A hundred years ago, people first brought electricity to their homes. They brought a lamp or a heater or whatever and they said, “What do you have to do to convince me this won’t burn my house down?” “Well trust me, I’m a good technologist.”

“No. They had to develop this thing called the Underwriters Laboratory, for example, that was on every lamp. It was written that someone else, a third party, tested this technology and could guarantee that not only the general technology would be good works, but I looked at this special lamp. “

Jordan envisions the same thing with data. There will be trustworthy third parties who promise us that our privacy will be preserved and that we can benefit from the disclosure of our data at the same time.

In the long term, a system will emerge that offers us the advantages of data exchange and at the same time minimizes the risk – or at least enables us to carry out risk-benefit analyzes and make well-founded decisions about the shared use of data. The final piece is critical to achieving the kind of data economy that Jordan sees.

“You give bids, you are accepted[undSieerledigendieArbeitSofunktioniertWirtschaftUndirgendwieistmaschinellesLerneninderWeltohneÖkonomieaufgewachsenohneAngeboteundProzess-undFeedbackschleifenundDatensammlungwoesEigeninteressentengibtundalldas”saysjordan”Indennächsten20JahrenmussesalldieseDingeumfassen”[andyoudotheworkThat’skindofhowofeconomicsworksAndsomehowmachinelearninghasgrowingupintheworldwithouteconomicswithoutbidsandprocessandfeedbackloopsandcollectionofdatawherethere’sselfinterestedpartiesandallthat”JordanSays”Thenext20yearsit’sgottobeembracingallthesethings”[undSieerledigendieArbeitSofunktioniertWirtschaftUndirgendwieistmaschinellesLerneninderWeltohneÖkonomieaufgewachsenohneAngeboteundProzess-undFeedbackschleifenundDatensammlungwoesEigeninteressentengibtundalldas“sagtJordan”Indennächsten20JahrenmussesalldieseDingeumfassen”[andyoudotheworkThat’skindofhowofeconomicsworksAndsomehowmachinelearninghasgrowingupintheworldwithouteconomicswithoutbidsandprocessandfeedbackloopsandcollectionofdatawherethere’sselfinterestedpartiesandallthat”Jordansays“Thenext20yearsit’sgottobeembracingallthesethings”

Similar articles:

State privacy laws increase as demands for state guidelines increase

Anger builds over big data abuse of big tech

Yes, you can use AI without sacrificing privacy


Source link

Share.

Comments are closed.