Generative AI has cybersecurity teams thrilled and sweating bullets. The technology churns out tricks much like a slot machine on a hot streak, yet significant risks to proprietary data lurk in the background. There’s no telling how exposed that data is, once it’s fed into these models, it’s out there in the wild. Experts are toying with the idea that homomorphic encryption might just be the secret weapon against this menace, and that a few bold projects might already be testing the waters.”It is possible that people have already done this on a small scale, i.e, encrypting data that goes into the Gen AI model homomorphically which prevents people from reading it directly,” said Mark Hovarth, VP analyst, Gartner.There is, Horvath pointed out, a mounting enthusiasm around Homomorphic encryption use cases for AI workloads. For instance, an annual privacy and security workshop by iDash (integrating data for analysis, anonymization, and sharing) Centre has worked on a model that considered pooling in high-net-worth individuals data from different organizations (funds and banks) under homomorphic encryption to facilitate a comprehensive cybersecurity analysis through smart machine learning (ML) models. Horvath called this a “secure multi-party computation.” “Entities who might be competitors in some realms come together to share information, over homomorphic cipher, for a common purpose, maybe a search engine or a statistical analysis,” he explained. Initially developed as a defense against the imminent threat of quantum computers, which are expected to break traditional encryption methods, such as RSA, DSA, and ECC, with ease, homomorphic encryption has gained traction as a powerful solution for safeguarding data in high-risk environments.”Homomorphic encryption enables the user to access and perform operations on encrypted data without access to the secret key,” said Frank Dickson, group vice president of Security & Trust at IDC. “The result is that the operations and the data remain encrypted. It is an evolution of zero trust.”The technology allows, for instance, a customer to have his email address updated by the support service agent without the latter having to “un-encrypt” the entire database in the record (possibly having a million entries) and perform a small update simply on the email address of one entry, Dickson added, explaining the ‘datum level’ operability of homomorphic encryption. Homomorphic encryption includes three types, partial, somewhat, and fully homomorphic encryption (FHE). Each type allows different levels of encrypted computation: partial supports one operation, somewhat allows limited operations, and FHE enables arbitrary computations on encrypted data.At least “in theory,” homomorphic encryption would be a wonderful technology for Gen AI, Dickson agreed with an evident sigh over the computing costs the technology demands.
Quantum hack for Gen AI risks
When asked about the current state of homomorphic encryption in addressing Gen AI risks, FortiGuard Labs, the threat intelligence arm of Fortinet, emphasized its promise.”Homomorphic encryption may have significant implications for”¯Gen AI, especially when it comes to addressing privacy challenges that arise from using and training large AI models,” said Aamir Lakhani, global security strategist at FortiGuard Labs. “Gen AI models rely on vast amounts of data, often consisting of sensitive or proprietary information, and processing this data poses risks of exposure and unauthorized access. One application of homomorphic encryption I definitely see is protecting the source data from theft and poisoning attacks.” This idea isn’t just theoretical. Companies like Enveil are already showcasing how homomorphic encryption can be harnessed in practical AI applications. Enveil’s flagship ZeroReveal solutions enable encrypted searches and secure AI model interactions. Their technology allows users to query AI models while keeping both the queries and responses encrypted, ensuring that no sensitive information is exposed during processing. “Imagine you wanted to ask Google some questions and you didn’t want anyone to know what you asked or what the results were because that’ll help them figure out what you asked,” Horvath said. “Enveil allows you to do that.”Similarly, Zama has tailored its fully homomorphic encryption (FHE) solutions for the Gen AI landscape. By enabling sensitive data to remain encrypted even while being processed by large language models (LLMs), Zama ensures that AI workloads maintain data privacy. This is particularly valuable for federated learning and secure multi-party computations, where datasets from different organizations are aggregated without compromising individual privacy. Meanwhile, Microsoft is experimenting with homomorphic encryption through its Azure Confidential Computing platform, focusing on encrypted AI workloads in sectors like healthcare and finance. The company aims to bridge the gap between security and utility, offering enterprises the ability to run AI models on encrypted data without exposing sensitive details. IBM has also been at the forefront, leveraging its HELib platform to perform secure computations on encrypted data. With a focus on practical applications, IBM has partnered with the healthcare and financial industries, enabling encrypted data analysis that safeguards privacy while facilitating valuable insights. By keeping data encrypted throughout processing, homomorphic encryption aims to mitigate risks associated with data exposure and unauthorized access. However, the technology still has a long way to go. As Aamir Lakhani points out, the practical implementation of homomorphic encryption in Gen AI is still in its infancy. Companies like Google and Microsoft are exploring complementary techniques such as AI watermarking to enhance security further. These methods aim to identify and trace proprietary data used within Gen AI models, adding another layer of protection. Why though, despite its first practical implementation by Craig Gentry dating back to a decade ago in 2009, did homomorphic encryption have a studded growth? Its monstrous ask in computational costs is definitely a slowing factor.
Why is homographic encryption a tough nut to crack?
Homomorphic encryption remains computationally expensive due to the sheer volume of operations required to manipulate encrypted data securely. For example, performing basic arithmetic operations on encrypted data is thousands to millions of times slower than equivalent plaintext operations.In specific benchmarks, addition in ciphertexts was reported to be up to 246,897 times slower than plaintext addition, with similarly high overhead for multiplications. These delays arise from the need to handle complex mathematical transformations that ensure the encryption remains unbreakable during computation.”‹”At the moment, homomorphic encryption is only applicable to a small number of niche applications,” Dickson added.”¯”It is very difficult to overcome the compute tax.”The computational load increases substantially in Gen AI use cases, where models require processing massive datasets. For instance, training a simple machine learning model on homomorphically encrypted data can take several orders of magnitude longer than standard training.This challenge is exacerbated by encryption schemes like Cheon-Kim-Kim-Song (CKKS) or Brakerski/ Fan-VerCauteren (BFV), which need to manage issues such as noise accumulation and bootstrapping, particularly in scenarios involving deep computations like neural network inference.”‹ Efforts to mitigate these computational demands include the development of optimized libraries like IBM’s HElib, Microsoft SEAL, and PALISADE, which aim to reduce processing times by refining encryption algorithms and bootstrapping techniques. Research also focuses on hybrid approaches, such as combining homomorphic encryption with lightweight encryption like AES to balance security and speed.These innovations aim to make homomorphic encryption more practical for real-world applications but scaling it to high-demand Gen AI scenarios remains a significant hurdle”‹. If they prove effective, it could drive Gen AI adoption to unprecedented levels.
First seen on csoonline.com
Jump to article: www.csoonline.com/article/3627116/this-new-cipher-tech-could-break-you-out-of-your-gen-ai-woes.html