Intel Demos Chip to Compute with Encrypted Data (spectrum.ieee.org)
178 points by sohkamyung 6 hours ago
freedomben 5 hours ago
Perhaps it's a cynical way to look at it, but in the days of the war on general purpose computing, and locked-down devices, I have to consider the news in terms of how it could be used against the users and device owners. I don't know enough to provide useful analysis so I won't try, but instead pose as questions to the much smarter people who might have some interesting thoughts to share.
There are two, non-exclusive paths I'm thinking at the moment:
1. DRM: Might this enable a next level of DRM?
2. Hardware attestation: Might this enable a deeper level of hardware attestation?
gpapilion 3 hours ago
Just to level set here. I think its important to realize this is really focused on allowing things like search to operate on encrypted data. This technique allows you to perform an operation on the data without decrypting it. Think a row in a database with email, first, last, and mailing address. You want to search by email to retrieve the other data, but don't want that data unencrypted since it is PII.
In general, this solution would be expensive and targeted at data lakes, or areas where you want to run computation but not necessarily expose the data.
With regard to DRM, one key thing to remember is that it has to be cheap, and widely deployable. Part of the reason dvds were easily broken is that the algorithm chosen was inexpensive both computationally, so you can install it on as many clients as possible.
amelius 19 minutes ago
I'm also thinking of what happens when quantum computing becomes available.
But when homomorphic encryption becomes efficient, perhaps governments can force companies to apply it (though they would lose their opportunity for backdooring, but E2EE is a thing too so I wouldn't worry too much).
egorfine 5 hours ago
> how it could be used against the users and device owners
Same here.
Can't wait to KYC myself in order to use a CPU.
observationist an hour ago
KYC = Kill Your Conscience
It's truly amazing how modern people just blithely sacrifice their privacy and integrity for no good reason. Just to let big tech corporations more efficiently siphon money out of the market. And then they fight you passionately when you call out those companies for being unnecessarily invasive and intrusive.
The four horsemen of the infocalypse are such profoundly reliable boogeymen, we really need a huge psychological study across all modern cultures to see why they're so effective at dismantling rational thought in the general public, and how we can innoculate society against it without damaging other important social behaviors.
xeonmc 34 minutes ago
Frieren 5 hours ago
> how it could be used against the users
We are not anymore their clients, we are just another product to sell. So, they do not design chips for us but for the benefit of other corporations.
3. Unskippable ads with data gathering at the CPU level.
dimitrios1 4 hours ago
I distinctly remember from university in one of my more senior classes designing logic gates, chaining together ands, nands, ors, nors, xors, and then working our way up to numerical processors, ALUs, and eventually latches, RAM, and CPUs. The capstone was creating an assembly to control it all.
I remember how thinking how fun it was! I could see unfolded before me how there would be endless ways to configure, reconfigure, optimize, etc.
I know there are a few open source chip efforts, but wondering maybe now is the time to pull the community together and organize more intentionally around that. Maybe open source chipsets won't be as fast as their corporate counterparts, but I think we are definitely at an inflection point now in society where we would need this to maintain freedom.
If anyone is working in that area, I am very interested. I am very green, but still have the old textbooks I could dust off (just don't have the ole college provided mentor graphics -- or I guess siemens now -- design tool anymore).
matheusmoreira 11 minutes ago
linguae an hour ago
officeplant 3 hours ago
youknownothing 5 hours ago
I don't think it's applicable to DRM because you eventually need the decrypted content: DRM is typically used for books, music, video, etc., you can't enjoy an encrypted video.
I think eGovernment is the main use case: not super high traffic (we're not voting every day), but very high privacy expectations.
freedomben 4 hours ago
Yes it must be decrypted eventually, but I've read about systems (I think HDMI does this) where the keys are stored in the end device (like the TV or monitor) that the user can't access. Given that we already have that, I think I agree that this news doesn't change anything, but I wonder if there are clever uses I haven't thought of
NegativeLatency 4 hours ago
benlivengood an hour ago
1. The private key is required to see anything computed under FHE, so DRM is pretty unlikely.
2. No, anyone can run the FHE computations anywhere on any hardware if they have the evaluation key (which would also have to be present in any FHE hardware).
ddtaylor 27 minutes ago
HDCP does some of that already in many of your devices.
gruez 5 hours ago
See: https://news.ycombinator.com/item?id=47323743
It's not related to DRM or trusted computing.
inetknght 5 hours ago
Not yet.
gruez 5 hours ago
mathgradthrow an hour ago
No, because of the fundamental limitation of DRM. Content must be delivered as plaintext.
observationist an hour ago
Regarding DRM, You could use stream ciphers and other well understood cryptography schemes to use a FHE chip like this to create an effectively tamper-proof and interception proof OS, with the FHE chip supplementing normal processors. You'd basically be setting up e2ee between the streaming server and the display, audio output, or other stream target, and there'd be no way to intercept or inspect unencrypted data without breaking the device. Put in modern tamper detection and you get a very secure setup, with modern performance, and a FHE chip basically just handling keys and encapsulation operations, fairly low compute and bandwidth needs. DRM and attestation both, as well as fairly dystopian manufacturer and corporate controls over devices users should own.
evolve2k 4 hours ago
My thought is half cynical. As LLM crawlers seek to mop up absolutely everything, companies themselves start to worry more about keeping their own data secret. Maybe this is a reason for shifts like this; as encrypted and other privacy-preserving products become more in demand across the board.
KoolKat23 2 hours ago
This is quite the opposite, better than we have.
It raises the hurdle for those looking to surveil.
If a tree falls in the forest and no one is around to hear it, does it make a sound?
This is primarily for cloud compute I'd imagine, AI specifically. As it's generally not feasible/possible to run the state of the art models locally. Think GDPR and data sovereignty concerns, many demand privacy and can't use services without it.
vasco 3 hours ago
Regarding DRM I don't see how it'll survive "Camera in front of the screen" + "AI video upscaling" once the second part is good enough. Can't DRM between the screen and your eyes. Until they put DRM in Neuralink.
RiverCrochet 2 hours ago
> Can't DRM between the screen and your eyes.
No, but media can be watermarked in imperceptible ways, and then if all players are required to check and act on such watermarks, the gap becomes narrow enough to probably be effective.
See Cinavia.
zvqcMMV6Zcr 5 hours ago
> Heracles, which sped up FHE computing tasks as much as 5,000-fold compared to a top-of the-line Intel server CPU.
That is nice speed-up compared to generic hardware but everyone probably wants to know how much slower it is than performing same operations on plain text data? I am sure 50% penalty is acceptable, 95% is probably not.
corysama 5 hours ago
There are applications that are currently doing this without hardware support and accepting much worse than 95% performance loss to do so.
This hardware won’t make the technique attractive for ALL computation. But, it could dramatically increase the range of applications.
bobbiechen 4 hours ago
Agreed. When I was working on TEEs/confidential computing, just about everyone agreed that FHE was conceptually attractive (trust the math instead of trusting a hardware vendor) but the overhead of FHE was so insanely high. Think 1000x slowdowns turning your hour-long batch job into something that takes over a month to run instead.
patchnull 4 hours ago
Current FHE on general CPUs is typically 10,000x to 100,000x slower than plaintext, depending on the scheme and operation. So even with a 5,000x ASIC speedup you are still looking at roughly 20-100x overhead vs unencrypted compute.
That rules out anything latency-sensitive, but for batch workloads like aggregating encrypted medical records or running simple ML inference on private data it starts to become practical. The real unlock is not raw speed parity but getting FHE fast enough that you can justify the privacy tradeoff for specific regulated workloads.
tromp 3 hours ago
10,000x to 100,000x / 5,000x = 2 to 10x, not 20 to 100x.
Foobar8568 4 hours ago
Now we know why Intel more or less abandonned SEAL and rejected GPU requests.
mmaunder 5 hours ago
Someone explain how you'd create a vector embedding using homomorphically encrypted data, without decrypting it. Seems like a catch 22. You don't get to know the semantic meaning, but need the semantic meaning to position it in high dimensional space. I guess the point I'm making is that sure, you can sell compute for FHE, but you quickly run up against a hard limit on any value added SaaS you can provide the customer. This feels like a solution that's being shoehorned in because cloud providers really really really want to have a customer use their data center, when in truth the best solution would be a secure facility for the customer so that applications can actually understand the data they're working with.
bob1029 4 hours ago
Most of modern machine learning is effectively linear algebra. We can achieve semantic search over encrypted vectors if the encryption relies on similar principles.
Chance-Device 5 hours ago
FHE is the future of AI. I predict local models with encrypted weights will become the norm. Both privacy preserving (insofar as anything on our devices can be) and locked down to prevent misuse. It may not be pretty but I think this is where we will end up.
boramalper 4 hours ago
If you're interested in "private AI", see Confer [0] by Moxie Marlinspike, the founder of Signal private messaging app. They go into more detail in their blog. [1]
[1] https://confer.to/blog/2025/12/confessions-to-a-data-lake/
CamperBob2 3 hours ago
I don't get how this can work, and Moxie (or rather his LLM) never bothers to explain. How can an LLM possibly exchange encrypted text with the user without decrypting it?
The correct solution isn't yet another cloud service, but rather local models.
FrasiertheLion 2 hours ago
boramalper 3 hours ago
Reptur 3 hours ago
If encrypted outputs can be viewed or used, they can be reverse-engineered through that same interface. FHE shifts the attack surface, it does not eliminate it.
Chance-Device 2 hours ago
If you know how to reverse engineer weights or even hidden states through simple text output without logprobs I’d be interested in hearing about it. I imagine a lot of other people would be too.
Foobar8568 3 hours ago
FHE is impractical by all means. Either it's trivially broken and unsecured or the space requirements go beyond anything usable.
There is basically no business demand beside from sellers and scholars.
bilekas 3 hours ago
This is incredible work.. And makes the technology absolutely viable.
However... In a world where privacy is constantly being eroded intentionally by governments and private companies, I think this will NEVER, ever reach any consumer grade hardware. My cynic could envision the technology export ban worldwide in the vein of RSA [0] .
Why would any company offer the customers real out of the box e2e encryption possibilities built into their devices.
DRM was mentioned by another user. This will not be used to enable privacy for the masses.
https://en.wikipedia.org/wiki/Export_of_cryptography_from_th...
autoexec 2 hours ago
> In a world where privacy is constantly being eroded intentionally by governments and private companies, I think this will NEVER, ever reach any consumer grade hardware.
Why not when government can just force companies to backdoor their hardware for them. That way users are secure most of the time except from the government (until the backdoor in intel's chips gets discovered anyway), and users have a false sense of security/privacy so people are more likely to share their secrets with corporations and the government gets to spy on people communicating more openly with each other.
FrasiertheLion 3 hours ago
Arguably this is less useful for consumer hardware in the first place. This is mostly useful when I don’t trust the service provider with my data but still need to use their services (casting my vote, encrypted inference, and so forth)
bilekas 3 hours ago
True, in the case of casting a vote though for example, I would see it being used within the voting machines itself before sending off to be counted. Good application.
But getting them available for customers for example say even a PCIe card or something and then that automatically encrypting everything you ever run today over an encrypted connection would be a dream.
jpauline 3 hours ago
This is a huge win for cybersecurity and data privacy.
gigatexal 3 hours ago
If they can get this shrunk down and efficient enough in a future scenario I think Apple could move back to Intel for this with their stance on encryption and things it being a pillar of their image.
Joel_Mckay 12 minutes ago
Not going to happen anytime soon, as the modern M4/ARM unified memory with on-chip GPU is years ahead of Intel. The software ecosystem is slowly growing to leverage this chip architecture, and due to the annoying PC RAM, SSD, and RTX GPU shenanigans it is no longer the lower value option.
The PC market was made shitty enough this year, that Mid/High class Mac Pro/laptops are actually often a better value deal now (if and only if your use-case is covered software wise.)
Intel does plan on a RTX + amd64 SoC soon, but still pooched the memory interface with a 30 year old mailbox kludge. Intel probably wont survive this choice without bailouts. =3
JanoMartinez 5 hours ago
One thing I'm curious about is whether this could change how cloud providers handle sensitive workloads.
If computation can happen directly on encrypted data, does that reduce the need for trusted environments like SGX/TEE, or does it mostly complement them?
esseph 5 hours ago
Everything about this in my head screams "bad idea".
If you need to trust the encryption and trust the hardware itself, it may not be suitable for your environment/ threat model.
numpad0 2 hours ago
It is a bad idea but not in the way you think. FHE hardware don't decrypt data on-chip. It's like using the Diffie-Hellman key exchange for general computation. The data and operations stay encrypted at any given moment while outside your client device.
The textbook example application of FHE is phone book search. The server "multiply" the whole phonebook database file with your encrypted query, and sends back the whole database file to you every time regardless of queries. When you decrypt the file with the key used to encrypt the query, the database is all corrupt and garbled except for the rows matching the query, thereby causing the search to have practically occurred. The only information that exists in the clear are query and the size of entire database.
Sounds fantastically energy-efficient, no? That's the problem with FHE, not risks of backdooring.
u1hcw9nx 5 hours ago
In FHE the hardware running it don't know the secrets. That's the point.
First you encrypt the data. Then you send it to hardware to compute, get result back and decrypt it.
Foobar8568 3 hours ago
But you leak all type of information and and the retrieve either leak even more data or you'll end up with transferring a god knows amount of data or your encryption is trivially broken or spend days/month/years to unencrypt.
bilekas 3 hours ago
gruez 5 hours ago
>If you need to trust the encryption and trust the hardware itself, it may not be suitable for your environment/ threat model.
Are we reading the same article? It's talking about homorphic encryption, ie. doing mathematical operations on already encrypted data, without being aware of its cleartext contents. It's not related to SGX or other trusted computing technologies.
cwmma 5 hours ago
In theory you only need to trust the hardware to be correct, since it doesn't have the decryption key the worst it can do is give you a wrong answer. In theory.
esseph 4 hours ago
But can you trust the hardware encryption to not be backdoored, by design?
That's my point, this sounds like a way to create a backdoor for at-rest data.