Privacy & security

TEE

Trusted Execution Environment. A hardware-secured region of a CPU or GPU where code runs in isolation, so even the machine's operator can't read what's happening inside. TEEs give decentralised AI inference privacy guarantees.

Also known as: trusted execution environment, secure enclave

TEEs solve a specific problem in decentralised computing: how to run sensitive code on someone else’s hardware without trusting them. Normally when you send a prompt to a Venice inference server or a Phala confidential compute node, the machine’s operator could theoretically read your data, log your queries, or modify the code you thought was running. TEEs close that gap by creating an isolated region of the processor where code and data are encrypted in memory, access is gated by hardware, and the operator of the machine literally cannot see what’s inside.

The main TEE implementations you’ll see in DeAI articles are Intel SGX (older, smaller enclave sizes, several historical vulnerabilities), Intel TDX (newer, virtual machine scale, used by NEAR AI Cloud and Phala), and NVIDIA Confidential Computing (H100 and H200 GPU enclaves, important because AI inference happens on GPUs). AMD SEV-SNP exists too but is less common in DeAI contexts. Each has slightly different trust assumptions, performance characteristics, and attestation protocols.

The attestation mechanism is what makes TEEs useful beyond just a privacy feature. A TEE can cryptographically prove to a remote party that it’s running a specific piece of code on a specific hardware version, in a specific unmodified configuration. This means a decentralised inference service can prove to users that the exact published model and code are running, not a tampered version that logs prompts to a hidden database. Without attestation, a TEE is just a promise; with attestation, it’s a verifiable commitment.

The honest limit of TEEs is that they depend on the hardware vendor’s security model. If Intel has a flaw in SGX (and they have: Spectre, Meltdown, SGAxe, CacheOut), every TEE deployment inherits the flaw until the hardware is patched or replaced. This is why “privacy via TEE” is not cryptographically equivalent to “privacy via zero-knowledge proof” or “privacy via fully homomorphic encryption.” TEEs rely on trust in a specific hardware manufacturer. The OYM Data Sovereignty dimension scores projects partly on whether they use TEEs in combination with other privacy layers or as a single point of trust.

Related terms