All Posts
AI Needs Privacy. iExec Is Building It with Confidential Computing + DePIN
Talk Recap
May 26, 2025
In Brief
Martin Leclercq of iExec took the stage at DePIN Day Dubai 2025 to introduce Confidential AI — the fusion of decentralized infrastructure and trusted execution environments (TEEs) to enable private, verifiable, and secure artificial intelligence. At a time when AI models and agents demand deeper access to personal data, iExec is building the Trust Layer for AI and DePIN, ensuring privacy is preserved by default.
iExec: A Legacy in Decentralized Compute
Founded in 2017, iExec is one of the earliest Web3 protocols focused on decentralized cloud computing. Over the years, it evolved from a marketplace for compute power and data (2017), to implementing blockchain and confidential computing (2020), to offering SDKs and an incubator (2023), and today, to building the privacy-first trust layer for AI and DePIN.
The Problem: Privacy in AI is Broken
While privacy is a fundamental right, it is often neglected in today’s AI-powered world. Most traditional systems encrypt data at rest or in transit. This is the critical vulnerability that exposes sensitive user data during processing, especially in AI workloads that require continuous access to input data.
Traditional AI also suffers from:
Black-box decision-making
Centralized control risks
Lack of verifiability
Privacy breaches (40% of orgs have experienced one)

The Solution: Confidential AI
iExec is building Confidential AI, a powerful approach that runs encrypted models and data in secure enclaves using Trusted Execution Environments (TEEs). These enclaves, created via Intel’s SGX and now the newer TDX, allow for privacy-preserving computations with verifiable trust.
In iExec’s architecture, two core components run in the enclave:
A Database of secrets, only accessible from authorized apps
The iApp (iExec Application) — user-deployed AI/logic that runs entirely in the enclave
iExec now supports both SGX (lightweight, high-assurance modules) and TDX (scalable for large AI models and legacy apps).
Decentralized Confidential Computing: Privacy + Web3 Logic
What makes iExec unique is not just running TEEs, but doing so in a decentralized way. The platform adds a Web3 layer for governance, allowing users to define:
Who can access their data
How it’s used
What rules apply to monetization or sharing
This unlocks a new model of computing where privacy, trust, and incentives are programmable.

Enter AI Agents: The Need for Secure Autonomy
As we enter the age of AI Agents — autonomous, decision-making systems that interact with your tools and data — privacy risks grow exponentially. These agents often require:
Access to your inbox, calendar, files
Personalized insights trained on private inputs
iExec’s Confidential AI allows these agents to operate without ever exposing the raw data. This ensures that users remain in control — and that the models themselves can be monetized securely.

What You Can Build with Confidential AI
With Confidential AI, developers can now:
Monetize data used for training models — with privacy guarantees
Run AI models inside enclaves that never expose input data
Deploy autonomous AI agents that respect user-defined rules
Create end-to-end encrypted workflows, even on public infrastructure

Building the Trust Layer for DePIN and AI
iExec is laying the trust rails for it. As AI agents become ubiquitous, the need for confidential, verifiable infrastructure is critical. Through Confidential AI, iExec empowers users to control, protect, and profit from their data while enabling developers to build ethical AI with ease.