Listen to this episode

What Is DeAI? What Counts as Decentralised AI

What is DeAI? Every crypto project claims to be an AI project now. Three tests for separating genuine decentralised AI from narrative-surfing.

The problem with the AI label

There are 911 tokens in CoinMarketCap’s AI & Big Data category. That number was under 100 two years ago. So what happened? Did 800 teams suddenly crack decentralised AIDeAIDecentralised AI. An umbrella term for blockchain-based projects that build AI infrastructure (compute, data, inference, models, agents) without a single central provider controlling the system.Like the difference between streaming a movie from Netflix and sharing it via BitTorrent. Netflix is fast and polished but one company controls what you can watch and what you pay. BitTorrent is messier but no single operator can shut you out.Read more →, or did 800 teams update their website copy?

Some of these projects were building AI infrastructure before the narrative existed. Some have made genuine technical pivots. And a significant number have done nothing more than add “AI” to the tagline and watch the tokenTokenA digital unit of value or access rights tracked on a blockchain. Tokens can represent ownership in a project, a right to use a service, a share of future revenue, or simply a tradable asset with no underlying claim.Like a physical poker chip a casino issues. The chip itself has no value. What makes it worth something is what it lets you do at the casino, what the casino has promised, and how much other people will pay you for it.Read more → pump.

CoinMarketCap’s AI category includes Bittensor (a network designed from the ground up to incentivise machine learningMLMachine Learning. The branch of AI where systems learn patterns from data instead of being explicitly programmed with rules. Modern AI (LLMs, image generation, recommendation systems) is almost entirely machine learning.Like teaching a child to recognise dogs by showing them thousands of pictures of dogs, instead of writing down a precise rulebook for what makes a dog. The child learns the pattern from examples rather than from instructions.Read more →), NEAR Protocol (a sharded L1L1Layer 1. A base blockchain that runs its own consensus mechanism, executes transactions, and settles its own state. Bitcoin, Ethereum, NEAR, and Solana are all L1s. Anything built on top of an L1 is technically a Layer 2 or higher.Like the foundation of a building. Nothing else can exist on top until the foundation is solid. Different L1s make different tradeoffs for what kind of building they can support.Read more → now marketing itself as “The Blockchain for AI”, though the founders did start in ML before pivoting to blockchain in 2018), and Injective (a DeFiDeFiDecentralised Finance. Financial services like lending, trading, and yield farming built on smart contracts instead of traditional banks or brokerages. DeFi protocols are usually permissionless and global.Like a vending machine that can give you a loan, swap your currencies, or invest your savings. Nobody is behind the counter, the rules are written into the machine itself, and anyone with money in the right format can use it.Read more → chain with no AI product at all). Messari’s classification system includes “AI Meme” as a legitimate sub-sector alongside AI Infrastructure. That tells you everything.

If you are trying to evaluate projects, allocate capital, or just understand what decentralised AI actually is, the existing labels are useless. So here is my attempt at drawing a line.

Three tests for genuine DeAI

I have spent the last several months reviewing projects across this space. StakingStakingLocking up a cryptocurrency to help secure a blockchain network, usually in exchange for rewards. The locked tokens act as a security deposit that can be taken away if the staker misbehaves.Like putting down a large rental deposit for an apartment. You get the money back if you behave, you earn interest while it's locked, and the landlord takes it if you trash the place.Read more → tokens, testing open models locally and on a VPS using AI agent frameworks like Agent Zero and OpenClaw, reading whitepapers, checking GitHub repos, verifying on-chain data. Twenty years of managing delivery risk on major infrastructure projects makes you sceptical of promises without evidence. Out of that work, three tests have emerged that I apply to every project I evaluate.

The provenance test

Did the AI capability come from the project’s founding thesis, or was it bolted on in response to a market narrative?

This sounds simple. It is not. Some projects have legitimate technical credentials that predate the AI hype cycle. Bittensor was designed as a decentralised machine learning network from day one. Numerai has been running a crowdsourced ML hedge fund since 2015. Venice built private inferenceInferenceRunning a trained AI model to produce an answer. Inference is what happens when you type a prompt into ChatGPT and get a response. The model takes your input, computes a best guess, and returns it.Like asking an expert for their opinion. The training was the decades they spent becoming an expert. The inference is the 30 seconds it takes them to answer your specific question.Read more → as its core product.

Others arrived at AI through genuine technical evolution. Render started as a distributed GPUGPUGraphics Processing Unit. Originally designed to render video game graphics, GPUs turned out to be exceptionally good at the massively parallel math that AI models need. Modern AI training and inference runs almost entirely on GPUs.Like a factory with 10,000 workers doing the same simple task in parallel, versus a CPU which is more like 10 workers each doing different complex tasks. AI training involves doing simple math a million times per second on a million numbers, which is exactly what the GPU factory is designed for.Read more → network for 3D rendering. GPUs that render frames can also run inference. The underlying infrastructure, distributed GPU scheduling and payment rails, translates directly. That is not AI-washing. That is a legitimate expansion of an existing capability.

Then there are the projects where the timing tells the story. PathDAO was a gaming guild whose token fell 99%. It pivoted to social apps, then NFTNFTNon-Fungible Token. A unique blockchain-tracked asset where each token is distinguishable from every other. Where regular tokens are interchangeable, NFTs represent unique items like art, collectibles, in-game assets, or domain names.Like the difference between a $20 note and a signed first-edition novel. The notes are interchangeable, any $20 buys the same thing as any other. The book is one of a kind, and its value depends entirely on which specific book it is.Read more → fashion, then music. Nothing worked. In December 2023, it relaunched as Virtuals Protocol, an AI agent creation platform. The AI agent category then grew 322% in Q4 2024 and Virtuals rode the wave. I am not saying Virtuals has no product. It does. But ask yourself: was the founding question “what technical problem needs solving” or “what narrative will save this token”?

APENFT rebranding to AINFT in October 2025, a full year after the AI agent peak, with zero pre-existing AI work? That is not a pivot. That is a costume change. Justin Sun’s TRON ecosystem adding “AI infrastructure” to what was an NFT marketplace is pure narrative arbitrage.

The removal test

If you took the AI component out of the project, would it still make sense?

Bittensor without AI is nothing. The entire consensus mechanism, Yuma Consensus, is designed to reward machine learning contributions. Remove AI and there is no network. Same for Gensyn. Same for Venice’s private inference product.

Now consider Chainlink. It appears in several AI aggregator categories because it provides data infrastructure. But Chainlink without AI is still Chainlink. It provides oracle services to DeFi, gaming, insurance and dozens of other sectors. AI workloads might use Chainlink data feeds, but that does not make Chainlink an AI project any more than AWS is an AI company because OpenAI runs on it. EigenLayer is the same story. It is general-purpose restaking infrastructure on Ethereum. Maybe 10-15% of its AVSs are AI-related, and projects like Ritual and Hyperbolic build on top of it, but EigenLayer without AI is still EigenLayer: a restaking protocol. I hold staked EIGEN from the airdropAirdropDistributing tokens for free to eligible wallets, usually to reward early users, bootstrap a community, or decentralise token ownership away from a small group of insiders at launch.Like a supermarket handing out free samples to people who already shop there. The samples cost the supermarket nothing to print. The goal is to convert casual shoppers into loyal customers by giving them something tangible to talk about.Read more →. That does not make it a DeAI position.

Filecoin is in the same position. CoinMarketCap lists it as the fourth largest AI & Big Data token by market cap. Protocol Labs has not rebranded Filecoin as an AI project. Their positioning is clear: AI workloads need storage, and Filecoin provides storage. Arweave is identical: permanent storage that AI workloads can use for modelModelA trained neural network that takes inputs (text, images, audio) and produces outputs (more text, classifications, generated content). In DeAI the model is the thing that actually does the work.Like a very experienced apprentice who has spent years watching thousands of masters make furniture. They can't explain how they know when a joint is right, but they can make a chair that looks and functions like a Chippendale. The training is invisible. The output is what matters.Read more → weightsParametersThe internal numbers (weights and biases) inside a neural network that get adjusted during training. A 70-billion-parameter model has 70 billion adjustable internal numbers encoding everything it has learned.Like the synapses in a human brain. Each parameter is a tiny dial that gets nudged a little during training. With enough dials, the network can represent surprisingly complex patterns. The total parameter count is roughly how much "brain" the model has.Read more → and trainingTrainingThe one-time process of teaching a neural network to perform a task by showing it massive amounts of example data and adjusting its internal weights until the outputs are good. Training builds the model; inference uses it.Like the years an apprentice spends learning a trade. You don't see any of the actual work, just thousands of repeated mistakes gradually becoming competence. By the end, the apprentice can do the job. The training was invisible, but the skill is now permanent.Read more → data, but Arweave without AI is still Arweave. That is an honest framing. Being useful to AI is not the same as being AI.

The removal test exposes the difference between AI as the foundation and AI as a customer.

Being useful to AI is not the same as being AI. If the project works fine without AI, the AI label is describing a customer, not the product.

The mechanism test

This one comes from Vitalik Buterin’s taxonomy of crypto-AI intersections. The question is: what role does AI play in the protocol’s mechanism?

Arbitrage bots on DEXs are AI using crypto infrastructure. The blockchain mechanism stays the same. The AI is just a more sophisticated player. Legitimate, but it does not make the DEXDEXDecentralised Exchange. A trading venue where token swaps happen entirely through smart contracts, with no central operator holding user funds. The largest DEXes are Uniswap, Aerodrome, Raydium, PancakeSwap, and Curve.Like a self-service vending machine that lets you swap one type of coin for another. The machine sets the exchange rate based on its current stock, anyone can deposit coins to refill it, and there's no clerk behind the counter.Read more → an “AI project.”

More interesting: AI as the objective of the game. The blockchain exists specifically to build, train, coordinate or distribute AI models. Bittensor’s subnets are this. Gensyn’s verified compute for ML training is this. The token incentives exist to produce intelligence. Remove the blockchain and you lose the coordination mechanism that makes the AI work.

Then there is the problematic case: AI as marketing for the game. The blockchain has its own purpose (DeFi, NFTs, gaming, storage) and AI has been added to the narrative without changing the underlying mechanism. The protocol works identically with or without the AI label.

The spectrum

These three tests produce a spectrum, not a binary. Here is how I categorise it.

Genuine Narrative
Core DeAI

AI is the founding thesis. Remove it and the project has no reason to exist.

BittensorGensynVeniceMorpheusio.net
DeAI-adjacent

Real tech that serves AI workloads, but the protocol is general-purpose.

AkashRenderPhalaNillion
AI-washed

Existing project with "AI" added to the marketing after the narrative shifted.

AINFTSuperVerseSecret Network

Core DeAI

Projects like Bittensor, Gensyn, Venice, io.net, Morpheus, Grass, Ocean Protocol and Vana range in quality and maturity. Some are live with real usage. Some are still building. But AI is what they are, not a feature they added.

DeAI-adjacent

Projects like Akash (general-purpose compute), Render (GPU rendering expanded to inference), Phala (confidential computingConfidential ComputeHardware-enforced computation where data and code are encrypted in memory and only the authorised application can access them. The machine's operator cannot read what the application is doing even though they own the machine.Like renting space in a bank vault. The bank owns the building and runs the security, but what you put in the vault is invisible even to the bank staff. Only you have the key.Read more → pivoting to AI agents), Nillion (blind computation with AI inference as one application), and GEODNET (decentralised geospatial data from ground stations, useful to autonomous vehicles and spatial AI, but the core product is positioning data) have real technology serving AI workloads, but the protocol is general-purpose.

I reviewed four privacy/confidential computing projects recently. All four, Nillion, Phala, Secret Network and Oasis, are general-purpose privacy infrastructure that has pivoted toward AI as a use case. Phala is the furthest along, with GPU TEETEETrusted Execution Environment. A hardware-secured region of a CPU or GPU where code runs in isolation, so even the machine's operator can't read what's happening inside. TEEs give decentralised AI inference privacy guarantees.Like a bank vault inside a bank. The bank owns the building, staffs the lobby, and runs the security cameras. But what's inside the vault is invisible to everyone, including the bank staff, unless the customer opens it.Read more → inference live on H100/H200/B200 hardware, 758,000 daily AI agent contract executions, and an integration with the Eliza V2 agent framework. But the underlying tech is horizontal. It runs trading bots, healthcare analytics and LLMLLMLarge Language Model. A neural network trained on vast amounts of text to predict the next word in a sequence. Modern LLMs (GPT, Claude, Llama, Qwen, DeepSeek) generate human-quality text and are the foundation of most modern AI products.Like an autocomplete that read every book ever written. It has no memory of individual texts but it has absorbed the patterns of language so deeply that it can generate paragraphs that sound human. The skill is statistical, not conscious.Read more → inference on the same infrastructure.

DeAI-adjacent projects can be excellent investments and excellent infrastructure. They deserve coverage. But calling them DeAI projects without noting the general-purpose origin would be dishonest.

AI-washed

Projects where “AI” was added to the marketing after the narrative shifted. The AI component is either non-existent, embryonic, or incidental to the actual product. AINFT (formerly APENFT), SuperVerse, Secret Network (DeFi platform collapsed with TVLTVLTotal Value Locked. The sum of all assets currently deposited in a protocol's smart contracts. TVL is the standard measure of how much capital a DeFi or DeAI protocol is custodying.Like the assets under management of a hedge fund. AUM tells you how much money the fund has been trusted with, which is a rough proxy for how much business it's doing. TVL plays the same role for crypto protocols.Read more → down 94%, AI is the replacement narrative), and the hundreds of “GPT” tokens with no underlying technology all fall here.

I should be upfront here: I was an early adopter on Secret Network. Privacy is a first principle for me and Secret’s confidential computing thesis was exactly what I wanted to support. I staked SCRT, used the network, and genuinely wanted it to succeed. That makes this harder to write, but it does not change the analysis.

Secret Network is worth looking at closely because it has real technology. Its TEE-based confidential computing genuinely can run private AI inference. The SecretAI SDKSDKSoftware Development Kit. A collection of code libraries, documentation, and tools that lets developers integrate a service into their applications without writing everything from scratch. SDKs are how projects become easy to build with.Like a plug-and-play kit for building furniture. You don't have to mill your own wood, forge your own screws, or design the joinery from scratch. The kit gives you pre-cut parts and instructions so you can assemble the thing in an afternoon.Read more → exists. But the context matters: the DeFi ecosystem that was the original product failed. TVL went from $118 million to $7 million. The “confidential AI” positioning filled a vacuum left by a collapsing core business. The SDK has 9 GitHub stars. The team is “actively seeking customers and partners,” which is their own language for having infrastructure without adoption.

Having real tech does not get you off the hook if the pivot is narrative-driven rather than product-driven.

Why this matters

The AI crypto sector peaked at roughly $70 billion in December 2024. It then fell 75%, erasing $53 billion in market value. AI tokens posted negative 50% year-to-date in 2025 despite being the second most popular narrative in crypto.

That is what happens when genuine infrastructure gets lumped in with narrative tokens. Capital flows in on hype, discovers that most of the “AI projects” do not actually have AI products, and flows out again. The projects doing real work get caught in the drawdown alongside the ones that were always empty. I lost funds in FTX. I know what it feels like when a category collapses because nobody bothered to check what was real.

An arXiv paper from 2025 reviewed the sector and found that most AI token projects “present a decentralized architecture in theory, while retaining centralized control over core operations in practice.” A separate industry survey of 1,200 startups (not just crypto), widely cited but with methodology not independently verified, found that 40% of companies branding themselves “AI-first” had zero machine learning code in production. Twenty-five percent were simply wrapping OpenAI’s APIAPIApplication Programming Interface. A structured way for one piece of software to talk to another. In DeAI, APIs let applications request inference from a model without running the model themselves.Like a waiter in a restaurant. You don't walk into the kitchen and cook your own meal. You tell the waiter what you want, they tell the kitchen, the kitchen cooks it, and the waiter brings it back. The API is the waiter.Read more → with a new interface.

These numbers should make anyone cautious about taking AI claims at face value. In crypto, where token price directly correlates with narrative, the proportion is likely worse.

How I use this framework

Every project review on this site goes through these tests. When I write about Bittensor, the AI credentials are obvious. When I write about Akash, I am honest that it is a general-purpose compute marketplace where AI happens to be the highest-demand workload. When something does not pass the bar, it does not get a review.

This is not about purity. DeAI-adjacent projects can be more practically useful than Core DeAI ones. Akash’s compute marketplace works today and is cheaper than AWS. That matters more than whether it was founded as an “AI project.” But the distinction matters for understanding what you are buying, what the risks are, and whether the AI narrative is driving fundamentals or just price.

The three tests are simple. Did AI come first or after the narrative? Would the project survive without AI? Does AI change the mechanism or just the marketing? If a project fails all three, it is probably not what it claims to be.

The line we draw

Decentralised AI means infrastructure and protocols that exist specifically to build, train, distribute or run AI in a way that no single entity controls, and nobody owns your mind. The blockchain is not a payment layer bolted onto a centralised AI service. The AI is not a marketing label bolted onto an existing blockchain. The two need each other.

Everything else is either adjacent infrastructure that serves AI (which can be excellent) or narrative-surfing (which is not). Knowing which is which is half the work of investing in this space.

Frequently asked questions

What is DeAI (decentralised AI)? DeAI is artificial intelligence infrastructure built on decentralised networks: distributed compute, open-weight models, and token-coordinated protocols where no single entity controls access, censors outputs, or owns your data. The blockchain and the AI need each other; if you could remove one without affecting the other, it is not genuine DeAI.

What is the difference between DeAI and regular AI? Regular AI runs on centralised infrastructure controlled by a single company (OpenAI, Google, Anthropic). They decide which models you access, what content policies apply, and how your data is used. DeAI distributes that control across independent participants. You choose the model, the compute provider, and the privacy level. The trade-off is that decentralised infrastructure is less polished but gives you sovereignty over your AI stack.

How do I know if a crypto project is really an AI project? Apply three tests. First, did AI come before the token? Projects founded around AI workloads are more credible than existing blockchains that added “AI” to their marketing. Second, would the project survive if the AI narrative disappeared? If yes, AI is a use case, not the foundation. Third, does AI change the protocol’s mechanism or just its marketing? If the blockchain functions identically without AI, it is narrative-surfing.

What are examples of genuine DeAI projects? Core DeAI projects where AI is the founding thesis include Bittensor (decentralised AI network), Venice (private inference), Gensyn (distributed training), and Morpheus (AI agent network). DeAI-adjacent projects with genuine AI utility include Akash (general-purpose compute with AI as the primary workload) and Phala (confidential computing for AI). See our project directory for all 34 reviewed projects.

Is DeAI a good investment? It depends on the project. We score every project on two dimensions: Freedom (how decentralised it actually is) and Returns (how well the token captures value). Some projects score high on freedom but have weak token economics. Others have strong revenue models but centralised control. Our quadrant model maps this trade-off. The honest answer is that most DeAI tokens are early-stage bets on infrastructure that may take years to reach scale.

Score changes, new reviews, one editorial take every two weeks. No spam.