The World's First Publicly Funded AI

WWWIII

One token. One mission.
Fund the first publicly developed and created large language model.
Open weights. Open training. Built by everyone.
TOKEN$WWWIII NETWORKEthereum ERC-20 SUPPLY1,000,000,000 MISSIONFirst Publicly Built LLM DEV FUND40% COMMUNITY30% STATUSBuilding LICENSEApache 2.0 TOKEN$WWWIII NETWORKEthereum ERC-20 SUPPLY1,000,000,000 MISSIONFirst Publicly Built LLM DEV FUND40% COMMUNITY30% STATUSBuilding LICENSEApache 2.0
Back the build.
Every contribution goes directly to compute, researchers, and infrastructure.
Funding Goal
$1,000,000,000
To train the first publicly built LLM
$0 raised $1,000,000,000
Supporter
0.05ETH
50,000 $WWWIII
  • Early access to model checkpoints
  • DAO governance voting rights
  • Supporter badge on contributor wall
Architect
1.0ETH
1,500,000 $WWWIII
  • Everything in Builder
  • Vote on architecture decisions
  • Dedicated compute credit
  • Advisory council seat
— OR SEND ETH DIRECTLY —
Presale Contract (Sepolia)
0xCD26a62fc178129F4b24759c329e0c1867d4e613 Click to copy
The AI arms race has a problem.
Every frontier model is built behind closed doors — by corporations, for corporations. The most powerful technology in human history is being developed without you.
🔒

Closed Development

GPT, Gemini, Claude — built by billion-dollar labs. Training data hidden. Architecture proprietary. You use the product but never own it.

💰

The Compute Problem

Training a frontier LLM costs $100M+ in compute alone. No individual can afford it. No open-source project can match it. Until now.

🌍

The WWWIII Solution

Pool resources through a token. Fund compute, researchers, and infrastructure. Every holder is a stakeholder in the model. Open weights. Open code. Open everything.

$WWWIII
1 billion tokens. Every one tied to building the first publicly created LLM.
Total Supply
1,000,000,000
$WWWIII
40%Development Fund
Compute, training runs, infrastructure, researcher grants
30%Community
Airdrops, contributor rewards, governance participation
15%Team & Advisors
2-year vesting, 6-month cliff — aligned with long-term mission
10%Liquidity
DEX pools, market making, exchange listings
5%Reserve
Emergency fund, partnerships, unforeseen opportunities
From token to transformer.
A phased plan to build the world's first publicly funded large language model.
Phase 1 — Foundation
Token Launch & Community
  • Deploy $WWWIII ERC-20 contract on Ethereum mainnet
  • Initial DEX offering on Uniswap
  • Establish DAO governance framework
  • Publish full technical whitepaper
  • Open-source all project infrastructure
Phase 2 — Architecture
Model Design & Data Pipeline
  • Hire core research team (publicly elected advisors)
  • Design transformer architecture — all decisions voted on by holders
  • Build open data pipeline (CommonCrawl, Wikipedia, ArXiv, etc.)
  • Secure GPU compute partnerships (decentralized + cloud hybrid)
  • Launch contributor reward program
Phase 3 — Training
First Training Run
  • Begin pre-training on assembled compute cluster
  • Live dashboard: loss curves, token throughput, cost — all public
  • Community-driven RLHF alignment
  • Intermediate checkpoints released as open weights
  • Bug bounties for safety and alignment issues
Phase 4 — Release
The People's Model
  • Full model release — Apache 2.0 license
  • Open API for token holders (free tier for all)
  • Inference cost subsidized by treasury
  • Begin fine-tuning ecosystem grants
  • Plan next-generation model based on learnings
Why this matters.
The case for publicly funded artificial intelligence.

The Concentration Problem

As of 2026, fewer than ten organizations on Earth have trained a frontier large language model. The cost of compute, the scarcity of talent, and the secrecy of training data have created an oligopoly over the most transformative technology since the printing press.

OpenAI raised $6.6 billion. Anthropic raised $7.3 billion. Google, Meta, and xAI have spent tens of billions more. The result: a handful of companies own the future of intelligence — and none of them answer to you.

The Open-Source Gap

Open-source models like LLaMA, Mistral, and DeepSeek have proven that open weights can rival closed systems. But they still depend on corporate benefactors. Meta open-sources LLaMA — but Meta decides the architecture, the training data, and the release schedule. Open weights are not the same as open development.

WWWIII proposes something different: a model that is publicly funded, publicly designed, and publicly governed from day one. Not a corporate gift. A collective creation.

Token Utility

The $WWWIII token is not a meme. It is a coordination mechanism. Holders govern decisions about architecture, training data, compute allocation, and release strategy through on-chain voting. The token directly funds:

GPU compute — the single largest cost in AI development. Tokens from the Development Fund are converted to pay for cloud compute and decentralized GPU networks.

Researchers — grants for ML engineers, alignment researchers, and data scientists who contribute to the project.

Infrastructure — training pipelines, data processing, evaluation benchmarks, and the open API.

Why Crypto?

Because no existing institution can do this. Governments move too slowly. Corporations have misaligned incentives. Crowdfunding platforms take a cut and offer no governance. A token on Ethereum provides: global permissionless funding, transparent treasury, programmable governance, and aligned incentives — token holders benefit when the model succeeds.

What We're Building

A transformer-based language model, targeting 70B+ parameters on initial release, trained on a curated open dataset. All training code, data processing pipelines, and model weights released under Apache 2.0. Every training run logged publicly. Every architectural decision voted on. Every dollar accounted for on-chain.

This is not a startup. It's a movement. The world's first publicly developed and created artificial intelligence.

The code.
ERC-20 token contract — auditable, verifiable, immutable.
WWWIII.sol Solidity ^0.8.20
// SPDX-License-Identifier: MIT
pragma solidity ^0.8.20;

import "@openzeppelin/contracts/token/ERC20/ERC20.sol";
import "@openzeppelin/contracts/access/Ownable.sol";

/**
 * @title WWWIII — The People's AI Token
 * @notice Funds the first publicly developed large language model
 * @dev 1B fixed supply, no mint function, deflationary optional
 */
contract WWWIII is ERC20, Ownable {

    uint256 public constant TOTAL_SUPPLY = 1_000_000_000 * 10**18;

    // Allocation addresses
    address public devFund;       // 40% — compute, research, infra
    address public community;     // 30% — airdrops, rewards, governance
    address public team;           // 15% — 2yr vest, 6mo cliff
    address public liquidity;      // 10% — DEX pools
    address public reserve;        //  5% — emergency + partnerships

    constructor(
        address _devFund,
        address _community,
        address _team,
        address _liquidity,
        address _reserve
    ) ERC20("WWWIII", "WWWIII") Ownable(msg.sender) {
        devFund   = _devFund;
        community = _community;
        team      = _team;
        liquidity = _liquidity;
        reserve   = _reserve;

        _mint(devFund,   TOTAL_SUPPLY * 40 / 100);
        _mint(community, TOTAL_SUPPLY * 30 / 100);
        _mint(team,      TOTAL_SUPPLY * 15 / 100);
        _mint(liquidity, TOTAL_SUPPLY * 10 / 100);
        _mint(reserve,   TOTAL_SUPPLY *  5 / 100);
    }

    /// @notice Optional burn — deflationary mechanism
    function burn(uint256 amount) external {
        _burn(msg.sender, amount);
    }
}
Join the build.
This model will be built in public, by the public.
⛓️

On-Chain Governance

Every major decision — architecture, dataset, compute provider — voted on by token holders.

🔬

Open Research

All training logs, loss curves, and checkpoints published in real time. Total transparency.

🛠️

Contributor Rewards

Write code, label data, review architecture — earn $WWWIII for every contribution.

🌐

Global & Permissionless

No KYC to participate. No borders. If you can hold a token, you can shape the future of AI.