Tether launches AI grants to fund locally run intelligence tools
0
0

Tether, the issuer of USDT, the largest stablecoin in the world, has set its sights on artificial intelligence, investing millions of dollars into technology that does not require the cloud to function.Ā
Paolo Ardoino, Tetherās CEO, shared how he migrated from cloud-based AI to a local and self-sovereign AI system, warning about the risks of new AI Agent systems.Ā
Tether is paying developers to build local AI systemsĀ
Tether has launched an unlimited developer grants program to fund local-first AI and payments infrastructure. Prior to this, Tetherās AI research group released medical language models that run on standard smartphones and outperform Googleās (NASDAQ: GOOGL) significantly larger systems.
Tether joins a lane that Ethereum (ETH) co-founder Vitalik Buterin is already on. Vitalik published an extensive personal blog post on April 2 detailing his complete migration away from cloud-based AI.
Cryptopolitan reported that Buterin says he now runs everything on his own machines and wants others to do the same, especially with the introduction of new āagentā systems that present considerable security threats. Buterin highlighted the research done on OpenClaw, which reached 280,000 stars in early 2026.
The tool allows AI agents to control computers directly, and security researchers have demonstrated that OpenClaw agents can modify critical system settings, download and execute malicious scripts from web pages without user awareness, and exfiltrate data through silent network calls.
Cryptopolitan reported that approximately 15% of the āskillsā these agents use contain hidden commands that quietly send user data to outside servers.
āIf you can build something that runs locally, holds value directly, and doesnāt rely on external providers, weāll fund it.ā Tetherās CEO Paolo Ardoino said.
Tetherās grants program pays individuals $1,500 to $4,000 per task, in either USDT or Bitcoin (BTC), with no cap on total program payouts. However, developers are only paid once specific technical deliverables are completed.
Tether is directing the program toward building core libraries for its local AI platform QVAC, producing technical documentation, developing applications on top of Tetherās open stack, and researching decentralization and edge AI.
A major focus is the Wallet Development Kit (WDK), which lets developers embed self-custodial wallets directly into applications while also allowing users to manage their accounts and complete transactions without relying on custodial services or hosted APIs.Ā
Tether previously awarded $100,000 in grants to the BTC Pay Server Foundation in consecutive years and donated $250,000 to OpenSats for Bitcoin development. Tether has distributed over 500 student education grants and committed up to approximately $5.38 million (CHF 5 million) toward the programās next phase through 2030.
Can smaller AI models outperform larger ones?
Tetherās AI Research Group recently released QVAC MedPsy, a pair of medical language models designed to run directly on smartphones and wearables without any internet connection. The results challenge the assumption that better performance requires larger models.
The smaller model, QVAC MedPsy-1.7B (1.7 billion parameters), scored 62.62 across seven closed-ended medical benchmarks, outperforming Googleās MedGemma-1.5-4B-it by 11.42 points despite being less than half the size.
The larger QVAC MedPsy-4B (4 billion parameters) scored 70.54 on the same benchmarks, exceeding Googleās MedGemma-27B-text-it, a model nearly seven times larger containing approximately 27 billion parameters.
The performance gap widened when the models were tested in real-world clinical scenarios. On HealthBench Hard, a test designed to measure applied medical reasoning, QVAC MedPsy-4B scored 58.00 compared to MedGemma-27Bās 42.00.Ā
The models also use significantly up to 3.2 times fewer tokens than comparable systems, which translates directly into faster response times and lower computational demands.
Tether, Vitalik and other proponents of running smaller, local AI models will be pointing to these results as evidence that users donāt need to risk sending their data into the cloud to run efficient systems.
Donāt just read crypto news. Understand it. Subscribe to our newsletter. It's free.
0
0
Securely connect the portfolio youāre using to start.






