🚨 JUST IN: Crypto AI Agent is here!!! Watch the video 🎥

Deutsch한국어日本語中文EspañolFrançaisՀայերենNederlandsРусскийItalianoPortuguêsTürkçePortfolio TrackerSwapCryptocurrenciesPricingIntegrationsNewsEarnBlogNFTWidgetsDeFi Portfolio TrackerOpen API24h ReportPress KitAPI Docs

The Research Tax

21d ago
bullish:

0

bearish:

1

Not too long ago, universities and their researchers once dominated artificial intelligence research. In the early 2010s, academic-only teams produced around 65% of the most compute-intensive AI models. By the early 2020s, that had fallen to ~10%.

This is perhaps not surprising. Commercial labs have the money, the infrastructure, and can attract the talent. Once the post-2022 consumer AI boom happened, and the revenue and investment came pouring in (if not the profits), it was always destined to be this way.

A lot of this is inevitable, but some of it isn’t. A lot of this is down to what you could call a kind of “research tax,” a series of obstacles that make it harder for academics in the public realm to pursue research about AI. This added cost they face is constraining teams and universities on multiple fronts: delay, abandonment, and talent attrition most obviously. Effectively promising a less productive research environment before they’ve even begun. In aggregate, less innovation, and less cutting edge research breakthroughs.

It’s also hidden (or at least not immediately visible). Abandoned projects because of cost constraints are not visible on any ledger, but they are a real cost to the universities themselves and — I’d argue — us as a society. (Although that argument is for another blog).

The Resource Gap

So let’s start with access to resources, which in the AI-world is “compute,” the hardware resources that allow AI models to generate and process information. An overwhelming majority (85%) of academics surveyed had zero budget for cloud compute; 66% rated cluster satisfaction at 3/5 or below. In the same survey, researchers also said they faced GPU wait times up to 2–3 days (longer near deadlines) plus limited multi-node capability.

And it’s not just about waiting. 41% of researchers surveyed had no multi-node capability (meaning they can’t link machines together to run bigger jobs) at all. Others cited weak interconnects (basically, the pipes between machines are too slow) that made anything beyond single-node training basically impossible. You can’t just “work around” that when the infrastructure caps what’s even attemptable.

Then there are hypotheses and models that are never brought to completion. In the same survey, only 17% reported pre-training models because the economics make building full models either financially unattractive or outright infeasible.

This is undoubtedly helping to contribute to an attrition of talent across the academic sector. By 2020, nearly 70% of new AI PhDs pursued careers in the private sector. (Of course, a lot of this is explained by the fact that salaries in the private sector can well outpace those in academia, but there is a human element to this too. Anyone who has pursued a PhD or knows someone who has, knows that current, or future, earnings is often a less important consideration. These researchers value freedom of inquiry and the ability to pursue their academic interests more than most. Universities simply can’t compete on the compute-level to allow them to do that).

Not All Universities are Equal

In a December 2024 paper by Stanford University titled Expanding Academia’s Role in Public Sector AI, the authors noted how “The same week that Princeton announced it would purchase 300 H100s, Meta announced it would buy 350,000. Microsoft plans to have 1.8 million H100s by the end of this year.” The academy is simply outgunned to an astonishing degree. It’s frankly not even a competition.

And let’s not forget, this is Princeton! One of the richest universities on the planet, with the reputational clout to attract billions of the brightest minds and billions more in endowments. (In fact, Princeton is in some sense the wealthiest major university on the planet because its massive $36.4 billion endowment is concentrated among a very small student body, giving it the highest wealth-per-student ratio of any world-class institution.) Depressing, really, because if Princeton can’t compete, who can?

These massive disparities are structural, not incidental, and well documented. Way back in the halcyon days of 2020 (before the consumer AI boom of ChatGPT etc) Nur Ahmed and Muntasir Wahed at Cornell University wrote about the “de-democratization of AI” as the ability to conduct large-scale and compute-intensive research was only happening at the top Universities and commercial institutions.

As we’ve just established, that is increasingly becoming dominated by just the big corporate behemoths, and not the private sector at large. Referring to “commercial AI labs” is essentially abstracting away the problem. We’re essentially talking about a handful of companies in the West, and a few more when you include China (which has its own mostly self-contained AI infrastructure and ecosystem). But I digress…

Taken together, this “research tax” is the cumulative overhead facing academic AI research: constrained compute access, limited funding, and infrastructure bottlenecks that slow experimentation and lead to projects being abandoned. The result is a shrinking share of AI research happening in the public realm, with more of it shifting toward for-profit companies. That isn’t inherently negative, but it can influence the kind of research that gets done.

Why Theta Cares

This topic is something we keep harping on about at Theta because we believe it justifies our approach to target universities as one of our core customers.

We have written previously about how bringing more “idle” compute into the supply — alongside partnerships with the big clouds, as we have — can help alleviate this problem, something we called “supply side compute”. This approach is not a panacea, but it is something that is genuinely changing feasibility for universities pursuing AI research.

At Yonsei University, Prof. Dongha Lee told us about in-house GPUs becoming “costly to grow and scale”, and finding EdgeCloud “scalable and flexible… at less than 50% of the cost”. Music to our ears. At Peking University, Prof. Zhen Xiao goes further, calling it “one of the most complex hybrid GPU infrastructure systems” he’s seen, with plans to replace “all of our existing lab and cloud based GPU resources”.

At Theta, we’re personally proud at how the positive experience has travelled via word of mouth. At Ajou University in Korea, Prof. Young-June Choi said he’d heard “nothing but great things” about performance and cost before the team signed on with us.

Again, our decentralized, distributed infrastructure isn’t going to change the world overnight. It also doesn’t get rid of the “tax”, but it does give academics and researchers a shot at pursuing their scholarly interests. It’s a step in the right direction, and the change for these individual teams is measurable and significant. Three cheers for that, at least.

If you’re a university, research institution, student or faculty member interested in finding out more. Please email partners@thetalabs.org to find out more.

This article was written by Josh Adams, Head of Marketing & PR at Theta Labs.

This article is part of Theta’s Thought Leadership Series. To read more articles like this, follow our Medium page.


The Research Tax was originally published in Theta Network on Medium, where people are continuing the conversation by highlighting and responding to this story.

21d ago
bullish:

0

bearish:

1

Manage all your crypto, NFT and DeFi from one place

Securely connect the portfolio you’re using to start.