← Back to blog

Global Warming Is Accelerating and Tech Is Sitting on the Best Data Nobody Uses

climatedatatech-industryinfrastructure

New research published this week shows global warming is accelerating faster than the IPCC's worst-case models from 2023. The 1.5°C target is gone. We blew past it. The conversation now is about whether we can hold 2°C, and the data is not encouraging.

I'm an engineer, not a climate scientist. But I know something about data infrastructure. And what frustrates me about tech's response to climate change is how little of our actual capability we apply to the problem.

The data problem nobody talks about

Climate science runs on data. Temperature readings, ocean sensors, satellite imagery, atmospheric measurements. The volume is enormous. The infrastructure to process it is not.

Most climate research labs run on hardware and budgets that would make a startup founder cry. I've talked to researchers who wait 3 weeks for model runs that would take hours on modern cloud infrastructure. Their data pipelines look like what we built at startups in 2014. Not because they're bad engineers. Because they're funded like it's 1994.

Meanwhile, the tech industry has built the most sophisticated data processing infrastructure in human history. We process petabytes daily. We run ML models at scales climate researchers can only dream of. We built this entire capability to sell ads and recommend videos.

What would actually help

I've been thinking about this for a while. Here's where tech infrastructure could make a real difference:

Open compute grants for climate modeling. Not the PR-friendly "we donated $5M to a sustainability initiative" kind. Actual compute time. Let climate researchers run their models on the same GPU clusters that train language models. The marginal cost to cloud providers is near zero during off-peak hours.

Better data pipelines. Climate data comes from hundreds of sources in dozens of formats. Building the ETL infrastructure to unify this data is exactly the kind of problem tech companies solve every day. It's boring work. It's also the bottleneck that slows climate research by months.

Prediction markets for climate outcomes. This one's controversial, but I think it'd work. Let people bet on specific climate outcomes with real money. Prediction markets are consistently better than expert panels at forecasting. The information signal would be valuable.

Why I'm writing this on a tech blog

Because the people reading this are the ones with the skills to help. If you're an engineer at a cloud provider, push for climate compute programs internally. If you're building data tools, consider what your tech could do for research institutions. If you run infrastructure, think about donating off-peak capacity.

I started donating 5% of my unused compute to a climate modeling project last year. It costs me about $340 a month. The research team told me it cut one of their model runs from 18 days to 4. That's not a rounding error. That's months of research acceleration over a year.

The uncomfortable math

The tech industry's total spend on AI infrastructure in 2025 was roughly $150 billion. Global funding for climate modeling and prediction was about $2.4 billion. That's a 62:1 ratio.

I'm not arguing we should stop building AI. I'm arguing that directing even 1% of our infrastructure capacity toward climate research would represent a massive increase in what's available. One percent. That's all.

The warming data that came out this week should scare everyone. But fear without action is just anxiety. The tech industry has the tools to help. The question is whether we'll bother.