Scale Matters: The Environmental Impact of AI
Should I feel guilty about using ChatGPT to write this email? It's a question I've been wrestling with, and I suspect you have too. The real story about AI and environmental impact is more nuanced than either the doomsayers or the techno-optimists would have us believe.
At Sitara, we've built our practice around two commitments: creating world-class experiential design and environmental stewardship. So when reports started surfacing about AI's massive carbon footprint—data centers consuming as much power as entire countries, tech companies firing up coal plants to meet demand—we had questions.
Were we undermining our environmental values every time we used these tools? Could we reconcile using AI to enhance our creative process with our commitment to planetary health? The headlines suggested we couldn't. But headlines rarely tell the whole story.
Over the past few months, we've been digging into the research, running the numbers, and trying to understand what AI's environmental impact actually means for design studios. What we found challenged our assumptions. Yes, there's a real environmental cost to AI—but not where we expected to find it. And the path forward isn't about individual guilt or abstinence, but about understanding where our choices actually matter.
This post shares what we've learned. It's for studios trying to do exceptional work while staying true to their values. For teams wondering if they need to choose between innovation and responsibility. The answer, as always, is more nuanced than the extremes suggest.
The Research Landscape: Two Stories, Both True
The research on AI's environmental impact tells two seemingly contradictory stories. MIT Technology Review projects that by 2028, AI will consume electricity equivalent to 22% of US households. A study published in Nature, meanwhile, shows AI uses 1,000x less carbon than humans for equivalent tasks.
Both are right. Understanding why is the key to this whole conversation.
MIT Technology Review’s projection reflects AI's explosive growth. ChatGPT hit 100 million users faster than any consumer app in history. Every major platform is embedding AI features. But here's what the headlines miss: when data centers need power fast, they reach for whatever's available. In Virginia, home to the world's largest concentration of data centers, utilities have kept coal plants in operation despite their plans for retirement. If these facilities run on the current grid, we're looking at emissions equivalent to adding 15-20 million cars to the road just to power these data centers.1 The renewable transition is on hold if this trend continues.
The Nature study reveals something most of us never consider: humans are incredibly energy-inefficient thinking machines. When you factor in our heating, cooling, lighting, computers, calories, and a number of other factors, we burn through massive amounts of energy to accomplish cognitive tasks. One ChatGPT query produces 0.135-2.2 g of CO2. A human writing the same page of text? About 1,400g of CO2—that's over 1,000x more. Even when you amortize the staggering cost of training these models (GPT-3's training consumed the equivalent of three full days of San Francisco's entire energy usage!), AI still comes out dramatically more efficient per task.
What makes this conversation particularly challenging is that AI companies aren't exactly forthcoming with their data. OpenAI stopped publishing detailed environmental reports, Google buries AI's impact in overall data center statistics, and the rest of us are left piecing together the story from academic estimates and the occasional leak. It's hard to make thoughtful decisions when the companies building these tools won't tell us what they actually cost.
Let's Talk Real Numbers for Your Studio
Here's what we discovered when we actually did the math: Your AI usage, even if you’re a power user, likely has a smaller carbon footprint than your morning cup of coffee. That might sound counterintuitive, but the numbers tell a clear story.
Let's start with the constraints. Even if you tried to max out your AI usage, most platforms have strict caps on queries. ChatGPT Plus caps at roughly 150 messages daily, Claude Pro limits you to around 80—for an 8-hour work day, that's one query every 3-4 minutes all day long. Most users don’t use nearly that much; a more reasonable estimate even for power users might be 40-50 queries daily.
The carbon math is straightforward. Each AI query produces between 0.135 and 2.2 grams of CO2, depending on the complexity of the request. Even assuming complex requests, 50 queries generate 110 g of CO2—almost exactly what it takes to boil water for a French press. That daily coffee ritual you don't think twice about? It has a larger carbon footprint than a full day of AI assistance.2
Scale this to a 50-person studio, with everyone using AI substantially for text-based tasks—40 queries per person, every day, all year. That's over 900,000 queries producing about 2,000 kg of CO2 annually. For context, one round-trip flight from NYC to London produces between 2,200 and 3,200 kg of CO2 per passenger. An entire studio's annual AI use has less impact than a single international business trip for one person.
This isn't an argument for mindless AI use, but it does suggest that personal guilt about using ChatGPT to draft emails or explore concepts is misplaced. The real environmental questions about AI exist at an entirely different scale.
The Scale Trap: Where Environmental Impact Actually Happens
The environmental impact of AI transforms completely when you shift from individual to systemic scale.
Consider Google's AI Overviews—those AI-generated summaries now appearing in search results. If they show up on just 15% of Google's 14 billion daily searches, that's 2.1 billion AI responses every day. The annual impact: 1,686,000 kg of CO2. To put that in perspective, that's equivalent to 530 round-trip flights from NYC to London, generated by a single feature at one company.
Google leads the industry in sustainable infrastructure—they've been carbon neutral since 2007 and run their data centers at twice the efficiency of typical facilities.3 Yet even this commitment can't solve a fundamental disconnect: infrastructure decisions and product decisions happen at completely different speeds and scales. Product teams shipping AI features think in sprints—user engagement, query relevance, competitive advantage. Infrastructure teams think in years—power purchase agreements, renewable capacity, cooling systems. These parallel tracks rarely intersect at the feature level.
The timing mismatch compounds the problem. Product teams can ship a new AI feature in weeks. Infrastructure teams need years to bring renewable capacity online. One product launch can add millions of queries overnight; one solar farm takes 3-5 years from planning to production. Even the most sustainability-focused companies are constantly playing catch-up, deploying features faster than they can green the infrastructure to support them.
For experiential designers, this creates a different calculus. A major museum implementing AI throughout its exhibits might generate 10,000 kg of CO2 annually—substantial, but orders of magnitude smaller than systemic deployments. Even large-scale venues like international airports might reach 500,000 kg from AI features in wayfinding or mobile apps.4 That's real impact worth considering, but it's fundamentally different from the infrastructure challenges making headlines.
The power grid strain you're reading about comes from this convergence of forces: platform-scale deployment, competitive pressure, infrastructure lag, and yes, our collective appetite for AI-enhanced everything. It's not about villains; it's about systems where individual good intentions hit the wall of collective impact.
This doesn't mean your project decisions don't matter. It means understanding where you fit in the larger picture—and focusing your energy where you can actually move the needle.
So What Can You Do?
So what does thoughtful AI use actually look like for experience designers? Here's what we've learned, organized by actual impact:
For the Industry: Support Transparency and Better Infrastructure This is where real change happens. The environmental decisions that matter occur at the infrastructure level—which data centers get built, what powers them, how efficiency is prioritized. Support regulations requiring energy disclosure. Advocate for renewable-powered data centers. Push clients toward cloud providers with genuine clean energy commitments. Join industry initiatives for transparency. These systemic changes affect millions of users, not just your projects.
For Your Projects: Think Carefully About Scale Every AI feature you embed multiplies its impact. A visitor-facing chatbot scales to thousands of users. An AI-powered analytics system runs 24/7/365. Before adding AI, ask: Does this meaningfully improve the experience? Could a simpler solution work? Passive AI systems (analytics, personalization, recommendations) often have larger footprints than interactive features because they never stop running.
For Your Practice: Optimize If You Want To If you enjoy optimizing, model selection offers real gains. Smaller specialized models often match larger ones—DeepSeek-Coder 7B outperforms CodeLlama 34B for many tasks at one-fifth the computational power, for example. Running models locally gives you control over renewable power sources and operating hours. But honestly? For most studios, using whatever tools make you most productive is the right choice. The efficiency gains offset the environmental cost.
The Bottom Line Here's what I've come to believe: the environmental cost of AI isn't really about your personal use. It's about the millions of AI features being embedded everywhere, running constantly, often powered by whatever energy source is cheapest.
Your job isn't to feel guilty about using AI to work better. It's to think carefully before adding AI features that will scale to thousands or millions of users. There's a difference between using AI and deploying it.
Epilogue: What We Haven't Talked About
This post focused specifically on AI's environmental impact—a question we felt we could meaningfully research and quantify. But we're acutely aware that carbon footprints are just one dimension of AI's true cost.
As Karen Hao documents in her new book Empire of AI, there are profound social impacts we haven't addressed here: the human labor behind data labeling, the extraction of resources and exploitation of workers in the Global South, questions of bias and representation, the concentration of power in a handful of tech companies. These issues deserve careful attention by anyone looking to benefit from modern AI tools.
We started with carbon because it felt measurable, comparable, and actionable. But responsible AI use requires grappling with all of these dimensions. This is the beginning of our investigation, not the end.
If you're interested in exploring these broader impacts with us, we'd love to hear from you. The conversation about ethical AI use is one that everyone needs to be having.
These projected emissions would be like adding 4-5 times the current number of EVs back onto the road as gas-powered vehicles, essentially canceling out years of EV adoption progress.
And we haven’t even considered the carbon footprint of sourcing and roasting your coffee beans.
While this infrastructure leadership doesn't eliminate the impact of billions of queries, it does mean each query produces significantly less CO2 than it would at most other tech companies.
For comparison, that's significantly less than what the building's HVAC system likely produces in a month.


