Skip to main content
Back to Pulse
opinion
The Decoder

Greg Brockman predicts AI will let small teams match the output of large ones if they can afford the compute

Read the full articleGreg Brockman predicts AI will let small teams match the output of large ones if they can afford the compute on The Decoder

What Happened

In the future, working with AI won't mean adapting to the computer—the computer will adapt to you, says OpenAI President Greg Brockman. "This is disruptive. Institutions will change." The article Greg Brockman predicts AI will let small teams match the output of large ones if they can afford the com

Our Take

Greg Brockman said small teams will match large-team output using AI — conditional on compute access. He called this institutional disruption, framed as adaptation rather than choice.

The compute caveat does the heavy lifting. Running GPT-4-class agents on multi-step workflows costs hundreds of dollars per developer monthly in production at any meaningful scale. Predicting team-size parity without modeling inference cost trajectories is the kind of optimism that gets startups to freeze headcount before their unit economics work.

What To Do

Model inference costs at 10x current usage before freezing headcount, because GPT-4-class agent workflows run hundreds of dollars monthly per developer in production.

Builder's Brief

Who

founders and CTOs planning headcount with AI productivity assumptions

What changes

unit economics models for compute versus headcount need a cost-curve stress test

When

months

Watch for

inference cost per active user crossing $10/month in production AI products

What Skeptics Say

Brockman's framing erases coordination and trust costs that scale with organizational complexity — AI doesn't handle the 3am incident call or the customer escalation that requires institutional memory and accountability.

Cited By

React

Newsletter

Get the weekly AI digest

The stories that matter, with a builder's perspective. Every Thursday.

Loading comments...