What’s in this article
- Why every major tech company eventually builds its own chips — Apple, Google, Amazon, and now Anthropic.
- What “owning the silicon” means for a company’s product — speed, cost, capability ceiling.
- Why this matters even if you don’t care about hardware — it changes what your AI tool can do.
- What designers should expect from Claude over the next 18 months.
- How I’d actually use this as a planning signal.
I’m Mike Kwal. I’m a designer, not a chip engineer. But chip moves at this scale tell you which AI companies are serious about being around for the next decade. So this matters for any designer planning a long retainer on AI tooling.
What just happened
Reports surfaced that Anthropic is exploring designing its own custom AI chips — silicon built specifically to run Claude. Not buying from Nvidia. Not renting GPUs forever. Building hardware optimized for their own model.
This follows a clear pattern. Apple designed the M-series. Google designed TPUs. Amazon designed Trainium and Graviton. Every company that takes its own software seriously eventually takes the hardware that runs it seriously.
Anthropic is the latest. It’s a quiet announcement that signals a loud strategy.
Why this matters for designers
Three things change when a company owns its hardware.
Cost goes down. Building your own chips saves the margin you’d otherwise pay Nvidia. Long-term that translates into either lower prices for end users (Claude Pro stays $20/month) or better capabilities at the same price (Claude Pro gets the smarter model included).
Speed goes up. Chips designed for one specific model run that model faster than general-purpose chips. The next generation of Claude Code I run on my Mac will likely respond noticeably faster — not because of software optimization, but because the silicon was built for it.
The capability ceiling rises. This is the one designers most often miss. Generic GPUs were never designed for AI. Custom AI chips can run bigger models, longer contexts, and faster reasoning. The next two years of Claude improvements will partly come from hardware most users will never see.
For me as a designer, that means: the tool I’m betting my retainer on is going to keep getting better, cheaper, and faster without me doing anything. That’s the win.
My $0.02 — How I’d actually use this
This is a “long-game” signal, not a “ship it Friday” tactic. Here’s how I’d factor it into how I run my studio.
I’d commit harder to one AI stack instead of hedging across three. When the company you depend on is willing to spend billions on its own silicon, that’s a multi-decade bet on themselves. I can match that with a multi-year bet on them. So I don’t keep paying for three competing AI subscriptions “just in case.” I pick Claude as my primary, ChatGPT as my one backup, and skip the rest. The savings — both money and mental load — are real.
I’d plan my service tiers around capability that doesn’t exist yet. When I quote a year-long retainer with “AI-assisted updates” baked in, I’m betting that the AI will be substantially better in 12 months than it is today. Hardware moves like this make that bet safer. I can sell forward-looking value because the tool’s trajectory is forward-looking.
I’d write content about the trajectory, not just the moment. This blog post is an example. When I publish about AI for designers, I include some “where this is going” — not just “what’s live today.” Hardware moves let me do that with confidence. Designers who only write about today’s features will look stale six months from now.
The simpler frame: when the tool I run my business on is being treated as a multi-decade infrastructure project by the company that makes it, my business plan should match. Long retainer, single primary tool, content that looks ahead.
Want the full playbook?
For the full case for committing to one AI stack — and how to pick the right one — see my Talk-to-Build Stack.
FAQ
When will Anthropic’s chips ship?
Probably 2027 at the earliest. Custom silicon takes years. The announcement matters more than the date.
Do I need to do anything different now?
No. Keep using Claude the way you use it. The benefit shows up gradually as new models drop.
Will Anthropic still use Nvidia GPUs?
Yes, for years. Custom chips supplement, not replace, generic GPUs in the short term. Long-term, the share shifts.
What if Apple or Google makes a competing AI play?
They already are. Gemini and Apple Intelligence both run on custom silicon. The question for designers is which AI fits your workflow best, not which company has the best chips.
Is this a sign Anthropic might IPO?
Hardware investments like this often precede an IPO or a strategic stake (like Amazon’s). Either way, it’s a signal of company maturity, not a wind-down.
Want help applying this?
Four ways to go deeper:
- Build with Builders. Join the Talk-to-Build community to Learn how to Earn money with AI, Download our AI Skills, Advance your business, Learn to build real assets for Website Design & Shopify stores — Gen-AI images, cinematic AI videos, conversational AI office secretaries — that you can sell to SMBs that want the outcomes but don’t have time to learn the skills.
- Done-for-you. MK-Way builds AEO-ready websites and apps for design agencies and founders who want it shipped fast.
- Quick question. DM me on Instagram. I read every message.
- B2B / strategy. Connect on LinkedIn for deeper conversations about AI in design and agency work.
Last updated: May 7, 2026.