tech

April 19, 2026

Google in talks with Marvell Technology to build new AI inference chips alongside Broadcom TPU programme

Summary: Google is in talks with Marvell Technology to develop two new AI chips – a memory processing unit and an inference-optimised TPU – adding a third design partner alongside Broadcom and MediaTek in its custom silicon supply chain. The discussions, which have not yet produced a signed contract, came days after Broadcom locked in a through-2031 TPU agreement and reflect Google’s shift toward inference as the dominant compute cost, as the custom ASIC market is projected to grow 45% in 2026 and reach $118 billion by 2033.

Google in talks with Marvell Technology to build new AI inference chips alongside Broadcom TPU programme

TL;DR

  • Google is in talks with Marvell Technology to design two new AI chips: a memory processing unit and an inference-optimized TPU.
  • Marvell would act as a design-services partner, similar to MediaTek's role, complementing Broadcom's involvement.
  • This strategy diversifies Google's custom silicon supply chain rather than replacing existing partners like Broadcom.
  • The discussions highlight the increasing importance of AI inference, which drives continuous compute demand and costs, as opposed to large-scale training.
  • Google's seventh-generation TPU, Ironwood, is designed for inference and will be produced in large quantities.
  • Marvell has a strong custom silicon business with significant revenue and design wins with major cloud providers.
  • Nvidia has invested in Marvell, positioning the company at the intersection of GPU and ASIC ecosystems.
  • Broadcom maintains a dominant position in custom AI accelerators, with substantial revenue and market share projections.
  • The broader ASIC market is growing faster than the GPU market, with custom chip sales projected to increase significantly.
  • Google's multi-partner chip strategy aims to mitigate pricing, supply, and strategic risks associated with relying on a single supplier.

Continue reading the original article

Made withNostr