tech

March 19, 2026

Multiverse Computing pushes its compressed AI models into the mainstream

After compressing models from major AI labs including OpenAI, Meta, DeepSeek and Mistral AI, Multiverse Computing has launched both an app that showcases the capabilities of its compressed models and an API that makes them more widely available.

Multiverse Computing pushes its compressed AI models into the mainstream

TL;DR

  • Private company defaults are at a high of 9.2%, prompting VC firms like Lux Capital to advise companies to get compute capacity commitments in writing.
  • Multiverse Computing is developing smaller AI models that can run locally on devices, reducing reliance on external compute infrastructure and mitigating counterparty risk.
  • The company has compressed models from major AI labs and launched the CompactifAI app and an API portal for wider access.
  • The CompactifAI app allows AI to run locally and offline, though it requires sufficient device RAM and storage, otherwise defaulting to cloud-based models.
  • The API portal provides developers and enterprises direct access to compressed models, offering transparency, control, real-time usage monitoring, and potentially lower compute costs.
  • Multiverse's latest compressed model, HyperNova 60B 2602, built on an OpenAI model, reportedly delivers faster responses at lower costs.
  • Smaller models offer advantages for business use cases, such as embedding AI in drones or satellites where connectivity is not guaranteed.
  • Multiverse Computing serves over 100 global customers and is rumored to be raising a significant new funding round.

Continue reading the original article

Made withNostr