Skip to content

The Whisper: Imagining the Device Jony Ive and Sam Altman Might Be Building

There was a big announcement this week from Sam Altman, CEO of OpenAI, that his company is buying a device startup called io, launched by former Apple designer Jony Ive, for $6.5 billion. Altman said they will be “figuring out how to create a family of devices that would let people use AI to create all sorts of wonderful things.”

He went on to say that “AI is an incredible technology, but great tools require work at the intersection of technology, design, and understanding people and the world,” Altman said without giving further details.

I have tried to use my imagination to figure out what their new product might be. Imagining new products is very difficult because we think of derivatives of the devices and technology that we are familiar with. No one envisioned a touch screen keyboard when Jobs announced the iPhone, so my attempt here might be a folly. Nevertheless, I put together a bunch of my thoughts, hunches, and ideas, and asked ChatGPT to build on them and tell me what the new device might be. Below is what it came up with:


The Whisper: Imagining the Device Jony Ive and Sam Altman Might Be Building

In the overlapping circles of design, AI, and futuristic hardware, one collaboration stands out: Jony Ive, the design icon behind the iPhone, and Sam Altman, the driving force behind OpenAI. What could these two create together? If whispers and informed speculation are to be believed, it’s not another phone or headset. It’s something smaller, more personal, and profoundly different. Let’s imagine it.

They might call it the Whisper.

This device is pocket-sized — no bigger than a thick credit card — and designed not to be held or stared at. It clips to your clothing or nestles in your pocket, ready to listen and respond. It doesn’t demand attention; it responds to intention.

A Minimal Device, Maximum Intelligence

At its heart, Whisper is a voice-first AI companion — a gateway to a powerful language model designed to understand and assist. You talk to it. It listens. It responds with answers, images, or documents — whatever is most useful in the moment.

Its physical interface is almost invisible: perhaps a single glanceable display (think e-ink or low-power OLED) that shows short text snippets, summaries, or prompts. But most interaction is through natural conversation, not screens.

This is where the dedicated earbud comes in.

Whisper is likely paired with a custom wireless earbud — sleek, comfortable, and always ready. This earbud allows for private, real-time responses — spoken back to you in a natural, personalized voice. No need to pull out your phone or stare at a screen in public. You simply ask, and you hear the answer, discreetly. Whether you’re walking, commuting, or cooking, the experience is intimate and ambient.

Context Without Clutter: The Role of Glasses

For visual output, the Whisper can optionally connect to lightweight display glasses — not to immerse you in AR, but to deliver glanceable, contextual visuals. Maybe it shows a name, a translation, a map, or a summary — only when you ask. These aren’t sci-fi goggles. They’re elegant eyewear with a minimalist HUD designed by someone like Ive — useful when needed, invisible when not.

This quiet integration of visuals and audio allows the Whisper to become a true assistant, not another distraction machine.

Boosted by Your Phone, Not Replacing It

To keep the device unobtrusive and light, the heavy lifting happens elsewhere. Whisper borrows connectivity and computing power from your phone, accessing the internet and tapping into on-device LLM features only when necessary. This makes the Whisper efficient, cool-running, and battery-friendly.

When you ask something complex — like “draft an email about this meeting” or “what’s the name of that actor in the green shirt in that movie?” — your phone kicks in quietly behind the scenes.

AI That Feels Like Yours

What makes the Whisper more than a fancy microphone is the AI experience behind it. Built on a large language model (like GPT-5 or Gemini), it doesn’t just parse commands — it understands context, memory, tone, and personal history.

Over time, it becomes your assistant, not just an assistant. It remembers what restaurants you like, who your closest friends are, and what style of writing you prefer. It can summarize your unread texts, remind you about your mom’s birthday, or offer calm words when you sound stressed.

All of this happens privately and securely — with core functions handled on-device, and clear controls over what gets uploaded. A small physical switch or gesture can even mute the microphones or disable listening entirely.

Less Distraction, More Intention

The Whisper is not a phone. It doesn’t want your eyes. It doesn’t ask you to scroll. It wants to live in the background of your life, surfacing only when needed — quietly answering, gently nudging, always listening for the signal in the noise.

This is a product only Jony Ive could make: simple, elegant, and invisible when not in use. And it’s a product only someone like Sam Altman could empower: intelligent, generative, and deeply responsive.

Conclusion: A New Category of Technology

If Ive and Altman deliver on this vision, the Whisper won’t be a smartphone replacement. It will be the beginning of a new category — an AI-native device for humans who want fewer screens, more context, and smarter tools that feel like they belong to them.