Key Takeaways
OpenAI and T-Mobile have reached a multi-year agreement to develop “AI-enabled services and tools,” drawing on T-Mobile’s data assets and OpenAI’s AI expertise.
The partnership has already borne fruit, with T-Mobile actively testing a new customer experience (CX) solution, IntentCX. But could it also be laying the groundwork for an OpenAI smartphone?
Customer experience agents have been one of the most widely deployed business applications for chatbots, but despite the technology’s potential, they can still be frustrating to use.
According to OpenAI and T-Mobile , the problem is that existing solutions aren’t sufficiently customized for the task. To solve this, IntentCX has “access to billions of data points from actual customer interactions ” to “deeply understand” each customer’s data footprint.
In other words, the new chatbot integrates and is perhaps fine-tuned using data from existing T-Mobile customer interactions.
As T-Mobile CEO Mike Sievert put it, “Our customers leave millions of clues about how they want to be treated through their real experiences and interactions, and now we’ll use that deep data to supercharge our [CX] team as they work to perfect customer journeys.”
Ever since reports surfaced that Jony Ive and Sam Altman were teaming up to develop the “iPhone for AI,” there has been speculation that OpenAI could enter the AI device market.
While such a move would be an odd pivot for the company, OpenAI’s latest partnership with T-Mobile builds on an existing relationship with Apple, suggesting It wants to carve out some sort of role for itself in the sector.
Apple’s latest iOS upgrade grants iPhone users access to OpenAI models. A similar deal with T-Mobile could see ChatGPT integrated with the carrier’s in-house phone brand, Revvl.
Launched at an Apple event on Monday, Sept. 9, the iPhone 16 range appears to follow the design principles that have characterized all recent generations of the device.
However, this year, the iPhone’s biggest upgrades can be found inside the device.
Reflecting the trend for more conversational voice assistants powered by language models, Siri has received an AI makeover and can now tap both Apple’s proprietary language models and OpenAI’s GPT-4 to help answer questions.
On the computer vision front, Visual Intelligence is Apple’s answer to Google Lens, letting users search for information by taking a photo.
The upcoming iOS 18 upgrade will make new capabilities available on older iPhones, too, thanks to a new, more powerful Arm chip, and the iPhone 16 can handle more advanced AI tasks on-device. Certain functions, including Visual Intelligence, will only run on the latest iPhone hardware.
While Apple Intelligence represents Apple’s most concerted effort to integrate AI into its flagship smartphones, the firm doesn’t appear to be in a rush to ship new features.
The new Siri is certainly more capable, but Apple has made sure not to drastically alter the user experience. Although the firm teased a host of new computer vision capabilities on Monday, Visual Intelligence will be rolled out incrementally in the coming months.
With Google and Samsung seemingly trying to pack as much AI as possible into their respective devices, Apple’s approach is more restrained. That could be a good idea, considering the kickback Google got for its latest smart assistant upgrade. Rather than rush to release new features, Apple appears to be focused on nailing them one at a time.
While Apple’s partnership with OpenAI gives iPhone users cloud access to GPT and DALL-E models, Apple Intelligence generally favors proprietary models and on-device AI when possible.
The strategy is classic Apple. After all, getting iPhone users hooked on ChatGPT risks creating a long-term dependency on OpenAI. Instead, the iPhone 16 is configured to use Apple’s in-house AI by default and only delegate to other models when requested by users.
With OpenAI models effectively relegated to the sidelines of iOS, their role in the wider smartphone ecosystem is cast into doubt.
While there are already ChatGPT apps for iPhone and Android, OpenAI’s partnership with Apple integrates the platform at a much deeper level. However, studies have repeatedly shown that smartphone users rarely change their default settings, which means Apple’s models are unlikely to be eclipsed by OpenAI’s.
What’s more, whereas GPT-4 once had a clear edge over rival language models, these days, competition is fierce. With leading models from OpenAI, Google, Anthropic, Meta, and others separated by razor-thin performance margins, individual preference and ease of use will likely play a much more important role going forward.
To be clear, the partnership with Apple is still beneficial to OpenAI. But it hardly guarantees ChatGPT’s success in smartphone AI.
In the long term, OpenAI CEO Sam Altman seems to think smartphones won’t be the devices that deliver AI services in the future. His project with Jony Ive aims to create a more “more natural and intuitive user experience” to interact with artificial intelligence. However, other startups that have attempted to answer the question of what comes after smartphones have generally failed.
Devices like Human’s AI pin and the Rabbit R1 encounter a fundamental challenge that is currently difficult to work around: people aren’t ready to give up their smartphones. And if the latest iPhone is specially engineered to run AI applications, why does anyone need a separate device?