In recent weeks, Apple has been making waves in discussions with major news outlets, signaling a groundbreaking move into the realm of generative artificial intelligence. As per the Friday report from the New York Times, the tech powerhouse from California is on a mission to gain approval for integrating news content into its evolving AI systems.
Multi-Million Dollar Agreements on the Table
Apple isn’t holding back, presenting multiyear agreements, each carrying a hefty price tag of at least $50 million. The goal? Acquiring licenses for extensive news article archives, a nugget of information revealed by insiders familiar with the ongoing negotiations, as detailed in the New York Times report.
Reaching Out to Publishing Titans
The outreach extends to publishing giants like Condé Nast, the force behind Vogue and the New Yorker, and media stalwarts such as NBC News and IAC. The latter owns People, the Daily Beast, and Better Homes and Gardens, all of which have found a place in Apple’s ambitious plans, according to the New York Times’ sources.
Mixed Reactions from Publishers
While Apple is pushing the envelope, some publishers approached during this venture have responded with cautious interest. The dynamics of these negotiations are proving to be quite the spectacle, with the tech giant aiming to reshape its AI landscape with the richness of journalistic content.
Apple’s Internal ChatGPT-Like Service
Not content with external collaborations alone, Apple has reportedly been working on an in-house service reminiscent of ChatGPT. This internal tool is designed to aid employees in testing new features, condensing text, and responding to queries using accumulated knowledge. It’s a peek into Apple’s commitment to fostering innovation within its own walls.
Ajax Framework and the Apple GPT Buzz
Back in July, tech wizard Mark Gurman hinted at Apple’s venture into the realm of artificial intelligence, particularly with the development of the Ajax framework. This framework boasts a spectrum of capabilities, with an application resembling ChatGPT, unofficially named “Apple GPT,” causing quite a stir. Recent insights from an Apple research paper suggest that Large Language Models (LLMs) may soon find a home on Apple devices, including iPhones and iPads.
Unpacking the Research Paper
VentureBeat stumbled upon a research paper that could be the key to Apple’s AI aspirations. Titled “LLM in a flash: Efficient Large Language Model Inference with Limited Memory,” the paper delves into the challenges of deploying Large Language Models on devices with restricted DRAM capacity.
Apple’s Approach Unveiled
The primary author of the paper, Keivan Alizadeh, a Machine Learning Engineer at Apple, sheds light on their strategy. “Our approach involves developing an inference cost model tailored to the characteristics of flash memory. This guides us to optimize in two critical areas: minimizing data transfer from flash and reading data in more substantial, cohesive segments.”
In essence, Apple’s foray into AI is not just a headline; it’s a symphony of negotiations, internal innovations, and groundbreaking frameworks, all converging to shape the future of artificial intelligence on Apple devices. Keep your eyes on this space for more updates as the tech giant continues to rewrite the rules of the AI game.