Apple is investing heavily in AI and on-device technologies according to recently published research papers.
Apple developed "LLM in a flash" to efficiently run large language models on iPhones and iPads. This could enable an improved, generative AI-powered Siri. Apple created "HUGS" to generate highly detailed, animatable avatars from short iPhone video clips in 30 minutes. This could significantly enhance the realism of avatars in Apple's rumored Vision Pro AR glasses.
The speed and quality of HUGS could allow for real-time rendering of avatars for social, gaming and professional applications. While Apple doesn't openly tout AI, these research papers show they are deeply involved in developing new AI technologies. However, Apple has not officially confirmed products using technologies like generative AI or Apple GPT.
Apple is investing heavily in artificial intelligence and on-device technologies, according to recently published research papers highlighting the company’s work. The studies show Apple is developing on-device AI technology, including a groundbreaking way to create animated avatars and a new method to run large language models on iPhones and iPads.
Aptly named “LLM in a flash,” Apple's research enables complex AI applications to run efficiently on devices with limited memory like iPhones or iPads. This could involve running an AI-powered Siri on-device to simultaneously assist with various tasks, generate text, and process natural language.
HUGS stands for Human Gaussian Splats, a technique to generate fully animated avatars from short iPhone video clips in 30 minutes. HUGS is a neural rendering framework that can create detailed avatars from just a few seconds of video for users to animate as desired.
These advancements could significantly improve Siri, enable more accessible AI tools, boost mobile technology, and enhance performance across applications on everyday devices.
Arguably the biggest breakthrough, HUGS can create malleable digital avatars from just 50 to 100 frames of monocular video. These human avatars can be animated and placed in different scenes since the platform uses a disentangled representation of humans and scenes.
According to Apple, HUGS outperforms competitors in animating human avatars with rendering speeds 100 times faster and training times of just 30 minutes.
Leveraging the iPhone camera and processing power to create avatars could deliver a new level of personalization and realism for iPhone users on social media, in gaming, for education, and in augmented reality applications.
HUGS could drastically improve the Digital Persona showcased at Apple’s last Worldwide Developers Conference for the rumored Vision Pro AR glasses. Vision Pro users could use HUGS to generate highly realistic avatars that move fluidly at 60 frames per second.
The speed of HUGS could also enable real-time rendering, which is critical for smooth AR experiences and could enhance social, gaming and professional applications with realistic, user-controlled avatars.
While Apple avoids using buzzwords like "AI" and prefers "machine learning," these research papers suggest the company is deeply involved in developing new AI technologies. However, Apple has not officially announced products using technologies like generative AI or the rumored Apple GPT.
Discover how Apple is transforming iPhones with dazzling AI technology. Read the full details on these innovative new research studies and see how Apple is investing in an AI-centric future.
Lorem ipsum dolor sit amet, consectetur adipiscing elit.
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique.
Lorem ipsum dolor sit amet, consectetur adipiscing elit.