Apple is stepping up its game in the field of artificial intelligence (AI) by exploring ways to catch up with its Silicon Valley rivals. In a research paper titled “LLM in a Flash,” the tech giant presents a solution to the computational bottleneck that currently hinders effective inference of large language models on devices with limited memory.
The focus of Apple’s research is to bring AI capabilities directly to iPhones, rather than relying on cloud computing platforms like Microsoft and Google. By doing so, the company aims to breathe new life into the smartphone market, which has experienced a decline in shipments this year.
Qualcomm CEO, Cristiano Amon, predicts that integrating AI into smartphones will not only create a new and improved user experience but also reverse the trend of declining mobile sales. However, running large AI models on personal devices poses technical challenges. Nevertheless, solving these challenges could lead to faster, offline AI assistants while enhancing user privacy.
Apple’s research provides a rare glimpse into the company’s secretive research labs and technological breakthroughs. The researchers believe that optimizing language models for battery-powered devices is crucial to fully harness their potential in various devices and applications.
As of now, Apple has not issued any official comments regarding its research findings. However, tech enthusiasts and industry analysts eagerly await further developments from the tech giant.
“Prone to fits of apathy. Devoted music geek. Troublemaker. Typical analyst. Alcohol practitioner. Food junkie. Passionate tv fan. Web expert.”