Advertisement

News

This is how Apple would incorporate AI into their iPhone by the end of 2024

The next iPhone will have artificial intelligence inside. But in what way and for what purpose?

This is how Apple would incorporate AI into their iPhone by the end of 2024
Chema Carvajal Sarabia

Chema Carvajal Sarabia

  • Updated:

We know that Apple is working on its own AI, a field they cannot afford to ignore if they want to stay at the forefront of the technology sector. The reality is that, right now, both Google and Microsoft are many steps ahead.

AppleTV+ DOWNLOAD

Apple has remained silent about its plans for generative AI, but with the release of new AI models today, it seems that the company’s immediate ambitions are firmly focused on making AI work locally on Apple devices.

Researchers at Apple published on Wednesday OpenELM, a series of four very small language models based on the Hugging Face model library.

How AI would work on the upcoming iPhones

Apple claims on its website that OpenELM, which stands for “Open-source Efficient Language Models”, works effectively in text-related tasks, such as email composition.

The models are open source and available to developers. It has four sizes: 270 million parameters; 450 million parameters; 1.100 billion parameters; and 3.000 billion parameters. Parameters refer to how many variables a model comprises in making decisions based on its training datasets.

For example, Microsoft’s recently released Phi-3 model reaches 3.8 billion parameters, while Google’s Gemma offers a version with 2 billion parameters. Smaller models are cheaper to run and are optimized to work on devices such as phones and laptops.

Apple CEO Tim Cook already announced in February that generative AI would come to the company’s devices and stated that Apple is investing “a huge amount of time and effort” in this field. However, Apple has not yet provided details on how it will use AI.

The company has already launched other AI models, although it has not launched any basic AI model for commercial use like its competitors have done.

In December, Apple released MLX, a machine learning framework that ideally makes it easier for AI models to perform better on Apple Silicon. They also released an image editing model called MGIE, which allows people to fix photos with instructions.

Android 14 DOWNLOAD

Another model, Ferret-UI, could be used for browsing smartphones. It is also rumored that Apple is working on a code completion tool similar to GitHub’s Copilot.

However, even with all the models released by Apple, the company would have reached out to Google and OpenAI to bring their models to Apple products.

Chema Carvajal Sarabia

Chema Carvajal Sarabia

Journalist specialized in technology, entertainment and video games. Writing about what I'm passionate about (gadgets, games and movies) allows me to stay sane and wake up with a smile on my face when the alarm clock goes off. PS: this is not true 100% of the time.

Latest from Chema Carvajal Sarabia

Editorial Guidelines