App Store introduces AI-generated tags for better app discovery

Developers would be able to control which of these AI-assigned tags would be associated with their apps
An undated image of App Store logo. — Apple
An undated image of App Store logo. — Apple

Apple has officially announced its plan to enhance App Store discoverability using AI tagging techniques, which are now available in the developer beta build of iOS 26.

However, the tags do not appear on the public App Store as of yet, nor are they informing the App Store Search algorithm on the public store.

With the upcoming App Store update, there’s speculation about how changes will impact an app’s search ranking.

A new analysis by Appfigures suggests metadata extracted from an app’s screenshots is influencing its ranking.

The firm theorised that Apple was extracting text from screenshot captions. Previously, only the app’s name, subtitle, and keyword list would count towards its search ranking, it said.

To note, screenshots inform app discoverability is accurate, based on what Apple announced at its Worldwide Developer Conference (WWDC 25), but the way Apple is extracting that data involves AI, not OCR techniques, Appfigures indicated.

At its annual developer conference, Apple explained that screenshots and other metadata would help enhance an app’s discoverability.

The company stated that it is using AI techniques to extract information that would otherwise be buried in an app’s description, category information, screenshots, or other metadata.

Notably, this allows Apple to assign a tag to better categorise the app. Developers would be able to control which of these AI-assigned tags would be associated with their apps, the company said.

Plus, Apple assured developers that humans would review the tags before they went live.

It will be noteworthy for developers to better understand tags and which ones will help their app get discovered.