Apple will relocate its Visual Intelligence functionality into the iPhone Camera app as a distinct Siri mode in the forthcoming iOS 27 update, according to a report. The new mode will appear as a selectable toggle next to Photo, Video, Portrait and other camera options, bringing the company's AI-driven visual tools directly into the primary imaging interface.
Currently, Visual Intelligence is accessed via the Camera Control button. Under the planned change, that capability will be available directly inside the Camera app, and the Camera Control shortcut - introduced on the right side of the iPhone 16 in 2024 - will instead launch the Siri mode within the Camera app rather than opening a separate Visual Intelligence interface.
The integrated Siri mode will let users point the camera at an object or scene and invoke external services such as ChatGPT to pose questions about what the camera sees. The feature will also provide the option to perform a Google reverse image search for additional context and information.
Apple's existing Visual Intelligence tools already perform several practical tasks: extracting details from real-world items, converting information from a concert poster into a calendar event, identifying plants and animals, and fetching business data such as phone numbers or menu items. The move into the Camera app consolidates these capabilities in one place and pairs them with a refreshed capture control.
Part of the redesign will replace the current white capture button used in Visual Intelligence with a new shutter control styled after the Apple Intelligence logo. The change is intended to align the visual language of the camera interface with the company's broader AI branding.
Beyond identification and search, Apple is expanding the functionality of Visual Intelligence. Planned additions include the ability to scan nutrition labels on food packaging to log dietary information, and to extract contact details directly from items captured by the camera so users can add them to their address book.
Apple intends to preview these updates to iOS 27 and related operating-system upgrades at the Worldwide Developers Conference in June, with a release slated for consumers this fall. The announced timeline places the features in a developer preview cycle ahead of general availability later in the year.
Key points
- Visual Intelligence will become a built-in Camera app mode labeled Siri, added alongside Photo, Video and Portrait.
- The Camera Control button shortcut on the iPhone 16 will launch the new Siri mode in the Camera app rather than a standalone Visual Intelligence interface.
- Functionality expands to include third-party query access such as ChatGPT and Google reverse image search, plus nutrition-label scanning and contact capture.
Risks and uncertainties
- Timing - The features are scheduled for a WWDC announcement in June and a consumer release this fall, leaving the exact consumer rollout timing subject to change.
- Integration dependencies - The Siri mode's ability to use external services like ChatGPT and Google reverse image search implies reliance on third-party integrations.
- User experience transition - Moving Visual Intelligence into the Camera app and changing the Camera Control shortcut behavior could affect how existing users discover and adopt the features.