Apple has unveiled a new feature called Visual Intelligence, which is designed to help users interact with their surroundings in a more intelligent way. This feature is part of the iPhone 16, and it appears to be Apple’s answer to Google’s Lens feature.
Visual Intelligence is activated by a new touch-sensitive button on the right side of the device, called Camera Control. When pressed, the feature can identify objects, provide information, and offer actions based on what the user is pointing at. For example, aiming the camera at a restaurant will bring up its menu, hours of operation, and ratings. Pointing it at a flyer for an event will allow users to add it directly to their calendar.
The feature can also be used to identify objects, such as dog breeds, and provide information on products, including where to buy them online. Later this year, Camera Control will also integrate with third-party tools, allowing users to access specific domain expertise. For instance, users will be able to use Google for product searches or tap into ChatGPT for problem-solving.
According to Apple, Visual Intelligence is designed with user privacy in mind. The company emphasizes that it does not have access to the specifics of what users are identifying or searching. Instead, the feature processes data on the device itself, ensuring that Apple does not collect any information on what users are clicking on. This approach is designed to provide users with a secure and private experience.