Apple invents its own version of Google Lens called Visual Intelligence
Apple has introduced a new feature called Visual Intelligence with the iPhone 16, which appears to be the company's answer to Google Lens. Unveiled during its September 2024 event, Visual Intelligence aims to help users interact with the world around them in smarter ways. The new feature is activated by a new touch-sensitive button on the right side of the device called Camera Control. With a click, Visual Intelligence can identify objects, provide information, and offer actions based on what you point it at. For instance, aiming it at a restaurant will pull up menus, hours, or ratings, while snapping a flyer for an event can add it directly to your calendar. Point it at a dog to quickly identify the breed, or click a product to search for where you can buy it online. Later this year, Camera Control will also serve as a gateway into third-party tools with specific domain expertise, according to Apple's press release. For instance, users will be able to leverage Google for product searches or tap into ChatGPT for problem-solving, all while maintaining control over when and how these tools are accessed and what information is shared. Apple emphasized that the feature is designed with privacy in mind, meaning the company doesn’t have access to the specifics of what users are identifying or searching. Apple claims that Visual Intelligence maintains user privacy by processing data on the device itself, ensuring that the company does not know what you clicked on.Catch up on all the news from Apple’s iPhone 16 event!This article originally appeared on Engadget at https://www.engadget.com/ai/apple-invents-its-own-version-of-google-lens-called-visual-intelligence-180647182.html?src=rss
Apple has introduced a new feature called Visual Intelligence with the iPhone 16, which appears to be the company's answer to Google Lens. Unveiled during its September 2024 event, Visual Intelligence aims to help users interact with the world around them in smarter ways.
The new feature is activated by a new touch-sensitive button on the right side of the device called Camera Control. With a click, Visual Intelligence can identify objects, provide information, and offer actions based on what you point it at. For instance, aiming it at a restaurant will pull up menus, hours, or ratings, while snapping a flyer for an event can add it directly to your calendar. Point it at a dog to quickly identify the breed, or click a product to search for where you can buy it online.
Later this year, Camera Control will also serve as a gateway into third-party tools with specific domain expertise, according to Apple's press release. For instance, users will be able to leverage Google for product searches or tap into ChatGPT for problem-solving, all while maintaining control over when and how these tools are accessed and what information is shared. Apple emphasized that the feature is designed with privacy in mind, meaning the company doesn’t have access to the specifics of what users are identifying or searching.
Apple claims that Visual Intelligence maintains user privacy by processing data on the device itself, ensuring that the company does not know what you clicked on.
Catch up on all the news from Apple’s iPhone 16 event!This article originally appeared on Engadget at https://www.engadget.com/ai/apple-invents-its-own-version-of-google-lens-called-visual-intelligence-180647182.html?src=rss
What's Your Reaction?