All four new Apple smartphone models presented yesterday received a dedicated touch button on the side of the body, which is responsible for controlling the functions of the main camera. After the release of iOS 18, it will help you quickly search for information about objects caught in the camera lens. This function will be integrated with the proprietary artificial intelligence system.

Image source: Apple, The Verge

Bloomberg representatives explained that the new touch button on the side of the iPhone 16 family of smartphones understands several types of impact, including single and double press, hold and slide your finger up or down. Depending on the associated gesture, the corresponding function will be called. By swiping up or down on a button, for example, you can zoom in on the image.

As The Verge adds, pressing the button once and then holding it down will call up a function to search for information based on the image of objects that are in the field of view of the main camera of the smartphone. The system will look for contextual information associated with the objects the camera is pointed at. For example, if you point the camera at a café building, its opening hours and menu will be found. The function is also useful when transferring information from paper announcements and leaflets to quickly create a reminder of a particular event. In the future, Apple plans to integrate the Visual Intelligence feature with third-party information services such as Google. Company representatives emphasize that the images themselves are not stored on Apple servers if the user uses them only to search for related data.

Leave a Reply

Your email address will not be published. Required fields are marked *