Sure enough, when I checked my iPhone 15 Pro this morning, the switch was on. You can find it yourself by going to Settings > Photos (or System Settings > Photos on a Mac). Enhanced visual search lets you search for landmarks you've taken photos of or search for those images using the names of those landmarks.
To see what it allows in the Photos app, swipe up on a photo you've taken of a building and select “Find Landmark” and a card will appear ideally identifying it. Here are a couple of examples from my phone:
At first glance, it's a convenient expansion of the Photos visual search feature that Apple introduced in iOS 15 that lets you identify plants or, say, find out what they are. The symbols on a laundry label mean. But Visual Look Up doesn't need special permission to share data with Apple, and this one does.
A description below the switch says you're giving Apple permission to “privately match places in your photos to a global index maintained by Apple.” As for how, there are details in a Apple Machine Learning Research Blog about enhanced visual search that Johnson links to:
The process begins with an on-device machine learning model that analyzes a given photo to determine if there is a “region of interest” (ROI) that may contain a landmark. If the model detects an ROI in the “historical” domain, a vector embedding is calculated for that region of the image.
According to the blog, that vector embedding is encrypted and sent to Apple for comparison with its database. The company offers a very technical explanation of vector embeddings in a research workbut IBM put it more simplywriting that embeddings transform “a data point, such as a word, sentence, or image, into a north-dimensional array of numbers representing the characteristics of that data point.”
Like Johnson, I don't fully understand Apple's research blogs, and Apple didn't immediately respond to our request for comment on Johnson's concerns. It appears the company went to great lengths to keep the data private, in part by condensing the image data into a readable format for a machine learning model.
Still, making the toggle option, such as those for sharing analytics data or recordings or interactions with Siri, rather than something users have to figure out, seems like it would have been a better option.