Hosting
Monday, February 24, 2025
Google search engine
HomeGadgetsI tried the iPhone 16's new visual intelligence and it feels like...

I tried the iPhone 16’s new visual intelligence and it feels like the future


When I walked past a Japanese tea shop in New York’s Bowery Market, all I had to do was point my iPhone at the storefront and press and hold a button on the side of my phone to view the hours, customer photos, and options to explore the store. to see calling. or place an order.

Apple’s new Visual Intelligence tool, which will be available for the iPhone 16 series, aims to eliminate the middle step: unlocking your phone, opening Google or ChatGPT and typing a question or upload a photo to get an answer. An early version of the feature is available as part of Apple’s iOS 18.2 developer beta, which launched on Wednesday for those in the program.

Although the version I tried is an early preview aimed at developers and not general users, it gave me an idea of ​​how Visual Intelligence works and what it could add to the iPhone experience. During my short time testing this very early version, I found that it works best for quickly retrieving information about places of interest. While it could be useful, I also imagine it will take consumers time to embrace the feature once it launches, as it represents a new way of thinking about how we operate our phones.

Still, it hints at a future where we don’t have to open as many apps to get things done on our mobile devices, and that’s promising.

But I’ll have more to say about it once I’ve spent more time with it and after the final version is launched.

Read more: ‘A Cambrian Explosion:’ AI’s Radical Reform of Your Phone, Coming Soon

How visual intelligence works

Visual Intelligence relies on the new Camera Control button on the iPhone 16, 16 Plus, 16 Pro and 16 Pro Max. Press and hold the button and you’ll see a prompt explaining what Visual Intelligence is and informing you that images won’t be saved to your iPhone or shared with Apple.

AI Atlas art badge

When the Visual Intelligence interface is open, simply tap the camera shutter button to take a photo. From there you can tap a button on the screen to ask ChatGPT about the image, or you can press searchbutton to start a Google search. You can choose to use ChatGPT with or without an account; requests remain anonymous and are not used to train the ChatGPT model if you do not log in.

A screenshot of the Visual Intelligence feature of Apple Intelligence on an iPhone with a Game Boy Color A screenshot of the Visual Intelligence feature of Apple Intelligence on an iPhone with a Game Boy Color

I took a photo of a retro gaming console and asked when it came out. Visual Intelligence, which uses ChatGPT, had the correct answer.

Lisa Eadicicco/CNET

In the current version of Visual Intelligence, there is also an option to report a problem by pressing the icon that looks like three dots. If you want to delete the image and create another one instead, you can tap X icon where the shutter button is usually located on the screen.

In addition to using Google or ChatGPT, the iPhone will also display certain options based on where you point the camera, such as opening hours if you point it at a store or restaurant.

What it’s like to use it

During the short time I’ve spent with Visual Intelligence so far, I’ve used it to learn about restaurants and stores, ask questions about video games, and more.

While it’s a quick and easy way to access ChatGPT or Google, what interests me most is the way it can identify restaurants and stores. Until now, this worked best if you pointed the camera at a storefront instead of a sign or banner.

For example, when scanning the exterior of Kettl, the Japanese tea shop I mentioned earlier, Visual Intelligence automatically retrieved useful information, such as photos of the different drinks. It responded the same way when I took a photo of a vintage video game store near my office. After pressing the shutter button, Apple showed the name of the store, along with photos of the inside, a link to visit the website, and the option to call the store.

namely-intell-screenshot-1.jpg namely-intell-screenshot-1.jpg

The coffee shop menu didn’t have pictures of the drinks, but thanks to Visual Intelligence on my phone, they did.

Lisa Eadicicco/CNET

Once I was in, I used Visual Intelligence to ask ChatGPT for game recommendations based on titles in the store and to learn more about consoles and games in the store. The answers were quite precise, although it’s always worth remembering that chatbots like ChatGPT aren’t always accurate.

When I asked ChatGPT about games similar to the Persona Dancing series after taking a photo of the games on a shelf, it suggested other titles that are also music and story driven. That seems like a sensible answer, since the Persona Dancing games are rhythm-based spin-offs of the popular Japanese Persona role-playing games. To find out that the GameBoy Color launched in 1998, all I had to do was take a quick photo and ask when it was released. (For what it’s worth, I got similar results when asking the same questions in the ChatGPT app.)

Apple Intelligence on an iPhone shows results from ChatGPT. Apple Intelligence on an iPhone shows results from ChatGPT.

This answer from ChatGPT and Visual Intelligence about games I might like was quite advanced.

Lisa Eadicicco/CNET

While I have enjoyed experimenting with Visual Intelligence so far, I think it would be much more useful when traveling. If I could just point my iPhone at a landmark, store, or restaurant to learn more about it, it would have been useful during my trips to France and Scotland earlier this year. In a city I already know, I don’t often quickly need more information about nearby locations.

Read more: What I learned after trading in my Apple Watch for Samsung’s Galaxy Ring

Visual intelligence and the future of telephones

It seems impossible not to compare Visual Intelligence to Google Lens, which also allows you to learn more about the world around you by using your phone’s camera instead of typing in a search term. In its current form (which, again, is an early preview intended for developers) Visual Intelligence feels almost like a dedicated Google Lens/ChatGPT button.

That can give the feeling that it’s not new or different, since Google Lens has been around for years. But the fact that this kind of functionality is so important that it gets its own button on the latest iPhone is telling. It indicates that Apple believes there might be a better way to search and get things done on our phones.

Apple is far from alone in this belief; Google, OpenAI, Qualcomm, and startups like Rabbit all believe that AI can use the camera in new ways on our mobile devices by turning it into more of a discovery tool than just a means to take photos. At the annual Snapdragon Summit this week, Qualcomm showed off a virtual assistant concept that uses the camera to do things like divide the bill into thirds at a restaurant based on a photo of the receipt.

The trick is to get ordinary people to adopt it. Even if it’s faster and more efficient, I’m willing to bet that muscle memory could keep many from abandoning the old ways of tapping and swiping in favor of taking photos.

Learning new habits takes time. But Visual Intelligence is only in early preview stages, so there’s a lot more to come.

Apple’s iPhone 16, 16 Plus show off brighter colors and buttons

View all photos





Source link

RELATED ARTICLES
- Advertisment -
Google search engine

Most Popular