Advertisement
Tech

Apple roasted over new iPhone 16 feature that Google released 7 years ago

Hasn’t this been done before?

Photo of Marlon Ettinger

Marlon Ettinger

Man talking(l), Iphone 16 screen(R)

Apple’s new iPhone 16 boasts a built-in “personal intelligence” system that, according to the company, lets users “quickly access visual intelligence to learn about objects or places around them faster than ever before.”

Featured Video

Just push a special button, take a picture, and get information about whatever you’re pointing your camera at, the company promises.

And posters are asking … doesn’t that sound like something that already exists? 

https://twitter.com/dr_cintas/status/1833205331092246844
Advertisement

“Google Lens has been around for 7 years. This is in no way groundbreaking. Very underwhelming announcements today,” posted @Charlie_Hiller on X in response to a video advertisement showcasing the new feature. Google Lens first launched in October 2017.

In the video, a man takes a photo of a restaurant to see its opening hours and other information, a flier advertising a concert to learn more about tickets, and a dog to figure out what breed it is.

But not everybody was convinced that the features were anything new.

“It’s crazy how Apple has gaslit their users,” added @Farah_ai_. “Been doing this for ages on Google.”

Advertisement

Apple is selling the new technology based on the claim that some queries get processed on the device—while others run on “dedicated Apple silicon servers,” which the company says ensures privacy for users.

“Apple Intelligence maintains the privacy and security of user data with Private Cloud Compute,” reads the phone’s marketing materials.

Some posters are picking up on that selling point.

Advertisement

“It works on-device for everything that is « common knowledge » as well as OCR and feature extraction (adding an event to the calendar for example) so that’s still cool,” wrote @Painguette.

But not everybody’s buying the marketing hype.

“it literally searched for the restaurant on the internet. ‘on device’ my ass,” posted @deludedrunefan.

“So google lens but worse?” added @RF10k.

Advertisement

Google Lens was initially a stand-alone app before being integrated into Google Camera, Google Photos, and Google Assistant, though it since switched back to an easy-to-access app.

When Google CEO Sundar Pichai announced the feature in 2017, he also touted the phone’s ability to quickly grab information from a restaurant as you walk past it.

“If you’re walking in a street downtown and you see a set of restaurants across you, you can point your phone and we can give you the right information in a meaningful way,” Pichai said at the 2017 Google IO 2017 conference.

For some people, the nothing-new aspect of the technology was a sign of a lack of inspiration in tech.

Advertisement

“this is what gets tech people excited?” posted @FoParty. “we used to have actual ideas. we think so small now all we innovate are fads.”


Internet culture is chaotic—but we’ll break it down for you in one daily email. Sign up for the Daily Dot’s web_crawlr newsletter here. You’ll get the best (and worst) of the internet straight into your inbox.

 
The Daily Dot