Google can't read minds (yet), but it can read outfits... so we tested it out

This new feature takes saying “Google it” to the next level. Here's what happened when we tried and tested the new Google Style Match feature. AKA, Shazam, for clothes...
Share
Share
Google can't read minds (yet), but it can read outfits... so we tested it out

Technology definitely seems to be reaching whole new levels of fashion-friendliness of late, with Snapchat, Instagram and Facebook all having developed shopping features. However, Google seems to be ever-so-slightly ahead of the game with the release of its newest baby on Google Lens.

Google Style Match allows you to put yourself in the shoes of a well-dressed stranger… quite literally. Through its use of image recognition, you can now take photos of clothes, accessories and homeware and the program will find the specific item online. And if it can’t find it, Google Style Match will give you the ultimate in dupes. Imagine a world where your internet search history doesn’t look like an almost romantic, yet obsessive short novel written about a polka-dot midi skirt. We’re quaking.

Although the function seems to be prevalent in a number of tech companies, Google’s entry into the market brings it into the mainstream, with visual ID for clothing being pushed to become a widespread reality. So we at Grazia did what any interpid journos would do. We took it upon ourselves to test the feature so you won’t have to. We know, you’re welcome.

First things first, Google Lens is definitely proficient at recognising restaurants (phew), landmarks, and even landline telephones down to the model. With the added dimension of Style Match, upon aiming your camera at people (in photos or in person alike) coloured dots pop up over each piece of identified clothing.

Tapping each dot will bring up a list of satisfyingly similar clothing items as the feature successfully picks up the main characteristics of the garments, from embellishment and print, to specific furry fabrics. Even our plant friends are not discriminated against.

The feature even kind of works from a distance, (granted the clothing shouldn’t be obstructed and you have to stay very, very still). We learnt this the hard way by countless foiled attempts at trying to take surreptitious shots of our colleagues from across the room. We say “trying” because, let’s be honest, pointing your camera at someone for the umpteenth time, hoping this time the app actually picks that jacket up, is not exactly discreet. 

Although this is all great news, the feature didn’t provide an exact match for many clothing items, so talking to strangers about their above-par apparel picks might still be inevitable. To its credit though, it did successfully identify Dior’s ‘5 Couleurs’ eyeshadow palette just from the pinkie-nail-sized shades of brown, sitting in the little black palette.

Unlike most other e-commerce platforms, a Style Match quality of note is the pleasant lack of separation by price brackets. Basing searches solely on visual likeness, the feature displays high street and high end side-by-side, no discrimination - something our debit cards are endlessly thankful for. Although eBay and Stella McCartney is hardly the most compelling comparison. We’ll call that one a fluke.

Now, actually shopping on the feature is a whole other world of its own... which brought up a bit of a conundrum - tapping any product led all the way back to – wait for it – a Google search page. Many a searched item was lost on the way. This could just be a wrinkle for Google to iron out, although Style Match is technically only a recognition tool.

Can Google Style Match change the way we shop? It’s possible. It probably won't cut down those 8-hour mall trawls, but it does make it that little bit easier to know what you’re looking for and where from, which is a total gift when you’re staring into an abyss of 1,200+ shops.

Photos: Jason Lloyd-Evans and supplied