LightBlog

lundi 12 juin 2017

Exclusive: Hands-on with Google’s Visual Search for Android

With so much information out there in the world, Internet search engines are crucial for acquiring knowledge. Accessible through the web browser on your desktop or laptop PC, a dedicated app on your smartphone or tablet, or through your voice on a smartwatch or home assistant device – you have so many options available to find the information you’re looking for. The vast majority of web searches are still performed via text queries, though the advent of Google Goggles was a promising start at bringing visual search to the masses. Sadly, Goggles has fallen by the wayside due to a lack of updates, but the recent announcement of Google Lens at I/O seems to be reviving the concept. We were able to get a hands on with the Google App’s new Visual Search feature, though we can’t confirm if this is the same feature as Google Lens.


Google’s Visual Search

From what little we know of how Google Lens actually works, it appears to be Google Goggles on steroids. Google’s image recognition is already incredibly powerful, but thanks to Google’s growing AI prowess Google Lens will not only be able to identify all different kinds of objects but it will also be able to provide contextual results. Lens will be able to interface with other Google services to provide more personalized feedback.

In theory, anyways. Besides a quick demonstration at Google I/O showcasing WiFi network connecting, there’s a lot of specifics about Google Lens that we don’t know of. But at the very least, we can take a look at what its interface may look like within the Google App.

Please note that the following screenshots may not be representative of the final Google Visual Search product. The feature is clearly marked as “BETA”, so our issues with the service (detailed below) may be resolved by the final release. Furthermore, this means that the interface may be subject to change.

As you can see, the interface consists of a large camera viewfinder in the top half of the screen along with a list of categories at the bottom. The image recognition categories consist of:

  • All
  • Clothing
  • Shoes
  • Handbags
  • Sunglasses
  • Barcodes
  • Products
  • Places
  • Cats
  • Dogs
  • Flowers

When you select a category, we suspect that Visual Search narrows its database to complete the query much more quickly. In any case, once you’ve selected a category (or stick with “all”), then to start searching you simply tap anywhere in the camera viewfinder. If the surrounding area is too dark, you can enable the camera flash by tapping on the flash icon in the top left-hand corner.

Performing a search will bring up a card view list of results. You can swipe left and right to view all of the results. Tapping on a result will open up a Google search page related to that object or product, though you can easily opt to go through the search page on another result by scrolling through the mini card view on top of the search page. Opening the three-dot menu on any visual search result also brings up the ability to directly access the image source, much like Google’s desktop image search.

We tested Visual Search on both static images sourced from the web as well as live pictures. As you would expect, visual search performs admirably on static images, but we were unable to get this beta version working with live pictures. I tried to get it to recognize a DualShock 4 wireless controller for the PlayStation 4 as well as my own Google Home, but in neither cases was it able to recognize it. Given the fact that this is a beta testing version that we get our hands on, we can’t say that this is a fault of the product. I would honestly be shocked if the live version of Visual Search would be unable to recognize something like a Google Home (though a part of me hopes it would hilariously mistake it as an air freshener).


Is Visual Search the same as Google Lens?

None of the searches we performed showed any signs of the intelligence that Google Lens demonstrated on stage at Google I/O. However, several strings within the latest Google App APK file do show a connection between the two. Within the search_widget.xml layout file is a line that mentions that the Google search widget may add a button to launch Visual Search. But what’s most interesting is the fact that the drawable that will visually identify this feature to the user is named “google_lens.”

<ImageButton android:orientation="horizontal" android:id="@id/search_widget_visual_search" android:background="@drawable/search_box_click_on_transparent" android:paddingLeft="8.0dip" android:paddingTop="8.0dip" android:paddingBottom="8.0dip" android:visibility="gone" android:layout_width="56.0dip" android:layout_height="fill_parent" android:src="@drawable/google_lens" android:scaleType="centerInside" android:contentDescription="@string/accessibility_visual_search_button" android:layoutDirection="locale" />

This icon, shown above, is the exact same logo that was shown during the Google Lens announcement at Google I/O.

In addition, this same icon shows up in another file, called navigation_menu.xml. This file defines what elements will show up in the sidebar menu of the Google App. The name of this new feature will be called Visual Search but its icon is from Lens.

<com.google.android.apps.gsa.shared.ui.drawer.DrawerEntry android:id="@id/visual_search" android:tag="ve=37333;track:click" android:visibility="gone" android:layout_width="fill_parent" android:layout_height="wrap_content" thegoogle:imageSrc="@drawable/quantum_ic_google_lens_grey600_24" thegoogle:text="@string/visual_search" />

Thus, it’s not too hard to put two-and-two together. It’s very likely that Google’s Visual Search is in fact Google Lens, though what makes me a bit hesitant is the fact that this is accessed from the Google App rather than within Google Assistant which is what was promised at Google I/O. Plus, since I was unable to get any of the smart features working (and the interface didn’t resemble the one shown at Google I/O), I can’t definitively claim that what I played around with is Google Lens.

Still, we’re excited to see how well Google Lens works in practice, especially given the rather disappointing effort on the part of Samsung’s Bixby Vision. Google’s I/O demonstration was neat, but I would personally hold off until we can try it out for ourselves.


What do you think of Google Lens? Let us know your thoughts in the comments below!



from xda-developers http://ift.tt/2tcOnZv
via IFTTT

Aucun commentaire:

Enregistrer un commentaire