eBay has over a billion listings across a wide variety of categories that millions of buyers search every day. It is a significant challenge to connect the right buyer to the right listing and in the process to present the user with similar items that are suited to the buyers inputs.
Each of those listings has multiple images in various forms and styles. The challenge is how to use computer vision to sift through the billions of listings on eBay and show the buyer visually similar items.
The word “similar” could have different implications when we consider that the shopping context varies widely from user to user. To classify our buyers, consider these two scenarios:
- Let’s say a buyer knows exactly what she wants and searches for it on eBay. From the extensive list of search results, she manages to find something that suits her requirements. Now before she wants to buy it or search elsewhere for a better deal, she may not have selected the best possible item that is available on eBay. Similar listings in this scenario could be a set of listings with all the buying options and price points. Knowing this can enable the user to make a more informed decision.
- Now, let’s say we have another buyer who has some idea of what he wants. He came to eBay and is exploring all the variety of listings that serve his needs. He browses, looking for that special treasure. We all know eBay has a wide variety of similar items that can serve a specific need, but showing them in relevance could be the challenge. "Similar" here could mean different listings from different manufacturers, different quality, or different aspects.
eBay has solved some of these key issues by introducing multiple Image Search features. However, there are some limitations, such as the quality of the picture, access to the image, and the clarity of the item we want in the image.
How can we address all these points while keeping the user at the center? We present to you our new visual shopping Drag and Drop Image Search experience on the native iOS and Android eBay apps.
The idea is to make use of the rich content that our sellers put together and trigger an image-based search while keeping all the aspects in the search context, thereby delivering results that clearly match the users' interests. eBay will be the first ever ecommerce platform that engages the user with this image-based search that adds a fun factor to the search process.
How do you use Drag and Drop? Just open the eBay app, search for an item that piques your interest. On the search results page, tap and hold on a listing image. The rest of the flow is intuitive. You see a drop zone where you can drag the listing image and drop it to trigger a visually similar search that is more contextual than ever. (You can also use Drag and Drop on the app's Home page.) The simplicity in the process and the richness in relevant results shows the engineering excellence and the fruits of structured data and emerging computer vision technologies we harbor.
How did we do it?
The engineering teams at eBay are dedicated to delivering the most intuitive experiences that help delight users on our platform. Touch is the fundamental input means on our mobile platforms. And with touch, we have a host of gestures that could be that much more effective when used in accordance with the right context.
iOS native platform challenges
The native apps are powered by a host of services, including Experience standard services and the Domain services. Both of these services are governed by certain common guidelines which we call Commerce Operating Standards. Depending on the service layer, we have different frontend components that power the overall experience. We needed a solution that works all over the place.
We made use of the Drag interaction APIs from Apple iOS 11.0 that has no dependency on the service layer. However, plumbing the required image data and relevant information called for generic Drag and Drop handlers that could be incorporated on both service standards with little code changes to the View layers.
With Drag and Drop APIs, views with content can be dragged from within and outside the host app into specific areas of the host app. But incorporating this into the eBay iOS app came with its own challenges. The Search Results Page is especially tricky, as we have both service versions catering to our user base.
All the experiences that we built were following our in-house built Model View Content Controller architecture. The idea behind this architecture is to transform the data models generated from JSON responses to view models. These view models, when passed to a content controller, are translated to views.
When adding the Drag and Drop API on the image view inside the listing view, a wrapper for all listing-related subviews is not straightforward. On top of this, the ListingSummaryView is inside the ComponentUI target. All the drag delegate handlers are implemented on the eBay app target, which led to further challenges.
To overcome these challenges, we masked the drag handler object as NSObjectProtocol and plumbed it into the ListingSummaryCellModel. This, in turn, is applied by the cell controller to the imageView. For optimization, we reuse this view as the user scrolls down. We add the UIDragInteraction with the delegateHandler each time the view is prepared for use, and remove the same when the view is prepared for reuse.
iOS is a big platform for ecommerce that could deliver delightful front end experiences. Visual Shopping is an existing feature on the platform, but bringing it out of the search ecosystem and stitching it with every listing possible is a challenge that we had to overcome.
Initial designs were to introduce a button icon to one of the corners over each listing image that could potentially launch a visual search with that image. However the challenge was to capture the users intent to go to a View Item page, but accidentally tapped the icon or vice versa.
To solve this, we came up with a drag gesture-based trigger that clearly separates a tap from tap and hold. The next challenge was to present a valid drop zone. We needed some place to drop the dragged image that the user could intuitively feel would initiate a search. At the same time, we should make that drop zone the obvious place to drop the image. We came up with a design to mask all the available area and present a big search box-like drop zone on top of the screen, making it unmistakably the only place you could drop the image.
We will continue to improve Visual Shopping, adding more options and expanding the feature to more platforms.
eBay continues to spearhead ecommerce with latest intuitive technological advancements and this moves us one step ahead in that direction. Enhancing user experiences and catering to users' needs in more than one way has always been at the core of what we are and what we do.
There were many people involved in the design, build and testing of Drag and Drop search without which this could not have been possible. Recognizing them: Jimmy Lui from the Verticals Android Team, the Verticals Product and Design teams, and the Computer Vision team.