- Welcome to the Made by Google Podcast, where we meet the people who work on the Google products you love. Here's your host, Rachid Finge - Circle. to Search is getting a huge upgrade, making it even more powerful to find out what you're looking at on your Pixel 10 or Galaxy S26. And if you're into fashion, it has a mind blowing trick up its sleeve that will help you find a beautiful new look. Let's learn more today from Director of Product Management, Harsh Kharbanda. - This is the Made By Google Podcast, - Harsh.
Welcome to the Made By Google Podcast. Uh, we're talking circle to search today. Do you remember the very, very, very first time you came across Circle to Search? - So, I've worked on Lens since Google Lens, since the beginning of Google Lens, um, since 2018. And, um, at some point during as Lens, as Lens was growing up, we, like, our main focus was the camera. And you know, like how users can take a picture of things in front of you and then get, get question, uh, uh, their questions answered. And at some point we realized a lot of the value is, uh, for people to be able to search things that they see on their computer, on their phone screens.
Uh, and so because what we started notice was users take a lot of picture screenshots, um, and then they just upload that to Google Lens. Um, and so, uh, a part of our team started working with Android and trying to figure out how to make this whole slo a lot, a lot more seamless. And that's where the Circle Gesture came around. And the first time I used it, it just felt so intuitive, um, that, that I was like, oh yeah, I'm going to use this like 10 times a day. Uh, 'cause there's so many different questions, uh, on my phone screen that I come across every single day. And then, yeah, from then on I was just hooked. So as the Lens
and the Circle to Search products merged over time, I ended up owning more and more Circle to Search, uh, product as well. And yeah. And here we are. So - I guess in order to understand where we are today, it might be useful to back up a little. Of course, the search engine started with typing in queries in the search box. Then at some point we got voice search, and then we got something as you mentioned already called Google Lens. So why did we come up with Google Lens back, uh, a few years ago, I guess? - Uh, great question. So the concept of a visual search has been around for a very long time. If you remember, I think Google Goggles was something that we - Oh, right.
Google worked on, - Goggles, worked on in 2012 or 2014. I, I forget exactly when, but Google invested in this, this concept of being able to search for things visually, very long time, uh, very early on, but the technology was just not there. Um, you know, early on Google Lens could, could only do QR codes, maybe some objects, um, and some translation and so forth. Um, and so the, the key thing with Lens was, uh, you know, like we knew that there is, there is demand because we see that there's a lot of questions that are very hard to articulate just verbally and for users to actually express via text or voice.
Like how do you describe a very unique dress that you come across on social media? There's so many different words. And even if you use those words, you're still not capturing the essence of that dress. Um, or like what people use Lens a lot for like plants. How do you describe a plant that is in your house that somebody gifted to you and you don't know the name of, and it's dying in a very specific way. And it has these like spots on it, how, like, it's just really, really hard and you can give clothes where it'll take you a long time and then, you know, you lose a bunch of fidelity in doing that. And so the impetus of like, Hey, users have so many questions that are hard to describe, was something
that Google saw very early on. - So we had Lens Visual Search, as you call it as well. And then at some point came circle to search. So for, for people who have never used Circle to search, how is that different from just using the camera to search something? - When we started Lens, a lot of the focus was on things that you see in front of you, right? Um, but what we quickly realized was, you are in stimulus in, in your daily life is not that much. Uh, and so you're seeing your same plan, your same dog, your, you know, same office location and, and work desk and so forth. And so you don't have that many opportunities to ask new questions. And phones have gotten better and better.
Pixel is an amazing product, and people just spend a lot, many number of hours on social media and on different parts of, of, of their phone. And as they scroll through and as they browse their phones, they come across the stimulus where they have these questions. And so the whole, the goal was how to bridge the gap between, okay, we have visual search, we're really focused on things in front of you, and it doesn't really, you know, like it's not a thing that people do all the time. How do we bring it closer to the device? How do we bring it where the user's questions are? And that's what led to Search.
- And for those who never had the chance to use Circle to Search, can you explain to me as a user, like how, how do I use it and, and what, what is maybe your favorite example of a situation where it's super useful? - Uh, sure. So, um, it's very simple, uh, on any, on your new Pixel device, uh, you just have to, uh, long press the nav handle, um, on any app that you're on. - That's that bar at the bottom, right? Correct. - Yes. Uh, the bar at the bottom. Uh, and that just takes, uh, the freezes your screen, and then you can tap anywhere. Or the best part is you can draw a circle around exactly
what you want, and then you get, uh, you know, like, uh, we will show you and most likely an AI answer of what you're looking at. And then you can ask questions like very specific questions about the thing, uh, things that, that you, that you're looking at as well. Um, one of my favorite examples, so, uh, I get a lot of spam from my mom on WhatsApp, uh, like, she sends me a ton of, uh, like messages that are definitely fake, at least in my opinion. Uh, or like videos of like, I'm like, this, this cannot be true. Um, and I use Circle to Search a lot to verify a lot of this information. So all I do is just long press, uh, you know, the, the nav handle, and then I circle
and it just tells me, Hey, this thing is not true. Here's the reasons why. And it actually searches the web figures out like, what, what's, what's, uh, actually true, what's not where the nuance is. And often I would screenshot that and send it back to her. Although, you know, that has gotten me in trouble many times. But, you know, that's one of my favorite personal use cases of Circle to Search. - And I guess what I use it a lot for, if, if there's something on my screen, uh, in a, in, in a language that I don't, that I cannot read, I can also circle text Right. And get a translation. Yeah, - Exactly. And that, that happens to me too,
like in Family Chats sometimes somebody would forward a message that's like in a, in a different Indian Language, uh, dialect, uh, that I don't speak, or it's, I, I, I might even speak, but it's hard for me to read, and I would just circle it and it can read it out to me, it contrast it for me. Uh, so it's, it's a pretty nifty, uh, feature that I, a lot of our users use a lot. - Harsh. I'm curious, like when did, because that's of course, you know, we're, we're going to talk about the new update for Circle To Search that recently launched. And I guess at some point you probably had the insight, and I'm not sure if you expected it,
that people are using Circle to Search in sort of a fashion sense as well. - Yeah, actually. So, uh, this was very early on in Circle to Search. We realized that this would be, this would be a thing. So one thing we learned from Google Lens was, uh, especially during COVID users, uh, spent a lot of time on their phones and a lot of time on social media, and they would come across, uh, videos or, you know, like, uh, social media posts of influencers, and they be like, oh, I love that jacket, but obviously I'm not gonna buy a $5,000 jacket. And so they would take screenshots and then upload it and bring it to Google Lens
and upload it, uh, to Google Lens to find something similar that is in their price range and so forth. Um, and so with Circular Search, we knew very early on that if we reduce the friction of, you know, the user not having to take screenshot and then, you know, closing the app, then opening Google Lens and uploading the screenshot, if they can just, you know, on the screen without leaving Instagram or TikTok or whatever you're watching, and just, you know, ask the question right then and there, it would actually lead to growth. And so we did see this very early on. Um, you know, users as they were like when we launched Circular Search, they, they used it very naturally for shopping. In fact, younger users use circular research
for shopping a lot more, um, because they're all, all obviously budget conscious and, and they have a sense of style and aesthetic that they want to like recreate. Um, and so, uh, younger users, female users is very, like this, this is a population that ends up using shop, like visual shopping using Circle to Search a lot. And, um, what we started to see is a lot of times they're not actually circling or wanting a single jacket or like a single product that they're looking at, right? Um, they actually wanted the whole vibe, the whole feel of the outfit.
Um, an example is like, oh, I saw Taylor Swift, uh, in this like, awesome look, uh, that is going viral. Uh, and, uh, it's like she posted her Instagram, or I saw through her Instagram account, and now I want to like, find every part of that look and of course, recreate that look, but in my budget. Um, and so Circle to Search didn't really work well for that. It worked really well for single objects, and you could circle one by one, like the jacket, the top, the jeans, the shoes, the, the bag. And some of these influencers have sunglasses. So there's like, you know, five - A lot to circle, right? Yeah. - Right. Uh, and so, um, and so now with this update, what we enabled was the user to just circle the whole thing and then find the look.
And with the late, the, with Gemini 3 and all the latest, uh, model updates that we've made, we are able to really look at the image in its whole, uh, and break it down and think through what are the different parts of the image that are really interesting, uh, and break it down, uh, uh, for the user of all the different parts of the image. And then the best part, as we do the same visual search for each part of the image, so we find the sunglasses, the jacket, the jeans, et cetera, uh, using visual search, and then we find you the exact product, if possible, and many, many similar products as well, um, uh, for you to like, be able to, uh, you know, browse and, and, and find something that's in your range
and really recreate the whole look. And then the cherry on top, once you find the product that you like, you can click on it and you will see an option to try that thing on you. So when you, when you click the try on new option, you will see the jacket, uh, that you found that is maybe in your budget, on you, on your on and on, what it looks like on you. And so that whole like, you know, end to end journey or something that when we tested with users, they really, really loved it. Like, oh my God, Taylor Swift was wearing that, and I, I found something for like less than a hundred bucks for that, and it looks great.
I'm gonna buy that. Right? Um, and, and it really delighted users. - That's amazing. Now, if you back up a little bit, you mentioned, so Circle to Search can now, uh, detect multiple objects. Yep. Uh, and we've been speaking about fashion, but what other scenarios are you thinking about where it might actually be useful to detect multiple things in one image? Yeah, yeah. And get a, a, a result from search? - Yeah. There's, there's many, uh, so like, uh, one obvious one related to products, uh, is one, one the PMs who, uh, one of the PMs, uh, in the team who works on this,
uh, came up with, uh, she's really into skincare and, you know, she came across, uh, like this, like skincare routine on Instagram, which had like 14 different products. And it literally lists out 14 different, like, or images of 14 different products. And so, you know, she basically just circled it and it's like, Hey, can you find me all the products and give me reviews for everything? And like, uh, you know, like rank it by price and it's able to search for every single thing and then just do that for you. Uh, another one that, uh, you know, uh, I came across, uh, was like the, in, I was, I was on social media and I came across the setup of plants, um, and I was like, oh, these plants look great, but I don't know what the names are. And they'll, you know, uh,
like I can grow in them in my house. So I just circle it and ask for like, Hey, can you name all the plants their growing conditions, what, what it takes? Will they thrive in these conditions, et cetera. I was able to search for everything, uh, for, for those as well. Um, another one was like, uh, I saw this post, uh, where like there were a group of like five actors, uh, holding their, uh, awards at a, I think it was Golden Globe or one of those awards ceremonies. Mm-hmm. I was like, I know a couple of them, but I don't know the others, and I don't know what they won the awards for. And so I just circled all of them
and I'm like, Hey, can you just tell me in a, in a nice table what, what they, who they are and what they won these awards for, and like, what should I watch a movie? And then it just like constructed a great, uh, result for, uh, result for me there as well. So lots of different use cases where the user really is asking about the whole thing and not just a part of the thing. Uh, and now, uh, we are able to do that for users, um, but like find the, finding the look is one of the most canonical journeys that like, kind of gets to, you know, like it really makes the, uh, people feel, aha, I get it, but then they end up using it for many different things, - You know, Harsh, what's better than talking about the
try on tool, I guess is maybe showing it to us. How about that? - Yeah. Let me just quickly record my screen, uh, so I can show you guys. Um, but let's, let's take this example. Uh, it's like a example I came across on social media. It's a golf look. It's, it's a, it's, it's a look that I wouldn't completely ever conceive of, like getting, but there are parts of it that are very interesting to me. So I will, I, so all I need to do is invoke circle to search, which I just did. And then, uh, I circle the whole look, and when I circle the whole look, I get this a I overview response
that allows me to find the look. And when I tap on find the look, um, I'm, as you can see now, it's actually able to deconstruct that entire look. So look at the shirt, look at the shots, look at the, uh, you know, uh, the shirt. And so now I have, okay, this is what the cap is, this is what the shirt is and so forth, and I can click on and it found the exact shirt so I can open the shirt, um, you know, in, in the viewer and see it closely. Um, and then what I'm able to do is, uh, I'm able to, uh, try it on as well. So let me take, uh, an example here. And now I see a try it on button and when I tap on, try it on, uh, it is able to instantly put that shirt on me.
Uh, and so probably not the thing that I want to wear, uh, but at least tells me, uh, you know, like not something for me. Um, so that's like the whole journey of like, Hey, I found this look, uh, online looks interesting, uh, let me search for it. And then, you know, you search for it in two taps, we found you something very similar and now you try it on and you're like, uh, probably not. And then you can move on or like buy - It, you know? I'm so curious what it was like to build this feature and then see it progress, because of course you need to do a lot of testing.
Can you tell me a little bit how that works? I mean, I presume you've done dozens and dozens of tryouts, uh, while building the product, but probably you need more than just the people on the team who are testing it. - Uh, yeah, like I think the, so, so to, to be able to like do, like, find the look, well, uh, there's like a couple of things. So, uh, first there's like, we need to get a very diverse set of images that have all the different types of looks that people might wanna search for, right? So, uh, you know, like female influencers, male influencers, younger, older influencers wearing different types of outfits, jackets, skirts, dress, and so books. And so first we need to like just get a very broad set
of like all the different images and then, uh, we have to like build a model to really, and make sure that all the different objects are like, you know, parts of the look are being deconstructed properly. So, uh, early on we saw like, you know, the model will just not, won't do a good job act actually searching shoes because shoes are generally a very small part of, uh, the whole look. I see. Um, and so we had to really tune it and say like, Hey, there's shoes all there too. Like, you have to look at the shoes or handbags.
They wouldn't, wouldn't pick up handbags or sunglasses. They would pick up 'cause there's a very prominent, prominent, but it won't pick up these other things. And so, uh, that, that was like the second part of like, can, can we make sure that it's actually deconstructing that look reliably? And then the hardest part, uh, is actually finding, uh, you know, visual like basically the results via visual search that are very, very similar, you know, because what users expect is that exact product or the exact, the very similar product. And doing that is very hard, especially, um, consider a look where you have a shirt and a jacket and a and you know, like sunglasses and, and skirt and so forth.
And there shirt is partially occluded by the jacket, like we can't even see it. And so we get very few visual cues to actually re retrieve similar looking shirts. And so we really have to do a good job of deconstructing that look and finding the, the shirt, uh, and then like retrieving other visually similar shirts. And then even when we do retrieve visually similar products, then users want it to be in their locale that's in stock, that's it. That, that, that they can quickly browse prices for, uh, from a, from a retailer and merchant that they trust. Um, and so there's a bunch of ranking and quality problems on top of that to make sure that all of these like line like pages that we are showing
to users are, are high quality. And then, so that was the last piece of the puzzle. And so like, it took us really breaking down the problem into all these different pieces and nailing each of ev each and every one of them to, to get to the point where the product feels good. - By the way, if, if someone who is listening isn't as much into fashion but is into interior design. Yeah, I noticed that the updated circle to search is also really useful for that. Yeah, - For sure. The exact same logic applies to interior design. So you come across, you know, like, uh, a living room setup
that you really love, uh, you love the couch, you love, uh, you know, the coffee table, uh, the accent chair and so forth. Now you just have a circle and say, find the look. Now again, we like bake down each of those different parts of the living room and search for you something very visually similar, but then again, like something that's in stock, it near you from a retailer that you trust. Um, so you can actually go and, you know, actually look at it or consider buying it - Harsh.
What would be your top tip for anyone interested now in circle to search to, to try it? Like what's one thing they definitely should do with the updated circle to search, - Push the product on the kind of things you want to use it for. Uh, we, when we started it was really, you know, like, oh, you can search parts of the image or parts of something that you're looking at. But more and more where we are going is, uh, it's not about a single thing, it's actually about the full context of your journey. Uh, so this update we are starting to get into, okay, you can, uh, ask a question about the full image, but then soon enough we are also working on things where we can, you can, we allow users
to ask a question about the full content that they're looking at. So with your permission, you can have us, uh, look at the full PDF, let's say if you are on a PDF or the full webpage that you are on, or the full video that you're looking at, and we can really search through that entire thing. So push the product, give us feedback, and we continue to strive to make sure that it works best, uh, for all your needs. - Harsh. Thank you so much for telling us about the updated circle to search now available on select devices like our Pixel 10 series.
Um, go and try it out. Enjoy it and thank you so much and please come back next time. - Great, thank you Rachid. - Thank you for listening to the Made by Google podcast. Don't miss out on new episodes. Subscribe now wherever you get your podcasts to be the first to listen.
Read the full English subtitles of this video, line by line.