
Google recently launched Search Live, a new feature for its mobile app that works with both iOS and Android smartphones.
This is a big development. As a component of Google's ongoing AI Mode experiment in Google Labs, Search Live is currently only available to users in the United States but has the potential to completely transform how users engage with Google Search.
By enabling users to have voice conversations with Google Search, Search Live aims to give users a more conversational way to search the web.
It is perfect for multitasking or on-the-go searching because it allows users to ask follow-up questions and ask their questions without typing.
Users can initiate voice conversations with the AI assistant by tapping the new Live icon in the Google app. The AI assistant will respond with spoken responses and links to pertinent websites.
The ability of Search Live to carry on conversations even when navigating between apps is one of its main advantages; it facilitates multitasking.
The Transcript feature also makes it possible to switch between voice and text in the middle of a conversation, and the AI Mode history lets users go back and pick up where they left off with earlier queries.
These features are intended to give users a smooth and effective search experience.
But the launch of Search Live has made people wonder why Google is providing a different service with comparable features to Gemini Live.
Google claims that Search Live makes use of a customised version of Gemini that has "advanced voice capabilities."
Despite this, there is a lot of overlap between Search Live and Gemini Live's primary features, which raises questions about why two services that appear to be redundant are necessary.
In addition, Google intends to integrate the camera into Search Live in the upcoming months, allowing users to demonstrate their visual observations to the AI, a capability already available in Gemini Live.