Everything Google Announced Today: Android, AI, Holograms

Of course, Google isn’t the only company that wants to manage your passwords for you. We have a list of excellent options in our password manager guide—including some advice about why in-browser options like Google’s are more limited.

New Tools for Remote Work

If you’ve been lucky enough to have a job that’s allowed you to work from home for the past 14 months, you’re probably used to living your work life in the cloud. Google’s new remote working tools aim to make that a little easier. Smart Canvas is a project management tool that lets multiple users work together across different document types. They can keep track of progress with checklist items tagged to specific dates and people, and brainstorm ideas live in one place.

Google Meet, the video chat platform, will soon be integrated directly into Google Docs, Sheets, and Slides. You’ll be able to click the little Meet button in the top corner, and collaborators can pop up on video in a column alongside the doc to argue about what gets edited. A new Companion Mode in Meet is meant to display members of a team in more equally placed tiles, along with better noise cancelation and automatic visual tweaks to zoom and lighting to make all participant videos more visually consistent. For anyone watching who needs captions, those can be turned on using live transcription, or even translated into one of Google’s supported languages.

Improved Natural Language Skills

Google showed off some new AI-powered conversational capabilities that will eventually turn up in products that use Google Assistant. First, it’s developed a new conversational model called LaMDA that can hold a conversation with you, either typed or spoken, about any topic you’re curious about. The AI will look up information about the topic while you’re talking, and then enhance the conversation in a natural way by weaving facts and contextual info into its answers. What we saw on Tuesday was just a controlled demo, but the LaMDA model really does look like it could make conversations with a computer feel even more human.

There’s another natural-language processing model headed to Google’s Search tools. Dubbed the Multitask Unified Model, or MUM, Google says the feature is intended to make sense of longer, multi-pronged questions submitted by users. In theory, you could ask it to compare different vacation locations, or tell you what kind of gear you’ll need to bring on a hike. It can trawl websites in other languages and translate them to yours, that way the most pertinent info isn’t locked behind a language barrier.

These enhancements are part of Google’s larger effort to understand the meaning and context of questions in the way a human might. Still, Google says the features are still in the experimental phase, so it’ll be a while before the Assistant starts making decisions about any pod bay doors.

More Detailed Maps

Google is tweaking bits of its Maps app in an effort to offer users more real-time information. When you’re asking for directions, Google will present an option for “eco-friendly routes” that factor in distance and road or traffic conditions to find a more fuel-efficient way to get where you’re going. A “safer routing” feature in Maps can analyze road lanes and traffic patterns to help you avoid what it calls “hard braking moments,” when traffic slows down unexpectedly.

If you’re walking around, there are also improvements to Google’s AR mode, Live View, that help contextualize where you are by analyzing streets signs and providing information like “busyness” levels for whole neighborhoods instead of just specific restaurants and shops. Live View also now works indoors, so you can see that contextual info inside a train station or a mall. The main Maps tool will also tailor what it shows you to the time of day and your location. Open Maps in the morning and you’ll see pins for breakfast options. Open Maps in a city you’ve never visited and you’ll see tourist spots and popular attractions.

Don’t Forget About Shopping

In an effort to make you even more likely to buy stuff on the internet, Google has tweaked some of its shopping tools. Now users can use Google Lens to search images in screenshots taken on their phone and link third-party memberships directly to their Google account. Also, the days where you could idly add a 5-pound bag of gummy bears to your shopping cart and then forget about it are gone. Now, whenever you open up a new tab in Chrome, Google will show you all of the pending purchases you have sitting in shopping carts around the web.

Google also announced a Shopify integration feature, which will let sellers who use Shopify make their products appear across search, Maps, images, Google Lens, and YouTube.


More Great WIRED Stories

Stay connected with us on social media platform for instant update click here to join our  Twitter, & Facebook

We are now on Telegram. Click here to join our channel (@TechiUpdate) and stay updated with the latest Technology headlines.

For all the latest Technology News Click Here 

 For the latest news and updates, follow us on Google News

Read original article here

Denial of responsibility! TechiLive.in is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.