They added Generative Artificial Intelligence to search

What will I learn from ChatGPT? An Artificial Intelligence Snapshot of Google‘s Search Generative Experience at I/O

Google is moving quickly to add ChatGPT-like features to search, but whether users will find them useful remains to be seen. Product searches, for instance, synthesized material from different reviews, but it was not immediately obvious how the brief summaries might improve the search experience.

The update to search is called an artificial intelligence snapshot. When you decide to use Search Generative Experience, you will see answers at the very top of your search results for some queries, which can provide more context to your search. You can use follow-up questions to narrow down the snapshot information.

” The technology is very early on, it has its own challenges, and we will make some mistakes, it’s something we will have to contend with for the foreseeable future”, says Liz Reid, vice president of search at Google, who gave WIRED a preview of the new features

ChatGPT is powered by a machine learning model trained to predict the words likely to follow a string of text by digesting huge amounts of text, including vast numbers of web pages. Additional training, provided by humans rating the quality of the bot’s responses, made ChatGPT more adept at answering questions and holding a conversation.

During the I/O keynote, Google also announced a new name for the suite of AI tools it’s bringing to Docs, Sheets, Slides, Meet, and Gmail: Duet AI. While these Workspace features let you do things like compose an email or generate images from text in Slides, they’re currently only available to those who sign up for its waitlist.

With the new editing feature, you can make significant changes to a photo, like enhancing the sky, moving a person or object, and removing people from the background. It will be available to select users later in the year.

Google AI: A Generative Landscape for Google, Pixels, Androids, and Other ioT Devices I/O Developer Conference Summary

Google must’ve figured out that a huge number of users are appending “Reddit” to their searches because it’s rolling out a new Perspectives feature that sources answers from Reddit, Stack Overflow, YouTube, personal blogs, and other sites.

There were a lot of announcements of new devices and features coming to existing software tools during the keynote address of the I/O developer conference today. The company leaned hard into generative computing, loudly characterizing itself as a decades-long leader in AI tech. It also gleefully put AI at the forefront of nearly every service and device it operates, including the new Pixel phones and tablet it unveiled today.

ThePixel 7A is the newest in the A-series lineup, and it features a 6.1-inch display that can run at 90 frames per second. The base version of the phone costs $499, but some carriers will have a pricier $549 option that supports millimeter wave (mmWave).

Even if you weren’t on the queue, you’ll be able to use the bot even if you weren’t on Bard. The company is adding support for Japanese and Korean languages as well as the ability to export text to other online destinations, such as Google Docs and Gmail.

There are new dark mode and visual search features as well, and Google plans on adding even more functionality in the future. That includes AI image generation that uses Adobe’s AI image generator, called Firefly, as well as integrations with third-party services like OpenTable and Instacart.

AI isn’t just coming for Google Search. Google has announced that it’s also bringing new AI-powered features to Android 14. One of these features, called Magic Compose, will live within Android’s Messages app and give you the ability to reply to texts using responses suggested by AI.

Next month, Google will launch a feature that will let you change the wallpaper on your device with the help of artificial intelligence. Instead of picking from a set of premade options, you’ll be able to describe an image, which your device will then generate using Google’s text-to-image diffusion model.

The Google Home Project: Redesigned Home for the Android Wear OS 4 Era and its Upgrade to A-Series Pixel 7A Model with New Cameras

In addition, Wear OS 4 has integrations that will give you better control over your home’s lighting, media, and camera notifications, all from your watch. Wear OS 4 is only available in a developer preview and emulator for now, with full availability coming later this year.

The redesigned Google Home app is no longer in an invite-only public preview and will become available to everyone starting Thursday, May 11th. The overhauled app comes with some major improvements, including a better camera interface, a new Favorites tab, and support for a ton of new device types.

When you want to use it, just pop it off the dock, and it will be a normal Android tablet, except a bit better, because there are more than 50 apps that can fit on the larger screen. It’s powered by the Tensor G2 chipset, and has many of the same software features as other Pixel devices. Sadly, there are no other accessories—no stylus and no keyboard. You can use it with accessories but it’s clear that it’s going to be a homebody.

Every year, Google announces an A-series version of the flagship Pixel that came before. This year’s Pixel 7A is a little more pricey ($499) than last year’s model, but you get a few more high-end perks, like a 90-Hz screen refresh rate and wireless charging support. The cameras are also completely new, with a 64-megapixel sensor leading the pack. You can read about it in our review. You can order it if you buy it today, as well as get a free case and a $100 accessory if you do.

Google’s Android Auto team: First Side-Functions from I/O keynote (to be published in a report on CarPlay, Dec. 1997)

Google is also bringing the generative features of its Bard chatbot directly into Android messaging, with settings that let you ask questions right in the chat box and adjust the syntax of your messages to adapt to different tones.

Magic Eraser will be getting an update later this year. The tool will now be known as Magic Editor, which is similar to a mobile version of Adobe’s PhotoShop. Users can change nearly every element of a photo, including adjusting lighting, removing unwanted foreground elements like backpack straps, and even moving the subject of the photo into other parts of the frame.

The Matter smart home standard didn’t get the most attention during the I/O keynote. We were told in briefings that you will be able to control Matter devices in theGoogle Home app from your mobile device in just a few weeks. Any member of the family can access the panel. As they say, if you can’t beat ’em, join ’em … in putting a Matter sticker on every home appliance.

In something of a plodding reply to Apple’s startling plans for CarPlay 2 announced in June last year, Google’s Android Auto team finally has news to share. It wasn’t during the I/O keynote, but it came in side- briefings before the show.

Previous post It is not cheap when it comes to the tech company’s new foldable device, the Pixel Fold
Next post The 8A and 7A are the only ones in a lineup that does not include the 6A