Five iOS 16 Features Android Phones Can Already Do

Every year when Google and Apple release their latest OS updates, there are always features inspired by each other. Whether it’s a new customization, design, or accessibility feature, someone has always done it first. Here are the big five iOS 16 Features Google did first and Android phones can do now.

smarter lock screen

I’ll be the first to admit that the new lock screen in iOS 16 looks great. The way Apple uses the Depth Effect to add depth and realism to images and have them interact with the Watch is genius and looks great. Having an Apple Watch model complication with health, inventory, battery, and weather is great. You can change the iOS font and clock color/complications to match the wallpaper or a color of your choice. It’s all up to you.

Google did a lot of this first though. Google has Widget at a glance, which intelligently provides you with similar information by predicting what you will need. It always shows the weather and date, but other information like upcoming events, tropical storm warnings, or boarding passes before boarding is smart. This is more powerful than what Apple offers – you can’t manually choose what you want all the time. The color of the watch can also change. It will be pulled from the material color palette that matches your wallpaper. You have four color palette options with Android 12 and up to 12 options in Android 13.

A much smaller feature Apple added is Live Activities, which allows apps to add a widget to the bottom of the lock screen with information like sports scores or an Uber distance. This is basically similar to Android notifications, which have been available to app developers to use for years on Android.

See also  Microsoft creates new Xbox advertising technology for games

The new iOS 16 lock screen is great for iOS users, it looks great and works well, but it’s also something that Android users have been testing for years. iOS users are lucky to have it now although it’s safe to say that Apple is heavily inspired by Google.

Automatic sharing of photos

iOS Photo Library on iCloud

In iOS 16, you can now have the Photos app automatically share your family photos in a shared album that you can all access. It has options to allow all photos after a certain date or all photos in it. There is also a button in the Camera app that automatically places the photos in the shared album. This shared album now gives everyone equal opportunity to add, edit and delete photos. Everyone has equal access and everything is shared with everyone on the album.

Google Photos has been doing this for at least two years. Partner sharing, the equivalent of Google Photos, allows you to automatically share photos that include that person. It has all the same features as Apple except that it is not limited to only Apple products. Since Google Photos is based on the web, you can upload photos from your DSLR from any computer and share them as well.

Set up partner photo sharing from Google

Moreover, Google also has automatic albums that you can share. This will automatically add all the photos you take of a specific person or pet to an album that can be shared with a link or directly through the app. You can also enable collaboration so that others can add their photos to it as well. A whole group of friends can set it up to automatically add each other’s photo to the album and everyone can access it.

See also  Fire Emblem Warriors: Three Hopes 'Awakened Rivals' Trailer, Demo Available Now

The Google feature has been around for a bit longer and is still a bit more powerful than Apple’s. Fortunately for iOS users, you can just download the Google Photos app on your iPhone to access these features now and you don’t have to wait until iOS 16 is released.

Smarter dictation with punctuation and user interaction

Dictation in iOS 16 now lets you edit and interact with what you dictate as you dictate. You can click and remove things and just tell the phone what you want to do, and it will do it. It now also autofills punctuation.

These spelling features are almost a direct clone of Google Voice Typing Assistant From Pixel 6 and 6 Pro. It has the same kind of features for interacting with text as you type, voice control of what you’ve already typed, and proper punctuation.

With my use of both iOS 16 and Assistant voice type, Google still has a major role in this feature. iOS 16 likes to put punctuation where it shouldn’t and still struggles to get me right. This is the first beta version of iOS 16, so this feature will likely improve.

Multiple stops in maps

Apple Maps now supports adding up to 15 stops along the way in Maps. This seemingly simple feature has been in Google Maps for years at this point. The only real difference between these features is that Apple Maps supports up to 15 stops while Google Maps is 10 at most. If you want many stops now on iOS, you can always download the Google Maps app on your iPhone.

See also  Phison replicates the high temperatures of PCIe Gen 5 NVMe SSDs, up to 125°C for control and active cooling requirements

Live Comments

Live Comments It was introduced in 2019 at Google I/O to use Google’s voice recognition technology to provide captions to content on phones that didn’t already have closed captions. It will work in real time and create it for any audio, except for phone calls. In March of this year, Google announced this for phone calls as well.

iOS 16 brings exactly the same feature. It records audio in real time via any app, including calls and FaceTime. The user interface looks identical. After a quick test, it seems to be a little slower than the Google alternative and not quite as accurate.

More about iOS 16:

FTC: We use affiliate links to earn income. more.


Check out 9to5Google on YouTube for more news:

Leave a Reply

Your email address will not be published.