iOS 16 vs. Android 13: Which smartphone OS is the best? – Android Police

Here is how the latest mobile operating systems compare
Google launched Android 13 in August, but it isn’t the only big mobile OS update in 2022. Apple’s iOS 16 made is debut in early September. While Google packed the biggest changes in years into Android 12 and has thus only smaller refinements left for Android 13, Apple has a few more interesting changes in iOS 16. There is a brand-new lock screen design, auto-updating notifications keeping you in the loop about deliveries, smart drag and drop, and more. Despite Android 13 focusing on iterative improvements, there are still quite a few thoughtful enhancements, so Google doesn’t have to shy away from the comparison to iOS 16 altogether. There are per-app languages, thoughtful Material You theming improvements, better privacy protection, and much more.
With all that in mind, let’s dive into the key features of Android 13 and iOS 16.
Android 12 and, to a lesser extent, Android 13, stepped up the interface design game with Material You, the wallpaper based theming engine that applies a fitting color palette to supported apps. While Apple hasn’t made any such fundamental changes to its platform since iOS 7 in 2013 and has focused on iterative improvements ever since, the company has some neat and unexpected upgrades in store for iOS 16. Among them is a brand-new lock screen that comes with widgets, quick switch options, and focus mode compatibility.
Apple’s iOS 16 lock screen is leaps and bounds better than Android 13’s at this point. While Android used to support widgets a long time ago (removed with Android 5 Lollipop in 2014, to be exact), Google has lost interest in the concept, likely as widgets were more and more treated as an afterthought. Apple changed how developers think about widgets when it introduced them to iOS in 2020, and two years later, the company is adding a selection of them to the lock screen. To do this in an elegant way, Apple leverages the work developers made to create Apple Watch compilations. That’s right — Apple’s lock screen widgets aren’t so much widgets as they are watch compilations. This approach allows you to add a lot of information to a relatively small space underneath the clock, and I’m all here for it.
Apple hasn’t only introduced widgets to the lock screen, the company has also made it possible for the clock to be concealed by subjects or objects in the wallpaper. This makes for a more immersive, three-dimensional experience. Combine that with the option to change your preferred font and the wallpaper, and lock screens have never been as personal as they’re on iOS 16. You can even automatically have your iPhone shuffle through different lock screens and wallpapers using Apple’s Focus Mode. You can set up this up so you only get notifications from work apps while you’re at work and only personal apps when you’re home or out and about.
All these flashy new additions can’t conceal the fact that some people are not fans of Apple’s notification options. Persistent notifications aren't turned on by default, and making iOS notifcations work to your exact liking can be a bit of a challenge.
Apple also can’t compare with Android 12’s and Android 13’s Material You theming. While theming your lock screen with custom fonts and widgets is fun, it can get tedious for those who just want to set it and forget it. Google’s theming engine is great for people like this. Material You takes the heavy theming work from you by just applying colors based on your wallpaper to app interfaces, making for a highly personalized experience in many parts of the phone — including the lock screen.
Android 13 is the first release on Google’s side to introduce official support for per-app language options. This makes it possible to use any app on your phone in the language you prefer, independent of your system language or the one set for other apps. The option is an absolute boon for those who speak more than one language, which is estimated to be around half of the world’s population.
Google gives you two intuitive ways to switch an app to another language. On Pixel phones and many others, you can long-press the app in question on your home screen, then tap the app info button (often just displayed as a small i in a circle) at the top of the popup. In the screen that opens, scroll down a little until you see a Language menu option. Tap it, and you can select whatever you prefer. This is the fastest way to go about when you want to quickly adjust languages on the fly.
If you just want to go through a list once and set all your apps to your preferred languages, you can do that too. Just go to your system settings, look for System with the Languages & input menu inside it, and then the App languages entry. In there, you will see a list of all supported apps and which languages you’ve assigned to them. You can then tap one of them to change your preferences.
As nicely as Google thought out this system, the company has still thrown in a huge roadblock. Even if an app already supports multiple languages, developers have to opt into per-app languages manually with a few lines of code. Right now, you might only have a small selection of supported apps for this feature on your phone running Android 13.
You can work around this with an ADB command discovered by Mishaal Rahman, which loses the opt-in requirement and just makes individual language switches possible for all apps. I haven’t run into any issues with this setup on my Pixel 6 and enjoy many of my apps in either English or German right now, but your mileage may vary as this ADB command unlocks an experimental option. You might even experience app crashes.
Meanwhile, per-app language options aren’t new to iOS 16. Apple has supported the feature since 2019’s iOS 13. The company has made the process much easier for its developers than Google, too. If an app is localized into more than one language already, a developer doesn’t need to add any extra code to their project for the per-app language switching option to show up in system settings.
On iPhones, changing the language is slightly more complicated than on Android 13, and you need to know where to look for the option. You can’t simply long-press the home screen icon and head to the language settings from there. Instead, you have to go to system settings and have to look for the app in question in the list towards the end there. Tap that, and only then you’ll find the option to switch the language. On iOS, there also isn’t a master list showing you which languages you’ve assigned to which apps like on Android 13. This is something that Apple hasn’t changed for iOS 16, either.
As you can see, neither of these solutions is perfect, though I would say that Apple’s approach is arguably still better, as developers don’t have to manually make any changes to support per-app languages.
Before we dive into keyboard comparison, it deserves mentioning that pitting Apple and Google against each other here isn’t entirely fair. The iOS keyboard is part of the system in the Apple ecosystem while Google’s Gboard is an app that can be updated via the Play Store at any time. This allows Google to iterate on features much faster than Apple, as it doesn’t have to push out a system update — not to mention that Gboard is available on virtually every Android device, not just Pixel phones.
With that out of the way, we can still see some big changes coming to both Google and Apple’s keyboards as part of system update announcements. For example, Gboard received powerful on-device dictation options as part of the Pixel 6 series launch and Android 12, and Apple has only now tried to follow suit with iOS 16. The way dictation works on both platforms is thus now pretty similar, but there are still some key differences. Punctuation is added automatically on both platforms, and it’s possible to seamlessly switch between voice input and typing when you need to make corrections or simply want to tweak what you said. You can even dictate a selection of common emoji on both platforms.
Apple’s iOS 16 dictation is a great step forward for the company and has thus become much more reliable, but the Cupertino company still can’t catch up with Google in that regard. With Gboard on the Pixel 6 and 6 Pro, it’s possible to use dictation to send and clear messages, undo your last bit of dictation. All this is achieved with voice commands like “send,” “clear,” “clear all,” or “stop.” If you want to, you can even keep the microphone active indefinitely, allowing you to just keep talking as you chat with someone without ever touching your phone.
Another topic tangential to the keyboard is the clipboard, and here, Google is also clearly winning with Android 13. Gboard has always been able to store and hold the last few items you’ve copied for later use, including screenshots, but now, Android 13 has a clipboard editor of its own in the system. Whenever you copy a text or an image, a small popup will appear in the bottom left corner. You can either share your clipboard right from there with apps or you can edit the contents. This is helpful when you want to share only specific parts of something you copied, like a street in an address. That said, Universal Clipboard on iOS makes it easy to copy and paste between your iPhone, iPad, and Mac.
Last but not least, Apple has finally added something to its iOS keyboard that we’ve had on Android since what feels like forever. If you want to, you can enable haptic feedback for every key press.
The media player hasn’t seen huge changes on either platform with their respective upgrades, but there are some things worth highlighting.
The Android 13 media player is a lot more visually appealing than its predecessor. Instead of simply pulling colors from your wallpaper like the rest of the interface does, the player now comes with the album art of the song you’re playing as a background. The play/pause button and other interface elements on the player pull their colors from the album cover, too, rather than the system color theme. On top of that, the progress bar now squiggles when you play music, making it visible at a glance that audio is currently playing. While you may find the design clashing with your general system theme at times, it’s a great way to showcase music and other media.
Apple’s redesign is much smaller, but the company has shuffled things around a bit, too. Instead of appearing at the top of the lock screen when music is playing, the media notification now shows up at the bottom — just like any other notification does on iOS 16. You can then tap the album art to see a bigger version of it right in the middle of your lock screen, complete with a background that pulls its colors from the album cover. Just like Android 13, iOS 16 also provides a little visualization when music is playing.
This new visual detail really only shines once you look at the new iPhone 14 Pro and Pro Max. When music is playing in the background, you can see the visualization and parts of the control interface at the top of the screen, rounding up the “Magic Island” interface that Apple has built around its new pill-shaped front camera cutout.
As good as Google is with image recognition thanks to all of the pioneering work it could achieve in Google Photos, Apple is playing more than catch-up in iOS 16. Apple’s new smart drag and drop Visual Look Up tool truly seems like it’s out of this world, and it is something that we already wished to have on Android the moment Apple introduced it. The feature allows you to pick a random subject or object from any photo on your screen, tap and hold it, and then drag it into another app as a standalone object.
From our testing, the feature works surprisingly well. Apple’s algorithms are good at recognizing where an object ends and where the background begins, and the cut out areas are right on point most of the time. While something like manual work in Photoshop is obviously still going to get you much better results, the process on iOS 16 is as seamless as it can get. It’s basically a shortcut to send your friends and family personalized stickers, which you would usually have to create in a tedious process on WhatsApp or Telegram.
Apple has also introduced the option to copy live text in videos. This allows you to hold and select text in paused video frames so you can copy or share the text you see in it.
While Google hasn’t introduced any new features like these in Android 13, the company already offers similar capabilities, at least when it comes to text. When you enter the Recents overview, you can either tap and hold whatever text you would like to extract or tap the Select option in the bottom right corner. While Google was first here, its feature isn’t as reliable as Apple’s, though. It will often recognize Instagram photos with text in them as photos you can’t extract text from but rather share as cropped screenshots only, and it doesn’t let you extract text from YouTube videos. DRM-protected apps like Netflix, Disney+, and Amazon Prime Video are left out of the equation in the first place as you can’t properly see their videos’ contents from the Recents screen (or rather, you can’t make screenshot of the content, which is what needs to be done for Google’s text recognition to work).
Apple first brought Focus Mode to iOS 15, and with iOS 16, the company is stepping things up considerably. The feature makes it possible to stop certain apps and people from reaching you and your notification shade at certain times of the day, so that you can stay focused at work and relaxed at night, only talking to friends and family. However, the boundaries are not so clear for all apps out there. You might use the same Calendar and Mail app for both personal and business purposes, and some of the browsing on your phone might be work-related while others isn’t. That’s where iOS 16’s Focus filters come in.
Focus filters allow you to set boundaries within individual apps like the aforementioned calendar, mail, or browser apps. This makes it possible to use Focus Mode to hide certain parts of the apps from your eyes, like your work tab group when you’re out and about or your business appointments when you’re winding down at night. Third-party apps are also able to tap into the filters with a new API. Apple has additionally revamped the setup process for Focus Mode with the new iOS release, promising to make things simpler.
All that said, Apple’s solution feels incredibly complex and convoluted when you first set it up. You will need to think really hard about which apps you need to use for work and which you use in your freetime, and which people are allowed to contact you under which circumstances. As powerful and simple as Focus filters can get once you’ve set them up, this is a feature that feels so geared towards power users that you would rather expect it on Android than on iOS.
On Android 13, Google hasn’t changed things when it comes to its own focus and winding down featureset, all collected under the Digital Wellbeing umbrella. Google doesn’t bother with different focus modes for different occasions in the first place. It has another approach to the issue that’s much simpler to grasp. The company only offers a single Focus Mode accessible under Digital Wellbeing in system settings. It lets you select distracting apps and turn them off based on a schedule or manually. During that time, you can’t access blocked apps at all (unless you allow yourself a 5 minute break). For everything else, you are encouraged to use Do Not Disturb, which silences all notifications and calls other than exceptions you’ve set up. This solution may not be as granular as Apple’s, but it has the advantage that it’s less overwhelming and complicated.
These two Android features are relatively unchanged since 2020, so if they aren’t news to you, that’s the reason why.
Android and iOS used to be vastly different beasts, but the more both platforms mature, the more similar they get. In the end, they both have their advantages and disadvantages, and over the years, they’ve looked at each other and copied what they liked. Android 13 and iOS 16 make this mature state extra obvious. Both new releases only bring choice improvements to the table, and they aim to enhance the underlying operating systems with select quality-of-life enhancements that aren’t exactly core to the mobile experience anymore. There are only a few things that one can really wish for on either platform, like better notifications on iOS or prettier apps built with more attention to detail on Android.
Manuel Vonau joined Android Police as a freelancer in 2019 and has worked his way up to become the publication’s Google Editor. He focuses on Android, Chrome, and other software Google products — the core of Android Police’s coverage. He is based in Berlin, Germany. Before joining Android Police, Manuel studied Media and Culture studies in Düsseldorf, finishing his university “career” with a master’s degree. This background gives him a unique perspective on the ever-evolving world of technology and its implications on society. He isn’t shy to dig into technical backgrounds and the nitty-gritty developer details, either. His first steps into the Android world were plagued by issues. After running into connectivity problems with the HTC One S, he quickly switched to a Nexus 4, which he considers his true first Android phone. Since then, he has mostly been faithful to the Google phone lineup, though these days, he is also carrying an iPhone in addition to his Pixel 6. This helps him gain perspective on the mobile industry at large and gives him multiple points of reference in his coverage. Outside of work, Manuel enjoys a good film or TV show, loves to travel, and you will find him roaming one of Berlin’s many museums, cafés, cinemas, and restaurants occasionally.

source

Leave a Reply

Your email address will not be published.