Hands-on with iOS 17, iPadOS 17, and macOS Sonoma

Bigger doesn’t always mean better

Eshu Marneedi
68 min readAug 7, 2023
Image
Image: Apple

As much as the new Apple Vision Pro headset took the limelight at WWDC23 in June, Apple also announced a bunch of new, really good software during the opening keynote. Apple’s OS releases look great this year, and I’ve had plenty of time to dig into the betas and discover some cool features. This isn’t an event recap — it’s a deep dive into the developer previews and the new features coming to the masses later this year. Here are my yearly thoughts on the future of Apple software.

Multi-platform enhancements

For a couple years now (at least since the Apple Silicon transition in 2020), Apple has focused on bringing most of its new features to both iOS and macOS, which is nice. I’m happy to report that the same has occurred during WWDC 2023. Here are most of the new OS features announced this year.

I’ll begin with the highlight new feature of the OS platforms this year: the new-and-improved keyboard with better autocorrect and revised design. In iOS 17, iPadOS 17, and macOS 14 Sonoma, the caret — that is, the text cursor — is now colored depending on the current app’s accent color. For example, it’s yellow in Notes. It also has a smoother animation to make the whole thing feel… more futuristic or something.

Feature-wise, Apple’s SVP of software engineering, Craig Federighi, touted autocorrect as being powered by a built-in, custom transformer model tailored to each user which syncs through devices via iCloud to remember your vocabulary. Hopefully this alleviates those weird autocorrect instances that occur because the system doesn’t recognize a name. In case of a bad auto-correction, reverting to the original word is now easier. You can now tap the word autocorrect changed, and tap the old word from the little bubble. This feature did exist previously, but it was hard to get to and was finicky when it worked. The system will also prompt you with other words that share the same root in case you’d like to change the word. Again, this has been here for ages, but it’s now easier to get to.

The most notable addition, however, is inline predictions. Once again powered by “on-device machine learning,” when you’re typing a sentence, the text field will light up with what the system thinks you’d like to say, à la Gmail. On macOS, you press the tab key to finish the sentence (or word, depending on how confident the system is), and on iOS, you press the space bar. This works really well in practice for common phrases like “I’ll see you later.” As soon as you start typing “I’ll see you…”, it prompts you with a suggestion to finish the sentence. It doesn’t always work, though, and if you accidentally follow through with a prediction, you have to manually delete the text it entered for you. This is beta 1, so this could change, and I hope it does. Autocorrect has improved drastically in my experience with the betas. Previously, iPhone autocorrect was notorious (to the point where it became a meme) for misspelling and incorrectly-correcting words. Now, it’s much more aware of the sentence you’re typing, the way that you type, and commonly agreed-upon grammatical rules. I’m a big fan and think this is one of the most underrated features of iOS 17 and macOS Sonoma.

Some other small, but notable improvements to the keyboard:

  • The text selector is now easier to grab, appears thicker, and is nicer to use on iOS.
  • The magnifying glass that made a return in iOS 15 is now more accurate for text selection.
  • When a word is auto-corrected, the system underlines it for a couple of seconds (again, in the app’s accent color) to let you know it’s been corrected. Finally, that “sorry, autocorrect” excuse is dead. Proofread your texts, people!
  • On macOS Sonoma and iPadOS 17 when caps lock is turned on, a caps lock indicator appears next to the caret. It’s a bit annoying, and there’s no way to turn it off. I know caps lock is on; all of my words are capitalized. Not everything is a password field.
Image
Really, I don’t need to know that caps lock is on. // Screenshot: Eshu Marneedi

I give the keyboard a huge thumbs-up this year. These are well-needed, tangible improvements.

Image
The new keyboard features. // Screenshots: Eshu Marneedi

Interactive widgets took a backseat during the keynote, but I think they’re pretty important. I’ve wanted (and expected) this feature to come since the new SwiftUI-based widget system was born in iOS 14 — it’s finally here, and I’m psyched. Using the AppIntents framework, developers can now make their widgets interactive across platforms. When tapping a part of a widget (or the whole widget, depending on its size), an action can be performed by the app. It no longer has to just open the app to a particular screen. This is similar to what the Shortcuts widget did in previous versions of iOS, but much, much better since third-party developers can now take advantage of these APIs that previously were private. In fact, Apple is putting these new APIs to work themselves — the most notable addition is a HomeKit widget, which I’ve been begging for since iOS 14. You can just tap the accessory you want to control, and it will perform the action you want. It’s… brilliant.

Third-party developers have gotten to work, too. The developer of Timery for Toggl, Joe Hribar, is already working on interactive widgets, and I can’t wait for more apps to adopt them. Pause/play podcasts, scroll through the weather, etc. One important thing to note is that interactive widgets aren’t “mini-apps” or Live Activities. The idea is that you can control specific actions within an app from a portal outside of it. Interactive widgets won’t change depending on what you tap. For example, there can’t be a whole navigation structure within a widget — you’d have to jump to the real app for that. They’re meant to supplement an app (like playing recently played music), not replace it (search for a new song). It’s also worth noting that interactive widgets aren’t available on the Lock Screen, which is a shame. I would have loved HomeKit widgets on the Lock Screen, or better yet, Shortcuts widgets. That seems like a logical place to put them, but I digress. This is a game-changing feature that Android beat Apple to the punch to — it levels up the experience of using widgets on your devices by an infinite amount, and I’m excited to see what new kinds of apps will be made with this. I know the Widgetsmith developer, David Smith, must be busy. Until the new betas become public, however, we won’t be able to see most of this work. I’m still happy with my HomeKit widgets, though. That alone makes iOS 17 worth it.

Image
The new interactive widgets, a new Clock widget, and a new Safari Reading List widget. // Screenshots: Eshu Marneedi

Safari was, unsurprisingly, a vital part of the keynote. And that’s a good thing — Safari is the best browser on Apple platforms, period; full-stop. This year, Safari gets some small, life-changing enhancements I’ve enjoyed over the past 2 weeks. One of my biggest pet peeves with Safari, specifically on iOS, was that autocomplete suggestions previously appeared truncated in the autocomplete view. It’s such a dumb bug that has now finally been fixed. Suggestions in Safari get a visual update, are much faster, and are more relevant to your search in Safari 17, and thank goodness for that. Also, when typing a URL, the autocompletion feels faster and more accurate in my testing. The headlining new “shiny” feature, however, is Safari profiles. Safari profiles are the headlining new cross-platform Safari feature this year — the idea is that you can have multiple ‘topics’ (not to use the word “profiles”) in Safari (e.g. work, home, school, etc.) which maintain separate history, extensions, cookies, and other browsing data. I can see this being super handy for people who don’t want their employer to know what they’re searching on their free time — it’s like having multiple binaries of Safari on the same device. It’s a really, really cool feature, and though I won’t use it personally, I think it’ll be useful for many. Also, Private Browsing (incognito, for Chrome users) is now locked under Face ID/Touch ID/Optic ID to prevent people from snooping on your private tabs. This should’ve been done years ago. These are some well-needed quality-of-life enhancements to Safari that don’t go unappreciated.

Image
The new Safari auto-completion interface, new private browsing locking, and Safari Profiles. // Screenshots: Eshu Marneedi

The Authentication Experience team at Apple also put in the work this year. For one, OTP codes from Apple Mail now fill automatically in Safari. This feature works quite well, in my experience (even using Gmail, which doesn’t support push), but I do wish there were an API for third-party email apps to use this feature. Perhaps that’s coming in iOS 18 and macOS Rancho Cucamonga. And, a new “auto cleanup” feature automatically discards emails and text messages with OPT codes. I love the people who made this. Additionally, in a continued effort to Sherlock the Electron-based 1Password, you can now share passwords using Family Sharing. Finally! But that isn’t even the best of this year’s enhancements — if you have info in Safari’s AutoFill, you can fill that data into PDFs across the system (like in Notes, Preview, etc.). This is super, super handy — while I haven’t had any PDFs to fill out over the past couple of weeks, I’ll be using this all the time. I’m sure it works well.

Apple’s continued focus on its suite of productivity apps is also appreciated. Notes, Reminders, and Freeform all get small, but nice enhancements:

  • Notes finally now has support for deep links to other notes. This is a feature I, and many others have wanted for a while, and it’s finally here. You can link to another note directly within a note from Notes now, say to add additional information (I’m doing that right now to write this article). PDFs now gain proper support in Notes, as well — they’ve essentially brought desktop Preview into the Notes app, and it works amazingly. They spent quite a long while discussing it during the keynote, and for good reason.
  • Reminders now automatically sorts grocery lists by category. For example, milk and cheese go in the dairy section, apples and tomatoes go in the produce section, etc. Once again, Reminders is the best grocery list app, and this makes it even better. It does mess up sometimes, but for the most part, it worked well in my testing.
Image
Reminders auto-sorting for grocery lists. // Screenshots: Eshu Marneedi
  • Freeform, though PencilKit, gets a couple of new pen tools, as well as the same PDF support Notes did.

The most prominent productivity announcement, however, is the addition of a new, heavily-rumored app: Journal. Journal is a new journaling app coming “later this year” (I reckon the launch will go the way of Freeform) made by Apple, essentially Sherlocking Day One. The app seems pretty basic and self-explanatory — you add new entries every day and it’s all “private and secure” because “Privacy is a fundamental human right” yada-yada-yada. It supports adding images, links, drawings, etc. like what you’d expect from an Apple app. The highlight of the app which was leaked by Mark Gurman a couple of weeks ago, though, is that Journal can look into your other Apple apps and suggest moments for you to journal about. Some examples Apple highlighted during the keynote: if you took a bunch of pictures at a certain place and with certain people, the Journal app will ask you if you’d like to reference those shots in your journal entry. If you exercise with your Watch on while listening to a song, that can also be counted as a “moment.” You can add an entry for each moment, or reference moments in an entry. I think this is pretty slick and will be the “selling point” (as far as we know, this will be a system app with no paywall) of this app. No other app could do this because there’s no API for apps to see what’s happening in other apps, and even if there was, it wouldn’t be as good as Apple’s implementation, because, again, privacy. I don’t journal, but if I did, this would be my app of choice to do so. What didn’t come to fruition, however, was a rumored feature where the phone would’ve been able to detect you were around other people and remind you about that later in the day. This was a bad idea and was most likely some dumb prototype that was toyed around during development. I’m glad it didn’t come to fruition — it seems like a privacy and logistical nightmare. That’s probably the reason why it was canned! We only got a sneak peek of this app during the keynote, so we have no idea how powerful it might be — nor how capable of a Day One replacement it might be — but it seems well-designed and well-built. People who journal care a lot about their journaling apps. Once again, I’m not able to try this app out right from the get-go — it seems like they’ve only built a prototype of it to show off on stage, which is disappointing.

Image
The unreleased Journal app. // Screenshot: Apple

iMessage took up a considerable part of the keynote, and rightfully so. They announced a crap-ton of new features and visual enhancements to what is probably most people’s most-used app on their phones. But, I’m careful and intentional with my word choice here — these are new iMessage features, not new Messages features. Apple, once again, has not adopted RCS this year. Why they won’t is simply bewildering to me. We’re not asking for Apple to kill iMessage for RCS — we’re asking Apple to enhance the SMS standard. Apple has made improvements to the way SMS messages are handled, though — reactions now appear properly, and group chats aren’t a mess anymore. But this doesn’t solve the underlying, massive problem — texting iPhone users as an Android user is a pain. Attachments don’t send correctly, they’re not reliable, and there’s a character limit. These problems would make the experience for iPhone users better. Think all you want, hate all you want — Apple adopting RCS would make people’s lives safer and better. I’m not in total support of Google owning the pipeline here. I’m not too fond of their cringy ad campaigns, nor do I like Hiroshi Lockheimer’s constant ragging on Apple. But Apple’s insistence on locking down the iMessage ecosystem is anti-consumer. With that, let’s get to the features.

iMessage apps get a facelift on iOS (they look the same on macOS) for the first time since they were introduced in iOS 10. I don’t think I know anyone who seriously uses iMessage apps the way Apple thinks people do, but Apple has now hidden them behind this weird-looking menu. The menu sniffs like… Windows design. It doesn’t seem like an iOS menu. I don’t like it, and hope (and think) that Apple will change this in future betas. The system apps also have new icons which harken back to the old iOS days — they look 3D-ish and look like they’re made out of real materials. Perhaps this is just the future of iOS design? I’ll get to this in a bit. Regardless, there are also a couple of new system iMessage apps, the first of which is called Check In, and Apple touted it a ton during the keynote. The idea is that you can start a session to notify a family member or friend when you arrive home safely. If the time lapses past the ETA Apple Maps gives you, your friend will be notified that you’re not home yet. This is a very clever feature, and I’m delighted that Apple has implemented it as a system feature. What’s even better is that this shows up as a Live Activity for your friend, so they can check up on you as you make your way home. If you’re lost, a map shows up for them so that they can find you. It’s immaculately and thoughtfully designed, and while I myself have no use for this feature, I’m positive that it will be a life-changer for many people. Kudos to everyone who worked on it. It’s worth noting that both parties will need to be on iOS 17, and your friend can also use the feature on macOS Sonoma or iPadOS 17. But the truth is, Apple OS adoption is so good that there’s no doubt that this feature will be covered ad nauseam on the news when it ships in the fall — and I’m here for it.

Speaking of location-aware features, there’s also a new Location iMessage app that allows you to easily share your location with friends and view your friends’ locations in-line. No more having to go hunt for them in the Find My App — you can share, view, and stop sharing locations directly within iMessage. A nifty feature, for sure. Both parties don’t have to be on iOS 17 for this to work, so I’ve tested it. Works as expected! [Images omitted for privacy reasons, of course.]

The last new iMessage app is a sticker drawer where all your stickers live across apps. Your recently-used ones also sync between devices, which is cool, I guess. The nifty new feature is yet-again a result of Sherlocking — Apple has stolen the idea from that Sticker Drop app where you can take images of your friends and turn them into stickers using the background detection and removal feature introduced in iOS 16. It works exactly as well as Sticker Drop ever did (because duh, it’s the same tech), but more seamlessly because it’s now integrated with the system. And yes, this feature is also available on macOS Sonoma — Apple has put care into ensuring all of these features come to all platforms, and I’m enjoying it. Mac users shouldn’t feel like second-class citizens, and I’ll dig into this more during the macOS section of this hands-on. (Spoiler alert: Apple has done a great job with macOS this year, and I have renewed hope regarding the platform.) One thing that Quinn Nelson of Snazzy Labs notes — your friends who don’t run iOS 16 won’t be able to see these iMessage stickers, even though they’re just… normal stickers. It’s a good thing I don’t have friends. Apple also says that you’ll be able to use stickers in more places, like FaceTime, Mail, and Freeform. There’s also an API for developers to bring in stickers users create in iMessage and other apps, since this sticker drawer is available system-wide via the keyboard, replacing the Memoji section of yore. Apple highlighted Snapchat as supporting this feature in the future.

Some other minor, but nice-to-have iMessage trinkets:

  • Audio messages are now transcribed, in-line, similar to voicemails. I have absolutely no idea why this took so long, but am I glad that it exists. It’s pretty accurate in my testing since it doesn’t have to deal with phone compression. Good enough in a pinch.
  • A new grabber will show up when you’re behind in a chat, so you can jump to the last message you’ve read.
  • You can now use filters when searching in iMessage similar to how you can in Mail (for example, filter by person, chat, etc.). The search UI also makes more sense — messages are broken down by chat thread, and there are separate sections for photos and other attachments. Overdue, and handy for people like me who never delete their iMessage chats.

Once again, iMessage is a very crucial part of the iPhone and Apple experience. People buy iPhones for iMessage. They use Macs instead of Windows machines for iMessage. That doesn’t have to change — what does is Apple’s insistence on making the messaging experience worse for their own users. I don’t own an Android phone, I own an iPhone. And every time I text an Android phone, it’s a bad experience for me, an Apple customer. These new features are pretty awesome (especially Check In), but Apple needs to get better at adopting universal standards. Their “Apple Knows Best” mentality isn’t going to work here for much longer.

Image
The new Messages features. // Screenshots: Eshu Marneedi

Speaking of things that just don’t work: Siri and Spotlight. Mainly Siri, but Spotlight too, in recent history. Siri doesn’t gain any new “intelligence” features this year, which is a shame in the age of LLM-powered generative AI chatbots “taking over the world” (pfft). Instead, Apple has removed the “Hey” part of “Hey Siri.” This is a really annoying “feature,” and I’ve changed it back to only listening for “Hey Siri” for now. I simply rag on Siri way too much in normal day-to-day speech for this feature to be of use to me. It’s flawless in execution, though — every time I say “Siri,” my phone lights up. It’s so good that Apple themselves had to change the frequency of when they said “Siri” during the WWDC opening keynote to prevent HomePods from going off worldwide. One other item of interest — the $550 flagship AirPods Max doesn’t support this feature at launch. I truly have no idea why this (seemingly arbitrary?) limitation exists on Apple’s “most advanced” pair of headphones. Even updating the AirPods firmware to a beta yielded the same annoying error notification. Truly bizarre.

Image
Siri is just… Siri now. // Screenshot: Eshu Marneedi

Continuing on the ‘Siri getting better at activating’ theme, Siri can now respond to requests back-to-back. I will admit, shamefully, that I had no idea what they were talking about when they advertised this during the keynote. It makes sense, now: you can now speak while Siri is speaking and it will respond to your new query without having to say “Hey Siri” again. This is, once again dumb in my opinion. It works… too well! I say things after Siri says something; I’m human. But the feature works as intended, I’m just not a robot. In the current betas, it doesn’t give you any indication that it’s listening unless you start talking, in which case the little orb lights up. I find this weird — I have no idea when it stops listening. I think it’s after it dismisses (after 15–30 seconds), but I’m unsure how that’ll work on HomePod. Also, this doesn’t work on macOS — only iOS/iPadOS and HomePods (and AirPods, by extension). We’ll have to see — I’m not ballsy enough to put the beta on my HomePods.

Let’s get to Spotlight, one of Apple’s best features — and no, this isn’t sarcasm! I genuinely think that Spotlight is a really good search engine for Apple platforms (Sherlock…), but I have my fair share of quibbles. Spotlight gets a new look and feel this year in iOS 17 and macOS Sonoma. Opening apps on iOS has a new animation, and results are, in Apple’s words, “richer.” People have gotten fired up about this in the last week because app ads have now made their way to Spotlight, and deservedly so. When you search for an app you don’t already have installed, the Top Hit (which is mostly always an ad) is accentuated in blue and appears larger. Photos and links also carry a rounded appearance and have some depth to them, with colors and bold text galore. This is what I meant earlier by “the future of iOS design.” Spotlight (especially on iOS and iPadOS, though some of these design cues transfer over to macOS, which is why I’m putting this in the multi-platform section of these hands-on impressions) doesn’t look the way Apple software typically does anymore. It feels rounded and transparent, and results take up way too much space in the spotlight menu. It reminds me of what Big Sur did 3 years ago — things feel way too spaced out. Animations take way too long, making the experience feel slower. I’m unsure if this is the way Apple should be headed — either embrace material design and go back to the way iOS 6 was, or go full neomorphic on us. Pick one; commit! Maybe this is an unpopular opinion.

Spotlight doesn’t feel any faster or smarter in my testing this year, which is a crying shame. Spotlight, for years, has been one of the best parts of iOS and macOS. It got really, really good in iOS 15, but has sat untouched (functionality-wise) in iOS 17. I was really hoping for some speed improvements at the bare minimum. Spotlight in iOS 16 crashed frequently, was laggy, didn’t yield good recommendations, and was super slow at returning results for even just basic app searches. Those issues haven’t been fixed in iOS 17, and Spotlight still pales in comparison to Alfred, which remains my launcher of choice on macOS. I’m disappointed by Apple’s approach to “on-device machine learning,” and have been for years — this is now just a chance for me to write about it. Apple touted Spotlight as being faster, richer, and better this year — that isn’t what I’ve observed personally in betas 1 and 2, which really is a bummer.

Image
The new Spotlight results. // Screenshots: Eshu Marneedi

I’m not asking Apple to make SiriGPT powered by an LLM here. That’s narrow-minded, low-level investor ‘boomer’ tech bro nonsense. We don’t need an LLM built into Apple software — we just need good artificial intelligence features built into these phones that are computationally more than capable of running them (and we got some — more on this later in the iOS section). Spotlight and Siri have been a disappointment this year, and that’s a shame. I’m really starting to lose faith in Apple’s machine learning ever getting better.

FaceTime got some small feature additions this year, as it usually does every year. For one, when you call someone via FaceTime, you can now leave them a video message. This works across platforms, and recipients need to be on iOS 17 or macOS Sonoma to receive them. I will genuinely block anyone who leaves me a video message. I didn’t answer your FaceTime call for a reason. I don’t want a video of you.

Secondly, third-party apps, most notably Zoom, now have access to the same video effects FaceTime did in iOS 16. You can now turn on ‘Studio Light’ or portrait mode in any app that uses the front-facing camera, and use a new feature called “Reactions.” Reactions allow you to gesture with your hands to “fill the camera frame with fun 3D augmented reality effects.” For example, you can do a thumbs-up to get some confetti on the screen. I’m sure the youth will enjoy this feature. Not for me, though.

I don’t understand why Apple doesn’t just make FaceTime into a Zoom replacement. Everyone knows about FaceTime — embrace it and make it better! Perhaps those days are over though, Zoom has won. But imagine an Apple-made Zoom! That’d be a hit.

Weather, much to my surprise, gains some notable improvements this year. The problem is, I don’t really have a problem with how much data the Apple Weather app exposes or how well it presents that data, but rather, the accuracy of the data. Since the introduction of Apple Weather Service in 2022, Apple Weather has been horribly inaccurate. I can’t comprehend what went wrong in the development of Apple Weather — they’ve got incredibly accurate Dark Sky data and they seem to be using it. The thing Dark Sky was the best at was accurately predicting when the rain would start, and Apple Weather is incredibly good at that. But Apple’s forecasts are just completely off! It’ll say “rain for the hour” with brazen confidence when it’s sunny outside with no clouds in sight. These days, I find it predicting it’ll rain all day with a 30% chance of rain for every hour in hopes of being right at least one of those hours. That’s not an accurate prediction, that’s just throwing things at the wall and seeing what sticks. I hoped Apple would listen and take the hundreds of pieces of user feedback into consideration to make Apple Weather Service better during WWDC this year, but they didn’t. In my time with the betas, Apple Weather has been as inaccurate as before, which is a complete disappointment. No, these aren’t new problems, and I don’t expect them to fix them overnight. Weather forecasting is challenging, I acknowledge that. But this is the stock weather app, and just jamming new features in every time isn’t very helpful when people just want weather forecasts that they can rely on. Changing the forecast from “90% chance of thunderstorms” to “no rain for the next week” back to “60% chance of light rain” is utterly worthless. /endrant

The main new feature they didn’t even discuss during the keynote (which I find odd because it looks like a lot of thought went into it) is a new section called “Averages.” Say what you want — the Earth is… kind of dying right now. I’m no climate expert, but it doesn’t take much education to know that the average temperature of this planet we call home is going up every year. As such, it’s nice to see exactly how much hotter this year is vs. last year. The Averages section does an excellent job of visualizing these changes. It collects historical temperature data from your area on this specific day since 1933 (at least in the US) and puts it all into a line graph that shows you all the temperatures as a range (colored with a gradient) and plots the normal range of temperatures). It then tells you, via a summary, if today’s high is within the normal range of temperatures, and if it isn’t, how much colder, or realistically, hotter, it is. There’s also a monthly averages section that plots historical temperature data (also since 1933) from each month and puts it into a bar graph. It’s hard to put into words how well this feature was implemented — I’m thoroughly impressed. The little tile that sits next to the day’s data points (like humidity, sunrise/sunset, etc.) is also nifty — it tells you, at a glance, how much cooler or hotter today is compared to the last 90 years. But that’s not all — there’s also precipitation data for the last 90 years, as well. And that’s represented even better, in my opinion. This is an awesome feature, and if you somehow can’t tell, I’m a huge fan.

Image
The new Weather Averages feature. // Screenshots: Eshu Marneedi

There’s also now a new moon section in the weather app with a nice calendar that visualizes the different phases of the moon throughout the month (and year, which is nice). There are even sections that let you know when the next full and new moon is. At the top of the sheet, there’s a model of the moon which shows you today’s phase of the moon, along with a very premium-feeling slider thing that you can move forwards or backward to get an hour-by-hour look at the moon phases. Finally, there are also data points for illumination percentage, moonrise, moonset, when the next full moon is, and the geocentric distance between the core of the moon and Earth. I’m sure these features are useful for many people, and I kind of want them to come to CARROT, which remains my weather app of choice (for reasons that I hope I’ve made obvious). Especially that slider — it’s just a delightful UI element.

Image
Moon information, now in the Weather app. // Screenshots: Eshu Marneedi

Some other minor, but quality features:

  • Building on the Averages feature, some sections (like humidity, precipitation, etc.) have a “Daily Comparison” section which compares today’s values to yesterday’s. It usually consists of a bar graph and summary, and I find it handy. What’s more interesting is that this feature — at least as of beta 2 — has been omitted from the macOS build. It only appears on iOS 17 and iPadOS 17, which is a shame. I spent 5 minutes trying to figure out where it was, only to find out that it… doesn’t exist. Hopefully, this is fixed before these OS releases become public.
  • Some tiles in the main view of the Weather app get larger to display more information. Namely, the moon and wind tiles. These new tiles now carry the same look as the Air Quality one did previously, and take up more space (double what they previously did). I wish this were customizable like CARROT allows for.
  • This is more of just a small visual change, but it is notable: the “My Location” location (this is the most confusing sentence structure; let’s roll with it) now has the heading “My Location” instead of stating your location, so now you don’t have 2 of the same location. (This doesn’t make any sense — please refer to screenshots.)
Image
The slightly tweaked Weather app. // Screenshots: Eshu Marneedi

While the Weather app does look good on paper (and in screenshots), it’s just not super functional. There’s an amazing article written by Srini Kadamati for Nightingale that I encourage you to take a couple minutes to read which discusses why the Dark Sky app was so awesome. The stock Weather app, though pretty, is dysfunctional, cluttered, spaced-out, and most of all, inaccurate. I can’t use it myself, even though I love the new features and visualizations this update brings.

That does it for the big, headlining improvements that come to all the OS releases this year, but there are some minor changes I find worthy enough for their own section:

  • You can now share AirTags with up to 5 family members, putting an end to those relentless “AirTag found moving with you” alerts that are provoked by a family member’s AirTag. From the bottom of my heart, thank you Apple engineers who worked on this.
  • Apple News+ now has crossword puzzles. I have no idea where they’re sourced from, but hey, that makes Apple News+ officially worth more than $-1.
  • Photos now adds pet face detection, which I think is hilariously inventive. While I don’t have pets, I’m sure people who do are psyched about this.
  • Safari and Clock get new widgets. The former gains a Reading List Home Screen widget, and the latter gains some new analog clock styles.
  • Personal Voice is a new accessibility feature demoed weeks before WWDC which allows you to generate a digitally-synthesized version of your voice in case you can’t speak later on. A feature called Live Speech will then let you type things into a text box and have those words be said in your voice. Both of these features are legitimately life-changing for people who suffer from ALS — I’m extremely glad Apple has taken the time and care to make them work well. Both of these features work well in the betas of iOS 17 and macOS Sonoma.
  • Apple IDs now include support for Passkeys. I’ve tried this feature out, and I’m not super impressed right from the get-go. This is probably an early beta thing (Ricky Mondello, who works for the Authentication Experience team at Apple I raved about earlier, says that this isn’t the final implementation), but you’re not actually able to use the “Sign In with iPhone” button on the website to sign in… with your iPhone yet. It just brings up a QR code that you have to scan with another device. Again, Ricky Mondello, a literal Apple employee, said that this is not how it’s going to ship, but it’s worthy of note.
Image
More widgets, pet lookup, News+ puzzles, and Passkeys for iCloud accounts. // Screenshots: Eshu Marneedi

iOS 17

iOS is Apple’s flagship operating system, and this year’s version… well, it’s a bit weird. As I briefly touched on during the Spotlight section of these first impressions, iOS 17… isn’t bug-free in these early betas. Very early on in the beta cycle, I stated that I thought that iOS 17 was the worst beta 1 since iOS 13, but the more I thought about it and used the beta day-to-day, I realized that it’s actually a lot better than the supposedly “stable” iOS 16. Sure, there are tons of visual glitches and buggy animations found throughout the OS, and I’ve experienced many random re-springs throughout the past couple of weeks, but the OS feels… much smoother. It’s no iOS 12 or OS X Snow Leopard, but wow, it’s an improvement over the buggy iOS 16. Yes, battery life kind of takes a hit this year, and sometimes the phone gets hot when you’re using it for more than 5 minutes, and SwiftUI keeps breaking (which is a whole other rant I could write 5000 words about; that’s not happening today), but it’s a beta, and I have confidence that Apple will iron out at least some of these obvious bugs before release [editor’s note: fix the feedback system, Apple…]. This year’s iPhone OS (they should really just call it that, by the way) update can be described as “nice to have” — none of the new features are particularly revolutionary, but they’re all, well, nice to have. While the multi-platform section covers most of these niceties, there are some that I left out because they’re iPhone-exclusive. Here are my favorites that I’ve tried out.

StandBy (that’s seriously how it’s capitalized, which is a marketing mistake in my opinion; looks weird) is the new highlight feature of iOS 17, in my opinion; it’s awesome! When your phone is charging (MagSafe isn’t required for StandBy to work) and in landscape mode, the whole screen turns into a smart home hub of sorts with widgets of your choosing taking up the whole screen. It’s essentially a Google Nest Hub made by Apple but smaller because this feature — for some bizarre reason — only works on the iPhone. Why Apple didn’t just port it to the iPad without modification baffles me; it seems like it was made for the iPad in the first place if you ask me. It’s not because StandBy requires an iPhone with an always-on display — it works on all iPhones that run iOS 17, including the LCD ones. Yes, it does work better with an always-on display as your information is always visible, but regardless, this should’ve come to the iPad. Perhaps this is an upcoming iPadOS 18 feature that they’re saving for next year for some reason, which is a real shame.

Image
StandBy: Night Mode edition. // Screenshot: Eshu Marneedi
Image
StandBy. // Screenshot: Eshu Marneedi

StandBy itself isn’t a shame at all. Developers can adapt their widgets to a new, larger square-shaped Lock Screen widget size, and users can customize how many widgets they’d like and the configuration they’d like them in. You can also add multiple widgets in a smart stack, similar to Home Screen widgets, and StandBy will rotate through them for you depending on the time of day and frequently used apps. StandBy is smart, too; when the phone detects you’re not looking at it (like when you’re asleep, for example), the always-on display on iPhone 14 Pro kicks in, lowering the number of screen refreshes per second. On other iPhone models, the screen simply goes to sleep. When it does detect you’re looking at it, though, all data updates in real-time, with smooth 60 Hz animations. And if it’s nighttime, the phone automatically realizes it should go into night mode, which switches your widgets to a red color and dims the screen so that you can sleep better. When it detects motion using the Face ID camera array (it’s not using your actual camera, just the depth sensor, putting privacy concerns to rest), it wakes up, exits always-on mode, and presents you with live, real-time data. It’s super neat and practical; it’s well-implemented. I’ve been using this every night since the first beta, and it’s been nice having larger widgets you don’t have to wake the phone up to look at. I’m excited to see what third-party developers are cooking up for StandBy, too. Many of my apps with Lock Screen widgets don’t even show up in the list of widgets available to use in StandBy because they haven’t been optimized yet.

Image
The widget picker for StandBy. // Screenshot: Eshu Marneedi
Image
Smart Stacks are also available in StandBy. // Screenshot: Eshu Marneedi
Image
You can cycle between widgets manually in StandBy. // Screenshot: Eshu Marneedi
Image
StandBy settings. // Screenshots: Eshu Marneedi

Widgets aren’t all, though. StandBy wasn’t just made to be helpful at night like you might assume (when else are you putting your phone on a charging stand and just… leaving it there?). StandBy also supports photo shuffles, fun animations, and more — as highlighted by Apple during the keynote — to be helpful or entertaining/amusing throughout the day. You just enable them as widgets through Clock. The more interesting, and perhaps more useful feature, however, is the addition of full-screen Live Activities. I think this is a truly amazing way to enjoy Live Activities; it makes them so much more useful and nice to use! Live Activities from apps that haven’t been optimized for StandBy yet — which is most apps this early in the beta cycle — are just enlarged, but soon, developers will be able to take advantage of the larger area. Apple itself has already done this with Siri, which now takes up the whole screen akin to an Echo Show or Nest Hub. I’m a huge fan of this for obvious reasons — StandBy now transforms the iPhone into a ‘bulletin board’ of sorts. You can throw all kinds of helpful information on it whenever you’d like, and it’s super well-implemented. StandBy is the best new iOS 17 feature, and I’m really happy about how it turned out. iPadOS next year, please?

Image
The media Live Activity in StandBy. // Screenshot: Eshu Marneedi

AirDrop, much to my surprise (and delight), took up some valuable time during the keynote, and for good reason. This year’s WWDC celebrates a decade of AirDrop’s existence, and it’s one of Apple’s best software features to date. However, I’ve had some issues with AirDrop’s reliability, especially in modern times; it fails sometimes, it’s slow, progress bars don’t really work, etc. Moreover, since the introduction of the U1 chip in the iPhone 11 Pro back in 2019, AirDrop hasn’t gained any major features that take advantage of the ultra-wideband spatial awareness capabilities of modern iPhones. These two pain points are finally addressed this year. Apple has finally introduced two quality-of-life enhancements to AirDrop that are simply pure awesomeness. Now, if you have a friend with an ultra-wideband iPhone (iPhone 11 or later), you can just tap your phone to the top of their phone to initiate an AirDrop ‘session’ — no more having to toy around with the share menu. And if you step away from them, you’ll be able to finish the AirDrop transfer over the internet. This also helps if you have a poor connection and should improve AirDrop reliability substantially, and I’m here for it. I transfer tons of large video files over AirDrop from my iPhone to my Mac pretty frequently, and this is a welcome addition. While I’ve tried the phone-to-phone U1 transfer since it’s available in the beta, I haven’t been able to try the OTA AirDrop stuff yet presumably because the backend infrastructure for that isn’t ready. Apple says that it’ll be shipping “later this year,” as they do — I predict that shipping date will miraculously change to “coming this spring” at the end of this year.

Image
The new AirDrop interface. // Screenshot: Eshu Marneedi

The shiner AirDrop feature from this year has its own name — NameDrop. You know back in the good-old days when people used to exchange business cards with each other to share contact information? Then, smartphones took over the world and brought us to the hell we live in now where we text each other contact cards. Now, there’s a better way to share contacts — NameDrop. This is a genius feature — quirky and showy, but genius. Again, if both parties have an ultra-wideband iPhone or Apple Watch, they’ll be able to put the top of their phones (or watches, or one or the other) together and exchange contact cards seamlessly with just one button tap. You can choose the specific info you’d like to share with them, too — no need to share personal numbers if you’re using the feature in a business setting, etc. It’s literally like business cards, but better, and better than all of those dumb “share your digital business card” apps from 2012 that never really took off. This will be a super popular feature amongst the mass crowds… because it’s awesome. Business cards can finally die. I haven’t tried this as I don’t have a friend who’s on the beta, but I’m yet to see a video of it failing, so I trust that it’ll work with the same accuracy as AirDrop (which is to say, pretty well). Also, the animation that plays when you transfer your contact card is beautiful. It’s whimsical and so Apple in the best way and I love it.

But of course, as John Siracusa would say, nothing is so perfect that it can’t be improved upon. I have some privacy concerns regarding how NameDrop will be implemented when it leaves beta in the fall. Last year, Apple restricted AirDrop in China (and later, worldwide) to only be available to everyone around you for 10 minutes at a time, while being unrestricted to people in your address book. Obviously, this approach won’t work for NameDrop because the people you’re NameDropping (verbifying Apple trademarks is my favorite pastime, as you can tell) presumably aren’t in your address book already. But I also don’t want to be sharing my contact card willy-nilly with strangers by accident. Perhaps there’s a buffer time limit they didn’t demo in the keynote or turn on during the beta cycle? Do the phones have to be super close to each other? I don’t think so, but I don’t want news articles written about people holding their phones against others’ phones to advertise scams on the NYC subway or whatever. I’m excited about this feature, but weary about the potential prospect of scams. People have notoriously found creative ways to use technology for nefarious purposes.

Image
Sharing options for AirDrop. // Screenshot: Eshu Marneedi

The Phone app, interestingly, was used to open the iOS portion of the keynote for some peculiar reason. Contact cards get notable changes this year in iOS 17, and I can’t quite figure out what’s better about the new contact cards. In fact, I think the changes make contact card customization harder. The highlight feature is the addition of “Contact Posters,” which replace the “old-fashioned” profile pictures of yore. I put “old-fashioned” in quotes because this isn’t something that was particularly in need of reinventing. Even though they spent a bunch of time demoing this feature during the keynote, it isn’t even advertised much on Apple’s iOS 17 Preview website… because it’s not very well implemented. It overcomplicates the system we have already, makes un-customized contact cards look worse, and seems weirdly buggy in the beta. Here’s how it works: instead of just having a photo for your contact card, you can create a header (think of this like a social profile, where you have a profile picture and header) and add “fun” effects to it, like color filters, Memojis, Stickers, and more. It’s everything you can do to a Lock Screen in iOS 16 but for your contact pictures (minus the widgets, obviously). Once you’re done messing around with how you look on your contact card (who has that much time?), you can choose to share it with your iMessage contacts like you previously would your profile photo and name. Then, when you ring them up, your custom Contact Poster shows up for them instead of the profile picture they added for you (they have to approve the change, but it shows up as an annoying banner for them which is almost even worse).

Image
The new “Contact Posters.” // Screenshots: Eshu Marneedi

To add insult to injury, this feature isn’t available on macOS at all — no custom posters for Mac users, which indicates to me that Apple mainly whipped this feature up for on-stage demo reasons. Barely anyone I know uses that “custom photo sharing” thing they introduced in iOS 13 because it’s annoying. People have photos and names for people depending on their relationship with them, etc. This is unnecessary engineering that complicates the system we have already. “Why is he so mad about this,” you may ponder. Rightfully so — I mean, it’s optional. Except… it isn’t. If I want to simply change a profile photo, I can’t do that anymore. I have to change the contact card’s “Poster.” because hitting save on an image change opens a new UI akin to the Lock Screen picker to modify the Poster. And if I don’t add Posters to my contacts, all of my contact cards that previously looked great now look gray and bland. Previously, if you set a profile picture for someone, a blurred, enlarged version of their profile picture would take up your whole iPhone screen when they called; it was easy to recognize who was calling based on the colors that took up the screen. Now, the entire screen is a yucky gray with just a tiny little barely-visible profile picture at the top-left corner. Why must they overcomplicate everything? Why did they have to make already-good contact cards look like total crap? What was wrong with the old implementation? I despise this feature, and I’m very mad. F for fail, Apple; go back to the drawing board.

Image
Contact Posters… don’t look very good. // Screenshots: Eshu Marneedi

In addition to the useless Contact Posters feature I’m still angry about, Apple decided to take us back to the good ‘ol days of landline telephones with a feature they call Live Voicemail. If you were born before the year 2015, which presumably you were if you’re reading this, you probably remember that “home phones” played voicemail messages as they came in on a loudspeaker. Well, this is essentially that feature but better; modern. As a caller leaves their voicemail, you’ll be able to see a live transcription of it right on your lock screen (the transcription happens in real-time, and there’s no need to listen to their voice). Neat, I guess, but also, why wouldn’t you just text the person instead of leaving a voicemail? It accomplishes the same task, after all. Maybe that’s my Gen Z-ness coming out. Perhaps we should just ban phone calls altogether. This is neat, probably, and is a step in the right direction for Apple. It’s no Google Duplex, but it’s nice to have; it fits within the theme of this update. Now they just need to make Siri deal with those useless phone trees and enter numbers for you as Google Assistant can. Make the robots talk amongst themselves — it’s 2023.

Visual Look Up, the feature introduced in iOS 15 as a direct (albeit crappy) competitor to Google Lens, gets some, dare I say, pretty welcome enhancements this year. Many, many paragraphs ago, I expressed discontent at Apple’s machine learning. I’m not retracting those statements here, nor do I have the intention to contradict them, but I need to applaud Apple for what they’ve done here. Credit will be given where credit is due; this is some cool stuff. Now, when you point your phone at laundry symbols or car warning lights, it’ll be able to tell you exactly how to do your laundry or what’s wrong with your car. This is genius. More of this, please, Apple! Gone is the need for third-party apps to do this stuff — the phone just does it automatically. It can even identify multiple symbols and point to Safari URLs that explain exactly what the symbols mean. It’s brilliantly designed, helpful, and awesome. I’m thoroughly impressed. No, this isn’t new technology, and I’m not going to go as far as Federico Viticci in saying that Apple is ”doing machine learning right.” But this is a step in the right direction — Apple is embracing machine learning and artificial intelligence and baking features into the system, and I’m happy about it.

Image
New Visual Look Up features. // Screenshots: Federico Viticci

In addition to these new capabilities of Apple ML, the interface also gets some minor improvements. Building on Live Text in videos from last year, you can now use Visual Look Up in videos and photos, in-line. This is pretty nice — if there’s something you don’t recognize in a video that’s being played through Apple’s default video player (as all good video apps do, I’m looking at you, YouTube…), you can just long-press to get Visual Look Up results. And if you find a cool dish from a picture online or whatever, you can use Visual Look Up to find recipes that match that dish. While not game-changing, it’s intriguing. As I said previously, I want Apple to bake more ML tools into its OS platforms. These new features work pretty well in my testing, and I’m excited about what Apple has for us next.

And of course, the little life-changers:

  • Maps finally gains offline map support. Finally! What in gods name took them this long? Google Maps has had this for years. If you’ve somehow been living under a rock for the past n number of years, you can now download a map to your phone of an area to get turn-by-turn directions even without service. This is nice for parts of road trips that are in rural areas, or if you live in an area where hurricanes are common and you’re scared of losing service because the world is falling apart. Apple went a step further and even allows you to download data for place cards — like hours and addresses — which is nice.
  • There’s now a new “fast” speed option in Haptic Touch settings, and wow, is it life-changing. I kid you not, this has made my experience of using an iPhone infinitely better, and everyone should enable it. I lamented the removal of 3D Touch from iPhones for one main reason — it was significantly faster than Haptic Touch because there was a pressure-sensitive layer baked into the screen that knew if you were 3D Touching or regular touching; it didn’t have to play guessing games. In iOS 17, it still obviously has to play guessing games if you own a phone younger than the iPhone XS, but the animation that plays when you Haptic Touch something is significantly snappier. It feels exactly like 3D Touch. What a huge QOL (quality-of-life) enhancer. Please, turn this feature on when iOS 17 hits in the fall. This truly is the biggest thing Apple has ever done.
  • Screen Distance is a new feature that (ironically, since the announcement of Apple Vision Pro — a display that sits 1 inch away from your face — closely followed the announcement of this feature) notifies you that your iPhone (or iPad) is too close to your face and might cause you vision issues in the future. I find this feature… annoying, because I’m stupid and have my phone too close to my face 100% of the time. Hey, maybe that’s why I’m short-sighted. But it works like a charm — a non-dismissible full-screen modal shows up whenever the “TrueDepth Camera System” (god, I hate that name and have for 6 years) detects you’ve been sitting too close to the screen for too long and urges you to move your phone away from your face. It then magically disappears after you do what it says. Pretty neat, but once again, ironic and annoying if you’re stupid like me (it’s okay to admit you are, this is a safe space).
  • The Dynamic Island has some new, bubbly animations. They feel a bit slower than the older ones, but also slightly more polished; I like them. Dynamic Island doesn’t gain any new features this year, though, which was not what I was expecting. It would seem to me like this should be the year Apple makes Live Activities and Dynamic Island support a zillion times better because this is the first OS version designed specifically for iPhone 14 Pro. All we get this year is just a Shazam Live Activity, which, if I’m being honest, I’m eternally grateful for. The interface no longer opens up in a crappy App Clip (that I actually thought was good back in 2021, for some reason) or in the bad Shazam app — there’s now a direct link to open in Apple Music from the Dynamic Island. I love it. I still would’ve appreciated new Dynamic Island actions and features, though.
  • Apple Music has crossfade support from the Android app! Finally, Apple’s borrowing features from its own Android app! If you don’t know what this is, crossfade just blends the end of your current song and the beginning of your next song together so that there’s no abrupt cut between songs. It’s nice, and I’ve left it on. In beta 1, it was super crashy and never worked right, but in beta 2, it’s pretty good. You can even adjust the crossfade duration. It only works in playlists (as it should — don’t ruin the work of the artists), and it’s about time that it arrived on iOS. I still find it hilarious that this made it to Android first — perhaps a bit too cross-platform, Apple?
  • You’ll be reminded periodically about apps that have access to Photos, iCloud Calendars, and more. UIImagePickerControllers also show a prompt that allows you to easily disallow photo sharing, which is nice. I always forget to check which apps have permissions, and these additions alleviate that pain point.
  • If someone sends you an explicit photo via iMessage, you now have the option to have it blurred and put behind a content warning automatically. This is nice; I didn’t try this feature for what I hope are obvious reasons.
  • Devices in the AirPlay list in Control Center are now “shown by relevance” powered by “on-device machine learning.” This took way too long, and it works brilliantly.
  • FaceTime now works on Apple TV. The tech used is the same as Continuity Camera from macOS Ventura — you use your iPhone as the camera, and the FaceTime app lights up on Apple TV. This is really neat, and while I’ll never use it (for reasons I’ve outlined during this article), I’m sure tons of people will, especially for watch parties and stuff.
  • SharePlay now works in CarPlay. What does this mean? You can now get everyone to contribute to the music in the car. I assume this’ll solve everyone’s problems.
Image
Offline maps, faster Haptic Touch, Screen Distance, and new privacy prompts for photo pickers. // Screenshots: Eshu Marneedi

I give iOS 17 an 6/10, where a 5 is average, a 10 is remarkable, and a 1 is terrible. It’s good by ‘batting average,’ — there’s a lot more that they could do, but Apple doesn’t focus on bug fixes and/or big feature releases anymore. Sure, Spotlight and Siri suck, and Apple’s ML needs work, especially when compared to the LLM revolution of now, but this release feels more stable than iOS 16 ever did. That point can’t be undermined.

iPadOS 17

iPadOS 17 is a particularly minor update. Aside from the aforementioned multi-platform features, the iPad doesn’t gain very many new tricks this year. That’s starting to become a trend, and that worries me. Apple was quite focused on improving iPad software from the introduction of the iPad Pro in 2015 up until 2019 with the introduction of iPadOS. Even in early 2020, Apple made notable strides to make the iPad more desktop-like with the introduction of the Magic Keyboard and proper cursor support. But since then, the iPad has taken a backseat, once again. iPadOS 14 brought no new iPad-specific multitasking features and omitted the new widgets that came to the iPhone, iPadOS 15 was essentially just iOS 14 but for the iPad, and iPadOS 16 brought a buggy, chaotic feature called Stage Manager that wasn’t ready and that was shamed for being “flawed” while omitting the new Lock Screen features of iOS 16. iPadOS 17 follows the same trend, once again, “adding” features that the iPad lost out on last year.

Last year, Apple put the M2 chip into the iPad — the same chip found in Apple Reality Pro which I marveled as being “the most technologically-advanced piece of technology you can buy for $3,500.” Yet, they refuse to do anything of note with it on the iPad. A couple of months ago, Apple brought Final Cut Pro and Logic Pro to the iPad, and I was excited. So excited that I planned on writing a whole review about how Final Cut Pro was going to change the iPad and how I was excited that the iPad was becoming a real computer. Well, that review didn’t happen, obviously. It never will happen, because Final Cut Pro for iPad is just a worse version of Final Cut Pro for Mac. It’s not real Final Cut Pro; there’s nothing inherently “Pro” about it in any way. In fact, I feel like it’s just reskinned iMovie, like iMovie+ or something. It’s not good for color correction, multi-cam, footage ingest and management, or anything real professionals have to do. All of your Mac Final Cut Pro plugins? They don’t work either. And you can’t work with your Mac Final Cut Pro libraries on the iPad like you’d think you would because they’re just “not compatible” without fancy hacks. I’m disappointed by Apple’s approach to the iPad, once again. They’re showing sheer incompetence and laziness and an unwillingness to harness the potential of iPad hardware here. That theme carries over to iPadOS 17, which only gains 3 major platform-specific enhancements: Stage Manager flexibility, new Lock Screen customization carried over from iOS 16, and the Health app. None of these are groundbreaking; heck, two of these “new” features are directly copied over from the iPhone. And the third is simply an enhancement to a buggy, haphazardly-designed feature from last year’s iPadOS release. To say I’m discouraged, disappointed, and dejected about this year’s iPadOS release would be an understatement, quite frankly.

Let’s start with Lock Screen customization because that’s the overdue, highlight feature of iPadOS 17. Lock Screen widgets for the iPad come in a new size — extra-large — which is also the same size that iPhone Lock Screen widgets optimized for StandBy use. Developers, when optimizing for the new OS versions, can share code between both widgets as they’re the same size. When holding your iPad in its landscape orientation, all widgets now sit at the left of the Lock Screen and are positioned vertically as opposed to horizontally below the time like in iOS. This means you can add many more widgets than you could in iOS, taking up the entire width of your iPad model. When your iPad is held vertically, it works the same as it does in iOS — widgets sit below the time, and you can add up to 4 small widgets or 2 large widgets with the omission of the extra-large widgets. Because you can’t add extra-large widgets to the Lock Screen when your iPad is held in its portrait orientation, you’ll have to set different widgets for each orientation. I don’t like this approach and wish they did it more like how the Home Screen widgets have worked on iPadOS since iPadOS 15. I get not wanting to have widgets sit on the left when the device is in its portrait orientation as horizontal screen space is limited, but extra-large widgets should be available in portrait orientation. Also, why is there an arbitrary limit of 2 large widgets? Portrait orientation means that portrait screen real estate is practically free, so why does Apple think extra-large widgets don’t belong there? This whole design decision confuses me, and I hope Apple sorts it out in iPadOS 18 (which is a year away, because once again, Apple doesn’t care about the iPad). What would’ve been even cooler is if Apple added Home Screen widgets to the Lock Screen on iPadOS 17 — it seems like it would work from a design standpoint because there’s enough room on the iPad screen. Why am I able to cook up better ideas than Apple’s own engineers?

Image
Picking widgets in iPadOS 17. // Screenshots: Eshu Marneedi
Image
Extra-large widgets are not available in vertical orientation. // Screenshot: Eshu Marneedi

You can also adjust the font, font color, and font weight of the clock at the top and the color of your widgets, as you could in iOS 16 — nothing new here, except for some new fonts and a new slider that allows you to precisely control how thick or thin you’d like the font. Both of these additions come to both iPhone and iPad and for phones that have an always-on display, setting the font weight to something super thin will result in the font being slightly thicker when the always-on mode is enabled to accommodate for the always-on display’s slight thinning of digits, which is pretty neat. But the change that makes me the happiest also comes to iPhone, and it’s the re-introduction of automatic light/dark mode wallpapers. I have no idea why this feature disappeared in iOS 16 (and only iOS 16, as the iPad had dark mode iOS 16 wallpapers). My guess is as good as yours, but I think it’s because Apple changed how they rendered wallpapers in iOS 16 to accommodate for the depth effect that wasn’t on the iPad last year (it’s on the iPad now, thankfully). Regardless, it’s back, and thank goodness for that. Now, when the device (iPhone or iPad, and Mac for that matter) switches from dark to light mode or vice versa, the wallpaper appearance switches as well. I do wish that we got the ability to add our own automatic appearance-changing wallpapers, though, like we can on macOS, but maybe that’s me asking for too much here. Apple also went out of its way to add a beautiful animation when you swipe up from the Lock Screen to get to the Home Screen on iPadOS to the system wallpapers, and I’ve been enjoying it. Again, it’s whimsical and delightful, elements that Apple software has lacked since the flat-ification (that’s not a word, let’s just roll with it here because I think it sounds brilliant) of Apple OS platforms a decade ago.

Image
Font & Color, and auto light/dark mode wallpapers. // Screenshots: Eshu Marneedi

All of the other stuff that came bundled with iOS 16 carries over to iPadOS 17, from what I can tell from my experience in using the betas. You can create multiple Lock Screens, cycle through them by tapping and holding the Lock Screen to show a carousel of your options, link them to Focus modes and other Home Screens, and Apple has even added some new space and Kaleidoscope wallpapers that are all optimized for the iPad (and again, have whimsical animations when swiping up, and in the case of the Kaleidoscope ones, animations when you rotate the iPad which I find amusing). Live Activities also finally make an appearance on the iPad, just with no Dynamic Island (obviously). They take up the same amount of space as their iPhone counterparts do (because they’re the same Live Activity) and work just fine. Just… thank goodness that Apple finally caught the iPad up here. They should really stop treating the iPad like a baby product and do what they’ve been doing to macOS by bringing all the features over each year. Be innovative if you want, Apple, but do it right, and on time. This is overdue; massively overdue.

Image
New wallpaper options in iPadOS. // Screenshot: Eshu Marneedi

The Health app finally comes to the iPad. While I go out of my way to uninstall the Health app on every new Apple device I buy, many have expressed enthusiasm towards this addition, especially those who use Apple Fitness+. I’m happy for them. The app is now fully optimized for the iPad, taking up the larger screen with a sidebar and colorful headers. Nice work, Apple! The headlining new feature(s) of the Health app is the new mental health and mood tracker. It starts with a new “survey” where you answer a couple of questions (they’re pretty basic, like “Have you experienced feelings of being down, depressed, or hopeless in the last 2 weeks?”), and the system gives tells you if it thinks you’re experiencing symptoms of depression and/or anxiety via a scale. Apple stresses that this is not meant to be taken as a diagnosis, and rather is just a mere recommendation, which is good. But this isn’t new; there are hundreds if not thousands of depression and anxiety surveys out there, many of them honored by governmental organizations around the world. The real feature now in the health app is a full-blown mood tracker. Every day (you can choose to receive a notification reminder to do this), the app asks you to use a slider to log how good you feel. It then puts all of the data in a calendar and chart that tracks your mood over time. There’s also a “momentary emotions” field where you can write stuff down. I’m sure a bunch of apps just got Sherlocked by this feature. All of these new mental health features live inside a new section of the Home app called “Mental Wellbeing,” alongside sleep data and mindful minutes carried over from previous iOS versions. None of these features are iPad-specific (they work on the iPhone too), but they also come to the iPad for the first time ever. I have no idea why this took so long.

Image
The “Mental Wellbeing” survey. // Screenshots: Eshu Marneedi

Stage Manager, and I can’t believe I’m saying this, finally gets good in iPadOS 17. Federico Viticci describes these changes much better than I could:

However, compared to iPadOS 16, it feels to me as if the process of resizing a window is smoother and more lenient than before. You still see a window “blink” as it gets resized, but I’m under the impression that there are more “intermediate steps” when it comes to the sizes you can choose from.

Apple seems to have added a bunch of — as Viticci puts them — “size classes” to window sizes. On my 11” iPad Pro, there seem to be 7 sizes horizontally and 4 sizes vertically, making for 28 individual sizes. This is much better than the previously limited amount of size classes from iPadOS 16 and allows you to create windows in more, sensible window sizes. For example, my app, Citations, can live in a small window because the UI just consists of a form, but an app like Craft demands more screen real estate. Now, I can do that — I’m not constrained as much by iPadOS. Now, yes, this is still more limited than macOS, obviously. That’s just the way iPadOS apps are designed. But I’ve found myself much less frustrated when using Stage Manager on iPadOS 17 than on iPadOS 16 because now I can size my windows however I please. And, Apple seems to have changed the animation that plays when you resize windows, making for an even better experience. Credit where credit is due, these changes are appreciated.

Image
New Stage Manager size classes. // Screenshot: Eshu Marneedi

But the real life-changer is that Stage Manager windows can now be placed anywhere on the screen. This might be the single best update to iPadOS this year. In iPadOS 16, windows would “snap” (albeit with a graceful flowy animation) to one of 9 (at least in my observation, it was a 3x3 grid) snap points. In iPadOS 17, windows no longer snap to snap points; they move as you please like macOS windows do. If you move a window to the edge of the screen (say on the bottom or the left), the Dock (and tiles? I forgot what the little previews on the left are called) is automatically hidden behind a swipe up. The whole system feels much more intuitive and works a lot better, and I’m a fan. These changes make Stage Manager feel like a real multitasking experience — it wasn’t even nearly this good last year, and I’m delighted to see that Apple took some of our complaints to heart. Windows finally feel like real windows now, not just free-floating, limited Split View tiles. Imposing arbitrary constraints on the iPad’s (comparative to the Mac) small screen and touch-first user interface created a UX that felt awkward and irritating on a day-to-day basis, and true freeform windows coupled with the addition of new window size classes alleviate these bottlenecks making for a much less-frustrating iPad experience. Doing real work with Stage Manager in the past was hard because I couldn’t get things the way that I wanted to with the iPad’s limited screen real estate — windows would overlap each other, elements on the screen would feel crowded and out of place, and I couldn’t understand what was going on. Add to that the numerous bugs, glitches, crashes, and weird app behavior (like on-screen keyboards), and Stage Manager was a dumpster fire of an experience that didn’t make sense nor that felt premium. Stage Manager in iPadOS 17 makes Stage Manager a respectable feature. I no longer feel “lost” in the UI anymore — I know where to tap for the system to do what I want and I know how the system will react to a tap. Nothing feels cluttered, disorienting, or cramped — windows move where I want them to move and are resizable in various configurations. Configurations that would be deemed awkward on a real desktop computer, but that are necessary for multitasking to make sense on the iPad. Animations are smoother, crashes and bugs are fixed; Stage Manager no longer feels like a half-assed feature that causes frustration and pain.

Image
This is true iPad multitasking. // Screenshot: Eshu Marneedi

None of this is to say that Stage Manager doesn’t still have its flaws — you know from reading this article that I am in no way a fan of the direction that Apple is taking the iPad in. The biggest, most annoying aspect of Stage Manager comes when you open a new app. If you already have an app open and go to launch a new one, your current app(s) will be kicked to the side and a new “space” will be created with the new app or window you just launched. You then have to tap the old space with all of your old apps that you (presumably) set up the way you want, then drag the new app from that new space that’s been automatically created into the old space and then position it how you want. Why can’t apps just open in the space that you’re in? It’s so inconvenient and makes me go “The iPad isn’t a real computer” every time I see it play out. That’s not how computers work; why Apple designed Stage Manager like this is simply mind-boggling to me. Yes, I agree, it should be easy to section out an app into its own space, but that should be hidden behind a long-press of the app icon or behind the “three-dots” menu. Or, they should make a toggle for people who want this to happen (I don’t know why they think that way, but hey, I’m not judging). This is nonsensical UX and is incredibly frustrating; it kills the vibe of Stage Manager altogether. Also, how is it that we’re living in 2023 and Apple hasn’t figured out keyboard shortcuts to tile windows in Stage Manager or even macOS? Why should I have to rely on Magnet for this? Apple should just go buy Magnet, Sherlock it entirely, and stuff it into the iPad. Having to use my fingers or mouse to move windows around is sub-optimal and annoying, especially because of the “inertia” of the iPad’s cursor. And once I work hard to get my windows the way that I want them, there should be an option for iPadOS to memorize spaces as you can on the Galaxy Z Fold phones. This doesn’t even consider external displays either — why do we have to rely on ugly workarounds to get clamshell mode working; why isn’t that a standard feature? Apple is over here making ads that brazenly ask “What’s a computer?” while neglecting things Mac laptops have been doing for over a decade.

Image
This is still incredibly annoying. // Screenshot: Eshu Marneedi

You answered your own question there, Apple. Why don’t people treat the iPad as a computer; what’s stopping people from using the iPad as their main computing device? It’s the lack of real, desktop-class features and customizability; this insistence on making the iPad experience sub-par when compared to the Mac; this delicate balance of designing iPadOS to be touch-first while also maintaining good cursor support. I can’t run apps like Magnet and Bartender on iPad, I can’t use Mac Final Cut Pro plugins on iPad, I can’t use Isotope or Audio Hijack on iPad… the list goes on. There are tons of things you can’t do on an iPad that you can do on a Mac; pro things that should be possible on a $1,500 iPad Pro that uses a desktop chip that simply aren’t. Adding a dumbed-down version of Final Cut Pro to the iPad and calling it quits is insulting — it’s not doing to do anything if the OS isn’t equipped to handle a Pro app. Heck, you can’t even exit the Final Cut Pro app on iPad Pro (the $1,500 one with 16 GB of memory and a whopping M2) while it’s exporting purely because of iPadOS’ stringent memory optimization. iPadOS is not built to handle any semblance of a “professional” workload.

Back to Stage Manager to round out this rant: it’s still far from perfect, and as I write this, we’re still pretty early in the beta cycle, so things are still susceptible to changes. But, no matter how hard Apple tries, I don’t think it’s going to save the iPad. It’s merely an add-on feature that now is starting to enhance the functionality of iPad Pro.

That’s the theme of this year’s iPadOS release — it’s not good. Credit where credit is due, of course: Apple caught the iPad up to its other OS platforms, and that’s… good, I guess? Not really commendable knowing that Apple’s tablet OS was literally a year behind its mobile one, but I digress. I’m not optimistic about the iPad’s future at all, and Apple is going to have to do a lot to make me feel optimistic again. I give iPadOS 17 a 4/10 for the Stage Manager improvements alone. This has been a disappointing release for the iPad.

macOS Sonoma

macOS 14 Sonoma is, in my opinion, the biggest, best platform release this year. Not because there are any particularly groundbreaking feature additions — there aren’t, in fact — but because of how much of a quality of life improvement Sonoma is after the dumpster fire Ventura release. I have been a vocal assailant of macOS 13 Ventura since its release; I’d say that it’s the worst macOS release since Catalina (which I still won’t forgive Apple for shipping). macOS Sonoma is a breath of fresh air from the constant bugs, hangs, and nag-filled hell of Ventura. Here’s a list of some minor things Apple fixed or changed in the beta version of macOS Sonoma from the stable version of macOS Ventura that makes Sonoma feel better:

  • In Ventura, if you had third-party widgets from an iPhone/iPad-optimized app (as in, not optimized for Mac) running on an Apple silicon Mac, they wouldn’t stay placed in Notification Center if you restarted your computer or put it to sleep for a while. That bug is now fixed in Sonoma, so widgets are now usable from ARM apps on macOS.
  • Editing widgets now works. Previously, editing a setting in a widget from the settings menu wouldn’t actually result in the changes being saved. That’s been fixed now — changes save properly, as they should.
  • Widgets update faster and more reliably than in Ventura.
  • The Control Center and Notification Center animations now play properly. They don’t feel buggy anymore.
  • The media player now always shows the Live Text button, when in Ventura, sometimes it wouldn’t.
  • Xcode phantom errors, broken Previews, and crashes have been fixed — finally!
  • Software Update fetches new updates (on the beta track) swiftly and installs them without manual intervention.
  • The bug where loading a website from the Safari start page would be slow is now fixed.
  • A nice animation has been added to when you log in after putting your Mac to sleep. If you have windows open, they animate in delightfully. If you don’t, the menu bar and dock slide in from the top and bottom respectfully. This is nice when compared to the cold, animation-less login sequence in Ventura. It just feels premium.
  • The Lock Screen now actually functions as a Lock Screen, with a large date and time. As The Verge puts it, macOS feels more iOS-like than ever.
  • Clicking on the Desktop when other windows are in focus will push all of your open windows away until one is clicked again. This produces the same effect as if you set up “Desktop” as a hot corner in previous macOS versions, and I love it. Super useful to hide your work if you have something sensitive up on your screen.
  • Messages notifications now are on-time, not 10 minutes late. I cannot believe it took a whole year for them to fix this.
  • Apps that have a “Time Sensitive” notification entitlement from Apple now are able to send notifications in the background (while the app is not running) properly (this should’ve worked before; it didn’t).
  • Hundreds of small bugs, UI glitches, or hangs from Ventura have now been addressed.
Image
The new Lock Screen in Sonoma, as well as showing the Desktop with just a click. // Screenshots: Eshu Marneedi

Sonoma has been the most stable version of macOS in a long while… and it’s still in beta. I’m very, very impressed. In addition to the multi-platform feature additions described much earlier in these hands-on impressions, Apple has focused on widgets, video conferencing features, and to my surprise, a new set of tools called the Game Porting Toolkit for this year’s macOS release. As I said previously, none of these additions dramatically enhance the macOS experience, but they’re pretty nice.

Widgets receive massive improvements in macOS Sonoma this year and were the highlight of this year’s macOS release. Widgets, since Big Sur, have lived in the Notification Center. Now, they live in a new place alongside the Notification Center: the Desktop. Dashboard is back, everyone — you can now freely place widgets wherever you’d like on the Desktop, in all four sizes (small, medium, large, and extra large from the iPad). There’s no grid of pre-defined points that you pin widgets to; it’s truly reminiscent of Dashboard from the bad old days when you could put widgets wherever you wanted. You simply drag them out of the newly redesigned widget library, place them where you’d like on the Desktop, and they stay there. Once placed, it’s easy to move them, too — just use your mouse to drag them wherever you’d like, or right-click to change the size or remove the widget entirely. If you already have widgets on the Desktop and drag another one out from the library, a handy alignment guide shows up to allow you to easily maintain spacing between widgets. It doesn’t force you into placing the widget next to your other ones, but it does make it a lot easier to do so if you want to. When another app is in the foreground, widgets become translucent, losing all of their colors similar to Lock Screen widgets on iPhone and iPad. It’s a cool effect, in my opinion, and makes widgets feel more at home on the Mac. They shouldn’t be the prime center of attention when a user is focusing on another app, but rather, should just seamlessly fade into the background — and that’s exactly what they do in Sonoma. As soon as the Desktop is in focus (you’ll know because “Finder” is shown in the Menu Bar), the widgets spring back to life. Clicking on a widget opens the corresponding app as it would in Notification Center or on the iPhone and iPad — as it should — and widgets on macOS can even be interactive as they are on iPhone and iPad (if optimized). This implementation is super neat, and while I don’t see myself as a Mac widgets user, I do like how this feels like a better implementation of Dashboard. What’s old is new; I’m a fan.

Image
Widgets in macOS Sonoma. // Screenshots: Eshu Marneedi

That isn’t even all of it, though. What surprised me the most is a feature that doesn’t even have a real trademarked name, which is a shame because I think it deserves a clever one. iPhone widgets can now live on your Mac as long as your iPhone and Mac are registered to the same iCloud account. Let me explain further: if you have an app on your iPhone that has widgets, your Mac will automatically show those widgets from your iPhone in the widget library on your Mac and fetch the data from your iPhone periodically if your iPhone and Mac are on the same WiFi network or are nearby (which, if we’re being honest, is 100% of the time). You don’t need to have the apps installed on your Mac at all — all of the data comes from your iPhone. This is so, so cool, and works so well too! Developers have no way to disable this “widget sharing” feature, either, which means it works flawlessly for all apps, including apps that perform frequent, data-heavy network calls like Twitch. I don’t know why you’d have a Twitch widget on your Mac Desktop running 24/7, but if you do, it updates the live previews quickly and flawlessly. It works like I assume it would if Twitch made their own Mac app — seriously, it’s awesome. Widget editing is also fully-functional, with support for all settings. It just pulls the data and settings from your iPhone and passes them on to your Mac. Brilliantly executed; seamless. I bet that when the masses get this feature when Sonoma ships later this year, they’ll have no idea that these widgets are from their iPhone and will just be psyched that they’ve got a bunch of new widgets on their Mac for “free.” Apps that have an iPhone and Mac variant show a toggle for if you’d like to use the iPhone or Mac widget (I don’t know why you’d use the iPhone one; I’d assume the Mac-native one is just a tad more reliable or fast to update), and widgets from your iPhone are marked with a “from iPhone” badge in the widget library. I’m a huge fan of this feature and am very impressed with how Apple pulled it off while being efficient with both iPhone and Mac batteries. Kudos.

Image
Widgets from your iPhone, now on the Mac. // Screenshot: Eshu Marneedi

Let’s get into video conferencing, which, to my surprise, took up a decent chunk of the keynote address. First, some Screen Sharing improvements: apps that use the new APIs — including Zoom, which delightfully surprised me — will now appear in the windowing picker. What does this mean? When you click and hold on the green dot in the top-left corner of any given window, you’ll now be presented with a fourth option to share that window in your video conferencing app of choice (once again, if it uses the new APIs). No more having to fiddle with the clunky custom Zoom screen sharing picker — macOS will handle all screen sharing permissions from now on. And when you’re screen sharing, a handy privacy indicator will appear in the menu bar to let you easily stop sharing, or limit your sharing to just one window. Also, you’ll now be able to overlay your camera feed atop your screen sharing window or place your shared window in the background of your space. Zoom and Teams have tried to make this happen for years, but it never really worked right because there was no system integration. Now, there is, and it’s supposed to work in Zoom right out of the gate. (Footnote: it doesn’t actually work in Zoom yet because Zoom hasn’t been updated to use the new screen sharing APIs I discussed earlier.)

Image
Sharing windows just got a lot easier. // Screenshot: Apple
Image
And now, you can overlay yourself atop your window! // Screenshot: Apple

New camera video effects have also been added, and they work in all apps. First, one where you can have “fun” (the boomer in me despises this) 3D augmented effects appear behind you on your camera feed by just gesturing to the camera (thumbs-up, hand-heart, etc.), and another called Recenter where macOS will auto zoom and pan on your video feed to make sure you’re centered within the frame when using Center Stage. The latter isn’t particularly new, but what is new is a menu bar item that appears while you’re on a call that lets you access features that previously the system handled by itself.

I… don’t reckon I’ll use the reaction camera effects, but I’m pretty excited about improvements to screen sharing. The various fragmented custom implementations across apps have been painful to use, quite frankly. They always make me feel like my computer has been “taken hostage” and the annoying custom toolbars with oodles of options don’t help. I’m glad Apple is offering new APIs that clean this experience up and make it more privacy-conscious and enjoyable to use.

Macs can’t game — everyone knows that. And Apple… hasn’t cared about that, pretty much ever. Apple deliberately avoids the gaming market and they make it hostile for developers to develop games for the Mac. With all of this said, I wasn’t expecting Apple to enhance gaming on the Mac at WWDC this year, shockingly, they did — and the new tools they introduced this year change the landscape of Mac gaming on Apple silicon. Macs… actually might be able to game in 2023.

The new tools that make this kind of a statement not totally outlandish are dubbed the “Game Porting Toolkit,” which is a new framework that allows developers to port and run their DirectX 12 (important vocabulary: DirectX 12 is the latest iteration of DirectX, a Windows framework used to develop games) games on Apple silicon Macs via an Apple-designed custom code patch to Wine. This is powered by Apple’s new game translation layer called D3D, which uses Metal, an Apple native graphics framework. Christina Warren, writing for Inverse, explains it well:

Apple added DirectX 12 support via something it is calling the Game Porting Toolkit, a tool Apple is offering to developers to see how their existing x86 DirectX 12 games work on Macs powered by Apple silicon. That toolkit largely takes place as a 20,000 line of code patch to Wine, a compatibility layer designed to bring support for Windows games to platforms such as Linux, BSD, and macOS.

Wine doesn’t emulate, but rather translates DirectX 12 API calls to stuff that macOS can understand, and this is important because it provides a significantly better experience when compared to emulation. This is because the game is, well, running on macOS by calling macOS APIs (notably Metal), not a virtual version of Windows like emulation aims to do. By essentially baking Wine directly into macOS, developers don’t need to change their code at all for it to run on macOS, because Apple is doing all the hard work translating Windows API calls to macOS ones. The game just thinks it’s running on Windows via DirectX 12 because it’s making DirectX 12 API calls; the changes only come in translation. Better yet, performance is excellent with this method when it wasn’t with Wine without the Apple-designed code patch because Wine, by default, ran as an x86 binary on macOS. And it made sense — Wine translated the API calls, but that doesn’t change the fact that the original code was written for x86 (ARM on Windows is an abysmal failure). The original Wine project relied on Rosetta 2, Apple’s native x86 → ARM translation layer introduced in November 2020 alongside the first M1 Macs, to translate x86 code into ARM code that runs on Apple silicon. This translation is processor-intensive, and it causes significant unintended performance bottlenecking when playing even minimally-intensive titles. This is no longer a problem with Sonoma’s Game Porting Toolkit because Apple’s “20,000 line of code patch to Wine” handles the x86 binary translation for us, without the need for Rosetta (which was only ever intended for use as a stop-gap solution so users could run their old x86 apps on Apple silicon Macs when they first launched). The result is impressive, native performance. Users have found computationally demanding titles like Cyberpunk 2077 on Ultra settings running okay on Sonoma on low-end, older Mac laptops, and popular AAA titles like Diablo IV running impressively on high-end M2 Max machines. This is a huge step forward for Mac gaming, and one that’s sorely needed, too.

Many enthusiasts have already found ways to get popular titles running on Sonoma by way of tools like Whisky, a native wrapper for Wine and Game Porting Toolkit that allows you to download and run any DirectX 12 game on Sonoma, and it works well. But Apple… has other plans for Game Porting Toolkit, as indicated by the name “Game Porting Toolkit.” Apple wants developers to use Game Porting Toolkit as a starting point to bring converted games to the Mac App Store, or in other words, make their games fully native and easy to download with Apple making a 30% commission on in-game purchases. I… don’t see this happening. Apple screwed the pooch on Mac gaming many years ago and there’s practically no turning back now; the damage has already been done to the Mac’s reputation. Game developers, and especially indie game developers don’t have the time nor patience to play with Apple to get their games on macOS, where the market is minuscule. Gamers use Windows, they don’t use macOS, and it’s going to take a lot of convincing for them to jump ship to macOS. As discussed on a recent episode of the Accidental Tech Podcast, Apple doesn’t have a competent, powerful, top-of-the-line Mac in their lineup, and they have no plans to make one. The M2 Ultra falls laughably short when pitted against Nvidia’s finest, the RTX 4090. Where efficiency doesn’t matter, Apple silicon falls short. Unless Apple focuses heavily on prepping its hardware to support AAA titles, it’ll never be able to match the superior performance of Windows desktops. If Apple wants to truly enter the gaming market, it’s going to have to try harder.

None of this is to “rain on Apple’s parade” or otherwise diminish the value of Game Porting Toolkit. As Christina puts it, this is “the best thing to happen to Mac gaming in 30 years,” no doubt. After years of stagnation, Apple is finally putting effort into the gaming story on the Mac. It’s not just “Apple Arcade” anymore — these are real games that you can play on real Macs. But will it work? Certainly not the way Apple wants it to. Steam exists and it’s very popular; people don’t buy games from the Mac App Store and they will not start now. Apple could try harder. Apple may try harder in the future. Who’s to say? They could go figure out a deal with Steam and all the big game studios to get people to buy games on the Mac. They could figure out a way to let people play games they already bought on Windows. They could actually try competing in the professional desktop market with a Mac Pro that competes with the RTX 4090 and that’s actually worth $7,000. Right now, Apple is doing none of that — their most expensive computer is a Mac Studio with a backpack. That’s not going to fly in 2023. I’m optimistic, I want to remain optimistic, I want Apple to take gaming seriously and I want game studios to take the Mac seriously. Enough wisecracks about the Mac not being able to game — it should be able to game, and it may with the help of Game Porting Toolkit. The future is bright, but our work here is far from done.

If you can’t tell by now, I’m very pleased with macOS 14 Sonoma. It’s a major quality-of-life update and comes back around to the subtitle of this article: “Bigger doesn’t always mean better.” Apple, for years, has focused on weird, unnecessary feature enhancements like Memoji and Shared With You, etc. while neglecting parts of the OS that people use often. macOS Sonoma disrupts that pattern, and while it didn’t bring any improvements to System Settings, which is still a total abomination of an app, or make SwiftUI forms look nicer, or fix Stage Manager weirdness, it’s a step in the right direction, and I’ve been enjoying using it. It’s smoother, it’s faster, it’s less glitchy — it feels more polished. Everything just works, the animations are smooth and delightful, and the whole system just feels more whimsical and Apple-like — a stark contrast from the crappified Big Sur. I give macOS Sonoma an 8/10. It’s an excellent release of macOS, balancing between features and stability, and I’m excited for users to receive it this fall.

AirPods

This year, AirPods gain much-needed improvements to Automatic Switching so notable that they warranted an entire mini-section within the keynote. Automatic Switching, first announced at WWDC20, has been a dumpster fire of a feature, quite frankly. In its current iteration, in macOS Ventura and iOS 16, Automatic Switching rarely works, and when it does work, it’s inconsistent. If you’re playing something on your iPhone, and decide to play something from your web browser on your Mac, the audio on your iPhone will not stop playing, and the audio from your Mac will blare loudly. This is PTSD-inducing — since this feature launched, I’ve always set my Mac to mute 24/7 and always check the menu bar to make sure AirPods are connected. But sometimes, it does work, like when you play music through the Music app. But hitting play on audio on your iPhone again has the same thing happen — the audio doesn’t auto-switch. It’s such a frustrating user experience, one that feels archaic and, dare I say, worse than the good old days before iCloud synced AirPods where you’d have to connect your Bluetooth headphones to each device separately.

All of these issues have been fixed in macOS Sonoma and iOS 17. If you play audio from a web browser on macOS Sonoma, audio switches seamlessly, both on my AirPods Max (which surprised me) and my AirPods Pro (2nd-generation). And when I want to go back to the other device, that works too. I’ve even stress-tested this, auto-switching quickly between devices and apps. It worked flawlessly. The way this works is interesting to observe, too — previously, AirPods would only connect to your other devices when audio started playing. Otherwise, they’d remain in standby mode and the default audio source for the device not playing audio would be that device’s internal speakers. Now, AirPods remain connected to all of your devices on your iCloud account, constantly. You can see this when you have audio playing from your iPhone and look in the Control Center on your Mac — the AirPods appear connected like you manually connected them, even though they’re playing audio from another device. As soon as you start playing something on your Mac, the iPhone’s audio stops. This is how Automatic Switching should’ve worked from day one. The new system is much more reliable, and I haven’t had it fail once in 3 months. In fact, I actually find it to be too good, sometimes, like in this rare but comical example: if I pause audio coming from my iPhone from the AirPods themselves (by hitting the on-device button or stalk) while I’m using my Mac (a common scenario) and then hit play on the AirPods again after a while, the Music app on my Mac will auto-launch instead of resuming the audio from my iPhone. I assume this is happening because the AirPods remain paired to both devices and forget which device was last used, which isn’t ideal. So, they just grab audio from the last device that I used, which is obviously my Mac. It’s a bit inconvenient, but I can always just hit resume on my iPhone again to get the AirPods to auto-switch. I’m very happy that Apple has invested time into making Automatic Switching good again; I’ve been complaining about this for years.

The new feature that’s much smaller and that only works on AirPods Pro 2 is one called “Adaptive Audio.” Apple describes this feature as “[a blend of] Transparency and Active Noise Cancellation to tailor the noise control experience as you move between changing environments and interactions.” All of that seems too… buzzword-y for my liking. Full disclosure: I haven’t actually tried this feature out because doing so would require installing AirPods beta software, and with my AirPods Max woes of late-2021, I’m never doing that. But, putting aside buzzwords, this seems like it could actually be quite useful. There are many times when I’ve wanted to control how much noise cancellation AirPods Pro pipe into my ears, but unlike Sony’s WF-1000XM5, I can’t do that. Now, there’s a feature just for that, which keeps ambient noises silenced but prioritizes voices. What I’m less enthused about is the fact that this feature is only available on AirPods Pro 2, and not on Apple’s highest-end (only in price, apparently…) AirPods Max, which cost an eye-watering $550. I assume this has something to do with the real-time processing speeds of the H2 chip exclusively found in AirPods Pro 2, but the cynic in me wants to believe that Apple is gatekeeping features for its latest-generation product. Regardless, I’m excited about this feature.

I give this year’s AirPods updates a 9/10. I do want more features, like a real AirPods app for true customization, but older products getting newer features and enhancements is always a good thing.

Well, that was a lot. I didn’t foresee this article being over 17,000 words when I first started writing it 2 months ago (this is not an exaggeration, check the screenshots), but here we are. All of this software should be coming out in September and October.

--

--

Eshu Marneedi

The intersection of technology and society, going beyond the spec sheet and analyzing our ever-changing world — delivered in a nerdy and entertaining way.