Hands on with iOS 16, iPadOS 16, macOS 13, and Xcode 14

Eshu Marneedi
35 min readJun 11, 2022

--

Look, I know what you’re thinking. Event recaps are boring. But an event recap this is not — Apple just wrapped up their… let’s see here… 34th Worldwide Developers Conference, and it was pretty interesting. I’m not sure we’ve had this good of a WWDC since… 2020. This WWDC brought updates to many parts of Apple’s software, new APIs for developers to take advantage of and use in their apps, and a whole flurry of bugs. I have the developer previews of all the new major software releases that came out of WWDC, and I’m going to talk about what’s new in the updates, how I feel about them, how developers can use them, and why they’re important.

Let’s talk iOS, starting with the name. A couple of weeks ago, Apple discontinued the last remaining non-iPhone device that ran iOS, the iPod Touch. I was wholeheartedly expecting Apple to finally retire the iOS name and reel it back to iPhoneOS — what the iPhone originally shipped with, but, they didn’t, thus complicating their naming scheme even more. What makes this more confusing/stupid is the fact that the iPod Touch isn’t even supported in iOS 16. It’s not only the iPod Touch that lost support, too: iPhone 6S has finally been retired, and the iPhone 7s are also no longer going to receive software support. I’m not bothered by these omissions — the A10 chip starts to chug when doing even basic tasks. However, you could validly make the argument that they could have just stripped some features from the 7 as they did from the 8 and X to make it work for another year. So, with so many devices losing support, this should be a big update, right? Eh, depends on your definition of big. The Lock Screen undoubtedly gets the biggest enhancements of any part of iOS 16, finally trying to compete with our Android brethren. First, you can now customize the font and color of the time in the Lock Screen. If you asked me 2 years ago with the launch of iOS 14 if Apple would allow users to pick between fonts on the Lock Screen, I would have bet actual money that this would not happen. Well, I was wrong, and I’m all here for it. You still can’t customize the accentColor throughout the system like Material You and macOS, but maybe they’re saving that for iOS 17. Who knows. But that’s not even the beginning of it all — the Lock Screen takes a page out of the Apple Watch’s book and allows users to create multiple Lock Screen designs. What makes these Lock Screen designs different from each other, you might ask? Well, alongside the picker on Apple Watch, Apple brought Complications to the iPhone Lock Screen. They’re identical, but Apple still calls the iPhone ones widgets instead of complications, which honestly, makes sense. Developers can use WidgetKit to make widgets for iPhone — _davidsmith, the developer of the much loved Widgetsmith application has already demoed some. This all sounds great so far, right? But there’s a problem I have with all this. Yes, you can press and hold on the Lock Screen now like the Apple Watch to bring up all of your saved Lock Screen designs and create new ones, but that’s not how most people are going to do it. iOS has always had a clear and distinct wallpaper view in Settings — but iOS 16 makes this more complicated than ever before. Now, when you click “Add a new wallpaper,” you get this brand new view where the user can pick between people, photos, emojis, and collections. Then, when the user is done making their selections, BOTH the Lock Screen and the Home Screen update. There is no way to specify that the changes should only apply to one screen like in previous iOS versions. Annoying. But you might think that you should be able to click the little windows in the wallpaper view to change the specific wallpapers — and you’d be partially right. When clicking the Lock Screen, you can only add or remove widgets and change the way the time is displayed. There is no option to change the wallpaper. The Home Screen is a slightly different story — it shows you a new view with a picture button and a couple of color gradients. That’s it. You don’t get all of the customization options. Let me walk you through how unintuitive this is: Say a user wanted to update their wallpaper to a picture on the Lock Screen and system one on the Home Screen. In iOS 15, the user would go to the wallpaper settings, click choose wallpaper, pick the picture, click set as Lock Screen, click on choose wallpaper again, pick the system one, and set as Home Screen. Done. In iOS 16, the user has to make a new wallpaper, choose the picture, set, click Home Screen… and they won’t be able to pick the system one. This is the most unproductive and unintuitive user design I have ever seen. Apple should fix this in future betas.

The new Lock Screen and wallpaper customization in iOS 16.

There’s still more Lock Screen stuff I want to talk about, starting with Notifications. Notifications in iOS have always been a mess. In iOS 16, Apple has tried to rectify that with a new feature that I’ve wanted for a long time called Live Activities. Live Activities is pretty simple — say for example you ordered food delivery and you’re on iOS 15. You get tons of notifications for every update that your driver posts. In iOS 16, developers will be able to use new APIs to make Live Activity tiles on the Lock Screen that stay there until your order is complete. No more notifications — just this interactive mini applet with all the information. SUPER, SUPER HANDY. I cannot wait for this. Rain storm? Live Activity. Delivery? Live Activity. I want Live Activities for everything. Hopefully, we see these Live Activities come to apps soon, preferably in the fall, since Apple has made the Live Activities API available in Xcode 14 for all developers, for free. Apple already has 2 Live Activities in the betas right now — the Timer and Media player. The Timer looks great; no more fiddling around with that stupid small text that takes up the date area on the lock screen. It now sits at the bottom like every other notification (yes, Notifications now roll in from the bottom, thank goodness) and has options to pause and delete. Nice! Enough of the Lock Screen — there’s still tons more in iOS 16. But these features carry over to every platform in a world where Apple is unifying their codebases across hardware.

I want to first talk about the all-new Home app. Yes, you heard that right, there’s a new Home app. Unfortunately, the new Home app doesn’t bring backend improvements like Smart Home enthusiasts like myself have been hoping for. So no 24/7 HomeKit secure video, no bug fixes, no error descriptions, no Siri improvements, etc. In fact, I’ve found that it’s more buggy than the last Home app — here’s hoping that this is just a beta thing and that this stuff is fixed before final release. But let’s talk about this UI: First, Apple finally killed the individual room pages — the scrollable tiles from the old app are now gone and instead, the tiles are organized by room in the main view. Still, this is an inconvenience for people with many devices and many rooms, like myself. So for that, there are new categories options at the top. Before, the top of the Home app was filled with quick stats, like the temperature, lights, fans, and more. Now, you can click those buttons to view every accessory that fits in that category. For example, “Climate” is an option, and the subtext is the temperature in the house from temp sensors and thermostats. Clicking on it shows you all of the fans, coolers, heaters, and thermostats in your house. Same with lights, security, speakers, and water. I’ve been using this view since I downloaded the beta, and it’s fantastic. If I want to turn my lights off, I click lights and toggle the ones I want off. Super useful for lots of accessories. This new Home app prioritizes categorization. No more scrolling through the 50 or so accessories that you have in the list or swiping through the individual screens. It’s all sorted by category, room, and house. But that’s not all — all of your cameras now show up at the same time (even though they aren’t live, which is annoying as sin) and there’s a new favorites view at the top of the screen that you can customize. FINALLY! Now, I don’t have to go scrolling just to turn off the Living Room lights. So fantastic. Apple aced the Home app. Too bad they didn’t fix HomeKit.

The new Home App in iOS 16.

Family Sharing also gets an essential feature that should have been here since the very beginning, or as John Siracusa would say, if Apple knew how families work. There’s now a shared photo library in Photos. Family members can move their family shots into a shared library that everyone can access and edit. They can also turn on the family sharing toggle in Photos to have the photos they take on their phone automatically upload to the shared photo library. Finally, gone are the days of maintaining one library that nobody other than one person has access to. One more thing — the Photos app can automatically identify that your family members are inside a photo, and then automatically throws it in the family photo library. That way, when your family’s on vacation, all of your vacation shots go right into the correct photo library. Super neat! Keeping with the Photos theme, you can now cut out subjects from your photos with the power of ML by just tapping and holding. It’s using the same tech that’s used on the Apple Watch, and now iPhone to cut people out from your photos and apply a depth effect to them. What’s neat is that these photos don’t need to be portrait mode photos. They don’t even need to be from an iPhone. The phone will just identify the subject and make it a PNG. I’m going to be using this all the time to make quick PNGs because it seems like it does a fair enough job. Quinn Nelson from Snazzy Labs demoed this feature, and it’s insane. I was able to see a couple of nicks here and there where it wasn’t able to cut the subject out from the background entirely (and sometimes, the phone wasn’t even able to identify that there was a subject in the photo and ceased to give me the option to cut the subject out from the photo), but it’s also good to keep in mind that it took 4 seconds. You could always take the finished result into Photoshop and do some light editing and it would be a near-perfect result. I’ve tried it with a couple of graphics and it works surprisingly well. I’m a huge fan.

The Photos app in iOS 16 can now cut subjects out from a photo.

You know what else I’m a huge fan of? Live Text in video — gone are the days of typing out a line of code that you found in a video or screen-shotting a graph to use Live Text. Live Text is built into the native video player, and it’s super handy. I’ve already used this feature multiple times while going through WWDC sessions, and as a student, I can already tell that this feature is going to be a godsend. Unlike most OCR programs, Live Text doesn’t completely wig out with Code or different fonts. It’s almost like it was designed by programmers… neat, huh? One small complaint — this could just be a beta thing, but for some reason, every time that text shows up in a video with Apple’s new video playback interface (which I’ll talk about a little later), the stupid live text button shows up right smack dab in the middle of the screen. So annoying. I hope this button is moved to the bottom or on a toolbar somewhere because that’s kind of bad. Apple didn’t stop there with ML though — they’ve also copied the dictation features from Google, and honestly, I can’t complain. This is well needed. Now, dictation automatically inserts commas, periods, question marks, and more. And, you can use the keyboard while dictation is on. That way, when dictation inevitably makes a mistake, you can correct it without having to turn dictation off. You can even dictate while you’re typing if that’s how you roll. In my experience, this doesn’t mean that dictation has improved in accuracy. Dictation still has some major shortcomings. Part of this is English’s fault (the aforementioned Snazzy Labs made a video about this a couple of years ago), but Google does a much better job of dictation as of right now. Shouldn’t have let your ML director go due to your senseless WFH policies, Apple! Enough ML — Apple is turning into a bank. Well, kind of. Apple introduced some cool new Apple Pay features, one of which was something they dubbed “Apple Pay Later.” It lets you split the cost of your Apple Pay purchase, making you pay the amount to Apple every week or month. I’ll be honest, I don’t like the direction this is headed. With Apple essentially becoming a bank for all of your purchases, it creates a completely new level of ecosystem lock-in. I mean, imagine you have to keep using an iPhone because if you don’t, you won’t be able to pay off your debt. This business is profitable but sketchy. Do you really trust a tech company with this sort of stuff? I wouldn’t, and I feel like Apple has to work on that a little bit. It’s the same reason I haven’t applied for the Apple Card (other than the fact that the Apple Card is a bad card). If you don’t care for this stuff like me, there’s something good for all of us Apple Pay users — Apple is building a delivery tracker into the Wallet app. As crazy as that might sound, it’s actually pretty useful. The downside of this is that websites and retailers have to opt into this, and they probably won’t. But if they do, and you end up purchasing their product on their website using Apple Pay, that order would show up in the new Tracking section. I hope retailers implement this because I hate tracking stuff. Why mustn’t there be an easy way to track everything I order? I digress. Last thing before I move into rapid fire — Focus gets some new enhancements and a new API. I’ve never been super into Focus, mainly using it for silencing notifications during meetings and when I’m sleeping. That might change with iOS 16, though — because apps can now filter content based on Focus modes with a new feature Apple dubs “Focus Filter.” Focus Filter can, for example; sort emails by a work email and personal email, sort calendar events, show work contacts and personal contacts, and even show different Lock Screens with different widgets. Developers can implement the Focus Filter API into their apps which allows users to choose what they want to be displayed in the app. I haven’t been able to test this since I don’t use Apple Mail and Apple Calendar, but it’s a welcome improvement to a feature that sort of died down since the launch of iOS 15.

There are a lot more changes that Apple’s made in tons of apps that I’m not necessarily interested in, so rapid-fire time. First up, Messages. You can now edit and delete your sent Messages in case you make a mistake. Edits can only be done within 15 minutes of the message being sent, and the features are only for iMessage chats. Unfortunately, there’s no RCS support in iOS 16, a feature that many of us were hoping for. In addition to this, you can now mark chats as unread/read so that you no longer forget to reply to that message you’ve been holding off. Neat. I couldn’t test this because I don’t have 2 devices that run iOS 16 — so beware — if you’re messaging someone without an iOS 16 device, the message won’t show up as edited or deleted. Luckily, iMessage tells you this, but it’s something to be aware of.

The Messages App now lets you mark messages as unread and edit messages.

The Mail app becomes Gmail with send later, undo send, and follow-up reminders. Yes, Apple finally remembered they make a Mail app. No other revolutionary features here.

The Mail App finally lets you set reminders and send emails later.

Safari now gets Shared Tab Groups so that if you’re planning an event or something, you can share a group of tabs and all of the people in your group will be able to add to the Safari group. Neat, but very niche. Maps gets multi-stop routing like Google Maps and public transit ticket prices show up when you’re planning a route. I’m an Apple Maps user, so this is nice. Welcome to the club of features Google has had for years now, Apple.

The Maps app now has support for adding stops to a route.

The Health App now has medication reminders which is a godsend for people who take medicine regularly. It uses VisionKit so all you have to do is scan the medicine bottle and it will schedule the reminders for your automatically. This doesn’t work as well as it did in the keynote video, but that’s most certainly just edited to look cool.

The Health app now gets medication reminders.

Okay, last thing — Spotlight search gets some enhancements that’ll finally make it compete with Alfred, including something Apple calls “intelligent results.” I don’t think that’s an accurate name though, because when I used Spotlight to search for the restaurant “Chili’s,” Spotlight recommended some rock band as a Siri suggestion that happened to have the word “Chili.” Yikes. Jokes aside, Spotlight has been bad for decades and even though it’s better than Windows search, it wails in comparison to Google and Alfred. And with prettier, more well-equipped options like Raycast coming to this market, Apple can and should continue to improve Spotlight’s way of displaying content. I mean, how cool would it be if you no longer had to use any in-app search views anymore, just Spotlight? Want to Google something? Spotlight. Look up a photo? Spotlight. Setting? Spotlight. You get the idea. For now, Spotlight just isn’t that advanced, but it could be. Phew. That’s a ton of stuff. Some of it useful, some of it playing catchup with Google, and some of it niche.

Spotlight now makes some interesting suggestions.

I want to talk about iPadOS next because Apple introduced some fundamental features that are specific to the iPad that I got to try out and that are important to understand to get at what Apple is trying to do with this platform — the main highlight of iPadOS is something called Stage Manager, and it’s a really weird feature. If you don’t want to witness my rant that’s about to commence, I suggest you skip this part and watch Luke Miani’s excellent video about the topic. Still here? Fine. I’m mad and happy at the same time. On one hand, this update finally brings resizable windows to the iPad, which is something iPad people (and myself, who wants to be an iPad person) have wanted for years. But on the other hand, this deliberately puts arbitrary limitations on what the iPad can really do, and lets Apple advertise a feature that doesn’t really work. So what is Stage Manager? In essence, it’s an app that can be triggered by the toggle in the Control Center that lets you add multiple apps to a “Stage,” which is sort of like a dock but with app previews. I’m sure that confused you even more, so let me describe this — all of the apps that you add to Stage Manager show up on the left of the screen. Then, when you’re ready to use an app, you pull it to the main canvas by clicking on it. These apps are all freeform — meaning that you can click on each one and resize them like you would on a Mac. You can always add more apps to the Stage by opening them from the Dock, which is visible at all times in Stage Manager, or the Home Screen. How do you get to the Home Screen, you might ask? Well, this is where the “it’s like an app” part comes in, because you just simply swipe up from the bottom of the display, like an app, to get back to the Home Screen. You then select an app from the Home Screen like normal. So, you might be thinking “well, this sounds great! It’s essentially freeform windows but with an extra step of enabling Stage Manager in Control Center, right?” Unfortunately, Apple has added absolutely unnecessary restrictions to this feature which lessen the appeal of an otherwise great experience. First restriction: you can only use this feature on M1 iPads. This is such a shame because no justification can be made here. When DigitalTrends requested an explanation for this lock-out, Apple gave them the following statement: ”Stage Manager is limited to M1 chips mainly due to iPadOS 16’s new fast memory swap feature, which Stage Manager uses extensively. This lets apps convert storage into RAM (effectively), and each app can ask for up to 16GB of memory.” However, this is bogus. According to Apple’s OWN documentation, memory swap is available on the iPad Air 4th Generation with the A14 chip. Okay, well that’s just one iPad, you might say. So, let’s look at the rest of the list… M1 iPad Pro 12.9”, M1 iPad Pro 11”. What’s wrong with this list? It’s that the M1 iPad Air (which supports Stage Manager, by the way, since it has an M1) DOESN’T support memory swap like Apple says an iPad needs to support Stage Manager. I’m sorry Apple, but even a blind guy could see that you’re trying to shepherd users into buying the new iPad. It’s absolute BS, and Apple should be ashamed. But that’s not all — Apple limits users to only 8 apps at a time with Stage Manager, even on the highest spec iPad Pros. I have an 11” iPad Pro, and I can only use 4 apps at a time. Meaning that if I want to use Safari, Craft, Messages, Fantastical, Spark, Music, Reminders, Home, and Files, like most Mac users with a completely normal setup, that’s not allowed! How is this acceptable when the 2017 MacBook with a Core i3 and 8 gigs of RAM can run Stage Manager on macOS (which yes, macOS has Stage Manager, I’ll talk about that in a bit) with no limitations on how many apps the user can use at one time, but the $2400 12.9” iPad Pro with the SAME M1 chip as the M1 MacBook Air and 16 gigs of RAM has this absolutely asinine, arbitrary, useless, and defunct limit? This has clearly been made to make the iPad and its users suffer, and I seriously don’t understand how Apple can neglect their massively overpriced 12.9” Kindle. If Apple wants to pitch the iPad, a magical, near-bezel-less, ultra powerful, compact tablet as a genuine laptop replacement, they need to sit down and actually develop their software for this device. It’s not a matter of making all iPads equal in capability like many of the speculators speculated last year when the iPad got the M1 since Stage Manager LITERALLY splits their lineup by M1 and non-M1, but it’s rather that Apple doesn’t care about the iPad and it’s unique capabilities and form factor, and that is downright outrageous as an iPad customer. Apple can and should do better, and for now, they should sit in shame. I’m not done though — if Apple hasn’t done enough damage to the iPad by upselling the M1 chip more than necessary and placing ridiculous limitations on what users can do with said chip, they’ve even limited the core reason why Stage Manager exists on the iPad at all, and that’s the ‘resizable windows’ part. In macOS, app developers can specify minWidth, minHeight, and minLength as well as the max variants of those modifiers. The OS, on a system level, never limits users’ ability to resize the window (except for making sure that the app fits on the screen and that the user can’t make the window too large, ignoring the safe area). This doesn’t apply to iPadOS. iPadOS only allows users to pick between a select amount of options (squared, long rectangle, full-size, etc). It’ll act like you can move the app windows freely, but as soon as you let go of the grab bar at the bottom of the window, it snaps to one of the pre-defined sizes. Apple took a feature that could have been great and could have been revolutionary for the iPad and purposefully made it dumb. In short, yes, we do have windows on the iPad now. But no, we don’t have FREEform windows on the iPad. Apple stripped all the choice and user customization out of a feature that is supposed to enhance user freedom. I’m disappointed. Til next year.

The new Stage Manager feature in iPadOS 16.

iPad isn’t all gloomy though. It’s just mostly gloomy. The iPad gets a couple of other enhancements that make it sort of more Mac-like. Well, it’s kind of like putting lipstick on a pig at this point, but I guess this is all we have. Other than the app updates that I talked about earlier, there are a couple of iPad-only features (other than, of course, Stage Manager) and APIs for developers — first, the new Reference Mode for the 12.9” iPad Pro. I’m a video maker, so I know how frustrating it is to edit without a color-accurate panel. iPadOS, in the future, will be able to use different color formats/reference modes used in video studios. Want a taste of this right now? Go to your MacBook Pro with M1 Pro/Max, Pro Display XDR, or Studio Display’s display preferences and you’ll see a load of color profiles and reference modes. That’s coming to iPad. I’m a bit confused why Apple opted to only allow 12.9” iPads to be able to use this feature. They could have brought it to the iPad Air and 11” iPad Pro since the Apple Studio Display uses an LCD and has the same reference modes (albeit without HDR support). What concerns me more is that there’s no real way to take advantage of these reference modes on the iPad. There’s really one only “professional grade” video editor available for iPad right now, and that’s LumaFusion. Heck, there’s not even full Photoshop for photographers who could maybe take advantage of this. This is proof that Apple needs to bring even half-baked Pro apps to the iPad. If Final Cut was on the iPad, then there would absolutely be a use for this. But the number of people actually using iPad for workflows where accurate color reproduction is a must is extremely slim. But hey, I guess that’s the entire story of the iPad.

Reference Modes in macOS.

Speaking of the iPad being terrible — you know how the iPad hasn’t had a Weather app or Calculator and it’s become a meme at this point? Well, the iPad is now getting a Weather app. How exciting! Apple, you’re like 10 years late. Welcome to the late people club. For real though, I saw an Apple Engineer on the Weather team literally brag about how they worked really hard on the iPad Weather app and I just had to laugh. The Weather app is basically the iOS app, but larger. Exactly what everyone was asking for. The titles are now clickable, though, and they show more information about the Weather condition selected. Looks like Apple is (2 years later) finally putting their Dark Sky acquisition to work (we’ll get to this a little later), and I’m here for it. Nothing will beat Dark Sky, though. Beware, Apple. Other than the obvious sidebar and navigation, the app still works. Honestly, it’s fine. I have no idea why Apple hyped this up during MKBHD’s WWDC 2020 chat with Craig Federighi, but hey, it exists.

The Weather App on iPadOS.

Something that Apple made a huge deal about during the Keynote video was that they worked hard on bringing “desktop-class apps” to the iPad. I’ll talk more about all of the APIs later, but here’s the rundown — developers will now be able to allow users to customize iPad toolbars instead of presenting menus, similar to macOS. Honestly, this is a well-needed feature. I’m so sick of hunting for tools that would be right in front of me on the Mac, and I’m happy to see this come to 3rd party developers. Unfortunately, we won’t be seeing this for a couple of years since developers can’t change targets for navigation. But, here’s hoping this comes soon. Searching, find and replace, and context menus are also here… do you see a theme with this? Every single feature beats around the bush — making apps that feel like a Mac app. They just barely fall short of a Mac app and miss important functionality. If Apple really wanted to make good iOS apps available on the App Store, they should allow developers to make Mac apps and distribute those apps on the iPad with no changes. I don’t care what it takes, because it’s about time. Ditch the stupid crap limitations. I want to touch on one last thing before I get started on the craphole that is macOS, and that’s that the Files app can finally show folder size. I don’t know how this took so long. If Apple would prioritize their tablet UI, they could do some amazing things. It’s beyond me that these features — that should be considered basic — are just now showing up on iPad. The reason the iPad is not a computer and is not even close to a computer is because of Apple’s reluctance to provide people with what they want on the iPad. It feels like a blown-up phone. Heck, it’s still even called iOS internally. Except for a couple of UI tweaks, iPadOS just isn’t there. If you want an example of this — try hooking up a mouse to your iPad. Notice how scrolling has the inertia that comes when you touch it. Notice how all the buttons are way too big for the cursor. Notice how everything that you do sort of imitates a swipe instead of a click. None of this has been addressed, let alone fixed in iPadOS 16. It’s the little things like this that make me feel like the iPad isn’t ready yet, or is half-baked. The truth is, iPadOS is a phone OS. Apple needs to put on its big boy shoes for next year and truly develop iPadOS and make clear distinctions between it and its sibling.

Ugh, we’re getting really out into the weeds here. Let’s reel it back in and talk about macOS, arguably, my favorite Apple platform. This year’s name is a weird one — Ventura. It’s a nice name, but I don’t think it fits with the theme. Then again, the rumored “Mammoth” was hilarious. Slight tangent: is it Ven-too-ra or Ven-choo-ra? Neither, if we’re being specific — it’s the Spanish pronunciation Ven-too-rrrra (with the rolled r). I digress. Onto the features. As I said, macOS Ventura’s highlight feature is Stage Manager, and honestly, I hate it. People on the Mac don’t want the limitations of iPadOS. This takes everything bad about iPadOS’ weird window management and brings it to the Mac. It’s basically the same feature. To my surprise, this feature has large amounts of support, and I don’t get it. I’d usually prefer to use the dock to get to my apps. But to each their own, I guess.

Stage Manager on macOS.

macOS Ventura keeps up the theme of bringing all of the app updates from iOS to macOS apps (with the addition of Weather and Clock!), but there’s one app that finally gets an update — System Preferences. Since the Kodiak beta in 2000, the app for settings and preferences has always been named System Preferences. However, in iPhoneOS 1.0, the iPhone team named the app simply “Settings.” That’s stuck until this year when the name has changed to System Settings. Congratulations Apple, you’ve made something that already made no sense make even less sense. What an accomplishment! I’m hoping that this is the doing of someone on the night of June 5th in a rush and that the name changes to “Settings” to avoid confusion. Other than the name, the app has been completely redone inside and out. It’s entirely built using SwiftUI, which is a blessing and a curse. The blessing is that it’s super responsive (kind of), but the curse is that it’s full of bugs and is basically an infant. SwiftUI is super new. Instead of AppKit, which is what the old System Preferences was made with, SwiftUI allows for the modern look and feel of modern apps and platforms, while also being easy. I understand that Apple wants to push SwiftUI, but an app like Settings shouldn’t have bugs. I just want to get what I get done fast and easily. SwiftUI just isn’t there yet. The app lacks animations, it frequently crashes, it doesn’t feel fluid, it doesn’t feel fast, there are frame drops, and the list goes on. It’s not a great experience. Framework aside, the app is just the Settings app with more preference panes. Pretty much everything has been renamed, remapped, and replaced with the iOS counterparts. All the Mac-specific stuff is still there, but it’s all been simplified to a point where there are barely any differences between the iOS and macOS apps. Apple went a little too far with the simplification in some places, like in the trackpad view, where the gesture videos have been completely removed for toggles. Still, I‘m not super upset with this new app. Yes, it’s been oversimplified. Yes, it’s full of bugs thanks to SwiftUI. Yes, it’s not as complex as System Preferences. But you know what? I kind of like it — everything is the same as the Settings app and all the clutter and baggage have been removed. The keyboard preference pane used to be cluttered, messy, and horrible. Now, it’s just a list of toggles. All of the misplaced items have been put in their proper location, all of the checkboxes have been replaced with toggles, and the stupid, ugly, horrific icons from Big Sur have been replaced with SF Symbols. I hope Apple irons out the SwiftUI bugs and brings back some of the charm of System Preferences though, but hey, this is only beta 1.

System Settings in macOS Ventura.

Now, onto the second biggest feature of them all, and my favorite feature announced by far at WWDC this year, and that’s the new “Continuity Camera.” Continuity Camera, first of all, is a feature of the current times. With Apple 100% remote for the past 2 years and millions more staying home, users have come to rely on the webcam on Apple products. A couple of months ago, Apple shipped one of the worst cameras on a modern Apple device on a product geared towards the Work From Home crowd — including Apple employees who presumably worked on this feature. This seems like an apology for that. Continuity Camera connects the Mac to your iPhone to use it as a webcam. This creates one of the best webcam pictures I’ve ever seen, and I frequently use a mirrorless camera as my webcam. This feature is seriously flawless — I haven’t found a single bug with it yet. The phone shows up as a camera in every single app — the developer doesn’t have to do anything. Once the user selects the iPhone to use as a camera, the iPhone makes an audible ding noise (which is quite pleasant, TBH) and the screen turns black. The camera feed and notifications from the phone are then parsed to the Mac. Is this a perfect solution for bad webcams? Absolutely not. When I’m in meetings, I always have my phone next to me to take notes and read messages. That’s not possible when it’s mounted to the top of your display, which is annoying. However, the feature just works. The apps that Apple effectively Sherlocked here did this so poorly. You’d have to download an extension on the Mac, pay for software on the phone, connect them with a cable, fiddle around with the settings, and hope to god that it worked. That’s not the case with this feature. You just turn it on, and you’re set. It’s dead easy, just as advertised. The phone also provides video effects like Center Stage, Portrait Mode, and Studio Light, which all make the image… artificial. But hey, at least they’re there.

Continuity Camera on macOS. The black screen is the iPhone camera view.

If you thought that Continuity Camera was awesome, wait until you hear about Desk View, a feature that goes along with Continuity Camera. Desk View, like the name suggests, lets you get a view of your desk with your iPhone. How does this work? It takes the ultra-wide camera image and does some ML magic to create a top-down view of your desk. This sounded way too good to be true during the keynote, but in reality, it works! As Dave2D said in his video, the camera can’t capture depth. It really only works if you’re writing something down. And yes, the image is extremely distorted and fuzzy, because the camera is taking a feed that’s not top-down at all and making it look as if it’s a top-down. I’m not going to say that I’m going to be a frequent user of this feature, but it’s as impressive as it looked in the keynote. In the picture, you’ll notice that everything looks a bit stretched and crooked. This is not how it looks in real life. Evidently, the ML needs some tweaking. But it’s a passable picture on a Zoom call with all the compression and cropping. I’m seriously impressed with Continuity Camera — it takes a very niche feature that previously required expensive, janky software and does it the Apple way. Wow.

Desk View in macOS Ventura.

Damn Daniel, that’s a ton of new updates! But Apple wasn’t done — there are tons of new improvements to Xcode, SwiftUI, and all-new APIs. I want to dive deep into some of this because half of it is pretty cool. If you’re not a developer, feel free to stick around. There’s still some fun stuff. But there’s also a lot of nerdery here, so hang tight. I’m going to start with Xcode, then move to APIs, then SwiftUI because that’s the order Apple presented the sessions.

Xcode 14 is a lighter update with fewer framework updates than usual, but it’s still feature-heavy. The canvas on the right gets a fresh coat of paint and performance improvements, and now, the live preview is always on by default. I like this change. You now no longer have to build the app every time you just want to check if your spacer works correctly. It’s all built-in. There’s one more update to the Canvas which is crucial for multiplatform apps — you can now select between multiple devices inside the canvas to see how your app shows up on every OS. You no longer have to run the app every time with a different target selected. It’s all there in multiple tabs. The same goes for dynamic text, appearance, accessibility features, and more. You can create unlimited instances of your app in the canvas and view your app in all of them at the same time — which I know a lot of developers are super excited for. It works great every single time and has made app development so much easier. Seriously. Apple says the Xcode canvas loads 25% faster, and I can attest to this. In complex projects, Xcode would usually take about 1–2 minutes to fire up the Canvas after the preview has been paused, but now, it usually only takes 30–45 seconds. It’s fast and feature-rich. I’m a huge fan.

Xcode Canvas in macOS 13 and Xcode 14.

However, the biggest feature by far is the new Targets menu. You can now add and modify targets after creating a project — so that way if you realize down the line that you want to make a macOS app too, you can add the macOS target and remove the Designed for iPad target. Xcode will then tell you what APIs aren’t available in macOS (or whatever new target you’ve added). So how will you fix those errors? With the new #if os(target) #endif wrapper, you can simply wrap the code that doesn’t work in the OS you’re building for. Say you use the modifier .navigationBarHidden(true) in your code. That modifier doesn’t apply to macOS. You can wrap the modifier with #if os(iOS) and that piece of code will only run on iOS. Super handy. Apple-designed Xcode 14 with Multiplatform in mind, and I love this. Hopefully, this incentivizes more app developers to design their apps for multiple platforms.

The new Targets menu in Xcode 14.

With every new OS release comes a boatload of new APIs that developers can build into their apps. This year, there’s no shortage of APIs. I briefly touched on the Live Activities and Focus Filter APIs, and in short, they’re out of the scope of this analysis. But, I’ll give you a quick rundown, starting with Focus Filter: FocusFilterAppContext and SetFocusFilterIntent are protocols that you use to build Focus Filters into your app. FocusFilterAppContext looks at the Focus mode the user is in and contains the specific code that tells your app what to do when that Focus is enabled (i.e filter a certain calendar set or email account). SetFocusFilterIntent contains the code needed to run FocusFilterAppContext, telling your app to pick some view/whatever when FocusFilterAppContext calls for action. Again, that’s a very bad code-level explanation, but that’s how you can use Focus Filter API to build Focus into your app. The Live Activities API isn’t available for use yet — but basically, it’s a mini applet that you can create within your app and that shows up as a notification for your user. Details on exactly which protocols to use are unknown. Apple didn’t stop with what was in the keynote, though. They introduced this new media player, which honestly is horrendous. There’s no way to grab the progress bar or volume. They sort of just sit there. During the session where an Apple engineer talked about this new media player, the engineer said it was meant to be “intuitive.” I’m a user, and I can say that this new media player is the most unintuitive thing I’ve ever seen. It makes no sense. If you’re going to implement this into your app, there’s a whole session for it and I won’t even try to explain it, but let me explain what’s new: developers can now add playback speed, and it comes by default when you implement the new viewer. I’m eager for this to be all over the OS this year since playback speed has been a highly requested feature. Playback controls are accessible throughout the entire screen, so the user no longer has to click the tiny bar at the bottom of their iPhone to skip forward. It also has inertia, where the Apple engineer described it as “a toy car.” Interesting, but why do we need this? The previous view was so much more intuitive than this one — I guarantee you users will be more confused trying to figure out the new gestures than if they just kept the iOS 15 one and added the new features. I’m not super jazzed. Look, all of this stuff is useful, but there’s one API that I’m the most excited about, and that’s WeatherKit. Since the Apple acquisition of Dark Sky in March 2020 and the subsequent announcement that the Dark Sky API would be shutting down, I’ve been begging for Apple to create a replacement API. And a replacement API they have created, and they call it WeatherKit. The great thing about WeatherKit is that it’s not limited to Apple platforms — you get 500K calls per month for free on all platforms and then after that, it’s fairly inexpensive — 50% off what the Dark Sky API used to be. The Swift API is for use on Apple platforms and is called with a simple import WeatherKit at the top of your code, and the REST API can be used for everything else, effectively killing the remains of the Dark Sky API. Apple calls the new service the well-named Apple Weather Service, no longer using TWC for their own Weather app (finally!). To use WeatherKit, you (of course) import WeatherKit, then create a Weather struct, then make requests using CurrentWeather, WeatherMetadata, WeatherSeverity, and WeatherQuery. These are all pretty self-explanatory. You can then call for characteristics in certain views (like Wind, WeatherCondition, etc). WeatherKit also has support for alerts, and you can use WeatherAlert, Forecast, and timeWeather for these. The list goes on and on — there’s so much information on this topic, and thank goodness for that. WeatherKit is easy to use, free, and open to other platforms. I’m a huge fan of where this is headed. Stay on the lookout for more WeatherKit stuff, because I’m pumped.

WeatherKit, Focus Filter, Media player, Live Activities, etc are cool and all, but they’re nothing without SwiftUI (or I guess UIKit and AppKit, but Apple wants us to forget about those). SwiftUI gets so many enhancements this year to Navigation, UI, variables, and more. I’m not going to cover all that, but I encourage you to check out Paul Hudson, author of HackingWithSwift’s amazing video detailing all the new features. I’m mainly going to focus on what I’ve played with — NavigationSplitView, NavigationStack, and SwiftUI Charts. First, Navigation. NavigationView, the API used for all navigation since the beginning of SwiftUI, is now deprecated in iOS 16, instead replaced with NavigationSplitView and NavigationStack. NavigationStack is easiest to understand and makes complete sense. Now, when you have a list of things that you want to open other things, you create a NavigationStack and print your list in there. It completely replaces NavigationView. You then create a navigationTitle with .NavigationTitle(“”) and you’re set. You have a view. But if you’ve ever wanted a multi-column navigationView, you would have to add a NavigationView, then a List, modify that list to show a toolbar, then create navigationLinks inside the List and you’d be done. This changes in iOS 16 — now, you create 2 views and link them together in a NavigationSplitView. A NavigationSplitView can have 2 or 3 columns, each of which links to the views with NavigationLinks. Seems easy enough, right? WRONG. You now have to create State and then bind that State to the NavigationLink instead of just linking like before. There’s no clear instruction on this, and honestly, I’m unable to figure it out myself. It’s by far the most confusing thing I’ve ever encountered in SwiftUI. Honestly, if you’re not planning on going iOS exclusive already (like most people), I wouldn’t bother with NavigationSplitView. NavigationView works perfectly fine, and you can create some awesome views with it. There’s no apparent performance benefit to using NavigationSplitView either, so do what’s best for you and your app. Enough ranting about Navigation, though. Let’s talk charts! SwiftUI Charts are a new way to display information in SwiftUI apps. That makes sense, right? You create a struct with the chart, a protocol that displays the chart named ChartView, and then you can get to making Chart content with ChartContentBuilder, and ChartContent. You can change all kinds of attributes with Structs and protocols, like usual, and bind State variables to the charts to display user-generated/selected information. Super neat! I’ve been able to play around with some of these charts, and they’re fast and easy to build while displaying information in a way most apps don’t. Forget boring columns, I want to see this stuff in apps! It’s beautiful!

Before I wrap this mega in-depth hands-on analysis, I want to talk about another tool that’s made me super excited about the future of security — and that’s passkeys. Passkeys aim to be the replacement for passwords, and I believe that. The way a passkey works is simple — when the User creates an account, they make a username like usual and then click to create an account. When they do that, the website/app’s server will generate an encrypted key that can’t be guessed, lost, stolen, or breached in any way. It then passes that key onto the device where it’s stored in iCloud Keychain. Neat, huh? Passkeys never have to be changed, remembered, or otherwise kept safe. Everything is on the device and protected with biometrics. When the user wants to log in, they simply use their face or fingerprint. On a new device? The user scans a QR code that holds an encrypted login request, and then the user can authenticate with their face on a known device. Want to share it? AirDrop. What about cross-platform? The FiDO alliance, made of tons of companies including Apple, Google, and Microsoft work together to make it so that these passkeys work across devices. With all that out of the way, I couldn’t be more excited for this password-less future. Implementing passkeys is easy (the framework already exists for developers to use right now), and I can’t wait for when these OSes become public so that we can finally see this future in action. The one concern I have with these is if the user doesn’t have the phone available, in which case, some backup code will probably be provided. There’s a lot to work out still and it’s the reason you don’t see passkeys today. But I have confidence that in as soon as just a couple of months, we’ll see these keys in action. I’m super excited.

Holy cannoli moly. That was a lot about WWDC22. But hey, that’s what happens when you try to take days of experience with software and cram it all into one blog post. WWDC22 was an unforgettable conference with tons of information, and there are a billion more things I decided not to cover — like the new CarPlay interface, the new Freeform app that’s supposed to come later, the new M2 chip, and Apple’s complicated Mac lineup, watchOS, tvOS, Fitness improvements, Quick Note on iPhone, Game Center improvements, and so, so, so much more. There were a ton of announcements this year at WWDC, and while I only got to play around with some of them, check back some time soon to see my thoughts on the M2 chip, the CarPlay stuff, and hands-on experience with future betas. I hope you’ve enjoyed this super long tangent about Apple platforms, my experience with them, my hopes for the future, and my analysis. You won’t find content this in-depth anywhere else, so if you liked it, please let me know. Send me a frog emoji if you’ve made it this far and I’ll give you a special internet cookie. Thanks for reading, and stay safe out there.

--

--

Eshu Marneedi
Eshu Marneedi

Written by Eshu Marneedi

The intersection of technology and society, going beyond the spec sheet and analyzing our ever-changing world — delivered in a nerdy and entertaining way.

No responses yet