Everything Announced at Apple’s Infomercial for Developers WWDC 2021

Apple announced new software for their devices yesterday during an infomercial targeted towards the developer audience and the public to kick off the second online-only year of Apple’s World Wide Developer’s Conference. Apple is one of the world’s richest companies, but some third-party developers are rightly upset about Apple’s attitudes towards them, and Apple has a long track record of abusive labor practices. Nevertheless I’m going to summarize the events from yesterday as best I can. All of the operating systems have developer betas available now, and public betas in July. “This Fall” was the general release date for all of the software, according to Tim Cook.

This article is long, so here is a table of contents.

iOS 15

Apple split iOS and their iPadOS off a few years ago which has left the two operating systems to receive updates on a different path of features, but usually they are updated at about the same time.

FaceTime in iOS 15

Apple’s video chat service, FaceTime gets a few updates that Craig Federighi say helps us stay connected. 

The first feature Federighi mentioned was Spatial audio which spaces out voices so they appear to be coming from the direction they appear on users’ screens during a call.

Federighi said that voice isolation was another feature coming to iOS 15. This feature is supposed to isolate user voices from anything going on around them so that the people on the other end of the call don’t hear unwanted noises. This was demonstrated with a pre-recorded video of a FaceTime call with a leaf blower slowly moving into the scene behind a caller. Federighi said a “wide spectrum” mode would capture “…your voice and everything around you.

Like most other group video chat software, FaceTime will be getting a grid view and a blurred-background view called Portrait mode according to Federighi.

FaceTime Links are another new feature in iOS 15 that Federighi says will let people plan calls ahead of time. Federighi specifically said that people with Android and Windows devices will be able to join calls using FaceTime Links through a web browser.

Federighi said that SharePlay would let people listen to music and watch shows together, as well as enabling screen sharing. The music listening experience in iOS 15 is supposed to allow anyone on a call to start playing music and add music to a shared queue that also lets anyone on the call pause or skip songs. Federighi also said that callers could open a streaming app for video and start watching a TV show or movie that would be shared with the other callers. Both music and video are supposed to be synced to play at the same time, so presumably every caller would be required to subscribe to each service that the others are using if they want to participate though those details could be up to the services that support the functionality, Federighi did not elaborate on the business specifics. People could also use the Messages app during calls if they want to be quiet during music or while watching a video.

Federighi said that it would be possible to push the video to an AirPlay receiving device like the Apple TV during a call so that you could keep watching the video program while doing what looked like awkwardly holding their iPhone up for an hour-long show so they can still see and hear friends on the call.

An API for SharePlay would let other developers integrate with this service. Disney+, Hulu, HBO Max, NBA, Twitch, TikTok, MasterClass, ESPN+, Paramount+ and PlutoTV were announced as initial partners who would support the SharePlay functionality.

Federighi said that Screen Sharing with FaceTime would let people browse listings on Zillow, share gameplay, and that you could use it to “…help someone out and answer questions right in the moment.” which is also known as tech support. Honestly, this is a very needed feature for anyone who wants to help someone else understand how to use their device or fix issues but Federighi did not call it tech support. 

Messages in iOS 15

Messages is getting updated with new features as well. Apple’s Engineering Program Manager for Communication Apps, Mindy Borovsky, introduced the updates. Borovsky demonstrated that photos in a messages thread would get features like stacks of photos and collages to better display multiple photos in-line in the conversation. Currently, if you post a photo to an iMessage thread each one takes up as much space as the last. Stacks and collages were shown to be browsable so you can still see each photo without them taking up as much space in the conversation.

Inside of this messages section, Borovsky demonstrated that the News app would receive an update to have stories shared with you by friends inside of Messages so that you can read stories later, a little bit like Instapaper but in a new section called Shared with You. Each news item that was shared gets a little note to show you who shared it and you can tap through that note to get back to the message thread with the person who shared it.

Shared with You is also in Apple Music and the iOS Photos app. Borovsky also demonstrated a playlist shared from Messages inside of Apple Music and pictures that were shared in Messages inside of the Photos app. Borovsky specifically said that Photos would import pictures that are shared that she would care about, however Borovsky also said “screenshots or memes” won’t be imported automatically. I suspect this could lead to bad things for people who share risqué photos, and I’m sure some people only share memes and screenshots and want those imported, but OK Apple. 

Craig Federighi said that the Safari web browser, Apple Podcasts podcast app, and the Apple TV app would also get Shared with You sections. Federighi also said that you could pin items in Messages to have them “elevated” in the various Shared with You sections and when you search for things.

Notifications in iOS 15

Federighi said that Notifications are getting revamped with a new style that would make it easier to identify what the notification is about and who or what sent it. There will also be a new Notification Summary that collects and makes user notifications easier to digest. Federighi also said you could choose when to have that summary delivered. Federighi reinforced that this processing of notifications would happen on the device which is Apple-speak for “…not in the cloud where we can see them.” Notifications from people are said to not go to the summary, only notifications that iOS deems less important will be dumped into this digest.

Focus in iOS 15

If you have Do Not Disturb enabled then an away message of sorts will be displayed to people who are about to or are sending you messages, according to Federighi, though there is also a way to get through if the situation is urgent. The new Focus system is an additional choice beyond Do Not Disturb. Focus was said to allow users to set up apps or people you want to allow through and has some situations where users can give the system suggestions how long they might want that Focus mode. For example, Federighi suggested the Personal mode for dinner with family or Work mode, a Sleep mode were also suggested by the UI and Federighi later said that you can make custom modes and a location-based Fitness Focus was demonstrated for when you arrive at the gym. Federighi again reiterated the “on-device intelligence” that would suggest the apps and people you might want to let break through the Focus mode and still alert you with notifications.

Federighi also went over how the iOS 15 Home Screen could let you dedicate a page to “…match your Focus.” and only show users apps and widgets that won’t distract from that mode.

Federighi said that the different Focus modes would automatically synchronize between devices.

Live Text & Spotlight Searches on iOS 15 (as well as macOS and iPadOS)

Federighi demonstrated text recognition from the Camera app. The app displayed a button that let Federighi scan a whiteboard in the viewfinder and then copy and paste the formatted text to an e-mail in a similar list style to what was on the whiteboard. Federighi also demonstrated the Photos app letting him highlight stylized, cursive, logo text from a cafe and then once the text was highlighted, he could use a Look Up action from a pop-up menu to look up the name of the restaurant and its location. There are a million ways this could be useful but it could also be very creepy, and Federighi demonstrated one of those ways by selecting and highlighting a phone number in another photo to call a restaurant. Who hasn’t taken a photo that accidentally had a post-it note or password in it that will now be easier to access to anyone who can view that photo on an iPhone.

Federighi also showed that this Live Text functionality works in screenshots, a new Quick Look feature presumably from the Files app, and photos in the Safari web browser. Federighi listed the seven languages that the functionality supports, English, Chinese, French, Italian, German, Spanish, and Portuguese. 

Federighi said that you can look up information about “…recognized objects and scenes…” a slide showed the Photos app recognizing a particular dog breed and letting you look up information about that breed from Wikipedia. Federighi said the system would identify art, books, nature, pets and landmarks.

Federighi also said this indexed information would get added to Spotlight, which is Apple’s name for searching on all of their devices so that a user could search for photos of people or places or objects or text in a photo on a device.

Spotlight would also get updated search results for contacts with a new contact card appearing in the search results letting you more quickly contact them. It does something similar right now in iOS 14 but it looks a little more fancy in iOS 15.

Musicians, actors, tv shows, and movies are also supposed to get updated results in Spotlight

Photos Memories

Chelsea Burnette, Apple’s Senior Manager of Photos Engineering demonstrated new highlighted Memories in the Photos app. Burnette demonstrated that the new system automatically adds music from Apple Music instead of the few canned songs that are used today, and that you could dynamically interact with the memory by pausing on a photo, or swiping backwards to view one that passed by, with the music continuing in the background and the memory swapping photos to the beat of the song.

Burnette also demonstrated that you could edit the tone through different styles, a black and white (Noir Film, specifically) color was applied to the photos for the memory and a slower song was played that also slowed down the playback of the photos. The same thing was demonstrated with a faster tempo and another color change to the photos. Burnette said that the color-changes weren’t filters, but are instead more precise edits to the photos in the memory. Finally Burnette demonstrated browsing through different options from Apple Music for background music,

Apple Wallet Keys and Identification Cards

Jennifer Bailey, Apple’s VP of Wallet and Apple Pay, reiterated past additions like transit passes, financial tools like credit and debit cards, and other passes. Bailey said that keys added to the Wallet would get an enhanced support for UltraWideBand that lets users leave their iPhone in a bag or a pocket to open a car door and drive. BMW was the first maker that is to release cars with UltraWideBand support “…later this year.” Bailey said that digital keys could also be stored for homes, workplaces, and hotels, in the form of digital keys and a digital badge. Hyatt is supposed to add the first hotel keys “Starting this Fall…” 

A mess of logos were displayed from companies that are supposed to support these new features, Hyatt, Lenel S2, Proxy, touchnet, Kastle Systems, atrium, Evergreen, dormakaba, Allegion, Aqara, Latch, Transact, Salto, cboard, Schlage, Legic (I’m not sure if that is an e or what), Walt Disney World, Assa Abloy, HID. I recognized three of the logos. Schlage, Disney World, and Hyatt.

Bailey also showed slides that demonstrated a Drivers License being scanned and stored in the Wallet app but said it would only work in certain states. Bailey said that the Transportation Security Administration that performs security theatre 24/7 across the United States would support the digital identification. Yeah I don’t want to hand my iPhone to a cop but Bailey showed that unlocking the identification card for presentation to an authority figure would require biometric or other authentication and it would show exactly what details were being presented before they’re unlocked. The TSA has also been mishandling devices as people return to the United States from international travel, and Apple employees travel a lot so I don’t doubt that Apple will have done their best to handle this as carefully as possible but I have a lot of questions about that implementation and I imagine more diligent security researchers have far more.

Weather

Apple acquired the Dark Sky app and services last year and is shutting down both the Android versions of Dark Sky and third-party access to the Dark Sky API which is killing a number of third-party weather apps and the Dark Sky apps and website are shutting down at the end of 2022. The iOS 15 Weather app is said by Federighi to be getting updates with a new design and new infographics. The Weather app is also getting new maps that look very similar in presentation to the Dark Sky maps, even including the Dark Sky timeline that lets you adjust the time of day to see recent and future predicted changes. 

Apple Maps

Maps is one of the apps with the most history on iOS. It originally featured maps provided by Google, but Google infamously wanted more and more data about the users and other carveouts in their contract with Apple, so supposedly that is what caused the split and Apple’s home-grown Maps app has been a little bit behind ever since.

Meg Frost, Apple’s Director of Product Design for Apple Maps, said that a new Apple Maps experience had been launched for the US, UK, Ireland, and Canada already and that Spain and Portugal would get those new maps today and Italy and Australia would get them later this year.

Frost explained that a new globe would be added to Apple Maps that lets people “tap and explore,” but went on to talk about how the entire Apple Maps system would be overhauled in cities. The demonstration showed something a little bit in-between the current 3D view option and the traditional maps style of flat-shaded spaces. So you have a flat-shaded world and an isometric view, but with differences in elevation and more landmark features like the Golden Gate Bridge with a similar kind of flat-shaded-maps-style in 3D. It blurs the line between a cartoon 3D world and a traditional map. The new map was shown to include cross-walks, bike lanes, highway lanes, overpasses, and more in the same simplified 3D style. This updated 3D experience is also supposed to come to CarPlay later this year.

Frost said that improvements would also happen for public transit users, and let users pin their favorite bus and train lines and find stations more easily. Apple Maps is also said to let transit riders know when their stop is about to come up with a notification. An augmented reality scanning mode was demonstrated that let a user scan their surroundings to identify the path forward with giant floating arrows and street names in the AR view. Frost related this to the experience of exiting a subway station unsure of which direction to head in, which is understandable but it still seems incredibly dorky to hold up a smart phone for any kind of AR experience in public.

The enhanced features are supposed to come to London, Los Angeles, New York City, Philadelphia, San Diego, San Francisco and the Bay Area, and Washington, DC by the end of the year, with more cities next year.

AirPods in iOS 15

Gagan Gupta, Apple’s Senior Engineering Program Manager for Siri, explained that AirPods would better support people with “mild hearing challenges” in iOS 15 using a feature called Conversation Boost that would optionally help you focus on people you are speaking with by cutting out other noises. There’s a saying that you either need accessibility features now, or you will eventually. I’ve always had a difficult time discerning words from some people in conversation when there’s a noisy environment, but unfortunately this feature is only supported with the more expensive set of headphones, the $250 AIrPods Pro. I know it still also looks odd to keep headphones in while you’re talking with someone in-person. Apple is rumored to be removing the stems from their headphones so that may help with the social aspect.

Gupta went on to explain that while currently Siri can optionally read incoming messages, the assistant will get an upgrade in iOS 15 to be able to read notifications and reminders like a shopping list when you get to a supermarket.

The previously mentioned Focus feature was said to help control which incoming notifications get announced over AirPods.

The AirPods Pro and AirPods Max are also supposed to be added to the Find My network that lets other Apple devices help find items when they’re lost. Unfortunately the AirPods Pro case and others doesn’t have any kind of speaker, so the cases have never been locatable unless the headphones are in them. Supposedly iOS 15 will let the AirPods Pro chirp while they’re inside the case. Users can also get an alert if they leave certain AirPods behind when they depart an area with iOS 15.

Another feature that was previously only available with AirPods connected to iPhones and iPads was Spatial Audio, Gupta said this would feature would come to tvOS for the Apple TV device “later this Fall.” Macs with Apple’s M1 system-on-a-chip are supposed to also get the support for Spatial Audio. Spatial Audio and Dolby Atmos support are available today for some albums in Apple Music.

iPadOS 15 and Multi-tasking.

One of the most interesting things about the new iPadOS is that you can choose to stay on iPadOS 14 and still receive security updates. Federighi said that iPadOS 15 would finally get support for Widgets. This is a year after the iPhone, but Federighi said that iPads would get access to a larger class of widget to suit its larger screen. Federighi demonstrated this larger-sized widget with Apple TV, Game Center, Files, and Photos widgets.

Federighi said that another year-old straggler, the App Library, would come to iPadOS 15. The iPadOS twist on the App Library is that it would always be available on the iPadOS dock.

Shubham Kedia, a Human Interface Designer at Apple talked about improved multitasking functionality coming to iPadOS 15. Kedia demonstrated a new control at the top of any app that would let iPad 15 users access the multi-tasking controls that are currently hidden behind various swiping gestures that I believe are currently confusing and nearly indecipherable. Try to split-view another app on iPadOS 14 today and then get rid of it. It’s about as easy as chewing broken glass for me as my swipes do everything but what I want. I hope this improvement really works out.

Kedia want on to discuss another new feature coming to iPadOS 15, the Shelf. This new UI element shows different open “windows” for apps that support multiple windows like Safari and Mail and so-on at the bottom of the screen when any of those windows are open. I find it very weird that Apple calls these windows because they are not windows in the traditional sense. They cannot overlap with each other or be moved and adjusted for size like traditional windows. 

Kedia also showed that the iPadOS 15 app-switcher would let you make multitasking duos directly inside of the app-switcher. 

Federighi said that keyboard shortcuts would let iPadOS 15 users access all of the same multi-tasking functionality.

Notes in iPadOS 15 and Quick Notes on iPads and Macs.

Apple’s Notes app is getting @ mentions, a revision history that Federighi calls an “Activity View,” and tags by using #hashtags anywhere inside of a note.

Will Thimbleby, an Engineer working on Apple’s Pencil Software, demonstrated a new iPadOS 15 feature called Quick Note that is supposed to let you quickly write a note by swiping from a corner of the iPad screen to bring up the Quick Note interface. The Quick Note interface is supposed to be context-aware and Thimbleby demonstrated this by making a note about a website and then showing that visiting a website he had already made a note about brought up a small preview of that note that could be interacted with to continue working on it while browsing that site. It looks useful but I bet there are about ten or twenty note-taking app developers that wish they could integrate with the operating system like that.

Federighi had an incredibly quick jumble of words about what devices could make Quick Notes (iPads and Macs) and that they could only be accessed and edited (but not created) on iPhones.

Apple’s Translate app

Federighi said that Apple’s Translate app is also finally coming to iPadOS 15. Like the calculator app, it has been strangely absent and I am a very heavy translation user for language learning and conversing with foreign-language speakers. Frankly the Translate app on iOS is unreliable and extremely limited, but it is their first release of it. I often type out something to be translated and then Translate loses the text when I ask it to translate which is absolutely infuriating and I have zero interest in doing quality assurance work for one of the world’s largest companies for free by writing up a bug report for them.

The iPadOS version of Apple’s Translate app is supposed to have its own unique twist and include handwriting recognition capabilities and to be able to work with multi-tasking. 

Federighi went on to say that the Translate app would support a new feature called Auto Translate in conversation mode to automatically detect which language needs to be translated during a conversation and when so that users don’t have to interact with the UI and start and stop speaking during conversations unnaturally.

Another new feature for macOS, iOS 15, and iPadOS 15, is system-wide translation. Any text you can select on the iPad can be translated, and Federighi reminded us that Live Text now includes text in photos and elsewhere.

Federighi reminded us that languages can be downloaded to the device so that the translation processing happens locally.

Swift Playgrounds Building SwiftUI Apps on the iPad

I’ve been grumbling about the lack of Xcode on iPad for years but Federighi said that Swift Playgrounds (a coding tutorial system that was introduced a while back on iPads and more recently on Macs) would now let users create actual applications on iPadOS 15 using the open-source Swift programming language and Apple’s SwiftUI toolkit. Code-completion is supposed to be much improved in the new version of Swift Playgrounds, and have access to more documentation and features like building, running and most importantly, submitting the app to Apple’s App Store. Federighi said that these apps can be built in Swift Playgrounds for both iPads and iPhones. I’m very curious what the testing situation will be like in addition to all of the other peripheral things that need to happen in order to actually ship a program.

Federighi said that the new Swift Playground app projects would work on Xcode for the Mac.

Mail Privacy Protection in Apple’s Mail Apps and a new App Privacy Report

Katie Skinner, Apple’s User Privacy Software Manager, talked about how the Mail app is getting updated to prevent marketers and others from tracking users by using invisible pixel tracking. A common tracking technique also used across the web to drill down and identify people everywhere. Skinner said that the Mail Privacy Protection feature would block marketers from seeing users’ IP address, location, and if email is opened. This is good, but I can also instantly think of ways that Apple might be implementing this and ways that marketers can work around it.

Erik Neuenschwander, Apple’s User Privacy Director talked about how Safari prevents tracking already, and then said that this IP-address hiding technology would work in Safari as well.

Both Skinner and Neuenschwander talked about how there would be a new App Privacy Report in Settings to see when apps are accessing functionality and data like a user’s location or microphone and photos or contacts. This privacy report also includes what third-party domains apps are contacting so that you can double-check if you’re OK with what they do. There is supposedly a huge business in inserting garbage tracking software into other innocuous apps like weather apps, so it makes a lot of sense for Apple to do this.

Siri On-Device Speech Recognition

Skinner and Neuenschwander also talked about a much needed upgrade to Siri, on-device speech recognition. Like so much of modern technology, if it’s difficult to do, then there are humans involved somewhere. There was a scandal a few years ago when it was revealed that most voice assistants have human reviewers double-checking that the interpretations are accurate. Apple eventually added system-level options for you to allow or deny human-review of their assistant’s voice recognition features using user data. If you’ve ever had a voice assistant start listening at time when it wasn’t summoned, you know why this is wrong.

It has always especially seemed inappropriate to me to allow children and anyone else who doesn’t understand what is going on to be in the same room as one of these assistants without disclosing that they are there, but Apple has made no attempt at alerting people that they are in range of one of their devices while it is listening and has no manual on/off controls for listening.

Some cheap Google smart speakers have the microphone muted via a physical button on the hardware and it persists across resets. It would be very helpful if Apple had a physical button on any devices that are designed to recognize speech that could disable the microphone.

Neuenschwander said that on-device processing would allow Siri to do many things without making any network requests. Neuenschwander also demonstrated how the offline Siri functionality is faster.

Apple ID Account Recovery

Mike Abbott, Apple’s VP of Cloud Services talked about how users can let people they trust help them recover an Apple ID if they’re ever locked out. Abbott said that these trusted contacts could help recover account access by providing a recovery code over the phone. I can see a number of potential pitfalls but OK.

Digital Legacy

One of the most morbid things I ever had to work on at a large social network I worked at a decade ago was what to do with user accounts when that user passes away. Abbott said that users will be able to choose people who can recover their iCloud account data (Photos, Notes, Mail, etc) in the event that they pass away and can provide a death certificate.

iCloud+ Private Relay

Abbott also introduced a new name for iCloud features called iCloud+, everything is getting a + to indicate that it costs money these days they should just use a dollar sign and be honest. Private Relay is a sort-of-VPN service that Abbott said would encrypt users’ web traffic inside of Safari and send it across two different relay servers to hide the users’ physical location and IP address. Abbott said that not even Apple can see what sites users are visiting when they use the service. Abbott also said that this happens without compromising performance, so web browsing should still be fast, somehow.

iCloud+ Hide My Email

Abbott wasn’t done talking about new features. Hide My Email is Apple’s service to give users disposable email addresses that forward as long as you want them to and users can leave themselves notes about why they made the address and Apple also includes little hints like it was made in Safari or Mail.

iCloud+ HomeKit Secure Video

Abbott said that iCloud+ subscriptions enable access for an unlimited number of video cameras and an unlimited amount of storage for home video cameras that support Apple’s HomeKit home automation network. I guess at least they haven’t announced working with the cops like Amazon does using every Ring camera.

iCloud+ Storage Prices Aren’t Changing

That’s good if you pay for storage and terrible if you are or know someone who will never pay for online storage. Apple’s iCloud pricing is competitive, but it is also absolutely infuriating that people lose important backups, photos, and other data because they will never pay for cloud storage and don’t have reliable places to back up their data to locally. Apple is one of the richest companies in the world and can afford to give people more storage when they buy Apple devices.

Google just stopped offering free photo storage and there has never been a free photo storage option that has lasted. However, the richest companies in the world could afford to protect these important memories if they cared to, but they all want to have Services revenue to demonstrate to investors that they’re Good Companies who can nickel and dime their customers to death. It is insulting and wrong every time anyone loses their photos or other important data because these expensive devices break, or are stolen, or are lost.

Health Stuff: Walking Steadiness, Lab Results, Health Sharing

Sumbul Ahmad Desai, MD, Apple’s VP of Health talked a bunch about how Apple is awesome and introduced an ad for how cardiologists were able to build an app that supposedly helped their patients not have to be re-admitted to a hospital.

Adeeti Ullal talked about mobility assessments, which are commonly done for anyone who is at risk of falling. The Health app is now supposed to monitor users while they are walking and alert them to potentially failing walking steadiness, according to Ullal. The Health app also has mobility tutorials for users to watch videos that can help improve mobility.

Desai talked about how lab results in Apple’s Health app would be easier to understand because they will include ranges for results and other information about what the results are measuring.

Desai also talked about trends in Health data that will be surfaced in the app and in notifications.

Another new feature Desai mentioned is the ability to share specific types of data with a doctor in some secure manner that would feed into the Doctor’s records systems. Desai said that various electronic health record companies would support the new system for exchanging data including Cerner, Allscripts, Athena Health, cpsi, dr chrono, Meditech Expanse.

Desai said that families would be able to share health data with each other, which looks like it could be very creepy and incredibly useful in the case of taking care of an elderly relative.

WatchOS 8: Breathe and Reflect in the Mindfulness App

Kevin Lynch, Apple’s VP of Technology also talked a bunch about how Apple is awesome, or more specifically, about how their Apple Watch is awesome. Sadly no third-party watch face support was announced. Lynch did announce a new update to the Breathe app in WatchOS. Lynch said that it has a new animation to help users calm down. For fans of scripted text statements like Live, Laugh, Love, there’s also Reflect, which has a simple statement for you to think about. The example given was “Think about something you love to do and why it brings you joy.”  Both Reflect and Breathe sessions are in a new Mindfulness app.

WatchOS 8: Respiratory Rate tracking while sleeping

WatchOS 8 will monitor respiratory rates while sleeping and alert users to changes in trends over time with the Health app, Lynch said.

Fitness in WatchOS 8 & Fitness+ Improvements

Julz Arney, Apple’s Director of Fitness Technologies said that there would be new Tai-Chi and Pilates workout modes in the new version of WatchOS 8.

Arney also said that Fitness+ would get new workouts from Jeanette Jenkins. According to Arney, there would also be new artist-spotlight workouts that focus on music from different musicians. Both of these additions aren’t restricted to WatchOS 8 and instead will be available later this month with the subscription service.

WatchOS 8: Portraits Watch Face

Lynch returned to introduce a new watch face that takes Apple’s Portrait mode photos from photo collections and then arranges them with various examples of the subject of the photo being layered in front of the time on the watch face with a small amount of overlap.

I haven’t been a fan of Apple’s faux-depth Portrait mode in years, but I’m a few generations behind on the iPhone. They look good when they work right, but so often I see small details that the iPhone’s depth-sensing misunderstood. Like all of the imagery during the WWDC 2021 infomercial, the example Portrait mode photos are all professionally done with incredible photography that most people won’t be able to reproduce. The watch face is adjustable so that users can control where the time appears

Shandra Rica, Apple’s Senior Manager of watchOS Software Programs, demonstrated the WatchOS 8 Portraits watch face feature and showed that scrolling the crown on the Apple Watch adjusts the view for fun.

WatchOS 8: Photos App “Completely Redesigned” and new Messaging options

According to Rica, the new WatchOS 8 Photos app is completely redesigned, Rica demonstrated that it now has access to the Memories that the iPhone generates and that it’s possible to share the photos directly from the watch using different methods of communication from voice-to-text to emojis. The communication methods now also include an “App Shelf’ that lets the user access things like the #images to send gifs. This is similar to the same thing you get when you tap on the little gray App Store icon on iOS Messages today if you’re on iOS 14, but not the blue one! That blue one sends you to a mini messages App Store (this sure isn’t confusing at all). The other options accessible from the Messages app shelf on WatchOS 8 appeared to also include Apple’s “Memoji,” voice memos, Apple Pay for transferring money, and the old weird heartbeat thing that lets users share their heartbeat with someone via the Watch.

WatchOS 8: Find My and Multiple Timers

it’s 2021 and the iPhone still can’t set multiple timers in Apple’s Clock app, but at least WatchOS 8 will be able to do so according to Kevin Lynch. It’ll also be able to find items using the Find My network, Lynch said.

Home Stuff: Virtual Keys, Siri Features, For All of You, HomePod Mini as Apple TV 4K Speakers, HomePod Mini in More Countries

Yah Carson, Apple’s Senior Engineer for HomeKit Software introduced a demonstration of a person doing stuff in a home set that Apple made such as unlocking their front door with a virtual key from the Apple Wallet and then sitting down on the couch and requesting Siri play a new episode of a TV show by talking to a HomePod Mini. The old HomePod isn’t being sold anymore so that’s a Mini version of nothing. Another resident in the fake house was watching TV with their friends while awkwardly holding their iPhone at a good camera angle for their friends to see their reaction. Who doesn’t love doing their arm falling asleep while they watch a show together.

Another demonstration introduced a new Apple TV feature called For All of You that made viewing recommendations for anyone in the house.

Carson also said that this Fall people would be able to set HomePod Mini devices as speakers for the Apple TV 4K box.

More countries would get access to the HomePod Mini device, Austria, Ireland, and New Zealand in June and Italy by the end of the year. According to Carson the HomePod Mini gets access to the personalized voice recognition service in all of the countries the HomePod Mini is available, though Carson specifically said this is beginning this year, so who knows how long that process will take.

Home Stuff: Siri on Third-Party Devices & Third-Party Interoperability Standards Matter

Carson said that HomeKit devices would now be able to integrate with Apple’s Siri assistant so that the third-party devices can have Siri speaking. A demonstration showed a person speaking to an Ecobee thermostat, Carson said the third-party devices would send the actual request to a HomePod somewhere in the house.

Earlier this year an alliance of tech companies renamed their connected home standards to “Matter” and Carson reiterated Apple’s support for that new standards body. Carson said that iOS 15 will enable access for Apple’s Home app to configure and control Matter devices which means that they no-longer need to be targeted towards HomeKit or other company-specific standards as long as they work with the alliance’s standards.

WatchOS 8 Home App Improvements & Package Detection

Carson went over the improvements coming to the WatchOS 8 Home App. According to Carson the intercom functionality would be available (I’m not actually sure if this is new?) to broadcast an audio message to other people in the house. Alerts and other interactions would work better with HomeKit-enabled products like surveillance doorbells.

Another new feature Carson spoke about and a resident of the fake home demonstrated was package detection, the surveillance doorbell camera detected a package being delivered and sent an alert to the actor’s Apple Watch.

Apple TV Creeper Cam Upgrades

If you’re a creep with a lot of networked surveillance cameras and other devices that all work with Apple’s systems, you’ll be able to watch and adjust them through an Apple TV device and monitor grids of cameras to really get your creep show on.

macOS 12: Monterey

Federighi went briefly over some of the previously mentioned new features that are also coming to the Mac operating system update including FaceTime upgrades to look better using the portrait mode and SharePlay to watch movies and shows together or screen sharing with a meeting where everyone can see a shared presentation. The Shared With You business is in the updated macOS 12 Photos app and elsewhere. Focus modes are also in the control center functionality in Monterey. Federighi went briefly over all of these features including Quick Notes and Notes updates.

macOS 12: Universal Control

Federighi talked about a new feature called Universal Control, a software-based input sharing scheme that lets a Mac share a keyboard and mouse (or trackpad) inputs with multiple Apple computers and iPads running Apple operating systems. I currently use open-source software called Barrier to do this between multiple devices, including a Windows desktop computer, but restrictions in Apple’s operating system prevent similar functionality from working with an iPad.

Federighi demonstrated this functionality including dragging and dropping a file from an iPad to a MacBook Pro, and added an iMac into the mix to drag a file all the way from the iPad, through the MacBook Pro, and ending up in Final Cut on the iMac.

macOS 12: AirPlay to Mac

Federighi showed AirPlay being used on an iMac to receive a video feed from an iPhone, a screen shared from an iPad, and using the iMac to receive audio from Apple Music AirPlay’d to the Mac from an iPhone. 

macOS 12: Shortcuts & Automator

It’s been speculated for years that Shortcuts should come to the Mac from iOS, but Federighi demonstrated it with single-click shortcuts opening multiple apps, arranging them, and then playing music in the background. Federighi said this is the start of a “multi-year transition” for Shortcuts support on MacOS, and that Automator scripting would still be supported, however one of the key features Federighi mentioned for Shortcuts on macOS is importing Automator scripts into Shortcuts.

Safari Updates across macOS, iPadOS, and iOS

Federighi talked us through a very large update to how Safari handles tabs which resulted in a redesign for Safari to shrink tabs down and put them next to the URL bar. When a tab is clicked on, the URL bar changes to take on the colors of the website you’re viewing and the old tab shrinks down to a smaller favicon and site name with the URL bar expanding onto the new tab. It’s a slick design that I think makes sense for Apple’s paradigm of browsing with site-specific settings and options. Extensions and other parts are stuffed inside of three-disclosure dots in the URL bar. The sidebar has new groups of tabs called tab groups that are saved for different activities or however you want to organize them so that the browser isn’t overloaded with tabs.

Beth Dakin, Apple’s Senior Manager of Safari Software Engineering demonstrated the new version of Safari and how it’s more efficient with the space available to it. Dakin demonstrated that the tab groups and open tabs are automatically synchronized across Apple devices including which tab is currently active.

Dakin then demonstrated how a tab group can be dragged into an email to turn them into an unordered list in the email.

Finally, Dakin demonstrated how you can switch the active tab group with a new button next to the URL bar.

Federighi then showed us that Safari on iOS is getting the URL bar moved to a floating position at the bottom of the screen which makes a lot of sense for modern devices, and has that quick tab switcher as well. A new Safari start screen is said to also be synchronized across devices.

Safari & WebExtensions, even on iOS

The systems for adding software to a web browser have changed every few years. Plug-Ins were the original method, and for a few years there has been a sort-of cross-platform API standard called WebExtensions.

Federighi talked about how Safari would gain additional support for WebExtensions on iOS which is the first time iOS has had access to extended technologies outside of the built-in interface and a specific set of adblock-inspired APIs that were just for content blocking which was great because the third-party app didn’t have access to the content of the sites or any user data, the content blocker could typically only provide a list of content for Safari to block which was nice and gave Apple a way to supply its users with ad blocking technology in a fairly secure manner.

Apple announced support for WebExtensions in general at last year’s WWDC 2020. Each browser still has a little bit of work to do for extensions to be supported in each one, but it should make it easier for extension authors in general to support multiple browsers now that all major browsers have similar support features.

Federighi went over the extensions support, continually reminding the audience that WebExtensions only work on whatever websites the user chooses.

Developer Technologies

Susan Prescott, Apple’s vice president of World-Wide Developer Relations finally got to speak to developers directly. This was over an hour and a half into the program, but developer-specific technologies are specifically relegated to a Developer State of the Union address after this infomercial.

Prescott went over a few APIs and technologies Apple was making available including Object Capture, a technology to capture real objects and turn them into 3D virtual objects. Mercifully, we never heard a single word about NFTs or other bullshit.

Ted Kremenek and Prescott both boasted about the Swift programming language for a while. Kremenek went into a small amount of details about new concurrency features in Swift using async await and actors. They said this would make code more efficient.

App Store Rent-Seeking, Product Pages, and In-App Events

Ann Thai is Apple’s Director of App Store Product Marketing and she talked about Apple’s App Store and reiterated Apple’s argument that it is focused on safety and growth, but did not mention the current legal battle soufflé going on to get Apple to open up their platforms from other giant business, Epic Games. Epic Games’ Executives desperately want to be that rent-seeking business instead of Apple. Thai said that Apple has paid out 230 billion US Dollars to third-party developers, which also means that Apple has stolen about 98 billion dollars in rent from these developers for being distributed through Apple’s App Stores.

Thai went over how an update to the App Stores would let developers offer one app with multiple custom product pages to showcase “different features of your  app for different users” this is a pretty typical feature to let people have different marketing to see what works best in getting people to download the app. Many of these fitness apps ship multiple versions with slight changes to work for different kinds of workouts, like running or cycling, but this did not appear to address that situation exactly.

Thai went over a new scenario for developers to advertise in-app events, limited time events that happen in their apps and advertised in Apple’s App Store.

Remarkably, no updates were made to automatically allow developers to have a smaller cut of their revenue stolen by Apple. Currently this is a process that developers have to apply for so that Apple steals less money each year from their revenue.

Xcode Cloud

Server-based continuous integration has been a process available with other platforms for years, Apple now has a version of this called Xcode Cloud according to Susan Prescott. Continuous integration builds code binaries and runs automated tests automatically when new code changes are committed. The past year has been wild for continuous integration products because grifters have been hijacking them to mine cryptocurrency (sound familiar?). Most continuous integration services have had to shut off or severely restrict their free and paid services in order to prevent this hijacking. Prescott says that said that no code is stored in Xcode Cloud, except for build artifacts (the compiled binaries), and that the service would make it easier to distribute pre-release software to testers.

Xcode Cloud is a 2022 service and Prescott said that more information on pricing for Xcode Cloud would come later which presumably means it won’t be free.

TestFlight for Mac

Apple bought the TestFlight service years ago when it was a third-party mobile tool that let people ship test builds of apps to iOS users. There will be a macOS version of TestFlight now according to Prescott. A beta of TestFlight is available today with wider availability later this year.

Overall

The software updates are largely good, I am still a little disappointed that software development on an iPad is relegated to a Playground of sorts, but I don’t want that to sound like playgrounds in the real world are bad, they are where the most creativity happens. If Swift Playgrounds eventually becomes a more full-featured development environment that isn’t limited to libraries like SwiftUI that could be a huge improvement for developers who no-longer would need a Mac to develop.

The health section is a perfect example of how the inaccessibility of technology due to the profit motive is harmful to the people that can’t access it. Things like keys and locks that only work with iPhones and Apple Watches. I didn’t really relay information about the studies that Sumbul Ahmad Desai spoke about, because they were part of the perfunctory self-congratulatory garbage Apple always has to say about their Health programs, but of course the health studies apple runs are by necessity excluding anyone who is too poor to have Apple Watches and iPhones. Desai boasted that more people are in Apple’s health studies (in partnership with actual medical institutions) than many typical studies, so this is the warfare of capitalism against the poor on an enormous scale. You don’t have to be rich to own an iPhone or Apple Watch, but if you’re an Android user or too poor to own them, you are excluded from these studies and your health results don’t matter to Apple or the medical institutions that are working with Apple. I’m not a doctor or ethicist, these are just the thoughts off the top of my head when I see this so it’s entirely possible there is something I’m missing in the way these studies are run. Perhaps they supplied some people with Apple Watches and iPhones who would otherwise not be able to participate.

The WatchOS 8 section of the program was similar. Lynch talked about studies that he said showed how the things like what the Apple Watch does with the Breathe app can help with improving a person’s overall psychological wellbeing. I cannot imagine a day or time when a retail or warehouse employer allows a worker to take five minutes to think for themselves to improve their mindfulness. People who are stuck in low-paying jobs might be able to get an Apple Watch, but they aren’t allowed to even use these features. I’ve never met better people than I did when I worked retail, and while all of us who aren’t independently wealthy are notionally on the same team, there is a clear divide among people who have money and time to access the tools Apple offers to make lives better, and healthier, if Apple’s claims are true.

This goes for everything in the United States, of course. If you want a functional car that isn’t going to break down on the drive home from buying it (this happened to my family with a cheap used car), you buy a new vehicle with a strong warranty and have a backup (or two!) at home. If you’re poor you ride public transportation and it’s costly in both time and your health, especially during a pandemic when riding it exposes you to more people who do not have the option of accessing health care or the option of working from home or taking a day off of work when they feel ill.

Public transportation is also a fantastic example of the kind of thing that when made a real public good, where even the rich want to use it, and it becomes an amazingly useful public good. Some (very few, but some) countries don’t even charge for using public transport at the time of use and instead actually tax their wealthiest citizens and fund public services. Those wealthy people also have other transportation options, but the public transportation systems that everyone uses are clean, well-maintained, and run well and often enough that users don’t have to worry about being late or missing a bus or train. This is what I think the ideal Apple would be. A public company that is actually public, where the wealthiest people prefer to use their products and services because it’s the best option for technology products and without a profit motive that makes the technology exclusive to those that can afford it. If, instead of funding the military to the tune of multiple tens of billions of dollars a year and instead funded publicly accessible technology, everyone would benefit.

This is not to say that technological access is the biggest thing that needs funding: Public schools, health care, housing, transportation, all of these are far more important on a daily basis, but reprioritizing funds away from military contractors and to every public good could make this technology accessible in addition to everything else people actually need to live and thrive. The drive to teach children to code is easy to understand as employers wanting more potential employees to exploit so they can pay less to all of them and is not a genuine drive to make anyone’s life better except the executives.

If Tim Cook and the other executives behind Apple truly cared about anyone’s health, the technology they create would be accessible to more people. They wouldn’t even have to nationalize to take steps towards that goal, but they definitely can’t do it as a “publicly” traded company without their internal workers getting representation at the same level of power as Cook.