Returning as a mostly online-only event, Apple’s WWDC is coming back on June 6th to 10th. At the pre-recorded developer-focused infomercial on the 6th we should find out about new operating system features that will come out in the Fall and potentially new hardware as well.
Notably, the logo for this year’s event almost looks like an application icon for the Apple’s Swift programming language. Could be something to do with improvements in developing in Swift on the iPad. Could be something else.
Either way, I plan to write about the infomercial on the 6th. You can find out more about the development focused activities for students and others at WWDC 2022 here.
Deprecations and Removed APIs
Periodically, Apple adds deprecation macros to APIs to indicate that those APIs should no longer be used in active development. When a deprecation occurs, it’s not an immediate end of life for the specified API. Instead, it is the beginning of a grace period for transitioning from that API and to newer and more modern replacements. Deprecated APIs typically remain present and usable in the system for a reasonable time past the release in which they were deprecated. However, active development on them ceases, and the APIs receive only minor changes to accommodate security patches or to fix other critical bugs. Deprecated APIs may be removed entirely from a future version of the operating system.
As a developer, avoid using deprecated APIs in your code as soon as possible. At a minimum, new code you write should never use deprecated APIs. And if your existing code uses deprecated APIs, update that code as soon as possible.
Deprecation of OpenGL and OpenCL
Apps built using OpenGL and OpenCL will continue to run in macOS 10.14, but these legacy technologies are deprecated in macOS 10.14. Games and graphics-intensive apps that use OpenGL should now adopt Metal. Similarly, apps that use OpenCL for computational tasks should now adopt Metal and Metal Performance Shaders.
Metal is designed from the ground up to provide the best access to the modern GPUs on iOS, macOS, and tvOS devices. Metal avoids the overhead inherent in legacy technologies and exposes the latest graphics processing functionality. Unified support for graphics and compute in Metal lets your apps efficiently utilize the latest rendering techniques. For information about developing apps and games using Metal, see the developer documentation for Metal, Metal Performance Shaders, and MetalKit. For information about migrating OpenGL code to Metal, see Mixing Metal and OpenGL Rendering in a View.
I was brought in to talk about the needs of games in general, but I made it my mission to get Apple to adopt OpenGL as their 3D graphics API. I had a lot of arguments with Steve.
Part of his method, at least with me, was to deride contemporary options and dare me to tell him differently. They might be pragmatic, but couldn’t actually be good. “I have Pixar. We will make something [an API] that is actually good.”
It was often frustrating, because he could talk, with complete confidence, about things he was just plain wrong about, like the price of memory for video cards and the amount of system bandwidth exploitable by the AltiVec extensions.
But when I knew what I was talking about, I would stand my ground against anyone.
When Steve did make up his mind, he was decisive about it. Dictates were made, companies were acquired, keynotes were scheduled, and the reality distortion field kicked in, making everything else that was previously considered into obviously terrible ideas.
I consider this one of the biggest indirect impacts on the industry that I have had. OpenGL never seriously threatened D3D on PC, but it was critical at Apple, and that meant that it remained enough of a going concern to be the clear choice when mobile devices started getting GPUs. While long in the tooth now, it was so much better than what we would have gotten if half a dozen SoC vendors rolled their own API back at the dawn of the mobile age.
While OpenGL isn’t going away immediately in macOS Mojave, when it is finally gone there will be many fewer games on macOS, it has been the only portable graphics API available for developers to bring their games to Linux and macOS, as well as other platforms, for decades.
Without OpenGL on macOS the Mac and Linux will both suffer, as will new platforms. They’ll have a harder time getting games and other software when bigger platforms are locked to vendor-specific APIs like Metal instead of cross-platform ones like Vulkan and OpenGL.
If I had to guess, I would hope that Valve will ship an intermediary layer to translate OpenGL calls for games on Steam, and hopefully they will make this software available for everyone else. There are already some other projects to translate OpenGL to platform-specific calls but it’s not going to be easy for games to support them. It’d be better if these projects had something to handle the translation on-the-fly. It’s also entirely possible that Valve will just give up on older games supporting modern versions of macOS after Apple fully deprecates OpenGL.
I don’t envy anyone trying to support old software and write good OpenGL drivers like Apple has (even when they don’t update their OpenGL support for years), but the deprecation of OpenGL is a real “Fuck You” to game developers and players unlike any other. Games getting updated from 32-bit to 64-bit, as well as going through the process of having any kind of graphics portability layer added on top, seems unlikely. Thousands of games are going to be lost to time when OpenGL dies off. Competition with popular hardware and software platforms will be even more difficult. I understand the desire to get rid of technical debt, but this is bad.
iOS 12 Craig Federighi boasted about iOS 11 being available on iOS devices from 2013, and that the majority of their users are on 11 versus the 6% they claim are on the latest Android update. According to Google, the percentage of Android users running their latest operating system (Android 8.1) is 0.8%, ouch. I would guess that Apple based their estimate on the percentage of devices that are capable of running 8.0 or 8.1, which is 4.9% and 0.8% respectively. The majority of Android users (25.5%) appear to still be in the stone age on Android 6.0.
According to Federighi the primary focus of iOS 12 will be performance, specifically he said that older devices would perform better. Using an iPhone 6 Plus as an example, Federighi claimed that apps would launch 40% faster, the keyboard would display 50% faster, the camera would open 70% faster.
Adobe’s CTO Abhay Parasnis announced that their products would support the new format and that Apple’s developer State of the Union would have some kind of demo of this integration.
Measure Federighi announced a new app for augmented reality, Measure, which will presumably replace everyone’s first ARKit app by giving you measurements for real-world objects. The app can detect some shapes and automatically provide measurements for those, but you’ll need to tap at the ends or corners of most real-world objects to get the app to measure them.
Apple’s USDZ Demos Federighi showed off a USDZ 3D scene of a fish embedded in their Apple News app, and then customized a guitar on Fender’s web store only to display it on the stage through an iPad’s AR viewfinder.
ARKit 2 Federighi said that iOS 12 would include a new version of their ARKit API for developers to create augmented reality experiences. The flagship feature for ARKit 2, shared experiences, allows multiple devices to view and interact with the same scene. A slide depicted a pre-recorded video of people playing a multiplayer AR game where both participants and an observer could see and interact.
Martin Sanders, the Director of Innovation at Lego, demonstrated the new object detection features of ARKit along with shared experiences by scanning a physical Lego set and then awkwardly holding two iPads around it with someone else from the Lego company. The iPads showed a city being built around the town square Lego set with some limited interactivity to place vehicles and minifigures. It looked like it could be fun, but this kind of AR scanning will probably be limited to sets that Lego produces, not the modifications people make to them. Sanders said that the Lego AR experience app would ship later this year.
Photos Search & For You Federighi said that Photos would improve in iOS 12. He highlighted the search functionality that would offer suggestions and other improvements to help you find photos you’re looking for. The app will also get a new tab, For You, that offers suggestions of photos you might like to see or ways you could change them to make them more appealing. It seemed like the editing suggestions were limited to ones that involve Apple’s features like looping a live photo or making some specific change to a portrait photo.
The For You tab will also suggest sharing photos with people who appear in them. He said that they would be shared at full resolution, which is an improvement because photos shared today aren’t at the original resolution. Federighi also said that the recipient of shared photos would be prompted to share photos from the same event so that everyone ends up with the entire set.
Siri Shortcuts Federighi mentioned a new feature called shortcuts that would let Siri easily start functions of other apps. The first example was “I lost my keys” which would have Siri open part of the Tile app’s functionality within Siri and look for them. Apps in iOS 12 would make suggestions of shortcuts to add to Siri. The iOS 12 lock screen and search will make suggestions of things to do like re-ordering coffee or turning on Do Not Disturb when you’re at a movie theater.
Federighi said that you’d be able to make your own shortcuts in a new “Shortcuts Editor” app.
Apple’s Kim Beverett demonstrated the shortcuts functionality with a few examples. One of which was adjusting a shortcut she made in the Shortcut Editor. Beverett’s Heading Home shortcut would compress a bunch of steps into one action. It lets her roommate know how long it’ll take her to get home, set her thermostat to 70 degrees and turns on a fan, and opens maps with the course home preconfigured. Beverett quickly used the editor to play a radio program she wanted to listen to every time she goes home whenever she uses the shortcut. This Shortcut Editor is clearly the Workflow app that Apple acquired last year.
Other App Updates Apple’s Susan Prescott demonstrated a few other iOS 12 app updates. Apple News is going from a 5 tab interface to 3 tabs with a new tab called Browse to handle the functionality from the current Search and Following tabs.
The Stocks app is completely rebuilt, and has business news built-in and new charts. The app will also be on the iPad.
Voice Memos is “even easier to use” and is also on the iPad, it’ll also use iCloud for file synchronization.
iBooks is getting refurbished with a new name and interface, it’s just Apple Books on iOS 12.
CarPlay is going to support 3rd party navigation apps.
Do Not Disturb Federighi returned to show us how iOS’ Do Not Disturb will be updated with a new feature called Do Not Disturb During Bedtime. It will hide notifications so you won’t be “barraged” with them if you wake up in the middle of the night and just want to see what time it is. You can also granularly enable Do Not Disturb until you leave a location or for a certain amount of time.
Notification Updates & The Screen Time App You’ll be able to change the way notifications are delivered, or turn them off entirely, directly from the lock screen. You’ll also get suggestions to disable them for specific apps when you rarely interact with them.
Notifications will also be grouped together by what app sent them.
The Screen Time app will give you a weekly report of how you’re wasting your life on your devices. It’ll also let you set time limits for each app. You’ll get reminders about that limit when you’re about to reach it, or you can set limits for the apps your kids use by category or by picking each specific app.
Animoji & MeMoji Animoji are getting updated to let you stick out your tongue with your 3D avatar, they’re also getting four new characters. The ghost, koala, tiger, and t-rex. You can also make a character based on your own face, Apple calls that a “MeMoji.”
Kelsey Peterson demonstrated these effects on stage, and the selfie camera can now display the characters over your real face.
Facetime Facetime can now have up to 32 people on a group video chat. It can be launched directly from a group text message chat in Messages. The different camera effects, like Animoji, are also available on Facetime.
WatchOS 5 Apple’s Kevin Lynch talked about new improvements for fitness and communications.
WatchOS 5 is getting improvements to compete with your friends in exercise. The Workouts app is getting a Yoga workout type, hiking, running gets a rolling pace measurement, pace alerts, and cadence measurements. WatchOS 5 will also automatically detect when you’ve started some types of workouts, it’ll retroactively credit you with the workout if you tell the Apple Watch to start the workout late.
There’s a Walkie-Talkie app for WatchOS 5 that goes over cellular or wifi connections.
The Siri watch face is getting information about sports, map information, and your heart rate. Siri Shortcuts will be on that Watch Face, as well as third-party apps.
Raising your wrist will let you talk to Siri right away, you won’t need to use the “Hey Siri” hotword.
Notifications can also be more interactive, like on the phone. One example was getting a notification from Yelp about a dinner reservation
Apple’s WebKit embedded browser will be available for some web links on the Apple Watch.
The Podcasts app is going to work on the watch, it’ll sync full episodes from your phone.
Apple’s Julz demonstrated these new features while riding an exercise bike, which was pretty impressive.
Kevin Lynch returned to tell us about Student ID Cards coming to the watch, it’ll work with just 6 universities to start, but it includes Temple University in Philadelphia.
Lynch also introduced a new rainbow watch band and watch face for Pride month. The watch face is supposed to be available today.
Apple TV 4K Apple’s Jen Folse talked about how the Apple TV 4K from last year is getting Dolby Atmos in the next version of tvOS, she also talked about how movies will get free upgrades to support the new audio format.
Folse reiterated Apple’s support for live TV and sports.
Folse also said that various cable providers are letting their users replace their cable boxes with the Apple TV, and use a new “Zero Sign-On” instead of the single sign-on system from last year.
The Aerial screensaver on the Apple TV is going to tell you what you’re looking at, finally. It’ll also get a new view from the International Space Station of the Earth.
macOS Mojave Craig Federighi returned to talk about the new features in macOS Mojave.
As Apple leaked, the next version of macOS is getting a dark mode. Previously you could dim the menu bar dock in their settings, but this is a system-wide feature that is more complete.
Desktop Stacks is another new feature that organizes the crap that gets scattered all over your desktop into stacks of documents matched by type, or date, or by tags. It’ll automatically keep those stacks organized as new files are added to the desktop.
Finder gets a new Gallery View that appears to replace the Cover Flow view with thumbnails at the bottom of the screen and a quick look type of view at the top.
The Finder is also getting a detail side bar with shortcuts to different actions you might like to take on the file, Federighi specifically mentioned adding customized automator actions to the side bar for certain types of files like watermarking files with one that was put together earlier.
The quick look utility is enhanced to let you do more from it, Federighi demonstrated trimming a video without opening a separate editor and signing a PDF document.
Screenshots are enhanced to be similar to the iOS functionality but with more powerful functionality that you’d expect on a computer. You can also capture video directly from that utility now.
The Continuity Camera feature was demonstrated by Federighi to take a photo from his phone and pop it directly into a Keynote slide. It can also scan in documents from your phone directly into a Mac app.
A few iOS apps are hitting the Mac. Apple News, Stocks, Voice Memos, and Home for using HomeKit devices.
Privacy Mojave will have better privacy enhancements to prevent other apps from getting access to your personal data unless you want them to do so.
Safari on Mojave and iOS 12 will prevent advertising publishers from tracking you based on sharing functions and comment fields. Mojave and iOS 12 will also hide more of your information from being fingerprinted by websites and advertising publishers.
Mac App Store The Mac App Store is finally getting updated, it hasn’t had many changes since 2011. Apple’s Ann Thai showed off a redesigned UI that takes a lot from their iOSApp Store. It looks much better.
Metal & Core ML on iOS and macOS Federighi returned to talk about Apple’s Metal graphics and computation API and external GPUs. He boasted about the speedups that Macs can get from up to four GPUs. It’s kind of crazy that the only GPUs that Apple supports for external GPUs are from AMD.
Apple’s machine learning API, Core ML, is getting updated with a 2nd version. It’s supposed to be 30% faster with a 75% reduction in the model size.
UIKit on the Mac As has been rumored, Apple is making it easier to bring iOS apps to macOS. No specific name was given for this technology, but Federighi says they’re using it internally with apps like Apple News and promises that it’ll be available for 3rd-party developers next year.
Release Dates for iOS 12 and macOS Mojave “This fall”
Overall I’m very happy that macOS is getting updated with a new Mac App Store, it’s been way too long and developers have been abandoning it for independent distribution systems. I like having apps bundled into one store, but it’s also good that Apple finally wants to compete with independent distribution.
It’s a little disappointing that the iPad Pro didn’t get updated with this event, but perhaps they’re moving the majority of hardware updates to the Fall event.
I’m very happy to see that Workflow is still getting updated, and it looks like it is even more useful after Apple has updated it under the Siri banner. I wish that it were on the Mac as an alternative to Automator.
Here’s what Apple announced at their WWDC 2016 Keynote, or you can just watch the video.
WatchOS 3
Kevin Lynch spoke about the changes coming in WatchOS 3.
Apps, that you pick, update in the background and launch faster as a result.
The long button below the crown will no-longer launch your VIP contact list. It’ll now launch the dock app switcher. The dock displays a live view of the applications so that you can see updated information even without entering one.
Glances are gone, replaced by control center when you swipe up on the watch face.
Messages immediately give you options to reply, instead of having to choose that you want to reply first.
When a message comes in you can scroll down to get a list of suggested replies.
WatchOS 3 will have a new input interface called Scribble. You draw out letters one at a time with the whole word appearing above the input interface. This supports both English and Chinese characters.
In addition to the Micky Mouse watch face you can now choose Minnie Mouse with different outfit color options.
There is a new watch face called Activity that more prominently features the activity rings which monitor your steps and other exercises throughout your day. They’ll be larger and displayed behind an analog style of watch hands if you choose this face, or in chronograph and digital variations.
Kevin Lynch isn’t done with new watch faces yet, Numerals is another. This one displays only analog watch hands and a prominent digit or digits in many different fonts for the current hour.
Another change to the watch face. You can now swipe to switch between them and choose which you would like available.
More complications are available on more watch faces.
Stacy Lysik gave a demo of WatchOS 3. She shows the audience how Apple’s watch apps have been updated for quicker interactions. The timer gives you a few preset timing options to launch, for example. This should allow my son to more quickly set timers that confuse me when they go off because he loves to grab my watch and mess with it while I am holding him.
Kevin Lynch returned to tell us about the SOS feature that can call your regional version of 911 and alert your emergency contacts with your location after the call. It’ll be activated in WatchOS 3 by holding down the side button and it’ll display a countdown before calling emergency services. Interestingly, Lynch mentioned that it’ll work either via bluetooth to your phone or over wifi if your Apple Watch is on a known network.
I wonder about the usefulness of an emergency call where your watch is broadcasting that call publicly on a speaker instead of through your phone’s earpiece.
Jay Blahnik appeared on stage to introduce a new fitness related improvements, starting with activity sharing as a competitive feature. With it you’ll be able to see your friends and family member’s activity rings and data like steps and calories burned. If your friends use third party apps and devices, it sounds like those will work with this feature as well as long as they use Apple’s HealthKit as a data intermediary.
When you view a friend’s activity information you’ll be able to send them suggested encouraging or competitive messages about their progress.
Jay Blahnik continued on to introduce new technology to recognize activity information for wheelchair users. He talked about how they have adjusted different notifications and other information if the wheelchair setting is enabled. For example, the Apple Watch can optionally remind you to stand for a minute once an hour. For wheelchair users the watch will remind you to take a break and push around a little. Some of the workouts will be specifically for wheelchair users.
Finally, Blahnik introduced an app called Breathe to remind people to take stress reducing breaks. It’ll have different kinds of optional reminders and will be able to guide you through a session with visuals or haptic feedback.
Lynch returned to the stage to discuss how WatchOS 3 will be improved for developers to enable integration with those new features and improvements.
Apps on the watch will be able to use Apple Pay. Fitness apps will be able to run in the background during workouts and have access to more data that will enable new kinds of workout apps on the watch according to Lynch.
There are many more improvements and new APIs to the SDK for the watch including SpriteKit and more. Games should be much better though it’ll still be a tough sell to keep your wrist raised up for a while. I’d still expect it to be really only for quicker interactions.
Eddie Cue came on stage to talk about tvOS for the 4th generation Apple TV. He spoke about new apps like Sling, Fox Sports Go, the French TV service Molotov and a few games like NBA 2k, Minecraft Story Mode, and Sketch Party.
Cue then introduced improvements to the Remote app for iPhone which has all of the features of the physical 4th generation Apple TV Remote.
Siri for the Apple TV will be able to find shows and movies by topic. Cue searched for high school comedy movies from the 80’s and got Ferris Bueller’s day off and other options.
Siri will be able to load into live channels. MLB was notably absent.
Logins with cable providers will be reduced to a single sign-on. The App Store will let you know what apps/channels you have access to once you sign in.
The Apple TV will have a dark mode, and will automatically download available apps if you download them to another device.
Cue finished the tvOS talk by briefly mentioning some of the improvements for the developer kit and mentioned that the new version will be released publicly this Fall.
Craig Federighi was introduced to discuss the improvements to OS X, now renamed to macOS. The new version will be macOS Sierra.
Sierra will introduce new features cribbed from third parties such as unlocking your Mac when you have your Apple Watch close to it.
Another new feature borrowed from a third party, Tapbot’s Pastebot in this case, is Universal Clipboard. You’ll be able to copy and paste text, images, video and more between Macs and iOS devices.
iCloud Drive will now include your files in the desktop folder and sync those between Macs as well as making them available via the iCloud Drive app on iOS.
If you are running out of space on your Mac’s hard drive, Sierra can attempt to offload older files to iCloud and remove other kinds of files that users don’t typically need like old application caches. There will be a new GUI specifically for doing all of these functions.
Apple Pay will now work on macOS Sierra through websites that support it. You’ll authenticate it on your iPhone with the Touch ID finger print reader or Apple Watch.
Craig announced another Sierra improvement, tabbing application windows like web browser tabs. Developers won’t need to do anything to support this functionality.
Videos will be able to go into a picture-in-picture mode from websites. You can drag to move or resize this always-on-top window which also persists across full screen application virtual desktops.
Finally, Siri will be available in macOS Sierra. She made a few jokes with Craig during a demo. Siri will be able to understand new questions that are more appropriate for a Mac. For example, Craig asked about files from a specific time period and location, and then did a follow up question to further refine the search. It is unusual for Siri to understand context.
Results from Siri can be pinned to the macOS Notification Center. Image results can be dragged directly into applications from Siri, or copied and pasted from an iOS device.
Sierra will be supported on these models of Mac:
Late 2009 & later
MacBook
iMac
2010 & later
MacBook Air
MacBook Pro
Mac mini
Mac Pro
Craig moved on to iOS 10 and discussed 10 new features.
First off the bat, user experience improvements.
The lock screen has been redesigned. You’ll be able to raise your phone and it’ll wake up, bypassing the issue with newer Touch ID sensors that are so fast you never see the lock screen if you press a finger to the home button.
Notifications on the lock screen are more interactive and designed for 3D Touch to display more contextual information. Craig demoed 3D touching a calendar invite and seeing more details about it, before accepting it.
The next example that Craig demonstrated was a more interactive iMessage conversation where he could see more of the context of the conversation and also get images in it without leaving the lock screen notification.
If you use the illegal taxi service, Uber, their updated notifications showed you the location of your incoming car with an unvetted driver that is probably making less than minimum wage after all is said and done.
Notification Center now lets you clear all of your notification with a 3D Touch gesture that reveals a clear all button.
Control Center was simplified in its initial display, but it also has another page you can swipe to with more bigger buttons for music control and an album art display.
It’ll be easier to get to the camera from the lock screen, you can now slide from right to left to open the camera application.
Sliding from left to right displays a new widget view. The demo broke when Craig tapped on his calendar widget to “show more” of his day. An engineer was immediately brought on stage to answer for this failure.
There are more 3D Touch improvements to apps on the home screen. The email application can now display a widgetized list of contacts and a count of unread messages from them. The activity application on the iPhone can also display your activity rings in a widget that appears when you 3D Touch on that from the home screen.
These widgets can also display live video through third party applications like ESPN.
Craig moved on to iOS 10’s improvements in Siri.
Developers will have access to Siri in IOS 10. Though it appears to be limited to certain categories of apps like messaging apps. Craig mentioned Slack, Whatsapp, and WeChat.
Siri will hail illegal taxi cabs through third party apps Uber, Lyft, and Didi. Search photos in Pinterest, IM and Shutterfly. Start and stop workouts with MapMyRun, Runtastic, and Runkeeper. Send payments with Number 26, Square Cash, and Alipay. Set up VoIP calls with Cisco Spark, Vonage, and Skype.
CarPlay will also work with third-party apps for messaging and VoIP.
The keyboard suggestions, QuickType, will be improved and understand the context of the conversation. Craig’s example, is that it’ll understand the difference between playing in the park versus the Orioles playing in the playoffs. If you’re asked “where are you?” in iMessage, QuickType suggestions will give you a big suggestion to give your current location on a map. If you’re asked for contact information, QuickType will suggest sending the contact. The calendar event suggestions that you’d see if you tapped on text like “Sunday at 2PM” before will now be more context-aware and understand that you’ve been talking about a certain type of food and a street address that get included in the calendar event suggestion.
The keyboard will now support multilingual typing, I won’t have to switch between German and English anymore!
Photos will now display your photos pinned on a map, and will run facial recognition to understand who is in what pictures. I hope it works better than iPhoto did at that. Photos will also now detect objects and scenes in pictures so that you can search for these better. Craig displayed an example picture of someone riding a horse by a lake with a mountain, and said it would understand the mountain and horse and the scenery if you search for those. I wonder if we’ll see progress bars scanning all of our photos the first time we open the new Photos app in iOS 10 as this occurs. I suspect that this will be done offline due to privacy concerns.
Photos will also group together different trips, people, groups, and topics like “on the water” or “at the beach” into a new interface. Craig demonstrated the new “Memories” tab in Photos on iOS 10 that shows the product of this computer vision work. It’ll even create a short video of different events with Ken Burns style zooms of different photos and videos as well as music layered on top. Craig assures us this will pick the right music, but he demonstrates overriding the music choice and how Photos will re-edit the “memory movie” to match the music change. These changes will also come to macOS Sierra.
Craig takes a break as Eddie Cue returns to show us updates to maps. iOS 10 Maps will offer you some suggestions based on calendars and where you commonly go at different times of day. If you search for restaurants it’ll display a horizontal list of cuisines and restaurant styles you can pick from, and adjust its suggestions based on which you select. Navigtaion is also improved. Eddie shows us how the view is more dynamic, zooming in and out intelligently based on where your next turn is and other information. Maps will give you suggestions for things along the route, and tell you how long it’ll take to reach your destination if you stop at these suggestions. There’s also a Maps extensions API, Eddie’s example is booking a table at a restaurant with an OpenTable restaurant, hailing an illegal taxi cab with Uber, and paying for it all inside Maps. Cool.
Apple Music is also getting updated with an “all-new redesign.” I don’t see the connect tab anymore. Bozoma Saint John went on stage to give a demo. The Music app looks to be streamlined without that connect tab, and pretty different. Lyrics are displayed right below the controls for the music if you scroll down. Very nice. Bozoma tried to get the audience to rap along with Rapper’s Delight, the audience cam showed us various Apple folks attempting to do so, it was extremely embarrassing. The new Apple Music interface looks great, though.
Eddie Cue comes back to discuss Apple News improvements. You can already read this site on there so they’ve done a bit of extra cleanup to improve the rest of the app. For example, they’ve added subscriptions and breaking news notifications.
Craig is back to discuss Homekit improvements. New categories of supported IoT devices have been added. A dedicated Home app is going to be available to control all of your homes. I’m sure monocles are popping out everywhere. Preset scenes will be available to adjust several devices at once. Siri can control these scenes and devices. If you swipe over on Control Center you’ll be able to control devices from there. Notifications will be able to interact with HomeKit devices and display live video from a doorbell device, for example. Apple TV will act as the hub for your HomeKit devices so that you can access them remotely when you’re away from home.
The Phone app will be updated to transcribe voicemails, though this functionality is in beta. An extension API will let other applications interact with the Phone app, in an example slide an extension from Tencent identifies an incoming caller that isn’t in your contacts as a possible spam caller.
When VoIP calls ring through, they’ll integrate into the lockscreen and the phone app just like regular calls. Contacts will be updated to display the options you have to call someone through multiple services. They’ve worked with Cisco to let you get your work calls through your iPhone.
The Messages app has been updated to provide previews of links inline. Videos and photos will display right in Messages. The camera will display a minified camera app inside of messages when you tap that button. Emoji are now three times larger. If you tap the emoji button after typing up a message, but before sending it, it will highlight words that can be replaced with emoji. Message bubbles can now have effects like shrinking or growing to emphasize emotion behind words. Messages can be hidden for a recipient to reveal with a swipe over the text or an image. They’ve also added a short series of icons to send a thumbs up, or a laugh to someone. Apparently we don’t have enough time to select the appropriate emoji now. This demo slide broke and displayed the thumbs-up above the message that it was replying to. You can also send “handwritten messages,” it isn’t clear if these are keyboard input being turned into a generated font with some ink physics or brushed on with a touch, but you can also send “digital touch” messages like drawing a smiley face on the Apple Watch. You can also draw on video or photos with digital touch. There are also fullscreen effects that appear behind the messaging transcript. A slide shows us some fireworks behind the chat bubbles after you send “Happy New Year!”
Bethany and Emron are introduced for a demo. They’re engineering and human interface design team leads. Bethany demonstrated sending and receiving some diffferent message types, Emron received a link to some music that you could play right inside Messages.
iMessage will now support developer-created apps that are launched from an “app drawer” for things like stickers. Craig tells us that some types of apps won’t even require code. Artists can make them a sticker app without coding, but you could also integrate other iOS functionality like the camera. Square Cash will work inside of Messages as well.
Craig started a demo. He demonstrates sending a sticker, and then applying some animated stickers to a photo message. The next demonstration is a group ordering food together through DoorDash, right inside of Messages. Each person selects the food they want from the restaurant (or food truck in this demonstration).
The Mac and Apple Watch will be able to receive these conversations, so it sounds like they won’t be able to work with messaging apps to create messages using them, which makes sense.
Here’s a video they showed to feature the new features in Messages:
Craig finishes up by blitzing through some other improvements to iOS 10. Live editing collaboration in the Notes app. Conversation view in Mail displays messages in a better threaded format. Live photos can be edited and they have stabilization now. Safari on iPad finally lets you use split view to display two websites at the same time. Previously you had to use third-party apps to do that. He also reminds us that many of these new features work for third-party developers.
Finally he re-emphasizes how all of this information that involves computer learning, the memories functionality in Photos for example, will be run on the device. The information that is sent to Apple’s servers will also be kept private by them not building profiles of users. Obviously this is a dig at Google who mines all of your data with no regard for your privacy if it’ll enable their advertisers to have better targeting.
Tim Cook comes back on stage to close things out and emphasize Apple’s commitment to developers. He discusses how great Swift is. He’s talking about how it’s the #1 language project on Github since the source was released. Cook also reiterates how important Swift is as a first language by releasing a new app called Swift Playgrounds for iPad.
Tim invites Cheryl Thomas on stage to demonstrate Swift Playgrounds. The app’s front door screen has lessons and challenges. QuickType suggestions are offered inside of lessons with code suggestions. There’s a simple turtle-style demo of moving a character on the screen with code like moveForward() and collectGem(). Cheryl demonstrates wrapping a for loop around some pre-existing code. This looks like it could give Codeacademy a run for its money, though obviously it’ll be limited to Swift many of the lessons could apply to other languages. We need XCode for iPad.
Cheryl opens a more advanced playground with a new coding keyboard to add new code to the playground.
Tim returns again to announce that Swift Playgrounds will be free, of course. Another video:
That could have been incredibly hokey, but I think they did a great job.
Tim Cook goes back over everything we’ve seen, and closes out the keynote.