OpenTTD, the open-source game of business transport simulation based on Transport Tycoon Deluxe, is now available for free on Steam for Windows, macOS, and Linux. The developers recommend that new players check out OpenTTD’s manual, a 26-part tutorial series on YouTube, and a short 14 minute video on signaling. This seems like it’s in the Dwarf Fortress realm of difficulty but those guides should help.
Connectivity Issues with AirPods and the Mac
But… connecting and disconnecting AirPods on the Mac is so much more frustrating than on iOS. While iOS 14 brought more intelligent connection and disconnection of AirPods, Big Sur can’t get with the program. It can take a long time to connect the AirPods, and they seem to disconnect at the drop of the hat.
Typically I use a pair of Sony MDR-7506 headphones with my Mac, connected through a USB audio mixer and both those devices are a little more than a decade old, but when I need to make a video call on my Mac I use my one working Powerbeats Pro (it’s the left one, the right one turns off after about 10 minutes of usage) and trying to determine that it is connected, the default microphone, and the microphone whichever app I’m using actually selects is a nightmare.
I don’t want to go back to switching which device (my Mac or my Windows computer) has a physical microphone attached and my experiences otherwise match Snell’s in this article. My wired Sony headphones are more than a decade old and work great even if they have some signs of wear at this point — I’ve replaced the ear pads four times now — nothing beats the reliability and consistency of actual headphones. In that same time I’ve gone through several pairs of AirPods with warranty servicing, and now these PowerBeats Pro that don’t have an extended warranty and I wasn’t able to get serviced during their first year. You can’t beat the convenience of AirPods and their cousins from Apple’s Beats brand, but they do not last at all and are not convenient to use with a Mac.
Apple also recently released a pair of over the ear headphones called AirPods Max and they are completely ridiculous. $550 just for the headphones and the included “Smart Case” doesn’t cover the entire headphones but at least they have replaceable ear pads… oh wait those are $70! The most expensive ear pads I’ve ever gotten for my Sony MDR-7506 headphones are $20. The Apple AirPods Max ear pads look like they will be much easier to replace, but they also look like they are more wasteful, there is a hard plastic part in the replacement, not just the foam mesh ear pad I replace on the Sony headphones.
The AirPods Max headphones also do not include any kind of wire for connecting directly to a device, just a Lightning to USB-C Cable for charging and no charging brick. Without a direct wired connection here will be audio latency that makes the AirPods Max unsuitable for editing video or audio, or doing any other kind of low-latency work like playing video games. Apple does sell a cable that will directly connect the AirPods Max for $35, but you can’t charge while you’re using that adapter. There are plenty of other lightning to 3.5mm cables but they apparently won’t work.
From the Apple Watch series of devices, the AirPods Max have a digital crown in order to change the volume, access Siri, and so-on.
Topping off the design of the AirPods Max is the weight, 385 grams. That’s heavy. My Sony MDR-7506 headphones are 229.63 grams which is completely comfortable. Even Gruber noted the weight in his review, titled “Heavy Is the Head That Wears the AirPods Max:
The AirPods Max headband does seem to distribute the weight as comfortably as it can, but the weight is all in the ear cups, and heavy ear cups are, well, heavy. When you remain motionless, you can forget they’re there. But when you move around, the AirPods Max have inertia. They move a bit when you shake your head side-to-side, and they move a lot when you nod your head up and down. Look down at your feet and look back up and you’re instantly reminded, Oh yeah, I’ve got heavy cans on my ears. You feel a bit bobble-headed with them on. The heaviness of the AirPods Max doesn’t make them uncomfortable, per se, but it definitely feels like they’re intended for stationary use. Their lack of water resistance aside, the weight keeps them from being the sort of headphones you’d want to use while exercising any more vigorously than a brisk walk.
There are plenty of headphones that cost $550 or more, but after my experiences with the regular AirPods and the Powerbeats Pro I would definitely not recommend anyone spend this much on these. When my Sony MDR 7506’s eventually become irreparably broken, I will get another pair of them. A decade is plenty of life for headphones that cost less than $100. For my iPhone, iPad, and video call use I’m going to get the cheapest pair of regular AIrPods I can. The other features of the AirPods Max sound great, 20 hours of battery life, active noise cancellation and a transparency mode that let you hear what is going on around you… but the inconsistent experience of using AirPods with a Mac, the ridiculous Smart Case, and the high price of both the AirPods Max and their replacement parts make it both out of reach for me now and completely unserviceable over time. Replacing the ear pads on my Sony headphones has cost about $60 over ten years for four replacements. Replacing the same AirPods Max earpads four times over a decade would have cost $276. Hopefully those Apple ear pads are more durable and last longer.
As a complete counter to their most expensive headphones, Apples’s Beats brand now has a $50 pair of headphones called Beats Flex that are Bluetooth earbuds connected to each other by a wire that is meant to go around the back of your neck when worn and the earbuds magnetically connect when you’re done with them. They don’t have a case, and at 12 hours they last longer than typical AirPods on a single charge which get 5 hours until you put them back in their charging case. The Flex are also available in a variety of colors (black, yellow, blue, gray) compared to the white AirPods and AirPods Pro. But the $50 price tag only gets you the first generation of AirPods chip, the W1, instead of the newer H1 in the 2nd generation of AirPods and AirPods Pro. The aforementioned AirPods Max have an H1 for each ear. The older H1 chip has more latency between the device making noise and the headphones receiving them, hands-free Siri access (which is almost entirely terrible if you call anyone in your life “Sweetie” preceeded by “Hey”). The H1 is also incredibly slow to pair to a device. The only thing the Flex have that AirPods don’t is that you won’t be as likely to drop one into water, due to the cable connecting the two earbuds together, and they charge via USB-C. The Flex also won’t sense they’re out of your ears and don’t pause podcasts or music until you magnetically link the earbuds. Apple also has an Android app for updating the Beats Flex firmware, something they don’t make available for their AirPods line of products.
The Verge’s Chris Welch liked the Flex for what they are. I don’t think I’d really recommend Beats Flex for anyone who wants to use them with a Mac but it is incredible that Apple makes competent bluetooth headphones that cost less than replacement ear pads for the AirPods Max.
Supposedly new AirPods and AirPods Pro designs are coming this year with shorter stems and new charging cases. I hope this doesn’t mean all of the new designs will be in-ear. One of the reasons why I’d like to go back to the regular AirPods is that they are more comfortable for my ears.
Apple held their annual World-Wide Developer Conference event in San Jose, California today. Here’s what they announced:
Tim Cook’s Services Showcase
Tim Cook hosted and reiterated how great he thinks Apple is, and briefly recapped their services event. As if to fix their perplexing lack of trailers or clips of their original TV programming (Apple TV+) from the services event, Cook introduced a clip for For All Mankind. It’s the Ron Moore-led show about the Russians winning the space race, and it was extremely incongruous with the rest of the event.
Cook introduced a redesigned home-screen for tvOS with full-screen background previews of shows.
Up next, multi-user support for tvOS so you only see your shows. A blessing for parents who never wanted Game of Thrones mixed with Sesame Street. Unless it’s in a Sesame Street parody of Game of Thrones, which I am ready for.
As an aside, Cook showed how you could swap users in tvOS using a new control-center menu that looks like it is available via a swipe in from the right.
The components in the new tvOS control center showed the time, date, profile pictures for the different tvOS users, a quick option to put the Apple TV to sleep, music controls, the audio output dingus which will be handy for AirPod users, and a search icon.
Cook showed that users would have personalized music options, again, my recommendations will no-longer be destroyed by the music I play for my family. Fantastic. This multi-user stuff needs to be on every Apple device.
The updated Apple Music app for tvOS would display lyrics while you’re listening.
Apple Arcade was demonstrated with Oceanhorn 2.
Cook promised that the Apple TV would now support the Xbox One S and the PlayStation Dual Shock 4 controllers, which both have Bluetooth functionality. This is fantastic because the only MFi controller that has clickable thumbsticks is wired. I believe this functionality is coming to iOS as well.
Screensavers are getting updated with underwater vistas in collaboration with the BBC’s Natural History team.
Cook boasted the current functionality of the watches Apple makes before introducing Kevin Lynch to talk about new features.
Lynch promised “more watch faces this year since the very first Apple Watch.” I think he meant that there are more new faces this year than there have ever been before.
Lynch demonstrated five new faces. A gradient face that moves an angled gradient with the minute hand. A “large numerals face” that showed the current hour in multiple languages. A modern digital face with huge chunky numbers. Finally, a hideous analog California dial with roman numerals (X, XI, I, II) above the horizontal midpoint of the watch and traditional Arabic numbers (8, 7, 5, 4) below. Finally, a new solar face that had multiple circles to represent the movement of the sun during the day.
According to Lynch he new faces would have an optional hourly chimes that they call Taptic Chimes. Each hour you’d either feel a nudge on your wrist from the “taptic engine” of the watch or if sound is on you’d hear something like a bird singing with the new modern face. This is extremely obnoxious. I can’t imagine anyone using it.
There are also new apps for the Apple Watch, an audiobook app for books purchased through Apple’s book store and a Voice Memos app to match the phone’s. Another new app is an Apple-provided calculator.
Finally, WatchOS gets apps that run only on the Watch. Previously, every Watch app had to have a companion app for the iPhone, even when that made little sense.
In the first actual nod to developers for the event, Lynch promised new APIs. An Extended Runtime API lets apps run for longer. Lynch specifically called out that this would be for apps using sensor data, like meditation, exercise, or tooth brushing.
A new-to-the-watch Streaming Audio API would let developers stream audio over the network. Three examples were given, podcasts (Outcast app), music (Pandora), and the MLB’s watch app with the Phillies versus the Brewers.
Lynch also promised a new UI framework this year.
Independent apps also mean that the watch gets its own App Store app to browse for & install new apps.
Dr. Sumbal Desai, Apple’s Vice President of Health was introduced to talk about new health and fitness features for the Watch.
Desai promised an updated Activity app with more long-term information through trending data. The updated app is supposed to provide new data points that weren’t previously exposed.
One of the new features of the Watch is the new Noise app that monitors your auditory environment and alerts you when it’s so noisy that you could damage your hearing. There’s also a complication for the watch face to display the current noise level.
Desai promised that the Noise app wouldn’t record or save audio, and instead just sample the volume every so-often.
Another new feature for the Watch is the Cycle Tracking app for period tracking. It’ll also be available in the iOS Health app.
Lynch returned to discuss a newly redesigned iOS Health app. A Summary view provides health related notifications and other summarily-provided information. He also promised that all of your health information would be private and securely-stored on your iPhone or encrypted in their data centers.
WatchOS program lead Haley Allen demonstrated the new watchOS functionality. Allen started with a new watch face, the Modular Compact face that had four items on the screen, a large analog dial in the upper right, two new complications (Wind and Rain) on the left side, and a large complication space in the bottom with an upcoming calendar item. Allen replaced the large calendar complication with a Noise complication to measure the environmental volume and had the audience cheer to raise the volume.
Allen moved on to the Infograph Modular face with a pretty cool two-tone color scheme of whte numerals with red highlights. Allen demonstrated the audiobook complication that let her tap to get back into the audiobook watch app.
The new app store was demoed, and it looks fine, although it isn’t clear how easy it will be to navigate from these short demos.
Finally, Allen demonstrated the MLB app streaming Red Sox versus Orioles game.
Lynch returned with the traditional “didn’t have time” feature jumble and mentioned that the watch would finally update itself without requiring the user to do so on their iPhone.
Lynch said there would be new watch bands and colors, he specifically called out a new pride band.
Apple shortchanges all of their retail workers equally when they make people line up for mandatory off-the-clock searches before being allowed to leave at the end of their shift.
Cook returned to move things back to iOS and introduced Craig Federighi who joked about the tremendous creativity behind the iOS 13 name, and then moved on to talking about how much performance has increased in current devices with the new operating system.
Federighi gave some examples and said that Face ID would be 30% faster to unlock devices.
Apps would be packaged differently for iOS 13, and would be 50% smaller to download and updates would somehow be 60% smaller. Who knows how much work is involved on the part of developers to attain those numbers, and what exactly is going on. Compression, selective assets, it could be any number of things and those could be effective or not across a wide variety of apps and games.
Apps, Federighi said, would launch twice as quickly on iOS 13.
Moving on to new features, Federighi introduced a video advertising the new iOS dark mode.
Dark Mode is a feature that debuted on macOS last year. Federighi demonstrated it in use on an iPhone, starting with the lock screen switching from a light version of a wallpaper and theme to a darker version. He moved on to News, which will no-longer blind you in the middle of the night, then swapped to the Calendar, Notes, and Messages. It’s a feature that should have been on iOS as soon as OLED screens became an option.
While in Messages Federighi demonstrated that the keyboard, also updated for dark mode, has swiping functionality. He later called it the “quick path keyboard”.
Federighi showed that the new share sheet in photos also adjusted to dark mode, and it had sharing suggestions in addition to a redesigned interface for sharing to apps and actions.
The darkness embraced the music app, which also has time-synched lyrics just like the TV Music app.
Federighi exited the darkness and talked about other app updates.
Safari is promised to have quick functionality to scale the text of the site you’re viewing, and per-site preferences.
Mail is to get “…desktop-class text formatting, including support for rich fonts.”
Notes has a new gallery overview, shared folders, and “much more.”
Reminders has been “completely reinvented” and “completely rewritten from the ground-up.” It’s supposed to give you suggestions of when or where you might want to be reminded, a quick-type bar lets you quickly add locations or photos to your reminders.
To-do lists can be embedded inside a top-level reminder for more organizational functionality. People can be tagged in a reminder so you’re reminded to talk to them about something you wanted to remember.
The Maps app is supposedly updated with new information from a fleet of Apple’s mapping vehicles. The entire US is to have the new data by the end of the year.
Meg Frost was introduced for a demonstration of the new maps app. She’s wheelchair bound and has what I can honestly say is the most awesome looking wheelchair I’ve ever seen.
The redesigned Maps app has a horizontally-scrolling list of categorized favorites under the search bar, collections of places in a vertical list, and then your recently viewed places.
Finally, there’s a street view mode with a binocular icon that launches users into a “look-around” view that appears comparable to Google’s app.
Federighi returned to quickly list off a few more changes to the maps app. A junction view for China that lets people see how traffic will flow in an upcoming intersection, and ETA sharing via quick on-screen shortcuts.
Federighi reiterated his company’s commitment to privacy with another word jumble and then talked about new privacy controls. Location could be shared with an app just once, and then it’d have to ask for approval the next time. You’ll also get regular alerts to apps using your location in the background, and said they wouldn’t allow apps to scan WiFi and Bluetooth signals to track your location.
Federighi promised that Apple would create a new single-sign on functionality for streamlining logins to apps without third-party services or new accounts. It’ll be up to the developer to support this method of sign-on. Federighi promised that this would create a new account for the app without revealing “… any new personal information.” Signing in would use Face ID, or presumably Touch ID. Developers can optionally request your name and e-mail address. Users can optionally share a one-time email address that Apple creates on the spot to forward mail from that app. Federighi said that you could delete the new address at any time to stop receiving mail from that app.
This will inhibit situations where, for example, you want to log into the web version of an app, so it isn’t clear to me how that will work but Federighi said that it’ll work on websites. It won’t work in Firefox, Chrome, or Edge unless Apple has built something for those browsers.
Next, Federighi talked about a new feature of HomeKit, Secure Video, for home security cameras to analyze video locally (on an iPad, HomePod, or Apple TV box) instead of on remote servers. When the video is uploaded, it’ll go to iCloud but be encrypted and Federighi promised that Apple can’t even see the video.
Federighi said that storage for 10 days of video recordings would be included in existing iCloud accounts, and that this video storage wouldn’t count against iCloud storage limits.
The first companies to support this new Secure Video functionality are Netatmo, Logitech, and Eufy. At leat two of those sound like Pokemon.
Federighi promised that there would be HomeKit enabled routers which just makes it even more infuriating that they stopped developing their AirPort line of network routers. HomeKit-enabled routers would segregate the network connections of internet-of-things devices. Federighi said that Linksys, Amazon-owned Eero, and the Spectrum ISP would make these routers at first.
Moving back to Messages, Federighi immediately sherlocked Casey Liss’s Vignette by saying that you could optionally share your name and photo with people you message so they wouldn’t just get a random phone number.
Federighi introduced popular YouTube spec workers’ “Memoji” avatars to talk about new features for those avatars. There are a ton of new features, here’s the video:
After the video, Federighi spoke about Memoji stickers. These could replace regular emoji with automatically generated Memoji versions of the regular emojis. It’s interesting that these are also included in the system keyboard, and Federighi said that the customized stickers would work in mail or other apps.
Finally, Federighi said that users without FaceID, but who had at least an A9 SoC, could create, edit, and share Memoji. Although they wouldn’t be animated without the iPhone and iPad Pro sensor suite.
The Camera and Photo apps have also been updated.
New Portrait Mode features include “high key mono,” and you’ll be able to move the virtual portrait lighting lights further or closer to your subject.
Edits are promised to be easier to make, have new features, and be available for video as well. You can finally rotate a video while editing without using another app.
Browsing photos is to be easier, separating out screenhots and photos of things like receipts, and organizing photos to only show a single photo of a particularly scene instead of every duplicate.
Justin Titi demonstrated the new Photos app, and how it organizes photos into easier collections of days, months, years, or still allows users to dig into every photo.
Federighi returned to introduce Stacy Leizeig (not sure on the spelling of that name) to talk about AirPods, HomePods, CarPlay, and Siri updates for iOS 13.
AirPods can optionally announce incoming messages as soon as they come in, so you can reply immediately. Leizeig said this would work with any messaging application that uses Apple’s SiriKit API.
Audio sharing will be available in iOS 13 to share audio between multiple devices. There is a user consent dialog before this happens, and audio is controlled independently.
Leizeig said that people could use the HandOff API to share music or a podcast to a HomePod by holding their device close to the HomePod. Recovering the listening session happens by the same gesture.
Siri is promised to have live radio access with over a hundred-thousand live stations.
The HomePod will finally recognize multiple speakers and personalize answers and data for Messages, Music, Reminders, Notes, and so on.
CarPlay updates include a new dashboard view with more information. Siri is promised to stay “out of the way” and not obscure the entire display. We need that functionality on regular iOS.
Siri Shortcuts is built into iOS 13 and “more powerful than ever.” The Shortcuts app will make more suggestions about multi-step shortcuts that might be useful to you.
The Siri voice in iOS 13 is promised to be entirely software generated for the first time, as opposed to generated using clips of different speech sounds to create each word. I hope this fixes Siri’s pronounciation of Hawaiian names, because it’s pretty awful. Leizeig compared iOS 12’s Siri to iOS 13’s Siri pronouncing a complex sentence scientifically describing absolute zero. The iOS 13 Siri sounded much more natural and human.
Federighi returned to close out the iOS 13 discussion and talk about the split to iPadOS. He introduced a video that has not been excerpted onto Apple’s YouTube channel yet, it showed new functionality that would be exclusive to the iPad.
Federighi demonstrated the newly defined iPadOS. He showed that an iPad Pro (2018) running iPad OS with more icons on the home screen, and by swiping over from the left, the widgets previously on their own home screen are now visible on the regular screen.
Multitasking is improved by giving slide-over windows their own grab handle at the bottom of the slide-over window that lets you quickly choose a recent window.
Split-view was demonstrated in Notes by grabbing a note from the list and splitting it off into a different split-view window. Finally. The iPad has been available for almost 10 years and multitasking needs these improvements.
Federighi demonstrated dragging and dropping individual notes next to other “spaces” with other apps open in other windows. App Expose would allow you to see any fullscreen space with a Note open, in Federighi’s example.
Federighi also demonstrated two Microsoft Word documents side-by-side in SplitView.
Federighi then demonstrated splitting e-mail composition into a new window while continuing to browse through other emails before dragging and dropping an image from another e-mail he had received to the e-mail he was composing. He also showed previewing a link in another e-mail, and then dragging that link and dropping the link into the e-mail he was composing.
An e-mail in the background joked about screensavers for iPad OS, including Flying Toasters. I’ll be holding out for the Dancing Disco Pig.
Federighi also talked about improvements to the Files app. New views like icon and list views. A Column view with previews, and quick actions.
iCloud Drive will finally get folder sharing.
The Files app in iPadOS is to get Samba sharing, it sounded like it was just to access network shares and not share from iOS.
Finally, the Files app will get external storage support for thumb-drives, SD cards, and other connected drives. Holy shit this took forever.
Finally, you can import directly from a camera into an app (Lightroom was the example) instead of going through iCloud Photos.
Safari is also getting updated on the iPad with “Desktop-class browsing” so that iPad users won’t get served the mobile version of websites.
Federighi specifically said that browser apps like Google Docs, Squarespace, and WordPress would work better now.
As rumored, Safari is getting a download manager. There will also be 30 new keyboard shortcuts for Safari.
Also rumored, fonts are easier to manage in iOS 13. They’ll be downloadable from the App Store. Fonts have been an important focus of Apple’s software for as long a very long time, this is a big Finally for iOS. It will be interesting to see how fonts are managed, and if they will be freely loadable without going through the App Store.
iPadOS multitouch is supposed to be easier. Federighi showed a giant hand grabbing the page scroll indicator directly on a long document, grabbing the cursor while editing a note and moving it easier while also adjusting that movement into a selection. A new gesture enables copy and paste on the iPad. Once text is selected it is possible to copy by pinching with three fingers and then pasting by moving the cursor and spreading three fingers out.
A three-finger swipe left is the new iPad undo gesture. Federighi joked about how people won’t have to shake-to-undo the iPad anymore.
The Apple Pencil latency is supposed to be reduced from 20 miliseconds to 9 miliseconds. The Pencil drawing tools are improved and available to third-party developers via a new API called PencilKit.
iPadOS 13 will allow you to mark-up “…anything on any app” by dragging up with the pencil from the corner of the screen to get a screenshot of the screen and mark it up. This is said to also work with an entire document, instead of just the currently visible screen area.
Toby Patterson was introduced to show off iPadOS’ new editing functionality, including bringing up a small iPhone-sized keyboard for typing with one hand. He accessed that new small keyboard by pinching the large keyboard with two fingers. Patterson was visibly flustered when trying to demonstrate some of the text selection gestures, which weren’t cooperating with him.
Patterson said you could still shake to undo, but he also showed off the new three-finger swipe. He said these gestures would work in any application that supports cut, copy, and paste with undo and redo, not just text-based document applications.
Moving onto the Pencil’s new tool pallete. Patterson showed it being dragged and dropped around the screen, so it could get out of the way of his work. It can be pinned to the edge, or minimized out of the way.
Patterson also showed the markup mode, and how it works with screenshots and can be toggled in apps that support it to the document mark-up mode.
Mac Pro 2019
Cook was brought back out to move things to the Mac Pro. John Siracusa’s dreams have come true. After a decade he can finally get a new 2019 Advanced-Cheese Grater Mac Pro.
John Ternus was introduced to talk about the long overdue replacement for the Trashcan Mac Pro. The Intel Xeon-based computer supports up to 28 cores.
The RAM is 2933Mhz ECC RAM, 6 memory channels, there are 12 DIMM slots. You could havce up to 1.5 terabytes of RAM.
Ternus said the new Mac Pro has 8 internal PCI-express slots. Four double-wide, three single-wide, and one additional half-wide slot that has IO ports. That card has two Thunderbolt 3 ports, two USB-A ports, and a regular 3.5 mm audio minijack. There are two more Thunderbolt 3 ports on the top of the computer.
There are two gigabit ethernet ports near the power connector at the bottom of the back of the 2019 Mac Pro.
Ternus said there is a new Mac Pro Expansion (MPX) module specifically for containing video cards that has it’s own 500 watts of power, a regular x16 PCI express edge connector, and then an additional PCI express/ DisplayPort/Power thing on the end. The starting graphics card would be a Radeon Pro 580X. I’m a little confused by that part, because it sounds like the 2017 Radeon Pro 580. Another option for graphics is the Radeon Pro Vega II, or two of those Vega II GPUs. I really wish the Nvidia and Apple feud would end so some other competing options could be availalbe.
Ternus called the dual Vega II configuration “The world’s most powerful graphics card.” before he introduced another marketing term, the Infinity Fabric Link, which supposedly lets data move between the GPUs five times faster than the PCI express bus.
Ternus said you could also configure the new Mac Pro with two dual Vega IIs modules.
Ternus talked about an add-on card for editing ProRes and ProRes RAW video. This card has a custom ASIC that allows the Mac Pro to process huge video data streams. Ternus said the Mac Pro could play 3 8K ProRes RAW streams.
The new Mac Pro has a 1.4 kiloWatt power supply. Ternus said that “under typical load conditions” the new three-fans-and-one-blower thermal cooling setup in the new Mac Pro is as quiet as an iMac Pro.
There are optional wheels for the Mac Pro.
Ternus boasted about the third-parties that are working with Apple for the new Mac Pro and specifically boasted about performance beating an Nvidia+Windows configuration.
David Earl was introduced to talk about Logic and Final Cut, and showed adding hundreds of instruments to a soundtrack for a documentary. He added about a thousand tracks and then worked with 8K video.
The starting price for this Mac Pro $6000. The availability was given as “this Fall.”
There will also be a special version of the 2019 Mac Pro for rack-mount configurations.
Apple Pro Display XDR
Colleen Novielli was brought out to talk about the new display for the new Mac Pro. She compared it to a pro display that costs $43,000. The APDXDR is 32 inches and has a resolution of 6016 x 3384. Novielli called it a 6K Retina Display. The new display was said to support P3 wide color gamut, 10-bit color, and has Reference modes built-in. Novielli said the display has contrast that is 25x better than a typical LCD.
Novielli also said there was a superior matte option available for the display that doesn’t degrade image quality compared to older technologies.
This display is supposed to maintain higher brightness levels, 1000 nits in perpituity, 1600 nits at the peak. Novielli said the display would have a 1,000,000:1 contrast ratio.
Novielli said the 15 inch MacBook Pro can have two of these displays connected, while the new Mac Pro can have six of these displays connected.
The arm for the display is custom, and supposedly makes the monitor feel weightless while it is being adjusted. There is also an optional VESA mounting adapter.
The Apple Pro Display XDR costs $5000. There is also a “Nano-Textured” version of the display that costs $6000. The VESA adapter is $200. The “Pro-stand” is a laughable $1000. If you don’t have either I guess you’ve gotta get some plywood and screws out or keep it on your lap?
macOS 10.15 Catalina’s iTunes Breakdown
Cook returned to summon Craig Federighi to talk about the new version of macOS.
Federighi walked us through some of the features of iTunes added over the past 18 years, paused to joke about adding Mail and Safari and a Dock to iTunes, before introducing a split of iTunes into three apps. Apple Music, Apple Podcasts, and Apple TV. It wasn’t immediately clear if the new app would still support local libraries of music and features like iTunes Match.
Federighi talked about how the Apple Music experience would be better in the new app before introducing the idea that sync for iOS devices would be moved to the Finder app.
Apple Podcasts gets its own app, as well. Federighi said that the new Podcasts app has machine learning’d it up to scan the spoken word content of the podcasts so you can search for the actual content instead of just the name or episode title.
The Apple TV app is split off into its own app. Federighi boasted that the new app would support 4K HDR playback supporting HDR-10, Dolby Atmos, and Dolby Vision standards.
The rumored iPad-as-external-display feature is real, and it is called Sidecar. Goodbye to the podcast ads? This functionality supports the Apple Pencil, and works wired or wirelessly.
Federighi introduced an accessibility ad for new voice features to control Macs and iOS, as well as rich text composition:
As rumored, Find My iPhone and Find My Friends have been combined for both iOS and macOS into one app just called Find My. Apparently devices will now emit Bluetooth signals so you can track them down even when they are offline, via a mesh network of Apple devices. Federighi promises that this is encrypted and anonymous.
Macs with the T2 security chip (an ARM SoC from Apple) get the ability to be locked down via activation lock.
macOS 10.15 Catalina, Again
Federighi talked about how the new updates to Photos would make it to this version of macOS, along with an updated start page in Safari, and the new gallery view in Notes. Reminders is also coming to macOS along with Screen Time.
Project Catalyst (Marzipan?)
Federighi’s got a marketing name for the stuff that brought shit iPad versions of News, Stocks, Voice Memos, and Home to macOS Mojave. Federighi says that this new cross-platform support is much improved, the slides are not very convincing.
In terms of third-party support, Federighi boasted about the racing game Asphalt, Twitter’s iPad client which is not somebody I want to hear about when they hate third-party developers to a degree that is profoundly disturbing for the long-term success of that platform. Finally, Federighi introduced somebody from Atlassian to talk about Jira for the Mac. Zzz. it’ll be in the Mac App Store this Fall.
Catalyst is available to developers in Xcode preview today. No multiple window support was demonstrated. I am not hopeful that this feature is at all improved over the awful News, Stocks, Voice Memos, and Home apps on macOS today.
Federighi talked about the new AR API RealityKit, for developers that don’t have experience with existing game engines. Reality Composer is a new app as part of Xcode, or for iOS and iPad. It has a bunch of pre-existing models to stuff into your app. Very strange. ARKit 3 was demonstrated with a new people occlusion feature to allow for people to be better placed in-between AR models. Motion capture was also demonstrated.
Minecraft Earth AR Demo
It wouldn’t be a modern Apple event without an awkward demo featuring VR or AR. VR is a little out of fashion, so it’s an AR demo from Lydia Winters and Saxs Persson of Mojang. I’m not sure how many people will want to hold an iPhone or iPad for hours to view an AR version of Minecraft called Minecraft Earth. All things considered this demo could have been much worse.
Federighi returned to talk about a new framework for the Swift programming language, SwiftUI. Federighi gave an example of a huge list that would be a huge piece of code but minimized when it was converted to the SwiftUI framework.
Federighi said that new features would be brought to apps with SwiftUI automatically, like Dark Mode. He also said that there’s a new interactive, upgraded preview type of thing in Xcode for SwiftUI. Federighi compared it to Swift Playgrounds but said it was much more powerful.
Josh Schafer was brought out to demo SwiftUI’s new editor in Xcode. He made a joke app to make up phony versions of MacOS, and immediately showed his changes both in the Xcode Preview as well as on a live iPhone.
Federighi returned to say that the SwiftUI framework was also available for watchOS, and all Apple platforms.
Cook’s Close-out Deals
Cook returned to tell us that developers betas were available for all of the new operating systems and software today, with public betas in July. The usual promised release window for the new operating systems was “The Fall.”
The world’s most performant Mac Pro is great if you’re working on movies or high-end audio production Apple still can’t ship a reliable laptop keyboard or a tower for normal people who aren’t at Pixar. It’s also not clear at all what storage upgrades for the Mac Pro will look like, I’m not sure if the word “T2” was mentioned during the keynote, but the product page for the new Mac Pro mentions the dedicated ARM SoC Apple calls T2 so it seems like you might be limited to add-on PCI-express modules or hooking up external peripheral storage. This is especially galling on the base-model Mac Pro that only includes 256GB of storage.
The display that costs extra if you want to put it somewhere besides flat on your floor is just the ridiculous cherry-on-top of unapproachable high-end hardware.
The OS update for the Mac was couched in the end of iTunes, which will hopefully still be available if some functionality is missing from the new Music, Podcast, and TV apps. Windows users, are of course still suck with the worst version of iTunes.
iCloud is still capped to an increasingly paltry 5GBs of free storage. I am very tired of reading help posts from iPhone users online who have lost their entire photo and video library when their phone breaks and they couldn’t afford enough iCloud storage. Somehow Apple has deemed the subset of their user base that has home video security cameras more important than the millions of iPhone users who would refuse to pay for secure backups.
One way that Apple could work around this is with their own home network router that had a disk inside for backups of all your devices, but the AirPort line of routers is dead and I am sad about that.
I’m not sure how much I buy the split of iOS and iPadOS, it didn’t seem like this event was organized to emphasize that split, but I am glad that they’re working on the iPad to make it a better competitor to a traditional laptop computer. Still, there are so many limitations of an iPad compared to a modern Mac. For example, there is still no generally available multi-user mode for the iPad. The HomePod is getting that feature first, I’m glad the HomePod is going to be more useful, but it’s ridiculous and looks more like a cash grab that everyone in a house needs their own iPad if they want it to be tailored to their uses.
I am a little concerned that the only way improvements can come to these platforms is bundled in a cute marketing name. The iPad needs more improvements yet to be more competitive as a laptop replacement, maybe we don’t need our home screen to be just a list of apps and a select few widgets. Maybe a home screen could include files and be arranged however someone wants it, just like a real computer. Apple is so slowly trickling out features to the iPad, I’m looking forward to more. Perhaps they’ll come faster with the rebranding to iPadOS.
The most important Pro app for iPads to be a full computer and ship their own apps is Xcode, and we still don’t have it. Until an iPad can ship iPad software without any other device, I’m not sure Apple is really onboard with the iPad as a real computer.
The Steam Link app is finally available on iOS (and tvOS) after being available on Android, and then the Raspberry Pi sbc, for a year. The Steam Link app acts like the now-discontinued Steam Link box and streams your Steam library (and more) to your phone, tablet, or TV. The iOS & Apple TV tvOS app allows for Bluetooth controllers like the Steam Controller as well as Apple-approved controllers that are already available for iOS and tvOS.
Apple initially approved and then blocked the Steam Link app for iOS last year. Presumably that was because Valve’s Steam store was available to users, which was a not-great on Apple’s part but makes about as much sense (none) as Apple demanding a cut of Amazon’s ebooks. The new version of the Steam Llink iOS app doesn’t let people access Valve’s store while streaming, it only allows people to play their game library.
The newest versions of the Android app also allows people to stream games when they’re away from home, the iOS app doesn’t have that feature yet and so you’ll be stuck playing on your home network.
As I’ve said before, I don’t think it is any good that the streaming of games you own locally is controlled by any store, platform, or driver company (like Nvidia’s Shield game streaming service.) There could be a third-party, entirely open-source effort to stream your desktop with performance in-mind, but there isn’t. The closest thing is the Moonlight project, but it is only available for people with Nvidia’s graphics cards.
All that said, I played a game of Into the Breach streamed to my iPhone from a Windows host using the new app and while that was a confusing setup process (disabling the virtual mouse, enabling the virtual gamepad) it was ultimately rewarding.
I did have one crash when I switched apps and the network connection had been dropped, but I just resumed the game once I re-launched the Link app.
Streaming from a macOS host is a giant pain in the ass, involving the installation of multiple kernel extensions, reboots, and then installing more kernel extensions and more reboots. I can’t imagine this will get any easier with macOS 10.15, if it’s possible at all. Apple delivers a warning to let you know that something Valve is doing won’t work with “…a future version” of macOS:
That is an ominous warning for a person to read who just wants to play a fucking game. I’m sure they’ll rush out to install the next big macOS upgrade.
This wasn’t in the keynote, but Apple had some bad news buried in the “What’s New in macOS” section of their developer site for anyone who makes games or other software with the OpenGL graphics API :
Deprecations and Removed APIs
Periodically, Apple adds deprecation macros to APIs to indicate that those APIs should no longer be used in active development. When a deprecation occurs, it’s not an immediate end of life for the specified API. Instead, it is the beginning of a grace period for transitioning from that API and to newer and more modern replacements. Deprecated APIs typically remain present and usable in the system for a reasonable time past the release in which they were deprecated. However, active development on them ceases, and the APIs receive only minor changes to accommodate security patches or to fix other critical bugs. Deprecated APIs may be removed entirely from a future version of the operating system.
As a developer, avoid using deprecated APIs in your code as soon as possible. At a minimum, new code you write should never use deprecated APIs. And if your existing code uses deprecated APIs, update that code as soon as possible.
Deprecation of OpenGL and OpenCL
Apps built using OpenGL and OpenCL will continue to run in macOS 10.14, but these legacy technologies are deprecated in macOS 10.14. Games and graphics-intensive apps that use OpenGL should now adopt Metal. Similarly, apps that use OpenCL for computational tasks should now adopt Metal and Metal Performance Shaders.
Metal is designed from the ground up to provide the best access to the modern GPUs on iOS, macOS, and tvOS devices. Metal avoids the overhead inherent in legacy technologies and exposes the latest graphics processing functionality. Unified support for graphics and compute in Metal lets your apps efficiently utilize the latest rendering techniques. For information about developing apps and games using Metal, see the developer documentation for Metal, Metal Performance Shaders, and MetalKit. For information about migrating OpenGL code to Metal, see Mixing Metal and OpenGL Rendering in a View.
Apple is already requiring that apps get updated to be 64-bit, or they’ll stop working in a future update.
As much as I loathe John Carmack today, and it certainly didn’t help that he decided to write this on Facebook, he recently wrote about how he persuaded Steve Jobs to support OpenGL on the Mac:
I was brought in to talk about the needs of games in general, but I made it my mission to get Apple to adopt OpenGL as their 3D graphics API. I had a lot of arguments with Steve.
Part of his method, at least with me, was to deride contemporary options and dare me to tell him differently. They might be pragmatic, but couldn’t actually be good. “I have Pixar. We will make something [an API] that is actually good.”
It was often frustrating, because he could talk, with complete confidence, about things he was just plain wrong about, like the price of memory for video cards and the amount of system bandwidth exploitable by the AltiVec extensions.
But when I knew what I was talking about, I would stand my ground against anyone.
When Steve did make up his mind, he was decisive about it. Dictates were made, companies were acquired, keynotes were scheduled, and the reality distortion field kicked in, making everything else that was previously considered into obviously terrible ideas.
I consider this one of the biggest indirect impacts on the industry that I have had. OpenGL never seriously threatened D3D on PC, but it was critical at Apple, and that meant that it remained enough of a going concern to be the clear choice when mobile devices started getting GPUs. While long in the tooth now, it was so much better than what we would have gotten if half a dozen SoC vendors rolled their own API back at the dawn of the mobile age.
While OpenGL isn’t going away immediately in macOS Mojave, when it is finally gone there will be many fewer games on macOS, it has been the only portable graphics API available for developers to bring their games to Linux and macOS, as well as other platforms, for decades.
Without OpenGL on macOS the Mac and Linux will both suffer, as will new platforms. They’ll have a harder time getting games and other software when bigger platforms are locked to vendor-specific APIs like Metal instead of cross-platform ones like Vulkan and OpenGL.
If I had to guess, I would hope that Valve will ship an intermediary layer to translate OpenGL calls for games on Steam, and hopefully they will make this software available for everyone else. There are already some other projects to translate OpenGL to platform-specific calls but it’s not going to be easy for games to support them. It’d be better if these projects had something to handle the translation on-the-fly. It’s also entirely possible that Valve will just give up on older games supporting modern versions of macOS after Apple fully deprecates OpenGL.
I don’t envy anyone trying to support old software and write good OpenGL drivers like Apple has (even when they don’t update their OpenGL support for years), but the deprecation of OpenGL is a real “Fuck You” to game developers and players unlike any other. Games getting updated from 32-bit to 64-bit, as well as going through the process of having any kind of graphics portability layer added on top, seems unlikely. Thousands of games are going to be lost to time when OpenGL dies off. Competition with popular hardware and software platforms will be even more difficult. I understand the desire to get rid of technical debt, but this is bad.