WWDC Predictions 2023
April 17, 2023
It's that magic time between when WWDC is announced and when the conference runs.
No, I didn't get a ticket to the community day. Yes, I'm going to travel out there anyway and catch up with people I haven't seen in way too long.
In the meantime, here are my pre-WW thoughts - at least for now.
I've heard the same rumors you've heard about the goggles and AR stuff. I'm not that excited about the device but I am interested to see if some of the technology that was developed specifically for the device makes its way into other platforms or pushes SwiftUI in any interesting ways.
There's also that moment after something new ships where you look back and say "oh that's what that was for."
Like when the iPhone shipped with animation such an integral part and we looked back at the animations that had been added to the Mac.
Like when we were told to start thinking in terms of points not pixels for some made up reason and then the double resolution phones shipped and we said, "oh, that's why."
Like when autolayout shipped and we were told that it was so that layouts could adjust for localization to languages such as German and then it turned out to be something necessary to support the phones that shipped in the fall with different aspect ratios.
So perhaps the widgets that have been pushed for iOS will be some essential part of the goggle world.
That said, I'm hoping it's not a WW overwhelmed by goggles and AR/VR content.
The other thing I've been hanging back on has been the AI/ML stuff. Perhaps this is the year that I take the time to dig in and get involved.
So here are the things I care the most about.
(1) Observation (see Evolution proposal 395)
This proposal provides a way of replacing use of ObservableObjects and @Published (and therefore Combine) with AsyncSequences.
If I were invited to do a presentation post-WW on something new, I think this would be my topic.
Although there are plenty of things in the proposal about connecting to and consuming these things, the bigger news is something we won't see until the conference itself: SwiftUI's end of things.
There will be a nice way of having a SwiftUI view use this new construct to provide the data used by a view.
Since SwiftUI isn't developed in the open, I don't know what this will be. I also don't know if it will be ready for beta 1. Remember that @Published appeared and evolved during the betas.
I would argue that this is an important step for Apple as it allows them to remove the dependency that SwiftUI has had on Combine. I was a big fan of Combine but you could feel the lack of support and commitment from Apple so better we move on sooner than later.
I think this will be a big deal.
I have to say that in general macros are a blind spot for me. But then again so is Reg Ex, AI/ML, and most of Science Fiction.
I've come to believe that the shortcoming is mine and not any of these topics.
Again, if I were invited to do a workshop post-WW and they didn't want my current Async workshop, I think macros would be my topic.
I don't have my head around the what or the why yet - so I really look forward to the videos from Doug and others this WW, but I think macros are something I will invest time in learning. I liked result builders at first and macros seem to be another way building DSLs. That said - I really don't know what I'm talking about. If I can figure it out, maybe I'll write a book where the hero exclaims "Holy, Mac-a-ro" when they create something cool.
Then again, I still haven't quite gotten my head around the generics talks from Holly and Slava from last year so maybe I'll just work more on that.
(3) Swift Data
I know, I've been predicting/calling for this for years. But the time feels right.
SwiftUI let us replace visual composition of our GUI using visual tools such as Storyboards with declarative, composable, lightweight, code-based, made up of value types that conformed to the View protocol.
The visual editor for Core Data seems to have disappeared recently and we now have some of the pieces that would make it more natural to have a declarative, composable, lightweight, code-based, made up of value types.
Stir in some actors for the system managing our instances. Add in noncopyable to make sure we aren't using and modifying these value types in two different places.
Again, I'm just making stuff up - but the time seems right for Swift Data.
Macros and Observation are in Swift Evolution, so they are pretty sure bets for this year. Swift Data is me playing the same numbers in the nightly lottery for years and I don't want to not play them this time 'cause I'll feel bad if they come up the one year I don't play.
(4) Distributed computing
I have no idea if Apple will dig in here, but look at the the devices in the Apple Ecosystem. Start with the iPhone, iPad, and various Macs. Throw in the Apple Watch. While we're at it, add in Apple TV. Then there are the trojan horse devices where Cars running Car Play are partially part of the ecosystem. We don't think of the headphones, home pods, and other home kit devices - but throw them in as well.
Many of you may be too young to remember the iPod and Apple's Digital Hub strategy where everything connected and synced through your Mac. It's why iTunes (now Apple Music) did way too much. In the early days of iPhone, we rearranged our home screens, installed apps, and backed up our phones using iTunes.
Anyway, now we control one device from another device. Sometimes I forget which device is actually running the app I'm interacting with.
It feels like it's past time for these devices to help each other out.
If we get some sort of goggle thing, then I'd love to see more distributed computing available for us to easily let the more powerful devices in our lives help out some of the more constrained ones.
I'd also like us to be able to work with (privacy concerns addressed) other devices both nearby and not.
I've loved the promise of CORBA, JINI, JXTA, Linda, and tuple spaces and would love to see what Distributed Actors can do.
For goodness sakes, we have Bonjour on all of our devices, we're built for this.
The promise of Hewitt's Actor Model seems within our reach and looks to open so many doors.
Oh, and while I'm at it, what happend to the Tensor abilities we were going to get in our devices based on differentiability in Swift.
The long and the short is, I don't think any of the things I listed as things I'm looking forward to are on the top of Apple's list so I expect this to be a great WWDC.