Cheered on by benevolent but slightly stressed guides, walking briskly through the luminous corridors of Apple Park hitherto closed to hordes of journalists, we did not even touch the secrets hidden by the Apple’s mothership.
A few minutes before the opening of WWDC, looking through the large windows that overlook the outside and a kind of meadow larger than life, we caught a glimpse of some open-air workshops for developers. None seemed to be about virtual reality. Augmented reality was barely on the list, but that’s it. That’s when we were almost certain that our slightly wild hopes to see Apple’s mixed reality (MR) headset announced were just that: slightly wild hopes…
The latest rumors were true: the future would be for later, for the end of the year at the earliest, and so much the worse for the amazed kid, never very far from the joysticks that control the tech journalist that we are lucky enough to be.
We therefore followed the keynote making us a reason, until the presentation of iPadOS 16. In front of the demonstrations of Stage Manager, we could not suppress a smile. Rightly or wrongly, it’s hard not to see a reminder of the innovative interfaces of the late 1990s (KDE and Gnome in mind). And difficult, above all, not to distinguish the bases of an interface cut for a virtual reality helmet. Stage Manager is a package that deals with the need to isolate from the environment to allow focus while giving space to applications, and ensuring you have other content close at hand.
The two-dimensional layout, with an added depth effect, the neat and clear organization of the windows that can be moved easily, all this has the air of a 3D virtual office… for today, but also for tomorrow.
A preview of the VR interface of the future?
Stage Manager is for us one of the tiny pebbles that Apple sows on its way, and which seems to lead straight to its VR or mixed reality headset. A set of scattered bricks that could well take place in this future product. We have listed a few of them, pell-mell, hardware or software bricks, as much for the pleasure of drawing plans on the comet as to remind you how patiently Apple builds things, at its own pace, according to its own terms.
His mastery of integration, his taste for technological streaming, his art of gradually integrating elements into devices already on the market to test them before slipping them into another new product, all this ready to play in a game of hide and seek, even if it means going too far, even if it means making a little or a lot of mistakes. Because the temptation is too strong to try to glimpse the future.
And, then after all, macOS became iOS, which evolved into iPadOS, then watchOS, each time adapting to new uses, to a new product. Each stage of this progression sketched out the future before it really took shape.
If we continue to follow this imaginary software path, we obviously find many opportunities to catch a glimpse of a future that is being prepared in silence, a software offer that expands, articulates and integrates to form a whole that seems to tend towards VR/MR headsets.
We will not make the affront to say that the App Store is one, it is obvious. How could Apple give up its model available on each platform since its inauguration on the iPhone in 2008?
Small steps for a big leap
But there are other new software and functional features that smack of virtual reality. Memojis are the obvious and playful incarnation of this, an already common means of representing oneself in the form of virtual avatars. Moreover, during the keynote “dematerialized” because of Covid of WWDC 2021, did not Tim Cook enter the scene in a Steve Jobs Theater populated by jubilant Memojis? The future was already here, right?
In addition, a feature like SharePlay will also be of interest in the context of the virtual reality headset. How to better share content – especially if it is spectacular and produced by Jon Favreau… – only in virtual reality? Giant screen in front of the eyes, friends present in the form of avatars, open and casual discussions thanks to FaceTime.
This year, FaceTime is enriched and integrated even more into Continuité. It adopts the Handoff function, which allows you to start a video conversation on your smartphone when you get home to finish it (today) on your Mac or an iPad, in order to always have the best screen in front of your eyes… And of course , tomorrow, the most intimate, personal and largest screen may be that of Apple’s virtual reality headset.
Here are some software features developed over time that will have their place in realityOS, if that’s its name. But there are dozens more. In Messages, for example, or even some new visuals in Mail or even Spotlight.
Software and hardware, in one boat
In a mixed world, where reality is intertwined with digital information, we can also imagine that certain technologies such as the Cinematic mode of the iPhone 13 Pro will have their place in this headset.
Being able to on the fly, depending on your needs, what you are doing, varying the area of sharpness, blurring a background to highlight a person or displaying additional data, all this makes a function a little gimmicky a new interest, a reinforced relevance, beyond the simple video aspect.
It is obvious that all the work done on the iPhones for the photographic part will be reused. Whether it’s software control, or component control.
For the software part, the rise in power in recent years of computational photography, driven by the neural networks integrated into the Axx and Mx SoCs, is an indisputable track of what Apple is considering for virtual, augmented or mixed reality.
Visual search, on-the-fly element clipping functions, face recognition, text in an image, and even now in a video, make a lot of sense today, but will have even more tomorrow, in an interface where the screen is not only everything, but is superimposed on reality.
For the hardware part, the lidar, which appeared in the iPad Pro, then the iPhone Pro, in order to strengthen (among other things) augmented reality experiences and the precision of measurements to increase the speed and quality of photos in lowlights, for example, would also be present in the helmet, according to recurring rumors.
Yet another example of this technological trickle down, of this art of early integration into a marketed product, which is still an element at the heart of R&D. Imagine what experimentation brings when it integrates into millions of devices used every day.
Still on the hardware side, how can we imagine that the work done on the AirPods Pro and Max in the field of noise cancellation, spatial sound production, etc. will not be used in the mixed reality headset/mask?
Same thing for more seemingly insignificant details but which prove that Apple reuses the concepts and ideas it has developed and perfected. Thus, the digital crown which has been the watch’s richest hours since its debut, has found an essential place in the AirPods Max. And, according to some rumors, it will also be part of the upcoming mixed reality headset. Why deprive yourself if the idea is good, the ergonomics proven and the users are used to it? The future also comes from the habits taken in the present, and the past.
From the keychain, to truly augmented reality
Because, like all giants who see far, or go players, who plan several dozen moves in advance, Apple places its pawns without us immediately understanding their importance.
The use of Ultra Wide Band (UWB, or Ultra Wide Band) technology, introduced with the iPhone 11, and AirTags, took on a whole new dimension during this WWDC 2022.
Already in 2020, with iOS 14, Apple introduced its framework baptized Nearby Interaction, which promised interactions between iPhones, augmented reality, virtual objects and other real ones. Interactions also between different devices, designed by Apple, such as the iPhone and the Watch, or possibly manufactured by third parties.
This year with iOS 16, Apple goes a step further and outlines scenarios where visitors can be guided through a real environment thanks to the technology that today allows them to find their AirTags attached to a lost bunch of keys. Arrows will appear on a screen, pointing in the right direction.
Everything will go through the screen of a smartphone, at first, but you don’t have to be a sweet dreamer to envisage that sooner or later all this will end up in a helmet or mixed reality glasses. The pieces are falling into place, little by little.
The heart of the future
Finally, obviously, the last piece of the puzzle may have been put under our noses, without us thinking twice. The M2 inaugurates the second generation of Apple Silicon SoCs. It brings the new 13-inch MacBook Air and MacBook Pro to life. It makes great promises, more power or even less consumption.
Two points which, according to Mark Gurman, whose sources seem to know what they are talking about, caused problems in the prototypes of virtual reality headphones from Apple. Apparently it was carrying equivalents of the M1 Pro…
With a new generation of chip, which benefits from an improved 5 nm engraving process, which heats up less to produce as much, or even more, power, we say that the virtual reality headset, which has not been mentioned once during the conference, yet was there, everywhere, all the time.
By wanting to see it too much, we may have missed some obvious clues (or are now interpreting other erroneous ones, we are ready to recognize it).
One certainty remains. In the history of Silicon Valley, as in the history of innovation, advances did not happen in blocks that sprang out of nowhere. The future is built in small steps, the future is in tune with the times. Apple headphones too.