Design in Apple’s vision of the future Yanko Design

Introducing Apple Vision Pro

Apple made waves earlier this month when it finally revealed its long-awaited foray into the world of mixed or extended reality. That the company has set its sights on this market is certainly no secret. Indeed, the delayed announcement (at least by market standards) has raised some questions as to whether it was just wishful thinking. At WWDC 2023, Apple certainly showed the world that it means serious business, maybe even too serious. The Apple Vision Pro headset itself is already a technological marvel, but in typical Apple fashion, it didn’t dwell too much on the specs that would make many pundits drool. Instead, Apple has focused on how the sleek headset almost literally opens up a whole new world and breaks down the barriers that used to limit virtual and augmented reality. In addition to the expensive hardware, Apple is selling an even more expensive new computing experience, which revolves around the concept of spatial computing. But what is Spatial Computing and does it have any meaning beyond viewing photos, browsing the web, and walking around in a virtual environment? As it turns out, it could be a world-changing experience, virtually and actually.

Designer: Apple

Making Space: What is Spatial Calculation?

Anyone who has kept an eye on trends in the modern world has probably already heard of virtual reality, augmented reality or even extended reality. While they sound new to our ears, their origins actually go way, way back, long before Hollywood had even smelled them. At the same time, however, we’ve heard so much about these technologies, especially from a few social media companies, that you can’t help but roll your eyes at another one to come. Given its hype, it’s certainly understandable to be wary of all the promises Apple has made, but that would be to downplay the concept of what Spatial Computing really feels like. THE next wave of computing.

It’s impossible to discuss spatial computing without touching base with VR and AR, the grandparents of what is now collectively called eXtended Reality or XR. Virtual reality (VR) is pretty much the better known of the two, mostly because it’s easier to implement. Remember that cardboard box with a smartphone inside that you strap to your head? This is pretty much the most basic example of VR, basically trapping you in a world full of pixels and intangible objects. Augmented reality (AR) frees you from that made-up world and instead overlays digital artifacts onto real-world objects, just like those Instagram filters that everyone seems to love or love to hate. The problem is that these are still intangible virtual objects and nothing you do in the real world really changes them. Mixed Reality (MR) solves this problem and connects the two so that a physical knob can actually change a virtual configuration or a virtual switch can activate a light somewhere in the room.

In this sense, Spatial Computing is the culmination of all these technologies, but with a very specific focus, which you can distinguish from the name. Simply put, it turns the whole world into your computer, turning any available space into an invisible wall that you can hang your app windows against. Yes, there will still be windows (with a lowercase w) due to how our software is currently designed, but you can re-dock as many as you like in the available space you have. Or you can just have a super gigantic video player filming your watching. The idea also uses our brain’s innate ability to associate things with spaces (which is the theory behind the Memory Palace) to make us organize our computer desktop the size of a room. In a sense, it makes your computer virtually invisible, allowing you to interact directly with applications as if they physically exist in front of you because they practically are.

The reality of the apple

Sure, you could say that Microsoft’s HoloLens has already done all of this too. What makes Spatial Computing and Apple’s implementation different is how the virtual and the real influence each other, just like in mixed reality. There is, for example, the direct way we can control floating applications using nothing but our body, either with hand gestures or even just the movement of our eyes. This is the fulfillment of all those Minority Report fantasies, except you don’t even need to wear gloves. Your facial expressions can also have an effect on your FaceTime doppelganger, a very useful trick since you won’t have a FaceTime camera available while wearing the Apple Vision Pro.

Apple’s visionOS Spatial Computing, however, is also indirectly affected by your physical environment, and this is where it gets a little magical and literally spatial. According to Apple’s marketing, your virtual windows will cast shadows on floors or walls and will also be affected by ambient light. Of course, you’ll be the only one to see those effects, but they make windows and other virtual objects look more real to you. Vision Pro will also dim the display to mimic the effect of dimming the lights when you want to watch a movie in the dark. It can even analyze surrounding objects and their textures to mix audio to sound like it’s really coming from all directions and bouncing off those objects.

The number of technologies to make this seamless experience possible is quite staggering; that’s why Apple didn’t focus too much on the optics, which is often the selling point of XR headphones. From sensors to processors to the AI ​​interpreting all that data, it’s no longer surprising that Apple took so long to announce Vision Pro and its Spatial Computing. It is, however, also his biggest gamble and could very well ruin the company if it crashes and burns.

Real design

Spatial computing is going to change the rules of the game, but it’s not a change that will happen overnight, no matter how much Apple wants it to. This is where computer science is heading, whether we like it or not, but it’s also going to take a long time. And while it may have computing in its name, its ramifications will impact nearly every industry, not just entertainment and, well, computing. When spatial computing takes off, the way we design and create things will also change.

Many designers are already using advanced computer tools such as 3D modeling software, 3D printers and even artificial intelligence to assist their creative process. Spatial Computing will make a quantum leap by allowing designers to have a more hands-on approach to creation. Together with digital twins and other existing tools, it will allow designers and makers to iterate on designs much faster, allowing them to measure a hundred times and print only once, saving time, resources and money in the long run.

Spatial computing also has the potential to change the very design of products themselves, but not in the outlandish way that the Metaverse has been trying to do. In fact, Spatial Computing turns the narrative upside down and places more importance on physical reality rather than having an expensive, one-of-a-kind NFT sneaker that you can’t wear in real life. Spatial computing highlights the direct interaction between physical and virtual objects, and this could open up a new world of physical products designed to interact with apps or, at the very least, influence them with their presence and composition. It might be limited to what we would consider computing, but in the future, computing will pretty much be how everyone interacts with the world around them, just like smartphones are today.

Human nature

As great as the Apple Vision may be, it will face many challenges before its Spatial Computing can be considered a success, the least of which is the price of the Vision Pro headset itself. We have highlighted those five reasons why Apple Vision Pro may fail and the main reason will be human factor.

Humans are creatures of habit as well as creatures of touch. It took people years, maybe even decades, for people to get used to keyboards and mice, and some people still struggle with touch screens today. While Apple’s Spatial Computing promises the familiar controls of existing applications, the way we interact with them will be completely gesture-based and, therefore, entirely new. Add to the fact that even touch screens give something our fingers can feel, and you can already imagine how alien those airy hand gestures can be for the first few years.

Apple has certainly done its due diligence in ergonomic and health studies and designs, but it’s not hard to see how this isn’t going to be the most common way people do computing, even if you make the Vision Pro cheap. Sure, today’s computers and mobile devices are poorly ergonomic by design, but many solutions have now been developed. Spatial computing is still uncharted territory, even after VR and AR have long blazed a trail. It will certainly take our bodies to get used to it before Spatial Computing becomes almost second nature, and Apple will need to stay strong until then.

Final thoughts

As expected, Apple wasn’t content to announce just another AR headset to join an uncertain market. The biggest surprise was its version of Spatial Computing, formally marketed as visionOS. Much of what we’ve seen is largely marketing and promises, but that’s what Apple was talking about. It could even be reality, although it will take some time to fully realize.

Unlike entertainment-focused VR or the almost ridiculous Metaverse, Spatial Computing definitely looks like the next evolution in computing that will come sooner rather than later. It is definitely still in its infancy, even though the seeds were planted nearly two decades ago, but it clearly shows the potential to be more widely accepted due to its more common and general applications. It also has the potential to change our lives in less direct ways, like changing the way we learn or even design products. It’s not yet clear how long that will take, but it’s not hard to see how Apple’s vision of the future could very well be ours.

#Design #Apples #vision #future #Yanko #Design
Image Source : www.yankodesign.com

Leave a Comment