March 29, 2024

Balkan Travellers

Comprehensive up-to-date news coverage, aggregated from sources all over the world

A closer look at Apple's Vision Pro keyboard and other controls

A closer look at Apple’s Vision Pro keyboard and other controls

An Apple developer session provided an in-depth look at the many ways users will (eventually) control the new Vision Pro headset, including a virtual keyboard you can type on in the air. Thanks come to us Spatial Input Design Sessionin which two members of Apple’s design team guide potential developers through best practices for designing apps for the new platform.

Apple seems to be keen for users to interact primarily with the headset by simply looking at UI elements and making small hand gestures with their arms resting on their lap. But in his developer session, Apple designer Israel Pastrana Vicente admits that “some tasks are better suited for direct interaction,” which could include reaching for and touching user interface elements (a feature Apple refers to as “direct touch”). There is also support for using physical keyboards and gamepads or game controllers.

Some tasks are better suited for direct interaction.

So let’s talk about the Vision Pro’s virtual keyboard. Apple designer Eugene Krivoruchko explains that it’s important to provide plenty of visual and audio feedback as you use it, to compensate for the “missing tactile information” associated with touching the reading tip. “While the finger is above the keyboard, the buttons display a hovering state and a highlight that gets brighter the closer you get to the button surface,” Krivoruchko explains. “It provides an affinity signal and helps guide the finger to the target. At the moment of contact, the state change is fast and responsive, and is accompanied by a matching spatial sound effect.”

See also  Meet the Xbox Gamer Who Speeds Up 1,000,000G Speed ​​Up to 25% in Just One Day

There’s also support for voice input, with the same developer session noting that focusing your eyes on the microphone icon in the search field will trigger Speak to Search. This will most likely pull audio data from a file Six microphones Built into Vision Pro.

Direct touch can also be used to interact with other system elements. There’s the ability to tap and swipe as if you were interacting with a touch screen, and Apple’s demo shows the wearer making a mid-air stylus motion to write a word and draw a heart shape in Markup. Although the primary interaction is done via the user’s hand, Krivoruchko shows how it also uses eye tracking to augment gestures. “You can control the brush cursor with your hand, similar to a mouse cursor, but then if you look to the other side of the canvas and click, the cursor jumps there to land right where you are looking. This creates a sense of precision and helps cover the large canvas quickly,” says the designer. .

We still have plenty of questions about how Apple’s expensive new Vision Pro headset will work in practice (especially its ability to pair with motion controllers from other manufacturers), but between our hands-on experience and developer sessions like this one, the experience is starting to come into focus.