this post was submitted on 23 Jan 2024
96 points (100.0% liked)
Technology
37708 readers
403 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
To me it's like the XReal Pro 2 with a bigger screen but bloated into 10x its price and basically the same gestures that were garbage on Microsoft Hololens. Tbf Hololens was astonishingly horrible at gesture recognition.
And imagine you have to tap the software keyboard floating in the air... Yup, that's how it worked with the Windows OS on Hololens. Jesus, I had to input my 30-letter workplace account PW on a keyboard that had some petite keys floating mid-air and away from me, switching between the alphabets and symbols modes every few air taps.
I could almost never log in because it was impossible to tap the correct keys for 30 times straight. Make one mistake, BS, but then the BS key was also small and I rarely could tap the BS correctly. Yeah, you try to remove a character and instead insert another wrong one till you miraculously manage to BS for exactly the correct number of times.
Agreed, I worry about this too. The Quest uses a similar gesture with hand tracking (finger pinching to click) and it feels really frustrating compared to the much more direct feel you get with the included controllers.
With the Apple you don't even have controllers available if you want them so gesture tracking must work perfectly. Apple does have a lot of experience in getting stuff like that just right, but I really wonder whether eyetracking + pinching is comfortable for hours.
Supposedly the gestures are one thing they did a really solid job of based on the demo recaps I've watched. And the eye tracking supposedly works quite well for focus state switching. The main complaint I've heard is that the virtual keyboard sucks.
I'll be really interested to see more in depth reviews when they start coming out.
Yeah that I can imagine. I think it would be really annoying and exhausting having to type by looking at the letters. This is how you control the mouse pointer, right?
But I really hope I can see it for real some day.
Here's what that Mark Gurman dude (Apple/Tech journalist for Bloomberg) tweeted about it:
So sounds like its either poke or look + pinch gesture and both options suck for a keyboard. I just think a virtual keyboard is a very difficult problem to solve for for several reasons which is why every attempt at them thus far has been shit.
And that's kinda the whole problem with VR/MR. It's some of the absolute hardest computing and optical and battery hardware and UI challenges we can find, all bundled into one product. It's just an incredibly steep task and a lot of the solves aren't even really a matter of "oh this is expensive" as much as it is "we're not sure if this is even possible right now."
I really hope we eventually get a fully mature device. I quite like VR and see so much potential in it.
Ok yes with Oculus it's similar actually. You can poke at the letters but the problem is the exact depth detection is not so great (mainly because you're pointing directly away from the tracking cams with your finger) so it's a bit of a hit and miss.
And moving the "virtual mouse pointer" and then pinching is also a pain to do. My oculus doesn't have eye tracking but you can move your hand to move the "pointer".
Both methods are a PITA. Using the controllers to point and then click the trigger is better but it's still slow going of course that way. It's like typing on a keyboard hanging in front of you by pressing the keys with a stick. Considering that's the most comfortable option (which the Vision Pro doesn't have for lack of controllers), it's pretty sad.
But yeah I see the potential too.. I hope it will come to pass.
I can imagine a return to some sort of t9 style typing where you could wear a thin sensor on your finger tips then tap certain fingers a certain number of times to enter specific characters. People who were used to typing with t9 could do it very quickly and without looking.
True, but it's still about adapting the user to the tech instead of the other way around. I don't think Apple will go for that.
I would personally think more in the direction of a separate sensor you can place in the house, from a third-person point of view the finger tracking will be much easier to do because you are not moving straight away from the camera.
Oh yeah, I meant eventually, not with this device. I doubt this will take off honestly, the tech is too new and bulky and expensive still. If virtual environments ever do become prolific though, I doubt we'll still use a visual representation if a keyboard at all, what would be the point.
What do you envision we'll use then? Dictation perhaps?
I don't really use that much because it's not really up to scratch yet IMO. But of course that may come.
Dictation in some cases sure, but it's not really secure if you're around people, and could also get weird talking to air all the time. I think if ar/wearable screens really want to take off were going to need an entirely new input method. Typing on a virtual keyboard is just so impractical, especially if you're say on a train or something. I think it'll be something like what I described, a lightweight wearable glove or fingertip sensor or something, and you input based on fingertip taps. You can keep your hands down by your sides while typing, don't have to flail about in the air just to quickly google something or answer the text that popped up on your glasses. Or a physical little keypad that can slip in your pocket, but with few enough buttons that you can type without having to look at it, like t9 texting.
Hmm interesting yeah..
I was thinking of swype typing. I notice I can comfortable type on tiny keyboards this way (like the one on my Unihertz Jelly which has a 2.5" screen). Perhaps that would work better in VR especially because it doesn't rely so much on forwards/backwards movements but only sideways/up-down which are much easier to track from the point of view of the head.
Virtual "floating mid air" keyboards are never going to be good. Even "projected over your fingers" keyboards are going to tank.
What AR should allow though, is using either a normal keyboard, or using a physical surface as a keyboard, with tactile feedback and no confusion about whether you've hit a key or not.