Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

    But it does put an enormous amount of pressure 
    on the eye tracking. As far as I can tell so 
    far, the role of precise 2D control has been 
    shifted to the eyes.
I've got one good eye and one bad eye. The bad eye is legally blind, has an off-center iris and is kind of lazy w.r.t. tracking.

I'm extremely curious to know how Vision Pro deals with this. One certainly hopes there's some kind of "single eye" mode; certainly seems possible with relatively small effort and the % of the population who'd benefit seems fairly significant.

Eye tracking most certainly sounds like the way to go, relative to hand-waving.

The Minority Report movie probably set the industry back by a decade or two. Waving your hands around to control stuff seems logical but is quickly exhausting.



From Apple's visionOS page, re: Accessibility (https://developer.apple.com/visionos/)

"And for people who prefer a different way to navigate content, Pointer Control lets them select their index finger, wrist, or head as an alternative pointer."

This reads to me like they've got you covered (at least in principle).


Hopefully you can just set one eye as dominant. I as well have one good eye, so just using that one for tracking should be fine.


They have a training process at the beginning that learns your eyes. It asks you to ‘point’ to various points in space. I’d imagine it would be able to detect a dominant eye with that, and prefer it over a slower or non-tracking eye.


From a Q&A session, a trackpad will also work.


Covered in a basic sense, but that's an extremely diminished experience which basically removes half the reason to use it


Let’s see how it works before we start making all these assumptions. Apple has the best track record in the industry when it comes to accessibility, hands down.


I agree, but if that's their entire solution, it isn't good


It would have been much less exciting to see Tom Cruise sitting on a couch, hands in his lap, gently flicking his fingers to scroll through crime scene footage. IIRC he talked about how tired his arms got during filming.

EDIT: found it — he didn't talk about it, but it was reported that he had to frequently rest his arms: https://medium.com/@LeapMotion/taking-motion-control-ergonom...


Oh, 100%. The interfaces in Minority Report were spectacular in terms of cinematic movie appeal.

Artistically it was a great choice, and they certainly weren't intending to inspire a bunch of wrong-thinking UI choices in the real world.


I distinctly remember they said they had hired scientists to discuss what the future would look like and I always thought "yeah, right, so in the future we got those holo interfaces and computers ultra connected but people in the same office still copy files from one computer to the other using some kind of plug able storage".

But now that I am also in the future I still use USB drive to move files at the office because. Guess they were on something after all :].

    The fallacies  

        The network is reliable;
        Latency is zero;
        Bandwidth is infinite;
        The network is secure;
        Topology doesn't change;
        There is one administrator;
        Transport cost is zero;
        The network is homogeneous.


I rarely use USB media, instead just copying things around using the network. But last week I needed to copy 200GB to my son's laptop, and ended up putting a 256GB SD card in the laptop and copying to it over night (it was started at 9pm anyway).

Never underestimate the bandwidth of a station wagon full of mag tape.


The dystopian future has a bustling physiotherapy and occupational therapy industry.


I can't speak for working in VR but I think us desk jockeys tend to underestimate what other people are doing for work. The lack of movement is what does thought workers in.


There are of course many jobs that are much more taxing on the body overall than waving one's hands at a Minority Report style computer interface. (I used to work at a restaurant! I've done actual work, I swear!)

However, waving your hands in the air for 8-10 hours a day feels a bit unnatural relative to "actual" physical work. I don't have a real scientific basis for this statement.

But there are some physics at play IMO. In boxing it's generally accepted that a missed punch consumes about twice as much energy as one that connects -- your body must do the work to accelerate your arm, and then it must do a roughly equal amount of work to decelerate it. I think there are some parallels to waving one's hands in the air for 40-50 hours per week relative to "actual" physical labor.

More to the point: MR-style interfaces (at least as typically implemented/depicted) don't really offer tangible advantages to using a good touch pad. They trade tiny finger movements on a trackpad for huge sweeping arm movements that accomplish the same things.

I think if MR-style interfaces had somehow been invented before mice and trackpads, we'd be celebrating mice and trackpads as absolute miracles of efficiency.


I actually prefer keyboard alone due to the concise control, I find mice pretty clumsy. But there are things that it can do that I can't achieve on a keyboard and same for VR, perhaps we'll end up with blended work stations.

On the note of tiring out, I have been strength training for 12ish years now and I still get tired quickly when holding my arms above my head to work on my car. I think because they are a smaller muscle group, they can be saturated easily. I don't have the same issue with repetitive motions, it's just holding the arm in a similar position that does it.

Maybe if AR gestures take into account full motions rather than holding the arm in a similar position for too long, it might not tire so easily.

I have VR already and I will say it's an exhausting experience in general, I can flat screen game for hours, but in VR I want out in less than an hour. I think it's the full focus it forces on you, and the split world spatial reasoning going on.


I’ve been on both sides of the coin in a role where I was on my feet 12 hours a day.

The aches and pains were there, but just in different spots. Yes my core was stronger, but holy hell were my feet exhausted.


You're talking like I didn't need to go to physiotherapy and occupational therapy to deal with my wrist issues and the lack of muscle tone in my glutes.


Same with all the scientists in movies writing on glass boards.


Yeah, Tom Cruise doesn't seem like the type of person to admit he would need to rest his arms.


What gave it away, the iron cross at the beginning of MI:2?


Not because he's allegedly fueled by the souls of aliens?


Considering there is a "calibration" step. I'm going to guess that as long as the "bad" eye behaves relatively predictably, it should be able to ignore the bad input and put appropriate weight on the "good" eye.

It will also be interesting how the outward display will display the "bad" eye input including missing eyes, lazy eyes, nystagmus, etc...


> The Minority Report movie probably set the industry back by a decade or two. Waving your hands around to control stuff seems logical but is quickly exhausting.

There was an operations center I worked at that had a large touch screen (think cnn wall), that always sat in the corner of the area collecting dust, but whenever some higher up would visit it would get pulled out and used just to provide a wow factor. It was always the unlucky one that ended up having to use it during this time. The fatigue was horrible.


Track record wise, apple is one of the best in terms of serving accessibility. So I’d bet greater than 50% odds that they’re thinking about lazy eye or one eye or derivatives there of.


[flagged]


Well, they have pretty consistently invested in and showcased accessibility, famously ROI-be-damned (to paraphrase Tim Cook). So, it’s not a stretch to expect (but not assume) they will have the perspicacity to do the same here.


I agree not to drink the apple copium, but they do excel in accessibility by a large degree.

Honestly, I think a large part of the industry taking into account accessibility in a more serious ways comes from the competitive advantage apple has created.

Doesn’t mean accessibility hasn’t been done in the past. By no means. But it’s a market segment they have on lockdown


FWIW, I emailed a fairly senior accessibility person at Apple yesterday with a question about the headset and got a response back today, even in the midst of WWDC. And I'm a relative nobody — just a bootstrapped solo founder who works in the accessibility field.


> Eye tracking most certainly sounds like the way to go, relative to hand-waving.

I suppose Apple could change my mind but I've never been a fan of eyes as input. Touch typing or any equivalent where you can look at one thing and do another seems impossible now.


I'd wait for the product to actually launch before making such a judgement


Apple don't need you to defend them here. They presented what they presented, and whether we're interested is based on that. It's a legitimate concern and a valid point of discussion.


The interaction paradigm you describe (where it's literally impossible to look at one window while actively using another) makes so little sense that we can almost certainly rule it out. Certainly no press members who demo'd the unit are describing the interface that way.

That's not "defending Apple", that's just... gosh, I can't think of any word besides common sense.

From all of the presentations and press accounts, it seems clear that eye tracking is something of a direct replacement for the mouse/trackpad cursor -- in fact it seems you can use a trackpad or presumably a mouse as well. Looking at an app or "window" seems equivalent to mousing over it, not equivalent to clicking to activate that window.


> The interaction paradigm you describe (where it's literally impossible to look at one window while actively using another) makes so little sense that we can almost certainly rule it out.

I think the implication is that the window you are looking at is focussed automatically. Which wouldn't work for say, typing up a document to a text editor while reading from a reference doc. That's just the beginning, I use a mouse-to-focus window manager and that happens all the time with all sorts of things.


I don’t know how it will actually work, but the Mac has always been click-to-focus, which would avoid that issue.


That contradicts the many accounts of "look at a text box bar, start talking it types"


I don't disagree with your other points AT ALL, but given how repeatedly proven that common sense is entirely a fiction, if any argument falls back on it you have to be immediately suspect of it.


I hear that now and then, but looking at the wiki on it, the “criticism” paragraph is only like 3 lines with no real refutation, out of a hefty lengthy article.

What’s your motivation for saying that? Personally I am suspect of people say there isn’t common sense.

Here is the definition at the top: Common sense (or simply sense) is sound, practical judgment concerning everyday matters, or a basic ability to perceive, understand, and judge in a manner that is shared by (i.e., "common to") nearly all people.

One can certainly say there is common sense. For instance no one likes pain, people like pleasure. People care about entertainment, not boredom. People tend to learn on the whole as they grow up. I also think on the whole have a perception of what is wrong and right, even if there is variance between cultures. The golden rule of reciprocity seems to be near universal (like: don’t kill, don’t steal and so on)

Lots of things we share as a species.


Your basis for discussion is that you read the wiki article on common sense? Then you cite a dictionary definition? Suffice to say that is pseudointellectual and inappropiate.

And then you obviate the concept by rendering your own gross ignorance: people don’t like pain? Today you learn about masochism. People need entertainment? Monasticism, asceticism, Buddhism, nihilism, et al. People learn as they age? Quite unfortunately many studies show many people lose a great deal after their teens. And your invocation of the idea that cultures share morality such as a prohibition on murder (this has nothing to do with “reciprocity” by the by) is truly divorced from basic observable reality.


Well, let's start with some basics.

100% of a farming town/community are going to hold "don't grab an unknown fence, as it might be electric" to be common sense. Yet there's going to be a sizable portion of the planet to whom that WON'T be obvious, and they'll try it.

100% of New Yorkers might know not to park your car overnight in a particular area, yet people new to the city will happily do so not knowing they may be putting themselves or their property in danger.

If you have to put bounds on "common sense", or contextualise it at all, then it isn't common, it's just knowledge shared by a group of people of unknown size.

You say it in your final sentence - "near universal". If it's not universal then what's the point? Where is the magical cutoff point of the population where "common knowledge" is no longer common? 95%? 50%? If so, why?

Such speculation is ludicrous because it's going to change drastically from area to area and change even more so depending one what you're talking about. And if such a definition is so fluid, when you're positing that it is in fact universal, then it's pretty clearly false.

"Common sense" and "common knowledge" are just shortcuts we use so we don't have to explain the entire contents of history in one go to explain one other concept. This is very useful. But relying on it existing _ALL_ the time is the work of a dangerous lunatic that has clearly not encountered a broad enough segment of humanity.


Thank you for your thoughtful reply. I think I might gonna read up a bit more on the topic, it’s quite interesting.

I think my personal stance is more on the idea that there is a lot we share as a species (from words, to symbols and human basics) and that could be grouped as common sense. But perhaps my grasp of english fails me here.

Looking at the etymology it also seemed to stem from “community” , what is common among a community. I would wager things were very common for his peers when Thomas Payne wrote his book “Common Sense” , a world to him that was largely very the same in style and substance.

Perhaps that concept simply can longer apply in a pluralistic global world or in a modern sense like you point out with your examples.


I have pendulum nystagmus since birth. My eyes saccade and shake very rapidly, even when I'm focused on something. Folks who talk with me certainly notice.

I really hope I won't be locked out of using vision pro just because its input modality isn't compatible with my body...


You might be SOL on version 1 or 2 but they'll figure it out eventually.


That was my thinking. They don't always nail down accessibility issues right away, but I think they're fairly highly regarded in terms of getting to things eventually.

(I say "I think" because I have not actually used their accessibility features)

With all of the new paradigms in VisionOS it seems perhaps unavoidable that accessibility features might take some time to catch up. Engineers and designers have to wrap their minds around the new stuff.


Maybe. The eyes are a sort of the canary in the coal mine for lots of neurological issues. Sometimes people don’t notice problems because the brain adapts well, and accommodation may be more challenging than one might think.

I have a loved one having issues due to a brain cancer. Optical seizures were the first warning sign and went unnoticed for some time. It would be amazing if VisionOS could potentially detect some conditions or help folks with epilepsy just as the watch has for some cardiac events.


Their accessibility features are quite fascinating.

Sound recognition for example, or Assistive Touch on the watch. Not needing those I have no idea wether or not they’re actually useful, but it sure is impressive and seems well-thought.


They describe alternate pointer modes in this presentation (18:30) https://developer.apple.com/wwdc23/10034


Index finger, wrist, and head are supported as alternate pointers.


One could hope they have an accessibility mode where the eye targets are larger. Or they offer a hardware controller pointer.


Curious. How do you go with 'normal' VR? Are there any problems playing games?


> The Minority Report movie probably set the industry back by a decade or two. Waving your hands around to control stuff seems logical but is quickly exhausting.

The bad idea had waves even before that: Remember the gorilla arm.

http://www.catb.org/jargon/html/G/gorilla-arm.html


I remember the gorilla arm but I found out that occasionally raising a hand from the keyboard and tapping the screen is not a noticeable effort and it's faster than aiming at the touchpad or (of course) reaching for the mouse. I discovered it mostly by using a Samsung tablet with both a Dex interface over Android and with an Ubuntu Unity one (the defunct Dex for Linux.)

That's maybe because the screen is small and low. Raising an arm at or above the head is a different matter. If you're cycling with a heart rate monitor you discover that you add some beats when you raise your hands (say, go from 150 to 155) and gain some if you lower your chest close to the handlebar. This doesn't take into account increased or reduced drag because I measured it at home on trainers.


I've noticed the same thing when I ride in my basement. If I come up off of the hoods, grab a water bottle and take a drink my heart rate bumps up ~5bpm. It'll climb a bit more if I eat something like a granola bar or a gel.

What's interesting is that unlike steady state to a hard sprint, the heart rate rise is almost immediate. There's some delay - 20 to 45 seconds - if I change it up and do a sprint interval before my heart rate reflects that effort.


Strabismus and amblyopia are common enough (4% of the population) that one would hope these types of systems would make sensible provisions to accommodate people with these conditions.

https://www.childrenshospital.org/conditions/strabismus-and-...


Lazy eyes, I think, are probably popular among software developers (I got one, and most devs I know have eyes issues). I think that's why Apple got lenses too. I think (hope) they got us covered since we'll be the first to use, develop on, and afford these devices.


I’m with you on this one, but I also have nystagmus in both eyes. I’ve been curious but also somewhat apprehensive about stuff starting to rely on eye tracking and whether these edge cases will get handled.

Probably best for me not to be an early-adopter in this case :)

Edit to add: Can confirm that Apple stuff is amazing with accessibility, so I’m not surprised there are alternative input methods. My main thinking towards the future is around how nystagmus and foveated rendering will interact. That’s a giant and fascinating research topic I imagine.


I tried the HoloLens ages ago and honestly the only upgrade would be the display & how often the inputs worked. Microsoft was demoing the HoloLens @ some hackathon and had a program where you could flip pages in a virtual book. On top of the book looking super low quality / illegible, I could never get the controls to work. The only thing that will make the vision pro seem like a worthwhile purchase would be if the controls worked flawlessly. I already have faith in the displays & picture.


I have a divergent squint on both eyes when tired/exhausted. I yet to try VR with eye tracking. Furthermore, I think the future would be some kind of brain reading implant/scanner.


They have accessibility/alternative input features to help with that


The arm waving is some form of exercise, which is a plus for me.


"Gorilla arm" is a term engineers coined about 30 years ago to describe what happens when people try to use these interfaces for an extended period of time. It's the touchscreen equivalent of carpal-tunnel syndrome. According to the New Hacker's Dictionary, "the arm begins to feel sore, cramped and oversized -- the operator looks like a gorilla while using the touchscreen and feels like one afterwards."

https://www.wired.com/2010/10/gorilla-arm-multitouch/


I guess non of the engineers lifted weights. that community has a bunch of terms for this




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: