Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Potentially vital R/D happening in such a way as to have almost zero chance of actually becoming a product.

My assumption, of this early work, is to scoop up important patents.

Related to important patents, varifocal is almost certainly the future of HMDs.

Having fixed focus, as all HMDs do now, is fatiguing and strange. A really interesting example of this is, in VR, hold something close to your eyes. You'll see that it gets blurry. Now close one eye. You'll see that it's clear. It's not that it's actually blurry, it's that your eyes are refusing to put up with the physical nonsense of a fixed focus 3d world.



> to scoop up important patents

Or perhaps to refine that, to scoop important patents

Apple is aggressively out there patenting obvious things that have been basic public knowledge / in use for years to try and lock up the field (eg: [0]). You can try and get your own patents but if you aren't strategically dependent on the IP for your own needs it is much easier and quicker to just nuke the field by putting as many things in public as possible so it's much harder for Apple (or others) to retro-patent broad classes of obvious features.

[0] https://image-ppubs.uspto.gov/dirsearch-public/print/downloa...


Wait am I reading this right? Is this saying that 2 days ago, Apple got approved of a patent on all of AR? Like, what Pokemon Go has been doing since 2016?


It is talking about showing nonvisible features as visible.

Like this quest app for visualizing wifi strength in mixed reality.

https://www.oculus.com/experiences/quest/4793438284066128/


Sure but the Vision Pro uses eye tracking and foveated rendering to simulate planes of view


Looking at a far object relaxes the muscles, while looking close tenses them. Any deviation from this is somewhat uncomfortable. Fixed focus displays mean that, when you look far, your eyes relax, expecting the physics to be correct, but then have iterate to focus at 3m (or whatever the fixed distance is). Same with near objects. The eyes tense, things go wrong, then they drift back to 3m.

Blurring can be applied, but that doesn't change the physics that your eyes expects, which causes the disconnect and discomfort, which varifocal displays solve.


It doesn't solve the problems.

Apple has actually issued developer guidelines for AVP that Apps should only render UI / static content at a fixed position from the headset in space corresponding to the fixed focal distance for this reason.


…because the headset does the simulated focal plane rendering.


Can you cite an Apple source that says that? Nothing I read matched what you’re saying. As another comment said foveated rendering and eye tracking to help with simulating depth.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: