Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don’t want to be nosey but given that it sounds like you feel like it’s a paradigm shifting device, and you’ve actually worked on and with the device, I’d be interested to read about your motivations for moving on from Apple/the project when you did if you ever feel like writing about it.

So many questions I’d love to ask you about this, but again, none of my business. Just wanted to let you know I’d upvote that blog post to the moon should you ever decide to write it.



Thanks! My motivations were mostly comp based (I'm in HFT now), I loved the product and the team I was on!

Feel free to ask any questions here and I'll answer anything I can (e.g. no unreleased info, but maybe go into more detail about the kind of stuff that was shown to devs at WWDC). No blog right now but maybe in the future :)


A major limitation for using VR devices as "virtual monitors" has been the screen resolution. Vision Pro appears to have significantly higher resolution than anything available on the market right now; in your experience, was the resolution high enough for the pixels to disappear and not be a distraction, especially when e.g. reading text? Have you used other VR headsets to compare it to?


I've used other VR headsets and it's significantly better than everything I've tried. You'll be able to put up virtual monitors at the same density as real life and read stuff fine, I couldn't really discern pixels at all. Reading physical monitors through passthrough is a bit harder (I increased text size a few points when I needed to do this) but isn't really something you'd do for actual work.


Lol this comment made a fun image pop into my head. Being in a work meeting with my Vision Pro on looking at the presentation through passthrough while browsing HN on my virtual screen. :)

The idea of looking at real screens in passthrough didn't even occur to me.


Lol, I'll expect people will figure out tells for whether you're slacking off irl in a headset just like being distracted in a zoom meeting. Reprojected eyes jumping around too much? Holding your hands below the table to pinch out of sight?


I'll be plunking down 3,500 if it lets me sleep through meetings.


I don’t know how interesting your meetings are but I don’t need vr glasses for that


Found the killer app.


Yep, I'll be sound asleep and the visor will show my eyes focused intently. It'll blink at an appropriate rate and maybe raise eyebrows during interesting conversation.


I am very excited for the front display to be jailbroken (assuming that’s needed) to display other more creative imagery, a la snap chat.


So in terms of information density and readability, how does it compare to a 5K retina display on an iMac or Studio Display?


Just purely from a PPD perspective a 5K Retina display will beat the Vision Pro, but from a readability standpoint I think they're equal. Like at "normal" text editor scales you can read everything just fine but a 5K display might be crisper.


What’s colour accuracy like? I wonder if the controlled environment with no external light etc could be useful for colour critical work.


Sorry, I haven't done any color critical work and am unsure of the specs.

If you're wondering about passthrough it's pretty good for most things but is definitely missing the dynamic range of the human eye, which no video camera or display can really match yet. Like a super bright light might just show up as white and you can look directly at it no problem. Basically the same as what you get when you record a video and watch it back on your phone.


Is is comfortable for all day work like that? Virtual monitors seems like the critical app for me.


It's more comfortable than other VR headsets I've tried, but still something strapped to your head that's significantly heavier than a pair of glasses. I think I could spend all day in it with no issues but I'm not sure how sick of it I'd get doing that 5x a week, every week.

I'm sure as soon as it's released people will be doing that and reporting back though!


Have you tried the XReal glasses? virtual monitors are my biggest use case too and I'm close to making the plunge on XReal, but would love to hear from people who have tried them.


Resolution aside most headsets kinda feel like wearing a scuba mask because of a narrow field of view. How was the vision pro? I assume this should be public info since it was shown to the press and devs.


To be honest I don't know the actual FOV number. It feels better than some VR headsets I've tried and on par with others. The lenses are definitely a more exotic shape than the ones on my Vive so they're able to get closer to your eyes and have better quality in all areas of the FOV. I feel like for work stuff and entertainment it's definitely good enough, though you might struggle living in it full time haha.


1. How about drinking stuff with it? Is it possible without getting it off?

2. What about sleeping with it? VRC people do that, so they might be curious.

3. Is it usable outside?

4. What was the longest time you had it continuously on?

5. Have you ever dropped it? :D


1. You can drink stuff but have to be careful. Hand-eye coordination gets a bit wonky the closer you are to your face. I've done it and it works though!

2. Never tried sleeping with it... I don't see why it would be any worse than other headsets though.

3. I've never used it outside, but that was for secrecy and not technical reasons.

4. Honestly not sure, maybe an hour without taking it off at all but I've definitely been in it for the majority of a few hour spans many times. At the time the main blocker was the beta OS and not comfort or battery (I would keep the battery pack plugged into the charger most of the time).

5. Nope! We were all super careful with them because prototypes are expensive, much more so than the consumer product. It's not something you could just casually drop while using like your phone though.


Thank you very much for your replies. :)

4. So the battery can be charged while plugged into the headset. What happens when you pull the battery from headset? I am guessing insta black. Does it have some power-saving mode where only R1 is feeding images from cameras to displays without any computing possibilities?


> (I would keep the battery pack plugged into the charger most of the time).

This sounds like the battery can be charging while using the headset, right? Which imo makes the 2 hour battery life much more understandable – if you're stationary in the device most of the time then you only have to rely on the battery when you move. If you can plug in to charge when you're back at your desk/couch, it's not really a limiting factor (for the use cases Apple is pursuing).


Gruber claims that it has the exact same FOV as actual human vision, and no discernible pixels.

https://daringfireball.net/2023/06/first_impressions_of_visi...


That can't be right. The human FoV is huge, more than 180 degrees. To cover that range without visible pixels requires much more than "4k" type resolution.

Either one of those statements can be right. Not both at the same time.

It's time someone with real measuring equipment looks at of these and gives a more technical review than just "wow Apple magic".


> That can't be right. The human FoV is huge, more than 180 degrees. To cover that range without visible pixels requires much more than "4k" type resolution.

I'm not so sure. Human vision is only sharp in a coin-sized area at any time. If you fix your eyes on a single word in your comment you can't actually read the entire comment, for example.

In other words, you don't need 4k in your entire FOV you just need to ensure that most of the pixels are spent in the middle of the viewing area.

I believe that it would be possible to have the screens in the headset generate a very distorted image, where the edges are compressed to a small area of the actual screen and therefore low-res, while the lenses stretch this image to fill the viewing FOV. Kind of like anamorphic movie lenses, or even how wideangle lenses distort the edges more than the center.

I have no idea if Vision Pro does this but it seems theoretically possible at least.


That's called foveated rendering, and it's exactly what the Apple headset does, they mention it here: https://developer.apple.com/visionos/


For foveated rendering you still need good pixels everywhere in case you're looking to the sides.

I think what the OP means is that you'd have actually less physical resolution in the sides. Most headsets do this somewhat but not in a huge amount.


each eye is not getting "4k resolution" they are claimed to be each capable of rendering a 4k screen in natural resolution.

meaning a couple 4k screens can be rendered in the visible area without a noticeable difference in quality to real screens at a similar "apparent" distance


Not sure why people are still repeating this erroneous line of thinking. Humans can’t look in two different places at once like a chameleon. The eyes focus on the same point and the stereo images overlap almost entirely - the difference in point-of-view between your two eyes is tiny, they are next to each other! (Try alternatively closing one eye). If you’re emulating a pixel from a screen in VR you’re going to have to draw that same pixel in both eyes. You do not get 2x the pixels.

The exception to this is at the edges of your vision, where each eye does see a unique portion of the field, but by definition that’s not where you’re looking.


> meaning a couple 4k screens can be rendered in the visible area without a noticeable difference in quality to real screens at a similar "apparent" distance

This is impossible at the resolution stated. The headset would have to be 8K or more per eye in order to achieve this, which it most definitely is not.


With Apple's marketing power, I really would expect them to claim a higher resolution number if it was that high. After all, why not? If it's actually 5K they would shout it from the rooftops. All they've said is "more pixels than a 4K screen", also we know this type of lens distortion causes some waste.

We also know 23 million pixels for the whole system. So, 11.5 million for one display. a 4K display is 8.2 million. So it's not a whole lot more than 4K, and lower than their own 5K display (which has 14 million), which matches what they claim ("more than a 4K display"). That's not enough to cover the full field of human vision and still have pixels so small they can't be seen.

With corner waste it's just barely enough for one 4K display this way (and stretched to the full limits of vision it will be pretty unwatchable so close). You can't display two as you mention, because AR/VR projection works by using the 2 screens to display the same content just from a slightly different position (parallax) causing the 3D effect. The more overlap between the eyes the better and more comfortable the 3D effect (some headsets try to get an ultrawide FoV this way but shoot themselves in the foot with low overlap).

If it has a really wide FoV its sharpness will be pretty much on-par with a quest 3/Pico 4, if it's got the same FoV it will be a lot sharper. I expect the truth to be somewhere in between. Wider than a Quest and also sharper, but pixels visible if you look well and not quite full human FoV.

What I expect Apple will have done is sacrifice a bit of vertical FoV for horizontal FoV. Vertical FoV is important in VR (especially 'roomscale') because of orientation issues, and not quite as important in AR. Also most of their marketing material promotes a seated position so moving around is not an issue. Most VR headsets have an almost-square resolution per eye, but I expect this one to be closer to 16:10 (maybe not quite that wide). I think it will come out at around 4300x2600 pixels per eye which is slightly over 11 million pixels.

Still very impressive (which it must be at that price point obviously!). But real-world and not magic. Makes sense because Apple's engineers, as good as they are, are bound by the same laws of physics we all are. Their marketeers would love us to believe otherwise though.

It's really time for some real-world specs on this thing instead of marketing blah. But I think Apple specifically doesn't want this, it's no wonder they let only the most loyal media outlets (like Gruber) get as much as a short hands-on with this thing. And even those experience are white-gloved in detail (they even prepared custom adaptive lenses). They just want to keep the marketing buzz on as long as possible.


Norm from Tested, a highly credible source for VR, actually mentions the FOV is akin to Valve Index's: https://youtu.be/f0HBzePUmZ0?t=825

This is actually a bit disappointing since I would rather not have to move my entire head to look at a virtual side monitor. It seems like the technology is there now and many companies are looking into it, hopefully other headsets will be released with higher FOVs: https://youtu.be/y054OEP3qck?t=283

Another noteworthy point from that video: Apple has bought Limbak, an optics specialist that was formerly tied with Lynx R1's optics. This means Lynx can no longer use Limbak's future optics that, admittedly, fall short in scaling to higher resolutions. Now, Lynx has shifted its gaze to hypervision optics, intent on preventing a similar acquisition by another tech behemoth like Apple.


I just read Gruber's review. When he says "field of view" he's not talking about total degrees of arc, he's talking about what one might think of as "zoom," in this instance. He says things don't appear larger or smaller when you remove the headset.


I think he’s mostly saying that whatever is in the visible field of view matches our eyes. Doesn’t mean the actual FOV is 180 :)


The actual quote from Gruber was "There is no border in the field of vision — your field of view through Vision Pro exactly matches what you see through your eyes without it."

What does "border in the FoV" means?


I assume that you’d be able to look beyond the edges of the screen.


I’m still curious how things will look in low light. All the demos were in an optimally lit room, but what about when you dim the lights and the camera has trouble picking up the room?


There are infra-red illuminators.


Oh, that’ll be fun when people start to realize that some polyester fabrics are transparent in IR!


> Vision Pro and VisionOS feel like they’ve been pulled forward in time from the future. I haven’t had that feeling about a new product since the original iPhone in 2007.

tf? Did he miss everything about LLMs?


LLMs are not so impressive when you understand how they work approximately. This new Apple thingy is very impressive even though it’s much easier to understand. Apparently even if yoou worked on it from the beginning.


Most people are quoting a FOV of between 90 and 100 degrees which is about average for the VR market


What’s the best way to develop for Vision Pro until the headset is available to buy? Use ARKit/RealityKit/Unity in an iPhone to develop some of the basics and hope the code translates well enough?


Yep, the simulator (will come with the SDK when it's released) plus an iPhone/iPad is the best you're gonna get. Most code will definitely translate! Would recommend ARKit+RealityKit over Unity for the best OS integration unless you've already started with Unity or don't need OS integration (i.e. a videogame).


Do you feel there are any large leaps left for this type of device or will it be incremental improvements to price, speed, battery life, etc from here on out?


Not exactly sure at all, I wouldn't have been able to predict this large of a leap before finding out about it myself.

I do think a bunch of incremental improvements will eventually allow different use cases though. E.g. you'd never take it on a run or a bike ride in its current state but with enough weight and battery improvements you could have a HUD for exercise stats + navigation when biking with stuff like a videogame-esque ghost of your PR to race against. Just a random idea, there's a lot of stuff you can dream up.


I do think we'll see a fitness related product in the future but holy crap the early adopters trying it out should be super careful! It might be easy to forget that your eyes are completely covered with a screen.

Imagine riding a bike at 30kmph and accidentally unplugging the headset. Boom. Black. Instantly. That's a scary place to find yourself.

I'm curious if Apple will utilize the gyroscope to try and detect movement faster than a certain speed and show a warning. Take the risk if you want, but at least people should be aware of the consequences!


Hah yeah that'd be less than ideal.

> I'm curious if Apple will utilize the gyroscope to try and detect movement faster than a certain speed and show a warning.

The Vision Pro is doing SLAM at all times so it actually always knows exactly where you are (relative to its origin) and your speed!


How does it work on busses/planes/boats? Are the screens locked to the cabin? Or the horizon? Or do they just fly all over the place like Quest?


Locked to the cabin but I've never tried it personally. It has a lot more advanced tracking than the Quest!


If you wear this thing in traffic you have a deathwish, and if you hit someone else while wearing it in traffic I hope you'll be sued into poverty to the point where you never ever will be able to buy another one.


If you wear Vision Pro to watch movies, can the field of view achieve a cinema-like effect. Or can we have a movie theater viewing experience on Vision Pro?


I don't know the exact FOV specs but it's definitely wide enough to achieve the average movie theater experience, even an IMAX-like experience.


Thanks. If we could have an IMAX-like experience, that would be fantastic.


Mind me asking how you got into HFT? Curious about this.


It's a pretty simple story: a headhunter reached out through a friend, I applied to a few companies, got a few offers, and joined one of them.

I was already into performance programming beforehand but had no finance knowledge whatsoever.


From the blog this leaped out at me:

> That shift is fundamental. The interface for Vision Pro felt like it was reading my thoughts rather than responding to my inputs. Its infinite, pixel perfect canvas also felt inherently different. I wasn’t constrained by my physical setup, instead my setup was whatever I thought would be most productive for me.

That DOES seem like a paradigm shift in the offing to me. Sure it might be iPhone 1 expensive and uncertain, but it's easy to imagine how incredible a lighter, more affordable version will be to MANY people within 5 years or so.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: