Jump to content

lacking support for canted screen HMDs


twistking

Recommended Posts

So SteamVR was just updated for the Index:

 

https://uploadvr.com/steamvr-index-motion-smoothing-amd/

 

I cant follow all that stuff abut dll's and API's up there - is it likely that this SteamVR update contains anything of use to people interested in adding canted screen compatability?

Mhm, could not find anything of greater interest. However i think there is no need to be concerned, that the index won't run with DCS at all.

 

Either valve will brute force fix it, to work with some performance degredation, or they have some magic recipe (which i doupt), or ED will offer proper support for the screen geometry at some point, which i am sure will happen; the question is when...

 

N`Abend,

 

we were not so much off, of how it works, I think.

 

But there´s a difference. The viewport or let it describe as the virtual cameras in the virtual reality are determined by the screens and are looking to the screens. ( see picture 4 ).

*edit* let's get the terminology right first; I propose: In the creation pipeline you have the "eyes" or "cameras", that look into the world, where they intersect with the "viewport" or "target". In the physical world you still have the "eyes" but the viewport should now be the screen, from where the world gets projected back to the eye. To be more precise in VR the viewport is not directly analogue to the screen, but to how the screen appears through the lenses. While in 2D the viewport "is" the monitor (or directly analogue to it), in VR the viewport is analogue to screen+lens. But i think we don't have to think that complex for "our" problem. Or do we?

 

N'Abend!

 

Well, i don't know if its wrong, but you are very keen on doing the projection to the viewport in a way that the viewport is not only canted (or tilted), but also shifted outwards.

Please imagine a model, where both "eyes" project straight forward but intersect the viewport not at 90 degree, but at an angle.

Compare it to a plate camera, where you tilt the film plane, but have the camera pointed straight ahead.

You are always keen on shifting the viewports outwards additionaly. So then you are not descriping a problem with canted (tilted) screens, but with shifted screens (shifted outwards and tilted).

 

In the reviews one always reads about "canted" screens, but maybe it would be more fitting to call it tilt-shifted screens, as when you are correct the outwards shift would be a way more dominant problem, than just having to account for a tilted screen.

 

The perspective corrections of the pictures of the buildings is hard to compare with what we have in VR. But with the compositor there is also a correction done to the frames the engine creates, to make the frames match on to the lenses in the HMD in the way to correct the distortion resulted of the shape of the lenses ( which is comparabel to the building in the picture before and after correction ).

 

The effect could be seen in the pictures of the A-10 is exactly, what could be corrected for the Pimax large FOV with editing the stereo.lua, what described before.

This is incorrect (i have a background in professional Photography and Cinematography, so i'm quite confident about this). The correction is not a lens distiortion correction, but a perspective correction (by definition i think), that you could also do with a camera (or lens) that allows for shifting the target/sensor plane.

 

It's the same correction that was applied to my example with the a-10, only that on the a-10 i did it in the correct dimension (horizontal vs vertical).

 

I don't know about the stereo.lua, but i also would have thought, that this could be easily corrected - be it in post processing, or by modifying the viewport in the render pipeline (the latter is - as of our current knowledge - not supported by DCS engine).

 

To be super correct though, the example with the photograph and the example with the a-10 are classical examples of shifting the target (or viewport) and not so much of tilting the target.

So maybe it is more of a shift problem and less of a tilt (canting) problem?

I thought a lot about it and while in photography, the tilted target is not used for perspective control (the shifted target is), a tilted target would still have a similar effect, only less pronounced.

In practice you would tilt the target to tilt the focus plane (focus as in circle of confusion, not focus as in where you are looking at), but at the moment we don't have to worry about this in VR, since we don't yet have varifocal VR. Imagine this discussion with varifocal HMDs...

 

So in short: Shifting the target will give you the effect from the example (both church and a-10), tilting (canting) the target will give you a very similar result, only less pronounced.

 

So, right now, i start to believe that the issues with pimax have to do both with screen shift and tilt and therefore have to be corrected with viewport shift and tilt to compensate.

Furthermore if you correct solely with post processing, the shift is more demanding to correct than the tilt.

 

However i think, that sensor shift could perhaps be compensated by IPD, since if the viewport stay relative to each other, but you start moving the distance between the eyes, you effectively shift the camera, relative to the viewport.

I would assume, that DCS engine already accounts for IPD, so maybe this could be hacked to use as a compensation for target shift, while target tilt (canting) could be easily fixed in post processing (with easy i mean without huge performance impact).

 

"Parallel Projection" is very interesting, as we already talked about VorpX before.

The Pimax5k+ is now my 4th headset and in the early times with my first, the HTC Vive, i played a bit around with VorpX. It was depending on the game, if VorpX works more like "fake" VR or "real" VR. Actually the lack of a information of the objects in the depth of the 3D space ( which the z-buffer provides ) forced VorpX into the "fake" 3D mode, if the application was programmed with z-buffer, VorpX worked like "real" VR.

 

I think the "Parallel Projection" in Pitool does exactly the same like VorpX and makes the Pimax to run with games or application, which are not programmed with a z-buffer.

DCS has got z-buffer programming, so no need for parallel projection.

I disagree. For true stereoscopic 3D you don't need the Z-buffer. Z-Buffer 3D somehow makes a 3D image from a single viewport. It is 3D on first impression, but when you have objects close to your eyes you notice that they are not truly 3D. They are just put in depth by applying different amounts of parallax. I have not tested it mayself, but the effect should be more comparable to shooting stereoscopic 3d with long lenses but common IPD: You get depth impression, but the objects themselfes appear flat since they look nearly identical on both eyes and both eyes only differ in parallax per object.

So no, i am pretty sure that Pimax does not use Depth Buffer for faking 3d from one viewport only, as this would give you a huge performance boost compared to true stereoscopic 3D. Also it would look less three dimensional (you would still have depth perception though).

 

But interesting is, why the VRzooms seems to work in DCS with Parallel Projection with the Pimax.

My conlcusion is simple: parallel projection ignores the position of objects in the depth of the 3D space and it looks like it also ignores the focus line of the viewport, which could also been seen as scale for the depth of the picture. There is actually no zooming into the depth of the picture on the focus line.

What could be observed, when VRzooming with parallel projection is, that the things don´t appear closer, but more tiny than without zoom.

Really hard to say, how it works. I have two possible explanations:

Either it is just post process based by cropping (and thereby shifting (!!!)) and skewing (tilt) the rendered image.

Or it is indeed done by manipulating the IPD to archieve target shift and then still applying post processing to finish it off.

 

This effect could be also seen in 3D cinema, when you watch a 3D movie with the shutter glasses. In the cinema the pictures of the movie also lack of an information in depth of the picture and in scenes, let´s say a landscape with a view into the far of the landscape, objects in this scene appear tiny as long as the eye has got a reference point from these objects to objects in "depth" of the scene. Compared to VR, 3D cinema is like "fake" 3D.

It's a little bit more complicated. I think it is still true 3D by definition, as it is stereoscopic, but the effect you describe comes from filming with long lenses without adjusting the distance between the cameras.

Alternatively filmmakers can change the distance between the cameras to have better three dimensionality on vistas that would not be as three dimensional in real life, but thereby by changing the sense of scale.

 

Parallel projection should never be activated in DCS, it destroys the quality of depth information of objects in DCS and cost much more performance. Parallel projection also doesn´t work right with VRzoom.

Is it, or are you just coming to this conclusion, because you think it's not stereoscopic. Because i am sure it is!

I was under the impression, that it is not needed for normal flying, but improves VrZoom. Haven't tested it myself of course.

Just from our discussion i would derive, that it does improve correctness of dcs imagery, but at a high performance cost.


Edited by twistking
Link to comment
Share on other sites

Hi twistking,

 

it´s really interesting and things to learn from your perspective as professional photographer/cinematographer. But I also think, there are lots of differences to virtual reality/VR HMDs. Most interesting is, that with both there could be found similar effects, but from different conclusions.

 

My goal is to learn exactly, how it works to be able to find the right switches to improve and it´s quite thrilling with VR, as the technology we have is pretty new to everybody.

 

Anyhow, I´d like to suggest to have a break in this discussion, as I don´t like to always test, conclude and make some findings which might point in the right direction or not.

 

Just preordered the F-16 and still need to learn some system with the F-14 and improve flight skills by practice or let´s say in short: enjoy DCS.

 

To the Viewport and canted displays there is still a lot to say and draw some pictures from my side, but let me keep it in mind to come back to that later, if you agree and still interested.

 

Maybe ED could have fix the VRzoom meanwhile and we won´t have anymore to discuss this problem... hahaha.

 

It would have been also of advantage if you would have a VR headset or best a canted one, like the Pimax to approve the findings I observed and for me to confirm the conclusions you could give by using the headset... just not to be theoretically all the way.

F-14b Tomcat   /   AV-8B Harrier   /   F-16C Viper  /   KA-50 Black Shark   /   Mi-24 Hind   /   MiG-21bis   

Link to comment
Share on other sites

Hi twistking,

it´s really interesting and things to learn from your perspective as professional photographer/cinematographer. But I also think, there are lots of differences to virtual reality/VR HMDs. Most interesting is, that with both there could be found similar effects, but from different conclusions.

My goal is to learn exactly, how it works to be able to find the right switches to improve and it´s quite thrilling with VR, as the technology we have is pretty new to everybody.

Anyhow, I´d like to suggest to have a break in this discussion, as I don´t like to always test, conclude and make some findings which might point in the right direction or not.

Just preordered the F-16 and still need to learn some system with the F-14 and improve flight skills by practice or let´s say in short: enjoy DCS.

To the Viewport and canted displays there is still a lot to say and draw some pictures from my side, but let me keep it in mind to come back to that later, if you agree and still interested.

Maybe ED could have fix the VRzoom meanwhile and we won´t have anymore to discuss this problem... hahaha.

It would have been also of advantage if you would have a VR headset or best a canted one, like the Pimax to approve the findings I observed and for me to confirm the conclusions you could give by using the headset... just not to be theoretically all the way.

 

Hey Voight,

yes, feel the same. I think we more or less agree on the main problem anyway and those little aspects we maybe don't agree on, get quickly lost in translation (or terminnology). Also on those finer details it quickly becomes speculations, as long as we have no definite explanation on how exactly the screen layout is in the Pimax (is it canted, or is it also shifted).

The stupid thing is, that all this are not corperate secrets and this should be documented somewhere, since every game engine programmer would need to know all the stuff, we are only doing educated guesses on.

I did some googling, but i could not find any article or dcumentation on how any of those things actually work:

If i find some good article, i'll post it here and/or send you a pm.

 

Until then have fun actually using your HMD (i will definitely get a VR HMD when i get my new PC, which should be later this year)!

 

Only thing i want to add to the discussion, is that you should not write off the parallel "projection compatibility", not because of my derivations here, but from what i've read about it on the pimax forums and other places:

It should be more correct than without by applying additional post process correction and it is not fake 3D or unholy trickery.

However it is of course the question, if it is subjectively needed and if it is worth the performance impact. Of course there is also always the possibility, that for some reason it just not plays well with DCS for whatever weird reason...

Also if you decide to give it a try at some point maybe disable the override IPD in DCS, just in case that parallel projection comp. does indeed utilize IPD trickery to compensate the screen shift (if there is any).

 

Keep me posted, if you find out anything new, or if you just come across a nice article, that explain the inner working of all this... :smartass:

 

Have a good flight!

Link to comment
Share on other sites

It´s true, that there´s no common sense explanation on how a particular HMD works, but the concept is quite clear for every HMD.

It´s not only guessing, we´re talking about. It´s observation, techniques, facts and conclusions.

The conclusion themselves are subject of the discussion.

 

Beside the let´s say marketing competition orientated discussion around VR headsets, this discussion here should be open and not to be moved at PM, so everybody could have access to follow, add or contradict without being afraid to be ridiculed or anything. The intention should be to optimize the VR experience for all or particular HMDs in DCS.

So there is also no need for me to keep you posted, you could simply add your conclusions and findings to the discussion.

 

My conclusions on the issue with the VRzoom in DCs with Pimax are right.

The problem could be also visualized in the drawings.

Your drawing, in which you added the viewports, is, how it should be to make VRzoom work.

My drawing, picture_4, shows how it is and why the VRzoom doesn´t work.

 

But I think I´ve maybe found a possible solution, which is following code, which need to be implemented into the VR render pipeline:

 

""

IVRSystem::GetEyeToHeadTransform

Jeremy McCulloch edited this page on 1 Nov 2018 · 2 revisions

HmdMatrix34_t GetEyeToHeadTransform( Hmd_Eye eEye )

 

Returns the transform between the view space and eye space. Eye space is the per-eye flavor of view space that provides stereo disparity. Instead of Model * View * Projection the model is Model * View * Eye * Projection. Normally View and Eye will be multiplied together and treated as View in your application.

 

This matrix incorporates the user's interpupillary distance (IPD).

 

eEye - Eye_Left or Eye_Right. Determines which eye the function should return the eye matrix for.

""

 

If I find more time, will try to look and/or add into the renderpipeline, maybe it could be added to the stereo.lua ( don´t think, it would be such easy ) or maybe by hexediting the OpenVR_API in DCS. I think best possibility would be, when Pitool gets open source and easier to edit to implement the eyetoheadtransformation.

 

Beside this single codeline, it will be needed to reference to IPD settings to make it work.

The IPD setting within DCS system option is a different thing, than the physical IPD settings of the HMDs. But the reference need to be done to the IPD setting from left and right viewport.

 

Parallel Projection in Pitool surely needs more investigation to absolutely know how it works and what it does. You´re right that my conclusion, that it ignores the depth information in the image are more guessing. Fact is, that it doesn´t work right for VRzoom.

Another fact is, that the Pimax works differently from the HMDs which work in anyway with parallel projection. The Pimax works in regular more like 3D shutter glasses and the image is projection in one eye while the other don´t get a image at the same clock of the panels frequency and vice versa. That explains, why parallel projection is more performance consuming, than without parallel projection.

 

Actually I think, that the engineers of the XTAL achieved a perfect match for their compositor with the shape of lenses and headset design and that´s the reason, why the image quality and performance is much more better than with the Pimax, but that, honestly, is just a guessing beside.

 

To know more about, how the VRpipeline could be adjusted for DCS, will be also of advantage for other new and upcoming HMDs.

The HP Reverb´s panels got an aspect ratio of 1:1, they are quadratic.

If the rendered VR image is based on a usual rectangular aspect ratio, there will be always rendered more, than could be projected in the visible area of view respectively the displays.

Presumtion is, that these differences will be also accompanied by a loss of image quality, beside the waste of performance.

 

Time will tell how things work out and I think we´re now a bit in a changing time through more different HMDs available for DCS, which also brings the need to adjust the settings in the VR pipeline in accordance to the differnet specification of the HMDs to get the best out of the headsets and DCS.


Edited by - Voight -

F-14b Tomcat   /   AV-8B Harrier   /   F-16C Viper  /   KA-50 Black Shark   /   Mi-24 Hind   /   MiG-21bis   

Link to comment
Share on other sites

Have we seen any confirmation from ED that canted screens ARE or AREN'T supported in DCS?

 

I recall when the Pimax 8k and 5k began showing up that there was some reports of distortion in the screens but I thought for the most part those had been resolved by now. I am clearly not fully up to speed on the current status though, hence my asking.

 

I know that in IL2 there were reports of scaling issues on some headsets but by tweaking a 3rd party VR/shader mod I believe most of that was resolved. Hopefully, if this turns out to be a problem with the Index, it won't be a long term one.


Edited by Madmatt_BFC

A-10C, AV-8B, F-16C, F/A-18C, KA-50, Mi-8, UH-1H, FC3, CA, WWII, NTTR, Normandy, Persian Gulf

 

Gaming Rig: I7 7700k @5GHz, Corsair H115i Water Cooling, 32GB G.Skill TridentZ RGB 3600MHz DDR4 SDRAM, Aorus GeForce GTX 1080Ti, 2 x Samsung 960 Pro M.2 1TB NVMe SSD's, Warthog HOTAS w/ Slew mod, MFG Crosswind Pedals, 2 x TM Cougar MFD's, Oculus Rift-S, TrackIR 5, Asus ROG PG3480 34" GSync Monitor @3440x1440-100Hz, Asus 27" Monitor @1920x1080-144Hz, Windows 10 x64

Link to comment
Share on other sites

I wouldn´t asked this way for support for canted VR headsets, as it imply, that canted VR headsets might not work with DCS, if not explicitly being supported. Canted screen VR headsets or non-planar Vr headsets do work well with DCS!

 

But there are minor issues with Pimax, like the VRzoom. So far there is only Pimax available as a non-planar headset and issues might also depend on the software design of Pimax´ engineers.

 

I´m pretty sure that ED developers are as same enthusiastic about VR, as we are and as same as for us the new design of VR headsets are as same new to ED developers.

 

Maybe Wags could tell a bit more, which VR headsets are tested due to development of DCS.

I think, he will be blown away by seeing DCS through the XTAL HMD :)

 

What I like about Oculus and the Rift S is, that it gets a lot of people into VR, who never tried before or hesitated. The Rift S could convince, as it is easy to handle and more affordable than the other VR HMDs.

In what Microsoft failed with the regular WindowsMixedReality headsets of its first generation, Oculus reached more people by its popularity.

So VR gets more important for DCS as more people use it as a feature.

 

I think the distortion reports you mentioned are the ones from a early development stage of Pimax, which are now solved. Main problem with Pimax and DCS still is, that it needs a highly performant PC to get it run acceptable... but these issues surely will be balanced due to further development of graphics hardware, software APIs and DCS engine ... with regard to last named, maybe tomorrow we get a nice surprise :) :) :)

F-14b Tomcat   /   AV-8B Harrier   /   F-16C Viper  /   KA-50 Black Shark   /   Mi-24 Hind   /   MiG-21bis   

Link to comment
Share on other sites

I wouldn´t asked this way for support for canted VR headsets, as it imply, that canted VR headsets might not work with DCS, if not explicitly being supported. Canted screen VR headsets or non-planar Vr headsets do work well with DCS!

 

It sounds like you are saying that "support" for canted displays is an irrelevant term, and that they should be plug&play like all the others, is that more or less right?

Link to comment
Share on other sites

Have we seen any confirmation from ED that canted screens ARE or AREN'T supported in DCS?

 

...

The fact that the Pimax software workaround is required for VR zoom, is confirmation that they’re not supported.
Link to comment
Share on other sites

  • 3 weeks later...
The main downside of canting is that both the existing software content library and the field of GPU rendering hardware are all typically optimized for parallel eyes. Fortunately, this may be readily compensated for in software using the re-projection techniques we already depend on for maintaining a constant frame rate. We just need to do a tiny bit every frame.... This way, apps past, present, and future may continue rendering in parallel as they always have, and they will "just work" for HMDs with mild amounts of cant angles.

from https://www.valvesoftware.com/en/index/deep-dive/fov

Link to comment
Share on other sites

Actually I don´t see so much of a problem with the canted display of the Pimax with DCS.

What Twistking and me discussed earlier related more to the VRzoom function in DCS which is in fact broken due to the use of Pimax, which got canted display. I wonder, if or how VRzoom works with the Index or XTAL.

Maybe Valve could have fixed this by software optimization for the Index, but another problem in this regard with Pimax might be, that Pimax does by default not work like HMDs with parallel projection... everyone knows, parallel projection could be activated for Pimax, but it is a massive impact on performance, which I wouldn´t trade-off to have VRzoom.

 

So I disclaimed VRzoom meanwhile... it was a useful and comfortable option, but I can see all gauges etc. clearly with the 5k+. But MFDs or MFCD in planes are still a problem to read en detail.

I lean a bit forward to read precisely, but when you get used to your plane with MFD, you could wokr with the MFDs by memory and don´t need to read the function of a button, before you press.

It also helps a bit fixing your position in the cockpit - by leaning back and recenter VR position, you are a bit closer to the front panels in the cockpit.

 

It would be nice to have back VRzoom for the Pimax, but beside of this, I don´t see the absolute need for more or special support for canted displays.

F-14b Tomcat   /   AV-8B Harrier   /   F-16C Viper  /   KA-50 Black Shark   /   Mi-24 Hind   /   MiG-21bis   

Link to comment
Share on other sites

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...