Simulated Reality SDK 7500c78d v1.30.2.51085 2024-04-26T11:23:03Z
Stable
|
MyFinger
and MyEyes
are the listener classes that accept hand pose and eye pair coordinates from the HandTracker and EyeTracker classes, respectively.CalculateModel()
and CalculateProjection()
functions. Rendering is done using OpenGL function glDrawArrays()
.glViewport()
function and rendered by glDrawArrays()
function. Subsequently, this image is passed to the weaving part of the SR SDK. During the weaving process pixels corresponding to the left and right images are distributed over the pixels of the display according to the display specifications and its calibration.Note:
Loading weaving parameters and setting up a weaving shader has been encapsulated in the GLWeaver class for this example. This class provides a framebuffer that we can render our existing output to. Doing so will allow us to get weaving set up quickly and without having to compromise our existing application.
To convert the existing application which renders side-by-side output to one that uses software to weave an image for our SR devices, we need to make three small modifications.
weave
function to generate our final output. weave
function simply combines pixels from the side-by-side left and right images in a very specific way. These combined colors are used to display an image on the LCD panel under the SR lens. Your eyes will only see colors meant for left or right after the light passes through the physical lens.