LeiaSR SDK 720218b2 v1.32.7.6322 2025-02-13T14:55:38Z
Stable
Getting Started - C++/OpenGL Weaving

Build and run C++/OpenGL example

  • Navigate to the install directory. Then execute the following commands as administrator in "Windows PowerShell"
    cd examples/opengl_weaving
    mkdir build
    cd build
    cmake -G "Visual Studio 15 2017 Win64" ..
    Note:
    • To run these command as administrator, open the Start menu, and scroll to "Windows PowerShell", from the right-click menu, select "Run as administrator".
    • Please take into account that you should select the correct input for -G parameter based on the installed Visual Studio on your system.
  • Open Visual Studio as Administrator. To do so, open the Start menu, and scroll to Visual Studio, then from the right-click or context menu of Visual Studio, select More > Run as administrator. When Visual Studio starts, (Administrator) appears after the product name in the title bar.
  • Then, open example_opengl_weaving.sln located in examples/opengl_weaving/build address in Visual Studio. Build example_opengl_weaving project and run it.
  • If the Eyetracker application is running a pentahedron object is expected to be displayed in 3D. If the Eyetracker application is not running the default eye positions will be used. These are identical for the left and right perspective such that a stereo image with zero disparity will be presented. You will only be able to look around the pentahedron in SR if the Eyetracker application is running in the background.

C++/OpenGL example

  • The expected behavior of the example is to display a pentahedron object in 3D at the tip of the index finger. This requires the eyetracker and handtracker run simultaneously.
  • Classes MyFinger and MyEyes are the listener classes that accept hand pose and eye pair coordinates from the HandTracker and EyeTracker classes, respectively.
  • To start streaming eye positions and hand pose coordinates SRContext has to be initialized:
    372 initializeSrObjects();
    373 MyFinger finger(SR::HandTracker::create(*context));
    374 SR::Screen* screen = SR::Screen::create(*context);
    static HandTracker * create(SRContext &context)
    Creates a functional HandTracker instance.
    Class of WorldObject representing the screen in real space.
    Definition: screen.h:34
    static Screen * create(SRContext &context)
    Creates an instance of a Screen class.
  • The OpenGL library is used to draw objects. For the extensive tutorial on how to visualize objects in OpenGL follow the link. In this guide we briefly point out that OpenGL objects are represented by a buffer of vertices (surface of each object can be represented by a set of triangles. Each of these triangles is represented by three vertices in 3D space) and by a buffer of the vertices' colors. In this example a pentahedron is drawn using four triangles and one square side. The square can be represented by two triangles. Therefore the buffer of vertices consists of 6 (4 + 2 triangles ) * 3 (three vertices each) * 3 (3 coordinates each) = 54 elements:
    10static const GLfloat g_vertex_buffer_data[] = {
    11 +1.0, -1.0, -1.0, 0.0, +1.0, 0.0, -1.0, -1.0, -1.0,
    12 0.0, +1.0, 0.0, -1.0, -1.0, +1.0, -1.0, -1.0, -1.0,
    13 +1.0, -1.0, +1.0, 0.0, +1.0, 0.0, +1.0, -1.0, -1.0,
    14 +1.0, -1.0, +1.0, +1.0, -1.0, -1.0, -1.0, -1.0, -1.0,
    15 +1.0, -1.0, +1.0, -1.0, -1.0, -1.0, -1.0, -1.0, +1.0,
    16 +1.0, -1.0, +1.0, -1.0, -1.0, +1.0, 0.0, +1.0, 0.0,
    17};
    Similarly, the buffer of vertices' colors also consists of 54 elements as in the code:
    21// One color for each vertex. XYZ maps to RGB
    22static const GLfloat g_color_buffer_data[] = {
    23 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, //back
    24 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, //left
    25 0.0, 1.0, 1.0, 0.0, 1.0, 1.0, 0.0, 1.0, 1.0, //right
    26 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, //up
    27 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, //up
    28 1.0, 1.0, 0.0, 1.0, 1.0, 0.0, 1.0, 1.0, 0.0, //front
    29};
    It is important to point out that clockwise or counterclockwise order of the vertices in the buffer defines inner or outer surface of the object respectively.
  • Once the object is defined in the buffers it can be further visualized in a desired location and for a desired viewpoint. Those transformations are done using CalculateModel() and CalculateProjection() functions. Rendering is done using OpenGL function glDrawArrays().
  • In order to visualize 3D object both left and right eyes, images have to be rendered. Therefore projection matrix is computed for the left and for the right eyes:
    242 // Update Model to put object on the finger
    243 glm::mat4 View = glm::mat4();
    244 glm::mat4 Model = CalculateModel();
    245
    246 // Set projection for left rendering
    247 glm::mat4 Projection = CalculateProjection(leftEye);
    248 glm::mat4 MVP = Projection * View * Model;
    249 glUniformMatrix4fv(MatrixID, 1, GL_FALSE, &MVP[0][0]);
    250
    251 // Set viewport to the left half
    252 glViewport(0, 0, renderWidth, renderHeight);
    253
    254 // Render the scene for the left eye
    255 pyramid.draw();
    256
    257 // Set projection for right rendering
    258 Projection = CalculateProjection(rightEye);
    259 MVP = Projection * View * Model;
    260 glUniformMatrix4fv(MatrixID, 1, GL_FALSE, &MVP[0][0]);
    261
    262 // Set viewport to the right half
    263 glViewport(renderWidth, 0, renderWidth, renderHeight);
    264
    265 // Render the scene for the right eye
    266 pyramid.draw();
  • Left and right images are stitched together by sequentially applying glViewport() function and rendered by glDrawArrays() function. Subsequently, this image is passed to the weaving part of the SR SDK. During the weaving process pixels corresponding to the left and right images are distributed over the pixels of the display according to the display specifications and its calibration.

C++/OpenGL example weaving

Note:

  • Weaving in our context is the process that for every subpixel (red,green,blue) selects which view (left,right) should be shown. Every subpixel can only be seen from certain angles, so for every subpixel it is determined whether the right eye or left eye of the user is closest to the direction in which this subpixel can be seen.

Loading weaving parameters and setting up a weaving shader has been encapsulated in the GLWeaver class for this example. This class provides a framebuffer that we can render our existing output to. Doing so will allow us to get weaving set up quickly and without having to compromise our existing application.

To convert the existing application which renders side-by-side output to one that uses software to weave an image for our SR devices, we need to make three small modifications.

  • We need to construct a GLWeaver instance. We only have to provide the size at which we want to render our scene side-by-side:
    404 weaver = new SR::PredictingGLWeaver(*context, renderWidth * 2, renderHeight, glfwGetWin32Window(window));
    405
    406 context->initialize();
    Definition: glweaver.h:216
  • Next our render loop needs to start by binding the framebuffer provided by the GLWeaver:
    422 // Draw to weaver with application specific shaders
    423 if (contextValid && weaver != nullptr) {
    424 glBindFramebuffer(GL_FRAMEBUFFER, weaver->getFrameBuffer());
    425 }
    426 else {
    427 glBindFramebuffer(GL_FRAMEBUFFER, 0); //Start rendering to the display
    428 }
    The render loop can continue as normal, objects like the pyramid in this example can be rendered to the bound framebuffer. The application can define their own shaders to do so.
  • Finally we should bind the default framebuffer again and use the weave function to generate our final output.
    444 glBindFramebuffer(GL_FRAMEBUFFER, 0); //Start rendering to the display
    445 // glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); does not need to be called because weaving overwrites each pixel on the backbuffer
    446 glViewport(0, 0, windowWidth, windowHeight);
    447
    448 // For weaving, the center-point between both eyes is used.
    449 // It should be converted from millimeters to centimeters
    450 if (contextValid && weaver != nullptr) {
    451 weaver->weave((unsigned int)windowWidth, (unsigned int)windowHeight, 0, 0);
    452 }
    The weave function simply combines pixels from the side-by-side left and right images in a very specific way. These combined colors are used to display an image on the LCD panel under the SR lens. Your eyes will only see colors meant for left or right after the light passes through the physical lens.