Simulated Reality SDK 7500c78d v1.30.2.51085 2024-04-26T11:23:03Z
Stable
Getting Started - C++/OpenGL Weaving

Build and run C++/OpenGL example

  • Navigate to the install directory. Then execute the following commands as administrator in "Windows PowerShell"
    cd examples/opengl_weaving
    mkdir build
    cd build
    cmake -G "Visual Studio 15 2017 Win64" ..
    Note:
    • To run these command as administrator, open the Start menu, and scroll to "Windows PowerShell", from the right-click menu, select "Run as administrator".
    • Please take into account that you should select the correct input for -G parameter based on the installed Visual Studio on your system.
  • Open Visual Studio as Administrator. To do so, open the Start menu, and scroll to Visual Studio, then from the right-click or context menu of Visual Studio, select More > Run as administrator. When Visual Studio starts, (Administrator) appears after the product name in the title bar.
  • Then, open example_opengl_weaving.sln located in examples/opengl_weaving/build address in Visual Studio. Build example_opengl_weaving project and run it.
  • If the Eyetracker application is running a pentahedron object is expected to be displayed in 3D. If the Eyetracker application is not running the default eye positions will be used. These are identical for the left and right perspective such that a stereo image with zero disparity will be presented. You will only be able to look around the pentahedron in SR if the Eyetracker application is running in the background.

C++/OpenGL example

  • The expected behavior of the example is to display a pentahedron object in 3D at the tip of the index finger. This requires the eyetracker and handtracker run simultaneously.
  • Classes MyFinger and MyEyes are the listener classes that accept hand pose and eye pair coordinates from the HandTracker and EyeTracker classes, respectively.
  • To start streaming eye positions and hand pose coordinates SRContext has to be initialized:
    377 initializeSrObjects();
    378 MyFinger finger(SR::HandTracker::create(*context));
    379 SR::Screen* screen = SR::Screen::create(*context);
    static HandTracker * create(SRContext &context)
    Creates a functional HandTracker instance.
    Class of WorldObject representing the screen in real space.
    Definition: screen.h:39
    static Screen * create(SRContext &context)
    Creates an instance of a Screen class.
  • The OpenGL library is used to draw objects. For the extensive tutorial on how to visualize objects in OpenGL follow the link. In this guide we briefly point out that OpenGL objects are represented by a buffer of vertices (surface of each object can be represented by a set of triangles. Each of these triangles is represented by three vertices in 3D space) and by a buffer of the vertices' colors. In this example a pentahedron is drawn using four triangles and one square side. The square can be represented by two triangles. Therefore the buffer of vertices consists of 6 (4 + 2 triangles ) * 3 (three vertices each) * 3 (3 coordinates each) = 54 elements:
    15static const GLfloat g_vertex_buffer_data[] = {
    16 +1.0, -1.0, -1.0, 0.0, +1.0, 0.0, -1.0, -1.0, -1.0,
    17 0.0, +1.0, 0.0, -1.0, -1.0, +1.0, -1.0, -1.0, -1.0,
    18 +1.0, -1.0, +1.0, 0.0, +1.0, 0.0, +1.0, -1.0, -1.0,
    19 +1.0, -1.0, +1.0, +1.0, -1.0, -1.0, -1.0, -1.0, -1.0,
    20 +1.0, -1.0, +1.0, -1.0, -1.0, -1.0, -1.0, -1.0, +1.0,
    21 +1.0, -1.0, +1.0, -1.0, -1.0, +1.0, 0.0, +1.0, 0.0,
    22};
    Similarly, the buffer of vertices' colors also consists of 54 elements as in the code:
    26// One color for each vertex. XYZ maps to RGB
    27static const GLfloat g_color_buffer_data[] = {
    28 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, //back
    29 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, //left
    30 0.0, 1.0, 1.0, 0.0, 1.0, 1.0, 0.0, 1.0, 1.0, //right
    31 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, //up
    32 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, 0.0, 0.0, 1.0, //up
    33 1.0, 1.0, 0.0, 1.0, 1.0, 0.0, 1.0, 1.0, 0.0, //front
    34};
    It is important to point out that clockwise or counterclockwise order of the vertices in the buffer defines inner or outer surface of the object respectively.
  • Once the object is defined in the buffers it can be further visualized in a desired location and for a desired viewpoint. Those transformations are done using CalculateModel() and CalculateProjection() functions. Rendering is done using OpenGL function glDrawArrays().
  • In order to visualize 3D object both left and right eyes, images have to be rendered. Therefore projection matrix is computed for the left and for the right eyes:
    247 // Update Model to put object on the finger
    248 glm::mat4 View = glm::mat4();
    249 glm::mat4 Model = CalculateModel();
    250
    251 // Set projection for left rendering
    252 glm::mat4 Projection = CalculateProjection(leftEye);
    253 glm::mat4 MVP = Projection * View * Model;
    254 glUniformMatrix4fv(MatrixID, 1, GL_FALSE, &MVP[0][0]);
    255
    256 // Set viewport to the left half
    257 glViewport(0, 0, renderWidth, renderHeight);
    258
    259 // Render the scene for the left eye
    260 pyramid.draw();
    261
    262 // Set projection for right rendering
    263 Projection = CalculateProjection(rightEye);
    264 MVP = Projection * View * Model;
    265 glUniformMatrix4fv(MatrixID, 1, GL_FALSE, &MVP[0][0]);
    266
    267 // Set viewport to the right half
    268 glViewport(renderWidth, 0, renderWidth, renderHeight);
    269
    270 // Render the scene for the right eye
    271 pyramid.draw();
  • Left and right images are stitched together by sequentially applying glViewport() function and rendered by glDrawArrays() function. Subsequently, this image is passed to the weaving part of the SR SDK. During the weaving process pixels corresponding to the left and right images are distributed over the pixels of the display according to the display specifications and its calibration.

C++/OpenGL example weaving

Note:

  • Weaving in our context is the process that for every subpixel (red,green,blue) selects which view (left,right) should be shown. Every subpixel can only be seen from certain angles, so for every subpixel it is determined whether the right eye or left eye of the user is closest to the direction in which this subpixel can be seen.

Loading weaving parameters and setting up a weaving shader has been encapsulated in the GLWeaver class for this example. This class provides a framebuffer that we can render our existing output to. Doing so will allow us to get weaving set up quickly and without having to compromise our existing application.

To convert the existing application which renders side-by-side output to one that uses software to weave an image for our SR devices, we need to make three small modifications.

  • We need to construct a GLWeaver instance. We only have to provide the size at which we want to render our scene side-by-side:
    409 weaver = new SR::PredictingGLWeaver(*context, renderWidth * 2, renderHeight, glfwGetWin32Window(window));
    410 // set latency for the weaver prediction
    411 weaver->setLatencyInFrames(2);
    412
    413 context->initialize();
    Definition: glweaver.h:152
  • Next our render loop needs to start by binding the framebuffer provided by the GLWeaver:
    420 // Draw to weaver with application specific shaders
    421 if (contextValid && weaver != nullptr) {
    422 glBindFramebuffer(GL_FRAMEBUFFER, weaver->getFrameBuffer());
    423 }
    424 else {
    425 glBindFramebuffer(GL_FRAMEBUFFER, 0); //Start rendering to the display
    426 }
    The render loop can continue as normal, objects like the pyramid in this example can be rendered to the bound framebuffer. The application can define their own shaders to do so.
  • Finally we should bind the default framebuffer again and use the weave function to generate our final output.
    442 glBindFramebuffer(GL_FRAMEBUFFER, 0); //Start rendering to the display
    443 // glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); does not need to be called because weaving overwrites each pixel on the backbuffer
    444 glViewport(0, 0, windowWidth, windowHeight);
    445
    446 // For weaving, the center-point between both eyes is used.
    447 // It should be converted from millimeters to centimeters
    448 if (contextValid && weaver != nullptr) {
    449 weaver->weave((unsigned int)windowWidth, (unsigned int)windowHeight, 0, 0);
    450 }
    The weave function simply combines pixels from the side-by-side left and right images in a very specific way. These combined colors are used to display an image on the LCD panel under the SR lens. Your eyes will only see colors meant for left or right after the light passes through the physical lens.