LeiaSR SDK 720218b2 v1.32.7.6322 2025-02-13T14:55:38Z
Stable
Getting Started - C++/DirectX Weaving

Build and run C++/DirectX example

  • Navigate to the install directory. Then go to the examples directory and execute the following commands as administrator
    cd examples/external/example_directX_weaving
    mkdir build
    cd build
    cmake -G "Visual Studio 15 2017 Win64" ..
    Note:
    • To run these command as administrator, open the Start menu, and scroll to "Windows PowerShell", from the right-click menu, select "Run as administrator".
    • Please take into account that you should select the correct input for -G parameter based on the installed Visual Studio on your system.
  • Open Visual Studio as Administrator. To do so, open the Start menu, and scroll to Visual Studio, then from the right-click or context menu of Visual Studio, select More > Run as administrator. When Visual Studio starts, (Administrator) appears after the product name in the title bar.
  • Then open example_directX_weaving.sln in Visual Studio. Build example_directX_weaving project and run it.
  • If the Eyetracker application is running a pentahedron object is expected to be displayed in 3D. If the Eyetracker application is not running the default eye positions will be used. These are identical for the left and right perspective such that a stereo image with zero disparity will be presented. You will only be able to look around the pentahedron in SR if the Eyetracker application is running in the background.

C++/DirectX example

Note:

  • Weaving in our context is the process that for every subpixel (red,green,blue) selects which view (left,right) should be shown. Every subpixel can only be seen from certain angles, so for every subpixel it is determined whether the right eye or left eye of the user is closest to the direction in which this subpixel can be seen.

The expected behavior of the example is to display a pentahedron object in 3D at the tip of the index finger. This requires the eyetracker class.

  • Class MyEyes is the listener that accept eye pair coordinates from the EyeTracker class.
  • To start streaming eye positions SRContext has to be initialized:
    270 initializeSrObjects();
    271 SR::Screen* screen = SR::Screen::create(*context);
    Class of WorldObject representing the screen in real space.
    Definition: screen.h:34
    static Screen * create(SRContext &context)
    Creates an instance of a Screen class.
  • The DirectX library is used to draw objects. For the extensive tutorial on how to visualize objects in DirectX follow the link. In this guide we briefly point out that DirectX objects are represented by a buffer of vertices (surface of each object can be represented by a set of triangles. Each of these triangles is represented by three vertices in 3D space) and by a buffer of the vertices' colors. In this example a pentahedron is drawn using four triangles and one square side. The square can be represented by two triangles. Therefore the buffer of vertices consists of 6 (4 + 2 triangles ) * 3 (three vertices each) * 3 (3 coordinates each) = 54 elements:
    9static const DirectX::XMFLOAT3 g_vertex_buffer_data[] = {
    10 {+1.0, -1.0, -1.0}, {0.0, +1.0, 0.0}, {-1.0, -1.0, -1.0},
    11 {0.0, +1.0, 0.0}, {-1.0, -1.0, +1.0}, {-1.0, -1.0, -1.0},
    12 {+1.0, -1.0, +1.0}, {0.0, +1.0, 0.0}, {+1.0, -1.0, -1.0},
    13 {+1.0, -1.0, +1.0}, {+1.0, -1.0, -1.0}, {-1.0, -1.0, -1.0},
    14 {+1.0, -1.0, +1.0}, {-1.0, -1.0, -1.0}, {-1.0, -1.0, +1.0},
    15 {+1.0, -1.0, +1.0}, {-1.0, -1.0, +1.0}, {0.0, +1.0, 0.0}
    16};
    Similarly, the buffer of vertices' colors also consists of 54 elements as in the code:
    20// One color for each vertex. XYZ maps to RGB
    21static const DirectX::XMFLOAT3 g_color_buffer_data[] = {
    22 {0.0, 1.0, 0.0}, {0.0, 1.0, 0.0}, {0.0, 1.0, 0.0}, //back
    23 {1.0, 0.0, 0.0}, {1.0, 0.0, 0.0}, { 1.0, 0.0, 0.0}, //left
    24 {0.0, 1.0, 1.0}, {0.0, 1.0, 1.0}, { 0.0, 1.0, 1.0}, //right
    25 {0.0, 0.0, 1.0}, {0.0, 0.0, 1.0}, { 0.0, 0.0, 1.0}, //up
    26 {0.0, 0.0, 1.0}, {0.0, 0.0, 1.0}, { 0.0, 0.0, 1.0}, //up
    27 {1.0, 1.0, 0.0}, {1.0, 1.0, 0.0}, { 1.0, 1.0, 0.0} //front
    28};
    It is important to point out that clockwise or counterclockwise order of the vertices in the buffer defines inner or outer surface of the object respectively.
  • Once the object is defined in the buffers it can be further visualized in a desired location and for a desired viewpoint. Those transformations are done using CalculateModel() and CalculateProjection() functions. Rendering is done using the function RenderScene().
  • In order to visualize 3D object both left and right eye images have to be rendered. Therefore projection matrix is computed for the left and for the right eye:
    224 // Update Model to put object on the finger
    225 DirectX::XMMATRIX View = DirectX::XMMatrixIdentity();
    226 DirectX::XMMATRIX Model = CalculateModel();
    227
    228 // Set projection for left rendering
    229 DirectX::XMMATRIX Projection = CalculateProjection(DirectX::XMLoadFloat3(&leftEye));
    230 DirectX::XMMATRIX MVP = Model * View * Projection;
    231 TransformationData.Transformation = MVP;
    232 renderer.GetContext()->UpdateSubresource(TransformationBuffer, 0, nullptr, &TransformationData, 0, 0);
    233 renderer.GetContext()->VSSetConstantBuffers(0, 1, &TransformationBuffer);
    234
    235 // Set viewport to the left half
    236 D3D11_VIEWPORT Viewport;
    237 Viewport.TopLeftX = 0.0f;
    238 Viewport.TopLeftY = 0.0f;
    239 Viewport.Width = renderWidth;
    240 Viewport.Height = renderHeight;
    241 Viewport.MinDepth = 0.0f;
    242 Viewport.MaxDepth = 1.0f;
    243 renderer.GetContext()->RSSetViewports(1, &Viewport);
    244
    245 // Render the scene for the left eye
    246 pyramid.draw();
    247
    248 // Set projection for right rendering
    249 Projection = CalculateProjection(DirectX::XMLoadFloat3(&rightEye));
    250 MVP = Model * View * Projection;
    251 TransformationData.Transformation = MVP;
    252 renderer.GetContext()->UpdateSubresource(TransformationBuffer, 0, nullptr, &TransformationData, 0, 0);
    253 renderer.GetContext()->VSSetConstantBuffers(0, 1, &TransformationBuffer);
    254
    255 // Set viewport to the right half
    256 Viewport.TopLeftX = renderWidth;
    257 renderer.GetContext()->RSSetViewports(1, &Viewport);
    258
    259 // Render the scene for the right eye
    260 pyramid.draw();