Simulated Reality SDK 7500c78d v1.30.2.51085 2024-04-26T11:23:03Z
Stable
Getting Started - C++/DirectX Weaving

Build and run C++/DirectX example

  • Navigate to the install directory. Then go to the examples directory and execute the following commands as administrator
    cd examples/external/example_directX_weaving
    mkdir build
    cd build
    cmake -G "Visual Studio 15 2017 Win64" ..
    Note:
    • To run these command as administrator, open the Start menu, and scroll to "Windows PowerShell", from the right-click menu, select "Run as administrator".
    • Please take into account that you should select the correct input for -G parameter based on the installed Visual Studio on your system.
  • Open Visual Studio as Administrator. To do so, open the Start menu, and scroll to Visual Studio, then from the right-click or context menu of Visual Studio, select More > Run as administrator. When Visual Studio starts, (Administrator) appears after the product name in the title bar.
  • Then open example_directX_weaving.sln in Visual Studio. Build example_directX_weaving project and run it.
  • If the Eyetracker application is running a pentahedron object is expected to be displayed in 3D. If the Eyetracker application is not running the default eye positions will be used. These are identical for the left and right perspective such that a stereo image with zero disparity will be presented. You will only be able to look around the pentahedron in SR if the Eyetracker application is running in the background.

C++/DirectX example

Note:

  • Weaving in our context is the process that for every subpixel (red,green,blue) selects which view (left,right) should be shown. Every subpixel can only be seen from certain angles, so for every subpixel it is determined whether the right eye or left eye of the user is closest to the direction in which this subpixel can be seen.

The expected behavior of the example is to display a pentahedron object in 3D at the tip of the index finger. This requires the eyetracker class.

  • Class MyEyes is the listener that accept eye pair coordinates from the EyeTracker class.
  • To start streaming eye positions SRContext has to be initialized:
    275 initializeSrObjects();
    276 SR::Screen* screen = SR::Screen::create(*context);
    Class of WorldObject representing the screen in real space.
    Definition: screen.h:39
    static Screen * create(SRContext &context)
    Creates an instance of a Screen class.
  • The DirectX library is used to draw objects. For the extensive tutorial on how to visualize objects in DirectX follow the link. In this guide we briefly point out that DirectX objects are represented by a buffer of vertices (surface of each object can be represented by a set of triangles. Each of these triangles is represented by three vertices in 3D space) and by a buffer of the vertices' colors. In this example a pentahedron is drawn using four triangles and one square side. The square can be represented by two triangles. Therefore the buffer of vertices consists of 6 (4 + 2 triangles ) * 3 (three vertices each) * 3 (3 coordinates each) = 54 elements:
    14static const DirectX::XMFLOAT3 g_vertex_buffer_data[] = {
    15 {+1.0, -1.0, -1.0}, {0.0, +1.0, 0.0}, {-1.0, -1.0, -1.0},
    16 {0.0, +1.0, 0.0}, {-1.0, -1.0, +1.0}, {-1.0, -1.0, -1.0},
    17 {+1.0, -1.0, +1.0}, {0.0, +1.0, 0.0}, {+1.0, -1.0, -1.0},
    18 {+1.0, -1.0, +1.0}, {+1.0, -1.0, -1.0}, {-1.0, -1.0, -1.0},
    19 {+1.0, -1.0, +1.0}, {-1.0, -1.0, -1.0}, {-1.0, -1.0, +1.0},
    20 {+1.0, -1.0, +1.0}, {-1.0, -1.0, +1.0}, {0.0, +1.0, 0.0}
    21};
    Similarly, the buffer of vertices' colors also consists of 54 elements as in the code:
    25// One color for each vertex. XYZ maps to RGB
    26static const DirectX::XMFLOAT3 g_color_buffer_data[] = {
    27 {0.0, 1.0, 0.0}, {0.0, 1.0, 0.0}, {0.0, 1.0, 0.0}, //back
    28 {1.0, 0.0, 0.0}, {1.0, 0.0, 0.0}, { 1.0, 0.0, 0.0}, //left
    29 {0.0, 1.0, 1.0}, {0.0, 1.0, 1.0}, { 0.0, 1.0, 1.0}, //right
    30 {0.0, 0.0, 1.0}, {0.0, 0.0, 1.0}, { 0.0, 0.0, 1.0}, //up
    31 {0.0, 0.0, 1.0}, {0.0, 0.0, 1.0}, { 0.0, 0.0, 1.0}, //up
    32 {1.0, 1.0, 0.0}, {1.0, 1.0, 0.0}, { 1.0, 1.0, 0.0} //front
    33};
    It is important to point out that clockwise or counterclockwise order of the vertices in the buffer defines inner or outer surface of the object respectively.
  • Once the object is defined in the buffers it can be further visualized in a desired location and for a desired viewpoint. Those transformations are done using CalculateModel() and CalculateProjection() functions. Rendering is done using the function RenderScene().
  • In order to visualize 3D object both left and right eye images have to be rendered. Therefore projection matrix is computed for the left and for the right eye:
    229 // Update Model to put object on the finger
    230 DirectX::XMMATRIX View = DirectX::XMMatrixIdentity();
    231 DirectX::XMMATRIX Model = CalculateModel();
    232
    233 // Set projection for left rendering
    234 DirectX::XMMATRIX Projection = CalculateProjection(DirectX::XMLoadFloat3(&leftEye));
    235 DirectX::XMMATRIX MVP = Model * View * Projection;
    236 TransformationData.Transformation = MVP;
    237 renderer.GetContext()->UpdateSubresource(TransformationBuffer, 0, nullptr, &TransformationData, 0, 0);
    238 renderer.GetContext()->VSSetConstantBuffers(0, 1, &TransformationBuffer);
    239
    240 // Set viewport to the left half
    241 D3D11_VIEWPORT Viewport;
    242 Viewport.TopLeftX = 0.0f;
    243 Viewport.TopLeftY = 0.0f;
    244 Viewport.Width = renderWidth;
    245 Viewport.Height = renderHeight;
    246 Viewport.MinDepth = 0.0f;
    247 Viewport.MaxDepth = 1.0f;
    248 renderer.GetContext()->RSSetViewports(1, &Viewport);
    249
    250 // Render the scene for the left eye
    251 pyramid.draw();
    252
    253 // Set projection for right rendering
    254 Projection = CalculateProjection(DirectX::XMLoadFloat3(&rightEye));
    255 MVP = Model * View * Projection;
    256 TransformationData.Transformation = MVP;
    257 renderer.GetContext()->UpdateSubresource(TransformationBuffer, 0, nullptr, &TransformationData, 0, 0);
    258 renderer.GetContext()->VSSetConstantBuffers(0, 1, &TransformationBuffer);
    259
    260 // Set viewport to the right half
    261 Viewport.TopLeftX = renderWidth;
    262 renderer.GetContext()->RSSetViewports(1, &Viewport);
    263
    264 // Render the scene for the right eye
    265 pyramid.draw();