Simulated Reality SDK 7500c78d v1.30.2.51085 2024-04-26T11:23:03Z
Stable
SR Application guidelines

Rules and Recommendations for making SR applications

Introduction

This document describes Rules and Recommendations for making SR applications.

The Rules must be followed for any SR application. Otherwise your application or some other SR applications on the user’s system may not work.

The Recommendations are optional: you may choose to ignore them.

Terms and Conditions

An application developer is expected to understand and comply to the Terms & Conditions that accompany the SR SDK. Furthermore, for publishing an application on the SR App Store a developer should register as such, for which another Terms & Conditions apply.

SR overview

Simulated Reality is experiencing virtual objects as if they exist in the real world, without any wearables.

SR is an immersive experience yet has several particularities that makes it different than VR or AR. An SR device works like a virtual window. What makes it interesting, is that an object can move between being inside or outside the window. The main difference with other technologies is that with SR there is no need to wear a headset to see depth. Additionally, there is the possibility of interacting with virtual objects without wearables.

The field of view is limited to the size of the display, unlike VR or AR, which can allow 360 degrees of freedom if the user moves around. Also, SR deals with virtual worlds and do not mixed reality with virtual creations. Note that these differences can be overcome by new ways of interacting with the simulated world. We will always encourage developers to explore new ways of applying simulated reality to improve user’s experience.

The display has a lens that directs the pixel’s light in a certain direction. This means from one angle, a certain set of pixels can be seen, and from another angle another set of pixels. This allows the SR system to display a different image for the left eye and the right eye. The lens can be switched on or off.

Another important part is the eye tracker. The eye tracker produces a stream of eye pairs: x, y, z coordinates of the user’s eyes. These eye pairs should be used correctly in the SR application to produce the best result.

Displaying images in SR consists of two parts: look-around and weaving.

Look-around means the user can “look around” the image by changing their position. The perspective in the image should change to match their viewing position. It results in a “left image” with the perspective as seen from the left eye, and a “right image” with the perspective as seen from the right eye.

Weaving means combining the left and right images into one image displayed on screen. The pixels which can be seen from the left eye will show the left image, and the pixels which can be seen from the right eye will show the right image.

General

  • Rule: Fill in the Product Name in the executable properties with a user-friendly name for your application, if possible. Explanation: The user can see which SR applications are currently running. This is the name they will see. Note that for Unreal Engine applications, this may not be possible.
  • Rule: Do not shut down or restart any SR services. Explanation: These can be in use by other SR applications.
  • Rule: Do not change global settings which impact other SR applications. Explanation: Changing these settings can negatively affect other SR applications. In particular, do not change power modes or global settings of the video driver.
  • Rule: Use only one SR context per application. Explanation: The SR context is designed to be global to an application. Specifically, do not create separate SR contexts per tab of your application.
  • Recommendation: Destroy the SR context when running in the background (for example minimized to tray) for extended periods of time. Explanation: The user can see which SR applications are currently running. Your application would appear in this list even if the user thinks it has closed.

Lens switching

The lens can be switched on or off. When switched on, the light of each pixel is directed in a certain direction. This is also called 3D mode. When switched off, the light passes through without changing direction, as if there is no lens on the display. This is also called 2D mode.

The application can request the lens to be on by setting the lens hint. It should unset the lens hint when it is no longer displaying 3D content. The application is responsible for doing this correctly. If no application has their lens hint set, the lens switches off.

The lens state can change outside of the control of the application. The application is responsible for responding to the lens state by displaying appropriate content (2D when off, 3D when on).

Switching the lens on takes less than a second, switching the lens off can take a few seconds.

  • Rule: Do not show exclusively 2D content when the lens is on. Explanation: When the lens is on, 2D content is distorted.
  • Rule: Unset the lens hint when showing exclusively 2D content. Explanation: When the lens is on, 2D content is distorted. By unsetting the lens hint, the lens should turn off.
  • Rule: Do not show 3D content when the lens is off. Explanation: When the lens is off, the left and right images of 3D content will appear half-transparent on top of each other.
  • Rule: Set the lens hint before showing 3D content. Explanation: When the lens is off, the left and right images of 3D content will appear half-transparent on top of each other. By setting the lens hint, the lens should turn on.
  • Rule: When an SR Context object is created, show 3D content or unset the lens hint. Explanation: The lens is on by default.
  • Rule: Unset the lens hint when minimizing the application. Explanation: When the application is minimized, it can no longer display 3D content.
  • Rule: Unset the lens hint and show 2D content when the application loses focus. Explanation: There could be a non-SR application window displayed in front of the application. The lens must be off to display that window correctly.
  • Recommendation: Avoid switching the lens on and off quickly. Explanation: The content is distorted during lens switching, which can take a few seconds.
  • Recommendation: Treat lens switching as instantaneous. Explanation: Lens switching can take a few seconds, but the application does not know when it is finished. It is easier to treat lens switching as instantaneous.

System events

The System Sense generates events about the global state of the SR system.

  • Rule: Show 2D content when receiving the SRUnavailable event (until the SRRestored event). Explanation: The SR system can be unavailable outside of the application’s control. The lens will be switched off. This takes precedence over the lens hint.
  • Rule: Handle the ContextInvalid event. Explanation: The SR connection encountered a problem. Do not continue to use this connection.
  • Recommendation: Rebuild the SR connection when receiving the ContextInvalid event. Explanation: The SR connection encountered a problem. Rebuilding the connection solves the issue.

Rendering

Look-around and weaving depends on the user’s viewing position. It is important to render using the correct projection and to keep the latency as low as possible.

Look-around is created when the left and right images are rendered by the SR application. The SR application developers are responsible for implementing the correct projection calculation, similar to the examples provided in the SR SDK.

Weaving combines these left and right images into one. This is done by the weaving plugin. The SR application developers are responsible for adding this step to their render pipeline.

  • Rule: Always weave content in real-time. Never pre-weave content. Explanation: The woven image depends on the user’s viewing position. When the woven image is rendered for a different viewing position, the left and right images appear half-transparent on top of each other.
  • Rule: When rendering content in real-time, use the user’s viewing position for projection calculation. Explanation: The left and right image projection should depend on the user’s viewing position to create look-around.
  • Rule: Do not apply post-processing effects on the woven image. No anti-aliasing. Explanation: Weaving depends on the physical properties of the lens on the device. The woven image must be displayed exactly as calculated by the SR weaver. However, it is safe to apply post-processing effects on the left and right images separately before weaving.
  • Rule: Do not display content woven on one SR device on another SR device. Explanation: Weaving depends on the physical properties of the lens on the device. Content that was calculated for one device, will not be displayed correctly on another device.
  • Rule: Use borderless full-screen mode. Explanation: The lens can only be switched on or off for the entire screen. Windowed SR applications are currently not supported. Exclusive full-screen mode can cause issues when the application loses focus.
  • Rule: Render time should be below one frame (16.6ms at 60 Hz). Explanation: Rendered frames should not be buffered to keep latency low.
  • Rule: Enable v-sync. Explanation: The content can show tearing or other artifacts when v-sync is disabled.
  • Recommendation: Avoid pre-rendering the left and right images before weaving. Explanation: The left and right image projection should depend on the user’s viewing position to create look-around. With pre-rendered left and right images, the content appears to change position to always face the user and look-around does not work.
  • Recommendation: Perform projection calculation using the user’s viewing position as late as possible. Explanation: The user’s viewing position constantly changes. By performing this stage as late as possible, the calculation uses the latest viewing position which best matches the user.

Latency compensation

The SR tracking systems depend on data from sensors in the device, these are recorded, processed and used to enable features like look-around and weaving. A non-zero amount of time passes between the recordings of these sensors and the presentation of the resulting images to the user on their SR display.

To improve responsiveness, predictive filtering can be applied to the tracking data, if the developer doesn't explicitly define other behavior, this will be applied system-wide. However, to accomodate a wide range of applications we've provided interfaces that allow the developer to take more control of this process.

  • Recommendation: Before turning to latency compensation, prioritize reducing latency of your application. Explanation: Predictive filtering can help compensate for existing latency but predicting the future is difficult and there are limits to the effectiveness of predictive filtering.
  • Rule: Do not attempt to apply prediction for more than 200 milliseconds in the future. Explanation: The user is too unpredictable to provide effective predictions past this point purely based on the movements of the user before the prediction is made.
  • Recommendation: For the most reliable results apply a prediction in sync with the display refreshrate. (V-sync) Explanation: Images are presented to the user at a fixed pace. If predictions are not made in a similarly consistent way, the real latency from the point of the predict call will vary and the experience will me inconsistent.
  • Recommendation: Consistenly underestimate the latency of your application. Explanation: Making predictions for more than the latency of the application will introduce unnescessary instability in the look-around and weaving.

Unreal Engine

Enable the SR rendering pipeline by enabling “Simulated Reality Enabled” in the SRUnreal settings. Specific Rules and Recommendations for Unreal Engine are distributed with the SRUnreal plugin. Some important points are below. For the full list, see the SRUnreal plugin.

  • Rule: Disable all VR plugins. Explanation: VR plugins interfere with the SR plugin.
  • Rule: Enable forward rendering. Explanation: This allows lower latency.
  • Recommendation: For complex scenes, enable VR instanced stereo rendering. Explanation: This allows faster rendering

Unity

Add the package to your project and use the SimulatedRealityCamera prefab to enable SR. Specific Rules and Recommendations for Unity are distributed with the SRUnity plugin. Some important points are below. For the full list, see the SRUnity plugin.

  • Rule: Disable all VR plugins. Explanation: VR plugins interfere with the SR plugin.
  • Rule: Use the SimulatedRealityCamera prefab instead of the default camera. Explanation: The default camera does not work with SR.
  • Rule: Use the SimulatedRealityUI prefab canvas instead of the default UI canvas. Explanation: The default UI does not work with SR. Note that not all UI elements may be supported on this canvas.
  • Recommendation: Decide the scaling mode (uniform scaling / non uniform scaling). Explanation: SR devices come in different sizes. Consider using uniform scaling for objects with a real-world size, for example faces or household objects.

Content

What content works best on SR is subjective, so this section has only guidelines.

  • Recommendation: Use the largest resolution that the display can resolve. Explanation: At higher resolutions, the brain can process better the information about depth.
  • Recommendation: Avoid placing objects in areas where the eyes have problems to focus on the object, such as very close to the user. Explanation: Having to focus on such an object can cause discomfort.
  • Recommendation: Avoid placing objects near stereo infinity. Explanation: Objects very far, like stars, require quite some effort for the user due to free viewing. In SR displays it is better to prevent this situation or make a smooth transition between objects at display depth and far away.
  • Recommendation: Do not create large depth jumps between objects. Explanation: These jumps are difficult to fuse by the eyes. The near and far objects appear disconnected instead of one three-dimensional scene, causing discomfort.
  • Recommendation: Create content for a specific screen size and specific viewing distance. Explanation: A different screen size or viewing distance will have a different field of view. This can lead to objects at the edges being out of view, or other issues. It is easier to develop for one specific screen size and viewing distance.
  • Recommendation: Avoid border violations (objects sticking out through the edges of the display). Explanation: The part of the object that sticks out, cannot be rendered on the screen for one of the eyes. This causes the object to be “cut off”, breaking immersion.
  • Recommendation: Avoid a too strong 3D effect. Explanation: Let the user accommodate with some depth and then increase the effect slowly.
  • Recommendation: Avoid pushing the background too far away. Explanation: With far away backgrounds, small changes in the user’s viewing position lead to a large change in the position of the background. The scene can appear to be shaking.
  • Recommendation: Avoid high contrast at objects far away or close by. Explanation: Some small percentage of the left and right image are seen by the other eye. This becomes noticeable when the contrast is high, producing crosstalk (“ghost image”).
  • Recommendation: Avoid out of focus areas. Explanation: Users can scan the scene and expect perfect focus for the area they look at.
  • Recommendation: Avoid high frequency content. Explanation: Small content like small font text can give issues if the lens array cannot resolve it. Increasing the font size, decrease the resolution of the texture and using an anti-aliasing filter are some ways to fix possible issues.
  • Recommendation: Specifically test the appearance of objects near the edges of the display. Explanation: Even without border violations, the edges may show some crosstalk (“ghost image”) under large angles.