Simulated Reality SDK 7500c78d v1.30.2.51085 2024-04-26T11:23:03Z
Stable
EyeTracker API

Classes

The user only has to interact with fixed interfaces offering different features. The EyeTracker interface is one of the most essential SR components. In this class diagram, only the user-facing classes are visualized.

  • SRContext is an essential element of any SR application, it maintains a list of all SR components that are in use and cleans them up when the application ends. It also allows different components of the application to share the same Sense implementations.
  • Sense is an interface used by the SRContext to keep track of any input or output devices.
  • EyeTracker is an interface providing access to eyetracker data.
  • EyePairStream connects EyeTracker implementations with user defined EyePairListener implementations.
  • EyePairListener is an interface to be implemented by any components of the user's application that want to receive eyetracker data.
  • SR_eyePair defines the actual eyetracker data.

EyeTracker Data

The SR_eyePair is at the center of the EyeTracker interface, this is what applications can subscribe to.

typedef struct {
uint64_t frameId;
uint64_t time;
SR_point3d left;
SR_point3d right;
C-compatible struct containing the position of two eyes.
Definition: eyepair.h:28
C-compatible 3d double vector representation.
Definition: types.h:81

EyeTracker usage

Application developers should define how to handle the EyeTracker data by implementing an EyePairListener. It is advisable to include an InputStream<SR::EyePairStream> member, this will come into play when the actual eye tracker is constructed.

class EyePairListenerImplementation : public SR::EyePairListener
{
public:
// Ensures EyePairStream is cleaned up when Listener object is out of scope
// The accept function can process the eye position data as soon as it becomes available
virtual void accept(const SR_eyePair& frame) override {
// Use EyeTracker data
}
};
Interface for listening to SR_eyePair updates.
Definition: eyepairlistener.h:20
virtual void accept(const SR_eyePair &frame)=0
Accept an SR_eyePair frame.
Template class to wrap data stream to a listener object.
Definition: inputstream.h:25

To get access to eyetracker data, users will have to construct an SRContext and call the static factory function EyeTracker::create as follows:

int main() {
// Create EyeTracker
SR::SRContext context;
SR::EyeTracker* eyeTracker = SR::EyeTracker::create(context);
Sense class which provides face tracking functionality to the SR system.
Definition: eyetracker.h:40
static EyeTracker * create(SRContext &context)
Creates a functional EyeTracker instance.
Maintains WorldObject and Sense objects during the application lifetime.
Definition: srcontext.h:80

Then we need to construct a listener that follows the desired way of processing the data. By calling set on the InputStream<SR::EyePairStream> field we ensure that the stream is destructed correctly when the EyePairListenerImplementation goes out of scope. When everything is set up context.initialize() starts using any newly constructed streams to listeners.

// Construct listener
EyePairListenerImplementation listener;
listener.stream.set(eyeTracker->openEyePairStream(&listener));
// Start tracking
context.initialize();
virtual std::shared_ptr< EyePairStream > openEyePairStream(EyePairListener *listener)=0
Creates a EyePairStream for listener to be connected to.
void initialize()
Initialize all senses.

If the main function of our application returns, all deconstructors will be called and we will no longer receive data. We should ensure that the application remains open as long as we want.

// Ensure that the program does not exit immediately.
char inchar;
std::cin >> inchar; //wait for key press
}

Prediction

To ensure that your application is as responsive as possible a specific EyeTracker implementation can be constructed. The following is a short description of relevant classes.

  • PredictingEyeTracker is an implementation of the EyeTracker interface that offers more control over the filtering of the output values to the application developer.
  • EyePairPredictor is an implementation of the PredictiveFilter<SR_eyePair, uint64_t> interface that defines how SR_eyePair data is filtered to provide accurate predictions.

PredictingEyeTracker usage

The PredictingEyeTracker::create function can be used in exactly the same way as the EyeTracker::create function but the PredictingEyeTracker will only provide output to connected EyePairListener instances when one of the predict functions is called.

An application might define the same type of EyePairListener.

class EyePairListenerImplementation : public SR::EyePairListener
{
public:
// Ensures EyePairStream is cleaned up when Listener object is out of scope
// The accept function can process the eye position data as soon as it becomes available
virtual void accept(const SR_eyePair& frame) override {
// Use EyeTracker data
}
};
int main() {
// Create EyeTracker
SR::SRContext context;

The EyeTracker::create call is replaced with PredictingEyeTracker::create.

// Construct listener
EyePairListenerImplementation listener;
listener.stream.set(eyeTracker->openEyePairStream(&listener));
// Start tracking
context.initialize();
Sense class which provides predictive eye tracking functionality.
Definition: predictingeyetracker.h:33
virtual std::shared_ptr< EyePairStream > openEyePairStream(EyePairListener *listener) override
Creates a EyePairStream for listener to be connected to.
static PredictingEyeTracker * create(SRContext &context)
Returns a class of PredictingEyeTracker.

To trigger the listener to receive new data the predict function can be called as follows.

// Program loop
while (true) {
eyeTracker->predict(80);
// Sleep in between predictions to simulate actual work being done in between frames
using namespace std::chrono_literals;
std::this_thread::sleep_for(16ms);
}
}
void predict(uint64_t latency, SR_eyePair &output)
Predict for a certain latency, output an SR_eyePair and trigger stream output.

Synchronous use of PredictingEyeTracker

In the above example, the work associated with filtering will be executed asynchronously. This can be useful to increase parallelization but might be difficult to control.

Since the PredictingEyeTracker is also an EyePairListener implementation. This still maintains a similar structure in your application. The developer does not have to define their own listener implementation in this case.

int main() {
// Create EyeTracker
SR::SRContext context;
// Start tracking
context.initialize();
// Program loop
while (true) {
SR_eyePair eyePair;
eyeTracker->predict(80, eyePair);
// Use EyeTracker data
// Sleep in between predictions to simulate actual work being done in between frames
using namespace std::chrono_literals;
std::this_thread::sleep_for(16ms);
}
}