Simulated Reality SDK 7500c78d v1.30.2.51085 2024-04-26T11:23:03Z
Stable
HeadPoseTracker API

The HeadPoseTracker API works similarly to the EyeTracker API, but exposes the user's head position and orientation rather than the user's eye positions.

Classes

The HeadPoseTracker API consists out of the following classes

  • SRContext is an essential element of any SR application, it maintains a list of all SR components that are in use and cleans them up when the application ends. It also allows different components of the application to share the same Sense implementations.
  • Sense is an interface used by the SRContext to keep track of any input or output devices.
  • HeadPoseTracker is an interface providing access to head pose data.
  • HeadPoseStream connects HeadPoseTracker implementations with user defined HeadPoseListener implementations.
  • HeadPoseListener is an interface to be implemented by any components of the user's application that want to receive head pose data.
  • SR_headPose contains the output data.

Head Pose Data

The SR_headPose struct is at the center of the HeadPoseTracker interface, this is what applications can subscribe to.

typedef struct {
uint64_t frameId;
uint64_t time;
SR_point3d position;
SR_point3d orientation;
Definition: head.h:16
C-compatible 3d double vector representation.
Definition: types.h:81

The SR_headPose struct contains the head position and orientation. For more insight to what these vectors mean, take a look at the coordinate system documentation.

HeadPoseTracker usage

Application developers should define how to handle the head pose data by implementing an HeadPoseListener. It is advisable to include an InputStream<SR::HeadPoseStream> member, this will come into play when the actual head pose tracker is constructed.

class HeadPoseListenerImplementation : public SR::HeadPoseListener
{
public:
// Ensures HeadPoseStream is cleaned up when Listener object is out of scope
// The accept function can process the head pose data as soon as it becomes available
virtual void accept(const SR_headPose& frame) override {
// Use head pose data
}
};
Interface for listening to SR_headPose updates This interface is supported from Eye Tracker version 1...
Definition: headposelistener.h:21
virtual void accept(const SR_headPose &frame)=0
Accept an SR_headPose frame.
Template class to wrap data stream to a listener object.
Definition: inputstream.h:25

To get access to head pose data, users will have to construct an SRContext and call the static factory function HeadPoseTracker::create as follows:

int main() {
// Create HeadPoseTracker
SR::SRContext context;
SR::HeadPoseTracker* headPoseTracker = SR::HeadPoseTracker::create(context);
Sense class which provides head pose tracking functionality to the SR system.
Definition: headposetracker.h:36
static HeadPoseTracker * create(SRContext &context)
Creates a functional HeadPoseTracker instance.
Maintains WorldObject and Sense objects during the application lifetime.
Definition: srcontext.h:80

Then we need to construct a listener that follows the desired way of processing the data. By calling set on the InputStream<SR::HeadPoseStream> field we ensure that the stream is destructed correctly when the HeadPoseListenerImplementation goes out of scope. When everything is set up context.initialize() starts using any newly constructed streams to listeners.

// Construct listener
HeadPoseListenerImplementation listener;
listener.stream.set(headPoseTracker->openHeadPoseStream(&listener));
// Start tracking
context.initialize();
virtual std::shared_ptr< HeadPoseStream > openHeadPoseStream(HeadPoseListener *listener)=0
Creates a HeadPoseStream for listener to be connected to.
void initialize()
Initialize all senses.

If the main function of our application returns, all deconstructors will be called and we will no longer receive data. We should ensure that the application remains open as long as we want.

// Ensure that the program does not exit immediately.
char inchar;
std::cin >> inchar; //wait for key press
}