0 votes
by (190 points)
  1. Do we have any pathThrough to dump the intermedidate eye tracking data(images) with the SDK?
    Need the data as well as the corresponding gaze vectors for future game tuning and development.

  2. data from all cameras can be dumped?

  3. does it support connecting external USB camera (UVCCamera) with USB cable?

1 Answer

+1 vote
by (55.5k points)

Dear developers,
1. Our SDK provides interfaces to obtain eye tracking data of Neo 3 Pro Eye, depending on the SDK version you are using.

For example:Unity XR SDK v2.0.3
2. Do you mean the two left and right cameras of neo 3 Pro Eye or what cameras do you refer to?
3. For USB camera, we do not have relevant restrictions or support.

by (190 points)

Thanks a lot for your detailed reply!
for question-1:
to be more specified, could we get the images of left and right eye?
for the SDK versions, we can use the one you recommeded.
if we can get the left and right eye images, how about the FPS and resolution?

for question-2:
partially yes!
firstly, as the question-1 described, could we get the left & right eye images at any FPS & resolution?
secondly, besides the left and right eye cameras, the PICO neo3 pro eye edition also has other 4 cameras( for example, the 2 cameras facing down to lower body, which might be used for hand gestures capturing), could we access the images from these cameras?

for question-3:
I actually want to check whether we can use OTG cable to connect UVC camera to this HMD with libusb & libuvc. In normal android devices, if it supports the OTG functionality, it could open UVCCameras under target android SDK level-27.
it makes senses if we cannot acess the images from the other 4 cameras mentioned in question-2. If we could get the images from your cameras, then we will not try the USB cameras.

appreciations and thanks in advance.

by (55.5k points)

Dear developers,
For question 1:
Do you mean the left eye and right eye images of the real human eye in front of the device, or the virtual scene image seen in the device?
Eyetracking is achieved by sensors in front of the lens, not by the camera. So the human eyes image in front of the device cannot be obtained.
For question 2:
If it is an image of human eyes, it cannot be obtained. For the four cameras you mentioned, you can obtain image information, and you can obtain camera images through the passthrough function.
PassThrough details you can refer to the link:
For question 3:
You can get an image of the camera located in front of the device. Using the PassThrough function.
For USB camera, we do not have relevant restrictions or support.

by (190 points)

Thanks a lot for so many details!
Question 1:
Yes, I meant the image of real human eyes, now I understand that Pico is not using cameras to obtain real human eye patch to infer the gaze direction, it is using other sensors like accelerators etc to achieve that, what we can get from paththrough is the virtual scene images for left or right eyes, thanks!

Question 2 & 3:
Understand what you explained above that we could get the images of the real world in frontal of the device, appreciate that!
one more question, could we get 4 different/independent streams from these 4 cameras via paththrough?

by (55.5k points)

Dear developer,

Images can be obtained from the left and right cameras located below the device.
The two cameras on top of the device are used for other purposes and do not provide this feature.

by (190 points)

Many appreications! totally understands all the points, thanks!

by (190 points)

just double confirm, the passthrough ( which could be used for dumping data ) is only available in Unity SDK / UnReal SDK and not in Android native SDK? Thanks in advance!

by (55.5k points)

Dear developers,
You can use the Pxr_GetSeeThroughData interface in the Android Native XR SDK,
Look at the following link to see if the following methods are what you need.