Hi all, I'm from Belgium. I'm not a native English speaker so I hope I'll explain correctly my problem.
In a project for my work, I need to access Eye tracking data while streaming a game (with the streaming assistant) on the pico neo 2 (in the futur on the pico neo 3).
As it is a streaming game, I suppose it should run on the pc and thus I can't use the different sdk proposed on the website. So I used the XR options integrated in Unity. I tried to do so but I failed. So here is my first question :
Is it possible to access the eye tracking data while streaming a game?
In order to not stay blocked, I tried to imagine alternatives solutions.
1) I imagine to create an application (Unity) with the Unity XR Platform SDK that could run in background (while the stream runs). This application would access the eye tracking data and send it on the network (OSC, UDP, other) to the pc that stream the game. This game could so receive these data and use them in the game. After some researches and some trials, it seems that the way Android is working doesn't allow such background running app in order to preserve the battery life. Here is my second question:
Is it possible to run two app simultaneously for example the streaming assistant in foreground and an Unity app in background on the pico headset?
2) I imagine to create an application (Android studio) with the Android Native SDK that could run a foreground service. This foreground service would access the eye tracking data and send them on the network (OSC, UDP, other) to the pc that stream the game. I'm not familiar at all with Android Studio nor Java but i tried to learn via tutorial and examples. I'm still having a lot of difficulties to make this kind of app. Here is my third question :
Is it possible to create such a service on the pico neo 2 (and later on pico neo 3)?
I hope someone could help me. If you have any other kind of solution to suggest, you are welcome to do so.
Have a nice day.