I'm trying to use a realtime video livestream feed from Open Broadcast Studio (OBS) in Meta Spark AR Studio. Said real-time video should be manipulable and displayed like any other video or image texture/material you import in Spark AR Studio. The real-time video should NOT replace the Camera/CameraTexture, which is used to capture the person. So in short: Instead of video playing above persons head, there should be a real-time video playing (OBS livestream or virtual camera).
I checked the Meta Spark Studio documentation and there seems to be no native way to import realtime video data (apart from the Camera/CameraTexture, which the mobile phone uses). But the javascript capabilities sound promising and therefor I'd like to asks if and how I could implement real-time livestream video in Meta Spark AR Studio (websocket?).