8/30/2023 0 Comments Launch compositor steam vr![]() Simply stop calling it to return control to the user's own camera. This video, for example, is filmed without any extra cameraman!Ĭall (position, rotation, vfov) to control LIV's camera when avatars are being used! If a creator is using a static camera, this transformation can give the illusion of camera movement. This transform is an additional wrapper to the user’s playspace, meant to allow for user-controlled transformations for special camera effects & transitions. It represents the centre of the user’s playspace. When implementing VR locomotion (teleporting, joystick, etc), this is the GameObject that you should move around your scene. This is the topmost transform of your VR rig. You can use your own camera prefab should you want to! Stage The LIV SDK, by default clones this object to match your application’s rendering setup. This is the camera responsible for rendering the user’s HMD. It must be set to the GameObject that contains the player's hands. The "Stage" object is where LIV's camera will be inserted in your hierarchy. The LIV Unity SDK is interested in only two GameObjects, your HMD camera & "Stage". LIV’s footprint is almost entirely dependent on how well-optimised your application is! Additional camera types, effects, and layers can be added in future updates to the LIV App, making your SDK integration last longer.įor this to work well, we have developed a minimum-latency, high-performance transport layer that also handles resource management.Accurate latency compensation works without any additional effort from you, the developer.Optimised resource use - we only do the bare minimum work required in the SDK, allowing it to stay lightweight and easy to maintain.This output can then be recorded or streamed using software like OBS or Discord.ĭoing this work out-of-engine comes with some significant benefits: The compositor takes in multiple timestamped sources, performs latency compensation, and composites them together. These textures are then submitted for composition! ![]() The background & foreground are separated by clipping geometry, based on the user’s location within the scene. This camera then renders your app into a background and foreground, to allow the user’s body to be composited in. The LIV SDK spawns a camera inside your app which is controlled by LIV. With the power of out-of-engine compositing, a creator can express themselves freely without limits as a real person or an avatar! How It Works Thanks to our software, creators can film inside your app and have full control over the camera. It contextualizes what the user feels & experiences by capturing their body directly inside your world! The LIV SDK provides a spectator view of your application. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |