Looking for something specific?
Just search below

Elevate your Corporate Metaverse experience

The latest version of SynergyXR 2.4 provides useful tools to manage experiences created, as well as helping users to access the most relevant experiences in the Corporate Metaverse.

Download now

SynergyXR 2.4 Release Highlights

Here’s what we’ve packed inside the latest version of SynergyXR 

  • Saving and controlling the detailed state of Spaces 
  • Guide your users to explore the right Corporate Metaverse Spaces 
  • Making it easier to manage and interact with content   

With the wider adoption of the Corporate Metaverse, a lot of new users are trying XR for the very first time. In the latest update, SynergyXR 2.4, we provide new and improved tools to make that first journey as exciting and user friendly as possible. Authors are given more control of how a virtual Space should look and feel when a new user enters. Admins can feature Spaces or even connect users to an automated flow, ensuring individual users are presented with the most relevant technical training experience.  

There are also several improvements to the latest addition to the SynergyXR family – our iOS application. Our app for iOS devices has a completely revamped interaction scheme making this platform even easier to use for explorers of the Corporate Metaverse, new or experienced.  

We’re excited to offer this and much more in SynergyXR 2.4. 

Saving and controlling the detailed state of Spaces

SynergyXR 2.4 offers more detailed control over assets saved in a Space. Authors can now define exactly what users should expect when entering a saved Space within SynergyXR. Which objects are locked? The progress of a video or 3D animation? The page of a PDF? All this can be controlled, ensuring a “what-you-see-is-what-you-get” experience when saving and loading Spaces. 

We’ve also expanded the capabilities of some of our content, including volume control of videos, and new animation modes for 3D models. 3D models can even be fixed to the environment, allowing users to import 3D scans of an area, and having other virtual objects interact naturally with it. Users now get the experience of the scan being an integrated and static part of the virtual environment.

Guide your users to explore the right Corporate Metaverse Spaces

Often, SynergyXR users have created a lot of saved Spaces, and it can be challenging to find the appropriate Space an individual user should enter. We have now introduced “Featured Spaces”. A Workspace admin can easily mark a saved Space as “featured”, ensuring it will show up on top of the list of saved Spaces for all users of that Workspace, making it so much easier to find. 

Admins can also go one step further using the new Space Suggestion feature. On the user web portal, an admin can select a specific user and specify (or “suggest”) which Space they should enter when using SynergyXR. The user will be nudged to the specified Space when logging into SynergyXR, helping the user to enter the appropriate part of the Corporate Metaverse defined by the admin. This is especially useful to ensure employees enter a relevant technical training experience.   

Try the new technical training demo

Want to experience virtual technical training for yourself? Jump into this free demo experience and learn how to assemble steel beams for a new construction project. Through visual, written, and voice step-by-step guidance, you’ll understand the processes and tools involved in drilling, bolting, and assembling steel beams.

Making it easier to manage and interact with content

SynergyXR 2.4 also vastly improves the way users can interact with content on iOS and PC. The new gizmo-based interaction allows users to easily move and rotate content – also in the depth of the experience, which is often difficult on 2D screens. This new way to interact with content works equally well in virtual or augmented mode.  

In addition to this, authors can now see a list of all the content currently in the space. Items are highlighted within the environment when hovering over the list view and you can see a traffic light indicator with an overview of memory usage (red, orange, green). Not only that, but users also have the ability delete objects or make them a part of the environment.

SynergyXR 2.4 – technical release notes

Book a 30-minute meeting and get a free trial 

  • Users without previous SynergyXR experience can now signup for a free trial 
  • A completely new Organization and Workspace is automatically being created, making the user the first admin, ready to invite colleagues 
  • The free trial is limited to 6 users for 7 days 

Save object state in Space

  • When saving a Space, all individual states of the objects are also saved, including:
    • Object locked/unlocked state 
    • 3D object animation state (play/pause/progress/loop/animation clip selected, etc.) 
    • Model layer state (layers enabled/disabled) 
    • 3D objects made static in the environment (see below) 
    • Object information panel state (shown/hidden) 
    • Video state (play/pause/progress/loop/mute/volume, etc.) 
    • PDF state (page) 
    • IoT dashboard state (horizontal/vertical) 
  • Basically, users should expect to load a Space in the exact same state as it was saved in 

New controls for content (video + animation) 

  • For video content, users can now control:
    • Volume – automatically muted when lowered to zero 
    • Loop or play once
  • For animated 3d models, users can now control:
    • The play mode: play-once, loop or ping-pong (great for exploded views)
    • In case multiple animation clips are embedded in the GLB file, a user can choose which one to play

Content list

  • Authors can bring up a list of all content currently in the Space
  • When hovering over a piece of content in this list, it is highlighted in the 3D Space to make it easily identifiable
  • Authors can delete content directly from this list
  • Each piece of content has a small memory indicator (green, yellow or red) to give an overview of what content is taking up the most memory on the device.
  • For 3D objects, these can be made static – read more below.

Static 3D objects – make objects part of the environment

  • In case of a 3D model, the user can choose to make the object “static”
    • Other objects can easily be placed on top of these environment models through physics-based collisions – e.g., an object placed on top of a 3D scan as mentioned above
    • This will effectively remove all interaction options with the object. E.g., the scan will be a static part of the environment, and no longer get a bounding box when users are trying to interact with it
  • Objects can be made interactable again through this list. This will enable user to move and rotate the 3d model again
  • In the case of animated models, the following rules apply: 
    • If the animation was paused when made static, the model will stay in this way, not moving any further
    • If the animation was playing when made static, the model will keep looping this animation. The UI controls are hidden to increase immersion, so users can only control the animation making the model interactable again

Space suggestion – select a space to always suggest to individual users

  • An admin can select a specific Space that a selected user should be suggested to enter
  • The admin can select this in the web Portal https://portal.synergyxr.com/ 
  • Select a user from the list, and press “Edit”
  • Select from the drop-down list which Space should be suggested to the user when logging in
  • A user with this feature enabled will automatically be offered to enter the suggested Space when logging in to SynergyXR
  • It is still possible to circumvent this, by pressing the cancel “X” button when prompted
  • An admin can disable this feature in the Portal by selecting “inactive” from the drop-down list

Featured Spaces

  • Some organizations have many saved Spaces to select from. To help new users finding the most relevant Spaces, admins can select to feature individual saved Spaces
  • By selecting a saved Space, and clicking the star-outline, the Space is featured for all users of the particular Workspace
  • Featured Spaces are automatically placed in a separate category on top of the list of Saved Spaces”
  • An admin can remove the feature-star by clicking the star icon again

Content interaction on iOS and PC

  • When selecting an object in the environment, a gizmo is shown indicating the interaction possibilities
  • By touching and dragging on the sides or the inner part of the gizmo, the user can move the object around in the environment 
  • By touching and dragging around the corners of the gizmo, the user can rotate the object around its vertical axis
    • Content can only be rotated around the vertical axis on iOS devices – users can still rotate around all axis on PC
  • By touching on the object and dragging vertically, the user can move the object up and down. A dotted line is shown to clearly visualize how far the object has been moved 
  • By pinching on the object, the user can scale the object up and down

iOS to iOS video streaming

  • When an iOS user is in “virtual mode” he/she can request to start a video stream from an iOS user in “augmented mode”
  • This will result in a live stream of the iOS camera feed of the user on-site to the user working remotely
  • PC-user can of course also still request camera streaming from iOS users in “augmented mode”

Procedures improvements

  • A new “LookAt” action is now available where the user is asked to look at a specific object for a certain amount of time – e.g., a warning light. A visual indicator gives feedback on the duration
  • It is now possible to have several languages in a single Procedure. Through a “Language selection menu” the user can choose between available languages before starting the Procedure
  • We now support much more natural interactions when moving and rotating objects during a Procedure. E.g., when rotating a handle of a valve, the user must grab and rotate the handle, instead of only clicking it and watching the resulting animation
  • The highlight system indicting the next step in the Procedure has been reworked. Instead of the arrows previously used, a white ring not indicated he next object to interact with. The further away from the object the user is looking, the large the ring scales in order to grab the attention of the user

HoloLens – end-of-life

  • We have decided to officially remove the HoloLens 2 from our list of supported devices
  • The HoloLens 2 has been an important part of SynergyXR, but it is a hardware platform used less than 0,1% of all SynergyXR sessions. Hence, we cannot justify keeping investing engineering resources into the continued support and development
  • With the latest updates to SynergyXR for iOS, we believe we have a more than capable alternative to the HoloLens, providing much more detailed 3D scan with better textures, all delivered on a platform with a user-friendly touch interface