Right from the start of the project, it became clear that the ideas for the use cases in the remote support topic area overlapped significantly. Therefore, a general use case for this topic was developed, and the associated efforts were consolidated. The following list shows the participating companies in the Remote Support topic stream and the specifications they developed:
Special Features and Challenges
Geodata - Special Features and Challenges
Snapshots for documentation, offline feature in case of connection failure, use in adverse conditions (tunnel construction), connection to a knowledge base
IFE Doors - Special Features and Challenges
Roles with variable feature sets, barcode scanning on the door system, step-by-step support for maintenance work, possibility of one-to-many connections
Kremsmüller - Special Features and Challenges
Sharing expert knowledge, working through checklists, robustness on construction sites, use in adverse conditions (helmet requirement, gloves, etc.)
Test-Fuchs - Special Features and Challenges
Support can consult other experts (one-to-many), identification of components on the device, display of current measured values, shape-based tracking is desirable
The objective was to implement mixed reality in the area of remote support.
Based on the specifications, the following points were implemented:
The open standard "WebRTC" is used for A/V communication between two (or more) participants. To ensure mixed reality functionality on as many devices as possible, "ARFoundation" was integrated into the current version. ARFoundation combines widely used algorithms for feature point detection and, in combination with Unity3D, enables development for Android and iOS devices. General features, which are also used in other topic streams, have been incorporated into the core.
This prototype allows for establishing an A/V connection with a conversation partner and placing anchored digital annotations (lines and symbols) in real space. A stable data connection is required. In case of poor connection quality, the frame rate and quality can be reduced, or the live stream can be completely paused. Even then, individual photos can still be sent to the conversation partner, who can then annotate and send them back. This provides at least basic support when the data connection is poor. External factors present another challenge: In the use cases presented, we are confronted with adverse conditions (weather, dirt, noise, mandatory hard hats, gloves, etc.) that significantly complicate the operation of current devices.
Through multiple tests of different prototype versions by the partner companies, industrial suitability was evaluated and potential improvements were identified. In several feedback loops with the partner companies, the functionality and UI/UX were improved accordingly. A dedicated WebRTC server was set up at the University of Applied Sciences Upper Austria (FHOÖ) to manage communication between the devices. The source code was documented and cleaned for further development within the companies and for the open-source community. Additionally, a comprehensive written manual outlining the functionality and codebase was created to facilitate further development. A video documentation further explains the most important findings.
Supported Hardware:
In addition to smartphones and tablets, dedicated versions of the application were developed for smart glasses:
MS HoloLens: The application was also built as a Unity project and uses the same technical basis (WebRTC) for connection and transmission. This makes it possible to establish a connection with an expert using a desktop client via HoloLens. The camera image from the HoloLens is transmitted to the expert, and an audio connection exists in both directions. Both participants can draw annotations, and the drawn points/lines are visible to both: The HoloLens user draws using hand gestures in space. The desktop user can pause the video to draw.
RealWear HMT-1: The existing remote support project has been adapted for the HMT-1. The application is controlled via voice commands, and due to limited input options, the connection is only possible with predefined contacts in a previously defined contact list. Because the smartglass does not support the AR API (ARCore or ARFoundation) used for Android, the application's features are limited: The expert can take and annotate screenshots using the desktop client and send the annotated images back to the HMT-1 user.
Here you can find a video about the project implementation.
The overall project "Mixed Reality Based Collaboration 4 Industry" was a collaboration between 22 companies and five scientific institutions from Lower Austria, Vienna, and Upper Austria (St. Pölten University of Applied Sciences, Upper Austria University of Applied Sciences Steyr Campus, FOTEC, IMC University of Applied Sciences Krems, and Vienna University of Technology).
Das Gesamtprojekt "Mixed Reality Based Collaboration 4 Industry" entstand aus der Zusammenarbeit von 22 Unternehmen und fünf wissenschaftliche Institutionen aus Niederösterreich, Wien und Oberösterreich (FH St. Pölten, FH Oberösterreich Standort Steyr, FOTEC, IMC FH Krems und TU Wien).