Keywords

1 Introduction

Augmented Reality (AR) is the computer-based extension of reality using smart glasses or other hardware devices. AR technology nowadays has a wide range of application areas [2, 15, 16]. By using AR, the reduction of error search and error correction is achieved. AR technologies have a great potential in commissioning [21] and maintenance [17], for assistance in training areas [10], assembly, repair, or automation [9]. AR applications (AR apps) serve as a useful tool in various technical and production fields (e.g., mechanical engineering and manufacturing automation).Footnote 1

In the past years, AR technologies have been increasingly exploited in technical assistance systems, namely the visual ones [22]. AR is being used for the visualization of assembly tasks, machine operation, or repair processes [1, 6]. With the support of cameras (e.g., integrated into tablet or smartphone), the system automatically recognizes a malfunction or broken components, allocates the relevant information, and displays the exact assembly instructions directly in a real image. Thus, the automatic fade-in of the circuit diagram while working on an electrical system component is conceivable. For a technician, this simplifies recording a very complex system and the analysis of relevant measured values. The assistance via “instructions and feedback” [4] for technicians and other workers can even be originated from remote experts and executed in real-time [8].

Considering the current development stage of hardware and software in the field of AR technology support, writing user applications has become relatively simple and less time-consuming. Developers provide a vast amount of free software, which can be used for complex applications like error correction in mechanical problems [3].

2 State-Of-The-Art Applications and Motivation for the Study

As it was mentioned in the introduction, the implementation of AR in the fields of mechanical engineering and manufacturing promises significant perspectives to reduce the complexity and increase interactivity of the human-machine communications.

Despite numerous research and development projects in the field of AR technologies [5], it has not yet been widely used in mechanical engineering. The main reason is the hardware’s high pricing and heavy weight. From the first experiments from 2003 to the increasing interest since 2008, the hardware available for AR apps shared the abovementioned characteristics [4].

The recognition of real objects, which is based on artificial intelligence (AI), is essential for the further positioning of virtual objects in AR. However, the detection of 3D objects (e.g., machine parts) usually requires industrial hardware and expensive software. For example, the DIOTA company [7] offers common alternatives for hardware and software. Industrial tablets such as GETAC or head-mounted displays from HTC or ISAR Projection System are used. For the software, examples from DIOTA (DIOTA Player and DIOTA Connect) and future AR software from Siemens are in operation [7]. Most of the current solutions are mainly interesting for large industrial customers but not for SMEs.

Thus, the most important problem of the wide use of AR can be formulated as follows: many developers traditionally use rather expensive development tools and software products.

A project in the Brandenburg University of Technology Cottbus—Senftenberg (BTU) at the Chair of Automation technology [3] was initiated to solve this problem. The aim of the conducted development was to build an AR application for education and error correction. The software tools Unity3D and Vuforia, due to their availability to the project team, were used during the development process. Several new problems regarding the AR application development were identified and solved.

2.1 The Problem of Data Transfer

The data transfer between industrial systems (for example, programmable logic controllers, PLCs) and Android applications is a rather complicated task when using simple development tools. The basic approaches are primarily focused on transferring data from one single PLC to one or more Android applications, for example, SIEMENS PLC support the data transfer with TCP/IP protocol [18]. But this task is getting even more complex if the solution has to be scalable, i.e., effective not only for one unit, but also for several ones [12, 19, 20]. Overall, the problem of data transfer from industrial controllers (PLC) to AR hardware is currently not well developed. Thus, a new data transfer method has to be established.

2.2 The Problem of User Activity Indication

The data transfer between the programmable controller (PLC) of the plant or laboratory unit and the user is a part of the AR ergonomics problem, which is currently under development [14]. The user’s Android application should not only receive data from the PLC but also transfer data back. The reason for the introduction of bidirectional information transfer is that the majority of the user’s actions with the mechanical environment during the error correction or learning process cannot be detected by the environment’s sensors. On the other hand, the importance of comfortable and easy communication between the user and the AR app is essential, which was explored in particular in [11]. Therefore, the user must be enabled to mark his actions with other tools, e.g., virtual keyboard, joystick, or application interface buttons (if any hand-held device as an AR hardware is used).

3 Method of Development

The following steps for the development of the project have been taken:

  • Analysis and arrangement of the elements of an AR system;

  • Creation of the “development logic”—instructions for the users’ actions;

  • Identification and selection of the open-source or low-cost AR software solutions;

  • Combining the AR system elements and solutions into a functioning system;

  • Developing of an AR application with “development logic” and tests.

This method of work with AR projects in BTU was formulated in the previous related work [3] and projects with AR HMI system for small laboratory plants). As for the application of the method, two important points required special attention. Firstly, the laboratory units needed a modular design to simplify possible modification.

Secondly, the supporting elements of the system were required. The complete scheme of the project released according to the method is shown in the Fig. 1.

Fig. 1
figure 1

Elements of the project

The developed system simulated the operation of an automated garage doors facility and related maintenance processes.

3.1 Hardware Components

The laboratory unit was equipped with a Siemens PLC. AR information could be displayed on a tablet, smartphone, or HMD (Human-Machine Display). The data transfer from the door control hardware (PLC) to the user hardware (smartphone and cardboard) occurred via OPC UA. The latter is a universal interface to the plant network (control components of the industrial plants). The transmitted information constituted a list of variables and their states. In this way, the information about the system components and the operating errors would be transferred to the AR application.

Reasons for the selection of OPC UA as the (data transfer) media were the following:

  • Support of C# and Java as implementation programming languages;

  • Scalability from embedded control software to operational or management information systems;

  • Custom security implementation based on the latest standards;

  • Configurability of timeouts for each service call.

In the first developing phase of the AR application, mobile hardware—an Android tablet—was used. In the final phase of the testing of the latest version of an AR app, a “wearable” device such as a smartphone and a cardboard box was used. A solution with real AR glasses, such as HoloLens from Microsoft, would be quite expensive for SMEs, therefore, such a configuration of an AR device was introduced to resemble the real future operating conditions. By using common and affordable devices such as tablets and smartphones, a distinctive balance of cost and capabilities may further be achieved.

3.2 Software Components

The software system used to develop the AR app consisted of the Vuforia Augmented Reality SDK in conjunction with the Unity3D physics engine.

The reason for the selection of Vuforia as a software solution was its broad functional capabilities. With Unity3D, complex dynamic models can be realized. The control of the models is possible, as well as communication with other programs. This development environment can be used for complex tasks and is free available for research and non-commercial projects. An alternative software solution to create an AR application could be another physical engine like, for example, Unreal Engine (UE). However, UE needs a high level of experience in C programming and is not free of charge. Nevertheless, the respective software is becoming more and more popular in large projects, as it offers significantly more options in the future.

Therefore, the main system components constituted Siemens PLC, OPC/UA interface and server, AR Hardware (Android smartphone and/or tablet, VR cardboard), Unity3D and Vuforia as the development environment.

3.3 Project Development Process

At the start of the project, a miniature laboratory facility (doors) was created and used. The experimentation unit contained a movable door, two user interfaces, and two tracker images. There were different tracker pictures for the positioning of AR elements and AR keyboard. The first prototype of the laboratory facility is demonstrated in Fig. 2.

Fig. 2
figure 2

Laboratory unit (small door, first prototype)

The proposed system enabled the user to carry out advanced operations usually done by experts. The set of implemented standard errors and how they were solved showed the AR app opportunities for training and non-expert error correction.

Three errors could be simulated and indicated with this laboratory unit:

  • “Short” door blocking. When the optical sensor detected an object while the door was moving down, the mechanism would be stopped. The user was guided to remove the object.

  • “Long” door blocking. In case the optical sensor was able to detect the object for longer than 5 s, the facility's door motor was stopped. Then the manual restart by the technical expert through a specific interface (see Fig. 2) was done.

  • Using the emergency button. The facility was stopped and the technical expert had to do the manual restart in the same way.

For the first prototype of the laboratory unit, a virtual keyboard (SVC) with two buttons was developed (Fig. 3.). This solution was created to solve a problem of the user activity indication (see Sect. 2.2). The user was required to switch between different instructions when fixing an error. The button “NEXT” could be activated to go to the next step of the error correction. The button “BACK” enabled returning to the previous step if checking off the correction action was needed.

Fig. 3
figure 3

Using virtual buttons (“NEXT” or “BACK”) for step switching in the first prototype of the project

The system under consideration was presented at the Hannover Expo in 2017 and 2018 as a part of the exhibition stand of the IHK (Industrie- und Han-delskammer) Berlin-Brandenburg. The overall number of the attendants of the exposition who tested the prototype amounted to about 100 people. Each tester was informed about the presence of simulated malfunctions in the laboratory unit, but was not informed about the possible course of actions explicitly. It was the job of the AR app to provide the users with step-by-step instructions for problem resolving. Mobile hardware, namely a tablet with an Android system, was used. Such practical use cases have shown that SVC increased the user's error handling capabilities by freeing the hands from the joystick or any other remote control device. However, SVC seemed to require an extra tracker picture. It additionally reduced the user's mobility if subsequent movement around the unit was required.

The analysis of the application of the first prototype of the project allowed us to formulate a solution to the two-way communication.

3.4 The Solution to the User Activity Indication Problem

A universal solution to problem of the user activity monitoring and two-way communication was not possible by software methods alone. Essentially, it required the tracking and identification of the possible actions. The latter was only feasible if the laboratory plant was equipped with additional sensors capturing each action of the user. The transfer of subsequently appearing data was also required.

Thus, the second prototype of the system was developed. The laboratory unit has been modified by introducing a new console for technicians. The SVC tracker was no longer needed. The new sensors allowed to control the user actions and the steps due to laboratory tasks.

The operator of the system could use wearable AR hardware (smartphone with an AR application and cardboard). He acquired the related status information on the door, as well as a guided manual. The outline of the user interface and the prototype overview are presented in Fig. 4.

Fig. 4
figure 4

Laboratory unit and all AR elements (small garage doors, second prototype)

There were three control buttons on the new user console box, which allowed for extended interaction patterns:

Start”. If this button was pressed, the garage door would open. If nothing appeared to be passing through the door, the latter waited for 12 s, and then closed.

Stop”. If the Stop button was pressed, the door stopped moving.

E-Button” (emergency stop button). If this button was pressed, the door was stopped, the movements of the system components were blocked, and a restart with a technical console box was required.

3.5 The Solution to the Data Transfer Problem

The primary idea for the development was a direct data transfer system. As previously described (see Sect. 2.1), the flexible and scalable data transfer between industrial solutions (e.g., OPC/UA) and Android applications appears to be a major challenge. Direct data transfer from the server to the Android system is also possible, but requires extensive knowledge of software development. This contradicted the philosophy of the project by increasing the complexity for the end-user and for the developer.

Therefore, a support web server was introduced. It received data from the OPC/UA. The Unity3D Web Socket (integrated into the AR app) collected the data from the web server (Fig. 5).

Fig. 5
figure 5

The structure of the data transmitting between the machine and the AR application

Such a communication structure had a feedback time of approx. 0.3 s, but was simple in development, reliable, and offered easy maintenance and scalability (numerous AR hardware units could take information from the support web server).

Furthermore, our solution appeared to be more universal, as it made OPC-UA available for every platform that supported Web Sockets. We thus enabled OPC-UA for platforms where no ad-hoc OPC UA support library existed. The programming language used in AR app did not play a significant role (C#, JavaScript, Unity, Python, etc.). It should be mentioned that we used JSON-RPC (JavaScript Object Notation Remote Procedure Call) via Web Socket, starting the calls on the web server as a result. In other words, our solution brought OPC-UA into the browser.

3.6 Resulting System Testing

To test the second prototype, we enlisted the help of laboratory visitors and university staff members. In total, the number of the participants of the experiment amounted to about 30 persons in 2019 and 2020. The same testing methodology as with the first system prototype was used.

The experimental results showed that 100% of testers were able to fulfil the service requirements in the laboratory unit for the first time in an error situation by following the AR app guidance. However, more than 60% of testers reported the problem of a the loss of position of virtual objects during sharp movements of the head. The issue was eliminated by modifying the tracking algorithms from Vuforia and using the extended tracking. With the modification under consideration, the system could work even when there were no trackers in the camera's field of view. Nevertheless, at the start of the program, the tracker image was still needed to initialize the object positions.

4 Technical Maintenance Simulation (Error Correction)

To solve the introduced error with the “short” door blocking, the user received a message to clear the door's path. (Fig. 6a).

Fig. 6
figure 6

a The first error (“short” door blocking). b Switching in the manual modus. c Manual control of the door movement. d Release of E-button

In the case of “long” door blocking, the user had to clear the door's path at first. Then the user needed to follow the AR instructions to activate the manual control mode (Fig. 6b) and close the door manually with the “Down” button (Fig. 6c).

As previously described (see Sect. 3.6), the system could detect errors and control every relevant step of the user and error correction. For this reason, there was automatic switching between AR instructions for individual steps.

If a door was blocked with an E-Button, the user was informed about further steps. He had to release an emergency stop button (Fig. 6d) and then (as explained before) switch the system to the manual operation before closing the door and restarting (as on the Fig. 6a and b).

5 Results

The AR application developed in this project can be used as an assistance system to solve problems and errors with a laboratory unit. Such interactive facility replaces the traditional study of maintenance manuals. The visual feedback informs the user about errors in the system and whether they have been rectified.

Based on the recognized tracker pictures, AR objects can be displayed on the needed position. After the first recognition of the picture, the system can work when there are no more trackers in the camera’s field of view. These AR objects reveal the error’s exact location and give instructions for user guidance to resume the system’s proper operation. The user is enabled to quickly and successfully follow the necessary steps of error correction. The use of the system for training and education purposes is foreseen, and this system has positive effects on reduced downtime.

6 Conclusion and Future Work

This work presents a practical application of Augmented Reality for machine environment maintenance. It is shown that communication between the current hardware and software components (OPC/UA and Android applications) and, therefore, technical assistance system realization, is possible without additional expenses. Moreover, the project demonstrates the possibility of developing and using an AR application with physics engine software (Vuforia Augmented Reality SDK in conjunction with the game engine Unity3D).

In the project presented in this article, a new scalable method of data transfer has been developed and tested. The achieved solution is portable and available for every platform that supports Web Sockets. The problem of user activity indication is solved with the implementation of new hardware elements in the mechanical environment.

The authors would like to further emphasize the importance of the equipment of modern generations of systems and plants with an excessive number of sensors [13] to precisely detect each user's action and the order of these actions for the further successful implementation of AR assistance systems. The next step may be providing the support of more machine operations. Further development in this direction might include the binding of additional AR visual effects libraries, as well as the introduction of new user-machine interactions. It might enable the use of a single OPC/UA solution for collecting data from even more units and plants in the laboratory. The collected data could finally be used for more AR hardware and operators.