TrainAR: An Authoring Tool for Augmented Reality Trainings
TrainAR is a holistic threefold combination of an interaction concept, didactic framework and authoring tool for Augmented Reality (AR) trainings on handheld Android and iOS devices (Blattgerste et al. 2021). It is completely open source, free and offers non-programmers and programmers without AR-specific expertise aiding in the creation of interactive, engaging, procedural Augmented Reality trainings. This repository contains the technical components of the interaction concept and authoring tool of TrainAR in form of a custom Unity 2022.1. Editor Extension. It can be used with the Unity Windows, macOS (Silicon), macOS (Intel), or Linux Editor and deploy to Android and iOS devices. The authoring tool already offers features like the onboarding animations, tracking solutions, assembly placements, evaluated interaction concepts, layered feedback modalities and training assessments of TrainAR out of the box. This allows authors of AR trainings to focus on the content of the training instead of technical challenges. Authors can simply import 3D models into the tool, convert them to TrainAR objects and reference them in a visual-scripting stateflow (that is inspired by work-process-analyses) to create a procedural flow of instructions, user actions and feedback.
The idea behind TrainAR is simple: Realistic deployments of head-mounted AR devices still remain a challange today because of high costs, missing relevant training, and novelty of interactions that require indepth onboarding. In contrast, smartphone-based AR would be realistically scalable today, while still retaining many of the learning benefits. At least in theory. While possible, most mobile AR learning applications focus on visualization instead of interactions today, severly limiting their application scope. In line with recent findings that, in terms of training outcome, tangible interactions are not significantly increasing retention or transfer of knowledge compared to purely virtual interaction approaches (Knierim et al. 2020), the idea of TrainAR is a holistic and scalable solution for proceudral task training using Augmented Reality on handheld AR devices. Hereby, the idea is not to replace practical trainings but use TrainAR scenarios for concept and procedure understanding in preparation for or retention training after the pracitcal training sessions. In line with Gagne 1984, it is envisioned as a noval type of multimedia source to train intellectual skills and cogntive strategies but does not train associated motor skills.
Publications on TrainAR
- J. Blattgerste, “The Design Space of Augmented Reality Authoring Tools and its Exploration for the Procedural Training Context,” PhD Thesis, 2024. doi:10.4119/unibi/2987473
-  J. Blattgerste, J. Behrends, and T. Pfeiffer, “TrainAR: An Open-Source Visual Scripting-Based Authoring Tool for Procedural Mobile Augmented Reality Trainings,” Information, vol. 14, iss. 4, 2023. doi:10.3390/info14040219
- J. Blattgerste and T. Pfeiffer, “TrainAR: Ein Augmented Reality Training Autorensystem,” in Wettbewerbsband AVRiL 2022, Bonn, 2022, pp. 40-45. doi:10.18420/avril2022_06
- J. Blattgerste, K. Luksch, C. Lewa, and T. Pfeiffer, “TrainAR: A Scalable Interaction Concept and Didactic Framework for Procedural Trainings Using Handheld Augmented Reality,” Multimodal Technologies and Interaction, vol. 5, iss. 7, 2021. doi:10.3390/mti5070030
Publications on TrainAR Trainings
- R. Dörner, A. Tesch, J. Rüggeberg, A. Fuchs, P. Grimm, J. Hillig, T. Pfeiffer, J. Blattgerste, A. Bernloehr, K. Vogel, N. H. Bauer, A. Pestov, U. Spierling, C. Geiger, A. Hildebrand, T. Tropper, W. Wilke, C. Winkler, E. Makled, E. Schott, W. Broll, F. Weidner, B. Fröhlich, G. Göbel, M. Steinhauser, T. Schwandt, G. Kumari, G. Stolz, S. Werner, Y. Uzun, and L. Oppermann, “Fallbeispiele für VR/AR,” in Virtual und Augmented Reality (VR/AR): Grundlagen und Methoden von Extended Realities (XR), R. Dörner, W. Broll, P. Grimm, and B. Jung, Eds., Berlin, Heidelberg: Springer Berlin Heidelberg, 2026, p. 579–628. doi:10.1007/978-3-662-72309-8_11
-  K. Vogel, A. Bernloehr, T. Willmeroth, J. Blattgerste, C. Hellmers, and N. H. Bauer, “Augmented reality simulation-based training for midwifery students and its impact on perceived knowledge, confidence and skills for managing critical incidents,” Midwifery, vol. 136, p. 104064, 2024. doi:https://doi.org/10.1016/j.midw.2024.104064
- J. L. DomĂnguez Alfaro, S. Gantois, J. Blattgerste, R. De Croon, K. Verbert, T. Pfeiffer, and P. Van Puyvelde, “Mobile Augmented Reality Laboratory for Learning Acid–Base Titration,” Journal of Chemical Education, 2022. doi:10.1021/acs.jchemed.1c00894
- J. Blattgerste, J. Franssen, M. Arztmann, and T. Pfeiffer, “Motivational benefits and usability of a handheld Augmented Reality game for anatomy learning,” in 2022 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR), 2022.
- M. Arztmann, J. L. DomĂnguez Alfaro, J. Blattgerste, J. Jeuring, and P. Van Puyvelde, “Marie’s ChemLab: a Mobile Augmented Reality Game to Teach Basic Chemistry to Children,” in European Conference on Games Based Learning, 2022.
- J. Blattgerste, C. Lewa, K. Vogel, T. Willmeroth, S. JanĂźen, J. Franssen, J. Behrends, M. Joswig, T. Schäfer, N. H. Bauer, A. Bernloehr, and T. Pfeiffer, “Die Heb@AR App – Eine Android & iOS App mit Augmented Reality Trainings fĂĽr selbstbestimmtes und curriculares Lernen in der hochschulischen Hebammenausbildung,” in Wettbewerbsband AVRiL 2022, Bonn, 2022, pp. 4-9. doi:10.18420/avril2022_01
- J. Blattgerste, K. Vogel, C. Lewa, T. Willmeroth, M. Joswig, T. Schäfer, N. H. Bauer, A. Bernloehr, and T. Pfeiffer, “The Heb@AR App – Five Augmented Reality Trainings for Self-Directed Learning in Academic Midwifery Education,” in DELFI 2022 – Die 20. Fachtagung Bildungstechnologien der Gesellschaft fĂĽr Informatik eV, 2022.
- J. Blattgerste, K. Luksch, C. Lewa, M. Kunzendorf, N. H. Bauer, A. Bernloehr, M. Joswig, T. Schäfer, and T. Pfeiffer, “Project Heb@AR: Exploring handheld Augmented Reality training to supplement academic midwifery education,” in DELFI 2020 – Die 18. Fachtagung Bildungstechnologien der Gesellschaft fĂĽr Informatik e.V., Bonn, 2020.
Examples of TrainAR Trainings
TrainAR has already been used to create and evaluate procedural AR trainings across a wide range of domains. The examples shown here span academic midwifery and anatomy learning, chemistry and laboratory work, classification tasks, household and workplace procedures, as well as manual assembly and tool-based activities. This breadth illustrates the central idea of TrainAR: the same reusable interaction, feedback, and assessment principles can be transferred to very different procedural contexts, while the concrete learning content remains domain-specific.
Across these scenarios, TrainAR supports learners who need to understand, rehearse, and retain procedural knowledge before or after hands-on practice. The project originated in AR-based training research for academic midwifery and has since been extended to additional educational and procedural contexts through collaborations and follow-up work.
Example Onboarding Sequence & Interaction of a TrainAR Training(From the Heb@AR Training: "Preparation of an emergency tocolysis")
The onboarding sequence is designed to introduce both the scenario and the interaction metaphor before the actual AR task begins. It first frames the training context and optional guidance, then teaches the core interaction metaphors, and finally supports learners in scanning their environment, placing the virtual training assembly, accessing help menus, recovering from tracking issues, and reviewing their performance at the end. (Described in more detail here)
a) Scenario introduction and task context.
b) Optional technical onboarding and expert insights.
c) Tutorial for grabbing virtual objects.
d) Tutorial for interacting with objects.
e) Tutorial for combining objects.
f) Scanning the room to detect a suitable area.
g) Positioning and confirming training placement.
h) In-training menu for replaying tutorials, replacing the assembly, or leaving the scenario.
i) Warning overlay for AR tracking problems such as low light or insufficient feature points.
j) End screen with contextualized performance feedback and training assessment.
The interaction sequence shows how TrainAR supports procedural actions and layered feedback during a training. It covers object selection, action availability, grabbing and combining, scenario-specific custom actions, immediate positive and negative feedback, escalated error overlays, and quiz-based checks for knowledge that is better assessed through UI elements than through direct object manipulation alone. (Described in more detail here)
a) Selecting an object, here combined with a context-triggered insight.
b) No object selected and no action currently available.
c) A grabbed object held in front of the learner for manipulation.
d) A combining state in which one object is aligned with another and combining them becomes possible.
e) A scenario-specific custom action, here drawing liquid into the syringe.
f) Positive feedback for a correct interaction.
g) Positive feedback with additional contextual guidance.
h) Negative feedback after an incorrect action.
i) A stronger overlay for severe or repeated errors.
j) A custom action in the form of a quiz (knowledge check).
Open Source Project
The source code is available at https://github.com/jblattgerste/TrainAR/ with a full documentation at https://jblattgerste.github.io/TrainAR/
If you find problems, bugs or want to provide feedback or suggestions, contact us through email or post your feedback as an issue on the GitHub project.Â
Feel free to contribute to our project through suggestions, feedback or by contributing to our open source GitHub repository through pull requests.
Acknowledgement
Blattgerste, J.; Behrends, J.; Pfeiffer, T. (2023) TrainAR: An Open-Source Visual Scripting-Based Authoring Tool for Procedural Mobile Augmented Reality Trainings. Information, 14 (4), 219. doi:10.3390/info14040219
@article{Blattgerste2023TrainAR,
AUTHOR = {Blattgerste, Jonas and Behrends, Jan and Pfeiffer, Thies},
TITLE = {TrainAR: An Open-Source Visual Scripting-Based Authoring Tool for Procedural Mobile Augmented Reality Trainings},
JOURNAL = {Information},
VOLUME = {14},
YEAR = {2023},
NUMBER = {4},
ARTICLE-NUMBER = {219},
URL = {https://www.mdpi.com/2078-2489/14/4/219},
ISSN = {2078-2489},
DOI = {10.3390/info14040219}
}
Responsible Investigators
Prof. Dr. Thies Pfeiffer
Email: thies.pfeiffer@hs-emden-leer.de
Dr. Jonas Blattgerste
Email: jonas.blattgerste@hs-emden-leer.de
Softwaredevelopment: Jonas Blattgerste, Sven JanĂźen, Jan Behrends