Address

University of Applied Sciences Emden/Leer, Constantiaplatz 4, 26723 Emden, Room S110

Contact Information

Call: 04921 8071877

Email: jonas.blattgerste@hs-emden-leer.de

Jonas Blattgerste

Research Associate

University of Applied Sciences Emden/Leer

Jonas Blattgerste is a Human–Computer Interaction researcher and Augmented & Mixed Reality developer. His current research interests are head-mounted, handheld and projection-based Augmented Reality approaches for training and assistance in work environments, educational settings and to assist cognitively impaired people. In his PhD, he is especially interested in the development and design space of Augmented Reality authoring tools.

Publications

  • J. Blattgerste, K. Luksch, C. Lewa, and T. Pfeiffer, “TrainAR: A Scalable Interaction Concept and Didactic Framework for Procedural Trainings Using Handheld Augmented Reality,” Multimodal Technologies and Interaction, vol. 5, iss. 7, 2021. doi:10.3390/mti5070030
    [BibTeX] [Abstract] [Download PDF]
    The potential of Augmented Reality (AR) for educational and training purposes is well known. While large-scale deployments of head-mounted AR headsets remain challenging due to technical limitations and cost factors, advances in mobile devices and tracking solutions introduce handheld AR devices as a powerful, broadly available alternative, yet with some restrictions. One of the current limitations of AR training applications on handheld AR devices is that most offer rather static experiences, only providing descriptive knowledge with little interactivity. Holistic concepts for the coverage of procedural knowledge are largely missing. The contribution of this paper is twofold. We propose a scalabe interaction concept for handheld AR devices with an accompanied didactic framework for procedural training tasks called TrainAR. Then, we implement TrainAR for a training scenario in academics for the context of midwifery and explain the educational theories behind our framework and how to apply it for procedural training tasks. We evaluate and subsequently improve the concept based on three formative usability studies (n = 24), where explicitness, redundant feedback mechanisms and onboarding were identified as major success factors. Finally, we conclude by discussing derived implications for improvements and ongoing and future work.
    @Article{mti5070030,
    AUTHOR = {Blattgerste, Jonas and Luksch, Kristina and Lewa, Carmen and Pfeiffer, Thies},
    TITLE = {Train{AR}: {A} {S}calable {I}nteraction {C}oncept and {D}idactic {F}ramework for {P}rocedural {T}rainings {U}sing {H}andheld {A}ugmented {R}eality},
    JOURNAL = {{M}ultimodal {T}echnologies and {I}nteraction},
    VOLUME = {5},
    YEAR = {2021},
    NUMBER = {7},
    ARTICLE-NUMBER = {30},
    URL = {https://www.mdpi.com/2414-4088/5/7/30},
    ISSN = {2414-4088},
    ABSTRACT = {The potential of Augmented Reality (AR) for educational and training purposes is well known. While large-scale deployments of head-mounted AR headsets remain challenging due to technical limitations and cost factors, advances in mobile devices and tracking solutions introduce handheld AR devices as a powerful, broadly available alternative, yet with some restrictions. One of the current limitations of AR training applications on handheld AR devices is that most offer rather static experiences, only providing descriptive knowledge with little interactivity. Holistic concepts for the coverage of procedural knowledge are largely missing. The contribution of this paper is twofold. We propose a scalabe interaction concept for handheld AR devices with an accompanied didactic framework for procedural training tasks called TrainAR. Then, we implement TrainAR for a training scenario in academics for the context of midwifery and explain the educational theories behind our framework and how to apply it for procedural training tasks. We evaluate and subsequently improve the concept based on three formative usability studies (n = 24), where explicitness, redundant feedback mechanisms and onboarding were identified as major success factors. Finally, we conclude by discussing derived implications for improvements and ongoing and future work.},
    DOI = {10.3390/mti5070030}
    }
  • J. Blattgerste, P. Renner, and T. Pfeiffer, “Authorable Augmented Reality Instructions for Assistance and Training in Work Environments,” in Proceedings of the 18th International Conference on Mobile and Ubiquitous Multimedia, New York, NY, USA, 2019. doi:10.1145/3365610.3365646
    [BibTeX] [Abstract] [Download PDF]
    Augmented Reality (AR) is a promising technology for assistance and training in work environments, as it can provide instructions and feedback contextualised. Not only, but especially impaired workers can benefit from this technology. While previous work mostly focused on using AR to assist or train specific predefined tasks, “general purpose” AR applications, that can be used to intuitively author new tasks at run-time, are widely missing. The contribution of this work is twofold: First we develop an AR authoring tool on the Microsoft HoloLens in combination with a Smartphone as an additional controller following considerations based on related work, guidelines and focus group interviews. Then, we evaluate the usability of the authoring tool itself and the produced AR instructions on a qualitative level in realistic scenarios and gather feedback. As the results reveal a positive reception, we discuss authorable AR as a viable form of AR assistance or training in work environments.
    @inproceedings{blattgerste2019authorable,
    author = {Blattgerste, Jonas and Renner, Patrick and Pfeiffer, Thies},
    title = {Authorable Augmented Reality Instructions for Assistance and Training in Work Environments},
    year = {2019},
    isbn = {9781450376242},
    publisher = {Association for Computing Machinery},
    address = {New York, NY, USA},
    doi = {10.1145/3365610.3365646},
    booktitle = {Proceedings of the 18th International Conference on Mobile and Ubiquitous Multimedia},
    articleno = {34},
    numpages = {11},
    keywords = {training, cognitive impairments, augmented reality, annotation, mixed reality, authoring, assistance},
    location = {Pisa, Italy},
    series = {MUM 19},
    url = {https://mixality.de/wp-content/uploads/2020/07/blattgerste2019authorable.pdf},
    abstract = {Augmented Reality (AR) is a promising technology for assistance and training in work environments, as it can provide instructions and feedback contextualised. Not only, but especially impaired workers can benefit from this technology. While previous work mostly focused on using AR to assist or train specific predefined tasks, "general purpose" AR applications, that can be used to intuitively author new tasks at run-time, are widely missing. The contribution of this work is twofold: First we develop an AR authoring tool on the Microsoft HoloLens in combination with a Smartphone as an additional controller following considerations based on related work, guidelines and focus group interviews. Then, we evaluate the usability of the authoring tool itself and the produced AR instructions on a qualitative level in realistic scenarios and gather feedback. As the results reveal a positive reception, we discuss authorable AR as a viable form of AR assistance or training in work environments.}
    }
  • J. Blattgerste and T. Pfeiffer, “Promptly Authored Augmented Reality Instructions Can Be Sufficient to Enable Cognitively Impaired Workers,” in GI VR / AR Workshop 2020, 2020.
    [BibTeX] [Abstract] [Download PDF]
    The benefits of contextualising information through Augmented Reality (AR) instructions to assist cognitively impaired workers are well known, but most findings are based on AR instructions carefully designed for predefined standard tasks. Previous findings indicate that the modality and quality of provided AR instructions have a significant impact on the provided benefits. The emergence of commercial products providing tools for instructors to promptly author their own AR instructions elicits the question, whether instructions created through those are sufficient to support cognitively impaired workers. This paper explores this question through a qualitative study using an AR authoring tool to create AR instructions for a task that none out of 10 participants was able to complete previously. Using promptly authored instructions, however, most were able to complete the task. Additionally, they reported good usability and gave qualitative feedback indicating they would like to use comparable AR instructions more often.
    @inproceedings{blattgerste2020prompty,
    title={Promptly Authored Augmented Reality Instructions Can Be Sufficient to Enable Cognitively Impaired Workers},
    author={Blattgerste, Jonas and Pfeiffer, Thies},
    booktitle={{GI VR / AR Workshop 2020}},
    year={2020},
    url = {https://mixality.de/wp-content/uploads/2020/07/blattgerste2020prompty.pdf},
    abstract = {The benefits of contextualising information through Augmented Reality (AR) instructions to assist cognitively impaired workers are well known, but most findings are based on AR instructions carefully designed for predefined standard tasks. Previous findings indicate that the modality and quality of provided AR instructions have a significant impact on the provided benefits. The emergence of commercial products providing tools for instructors to promptly author their own AR instructions elicits the question, whether instructions created through those are sufficient to support cognitively impaired workers. This paper explores this question through a qualitative study using an AR authoring tool to create AR instructions for a task that none out of 10 participants was able to complete previously. Using promptly authored instructions, however, most were able to complete the task. Additionally, they reported good usability and gave qualitative feedback indicating they would like to use comparable AR instructions more often.}
    }
  • J. Blattgerste, K. Luksch, C. Lewa, M. Kunzendorf, N. H. Bauer, A. Bernloehr, M. Joswig, T. Schäfer, and T. Pfeiffer, “Project Heb@AR: Exploring handheld Augmented Reality training to supplement academic midwifery education,” in DELFI 2020 – Die 18. Fachtagung Bildungstechnologien der Gesellschaft für Informatik e.V., Bonn, 2020, pp. 103-108.
    [BibTeX] [Abstract] [Download PDF]
    Augmented Reality (AR) promises great potential for training applications as it allows to provide the trainee with instructions and feedback that is contextualized. In recent years, AR reached a state of technical feasibility that not only allows for larger, long term evaluations, but also for explorations of its application to specific training use cases. In the BMBF funded project Heb@AR, the utilization of handheld AR as a supplementary tool for the practical training in academic midwifery education is explored. Specifically, how and where AR can be used most effectively in this context, how acceptability and accessibility for tutors and trainees can be ensured and how well emergency situations can be simulated using the technology. In this paper an overview of the Heb@AR project is provided, the goals of the project are stated and the project’s research questions are discussed from a technical perspective. Furthermore, insights into the current state and the development process of the first AR training prototype are provided: The preparation of a tocolytic injection.
    @inproceedings{blattgerste2020hebar,
    author = {Blattgerste, Jonas AND Luksch, Kristina AND Lewa, Carmen AND Kunzendorf, Martina AND Bauer, Nicola H. AND Bernloehr, Annette AND Joswig, Matthias AND Schäfer, Thorsten AND Pfeiffer, Thies},
    title = {Project Heb@AR: Exploring handheld Augmented Reality training to supplement academic midwifery education},
    booktitle = {DELFI 2020 – Die 18. Fachtagung Bildungstechnologien der Gesellschaft für Informatik e.V.},
    year = {2020},
    editor = {Zender, Raphael AND Ifenthaler, Dirk AND Leonhardt, Thiemo AND Schumacher, Clara},
    pages = { 103-108 },
    publisher = {Gesellschaft für Informatik e.V.},
    address = {Bonn},
    url = {https://dl.gi.de/bitstream/handle/20.500.12116/34147/103%20DELFI2020_paper_79.pdf?sequence=1&isAllowed=y},
    abstract = {Augmented Reality (AR) promises great potential for training applications as it allows to provide the trainee with instructions and feedback that is contextualized. In recent years, AR reached a state of technical feasibility that not only allows for larger, long term evaluations, but also for explorations of its application to specific training use cases. In the BMBF funded project Heb@AR, the utilization of handheld AR as a supplementary tool for the practical training in academic midwifery education is explored. Specifically, how and where AR can be used most effectively in this context, how acceptability and accessibility for tutors and trainees can be ensured and how well emergency situations can be simulated using the technology. In this paper an overview of the Heb@AR project is provided, the goals of the project are stated and the project’s research questions are discussed from a technical perspective. Furthermore, insights into the current state and the development process of the first AR training prototype are provided: The preparation of a tocolytic injection.}
    }
  • J. Blattgerste, P. Renner, and T. Pfeiffer, “Augmented Reality Action Assistance and Learning for Cognitively Impaired People. A Systematic Literature Review,” in The 12th PErvasive Technologies Related to Assistive Environments Conference (PETRA ’19), 2019. doi:10.1145/3316782.3316789
    [BibTeX] [Abstract] [Download PDF]
    Augmented reality (AR) is a promising tool for many situations in which assistance is needed, as it allows for instructions and feedback to be contextualized. While research and development in this area have been primarily driven by industry, AR could also have a huge impact on those who need assistance the most: cognitively impaired people of all ages. In recent years some primary research on applying AR for action assistance and learning in the context of this target group has been conducted. However, the research field is sparsely covered and contributions are hard to categorize. An overview of the current state of research is missing. We contribute to filling this gap by providing a systematic literature review covering 52 publications. We describe the often rather technical publications on an abstract level and quantitatively assess their usage purpose, the targeted age group and the type of AR device used. Additionally, we provide insights on the current challenges and chances of AR learning and action assistance for people with cognitive impairments. We discuss trends in the research field, including potential future work for researchers to focus on.
    @inproceedings{2934446,
    abstract = {Augmented reality (AR) is a promising tool for many situations in which assistance is needed, as it allows for instructions and feedback to be contextualized. While research and development in this area have been primarily driven by industry, AR could also have a huge impact on those who need assistance the most: cognitively impaired people of all ages. In recent years some primary research on applying AR for action assistance and learning in the context of this target group has been conducted. However, the research field is sparsely covered and contributions are hard to categorize. An overview of the current state of research is missing. We contribute to filling this gap by providing a systematic literature review covering 52 publications. We describe the often rather technical publications on an abstract level and quantitatively assess their usage purpose, the targeted age group and the type of AR device used. Additionally, we provide insights on the current challenges and chances of AR learning and action assistance for people with cognitive
    impairments. We discuss trends in the research field, including potential future work for researchers to focus on.},
    author = {Blattgerste, Jonas and Renner, Patrick and Pfeiffer, Thies},
    booktitle = {The 12th PErvasive Technologies Related to Assistive Environments Conference (PETRA ’19)},
    location = {Rhodes, Greece},
    publisher = {ACM},
    title = {{Augmented Reality Action Assistance and Learning for Cognitively Impaired People. A Systematic Literature Review}},
    url = {https://nbn-resolving.org/urn:nbn:de:0070-pub-29344462, https://pub.uni-bielefeld.de/record/2934446},
    doi = {10.1145/3316782.3316789},
    year = {2019},
    }
  • J. Blattgerste, P. Renner, and T. Pfeiffer, “Advantages of Eye-Gaze over Head-Gaze-Based Selection in Virtual and Augmented Reality under Varying Field of Views,” in COGAIN ’18. Proceedings of the Symposium on Communication by Gaze Interaction, 2018. doi:10.1145/3206343.3206349
    [BibTeX] [Abstract] [Download PDF]
    The current best practice for hands-free selection using Virtual and Augmented Reality (VR/AR) head-mounted displays is to use head-gaze for aiming and dwell-time or clicking for triggering the selection. There is an observable trend for new VR and AR devices to come with integrated eye-tracking units to improve rendering, to provide means for attention analysis or for social interactions. Eye-gaze has been successfully used for human-computer interaction in other domains, primarily on desktop computers. In VR/AR systems, aiming via eye-gaze could be significantly faster and less exhausting than via head-gaze. To evaluate benefits of eye-gaze-based interaction methods in VR and AR, we compared aiming via head-gaze and aiming via eye-gaze. We show that eye-gaze outperforms head-gaze in terms of speed, task load, required head movement and user preference. We furthermore show that the advantages of eye-gaze further increase with larger FOV sizes.
    @inproceedings{2919602,
    abstract = {The current best practice for hands-free selection using Virtual and Augmented Reality (VR/AR) head-mounted displays is to use head-gaze for aiming and dwell-time or clicking for triggering the selection. There is an observable trend for new VR and AR devices to come with integrated eye-tracking units to improve rendering, to provide means for attention analysis or for social interactions. Eye-gaze has been successfully used for human-computer interaction in other domains, primarily on desktop computers. In VR/AR systems, aiming via eye-gaze could be significantly faster and less exhausting than via head-gaze.
    To evaluate benefits of eye-gaze-based interaction methods in VR and AR, we compared aiming via head-gaze and aiming via eye-gaze. We show that eye-gaze outperforms head-gaze in terms of speed, task load, required head movement and user preference. We furthermore show that the advantages of eye-gaze further increase with larger FOV sizes.},
    author = {Blattgerste, Jonas and Renner, Patrick and Pfeiffer, Thies},
    booktitle = {COGAIN '18. Proceedings of the Symposium on Communication by Gaze Interaction},
    isbn = {978-1-4503-5790-6},
    keywords = {Augmented Reality, Virtual Reality, Assistance Systems, Head-Mounted Displays, Eye-Tracking, Field of View, Human Computer Interaction},
    location = {Warsaw, Poland},
    publisher = {ACM},
    title = {{Advantages of Eye-Gaze over Head-Gaze-Based Selection in Virtual and Augmented Reality under Varying Field of Views}},
    url = {https://nbn-resolving.org/urn:nbn:de:0070-pub-29196024, https://pub.uni-bielefeld.de/record/2919602},
    doi = {10.1145/3206343.3206349},
    year = {2018},
    }
  • P. Renner, J. Blattgerste, and T. Pfeiffer, “A Path-based Attention Guiding Technique for Assembly Environments with Target Occlusions,” in IEEE Virtual Reality 2018, 2018.
    [BibTeX] [Download PDF]
    @inproceedings{2917385,
    author = {Renner, Patrick and Blattgerste, Jonas and Pfeiffer, Thies},
    booktitle = {IEEE Virtual Reality 2018},
    location = {Reutlingen},
    publisher = {IEEE},
    title = {{A Path-based Attention Guiding Technique for Assembly Environments with Target Occlusions}},
    url = {https://nbn-resolving.org/urn:nbn:de:0070-pub-29173854, https://pub.uni-bielefeld.de/record/2917385},
    year = {2018},
    }
  • J. Blattgerste, P. Renner, B. Strenge, and T. Pfeiffer, “In-Situ Instructions Exceed Side-by-Side Instructions in Augmented Reality Assisted Assembly,” in Proceedings of the 11th ACM International Conference on PErvasive Technologies Related to Assistive Environments (PETRA’18), 2018, p. 133–140. doi:10.1145/3197768.3197778
    [BibTeX] [Abstract] [Download PDF]
    Driven by endeavors towards Industry 4.0, there is increasing interest in augmented reality (AR) as an approach for assistance in areas like picking, assembly and maintenance. In this work our focus is on AR-based assistance in manual assembly. The design space for AR instructions in this context includes, e.g., side-by-side, 3D or projected 2D presentations. In previous research, the low quality of the AR devices available at the respective time had a significant impact on performance evaluations. Today, a proper and up-to-date comparison of different presentation approaches is missing. This paper presents an improved 3D in-situ instruction and compares it to previously presented techniques. All instructions are implemented on up-to-date AR hardware, namely the Microsoft HoloLens. To support reproducible research, the comparison is made using a standardized benchmark scenario. The results show, contrary to previous research, that in-situ instructions on state-of-the-art AR glasses outperform side-by-side instructions in terms of errors made, task completion time, and perceived task load.
    @inproceedings{2919601,
    abstract = {Driven by endeavors towards Industry 4.0, there is increasing interest in augmented reality (AR) as an approach for assistance in areas like picking, assembly and maintenance. In this work our focus is on AR-based assistance in manual assembly. The design space for AR instructions in this context includes, e.g., side-by-side, 3D or projected 2D presentations. In previous research, the low quality of the AR devices available at the respective time had a significant impact on performance evaluations. Today, a proper and up-to-date comparison of different presentation approaches is missing.
    This paper presents an improved 3D in-situ instruction and compares it to previously presented techniques. All instructions are implemented on up-to-date AR hardware, namely the Microsoft HoloLens. To support reproducible research, the comparison is made using a standardized benchmark scenario. The results show, contrary to previous research, that in-situ instructions on state-of-the-art AR glasses outperform side-by-side instructions in terms of errors made, task completion time, and perceived task load.},
    author = {Blattgerste, Jonas and Renner, Patrick and Strenge, Benjamin and Pfeiffer, Thies},
    booktitle = {Proceedings of the 11th ACM International Conference on PErvasive Technologies Related to Assistive Environments (PETRA'18)},
    isbn = {978-1-4503-6390-7},
    keywords = {Augmented Reality, Assistance Systems, Head-Mounted Displays, Smart Glasses, Benchmarking},
    location = {Corfu, Greece},
    pages = {133--140},
    publisher = {ACM},
    title = {{In-Situ Instructions Exceed Side-by-Side Instructions in Augmented Reality Assisted Assembly}},
    url = {https://nbn-resolving.org/urn:nbn:de:0070-pub-29196019, https://pub.uni-bielefeld.de/record/2919601},
    doi = {10.1145/3197768.3197778},
    year = {2018},
    }
  • J. Blattgerste, B. Strenge, P. Renner, T. Pfeiffer, and K. Essig, “Comparing Conventional and Augmented Reality Instructions for Manual Assembly Tasks,” in Proceedings of the 10th International Conference on PErvasive Technologies Related to Assistive Environments, 2017, p. 75 – 82. doi:10.1145/3056540.3056547
    [BibTeX] [Abstract] [Download PDF]
    Augmented Reality (AR) gains increased attention as a means to provide assistance for different human activities. Hereby the suitability of AR does not only depend on the respective task, but also to a high degree on the respective device. In a standardized assembly task, we tested AR-based in-situ assistance against conventional pictorial instructions using a smartphone, Microsoft HoloLens and Epson Moverio BT-200 smart glasses as well as paper-based instructions. Participants solved the task fastest using the paper instructions, but made less errors with AR assistance on the Microsoft HoloLens smart glasses than with any other system. Methodically we propose operational definitions of time segments and other optimizations for standardized benchmarking of AR assembly instructions.
    @inproceedings{2909322,
    abstract = {Augmented Reality (AR) gains increased attention as a
    means to provide assistance for different human activities.
    Hereby the suitability of AR does not only depend on the
    respective task, but also to a high degree on the respective
    device. In a standardized assembly task, we tested
    AR-based in-situ assistance against conventional pictorial
    instructions using a smartphone, Microsoft HoloLens and
    Epson Moverio BT-200 smart glasses as well as paper-based
    instructions. Participants solved the task fastest using the
    paper instructions, but made less errors with AR assistance
    on the Microsoft HoloLens smart glasses than with
    any other system. Methodically we propose operational
    definitions of time segments and other optimizations for
    standardized benchmarking of AR assembly instructions.},
    author = {Blattgerste, Jonas and Strenge, Benjamin and Renner, Patrick and Pfeiffer, Thies and Essig, Kai},
    booktitle = {Proceedings of the 10th International Conference on PErvasive Technologies Related to Assistive Environments},
    isbn = {978-1-4503-5227-7},
    keywords = {Assistance Systems, Head-Mounted Displays, Smartglasses, Benchmarking, CLF_RESEARCH_HIGHLIGHT},
    location = {Island of Rhodes, Greece},
    pages = {75 -- 82},
    publisher = {ACM},
    title = {{Comparing Conventional and Augmented Reality Instructions for Manual Assembly Tasks}},
    url = {https://nbn-resolving.org/urn:nbn:de:0070-pub-29093227, https://pub.uni-bielefeld.de/record/2909322},
    doi = {10.1145/3056540.3056547},
    year = {2017},
    }

Education

  1. 2018 - 2019

    Intelligent Systems (Master of Science)

    Bielefeld University
  2. 2013 - 2018

    Cognitive Informatics (Bachelor of Science)

    Bielefeld University

Experience

  1. since 2019
    Research Associate
    University of Applied Sciences Emden/Leer
  2. since 2019
    Scientific Consultant
    Raumtänzer GmbH
  3. 2019-2019
    Software Developer
    Raumtänzer GmbH
  4. 2017-2019
    Research Assistant
    Bielefeld University