Address

University of Applied Sciences Emden/Leer, Constantiaplatz 4, 26723 Emden, Room S109

Contact Information

Call: +49 4921 8071877

Email: jonas.blattgerste@hs-emden-leer.de

Jonas Blattgerste, M.Sc.

Research Associate

University of Applied Sciences Emden/Leer

Jonas Blattgerste is a Human–Computer Interaction Researcher and Augmented & Mixed Reality Developer. His current research interests are head-mounted, handheld and projection-based Augmented Reality for training and assistance purposes.

Publications

  • J. Blattgerste, J. Franssen, M. Arztmann, and T. Pfeiffer, “Motivational benefits and usability of a handheld Augmented Reality game for anatomy learning,” in 2022 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR), 2022.
    [BibTeX] [Download PDF]
    @inproceedings{Blattgerste2022Motivational,
    title={Motivational benefits and usability of a handheld Augmented Reality game for anatomy learning},
    author={Blattgerste, Jonas and Franssen, Jannik and Arztmann, Michaela and Pfeiffer, Thies},
    booktitle={2022 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR)},
    year={2022},
    organization={IEEE},
    url = {https://mixality.de/wp-content/uploads/2022/12/Blattgerste2022Motivational.pdf}
    }
  • J. Blattgerste and T. Pfeiffer, “TrainAR: Ein Augmented Reality Training Autorensystem,” in Wettbewerbsband AVRiL 2022, Bonn, 2022, pp. 40-45. doi:10.18420/avril2022_06
    [BibTeX] [Download PDF]
    @inproceedings{Blattgerste2022TrainAR,
    author = {Blattgerste, Jonas AND Pfeiffer, Thies},
    title = {TrainAR: Ein Augmented Reality Training Autorensystem},
    booktitle = {Wettbewerbsband AVRiL 2022},
    year = {2022},
    editor = {Söbke, Heinrich AND Zender, Raphael} ,
    pages = {40-45} ,
    doi = {10.18420/avril2022_06},
    publisher = {Gesellschaft für Informatik e.V.},
    address = {Bonn},
    url = {https://mixality.de/wp-content/uploads/2022/12/Blattgerste2022TrainAR.pdf}
    }
  • J. Blattgerste, C. Lewa, K. Vogel, T. Willmeroth, S. Janßen, J. Franssen, J. Behrends, M. Joswig, T. Schäfer, N. H. Bauer, A. Bernloehr, and T. Pfeiffer, “Die Heb@AR App – Eine Android & iOS App mit Augmented Reality Trainings für selbstbestimmtes und curriculares Lernen in der hochschulischen Hebammenausbildung,” in Wettbewerbsband AVRiL 2022, Bonn, 2022, pp. 4-9. doi:10.18420/avril2022_01
    [BibTeX] [Download PDF]
    @inproceedings{Blattgerste2022HebARAVRiL,
    author = {Blattgerste, Jonas AND Lewa , Carmen AND Vogel, Kristina AND Willmeroth, Tabea AND Janßen, Sven AND Franssen, Jannik AND Behrends, Jan AND Joswig, Matthias AND Schäfer, Thorsten AND Bauer, Nicola H. AND Bernloehr, Annette AND Pfeiffer, Thies},
    title = {Die Heb@AR App - Eine Android & iOS App mit Augmented Reality Trainings für selbstbestimmtes und curriculares Lernen in der hochschulischen Hebammenausbildung},
    booktitle = {Wettbewerbsband AVRiL 2022},
    year = {2022},
    editor = {Söbke, Heinrich AND Zender, Raphael} ,
    pages = {4-9} ,
    doi = {10.18420/avril2022_01},
    publisher = {Gesellschaft für Informatik e.V.},
    address = {Bonn},
    url = {https://mixality.de/wp-content/uploads/2022/12/Blattgerste2022HebARAVRiL.pdf}
    }
  • J. Blattgerste, J. Behrends, and T. Pfeiffer, “A Web-Based Analysis Toolkit for the System Usability Scale,” in Proceedings of the 12th ACM International Conference on PErvasive Technologies Related to Assistive Environments, 2022.
    [BibTeX] [Download PDF]
    @inproceedings{Blattgerste2022SUS,
    author = {Blattgerste, Jonas and Behrends, Jan and Pfeiffer, Thies},
    booktitle = {Proceedings of the 12th ACM International Conference on PErvasive Technologies Related to Assistive Environments},
    location = {Corfu, Greece},
    title = {A Web-Based Analysis Toolkit for the System Usability Scale},
    url = {https://mixality.de/wp-content/uploads/2022/07/Blattgerste2022SUS.pdf},
    year = {2022}
    }
  • J. Blattgerste, K. Vogel, C. Lewa, T. Willmeroth, M. Joswig, T. Schäfer, N. H. Bauer, A. Bernloehr, and T. Pfeiffer, “The Heb@AR App – Five Augmented Reality Trainings for Self-Directed Learning in Academic Midwifery Education,” in DELFI 2022 – Die 20. Fachtagung Bildungstechnologien der Gesellschaft für Informatik eV, 2022.
    [BibTeX] [Download PDF]
    @inproceedings{Blattgerste2022HebARApp,
    author = {Blattgerste, Jonas and Vogel, Kristina and Lewa, Carmen and Willmeroth, Tabea and Joswig, Matthias and Schäfer, Thorsten and Bauer, Nicola H. and Bernloehr, Annette and Pfeiffer, Thies},
    booktitle = {DELFI 2022 – Die 20. Fachtagung Bildungstechnologien der Gesellschaft für Informatik eV},
    location = {Karlsruhe, Germany},
    title = {The Heb@AR App – Five Augmented Reality Trainings for Self-Directed Learning in Academic Midwifery Education},
    url = {https://mixality.de/wp-content/uploads/2022/07/Blattgerste2022DELFI.pdf},
    year = {2022}
    }
  • M. Arztmann, J. L. Domínguez Alfaro, J. Blattgerste, J. Jeuring, and P. Van Puyvelde, “Marie’s ChemLab: a Mobile Augmented Reality Game to Teach Basic Chemistry to Children,” in European Conference on Games Based Learning, 2022.
    [BibTeX] [Download PDF]
    @inproceedings{Arztmann2022Marie,
    author = {Arztmann, Michaela and Domínguez Alfaro, Jessica Lizeth and Blattgerste, Jonas and Jeuring, Johan and Van Puyvelde, Peter},
    booktitle={European Conference on Games Based Learning},
    location = {Lisbon, Portugal},
    title = {Marie’s ChemLab: a Mobile Augmented Reality Game to Teach Basic Chemistry to Children},
    url = {https://mixality.de/wp-content/uploads/2022/07/Arztmann2022MariesChemLab.pdf},
    year = {2022}
    }
  • J. L. Domínguez Alfaro, S. Gantois, J. Blattgerste, R. De Croon, K. Verbert, T. Pfeiffer, and P. Van Puyvelde, “Mobile Augmented Reality Laboratory for Learning Acid–Base Titration,” Journal of Chemical Education, 2022. doi:10.1021/acs.jchemed.1c00894
    [BibTeX] [Abstract] [Download PDF]
    {Traditionally, laboratory practice aims to establish schemas learned by students in theoretical courses through concrete experiences. However, access to laboratories might not always be available to students. Therefore, it is advantageous to diversify the tools that students could use to train practical skills. This technology report describes the design, development, and first testing of a mobile augmented reality application that enables a hands-on learning experience of a titration experiment. Additionally, it presents the extension of the TrainAR framework for chemical education through the implementation of specific domain features, i.e., logbook, graph, and practical oriented hints. To test the application, 15 participants were recruited from five different high schools and two universities in Belgium. The findings reflect that the MAR Lab app was well-received by the users. In addition, they valued the design elements (e.g., logbook and multiple-choice questions), and the system has “good” usability (SUS score 72.8
    @article{doi:10.1021/acs.jchemed.1c00894,
    author = {Domínguez Alfaro, Jessica Lizeth and Gantois, Stefanie and Blattgerste, Jonas and De Croon, Robin and Verbert, Katrien and Pfeiffer, Thies and Van Puyvelde, Peter},
    title = {Mobile Augmented Reality Laboratory for Learning Acid–Base Titration},
    journal = {Journal of Chemical Education},
    year = {2022},
    doi = {10.1021/acs.jchemed.1c00894},
    abstract={Traditionally, laboratory practice aims to establish schemas learned by students in theoretical courses through concrete experiences. However, access to laboratories might not always be available to students. Therefore, it is advantageous to diversify the tools that students could use to train practical skills. This technology report describes the design, development, and first testing of a mobile augmented reality application that enables a hands-on learning experience of a titration experiment. Additionally, it presents the extension of the TrainAR framework for chemical education through the implementation of specific domain features, i.e., logbook, graph, and practical oriented hints. To test the application, 15 participants were recruited from five different high schools and two universities in Belgium. The findings reflect that the MAR Lab app was well-received by the users. In addition, they valued the design elements (e.g., logbook and multiple-choice questions), and the system has “good” usability (SUS score 72.8, SD = 14.0). Nevertheless, the usability and learners’ experience can be improved by tackling technical problems, providing more explicit instructions for subtasks, and modifying certain features. Therefore, future development will concentrate on improving upon these shortcomings, adding additional levels to target a larger audience, and evaluating the improvements’ effects with more participants.},
    URL = {https://doi.org/10.1021/acs.jchemed.1c00894},
    eprint = {https://doi.org/10.1021/acs.jchemed.1c00894}
    }
  • J. Blattgerste, K. Luksch, C. Lewa, and T. Pfeiffer, “TrainAR: A Scalable Interaction Concept and Didactic Framework for Procedural Trainings Using Handheld Augmented Reality,” Multimodal Technologies and Interaction, vol. 5, iss. 7, 2021. doi:10.3390/mti5070030
    [BibTeX] [Abstract] [Download PDF]
    The potential of Augmented Reality (AR) for educational and training purposes is well known. While large-scale deployments of head-mounted AR headsets remain challenging due to technical limitations and cost factors, advances in mobile devices and tracking solutions introduce handheld AR devices as a powerful, broadly available alternative, yet with some restrictions. One of the current limitations of AR training applications on handheld AR devices is that most offer rather static experiences, only providing descriptive knowledge with little interactivity. Holistic concepts for the coverage of procedural knowledge are largely missing. The contribution of this paper is twofold. We propose a scalabe interaction concept for handheld AR devices with an accompanied didactic framework for procedural training tasks called TrainAR. Then, we implement TrainAR for a training scenario in academics for the context of midwifery and explain the educational theories behind our framework and how to apply it for procedural training tasks. We evaluate and subsequently improve the concept based on three formative usability studies (n = 24), where explicitness, redundant feedback mechanisms and onboarding were identified as major success factors. Finally, we conclude by discussing derived implications for improvements and ongoing and future work.
    @Article{mti5070030,
    AUTHOR = {Blattgerste, Jonas and Luksch, Kristina and Lewa, Carmen and Pfeiffer, Thies},
    TITLE = {Train{AR}: {A} {S}calable {I}nteraction {C}oncept and {D}idactic {F}ramework for {P}rocedural {T}rainings {U}sing {H}andheld {A}ugmented {R}eality},
    JOURNAL = {{M}ultimodal {T}echnologies and {I}nteraction},
    VOLUME = {5},
    YEAR = {2021},
    NUMBER = {7},
    ARTICLE-NUMBER = {30},
    URL = {https://www.mdpi.com/2414-4088/5/7/30},
    ISSN = {2414-4088},
    ABSTRACT = {The potential of Augmented Reality (AR) for educational and training purposes is well known. While large-scale deployments of head-mounted AR headsets remain challenging due to technical limitations and cost factors, advances in mobile devices and tracking solutions introduce handheld AR devices as a powerful, broadly available alternative, yet with some restrictions. One of the current limitations of AR training applications on handheld AR devices is that most offer rather static experiences, only providing descriptive knowledge with little interactivity. Holistic concepts for the coverage of procedural knowledge are largely missing. The contribution of this paper is twofold. We propose a scalabe interaction concept for handheld AR devices with an accompanied didactic framework for procedural training tasks called TrainAR. Then, we implement TrainAR for a training scenario in academics for the context of midwifery and explain the educational theories behind our framework and how to apply it for procedural training tasks. We evaluate and subsequently improve the concept based on three formative usability studies (n = 24), where explicitness, redundant feedback mechanisms and onboarding were identified as major success factors. Finally, we conclude by discussing derived implications for improvements and ongoing and future work.},
    DOI = {10.3390/mti5070030}
    }
  • J. Blattgerste, P. Renner, and T. Pfeiffer, “Authorable Augmented Reality Instructions for Assistance and Training in Work Environments,” in Proceedings of the 18th International Conference on Mobile and Ubiquitous Multimedia, New York, NY, USA, 2019. doi:10.1145/3365610.3365646
    [BibTeX] [Abstract] [Download PDF]
    Augmented Reality (AR) is a promising technology for assistance and training in work environments, as it can provide instructions and feedback contextualised. Not only, but especially impaired workers can benefit from this technology. While previous work mostly focused on using AR to assist or train specific predefined tasks, “general purpose” AR applications, that can be used to intuitively author new tasks at run-time, are widely missing. The contribution of this work is twofold: First we develop an AR authoring tool on the Microsoft HoloLens in combination with a Smartphone as an additional controller following considerations based on related work, guidelines and focus group interviews. Then, we evaluate the usability of the authoring tool itself and the produced AR instructions on a qualitative level in realistic scenarios and gather feedback. As the results reveal a positive reception, we discuss authorable AR as a viable form of AR assistance or training in work environments.
    @inproceedings{blattgerste2019authorable,
    author = {Blattgerste, Jonas and Renner, Patrick and Pfeiffer, Thies},
    title = {Authorable Augmented Reality Instructions for Assistance and Training in Work Environments},
    year = {2019},
    isbn = {9781450376242},
    publisher = {Association for Computing Machinery},
    address = {New York, NY, USA},
    doi = {10.1145/3365610.3365646},
    booktitle = {Proceedings of the 18th International Conference on Mobile and Ubiquitous Multimedia},
    articleno = {34},
    numpages = {11},
    keywords = {training, cognitive impairments, augmented reality, annotation, mixed reality, authoring, assistance},
    location = {Pisa, Italy},
    series = {MUM 19},
    url = {https://mixality.de/wp-content/uploads/2020/07/blattgerste2019authorable.pdf},
    abstract = {Augmented Reality (AR) is a promising technology for assistance and training in work environments, as it can provide instructions and feedback contextualised. Not only, but especially impaired workers can benefit from this technology. While previous work mostly focused on using AR to assist or train specific predefined tasks, "general purpose" AR applications, that can be used to intuitively author new tasks at run-time, are widely missing. The contribution of this work is twofold: First we develop an AR authoring tool on the Microsoft HoloLens in combination with a Smartphone as an additional controller following considerations based on related work, guidelines and focus group interviews. Then, we evaluate the usability of the authoring tool itself and the produced AR instructions on a qualitative level in realistic scenarios and gather feedback. As the results reveal a positive reception, we discuss authorable AR as a viable form of AR assistance or training in work environments.}
    }
  • J. Blattgerste and T. Pfeiffer, “Promptly Authored Augmented Reality Instructions Can Be Sufficient to Enable Cognitively Impaired Workers,” in GI VR / AR Workshop 2020, 2020.
    [BibTeX] [Abstract] [Download PDF]
    The benefits of contextualising information through Augmented Reality (AR) instructions to assist cognitively impaired workers are well known, but most findings are based on AR instructions carefully designed for predefined standard tasks. Previous findings indicate that the modality and quality of provided AR instructions have a significant impact on the provided benefits. The emergence of commercial products providing tools for instructors to promptly author their own AR instructions elicits the question, whether instructions created through those are sufficient to support cognitively impaired workers. This paper explores this question through a qualitative study using an AR authoring tool to create AR instructions for a task that none out of 10 participants was able to complete previously. Using promptly authored instructions, however, most were able to complete the task. Additionally, they reported good usability and gave qualitative feedback indicating they would like to use comparable AR instructions more often.
    @inproceedings{blattgerste2020prompty,
    title={Promptly Authored Augmented Reality Instructions Can Be Sufficient to Enable Cognitively Impaired Workers},
    author={Blattgerste, Jonas and Pfeiffer, Thies},
    booktitle={{GI VR / AR Workshop 2020}},
    year={2020},
    url = {https://mixality.de/wp-content/uploads/2020/07/blattgerste2020prompty.pdf},
    abstract = {The benefits of contextualising information through Augmented Reality (AR) instructions to assist cognitively impaired workers are well known, but most findings are based on AR instructions carefully designed for predefined standard tasks. Previous findings indicate that the modality and quality of provided AR instructions have a significant impact on the provided benefits. The emergence of commercial products providing tools for instructors to promptly author their own AR instructions elicits the question, whether instructions created through those are sufficient to support cognitively impaired workers. This paper explores this question through a qualitative study using an AR authoring tool to create AR instructions for a task that none out of 10 participants was able to complete previously. Using promptly authored instructions, however, most were able to complete the task. Additionally, they reported good usability and gave qualitative feedback indicating they would like to use comparable AR instructions more often.}
    }
  • J. Blattgerste, K. Luksch, C. Lewa, M. Kunzendorf, N. H. Bauer, A. Bernloehr, M. Joswig, T. Schäfer, and T. Pfeiffer, “Project Heb@AR: Exploring handheld Augmented Reality training to supplement academic midwifery education,” in DELFI 2020 – Die 18. Fachtagung Bildungstechnologien der Gesellschaft für Informatik e.V., Bonn, 2020, pp. 103-108.
    [BibTeX] [Abstract] [Download PDF]
    Augmented Reality (AR) promises great potential for training applications as it allows to provide the trainee with instructions and feedback that is contextualized. In recent years, AR reached a state of technical feasibility that not only allows for larger, long term evaluations, but also for explorations of its application to specific training use cases. In the BMBF funded project Heb@AR, the utilization of handheld AR as a supplementary tool for the practical training in academic midwifery education is explored. Specifically, how and where AR can be used most effectively in this context, how acceptability and accessibility for tutors and trainees can be ensured and how well emergency situations can be simulated using the technology. In this paper an overview of the Heb@AR project is provided, the goals of the project are stated and the project’s research questions are discussed from a technical perspective. Furthermore, insights into the current state and the development process of the first AR training prototype are provided: The preparation of a tocolytic injection.
    @inproceedings{blattgerste2020hebar,
    author = {Blattgerste, Jonas AND Luksch, Kristina AND Lewa, Carmen AND Kunzendorf, Martina AND Bauer, Nicola H. AND Bernloehr, Annette AND Joswig, Matthias AND Schäfer, Thorsten AND Pfeiffer, Thies},
    title = {Project Heb@AR: Exploring handheld Augmented Reality training to supplement academic midwifery education},
    booktitle = {DELFI 2020 – Die 18. Fachtagung Bildungstechnologien der Gesellschaft für Informatik e.V.},
    year = {2020},
    editor = {Zender, Raphael AND Ifenthaler, Dirk AND Leonhardt, Thiemo AND Schumacher, Clara},
    pages = { 103-108 },
    publisher = {Gesellschaft für Informatik e.V.},
    address = {Bonn},
    url = {https://dl.gi.de/bitstream/handle/20.500.12116/34147/103%20DELFI2020_paper_79.pdf?sequence=1&isAllowed=y},
    abstract = {Augmented Reality (AR) promises great potential for training applications as it allows to provide the trainee with instructions and feedback that is contextualized. In recent years, AR reached a state of technical feasibility that not only allows for larger, long term evaluations, but also for explorations of its application to specific training use cases. In the BMBF funded project Heb@AR, the utilization of handheld AR as a supplementary tool for the practical training in academic midwifery education is explored. Specifically, how and where AR can be used most effectively in this context, how acceptability and accessibility for tutors and trainees can be ensured and how well emergency situations can be simulated using the technology. In this paper an overview of the Heb@AR project is provided, the goals of the project are stated and the project’s research questions are discussed from a technical perspective. Furthermore, insights into the current state and the development process of the first AR training prototype are provided: The preparation of a tocolytic injection.}
    }
  • J. Blattgerste, P. Renner, and T. Pfeiffer, “Augmented Reality Action Assistance and Learning for Cognitively Impaired People. A Systematic Literature Review,” in The 12th PErvasive Technologies Related to Assistive Environments Conference (PETRA ’19), 2019. doi:10.1145/3316782.3316789
    [BibTeX] [Abstract] [Download PDF]
    Augmented reality (AR) is a promising tool for many situations in which assistance is needed, as it allows for instructions and feedback to be contextualized. While research and development in this area have been primarily driven by industry, AR could also have a huge impact on those who need assistance the most: cognitively impaired people of all ages. In recent years some primary research on applying AR for action assistance and learning in the context of this target group has been conducted. However, the research field is sparsely covered and contributions are hard to categorize. An overview of the current state of research is missing. We contribute to filling this gap by providing a systematic literature review covering 52 publications. We describe the often rather technical publications on an abstract level and quantitatively assess their usage purpose, the targeted age group and the type of AR device used. Additionally, we provide insights on the current challenges and chances of AR learning and action assistance for people with cognitive impairments. We discuss trends in the research field, including potential future work for researchers to focus on.
    @inproceedings{2934446,
    abstract = {Augmented reality (AR) is a promising tool for many situations in which assistance is needed, as it allows for instructions and feedback to be contextualized. While research and development in this area have been primarily driven by industry, AR could also have a huge impact on those who need assistance the most: cognitively impaired people of all ages. In recent years some primary research on applying AR for action assistance and learning in the context of this target group has been conducted. However, the research field is sparsely covered and contributions are hard to categorize. An overview of the current state of research is missing. We contribute to filling this gap by providing a systematic literature review covering 52 publications. We describe the often rather technical publications on an abstract level and quantitatively assess their usage purpose, the targeted age group and the type of AR device used. Additionally, we provide insights on the current challenges and chances of AR learning and action assistance for people with cognitive
    impairments. We discuss trends in the research field, including potential future work for researchers to focus on.},
    author = {Blattgerste, Jonas and Renner, Patrick and Pfeiffer, Thies},
    booktitle = {The 12th PErvasive Technologies Related to Assistive Environments Conference (PETRA ’19)},
    location = {Rhodes, Greece},
    publisher = {ACM},
    title = {{Augmented Reality Action Assistance and Learning for Cognitively Impaired People. A Systematic Literature Review}},
    url = {https://nbn-resolving.org/urn:nbn:de:0070-pub-29344462, https://pub.uni-bielefeld.de/record/2934446},
    doi = {10.1145/3316782.3316789},
    year = {2019},
    }
  • J. Blattgerste, P. Renner, and T. Pfeiffer, “Advantages of Eye-Gaze over Head-Gaze-Based Selection in Virtual and Augmented Reality under Varying Field of Views,” in COGAIN ’18. Proceedings of the Symposium on Communication by Gaze Interaction, 2018. doi:10.1145/3206343.3206349
    [BibTeX] [Abstract] [Download PDF]
    The current best practice for hands-free selection using Virtual and Augmented Reality (VR/AR) head-mounted displays is to use head-gaze for aiming and dwell-time or clicking for triggering the selection. There is an observable trend for new VR and AR devices to come with integrated eye-tracking units to improve rendering, to provide means for attention analysis or for social interactions. Eye-gaze has been successfully used for human-computer interaction in other domains, primarily on desktop computers. In VR/AR systems, aiming via eye-gaze could be significantly faster and less exhausting than via head-gaze. To evaluate benefits of eye-gaze-based interaction methods in VR and AR, we compared aiming via head-gaze and aiming via eye-gaze. We show that eye-gaze outperforms head-gaze in terms of speed, task load, required head movement and user preference. We furthermore show that the advantages of eye-gaze further increase with larger FOV sizes.
    @inproceedings{2919602,
    abstract = {The current best practice for hands-free selection using Virtual and Augmented Reality (VR/AR) head-mounted displays is to use head-gaze for aiming and dwell-time or clicking for triggering the selection. There is an observable trend for new VR and AR devices to come with integrated eye-tracking units to improve rendering, to provide means for attention analysis or for social interactions. Eye-gaze has been successfully used for human-computer interaction in other domains, primarily on desktop computers. In VR/AR systems, aiming via eye-gaze could be significantly faster and less exhausting than via head-gaze.
    To evaluate benefits of eye-gaze-based interaction methods in VR and AR, we compared aiming via head-gaze and aiming via eye-gaze. We show that eye-gaze outperforms head-gaze in terms of speed, task load, required head movement and user preference. We furthermore show that the advantages of eye-gaze further increase with larger FOV sizes.},
    author = {Blattgerste, Jonas and Renner, Patrick and Pfeiffer, Thies},
    booktitle = {COGAIN '18. Proceedings of the Symposium on Communication by Gaze Interaction},
    isbn = {978-1-4503-5790-6},
    keywords = {Augmented Reality, Virtual Reality, Assistance Systems, Head-Mounted Displays, Eye-Tracking, Field of View, Human Computer Interaction},
    location = {Warsaw, Poland},
    publisher = {ACM},
    title = {{Advantages of Eye-Gaze over Head-Gaze-Based Selection in Virtual and Augmented Reality under Varying Field of Views}},
    url = {https://nbn-resolving.org/urn:nbn:de:0070-pub-29196024, https://pub.uni-bielefeld.de/record/2919602},
    doi = {10.1145/3206343.3206349},
    year = {2018},
    }
  • P. Renner, J. Blattgerste, and T. Pfeiffer, “A Path-based Attention Guiding Technique for Assembly Environments with Target Occlusions,” in IEEE Virtual Reality 2018, 2018.
    [BibTeX] [Download PDF]
    @inproceedings{2917385,
    author = {Renner, Patrick and Blattgerste, Jonas and Pfeiffer, Thies},
    booktitle = {IEEE Virtual Reality 2018},
    location = {Reutlingen},
    publisher = {IEEE},
    title = {{A Path-based Attention Guiding Technique for Assembly Environments with Target Occlusions}},
    url = {https://nbn-resolving.org/urn:nbn:de:0070-pub-29173854, https://pub.uni-bielefeld.de/record/2917385},
    year = {2018},
    }
  • J. Blattgerste, P. Renner, B. Strenge, and T. Pfeiffer, “In-Situ Instructions Exceed Side-by-Side Instructions in Augmented Reality Assisted Assembly,” in Proceedings of the 11th ACM International Conference on PErvasive Technologies Related to Assistive Environments (PETRA’18), 2018, p. 133–140. doi:10.1145/3197768.3197778
    [BibTeX] [Abstract] [Download PDF]
    Driven by endeavors towards Industry 4.0, there is increasing interest in augmented reality (AR) as an approach for assistance in areas like picking, assembly and maintenance. In this work our focus is on AR-based assistance in manual assembly. The design space for AR instructions in this context includes, e.g., side-by-side, 3D or projected 2D presentations. In previous research, the low quality of the AR devices available at the respective time had a significant impact on performance evaluations. Today, a proper and up-to-date comparison of different presentation approaches is missing. This paper presents an improved 3D in-situ instruction and compares it to previously presented techniques. All instructions are implemented on up-to-date AR hardware, namely the Microsoft HoloLens. To support reproducible research, the comparison is made using a standardized benchmark scenario. The results show, contrary to previous research, that in-situ instructions on state-of-the-art AR glasses outperform side-by-side instructions in terms of errors made, task completion time, and perceived task load.
    @inproceedings{2919601,
    abstract = {Driven by endeavors towards Industry 4.0, there is increasing interest in augmented reality (AR) as an approach for assistance in areas like picking, assembly and maintenance. In this work our focus is on AR-based assistance in manual assembly. The design space for AR instructions in this context includes, e.g., side-by-side, 3D or projected 2D presentations. In previous research, the low quality of the AR devices available at the respective time had a significant impact on performance evaluations. Today, a proper and up-to-date comparison of different presentation approaches is missing.
    This paper presents an improved 3D in-situ instruction and compares it to previously presented techniques. All instructions are implemented on up-to-date AR hardware, namely the Microsoft HoloLens. To support reproducible research, the comparison is made using a standardized benchmark scenario. The results show, contrary to previous research, that in-situ instructions on state-of-the-art AR glasses outperform side-by-side instructions in terms of errors made, task completion time, and perceived task load.},
    author = {Blattgerste, Jonas and Renner, Patrick and Strenge, Benjamin and Pfeiffer, Thies},
    booktitle = {Proceedings of the 11th ACM International Conference on PErvasive Technologies Related to Assistive Environments (PETRA'18)},
    isbn = {978-1-4503-6390-7},
    keywords = {Augmented Reality, Assistance Systems, Head-Mounted Displays, Smart Glasses, Benchmarking},
    location = {Corfu, Greece},
    pages = {133--140},
    publisher = {ACM},
    title = {{In-Situ Instructions Exceed Side-by-Side Instructions in Augmented Reality Assisted Assembly}},
    url = {https://nbn-resolving.org/urn:nbn:de:0070-pub-29196019, https://pub.uni-bielefeld.de/record/2919601},
    doi = {10.1145/3197768.3197778},
    year = {2018},
    }
  • J. Blattgerste, B. Strenge, P. Renner, T. Pfeiffer, and K. Essig, “Comparing Conventional and Augmented Reality Instructions for Manual Assembly Tasks,” in Proceedings of the 10th International Conference on PErvasive Technologies Related to Assistive Environments, 2017, p. 75 – 82. doi:10.1145/3056540.3056547
    [BibTeX] [Abstract] [Download PDF]
    Augmented Reality (AR) gains increased attention as a means to provide assistance for different human activities. Hereby the suitability of AR does not only depend on the respective task, but also to a high degree on the respective device. In a standardized assembly task, we tested AR-based in-situ assistance against conventional pictorial instructions using a smartphone, Microsoft HoloLens and Epson Moverio BT-200 smart glasses as well as paper-based instructions. Participants solved the task fastest using the paper instructions, but made less errors with AR assistance on the Microsoft HoloLens smart glasses than with any other system. Methodically we propose operational definitions of time segments and other optimizations for standardized benchmarking of AR assembly instructions.
    @inproceedings{2909322,
    abstract = {Augmented Reality (AR) gains increased attention as a
    means to provide assistance for different human activities.
    Hereby the suitability of AR does not only depend on the
    respective task, but also to a high degree on the respective
    device. In a standardized assembly task, we tested
    AR-based in-situ assistance against conventional pictorial
    instructions using a smartphone, Microsoft HoloLens and
    Epson Moverio BT-200 smart glasses as well as paper-based
    instructions. Participants solved the task fastest using the
    paper instructions, but made less errors with AR assistance
    on the Microsoft HoloLens smart glasses than with
    any other system. Methodically we propose operational
    definitions of time segments and other optimizations for
    standardized benchmarking of AR assembly instructions.},
    author = {Blattgerste, Jonas and Strenge, Benjamin and Renner, Patrick and Pfeiffer, Thies and Essig, Kai},
    booktitle = {Proceedings of the 10th International Conference on PErvasive Technologies Related to Assistive Environments},
    isbn = {978-1-4503-5227-7},
    keywords = {Assistance Systems, Head-Mounted Displays, Smartglasses, Benchmarking, CLF_RESEARCH_HIGHLIGHT},
    location = {Island of Rhodes, Greece},
    pages = {75 -- 82},
    publisher = {ACM},
    title = {{Comparing Conventional and Augmented Reality Instructions for Manual Assembly Tasks}},
    url = {https://nbn-resolving.org/urn:nbn:de:0070-pub-29093227, https://pub.uni-bielefeld.de/record/2909322},
    doi = {10.1145/3056540.3056547},
    year = {2017},
    }

Education

  1. 2018 - 2019

    Intelligent Systems (Master of Science)

    Bielefeld University
  2. 2013 - 2018

    Cognitive Informatics (Bachelor of Science)

    Bielefeld University

Experience

  1. since 2019
    Research Associate
    University of Applied Sciences Emden/Leer
  2. since 2019
    Scientific Consultant
    Raumtänzer GmbH
  3. 2019-2019
    Software Developer
    Raumtänzer GmbH
  4. 2017-2019
    Research Assistant
    Bielefeld University