Current Research Projects

 

Social Drones

Autonomous drones are becoming ubiquitous. Already, drones are in use for research and applications in construction and maintenance, military and emergency operations, space exploration, logistics, accessibility, smart homes, sports and exercise, and human-computer interfaces.

In the majority of current drone applications, a pilot uses a drone to remotely inspect or manipulate subjects in an environment that is not easily accessible for humans. Conversely, there is an emerging class of applications where fully autonomous drones operate in spaces populated by human users or bystanders. We have coined the term “social drones” to describe these. Our current work aims to create knowledge that informs the design of social drones and related systems via constructive design research.

This project has received funding from the Marianne and Marcus Wallenberg Foundation’s initiative for research on “Consequences of AI and Autonomous Systems,” as part of the WASP-HS research program.

SELECTED PUBLICATIONS

Sara Ljungblad, Yemao Man, Mehmet Aydın Baytaş, Mafalda Gamboa, Mohammad Obaid, and Morten Fjeld. 2021. What Matters in Professional Drone Pilots’ Practice? An Interview Study to Understand the Complexity of Their Work and Inform Human-Drone Interaction Research. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, New York, NY, USA, Article 159, 1–16. DOI:https://doi.org/10.1145/3411764.3445737

Joseph La Delfa, Mehmet Aydın Baytaş, Emma Luke, Ben Koder, and Florian 'Floyd' Mueller. 2020. Designing Drone Chi: Unpacking the Thinking and Making of Somaesthetic Human-Drone Interaction. In Proceedings of the 2020 ACM Designing Interactive Systems Conference (DIS '20). Association for Computing Machinery, New York, NY, USA, 575–586. DOI: https://doi.org/10.1145/3357236.3395589

Joseph La Delfa, Mehmet Aydın Baytaş, Rakesh Patibanda, Hazel Ngari, Rohit Ashok Khot, and Florian “Floyd” Mueller. 2020. Drone Chi: Somaesthetic Human-Drone Interaction. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (CHI ’20) [Honorable Mention Award]. DOI: https://doi.org/10.1145/3313831.3376786

Mehmet Aydın Baytaş, Markus Funk, Sara Ljungblad, Jérémie Garcia, Joseph La Delfa, and Florian “Floyd” Mueller. 2020. IHDI 2020: Interdisciplinary Workshop on Human-Drone Interaction. In Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems (CHI EA ’20). DOI: https://doi.org/10.1145/3334480.3375149

Mehmet Aydın Baytaş, Sara Ljungblad, Joseph La Delfa, & Morten Fjeld (2020). Reconsidering “Social Drones” via Agent Archetypes: Social Robots or Objects with Intent? In Proceedings of the Interdisciplinary Workshop on Human-Drone Interaction (iHDI 2020). URL: http://ceur-ws.org/Vol-2617/paper4.pdf

Mehmet Aydın Baytaş, Markus Funk, Sara Ljungblad, Jérémie Garcia, Joseph La Delfa, & Florian ‘Floyd’ Mueller (eds.) (2020). Proceedings of the Interdisciplinary Workshop on Human-Drone Interaction. URL: http://ceur-ws.org/Vol-2617/

Mehmet Aydın Baytaş, Damla Çay, Yuchong Zhang, Mohammad Obaid, Asım Evren Yantaç, and Morten Fjeld. 2019. The Design of Social Drones: A Review of Studies on Autonomous Flyers in Inhabited Environments. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI ’19). DOI: https://doi.org/10.1145/3290605.3300480

Kari Daniel Karjalainen, Anna Elisabeth Sofia Romell, Photchara Ratsamee, Asim Evren Yantac, Morten Fjeld, and Mohammad Obaid. 2017. Social Drone Companion for the Home Environment: a User-Centric Exploration. In Proceedings of the 5th International Conference on Human Agent Interaction (HAI '17). DOI: https://doi.org/10.1145/3125739.3125774

Alexander Yeh, Photchara Ratsamee, Kiyoshi Kiyokawa, Yuki Uranishi, Tomohiro Mashita, Haruo Takemura, Morten Fjeld, and Mohammad Obaid. 2017. Exploring Proxemics for Human-Drone Interaction. In Proceedings of the 5th International Conference on Human Agent Interaction (HAI '17). DOI: https://doi.org/10.1145/3125739.3125773

Mohammad Obaid, Felix Kistler, Gabrielė Kasparavičiūtė, Asim Evren Yantaç, and Morten Fjeld. 2016. How would you gesture navigate a drone?: a user-centered approach to control a drone. In Proceedings of the 20th International Academic Mindtrek Conference (AcademicMindtrek '16). DOI: https://doi.org/10.1145/2994310.2994348

Mohammad Obaid, Omar Mubin, Christina Anne Basedow, A. Ayça Ünlüer, Matz Johansson Bergström, and Morten Fjeld. 2015. A Drone Agent to Support a Clean Environment. In Proceedings of the 3rd International Conference on Human-Agent Interaction (HAI '15). DOI: https://doi.org/10.1145/2814940.2814947

Links

Project Page on WASP-HS Project Website

 

Design Challenge: Privacy Controls for IoT Systems

The Internet of Things (IoT), based on single-purpose internet-connected devices, becomes more and more pervasive. It is increasingly adopted by private households and hospitality industry, providing services such as security, monitoring or voice assistance via the "cloud". In order to develop effective privacy controls for this technology, users should be kept in the loop of managing the flows of the data they generate. That's why user’s privacy preferences and respective contextual factors need to be understood and incorporated in the design of corresponding controls, which is the topic of this work.

IoT users still expect vendors and regulators or policymakers to protect them from the privacy and security threats entailed by the adoption of in-house IoT devices. Next, usable control interfaces are needed to empower IoT users in making informed decisions about the privacy of their household-generated data.

This project has received funding from the Knut and Alice Wallenberg Foundation’s Wallenberg Autonomous Systems and Software Program (WASP).

 

Human-Computer Interaction for Industrial Tomography

Process tomography images and data come with a high sampling rate and encode complex information. Hence, they are often quite inapprehensible for human perception. On the other hand, the human senses and mind will not be banished from production but stay integrated with its valuable expert knowledge to guarantee safety, knowledge-based control and reasoning as well as intervention in critical states. Integration of non-formal knowledge and presentation of complex image and tomography data to human perception are challenges. This project deals with the question of how to present process data from tomography in time and space to human operator and how to optimize the machine interface for such data.

This project has received funding from the European Union's Horizon 2020 research and innovation programme under the Marie Sklodowska-Curie grant agreement No 764902 (project TOMOCON).

Selected Publications

Yuchong Zhang, Yong Ma, Adel Omrani, Rahul Yadav, Morten Fjeld, and Marco Fratarcangeli. 2019. Automatic Image Segmentation for Microwave Tomography (MWT): From Implementation to Comparative Evaluation. In Proceedings of the 12th International Symposium on Visual Information Communication and Interaction (VINCI’2019). Association for Computing Machinery, New York, NY, USA, Article 26, 1–2. DOI: https://doi.org/10.1145/3356422.3356437.

Zhang, Yuchong, and Morten Fjeld. "Condition Monitoring for Confined Industrial Process Based on Infrared Images by Using Deep Neural Network and Variants." In Proceedings of the 2020 2nd International Conference on Image, Video and Signal Processing (IVSP' 20), pp. 99-106. 2020. DOI: https://doi.org/10.1145/3388818.3388823.

Yuchong Zhang, Morten Fjeld, Alan Said, and Marco Fratarcangeli. 2020. Task-based Colormap Design Supporting Visual Comprehension in Process Tomography. In Proceedings of EuroVis 2020. [link]

Yuchong Zhang, Yong Ma, Adel Omrani, Rahul Yadav, Morten Fjeld, and Marco Fratarcangeli. 2020. Automated Microwave Tomography (MWT) Image Segmentation: State-of-the-Art Implementation and Evaluation. In Proceedings of International Conference in Central Europe on Computer Graphics, Visualization and Computer Vision 2020 (WSCG'2020). [link]

Zhang, Yuchong, and Morten Fjeld. "“I Am Told to Be Happy”: An Exploration of Deep Learning in Affective Colormaps in Industrial Tomography." In Proceedings of 2021 2nd International Conference on Artificial Intelligence and Information Systems (ICAIIS' 21). DOI: https://doi.org/10.1145/3469213.3469220.

Zhang, Yuchong, Morten Fjeld, Marco Fratarcangeli, Alan Said, and Shengdong Zhao. 2021. "Affective Colormap Design for Accurate Visual Comprehension in Industrial Tomography." Sensors 21, no. 14: 4766. DOI: https://doi.org/10.3390/s21144766.

Y. Zhang, R. Yadav, A. Omrani, and M. Fjeld, “A novel augmented reality system to support volumetric visualization in industrial process tomography,in International Conference on Intelligent Human Computer Interaction, 2021. [Best Paper Award] DOI: https://www.ihci-conf.org/wp-content/uploads/2021/07/01_202105L001_Zhang.pdf.

Yuchong Zhang, Adam Nowak, Andrzej Romanowski, and Morten Fjeld. 2021. Augmented Reality with Industrial Process Tomography: To Support Complex Data Analysis in 3D Space. In Adjunct Proceedings of the 2021 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2021 ACM International Symposium on Wearable Computers. https://doi.org/10.1145/3460418.3479288.

Zhang, Yuchong, Adel Omrani, Rahul Yadav, and Morten Fjeld. "Supporting Visualization Analysis in Industrial Process Tomography by Using Augmented Reality—A Case Study of an Industrial Microwave Drying System." Sensors 21, no. 19 (2021): 6515. https://doi.org/10.3390/s21196515.

Yuchong Zhang, Adam Nowak, Guruprasad Rao, Andrzej Romanowski, and Morten Fjeld. 2021. Is Industrial Tomography Ready for Augmented Reality? A Need-finding Study of How Augmented Reality Can Be Adopted by Industrial Tomography Experts. In In Proceedings of 2021 the 11th IEEE International Conference on Virtual Reality and Visualization (ICVRV). In press.

Links

TOMOCON Project Website

 

Augmented Reality Applications for Marine Operations

For maritime safety, it is crucial to have high-quality collaboration between crew members and with crews onshore or on other units (e.g. other vessels, ports, and wind farms). A challenge to collaboration may be that each crew member has different perspectives and different information available (e.g. what they observe directly, observe through instrumentation, or draw from their memory or mental models). A relatively new technology, augmented reality, may have the potential to improve maritime collaboration. The project aims to identify how augmented reality can facilitate collaboration in marine operations. One literature review and two experimental studies will research how different augmented reality applications can support situation awareness, decision-making, and communication among crew members. Findings will inform the development and research of augmented reality applications for collaboration in safety-critical operations and facilitate improvement of the integration and usability of user interfaces.

This project is funded by the University of Bergen.

AR example.png
 

Selected Past Projects

 

Supporting Remote Collaboration with Low-Cost Mobile Device Augmentations

Remote collaboration has been increasingly adopted as a common work practice in this era of distributed workforce. It allows a person to join a meeting with his team without being physically in the same place. An expert can support a novice to fix a technical issue without having to travel to the venue where the issue is happening. A student can attend a class while staying at home due to certain reasons. Remote collaboration provides the workers with more flexibility in their work settings to efficiently perform their job. Also, it can reduce unnecessary traveling costs in certain scenarios. Furthermore, in specific cases where physical social interactions should be restricted, remote collaboration becomes critical to maintain normal business activities. To enable effective remote collaboration, it is important that the supporting systems can richly transfer various communication cues so that collaborators can be adequately aware of what is going with their peers as well as with the collaborative tasks.

While several projects in this space focus on cutting-edge immersive technologies like AR/VR or high-end motion-capture systems to achieve this, it is obvious that many people currently stick to commodity devices like laptops or tablets. These devices have several limitations caused by their form factors that lead to the reduced effectiveness and communication in remote collaboration. To mitigate this, our lab carried out a series of research projects that focused on designing interfaces to effectively support people to work together remotely using those devices. Our designs either leverage only off-the-shelf gadgets or require minimal hardware instrumentations at very low costs to help collaborators convey richer communication with each other.

Screen Shot 2020-03-14 at 16.30.37.png

Selected Publications

Le KD., Avellino I., Fleury C., Fjeld M., Kunz A. (2019) GazeLens: Guiding Attention to Improve Gaze Interpretation in Hub-Satellite Collaboration. In: Lamas D., Loizides F., Nacke L., Petrie H., Winckler M., Zaphiris P. (eds) Human-Computer Interaction – INTERACT 2019. Lecture Notes in Computer Science, vol 11747. Springer, Cham. DOI: https://doi.org/10.1007/978-3-030-29384-0_18

Khanh-Duy Le, Paweł W. Woźniak, Ali Alavi, Morten Fjeld, and Andreas Kunz. 2019. DigiMetaplan: supporting facilitated brainstorming for distributed business teams. In Proceedings of the 18th International Conference on Mobile and Ubiquitous Multimedia (MUM ’19). Association for Computing Machinery, New York, NY, USA, Article 36, 1–12. DOI: https://doi.org/10.1145/3365610.3365637

Khanh-Duy Le, Morten Fjeld, Ali Alavi, and Andreas Kunz. 2017. Immersive environment for distributed creative collaboration. In Proceedings of the 23rd ACM Symposium on Virtual Reality Software and Technology (VRST ’17). Association for Computing Machinery, New York, NY, USA, Article 16, 1–4. DOI: https://doi.org/10.1145/3139131.3139163

Khanh-Duy Le, Kening Zhu, and Morten Fjeld. 2017. Mirrortablet: exploring a low-cost mobile system for capturing unmediated hand gestures in remote collaboration. In Proceedings of the 16th International Conference on Mobile and Ubiquitous Multimedia (MUM ’17). Association for Computing Machinery, New York, NY, USA, 79–89. DOI: https://doi.org/10.1145/3152832.3152838

Links

Portfolio of Khanh-Duy Le