DoggyVision is a mobile application that does real-time image processing of the video feed to show users what the visual perception of dogs (and cats) is like. We released the beta version of DoggyVision last year, and have been in the process of redesigning the app so it can stream video from an external camera placed on a dog collar, and offer other color perception filters of other animals.
We led a user study to investigate how people use DoggyVision -- both in terms of the impacts it offers children for science education, and the experience of using the interface itself. In this study we had families participate in a scavenger hunt where they had to collect photographs using DoggyVision and write notes describing what the photographs were. We led a group discussion based on photographs and post-it note commentary that participants generated.
Throughout the user study we identified key areas for improvement, including bug fixes to address lag and crashing issues, usability improvements regarding placement of UI elements, and feature requests such as streaming from a remote camera and providing more color filters beyond dogs and cats.
I am currently leading the process of redesigning the DoggyVision UI, relying on a mix of user flow mockup strategies, as well as building a proof-of-concept prototype for an iOS app.
Kelly, A., Chang, C., Hill, C., West, M., Yoder, M., Polman, J. L., Kane, S. K., Eisenberg, M., & Shapiro, R. B. (in press). “Our dog probably thinks Christmas is really boring” — Re-mediating science education for feminist-oriented inquiry. To appear in Proceedings of ICLS 2020. Nashville, TN.