I am currently leading the design and research of AnimalViz, a mobile application that simulates animals' vision in real-time on the phone's video feed.
This project is supported by a research grant from the National Science Foundation (award #1736051).
Many animals' senses are difficult for humans to comprehend and empathize with. Science education lacks tools that enable students to adopt the perspective of another species, leading many students to form misconceptions about animal biology.
Support students' ability to do "perspective-taking" of other animals' visual senses, in order to give them a more authentic and empathetic understanding of animal biology.
Stakeholder: The National Science Foundation funds a larger project I am a part of with the goal of developing learning experiences that connect people and animals to enhance K-12 STEM education.
Users: AnimalViz is for K-12 students in both informal and formal education settings, as well as educators and guardians to use with youth.
Our 3-person research team led a focus group to investigate what curiosities families have about their pets' senses and experiences.
My role:
We found that families were particularly interested in how their pets see and hear the world. Participants mentioned the desire to use technology to view the world from their pet's perspective so they can understand how they see and experience the world.
The goal was to engage students in perspective-taking of other species' senses. Given the challenges of replicating senses like smell, touch, and hearing, I chose to focus on vision since that could be simulated without the need for expensive hardware. I designed a mobile augmented reality app that would apply filters to the phone's camera feed in real-time. I designed two augmented reality applications, DoggyVision and KittyVision.
In the US, 38.4% of households own a dog, and 25.4% own a cat. Our goal was to integrate science education with empathy, so I focused on animals that people already have loving relationships with, and animals whose visual senses could be simulated the easiest.
Many people believe that dogs and cats only see in black and white. This is false.
At no-cost, I developed DoggyVision and KittyVision Snapchat lenses to prototype these experiences.
These prototypes simulate 3 aspects of dog and cat vision: color perception, brightness discrimination, and visual acuity.
I, along with my collaborator Christine Chang, designed a study to explore how DoggyVision and KittyVision mediated families' scientific inquiry and discovery around animal senses
My role:
We found DoggyVision to be an effective means in engaging families to scientifically think about the differences in theirs and their pets senses (learn more in my published paper).
The families were curious about other species of pets (and non-pets) such as fish, cats, birds, reptiles, and more. However, families also said they wish they could see the world more truly from their pet's perspective (their point of view & their daily routines). In addition, familes wanted to collect more data about their pets so they could understand how multiple environmental stimuli play a role in their pets' behaviors.
Given participants' interests in many animals, I am currently working on designs for AnimalViz offer more species' senses to experience.
AnimalViz will also have the option to connect to a wearable collar camera over Bluetooth, which will stream video from the pet's perspective (a feature several participants said they felt would improve DoggyVision and KittyVision).
“Our Dog Probably Thinks Christmas is Really Boring”: Re-mediating Science Education for Feminist-Inspired Inquiry
Annie Kelly, Christine Chang, Christian Hill, Mary West, Mary Yoder, Joseph Polman, Shaun Kane, Michael Eisenberg, & R. Benjamin Shapiro
In Proceedings of the 2020 ICLS International Conference of the Learning Sciences
A blog post I wrote for how to install the Snapchat DoggyVision and KittyVision lenses.
Description of the grant funded project on the ATLAS Institute website.