I am currently leading the design and research on a grant called Cyberpets (funded by the National Science Foundation award #1736051​), which has the goal of enhancing youth engagement with STEM through technologies and curricums that encourage scientific investigations with their pets.

PetSense is a product and learning experience aimed at supporting this goal. PetSense is a mobile application that simulates animals' vision and analyzes audio and other sensor data in real-time.

Various frames of the PetSense iOS app

Many animals' senses are difficult for humans to comprehend and empathize with. Science education lacks tools that enable students to adopt the perspective of another species, leading many students to form misconceptions about animal biology.


Support students' ability to do "perspective-taking" of other animals' visual senses, in order to give them a more authentic and empathetic understanding of animal biology.

Who is PetSense for?

Stakeholder: The National Science Foundation funds a larger project I am a part of with the goal of developing learning experiences that connect people and animals to enhance K-12 STEM education.

Users: PetSense is for K-12 students in both informal and formal education settings, as well as educators and guardians to use with youth.

User Research

Our 3-person research team led a focus group to investigate what curiosities families have about their pets' senses and experiences.

My role:

  • Helped moderate a group discussion using a mix of pre-scripted prompts and improvisation
  • Facilitated a post-it activity for families to write down responses and questions to prompts I wrote on the board
  • Managed the collection of audio and video recording data
  • Took detailed field notes

We found that families were particularly interested in how their pets see and hear the world. Participants mentioned the desire to use technology to view the world from their pet's perspective so they can understand how they see and experience the world.

A moment during the focus group discussion.

The goal was to engage students in perspective-taking of other species' senses. Given the challenges of replicating senses like smell, touch, and hearing, I chose to focus on vision since that could be simulated without the need for expensive hardware. I designed a mobile augmented reality app that would apply filters to the phone's camera feed in real-time. I designed two augmented reality applications, DoggyVision and KittyVision.

In the US, 38.4% of households own a dog, and 25.4% own a cat. Our goal was to integrate science education with empathy, so I focused on animals that people already have loving relationships with, and animals whose visual senses could be simulated the easiest.

Many people believe that dogs and cats only see in black and white. This is false.

A puppy and his toys with average human color perception.
Dog color perception.

At no-cost, I developed DoggyVision and KittyVision Snapchat lenses to prototype these experiences.

These prototypes simulate 3 aspects of dog and cat vision: color perception, brightness discrimination, and visual acuity.

If you have Snapchat, you can follow the instructions here to get DoggyVision and KittyVision on your phone!

I, along with my collaborator Christine Chang, designed a study to explore how DoggyVision and KittyVision mediated families' scientific inquiry and discovery around animal senses

My role:

  • I designed and facilitated a 3-hour workshop with 5 families (8 adults, 9 kids)
  • I led a group discussion about what questions families had about their dogs' visual senses
  • I designed a pet vision scavenger hunt where families took photographs of different scenes indoors and outdoors filtered and unfiltered
  • Afterwards, I facilitated a group sticky-note discussion about what they observed and other questions they had about their pets (or other animals)
Families' post-its from our group discussion.

We found DoggyVision to be an effective means in engaging families to scientifically think about the differences in theirs and their pets senses (learn more in my published paper).

The families were curious about other species of pets (and non-pets) such as fish, cats, birds, reptiles, and more. However, families also said they wish they could see the world more truly from their pet's perspective (their point of view & their daily routines). In addition, familes wanted to collect more data about their pets so they could understand how multiple environmental stimuli play a role in their pets' behaviors.

A mother and daughter exploring the outdoors with DoggyVision while taking notes.
Design revisions and next steps

Given participants' interests in many animals, I am currently working on designs for PetSense offer more species' senses to experience.

PetSense will also have the option to connect to a wearable collar camera over Bluetooth, which will stream video from the pet's perspective (a feature several participants said they felt would improve DoggyVision and KittyVision).

In addition, PetSense will simplify the process of saving images and videos. We also received feedback from teachers that a data collection and visualization tool would be nice to have in PetSense for classroom work, and I am incoprating this into the designs as well.

More PetSense mockups and component level wireflows.

“Our Dog Probably Thinks Christmas is Really Boring”: Re-mediating Science Education for Feminist-Inspired Inquiry
Annie Kelly, Christine Chang, Christian Hill, Mary West, Mary Yoder, Joseph Polman, Shaun Kane, Michael Eisenberg, & R. Benjamin Shapiro
In Proceedings of the 2020 ICLS International Conference of the Learning Sciences

Other Media

A blog post I wrote for how to install the Snapchat DoggyVision and KittyVision lenses.

Description of the grant funded project on the ATLAS Institute website.

Daily Camera article talking about the project.