Hello there, I'm a PhD student in the ATLAS Institute at the University of Colorado Boulder,
where I also earned my B.S. in Computer Science and a certificate in Technology, Arts, and Media.
I'm a member of the
Laboratory for Playful Computation
where I'm known as the resident "code witch." My advisor is Dr. Ben Shapiro.
My research focuses on ways that we can empower adult women artists to learn to engineer their own technologies for live performance. I use design research and feminist methodologies to achieve this – by looking at how artists learn to work with interactive technologies, focusing on what artistic and computational practices they employ, and understanding what typical challenges they encounter in order to design better programmable technologies and curricula. I do this by teaching artist workshops, hosting hackathon-style events, and collaborating with artists to help them learn to engineer their own technologies to achieve their creative visions. This is all part of the Weird Code Club. I am a computer scientist by training, and a musician by heart – in both domains it is common for women to be underrepresented as technicians and engineers. In particular, in the performing arts, the artists themselves have to contract or outsource much of the technical requirements of their projects to other people. In some situations this can be ideal, but in others it can result in artists having to to relinquish a lot of their own creative control. This is an issue that I am passionate about addressing, and currently exploring in my work.
In addition to my research, I am usually working on several artistic endeavors at a time which include interactive and immersive installations and playing in loud obnoxious rock bands.
The Talking Trees Jam was a weekend workshop where participants collaborated on interactive art installations that embody Colorado environmental data. Participants worked with real scientific data collected in Colorado forests and physical computing tools to bring their installations to life.
A 2017 interdisciplinary design event for artists and engineers where participants worked in groups to build an interactive technology for creative expression
For more examples of projects that are not shown here visit my personal portfolio website.
The Cell is an immersive, interactive, room-sized installation of an E.Coli cell housed in the University of Colorado Boulder’s ATLAS Institute.
The inspiration for this project was to make a fun, visually engaging, and explorative educational experience. Upon entering the room, visitors are stepping inside of an E-coli cell and can touch, play, and interact with the different elements of the cell.
The Cell contains large strands of interactive DNA that you can “unzip” by separating nucleotide pairs.
As visitors separate the nucleotide pairs, the bases of the pair – A, G, C, and T – are revealed and denoted by color.
Visitors can also pick up pieces of RNA off the floor and attach it to the strands of DNA to create different
RNA sequences. Eventually, visitors will be able to use these RNA sequences to activate proteins and then place the proteins in the cell’s receptors to
change the behavior and appearance of the cell. The construction of this project involved carpentry, scultpure, laser cutting, programming
physical computing technologies to add interactivity, sound design, and more.
“The Show” was a 2018 dance performance produced and choreographed by CU alumnus Emily Daub. The performance was about exploring relationships between people and different genres of dance. I designed and developed the software for the interactive light-up costumes. Each costume had between 1-5 proximity sensors embedded into the clothing to sense the distances between each dancer at any given time. Each costume had a default color (e.g. one dancer was red, one was blue, etc) and the colors of each dancer would affect the colors of the LEDs of the dancers around them. For example, if the blue dancer and red dancer were close in proximity, both of their costumes' LEDs would blend and light up purple, and when they moved away from each other the lights would start to fade closer to their default colors. The LEDs on some of the costumes were also programmed to display different kinds of animations based on the rotational speed of the dancer. I also implemented a remote control device that I could use during the performance to remotely switch the costumes between different settings.
DoggyVsion is an augmented reality mobile application I developed that lets people see the world in a dog’s color spectrum. It is a common misconception that dogs can only see the world in monochromatic black and white, in actuality dogs' vision is dichomatic. DoggyVision was developed in Unity for iOS and Android and can be used in handheld mode and also with a Google Cardboard.
BlockyTalky is a programming environment designed to make it easy for novice programmers to
make interactive, networked physical computing devices. Kids as young as ten have used it
to rapidly build a wide range of projects, from networked cat feeders to computer music systems.
Part of my work on this project was to adapt BlockyTalky so that it could be used by adults to
interface with professional level artistic software like Max/MSP, Pd, Unity, Processing, and more.
ARcadia is a toolkit for bringing paper prototypes to life through the use of marker-based Augmented Reality.
It enables people to create tangible interfaces for real-time creative expression without the use of
embedded sensors or electronics.
Users can construct interfaces using cardboard, paper, and other craft materials and stick fiducial markers on it to enable tracking via a webcam.
Users can then program different mappings between markers and events to add interactivity to their projects. After the crafting and programming phase, a user can play with their interface live – meaning their interactions with the interface will cause events to occur in real-time.
Examples of possible projects include drum machines, pianos, games, etc with only using cardboard, fiducial markers, and a camera.
Universal Mind Control is an interactive installation for planetariums where people can use their brain to control space and time in
a planetarium system called Uniview. A user can train the software to activate different controls based on different brainwave patterns, e.g.
"when I'm concentrating, speed up Earth's orbit," or "when I'm reading, fly to the Kuiper belt." This is made possible through the use of several
customizable machine learning algorithms that allow users to customize the experience based on their own brain's activity and to
set up the controls that they want to play with. Uniview is a very popular
option for planetariums across the world, so we designed our application to interface with it in order to provide a high-quality visual
experience, and also make our application ubiquitous enough to be used in most institutions.
Audiovisual Playground is a 3D, virtual, music-making interface developed in Unity for the HTC
VIVE virtual reality headset. We took the idea of the popular musical composition tool,
music sequencer, and transformed it from a 2D linear interface to a 3D circular one that
users can interact with and walk around.
Boulder Museum of Contemporary Art's Cultural Cul-de-sac was a series of events designed to bring members of the Boulder community together
and engage in discussions about different cultures. Several animations of mine were displayed during the cul-de-sac event,
This is Gonna Take Awhile,
a BBQ and picnic themed event in the museum.
Working with the University of Colorado Museum of Natural History, we designed and
implemented Metamorphosis, a system for an engaging, educational butterfly exhibit called Becoming Butterflies.
The exhibit employs augmented reality and full-body interaction to guide users through
critical phases of a butterfly‘s metamorphosis process.
Metamorphosis lets children experience this life cycle with their own bodies by interacting with an
animated projected character. Children begin the interaction by standing in front of the exhibit
where they can be tracked by the Kinect. Once the Kinect begins tracking, the child is prompted (via text projected on the wall
and audio played via overhead speak- ers) to crouch to become an egg. Once the child crouches, the projector displays
an egg. The user wiggles back and forth to break free of the egg and become a caterpillar. In caterpillar form,
the user moves around eating leaves by making a “chomping" gesture with his/her arms until the caterpillar is ready to
become a chrysalis. At this point the user reaches his/her hand up to touch the branch overhead,
forming a silk thread and becoming a chrysalis. The child holds still until the exhibit prompts them to break
free and become a butterfly by reaching their arms out. Finally, the child is prompted to move his/her arms up
and down to fly away and lay another egg.
The Laboratory for Playful Computation, directed by Dr. Ben Shapiro, designs new playful and programmable technologies for learning. As a research assistant I help in facilitating research on how people learn and play with cutting-edge advances in programmable digital media. In addition, I currently lead my own research projects around building DIY and programmable tools for live artistic performance.
While I was an intern at Seagate Technology I worked on developing an interface to assist factory workers to run flaw scan tests on hard drives.