Annie Kelly

PhD Student · Computer Scientist · Musician

Hello there, I'm a PhD student in the ATLAS Institute at the University of Colorado Boulder, where I also earned my B.S. in Computer Science and a certificate in Technology, Arts, and Media. I'm a member of the Laboratory for Playful Computation where I'm known as the resident "code witch." My advisor is Dr. Ben Shapiro.

My research focuses on ways that we can empower adult women artists to learn to engineer their own technologies for live performance. I use design research and feminist methodologies to achieve this – by looking at how artists learn to work with interactive technologies, focusing on what artistic and computational practices they employ, and understanding what typical challenges they encounter in order to design better programmable technologies and curricula. I do this by teaching artist workshops, hosting hackathon-style events, and collaborating with artists to help them learn to engineer their own technologies to achieve their creative visions. This is all part of the Weird Code Club. I am a computer scientist by training, and a musician by heart – in both domains it is common for women to be underrepresented as technicians and engineers. In particular, in the performing arts, the artists themselves have to contract or outsource much of the technical requirements of their projects to other people. In some situations this can be ideal, but in others it can result in artists having to to relinquish a lot of their own creative control. This is an issue that I am passionate about addressing, and currently exploring in my work.

In addition to my research, I am usually working on several artistic endeavors at a time which include interactive and immersive installations and playing in loud obnoxious rock bands.


University of Colorado Boulder

PhD student · ATLAS Institute
2016 - Present

University of Colorado Boulder

Bachelor of Science · Computer Science
Certificate · Technology, Arts, and Media
Class of 2016

Publications & Presentations

  • Tangible and Playful Connected Learning
    Sherry Hsi, Annie Kelly, Lila Finch, R. Benjamin Shapiro, Colin Dixon, Mike Petrich, & Karen Wilkinson
    A workshop facilitated at the 2018 Connected Learning Summit

  • ARcadia: A Rapid Prototyping Platform for Real-time Tangible Interfaces
    Annie Kelly, R. Benjamin Shapiro, Peli de Halleux, & Thomas Ball
    In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems

  • BlockyTalky: New programmable tools to enable students’ learning networks
    Annie Kelly, Lila Finch, Monica Bolles, & R. Benjamin Shapiro
    In Proceedings of the 2018 International Journal of Child-Computer Interaction

  • Universal Mind Control
    Annie Kelly & Monica Bolles
    Presented at the 2018 IMERSA Summit in Columbus, OH

  • BlockyTalky: A Prototyping Toolkit for Digital Musical Interfaces
    Annie Kelly, Monica Bolles, & R. Benjamin Shapiro
    A workshop facilitated at the 2017 International Conference on New Interfaces for Musical Expression

  • Becoming Butterflies: Interactive Embodiment of the Butterfly Lifecycle
    Annie Kelly, Matthew Whitlock, Stephen Voida, et al.
    Poster presented at the 2017 ACM International Joint Conference on Pervasive and Ubiquitous Computing

  • BlockyTalky: Tangible Distributed Computer Music for Youth
    R. Benjamin Shapiro, Annie Kelly, Matthew Ahrens, et al
    Selected for the 2017 Computer Music Journal

  • Audiovisual Playground: A Music Sequencing Tool for 3D Virtual Worlds
    Annie Kelly & Kristofer Klipfel
    In Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems

  • BlockyTalky: A Physical and Distributed Computer Music Toolkit for Kids
    R. Benjamin Shapiro, Annie Kelly, Matthew Ahrens, & Rebecca Fiebrink
    In Proceedings of the 2016 International Conference on New Interfaces for Musical Expression

Weird Code Club

Part of what I'm passionate about is empowering adult artists to engineer their own technologies for live applications – like performing arts, interactive museum exhibits, and more. Typically, when artists want to incorporate custom interactive technology into their performances, they have to outsource the work to engineers, losing creative agency in the process. The Weird Code Club is a series of educational workshops and computational tools designed to empower artists to have more control over the designs of technologies used in their projects, and to help them become more self-confident creative technologists.

Some of the performance technologies the Weird Code Club seeks to make more accessible are: toolkits to simplify the construction of real-time tangible interfaces, design tools that target specific creation goals like stage lighting, proximity detection, VJ and visual performance software, and audio processing tools.

  • Code Against the Machine

    Tools & educational workshops for artists to build interactive technologies to enhance their live performances
    Code Against the Machine is also a series of computational workshops and technologies for artists to build their own technologies for live performance.

    We recently worked with students enrolled in a graduate CU dance course called Advanced Composition. We led a few days of instruction for how to use a beginner-friendly physical computing tools to create interactive systems and then let the students choose a final project to work on. Students were specifically taught how to use the micro:bit as well as how to create their own physical inputs using electronics and craft materials, and also how to map input from real-time sensors to output such as sound, lighting, and motion. Students met with me one-on-one for technical advice but were instructed to build their systems themselves, which including both the programming and construction. Students then performed using their technologies that they built.

  • We have also started running a series of interactive stage-lighting courses for artists and musicians. Using physical computing tools and browser-based programming editors we teach musicians how to build physical interfaces that can be used in real-time to control professional stage lighting equipment. Participating artists are taught how to control DMX512 stage lighting equipment using custom-built physical inputs and sensors.

  • Talking Trees Jam

    A weekend jam for creating interactive art installations that embody environmental change

    The Talking Trees Jam was a weekend workshop where participants collaborated on interactive art installations that embody Colorado environmental data. Participants worked with real scientific data collected in Colorado forests and physical computing tools to bring their installations to life.

  • Creative++

    A Creative Arts and Technology Jam

    A 2017 interdisciplinary design event for artists and engineers where participants worked in groups to build an interactive technology for creative expression

Art & technology

For more examples of projects that are not shown here visit my personal portfolio website.

"The Cell"

An immersive and interactive scientific installation of an E.Coli cell

The Cell is an immersive, interactive, room-sized installation of an E.Coli cell housed in the University of Colorado Boulder’s ATLAS Institute. The inspiration for this project was to make a fun, visually engaging, and explorative educational experience. Upon entering the room, visitors are stepping inside of an E-coli cell and can touch, play, and interact with the different elements of the cell. The Cell contains large strands of interactive DNA that you can “unzip” by separating nucleotide pairs. As visitors separate the nucleotide pairs, the bases of the pair – A, G, C, and T – are revealed and denoted by color. Visitors can also pick up pieces of RNA off the floor and attach it to the strands of DNA to create different RNA sequences. Eventually, visitors will be able to use these RNA sequences to activate proteins and then place the proteins in the cell’s receptors to change the behavior and appearance of the cell. The construction of this project involved carpentry, scultpure, laser cutting, programming physical computing technologies to add interactivity, sound design, and more.

"The Show"

A dance performance featuring interactive light-up costumes

“The Show” was a 2018 dance performance produced and choreographed by CU alumnus Emily Daub. The performance was about exploring relationships between people and different genres of dance. I designed and developed the software for the interactive light-up costumes. Each costume had between 1-5 proximity sensors embedded into the clothing to sense the distances between each dancer at any given time. Each costume had a default color (e.g. one dancer was red, one was blue, etc) and the colors of each dancer would affect the colors of the LEDs of the dancers around them. For example, if the blue dancer and red dancer were close in proximity, both of their costumes' LEDs would blend and light up purple, and when they moved away from each other the lights would start to fade closer to their default colors. The LEDs on some of the costumes were also programmed to display different kinds of animations based on the rotational speed of the dancer. I also implemented a remote control device that I could use during the performance to remotely switch the costumes between different settings.


See the world through a dog's eyes

DoggyVsion is an augmented reality mobile application I developed that lets people see the world in a dog’s color spectrum. It is a common misconception that dogs can only see the world in monochromatic black and white, in actuality dogs' vision is dichomatic. DoggyVision was developed in Unity for iOS and Android and can be used in handheld mode and also with a Google Cardboard.


A programmable toolkit for building collaborative musical controllers

BlockyTalky is a programming environment designed to make it easy for novice programmers to make interactive, networked physical computing devices. Kids as young as ten have used it to rapidly build a wide range of projects, from networked cat feeders to computer music systems. Part of my work on this project was to adapt BlockyTalky so that it could be used by adults to interface with professional level artistic software like Max/MSP, Pd, Unity, Processing, and more.


An augmented reality toolkit for prototyping live artistic controllers

ARcadia is a toolkit for bringing paper prototypes to life through the use of marker-based Augmented Reality. It enables people to create tangible interfaces for real-time creative expression without the use of embedded sensors or electronics. Users can construct interfaces using cardboard, paper, and other craft materials and stick fiducial markers on it to enable tracking via a webcam. Users can then program different mappings between markers and events to add interactivity to their projects. After the crafting and programming phase, a user can play with their interface live – meaning their interactions with the interface will cause events to occur in real-time. Examples of possible projects include drum machines, pianos, games, etc with only using cardboard, fiducial markers, and a camera.

Universal Mind Control

Controlling the universe with the human mind

Universal Mind Control is an interactive installation for planetariums where people can use their brain to control space and time in a planetarium system called Uniview. A user can train the software to activate different controls based on different brainwave patterns, e.g. "when I'm concentrating, speed up Earth's orbit," or "when I'm reading, fly to the Kuiper belt." This is made possible through the use of several customizable machine learning algorithms that allow users to customize the experience based on their own brain's activity and to set up the controls that they want to play with. Uniview is a very popular option for planetariums across the world, so we designed our application to interface with it in order to provide a high-quality visual experience, and also make our application ubiquitous enough to be used in most institutions.

AudioVisual Playground

A virtual reality playground for interacting with music sequencers

Audiovisual Playground is a 3D, virtual, music-making interface developed in Unity for the HTC VIVE virtual reality headset. We took the idea of the popular musical composition tool, music sequencer, and transformed it from a 2D linear interface to a 3D circular one that users can interact with and walk around.

Boulder Museum of Contemporary Art Cultural Cul-de-sac event

Public display of animated artwork

Boulder Museum of Contemporary Art's Cultural Cul-de-sac was a series of events designed to bring members of the Boulder community together and engage in discussions about different cultures. Several animations of mine were displayed during the cul-de-sac event, This is Gonna Take Awhile, a BBQ and picnic themed event in the museum.


A temporary museum installation about the lifecycle of butterflies

Working with the University of Colorado Museum of Natural History, we designed and implemented Metamorphosis, a system for an engaging, educational butterfly exhibit called Becoming Butterflies. The exhibit employs augmented reality and full-body interaction to guide users through critical phases of a butterfly‘s metamorphosis process. Metamorphosis lets children experience this life cycle with their own bodies by interacting with an animated projected character. Children begin the interaction by standing in front of the exhibit where they can be tracked by the Kinect. Once the Kinect begins tracking, the child is prompted (via text projected on the wall and audio played via overhead speak- ers) to crouch to become an egg. Once the child crouches, the projector displays an egg. The user wiggles back and forth to break free of the egg and become a caterpillar. In caterpillar form, the user moves around eating leaves by making a “chomping" gesture with his/her arms until the caterpillar is ready to become a chrysalis. At this point the user reaches his/her hand up to touch the branch overhead, forming a silk thread and becoming a chrysalis. The child holds still until the exhibit prompts them to break free and become a butterfly by reaching their arms out. Finally, the child is prompted to move his/her arms up and down to fly away and lay another egg.


Laboratory for Playful Computation

Research Assistant

The Laboratory for Playful Computation, directed by Dr. Ben Shapiro, designs new playful and programmable technologies for learning. As a research assistant I help in facilitating research on how people learn and play with cutting-edge advances in programmable digital media. In addition, I currently lead my own research projects around building DIY and programmable tools for live artistic performance.

October 2015 - Present

Microsoft Research

PhD Research Intern

While I was a PhD Intern at Microsoft Research I worked with the Microsoft Makecode team. I designed and developed a target for pxt for a blocks-based and Javascript code editor for building Augmented Reality and music experiences. You can try out the editor here (currently supported for the Chrome browser on computers and Android phones).

June 2017 - August 2017

Seagate Technology

Engineering Intern

While I was an intern at Seagate Technology I worked on developing an interface to assist factory workers to run flaw scan tests on hard drives.

May 2015 - October 2015