I led the design and research of ARcadia at the Laboratory for Playful Computation in Boulder, CO, and at Microsoft Research in Redmond, WA.

ARcadia brings paper prototypes to life through the use of marker-based augmented reality (AR). With ARcadia, people can turn real-world tangible objects into controllers for music, IoT devices, lights, and more. ARcadia empowers novice technology designers to rapidly construct and iterate on these interactive prototypes.

Photograph of an ARcadia controller for an RGB light
How ARcadia works

Track objects
Print or draw AR markers and place them on tangible objects.

Illustration of placing AR markers on physical objects

Write code
Program how you want your interface to behave.

Illustration of programming using the ARcadia web editor

Start interacting!
Interact with the AR objects to trigger your code events!

Illustration of a person interacting with their ARcadia prototype

Designing custom tangible interfaces requires complex knowledge of programming and electronics.
It is also expensive, and challenging for people (even experts) to iterate on their designs.

Many computer science and physical computing education tools are still too complex to use for students to create meaningful projects.


Empower novice programmers and designers to learn computer science and prototyping by creating interactive, and expressive, projects with ease and at little to no cost.

Who ARcadia is for

ARcadia was designed for use in K-12 informal and formal computer science education settings, and for people of any age that want to turn objects into interactive controllers without prior programming experience.


I reviewed existing research from the field of computer science education, as well as data collected by the Microsoft Makecode education team. Research shows that text-based programming languages create cognitive overload for beginner programmers. These languages are complex and cause beginners to worry about the syntax of their code (e.g. misplaced semicolons, keywords), as opposed to the semantics of their code (the actual behavior of their program).

In addition, I reviewed existing educational toolkits for building interactive, physical projects. Research shows that computing toolkits such as the Arduino have too much complexity in the wiring and programming phases of building.

I also interviewed several experts from the fields of New Interfaces for Musical Expression (NIME), HCI, and Computer Science Education.


Paper-based prototyping techniques are commonly used by designers to express ideas before investing time and money into development. However, paper-based prototypes lack interactivity.

I wanted to leverage the benefits of paper-prototyping, and add interactivity to bring paper-prototypes to life.

I identified image tracking, AI, and augmented reality as technologies to accomplish this goal at little to no cost.

A few ARcadia markers
Users print out markers and attach them to objects to make them interactable.
ARcadia setup with printed markers, a webcam, and browser
ARcadia setup with printed markers, a webcam, and browser.

ARcadia enables users to any physical object interactive by placing a trackable, unique barcode on it. A user can then point a webcam or phone camera at these trackable objects, which are detected by the system. In the ARcadia programming editor, users can use code blocks to develop event-driven programs that define how their interfaces behave. For example, they can write a program that says, "when this object is moved to the left, play a snare drum."

I integrated block-based programming designs into ARcadia's programming interface so students can learn computational thinking by focusing on the designs and behaviors of their projects, instead of getting caught up in programming syntax problems that would prevent them from building meaningful and expressive projects.

By allowing users to develop programs using high-level code abstractions, and to only require a computer and basic low-cost materials — users can rapidly prototype and design novel, tangible interfaces for creative expression.

Low-fidelity wireframe
A low-fidelity wireframe sketch of the ARcadia interface.
High-fidelity mockup made in Sketch.
Production version in a web browser
Production version in a web browser.
A programming block.
An ARcadia program that plays a C note and a kick drum sample when two markers are touching.
A video demonstrating how to use ARcadia.


Throughout the design iterations, I frequently tested prototypes with users on site one-on-one.

I facilitated an augmented reality music-making workshop with 120 high school girls using a production version of ARcadia.

My methods:

  • I guided workshop participants through a few introductory exercises and then they worked on projects of their choosing for one hour,
  • I collected their project code, photographs and videos of their projects through various levels of completion, and
  • I collected participants' responses for things they "liked about ARcadia," and things they "wish ARcadia could do."

Based on feedback I collected from participants, I identified a weakness in the ARcadia toolkit. When I say "weakness" I am not referring to performance issues or bugs, but rather the way ARcadia was designed to be used. Participants wanted to use it for projects beyond the simple music features that were available in the browser. They wanted to use it to create custom controls for games, to control lighting in their houses, and more. In other words, ARcadia needed to be flexible enough to connect to many different applications through the internet of things (IoT).

Post-it wall that participants wrote their feedback on.
Post-it wall that participants wrote their feedback on.
A study participant holding up an ARcadia marker to a webcam.
A study participant holding up an ARcadia marker to a webcam.
A study participant's ARcadia piano project.
A study participant's ARcadia piano project.
Design Revisions

To address this feedback, I designed ARcadia to contain begnner-friendly programming constructs that would enable users to connect their interfaces to other applications. To perform contextual inquiry, I interviewed designers and performing artists to gain an understanding of what interesting use cases might be, using semi-structured interviews and questionnaires. I found that in the performing arts and IoT community, it is common for creators and hackers to route input and output between multiple applications using WebRTC and OSC (two technologies that enable networked communication). I added blocks that networked over WebRTC and OSC. Below are three examples: ARcadia used to control lighting, ARcadia used as an effects pedal for a keyboard, and ARcadia used to scrub a YouTube video to alter audio and visual effects.

Photograph of an ARcadia controller for an RGB light
An ARcadia interface for adjusting the color of an RGB light.
Photograph of an ARcadia controller for toggling piano effects
An ARcadia button to toggle effects on and off on a MIDI controller.
Photograph of an ARcadia controller for controlling music video playback
An ARcadia slider to adjust the playback speed of a music video.
Further Evaluation

To test the new networking features of ARcadia I tested with users in situ, as well as conducted usability testing in the field at pop-up demo booths. In addition, to push the capabilities of ARcadia to the max I along with other performers testing using ARcadia as musical controllers, video game controllers, visual jockeying controllers, home automation controllers, etc.

A video of ARcadia being used to control the playback rate and position of a music video.
Further Revisions

In my evaluation I identified that some of the labels on the networking blocks were confusing, and there were too many steps required to initiate connection between devices. To address this, I redesigned the blocks such that devices could connect by name instead of having to use IP addresses or long strings of numbers and letters.


Overall, ARcadia makes it much easier for non-programmers to design multimedia, interactive, networked systems.

ARcadia has been implemented as a computer science education tool in international classrooms and learning environments. I have published my ARcadia design work in the 2018 CHI Conference on Human Factors in Computing Systems, and ARcadia has also been referenced by others UI/UX designers and researchers.


ARcadia: A Rapid Prototyping Platform for Real-time Tangible Interfaces
Annie Kelly, R. Benjamin Shapiro, Peli de Halleux, & Thomas Ball
In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems

Other media

On list of Microsoft MakeCode custom editors
Project description on Laboratory for Playful Computation website
Mention of research publication of Microsoft Research's website
Mention of project and my overarching thesis research on the ATLAS Institute's website