Designing custom tangible interfaces requires complex knowledge of programming and electronics.
It is also expensive, and challenging for people (even experts) to iterate on their designs.
Many computer science and physical computing education tools are still too complex to use for students to create meaningful projects.
Empower novice programmers and designers to learn computer science and prototyping by creating interactive, and expressive, projects with ease and at little to no cost.
ARcadia was designed for use in K-12 informal and formal computer science education settings, and for people of any age that want to turn objects into interactive controllers without prior programming experience.
I reviewed existing research from the field of computer science education, as well as data collected by the Microsoft Makecode education team. Research shows that text-based programming languages create cognitive overload for beginner programmers. These languages are complex and cause beginners to worry about the syntax of their code (e.g. misplaced semicolons, keywords), as opposed to the semantics of their code (the actual behavior of their program).
In addition, I reviewed existing educational toolkits for building interactive, physical projects. Research shows that computing toolkits such as the Arduino have too much complexity in the wiring and programming phases of building.
I also interviewed several experts from the fields of New Interfaces for Musical Expression (NIME), HCI, and Computer Science Education.
Paper-based prototyping techniques are commonly used by designers to express ideas before investing time and money into development. However, paper-based prototypes lack interactivity.
I wanted to leverage the benefits of paper-prototyping, and add interactivity to bring paper-prototypes to life.
I identified image tracking, AI, and augmented reality as technologies to accomplish this goal at little to no cost.
ARcadia enables users to any physical object interactive by placing a trackable, unique barcode on it. A user can then point a webcam or phone camera at these trackable objects, which are detected by the system. In the ARcadia programming editor, users can use code blocks to develop event-driven programs that define how their interfaces behave. For example, they can write a program that says, "when this object is moved to the left, play a snare drum."
I integrated block-based programming designs into ARcadia's programming interface so students can learn computational thinking by focusing on the designs and behaviors of their projects, instead of getting caught up in programming syntax problems that would prevent them from building meaningful and expressive projects.
By allowing users to develop programs using high-level code abstractions, and to only require a computer and basic low-cost materials — users can rapidly prototype and design novel, tangible interfaces for creative expression.
Throughout the design iterations, I frequently tested prototypes with users on site one-on-one.
I facilitated an augmented reality music-making workshop with 120 high school girls using a production version of ARcadia.
Based on feedback I collected from participants, I identified a weakness in the ARcadia toolkit. When I say "weakness" I am not referring to performance issues or bugs, but rather the way ARcadia was designed to be used. Participants wanted to use it for projects beyond the simple music features that were available in the browser. They wanted to use it to create custom controls for games, to control lighting in their houses, and more. In other words, ARcadia needed to be flexible enough to connect to many different applications through the internet of things (IoT).
To address this feedback, I designed ARcadia to contain begnner-friendly programming constructs that would enable users to connect their interfaces to other applications. To perform contextual inquiry, I interviewed designers and performing artists to gain an understanding of what interesting use cases might be, using semi-structured interviews and questionnaires. I found that in the performing arts and IoT community, it is common for creators and hackers to route input and output between multiple applications using WebRTC and OSC (two technologies that enable networked communication). I added blocks that networked over WebRTC and OSC. Below are three examples: ARcadia used to control lighting, ARcadia used as an effects pedal for a keyboard, and ARcadia used to scrub a YouTube video to alter audio and visual effects.
To test the new networking features of ARcadia I tested with users in situ, as well as conducted usability testing in the field at pop-up demo booths. In addition, to push the capabilities of ARcadia to the max I along with other performers testing using ARcadia as musical controllers, video game controllers, visual jockeying controllers, home automation controllers, etc.
In my evaluation I identified that some of the labels on the networking blocks were confusing, and there were too many steps required to initiate connection between devices. To address this, I redesigned the blocks such that devices could connect by name instead of having to use IP addresses or long strings of numbers and letters.
Overall, ARcadia makes it much easier for non-programmers to design multimedia, interactive, networked systems.
ARcadia has been implemented as a computer science education tool in international classrooms and learning environments. I have published my ARcadia design work in the 2018 CHI Conference on Human Factors in Computing Systems, and ARcadia has also been referenced by others UI/UX designers and researchers.
ARcadia: A Rapid Prototyping Platform for Real-time Tangible Interfaces
Annie Kelly, R. Benjamin Shapiro, Peli de Halleux, & Thomas Ball
In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems
On list of Microsoft MakeCode custom editors
Project description on Laboratory for Playful Computation website
Mention of research publication of Microsoft Research's website
Mention of project and my overarching thesis research on the ATLAS Institute's website