Creative Technology Projects & Experiments

These projects range from physical computing public art installations to games and AR applications. 

(in)visible narrativity
is a story sharing initiative that is expressed as an interactive art installation. When observers encounter this piece, their very motion trips the "BOX" into revealing a hidden narrative. The stories shared allow for public revelation of any form of content in which the writer or participant chooses.

This project sits at the intersection of sociology and psychology to examine how the perception of social identities can be positively influenced through the sharing of stories within a public space, in this case Washington Square Park, New York, NY. 

Technically, this project utilizes LEDs, Ultraviolet LEDs, a motion sensor, and a microcontroller all encased within a "BOX".

Automated Gender Bias: For this project, we utilized machine learning to analyze scene settings and content in the 1978 film “Halloween.” We used TensorFlow img2txt to generate descriptions for each frame and then ran a Python script to pull frames based on word descriptions. Using the specific content we could get specific percentages and frame amounts for quantitative data surrounding female stereotypes.

Done in collaboration with: Lauren Malkani / and Aarati Akkapeddi /

Apologia: An American Apology: for this project, we used a python script to export each frame of each video as an individual image and then used a facial recognition model to trace the facial points of each frame to create a “computer vision” recreation of the face. We then stitched the video back together to create a new sketch of the apology. We then analyzed the sentiment through IBM Watson’s “Tone Analyzer” to show the sentiment of each sentence spoken, as well as the overall sentiment. We then overlaid this information onto each final video piece.

Done in collaboration with: Lauren Malkani / and Aarati Akkapeddi /

Parallel Americana and the Power of Perception is a video-based installation that evaluates how theories of quantum physics can affect human perceptions in order to influence the examination of our ability to create and manipulate reality. This concept aims to reduce the amount of limited individual viewpoints that often result in a lack of tolerance for opposing ideas. The idea here is that once you enter the space, using facial recognition your mere observation will generate a reality.

Created using openFrameworks and Adobe After Effects.

Github: Source Code

TranSense: Environmental Interconnectedness is the exploration into how technology can influence not only how we interact with fellow humans across the world, but also our environment.

This wearable haptic undergarment allows the user to receive update-to-date weather conditions via the internet (IOT). The garment is embedded with 10 vibration motors. Each motor is connected to a wifi-enabled microcontroller located on the back. This microcontroller, when connected to an accessible wifi network, will pull the weather data by a user-specified location from the OpenWeatherMap API. It then parses the information needed, in this case, the wind data, and maps it to the vibration motors. There are two pieces of information being mapped. One, the wind speed, which is being used to control how much power is given the motor, and two, the wind direction, which is used to determine which motors are being activated.

How can we extract environmental data to create a sense of calmness in our lives? And how many of us communicate with the world through cellular devices? When too much time is spent communicating through this form, this can create a disconnect between us and our physical surroundings. The aim of this project is to better integrate one with the world, through a different form, one that doesn’t rely on screens.

TranSense: Environmental Interconnectedness: Take a journey as a cellular device in a dystopia future where humankind has been eradicated and the only remnants of human civilization remaining are the technologies humanity once depended upon.

This phase one prototype AR Andriod application is the beginning of a series of AR instructional materials which demonstrate how simple physics and mechanical engineering can be used to make creative, and unusual instruments.

A Virtual Rube Goldberg Machine was created using Unity3D and Vuforia.

A simulation of the effect of affordable housing on those without a home in a municipality. 


Goal: The goal of this project was to create an interactive book that would play into both the physical and digital spaces. I chose to use light and sound from the digital element, Processing. Conceptually, this book serves is for a prototype children’s book to help very young ESL students learn the English onomatopoetic sounds of various animals. When children press the pages of the book, they will hear the sound of the animal displayed.

Jumper wires (7)
1 MegaOhm resistors (3)
Conductive (copper) tape/ink
Arduino Uno
alligator clips (3)

How it works: There are 3 pages. Under each page, there is another page which has a portion of it covered with copper tape. The placement of the tape is based on where the user has been indicated to press which is shown as printed graphics on the top sheet. The other part of this process was connected the jumper wires/alligator clips to the breadboard. Once each of the wires and connected to both the press pads and the breadboard they are all connected via pins to Arduino. For this I used, pins 2, 6, and 8. Lastly, I added high-value resistors to each of the pins sensors (to affect drain time) and connected all of them to pin 4. Pin 4 serves as a way for Arduino to send out a signal to each of the sensors. The three other pins receive the data. Luckily, there is a capacitive sense library that does the math of listening for changes in frequency send out.

I also tested a few different materials; conductive tape (copper) and conductive paint (copper). Both worked well. The copper tape was fastest to prototype, but the paint allowed for more creativity.



Goal: To create a wireless notification device. The mailbox in my apartment building is on the first floor. However, I live on the fourth floor. It takes extra effort to go downstairs to check the mailbox only to discover that nothing has arrived. So, this project is an Arduino Bluetooth-enabled device that sends a notification to my desktop and my cell phone that the mailbox has been opened, thus alerting me to the presence of mail.

  • Jumper wires
  • Arduino Nano
  • Bluetooth module (HC-06)
  • 1K,2K, and 330 resistors
  • LED
  • PIR sensor
  • 9V battery
  • Programming: Arduino IDE, and Python3.

How it works: The Arduino Nano is connected to the HC-06 through the RX-TX and TX-RX pins. I used a voltage divider because the Arduino operates with 5V, but the RX pin on the HC-06 can only accept 3.3V. Then, I connected the PIR sensor. One side to GND, the other to PWR, and connected it to pin 2. Finally, I connected the LED to in 7 to test. As for the code, it is fairly simple. The PIR is a digital connection, so it only reads HIGH and LOW. If the PIR is HIGH it turns the LED ON and prints 1 to the serial monitor, if the PIR is LOW then it turns the LED OFF and prints 0 to the serial monitor. If only runs these statements if the PIR has first detected movement. Lastly, in order for my serial port to communicate with my desktop and cellphone, I used a push notification API with my computer’s Wifi connection. To do this I wrote a Python3 script. The script waits for the HC-06 to print to the serial monitor, once it does it sends a push message through the API to my Pushover service account, which sends out a push notification to any device in my network (in this case, my phone and desktop).