How it Works

The first version of the project (started in late 2009), uses a webcam to collect video, a laptop to process the video, and video glasses to replace the user's vision.

The program to process the video uses color averaging and blob detection to determine which colors lie outside of an environmental average. This can be considered to be reverse color tracking, where the environment is being calculated, and colors are being tracked that don't belong there. Each area that is found to be outside a given threshold, is blocked with the average colors of its own area. In this process, the complexity of advertising is abstracted to blocks of its own color.

The code is written in processing and relies on JMyron Library

Original Image (Click to Advance) 
Shinagawa Station, Tokyo

Image 1/5