Water Vapor Display

In early 2013 I discovered pico-projectors. I picked up a relatively inexpensive one (Microvision SHOWWX+ with HDMI input), which sadly can no longer be purchased! It was an incredible piece of technology, which is/was ahead of its time. The first application I came up with it was to create a water vapor display. Basically, generate water vapor (fog), blow it into a curtain with laminar flow, and project an image on it. I created a manifold of about 20 channels, with two plastic bag air-resevoirs (to 'still' the air), pressurized by 2 CPU fans. The manifold had a gap at the based through which water vapor could be sucked up and lifted into a curtain. After building the whole thing, well, it didn't really work. I couldn't get enough fog up into a curtain.

My Unsuccessful Vapor Display

Successful Vapor Displays

ELEX 4315 Term Project

The end of term is around the corner for 2014 and the ELEX 4315 students have done an amazing job on their term project. ELEX 4315 is an Embedded Systems course in C++, which I’ve updated to cover real-time systems and machine vision. This year the course labs added up to an automated Blueberry Sorting Machine. The system the students created used machine vision to identify ‘blue’ berries (ping-pong balls) and sort these into one bin while rejecting all other colored balls to the reject bin. Students developed the machine vision algorithm, microcontroller firmware, C++ control, and electro-mechanical sorting system including:

  • An arduino microcontroller system with custom communications protocol to accept inputs (switches / pushbuttons) and control actuators (servos) and indicators (LEDs).
  • C++ class to manage serial communication from microcontroller to PC
  • C++ class to capture webcam images and perform the decision making on accept or reject of the berries
  • C++ TCP/IP server to allow remote control of system (via another similar custom communication protocol)
  • C++ client application for remote control – modified to run as CGI
  • Web interface to C++ CGI client to control system
  • All objects multi-threaded for optimized real-time performance

Two of the fastest systems are shown below. These systems were able to sort balls at a rate of about 2 per second with in the best run, one error in 45 balls.


The Golden Age of Hardware

“We Have Entered The Golden Age Of Hardware Hacking”

I couldn’t agree with the author more. Like the author, I too picked up a degree in electrical engineering in the late 90’s, only to watch almost every one of my school colleagues move into the sotware realm. All that hardware work on electric theory for nothing! But understandably so, hardware was expensive to produce, while software was far less capital intense.

I stuck with it though, after many years in graduate school I founded Gazepoint which produces eye-tracking systems which include hardware (electrical and mechanical) and a whole lot of software. These systems would not have been possible were it not for the advances in low cost electronics manufacturing AND 3D printing. I happen to love 3D printing, I’ve printed everything from replacement laundry dryer knobs and sink drain plugs, to commercial products such as housings and head mounted eye-trackers.

Last year I sponsored a UBC student team that developed an incredible video game called Focalpoint. The game was amazing, they did a fantastic job but it was all software. I’m impressed that electrical engineering produces graduates well rounded enough to do full bore software projects like this, but I also think we’re just entering this Golden Age of Hardware and believe students might find some real opportunities innovating in hardware.

So this year at BCIT I have two student teams working on hardware based projects. Of course both include significant software components but each is based around the hardware device. As the teams progress I’ll post more!



I will be attending the 4th International Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction in Seattle on September the 13th, 2014. We tried very hard to get a paper done in time for the conference but we were just too swamped with everything else on the go. So instead I might just bring down one of our custom head mounted eye-gaze tracking system we’ve been working on to demo instead.

Let me know if you are coming too, looking forward to seeing everyone there!



EECE496 Term Project

The UBC EECE 496 capstone engineering team did an amazing job developing an innovative new video game which uses the player’s eye-gaze information as a core game dynamic, called Focalpoint. Focalpoint is one of the first, if not the first game which uses eye-gaze as a native game dynamic (and not just a mouse cursor or touch replacement). It’s a simple game, but once you start playing, you forget about the eye-tracking and it feels like the computer just knows what you want it to do, without having to click anywhere.

HCI Group back in action


The HCI Group has been quiet the last few years as I have been very busy focusing on entrepreneurship while my academic activities as Adjunct Professor at the University of British Columbia took a back seat. I am very happy to announce that I am finally able to restart the HCI Group and all related activities based on my new tenure at the British Columbia Institute of Technology.

As always, the focus of the HCI Group is on human-computer interaction with an emphasis on eye-gaze tracking. However, in addition to HCI, the HCI Group has partnered with the research group of Dr. Julie Robillard, assistant professor at the University of British Columbia and will be assisting on the technical side with the research in Neuroscience and Scientific Communication.

Stay tuned for more!

Craig Hennessey