The base glass like material is plexiglass. The plexiglass is carved in a matrix like structure to give it a cool look.

Each carved line below the plexiglass has a silver conductive wire glued to it.

This entire array of wires connected together, act as the conductive surface which is attached to a capacitive sensor which is in turn connected to our raspberry pi.

This way, we can detect the presence or touch of a human on the glass surface.

A simple setup with just touch sensing and rgb LEDs can work, but to spice things up, we are also trying to find the exact coordinates of the touch. This can be used to provide complex lighting effects using individually programmable rgb LEDs.

To achieve this, we place a camera below the glass surface to track the finger position and movements. A normal pi camera's fov is so less that we need to make the box taller in order to capture the entire glass surface. So we have used wideangle fish eye camera which has a wider fov. This makes the box smaller. we have also slightly tilted the glass to a certain degree in order to increase the fov further.

The camera is connected to the Pi which processes the live camera feed and identified finger position and movements using image processing (python + opencv). It was a challenging task to identify the finger edges to track it, given that we also have lots of edges from the wires running in a matrix structure

After identifying the finger position, we used this to light the LEDs accordingly. Different effects for touch, drag etc. like leds following the finger.

this finger position and movements is also streamed to the computer over Wifi. our software which sits on the computer gets these coordinates and moves the computer pointer accordingly.

The gestures like multifinger touch was also identified, to capture scrolls etc. This can also be used to change the LED colors by some three finger gestures.