In 2015, we began developing Neuroscientific Research focused on Art in collaboration with the University of Houston, Tecnológico de Monterrey and the Contemporary Art Museum of Monterrey (MARCO), resulting in scientific articles published in journals such as "Frontiers in Neuroscience" (2017; Zurich, Switzerland).

Currently, we work not only on scientific development in this area, but also in integrating it with interactive design. 

Together with L14, a group specialized in developing audio technology and interaction design we focus on applying state-of-the-art human-machine interfaces in other interesting areas.

Our goal is to use new creative platforms in order to increase the reach and the impact of our efforts, and of the technology with which we work.

Concept: Build an innovative interface for humans to talk to machines or machines to talk to humans.

With the NeuroLS System, we aim to provide a plug & play interactive and generative hardware + software interface that will turn brain signals into light and sound, in a way in which will allow hobbyists and artists around the world to create their own audiovisual pieces through brain signals, in real time.

For this first version, we focus on building the system to control DMX universes. In which the user can select a range of colors for each  MUSE HEADBAND sensor; to be controlled by each of the brain Absolute Band Powers (Alpha, Beta & Gamma). Thus the DMX universe can be mapped out and controlled by the brain signals.

The challenge

The state of the art of neural signal interpretation involves a thorough and time-consuming process which typically starts with the apparently-simple collection of data, but also involves a preprocessing stage including signal filtering and requires complex processing techniques which, in science and research, is no trivial task at all.

Thereby the integration of hardware pieces and software in order to control the systems with the brain signals in real time was complex challenge to solve.

Approach to solve the problem

In the process of developing scientific projects, such manual and involved techniques are appropriate and even necessary, but in terms of interactive design applications, it is possible to develop, apply and take advantage of faster and more automated interpretation techniques -provided they may not be the most scientifically sound- to produce audiovisual works that result from real-time communications between the brain and the computer.

How it works

A Muse headband collects Raw EEG data from the user and sends the information via Bluetooth to a computer.  A software from Muse Tools, processes the brain signals to Absolute Band Powers (Alpha, Beta, & Gamma) for each sensor. NeuroLS Software (Pure Data patch) registers this data to arrays to determine the predominant brainwave for each sensor and allows the user to customize the output of the signal to a DMX-MIDI system.

The user can also select a range of colors for each sensor of the MUSE HEADBAND to be controlled by each of the brain Absolute Band Powers (Alpha, Beta & Gamma). So the DMX universe can be mapped out and controlled by the brain signals.

Our extra mile: Workshops -- More Developers + More Projects

The goal of our studio is to use the scientific tools that we develop to make awesome things; and we know sharing with more people will result in even more awesomeness. This is why, in the beginning of last year, we started offering workshops, for both adults and young students, through which we share -in a practical learning process- our technical and scientific knowledge with the local and national community, and we provide them with tools to make their own creative projects.

We started with the goal of changing the world, we realized the tools were in science, and now we aim to create the extraordinary through interactive design.

Changing the world is not an easy task, to achieve it most be done in community.