Close
0%
0%

Light Painting with VR

This project combines traditional long exposure light painting with LEDs controlled from inside a virtual reality painting application

Similar projects worth following

Light painting is done with long exposure photography of moving light sources. With enough skill and meticulous planning, the photographer can weave the resulting light trails into a stunning picture. 

Among the most challenging factors for this technique are the following two:

  • the almost completely dark environment during the exposure
  • the absence of light trails in our vision

These challenges make it very hard to create accurate light trails by hand, as the visual feedback is simply not there. You cannot see the light strokes that you have just traced. Light painting artists circumvent this lack of control by using other methods to create the light trails, such as generative light sources (e.g. LEDs attached to rotating bicycle wheel or a LED strip ... ), static light sources and predefined paths with visual/tactile aides to guide in the darkness. Some excellent results are also achieved with free form hand gestures that do not require accurate control.

The project that I present here gives a tool to overcome these challenges by using virtual reality to allow the artist to see and control with a high degree of accuracy what is being painted in the long exposure photography.

An overview of the steps required to reproduce the project are written below. The steps assume that you feel comfortable setting up and using an HTC Vive, a Raspberry Pi, as well as setting up and doing basic long exposure photography. Being able to adapt some code in the source files will also come handy. The guide and source code can be written to make the project more user friendly, which is something I intend to do as time allows.

  • 1 × Hardware: HTC Vive VR headset + 2 controllers + 2 tracking base stations
  • 1 × Hardware: PC Nvidia GTX 1080, i7 6700K, 32GB RAM
  • 1 × Optional Hardware: Pimoroni Unicorn HD HAT (LED array) Offers more LEDs (16x16), which gives a finer resolution of the light strokes. The color tends to be more washed out, though (https://shop.pimoroni.com/products/unicorn-hat-hd)
  • 1 × Optional Hardware: Pibow case for the Pi https://shop.pimoroni.com/products/pibow-zero-w
  • 1 × Optional Hardware: Tripod This makes it easier to steady the camera for the shot

View all 13 components

  • First project log entry

    fhernand02/20/2018 at 00:53 0 comments

    I finally got around to writing a short guide, both to answer some questions and to make it possible for others to reproduce this project. The current state of the project is barely stable, and many adjustments and options require changing the source code directly. But it is usable and can produce very beautiful results. I hope you enjoy it as I have, and I would love to see what you can come up with.

    As time allows I will try to make things more user friendly, and others are welcome to contribute as well. 

View project log

  • 1
    Preparing the server software
    1. Install node.js
    2. Clone the A-Painter socket server repo into a directory (e.g. C:\a-painter-socket-server)
    3. In the new directory, run
      npm install

      to install the missing dependencies. 

    4. Clone the A-painter into a directory inside the above created directory (e.g. C:\a-painter-socket-server\a-painter)

    You can now run the A-Painter application by entering the following URL in a WebVR enabled browser (such as those shown here: https://webvr.rocks/):

    http://localhost:3002/?room=theRoom

    (Note that currently both port and room name are hardcoded)

  • 2
    Preparing the client software

    You can connect your Pi to the A-painter application as follows:

    1. If you need to setup a new distro for your Pi, you can follow something like this: https://www.w3schools.com/nodejs/nodejs_raspberrypi.asp
    1. Clone the VR LED painter socket client into a directory
    2. In the new directory, run
      npm install

      to install the missing dependencies. 

    3. Once the server is started as described above, start the following node application (for the Unicorn pHAT):

      sudo node app.js

      (apphd.js for the Unicorn HAT HD) (Note that currently the server host URL is hardcoded in the .js file)

    Now, the LED Array should light up with the corresponding brush info that is used in A-Painter.

  • 3
    Making the long exposure shot

    Now that you can render light strokes, you can also make the long exposure shot. If you have no experience with this, you can resort to the following guide: https://www.digitalrev.com/article/how-to-setup-your-camera-for-long-exposure-photography

    Some other tips:

    • Keep in mind that using too many strokes during your shot can affect the performance of the A-painter application
    • You can drag and drop .obj models and pictures to your browser window where A-painter is running. You can use these as guides or help during your painting session.
    • I recommend using  OpenVR Advanced Settings (https://github.com/matzman666/OpenVR-AdvancedSettings) to adjust your painting environment, specially when using pictures and models as guides, as you can place them where you need them

View all 3 instructions

Enjoy this project?

Share

Discussions

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates