Close

Does this project spark your interest?

Become a member to follow this project and don't miss any updates

0%
0%

Metaverse Lab

Exploring processes and infrastructure for building physically persistent mixed reality spaces.

Similar projects worth following
I created this project to explore and share relevant information for building the Metaverse: a collective shared space, created by the convergence of virtually enhanced physical reality and physically persistent virtual space, including the sum of all virtual worlds, augmented reality, and the internet.
Just open http://alusion.net/ in janusVR and look at what I mean.

JanusVR: Software that brings all of the web into Virtual Reality

Dr. James McCrae, inspired by the novel Snow Crash by Neal Stephenson who detailed the Metaverse, built an engine that allows a spatial walk through the internet. The analogy is that webpages are rooms, and links connect rooms via portals (doorways which seamlessly connect rooms).

To be precise about the meaning of the name Janus - it is in reference to the portals which are used to interconnect rooms. Like the Roman god Janus, a portal is a single object with two faces that can peer into two separate spaces.

By embedding some XML within an existing HTML file, Janus can read the content and arrange the content in particular patterns on pre-defined geometry. Here's an example of a simple app for implementing an ADF (Area Definition File) and 360 photosphere:

<html>
<head>
<title>Example Room</title>
</head>
<body>
<FireBoxRoom>
<Assets>
<AssetObject id="home" src="5.obj.gz" mtl="5.mtl" />
<AssetObject id="cam" src="PHOTOSPHEREinFacing.obj" tex0="PANO.jpg" />
</Assets>
<Room use_local_asset="room_plane" visible="false" pos="-9.905001 0.034 7.167" xdir="-0.906309 0 -0.422616" ydir="0 1 0" zdir="0.422616 0 -0.906309" run_speed="5" default_sounds="false">
<Object id="home" pos="-4.7 0 -0.9" lighting="false" blend_src="one_minus_constant_color" blend_dest="src_color" />
</Room>
</FireBoxRoom>
</body>
</html>

All it takes to start creating a VR site is by placing the FireBoxRoom tag within the body tag of an existing HTML file. It's also nice to know that tags can also be encapsulated within standard HTML comments. Janus will still detect the FireBoxRoom while other browsers will ignore the content and prevent text from leaking out. By opening this page with Janus, you will see the ADF (Tango Scan) and can spawn the photosphere by toggling edit mode and right clicking anywhere and click to set it. The built in editor makes it easy to tweak the values within the DOM and real time collaboration can be accomplished.

AVALON: Anonymous Virtual/Augmented Local Networks

Inspired by projects like Freifunk and PirateBox, AVALON is like a wireless dead drop with the capability to pair with other nodes to form a peer-to-peer mixed reality mesh net. The vision I see in AVALON and the reason I am exploring cheap hardware for Virtual Reality is to build scalable, secure, decentralized infrastructure for a multidimensional internet.

Multiple users wirelessly connected to an AVALON node are present and are able to interact in real time within the shared virtual space.

Freifunk: German, word for word translation free radio, more appropriate: free wireless networking. Freifunk is part of the international movement for open wireless radio networks.

AVALON is designed to be private and secure. No logins are required and no user data is logged. Users remain anonymous and the system is purposely not connected to the internet in order to subvert tracking and preserve user privacy. The concept is similar to USB dead drops: anonymous, offline, peer to peer file-sharing network in a public space. Essentially the node is a wireless version of a USB dead drop (filesystem + web server) that allows a user to connect to the WiFi hotspot and chat anonymously, upload and download content, post on an imageboard, and stream media.

AVALON allows for immersive and multiuser spatial walkthroughs of peer-to-peer networks in a 3D graphics environments.

The future is already here, it's just not very evenly distributed.
http://piratebox.cc/
http://janusvr.com/
https://deaddrops.com/


AVALON [Anonymous Virtual Augmented Local Network] and other works here by alusion is licensed under a Creative Commons Attribution 4.0...

Read more »

  • 1 × Raspberry Pi $35 credit card sized GNU/Linux computer.
  • 1 × Class 10 SD card Atleast 4gb but I recommend getting a 16gb card. Make sure class 10 SDHC!
  • 1 × USB WiFi adapter I used the TL-WN722N, list of supported dongles here: http://elinux.org/RPi_USB_Wi-Fi_Adapters
  • 1 × Ethernet cable Crimp some short cables yourself.
  • 1 × 5 volt power supply To power the Pi and Router.
  • 1 × USB Flash Drive Formatted FAT32 with a single partition. Recommend atleast 16gb.
  • 1 × OpenWRT Router TP-Link wr703n is cheap and travel size. List of supported devices: http://piratebox.cc/openwrt:hardware
  • 1 × Oculus Rift It is a VR/AR project afterall
  • 1 × Micro USB to USB Male Power the OpenWRT router with Raspberry Pi.
  • 1 × Computer with Ethernet port

View all 11 components

  • A Year of Metaverse Lab

    alusion04/02/2016 at 03:21 0 comments

    Metaverse Engineering

    It's the dawn of the virtual age and the first high end virtual reality headsets have begun arriving, delivering a sense of presence for the masses. I have sought to utilize distributed infrastructure- and democratize the best in 3D technology with the maker spirit to help foster a global renaissance moment. It has officially been a year since I began Metaverse Lab and the vision is at last coming to life.

    As project logs started to expand beyond average web user attention span, I recently decided to rebuild alusion.net into a hangout place with a panoramic vision wall. All you need to do is open http://alusion.net/ using JanusVR.

    I'm figuring out how to do a mailing system for version controlling room edits. One idea might be to record a ghost of the room edits done inside Janus and then to paste the script somewhere.

    Another idea I had is to save the room edits with ctrl-s and then upload files to https://teknik.io/ and just send me the link.

    I decorated the entire walls with images dropped from Metaverse Lab in a timeline. You can as easy as using Ctrl + drag and dropping from websurfaces. I'm working on various tracking methods for creating a guake style UI for use within Janus. Here's a first early test I made:

    The next test will most likely involve leap motion tracking, although I'm in no rush unless suddenly Orion gets Linux support.

    https://www.reddit.com/r/janusVR/comments/47mjg4/release_4943_leap_motion_transform_bugfix/

    room.update = function(delta_time) { 
    
    if (player.hand0_active) { 
        room.objects["cube"].pos.x = player.hand0_pos.x; 
        room.objects["cube"].pos.y = player.hand0_pos.y; 
        room.objects["cube"].pos.z = player.hand0_pos.z;
    
        room.objects["cube"].xdir.x = player.hand0_xdir.x;
        room.objects["cube"].xdir.y = player.hand0_xdir.y;
        room.objects["cube"].xdir.z = player.hand0_xdir.z;
    
        room.objects["cube"].ydir.x = player.hand0_ydir.x;
        room.objects["cube"].ydir.y = player.hand0_ydir.y;
        room.objects["cube"].ydir.z = player.hand0_ydir.z;
    
        room.objects["cube"].zdir.x = player.hand0_zdir.x;
        room.objects["cube"].zdir.y = player.hand0_zdir.y;
        room.objects["cube"].zdir.z = player.hand0_zdir.z;
    }
    
    if (player.hand1_active) {
        room.objects["cube2"].pos.x = player.hand1_pos.x;
        room.objects["cube2"].pos.y = player.hand1_pos.y;
        room.objects["cube2"].pos.z = player.hand1_pos.z;
    
        room.objects["cube2"].xdir.x = player.hand1_xdir.x;
        room.objects["cube2"].xdir.y = player.hand1_xdir.y;
        room.objects["cube2"].xdir.z = player.hand1_xdir.z;
    
        room.objects["cube2"].ydir.x = player.hand1_ydir.x;
        room.objects["cube2"].ydir.y = player.hand1_ydir.y;
        room.objects["cube2"].ydir.z = player.hand1_ydir.z;
    
        room.objects["cube2"].zdir.x = player.hand1_zdir.x;
        room.objects["cube2"].zdir.y = player.hand1_zdir.y;
        room.objects["cube2"].zdir.z = player.hand1_zdir.z;
    }
    
    }
    Further JanusVR JavaScript documentation can be found here: http://www.janusvr.com/js.html


    JanusWeb

    It also has nearly been a year since I was inspired for project AVALON [Anonymous Virtual Augmented LOcal Networks]. Looking back and reading https://hackaday.io/project/5077-metaverse-lab/log/16699-digital-immortality from April 2015, one is able to see the ideas coming together. This excerpt alludes to the vision that has kept with me over the year:

    Everything in this project must be set to scale with the physical world to be effective at delivering its vision. Access into Janus from a browser client is under development. - April 2015

    At the time I was referring to SceneVR, a project by Ben Nolan that is now built atop of Mozilla's A-Frame. The fact is, it might be too much for someone to download a program such as Janus because of the average attention span of a web user.

    A webVR developer...

    Read more »

  • Metaverse Party

    alusion03/04/2016 at 22:06 1 comment

    Abstract Ideas

    I came to Hollywood with a plan and as doors opened that vision has steadily matured. It was late September 2015 when I was introduced to the space. I snapped this 360 with my phone:

    This is the story of a work in progress; a peak into a grand design for the ultimate experiment of connecting the digital and physical world together and starting a new art movement.I explained that this could be done by using a combination of live 360 and 3D scanning technology to transform the physical space into an online mixed reality world. I had the right tools to produce this vision, and so when my Tango came in I got to work scanning the space:

    Normally the Tango is meant to scan small spaces or furniture, I had really tested the limits by scanning 10,000+ sq ft space with this device. With a little practice and 15+ scans later, I had pieces that were starting to look good and was ready to texture bake in blender:

    At this point along, I felt victorious. With the space captured into a scale 3D model, everything else that follows gets easier and the astounding possibilities with an engine like Janus will be useful for tasks such as event planning. This can make a 3D modeller’s job easier as well. Link to Album
    Next I wanted to experiment with the interesting neural network activity I was doing at the time:

    I learned that the paint wears thin without substantial GPU processing power available. However, from a distance the results did look good. What if the gallery itself is part of the art show and with AVALON people can take it home with them or making it into a small sized hologram? I had long since stopped believing in mistakes, they are opportunities to obtain a deeper understanding of the system. Link to Video

    One method I discovered works really well for augmenting physical spaces is by stylizing the texture maps. For standard wavefront object files, they usually come with an mtl file that points to a texture.

    newmtl Textured
    Ka 1.000 1.000 1.000
    Kd 1.000 1.000 1.000
    Ks 0.000 0.000 0.000
    d 1.0
    illum 2
    map_Ka lenna.tga           # the ambient texture map
    map_Kd lenna.tga           # the diffuse texture map (most of the time, it will be the same as the ambient texture map)
    map_d lenna_alpha.tga      # the alpha texture map
    
    map_Ks lenna.tga           # specular color texture map
    map_Ns lenna_spec.tga      # specular highlight component
    map_bump lenna_bump.tga    # some implementations use 'map_bump' instead of 'bump' below
    bump lenna_bump.tga        # bump map (which by default uses luminance channel of the image)
    disp lenna_disp.tga        # displacement map
    decal lenna_stencil.tga    # stencil decal texture (defaults to 'matte' channel of the image)

    You are allowed to change that line to map whatever texture you'd like, including gifs. Very super simple stuff for people with any level of 3D modeling experience.

    Digital graffiti. This had planted a seed in my mind that we'll return to later.

    I played with generative algorithms for creating textures and experimented with alpha channels. What I came up with one night is similar to shadow puppet art, but instead of a shadow it is an object that lights the stage from behind or underneath:

    Link to textures

    I began to think about the possibility of using image processing scripts combined with my WiFi data visualization art in order to create puddles where the texture of the place transforms into a painting. Imagine a droplet of information that stains the fabric of the space you're in. Now it is around Christmas 2015 and as homage to the artist who opened the door for me I captured and optimized a photogrammetry scan (Link to album: http://imgur.com/a/mOi7R) so he can overlook this work in progress:


    A New Year

    I took a trip to the annual Chaos Communication Congress in Hamburg to create a virtual art installation.

    You can check out pictures of it at 32c3.neocities.org or visit the link in Janus to see it in VR.

    The year is 2016 and the vision is starting to come to life. During the time I was scanning... Read more »

  • Generative Networks Pt. 2

    alusion02/26/2016 at 17:43 0 comments

    Link to Part 1

    Location Based Mixed Reality WiFi Glitch Art

    I've been really busy lately. Earlier this month, a hackerspace in Warsaw Poland published a guide to hacking an awesome device called the Zsun, a WiFi card reader that's been hacked to run OpenWrt. It costs $11 and has 64MB RAM, 16MB SPI Flash, and USB SD Card Reader and is perfect for converting into an AVALON node. http://imgur.com/a/XP6oX

    When finished, it will receive power through USB and will seed it's own access point or connect with others nodes in a meshnet.

    The vision I have for this art project is that the zsun listen to ambient WiFi signals around it and extract image data to retexture itself. We will need a script to process the images. We will use imagemagick for creating the textures for the leaves.

    Basic stuff covered in https://hackaday.io/project/4423-stereo-vision-experiments

    # Convert input image into a binary black and white stencil
    convert leaf.png matte:- | convert - stencil.png
    # Make black transparent (can also add fuzz)
    convert -transparent Black stencil.png mask.png

    # Composite the mask over the background image
    convert mask.png -channel Black -separate +channel girl.png +swap -composite out.png
    Creating the Initial Scene in Blender

    Creating a patch of land, the area could symbolize the signal range.

    Mapping the textures onto a Japanese Maple 3D model.

    Creating a simple Janus app
    <html>
    <head>
    <title>Japanese Maple</title>
    </head>
    <body>
    <FireBoxRoom>
    <Assets>
    <AssetObject id="room_dae" src="room.dae" />
    <AssetObject id="jap_maple_dae" src="jap_maple.dae.gz" />
    <AssetImage id="black" src="black.gif" tex_clamp="true" />
    </Assets>
    <Room pos="0 0 1" xdir="1 0 0" ydir="0 1 0" zdir="0 0 1" skybox_down_id="black" skybox_front_id="black" skybox_left_id="black" skybox_back_id="black" skybox_right_id="black" skybox_up_id="black">
    <Object id="room_dae" js_id="0" pos="0 -1.6 -5" scale="3.6 3.6 3.6" lighting="false" cull_face="none" collision_id="room_dae" blend_src="src_color" />
    <Object id="jap_maple_dae" js_id="1" pos="0 -1.6 -5" scale="3.6 3.6 3.6" lighting="false" collision_id="maple_COLL" blend_src="one" blend_dest="zero" />
    </Room>
    </FireBoxRoom>
    </body>
    </html>

    Link to default tree

    This is already looking better than the basic Multiplayer VR sample in Project Tango store:

    https://play.google.com/store/apps/details?id=net.johnnylee.multiplayervr&hl=en

    No offense to Johnny Lee, he is awesome and I've always been a great fan of his work! Lets build the Metaverse together :)

    Sourcing Textures

    Web Scraping and Virtual Reality is a powerful combination to visualize data and use computers to create art. A good primer and my personal recommendation for those wishing to get into scraping is Web Scraping with Python. Not only is it a lot of fun, you'll obtain a deeper understanding of the internet and build up some very useful skills along the way.

    # Pythonpy will evaluate any python expression from the command line. Install via:
    sudo pip install pythonpy
    # Make sure you also have BeautifulSoup installed
    sudo pip install beautifulsoup4
    # scrape from imgur and then name images via sequence
    curl http://imgur.com/ | py 'map(lambda _: _.attrs["src"].partition("//")[-1], bs4.BeautifulSoup(sys.stdin).findAll("img"))' | xargs -n1 wget >/dev/null 2>&1 && ls | cat -n | while read n f; do mv "$f" "$n.jpg"; done
    The next step is to incorporate these images into our DOM.

    To be continued..


View all 34 project logs

  • 1

    AVALON Raspberry Pi Arch build (RASPBERRY PI 1 ONLY):

    Using a Bittorrent client, download the PirateBox image from here.

    Extract the ArchLinuxARM-2014.10-PirateBox*.zip file and follow the Raspberry Pi SD Card Setup instructions (OS X instructions) (Windows instructions) (Linux instructions) to install the image to your SD card.

    Insert SD card and connect USB Wi-Fi adapter and FAT32 formatted USB into the Pi. Connect Pi via ethernet to home router then insert power and wait 2 mins for it to boot up.

  • 2

    Once the box is fully booted, open a terminal window (for OS X go to Applications > Utilities > Terminal; for Windows install and open PuTTY and ssh into your PirateBox.

    ssh root@alarmpi
    
    The password is: rootOnce you've logged in, change your password (to something you'll remember!) by using the password command:
    passwd
  • 3

    Now it's time to update and upgrade your packages. Arch uses pacman for its package manager. The next steps will detail upgrading your packages, ugrading the pacman database, install dependencies for the Janus-Server, create a new user, logging in to install the Janus multi-user server.

    pacman -Syu
    pacman-db-upgrade
    pacman -Sy git tmux nodejs sshfs
    useradd -m user
    passwd user
    login user
    git clone https://github.com/lisa-lionheart/janus-server.git
    cd janus-server/
    npm install
    ./generate_key
    Note: If you receive an error like Read only system, just edit /boot/cmdline.txt and put rw before rootwait

View all 7 instructions

Enjoy this project?

Share      

Discussions

Similar Projects