• The Avatar Program

    WalkerDev04/23/2019 at 01:31 0 comments

    When it comes to technology, humans have been able to be far away while being in their location. If this is possible, why is it that there can not be 1 user in 2 locations? This would require a user, a willing catalyst and a connection between the two.

    The Avatar system is a program dedicated to creating said catalyst, while having a virtual version of the user. This is a simple yet advanced process that may be available to all in the near, yet far future.

    The system is compromised of 5 main components. This includes

    - A Neural Interface that scans and analyzes the brain

    - A virtual core program that holds an AI heavily based off of the user

    - The Virtual Spindles that allow said Avatar to exist in / interfere with things in the real and virtual world

    - The structural system that allows for the Avatar's body and movement functions

    and most importantly

    - The synthesis program that generates AI from learning and development over years in the virtual world (VIA rate of time increase) rather than Input/Output data.

    It'd be good to look at each individual component and study it further. Let's start with the Neural Interface.

    - A Neural Interface that scans and analyzes the brain

          A Neural Interface (or BCI) is a system that connects the brain to an external device, whether it be a computer, device or set of servos (for example). As of right now, the most powerful BCIs are capable of allowing prosthetic usage, the control of objects and the control of some machinery. 

          Most BCIs use EGG, EKG or some other sort of brain sensor that outputs VIA waves. The Vector Gear, on the other hand uses a 3d brain scan ring that creates a real-time brain image and plugs it into a virtual brain, being able to be asses ed by entities that exist in the Horizon. These include AI Alpha, Beta, Iota, Sigma and can form much more advanced AI if needed, but that is a lecture for a different day.

    The AI there are capable of registering parts of the brain, aligning it with an AI and at some cases being able to be used as a health indicator. Speaking of AI, 

    - A virtual core program that holds an AI heavily based off of the user

         Thanks to the wonder of Game Engines, Visual Studios and modern research, it is possible to create an item in the real world and the virtual world. For example, you could connect an Arduino to Unity, then have the electricity flowing through it be represented by a color change and only have the color on for wires with electricity running through them. This is the same approach used for the Virtual Core.

          The Virtual Core is a system that implements the usage of BCI and Virtual Engines to create a stimulated brain. At the time of this writing, there has yet to be a fully finished brain, but brain scans, Nengo AI and a virtual brain is being implemented into Unity at an attempt to create an AI based off of a user. While creating AI VIA said way is easy, the hard part is the collection of memories and matching brain activity to be like the user, while having the avatar be a separate entity. This  especially becomes noticeable with

    - The Virtual Spindles that allow said Avatar to exist in / interfere with things in the real and virtual world

         As the real world is overlapped with the virtual world and constantly watched by your Avatar, the AI can interact with items as long as the user does and/or it exists in the virtual world. This allows for the AI to learn more and create a more detailed bridge between neurons. The ability for AI to do this is the most important in the first stages, where the AI is given a catalyst brain and shapes it to have more advanced bridges between neurons. In fact, this could theoretically be key to creating conscious AI.

          The connections between neurons also allow for more capabilities while subjecting AI to several limiters...

    Read more »