Close

Project Log 6: These AI's are out of control.

A project log for DIY Full-body VR Rigid Haptic Suit

Cheap Piezoeletric crystals could be used both as movement sensors and vibrational haptic simulators.

fulanodetailFulanoDetail 12/17/2022 at 20:130 Comments
Today is Saturday, 17 of december, 16:57 Brasilia timezone

Well, sorry for the silence, I'm trying my best.

So, well, there is an AI called "ChatGPT" which is a chat bot that can code for you.

I can literally ask for equations about temperature, electricity generation and this thing always has an answer, it is madness!

Anyway, I'm starting to ask for suggestions by this AI on how to program the Piezoelectric Full-body VR Suit, and it is giving me interesting answers.


First interaction:

"Yes, it is possible to use piezoelectric buzzers as a touch or pressure sensor and use the resulting electric pulses to control a 3D model in a game like VRChat. Here is a high-level overview of the steps you would need to follow to achieve this:
  1. Set up the piezoelectric buzzer: First, you would need to set up the piezoelectric buzzer and connect it to a microcontroller or computer that can read and process the electric pulses generated by the buzzer. Depending on the specific hardware you are using, you may need to write code to read and interpret the electric pulses from the buzzer.
  2. Map the electric pulses to body movements: Next, you would need to map the electric pulses generated by the piezoelectric buzzer to specific body movements in the 3D model. This could be done by creating a lookup table or using machine learning algorithms to train a model to associate specific patterns of electric pulses with specific body movements.
  3. Integrate the piezoelectric buzzer with VRChat: Finally, you would need to integrate the piezoelectric buzzer and the code you have written to interpret the electric pulses with VRChat. This could be done using VRChat's API or by writing a plugin for VRChat that connects to the piezoelectric buzzer and uses the electric pulses to control the 3D model.

I hope this helps! Let me know if you have any questions or if you would like more specific guidance on any of these steps."


Second interaction:

"Yes, it is possible to use piezoelectric sensors to detect touch or pressure and translate that into body movement in a 3D model inside a game like VRChat. Here is an example of how you could approach this problem:
Set up the piezoelectric sensor to detect touch or pressure and output an electric pulse. You will need to follow the manufacturer's instructions for connecting and configuring the sensor.
Use a microcontroller, such as an Arduino, to read the electric pulses from the piezoelectric sensor and convert them into digital signals.
Connect the microcontroller to a computer via a USB connection.
Write a program that runs on the computer and communicates with the microcontroller over the USB connection. This program should receive the digital signals from the microcontroller and interpret them as body movement.
Use the VRChat API to control the 3D model in the game based on the body movement data received from the microcontroller.
It's important to note that this is just one possible approach to implementing this functionality, and you may need to adjust the details depending on your specific requirements and hardware setup. Let me know if you have any further questions or need more specific guidance.
-----
import serial
import VRChatAPI
# Set up a connection to the microcontroller over USB
ser = serial.Serial('/dev/ttyUSB0', 9600)
# Create a VRChatAPI instance
api = VRChatAPI.VRChatAPI()
# Set up the 3D model's initial positions and orientations for each limb
head_position = [0, 0, 0]
head_orientation = [0, 0, 0]
neck_position = [0, 0, 0]
neck_orientation = [0, 0, 0]
thigh_position = [0, 0, 0]
thigh_orientation = [0, 0, 0]
leg_position = [0, 0, 0]
leg_orientation = [0, 0, 0]
arm_position = [0, 0, 0]
arm_orientation = [0, 0, 0]
forearm_position = [0, 0, 0]
forearm_orientation = [0, 0, 0]
thorax_position = [0, 0, 0]
thorax_orientation = [0, 0, 0]
abdomen_position = [0, 0, 0]
abdomen_orientation = [0, 0, 0]
# Main loop to continuously read data from the microcontroller and update the model's positions and orientations
while True:
    # Read a line of data from the microcontroller
    data = ser.readline()
    
    # Parse the data to extract the body movement information for each limb
    # For example, if the data is formatted as "head:10,20,30 neck:40,50,60" you could use the following code:
    parts = data.split()
    for part in parts:
        name, x, y, z = part.split(':')
        x, y, z = map(int, (x, y, z))
        if name == 'head':

Third interaction:

Me:

"Could you change the code in such a way that there are "boxes" where inside it would have the electric signals that the piezoelectric sensors would emit, and the initial position of each limb was already stablished to be in a specific set of initial coordinates?
These coordinates would be "z", "x" and "y", and would be in "position", "orientation" and "scale".
Each limb would be listed, but also have the easy option of increasing the amount of boxes that can be filled accordingly to the user.

The limbs would count as "points of articulation", there is a point on every 3D skeleton of each rigged 3D model where the respective limb is moved.

For example, the point of articulation of an arm is on the shoulder.

So, the points of articulation are, for now, the following: 2 eyes, 1 head, 1 upper lip, 1 lower lip, 1 chin, 1 upper neck, 1 lower neck, 2 shoulders, 2 elbows, 2 wrists, 2 thumbs, 2 index fingers, 2 middle fingers, 2 ring fingers, 2 little fingers, 1 thorax, 1 abdomen, 2 thighs, 2 knees, 2 legs and 2 feets.

There would be more than 10 piezoelectric sensors around the body that need to be taken into consideration when moving the skeleton of the humanoid 3D model, however, it would also be necessary to have a easier way of listing more of these sensors in the code if necessary."


The Chat bot:


import serial
import VRChatAPI
# Set up a connection to the microcontroller over USB
ser = serial.Serial('/dev/ttyUSB0', 9600)
# Create a VRChatAPI instance
api = VRChatAPI.VRChatAPI()
# Set up a dictionary to store the initial position and orientation of each limb
initial_positions = { "eyes": [[0, 0, 0], [0, 0, 0]], "head": [[0, 0, 0], [0, 0, 0]], "upper_lip": [[0, 0, 0], [0, 0, 0]], "lower_lip": [[0, 0, 0], [0, 0, 0]], "chin": [[0, 0, 0], [0, 0, 0]], "upper_neck": [[0, 0, 0], [0, 0, 0]], "lower_neck": [[0, 0, 0], [0, 0, 0]], "shoulders": [[0, 0, 0], [0, 0, 0]], "elbows": [[0, 0, 0], [0, 0, 0]], "wrists": [[0, 0, 0], [0, 0, 0]], "thumbs": [[0, 0, 0], [0, 0, 0]], "index_fingers": [[0, 0, 0], [0, 0, 0]], "middle_fingers": [[0, 0, 0], [0, 0, 0]], "ring_fingers": [[0, 0, 0], [0, 0, 0]], "little_fingers": [[0, 0, 0], [0, 0, 0]], "thorax": [[0, 0, 0], [0, 0, 0]], "abdomen": [[0, 0, 0], [0, 0, 0]], "thighs": [[0, 0, 0], [0, 0, 0]], "knees": [[0, 0, 0], [0, 0, 0]], "legs": [[0, 0, 0], [0, 0, 0]], "feet": [[0, 0, 0], [0, 0, 0]],
}
# Main loop to continuously read data from the microcontroller and update the model's position and orientation
while True: # Read a line of data from the microcontroller data = ser.readline() # Parse the data to extract the body movement information # For example, if the data is formatted as "limb:eyes x:10 y:20 z:30" you could use the following code: limb, x, y, z = data.split() limb = limb.split(":")[1] # extract the limb name from the data x = int(x.split(":")[1]) # extract the x coordinate from the data y = int(y.split(":")[1]) # extract the y coordinate from the data z = int(z.split(":")[1]) # extract the z coordinate from the data

Well, I would need to stablish the Source of "'/dev/ttyUSB0" and the "VRchatAPI instance" (whatever the heck this mean), but I think it is a good start (I hope).

Dunno where I should do from this, but I would need to ask for more modifications on the code, specially since I didn't specify the 3D program these would be.

But I could just "hey chat bot, convert this code to X program code" and things would be easier.

However, that's the thing, one thing is asking "hey, how much energy I would need to throw a rock from a catapult", and the chat bot would gladly tell "X amount of energy", but if you ask to a human, you would get "well, this is a complicated one, I would suggest you to read these books, but based on assumption, depending on the X distance from the axis of the basket where the rock is in, the elasticity of the wood in Y value, the torque in Z, the speed in W, etc etc etc".

So, the bot will only answer things inside the parameters you have given to it, and you will only know the right parameters to give to it if you take your time to study the subject and understand the subject.

Discussions