Making robots, computer vision, electronics, face recognition & AI, AR, 3D rendering, GPU and Assembly code for around 20 years!
I built a cheap 30cm-high 2-legged walk robot with an onboard Jetson TK1 computer performing GPU-accelerated face detection and video stabilization, to find the nearest person and walk towards them.
For an undergraduate project, I created my own self-contained robot that can drive around a room while avoiding many obstacles by using its Infrared sensors, Ultrasonic sonar, Human speech module and early Atmel microcontroller.
We used a Reinforcement-Learning Neural Network to allow a robot to learn about its surroundings instead of being pre-programmed, such as whether something makes noises or vibrates or can be moved or is too heavy.
For the social English & Arabic speaking robot we built, I created the realtime Face Detection and Face Recognition system that can be trained &/or tested using the Facebook social network.
I built one of the first robots that could recharge itself. I also created an interactive website that let anyone in the world control it from their web browser even with a dial-up modem connection.
A human-sized robot that could balance on 2 wheels, climb up steps, and fall from 3ft and continue driving, and transform itself to a 4 wheeled mode (like a car), since it was designed for rugged military operations.
A humanoid robot to speak & listen in English & Arabic, learn & recognize faces, look towards interesting items such as faces or moving regions, perform realtime video stabilization, and perform basic human motions with its face, head & arms.
Create your Hackaday.io profile like Shervin Emami and many others