Close
0%
0%

Auto tracking camera

A camera that tracks a person & counts reps using *AI*.

Similar projects worth following
The source code:https://github.com/heroineworshiper/countrepsRapidly becoming the next big thing, 1st with subject tracking on quad copters, then subject tracking on digital assistants. It's long been a dream to have an autonomous camera operator that tracks a subject. The Facebook Portal was the 1st sign lions saw that the problem was finally cracked. The problem is all existing tracking cameras are operated by services which collect the video & either sell it or report it to government agencies.Compiling & fixing a machine vision library to run as fast as possible on a certain computer is such a monumental task, it's important to reuse it as much as possible. To simplify the task of a tracking camera, the same code is used to count reps & track a subject. The countreps program was a lot more complicated & consumed most of

The lion kingdom started getting ideas to make a tracking camera in July 2014.  Quad copter startups were booming & tracking subjects by GPS suddenly caught on, even though it was just a rehash of the worthless results hobbyists were getting in 2008.  The lion kingdom figured it could improve on it with machine vision tracking fiducial markers.

It was terrible.  You can't make a video wearing all those markers & the picture quality wasn't good enough to reliably detect the markers.  To this day, hobbyist tracking cams are all still using chroma keying & LED's.  The lion kingdom would do better.

The next step occurred in Aug 2016 with LIDAR.



That had problems with reflections in windows & couldn't detect tilt.  It could only estimate tilt by the distance of the subject from the camera.

2018 saw an explosion in CNN's for subject tracking.  The key revelation was openpose.  That theoretically allowed a camera to track a whole body or focus in on a head, but it didn't allow differentiating bodies.  The combination of openpose & a 360 camera finally allowed a subject to be tracked in 2 dimensions, in 2019.


The problem was a 360 camera with live output was expensive & cumbersome to get working.  The live video from the cheap camera had a long lag.  The tracking camera was offset from the recording camera, creating parallax errors.

Tracking would have to come from the same camera that recorded the video. That would require a wide angle lens, very fast autofocus, & very high sensitivity.  It took another year for cameras to do the job for a reasonable price.





The EOS RP allowed wide angle lenses & had much faster autofocus than previous DSLRs.  Together with a faster laptop, the tracking system was manely doing the job.  Openpose couldn't detect the boundaries of the head, only the eye positions.  That made it point low.

The next step would be tracking a single face in a crowd of other faces.

openpose.mac.tar.xz

Bits for openpose & caffe that were changed for mac.

x-xz - 11.55 kB - 01/04/2019 at 18:38

Download

countreps.mac.tar.xz

The simplest demo for mac.

application/x-xz - 1.71 kB - 01/04/2019 at 18:36

Download

countreps.c

Simplest Linux demo

x-csrc - 5.34 kB - 01/02/2019 at 08:31

Download

Makefile

Simplest Linux makefile

makefile - 673.00 bytes - 01/02/2019 at 08:31

Download

  • Portrait mode with the flash & different lenses

    lion mclionhead06/04/2020 at 04:21 0 comments

    This arrangement was the fastest to set up.

    28mm

    17mm.  Then, there was a more spaced arrangement which took longer to set up.

    There were more shadows.  For a single flash, it's better to have it closer to the camera.  The only lens to be used in practice is the 17mm with an optimum distance from the camera, but the lion kingdom put some effort into making it work with longer lenses & less optimum distances from the camera.  In testing, it gives most useful results with the 17mm.

    There were 2 different camera elevations.

    The desired material looks better at waist height, but the flash is farther from the ceiling.  There were many algorithms to try to improve the tilt tracking.  Trying to estimate the head size was required.  The head estimation leads to a different tilt when looking at the camera & looking sideways.

    Other problems are camera motion while shooting & seeing a preview after shooting.  The tracker starts tracking the preview.  A beefed up remote control could allow the lion to freeze the tracker when showing the preview, but the same amount of manual intervention can also clear the preview before the tracker goes too far off.  In practice, the camera usually isn't moving during a photo so the preview doesn't make it move.

    The 17mm has proven to be 1 stop darker than the 28mm & 50mm.  That's why it was only $600.  Forget about discovering that from the adsense funded internet.  F stop doesn't account for light transmission efficiency, so lenses with the same f stop can have wide variations in brightness.

    Then, there was a boring test of video.

  • Replacing the bulb in the 580 EX II

    lion mclionhead06/02/2020 at 06:42 0 comments

    The lion kingdom's 580 EX II died after 12 years.  Lions took many indoor photos with it.  

    Then, this arrived.  It behooves humans to get a bulb assembly rather than a bulb.  

    https://www.walmart.com/ip/Canon-Speedlight-580EX-II-flash-reflector-flash-tube-assembly-CY2-4229/525541142

    The bulb is very hard to replace on its own.  There was a starting guide on 

    https://joelgoodman.net/2012/07/19/flash-bulb-repair-canon-580ex-ii/

    It's essential to discharge the capacitor.  It still had 200V after 2 weeks with no batteries.

    There is a discharging hole with electrical contact inside, exposing the capacitor's + terminal.  This must be grounded through a 10k resistor to the flash ground, without touching the resistor or ground while touching the + lead.  The trick is to keep 1 paw behind your back while holding the + lead with your other paw.

    A few screws revealed the electronicals.

    The bulb assembly is on a corkscrew drive.  The corkscrew drive moves it to adjust the spread of the beam.

    The 12 year old bulb was cactus.

    4 cables connected to the assembly.

    The old bulb & silicone were liberated, after discovering the bulb was as fragile as paper.

    Then 1 end of the new bulb was soldered in before inserting it back into the enclosure.  

    This was the wrong way to insert the silicone. 

    The lion kingdom did what it could with the silicone on 1st.  The soldered end went back into the assembly.  The unsoldered end received its silicone 1st, then wire, & finally heat shrink.  The heat shrink was too long, but if the sharper turns break the wire, there's more wire from an old LCD backlight in the apartment.

    Based on the challenge of getting the silicone on, all 3 wires clearly need to be desoldered from the PCB 1st.  The wires should be soldered to the bulb without the silicone.  Then, the heat shrink should be put on.  Then, the silicone needs to be fed around the wires before soldering the wires back on the PCB.  The assembly probably doesn't need to be taken off the corkscrew drive if you have the right tweezers.

    The lenses only go on 1 way.

    Reassembling the 4 wires showed how the 580 EX II wasn't designed at all for manufacturability.  They wanted the best possible flash, no matter how expensive it was.  


    Then, the deed was done, showing what a luxurious flash it was compared to a cheap flash from 40 years ago.  

  • Tracking 2 people

    lion mclionhead05/28/2020 at 07:11 0 comments



    This was some rare footage of 2 people doing random, intimate movements from a stationary camera.  Showing it on a large monitor & having the camera track the monitor might be good enough to test 2 people.


    Automated tracking cam with 2 subjects was a disaster. Most often, 1 animal covered part of the other animal, causing it to put the head in the top of the frame. When they moved fast, they caused a sympathetic oscillation. Setting up the test was quite involved.

    Eventually, the corner cases whittled down.

    Openpose lost tracking when they were horizontal, but it didn't know to look to the right for the missing head either.

    When both were visible, it tended to look down.  This may be from the lack of tracking the head outline.

    When both were standing up but too different in height to fit in the frame, it tracked 1 or the other.

    A tough composition with partially visible bodies & 1 head much closer than the other made it place the smaller head where the bigger head was supposed to appear & place the bigger head at the bottom.

    Another tough one with 2 partially visible bodies.

    When both were standing & close in height, it tracked well.  Since it tracks heads, the key is to keep the heads close in height.  The trick is no matter how bad the tracking was, it never permanently lost the subjects.

  • Hacking a flash battery pack to use a lipo

    lion mclionhead05/25/2020 at 22:52 0 comments

    The flash needs to be externally mounted to keep the tracking mount as light as possible.  Also, in a high pressure photo shoot, the flash should be powered externally.  After Ken Rockwell extolled his frustrations with battery powered flashes https://www.kenrockwell.com/tech/strobes.htm, it behooved the lion kingdom to give the 580 EX II an extra boost.

    The lion kingdom invested in a cheap JJC battery pack, for just the cost of a boost converter & a cable, only to find the JJC's are sold with 1 connector type for 1 camera.  The goog reported the cables can't be bought separately.  

    https://www.amazon.com/JJC-BP-CA1-External-600EX-RT-YN600EX-RT/dp/B01GUNLQLW

    So in its haste, the lion kingdom spent $18 on used a cable from fleebay which ended up broken. 

    In the meantime, the goog updated its search results 5 days later to yield a brand new $12 cable.

    https://www.amazon.com/Connecting-Replacement-JJC-Recycling-YN600EX-RT/dp/B01G8PMZ12

    The total cost of the 1 hung lo JJC ended up more than a high end battery pack, not unlike how bang good soldering irons end up costing as much as a JBC with all the accessories.  It wasn't the 1st time lions were ripped off by the fact that goog can take 5 days to perform a novel search.

    Since 2011, a drawing has circulated around the internet with the 580 EX II pinout, but nothing useful on the voltages.  Fortunately, there actually is a servicing manual for the flash.

    https://www.manualslib.com/download/379083/Canon-580exii.html

    The control signal is 0-5V with 5V being the on signal.

    The external battery pack directly feeds the mane capacitor through some diodes.  The mane capacitor normally charges to 330V (page 24) but the status LED turns green when it's 213V & red when it's 268V (page 25).  The flash MCU resets if the voltage goes above 350V.  The external battery pack boosts what the internal charger already does, but doesn't have to completely charge the capacitor.


    Interior after modifications to use a LIPO.  Manely, the 3 battery terminals have been replaced.

    Turning to the JJC battery pack, the Nikon cable has a 100k resistor from K1' to K & a 100k resistor from GND to K2.  PH+ is the high voltage.  The resistors are probably selecting the voltage.  CNT is the control signal from the flash. 

    The Canon cable has no resistors.  All the K pins are floating.  Only PH+, GND, & CNT are connected.

    A quick test with the Nikon resistors showed it makes an unstable 290-320 volts.



    All the external battery packs use 2 boost converters in parallel.  Each boost converter runs on 6V.  If they have 8 AA's, they run both boost converters.  If they have 4 AA's, they run 1 boost converter.

    The battery pack has 3 taps: 12V at B+, 6V at BM, GND at G.  It can run on 6V from B+ to BM, or 6V from BM to G.  To run it on a 12V lipo from B+ to G, a regulator has to supply the 6V tap to get both boost converters to fire.  The lion kingdom whacked on an LM7806.





    The JJC has 2 5A current limiting fuses going to the batteries.  It's essential to leave these in place with their heatsinks.

    The JJC was modified with the 6V regulator & current limiting fuses in the battery compartment.  In testing, the JJC drew 5A to charge the flash.  Combined with the fresh batteries in the flash, it supplied more power than the flash could use without destroying itself.  Running on a lipo, the 580 EX II is basically a very expensive manes powered strobe.


    The 6V regulator got momentarily hot, but the flash couldn't draw enough power to keep it hot.  It was time to see if it did anything useful.

    BM with the LM7806 when recharging

    BM without the LM7806 when recharging.  

    BM without the LM7806 when idle.

    It's peak current is 2A while the boost converters are running 5A in series.  It's obviously making a prettier waveform when idle...

    Read more »

  • Portrait mode & HDMI tapping

    lion mclionhead05/17/2020 at 19:10 0 comments

    The tracker needs to support still photos in portrait mode.  Portrait mode is never used for video, at least by these animals.  A few days of struggle yielded this arrangement for portrait mode. The servo mount may actually be designed to go into this configuration. 

    It mechanically jams before smashing the camera. Still photos would be the 1st 30 minutes & video the 2nd 30 minutes of a model shoot, since it takes a long time to set up portrait mode.

    News flash: openpose can't handle rotations.  It can't detect a lion standing on its head.  It starts falling over with just 90 degree rotations. The video has to be rotated before being fed to openpose.

    https://www.amazon.com/gp/product/B0876VWFH7

    Also installed were the HDMI video feed & the remote shutter control.  These 2 bits achieved their final form.  Absolutely none of the EOS RP's computer connections ended up useful.  Note the gears gained a barrier from the cables.

    News flash: the EOS RP can't record video when it's outputting a clean HDMI signal.  The reason is clean HDMI causes it to show a parallel view on the LCD while sending a clean signal to HDMI.  It uses a 2nd channel that would normally be used for recording video & there's no way to turn it off. 

    Your only recording option when outputting clean HDMI is on the laptop.  Helas the $50 HDMI board outputs real crummy JPEG's at 29.97fps or 8 bit YUYV at 5fps.  It's limited by USB 2.0 bandwidth, no good for pirating video or any recording.

    The mighty Tamron 17-35mm was the next piece.  It was the lion kingdom's 1st lens in 12 years.  The lion kingdom relied on a 15mm fisheye for its entire wide angle career.  It was over $500 when it was new.  It was discontinued in 2011 & used ones are now being sold for $350.  Its purchase was inspired by a demo photo with the 15mm on an EOS 1DS.

    Defishing the 15mm in software gave decent results for still photos, but less so for video.  There will never be a hack to defish the 15mm in the camera.

    With the HDMI tap, it could finally take pictures & record video through the camera.  The tracker did its best to make some portraits.  The tracking movement caused motion blur.  Large deadband is key for freezing the camera movement.  Portrait mode still needs faster horizontal movement, because it has less horizontal room.

    Openpose lacks a way to detect the outline of an animal.  It only detects eyes, so the size of the head has to be estimated by the shoulder position.  It gets inaccurate if the animal bends over.  Openpose has proven about as good at detecting a head as a dedicated face tracker.

    The tracker has options for different lenses.  Longer lenses make it hunt. 50mm has been the limit for these servos.   Adding deadbands reduces the hunting but makes it less accurate.  It's definitely going to require a large padding from the frame edges.  For talking heads, the subject definitely needs to be standing up for the tracker to estimate the head size.


    A corner case is if the entire body is in frame, but the lower section is obstructed.  The tracker could assume the lower section is out of frame & tilt down until the head is on top.  In practice, openpose seems to create a placeholder when the legs are obstructed.

  • Tilt tracking

    lion mclionhead05/14/2020 at 19:36 0 comments

    Tilt tracking was a long, hard process but after 4 years, it finally surrendered.  The best solution ended up dividing the 25 body parts from openpose into 4 vertical zones.  Depending on which zones are visible, it tilts to track the head zone, all the zones, or just tilts up hoping to find the head.  The trick is dividing the body into more than 2 zones.  That allows a key state where the head is visible but only some zones below the head are visible.

    The composition deteriorates as the subject gets closer, but it does a good job tracking the subject even with only the viewfinder.  It tries to adjust the head position based on the head size, but head size can only be an estimation.

    It supports 3 lenses, but each lens requires different calibration factors.  The narrower the lens, the less body parts it sees & the more it just tracks the head.  Each lens needs different calibration factors, especially the relation between head size & head position.  The narrower the lens, the slower it needs to track, since the servos overshoot.  Openpose becomes less effective as fewer body parts are visible.  Since only the widest lens will ever be used in practice, only the widest lens is dialed in.

    The servos & autofocus are real noisy.  It still can't record anything.

    All this is conjecture, since the mane application is with 2 humans & there's no easy way to test it with 2 humans.  With 2 humans, it's supposed to use the tallest human for tilt & the average of all the humans for pan.

    Pan continues to simply center on the average X position of all the detected body parts.   

    There is a case for supporting a narrow lens for portraits & talking heads.  It would need a deadband to reduce the oscillation & the deadband would require the head to be closer to the center.  A face tracker rather than openpose would be required for a talking head.

  • The EOS RP arrives

    lion mclionhead05/10/2020 at 20:33 0 comments

    The next camera entry was the mighty EOS RP.  It's advantages were better light sensitivity, a wider field of view through a full size sensor, faster autofocus, preview video over USB.  Combined with the junk laptop's higher framerate, it managed real good tracking without the janky Escam.  

    The preview video can be previewed by running

    gphoto2 --capture-movie --stdout | mplayer -

    The preview video in video mode is 1024x576 JPEG photos, coming out at the camera's framerate.  If it's set for 23.97, they come out at 23.97.  If it's 59.94, they come out at 59.94.  Regardless of the shutter speed, the JPEG frames are duplicated to always come out at over 23.97.   They're all encoded differently by a constant bitrate algorithm, so there's no way to dedupe by comparing files.

    Once streaming begins, most of the camera interface is broken until the USB cable is unplugged or the tracker program is killed.  There is  some limited exposure control in video mode.  In still photo mode, the preview video is 960x640 & the camera interface is completely disabled. 

    Whenever USB is connected, there's no way to record video.  There is a way to take still photos by killing gphoto2, running it again as 

    gphoto2 --set-config capturetarget=1 --capture-image

    & resuming the video preview.  It obviously has only a single video encoder & it has to share it between preview mode, video compression & still photos.

    In video mode, the camera has to be power cycled to get gphoto2 to reconnect to it after killing the program.  In still photo mode, it seems to reconnect after taking a still photo.  The lion kingdom will probably use 30 minutes in video mode & 30 minutes in still photo mode in practical use, so a method would be required to remotely trigger gphoto2 if USB previewing was the goal.

    Helas, the lion kingdom decided not to use USB previewing at all.  The next step is to try an HDMI capture board.

    Added more safety features to keep the mount from destroying the camera, manely it goes into a manual alignment mode when it starts & only starts tracking after the user enables it, has limits to the step size.  Unfortunately, initialization still sometimes glitches to a random angle.

    Tracking tilt continues to be a big problem.  It can be used with some kind of manual tilt control, but there should be a way to do it automatically.

  • Dedicated camera ideas

    lion mclionhead01/31/2020 at 07:33 0 comments

    1 of the many deal breakers with this project was the autofocus on the DSLR completely failing to track the subject.  The mane requirement was the lowest light sensitivity & an ancient EOS T4I with prime lenses was the only option.  

    It finally occurred to the lion kingdom that a tracking mount probably needs a dedicated camera specially equipped for a tracking mount, rather than a DSLR.  With much videography being airshows & rockets, there was also a growing need for something with stabilization, faster autofocus, & sharper focus.   Lions were once again getting nowhere near the quality of modern videos.

    The Jennamarbles dog videos are always razor sharp & perfectly lit.  It's astounding how sharp the focus remanes while tracking a dog.  These videos look a lot more professional than all the other meaningless gootube videos, just because of the focus & the even lighting.  As much as lions like blown out colors, what gives Jennamarbles a professional touch seems to be the washed out but even colors.

    The internet doesn't really know whether she uses a Canon PowerShot G7 X Mark II or an EOS 80D.  The internet definitely doesn't know what lens is on the EOS 80D.

    This dog video appears to have a reflection of a small Powershot, unless her preference in men is larger than the usual oversized men.  All the good shots seem to be coming from the Powershot.

    This dog shot stood out for its even lighting & sharp focus, despite the motion.

  • Commercial solutions appear

    lion mclionhead07/05/2019 at 20:03 0 comments

    After months of spending 4 hours/day commuting instead of working on the tracking camera, a commercial solution from Hong Kong hit stores for $719.

    https://www.amazon.com/dp/B07RY5KDX2/

    The example videos show it doing a good job.  Instead of a spherical camera or wide angle lens, it manages to track only by what's in its narrow field of view.  This requires it to move very fast, resulting in jerky panning.  

    It isolates the subject from a background of other humans, recognizes paw gestures, & smartly tracks whatever part of the body is in view without getting thrown off.   In the demos, it locks onto the subject with only a single frame of video rather than a thorough training set.  It recognizes as little as an arm showing from behind an obstacle.  Based on the multicolored clothing, they're running several simultaneous algorithms: a face tracker, a color tracker, & a pose tracker.  The junk laptop would have a hard time just doing pose tracking.

    The image sensor is an awful Chinese one.  It would never do in a dim hotel room.  Chinese manufacturers are not allowed to use any imported parts.  The neural network processor is not an NVidia but an indigenously produced HiSilicon Hi3559A.  China's government is focused on having no debt, but how's that working in a world where credit is viewed as an investment in the future?  They can't borrow money to import a decent Sony sensor, so the world has to wait for China's own sensor to match Sony.

    It's strange that tracking cameras have been on quad copters for years, now are slowly emerging on ground cameras, but have never been used in any kind of production & never replicated by any open source efforts.  There has also never been any tracking for higher end DSLR cameras.  It's only been offered on consumer platforms.

  • Junk laptop arrives

    lion mclionhead06/08/2019 at 21:18 0 comments

    After living with a macbook that only did 3fps & a desktop which ran the neural network over a network, the lion kingdom obtained a gaming laptop with GT970 that fell off a truck.  The GT970 was much more powerful than the macbook's GT750 & the desktop's GT1050, while the rest was far behind.  Of course, the rest was a quad 2.6Ghz I7 with 12GB RAM.  To an old timer, it's an astounding amount of power.  Just not comparable to the last 5 years.

     Most surprising was how the GT970 had 50% more memory & 2x more cores than the GT1050 but lower clockspeeds.  They traded clockspeed for parallelism to make it portable, implying clockspeed used more power than transistor count.

    Pose tracking on a tablet using cloud computing was a failure.  The network was never reliable enough.  It desperately needed a laptop with more horsepower than the macbook.

    The instructions on https://hackaday.io/project/162944-auto-tracking-camera/log/157763-making-it-portable were still relevant.  After 2 days of compiling openpose from scratch with the latest libraries, porting countreps to a standalone Linux version, the junk laptop ran it at 6.5fps with the full neuron count, 1280x720 webcam, no more dropped frames.  It was a lot better at counting than the tablet.


    The mane problem was it quickly overheated.  After some experimentation, 2 small igloo bars directly under the fans were enough to keep it cool.  Even better would be a sheet of paper outlining where to put the igloo bars & laptop.  Igloo bars may actually be a viable way to use the power of a refrigerator to cool CPUs.

View all 25 project logs

Enjoy this project?

Share

Discussions

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates