Close
0%
0%

Dexter

A low cost, high precision haptic robotic arm with 50 micron repeatability and modular end effectors

Similar projects worth following
Dexter is a trainable, 3D printed, 5+ axis haptic robotic arm controlled by an FPGA supercomputer. It utilizes optical encoders to achieve high precision with repeatability of 50 microns. End effectors such as a gripper or a laser can be utilized by Dexter, increasing its functionality and potentially even adding more axes.

Dexter is an open source, 5+ axis robotic arm. It is built from 3D printed parts and held together with carbon fiber strakes for reinforcement. Dexter uses five NEMA-17 stepper motors with three harmonic drives in conjunction with optical encoders to get extreme precision, with a 2.5 micron stepping distance and 50 micron repeatablity. 

Dexter is also trainable, meaning that you can manually control it and have playback of the exact path it took. This makes it easily programmable for applications.

Dexter can also be controlled with our software, Dexter Development Environment (DDE). DDE utilizes a modified version of JavaScript that allows for a more traditional approach to programming while still being friendly to those who are newer to code. It also contains a simulator that will show playback of Dexter's movements using a pane in the development environment.

Dexter even has a Unity library that one of the members of the Dexter community put together, complete with haptic feedback. This could potentially be used for video games or even real world applications where a Dexter must be remotely controlled.

  • 5 × Wantai Stepper Motor NEMA 17 stepper motor used for Dexter's movement
  • 3 × HanZhen Harmonic Drives Harmonic drive that is used on three axes, 52:1 gear ratio
  • 1 × Avnet MicroZed FPGA board used by Dexter to process information from the optical encoders

  • Dexter HD

    Haddington Dynamics06/01/2018 at 21:16 0 comments

    2018 continues to be good for Haddington Dynamics and we picked up seed investors.

    We will be exhibiting the remote control scaling system in Munich at the automatica show.  This along with the latest version of Dexter we call Dexter HD.

    Very busy times.

  • Kickstarter and more

    Haddington Dynamics06/01/2018 at 21:14 0 comments

    On February 14 we open sourced everything (CAD, Code, boards) on Github using the GPL3 license.  The next day we launched our Kickstarter campaign.

    We set it at $100k and hit that goal with 112 backers.

    In the campaign we offered a "Makecation".  Come to Las Vegas and build your robot with us.  This allowed us to get to know our backers and these Makecations have led to great business opportunities.

    By open sourcing we reduced the friction to work with universities and New York Institute of Technology (which we met Matt Cornelius back in 2015 at the New York Maker Faire) was keen to work with us with the leadership of Christian Pongratz.

    They are now building Dexters on campus and our technology will be part of the curriculum this coming fall semester.  NYIT is focusing on Makerism and see Dexter as a tool to work across disciplines.

    2017 Bay Area Maker Faire allowed us to meet NASA.  The project manager came to do a Makecation that fall and purchased 5 Dexters to prototype a Drone inspection cell in a project called Fit2Fly.

    We also formed a JV with Axiom Electronics called e1ectr0n to build robots for the EMS industry,

  • Out of the Garage

    Haddington Dynamics06/01/2018 at 21:05 0 comments

    March 28, 2015 brought in investors that I knew from the neighborhood.  The neighborhood was called Haddington and is located in Henderson NV.

    The goal was to move out of the garage and raise money.

    We moved to the first location in July 2015 and spoke to many investors - all of them wanted to keep the project closed and patent everything.

    We thought the best approach was to open source the platform allowing a more friction free adoption.

  • Garage Time

    Haddington Dynamics06/01/2018 at 21:01 0 comments

    Dexter (named after human dexterity) began in earnest in 2012.  The project started in the garage.  I sold a supercomputer company and used the proceeds to fund my research taking supercomputer technology and putting it into a robot.  I built a language to program FPGA chips and began using it to create a real time physics.  I used  a laser cutter to cut out pieces to put together the first robot and also used the laser cutter to cut the first optical encoders on paper.

    This allowed me to test out all the signal processing I was doing in the FPGA.  There is video on Youtube of my first release of this system.

    I started to realize this could be a much more touch sensitive and precise robot control system.

    I spent the next three years in the garage working on the design and testing the control system.

View all 4 project logs

  • 1
    Dexter Assembly Manual

    The link to the Dexter Assembly Manual is here

View all instructions

Enjoy this project?

Share

Discussions

James Newton wrote 07/13/2018 at 17:37 point

LOL. Ok, ok... The responses before ARE written by the guy at HD who does marketing. It's funny to me that he is providing the very best answers he can provide, given data from the techs and engineers in the company, and it still doesn't make anyone happy because of the language. Give the poor guy a break? 

So I'm an engineer at HD (firmware, testing, working on Arduino compatible smarts in the new end effector). And I'm a long time OSH / FOSS supporter. And I completely freaking fail at marketing (check my side business. LOL). 

Maybe a conversation with me is better? I'll post some responses below this.

  Are you sure? yes | no

James Newton wrote 07/13/2018 at 17:38 point

1. ITAR, CCL, whatever stupid government regulation it actually is we don't care about. We aren't allowed to put an encoder on the 'bot that exceeds a certain angular resolution and then sell the bot outside the USA. It's freaking stupid and very annoying, but we want to stay out of jail so we do as we are told. 

From a technical perspective, the encoder is AMAZING! I have a Dexter, I know what it can do. There ARE problems with it, but they are mostly related to the processing of the data and not related to the encoder itself. e.g. We currently are not exposing the actual joint angle from the FPGA to the onboard Firmware, but that's on the "backlog" list for the next iteration. As a result, we currently can't read back the actual joint angle in the firmware, we only send a desired joint angle and the system "makes it so". We can get back the error, and other status info, but actual join angle is important and we will add it. This presentation may do a better job of explaining how the encoder works and why it's able to do more:

https://docs.google.com/presentation/d/1YGBnNvxhoP9J7alk5sASB5aUM_wg30fUuiZzM94uL2w/edit?usp=sharing

Let me know if you have more questions about the encoder. It's a technology I think more people should understand, so I'm happy to spend more time explaining it.

instead of using the slots in the encoder disk as "on", "off" / "black", "white" / binary values, we use the optical sensors in /analog/ mode. With a high resolution ADC, we get thousands of positions per slot. And yes, those are not perfectly linear, and yes, there is noise and all that poop. But the end result is MUCH higher resolution than a standard encoder. Fantastic repeatability (NOT accuracy). And it's a pretty cool idea, huh?

But even in home applications, why NOT print a low res encoder disk, use photodiodes and A2D encoders, and get higher resolution positioning? It's a freaking cool idea! And we open sourced it. 

  Are you sure? yes | no

James Newton wrote 07/13/2018 at 17:39 point

 2. Massively parallel, super whatever... who cares? It's an FPGA doing stuff faster than the FIrmware can do it. That the point. That's cool. Really, really cool, actually. It gives us a stunningly fast feedback time so we can do things like adjust the "springiness" of the response. Or switch into a compliant "follow" mode instead of position "keep" mode. In follow, you can grab the robot, move it around, and record waypoints or the entire path. Then play back, insert into a program, whatever. It makes it really easy to use. 

One issue we have right now is that you have to return to "home" to switch from follow to keep modes, but we are working on that as well. It's a tricky problem that we didn't see coming, and honestly, it's over my head why it doesn't work. 

  Are you sure? yes | no

James Newton wrote 07/13/2018 at 17:42 point

3. Open source: Dexter is as open source as we can legally make it. There is ONE area of Dexter that isn't /really/ fully open source and that is, as modzer0 points out, the HDL for the FPGA. The reason for that is that our primary engineer, who developed the HDL himself. It's called "Viva". He sold it to another company and that company is sitting on it. It's a decision he may well regret, but feeding the family is sometimes a good idea, no? He has a resale license, and can "sell" a copy here and there for $1, but there are limits on his ability to do that. 

The files that come out of the HDL are available online (in the github) and they are /technically/ editable, and the source file is also there, but yeah... we get it. It's not really fully open source on that point because the editor isn't available to the general public. 

So why not use a "standard" HDL? I'm so glad you asked. Because, and this is really important, so please read to the end and think about it, all the existing HDL's for FPGA design force you to THINK in sequence as you write the code. Our HDL, "Viva" is graphical. It shows you things on the screen in a format that more closely represents what is actually happening in the FPGA. If you write in a standard HDL, you have to think about a process that is parallel, then transcribe it into an inherently serial form (the HDL) then it gets processed back into parallel operation. With Viva, you think, and draw in parallel. It's just better, and it's what the big mfgrs /should/ be providing and are not. 

More than that, it's the format our principle developer thinks in and works in, so we aren't going to change it. Sorry. End of story. Not going to burn more cycles defending that. 

If you want to reject our FPGA claims, that's your choice and there isn't much we can do about it. I'll try to post screenshots if that helps? I think I can do that...

If you want to reject "massively parallel" and "FPGA super computer" then... LOL. Yeah, I kind of agree with you. Those are marketing words. But the FPGA IS doing cool things for us, it DOES solve a lot of problems, and you CAN see that in our demos and in person if you are near someone who has a robot. 

If anyone in the SoCal area wants a demo, and can make it to Escondido, I'll happily setup and time and show you what the robot can, and can not, do. We go to Maker Faires all the time, stop by and see for yourself. All the other parts of Dexter are FULLY open source. The firmware, the hardware (schematics, PCB's, etc...) the physical objects, etc... I've printed new end effectors from the files on OnShape. I'm using CircuitMaker.com for the power supplies (and soon the new interface board) for the new end effector. 

If there is something (other than the FPGA HDL) that we've missed sharing, please point it out. 

  Are you sure? yes | no

James Newton wrote 07/13/2018 at 17:46 point

3. The "end all scarcity" / robots in Africa thing. That's the goal. SpaceX wants to make humans interplanetary. All sorts of AI companies have crazy end goals. We set our sights high. At this point, other people have used Dexter to cut things with a laser, make art, tend a 3D printer (remove the finished part and dump it in a box), run a camera (a youTuber), stuff like that. We are short on practical applications, and we need YOU involved to find and solve problems with our "solution" (with Dexter). 

Our best application at this point is teaching people how to make a robot and NYIT is using it for exactly that at this time. 

If you wanna learn how to make a really cool robot, dig into the github. Post issues if you find a problem / don't understand something / can't find something. We REALLY want people to get involved in the project:
https://github.com/HaddingtonDynamics/Dexter
(p.s. please check out the wiki... I've worked really hard on that and I'm proud of it, but I want feedback on how to improve it.)

  Are you sure? yes | no

Daniel wrote 7 days ago point

Understand that Hackaday is a haven and focal point for a large number of hackers, hardware hackers, reverse engineers, hobbyists, makers, and professionals in related fields. A large percentage of them love seeing tech scams dismantled and destroyed. Some systematically discredit marketing claims for recreation. The number one response from those scammers is marketing speak, and the tactic of attacking the credibility of those posting hard technical questions or even basic science why their product will not work as advertised. The people behind those scams have large teams of marketers and few if any engineers.

There are people who will fall for those tactics, and there are those who see right through them and smell bullshit. Hackaday has a large percentage of the latter.

A bit of advise for this site is to take any interaction with it away from the marketing guy. Remove all the claims in the description and stick to technical details. You don't have to be a good communicator. In fact if you're too good it will just create skepticism. Post about the process. The problems and how you worked through them. It should be about how things work and how you made them work. That's the purpose of the site. A place to show off projects and build logs.

One thing never to do is trying to undermine someone's credibility rather than addressing the technical questions. That's a fatal mistake as evidenced by Modzer0's surprisingly polite checkmate followed by the forceful creation of a new orifice for your marketing guy. Don't try the marketing impulse to delete the jab.  If he just apologizes everyone will forget it. Trying to cover it up will just make screenshots surface and you'll look even worse. Integrity is increasingly rare these days, displaying some will help fix that damage. Don't act like the tech scammers who are all marketing.

If you discuss the problems you have with designs you'll find people will help you with solutions.

  Are you sure? yes | no

James Newton wrote 6 days ago point

Yeah, Daniel, I hear you and you are right. But the engineers at the company are heads down working, and we tend to forget about reaching out. Our bad, I know. I mean, we have a marketing guy to do the "reach out" stuff, so he is just trying to do his job. Honestly, it's my fault because I'm pretty sure he said something about posting on Hackaday and I should have twigged that it wasn't the right fit. I'm the only (I think) engineer at the place who has been on hackaday before going to work there, so I do understand and I should have caught it quicker. 

I do hope my responses and the details I've provided have answered your and others concerns? If you have any other questions, please don't hesitate to ask and I'll do my best to answer them. It would be a crying shame if I fail to convey how truly frick'n cool this robot and this company really are. Seriously, I have this job today because I spent a good long time contributing via open source just because I could see how amazing it is. I only got hired because I was whining about my regular day job being cut back and they wanted me to spend more time with them! LOL. 

  Are you sure? yes | no

Modzer0 wrote 07/12/2018 at 00:15 point

Classy, labeling me as 'angry'. Such things are typically a defensive statement when one feels their ability to respond is in a weakened position.

I also never asked for your 'ITAR' data because you have none. ITAR deals with items and companies dealing with such defined in the USML. What you linked is the CCL, the commerce control list. Any microcontroller with a processor speed over 40Mhz is technically under that as well. I've actually had to sign export control documents to buy ESP32 modules that were imported from China to begin with.

As for the USML the use of a high resolution encoder in a robot arm isn't defined as a munition as it is:

 (1) not 'specially designed' to be a weapon system

(2) not used in angular control of devices designed for infrared optical tracking for intelligence gathering purposes.

(3) not a device used to track angular rate or position of missile technology(MT) that doesn't apply to rotary encoders and wasn't funded by the DoD

(4) is not space qualified and is not being used in an instrumentation radar

Lots of people misunderstand the things ITAR applies to

FPGAs can do tasks in parallel. It all depends on the contents of the bitstream given to it. If it's given a single cpu IP say something like a small pure stack based CPU then it is not in fact 'completely parallel'. It depends on how you use it. The way you describe it is how a marketing person would word it to impress potential customers. FPGAs are just configurable hardware, though very useful configurable hardware that can be used as application accelerators processing data clock for clock more efficiently than a general purpose CPU. That's why there's quite a number of motion control IP available.

I love FPGAs, I don't love the development environments or having to keep multiple versions of very large dev tools installed just because I'm working on an older FPGA. The licensing is something that gives finance bad dreams. Also a 'typical' 30 person FPGA team? That's a very large team.

As for the 'Duopoly' you almost make that sound like a conspiracy. You sure it wasn't them just supporting languages that were IEEE standardized and didn't come with very high license fees?

  Are you sure? yes | no

James Newton wrote 07/13/2018 at 18:00 point

Can I be honest? We really aren't government employees and we really don't understand all that regulatory crap. Or at least I don't. We were told to limit the resolution and we didn't want to get into any trouble with the government so we did it.

Can we get back to the technology? Because that's what's really cool! Have you seen how the encoder actually works? This presentation might help explain it:
https://docs.google.com/presentation/d/1YGBnNvxhoP9J7alk5sASB5aUM_wg30fUuiZzM94uL2w/edit?usp=sharing

instead of using the slots in the encoder disk as "on", "off" / "black", "white" / binary values, we use the optical sensors in /analog/ mode. With effective 13 bit resolution, (12 bit but dithered) we get 8096 positions per slot. And yes, those are not perfectly linear, and yes, there is noise and all that poop. But the end result is MUCH higher resolution than a standard encoder. Fantastic repeatability (NOT accuracy). And it's a pretty cool idea, huh? Watch the video above... I've seen that demo in person. Swing by my office in Escondido, CA and I'll happily show you.

It's really wild when you use 2 robots slaved together because the human operator takes out any accuracy issues. The 2nd can send back position errors to the first, which can then use that to increase the position resistance which provides the operator with "feeling" from remote touches. It's amazing if you experience it yourself. Sadly, I only have one Dexter at my office, so I've only seen that at the main office in Las Vegas and at the Maker Faires. I think we will be at New York Maker Faire, September 22,23 and then San Diego Maker Faire October 6,7.  Come down and see it! 

But even in home applications, why NOT print a low res encoder disk, use photodiodes and A2D encoders, and get higher resolution positioning? And then if you know FPGAs you should be able to code up a feedback loop (PID or whatever) and use that to regulate the joint. 

  Are you sure? yes | no

Modzer0 wrote 07/02/2018 at 04:32 point

I remember this from the kickstarter and the ridiculous claims of supercomputer FPGAs and how it would somehow solve a bunch of problems. All the claims of open source yet the provided HDL wasn't even complete. Nothing about it was that exceptional considering the amount of off the shelf IP available for motion control. The kickstarter video with all of it's helping the impoverished and buzzword marketing tactics says a lot about the people behind it.

Then there's this 'project' that's purely marketing itself.

  Are you sure? yes | no

Haddington Dynamics wrote 07/06/2018 at 21:54 point

Thank you for your input. We are sorry that you are dissatisfied with our project. We use an FPGA board with in-house code written by Kent Gilson, the pioneer of hypercomputing, with FPGAs using his own language.  With a pinch of research, you would have been able to see that we do not use Verilog or HDL.  We can get a 33 million gate program onto an off-the shelf Zed board.  Here is an early article from 2003 in Forbes regarding the founder’s work. https://www.forbes.com/2003/03/25/cz_dl_0325star2.html#2794832ef226 Here is an article to explain how we are doing what we are doing and how different it is from current control systems. https://www.roboticstomorrow.com/article/2018/06/using-fpga-supercomputers-to-build-a-high-precision-robot/12145- There was a big launch at automatica-Munich for an 8kHz control system.  We are currently running at 100kHz on each joint.  We restricted our encoders to 19.5bit so we don’t hit ITAR restrictions and therefore can not open source what we have done.  Lastly, not sure how to take your final comment regarding the marketing buzzwords.  If you are saying we are not sincere in our convictions to use technology to help others with a goal of ending scarcity, then I would direct you to the website https://www.whycantwe.org/ which was written by Fry (co-founder) and his fellow MIT colleague Henry Liberman.  One final link to help you and others understand our influences – I direct you to Jeremy Rifkin and The Third Industrial Revolution.  https://www.youtube.com/watch?v=QX3M8Ka9vUA Hopefully these links help you and others understand what we are doing.  

  Are you sure? yes | no

Anoir Ben Tanfous wrote 07/07/2018 at 14:48 point

I appreciate your answer

  Are you sure? yes | no

Modzer0 wrote 07/07/2018 at 19:16 point

An answer filled with more marketing. You keep repeating your founder's name like he invented the wheel or something. The hypercomputing you referred to is best described as 'hype' computing as it was all mostly hype with little substance other than taking people's money.  It's much like someone proudly stating that they invented Steorn's Orbo. Then you're using some magic FPGA language that's so great that no one has really heard of it in professional circles. So color me very skeptical of any claims without visible, testable, and verifiable results. You restricted the encoders? Great, remove those restrictions on the encoders and show us the results associated part numbers, and code that people can actually synthesize/compile and use for independant verification. It's also fairly easy to fine tune motors for a single repeated movement.

How about repeating the movement accuracy test say 10 times, then picking up a 100 gram weight and repeating it to increment up to the practical weight limit of the arm. Continuous video with no cuts.

As for ITAR, please provide specifics on the exact item you're in danger of exceeding. I have quite a bit of familiarity with it as much of my work is for the DoD and US Government.

As for my reference to marketing terms I refer to the continuous use of terms like 'supercomputer' to describe the FPGA based motor control system. That is a textbook example of a marketing buzzword.

I'm a scientist at heart so my default skepticism isn't fixed. The burden of proof is on you. I'll sing your praise if proof is independently verifiable.

Prove you're not like Waterseer of which your feel good help the people marketing video for a robotic arm of all things sounds very similar. Engineers and scientists told them their design wouldn't work and they kept claiming it did. Now they've switched to what's basically an overpriced dehumidifier proving everyone who told them their ground and wind based passive system would never really work much less come near their claimed values correct. Then there's solar roadways with all of it's hype and after taking in millions they still haven't paved a road or made a test installation that's net positive on power generation.

Then there's the simple reality of the modern tech economy. If it was really that good the company would have already been bought. Instead you're marketing hobby robotic arms on a site primarily for personal projects.

  Are you sure? yes | no

Haddington Dynamics wrote 07/11/2018 at 21:51 point

Skeptics are always welcomed – however you come across more angry than skeptic.  Not sure we can resolve that other than provide more information which you will promptly call marketing.  Either we stepped on your toes or this is just you.

We use the term hyper or super to provide context to massively parallel computational horsepower. 
A typical FPGA team is around 30 folks in 6 divisions (integration, testing, timing etc..)  We do not have these constraints.  Regarding why this language is not adopted or widespread - you need not look further than the duopoly that is Xilinx or Alterra.  What hardware company is going to support software that is agnostic to its manufacturer.  They make their money on the software- a challenge to that will not be supported.  Also, if FPGAs were not a serious contender for where hardware is going – would Intel spend north of $13B to buy Alterra?
Regarding ITAR restrictions - an encoder that is 20bits can steer lasers accurately.  Beam steering is not open source material.  You are an anonymous individual, so we have no idea of your credentials but based on your statement above you ‘work for the DoD and US Government’ then it should not have taken a request to us regarding ITAR restrictions on optical encoders.  But again, we don’t know who you are. https://www.bis.doc.gov/index.php/documents/regulations-docs/federal-register-notices/federal-register-2014/990-ccl3-2/file page 14.
You want a part number?  The layout to our Opto board is here https://github.com/HaddingtonDynamics/Dexter.  ‘It’s just a simple optical encoder system where we turn a 3D printed code disk into an analog sine cosine wave to get a circle and do interpolation from it. We record a table and then compare against that table to normalize, 2 million samples per second. Which spread over 10 converters--2 at a time--equals 200,000 samples per second per LED/phototransistor which gives us 100,000 (100kHz) frequency bandwidth per axis. From that we get loads of data. We can then use signal processing to get process gain. And all of this is done in real time.  FPGAs are completely parallel. It’s completely different from a processor where you have one thread running or multiple threads running. Everything is happening all the time simultaneously; you have to be explicit about things that are synchronized because it’s all happening at once.’ 
This can’t be done with serial processors that are “so abundant” on the internet today.  If you don’t want precision and haptics at a low cost – there are plenty of alternatives.
Regarding outside testing – you are absolutely correct.  Let others challenge our statements.  We applaud testing.  Which is kind of funny that you are arguing for us to prove ourselves and yet at the same time our proving ourselves means nothing if it not outside tested.  So far we have received positive feedback from Google X, Toyota Research Institute, New York Institute of Technology, NASA, Axiom Electronics and others that we can’t talk about because of NDAs.
When you have disruptive technology, you want to make sure it is available to the bright minds.  Many of those bright minds are in the maker community and can expand the capabilities of our robot beyond what we could do as a small company.  And getting to the people has worked.  Here is a link to one of our clients working with Dexter on a path we would not have taken as a company.  Robots/AI/painting https://www.facebook.com/ZSparkyThoughts/videos/214948589158880/
Presenting on this site was not a marketing ploy.  We were asked at the Bay Area Maker Faire to please submit our information – they liked what we were doing.  They liked that we were also working with a university that is moving to a maker focus.  In fact, 6 courses this fall at NYIT will have our technology as part of their curriculum.  Giving makers new tools is both economically viable and creatively exuberant.  
We knew when we open sourced that we took a large chunk of potential buyers and investors off the table.  This was a calculated decision, the only way to get disruptive tech into the market was to make sure everyone had access to it.  We had plenty of investors on the upfront with silly valuations – they all wanted to spend that money on IP attorneys.  In our opinion that is not how you get a new platform to the market.  These decisions could end up biting us.  But at least the IP is not tied up and others can use it.

  Are you sure? yes | no

Daniel wrote 07/12/2018 at 15:26 point

He got a point. The engineers who get it done have a spidey sense earned with experience. If things look like it's written by marketing what shows up rarely meets expectations.

You've got to be a marketing or PR person. If not, you should be. The responses read like prepared press releases. The kinda things that irritate engineers. I bet if you just posted the technical details he wouldn't have any objections. Hell, there's way too many marketing keywords and phrases for my tastes.

I understand the objection to the Kickstarter video as well. Ever been to the poorer regions of Africa? They don't have any use for robot arms. They'd be happier with security, safe water, reliable food supplies, communications, power, lights, education, and a few other things before robot arms appear on the list. Even then I can say those robot arms will break fairly quickly in those conditions. There are people susceptible to marketing everywhere. There are also people who are irritated by it and will avoid the product just out of spite.

It might also annoy the open source crowd when you use a proprietary, dead language for a key part of how it works.

  Are you sure? yes | no

James Newton wrote 07/13/2018 at 18:04 point

Daniel, Modzer0, see my post above. Yeah, the guy who wrote this is a marketing person. He is doing the best he can with data from our engineers (I'm one of them). We've been to busy working to post here. And yeah, the FPGA source isn't /really/ open source because the program to open the file isn't available commercially. Answers / reasons in my post and replies to it above. It IS still a really cool project, it's as open source as we can make it, and people who contribute and have FPGA ability are welcome and maybe we can work something out to get you access to the HDL software we use. Read my post and the replies for details. 

  Are you sure? yes | no

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates