So far you can use a BLDC gear motor, a small worm drive gear motor, or a Nema 23 stepper to drive this.
Torque requirement is 15-30 kg/cm and speed requirement is 22 to 30 rpm.
We need to deliver a faster inhale stroke than previously specified, so torque requirement is now 30 kg/cm and speed requirement is 48 rpm. This may put it beyond what our (cheap) Nema 23 can do and probably puts my first pick for the BLDC out of the running as well. The previous number for inhale time was 1.5s. I picked the parts based on 1s. The actual time from the UK document is now 0.3s. Ooof. Everything gets harder.
Software is in progress. With the sensor stack in place, will be capable of assist ventilation, or static rate ventilation.
Pressure and volume delivered will be directly adjustable. Maximum pressure will be configurable. Maximum pressure should also likely be backed up by a physical pop-off valve. We have a reliable 3D printed valve already developed for us.
@Steven.Carr has been happily milling out PC boards and laser cut experiments on his very nice equipment. I was growing envious as I sit here airwiring stuff.
Four years later, I finally got around to building the PCB milling attachment for #Arcus-3D-M2 - Mixed material filament printer . I always had a couple extra AL end effectors and 12 extra mag balls for exactly this purpose. Necessity is your mother.
Took the HF Moto tool I had and machined a full depth collet for it to specifically fit the PCB milling bits I have been carrying around in my toolbox for the last 4 years. The normal collets for this tool when presented with any side loading wear out in about 5 minutes as they are relatively soft brass, and only hold the first 3mm of the bit.
I also drilled massive holes in the housing to try to cool the motor a little better so it doesn't melt its mounts when running continuously.
This little motor does actually have *one* nice bearing in it, and carbon block brushes. It may actually do okay for a while now.
Anyway... The results.
This part is the PCB that holds the two optical encoders we are using which allow us to turn DC motors (a Ford F150 wiper motor) into proper servos for about $4 in parts.
I don't know if it is intended to be offset like that or not. I assume so because I actually ran it twice as I got the depth wrong the first time.
I just took the gcode that @Steven.Carr ran on his Tormach (envy) and stripped out all those unnecessary things like tool changer commands and actual spindle speed speed control, and reduced the depth of the drill cycle to just give me the centers by relocating Z=0 with a G92 before it.
Then again using a Tormach to mill PCB's is akin to using a pulse jet engine to spray sanitizer. Yes, I am referencing you ACME Creation Labs.
And.. a video.
I was pretty conservative with my feedrates as I had no idea how this would do.
Steven ran this at F3.0. I turned that down to F1.0, but ended up running the trace etching at F3.0 via feed rate override, but then dropping it to F0.6 for the final pass cutting full depth on the cutout.
I also discovered my 3D printer UI does not properly handle running in G20. Imperial coordinate systems FTW. I am choosing not to fix that.
This is not a judgement on anyone involved in said 'Drama'. Mistakes were made, but I don't believe anyone actually intended for the 'Drama' happening right now, to happen.
I chose not to believe that.
In any case, I basically burned myself out and was physically ill for a while, and so I personally have done very little in the way of actual work on this project. Sleeping was far more attractive and probably much better for me.
In the meantime the laser cutter I was using to prototype my parts here 'went away', the city is on lockdown, the AC is not working, and I then I enticed a few very time consuminmg things to bite me. It has not been a good week.
We will prevail.
I do have *one* working prototype though and through the significant effort of @Steven.Carr probably multiple modes, servo control of the motor subsystem, failure detection of individual components, etc. Working is a relative term here as I still have yet to fix what may just be a simple serial issue to even find out. Basically he wrote a pure C version of what I did in Python/Kivy for the UI and the serial protocol to implement it, in what seemed like a day. Bastard. :)
At this point though I can probably get to that... um... Sunday.
First off, using individual jumpers sucks. This choice alone has probably cost me 20 hours of debugging time. Some pin header ceases to fit perfectly and you partially lose ground, which makes things fail only when the majority of signals are above the halfway point between ground and Vcc as the the low going pulses also provide a partial ground. Argh.
Narrowing this down to the jumpers happened when everything finally got hand soldered directly, and all the strange issues happening during testing just went away.
Hand soldering wires still also sucks though. So I have now specified pre-made IDC connectors and matching keyed sockets for all the internal connections.
One leg of the each of the four twisted pairs will run SDA1,SDC1,SDA2,SDC2 and the other half of those four pairs will run our Vcc/ground. This seems to be a reliable way to get what we need for running dual I2C channels, out of an existing and robust connection like RJ45. Much more so than trying to run the SDA/SCA on a single twisted pair anyway, as they essentially work against each other then as they are not intended to be halfsies of a differential signal. :)
This also probably means generating two tiny daughterboards just to make our IDC to RJ45 jack connection, and to plug into our new optical interrupters, but so be it. Reliability is kinda key here. Which leads me into the next design choice.
Using unshielded hall effect sensors anywhere near a DC or stepper motor, unless they were designed for that particular motor, also sucks.
They will read perfectly, until you start your motor (if you are lucky). If you are not lucky, they may still read perfectly.. for a while. Eventually though it seems they reach some saturation or auto-leveling point, and cease to work reliably until you remove the field generated by the motor/stepper.
Our hall effect sensor for detecting 'home' got replaced by optical gates and a laser cut encoder. I also attempted to future proof this bit by adding a second encoder we could use to read the rotational rate. That led to four more design variations on how to space the encoders/encoder slots. I have arrived at the final revision for this.
I've built out a single glue board for hooking all of this up. Assembly and testing will happen tonight, if I don't run out of steam.
Assembly using the 'all at once' mantra was very annoying.
The front and rear are now split into panels held together by screws. This means you can assemble all the mechanical bits, and then assemble all the electrical bits. The latter I'm shooting for four screw terminals and two ribbon cables. Time will tell.
Mounting the Pi touchscreen needed a couple layers...
I finished up the latest version of the acrylic tab/slot version and promptly set about manually wiring up my control panel for it. Looks good right?
Several hours later I realized how much this sucks. There is absolutely zero chance anyone not in a third world country who desperately needs this, and has no other choice, would do this for 1000 units. This issue could be fixed with a suitably large PCB I could through-hole, but that also sucks.
I promptly put that idea on the back burner and decided to stick our other display (Pi touchscreen) directly on the unit.
It still uses the serial protocol @Steven.Carr came up with to interface between the critical code running on the 'core' and the Pi. This means we could still split the two and use some wireless protocol instead... or run it on a tablet, or have it go away altogether and the core will just keep truckin..
This also led to the discovery that wiring this thing up and *then* assembling it also sucks hard core.
I am in the process of redesigning the front/back panels so they can be now removed without having to split the case. This will translate to the ability to assemble the entire unit, plug stuff in, and then put the front/back on as the last step. That is going to waste some material, but it will save a whole buttload of time. Everything goes together pretty quickly, right up until you start getting wires caught in the tabs and such.
The time lost is far more valuable than the material in this case. Doing it.
And it has been terrifying. A whole lot in the physical design has changed, very quickly.
In fact, the overall goal of the project, has changed
We started out using the American version of the design requirements document. which specified a 1.5s breath interval.
That varied greatly from the later released UK ventilator design document. That actually seemed a lot more sane and specifies a 0.3s inhale timing. Switching documents resulted in a five fold increase in our power requirement.
That five fold increase now puts our drive system requirement coming in between 30W and 60W.
Finding a capable motor that can do that for 400k cycles without breaking a sweat increased the weight of the drive section beyond where it was feasible to place this unit on the patient.
Since we are now moving beyond our strict weight limit, I also decided to add a battery backup.
That led to designing a better body to house all of our new toys.
I decided to go all-in on an interlocking pin/slot panel design. The new body and battery meant we now had a transportation ventilator on our hands here, so it got a handle. This meant we needed to move all of our controls and cable connections to where they are less likely to get knocked around when this gets tossed into an ambulance.
Next step of adding an extension hose to the ventilator bag, and then kinking it off with the pressure release disabled just to see what happened, resulted in us shattering the cam. We have power to spare here.
The cam got redesigned as a solid disc, and the cam arms now clear the disc on either side.
If we were to suddenly lose our active feedback for volume/pressure, we decided it would be best to alarm, but still keep doing what we were doing. To do that it is probably necessary to add an encoder to our dumb motor, so we actually know how hard/fast we are pushing. We added an encoder and a spot to put our reader for it.
To keep in simple, our encoder disc is missing one pulse at our two home positions. We can find home by using missing pulse detection and a single photo interrupter. We had a hall sensor before, but it only told us when we were at 'home'. This is much more useful and uses the same number of pins.
The interrupter slots I picked here are cut at 4x the kerf width of our laser cutter.
That allows for 200 slots and still having more blocked space than cut.
Two hundred probably sounds really familiar, and it should. That is also the step count for the vast majority of NEMA 34/NEMA 23 stepper motors. I think you see where this is going....
As for the kerf to do this, don't get nervous here. Totally dialed in, the best our laser can do right now is 0.15mm. That's about 50% over spec. You'll be fine.
The complexity of all these new parts led me to model the outline of every single part. The clearance to some of them is really close, and I have been bitten here.
For example... the first version of this just eyeballing the motor position ended up with the threaded rod the stack was built from going ever so slightly through where the motor case was. Yep.
The changes I made to the laser cut version have been applied to the FFF version. The arm bearing and lower mounts have been altered to use threaded rod.
The primary difference for FFF is that it uses single bearings whereas I have opted for two bearings stacked on the laser version. This lets me use the same model for both and just have the laser cut parts be thicker to get the added strength I need.
I also did not put the cutouts back into the FFF files. Printing them with low infill is probably stronger than adding the cutouts. They will definitely print slower this way though.