Low Cost, Time-of-Flight Gravimeter Arrays

Gravimeter array imaging requires building low cost, high sensitivity, time-of-flight (aka high sampling rate) sensors.

Similar projects worth following
For this project I wanted to design a low cost instrument that can be built by someone from high school onward, that is sensitive enough to track the sun and moon second by second using their vector tidal acceleration signal. The high sampling rate allows low resolution methods to use statistical averaging to improve resolution. There are MEMS gravimeters, atom gravimeter chips, "molecular electronic transducers" and a wide range of analog methods that can be upgraded with better position sensors, ADCs and small smart arrays.

The network of superconducting gravimeters has been collecting data for decades.  About 95 percent of the signal is the sun-moon signal.  I have an example of one month of data in an attached image.  You can just see a few spots of blue where the actual signal peeps out from underneath the calculated signal.

The calculation is quite simple, if you have a ready source for precise positions of the sun, moon and earth in station centered coordinates.  Luckily NASA's online Horizon system provided by the Jet Propulsion Laboratory (JPL)  Solar System Dynamics group has made it easy.  I will post how to generate and download the necessary data to calculate the signal to be expected, and post some programs on GitHub to take the positions, calculate the tidal signal, and help you do the linear regression needed for each axis of the instrument.

This is a very forgiving method for getting started.  If you have gaps in the data, or periodic noise, earthquakes or other random interruptions, each measurement of the three axes in time can be individually compared to what is expected.  You will need to try to get your instrument aligned carefully to North, East and Vertical unit vectors.  You will need the exact GPS location of the instrument.  If you are off a bit, you will see that in the regression calculations, and can use the regressions to solve for the position and orientation. That is a bit advanced, but I will try to simplify it and post it on GitHub and here.

I have a few ideas myself.  I would like to try some of the MEMS, atom chip and other gravimeters that are getting sensitive enough to be called "gravimeter" rather than accelerometer.  In fact, "seeing" the sun moon signal is a good indication that any new technology has reach a certain level in capability.  Large networks can potentially provide feedback to the JPL ephemeris process to help refine the values for GMsun and GMmoon.  Longer term for these "second" instruments they can probably try to track Jupiter and Venus.

My more immediate goal is to find the instruments that are sensitive enough and can be logged at high sampling rates, in order to be able to apply "time of flight" and correlation methods to arrays of gravimeters for things like 3D imaging of earthquakes, the earth's interior, ocean currents, and atmospheric currents.  It will probably take one or two generations of people learning about and using gravitational fields on a daily basis for these things to be possible.   I thought I would try to share what I have learned by calibrating the SG and seismometer networks this way, and what I am learning from the many new device manufacturers.

The vector tidal acceleration at an instrument location on the earth is a simple Newtonian gravity calculation.  For now, it only uses the sun and moon.  The sun's gravitational potential at the instrument has a gradient that is the acceleration we measure.  But because the sun also accelerates the earth too, you take the sun's acceleration at the instrument location and subtract the sun's acceleration at the center of the earth.  Then add to that the moon's acceleration at the instrument, minus the moon at the center of the earth.  There are xyz values for each of these.  So you are taking the x value of the suns acceleration at the instrument and subtracting the sun's x value at the center of the earth.  You also have to calculate the vector centrifugal acceleration at the instrument.  It sounds complicated, but it is mostly software and keeping data organized.

Once an instrument is calibrated by comparing it to what is expected, then it can begin reporting on what it measures.  The measurements can be solved for the position of the sun, assuming standard values for the moon, and solved for the position of the moon, assuming standard values for the sun.  These...

Read more »


This is a sample of what the signal looks like from the Z (vertical) axis of a broadband seismometer in Cantley Canada. Again, only a linear regression is needed. The other two axes are similar, though horizontal accelerations tend to be a bit more noisy for different reasons.

JPEG Image - 89.09 kB - 08/22/2019 at 03:57


Bandung Indonesia Sample Calibration Spreadsheet Image ba000100.ggp.png

This is what the "sun moon" signal looks like from one month of minute readings from the superconducting gravimeter at Bandung Indonesia. You can see some earthquake activity (seismic jiggling of the detector). The pink is the calculated Newtonian tidal gravity. The blue is the station reading. The black is the residual which is mostly from atmospheric density variations. The R squared (R2) is 0.996357. The a and b are offset and multiplier from the linear regression. The offset is related to the level of helium. And the multiplier is related to the local (pretty consistent) Love number which your can find related to "earth tides".

Portable Network Graphics (PNG) - 209.77 kB - 08/22/2019 at 03:47


First Interference Ring Setup.jpg

This is a picture of the setup used for the "first interference ring" setup. The batteries and laser diode at the bottom, the two mirrors, the screen, camera and postit note to block the direct beam. The laser is pointing to the right edge of the farther mirror, the beam bounced back toward the bottom, hits the right edge of the mirror closest to the bottom, then boes to the right edge of the yellow postits. The scattered bean goes across the full space between the yellow postits and the camera. It is probably the front surface reflection, but I don't know for sure yet.

JPEG Image - 2.70 MB - 05/01/2019 at 13:42


Gravimeter Images.png

These are some of the images you might encounter when you are looking for analog sensors that will give you gravitimeter data streams. Laser and magnetic levitation, pyrolytic levitation, the signals and someones measurement of them. High eigenmodes of non contact atomic force cantilevers. Ocean waves and natural sources of gravitational change.

Portable Network Graphics (PNG) - 740.68 kB - 04/11/2019 at 16:39


Magnetometer and Gravimeter Networks Image 2.png

These is the second diagram from the video "Spatial Resolution of Magnetometer and Gravimeter Imaging Networks"

Portable Network Graphics (PNG) - 61.42 kB - 04/07/2019 at 23:00


View all 7 files

  • More information posted at ResearchGate on 3 axis gravimeters

    RichardCollins07/19/2022 at 17:15 2 comments

    My main "job" is the Internet Foundation, so I am constantly monitoring global issues, methods and activities on the Internet. But I have a heart for research groups and gravitational noise detector arrays.  A group of people started following me on ResearchGate.

    Just so you can see what others are doing with gravitational sensors "for real", here is a link to my notes on ResearchGate titled 

    Solar System Gravimetry and Gravitational Engineering

  • Piezoelectric film and fiber, Three Axis, High Sampling Rate Gravimeters for Imaging Arrays

    RichardCollins02/24/2022 at 00:32 1 comment

    I have not posted updates here for a while, but I work on this continually.  The latest possible technology that I will be trying to adapt is related to piezoelectric films and piezoelectric fibers.  I have seen several efforts to use piezo disks, but not these films.  As I am reading the history of these materials and devices, particularly polyvinyliden fluoride polymers - it goes back about 50 years.

    There are many related topics that make this messy, and lots of people grabbing after money. Too much marketing and high priced things.

    "piezo film", "piezoelectric film", "PVDF film", "PVDF energy harvesting", "piezoelectric energy harvesters", "piezoelectric nanogenerators", "piezoelectric polymers", and many more.

    ("piezoelectric film" OR "piezo film") ("gravimeter" OR "accelerometer") has close to 60,000 entry points.  Many of them false leads or hype.

    These films, mounted along X Y Z axes with some fairly simple electronics for interface - plus some sophisticated algorithms for noise classification, management and reporting - should be able to beat the MEMS gravimeters (hyped up MEMS accelerometers sensitive enough to track the sun and moon, and - at high time of flight sampling rates - all kinds of imagine of masses.  My target is to map the interior of the moon, but lots of things found looking for how to do it.

    The main reason I am excited about these films is they can support up to Gsps (samples per second)  or GHz data streams for time of flight.  The key issue I  realized many years ago is the need to use time of flight (gravity has exactly the same speed as light) for locating, characterizing and calibrating sources.  Once a source of gravitational (or electromagnetic) "noise" has been identified, it is no longer "noise" but a "signal" that can be used as a reference source. Since the natural thing it so find the strong noises first, that means many earth and solar system gravitational reference sources.  The strongest is NOT earthquakes. But rather atmospheric density fluctuations and flows. The reason they are "good" sources, is they can be independently verified by lidar and many 3D imaging methods now. Same with ocean waves and currents.  And subsurface imaging of seismic waves.

    I have been learning how to design chips and PCBs.  How to order and manufacture test devices.  I have always known the data engineering and statistical side of the problem.

    I measured the speed of gravity about 20 years ago.  I hope to get a device that any kid in high school or college or on their own, can build and use with readily available tools.  I have many of Arduino type processors, Raspberry Pi, Jetson, Ryzen and Intel devices for the data handling.  95 % of the problem is collecting and processing the data streams.

    "It is not hard, just tedious".  I don't know how much 7312347343*434388234377234 might be, but it is not hard to get an exact answer - just tedious to work out by hand.  This problem is now "gravitational engineering".  Nothing really much unknown, just tedious working out of tedious details.

  • Gravitational Sensor Requirements and Applications, Magnetic Flux Quantum

    RichardCollins01/08/2021 at 15:31 0 comments

    I have spent the last year going over possible technologies and methods.  My criteria for the gravitational imaging network is

    Three axes : so that each sensor can determine direction. Each axis of the signal is very precise. Fitting a one dimensional signal is ambiguous.  Fitting three orthogonal axes signals at once is very precise.  A three axis gravimeter can be as precise or more precise than a GPS station.  And you can solve for the orientation of the sensor.  We are not to where someone walking around can use it like a gravitational compass, but I think in the future that will be possible.  I started out just trying to track the sun an moon precisely. But in the last 17 years I have learned a few things.

    High Sampling Rate:  So that arrays of sensors can correlate and solve for direction of the source.  Higher data rates mean more samples and data to characterize and study the source.  Higher rates over time mean a wide range of single pixel and super-resolution and correlation techniques can be applied.

    Arrays:  From the beginning, I knew that the  current sensors are crude, and the gravitational signals are mixed in with magnetic, thermal, seismic, acoustic, and many sources of noise. But even though the signals move at the speed of light and gravity at a million samples per second, each microsecond sample is 299.8 meters. At a Giga samples per second that is 0.2998 meters or 29.98 centimeters.  

    I spent much of the last couple of decades trying to find sensors that are sensitive to gravity, to magnetic and electromagnetic fields. My intention is to then correlate to global "magnetic" and "electric" and "acoustic" and "gravitational" and "thermal" sensor networks and then use statistical methods to sort out and calibrate all the signals.  Most of the noise is a mixture of radiation field (particularly thermal radiation), low frequency magnetic and electric, and many many natural and man made sources of electromagnetic noise.  The movements of air (remember we are looking at things at Msps and Gsps so all the acoustic and ultrasonic events and air movements are there in principle.

    An earthquake is not an instantaneous explosion. Rather it is the movement of mass in one area that causes movement to spread out in a pretty well known, and "model-able" way. The surface waves are fairly obvious. Cubic voxels that only held air get filled with soil and rock and water. Later they get filled with air again.  Those seismic surface waves and interior wave are tracked precisely by thousands of sensitive and daily more integrated sensor networks. So the gravitational potential changes with time can be calculated.  That travels out at the speed of light, and the signal gradient (the acceleration field) at sensor locations is reliable and can be used to learn more about the mass distributions, speeds and volumes and locations.

    The electron is a wonderful tool. It has electric charge so it responds to electric fields, It has magnetic moment so that it responds to changing magnetic fields and gradients. It has mass, so it responds to change in the gravitational potential and its gradients.  I have come to treat the gravitational and electromagnetic fields as one field. And I can use the properties and behaviors of electrons to sort out and quantify the contributions from each field to motions and orientations of single electrons, and groups of electrons. The atoms and molecules and particles and gluons and states of matter and the vacuum are there as a backdrop. 

    I am spending more and more time looking at and organizing information related to "single electron" and "single photon" devices. These seem rather wonderful that people are getting down to that level of precisions. But I finally understood where to put the "magnetic flux quantum".

    If you ask, what is the energy of a photon whose frequency is 1 Hertz, in electron Volts, you get...

    Read more »

  • Low Cost Gravimeter Video 1 - My first interference picture and thoughts​

    RichardCollins05/01/2019 at 11:31 0 comments

    Here is a video of my first interference ring.  It is a 10 for $4.99 mini laser diode at 650 nm, two AA batteries taped together with a brass nail taped on one end and the laser diode wires to the +/- on the other end. The diode is laying on a book.  I have two 2" round mirrors (50 for $10.99).  The laser hits the edge of the first one,  bounced back and hits the one closer to the laser, then goes off and hits a piece of white paper about a foot away,  The two mirrors are about 5 cm apart. I have a $39.99 1600x1200x30 fps Crosstour sports cam that can be plugged in as a webcam, sitting a few inches from the screen.  The first dark ring is about 1 centimeter (cm) across.  I am using APowerSoft Screen Pro to record the video.  [ For the next video, I have Javascript/html program that I wrote to read the camera frames and collect statistics, display various pictures of the camera stream, and try to determine what all this wonderful data can be used for.  I have to try to use if for monitoring the pendulum, but I just love the statistics for their own right.]
      Oh, I have four 1 inch round magnets for a dollar, and a pack of 12 for a dollar half inch paper binding clips.  I clip the clip on the mirror, squeeze the wires and remove them, stick the mirror and clip on the magnet.  My table happens to be a wrought iron outdoor table so they stick, but they are heavy enough to sit on a table.  I had to learn to touch the mirror and only move the magnets carefully.  I will measure the angles and restrictions for getting this image because it took me a couple of hours to wrap my head around what it was doing.  Finally, I have a stack of sticky notes blocking the laser beams from hitting the screen, so the camera is not washed out.   The lights are out, except for the computer screens.  Looking forward to seeing what I can do with the data stream.  Will put a picture and diagram. Will record some data and play a bit.  I have other things I am doing, but will try to do more later today or tonight.

  • More Measurement and positioning techniques at sub-nanometer resolution

    RichardCollins04/21/2019 at 22:27 0 comments

    I am reviewing laser measurement techniques at nanometer resolution.  I need to go beyond that, but wanted to be sure I have a good foundation.  This article is helpful:

    "A review of nanometer resolution position sensors: Operation and performance"
    Andrew J. Fleming at

    It includes many low cost methods, and encourages the use of statistical measures that cannot be gamed or spun for marketing purposes.  My interest is on the "interferometer" and "encoder" categories, since the sensors need to be insensitive to magnet fields. When he wrote this in 2013, the sampling rate was in the kilosamples per second (ksps) range.  Now the same methods can use Msps and Gsps low cost solutions.

    [ "capacitive positions sensors" "nanometer" ] and [ non contact "atomic force" "nanometer" ] yield many useful efforts, but much reported is heavy on potential markets and dreams, then practical low cost commodity sensors you and I can use.

    I definitely need to look more at piezo devices and linear motors.  If my pendulum starts to swing I need to get out of the way, then track closer and closer as it settles down.

    A LOT of these things are the end result of people pushing older amplifier and ADC technologies, and they are about a thousand times more expensive than the new ones. But I cannot ignore anything.  I wish there were a way to help all the older instrument makers to upgrade.  There are still a lot of older instruments that do not use embedded processors, data sharing, modeling or statistics at all.

    This one at seems well reasoned and helpful, but one glance at the photos, and I know it is well beyond my meager budget.  They do mention "Simple capacitive sensors, such as those used in inexpensive proximity switches or elevator touch switches, are simple devices and in their most basic form could be designed in a high school electronics class. ", so I am going to be reading "Capacitive Sensor Operation and Optimization (How Capacitive Sensors Work and How to Use Them Effectively)" at to learn from a master craftsman.

    Go to run.  I am reading so much my sight keeps going out. Wish I had some help.

  • Monitoring still smaller masses for a gravimeter - atoms, electrons, photons

    RichardCollins04/21/2019 at 20:20 0 comments

    The mass of the object being monitored keeps shrinking.  The old seismometers had big masses on a spring, or supported by a wire and able to swing.  But as electronic techniques make the measurement of position faster, smaller and more precise, the big masses are stil useful.

    That is why I am trying to instrument a simple pendulum with precise sensors.

    But follow the thought all the way down.

    The "atom interferometers" make use of the fact that the wavelength for a particle is h/mv, where h is Planck's constant , m is the mass of the particle, and v is the velocity of the particle. The larger the mass of the particle being used, the smaller the wavelength.  

    h = 6.626070040E-34 Joules/Hertz

    But other than some experiments with atom interferometer chips, the cost and complexity of the "atom" methods seems difficult to implement.  I will keep looking, but will it is hard to find something I can adapt during this short period of this contest.  Oh, most of the "quantum" experiments with Bose Einstein condensates, including some superconducting configurations can be treated as simple atom methods.

    My note here goes to still smaller particles, the electrons.  These are used in large quantities in our current (pun intended) devices.  We store them in our capacitors, move them from place to place, modulate them, and find them generally fairly useful

    But we have not (so far as I have found yet) used the fact they have mass that is sensitive to gravitational potential (time dilation effects), and gravitational potential gradients (acceleration and velocity effects).  The atom interferometers make use of well studied internal states of atoms and molecules to manipulate and monitor them for use in sensing.  But there are very very specific interactions of electrons that can as well.

    I do not like the term "spin spin", because it doesn't tell me what is happening or what I can do with it.  I like the term "permanent magnetic dipole interaction".  I will put up with "hyperfine interaction" when it is applied to interactions of magnetic dipoles, magnetic quadupoles or any combination of Schrodinger states of atoms, molecules or particles.  If it can be represented as a "particle" whose field has multipoles, and these interact, I would say "multipole interactions".

    So I am looking at all the electron magnetic dipole interactions to see which phenomena might be "hacked" to make a low cost, small, precise gravimeter.

    One magnetic dipole interaction I have used for a long time is electron-electron magnetic dipole binding, where two electrons bind magnetically to form a stable pair.  I think that is the basis of supeconductivity generally, but I am trying to stay on track to solve this gravimeter problem in time. I am pretty sure the same thing is going on with proton-proton magnetic dipole binding in neutron stars and everyday nuclei here on earth.  I use magnetic dipole binding to estimate nuclear reactions where particles with permanent magnetic dipoles bind with nuclear energies. Sorry, just reminding myself of all the pathways I have investigated over the years.  I want one to help me here.

    So in a radio receiver, the fluctuations in the voltage of the electrons in a capacitor are related to "kT".  But part of that signal is gravity, part is the earth's magnetic field, part seismic, part human noise, and part kinetic fluctuations and phonons in the parts.  We distinguish "kT" in resistors, but it is just the same mix of signals coming through the potentials affecting particular devices in our circuits called resistors.  I tried to use a "kT" sensor (Johnson noise) sensor recently, but so far inconclusive, because you have to measure "everything" to sort out the source of the noise. 

    So, in parallel, I am gathering data from magnetometer arrays, seismometer arrays, gravimeter arrays, VLF ELF ULF and all frequencies of electromagnetic sources, power...

    Read more »

  • Trying Optical Feedback Interferometer/Interference Sensors

    RichardCollins04/21/2019 at 19:33 0 comments

    Ben Krasnow has uncovered a simple method for laser measurement.  It will take some effort to convert it to a convenient tool, but he gave enough instructions to do that.  Here is his video: - "Laser diode self-mixing: Range-finding and sub-micron vibration measurement."

    He found some laser diodes that have an integrated "internal monitor photodiode" with feedback.  The feedback signal is what he is tracking.  You get "interference" because the reflected wave timing matches the timing of the outgoing wave.  So you should be able to get the same effect with an on purpose emitter and receiver pair. 

    I checked just now, and I think this "Self-mixing laser diode vibrometer December 2002" at is the same approach.  

    Looking more deeply, this paper 2004 "Self-Mixing Laser Diode Velocimetry: Application to Vibration and Velocity Measurement" by Lorenzo Scalise, Yanguang Yu, Guido Giuliani at explains "

    "Laser Doppler velocimetry  and laser Doppler vibrometry  are well-known measurement techniques widely used for the precise remote measurement of the velocity of fluids, and for accurate measurement of the displacement, velocity and acceleration of solid objects. With these types of instruments, it is possible to measure the velocity and displacement of the target surface, simply by using a light beam."

    So there is a rich source of useful techniques - once you know to google "velocimetry" "laser" and "internal monitor photodiode".

    Still further checking find "Microcantilever Displacement Measurement Using a Mechanically Modulated Optical Feedback Interferometer"  at  By 2016 they have broadened the concept from piezo to cantilever, and cleared up that it is an optical feedback loop. 

    Finally, "Optical feedback interferometry" today has over 10,000 results for the exact phrase."Optical+feedback+interferometry"

    Now to find one to adapt to my problem and move on.  I just need a data stream.

  • Gravitational Potential Type Detectors

    RichardCollins04/11/2019 at 17:05 0 comments

    This project's immediate goal is low cost acceleration field measurement and imaging techniques. 

    But the gravitational potential changes the rate of clocks, nuclear and chemical processes, at the surface of the earth - particularly, because of the changing distances and orientations of the sun, moon, earth and many things on earth.  These "direct potential" instruments derive from resonance measurement on the electronic and magnetic states of atoms - cesium and rubidum as a start - for use as precise atomic clocks.  As such they found the that clocks do change their rate at rest because of the absolute value of the gravitational potential (the acceleration is just the gradient of this potential), and can be "inverted" to report on the potential itself.

    Mossbauer effect (very precise accounting for recoil energies during state changes in atoms and molecules with narrow linewidths) can be used to measure the gravitational redshift, which depends on an integral of the potential between two points.  His innovation was to find materials where the lattice surrounding the emitting (and absorbing) atoms (molecules) could absorb the recoil energy on the timescale in which the state change occurs.  That is difficult because finding and characterizing materials to have precise atomic and nuclear properties is hard for bulk materials.

    With laser tools and fast ADCs and computing, it should be possible to use a wide variety of paired emissions and absorptions where the recoil can be tracked, accounted for, and compensated for.  There should be cyclotron versions and microwave plasma versions as well.  I am reviewing all those methods and possibilities, and will report later, or as I have updates.

    The importance of the gravitational potential detectors, is there are broad classes of instruments and experiments being proposed, just started, or going on, sensitive enough that they need to account for (1) earth tides, (2) small variation in station location, (3) changes in the rates of atomic and molecular rates, frequencies, and energies due to the changing potential due to the sun and moon relative to the station. The earth itself is changing shape, and its potential changes are tracked by the International Center for Global Gravity Field Models (ICGEM),, along with its temporal variations.

    This last affects many Bose Einstein condensate, quantum, superfluid, plasma and nuclear experiments.  I am trying to lay out the general rules, but my advice is that if you are saying "nano" and starting to whisper "pico" and "femto" and "atto", you need to check your local gravitational potential and acceleration field "weather report".  :)

    Now it is a general rule of thumb, that any system that has to account for these types of changes, can invert their models for correction to become reporting nodes in a global network of sensors.  Sensor monitoring the positions of the sun and moon, and the shape and gravitational events on the surface of the earth and in the solar system.  So if someone's G experiment is giving them fairly wide variation from place to place, and time to time, it might well be they have not accounted for the sun, moon and earth portions of the potential and its gradient.

    I do not know exactly how the signal is initated and travels from where the change is made, to "direct potential" or gradient sensors.  For a pendulum swinging nearby, used as a reference source, or just keeping video tracking and identification senors on poles of the nearby highway to identify them and correlate with gravity signals. Or using 3D video and imaging to get the shape of the ocean to calculate its gravity signal at your site.  All of these are quite complex.  Easy to do, but with noise and careful work required. The signals travel at the speed of (light and gravity) so you need to sample at rates appropriate to your needs.  If you want to use a gravity...

    Read more »

  • Glasgow MEMS Gravimeter making progress

    RichardCollins04/11/2019 at 01:29 0 comments

    This MEMS gravimeter at the University of Glasgow uses an interesting "optical shadow" detector.  And they have added tilt and temperature monitoring and compensation. They still have not calibrated all three axis, nor are they doing routine sun moon calibration, but seem to be making progress.

    A High Stability Optical Shadow Sensor with Applications for Precision Accelerometers

    Field tests of a portable MEMS gravimeter

  • MagQuest $1.2 Million Dollar Contest to Improve Global Magnetic Measurements

    RichardCollins04/07/2019 at 23:19 0 comments

    The National Geospatial-Intelligence Agency is looking for people to find novel approaches to gathering and utilizing magnetic data in conjunction with their development and support of the World Magnetic Model (WMM).  They rely heavily on ths European Space Agency "Swarm" of satellites to measure and calibrate the World Magnetic Model.  But they would like to use sensor data fusion to improve and verify the space-based sensor network.  So anything is fair game.

    Personally I think they should at least upgrade the existing magnetic networks with decent ADCs, continous high sensitivity monitoring arrays, and get their Internet data sharing down to current practices at least.

    Here is what they say about the WMM:  "The WMM is embedded in thousands of systems. More than a billion smartphone users depend on the WMM to point them in the right direction when they use mobile navigation apps. Drivers rely on the WMM to power the compasses in their cars.  The WMM is also critical for military and commercial uses around the world. Among other applications, it supports navigation and attitude determination for submarines, satellites, and aircraft, while also informing operational logistics like the numbering of runways."

    So they have an existing user base, but are not collecting ground truth from them apparently.  If you could hack the cell phone to provide ground truth, or plug millions of low cost sensors into Internet so they can be used to constrain the solutions when they try to build their model, it will help them with their current plans.  And, presumeably pay for you to continue to help them.

    But it might be that there are other ways to provide better solutions to the fundamental problems and needs of their clients.  Perhaps there are better ways to solve for orientation and location using existing GPS, or by upgrading selected cell phones.  Or by letting people put in local nodes to provide very precise updates on magnetic field variations.  I can think of about 30 things to try. But, my main concern is they also upgrade sampling rates so whatever sensors are created or upgraded, can be used for magnetic time of flight imaging arrays.

View all 11 project logs

Enjoy this project?



billybowden wrote 08/31/2022 at 06:48 point

For a high school student who was interested in something a bit simpler to start with, do you have a recommendation for an accelerometer that might be suitable for measuring the difference in gravity between one degree of latitude for example? I had a look on sparkfun but there's so many factors - range, sensitivity, resolution - that it's difficult to know which to choose.

  Are you sure? yes | no

RichardCollins wrote 10/08/2019 at 13:09 point

@Nelson Phillips 

I  am going through the software defined radios, and many related technical groups that use low noise amplifiers for electronic signals.  Oscilloscopes, software defined radios, most scientific and technical sensors.  The "audio" ADCs are 24 bit at 96 ksps and the 32 bit 192 ksps and 384 sps are dropping in prices.  The $30-60 SDRs are mostly capable of 3 Msps now and the newer $150-300 ones are in the 10 Msps to 50 Msps range.  I am tracking all the progress. So taking an electrical or optical data stream from a sensor of any kind, amplifying it and fitting it precisely to the bit size and sampling rate of low cost ADCs is maturing into a science and not a hack.

The atom interferometer alternatives to LIGO (laser optical interferometry)  are three to four orders of magnitude more precise. They suffer from trying to use atoms for everything.  But they can be interfaced with electrical and optical low cost methods to ADCS and into the the standard processing chain - storage, sharing, analysis, visualization, research, modeling, development groups, small manufacturing (where needed) and global sharing.

I am looking more closely at electron interferometry, and at the associated super-resolution methods from every group I could find that has tackled using massive amount of data to characterize, magnify, illuminate, understand and utilize small and complex processes.  So if we need atom by atom assembly, some groups are doing that.  If we need xray imaging and subassemblies, there are people doing that.  I suppose my innovation is taking 50 years to learn every scientific and technical language, and spending 21 years studying internet groups to see what wastes most of their time.

Yes, small clusters of sensors at a given location to monitor volcanos perhaps, but sharing their data globally for earthquake early waring, and sharing that to image Venus and the sun's interior.  I mapped out many of the existing and emerging sensor networks. They all share the same methods but not all know what is known. 

I am starting to talk to manufacturers about making small arrays.  I just bought a "vibration sensor from Analog Devides that is higher sampling rate, three axis accelerometer.  I ask them to let me show them how to improve it to be used as a gravimeters.  They charge $600 because they do not know what they are aiming for.  They have ten markets, but know none oft them well.  I do not know if they will listen.  I just work as hard as i can to do what comes to me as well as I can.  In the hopes that is what I should be doing.  I just know my ideas about gravity from 15 years ago are starting to find confirmation in the sensor data i am collecting. And, my no regrets policy, if I do a good job at everything I do, I do not regret if I do not find the specific things I wanted to find.  It is the truth we are seeking, not fame or fortune, or even being right.

Sorry,  I think about a lot of things as I gather and organize information about groups, technologies and how things fit together.

If I get some of these devices in for testing, i will try to post my notes and comments online somewhere.  I might have to use some of the "expensive" $500 hacks to get it down to $10. But there are a LOT of $10 - 1000 problems in the world right now that good amplifiers, ADCS, data storage and sharing can help.


  Are you sure? yes | no

Comedicles wrote 10/07/2019 at 16:43 point

I have an old Hughes Research paper by Robert Forward that describes sub-angstrom measurements in interferometers. It is from work on space based detectors for gravity waves, but the basic idea and setup is actually rather simple. I can scan it if you like.

  Are you sure? yes | no

RichardCollins wrote 10/08/2019 at 12:41 point

@Comedicles   I started my gravity research based on Robert Forward's dissertation on transmission and reception of gravity signals.  He is the one who laid the foundation for design of LIGO I think.  He worked for Hughes but was involved in gravity research.  

I would like to read that.

LIGO is Laser Interferometer GravitationalWave Observatory. The newer atom interferometric methods can basically allow that 4km laser (optical) inferometer to be replaced with a room sized, perhaps desktop system.  I am looking for low cost systems that can be used for earth-based gravity scanning and imaging.  AND be used for deep gravitational astronomy across ALL the frequencies.  

The gravitational frequencies cover the same range as electromagnetism. And LIGO is only looking at a tiny tiny piece and only cares about distant things.  I want to see the tiny details of earth, the sun, the oceans, the atmosphere, mars and venus and the planets and asteroids - in 3D fly through videos and controlled and carefully calibrated simulations. That is possible with the atom interferometer gravitation potential detectors, but even more so the gravimeter based (measuring the gravitational potential gradient).  I cannot bet on any one right now, so I am folllowing and encouraging all of them.

I have no resources to work in space.  I am 70 years old, and even if I can help get people to work on gravitational engineering seriously, I doubt it will be ready to get my old bones into space.  So I am working to encourage  the development of gravitational transmission, reception, imaging and detection here on earth.  I am also working on profiling the work to get practical gravity control.  If I tell the computer where to move an object and it does, with no wires or solid things touching.  That is good enough for me.  If we need higher energy density field sources, there are very direct ways to organize global groups to tackle that.  If there are people ready to start on warp drives, they can at least do the practical engineering modeling to identifiy what needs to be done.  In my world, most anything is possible.  It took me my whole life and I work very very very hard at it.  But in spite of exhaustion, I am encouraged.

  Are you sure? yes | no

RichardCollins wrote 08/05/2019 at 20:29 point


I have been working on a number of related problems.  Once I determined that "gravitational" signals can be considered part of the electromagnetic spectrum, on the one hand.  Or, rather that electromagnetic waves propagate in an underlying dense gravitational potential.  Then, I wanted to work out the theory and underlying processes.  So I have been a bit diverted for the last couple of months to see the consequences of combining gravity and electromagnetism in this way.  Today I was working on neutrino oscillations. I had to come up with a working model of mass and inertia that took care of most situations.  I can do pure theory, but I am aiming at gravitational engineering - I want to build things, or give people tools to calculate and build real devices.

Analog Devices sells Gsps (giga samples per second) and Msps (mega sps) ADCs. The Giga are about $1500 and the Mega about $150.  These are digital so I say "samples per second" or "frames per second" with digital data streams.  But, I looked more closely at the Bose Einstein condensate gravimeter designs. They are using analog methods for correlation, so for them I would say GigaHertz or GHz.  They talk about detection at optical frequencies.  When I looked at that still more closely, I found that in order to image the surfaces of stars, you can use infrared interferometry using arrays of optical detectors around the world.  So I had to change my thinking about what is possible.

ALL of these gravimeters are simply very sensitive accelerometers.  And an accelerometer simply tracks the position of ANY mass and calculates the second derivative with respect to time.  The oldest designs use rigid masses. Today people use small and smaller masses.  So the pendulum, cantilever, MEMS, levitated mass and similar devices are improving on the oldest approach.  Optical interferometry allows for measureing the position in very tiny increments.  

The atom interferometer uses the much smaller wavelenth of atoms to measure the distances of the rigid masses still more precisely.  If you take nanometer distances and differentiate with respect to time twice, you get nanometers per second squared.  But the atom interferometers and a host of new super-resolution methods allow going down to the pico and femto ranges.  So I have been playing catch-up, to simply locate and identify ALL the new technologies.

I spent about a month since I last wrote going through the LIGO data and methods.  I consider that a large optical interferometer with Fabry-Perot cavity length that allows them to measure changes in the gravitational potential down to sub femtometer levels.  They are a "direct potential detector", rather than an accelerometer (measuring the gradient of the potential).  There are plans to use atom interferometry to replace those monster devices with something compact.  I did not have time to work on that.

I did try to convince LIGO to share their earth-based data and not just the highly fiddled strain data.  The global networks of gravimeters, seismometers, magnetometers and other similar devices are mostly in the nano range right now.  And LIGO's subsystem that tracks and suppresses what they call "Newtonian noise" from the earth and solar system is not as capable as the global networks. But the global networks need next generation detectors, and LIGO has sucked up some of the best and brightest - even if they lock them into a mindset that only allows them to work on distant black hole and neutron star events. They disdain anything to do with earth, but most of the real problems and practical uses of gravitational imaging are here on earth.  So I would like LIGO to help the global accelerometer/gravimeter arrays, and in turn, let the global arrays help them refine and improve their early detection of seismic waves so they don't have to shut down their experiment every time there is a little or big shake.

But I also tried to process the LIGO data as a 16,384 samples per second detector array.  To use their three instruments to scan the interior of the earth and sun.  At 16384 sps, and using the speed of light and gravity as 2.99792458E8 meters/second, that is about 18.298 kilometer resolution.  Breaking the earth into that many voxels gives a starting point for 3D imaging. They problem I ran into is the LIGO software sucks.  And the good data is in the 95% of their system they do not share.  They could probably run that earth-based gravitational signals suppressions system as a 10 Msps detector in its own right, get good data on the earth, and have a much easier time of characterizing, tracking and anticipating the exact sources of the local gravitational signals.  Then do a better job looking at the universe.

I spent several weeks looking at all the solar observatories, and solar models to see if I could use the sun to calibrate the earth-based magnetometer arrays.  They problem is that the signals in the sensitive gravimeters/accelerometers/seismometers all pick up magnetic variations, and most nano, pico, and femto experiments on earth need to be corrected for the changing gravitational potential at their location.

Sorry, I was just writing out some notes to remind myself of all the things I have been doing, and trying to do.  It could be rather exciting, but I am too tired, and have no time to celebrate what otherwise would be big steps.

Rather than putting several devices into local arrays to increase bandwidth, using digital techniques, I have been looking at using analog correlators to compare time of flight signals.  It DOES make sense to use several detectors, like an array of antennas are used for electromagnetic scanning (radio astronomy, radar, lidar, etc).  My problem is that I can only work 12 to 18 hours a day, seven days a week. 

To simplify things.  If you take ANY flow of electrons, any change in the charge of capacitors or capacitor arrys.  ANY electrons in antennas.  Their second derivatives are related to the acceleration of the electrons, which is an acceleration field.  Say an acceleration field hits a regular electromagnetic antenna.  If you are looking at averages and low sampling rates, you cannot tell if the acceleration is from electromagnetic sources, or from gravitational sources. But, with time of fligth and imaging, you can determine if the signal is from some astronomical or earth-based source where you can independently measure the charge, magnetic field and mass distribution and changes in mass distribution.

If you have a mass ejection on the sun.  It is going to change the gravitational potential there, then with a delay of about 499 seconds, the changes will difffuse to the earth-based detector arrays.  If they are operating at 1 sps , a light second is out to the moon.  At 16,384 sps they can resolve to 18.4 km.  At 1 Msps/MHz they can get down to 300 meters. And the array can determine direction.  The reason I looked at the solar data is to see what correlations are possible.  So the images of the solar ejection can be processed to estimate the EXACT timing and character of the signal that will arrive at earth.  The correlation gets easier, because you are not correlating the weak signals at the earth with each other, but rather the detected signals at the earth with the strong and clear signals from the solar optical and electromagnetic arrays watching the sun. The only thing they need is better global coordination of data sharing, better policies on formats and a clearer policy on what I call "real accessiblity". And they could use some faster algorithms for their models of spicules, granules and other observables on the sun that can be mined for gravitational and low frequency magnetic signals.

I haven't completely answered your questions.  For Hackaday.IO, I cannot be selfish.  So I have looked at how to improve coordination on the fundamental tools needed for design of anything.  The weakness of HackaDay.IO is that most people do not know exactly why they are doing things.  It gets couched into statements like "Isn't this cool?" or "I took something old and applied new materials and tools to look at it in a new way."  Ultimately I think most people want to feel a sense of worth. Either to contribute something worthwhile to help others, just because it is a good thing.  Or to share a sense of wonder and delight in doing a good job.  Some are looking to find a way to make a living, or have ideas they feel woutld help MANY people, and they need the resources to make that happen.  I listened to a chat on manufacturing and sale of project devices from HackaDay,IO.  What struck me is there is no way to carry on long term discussion about what is needed.  i could use help on ADCs, radio astronomy type arrays, and many other things.  Many of the designers have holes in their experiences. To do good work and create good things takes groups.  Now every here is working alone, or with haphazard ad hoc groups.  One could be much more deliberate about the whole thing.

For the Internet Foundation, which at 70 years of age, is my full time job, I review global topic networks.  The ones I have been looking at most closely are those that require close cooperation of tens of thousands, hundreds of thousands, up to hundreds of millions of people working for years or decades for their solution.  Such global cooperation IS possible.  I love gravitational problems and have worked at them steadily for at least 40 years.  But the real benefit I might have is to help large groups work together effectively. So I spend 40 hours a week on Internet policies, and only another 30 or 40 hours on gravity. And then another 20 or 30 hours helping people find birth parents using Ancestry DNA.

I respond to questions.  The more specific your questions, the more careful and specific my anwers. 

If we could get gravitational imaging arrays - using any combination of detector technologies, There are people who have tried to build gravitational sources or find gravitaitonal sources that can be used to calibrate the arrays.  If I am right that every electromagnetic sensor can also be used to measure changes in acceleration fields, and an acceleration field does not care whether the source is a random array of atoms and molecules in a rock or in a purpose-built material to generate unique signals in an acceleration field detector device.

So what should be possible right now?  Tracking and imaging seismic waves from earthquakes. tracking and imaging ocean waves and currents, atmospheric waves and currents, deep earth motions and current, solar interior motions and currents.  Full sky scans of low frequency signals from the Milky Way.  Better imaging of magnetic fields.  If you can think of it, it is probably possible.

For 21 years (I started the Internet Foundation in July 1998). I have been reviewing topic groups on the internet, mostly math and science, computers and modeling and design, and global, because that is my background.  What i found is that if I can think of something, then there are almost always going to be ten thousand people who have already been working on it quietly and determinedly, and independently in most cases.  So I have gained great humility.  Because everything I might do or propose, someone probably has done it better than me.  But that is so hopeful too - because ANYTHING you might want to do, there are probably many people who might have tried all or portions of it.

My "come one, come all" project is somewhat tongue in cheek.  But it is deadly serious as well. becase as HackaDay people show every day - if you can think of it, you can probably do it.  Things that used to cost millions now can be done with scraps and left-overs or hardware hacks.  

Oh, I looked into that warp drive.  I think when we understand the relation between gravity, electromagnetism, mass and energy and the universal potential better, that sort of thing can be routine "gravitational engineering". I  would rather try it and fail than not to try at all. Working a difficult problem to death and then failing, if is is documented well, can teach others what to do better.  If you see the massive waste of human time to duplicated efforts that I do, you would not hesitate if something only takes a few hundred thousand man years.

I think I found a way to use super-resolution methods for my pendulum.  But I probably will use a "trifilar pendulum" and household items, and some cheap laser pointers.  I wish I knew how to build a very stable power supply for R G B IR laser pointer diode lasers. They have to run for months at a time.  I need to monitor tiny power and laser power fluctuations to be sure to separate that from high frequency electromagnetic and gravitaitonal noise.  So much to do, and so much that can be done.

Richard Collins, The internet Foundation

  Are you sure? yes | no

Nelson Phillips wrote 08/05/2019 at 11:13 point

Super interesting project. 

How is the project going? 

How do are you planning to sample at such a rate? The sampling rate presents many problems, is the main problem the front end ADC or a a discrete logging device?

How you considered multiple sensors in one location to increase the bandwidth? Kind of in the manner this paper presents.

  Are you sure? yes | no

RichardCollins wrote 10/11/2019 at 12:34 point

@Nelson Phillips I am having to be very creative to find ways to store the data efficiently, or to compress it without loss. If I can manage to not lose any information, then I can trade sampling rate for resolution and back.  If you use statistical techniques, then the resolution goes as sqrt(N). But if you use lossless methods then it goes as N.  That seems to be a universal rule.  The people who have been solving that are partly use direct memory access (DMA) to give signal processing and FPGA systems access to local high speed and large memory.

That paper is a good one.  They save much of the information regarding the important data from the image with their algorithm.  Kind of like "let's keep track of the red ones, they are more important".  If I use my "lossless" rule, then you could get the same thing by using larger buffers and more memory local to the sensor.

I am kind of going the extreme.  I want to have all my sensors remember everything they have ever gathered.  Apply robust statistical summary algorithms to pull out most of the actual information for users in a concise form, and then use the raw data for global correlations.  I wish I could show you the hundreds of tests I have done with different kinds of data.

I was looking at some seismometers clustered in one area yesterday.  They actually lost by doing that, since they had the sensor too close together.  The thing they wanted to watch was so large, and the sensor so close together and all seeing identifically the same thing, they lost the advantage of the array - which it to multiply the information.  Since they had identical information, they were averaging, not giving N unique streams.  So the sensors have to be calibrated by a common standard, but in unique and independent locations.

If you had a thousand accelerometers all piled on one acre, you will learn about the things nearby, unless you have precise control on time of sample, and very precise tracking on the levels of the signals.  You can use close arrays, but they have to be sensitive enough to give unique signals.  I try to push sensors to give essentially pure noise, but keep track of the control signals.  It becomes a classic null measurement.  But the purpose is to force each sensor to match its local environment as far as that can be pushed.  Not to get too zen about it, but if you are measuring gravity you want the sensor to become "one" with the field.  It loses its own identify and simply follows the field. Same with magnetic, position, velocity, acceleration, electric, charge, orientation and the thousands of other measurements we have come up with.

I wish I could go through that paper's data myself.  But I am getting SO TIRED of trying to learn thousands of different private languages, formats, development environments, private settings - just to play with the data and algorithms.  I spend ten hours gettting the data, and half an hour checking the algorithm and improving it.

Thanks for the link to that one.  It is really good.  I am just getting worn out.  I spent most of the night looking MEMS components that I can hack to get more sensitive analog signals to start with.  ALL the analog sensors I want to use SDRs of my own choice, not someone cheap modulations, ADCs and data handling choices.

  Are you sure? yes | no

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates