a hardware sampler based on the raspberry pi running bare metal (so no Linux)

Public Chat
Similar projects worth following
A hardware sampler implementation using the raspberry pi hardware.

So I'm very into samplers. I create sample based music, if you're curious about that you can check that out on Spotify:

I absolutely love samplers since I was a kid, but couldn't afford one. Specifically, I was in love with the ones from Akai for some reason. I remember gazing upon an image of an Akai s6000 in a sales magazine for a large music shop, dreaming about all the crazy things this thing could do, even though I did not really have a good understanding of what a sampler actually was back then. It just looked like an interesting futuristic huge machine with a gameboy like removable device.

Skip to 2019. I finally acquired one when they were breaking down the old audio postproduction studio at the television network where I worked back then.

It was in fairly good condition, it just needed some external cleaning. Checking the contents on the internal harddrive, I found the remnants of sounds used by the "Who Wants To Be A Millionaire" show. It's been a long time since the last time that has aired in my country, so it must have been just sitting there for a while. I ordered a SCSI2SD kit, removed the internal scsi harddrive and installed the SCSI2SD kit instead leaving the original floppy drive intact. Great, I now have a fully working future proofed Akai s6000 to play with. It's great and all, but I just wish the firmware source code was leaked somewhere so I could start to make some changes to it.

My main sampler is an Akai MPC1000. It just checks all the boxes I want from a sampler, is easy to operate, and has a very raw processing feel to it. When pitching samples, there seems to be no low pass filter (which is something I actually like a lot). I even own two of them, just in case one breaks down. But it's not completely perfect. While JJOS is famous for adding a lot of features to the existing firmware, it also adds stuff I don't really need, and makes the UI more complex in my opinion.

In Utopia, there would be a sampler that combines the best of the s6000's friendly UI and feature set, with all the strengths of the MPC1000.

That's when I started thinking, as a programmer, if I really would put my mind to it I could make one myself these days with all the information out there. So I got started researching hardware. First I was looking at DSPs. Full disclosure, I suck at maths. Don't even understand simple concepts. But I'm a good handyman, and can puzzle things together. Ideally I would find a base to work with, and puzzle my way through. So I read some more, and finally found someone who said that the Raspberry Pi's arm is probably powerful enough to blow these old DSP's out of the water. A while back I was experimenting with programming a bare metal midi processor on the Pi 3 from scratch using David Welch's tutorials in C, so already gained some knowledge on how to do it. It also has a ton of working memory, unlike MCUs such as the Arduino, AVR, PIC or even ESP32. When researching I also came across Circle, which is a bare metal framework for the Raspberry Pi series that has got a lot of awesome work already put into it and figured this would be the right choice for my project. I'm not really familiar with C++, but have got a lot of experience with object oriented languages such as C# and Java, so the object concept should not be an obstacle.

We're going to take things slow, step by step. This has to potential to be one of those huge-mountain-of-work projects that never gets finished, so I need to take things slow and simple if I don't want to discourage myself.

I imagine the steps would be like this:

1. Setup bare metal compiler environment workflow with Circle on macOS

2. Create a simple program that plays a sample through the headphone jack

3. Find out how to mix samples for playing back multiple overlapping voices.

4. Implement classes for voices and mixing.

5. ...

Like I said, I suck at maths, so I'm going the...

Read more »

  • Getting back into the project slowly

    Nick Verlinden03/11/2021 at 08:20 0 comments

    It's been a long time since the last log. A lot has happened for me in this time, and there was no room for openSampler in my schedule. I'm typing this on the train to work, so forgive me if this log contains more typing errors than usual. It makes me think about just how bad mobile phone keyboards are. I keep making the same mistakes because of muscle memory, shurely there must be some ai algortihm to assist me in this day and age...

    Anyway, gonna keep this short. I only have about 20 minutes to finish this message. Long story short, i changed from fully self employment in 2020 to 80% employment doing my all time dream job at Cinematek, wich is the Royal Belgian Film Archive. I was lucky enough to be hired as an expert in film restoration and digitization. Fantastic working with film, but also very satisfying cleaning them up digitally. I'm sure many would find this is one of the most boring jobs they can imagine, cleaning dust all day, but to me it's heaven. I'm also buying a house right now, about the worst possible timing. The housing market in Belgium is being flooded by prospects looking for one, and stories are floating around that people are even buying houses just looking at the pictures without visiting them first. That's crazy. My motivation is different, but I managed to find something I really like. So this is taking up al my time for the moment. Well, not all of my time, but I seem to be having difficulty focussing on multiple big projects. I'm constantly thinking about the house project, and sometimes openSampler, but don't feel like working on openSampler unless i know that i can dedicate some time on it, and not just 10 minutes here, 15 minutes there.

    So, what about it? Last time i think i talked about getting it to work on the twatch, well it all worked out. It's working well. Yesterday i finally spent some time setting up my dev environment om my new macbook m1. Yeah i bought one, but only after witnessing it at Cinematek where i tested editing 10bit h265 6k anamorphic footage from my gh5, and it worked like a charm. I was so amazed, that I bought a macbook air because my old one could not even play back the videos in quicktime or vlc. Anyway, the m1 comes with it's problems. I installed the parallels desktop technical preview and installed the arm64 version of ubuntu. All works great. Then i installed some dev libraries, nothing special there. Then, i got the the esp-idf. It's not officially supported on arm (yet, they seem to be working on it), so i had to do some tinkering to get it to work. Basically it came down to: compile crosstool-NG myself, register the path, modify the esp-idf script to bypass the xtensa installation (because it would block you from continuing because there are no precompiled binaries for arm), and then i also had to modify to skip the checks for the xtensa compilers for the s2 and s3 versions. I dont need them. After that worked great and i got to compile. Then i tried to flash the watch, but i ran into some permission problems and had to chown the /dev/ttyUSB0 device before it would work. After that eureka! By the way, make sure your vm has more than 2gb of ram, because else the xentsa gcc will not compile and fail at the step installing gcc final binary or something like that. It will say something vuague like the process was killed. I changed my vm to 4gb, and all is fine. 

    I also did some home cleaning in the project, made sure everything is nicely seperated. Now i need to implement mouse and keyboard events, and then the ui system should be usuable. Can't wait until this relatively boring task is done, but i'm almost there.

    I'm also gonna try and get the gcc compilers for circle working on the ubuntu arm vm, but i dont think it will be a problem, it looks like it already has good arm support.

    I arrived in Brussels, talk to you soon!

  • Getting the engine to work on the ESP32

    Nick Verlinden11/14/2020 at 16:22 1 comment

    I just couldn't help myself. Remember the lengthy talk we had about perfection and OCD in one of the previous logs? Well, the demon strikes again. I was busy with work, that made me work on a modular casing system that is about 90% completed. I have to wait for some parts to come in from China, so that's put aside for now. Then I got sidetracked because I found this Hyperpixel 4 display, and that used up all pins on a Pi, so I decided to rewrite my graphics layer to allow for remote commands over i2c. And while at it, I decided I would do it in plain C, because then I can reuse my graphics layer on the esp32. I know you can use C++ on the esp. I just prefer not to.

    I just bought this briliant watch, the Lilygo TWatch 2020. The graphics layer is also about 90% finished, and ready to be used. Then I thought, gee. I have all these ESP32's lying around and have got nothing to do with them. Wouldn't it be awesome if I could make some kind of a micro sampler reusing my existing (simplistic) audio engine? My Pi with touchscreen is a bit bulky to develop on and take with me for developing on the road, so I decided to go for it. First I made it work on the internal DAC, then using and external DAC. Then just for fun, I tried to get it to run on the watch. After some fiddling I realised that all examples for this watch are written with the Arduino framework. I'm using esp-idf, so I had to figure some things out. I discovered the reason it didn't work on the watch was because the DAC was not powered on when the esp32 boots. Makes sense, it's a watch, power management is very important there. You need to talk to the power management unit in order to power it on. I traced the code required to get it to power from several examples online. Once that was done, it was working on the watch as well. You would think the watch would use the internal dac to connect a speaker, but they decided to go with an all-in-one package i2s codec + amplifier.

    Oh boy, that feeling of accomplishment... You see, I fell down this rabbit hole of side-quests. But I gained a lot of experience in doing so ;-). I also learned that I could get about 20 mono voices with pitching and gaining to work on a single core of the ESP32. Impressive, i thought it was gonna be a lot less.

    Anyway, the plan is now to work more on the engine, so i hope to see some progress in there. I'm also still learning how to properly structure and organise a project targeting multiple hardware platforms, so unfortunately there will be a lot of refactoring along the way. But a tidy house, is a nice house :-)

    Hope to talk to you soon.

  • Hyperpixel 4 Warning

    Nick Verlinden09/30/2020 at 20:03 0 comments

    Short warning for all those who are interested in using the HyperPixel 4.0 Square display: it uses ALL the GPIO pins. In my experience not even the UART is accessible anymore. That's a no-go for openSampler, but I still want to use this display because the dimensions are absolutely perfect. I'm thinking about using the pi zero to drive the display, and let the pi 4 (or the cm4 when it comes out) send display commands over i2c to the pi zero. This way I can also use the display with any MCU that can talk i2c, such as the esp32. I'm not the first one to do such a thing, as far as I know, NerdSeq also uses the pi zero to drive a display.

    For those who also want the use this display in bare metal; you need to add some lines to the config.txt, and add the hyperpixel4.dtbo file to the overlays directory on the boot volume. This file needs to be built by the install script. They used to have a compiled version in their github repo, but it's gone. I don't know why though, I 'compiled' it on a Pi 4, and tested it by using the same sd card in a pi zero and it worked without modification. If you need it, you can find my compiled version here: ''. You also need to do some display init code in your project. The code can be found in this file: ''. You need to do some rewriting so that it uses circle's GPIO functions. The code basically bit bangs some display init commands over one of the GPIO pins. I think it's a little weird because it looks like it's some sort of SPI, so I don't know why they just didn't use the SPI bus. If someone knows, shout it out in the comments please. 

    This is the output of the install script from hyperpixel, all the lines added to config.txt are of interest:

    Config: Added dtoverlay=hyperpixel4 to /boot/config.txt
    Config: Added overscan_left=0 to /boot/config.txt
    Config: Added overscan_right=0 to /boot/config.txt
    Config: Added overscan_top=0 to /boot/config.txt
    Config: Added overscan_bottom=0 to /boot/config.txt
    Config: Added framebuffer_width=720 to /boot/config.txt
    Config: Uncommented framebuffer_height=720 in /boot/config.txt
    Config: Added enable_dpi_lcd=1 to /boot/config.txt
    Config: Added display_default_lcd=1 to /boot/config.txt
    Config: Added dpi_group=2 to /boot/config.txt
    Config: Added dpi_mode=87 to /boot/config.txt
    Config: Added dpi_output_format=0x7f226 to /boot/config.txt
    Config: Added hdmi_timings=720 0 15 15 15 720 0 10 10 10 0 0 0 60 0 35113500 6 to /boot/config.txt

  • Quick Update

    Nick Verlinden09/27/2020 at 20:31 0 comments

    Just a quick update here. I've been very busy with work, which should be a good thing in these times. Unfortunately, that also means less time to play around. But... openSampler is always on my mind, and I have lots of time while commuting to think about the execution. It really helps not to take things too fast, because while staring out the train window, often an idea will pop into my mind, that will solve some problem I was having.

    I've been working on a modular 3d printable case system in the evenings. It's nice, it takes some time to print parts, trail and error. The casing system will be used for openSampler. I have this idea that the controls for the sampler are completely customisable. If you want to have a rotary control, you can. If you want to have dedicated hardware buttons for something specific, you can. Heck, if you just want a touchscreen and no other controls, fine!

    I'm thinking about making all control boards i2c modules. That way you can add controls as you please, or remove them if they're not working out for you, or even remap their function to something else. I'm pretty excited by this idea. In my head it's all easy-peasy; but I suspect the reality of it will be a little more complex. 

    Anyway, i'm focussing on perfecting the 3d printable case system now, next I'm going to try another touch display. I've bought this square HyperPixel touch screen, which seems perfect for me. I found the raspberry pi 7inch touch screen a little bit too bulky. I think the display will work out of the box, but the touch screen might not. It's i2c based, and think Circle's built-in touch driver is for the memory based touch screen only. i2c is relatively easy, so it should not be a huge problem to implement this. Let's see how that works out.

    Talk to you soon!

  • bixl, 1-bit pixel editor is born!

    Nick Verlinden08/05/2020 at 08:27 0 comments

    If you read the previous logs, you might have gotten the message that I love 1-bit guis. I loved Atari TOS and the early version of MacOS. To create 1 bit graphics, you can draw lines or pixels using code like this for instance (this is not real code!):


     To draw fonts, there are several different solutions out there. Having followed David Welch's bare metal programming tutorials, I was using the psf font format, which is very simple. It contains a 32byte header with a few important parameters such as the width and height of the characters. You can create these fonts by hand if you want. Just create a C header filer with a byte array, and start entering bytes manually according to the spec.

    Even though the format is quite simple, it's a very tedious job. I looked around, and there seemed to be no simple (free) app to create 1 bit pixel graphics. I also have a Samsung Note 9, and it seemed like a romantic idea to be able to use the included pen for drawing. I have to travel to work a lot this month, so I had some train time to waste. That's when I created bixl. It's a very simple pixel drawing program.

    It's a web app that is available here:

    I had some experimentation code lying around for creating a pwa, so this web app is also a pwa that you can install directly from the browser onto your computer/phone/... 

    You can start from scratch, or load an existing psf font. Only psf version 2 is supported at this time, but I'm not planning to add support for more file formats. If you want to add it yourself, dig in, it's open source! :-)

    Fighting the lifelong enemy called perfection

    Now is a great time to talk to you about the worst enemy I have to face all the time: perfection. If you don't want a lecture about that, just skip the rest of the log as there will be no new technical information.

    When working on stuff like this, it's easy to lose yourself in details. I constantly think like: 'oh it would be nice to add x feature', or: 'people will probably need feature y'. This process keeps on going and if you give in to it, you will never deliver or it will take you a very long time to do so. And even then, it will never feel complete. there's always something more you can add.

    A few years ago, I was dreaming of a distributed system where you could just include packages directly from github/lab/... in a web app. Think of it like npm, but without the need to locally download and keep these libraries. Your browser would just fetch them when needed. So I started experimenting and writing code. At one point I had a working system. It was fast, and it worked. Then the ocd kicked in and said: you need to rewrite this now, because people are going to see this mess you created. So I started to rewrite with the knowledge I gained by creating it. Javascript promises where the new thing, and everyone was using it, so I had to use them in my rewrite as well. Then, that version was almost finished, and I decided I wanted to add a feature because it seemed nice to have, or it just made sense (though not required!). But then I patched something and the code seemed messy again (though it really wasn't, but I thought it was), so I decided I would rewrite it from the ground up with this new knowledge. By the end, you could write plugins for loading from your own custom repository and all, quite fancy. It just took 3 years of train travel time, and a decent amount of spare time from my life to create it. A few months later, this project was almost obsolete by the javascript import api. I never started using it. So when I look back at it now, it just seemed like one big waste of time. Image you spent your life building your own car. But by the time you finish, cars are no longer being used and were replaced by drones. You created a nice car. For yourself. Great. Now you know how to build a car.

    So it was then that I realised that I have to fight the ocd and perfectionism when it strikes. With...

    Read more »

  • Getting the 64bit compiler working.. Failure!

    Nick Verlinden07/27/2020 at 09:18 0 comments

    Failure is a part of life, and we just have to deal with it and move on. After several hours of trying to compile the aarch64 gcc cross compiler on macOS myself, I have given it up for now. I will continue development and testing in 32bit. I'll just compile and test on linux for aarch64 when a major version has been released.

    I got as far as getting a binary for the g++ compiler, but using it with circle results in:

      CPP   actled.o
    'armv8-a+crc' is not a recognized processor for this target (ignoring processor)
    'armv8-a+crc' is not a recognized processor for this target (ignoring processor)
    /var/folders/q1/0jsxyqlx6jl_592pvc_z3lx00000gn/T//ccLAjiBh.s:1:2: error: unknown directive
            .arch armv8-a+crc
    /var/folders/q1/0jsxyqlx6jl_592pvc_z3lx00000gn/T//ccLAjiBh.s:10:2: error: unknown directive
            .type   _ZN7CActLEDC2Eb, %function
    /var/folders/q1/0jsxyqlx6jl_592pvc_z3lx00000gn/T//ccLAjiBh.s:17:16: error: brackets expression not supported on this target
            stp     x29, x30, [sp, -48]!
    /var/folders/q1/0jsxyqlx6jl_592pvc_z3lx00000gn/T//ccLAjiBh.s:21:2: error: unknown use of instruction mnemonic without a size suffix
            mov     x29, sp
    /var/folders/q1/0jsxyqlx6jl_592pvc_z3lx00000gn/T//ccLAjiBh.s:25:2: error: invalid instruction mnemonic 'adrp'
            adrp    x2, .LANCHOR0

     This is the procedure that finally gave me a (non-working) g++ binary:

    #download aarch64 gcc arm compiler source (
    #extract and cd into directory
    mkdir build && cd build
    ../configure --build=x86_64-build_apple-darwin19.5.0 --host=x86_64-build_apple-darwin19.5.0 --target=aarch64-none-elf --prefix=/Users/nick/Downloads/aarch64/aarch64-none-elf --with-local-prefix=/Users/nick/Downloads/aarch64/aarch64-none-elf/aarch64-none-elf --with-gnu-as --with-gnu-ld --disable-libstdcxx --without-headers --with-newlib --enable-threads=no --disable-shared --disable-__cxa_atexit --disable-libffi --disable-libgomp --disable-libmudflap --disable-libmpx --disable-libssp --disable-libquadmath --disable-libquadmath-support --enable-lto --enable-target-optspace --disable-nls --disable-multilib --enable-languages=c,c++
    make -j 8

    stdlibc++-v3 gave me a lot of problems when compiling that I could not figure out, so I tried to compile g++ without libsdtcxx. I figure that is the root cause of all subsequent problems. Anyway, I wanted to share my workflow just in case someone else wants to have a stab at it. 

    For the record, compiling the c compiler with the configure parameters below works, it's the c++ compiler that's giving me a hard time.

    download gcc arm source
    mkdir build && cd build
    ../configure --build=x86_64-build_apple-darwin19.5.0 --host=x86_64-build_apple-darwin19.5.0 --target=aarch64-none-elf --prefix=/Users/nick/Downloads/aarch64/aarch64-none-elf --with-local-prefix=/Users/nick/Downloads/aarch64/aarch64-none-elf/aarch64-none-elf --without-headers --with-newlib --enable-threads=no --disable-shared --disable-__cxa_atexit --disable-libgomp --disable-libmudflap --disable-libmpx --disable-libssp --disable-libquadmath --disable-libquadmath-support --enable-lto --enable-target-optspace --disable-nls --disable-multilib --enable-languages=c
    make -j 8
    make install-strip

  • Eureka! It's working!

    Nick Verlinden07/22/2020 at 09:38 0 comments

    Yesssss, making some good progress here! It's been a wild adventure already. I am very grateful Patrick already did the heavy lifting on the Audio Injector Octo card in his vGuitar project, because it took me a while to get it to work properly. 

    When I started to use some of his code I knew zero about i2s or DMA. Niks, nada, noppes. And to be honest, I still don't fully understand it, but after staring at the code for hours, I now have at least superficial understanding of how it works, and can go on to the next steps to be taken. Right now only 2 output channels are working using a modified version of circle's built in i2s output code. Patrick actually rewrote it into something he could use with the Teensy audio library. There is a chance I might make something similar myself in the future when I want to get audio input working. The reason why it doesn't work out of the box with circle's code is because the Audio Injector should be the i2s master, and the circle code is made for the Pi to be the i2s master. It took some poking around to get things working. But like I said, thanks to Patrick for figuring out the CS42448 initialisation part, and the part to make the pi accept a master clock from the i2s bus. 

    The second reason is that the Octo actually uses something called TDM, that is to my understanding taking abuse of the i2s protocol to transfer more than 2 channels. Apparantly i2s was only designed for transporting two channels simultaneously. TDM is a way to get multiple channels working over the i2s bus.

    The entire project now seems very promising since it left the theoretical concept stage. I've been giving the gui some thought as well, and think I'm going to create it with two types of input in mind: the touchscreen, and a rotary encoder (and other buttons). You should be able to do everything with the touchscreen, but also be able to connect a rotary encoder like on the mpc1000 and s6000 to change values fast.

    I'm working as a freelance video editor for 6 days a week in August, so development for this project might be a bit slowed down during that time because I will be mentally drained when I get home in the evening, and imagine I'll need my Sunday to recuperate from screen fatigue.

  • Update

    Nick Verlinden07/15/2020 at 11:14 0 comments

    Today i received the audio injector octo 8 soundcards. I'm looking forward to figure out how to get audio input and output. I'll have a look at Patrick's vGuitar rig project code, because he already figured out a lot of the details.

    In the mean time I have been busy with creating the ui code. Let's get something straight about that. I just absolutely love 1 bit guis, and both the mpc1000 and s6000 have 1 bit guis because of their use of graphic 1 bit displays. So, I'm creating a 1 bit gui for this project. I hear you thinking: this guy... I know, tastes differ, and you may not like the pixellated look of 1 bit displays, but with those limitations, a minimalistic design comes with it and thats the way I like it. Further more, if at one point we discover that we dont like touchscreens, we can always replace it with a 1 bit spi/i2c display with minimal change. Also, since this is an open source project, you are free to fork it, and create a gui to your liking! While looking at the guis of the mpc1000 and the s6000, I stumbled upon the gui of the mpc4000. It's like a mix between the mpc1000 and s6000, and it's a great source to base the gui on.

    The source of the first working version is going on the gitlab repo soon. Stay tuned!

  • Manipulating audio sample data

    Nick Verlinden07/10/2020 at 13:46 0 comments

    Last time we figured out how sound works in code, so now we can get to the fun stuff and do some manipulation on that sound! 

    Now that we know that a sample is the amplitude, it was easy to figure out that by dividing or multiplying the value, you could make it louder or less loud. Now let's see what our sampler needs to be able to do.

    When you use a sample on an Akai MPC1000, you can specify it's tuning during recording and manipulate the pitch after recording. On the MPC1000 this can sound quite grungy. I did not immediately know how pitch works in code, so I let it rest for moment and went on to re-watch some episodes of Community. My brain kept doing some thinking work while I was watching. I don't know if this will make sense to you, or if I'm even explaining this in a way that makes it clear, but let's have a shot. I figured when talking about music theory, for a certain tone to be an octave higher, the wave needs to be double the speed. So, when you want it be an octave lower, it needs to be half the speed. 

    Could it be really that simple?

    It was already late, but ran to the basement where my 'lab' is to test my theory. I came up with this piece of code.

    Sound[int(sPos++ * pitch)];

    In case you havent figured it out, this need to be in a loop where sPos increases the sample data index.

    And you know what? To my big surprise this actually sound just like as it would on the MPC1000. There is a reason for that. I suspect the MPC1000 does not have a low-pass filter or any fancy algorithm correcting the aliasing that occurs when resampling the audio. So the artefacts that your hear are imperfections when modifying the length of the wave sample data.

    Figure out how pitching works: CHECK.

    If you have listened to my music, you will probably notice that I often like to use aliasing as an effect. Take this song for instance: '', you can hear it very clearly in the vocal buildup in the middle of the song at 1:40.

    With the newly gained knowledge about how and when aliasing occurs, we can create decimator effect (also often present in bit crusher style effects). This is what I came up with:

    // decimator
    float decimate = 1.0f;         
    if (decimate != 1) {
        int idx = int(int(sPos++ * decimate) / decimate);
        nLevel = Sound[int(idx * pitch)];

    Now I'm not going to explain this in detail, just have a look at it, and try to figure it out knowing that 'Sound' is the signed 16-bit sample data, and nLevel is a signed short (or in other words a 16-bit signed int)  that is going to be sent to the audio device's output. If you use a value of 0.1f for decimate, you will get a really low-fi gritty sound. just the way I like it.

    Talking about bit crushing, how to approach that? Simple, by removing bits like this:

    // bit reduction
    int bits = 9;
    nLevel = nLevel >> bits;
    nLevel = nLevel << bits;

    But on the MPC1000 the effect is called 'Bit Grunger', and does not sound like the bit reduction technique like in the code above. Instead, I think that they drive the sound by compressing the quiet parts, so that it fits into the new bitdepth. Think of it like rescaling the wave so it fits in the newly specified bit depth. Our code above just throws the bottom part away, but the code below adds some 'drive' to the sound.

    // drive
    float depth = 0;
    if (depth > 0) {
        nLevel = 32767 * (tanh((float(nLevel)/32767)*depth));

    If you do the driving part before the bit reduction, it will sound more like the 'Bit Grunger' effect on the MPC1000. By the way at this point I would like to thank Heikki Rasilo for helping me out with the math part of the drive. I did not even know what a tangent function was, and couldn't have done it without him. You know; I suck at math. In high school I even had extra after school classes for algebra, but it just didn't work. I failed maths that year, and went on to the lower grade wood workshop education course. But that...

    Read more »

  • Finding out how sound works in code

    Nick Verlinden07/08/2020 at 11:14 0 comments

    Last time we set up the compiler environment. I'm using it extensively now and works great. I only had a few times where the bootloader did not start the image after sending it over serial (and then I had to reset it and send the image again, so no drama).

    I decided to start out by using the '34-sounddevices' sample project, because it looks like it's really simple to modify it, and get it to play sound data instead of a 440hz tone.

    So my idea is to add sound data to the project, so that I can patch it in to where the 440hz tone is fed into the buffer. I'm really in unfamiliar territory here, but have heard of the double buffering pattern. I think that's what the original author does here. We have an audio buffer of a fixed length, and then there is a loop that will fetch data from this buffer, and write parts of it to the audio device's output buffer. We are supposed to write data to the first buffer, so that the loop can write it to the second. The reason behind this is to prevent audio dropouts. Basically by having two audio buffers, we make sure that the strictly timed audio device always has data, even if the timing of the processing/generation code is not that strict. Audio is something that is very time sensitive. Much like a video plays at 30 frames per second, cd-quality audio (wow that's been a while since I called it that) needs 44100 samples per second (times two if you want stereo output).

    If you're already experienced with how sound works in bits and bytes, I suggest you skip the rest of this log, as it really only explains the fundamentals, no advanced processing (yet).

    Now let's talk about what a sample actually is. Before starting this project, I had a vague representation in my head of how a sample is stored in computer memory. But diving deeper into this, I picked up a few things.

    So let's get it out of the way; sound consists out of waves. A wave goes up and down. Depending on how loud it is, the wave goes higher. But like I said, a wave goes up and down. When it goes down, it still goes louder. This is called the amplitude. It's an audio waves thing, if this sounds weird to you, I suggest you look up on information how sound reproduction works, like say how a speaker reproduces sound.

    A computer does not store waves, it stores bits. So, how can a computer then store a wave? By taking samples from it. Samples are points in time where the amplitude of a wave are measured. And for cd-quality audio, for every second 44100 points are measured.

    There is a good image on wikipedia that illustrates this, every point on the wave is a sample:

    For the time being, assume we're talking about 1 channel (so mono, not stereo). When your audio is stored as 8-bit, your computer offers you an array of 44100 bytes for each second of audio that represent the loudness of those parts of the wave. The bytes represent time sequentially, so byte 22050 contains a sample of the wave at 0.5 seconds in time. So every byte is 1 sample. The midpoint of a wave is silence. In the illustration from wikipedia above, the midpoint is represented by the line in the middle. The image below illustrates this a little bit better because its shows you a longer waveform over time, you may say it's 'zoomed out' compared to the illustration from wikipedia above, which is heavily 'zoomed in'.

    In 8-bit audio the midpoint is 127. So silence is represented by 127. That is ... when we are talking about 8-bit audio. For those of you who lived and played games in the MS DOS era, 8-bit audio just sounds yuck. CD-quality audio is 16-bit. And in 16-bit you have the choice to see your samples represented as a signed or unsigned integer. In case you wonder, a 'signed' integer is a number that can go below zero. Knowing that, when using a 16-bit signed integer, the highest amplitude (a.k.a. loudness) a sample can have is 32767 and -32767, and so 0 equals silence. On the other side, if audio is stored as a 'unsigned' 16-bit integer, the highest amplitude is...

    Read more »

View all 11 project logs

Enjoy this project?



Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates