• A fugly amplifier

    12/26/2023 at 08:09 0 comments

    My sturdy NAD amplifier developed a fault where the left channel got fainter and fainter. I have the service manual and I think I know where the fault is, but I haven't got a round tuit fixing it. Meanwhile I listen to music all the time so I would miss not having an amplifier. #Playing music remotely with bluetooth was partly a response to the deprivation.

    Knowing now that class D amplifiers are small, cheap and powerful, I thought I would buy another, more powerful, amplifier board and retrofit it to an old amplifier case. The amplifier you see came from some garage sale or other, I don't remember. I do remember that I bought it because it offered quadraphonic decoding. For a while in the late 70s, one could buy quadraphonic LP records that squeezed 4 channels onto the 2 channel groove of the vinyl. I had one or two such LPs and only ever heard them in stereo. But by the 2000s, multichannel digital recordings obsoleted the quadraphonic hacks. You can see the 4 power amplifier ICs in this top view of the amplifier.

    I thought to put a SMPS inside the case and rewire the selector switch to choose between input sources. The amplifier board is totally disconnected and the volume pots do nothing. I only kept it because the volume pots are soldered on and removing them would leave even more holes in the front panel. (I may later take the pots off the PCB and remove the PCB, leaving the pots bolted on the case just for appearances' sake.)

    Here's the class D amplifier I bought for about $20 delivered. It takes up to 36V supply, and the SMPS which I think used to feed a laptop or something, can supply 32V. It's based on the TDA7498 chip.

    So after a bit of of hacksawing and rewiring, I ended up with the fugly amplifier in the first photo. Fugly because the board is mounted on top of the case. To mount it inside I would have to think of some way to move the volume pot off the board.

    Meanwhile I bought a modern AV receiver (not just a stereo amplifier, but multichannel, includes tuners, and even bluetooth input) on sale in the end of year offers so that does duty these days. I'll put this fugly hack to the back of my to do queue and get on with other projects.

  • Halving the execution time of an Arduino sketch

    12/13/2023 at 08:04 0 comments

    I have a large pile of 4164 and 41256 1-bit wide DRAM chips from the PC era. Many of them were extracted from sockets or boards. I was curious to know how many are faulty. A web search quickly turned up many designs for Arduino based DRAM testers. Most of them required only a few extra components in addition to an Uno or a Nano. Wiring is also simple so I wired up a tester on breadboard.

    (There are complex versions that test a larger range of chips (e.g. additional voltage supplies), and have fancy features like an onboard display, but this was a one-off task. For the same reason I have no interest in any of the published PCB designs for Arduino shields.)

    The first design I wired up was this:

    This worked well. I didn't even wire up the LEDs because the serial console displays the status. Each chip takes about 80s to test. For succeeding chips I just have to press the reset button on the Nano as the program is already flashed into the MCU. It found a handful of faulty chips. Interesting that a batch of Mostek chips failed the test. Either their specs don't work with this tester, or a process fault ruined the chips over time.

    But this circuit didn't handle 41256 chips. So I turned to this design:

    The wiring is different. Here they have tried to minimise wire crossings from each side of the Arduino to the chip socket, at the expense of a less logical pin assignment. But it doesn't matter since a table handles the mapping in the sketch.

    This also worked well. But with 4 times as many bits the testing time ballooned out to 288s. This meant many minutes waiting. There is only so much surfing I can do on the computer while waiting for tests to complete.

    Hacking the sketch

    I know that the Arduino library digitalWrite() and digitalRead() routines do a lot behind the scenes and this is reflected in the execution time. The Arduino documentation shows how to do direct I/O on the ports, but discourages it because for most sketches it doesn't matter and the sketch loses readability and portability. But the same page acknowledges there are situations where direct access is warranted. This testing sketch is just such a situation; it does a lot of bit I/O.

    So I looked into how to rewrite the sketch to use direct I/O. A goal is to not change the structure of the code but use conditional defines to replace the digitalWrite() and digitalRead() routines.

    For digitalRead() on the data out pin there is only one instance that needs to be replaced with inline code to read the pin, so that is easy.

    Since writing HIGH or LOW are different code sequences in direct I/O, we define separate macros for the two situations, called dWH and dWL. When the optimisation is disabled, these map to digitalWrite(port, HIGH-or-LOW) and the program runs as before. When optimisation is enabled, these are defined as routines dWH() and dWL(). They use a switch statement to direct to the appropriate code for that pin. To discover the appropriate direct I/O statement one needs to consult the pin table of the Nano. The pins are also labelled in the sketch comments. So the overhead of direct I/O on a pin is a call, switch, and return.

    Not all the digitalWrite() calls need to be substituted. Those dealing with the LEDs are not time critical and can be left as-is.

    #define FAST_RW
    #ifndef FAST_RW
    #define dR_DO() digitalRead(DO)
    #define dWH(p)  digitalWrite((p),HIGH)
    #define dWL(p)  digitalWrite((p),LOW)
    #define dR_DO() (PINB & 1)
    void dWH(int p) {
        switch (p) {
        case XA1:
            PORTD |= (1 << 2); break;
        case XA7:
            PORTD |= (1 << 3); break;
        case XA5:
            PORTD |= (1 << 4); break;
        case XA4:
            PORTD |= (1 << 5); break;
        case XA3:
            PORTD |= (1 << 6); break;
        case XA6:
            PORTD |= (1 << 7); break;
        case CAS:
            PORTB |= (1 << 1); break;
        case XA8:
            PORTC |= (1 << 0); break;
        case DI:
            PORTC |= (1 << 1); break;
        case WE:
            PORTC |= (1 << 2); break;
        case RAS:
            PORTC |= (1 << 3); break;
        case XA0:
            PORTC |= (1 << 4); break;
        case XA2:
            PORTC |= (1 << 5); break...
    Read more »

  • Installing gcc for the 8088 on a RPM based Linux

    04/08/2023 at 09:50 0 comments

    That's right, there has been a GNU C compiler (gcc-ia16) that generates code for the 16 bit x86 CPUs for quite a few years now, but it's due to the fork by TK Chia that much progress to a usable tool has been made in recent years. It's already in use by various projects, including the ELKS project.

    Why would one use gcc-ia16 in preference to other existing free compilers such as Turbo-C (running under DOS, but you could run it in a VM) and Open Watcom, also enjoying a revival in interest? Well with gcc you get the prospect of compliance with a more up to date C standard, plus the possibility of someday compiling C++.

    Why am I even tangling with this? I haven't had any 16-bit PCs for I forget how many years. Not even 32-bit only PCs, they are all 64-bit capable. I do have some 8088 chips and might get a round tuit making a SBC board for them. The real reason is I'm bored as I temporarily don't have access to my hardware toys. So I decided to see if I could install gcc-ia16 on my OpenSUSE Linux system.

    Now that's definitely possible, because one can compile gcc-ia16 from source. But I'm lazy and decided to see if could adapt the Ubuntu packages, which are in DEB format. I remembered there is a utility called alien, which can convert between various package formats.

    First of all alien is not in the official OpenSUSE repos, so I had to go to software.opensuse.org to find a contributed package. That went ok.

    Next, I downloaded the DEB packages from the Ubuntu PPA for gcc-ia16. There are heaps of packages there, including versions for 3 Ubuntu releases. Cut to the chase, these are the ones that are needed: gcc-ia16-elf, binutils-ia16-elf, libi86-ia16-elf, and libnewlib-ia16-elf.

    So I ran alien on one like this

    sudo alien -v -k -r gcc-ia16-elf_6.3.0-20230219.07-ppa230219074~jammy_amd64.deb

    Alien needs to be run as root or it will not be able to assign the correct ownership to the files in the resulting package. The -v shows progress messages, -k preserves the version number, and -r means the output should be a RPM package. Basically alien unpacks the DEB and reassembles it as a RPM. But it also checks the runtime dependencies, such as libraries. Despite being a complex tool, a command line driven compiler usually doesn't need much more than standard C libraries.

    First problem, alien couldn't handle the Zstd compression of elements of the DEB. Zstd is a new-fangled compression scheme used in Ubuntu DEBs which replaces the old-fangled Gzip scheme.

    I need a newer alien but when I tried to install that, it turns out my Perl isn't recent enough. Grr.

    Ok, try another tack. Unpack the DEB, which is actually just a Linux ar archive, and convert the .zst files to .gz files and repack. That worked.

    Oops, now it complains that the glibc version that I have on my system, 2.31, isn't recent enough for this application which wanted 2.32. Ok, instead of using the Jammy (Ubuntu 22.04) package, I'll drop back to the Focal (Ubuntu 20.04) package.

    sudo alien -v -k -r gcc-ia16-elf_6.3.0-20230219.07-ppa230219074~focal_amd64.deb

    And that converted ok.

    One of the packages wanted libisl22 when I only had libisl15. This turned out to be the Integer Set Library originally from INRIA. Compilers have to manipulate sets at points during their operation. It turned out that the package comprises a dynamically loaded library and a symbolic link. And nothing required libisl15 so it would be quite safe to install libisl22 side by side and not worry about another application loading the wrong version.

    After I converted all the packages just because I'm a bit obsessive, I changed the ownership of the resulting RPMs from root to myself, and made their mtimes the same as the corresponding DEB packages.

    Now, would the RPM packages install correctly? I tried:

    sudo zypper in gcc-ia16-elf-6.3.0_20230219.07-ppa230219074~focal.x86_64.rpm

    Notice I used zypper and not rpm, because zypper will detect any dependencies and satisfy them.

    One other...

    Read more »