Close
0%
0%

Improbable Secret Project

Probability this can work: 98%, working well: 70%, working within 1k: 10%. A LOT of work, and utterly ridiculous.

Similar projects worth following
It's too improbable to tell the details, just yet...
But let's say it's got something to do with an 8088/8086 and an AVR

(random note to self: 18868086881)

Pretty certain this won't work in 1k... Despite being 1 byte of initial-input, allegedly there's something like 300 possible inputs based on the optional second byte... That's a pretty big lookup-table, nevermind all the code.

But I have *some* ideas about how to shrink it... maybe at the cost of execution-speed.

  • We Have A BIOS Extension!

    esot.eric6 days ago 6 comments

    UPDATE: Adding screenshot from 'dosbox'

    ------------

    Thanks to @Shaos's hard work getting a PCjr ROM-Cartridge working over at #PCjrCart (https://hackaday.io/project/19160-pcjrcart/log/51792-creating-a-bios-extension), my work getting my first ROM extension was greatly reduced!

    Here it is running as a .COM under 'dosbox'

    And here it is as a ROM-Extension on my PC/XT clone:

    Apparently on the XT, it runs the BIOS extensions almost immediately in the boot process, so this is on a completely black background. (Note that it cycles through text-colors as well as background-colors).

    And I learned a lot, along the way.

    We've got the BIOS in the ZIF socket on the left, because that's where I intend to do most of my work, later down the road... (and I found another BIOS to try out floating in the 'old chips' box).

    And we've got the "BIOS-Extension ROM" further to the right.

    A couple notes:

    • It Boots FAST (no memory-test beforehand?!)
    • The PC/XT apparently doesn't start in the same video-page as the PCjr, so a simple fix is to use INT10H to read the current page, then reuse that value instead of hard-coding it. Sum total of 1 additional instruction, and one modified.
    • The PC/XT doesn't use a CRC to validate the ROM-extension, instead it uses a simple checksum (8-bit). My quick-solution: hand-edit the binary, replacing the last two bytes with 00 and the checksum
    • My PC/XT's video arrangement is pretty wonky (which you probably guessed if you've read my last logs), so the original choice of RED for the text resulted in basically "_| -| || -_ |-" which is why I've modified his code to cycle through the foreground and background colors to find a decent choice (of which there are few).

    Here's the first-experiments. Before loading it into a ROM, you can try out the .COM executable under DOS...

    -------------------

    I won't give it *all* away... but go grab @Shaos's code from github (again https://hackaday.io/project/19160-pcjrcart/log/51792-creating-a-bios-extension)

    Make sure you've got 'nasm' installed (the Assembler)... Look up "INT 10H" (wikipedia's got it)

    You might want to install 'dosbox' as well... when running it slow the "cycles" to ~300 to mimic a PC/XT's execution-speed.

    Here's a makefile to make things easier...

    default:
    	gcc -o pcjrcart pcjrcart.c crc16.c
    	nasm test.asm -fbin -o test.com
    	./pcjrcart test.com
    
    clean:
    	rm -f pcjrcart test.com test.bin
    
    run:
    	dosbox -machine cga test.com

  • shoulda known better...

    esot.eric01/14/2017 at 10:19 0 comments

    ...than to click on a youtube video in the midst of having-written for well over an hour, without Lazarus(?) installed... :/

    Whelp, suffice to say, I'm trying out @Shaos's instructions, here: https://hackaday.io/project/19160-pcjrcart/log/51792-creating-a-bios-extension/discussion-74094 and made some modifications to make it XT-compatible (test the current page, rather than assuming page 7). And some more modifications because my stupid video-setup doesn't show anything other than "|-_|| -| |_" when displayed in red, or most any color, for that matter.

    Anyways, a great starting-point.

    I wasn't planning on doing BIOS extensions, as the whole point (at this point) in the project, of doing a custom ROM, is to limit the number of executed-instructions to those I've actually (at that time, when I get there) implemented... Which, wouldn't at all be limited by the BIOS, itself, which is run before the BIOS extensions...

    BUT: his work, there, is a *GREAT* starting-point... showing how to use nasm, and more...

    And, I think, PCjr cartridges basically work *identically* to PC/XT bios-extensions. And some ideas 'round that link, regarding how to use those bios-extension/PCjr-cartridge memory-locations as Read/Write, despite not having a /WE pin... in which case you can throw custom hardware, there, too.

  • state of the cpu address

    esot.eric01/13/2017 at 22:23 0 comments

    Thanks to @jaromir.sukuba for inspiring this writing...

    The point of this project is[/was?] to implement the 8088 chip itself, with an AVR... making use of the other peripherals already on the original mother-board, not because they're *better* than the peripherals available in an AVR, but because *nearly everything* is interfaced to the 8088 in the same way, including the "program memory" (BIOS), the RAM, and the I/Os...

    There's been discussion--as well as browsings mentioned in past logs--wherein it's suggested to use the AVR's internal peripherals to *replace* those on the motherboard... That'd be smart. In fact, it'd probably be *much* faster if that was done.

    E.G. writing a byte from a string stored in the AVR's internal memory into the TX-register in the AVR's serial-port would probably take something like 2 AVR instructions, maybe 4 AVR clock-cycles. On the other hand, doing-so, even without emulating the 8088's instruction-set, to an externally-attached ISA RS-232 card would require, at the very minimum, 4 8088 bus clocks (one 8088 bus cycle).

    I think I can somewhat-reasonably expect to execute 5 AVR instructions per every 8088 bus clock... (4.77MHz * 5 = a slightly overclocked AVR). So we're talking a bare-minimum of 20 AVR-clock-cycles to transact a single byte of instruction/memory/I/O data... And that's assuming no wait-states, assuming the Bus-Interface-Unit (BIU) isn't already in the middle of a transaction (e.g. caching the next instruction-byte), that DMA isn't refreshing the DRAM, etc.

    That's a *huge* hit, for *every* byte-transaction. And, again, doesn't even consider the fact that the AVR will be emulating the 8088's instruction-set.

    Now, again, consider (as @jaromir.sukuba brought up, and was brought up in previous logs) just how much could actually be implemented *within* the AVR... The PC/XT BIOS-ROM is 8KB... Many AVRs could fit that and still have plenty of space for AVR-code. So, the BIOS itself could be stored in the AVR... Reducing the read of each byte from 20AVR cycles to 2-3, during boot, interrupts, and more. The RS-232 port, maybe I2C for a keyboard... (or even bitbanged PS/2!)... Even SPI for an LCD. Could even load a bunch of RAM in there, as well. And all these "devices" would communicate *significantly* faster when directly-interfaced with the AVR-core, rather than going through the 8088-style bus.

    Also, consider that the 8088 contains two "units" which run *in parallel*. The Execution Unit (EU) actually executes instructions, but the Bus-Interface Unit (BIU) grabs data from/writes data to the bus. The BIU runs *in parallel* to the EU, so while the EU is executing instructions, the BIU is often fetching the next data-bytes for the next instructions, simultaneously. This could, in a way, be considered like a DMA transaction... Well, 8bit AVRs don't have DMA, especially one that'd be compatible with the 8088's bus-interface. So, that means each transaction on the bus has to be handled by the "CPU", making the entire system more like a lock-stepped single-core system, rather than a dual-threaded, dual-core system that the 8088's EU/BIU more closely approximate.

    Similarly, consider an 8088 bus-transaction's being 4 bus-clocks, being 20 AVR clocks, being ~20 AVR instructions... Worst-Case, a bus-transaction transfers something like 6 bytes. We're talking 3 bytes for the address, 1 for data, and numerous control signals. The vast-majority of the signals need to be set-up early-on in the transaction... That means the AVR "core" would be executing numerous instructions as fast as possible to set up those byte-transactions. But, thereafter, the AVR core will just have to twiddle its thumbs waiting for the various bus-clocks, wait-states, and delays, thereafter.

    Those bus-clocks, again, come at something like 1 every 5AVR cycles... So, one might think...

    Read more »

  • The Photographic Saga of Building the XT

    esot.eric01/13/2017 at 14:27 0 comments

    The textual saga was described in the early logs of this project...

    The basic jist is that I built an IBM PC/XT clone, based on stuff I had laying 'round the stockpiles... The Motherboard was in a box filled with scrapped PCBs, as was an ISA card, or two. Though some I was actually smart enough to store in a box specifically marked "ISA Cards", and some were even in anti-static bags. Most, on the other hand, were not.

    Getting this system running was a bit**... it kept stringing me along with little minutes of utter surprise at the fact certain things worked-out, despite how they'd been abused over the years, or how unlikely they were, and then days on end of roadblocks with things that should've been easy, or with tools I'd been using in pretty-durn-functional states for quite some time.

    The earlier logs describe it better... here's some photos.

    -----------

    Dug through the ol' box of Keyboards, pretty certain I didn't have an XT keyboard (which is not at all compatible with AT/PS2)... and lucked-out with a nice surprise:

    So, of course, this project was meant to be, right?

    First things first, apparently somewhere down the road I thought it more likely I'd use the cables (I was kinda into MIDI, at the time... same connector) than that I'd use these old beat-up keyboards... so years ago I cut most of my grubby keyboards' cables and stored them separately... (and numerous times I'd thought to just dump that box of keyboards, glad I didn't). Dug through that box of cables and found exactly the right one... (!) Hadn't even been repurposed!

    I can't recall what the result was, powering the system up that first time... was it a nasty long beep, or...?

    Anyways, turns out the switch-settings on the motherboard were incorrectly set for an FPU. (Did I desocket the 8087 FPU long ago, explaining why I found one in my "unsorted chips" bin a few weeks earlier, or did the switches get bumped in all those years stored in a box with dozens of other scrap PCBs and no anti-static bags?)

    Changing the switch settings (nicely-documented on the web!) solved that problem!

    Next, the video card wasn't syncing up with the TV... In fact, for quite some time no video was showing up at all... until finally I connected it straight to my old CRT TV--rather'n an LCD, nor through a VCR--when I saw something, but not legible.

    So I fought long-and-hard with the color-burst adjustment capacitor on the motherboard... It seemed to get better with adjustment but never quite legible, so I cut the trace between the capacitor and the crystal, and experimented with series caps. No go, there, so tried a parallel cap, still nogo... Finally replaced the crystal altogether... Still nogo.

    Note that the board is labelled 14.318MHz, as I recall its crystal also being labelled. I grabbed a 14.31818MHz crystal this time.

    I 'scoped it out... and I can't explain this, but my 'scope showed 14.04MHz, even after these changes, and varying the capacitor had only a *tiny* change in frequency. 14.04MHz is *way* off, so I figured maybe the capacitor was old-and-crusty or something... maybe the loading of the driver circuit varied elsewhere... I dunno. Another thought, early on, was that the original crystal might've "cracked" in the decades of abuse, which would detune it, right?

    I'm sure I've checked this countless times... That's ten peaks, 14.04MHz. I don't get it, and you'll understand why, when we get through this ordeal...

    The CGA video card had an unpopulated space for a crystal, as well... I followed the traces and it seemed that there was a soldered-in wire where there could've been a set of jumpers to select either the clock signal from the ISA slots, or from the onboard (unpopulated) crystal's unpopulated circuitry... Since I didn't know what support components were necessary, and since the signal coming from the ISA slot was...

    Read more »

  • A better idea 2.0... uC's *as* ROM

    esot.eric01/12/2017 at 11:49 4 comments

    Duh... as mentioned in a previous log...

    #A 4$, 4ICs, Z80 homemade computer on breadboard

    has some great ideas...

    I still like the idea of actually using original PC/XT peripherals for this project, I dunno why exactly, but that's kinda the point.

    BUT... one of the things @Just4Fun also did over at that project was to implement the ROM/BIOS in an AVR. The AVR's not *executing* the code, since it's Z80 code. It's making it available as though the AVR was a ROM.

    Welllll... sheesh... here I am thinking about piggy-backing a ROM on top of the original ROM... (last log) Why not use an AVR *as* the ROM?

    Plausible, maybe even probable. AND, most-importantly, In-System-Programmable!

    ------

    So, realistically, I've a bit of an AVR-shortage 'round here... I *might* have a spare 644P with a few blown I/Os that'd work, but otherwise we're talking 8515's with enough pins and only 8K of storage (though if I write 8K of assembly, that'd be absurd). So, another option is some PIC32's I've in-stock.

    I hesitate to use these for *this* project, because half of the point is to see what a lowly 8-bitter can do. But I don't see using a PIC32 as a ROM as being out of the spirit of the original intent... since the same could just as easily be done with an EPROM and a ZIF socket. It'd mainly just be a time-saver.

    Why not, eh?

    -----

    Oh, and I could probably fit the original ROM as well.

    :/ I kinda liked the idea of piggybacking, but this makes a lot more sense...

    ------

    FOOL!

    Those're 3.3V-only! ...

    ---------

    alright, the 100-pin TQFPs I have have *just enough* 5V-tolerant pins... That's a bit ridiculous, but doable I spose.

  • First Attempt at raw 8086/88 assembling...

    esot.eric01/12/2017 at 10:19 7 comments

    Here we go! First attempt at assembling raw 8088/86 code...

    The point, here, is to create raw binary machine-code that can be directly loaded into the BIOS/ROM chip... NOT to create an executable that would run under an operating system like linux, or DOS.

    (Though, as I understand, technically this output is identical to a .COM file, in DOS, or could also plausibly be written to a floppy disk via 'dd').

    This needs: 'bin86' and maybe 'bcc' both of which are available packages in debian jessie, so probably most linux distros...

    Note that you can't use the normal 'as' (I coulda sworn it was called 'gas') assembler that goes along with gcc (if you're using an x86)... because... it will compile 32-bit code, rather than 16-bit code. (See note at the bottom)

    ---------

    So, here's a minimal buildable example:

    # https://www.win.tue.nl/~aeb/linux/lk/lk-3.html
    #  "Without the export line, ld86 will complain ld86: no start symbol."
    
    
    export _main
    _main:
       nop   ; -> 0x90 (XCHG AX?)
    
       xchg ax, ax ; Yep, this compiles identically with as86 (not with gnu as)
    
    Note that, technically, the 8088/86 doesn't have a "NOP" instruction... It's implemented by the (single-byte) opcode associated with 'xchg ax, ax'.

    I got a little carried away with my makefile. Hope it's not too complicated to grasp:

    #This requires bcc and its dependencies, including bin86, as86, etc.
    
    target = test
    
    asmTarget = $(target).a.out
    
    ldTarget = $(target).bin
    
    default:
       as86 -o $(asmTarget) $(target).s
       ld86 -o $(ldTarget) -d $(asmTarget)
    
    clean:
       rm -r $(asmTarget) $(ldTarget)
    
    hexdump:
       hexdump -C $(ldTarget)
    
    #objdump86 only shows section-info, can't do disassembly :/
    objdump:
       objdump86 $(asmTarget)
    
    Note that you can't *only* use as86 (without using ld86), because, like normal ol' 'as', it'll compile an executable with header-information, symbols, etc... meant to run under an operating-system.

    ld86 links that... or, really, in our case... unlinks all that header/symbol information, basically extracting a raw binary.

    So, now, if you look at the output of hexdump (or objdump86) you'll see the file starts with 0x90 0x90, as expected.

    I'm not yet sure why, but the file is actually four bytes... 0x90 0x90 0x00 0x00.

    Guess we'll come back to that.

    ---------

    So, my plan is to pop the original 8088 PC/XT-clone's ROM/BIOS chip, and insert a new one that contains nothing but a "jump" to one of the (many) other (empty) ROM sockets, where I'll write my own code in another chip.

    Actually, what I think I'll do is copy the original ROM/BIOS and piggy-back another chip right atop the copy (keeping the original in a safe location). Then I'll put a SPDT switch between the /CS input from the socket and the two ROMs' /CS pins (and a couple pull-up resistors). That way I can easily choose whether I want to boot with the normal BIOS or whether I want to boot with my experimental code.

    I guess I'll have to make sure that my secondary/experimental ROM chip does NOT start with 0xAA 0x55, as that's an indicator to a normal BIOS that the chip contains a ROM expansion (for those times when I want to boot normally). Maybe the easiest/most-reliable way would just be to start it with 0x90 (nop), then have my custom code run thereafter.

    ---------

    So, so far this doesn't take into account the jumping. Note that x86's boot from address-location 0xffff0, they expect a "jump" from there to [most-likely] the beginning of the ROM/BIOS chip's memory-space, where the actual code will begin.

    So, next I'll have to learn how to tell 'as86' that I want [some of] my code to be located at 0xffff0... and I suppose that means I need to make sure my EPROM is the right size to do-so, based on whatever address-space the original ROM occupied... and, obviously, my EPROM won't be 1MB, so... what location...

    Read more »

  • A better idea...

    esot.eric01/11/2017 at 05:49 0 comments

    #A 4$, 4ICs, Z80 homemade computer on breadboard

    Yeahp... AVRs are chock-full of peripherals that older computers used to implement via separate dedicated chips... So... In the 8086/8088-realm, they could be kinda like "super-IO" controller-chips, or south-bridges...?

    Hmmm... Maybe I'm going about this all wrong. Or not, since it's kinda been curbed.

  • Output Timings are strict! Look at input-timings instead!

    esot.eric01/04/2017 at 06:24 0 comments

    Guess my brain wasn't in it, when I drew up the timing-diagrams shown in the last log.

    OF COURSE the timing-specifications from the 8088 manual are for the 8088's *outputs*... They're guarantees of what the 8088 chip, itself, will do... so that, when interfacing with other chips you can make sure those other chips' input-timing requirements are met.

    ---------------

    There's a *huge* difference between what's required to be within-specs, and what's guaranteed by the 8088.

    ---------------

    SO... Again, there's basically no way my 8-bit AVR can possibly change 20 address-bits (three ports) in less than the 110ns guaranteed by the 8088, when a single AVR instruction (at 20MHz) is already 50ns, and that'd only handle *one* byte, of the three.

    So far I've only dug up specs for the 8288 (which converts /S2:0 into /RD, /WR, etc.) and the 8087 FPU (which I don't intend on using in this early stage, BUT, should probably be a decent resource for expectations of the other 8088 outputs, like A19:0).

    And...

    Yep, those specs are *way* more lenient.

    FURTHER, they're *MUCH* more indicative of what's going on...

    I couldn't figure out from the 8088's timing-diagrams *when* these signals were supposed to be *sampled*... Falling-edge? Rising-edge? (are they level-sensitive? E.G. A15:8 being fed directly to a memory-device?)

    But e.g. the 8288 datasheet shows, for the /S2:0 signals, very clear "setup" and "hold" times, very clearly surrounding a specific clock-edge. Similarly of the 8087's datasheet showing which *edge* the Address-bits need to be setup for (and held after).

    Those setup/hold times are shown as minimums, with no maximums... and worst-case we have a minimum setup-time of 35ns.

    One 8088 bus clock-cycle is 1/4.77MHz=210ns, leaving a whopping 185ns of extra potential setup-time in many cases!

    So, then, rather than having to switch all the Address inputs (three bytes) within the 8088's spec'd 110ns, we actually have 185ns to work with. That's doable.

    And, further, some of those signals may not even be sampled at every clock, so might be changed in a prior clock-cycle than the one where it's needed.

    E.G. If it can be determined that A15:8 are only paid-attention-to after the latching of A19:16 and A7:0, and only until, e.g., the end of Data-Out, then it might be possible to change the address-bits *before* the next cycle, e.g. in T4 alongside the change of /S2:0...

    Similarly, it might be possible to stretch those timings a bit:

    A prime example might be, again, A15:8. The higher and lower address-bits are time-multiplexed with status-signals and data, but A15:8 aren't multiplexed at all. Since the others are Muxed, and must therefore be *latched* through a separate latch-chip, that means the entirety of A19:0 won't be available to devices until those latches are latched *and* their propagation-delays... So, then, realistically... it's probably reasonable to assume that no devices attached to the address-bits actually look at the address until *after* that time... so then A15:8 could plausibly be changed even slightly after the other bits' latch-clock-edge. (Or, at the very least, should probably be the *last* bits written, when writing the address-bits).

    Anyways, it's starting to seem less implausible. And maybe even possible without synchronizing the AVR clock to the 8088 clock too accurately.

  • Secret-Revealed, and Too Many I/O!

    esot.eric01/02/2017 at 03:52 18 comments

    UPDATE: Re: Project "Reveal" and realistic goals in the following few paragraphs...

    ------

    This should've been two separate project-logs... The "secret-reveal" was supposed to be a minor bit merely as lead-up to the current state-of-things (interfacing with the bus).

    -------

    In case you haven't figured out "the secret" of this project... the idea is to use an AVR to emulate an 8088 *in circuit*... To pop out the original 8088 in my (finally-functioning) PC/XT clone, and pop-in my AVR, and see what the blasted thing's capable of.

    A key observation, here, is that AVRs run at roughly 1 instruction per clock-cycle, while 8088's can take *dozens* of clock-cycles for a single instruction. I read somewhere that the 4.7MHz 8088 in a PC/XT runs at something like 500*K*IPS. Whereas, an AVR running at 20MHz would run at something like 20MIPS! FOURTY TIMES as many instructions-per-second!

    (albeit, simpler instructions, only 8-bit, and I imagine much of the limitation of an 8088 is the fact it's got to fetch instructions from external memory at a rate of 1/4 Byte per clock-cycle... which of course an AVR wouldn't be immune to, either.)

    Anyways, I think it's plausible an AVR could emulate an 8088 *chip* at comparable speed to the original CPU. (Yahknow, as in, taking minutes, rather than hours, to boot DOS). Maybe even play BlockOut! (3D Tetris).

    But those are *Long Term* goals, and it's entirely likely I'll lose steam before even implementing the majority of the instruction-set. For now, I intend to pull out the BIOS ROM, and replace it with a custom "program" using an *extremely* reduced instruction-set...

    First goal is to output "Hello World" via the RS-232 port... I think that wouldn't take much more than a "jump" and a bunch of "out"s... so should be doable, even by the likes of me.

    ...And, maybe, just maybe, I could fit that and the reduced "emulator" in less than 1K of total program-memory on the two systems (and not making use of other ROM sources such as LUTs for character-drawing on the CGA card)... A bit ridiculous, there's only a few days left for the https://hackaday.io/contest/18215-the-1kb-challenge, and I haven't even decided on which AVR to use....

    (Regardless of the contest, it's an interesting challenge to try to keep this as compact as possible. You can probably see from the previous logs I've already been trying to figure out how to minimize the code-space requirements for *parsing* instructions... *executing* them is still a *long* ways off ;)

    ------------

    I was planning to use my trusty Atmega8515, since I've a few in stock...

    I think it has *exactly* as many I/O pins as I need...


    But... it gets complicated because:

    The 8088 clock runs at 4.77MHz, this is derived from a crystal running at
    14.318MHz (divided by 3), but that signal's not available at the processor.


    8-bit AVR clocks generally max-out around 16-20MHz (but can be overclocked
    a little, and sometimes even quite a bit).


    The 8515 is rated for 16MHz. And, worse for this project, its internal (and
    *calibratable*) clock is limited to somewhere around 8MHz.


    So, let's say I could bump that up to 4.77*2=9.54MHz via the OSCCAL
    (oscillator calibration) register... That means I'd only be able to execute
    *two* instructions between each of the 8088's clock-cycles.


    And, seeing as how the AVR is only an 8-bit processor, there's no way it'd
    come close to the 8088's bus-timing, what with needing to write 20 bits in
    a single 8088 cycle (A19:16, A15:8, A7:0, at the beginning of T1).


    In my initial estimates, I was planning on having 4 AVR clock-cycles for
    every 8088 clock-cycle. That's not far-fetched... 19.09MHz. (Again,
    most AVRs these days are rated for 20MHz, and the 8515-16MHz could probably
    handle 19MHz). Doable, but I'd need an external clock for the 8515, and
    somehow would need to calibrate it to closely-match...

    Read more »

  • It Has Begun!

    esot.eric12/30/2016 at 12:09 0 comments

    It has finally begun... *This* project, rather'n the slew of random-tangents that this project took me on.

    I think it'll be a *tremendous* accomplishment if somehow I manage to fit this within 1K... As in, I think it's pretty much unlikely. But I am definitely designing with minimal-code-space in mind... (which probably serves little benefit to the end-goals, besides this contest, heh! Oh well, an interesting challenge nonetheless).

    Amongst the first things... I needed a 4BYTE circular-buffer... and decided it would actually be more code-space efficient (significantly-so) to use a regular-ol' array, instead of a circular-buffer. Yeahp, that means each time I grab something from the front of the array, I have to shift the remaining elements. Still *much* smaller. And, aside from the "grab()" function, significantly faster, as well.

    Also amongst the first things... I need to sum a bunch of values located in pointers. Except, in some cases those pointers won't be assigned. But you can't add *NULL and assume it'll have the value 0, so the options are to test for p==NULL, or assign a different default.

    ---------

    Now... I'm on trying to figure out how to take an input Byte and route it appropriately. The fast-solution is a lookup-table. But we're talking 256 elements in that table, alone! That's already 1/4th of the 1KB limit!

    One observation, so far: One of things I need to "look up" is a simple true/false value. It turns out, apparently, except for 3 cases (in red), the lowest-bit doesn't affect the true-ish-ness/false-ish-ness of the particular characteristic I'm looking for. Alright! So I only need a 128-element look-up-table (and a handful of if-then statements).

    Now, better-yet... Currently I'm only looking up a True/False value. That's only a single bit. So, shouldn't I be able to divide that 128-lookup down to 128/8=16 BYTES? Now we're talking.

    First idea: Simply use the first four-bits of the value to address the LUT. Then, the remaining three bits (8 values) will select one of the 8 bits in the LUT's returned byte.

    Simple!

    ---------

    The thing is, *that one* true/false value is only *one* of the *many* characteristics I'll be needing to look-up. So... The question becomes... Does this method save any space...? That has yet to be determined.

    On the one hand, this particular true/false value is necessary early-on. On the other hand, if I used a 256-element LUT, I could determine *many* characteristics, simultaneously. On the other other hand, a 256-Byte LUT isn't quite enough to determine *all* the characteristics I'll be needing to determine... as, in many (but not all) cases a *second* byte is necessary.

    So, the first 16-byte LUT will tell us whether a second byte is required (the true/false value I'm looking for, currently).

    So, let's just pretend there was no second-byte required to determine the necessary characteristics, and the original 256-element LUT would give us all the info we'd need. I'd still need to know whether additional-bytes are necessary (in this imagining: not to determine the characteristics, but to determine *arguments*). So, somewhere in that 256-element LUT's output, I'd *still* need an indicator of whether additional bytes are necessary. Which would mean one bit in each of 256 elements (32 bytes). Which, I've already managed to reduce to 128 bits (16 bytes), and a handful of if-then statements (~10? bytes).

    Now that wouldn't sound like much... I've managed to save something like 6 bytes... And, in the end, it's quite plausible the 256-element LUT will still be necessary to determine the *other* characteristics.

    So, here's a reality-check... I *do* need to know whether the input requires a second byte. So, I *do* need that bit of data for each element in the 256-element LUT. But, one of the other characteristics I need to "look up" happens...

    Read more »

View all 16 project logs

Enjoy this project?

Share      

Discussions

Ted Yapo wrote 12/09/2016 at 15:26 point

You left a hint on your "project list" / "to-do" page.  I think I guessed what it is - but I'm not telling :-)

Good luck!  I'll PM you a pertinent link.

  Are you sure? yes | no

esot.eric wrote 12/09/2016 at 17:41 point

Shhh! :)

Oh, and thanks for the luck-wishing, I'll need it!

  Are you sure? yes | no

Does this project spark your interest?

Become a member to follow this project and never miss any updates