Update Q1 2022

I will try to keep this section of the hackaday site going with short updates for any interested folks.  The most recent update is about the real-time clock.

Realtime clock

I wasn't that bothered about adding a realtime clock until I had got to being 'almost complete' with the FAT16 handling. After using the filesystem for a while it becomes obvious that one needs the timestamp on files.  I decided to use a period part from the day, the Dallas DS12887 which was found on most PCs in the early-mid 80s.  The problem is that these units have a built-in battery and they eventually die.. the solution I learnt from research is that some clever folks found that one can cut away the packaging to expose hidden pins that give access to the battery power terminals - and then hack on a button battery such as a CR2032!

The actual biggest pain is the CPU interface - the DS12887 is not directly compatible with the 6502 bus signals, but again after a lot of research, I managed to find the right configuration.

Then came the low level software - OS routines to get and set the time (plus checking if the clock it ok on boot up).  After that I could code the filesystem timestamp handling - both to set and get the time.  It's actually a bit of a pain on an 8 bit only CPU!

In-line assembler 

This now means I have a lot of power when I need it - for example being able to run off interrupts.  I am quite pleased with how I ended up slotting this in to the interpreter.  The interpreter stores all keywords as tokens to speed up execution and tokens are identified by bit 7 being set in the code.  This means 128 tokens maximum - which is fine for BASIC, but as there are at least 56 mnemonics in the 6502, I wouldn't be able to treat them as normal tokens.

My solution was that the inline assembler is a kind of sub-interpreter, invoked by the main interpreter whenever it encounters a '.' (dot) symbol as the first non-whitespace character.  At that point the in-line assembler does the decoding etc.  So 128 token limit side-stepped.

My other problem is how to treat labels and variables.  I decided fundamentally to use the same variable name space as the main code - which is handy because variables can be set in BASIC and then used by the assembler.  This makes for a potentially very powerful macro assembler.

My final problem was resolving address locations.  Some addressing modes of the 6502 are ambiguous such as zero page and absolute - no way to tell which addressing mode you want until the address is known.  But if that address was coming from a variable which could be a forward reference and hence has not been declared yet then what to do?  The solution is to do one pass where you assume everything unknown is zero page, then a second pass with forward references filled.  This may result in some zero page modes becoming absolute - which takes an extra byte.  If this happens then all the forward references are now wrong, until they are reached and the value updated!  The solution is to do a third and final pass which uses the correct values.

Hence the reason this is a 3-pass assembler.

What can I do with this that was not possible in BASIC?  Well aside from the obvious massive speed increase, it would be impossible to do effects like this, which rely on handling interrupts and micro-second precision to change video memory settings.

(Nov 2018) - Enhanced memory map and decoding - 128K RAM and 64K ROM

I'll copy the notes from my schematic by way of an explanation..

Objectives

My computer uses 16K ROM and 48K ROM (minus addresses mapped to IO), but I have 64K EEPROM and 128K SRAM chips on the breadboard.  The 65c02 can only address 64K through its 16 bit address bus, so the route I will go down is memory banking - the ability to switch out different parts of the total RAM/ROM in to the 16 bit address range.

The full set of features is as follows:

So I need somewhere to hold the RAM and ROM bank numbers to be decoded to switch in the correct part of memory, and also the ability to switch ROM out altogether to enable a full 64K of RAM (minus IO window).  I also need to move the IO window from B000-BFFF (a 4KB block) to somewhere lower and make it smaller too as no IO device needs more than 16 bytes addressing space.

Constraints

But, I need to work within existing constraints of space and parts to hand;
Solution approach

I engineered the features through 3 solution approaches as follows..

Solution 1  - Move IO access from B000 to 0400 and reduce to 4 pages (1KB block)

Implementation

Solution 2 - Add a ROM disable output and write to ROM detection
Implementation
Solution 3 - Add RAM and ROM bank selection bits
Implementation


Update (June 2018) - Clock speed doubled - now running @ 5.36Mhz

The clock circuit of my project used a 21.7727Mhz master crystal, which is fed through a 4 bit counter to output the right clock speed for various devices:

I also have a 5.36Mhz clock available, but in previous attempts at trying this, I end up with a blank screen an/or frozen machine.  This led me to believe that my RAM was not fast enough to keep up with the demand at 5.36Mhz.

So I tried swapping out the 70ns RAM chip with a very fast 15ns RAM chip.  I also made some software adjustments, mainly putting in extra NOP instructions to delay the speed at which the TMS9918 was being driven and the BBC keyboard was being strobed.  To cut a long story short, it turns out that the 15ns chip is glitchy, but the 70ns chip works fine once the software reflects the faster CPU cycles.

I also got a huge amount of insight from the expert folks over at the 6502 forum - turns out my assumptions on the activities of the 6502 across one clock cycle are incorrect, but still my current design does work without any issues - but it's not optimal and I will sort this out in due course.

So I'm mega pleased.  I would never have dreamt of a 5.36Mhz 6502 back in the early 80s.  The BBC Micro was the fastest 6502 micro around running at 2Mhz, followed by the Atari 8 bits at 1.79Mhz.  My machine is more than 2.5 times faster clocked than the BBC, and with the tokenising BASIC interpreter I built (dflat), a serious amount of power is available to write interesting programs :-)


Update (Oct 2017) - Improvements to dflat

I thought I had made a more recent post than last December - time flies!  So I haven't been able to spend as much time as I would have liked, but in the last few weeks have built a few improvements to dflat:


Update (Dec 2016) - Tetris game in dflat!

I felt the need to do something more sophisticated than the Invaders game in dflat, so have attempted a decent implementation of Tetris. The screenshot below shows the game - I am quite pleased with it, as it uses only my custom dflat interpreter. I am using sprites, sound, user defined graphics and a multi-colour palette to give the game a nice look and feel. In addition, to optimise for performance of scrolling the game map when a line is completed, I am using fairly intense string handling to maintain the game map rather than the original 20x10 integer array. Whilst developing this game, I decided to add timer functions to dflat, to help with timing various animations in the game loop - a beneft to building one's own computer language!

Invaders retro game done!

I've been a bit busy with work, but have managed to create a small and simple space invaders type game entirely in dflat. It takes me right back to around 1984 when I learnt to program from magazine listings! Also, I have added a couple of videos with commentary on youtube - one on the hardware (see below) and one on dflat (see dflat log).


State (April 2016)

As of the last update, my home brew has the following features:

First for a bit of context (back to around late 2014)

I guess I have always been interested in the lower level details of computers and their operations. As a kid of the 80s in the UK, getting to know the inner workings of my old Oric-1 and Atari 800XL entailed learning 6502 assembly and the I/O chips (including sound and video).

Then I had to grow up. I went to University, and went in to the world of work. My first couple of jobs were low-level (sort of), building software for embedded systems using a mixture of C, C++ and Assembly.

But for almost 20 years, I have been designing, architecting and consulting around distributed business systems. So I went completely away from the low level, working at the business outcome level - which is great in many, many ways.

However I still had a passion for being able to wield the computer at its lowest levels. By the 2010s I started to gain a desire to build my own computer - (almost) the ultimate expression of this. I did some research and found not only loads of homebrew computer enthusiasts, but even homebrew processor enthusiasts. But as is often the way, life and work took over and I put it to one side.

Then 2014..

Around late 2013, the desire to build my own computer hadn't gone away, and I had continued my research and looking longingly at other enthusiasts' efforts. Finally, I resolved to stop pondering and actually get on with it. I needed a hobby that would work around a demanding job and home life, so this would be something I could let progress as my time allowed.

Future

I had a number of initial goals:

I then set some more goals:

So what next, I seem to have done everything I set out to do.  Well this whole project is really more of a hobby, so what the 128K RAM and 64K ROM now let's me do is put even more software features in to my computer.  Especially with the quadrupling of ROM, I think I can extend the custom language to add loads of cool graphics and perhaps sound features.  In addition, I think I need a proper DOS type command line and drivers than will deal with long file names and directory (perhaps even FAT32 rather than FAT16).

One thing I do want to do is to add an assembler to dflat - then my language will be almost as good as the legendary BBC BASIC from 1982! :-)