Close

Keyboard Interrupt Issues

A project log for Cat-644

An 8-bit computer based around an AVR ATMega-644 and junk-box SRAM.

marsMars 08/09/2014 at 08:320 Comments

The keyboard caused me a lot of trouble throughout this project.  It ended up being re-done twice!

Reading a PS/2 keyboard is fairly straightforward.  Here is a snip from my Project Details page:

The PS/2 protocol consists of 2 signals: CLK and DATA. These are on open collector bus running at TTL (5v) levels. Open collector refers to the way the transistor are arranged, but that isn't that important here. What is important are three things:
In normal operation, the keyboard controls the clock. Whenever a key is pressed, a scan code is transmitted from the keyboard to the host. One important detail is the DATA is considered valid on the falling edge of the CLK signal. This means, they keyboard puts out a data bit, THEN drops clock. When looking at the clock, when the high-to-low transition occurs, we read the data.
11 bits make up a PS/2 keyboard frame:

The AVR has a function called the pin-change interrupt, which means when an I/O pin changes state, an interrupt is run.  The pin to patch is the PS/2 CLK pin,  when that signal starts bouncing up and down, the user has pressed a key. The PS/2 data line is valid when CLK is low, so IF a pin change interrupt has occured AND clk is low, we read the data bit.  If we save these bits as we read them, we should get 1 complete keyboard scan code.  The code to do this is here:

#define KEY_PORT  PORTA
#define KEY_DDR   DDRA
#define KEY_PIN   PINA
#define KEY_MASK  0b00111111
#define KEY_CLK   0b10000000   /* A7 is connected to PS/2 clock */
#define KEY_DATA  0b01000000   /* A6 is connected to PS/2 DATA */

void key_init()
{
    
    KEY_DDR &= KEY_MASK;  //keyboard are inputs
    KEY_PORT |= KEY_CLK |KEY_DATA ; //pullup
    
    //enable interrupt
    PCICR = (1 << PCIE0);  //enable pin change interrupt
    PCMSK0 = (1<<PCINT7);     //enable PCINT7    
    
}

//these are volatile so that non-ISR code reading these values don't asumme they don't change

volatile char keyb_readpos=0;  //written by reader
volatile char keyb_writepos=0; //written by driver
volatile char keyb[KEYB_SIZE];
volatile char keybits=0;
volatile char keybitcnt=0;

ISR (PCINT0_vect)
{

    //if KEY_CLK is low, this was a falling edge    

    if (!(KEY_PIN & KEY_CLK))
    {
	//get the data bit, and shift it left 1 for convenience
        char inbit = (KEY_PIN&KEY_DATA)<<1;
        
	//are we the start bit, and is it zero?
        if (keybitcnt==0)
        {  
            if (inbit==0)
                keybitcnt++;
        }
        else if (keybitcnt<9)
        {
	    //are we a data bit?

            keybits = keybits>>1; //shift right all bits-so-far
            keybits |= inbit;   // OR the new bit on the left
            keybitcnt++;
        }
        else if (keybitcnt==9)
	    //are we the parity bit?  (ignore it)
            keybitcnt++;
        else { 
	    //i guess we are the stop bit
            if (inbit==128)    //are we a '1'?
            {
		//we have a data byte!

                keyb[keyb_writepos] = keybits;  //write in buffer
                if (keyb_writepos==(KEYB_SIZE-1))
                    keyb_writepos=0;
                else
                    keyb_writepos++;
                
            }
            //if we got to 'stop bit' and we are not a '1',soemthing went wrong.  
	    //reset and try again later
            keybitcnt=0;
        }
        
    }
    
}


The above works well, just not when the video interrupt is not running.  If the video interrupt IS running, we can still read the keyboard just fine, but the screen sometimes looses sync.  Why?


Keyboard re-do:

Trying to read the keyboard while outputting VGA proved problematic. The monitor wants precise timing. If the user presses a key right before the video routine is supposed to run, the keyboard routine will block the VGA routine. I do generate the hsync signal by hardware timers, so the hsync could happen in the right place, but the start of the interrupt routine has now shifted too-late.  HSYNC-to-low is done by the timer, but turning HSYNC high, or toggling VSYNC is still done with software.  Also, if the video interrupt starts too late, it won't finish in time for the next HSYNC to happen, and the timing will break.

It turns out the PS/2 spec has the answer. (or so I thought at the time)

The host (computer) can also send commands to the keyboard. To send a command the host pulls the CLK line now. What the keyboard is supposed to do, is immediately stop transmitting and listen. When the host has finished its command, the keyboard can attempt to resend that last thing the keyboard was sending. The keyboard is supposed to buffer any keystoke the user makes during this time, and send them when it can. Using this, I would pull the PS/2 CLK low before the active video portion of the video frame. During the vertical blank period (where the AVR sends only hsync/vsync signals, and no video), the CLK line was left back to being an input w/ pullup. This worked resonably well... with one of my keyboards. Another keyboard was erratic, and a third keyboard didn't seem to work at all.  It seems to all depend on the keyboard's clock timing. If the keyboard's clock is fast enough to send a full code during the vertical blanking... fine. If the keyboard was too slow, it would never successfully send a code before the 'shutup' signal is applied again. But, having 1 keyboard work while processing video was an important milestone.

Keyboard re-re-do:

I needed a way to read the keyboard while processing video. Using the CLK signal as a way to 'shut up' the keyboard wasn't working very well. It turns out, I already have an interrupt firing periodically. The video interrupt fires for every scanline of video... this is at a rate of about 33 kHz. The maximum PS/2 CLK rate is 15 Khz. So... if I look at the PS/2 port at least 30 Khz I should be able to see both the low and high parts of the clock. (Nyquist/Shannon limit)

At the beginning of the video interrupt is a cycle-counting wait that equalizes interrupt latency. At the end of that wait, the value of PORTA (PINA actually) is captured to a variable.  At the end of the video scanline interrupt, the captured PINA value is inspected and compared to the previously captured value (from the previous firing of the interrupt) to see 1) if the CLK changed, and 2) if the transition was from high to low. If it was a high-to-low transition, the capured PS/2 DATA value is processed in the same way as done previously in the keyboard interrupt. The video interrupt now handles video, AND polls the keyboard.

There's just one problem: When the video interrupt decides its time to process keyboard code logic, the interrupt takes too long. By the time the keyboard logic in finished, all the registers are popped off the stack, and the interrupt handler returns to wherever the main program was before it fired, we are already past the point of where the next video interrupt would have started. At best, the next line of video is scrambled, at worst the slight variation in sync signaling has broken the monitor's video sync. It's the same problem all over again.  What to do? 

The simple solution is to not restore machine state. Instead of carefully putting the processor back to its pre-interrupt state, we just jump to the top of the interrupt routine, and directly proceed into processing the next video scanline. This only happens when the PS/2 keyboard sends a bit of data... so every time the user presses a key the CPU looses the little bit of time in between a few scanlines... which doesn't even amount to much. Now the keyboard code works with all 3 of my keyboard I've tested, AND causes no interference to the video picture at all. 

unsigned char oldclock=1;  //last state of ps2 clock

ISR (TIMER1_COMPA_vect)  //video interrupt (TIMER1 )
{
    unsigned char i;
    unsigned char laddr;
    unsigned char lddr;
    unsigned char ldata;
    unsigned char lctrl;
       
    asm volatile ("revideo:");

    wait_until(62);  /* Wait until TIMER1 == 62 clocks */
       
    kp = KEY_PIN;  //capture ps/2 port at regular interval
   

 /* Output 1 scanline here 
    (not put here, since its not important to keyboard function )
*/


   i = kp & KEY_CLK;  /* isolate the CLK bit*/
    
    if ((!i) && (i != oldclock))  /* If the CLK bit is low, but was high last time, we had a transition*/
    {
        oldclock=i;
           
        keypoll(kp);  //do keyboard logic, with the captured port value
                
        while (TCNT1>50);//wait for counter to roll over 

        TIFR1 |= (1 << OCF1A); //clear timer interrupt flag  (setting this bit clears the flag) 

	/* Note: we have to clear this flag, because otherwise as soon the the video interrupt finally
		returns, it will immediately re-enter because the timer has re-tripped 
	*/


        asm volatile ("rjmp revideo"); /* go up to next scanline */
        
    }
          


void inline  keypoll(unsigned char portin) {
	/*This is largely the same as the pin change interrupt used for PS/2*/    

        char inbit = (portin & KEY_DATA)<<1;
             
        if (keybitcnt==0)
        {
            if (inbit==0)
            keybitcnt++;
        }
        else if (keybitcnt<9)
        {
            keybits = keybits>>1;
            keybits |= inbit;
            keybitcnt++;
        }
        else if (keybitcnt==9)
        	keybitcnt++;
        else { //cnt is 10 (11th bit of frame)
            if (inbit==128)
            {
                keyb[keyb_writepos] = keybits;  //write in buffer
                if (keyb_writepos==(KEYB_SIZE-1))
                    keyb_writepos=0;
                else
               	    keyb_writepos++;    
            }    
            keybitcnt=0;
        }    
    
    }

The 'rjmp revideo' is infortunately ugly inline assembly.  For whatever reason, when a simple goto was used, gcc tried to be too clever and a lot of the keyboard logic ran before the video logic, completely breaking the video signal.

Nearly the entire video interrupt is written in C, with only small parts in assembly.  A lot of care is taken to make sure the assembly output of gcc does what I expect, in the order I expect it, since timing is so important.  This process is fairly fragile.  What I will probably end up eventually doing, to make sure others that might not have the same version of gcc, is take the assembly output of gcc for this function, beautify and comment it, and use that as the source code for the video interrupt routine.

Discussions