-
Dead screen debugging
05/29/2021 at 10:33 • 0 commentsYou learn new things each day you work on a project, right? For larger projects I like to work with vscode and platformio. This makes life easier because you can work on code for both processors in one editor, and I really like the additional functions like splitscreen, a function-index and being able to see all references to a function or variable. So when I finished working on the code for the I2C monitor in a development or test-workspace, I copied the code to the main project and checked if it played nicely with everything else, something strange happened …
The code compiles with the usual nags, which are nothing serious and mostly library related, but when I uploaded the code the oled screen stayed dead. How can this be? First you check all your cabling. I didn’t use the oled working on the I2C-monitoringcode in the test-workspace. The cabeling seemed fine, so I checked with an other oled screen but this one also played dead … oops. Next I remarked some of the changes I made in the housekeeping and the main loop but that didn’t make any difference at all. Then I focused on the setup loop and added a delay after the initialization, this way the screen should show the Adafruit logo; it didn’t. I tried a quick I2C scanner and it showed a response at the correct address so the screen must be ok right? Next I uploaded the Adafruit example into the test-workspace and uploaded it to the Teensy and … success, it worked! Good news and bad news, the good news: the hardware is working, the bad news: the fault is entirely my own.
I did find out that the oled.begin function in my project did not succeed. This is probably a library thing right? I didn’t change anything about the oled code in the setup since the last time it worked so maybe changes or different versions or dependency related issues due to updates? I copied the versions from the test-workspace with the working Adafruit example to the main project, but that didn’t help. I simply could not figure out why the example code worked and in my program it didn’t. Of to the forums to find some answers and I can tell you, that is not very easy: searching for answers why the example works and your own code does not. Somehow an issue a lot of beginners face, but I did find one mention about not enough memory for the library to create an offscreen display on an Uno. Could this be the issue? Ah, my main program still had an array size of 8192 items for the capturebuffer. This was ok with dma but the new code needed more buffers: 1 word and 2 byte arrays instead of 1 word and 1 byte. The teensy is 32 bits so in hardware it probably used 3 word arrays. After I reverted back to array sizes of 4096 items, the screen came back to life.
I should look into some memory optimization for the buffers if it does not take to much interrupt time, another item for the to-do list.
-
I2C Monitor/Sniffer, part 2
05/20/2021 at 17:53 • 0 commentsWell that was a bit presumptuous: “Now to integrate this into the main program”. Yes, we can get the timing information this way, but it won’t be the correct time. I added a delay(address); to the Arduino Uno sketch to check the timing; this way each address-scan delays an additional millisecond. This way the time-information should increase with one ms each stop/start cycle. Alas this was not the case. It seems the DMA channel for the time misses triggers. Thinking some more about this it is obvious: a DMA routine to transfer 1 byte from a peripheral to SRAM takes 16 cycles of the F_BUS and transferring 4 bytes takes (probably?) 19 cycles, so both DMA request back-to-back take between 32 and 37 cycles. At 72MHz and a F_BUS of 36MHz this means that the system is at least 890ns busy with processing the DMA requests. Should be doable right? Wrong. A 400kHz I2C bus means the SCL toggles 800.000 times a second so that’s every 1250ns. Ok, so still doable right? Nope. Because although strictly speaking we only need to interrupt on the rising SCL edges, in order to dsetect start, restart and stop signalling we need to trigger on any changes of the SDA line. A rising SDA line right before a rising SCL line will lead to missed clocksignals. In order for both changes to be detected and processed they would to be more than 890ns apart and that just isn’t the case when SCL is low and a one is put on the bus.
Using interrupts I was able to capture timing information at 100kHz. Maybe with overclocking we can get the speed we need? I wrote a simple testprogram to interrupt as any change of pin 5, in the interrupt routine I set pin 6 high, processed the data and time and set pin 6 low again. On a oscilloscope we can now see how long the interrupt processing takes, from trigger to interrupt routine and how long the interruptroutine takes. At 72MHz entering the interrupt took about 900ns, the processing took another xxxns. Enough for an I2C bus @100kHz but not fast enough for 400kHz and faster. Overclocking to 144MHz reduced the needed time to 500ns and xxxns but still not fast enough if there are multiple interrupts close together.
So, mission impossible? No, not really. We just need a way to filter out the unwanted interrupts. We are only interested in SDA changes if SCL is high, so if we convert a SDA change into a pulse and filter this so it only reaches the Teensy when SCL is high. The last part is easy: just put it through an AND-gate. The first part is tricky but doable: if we delay the SDA signal a bit, say 50ns and feed both the undelayed signal and the delayed signal into an XOR gate, we get a 50ns wide puls each time the SDA signal changes. We can make an XOR gate with 3 NAND gates and 2 inverters (or 5 NAND gates). If we use the 74HC00 and the 74HC14, we get 4 NAND gates and 6 inverters. We can use the inverters to delay the signal, just put a RC network between two inverters.
Using this “filtering circuit” we can capture signals and time information for an I2C bus @ 400kHz, with overclocking to 144MHz, we can probably get up to 1MHz.
-
I2C Monitor/Sniffer
05/05/2021 at 16:51 • 0 commentsThe project already included an I2C Scanner which scans an I2C bus to determine the slave addresses on the bus but I also wanted to add an I2C monitor or sniffer however the chosen GPIO pins that are broken out are pin 5 & 6, and they are not a I2C bus. The scanner uses the SoftWire library but for sniffing you need to passively listen to the signals on the bus.
My first try was using a pin-change interrupt on GPIO1 & 2 aka digital 5 & 6 aka PortD bit 7 & PortD bit 4 aka SDA and SCL. No need to look at the falling edge of SCL, nothing interesting happening there but the falling edge of SDA indicates a Start condition. So digital 5 became an on change interrupt and digital 6 a rising edge interrupt. I added an Arduino Uno to scan the I2C bus by opening each address between 0x01 and 0x7E for writing and seeing if any device acknowledges the address. The interrupt routines just grasped the complete 1st byte of PortD and placed it, together with the current value of micros() into an array.
This resulted in a perfect capture of the signals on the I2C bus, complete with time-codes. Nice! But then I saw the bus was only running the default 100kHz. Does it also work on a 400kHz bus? Sadly, no. Could I work with just 100kHz? Probably but this Teensy has more capabilities than just raw speed and memory, it also has DMA channels, let’s try that rabbit hole, it’ll be fun they said.
There are no tutorials for how to DMA on the Teensy 3.2. But there are some mentions in the forums and Paul made a nice wrapper in the core: DMAchannels. Here are some of the information sources that helped me figure it out: https://www.nxp.com/docs/en/application-note/AN4419.pdf - An application note on how to use the PIT (periodic interval timer) and two DMA channels to create a PWM pin, this will show you how powerful these DMA channels can be. One thing bothered me though, a line on page 5: “2. The output GPIO pin number is 1 per each GPIO port. 2 output pins from 1 port is not accepted. Because GPIO can accept only 1 DMA trigger.” But thankfully a DMA channel on the other hand can receive more than one GPIO trigger if they are from the same port. An other useful bit of information is this forumpost (no. 3) from Paul: https://forum.pjrc.com/threads/63353-Teensy-4-1-How-to-start-using-DMA and off course the source code of DMAChannel.h: https://github.com/PaulStoffregen/cores/blob/master/teensy3/DMAChannel.h.
So putting all this together:
Create a DMA channel, feed it the PortD input register as source, point it to a buffer as destination, tell it how many bytes to transfer each time and after how many transfers to stop and trigger an interrupt (don’t forget to tell it to stop!) and last but not least: tell it when to do all this. So we take a look at the datasheet of the processor. And we take another and another because reading this datasheet is worse then reading a machine-translated Chinese datasheet with to many pictures in Chinese which can’t be translated. But on page 79 of the datasheet (MK20DX256 Manual) it says that Source number 52 can be used to trigger a DMA channel for port D. But how do I tell the DMA channel and how does it know which pins to trigger on? Easy with DAMChannel.h, just add dmachannel.triggerAtHardwareEvent(DMAMUX_SOURCE_PORTD); and the pins is easy as well: each pin has a configuration register (PORTx_PCRx), looking at that register (page 227 section 11.14.1) we see 4 bits named IRQC and in the table you can see that if we set these bits to 0011 we get an DMA trigger for pin changes. Paul made this easy as well, just issue this command: CORE_PIN5_CONFIG = PORT_PCR_IRQC(3)|PORT_PCR_MUX(1); after declaring it as an INPUT. The input register for PortD (GPIOD_PDIR), is 32 bits wide. So we would need a buffer 4 bytes wide for each bit we want to capture, that is a bit to much and eats too much into our memory. But the DMA channel starts with the LSB so if we say just transfer 1 byte, we get the bytes we need.
But wait, can we get the time as well? Maybe. We can’t use millis() but maybe we can read the variable millis uses directly from the memory location? But millis() is too coarse and micros() is not a single variable we can read. Luckily the Teensy 3.2 has four 32-bit timers called Periodic Interval Timers (PIT), and they run at F_BUS which is @36MHz with the default 72MHz configuration. Theoretically that results in a resolution of 27,8 nanoseconds and a maximum timerperiod of 119,3 seconds. So if we trigger a DMA channel to store the PIT value in a buffer and set a PIT for an interrupt every 60 seconds, we get the elapsed time at each pin change. And if we store the DMA address on each PIT interrupt in an additional buffer, we can count the minutes so we can capturing periodic bursts with some deadtime in between them and still have fairly accurate timekeeping. I got it working in a separate testsketch and could capture accurate data up to 530kHz (higher became problematic due to the added capacitance of the I2C bus in spite of the added 1k pull-ups). Now to integrate this into the main program.