I have started writing the main code for the system. There is a simple state machine (I will probably post the diagram for it later) that manages switching between modes and determining which node is transmitting and when to receive. In short it works like this:
- All nodes in standby
- When a node wants to transmit audio, it sends a packet (with ECC) to the other node(s).
- The other node(s) reply with an ACK
- Once an ACK is received the master goes into streaming mode and the slave(s) going into streamingRX mode.
Everything works great up to this point. Here's where things get funky.
How do you end a transmission when the entire data pipe is dedicated to transmit audio?
What I am doing now: when the master wants to end the transmission it sends 1000+ audio bytes of 0xAA. This is a repeating 1010 pattern that allows for good signal integrity. The slave then looks for say, 500 of these packets in a row, to end the transmission.
Problem: I don't know if this is a "real" issue or not (it could be a bread-board signal integrity issue), but I am often seeing bits be skipped by the receiver (maybe 1 in 100,000 samples or so). For example, if a byte sequence is this:
10101010 10101010 10101010
I sometimes get this:
10101010 10101101 01010101
Where a bit is dropped causing the byte sequence to permanently be off by 1 bit. This is not only bad for the end-transmission sequence, but also the audio date can get wacky. 1 in 100,000 samples is sometihng like 1 in a couple seconds or so, so it is pretty frequent.
I fear that the solution may be to double the data rate, or increase it by a couple bits maybe. This would allow for ECC (error correcting codes) or some way to detect a missing bit. Also, maybe Manchester encoding might help.
My initial thoughts:
- Make each audio packet 12 bits long
- first 3 bits are always '111'
- 8 bits of audio sample
- 1 bit checksum
Does anyone have any experience with correcting for missing bits in a bit-stream for this type of application?