When creating a state machine, there are two things to consider: How to represent the state you are currently in is the most obvious one. This may seem like a trivial concern, but as you'll see, there is more to it than meets the eye.
The second decision is how you translate your logic into a state machine. There are many possible ways, of course, but most people prefer to use the following methodology:
- Use a clocked always block to store the current state.
- Create one or more pieces of combinatorial logic to compute the output for the current state.
- Create one or more pieces of combinatorial logic to compute the output for the next state.
The machines we have looked at so far have a pretty small number of states. So there's no real problem representing the states as normal binary numbers. For example, a parity checker might have an idle state (0), an odd state (1), and an even state (2). You use two flip flops and you are "wasting" one state (since 3 is an illegal state). But that's not a big deal.
However, any time you want to decode the state, you have to examine both state variables. Again, for two that's not a big deal, but what if you had, say 100 states in 8 bits? Or 1000 states in 10? Finding state 821 will take a few inverters and an AND gate. Of course, storing 1000 states in 10 bits is relatively compact, too.
But what if you don't care about compactness. You care about speed. Probably the fastest method to represent state is the so-called "one hot" method. In the case of the parity machine we would have three binary bits for the state. Idle might be 001, odd would be 010, and even would be 100. At all times, there is only one flip flop set -- which makes sense if you think about the name one-hot.
Now it is easy to detect a particular state by examining a single bit. So it is fast and easy, what's the downside? You'll need to pay attention to reset because 000 isn't a legitimate state. You sometimes see people use a "modified one-hot" where 000 is the initial state, but if you ever need to decode that state, you lose the advantage.
Space is another problem. If you really had 1000 states (and, yes, that's probably bigger than you should have) then you'd need 1000 bits of state. Another more subtle problem is reliability. If you want to detect an illegal state, you'll need some sort of gating arrangement that asserts only if more than one state bit asserts. That gets complicated fast with too many bits.
Still, one-hot is often a great choice for state encoding, and you see it used frequently. With negative logic you sometimes see "one cold" encoding which is the same as one hot, but the active state is zero.
Speaking of reliability, gray code encoding is also possible in some cases. This is similar to the numeric representation except the bit patterns are selected so that only one bit changes per state transition. For example, this wasn't true with the parity machine we mentioned before. In binary, the states were 00, 01 and 10. If you can go from 01 to 10 or vice versa, that's two bit flips. We say the hamming distance between those states is two.
If we made the parity machine gray, it would turn into a one hot. But consider a different example. Let's say we have a state machine that will count the number of button pushes from 1 to 3. A simple encoding would be two binary bits: 00, 01, 10, and 11 where 00 is the idle state. But, again, the transition between 01 and 10, as well as 11 and 00 requires two bit flips. With a gray code, the states might be: 00 01 11 10. Now each transition only takes one bit flip.
There are other encodings like Johnson and O'Brien that seek to minimize bit changes between adjacent states. These schemes can be useful when you want to use combinatorial logic to detect state changes or you are worried about power consumption.
It isn't always possible to build a gray or similar sequence without some work. For example, consider the 00, 01, 11, 10 sequence mentioned above. That's fine if the states go from one to another in sequence. But what if state 00 could go to state 11? You'd need to rearrange the code or use more bits. This is a big problem if you add or change transitions in the middle of the design.
Let's consider a state machine to figure out if the sum of positive numbers is even or odd. It doesn't matter how many bits the numbers are, because all we need to do is look at the last bit of any number to know if it is even or odd. An even number plus an even number is even. An odd number plus an odd number is even. And an even number plus an odd number stays odd.
We will need three states: idle, odd, and even and let's use one hot encoding, just for practice. It is handy to use localparam to name the states so:
module odd_even_sum_detect(input clk, input reset, input bit0,
output oddled, output evenled);
reg [2:0] state;
Of course, this is a simple example, so you might have even had:
reg idle, odd, even;
This gets clumsy, though, when you need to reset things or examine several states at once.
Remember, we said it is customary to put the state machine into at least three parts. The first part is state management. We'll also use this as an excuse to save the input bit so if it changes externally, we won't care.
reg [2:0] nextstate;
always @(posedge clk)
Obviously, the trick now is to compute nextstate. This could be done with an assign or with a combinatorial always block. I'll use the always block:
if (state==odd) nextstate=inbit?even:odd;
There are a few interesting things to notice here. First, the idle state is zero so it is intrinsically even. The else branch catches both cases. That's the second point, though. You must assign nextstate in all cases so you don't infer a latch. If you fail to do so, the synthesizer will assume you want to keep the next state, but not with a flip flop and you will get a latch. This is usually undesirable.
Because this is combinatorial, you can use = instead of <= and the last assignment is going to "win." So you could, for example, assign something at the top of block as a default or, if using a case statement, be sure to include a default case. In this example, even an illegal state (if one were possible) or an idle state would still get an assignment because we only check for odd and then do something for everything else.
You might also notice that if state is odd, nextstate is the inverse of inbit otherwise, it is inbit. So you could write this as:
However, this is less intuitive and -- as we will talk about later -- the tools may even figure that out on our behalf.
That leaves one more piece to write. The output section which is, again, combinatorial. This time I'll use assign, but I could do another always @(*) if I wanted to.
You hope your synthesizer would figure out that checking for all the bits is not necessary, but if it really checks for 010 and 100, that is kind of a waste. If you want to be more explicit, you could say:
Having a single clocked piece of code along with a combinatorial part to compute the next state and another to compute the output is known as the "3 process style" of state machine. However, some argue that this is more error prone. You can easily put everything in one place (the "1 process style"):
always @(posedge clk)
For something this simple, it is fine. Note that I now don't need to latch inbit because that's done inherently when I store state. Of course, now oddled and evenled are reg types. I could have stayed with the assign and had a "2 process" state machine.
Which one is best? That's a hotly debated topic. In general, you should pick the style you are most comfortable with for the task at hand.