Close

Markov chains

A project log for Humanoid robot named Murphy

Created to explore new ways of artificial intelligence

m-bindhammerM. Bindhammer 04/20/2016 at 07:270 Comments

A mind must be able to predict the future to some extent to do any planning. In the following example Markov chains are used to develope a very simple weather model (should I take my umbrella tomorrow, Murphy?). The probabilities of weather conditions (rainy or sunny), given the weather on the preceding day, are represented by the transition matrix:

The matrix P represents the weather model in which the probability that a sunny day followed by another sunny day is 0.9, and a rainy day followed by another rainy day is 0.5. Markov chains are usually memoryless. That means, the next state depends only on the current state and not on the sequence of events that preceded it. But we can change the inital state if we have other experiences to model the weather or something else. One more note: Columns must sum to 1 because it's a stochastic matrix.

The weather on the current day 0 is either sunny or rainy. This is represented by the following vectors (100% sunny weather and 0% rainy weather or 0% sunny weather and 100% rainy weather):

The general rule for day n is now:

or equally

This leaves us with the question how to compute these matrices multiplications for our example:

Let us assume the weather on day 0 is sunny, then weather on day 1 can be predicted by:

Hence, the probability that the day 1 will be sunny is 0.9.

Applying the general rule again, weather on day 2 can be predicted by:


Discussions