Published on

Markov chains application (weather forecasting ).

Introduction Markov Chains Application “ Weather Forecasting”.

Markov chains are an important mathematical tool in stochastic processes. They are used widely in many different disciplines. A Markov chain is a stochastic process that satisfies the Markov property , which means that the past and future are independent when the present is known . This means that if one knows the current state of the process , then no additional information of its past states is required to make the best possible prediction of its future. This simplicity allows for great reduction of the number of parameters when studying such a process..

Markov Chains State Space The state space is considered a group of all possible states that a random system can be in . It is called S. There are many examples of S such as taking some numbers like , , or etc . Transition Probabilities The transition probabilities are defined as a table of probabilities and considered The second component. . Each entry in the table informs us about the probability of an object transitioning from states “entries” , there will be a probability associated with all of the states which need to be equal or greater than 0. Plus, the sum of probability values needs to be 1..

State Space. محمد سامي محمدعبد العزيز عشرة.

Transition probability table Where :. Transition Probabilities.

Applications Markov chain is defined as a useful tool for prediction it is possible to predict future trends by analyzing the object’s previous behaviors , so there are a variety of applications of it in the area of natural sciences and population , weather predictions ”forecasting” , economics and finances etc ..

Weather Forecasting . In our simplified universe weather can be in of 3 states as following : Sunny Cloudy Rainy The probability of how it being tomorrow, depends on its state today whether its sunny, cloudy or rainy..

If we assumed that the weather of tomorrow depends only on today’s weather, that can be called Markov chain first order . For Example : - if today is sunny, what is the probability that the coming days are sunny, rainy, cloudy, cloudy, sunny so we go with the order (0.5)(0.4)(0.3)(0.5) (0.2) = 0.0054.

. كريم محمود عبد السلام غنيم.

. رشاد السيد ابراهيم أحمد.

Markov chains are an important concept in stochastic processes. They can be used to greatly simplify processes that satisfy the Markov property, namely that the future state of a stochastic variable is only dependent on its present state. Mathematically , Markov chains consist of a state space, which is a vector whose elements are all the possible states of a stochastic variable, the present state of the variable, and the transition matrix. The transition matrix contains all the probabilities that the variable will transition from one state to another or remain the same. To calculate the probabilities of a variable ending up in certain states after n discrete partitions of time, one simply multiplies the present state vector with the transition matrix raised to the power of n . There are different types of concepts regarding Markov chains depending on the nature of the parameters and application areas. They can be computed over discrete or continuous time . The state space can vary to be finite or countably infinite and depending on which, behave in different ways ..