Mathematics has been pushed to its limits and it is emulating decision making skills in humans. Few decades from now mathematics will run in the blood and there are going to be (r)evolutionary effects evident in the portion of the brain that solves mathematics problems.
Forecasting (rather guessing) has been a major motivation to induce mathematics in fields as diverse as biology (and numerous sub fields), geology, economics and finance.
Gibbs sampling is a major probabilistics tool that allows one to model large systems of "particles". It derives its inspiration from physical systems of magnetic interaction and perhaps other areas which resemble interacting particle system. The ubiquitous Ising models where one models the interacting particles and determine the typical polarity or spins of each particle in the large system, is an example of application of Gibbs sampling technique.
Consider a very large arbitrary set of outcomes such that there occurrence is governed by a stable energy function. Let me try and make the idea clearer, we should start with , say , a lattice where the particle interactions determine the energy level of the system. Once we have an energy function identified we model the system by saying that it likes to be in a state of lower energy. But the state at any point of time is random, in that case the most natural (once you have done enough math) thing to do is use exponential to define a probability distribution on the set of outcomes (or states) that assigns lower probability to high energy states and vice-versa. This distribution is what is called the Gibbs Distribution.
So now we have a probability (a value between 0%-100%) attached with every possible state of the system. A physical system may be dependent on some parameters or external conditions, take for instance temperature (T). The temperature may affect the outcome of the system and hence we need to gauge its effect on the Gibbs probability disribution.
To make the notion more clear consider two systems which are under two different temperatures. The system which exists in the region of higher temperature will have a lower probability to end up being in a lower energy state as compared to the system in the region of lower temperature.
It is clear that temperature is analogus to volatility (variance) in probability distribution. If the variance is low then the mode (most probable outcome) which is the least energy state is more likely to appear.
One very interesting idea is to apply the technique of Gibbs sampling to minimize complex functions. If one samples from the Gibbs distribution identified by taking the function that needs to be minimized as the energy function, then the lower energy states since they have a higher probablity would be sampled from this distribution and the "lowest" energy state is the point of minima.
Other applications include image segmentation and data clustering. These application of mathemtics hover at the boundaries of Mathematics, one wrong step can take us in the other territory and may lead to ridicule. And that is why one needs to keep an open mind and be cautious at the same time.
No comments:
Post a Comment