next up previous contents index
Next: Dynamic Markov Chains ( Up: Graphs and Related Data Previous: Miscellaneous Graph Functions (   Contents   Index


Markov Chains ( markov_chain )

Definition

A Markov Chain is a graph G in which each edge has an associated non-negative integer weight w[e]. For every node (with at least one outgoing edge) the total weight of the outgoing edges must be positive A random walk in a Markov chain starts at some node s and then performs steps according to the following rule:

Initially, s is the current node. Suppose node v is the current node and that e0, ..., ed - 1 are the edges out of v. If v has no outgoing edge no further step can be taken. Otherwise, the walk follows edge ei with probability proportional to w[ei] for all i, 0 < = i < d. The target node of the chosen edge becomes the new current node.

#include < LEDA/markov _chain.h >

Creation

markov_chain M(graph G, edge_array<int> w, node s = nil)
    creates a Markov chain for the graph G with edge weights w. The node s is taken as the start vertex (G.first_node() if s is nil).

Operations

void M.step(int T = 1) performs T steps of the Markov chain.

node M.current_node() returns current vertex.

int M.current_outdeg() returns the outdegree of the current vertex.

int M.number_of_steps() returns number of steps performed.

int M.number_of_visits(node v)
    returns number of visits to node v.

double M.rel_freq_of_visit(node v)
    returns number of visits divided by the total number of steps.


next up previous contents index
Next: Dynamic Markov Chains ( Up: Graphs and Related Data Previous: Miscellaneous Graph Functions (   Contents   Index
LEDA research project
2000-02-09