My Knowledge Base
Search
Search
Dark mode
Light mode
Reader mode
Explorer
Home
❯
3. Resource
❯
Homogeneous Markov Chain
Homogeneous Markov Chain
Dec 10, 2025
1 min read
math/probability
math/probability/stochastic_process
Definiotion
Markov Chain
that
P
(
X
t
+
1
=
j
∣
X
t
=
i
)
are the same for any
t
,
{
X
t
}
Graph View
Backlinks
Accessibility
Aperiodic Markov Chain
Detailed Balance Condition
Ergodic Markov Chain
Ergodic Theorem
Irreducible Markov Chain
Limiting Distribution
Period of Markov Chain
Recurrence
Stationary Distribution
Stochastic Process Note
Stochastic Process
Transition Probability Matrix
Transition Probability