Dynamic Bayesian network
Appearance
A Dynamic Bayesian Network (DBN) is a Bayesian Network which relates variables to each other over adjacent time steps. This is often called a Two-Timeslice BN (2TBN) because it says that at any point in time T, the value of a variable can be calculated from the internal regressors and the immediate prior value (time T-1). DBNs are common in robotics, and have shown potential for a wide range of data mining applications. For example, they have been used in speech recognition, digital forensics, protein sequencing, and bioinformatics. DBN is a generalization of hidden Markov models and Kalman filters.[1]
See also
References
- ^ Stuart Russell; Peter Norvig (2010). Artificial Intelligence: A Modern Approach (PDF) (Third ed.). Prentice Hall. p. 566. ISBN 978-0136042594. Retrieved 22 October 2014.
dynamic Bayesian networks (which include hidden Markov models and Kalman filters as special cases)
- Murphy, Kevin (2002). Dynamic Bayesian Networks: Representation, Inference and Learning. UC Berkeley, Computer Science Division.
- Ghahramani, Zoubin (1997). "Learning Dynamic Bayesian Networks". Lecture Notes In Computer Science. 1387: 168–197. CiteSeerx: 10.1.1.56.7874.
- Friedman, N.; Murphy, K.; Russell, S. (1998). Learning the structure of dynamic probabilistic networks. UAI’98. Morgan Kaufmann. pp. 139–147. CiteSeerx: 10.1.1.75.2969.
Software
- BNT at Google Code: the Bayes Net Toolbox for Matlab, by Kevin Murphy, (released under a GPL license)
- DBmcmc : Inferring Dynamic Bayesian Networks with MCMC, for Matlab (free software)
- GlobalMIT Matlab toolbox at Google Code: Modeling gene regulatory network via global optimization of dynamic bayesian network (released under a GPL license)
- libDAI: C++ library that provides implementations of various (approximate) inference methods for discrete graphical models; supports arbitrary factor graphs with discrete variables, including discrete Markov Random Fields and Bayesian Networks (released under the FreeBSD license)