World Library  
Flag as Inappropriate
Email this Article

Hierarchical hidden Markov model

Article Id: WHEBN0009862271
Reproduction Date:

Title: Hierarchical hidden Markov model  
Author: World Heritage Encyclopedia
Language: English
Subject: Hidden Markov model, Hierarchy, List of statistics articles
Collection:
Publisher: World Heritage Encyclopedia
Publication
Date:
 

Hierarchical hidden Markov model

The hierarchical hidden Markov model (HHMM) is a statistical model derived from the hidden Markov model (HMM). In an HHMM each state is considered to be a self-contained probabilistic model. More precisely each state of the HHMM is itself an HHMM.

HHMMs and HMMs are useful in many fields, including pattern recognition.

Background

It is sometimes useful to use HMMs in specific structures in order to facilitate learning and generalization. For example, even though a fully connected HMM could always be used if enough training data is available it is often useful to constrain the model by not allowing arbitrary state transitions. In the same way it can be beneficial to embed the HMM into a greater structure; which, theoretically, may not be able to solve any other problems than the basic HMM but can solve some problems more efficiently when it comes to the amount of training data required.

The hierarchical hidden Markov model

In the hierarchical hidden Markov model (HHMM) each state is considered to be a self-contained probabilistic model. More precisely each state of the HHMM is itself an HHMM. This implies that the states of the HHMM emit sequences of observation symbols rather than single observation symbols as is the case for the standard HMM states.

Illustration of the structure of a HHMM. Gray lines show vertical transitions. The horizontal transitions are shown as black lines. The light gray circles are the internal states and the dark gray circles are the terminal states that returns control to the activating state. The production states are not shown in this figure.

When a state in an HHMM is activated, it will activate its own probabilistic model, i.e. it will activate one of the states of the underlying HHMM, which in turn may activate its underlying HHMM and so on. The process is repeated until a special state, called a production state, is activated. Only the production states emit observation symbols in the usual HMM sense. When the production state has emitted a symbol, control returns to the state that activated the production state. The states that do not directly emit observations symbols are called internal states. The activation of a state in an HHMM under an internal state is called a vertical transition. After a vertical transition is completed a horizontal transition occurs to a state within the same level. When a horizontal transition leads to a terminating state control is returned to the state in the HHMM, higher up in the hierarchy, that produced the last vertical transition.

Remember that a vertical transition can result in more vertical transitions before reaching a sequence of production states and finally returning to the top level. Thus the production states visited gives rise to a sequence of observation symbols that is "produced" by the state at the top level.

The methods for estimating the HHMM parameters and model structure are more complex than for the HMM and the interested reader is referred to (Fine et al., 1998).

It should be pointed out that the HMM and HHMM belong to the same class of classifiers. That is, they can be used to solve the same set of problems. In fact, the HHMM can be transformed into a standard HMM. However, the HHMM utilizes its structure to solve a subset of the problems more efficiently.

See also

References

  • S. Fine, Y. Singer and N. Tishby, "The Hierarchical Hidden Markov Model: Analysis and Applications", Machine Learning, vol. 32, p. 41–62, 1998
  • K.Murphy and M.Paskin. "Linear Time Inference in Hierarchical HMMs", NIPS-01 (Neural Info. Proc. Systems).
  • H. Bui, D. Phung and S. Venkatesh. "Hierarchical Hidden Markov Models with General State Hierarchy", AAAI-04 (National Conference on Artificial Intelligence).
This article was sourced from Creative Commons Attribution-ShareAlike License; additional terms may apply. World Heritage Encyclopedia content is assembled from numerous content providers, Open Access Publishing, and in compliance with The Fair Access to Science and Technology Research Act (FASTR), Wikimedia Foundation, Inc., Public Library of Science, The Encyclopedia of Life, Open Book Publishers (OBP), PubMed, U.S. National Library of Medicine, National Center for Biotechnology Information, U.S. National Library of Medicine, National Institutes of Health (NIH), U.S. Department of Health & Human Services, and USA.gov, which sources content from all federal, state, local, tribal, and territorial government publication portals (.gov, .mil, .edu). Funding for USA.gov and content contributors is made possible from the U.S. Congress, E-Government Act of 2002.
 
Crowd sourced content that is contributed to World Heritage Encyclopedia is peer reviewed and edited by our editorial staff to ensure quality scholarly research articles.
 
By using this site, you agree to the Terms of Use and Privacy Policy. World Heritage Encyclopedia™ is a registered trademark of the World Public Library Association, a non-profit organization.
 


Copyright © World Library Foundation. All rights reserved. eBooks from Project Gutenberg are sponsored by the World Library Foundation,
a 501c(4) Member's Support Non-Profit Organization, and is NOT affiliated with any governmental agency or department.