A Hidden Markov model for progressive multiple alignment - PowerPoint PPT Presentation

1 / 13
About This Presentation
Title:

A Hidden Markov model for progressive multiple alignment

Description:

Hidden Markov Model is used to find optimal value in many applications like: ... Comparing characters, Substitution. GAP: Pxi,yi= represents prob. ... – PowerPoint PPT presentation

Number of Views:244
Avg rating:3.0/5.0
Slides: 14
Provided by: stude616
Category:

less

Transcript and Presenter's Notes

Title: A Hidden Markov model for progressive multiple alignment


1
A Hidden Markov model for progressive multiple
alignment
  • -Ari Loytynoja and Michel C.Milinkovitch
  • Presnted by Santosh Kumar Kodicherla

2
HMM Applications
  • Hidden Markov Model is used to find optimal value
    in many applications like
  • 1. In Membrane Helix
  • 2. In finding a dice whether its Fair dice or
  • not.
  • 3.Decesion tree applications, Neural Networks
    etc.

3
Working of HMM for Simple Pair wise Alignment
  • We check The two sequences and built the unknown
    parent. (Similarity is maximum).
  • This forms the basis for Current Algorithm.

Parent
Seq1
Seq2
4
Steps in HMM Works
5
Alignments
  • Pairwise Alignment
  • PDGIVTSIGSNLTIACRVS
  • PPLASSSLGATIRLSCTLS
  • Multiple Alignment
  • DREIYGAVGSQVTLHCSFW
  • TQDERKLLHTTASLRCSLK
  • PAWLTVSEGANATFTCSLS
  • LPDWTVQNGKNLTLQCFAD
  • LDKKEAIQGGIVRVNCSVP
  • SSFTHLDQGERLNLSCSIP
  • DAQFEVIKGQTIEVRCESI
  • LSSKVVESGEDIVLQCAVN
  • PAVFKDNPTEDVEYCCVAD

6
Systems and Models
  • Building Multiple alignment with Decreasing
    Similarity.
  • Compute probabilistic alignment
  • Keep Track of child pointers.
  • For each site Vector probabilities of alternate
    characters A/C/G/T/- is calculated.
  • New node generated is aligned with another
    internal sequence and cont.
  • Once root node is defined for multiple alignments
    ,we use recursive back tracking to generate
    multiple alignments.

7
Substitution Model
  • Consider Seqx, Seqy- generate Seqz(Parent)
  • Terms
  • Pa(Xi) Probability Seq Xi has character a .
  • If a char is observed it is given a prob1.
  • Character a has a background probability qa
  • a Evolves b, this represented as Sab.
  • Comparing characters, Substitution.
  • GAP
  • Pxi,yi represents prob. Xi,Yi are aligned and
    generate Zi.
  • For all the character states a in Zk-
  • pxi ,y j pzk (xi , y j ) ?pzka(xi , y j ).
  • pzka(xi , y j ) qa ?b sab pb(xi ) ?b sab pb(y
    j )

8
  • Steps in Algorithm
  • Look back HM Model.
  • Pair wise alignment
  • Calculate Posterior Probability.
  • Multiple Alignment
  • Testing Algorithm

9
  • Look back HM Model
  • Defines 3 states,
  • Match M, x-insert ,y-insert.
  • -Calculate probabilities of Moving from M to X or
    Y represented as d.
  • -Probability to stay at insert e .
  • -Probability to move back to M.

10
Pair wise alignment
  • In Dynamic prog, we define matrix and makes
    recursive calls, by choosing best path.
  • Use Backtracking to find the best path.
  • Veterbi path to get the best alignment path.
  • Used to find the parent vector which represents
    both childs.

11
Forward and backward recursions.
12
Multiple Alignment Observations.
  • The pair wise algorithm works progressively from
    tip of the node to root of tree.
  • Once root node is defined multiple alignments can
    be generated.
  • If a gap is introduced in the process , the
    recursive call does not proceed.
  • At a given column most of sequences are well
    aligned except few which may contain Gaps.

13
Testing the new Algorithm
Write a Comment
User Comments (0)
About PowerShow.com