0 votes

We are given an MDP with 3 states: Red, Green, Blue (RGB). 

Emission probabilities:  

R --> 1 (p=1/3), 2 (p=1/3), 3 (p=1/3)

G --> 1 (p=1/3), 2 (p=1/3), 3 (p=1/3)

B --> 1 (p=1/3), 2 (p=1/3), 3 (p=1/3)

Transition Matrix

R --> R (p=1/3), G (p=1/3), B (p=1/3)

G --> R (p=1/3), G (p=1/3), B (p=1/3)

B --> R (p=1/3), G (p=1/3), B (p=1/3)

These are the observations: 1, 2, 3, 1, 1, 2, 1, 2, 3, 1, 1, 2, 2

What is the most likely explanation for this observation sequence?
Initial state is not known and is equally likely from all states.

in HMM by AlgoMeister (2.1k points)

15 Answers

+1 vote
 
Best answer

Without solving this problem mathematically, I think, it is very obvious that the most likely explanation for this observation sequence is about their equality as emission probabilities and values of transition matrices are the same. Therefore, there is no unique MLE; any state sequence of the same length is equally probable. 

If we solve this problem using the mathematical equations:

P(S,O)=P(s1​) ∏( P(st​ ∣ st−1​) ∏ ​P(ot​ ∣ st​)

P(s1​)=1/3

​P(st∣st−1)= 1/3

​ P(ot∣st)=1/3​

After putting the values into the equation, we get :

P(S, O)=(1/3​) ^ 2T, which means P(S, O) is independent of S, so whichever state sequence is there, they all have the same likelihood. 

by AlgoStar (392 points)
selected by
0 votes

From given question, we can see that all emission probabilities are equal and all transition probabilities are equal, so every possible hidden state sequence has exactly the same probability. We can write it as below:

  • P(R)=P(G)=P(B)=1/3     (Initial state probabilities)
  • P(1∣R)=P(2∣R)=P(3∣R)=1/3         (Emission probabilities) 
  • P(R∣R)=P(G∣R)=P(B∣R)=1/3       (Transition probabilities)​​

So for any hidden-state sequence of length 13:


P(States, observations) = P(s1) ∏13t=1 P(ot ∣ st) ∏13t=2 ​P(st ​∣ st-1), or simply


P(states,observations) = P(s1​) * P(o1​∣s1​) * P(s2​∣s1​) * P(o2​∣s2​) * P(s3​∣s2) * P(o3​∣s3​).....P(s13∣s12) * P(o13​∣s13​)

P(states,observations)= (1/3)26 

So, all hidden-state sequences are equally likely. For example:

1) B,G,R,B,R,G,G,B,R,R,G,B,B

2) G,G,G,G,G,G,G,G,G,G,G,G,G

3) R,G,B,R,G,B,R,G,B,R,G,B,R

all have the same probability. 
Therefore, there is no single most likely state sequence, as every possible RGB state sequence of length 13 is equally likely.

by (236 points)
0 votes
Since all probabilities are same all states are symmetric. This means that any RGB sequence of length 13 is equally likely. Probability for each is: (1/3)^26
by Active (264 points)
0 votes

We have an HMM with states St​∈{R,G,B} and observations
Ot​.

Model

Initial: P(S1​=s)=1/3

Transition:
P(St​=j ∣ St−1​=i)=1/3​,∀i,j∈{R,G,B}

Emission:
P(Ot​=k ∣ St​=s)=1/3,∀s∈{R,G,B}, k∈{1,2,3}

Joint probability

For any hidden sequence S1:T​ and observed sequence O1:T​:
P(S1:T​,O1:T​)=P(S1​) (t=2 ∏​ T[P(St ∣ St−1​)]  (t=1 ∏T​ [P(Ot ​∣ St​)])

Substituting the values:
P(S1:T​,O1:T​)=1/3* (1/3​)^(T−1) * (1/3​)^T = (1/3​)^(2T−1)

So for this problem (T=13):
P(S1:T​,O1:T​)=(1/3)^25

Interpretation

Now notice something important: 

The expression (1/3​)^25does not depend on which states we choose 

It is the same for every possible hidden sequence S1:T

So when we compute the posterior:
P(S1:T​∣O1:T​)=P(S1:T​,O1:T​) / P(O1:T​)​

the numerator is identical for all state paths, meaning P(S1:T​∣O1:T​)=constant over all S1:T
​​

    by Active (264 points)
    0 votes

    This HMM has no unique most likely hidden-state sequence. The reason is that, for every state:

    R emits 1,2,3 with probability 1/3

    G emits 1,2,3 with probability 1/3

    B emits 1,2,3 with probability 1/3

    And every transition also has probability 1/3:

    R/G/B -> R/G/B all equally likely.

    So for any hidden sequence, the probability is the same. There are 13 observations, so there are 13 hidden states.

    For any hidden sequence:

    P(states,observations)=P(s1) ∏P(s| st-1)∏P(ot | st)

    Since everything is 1/3:

    P=(1/3)⋅(1/3)12⋅(1/3)13=(1/3)26

    So all possible hidden-state sequences have exactly the same probability.

    by Active (312 points)
    0 votes

    Hi, 

    All states are equally likely to start, all transitions are equally likely, and all emissions are equally likely.

    So every possible hidden-state sequence has the same probability.

    Observation sequence:

    1, 2, 3, 1, 1, 2, 1, 2, 3, 1, 1, 2, 2

    For any hidden-state sequence of length 13, the probability is:

    Initial probability * transition probabilities * emission probabilities

    = (1/3) * (1/3)^12 * (1/3)^13

    = (1/3)^26

    So there is no single most likely explanation.

    Answer:

    Any state sequence is equally likely.

    For example:

    R, R, R, R, R, R, R, R, R, R, R, R, R is just as likely as  ---- R, G, B, R, G, B, R, G, B, R, G, B, R and also just as likely as any other sequence.

    by Active (264 points)
    0 votes
    The observation sequence could be of any form, given the fact that any of the states in the sequence can have equally likely emissions. Moreover, the transition matrix consists of all possible state transitions with equal probabilities, which further confirms the absence of a particular sequence that would be more likely to be hidden behind this observation.
    by (224 points)
    0 votes
    In this HMM, all probabilities are equal.

    Each state starts with probability 1/3.  

    Each transition has probability 1/3.  

    Each emission has probability 1/3.  

    The observation sequence has length T = 13, so there are 13 hidden states.

    For any chosen state sequence S₁:₁₃, the joint probability is:

    P(S, O) = (1/3) × (1/3)¹² × (1/3)¹³  

            = (1/3)²⁶  

    This value is the same for every possible sequence of 13 states.

    So no sequence is better than another. The model assigns equal likelihood to all of them.

    For example, these two sequences:

    R R R R R R R R R R R R R  

    R G B R G B R G B R G B R  

    both have probability (1/3)²⁶.

    Conclusion: there is no unique most likely sequence. Any sequence of length 13 is an MLE.
    by Active (272 points)
    0 votes
    Hello,

    We have 3 hidden states:

    R, G, B

    Emission probabilities:

    For R:
    P(1|R) = 1/3, P(2|R) = 1/3, P(3|R) = 1/3

    For G:
    P(1|G) = 1/3, P(2|G) = 1/3, P(3|G) = 1/3

    For B:
    P(1|B) = 1/3, P(2|B) = 1/3, P(3|B) = 1/3

    Transition probabilities:

    From any state, probability to go to R, G, or B is always 1/3.

    Also initial state is equally likely:

    P(R at start) = P(G at start) = P(B at start) = 1/3

    Observation sequence:

    1, 2, 3, 1, 1, 2, 1, 2, 3, 1, 1, 2, 2

    Now the important point is:

    All 3 states have exactly the same emission probabilities

    and

    all transitions are also exactly the same

    So no matter which hidden-state sequence we choose:

    each observation contributes factor 1/3

    each transition contributes factor 1/3

    initial state also contributes factor 1/3

    That means every possible hidden-state sequence of length 13 has exactly the same probability.

    For any particular state sequence q1, q2, ..., q13:

    P(q1, ..., q13, observations)

    = P(q1) * product of transitions * product of emissions

    = (1/3) * (1/3)^12 * (1/3)^13

    = (1/3)^26

    So all explanations have equal probability.

    Therefore there is no single most likely explanation.

    Every possible RGB state sequence of length 13 is tied.

    Examples:

    R, R, R, R, R, R, R, R, R, R, R, R, R

    and

    R, G, B, R, G, B, R, G, B, R, G, B, R

    and

    B, B, G, R, R, G, B, R, G, B, B, R, G

    all have the same probability.

    Final answer:

    There is no unique most likely hidden-state sequence. Because all states emit observations with the same probabilities and all transitions are uniform, every possible state sequence is equally likely. So the observation sequence does not give any preference for one explanation over another.
    by Active (252 points)
    0 votes

    Hello Professor,

    In this HMM, there are 3 hidden states:

    R, G, B

    The observation sequence is:

    1, 2, 3, 1, 1, 2, 1, 2, 3, 1, 1, 2, 2

    So there are 13 observations, which means there are 13 hidden states.

    The important part is that all probabilities are equal.

    The initial state probabilities are:

    P(R) = 1/3
    P(G) = 1/3
    P(B) = 1/3

    The transition probabilities are also all equal:

    From any state, the probability of going to R, G, or B is 1/3.

    The emission probabilities are also all equal:

    Each state emits 1, 2, or 3 with probability 1/3.

    So no hidden state is more likely than another, and no transition is more likely than another.

    For any hidden-state sequence of length 13, the probability is:

    initial probability * transition probabilities * emission probabilities

    There is 1 initial probability:

    1/3

    There are 12 transitions:

    (1/3)^12

    There are 13 emissions:

    (1/3)^13

    So for any hidden-state sequence:

    P(sequence and observations) = (1/3) * (1/3)^12 * (1/3)^13

    P(sequence and observations) = (1/3)^26

    This value is the same for every possible hidden-state sequence.

    For example:

    R, R, R, R, R, R, R, R, R, R, R, R, R

    has the same probability as:

    R, G, B, R, G, B, R, G, B, R, G, B, R

    and also the same probability as any other sequence of 13 states.

    Final answer:

    There is no unique most likely explanation. Every possible RGB state sequence of length 13 is equally likely.

    So the observation sequence does not give preference to any one hidden-state sequence

    by (212 points)

    Related questions

    0 votes
    15 answers
    asked Apr 25 in HMM by amrinderarora AlgoMeister (2.1k points)
    0 votes
    8 answers
    0 votes
    16 answers
    0 votes
    1 answer
    asked May 2, 2021 in HMM by amrinderarora AlgoMeister (2.1k points)
    0 votes
    1 answer
    asked May 8, 2023 in HMM by Max Active (276 points)
    ...