Graphical Models of Probability Graphical models  use directed or undirected graphs over a set of random variables to explicitly specify variable dependencies and allow for less restrictive independence assumptions while limiting the number of parameters that must be estimated. Bayesian  Networks : Directed acyclic graphs that indicate causal structure. Markov Networks : Undirected graphs that capture general dependencies. Middle ware, CCNT, ZJU 10/02/11
H idden  M arkov  M odel Zhejiang Univ CCNT Yueshen Xu  Middle ware, CCNT, ZJU 10/02/11
Overview Markov Chain HMM Three Core Problems and Algorithms Application Middleware, CCNT, ZJU 10/02/11
Markov   Chain Instance We can regard the weather as three states : state1 : Rain state2 : Cloudy state3 : Sun We can obtain the transition matrix with long term observation Middleware, CCNT, ZJU 10/02/11 Tomorrow Rain Cloudy Sun Today Rain 0.4 0.3 0.3 Cloudy 0.2 0.6 0.2 Sun 0.1 0.1 0.8
Definition one-step transition probability That is to say, the evolvement of the stochastic process only relies on  the current state and has nothing to do with those states before. Then we call this  Markov property , and the process is regarded as  Markov Process State Space: Observation Sequence: Middleware, CCNT, ZJU 10/02/11
Keystone Middleware, CCNT, ZJU state transition matrix 其中: Initial state probability matrix 10/02/11
HMM A HMM is a double random process, consisting of two parallel parts: Markov Chain : Describe the transition of the states, which is unobservable, by means of transition probability matrix. Common stochastic process : Describe the stochastic process of the observable events  Markov Chain (  , A ) Stochastic Process ( B ) State Sequence Observation Sequence q 1 , q 2 , ..., q T o 1 , o 2 , ..., o T HMM Middleware, CCNT, ZJU 10/02/11 Unobservable Observable Core Feature
Example: S 1 S 2 S 3 What’s the probability of producing the sequence “abb” for this stochastic process?  Middleware, CCNT, ZJU 10/02/11 a 11   0.3 a b 0.80.2 a 22   0.4 a b 0.30.7 a 12   0.5 a b 1 0 a 23   0.6 a b 0.50.5 a 13   0.2 a b 0 1
Instance1: S 1 S 2 S 3 S 1 ->S 1 ->S 2 ->S 3   0.3*0.8*0.5*1.0*0.6*0.5=0.036 Middleware, CCNT, ZJU 10/02/11 a 11   0.3 a b 0.80.2 a 12   0.5 a b 1 0 a 23   0.6 a b 0.50.5 a 13   0.2 a b 0 1 a 22   0.4 a b 0.30.7
Instance2: S 1 S 2 S 3 S 1 ->S 2 ->S 2 ->S 3   0.5*1.0*0.4*0.3*0.6*0.5=0.018 Middleware, CCNT, ZJU 10/02/11 a 11   0.3 a b 0.80.2 a 12   0.5 a b 1 0 a 23   0.6 a b 0.50.5 a 13   0.2 a b 0 1 a 22   0.4 a b 0.30.7
Instance3: S 1 S 2 S 3 S 1 ->S 1 ->S 1 ->S 3   0.3*0.8*0.3*0.8*0.2*1.0=0.01152 Therefore, the total probability is: 0.036+0.018+0.01152=0.06552 Middleware, CCNT, ZJU We just know “abb”, but don’t know “S ? S ? S ? ”-----That’s the point. 10/02/11 a 11   0.3 a b 0.80.2 a 12   0.5 a b 1 0 a 23   0.6 a b 0.50.5 a 13   0.2 a b 0 1 a 22   0.4 a b 0.30.7
Description A HMM can be identified by those parameters below: N: the number of states M: the number of observable events for each state A: the state transition matrix B: observable event probability : the initial state probability Middleware, CCNT, ZJU We generally record it as  10/02/11
Three Core Problem Evaluation:  In the case that the observation sequence  and the model  have been preseted, then how can we calculate  ? Optimization: Based on question 1, the question is how to choose a special sequence  so that the observation sequence O can be explained reasonably? Training Based on question 1, here is how to adjust parameters of the model  to maximize  ? Middleware, CCNT, ZJU We know O, but don’t know Q 10/02/11
Solution There is no need to expound those algorithms, since we should pay attention to the application context. Evaluation——Dynamic Programming Forward  Backward Optimization——Greedy Viterbi Training——Iterative Baum-Welch & Maximum Likelihood Estimation You can think over and deduce these methods after the workshop. Middleware, CCNT, ZJU 10/02/11
Application Context Just think over it : The feature of HMM  Which kind of problem can it describe and model? Two  stochastic sequence One  relies on  another or two is  related . One can be “ seen”,  but another can  not Just think about the  Three Core Problem …… I think we can make a conclusion , just as:  Use   One sequence to deduce and predict another  or  Find Out Who is Behind  “ Iceberg” Problem Middleware, CCNT, ZJU 10/02/11
Application Context(1): Voice Recognition Statistical Description The characteristic pattern of voice, from sampling more often:  T =t 1 ,t 2 ,…, t n   The word sequence  W(n): W 1, W 2, ...,W n   Therefore, what we concern about is P( W(n)|T ) Middleware, CCNT, ZJU Formalization Description What we have to solve is : k = arg max{ P( W(n)| T  ) } 10/02/11
Application Context(1): Voice Recognition Middleware, CCNT, ZJU Recognition Framework 10/02/11 Baum-Welch Re-estimation Speech database Feature Extraction Converged?  1  2  7 HMM waveform feature Yes No end
Application Context(2): Text Information Extraction Figure out the HMM Model :  Q1:What ‘s the state and what’s the observation event? Q2:How to figure out those parameters, just like a ij ? Middleware, CCNT, ZJU state : what you want to extract observation event : text block or each word etc Through Training Samples 10/02/11
Application Context(2): Text Information Extraction Middleware, CCNT, ZJU Partition-ing State List Extracted Sequence Document Partitioni-ng Training Sample HMM Extraction Framework country, state , city, street title, author, email, abstract 10/02/11
Application Context(3): Other Fields: Face Recognition POS tagging Web Data Extraction Bioinformatics Network intrusion detection Handwriting recognition Document Categorization Multiple Sequence Alignment … Middleware, CCNT, ZJU Which field are you interested in ? 10/02/11
Middleware, CCNT, ZJU 10/02/11
B ayes  B elief  N etwork Yueshen Xu, too Middle ware, CCNT, ZJU 10/02/11
Overview Bayes Theorem Naïve Bayes Theorem Bayes Belief Network Application Middleware, CCNT, ZJU 10/02/11
Bayes Theorem Basic Bayes Formula Basic of basis, but vital. prior probability posterior probability complete probability formula Middleware, CCNT, ZJU Condition Inversion 10/02/11
The naive Bayes theorem is a simple probabilistic theorem based on applying Bayes theorem with strong  independence  assumptions Naïve Bayes Theorem Middleware, CCNT, ZJU Chain Rule Conditional Independence C F 1 F 2 … F n Naïve Bayes is a simple Bayes Net 10/02/11
Bayes Belief Network: Graph Structure Directed Acyclic Graph (DAG) Nodes are random variables Edges indicate causal influences Middleware, CCNT, ZJU Burglary Earthquake Alarm JohnCalls MaryCalls RV parents  descendant relationship 10/02/11
Bayes Belief Network: Conditional Probability Table  Each node has a conditional probability table (CPT) that gives the probability of each of its values given every possible combination of values for its parents. Roots (sources) of the DAG that have no parents are given prior probabilities. Middleware, CCNT, ZJU Burglary Earthquake Alarm JohnCalls MaryCalls 10/02/11 P(B) .001 P(E) .002 B E P(A) T T .95 T F .94 F T .29 F F .001 A P(M) T .70 F .01 A P(J) T .90 F .05
Bayes Belief Network: Joint Distributions A Bayesian Network implicitly defines a joint distribution. Example Therefore an inefficient approach to inference is: 1) Compute the joint distribution using this equation. 2) Compute any desired conditional probability using the joint distribution. Middleware, CCNT, ZJU Conditional Independence 10/02/11
Conditional Independence & D-separation  D-separation Let X,Y and Z be three sets of node If X and Y are d-separation by Z then X and Y are conditional independent given Z D-separation A is d-separation from B given C if every undirected path between them is blocked Path blocking Three cases that expand on three basic independence structures. Middleware, CCNT, ZJU 10/02/11
Application: Simple Document Classification(1) Step1: Assume for the moment  that there are only two mutually exclusive classes, S and  ¬S (eg, spam and not spam), such that every element(email) is in either one or the other, that is to say: Step2: what we concern about is :  Middleware, CCNT, ZJU 10/02/11
Application: Simple Document Classification(2) Step3: Dividing one by the other gives, and the be re-factored . Step4: Taking the logarithm of all these ratios for decreasing calculated quantity:  >0 or <0 Known Sample Training Middleware, CCNT, ZJU 10/02/11
Application: Overall Medical diagnosis Pathfinder system outperforms leading experts in diagnosis of lymph-node disease. Microsoft applications Problem diagnosis: printer problems Recognizing user intents for HCI Text categorization and spam filtering Student modeling for intelligent tutoring systems. Biochemical Data Analysis Predicting mutagenicity So many… Which field are you interested in ? Middleware, CCNT, ZJU 10/02/11
Middleware, CCNT, ZJU 10/02/11

More Related Content

PDF
Presentation
PDF
697584250
PDF
Presentation on Probability Genrating Function
PPTX
Markov models explained
PDF
Random graph models
PPTX
Probabilistic models (part 1)
PDF
Lecture10 - Naïve Bayes
Presentation
697584250
Presentation on Probability Genrating Function
Markov models explained
Random graph models
Probabilistic models (part 1)
Lecture10 - Naïve Bayes

Similar to Hidden markov chain and bayes belief networks doctor consortium (20)

PDF
An overview of Hidden Markov Models (HMM)
PDF
2012 mdsp pr06  hmm
PDF
Hidden markovmodel
PDF
Hidden Markov Models
PDF
Lecture13 xing fei-fei
PPT
HMM DAY-3.ppt
PPT
Hidden Markov and Graphical Models presentation
PPTX
Hidden markov model
PDF
12 Machine Learning Supervised Hidden Markov Chains
PPTX
PPTX
Presentationhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhh1.pptx
PPT
Hidden Markov Models with applications to speech recognition
PPT
Hidden Markov Models with applications to speech recognition
PPT
Hidden Markov Model in Natural Language Processing
PPTX
Hidden Markov Model (HMM)
PDF
Introduction to Hidden Markov Models
PDF
Basics of Dynamic programming
PPTX
Into to prob_prog_hari (2)
PPTX
PRML Chapter 8
PPTX
Hidden markov model
An overview of Hidden Markov Models (HMM)
2012 mdsp pr06  hmm
Hidden markovmodel
Hidden Markov Models
Lecture13 xing fei-fei
HMM DAY-3.ppt
Hidden Markov and Graphical Models presentation
Hidden markov model
12 Machine Learning Supervised Hidden Markov Chains
Presentationhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhh1.pptx
Hidden Markov Models with applications to speech recognition
Hidden Markov Models with applications to speech recognition
Hidden Markov Model in Natural Language Processing
Hidden Markov Model (HMM)
Introduction to Hidden Markov Models
Basics of Dynamic programming
Into to prob_prog_hari (2)
PRML Chapter 8
Hidden markov model
Ad

More from Yueshen Xu (20)

PDF
Context aware service recommendation
PDF
Course review for ir class 本科课件
PDF
Semantic web 本科课件
PDF
Recommender system slides for undergraduate
PDF
推荐系统 本科课件
PDF
Text classification 本科课件
PDF
Thinking in clustering yueshen xu
PDF
Text clustering (information retrieval, in chinese)
PDF
(Hierarchical) Topic Modeling_Yueshen Xu
PPTX
(Hierarchical) topic modeling
PDF
Non parametric bayesian learning in discrete data
PDF
聚类 (Clustering)
PDF
Yueshen xu cv
PDF
徐悦甡简历
PDF
Learning to recommend with user generated content
PDF
Social recommender system
PPT
Summary on the Conference of WISE 2013
PDF
Topic model an introduction
PPTX
Acoustic modeling using deep belief networks
PPT
Summarization for dragon star program
Context aware service recommendation
Course review for ir class 本科课件
Semantic web 本科课件
Recommender system slides for undergraduate
推荐系统 本科课件
Text classification 本科课件
Thinking in clustering yueshen xu
Text clustering (information retrieval, in chinese)
(Hierarchical) Topic Modeling_Yueshen Xu
(Hierarchical) topic modeling
Non parametric bayesian learning in discrete data
聚类 (Clustering)
Yueshen xu cv
徐悦甡简历
Learning to recommend with user generated content
Social recommender system
Summary on the Conference of WISE 2013
Topic model an introduction
Acoustic modeling using deep belief networks
Summarization for dragon star program
Ad

Recently uploaded (20)

PPTX
Climate Change and Its Global Impact.pptx
PPTX
B.Sc. DS Unit 2 Software Engineering.pptx
PDF
International_Financial_Reporting_Standa.pdf
PDF
HVAC Specification 2024 according to central public works department
PDF
Empowerment Technology for Senior High School Guide
PDF
FOISHS ANNUAL IMPLEMENTATION PLAN 2025.pdf
PDF
BP 505 T. PHARMACEUTICAL JURISPRUDENCE (UNIT 1).pdf
PDF
Vision Prelims GS PYQ Analysis 2011-2022 www.upscpdf.com.pdf
PDF
Skin Care and Cosmetic Ingredients Dictionary ( PDFDrive ).pdf
PDF
Τίμαιος είναι φιλοσοφικός διάλογος του Πλάτωνα
PDF
AI-driven educational solutions for real-life interventions in the Philippine...
PDF
MBA _Common_ 2nd year Syllabus _2021-22_.pdf
PDF
Climate and Adaptation MCQs class 7 from chatgpt
PDF
1.3 FINAL REVISED K-10 PE and Health CG 2023 Grades 4-10 (1).pdf
PDF
Race Reva University – Shaping Future Leaders in Artificial Intelligence
PDF
BP 704 T. NOVEL DRUG DELIVERY SYSTEMS (UNIT 1)
PPTX
Share_Module_2_Power_conflict_and_negotiation.pptx
PPTX
Module on health assessment of CHN. pptx
PDF
LIFE & LIVING TRILOGY - PART - (2) THE PURPOSE OF LIFE.pdf
PPTX
A powerpoint presentation on the Revised K-10 Science Shaping Paper
Climate Change and Its Global Impact.pptx
B.Sc. DS Unit 2 Software Engineering.pptx
International_Financial_Reporting_Standa.pdf
HVAC Specification 2024 according to central public works department
Empowerment Technology for Senior High School Guide
FOISHS ANNUAL IMPLEMENTATION PLAN 2025.pdf
BP 505 T. PHARMACEUTICAL JURISPRUDENCE (UNIT 1).pdf
Vision Prelims GS PYQ Analysis 2011-2022 www.upscpdf.com.pdf
Skin Care and Cosmetic Ingredients Dictionary ( PDFDrive ).pdf
Τίμαιος είναι φιλοσοφικός διάλογος του Πλάτωνα
AI-driven educational solutions for real-life interventions in the Philippine...
MBA _Common_ 2nd year Syllabus _2021-22_.pdf
Climate and Adaptation MCQs class 7 from chatgpt
1.3 FINAL REVISED K-10 PE and Health CG 2023 Grades 4-10 (1).pdf
Race Reva University – Shaping Future Leaders in Artificial Intelligence
BP 704 T. NOVEL DRUG DELIVERY SYSTEMS (UNIT 1)
Share_Module_2_Power_conflict_and_negotiation.pptx
Module on health assessment of CHN. pptx
LIFE & LIVING TRILOGY - PART - (2) THE PURPOSE OF LIFE.pdf
A powerpoint presentation on the Revised K-10 Science Shaping Paper

Hidden markov chain and bayes belief networks doctor consortium

  • 1. Graphical Models of Probability Graphical models use directed or undirected graphs over a set of random variables to explicitly specify variable dependencies and allow for less restrictive independence assumptions while limiting the number of parameters that must be estimated. Bayesian Networks : Directed acyclic graphs that indicate causal structure. Markov Networks : Undirected graphs that capture general dependencies. Middle ware, CCNT, ZJU 10/02/11
  • 2. H idden M arkov M odel Zhejiang Univ CCNT Yueshen Xu Middle ware, CCNT, ZJU 10/02/11
  • 3. Overview Markov Chain HMM Three Core Problems and Algorithms Application Middleware, CCNT, ZJU 10/02/11
  • 4. Markov Chain Instance We can regard the weather as three states : state1 : Rain state2 : Cloudy state3 : Sun We can obtain the transition matrix with long term observation Middleware, CCNT, ZJU 10/02/11 Tomorrow Rain Cloudy Sun Today Rain 0.4 0.3 0.3 Cloudy 0.2 0.6 0.2 Sun 0.1 0.1 0.8
  • 5. Definition one-step transition probability That is to say, the evolvement of the stochastic process only relies on the current state and has nothing to do with those states before. Then we call this Markov property , and the process is regarded as Markov Process State Space: Observation Sequence: Middleware, CCNT, ZJU 10/02/11
  • 6. Keystone Middleware, CCNT, ZJU state transition matrix 其中: Initial state probability matrix 10/02/11
  • 7. HMM A HMM is a double random process, consisting of two parallel parts: Markov Chain : Describe the transition of the states, which is unobservable, by means of transition probability matrix. Common stochastic process : Describe the stochastic process of the observable events Markov Chain (  , A ) Stochastic Process ( B ) State Sequence Observation Sequence q 1 , q 2 , ..., q T o 1 , o 2 , ..., o T HMM Middleware, CCNT, ZJU 10/02/11 Unobservable Observable Core Feature
  • 8. Example: S 1 S 2 S 3 What’s the probability of producing the sequence “abb” for this stochastic process? Middleware, CCNT, ZJU 10/02/11 a 11 0.3 a b 0.80.2 a 22 0.4 a b 0.30.7 a 12 0.5 a b 1 0 a 23 0.6 a b 0.50.5 a 13 0.2 a b 0 1
  • 9. Instance1: S 1 S 2 S 3 S 1 ->S 1 ->S 2 ->S 3 0.3*0.8*0.5*1.0*0.6*0.5=0.036 Middleware, CCNT, ZJU 10/02/11 a 11 0.3 a b 0.80.2 a 12 0.5 a b 1 0 a 23 0.6 a b 0.50.5 a 13 0.2 a b 0 1 a 22 0.4 a b 0.30.7
  • 10. Instance2: S 1 S 2 S 3 S 1 ->S 2 ->S 2 ->S 3 0.5*1.0*0.4*0.3*0.6*0.5=0.018 Middleware, CCNT, ZJU 10/02/11 a 11 0.3 a b 0.80.2 a 12 0.5 a b 1 0 a 23 0.6 a b 0.50.5 a 13 0.2 a b 0 1 a 22 0.4 a b 0.30.7
  • 11. Instance3: S 1 S 2 S 3 S 1 ->S 1 ->S 1 ->S 3 0.3*0.8*0.3*0.8*0.2*1.0=0.01152 Therefore, the total probability is: 0.036+0.018+0.01152=0.06552 Middleware, CCNT, ZJU We just know “abb”, but don’t know “S ? S ? S ? ”-----That’s the point. 10/02/11 a 11 0.3 a b 0.80.2 a 12 0.5 a b 1 0 a 23 0.6 a b 0.50.5 a 13 0.2 a b 0 1 a 22 0.4 a b 0.30.7
  • 12. Description A HMM can be identified by those parameters below: N: the number of states M: the number of observable events for each state A: the state transition matrix B: observable event probability : the initial state probability Middleware, CCNT, ZJU We generally record it as 10/02/11
  • 13. Three Core Problem Evaluation: In the case that the observation sequence and the model have been preseted, then how can we calculate ? Optimization: Based on question 1, the question is how to choose a special sequence so that the observation sequence O can be explained reasonably? Training Based on question 1, here is how to adjust parameters of the model to maximize ? Middleware, CCNT, ZJU We know O, but don’t know Q 10/02/11
  • 14. Solution There is no need to expound those algorithms, since we should pay attention to the application context. Evaluation——Dynamic Programming Forward Backward Optimization——Greedy Viterbi Training——Iterative Baum-Welch & Maximum Likelihood Estimation You can think over and deduce these methods after the workshop. Middleware, CCNT, ZJU 10/02/11
  • 15. Application Context Just think over it : The feature of HMM Which kind of problem can it describe and model? Two stochastic sequence One relies on another or two is related . One can be “ seen”, but another can not Just think about the Three Core Problem …… I think we can make a conclusion , just as: Use One sequence to deduce and predict another or Find Out Who is Behind “ Iceberg” Problem Middleware, CCNT, ZJU 10/02/11
  • 16. Application Context(1): Voice Recognition Statistical Description The characteristic pattern of voice, from sampling more often: T =t 1 ,t 2 ,…, t n The word sequence W(n): W 1, W 2, ...,W n Therefore, what we concern about is P( W(n)|T ) Middleware, CCNT, ZJU Formalization Description What we have to solve is : k = arg max{ P( W(n)| T ) } 10/02/11
  • 17. Application Context(1): Voice Recognition Middleware, CCNT, ZJU Recognition Framework 10/02/11 Baum-Welch Re-estimation Speech database Feature Extraction Converged?  1  2  7 HMM waveform feature Yes No end
  • 18. Application Context(2): Text Information Extraction Figure out the HMM Model : Q1:What ‘s the state and what’s the observation event? Q2:How to figure out those parameters, just like a ij ? Middleware, CCNT, ZJU state : what you want to extract observation event : text block or each word etc Through Training Samples 10/02/11
  • 19. Application Context(2): Text Information Extraction Middleware, CCNT, ZJU Partition-ing State List Extracted Sequence Document Partitioni-ng Training Sample HMM Extraction Framework country, state , city, street title, author, email, abstract 10/02/11
  • 20. Application Context(3): Other Fields: Face Recognition POS tagging Web Data Extraction Bioinformatics Network intrusion detection Handwriting recognition Document Categorization Multiple Sequence Alignment … Middleware, CCNT, ZJU Which field are you interested in ? 10/02/11
  • 22. B ayes B elief N etwork Yueshen Xu, too Middle ware, CCNT, ZJU 10/02/11
  • 23. Overview Bayes Theorem Naïve Bayes Theorem Bayes Belief Network Application Middleware, CCNT, ZJU 10/02/11
  • 24. Bayes Theorem Basic Bayes Formula Basic of basis, but vital. prior probability posterior probability complete probability formula Middleware, CCNT, ZJU Condition Inversion 10/02/11
  • 25. The naive Bayes theorem is a simple probabilistic theorem based on applying Bayes theorem with strong independence  assumptions Naïve Bayes Theorem Middleware, CCNT, ZJU Chain Rule Conditional Independence C F 1 F 2 … F n Naïve Bayes is a simple Bayes Net 10/02/11
  • 26. Bayes Belief Network: Graph Structure Directed Acyclic Graph (DAG) Nodes are random variables Edges indicate causal influences Middleware, CCNT, ZJU Burglary Earthquake Alarm JohnCalls MaryCalls RV parents descendant relationship 10/02/11
  • 27. Bayes Belief Network: Conditional Probability Table Each node has a conditional probability table (CPT) that gives the probability of each of its values given every possible combination of values for its parents. Roots (sources) of the DAG that have no parents are given prior probabilities. Middleware, CCNT, ZJU Burglary Earthquake Alarm JohnCalls MaryCalls 10/02/11 P(B) .001 P(E) .002 B E P(A) T T .95 T F .94 F T .29 F F .001 A P(M) T .70 F .01 A P(J) T .90 F .05
  • 28. Bayes Belief Network: Joint Distributions A Bayesian Network implicitly defines a joint distribution. Example Therefore an inefficient approach to inference is: 1) Compute the joint distribution using this equation. 2) Compute any desired conditional probability using the joint distribution. Middleware, CCNT, ZJU Conditional Independence 10/02/11
  • 29. Conditional Independence & D-separation D-separation Let X,Y and Z be three sets of node If X and Y are d-separation by Z then X and Y are conditional independent given Z D-separation A is d-separation from B given C if every undirected path between them is blocked Path blocking Three cases that expand on three basic independence structures. Middleware, CCNT, ZJU 10/02/11
  • 30. Application: Simple Document Classification(1) Step1: Assume for the moment that there are only two mutually exclusive classes, S and ¬S (eg, spam and not spam), such that every element(email) is in either one or the other, that is to say: Step2: what we concern about is : Middleware, CCNT, ZJU 10/02/11
  • 31. Application: Simple Document Classification(2) Step3: Dividing one by the other gives, and the be re-factored . Step4: Taking the logarithm of all these ratios for decreasing calculated quantity: >0 or <0 Known Sample Training Middleware, CCNT, ZJU 10/02/11
  • 32. Application: Overall Medical diagnosis Pathfinder system outperforms leading experts in diagnosis of lymph-node disease. Microsoft applications Problem diagnosis: printer problems Recognizing user intents for HCI Text categorization and spam filtering Student modeling for intelligent tutoring systems. Biochemical Data Analysis Predicting mutagenicity So many… Which field are you interested in ? Middleware, CCNT, ZJU 10/02/11