SlideShare a Scribd company logo
DMI-ST. JOHN THE BAPTIST UNIVERSITY
LILONGWE, MALAWI
Module Code: 055CS62
Subject Name: Theory of computation
Unit V Detail Notes
School of Computer Science
Module Teacher: Fanny Chatola
Syllabus
Basic Definitions, Equivalence of Acceptance by Finite State & Empty stack, PDA & Context Free Language,
Equivalence of PDA and CFG, Parsing & PDA: Top-Down Parsing, Top-down Parsing Using Deterministic PDA,
Bottom-up Parsing, Closure properties and Deterministic PDA.
A Language that is not recursively enumerable, An un-decidable problem that is RE, Post Correspondence Problem,
The Classes P and NP: Problems Solvable in Polynomial Time, An Example: Kruskal's Algorithm,
Nondeterministic Polynomial Time, An NP Example: The Traveling Salesman Problem,
Polynomial-Time Reductions NP Complete Problems, An NP-Complete Problem: The Satisfiability Problem,
Tractable and Intractable Representing SAT Instances, NP Completeness of the SAT Problem,
A Restricted Satisfiability Problem: Normal Forms for Boolean Expressions, Converting Expressions to CNF,
The Problem of Independent Sets, The Node-Cover Problem.
Push down automata:
Regular language can be charaterized as the language accepted by finite automata. Similarly, we can
characterize the context-free language as the langauge accepted by a class of machines called
"Pushdown Automata" (PDA). A pushdown automation is an extension of the NFA.
It is observed that FA have limited capability. (in the sense that the class of languages accepted or
characterized by them is small). This is due to the "finite memory" (number of states) and "no external
memory" involved with them. A PDA is simply an NFA augmented with an "external stack memory". The
addition of a stack provides the PDA with a last-in, first-out memory management cpapability. This "Stack"
or "pushdown store" can be used to record a potentially unbounded information. It is due to this memory
management capability with the help of the stack that a PDA can overcome the memory limitations that
prevents a FA to
accept many interesting languages like . Although, a PDA can store an unbounded amount
of information on the stack, its access to the information on the stack is limited. It can push an element
onto the top of the stack and pop off an element from the top of the stack. To read down into the stack
the top elements must be popped off and are lost. Due to this limited access to the information on the
stack, a PDA still has some limitations and cannot accept some other interesting languages.
As shown in figure, a PDA has three components: an input tape with read only head, a finite control and
a pushdown store.
The input head is read-only and may only move from left to right, one symbol (or cell) at a time. In each
step, the PDA pops the top symbol off the stack; based on this symbol, the input symbol it is currently
reading, and its present state, it can push a sequence of symbols onto the stack, move its read-only head
one cell (or symbol) to the right, and enter a new state, as defined by the transition rules of the PDA.
PDA are nondeterministic, by default. That is, - transitions are also allowed in which the PDA can pop
and push, and change state without reading the next input symbol or moving its read-only head. Besides
this, there may be multiple options for possible next moves.
Formal Definitions : Formally, a PDA M is a 7-tuple M =
where,
• is a finite set of states,
•is a finite set of input symbols (input alphabets),
•is a finite set of stack symbols (stack alphabets),
• is a transition function from to subset of
• is the start state
• , is the initial stack symbol, and
• , is the final or accept states.
Explanation of the transition function, :
If, for any . This means intitutively that
whenever the
PDA is in state q reading input symbol a and z on top of the stack, it can nondeterministically for any i,
• go to state
• pop z off the stack
• push onto the stack (where ) (The usual convention is
, then
that if
will be at the top
and
• move read head right one cell past the current symbol a.
If a = means intitutively that whenver the PDA is
in
state q with z on the top of the stack regardless of the current input symbol, it can nondeterministically
for any
i, ,
• go to state
• pop z off the stack
• push onto the stack, and
• leave its read-only head where it is.
State transition diagram : A PDA can also be depicted by a state transition diagram. The labels on the
arcs indicate both the input and the stack operation. The transition
is
depicted by
Final states are indicated by double circles and the start state is indicated by an arrow to it from nowhere.
Configuration or Instantaneous Description (ID) :
,
, then
for and
at the bottom.)
A configuration or an instantaneous description (ID) of PDA at any moment during its computation
is an element of describing the current state, the portion of the input remaining to be read
(i.e. under and to the right of the read head), and the current stack contents. Only these three elements
can affect the computation from that point on and, hence, are parts of the ID.
The start or inital configuartion (or ID) on input . That is, the PDA always starts in
its start state, with its read head pointing to the leftmost input symbol and the stack containing
only the start/initial stack symbol, .
The "next move relation" one figure describes how the PDA can move from one configuration to
another in one step.
Formally,
Let I, J, K be IDs of a PDA. We define we write I K, if ID I can become K after exactly i moves. The
relations define as follows
I K
I such that I J
I such that I J.
That is, is the reflexive, transitive closure of . We say that I J if the ID J follows from the
ID I in zero or more moves.
( Note : subscript M can be dropped when the particular PDA M is understood. )
Language accepted by a PDA M
There are two alternative definiton of acceptance as given below.
1. Acceptance by final state :
Consider the PDA . Informally, the PDA M is said to accept its input
by final
state if it enters any final state in zero or more moves after reading its entire input, starting in the start
configuration on input .
Formally, we define L(M), the language accepted by final state to be
{ for some }
is
iff
'a' may be or an input symbol.
and
J if K and K
J if
| and
2. Acceptance by empty stack (or Null stack) : The PDA M accepts its input by empty stack if starting in the
start configuration on input , it ever empties the stack w/o pushing anything back on after reading the entire
input. Formally, we define N(M), the language accepted by empty stack, to be
{ for some }
Note that the set of final states, F is irrelevant in this case and we usually let the F to be the empty set
i.e. F = Q .
Example 1 : Here is a PDA that accepts the language .
consists of the following transitions
The PDA can also be described by the adjacent transition diagram.
Informally, whenever the PDA M sees an input a in the start state with the start symbol z on the top of
the stack it pushes a onto the stack and changes state to . (to remember that it has seen the first 'a').
On state if it sees anymore a, it simply pushes it onto the stack. Note that when M is on state , the
symbol on the
top of the stack can only be a. On state if it sees the first b with a on the top of the stack, then it needs
to start comparison of numbers of a's and b's, since all the a's at the begining of the input have already
been pushed onto the stack. It start this process by popping off the a from the top of the stack and enters
in state q3
(to remember that the comparison process has begun). On state , it expects only b's in the input (if it
sees any more a in the input thus the input will not be in the proper form of anbn). Hence there is no more
on input a
|
, and
when it is in state . On state it pops off an a from the top of the stack for every b in the input. When
it sees the last b on state q3 (i.e. when the input is exaushted), then the last a from the stack will be
popped off and the start symbol z is exposed. This is the only possible case when the input (i.e. on -
input ) the PDA M
will move to state which is an accept state.
we can show the computation of the PDA on a given input using the IDs and next move relations. For
example, following are the computation on two input strings.
Let the input be aabb. we start with the start configuration and proceed to the subsequent IDs using the
transition function defined
( using transition 1 )
( using transition 2 )
( using transition 3 )
( using transition 4 ), ( using transition 5 ) , is final state. Hence ,
accept. So the
string aabb is rightly accepted by M
we can show the computation of the PDA on a given input using the IDs and next move relations. For
example, following are the computation on two input strings.
i) Let the input be aabab.
No further move is defined at this point.
Hence the PDA gets stuck and the string aabab is not accepted.
Example 2 : We give an example of a PDA M that accepts the set of balanced strings of parentheses []
by empty stack.
The PDA M is given below.
is
defined as
Informally, whenever it sees a [, it will push the ] onto the stack. (first two transitions), and whenever it
sees a ] and the top of the stack symbol is [, it will pop the symbol [ off the stack. (The third transition).
where
The fourth transition is used when the input is exhausted in order to pop z off the stack ( to empty the
stack) and accept. Note that there is only one state and no final state. The following is a sequence of
configurations leading to the acceptance of the string [ [ ] [ ] ] [ ].
Equivalence of acceptance by final state and empty stack.
It turns out that the two definitions of acceptance of a language by a PDA - accpetance by final state and
empty stack- are equivalent in the sense that if a language can be accepted by empty stack by some
PDA, it can also be accepted by final state by some other PDA and vice versa. Hence it doesn't matter
which one we use, since each kind of machine can simulate the other.Given any arbitrary PDA M that
accpets the language L by final state or empty stack, we can always construct an equivalent PDA M with
a single final state that accpets exactly the same language L. The construction process of M' from M and
the proof of equivalence of M & M' are given below.
There are two cases to be considered.
Let qf be a new
state not in Q.
Consider the PDA as well as the following transition.
. It is easy to show that M and M' are
equivalent i.e.
Let for some
Then
Thus accepts
Conversely, let accepts ), then for
inherits all other moves except the last one from M.
Hence for some
Thus M accepts . Informally, on any input simulate all the moves of M and enters in its own final
state
whenever M enters in any one of its final status in F. Thus accepts a string iff M accepts it.
CASE II : PDA M accepts by empty stack.
We will construct from M in such a way that simulates M and detects when M empties its stack.
enters its final state when and only when M empties its stack.Thus will accept a string iff M
accepts.
contains and
L(M) = L( )
and
i.e. L(
.
CASE I : PDA M accepts by final state, Let
where
L(M) . Then
Let contains all the
and
Transitions 1 causes to enter the initial configuration of M except that will have its own
bottom-of-stack marker X which is below the symbols of M's stack. From this point onward will
simulate every move of M since all the transitions of M are also in
If M ever empties its stack, then when simulating M will empty its stack except the symbol X at the bottom.
At this point, will enter its final state by using transition rule 2, thereby (correctly) accepting the
input. We will prove that M and are equivalent.
Let M accepts . Then
.
But then
( by transition rule 1)
includes all the moves of M )
( by transition rule 2 )
Hence, also accepts . Conversely, let accepts .
Then for some
Every move in the sequence, were taken from M.
Hence, M starting with its initial configuration will eventually empty its stack and accept the input i.e.
Equivalence of PDA’s and CFG’s:
We will now show that pushdown automata and context-free grammars are equivalent in expressive
power, that is, the language accepted by PDAs are exactly the context-free languages. To show this, we
have to prove each of the following:
i) Given any arbitrary CFG G there exists some PDA M that accepts exactly the same
language generated by G. ii) Given any arbitrary PDA M there exists a CFG G that
generates exactly the same language accpeted by M.
(i) CFA to PDA
We will first prove that the first part i.e. we want to show to convert a given CFG to an equivalent PDA.
Let the given CFG is . Without loss of generality we can assume that G is in Greibach
where and X and
transition of , as well as the following two transitions.
for some
( Since
Normal Form i.e. all productions of G are of the form .
where .
From the given CFG G we now construct an equivalent PDA M that accepts by empty stack. Note that
there is only one state in M. Let
, where
• q is the only state
• is the input alphabet, • N is the stack alphabet ,
• q is the start state.
• S is the start/initial stack symbol, and , the transition relation is defined as follows
For each production
.
If , then by definition of L(G), there must be a leftmost derivation starting with S and deriving
w.
i.e.
Again if , then one sysmbol. Therefore we need to show that for any .
.
But we will prove a more general result as given in the following lemma. Replacing A by S (the start
symbol) and gives the required proof.
via a leftmost
derivative iff
Proof : The proof is by induction on n.
Basis : n = 0
iff
iff
and
iff
by
Lemma For any , and ,
.
iff i.e. and
, . We now want to show
that M and G are equivalent i.e. L(G)=N(M). i.e. for any . iff
Induction Step :
via a leftmost derivation. Let the last production
applied in their derivation is
.
Then, for some
where
Now by the indirection hypothesis, we get,
.............................................................................(1)
Again by the construction of M, we get
so, from (1), we get
since
. Conversely,
assume that
be the transition used in the last move. Then for some
and
.
Now, by the induction hypothesis, we get
via a leftmost derivation.
Again, by the construction of M, must be a production of G. [ Since ].
Applying the production to the sentential form we get
,
and
and , we get
That is, if , then
and let
,
where and
First, assume that
for some and
i.e.
via a leftmost derivation.
Hence the proof.
Example : Consider the CFG G in GNF
SaAB
Aa / aA
Ba / bB
The one state PDA M equivalent to G is shown below. For convenience, a production of G and the
corresponding transition in M are marked by the same encircled number.
(1) SaAB
(2) A a
(3) AaA
(4) B a
(5) B bB
. We have used the same construction
discussed earlier
Some Useful Explanations :
Consider the moves of M on input aaaba leading to acceptance of the string.
Steps
1. (q, aaaba, s)
2.
3.
4.
) Accept by empty stack.
5.
Note : encircled numbers here shows the transitions rule applied at every step.
Now consider the derivation of the same string under grammar G. Once again, the production used at
every step is shown with encircled number.
S aaaba
Steps
Observations:
• There is an one-to-one correspondence of the sequence of moves of the PDA M and the
derivation sequence under the CFG G for the same input string in the sense that - number of
( q, aaba, AB )
( q, aba, AB )
( q, ba, B )
( q, a, B )
( q, ,
aAB aaAB aaaB aaabB
1 2 3 4 5
steps in both the cases are same and transition rule corresponding to the same production is
used at every step (as shown by encircled number).
• considering the moves of the PDA and derivation under G together, it is also observed that at
every step the input read so far and the stack content together is exactly identical to the
corresponding sentential form i.e.
<what is Read><stack> = <sentential form>
Say, at step 2, Read so far =
a stack = AB
Sentential form = aAB From this
property we claim that
true, then apply with and we get (
by
definition )
Thus N(M) = L(G) as desired. Note that we have already proved a more general version of the claim
PDA and CFG:
We now want to show that for every PDA M that accpets by empty stack, there is a CFG G such that
L(G) = N(M)
we first see whether the "reverse of the construction" that was used in part (i) can be used here to
construct an equivalent CFG from any PDA M.
It can be show that this reverse construction works only for single state PDAs.
• That is, for every one-state
PDA M there is CFG G such that
L(G) = N(M). For every move of the
PDA M in the
grammar .
we can now apply the proof in part (i) in the reverse direction to show that L(G) = N(M).
But the reverse construction does not work for PDAs with more than one state. For example, consider the
PDA
M produced here to accept the langauge
Now let us construct CFG using the "reverse" construction.
( Note ).
Transitions in M Corresponding Production in G
iff . If the claim is
iff or iff
we introduce a production
where N = T and
We can drive strings like aabaa which is in the language.
But under this grammar we can also derive some strings which are not in the language. e.g
and
Therefore, to complete the proof of part (ii) we need to prove the following claim also.
Claim: For every PDA M there is some one-state PDA such that .
It is quite possible to prove the above claim. But here we will adopt a different approach. We start with
any arbitrary PDA M that accepts by empty stack and directly construct an equivalent CFG G.
PDA to CFG
We want to construct a CFG G to simulate any arbitrary PDA M with one or more states. Without loss of
generality we can assume that the PDA M accepts by empty stack.
The idea is to use nonterminal of the form <PAq> whenever PDA M in state P with A on top of the stack
goes
to state . That is, for example, for a given transition of the PDA corresponding production in the
grammar as shown below,
And, we would like to show, in general, that iff the PDA M, when started from state P with A on
the top of the stack will finish processing , arrive at state q and remove A from the stack. we are now
ready to give the construction of an equivalent CFG G from a given PDA M. we need to introduce two
kinds of producitons in the grammar as given below. The reason for introduction of the first kind of
production will be justified at a later point. Introduction of the second type of production has been justified
in the above discussion.
Let be a PDA. We construct from M a equivalent CFG
Where
• N is the set of nonterminals of the form <PAq> for and P contains the
follwoing two kind of production
1.
2. If , then for every choice of the
,
sequence
Include the follwoing production
If n = 0, then the production is .For the whole exercise to be meaningful we want
. But
and
, .
means there is a sequence of transitions ( for PDA M ), starting in state q, ending in
,
during which the PDA M consumes the input string and removes A from the stack (and, of course, all
other symbols pushed onto stack in A's place, and so on.)
That is we want to claim that
If this claim is true, then let for some
. But for
all
iff PDA M accepts w by empty stack or L(G)
= N(M)
Now, to show that the above construction of CFG G from any PDA M works, we need to prove the
proposed claim.
Note: At this point, the justification for introduction of the first type of production (of the form
) in the CFG G, is quite clear. This helps use deriving a string from the start symbol of the grammar.
Proof : Of the claim for some
The proof is by induction on the number of steps in a derivation of G (which of course is equal to the
number of moves taken by M). Let the number of steps taken is n.
The proof consists of two parts: ' if ' part and ' only if ' part. First, consider the ' if ' part
If .
Basis is n =1
Then . In this case, it is clear that . Hence, by construction
is a production of G.
Then
Inductive Hypothesis :
Inductive Step :
iff
iff i.e.
iff , and
then
to get iff
we have as production in G. Therefore,
consider the first move of the PDA M
which uses the
=
. Now M must remove
from stack while
consuming x in the remaining n-1 moves.
Let is the prefix of x that M has consumed
when
the stack. Then there must exist a sequence of states in M (as per construction) (with
),
such that
]
]
...
[ Note: Each step takes less than or equal to n -1 moves because the total number of moves required
assumed to be n-1.]
That is, in general
.
So, applying inductive hypothesis we get
. But corresponding to the original move
in M we have added the following production in G.
We can show the computation of the PDA on a given input using the IDs and next move relations. For
example, following are the computation on two input strings.
i) Let the input be aabb. we start with the start configuration and proceed to the subsequent IDs using the
transition function defined
( using transition 1 ) , ( using transition 2 )
( using
transition 4 )
is final state. Hence,
accept.
So the string aabb is rightly accepted by M.
we can show the computation of the PDA on a given input using the IDs and next move relations. For
example, following are the computation on two input strings.
, where
=
,
( using transition 3 ),
( using transition 5 ) ,
For n >1, let w = ax for some and
general transition
first appears at top of
[ This step implies
[ This step implies
,
i) Let the input be aabab.
No further move is defined at this point.
Hence the PDA gets stuck and the string aabab is not accepted.
The following is a sequence of configurations leading to the acceptance of the string [ [ ] [ ] ] [ ].
Equivalence of acceptance by final state and empty stack.
It turns out that the two definitions of acceptance of a language by a PDA - accpetance by final state and
empty stack- are equivalent in the sense that if a language can be accepted by empty stack by some
PDA, it can also be accepted by final state by some other PDA and vice versa. Hence it doesn't matter
which one we use, since each kind of machine can simulate the other.Given any arbitrary PDA M that
accpets the language L by final state or empty stack, we can always construct an equivalent PDA M with
a single final state that accpets exactly the same language L. The construction process of M' from M and
the proof of equivalence of M & M' are given below
There are two cases to be considered.
be a new
state not in Q.
Consider the PDA as well as the following transition.
. It is easy to show that M and are
equivalent i.e.
.
Let for some
Then .
Thus accepts .
Conversely, let
accepts for some
contains and
. Then and
CASE 1 : PDA M accepts by final state, Let . Let
where
i.e. , then
. inherits all other moves except the last one from M. Hence
.
for some
Thus M accepts . Informally, on any input simulate all the moves of M and enters in its own final
state
whenever M enters in any one of its final status in F. Thus accepts a string iff M accepts it.
CASE 2 : PDA M accepts by empty stack.
we will construct from M in such a way that simulates M and detects when M empties its stack.
enters its final state when and only when M empties its stack.Thus will accept a string iff M
accepts.
Let contains all
and
Transitions 1 causes to enter the initial configuration of M except that will have its own bottom-
of-stack marker X which is below the symbols of M's stack. From this point onward M' will simulate every
move of M since all the transitions of M are also in .
If M ever empties its stack, then when simulating M will empty its stack except the symbol X at the bottom.
At this point , will enter its final state by using transition rule 2, thereby (correctly) accepting the
input. we will prove that M and are equivalent.
Let M accepts .
Then
. But then,
( by transition rule 1 )
include all the moves of M )
( by transition rule 2 )
Hence, also accepts .Conversely, let accepts .
Then for some Q .
Every move in the sequence
were taken from M.
where and and
the transition of , as well as the following two transitions.
for some
( since
Hence, M starting with its initial configuration will eventually empty its stack and accept the input i.e.
.
Deterministic PDA:
Regular Languages and DPDA’s The DPDA’s accepts a class of languages that is in between the regular
languages and CFL’s.
Deterministic Pushdown Automata (DPDA) and Deterministic Context-free Languages (DCFLs)
Pushdown automata that we have already defined and discussed are nondeterministic by default, that is , there may be
two or more moves involving the same combinations of state, input symbol, and top of the stock, and again, for
some state and top of the stock the machine may either read and input symbol or make an - transition (without
consuming any input).
In deterministic PDA , there is never a choice of move in any situation. This is handled by preventing the above mentioned
two cases as described in the definition below.
Defnition : Let be a PDA . Then M is deterministic if and only if both the following conditions
are
satisfied.
1. (this condition prevents
multiple choice f
2.
(This condition prevents the possibility of a choice between a move with or without an input symbol).
Empty Production Removal
The productions of context-free grammars can be coerced into a variety of forms without
affecting the expressive power of the grammars. If the empty string does not belong to a
language, then there is a way to eliminate the productions of the form A→ λ from the
grammar. If the empty string belongs to a language, then we can eliminate λ from all
productions save for the single production S → λ. In this case we can also eliminate any
occurrences of S from the right-hand side of productions.
Procedure to find CFG with out empty Productions
has at most one element for any and
any combination of )
If and for every
Unit production removal
Left Recursion Removal
NORMAL FORMS
Two kinds of normal forms viz., Chomsky Normal Form and Greibach Normal Form (GNF) are
considered here.
Chomsky Normal Form (CNF)
Any context-free language L without any λ-production is generated by a grammar is
which productions are of the form A → BC or A→ a, where A, B ∈VN , and a ∈ V Τ.
Procedure to find Equivalent Grammar in CNF
(i) Eliminate the unit productions, and λ-productions if any,
(ii) Eliminate the terminals on the right hand side of length two or more.
(iii) Restrict the number of variables on the right hand side of productions to two.
Proof:
For Step (i): Apply the following theorem: “Every context free language can be generated by
a grammar with no useless symbols and no unit productions”.
At the end of this step the RHS of any production has a single terminal or two or more symbols.
Let us assume the equivalent resulting grammar as G = (VN ,VT ,P ,S
). For Step (ii): Consider any production of the form
Example
Obtain a grammar in Chomsky Normal Form (CNF) equivalent to the grammar G with
productions P given
Solution
Pumping Lemma for CFG
A “Pumping Lemma” is a theorem used to show that, if certain strings belong to a language,
then certain other strings must also belong to the language. Let us discuss a Pumping Lemma
for CFL. We will show that , if L is a context-free language, then strings of L that are at least
‘m’ symbols long can be “pumped” to produce additional strings in L. The value of ‘m’
depends on the particular language. Let L be an infinite context-free language. Then there is
some positive integer ‘m’ such that, if S is a string of L of Length at least ‘m’, then
(i) S = uvwxy (for some u, v, w, x, y)
(ii) | vwx| ≤ m
(iii) | vx| ≥1 (iv) uv iwx i y∈L. for all non-negative values of i.
It should be understood that
(i) If S is sufficiently long string, then there are two substrings, v and x, somewhere in S.
There is stuff (u) before v, stuff (w) between v and x, and stuff (y), after x.
(ii) The stuff between v and x won’t be too long, because | vwx | can’t be larger than m.
(iii) Substrings v and x won’t both be empty, though either one could be.
(iv) If we duplicate substring v, some number (i) of times, and duplicate x the same number
of times, the resultant string will also be in L.
Definitions
A variable is useful if it occurs in the derivation of some string. This requires that
(a) the variable occurs in some sentential form (you can get to the variable if you start from S),
and (b) a string of terminals can be derived from the sentential form (the variable is not a “dead
end”). A variable is “recursive” if it can generate a string containing itself. For example,
variable A is recursive if
Proof of Pumping Lemma
(a) Suppose we have a CFL given by L. Then there is some context-free Grammar G that
generates L. Suppose
(i) L is infinite, hence there is no proper upper bound on the length of strings belonging to
L.
(ii) L does not contain l.
(iii) G has no productions or l-productions.
There are only a finite number of variables in a grammar and the productions for each variable
have finite lengths. The only way that a grammar can generate arbitrarily long strings is if
one or more variables is both useful and recursive. Suppose no variable is recursive. Since
the start symbol is non recursive, it must be defined only in terms of terminals and other
variables. Then since those variables are non recursive, they have to be defined in terms of
terminals and still other variables and so on.
After a while we run out of “other variables” while the generated string is still finite. Therefore
there is an upper bond on the length of the string which can be generated from the start symbol.
This contradicts our statement that the language is finite.
Hence, our assumption that no variable is recursive must be incorrect.
(b) Let us consider a string X belonging to L. If X is sufficiently long, then the derivation of X
must have involved recursive use of some variable A. Since A was used in the derivation, the
derivation should have started as
theory of computation notes for school of engineering
Usage of Pumping Lemma
Hence our original assumption, that L is context free should be false. Hence the language L is
not con text-free.
Example
Check whether the language given by L = {a mbmcn : m ≤ n ≤ 2m} is a CFL or not.
Solution
Closure properties of CFL – Substitution
Applications of substitution theorem
Reversal
Inverse Homomorphism:
theory of computation notes for school of engineering
theory of computation notes for school of engineering

More Related Content

PPTX
PDF
Free Ebooks Download ! Edhole
PPTX
csc218pushdownautomata-160627235627.pptx
PPTX
push down automata
PPTX
Module 4 PDA updated Theory of computation.pptx
PPT
Description of Finite Automata and Languages Accepted by Push Down Automata
PPT
Push Down Automata (PDA) Definition, Problems and State Diagrams including tr...
Free Ebooks Download ! Edhole
csc218pushdownautomata-160627235627.pptx
push down automata
Module 4 PDA updated Theory of computation.pptx
Description of Finite Automata and Languages Accepted by Push Down Automata
Push Down Automata (PDA) Definition, Problems and State Diagrams including tr...

Similar to theory of computation notes for school of engineering (20)

PPT
Turing Machine push down automata-examples
PPT
2-stack-Push Down Automata(PDA)_TOComp.ppt
PPTX
Pushdown_Automata_Presentation in Automata theory and compiler design
PPT
Designing of a Determinstic Push Down Atomaton
PPTX
PDA (1) (1).pptx
DOCX
Pushdown AutomataChapter 12Recognizing Context-F.docx
PPTX
3.1,2,3 pushdown automata definition, moves &amp; id
PPTX
16 PDA push down autometa push down.pptx
PPTX
AUTOMATA AUTOMATA AUTOMATAAutomata7Chapter6.pptx
PPT
Pushdown Automaton (1).ppt
PPTX
Introduction_to_PDA.pptx
PPT
Pushdown-Automata_L1.ppt
PPTX
539_PDA push down automata_ d_b_m_s.pptx
PPTX
Pushdown Automata toc presentation 2025.pptx
PPTX
unit 3.pptx-theory of computation unit 3 notes
PPTX
Unit iv
PPTX
Automata theory - Push Down Automata (PDA)
PDF
push down automata- theoretical foundations of computations
PPTX
PDA (pushdown automaton)
PPT
Pushdown automata
Turing Machine push down automata-examples
2-stack-Push Down Automata(PDA)_TOComp.ppt
Pushdown_Automata_Presentation in Automata theory and compiler design
Designing of a Determinstic Push Down Atomaton
PDA (1) (1).pptx
Pushdown AutomataChapter 12Recognizing Context-F.docx
3.1,2,3 pushdown automata definition, moves &amp; id
16 PDA push down autometa push down.pptx
AUTOMATA AUTOMATA AUTOMATAAutomata7Chapter6.pptx
Pushdown Automaton (1).ppt
Introduction_to_PDA.pptx
Pushdown-Automata_L1.ppt
539_PDA push down automata_ d_b_m_s.pptx
Pushdown Automata toc presentation 2025.pptx
unit 3.pptx-theory of computation unit 3 notes
Unit iv
Automata theory - Push Down Automata (PDA)
push down automata- theoretical foundations of computations
PDA (pushdown automaton)
Pushdown automata
Ad

More from FIONACHATOLA (18)

PPTX
ARTIFICIAL INTELLIGENCE CONSTRAINT SATISFACTION PROBLEMS (CSPS) AND MINI-MAX....
PPT
ARTIFICAIL INTELLIGENCE AO ALGORITHMS NOTES
PPTX
business intelligence and analytics notes
PPTX
VOICE OVER INTERNET PROTOCOL(VOIP) notes
PPTX
PC HARWARE AND TROUBLE SHOOTING INTRODUCTION
PDF
Unit I Data structure and algorithms notes
PDF
UG_B.Sc._Information Technology_129 14_Office Automation Lab.pdf
PPT
computer Hardware and trouble shooting Basics
PPTX
Introduction to Email, internet protocols
PPTX
Introduction to Internet Protocol ADDRESS.pptx
PPTX
introduction to OSI MODEL-WPS Office.pptx
PDF
computer system organization and architecture
PDF
computer fundamentals -e- notes for IT & CS
DOCX
theory of computation notes for school of engineering
DOCX
theory of computation notes for school of engineering
DOCX
theory of computation notes for school of engineering
DOCX
theory of computation notes for school of engineering
DOCX
in computer data structures and algorithms
ARTIFICIAL INTELLIGENCE CONSTRAINT SATISFACTION PROBLEMS (CSPS) AND MINI-MAX....
ARTIFICAIL INTELLIGENCE AO ALGORITHMS NOTES
business intelligence and analytics notes
VOICE OVER INTERNET PROTOCOL(VOIP) notes
PC HARWARE AND TROUBLE SHOOTING INTRODUCTION
Unit I Data structure and algorithms notes
UG_B.Sc._Information Technology_129 14_Office Automation Lab.pdf
computer Hardware and trouble shooting Basics
Introduction to Email, internet protocols
Introduction to Internet Protocol ADDRESS.pptx
introduction to OSI MODEL-WPS Office.pptx
computer system organization and architecture
computer fundamentals -e- notes for IT & CS
theory of computation notes for school of engineering
theory of computation notes for school of engineering
theory of computation notes for school of engineering
theory of computation notes for school of engineering
in computer data structures and algorithms
Ad

Recently uploaded (20)

PDF
composite construction of structures.pdf
PPTX
OOP with Java - Java Introduction (Basics)
PDF
Model Code of Practice - Construction Work - 21102022 .pdf
PPTX
Construction Project Organization Group 2.pptx
DOCX
573137875-Attendance-Management-System-original
PDF
keyrequirementskkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkk
PDF
Mitigating Risks through Effective Management for Enhancing Organizational Pe...
PDF
TFEC-4-2020-Design-Guide-for-Timber-Roof-Trusses.pdf
PPTX
CARTOGRAPHY AND GEOINFORMATION VISUALIZATION chapter1 NPTE (2).pptx
PDF
R24 SURVEYING LAB MANUAL for civil enggi
PPTX
Infosys Presentation by1.Riyan Bagwan 2.Samadhan Naiknavare 3.Gaurav Shinde 4...
PPTX
Lecture Notes Electrical Wiring System Components
PPTX
Sustainable Sites - Green Building Construction
PPTX
UNIT 4 Total Quality Management .pptx
PPTX
Foundation to blockchain - A guide to Blockchain Tech
PPTX
web development for engineering and engineering
PPTX
FINAL REVIEW FOR COPD DIANOSIS FOR PULMONARY DISEASE.pptx
PPTX
Welding lecture in detail for understanding
PDF
Operating System & Kernel Study Guide-1 - converted.pdf
PDF
PRIZ Academy - 9 Windows Thinking Where to Invest Today to Win Tomorrow.pdf
composite construction of structures.pdf
OOP with Java - Java Introduction (Basics)
Model Code of Practice - Construction Work - 21102022 .pdf
Construction Project Organization Group 2.pptx
573137875-Attendance-Management-System-original
keyrequirementskkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkk
Mitigating Risks through Effective Management for Enhancing Organizational Pe...
TFEC-4-2020-Design-Guide-for-Timber-Roof-Trusses.pdf
CARTOGRAPHY AND GEOINFORMATION VISUALIZATION chapter1 NPTE (2).pptx
R24 SURVEYING LAB MANUAL for civil enggi
Infosys Presentation by1.Riyan Bagwan 2.Samadhan Naiknavare 3.Gaurav Shinde 4...
Lecture Notes Electrical Wiring System Components
Sustainable Sites - Green Building Construction
UNIT 4 Total Quality Management .pptx
Foundation to blockchain - A guide to Blockchain Tech
web development for engineering and engineering
FINAL REVIEW FOR COPD DIANOSIS FOR PULMONARY DISEASE.pptx
Welding lecture in detail for understanding
Operating System & Kernel Study Guide-1 - converted.pdf
PRIZ Academy - 9 Windows Thinking Where to Invest Today to Win Tomorrow.pdf

theory of computation notes for school of engineering

  • 1. DMI-ST. JOHN THE BAPTIST UNIVERSITY LILONGWE, MALAWI Module Code: 055CS62 Subject Name: Theory of computation Unit V Detail Notes School of Computer Science Module Teacher: Fanny Chatola
  • 2. Syllabus Basic Definitions, Equivalence of Acceptance by Finite State & Empty stack, PDA & Context Free Language, Equivalence of PDA and CFG, Parsing & PDA: Top-Down Parsing, Top-down Parsing Using Deterministic PDA, Bottom-up Parsing, Closure properties and Deterministic PDA. A Language that is not recursively enumerable, An un-decidable problem that is RE, Post Correspondence Problem, The Classes P and NP: Problems Solvable in Polynomial Time, An Example: Kruskal's Algorithm, Nondeterministic Polynomial Time, An NP Example: The Traveling Salesman Problem, Polynomial-Time Reductions NP Complete Problems, An NP-Complete Problem: The Satisfiability Problem, Tractable and Intractable Representing SAT Instances, NP Completeness of the SAT Problem, A Restricted Satisfiability Problem: Normal Forms for Boolean Expressions, Converting Expressions to CNF, The Problem of Independent Sets, The Node-Cover Problem.
  • 3. Push down automata: Regular language can be charaterized as the language accepted by finite automata. Similarly, we can characterize the context-free language as the langauge accepted by a class of machines called "Pushdown Automata" (PDA). A pushdown automation is an extension of the NFA. It is observed that FA have limited capability. (in the sense that the class of languages accepted or characterized by them is small). This is due to the "finite memory" (number of states) and "no external memory" involved with them. A PDA is simply an NFA augmented with an "external stack memory". The addition of a stack provides the PDA with a last-in, first-out memory management cpapability. This "Stack" or "pushdown store" can be used to record a potentially unbounded information. It is due to this memory management capability with the help of the stack that a PDA can overcome the memory limitations that prevents a FA to accept many interesting languages like . Although, a PDA can store an unbounded amount of information on the stack, its access to the information on the stack is limited. It can push an element onto the top of the stack and pop off an element from the top of the stack. To read down into the stack the top elements must be popped off and are lost. Due to this limited access to the information on the stack, a PDA still has some limitations and cannot accept some other interesting languages. As shown in figure, a PDA has three components: an input tape with read only head, a finite control and a pushdown store. The input head is read-only and may only move from left to right, one symbol (or cell) at a time. In each step, the PDA pops the top symbol off the stack; based on this symbol, the input symbol it is currently reading, and its present state, it can push a sequence of symbols onto the stack, move its read-only head one cell (or symbol) to the right, and enter a new state, as defined by the transition rules of the PDA. PDA are nondeterministic, by default. That is, - transitions are also allowed in which the PDA can pop and push, and change state without reading the next input symbol or moving its read-only head. Besides this, there may be multiple options for possible next moves. Formal Definitions : Formally, a PDA M is a 7-tuple M = where,
  • 4. • is a finite set of states, •is a finite set of input symbols (input alphabets), •is a finite set of stack symbols (stack alphabets), • is a transition function from to subset of • is the start state • , is the initial stack symbol, and • , is the final or accept states. Explanation of the transition function, : If, for any . This means intitutively that whenever the PDA is in state q reading input symbol a and z on top of the stack, it can nondeterministically for any i, • go to state • pop z off the stack • push onto the stack (where ) (The usual convention is , then that if will be at the top and • move read head right one cell past the current symbol a. If a = means intitutively that whenver the PDA is in state q with z on the top of the stack regardless of the current input symbol, it can nondeterministically for any i, , • go to state • pop z off the stack • push onto the stack, and • leave its read-only head where it is. State transition diagram : A PDA can also be depicted by a state transition diagram. The labels on the arcs indicate both the input and the stack operation. The transition is depicted by Final states are indicated by double circles and the start state is indicated by an arrow to it from nowhere. Configuration or Instantaneous Description (ID) : , , then for and at the bottom.)
  • 5. A configuration or an instantaneous description (ID) of PDA at any moment during its computation is an element of describing the current state, the portion of the input remaining to be read (i.e. under and to the right of the read head), and the current stack contents. Only these three elements can affect the computation from that point on and, hence, are parts of the ID. The start or inital configuartion (or ID) on input . That is, the PDA always starts in its start state, with its read head pointing to the leftmost input symbol and the stack containing only the start/initial stack symbol, . The "next move relation" one figure describes how the PDA can move from one configuration to another in one step. Formally, Let I, J, K be IDs of a PDA. We define we write I K, if ID I can become K after exactly i moves. The relations define as follows I K I such that I J I such that I J. That is, is the reflexive, transitive closure of . We say that I J if the ID J follows from the ID I in zero or more moves. ( Note : subscript M can be dropped when the particular PDA M is understood. ) Language accepted by a PDA M There are two alternative definiton of acceptance as given below. 1. Acceptance by final state : Consider the PDA . Informally, the PDA M is said to accept its input by final state if it enters any final state in zero or more moves after reading its entire input, starting in the start configuration on input . Formally, we define L(M), the language accepted by final state to be { for some } is iff 'a' may be or an input symbol. and J if K and K J if | and
  • 6. 2. Acceptance by empty stack (or Null stack) : The PDA M accepts its input by empty stack if starting in the start configuration on input , it ever empties the stack w/o pushing anything back on after reading the entire input. Formally, we define N(M), the language accepted by empty stack, to be { for some } Note that the set of final states, F is irrelevant in this case and we usually let the F to be the empty set i.e. F = Q . Example 1 : Here is a PDA that accepts the language . consists of the following transitions The PDA can also be described by the adjacent transition diagram. Informally, whenever the PDA M sees an input a in the start state with the start symbol z on the top of the stack it pushes a onto the stack and changes state to . (to remember that it has seen the first 'a'). On state if it sees anymore a, it simply pushes it onto the stack. Note that when M is on state , the symbol on the top of the stack can only be a. On state if it sees the first b with a on the top of the stack, then it needs to start comparison of numbers of a's and b's, since all the a's at the begining of the input have already been pushed onto the stack. It start this process by popping off the a from the top of the stack and enters in state q3 (to remember that the comparison process has begun). On state , it expects only b's in the input (if it sees any more a in the input thus the input will not be in the proper form of anbn). Hence there is no more on input a | , and
  • 7. when it is in state . On state it pops off an a from the top of the stack for every b in the input. When it sees the last b on state q3 (i.e. when the input is exaushted), then the last a from the stack will be popped off and the start symbol z is exposed. This is the only possible case when the input (i.e. on - input ) the PDA M will move to state which is an accept state. we can show the computation of the PDA on a given input using the IDs and next move relations. For example, following are the computation on two input strings. Let the input be aabb. we start with the start configuration and proceed to the subsequent IDs using the transition function defined ( using transition 1 ) ( using transition 2 ) ( using transition 3 ) ( using transition 4 ), ( using transition 5 ) , is final state. Hence , accept. So the string aabb is rightly accepted by M we can show the computation of the PDA on a given input using the IDs and next move relations. For example, following are the computation on two input strings. i) Let the input be aabab. No further move is defined at this point. Hence the PDA gets stuck and the string aabab is not accepted. Example 2 : We give an example of a PDA M that accepts the set of balanced strings of parentheses [] by empty stack. The PDA M is given below. is defined as Informally, whenever it sees a [, it will push the ] onto the stack. (first two transitions), and whenever it sees a ] and the top of the stack symbol is [, it will pop the symbol [ off the stack. (The third transition). where
  • 8. The fourth transition is used when the input is exhausted in order to pop z off the stack ( to empty the stack) and accept. Note that there is only one state and no final state. The following is a sequence of configurations leading to the acceptance of the string [ [ ] [ ] ] [ ]. Equivalence of acceptance by final state and empty stack. It turns out that the two definitions of acceptance of a language by a PDA - accpetance by final state and empty stack- are equivalent in the sense that if a language can be accepted by empty stack by some PDA, it can also be accepted by final state by some other PDA and vice versa. Hence it doesn't matter which one we use, since each kind of machine can simulate the other.Given any arbitrary PDA M that accpets the language L by final state or empty stack, we can always construct an equivalent PDA M with a single final state that accpets exactly the same language L. The construction process of M' from M and the proof of equivalence of M & M' are given below. There are two cases to be considered. Let qf be a new state not in Q. Consider the PDA as well as the following transition. . It is easy to show that M and M' are equivalent i.e. Let for some Then Thus accepts Conversely, let accepts ), then for inherits all other moves except the last one from M. Hence for some Thus M accepts . Informally, on any input simulate all the moves of M and enters in its own final state whenever M enters in any one of its final status in F. Thus accepts a string iff M accepts it. CASE II : PDA M accepts by empty stack. We will construct from M in such a way that simulates M and detects when M empties its stack. enters its final state when and only when M empties its stack.Thus will accept a string iff M accepts. contains and L(M) = L( ) and i.e. L( . CASE I : PDA M accepts by final state, Let where L(M) . Then
  • 9. Let contains all the and Transitions 1 causes to enter the initial configuration of M except that will have its own bottom-of-stack marker X which is below the symbols of M's stack. From this point onward will simulate every move of M since all the transitions of M are also in If M ever empties its stack, then when simulating M will empty its stack except the symbol X at the bottom. At this point, will enter its final state by using transition rule 2, thereby (correctly) accepting the input. We will prove that M and are equivalent. Let M accepts . Then . But then ( by transition rule 1) includes all the moves of M ) ( by transition rule 2 ) Hence, also accepts . Conversely, let accepts . Then for some Every move in the sequence, were taken from M. Hence, M starting with its initial configuration will eventually empty its stack and accept the input i.e. Equivalence of PDA’s and CFG’s: We will now show that pushdown automata and context-free grammars are equivalent in expressive power, that is, the language accepted by PDAs are exactly the context-free languages. To show this, we have to prove each of the following: i) Given any arbitrary CFG G there exists some PDA M that accepts exactly the same language generated by G. ii) Given any arbitrary PDA M there exists a CFG G that generates exactly the same language accpeted by M. (i) CFA to PDA We will first prove that the first part i.e. we want to show to convert a given CFG to an equivalent PDA. Let the given CFG is . Without loss of generality we can assume that G is in Greibach where and X and transition of , as well as the following two transitions. for some ( Since
  • 10. Normal Form i.e. all productions of G are of the form . where . From the given CFG G we now construct an equivalent PDA M that accepts by empty stack. Note that there is only one state in M. Let , where • q is the only state • is the input alphabet, • N is the stack alphabet , • q is the start state. • S is the start/initial stack symbol, and , the transition relation is defined as follows For each production . If , then by definition of L(G), there must be a leftmost derivation starting with S and deriving w. i.e. Again if , then one sysmbol. Therefore we need to show that for any . . But we will prove a more general result as given in the following lemma. Replacing A by S (the start symbol) and gives the required proof. via a leftmost derivative iff Proof : The proof is by induction on n. Basis : n = 0 iff iff and iff by Lemma For any , and , . iff i.e. and , . We now want to show that M and G are equivalent i.e. L(G)=N(M). i.e. for any . iff
  • 11. Induction Step : via a leftmost derivation. Let the last production applied in their derivation is . Then, for some where Now by the indirection hypothesis, we get, .............................................................................(1) Again by the construction of M, we get so, from (1), we get since . Conversely, assume that be the transition used in the last move. Then for some and . Now, by the induction hypothesis, we get via a leftmost derivation. Again, by the construction of M, must be a production of G. [ Since ]. Applying the production to the sentential form we get , and and , we get That is, if , then and let , where and First, assume that for some and
  • 12. i.e. via a leftmost derivation. Hence the proof. Example : Consider the CFG G in GNF SaAB Aa / aA Ba / bB The one state PDA M equivalent to G is shown below. For convenience, a production of G and the corresponding transition in M are marked by the same encircled number. (1) SaAB (2) A a (3) AaA (4) B a (5) B bB . We have used the same construction discussed earlier Some Useful Explanations : Consider the moves of M on input aaaba leading to acceptance of the string. Steps 1. (q, aaaba, s) 2. 3. 4. ) Accept by empty stack. 5. Note : encircled numbers here shows the transitions rule applied at every step. Now consider the derivation of the same string under grammar G. Once again, the production used at every step is shown with encircled number. S aaaba Steps Observations: • There is an one-to-one correspondence of the sequence of moves of the PDA M and the derivation sequence under the CFG G for the same input string in the sense that - number of ( q, aaba, AB ) ( q, aba, AB ) ( q, ba, B ) ( q, a, B ) ( q, , aAB aaAB aaaB aaabB 1 2 3 4 5
  • 13. steps in both the cases are same and transition rule corresponding to the same production is used at every step (as shown by encircled number). • considering the moves of the PDA and derivation under G together, it is also observed that at every step the input read so far and the stack content together is exactly identical to the corresponding sentential form i.e. <what is Read><stack> = <sentential form> Say, at step 2, Read so far = a stack = AB Sentential form = aAB From this property we claim that true, then apply with and we get ( by definition ) Thus N(M) = L(G) as desired. Note that we have already proved a more general version of the claim PDA and CFG: We now want to show that for every PDA M that accpets by empty stack, there is a CFG G such that L(G) = N(M) we first see whether the "reverse of the construction" that was used in part (i) can be used here to construct an equivalent CFG from any PDA M. It can be show that this reverse construction works only for single state PDAs. • That is, for every one-state PDA M there is CFG G such that L(G) = N(M). For every move of the PDA M in the grammar . we can now apply the proof in part (i) in the reverse direction to show that L(G) = N(M). But the reverse construction does not work for PDAs with more than one state. For example, consider the PDA M produced here to accept the langauge Now let us construct CFG using the "reverse" construction. ( Note ). Transitions in M Corresponding Production in G iff . If the claim is iff or iff we introduce a production where N = T and
  • 14. We can drive strings like aabaa which is in the language. But under this grammar we can also derive some strings which are not in the language. e.g and Therefore, to complete the proof of part (ii) we need to prove the following claim also. Claim: For every PDA M there is some one-state PDA such that . It is quite possible to prove the above claim. But here we will adopt a different approach. We start with any arbitrary PDA M that accepts by empty stack and directly construct an equivalent CFG G. PDA to CFG We want to construct a CFG G to simulate any arbitrary PDA M with one or more states. Without loss of generality we can assume that the PDA M accepts by empty stack. The idea is to use nonterminal of the form <PAq> whenever PDA M in state P with A on top of the stack goes to state . That is, for example, for a given transition of the PDA corresponding production in the grammar as shown below, And, we would like to show, in general, that iff the PDA M, when started from state P with A on the top of the stack will finish processing , arrive at state q and remove A from the stack. we are now ready to give the construction of an equivalent CFG G from a given PDA M. we need to introduce two kinds of producitons in the grammar as given below. The reason for introduction of the first kind of production will be justified at a later point. Introduction of the second type of production has been justified in the above discussion. Let be a PDA. We construct from M a equivalent CFG Where • N is the set of nonterminals of the form <PAq> for and P contains the follwoing two kind of production 1. 2. If , then for every choice of the , sequence Include the follwoing production If n = 0, then the production is .For the whole exercise to be meaningful we want . But and , .
  • 15. means there is a sequence of transitions ( for PDA M ), starting in state q, ending in , during which the PDA M consumes the input string and removes A from the stack (and, of course, all other symbols pushed onto stack in A's place, and so on.) That is we want to claim that If this claim is true, then let for some . But for all iff PDA M accepts w by empty stack or L(G) = N(M) Now, to show that the above construction of CFG G from any PDA M works, we need to prove the proposed claim. Note: At this point, the justification for introduction of the first type of production (of the form ) in the CFG G, is quite clear. This helps use deriving a string from the start symbol of the grammar. Proof : Of the claim for some The proof is by induction on the number of steps in a derivation of G (which of course is equal to the number of moves taken by M). Let the number of steps taken is n. The proof consists of two parts: ' if ' part and ' only if ' part. First, consider the ' if ' part If . Basis is n =1 Then . In this case, it is clear that . Hence, by construction is a production of G. Then Inductive Hypothesis : Inductive Step : iff iff i.e. iff , and then to get iff we have as production in G. Therefore,
  • 16. consider the first move of the PDA M which uses the = . Now M must remove from stack while consuming x in the remaining n-1 moves. Let is the prefix of x that M has consumed when the stack. Then there must exist a sequence of states in M (as per construction) (with ), such that ] ] ... [ Note: Each step takes less than or equal to n -1 moves because the total number of moves required assumed to be n-1.] That is, in general . So, applying inductive hypothesis we get . But corresponding to the original move in M we have added the following production in G. We can show the computation of the PDA on a given input using the IDs and next move relations. For example, following are the computation on two input strings. i) Let the input be aabb. we start with the start configuration and proceed to the subsequent IDs using the transition function defined ( using transition 1 ) , ( using transition 2 ) ( using transition 4 ) is final state. Hence, accept. So the string aabb is rightly accepted by M. we can show the computation of the PDA on a given input using the IDs and next move relations. For example, following are the computation on two input strings. , where = , ( using transition 3 ), ( using transition 5 ) , For n >1, let w = ax for some and general transition first appears at top of [ This step implies [ This step implies ,
  • 17. i) Let the input be aabab. No further move is defined at this point. Hence the PDA gets stuck and the string aabab is not accepted. The following is a sequence of configurations leading to the acceptance of the string [ [ ] [ ] ] [ ]. Equivalence of acceptance by final state and empty stack. It turns out that the two definitions of acceptance of a language by a PDA - accpetance by final state and empty stack- are equivalent in the sense that if a language can be accepted by empty stack by some PDA, it can also be accepted by final state by some other PDA and vice versa. Hence it doesn't matter which one we use, since each kind of machine can simulate the other.Given any arbitrary PDA M that accpets the language L by final state or empty stack, we can always construct an equivalent PDA M with a single final state that accpets exactly the same language L. The construction process of M' from M and the proof of equivalence of M & M' are given below There are two cases to be considered. be a new state not in Q. Consider the PDA as well as the following transition. . It is easy to show that M and are equivalent i.e. . Let for some Then . Thus accepts . Conversely, let accepts for some contains and . Then and CASE 1 : PDA M accepts by final state, Let . Let where i.e. , then . inherits all other moves except the last one from M. Hence .
  • 18. for some Thus M accepts . Informally, on any input simulate all the moves of M and enters in its own final state whenever M enters in any one of its final status in F. Thus accepts a string iff M accepts it. CASE 2 : PDA M accepts by empty stack. we will construct from M in such a way that simulates M and detects when M empties its stack. enters its final state when and only when M empties its stack.Thus will accept a string iff M accepts. Let contains all and Transitions 1 causes to enter the initial configuration of M except that will have its own bottom- of-stack marker X which is below the symbols of M's stack. From this point onward M' will simulate every move of M since all the transitions of M are also in . If M ever empties its stack, then when simulating M will empty its stack except the symbol X at the bottom. At this point , will enter its final state by using transition rule 2, thereby (correctly) accepting the input. we will prove that M and are equivalent. Let M accepts . Then . But then, ( by transition rule 1 ) include all the moves of M ) ( by transition rule 2 ) Hence, also accepts .Conversely, let accepts . Then for some Q . Every move in the sequence were taken from M. where and and the transition of , as well as the following two transitions. for some ( since
  • 19. Hence, M starting with its initial configuration will eventually empty its stack and accept the input i.e. . Deterministic PDA: Regular Languages and DPDA’s The DPDA’s accepts a class of languages that is in between the regular languages and CFL’s. Deterministic Pushdown Automata (DPDA) and Deterministic Context-free Languages (DCFLs) Pushdown automata that we have already defined and discussed are nondeterministic by default, that is , there may be two or more moves involving the same combinations of state, input symbol, and top of the stock, and again, for some state and top of the stock the machine may either read and input symbol or make an - transition (without consuming any input). In deterministic PDA , there is never a choice of move in any situation. This is handled by preventing the above mentioned two cases as described in the definition below. Defnition : Let be a PDA . Then M is deterministic if and only if both the following conditions are satisfied.
  • 20. 1. (this condition prevents multiple choice f 2. (This condition prevents the possibility of a choice between a move with or without an input symbol). Empty Production Removal The productions of context-free grammars can be coerced into a variety of forms without affecting the expressive power of the grammars. If the empty string does not belong to a language, then there is a way to eliminate the productions of the form A→ λ from the grammar. If the empty string belongs to a language, then we can eliminate λ from all productions save for the single production S → λ. In this case we can also eliminate any occurrences of S from the right-hand side of productions. Procedure to find CFG with out empty Productions has at most one element for any and any combination of ) If and for every
  • 21. Unit production removal Left Recursion Removal NORMAL FORMS Two kinds of normal forms viz., Chomsky Normal Form and Greibach Normal Form (GNF) are considered here. Chomsky Normal Form (CNF) Any context-free language L without any λ-production is generated by a grammar is which productions are of the form A → BC or A→ a, where A, B ∈VN , and a ∈ V Τ.
  • 22. Procedure to find Equivalent Grammar in CNF (i) Eliminate the unit productions, and λ-productions if any, (ii) Eliminate the terminals on the right hand side of length two or more. (iii) Restrict the number of variables on the right hand side of productions to two. Proof: For Step (i): Apply the following theorem: “Every context free language can be generated by a grammar with no useless symbols and no unit productions”. At the end of this step the RHS of any production has a single terminal or two or more symbols. Let us assume the equivalent resulting grammar as G = (VN ,VT ,P ,S ). For Step (ii): Consider any production of the form Example Obtain a grammar in Chomsky Normal Form (CNF) equivalent to the grammar G with productions P given
  • 23. Solution Pumping Lemma for CFG A “Pumping Lemma” is a theorem used to show that, if certain strings belong to a language, then certain other strings must also belong to the language. Let us discuss a Pumping Lemma for CFL. We will show that , if L is a context-free language, then strings of L that are at least ‘m’ symbols long can be “pumped” to produce additional strings in L. The value of ‘m’ depends on the particular language. Let L be an infinite context-free language. Then there is some positive integer ‘m’ such that, if S is a string of L of Length at least ‘m’, then
  • 24. (i) S = uvwxy (for some u, v, w, x, y) (ii) | vwx| ≤ m (iii) | vx| ≥1 (iv) uv iwx i y∈L. for all non-negative values of i. It should be understood that (i) If S is sufficiently long string, then there are two substrings, v and x, somewhere in S. There is stuff (u) before v, stuff (w) between v and x, and stuff (y), after x. (ii) The stuff between v and x won’t be too long, because | vwx | can’t be larger than m. (iii) Substrings v and x won’t both be empty, though either one could be. (iv) If we duplicate substring v, some number (i) of times, and duplicate x the same number of times, the resultant string will also be in L. Definitions A variable is useful if it occurs in the derivation of some string. This requires that (a) the variable occurs in some sentential form (you can get to the variable if you start from S), and (b) a string of terminals can be derived from the sentential form (the variable is not a “dead end”). A variable is “recursive” if it can generate a string containing itself. For example, variable A is recursive if Proof of Pumping Lemma (a) Suppose we have a CFL given by L. Then there is some context-free Grammar G that generates L. Suppose (i) L is infinite, hence there is no proper upper bound on the length of strings belonging to L. (ii) L does not contain l. (iii) G has no productions or l-productions. There are only a finite number of variables in a grammar and the productions for each variable have finite lengths. The only way that a grammar can generate arbitrarily long strings is if one or more variables is both useful and recursive. Suppose no variable is recursive. Since the start symbol is non recursive, it must be defined only in terms of terminals and other variables. Then since those variables are non recursive, they have to be defined in terms of terminals and still other variables and so on. After a while we run out of “other variables” while the generated string is still finite. Therefore there is an upper bond on the length of the string which can be generated from the start symbol. This contradicts our statement that the language is finite. Hence, our assumption that no variable is recursive must be incorrect. (b) Let us consider a string X belonging to L. If X is sufficiently long, then the derivation of X must have involved recursive use of some variable A. Since A was used in the derivation, the derivation should have started as
  • 26. Usage of Pumping Lemma Hence our original assumption, that L is context free should be false. Hence the language L is not con text-free. Example Check whether the language given by L = {a mbmcn : m ≤ n ≤ 2m} is a CFL or not. Solution Closure properties of CFL – Substitution