Sale!

CMSC 303 Assignment 5 solved

Original price was: $30.00.Current price is: $25.00.

Category:

Description

Rate this product

1. [6 marks] This question asks you to examine the formal definitions of a TM and related concepts closely. Based
on these definitions, answer the following.
(a) A configuration of a Turing Machine (TM) consists of three things. What are these three things?
(b) Can a Turing machine ever write the blank symbol t on its tape?
(c) Can the tape alphabet Γ be the same as the input alphabet Σ?
(d) Can a Turing machine’s head ever be in the same location in two successive steps?
(e) Can a TM contain just a single state?
(f) What is the difference between a decidable language and a Turing-recognizable language?
2. [8 marks] This question gets you to practice describing TM’s at a semi-low level. Give an implementation-level
description of a TM that decides the language
L = {x | x contains twice as many 0s as 1s}.
By implementation-level description, we mean a description similar to Example 3.11 in the text (i.e. describe
how the machine’s head would move around, whether the head might mark certain tape cells, etc. . . . Please do
not draw a full state diagram (for your sake and for ours)).
3. [9 marks] This question investigates a variant of our standard TM model from class. Our standard model
included a tape which was infinite in one direction only. Consider now a TM whose tape is infinite in both
directions (i.e. you can move left or right infinitely many spaces on the tape). We call this a TM with doubly
infinite tape.
(a) [3 marks] Show that a TM with doubly infinite tape can simulate a standard TM.
(b) [5 marks] Show that a standard TM can simulate a TM with doubly infinite tape.
(c) [1 mark] What does this imply about the sets of languages recognized by both models?
4. [10 marks] This question studies closure properties of the decidable and Turing-recognizable languages.
(a) [5 marks] Show that the set of decidable languages is closed under concatenation.
(b) [5 marks] Show that the set of Turing-recognizable languages is closed under concatenation. (Hint: This
is trickier than part (a) because if a (deterministic) Turing machine decides to split an input string x as
x = yz and check if y ∈ L1 and z ∈ L2, i.e. to check if x ∈ L1 ◦ L2, then running the recognizer for L1
on y (say) may result in an infinite loop if y 6∈ L1.)
1
5. [5 marks] This question allows you to explore variants of the computational models we’ve defined in class.
Let a k-PDA be a pushdown automaton that has k stacks. In this sense, a 0-PDA is an NFA and a 1-PDA
is a conventional PDA. We know that 1-PDAs are more powerful (recognize a larger class of languages) than
0-PDAs. Show that 2-PDAs are more powerful than 1-PDAs. (Hint: Recall from A4 that the language L =
{a
nb
nc
n | n ≥ 0} is not context-free.)
2