Probabilistic rebound Turing machines

Lan Zhang, Katsushi Inoue, Akira Ito, Yue Wang
2002 Theoretical Computer Science  
This paper introduces a probabilistic rebound Turing machine (PRTM), and investigates the fundamental property of the machine. We ÿrst prove a sublogarithmic lower space bound on the space complexity of this model with bounded errors for recognizing speciÿc languages. This lower bound strengthens a previous lower bound for conventional probabilistic Turing machines with bounded errors. We then show, by using our lower space bound and an idea in the proof of it, that (i) $[PRTM (o(logn))] is
more » ... mparable with the class of context-free languages, (ii) there is a language accepted by a two-way deterministic one counter automaton, but not in $[PRTM(o(logn))], and (iii) there is a language accepted by a deterministic one-marker rebound automaton, but not in $[PRTM(o(logn))], where $[PRTM(o(logn))] denotes the class of languages recognized by o(logn) space-bounded PRTMs with error probability less than 1 2 . Furthermore, we show that there is an inÿnite space hierarchy for $[PRTM(o(logn))]. We ÿnally show that $[PRTM(o(logn))] is not closed under concatenation, Kleene+, and length-preserving homomorphism. This paper answers two open problems in a previous paper. : S 0 3 0 4 -3 9 7 5 ( 0 1 ) 0 0 0 9 8 -6
doi:10.1016/s0304-3975(01)00098-6 fatcat:p55iw5ul7ralxhyjrkell644xm