Alan Turing was the founder of computer science,
mathematician, philosopher, code breaker and a visionary.
He was born in 1912 (23 June) at Paddington, London. He studied at the
Sherborne School, during 1926-31. He was then an undergraduate student
at King’s College, Cambridge University during 1931-34. In 1935 he was
elected Fellow of King’s College, Cambridge for his work in Quantum
mechanics, probability and logic during 1932-35. Before the advent of
the digital computer i.e. in the 1930's several mathematicians began to
think about what it means to be able to compute a function.
Turing's definitive contribution was the relation between logical
instructions, the action of the mind, and a machine which could in
principle be embodied in a practical physical form. Thus, in 1936, he
contributed ‘The Turing machine, computability, universal machine’. At
the same time American logician, Alonzo Church had also been working on
the same and came with a parallel conclusion. It was seen, however that
Turing's approach was original and different; Church relied upon an
assumption internal to mathematics, rather than appealing to operations
that could actually be done by real things or people in the physical
world. Alonzo Church and Alan Turing independently arrived at
equivalent conclusions. As we might phrase their common definition now:
What is a Turing
Turing expanded this idea to show that there exists a "universal" Turing machine, a machine which can calculate any number and function, given the appropriate instructions. The set of instructions can be thought of as a program for the Turing machine.
Turing’s Paper, On Computable Numbers was published near the end of 1936.This paper generated a lot of attention for him and the idea came to mathematician John Von Neumann. Turing obtained his Ph.D thesis through work that extended his original ideas i.e. during 1936-38
Turing also completed his Ph.D. in Logic, algebra, number theory at Princeton University.
It is found that Recurrent Neural Networks are actually equivalent to Turing Machines i.e. they can compute any algebraically computable function.
Heikki Hyötyniemi of Helsinki University of Technology, Control
Engineering Laboratory, Otakaari 5A, FIN-02150 Espoo, Finland has published a paper stating the same i.e. ‘Turing Machines are Recurrent
The class P informally is the class of decision problems that can be solved by some algorithms within a number of steps bounded by some
fixed polynomial in the length of the input. We can get an exact solution to those problems within some reasonable amount of time. Such
problems fall in class P where P stands for polynomial time complexity.
Problems for which there is a polynomial time algorithm are called tractable.
To define the problem in a precise way it is necessary to give a formal
model of a computer. The Turing machine is the standard computer model
in computability theory which was introduced by Alan Turing in 1936.
The model was introduced before physical computers were built; but it
still continues to be accepted as the proper computer model for the
purpose of defining computable function.
NP Complete are the hardest problems in the class NP. Such problems are
singled out because if at any point we find a deterministic polynomial
time algorithm for one of them, all of the problems in NP must have
deterministic polynomial time algorithms.
Use of Neural Networks to solve P, NP, NP Hard and NP Complete
The Turing Machines can be used in computing P and NP problems. So the
Recurrent Neural Networks can also be used for the same.
However, both Hopfield and Recurrent Neural Networks are used for
solving problems that deal with P and NP complexities.
The maximum clique problem (MCP) is a classic graph optimization
problem with many real-world applications. This problem is NP-complete
and computationally intractable even to approximate with certain
absolute performance bounds. New discrete competitive Hopfield neural
network for finding near-optimum solutions for the MCP are used. More
information related to this topic can be obtained at the following
Thus, Recurrent Neural Networks can be used for Universal Computation
and Universal Approximation that is all places here Turing Machines
can be used.Universal Approximation: It means having the
ability to approximate any function to an arbitrary degree of
Universal Computation: It means the ability of computing
anything that can be computed in principle, that is equivalent in
computing power to a Turing machine, the lambda calculus, or a Post
Prof. Ashay Dharwadker