Abstract
Computational complexity was introduced and developed by Hartmanis and Stearns in order to provide methods in measuring the degree of complexity or difficulty of computational problems or functions as well as to measure the efficiency of algorithms. It was developed through understanding of numerous complex problems and analysis of the issues within these problems.
The history of the subject of computational history started with the theoretical computational model or a machine developed by Alan Turing in during the 1930s. In his paper, he convinced the scientific community that developing an effective algorithm is possible. The Turing machine paved way to the development of digital computers during the 1940s to 1950s. After this development, numerous experts began to question several limitations of the model involved in the Turing machine (Arora & Barak, 2007). One of these limitations is the time and memory consideration for a computer algorithm. Others are also concern about the idea of the relative computational difficulty of various functions. These considerations resulted to the development of the computational complexity.
In 1965, a paper introducing the subject matter of computational complexity was published by Juris Hartmanis and Richard Stearns (Arora & Barak, 2007). In this paper, the concept of computational complexity was developed in order to consider comparing the difficulty and efficiency of algorithms or computational functions. This paper also introduced the key idea to formulate or measure the memory and time as a function of input length. Another paper was introduced during the 1960s which also focused on the subject matter of computational complexity. The paper entitled “The intrinsic computational difficulty of functions” was published by Alan Cobham. In this paper, he argued that the degree of difficulty of functions could not be easily determined without the theory behind computational complexity.
The early development of computational complexity involves numerous issues. These issues also paved way to the development of several concepts concerning computational complexity. One of the main issues during the development of computational complexity is the determination of the most effective measure for complexity. The early papers focusing on the subject matter of computational complexity easily convinced the experts that time and space is the most effective measure for complexity. Numerous experts are trying to consider other types of measures such as work and certain axiom. However, it is obvious in the modern times that space and time is a necessary measure for complexity.
During the 1970s, various concepts emerged which helped in developing the subject matter of computational complexity. One of these concepts which greatly helped in the development of computational complexity is the (non-deterministic polynomial time) NP-completeness. It is considered as a decision problem which resisted fast solution. It captured the attention of experts concerning the topic of computational complexity and proved several concepts which branched to this subject matter (Arora & Barak, 2007). Some of the widely known problems which have no known solution are optimization problems and linear programming problems. Experts study NP-complete problems since they believed that the determination of the fastest solution to a computational problem is related or associated to the fastest verification of solution.
During the 1970s, the issue of structural complexity emerged which greatly helped computational complexity. The structural complexity captured the attention of experts due to the emergence of NP-complete problems. In order to study or analyze some of the properties of the NP-complete problems, several experts developed conjectures, ideas and theories regarding these types of problems. One of these ideas is the isomorphism conjecture developed by Hartmanis and Berman. This conjecture states that sets containing NP-complete are isomorphic. This conjecture has been useful in formulating numerous concepts and ideas regarding the NP-complete problems. Several experts also classified problems which are not NP but requires hard solutions as polynomial hierarchy problems.
Another significant idea which greatly helped in understanding structural complexity is the concept of alternation developed by Kozen, Stockmeyer and Chandra (Fortnow & Hower, 2013). The alternation concept provides an opportunity for the classification of the combinatorial functions or problems. Kozen, Stockmeyer and Chandra used nondeterministic Turing machine in order to acquire the classification. Some of the interesting classes which are also considered by experts include polynomial time (P), deterministic log space (L) and non-deterministic log space (NL). Experts considering these classes acquire interesting problems and issues which greatly helped in understanding computational complexity. The last concept which helped in understanding structural complexity is the oracles. It uses relative computations and helped in understanding structures of complex problems.
Another concept which was developed during the 1970s which helped in providing opportunities to study complex problems such as the NP-complete problems is the counting complexity or counting class method. It focused on determining computation paths within the solution or algorithm of NP-complete problems. It played a significant role in understanding computational complexity and has been applied in complex problems involving circuits. The counting function is denoted by the number sign #. In the case of the polynomial space P, the counting function classes are denoted by #P (Fortnow & Hower, 2013).
Before the development of computational complexity, the use of random inputs in the field of computer science is considered difficult. Random numbers could be useful in understanding or studying probabilistic or deterministic combinatorial problems. During the 1970s, probabilistic algorithms have been considered. Some of the notable developments in this field include the Berkelamp’s algorithm and the prime recognition algorithm developed by Solovay and Strassen (Fortnow & Hower, 2013). Probabilistic algorithms played an important role in developing the subject matter of computational complexity. Some of the developed concepts during the 1980s with regards to probabilistic algorithms include interactive proof systems, probabilistic checkable proofs and de-randomization.
With the development of computer language, computational complexity concepts require a measure or degree of complexity of the logical language necessary in order to define the problem. Descriptive complexity helped in this case (Fortnow & Hower, 2013). Descriptive complexity opts to understand mathematical logic through measuring the complexity of the logical language necessary in order to define the problem.
The measure of complexity is also developed in other models of computations such as circuit complexity, communication complexity and proof complexity. With the development of science, computational complexity also reached other models of parallel and probabilistic computation. One of the emerging concepts in the field of computational complexity theory is the quantum computing or quantum complexity. The main objective of this field is the stimulation of quantum systems in computer technology. Other future directions of field of the computational complexity include the separation of the P and NP and the development of other models of complexity. According to Hartmanis, the efficient computational power is inherent on models and using techniques in computational complexity could help in understanding them.
References
Arora, S., & Barak, B. (2007). Computational Complexity: A Modern Approach. Princeton University Press.
Fortnow, L., & Homer, S. (2013). A Short History of Computational Complexity. NEC Laboratories. Retrieved from http://people.cs.uchicago.edu/~fortnow/beatcs/column80.pdf.