Normal algorithm
The name attached to algorithms (cf. Algorithm) of a certain precisely characterized type. Together with recursive functions (cf. Recursive function) and Turing machines (cf. Turing machine), normal algorithms have become known as one of the most suitable refinements of the general intuitive idea of an algorithm. The concept of a normal algorithm was developed in 1947 by A.A. Markov in the course of his research on the identity problem for associative systems (see Associative calculus). The detailed definition and general theory of normal algorithms are set forth in [1] (Chapts. I–V); see also [2] (Chapts. I–VII).
Every normal algorithm $\mathfrak A$ works in a certain alphabet $A$ and gives rise to a well-defined process performed on words (cf. Word). The specification of this alphabet occurs in the definition of $\mathfrak A$ as an obligatory constituent part, and in the situation in question one says of $\mathfrak A$ that it is a normal algorithm in the alphabet $A$. Every normal algorithm in a fixed alphabet $A$ is completely determined by indicating its scheme — an ordered finite list of substitution formulas in $A$. Every such formula is in essence an ordered pair $(U,V)$ of words in $A$. Here $U$ is called the left and $V$ the right part of the formula. Among the formulas of a given scheme some are specially distinguished and are declared to be conclusive. As a rule, in the scheme of a normal algorithm a conclusive formula is written in the form $U\to\cdot V$ and an inconclusive one in the form $U\to V$.
A normal algorithm $\mathfrak A$ in an alphabet $A$ is a prescription for constructing, starting from an arbitrary word $P$ in $A$, a sequence of words $P_i$ according to the following rule. The word $P$ is taken as the initial term $P_0$ of the sequence and the construction process continues. Suppose that for some $i\geq0$ the word $P_i$ has been constructed and that the process of constructing the sequence in question is not yet complete. If in the scheme of $\mathfrak A$ there are no formulas with left parts occurring in $P_i$, then one puts $P_{i+1}$ equal to $P_i$, and the process of constructing the sequence is considered complete. But if in the scheme of $\mathfrak A$ there are formulas with left parts occurring in $P_i$, then one takes as $P_{i+1}$ the result of substituting the right part of the first of these formulas for the first occurrence of its left part in $P_i$ (cf. Imbedded word); here the process of constructing the sequence is taken to be complete if the substitution formula to be applied at this stage is conclusive and it continues otherwise. If the process of constructing the relevant sequence terminates, then one says that the normal algorithm $\mathfrak A$ in question is applicable to the word $P$. The last term $Q$ of the sequence is the result of applying $\mathfrak A$ to $P$ and is denoted by the symbol $\mathfrak A(P)$. One says that $\mathfrak A$ transforms $P$ into $Q$ and writes $\mathfrak A(P)=Q$. A normal algorithm in any extension of $A$ is said to be a normal algorithm over this alphabet.
There are strong reasons to suppose that the concept of a normal algorithm is an adequate precise version (formalization) of the general idea of an algorithm in an alphabet. More accurately, one may assume that for every algorithm $\mathfrak A$ in any alphabet $A$ one can construct a normal algorithm $\mathfrak B$ over $A$ that transforms an arbitrary word $P$ in $A$ into the same result as the original algorithm $\mathfrak A$. This convention is known in the theory of algorithms as the normalization principle. The formalization of the concept of an algorithm on the basis of that of a normal algorithm turns out to be equivalent to other known formalizations (see, for example, [3]). As a consequence, this normalization principle turns out to be equivalent to the Church thesis, which proposes to assume that the concept of a partial recursive function is an adequate formalization of the non-formal, intuitive, concept of a computable arithmetic function (cf. Computable function).
Originating in the first instance in connection with algebraic problems, normal algorithms have proved to be a convenient working tool in much research requiring a precise notion of an algorithm, especially when the basic objects of study are not of arithmetical nature and allow for an appropriate representation in the form of words in certain alphabets (such as situations in constructive analysis, for example).
References
[1] | A.A. Markov, "Theory of algorithms" , Israel Program Sci. Transl. (1961) (Translated from Russian) (Also: Trudy Mat. Inst. Steklov. 42 (1954)) |
[2] | A.A. Markov, N.M. [N.M. Nagornyi] Nagorny, "The theory of algorithms" , Kluwer (1988) (Translated from Russian) |
[3] | E. Mendelson, "Introduction to mathematical logic" , v. Nostrand (1964) |
Comments
In 1936–1937 A.M. Turing formulated the thesis that his formalization of the intuitive notion of effective computation in terms of Turing numbers coincided precisely with the intuitive notion of computability. After several different formalizations turned out to be equivalent, A. Church extended Turing's thesis to the effect that all strong enough formalizations will turn out to cover precisely Turing's concept (Church's thesis).
Normalization principle. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Normalization_principle&oldid=43261