Algorithm In mathematics, computer science, and related subjects, an algorithm is an effective method for solving a problem using a finite sequence of instructions. Algorithms are used for calculation, data processing, and many other fields.
Each algorithm is a list of well-defined instructions for completing a task. Starting from an initial state, the instructions describe a computation that proceeds through a well-defined series of successive states, eventually terminating in a final ending state. The transition from one state to the next is not necessarily deterministic; some algorithms, known as randomised algorithms, incorporate randomness.
A partial formalisation of the concept began with attempts to solve the Entscheidungs problem (the “decision problem”) posed by David Hilbert in 1928. Subsequent formalisations were framed as attempts to define “effective calculability” or “effective method”; those formalisations included the Gödel-Herbrand-Kleene recursive functions of 1930, 1934 and 1935, Alonzo Church’s lambda calculus of 1936, Emil Post’s “Formulation 1” of 1936, and Alan Turing’s Turing machines of 1936–7 and 1939.
The adjective “continuous” when applied to the word “algorithm” can mean: 1) An algorithm operating on data that represents continuous quantities, even though this data is represented by discrete approximations — such algorithms are studied in numerical analysis; or 2) An algorithm in the form of a differential equation that operates continuously on the data, running on an analog computer.