http://www.spinellis.gr/pubs/jrnl/1997-CompComm-Formal/html/formal.htm This is an HTML rendering of a working paper draft that led to a publication. The publication should always be cited in preference to this draft using the following reference:
|
Stefanos Gritzalis
Department of Information
& Communication Systems
University of the Aegean
Research Unit, 30 Voulgaroktonou
St., Athens GR-11472, GREECE
Tel: +30-1-6456.688, Fax:
+30-1-6448.428, Email: sgritz@aegean.gr
Department of Informatics
Technological Educational
Institute (T.E.I.) of Athens
Ag.Spiridonos St., Aegaleo,
Athens GR-12243, GREECE
Tel: +30-1-5910974, Fax:
+30-1-5910975, Email: sgritz@acm.org
Diomidis Spinellis
Department of Information
and Communication Systems
University of the Aegean
Samos GR-83200, GREECE
Tel: +30-273-33919, Fax:
+30-273-35483, Email:dspin@aegean.gr
Panagiotis Georgiadis
Department of Informatics
University of Athens
TYPA Buildings, Athens
GR-15771, GREECE
Tel: +30-1-7257560, Fax:
+30-1-7219561, Email: georgiad@di.uoa.gr
Formal methods, theory, and
supporting tools can aid the design, analysis, and verification
of the security-related and cryptographic protocols used over
open networks and distributed systems. The most commonly followed
techniques for the application of formal methods for the ex-post
analysis and verification of cryptographic protocols, as the analysis
approach, are reviewed, followed by the examination of robustness
principles and application limitations. Modern high-level specification
languages and tools can be used for automatically analysing cryptographic
protocols. Recent research work focuses on the ex-ante
use of formal methods in the design stage of new security protocols,
as the synthesis approach. Finally, an outline is presented
on current trends for the utilisation of formal methods for the
analysis and verification of modern complicated protocols and
protocol suites for the real commercial world.
Protocol Analysis Tools, Formal
methods, Security protocols, Cryptographic Protocols.
A protocol is a set of rules and conventions that define the communication framework between two or more agents. These agents, known as principals, can be end-users, processes or computing systems. In cryptographic protocols part of at least one message is encrypted. Security-related and cryptographic protocols are used to establish secure communication over insecure open networks and distributed systems. These protocols use cryptographic techniques to achieve goals such as confidentiality, authentication of principals and services, message integrity, non-repudiation, order and timeliness of the messages, and distribution of cryptographic keys. Unfortunately, open networks and distributed systems are vulnerable to hostile intruders who may try to subvert the protocol design goals.
Given such requirements, it is not surprising that there have been several examples of security-related and cryptographic protocols that were published, believed to be sound, and later shown to have several security flaws [1] [2] [3]. After the discovery of flaws in a protocol, the flaws are often corrected or approaches are being adopted to avoid using the reasoning of the flawed protocols [4]. These facts increasingly prompted research into the development of several different formal methods for detecting protocol failures, following an analysis approach to designing secure protocols. As is the case in the analysis of conventional communication protocols, two kinds of techniques have been applied to this problem: those based on attempts to construct inferences using specialised logics based on a notion of knowledge and belief, that protocol participants can confidently reach desired conclusions, and, those based on attempts to construct possible attacks using algebraic properties of the algorithms in the protocols.
Inference-construction methods are utilising modal logics similar to those that have been developed for the analysis of the evolution of knowledge and belief in distributed systems. These methods are widely used [5] [6] [7]. A number of specific problems are associated with them [8] [9] [10] [11] [12] related to: the analysis of zero knowledge protocols, the detection of parallel session multi-role flaws, the transformation of messages and prepositions to idealised messages, the fact that there is no complete semantics for the logic, and the modelling of freshness.
Attack-construction methods construct probable attack sets based on the algebraic properties of the protocol's algorithms. These methods [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] are targeted towards ensuring authentication, correctness, or security properties; they are not dependent on the correctness of a proposed logic. Their main disadvantage lies in the big number of possible events that must be examined.
Attempting to avoid the exponential searches of the attack-construction methods or to extend analyses to protocols that involve arbitrarily large numbers of participants and messages, has given rise to a third approach for the analysis of protocol failures. This is the recent approach of proof-construction methods, which has the potential of being as thorough as attack-construction in finding possible attacks, while avoiding exponential searches by replacing them with theorems about these searches. The proof-construction methods are complementary to inference-construction methods, since they are also based on the problem formalisation through hypotheses and authentication properties, but rely on problem-specific properties and a specification at a finer level of precision.
Proof-construction methods formally model the actual computations performed in protocols and prove theorems about these computations. This approach has been taken by Snekkenes [23], Bolignano and Paulson [24] [25] [26], and Brackin [27].
In this paper we provide a review
of the state-of-the-art in the application of formal methods and
the development of relevant tools for the analysis, design and
verification of security-related and cryptographic protocols and
outline major trends of research in this area. The remainder of
this paper is organised as follows: in Sections 2, 3, and 4 we
describe under the analysis approach the most commonly
followed approaches to the ex-post application of formal
methods to already designed cryptographic protocols. In Section
5 we outline the use of security-related formal specification
languages and tools for automatically analysing cryptographic
protocols. In Section 6 we present an assortment of helpful principles
and limitations encapsulating relative experience of good and
bad practice that can be used in the design of errorfree
cryptographic protocols. In Section 7 we discuss recent trends
for the ex-ante use of formal methods in the design stage
of new cryptographic protocols using the synthesis approach.
Finally, in Section 8, we outline modern trends for the utilisation
of formal methods for the analysis and verification of modern
complicated protocols and protocol suites for the real commercial
world.
Attack-construction methods can be divided into three sub-categories based on their theoretical foundation:
In the following paragraphs we present
these sub-categories in order to describe the basic features of
every method.
These methods analyse a cryptographic protocol as any other program whose correctness they are trying to prove. This is achieved by specifying the protocol: as a finite-state machine [21] [22], using predicate calculus [14], or within a process algebra [20] [15].
Sidhu and Varadharajan map the protocol to a finite-state machine [21] [22]. The first analysis method [21] verifies the basic properties of a number of protocols, detects basic flaws, but can not detect flaws due to the re-use of old messages as no temporal assumptions are used. The second method [22] also verifies the basic properties of a number of protocols, but exhibits a number of problems as the number of states increases. In addition, in order to deal with flaws related to the re-use of old messages, the authors propose to incorporate into the analysis data from the session key message contents.
Another approach introduced by Kemmerer, is based on predicate calculus extensions [14]. This method is using the specification language Ina Jo and the Formal Development Methodology. Ina Jo [28] is a non-procedural assertion language that is an extension of first-order predicate calculus. Formal specifications written in Ina Jo specify definitions, initial conditions, transforms, axioms, and criteria. Criteria are used to specify critical requirements for a secure state. Ina Jo formal specifications can then be executed and verified by related tools, such as Inatest. This approach has proved to be successful in locating both active and passive attack flaws, since in both cases the intruder is a separate entity in the model's mathematical framework.
Roscoe proposed a more rigorous approach
[20], which is based on modelling all the agents taking part in
the protocol including the communicating principals and the intruder
as Communicating Sequential Processes (CSP). The proposed method
can be used to formalise messages, traces, intruders, and nonce
challenges. The Failures Divergence's Refinement checker (FDR)
tool is a general purpose tool that is used afterwards to determine
whether an implementation refines a specification. FDR takes as
input two CSP processes; a specification and an implementation
and tests whether the implementation refines the specification.
Initially, this approach was used to analyse many sorts of systems,
including distributed databases and communications protocols [29].
Recently it has also been used to analyse security protocols
[20] [15]. In the case of protocol authentication, FDR is used
to test whether the protocol correctly achieves authentication
and discover a specific kind of attack of the protocol: this is
the case where an intruder is masquerading as another one within
a protocol run. Then, the protocol is adapted in order to remove
the potential flaw and FDR is used to verify that there are no
attacks upon a small system running the protocol. The tests using
FDR have proven to be rather fast. However, the main problem of
the effectiveness of this approach in the examination of large
scale systems remains. As FDR requires a user-specified limit
on properties such as the number of objects it will consider,
failure to find an attack only asserts that an attack can not
be found within those specified limits.
Roscoe and Goldsmith [30] have described how a fully potent cryptographic protocol attacker can be modelled using a given inference system in CSP. Their approach utilises the FDR2 tool by Formal Systems. It uses a lazy exploration strategy which examines the subset of intruder states reachable by the protocol rules effectively exploring the behaviour of the intruder in parallel with the protocol's evolution. A particular advantage of their methodology lies in its ability to reason about the absence of denial-of-service attacks. Their technique requires the production of a CSP description of the protocol by hand. This has proved not only time-consuming but error-prone as well, even for experts in this area. For semi-automating the CSP description, Lowe designed a program named Casper [31]. Casper is an effective front-end for the aforementioned approach and will be presented in Section 5.
J.C.Mitchell, M.Mitchell, and Stern use a general-purpose state enumeration tool, named Mur [32] (pronounced "Mur-phi" from the Greek letter ) to analyse security-related protocols [19]. The methodology is similar to the approach used in CSP model checking of cryptographic protocols. It involves modelling the protocol and the desired properties in the Mur language. Mur then verifies - using breadth-first or depth-first full state enumeration - that all reachable states of the system satisfy the specification. Typically, the methodology for analysing protocols involves the following successive steps: formulate the protocol, add an adversary to the system, state the desired correctness condition, run the protocol for some specific choice of system size parameters, experiment with alternate formulations, and repeat.
Mur has been used to demonstrate flaws already known, as TMN [33] and Kerberos version 5 [34]. A useful aspect of the Mur approach is that it is feasible to modify a system description to reflect a situation where one or more pieces of secret information have been compromised.
The standardised language LOTOS [35] [36] has also been used to specify security protocols and cryptographic operations [37] and aid the verification of a protocol's robustness to intruder attacks. LOTOS is made up of two main components: a process algebra with a structured operational semantics and an abstract datatype language. The LOTOS formal language has been used to model the Equicrypt protocol [38] for conditional access to multimedia services and to find some successful attacks against it [39]. LOTOS has also been used by Germeau and Leduc to specify a registration protocol for the mutual authentication between a Trusted Third Party and a user [40].
Another general purpose formal specification language used in this area is ASTRAL [41] whose strength lies in the specification real-time systems. Dang uses the ASTRAL model checker [42] to check the ability of satisfaction of critical requirements of an ASTRAL specification by enumerating possible runs of transitions within a given time. ASTRAL has been applied on the Needham-Schroeder public-key authentication protocol [2] and the TMN protocol [33]. The ASTRAL model checker missed a bug in TMN, because it required excessive execution time under the given ASTRAL coding of the specification. Furthermore, the ASTRAL approach uncovers simple bugs also uncovered by Mur tool. The above results are preliminary, but it is expected that ASTRAL will prove to be more effective in the investigation of real-time protocols.
All the approaches described above have been shown to discover attacks caused by lack of explicitness in the protocol messages. Unfortunately, they suffer from the large size of the state space under exploration. Additionally, if a method fails to find an attack, this only means that there is no attack on the particular small system analysed, but an attack may exist for some larger system running the same protocol. So the effectiveness of the aforementioned methods in the examination of large scale systems remaining to be demonstrated. Lowe [43] presents sufficient conditions for the protocol and its environment guaranteeing that, if there is no breach of secrecy when the protocol is run by an appropriate small system, then there is no breach of secrecy on any system.
Although these general purpose methods
have been judged as an important contribution to the field research
has often turned into more specialised directions. The driving
force behind this turn is the strong desire to use reasoning knowledge
specific to the cryptography domain.
The algebraic simplification methods model a protocol with a collection of rules for transforming and reducing algebraic expressions representing messages. Representative methods in this category have been proposed by Dolev and Yao [13], and Meadows [16] [17].
Dolev and Yao presented the basic model for the state-machine approach [13]. Under their model an intruder is in full control of the network being able to read, modify, create, and delete messages; effectively, the intruder is using the system being attacked as a machine to generate words (messages). The words follow some rewrite rules based, for example, on the properties of symmetric encryption. The intruder's task is to discover a word that should have been secret. Thus, the protocol security problem is transformed into a search based on a term-rewrite system. This approach was used to develop analysis algorithms for some restricted protocol classes.
According to the aforementioned work, two models were developed, namely the cascade protocol model, in which the users can apply cryptographic operations in several layers to form messages, and, the name-stamp protocol model in which the users are allowed to append, delete, and check names encrypted together with the plaintext. The name-stamp protocol can be used to model layers of encryption. The main drawbacks of the Dolev-Yao model are its failure to model the principals' ability to remember state information between states, and the fact that it can only detect protocol deficiencies.
Meadows's NRL Protocol Analyzer [16] [17] is a prototype verification tool, written in Prolog, that can be used to assist either in the verification of security properties of cryptographic protocols or in the detection of security flaws. The NRL model takes the same approach as the term-rewrite model of Dolev-Yao. The main difference between the two models is that the Dolev-Yao model treats a protocol as a machine for producing words, while NRL Protocol Analyzer treats a protocol as a machine for producing not only words, but also beliefs and events. In the NRL model each protocol participant possesses a set of beliefs. These beliefs are created or modified as the result of receiving messages made up of words, while messages are sent depending upon both beliefs and messages received. Events represent the state transitions in which new words are generated and beliefs are modified. Thus an intruder who controls the dissemination of messages can use the protocol to produce words, beliefs, and events.
The NRL Protocol Analyzer, in common with the Interrogator model [18] [44] uses a backward search strategy to construct a path from a specified insecure state to an initial state. The main difference between the NRL model and the Interrogator stems from their end goals: the NRL model aims to prove that a protocol is secure while the Interrogator is designed to search for ways to achieve insecure states without guaranteeing that the protocol is secure if the search fails. However, unlike the Interrogator model the NRL Analyzer can construct a single path using an arbitrary number of protocol rounds thereby working in an infinite state space. This approach allows the NRL Analyzer to discover attacks based on a combination of a protocol runs.
The NRL Protocol Analyzer has been
used successfully to locate a series of previously unknown flaws
in a number of protocols [45]
[46],
and to demonstrate flaws that were already known in the literature
[47].
The main drawback of the current implementation is the fact that
to keep the state space workable some drastic simplifying assumptions
are required. In addition, as with most rule-rewrite systems,
it is not clear how well the system scales as more complicated
algorithms will need to be expressed using an ever increasing
set of rules. Another source of difficulty in using the NRL Protocol
Analyzer lies in the generation of lemmas stating that infinite
classes of states are unreachable: these have to be proved by
hand. In Section 5 we describe an effective procedure [48]
for making this task easier by automating the generation of lemmas
process.
One of the earliest systems that used the Dolev-Yao approach, is Millen's Interrogator Model [18] [44]. The Interrogator is a software tool written in Prolog that incorporates a protocol state-transition model. While the abstract model includes the usual state variable for the intruder's set of known items, the search algorithms expressed recursively use a state representation with no explicit mention of the known set.
In addition, the Interrogator has an equation-solving facility for terms using encryption and other operators used in authentication protocols. This facility called generalised narrowing implements a multiple-theory approach which handles commutative operators like exclusiveor and others, such as a limited form of finite-field exponentiation to which prior narrowing algorithms do not apply.
Protocol participants are modelled
as communicating state machines whose messages to each other are
intercepted by an intruder who can either destroy messages, modify
them, or let them pass through unmodified. Given a final state
in which the intruder knows some word which should be secret,
the Interrogator will try - by using operations defined by non-confluent
rewrite systems - all possible ways of constructing a path by
which that state can be reached. If it finds such a path, then
it has identified a security flaw, however its failure to find
an attack does not constitute a proof that no attack exists within
its model. The Interrogator model has not uncovered previously
unknown attacks in wellknown protocols, but it has been
able to reproduce a number of already known attacks [47].
A formal logic model, called BAN logic [5], presented by Burrows, Abadi, and Needham has been widely used for the analysis of authentication protocols. BAN logic of belief belongs to the class of KD45 modal logics which practically means that any fact is only a belief and does not need to be universal in time and space. It assumes that authentication is a function of integrity and freshness, and uses logical rules to trace both of those attributes through the protocol.
There are three main stages for the analysis of a protocol using BAN logic. The first step is to express the assumptions and goals as statements in a symbolic notation so that the logic can proceed from a known state to one where it can ascertain whether the goals are in fact reached. The second step is to transform the protocol steps into symbolic notation. Finally, a set of deduction rules called postulates are applied. The postulates should lead from the assumptions, via intermediate formulas, to the authentication goals.
BAN logic has been a success. It has found flaws in several protocols, including Needham-Schroeder [2] and CCITT X.509 [49]. It has uncovered redundancies in many protocols, including Needham-Schroeder, Kerberos [50], Otway-Rees [51], and CCITT X.509 [49]. Many published papers use BAN logic to make claims about their protocol's security [52] [53].
Inevitably, critiques on various features of the BAN logic have been published. According to Liebl it is difficult to prove properties of the BAN logic, such as completeness, and the logic does not take into consideration the release of message contents and the interaction of the runs at different time of the same protocol [54]. Nessett criticised BAN logic about its claimed goals of authentication [55]. He constructed a specific example in order to demonstrate the BAN logic's failure to discover flaws which violate security in a basic sense. Snekkenes examined the BAN logic's limitation of providing partial correctness proofs [56]. Syverson described common misunderstandings about BAN logic's goals and explained a problem of informality in BAN logic's operational semantics [10]. For this reason, specific measures to formalise BAN logic have been proposed by Mao and Boyd [57]. This formalisation is desirable, not only for its potential in providing rigorous analysis of security protocols, but, in addition, for its ability to support computer-aided analysis.
The most criticised points in BAN logic are: the fact that there is no complete semantics for the logic and the modelling of freshness. The lack of complete semantics may lead to problems in modelling as some facts may have an unclear meaning. It usually causes problems at the idealisation step due to ambiguity and vagueness, particularly where a message is idealised into a formula containing information not present in the message itself. An interesting research goal to overcome BAN logic's aforementioned drawback would be the development of an efficient method for authenticating protocol idealisations. This method would presumably be based on rule-based techniques and would result in a way to refine a big protocol message transformation step into smaller, simpler, and easier to understand steps. This method would reduce the possibility of error occurrence in the informal protocol idealisation steps and would increase the ease of diagnoses of lower-level design flaws. Mao and Boyd have worked towards this goal [58], but their work does not cover protocols using public-key algorithms nor does it include a theoretic proof of the soundness of the proposed idealisation rules. Regarding the modelling of freshness it is not possible - as is the case in most modal logics - to distinguish between freshness of creation and freshness of receipt. The abstract level of BAN logic models results in difficult to assess hypotheses and protocol descriptions. According to Syverson [59], BAN logic's results seem to be less reliable than NRL Protocol Analyzer's, but are easier to come by. Other published logic systems are designed as extensions to BAN logic [6] [60] [9] or correct perceived weaknesses [56] [57].
A successful, but rather complicated approach called GNY logic, was proposed by Gong, Needham, and Yahalom [6] increasing the scope of BAN logic. GNY logic aims to analyse a protocol step-by-step, making explicit any assumptions required, and drawing conclusions about the final position it attains. This logic offers important advantages over BAN logic. The GNY approach places a strong emphasis on the separation between the content and the meaning of messages which increases consistency in the analysis and introduces the ability to reason at more than one level. In GNY logic, principals can include in messages data which they do not believe in, but just possess. It is also possible to express the ability of a recipient to identify the expected messages and allows one to determine that certain messages are not replays of a recipient's own previous messages in a given session.
GNY logic has a number of drawbacks: it addresses only authentication and is much more complicated and elaborate than other methods as it has many rules which have to be considered at each stage [61].
A Higher Order Logic (HOL) theory [62] formalising an extended version of GNY, named BGNY logic has been introduced by Brackin [8]. This belief logic is used by software that automatically proves authentication properties of cryptographic protocols. Similarly to GNY logic BGNY addresses only authentication. However, BGNY extends the GNY logic including the ability to specify protocol properties at intermediate stages and being able to specify protocols that use multiple encryption and hash operations, message authentication codes, hash codes as keys, and key-exchange algorithms.
Another logic, called SvO, presented by Syverson and van Oorschot [7], is designed to capture the features of extensions and variants of four logics, namely BAN, GNY, AT [63], and vO [60] in a single unified framework. In addition, the authors provide modeltheoretic semantics with respect to which the logic is sound. The SvO logic was intended to encompass the reasoning of these other logics while providing a rigorous understanding of its formal expressions. The SvO logic is considered to be simpler to use and more expressive than any of the logics from which it is derived.
Kailar proposes a special-purpose logic to be used for the analysis of communication protocols that require accountability [64], such as those for secure electronic transactions. This logic looks at what can be achieved without making any assumptions about freshness. A set of postulates which are applicable to the analysis of proofs in general and the proofs of accountability in particular are proposed. In the same framework, an authentication logic presented by Kessler and Neumann [65] can analyse the accountability of transactions in the framework of electronic commerce protocols. Their work is based on the AUTLOG semantics developed earlier [9]. New rules and predicates are used to model accountability and to prove that the new calculus is correct with respect to the formal semantics.
Wedel and Kessler also propose a
logic for the analysis of authentication protocols [66]
providing formal semantics for proving its soundness. This logic
can handle a wide variety of cryptographic mechanisms using a
minimum of notation. In their approach, the elimination of the
formuli out of the idealised messages leads to a clear distinction
between the protocol itself and the assumptions about it.
As mentioned before, inference-construction methods do not address secrecy, often lack clear semantics, and it is sometimes difficult to say exactly what a belief-logic proof actually proves. On the other hand, attack-construction methods may have to search spaces that grow exponentially with the size of the protocol, so the time and space they require can easily exceed all available resources.
In order to confront the aforementioned drawbacks, Bolignano has proposed an approach targeting the generation of human-readable proofs [25]. Such proofs can be used as part of a vulnerability analysis or formal code inspections. In order to achieve this goal specific properties of the problem are used to formalise the requirements and simplify the proofs. The approach places particular emphasis on the clear description of the problem providing a clear separation between reliable and unreliable principals, the precise description of their roles, beliefs, and control structures, the imposition of sequencing constraints, the expression of authentication properties using temporal features, and the formalisation of algorithm properties. The use of powerful invariants and the axiomatisation of intruder knowledge result in a verification process whose conciseness is comparable to that of modal based approaches. The process can be automated within a framework of typed logics using the Coq proof assistant [67].
Paulson has independently developed a similar approach synthesising the inferenceconstruction method idea of protocol message guarantees and the attackconstruction method notion of events [24] [26]. Unlike Bolignano who models the states of the four agents A, B, S, and the Spy, Paulson defines protocols inductively as the set of all possible event traces. Agents receiving a trace can forward it and extend it according to the protocol rules while remaining agnostic to the message's true sender. This approach allows the modelling of both attacks and key losses. Again, the process is partially automated using the Isabelle theorem prover [68].
Within the same framework, Schneider presents a general approach for the analysis and verification of authentication properties in the language of CSP [69]. The focus of this research work is the development of a specific theory targeted towards the analysis of authentication protocols and built on top of the general CSP semantic framework. The CSP syntax provides a precise way to describe authentication protocols in terms of the messages accepted and transmitted by the individual protocol participants. This approach aims to bridge the gap between the ability to express authentication protocols in a precise way and the facility to reason formally about the properties they exhibit. The aforementioned theory has been successfully applied [70] to the modelling and analysis of the Zhou and Gollmann fair non-repudiation protocol [71].
More recently, Fabrega, Herzog, and Guttman introduced the notion of a strand space [72] [73]. They propose a model and a set of proof methods for cryptographic protocols along the lines of the NRL Protocol Analyzer, Schneider's work, and Paulson's inductive definitions. A strand is defined as a sequence of events within the domain of a security protocol representing the actions of legitimate party or a penetrator. Defining a strand space as a graph of strands allows protocol correctness to be expressed as connections between different kinds of strands. In conjunction with the aforementioned formalism the authors use the concept of ideals to prove bounds on a penetrator's capabilities independently of the security protocol being analysed. This approach is characterised by the simplicity of the model and the effortless production of reliable protocol correctness proofs. An interesting concept of this method is the pictorial approach [74] which is used as a heuristic for stating and proving correctness results. Strand spaces can help users draw informative pictures of security protocols, attacks, correctness theorems, and crucial steps in the proofs; thereby focusing on the protocol goals and their satisfaction.
Finally, another related approach
is the implementation of the protocol security theory developed
by Snekkenes [23] within the HOL theorem prover [62]. The prover
works on an explicit identification of the participants' message
extractions, computations, tests, and actions exploiting the algebraic
properties of the cryptographic algorithms used. The theory also
distinguishes participants from the roles they play, thereby modelling
attacks in which an attacker leads legitimate participants to
think they are at different stages in the protocol than they really
are. Snekkenes has also developed proof tools that are based on
large amounts of human input.
The techniques outlined above can not be easily applied by analysts other than the developers themselves. The main reason for this difficulty is the fact that the protocols have to be re-specified for each technique, and it is not easy to transform the published description of the protocol into the required formal system. So, some tools are designed as automatic translators. The input to any such translator still requires a formally-defined language, but it can be made similar to the message-oriented protocol descriptions that are typically published in the literature. This introduces the idea of designing a single common protocol specification language that could be used as the input format for any formal analysis technique.
Meadows [48] describes a heuristic procedure for automating language generation using the NRL Protocol Analyzer. In contrast to other methods, where languages are defined by hand and unreachability properties are automatically proved, this approach combines the language generation with the unreachability proof. By using this approach the languages are efficiently created and are also amenable to automated analysis thereby improving the performance of the NRL Protocol Analyzer.
Brackin specifies a simple Interface Specification Language (ISL) and describes an Automatic Authentication Protocol Analyzer (AAPA) which can automatically either prove that specific protocols satisfy the desired properties, or identify precisely where these proof attempts fail [75] [76] [77]. The AAPA produces its proofs using the BGNY protocol analysis belief logic implemented in the HOL family of proof tools. The AAPA can be used either alone or as part of the Convince system [78]. The Convince tool facilitates the modelling and analysis of cryptographic protocols using a HOL theorem prover with automated support. This tool, developed by Lichota, Hammonds, and Brackin implements BGNY belief logic. It consists of the AAPA together with a graphical user interface that automatically creates ISL specifications from user-created graphical protocol representation.
The time and space required to do an AAPA analysis grow quadratically with the size of the protocol making it possible for the AAPA to quickly analyse large and complicated protocols. A creditable performance to evaluate the results of AAPA includes the analysis of fifty two protocols from "A Survey of Authentication Protocol Literature" by Clark and Jacob [79]. This is a continually updated library of protocols analysed in the protocol-failure literature. As mentioned before, the time for protocol analysis proved to be quite brief; an experienced user needed eighty working hours to model and analyse fifty two protocols. However, AAPA misses some failures, most notably non-disclosure failures, due to the fact that BGNY belief logic makes authentication deductions by assuming that there have been no non-disclosure violations. Nevertheless, AAPA remains one of the most effective modern tools aiding the design and analysis process.
Another promising language named Common Authentication Protocol Specification Language (CAPSL), partly inspired by ISL, is being developed by Millen [80]. CAPSL is proposed as a single common protocol specification language that can be used as the input format for any formal analysis technique, such as Prolog state-search analysis tools [18], the NRL Protocol Analyzer [16] [17], model-checking with FDR [15], and HOL [8]. The main objectives of the CAPSL design are usability, abstraction, completeness, extensibility, parsability, and scalability.
A program named Casper, developed by Lowe [31], semiautomatically produces the CSP description from a more abstract description, thus greatly simplifying the modelling and analysis process. Casper compiles high level protocol descriptions, written following a notation similar to the notation used in the academic literature, into CSP scripts for checking on FDR2. Casper does not yet cover all the features found in security protocols, but has been applied to a number of known protocols, such as the Andrew protocol, the Kerberos protocol, the CCITT X.509 protocol, and the Yahalom protocol [5]. Some of these case studies are available via the Casper Web page [81].
The main differences between CAPSL and the Casper language stem from the fact that a Casper input file has to define not only the protocol itself, but also the system to be checked; CAPSL defines only the protocol. The most important consequence for Casper is that when designing the input syntax one has to provide a mechanism for defining the agents who are taking part in the system, the specific roles they played, and the data items they used.
Recently, Abadi and Gordon developed the Spi calculus language to be used for specifying cryptographic protocols [82]. The Spi calculus is an extension of the Pi calculus [83] an existing language for specifying mobile computation which does not include any constructs for encryption and decryption. Within Spi calculus protocols are represented as processes while their security properties are represented using the notion of protocol equivalence. The Spi calculus approach resembles the modal logic reasoning about channel utterances and the characterisation of knowledge within a state-machine environment. It differs however from other approaches by representing integrity and secrecy as equivalencies and defining the environment as a Spi calculus process.
Finally, Brackin proposed a tool
targeted towards automating proof-construction [27] [84].
The HOL theory Protocol Description Logic (PDL) formalises the
low-level details of the actions performed by processes executing
a protocol. PDL is not sufficiently expressive to formalise all
protocols specified in CAPSL [80].
The main advantage of Brackin's approach is that the complexities
that arise in proving desired belief inferences will not need
to be considered again once these inferences are proved. Thereby
protocol analyses using PDL are not only as trustworthy as those
produced by attack-construction methods, but also comparable in
speed to those performed using inference-construction methods.
Aiming towards the design of an effective
cryptographic protocol, a complementary approach is to try to
encapsulate experience of good and bad practice into empirical
rules. The robustness principles are therefore helpful, in that
adherence to them contributes to the simplicity of protocols and
avoids a considerable number of published confusions and mistakes.
Anderson and Needham [85]
propose a number of robustness principles, and Abadi and Needham
[86]
[87]
introduce complete analyses of desirable protocol properties and
relevant limitations. Some of them are mentioned below:
It is remarkable that, in many cases, following one design principle will sometimes lead to violating another. This is almost expected, since we have to deal with empirical rules. In addition, even following all the above rules will not guarantee a sound design. Many authors have considered the question of what are appropriate goals in the context of protocol analysis. Accordingly, Boyd [88] reviewed some design goals in authentication protocols and proposed a classification of them: intentional and extensional goals. Intentional goals are generally concerned with ensuring that the protocol runs correctly as specified, while extensional goals are concerned with what the protocol achieves for its participants. It has been suggested that attacks should be measured by whether or not they violate extensional specifications even if intentional ones have been used to find the attacks in the first place. Boyd proposes a hierarchy of extensional protocol goals which includes the major proposed goals for key establishment. He furthermore demonstrates how these extensional goals can be exploited to motivate design of entity authentication protocols.
Concluding, it is becoming widely
accepted that both formal methods and structured design rules
must be taken into account in a complementary way during all phases
of a protocol design for achieving effective and reliable cryptographic
protocols.
The design of secure cryptographic protocols is a very complex and difficult process. Until recently, researchers were orientated towards the use of formal methods for the analysis and verification of existing protocols. These methods have proved successful at discovering flaws with existing protocols, sometimes previously unrecognised ones. Criticisms on formal verification include the fact that key distribution protocols, claimed to be secure under BAN logic, have already been broken [89]. Furthermore, BAN-like logics do not prove that a weakness in the protocol implies that the cryptoscheme, on which it is based, can be broken. Therefore, a great deal of doubt remains as to whether any of the existing techniques is sufficient to provide a proof that a given protocol is correct. This situation has a fair analogy in the verification process of general purpose computer programs, where reliable testing techniques allow many bugs to be found, but will not provide a basis for complete proof of correctness.
Therefore, it would be a prudent
and mature trend, to design specific methods and implement tools,
in order to aid the initial correct design of cryptographic protocols.
The incorporation of formal methods into design can be implemented
in various ways.
One approach is to develop and use protocol design methodologies that lend themselves to formal method analysis [90]. This is exemplified by the modular design proposed by Heintze and Tygar [91]. Using protocol security reasoning tools and a composition theorem they can state sufficient conditions for combining two secure protocols to form a new one with similar properties. Based on secret-security and time-security notions they can provide examples of how unmet conditions result in an insecure protocol.
Another approach is based on the development of design principles which are used to develop protocols whose security is easy to evaluate. To satisfy this goal Gong and Syverson propose the notion of fail-stop protocols [92]. The main idea came from earlier work where they proposed the concept of a fail-stop processor, which, when failing, stops before any effect is visible to the outside environment [93]. Similarly, a fail-stop protocol halts in response to active attacks interfering with the protocol execution. Given such a protocol its security analysis involves only the examination of possible passive attacks such as eavesdropping. It is therefore much easier to conclude whether the secrecy assumption can be violated.
The suggested proof methodology for a fail-stop protocol comprises three phases: the verification that the protocol is fail-stop, the validation of the secrecy assumption, and the application of BAN-like logics. The proposed methodology applies BAN-like logics because even for a fail-stop protocol, the residue from its execution may be useful to an attacker [55]. Another encouraging point for this methodology is that the specifications of fail-stop protocols satisfy some of the main prudent engineering principles from [86] and [87]. Accordingly, the GNY logic is used to analyse a fail-stop protocol the proof complexity can be dramatically reduced. According to the researcher's team investigation, many existing protocols proved to be fail-stop [92]; therefore the new notions are not too limiting.
The most prominent approach in this area seems to be the layered approach proposed by Meadows [90]. This approach can be used together with Heintze and Tygar's approach [91]. It is based on a stack of models at different levels of abstraction. As a first step, the protocol designer uses a relatively abstract model to construct and verify the security protocol. If this protocol is correct at that top layer, the designer focuses on a more detailed model which refines the abstract one. The repeated execution of this process leads to the final production of a detailed specification. Much of the existing work on requirements specifications [11] has this specific flavour. For the application of BAN Logic [94], the approach is based on a parser that translates members of a limited class of protocol specifications into BAN Logic.
Within this framework, Rudolph introduces an approach for designing an abstract model for cryptographic protocols that can be used as the top layer of a layered design method [95]. The main idea of Rudolph is the usage of Asynchronous Product Automata. The whole design process starts with a relatively abstract model at the top layer and ends in a refined specification that can be proven to be an implementation of the top level. This model reaches a higher level of abstraction than the model presented in the work of Heintze and Tygar [91] through the use of logical secure channels instead of encryption.
The notion of channels is also utilised by Buttyan, Staaman, and Wilhelm to present a simple logic for authentication protocol design [96]. These channels are abstract views of various types of secure communication links between principals. The way channels are used is similar to the use of Pi calculus channel primitives [83]. The proposed Simple Logic preserves the simplicity of BAN logic and adopts some concepts from GNY logic. It consists of a language and a small number of inference rules. The language is used to describe assumptions, events, and the protocol goals. The inference rules are used to derive new statements about the system. The goal of the analysis is to construct a witnessing deduction, which is a derivation of the goals from the assumptions and the formal protocol description. The protocol is correct in the case where such a deduction exists. The lack of a witnessing deduction means that the protocol may not be correct.
Boyd and Mao proposed another technique for designing key exchange protocols [97] which are guaranteed to be correct in the sense that a specified security criterion will not be violated if protocol principals act correctly. This technique is developed from basic cryptographic properties that can be expected to be held by a variety of cryptographic algorithms. Protocols can be developed abstractly and any particular type of algorithm that possesses the required property can then be used in a concrete implementation.
Gollmann [98] suggests that the design of authentication protocols has proven to be error prone partly due to a language problem. The objectives of entity authentication are usually given in terms of human encounters while we actually implement message passing protocols. The author proposes various translations of the high level objectives into a language appropriate for communication protocols.
Several researchers believe that
in the near future, more effort will be spent on designing secure
protocols and less on formal verifications. As expected, this
trend has received criticism similar in nature to that expressed
towards the use of formal methods in program design and implementation
[99]. Specifically, Meadows argues [90] that design specifications
do not guarantee that protocols will meet security goals that
were not foreseen by the design approach, that the protocols designed
are sometimes impractical, and that - due to the imprecision of
design principles - flawed protocols may in any case be designed.
We presented an overview of the modern trends in the application of formal methods for the analysis and verification of cryptographic protocols. The three method families we described are useful at various levels of abstraction. The more abstract models can be used efficiently at earlier points in the design stage, when implementation details have not yet been decided. A protocol analysis toolkitbased usage scenario can be described as follows [24] [25] [90]: initially, use an inference-construction method, like BAN, to determine what the role of each message of a protocol should be and to ensure freshness properties; then, use an attack-construction method, like NRL Protocol Analyzer, for finding simple attacks quickly; and finally, utilise a proof-construction method to investigate deeper properties with a modest amount of effort.
Furthermore, some areas where current research is conducted have emerged. Bolignano is working towards investigating the use of his approach in the context of an ITSEC evaluation. Another interesting research direction is the investigation of the potential integration of methods like the NRL Protocol Analyzer and the Interrogator model within the methodology of fail-stop protocols in the cases of protocols that do not satisfy the failstop requirements [100]. To ease and extend the use of the Mur tool
it would be useful to achieve automatic translation of a higher-level protocol specification language such as CAPSL into Mur and combine analyses using exhaustive finite-state analysis and formal logic methods.
Moreover, the research community
is also working towards developing tools that take easy-to-write
specifications of protocols and the expected properties and quickly
perform formal analyses checking for failures of these protocols
to achieve their desired properties. AAPA [75]
and CAPSL [80]
seem to be the most promising approaches to bridge the gap between
the typical informal presentations of protocols given in research
papers and the precise characterisations required to conduct formal
analysis. Representative research and development attempts for
designing effective tools, include the work of Brackin for a new,
currently unnamed, protocol analysis tool which, unlike AAPA,
will use CAPSL rather than ISL as its input language and use the
Protocol Description Language as the basis for its proofs. In
addition, the co-operation of Millen, Meadows, and Brackin has
resulted in a - yet unpublished - multi-purpose CAPSL translator
[84]. This translator will be able to translate CAPSL protocol
specifications into a HOL theory to be used with PDL, into input
of the NRL Protocol Analyzer, or into humanreadable algorithm
descriptions.
An interesting research trend, lies in the fact that many current activities use formal methods for analysing and verifying modern protocols and protocol suites to be used in the commercial world. These suites consist from a set of single protocols which interact with each other causing, previously unknown, potential vulnerabilities. Within this context Brackin describes how the AAPA works to formally analyse two large commercial protocols [101]: the main- and coin-sequence protocols developed by CyberCash, Inc., Mur has been used in order to analyse the SSL 3.0 - a complex de facto standard for achieving secure Internet communication - handshake protocol [102], and Paulson's Inductive method has been applied [103] to analyse the descendant of SSL 3.0, known as TLS Internet Protocol. Recently, Paulson's Inductive method has also been exploited [104] [105] to formalise Kerberos version IV, a real-world timestamp-based protocol. In this work, a complete formalisation of the whole protocol is achieved, and several guarantees about its entangled operation are proved using the Isabelle theorem prover. Furthermore, Bolignano generalised [106] the approach his earlier approach [25] in order to achieve the verification of electronic payment protocols, such as C-SET and SET. The NRL Protocol Analyzer has also been utilised by Meadows for the analysis and verification [107] of the Internet Key Exchange protocol, IKE (formerly ISAKMP/Oakley) [108].
Finally, from the security protocol
designer's point of view, the research community, working towards
developing more effective techniques to ex-ante design
protocols that are guaranteed to be reliable and correct in the
first place, has implemented the synthesis approach. Most of the
recent research in this area is focused on the application of
the notion of channels in order to effectively implement the layered
approach.
The authors would like to
thank C. Meadows, Naval Research Laboratory, S. H. Brackin, ARCA
Systems, Inc., L. C. Paulson, University of Cambridge, C. Boyd,
Queensland University of Technology, W. Mao, Hewlett-Packard,
and the anonymous referees for their insightful remarks and constructive
comments.
Dr. Stefanos Gritzalis holds a BSc (Physics), an MSc (Electronic Automation), and a PhD (Informatics) all from the University of Athens, Greece. Currently he is a an Assistant Professor with the Department of Informatics of the Technological Educational Institute (TEI) of Athens, Greece, and a Research Associate with the Department of Information and Communication Systems of the University of the Aegean, Greece. His professional experience includes senior consulting and researcher positions in a number of private and public institutions. He has been involved in several national and EU funded R&D projects, such as COSACC (DGXIII-Telematics for Administration), KEYSTONE (DGXIII-ETS II), EUROMED-ETS (DGXIII-ETS I), ISHTAR (DGXIII-Healthcare Telematics), ERMIS (DGXVI-Action2), PD4/5 (DGXIII-Value II) etc. His published scientific work includes two (2) books (in Greek) on Information and Communication Technologies topics, and more than fifteen (15) journal and national and international conference papers. The focus of these publications is on Computer Systems Security, Cryptography, and Distributed Systems. Dr. S.Gritzalis is a Member of the Board (Treasurer) of the Greek Computer Society.
Dr. Diomidis Spinellis holds an MEng in Software Engineering and a PhD in Computer Science both from Imperial College (University of London, UK). Currently he is lecturing at the Department of Information and Communication Systems, University of the Aegean, Greece. He has provided consulting services to a number of Greek and international Information Technology companies. He has been involved in several national and EU funded R&D projects in the areas of Computer Security, Language Engineering, and Scientific Visualisation. He is the author of more than 30 technical papers and conference presentations. He has contributed software to the 4.4BSD Unix distribution, the X-Windows system, and is the author of a number of public domain software packages, libraries, and tools. His research interests include Information Security, Software Engineering, and Programming Languages. Dr. Spinellis is a member of the ACM, the IEEE, and a founding member of the Greek Internet User's Society. He is a co-recipient of the Usenix Association 1993 Lifetime Achievement Award.
Dr. Panagiotis Georgiadis
holds a B.Sc. in Physics from the University of Athens, Greece,
an M.Sc. in Computer Science from the Warwick University, UK,
and a Ph.D. in Computer Science from the University of Athens,
Greece. Currently he is an Associate Professor with the Department
of Informatics, University of Athens, Greece. He has been involved
with several funded R&D projects in the areas of Distributed
Systems and Software Engineering. He has authored three books
and more than fifty (50) papers and conference presentations.
His research interests include Simulation, Distributed Systems,
Operating Systems, and Computer Systems Security. Dr. Georgiadis
is a member of the ACM, and the Greek Computer Society.