Current Workshop | Overview | Back | Home | Search | | ||||||||||
Logic and information security |
The
interdisciplinary workshop 'Logic and information security' focuses on the
logical analysis of protocols for secure communication, including additional
communicative aspects susceptible to attack. Disciplines and subdisciplines involved are: modal logic, combinatorial
mathematics, information science, computer science, cryptography, computer
security, philosophy, cognitive science, linguistics. Areas of interest are
combinatorial mathematics for bit exchange, information-based analysis with or
without modal logic, zero-knowledge protocols and oblivious transfer (and in
particular epistemic logical analyses of such phenomena), and model checking
protocols - both temporal epistemic and dynamic epistemic model checking as
more real-time protocol verification (often with first-order logics and systems
based on term or equational rewriting). The
combinatorial mathematical approaches originate in design theory - an area of
mathematics with roots in the 1860 in work by Kirkman
and Steiner, e.g. - and in proposals for bit exchange protocols based on random
deals of cards, such as by Winkler, Fischer, Wright and others from the 1980s
onward. This involves exchange of secret bits between some players by way of
public signals. These combinatorial mathematical approaches provide interesting
unconditionally secure protocols. They can be applied in security settings that
presume a prior distribution of information, such as interpreted systems, a
well-known architecture for knowledge-based systems pioneered in the 1980s by Vardi, Halpern and others. Both
zero-knowledge protocols (as in work by Goldreich
which has now been around for some 20 years) and 'epistemically
motivated' approaches like Rabin's oblivious transfer are currently being modelled in logical frameworks (Kramer, Palamidessi,
Maffei). There is some
overlap with model checking applications: toy zero-knowledge protocols such as
the Dining Cryptographers are implemented and validated in epistemic model
checkers. From the
workshop topics these areas provide its most interdisciplinary aspect. There
are relations between zero-knowledge protocols and various ways of diffusion of
ignorance and knowledge in communities of human agents, and the formalization
of intentions. Some epistemic approaches model the intention to keep a secret
directly as a postcondition of protocol execution. In
more philosophical or linguistic terms this could be called drawing the
pragmatics of language, such as communicative expectations and suppositions,
into its formal semantics. Investigation of such aspects therefore involves the
philosophical/linguistic subdisciplines of pragmatics
and discourse analysis, such as work by van Benthem,
van Eijck, and Veltman. Propositional
model checkers have been used extensively for general satisfiability
problems. The recent development of dedicated dynamic or epistemic model
checkers is particularly useful for the verification of information-based
protocols, because we can then specify the protocols in terms of their natural
epistemic requirements and let the model checker do the dirty work of model
construction and of the validation of such epistemic formulas, including its
dynamic aspects such as epistemic postconditions of
protocol execution, or epistemic invariants during protocol execution. Examples
of epistemic model checkers with dynamic features are DEMO (van Eijck), MCMAS (Lomuscio), and MCK
(van der Meijden). An issue
in the epistemic model checking community is scalability - these tools often do
not scale up well. Closer to
real-time protocol verification with logical means and methods is deployed in
first-order and term-rewriting based model checkers and similar tools.
Toulouse- and Nancy-based researchers such as Balbiani, and the group of
Basin in [Back] |