A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2017; you can also visit the original URL.
The file type is
Lecture Notes in Computer Science
A key-encapsulation mechanism (KEM) is a cryptographic primitive that allows anyone in possession of some party's public key to securely transmit a key to that party. A KEM can be viewed as a key-exchange protocol in which only a single message is transmitted; the main application is in combination with symmetric encryption to achieve public-key encryption of messages of arbitrary length. The security of KEMs is usually defined in terms of a certain game that no efficient adversary can win withdoi:10.1007/978-3-642-42001-6_16 fatcat:jtdoh6gqf5ca7nt73m2monsryu
more »... non-negligible advantage. A main drawback of game-based definitions is that they often do not have clear semantics, and that the security of each higher-level protocol that makes use of KEMs needs to be proved by showing a tailor-made security reduction from breaking the security of the KEM to breaking the security of the combined protocol. We propose a novel approach to the security and applications of KEMs, following the constructive cryptography paradigm by Maurer and Renner (ICS 2011). The goal of a KEM is to construct a resource that models a shared key available to the honest parties. This resource can be used in designing and proving higher-level protocols; the composition theorem guarantees the security of the combined protocol without the need for a specific reduction. Introduction Key establishment is a cryptographic primitive that allows two parties to obtain a shared secret key, which can subsequently be used in cryptographic mechanisms such as encryption schemes or message authentication codes (MACs). The most important application of key-establishment protocols is in the setup phases of protocols for secure communication, such as TLS or IPSec, but, furthermore, their unidirectional variant-key-encapsulation mechanisms (KEMs)-are an important building block in most practical public-key encryption schemes. This paper is dedicated to Johannes Buchmann on the occasion of his 60 th birthday. The topic of the paper, the key-establishment problem, a fundamental problem in cryptography, is one of the areas to which he has contributed significantly (e.g.,    21] ). In this paper, we focus on the particular case where keys are established using KEMs and only unidirectional communication. We build on , where public-key encryption is treated in constructive cryptography, and some parts of this work are taken from that paper. Security Notions for Key-Encapsulation Mechanisms An important question for the application of KEMs is which level of KEM security is required in order for a higher-level protocol that makes use of a KEM to be secure. To define KEM security, game-based security notions for public-key encryption have been adapted to work with KEMs. A game-based definition is usually characterized by a security property that is to be maintained in the presence of an adversary launching a certain attack against the scheme in question. Both security property and attack are encoded only implicitly into the security game. As a consequence, the traditional answer to the above question is that for each protocol one needs to identify the appropriate security notion and provide a reduction proof to show that a KEM satisfying this notion yields a secure protocol. An alternative approach is to capture the semantics of a security notion by characterizing directly what it achieves, making explicit in which applications it can be used securely. The constructive cryptography framework [15, 16] was proposed with this general goal in mind. Resources such as different types of communication channels and keys are modeled explicitly, and the goal of a cryptographic protocol or scheme π is to construct a stronger or more useful resource S from an assumed resource R, denoted as R π
We show a transitivity property of nonlocal correlations: There exist tripartite nonsignaling correlations of which the bipartite marginals between A and B as well as B and C are nonlocal and any tripartite nonsignaling system between A, B, and C consistent with them must be such that the bipartite marginal between A and C is also nonlocal. This property represents a step towards ruling out certain alternative models for the explanation of quantum correlations such as hidden communication atdoi:10.1103/physrevlett.107.100402 pmid:21981484 fatcat:5gwklwhpljee5iuc3hyfouk74a
more »... ite speed. Whereas it is not possible to rule out this model experimentally, it is the goal of our approach to demonstrate this explanation to be logically inconsistent: either the communication cannot remain hidden, or its speed has to be infinite. The existence of a three-party system that is pairwise nonlocal is of independent interest in the light of the monogamy property of nonlocality.
In a recent paper, Coretti et al.  introduced a new middleground security notion for encryption-termed indistinguishability under (chosen-ciphertext) self-destruct attacks (IND-SDA) in this paper 1 ... However, Coretti et al. ...doi:10.1007/978-3-662-49096-9_13 fatcat:pteqfo5nbvdclkubuojhaj5o5q
When analyzing the round complexity of multi-party protocols, one often overlooks the fact that underlying resources, such as a broadcast channel, can by themselves be expensive to implement. For example, it is well known that it is impossible to implement a broadcast channel by a (deterministic) protocol in a sub-linear (in the number of corrupted parties) number of rounds. The seminal works of Rabin and Ben-Or from the early 80's demonstrated that limitations as the above can be overcome bydoi:10.1007/s00145-018-9279-y fatcat:wojkptxvfjcehpz7ymjzppyozm
more »... ing randomization and allowing parties to terminate at different rounds, igniting the study of protocols over point-to-point channels with probabilistic termination and expected constant round complexity. However, absent a rigorous simulationbased definition, the suggested protocols are proven secure in a property-based manner or via ad hoc simulation-based frameworks, therefore guaranteeing limited, if any, composability. In this work, we put forth the first simulation-based treatment of multi-party cryptographic protocols with probabilistic termination. We define secure multi-party computation (MPC) with probabilistic termination in the UC framework and prove a universal composition theorem for probabilistic-termination protocols. Our theorem allows to compile a protocol using deterministic-termination hybrids into a protocol that uses expected-constant-round protocols for emulating these hybrids, preserving the expected round complexity of the calling protocol. We showcase our definitions and compiler by providing the first composable protocols (with simulation-based security proofs) for the following primitives, relying on point-to-point channels: (1) expected-constant-round perfect Byzantine agreement, (2) expected-constant-round perfect parallel broadcast, and (3) perfectly secure MPC with round complexity independent of the number of parties.
When analyzing the round complexity of multi-party protocols, one often overlooks the fact that underlying resources, such as a broadcast channel, can by themselves be expensive to implement. For example, it is well known that it is impossible to implement a broadcast channel by a (deterministic) protocol in a sub-linear (in the number of corrupted parties) number of rounds. The seminal works of Rabin and Ben-Or from the early 80's demonstrated that limitations as the above can be overcome bydoi:10.1007/978-3-662-53015-3_9 fatcat:mxwa72ilzbhnxapui2mmbyamjq
more »... ing randomization and allowing parties to terminate at different rounds, igniting the study of protocols over point-to-point channels with probabilistic termination and expected constant round complexity. However, absent a rigorous simulationbased definition, the suggested protocols are proven secure in a property-based manner or via ad hoc simulation-based frameworks, therefore guaranteeing limited, if any, composability. In this work, we put forth the first simulation-based treatment of multi-party cryptographic protocols with probabilistic termination. We define secure multi-party computation (MPC) with probabilistic termination in the UC framework and prove a universal composition theorem for probabilistic-termination protocols. Our theorem allows to compile a protocol using deterministic-termination hybrids into a protocol that uses expected-constant-round protocols for emulating these hybrids, preserving the expected round complexity of the calling protocol. We showcase our definitions and compiler by providing the first composable protocols (with simulation-based security proofs) for the following primitives, relying on point-to-point channels: (1) expected-constant-round perfect Byzantine agreement, (2) expected-constant-round perfect parallel broadcast, and (3) perfectly secure MPC with round complexity independent of the number of parties. * An extended abstract of this work appeared at CRYPTO 2016.
Lecture Notes in Computer Science
The security of public-key encryption (PKE), a widely-used cryptographic primitive, has received much attention in the cryptology literature. Many security notions for PKE have been proposed, including several versions of CPA-security, CCA-security, and non-malleability. These security notions are usually defined via a game that no efficient adversary can win with non-negligible probability or advantage. If a PKE scheme is used in a larger protocol, then the security of this protocol is proveddoi:10.1007/978-3-642-42033-7_8 fatcat:5a6dvmonuvfzvaek7uvahhv4ii
more »... y showing a reduction of breaking a certain security property of the PKE scheme to breaking the security of the protocol. A major problem is that each protocol requires in principle its own tailormade security reduction. Moreover, which security notion of the PKE scheme should be used in a given context is a priori not evident; the employed games model the use of the scheme abstractly through oracle access to its algorithms, and the sufficiency for specific applications is neither explicitly stated nor proven. In this paper we propose a new approach to investigating the application of PKE, based on the constructive cryptography framework [24, 25] . The basic use of PKE is to enable confidential communication from a sender A to a receiver B, assuming A is in possession of B's public key. One can distinguish two relevant cases: The (non-confidential) communication channel from A to B can be authenticated (e.g., because messages are signed) or non-authenticated. The application of PKE is shown to provide the construction of a secure channel from A to B from two (assumed) authenticated channels, one in each direction, or, alternatively, if the channel from A to B is completely insecure, the construction of a confidential channel without authenticity. Composition then means that the assumed channels can either be physically realized or can themselves be constructed cryptographically, and also that the resulting channels can directly be used in any applications that require such a channel. The composition theorem of constructive cryptography guarantees the soundness of this approach, which eliminates the need for separate reduction proofs. We also revisit several popular game-based security notions (and variants thereof) and give them a constructive semantics by demonstrating which type of construction is achieved by a PKE scheme satisfying which notion. In particular, the necessary and sufficient security notions for the above two constructions to work are CPA-security and a variant of CCAsecurity, respectively. Introduction Public-key encryption (PKE) is a cryptographic primitive devised to achieve confidential communication in a context where only authenticated (but not confidential) communication channels are available [11, 34] . The cryptographic security of PKE is traditionally defined in terms of a certain distinguishing game in which no efficient adversary is supposed to achieve a non-negligible advantage. There exists quite a wide spectrum of security notions and variants thereof. These notions are motivated by clearly captured attacks (e.g., a chosen-ciphertext attack) that should be prevented, but in some cases they seem to have been proposed mainly because they are stronger than previous notions or can be shown to be incomparable. This raises the question of which security notion for PKE is suitable or necessary for a certain higher-level protocol (using PKE) to be secure. The traditional answer to this question is that for each protocol one (actually, a cryptography expert) needs to identify the right security notion and provide a reduction proof to show that a PKE satisfying this notion yields a secure protocol. 1 An alternative approach is to capture the semantics of a security notion by characterizing directly what it achieves, making explicit in which applications it can be used securely. The constructive cryptography paradigm [24, 25] was proposed with this general goal in mind. Resources such as different types of communication channels are modeled explicitly, and the goal of a cryptographic protocol or scheme π is to construct a stronger or more useful resource S from an assumed resource R, denoted as R π 1 Note that this work is orthogonal to the foundational problem of designing practical PKE schemes provably satisfying certain security notions, based on realistic hardness assumptions. The seminal CCA-secure PKE scheme based on the DDH-assumption by Cramer and Shoup [9,10] falls into this category, as do, e.g., [13, 32, 19, 21, 35] .
Secure multi-party computation (MPC) allows several mutually distrustful parties to securely compute a joint function of their inputs and exists in two main variants: In synchronous MPC parties are connected by a synchronous network with a global clock, and protocols proceed in rounds with strong delivery guarantees, whereas asynchronous MPC protocols can be deployed even in networks that deliver messages in an arbitrary order and impose arbitrary delays on them. The two models-synchronous anddoi:10.1007/978-3-662-53890-6_33 fatcat:g5sznzm5jjaclgu6qp6cv2c3xq
more »... synchronous-have to a large extent developed in parallel with results on both feasibility and asymptotic efficiency improvements in either track. The most notable gap in this parallel development is with respect to round complexity. In particular, although under standard assumptions on a synchronous communication network (availability of secure channels and broadcast), synchronous MPC protocols with (exact) constant rounds have been constructed, to the best of our knowledge, thus far no constant-round asynchronous MPC protocols are known, with the best protocols requiring a number of rounds that is linear in the multiplicative depth of the arithmetic circuit computing the desired function. In this work we close this gap by providing the first constant-round asynchronous MPC protocol that is optimally resilient (i.e., it tolerates up to t < n/3 corrupted parties), adaptively secure, and makes black-box use of a pseudo-random function. It works under the standard network assumptions for protocols in the asynchronous MPC setting, namely, a complete network of point-topoint (secure) asynchronous channels with eventual delivery and asynchronous Byzantine agreement (aka consensus). We provide formal definitions of these primitives and a proof of security in the Universal Composability framework. Our contributions. In this paper, we first formalize the asynchronous model with eventual delivery in the universal composability (UC) framework  , introduce a suitable formal notion of asynchronous round complexity, and formulate the basic communication resources (such as asynchronous secure channel and asynchronous Byzantine agreement [A-BA]) as ideal functionalities in that model. 4 (See Section 3.) We then present the-to the best of our knowledge-first constant-round MPC protocol for this asynchronous setting (i.e., a protocol whose round complexity is independent of the multiplicative depth of the evaluated circuit and the number n of parties) based on standard assumptions, namely, the existence of pseudo-random functions (PRFs). 5 The protocol is UC-secure in the secure-channels model with A-BA, and makes black-box use of the underlying PRF, tolerating a computationally bounded, adaptive adversary actively corrupting up to t < n/3 parties, which is optimal for this setting. 6 At a high level, here is how we construct our constant round protocol. First, we devise a constantdepth circuit for computing the keys, masked values, and (shares of the) garbled gates needed for a distributed evaluation of a Yao garbled circuit that encodes the function the parties wish to compute. This circuit is then evaluated by means of a linear-round (in the depth of the circuit and in n) asynchronous protocol. However, this circuit is Boolean whereas all existing asynchronous protocols evaluate arithmetic circuits. To deal with this mismatch we devise an asynchronous protocol for computing Boolean circuits by appropriately adapting the protocol by Kelmer, and Rabin . Any party who receives the output from the evaluation of the Boolean circuit uses it to encrypt shares of each garbled gate, which it sends to all other parties. Finally, each party locally evaluates the (distributed) garbled circuit by decrypting incoming encrypted shares of each gate and reconstructing the function table of the gate as soon as sufficiently many consistent shares have arrived until all gates are evaluated. Once all gates are evaluated in this fashion, the party is in possession of the output. The protocol and its analysis are presented in Section 4. Related work. Beaver, Micali, and Rogaway  were the first to provide a constant-round MPC protocol in the synchronous stand-alone model. (Refer to Appendix A for a more detailed and historical account of the development of MPC protocols in both the synchronous and asynchronous settings, together with the tools that are used in each setting.) Their protocol is secure in the computational setting and tolerates an adaptive adversary who actively corrupts up to t < n/2 parties. The complexity of  was improved by Damgård and Ishai , who provided the first constant-round protocol making black-box use of the underlying cryptographic primitive (a pseudo-random generator). Importantly, both  and  assume a broadcast channel, an assumption essential for obtaining constant-round MPC. Indeed, as proved in [21, 19] , it is impossible to implement such a broadcast channel from point-to-point communication in a constant number of rounds, and although expected constant-round broadcast protocols exist in the literature (e.g., [20, 29] ), using them to instantiate calls within the constructions of  or  would not yield an expected constant-round protocol  . The intuitive 4 Note that while the UC framework already is asynchronous, asynchronous communication with eventual delivery has not been modeled in it so far. 5 A recent approach based on threshold fully homomorphic encryption was proposed by Cohen  ; see the discussion in the section on related work below. 6 Refer to the discussion in related work below on the necessity of this bound in the asynchronous setting.
This statement is shown by combining a simple information-theoretic argument with the constructive cryptography perspective on PKE (Coretti et al., Asiacrypt '13). ...doi:10.1007/978-3-662-46494-6_22 fatcat:oig4ocrr7zhg3cb656tc3ivnd4
15 Leibniz International Proceedings in Informatics Schloss Dagstuhl-Leibniz-Zentrum für Informatik
An important benchmark for secure multi-party computation (MPC) protocols is their round complexity. For several important MPC tasks, (tight) lower bounds on the round complexity are known. However, for some of these tasks, such as broadcast, the lower bounds can be circumvented when the termination round of every party is not a priori known, and simultaneous termination is not guaranteed. Protocols with this property are called probabilistic-termination (PT) protocols. Running PT protocols infatcat:3j3u3oefqvebna5ktczqtpwfoa
more »... arallel affects the round complexity of the resulting protocol in somewhat unexpected ways. For instance, an execution of m protocols with constant expected round complexity might take O(log m) rounds to complete. In a seminal work, Ben-Or and El-Yaniv (Distributed Computing '03) developed a technique for parallel execution of arbitrarily many broadcast protocols, while preserving expected round complexity. More recently, Cohen et al. (CRYPTO '16) devised a framework for universal composition of PT protocols, and provided the first composable parallel-broadcast protocol with a simulation-based proof. These constructions crucially rely on the fact that broadcast is "privacy free," and do not generalize to arbitrary protocols in a straightforward way. This raises the question of whether it is possible to execute arbitrary PT protocols in parallel, without increasing the round complexity. In this paper we tackle this question and provide both feasibility and infeasibility results. We construct a round-preserving protocol compiler, secure against a minority of actively corrupted parties, that compiles arbitrary protocols into a protocol realizing their parallel composition, while having a black-box access to the underlying protocols. Furthermore, we prove that the same cannot be achieved, using known techniques, given only black-box access to the functionalities realized by the protocols, unless merely security against semi-honest corruptions is required, for which case we provide a protocol.
We would like to thank Martin Hirt, Sandro Coretti and anonymous referees for their valuable comments about the paper. ...doi:10.1007/978-3-662-47666-6_56 fatcat:fgh3nu6ctnb3bnlz7kbhvkfpfy
Three of our students (Janick Bernet, Sandro Coretti, and Christian Oberholzer) have successfully implemented some functions in a Microsoft Word add-in. ...doi:10.1109/imcsit.2009.5352721 dblp:conf/imcsit/MahlowP09 fatcat:hv3ewiwjhbgcdpxf6icup2a3ba
Of particular help were discussions with Joël Alwen, Christian Badertscher, Ran Canetti, Sandro Coretti, Grégory Demay, Yevgeniy Dodis, Peter Gaži, Martin Hirt, Dennis Hofheinz, Daniel Jost, Christian ...doi:10.1007/978-3-662-53641-4_1 fatcat:33sz6kn4yne7joqus4ehqj3cjq
Lecture Notes in Computer Science
These people include Divesh Aggarwal, Kfir Barhum, David Basin, Sandro Coretti, Grégory Demay, Martin Hirt, Dennis Hofheinz, Thomas Holenstein, Christoph Lucas, Sebastian Mödersheim, Krzysztof Pietrzak ...doi:10.1007/978-3-642-27375-9_3 fatcat:rhqn3xyuvna6rnxivatvudjhxu
Springer, 2014.  Sandro Coretti, Ueli Maurer, Björn Tackmann, and Daniele Venturi. ... In a recent work, Coretti et al.  applied split-state non-malleable codes with n-states to get a weaker notion of multi-bit CCA security. ...doi:10.4230/lipics.icalp.2016.31 dblp:conf/icalp/ChandranGMPU16 fatcat:qgc7wrectrgsrbimivtyfd7w7q
Cars. 168, fol. 291 Wendeltreppen wurden notwendig, urn den Zugang zu den coretti der Kirche zu gewahrleisten.93 93 Vermutlich waren in dem Entwurf Girolamos die coretti vom Palast aus durch einen ... Wie aus Dokumenten im Archivio Spada hervorgeht, waren die Campanili dort im Fruhjahr 1654 im Bau (Heimburger Ravalli, S. 263; Sandro Corradini, »Inediti del Borromini nella ristrutturazione di S. ...doi:10.11588/rjbh.1996.31.80315 fatcat:nzf65iry5fbyxbkr2xazzo4jca
« Previous Showing results 1 — 15 out of 21 results