IA Scholar Query: Unbounded Inner-Product Functional Encryption with Succinct Keys.
https://scholar.archive.org/
Internet Archive Scholar query results feedeninfo@archive.orgTue, 02 Aug 2022 00:00:00 GMTfatcat-scholarhttps://scholar.archive.org/help1440Quantum Computing: Lecture Notes
https://scholar.archive.org/work/2pcfo6u7jzg25alp6mv6fq3w2y
This is a set of lecture notes suitable for a Master's course on quantum computation and information from the perspective of theoretical computer science. The first version was written in 2011, with many extensions and improvements in subsequent years. The first 10 chapters cover the circuit model and the main quantum algorithms (Deutsch-Jozsa, Simon, Shor, Hidden Subgroup Problem, Grover, quantum walks, Hamiltonian simulation and HHL). They are followed by 3 chapters about complexity, 4 chapters about distributed ("Alice and Bob") settings, a chapter about quantum machine learning, and a final chapter about quantum error correction. Appendices A and B give a brief introduction to the required linear algebra and some other mathematical and computer science background. All chapters come with exercises, with some hints provided in Appendix C.Ronald de Wolfwork_2pcfo6u7jzg25alp6mv6fq3w2yTue, 02 Aug 2022 00:00:00 GMTVerifiable Encodings for Secure Homomorphic Analytics
https://scholar.archive.org/work/e5g7nawczngoda6h4wtyixjyu4
Homomorphic encryption, which enables the execution of arithmetic operations directly on ciphertexts, is a promising solution for protecting privacy of cloud-delegated computations on sensitive data. However, the correctness of the computation result is not ensured. We propose two error detection encodings and build authenticators that enable practical client-verification of cloud-based homomorphic computations under different trade-offs and without compromising on the features of the encryption algorithm. Our authenticators operate on top of trending ring learning with errors based fully homomorphic encryption schemes over the integers. We implement our solution in VERITAS, a ready-to-use system for verification of outsourced computations executed over encrypted data. We show that contrary to prior work VERITAS supports verification of any homomorphic operation and we demonstrate its practicality for various applications, such as ride-hailing, genomic-data analysis, encrypted search, and machine-learning training and inference.Sylvain Chatel, Christian Knabenhans, Apostolos Pyrgelis, Jean-Pierre Hubauxwork_e5g7nawczngoda6h4wtyixjyu4Thu, 28 Jul 2022 00:00:00 GMTShielding Probabilistically Checkable Proofs: Zero-Knowledge PCPs from Leakage Resilience
https://scholar.archive.org/work/pxwy5brfm5hh7ft4lk3q4gvapi
Probabilistically Checkable Proofs (PCPs) allows a randomized verifier, with oracle access to a purported proof, to probabilistically verify an input statement of the form "x∈L" by querying only a few proof bits. Zero-Knowledge PCPs (ZK-PCPs) enhance standard PCPs to additionally guarantee that the view of any (possibly malicious) verifier querying a bounded number of proof bits can be efficiently simulated up to a small statistical distance. The first ZK-PCP construction of Kilian, Petrank and Tardos (STOC 1997), and following constructions employing similar techniques, necessitate that the honest verifier makes several rounds of queries to the proof. This undesirable property, which is inherent to their technique, translates into increased round complexity in cryptographic applications of ZK-PCPs. We survey two recent ZK-PCP constructions-due to Ishai, Yang and Weiss (TCC 2016-A), and Hazay, Venkitasubramaniam and Weiss (ITC 2021)-in which the honest verifier makes a single round of queries to the proof. Both constructions use entirely different techniques compared to previous ZK-PCP constructions, by showing connections to the seemingly-unrelated notion of leakage resilience. These constructions are incomparable to previous ZK-PCP constructions: while on the one hand the honest verifier only makes a single round of queries to the proof, these ZK-PCPs either obtain a smaller (polynomial) ratio between the query complexity of the honest and malicious verifiers or obtain a weaker ZK guarantee in which the ZK simulator is not necessarily efficient.Mor Weisswork_pxwy5brfm5hh7ft4lk3q4gvapiWed, 13 Jul 2022 00:00:00 GMTFactoring and Pairings Are Not Necessary for IO: Circular-Secure LWE Suffices
https://scholar.archive.org/work/vrfzybcuejaxlc7juc3xg73xgi
We construct indistinguishability obfuscation (iO) solely under circular-security properties of encryption schemes based on the Learning with Errors (LWE) problem. Circular-security assumptions were used before to construct (non-leveled) fully-homomorphic encryption (FHE), but our assumption is stronger and requires circular randomness-leakage-resilience. In contrast with prior works, this assumption can be conjectured to be post-quantum secure; yielding the first provably secure iO construction that is (plausibly) post-quantum secure. Our work follows the high-level outline of the recent work of Gay and Pass [STOC 2021], who showed a way to remove the heuristic step from the homomorphic-encryption based iO approach of Brakerski, Döttling, Garg, and Malavolta [EUROCRYPT 2020]. They thus obtain a construction proved secure under circular security assumption of natural homomorphic encryption schemes - specifically, they use homomorphic encryption schemes based on LWE and DCR, respectively. In this work we show how to remove the DCR assumption and remain with a scheme based on the circular security of LWE alone. Along the way we relax some of the requirements in the Gay-Pass blueprint and thus obtain a scheme that is secure under a different assumption. Specifically, we do not require security in the presence of a key-cycle, but rather only in the presence of a key-randomness cycle. An additional contribution of our work is to point out a problem in one of the building blocks used by many iO candidates, including all existing provable post-quantum candidates. Namely, in the transformation from exponentially-efficient iO (XiO) from Lin, Pass, Seth and Telang [PKC 2016]. We show why their transformation inherently falls short of achieving the desired goal, and then rectify this situation by showing that shallow XiO (i.e. one where the obfuscator is depth-bounded) does translate to iO using LWE.Zvika Brakerski, Nico Döttling, Sanjam Garg, Giulio Malavolta, Mikołaj Bojańczyk, Emanuela Merelli, David P. Woodruffwork_vrfzybcuejaxlc7juc3xg73xgiTue, 28 Jun 2022 00:00:00 GMTReclaiming scalability and privacy in the decentralized setting
https://scholar.archive.org/work/iyr7hfriavfklb6scz5p5gdy6i
The advent of blockchains has expanded the horizon of possibilities to novel decentralised applications and protocols that were not possible before. Designing and building such applications, be it for offering new ways for humans to interact or for circumventing the shortcomings of existing blockchains, requires analysing their security with a rigorous and multi-faceted approach. Indeed, the attack surface of decentralised, trustless applications is vastly more expansive than that of classical, server-client-based ones. Desirable properties such as security, privacy and scalability are attainable via established and widely applied approaches in the centralised case, where clients can afford to trust third party servers. Is it possible though for clients to self organize and attain these properties in use cases of interest without reliance on central authorities? We examine this question in the setting of a variety of blockchain-based applications. With an explicit aim of improving the state of the art and extending the limits of possible decentralised operations with precision and robustness, the present thesis explores, builds, analyses, and improves upon payments, content curation and decision making.Orfeas Stefanos Thyfronitis Litos, Orfeas Litos, University Of Edinburgh, Aggelos Kiayias, Kousha Etessamiwork_iyr7hfriavfklb6scz5p5gdy6iWed, 22 Jun 2022 00:00:00 GMTProceedings of the 2022 Joint Workshop of the German Research Training Groups in Computer Science
https://scholar.archive.org/work/lvykkw5kcfhlvolc6paa2sxczu
Having spent two successive years running online to prevent the spread of the Corona virus, the traditional annual meeting of the German Research Training Groups (RTGs) funded by the Deutsche Forschungsgemeinschaft (DFG) in the field of computer science returns to Schloss Dagstuhl --– Leibniz Center for Informatics, one of the world's premier venues for computer science-related seminars. Returning to Dagstuhl and hosting this meeting as an in-person-only event was a deliberate decision to revive interaction modes that many of the funded researchers had yet to experience: fostering personal interchange of ideas and experiences in order to strengthen the connection within the German computer science community. This volume documents the abstracts of the research topics of funded researchers in the participating RTGs. The event was jointly organized by RTG 2475 (Cybercrime and Forensic Computing) and RTG 2428 (ConVeY --- Continuous Verification of Cyber-Physical Systems). It took place between Sunday, June 12 and Wednesday, June 15, 2022, as in-person only Dagstuhl Event 22243. The meeting featured the usual sequence of research presentations by funded researchers, networking meetings for PIs and RTG coordinators, as well as two invited talks, one by Professor Martina Seidl (JKU Linz, Austria) on "Competitions as Scientific Method" and another by Professor Jennifer Byrne (School of Medical Sciences, The University of Sydney, Australia) titled "An introduction to research paper mills". Because last year's event marked the 25th anniversary of the workshop series, it featured a live interview with Professor Otto Spaniol who had initiated the workshop series in 1996. We document the interview in this volume.Felix Freiling, Helmut Seidl, 2022 2022 Joint Workshop Of The German Research Training Groups In Computer Science June 12–June 15work_lvykkw5kcfhlvolc6paa2sxczuTue, 17 May 2022 00:00:00 GMTCyber Counterintelligence: Assets, Audiences, and the Rise of Disinformation
https://scholar.archive.org/work/p67uwtb5bvcefbn5sjx5pqtl6u
In April 2021, Facebook suffered yet another data breach that affected hundreds of millions of accounts. The private information of over 500 million people had been stolen by hackers - names, phone numbers, email addresses, locations and more. The cache is potentially valuable to a host of malicious actors, from criminals motivated by financial gain to hostile foreign actors microtargeting voters through information operations. It follows an evolution of threats in cyberspace targeting government agencies, utilities, businesses, and electoral systems. With a focus on state-based actors, this thesis considers how state threat perception of cyberspace has developed, and if that perception is influencing the evolution of cyber counterintelligence (CCI) as a response to cyber-enabled threats such as disinformation. Specifically, this thesis traces the threat elevation of cyberspace through the evolution of the published national security documentation of the United Kingdom, asking how threat elevation corresponds to the development of CCI, if at all, and what sort of responses these processes generate to combat the rising threat of disinformation campaigns conducted against liberal democracies. Democratic audiences are a target of influence and information operations, and, as such, are an intelligence and security vulnerability for the state. More so than in previous decades, to increase national resilience and security in cyberspace, the individual, as part of the democratic audience, is required to contribute to personal counterintelligence and security practices. This research shows that while assets and infrastructure have undergone successful threat elevation processes, democratic audiences have been insufficiently recognised as security vulnerabilities and are susceptible to cyber-enabled disinformation.CJ O'Connor, University, The Australian Nationalwork_p67uwtb5bvcefbn5sjx5pqtl6uThu, 21 Apr 2022 00:00:00 GMTFunctional encryption: definitional foundations and multiparty transformations
https://scholar.archive.org/work/omlzdabwgrhazgzdcm2qfdowqa
Classical cryptographic primitives do not allow for any fine-grained access control over encrypted data. From an encryption of some data x, a decryptor, who is in possession of a decryption key, can either obtain the whole data x or nothing. The notion of functional encryption overcomes this drawback and enables access control over encrypted data. In this setting, a setup generator is responsible for generating the public parameters and, so-called, functional keys. These functional keys are decryption keys that are associated with a function f such that, when used in the decryption procedure, the decryptor obtains f(x), which is the result of the function f applied to the encrypted data x. The standard security definition of functional encryption prevents a malicious decryptor from learning more about the encrypted data than what can be obtained from the functional keys it owns. In this thesis, we introduce the notion of consistency, a security definition that protects an honest decryptor against a malicious encryptor and/or setup generator. We formally introduce this notion using different security games and show that our notions are completely separated from existing confidentiality notions. Additionally, we analyze existing schemes and show how they can be modified to achieve consistency. Furthermore, we construct black-box compilers that turn any functional encryption scheme into a consistent one. Finally, we also analyze consistency in the universal composability (UC) framework and show that the consistency games imply UC security. A more general notion of functional encryption is the notion of multi-client functional encryption, which allows a decryptor to evaluate multi-input functions on multiple ciphertexts generated by several different clients. This notion also requires a setup generator that generates the encryption keys for the different clients as well as the functional keys for the decryptor. A corrupted setup generator is able to compromise the privacy of all the clients in the system by generatin [...]Hendrik Waldner, University Of Edinburgh, Aggelos Kiayias, Markulf Kohlweiss, Vassilis Zikaswork_omlzdabwgrhazgzdcm2qfdowqaThu, 31 Mar 2022 00:00:00 GMTCryptography from Pseudorandom Quantum States
https://scholar.archive.org/work/7bniib53pnawto2l5fhg2czzeu
Pseudorandom states, introduced by Ji, Liu and Song (Crypto'18), are efficiently-computable quantum states that are computationally indistinguishable from Haar-random states. One-way functions imply the existence of pseudorandom states, but Kretschmer (TQC'20) recently constructed an oracle relative to which there are no one-way functions but pseudorandom states still exist. Motivated by this, we study the intriguing possibility of basing interesting cryptographic tasks on pseudorandom states. We construct, assuming the existence of pseudorandom state generators that map a λ-bit seed to a ω(logλ)-qubit state, (a) statistically binding and computationally hiding commitments and (b) pseudo one-time encryption schemes. A consequence of (a) is that pseudorandom states are sufficient to construct maliciously secure multiparty computation protocols in the dishonest majority setting. Our constructions are derived via a new notion called pseudorandom function-like states (PRFS), a generalization of pseudorandom states that parallels the classical notion of pseudorandom functions. Beyond the above two applications, we believe our notion can effectively replace pseudorandom functions in many other cryptographic applications.Prabhanjan Ananth, Luowen Qian, Henry Yuenwork_7bniib53pnawto2l5fhg2czzeuTue, 15 Mar 2022 00:00:00 GMTFoundations of decentralised privacy
https://scholar.archive.org/work/bid4o2gkj5alxbwjmzc3s426ge
Distributed ledgers, and specifically blockchains, have been an immensely popular investment in the past few years. The heart of their popularity is due to their novel approach toward financial assets: They replace the need for central, trusted institutions such as banks with cryptography, ensuring no one entity has authority over the system. In the light of record distrust in many established institutions, this is attractive both as a method to combat institutional control and to demonstrate transparency. What better way to manage distrust than to embrace it? While distributed ledgers have achieved great things in removing the need to trust institutions, most notably the creation of fully decentralised assets, their practice falls short of the idealistic goals often seen in the field. One of their greatest shortcomings lies in a fundamental conflict with privacy. Distributed ledgers and surrounding technologies rely heavily on the transparent replication of data, a practice which makes keeping anything hidden very difficult. This thesis makes use of the powerful cryptography of succinct non-interactive zero-knowledge proofs to provide a foundation for re-establishing privacy in the decentralised setting. It discusses the security assumptions and requirements of succinct zero-knowledge proofs atlength, establishing a new framework for handling security proofs about them, and reducing the setup required to that already present in commonly used distributed ledgers. It further demonstrates the possibility of privacy-preserving proof-of-stake, removing the need for costly proofs-of-work for a privacy-focused distributed ledger. Finally, it lays out a solid foundation for a smart contract system supporting privacy – putting into the hands of contract authors the tools necessary to innovate and introduce new privacy features.Thomas Kerber, University Of Edinburgh, Aggelos Kiayias, Markulf Kohlweisswork_bid4o2gkj5alxbwjmzc3s426geTue, 01 Feb 2022 00:00:00 GMTQuantum cryptography with classical communication: parallel remote state preparation for copy-protection, verification, and more
https://scholar.archive.org/work/n2gcpkczwjgedgucslq7ulj3d4
Quantum mechanical effects have enabled the construction of cryptographic primitives that are impossible classically. For example, quantum copy-protection allows for a program to be encoded in a quantum state in such a way that the program can be evaluated, but not copied. Many of these cryptographic primitives are two-party protocols, where one party, Bob, has full quantum computational capabilities, and the other party, Alice, is only required to send random BB84 states to Bob. In this work, we show how such protocols can generically be converted to ones where Alice is fully classical, assuming that Bob cannot efficiently solve the LWE problem. In particular, this means that all communication between (classical) Alice and (quantum) Bob is classical, yet they can still make use of cryptographic primitives that would be impossible if both parties were classical. We apply this conversion procedure to obtain quantum cryptographic protocols with classical communication for unclonable encryption, copy-protection, computing on encrypted data, and verifiable blind delegated computation. The key technical ingredient for our result is a protocol for classically-instructed parallel remote state preparation of BB84 states. This is a multi-round protocol between (classical) Alice and (quantum polynomial-time) Bob that allows Alice to certify that Bob must have prepared n uniformly random BB84 states (up to a change of basis on his space). Furthermore, Alice knows which specific BB84 states Bob has prepared, while Bob himself does not. Hence, the situation at the end of this protocol is (almost) equivalent to one where Alice sent n random BB84 states to Bob. This allows us to replace the step of preparing and sending BB84 states in existing protocols by our remote-state preparation protocol in a generic and modular way.Alexandru Gheorghiu, Tony Metger, Alexander Porembawork_n2gcpkczwjgedgucslq7ulj3d4Mon, 31 Jan 2022 00:00:00 GMTLocality-Preserving Hashing for Shifts with Connections to Cryptography
https://scholar.archive.org/work/aw5tjgtfkvf7peorwi54qahpdq
Can we sense our location in an unfamiliar environment by taking a sublinear-size sample of our surroundings? Can we efficiently encrypt a message that only someone physically close to us can decrypt? To solve this kind of problems, we introduce and study a new type of hash functions for finding shifts in sublinear time. A function h:{0,1}^n→ℤ_n is a (d,δ) locality-preserving hash function for shifts (LPHS) if: (1) h can be computed by (adaptively) querying d bits of its input, and (2) [ h(x) ≠ h(x ≪ 1) + 1 ] ≤δ, where x is random and ≪ 1 denotes a cyclic shift by one bit to the left. We make the following contributions. * Near-optimal LPHS via Distributed Discrete Log: We establish a general two-way connection between LPHS and algorithms for distributed discrete logarithm in the generic group model. Using such an algorithm of Dinur et al. (Crypto 2018), we get LPHS with near-optimal error of δ=Õ(1/d^2). This gives an unusual example for the usefulness of group-based cryptography in a post-quantum world. We extend the positive result to non-cyclic and worst-case variants of LPHS. * Multidimensional LPHS: We obtain positive and negative results for a multidimensional extension of LPHS, making progress towards an optimal 2-dimensional LPHS. * Applications: We demonstrate the usefulness of LPHS by presenting cryptographic and algorithmic applications. In particular, we apply multidimensional LPHS to obtain an efficient "packed" implementation of homomorphic secret sharing and a sublinear-time implementation of location-sensitive encryption whose decryption requires a significantly overlapping view.Elette Boyle and Itai Dinur and Niv Gilboa and Yuval Ishai and Nathan Keller and Ohad Kleinwork_aw5tjgtfkvf7peorwi54qahpdqSun, 09 Jan 2022 00:00:00 GMTABE for Circuits with Constant-Size Secret Keys and Adaptive Security
https://scholar.archive.org/work/k35wvwejhvdeximlbgyqcl5jja
An important theme in research on attribute-based encryption (ABE) is minimizing the sizes of the secret keys and ciphertexts. In this work, we present two new ABE schemes with constant-size secret keys, that is, the key size is independent of the sizes of policies or attributes, and dependent only on the security parameter 𝜆. • We construct the first key-policy ABE scheme for circuits with constant-size secret keys, |sk 𝑓 | = poly(𝜆), which concretely consist of only three group elements. The previous state-of-the-art construction by [Boneh et. al., Eurocrypt'14] has key size polynomial in the maximum depth 𝑑 of the policy circuits, |sk 𝑓 | = poly(𝑑, 𝜆). Our new scheme removes this dependency of key size on 𝑑 while keeping the ciphertext size the same, which grows linearly in the attribute length and polynomially in the maximal depth, |ct 𝑥 | = |𝑥| poly(𝑑, 𝜆). • We present the first ciphertext-policy ABE scheme for Boolean formulae that simultaneously has constant-size keys and succinct ciphertexts of size independent of the policy formulae, in particular, |sk 𝑓 | = poly(𝜆) and |ct 𝑥 | = poly(|𝑥|, 𝜆). Concretely, each secret key consists of only two group elements. Previous ciphertext-policy ABE schemes either have succinct ciphertexts but non constantsize keys [Agrawal-Yamada, Eurocrypt'20, Agrawal-Wichs-Yamada, TCC'20], or constant-size keys but large ciphertexts that grow with the policy size, as well as the attribute length. Our second construction is the first ABE scheme achieving double succinctness, where both keys and ciphertexts are smaller than the corresponding attributes and policies tied to them. Our constructions feature new ways of combining lattices with pairing groups for building ABE and are proven selectively secure based on LWE and in the generic (pairing) group model. We further show that when replacing the LWE assumption with its adaptive variant introduced in [Quach-Wee-Wichs FOCS '18] the constructions become adaptively secure.Hanjun Li, Huijia Lin, Ji Luowork_k35wvwejhvdeximlbgyqcl5jjaFormal Foundations for Anonymous Communication
https://scholar.archive.org/work/o2wnazwljjcdxjjggtq35ntzfq
Mit jeder Online-Tätigkeit hinterlassen wir digitale Fußspuren. Unternehmen und Regierungen nutzen die privaten Informationen, die von den riesigen Datenmengen der Online-Spuren abgeleitet werden können, um ihre Nutzer und Büger zu manipulieren. Als Gegenmaßnahme wurden anonyme Kommunikationsnetze vorgeschlagen. Diesen fehlen jedoch umfassende formale Grundlagen und folglich ist der Vergleich zwischen verschiedenen Ansätzen nur sehr eingeschränkt möglich. Mit einer gemeinsamen Grundlage zwischen allen Forschern und Entwicklern von anonymen Kommunikationsnetzen können Missverständnisse vermieden werden und die dringend benötigte Entwicklung von den Netzen wird beschleunigt. Mit Vergleichbarkeit zwischen den Lösungen, können die für den jeweiligen Anwendungsfall optimalen Netze besser identifiziert und damit die Entwicklungsanstrengungen gezielter auf Projekte verteilt werden. Weiterhin ermöglichen formale Grundlagen und Vergleichbarkeit ein tieferes Verständnis für die Grenzen und Effekte der eingesetzten Techniken zu erlangen. Diese Arbeit liefert zuerst neue Erkenntnisse zu generellen Formalisierungen für anonyme Kommunikation, bevor sie sich dann auf die praktisch am meisten verbreitete Technik konzentriert: Onion Routing und Mix Netzwerke. Als erstes wird die Vergleichbarkeit zwischen Privatsphärezielen sichergestellt, indem sie formal definiert und miteinander verglichen werden. Dabei enteht eine umfangreiche Hierarchie von eindeutigen Privatsphärezielen. Als zweites werden vorgeschlagene Netzwerke analysiert, um deren Grundbausteine zu identifizieren und deren Schutz als Auswirkung in der Hierarchy zu untersuchen. Diese Grunlagen erlauben Konflikte und Schwachstellen in existierenden Arbeiten zu entdecken und aufzuklären. Genauer zeigt sich damit, dass basierend of derselben informalen Definition verschieden stark schützende formale Versionen entstanden sind. Weiterhin werden in dieser Arbeit die Notions genutzt um existierende Unmöglichkeitsresultate für anonyme Kommunikation zu vergleichen. Dabei wird nich [...]Christiane Kuhn, Thorsten Strufework_o2wnazwljjcdxjjggtq35ntzfqFictions of Distance in Recent American Literature
https://scholar.archive.org/work/lmi6dlhuevcwrc6a4ltfv7dz5q
in the past few decades, various attempts have been made for overcoming exactly the kind of self-reflexive detachment practiced by critics in the seventies and eighties. As Rita Felski, one of the most influential proponents of "postcritique," explains, their critical practices ("symptomatic reading, ideology critique, Foucauldian historicism") demand "an attitude of vigilance, detachment, and wariness (suspicion)" and equate a pose of disinterested disengagement with intellectual rigor (3; emphasis in original). Drawing on theories of affect, the postcritical movement implies that critique has learned nothing from its own criticism of Enlightenment ideals but still thinks that aloofness equals intellectual adeptness. Hence, the need to move beyond critique itself and become postcritical. Yet, by taking critique as the object of its criticism, postcritique may not be so much overcoming distance as once more extending the critical project, taking not merely critical methods but the very person of the critic-affects, feelings, attitudes, and all-into its fold. Self-evidently, such an attack on critical detachment and "suspicion" as the prevailing "mood and method" (Felski 1) of the twentieth century downplays the crucial role distance has played in the formation of the modern intellect more generally: If we believe Hannah Arendt, modernity was ushered in by Galileo, who used a telescope-a technology of distance, if there ever was oneto demonstrate that, contrary to sensory perception and common belief, the earth and its inhabitants are not the center of the universe. "The immediate philosophic reaction to this reality was not exultation but Cartesian doubt," Arendt writes, "by which modern philosophy-that 'school of suspicion' as Nietzsche once called it-was founded" (260). If distance, and by extension, suspicion (primarily the suspicion of oneself), is understood as the prevailing ethos of modern, post-Enlightenment thought, overcoming it by a simple tuning of one's "mood," as Felski suggests, seems unlikely and ultimately undesirable. The way out of this critical impasse is indicated by Anderson. Lamenting what she takes to be "incoherence on the subject of detachment in contemporary theory," Anderson calls for a more nuanced understanding of distance as a distinctly modern intellectual practice. In her view-and the above overview seems to support this claim-"contemporary thinkers generate false oppositions and exclusions in their consideration of differing modes and practices of detachment," which results in "truncated forms of theory" (24). Instead of assuming that distance is both a constant and constantly good or bad, Anderson argues that being able to "imagine critical distance as a temporary vantage," and "disengagement" as a "'stance' . . . among others," allows us to see the value of detachment as "an aspiration more than a certainty" (32-33): 2.1 7 8 to the potential for significant togetherness toward which human beings may aspire in their interactions with each other-a togetherness, however, which paradoxically consists of a communion between separate, distinct beings. According to affect theorist Lauren Berlant, on the other hand, intimacy is even less real than this; rather, intimacy denotes a "utopian" vision which operates in the interstices between fantasy and reality (282). For Berlant, intimacy does not constitute a feeling or specific kind of emotional attachment, as it does for Yousef, but "an aesthetic, an aesthetic of attachments" (285). If intimacy is rarely instantiated in reality-"virtually no one knows how to do intimacy," claims Berlant (282)-most critics seem to think that it is even less likely to find expression in virtual spaces. In Alone Together, psychologist and social scientist Sherry Turkle claims that digital technology has become "the architect of our intimacies," to the detriment of the real thing: "Our networked life allows us to hide from each other, even as we are tethered to each other," and instead of actual intimate connections, we settle for "our new intimacy with machines," which are by definition unable to hold up their end of the bargain by sustaining the sense of significant togetherness we crave (1-3). Whereas Berlant and Yousef emphasize the constructed, linguistically-mediated nature of all intimacy, Turkle draws a sharp distinction between "authentic" intimate bonds between human beings and the "inauthentic," simulated intimacies that occur in technologically-mediated environments, especially those that take place between human and non-human actors: For her study, Turkle interviewed hundreds of people whose desire for intimacy was not directed at other people, or even at pets or other non-human animals, but at sociable robots, such as Paro, the Japanese robot in the shape of a baby seal marketed as a companion for elderly people. Turkle's pessimism seems both reasonable and out of place: On the one hand, there is no question that the lonely and the isolated deserve more than battery-operated toys to keep them company; on the other, human beings have always relied on technology where actual faceto-face interaction is either undesirable or impossible. More broadly, Turkle's nostalgic yearning for a golden age of intimacy recalls deep-seated anxieties about technology and its capability to erode "authentic" human contact. Such lamentations over a lost art of intimacy are especially compelling in the ongoing pandemic-and not entirely without cause, given the alarming spikes in reported loneliness and feelings of isolation after its onset. One way to gain some critical distance from these anxieties about intimacy in the digitallymediated world is to examine an older, analogue form of mediated intimacyone that predates social media, video conferences, and artificial intelligence: literature. If what makes digital technologies "seductive," in Turkle's opinion, is their ability to provide an "illusion of companionship without the demands of friendship" (3), then this is no less true about the allure of reading. Would 2.1 9 10 Turkle think that readers of literature also "navigate intimacy by skirting it" altogether (10)? Unpacking these prejudices serves as an exercise in critical (self-)distancing. Let us consider the young woman who tells Turkle that she would exchange her human boyfriend for a robot one in a heartbeat were such technology available. For Turkle, it comes as a genuine shock that certain people would find engaging in such "inauthentic" intimacies "preferable . . . to . . . the sometimes messy, often frustrating, and always complex" kind of intimacy found in the "real" world (7). Evidently, in Turkle's view, not only are there "authentic" and "inauthentic" kinds of intimacy, people might be unknowingly seeking illusory intimacies, the true nature of which it is the critic's job to expose. Were she less certain of her own objectivity, and more prone to the kind of hesitant, self-reflexive detachment called for by Anderson, Turkle might have noted the biases that underlie her questioning-at the very least by placing her inquiry in an historical context of similar anxieties concerning various new media. As literary critics, we suggest contextualizing such anxieties by examining the paradoxes of intimacy instantiated by the novel. On the one hand, the rise of the novel in the eighteenth century coincides with what Jürgen Habermas has discussed as the demarcation of Intimsphäre, the intimate, private sphere of the bourgeois home, from Öffentlichkeit, the public sphere. Concomitant with this "transformation of the public sphere," Habermas argues, the relations between "author, work, and public . . . became intimate mutual relationships between privatized individuals," and especially the "author and the reader became actors who 'talked heart to heart'" (50). According to Habermas, the modern "psychological novel fashioned for the first time the kind of realism that allowed anyone to enter into the literary action as a substitute for his own, to use the relationships between the figures, between the author, the characters, and the reader, as a substitute for reality" (51)-or as a substitute for 'real' intimacy. The novel, according to this view, is from the start an art of the intimate, which affords readers an unprecedented familiarity with the intimate inner lives of (fictional) others, but also has the potential of entrapping them in a false, in Turkle's words "inauthentic," form of intimacy. Yet, Habermas argues, only between such psychologically selfdistanced, self-reflexive individuals could "the public sphere of a rationalcritical debate" emerge (51), proving that even "inauthentic" intimacy can have a potentially transformative effect on society. On the other hand, the novel has been criticized exactly for the opposite reason-for a failure to sustain any form of intimacy. Thinking about the novel as an art of isolation could begin with Walter Benjamin's 1936 essay "The Storyteller," which is rendered throughout in a language of intimacy and distance. Regardless whether or not one agrees with Benjamin's claim that "the to the physical, effectively claiming in poetry an "elementary communication" comparable to the mutual, tactile correspondence of a handshake, that 2.1 meaning of the word fiction, from the Latin fingere, to "form" or "contrive." The poetics of postmodernism, far from overly detached and disengaged, are revealed to be the very opposite of this: By seeking to represent the tactile act of crafting that is involved in the making of all fictions, postmodern fiction is a testament to the idea that the art of poetics is an inherently intimate one. Beyond its intimate poetics, postmodern fiction also demonstrates a deep, if ambivalent, engagement with the idea of intimacy on the level of content. A comprehensive study of intimacy in postmodern fiction has so far not been undertaken, but our own ongoing research indicates potential avenues for conducting such a survey. Many classics of postmodern literature are doubtful or suspicious about the possibility of intimacy. Frequently, when the potential for intimacy emerges within the postmodern novel-whether between fictional characters or between the reader and the implied author-it assumes the 2.1 which is to say they are not simply implied, not primarily to be understood as rhetorical constructions or immortalized placeholders. The text's existence depends not only on a writer but also on a particular reader at a particular place and time" (206; emphasis in original). Through an emphatic acknowledgment of readers as present within their texts, the writers in question employ a rhetoric of sincerity to come out of 'hiding' and expose themselves with all 2.1 B i o g r a p h ywork_lmi6dlhuevcwrc6a4ltfv7dz5qOn the Impossibility of Algebraic Vector Commitments in Pairing-Free Groups
https://scholar.archive.org/work/ferwlccjyrbz3mjwggue43xksu
Vector Commitments allow one to (concisely) commit to a vector of messages so that one can later (concisely) open the commitment at selected locations. In the state of the art of vector commitments, algebraic constructions have emerged as a particularly useful class, as they enable advanced properties, such as stateless updates, subvector openings and aggregation, that are for example unknown in Merkle-tree-based schemes. In spite of their popularity, algebraic vector commitments remain poorly understood objects. In particular, no construction in standard prime order groups (without pairing) is known. In this paper, we shed light on this state of affairs by showing that a large class of concise algebraic vector commitments in pairing-free, prime order groups are impossible to realize. Our results also preclude any cryptographic primitive that implies the algebraic vector commitments we rule out, as special cases. This means that we also show the impossibility, for instance, of succinct polynomial commitments and functional commitments (for all classes of functions including linear forms) in pairing-free groups of prime order.Dario Catalano, Dario Fiore, Rosario Gennaro, Emanuele Giuntawork_ferwlccjyrbz3mjwggue43xksuBounded Functional Encryption for Turing Machines: Adaptive Security from General Assumptions
https://scholar.archive.org/work/7a5qop5c3nhxjikzyppafjsvjq
The recent work of Agrawal et al., [Crypto '21] and Goyal et al. [Eurocrypt '22] concurrently introduced the notion of dynamic bounded collusion security for functional encryption (FE) and showed a construction satisfying the notion from identity based encryption (IBE). Agrawal et al., [Crypto '21] further extended it to FE for Turing machines in non-adaptive simulation setting from the sub-exponential learining with errors assumption (LWE). Concurrently, the work of Goyal et al. [Asiacrypt '21] constructed attribute based encryption (ABE) for Turing machines achieving adaptive indistinguishability based security against bounded (static) collusions from IBE, in the random oracle model. In this work, we significantly improve the state of art for dynamic bounded collusion FE and ABE for Turing machines by achieving adaptive simulation style security from a broad class of assumptions, in the standard model. In more detail, we obtain the following results: 1. We construct an adaptively secure (AD-SIM) FE for Turing machines, supporting dynamic bounded collusion, from sub-exponential LWE. This improves the result of Agrawal et al. which achieved only non-adaptive (NA-SIM) security in the dynamic bounded collusion model. 2. Towards achieving the above goal, we construct a ciphertext policy FE scheme (CPFE) for circuits of unbounded size and depth, which achieves AD-SIM security in the dynamic bounded collusion model from IBE and laconic oblivious transfer (LOT). Both IBE and LOT can be instantiated from a large number of mild assumptions such as the computational Diffie-Hellman assumption, the factoring assumption, and polynomial LWE. This improves the construction of Agrawal et al. which could only achieve NA-SIM security for CPFE supporting circuits of unbounded depth from IBE. 3. We construct an AD-SIM secure FE for Turing machines, supporting dynamic bounded collusions, from LOT, ABE for NC 1 (or NC) and private information retrieval (PIR) schemes which satisfy certain properties. This significantly expands the class of assumptions on which AD-SIM secure FE for Turing machines can be based. In particular, it leads to new constructions of FE for Turing machines including one based on polynomial LWE and one based on the combination of the bilinear decisional Diffie-Hellman assumption and the decisional Diffie-Hellman assumption on some specific groups. In contrast the only prior construction by Agrawal et al. achieved only NA-SIM security and relied on sub-exponential LWE. To achieve the above result, we define the notion of CPFE for read only RAM programs and succinct FE for LOT, which may be of independent interest. 4. We also construct an ABE scheme for Turing machines which achieves AD-IND security in the standard model supporting dynamic bounded collusions. Our scheme is based on IBE and LOT. Previously, the only known candidate that achieved AD-IND security from IBE by Goyal et al. relied on the random oracle model. 1. conf 1 = White |Gates| and conf T = Black |Gates| .Shweta Agrawal, Fuyuki Kitagawa, Anuja Modi, Ryo Nishimaki, Shota Yamada, Takashi Yamakawawork_7a5qop5c3nhxjikzyppafjsvjqCounting Vampires: From Univariate Sumcheck to Updatable ZK-SNARK
https://scholar.archive.org/work/rn5gtop3ybcvxkij7v2pupzhmy
We propose a univariate sumcheck argument Count of essentially optimal communication efficiency of one group element. While the previously most efficient univariate sumcheck argument of Aurora is based on polynomial commitments, Count is based on inner-product commitments. We use Count to construct a new pairing-based updatable and universal zk-SNARK Vampire with the shortest known argument length (five group elements and two integers) for NP. In addition, Vampire uses the aggregated polynomial commitment scheme of Boneh et al. Differently from the previous (efficient) work, both Count and Vampire have an updatable SRS that consists of non-consequent monomials.Helger Lipmaa, Janno Siim, Michal Zajacwork_rn5gtop3ybcvxkij7v2pupzhmyStructure-Preserving Compilers from New Notions of Obfuscations
https://scholar.archive.org/work/odnpzt2iojax7fncjmickkqmne
The dream of software obfuscation is to take programs, as they are, and then compile them into obfuscated versions that hide their secret inner workings. In this work we investigate notions of obfuscations weaker than virtual black-box (VBB) but which still allow obfuscating cryptographic primitives preserving their original functionalities as much as possible. In particular we propose two new notions of obfuscations, which we call oracle-differing-input obfuscation (odiO) and oracle-indistinguishability obfuscation (oiO). In a nutshell, odiO is a natural strengthening of differing-input obfuscation (diO) and allows obfuscating programs for which it is hard to find a differing-input when given only oracle access to the programs. An oiO obfuscator allows to obfuscate programs that are hard to distinguish when treated as oracles. We then show applications of these notions, as well as positive and negative results around them. A few highlights include: -Our new notions are weaker than VBB and stronger than diO. -As it is the case for VBB, we show that there exist programs that cannot be obfuscated with odiO or oiO. -Our new notions allow to compile several flavours of secret key primitives (e.g., SKE, MAC, designated verifier NIZK) into their public key equivalent (e.g., PKE, signatures, publicly verifiable NIZK) while preserving one of the algorithms of the original scheme (function-preserving), or the structure of their outputs (format-preserving).Matteo Campanelli, Danilo Francati, Claudio Orlandiwork_odnpzt2iojax7fncjmickkqmneAdaptively Secure Single Secret Leader Election from DDH
https://scholar.archive.org/work/nqlymboxd5b3bh7y47ototpwfy
Single Secret Leader Election protocols (SSLE, for short) allow a group of users to select a random leader so that the latter remains secret until she decides to reveal herself. Thanks to this feature, SSLE can be used to build an election mechanism for proof-of-stake based blockchains. In particular, a recent work by Azouvi and Cappelletti (ACM AFT 2021) shows that in comparison to probabilistic leader election methods, SSLE-based proof-of-stake blockchains have significant security gains, both with respect to grinding attacks and with respect to the private attack. Yet, as of today, very few concrete constructions of SSLE are known. In particular, all existing protocols are only secure in a model where the adversary is supposed to corrupt participants before the protocol starts -an assumption that clashes with the highly dynamic nature of decentralized blockchain protocols. In this paper we make progress in the study of SSLE by proposing new efficient constructions that achieve stronger security guarantees than previous work. In particular, we propose the first SSLE protocol that achieves adaptive security. Our scheme is proven secure in the universal composability model and achieves efficiency comparable to previous, less secure, realizations in the state of the art.Dario Catalano, Dario Fiore, Emanuele Giuntawork_nqlymboxd5b3bh7y47ototpwfy