Filters








328 Hits in 7.6 sec

Remember What You Want to Forget: Algorithms for Machine Unlearning [article]

Ayush Sekhari, Jayadev Acharya, Gautam Kamath, Ananda Theertha Suresh
2021 arXiv   pre-print
For the setting of convex losses, we provide an unlearning algorithm that can unlearn up to O(n/d^1/4) samples, where d is the problem dimension.  ...  We initiate a rigorous study of generalization in machine unlearning, where the goal is to perform well on previously unseen datapoints. Our focus is on both computational and storage complexity.  ...  Acknowledgements We thank Robert Kleinberg, Mehryar Mohri, and Karthik Sridharan for helpful discussions.  ... 
arXiv:2103.03279v2 fatcat:f7ont7in4zcx7mpzb6kaq4ikl4

Forgetting Fast in Recommender Systems [article]

Wenyan Liu, Juncheng Wan, Xiaoling Wang, Weinan Zhang, Dell Zhang, Hang Li
2022 arXiv   pre-print
To our knowledge, this work represents the first attempt at fast approximate machine unlearning for state-of-the-art neural recommendation models.  ...  Users of a recommender system may want part of their data being deleted, not only from the data repository but also from the underlying machine learning model, for privacy or utility reasons.  ...  We would also like to thank Dr. Yuanshun Yao (ByteDance) and Dr. Chong Wang (ByteDance) for helpful discussions.  ... 
arXiv:2208.06875v1 fatcat:epammlofcveapatoxomcaoyi6u

Ash Stories: A Spell against Forgetting

Madeleine Collie
2021 Performance Philosophy  
an exploration of ash migrations to the colonies via acclimatisatioblics.  ...  This paper will explore The Ash Project (2016-2019), which worked to commission a memorial sculpture and a series of walks, talks, workshops and exhibitions to create closer relationships between ash trees  ...  that has sustained care for Country on these lands for over 60,000 years.  ... 
doi:10.21476/pp.2021.62320 fatcat:hv7t36nqgrbh5jzmxolwmehzki

Selective Forgetting of Deep Networks at a Finer Level than Samples [article]

Tomohiro Hayase, Suguru Yasutomi, Takashi Katoh
2020 arXiv   pre-print
Experimental results show that the proposed methods can make the model forget to use specific information for classification.  ...  Moreover, we introduce the forgetting procedure as an optimization problem on three criteria; the forgetting, the correction, and the remembering term.  ...  Bourtoule et al. (2019) also employed a similar definition of forgetting and named it machine unlearning.  ... 
arXiv:2012.11849v2 fatcat:ocq3qey6ynexdejwbumkguptom

Eternal Sunshine of the Spotless Net: Selective Forgetting in Deep Networks [article]

Aditya Golatkar, Alessandro Achille, Stefano Soatto
2020 arXiv   pre-print
We explore the problem of selectively forgetting a particular subset of the data used for training a deep neural network.  ...  The method does not require retraining from scratch, nor access to the data originally used for training.  ...  Acknowledgements: We would like to thank the anonymous reviewers for their feedback and suggestions.  ... 
arXiv:1911.04933v5 fatcat:fa2sx7kenfdtznslcro67g7e5q

Catastrophic Forgetting and the Pseudorehearsal Solution in Hopfield-type Networks

ANTHONY ROBINS, SIMON McCALLUM
1998 Connection science  
To Jayson Mackie for giving feedback on the final draft. And finally to the women who have shared this journey with me, Liana, Trish, and of course Rachael.  ...  forgetting.  ...  The animal could try to store everything that happens to it, forget those things that were not connected to survival, or just forget all the events and just remember simple relationships.  ... 
doi:10.1080/095400998116530 fatcat:4g5qyus4pjcvzbvkraujxxqxoe

Eternal Sunshine of the Spotless Net: Selective Forgetting in Deep Networks

Aditya Golatkar, Alessandro Achille, Stefano Soatto
2020 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)  
We explore the problem of selectively forgetting a particular subset of the data used for training a deep neural network.  ...  The method does not require retraining from scratch, nor access to the data originally used for training.  ...  Acknowledgements: We would like to thank the anonymous reviewers for their feedback and suggestions.  ... 
doi:10.1109/cvpr42600.2020.00932 dblp:conf/cvpr/GolatkarAS20 fatcat:3coiyuo3ingf7po7ghtzgr4lkq

An Investigation on Learning, Polluting, and Unlearning the Spam Emails for Lifelong Learning [article]

Nishchal Parne, Kyathi Puppaala, Nithish Bhupathi, Ripon Patgiri
2021 arXiv   pre-print
Machine unlearning for security is studied in this context. Several spam email detection methods exist, each of which employs a different algorithm to detect undesired spam emails.  ...  So to act deftly in such situations model needs to readily unlearn the polluted data without the need for retraining.  ...  It is possible to determine precisely what kind of data users wish to forget by providing the users with the option to choose degrees of granularity.  ... 
arXiv:2111.14609v2 fatcat:ot55sm7e6nan5g4yomxs6hzliq

Deep Unlearning via Randomized Conditionally Independent Hessians [article]

Ronak Mehta, Sourav Pal, Vikas Singh, Sathya N. Ravi
2022 arXiv   pre-print
Recent legislation has led to interest in machine unlearning, i.e., removing specific training samples from a predictive model as if they never existed in the training dataset.  ...  models that may require unlearning samples identified for exclusion.  ...  Remember what you want to for- get: Algorithms for machine unlearning, 2021. 2, 5, 6, 11, 12 [37] Umut Simsekli, Lingjiong Zhu, Yee Whye Teh, and Mert Gurbuzbalaban.  ... 
arXiv:2204.07655v2 fatcat:epwwcqah7bf6fcr63ztcu2wf4a

The Noise in the Archive: Oblivion in the Age of Total Recall [chapter]

Jean-François Blanchette
2011 Computers, Privacy and Data Protection: an Element of Choice  
and constraints to the business of remembering and oblivion.  ...  His current research focuses on developing a theoretical framework for analysing the materiality of computing, and its implications for long-term preservation of digital objects.  ...  Can they be made to forget? What is the relation between human, institutional, and machine memory?  ... 
doi:10.1007/978-94-007-0641-5_2 fatcat:2lnglattobdplbevu5blgh2sca

Machine Unlearning for Random Forests [article]

Jonathan Brophy, Daniel Lowd
2021 arXiv   pre-print
Responding to user data deletion requests, removing noisy examples, or deleting corrupted training data are just a few reasons for wanting to delete instances from a machine learning (ML) model.  ...  At the lower levels, splits are chosen to greedily optimize a split criterion such as Gini index or mutual information.  ...  Acknowledgments We would like to thank Zayd Hammoudeh for useful discussions and feedback and the reviewers for their constructive comments that improved this paper.  ... 
arXiv:2009.05567v2 fatcat:ovcox5e3zbgerke6ppl7i74pmm

Theories of Parenting and their Application to Artificial Intelligence

Sky Croeser, Peter Eckersley
2019 arXiv   pre-print
As machine learning (ML) systems have advanced, they have acquired more power over humans' lives, and questions about what values are embedded in them have become more complex and fraught.  ...  Much work is underway to try to weave ethics into advancing ML research.  ...  Acknowledgments The authors are highly grateful to Elizabeth Przywolnik for excellent assistance and feedback.  ... 
arXiv:1903.06281v1 fatcat:45eaqfmmqvgwfifwsktmob4qsm

Modernising education: unlearned lessons from Frederick Taylor

Marina A. Shulga, Galyna A. Poperechna, Liubov R. Kondratiuk, Halyna R. Petryshyn, Oleh A. Zubchyk
2021 Linguistics and Culture Review  
, thus, about what is an "educated person" and how it is possible to create one today.  ...  The methods of an integrated approach allowed for demonstrating that the understanding of the content of the so-called "unlearned lessons" from Frederick Taylor regarding the problem of "human capacity  ...  For example: "I want you to teach me to understand what I read" (Bradbury, 1953) ; "I tried to remember, but if I look away, I forget everything" (Bradbury, 1953) ; "do you think that if you are taught  ... 
doi:10.21744/lingcure.v5ns2.1332 fatcat:ioqvks6umrb2jnjzmc6w7cj3mq

A Conversation on Racism and Computer Music. Complete interview transcription

2021 array. the journal of the ICMA  
It forgets racist hierarchy and modes of coloniality. Colonial modes of acting, systems, etc., are not reproduced. That is what this call tries to say but that's, of course, not the case.  ...  I mean, computer music is not isolated from the world; that's one thing that we need to remember.  ...  Acknowledgements This project was funded by the Center for Humanities, the Institute for Creativity, Arts, and Technology, the Office of Diversity and Inclusion, and the School of Performing Arts at Virginia  ... 
doi:10.25370/array.v20213277 fatcat:xtl3jlsdc5evbagmvx47iqocea

Nootemporality, Sociotemporality, and the Internet

Mark Aultman
2009 KronoScope  
An older generation remembers a time before Xerox copiers and fax machines; a younger generation does not remember a time before the Internet or mobile phones.  ...  But this does not mean that we can forget how the past has led to the present or that what we might wish to preserve in the present has come easily.  ... 
doi:10.1163/156771509x12638244967755 fatcat:hjbp2a2gfvcsnma4bcxxy2rbre
« Previous Showing results 1 — 15 out of 328 results