Filters








30,039 Hits in 2.0 sec

Learning Logistic Circuits [article]

Yitao Liang, Guy Van den Broeck
2019 arXiv   pre-print
We show that parameter learning for logistic circuits is convex optimization, and that a simple local search algorithm can induce strong model structures from data.  ...  This paper proposes a new classification model called logistic circuits.  ...  Structure Learning This section presents an algorithm to learn a compact logical circuit structure for logistic circuits from data.  ... 
arXiv:1902.10798v1 fatcat:74xmyjr2zfbxnfa7pmrjpuhiyy

Learning Logistic Circuits

Yitao Liang, Guy Van den Broeck
2019 PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE  
We show that parameter learning for logistic circuits is convex optimization, and that a simple local search algorithm can induce strong model structures from data.  ...  This paper proposes a new classification model called logistic circuits.  ...  Structure Learning This section presents an algorithm to learn a compact logical circuit structure for logistic circuits from data.  ... 
doi:10.1609/aaai.v33i01.33014277 fatcat:qde6abznufbpvffu3lnoxwb4uu

Secure Multi-pArty Computation Grid LOgistic REgression (SMAC-GLORE)

Haoyi Shi, Chao Jiang, Wenrui Dai, Xiaoqian Jiang, Yuzhe Tang, Lucila Ohno-Machado, Shuang Wang
2016 BMC Medical Informatics and Decision Making  
Conclusions: In this study, we developed a circuit-based SMAC-GLORE framework. The proposed framework provides a practical solution for secure distributed logistic regression model learning.  ...  Unlike our previous work in GLORE, SMAC-GLORE protects not only patient-level data, but also all the intermediary information exchanged during the model-learning phase.  ...  In practice, advanced circuits are required to handle more complicated tasks, such as secure distributed logistic regression, where only the learned model parameters are allowed to be released as circuit  ... 
doi:10.1186/s12911-016-0316-1 pmid:27454168 pmcid:PMC4959358 fatcat:qwtnfd2qnnaf7hox2gga2vrvty

ABG: A Multi-Party Mixed Protocol Framework for Privacy-Preserving Cooperative Learning [article]

Hao Wang, Zhi Li, Chunpeng Ge, Willy Susilo
2022 arXiv   pre-print
Additionally, we design specific privacy-preserving computation protocols for some typical machine learning methods such as logistic regression and neural networks.  ...  In this work, we propose a multi-party mixed protocol framework, ABG^n, which effectively implements arbitrary conversion between Arithmetic sharing (A), Boolean sharing (B) and Garbled-Circuits sharing  ...  [33] used federated learning for the training of logistic regression models. Yang et al. [34] combined the idea of transfer learning, adopting transfer learning to collaboratively learn a model.  ... 
arXiv:2202.02928v2 fatcat:oup6mqpk2zcyrkj7q2etphllzm

On Tractable Computation of Expected Predictions [article]

Pasha Khosravi, YooJung Choi, Yitao Liang, Antonio Vergari, Guy Van den Broeck
2019 arXiv   pre-print
In fact, the task is intractable even for simple models such as logistic regression and a naive Bayes distribution.  ...  Specifically, we consider expressive probabilistic circuits with certain structural constraints that support tractable probabilistic inference.  ...  The expectation of such logistic circuit σ • g m w.r.t.  ... 
arXiv:1910.02182v2 fatcat:lzl5jtp2arfbjnl5u6dz7bll7q

One-step regression and classification with cross-point resistive memory arrays

Zhong Sun, Giacomo Pedretti, Alessandro Bricalli, Daniele Ielmini
2020 Science Advances  
Here, we show that a cross-point resistive memory circuit with feedback configuration can train traditional machine learning algorithms such as linear regression and logistic regression in just one step  ...  Machine learning has been getting attention in recent years as a tool to process big data generated by the ubiquitous sensors used in daily life.  ...  The same approach might be applied to pretrained deep networks by the concept of transfer learning (38) , thus enabling the one-step training capability for a generalized range of learning tasks.  ... 
doi:10.1126/sciadv.aay2378 pmid:32064342 pmcid:PMC6994204 fatcat:aoeu626nerhljkzxdv5kwciedm

SecureML: A System for Scalable Privacy-Preserving Machine Learning

Payman Mohassel, Yupeng Zhang
2017 2017 IEEE Symposium on Security and Privacy (SP)  
In this paper, we present new and efficient protocols for privacy preserving machine learning for linear regression, logistic regression and neural network training using the stochastic gradient descent  ...  Machine learning is widely used in practice to produce predictive models for applications such as image processing, speech and text recognition.  ...  Acknowledgements We thank Jing Huang from Visa Research for helpful discussions on machine learning, and Xiao Wang from University of Maryland for his help on the EMP toolkit.  ... 
doi:10.1109/sp.2017.12 dblp:conf/sp/MohasselZ17 fatcat:y6lwerwfwnfghjkia76yqenmiy

Scan-Chain-Fault Diagnosis Using Regressions in Cryptographic Chips for Wireless Sensor Networks

Hyunyul Lim, Minho Cheong, Sungho Kang
2020 Sensors  
Scan structures, which are widely used in cryptographic circuits for wireless sensor networks applications, are essential for testing very-large-scale integration (VLSI) circuits.  ...  Faults in cryptographic circuits can be effectively screened out by improving testability and test coverage using a scan structure.  ...  Linear regression and logistic regression were used as the machine-learning models through scikit-learn [39] .  ... 
doi:10.3390/s20174771 pmid:32846955 pmcid:PMC7506763 fatcat:uekdvhlnbnfcjlbtm7qlgf2k34

Faster Secure Data Mining via Distributed Homomorphic Encryption [article]

Junyi Li, Heng Huang
2020 arXiv   pre-print
By using the HE technique, it is possible to securely outsource model learning to the not fully trustful but powerful public cloud computing environments.  ...  For example, we successfully train a logistic regression model to recognize the digit 3 and 8 within around 5 minutes, while a centralized counterpart needs almost 2 hours.  ...  More importantly, the depth of the computational circuit in distributed learning is bounded as the worker will refresh its parameter every a few iterations.  ... 
arXiv:2006.10091v1 fatcat:ir2zw4hfz5givfe4ls3rcg3vlu

Logistic Regression on Homomorphic Encrypted Data at Scale

Kyoohyung Han, Seungwan Hong, Jung Hee Cheon, Daejun Park
2019 PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE  
Machine learning on (homomorphic) encrypted data is a cryptographic method for analyzing private and/or sensitive data while keeping privacy.  ...  We also evaluate our algorithm on the public MNIST dataset, and it takes ∼2 hours to learn an encrypted model with 96.4% accuracy.  ...  We encrypted the MNIST dataset and executed our logistic regression algorithm. Table 1 shows the result. With 32 iterations, our logistic algorithm took 132 minutes to learn an encrypted model.  ... 
doi:10.1609/aaai.v33i01.33019466 fatcat:lodh5a3h2nh4pd4ejaavx3a54u

A Low-Depth Homomorphic Circuit for Logistic Regression Model Training [article]

Eric Crockett
2020 IACR Cryptology ePrint Archive  
We take a fresh look at one specific task: training a logistic regression model on encrypted data.  ...  Machine learning is an important tool for analyzing large data sets, but its use on sensitive data may be limited by regulation.  ...  In this work, we examine these steps for a particular function, logistic regression model training, a basic task in supervised machine learning.  ... 
dblp:journals/iacr/Crockett20 fatcat:ltwd34em4vfrnmp6wdaibikaji

Asynchronous network of cellular automaton-based neurons for efficient implementation of Boltzmann machines

Takashi Matsubara, Kuniaki Uehara
2018 Nonlinear Theory and Its Applications IEICE  
They are special types of cellular automata and are implemented as small asynchronous sequential logic circuits.  ...  Artificial neural networks with stochastic state transitions and calculations, such as Boltzmann machines, have excelled over other machine learning approaches in various benchmark tasks.  ...  The proposed CAN is a good approximation of the logistic function. y = 1 2 1 + erf x − μ √ 2σ 2 , (10) Digital circuit implementation This section considers digital circuit implementations of the CAN  ... 
doi:10.1587/nolta.9.24 fatcat:2l42p2kw25b6thujaqcw2wkcbe

A Tractable Probabilistic Model for Subset Selection

Yujia Shen, Arthur Choi, Adnan Darwiche
2017 Conference on Uncertainty in Artificial Intelligence  
Our proposed model is interpretable and subsumes a previously introduced model based on logistic regression.  ...  We highlight the intuitive structures that we learn via case studies.  ...  Moreover, unlike our recursive model, there is no structure to be learned in a logistic n-choose-k model. PARAMETER LEARNING Suppose we are given a set of n binary variables X.  ... 
dblp:conf/uai/ShenCD17 fatcat:neqbafnoqvgglcmriwlb4vl6la

What Matters: Agreement between U.S. Courts of Appeals Judges

Daniel L. Chen, Xing Cui, Lanyu Shang, Junchao Zheng
2016 Social Science Research Network  
Courts of Appeals (13 Circuits including 12 Regional and one for the Federal Circuit); and U.S. District Courts. The middle level is also called Federal Circuit Courts.  ...  The reason data scientists have applied machine learning is because of the perception that machine learning can provide a generic, robust, and fully predictive method.  ...  Appendix 'case_circuit': Circuit of Court, Categorical. 'case_MajSelfCertainWords': The number of words in the verdict that indicate self-certainty, Numerical.  ... 
doi:10.2139/ssrn.2839305 fatcat:ixj4xckcsvgxrbj5uslotjphxy

Parallel Reinforcement Pathways for Conditioned Food Aversions in the Honeybee

Geraldine A. Wright, Julie A. Mustard, Nicola K. Simcock, Alexandra A.R. Ross-Taylor, Lewis D. McNicholas, Alexandra Popescu, Frédéric Marion-Poll
2010 Current Biology  
( Figure 1C ; logistic regression: c 4 2 = 22.1, p < 0.001).  ...  Honeybees quickly learn to associate floral cues with food [3], a trait that makes them an excellent model organism for studying the neural mechanisms of learning and memory.  ...  Error bars represent 6SEM. influence the circuits involved in appetitive learning.  ... 
doi:10.1016/j.cub.2010.11.040 pmid:21129969 pmcid:PMC3011020 fatcat:5aohzid4hbe5fmrbq3t4xtcu4i
« Previous Showing results 1 — 15 out of 30,039 results