Filters








9,625 Hits in 3.6 sec

Federated Functional Gradient Boosting [article]

Zebang Shen, Hamed Hassani, Satyen Kale, Amin Karbasi
2021 arXiv   pre-print
First, in the semi-heterogeneous setting, when the marginal distributions of the feature vectors on client machines are identical, we develop the federated functional gradient boosting (FFGB) method that  ...  In this paper, we initiate a study of functional minimization in Federated Learning.  ...  In this setting, we propose the federated functional gradient boosting (FFGB) method.  ... 
arXiv:2103.06972v1 fatcat:io3lm3pozzg5vlizx5iu3uviia

Practical Federated Gradient Boosting Decision Trees

Qinbin Li, Zeyi Wen, Bingsheng He
2020 PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE  
Gradient Boosting Decision Trees (GBDTs) have become very successful in recent years, with many awards in machine learning and data mining competitions.  ...  There have been several recent studies on how to train GBDTs in the federated learning setting.  ...  Gradient Boosting Decision Trees (GBDTs) The GBDT is an ensemble model which trains a sequence of decision trees.  ... 
doi:10.1609/aaai.v34i04.5895 fatcat:rcehcdwr5bcspbjcnu2wm4ivve

Practical Federated Gradient Boosting Decision Trees [article]

Qinbin Li, Zeyi Wen, Bingsheng He
2019 arXiv   pre-print
Gradient Boosting Decision Trees (GBDTs) have become very successful in recent years, with many awards in machine learning and data mining competitions.  ...  There have been several recent studies on how to train GBDTs in the federated learning setting.  ...  Gradient Boosting Decision Trees (GBDTs) The GBDT is an ensemble model which trains a sequence of decision trees.  ... 
arXiv:1911.04206v2 fatcat:ccasyn6lorg3ddnwn73jjywmxi

SecureGBM: Secure Multi-Party Gradient Boosting [article]

Zhi Fengy, Haoyi Xiong, Chuanyuan Song, Sijia Yang, Baoxin Zhao, Licheng Wang, Zeyu Chen, Shengwen Yang, Liping Liu, Jun Huan
2019 arXiv   pre-print
jointly obtain a shared Gradient Boosting machines model while protecting their own data from the potential privacy leakage and inferential identification.  ...  In this paper, we proposed a novel Gradient Boosting Machines (GBM) framework SecureGBM built-up with a multi-party computation model based on semi-homomorphic encryption, where every involved party can  ...  The key idea of gradient boosting is to consider the procedure of boosting as the optimization over certain cost functions [14] .  ... 
arXiv:1911.11997v1 fatcat:fexgdbtlw5a4nhnslunrrf6lsq

SecureBoost+ : A High Performance Gradient Boosting Tree Framework for Large Scale Vertical Federated Learning [article]

Weijing Chen, Guoqiang Ma, Tao Fan, Yan Kang, Qian Xu, Qiang Yang
2021 arXiv   pre-print
Gradient boosting decision tree (GBDT) is a widely used ensemble algorithm in the industry.  ...  It makes effective and efficient large-scale vertical federated learning possible.  ...  Among these vertical federated algorithms, tree-based algorithms, especially gradient boosting decision trees (GBDT), are one of the most popular kinds.  ... 
arXiv:2110.10927v3 fatcat:ovyk374r65gaxkopvizhawrnzm

Boosting Privately: Privacy-Preserving Federated Extreme Boosting for Mobile Crowdsensing [article]

Yang Liu, Zhuo Ma, Ximeng Liu, Siqi Ma, Surya Nepal, Robert Deng
2020 arXiv   pre-print
Inspired by the two challenges, we propose FedXGB, a federated extreme gradient boosting (XGBoost) scheme supporting forced aggregation. FedXGB mainly achieves the following two breakthroughs.  ...  Recently, Google and other 24 institutions proposed a series of open challenges towards federated learning (FL), which include application expansion and homomorphic encryption (HE).  ...  Our contributions can be summarised as follows: • Federated Extreme Gradient Boosting.  ... 
arXiv:1907.10218v2 fatcat:4alhvnuffvdkzkyzc4ivzuzad4

L2-Boosting for Economic Applications

Ye Luo, Martin Spindler
2017 The American Economic Review  
of Boosting as a functional gradient descent optimization (minimization) method.  ...  The goal is to minimize a loss function, e.g., an L 2 -loss or the negative log-likelihood function of a model, by an iterative optimization scheme.  ... 
doi:10.1257/aer.p20171040 fatcat:od22dfnadbdg5o6ciscl5yjtoe

Cloud-based Federated Boosting for Mobile Crowdsensing [article]

Zhuzhu Wang, Yilong Yang, Yang Liu, Ximeng Liu, Brij B. Gupta, Jianfeng Ma
2020 arXiv   pre-print
The application of federated extreme gradient boosting to mobile crowdsensing apps brings several benefits, in particular high performance on efficiency and classification.  ...  In this paper, we propose a secret sharing based federated learning architecture FedXGB to achieve the privacy-preserving extreme gradient boosting for mobile crowdsensing.  ...  CONCLUSION In this paper, we proposed a privacy-preserving federated learning architecture (FedXGB) for the training of extreme gradient boosting model (XGBoost) in crowdsensing applications.  ... 
arXiv:2005.05304v1 fatcat:vt6wzcpqffbl3lgn4e6o4mp3h4

L_2Boosting for Economic Applications [article]

Ye Luo, Martin Spindler
2017 arXiv   pre-print
We attribute this to missing theoretical results for boosting.  ...  Boosting algorithms represent one of the major advances in machine learning and statistics in recent years and are suitable for the analysis of such data sets.  ...  We follow the interpretation of (Breiman 1998) and (Friedman 2001) of Boosting as a functional gradient descent optimization (minimization) method.  ... 
arXiv:1702.03244v1 fatcat:6oqwbes3mramdm3rjz3qvrpisi

PipAttack: Poisoning Federated Recommender Systems forManipulating Item Promotion [article]

Shijie Zhang and Hongzhi Yin and Tong Chen and Zi Huang and Quoc Viet Hung Nguyen and Lizhen Cui
2021 arXiv   pre-print
federated recommender.  ...  Then, by uploading carefully crafted gradients via a small number of malicious users during the model update, we can effectively increase the exposure rate of a target (unpopular) item in the resulted  ...  , we can effectively trick the federated recommender to become biased towards our target item, thus boosting its exposure.  ... 
arXiv:2110.10926v1 fatcat:gobgsvmhurd4hf4jb22mtacjzy

Distributed Parametric Optimization with the Geneva Library [chapter]

Rudiger Berlich, Sven Gabriel, Ariel Garcia, Marcel Kunze
2011 Data Driven e-Science  
Boost: • Extremly portable • Contains implementation of upcoming C++ library standard • Boost::Function • Generalized function callbacks • Boost::Bind • Generalized parameter binding • Boost:  ...  is available  Gradient descents • Basic algorithm • Very simple!  ... 
doi:10.1007/978-1-4419-8014-4_24 fatcat:auekcqkwbjca5prafm7rfw3wrq

Gradient boosting for the prediction of gas chromatographic retention indices

Dmitriy D. Matyushin, Anastasia Yu. Sholokhova, Aleksey K. Buryak
2019 Сорбционные и хроматографические процессы  
The same data sets and the set of descriptors are used for the neural network and gradient boosting.  ...  Various machine learning methods are used for this task, but methods based on decisiontrees, in particular gradient boosting, are not used widely.  ...  O -gradient boosting; + -neural network with one hidden layer. The work was supported by the Ministry of Science and Higher Education of the Russian Federation.  ... 
doi:10.17308/sorpchrom.2019.19/2223 fatcat:2t3r5rwecnddlalkaqv3fydgim

New object detection features in the OpenCV library

P. N. Druzhkov, V. L. Erukhimov, N. Yu. Zolotykh, E. A. Kozinov, V. D. Kustikova, I. B. Meerov, A. N. Polovinkin
2011 Pattern Recognition and Image Analysis  
ACKNOWLEDGMENTS This research was supported by the Russian Federal Program "Scientists and Educators in Russia of Inno vations," contract no. 02.740.11.5131.  ...  GRADIENT BOOSTING TREES Gradient boosting trees is a serial ensemble of deci sion trees [8] where every new tree constructed relies on previously built trees.  ...  We suggest replacing the support vector machine classifier with gradient boosting trees and compared the modified implementation with the original.  ... 
doi:10.1134/s1054661811020271 fatcat:sxrwgngx7zfyvn4epaghsluuyy

Efficient Batch Homomorphic Encryption for Vertically Federated XGBoost [article]

Wuxing Xu, Hao Fan, Kaixin Li, Kai Yang
2021 arXiv   pre-print
In this paper, we studied the efficiency problem of adapting widely used XGBoost model in real-world applications to vertical federated learning setting.  ...  To address the data privacy and security concerns, federated learning has attracted increasing attention from both academia and industry to securely construct AI models across multiple isolated data providers  ...  Liu, “LightGBM: A highly efficient gradient boosting decision ciphertext transmission almost in half.  ... 
arXiv:2112.04261v1 fatcat:ryai3p5ssjf5jjt3v6iswqkahe

Analyzing Federated Learning through an Adversarial Lens [article]

Arjun Nitin Bhagoji, Supriyo Chakraborty, Prateek Mittal, Seraphin Calo
2019 arXiv   pre-print
We explore a number of strategies to carry out this attack, starting with simple boosting of the malicious agent's update to overcome the effects of other agents' updates.  ...  Federated learning distributes model training among a multitude of agents, who, guided by privacy concerns, perform training using their local data but share only model parameter updates, for iterative  ...  While the loss is a function of a weight vector w, we can use the chain rule to obtain the gradient of the loss with respect to the weight update δ, i.e. ∇ δ L = α m ∇ w L.  ... 
arXiv:1811.12470v4 fatcat:imyfzkrhmfet5d4rymph6vjcr4
« Previous Showing results 1 — 15 out of 9,625 results