Filters








115,199 Hits in 8.1 sec

A General Analysis Framework of Lower Complexity Bounds for Finite-Sum Optimization [article]

Guangzeng Xie, Luo Luo, Zhihua Zhang
2019 arXiv   pre-print
This paper studies the lower bound complexity for the optimization problem whose objective function is the average of n individual smooth convex functions.  ...  This construction is friendly to the analysis of proximal oracle and also could be used to general convex and average smooth cases naturally.  ...  Acknowledgements: We thank Dachao Lin and Yuze Han for their helpful discussions about Lemma 2.8.  ... 
arXiv:1908.08394v1 fatcat:5uvlm2m5ercy5ckdcfr6r3abka

On the Complexity of Minimizing Convex Finite Sums Without Using the Indices of the Individual Functions [article]

Yossi Arjevani, Amit Daniely, Stefanie Jegelka, Hongzhou Lin
2020 arXiv   pre-print
In this work, we exploit the finite noise structure of finite sums to derive a matching O(n^2)-upper bound under the global oracle model, showing that this lower bound is indeed tight.  ...  Following a similar approach, we propose a novel adaptation of SVRG which is both compatible with stochastic oracles, and achieves complexity bounds of Õ((n^2+n√(L/μ))log(1/ϵ)) and O(n√(L/ϵ)), for μ>0  ...  Defense Advanced Research Projects Agency or the Department of Defense.  ... 
arXiv:2002.03273v1 fatcat:hitl2afuijhurcsk7pmhle436e

Tight Lower Complexity Bounds for Strongly Convex Finite-Sum Optimization [article]

Min Zhang, Yao Shu, Kun He
2022 arXiv   pre-print
To break these limitations, we derive tight lower complexity bounds of randomized incremental gradient methods, including SAG, SAGA, SVRG, and SARAH, for two typical cases of finite-sum optimization.  ...  Finite-sum optimization plays an important role in the area of machine learning, and hence has triggered a surge of interest in recent years.  ...  bounds for finite-sum optimization.  ... 
arXiv:2010.08766v2 fatcat:av6ziaknk5emdecw7o2hueqeei

Learning the distribution with largest mean: two bandit frameworks [article]

Emilie Kaufmann
2017 arXiv   pre-print
For both of them (regret minimization and best arm identification) we present recent, asymptotically optimal algorithms.  ...  We compare the behaviors of the sampling rule of each algorithm as well as the complexity terms associated to each problem.  ...  optimal proportions w * (µ), while being amenable for finite-time analysis.  ... 
arXiv:1702.00001v3 fatcat:ua3oi6bd6zhu5fiadonfdhol6i

On Reinforcement Learning for Turn-based Zero-sum Markov Games

Devavrat Shah, Varun Somani, Qiaomin Xie, Zhi Xu
2020 Proceedings of the 2020 ACM-IMS on Foundations of Data Science Conference  
This is nearly optimal as we establish a lower bound of Ω(ε −(d+2) ) for any policy.  ...  We consider the problem of finding Nash equilibrium for two-player turn-based zero-sum games. Inspired by the AlphaGo Zero (AGZ) algorithm [26] , we develop a Reinforcement Learning based approach.  ...  A Generic Lower Bound To understand how good the above sample complexity upper bound is, we establish a lower bound for any algorithm under any sampling policy.  ... 
doi:10.1145/3412815.3416888 dblp:conf/fods/ShahSXX20 fatcat:ihmhh5lxyjh6td7lvjycwlw2jq

Exploration Analysis in Finite-Horizon Turn-based Stochastic Games

Jialian Li, Yichi Zhou, Tongzheng Ren, Jun Zhu
2020 Conference on Uncertainty in Artificial Intelligence  
We propose a framework, Upper Bounding the Values for Players (UBVP), to guide exploration in FTSGs.  ...  One is Uniform-PAC with a sample complexity of Õ(1/✏ 2 ) to get an ✏-Nash Equilibrium for arbitrary ✏ > 0, and the other has a cumulative exploitability of Õ( p T ) with high probability.  ...  L172037), Beijing Academy of Artificial Intelligence (BAAI), and a grant from Tsinghua Institute for Guo Qiang.  ... 
dblp:conf/uai/LiZR020 fatcat:ohadzqri7rdaniljbmhazo6rvq

The Complexity of Nonconvex-Strongly-Concave Minimax Optimization [article]

Siqi Zhang, Junchi Yang, Cristóbal Guzmán, Negar Kiyavash, Niao He
2021 arXiv   pre-print
This paper studies the complexity for finding approximate stationary points of nonconvex-strongly-concave (NC-SC) smooth minimax problems, in both general and averaged smooth finite-sum settings.  ...  We establish nontrivial lower complexity bounds of Ω(√(κ)Δ Lϵ^-2) and Ω(n+√(nκ)Δ Lϵ^-2) for the two settings, respectively, where κ is the condition number, L is the smoothness constant, and Δ is the initial  ...  Lower Bounds for NC-SC Minimax Problems In this section, we establish lower complexity bounds (LB) for finding approximate stationary points of NC-SC minimax problems, in both general and finite-sum settings  ... 
arXiv:2103.15888v1 fatcat:7htfnvvs25ftnblfkyrrriwpza

Efficient Smooth Non-Convex Stochastic Compositional Optimization via Stochastic Recursive Gradient Descent

Huizhuo Yuan, Xiangru Lian, Chris Junchi Li, Ji Liu, Wenqing Hu
2019 Neural Information Processing Systems  
bound for stochastic compositional optimization: O((n + m) 1/2 ε −2 ) in the finite-sum case and O(ε −3 ) in the online case.  ...  Such a complexity is known to be the best one among IFO complexity results for non-convex stochastic compositional optimization.  ...  We emphasize that the set of assumptions for compositional optimization is different from vanilla optimization, and claiming optimality of the IFO complexity requires a corresponding lower bound result  ... 
dblp:conf/nips/YuanLLLH19 fatcat:l3ld7pyycbdjnmdca7jvjxl2qq

Learning the distribution with largest mean: two bandit frameworks

Emilie Kaufmann, Aurélien Garivier, Jean-François Coeurjolly, Adeline Leclercq-Samson
2017 ESAIM Proceedings and Surveys  
For both of them (regret minimization and best arm identification) we present recent, asymptotically optimal algorithms.  ...  We compare the behaviors of the sampling rule of each algorithm as well as the complexity terms associated to each problem. Résumé.  ...  optimal proportions w * (µ), while being amenable for finite-time analysis.  ... 
doi:10.1051/proc/201760114 fatcat:frh2rxyqf5asfmywaf6xu3rwza

Nearly Optimal Private Convolution [chapter]

Nadia Fawaz, S. Muthukrishnan, Aleksandar Nikolov
2013 Lecture Notes in Computer Science  
We prove optimality via spectral lower bounds on the hereditary discrepancy of convolution matrices.  ...  We study algorithms for computing the convolution of a private input x with a public input h, while satisfying the guarantees of (ε, δ)-differential privacy.  ...  of our algorithm, we need to prove optimal lower bounds on the noise complexity of private algorithms for computing convolutions.  ... 
doi:10.1007/978-3-642-40450-4_38 fatcat:4qgye52ohrd2tlnh5lfnbmadw4

A Catalyst Framework for Minimax Optimization

Junchi Yang, Siqi Zhang, Negar Kiyavash, Niao He
2020 Neural Information Processing Systems  
We introduce a generic two-loop scheme for smooth minimax optimization with strongly-convex-concave objectives.  ...  Furthermore, when extended to the nonconvex-concave minimax optimization, our algorithm again achieves the state-of-the-art complexity for finding a stationary point.  ...  Acknowledgments and Disclosure of Funding This work was supported in part by ONR grant W911NF-15-1-0479, NSF CCF-1704970, and NSF CMMI-1761699.  ... 
dblp:conf/nips/YangZKH20 fatcat:2ebpuvfnbzajfcj6forpspdxba

A Multi-Objective Optimization Framework for URLLC with Decoding Complexity Constraints [article]

Hasan Basri Celebi and Antonios Pitarokoilis and Mikael Skoglund
2021 arXiv   pre-print
In this paper, we introduce a multi-objective optimization framework for the optimal design of URLLC in the presence of decoding complexity constraints.  ...  We investigate the optimal selection of a transmission rate and power pair, while satisfying the constraints. For this purpose, a multi-objective optimization problem (MOOP) is formulated.  ...  Here, we take our analysis in [23] one step further and set the optimal design of URLLC systems in a MOOP framework.  ... 
arXiv:2102.12274v1 fatcat:mi4ofqdgxrcmjotuk2r5kpxdy4

A Survey on MIMO Transmission with Discrete Input Signals: Technical Challenges, Advances, and Future Trends [article]

Yongpeng Wu, Chengshan Xiao, Zhi Ding, Xiqi Gao, Shi Jin
2017 arXiv   pre-print
Particularly, a unified framework which designs low complexity transmission schemes applicable to massive MIMO systems in upcoming 5G wireless networks is provided in the first time.  ...  Multiple antennas have been exploited for spatial multiplexing and diversity transmission in a wide range of communication applications.  ...  Simeone et al. [173] MIMO Average sum MSE NLP, LP, LE SCSIT Sum power lower bound LP, TABLE XIV : XIV Performance analysis of MIMO MB systems L. G.  ... 
arXiv:1704.07611v1 fatcat:4gep4gnbefdytjdggniocdora4

Variance Reduction via Accelerated Dual Averaging for Finite-Sum Optimization [article]

Chaobing Song, Yong Jiang, Yi Ma
2021 arXiv   pre-print
In this paper, we introduce a simplified and unified method for finite-sum convex optimization, named Variance Reduction via Accelerated Dual Averaging (VRADA).  ...  Meanwhile, VRADA matches the lower bound of the general convex setting up to a loglog n factor and matches the lower bounds in both regimes n≤Θ(κ) and n≫κ of the strongly convex setting, where κ denotes  ...  This work successfully exploits the finite-sum structure to push the performance of this kind of problems in both theory and practice.  ... 
arXiv:2006.10281v4 fatcat:vh7jrud4abea7jqtwbvthviusi

Variance reduction for Riemannian non-convex optimization with batch size adaptation [article]

Andi Han, Junbin Gao
2020 arXiv   pre-print
As a result, we also provide simpler convergence analysis for R-SVRG and improve complexity bounds for R-SRG under finite-sum setting.  ...  We show that this strategy can achieve lower total complexities for optimizing both general non-convex and gradient dominated functions under both finite-sum and online settings.  ...  We first draw a comparison between IFO complexities of vanilla R-SVRG under two analysis frameworks.  ... 
arXiv:2007.01494v1 fatcat:k6ujwrgjxzcwtddk7e2lxaubha
« Previous Showing results 1 — 15 out of 115,199 results