Discrete symbolic optimization and Boltzmann sampling by continuous neural dynamics: Gradient Symbolic Computation [article]

Paul Tupper, Paul Smolensky, Pyeong Whan Cho
2018 arXiv   pre-print
Gradient Symbolic Computation is proposed as a means of solving discrete global optimization problems using a neurally plausible continuous stochastic dynamical system. Gradient symbolic dynamics involves two free parameters that must be adjusted as a function of time to obtain the global maximizer at the end of the computation. We provide a summary of what is known about the GSC dynamics for special cases of settings of the parameters, and also establish that there is a schedule for the two
more » ... ameters for which convergence to the correct answer occurs with high probability. These results put the empirical results already obtained for GSC on a sound theoretical footing.
arXiv:1801.03562v1 fatcat:a4c2hbkjarekrk7etumbrtj6vm