A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2017; you can also visit the original URL.
The file type is
This paper extends earlier work  on an approach to accelerate fuzzy clustering by transferring methods that were originally developed to speed up the training process of (artificial) neural networks. The core idea is to consider the difference between two consecutive steps of the alternating optimization scheme of fuzzy clustering as providing a gradient. This "gradient" may then be modified in the same way as a gradient is modified in error backpropagation in order to enhance the training.doi:10.1016/j.ins.2008.09.017 fatcat:ibxyakyl7jhkvmy6ml3r6rvvde