Rough sets and fuzzy sets in natural computing

Hung Son Nguyen, Sankar K. Pal, Andrzej Skowron
2011 Theoretical Computer Science  
Rough sets and fuzzy sets in natural computing Natural Computing (NC) is a discipline that builds a bridge between computer science and natural sciences. It deals mainly with the methodologies and models that take inspiration from nature (or are based on natural phenomena) for problemsolving, using computers (or computational techniques) to synthesize natural phenomena, or employ natural materials (e.g., molecules) for computation. The constituent technologies for performing these tasks include
more » ... cellular automata, artificial neural networks (ANN), evolutionary algorithms, swarm intelligence, artificial immune systems, fractal geometry, artificial life, DNA computing, granular computing and perception-based computing. For example, artificial neural networks attempt to emulate the information representation and processing scheme and the discriminatory ability of biological neurons in the human brain together with characteristics such as adaptivity, robustness, ruggedness, speed and optimality. Similarly, evolutionary algorithms create a biologically inspired tool based on powerful metaphors from the natural world. They mimic some of the processes observed in natural evolution such as crossover, selection and mutation, leading to a stepwise optimization of organisms. On the other hand, perception-based computing provides the capability to compute and reason with perceptionbased information as humans do to perform a wide variety of physical and mental tasks without any measurement and computation. Reflecting the finite ability of the sensory organs and (finally the brain) to resolve details, perceptions are inherently fuzzy-granular (f-granular) [21] . That is, boundaries of perceived classes are unsharp and the values of the attributes they can take are granulated (a clump of indistinguishable points or objects) [20, 5] . Granulation is also a computing paradigm such as, among others, self-reproduction, self-organization, functioning of the brain, Darwinian evolution, group behavior, cell membranes, and morphogenesis, that are abstracted from natural phenomena. A good survey on natural computing explaining its different facets is provided in [4] . Granulation is inherent in human thinking and reasoning processes. Granular computing (GrC) provides an information processing framework where computation and operations are performed on information granules, and is based on the realization that precision is sometimes expensive and not much meaningful in modeling and controlling complex systems. When a problem involves incomplete, uncertain, and vague information, it may be difficult to differentiate distinct elements, and so one may find it convenient to consider granules for its handling. The structure of granulation can often be defined using methods based on rough sets, fuzzy sets or their combination. In this consortium, rough sets and fuzzy sets work synergistically, often with other soft computing approaches, and use the principle of granular computing. The developed systems exploit the tolerance for imprecision, uncertainty, approximate reasoning and partial truth under soft computing framework and is capable of achieving tractability, robustness, and close resemblance with human-like (natural) decision-making for pattern recognition in ambiguous situations [19] . Qualitative reasoning and modeling in NC requires to develop methods supporting approximate reasoning under uncertainty about non-crisp, often vague concepts. One of the very general schemes of tasks for such qualitative reasoning can be described as follows. From some basic objects (called patterns, granules or molecules) it is required to construct (induce) complex objects satisfying a given specification (often expressed in natural language specification) to a satisfactory degree. For example, in learning concepts from examples we deal with tasks where partial information about the specification is given by examples and counter examples concerning the classified objects. As instances of such complex objects one can consider classifiers studied in machine learning or data mining, new medicine against some viruses or behavioral patterns of cell interaction induced from interaction of biochemical processes realized in cells. Over the years, we have learned how to solve some of such tasks, however many of them still pose great challenges. One of the reasons for this is that the discovery process of complex objects relevant for the given specification requires multilevel reasoning with the necessity of discovering on each level the relevant structural objects and their properties. The search space for such structural objects and properties is huge and this, in particular, requires fully automatic methods that are not feasible using the existing computing technologies. However, this process can be supported by domain knowledge which can be used for generating hints in the searching process (see, e.g., [1] ). This view is consistent with [2] (see page 3 of Foreword): 0304-3975/$ -see front matter
doi:10.1016/j.tcs.2011.05.036 fatcat:ptgtfw6wrncz7nqm6rci6p4lrm