On a nascent mathematical-physical latency-information theory, part II: the revelation of guidance theory for intelligence and life system designs

Erlan H. Feria, Sos S. Agaian, Sabah A. Jassim
2009 Mobile Multimedia/Image Processing, Security, and Applications 2009  
Since its introduction more than six decades ago by Claude E. Shannon information theory has guided with two performance bounds, namely source-entropy H and channel capacity C, the design of sourced intelligence-space compressors for communication systems, where the units of intelligence-space are 'mathematical' binary digit (bit) units of a passing of time uncertainty nature. Recently, motivated by both a real-world radar problem treated in the first part of the present paper series, and
more » ... us uncertainty/certainty duality studies of digital-communication and quantizedcontrol problems by the author, information theory was discovered to have a 'certainty' time-dual that was named latency theory. Latency theory guides with two performance bounds, i.e. processor-ectropy K and sensor consciousness F the design of processing intelligence-time compressors for recognition systems, where the units of intelligence-time are 'mathematical' binary operator (bor) units of a configuration of space certainty nature. Furthermore, these two theories have been unified to form a mathematical latency-information theory (M-LIT) for the guidance of intelligence system designs, which has been successfully applied to real-world radar. Also recently, M-LIT has been found to have a physical LIT (P-LIT) dual that guides life system designs. This novel physical theory addresses the design of motion life-time and retention life-space compressors for physical signals and also has four performance bounds. Two of these bounds are mover-ectropy A and channel-stay T for the design of motion life-time compressors for communication systems. An example of a motion life-time compressor is a laser system, inclusive of a network router for a certainty, or multi-path life-time channel. The other two bounds are retainer-entropy N and sensor scope I for the design of retention life-space compressors for recognition systems. An example of a retention life-space compressor is a silicon semiconductor crystal, inclusive of a leadless chip carrier for an uncertainty, or noisy life-space sensor. The eight performance bounds of our guidance theory for intelligence and life system designs will be illustrated with practical examples. Moreover, a four quadrants (quadrants I and III for the two physical theories and quadrants II and IV for the two mathematical ones) LIT revolution is advanced that highlights both the discovered dualities and the fundamental properties of signal compressors leading to a unifying communication embedded recognition (CER) system architecture. systems using as performance bounds the expected source-information in bits, denoted as the source-entropy H, and the channel-capacity C of a noisy intel-space channel. More specifically, these bounds guide the design of intel-space channel and source integrated (CSI) coders with the integrated intel-space channel-coder advancing overhead knowledge in the form of mathematical signals, e.g. parity bits extracted from bit streams that have the effect of increasing the amount of channeled intel-space. The certainty intel-time dual for this scheme was found to be M-LT's intel-time sensor-coding (or 'the mathematical theory of recognition' as identified from a duality perspective) [1]. Intel-time sensor-coding guides the design of intel-time recognition systems using as performance bounds the minmax processorlatency criterion in bor units, denoted as the processor-ectropy K, and the sensor-consciousness F of a limited intel-time sensor. More specifically, these bounds guide the design of intel-time sensor and processor integrated (SPI) coders with the integrated intel-time sensor-coder advancing overhead knowledge in the form of mathematical signals, such as interferences, clutter inclusive, and noise that have the effect of increasing the amount of sensed intel-time. Moreover, the unification of M-LT and M-IT has yielded a mathematical latency-information theory (M-LIT), or mathematical guidance theory for intelligence system designs, that for mathematical intelligence signals addresses in a unified fashion uncertainty communication issues of intel-space and certainty recognition issues of intel-time. Some details and insightful illustrations of the aforementioned mathematical performance bounds are presented in this paper. It is of interest to note that the aforementioned uncertainty-information-space/certainty-latency-time duality revelation had its roots in an earlier 1978 discovery by the author of an existing uncertainty-communication/certainty-control duality between an uncertainty 'digital' communication problem [4] and a certainty 'quantized' control problem [5] . This newly found duality in turn led him to the formulation of a novel and practical parallel processing methodology to quantized control where the controlled processor can be modeled with any certainty linear or nonlinear state-variables representation. This control scheme he named Matched-Processors since it was the 'certainty' control dual of the 'uncertainty' Matched-Filters communication problem. It should be further noticed that this uncertainty/certainty duality perspective for a 'discrete' communication/control problem is also exhibited by Kalman's LQG control formulation of 1960 [6] for the complementary 'continuous' communication/control problem.
doi:10.1117/12.819057 fatcat:cdvodgzq6zch5p7qs72lfhcycq