Artificial Intelligence in Aerospace [chapter]

David John
2010 Aerospace Technologies Advancements  
Aerospace Technologies Advancements 2 model, but the model is computationally expensive. In this situation, a machine-learning "wrapper" can be applied to the deterministic model providing us with a "code accelerator". A good example of this is in the case of atmospheric photochemistry where we need to solve a large coupled system of ordinary differential equations (ODEs) at a large grid of locations. It was found that applying a neural network wrapper to the system was able to provide a speed
more » ... p of between a factor of 2 and 200 depending on the conditions. Second, when we do not have a deterministic model but we have data available enabling us to empirically learn the behaviour of the system. Examples of this would include: Learning inter-instrument bias between sensors with a temporal overlap, and inferring physical parameters from remotely sensed proxies. Third, machine learning can be used for classification, for example, in providing land surface type classifications. Support Vector Machines perform particularly well for classification problems. Now that we have an overview of the typical applications, the sections that follow will introduce two of the most powerful machine learning approaches, neural networks and support vector machines and then present a variety of examples. Machine learning Neural networks Neural networks are multivariate, non-parametric, 'learning' algorithms (Haykin, 1994 , Bishop, 1995 , Haykin, 2001a , Haykin, 2001b inspired by biological neural networks. Computational neural networks (NN) consist of an interconnected group of artificial neurons that processes information in parallel using a connectionist approach to computation. A NN is a non-linear statistical data-modelling tool that can be used to model complex relationships between inputs and outputs or to find patterns in data. The basic computational element of a NN is a model neuron or node. A node receives input from other nodes, or an external source (e.g. the input variables). A schematic of an example NN is shown in Figure 1 . Each input has an associated weight, w, that can be modified to mimic synaptic learning. The unit computes some function, f, of the weighted sum of its inputs:
doi:10.5772/6941 fatcat:td55fz72zvfn7kmk2glnocqhru