Environmental Modelling - Introduction
by Thomas Lux and Achim Sydow
The topic of environmental modelling was last addressed in Issue 34 of ERCIM News, in July 1998. Influenced by the Kyoto Protocol (adopted on 11 December 1997), public awareness of environmental problems had at that time reached a peak. The goal of this thematic issue is to look at the pervasion of modern information and communication technology into the environmental and ecological sciences.
The power of today's computational and communication resources means that we are able to create modelling, simulation and decision-support tools with unprecedented quality. Modelling the biosphere with ever-greater numbers of biotic and abiotic components remains a great challenge of our time. Climate research (space weather included) uses models dealing with varying scales and resolutions, and will require new architectures with access to distributed resources. Branch-oriented simulation systems should prove the right software tools to be flexibly adapted to the special structure and data of complex environmental systems.
Environmental applications carry with them a number of special demands. These include complexity (dimension, structure with abiotic and biotic subsystems), scale (amount of data, distribution, heterogeneity), modelling for different purposes (scenario analysis, emergency response, risk management etc), the need for adaptability (coupling of models, parameter adjustment etc), longevity of data and applicability to different purposes.
These demands have led to a variety of current research themes, such as parallel, distributed and Grid computing, knowledge from data, decision support, intelligent/adaptive user interfaces and visualization, standardization of metadata and system interfaces, workflows for automatic access to distributed resources, and the generic nature of information and simulation systems.
The ERCIM Working Group Environmental Modelling provides a platform for the discussion of research in this area and invites interested groups to join.
History and Milestones
While environmental modelling based on physics has a long tradition, modelling biotic components is a relatively new challenge. All these models have only become really useful, however, in connection with computer applications. Mathematical models and simulation software (including numerical analysis) are therefore strongly dependent on each other. It is interesting to note that simulation is both the tool and the research aim in environmental engineering. Without simulation there would be few discoveries in this field, for example, chaotic processes! This dependency is illustrated by two examples from the history of environmental modelling.
Example 1: Atmospheric Processes
On 1 July 1946, J. von Neumann (1903-1957) began to look at meteorology: his aim was to develop a numerical weather prediction model. He simplified the basic meteorological equations such that only meteorologically relevant solutions were produced, and the numerical methods were stable. Two years later the meteorologist Jule Charney (1917-1981) joined the team and also dealt with this question.
The complete equations, with meteorologically insignificant higher frequent oscillations, had been considered thirty years earlier by L. F. Richardson (weather prediction by numerical processes, 1922) in the model area with 'staggered grids'. However, Richardson estimated that solving these would have required the efforts of 64,000 technicians. At one point, von Neumann was in a position to use for a month the army's computer, ENIAC (Electronic Integrator and Calculator), in order to solve the simplified model. He created basic methods for programming algorithms including sub-programs, iteration blocks and recursive blocks, all of which are now ubiquitous in software technology. Von Neumann considered the problem of modelling atmospheric processes to be one of the most complicated problems possible, after the analysis of human behaviour in conflict situations.
Today a hierarchy of models exists for studying the space-time spectrum of phenomena like long waves, fronts, hurricanes, thunderstorms, tornados, micro-turbulences etc. In connection with projects of the ERCIM Working Group on Environmental Modelling, ozone models and remote data sensing via satellites have also been included. Since the time of those pioneers of numerical simulation, one task that remains is model instabilities.
Example 2: Growth Processes / Population Dynamics
Growth processes are the basis of ecological modelling. In 1961, the meteorologist E. L. Lorenz used greatly simplified weather forecast equations to show that tiny errors in initial conditions could make forecasts outside of a certain time period impossible (deterministic chaos). This chaotic behaviour was also found in models of basic growth processes. For example, the equation developed by Verhulst in 1845 (the discrete version of which is today known as logistic growth, or growth with limited food), also produces chaotic behaviour. An interesting offshoot from this discovery is the development of wonderful two-dimensional computer art.
It is often desirable in ecology to analyse distributed growth processes structured in food chains (A. J. Lotka, 1888-1949, E. P. Odum, 1983). Lotka and Volterra developed their famous predator-prey model along these lines. Models including only three species could display chaotic behaviour, depending on the non-linearities of the coupled species.
The basic tools include models for logistic growth, delayed logistic growth (M. Smith 1968), exponential growth etc. Depending on the ecosystems to be analysed, transport, diffusion and other processes must be modelled.
An enormous problem is determining the initial conditions for model runs, in the case of weather forecasting, climate research, ecosystems research etc. Extensive data assimilation via different methods, including remote sensing by satellite measuring is needed. A great challenge for environmental modelling!