Unsere Webseite nutzt Cookies. Wenn Sie auf dieser Webseite bleiben, nehmen wir an, dass Sie damit einverstanden sind. Sie können unsere Cookies löschen. Wie das geht, erfahren Sie in unserer Datenschutzerklärung.
Scientists try to penetrate natural phenomena with their understanding, seeking to reduce all complexity to a few fundamental laws. The cool rationality of science and technology has pervaded and transformed the world to such an extent that it could destroy human life.
The creed of the “fundamentalists” has lost its exclusive attractiveness because of the great unifying success in modern science (e.g.: elementary particle physics, molecular genetics, ...). It becomes more and more important to figure out the patterns through which the basic laws show themselves in reality. More than just fundamental laws are operating in what actually “is”. Every non-linear process leads to forks in the path at witch decisions are made whose consequences can’t be predicted because each decision has the character of an amplification. These decisions may blow up and have fear - reaching effects. Sooner or later the initial knowledge of the system becomes irrelevant, from now on there is an uncontrollable process, where information is generated and retained. These processes become unpredictable over very long periods of time. E. g.: The old problem of the stability of the solar system is still unsolved. (Problem of gravitational interactions) Analogous problems arise in almost all other disciplines. We have no controlled nuclear fusion because we have no adequate understanding of the chaotic motion of charged particle in the magnetic mirror system. Phenomenology has its own laws. At every new stage of organisation new rules take effect. We know it well from everyday live, but it calls for a completely new orientation in science.
On the one hand some people look upon a computer as a diabolical instrument and on the other hand are there some others that are completely addicted. But used with some reflection it can also help us lift the veil on nature’s secrets.
Where scientists of earlier generations had to simplify their equations or give up completely, we are able to see their full content on the display monitor of our computer. Through graphical representation of natural processes new ideas and associations are stimulated. In connection with the computer a lot of new topics sprung up, like Synergetics. Synergetics is the systematically trying to find the rules by which order arises in complex systems.
Fractals are part of Synergetics. They deal with chaos and order and with their competition or coexistence. The process chosen here comes from various physical or mathematical problems, like order and disorder or magnetic and non - magnetic state. The pictures represent processes that are simplified idealisations of reality.
The principle of self - similarity is nonetheless realised approximately in nature: coastlines, riverbeds, cloud formations, trees, and so on. It was Benoit B. Mandelbrot who opened our eyes to the fractal geometry of nature.
The processes that generate fractals are simple feedback processes in which the same operation is carried out repeatedly.
For better imagination see the picture.
The only requirement here is a non - linear relation between input and output. The rule x àf(x) will depend on a parameter c, whose influence won’t be discussed here, because it would be to complex.
Our interest is now the behaviour of this iteration over a long period of time. What will the sequence do? Reach a limit value and rest there or be a typical cycle of values that is repeated over and over again? Or is it for all times unpredictable?
Physicists like to think in terms of infinitesimal time - steps: natura non facit saltus. Biologists often prefer to look at changes from year to year or from generation to generation. Both views are possible and only the circumstances stipulate which description is appropriate.
What do we mean by chaos?
In simple terms the system has gone out of control. There is no way to predict its long time behaviour. The surprise: The sequence is determined by its initial value - and yet, it cannot be predicted other than by letting it run. The problem is that any real description of the initial size of the sequence, its representation in a computer for instance, can only be given with finite precision. The process can be viewed as an unfolding of information: the longer we observe it, the better we know.
The most exciting aspect is not the chaos as such, but the scenario by which the order turns into chaos.
An exact analysis of the bifurcation points (the exact growth parameters for the oscillation between two periods) in the scenario shows that the doubling factor approaches a universal value of d = 4.669201 ... as the period increases. This number is called the “Feigenbaum number”, because Mitchell Feigenbaum discovered the universality of this number.
The discovery has spurred an enormous activity among scientist of many fields. Mathematicians for example are still trying to fully understand that unexpected universality. But perhaps more important it has boosted a general hope that non - linear phenomena may not be out of reach of systematic scientific classifications.
The history of fractals before Mandelbrot
Like new forms of life, new branches of mathematics and science don’t appear from nowhere. The ideas of fractal geometry can be traced to the late nineteenth century, when mathematicians created shapes (sets of points) that seemed to have no counterpart in nature. By a wonderful irony, the “abstract” mathematics descended from that work has now turned out to be more appropriate than any other for describing many natural shapes and processes.
Perhaps we shouldn’t be surprised. The Greek geometers worked out the mathematics of the conic sections for its formal beauty; it was two thousand years before Copernicus and Brahe, Kepler and Newton overcame the preconception that all heavenly motions must be circular, and found the ellipse, parabola and hyperbola in the paths of planets, comets, and projectiles.
In the 17th century Newton and Leibniz created calculus, with its techniques for “differentiating” or finding the derivative of functions - in geometric terms, finding the tangent of a curve at any given point. True, some functions where discontinuous, with no tangent at a gap or an isolated point. Some singularities: abrupt changes in direction at which the idea of a tangent becomes meaningless. But these were seen as exceptional and attention was focused on the “well - behaved” functions that worked well in modelling nature.
Beginning in the early 1870s, though, a 50 - year crises transformed mathematical thinking. Weierstrass described a function that was continuous but nondifferentiable (no tangent could be described at any point). Cantor showed how simple, repeated procedure could turn a line into a dust of scattered points, Peano generated a convoluted curve that eventually touches every point on a place. These shapes seemed to fall “ between” the usual categories of one - dimensional lines, two - dimensional planes and three - dimensional volumes. Most still saw them as “pathological” cases, but here and there they began to find applications.
In other areas of mathematics, too, strange shapes began to crop up. Poincare attempted to analyse the stability of the solar system in the 1880s and found that the many - body dynamical problem resisted traditional methods. Instead, he developed a qualitative approach, a “state space” in which each point represented a different planetary orbit, and studied what we would now call the topology (the “connectedness”) of whole families of orbits. This approach revealed that while many initial motions quickly settled into the familiar curves, there where also strange, “chaotic” orbits that never became periodic and predictable.
Other investigators trying to understand fluctuating, “noisy” phenomena (the flooding of the Nile, price series in economics, the jiggling of molecules in the Browian motion in fluids) found that traditional models could not introduce apparently arbitrary scaling features, with spikes in the data becoming rarer as they grew larger, but never disappearing entirely.
For many years these developments seemed unrelated, but there were tantalising hints of a common thread. Like the pure mathematicians’ curves and the chaotic orbital motions, the graphs of irregular time series often had the property of self - similarity: a magnified small section looked very similar to a large one over a wide range of scales.
Who is this “Mandelbrot”, Anyway?
While many pure and applied mathematicians advanced these trends, it is Benoit B. Mandelbrot above all who saw what they had in common and pulled the threads together into the new discipline.
He was born in Warsaw in 1924, and moved to France in 1935. In a time when French mathematical training was strongly analytic, he visualised problems whenever possible, so that he could attack them in geometric terms. He attended the Ecole Polytechnique, then Caltech, where he encountered the tangled motions of fluid turbulence.
In 1958 he joined IBM where he began a mathematical analysis of electronic “noise” and began to perceive a structure in it, a hierarchy of fluctuations of all sizes, that could not be explained by existing statistically methods.
Through the years that followed, one seemingly unrelated problem after another was drawn into the growing body of ideas he would come to call fractal geometry.
As computers gained more graphic capabilities, the skills of his mind’s eye were reinforced by visualisation on display screens and plotters. Again and again, fractal models produced results (series of flood heights, or cotton prices) that experts said looked like “the real thing”.
Visualisation was extended to the physical world as well. In a provocative essay titled “How long is the coast of Britain?” Mandelbrot noted that the answer depends on the scale at which one measures: it grows longer and longer as one takes into account every bay and inlet, every stone, every grain of sand. And he codified the self - similarity characteristic of many fractal shapes - the reappearance of geometrically similar features at all scales.
First in isolated papers and lectures, then in two editions of his seminal book, he argued that many of science’s traditional mathematical models are ill - suited to natural forms and processes: in fact, that many of the “pathological” shapes mathematicians had discovered generations before are useful approximations of tree bark and lung tissue, clouds and galaxies.
Mandelbrot was named an IBM Fellow in 1974, and continues to work at the IBM Watson Research Centre. He has also been a visiting professor and guest lecturer at many universities.
Fractals have three important properties:
1. They are generated by relatively simple calculations, repeated over and over, feeding the results of each step back into the next - something computers can do very rapidly.
2. They are, quite literally, infinitely complex: they reveal more and more detail without limit as you plot smaller and smaller areas.
3. They can be astonishingly beautiful, especially using PC colour displays’ ability to assign colours to selected point and to “animate” the images by quickly shifting those colour assignments.