Sunday, October 11, 2020

Chaos Still Reigns!!

A quick follow-on to last week's post. It would be blasphemous to talk about chaos without at least mentioning the most iconic fractal since fractals became a thing via The Fractal Geometry of Nature, the set of complex numbers named after its author, Benoit Mandelbrot. I've spent a lot of time exploring the Mandelbrot set since I've been able to use computers, so you'd think I'd have plenty to say about it. And I do, but not this week. Indeed it is this, more than anything else, that started me on the visualization track. Instead, I'll leave you with the real reason why the thing is so damn captivating in the first place, with, what else, a visualization.

The Mandelbrot Set with $-1.01 \leq\operatorname{Re}(z) \leq -1.006$ and $0.309 \leq \operatorname{Im}(z) \leq 0.313$.
 
The Mandelbrot set is in fact relevant to strongest theme of this blog, that of parameter spaces. Namely, it is the set of all parameters $c$ such that the iteration $z \mapsto z^2 + c$, starting at $z=0$, remains bounded. It in fact is a catalogue of another bunch of fractals (just like having a catalogue of all lines in the plane), Julia sets, which describe for each fixed $c$, what starting values of the iteration $z \mapsto z^2 + c$ stay bounded. In other words, Julia sets describe a collection involving the $z$ values of the iteration, and the Mandelbrot set describes a collection involving $c$ values. The main dark blobby part of the image is the actual set; the fancy colors are just colorings according to how long it takes for a point there to escape outside a certain disk in the plane ($|z|\leq 2$). Here the colors are assigned to the Viridis palette, and the iteration is done up to 1000 times (at 1000, it's just considered to be in the set). It cycles through the Viridis palette by the number of iterations modulo 50 (every 50 times, the color repeats), which is why you see interesting discontinuous jumps in color. Enjoy. (The next post will probably include some examples of Julia sets, showing precisely how the Mandelbrot set is a catalogue of them). I leave you with my all-time favorite from the 90s, the "Jewel Box" as coined by Rollo Silver in a fractal newsletter Amygdala back in the '90s. Unfortunately, I don't know where he is now and what the state of the newsletter is!

Centered at $-0.74542846 + 0.11300888\mathsf{i}$ and going about $10^{-5}$ on either side in the $x$-direction


Saturday, October 3, 2020

Chaos Reigns!!!

 

Chaos and fractals were all the rage when I was growing up, and was probably one of the major factors in getting me interested in math, programming, and differential equations, since much of this is ultimately about how systems evolve over time. Oh, and getting me into TOTALLY AWESOME visualizations. This is a graph of the various possible final states of the quadratic iterator: $x_{n+1} = ax_n (1-x_n)$. What this means is, for a given parameter $a$, and some start value in the range $(0, 1)$, we consider iterating the map $F(a, x) = ax(1-x)$, namely, the sequence $F(a, x)$, $F(a, F(a, x))$, $F(a, F(a, F(a, x)))\dots$. For some ridiculous history here, one of my favorite things to do as a child was to repeatedly press buttons on a calculator over and over again to see what happens. Little did I know that this is basically, in math, the only thing that actually exists: everything is built on the foundation of recursion. Of course, I don't actually believe recursion is the only thing that exists, rather, there is interesting conceptual structure that is built up upon fundamental components, and it is damn amazing how something so simple can ultimately create something so complex. And here is one example in which this is true.

Anyway, to get back to it: if we keep iterating $F$ in the variable $x$, we will eventually fall into some stable pattern. For small $a$, iterating $F$ in the variable $x$ eventually gives convergence to a fixed point, namely $1-\frac{1}{a}$. How did we find this? If we have a fixed point $x = F(a, x)$, then $x = ax(1-x)$ or $-ax^2 + (a-1)x = 0$. This gives two solutions $x = 0$, and $-ax + (a-1) = 0$, or $x = (a-1)/a$. 

Now, for each such $a$ (which we take to be the horizontal axis), we plot this fixed point (on the vertical axis). And as you move up to $a=3$, all is peaceful, as all iterations eventually settle on one and only one point. But at $a = 3$, something strange happens. You stop getting a single fixed point, but rather, you start bouncing between two values. We can actually solve for what these two values are using none other than ... the cubic equation (that's the small extent of continuity here with the previous posts!). The way to see this: if there's a period-2 sequence, $x = F(a, y)$ and $y = F(a, x)$ for the two values, which means $x = F(a, F(a, x))$ and therefore $x = aF(a, x)(1-F(a, x)) =  a(ax(1-x))(1- ax(1-x))$. This is a priori a quartic equation, since the number of $x$'s eventually multiply out to degree $4$, but once you realize there's a trivial solution $x = 0$ here (because there's a factor of $x$ multiplying on both side), it's a cubic equation $a^2(1-x)(1-ax(1-x))  -1 = 0$. I'm not going to try solving this, despite having had much fun wrangling with Cardano's formula in the past few weeks, because that distracts from the ultimate insight (and indeed, as fun as that formula is, and the beautiful theory that eventually comes from trying to solve polynomial equations, Galois theory), you quickly enter territory in which you stop being able to solve for things in terms of roots. To visualize we instead proceed numerically, directly iterating from a start value and waiting for things to stabilize around a set of values. One sees that as $a$ grows higher and higher, the number of final points starts doubling and doubling, faster and faster (in fact, a factor of the Feigenbaum constant faster), until all chaos breaks loose and the iterator never settles on any fixed point. It's very weird how just varying the parameter $a$, one can go from very nice fixed point behavior to wild chaos.

Ok, you can't go very far analytically, but we can still try anyway:


To explain it, we'll have to back up a little. Note that I said the $2$-periodic solution involves the solution to a cubic equation. Where is the third one? (actually, as previously mentioned, it's really quartic, and $x=0$ is a solution). Is the cubic always one that has a zero discriminant? Probably not. We mentioned that these "fixed" points come about from iterating the mappings, i.e., there is dynamics involved. Solving the cubic, you will actually get $1 - 1/a$, the original fixed point, as one of the solutions (in fact, here is again where numerical analysis helps: you can get around using Cardano's formula if you realize from the picture that the original fixed point continues; then you guess the solution $1-1/a$, factor it out with synthetic division, and solve the remaining quadratic equation — this is exactly how I taught college algebra students to solve cubics). Of course, this has to satisfy $x = F(a, F(a, x)) = G(a, x)$, and the two solutions to this equation fixed points of $G$, and thus when $F$ is evaluated on it, it alternates between the two values. What goes wrong here is that this point (as well as the point $x=0$) is unstable, namely, if you start iterating from some point even slightly away from this fixed point, iterating the map will run away from that point $1 - 1/a$. What is significant about the value $a = 3$ is that the discriminant of the cubic forces two of the roots to become complex conjugates, so there is only one real solution, and the discriminant also is the switchover point from when the iteration is stable there versus anywhere else. A fixed point $x$ is stable whenever $|\partial G/\partial x| < 1$ at that point. This can be roughly seen by linearizing about the fixed point, a standard technique:

\[G(a, x+h) \approx G(a, x) + (\partial G/\partial x(a, x))h + O(h^2) = x + (\partial G/\partial x) h + O(h^2).\]

In other words, $G(a, x+h)$ is closer to $x$ than $x + h$ is, because $h$ is being multiplied by something of magnitude $< 1$. So iterating brings things close to the fixed point even closer. It can be seen that at least for a little bit after $a = 3$, the central $1-1/a$ continues, but the other two solutions are the stable fixed points. So if you plot the curves $x = G(a, x))$, you'll get something that divides into $3$ branches at $a = 3$ (referred to as a pitchfork bifurcation). Similarly, if you plot $x = F(a, F(a, F(a, F(a, x)))) = H(a, x)$, the 4th order iterate, it will divide into pitchforks twice. This is given in the above plot. However, to figure out which fixed points will actually happen if you iterate from a random starting point, you can figure out where $|\partial H/\partial x| < 1$. This gives the regions in the plane where solutions are stable. You can alternatively plot the complement, where it is $\geq 1$, which can be made to cover up the parts of the pitchforks that are not stable solutions. The weird gloopy visualization of these regions is just cool to look at, even where there aren't any stable points. The curves you see inside there are still the correct initial ones before it branches out into a complete mess.  It of course quickly becomes ferociously hard to compute for higher degrees, so a numerical study is still the way to go. But this is conceptually fun, though!

I leave you with a similar implicit function graph, but now color-coded:


The red and blue correspond to $|\partial G/\partial x| \geq 1$ and the greens are where it is $< 1$.

Another version, executed much faster using TensorFlow (yes, it can be used for non-machine learning applications!)