A "maximizing plant" if you prefer.
I uploaded this function as it is special: most functions used to test those optimizing algorithms are designed for continuos functions, and the best algorithms also need the function to be diferentiable one or two times.
This fractal algorithm surely can be beaten on those functions by standard algorithms, but this one can scan any kind of function (even a random noise can be maximized to some degree) in any number of dimensions.
The function showed is clearly not continuous, and I deliverately placed a gasp between the intial position and the global best, but particle still converges quite straight to the best.
But what about convergence? Can you start from any initial point and still get convergence to the global optimum? Well, fractal is doing quite a nice job scanning the state space, so the most I can say is: it will try hard to converge from any initial point.
It is good at convergence, but I can not formally prove it in any way, that is the naked truth (fractals are not so nice at proving things) but just have a look at the following video.
I will show you 400 initial positions and it convergence to the global best (blue dots correspond to a slow fractal growth, while yellow ones use a more agressive aproach):
As you can see, all points converge, slower or faster, to the right position... and the alogrtihm can be then easily paralelized.
I hope you enjoyed watching!
UPDATE: I have added a benchmark about this algorithm in this post.