After a chat with David Ha about this idea I was convinced that it was worth giving it a shot. It is a simplified implementation of Ken Stanley's Compositional Pattern Producing Network (CPPN) with a twist. Ken's CPPNs are similar to Artificial Neural Netowrks, but they differ in the non homegeus ways they use both inputs and activation functions and for the fact that they usually use coordinates as inputs and output a single value (pixel intensity, or pixel color if you output 3 values) for the input coordinates. Because of this mapping from coordinates to intensities / colors they are used for drawing generative art, like in PicBreeder.
When i say non homogeneus, what I mean is that neurons can have any number of inputs (including output of other neurons) and can apply to them any sort of function (inclugin sine and cosine, or RBF for instance). Because of this generality, one of the problems that CPPNs have is that there could be cycles in the connectivity path from inputs to outputs. There are several possible strategies to resolve it in literature.
David Ha implemented in TensorFlow a simplified version of CPPNs in the form of a classic layered Artificial Newural Network architecture that takes coordinates as inputs, projects them lineraly to obtain an hidden representation and then uses them as inputs to a regular Neural Netowrk. The output is still pixel intesnity or color like in a CPPN.
I did some improvements on his code, adding a video recording feature, porting it to python3 and to a newer version of tensorFlow, alongside some other cleanup, but the main twist regards one of the assumptions of the CPPN. As said before, loops are an issue for CPPNs, but because of the fixed nature of the architecture and the lack of repetition in the unfolding of the layers in David Ha's implementation, the model can't produce recursive patterns. So I added an additional input parameter to the model that is the number of times the model should feed its own output as its input (is acually is a metaparameter). In this way, controllind the depth of the recursion and how to construct the architecture of the network and it's computational graph dynamically, the model regains the ability to produce recursive and fractal-like pattern. I named this model Recursive Pattern Producing Network )RPPN) and it can generate beautiful high resolution abstract images like the first two in the gallery, as well as fractal-like patterns like the third one in the gallery.