March 13, 2012

GenShaders – Part 1

by Cody Brocious

I recently embarked on a new experiment which I believe will be of particular interest to the Displayhack audience.
I started with a fairly simple question: is it possible to use genetic algorithms to generate coherent, aesthetically pleasing textures?

As the idea evolved in my mind and in conversations with friends, I decided to build a simple app to test out the concept and write about it as it progressed.

Before I dive in, give the current version a shot. You can access the bleeding edge at demoseen.com/genshaders and the current version at demoseen.com/genshaders-1 – I wanted to keep these separate so that you could see where I was at each point.

Note: You need a WebGL-friendly browser for this to work.

First Steps

My first thought for an interface was a simple 4×3 grid of WebGL canvases. Why WebGL? It’s easy, fast, and shaders can be used to generate the textures. The design was largely inspired by the Chrome new tab page — hey, it works!

From there, I had to actually start generating shaders. I decided that the best place to start was doing it purely randomly, so I created a shader builder based on a few simple practices:

  • The code of the shader was gl_FragColor = vec4($genValue$, $genValue$, $genValue$, 1.0); Meaning each value going into the output color was generated separately.
  • The genValue function — during shader generation — would randomly return one of the following:
    • An existing value (initially either the X or Y coord for the current pixel)
    • A random float
    • Two other genValue’d expressions combined with a random arithmetic operation
    • A genValue’d expression passed through a function such as sin, cos, tan, sqrt, etc

Based on that simple practice, I got some really interesting results:

But around this time, I realized that my initial concept of this being something that would allow you to drill down to get the exact texture you want, as a standalone tool, was simply not the best way to use this.

Evolution #1: Time

I mentioned earlier that genValue could pass back an existing value. Well, the first step I took away from generating static textures was to throw time (as floating point seconds) into the mix. Immediately, shaders that looked completely uninteresting before started looking amazing. Everywhere you looked, there was action. You’d get everything from old-school plasma effects to amazing shifting colors and rays of light.

Once I saw the effects of this, I decided to expand it. Instead of just floats and unary functions, I created vector creation methods and expanded the function catalog to include binary float functions (e.g. mod, atan, pow). But as the complexity increased, so did the odds of it going off the deep end; it’d often get into a loop where it was calling genValue/genVec* hundreds of times, recursively and causing stack overflows, or creating shaders so large that they simply wouldn’t compile. To get around this, I added a level value to each generator function, so that it could keep track of how deep it was in the hierarchy at any point.

One nice thing about having the level value around was that I could point the random generator in the right direction; if I was more than n levels deep, I wouldn’t generate vectors, for instance, because I knew it was highly likely that they’d end up tanking the rest of the code. I spent a while just tweaking the values here to get to a place where things compiled quickly and still looked good.

I tried to get a screenshot of this step that really captures what this looks and feels like, but without the animation it really just can’t be understood. I strongly recommend you take a look at the live demo at this point, if you haven’t already.

Next Step

The next few changes are going to be big ones. Rather than this just being a standalone bit of JS, it’s going to be attached to a server backend that will keep track of the genome. Everyone going to the site will be working on the same gene pool, so we end up acting as a collective fitness function.

In terms of the interface, it’s going to be considerably nicer. Rather than the big 4×3 grid, you’ll be presented with two shaders at a time and asked to choose the one you prefer; think Hot or Not for shaders.

To Be Continued

Well, that’s about it for where I’m at now. Things are still just random, but I’m loving what I’m getting out of it now — I can only imagine the insane things I’m going to get once everyone is unleashed on it. I plan on putting out these mini-articles as I work on this, so whether it works or not, you’ll get to see what I do and hopefully learn something along the way.

avatar
About the author, Cody Brocious

Outside of his day job as a hacker for Mozilla, Cody (known as Daeken to many of you) is moving closer and closer to making non-mediocre demos and games. One day, when he grows up, he plans to run a game studio with a technical focus on procedurally generated content.

1 Comment Post a comment
  1. avatar
    cdvolko
    May 5 2013

    I did something similar. Check out the gallery of images created by my program: http://www.hugi.scene.org/adok/miscellaneous/gpgl.htm

    The program itself (C++, OpenGL, Windows) is available at http://www.hugi.scene.org/adok/index_works.htm.

Leave a Comment