well, i dug my shaders for game artists book out again with the idea of reading up on how corona effects are done around light sources - which turned out to be in a whole chapter of the book that dealt with lots of interesting effects - blurs, colour manipulation etc.
Right, i'm completely out of my depth here, but it looks like these effects mostly seem to be done by rendering the normal camera view to a seperate 'offscreen' texture, which is then manipulated by shader code to get the desired effect, before then being re-combined with the normal camera view and finally sent out to the screen for the player to see, is that right?
Am i right in thinking this 'rendering to offscreen textures' part of the process can only be done by programming it into the oolite graphic engine, it's not something you can do in glsl? i'm just asking because i'm kind of interested in these effects but i don't want to spend much time worrying about them if the technique to achieve them isn't supported in Oolite - uh, that's not a feature request by the way
Edit: just started flipping through pages at random, oo, this sounds interesting, 'polynormal texture mapping' - self shadowing bumpmaps!