Sunday, 26 June 2016

Normal mapping (part 31)

Okay, we're taking a quick detour. I've been wanting to add this to the engine for a little while now but just didn't get around to it yet. As I was adding a model to the scene that had some normal maps I figured, why not...

Now I'll be the first to admit I'm not the greatest expert here. The last time I messed around with normal mapping was in the mid 90ies and that was completely faking it on the CPU because an 80386 didn't have the umph to do this.

The idea behind normal mapping is simple. The normal for the surfaces are the key thing that determines the lighting of our surface. It allows us to determine the angle at which light hits the surface and thus how to shade it. Now if there is a grove in our surface we need to introduce additional complexity to our model to properly light this. Take a brick wall for instance, we would really need to model each and every brick with all its jagged edges, to create believable lighting of the wall.

Normal mapping allows us to approximate this by still using a flat surface but adding a texture map that contains all the normals for the surface. So take our brick wall, at the edges of our bricks our normals won't be pointing outwards but sideways around the contours of the bricks. The end result is shading the bricks properly. Now this is very limited as we won't be casting any shadows or changing the geometry in any meaningful way so if you come very close you'll see that we're being tricked but it is very effective for convincingly representing a rough surface.

The biggest issue with normal mapping is that the texture assumes the surface is pointing in one direction (facing us) so we need to rotate all the normals to match the orientation of our surface.
Funny enough, single pass shaders optimise this by doing the opposite and rotating the lights to match our normal map, it's prevents a relatively expensive matrix calculation in our fragment shader.
In a deferred shader we don't have that luxury and will apply our matrix in our fragment shader.

To do this we need not only our normals for our vertices, we also need what are called the tangent and bitangent of our normal. These are two perpendicular vectors to our normals that further define the orientation of our surface. Most implementations I've seen will use the adjacent vertices of our vertex to calculate the tangent and bitangent for that vertex and add that to our data. I was planning on doing the same until I read this post: Deferred rendering and normal mapping.
I have no idea why this works, but it does, so I've used it.

Preparing your mesh

Now the first thing is that we actually need a normal map for our object. The house I've added to our scene had a normal map included but I've also added one for our tree using this brilliant site here: NormalMap Online
This little online tool is amazing. It lets you create all sorts of maps from source images. Now I've used our tree texture directly and we have this results:

A note of warning here, what is behind this is the assumption that darker colors are 'deeper' then lighter colors and that is what drives creating the normal map. It usually gives a convincing result but sometimes it pays to create a separate 'depth' image to create a much better result. Indeed I believe the normal map for the house was created by similar means and gives some incorrect results.
For our purposes today, it will do just fine.

We already had added a "bumpmap" texture to our shader structure for our heightmap implementation so we'll use that here as well. That means I just had to change the material file loader to support wavefronts map_bump directive and add that to our tree material file.

Changes to our shader

You'll see I've added two shaders to our list of shaders, BUMP_SHADER and BUMPTEXT_SHADER and both use our standard vertex shader and standard fragment shader. We simply added "normalmap" as a definition to add in the code we need. The first shader simply applies this on a single color material and the second on a textured material. We could easily combine our environment map into this as well though I have not made any changes to using the normal from our normal map instead of the vertex normal.

In our vertex shader we can see that we simply take parts of our normal-view matrix as our tangent and bitangent:
...
#ifdef normalmap
out vec3          Tangent;        // tangent
out vec3          Binormal;       // binormal
#endif

void main(void) {
  ...

  // N after our normalView matrix is applied
  Nv = normalize(normalView * N);
#ifdef normalmap
  Tangent = normalize(normalView[0]);
  Binormal = normalize(normalView[1]);
#endif 

  ...
}
As I said before, I have no idea why this works, I'm sure if I dive into the math I'll figure it out or find out it is not a complete solution but it seems to have the right effects. All in all I may get back to this and precalc all my tangents and bitangents and make it part of our vertex data.
Do note that the normal, tangent and bitangent get interpolated between our vertices.

In our fragment shader we simply replace the code that outputs our normal as it is to code that takes the normal from our normal map and applies the matrix we end up preparing:
...
#ifdef normalmap
uniform sampler2D bumpMap;                          // our normal map
in vec3           Tangent;                          // tangent
in vec3           Binormal;                         // binormal
#endif
...
void main() {
  ...
#ifdef normalmap
  // TangentToView matrix idea taken from http://gamedev.stackexchange.com/questions/34475/deferred-rendering-and-normal-mapping
  mat3 tangentToView = mat3(Tangent.x, Binormal.x, Nv.x,
                            Tangent.y, Binormal.y, Nv.y,
                            Tangent.z, Binormal.z, Nv.z);
  vec3 adjNormal = normalize((texture(bumpMap, T).rgb * 2.0) - 1.0);
  adjNormal = adjNormal * tangentToView;
  NormalOut = vec4((adjNormal / 2.0) + 0.5, 1.0); // our normal adjusted by view
#else
  NormalOut = vec4((Nv / 2.0) + 0.5, 1.0); // our normal adjusted by view
#endif
  ...
}
So we create a tangentToView matrix using our tangent, bitangent and normal, then get the normal from our normal map, apply our matrix and write it out to our normal geobuffer.

Note btw that the code for writing to our normal geobuffer has changed slightly to bring our -1.0 to 1.0 range into a 0.0 to 1.0 range. You'll see that I've updated our lighting shaders to reverse this. Without this change we lost our negative values in our normal map.  I'm fairly surprised this didn't give that much problem during lighting.

Anyways, here's the difference in shading between our tree trunk without normal mapping and with normal mapping:
Tree without normal mapping
Tree with normal mapping






And here are a few renders of the house I added, without normal mapping, normal mapping without our texturemaps and the full end result:
House without normal mapping

House with normal mapping but no textures

House with normal mapping and textures

Well, that's enough for today. Source code will be on github shortly :)




No comments:

Post a Comment