## Author Archive

December 2, 2013

I found this techrepublic page, which gives a conversion algorithm from a standard RGB image to a Sepia image tone as follows:

```outputRed = (inputRed * .393) + (inputGreen *.769) + (inputBlue * .189)
outputGreen = (inputRed * .349) + (inputGreen *.686) + (inputBlue * .168)
outputBlue = (inputRed * .272) + (inputGreen *.534) + (inputBlue * .131)

```

You can obviously tweak these values if you like – there is no exact formula (Sepia itself being a natural material originally derived from the ink sac of a Cuttlefish, and now derived using a variety of chemicals).

You can apply this algorithm in the finalcolor modifier of a Unity surface shader as follows:

```Shader "Custom/Sepia" {
Properties {
_MainTex ("Base (RGB)", 2D) = "white" {}
}
Tags { "RenderType"="Opaque" }
LOD 200

CGPROGRAM
#pragma surface surf Lambert finalcolor:Sepia

sampler2D _MainTex;

struct Input {
float2 uv_MainTex;
};

void surf (Input IN, inout SurfaceOutput o) {
half4 c = tex2D (_MainTex, IN.uv_MainTex);
o.Albedo = c.rgb;
o.Alpha = c.a;
}

void Sepia (Input IN, SurfaceOutput o, inout fixed4 color) {

fixed3 sepia;
sepia.r = dot(color.rgb, half3(0.393, 0.769, 0.189));
sepia.g = dot(color.rgb, half3(0.349, 0.686, 0.168));
sepia.b = dot(color.rgb, half3(0.272, 0.534, 0.131));

color.rgb = sepia;
}

ENDCG
}
FallBack "Diffuse"
}

```

Which gives:

November 26, 2013

## Lighting Models and BRDF Maps

A Bi-directional Reflectance Distribution Function (BRDF) is a mathematical function that describes how light is reflected when it hits a surface. This largely corresponds to a lighting model in Unity-speak (although note that BDRFs are concerned only with reflected light, whereas lighting models can also account for emitted light and other lighting effects).

The “bi-directional” bit refers to the fact that the function depends on two directions:

• the direction at which light hits the surface of the object (the direction of incidence, ωi)
• the direction at which the reflected light is seen by the viewer (the direction of reflection, ωr).

These are typically both defined relative to the normal vector of the surface, n, as shown in the following diagram (rather than simple angles, each direction is actually modelled in the BRDF using spherical coordinates (θ, φ) making the BRDF a four-dimensional function):

Given that our perception of a material is determined to a large extent by its reflectance properties, it’s understandable that several different BRDFs have been developed, with different effectiveness and efficiency at modelling different types of surfaces:

• Lambert: Models perfectly diffuse smooth surfaces, in which apparent surface brightness is affected only by angle of incident light. The observer’s angle of view has no effect.
• Phong and Blinn-Phong: Models specular reflections on smooth shiny surfaces by considering both the direction of incoming light and that of the viewer.
• Oren-Nayar: Models diffuse reflection from rough opaque surfaces (considers surface to be made from many Lambertian micro-facets)
• Torrance-Sparrow: Models specular reflection from rough opaque surfaces (considers surface to be made from many mirrored micro-facets).

In addition to the preceding links, there’s a good article explaining some of the maths behind these models at http://www.cs.princeton.edu/courses/archive/fall06/cos526/tmp/wynn.pdf

## BRDF Maps

In the case of game development, it’s often not necessary to strive for physically-accurate BRDF models of how a surface reacts to light. Instead, it’s sufficient to aim for something that “looks” right. And that’s where BRDF maps come in (sometimes also called “Fake BRDF”).

A BRDF map is a two-dimensional texture. It’s used in a similar way to a one-dimensional “ramp” texture, which are commonly used to lookup replacement values for individual lighting coefficients. However, the BRDF map represents different parameters on each of its two axes – the incoming light direction and the viewing direction as shown below:

A shader can use a tex2D lookup based on these two parameters to retrieve the pixel colour value for any point on a surface as a very cheap way of modelling light reflection. Here’s an example Cg BRDF surface shader:

```  Shader "Custom/BRDF Ramp" {
Properties {
_MainTex ("Texture", 2D) = "white" {}
_BRDF ("BRDF Ramp", 2D) = "gray" {}
}
Tags { "RenderType" = "Opaque" }
CGPROGRAM
#pragma surface surf Ramp

sampler2D _BRDF;

half4 LightingRamp (SurfaceOutput s, half3 lightDir, half3 viewDir, half atten) {

// Calculate dot product of light direction and surface normal
// 1.0 = facing each other perfectly
// 0.0 = right angles
// -1.0 = parallel, facing same direction
half NdotL = dot (s.Normal, lightDir);

// NdotL lies in the range between -1.0 and 1.0
// To use as a texture lookup we need to adjust to lie in the range 0.0 to 1.0
// We could simply clamp it, but instead we'll apply softer "half" lighting
// (which Unity calls "Diffuse Wrap")
NdotL = NdotL * 0.5 + 0.5;

// Calculate dot product of view direction and surface normal
// Note that, since we only render front-facing normals, this will
// always be positive
half NdotV = dot(s.Normal, viewDir);

// Lookup the corresponding colour from the BRDF texture map
half3 brdf = tex2D (_BRDF, float2(NdotL, NdotV)).rgb;

half4 c;

// For illustrative purpsoes, let's set the pixel colour based entirely on the BRDF texture
// In practice, you'd normally also have Albedo and lightcolour terms here too.
c.rgb = brdf * (atten * 2);
c.a = s.Alpha;
return c;
}

struct Input {
float2 uv_MainTex;
};
sampler2D _MainTex;
void surf (Input IN, inout SurfaceOutput o) {
o.Albedo = tex2D (_MainTex, IN.uv_MainTex).rgb;
}
ENDCG
}
Fallback "Diffuse"
}

```

And here’s the image it produces – notice how the shading varies from red to yellow based on view direction, and from light to dark based on direction to the light source.

As a slightly less trivial example, here’s another BRDF texture map that again uses light direction relative to surface on the x axis, but this time, instead of using view direction, uses curvature of the surface on the y axis (the gradient in the y axis is quite subtle but you should be able to note reddish hue towards the top centre of the image, and blueish tint at the top right):

This map can be used to generate convincing diffuse reflection of skin that varies across the surface of the model (such that, say, the falloff at the nose appears different from the forehead), as shown here:

November 14, 2013

## Procedural Terrain Splatmapping

If you try searching for information on how to assign textures to Unity terrain through code (i.e. derive a “splatmap” of textures based on the terrain heightmap profile itself), you’ll probably end up being sent to this post on the Unity Answers site. Despite being over 3 years old, it still seems to be pretty much the only helpful demonstration of accessing Unity’s somewhat poorly-documented terrain functions through script.

However, while its a good starting point, the script there suffers from several shortcomings (I’m not blaming the original author – I imagine that at the time he wrote it, he probably didn’t expect it to become the authoritative source on the matter!):

• It only works if your terrain’s splatmap has the same dimensions as its heightmap.
• It allows for only three textures to be blended.
• The y/x axes are inverted.
• The method of normalisation is incorrect. ( `Vector3.Normalize() `sets the magnitude of the vector representing the texture weights to 1 – i.e. two equal textures will each have weight of 0.707. What instead is required is that the component weights should sum to 1 – i.e. each have weight of 0.5).

Attached below is my version of an automated splatmap creation script which attempts to correct some of the issues in that earlier code. It allows for any number of terrain textures to be blended based on the height, normal, steepness, or any other rules you create for each part of the terrain. Just attach the AssignSplatMap C# script below onto any terrain gameobject in the scene (having first assigned the appropriate textures to the terrain) and hit play.

```using UnityEngine;
using System.Collections;
using System.Linq; // used for Sum of array

public class AssignSplatMap : MonoBehaviour {

void Start () {
// Get the attached terrain component
Terrain terrain = GetComponent();

// Get a reference to the terrain data
TerrainData terrainData = terrain.terrainData;

// Splatmap data is stored internally as a 3d array of floats, so declare a new empty array ready for your custom splatmap data:
float[, ,] splatmapData = new float[terrainData.alphamapWidth, terrainData.alphamapHeight, terrainData.alphamapLayers];

for (int y = 0; y < terrainData.alphamapHeight; y++)
{
for (int x = 0; x < terrainData.alphamapWidth; x++)
{
// Normalise x/y coordinates to range 0-1
float y_01 = (float)y/(float)terrainData.alphamapHeight;
float x_01 = (float)x/(float)terrainData.alphamapWidth;

// Sample the height at this location (note GetHeight expects int coordinates corresponding to locations in the heightmap array)
float height = terrainData.GetHeight(Mathf.RoundToInt(y_01 * terrainData.heightmapHeight),Mathf.RoundToInt(x_01 * terrainData.heightmapWidth) );

// Calculate the normal of the terrain (note this is in normalised coordinates relative to the overall terrain dimensions)
Vector3 normal = terrainData.GetInterpolatedNormal(y_01,x_01);

// Calculate the steepness of the terrain
float steepness = terrainData.GetSteepness(y_01,x_01);

// Setup an array to record the mix of texture weights at this point
float[] splatWeights = new float[terrainData.alphamapLayers];

// CHANGE THE RULES BELOW TO SET THE WEIGHTS OF EACH TEXTURE ON WHATEVER RULES YOU WANT

// Texture[0] has constant influence
splatWeights[0] = 0.5f;

// Texture[1] is stronger at lower altitudes
splatWeights[1] = Mathf.Clamp01((terrainData.heightmapHeight - height));

// Texture[2] stronger on flatter terrain
// Note "steepness" is unbounded, so we "normalise" it by dividing by the extent of heightmap height and scale factor
// Subtract result from 1.0 to give greater weighting to flat surfaces
splatWeights[2] = 1.0f - Mathf.Clamp01(steepness*steepness/(terrainData.heightmapHeight/5.0f));

// Texture[3] increases with height but only on surfaces facing positive Z axis
splatWeights[3] = height * Mathf.Clamp01(normal.z);

// Sum of all textures weights must add to 1, so calculate normalization factor from sum of weights
float z = splatWeights.Sum();

// Loop through each terrain texture
for(int i = 0; i<terrainData.alphamapLayers; i++){

// Normalize so that sum of all texture weights = 1
splatWeights[i] /= z;

// Assign this point to the splatmap array
splatmapData[x, y, i] = splatWeights[i];
}
}
}

// Finally assign the new splatmap to the terrainData:
terrainData.SetAlphamaps(0, 0, splatmapData);
}
}

```

Here’s some examples of rules you might want to implement for various textures:

 Texture weight based on surface normal (useful for e.g. snow accumulating on one side of a mountain, moss growing on north side of a hill) Texture weight based on height (e.g. ice caps at high altitudes, sand near sea-level) Texture weight based on steepness (e.g. grass grows on relatively flat terrain)

Using this script with Unity’s standard terrain textures turns the following default grey terrain:

Into this slightly more attractive scene:

Sure there’s certainly a lot more you could do to improve the mapping, and it doesn’t compare to the output you get from professional world modelling tools such as World Machine. But, then again, it’s free, and it gives you a good start from which to refine further improvements to your terrain.

Tags: , , ,
November 12, 2013

## Importing DEM Terrain Heightmaps for Unity using GDAL

I know that some folks reading my blog are from the spatial/mapping community, and may have been disappointed that my posts of late have been more influenced by game development and Unity than by spatial data and Bing Maps. Well, good news, spatial fans – this post is about mapping terrain from DEM data! (to create terrain for a Unity game… )

I’ve written a few posts in the past (such as here and here) that have made use of Digital Elevation Model data, such as collected by the SRTM or GTOPO30 datasets, via GDAL into Bing Maps, WPF etc. In this post I’ll be running through a similar workflow to that which I’ve used before, but this time with the target output being a Unity terrain object.

So, here goes:

1.) Get a GeoTIFF file of DEM data. For whole-world STRM coverage, use Google Earth browse of http://www.ambiotek.com/topoview, or download direct from e.g. http://srtm.geog.kcl.ac.uk/portal/srtm41/srtm_data_geotiff/srtm_36_02.zip (although I note that King’s have disabled directory browsing on their server so you’ll have to know the name of the SRTM file you want to access)

If you want to preview the data in the DEM file, you can open it up in MicroDEM (File –> Open –> Open DEM –> Select GeoTiff file). It should appear something like this:

Note that although you can load up GeoTIFF files in image applications such as Photoshop or GIMP, they won’t visualise the geographic data encoded in the file, and you’ll probably just see this:

2.) (Optional) Use gdalwarp to transform the data to an alternative projection and/or crop it to a particular area of interest. Like all forms of spatial data, DEM data can be provided in various different projections. The STRM data I’m using here is provided in WGS84 decimal degrees, but as it’s data for Great Britain, I’d like to reproject it to the National Grid of Great Britain (EPSG:27700) instead:

```gdalwarp –multi –t_srs EPSG:27700 srtm_36_02.tif srtm_36_02_warped.tif

```

I’ll then crop a 100km2 area from the warped image so that it covers the Ordnance Survey Grid Square “SH”, which includes North Wales and Snowdonia National Park as shown in the red square here:

Here’s the gdalwarp command to crop the image:

```gdalwarp srtm_36_02_warped.tif –te 200000 300000 300000 400000 srtm_36_02_warped_cropped.tif

```

If you were to view the warped, cropped image in MicroDEM again now, it would look like:

3.) Use gdal_translate to convert the GeoTIFF file to the raw heightmap format expected by Unity

```gdal_translate –ot UInt16 –scale –of ENVI –outsize 1025 1025 srtm_36_02_warped_cropped.tif heightmap.raw

```

The settings used here are as follows:

• -ot UInt16 Use 16 bit channel. Most graphics applications use 32 bit colour images, consisting of four channels (R,G,B,A), each one having 8 bits. That means that, in any given channel, you can only represent an integer value between 0-255. We’ll be encoding the heightmap data in a single channel, so having only 256 unique values would not give us very precise resolution. Instead. we’ll specify a 16bit channel that gives 65,536 unique values instead.
• –scale To make the most of our 16 bit channel, we need to scale the height values from their original range to fill the full range of values available in the 16 bit channel (0 – 65535). Specifying –scale with no parameters automatically does this.
• -of ENVI  This outputs the result in the raw binary output file format Unity expects.
• -outsize 1025 1025 Unity terrain maps must have dimensions equal to a power of 2 + 1. i.e. 65x65, 129x129, 257x257, 513x513, 1025x1025, 2049x2049 etc.

4.) Import into Unity

Create a new terrain object and, from the settings tab of the Inspector, click the button to Import Raw heightmap. If you browse to select the heightmap.raw file created in the previous step, it should populate the correct settings automatically (note that, even when running under Windows, gdal appears to use Mac Byte Order). Note that Width and Height need to match the Heightmap Resolution field in the terrain inspector, not the Width and Height of the terrain itself:

If all goes well, your blank terrain should magically update to reflect the heightmap and you’ll see the following:

5.) Correct the orientation

The observant amongst you will notice one slight problem with the previous image – the heightmap has been rotated anti-clockwise by 90 degrees. It turns out that Unity treats heightmap coordinates with (0,0) at the bottom-left corner, whereas most other applications, including gdal, interpret (0,0) at the top-left corner. Fortunately the fix is quite simple. Just attach the following script to the terrain object and click play (changes will be saved to the terrain, so script can be disabled/deleted once used once)

```using UnityEngine;
using System.Collections;

public class RotateTerrain : MonoBehaviour {

void Start () {

Terrain terrain = GetComponent<Terrain>();

// Get a reference to the terrain
TerrainData terrainData = terrain.terrainData;

// Populate an array with current height data
float[,] orgHeightData = terrainData.GetHeights(0,0,terrainData.heightmapWidth, terrainData.heightmapHeight);

// Initialise a new array of same size to hold rotated data
float[,] mirroredHeightData = new float[terrainData.heightmapWidth, terrainData.heightmapHeight];

for (int y = 0; y < terrainData.heightmapHeight; y++)
{
for (int x = 0; x < terrainData.heightmapWidth; x++)
{

// Rotate each element clockwise
mirroredHeightData[y,x] = orgHeightData[terrainData.heightmapHeight - y - 1, x];
}
}

// Finally assign the new heightmap to the terrainData:
terrainData.SetHeights(0, 0, mirroredHeightData);
}
}

```

And there you have it – DEM data of Wales imported and ready to play in Unity. Now, if you want, you can texture it using procedural splat mapping (or just paint textures on manually).