Shader Q&A

Last week, Mike asked me a bunch of really well worded and well thought out questions on my “Writing Your First Shader” post https://josephstankowicz.wordpress.com/2013/06/22/writing-your-first-shader/#comment-72. Answering these questions seemed like a great topic for this week’s post.

1. As I was going through the steps, I wasn’t initially clear how to assign my shader to the material. I think this is because I looked in the Shader drop-down with the material selected and couldn’t find the one I had just created (the trick is that it was in the Custom menu, but I think I solved it by dragging and dropping). This is actually a really minor issue in retrospect but someone unfamiliar with the process might not know what to do here.

Mike covered the solutions in his question, but it’s probably worth covering with an animated gif. Unity’s shader selection menu for a material sorts things into submenus based on the shader name. When naming your shader, which is generally the first line in the shader file, you can use “/” to create subfolders in the path. Here is an example of a shader I named “Path/To/Shader/ExampleShader” : http://i.imgur.com/eh37jiA.png. You can also assign a shader by dragging it into the material’s shader field, as seen in this animated gif: http://i.imgur.com/mh1BEeD.gif.

2. Applying the created material to the newly created sphere might need a little bit of explanation, since beginners might not know that you need to assign it to the materials property of the Mesh Renderer component, since a sphere has a Sphere Collider component which also has a Material Property (for physics material). Again in retrospect this seems like a really basic issue but I’m thinking of someone not completely familiar with Unity.

There are two things to cover here: how to actually assign a material to an object, and the difference between physics material and rendering materials.

You can apply a material to an object that has a “Mesh Renderer” component a few different ways. The material for the mesh is in the material list for the mesh renderer http://i.imgur.com/rjYLC2l.png. This does mean you can have more than one material per mesh. The way Unity handles multiple materials per mesh is through multiple passes, it renders the parts of the mesh assigned to the first material in one pass, then renders the mesh again for the second material. One way you can assign a material to a mesh is to press the selection circle to the right of the material name, and use Unity’s selection window to pick out a material. You can also assign a material to a mesh by dragging the material onto a game object that has a mesh renderer in your hierarchy view, as seen here http://i.imgur.com/jxAWDsI.gif. Another way to assign the material is to drag it onto the mesh in your 3D scene view, as seen here http://i.imgur.com/pxM8HD5.gif. Like most components in Unity, you can drag the material over the component field you are replacing it with, as seen here http://i.imgur.com/IFgqzZa.gif. The last common way to assign a material is to drag it into the empty space in the inspector near the bottom http://i.imgur.com/uz49fWX.gif. You are probably wondering why it shows the material as if it is a component, while it is also a field in the mesh renderer component. This is just Unity providing a custom inspector to make it easy to edit and customize materials when you have an object selected.

An unfortunate part of software engineering is name collisions. The term “material” we have been using has been to describe just the visual properties of an object. In the real world, material is used to describe a lot of other properties of an object besides how the object looks. The result of this is the physics system in Unity has its own “physics material”. This is a different file, a different set of properties unrelated to how the object looks, and instead how the object behaves within the physics system. A physics material contains the friction and bounciness information of an object.

3. I was never really clear what a SubShader is or how this differs from a pass. Can I have multiple SubShaders, and what’s the difference between having multiple subshaders versus multiple passes? I guess it just struck me as something unexpected, because I would just think “Hey, we have a shader… why are we defining something called a ‘sub’ shader?”

Generally, multiple passes are used for multipass rendering. This is when you want to render an object a few times, with different rules or other behaviour each time. There are many uses for this, the example I will share is often used in RTSes to keep units behind buildings visible. Here is an animated gif showing the two pass behavior http://i.imgur.com/0HW1oef.gif. The first pass does the opposite of a standard Z test. Normally, a pixel is rejected for rendering if anything is in front of it, but in this case the first pass does a Z test for “GEqual”, or greater than or equal to, and will only render pixels that have something between themselves and the camera. This pass renders all pixels in solid red. The second pass is the standard unlit texture from the “writing your first shader’ example. Here is the source for this shader:

Shader “RTSShader” {

Properties {

_MainTex(“Texture”, 2D) = “white” { }

}

SubShader {

Pass {

ZWrite Off

ZTest GEqual

CGPROGRAM

#pragma vertex vert

#pragma fragment frag

#include “UnityCG.cginc”

struct appdata {

float4 vertex : POSITION;

};

struct v2f {

float4 pos : SV_POSITION;

};

 v2f vert(appdata v) {

v2f o;

o.pos = mul (UNITY_MATRIX_MVP, v.vertex);

return o;

}

fixed4 frag(v2f i) : COLOR {

return fixed4(1,0,0,1);

}

ENDCG

}

Pass {

ZTest LEqual

CGPROGRAM

#pragma vertex vert

#pragma fragment frag

#include “UnityCG.cginc”

 

sampler2D _MainTex;

float4 _MainTex_ST;

 

struct appdata {

float4 vertex : POSITION;

float2 texcoord : TEXCOORD0;

};

 struct v2f {

float4 pos : SV_POSITION;

float2 uv : TEXCOORD0;

};

 v2f vert(appdata v) {

v2f o;

o.pos = mul (UNITY_MATRIX_MVP, v.vertex);

o.uv = TRANSFORM_TEX (v.texcoord, _MainTex);

return o;

}

 

fixed4 frag(v2f i) : COLOR {

return tex2D (_MainTex, i.uv);

}

ENDCG

}

}

}

Subshaders are generally used to account for different video cards. Very few game developers are going to be in a position where they are making a game for a single target piece of hardware. Even if you are targeting a single platform, such as iOS, Apple releases many new iPhones, iPads, and iPod Touches each year, each with slightly different hardware. If you are releasing a PC game or Android game, the available graphics hardware is going to be extremely variable. Subshaders are used to account for this, providing a different implementation of the shader for different video cards. The first use of this is probably obvious, you can write shaders that have less features and less functionality for weaker video cards. Maybe you write two subshaders for your “Normal Mapped Specular Lit” shader, the first subshader is the full featured shader, and the second subshader is the fallback, which might use a faster but uglier specular lighting algorithm and drop the normal mapping behavior. The second use of subshaders is to write different implementations of the same shader that do the exact same thing, just optimized for different video cards. All iOS devices uses PowerVR GPUs, whereas most of the more powerful Android devices use nVidia’s Tegra line of GPUs. If you compare similarly powerful devices driven by both product lines, each has its own set of strengths and weaknesses, and you can use subshaders to make sure that your shader runs as fast as it can on each GPU platform.

4. I really often see shader code that has very simple, short, single-letter variable names like ‘v’, etc.? There’s no harm if I make long variable names right? I doubt it, I just want to make sure there isn’t some convention at work here.

Yeah, there is no problem using longer variable names in shader code. Some engineers just prefer shorter names.

5. I may have missed it, but I didn’t see an explanation of the types used and their typical purposes. Seeing a “float4″, for example, was pretty self-explanatory, but it threw me off when the fragment shader returned a “fixed4″. I would have just expected it to return a “float4″ but I didn’t see any explanation as to why this was different.

I have not covered this yet, it is a slightly more advanced topic I would probably get to eventually. As you write more advanced shaders, performance can become a big concern. Float, half, and fixed are the three variable types for tracking non-integer values with a decimal point. Floats are the most accurate of these variable types, but are the heaviest and slowest, whereas fixed are the least precise, but take the least amount of space, and half sits in the middle. Fixed4 / half4 / float4 are all different ways to represent a set of 4 values together, and are useful for representing a color, with red, green, blue, and alpha values. Unity has a page on shader performance here that covers more http://docs.unity3d.com/Documentation/Components/SL-ShaderPerformance.html. Note that you want to use the lowest precision variable type you can for a given piece of data, but it is important to avoid changing data types, it is expensive to convert between float, half, and fixed.

6. I was confused about the Properties block. So, is that just data that gets initialized for the shader before it runs? Is it initialized once for the entire render operation? The _MainTex syntax was confusing also and I wasn’t sure what all the components meant. It looks like an odd method call with parameters being passed, or just some properties table or something, and the parentheses at the end didn’t make sense to me. It reminds me of Lua script or something. I also see that _MainTex is being referenced elsewhere in the program, so clearly it’s important. Is “Texture” some reserved string in that initialization? What does the ’2D’ mean?

Shader properties are the data input to the shader from the material. It is not a function call, but a variable declaration. This is the information that you will set in the Unity interface, data you can control from outside the shader code. “_MainTex” is the internal variable name you will use in the shader. The value “Texture” in the quotes is the name the property will have in the material view in Unity. As you write more complex shaders, you might need to provide more detailed descriptions here to help those using the shader properly set the data. You might have a data texture you use as input, and have a description like “Texture (RGB) Intensity (A)” if you are using the alpha channel as a special data set in the texture file. The “2D” is the type of the property, a “2D” is a two dimensional texture. The braces at the end are for any extra options on texture properties. To use the shader property in your shader code, you declare the variable again, within your shader code, matching the “_MainTex” name. Generally, people use “_” at the beginning of shader property names to help make it clear that it is a shader property and not a local variable. Here is Unity’s page on shader properties, covering the different property types Unity supports http://docs.unity3d.com/Documentation/Components/SL-Properties.html and here is a Unity page on usage of properties http://docs.unity3d.com/Documentation/Components/SL-PropertiesInPrograms.html.

7. Where is _MainTex_ST ever used? I see it declared but never used, which is really confusing.

This is used in the TRANSFORM_TEX macro. TRANSFORM_TEX is defined in UnityCG.cginc. On Windows, this file is at C:\Program Files (x86)\Unity411\Editor\Data\CGIncludes and Applications/Unity/Contents/CGIncludes on Mac. The TRANSFORM_TEX macro is defined as:

#define TRANSFORM_TEX(tex,name) (tex.xy * name##_ST.xy + name##_ST.zw)

If you were not using the macro, the code would look like:

o.uv = v.texcoord.xy * _MainTex_ST.xy + _MainTex_ST.zw;

The X and Y values of the _MainTex_ST are the scale, and the zw are the transform. When viewing a material in Unity, these are the “Tiling” and “Offset” values as seen here: http://i.imgur.com/otT8Nb9.png

That covers Mike’s questions, if anyone else has any questions go ahead and ask them! You’re not going to be the only person with those questions.

Advertisements
About

Joseph Stankowicz is a software engineer who has worked in the video games industry for over eight years. The last two years have had a heavy focus on Unity development, where he helped ship over eleven titles to iOS and Android platforms. He also is really excited about 3D printing, and keeps his Solidoodle 3 printing out stuff as often as possible. You can view his LinkedIn profile here http://www.linkedin.com/pub/joseph-stankowicz/60/294/420

Posted in Unity3D, Unity3D Performance

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: