Unity, Consoles, Shaders...
Friday, May 8, 2015 at 6:10PM
n00body in Shaders, Unity3D

UPDATE2

Also it seems like it would be a good idea to test in OpenGL mode on an AMD/Intel GPU. Their GLSL compilers seem especially strict, so you're likely to catch the most bugs on those platforms.

 

UPDATE

Apparently you also need to assume some of the strict assignment and swizzling rules of GLSL. Certain things like accidentally assigning a four component vector input to a three component variable will pass for Direct3D, but blow up once you hit OpenGL.

 

ORIGINAL

To all the people authoring shaders for Unity with the goal of having them work on all of Unity's supported platforms I say the following:

  1. The shaders are HLSL! Unity still calls them Cg for legacy reasons, but they've all but abandoned that platform. So when looking up how to do something in the shaders, look for HLSL tips!
  2. Read the header "HLSLSupport.cginc" and burn its contents into your memory. Inside you will find all the crazy preprocessor juggling that Unity has to do to hide platform and language API differences. 

Today I learned this lesson the hard way with Alloy's parallax occlusion mapping code. Basically, in Cg if you want to apply texture coordinate derivatives then you use an overloaded version of tex2D(). In HLSL, you have to use an explicit intrinsic called tex2Dgrad().

Guess which one is platform-safe in Unity's shaders? To find the answer, follow tip #2. :p

Article originally appeared on Crunchy Bytes (http://n00body.squarespace.com/).
See website for complete article licensing information.