High-level shader language

High Level Shading Language ( HLSL ) denotes developed for DirectX programming language that is used for programming shader blocks. Occasionally, the group of high-level languages ​​for shader is called with HLSL.


Under shading, the change of individual vertices or fragments is referred to within the graphics pipeline in computer graphics. This is preferably carried directly on the hardware, which long the use of assembly language necessitated. Programming in assembler is, however, quite impractical, prone to error and the hardware manufacturer dependent. This circumstance should the High Level Shading Languages ​​resolve. Make high linguistic structures that make programming easier and thus enable the programmer to focus on his goal. A compiler translates the code of the high-level language into machine language for the GPU. The DirectX HLSL specific high-level language is translated at runtime of the application of the DirectX library using the graphics driver in the appropriate for the current graphics hardware assembly language. Different Shaders for Nvidia or ATI / AMD video cards are no longer necessary.

Language elements

HLSL does not OOP approaches like other languages ​​, is strongly oriented towards C, but with optimized shader programming to the built-in data types and operations.

Global shader parameters

Parameters that are passed to a shader in HLSL are globally in the complete code available and be written outside methods or structs, usually at the beginning of the code.

Float4x4 world; / / Define a 4x4 - floating-point matrix, here the world matrix   float4x4 worldViewProj; / / The world -view - projection matrix, calculated as World * View * Proj   float3 lightDir; / / A 3- element vector.   Light float4 color = { } 0.5,0.5,0.5,1; / / Color of light (vector with predefined value )   float4 Ambient = { 0.5,0.5,0.5,1 }; / / Light color of the ambient light   float4 LightDir = {0,0, -1, 0 }; / / Direction of sunlight ( here: vertical from above)     Texture2D tex0; / / A texture   Sampler State Default Sampler / / The " Sampler" defines the parameters for the texture mapping {   filter = MIN_MAG_MIP_LINEAR; / / Interpolation for texture stretching   AddressU = Clamp; Prune / / texture coordinates outside [0 .1 ]   AddressV = Clamp; }; For the meaning of the above matrices, see the article graphics pipeline.

Input and output of the vertex shader

Of course you can write each parameter individually in the parameter list of a shader method, but in practice uniform structs are common to save paperwork and to provide more clarity. In principle, any values ​​and vectors can be passed with the command structure, but a position is almost always there.

/ / Input to the vertex shader.   struct MyShaderIn   {       float4 position: POSITION; / / The compiler will be announced, what "means" the variable. Here: This is a position       float4 normal: NORMAL0; / / The vertex normal is used for the illumination       float2 TexCoords: TEXCOORD0; / / Texture coordinates   }     struct MyShaderOut   {       float4 position: POSITION;       float4 TexCoords TEXCOORD0;       float4 normal: TEXCOORD1;   } The "in- struct " indicates the data structure as it is passed from the wire frame in the shader, so the vertex shader. This processes the data and provides an "out- struct " as the return type back. This is then passed to the pixel shader, the only one float4 or similar returns at the end, with the final pixel color.

Vertex / Pixel Shader method

A method must exist for vertex shaders and pixel shaders. This refers to a data structure, and processes it appropriately. The vertex shader is called once for each vertex, the pixel shader once per pixel to renderndes texture.

MyShaderOut MyVertexShader ( MyShaderIn In)   {       MyShaderOut Output = ( MyShaderOut ) 0;       / / The next line is the projection of multiplication. They multiplied the position of the current point with       / / From the combined world, camera and projection matrix 4x4 matrix to obtain the screen coordinates       Output.Position = mul ( IN.position, WorldViewProj );       Output.TexCoords = In.TexCoords; / / Texture coordinates are simply being passed through in this simple example       Output.Normal = normalize ( mul ( In.Normal, ( float3x3 ) World) ); / / The normal is rotated       return output;   }     / / A helper function   float DotProduct ( float3 LightPos, float3 pos3D, float3 normal)   {       float3 lightDir = normalize ( pos3D - LightPos );       return dot ( - lightDir, normal);   }        / / The pixel shader is as a return value Only one color (possibly with alpha)   float4 MyPixelShader ( MyShaderIn In): color0   {       / / Illuminance of the surface ( the scalar product of negative light vector and       / / Normal vector of the surface is > 0 if the surface is facing the light source )       Sunlight float dot = ( ( float3 ) LightDir, In.Normal );       float4 sunLightColor = float4 ( Sunlight, Sunlight, Sunlight, 1); / / Set the alpha channel       sunLightColor * = Light Color; Installing / / The color of light       sunLightColor = saturate ( sunLightColor ); Prune / / The color values ​​to [ 0 to 1 ]       / / The texture color at the point to be drawn to pick up. To the interpolation of texture coordinates       / / We do not need to worry about us, that take hardware and compiler.       float4 base color = Tex0.Sample (default sampler, In.TexCoords );       float4 brightness color = base color * ( sunLightColor Ambient); Einrechnen / / brightness and contrast       brightness color = ( color brightness offset Brightness ) * ( 1.0 offset Contrast);       return brightness color;   } geometry shader

The implementation of a Geometry shader is optional and allows a primitive to represent 0 to n new primitives. The type of the output primitives and the maximum number of vertices produced, however, must be made known at compile time. The implementation is here procedurally and used specially imported data structures (Point Of T Stream, Stream Line and TriangleStream Of T ). There is also the possibility to neighboring triangles or lines access. This can be achieved using the input modifiers ( triangleadj and lineadj ). Typical applications for Geometry shaders are the generation of point sprites and rendering in CubeMap textures. Here is a simple geometry shader that each triangle to which it is applied reduced to its focus towards:

[ maxvertexcount (3)]   void GS ( triangle MyShaderOut input, inout TriangleStream OutputStream )   {       MyShaderOut point;       float4 centroid = ( input.Position input.Position input.Position ) / 3.0;       point = input;       point.Position = lerp ( centroid, input.Position, 0.9);       OutputStream.Append (point);         point = input;       point.Position = lerp ( centroid, input.Position, 0.9);       OutputStream.Append (point);         point = input;       point.Position = lerp ( centroid, input.Position, 0.9);       OutputStream.Append (point);         OutputStream.RestartStrip ();   } techniques

Finally, the methods defined in the form of techniques and passes must be assigned so that they are implemented accordingly by the compiler. The syntax of the shader has changed slightly with DirectX 10, therefore the target version is again stated explicitly in the art.

Technique10 MyTechnique / / For DirectX 10   {       matching pass0       {           Vertex shaders = compile vs_4_0 MyVertexShader ();           Pixel Shader = compile ps_4_0 MyPixelShader ();       }   } alternatives

  • RenderMan ( Shading Language of Pixar Animation Studios)
  • CG (C for graphics )
  • OpenGL Shading Language
  • OpenGL ES Shading Language
  • Geometric Modeling