Advertisement
  1. Game Development
  2. WebGL
Gamedevelopment

Building Shaders With Babylon.js and WebGL: Theory and Examples

by
Difficulty:AdvancedLength:LongLanguages:
Sponsored Content

This sponsored post features a product relevant to our readers while meeting our editorial guidelines for being objective and educational.

In the keynote for Day 2 of //Build 2014 (see 2:24-2:28), Microsoft evangelists Steven Guggenheimer and John Shewchuk demoed how Oculus Rift support was added to Babylon.js. And one of the key things for this demo was the work we did on a specific shader to simulate lenses, as you can see in this picture:

Lens simulation image

I also presented a session with Frank Olivier and Ben Constable about graphics on IE and Babylon.js

This leads me to one of the questions people often ask me about Babylon.js: "What do you mean by shaders?" So in this post, I am going to explain to you how shaders work, and give some examples of common types of shaders.

The Theory

Before starting experimenting, we must first see how things work internally.

When dealing with hardware-accelerated 3D, we are discussing two CPUs: the main CPU and the GPU. The GPU is a kind of extremely specialized CPU.

The GPU is a state machine that you set up using the CPU. For instance the CPU will configure the GPU to render lines instead of triangles. Or it will define that transparency is on, and so on.

Once all the states are set, the CPU will define what to render—the geometry, which is composed of a list of points (called the vertices and stored into an array called vertex buffer), and a list of indexes (the faces, or triangles, stored into an array called index buffer).

The final step for the CPU is to define how to render the geometry, and for this specific task, the CPU will define shaders for the GPU. Shaders are a piece of code that the GPU will execute for each of the vertices and pixels it has to render.

First, some vocabulary: think of a vertex (vertices when there are several of them) as a “point” in a 3D environment (as opposed to a point in a 2D environment).

There are two kinds of shaders: vertex shaders, and pixel (or fragment) shaders.

Graphics Pipeline

Before digging into shaders, let’s take a step back. To render pixels, the GPU will take the geometry defined by the CPU and will do the following:

Using the index buffer, three vertices are gathered to define a triangle: the index buffer contains a list of vertex indexes. This means that each entry in the index buffer is the number of a vertex in the vertex buffer. This is really useful to avoid duplicating vertices. 

For instance, the following index buffer is a list of two faces: [1 2 3 1 3 4]. The first face contains vertex 1, vertex 2 and vertex 3. The second face contains vertex 1, vertex 3 and vertex 4. So there are four vertices in this geometry: 

Chart showing four vertices

The vertex shader is applied on each vertex of the triangle. The primary goal of the vertex shader is to produce a pixel for each vertex (the projection on the 2D screen of the 3D vertex): 

vertex shader is applied on each vertex of the triangle

Using these three pixels (which define a 2D triangle on the screen), the GPU will interpolate all values attached to the pixel (at least its position), and the pixel shader will be applied on every pixel included into the 2D triangle in order to generate a color for every pixel: 

pixel shader will be applied on every pixel included into the 2D triangle

This process is done for every face defined by the index buffer. 

Obviously, due to its parallel nature, the GPU is able to process this step for a lot of faces simultaneously, and thereby achieve really good performance.

GLSL

We have just seen that to render triangles, the GPU needs two shaders: the vertex shader and the pixel shader. These shaders are written using a language called GLSL (Graphics Library Shader Language). It looks like C.

For Internet Explorer 11, we have developed a compiler to transform GLSL to HLSL (High Level Shader Language), which is the shader language of DirectX 11. This allows IE11 to ensure that the shader code is safe (you don’t want to using WebGL to reset your computer!):

Flow chart of transforming GLSL to HLSL

Here is a sample of a common vertex shader:

Vertex Shader Structure

A vertex shader contains the following:

  • Attributes: An attribute defines a portion of a vertex. By default a vertex should at least contain a position (a vector3:x, y, z). But as a developer, you can decide to add more information. For instance, in the former shader, there is a vector2 named uv (texture coordinates that allow us to apply a 2D texture on an 3D object).
  • Uniforms: A uniform is a variable used by the shader and defined by the CPU. The only uniform we have here is a matrix used to project the position of the vertex (x, y, z) to the screen (x, y).
  • Varying: Varying variables are values created by the vertex shader and transmitted to the pixel shader. Here, the vertex shader will transmit a vUV (a simple copy of uv) value to the pixel shader. This means that a pixel is defined here with a position and texture coordinates. These values will be interpolated by the GPU and used by the pixel shader. 
  • main: The function named main() is the code executed by the GPU for each vertex and must at least produce a value for gl_position (the position on the screen of the current vertex). 

We can see in our sample that the vertex shader is pretty simple. It generates a system variable (starting with gl_) named gl_position to define the position of the associated pixel, and it sets a varying variable called vUV

The Voodoo Behind Matrices

In our shader we have a matrix named worldViewProjection. We use this matrix to project the vertex position to the gl_position variable. That is cool, but how do we get the value of this matrix? It is a uniform, so we have to define it on the CPU side (using JavaScript).

This is one of the complex parts of doing 3D. You must understand complex math (or you will have to use a 3D engine, like Babylon.js, which we are going to see later).

The worldViewProjection matrix is the combination of three different matrices:

The worldViewProjection matrix is the combination of three different matrices

Using the resulting matrix allows us to be able to transform 3D vertices into 2D pixels while taking in account the point of view and everything related to the position/scale/rotation of the current object.

This is your responsibility as a 3D developer: to create and keep this matrix up to date.

Back to the Shaders

Once the vertex shader is executed on every vertex (three times, then) we have three pixels with a correct gl_position and a vUV value. The GPU will then interpolate these values on every pixel contained in the triangle produced by these pixels.

Then, for each pixel, it will execute the pixel shader:

Pixel (or Fragment) Shader Structure

The structure of a pixel shader is similar to a vertex shader:

  • Varying: Varying variables are values created by the vertex shader and transmitted to the pixel shader. Here the pixel shader will receive a vUV value from the vertex shader. 
  • Uniforms: A uniform is a variable used by the shader and defined by the CPU. The only uniform we have here is a sampler, which is a tool used to read texture colors.
  • main: The function named main is the code executed by the GPU for each pixel and must at least produce a value for gl_FragColor (the color of the current pixel). 

This pixel shader is fairly simple: It reads the color from the texture using texture coordinates from the vertex shader (which in turn got it from the vertex).

Do you want to see the result of such a shader? Here it is:

This is being rendered in real time; you can drag the sphere with your mouse.

To achieve this result, you will have to deal with a lot of WebGL code. Indeed, WebGL is a really powerful but really low-level API, and you have to do everything by yourself, from creating the buffers to defining vertex structures. You also have to do all the math and set all the states and handle texture loading and so on…

Too Hard? BABYLON.ShaderMaterial to the Rescue

I know what you are thinking: shaders are really cool, but I do not want to bother with WebGL internal plumbing or even with math.

And that's fine! This is a perfectly legitimate ask, and that is exactly why I created Babylon.js.

Let me present to you the code used by the previous rolling sphere demo. First of all, you will need a simple webpage:

You will notice that the shaders are defined by <script> tags. With Babylon.js you can also define them in separate files (.fx files).

You can get Babylon.js here or on our GitHub repo. You must use version 1.11 or higher to get access to BABYLON.StandardMaterial.

And finally the main JavaScript code is the following:

You can see that I use a BABYLON.ShaderMaterial to get rid of all the burden of compiling, linking and handling shaders.

When you create a BABYLON.ShaderMaterial, you have to specify the DOM element used to store the shaders or the base name of the files where the shaders are. If you choose to use files, you must create a file for each shader and use the following filename pattern: basename.vertex.fx and basename.fragment.fx. Then you will have to create the material like this:

You must also specify the names of any attributes and uniforms that you use. Then, you can set directly the value of your uniforms and samplers using the setTexture, setFloat, setFloats, setColor3, setColor4, setVector2, setVector3, setVector4, and setMatrix functions.

Pretty simple, right?

Do you remember the previous worldViewProjection matrix? Using Babylon.js and BABYLON.ShaderMaterial, you have nothing to worry about! The BABYLON.ShaderMaterial will automatically compute it for you because you declare it in the list of uniforms.

BABYLON.ShaderMaterial can also handle the following matrices for you:

  • world 
  • view 
  • projection 
  • worldView 
  • worldViewProjection 

No need for math any longer. For instance, each time you execute sphere.rotation.y += 0.05, the world matrix of the sphere is generated for you and transmitted to the GPU.

CYOS: Create Your Own Shader

So let’s go bigger and create a page where you can dynamically create your own shaders and see the result immediately. This page is going to use the same code that we previously discussed, and is going to use a BABYLON.ShaderMaterial object to compile and execute shaders that you will create.

I used ACE code editor for CYOS. This is an incredible code editor with syntax highlighters. Feel free to have a look at it here. You can find CYOS here.

Using the first combo box, you will be able to select pre-defined shaders. We will see each of them right after.

You can also change the mesh (the 3D object) used to preview your shaders using the second combo box.

The Compile button is used to create a new BABYLON.ShaderMaterial from your shaders. The code used by this button is the following: 

Brutally simple, right? The material is ready to send you three pre-computed matrices (world, worldView and worldViewProjection). Vertices will come with position, normal and texture coordinates. Two textures are also already loaded for you:

amiga texture
amiga.jpg
ref texture
ref.jpg

And finally, here is the renderLoop where I update two convenient uniforms:

  • one called time in order to get some funny animations 
  • one called cameraPosition to get the position of the camera into your shaders (which will be useful for lighting equations) 

Thanks to the work we did on Windows Phone 8.1, you can also use CYOS on your Windows Phone (it is always a good time to create a shader):

CYOS on Windows Phone

Basic Shader

So let’s start with the very first shader defined on CYOS: the Basic shader.

We already know this shader. It computes the gl_position and uses texture coordinates to fetch a color for every pixel.

To compute the pixel position, we just need the worldViewProjection matrix and the vertex’s position:

Texture coordinates (uv) are transmitted unmodified to the pixel shader.

Please note that we need to add precision mediump float; on the first line for both vertex and pixel shaders because Chrome requires it. It defines that, for better performance, we do not use full precision floating values.

The pixel shader is even simpler, because we just need to use texture coordinates and fetch a texture color:

We saw previously that the textureSampler uniform is filled with the “amiga” texture, so the result is the following:

Basic Shader result

Black and White Shader

Now let’s continue with a new shader: the black and white shader.

The goal of this shader is to use the previous one but with a "black and white only" rendering mode. To do so, we can keep the same vertex shader, but the pixel shader must be slightly modified.

The first option we have is to take only one component, such as the green one:

As you can see, instead of using .rgb (this operation is called a swizzle), we used .ggg.

But if we want a really accurate black and white effect, it would be a better idea to compute the luminance (which takes into account all color components):

The dot operation (or dot product) is computed like this:

result = v0.x * v1.x + v0.y * v1.y + v0.z * v1.z

So in our case:

luminance = r * 0.3 + g * 0.59 + b * 0.11 (these values are based on the fact that human eye is more sensible to green)

Sounds cool, doesn’t it?

Black and white shader result

Cell Shading Shader

Now let’s move to a more complex shader: the cell shading shader.

This one will require us to get the vertex’s normal and the vertex’s position in the pixel shader. So the vertex shader will look like this:

Please note that we also use the world matrix, because position and normal are stored without any transformation and we must apply the world matrix to take into account the object’s rotation.

The pixel shader is the following:

The goal of this shader is to simulate a light, and instead of computing a smooth shading we will consider that light will apply according to specific brightness thresholds. For instance, if light intensity is between 1 (maximum) and 0.95, the color of the object (fetched from the texture) will be applied directly. If intensity is between 0.95 and 0.5, the color will be attenuated by a factor of 0.8, and so on.

So, there are mainly four steps in this shader:

  • First, we declare thresholds and levels constants.
  • Then, we need to compute the lighting using the Phong equation (we assume that the light is not moving): 

The intensity of light per pixel is dependent on the angle between the normal and the light's direction.

  • Then we get the texture color for the pixel.
  • And finally we check the threshold and apply the level to the color.

The result looks like a cartoon object: 

Cell shading shader result

Phong Shader

We used a portion of the Phong equation in the previous shader. So let’s try to use the whole thing now.

The vertex shader is clearly simple here, because everything will be done in the pixel shader:

According to the equation, you must compute the diffuse and specular part by using the light direction and the vertex’s normal:

We already used the diffuse part in the previous shader, so here we just need to add the specular part. This picture from a Wikipedia article explains how the shader works:

Diffuse plus Specular equals Phong Reflection
By Brad Smith aka Rainwarrior.

The result on our sphere:

Phong shader result

Discard Shader

For the discard shader, I would like to introduce a new concept: the discard keyword. This shader will discard every non-red pixel and will create the illusion of a "dug" object.

The vertex shader is the same as that used by the basic shader:

The pixel shader will have to test the color and use discard when, for instance, the green component is too high:

The result is funny:

Discard shader result

Wave Shader

We’ve played a lot with pixel shaders, but I also wanted to show you that we can do a lot of things with vertex shaders.

For the wave shader, we will reuse the Phong pixel shader.

The vertex shader will use the uniform called time to get some animated values. Using this uniform, the shader will generate a wave with the vertices’ positions:

A sine is applied to position.y, and the result is the following:

Wave shader result

Spherical Environment Mapping

This one was largely inspired by this tutorial. I’ll let you read that excellent article and play with the associated shader. 

Spherical environment mapping shader

Fresnel Shader

I would like to finish this article with my favorite: the Fresnel shader.

This shader is used to apply a different intensity according to the angle between the view direction and the vertex’s normal.

The vertex shader is the same one used by the cell shading shader, and we can easily compute the Fresnel term in our pixel shader (because we have the normal and the camera’s position, which can be used to evaluate the view direction):

Fresnel Shader result

Your Shader?

You are now more prepared to create your own shader. Feel free to use the comments here or at the Babylon.js forum to share your experiments!

If you want to go further, here are some useful links:

And some more learning that I’ve created on the subject:

Or, stepping back, our team’s learning series on JavaScript: 

And of course, you are always welcome to use some of our free tools in building your next web experience: Visual Studio Community, Azure Trial, and cross-browser testing tools for Mac, Linux, or Windows.

This article is part of the web dev tech series from Microsoft. We’re excited to share Microsoft Edge and the new EdgeHTML rendering engine with you. Get free virtual machines or test remotely on your Mac, iOS, Android, or Windows device @ http://dev.modern.ie/.

Advertisement
Advertisement
Looking for something to help kick start your next project?
Envato Market has a range of items for sale to help get you started.