Hello 🙋♂️🙋♂️. In my last few articles, I’ve shown how to set up your WebGPU project and how to create a basic render loop but we never did anything fun with it. So far all we’ve done is create triangles and “squares”(2 triangles) aaaannnnndddd that’s it. This, as you might expect is boring.
Now, don’t get me wrong, you really should give respect to the triangle as its the foundation but, a solid foundation without a house on top is meaningless. So, I figured we might as well dive into the magical world of shaders and make some cool stuff.
Understanding the basics
For us to understand the power of shaders, we first need to keep a few things in mind. Let’s start with the vertex shader. To summarize, the vertex shader is responsible for defining where on the screen a particular result will be drawn. This is done by giving a collection of points that typically define triangles that define areas which will eventually be drawn to. A further thing to remember is that the GPU uses clip space to abstract away the actual canvas it will render to. This is done mostly because working with numbers in the -1 to +1 range allows for much greater accuracy in calculations due to how floating point numbers are implemented.
As can be seen from the image, both the x
and y
axis range from -1
to +1
and any coordinates that are passed that are greater than or less than these values will lead to the things in that area not being drawn on the screen hence the name clip space.
For this article, this is all we need to know about the vertex shader since we are mostly focusing on the fragment shader for this article since playing with colours is more fun. I’ll do another article on the vertex shader if there’s a need.
Basic shader script
To start with, I made this basic shader
struct Props {
width: f32, // the width of the canvas
height: f32, // the canvas of the screen
time: f32, // the number of seconds since the program started
}
@group(0) @binding(0) var<uniform> props: Props;
@vertex
fn vs(@builtin(vertex_index) idx: u32) -> @builtin(position) vec4f {
var pos: array<vec2f, 3> = array(
vec2f(-0.5, 0.5), // top left point
vec2f(-0.5, -0.5),// bottom left point
vec2f( 0.5, -0.5),// bottom right point
);
return vec4f(pos[idx],0,1);
}
@fragment
fn fs(@builtin(position) pos: vec4f) -> @location(0) vec4f {
return vec4f(1, 0, 0, 1);
}
If you’ve read my previous articles, this code should look familiar. It draws a red triangle onto the screen.
I won’t cover the plumbing for getting the shader code to run but I’ve linked the code to the project here. If you want to know how that works you can check out my previous articles. This is mainly because its not that different from the rendering loop setup with the execption of getting the data into the props uniform plus I think it’d be a great learning opportunity.
Explaining the shader script
You can find an accessible in-depth explanation of the WGSL syntax here.
At the start of the script, you see I create something called a struct
. This can be thought of as how we define the properties of a class which is called Props
. This class definition is then used to explain the structure of the props
variable which you’ll see in the subsequent line. An important thing to note is that WGSL, the WebGPU shader language, is a statically typed language. Aside from this, it also cares about the exact size of the type hence in this case f32
means that it is a 32-bit floating point number.
Another thing you’ll notice is the array
, vec2f
and vec4f
datatypes. The array
datatype should be obvious but you can understand the other 2 as defining a 2 and 4 dimensional 32 bit floating point vector respectively. Conceptually, you can think of a vector as an array with a fixed number of elements. So vec2f
is a 2-element array and vec4f
is a 4-element array.
Moving on, you’ll notice the vertex buffer defines the three points of the triangle which should make sense if you look at the clip space reference image. As I mentioned earlier, we aren’t really concerned about this function but I would like to draw things to the whole screen instead of just the tiny triangle as its more interesting so we can modify the vertex shader to look like below.
@vertex
fn vs(@builtin(vertex_index) idx: u32) -> @builtin(position) vec4f {
var pos: array<vec2f, 3> = array(
vec2f(-1, 3),
vec2f(-1, -1),
vec2f( 3, -1),
);
return vec4f(pos[idx],0,1);
}
This is a hack I learnt from this article that defines the triangle shown below. As can be seen from the image, this triangle perfectly covers the -1 to 1 range and we don’t need to worry about the extra areas since they get clipped out anyway. This isn’t a particularly important thing but I like it because it allows me to only define 3 points instead of 6 points if I want to define a rectangle that covers the whole screen.
At this moment, your screen should now look like this.
As expected, we’ve now covered the whole screen and can start messing around with the colors.
Understanding the fragment shader
With that out the way, let’s move on to the star of the show. As you’ll recall, our fragment shader looks like this.
@fragment
fn fs(@builtin(position) pos: vec4f) -> @location(0) vec4f {
return vec4f(1, 0, 0, 1);
}
To understand what’s going on here, remember that all the fragment shader cares about is assigning a colour to the given pixel on the screen. So, if we want to have different colours at different areas of the screen, we need to know what pixel we are drawing to in principle. This is why we have the pos
variable as an argument to the function which is a variable provided by WGSL.
One thing we need to know is that the x
and y
values of the pos
vector, range from 0
to canvas width
and 0
to canvas height
respectively. This is a problem as the colour output range is from 0
to 1
. This means that if we just pass the coordinates directly then it would clip everything to 1
which is not ideal. To fix this, we divide the x
and y
values by the canvas width and height respectively as shown in the below code.
@fragment
fn fs(@builtin(position) pos: vec4f) -> @location(0) vec4f {
var uv = pos.xy / vec2f(props.width, props.height);
return vec4f(uv.x, 0, 0, 1);
}
You’ll notice I’ve introduced this funny notation here pos.xy
. This is called swizzling and it’s a shorthand for vec2f(pos.x,pos.y)
. It’s very versatile and you can use it to quickly construct vectors like pos.zyxx
, pos.xyx
, and pos.xyz
. Another thing you’ll notice is that you can implicitly carry out element-wise operations, in this case, division, between two vectors of the same size.
Moving on, after running this code, you’ll get the following result.
An interesting thing you’ll notice is that we now have a colour gradient going from black on the left to bright red on the right. This matches with our expected result of the x
range going from 0
to 1
. Similarly, we can also inspect our y value by modifying the code like so.
@fragment
fn fs(@builtin(position) pos: vec4f) -> @location(0) vec4f {
var uv = pos.xy / vec2f(props.width, props.height);
return vec4f(uv.y, 0, 0, 1);
}
This gives us the following result.
You’ll notice now that the gradient starts at black at the top and goes to red at the bottom.
Taking these two observations into account we can conclude that our current coordinate space in terms of the uv
variable is as shown in the image below.
We can also visualise this coordinate space by making use of both the red and green channels by making the following modifications which use the red
channel for x
and the green
channel for the y
.
@fragment
fn fs(@builtin(position) pos: vec4f) -> @location(0) vec4f {
var uv = pos.xy / vec2f(props.width, props.height);
return vec4f(uv, 0, 1);
}
This now gives us this result.
As you’d expect, the top left is black corresponding to coordinate (0,0)
, the bottom right is yellow corresponding to (1,1)
with the colours becoming increasingly red as you move right and increasingly green as you move down.
You might now notice that when it comes to the fragment shader, there’s no difference between colours and spatial coordinates. This is a very important thing to remember as it’s the core principle underlying all the crazy things we can make with the fragment shader as you’ll see.
If you want to learn more about this, I recommend you checkout the Book of Shaders website. It unfortunately uses WebGL instead of WebGPU but the shading languages aren’t too different from each other and the lessons it imparts on you are invaluable.
Now that we’ve got a basic understanding of the fragment shader, let’s have some fun! 🥳🥳
Playing with the fragment shader
So the first thing I want to do is convert my uv
coordinates which range from 0
to 1
to clip space coordinates which range from -1
to +1
. This is not a fundamentally necessary step but the examples I have make use of a lot of symmetry so it makes sense to make the centre of the screen (0,0)
instead of (0.5,0.5)
like it is at the moment.
To show you how you can achieve this, lets look at our individual axises.
For the x-axis, we know it ranges from 0
to 1
from left
to right
. This means if we double our range and subtract it by 1
we’ll get numbers that range from -1
to +1
from left
to right
.
For the y-axis, we know it ranges from 0
to 1
from top
to bottom
. This means if we negate the numbers, double the values and add 1
we’ll get numbers that range from -1
to +1
from bottom
to top
.
With this in mind, we make the following modifications to our fragment shader.
@fragment
fn fs(@builtin(position) pos: vec4f) -> @location(0) vec4f {
var uv = pos.xy / vec2f(props.width, props.height);
uv.x = 2 * uv.x - 1;
uv.y = -2 * uv.y + 1;
return vec4f(uv, 0, 1);
}
This would give us the following result
You’ll notice that the top right corner is the expected yellow color and the entire bottom left quadrant is black since all the numbers in that region are negative.
We could also have compacted the clip space transformation into one line if we had used the element-wise operations, as shown below.
@fragment
fn fs(@builtin(position) pos: vec4f) -> @location(0) vec4f {
var uv = pos.xy / vec2f(props.width, props.height);
uv = 2 * vec2f(uv.x, -uv.y) + vec2f(-1, 1);
return vec4f(uv, 0, 1);
}
Before continuing, let's take a moment to appreciate everything we’ve learnt. While I won’t claim that this is all you need to know, I strongly believe that with this basis, you can now understand why certain shader tricks work so well and the following examples prove it.
First art piece
Our first art piece is a replica of this art piece.
Let us look at our first WGSL function distance
which can give you the distance between two points.
@fragment
fn fs(@builtin(position) pos: vec4f) -> @location(0) vec4f {
var uv = pos.xy / vec2f(props.width, props.height);
uv = 2 * vec2f(uv.x, -uv.y) + vec2f(-1, 1);
var d = distance(uv, vec2f(0));
return vec4f(d, d, d, 1);
}
having obtained the distance d
we plugged it into all the color channels which will give us a nice grayscale output.
So as we might expect, the distance starts at zero and progressively becomes white as we radiate out. You might notice something funny though. If you look closely, you’ll notice the black region is more elliptical than circular and this is because of the aspect ratio of the screen size. To fix this, you need to multiply the uv x coordinates by the screen resolution.
@fragment
fn fs(@builtin(position) pos: vec4f) -> @location(0) vec4f {
var uv = pos.xy / vec2f(props.width, props.height);
uv = 2 * vec2f(uv.x, -uv.y) + vec2f(-1, 1);
uv.x *= props.width / props.height;
var d = distance(uv, vec2f(0));
return vec4f(d, d, d, 1);
}
And now we get the correct circular output
If we want a perfect circle we can use another function name step
which behaves like the typical step function. For those not aware of the step function, it’s a function that is 0 before a certain value and 1 afterwards.
@fragment
fn fs(@builtin(position) pos: vec4f) -> @location(0) vec4f {
var uv = pos.xy / vec2f(props.width, props.height);
uv = 2 * vec2f(uv.x, -uv.y) + vec2f(-1, 1);
uv.x *= props.width / props.height;
var d = distance(uv, vec2f(0));
d = step(0.5, d);
return vec4f(d, d, d, 1);
}
This gives us the following output
Not bad but you’ll notice the edge is quite sharp. We can fix this by using the smoothstep
function instead which behaves exactly like the step
function but it performs a smoother change from 0
to 1
based on two provided values
@fragment
fn fs(@builtin(position) pos: vec4f) -> @location(0) vec4f {
var uv = pos.xy / vec2f(props.width, props.height);
uv = 2 * vec2f(uv.x, -uv.y) + vec2f(-1, 1);
uv.x *= props.width / props.height;
var d = distance(uv, vec2f(0));
d = smoothstep(0.5, 0.55, d);
return vec4f(d, d, d, 1);
}
we can also invert the colours like so.
@fragment
fn fs(@builtin(position) pos: vec4f) -> @location(0) vec4f {
var uv = pos.xy / vec2f(props.width, props.height);
uv = 2 * vec2f(uv.x, -uv.y) + vec2f(-1, 1);
uv.x *= props.width / props.height;
var d = distance(uv, vec2f(0));
d = 1 - smoothstep(0.5, 0.55, d);
return vec4f(d, d, d, 1);
}
A cool thing we can do is form a ring outline by subtracting a smaller circle from a bigger circle.
@fragment
fn fs(@builtin(position) pos: vec4f) -> @location(0) vec4f {
var uv = pos.xy / vec2f(props.width, props.height);
uv = 2 * vec2f(uv.x, -uv.y) + vec2f(-1, 1);
uv.x *= props.width / props.height;
var d = distance(uv, vec2f(0));
var innerCircle = 1 - smoothstep(0.5, 0.52, d);
var outerCircle = 1 - smoothstep(0.55, 0.57, d);
d = outerCircle - innerCircle;
return vec4f(d, d, d, 1);
}
Pay attention to this subtracting trick I did right here. You’ll often see a lot of this in shader renderings where we create new shapes by either removing existing shapes or adding new shapes.
Now let’s see if we can get some repetitions going. For this, we’re gonna make use of the sin
function.
@fragment
fn fs(@builtin(position) pos: vec4f) -> @location(0) vec4f {
var uv = pos.xy / vec2f(props.width, props.height);
uv = 2 * vec2f(uv.x, -uv.y) + vec2f(-1, 1);
uv.x *= props.width / props.height;
var d = distance(uv, vec2f(0));
d = sin(d);
d = 1 - smoothstep(0.7, 0.71, d);
return vec4f(d, d, d, 1);
}
You’ll notice I’ve gone back to the black circle code but I added a sin
in-between. Les see what we get.
@fragment
fn fs(@builtin(position) pos: vec4f) -> @location(0) vec4f {
var uv = pos.xy / vec2f(props.width, props.height);
uv = 2 * vec2f(uv.x, -uv.y) + vec2f(-1, 1);
uv.x *= props.width / props.height;
var d = distance(uv, vec2f(0));
d = sin(d);
d = smoothstep(0.7, 0.71, d);
return vec4f(d, d, d, 1);
}
🤔🤔 You’ll notice nothing changed at all. This is because at the moment we only have one frequency cycle. Let’s add a few more.
@fragment
fn fs(@builtin(position) pos: vec4f) -> @location(0) vec4f {
var uv = pos.xy / vec2f(props.width, props.height);
uv = 2 * vec2f(uv.x, -uv.y) + vec2f(-1, 1);
uv.x *= props.width / props.height;
var d = distance(uv, vec2f(0));
d = sin(d * 30);
d = smoothstep(0.7, 0.71, d);
return vec4f(d, d, d, 1);
}
The last thing we can do is add a time component. That way we’ll have things finally moving on the screen.
@fragment
fn fs(@builtin(position) pos: vec4f) -> @location(0) vec4f {
let t = props.time;
var uv = pos.xy / vec2f(props.width, props.height);
uv = 2 * vec2f(uv.x, -uv.y) + vec2f(-1, 1);
uv.x *= props.width / props.height;
var d = distance(uv, vec2f(0));
d = sin(d * 30 + t * 4);
d = smoothstep(0.7, 0.71, d);
return vec4f(d, d, d, 1);
}
Conclusion
And there we have it our first art piece. Feel free to continue with the video I linked if you want to create more things but I hope this has helped demystify the world of shaders a bit. 😁 I wish you well.