Stupid Shader Tricks: Fun with Arrows in Godot

Introduction – Gettin’ Shady

Motivation

I’ve been having a blast experimenting with shaders in Godot, which are programs that run on the GPU to accomplish visual effects (and more!) in games. Specifically, I’ve been exploring CanvasItem shaders, which are Godot’s special shaders for 2D, but the principles involved are the same as in 3D, since even in 3D, shaders generally determine what a surface will look like.

I had been exploring some game development projects recently, first with Unity and now with Godot, and while I have loved learning about visual arts and using GIMP to create sprites, I got it in my head that I wanted to see how far I could go using only graphics generated by the game engine and shader code.

What I love about shaders is probably what most folks who are new to game development hate: shaders are inherently mathematical. There’s no getting around it. Shader language is inherently a vector language, and this is wildly powerful and also, at times, mind-bending.

What’s Inside

First, I give my version of a very brief intro to shaders, but point (and rely on) some great links for getting started for those who don’t already know, rather than try to improve on the great resources already out there.

Then I show a few shaders of increasing complexity, which illustrate how we can connect our understanding of linear equations in mathematics to draw straight lines in shaders, including how to get a straight line through the center of a texture at any angle. I make some observations that will hopefully be useful for folks who are new(ish) to shaders. Ultimately, this shows off a handle of the basic concepts used in fragment shaders. The shaders were written using Godot 4.2.2.

What is a shader? A tiny intro about Godot shaders

I am not going to fully explain how shaders work here, but there are tons of great intros, and I highly recommend reading more about them. For example, the Godot Shader Introduction is great. For shaders more generally, The Book of Shaders is probably one of the best resources around for newcomers; it is truly one of the treasures of the internet. For video introductions, check out the FencerDevLog and Godotneers Youtube channels. I also highly recommend the videos of Freya Holmér (note: Freya is using Unity in the linked video, so that video doesn’t provide Godot specific info, but is still a great general resource.)

But here is a two-minute overview: GPUs (essentially) run on all your pixels at once. A shader program gets a rectangle, called a Texture, to draw on, and typically that rectangle already has something drawn on it. A shader can modify what is already there, or completely overwrite it. Probably the most important variable in shader coding is UV — this is coordinate of the pixel in the texture, except normalized. This means no matter what dimensions the texture really is, both U and V only range between 0 and 1, just like the square with \(0 < x < 1\) and \(0 < y < 1\) in the \(x,y\) plane. This “squeezing” is done by just scaling down, so if the image is, say 256×128, then the \((u,v)\) coordinates of a point with “actual” coordinates \((x,y)\) will be given by the functions \(u=\frac{x}{256}\) and \(v = \frac{y}{128}\). One thing to note, though, is that in Godot shaders, \((0,0)\) is at the upper-left, and V increases toward 1 as you move downward. Godot shaders have a built-in input vector UV, representing the coordinates of the pixel currently being rendered. Since this is two dimensional, it’s called a vec2. When you are working on a shader, you’re focusing on just that single UV pixel. It may seem counterintuitive, but since UV is a variable, it’s more accurate to say that you are writing code for a generic pixel in that texture. This enables you to determine what to do with all pixels in the texture in one fell swoop. Your code never knows a specific “constant” coordinate pair for (U,V); it has to be written so that the same bit of code works for all pixels in the texture at the same time. That can be tricky to get used to, relative to “normal” coding, but it is also one reason shaders are so powerful.

Shaders calculate a variety of things in rendering a pixel, but I’m only going to use fragment shaders here, which just determine what color (and transparency) a pixel should have. This is done in RGBA (reg, green, blue, alpha) format. Each of these four values is represented as a floating point number from 0 to 1. As is typical, an alpha of 0 means fully transparent and 1 means fully opaque. The Godot fragment function has a built-in input called COLOR, which is a vec4, since it needs to represent these four values.

Now’s a good time to warn you that I am not an expert on shaders; so feel free to let me know if I say something you think is off-the-wall or misleading.

Some Shader Arrows

Preliminary – Setting up a Godot Shader

If you’ve never used a shader in Godot, it’s worth checking one of the aforementioned intros to see how to get to the point where you can actually apply a shader to an object. You’ll need to create a scene and add something with a texture to that scene. A ColorRect works, or you can use either a Sprite2D or a TextureRect (with a CanvasTexture subresource, and scaled large enough so that you can see it) works great. You don’t need to import any image into the project to do this, but you could always use the lovable icon.svg that comes with every Godot project as your base texture. You’ll want to click on the object you’ve created, and in the inspector, scroll down to Material, select “New Shader Material”, and then within that select “New Shader.”

A screenshot showing the CanvasItem area of the Godot Inspector, with "Material" and "New Shader" option.

A simple start

Below is the full shader code for a blue arrow arrow facing upward. If you are not familiar with the shader function smoothstep, be sure to read up on that first. It’s one of the first things you need to learn about shaders. (You might want to read about step first.)

shader_type canvas_item;

void fragment() {
	COLOR.rgb = vec3(0.,0.,1.);
		
	COLOR.a = max(smoothstep(.05,.0,abs(UV.x-.5)), 
			      max(smoothstep(.05,.0, abs(UV.x-.5-UV.y)), 
				      smoothstep(.05,.0, abs(UV.y-.5+UV.x))));
}

We get a picture like this:

A screenshot of the "Material" entry of a desmos resource with an arrow pointing up.

Some notes:

  • We first set COLOR.rgb to the vector <0,0,1>. Those decimal points in the code are crucial; a shader does not convert between integer and floating point representation for you, and the shader compiler will not compile this. COLOR vector is a vec4, and the entries of a vec4 are floats, not integers. Shaders need to be fast, so they make sure they’re not doing that busy work for you. This can be really frustrating, but (I’m sure) it helps ensure shaders operate as efficiently as possible. (I’m probably going to render that moot by doing stupid things with shaders.)
  • But notice we used a vec3 in that first line, not a vec4. This is because we’re only overwriting the values of the first three components of the color vector: r, g, and b. We do this by writing COLOR.rgb = ... . Shaders taketh away, but they also giveth. It’s really easy to refer to, and assign to, the components of vectors, which makes it easier to write shader code when you know what you’re doing.
  • Interestingly, vectors commonly refer to both points in space and also colors. Points in space up to four dimensions are typically referred to as x,y,z, and w. Colors coordinates, as we noted, are r, g, b, and a. The shader doesn’t inherently know or care what a particular vector you’ve created represents, so shader language conviently lets you use either xyzw notation or rgba notation interchangeable. It makes sense to try to coordinate your choice with the meaning of the vector, but part of the fun of shaders is about translating coordinates into colors, so that can become ambiguous depending on what you write. With the vector UV, we use UV.x and UV.y. (It’s a bit weird since “UV” is so-named because it thinks of its coordinates as the normalized vectors u and v, but this is how we refer to the first and second components — it would be intuitive to say UV.u to mean the same as UV.x or UV.r, which we noted both refer to the same value, but UV.u is not defined.)
  • I then just assign a value to COLOR.a. This is a single line of code that takes the maximum value over 3 smoothstep calls. Each smoothstep call essentially just checks how far the point UV is away from a certain line, and blends between alpha=1 (opaque) when the UV point is on the line and alpha=0 (transparent) when it is at least distance .05 away from the line. (Remember, the total width of the rectangle is 1, so 0.05 is 5% of the width. The lines we check are first, the vertical line with \(x=.5\), second the diagonal line with x+y=.5, and finally, the diagonal line with \(x-y=.5\). In the code I don’t want to write the equation; rather, I’ve moved everything to the same side of the equation to obtain an expression such as \(x-y-.5\) , and the second smoothstep parameter being 0 then corresponds to \(x-y-.5=0\).
  • It can be helpful sometimes to use a graphing tool like Desmos to quickly see what is what different equations may give, but remember that in the shader, the \(y\)-coordinate increases as we move downward, which is not uncommon in computing but not the norm for graphing calculators.
A screenshot of Desmos with the equations x = .5 { 0 < y < 1}, x - y = .5 {.5 <x < 1}, and y+x =.5 {0<x<.5} graphed.
When drawing the lines in Desmos, the arrow points down because y points up.

Fancy Arrows

I wanted to draw arrows to indicate the velocity of some object, and my idea was to have one in each of the four cardinal directions. Each of the four arrows would be “on” or “off” depending on whether the object has a positive component in that direction. The following shader adds some features so that an active arrow is thicker and in green, while an inactive area is thinner and red. It lets you choose a “facing” 1, 2, 3, or 4, each corresponding to one of the cardinal directions:

shader_type canvas_item;

uniform bool active = false;
uniform vec4 inactive_color : source_color;
uniform vec4 active_color : source_color;
uniform float inactive_width : hint_range(0.0,0.3,.01) = .05;
uniform float active_width : hint_range(0.0, 0.3, .01) = 1;
uniform int facing : hint_range(0,4,1) = 0; // 0 = up, 1 = left, 2 = right, 3 = down;


void fragment() {
	COLOR.rgb = active ? active_color.rgb : inactive_color.rgb;
	float line_width = active ? active_width : inactive_width;
	vec2 uv = UV;
	if (facing == 1)
		uv.y = 1.-uv.y;
	if (facing == 2)
	{
		uv.x=1.-UV.y;
		uv.y=UV.x;
	}
	if (facing == 3) 
	{
		uv.x=UV.y;
		uv.y=1.-UV.x;
	}
		
	COLOR.a = max(smoothstep(line_width,.0,abs(uv.x-.5)), 
				+ max(smoothstep(line_width,.0, abs(uv.x-.5-uv.y)), 
				+ smoothstep(line_width,.0, abs(uv.y-.5+uv.x))));
}

And here are the parameters I used for that:

Screenshot showing the shader with an arrow pointing left.

Gettin’ Fancier

You might decide you want the arrow to point any direction. An astute linear algebra student will notice that the modifications used above to point left, down, or right, correspond to certain transformations — all reflections. These are almost linear transformations, except the point which is “preserved” (held fixed) under the change is the middle of the texture, which is \((.5,.5)\), not the origin \((0,0)\). Due to the symmetry of the arrow, the reflections give the same result as rotating by 1, 2, or 3 “clicks” of 90 degrees.

To rotate the arrow by an arbitrary angle \(\theta\), linear algebra gives us the lovely rotation matrix \[\begin{bmatrix} cos(\theta) & -\sin(\theta) \\ \sin(\theta) & \cos(\theta) \end{bmatrix}\]. To represent this matrix in the Godot shader language, we need to use the mat2 type. We perform this rotation through matrix multiplication in the code below.

[Mathy Side Note] My friend, multiplying by that matrix gives you the angle sum formulas. If you’re familiar with matrix multiplication, try it out. That’s why this matrix performs a rotation — it’s literally just giving you a new vector corresponding to adding the angles.

However, the rotation matrix only rotates around the origin. As noted, we need to rotate around the point \((.5,.5)\). Fortunately, it is as easy as first subtracting that point from UV, then rotating, then adding back. Geometrically, we shift our desired center to the origin, rotate, then shift back.

shader_type canvas_item;

uniform vec4 active_color : source_color;
uniform float active_width : hint_range(0.0, 0.3, .01) = 1;
uniform float angle : hint_range(0, 6.28318, .01) = 0;

void fragment() {
	COLOR.rgb = active_color.rgb;
	float line_width = active_width;
	mat2 rotation = mat2(vec2(cos(angle), -sin(angle)), vec2(sin(angle), cos(angle)));
	vec2 uv = (UV-vec2(.5))*rotation + vec2(.5);

	COLOR.a = max(smoothstep(line_width,.0,abs(uv.x-.5)), 
				+ max(smoothstep(line_width,.0, abs(uv.x-.5-uv.y)), 
				+ smoothstep(line_width,.0, abs(uv.y-.5+uv.x))));
	COLOR.a -= smoothstep(.5,.52, length(uv - vec2(.5)));
}

At the end, we also added one extra line — since the arrow now won’t have it’s “tip” at the edge of the texture, this rotation reminds us that we haven’t “restricted” those lines in any way — it’s really trying to draw the full line, and so you get a kind of “x” or asterisk shape as those lines continue past the arrow point. So we add another smoothstep to cut them off at the point of the arrow by making anything too far from the center transparent.

More Stupid Tricks?

One of the best questions to ask is “where can we go from here; how can we push this?”

How can you tweak the above to rotate the other direction? What if we want the “four cardinal” to have the background be maybe a highly transparent black? Or if we want the “any direction” arrow to have a circular transparent color; what changes can we make? What other fun things can we do with this? What other natural extensions? What about arrows not passing through the center?

Conclusion & Discussion

Is it stupid to try to draw your images with shaders rather than importing static images? I’m no expert, but I think it depends on a few factors. If you won’t be modifying an image, except maybe scaling or translating it, then it is surely more efficient import static images. It’s likely easier to maintain them with sprite atlases, etc. Also, in professional cases involving dedicated artists who are not coders or mathematicians, their workflow uses artist tools like Photoshop, and it doesn’t make sense to even consider something like this. But if you (a) have highly dynamical images, (b) are much more comfortable with math than traditional avenues to art, and/or (c) will be placing rather involved shaders on top of images anyway, then maybe it makes sense to just create your images with shaders when it is easy from a work-effort perspective.

Do you have a favorite shader trick, stupid or otherwise?