Unity Shader Graph – Energy Shader – by Casey Farina

August 9, 2021

Shader Graph

Unity


Title:
ART 200: Creating an energy Shader in Shader Graph in Unity

By:
Casey Farina


Youtube – Tutorial

Description:
Tutorial on creating an energy effect on a surface with Unity’s Shader Graph.


Introduction

This tutorial covers the creation of a shader in Unity’s Shader Graph which covers the surface of an object with a glowy, plasma-like effect. Further in the tutorial they cover how to make the less pronounced parts of the effect actually completely transparent, which gives a cool effect on its own, or allows for this to be placed over other effects or surfaces to apply the effect with gaps showing the surface below.

Fig. 1: My Results of the Energy Shader Along with Shader Graph View

Quick Notes

HDR Color and Emission

It starts with HDR color with significant intensity into Emission to create glowing, radiant effective.

Voronoi Noise in Layers

Use Voronoi noise to create moving globules effect within the energy. Contrast node used with this to create more distinct dots.

They then created two Voronoi noises with differet cell densities and blended them with a Blend node, giving a combination of tiny particles moving with some larger effects moving throughout the material.

Transparency and Alpha Clip Threshold

By making the Surface of the Graph Inspector Transparent, feeding the Blend result into the now present Alpha on the Fragment node, and setting a proper Alpha Clip Threshold (0.1 is usually a good start), then you can get an effect where parts of the energy shader are totally transparent and see through.

Summary

This tutorial helped me make a simple, yet effective energy effect shader that works decently well on multiple surfaces. The extra segment on including transparency into the effect really took it up a level for me as an interesting effect that could have a lot of uses. This would help expand the effect a bit by also giving it the option of being used on a larger mesh outside of an object’s actual mesh to provide a sort of aura off of and around an object that could be cool, as well as allowing for layering with other effects easier.

They also covered a small segment on using the Gradient node for some strange effects. This was a small segment that also didn’t appeal to me particularly so I didn’t try that portion out myself. That could give some more variety to the tool, but it just wasn’t something I wanted currently.

My result can be seen in action below with a few variations in speed of the effect and color!

via Blogger http://stevelilleyschool.blogspot.com/2021/08/unity-shader-graph-energy-shader-by.html

Unity Shader Graph – Signed Distance Fields – Update with Subgraph Fix

June 24, 2021

Shader Graph

Unity


Title:
Drawing Boxes and Rectangles in URP Shader Graph with 2D SDFs! 2021.1 | Unity Game Dev Tutorial

By:
Ned Makes Games


Youtube – Tutorial

Description:
Exploration into calculating signed distance fields and using them with Unity’s Shader Graph.


Title:
Rectangle SDFs and Procedural Bricks! Video Production Stream | Unity Game Dev Livestream

By:
Ned Makes Games 2


Youtube – Tutorial

Description:
The full stream which most of the previous tutorial is pulled from. Useful for any more in depth questions of previous tutorial.


Overview

When I visited this tutorial yesterday I ran into an issue with Unity 2021.1.3 that working with subgraphs was extremely bugged and error prone. I had seen online that later versions potentially fixed the issue, so I download the latest version, 2021.1.12, and this did indeed fix the issue for me, making this tutorial much easier to follow along with.

This tutorial was mostly just looking at the subgraphs and shader graphs they built and following along to build them out myself. This was at least a decent learning experience at getting familiar with the work flow of setting up subgraphs for your shader graphs, as well as just using a lot of the math nodes.

Helper Scripts to Show Off Shader

Along with building the shader, they made two simple scripts to make the shader a bit interactive and more flexible.

SetPoints

This class was responsible for letting the user move the two points dictating the general length of the rectangle shape by just clicking and dragging. This however did not work immediately, as they were using a helper class named MousePointer I did not use.

I was able to get a similar result by replacing their process of getting the point:


var p = MousePointer.GetWorldPosition(Camera.main);



with my replacement:


var p = Camera.main.ScreenToWorldPoint(new Vector3(Input.mousePosition.x, Input.mousePosition.y, distanceToPlane));

distanceToPlane was a value the user could put in that is the distance from the camera to the flat plane the camera is facing to test the shader. As long as the exact same distance is put there for the z-value of ScreenToWorldPoint, the points moving correlate exactly with where the user is dragging them.

RectangleThicknessByScrollWheel

This class lets the user control the thickness, or width, of the rectangle with the scroll wheel. This class directly worked as shown.

General Notes with Scripts Interacting with ShaderGraph Properties

Integrating the scripts with the Shader Graph properties was actually pretty easy. It worked similarly to working with the Animator in Unity. You just use methods like SetFloat() and give it two parameters where one is the exact string name of the property you want to set, and the second is the value you are passing in to said property. It is worth noting this was just accessed through the Material, there was no strange Shader Graph object that needed to exist or anything like that.

An example of my implementation of the tutorial can be seen below.


Unity Shader Graph: SDF Rainbow Pulse from Tutorial by NedMakesGames from Steve Lilley on Vimeo.

Video Example of my Following of the Pulse Shader in the Ned Makes Games Tutorial

via Blogger http://stevelilleyschool.blogspot.com/2021/06/unity-shader-graph-signed-distance_24.html

Unity Shader Graph – Signed Distance Fields and Subgraph Errors

June 23, 2021

Shader Graph

Unity


Title:
Drawing Boxes and Rectangles in URP Shader Graph with 2D SDFs! 2021.1 | Unity Game Dev Tutorial

By:
Ned Makes Games


Youtube – Tutorial

Description:
Exploration into calculating signed distance fields and using them with Unity’s Shader Graph.


Overview

This shader tutorial quickly explores calculating signed distance fields and using that for interesting effects. These effects were built in HLSL in the tutorial video originally, but they also show how these can be implemented with Unity’s Shader Graph system. I wanted to use the Shader Graph approach, but unfortunately I found that Unity’s Shader Graph Subgraphs have some major issues.

Signed Distance Fields (SDFs)

Signed Distance Fields (SDFs): calculate the distance from any arbitrary point to a specific shape

Principles of Calculating

To start, they look at an example using a rectangle whose center is at the origin (0, 0).

First, they find the distance from the point, p, to the center of the rectangle, which is just the length of the Vector2 p because the center is at the origin.

Then, using the symmetry of the rectangle, the absolute value of point, p, and the half dimensions of the rectangle are used to determine the distance of the point to any corner of the rectangle.

To get the positive results, they find the vector between the absolute value of point, p, and the corner of the rectangle and find the length of this vector after converting any negative components to 0.

Since the core of an SDF is that it is signed, meaning that a point inside the shape returns a negative value and a point outside the shape returns a positive value, they expand it to deal with negative distances. The vector, d, which is that from the absolute value of point, p, to the corner of the rectangle is only inside of the shape when both components of d are negative.

Assuming both components of d are negative, the result from the previous step already results in 0, so they can add a secondary component to this that returns a negative result in this case. By using min(max(d.x, d.y), 0) they can find this distance because a point within the rectangle must be closer to one wall or the other, or they are identical values. This is why there also is no rounded effect within the rectangle.

Moving the rectangle’s center from the origin just requires an additional offset argument.

Then, rotation requires another argument, and requires rotational matrix math (something I covered in my investigation to changing vector spaces).

Unity Problem with Subgraphs

While following along to mimic their Shader Graphs, I came across a Unity issue working in Sub Graphs especially. When creating properties and moving those property nodes around, Unity consistently runs in ArgumentNullException errors which completely shut the graph down and prevent any further progress until it is closed and reopened. Apparently Unity versions 2021.2 and up may work better with this, so I will have to look into more Unity versions in the future.

via Blogger http://stevelilleyschool.blogspot.com/2021/06/unity-shader-graph-signed-distance.html

Unity Shader Graph – Liquid Effect by Gabriel Aguiar Prod.

June 18, 2021

Shader Graph

Unity


Title:
Unity Shader Graph – Liquid Effect Tutorial

By:
Gabriel Aguiar Prod.


Youtube – Tutorial

Description:
Quick shader graph tutorial exploring interesting effects of moving shader based on psuedo-physics.


Overview

This shader tutorial looked like a great way to extend my knowledge on Unity’s shader graph since it appears to have some neat and unique mechanics. The fact that the shader is responding to the changes in position and rotation of the object explore ways to have shaders follow physics-like rules to create effects visually that can mimic for physical phenomena. Many of this user’s tutorials also include setting up the initial models, but I think this effect can work decently with other simple shapes I can easily make with ProBuilder.

via Blogger http://stevelilleyschool.blogspot.com/2021/06/unity-shader-graph-liquid-effect-by.html

Coding Adventure: Ray Marching by Sebastian Lague

April 12, 2021

Ray Marching

Unity


Title:
Coding Adventure: Ray Marching


By: Sebastian Lague


Youtube – Tutorial

Description:
Introduction to the concept of ray marching with some open available projects.


Overview

This video covers some of the basics of ray marching while also visualizing their approach and creating some interesting visual effects and renders with the math of signed distance along with ray marching logic. The major ray marching method they show is sphere tracing, which radiates circles/spheres out from a point until anything is collided with. Then, the point moves along the ray direction until it reaches the radius of that sphere projection and emits another sphere. This process is repeated until it radiates a very small threshold radius sphere, which is when a collision is determined.

The resulting project made is available, and I think it would be very useful and interesting to explore. The Youtube video description also holds many links to various sources used to create all the tools and effects in the video, which could also be beneficial for further research into these topics.

Fig. 1: Example of Raytracing Visual from Video (by Sebastian Lague)

via Blogger http://stevelilleyschool.blogspot.com/2021/04/coding-adventure-ray-marching-by.html

Drawing and Rendering Many Cube Meshes in Unity (Part 1 of Part 1)

March 03, 2021

Shaders for Game Devs

Shaders


Title:
Shader Basics, Blending & Textures • Shaders for Game Devs [Part 1]


By: Freya Holmér


Unity – Forum

Description:
Discussions and code for drawing and rendering many cube meshes.


Overview

I have been exploring shaders as an option for efficiently generating large amounts of geometry and came across this recent talk covering shaders all the way from the beginning. This seems like a good opportunity to at least get a better understanding of what they are and good cases to look into using them.

Intro to Shaders

Shaders: code that runs on the GPU in their truest form
This was there answer for the simplest form of explaining what a shader is from a game development point of view, and I liked it as a good starting foundation to help my understanding. Textures, Normal Maps, Bump Maps, etc. are all examples of tools that can be used as input for shaders. Shaders then use the information provided by those along with their given code to determine how to visualize and render with that information.

Fresnel Shader: as something fades away from you, you get a stronger light.
It looks like an outline type effect often, but it is not an outline effect. It will highlight features which are moving away from your view. As the angle towards some surface versus the camera is very low, something happens. This is just a commonly used type of shader.

Structures of a Shader

Structure within Unity (Description)[Language or Tool to Modify]:

Shader

– Properties (Input data) [ShaderLab]

– Colors

– Values

– Textures

– Mesh

– Matrix4x4 (transform data: where it is, how it’s rotated, how it’s scaled)

– SubShader (can have multiple in a single shader) [ShaderLab]

– Pass (Render/Draw pass; Can have multiple)

– Vertex Shader [HLSL]

– Fragment Shader (“Pixel” Shader) [HLSL]

Vertex Shader

This deals with all the vertices of the mesh. Similar to a foreach loop that runs over all the vertices you have. One of the first common issues with vertex shaders is placing the vertices. Shaders however do not particularly care about world space. They generally deal with position in clip space, which are values between 0 and 1 to determine where to place them on the screen. This can often be done simply by taking the local space coordinates and transform them with an MVP matrix to convert them to clip space and you are done.

Vertex shader is often used to animate water or sway grass and foliage in the wind. This aspect is used often to provide movements or animation. They mention that vertex UV coordinates can be manipulated in the vertex shader or the fragment shader, but if possible to do in the vertex shader it should be done their first. All you do here is set the positions of vertices or pass data to the fragment shader.

Fragment Shader

This is similar to a foreach loop that runs over each fragment. A pixel usually refers directly to a pixel being rendered on the screen, which a fragment shader does not always deal with. However, it is common for these to overlap, which is why some call this a pixel shader. This aspect generally comes down to determining what color to set for every fragment or pixel. The vertex shader always runs before the fragment shader. Data can be passed from the vertex shader to the fragment shader, but not vice a versa.

Shaders vs. Materials

Mesh and Matrix4x4 are normally supplied by your mesh renderer component or something like that, where as colors, values, and textures are something you must define yourself. These properties are generally defined with materials. The material helps contain these parameters which are then passed in to the shaders. You never “add a shader to an object” in Unity, it is effectively done by adding a material which then references the shader to be used. You can think of materials as preconfigured parameters to be used when rendering something with a shader. You can have multiple materials which all use the same shader, but have different input properties.

via Blogger http://stevelilleyschool.blogspot.com/2021/03/drawing-and-rendering-many-cube-meshes_3.html

Drawing and Rendering Many Cube Meshes in Unity

March 02, 2021

Drawing and Rendering Meshes

Unity


Title:
Drawing 1 Million cubes!


Unity – Forum

Description:
Discussions and code for drawing and rendering many cube meshes.


Overview

I wanted to be able to replicate the drawcube Gizmos in Unity I am using to portray data for my architecture project in the game scene and eventually builds of the project. To do this I need a very lightweight way to draw many small cube meshes, so I looked into drawing/rendering my own meshes and shaders. This option I came across in a Unity forum to just draw and render meshes on Update seemed decent enough so I investigated a simpler version for my needs.

Implementation

I could get away with a much simpler version of the CubeDrawer script found in the forum comments. I could strip out the batching and the randomization as I need very specific cubes and no where near the million cubes they were rendering. I am normally looking at somewhere in the thousands to ten-thousands, and want very specific locations.

So I stripped this script down and tweaked it some for my needs so I could feed the position and color data I was already creating from my node grids and heatmaps into this simpler CubeDrawer. I then had it build and render the cubes. It was able to give me the visual results I wanted, but I was seeing a significant performance reduction. The AI agents had stuttery movement and the camera control had a bit of lag to it. I’ll need to investigate ways to reduce the performance hit this has, but it’s a step in the right direction.

via Blogger http://stevelilleyschool.blogspot.com/2021/03/drawing-and-rendering-many-cube-meshes.html

Course on HLSL – Shader Fundamentals

September 26, 2019

Intro to HLSL Shaders

Shader Tutorials

80 lv – Series on HLSL Shader Fundamentals

Article

By: 80.lv


Introduction – HLSL Shader Creation 1 – HLSL Shader Fundamentals

Tutorials – Youtube List

By: Ben Cloward


The first link leads to the following youtube video list that contains the full course on HLSL shader fundamentals. This may be exactly what I was looking for (although apparently the course was first published in 2007, so some of the information is a bit dated). Regardless of age, this is supposed to be a very good run down of HLSL and shaders in general. This will help my search for understanding the basics of shader language while I continue to learn about Unity’s shader graphs separately.

Intro to Unity’s Shader Graph

September 13, 2019

Unity Shader Graph

Intro Tutorial

Youtube – Basics of Shader Graph – Unity Tutorial

Tutorial #1

By: Brackeys


This tutorial introduces you to Unity’s shader graph, so it starts with the bare minimum and goes over some simple features to get you started. Messing with vertex colors got me back into shaders in Unity, and shader graphs provide a really simple way to get some interesting visual effects when I need a break from trying to figure out shader coding.

I already ran into a lot of snags that kept things from working initially. I missed the step at the beginning of the tutorial where they change the Unity project template to Lightweight RP as they’re creating the project. I was able to get around this by doing what they did in their vertex color tutorial, where I created a Lightweight Renderer Pipeline Asset and dragged that into the Graphics tab of the Project Settings.

I then found out that Unity does not like it when you change the name of a shader. I expected this as a possible issue since scripts need to have some of the coding changed to have the class name match the file name, so this seemed like it could be a similar problem. Since the material/shader still was not working, I simply deleted the shader and made a new one with the proper name and everything was finally displaying as a non-error material.

The general effect they were going for was a glowing effect, so they started with using a Fresnel Effect node in the shader graph. This exactly produces a bit of a glowing effect, with a power value that controls how glowy the material is. We then explored applying colors to node effects like this one. This is done by creating a color node, and multiplying it together with the node you intend to color.

There is an interesting feature in the shader graph that actually lets you open the code and see what it is actually doing behind the scenes. This could be useful for exploring the actual shader code behind the vertex color node I was interested in checking out.

Initially, everything is only controllable within the shader graph, with no way to edit anything directly in the Unity Inspector. You can change this however by right clicking in a node and selecting “Convert to Property”. You can still edit it within the shader graph, but it is now found in the Black Board. This does also show the property in your material so you can have different colored/effected objects using the same base shader graph object.

Finally, they just quickly show that the shader graph can accept and apply textures as well. They just do a simple occlusion example to show this off.

My Final Example Following Tutorial

Vimeo – See Animated Shader

Model: Free Asset from Polygonal Series by Meshtint

PROBLEMS

The shader graph was a bit finicky for me and had weird issues. Some of the previews within the shader graph did not appear for me. I tried ] changing up the graph and opening/closing the shader but they remained gone. I will have to try making a project with a Lightweight RP template from the initial setup to see if that changes any settings that impact this.

Once I promoted the color node to a property, it was hard to actually control the final color on the actual object. Sometimes I would change the color in the Blackboard and it would be reflected on the object, then sometimes it wouldn’t. Then editing the color within the created property on the material would only sometimes change the color. I have not found exactly when it passes the reference from being editable in the shader graph to when it is editable in the material (to sometimes not editable at all), but reapplying the material did seem to fix these issues sometimes.

NEXT STEP

More shader graph tutorials would be nice to check out just to get a basic grasp of how to use it to get some nice effects pretty quickly, although I’d still like to look into the coding of shaders as well. I would also be interested in looking how to tie them together, and control some of the parameters of the shader graph through script.

Brackeys Tutorials: Terrain Generation and Vertex Colors (Cont.)

September 12, 2019

Graphics Tutorials

Mesh Generation/Alteration and Vertex Colors

Youtube – MESH GENERATION in Unity – Basics

Tutorial #1

By: Brackeys


Youtube – PROCEDURAL TERRAIN in Unity! – Mesh Generation

Tutorial #2

By: Brackeys


Youtube – MESH COLOR in Unity – Terrain Generation

Tutorial #3

By: Brackeys


Tutorial #3

Tutorial #3 actually covers UVs and applying textures to surfaces, as well as vertex colors and shaders in Unity. It starts by covering UVs and applying textures, and the main take away is that you want to normalize your UV vectors so they are within the range of [0, 1]. Textures are applied in a normalized fashion when dealing with UVs.

It then gets into setting the vertex colors. This is done by creating a Color array and then simply setting the Mesh.colors of the mesh equal to that array. We created a colors array that was the same size as our vertices, which leads me to believe that a mesh colors array generally has the same number of indices as vertices.

To set the colors, the underlying concept is that the color should correspond to the height of a vertex. This requires some more normalization, which requires us to keep track of the minimum and maximum terrain heights, so that we can use these as our bounds for a [0, 1] range. This is important because we are coloring with a Gradient. Unity has a method that goes with Color and Gradient called Gradient.Evaluate(), which can take in a value between [0, 1] and assign a color based on the given gradient (0 being one end of the color gradient, 1 being the other, and numbers in between being the middle colors or blends).

Once we’ve created the Color array and applied it to our mesh, it does not yet display. This is because most general Unity shaders don’t show this effect. To resolve this, they used Unity’s shader graph to apply the vertex colors to the surface’s actual colors. Using this requires the installation of the Lightweight Render Pipeline and the Shader Graph. You had to create a Lightweight Render Pipeline Asset and drag that into the Graphics section of your Project Settings for reasons that are not explained. You then create a shader with the option of “PBR Graph” to make your shader using the visual node system of the Shader graph. Once there, we just simply added a Vertex Color node and dragged that into the Albedo value of our material to apply the vertex colors.

Example Generated Terrain with Vertex Colors – by me

FURTHER TESTING

I wanted to experiment a bit more with this to see how applying it to some existing meshes could possibly work. I started simple by just using a basic Unity cube. I made a quick script for it that just grabbed a reference to its mesh and went through its entire mesh.vertices array to check exactly how many there were, and there were 24 as I expected {the 8 corners each have 3 vertices, dealing with the 3 different normals of the 3 intersecting faces). I then just made a few public Color options that could be fed into a small 3 color array that would be distributed to vertices around the cube. I applied my material with the created shader, and sure enough, this created the various vertex color effects around my cube.

Here is the script breakdown for this small test: public class CubeVertexColor : MonoBehaviour
{
Mesh mesh;
int vertexCounter = 0;

public Color color1;
public Color color2;
public Color color3;

private Color[] colorArray;
private Color[] vertexColors;

private void Start()
{
mesh = GetComponent().mesh;

colorArray = new Color[] { color1, color2, color3 };
vertexColors = new Color[24];

for(int i = 0; i < mesh.vertices.Length; i++)
{
vertexColors[i] = colorArray[i % 3];
Debug.Log(“The name of this vertex is: ” + mesh.vertices[i]);
vertexCounter = i + 1;
}

mesh.colors = vertexColors;

Debug.Log($”There are {vertexCounter} vertices in the cube mesh.”);
}
}

Example Cube with Vertex Colors – by me

NEXT STEP

I would still like to look into an actual shader script that is able to give similar results for vertex colors. I have gathered some resources today that I will most likely make a post with. I was hoping this could be a helpful tool for creating objects within Unity, but it still seems like it will be easier to make them outside and apply the vertex colors with another software. This will still be helpful in actually displaying those colors though if that is the route I take.