Drawing and Rendering Many Cube Meshes in Unity (Part 1 of Part 1)

March 03, 2021

Shaders for Game Devs

Shaders


Title:
Shader Basics, Blending & Textures • Shaders for Game Devs [Part 1]


By: Freya Holmér


Unity – Forum

Description:
Discussions and code for drawing and rendering many cube meshes.


Overview

I have been exploring shaders as an option for efficiently generating large amounts of geometry and came across this recent talk covering shaders all the way from the beginning. This seems like a good opportunity to at least get a better understanding of what they are and good cases to look into using them.

Intro to Shaders

Shaders: code that runs on the GPU in their truest form
This was there answer for the simplest form of explaining what a shader is from a game development point of view, and I liked it as a good starting foundation to help my understanding. Textures, Normal Maps, Bump Maps, etc. are all examples of tools that can be used as input for shaders. Shaders then use the information provided by those along with their given code to determine how to visualize and render with that information.

Fresnel Shader: as something fades away from you, you get a stronger light.
It looks like an outline type effect often, but it is not an outline effect. It will highlight features which are moving away from your view. As the angle towards some surface versus the camera is very low, something happens. This is just a commonly used type of shader.

Structures of a Shader

Structure within Unity (Description)[Language or Tool to Modify]:

Shader

– Properties (Input data) [ShaderLab]

– Colors

– Values

– Textures

– Mesh

– Matrix4x4 (transform data: where it is, how it’s rotated, how it’s scaled)

– SubShader (can have multiple in a single shader) [ShaderLab]

– Pass (Render/Draw pass; Can have multiple)

– Vertex Shader [HLSL]

– Fragment Shader (“Pixel” Shader) [HLSL]

Vertex Shader

This deals with all the vertices of the mesh. Similar to a foreach loop that runs over all the vertices you have. One of the first common issues with vertex shaders is placing the vertices. Shaders however do not particularly care about world space. They generally deal with position in clip space, which are values between 0 and 1 to determine where to place them on the screen. This can often be done simply by taking the local space coordinates and transform them with an MVP matrix to convert them to clip space and you are done.

Vertex shader is often used to animate water or sway grass and foliage in the wind. This aspect is used often to provide movements or animation. They mention that vertex UV coordinates can be manipulated in the vertex shader or the fragment shader, but if possible to do in the vertex shader it should be done their first. All you do here is set the positions of vertices or pass data to the fragment shader.

Fragment Shader

This is similar to a foreach loop that runs over each fragment. A pixel usually refers directly to a pixel being rendered on the screen, which a fragment shader does not always deal with. However, it is common for these to overlap, which is why some call this a pixel shader. This aspect generally comes down to determining what color to set for every fragment or pixel. The vertex shader always runs before the fragment shader. Data can be passed from the vertex shader to the fragment shader, but not vice a versa.

Shaders vs. Materials

Mesh and Matrix4x4 are normally supplied by your mesh renderer component or something like that, where as colors, values, and textures are something you must define yourself. These properties are generally defined with materials. The material helps contain these parameters which are then passed in to the shaders. You never “add a shader to an object” in Unity, it is effectively done by adding a material which then references the shader to be used. You can think of materials as preconfigured parameters to be used when rendering something with a shader. You can have multiple materials which all use the same shader, but have different input properties.

via Blogger http://stevelilleyschool.blogspot.com/2021/03/drawing-and-rendering-many-cube-meshes_3.html

Procedural Generation – Far Cry 5 GDC Content Generation

Procedural Generation – GDC 2018 Talk – Procedural World Generation of ‘Far Cry 5’

By: Etienne Carrier

GDC 2018 – Procedural World Generation of ‘Far Cry 5’ – GDC Vault Link

GDC talk where Etienne details a large scale tool they created using procedural generation techniques to help build the world of Far Cry 5. It covers both the user end (world editors/designers) and the behind the scenes processes that make everything possible. They use Dunia and Houdini together to create these tools.