Intro to Quadtree and Octree in Unity

January 19, 2019

Quadtree and Octrees

Intro to Oct/Quadtree Data Structures

These sources begin to explain what quadtree and octree systems are and their potential uses in a game environment. There are a few tutorials attached as well explaining how to set them up with programming. Searching for these topics did not turn up a lot, but these few sources seem to be good starts.

Youtube – Lets Make an Octree in Unity

By: World of Zero

This is a basic approach to just setting up an Octree system in Unity in general.

Youtube – Building the Quadtree – Lets Make 2D Voxel Terrain – Part 1

By: World of Zero

This is the result of a challenge to the creator to make a Worms-style terrain destruction system. This would be approached by creating a 2D voxel system.

Youtube – Coding Challenge #98.1: Quadtree – Part 1

By: The Coding Train

This video gives a very good conceptual introduction to what a quadtree really is and the uses of it in programming. It then sets up how to create one in javascript.

Learning Foundations of Unity Shaders

January 18, 2019

Intro to Shaders in Unity

Glitchy Man Material

Unity3D – Live Training Session: Writing Your First Shader In Unity

This tutorial was pulled from the tutorial list I created January, 17th (“Assorted Unity Tutorials – Structs, Shaders, and Unity Architecture”). IT not only introduced me to the core components that make up a shader in Unity, it also covered a lot of terminology and behind the scenes information to give me a better foundational understanding of how shaders operate.

Types of shaders you can create in Unity:
  • Surface Shaders: code generation approach that’s easier to write lit shaders than using low level vertexe/pixel shader programs
  • Unlit Shaders: don’t interact with Unity lights, useful for special effects
  • Image Effect Shaders: typically postprocessing effect that reads source image, does calculations, and renders result
  • Compute Shaders: programs run on graphics card, outside normal rendering pipeline; used for massively parallel GPGPU algorithms or accelerate parts of games rendering
Explaining Basic Shader Script

Shaders go onto a material. Determines how a material is rendered. Standard for Unity shader uses Shader Language. The Properties block is similar to public variables in Unity, as they can be seen in editor. The Pass block is where script passes logic to renderer. Tags explain how it wants to be rendered. The two structs (data functions) pass into main functions. These are the vertex function (vert) and the fragment function (frag).

Core Terminology for Shader Scripts
  • Vertex Function: takes shape of model and potentially modifies it; gets the vertices of model ready to be rendered; converts form object space to clip space (relative to camera); result goes to fragment function
  • Fragment Function: applies color to shape output by vertex function; this paints in the pixels
  • Property Data: colors, textures, values set by user in inspector
  • LOD (Level of Detail): this goes with how detailed object is (usually associated with idea in games where closer objects have higher detail and far objects have low detail)

Shaders do not use inheritance. Most classes in Unity start as Monobehavior, which gives you a lot of nice base functions. Shaders need that included, which is what the line { #include “UnityCG.cginc”} is for. This includes the use of a bunch of helpful helper functions.

Two important structs: appdata and v2f. appdata passes in information of vertices of 3D model. These are passed in in a packed array (variable with 4 floating point numbers: x, y, z, w). POSITION is a semantic binding, this tells shader how something will be used in rendering. V2f is short for “vert to frag”.

Coordinate system translations:

Local space -> World Space -> View Space -> Clip Space -> Screen Space

Looking into fragment sections

Fixed4 can either be: x,y,z,w or for color, r,g,b,a. Created a variable _TintColor in properties, which showed up in Unity inspector under the Unlit_Hologram material. This then needed to be used in the CGPROGRAM to actually do anything. We added this color to the fixed4 col found in fixed4 frag, which “adds” the colors together.

Making a transparent Shader

First, changed RenderType in Tags from Opaque to Transparent. Also needed to add “Queue” = “Transparent” here, as the order things are rendered is also important. Because of this, you want other things rendered before rendering the transparent thing because you want the transparent thing rendered “on top”. There are several primary queue tags that exist for rendering order. The following is the order of rendering generally, from first to last.

Primary Queue Tags for Render Order:
  • Background (first, back)
  • Geometry (Default)
  • AlphaTest
  • Transparent
  • Overlay (last, top)

Add ZWrite Off keyword. Tells us to not render on the depth buffer. This is usually done for non-solid objects (i.e. Semi-Transparent).

Displacing vertices and clipping pixels

Using the function “clip” in frag function to clip out pixels within a certain threshold. Adding sin function along with several variables (Speed, Amplitude, Distance, Amount (Multiplicative Factor)) into vertex function to move vertices around in object space, relative to the object. This is done before passing into the frag function. _Amount was a factor in a range between 0 and 1 just to control how much the shader effect was happening. The amount was important for the C# script used to control the effect on a time based interval. The C# script, HoloManGlitcher, could access the variables within the shader script. This was done simply through the material. (i.e. holoRenderer.material.SetFloat (“_Amount”, 1f); )

Assorted Unity Tutorials – Structs, Shaders, and Unity Architecture

January 17, 2019

Unity Tutorial Assortment

Structs, Shaders, and Unity Architecture

Youtube – HOW TO MAKE COOL SCENE TRANSITIONS IN UNITY – EASY TUTORIAL

By: Blackthornprod

Youtube – Beginning C# with Unity – Part 15 – Structs

By: VegetarianZombie

Youtube – Reduce Garbage Collection in Unity with Structs

By: Unity3d College

Youtube – Unity Architecture – Composition or Inheritance?

By: Unity3d College

Youtube – Shaders 101 – Intro to Shaders

By: Makin’ Stuff Look Good

Unity3D – Live Training Session: Writing Your First Shader In Unity

By: Unity

This is juts a list of some useful resources of tutorials for some things I would like to get around to soon. They cover some basic functionalities of Unity, as well as some more in depth programming concepts to help aid in building my code.

Field of View Tutorial E02 – FoV Mesh – by Sebastian Lague

January 16, 2019

Field of View

Tutorial – E02

Youtube -Field of view visualisation (E02)

By: Sebastian Lague

This is the second part of a tutorial creating a field of view (FoV) for a character in Unity. This focuses on an in-game visualization of the field of view that fills the entire viewing area. FoV visualization is a generated mesh. This will be built from tris made up by the character’s position and the end point of each ray cast within the FoV.

A struct was created to hold all of the information for the raycasts. Used transform.InverseTransformPoint to convert a point (Vector3) from a global value to a local value.

The FoV mesh initially had an issue dealing with the corners/edges of obstacles. A high resolution was needed (many rays cast) to reduce the visual jitter effect around obstacle corner/edges. This was addressed further in the tutorial. To solve this, approached the problem by determining when rays go from hitting obstacle to missing obstacle, and label these “min” and “max” values. We know edge must be between these two rays somewhere. Then you cast rays directly in between these two rays until you find the edge/corner (or somewhere very close).

This solution helped the case of the FoV mesh hitting an obstacle very well, but still had issues if the two sequential view rays hit two different objects. This case has both rays hitting an object, so the logic determines this case is fine, however it produces a similar problem to what we had before. To fix this, a distance threshold between the two ray hit locations was added. This assumes two points hitting far apart were most likely parts of two different obstacles. The solution appeared to make the FoV mesh much smoother.

PROBLEMS

My FoV mesh was glitching out sometimes when interacting with obstacles. Some of the points of the mesh were jumping to very far away and incorrect positions. This did not appear to happen at all in the tutorial video, but happens pretty frequently with my setup so it may just be a typing error within the code somewhere. I will have to investigate to determine the issue.

SOLVED

The tutorial provides the scripts through github, so I was able to compare them side by side and find the error. It was a typo. In the DrawFieldOfView method, in the check to see if the sequential raycasts hit the same type of thing where they are checking for the edge cases, they add either pointA or pointB to the viewPoints list depending on which one is not equal to zero. I had both of these cases adding pointA specifically, which is what was causing the very strange mesh shapes sometimes for cases where my Fow was hitting obstacles.

level-design.org and a GDC Talk on Level Designers

January 15, 2019

Level Design – Resources

Talks and Online Source

Youtube – GDC 2015 – Level Design in a Day: Level Design Histories and Futures
level-design.org site

These have been helpful level design resources for games that were brought up in our class today for Architectural Approach to Level Design. While not particularly useful now, they could be good resources to go back to.

Field of View Tutorial E01 – by Sebastian Lague

January 14, 2019

Field of View

Tutorial – E01

Youtube -Field of view visualisation (E01)

By: Sebastian Lague

This tutorial sets up a field of view (FoV) for a character. This FoV gives basic sight to a character. This sight has a radius and a viewing angle, and it is blocked by obstacles. This FoV also has a global setting that determines if it follows the character’s rotation or not. There are also editor elements that make it visually very clear what is happening by drawing out all of these factors.

Unity angles: Normally we start with 0 degrees to the right of the circle, but Unity starts with 0 at the top of the circle (where you’d normally have 90 degrees) for its angular math calculations. Since sin (90 – x) = cos x, this just means we swap sin and cos in Unity math.

PROBLEM

I had an issue with GetAxisRaw command not working for mouse inputs with the axes being “Horizontal” and “Vertical”. It just constantly returned a value of 0. This was ok with keyboard inputs however.

Thesis Terms: Game Mechanics and MDA

January 13, 2019

Defining Terms for Thesis Research

Focus on MDA Terms

Game – “Type of play activity, conducted in the context of a pretended reality, in which the participants try to achieve at least one arbitrary, nontrivial goal by acting in accordance with the rules” [4] p.1 Mechanics – describes the particular components of the game, at the level of data representation and algorithms [1] – various actions, behaviors, and control mechanisms afforded to the player within a game context [1] – the rules and procedures of the game [2] p. 40 – elements of the game; “rules of the game” [2] p. 138 – methods invoked by agents for interacting with the game world [3] p.1 – “the rules, processes, and data at the heart of a game” [4] p. 1 Dynamics – describes the run-time behavior of the mechanics acting on the player inputs and each others’ outputs over time [1] – “runtime behavior(s) of the game; when players interact with the rules, what happens?” [2] p. 138 Aesthetics – describes the desirable emotional responses evoked in the player, when she interacts with the game system [1] – emotional results generated from the game [2] p.138

References – [1] R. Hunicke, M. LeBlanc, and R. Zubek, “MDA: A Formal Approach to Game Design and Game Research,” p. 5. – [2] Z. Hiwiller, Players making decisions: game design essentials and the art of understanding your players. New York? New Riders/NRG, 2016. – [3] M. Sicart, “Defining Game Mechanics”, The International Journal of Computer Game Research, vol. 8, no. 2, pp. 15, December 2008. – [4] E. Adams and J. Dormans, Game mechanics: advanced game design. Berkeley, CA: New Riders, 2012.

Unity Basic Enemy AI Patrolling

January 12, 2019

Unity Basic AI

Patroling

Youtube – PATROL AI WITH UNITY AND C# – EASY TUTORIAL

This tutorial just sets up a basic patrolling AI where empty gameobject waypoints (called movespots in the tutorial) determine the enemy’s movement. It used an array of positions which contained all the possible waypoints, then randomly selected between them and moved the enemy there. This movement isn’t particularly useful for much, but it showcased a basic waypoint system well enough.

This tutorial actually attempted to correct the issue I brought up from the previous tutorials in my last blog where they were using if statements that ended when a position exactly equaled another position, which was very susceptible to math errors. They even used my quick (but still not great) work around of using distance and “less than” to set a small acceptable range to account for these small math errors.

Unity Basic Enemy AI Following and Spacing

January 11, 2019

Basic Unity AI Tutorials

Player Following and Spacing

Youtube – AI TUTORIALS WITH UNITY AND C#
Youtube – SHOOTING/FOLLOW/RETREAT ENEMY AI WITH UNITY AND C# – EASY TUTORIAL

By: Blackthornprod

This first video used the Unity command “MoveTowards” which I thought was giving me some issues so I just created my own follow AI using simple vector math (this was a good opportunity to brush up on vector math again). It turned out everything was fine and both ways worked, but I like really knowing the math behind what my objects are doing. The addition of a stopping distance so the enemy stopped moving towards you at a certain distance was a nice extra touch that’s very easy to add with a simple if statement tied to distance between player and enemy.

The second video was a bit more interesting by having the enemy have 3 ranges: vary far away, away, too close. At very far away, it moved closer. At away, it stayed in position. At too close, it would move away from the player. This could be a good setup to test setting up a small state machine to get practice using those with AI. I could have each of those be a state and use an enum switch case setup to decide which action the enemy should use.

This was also a nice little refresher on instantiating projectiles, although the logic for them was strange and bad. The projectileScript class would get the position of the player and then end at that position. This was then “remedied” in the tutorial by having it destroy itself when the projectile position was equal to that original target position, but math errors were preventing them from EXACTLY matching the value for me, which kept them from being destroyed. As a quick solution, I just changed the if statement to occur when the distance between the projectile’s position and the target position was less than 0.1, so it just had to be pretty close, which allows for some math errors. I imagine this is not a great solution either though as calculating distance in Update for a projectile constantly sounds too aggressive computationaly.

Compiling List of Physical Phenomena for Game Analysis

January 10, 2019

Physical Properties to Observe in Games for Thesis

List of Physical Properties/Phenomena/Measures

I am compiling a list of physical factors that could be considered as the system or parameters to control/set for my thesis project. This will help me determine what factors I should be looking for in the games I am analyzing, as well as helping me develop a concept for the game to showcase the thesis design.

What are physical properties that could be coded as systems in games? Look for equations for general physical properties:

  • Force: Wikipedia – Force
    • F = dp/dt = d(mv)/dt
    • If mass remains constant (which it generally does in most normal physical instances), this can be changed to the more well known F = ma (because dv/dt = a) [1]
    • F = ma
    • Gravity
    • Magnetism
    • Thrust – increases velocity of object
    • Torque – produces changes in rotational speed
    • Mechanical Stress – distribution of forces through extended body where each part applies force to adjacent parts
      • Causes no acceleration of body because forces within body balance each other
      • i.e. Polybridge games
    • Friction
    • Equilibrium
    • Continuum Mechanics
      • Pressure – distribution of many small forces applied over an area of a body
      • Stress(?) – usually causes deformation of solid materials or flow in fluids
      • Drag – decreases velocity of objects
    • Elastic Force
      • Elasticity
      • Hooke’s Law = F = kx
    • Four fundamental Forces of Natures:
      • Gravitation
      • Electromagnetic
      • Strong Nuclear
      • Weak Nuclear

Energy: Wikipedia – Energy

  • “In physics, energy is the quantitative property that must be transferred to an object in order to perform work on, or to heat, the object”
  • Work = Force * Distance
  • W= ∫_C▒〖F∙ds〗(Equation from Word is a bit messed up)
    • Work is equal to the line integral of the force along a path C
  • Potential energy – stored by an object’s position in a force field
    • Potential gravitational energy
    • Electrical
    • Magnetic
  • Kinetic Energy – for moving objects
  • Elastic Energy
  • Chemical energy – caused by fuel burning
  • Radiant Energy – carried by light
  • Thermal energy – due to object’s temperature