Lerp Fundamentals in Unity

May 10, 2021

Lerp

Unity


Title:
How to Use Lerp (Unity Tutorial)

By:
Ketra Games


Youtube – Tutorial

Description:
A brief coverage of exactly how Lerp works in Unity with a couple ways to use it.


Overview

I was recently researching ways to effectively use Lerp in Unity and came across some strange implementations that made me unsure if I understood how it worked. This video however specifically covered that case and explained that it does work, but it’s not a particularly ideal solution.

The case covered is specifically called the “Incorrect Usage” in this video. It is when Time.deltaTime alone is entered as the time parameter for Lerp. As I thought, this is just entering some tiny number as a time parameter each time it is called, which is then used to just some value between the intial and final values entered into the Lerp. It is not very controlled, and it leads to a strange situation where it keeps getting called and updated but it does not theoretically reach the final position ever (although this may be different with how computers handle the values eventually).

Finally they cover a couple time parameters to use with Lerp to get a few varied results. SmoothStep is one they suggest to get a result similar to the “Incorrect Usage” but done properly. It adds a bit of a slowing effect as the value gets closer to the final result. They also show using an Animation Curve to control the value in many various mathematical ways over time.

via Blogger http://stevelilleyschool.blogspot.com/2021/05/lerp-fundamentals-in-unity.html

Game Project: Flying Game – Part 1 – Introductory Movement and Camera Controller

April 30, 2021

Flying Game

Game Project


Overview

I wanted to do some work on a small, simple 3D game in Unity I could make within a week or so. My original focus is on player movement, and from there I decided to hone in on a flying game of some kind. I have been watching Thabeast721 on Twitch and he recently played Super Man 64 which I have also seen at Games Done Quick (GDQ) events and thought building off of and improving their flying controller could be a fun project.

Player Controllers

Controller #1 – Rotate Forward Axis

My initial thoughts on a basic flying player controller was to have left/right rotate the player on the y-axis, up/down rotate the player on the x-axis, and a separate button to propel the player in their current forward direction.

As a controller standard, I also tend to have the player’s input be received in the Update method to receive as often as possible, but then output these inputs as movement in FixedUpdate to keep it consistent on machines with different frame rates.

	private void Update()
    {
        horizontalInput = Input.GetAxisRaw("Horizontal");
        verticalInput = Input.GetAxisRaw("Vertical");

        if (Input.GetButton("Jump"))
        {
            isMoving = true;
        }
        else
        {
            isMoving = false;
        }
    }

    private void FixedUpdate()
    {
        // Inputs in reverse position for direction vector because that influences which axis that input rotates AROUND
        Vector3 direction = Vector3.Normalize(new Vector3(verticalInput * inversion, horizontalInput, 0.0f));
        Flying(direction);
    }

    private void Flying(Vector3 dir)
    {
        RotatePlayer(dir);
        MovePlayer(dir);
    }

    private void RotatePlayer(Vector3 dir)
    {
		transform.Rotate(dir * rotationSpeed * Time.deltaTime);
    }

    private void MovePlayer(Vector3 dir)
    {
		if (isMoving)
		{
			transform.position += transform.forward * movementSpeed * Time.deltaTime;
		}
    }

Controller #2 – Rotate on Y-Axis but Translate Directly Vertically

With my second approach I wanted to try and emulate the flying controller from Super Man 64 just to see how it felt. It seems like a more acarde-y style of flying with easier controls, so I thought it would be a good option to investigate. For this horizontal rotation (rotation on the y-axis) remained the same, as this is pretty standard with grounded player controllers as well.

The up and down rotation (rotation on the x-axis) however, was completely removed. The up/down inputs from the player simply influence the movement vector of the player, adding some amount of up or down movement to the player. This makes it much easier to keep the player’s forward vector relatively parallel to the ground and is much less disorienting than free-form rotational movement in the air.

	private void Start()
    {
        if (isInvertedControls)
        {
            inversion = -1.0f;
        }
    }

    private void Update()
    {
        horizontalInput = Input.GetAxisRaw("Horizontal");
        verticalInput = Input.GetAxisRaw("Vertical");

        if (Input.GetButton("Jump"))
        {
            isMoving = true;
        }
        else
        {
            isMoving = false;
        }
    }

    private void FixedUpdate()
    {
        // Inputs in reverse position for direction vector because that influences which axis that input rotates AROUND
        Vector3 direction = Vector3.Normalize(new Vector3(verticalInput * inversion, horizontalInput, 0.0f));
        Flying(direction);
    }

    private void Flying(Vector3 dir)
    {
        RotatePlayer(dir);
        MovePlayer(dir);
    }

    private void RotatePlayer(Vector3 dir)
    {
		transform.Rotate(dir.y * Vector3.up * rotationSpeed * Time.deltaTime);     
    }

    private void MovePlayer(Vector3 dir)
    {      
		if (isMoving)
		{
			Vector3 movementDirection = Vector3.Normalize(transform.forward + dir.x * Vector3.up);
			transform.position += movementDirection * movementSpeed * Time.deltaTime;
		}     
    }

Camera Controller

Follow Position

To follow the player’s position, I am using a really simple case where it just follows them at some fixed offset. The offset is originally determined by the initial position of the camera relative to the player, and then for the rest of its run its position is just that of the player summed with the offset.

Where transform.position is the position of the camera object:



offset = transform.position – player.transform.position;



transform.position = player.transform.position + offset;

Rotate to Follow

My first test just to initialize the camera follow was to child it to the player to see how it looked that way. This worked ok for following position, but having multiple rotation influences made this impossible to use quickly at first. As I changed the player controller to a more general, arcade-style, it worked better but was still poor for rotation.

To fix this I put the camera as a child onto a separate empty gameobject. This gameobject could then follow the player and rotate to rotate the camera around the player while keeping it at a fixed offset distance. This also made determining the rotation angle/looking vector from the camera to the player much simpler. Since the camera is rotate downward some, its forward vector is not in-line with the world z-axis anymore. This camera container however could keep its axes algined with the world’s axes. This meant I could just make sure to align this container’s forward vector with the player’s forward facing vector on the xz-plane. To do so it just required the following:

private void RotateView()
{
	Vector3 lookDirection = new Vector3(player.transform.forward.x, 0.0f, player.transform.forward.z);

	transform.rotation = Quaternion.LookRotation(lookDirection, Vector3.up);
}

Unity’s Quaternion.LookRotation method allows me to set the rotation of the object based on the direction of a forward facing vector (lookDirection in this case), with a perpendicular upward vector to make sure I keep the rotation solely around the y-axis.

The following is a quick look at how the final player controller and camera controller interact from this initial prototype approach:

Flying Game Project: Initial Player Controller and Camera Controller Prototypes from Steve Lilley on Vimeo.

Summary

via Blogger http://stevelilleyschool.blogspot.com/2021/04/game-project-flying-game-part-1.html

How to Make a Multiplayer Game in Unreal Engine 4 [Blueprint Tutorial] (Unreal) – Part 1 – by DevAddict

April 27, 2021

Multiplayer

Unreal


Title:
How to Make a Multiplayer Game in Unreal Engine 4 [Blueprint Tutorial]

By:
DevAddict


Youtube – Tutorial

Description:
A tutorial extension to the previous Unreal platformer tutorial that shows multiplayer implementation.

Summary

This tutorial extends the previous tutorial on making a platformer in Unreal found here:

Youtube – Lets Make a Platformer – Unreal Engine 4.26 Beginner Tutorial

The original tutorial follows one created by Unreal with some extra steps added. This tutorial is an expansion made by DevAddict specifically to show how to add multiplayer to this project.

Lesson 1: Introduction to Multiplayer

Play Modes

When going into Play Mode in Unreal, there are many options for testing and debugging multiplayer.

    Play Modes: Net Modes:

  • Play as Offline (Standalone): (Default) You are the server
  • Play as Listen Server: Editor acts as both the Server and the Client
  • Play as Client: Editor acts solely as Client and a Server is started behind the scenes for you to connect to

Testing Multiplayer Settings

Approach #1

  • Set Number of Players to: 2
  • Set Play Mode to: Play as Client

This tests both windows as if they were both individual clients.

Approach #2

  • Set Number of Players to: 2
  • Set Play Mode to: Play as Listen Server

The Editor will act as the Server (host) and the extra windows will act as Clients connected to that Server. This helps point out differences occurring between the Server and Clients for debugging multiplayer actors.

Editing Blueprints

Editing Game Mode

The Game Mode class only exists on the Server. It does NOT exist on any of the Clients.

This fact is why transitioning this tutorial to multiplayer causes many issues, the first of which is fixing the UI so it displays for all players. It originally only displays for the Server player because much of the programming for it is within the Game Mode class. Similarly, the respwan code is also only in the Game Mode.

Use “Event OnPostLogin” node

-> “Cast To BP_PlatformerController” node

-> Client Draw HUD (Event we created in the BP_PlatformerController class)

This tells the Game Mode (Server) that when a new player logs in and has their own Player Controller created, that that specific instance will create its own HUD for that individual Client. Note that they intially tried the “Event Handle Starting New Player” node in place of the “Event OnPostLogin” node, which did create the UI, but it did NOT create the character (so in the Unreal editor you just moved around as a camera). This approach may work with some extra modifications, but it did not direclty work in this instance.

Player Controller

The Player Controller is very powerful in multiplayer because it is replicated on the Server and the Client. They like to keep UI on the Player Controller because it exists throughout the play session. While the character may be destroyed in some instances, the Player Controller generally persists. This makes the Player Controller beneficial for respawning mechanisms as well.

Building a Player Controller:

Right-Click -> Blueprint Class -> Player Controller
Named: BP_PlatformerController

You need to connect the Player Controller and your Game Mode, as they work together to realy information between players and the Server.

In the Game Mode class (ThirdPersonGameMode) Event Graph -> Details -> Classes -> Player Controller Class -> Use dropdown to select new Player Controller (BP_PlatformerController)

Common Error – Event BeginPlay to Initialize Player Controller Blueprint

When initializing their Player Controller class, many may try using the “Event BeginPlay” node. This works for single player, which is why it may be prevalent, but it does not work for a multiplayer project. Instead you want an event that will run on the Client ONLY.

Moving HUD from Game Mode (Server) to Player Controller (Client):

Add Custom Event

Connect Custom Event to start of class

In Details of Custom Event -> Graph -> Replicates: Run on owning Client -> Replicates: Check ON Reliable

via Blogger http://stevelilleyschool.blogspot.com/2021/04/how-to-make-multiplayer-game-in-unreal.html

Observer Pattern in Unity with C# – by Jason Weimann

April 22, 2021

Observer Pattern

Game Dev Patterns


Title:
Observer Pattern – Game Programming Patterns in Unity & C#

By:
Jason Weimann


Youtube – Tutorial

Description:
Introduction to the observer pattern and implementing it in Unity through C#.


Overview

This tutorial covers the basics of the observer pattern in game development with two ways of implementing it in Unity. The first approach is relatively simple just to establish the concept, whereas the second approach uses events with C# to create a more flexible system.

Observer Pattern Basics

An object, called the subject, maintains a list of its dependents, called observers, and notifies them automatically of any state changes (usually by calling one of their methods)

Example Uses in Games:

UI elements updating when data behind them changes

Achievement systems

Implementation #1: Inheriting Observer and Subject Classes

Observer Class:

Abstract class your observers will inherit from

OnNotify() method that accepts a value and notification type


Subject Class:

Abstract class your subjects will inherit from

Hold information on their list of observers they report to

RegisterObserver() method to add Observers to list of those reporting to

Notify() passes a value and notification type on to all the observers they report to in their list, in turn calling their OnNotify methods with the passed on data

This example gets the point across, but is not particularly well suited for Unity or C# projects. Requires a base class on all observers and subjects (although could possibly be changed to using an interface system).

Implementation #2: Using Events

Subject objects create a ‘static event Action‘, which they then call when the requirements are met.

Observer objects have their own methods to perform when those same requirements are met, and they are connected by having the observers subscribe their relevant methods to that same ‘static event’.

This reduces the direct coupling between the observers and subjects, as well as any other objects involved. Should profile this as calling events every frame can start to lead to performance loss. As always, also need to be careful to properly unsubscribe methods from events when needed (such as when deactivating or destroying objects).

via Blogger http://stevelilleyschool.blogspot.com/2021/04/observer-pattern-in-unity-with-c-by.html

Architecture AI Pathing Project: Fixing Highlight Selection Bugs

March 22, 2021

Highlight Selection

Architecture AI Project


Overview

I was having a bug come up with our selection highlight effect where sometimes when moving around quickly, several objects could be highlighted at the same time. This is not intended, as there should ever only be a single object highlighted at once, and it should be the object the user is currently hovering over with the mouse.

Bugfix

Assessing Bug Case

Since it happened generally with moving around quickly in most cases, it was difficult to nail down the direct cause at first. After testing a while though, it was noticeable that the bug seemed to occur more often when running over an object that was non-selectable, onto a selectable object. Further testing showed this was the case when directly moving from a non-selectable to a selectable object right afterward. This helped isolate where the problem may be.

Solution

It turns out in my highlight selection SelectionManager class, I was only unhighlighting an object if the ray did not hit anything or it did both: 1) hit an object and 2) that object had a Selectable component. I was however not unhighlighting an object if the ray: 1) hit an object and 2) that object did NOT have a Selectable component. This logic matched up with the bug cases I was seeing, so this was the area I focused on fixing.

It turns out that was where the error was coming in. By adding an additional catch for this case to also unhighlight an object when moving directly from a selectable object to a non-selectable and back to a selectable object again, the bug was fixed.

Architecture AI Project: Fixing Selection Highlight Bug from Steve Lilley on Vimeo.

Summary

This was a case of just making sure you are properly exiting states of your system given all cases where you want to exit. This could probably use a very small and simple state machine setup, but it seemed like overkill for this part of the project. It may be worth moving towards that type of solution if it gets any more complex however.

via Blogger http://stevelilleyschool.blogspot.com/2021/03/architecture-ai-pathing-project-fixing_22.html

Architecture AI Pathing Project: Fixing Weird Build Bugs

March 11, 2021

Build Issues

Architecture AI Project


Overview

After working on the project with a focus on using it in Editor for a while, we finally decided to try and see if we could build the project and work with it from there. Initially it did not provide anything useable. It would build, but nothing could be done. After a while, I isolated that the colliders were giving trouble so I manually added them for a time and that helped set up the base node grid. The file reader however was unable to provide data to the node grid, so only one aspect of applying data to the pathing node grid worked.

These issues made the build fairly unuseable, but provided some aspects to approach modifying in order to fix the build. After some work focusing on the issue with applying colliders and reading/writing files, I was able to get the builds into a decently workable spot with hope to get the full project useable in a build soon!

Unable to Apply Colliders with Script: Working with Mesh Colliders at Run Time

This first issue right off the start of opening the build was that my script for applying mesh colliders to all aspects of the model of interest was not working. This made sense as a cause for the node grid not existing as raycasts need something to hit to send data to the node grid. Further testing with simply dropping a ball in the builds showed it passed right through, clearly indicating no colliders were added.

I used a band aid fix temporarily by manually adding all the colliders before building just to see how much this fixed. This allowed the basic node grids to work properly again (the walkable and influence based checks). The daylighting (data from the file reader) was not working still however, which showed another issue, but it was a step in the right direction.

Solution

With some digging, I found that imported meshes in Unity have a “Read/Write Enabled” check that appears to initially be set to false on import. While this does not seem to have an effect when working in the editor, even in the game scene, this does seem to apply in a build. So without this checked, the meshes apparently lose some editing capabilities at run time, which prevented the colliders from being added by script. Upon checking this, adding the colliders worked as intended.

File Reader Not Working: Differences Between Reading and Writing Text Files in Unity, and the Difficulties of Writing

While this got the build up and working at least, we were still missing a lot of options with the node grid not being able to read in data from the File Reader. Initially I thought that maybe the files being read were just non-existent or packaged incorrectly so I checked that first. I was loading the files in through Unity’s Resources.Load() with the files in the Resources folder, so I thought they were safe, but I still needed to check. To do so I just added a displayed UI text that read out the name of the file loaded if found, and read out an error if not found. This continuously provided the name of the file, indicating it was being found and that may not be the problem.

Difference Between “Build” and “Build and Run” in Unity

I was doing all my testing by building the project, and then finding the .exe and running it myself. Eventually I tried “Build and Run” just to test a bit faster, and to my surprised, the project totally worked! The File Reader was now working as intended and the extra pathing type was being read in properly and applied to the underlying node grid! But this was not a true solution.

To double check, I closed the application and tried to open it again directly from the .exe. Once I did, I found that again, the project was NOT applying the data properly and the file reader was NOT working as intended. This is important to note as “Build and Run” may give false positives for your builds working, when they actually don’t when run properly.

I found an attempt at an explanation here when looking for what caused this, as I hoped it would also help me find a solution:



Unity Forums – Differences between Build – Build&Run?


One user suggests some assets read from the Assets folder within Unity’s editor may still be in memory when doing “Build and Run”, which is not the case when simply doing a build. Further research would be needed though to clarify what causes this issue.

Solution

This did not directly lead me to my solution, but it did get me looking at Unity’s development builds and the player.log to try and find what issues were being created during the running of the build. This showed me that one part of the system was having trouble writing out my debug logs that were still carrying over into the build.

Since these were not important when running the build, I just tested commenting them out. This actually fixed the process and the File Reader was able to progress as expected! This read the file in at run time, and applied the extra architectural data to the pathing node grid as intended!

Reading vs. Writing Files through Unity

This showed me some differences in reading and writing files through Unity, and how writing requires a bit more work in many cases. Unity’s build in Resources.Load() works plenty fine as a quick and dirty way to read files in, even in the building process as I have seen. Writing out files however requires much more finesses, especially if you are doing something with direct path names.

Writing out files pretty much requires .NET methods as opposed to built in Unity methods, and as such might not work as quickly and cleanly as you hope without some work. When done improperly, as I had setup initially, it directly causes errors and stops in your application when you finally build it as the references will be different from when they were running in the Unity editor. This is something I will need to explore more as we do have another aspect of the project that does need to write out files.

Summary

If you want to modify meshes in your builds and you run into issues, just make sure to check if the mesh has “Read/Write Enabled” checked. Reading files with Unity works consistently when using a Resources.Load() approach, but writing out files is much trickier. Finally, use the dev build and player.log file when building a project to help with debugging at that stage.

via Blogger http://stevelilleyschool.blogspot.com/2021/03/architecture-ai-pathing-project-fixing.html

Drawing and Rendering Many Cube Meshes in Unity (Part 1 of Part 1)

March 03, 2021

Shaders for Game Devs

Shaders


Title:
Shader Basics, Blending & Textures • Shaders for Game Devs [Part 1]


By: Freya Holmér


Unity – Forum

Description:
Discussions and code for drawing and rendering many cube meshes.


Overview

I have been exploring shaders as an option for efficiently generating large amounts of geometry and came across this recent talk covering shaders all the way from the beginning. This seems like a good opportunity to at least get a better understanding of what they are and good cases to look into using them.

Intro to Shaders

Shaders: code that runs on the GPU in their truest form
This was there answer for the simplest form of explaining what a shader is from a game development point of view, and I liked it as a good starting foundation to help my understanding. Textures, Normal Maps, Bump Maps, etc. are all examples of tools that can be used as input for shaders. Shaders then use the information provided by those along with their given code to determine how to visualize and render with that information.

Fresnel Shader: as something fades away from you, you get a stronger light.
It looks like an outline type effect often, but it is not an outline effect. It will highlight features which are moving away from your view. As the angle towards some surface versus the camera is very low, something happens. This is just a commonly used type of shader.

Structures of a Shader

Structure within Unity (Description)[Language or Tool to Modify]:

Shader

– Properties (Input data) [ShaderLab]

– Colors

– Values

– Textures

– Mesh

– Matrix4x4 (transform data: where it is, how it’s rotated, how it’s scaled)

– SubShader (can have multiple in a single shader) [ShaderLab]

– Pass (Render/Draw pass; Can have multiple)

– Vertex Shader [HLSL]

– Fragment Shader (“Pixel” Shader) [HLSL]

Vertex Shader

This deals with all the vertices of the mesh. Similar to a foreach loop that runs over all the vertices you have. One of the first common issues with vertex shaders is placing the vertices. Shaders however do not particularly care about world space. They generally deal with position in clip space, which are values between 0 and 1 to determine where to place them on the screen. This can often be done simply by taking the local space coordinates and transform them with an MVP matrix to convert them to clip space and you are done.

Vertex shader is often used to animate water or sway grass and foliage in the wind. This aspect is used often to provide movements or animation. They mention that vertex UV coordinates can be manipulated in the vertex shader or the fragment shader, but if possible to do in the vertex shader it should be done their first. All you do here is set the positions of vertices or pass data to the fragment shader.

Fragment Shader

This is similar to a foreach loop that runs over each fragment. A pixel usually refers directly to a pixel being rendered on the screen, which a fragment shader does not always deal with. However, it is common for these to overlap, which is why some call this a pixel shader. This aspect generally comes down to determining what color to set for every fragment or pixel. The vertex shader always runs before the fragment shader. Data can be passed from the vertex shader to the fragment shader, but not vice a versa.

Shaders vs. Materials

Mesh and Matrix4x4 are normally supplied by your mesh renderer component or something like that, where as colors, values, and textures are something you must define yourself. These properties are generally defined with materials. The material helps contain these parameters which are then passed in to the shaders. You never “add a shader to an object” in Unity, it is effectively done by adding a material which then references the shader to be used. You can think of materials as preconfigured parameters to be used when rendering something with a shader. You can have multiple materials which all use the same shader, but have different input properties.

via Blogger http://stevelilleyschool.blogspot.com/2021/03/drawing-and-rendering-many-cube-meshes_3.html

Architecture AI Pathing Project: Upward Raycast for Better Opening Detection

January 11, 2021

Raycast for A* Nodes

Architecture AI Project


Original Downward Raycast and Downfalls

The raycasting system for detecting the environment to setup the base A* pathing nodes was originally using downward raycasts. These rays traveled until they hit the environment, and then set the position of the node as well as whether it is walkable or not. An extra sphere collision check around the point of contact was also conducted so as to check for obstacles right next to the node as well.

This works for rather open environments, but this had a major downside for our particular needs as it failed to detect openings with in walls and primarily doors. Doors are almost always found within a wall, so the raycast system would hit the wall above the door and read as unwalkable. This would leave most doors as unwalkable areas because of the walls above and around them.

Upward Raycast System Approach

Method

The idea of using an upward raycast was that it would help alleviate this issue of making openings within walls unwalkable when they should be walkable. By firing the rays upward, the first object hit in most cases can safely be assumed to be the floor because our system only works with a single level at this time. Upon hitting the first walkable target (in this case, the assumed floor), this point is set and another raycast is fired upward until it hits an unwalkable target. If no target is hit, this is determined as walkable (as there is clearly nothing unwalkable blocking the way), but if a target is hit, the distance between these two contact points is calculated. This distance is then compared with a constant height check and if the distance is greater, the node is still marked as walkable even though it eventually hit an unwalkable object.

This approach attempts to measure the available space between the floor and any unwalkable objects above it. If a wall is set directly onto the floor, as many are, the distance will be very small so it will not meet the conditions and will be set unwalkable appropriately. If there is a walkable door or just a large opening such as an arch, the distance between the floor and the wall above the door or the example arch way should be large enough that the system notes this area is still walkable.

Sphere Collision Check for Obstacles

Similarly to the original system, we still wanted a sphere collision check to help note obstacles very close to nodes so that the obstacles would not slip between the cracks of the rays cast, effectively becoming walkable. We included this similarly, but it is noted because the initial hit used is now below the floor, so the thickness of the floor must be accounted for. Currently a larger radius check is just needed so it can reach above and through the floor. In future edits, it could make sense to have a noted constant floor thickness and perform the sphere radius check above the initial raycast contact point according to this thickness.

Test Results

Details

I compared the two raycast methods with a few areas in our current test project. In the following images, the nodes are visualized with Unity’s cube gizmos. The yellow nodes are “walkable” and the red nodes are “unwalkable”. Most of the large white objects are walls, and the highlighted light blue material objects are the doors being observed.

Test 1

A high density door area was chosen to observe the node results of both systems.

The downward check can work for doors when the ray does not directly hit a wall, as the sphere check properly checks them as walkable from the floor. However, the rays hitting the walls can be seen by the unwalkable nodes placed on top of them. A clear barrier is formed along the entirety of the walls, effectively making the doors unwalkable.

The upwards raycast test clearly shows every door has at least a single node width gap of walkable nodes in every case. The doors that were originally unwalkable with the downward raycast and now walkable as the height check was met.

Test 1: Downward Raycast




Test 1: Upward Raycast

Test 2

A larger room with a single noteable door was observed.

The downward check does pose a problem here as a full line of unwalkable nodes can be seen on the floor blocking access through the door and into the room. Because the problem is seen with nodes on the floor rather than nodes on top of the wall, this is actually a case where the sphere collision check is the problem and not the raycast particularly. Changing the collision radius for the check could potentially solve the issue here.

The upwards raycast is able to cleanly present a walkable path through this door into the room. While this does give us the result we desired, it should be noted again this difference can be attributed to the difference in the sphere collision check for obstacles. The same radius was used for both tests, but the upwards raycast sphere orginates from the bottom of the floor, so the extra distance it has to travel through the floor is what really opens up this path.

Test 2: Downward Raycast




Test 2: Upward Raycast

Summary

The upwards raycast seems extremely promising in providing the results we wanted for openings, doors especially. The tests clearly demonstrate that it helps with a major issue case the downward check had, so it is at worst a significant upgrade to the original approach. The upward raycast with distance check method also succeeded in other door locations, supporting its consistency. It will still have trouble from time to time with very narrow openings, but should work in a majority of cases.

via Blogger http://stevelilleyschool.blogspot.com/2021/01/architecture-ai-pathing-project-upward.html

Architecture AI Pathing Project: Revit Data Applying Helper Class

December 9, 2020

File Reader

Architecture AI Project


Enhancing the Extendability of the Revit Parameter Data Application Class through Helper Class

I got the basics of reading in a specified set of parameter data from Revit to apply to the model in Unity, but we would need to have options for possibly using many sets of data from there. To handle this I wanted to make it easy to add ways to handle all the differents types of logic with these sets of data. This started by creating a foundational interface that any class utilizing the data would implement, but I also needed a way to tie these specific classes to their corresponding data sets. I ended up doing this mostly with a helper class, named RevitModelDataHandlerManager, to the CSVReaderRevitModel class.

Connecting Specific Data Handler Classes with Specific Data Sets Using a Dictionary

Because of the nature of this data, we know that we will be searching for several different specific strings somewhere along the way, so I wanted to hard code those strings in one global area so that if anything needs modified or added, I would only have to do it in one location and it would also reduce string input errors along the way. This was applied to the construction of a dictionary in the new helper class, named RevitModelDataHandlerManager.

RevitModelDataHandlerManager contains a hard coded initialized dictionary which associates a string term with a specific class implementing the IRevitModelDataHandler interface:

  • key = string of sheet name
  • value = class implementing IRevitModelDataHandler specifically tied to that type of sheet

This way once the name of the sheet of interest is known, it can be entered here as the key for its proper data set a single time. Then anytime that type of data is filtered through this system, it uses this dictionary to determine exactly which class implementing the IRevitModelDataHandler interface to use so it translates the data into the proper functions within the system.

RevitModelDataHandlerManager then has a method which takes as inputs the two necessary inputs for any IRevitModelDataHandler class (GameObject object being modified, string value used for modification) as well as another string input to determine which IRevitModelDataHandler to use from the dictionary (string name associated with particular IRevitModelDataHandler {generally a sheet name}). And again, all the sheet names from CSVReaderRevitModel can then be siphoned through this RevitModelDataHandlerManager class which will determine which IRevitModelDataHandler classes to use for which data using the constructed dictionary.

Summary

After applying all these modifications, testing the system was working well and providing similar results as before when the system was more rigid and only testing a single data set. Testing multiple data sets ran smoothly and operated as intended.

Right now the system specifically allows input for combinations of a sheet name and a single column name (the column being the data values it is looking for). It may make sense in the future to have a string array as the column name because it could be possible that the user would want to search for several types of data within the same sheet at a given time. This can technically be accomplished currently by passing in the same sheet name multiple times, and just associating a different column name with it each time, but this is not the most efficient process.

Next Step

The foundation of the Revit parameter data reading system is basically fully functional at this time, it is just a matter of determing the various IRevitModelDataHandler type classes to create to handle all of our needs for now. Fixing the raycasting system to handle obstacle detection is now the next major step to look towards.

via Blogger http://stevelilleyschool.blogspot.com/2020/12/architecture-ai-pathing-project-revit.html

Architecture AI Pathing Project: Applying Revit Parameter Data to Model in Unity

December 8, 2020

File Reader

Architecture AI Project


Applying Revit Parameter Data to Model in Unity

After reading the Revit parameter data into Unity, it could now be used to modify the model within Unity itself. The original goal was to read the data to determine which objects within the model should be placed on which Unity layer, specifically the “Unwalkable” layer in most cases. It should then be extended to be able to add components to specific objects as well, meaning it needs to have a diverse functionality set available when modifying these individual objects.

Building Data Arrays as Dictionaries for Flexibility

Since the full extent of the use of this data is not fully known yet, having a flexible and extendable option for organizing and searching through the data made sense here. As the data is read in one sheet at a time and placed into separate 2D string arrays, I decided to organize those 2D arrays within a dictionary. The key of these dictionaries is a single string, the sheet name, and then the values of the dictionaries are the entire 2D string arrays. This allows various methods to easily find a single (or multiple) arrays of data for their specific needs without searching through all the data read in.

Interface for Building Classes to Handle Various Functions for Modifying Objects in Model

Because we will need a wide variety of functionalities for modifying the objects in the model (starting with either changing the layers or adding various components), and this will consistently involve a gameobject (that object being modified) and some string value (the data read from the Revit parameters), I looked to creating an Interface foundation for this system.

I created the IRevitModelDataHandler interface, and started with two new classes for setting it up, which were RevitDoorDataHandler and RevitWallDataHandler. IRevitModelDataHandler just has a method named ModifyModelWithData that takes inputs of a GameObject and a string. RevitDoorDataHandler then uses that method to apply the logic to doors in the scene and RevitWallDataHandler uses that method to apply logic to walls in the scene. At this time they are both layer modifying logic, but they are still done in different ways currently. These classes will both be implemented from a centralized location (which currently is the CSVReaderRevitModel class, but may be moved elsewhere for organization).

Finding the Objects to Modify

I took the approach of finding objects within the model based on their ID values and this worked well. Each sheet has an ID column as the first column, so this can consistently be located in the same place for all data. As the system goes through the sheets of interest one line at a time, it uses the ID from the first column and searches through all the GameObjects in the architecture model until it finds one with a name containing the ID number. Once found, it knows this is the object it will be modifying with that row’s data.

Finding and Applying the Correct Data

The data we are interested in for any given functionality will be determined by the name of the column (the column header). So when preparing for applying a specific functionality, it is known which header title to search for, and upon finding it, that column index is noted and retained. Then again, as the system goes through a sheet of interest one row at a time, it knows which column to search for to find the data which it will actually be applying to the found object. Both this value and the found GameObject can then be passed as input parameters to any class implementing the IRevitModelDataHandler interface.

Summary

With flexibility and extendability at the forefront, I built the Revit parameter model modifying system with a dictionary for the 2D string data arrays and an interface system to apply the various logics necessary. So far this appears to be an effective approach, and is working so far in the test runs to apply the “Unwalkable” layer to specific doors noted by the Door Revit parameter data.

Reorganization of where some of the core system methods are located could be beneficial. The CSVReaderRevitModel class is holding a lot of the major methods right now, and is using an awkward switch statement to determine which interface implementing classes to use for which data. This is ok for now, but it will not particularly scale well. Ideally, the interface implementation should provide an avenue to mitigate this through more proper use of polymorphism.

Next Step

Reorganizing and cleaning the underlying data application system for the Revit paramater data application will be a clear next step, so hopefully I can clean out the CSVReaderRevitModel class and use the interfaces more properly. I then need to get it working with several sheets of the data (namely Doors and Walls), and then implement a version that can apply components since that will also be needed in the near future. The next large goal remains to be updating the ray casting system of the project.

via Blogger http://stevelilleyschool.blogspot.com/2020/12/architecture-ai-pathing-project.html