Components of Game Engines


In this essay I will try and discuss a range of components that make up a game engine, the definition of which was already established in a previous essay. I will try and tackle many different components and explain what they are and how they can affect a game etc.


In the case of computer graphics then Level of Detail (LOD) involves the decreasing complexity of a 3D object as it moves further away from the player’s location or according to other measures such as object importance or position.


In the image above, from The Elder Scrolls V Skyrim (using a LoD mod) you can see that in the left image the waterfall and water is static and there is no animation for the waterfall while on the right the level of detail enables the water to be animated and for the waterfall animation to continue.


In the case of computer graphics then lighting involves the simulation of light in computer graphics. Software such as Radiance involves highly accurate attempts at photorealistic rendering although videogames tends to employ non-realistic rendering in a case for more stylised lighting in scenes.


In the images above, again taken from The Elder Scrolls V Skyrim (although this time the bottom image is with the Realistic Lighting Overhaul mod) you can see a huge difference in the lighting between the scenes and how it can affect the game. In contrast to the top image from the vanilla version of the game, the bottom image does offer an overall darker image but one that seems to have more depth in the lighting, with it being more obvious that there are multiple interacting light sources etc whi8le the top image seems as if everything is being lit by one source.


Textures, more specifically texture mapping, is a way to add detail and colour to a computer-generated graphic or 3D model. It was pioneered by Edwin Catmull, President of Pixar and Walt Disney Animation Studios, in 1974.

The image below display a 3D Model with (2) and without (1) textures.



Fogging, or distance fog, is a technique used in 3D computer graphics to enhance the perception of distance by simulating a fog effect. Due to the fact that complex shadows are typically difficult to render a fog effect is used to objects further away are obscured by “fog”.

One of the most famous examples of Fogging in videogames is the game Silent Hill which made use of Fogging by working it into the storyline as the town of Silent Hill was submerged in a layer of thick fog.


However despite technological advances negating Fogging as a graphical necessity the feature was kept in future Silent Hill games.


Occlusion/Ambient Occlusion is a method of measuring how exposed part of a surface is to ambient lighting. It achieves a level of depth in the lighting that would otherwise typically be a flat lighting scenario.


The above image shows how Ambient Occlusion can enhance an image, with the angles of corners etc appearing darker in the final image rather than the lighting appearing “flat”.


Shadowing is the process of adding shadows to 3D computer graphics. The concept was introduced by Lance Williams in 1978. It is based around testing whether a pixel is visible from a light source, by comparing it to a z-buffer/depth image of the light source’s view, in the form of a texture.


The above images show the contrast between an image with no shadows and one with shadows, and as you can see in the image on the right the addition of shadows greatly enhances the sense of depth in a scene.


BSP (Binary Space Partitioning) was developed with 3D Computer Graphics in mind, wherein the structure of BSP allows information about the objects in a scene that would be useful in rendering, to be accessed rapidly.

It is a file format prominently used by Valve’s Source Engine.


In 3D computer graphics occlusion culling is a process that determines which surfaces and parts of surfaces are not visible from a certain viewpoint. It can be used to greatly enhance performance in games with expansive and complex scenes such as The Elder Scrolls V: Skyrim etc.


Rendering is the process of generating an image from a 2D or 3D Model by computer programs, the results of which could be called Rendering. There are several techniques involved in rendering, as trying to accurately render an entire image would take a ridiculous amount of time so certain techniques and restrictions are employed to achieve the desired result.

Rasterization, including scan line rendering, geometrically projects objects in the scene onto an image plane. Ray Casting renders the scene as observed from a particular point of view while the light passes through a singular object while Ray Tracing is similar to Ray Casting by is more advanced and typically achieves more photorealistic affects although usually at a huge cost.



Anti Aliasing refers to a number of different techniques used to combat aliasing, when an image becomes distorted or artefacts appear when the rendered image is different from the original image.



Spatial Anti Aliasing is used when representing a high-resolution image at a lower resolution while Supersample and Multisample Anti Aliasing are versions of spatial anti aliasing.



Pathfinding is the plotting, by a computer program, of the shortest route between two points. In videogames it is often a source of strife in games where you have to escort NPCs who get caught on corners of the environment, or in large-scale RTS games like the Total War series where soldiers and units don’t go where you want them to go in a logical manner.

Pat1 Pat2


Inverse Kinematics are reactionary animations to a situation in a videogame. For instance a character who crouches may actually bend their knees and lower themselves with their body reacting and animating accordingly, or in a game like The Last Of Us, when you take cover the main character will put a hand up to steady himself against the object and if his companion takes cover alongside him he will raise his arm so she can duck under, with the rest of his body animating accordingly.


Particle Systems are techniques in physics and computer graphics that use a large number of small objects/sprites to simulate certain objects which are otherwise hard to reproduce. These can include fires, explosions, moving water (the Skyrim picture at the start of this displays a static and animated waterfall of sorts), sparks, leaves, snow, fog, dust and more.


Note the fiery spell effect from World of Warcraft, as shown in the above image.



Havok Systems may be the most well known example of a middleware physics engine. While it can obviously be customized to the developer’s needs and wishes it focuses on real-time collision and realistic physics in 3D environments. It runs on a large range of systems including; Windows, Linux, Android, iOS, Mac OS X, Xbox One, PlayStation 4 and Wii U among others.


One thought on “Components of Game Engines

  1. Pingback: Game Engines | BossDarkseid

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s