The Quest system is used to unlock various skills, provide experience/skill points and to progress through the game. Each quest can have multiple objectives and optionally include way points that will show up on the compass.
Objectives are modular and may be reused in any quest, for example, returning to a particular character after doing a task, or completing a previous objective. Objectives can be completed by performing specific tasks, which the objectives themselves will be monitoring, as they are blueprints with C++ base classes, they have all the required functionality of any other blueprints. Once all objectives are completed, the Quest itself is completed and any skill/experience points awarded.
Objectives include showing an information popup once the objective has spawned, so the player knows what the current objective is, but also it will update the journal and active quest UI, to indicate which objective is currently active. The below image is a test level, with test characters and a test Quest, with the first objective active.
The InterpActors allow for any mesh, or hierarchy of meshes to have local transformations ( translation, rotation and scale ) interpolated over time, at different, or a constant speed, and by cycling through, or ping-ponging between the beginning and end transforms. It may also be setup to act and interpolate with triggers for opening, and auto closing after a set time using these blueprints.
The DirectMoveActors can move any character, including AI in a specific direction, at a specific speed, and when combined, will increase the movement speed as they move through the objects based on triggers / overlap spheres.
This will provide some detail the high level workflow I used to create a huge open world, with 8x8KM densely populated with vegetation, trees, grass and mountains using Unreal Engine 4 World Composition, along with several other tools which I will go through briefly.
This character was created to learn Maya (Modelling, UVMapping, Rigging and Skinning, Blend Shapes, Animation), Substance Painter+Designer, and establish/learn the workflow between UnrealEngine, and all of the aforementioned tools.
It takes a long time to learn something, but it takes a longer time to learn a lot of somethings. Learning Maya is like learning a new Programming Language, you always find new stuff the longer you use it, and in this case, it was a very different experience than using 3DS Max…. MORE
Drescalus world building towns, Castle is nearing completion, will still need polish here and there. In total there are 34 scenes, where each shop, house, Drescalus outside and Castle areas are scenes of their own, making this world quite large.
Game Development Started: Nov 2013 (3 Months to complete Phase I, World Building)
Current Stats: World Build Completion Level: 85%
There are 2 towns in Drescalus, each town has 10 houses, there is one town square, with 5 shops, and one castle which contains 2 wings, and many different rooms and areas of interest.
All houses/shops/internal geometry, can be entered via doors, and at the moment, the shops are empty as game content is still being created.
The goal of the last few months have been to build new or use existing technology to allow for a huge open world type of game, that will typically run on high end current hardware.
This post will detail creating the rocks and trees, and the tools created/used in the process for creating the final scene for morphing and LOD tests, in a huge open world, every bit of scenery/geometry visible, *(except for the clouds), are traversable terrain, where any character can walk to.
This process will detail the workflow to create clothing, using Max Garment maker, and Cloth modifier, and will include automatic skinning, and extracting morph targets to match character body part customizations, like body muscle size etc.. using Max’s SkinWrap, and custom Max scripts.
The system already allows for different accessories like hair, facial hair and their morph targets for facial and customization animation, but now includes calculating tangents to enable bump mapped shaders, as well as heavily optimized vertex and tangent caches.
The morph targets were created in Max, and were auto generated from scripts for the accessories (hair, beard etc. *clothing and armor too coming soon) using the morph trargets of the base character, which were created manually, to alter the facial and body features of the character.
These characters were modelled, fully UV Mapped, plainly textured (solid skin color only, for now), rigged for and skinned for biped skeletal animation, and contains a few base animations to test all major joints. These will serve as a base, to create other characters from, using a custom Face and Body Morph MAXScript.
A Base low-poly mesh was created, UV Mapped, rigged and skinned for skeletal animation, then select face and body parts/properties are morphed for minimum and maximum extents, eg. Eye width from each other, Eye height, nose bridge width, ears, cheeks jaw etc…
This demo shows the AI Character system, where the scene contains more than 100 active AI characters, but only those within range are being rendered and animated, while those who fall out of range, will still animate, and engage in combat, but not be rendered, and those that completely fall out of the large range, will not be active at all, the frame rates speak for themselves, mostly hovering above 100.
This project was done as a demonstration of the type of environment and level building technologies we will be using for upcoming Games.
The level itself consists of multiple assets, that were created using the Modular Level Design methodology, and to prove and test the methodology, specifically for workflow and timeline estimates and processes.
Small, re-usable assets were created, and used to build the Castle and floors from the ground up.