TOP 3D AI Logo
TOP 3D AI
Back to Learn
Tutorial

Game Level with AI in One Day

Complete workflow for building a stylized dark fantasy environment in UE5 in a single day — from AI concept generation and Varco 3D asset creation to Blender cleanup, Substance Painter texturing, and final scene polish with toon shaders.

Creating a stylized 3D environment used to take days or even weeks of concepting, modeling, cleanup, texturing, and scene assembly.

Today, with the right AI-assisted workflow, it is possible to go from an idea to a fully assembled environment in Unreal Engine 5 in a single day.

In this article, I break down the exact process I used to create a mysterious stylized environment inspired by the atmosphere of Alice in Wonderland, The Nightmare Before Christmas, and other dark fantasy references.

Stylized dark fantasy game environment built in Unreal Engine 5 with AI-generated assets
The final environment — built in one day using AI-generated 3D assets

The core workflow:

  • 1.Establish a strong visual direction
  • 2.Convert the concept into individual assets
  • 3.Generate the assets as 3D models
  • 4.Clean and prepare them in Blender
  • 5.Assemble and polish the final scene in Unreal Engine 5
Free Assets: All assets from this project are available for download. Grab them from Google Drive and follow along.
1

Define the Visual Style

Before generating any 3D assets, you need a clear visual reference that defines the overall style, the mood of the scene, the composition, and the key objects within the environment.

In this case, NanoBanana was used to generate a mysterious fantasy location with a whimsical yet eerie tone — something between Alice in Wonderland and The Nightmare Before Christmas.

At this stage, technical precision is not the goal. Instead, the focus is on answering creative questions:

  • What kind of world is this?
  • What materials and shapes define it?
  • Which objects are essential to the scene?
AI-generated concept image of a dark fantasy environment
The concept image — this becomes the visual blueprint for the entire environment
Once the concept image feels right, it becomes the visual blueprint for the entire environment. Everything that follows is built around this reference.
2

Break the Concept Into Individual Assets

After choosing the environment reference, the next step is to break the image into separate assets — environment pieces, props, hero objects, and decorative elements.

A practical way to do this is by combining the concept image with ChatGPT. Isolate objects from the scene, upload them, and ask ChatGPT to generate clean prompts describing each asset.

Using those prompts together with the visual references, refined asset images are generated in NanoBanana through an image-to-edit workflow.

Reference board with isolated assets on white backgrounds
Asset reference board — each object isolated on a clean background for 3D generation
The goal is to produce clean asset references placed on a neutral or white background. This makes the 3D generation stage significantly easier and more consistent.
3

Generate and Optimize the 3D Assets

Once the asset references are ready, the next stage is 3D generation. In this workflow, models are generated using Varco 3D based on the prepared images.

This stage involves two main processes:

  • Generating the base geometry from the reference images
  • Initial optimization pass — PBR materials are generated and a remesh process reduces polygon density
3D model generation in Varco 3D from reference images
Varco 3D generation — converting 2D references into 3D models with PBR materials
Generated 3D asset ready for cleanup in Blender
A generated asset ready for import into Blender
AI-generated models can often be too dense or messy. The optimization step is important when preparing assets for real-time engines like Unreal Engine 5.
4

Clean Up and Prepare Assets in Blender

After generation, all models are imported into Blender. This is where raw AI-generated models are turned into production-ready assets.

The cleanup process includes:

  • Running Merge by Distance to remove duplicate vertices
  • Adjusting pivot points
  • Applying All Transforms
  • Correcting the scale of assets
Merge by Distance operation in Blender removing duplicate vertices
Merge by Distance — removing duplicate vertices from AI-generated geometry
Assets prepared and organized in Blender
All assets cleaned, scaled, and organized in Blender
Setting the correct scale in Blender helps avoid fixing proportions later inside Unreal Engine. This is also a good moment to standardize naming conventions for assets, materials, and textures.
5

Refine Meshes and Custom Textures

Some environment elements benefit from additional mesh adjustments. In this project, the floating islands generated by AI required more control over the grass surface areas.

The refinement process:

  • 1.The top grass areas were separated into their own mesh parts
  • 2.The geometry was cleaned manually
  • 3.A new UV layout was created
  • 4.Surfaces were textured precisely in Substance Painter
Separated mesh parts of floating islands in Blender
Grass surfaces separated into their own mesh parts for precise texturing
Texturing floating islands in Substance Painter
Custom texturing in Substance Painter — precise control over grass and terrain surfaces
This step demonstrates an important principle: hybrid workflows produce the best results. AI quickly generates the base shapes, while manual editing restores artistic control where needed.
6

Scene Setup in Blender

Before moving to Unreal Engine, the environment can be roughly assembled in Blender. This helps test composition, object scale, and overall visual balance.

An additional detail was created at this stage for the houses: a separate emission texture was painted in Blender for the house windows.

Emission map workflow:

  • 1.A new texture was created inside Blender
  • 2.It was named Emission
  • 3.Using the brush tool, an emission map for the house windows was painted manually
Painting emission map for house windows in Blender
Painting the emission map for house windows
Emission map result showing glowing windows
Result — windows ready to glow in the engine

This allowed the windows to glow in the engine and added much more atmosphere to the environment.

Once the scene composition looked good, the setup was exported as GLB — a convenient bridge between Blender and Unreal Engine.
7

Assemble and Polish in Unreal Engine 5

After importing everything into Unreal Engine 5, the main focus becomes scene polish and atmosphere.

This stage includes configuring:

  • Directional Light for key lighting
  • Post Process settings for visual mood
  • A Toon Shader plugin with outline-style visual treatment
  • Additional props and environmental details for density
Unreal Engine 5 scene with basic lighting setup
UE5 scene with basic setup — the starting point for scene dressing
Scene without post-processing effects
Before post-processing
Scene with toon shader and post-processing effects
After post-processing + toon shader
This final dressing stage has a massive impact on the perceived quality of the environment. Lighting, shaders, and post-processing transform a set of assets into a complete world.
8

Add a Character and Retarget Animation

To complete the environment, a character was added. The character serves two important purposes: providing a scale reference and adding a storytelling element.

Character preparation:

  • 1.Prepare and scale the character in Blender
  • 2.Rig the character using Mixamo
  • 3.Export the model as GLB
  • 4.Import into UE5 and use retargeted animations from the Third Person template
Character rigged with Mixamo auto-rigging
Character rigged in Mixamo — fast and effective for stylized characters
Animation retargeting setup in Unreal Engine 5
Retargeting animations from the Third Person template
Character with retargeted animation in the environment
Character animated and placed in the environment
This is a fast way to add motion and life to the scene without building a custom animation system. Retargeted animations from the standard Third Person template work surprisingly well.

Key Takeaways

Start With Style, Not Geometry

A strong concept image defines the world much faster than building assets blindly.

Use AI for Both Ideation and Production

AI tools generate clean asset references and base 3D models — not just inspiration.

Blender Remains Essential

Even with AI-generated models, geometry cleanup, pivots, transforms, scale adjustments, and UV work still require manual attention.

Hybrid Workflows Are Stronger

The best results come from combining AI speed with manual artistic control.

Unreal Engine Is the Final Stage

Lighting, shaders, post-processing, and scene dressing transform a set of assets into a complete environment.

Conclusion

AI-generated 3D assets are no longer just experimental tools. With the right workflow, they can become part of a practical environment production pipeline for Unreal Engine 5.

The key takeaway is not that AI replaces artists. Instead, AI dramatically accelerates the early and middle stages of environment creation.

For artists who already understand composition, style, and real-time scene assembly, this speed can become a real advantage — making it possible to build a complete stylized environment in a single day.

Watch the full video walkthrough on the Stefan 3D YouTube channel for a detailed visual breakdown of every step.

Want to compare these tools yourself? Check out our 3D AI Arena.

Try the Arena