Creating a stylized 3D environment used to take days or even weeks of concepting, modeling, cleanup, texturing, and scene assembly.
Today, with the right AI-assisted workflow, it is possible to go from an idea to a fully assembled environment in Unreal Engine 5 in a single day.
In this article, I break down the exact process I used to create a mysterious stylized environment inspired by the atmosphere of Alice in Wonderland, The Nightmare Before Christmas, and other dark fantasy references.

The core workflow:
- 1.Establish a strong visual direction
- 2.Convert the concept into individual assets
- 3.Generate the assets as 3D models
- 4.Clean and prepare them in Blender
- 5.Assemble and polish the final scene in Unreal Engine 5
Define the Visual Style
Before generating any 3D assets, you need a clear visual reference that defines the overall style, the mood of the scene, the composition, and the key objects within the environment.
In this case, NanoBanana was used to generate a mysterious fantasy location with a whimsical yet eerie tone — something between Alice in Wonderland and The Nightmare Before Christmas.
At this stage, technical precision is not the goal. Instead, the focus is on answering creative questions:
- •What kind of world is this?
- •What materials and shapes define it?
- •Which objects are essential to the scene?

Break the Concept Into Individual Assets
After choosing the environment reference, the next step is to break the image into separate assets — environment pieces, props, hero objects, and decorative elements.
A practical way to do this is by combining the concept image with ChatGPT. Isolate objects from the scene, upload them, and ask ChatGPT to generate clean prompts describing each asset.
Using those prompts together with the visual references, refined asset images are generated in NanoBanana through an image-to-edit workflow.

Generate and Optimize the 3D Assets
Once the asset references are ready, the next stage is 3D generation. In this workflow, models are generated using Varco 3D based on the prepared images.
This stage involves two main processes:
- Generating the base geometry from the reference images
- Initial optimization pass — PBR materials are generated and a remesh process reduces polygon density


Clean Up and Prepare Assets in Blender
After generation, all models are imported into Blender. This is where raw AI-generated models are turned into production-ready assets.
The cleanup process includes:
- Running Merge by Distance to remove duplicate vertices
- Adjusting pivot points
- Applying All Transforms
- Correcting the scale of assets


Refine Meshes and Custom Textures
Some environment elements benefit from additional mesh adjustments. In this project, the floating islands generated by AI required more control over the grass surface areas.
The refinement process:
- 1.The top grass areas were separated into their own mesh parts
- 2.The geometry was cleaned manually
- 3.A new UV layout was created
- 4.Surfaces were textured precisely in Substance Painter


Scene Setup in Blender
Before moving to Unreal Engine, the environment can be roughly assembled in Blender. This helps test composition, object scale, and overall visual balance.
An additional detail was created at this stage for the houses: a separate emission texture was painted in Blender for the house windows.
Emission map workflow:
- 1.A new texture was created inside Blender
- 2.It was named
Emission - 3.Using the brush tool, an emission map for the house windows was painted manually


This allowed the windows to glow in the engine and added much more atmosphere to the environment.
Assemble and Polish in Unreal Engine 5
After importing everything into Unreal Engine 5, the main focus becomes scene polish and atmosphere.
This stage includes configuring:
- Directional Light for key lighting
- Post Process settings for visual mood
- A Toon Shader plugin with outline-style visual treatment
- Additional props and environmental details for density



Add a Character and Retarget Animation
To complete the environment, a character was added. The character serves two important purposes: providing a scale reference and adding a storytelling element.
Character preparation:
- 1.Prepare and scale the character in Blender
- 2.Rig the character using Mixamo
- 3.Export the model as GLB
- 4.Import into UE5 and use retargeted animations from the Third Person template



Key Takeaways
Start With Style, Not Geometry
A strong concept image defines the world much faster than building assets blindly.
Use AI for Both Ideation and Production
AI tools generate clean asset references and base 3D models — not just inspiration.
Blender Remains Essential
Even with AI-generated models, geometry cleanup, pivots, transforms, scale adjustments, and UV work still require manual attention.
Hybrid Workflows Are Stronger
The best results come from combining AI speed with manual artistic control.
Unreal Engine Is the Final Stage
Lighting, shaders, post-processing, and scene dressing transform a set of assets into a complete environment.
Conclusion
AI-generated 3D assets are no longer just experimental tools. With the right workflow, they can become part of a practical environment production pipeline for Unreal Engine 5.
The key takeaway is not that AI replaces artists. Instead, AI dramatically accelerates the early and middle stages of environment creation.
For artists who already understand composition, style, and real-time scene assembly, this speed can become a real advantage — making it possible to build a complete stylized environment in a single day.