Workflow

Below is documented a typical workflow that is used internally by the Soft8Soft team and by Verge3D users (examples include Scooter Configurator, Teapot Heater, Industrial Robot, Jewelry Configurator among others).

Steampunk WebGL-based visualization of teapot heater
Teapot Heater demo.

To create interactive 3D web content you can use Blender, 3ds Max, or Maya version of Verge3D with equal success. On the chart below you can see the main stages for creating a typical Verge3D application. Virtually all work can be done by only one person with basic 3D generalist skills.

Verge3D workflow chart

Modeling and Baking

In our team, we practice the following modeling pipeline. At first, high-poly versions of models are created. At the next stage, low- to middle-poly models are obtained via simply removing the subsurf modifier or retopology. No more than 100k tris per model is recommended.

Low-poly modelling workflow

The low-poly models are then UV-unwrapped. Finally, the meshes are triangulated - this is not a requirement of Verge3D but rather recommended for baking maps. Also, triangulated models are better suited for loading to third-party texture editors such as Substance Painter.

Normal and occlusion maps, if they are needed, are baked using the superimposed high-poly and triangulated low-poly meshes.

Choosing Material System

In most cases, you should base your material workflow on the native materials used in your favorite modelling suite:

You may also check out the following videos explaining how to use materials in your Verge3D-based apps: Blender, 3ds Max, Maya.

Image Formats, Resolution and Best Practices

PNG or JPEG

For best efficiency, we recommend you to use web-friendly formats for your textures: lossless PNG with maximum compression, or lossy JPEG (although Verge3D also supports WebP, GIF, BMP, TIFF). As a rule of thumb, if you don't need the alpha channel in your texture, prefer using JPEG over PNG.

Normal Maps

The normal maps should be saved as PNG even if the alpha channel is wasted, because normal maps loaded in JPEG format often produce visible shading artifacts. The images in PNG format, however, may be too heavy in regards to file size, so use normal maps only if there is a strong need in them.

Resolution

Be careful to not use too detailed textures unless you really need them. Big images can adversely impact the performance, drain video memory which is scarce on handheld devices (all the way down to crash) and significantly prolong the loading. The resolution of most of your textures should be 1024 px or below.

Beware of NPOT

The resolution of the textures should follow power the power-of-two rule (256, 512, 1024 px is great while 1000 px is bad). The engine re-scales all non-power-of-two (NPOT) images upon loading anyway, so consider carefully reviewing your textures in order to achieve maximum efficiency with regard to file size and loading time.

Aspect

Textures may be square or rectangular such as 1024x512 px.

Reuse

Always try re-using image files in your materials, and texture maps/nodes in your shaders, rather than making copies.

Resist the impulse to reuse textures as UI icons — those should be pre-scaled even if it means more files.

Pack BW in RGBA

If you have several black and white images (AO, light maps, transparency masks, color masks, etc), consider packing them in the RGBA channels of a single texture.

Load on Demand

If you are developing a customizer or a similar app, you may consider loading only a limited set of textures on startup. The other textures can be loaded and applied to your models on demand with the replace texture puzzle. Another method is to load parts of your scenes dynamically with the append scene puzzle.

Environment Map (Image-Based Lighting)

Environment map is a key component of a real-time scene used for providing the background and material reflections. We recommend artists to use equirectangular images in HDR format. For performance reasons, the size of the environment map is better to not exceed 2048x1024 px.

Assigning HDR environment in Blender

Besides that, HDR textures can be used to fake complex lighting conditions. For example, there can be too many light sources for the engine to handle, or the lights cannot be approximated by simple points or rectangulars.

The default cube projects for Blender, 3ds Max, or Maya include the HDR texture called environment.hdr which you can reuse in your apps.

You may also check out the following videos explaining how to setup the HDR environment in your scene: Blender, 3ds Max, Maya.

HDR Rendering

HDR (high dynamic range) rendering pipeline can be enabled with the corresponding checkbox in Verge3D export settings window (3ds Max, Maya), or on Verge3D settings panel under the Render tab (Blender). In this mode, half-float textures are used by the engine for better precision and intensity range, which particularly, is important for proper rendering of the bloom post-process effect.

Scooter, a 3D visualization which uses HDR rendering capabilities
Scooter, a 3D visualization which uses HDR rendering capabilities.

Animation

Animation clips are created for relevant model parts as usual. Skinning, whole-object, morph-target, and material animation can be used to produce various effects (see more in the Blender, 3ds Max, or Maya animation guide).

Creating animation for interactive rendering

You might want to provide human-readable names to animated objects so that they can be easily found in the Puzzles editor.

Animation puzzles

You may also check out the following videos explaining how to create animation: Blender, 3ds Max, Maya.

Layout and UI

We suggest 3 different methods for creating user interfaces for your Verge3D-based apps. The first, more straightforward, is to design UI elements right in your modelling suite and then use camera-parenting and fit to camera edge feature to position them on the screen (see Blender, 3ds Max, Maya settings). Also, since this method is not based on HTML, it works well for AR/VR applications.

E-learning demo with 3D UI
E-learning, a Verge3D demo which uses UI modelled as 3D.

Another approach is to build up app interfaces with HTML puzzles. This way you can design complex UIs by using HTML/CSS and still do this in a completely code-less manner.

Farmers journey game with HTML UI
Farmers Journey, a browser game featuring HTML UI made with Puzzles.

To learn more about HTML puzzles, see the following tutorial series.

And the last approach would be using some external web design software. Any WYSIWYG editor capable of exporting HTML/CSS files will work. One example is Webflow which we used to create some Verge3D demos such as Scooter, Industrial Robot, or Recliner. You can, of course, build the HTML interface manually with code or use some other tools instead.

Using Webflow editor for creating 3D web applications

With this approach interface elements (menus, buttons, info boxes...) are created as part of a separate web page in which a Verge3D app is embedded. See the detailed guide on creating HTML-based GUI for more details and examples.

Post-Processing

Ambient Occlusion (GTAO) can be enabled and exported from within Blender / Max / Maya.

Also, the following effects can be enabled with post-processing puzzles: bloom (which works best with HDR enabled), brightness-contrast, grayscale, depth of field, afterimage (motion blur), screen-space reflection and screen-space refraction. Parameters for these effects can be changed in runtime or animated with high performance thanks to internal caching. There is also a puzzle for removing all post-processing effects from a scene.

Assigning post-processing effects in Verge3D

Audio

Background music and/or event sounds can be added with sound puzzles to be triggered by the user. You should use the mp3 format for your audio files as it is supported in all web browsers.

Using sounds in interactive web apps

There is a special restriction for playing sounds: the sound playback can only be initiated via direct interaction with a web page. For example, the following setup would work everywhere including on Apple's devices:

Audio puzzles that work

However, if a sound is played upon some event which is not caused by direct user action, it wouldn't work:

Audio puzzles that does not work

To overcome this problem, you can use the on event / touchstart puzzle that would play and immediately pause all the sounds used in a scene when the user taps on the screen for the first time:

Audio puzzles with workaround

The above setup is available from the Puzzle library under the name Sound iOS Workaround.

Asset compression

When the app is complete, you can optimize the loading of its scene files as described in the corresponding section of this manual.

You may also check out the following videos explaining how to enable asset compression for your apps: Blender, 3ds Max, Maya.

Publishing

You can publish your project using either of methods mentioned in the publishing subsection of the App Manager guide.

Got Questions?

Feel free to ask on the forums!