Home › Forums › Bug Reports and Feature Requests › Materials from an Artists Perspective . . .
- This topic has 8 replies, 3 voices, and was last updated 6 years, 7 months ago by mcolinp.
-
AuthorPosts
-
2018-04-27 at 12:30 am #3995mcolinpCustomer
I have been waiting and waiting for Verge3D materials to finally make sense to me as an experienced 3D digital artist. Sadly I seem to be making little to no progress.
There are fundamentals of basic materials that should easily be understood to the user to allow them to make subtle changes to get materials exactly as they would appear in the real world.Part of my struggle is that most Blender users are trying to create materials that are not based on “Real World” materials; for animated characters or games, etc.
I design Products; and as such need standard materials such as Aluminum, Stainless. (in Polished, Brushed, & Matte finishes.) as well as Gloss and Matte Plastics, Rubber, Glass, Clear Plastics, etc.
With the right understanding of inputs; I should be able to create any of the materials listed above using the following inputs as things that influence the appearance:
Diffuse
Specular
Reflectivity
Fresnel
Base Color
Alpha(A perhaps simplified list; but bear with me . . .)
What I find really aggravating is that there has been certain things claimed to be implemented which I have never gotten to work; or they do not work in a way that is intuitive to me as the user.
**Environments**
My main focus here is on Equirectangular HDR’s; as they are the most widely used in other software. In theory I should be able to setup a world environment that automatically influences reflections in materials that are reflective. -I have NEVER successfully achieved this. The environment also should be able to be toggled using the properties panel as visible or not visible; while still creating reflections even if not visible in the background.**Materials**
Back when I was playing with Blend4web, (I know its a bad word here), I was able to create a very realistic brushed Stainless material using 2-3 versions of the same background environment as inputs which went from crisp to blurred and were applied with a Fresnel setting to make the surface appear correct at any given viewing angle.
I would say it would be better overall to create a multi-layered shader that reflects the existing World Environment with different amounts of blur, set by the user.
Is there any way that the team could create a demo video setting up an equirectangluar world environment; and then show how to manually setup a material that reflects that world environment?
When I open the example files I generally find them to not really reflect “Real World” materials; and most of the materials seem overly complicated and require normal maps, etc. I have little to no interest in normal maps as a requirement to create a material. I did take a look at the “Cube” example which uses normal maps. I was able to remove the normal textures; (as I would not need them); but still could not figure out where the reflection map came from and how I might change it to something else. (Again; shouldn’t I just be able to reflect a world environment?)
As an Artist I look at the characteristics of what influences the appearance of any given material. The materials I have seen seem to require more of a programmer’s approach. Even the “verge PBR” Node makes me want to pull my hair out; as it never if rarely creates what I expect. (Most of my materials have Fresnel and reflections; which I have never been successful in creating . . .)
I am most interested in understanding the direct effect of any given node; when it is added to a node tree. (and where, in some cases.) I have not found sufficient documentation that gives a needed overview of not only which nodes are supported, but what they do; as well as where, why, & how to use them. It literally feels like taking a “stab in the dark”; which is often what the results look like . . .
I would be happy to provide examples or further discuss how materials can become more user friendly to help the community.
2018-04-27 at 1:57 am #3997mcolinpCustomerA possible aspect that could help; create a flag for nodes that are not supported. It could be a yellow triangle with an exclamation point; or a red glow outline (only around unsupported nodes) in the Blender node editor. Having this kind of feedback is essential to know what is not working/supported.
2018-04-27 at 9:30 am #4001Mikhail LuzyaninStaffThere are fundamentals of basic materials that should easily be understood to the user to allow them to make subtle changes to get materials exactly as they would appear in the real world.
Yes, you a riht in most cases. The GLSL material method is bit old and tricky and some of the preferences not clear and not connected with real world, but it’s more fast for WebGL. So this materals based on Blender Internal render and now it’s gone and this material system will gone soon too. So don;t worry about that.
With the right understanding of inputs; I should be able to create any of the materials listed above using the following inputs as things that influence the appearance:
Diffuse
Specular
Reflectivity
Fresnel
Base Color
AlphaYes it’s a common PBR pipeline and we will implement this in future.
**Environments**
My main focus here is on Equirectangular HDR’s; as they are the most widely used in other software. In theory I should be able to setup a world environment that automatically influences reflections in materials that are reflective. -I have NEVER successfully achieved this. The environment also should be able to be toggled using the properties panel as visible or not visible; while still creating reflections even if not visible in the background.You can setup Equirectangular environments just follow this tutorial, but it has influence on materials only in Verge3D PBR shader. For standart GLSL pipeline you will only see it as environments.
Back when I was playing with Blend4web, (I know its a bad word here), I was able to create a very realistic brushed Stainless material using 2-3 versions of the same background environment as inputs which went from crisp to blurred and were applied with a Fresnel setting to make the surface appear correct at any given viewing angle.
Don’t worry we all came from Blend4Web team and it’s not a red flag for us. They has a simple implementation of Cycles nodes that more simple to setup and more close to real world properties that we will ad soon. We are already discuss this and we have plant to replace old GLSL node by cycles node with the same behavior.
I would say it would be better overall to create a multi-layered shader that reflects the existing World Environment with different amounts of blur, set by the user.
Is there any way that the team could create a demo video setting up an equirectangluar world environment; and then show how to manually setup a material that reflects that world environment?
It already exist just check our Youtube channel in Blender’s Tutorials playlist, also to understand how to setup GLSL node check the Blender Community Tutorials playlist.
When I open the example files I generally find them to not really reflect “Real World” materials; and most of the materials seem overly complicated and require normal maps, etc. I have little to no interest in normal maps as a requirement to create a material.
There’s no realtime reflection in Verge3D now. To imitate reflection you need to ue Cubemaps and insert them directly into the material as textures. It’s a coomon method for Game engines to use cubemaps as imitation of reflection you can check for this Unreal Engine for example and many more. It’s a tricky and not clear at first look but it’s more optimised for web. If you check demos of blend4web that has realtime reflection they very slow it’s pay for the rendering of reflections. But they need so we hope we implement them in future.
I did take a look at the “Cube” example which uses normal maps. I was able to remove the normal textures; (as I would not need them); but still could not figure out where the reflection map came from and how I might change it to something else. (Again; shouldn’t I just be able to reflect a world environment?)
When you removed a normal map that was connected to cubemap you would need just to connect to normal input of Cubemap node a normal from Geometry node to reestablish working of cubemap reflections.
As an Artist I look at the characteristics of what influences the appearance of any given material. The materials I have seen seem to require more of a programmer’s approach. Even the “verge PBR” Node makes me want to pull my hair out; as it never if rarely creates what I expect. (Most of my materials have Fresnel and reflections; which I have never been successful in creating . . .)
Verge3D PBR based on GLTF pbr standart that more simple that for example Principled node but gives you a compatibility with many engines like Godot, Facebook3D posts and so on. We have a plan to support a Cycles Principled to open all possibilities of PBR pipeline.
I would be happy to provide examples or further discuss how materials can become more user friendly to help the community.
It would be helpfull.
A possible aspect that could help; create a flag for nodes that are not supported. It could be a yellow triangle with an exclamation point; or a red glow outline (only around unsupported nodes) in the Blender node editor. Having this kind of feedback is essential to know what is not working/supported.
We doing a cleanup of unsuported nodes from time to time, but some of them are still there!:)
Co-founder and lead graphics specialist at Soft8Soft.
2018-04-27 at 12:41 pm #4002mcolinpCustomerThere’s no realtime reflection in Verge3D now. To imitate reflection you need to ue Cubemaps and insert them directly into the material as textures. It’s a coomon method for Game engines to use cubemaps as imitation of reflection you can check for this Unreal Engine for example and many more. It’s a tricky and not clear at first look but it’s more optimised for web. If you check demos of blend4web that has realtime reflection they very slow it’s pay for the rendering of reflections. But they need so we hope we implement them in future.“
Thank you for your detailed responses. I do wonder about this approach with cubemaps; my gut senses that too many cubemaps makes your filesize balloon in size . . . Is this true? This is why I suggested making material layers that that can control blurred reflections from a master environment image; controlled overall with a fresnel node. (In each material) In my mind; this eliminates so many separate images needed using cube maps in every material.
2018-04-27 at 1:15 pm #4003Mikhail LuzyaninStaffThank you for your detailed responses. I do wonder about this approach with cubemaps; my gut senses that too many cubemaps makes your filesize balloon in size . . . Is this true? This is why I suggested making material layers that that can control blurred reflections from a master environment image; controlled overall with a fresnel node. (In each material) In my mind; this eliminates so many separate images needed using cube maps in every material.
In most cases you will need just to bake 3 or 4 cubemaps for the scene if there’s no much difference in the object’s locations. Artists that familiar with Cycles, v-ray or an other biased or unbiased render more comfortable with a reflection/roughness method: when they switch this value to needed and all will done automaticly, but this is not the realtime pipeline.
If you think that it’s not the same as if you for example baked cubemaps for each object and insert them into materials, so I can say that for render it’s the same but it doing automatically by the render. If you read docs of 3ds max scanline or eevvee (any other, I just can remember this two where it described very clear) you can find that for each object with reflections a cubemap will be rendered from it’s center or if the material has different roughness on it’s parts it will be mean that each object will has a multiple rendered cubemap for each roughness pass. So in reality you will have a huge amount of cubemaps that will be rendered for your scene and will be like a real balloon (as you said) of textures for your video processor.
Only thing that different is that with first method you doing this by yourself and in second method it will be done for you by the engine and your video processor. First will be uncomfortable for you and increase an amount of megabyte of your blend file, second will be uncofortable for your video processor and also increase amount of megabyte of video memory used for such job.
Of cource you would prefer first and it will come to verge3d when eevee will be released I think, but now we recommend to use cubemaps. You can find how to make them on our youtube channel if you will have any question about baking and using them just ask here on the forum you will get fast help.Co-founder and lead graphics specialist at Soft8Soft.
2018-04-27 at 3:59 pm #4007mcolinpCustomerIn your first reply; you mentioned a tutorial for Equirectangular Environment maps . . .
Do you have a link?
Also, When you mentioned creating 3-4 cubemaps depending on the location of objects . . .This seems to me like something that could be automated. Something that the user should be able to go into the node tree of each object or material; and add a node (or node group) that calculates the appropriate cubemaps from the World environment in relation to the object/material. Perhaps with options for how many levels of different cubemaps it creates and toggles between, (1-2 up to maybe 5 or 6 total). Think of how much time would be saved with such a setup!
I also realize that there are huge changes coming in 2.8; which you have eluded to. And I am excited to see where things will go with the coming changes. Hopefully this discussion will help shape some of the vision and oversight in the coming changes. I encourage anyone else invested in these workflows to also pitch in to the discussion. My main hope is that there is some unification around what makes “sense” from a user perspective; to streamline the user experience (for creators); as well as increase the quality of the scenes created.
2018-04-27 at 4:17 pm #4009mcolinpCustomerAlso curious if there is any beta support for 2.8/Eevee that people can be involved in. Perhaps a dedicated forum for it would be good. I noticed that the Plugin does load in the latest 2.8 daily build; but I’ve not really spent any time trying to see what I can do differently if anything . . .
2018-04-27 at 4:45 pm #4010Alexander KovelenovStaffHi!
Thank you for your suggestions. I’m sure we will find a good artist friendly solution for everyday usage which will be both easy to use and optimized for all platforms, not only for high-end workstations which power raytraced renders such as Cycles, Vray etc. I think Eevee is a good first step in this direction, still we need to do even more lightweight and fast solution. Stay tuned!
2018-04-27 at 5:02 pm #4011mcolinpCustomerBeing able to support the Principled BSDF PBR shader in Eevee would be a game changer; along with supporting reflecting the World Environment. I realize there are benefits to workflows which are more complex; but I feel there is a need for both, as you have also indicated. Getting something that renders correctly should be a relatively easy path; while optimizing for best of all possible scenarios should be a secondary and more involved stage.
Perhaps a way to simulate multiple devices and give feedback on what might help in various cases could be a future development goal. I’ve used Tumult Hype to create really nice animated html5 interactive content. It includes a feedback system that specifies which features of your design will be unsupported or rendered differently than expected in certain OS’s and browsers. I like this approach as it leaves it up to me to decide what is acceptable. (I can always make a note on a webpage that certain features aren’t supported in specific setups . .. )
-
AuthorPosts
- You must be logged in to reply to this topic.