Jump to content

What's the best way to bake textures in Blender for SL?


You are about to reply to a thread that has been inactive for 672 days.

Please take a moment to consider if this thread is worth bumping.

Recommended Posts

Hello,

When you bake a texture in Blender Cycles for using in SL, what params do you use to use? Do you have any tip or advice? I always spend hours trying different combinations, i can't find any general rule or guideline to follow... Any tip welcome, thank you!

I'm talking about this feature of Blender:

3a29a104e54324ff85f43db057c22a35.png

Link to comment
Share on other sites

Blender's materials do not map one-to-one into SL materials, so I'd say it depends.

SL's diffuse and specular don't match to any single bake: baking in a lot of lighting/gloss into the diffuse texture will look nice from a single angle in SL in a controlled environment, but SL rarely restricts the view to a single angle or controlled environments, so my philosophy is to bake as little lighting/shadow into the diffuse texture as possible to just get a little extra definition going and let the advanced lighting shader handle the rest. That will, of course, look more dull for someone who isn't using advanced lighting, but I never found turning it off an acceptable sacrifice even when I used a 2012 laptop for several years.

I'm absolutely not a pro and there are probably better approaches, but personally, I set up a scene in a way that has subtle AO/shadowing, just enough for extra definition but no stark shadows or shine, and bake a combined bake (with all options selected, I've never had any reason to not have all options in a combined bake enabled for SL texture purposes) for the diffuse texture. Normal map bake luckily is simple, as it corresponds directly with SL. For SL's specular map, environment (specular map's alpha channel) and glossiness (normal map's alpha channel), I edit the Blender materials temporarily to display the appropriate parts of the material directly as emission, bake an emit map, and then use an image editing program to combine them manually.

For example, if I was generating the material's roughness as a procedural texture connected in to the roughness slot of a principled shader, I'd instead plug that texture into an emissive shader and output that, i.e. if you're using Node Wrangler just use the Connect to Output feature. Mind you, SL uses glossiness, not roughness, so you have to invert the generated map before applying it to the SL-bound texture.

Edited by Frionil Fang
  • Like 1
  • Thanks 1
Link to comment
Share on other sites

  • 4 weeks later...

here's my workflow - simple but seems to produce decent results inworld....

Texture the model in blender, using as many materials as you want/need and UV mapping it to suit that process, ignoring SLs limitations.

Make a separate UV map for SL, this time planning on how your SL material faces will be laid out. Tie that SL UV map to the image you're using as a bake target in blender so the islands are baked using the UV that SL will end up seeing.

Making the SL diffuse map:
Bake AO and Diffuse (influence ONLY from "color" not any blender lighting so you get a "real" diffuse map out) separately. Load both as separate layers into GIMP. Color-to-alpha on the AO map and set the layer as a whole to 50% transparency. Save the visible result as your SL diffuse map. If youlre not going tobe using the alpha channel on that face, save it without one.

Making the SL normal map:
Trying to "bake normals" like this produces artefacts. Temporarily change your blender materials to connect the normal map to "base color" instead and bake it as a diffuse instead

Making the SL specular map:
This in a pain in the a** in blender. I make mine manually in GIMP.

Now duplicate the model, add slots for the SL materials you plan and create them using the textures you've baked and saved and the "SL UV map" you created earlier.Assign all the faces from the working textures in blender to the appropriate "SL face" in the copy of your model. Remove all other materials and UV maps from the copy. Preview your model copy in blender textured with this "SL like" arrangement (you wont get it exact but you'll see any glitches or weirdness)

Make your LOD models from the copy and upload.

Link to comment
Share on other sites

7 hours ago, Da5id Weatherwax said:

Bake AO and Diffuse (influence ONLY from "color" not any blender lighting so you get a "real" diffuse map out) separately.

Yep.

Quote

Load both as separate layers into GIMP. Color-to-alpha on the AO map and set the layer as a whole to 50% transparency.

Much better to set the AO layer's blending method to "Multiply". Leaving it on "Normal" and trying to work some varying transparency will still result in non-shadow areas looking a little foggy as the AO map's gray and white pixels get slightly applied.

Quote

Trying to "bake normals" like this produces artefacts. Temporarily change your blender materials to connect the normal map to "base color" instead and bake it as a diffuse instead

Well, normal maps aren't made by baking a texture render, so I'm not surprised it doesn't work. I don't understand your alternate method though.

For Blender to bake a normal map, you need two mesh objects (or one that you've sculpted with a multiresolution modifier): the one intended for SL and a high-poly one. The baking depends purely on geometry and doesn't care about lighting, shader nodes or textures (except to determine where to store the bake result).

Or you could use something like GIMP or an online filter to generate a normal map from your diffuse, though that's really just a rough approximation that assumes the diffuse texture can be treated as kind of a bump map if you just do some edge detection or grayscaling on it, maybe.

Link to comment
Share on other sites

15 minutes ago, Quarrel Kukulcan said:

Well, normal maps aren't made by baking a texture render, so I'm not surprised it doesn't work. I don't understand your alternate method though.

For Blender to bake a normal map, you need two mesh objects (or one that you've sculpted with a multiresolution modifier): the one intended for SL and a high-poly one. The baking depends purely on geometry and doesn't care about lighting, shader nodes or textures (except to determine where to store the bake result).

Or you could use something like GIMP or an online filter to generate a normal map from your diffuse, though that's really just a rough approximation that assumes the diffuse texture can be treated as kind of a bump map if you just do some edge detection or grayscaling on it, maybe.

I'm using my own library of PBR textures - they include normal maps for each material and those are included in my initial texturing of the model. Baking the normal maps down as a diffuse map - so that blender handles the color data just like any other image texture and puts it in the right place on the "mapped to SL UV" target gives me a normal map I can upload to SL and all the normals are right for each texture.

I'm generally not using normal maps to "fake geometry" beyond that. I do make them in gimp if I'm creating a bas-relief or carved surface, those I make a greyscale heightmap and convert it to a normalmap in gimp, but then that become part of my input normalmaps alongside the PBR ones I use for the materials - and needs baking to the SL UV map as color data just like the others.

I totally get that using a more finely subdivided and detailed mesh as a source for a normalmap is a thing - it's just that in that final step I HAVE all my normalmaps, they are in the blender render pipeline already - and while it is intuitive to "bake the normalmap" in that situation to try and make a "normalmap for SL" that isn't what blender expects when baking normals and the end result will not be what you want. Bake it as a diffuse map to covert it from whatever UV suited your input textures to the one UV that will remain when you upload the model and textures to SL..

Link to comment
Share on other sites

2 hours ago, Da5id Weatherwax said:

I'm using my own library of PBR textures - they include normal maps for each material and those are included in my initial texturing of the model. Baking the normal maps down as a diffuse map - so that blender handles the color data just like any other image texture and puts it in the right place on the "mapped to SL UV" target gives me a normal map I can upload to SL and all the normals are right for each texture

Ahhh. Gotcha.

(Make sure your texture nodes are marked "non-color data" instead of "sRGB" so Blender doesn't do any gamma correcting. That will mess up your vectors slightly.)

  • Like 1
Link to comment
Share on other sites

9 minutes ago, Quarrel Kukulcan said:

(Make sure your texture nodes are marked "non-color data" instead of "sRGB" so Blender doesn't do any gamma correcting. That will mess up your vectors slightly.)

I got burned by that ONCE. I'd forgot to set one texture node holding a normalmap to non-color. Those bits of that face looked like sh**. I was like "WTF? I did this right just like the others... " and then had a total D'OH moment when I spotted the one misconfigured node. Since I do not wish to feel that stupid again, I now check all of them before I hit the "bake" button for the first time on any model....

Link to comment
Share on other sites

On 5/20/2022 at 3:20 AM, Aglaia said:

Hello,

When you bake a texture in Blender Cycles for using in SL, what params do you use to use? Do you have any tip or advice?

 

Ok so I've actually spent a lot of time trying to find the 'correct' answer for this in the past, apologies that my post is a little longer than others, but I hope it will be helpful

Channel Packed

Firstly, the most important thing to know is that SecondLife uses Channel Packing. What Channel Packing means is that SecondLife uses the alpha channel of an image to store data other than transparency to save on disk space.

The alpha channel for every pixel is a value between 0 and 1 (or 0 and 255 in image terms) - Data we pack into our image from another greyscale (black and white) image.

According to the Material Data for SecondLife documentation, SecondLife uses three textures, each use the alpha channel for different purposes

Quote
 
Parameter Red Green Blue Alpha
Diffuse Map Red Green Blue selectable

see Alpha Mode

Normal Map Normal X Axis Normal Y Axis Normal Z Axis Specular exponent
Specular Map Red Green Blue Environment intensity

 

OK so this might be a bit overwhelming but I'll go over them one by one and try to break them down as best I can as I understand it

Texture 1 - Diffuse map

The RGB Channels are the diffuse map - On passing glance you might think this is just the color of the object - It's similar but not quite - What this really is is how light is diffused over the object.

So to be technically correct, the 'Diffuse' bake from the Bake Type's drop down in Blender will get you the RGB channels for Diffuse as that takes lighting and material of the object into account.

Some people might opt to use the 'Combined' bake but that is a debate to be had about graphics settings and a whole can of worms I'm just going to avoid.

What about the alpha channel on Diffuse?

The alpha channel on a diffuse map is optional,  in SecondLife it can used for three different purposes

  • Emission
  • Transparency - Alpha Blending
  • Transparency - Alpha Masking
  • Emission

Emission data is useful for objects that emit light in different places, think for example like a tech control panel with LED's and stuff, you can have just little spots of the texture light up. For that you'd bake an Emission map and then use Channel Packing to combine it with your diffuse (I will describe channel packing further down)

Alpha blending is good for objects with semi-transparent materials - Note that this is expensive to render as the computer drawing each pixel has to now do additional maths using the alpha channel. Avoid alpha blending where you can.

Alpha masking is good for stuff like tree leaves etc where it's either fully visible or it's fully transparent - This is less expensive than alpha blending.

Texture 2 - Normal Map

You would get the RGB data by selecting the bake type 'Normal'

The alpha channel - What the heck is a 'Specular Exponent?'

Quote

The alpha channel of the Normal Map may contain a specular exponent value that is mutilplied by the "Glossiness" parameter

So what specular exponent in normal map alpha really means, is that it takes each pixel's alpha value from the normal map (from 0 fully transparent, to 1 solid) and multiplies it against the Glossiness setting in-world

image.png.e57a74a2799142fc5bd4c36cd7c705f3.png

What this means in practice is that the alpha channel on your normal map is really important because it controls how glossiness data of your material.

The confusing part

Ok so we know that the alpha channel on a normal map controls glossiness, so we need to bake 'Glossy' from the Bake Types in Cycles. Right? Right??

Unfortunately not, baking Glossy in Cycles produces RGB data that is actually used for our specular map >.<

So what the heck do we bake?

Well, we want a grayscale image to use as glossy, and unfortunately there is no option for that - However - There is an option to bake the inverse of what we want - the Roughness bake which is literally out glossiness data inverted. So, to get the correct alpha channel

1. Bake Roughness
2. Use an image editor to Invert roughness, this is our greyscale Glossiness map. It's also possible to invert the texture inside Blender's Image Editor by going to Image -> Invert -> Invert Image Colors

Texture 3 - Specular Map

RGB Channels - I've reached the conclusion that this is the  'Glossy' bake from Blender as it produces data that looks the most correct to me in my experience

Alpha Channel - Controls the reflected environment map -

image.png.6623aa840c1ca88b6dc4774bd334fded.png

IE it's a multiplier for this

In theory this is good for metals, but Honestly SecondLife's environment map sucks so I tend to avoid using this. Some PBR materials you might download come with reflection maps - I think what you'd do is simply plug the reflection map into  emission inside your node editor, and then bake emission to get this data if you wanted to reflect the 'environment' but honestly today that effect sucks so :/

Currently SecondLife is in the process of implementing PBR/GLTF and when that happens, using the Alpha Channel on specular will become more desirable. When that happens you'll want to experiment with the alpha channel on specular, but actually you'll be uploading PBR materials so this whole thing i'm teaching you now will become legacy information.

How do I channel pack textures?

In my experience, photo editors such as Photoshop, Affinity Photo do not work as they destroy RGB data when a pixel becomes fully transparent for reasons beyond my understanding

The best way to channel pack a texture is to use a tool built explicitly for channel packing

I recommend the Multi-Channel Image Tool it is free, open-source and it's the easiest to use channel packer I've found.

image.png.12e96a580172517dc1b2a16dc19566ac.png

If you're not familar with Github, you download it by clicking "V.1.2.0 - Optimized..." under 'Releases' and then clicking 'Multi-Channel-Image-Tool.zip' to download the program.

On the Combine tab in the program, there is an R,G,B,A tab, and you set the data for each of these in their respective tabs

Example

So for example, if I wanted to channel pack a normal map for SecondLife, I'd use my normal map texture for the R/G/B tabs, and set the channel to extract from the image to R/G/B respectively.

Then for the Alpha Channel, I'd select my Glossiness texture (Inverted Roughness). It will default to extracting Alpha data from the Glossiness map, which is wrong our image is black and white. So we'll just choose 'Red' as black and white images have the same value for Red/Green/Blue.

Finally I'd click Save as, now I've saved a channel packed image!

 

Anyway that's my wealth of knowledge on the subject. I hope it helps you and any other SecondLife resident looking for this info :)

 

 

 

 

 

 

 

 

 

  • Thanks 2
Link to comment
Share on other sites

FWIW, the upcoming PBR materials are planned to use the following spec (Subject to change!) :

Channels Texture Modifiers
RGBA Albedo (+ transparency) RGB Tint
RGB Normal (MikkT space, Y+) (none)
RGB Emissive RGB Tint
RGB ORM (Combined greyscale maps for R = Occlusion, G = Roughness, B = Metalness)

Float - Metallic factor
Float - Roughness factor

Conveniently, that channel layout maps to the existing conventions used by UE4 (Except with the normal map, which UE uses Y-, whereas SL expects Y+) and Godot (Uses Y+ normal maps, so no incompatibilities!), so most PBR-friendly programs already support export in that format.

Note that in this (proposed) new layout, the only alpha channel in use is contained within the Albedo texture.

Edited by Jenna Huntsman
  • Like 2
Link to comment
Share on other sites

Interesting that we will be able to color emission independently

At first reading seeing that the Alpha channels are going under utilized seemed bad (think: disk space) but then the alpha channel has always been problematic to pack due to image editors like photos hop, affinity photo being 'smart' with the alpha channel, so at least it'll make life easier

I suppose since in practice most textures won't use Emission, it remains 3 texture uploads like before so 30L$ a pop

Occlusion channel is interesting, I suppose this is just for 'fake' detail and not a replacement of the occlusion rendered by the viewer. 

It would seem like a good opportunity to me seeing as these objects would render under different rules to also make the real-time occlusion a bit less weak for objects rendering using pbr, so creators don't feel the need to bake AO on everything and thus we can use repeating textures for houses and large objects instead of lots of baked ones

 

Link to comment
Share on other sites

8 minutes ago, Extrude Ragu said:

At first reading seeing that the Alpha channels are going under utilized seemed bad (think: disk space) but then the alpha channel has always been problematic to pack due to image editors like photos hop, affinity photo being 'smart' with the alpha channel, so at least it'll make life easier

Yeah, Photoshop is particularly bad for that as unless you're working in TGA (with the alpha channel, NOT transparency), then all of the colour data in the transparency is lost.

Another thing, leaving those alpha channels unused is actually a good thing -- It makes implementing BoM for Materials muuuuch easier as now those materials textures can contain transparency information, which means they can be layered. (Note that this reasoning is from my headcanon, not anything official, but I'm advocating for continuing on this route for this reason)

Link to comment
Share on other sites

On 6/17/2022 at 3:39 AM, Extrude Ragu said:

The alpha channel on a diffuse map is optional,  in SecondLife it can used for three different purposes

  • Transparency - Alpha Blending
  • Transparency - Alpha Masking
  • Emission

It also modulates Glow level. This is true regardless of alpha mode -- even None.

Link to comment
Share on other sites

15 hours ago, Jenna Huntsman said:

Another thing, leaving those alpha channels unused is actually a good thing -- It makes implementing BoM for Materials muuuuch easier as now those materials textures can contain transparency information, which means they can be layered. (Note that this reasoning is from my headcanon, not anything official, but I'm advocating for continuing on this route for this reason)

From the table, it doesn't seem that the alpha component is being actually taken into account. Seeing the base color marked as rgba and the others just as rgb, apparently the alpha channel would be discarded altogether...

Link to comment
Share on other sites

21 minutes ago, OptimoMaximo said:

From the table, it doesn't seem that the alpha component is being actually taken into account. Seeing the base color marked as rgba and the others just as rgb, apparently the alpha channel would be discarded altogether...

That's okay as the individual channels are uploaded as textures, so the alpha channel is preserved when uploaded, but unused (/ discarded by the shader) for PBR materials - so it's information the bakes service can use in order to be able to layer materials textures on top of eachother. (hypothetically)

This also means, that for a 4-channel PBR material, the upload cost would be 40 L$ (under current pricing)

Link to comment
Share on other sites

On 6/19/2022 at 4:03 PM, Jenna Huntsman said:

That's okay as the individual channels are uploaded as textures, so the alpha channel is preserved when uploaded, but unused (/ discarded by the shader) for PBR materials - so it's information the bakes service can use in order to be able to layer materials textures on top of eachother. (hypothetically)

This also means, that for a 4-channel PBR material, the upload cost would be 40 L$ (under current pricing)

That leaves out the detail of how the bake service will treat those channels discarded by the shader, because at the end those layers will be applied to the body and therefore to the shader.

I would make the feature request clear and detailed to LL, so that the detail would not get lost in their WIP as a secondary detail that, considering how they work, would inevitably fall within the "oh we forgot about that, but now can't change things to make that work" category.

Link to comment
Share on other sites

You are about to reply to a thread that has been inactive for 672 days.

Please take a moment to consider if this thread is worth bumping.

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
 Share

×
×
  • Create New...