Jump to content

Kyrah Abattoir

Resident
  • Posts

    2,296
  • Joined

  • Last visited

Posts posted by Kyrah Abattoir

  1. On 1/27/2024 at 4:14 PM, Charlotte Bartlett said:

    Removing topology to create an optimized mesh, especially if not clean, for a novice is far harder than adding it once you have a quality low poly set up.

    Once they master that part of the workflow and the application they can adjust as relevant.

     

    Debatable.

    I prefer the other way around as I don't necessarily know in advance what my low poly will look like since the high poly is also part of my design phase.

    • Thanks 2
  2. 15 hours ago, Extrude Ragu said:

    Another case of path to hell paved with good intentions

    Sounds like every argument I ever have with Henri on SL things. I'm sure he has good intentions.

    As a creator I want to use the new features available to me to make better things, and not in ten years. I want to challenge my workflow and to be able to make cooler, faster, and better things. The diffuse slot being the same for basic shading and material already created an impossible situation for creators (a good example of a "transition period" that never effectively ended).

    The Official client got rid of basic shading entirely as an option, as far as I am concerned that's the "law" now.

    This is a good move towards unifying the visual experience for everyone and massively simplifying the rendering pipeline. Shaders are finally the only option.

    Sometimes, less is more.

    2 hours ago, Extrude Ragu said:

    We should give people realistic expectations.

    A realistic expectation would be that a legacy rendering mode is not going to provide the full experience going forward and should be considered unsupported.

    • Like 3
  3. I do want to point, keep in mind that strictly baking high to low is not necessarily the be-all end-all approach. There are a few things you should still consider even if you bake high to low:

    • Detail elements: (studs, buckles, zipper tabs, chain links) do these really need their own UVs on the texture or can you bake just one and have them all share the same UVs.
    • Mirrored areas: Sometimes the front and back of an object are very close, if not identical, can I get away with having them share UVs.
    • Repeating sections: (zips, trims), these typically look bad when baked due to texture res limits, having them use a very small repeating texture is more often than not better looking.
    • Some things are just better if baked separately: Especially when you have a lot of overlaps, areas that have natural seams between them can be separated and baked on their own.
    • Don't forget UV seam padding: I usually recommend 4 to 8 pixels of padding between UV islands, to ensure they do not bleed into each-others, even when the texture is stuck at half resolution.
    • UV Packing: If you have a lot of free space in your UVmap, it might be worth considering whether you could make better use of the texture space. Splitting your UVs differently, using a non-square texture size, or even slightly deforming your UV islands to make a better use of the space.
    • Normal maps prefer horizontal and vertical seams: If you need to split an area and are worried about the seam it will create in the texture, you usually get better results if the seam is perfectly straight and either vertical or horizontal.
    • More mirroring: You can't always use a mirrored normal map, but just because you can't mirror this map doesn't mean you can't mirror the others and effectively save 50% of your texture memory usage on these.
    • Accents and UV optimization: If you intend on having accents that use a separate face for customization, it might be worth putting these on a separate smaller texture with its own UVs, this way you don't end up doubling the texture memory usage when the base & details are not sharing the same texture.
    • Like 1
  4. 8 hours ago, Modulated said:

    Your prerogative to do so, I support that...but you need to label your products CLEARLY as pbr only so people who don't use it don't get duped into buying something they can't use.

    What makes you think I wouldn't?

  5. 9 hours ago, Henri Beauchamp said:

    I might blacklist the UUID of that texture

    Singling out a random creator isn't exactly a good way to stay in LL's good graces as a TPV dev.

    9 hours ago, Henri Beauchamp said:

    your products will not look revoltingly ugly to people who do not use PBR.

    There is always that I suppose, but it isn't really promoting PBR adoption.

    It will look revoltingly ugly regardless, they will just assume I can't make textures rather than realizing this content isn't made for them.

    9 hours ago, Henri Beauchamp said:

    And there was no possible fallback for mesh rendering in old viewers

    There was, you saw a generic primitive, I used face 0 to hold the "this is a mesh!" texture. That's all you saw on a non-mesh viewer.

    • Like 1
  6. On 1/13/2024 at 11:16 PM, gwynchisholm said:

    (long quote)

    SecondLife is built under the assumption that all assets are potentially subject to changes, this is very hard to optimize when 99.99% of games out there all use "cooked" content: Every asset that can be considered static is optimized as such and once you know what will never change you can do all kinds of advanced culling methods.

    Yes SL isn't exactly cutting edge in a lot of its approaches, but the core philosophy of its design makes most common optimization approaches impossible.

    We would be in a different situation is, let's say, SL had embraced a low-fi aesthetic more suitable to a streamed world but LL has been completely hands-off on content guidelines. And so users decided that they want SL to look like an AAA game, despite the fact it simply cannot do that AND run well.

    You can't blame a hummer for being unable to outrun an F1.

    • Like 1
  7. On 12/5/2023 at 10:59 PM, Codex Alpha said:

    It's more of an ethical choice than a legal one. Most likely you can get away with the legal issues (mostly), but it's your integrity and character (and the satisfaction you get) from  your activities that you alone can decide.

    I would love to get rid of this "decision" :3

  8. If this is any consolation, Vrchat, (the other major platform that's 100% user generated content also runs horribly. 40 people events are a slideshow unless you drop your "max rendered avatars", it could be worse, but there are quirks to the platform that limit the damage players can do.

    Just like in SL, most vrchat content creators do not care about the rating of the avatars they sell because the users don't care, and therefore, no one is paying for that.

    Just as with SL, it is mostly a content problem.

     

    • Like 1
  9. 5 hours ago, Frionil Fang said:

    Out of curiosity I dug around a bit and if I compress your old, better quality image with openjpeg 2.5 (I know the version used in self-build Firestorm is 1.4) with the same parameters as defined in the FS code, ("-r 1920,480,120,30,10 -mct 1 -I") I get a 77k file with mostly identical look to the original. If I leave out the lowest compression rate (10), I get a 26k file with a crusty look mostly identical to the new upload. Could be the version used in FS self-build is somehow leaving out the intended best-quality compression level, beats me why and how though.

    So the client IS doing the compression huh... But at least that's somewhere I could start looking yeah.

    This is what I dug out of llimagej2coj.cpp:

    JPEG2KEncode(const char* comment_text_in, bool reversible)
    {
        memset(&parameters, 0, sizeof(opj_cparameters_t));
        memset(&event_mgr, 0, sizeof(opj_event_mgr_t));
        event_mgr.error_handler = error_callback;
        event_mgr.warning_handler = warning_callback;
        event_mgr.info_handler = info_callback;
    
        opj_set_default_encoder_parameters(&parameters);
        parameters.cod_format = OPJ_CODEC_J2K;
        parameters.cp_disto_alloc = 1;
        parameters.max_cs_size = (1 << 15);
    
        if (reversible)
        {
            parameters.tcp_numlayers = 1;
            parameters.tcp_rates[0] = 1.0f;
        }
        else
        {
            parameters.tcp_numlayers = 5;
            parameters.tcp_rates[0] = 1920.0f;
            parameters.tcp_rates[1] = 960.0f;
            parameters.tcp_rates[2] = 480.0f;
            parameters.tcp_rates[3] = 120.0f;
            parameters.tcp_rates[4] = 30.0f;
            parameters.irreversible = 1;
            parameters.tcp_mct = 1;
        }
    
        if (comment_text)
        {
            free(comment_text);
        }
        comment_text = comment_text_in ? strdup(comment_text_in) : nullptr;
    
        parameters.cp_comment = comment_text ? comment_text : (char*)"no comment";
        llassert(parameters.cp_comment);
    }

    This appears to be completely unchanged compared to the one in the official viewer source.

    Out of curiosity I've tried the rates you mentioned, which do not match the ones in the source, but no dice.

  10. I've just done a test on the Official LL viewer and my files appear to upload normally there, how very strange, but at least not malicious as I feared.

    Currently, updating my dev viewer to see if the problem goes away... and If I can figure out what caused it to happen at all, since I don't really mess with the uploader code.

    EDIT: Scratching my head on this one, why would the client matter for this?

    I'm wondering if it is because of the FS OpenJpeg implementation for people that build without KDU? Tempted to open a Jira about it but who compile their viewer anymore anyway. But that wouldn't explain why it only started happening this year.

    • Thanks 1
  11. 12 hours ago, Frionil Fang said:

    Dunno where your .jp2 format files are coming from though, when I save to disk from the viewer I don't have an option to save it that way. PNG or TGA only.

    That's the native format used internally by SL's asset storage and the viewer when it fetches textures.

    Of course, I cannot "prove" this, but exactly the same file was uploaded.

    Original VS the 2022 upload: Photoshop_2023-10-15_22-27-51.png.9adf824cc591a23fa5bd7ea9c0a1fd76.png

    Original VS the 2023 upload: Photoshop_2023-10-15_22-28-33.png.390e5772564fabfce50cdcdcd98cfbae.png

    (I increased contrast obviously, but it is the same contrast for both)

  12. I've noticed now that texture quality is noticeably worse on upload, and has been for at least a couple of months.

    Out of curiosity I've checked the file size after upload (the files served by SL to the viewer) on files, an old file from last year sits at 77kb once converted to jpeg 2k.

    explorer_2023-10-15_19-22-24.png.e1f31ada6fbf05ff192f30040a761f70.png

    The same file re-uploaded today is now 26kb and has a very, VERY noticeable loss of quality.

    These two were both uploaded from the same .tga texture which has remained untouched, on my disk drive since october 2022.

    xnview_2023-10-15_19-24-27.thumb.png.5d0e3ba919a98b66ab96eb6da43f1dae.png

    I'll keep the language of this post clean, but that's purely out of self-control.

    • Like 1
×
×
  • Create New...