Jump to content

Fluffy Sharkfin

Resident
  • Posts

    1,099
  • Joined

  • Last visited

  • Days Won

    1

Posts posted by Fluffy Sharkfin

  1. 7 hours ago, Chic Aeon said:

    MEANWHILE I just wanted to note that it very much DOES matter as those jagged edges happen INSIDE the connected faces on the UV whenever there is a diagonal (this includes circles :D).   Adding padding doesn't help at all in this case.

    Sorry I just assumed that you were referring to the pixelation on the edges outside the UV islands as well since you mentioned in your first post that "I have the same choppy diagonals both on outer and inner edges".

      

    7 hours ago, Chic Aeon said:

    I realize that I could make the islands separate but in some pieces of this project that would have made a TON of islands and I am not really big on scattered UV maps :D. So finding a way to fix the bake was my aim!

    Not to mention that the more scattered and broken up your UV map is the more UVs/vertices the model will have which can sometimes result in higher download weights in the LI calculation.

     

    Since the amount of cleanup that will be needed is going to vary on a texture by texture basis it seems likely that your ideal solution will also vary accordingly.  On textures that have a lot of problem areas it may be quicker to use a combination of adding some anti-aliasing during the baking process (assuming you can find settings that work for you), baking at a higher resolution and then resampling/scaling down the image, whereas for textures with fewer pixelated areas fixing them manually in photoshop will probably be the quicker solution.  In the end it boils down to effective time management, i.e. how many problem areas there are on the texture in question and how long baking at a higher resolution with anti-aliasing takes vs how fast you are at bitmap editing.

  2. 14 minutes ago, OptimoMaximo said:

    @Chic AeonWell well, look what i found by chance on the internet

    sQ6Wg.png

    Here is what you need to switch from the default in order to access AA Samples. I would suggest also to zero out Transmission, Subsurface and Volume to avoid computational resources waste on features you most likely aren't using.

    Lol, If you check the last link I posted (Improving AA in Cycles / Removing jaggiesyou'll find that image is from the top rated answer. ;) 

  3. I don't use Blender myself but it sounds like you need to tweak your texture filtering, interpolation and/or anti-aliasing options in your render settings?

    Blender Image Options

    Blender Anti-aliasing Options

    Apparently if you're using Cycles then you shouldn't need anti-aliasing but you could perhaps try increasing the pixel filter width as mentioned here.

    There's also some info that you may find useful on this page Improving AA in Cycles / Removing jaggies.

    Technically the outer edges being pixelated shouldn't matter since they fall outside the UV edges and will never be seen, although you may want to leave a little more room between each UV island as it looks as though the padding is almost bleeding into the neighbouring island in that first screenshot.

    • Thanks 1
  4. I don't really take a lot of pics myself but I hear the Black Dragon viewer is quite popular with SL photographers, it has a lot of nice ease of use features for tweaking your viewers environmental/visual effects.  Whether you choose to do any post-processing work on your photos is entirely up to you but if you're looking for free alternatives to Photoshop you may want to try Gimp or Krita 

  5. 6 minutes ago, KanryDrago said:

    The problem as I see it is Sansar is being marketed currently to creators and consumers seem to be an afterthought to Linden labs.

    That's most likely because Sansar is still in "Creator Beta" and LL are trying to encourage creators to sign up and create content so that when they open the Tourists/Roleplayers/Fashionistas Beta there won't be hundreds of angry SL residents signing up and complaining that there aren't 100,000 prefab houses/hair styles/pairs of shoes available on the marketplace? :)

    • Like 1
  6. 40 minutes ago, Cindy Evanier said:

    I had that problem a while ago and the advice at the time was to change my password, wait 24 hours for it to sync and try  the beta grid again.  It worked :)

    That doesn't work anymore unfortunately, according to the Aditi wiki page.

    Quote

    WARNING: June 2016 - If you are a new user and have never logged into ADITI, you will need to contact Support to gain access. This is a temporary bug in Aditi login. 

     

  7. If you're looking for tutorials I'd recommend googling for terms like "low poly game hair tutorial" and "creating textures for hair cards" etc.  The number of results in the videos section alone seems to range between 500,000 and 1.5 million depending on the search terms used, at least the first few pages of which are usually youtube video tutorials.

    It does seem that rather than using hair "cards" (flat 2D planes) a lot of creators in SL use tubes to give the illusion of volume from a wider range of angles, but the process of creating the textures will be pretty much identical regardless of which geometry you choose.

  8. @Yasojith If you're trying to achieve a smooth transition (i.e. pulsing effect) and are only using a blank texture on the object rather than a specific image then I would definitely recommend exploring the llSetTextureAnim approach rather than attempting to do it all via script.

    Texture Animation is a client-side effect so is far more resource friendly and also won't suffer from any slowdown on laggy sims where lots of other scripts are eating up cpu time.  Additionally since the intensity of the glow is determined by the value of the pixels in the alpha channel of a texture you may find it simpler to set up more complex transitions and effects in a bitmap editor using gradients.

    For example:

    default
    {
        state_entry()
        {
            llSetLinkPrimitiveParamsFast(LINK_THIS,[
                PRIM_ALPHA_MODE, ALL_SIDES, PRIM_ALPHA_MODE_EMISSIVE, 0, 
                PRIM_GLOW, ALL_SIDES, 0.2, 
                PRIM_TEXTURE, ALL_SIDES, "5c2f0866-56b7-8ea3-00cf-cca31a07a220", <1,1,0>, <0,0,0>, 0.0]);
                
            llSetTextureAnim(ANIM_ON|LOOP, ALL_SIDES, 128, 1, 0, 128, 32);
        }
    }

    uses a 128 x 32 plain white texture with an alpha channel like this 5a00236192ec8_EmissiveGradient1.jpg.e784bb00b4cf9d3064077551c990d110.jpg to produce a steady transition from 0% to 100% and back to 0% (the actual amount of glow is determined by the settings for the prim, i.e. PRIM_GLOW, ALL_SIDES, 0.2, however emissive mask mode affects the Full Bright value of the prim so it will appear slightly brighter than using only glow).

    Changing the PRIM_TEXTURE settings to

    PRIM_TEXTURE, ALL_SIDES, "79de213f-bc72-748f-d6e1-285e58aff1a4", <1,1,0>, <0,0,0>, 0.0]);

    would use a texture with an alpha channel like this 5a002a30ba901_EmissiveGradient2.jpg.643263a09a3363bacf20cdf78aadea4f.jpg which results in a faster transition and shorter pulse with a longer pause in between.

    Similarly changing the PRIM_TEXTURE settings to

    PRIM_TEXTURE, ALL_SIDES, "2341f9d9-b7a3-e18f-c294-d2c24106de7f", <1,1,0>, <0,0,0>, 0.0]);

    uses a texture with an alpha channel like this 5a002d159ff09_EmissiveGradient3.jpg.59e911768aba264dc6d54f1d2e4003a4.jpg which would give a 2 second pause followed by an instant transition to max glow then a 2 second fade back to zero.

    And finally changing the PRIM_TEXTURE settings to

    PRIM_TEXTURE, ALL_SIDES, "1be86627-0563-ad17-0ed5-5006176d231b", <1,1,0>, <0,0,0>, 0.0]);

    uses a texture with this alpha channel 5a002ffc2cee2_EmissiveGradient4.jpg.fbfdfc53be48d64aa061982829107c33.jpg to produce 3 short half-second pulses at 50% glow (50% grey), followed by a 1 second pulse at 100% glow (white) with a slow fade in and faster fade out.

     

    As you can see from the above examples you can easily control the timing and intensity of the pulses simply by adjusting the gradient that you use in the alpha channel of the texture, and since the script only runs once and then stops it uses no additional cpu time (in fact since texture animation is treated as a property of the prim you can even remove the script once its been run and the texture will continue animating regardless).

    • Like 2
    • Thanks 2
  9.  

    4 hours ago, Rolig Loon said:

    Or, if you want another entry in the Even Shorter Sweepstakes:
     

    
    float glow;
    
    default
    {
        state_entry()
        {
            llSetTimerEvent(5.0);
        }
    
        timer()
        {
            llSetLinkPrimitiveParams(LINK_THIS, [PRIM_GLOW, ALL_SIDES, 0.2 * (glow = !glow)]);
        }
    }

    :)

    And yes, always use brackets.

     

    Or how about applying a small blank texture with this for an alpha channel ...

    59ff631484f91_emmisivemaskalphachannel.jpg.fcdea4de6a48a98594e17b6d8ef75e38.jpg

    then setting the Alpha mode to Emissive mask and the Glow to a value of 0.20 and then all you'd need in the script would be...

    default
    {
        state_entry()
        {
            llSetTextureAnim(ANIM_ON|LOOP, ALL_SIDES, 2, 1, 0, 2, 0.2);
        }
    }

    :)

     

    And yes, always always always use brackets! :D

    • Haha 1
  10. 2 minutes ago, Yasojith said:

    Taff Nouvelle and Fluffy Sharkfin thank you so much, that helped me a lot.

    You're welcome!

    And, since I can't seem to sleep right now, here's an even shorter version without the need for an if statement or any variables aside from the integer flag, you just need to recast the flag from an integer to a float and divide by 5 to give you a value of either 0.2 or 0.0 depending on if the flag is TRUE or FALSE (1 or 0) and then use that as the glow value in the llSetLinkPrimitiveParamsFast command. ;)

    integer glow = FALSE;
    
    default
    {
    	state_entry()
    	{
    		llSetTimerEvent(5.0);
    	}
    
    	timer()
    	{
    		glow = !glow;
    		llSetLinkPrimitiveParamsFast(LINK_THIS, [PRIM_GLOW, ALL_SIDES, (float)glow/5]);
    	}
    }

     

    • Like 2
  11. Actually that part of the script is fine, if you put the llSetLinkPrimitiveParamsFast command inside the bracket then it will only ever execute when glowVal = 0.2 so the glow won't turn on and off.

    The real problem is that llSetTimerEvent doesn't return a value so integer tFace = llSetTimerEvent(5.0); won't work.  tFace needs to be defined either as a global variable at the start of the script, i.e. before default, or as a local variable in the event or function in which it's to be used, in this case the timer event (you can specify a particular face number or use the constant ALL_SIDES to have it change all the faces at once).  The llSetTimerEvent command should be on a different line.

    Also you could get rid of the llGetLinkPrimitiveParams command and simply use an integer with a boolean value as a flag instead.

    So basically your script would look something like this.

    integer tFace = ALL_SIDES;
    integer glow = FALSE;
    
    default
    {
        state_entry()   
        {           
            llSetTimerEvent(5.0);  
        }    
     
        timer() 
        {
            float glowVal;
            if(!glow)
                glowVal = 0.2;
            else
                glowVal = 0.0;
            llSetLinkPrimitiveParamsFast(LINK_THIS, [PRIM_GLOW, tFace, glowVal]);
            glow = !glow;
       }
    }

     

    • Like 1
    • Thanks 1
  12. 1 hour ago, Klytyna said:

    imagine the lag on a sim set to limit it's self to 50 avatars, if it had 1200 animesh npc bots scampering about...

    Other than being comprised of polygons rigged to an animated skeleton, avatars and animesh objects have no similarity whatsoever.   Trying to compare an animesh object to an avatar connected to a client which is constantly sending data to (and receiving data from) the sim is like comparing a car being driven by a human with the automatic barrier at a car park or a railway crossing.

    • Like 1
  13. 4 minutes ago, Qie Niangao said:

    So now I'm thinking, given enough animations, the jellyfish too might have moved "randomly enough", and even if that didn't completely offset the extra rendering, it could be a much easier bit of content to create. I mean, instead of all the offline math I did to figure out movement constraints and the scripting to make it happen, with an Animesh approach it could be just some animations to create and trivially script together. It seems that could be packaged into a pretty accessible toolkit for creators.

    I think it would help if there were more scripted control over the playback of animations.  Something similar to the level of control we have with texture animation, such as the ability to choose custom start and end frames, set playback rate (preferably with interpolation to guarantee smooth motion rather than simply changing the number of frames displayed per second), options like loop, reverse and ping-pong and possibly even some control over the ease in and ease out settings for each animation which would make seamless blending of animations possible. 

    That would allow for the creation of more "random" movement without the need to upload a large number of animations,  and also make it a lot easier to create more realistic moving animesh objects since you wouldn't need separate animations to accommodate different movement speeds, you could control the speed that wheels/limbs are being animated based on the speed at which the object is travelling making it a lot easier to avoid situations where objects appear to slide across the ground, etc.

    • Like 1
  14. 3 hours ago, Qie Niangao said:

    A big LI would much constrain the applications, that's for sure. Similarly with Pathfinding, the minimum 15 LI imposed on characters seemed to discourage adoption for exactly those simple applications where that limited technology might be useful. But a Pathfinding character came with some hefty overhead, so maybe that LI was kinda necessary.

    And similarly a fairly big LI may be appropriate for the whole Animesh mechanism, which seems very heavyweight, based on animation of a full avatar skeleton. That's all fine for the examples we've seen (basically decorative NPCs), but massive overkill for many applications that would benefit from a simpler way to deform some simple mesh geometry.

    I'm thinking of a little script I wrote for a friend who wanted mesh jellyfish to swim randomly in a tank. Both the jellyfish and the motion were quite simple, the trick being to convincingly coordinate the simple animation with the semi-random path and speed of locomotion. The point is, they were like 2 LI per jellyfish, and a single script could run a bunch of them. I'd have loved a way to flex a simpler mesh rather than texture-animate through faces of a more complex one, but it would be crazy irresponsible to foist on viewers a full animated avatar skeleton rig for each little jellyfish. Fun little applications like this are simply never going to be appropriate for Animesh.

    And that's fine, but there's an important grey area: On balance, will pets and breedables - now nearly "jellyfish simple" - get laggier as they adopt Animesh? And so is a pretty big LI necessary to constrain that?

    I'm guessing that the potential for lag from animesh is linked more closely to the number of polygons being transformed during animation than it is the number of bones being animated? 

    If that's not the case then couldn't they examine each mesh object and take into account the number of bones that a mesh is rigged to (since it's now possible to upload mesh that includes only a partial rig), and use that number as a modifier when calculating the land impact for each mesh rather than basing the calculation on triangle count alone, so that simpler mesh objects rigged to only 3 or 4 bones are less heavily penalized than those that are rigged to the full avatar skeleton?

  15. On 10/5/2017 at 6:54 PM, Violaine Villota said:

    with silk velvet the light is reflected more from the fibers that are on the side or edge of whatever it's wrapping around rather than from the front.

    Unfortunately that's something that SL materials simply can't reproduce, for that you'd need a custom shader (which SL doesn't have).

    As has been pointed out you can fake the effect, but in order to do so you'd need to either find/create a custom shader in your 3D rendering app of choice and bake the lighting info into the diffuse texture, or alternatively you can hand paint the lighting on, but since you'd then have static highlights and shadows on your texture the effect is going to work a lot better on say a piece of furniture in an indoor environment with a fixed light source (i.e. where the object isn't moving and you can position it so that the lighting on the textures matches the direction of light from any visible light sources) than it will on a piece of clothing that will be moving around and viewed in a variety of differently lit environments.

    • Like 1
  16. 7 minutes ago, carley Greymoon said:

    oh wow, Thank you so much Fluffy:) it's working now as separate scripts. one for each section. 3 sections each with their own buttons. which is just fine with me, it serves my purpose.

    but i don't understand how all 3 could work with one script. the user can input 3 UUIDs into one text box? i know you must be right, i just don't get how the script could know which UUID belongs to what section. lets say the torso texture UUID is 111-1111-1111-111 and the lower is 222-2222-222. how could the script know texture 111-1111-111 belongs to list torso = [42, 43, 44, 45, 46, 47, 48, 49, 53, 54, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117, 118, 119, 120, 121, 122, 123, 124, 125, 126, 127, 128, 129, 130, 131, 132, 133, 134];?

    in any case i thank you profusely:) and everyone else as well.

    in the original script that I posted earlier the textures are meant to be sent in a certain order (head, torso, lower), the script simply assumes that's the order in which the three UUIDs have been sent and applies them to the corresponding bodyparts.

    As for the new script, since you're sending them one at a time you have to specify which bodypart it applies the texture to and include that information in the message (i.e. "torso,111-1111-1111-111" or "lower,222-2222-222").  As I suggested earlier you could include an llDialog command to open a menu with three buttons (head, torso, lower) then when the user presses one the script automatically opens the text input window to allow them to input the texture UUID.  That way they don't have to type in the name of the bodypart they can just select it from the menu, but having separate buttons for each section works just as well (although you can also do that using a single script rather than three separate ones if you put the script in the root object and use llDetectedLinkNumber() in the touch event). ;)

     

    Anyway, I'm glad it's working for you now! :)

  17. Okay since you're sending texture UUIDs one at a time rather than all three at once, try this...

    Sender Script (based on the script you currently have)

    default
    {
        state_entry()
        {
            llListen(-54321,"","","");
        }
    
        touch_start(integer total_number)
        {
            llTextBox(llDetectedKey(0)," \n Enter a bodypart and texture UUID",-54321);
        }
       
        listen(integer channel, string name, key id, string msg)
        {
            llRegionSay(-12345,msg); 
        }
    }

    Receiver Script (based on the script I posted earlier)

    list head = [1,2,3];
    list torso = [4,5,6];
    list lower = [7,8,9];
    
    setTexture(list link_numbers, key texture)
    {
        integer i;
        for (i=0; i < llGetListLength(link_numbers); i++)
            llSetLinkPrimitiveParamsFast(llList2Integer(link_numbers,i), [PRIM_TEXTURE, ALL_SIDES, texture, <1,1,0>, <0,0,0>, 0.0]);
    }
    
    default
    {
        state_entry()
        {
            llListen (-12345,"","","");
        }
    
        listen(integer channel, string name, key id, string msg)
        {
            if (llGetOwnerKey(id) == llGetOwner()){
                list data = llCSV2List(msg);
                string bodypart = llList2String(data,0);
                key texture = llList2Key(data,1);
                if (bodypart == "head")
                    setTexture(head,texture);
                else if (bodypart == "torso")
                    setTexture(torso,texture);
                else if (bodypart == "lower")
                    setTexture(lower,texture);
            }
        }
    }

     

    In order for it to work you need to specify both which bodypart you wish to apply the texture to and the texture UUID as a CSV.

    For example:

    Quote

    head,89556747-24cb-43ed-920b-47caed15465f

    will texture the head with the default plywood texture.

     

    It's not the most elegant solution to selecting which bodypart to texture, personally I'd go with a HUD or at least an llDialog menu rather than having to type the bodypart name into the text box, but at least you can test it and see it working. :)

     

  18. 1 hour ago, Nova Convair said:

    llList2Key only works if the list has stored a key it does not convert a string to a key

    From the wiki 

    Quote

    If the type of the element at index in src is not a key it is typecast to a key. If it cannot be typecast null string is returned.

     

    carley, the reason the script isn't working as you have it now is it was written to work with three texture UUIDs, if you're passing a single UUID to it then you simply need to change

    setTexture(lower,(key)llList2String(textures,2));

    to

    setTexture(lower,(key)llList2String(textures,0));

     

    But, that would defeat the purpose of separating the message into a list of UUIDs since there's only one.

    The idea is to handle changing all the textures of the linkset using a single script rather than using multiple scripts for each texture. :)

×
×
  • Create New...