Jump to content

Yingzi Xue

Resident
  • Posts

    259
  • Joined

  • Last visited

Posts posted by Yingzi Xue

  1. Question:  Traffic games that use go-between game "currency" as payout and then let you cash out at their ATM... are these games considered within the same category as "Skill Games" that pay out?

    Personally, I am glad these games are being limited to private regions.  General Terms and Conditions, item 2 states, "Skill Gaming is not permitted in the Second Life area in which Linden Lab is the estate owner (the “Mainland”)."  Yee freaking haw!  The impact one traffic game can have on a region is enough to warrant keeping them off of the mainland.  When you can't even access your own land because a traffic game is attracting a large number of players, something needs done.  As a long-time owner of mainland, I've intermittently been denied entry into my own land for a number of years, because of a sploder or game.

  2. 1) It sounds like you're describing an RPG.  SL is not an RPG or an MMORPG, but it has role playing areas.  As has been said many times in this thread, the people make SL what it is.  If you want something to exist, make it.  To expect LL to all of a sudden become a content creator after 11 years of being the facilitator/infrastructure provider, is a bit much.

    2) Here's the problem I have with any corporation hiring graphics designers to make anything.  Look at Microsoft's OS's from Windows 3.1 to Windows 8.  The graphics have sucked the whole life of the Windows operating system.  That's just one example.  The real problem is if someone isn't emotionally or creatively invested in a project, it ends up being lifeless.

    3) This is correct, which is what makes SL what it is, being able to do your own thing, create your own world.

    4) They're not going to upgrade 11 year old technology.  That's like someone asking Microsoft to rewrite Windows XP to work with new hardware and new technologies, like gigs of memory, larger hard drives and expanded graphics capabilities.  The architecture doesn't support it and never will.

  3. Even though it's 11 years old, Second Life would be huge shoes to fill.  There are several things I would require for me to even think about moving to another platform.  If you've taken a look at High Fidelity, there are some really cool features that I think are going to be a necessity to even be able to compete with it.  Combine SL and what they're doing with High Fidelity, SL has their work cut out for them.  Look at what High Fidelity can/will be able to do.

     

    • Scripting language with powerful control over what you create.  HF uses Java currently.
    • Intuitive building system that supports mesh in-world and voxels that can be sized to atomic level.
    • Complete in-world avatar customization, including custom skeletons/bone arrangements.
    • Allows people to share their computers with each other to act as servers or as scripted interactive content.
    • Allows many different people and institutions to deploy virtual world servers, interconnecting those servers so that people and digital objects can travel among them, and harnessing shared computing devices to scale their content and load.
    • When used with the new display and input devices coming to market, High Fidelity will enable a planetary-scale virtual space with room for billions of people, served by billions of computers.
    • The server architecture alone is genius.
  4. I am excited about the possibilities of SL 2.0, but the confidence that LL will be able to pull it off is low.  The good news is, if LL doesn't fill those shoes, possibly High Fidelity will, or someone else will come along.  It's happening, that's what is promising; it's the future of virtual worlds.  If none of them meet the grade, we still have Second Life or an equivalent.

  5. Although it's a cool idea to tie Google Maps in with Second Life for real-world/in-world locations, there are inherent problems with that.

     

    1. Second Life mainland is unique and doesn't resemble real-world land mass.
    2. The amount of space and resources needed to create real-world continents alone would be impossible.
    3. If you use existing mainland continents for real-world location replicas, it wouldn't translate very well, there would be a disconnect because it's a real-world location located on a fictional continent.  Thus, Google Maps would be pointless, except as a pointer to an in-world location.
    4. Who would maintain these locations?  Who would run them?  LL wouldn't.  Finding people to adopt these locations and try to turn them into places the community wants to congregate, that's a challenge in itself.

    Second Life is what you make it.  Therefore, if that's what you want to see, make it happen, create it.  It should be noted that open sim is more suitable for a Google Maps project, since you can separate the grids.

  6. Second Life is community driven.  This means, the world is what it is because the users make it that way.  That's the beauty of Second Life.  LL has many areas that were made specifically for premium members only (with help from users), but to have a design team at LL making content for the world isn't how Second Life was envisioned and I think it would detract from the world as a whole.  If you've ever been to an SL hub in-world, you might notice that all of them have one thing in common--no soul.  They feel empty and pointless.  This would most surely happen if LL started creating their own content.  In Second Life, people want complete control of their content and they don't want that content coming from a corporation.  What makes Facebook thrive?  It's the user shared content, not content made by Facebook.  Even if LL made all of the attractions you requested, they would be empty because people prefer something that feels active and vibrant, that only comes from users who passionately create in Second Life, not some department at a corporation that craps out canned content.

    Second Life IS NOT A GAME, it's a virtual world.  While there are games in-world, there is role playing and combat, that's a small part.  Second Life is a community.  It's a medium that allows people to create worlds only they can image, experience things they can't do in real life.  It's not an MMO and it's not about getting your next gaming fix with throw-away content.  Second Life means different things to different people, but to understand what Second Life is and was meant to be is important.  Losing sight of that is a bad thing.

    Second Life as it is has reached the end of its life as far as appealing to new users is concerned.  We're on the cusp of new technologies and the next generation of virtual worlds.  I liken it to BBS's and the internet.  While Second Life is still valid as a virtual world, it's aged 11 years.  New technologies will eclipse it in the next several years and it will be a distant fond memory.  Oh there will still be open sim and people who will enjoy Second Life in some form; there are still BBS's, but they are empty nostalgic time capsules that no one uses anymore.

    I love Second Life.  It's the best creative outlet I've ever found.  I've been in SL since 2007 and I still find it as addicting as the day I joined.  I never get tired of it.  Despite its clunkiness, its flaws, it is still the best virtual world.  I can't see LL spending a lot of time or effort into an old platform.  They're focused on the next generation of virtual world, which will be more accessible, easier to use and more immersive.  Look at what High Fidelity is doing and how it works.  It's going to be some amazing stuff.  If LL can equal that or better it, we're going to see some really cool things coming in the next few years (I hope).

    I see the writing on the wall.  Second Life isn't going to live forever.  I'll be here til the last day, unless something else is so good it pulls me away.  Where we're headed is exciting.  I see social media being merged into virtual worlds, where there is no limit to what you can do.  I think Facebook sees it too, which is why they're invested in it.

    Second Life isn't going to change much from here on out.  I'm fine with that.  Still, there are things that can be done to bring more people in--like reducing prices across the board, having a support network that is community based, foster more community-driven events and locations, bring back those institutions and real-world business presences that left SL because of the price increase--and let them create their unique content.  Then show the world through ads how appealing SL is as a virtual world and what you can do inside it.  I still think they've been missing a huge opportunity all along with how they present SL to new users and how they do nothing to keep it from seeming so complex and daunting.  No real official training on how to build or script, except for the wiki.  There are many things that LL could do in an official capacity that they aren't doing, but making content beyond their infrastructure role is not something I want them to do.

    My $0.02 worth.

  7. Each router is different.  Check with the manufacturer website for the manual.  Some use a local web address that points to the router, others use software to configure.  Then it's just a matter of looking at the settings.  Here's an excerpt from a thread where someone had the same problem and fixed it.

    http://community.secondlife.com/t5/Technical/Voice-not-working-tried-everything/qaq-p/1754063

    I had a similiar problem with voice suddenly not working any longer. My issue turned out to be a problem with my home network. I had to reset my router back to factory settings and I replaced a network cable that wasn't making a secure connection. Despite this being what fixed it, it wasn't affecting any other programs.

  8. llSensor/llSensorRepeat can't detect attachments.  The scripts have worked so far because one has always been in-world (not attached to an avatar).

    Since you can't detect attachments with a sensor, you have to resort to other methods. 

    One possible scenario:

    ---

    Chat negotiation, where two attachements negotiate with each other for arrows.

    Since you can't detect attachments using a sensor, you must spam chat to find other attachments to negotiate with.  This can be limited by using a sensor to only send out chat negotiations when avatars are detected within a certain range (AGENT_BY_LEGACY_NAME instead of PASSIVE | SCRIPTED).  You can further limit chat spam by adding in pauses between sending chat messages.

    Chat negotiation would work fine between two attachments where each receives a message to send arrows to the target.  It's when you have a third attachment (third avatar) or a whole group of attachments that we run into chaos.  Attachments will fight each other.  Arrows will go from one to another based on what attachement received what message first.  Then when the chat spam continues (because it will since you're detecting avatars), arrows will change targets.  It'll be a mess.  Since all attachments are created equal, there has to be master/slave relationships, otherwise your scripts will be fighting each other.  An easy fix would be to allow the avatar to select who they want to target with the attachment via dialog menu.  You could also use script states to compartmentalize negotiation/listen/sensor/chat.  Keeping track of avatars or attachments (or both) that are negotiating/being interacted with might help keep things in check

    Once negotiation happens and is locked in, all further communications would need to be shut off until there are no avatars around.  You can accomplish this by removing the listen and stopping the script from sending out chat messages.  Then, in the no_sensor() event, when no avatars are detected, drop the current target and turn chat back on for a new negotiation.

    Particles can only have one target at a time.  Therefore, you can only target one attachment to another at any given time, unless your object has multiple prims, which can then be used to target multiple attachments using llLinkParticleSystem.  This opens up a whole new can of exponential worms.

    ---

    Maybe someone else will have some input.  My brain doesn't want to work today.  I'm running on one or two cylinders. lol

    Sounds like you have your work cut out for you.  Only you know what you're trying to design and what the parameters are.  I tried to help as best I could with the descriptions you gave, as you gave them, but there is no clear picture as to exactly what you're aiming for with this project.  Each of your posts has been a little more revealing, but it's not enough to form a definite direction for what you want.  I'm glad to help with the process, but I won't write the project for you, that's not what this forum is for. 

    If you have further questions and need help, post your code and we'll do what we can.  Please make sure you're being completely descriptive in the functionality you're looking for.

  9. This IF statement is processing touches from your children, but not allowing your root prim to process touches.  Simply remove this IF statement from your script (don't forget the end bracket) and the root should process touches as well.  Your root prim script is what's processing touches, so the IF statement isn't needed.

     

            if(!(llDetectedLinkNumber(0) == llGetLinkNumber()))// We can use a condition like this to filter from which prims touches are triggered.        {

     

    Your root script would look like this...

     

    string animation;string  CONTROLLER_ID = "A"; // See comments at end regarding CONTROLLERS.integer AUTO_START = TRUE;   // Optionally FALSE only if using CONTROLLERS.list particle_parameters=[]; // stores your custom particle effect, defined below.list target_parameters=[]; // remembers targets found using TARGET TEMPLATE scripts.integer told;integer once_is_too_often; default{    state_entry()    {        told = FALSE;    }        touch_start(integer total_number)    {        particle_parameters = [  // start of particle settings        // Texture Parameters:        PSYS_SRC_TEXTURE, llGetInventoryName(INVENTORY_TEXTURE, 0),         PSYS_PART_START_SCALE, <0.04, .3, FALSE>, PSYS_PART_END_SCALE, <.12, 0.05, FALSE>,         PSYS_PART_START_COLOR, <.6,.6,.6>,    PSYS_PART_END_COLOR, <0.3,0.4,.6>,         PSYS_PART_START_ALPHA,  (float)0.75,            PSYS_PART_END_ALPHA, (float)0.50,                   // Production Parameters:        PSYS_SRC_BURST_PART_COUNT, (integer)5,         PSYS_SRC_BURST_RATE, (float) 0.01,          PSYS_PART_MAX_AGE, (float)1.0,         PSYS_SRC_MAX_AGE,(float) 0.0,                  // Placement Parameters:        PSYS_SRC_PATTERN, (integer)8, // 1=DROP, 2=EXPLODE, 4=ANGLE, 8=ANGLE_CONE,                // Placement Parameters (for any non-DROP pattern):        PSYS_SRC_BURST_SPEED_MIN, (float)0.3,   PSYS_SRC_BURST_SPEED_MAX, (float)0.9,         PSYS_SRC_BURST_RADIUS, 0.1,                // Placement Parameters (only for ANGLE & CONE patterns):        PSYS_SRC_ANGLE_BEGIN, (float) 0.08*PI,    PSYS_SRC_ANGLE_END, (float)0.002*PI,          // PSYS_SRC_OMEGA, <0,0,0>,                 // After-Effect & Influence Parameters:        PSYS_SRC_ACCEL, <0.0,0.0, - 2.0 >,          // PSYS_SRC_TARGET_KEY,      llGetLinkKey(llGetLinkNumber() + 1),                         PSYS_PART_FLAGS, (integer)( 0         // Texture Options:                                 | PSYS_PART_INTERP_COLOR_MASK                               | PSYS_PART_INTERP_SCALE_MASK                               | PSYS_PART_EMISSIVE_MASK                               | PSYS_PART_FOLLOW_VELOCITY_MASK                                              // After-effect & Influence Options:                         // | PSYS_PART_WIND_MASK                                     // | PSYS_PART_BOUNCE_MASK                                   // | PSYS_PART_FOLLOW_SRC_MASK                              // | PSYS_PART_TARGET_POS_MASK                              // | PSYS_PART_TARGET_LINEAR_MASK           //end of particle settings                             )        ];                if ( AUTO_START ) llParticleSystem( particle_parameters );        if(!once_is_too_often)        {            llRequestPermissions(llDetectedKey(0), PERMISSION_TRIGGER_ANIMATION);        }    }    run_time_permissions(integer perm)    {        if (perm & PERMISSION_TRIGGER_ANIMATION)        {            animation = llGetInventoryName(INVENTORY_ANIMATION,0);            llStartAnimation(animation);            llOwnerSay("animation will end in 5 seconds");            llSetTimerEvent(5.0);told = TRUE;        }        else        {        }    }            timer()    {        llParticleSystem([]);        llStopAnimation(animation);        llSetTimerEvent(0.0);    }}

     

    If you want your root touch to do something different, keep the script you have for the root and add in an else statement after the IF I showed you at the top, then put your code there.  Like this:

     

    string animation;string  CONTROLLER_ID = "A"; // See comments at end regarding CONTROLLERS.integer AUTO_START = TRUE;   // Optionally FALSE only if using CONTROLLERS.list particle_parameters=[]; // stores your custom particle effect, defined below.list target_parameters=[]; // remembers targets found using TARGET TEMPLATE scripts.integer told;integer once_is_too_often; default{    state_entry()    {        told = FALSE;    }        touch_start(integer total_number)    {        if(!(llDetectedLinkNumber(0) == llGetLinkNumber()))// We can use a condition like this to filter from which prims touches are triggered.        {            particle_parameters = [  // start of particle settings            // Texture Parameters:            PSYS_SRC_TEXTURE, llGetInventoryName(INVENTORY_TEXTURE, 0),             PSYS_PART_START_SCALE, <0.04, .3, FALSE>,     PSYS_PART_END_SCALE, <.12, 0.05, FALSE>,             PSYS_PART_START_COLOR, <.6,.6,.6>,    PSYS_PART_END_COLOR, <0.3,0.4,.6>,             PSYS_PART_START_ALPHA,  (float)0.75,            PSYS_PART_END_ALPHA, (float)0.50,                           // Production Parameters:            PSYS_SRC_BURST_PART_COUNT, (integer)5,             PSYS_SRC_BURST_RATE, (float) 0.01,              PSYS_PART_MAX_AGE, (float)1.0,             PSYS_SRC_MAX_AGE,(float) 0.0,                          // Placement Parameters:            PSYS_SRC_PATTERN, (integer)8, // 1=DROP, 2=EXPLODE, 4=ANGLE, 8=ANGLE_CONE,                        // Placement Parameters (for any non-DROP pattern):            PSYS_SRC_BURST_SPEED_MIN, (float)0.3,   PSYS_SRC_BURST_SPEED_MAX, (float)0.9,             PSYS_SRC_BURST_RADIUS, 0.1,                        // Placement Parameters (only for ANGLE & CONE patterns):            PSYS_SRC_ANGLE_BEGIN, (float) 0.08*PI,    PSYS_SRC_ANGLE_END, (float)0.002*PI,              // PSYS_SRC_OMEGA, <0,0,0>,                         // After-Effect & Influence Parameters:            PSYS_SRC_ACCEL, <0.0,0.0, - 2.0 >,              // PSYS_SRC_TARGET_KEY,      llGetLinkKey(llGetLinkNumber() + 1),                                 PSYS_PART_FLAGS, (integer)( 0         // Texture Options:                                     | PSYS_PART_INTERP_COLOR_MASK                                   | PSYS_PART_INTERP_SCALE_MASK                                   | PSYS_PART_EMISSIVE_MASK                                   | PSYS_PART_FOLLOW_VELOCITY_MASK                                                  // After-effect & Influence Options:                             // | PSYS_PART_WIND_MASK                                         // | PSYS_PART_BOUNCE_MASK                                       // | PSYS_PART_FOLLOW_SRC_MASK                                  // | PSYS_PART_TARGET_POS_MASK                                  // | PSYS_PART_TARGET_LINEAR_MASK               //end of particle settings                                 )            ];                        if ( AUTO_START ) llParticleSystem( particle_parameters );            if(!once_is_too_often)            {                llRequestPermissions(llDetectedKey(0), PERMISSION_TRIGGER_ANIMATION);            }        } else // Process root differently        {            // Code to process root        }    }    run_time_permissions(integer perm)    {        if (perm & PERMISSION_TRIGGER_ANIMATION)        {            animation = llGetInventoryName(INVENTORY_ANIMATION,0);            llStartAnimation(animation);            llOwnerSay("animation will end in 5 seconds");            llSetTimerEvent(5.0);told = TRUE;        }        else        {        }    }            timer()    {        llParticleSystem([]);        llStopAnimation(animation);        llSetTimerEvent(0.0);    }}

     

     

     

  10. I know you said you have an exception on your firewall for SLVoice, but just to make sure it's not the exception that's configured wrong, try turning off your firewall and anti-virus software and see if SLVoice works.  That's the last thing I know to check if your headset is working with everything else.  Make sure Windows Firewall is off too.  If this doesn't fix the problem then I don't know, you've tried about everything.

    I just thought of something.  Does your internet modem have a built-in router?  It's possible it could be blocking the port(s) involved.

  11. Link order goes in reverse.  The first prim you select is the last link in the linkset.  The last prim you select becomes the root.  If you mix and match, meaning link unlinked prims to linked prims or unlink them all and select/deselect, the link order changes and your root prim will most likely change.  You should always be aware of what is linked and what isn't.  Reordering to get a new root prim is as simple as unlinking, holding shift, deselet the prim you want to be the root, then select it again, then link the prims togetehr, viola.

    Also, in Firestorm, the link number display in the edit window (if using Edit Linked Parts) can potentially display the wrong link number for individual prims in a linkset; not sure if it's a bug or what.  It's a good idea to use an auto dying script that detects the link number for a specific prim in your object, if you need to know a specific prim's link number.  Here's the script I wrote, I use it all the time.  The value of 0 means it's not linked.  1 thru x is your link number in the linkset.

     

    // Drop me in a linked prim to get its link number.
    // Script auto deletes after one second.

    default{ state_entry() { llSay(0, "Link #: "+(string)llGetLinkNumber()); // Say the link # in local chat.
    llSetTimerEvent(1.0); // Allow the script a short pause, for posterity. }

    timer()
    {
    llRemoveInventory(llGetScriptName()); // Delete the script.
    }
    }

     

    • Thanks 1
  12. You're welcome.

    Your description didn't specify two attached prims, so the first set was for two rezzed prims, then you said one attached, so I changed it to one attached.  lol  Had I known you wanted two scripted attachments, one the emitter and one the receiver, I would've written the scripts differently.  As it is, everything you need to achieve what you want is within these scripts.  You might have to mod them a bit, but the code is there. 

    I've done most of the work.  I'm leaving it in your hands.

  13. An example of how to do it.

    The emitter_name and object_name strings should be the unique and set in both scripts or it won't work.

    The emitter object names itself "My Emitter" as per the emitter_name string.  The emitter listens for a chat message from an object named "My Object" (as set in object_name), containing the text object_name+"@".  If that test passes, anything after @ is assumed to be a key.  The key is tested first in case the message doesn't contain a valid key.  If valid, the key is used to point particles to the attached object.

     

    // This goes in your emitter object.  string emitter_name = "My Emitter"; // Name of emitter object to detect.
    string object_name = "My Object"; // The name of the attached objectfloat sensor_interval = 5.0; // Sensor detects objects ever x seconds.float sensor_range = 10.0; // Range in meters.integer chat_channel = -86654; // Our script-to-script chat channel.arrow(key id){ llParticleSystem([ PSYS_PART_FLAGS, PSYS_PART_EMISSIVE_MASK | PSYS_PART_FOLLOW_SRC_MASK | PSYS_PART_FOLLOW_VELOCITY_MASK | PSYS_PART_INTERP_COLOR_MASK | PSYS_PART_INTERP_SCALE_MASK | PSYS_PART_TARGET_LINEAR_MASK | PSYS_PART_TARGET_POS_MASK, PSYS_SRC_PATTERN, PSYS_SRC_PATTERN_ANGLE_CONE, PSYS_PART_START_ALPHA, 1.000000, PSYS_PART_END_ALPHA, 1.000000, PSYS_PART_START_COLOR, <1.000000, 1.000000, 1.000000>, PSYS_PART_END_COLOR, <1.000000, 1.000000, 1.000000>, PSYS_PART_START_SCALE, <0.250000, 0.250000, 0.00000>, PSYS_PART_END_SCALE, <0.250000, 0.250000, 0.000000>, PSYS_PART_MAX_AGE, 10.000000, PSYS_SRC_MAX_AGE, 0.000000, PSYS_SRC_ACCEL, <0.000000, 0.000000, 0.000000>, PSYS_SRC_ANGLE_BEGIN, 0.000000, PSYS_SRC_ANGLE_END, 0.000000, PSYS_SRC_BURST_PART_COUNT, 1, PSYS_SRC_BURST_RATE, 0.750000, PSYS_SRC_BURST_RADIUS, 0.000000, PSYS_SRC_BURST_SPEED_MIN, 0.500000, PSYS_SRC_BURST_SPEED_MAX, 0.500000, PSYS_SRC_OMEGA, <0.000000, 0.000000, 0.000000>, PSYS_SRC_TARGET_KEY, id, PSYS_SRC_TEXTURE, "c0ceae16-cad3-be86-ea1f-66b8c52a595a"]); }default{ state_entry() {
    llSetObjectName(emitter_name); // Ensure the object is named correctly. llParticleSystem([]); // Remove particles if they exist. // Start a listen for messages from "My Object. llListen(chat_channel,object_name,"",""); } listen(integer channel, string name, key id, string message) { // Look for object_name+"@" in message. If not, it's not one we want. // @ is a separator between the expected recieved phrase and the key. // (see the attachment script) if (llSubStringIndex(message,object_name+"@")>-1) { // Pull our key from the location of @ + 1 position to the end of string. string target_key = llGetSubString(message,llSubStringIndex(message,"@")+1,-1); if ((key)target_key) // Returns true if it's a good key. { arrow(target_key); // Set our particle target; turn on particles. } } }}

     

    Attached object script:

    Note that the sensor is hard set to look for the string emitter_name ("My Emitter").  This limits the scope of the sensor for maximum efficiency.  The sensing interval is set to every 5 seconds.  If "My Emitter" is detected within 10 meters, it sends a chat message.  The chat message is hard set to object_name+"@"+(string)llGetKey().    The object MUST BE ATTACHED TO AN AVATAR for the sensor to activate and work.

     

    // This script goes in your attachment object.

    string emitter_name = "My Emitter"; // Name of emitter object to detect.
    string object_name = "My Object"; // The name of the attached object
    float sensor_interval = 5.0; // Sensor detects objects ever x seconds.
    float sensor_range = 10.0; // Range in meters.
    integer chat_channel = -86654; // Chat channel for script-to-script communication.
    integer toggle; // Our on/off switch for the chat message.

    default
    {
    state_entry()
    {
    llSetObjectName(object_name); // Ensure our object is named correctly.
    }

        attach(key avatar)
        {
            if (avatar) // Attached, start sensor
            {
    llSensorRemove(); // Stop previous sensor from last detach, clear any leftovers.
                toggle = FALSE; // Flip our switch off to enable messaging again.

    // Restart sensor.
                llSensorRepeat(emitter_name,"", PASSIVE | SCRIPTED, sensor_range, PI, sensor_interval);
            }
        }
        
        sensor(integer objects_found)
        {
            // If this event is triggered, we found an object named "My Emitter".
            // Send the trigger message to receive an arrow pointing to the object.
            
            if (toggle == FALSE) // Keeps from having repeated chat messages.
            {
                // Send our chat message to enable pointer arrow.
                llRegionSay(chat_channel, object_name+"@"+(string)llGetKey());
                toggle = TRUE; // Message was sent. Flip our switch on.
            }
        }
        
        no_sensor() // We've left the area; My Emitter not detected any more.
        {
            if (toggle == TRUE) toggle = FALSE; // Flip switch off, in case we find My Emitter again.
        }
    }

     

    NOTES:  If there's no way to get around having a constant listen or sensor in a script, limit its scope as much as possible for efficiency.  In these examples we're listening for "My Object" to send a message and our sensor will only detect an object named "My Emitter".  In this way we're only getting valid messages and we're only detecting a valid object.  We further limit script run-time by making sure they're as idle as possible, using a slower detect time of 5.0 seconds and an integer as a switch to keep from having any unnecessary code to process if we don't have to.

  14. What are you using as a device to communicate?  Is it a USB headset?  Is it a mic/headset plugged into your mic jack on your sound card?

    Without loading Firestorm, go into the Control Panel and Sound.  Under the Recording tab, see if your device is checked as the active device.  If not, set it as default.  Then click Configure and go into the settings.  Ensure your mic is turned up.  Go through the microphone setup and ensure your mic is working properly.  If you do all these things and can't get sound out of it, it could either be a bad headset/mic or you could have a driver issue with your device.  You could try reinstalling drivers from the maufacturer to see if that fixes it.  Some devices come with their own set of controls.  If yours does, run the software and check the settings for the mic.  If you get the mic running in Windows, but not Firestorm, edit the settings via Preferences - Sound & Media - click Audio Device Settings button - Use the dropdowns to set up your input/output devices.  If they are grayed out, your device isn't being detected and it could be a hardware/software issue.

    If all of this fails, borrow a mic or headset from someone just to try.  OR, buy a cheap mic or headset from Wal-mart or even Goodwill (although there's a chance a Goodwill device will be bad) and try it.

    This should help narrow down your problem.  It sounds like your hardware or drivers to me.  Either the headset died, the drivers got messed up or the sound card mic port died (if that's what you're using).  In the case of a bad sound card mic port, just get a USB headset, problem solved.

  15. Always glad to help.

    If furniture is what you're buying and you want to learn how to set up your own, it's hard to beat zEd's Multi-Pose System.  The initial investment might seem like a lot, but it is a powerful feature-rich system for setting up furniture.  It'll meet all of your needs, from the smallest project to the largest.  As long as the items are modifiable, you could delete the existing scripts (which are no modify) and create your own sits with animations using such a tool.

  16. It sounds like you've already proven the pergola is the issue.

    The problem sounds like it's the physics shape of the pergola affecting your ability to interact with the couch.  Sometimes creators don't take this into account and you end up with a problem where objects inside or behind cannot be interacted with.  A bad physics shape which, for instance, covers the entrance of a pergola like an invisible box, will cause this issue.

     

    1. Press CTRL-3 to bring up the edit window and select the pergola. 
    2. If it has an all yellow halo and modifiable you can link a dummy prim as the root and then set the pergola physics shape type to NONE in the Features tab of the edit window.  The dummy prim root makes it so no part of the pergola is root and therefore the physics shape type of the whole pergola can be changed.  Root prim physics shape type cannot be changed to NONE, but children can.  Setting the pergola to NONE negates its physics shape, which is most likely causing the problem.

     

    • Variation 1:  If the pergola is made up of more than one prim, the root will have a yellow halo and the child prims will have a blue halo.  If the root prim is an object that isn't obstructing your couch, you may be able to successfully set all child prims to physics shape typee NONE and fix the problem.  This should negate the physics shape which will allow you to walk through the pergola and hopefully click through it as well.  A small window will pop up and say you can't change the root prim to NONE.  This is normal.  The root prim will keep its physics shape type and the rest will be set to NONE.
    • Variation 2:  Check the Edit Linked Parts checkbox in edit window and select individual prims in the pergola object and change their physics shape type to NONE (surgical approach).  Hold SHIFT to select multiple prims in the linkset if you want to change multiple prims at once, then set the setting in the Features tab.
    • Side note - To see physics shapes (Firestorm), enable Advanced menu with CTRL-ALT-SHIFT-D, then go into that menu and enable Develop menu.  Under the Develop menu you'll find Render Metadata.  Under that you'll find Physics Shapes.  Turn this on and you will see if the physics shape conforms to the object or is boxing in what should be open areas.

    Worst case scenario, if none of this works, contact the creator of the pergola and have them fix their product.

    1. Modify your arrow() function to allow you to pass it a key.  Example:  arrow(key id) instead of arrow(), then use "id" in your target setting in the llParticleSystem statement:  PSYS_SRC_TARGET_KEY, id,
    2. You'll need to turn on the PSYS_PART_TARGET_LINEAR_MASK or PSYS_PART_TARGET_POS_MASK in your arrow function's llParticleSystem statement, depending on if you want a direct route or a gradual route to your target.
    3. Then you need to know the target's key.  You can use a sensor to acquire nearby objects within a range, using llSensor for a single detect or llSensorRepeat for repeated detections (you can also use llSensor with a timer for more control).  You can also use the name setting in your llSensor/llSensorRepeat statement to sense a particular object's name, which allows you to pull the key for only that object.
    4. In the sensor event, use a loop to cycle through the detected objects and compare llDetectedName(x) to your desired target and point the particles to the new llDetectedKey(x) by specifying arrows(llDetectedKey(x)).

     

    Below is a very simple example of how it's done.  This only runs the sensor once, searching for passive objects (non-scripted, not active or physical; doing nothing, being nothing), to detect the name "My Object" within a 10m range, 360 degrees (PI) around the object.  If not detected it lets you know with a whisper.  Note the llSensor event specifies the name we're looking for, which narrows the sensor down to that particular object name.  All other objects are ignored.

     

    string name = "My Object";arrow(key id){    llParticleSystem([    PSYS_PART_FLAGS, PSYS_PART_EMISSIVE_MASK | PSYS_PART_FOLLOW_SRC_MASK | PSYS_PART_FOLLOW_VELOCITY_MASK | PSYS_PART_INTERP_COLOR_MASK | PSYS_PART_INTERP_SCALE_MASK | PSYS_PART_TARGET_LINEAR_MASK | PSYS_PART_TARGET_POS_MASK,    PSYS_SRC_PATTERN, PSYS_SRC_PATTERN_ANGLE_CONE,     PSYS_PART_START_ALPHA, 1.000000,    PSYS_PART_END_ALPHA, 1.000000,    PSYS_PART_START_COLOR, <1.000000, 1.000000, 1.000000>,    PSYS_PART_END_COLOR, <1.000000, 1.000000, 1.000000>,    PSYS_PART_START_SCALE, <0.250000, 0.250000, 0.00000>,    PSYS_PART_END_SCALE, <0.250000, 0.250000, 0.000000>,    PSYS_PART_MAX_AGE, 10.000000,    PSYS_SRC_MAX_AGE, 0.000000,    PSYS_SRC_ACCEL, <0.000000, 0.000000, 0.000000>,    PSYS_SRC_ANGLE_BEGIN, 0.000000,    PSYS_SRC_ANGLE_END, 0.000000,    PSYS_SRC_BURST_PART_COUNT, 1,    PSYS_SRC_BURST_RATE, 0.750000,    PSYS_SRC_BURST_RADIUS, 0.000000,    PSYS_SRC_BURST_SPEED_MIN, 0.500000,    PSYS_SRC_BURST_SPEED_MAX, 0.500000,    PSYS_SRC_OMEGA, <0.000000, 0.000000, 0.000000>,    PSYS_SRC_TARGET_KEY, id,     PSYS_SRC_TEXTURE, "c0ceae16-cad3-be86-ea1f-66b8c52a595a"

    ]); }default{ state_entry() { llSensor(name,"", PASSIVE, 10.0, PI); } sensor(integer objects_found) { if (objects_found) arrow(llDetectedKey(0)); } no_sensor() { llWhisper(0,"'"+name+"' not found."); }}

     

    Another example using llSensorRepeat instead and detecting all passive objects, then using a for loop in the sensor event to search for the name "My Object" in the results:

     

    string name = "My Object"; // Name of object to detect

    float sensor_interval = 5.0; // Sensor detects objects ever x seconds

    float sensor_range = 10.0; // Range in meters

     

    arrow(key id)

    {

        llParticleSystem([

        PSYS_PART_FLAGS, PSYS_PART_EMISSIVE_MASK | PSYS_PART_FOLLOW_SRC_MASK |

    PSYS_PART_FOLLOW_VELOCITY_MASK | PSYS_PART_INTERP_COLOR_MASK |

    PSYS_PART_INTERP_SCALE_MASK | PSYS_PART_TARGET_LINEAR_MASK | PSYS_PART_TARGET_POS_MASK,

        PSYS_SRC_PATTERN, PSYS_SRC_PATTERN_ANGLE_CONE,

        PSYS_PART_START_ALPHA, 1.000000,

        PSYS_PART_END_ALPHA, 1.000000,

        PSYS_PART_START_COLOR, <1.000000, 1.000000, 1.000000>,

        PSYS_PART_END_COLOR, <1.000000, 1.000000, 1.000000>,

        PSYS_PART_START_SCALE, <0.250000, 0.250000, 0.00000>,

        PSYS_PART_END_SCALE, <0.250000, 0.250000, 0.000000>,

        PSYS_PART_MAX_AGE, 10.000000,

        PSYS_SRC_MAX_AGE, 0.000000,

        PSYS_SRC_ACCEL, <0.000000, 0.000000, 0.000000>,

        PSYS_SRC_ANGLE_BEGIN, 0.000000,

        PSYS_SRC_ANGLE_END, 0.000000,

        PSYS_SRC_BURST_PART_COUNT, 1,

        PSYS_SRC_BURST_RATE, 0.750000,

        PSYS_SRC_BURST_RADIUS, 0.000000,

        PSYS_SRC_BURST_SPEED_MIN, 0.500000,

        PSYS_SRC_BURST_SPEED_MAX, 0.500000,

        PSYS_SRC_OMEGA, <0.000000, 0.000000, 0.000000>,

        PSYS_SRC_TARGET_KEY, id,

        PSYS_SRC_TEXTURE, "c0ceae16-cad3-be86-ea1f-66b8c52a595a"]);

    }

     

    default

    {

        state_entry()

        {

            llParticleSystem([]); // Remove particles if they exist

            

            // Depending on your object type, you may use ACTIVE and/or SCRIPTED instead of PASSIVE.

            // In this case we're detecting PASSIVE, which are non-scripted objects that are inactive

    // and non-physical. PI (3.14; 360 degrees) is the arc of detection.

     

            llSensorRepeat("","", PASSIVE, sensor_range, PI, sensor_interval);

        }

     

        sensor(integer objects_found) // objects_found = number of sensed objects returned.

        {

    // Sensor returns a list of detected objects from 0 to objects_found-1 (max 16; 0-15).

    // We can reference information about the detected objects via llDetect functions.

    // In this example we use llDetectedName(number) to reference the name of

    // one of the objects.

     

            integer loop_counter; // Set up an integer to count from 0 to objects_found.

            

            // Loop through each of the detected objects in sequential order from 0 to objects_found-1.

    // Use llDetectedName(loop_counter) to compare current detected object against name.

    // Check to see if we've reached the end. If not, add +1 to loop_counter and continue.

     

            for(loop_counter = 0; loop_counter < objects_found; loop_counter++)

            {

     

    // Does this particular object's name match our name?

    // Use llToLower to negate case sensitivity with lowercase.

     

                if (llToLower(llDetectedName(loop_counter)) == llToLower(name))

                {

                    llWhisper(0,"'" + name + "' found.  Target set.");

                    arrow(llDetectedKey(loop_counter)); // Set our particle target; turn on particles.

                    llSensorRemove(); // Stop sensor, also negates all sensor results.

                    return; // Exit for loop and sensor event and go idle.

                }

            }

            

            // Not found after checking all sensed objects

    // Note that this whisper repeats until object is found, for testing purposes.

     

            llWhisper(0,"'" + name + "' not found.  Continuing to search.");

            llSensorRepeat("","", PASSIVE, sensor_range, PI, sensor_interval); // Restart sensor

        }

        

        no_sensor()

        {

            // Unless you're in a place where no objects exist (per

            // your specified type), this event will never be triggered.

        }

    }

     

    These examples should get you started.  You can just as easily set the script to sense avatars instead using AGENTS_BY_LEGACY_NAME as the type, for instance.  See llSensor and llSensorRepeat in the wiki for more info (links to those at the top of this page).

     

    NOTE:  Each prim has its own particle emitter and can only have one target.  If the object being targeted no longer exists, the particle emission stops.  Sensors only return the closest 16 objects.

×
×
  • Create New...