Jump to content

animats

Resident
  • Posts

    6,144
  • Joined

  • Last visited

Posts posted by animats

  1. 39 minutes ago, SkyIntruder said:

    I'm not satisfied with "pretty good". Only the maximum ... Only "Hardcore" ... The question was not how to set up to play (I have been playing for several years and I know all the subtleties of the settings), but how to make the game work at 100% of the computer's power. but thanks for the answer.

    SL needs a new viewer to do that. The existing ones are mostly single thread and do not use modern GPUs fully.

  2. 55 minutes ago, Arduenn Schwartzman said:

    What do you mean 'ourselves'? You never had control over SL. You're only using it as a commodity.

    The only real asset Linden Lab has it is its users and creators. The viewer is open source and others have implemented viewers. The server has been duplicated as Open Simulator. The really smart people who developed the system are all gone. LL doesn't even own their servers any more.

    All LL has left is a solid base of reasonably happy users. It's worth reminding LL management of this now and then.

    • Like 1
    • Haha 3
  3. I'm not quite seeing it either.

    Projectors normally track the motion of the object to what they are attached quite well. Most vehicle headlights are projectors, and they reliably point where the vehicle is pointed.

    LLTargetOmega is for spinning things where you're not that concerned about exact position. Use it for propellers, wheels, and decorative spinning objects. It's usually all viewer side and doesn't load down the sim.

    If you need more controlled rotation, I'd suggest using keyframe animation. That's a combination of server side and viewer side, and will get you smooth motion.

    A script frantically updating angles many times per second is the least efficient approach, but will work.

    By the way, an SL light has four parts:

    1. The projection or emission of light from the object. This lights other objects, but not the light source.
    2. The color of the face emitting the light. This is often just full bright.
    3. Glow. A little glow around a light looks good, but don't overdo. About 0.02 is good for most lights.
    4. The visible light cone. That's a separate object, usually translucent and full bright. Textures can give it a nice falloff effect.

    Each of those has to be set up separately. But if all are attached to the same object, they should stay in sync.

  4. Shibuya-Crossing.thumb.jpg.a215454565777a2e1d466dd5855b0bcf.jpg

    Streaming video in SL! Live traffic camera at Shibuya Crossing in Tokyo. This is just a plain prim where I set the media URL for one face.

    For Youtube streams, you want the URL in the "embed" code. That's just the video, not the whole webpage. This one is https://www.youtube.com/embed/lkIJYc4UH60.

    Lots of potential here. Clubs, performances, etc. If you have your own streaming server you can set things up so that everyone sees the same thing.

    • Like 1
  5. On 12/30/2020 at 3:50 AM, ItHadToComeToThis said:

    What is the best and most accurate method for increasing someone’s movement speed?

    Get on a vehicle.

    There's an F-16 on Marketplace, and it maxes out around 130m/sec. At one region crossing every 2 seconds, something usually breaks within a few minutes, but it does work. For road travel, it's quite possible to drive 30m/sec on Linden roads, and faster on long straightaways. You have to do some things to increase vehicle downforce, as in real life.

  6. Interesting. From those messages, the connection to the login URL, the first connection you make to SL at startup, timed out.

    When you can't get SL to connect, try to connect to "login.agni.lindenlab.com" with a browser. You should get "FORBIDDEN", because you're not sending a username and password. If that times out, it's not anything viewer-related.

     

    • Thanks 1
  7. I think that Oz said recently you can now purchase land from Linden Lab directly again. It's not yet automated; you have to contact Support.

    There are essentially unlimited isolated sims available from LL now. That was part of the purpose of moving everything to Amazon Web Services - ease of adding more capacity. Before, LL was out of space in their data center in Arizona.

  8. 5 hours ago, Doc Carling said:

    And the die hard fan fraction in SL is willing to accept all technical issues as long as SL is up.

    To some extent, that's true. SL is doing well from the perspective of long-time SL users. But it would be laughed off Steam as too buggy. As Sansar was. This is a big problem in retaining new users, who see SL as broken.

    What's badly broken right now?

    • Group messaging. If you're going to be an online social network, messaging has to work reliably.
    • Region crossings, about which I've said much in the past and won't repeat myself.
    • Content loading stalls, seen in world as things taking way too long to appear. I suspect throttling by the Akamai content delivery network.
    • Voice is still unreliable.
    • EEP made the mainland world too dim, due to some bad defaults, and that hasn't been fixed yet.

    Those are the big ones, and they should be LL's priorities for Q1 2021. What else?

    Server uptime is pretty good, and was maintained through the transition to AWS. That's an achievement, considering how much state is being maintained server side.

    • Thanks 1
  9. 3 hours ago, VirtualKitten said:

      if(euler.y>361) return_value = _fmod(360,euler.y); else return_value = euler.y;
        llOwnerSay("Adjusted for modulus:"+(string)return_value);
       
        if(llFabs(return_value) != return_value && return_value !=180) return_value = 360+return_value; else if((integer)return_value == 0 && angle == 0) return_value = 180; else return_value = llFabs(return_value) ;

    This is why everybody in graphics uses a homogeneous representation of rotations. No special cases at zero degrees. In Second Life, that's quaternions.

    The key idea here is that you can multiply LSL rotations to get a new rotation. So, compute the rotation quaternion for the change you want to make using llEulerToQuat, then multiply the object's old rotation (from llGetRotation or llGetLinkPrimitiveParams) by the quaternion for the change to get the object's new rotation. Then apply that to the object with llSetRotation or llSetLinkPrimitiveParams.

    Ref: http://wiki.secondlife.com/wiki/Rotation

    This area is confusing but well documented.

    If you just want to spin something, see the functions which mention "Omega".

  10. LSL has a parser for JSON. If you can make your site emit JSON, that's easier to deal with than trying to parse HTML yourself in LSL. What you're trying to do is quite possible, but more work than necessary.

    LSL also has something for parsing comma-separated values.

    Web pages don't have to be HTML. They can be text/text or text/json, as well as text/html. There are lots of resources available about making web sites do what you want. In general, it's better to make a web site that talks easily to LSL than making LSL to talk to an existing web site. LSL has severe memory and compute limitations, and the language is not well suited to parsing.

  11. 3 hours ago, Drayke Newall said:

    Thought this to be interesting with regards to the metaverse. Virtual Property of the famous New York Stock Exchange Sells for US$23,000 in Upland Metaverse Auction

    Upland is a blockchain-based metaverse that is mapped to real-world addresses and is basically a literal virtual world monopoly game.

    Upland is strange. I suspect they will at some point run into trouble with the SEC. It's not a game, it's a Make Money Fast scheme. You can't do anything with your "parcel" except trade it.

    There are several "blockchain" based virtual worlds, notably Decentraland and Sominium Space. People trade land there for excessive amounts of money, but don't go in world much. The amount of land is severely restricted to keep the prices up.

    Both have browser clients, so you can visit those low-rez worlds easily. At least you can go in world there, unlike Upland.

  12. I've had the same problem clothing my animesh NPCs. I had to have some clothing custom-made.animeshclothingset.thumb.jpg.1fa288b3c1722a7d41ed6b86497d3baa.jpg

    Various animesh outfits

    From left to right,

    • Shoes and jacket custom-made for me as low-LI animesh clothing by Duck Girl. Hat is from Marketplace, just for the holiday season. Hair from Meli Imako's line of low-LI animesh hair. (She has some other items for animesh, too.) T-shirt and sweatpants are texture clothing. 50 LI, most of which is the hat.
    • Dress from Marketplace. Not a good item for animesh, because it has a cheat where the lower levels of detail are blank. Animesh, unlike avatars, use levels of details the same way regular objects do, so you will see clothing at low LODs. This dress disappears at distance, but the character model, which has reasonable lower LODs, does not. 48 LI, most of which is the dress.
    • T-shirt and sweat pants are texture clothing. Shoes are the same as at the left. 33 LI.
    • An entire animesh character from Marketplace. Not dressable. Probably a minor character from some game. 24 LI.

    The character model for the left three is from Uno Blokke, custom-modified for me to have reasonable lower LODs. That model has SL-standard UVs and rigging, so it can wear some classic avatar clothing.

    Animesh don't have Bakes on Mesh support, and there's no "wear" command. You can't just use system clothing. This model has a "head" part, an "upper" part, and a "lower" part, for which you have to assemble a texture from a skin texture layer plus clothing layers using Photoshop or GIMP. I have some standard templates for simple clothing items, and made up the tops and bottoms you see here. 

    Dressing animesh with rigged mesh is done by linking the mesh clothing object to the animesh character. Then it will snap into place. Most mesh clothing has huge LI when rezzed in world, and usually has terrible lower levels of detail. Just using some standard mesh clothing will result in LIs from 130 to "parcel full".

    There are also many animesh characters with simple animations on Marketplace. Too many are ripped from either Star [Wars|Gate|Trek|Craft] or the Marvel Overextended Universe. Those are not re-dressable with standard SL clothing.

    So you can definitely get a pantsuit onto your AI character, but it's not easy. See if you can find some pantsuit you like that fits standard avatars, and if the LI is too high, work with the creator to get it down.

     

     

  13. Monitors are getting wider.

    ultrawide.jpg

    49 inch curved monitor. The price on these is coming down. Many are now below US$1000, with the cheapest below US$750. They're essentially half of a 4K TV display, and as those have gone mainstream, making wider monitors has become cheaper.

    Now if you had three of these in a semicircle...

    africa_en_sub_pc_hero_005.jpg

    Who needs a VR headset?

  14. 2 hours ago, FairreLilette said:

    There would be no way for the viewer to know other than, well...you know, private parts the avatar is wearing perhaps.  But, imagine the viewer scanning for what private parts the avatar is wearing.  I think it's safe to say 'not gonna happen'.

    Any object can get a list of the attachments of any nearby avatar. I have a pose stand that does this, as an aid to helping others with clothing problems. The avatar doesn't need to  sit or touch anything. Any desired restrictions could be built into a security orb.

    • Like 1
×
×
  • Create New...