Jump to content

Polymath Snuggler

  • Posts

  • Joined

  • Last visited

Everything posted by Polymath Snuggler

  1. I don't know if you've actually looked at what sansar IS. Based on your remarks I'm just going to assume that you haven't ~ because it's got almost nothing to do with virtual living in the manner that SL provides. Sansar is a virtual reality hosting platform. It's a service to produce virtual environments and share them with other people via the web. What it's not designed to do is to be a virtual living space and social network. So unless you're in the habit of creating art installations or doing online education it's really not going to have much appeal to most people already on SL. Sansar is Linden Labs attempt at making a product for everyone else who needs a VR environment but doesn't really like / want or need something like SL.
  2. That's a really interesting assertion. I have only been in SL for a little over 2 years. But I'd like to see the documentation about this!! Because I've been on VERY laggy sims before with high script counts and almost zero physics collosions taking place ( think any shopping event on SL ) Tons of vendor scripts, tons of avatar scripts, lots of people wandering around, but not doing much else. The lag on those sims is always very very present. Not FPS based lag ( there's that too ) but the inability to walk in a normal manner, sort of "snapping" back to your previous location as if on a leash, etc etc. So I'm curious ~ if the lag doesn't come from the 1000's of scripts running at shopping events, then ~ where does it come from? No one is colliding, people can't really move to begin with. We're mostly just standing around camming vendor stalls. So ~ the material evidence says you're not quite correct, I'd really like to see the documentation though!
  3. No no! ~Script count still matters. Just Avatar Complexity matters TOO. They both affect different things. Script count bogs down the simulator ( the CPU of the sim ) too many scripts and you'll find it difficult to walk around, type and enjoy yourself. Even if you use the viewer settings to turn everyone into a Jelly AV, if the script count on the sim is too high walking will seem boggy and you may not be able to move your avatar around at all. However, you'll be able to see that you can't walk around properly at a nice fluid 30 FPS and cam over to the other side of the sim without crashing, you just can't walk your avatar over there. What quickgraphics addresses is the OTHER type of lag, the type of lag that turns SL into a slide show when you land in a highly populated sim as your poor computer tries to draw the scene over and over again but just can't because there's too much stuff in it. You can be on a sim where walking around is quick and responsive (no scripts at all ) but your viewer will be crawling along at 2-3 FPS. So your Avatar will be able to walk around but camming will be difficult, as it'll look all stuttery and slide-show like. Frequently both types of lag go hand in hand, as populated sims have both a lot of overly complex avatars ( which causes a lot of slide-show FPS lag ) who also are loaded down with a bunch of scripts, that make walking incredibly difficult. Usually these places have 100's of VENDOR scripts too ~ ( because C88! ) and so walking becomes even harder. *Both still matter*.
  4. When you first start the viewer, it has to pull all of the surrounding info from the world around you. This makes things seem really laggy for awhile until it finishes downloading and then everything is better again.
  5. Yes ~ approximately 45% of all avatars are over 80K. I did a little measuring test and submitted a JIRA on it in the hopes that they would back off the 80K line. But they seemed rather intent on keeping it there. ( Though supposedly it changes based on what your grapics card is? ) He're's the study. https://jira.secondlife.com/browse/BUG-10967
  6. All current mesh heads ( human and non-human alike ) are either completely static and boring or they have complex multilayered meshes that are used to fake playing animations. That's how non-human avatars blink, smile, open their mouths, pretend to talk. Those polys are wasted and inefficient. Removing them because they will be replaced by an efficient bone and animation system reduces the render cost of the avatar. I don't know how you haven't quite picked up on that yet. Perhaps you're the owner of some really inefficiently made mesh avatars that are utterly stoic and statuesque in the facial region. But that's not the case for the vast majority of non-human content on SL.
  7. Don't argue with me when you're that wrong.  Note: Some of these avatars actually had under 80K ARC ( upper right corner had a nice elegant low 30K ) even with their multi-alphalayered head meshes. That doesn't mean that Bento won't let them be even lower ~ and more animated and interesting. Most were 200K+ though.
  8. The vast majority of non-human based avatars are presently comprised of sculpted prims or compounded "alpha layed meshes" Both of which are easily outperformed by a rigged mesh with the new skeleton bones. If, as a designer I want to make a minotaur out of sculpts, to do so convincingly may use somewhere between 50-160K complexity to just achieve the form of the creature, nevermind attempts to animate it. With the new skeleton that same form can be more efficiently created, and more convincingly animated while at the same time still keeping it's complexity rating at roughly 25-30 K total.
  9. It's a cumulative effect. The 80K limit is the default. If you personally feel your hardware is capable of handling more than that, then that's why the slider exists. You can increase it. It's true that ARC cost de-rendering hits non-human AV's particularly hard, especially older ones created from a ton of sculpties. I would feel that this was a "nasty bias towards fur things" ~ were it not for the fact that project Bento is literally just down the road in release schedule. And we ( the creators collaborating with the lindens on beta project bento ) deliberately planned out the new SL Skeleton capabilities pretty much exclusively with a non-human avatar in mind. The new skeleton is literally a quadruped and all of our animation use cases took non-human avatar design into account, almost primarily over human ones. Ease of use for non-human avatars was the primary reason for a lot of choices. So non-human avatars are about to get a lot more impressive for a much much lower complexity cost than the presently existing ones available on the market.
  10. Ahh shoes are another good point. Especially Sculpted shoes. I'll add that in. As for the rest of the mechanical explanations of the 'why' ~ that's way too in depth for a simple "quick simple reference". I was trying to explain it in 2 paragrahs or less.
  11. I know Black Dragon has already adopted the Jelly based auto-derendering thing. I imagine Firestorm will~ with a modification here or there soon enough as well. I can't speak to their actual release schedule, but I imagine you'll be seeing this in most major viewers ( Firestorm included ) by the middle of June or early July. Possibly even earlier than that! http://wiki.secondlife.com/wiki/Third_Party_Viewer_Directory As for multi-layered onion avatars. The well made ones ( Slink and Maitreya ) are actually surprisingly only 5-7K ARC each. Some of the more poorly designed ones will probably run aground the jelly-AV de-render, but that's what this system is supposed to help do, help people understand what content causes lag, and what content is well designed!! So far the hardest hit avatars are sadly the old sculptie based Furry type avatars. Fortunately, good news for them. Bento is on it's way soon!!
  12. Try to avoid changing states when using permissions. Strange things happen. Just pretend the option to switch states in scripts isn't there. Declare a variable TRUE/FALSE on whether you want it to do one thing or the other ~ then switch that variable. Honestly. I haven't read your script in depth. Just a bit of cursory advice based on the fact that I saw 2 states with perms changes and didn't want to read into it any more than that.
  13. You need to be more specific please. Did this happen in the viewer? Have you tried relogging? Is this taking place in the VMM interface? What's going on?
  14. What is a good Complexity rating? By default the number you're trying to be under is 80K. If you're OVER 80K and are confused as to why here's a quick guide: #1 Old sculptie items: If you're wearing things that are made out of sculpts and not mesh, chances are taking it off will practically halve your complexity rating. Old Sculptie hair is terrible. Also old sculptie Shoes, even (especially) the really fancy ones like Moodys. #2 Just wearing too much stuff : I often see avatars with jewelry everywhere. Their wrists, waist, ankles , a dense cluster of earrings and 2-3 necklaces, clusters of rings on every finger. Add on top of that a mesh body, mesh head, three or four a little animated shoulder pets, An animated tail, some huge wings, horns on their head, jewelry on those horns ~ etc etc etc. While that look can be done with a low avatar render complexity rating, it rarely is. You might need to find some different designers for your look. Which brings us to #3. #3 Wearing mesh from inexperienced designers: This is where it gets a bit difficult to tell the "how and why". Some mesh is bad for SL, other mesh looks absolutely fantastic and has really low cost. The thing is ~ when you're just looking at an item in the store, they both look the same. This is where the best advice I can give you is TRY DEMOS. Demos will give you a good idea of how complex an item is. As a general rule, if a single item adds over 30K to your complexity score. You should probably find an alernative. ( Most well made items add 2-9K each ) Lastly ~ a lot of people have complained that the new feature punishes people for other designers mistakes. "Why should I be de-rendered because a designer made something with 60K complexity and I bought it?" This is sadly the down-side to this new feature. The hope is that designers AND residents alike will both become more aware of the impact that they have on causing lag in SL and will create better (less laggy ) things to wear and enjoy.
  15. Your avatar complexity has just become an important part of your SL. The hope is to teach people that items from some designers are bad for SL, while others look amazing and have very low cost. In a perfect world, we'll all pay attention to those complexity numbers and SL will become a less laggy place!
  16. Sorry to bug you about this Gaia~ but I think you're the one who has the highest liklihood of knowing the answer to this. For linden .anim files on the really old skeleton, before fitmesh that MAX_CONSTRAINTS = 10 MAX_JOINTS = 32 What are these values now with Bento?
  17. I can already tell you Vir's answer to this, "We're not looking to add this sort of functionality at this time, but maybe at a future date." That being said ~ influencing the Avtar_lad.xml file on the fly would be an interesting proposal, (that is what you're suggesting here if I can understand it correctly? ) The problem is ~ I keep getting mired in the sticky details, such as: "I'm wearing two mesh heads, one restricts the eye spacing slider, one restricts the nose length slider, which one wins out?" If you take the least-restrictive setup then one head is going to have it's sliders unlocked beyond the constraint you wished to place on it anyways, thus defeating the purpose of the feature. If you take the most restrictive, theoretically you can lock up every slider to the point of unusability. What about pre-existing shapes that have the slider values already set? Will they be altered upon wearing them? There's a lot of questions that seem to make this ~ a bit less than simple to solve.
  18. Using wing bones to animate hair is pefectly fine, or better yet you can even use the Hind Limbs bones for animating hair , dress trains etc etc, there's endless possibilities. Adding even more bones would just be irresponsible at this point. As designers we have to be able to understand the limitations of the SL environment. Hair is one of the largest causes of lag on SL presently. Animating the dense meshes that people produce for convincing hair would compound the lag problem catastrophically. I'm already absolutely terrified of what would happen if someone. Say, theoretically...linked a 1.2 million polygon head up to the current facial bones that Bento has. The amount of lag and crashes that would cause would be even worse than static 1.2 million polygon heads. I hope that all designers keep their poly counts in check for the long term survival of SL. There should never be a single entity that has 1,200,448 triangles. Let alone a RIGGED one.
  19. Sadly that would require modifying the mesh uploader extensively to read a DAE file with that embedded data in it. Also that DAE file would no longer be a standardised XML DAE file that every mesh program in the world knows how to make. So it would require creating some sort of bizarro custom DAE File format for SL use only. Then writing exporters for 3ds max, maya Lightwave and Blender to write said bizarro DAE files. It's really not possible to do anything like this within the release time frame that Bento is shooting for. If you have an actual mechanical methodology for how this would be implemented that doesn't involve suddenly making proprietary collada files ~ by all means file a feature suggestion JIRA!
  20. It's likely most mesh heads will not use identical rigs ~ as every face will be weighted differently. I submitted a proposed system to deal with this situation~ but it isn't something that you want to do via AO's.
  21. Given the release-state of Occulus Rift, has LL considered gearing a substantial portion of Sansar's compatability towards browser ( or browser app ) based rendering that will allow the platform to enjoy widespread useage without a long installation. We've all seen , does Linden Labs have any plans to make all Sansar content available in this manner? If not, Why?Follow up question : There are literally thousands of people and companies who would enjoy the use of a small easily-hostable, easily-viewable VR experience for things such as simple walkthroughs of apartment layouts, or small scale teaching experiences that are loadable on-demand etc etc. Is LL attempting to capture that business demographic? If not, Why?
  22. I posted this on the AV_Sitter2 Forums and got in contact with Code Violet about this idea. It's still an idea. Just that. It's a simple idea though and easy to implement but it would solve a LOT of problems. As project Bento is progressing, we, the design community should probably start thinking about adapting some sort of standardized metric for playing facial animations. Since it's unlikely that any given mesh head will have the same rigging and weighting as another one ( Especially in the case of non-human avatars ) I'm proposing we standardize a simple single-script ( or partial script ) implementation for broadcasting facial animation requests. I'm trying to start this standardization early so that creators can adopt it early and save ourselves the hassle of muddling through until a standard arises later on. The nature of this script is simple, if your piece of furniture has an animation to play that you would like accompanied by a facial animation, it will send a targeted animation request to an avatar UUID to play an animation of that type. IE : if you have a kissing animation the seated avatars will receive an animation play request on a pre-determined negative channel "MESH_KISS" . If there are any avatars wearing a mesh head that understand that request, they will attempt to play their own internal "MESH_KISS" animation. This way we can skip-over the problem of trying to standardize rigs and weighting as we can just have a list of "Standardized animations" that each head can perform ( obviously not a requirement ). I just want the infrastructure to be there. Just to specify the example, all I'm proposing is when an animation is a message similar to the following will be broadcast on a channel. [AVATAR KEY] | PLAY | "MESH_KISS" | [LOOP TRUE/FALSE] | [LOOPTIME] | [ STOP_BLINK TRUE_FALSE ] ( recipient avatar, play animation, animation_name , whether it's a looping animation, whether the animation should stop after a specified time , whether the animation should stop you from blinking while you're doing it ) The difficult part of establishing the protocol is defining what sort of animations should come "standard" on a mesh head. Just as examples: MESH_SMILE , MESH_YAWN , MESH_KISS, MESH_LICK , MESH_PH_A ( Mesh Phonetic A ) , MESH_PH_O , MESH_PH_S, MESH_PH_P (etc etc etc) , MESH_YAWN ~ and so on and so forth. The list need not be exhaustive and can always have more animations added at a later time. The furniture scripts need only have the animation names tacked onto their pre-load notecards. The majority of the "work" will be the animation players inside the heads. The idea is simply to establish the protocol NOW so we don't have to muddle through disparate rigs and weirdness later on. This idea is just that~ an idea at the moment. I'm eagerly awaiting feedback on what it should also contain/ do and what people think the "basic" animations that a mesh head should be able to run. Please note: the list need not be exhaustive: as any animation requests that are not understood will simply be ignored. The idea is simply to get this project going early so we have a reliable infrastructure to count on later. My hope is to nudge the community in the direction of keeping facial animations and body animations as two separate animations. The situation I'm trying to avoid here is having animators animating the face when creating their own somatic animations and then having the head-designer community trying to fight against the incursion of "bad head animations" that don't work for their particular content. I want to set precedents and minimize user AND creator confusion from the start. In addition it would help solve the animation file-size problems if we had a established 'split' between facial and body animations, but I don't want to go so far as to ask the lab to enforce a restriction, as that would limit the creativity of the platform. But ideally I would like an announcement that establishes this precedent when Bento goes live.
  23. Okay ~ I've actually got a proper answer-answer to this question. The internal animation format for Second Life does not contain scaling data at all. The entire data-array inside the actual .anim files themselves is exclusively rotation and positional information stored on a per-joint basis in an array, which is then crammed into Hexadecimal for file size concerns. So scaling isn't possible presently without rewriting the entire internal animation file structure, then rewriting the the animation interperters to deal with the new file structure. On top of that ~ any old viewers would look at the new animation files and be very very confused, so it would have to be a massive update~ etc etc. Basically ~ I don't think it's worth it.
  • Create New...