Jump to content

Fluffy Sharkfin

Resident
  • Posts

    944
  • Joined

  • Last visited

Everything posted by Fluffy Sharkfin

  1. I agree it's an undeniably cumbersome solution and having someone write a simple lightweight app that allows you to pose a figure and have that data relayed to the puppeteer plugin would definitely be preferable. I was just thinking more in terms of what would be the simplest solution to implement in order to test how well the theory works in practice and since the puppeteer plugin uses Python which is supported in Maya & Blender that seemed like a logical first step. I must admit I've yet to try BD so haven't had a chance to try the in-viewer posing tool but I assume that would be the easiest solution of all (if LL were inclined to implement it).
  2. I was actually thinking about something similar, i.e. using Blender or Maya to pose the SL armature and have that information mirrored in real-time on your avatar in SL via the puppeteer plugin. That could potentially be useful for SL photographers wanting to fine tune poses when taking pictures.
  3. As a child I was raised by my fathers oldest sister who was around 70 years older than me (she was born in the early 1900s) so I had a very "traditional" upbringing myself. 😅
  4. Hmm no thank you, I already cringe whenever I meet an english person with a first name like "Sir" or "Lord", let's not encourage them further! Anyway, as has been pointed out, the british monarchy is too divisive a subject and, despite her unusually long reign, I see no real reason to single out this monarch for special treatment over any other head of state or world leader.
  5. I already own several mesh bodies but if the new NUX avatars are of reasonable quality then I fully intend to start using it instead for a variety of reasons (none of which are "because I'm too cheap"). Having a readily available dev kit so I can make my own clothes, etc. is definitely the most compelling reason to swap and, as has been pointed out, the successful adoption of LLs new mesh avatars will rely heavily on the amount of content available for it so I'm eager to pitch in because while I believe there will always be a market for third party bodies, I think it's about time that market became a little more niche and less an essential component of avatar creation. New users sign up to SL for a multitude of reasons, some of which don't actually require the level of realism or functionality that aftermarket bodies provide, so this weird "mesh snobbery" that seems to have developed over the years is a little misplaced. The assumption seems to be that if someone isn't wearing a mesh body and clothing they are not invested enough in SL or simply don't care about their appearance. However the truth is that, for some, spending lots of money on their avatar appearance does very little to enrich their experience (beyond avoiding the derision of those aforementioned "mesh snobs") so judging people for not wearing mesh is essentially discriminating against them because they don't share your interests. I expect there will still be those who turn their noses up at people using a "freebie" body, however at least they won't be able to hide behind the excuse of "it's not mesh" and will instead be forced to admit they're just snobs.
  6. 100% agree with Rolig, the LSL wiki has been in my browser bookmarks since the day I signed up for SL and is an invaluable tool. Another invaluable tool when it comes to learning to script is this forum. Use the search function liberally and on the rare occasion that you can't find an existing thread containing an answer to your question, you can always start a new one and ask it yourself. There are some very knowledgeable (and equally helpful) people here who will be happy to point you in the right direction.
  7. Take a look at llSitTarget on the wiki, which allows you to set the offset and rotation of an avatar when sitting on an object.
  8. Hopefully LL will either provide it in a widely supported format like gltf or will release it in multiple formats to accommodate as many creators as possible. If not then no doubt it won't be more than a day or two before someone goes to the trouble of converting the dev kit so it can be used in other software, SL residents are very resourceful and quite adept at stepping in and providing solutions when LL drop the ball on little details like this.
  9. It should be a relatively simple process to set up a full-bright background in SL and then set the SL window as a source in OBS so you can overlay your puppeteered avatar onto whatever live video you're capturing/streaming. It will be interesting to see if any streamers will risk the wrath of Twitch moderation and try using their SL avatar as a cheap alternative to the more expensive custom made Vtuber avatars some streamers currently use and just how seriously Twitch takes that infraction (assuming the feature ends up working well enough to be a viable alternative of course).
  10. While it's nice to theorize on the possibilities a feature like this may present I doubt things like working collision bones or IK are going to be part of what LL delivers at the end of its development. I suspect this is essentially what we're getting, turn on your webcam and you can wave at other avatars and nod or shake your head. I personally don't see any point in developing it beyond that, it's a neat toy that some people will enjoy and a few will use when making machinima or tutorial videos, if LL want to do something to improve avatar interactions or provide new features for creators I think their efforts would be best spent in other areas.
  11. You have to possess a lot of dedication to the platform to want to spend days (if not weeks) creating something like this in Second Life when other platforms running on modern hardware are capable of this...
  12. Technically speaking most (if not all) current skins already make some attempt to mimic SSS by using subtle changes in the colouration of the skin to accentuate certain features. In reality a persons skin isn't different colours on different areas of the face, the reason the cheeks are redder and the boney areas are lighter in colour is due to subsurface scattering, so any skin which uses colour variation to highlight features like cheeks, knuckles, etc. is emulating SSS.
  13. Subsurface scattering is basically a lighting effect so trying to bake it into the texture works about as well as baked lighting or reflections. A good real world example of SSS is the effect you get when you hold your hand up in front of strong sunlight, that red glow you see around the edge of your fingers is the result of subsurface scattering (i.e. light passing through the surface of your skin and scattering as it bounces around on the blood vessels, etc. beneath). Obviously as you rotate your hand the glow continues to only appear on the edges of your fingers where the light can pass through. Essentially the effect is "dynamic", which is why you can't fake subsurface scattering using the skin texture.
  14. If that's the case then I think LL are in a catch 22 situation because if they make the avatars more stylized and less realistic then new users are going to instantly feel out of place once they start mingling with the general population. I think a big part of the problem is that the standard SL avatar has hardly changed at all in the last two decades and most of the improvements to their visual quality are the result of innovation and hard work by creators/residents, the downside of which is that the majority of the avatar customization process is now essentially out of LLs hands, so even if they came up with a groundbreaking new system for avatar creation it will still have very little impact since there's no way to integrate the vast wealth of user created options available which are necessary to create the type of realism that you see in-world. Personally I'd be surprised if avatar realism was an issue since the characters in BDO are arguably just as realistic and detailed as most SL avatars and yet people seemed to love the character creator and happily spent hours in it tweaking their characters (they even have a gallery where people can share their creations with other players).
  15. I suspect the fact that people sign up for such a myriad of reasons may actually be a contributing factor to the poor retention rate of new users, as it complicates the process of guiding users through the signup process and delivering them to the type of content that actually prompted them to try SL in the first place. Being presented with a lot of information which isn't particularly relevant to your reason for visiting SL is probably off-putting. On the subject of allowing new users to customize their avatars during the sign up process, I read an article about a company called ReadyPlayerMe that's offering a third party avatar creation service (you can try the avatar creation process for free here). It's mostly aimed at the new crop of "metaverse" platforms so the style of avatar is quite cartoonish/stylized but I think the general approach is something that could be applicable to the SL signup process too (although personally I'd rather see them copy the Black Desert Online character creation process, but that's probably asking a little too much).
  16. Since the term "metaverse" was coined by Neal Stephenson in his novel Snow Crash (which was apparently one of the inspirations behind Second Life) I've always considered his description of it in the book to be the canonical definition. I see Second Life as an early prototype of how a metaverse may look, but it's a little too insular and disconnected from reality to be a full fledged metaverse.
  17. Yeah but, weird FOV choices & questionable structural integrity aside, the quality of the graphics aren't all that bad. 😄
  18. I don't have any particular love for the guy either but I get the feeling that the media (and the internet as a whole) have pushed that narrative into the public consciousness over the last few years. In response to the recent widespread mockery of Meta because of a screenshot he tweeted he posted a couple of updated screenshots from Meta (the lighting/rendering for which seem to be more on par with SL than previous images) on his instagram. Since I was there I figured I'd take a look at some of his other posts and in some of them he looks positively emotive. Perhaps he's just spent more time practicing in the mirror lately or they upgraded the emotion subroutines in his firmware or something but at times it almost seems like he's one of us!
  19. Nothing I've seen of Meta really impresses me so far, especially not when you consider the amount of money that's gone into development (seriously what have they been doing with all that cash, are the coders all sitting around with solid gold keyboards, does each of the designers have a diamond encrusted mouse?). But I must say I do feel a certain sympathy for the guys tasked with creating the avatars. Everyone assumes that they were asked to create realistic human avatars (and if that's the case they deserve all the scorn directed at them), however if "the Zuck" just showed up one day and said "make an avatar that looks like me!"... I think they pretty much nailed it... the lifeless eyes, the pallid complexion, even the weird hair and gormless expression.
  20. I'm just sad that they didn't include the store greeter guy from Idiocracy saying "Welcome to Walmart, I love you!" at the start of the shopping experience!
  21. Oh if it were a technical demo on how to perform an oil change for the specific make and model of the vehicle you own in real life, sure. But in that video it appeared to be a virtual mechanic changing your virtual oil whilst you were in the virtual store playing with virtual groceries ... and you didn't even get to watch them doing it! 🤣
  22. I'm with you (and Nvidia) on how I perceive the metaverse, although I don't think of the metaverse as purely an entertainment platform but rather a collaborative tool which can be used to create social spaces as well. So far the only tech demos I've seen for self-proclaimed metaverse products that have really peaked my interest and made me stop to wonder at the possibilities are the ones concerning Nvidias Omniverse project which, rather than focusing on creating a virtual world, focuses on the framework required to connect various applications and development platforms so that people can work collaboratively in real-time. All these new virtual worlds claiming to be the metaverse seem more like marketing schemes, the foundation for a functioning metaverse will be the technology that connects multiple virtual worlds together.
  23. Yes, to my mind putting virtual oil in your virtual car is no less silly than being forced to put unwanted virtual milk back into a virtual refrigerator. The video does a great job of illustrating one of the pitfalls of designing virtual environments, in that the designers have opted for familiarity over functionality by implementing features which serve no purpose beyond emulating the tedium of the real life experience rather than using their imagination and implementing the technology available to create something innovative that offers any real advantage to shopping in the metaverse.
  24. Perhaps the designers decided that in order to really recreate the shopping experience the user should be forced to start the process by driving from their virtual home to the virtual store in their virtual car, clearly words like "innovative" and "efficient" weren't part of their design brief.
  25. The part of the video where the user gets sent back to the virtual refrigerated section so they can put the non-existent milk back into the theoretical fridge was hilarious. I guess if the metaverse does crash and burn at least Walmart can recoup some of the cost of developing this by releasing it as a Shopping Simulator game, like PowerWash Simulator only with shopping carts!
×
×
  • Create New...