Jump to content

Fluffy Sharkfin

Resident
  • Posts

    916
  • Joined

  • Last visited

Everything posted by Fluffy Sharkfin

  1. You can use LLDetectedTouchST to detect the surface coordinates at which a prim has been touched, then it's just a case of having a lookup table of acceptable coordinate ranges for each "button" on the texture for that face. When a user touches a face you can test to see if it falls within valid coordinates for a button on the texture for that face and if so trigger the correct corresponding function in the script.
  2. Alternatively, since it probably gets lonely standing around all day creating things, maybe they just do it in the hope that one of their customers will IM them asking for modifiable versions and then they'll have someone to talk to? 🤔
  3. I wouldn't be so sure. While a lot of the current 3D generative AI tools do produce some truly atrocious topology there are some that have made significant improvements. The topology of the results displayed in the video below may not be perfect or particularly well optimized it's still on par with some of the content I've seen in SL Additionally since the resulting topology is quad based it would be very easy to optimize by hand if you really wanted to maximize performance. Generative AI is still in its infancy and, while it may not be ready to produce content entirely unaided yet, as a tool for creatives it's already becoming a part of artists workflows and is being incorporated into industry standard software.
  4. That's already inevitable unfortunately. A brief google search for "AI generated 3D models" yields plenty of results, including an article listing the seventy best AI tools for creating 3D content, and that list will only grow over time. There's no stuffing the AI genie back in its bottle, if you aren't creating mesh purely for the enjoyment of it then you may as well stop now, or at least pivot to incorporating AI into your workflow. The only benefit to having existing 3D skills is if you understand enough of the theory behind 3D creation to enable you to refine and optimize the sorts of results that AI is capable of producing thereby allowing you to produce a better final product than those who use generative AI without any "post production" process.
  5. In light of recent developments in this discussion I'd like to revise my previous answer and suggest that, in lieu of free animesh ponies, LL distribute wearable saddles to all new residents which can be "hired" via a tip jar so that they can earn money by giving people rides to and from the newly re-instated telehubs!
  6. I'll admit I've got way more experience working with the Maitreya body and haven't really bothered with the Legacy. I guess since most people go with the typical hourglass or pear shapes I just assumed Legacy wasn't as versatile as the Maitreya. I agree you can definitely create a wider variety of shapes than you normally see around SL, it's just a matter of choosing the right bodies for the right shapes.
  7. I'd say it's even easier to achieve a wide variety of shapes with the maitreya body, especially if you use deformers and add-ons. Certain bodies just aren't going to work for certain shapes because the creators have already accentuated some aspects of the anatomy beyond what the appearance sliders can compensate for, but there are enough popular bodies that you can find one or two that will accommodate almost any shape.
  8. Add a forum feature that changes random words of every post to "banlines"!
  9. Release the A.I. powered hippos! Free animesh ponies for everyone! Hire Morgan Freeman to follow me around and narrate my second life! Legalize shortness!
  10. I think in a few years time BCI and AR could be a real game changing combination, but both technologies are still a long way from being commercially viable consumer products. As for using a direct interface between your brain and a computer to play Second Life, I think the lack of filter between peoples brains and their keyboards is enough of a problem already.
  11. I'm not sure what games you've been playing or what sims you hang out on but I feel like we lead very different second lives! 🤣
  12. My mind flashed back to the scene in Trainspotting but I decided not to link it since it's wildly off-topic and also really gross! 😅 I like my reality just the way it is, and while I find the whole subject of implanted technology and brain interfaces fascinating I'm with Paul on this one, I'd rather keep my tech on the outside for now. I've had the OpenBCI site bookmarked for a while now and have often toyed with the idea of trying out some of their more affordable options.
  13. I believe you can get a variety of suppositories that have similar effects, do those count? 😅
  14. Of course not! I will defend any persons right to be a blob if that's what makes them happy! But we all know how marketing works, pretty soon everyone will be thin shaming and the next thing you know we're living in a Blobocracy (but not an actual blobocracy cos apparently that's a real thing 😮) where everyone is judged not by their merits or actions but purely by their roundosity!
  15. I can definitely see a number of very useful applications for brain implants that allow you to remotely control technology, but the question is will it be used to improve peoples lives and empower them or will it simply enable people who are, as Sid put it.. Ultimately it could be used to improve humanity or we could all end up looking like this...
  16. I guess it's not as permanent as a factory reset at least. 😅 Neuralink will probably be a lot more appealing in years to come once the technology has advanced but so far the most advanced real world application they seem to have tested is one of the recipients playing a game of online chess which isn't exactly pushing the boundaries of human limitations. It's definitely something that could benefit a lot of people with disabilities in the future though, I just hope that development is allowed to focus on that and not dictated purely by profit.
  17. Oh I think I may have tried that a few times! Isn't that when you drink alcohol until you lose the ability to talk, your limbs don't work properly and you end up vomiting on yourself?
  18. That quirky old name for the windows error screen could potentially become a lot more literal if you happen to be driving or operating heavy machinery! We could soon be living in a world where fatal error messages precede actual fatalities..🤔
  19. As much as I love technology, I'd have to say no because the automatic updates and pop-ups would be a deal breaker for me!
  20. If LL do institute a minimum height policy in Second Life I assume they'll also be doing random spot checks on peoples private properties to make sure nobody is resizing their houses and furniture to make themselves feel shorter?!
  21. Obviously you'll need to use llStartAnimation (and other related commands for requesting/checking permissions, etc.) but in addition you should also take a look at llGetAgentInfo and specifically AGENT_ON_OBJECT which you can use to detect whether or not an avatar is seated or standing.
  22. I agree that in most cases the individual creating the assets will not be the same person responsible for their implementation within a game engine but there are still considerable differences between a level designer working as part of a development team and the way in which SL residents utilize assets within SL. To use your "mirrored" keyword as an example, if a level designer were facing the same issue being described in this thread they could simply ask one of the coders to add an additional function within the shader which would automatically invert the appropriate channel of the normal map whenever it encounters an asset which contains the "mirrored" keyword, whereas obviously SL residents don't have the ability to modify the shaders used to display the assets they're working with. Incidentally both Unreal and Unity do support texture mirroring. In Unreal Engine you can mirror a texture simply by multiplying the UV coordinates by -1 on the required axis (similar to how textures are mirrored in SL) while in Unity you can set the TextureWrapMode to Mirror/Mirror Once rather than Repeat and then offset the texture accordingly. As an alternative to adding checkboxes which allow users to manually invert the red or green channel of a normal map LL could potentially implement a similar approach to the one used in Unity and provide the option to toggle between either repeating or mirrored tiling and then automatically invert the correct channel within the shader each time the mirrored version of the normal map is displayed, then rather than setting the texture scale to -1 residents could set the texture offset to 1.0 on the required axis in order to display a mirrored version of the textures/materials.
  23. Because the vast majority of those people aren't developing for a platform built entirely from user-created content. Second Life requires end users to have as much control as possible over how content is displayed whereas most game developers are entirely responsible for creating and optimizing all of the environments, props and characters and provide users with little to no ability to directly modify any of the content within the game/platform. It should be a matter of simply providing a couple of extra checkboxes in the texture tab of the build window which when checked will invert the red or green channel of the normal map, this would allow users to flip a normal map horizontally and/or vertically and then invert the corresponding channel(s) to compensate for the reversed texture orientation.
×
×
  • Create New...