-
Posts
916 -
Joined
-
Last visited
Content Type
Forums
Blogs
Knowledge Base
Everything posted by Fluffy Sharkfin
-
You can use LLDetectedTouchST to detect the surface coordinates at which a prim has been touched, then it's just a case of having a lookup table of acceptable coordinate ranges for each "button" on the texture for that face. When a user touches a face you can test to see if it falls within valid coordinates for a button on the texture for that face and if so trigger the correct corresponding function in the script.
- 4 replies
-
- 4
-
- mapping
- coordinates
-
(and 1 more)
Tagged with:
-
I wouldn't be so sure. While a lot of the current 3D generative AI tools do produce some truly atrocious topology there are some that have made significant improvements. The topology of the results displayed in the video below may not be perfect or particularly well optimized it's still on par with some of the content I've seen in SL Additionally since the resulting topology is quad based it would be very easy to optimize by hand if you really wanted to maximize performance. Generative AI is still in its infancy and, while it may not be ready to produce content entirely unaided yet, as a tool for creatives it's already becoming a part of artists workflows and is being incorporated into industry standard software.
-
That's already inevitable unfortunately. A brief google search for "AI generated 3D models" yields plenty of results, including an article listing the seventy best AI tools for creating 3D content, and that list will only grow over time. There's no stuffing the AI genie back in its bottle, if you aren't creating mesh purely for the enjoyment of it then you may as well stop now, or at least pivot to incorporating AI into your workflow. The only benefit to having existing 3D skills is if you understand enough of the theory behind 3D creation to enable you to refine and optimize the sorts of results that AI is capable of producing thereby allowing you to produce a better final product than those who use generative AI without any "post production" process.
-
Silly Ideas to Fix Second Life
Fluffy Sharkfin replied to Charlemagne Allen's topic in General Discussion Forum
In light of recent developments in this discussion I'd like to revise my previous answer and suggest that, in lieu of free animesh ponies, LL distribute wearable saddles to all new residents which can be "hired" via a tip jar so that they can earn money by giving people rides to and from the newly re-instated telehubs! -
I'll admit I've got way more experience working with the Maitreya body and haven't really bothered with the Legacy. I guess since most people go with the typical hourglass or pear shapes I just assumed Legacy wasn't as versatile as the Maitreya. I agree you can definitely create a wider variety of shapes than you normally see around SL, it's just a matter of choosing the right bodies for the right shapes.
- 42 replies
-
- 2
-
I'd say it's even easier to achieve a wide variety of shapes with the maitreya body, especially if you use deformers and add-ons. Certain bodies just aren't going to work for certain shapes because the creators have already accentuated some aspects of the anatomy beyond what the appearance sliders can compensate for, but there are enough popular bodies that you can find one or two that will accommodate almost any shape.
- 42 replies
-
- 1
-
Silly Ideas to Fix Second Life
Fluffy Sharkfin replied to Charlemagne Allen's topic in General Discussion Forum
Them's fighting words... Ban bans! -
Silly Ideas to Fix Second Life
Fluffy Sharkfin replied to Charlemagne Allen's topic in General Discussion Forum
Add a forum feature that changes random words of every post to "banlines"! -
-
Silly Ideas to Fix Second Life
Fluffy Sharkfin replied to Charlemagne Allen's topic in General Discussion Forum
Release the A.I. powered hippos! Free animesh ponies for everyone! Hire Morgan Freeman to follow me around and narrate my second life! Legalize shortness! -
I think in a few years time BCI and AR could be a real game changing combination, but both technologies are still a long way from being commercially viable consumer products. As for using a direct interface between your brain and a computer to play Second Life, I think the lack of filter between peoples brains and their keyboards is enough of a problem already.
-
My mind flashed back to the scene in Trainspotting but I decided not to link it since it's wildly off-topic and also really gross! 😅 I like my reality just the way it is, and while I find the whole subject of implanted technology and brain interfaces fascinating I'm with Paul on this one, I'd rather keep my tech on the outside for now. I've had the OpenBCI site bookmarked for a while now and have often toyed with the idea of trying out some of their more affordable options.
-
Of course not! I will defend any persons right to be a blob if that's what makes them happy! But we all know how marketing works, pretty soon everyone will be thin shaming and the next thing you know we're living in a Blobocracy (but not an actual blobocracy cos apparently that's a real thing 😮) where everyone is judged not by their merits or actions but purely by their roundosity!
-
I can definitely see a number of very useful applications for brain implants that allow you to remotely control technology, but the question is will it be used to improve peoples lives and empower them or will it simply enable people who are, as Sid put it.. Ultimately it could be used to improve humanity or we could all end up looking like this...
-
I guess it's not as permanent as a factory reset at least. 😅 Neuralink will probably be a lot more appealing in years to come once the technology has advanced but so far the most advanced real world application they seem to have tested is one of the recipients playing a game of online chess which isn't exactly pushing the boundaries of human limitations. It's definitely something that could benefit a lot of people with disabilities in the future though, I just hope that development is allowed to focus on that and not dictated purely by profit.
-
As much as I love technology, I'd have to say no because the automatic updates and pop-ups would be a deal breaker for me!
- 91 replies
-
- 15
-
Official Statement from Oberwolf Linden
Fluffy Sharkfin replied to Scylla Rhiadra's topic in General Discussion Forum
If LL do institute a minimum height policy in Second Life I assume they'll also be doing random spot checks on peoples private properties to make sure nobody is resizing their houses and furniture to make themselves feel shorter?! -
Obviously you'll need to use llStartAnimation (and other related commands for requesting/checking permissions, etc.) but in addition you should also take a look at llGetAgentInfo and specifically AGENT_ON_OBJECT which you can use to detect whether or not an avatar is seated or standing.
-
Mirroring normal maps... too much to ask?
Fluffy Sharkfin replied to Rick Nightingale's topic in Building and Texturing Forum
I agree that in most cases the individual creating the assets will not be the same person responsible for their implementation within a game engine but there are still considerable differences between a level designer working as part of a development team and the way in which SL residents utilize assets within SL. To use your "mirrored" keyword as an example, if a level designer were facing the same issue being described in this thread they could simply ask one of the coders to add an additional function within the shader which would automatically invert the appropriate channel of the normal map whenever it encounters an asset which contains the "mirrored" keyword, whereas obviously SL residents don't have the ability to modify the shaders used to display the assets they're working with. Incidentally both Unreal and Unity do support texture mirroring. In Unreal Engine you can mirror a texture simply by multiplying the UV coordinates by -1 on the required axis (similar to how textures are mirrored in SL) while in Unity you can set the TextureWrapMode to Mirror/Mirror Once rather than Repeat and then offset the texture accordingly. As an alternative to adding checkboxes which allow users to manually invert the red or green channel of a normal map LL could potentially implement a similar approach to the one used in Unity and provide the option to toggle between either repeating or mirrored tiling and then automatically invert the correct channel within the shader each time the mirrored version of the normal map is displayed, then rather than setting the texture scale to -1 residents could set the texture offset to 1.0 on the required axis in order to display a mirrored version of the textures/materials. -
Mirroring normal maps... too much to ask?
Fluffy Sharkfin replied to Rick Nightingale's topic in Building and Texturing Forum
Because the vast majority of those people aren't developing for a platform built entirely from user-created content. Second Life requires end users to have as much control as possible over how content is displayed whereas most game developers are entirely responsible for creating and optimizing all of the environments, props and characters and provide users with little to no ability to directly modify any of the content within the game/platform. It should be a matter of simply providing a couple of extra checkboxes in the texture tab of the build window which when checked will invert the red or green channel of the normal map, this would allow users to flip a normal map horizontally and/or vertically and then invert the corresponding channel(s) to compensate for the reversed texture orientation.