Jump to content
Sign in to follow this  
face2edge

See something different...

Recommended Posts

Question: Is there a method that allows one viewer or avatar to see something different from another user at the same time?

Example 1: As I walk by, the sign is looking right at me, but someone else says it is looking right at them. Of course, this could occur in multiple viewers.

Example 2: As any avatar within a certain range moves around an object like a sphere, the color changes so that a user on the opposite sides may see blue or yellow at the same time in different viewers?

Challenge: The challenge here is so that each viewer can have an object display properties based on that avatar position. I am not talking about following the closest avatar. That is simple. I am not describing how a bump map changes highlights as you walk buy, but that is close in that each angle shows something different.

Desire: I want to create a hologram in a simulated world like Second Life. Mirrors and lasers do not exist, but is there a way to code for these phenomena?

Just remember: There is no spoon.

Share this post


Link to post
Share on other sites

The only thing that comes in mind is like this jiggy-pictures you sometimes find in cereals or chewing gums.

Thats triangles with different images on each side so from your angle you see a different side than others.

The best translation might be flip-picture :smileyindifferent:

But that sounds prim-heavy ..

Monti

Share this post


Link to post
Share on other sites

Example 1. The sign. You can do it with particles. As they always face the viewer. Just time them to emit and die at around the same time, so there is always a particle sign to look at.

 

You may be able to extrapolate that into example two as well.

Share this post


Link to post
Share on other sites

By 'Jiggy Pictures' are your referring to the images that shift when viewed from different angles?  The old style with a very poor quality was done with little stripes built into the outer layer so each eye would see different views.

My ultimate goal is to simulate a hologram that shifts as you walk by.  It sounds more like an animated texture that shifts as your position moves.  The problem is that I don’t want it to track one single user like the closest avatar.  I want this magic wall to change colors as people move by it.

For instance, an object may look blue from far away, but it may be red up close.  It will change colors as you approach it, but I want that for everyone and not just the closest person.  I also want to use textures to simulate the hologram instead of simple color properties.

I don’t know how to script an object for all users to see something different or interact with each avatar in a unique way at teh same time.

Share this post


Link to post
Share on other sites

A "sign" that always faces the viewer is almost certainly done with particles. Somewhere in my inventory I have an "NPC Generator", that is a one-prim scripted object that displays above it a single particle for an avatar-sized flat image of an avatar. For example, I could use that to display a castle guard on the far side of a room. The illusion faisl though if the viewer isn't on the same level, because the particle continues to display as if perpendicular to the viewer's line of sight. So viewed from the balcony over the castle wall, that guard seems to be laying on his back!

Displaying a "hologram" in SL is easy. Just make the thing out of normal prims, sculpties and/or mesh, and set it to be phantom and partially transparent (and monochrome toned, if it is a monochrome hologram you are simulating).

Share this post


Link to post
Share on other sites

Making an image display something different for each user would be harder, and I am not sure how to approach that. Things like a sphere that seems a different color to different users when viewed at the same time from different angles may be a texturing trick. For example, I can use multiple flat planes with an alpha texture on them, and what each user will see will be, generally, the texture on the surface that is most perpendicular to their line of sight. Prim trees made of three or four intersecting planes use this trick to look three-dimensional. There's no reason those images have to be the same, though. In fact, a good prim tree will have the image mirror-reversed on the opposite side of each plane, and eact plane would be an image of that tree from that angle of view. I've made a few very nice prim trees by modelling them in a 3D app like Bryce, that can make procedural, fractal-based models of trees, and rendering the same tree from several angles to make the images for my SL prim tree.

Share this post


Link to post
Share on other sites

Particles sound like a great method for example 1.  It is interesting that they always face the viewer.  I am looking for that kind of functionality in this example, but it does not work for the others.  I can understand how you would simulate something looking at your no matter where you were and if particles act this way for each viewer then maybe what I am looking for is actually built into the viewer software instead of the virtual world itself. 

I really have to wonder what kind of programming it takes for an object like a particle to look at each viewer at the same time.  That makes me think the particles are rendered on the client side.  I read something similar when talking about a heads up display (HUD).  The HUD always looks directly at the viewer.

To my knowledge, HUDs are static for a specific user.  I know that they can be manipulated, but I mean that the HUD isn’t part of the virtual world like a sign would be.  The HUD is part of a layer in the viewer.

I also do not think HUDs are the way to go because these are loaded by users so anyone avatar just browsing by would not see the hologram.  HUDs are worn so certain people would have them, but not passers-by.

Share this post


Link to post
Share on other sites

I know that you could make a 3D object look holographic, but I want to make a hologram.  For instance I want a texture where the eyes are always watching you, but it’s a flat picture on a wall in a creepy mansion.  As you move from side to side, the eyes would track your position.  I want every observer to think they are being watched.  Of course, that is another example, but very similar to what I need.

I can understand how a particle that has a continuous look-at function would misrepresent an object in a world where avatars can fly overhead all the time.  At some point in my reasoning, I thought that an object would track avatar position and rotate only on the ground plane to follow movements.  This is unlike the particles and would require scripting. 

Share this post


Link to post
Share on other sites

I have seen the plants displayed in slices so you see different slices as you move around.  That is also a very similar example.  I want to have that effect with a single object.  Yes, it is probably a texture trick. I want a texture to change based on position of an avatar.  The whole idea of tracking each unique position for all viewers is the challenge.  I understand the process for making a tree in SL, but that isn’t really what I want to do.  The level of detail and precision of my original idea have nothing to do with making a plant you can walk around.  I want a static plane than can shift by way of position for all viewers.

Share this post


Link to post
Share on other sites

This brings up an interesting point.  At some point in my early experiences with SL, I flew into a tree house while exploring.    Imagine my surprise when a laser started tracking my position and once it had a lock on me for 20 seconds; it had teleported me out of there.  That is very awesome.  That is the kind of challenge I am describing and I am new to this.  I want to test and see if that tracking system can track two or more avatars at the same time.  I believe that if two separate viewers see the laser beams pointing to both at the same time then we are part of the way there with that design.  The next challenge is still to find out how both viewers can show different color properties at the same time.

Share this post


Link to post
Share on other sites

I think you answered your own question. A HUD each user wears with an almost Augmented Reality way of displaying your hologram in the scene, would allow control over each users view. I haven't seen AR used in SL, there may be some limits I'm not aware of. But it sure sounds neat!

Share this post


Link to post
Share on other sites

Face2edge:

(1) Particles are, indeed, rendered client-side. (Well, technically, everything is rendered client-side, but you know what I mean :) )  So there's no special programming involved on your side, as the object creator and scripter, to make the particles face two or more AVs in different physical positions simultaneously; every AV within sighting distance of the generated particles will automatically see these particles as facing their own particular viewpoint, no matter where it (and they) happen to be in relation to each other.

(2) As far as the tracking laser goes -- that, too, can be done with particles; a particle chain can be drawn between any two objects, and avatars count as "objects" in this scenario. :)  However, as far as I know, each prim can only emit one particle chain at a time, so if you want your laser to be able to target multiple avatars simultaneously, you'll need to have one prim for each AV you want to target, then work out some way of coordinating them so that when an AV comes within range, the coordinating script finds the first "dormant" emitter in the set and activates it, then deactivates it again when the AV moves out of range.

Not impossible, but it will involve some serious LSL scripting work to pull off. 

Share this post


Link to post
Share on other sites

HUDs exist on the top layer of the viewer and they always face directly at the viewer, right?  How would I have an object inside the virtual world that can interact with the HUD?  Particles also face the viewer, but that is not similar to the other examples where you have a texture that changes as you move by it.

Share this post


Link to post
Share on other sites

I was surprised by the functionality of the laser tracking security system.  I have not found it again to test it with two avatars at the same time to see if it track multiple avatars like you discussed.  Ultimately, I want a texture to track avatar positions which is simple enough, but I want it to track everyone within range.  Your thoughts have given me hope, but that would be to create a multitude of sensors to track, store, and update multiple avatars at the same time.  That would be tedious at best to do for an entire room of people.  I want a hall of holographic pictures and I don’t want to track each person.

If the particles exist in the virtual world, but they face the viewer by default, there must be a way to have an object that can be programmed to follow the all viewers like the particles.  Maybe this is a feature request.  I want the power of a particle in a regular object so absolutely any 3D object can look at the viewer and this same functionality can be used to get positions of every avatar and look at them in their respective viewers just like the particles.

That would be neat.  I have seen 3D object that simulate looking at a moving hologram, but I want a simple animated texture to follow everyone!  Just a as the particle can face each viewer, I want the texture to move with each viewer.

Share this post


Link to post
Share on other sites

HUDs always face the viewer, yes.

An in-world object can interact with the HUD in the same way as with any other attachment or object -- by sending messages back and forth between a script in the HUD and one in the in-world object, using llRegionSay().  By picking an appropriate channel number (a large negative number that isn't easily guessed, such as -99650, is usually best since it decreases the odds anything else will be talking on that channel at the same time), the HUD and in-world object can send messages to each other describing their current state or asking each other to take actions.

This, too, will require more than a bit of LSL scripting work, though. :) 

Share this post


Link to post
Share on other sites

I guess I'm a bit vague still on what you're trying to accomplish, here.  Are you saying you want something that physically follows people around the sim?

I have seen items that will do that; you walk into a sim and a faerie, or a ghost, or something like that will start following you around for a certain amount of time or distance.  I've not tried to make one myself, but I suspect there's a "primary" object that senses when a new avatar comes into range, then temp-rezzes the player-follower object and hands off the newly-arrived avatar's key to a script inside the player-follower which causes it to, well, follow the player around. :)  To get your hologram effect, simply make the actual object 100% transparent and put the particle-emitter script inside it as described previously; then the "hologram" particle it emits will always face whoever is looking at it.

The only downside to this approach is that temp-rezzed objects will automatically "die" and disappear from the sim after a certain period of time, usually around 60~70 seconds.  You could also rezz the player-followers as regular objects, but you'd want to be very careful to include a mechanism in the follower's script to make it die on its own as soon as the player it's chosen to follow has moved out of range, or if it attempts to cross a boundary into another sim, or after a fixed period of time has elapsed, so you don't end up with heaps of abandoned objects floating around.

Share this post


Link to post
Share on other sites

I was playing a game to see how particles and HUDs are used so I have an idea of how the virtual world can communicate with the HUDs.  I have also read articles about avatars being followed from sim to sim.  They are either tracking the worn objects or the avatars directly.  That is great, but also very common.  The LookAt function can look at whatever you tell it to.  I understand that proximity can be used so an object will look at the closest avatar within range, but I want all of them to be looked at with the same object.

My next question is about particles again.  Do you have to spawn many particles to simulate them looking at each viewer or can one particle actually be rendered so as to look at each viewer? Here is what I mean.  If I have one particle, can that particle track multiple people so that each viewer see the particle as facing them?  If that is true, could I have an object that has the same function so that one object can track every avatar?  Can an object track the single particle’s orientation so that the object can show different properties to each viewer.  For instance, can I make my 3D object not only track each avatar, but also show different rotation values to the viewer.  Maybe the rotation of a particle is done in some way other than local or world.  Perhaps the particle use a reference rotation to always point to every viewer instead of pointing to a point inside the actual virtual world.

I really need to find out how to break down the code that particles use because it sounds like I could use that functionality somehow to simulate this feature.  Imagine walking into a room and the statue is looking at you and tracks you around the room.  I also want each unique avatar in that room to see the same view at the same time.  Each person would feel they were being watched.  Even if I could do this by creating a particle for each avatar and track multiple positions, this would be extremely code and processor intensive.  Maybe this idea is very powerful and hard on resources like a high polygonal structure would be.

Share this post


Link to post
Share on other sites

OK, let me see if I can make this a little more simple. :)

You do not have to do anything to make a particle always appear to face the viewer.  There is no "tracking" of avatars involved in having particles face whoever is looking at them.

Why?  Because particles aren't real objects, they do not actually "exist" within the sim, and the LSL script inside a particle-emitting object does not actually rezz and orient the particles in-world.

Particles only exist in the viewer client.  The sim just tells the client that "this prim is a particle emitter, and here are the emission characteristics for it." (size, rate, color, texture, etc.)  It is the viewer client which creates the particles in its own local memory and keeps track of them, and those particles will always be drawn so that they face directly towards whatever particular viewpoint the client's camera happens to be in.  Your particle-emitting prim will never know, nor will it need to care, how many people are looking at the particles or where they are positioned in relation to the emitter.  The viewer client will always, always, ALWAYS draw the particles to be visible directly "in front of" the player's camera.

Does that clarify it? :)

 

Share this post


Link to post
Share on other sites

That is simpler, but now I have more questions!  I did some thinking about how particles are rendered and came to many of the same conclusions.

I want my object to look at the avatars in much the same way as a particle always looks at the viewer.  The key here is that I want the viewer to see a unique view of the objects.  I understand that this may be client side rendering.  Although there may be no tracking involved, the particle emitter must be in some level of communications with the client side rendering because the particles can be controlled like any other object.  That communication may be from emitter to client, but why can’t that be reversed so the emitter is told to do something from the client?  In my reasoning, this reminds me of how a HUD communicates with the virtual world.  For instance, an avatar wears the HUD, but it resides and is controlled in the viewer and doesn’t really exist in the virtual world.  Sound familiar?

So, how can I get an object in the virtual world to use client side rendering in order to look at each viewer’s avatar?  This is my current question.  If communication was allowed to take place between a ghost particle, the viewer, and another object, would you be able to track the orientation of the particle and feed that information back to the orientation of an object inside the world.  You would still have to use client side rendering so each viewer sees a unique view of the object orientation.  So, the next question is about that point.  Can you use client side rendering on object other than HUDs and particles?  If you could enable client side rendering at will, could you make higher level of detail objects to render outside of the second life system?  Can this be turned on for an entire Sim so all models are hardware accelerated?

I understand that some of what I am trying to accomplish may be feature requests, but how can I do any of this now?  Would I have to be an inside developer to figure it out?  

Share this post


Link to post
Share on other sites


face2edge wrote:

So, how can I get an object in the virtual world to use client side rendering in order to look at each viewer’s avatar?  This is my current question.  If communication was allowed to take place between a ghost particle, the viewer, and another object, would you be able to track the orientation of the particle and feed that information back to the orientation of an object inside the world.  You would still have to use client side rendering so each viewer sees a unique view of the object orientation.  So, the next question is about that point.  Can you use client side rendering on object other than HUDs and particles?  If you could enable client side rendering at will, could you make higher level of detail objects to render outside of the second life system?  Can this be turned on for an entire Sim so all models are hardware accelerated?

First everything in sl is rendered in the client. Second what you're asking for is the ability to treat 3d objects as if they were particles which is a bit silly considering particles were invented to do things you couldn't do with 3d objects.

Particles were created to simulate fuzzy objects that don't have a solid shape and are thus difficult or impossible to create with 3d objects. Another way of putting it is that 3d objects are used for solid matter where as particles are used for gases and fluids or things that behave like them. Originally particulate clouds such as smoke, water mist, rain/sand/dust clouds, fire, sparks, etc. More advanced particle systems, which sl does not have, can simulate things like hair, fur, grass and such. Particles used in conjunction with a fluid simulation can also be used to mimic flowing water.

It's best to think of a particle as a flat sheet of paper that always faces the camera, because that's what it is. But just because it's a piece of paper doesn't mean you can't trick people into thinking it's something else. I suggest you spend some time playing around with sl's particle system to see what it can do and how it reacts to the various settings. This will give you a much better idea of what you can and can't do with them and will effectively answer your question for you.

 

Share this post


Link to post
Share on other sites

Of course the virtual world is rendered by the client, and I know particles are very special.  The only reason I am interested in particles at the moment is because I want to use an object to always look at the avatar in each unique viewer at the same time.  When I walk by, an object tracks my movement, but it also appears to be doing the same way for others at the same time.  In other words, the object will be watching me like the particle always face each viewer irrespective of whether I see it or not.  Maybe I do have to create a multitude of particles and have each one follow one avatar at a time, but particles do look at the viewer and not the avatar.  HUDs also look at the viewer and not the avatar.

I want to use the functionality of a particle to always look at an avatar so each viewer sees an object watching or tracking them around a room.  I don’t want a particle system to do this.  I want to code an object to follow each avatar at the same time because this is something I have not seen before.  I can make an object follow whatever I want, but those are regular objects.

Somehow, I want to use the particle example to have an object appear the same to each viewer simultaneously instead of changing when examined.  This is a tough to conceptualize in a virtual environment because I think I really want an object to have properties of a particle.  One main difference is that I want the object to follow an avatar like a particle follows the viewer. 

It does seem that there particles do have some extra processing on the client side even though it is all technically rendered there.  I know how convoluted it sounds.  If a single particle appears face forward to each viewer, then I want an object to be processed the same way with the exception of where it is pointing so the single particle would face the avatar instead of the viewer.

Perhaps what I am suggesting would be the birth of a new type of object.  Imagine a particle that allows for an attribute change like changing which object to look at instead of the viewer.  I wonder what this would actually do or why we would want that.  I guess it would be a static particle that would not die away unless it was killed.  A super particle might be a better description.

Share this post


Link to post
Share on other sites

I would like to make an experiment where an object rotates as a particle does.  I don’t know if that level of communication is allowed between the particles and an object.  The emitter may not even care because the particles are rendered with respect to the viewer and by the client.  It’s almost like there are two levels of client rendering and I want an object to tap into the orientation of the particles as rendered.  I know this is incredibly difficult if it is even possible and that one would probably need to be a developer to do this.  I don’t know of any graphics system that can do this, but I am not a graphics developer, either.

Share this post


Link to post
Share on other sites

Why do you keep insisting that it has to be an object that follows the avatar? Go and try to build what you want with particles first before you ask for a new and rather weird object type that would have very limited uses (i.e. almost none).

Share this post


Link to post
Share on other sites

I want to use particle like behavior.  For instance, I want something like a particle to always point at the avatar instead of facing the viewer.  I know how to make an object look at one avatar, but I like the way particles always look at the viewer because it is not loaded in the virtual world.  That’s why I think the function I am after would be a hybrid.

Is it the physics engine that generates the particles?  I want a local object or sub particle to behave the same way except to look at whatever I select instead of the viewer.  I would like to see communication between particles and the virtual world.  Can you actually create a particle without going through the graphics engine that renders them outside of the virtual world?  Is that what I mean?

I know I am beating a dead issue if this takes a specific kind of object type, and I know it’s a very specific purpose, but I can think of many areas to use it if I could simulate something like this.  It’s way outside the box.  I think that’s why it is so hard to understand the need for it right now.  Particles have been around for years and have only made technological advancements.  I don’t feel that the potential has been realized fully.

Maybe this is beyond the scope of a system like Second Life.  Perhaps I am talking more about a graphics system that creates movies in a professional setting.

Share this post


Link to post
Share on other sites

Yes you can do something like that with some extensive modifications to both the viewer and the sim, the point is why do we need that? Aside from your one example what exactly would be using this? We're talking about tens of thousand of dollars worth of development work for your pet project. Why can't you build this thing with the tools that are already available?

And I'm sorry for being blunt but you need to learn a lot more about how this stuff works so you can start asking the right questions. Some of the things you're asking or presuming are just way out there, it makes it very hard to give you a proper answer because I'd have to type out several pages to explain the necessary back ground info.

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Sign in to follow this  

×
×
  • Create New...