Jump to content


  • Content Count

  • Joined

  • Last visited

Community Reputation

0 Neutral

About face2edge

  • Rank
  1. I don’t even see a way to create this effect using textures. Bump maps may simulate a similar effect, but that is only if I can load a custom bump. I would have to create a bump that gives a higher dimensional feel to a flat surface so it would appear to move as you walked by. I did some brief testing and found that SL have predefined bumps and those are static. For instance, the bricks features don’t seem to move when you walk by. The texture seems to be baked with your texture and rendered accordingly. I want it to move like a dynamic reflection. I like how water is rendered so well, but why hasn’t SL started using bumps that way? I did read something about changing from textures to materials. That would be interesting. Perhaps I could do my effect then?
  2. I understand the effect will be purely local, but that is what I am going for. I know how to generate huge polygonal models, but that would be very slow and create lag when moving around my area. I suppose I could make a very restricted area that would not load any external object. For instance, the inside of a building with no windows should free up rendering to inside the room only. I know how slow the rest of the world renders when I am outside a building. The effect I am going for needs to be unique for each viewer so I know that client side rendering would be useful. I have been reading that this would allow the OS to communicate with the in world objects through the viewer, but that seems very experimental right now. It would be nice to see an extension that controls SL objects from outside the viewer. I think that is how particles are rendered. The emitter object is in world and it tells the graphics card what to render so they don’t exist in the virtual world. So, the particles are client-side, but I would need a new type to manipulate the way I want. I don’t have a problem learning how to get deep into scripting with SL, but I also see how different viewers are capable of changing how they interact with SL. I would love to see a viewer that allows for OS or graphics specific languages. Perhaps I can find a viewer with these functions, and I know it may only work for me if it is dependent on a script or extension. As it is now, I am going to start working on this effect to render the way I want for no one other than myself. I want my object to key in on my avatar only so it only interacts with me. At a later time, I can think more about track everyone and doing some client-side work or beyond.
  3. I would imagine that I have to get all the way to wonderland in order to learn what I must to accomplish my task. I feel like I have to mind meld a Linden to prove my point. I have been doing some reading about client-side scripting. That seems like the path I will have to take. I found the particle laboratory to be very interesting and it showed me all the flavors of particles and how to manipulate them, but the core function I require seems to be wholly based in client-side scripting so an object rotation appears unique to each client. For now, I have chosen to move forward with learning the scripting and I will have it key on my movements for now. At some point, I would like to add other features, but working out the particulars of texture animations keyed on avatar positions will be a challenge for now.
  4. Thanks for all the consideration, everyone! It's time for deeper research...
  5. That's exactly what I thought. I know this is way out there. I can’t think of how I want to do it, but I also realize I don’t know what the system is capable of. So far, I don’t see how my experiment is possible. That’s what I do. I pick a challenge that takes me through learning so many little pieces. You seem to be well informed. Do you have some heavy resources that can clutter my mind to help me figure this mess out? I understand most of the basic tutorials, but I have to get into how particles are used and maybe another viewer is SL viewer is limited for tweaking. Any help is greatly appreciated!
  6. I want to use particle like behavior. For instance, I want something like a particle to always point at the avatar instead of facing the viewer. I know how to make an object look at one avatar, but I like the way particles always look at the viewer because it is not loaded in the virtual world. That’s why I think the function I am after would be a hybrid. Is it the physics engine that generates the particles? I want a local object or sub particle to behave the same way except to look at whatever I select instead of the viewer. I would like to see communication between particles and the virtual world. Can you actually create a particle without going through the graphics engine that renders them outside of the virtual world? Is that what I mean? I know I am beating a dead issue if this takes a specific kind of object type, and I know it’s a very specific purpose, but I can think of many areas to use it if I could simulate something like this. It’s way outside the box. I think that’s why it is so hard to understand the need for it right now. Particles have been around for years and have only made technological advancements. I don’t feel that the potential has been realized fully. Maybe this is beyond the scope of a system like Second Life. Perhaps I am talking more about a graphics system that creates movies in a professional setting.
  7. I would like to make an experiment where an object rotates as a particle does. I don’t know if that level of communication is allowed between the particles and an object. The emitter may not even care because the particles are rendered with respect to the viewer and by the client. It’s almost like there are two levels of client rendering and I want an object to tap into the orientation of the particles as rendered. I know this is incredibly difficult if it is even possible and that one would probably need to be a developer to do this. I don’t know of any graphics system that can do this, but I am not a graphics developer, either.
  8. Of course the virtual world is rendered by the client, and I know particles are very special. The only reason I am interested in particles at the moment is because I want to use an object to always look at the avatar in each unique viewer at the same time. When I walk by, an object tracks my movement, but it also appears to be doing the same way for others at the same time. In other words, the object will be watching me like the particle always face each viewer irrespective of whether I see it or not. Maybe I do have to create a multitude of particles and have each one follow one avatar at a time, but particles do look at the viewer and not the avatar. HUDs also look at the viewer and not the avatar. I want to use the functionality of a particle to always look at an avatar so each viewer sees an object watching or tracking them around a room. I don’t want a particle system to do this. I want to code an object to follow each avatar at the same time because this is something I have not seen before. I can make an object follow whatever I want, but those are regular objects. Somehow, I want to use the particle example to have an object appear the same to each viewer simultaneously instead of changing when examined. This is a tough to conceptualize in a virtual environment because I think I really want an object to have properties of a particle. One main difference is that I want the object to follow an avatar like a particle follows the viewer. It does seem that there particles do have some extra processing on the client side even though it is all technically rendered there. I know how convoluted it sounds. If a single particle appears face forward to each viewer, then I want an object to be processed the same way with the exception of where it is pointing so the single particle would face the avatar instead of the viewer. Perhaps what I am suggesting would be the birth of a new type of object. Imagine a particle that allows for an attribute change like changing which object to look at instead of the viewer. I wonder what this would actually do or why we would want that. I guess it would be a static particle that would not die away unless it was killed. A super particle might be a better description.
  9. That is simpler, but now I have more questions! I did some thinking about how particles are rendered and came to many of the same conclusions. I want my object to look at the avatars in much the same way as a particle always looks at the viewer. The key here is that I want the viewer to see a unique view of the objects. I understand that this may be client side rendering. Although there may be no tracking involved, the particle emitter must be in some level of communications with the client side rendering because the particles can be controlled like any other object. That communication may be from emitter to client, but why can’t that be reversed so the emitter is told to do something from the client? In my reasoning, this reminds me of how a HUD communicates with the virtual world. For instance, an avatar wears the HUD, but it resides and is controlled in the viewer and doesn’t really exist in the virtual world. Sound familiar? So, how can I get an object in the virtual world to use client side rendering in order to look at each viewer’s avatar? This is my current question. If communication was allowed to take place between a ghost particle, the viewer, and another object, would you be able to track the orientation of the particle and feed that information back to the orientation of an object inside the world. You would still have to use client side rendering so each viewer sees a unique view of the object orientation. So, the next question is about that point. Can you use client side rendering on object other than HUDs and particles? If you could enable client side rendering at will, could you make higher level of detail objects to render outside of the second life system? Can this be turned on for an entire Sim so all models are hardware accelerated? I understand that some of what I am trying to accomplish may be feature requests, but how can I do any of this now? Would I have to be an inside developer to figure it out?
  10. I was playing a game to see how particles and HUDs are used so I have an idea of how the virtual world can communicate with the HUDs. I have also read articles about avatars being followed from sim to sim. They are either tracking the worn objects or the avatars directly. That is great, but also very common. The LookAt function can look at whatever you tell it to. I understand that proximity can be used so an object will look at the closest avatar within range, but I want all of them to be looked at with the same object. My next question is about particles again. Do you have to spawn many particles to simulate them looking at each viewer or can one particle actually be rendered so as to look at each viewer? Here is what I mean. If I have one particle, can that particle track multiple people so that each viewer see the particle as facing them? If that is true, could I have an object that has the same function so that one object can track every avatar? Can an object track the single particle’s orientation so that the object can show different properties to each viewer. For instance, can I make my 3D object not only track each avatar, but also show different rotation values to the viewer. Maybe the rotation of a particle is done in some way other than local or world. Perhaps the particle use a reference rotation to always point to every viewer instead of pointing to a point inside the actual virtual world. I really need to find out how to break down the code that particles use because it sounds like I could use that functionality somehow to simulate this feature. Imagine walking into a room and the statue is looking at you and tracks you around the room. I also want each unique avatar in that room to see the same view at the same time. Each person would feel they were being watched. Even if I could do this by creating a particle for each avatar and track multiple positions, this would be extremely code and processor intensive. Maybe this idea is very powerful and hard on resources like a high polygonal structure would be.
  11. I was surprised by the functionality of the laser tracking security system. I have not found it again to test it with two avatars at the same time to see if it track multiple avatars like you discussed. Ultimately, I want a texture to track avatar positions which is simple enough, but I want it to track everyone within range. Your thoughts have given me hope, but that would be to create a multitude of sensors to track, store, and update multiple avatars at the same time. That would be tedious at best to do for an entire room of people. I want a hall of holographic pictures and I don’t want to track each person. If the particles exist in the virtual world, but they face the viewer by default, there must be a way to have an object that can be programmed to follow the all viewers like the particles. Maybe this is a feature request. I want the power of a particle in a regular object so absolutely any 3D object can look at the viewer and this same functionality can be used to get positions of every avatar and look at them in their respective viewers just like the particles. That would be neat. I have seen 3D object that simulate looking at a moving hologram, but I want a simple animated texture to follow everyone! Just a as the particle can face each viewer, I want the texture to move with each viewer.
  12. HUDs exist on the top layer of the viewer and they always face directly at the viewer, right? How would I have an object inside the virtual world that can interact with the HUD? Particles also face the viewer, but that is not similar to the other examples where you have a texture that changes as you move by it.
  13. This brings up an interesting point. At some point in my early experiences with SL, I flew into a tree house while exploring. Imagine my surprise when a laser started tracking my position and once it had a lock on me for 20 seconds; it had teleported me out of there. That is very awesome. That is the kind of challenge I am describing and I am new to this. I want to test and see if that tracking system can track two or more avatars at the same time. I believe that if two separate viewers see the laser beams pointing to both at the same time then we are part of the way there with that design. The next challenge is still to find out how both viewers can show different color properties at the same time.
  14. I have seen the plants displayed in slices so you see different slices as you move around. That is also a very similar example. I want to have that effect with a single object. Yes, it is probably a texture trick. I want a texture to change based on position of an avatar. The whole idea of tracking each unique position for all viewers is the challenge. I understand the process for making a tree in SL, but that isn’t really what I want to do. The level of detail and precision of my original idea have nothing to do with making a plant you can walk around. I want a static plane than can shift by way of position for all viewers.
  15. I know that you could make a 3D object look holographic, but I want to make a hologram. For instance I want a texture where the eyes are always watching you, but it’s a flat picture on a wall in a creepy mansion. As you move from side to side, the eyes would track your position. I want every observer to think they are being watched. Of course, that is another example, but very similar to what I need. I can understand how a particle that has a continuous look-at function would misrepresent an object in a world where avatars can fly overhead all the time. At some point in my reasoning, I thought that an object would track avatar position and rotate only on the ground plane to follow movements. This is unlike the particles and would require scripting.
  • Create New...