Jump to content

Mircea Lobo

Resident
  • Posts

    175
  • Joined

  • Last visited

Everything posted by Mircea Lobo

  1. I wasn't sure if that's possible, but if it is this should totally work. It does also sound like Opensim should have it by now, and llDetectedType is marked as implemented over here so I won't worry. Of course, there is still a little problem. Suppose I stack multiple crates on top of each other, the function still triggers even if the objects are at their initial position. So I'll probably still need to do some distance comparison, like llVecDist. Otherwise, this should solve the problem... thanks.
  2. Yes I am, and I totally plan to use the NPC function for scripted characters! With bots however, I prefer using primitives... since that gives me more control and they don't really need to be treated like real avatars (eg: Appear on the minimap).
  3. Bullet is now the default physics engine in Opensim, and what I use as well. It's working pretty nicely really. Other than that, I plan to have physical objects on my sim either way... such as crates and other stuff you can push around. So having bots be physical wouldn't be accidental or uinque, and I'd prefer it because it's more realistic. For example, shooting a small floating droid can push it back by a few meters. If I have to go with non-physical however, llSetKeyframedMotion indeed sounds like the best idea. And now that I know about the existence of llCastRay, a function to decide a trajectory would be even easier! Thanks for the info.
  4. Thanks to the suggestions. Funny I was never aware of llSetRegionPos, but that sounds like it should solve it. As for comparing position / rotation in a timer event, that can work. But I'd rather have the object detect the last moment when it was pushed by something and set the timer event then. Is it possible for a script to detect if the prim it's in has been pushed by an avatar or another physical object, but ignore collisions with terrain or non-physical prims?
  5. I'm working on a slightly scifi themed sim (Opensim / OSGrid). Among many things, I wish to add bots that do various things. Scripting those things shouldn't be a problem, but I'm not sure how to script the bot movement itself. I don't need anything too fancy. What I basically want is a physical object that moves toward a random direction (choosing a new one every X seconds) and faces the direction of movement. If possible, I'd like the object to detect collosions and obstacles enough so it doesn't foolishly push into a wall until a new direction is triggered... though if that's too difficult I can live without it. It should however never try to go outside of the region's boundaries. No pathfinding please. Opensim doesn't have that and I'm not really a fan of it. llMoveToTarget might be a good candidate, and maybe llApplyImpulse can work too. Doubt llSetKeyframedMotion works for physical objects though. Note that I want the bots to also act as physical objects, so avatars or other physical objects can bump them. So the script must be aware that unforseen circumstances might change the position and rotation of the bot at any time, and still aim toward the same trajectory. Other than that, I'd like the script to be compatible with both air and ground bots. So I can make floating droids as well as wheeled / ground ones. In the case of flying bots, it would be a nice addition if a random direction could be chosen in all axes rather than just horizontally.
  6. I have a region (Opensim mind you) on which I plan to add a lot of physical objects... such as crates, trash cans, chairs, or other stuff that can be pushed around. However, I want those objects to re-appear at their original location and rotation after a while, so avatars can't permanently push them god knows where. I've seen this done on another sim, but never accessed the original script myself. Usually I'm good with LSL. In this case however, there are a few things that get in the way. First of all, how do you know when an object was pushed by an avatar or another physical object? If you simply use the collision_start event, it might also detect the floor as an obstacle, and run the function unnecessarily. Second, llSetPos has a limit of 10 meters... and although I can override it in Opensim I'd rather do things the right way. How would I teleport the object back if it's been puhed more than 10m away, without having to use llSetPos in a loop? I tried google, but the only term that comes to mind is auto-return... and in SL that addresses parcel auto return, not returning physical objects to their original position a minute after they've been last pushed. Any ideas?
  7. This is a feature I've badly wanted to see, given that I like immersion and large landscapes. Currently some simulators work around this by having a huge mesh that's 3 x 3 simulators large simulating landscapes... but it's usually obvious that this is a hack plus it makes neighboring sims impossible to exist. My idea was that region and / or parcel owners could attach objects to the sky. In some engines, those are known as infinite distance objects I believe. Such objects would only appear to avatars present in that region / parcel. They are rendered behind anything in the world, but in front of the windlight sky. They move together with the view, and the object's center is always located at the camera center. Such objects couldn't cast or receive local lights on / from objects on the simulator of course, but must be lit by the sky at least. Such could be used to add a grand environment to simulators. In a city region for instance, people could texture several cubes with bricks and windows then attach the object to the sky, causing everyone to feel like they're in a huge city. In nature sims, it would be easy to add large landscapes with trees and rivers and other details, using an object consisting of just a few sculpt primitives. In a scifi sim, you could put custom planets on the sky, or huge space ships that circle around the place (using llTargetOmega). What do you think about this idea? And if other people want it too, is there someone who can attempt coding it in the viewer, or embarking on the difficult adventure of convincing LL to look into it?
  8. As everyone can agree, one of the best things about SL is the flexibility with which you can build and design nearly anything. But as of recent, especially since getting more into voxel engines, I'm starting to find the current concept of primitives a bit old and limiting. You only have a few basic shapes to work with (cube, cylinder, sphere, etc) each with only a few physical traits that can be edited otherwise. Of course, there are sculpties and mesh too. But those have their own limitations, which is why I only like them when they're really needed. Biggest one is that you can't edit them in-world, and have to create and export them from a 3D modeling program then upload, no changes possible after that. You also can't script changes to physical traits... like making the interior of a hollow sphere change size when touched, using llSetPrimitiveParams. And since primitives are procedural geometry and computed in realtime, other things like level of detail work in a better way. My idea was to have an interface which would allow you to define basic shapes, as well as settings to influence their geometry (like how current shapes have "hollow" or "taper"). Being able to define them in text (such as xml) would also work nicely. The plan is that the user could specify vertex points, whether the surface between them should be flat or round, if the faces between them are continuous or can be textured separately, and possibly create traits which can be scripted in LSL and edited in the Build menu. The current primitive types would become default templates, available in the Library inventory or something... but just items part of the new system. I know many would say this would be too much work for nothing, especially now that there's mesh. Like I said, I believe primitives have advantages and a flexibility that mesh / sculpt could not, and IMO should remain a strong point in Second Life. Being able to write your own simple shapes, rather than just having the classic "cube" and "sphere" and so on, would be a welcome feature that a lot could be done with. I can already imagine pinched spheres, rounded cubes, buleged cylinders, etc. and some of those shapes even being animated :) Would this be possible to add to SL at this stage? In a way that can of course convert old primitives to the new system, and allow existing functionality to remain unchanged.
  9. Although any discussion is welcome, I'd rather this doesn't turn into a debate about which driver is best and should be used. I know most people prefer the proprietary video drivers for gaming (even on Linux), and might have no problems maintaining them or get better performance. It's everyone's personal choice and experience... I have my own reasons for sticking to the free drivers after trying both. What's important is that the builtin drivers are striving to become better, and some are getting there. I never tried Nouveau since I have ATI (Radeon driver). But as surprising as this might sound, I can barely notice any lower performance to fglrx in any game, especially since last month's update. Including high-end games (like Xonotic on "Ultra" settings) or older Windows games ran on WINE (NFS Carbon actually runs better). On Second Life performance is absolutely the same, and every feature works perfectly. The only exception are those shader issues, which are likely due to few people testing SL with the free drivers, and neither side noticing its problems in relation to the other. Even if it's just to support Linux and its free software (including the drivers), I aim to get such bugs solved.
  10. I dug up more information on when the color outlines occur. As previously mentioned, they appear when Advanced Lighting Model (deferred pipeline) is enabled. However, two other options must be turned on too: "Local Lights" + "Bump mapping and shiny" (RenderLocalLights + RenderObjectBump). Experimenting with light sources also revealed the in-world trigger: Intersection of multiple light sources. There's no junk if only one local light affects an area, or there are no lights present. But if anything is touched by two or more local lights, the outlines form... noise increasing with the amount of lights.
  11. Proprietary drivers are crucial on Windows, but very bad on Linux. At least that's my experience; fglrx caused me many problems, and I was glad to get rid of it and return to the OSS ones. Except for some issues like the SL shaders of course. Besides... the free video drivers are catching up to the proprietary ones in this era, and becoming usable for gaming. Performance in all 3D engines (including SL) is almost the same with both drivers here! So the free ones need someone to test them too and report issues. I don't mind being that someone for Radeon and Second Life :matte-motes-bashful-cute-2:
  12. Upgraded from Mesa 9.2.2 to 9.2.3. The issue is still present and unchanged. I found a way to fix the noise caused by Basic Shaders. Setting RenderMaxTextureIndex to a lower value removes the graphical junk. The default value is 16, and the problem is fixed if I set it to 4 or less (any higher causes the corruption). However, this doesn't fix the color outlines when enabling Advanced Lighting Model.
  13. I believe I have the same artifacts described here with Mesa / Gallium on Radeon. I've kept looking for ways to fix it for days, and found this topic in the process. See this thread where I also posted some screenshots. I'm glad to hear Firestorm is working on a fix. I tend to defer Firestorm to other viewers, but it sounds like I'll be going back to it if they fix it first. Hopefully SL and / or the video drivers will fix it soon, whichever has the fault.
  14. This issue addresses people who run Second Life on Linux and use the free video drivers (Mesa / Gallium 3D). I have an ATI card and therefore use the OSS Radeon driver, I don't know if this problem also applies to Nouveau. Important: Before anyone can test this, you need to modify the viewer's shaders to be compatible with Mesa. See this discussion. You need to comment out all #extension lines, and change all double underscores into single underscores (__ to _), in every *.glsl file. This is needed due to another problem, but it's is a separate subject. Now onto the actual issue: Shaders cause graphical glitches when enabled. There are primarily two symptoms: The most common and obvious one is corruption showing over some surfaces, consisting of little little squares / patches of various colors. The second symptom which is more discrete, is that textures are sometimes swapped. Occasionally, the real texture and another texture blend in depending on which angle you look at the surface from. This only appears to be present where the corruption is. The problem also has two stages. The least severe one takes place when you enable Basic Shaders. It causes only a bit of junk on some surfaces and the texture blending. The second stage comes when you also enable Advanced Lighting Model. This adds a new layer of graphical corruption, consisting of patches of a single color outlining various objects. After some testing, I found that this is caused by local lights in combination with bump mapping (turning either off stops the issue). Below are two screenshots. The first one shows the issue when only basic shaders are turned on, and the second shows the additional color outlines when enabling deferred rendering: First of all, I'm curious if anyone else who has a Linux machine and the free (non-proprietary) video drivers (especially Radeon) can confirm this. I'd then like to know if anyone can figure out how to fix this, so hopefully we can bring the issue to a Linden who can include a solution. I also hope to temporarily fix the problem until either Mesa or SL do it, but have no idea where in the shaders to look.
  15. Finally, I found a simple way to temporarily fix most of the shaders. I can now enable Basic Shaders followed by Atmospheric Shaders, but Advanced Lighting Model will still not work. Seemingly due to the same issue, but in a form that's not as easy to fix. I would be grateful if someone could use the knowledge I posted up to this point to figure out how to solve those as well. What fixes the primary shaders is commenting out all #extension lines the from glsl files. To make it easy for everyone to apply the fix, I created a bash script that does it automatically. Create an empty file in SecondLife/app_settings/shaders and paste the code below inside, then save and execute it. Make sure to backup your shaders folder first, this will permanently edit your glsl files! EDIT - I was able to fix the remaining shaders today. Their cause was different: Double underscores are not allowed in macro names under Mesa. The shader file class1/deferred/fxaaF.glsl contains multiple uses of this. I updated the script below to also correct this problem, and rename "__" to "_". At this point all shaders can be enabled, but there is graphical corruption which might or might not be related to the sloppy fixes I used here. #!/bin/bash # this script does a sloppy correction of Second Life shaders for Mesa compatibility files=$(find ./* -type f -name *.glsl) for f in ${files} do echo "Fixing shader file ${f}" # #extension directives must come before any non-preprocessor tokens sed -i 's/#extension/\/\/ #extension/g' ${f} # macros may not containin two consecutive underscores sed -i 's/__/_/g' ${f} done
  16. Tried the latest official viewer (3.6.10) on openSUSE 13.1 x64, KDE 4.11.2. The fontconfig issue no longer exists, but the regex one does. As instructed here, temporarily removing ~/.gtkrc* lets the viewer start up. I hope Linden fixes this soon.
  17. Sorry for posting so many times in a row... this thread is pretty inactive apart from myself. I spoke with the teams of components involved in the issue (X11, Mesa, Radeon) and they found a problem in the viewer's shaders. Apparently a statement is written incorrectly, and Mesa isn't meant to support that. Here is exactly the reply I got: "According to the GLSL specification (1.30, page 14), #extension directives must come before any non-preprocessor tokens. These shaders are invalid and we correctly fail to compile them. So, this is a bug in the Second Life viewer. I would try filing a bug with them." I posted the problem on Jira, and hope this is something Linden will get to soon. Also, here is my and another person's report filed for Mesa about the same problem (the last one has a lot more information): https://jira.secondlife.com/browse/BUG-4451 https://bugs.freedesktop.org/show_bug.cgi?id=71591 https://bugs.freedesktop.org/show_bug.cgi?id=69226 In the meantime, does anyone know how I can manually repair these shaders until the viewer fixes them?
  18. Update: Someone on IRC asked me to take a full GLSL dump output, to see exactly what's failing in shader compilation. It didn't help clarify the issue, but I thought to post the detailed error here as well in case it might help: http://pastebin.com/raw.php?i=CsHgM1gV
  19. I upgraded to openSUSE 13.1 yesterday, thanks to the software repositories which were made available earlier than the DVD release. With this upgrade, I dumped fglrx and let Linux switch to the Radeon driver with Mesa. I then ran Firestorm again to see how things go. Sadly it's pretty bad. Absolutely no shaders seem to work at all. I can run SL fine and at optimal performance, but cannot enable any effects (not even glow). The "Basic shaders" preference can be ticked on, but does not persist a viewer restart. I'm not going back to the proprietary driver, since it's very buggy in Linux and removing it was already a pain. I need a way to fix this and get shaders working with this setup. Mesa 9.2 supports openGL <= 3.1, so I'm confused why GLSL would fail so horribly. Here is the console output the viewer prints when I'm already in-world and try to enable the shaders. What can I try?
  20. Although the SL viewer has been giving me issues since moving to Linux (on openSUSE 12.3 currently) I'm usually able to run it ok. I have an ATI Radeon HD 6870 card, and currently use the fglrx driver since the open-source Radeon driver has problems running some games. I tested SL on both fglrx and radeon, and performance is the same on either without major bugs. However, there's one issue which happens with the open-source driver; Shaders can't be enabled. The "Shaders" checkbox in the video preferences menu isn't grayed out so I can turn it on. Once I enable shaders I can also enable the other individual effects (such as depth of field). However, no change is visible in the world after I click ok, although they're persisted in the menu. After I restart the viewer, I find the shader option turned off again. This sounds like the viewer is unable to detect shader support under the open-source Radeon driver (works fine with fglrx, the proprietary ATI driver). I know the Radeon driver & mesa are capable of running any shader (possibly apart from some OpenGL 3.0 features) since I run Linux games with advanced graphics and all shaders worked fine. I'm prolly not going to post this on Jira since even minor tickets take years to get noticed by Linden Lab, then a few more years to get solved. If someone else with more popularity there wants to post it in my place, that would be apperciated. Note that I'm staying on fglrx until the radeon driver fixes some problems, so I can only test again after my next distribution upgrade when I'll be trying the free driver again.
  21. Yeah, I'm familiar with buggy drivers and things that can go wrong there. Especially since moving to Linux where video drivers are one of the greatest points of instability. Still, multi-layering of textures is one of the simple tasks (in its basic form at least). I don't know how the SL viewer does it and if more advanced magic is involved, but typically even the oldest OpenGL applications can easily layer multiple textures on the same model. I'm a bit unclear why it's easier to do it server-side instead of using a simple and stable way of doing it in the viewer. Apart of course from sending one merged texture which should take less networking. Anyway, any optimization done the right way (uncertain at that in this case) is helpful, so I'll note this on the list of good things LL did. But personally, I'd still think more about a smart system to make all textures load faster... including terrain and in-world objects.
  22. As I was looking through my Youtube submissions, I saw this video on the list. It explains that SL is getting a new system called "Project Sunshine" meant to improve avatar loading, and since I hate gray and blurry avatars I watched curiously to see how it's implemented and how much it's going to improve loading. But the description halfway through the video left me a bit confused at least. Apparently this is meant to solve gray avatars by processing some textures server-side. Although I don't know the SL code, I always thought the issue is textures and shapes taking too long to network over, not the viewer having little resources to process them (since any modern CPU / GPU prolly handles that quickly). Being a developer myself, this makes little sense as to how it's helpful... so I was wondering if someone can offer a better technical description. There's only one thing the server could do apart from better compression; When avatars have multiple textures applied to the same area of the mesh, combine all textures and send just one. So instead of sending two textures and letting the client map them, the server can indeed combine them (same way you flatten multiple layers in Photoshop) and send that to the viewer. But from what I seen that's a rarer case. Most complex avatars today are problematic because they have a lot of attachments, while typically using simple / few clothes of the native avatar system. Something else I'm thinking about: Although the gray / blurry textures affect avatars most visibly, they're not an avatar issue. Any mesh, sound or texture takes long to load, including objects in the sim. I'd rather Linden tried to find an ingenious system to fix the problem entirely including this. Although the main factor of textures loading slowly is networking (which only technology in the (non-Linden) labs can fix) there might be ways to slightly optimize the networking of assets. For example, some textures getting stuck and never loading at all (I know that used to happen in the past at least). I'm not saying this won't be helpful at all, since anything that reduces networking and loading is good. But IMO Linden might be a bit overly triumphal on something that's a minor optimization. So yeah... those are my 2 cents on the Project Sunshine idea. I was curious how exactly the server can optimize avatar loading time, apart from texture compression and server-side layer merging. Are there also clear benchmarks comparing texture loading with and without this?
  23. Sounds like it's all good in that case, thanks. The navmesh display is the last thing I could imagine ever caring about, so it's no difference to me and I can stick to only one viewer (unless / until OpenSim goes on its own path and its viewer will not work with SL grids any more).
  24. Ok. It's not a Firestorm question specifically, I'm wondering about any custom viewer that doesn't have the Havoc library. Even if you'd download the official viewer's source code and manually compile it with --loginuri support (and of course without Havoc).
  25. We all know the story of LL removing the --loginuri parameter and OpenSim support, in favor of adding path finding to SL. The official Second Life viewer can no longer connect to OpenSim grids for this reason. However, I was wondering if it will also be the other way around; Will SL viewers without that Havoc library remain supported on the Linden grid, if someone doesn't care to see client-side physics? I use both SL and OpenSim, and in the current context I'd need the SL viewer for SL and a custom viewer (like Firestorm) for OpenSim. But I don't want to maintain two viewers... and at the same time I don't give a damn about path finding and using it on the main grid. So I'd prefer to use the OpenSim viewer of choice to connect to the Linden grid too. Currently this still works. The opensim version of Firestorm as well as the Teapot viewer connect well to the main grid. But is there a risk of this changing, and logins being blocked for clients that don't have the Havoc library in the future? Or something else breaking if they don't? I assume not, but I'm still curious to know for sure.
×
×
  • Create New...