Jump to content

arabellajones

Resident
  • Posts

    685
  • Joined

  • Last visited

Everything posted by arabellajones

  1. So the Firestorm developers decided to make it easily accessible for the users without any documentation.... I am not surprised. The latest version of Firestorm, the animesh beta, is misbehaving on Linux. Other viewers on the same version of Linux are not, and (I had to ask somebody to check it) the Win 10 version of animesh Firestorm is OK.. And JIRA as a bug-reporting tool is about as user-friendly as a rabid honey badger with a hangover.
  2. Yep, very standard stuff. It's an explicit file format in Kerbal Space Program, but all the textures are already on your hard drive. Looks like the way I've been thinking of textures, just big enough for a 1-to-1 mapping of texture pixels to screen pixels, which is not the same as LOD switching. But that means some of those 1024 textures will never get used at the full size.
  3. Thanks for the sources on that. It's pretty much what I try to do, though there are elements of SL which seem poorly documented. I'm pretty sure that some sort of mip-map system has been used, different sized textures sent to the viewer for objects at different distances, but where is it described? Your example points up one reason, I'm not using the right bit of jargon, but the only evidence I have is what happens when downloads are sluggish, and | see the low-res texture switch to a higher resolution. And how does it relate to how LOD works on the mesh? Yeah, lower texel density on such things as the underside of a vehicle is a good move. For mesh, you have to start with the UV mapping. I can think of a couple of projects I have where that could be done.
  4. Oh, accidents happen. I've uploaded textures with an alpha channel that wasn't needed. But there's that old saying in Chicago. Once is happenstance, twice is coincidence, but three times...
  5. I wanted to check what the four numbers are which are shown by the "Color under Cursor" function. The first three are RGB values. and all the third-party sources I can find say the fourth is an Alpha value, but when I made a box, with no transparency, completely opaque, default blank texture, the fourth component was reported as "65". I can't see that making sense. Part of the problem I am having is that, while every answer coming up from a Google Search shows the Alpha value answer, nothing comes up from Linden Lab. I've not found anything pointing to the Wiki, or anything else that could be called a reliable source. The LSL code for a color vector uses a float, range 0-1.000, other methods use 0-255 or the hexadecimal equivalent, but is this 1-100 or 0-100 or 0-99 or 1-99? And then I am looking in the viewer, and the values go weird, apparently inverted. A dark blue has RGB values over 200, pale colours have low values. Oh, and I checked. It's not behaving like an HSV set. Since I use Linux, there doesn't seem any point in submitting a JIRA, there isn't any supported viewer, some jobsworth will just close it with a "not our problem, guvnor". But where is the documentation? Have we just been using a piece of buggy code for all these years and nobody has noticed? Have we all been passing around knowledge from the Viewer 1 era, and never noticed?
  6. One thing has been confusing me. I use The GIMP for graphics creation and editing, and that uses "Linear" and "Cubic" to label the interpolation methods used when scaling an image. I had to do a bit of digging to find out whether they were the same as "bilinear" and "bicubic" (This is why I get so picky about multiple labels for the same thing—"download weight" or "streaming weight", guys?). The reason for the apparent sharpness is that the bilinear interpolation adds an artefact that is similar to what an edge-enhancement filter does. Problem 1: What happens when multiple texture pixels are contributing to the same screen pixel? A 1024 texture is tall enough to fill your viewer display vertically. Some objects use the UV mapping to put several views of an object onto one texture. I did that with an ISO shipping container, which means each side is using a 256-pixel high block, but how often do you see it from close enough for that to matter? This can also be where anti-aliasing comes into play, which is a sort of blurring. But the precise vertical lines in text don't need to be blurred to avoid the step patterns you get on diagonals and curves. So what do you do? And does it matter if you scale the image before or after doing anti-aliasing Problem 2: Different parts of the image respond differently to the same tool: some can look worse and some can look better. This is one reason to use layers. It's fairly easy to anti-alias a layer carrying text, but not the rest of the image, but it gets more complicated if you were to want to use different forms of interpolation for scaling the image. Also, scaling the whole image, still split into layers, can be less than ideal. Problem 3: Sometimes you just have to try the alternatives, and see what works best. At least I can use the Local Textures option in Firestorm, because the Viewer and the nature of the object can have an effect on it all. As with a mesh and the smoothing, I am not sure there is any way you can reliably see what happens without using a Viewer. Truth be told, there are huge numbers of textures being used in Second Life which don't need to be 1024 pixels across. How close do you have to be for somebody's eye to be that many screen pixels? Most of the time I work on a texture at 2048 size, and scale it down to a 512 for upload and use. And why do people still leave an alpha channel in? Yes, there are reasons to use an alpha channel in a specular or normal map, but in a diffuse map, set to 100% opaque for the whole image, it's just a huge lump of unused data.
  7. What on earth are you Lindens doing? Over five hours shift in the timing of this "scheduled maintenance". I'm in Europe. I was asleep. I'm not your only customer running on those times. But this is your own clock on the wall you're using.
  8. Oh yes it is. And the debug trick is one for people who really know what they're doing. Why doesn't the slider allow direct numeric entry, like all the others do?
  9. I have very mixed feelings about this. First my usual, all-mesh, hair body, clothes, and tail has a complexity/displayweight of under 10k Second, I usually run my viewer with a limit of 100k, and very few people go above 150k But the UI for complexity setting sucks big-time.The maximum value that can be set is somewhere just over 350k, above that the only uption is "Unlimited", and an extra couple of big steps could be worthwhile. If the last three steps were 350k, 700k, and 1100k, that would still shut the door against the ultra-complexity griefer tools, while allowing a handful of extreme avatars. On what I see, Complexity is the only one controlled by a slider without a direct numeric input. Incidentally, the size of the steps, masked by the slider, is bizarre. It's not measured to 2 or 3 significant figures. It's not jumping from 10,000 to 11,000 but from something such a 9768 to 10732. (Those aren't real numbers, but that's the style.).That's meaningless precision.
  10. It does look careless. There are a few other things, like not having a distinct folder for settings and cache. I can be a pit picky sometimes, but there doesn't seem to be an easy escape. It's a BETA version, dammit! I'm happy to use it, but the Firestorm mob are doing a good impression of two short planks on some things. I use Linux, I tell them that, and they still want to know if I can replicate the problem on the "official" viewer. The Linux version of the Linden Lab viewer is over a year old. I'm not sure it can even be called "official" any more, but just from the dates, there much be a lot of things that just can't be replicated, and that fact blows a hole in the procedures they want us to follow. Like I said, I'm a Linux user, and the tendency to suggest Windows-specific solutions gets a bit tedious. I have libraries, supplied with my operating system, which do things that Second Life viewers have problems with, because somebody says they can't supply the library with the viewer. It makes me wonder how so many other programs make use of the libraries without any apparent problems. Well, I have not seen any problems with the Linux version. I am not worried by it. But I make back-ups regularly. I've had to roll-back from new versions of programs until some bug is fixed, but I have the tools to manage that. The Firestorm team, and maybe Linden Lab too, seem to make some damnably dangerous choices. Call me paranoid, if you wish, but is that really so crazy?
  11. It can get complicated, and not all mesh-creation programs handle sharp edges the same way. But there is an Upload option, "Generate Normals", which can work well. It would be good for something such as that cube, less good for other things. I've used it to preserve a sharp step that would otherwise be lost in smoothing, specifically a lapel edge on a coat. I also wonder if any external program does smoothing in quite the same way as SL But remember that low-LOD models can be far enough away that you can't see that sort of detail. Will sharp edges matter at a distance?
  12. Thanks... Did the Lindens document any of this? I find it hard to be confident.
  13. I am not going to claim to be a particularly skilled or fast mesh maker, but I do make stuff. I was lucky to buy a mesh avatar which is well optimised. The marketplace is no help in finding low-complexity items. Anyway, I am walking around as the pictured avatar, total complexity under 10k. And some creator who sees this will now start pointing and screaming "Cheating" because I made those meshes for what SL viewers do. I've done tests, zooming my camera out and seeing what detail my screen can show. My meshes are comparable triangle-size to the classic avatar. I use normal maps, but there is detail in those clothes which doesn't depend on them. It would help a lot if the Lindens could write documentation for humans rather than computers. It would help if some creators could read documentation. There have been times when, despite clearly saying, "I use Linux", helpful people have recommended wonderful Windows programs. That outfit isn't quite accurate, I've since seen pictures with some better info. "Run and find out." is a pretty good motto.
  14. Oh dear. I do a lot of my building in another program, not Blender, though I use it with Avastar for rigged mesh. Using both, incidentally, I wonder if any program does mesh smoothing in quite the same way as SL. Both export a .dae that the uploader can handle, but this sounds like yet another undocumented can of worms.
  15. I was checking up something, and there is a test Omega Applier for Bakes on Mesh. A possible problem is that Omega Appliers use the texture UUID, and access to that is limited as a barrier to content theft. I have a couple of mesh bodies that can use Omega appliers, they have classic-compatible mapping on a distinct cloth layer, and I have made a couple of basic Omega Appliers for classic clothing. One thing I have found: an Omega option to clear the clothing is very useful. I have seen a few that have that built in, but there are standalone appliers just for that. Bakes on Mesh could pay off, only needing one clothing-mesh layer, cutting avatar complexity. I have one mesh body that I can usually keep under 20k complexity, with clothes. I regularly see people running avatars with over ten times that. Some I have seen were over 450k complexity The lack of info in the Marketplace isn't helping
  16. If you're not thinking about how big it will come out as screen pixels you maybe should. A pair of classic pants put about 1 meter against the height of the texture, and my whole screen is 1080 pixels high So you would have to be zooming in pretty close on somebody's pants before the difference between a 512 and a 1024 mattered. Maybe you have some expensive hardware, more screen pixels, and your UV mapping could make that 1024 into effectively four 512 textures, but if you're not thinking about that, you're a lousy creator
  17. A day or two ago, there was some in-world discussion of basic instrument panels for aircraft, a HUD with a set of basic instruments, that wasn't wasting resources. There have been several, and it emerged that some have vanished. Still in my inventory, but gone from the marketplace. Still working, relatively lowstreaming and render weights, not triggering the warnings that some maker-provided HUDs do, about such things as excessive texture use, and giving useful indications in SL. Why does a HUD give a numeric readout to 7 or 8 decimal places? I reckon there's room for something like this, using up-to-date scripts and modelling . I am going to drop this in as a real-world example, the RAF Blind Flying Panel. Speeds don't need to be exact. Mach 1 is less than a second to cross a sim, a Spitfire could cross the sim in a couple of seconds, and maybe there needs to be a scale factor setting, so that the ASI can show an appropriate speed for the plane without having to use crazy-fast in-world movement.
  18. A lot of this is beyond me, but if you teleport to the road in the Kuula sim, and then walk into the NCI parcel west of the road, when it's busy, you'll see something about the difference between mesh rezzing and other sources of load. It's set up so you can't see the avatars in the parcel from outside, and when you cross the parcel boundary they all load at the same time. There is a lot of stuff downloading to your viewer when you TP into a sim. and you have a chunk of data about you being sent from one sim to another. The way things work, it's hard to get a good figure for Download Weight, and complexity is something a bit different anyway, but, for what little it's worth, my usual Complexity is under 15k, and there are other people up around 150k, and the slider for limiting complexity in nearly useless when the complexity is over about 250k There is a lag indicator in the parcel, it does show red pulses when people TP in. I have been seeing the same sort of high-script times lately as the OP has described. The mesh rezzing problem isn't new. I think the script-time problem is, and there have been changes of occupier on some other parcels I make a bit of effort to keep the mesh I make a low-complexity as possible. I look at the LOD distances and triangle counts (use Firestorm to get the info), and try to avoid triangles smaller than a screen pixel. There are rumours of some changes in the work, a change in LOD distances, which might spoil the look of what I make. I can't say I shall be happy about that, but. frankly, the Lindens couldn't document their way out of a soggy-wet paper bag. And the same problem seems to be part of this mess.
  19. It's arguable that there needs to be something better than these meetings, coming from Linden Lab. Some things get clearly announced. Some don't. This is one of the things that got left to drift, and I don't think it should have been. I hope they're paying Inara Pey really well for reporting these events, and doing the job that they should be.
  20. I don't think I've seen anything elsewhere to say Vivox no longer support Linux. It makes all the talk of a community-supported Linux version look a bit of a sham. It was easy enough to confirm through Google, but it's all indirect. All you can see from Vivox is total silence about Linux. I'd be a bit less grumpy if Linden Lab were at least honest about this detail. They seem to have a devotion to voice, which I am not sure is all that useful in practice, and I can't help but recall that line about a verbal contract not being worth the paper it's written on. After my experience of SL15B, it looks like part of a larger pattern of bloody-minded non-information. I know that as a Linux user I am in a tiny minority, but I reckon the problems I have would be the same, or worse, under Windows. Switching won't, for instance, change the quality of the documentation for Second Life.
  21. Some companies do it right. They take the trouble to test their product. For them, Wine is effectively another version of Windows. I use one of these programs. Some of the choices Linden Lab have made make it a little difficult for me to regard them as competent.
  22. It is my understanding that voice in Second Life depends on a module supplied through Linden Lab, which uses proprietary code from Vivox. All third party viewers depend on this code and use a module called "SLVoice", marked as an executable. I have not seen any clear statement of when this code was last updated. While Linux support is suspended, and a future "Alex Ivy" viewer for Linux depends on work from the TPV community, which I see is being done, this particular component appears to depend on Linux code that ceased maintenance four years ago. I refer to the gstreamer-0.10 module. It already appears not to work any more. Since it is not open source there is an obvious problem in an update depending on the TPV community. I can see a couple of possible fixes, but they depend on there being a legal way for somebody to work with the Vivox code. Essentially, it needs somebody with Linux skills to do the work under contract, but is it Vivox or Linden Lab who has to employ them. While there are similarities with the situation over the Havok code for mesh import, I would argue that Voice is not the same. We don't need to agree extra T&Cs to use Voice, as we do to import mesh. Is there a long term plan, or should I assume that voice is now a dead option for Linux viewers? Should Voice be assumed by event organisers to be a universal (I hit problems at SL15B, where presentations by Linden Lab used Voice with no apparent alternative.) (Having somebody speaking on a media stream while listening for questions on voice could be a fix for these big events, but you would need to be careful about feedback loops.) The voice problem also raises questions about the regular in-world user group meetings. I am not enthusiastic about using Windows for a special event, but if it is going to be necessary, I would prefer something a little bit more explicit than saying Voice is used. It isn't working with Linux. Say these events don't work with Linux viewers.
  23. It's pretty clear from the start of the thread that this is about the Windows viewer, but I would have mentioned "Windows OS" in the thread title. I don't consider just the version number to be useful to us ordinary humans.
  24. As I recall, the point of the thermal paste is to compensate for imperfect thermal contact between the cooler and the top of the chip-casing.. Neither are flat (I once saw an optical flat. Wow!). The past isn't really there to be a good conductor, it just increases the area for heat flow. A good cooler rig can be pretty cheap, though the room in the case will matter. I had an Arctic cooler running for years on my old computer, my current machine is stock Dell. Fans can wear out, and I would look at how your motherboard controls fan speed. Some have a BIOS that can be run at continuous max speed. I would say the big mistake is not replacing the thermal paste. I give an example here, from eBay in the UK. The thermal conductivity quoted is typical, but you can get better without a huge bill. Arctic is a good brand, but some sellers don't quote hard numbers. Example of thermal paste sold in the UK Since you only have a thin film of thermal paste, there's not really going to be very much temperature difference, and a 2:1 difference in conductivity is going to mean fractions of a degree. Spend a little more on branded paste, and you may get something that lasts longer, but I worry more about making a very thin film. You not spreading jam on a sandwich, marmite is maybe a better analogy. Airflow in the case is a tangled issue. Some cases have the PSU at the bottom, essentially with its own cooling air. It's usual to have a front intake fan set low, exhaust fan high, and the video card has its fan/heatsink venting at the back. The sort of commonplace CPU rig that works well with a side vent sounds similar to what you have. Higher-grade coolers have a huge heatsink and blow from the side. Best rig it all front to back. Keep your cabling tidy, that can be the killer. You're not getting excessive temperatures, but Second Life viewers do do a lot of work and it comes at the high end of what people see.
  25. I think one of the problems with the freebies is that there isn't a common target, which there was in the days before Mesh. I use a particular mesh avatar, which has a look that suits me and works well, but there was so little, even if I used the classic AV. It wasn't even worth taking freebies I don't have an avatar to use, so I can't comment on the quality.
×
×
  • Create New...