Jump to content

Kwakkelde Kwak

Resident
  • Posts

    2,879
  • Joined

  • Last visited

Everything posted by Kwakkelde Kwak

  1. steph Arnott wrote: Actually no, but if you think I am that is your choice. No that was my perception, that's something completely different. This is a forum were coders and scripters work to solve and improve If you post your code the others can see what you wish to prove. When did you get to decide what these forums are for? And when did you get to decide what I want to do or what I have to do or what I do? I do not want to "prove" anything. Like in the other thread, I was simply trying to explain how the memory limit works. You on the other hand seem to have the strange urge to "be right". By acting like that you don't see or don't want to see how things work and make up the most confusing posts, change them as rapidly as others can answer them, draw conclusions from thin air and completely ignore any relevant answers you get. In the end all you do is raise your hands and ask the question "what good is this memory limit?" I already answered that, in fact I did it more than once. You just don't want to hear it. The average visitor of these forums is probably a better scripter than I am, maybe they are also better communicators. So maybe they can explain to you how things work, since I am obviously unable to do so. I'm not going to repost 3 pages of forum here. If people want to see the scripts I posted (and yours) and everything that relates to them, I gave the link so they can have a look. EDIT In the very unlikely case that I am wrong about the limit, please post a scenario/script showing so. It's never too late to learn.
  2. steph Arnott wrote: some seem to be convinced that the limit set is fixed and never changes. The only time it is fixed is when the script is in a code block that is racking up the available memory.untill it reaches the set limit. By some you mean me I suppose. You still haven't read what I wrote or looked at my explanatory scripts or at least you didn't understand a thing about either. I'll try to explain as clear as I can, again: You execute the llSetMemoryLimit function. Then there are two scenarios. Either the memory use at that particular time is less or more than the llSetMemoryLimit. If the limit is lower than the memory use, the function will be ignored completely and the limit will remain at its default 64k. That way the script will at least run (as long as memory use isn't higher than 64k of course). If the limit is higher than the memory use, that will be the limit until you change it by script, in other words, by executing the llSetMemoryLimit again. It is not possible to go over the limit. You'll get a stack-heap collision. While a script is running, you can change both the memory limit and the memory use, that doesn't change the two scenarios I posted in any way though.
  3. Cephus Twine wrote: but what is there to point you to somewhere you might need to go on the preview grid like a sandbox? The search box in the map menu, which is what I used. I typed "Mesh Sandbox " and "Sandbox". Or these forums It's a test grid, so don't expect a lot of guidance or help from LL. I only use it for final tests before uploading objects or textures to the main grid and can't think of another reason to log into it.
  4. I can see three mesh sandboxes are up and running, Mesh Sandbox 3, 6 and 22. Those have a 24 hour return time. Sandbox Wanderton is also open, with a three hour return.
  5. If you mean how one makes a mesh object in SL with multiple faces, you assign a multi/sub material in 3ds Max to your object. Each sub material translates to a texturable face in SL. So ID #1 is face 0, ID #2 is face 1, ID #3 is face 2 etc. Take a look at this thread.
  6. Shelby Silverspar wrote: anthonyc12 wrote: Many of my family members have built their own homes, I have not (I do not swing a hammer......it's just not a skill I have ). you do know they are talking about building a home in Second Life, not in real life, right? You do know you're probably talking to a bot right?
  7. No, my code proves that if the memory limit is active, you will get a stack heap error when that limit is exceeded. Therefor the script is not useless, it's just not a script you'd use otherwise. It's pretty clear. I can think of countless scripts that in a technical way do exactly what my little script does. For example, creating a list with people that visited your sim in the last day. In scripts like that you will have to make either the memory limit variable, or if the limit is almost reached, the script should remove some names from the list before adding new ones, or move parts of the list to a new script with more memory available, or a combination of these. OR you just forget about this entire memory limit since the only use for it is giving an estimation of memory use of mono scripts. In other words, only the estate tools and some lag meters will pick up on it. These tools do not pick up actual memory use and are therefor pretty useless themselves. Like I posted before, the function is a left over from the days that LL wanted to put a cap on memory use per avatar (and per region if I'm not mistaken). I do have to correct myself on a small but important thing, the memory limit isn't set when compiling the script. It's set when the llSetMemoryLimit is executed of course. That might have caused your assumption that (sometimes) the limit is ignored. So if a script gathers some (variable) data before the limit is set, it's possible that in some cases the limit is actually set and sometimes not. That still doesn't mean that if the memory use goes over the limit, the limit is ignored. It means that if the memory use is higher than the limit on excecution of the function, the limit is simply not set. I would say that's some pretty poor coding, the limit needs to be set in such a way that it's never lower than the memory in use, very easy to check and very easy to set. EDIT One thing that crossed my mind, this whole argument started when someone said their memory use was this and that for this and that amount of scripts. I don't know how that was measured, but if those scripts were all duplicates of a mono script, that's very easy to accomplish, since duplicate mono scripts share their memory. In other words, in theory you could have hundreds of scripts with a total memory use of just a couple of thousand bytes.
  8. string Something; default { state_entry() { llSetMemoryLimit(4450); Something = "A"; llSetTimerEvent(1); } timer() { llSay(0, (string)llGetFreeMemory() + " bytes out of " + (string)llGetMemoryLimit() + " free, " + (string)llGetUsedMemory() + " bytes used."); Something = Something+"A"; } } Another one, maybe this will make it clearer. Just put that script in a prim and watch what happens below. [08:18] Object: 38 bytes out of 4450 free, 4412 bytes used. [08:18] Object: 36 bytes out of 4450 free, 4414 bytes used. [08:18] Object: 34 bytes out of 4450 free, 4416 bytes used. [08:18] Object: 32 bytes out of 4450 free, 4418 bytes used. [08:18] Object: 30 bytes out of 4450 free, 4420 bytes used. [08:18] Object: 28 bytes out of 4450 free, 4422 bytes used. [08:18] Object: 26 bytes out of 4450 free, 4424 bytes used. [08:18] Object: 24 bytes out of 4450 free, 4426 bytes used. [08:18] Object: 22 bytes out of 4450 free, 4428 bytes used. [08:18] Object: 20 bytes out of 4450 free, 4430 bytes used. [08:18] Object: 18 bytes out of 4450 free, 4432 bytes used. [08:18] Object: 16 bytes out of 4450 free, 4434 bytes used. [08:18] Object: 14 bytes out of 4450 free, 4436 bytes used. [08:18] Object: 12 bytes out of 4450 free, 4438 bytes used. [08:18] Object: 10 bytes out of 4450 free, 4440 bytes used. [08:18] Object: 8 bytes out of 4450 free, 4442 bytes used. [08:18] Object: 6 bytes out of 4450 free, 4444 bytes used. [08:18] Object: 4 bytes out of 4450 free, 4446 bytes used. [08:18] Object: 2 bytes out of 4450 free, 4448 bytes used. [08:18] Object: 0 bytes out of 4450 free, 4450 bytes used. And then, surprise, a stack heap collision.
  9. Apparantly you didn't listen to what I said about your scripts. Apparantly you didn't even bother to use the script I posted. Apparantly you didn't set the memory limit high enough on compile. You can test all you want, but if you run the wrong tests you will get wrong results. Show a script that ignores the script limit AFTER compile, not DURING compile. In other words, make a script that has a limit lower than 64k (not 3k, that is a stupid number that can never be a limit), then keep adding data to the script until it runs out of memory. Just make a simple script with a memory limit of let's say 4-5k, then keep adding to a list on touch so the memory use increases a bit every time. Oh wait, that's the script I posted to show you how it works.
  10. That's what I'd say. SL once burned one of my video cards. It just was the wrong card for the job so I don't blame anyone. It fried a part of my motherboard in 2007 and I think a power supply went dead in 2009. Chances are the same happened to WADE1, possibly the power supply (which is cheap and easy to replace). On the other hand, a while ago, I really can't remember which viewer versions, CPU's were stressed to a constant 100% when SL was active for some people. A well cooled PC should be able to handle this, mine can render for hours (4 cores at 100%) with normal temps, but it's not the way it's supposed to be for a program like SL and it can seriously damage "average" computers. Never hurts to monitor all temperatures, or even the drawn power at the wall socket. If temperatures get too high and cleaning doesn't help, add fans, replace heatsinks or improve airflow (or when using a laptop add a cooling pad) depending on what's overheating. If you draw more power than your power supply can handle, upgrade it. For what it's worth, I don't experience any more lag than last year.
  11. Kelli May wrote: Don't get me started. I've done some living history in RL, so I'm regularly frustrated/entertained by the average merchant's concept of historical garb. 'Medieval' as a search term will bring up classical Roman & Greek clothing, Viking weapons, Elizabethan dresses, Restoration coats, Victorian underwear and more Gorean silks than you can shake a whip at. What's wrong with viking weapons? The viking era was smack in the middle of the Middle Ages.
  12. I think Drongle explained why I was confused somewhat. UV map and UV template (the baked texture from the Edit UVW menu) are not the same. Furthermore, "applying a UV map" is impossible in Max (I am surprised this isn't the case in Blender). All geometry has UV data, it just isn't organised very well initially, especially after moving around vertices, extruding faces and other UV altering operations. All things Drongle isn't sure about regarding Max can be done. As Drongle suggested, you can use the entire canvas of the texture for each ID / submaterial. If you use four different textures for the model you have now, that would be an enormous waste of texture canvas and therefor memory use. Sometimes it's useful to make sure different ID's aren't overlapping. You can use a single texture for an object then, but use different shininess settings (for example) for each individual face in SL. I take it both Drongle and Medhue answered your question about the faces. If that doesn't solve your issue, don't be shy and ask again. Good luck!
  13. I'm not quite sure what you mean by "applying the UV map" and "replacing the UV map". All UV maps are already applied before you render the templates. You can export the model as it is when you render those. In SL the four ID's should translate into face 0,1,2 and 3. You can apply your textures (or with the way you made the UV map a single texture) in SL. If you want the UV templates to show in 3ds max, just assign the rendered templates to the four corresponding diffuse map slots. EDIT. You can assign multiple UV maps to any "face", but that serves no purpose in your case. Only one channel can be shown. Anyway, do you mean "assigning the UV template as a texture" when you say "applying the UV map"? If that's the case, aren't you assigning them to the same ID? You still need 4 ID's for the templates, or for your final textures.
  14. WuShin wrote: If you have a decent computer that complies with SL requirements you should never experience "lag" or freezing (unless there is a server problem) Scripts run on the server, so causing server problems is exactly what they do, well, if there are too many anyway. You can have the fastest computer on earth, but that has no influence at all on server performance.
  15. From what you post, unfortunately, no. Could you post the .dae file somewhere? or maybe the .max file? or both?
  16. I don't know what you're trying to show with these two scripts. All I can conclude from those two, is that the limit is too small and therefor ignored when the script is compiled. No script can use just 3000 bytes, a simple state_entry with a single function uses more than that. The stack-heap occurs when the memory limit IS set and the memory gets higher than that limit. It is not likely to happen in a script like yours without any variables. In the second script you can see the difference between the first and second event, it's a small difference. Set the limit to 4916 instead of 3000 and you'll see the script will compile, but give a stack-heap collision right away.
  17. Erik Verity wrote: Considering there is no hint of this behavior mentioned in the wiki, and it could only show up if you set a limit between that (almost surely small amount) between max potential identified at compile time and max possible at run-time, it would be worth editing the wiki to clarify. Every indication is given that a limit set by the script is only for use by script checkers and ignored otherwise; I don't think anyone would normally be estimating a limit in state_entry beforehand in the first place, which might be about the only way to set a value too low and still be unrecognized. The function is a bit of an oddity in our toolbox. It was meant to help scripters with the dreaded memory ceiling that luckily never saw Second Life daylight. The limit is a limit though, no way around it. The only use for it now is to inform lag-tracking meters and it does a poor job at that, since those meters show the limit, not the use. I agree the wiki could be more clear on this, but I'm not exactly a seasoned scripter, so I don't feel that's my job:) Edit: I realize limits are set in state_entry - but it wouldn't make sense to get a useful value from llGetUsedMemory() from there - it kind of defeats the purpose of that function call altogether to try to get a value of memory used before the script has even run. The state entry isn't the best place for it of course. It would make sense to set the limit "on the fly" after memory changing commands to give the lag meters some useful and reliable information. My scripts were just examples to show how you can run out of memory, something you'd want to avoid in a real script I'd say.
  18. Nova Convair wrote: My slider goes from 0 to 8 maybe that got my attention. Weird, mine goes from (low-high) from 0-2 as I said. Same viewer version. A higher LOD factor allows to see detailed mesh/sculpt in greater distances or allows to see them at all in normal distances. For LOD 4 a medium machine is fully sufficient. ( that is a former high end machine but time schange ) Well I'll admit nothing is cast in stone in relation to this. But I build my items with "normal" settings in mind, no debug or TPV settings. That means I often can get away with a very detailed highest LoD. If people set their LODFactor too high, those objects will be seen from twice the distance, which can add a terrible amount of geometry on screen at any given time (not to mention the lower LoDs switching later). Most local lag is due to texture memory I suppose, so it might not be that bad. It's really hard to measure. Anyway, I build my items in such a way that LODFactor 2 is perfectly fine.
  19. I added 64k by adding "blah"? Then I wonder how much memory a real script uses.... You know that's not the case. A memory limit is a memory limit. As you can see, even with a lot of blah only 4k of memory is used. EDIT ok, I see you meant 64, I added 64 because after the script sets the limit, there's another event. The 64 allows that event to fit the memory. Anything on top of that, not even close to 64k, causes a stack heap collision, which means the script ran out of available memory. @Erik, as you can see by my first script, I tested it as well. I have done so before. Like I said, my conclusion is that the limit is set when the script compiles. If it's too small it will be ignored, if it's enough, it will be a true limit which can't and won't be ignored once the script is running. Both scripts were running in mono, how else would I get a stack heap with the first script? More edit... no, one click is enough to trigger the error.
  20. Then why do I get a stack heap when I make a supersimple script with a memory limit of its initial use+64 which adds a small amount of data on a list on touch?
  21. Unless LL changed it, that is only the case when you compile the script. Once running, you will get an error when you go over the limit. "ADDED: The convo was about te 200 not 10 or 50." I don't understand that
  22. Well I'm sorry but you brought it up. Scripts can have a 100% steady or 100% predictable memory use, I bet most small child scripts have that. So it's perfectly normal to set the memory limit to a lower number than 10k. As far as I know, it won't be ignored either. If you go over the memory limit, you'll get a stack heap error, just like you get when an LSL script goes over 16k. What point would there be for a memory limit if it didn't work? The whole function is pointless enough already The 2.5 in your example doesn't cost any resources at all. Like you said yourself, it's a limit, not actual use.
  23. steph Arnott wrote: anything below a 10k limit would risk it being ignored and reverted back to 64K Enlighten me....
×
×
  • Create New...