Jump to content

Revived JIRA for increase in memory limit


Arduenn Schwartzman
 Share

You are about to reply to a thread that has been inactive for 1883 days.

Please take a moment to consider if this thread is worth bumping.

Recommended Posts

2 minutes ago, Gadget Portal said:

If you've got so many poses in a single couch or bed that you're hitting memory limits, I'd be extremely impressed.

You should be extremely impressed by each single piece of furniture of the hundreds of thousands of furnitures out there on the grid then. If you open up the average piece of furniture in SL, be it single- or multi-user, you'll see that the number of scripts related to sitting is at least three. That alone should be a strong indicator of the demand for more memory per script, which could, overall, save significant overhead.

Link to comment
Share on other sites

1 minute ago, Arduenn Schwartzman said:

You should be extremely impressed by each single piece of furniture of the hundreds of thousands of furnitures out there on the grid then. If you open up the average piece of furniture in SL, be it single- or multi-user, you'll see that the number of scripts related to sitting is at least three. That alone should be a strong indicator of the demand for more memory per script, which could, overall, save significant overhead.

There maybe three if old, mine have one.

Link to comment
Share on other sites

21 minutes ago, steph Arnott said:

There maybe three if old, mine have one.

Congratulations and thank you for being so script-aware with your interior. I genuinely and wholeheartedly applaud you for that.

However, I just jumped into SL and sampled some bleeding edge furniture at an interior decoration event called The Boardwalk, http://maps.secondlife.com/secondlife/Von Strauss/67/28/24 , go see for yourself. The results were even worse than I thought. My rough estimate of average of 3 scripts per furniture was very conservative:

Quote

[10:29] Script info: 'Object': [8/8] running scripts, 512 KB allowed memory size limit, 0.007344 ms of CPU time consumed.
[10:29] Script info: 'Object': [8/8] running scripts, 512 KB allowed memory size limit, 0.007768 ms of CPU time consumed.
[10:29] Script info: '.:UR:. Mid-century Modern Living - Sofa w/ extra (fpm)': [4/4] running scripts, 256 KB allowed memory size limit, 0.003478 ms of CPU time consumed.
[10:29] Script info: 'Rotund Pillow': [2/2] running scripts, 128 KB allowed memory size limit, 0.001836 ms of CPU time consumed.
[10:29] Script info: 'Object': [8/8] running scripts, 464 KB allowed memory size limit, 0.006337 ms of CPU time consumed.
[10:29] Script info: 'Object': [9/9] running scripts, 576 KB allowed memory size limit, 0.007079 ms of CPU time consumed.
[10:29] Script info: 'Silla': [10/10] running scripts, 640 KB allowed memory size limit, 0.005176 ms of CPU time consumed.
[10:30] Script info: 'UprightPianoStool': [15/15] running scripts, 960 KB allowed memory size limit, 0.012023 ms of CPU time consumed.
[10:30] Script info: 'chair seat': [8/8] running scripts, 512 KB allowed memory size limit, 0.004772 ms of CPU time consumed.
[10:30] Script info: 'chair seat': [8/8] running scripts, 512 KB allowed memory size limit, 0.004822 ms of CPU time consumed.

That was an honest, unbiased sample of the latest in furniture in just 4 or 5 booths directly around me at the landing point.

Edited by Arduenn Schwartzman
Link to comment
Share on other sites

2 minutes ago, Arduenn Schwartzman said:

 

Well all you are doing is proving what LL proved. Many people write badly inefficient scripts. This is a very basic example

if (llListFindList(sims,[sim]) == -1)

if( ~llListFindList(sims,[sim]) )

Both do the same job, but the last uses less byte code and runs a lot faster.

Link to comment
Share on other sites

5 hours ago, Arduenn Schwartzman said:

In two 64-kb scripts:

  • Step 1: Script 1 dumps parameters into request string
  • Step 2: Script 1 sends request string to Script 2
  • Step 3: Script 2 parses request string into parameters
  • Step 4: Script 2 extracts or processes data from list using said parameters
  • Step 5. Script 2 sends data or confirmation feedback signal to Script 1
  • Step 6. Script 1 parses the response and acts accordingly
  • (Events triggered: 2--actually 4--Scripts also receive their own llMessageLinked)

Yeah, if that's what you're doing it can use up a lot of time and resources.  When I run up against a memory limitation and need to split a script in two, I try to avoid putting myself in the position of having to make the scripts play endless ping-pong.  For example, if I were designing a retexturing system, I would put my menu dialogs and whitelisting code into one script and all of the texture UUIDs and the SLPPF operations that do the actual texturing in a second script.  The first script would handle the access authorizations and simply pass the user's menu choice to the second script.  The second script wouldn't need to send data or confirmation back to the first one at all.  If I were clever enough, I would write that second script to be as generic as possible so that it could receive choices from dialog menus in more than one spot in the first script --- sort of like treating the second script as a large user-defined function that serves the first script.

  • Like 1
Link to comment
Share on other sites

59 minutes ago, steph Arnott said:

Well all you are doing is proving what LL proved. Many people write badly inefficient scripts.

or, alternatively, I'm proving that there's a dire need for memory increase that allows simpler and more efficient scripts.

59 minutes ago, steph Arnott said:

Both do the same job, but the last uses less byte code and runs a lot faster.

That's just scraping for a few measly bytes. Perfect for optimizing an almost finalized product and decreasing the chance of getting a stack heap collision. But it's a far cry from freeing up space for extra functionality.

Edited by Arduenn Schwartzman
Link to comment
Share on other sites

7 minutes ago, Rolig Loon said:

  The first script would handle the access authorizations and simply pass the user's menu choice to the second script.  The second script wouldn't need to send data or confirmation back to the first one at all.  If I were clever enough, I would write that second script to be as generic as possible so that it could receive choices from dialog menus in more than one spot in the first script --- sort of like treating the second script as a large user-defined function that serves the first script.

Well that was really what i was on about.

Link to comment
Share on other sites

2 minutes ago, Arduenn Schwartzman said:

@Rolig Loon Your example definitely suffers less from script splitting side-effects than mine. They're opposites on a spectrum of different strategies and needs, all of which, nonetheless, will benefit from the ability to exist in a single script.

Well it is not going to happen. LL have decided that and they are not going to change that decision.

Link to comment
Share on other sites

41 minutes ago, Arduenn Schwartzman said:

 Your example definitely suffers less from script splitting side-effects than mine. They're opposites on a spectrum of different strategies and needs, all of which, nonetheless, will benefit from the ability to exist in a single script. 

Yeah, I wasn't necessarily making an argument against increasing the memory limit on Mono scripts.  I think Steph is probably right that LL isn't likely to raise that limit any time soon, though, so my fallback position is to (1) keep my scripts as small and tight as I can and then (2) if I am forced to split a script, make the split parts as nearly independent as possible, so that I don't need to pass very many link messages back and forth between them.  I've been following those principles long enough that the workflow is almost instinctive now.  Speaking only for myself, I can live with the current 64K limit without too much muttering under my breath.

  • Like 1
Link to comment
Share on other sites

2 hours ago, Arduenn Schwartzman said:

You should be extremely impressed by each single piece of furniture of the hundreds of thousands of furnitures out there on the grid then. If you open up the average piece of furniture in SL, be it single- or multi-user, you'll see that the number of scripts related to sitting is at least three. That alone should be a strong indicator of the demand for more memory per script, which could, overall, save significant overhead.

On the other hand, the average creator in SL, especially ones that don't write their own scripts, is terrible at any kind of optimization.

  • Like 1
Link to comment
Share on other sites

The affirmation that increasing the limit to 128k will make things less laggy may seem unintuitive, but ultimately this is true. The amount of lag isn't directly related to the size of a script.

In fact, having a limit at all (even 128k) makes no sense when you can almost have as many scripts as you want inside a single prim, which will all run at the same time when a single script will be limited to one event at a time. And even a cap on the total memory used by a linkset wouldn't be perfect because then you might force many scripters to use several linksets listening to each other (sounds like a nightmare if you ask me).

An idea could be, however, to have bigger scripts use more Land Impact just like having several scripts do?

Also, as Strife suggested many years ago, LL could have scripts use 64k by default and make llSetMemoryLimit finally relevant by allowing it to also increase that value, rather than just decreasing it. Personally I like that idea.

  • Like 2
Link to comment
Share on other sites

6 hours ago, steph Arnott said:

Because ten identical scripts that are a copy of the compiled original act as if they are one script. Filling up one and moving to the next has already achieved what you want. Also LL ran a feasability study and concluded that giveng people the ability to use what was reqiured threw up an issue that Fionalein pointed out. Most will just write bad scripts. Also php (if you know how to use it) is vastly superior than increasing script memory allocation. Personally i do not do anything that needs resorting to an external server.

Stack sharing is not what this topic is about.

Link to comment
Share on other sites

17 hours ago, Arduenn Schwartzman said:

Hi folks,

I just reissued a (slightly altered) feature request involving increased memory for scripts. Most of you will probably be able to imagine the potential and benefits for memory increase.

The old JIRA (closed, unimplemented) can be found here: https://jira.secondlife.com/browse/BUG-134167

As a solution to potential abuse of high-memory scripts, a way of pay-per-script is proposed, as follows:

  • A creator pays 500 L$ for a 128-kb script to emerge in their inventory (L$ 2000 for a 256-kb script, etc?).
  • This script remains No Trans to the creator (albeit Copy and Mod) until a 'finalize' feature is activated by menu or check box and confirmation dialog.
  • Upon 'finalize', the script becomes Copy/Trans, yet No Mod to the creator and anyone else. The creator can set to either No Copy or No Trans.
  • Optionally, limit this feature to Premium members.

The new JIRA is here: https://jira.secondlife.com/browse/BUG-226311

Any suggestions for or against it in this thread (since not every scripter visits the JIRA site on a regular basis)?

Clearly I'm not thinking straight but I don't understand this dance with the permissions.

I mean, I pay my L$500 for a 128-kb script to appear in my inventory,  called New Script (128 kb) or whatever, which is fully copy and mod.   So I take a copy of it, save that as My Big Script 1, and write my script.   I choose "Finalise" and it becomes Copy/Trans but no Mod.    Meanwhile, I've still got my original New Script (128 kb) fully mod and copy, so I take another copy, save it as My Big Script 2, delete all the content and write a completely different script, finalise that, and so on.

Unless I've missed something, unless I mess up and finalise my original New Script (128 kb) without having a backup, I only pay the one fee ever.   So what's the point of messing with the perms?    

I think I must have misunderstood something here.

Edited by Innula Zenovka
  • Like 1
  • Thanks 1
Link to comment
Share on other sites

4 hours ago, Innula Zenovka said:

Meanwhile, I've still got my original New Script (128 kb) fully mod and copy, so I take another copy, save it as My Big Script 2, delete all the content and write a completely different script, finalise that, and so on.

You got a valuable point there. Upon finalize, all copies and forks of that first emerged script must become either No Mod or non-finalizable (and thus remain No Trans), except for that one finalized version, or the whole point of pay per script won't work.

Or, alternatively, do the transaction every time the script gets finalized, instead of paying to emerge a script. In fact, that's a better alternative, making the whole process simpler as well. I'm adjusting the proposal.

  • Checking a 128 kB checkbox on the script window enables 128-kb and turns the script No Trans to the creator.
  • Clicking a 'Finalize' button on the script window brings up a payment dialog.
  • Payment irreversibly turns the script No Mod/Copy/Trans to the creator.
Edited by Arduenn Schwartzman
Link to comment
Share on other sites

3 hours ago, Arduenn Schwartzman said:

I'm pretty sure you can think of ways to not lose the code.

FYI. LL only intended an increase to 32kb. During testing it was found that certain 16kb LSL2 scripts were using for times the memory when converted to Mono. LL decided that in order not to break those scripts the increase would have to be 64kb. Now seeing as LL never wanted 64kb in the first place but had no choice then there is zero chance of them increasing to 128kb. That information is somewhere in the wiki. By a qiurk fate we have 64kb scripts and we should be thankfull for that quirk of fate because otherwise we would only have 32kb scripts.

Link to comment
Share on other sites

Just now, Wulfie Reanimator said:

Is this documented anywhere?

Look in the wiki. I read it a few years ago and am not going to trudge through it again. Even a fast scan tells you this ' In some extreme cases Mono scripts can use up to four times the memory as LSL2 scripts. To maintain backwards compatibility, the script size limit has been increased from 16KB to 64KB. '

Link to comment
Share on other sites

3 minutes ago, steph Arnott said:

Look in the wiki. I read it a few years ago and am not going to trudge through it again. Even a fast scan tells you this ' In some extreme cases Mono scripts can use up to four times the memory as LSL2 scripts. To maintain backwards compatibility, the script size limit has been increased from 16KB to 64KB. '

I know it says that, but I haven't ever seen anything saying that they wanted only 32KB.

Edited by Wulfie Reanimator
Link to comment
Share on other sites

You are about to reply to a thread that has been inactive for 1883 days.

Please take a moment to consider if this thread is worth bumping.

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
 Share

×
×
  • Create New...