Jump to content

Can we talk about increasing Script memory from 64kb ?


Coffee Pancake
 Share

Recommended Posts

5 minutes ago, Coffee Pancake said:

As per the title .. script memory being capped at 64kb is arbitrary and half the reason many complex scripted objects end up running a handful of scripts.

I suppose "we" can, but how will this initiate a change?

Can we also talk about adding another language to Second Life to be used in addition to LSL?  I used to love the heated debates between residents about what language should be added.

How much 'script memory' would you suggest LSL have access to?

What language would you like to see added and how much memory should it have access to?

(Or shut me down by saying "No additional languages to be discussed in my thread!")

Link to comment
Share on other sites

If you'll allow me to be the devil's advocate, it needs to be capped somewhere, and 64kb seems fine unless you're doing something that involves large lists of strings.

If there were no cap, imagine the amount of memory even a novice scripter could use up (possibly by accident) all on LL's dime.

Link to comment
Share on other sites

1 hour ago, Quistess Alpha said:

If you'll allow me to be the devil's advocate, it needs to be capped somewhere, and 64kb seems fine unless you're doing something that involves large lists of strings.

If there were no cap, imagine the amount of memory even a novice scripter could use up (possibly by accident) all on LL's dime.

Possible solution: Current and future scripts stay 64KB by default, but llSetMemoryLimit can be used to increase the size up to some new maximum, such as 128KB. (Seems like a safe increase as a first experiment, and we'll be able to know when scripts might be using more memory than normal.) This way, most scripters won't be using any more memory than usual.

I'm all for increasing memory capacity somehow (I wish KVP wasn't Premium), but what I'd be concerned about is "what if something does go wrong?" And I don't mean things like bugs, I mean long-term real use-cases. If it does start causing significant degradation of sim performance across the grid and LL is unwilling or unable to improve their hardware with Amazon, going back to 64KB is equivalent to pulling out a barbed arrow. It's gonna hurt more coming out than going in.

Edited by Wulfie Reanimator
  • Like 3
Link to comment
Share on other sites

2 hours ago, Ardy Lay said:

What language would you like to see added and how much memory should it have access to?

(Or shut me down by saying "No additional languages to be discussed in my thread!")

I think that's better in it own thread to be honest.

  • Like 1
Link to comment
Share on other sites

More memory is always pretty nice.

Might also wish for some efficient shared memory between scripts in the same object. E.g. allow one script to "export" a read only list to other parts of a script set, to avoid tons of messages and duplication back and forth. That could simplify some things even when the 64kB limit of writeable memory per script survives.

 

Link to comment
Share on other sites

3 minutes ago, Profaitchikenz Haiku said:

Is increasing script memory going to have an effect on Sim-crossings and TPs?

Almost certainly, due to the process of how scripts are transferred between sims (and why LSO scripts are much faster than Mono scripts for that).

Edited by Wulfie Reanimator
Link to comment
Share on other sites

13 minutes ago, Wulfie Reanimator said:

and why LSO scripts are much faster than Mono scripts for that

Are they faster just because they're a quarter the size of Mono? Or is there some other aspect to the way Mono scripts are loaded that is the reason?

It seems to me that the need for larger scripts (and I accept there is a case for wanting more memory) is mostly for storing data, and in this case, perhaps there could be a more efficient way of storing data?

Link to comment
Share on other sites

if the cap was raised then we would find a way to fill it up and still not have enough room for everything we might want to do

if i wanted anything then I would want KVP to be grid-wide scope please. Just the KVP thanks. I can live without the experience permissions system being grid-wide scope

 

  • Like 1
Link to comment
Share on other sites

9 minutes ago, Profaitchikenz Haiku said:

perhaps there could be a more efficient way of storing data?

Capital idea!  How about we store data in a STATIONARY SCRIPT on a STATIONARY SCRIPT SERVER and provide some communication mechanism between stationary scripts and the scripts we have, wherever they may roam.

But, here's the deal:  Any new storage mechanism dreamt up must also have an expiration system whereby the owner must actively maintain it or it goes static, then later, evaporates.

Link to comment
Share on other sites

I'd like to have the stack memory counted separately from code and heap. Large incoming messages, and some LSL calls, can briefly use up large amounts of stack and cause a stack-heap collision. Scripts don't get a chance to discard or reject or break up or process a big incoming message before it overflows memory. This makes some scripts unreliable. So I'd like to have 64K of code and data, plus up to 64K of stack. Stack memory is temporary; when you're waiting for an event, there's no stack space in active use, although space may be allocated.

As for larger scripts, that's a problem with technical debt. It recently slipped out at a user group meeting that the server processes are still in 32-bit mode, and thus sims can't use more than 4GB of memory. So there's a hard ceiling on sim memory.  Understand, in the Linux world (the servers run Linux) that's rare. Just about everything that runs on Linux is 64-bit today. You have to download old libraries to even build 32-bit programs on LInux.

  • Like 1
Link to comment
Share on other sites

1 minute ago, Mollymews said:

KVP

Maybe this is a solved-issue (as many times as I've thought about it, never actually used KVP), but as long as we're dreaming, I'd like a better way to segment the KVP database for different projects. Something like domain+key+value, and an easy way to get only keys associated with a given domain, or even just something like

llSearchKeysValue(string query,integer first, integer count);

which would fetch the names of keys if the name starts with query.

  • Like 1
Link to comment
Share on other sites

Just now, Ardy Lay said:

How about we store data in a STATIONARY SCRIPT on a STATIONARY SCRIPT SERVER

That would obviously work but I was thinking more along the lines of speeding up notecard reads, ie find a way of improving what's already implemented.

My thinking is that whilst scripts are server-side and possibly need re-compiling with each TP because it's going to a different server, notecards just need to transit as an asset with no additional work required during the handover.

  • Like 1
Link to comment
Share on other sites

17 minutes ago, Profaitchikenz Haiku said:

more efficient way of storing data?

Amazon Dynamo DB, perhaps. It's like KVP, but you can store all the data you can pay for, terabytes if necessary. The first 25GB is free. Someone who needs that should write LSL code to access AWS DynamoDB. It talks HTTP and JSON, so that's not too tough.

Link to comment
Share on other sites

6 minutes ago, animats said:

It talks HTTP and JSON, so that's not too tough.

But would it be fast enough? Imagine you have position and rotation coordinates stored for a journey, in a script you'd possibly use link messages to get the next sets, and the delay is trivial. If you read them from a notecard there's the inherent delays of llreadNotecardLine, but if you have to make a call to an outside web service there's not just the internet transits but the issue of what to do if the request just doesn't get answered?

I realise the other inherent problem with notecards is that they too are capped at an upper limit, but as they don't need to get bytecoded or thrown across the wires by http requests they do seem the most convenient option to look at for better data access.

Link to comment
Share on other sites

2 minutes ago, Profaitchikenz Haiku said:

But would it be fast enough? Imagine you have position and rotation coordinates stored for a journey, in a script you'd possibly use link messages to get the next sets, and the delay is trivial. If you read them from a notecard there's the inherent delays of llreadNotecardLine, but if you have to make a call to an outside web service there's not just the internet transits but the issue of what to do if the request just doesn't get answered?

I realise the other inherent problem with notecards is that they too are capped at an upper limit, but as they don't need to get bytecoded or thrown across the wires by http requests they do seem the most convenient option to look at for better data access.

Does your journey have any stops?  How about loading up on waypoints while at the stop then zip through them from memory then do it again at the next stop.

Link to comment
Share on other sites

16 minutes ago, Ardy Lay said:

Does your journey have any stops?  How about loading up on waypoints while at the stop then zip through them from memory then do it again at the next stop.

My NPCs actually do that sort of thing, with coarse path planning, fine path planning, and path execution all in different scripts, running concurrently. They give the illusion of being real-time, but are really executing plans made a few seconds previous. If anything goes wrong, they stop, and stand, arms folded, while replanning takes place. All this is to be able to handle overloaded sims with 64K scripts. Huge headache to write and debug.

Link to comment
Share on other sites

14 minutes ago, Profaitchikenz Haiku said:

That would obviously work but I was thinking more along the lines of speeding up notecard reads, ie find a way of improving what's already implemented.

My thinking is that whilst scripts are server-side and possibly need re-compiling with each TP because it's going to a different server, notecards just need to transit as an asset with no additional work required during the handover.

i vote for this. Writeable notecards. As then we can have as much persistent read/write data storage as we can stuff notecards into object contents. Something like:

 

integer result = llWriteNotecardLine(nameofnotecard, data, linenumber);  // which overwrites existing line or appends if linemumber is >= EOF

if (result = -1) llOwnerSay(nameofnotecard + " write fail. Probable cause: Out of memory.");

if (result >= 0) llOwnerSay("line number written to :" + (string)result;

... and probably as well

integer availablememory = llGetNotecardMemory(nameofnotecard);

if (availablememory > 256) result = llWriteNotecardLine(nameofnotecard, somedatalessthan256bytes, linenumber);

... and also

integer result = llDeleteNotecardLines(nameofnotecard, beginline, endline);

llOwnerSay("available memory is: " + (string)result);

 

Link to comment
Share on other sites

3 minutes ago, Mollymews said:

i vote for this. Writeable notecards. As then we can have as much persistent read/write data storage as we can stuff notecards into object contents.

Isn't every edited notecard a new notecard and we just hope and pray LL do garbage collection?

  • Like 1
Link to comment
Share on other sites

58 minutes ago, Quistess Alpha said:

Maybe this is a solved-issue (as many times as I've thought about it, never actually used KVP), but as long as we're dreaming, I'd like a better way to segment the KVP database for different projects. Something like domain+key+value, and an easy way to get only keys associated with a given domain, or even just something like

llSearchKeysValue(string query,integer first, integer count);

 

the way this is typically scripted is

string DOMAIN = "MyApp";

string q = DOMAIN + "key1";
q = DOMAIN + "key2";

.. and so on

... in other app:

string DOMAIN = "MyOtherApp";

string q = DOMAIN + "key1";
q = DOMAIN + "key2";

 

  • Like 1
Link to comment
Share on other sites

13 minutes ago, Mollymews said:

the way this is typically scripted is

My point is, that all of the apps need to know before hand what 'key1' and 'key2' are. You can't easily create a key in script a and expect it to be discoverable in script b, unless script b iterates over every single key in the database, (which might be fine for a database only used for one application, but otherwise would be infeasible for large databases.)

If you're dynamically adding keys, you might like to ask questions like: How many keys are there in this domain? Which keys in the database are relevant to this application? You could of course have more keys which store the answers to these questions, but that could get a bit messy to keep accurate.

Edited by Quistess Alpha
Link to comment
Share on other sites

1 hour ago, Profaitchikenz Haiku said:

Are they faster just because they're a quarter the size of Mono? Or is there some other aspect to the way Mono scripts are loaded that is the reason?

There is a distinct difference in the way LSO and Mono scripts are handled.

LSO scripts will always take up 16KB memory.. even if there are no variables or events. They are always allocated the full 16KB of memory.

Mono scripts have dynamic memory. While the maximum memory capacity is 64KB, the script will only take up a fraction of that in most cases.

This is significant when a script needs to be transferred from one sim to the next. To do that, the current sim needs to figure out the current size of a script so that it can be stored for transfer (don't forget bytecode sharing, heap/stack memory). For LSO scripts, it's very easy since they're always guaranteed to be exactly 16KB. The destination sim suffers even more since they have to receive that information, allocate the space, and then initialize those scripts. Mono scripts are very slow to initialize and start up compared to LSO and it can be easily observed in your day-to-day interaction with scripts.

1 hour ago, Profaitchikenz Haiku said:

It seems to me that the need for larger scripts (and I accept there is a case for wanting more memory) is mostly for storing data, and in this case, perhaps there could be a more efficient way of storing data?

No, there are quite a few things that are memory intensive without doing any long-term storage. HTTP requests, large llSetLinkPrimitiveParamsFast calls, multiple raycasts (I'm sure @animats's NPCs would benefit from more memory), general data processing, etc.

Edited by Wulfie Reanimator
  • Thanks 1
Link to comment
Share on other sites

3 minutes ago, Quistess Alpha said:

My point is, that all of the apps need to know before hand what 'key1' and 'key2' are. You can't easily create a key in script a and expect it to be discoverable in script b, unless script b iterates over every single key in the database, (which might be fine for a database only used for one application, but otherwise would be infeasible for large databases.)

we can test for the presence of a key with  llReadKeyValue  http://wiki.secondlife.com/wiki/LlReadKeyValue

if the key doesn't exist then it returns an error

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...