Jump to content

Webserver Efficiency: Saving and Retrieving Data from an External Database


Artix Ruxton
 Share

You are about to reply to a thread that has been inactive for 4673 days.

Please take a moment to consider if this thread is worth bumping.

Recommended Posts

Greetings all,  I had a few questions revolving around the "how much traffic is too much traffic when sending httprequests to a PHP script on a webserver with "unlimited bandwidth""?  

The implementation is for a combat system that handles player statistics.   When a new version of the system is available, the old one detaches and a new one is sent. On attach of the new one, the players stats are loaded from the database.

Which brings me to the issue of stat saving efficiency.  Can I save a players stats to the Database every time they gain XP, say MULTIPLE TIMES IN A MINUTE?  Or if 50+ players are playing and are gaining XP by the minute would that cause too much stress on the webserver and slow things down?

How many httprequests can you send to a PHP script in a short amount of time without overloading it? Imagine the web host is a service with "unlimited bandwidth" such as hostgator or bluehost.

 

Thanks,

Artix

 

Link to comment
Share on other sites

You're first question can simply not be answered. For a number of reasons:

  1. About what provider are you talking?
  2. What kind of programm are you talking about (a free program, a virtual server programm, a dedicated server programm?)?
  3. Why do you think these providers choose phrases like "Too much"? They don't want to and can't really define it.

Next point - and you're mixing things up here: They promise free bandwidth since in most cases it's not the bandwidth that limits the performance, but the computing power. You won't find much on this topic reading the description of there services. And in your case - sending and receiving text - bandwith is no issue at all - imagine how much lines of text ypu can send at the price of a small image.

Processing power and speed are much trickier. In terms of a webserver just handing out simple HTML pages, a regular blade server should be able to serve hundrets of requests a minute. But you want an application - php and database - which again is far more expensive than just serving out a cached web page. To add to the comlexity: to good is the databe server, how many webservers share a database server? How much strin are you puttig on it?

Let's jump to the last question: I'm sure, hostgator's Dedicated Server programm would be enough (much too much, actually) - I'm not too sure about their Hatchling Plan.

At the end of the day, if you're serious about this, you have to test - do simulations of the peak traffic you expect and run them in internet peak times (!) when the virtual hosts are likely to have their peaks as well. Or go for a package  that is sure to serve your needs - which will be rather expensive.

Link to comment
Share on other sites

I pretty much agree with the first poster. The amount of text data you send from SL is minuscule when compared to images or video. The bigger concern would be making sure the SL region can handle the traffic. According to the limits identified in the LSL wiki your script can send 1 HTTP request every second with a maximum of 2k of data (including headers). So 50 avatars in a region sending one request every second would be 100 KB per second (that's bytes, not bits), assuming I did the math right. And that is what you described as the upper limit of your expected bandwidth. Bandwidth won't be your issue, even if you have multiple regions full of avatars maxing out your bandwidth, you aren't going to be using very much.

Link to comment
Share on other sites

it's a pretty common misconception, caused by the caveats that are part of the events that both those functions associate with... the events have 2k limits, but the functions don't.

the reason it's noted on the function pages is because they are often used in conjunction with the events and each other, so it could cause problems not to know that in cases where they trigger each other.

ETA:
as darkie mentions, most servers have limits on URL data length, although PUT / POST data doesn't see the same limits usually

Link to comment
Share on other sites

Even for POST method , there is a limit in the size of the body , but generally it s huge  compared to the limit size of URLs

 

For instance , for an apache web server the limits is delared in the parameter

LimitRequestBody

inside your config file httpd.cond

Check the other parameters too :

#LimitRequestLine: Limit on bytes in Request-Line (Method+URI+HTTP-version).
#Maximum value set by DEFAULT_LIMIT_REQUEST_LINE in httpd.h to 8190KB

#LimitRequestFieldsize: Limit on bytes in any one header field.
#Maximum value set by DEFAULT_LIMIT_REQUEST_FIELDSIZE in httpd.h to 8190KB

#LimitRequestBody: Limit on bytes in Request-Line.

#LimitRequestFields: Limit on number of request header fields.

The application server can limit too the size of  requests.

For instance , for a php application server , the limit is declared inside your php.ini config file with these parameters

upload_max_filesize = 2M
post_max_size = 2M

Check your host web server hosting documentation  to know the limits

 

Anyway , you will be limited by the size of your string inside the LSL script , so 64 kb in mono with a 16bit character size . So 32 kb less the side code of your script and the size of other datas you use

 

For the max number of connections actives , you should check too your httpd.conf file if you are using apache .

It s easy to modify but is often difficult to tune correctly

For instance , for apache web server you may check this directive

MaxKeepAliveRequests for statics

MaxClients  for dynamics

Nevertheless , there are many other directives you should check too

 

A simple common configuration  in local may handle 400 active requests if you answer some static documents ( images , for instance ) but around 50 active  requests  only for dynamic documents ( php , jsp asp .. ) without to have a time out . 

To go back to the 1st post , 50 connections per minute is easily handled . 50 connections per seconds could fail . It can depend how fast you generate the answer

 

 

If you are using a database you should check too the configuration of your database . Generally , each database has got a pool of connections . If you are using this pool of connections it s fine , but you will be limited by the size of the pool . If you don t use this pool of connection inside your php , jsp, asp page , and create some connection inside the page , you will fail to timeout quickly .

Link to comment
Share on other sites

all of the default values except request size (url length) for any compile I've seen are well above the max that LSL can send (~60KB)... the only limits to watch for in general are are the 1-8KB limit on urls (I wouldn't push more than 1KB), and that the server can actually work with what you send (or get to it in the case of lsl http-in)

Link to comment
Share on other sites

You are about to reply to a thread that has been inactive for 4673 days.

Please take a moment to consider if this thread is worth bumping.

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
 Share

×
×
  • Create New...