Jump to content

Server Bake Code


Patrick032986
 Share

You are about to reply to a thread that has been inactive for 4099 days.

Please take a moment to consider if this thread is worth bumping.

Recommended Posts


solstyse wrote:

 

what I mean by bigger than sl can display is exactly what you linked to. If the maximum for clothing is 512, then anything bigger is, well, bigger. And what is a maximum, if not a specified limit?

Now, I have uploaded 1024 textures for clothing, when the max is 512. I just forgot to shrink the template before saving the .png. but here's where it gets interesting. A full viewer, such as Firestorm or the official viewer, just displays the item as if I hadn't made that mistake. but a cut down viewer, like Lumiya, doesn't scale the texture down. It just refuses to show the item.

Those limits mean that they indeed are the maximum size limits of texture for an item.

For example:

 

1.  You upload a shirt texture as 1024×1024.  Will SL show this shirt on avatar as 1024×1024 texture?  No it won't.  When the shirt texture is applied on avatar it will be reduced to 512×512 size which is the maximum size for clothing item.  So you cannot accidentally apply too large texture on avatar.  If you upload too big texture for an item the system takes care of it and reduces it to the proper size.

2. You upload a texture as 2048×2048 what you intend to use on a prim.  Will the texture show as 2048×2048 on the prim?  Once again, no it won't.  Because the maximum texture size is 1024×1024 the system will reduce the too big texture to the allowed size.

If Lumiya viewer has a problem with that, then there must be a bug in the viewer, it does not work as it should.

Link to comment
Share on other sites


Coby Foden wrote:

2. You upload a texture as 2048×2048 what you intend to use on a prim.  Will the texture show as 2048×2048 on the prim?  Once again, no it won't.  Because the maximum texture size is 1024×1024 the system will reduce the too big texture to the allowed size.


 

This doesn't work how you think it works. the 2048x2048 texture is resized *at upload* to 1024x1024. Not when applied to the prim. 

 

But the point being, clothing isn' the 'do all, be all' of SL. there's very good reasons to have textures for building that are 1024x1024. In fact I wish 2048x2048 was still available as it was up until 2008-2009 as there are certain instances where the larger dimensions are desirable. But c'est la vie.

 

Link to comment
Share on other sites

I agree with you partly on the "Why" of windows8's efficiency. I would also add that I think the flood of low end laptops, and the release of Atom powered tablets have a lot to do with Microsoft's decisions too. Also, I think that the gaming crowd's desire to spend one time a decade on hardware is contributing significantly to the slowdown of x86 performance gains.

My software use, I'd say, is probably 50/50. I spend more time using my phone than my pc. And I work in an a pllace that encourages that. We have multiple wifi networks, computers that are slower than our phones, so when someone needs a quick answer, we're actually told to use our phones. I also dual-boot my computer, so it's nice to have access to my stuff without rebooting because it's in a partition that I'm not using.

You and I disagree about the idea of making more hapen serverside requiring a complete rewrite, unless I'm misunderstanding you. I think over the course of years, almost nothing would be the same anyway. So as time goes on, if the migratioin to server-side is gradual, it wouldn't feel as extreme as what you're describing. In a way, if they just start making the new code serverside, then it'll happen eventually nomatter what, just through natural evolution. I think compatibility is key in this economy.

Your definition of exodus is a bit more severe than mine. There was a  while when people were saying that PC gaming was dead. Obviously, they were wrong. But there was a reason why they thought that. And from what I have seen, for the more "hardcore" games consoles are still the weapon of choice. Now that the current generation is so old, computers have a chance. And gamers will never abandon the PC completely. I guess to me the move to consoles was significant enough to fit the word exodus, while to you it wasn't.

Typically, I think that computer costs should be about $125 per year in hardware, if you average it out. So if you spent $500, you should be happy with your performance for about 4 years. Of course, that isn't always perfect. And more of these laptops have the graphics right on the cpu, which doesn't help. Computer replacement should happen only if either you've tried everything else, or you're generally unsatisfied with it. Not just for sl. And a wired connection isn't necessary. Wifi technology has come a long way. The thing I despise though is those people who will just repeat the "upgrade your hardware" mantra and then berate the person who said they can't or won't. That invariably is followed by a lecture about the direction that technology goes in. It's the same tired idiot script, and it tells me that the responder is more interested in flaming than in answering. That, and being misquoted. I tend to become a bit irrational when that happens.

 

As for what Lumiya does with 1024 textures on clothing. I actually use that "bug" to my advantage. After I make something, I'll log on with my phone to make sure I don't look naked. lol. Not that I make much, but I want what I make to be optimized. now, if these textures were to, say, automatically reduce at the time that they're stored on the asset server, then they wouoldn't have to scale down for delivery to the viewers every time. If LL can make the grid, they can make that an automated process. I don't think they can make everything perfect, jjust better. Because you're right. It's the people who make sl unique.

Link to comment
Share on other sites


solstyse wrote:

Also, I think that the gaming crowd's desire to spend one time a decade on hardware is contributing significantly to the slowdown of x86 performance gains.

Who said the gaming crowd desires to only upgrade once a decade? If not needed, they would never upgrade, I know I wouldn't. But the reality is I buy a new computer every five or so years and often buy some minor hardware upgrades in between.


My software use, I'd say, is probably 50/50. I spend more time using my phone than my pc. And I work in an a pllace that encourages that. We have multiple wifi networks, computers that are slower than our phones, so when someone needs a quick answer, we're actually told to use our phones.

A phone is faster than a PC in some cases. If both are turned off and you need to send some e-mail, a phone will be faster. When you need computing power, you have a seriously fast phone and seriously outdated computer if the phone is faster. You can't compare the two.


You and I disagree about the idea of making more hapen serverside requiring a complete rewrite, unless I'm misunderstanding you. I think over the course of years, almost nothing would be the same anyway. So as time goes on, if the migratioin to server-side is gradual, it wouldn't feel as extreme as what you're describing. In a way, if they just start making the new code serverside, then it'll happen eventually nomatter what, just through natural evolution. I think compatibility is key in this economy.

I don't see how you picture this "just start making new code server side". Some things can be done server side, some are preferrably done server side, some can only be done server side. We already have inventory storage, (most) script and all physics calculation done server side. This simply can't be done in a reasonable way at the user's end. Now LL wants to move texture baking to the servers. This doesn't need any real change in the architecture of SL. We already download textures. These textures are sent from the server, then on our end the ambient occlusion and shadows are baked in for example. That can be done server side and we could download the baked texture. Not a big change really. The real challenge for the end user's computer is the processing of textured geometry (Texels/Triangles). Moving this process to the servers isn't easy by any means since unlike physics, inventory, textures etc, it's not shared between even two people. It certainly can't be done using the software (and LL hardware) currently available. Not only that, even the baking has some potential pitfalls. Moving this process to the servers reduces load on (lower end) systems, but also ignores the potential of high end hardware. Baking in the shadows means draw distance won't affect shadows from objects in the sky anymore. If (for testing purposes) I raise my draw distance to 512 or 1024, every single skybox or platform will cast shadows on the ground and it doesn't look pretty. Again, if the picture LL produces is as nice as the picture my PC can produce, I'm all for it. If it means lower quality, for me it's a big nono. What can't be done server side at all is producing the final image (Pixels on screen). I probably overlooked or forgot something, but I think I pretty much covered what the viewer does at this point. The main point is you can't simply move everything that's viewer side to the server by changing a couple of lines of code on both ends, secondly, most processes are already done server side.


Your definition of exodus is a bit more severe than mine. There was a  while when people were saying that PC gaming was dead. Obviously, they were wrong. But there was a reason why they thought that. And from what I have seen, for the more "hardcore" games consoles are still the weapon of choice. Now that the current generation is so old, computers have a chance. And gamers will never abandon the PC completely. I guess to me the move to consoles was significant enough to fit the word exodus, while to you it wasn't.

Ok, that's clear, maybe you should call it "a significant shift years and years ago" rather than an exodus, which implies it's huge and ongoing, then.


Typically, I think that computer costs should be about $125 per year in hardware, if you average it out. So if you spent $500, you should be happy with your performance for about 4 years. Of course, that isn't always perfect. And more of these laptops have the graphics right on the cpu, which doesn't help. Computer replacement should happen only if either you've tried everything else, or you're generally unsatisfied with it. Not just for sl. And a wired connection isn't necessary. Wifi technology has come a long way. The thing I despise though is those people who will just repeat the "upgrade your hardware" mantra and then berate the person who said they can't or won't. That invariably is followed by a lecture about the direction that technology goes in. It's the same tired idiot script, and it tells me that the responder is more interested in flaming than in answering. That, and being misquoted. I tend to become a bit irrational when that happens.

What your personal opinion on computer cost is (or mine for that matter), is rather irrelevant. Anyway, if you spend $500 on a desktop, it will probably last you 4 years without any issues, if you spend that on a laptop it might not, depending on what you use it for. People should think before they buy a computer. I understand lots of people already have a computer when they start using SL of course. LL should keep their minimum requirements low to keep their potential market as wide as possible. As I said before, the recommended requirements are really low compared to any other real time rendering program I know.

 

 


As for what Lumiya does with 1024 textures on clothing. I actually use that "bug" to my advantage. After I make something, I'll log on with my phone to make sure I don't look naked. lol. Not that I make much, but I want what I make to be optimized. now, if these textures were to, say, automatically reduce at the time that they're stored on the asset server, then they wouoldn't have to scale down for delivery to the viewers every time. If LL can make the grid, they can make that an automated process. I don't think they can make everything perfect, jjust better. Because you're right. It's the people who make sl unique.

The textures you see on an avatar are already stored on a server, they are stored on the server running the region. Nobody will ever see three 1024x1024 or 256x256 textures on an avatar or even download them from the server. They are always 512x512. The server sends these 512x512 textures to all viewers, I suspect even your own viewer, which would explain the delay in sharpening your body after a texture rebake.

Link to comment
Share on other sites


Darien Caldwell wrote:


Coby Foden wrote:

2. You upload a texture as 2048×2048 what you intend to use on a prim.  Will the texture show as 2048×2048 on the prim?  Once again, no it won't.  Because the maximum texture size is 1024×1024 the system will reduce the too big texture to the allowed size.


 This doesn't work how you think it works. the 2048x2048 texture is resized *at upload* to 1024x1024. Not when applied to the prim.

I think my wording wasn't clear, unfortunately leading to misunderstanding.  I didn't actually say that the 2048×2048 texture will be resized to 1024×1024 when the texture is applied to the prim.

Naturally as the system does not allow bigger than 1024×1024 textures, then the system will indeed resize the too big textures at upload to 1024×1024 size.

 

 

However for avatar textures, what happens is:

 

For example, we make shirt texture 1024×1024.  We upload it.  The system will happily accept it as it is;  the system has no idea at all that it is a shirt texture.

Ok, now when we are going to apply that 1024×1024 texture to the avatar.  The system will mumble to itself *hey, that's too big for that purpose*.  Then the system will resize the texture to 512×512 (because that is the allowed size for that purpose) and then applies it to the avatar.

My understanding is (from what I have read) that on the server the shirt texture is still 1024×1024 size.  So every time that shirt texture is applied to the avatar, the system has to do that downsizing (causing unnessary extra work and lag).  Therefore uploading avatar textures in the wrong size should be avoided.

Link to comment
Share on other sites


Coby Foden wrote:

 

My understanding is (from what I have read) that on the server the shirt texture is still 1024×1024 size.  So every time that shirt texture is applied to the avatar, the system has to do that downsizing (causing unnessary extra work and lag).  Therefore uploading avatar textures in the wrong size should be avoided.

True, as long as you don't confuse the original texture with the baked one. Both are stored on servers, but only the baked 512x512 one is sent to viewers as avatar texture. Every time your new texture is baked, that could be by manually doing so, when changing clothes and possibly even when logging on or teleporting, the big texture has to be processed by your own viewer. It's not that big of a deal really, but unneccecary nonetheless.

Link to comment
Share on other sites

Kwakkelde, I pretty much agree with everything in your latest post, except maybe for some minor details.

You're right about the gaming crowd. The age of the consoles that are out there are causing them to look again to PC's. It'll go back and forth like that for a while, I think. New console comes out, and only the most dedicated stick to PC gaming. Consoles age, and people come back fo the better hardware. The biggest advantage that the consoles have is standardized hardware. Upgrading was a constant thing before consoles became networked. Now, as you and I said, computers have stabilized enough that most people can be happy for 4-5 years on a $500 dollar purchase. I'm of the opinion that software needs to be written to last the average lifespan of the machine that runs it. Thus, I think a computer should need to be at least 4-5 years old before it struggles with SL. We can blame marketing and manufacturers for some people having problems with computers that are 2-3 years old, but now it's LL who has to deal with their loss of satisfaction, either by letting those customers go, or by using their coding skills to keep them.

And I also agree that whether a phone or pc is faster depends on circumstance. There are a few factors in this. One is the code. Arm devices typically use less code to perform the same functiion compared to x86. So, yeah, I'm fully aware that the hardware specs are still a long way off. But the efficiency of the code makes it feel like the divide is much smaller than it is. I think that in a big way, Windows is trying to address that by using the more lightweight apps that are available for the first time in win8.

It's common in America for business computers to be some of the oldest around, so yes, they are very outdated, and very slow. Also, they use older software that is just "patched" for new functions, which of course leads to bugs and broken code. Obviously, tech companies are an exceptiion. One of the biggest complaints of American workers was "My home computer is faster, cheaper, and more reliable." Now, we talk the same about our phones.

In a major way, the problem is that x86 has the power advantage. But ARM has a very clear marketing advantage. Nobody complains about replacing their phone every two years because they are subsidized. But the average person now demands a minimum of 4 years from a PC. The reason is that phones are subsidized, while computers are not. Plus, x86 has already had it's rapid growth spurt. That led to some sloppy code. Now that growth is slowing, x86 software can refine, and mature into cleaner code. But this presents a problem to hardware manufacturers. Storage cost is virtually free. HDD's dollar to gigabyte ratiio is under a $1/1gb ratio. With 3tb drives available, and code becoming more efficient, the need for further development is no longer felt by the consumer. 2-7 free gigabytes of cloud storage are easy to get from multiple companies.

So the issue is that as consumers are becoming capable of doing more and more on the hardware that their cellular provider practically gives away, they are going to demand cheaper and cheaper hardware. Even savvy consumers who account fro the added power of the PC are going to expect that PC to have a longer and longer lifespan, simply because every dollar spent on a PC comes from their own pocket.

I honestly would double the timeframe per dollar amount that you've quoted. I expect that if you spend $500 on a computer, then after 4 years a laptop would be ready for replacement, while a desktop is ready for a bit of upgrading. but everyone's personal experiences are different.

Much of what LL does is already serverside. And the recent change to character baking kind of hints that they're thinking the same thing I am. maybe what I'm saying can be considered giving their new direction a vote of confidence?

Can I make a confession right now? I, like many people I now find myself arguing with frequently on these forums, used to feel that ARM was weak, that the cloud was just a bullshiz trend, and that the power of the PC's we currently use is necessary. But then I saw an OnLive system in action. Not at a show, or anything. In my brother's living room. Here's a couple links. http://www.onlive.com/game-system http://www.onlive.com/about What I saw was the "console" which is an arm device about the size of a cigarette pack, The first time I saw it my brother was using it to play Saints Row 2. or 3, maybe. One of those. and it was flawless. My mind was changed. the most impactful statement comes from http://www.gizmag.com/onlive-tv-cloud-game-system-announced/17084/ The author says, "As the games are actually stored at Onlive's data centers, users should always have the latest version..... Another major advantage for players is that many of the hardware upgrades necessary for the frequently changing gaming landscape can be undertaken at the server end and not by the

consumer."

What I'm calling OnLive isn't the inventor or genius behind the points I'm trying to make, but they are the "poster child, or example, or representative, mascot".... whichever word you find most fitting. these kind of server based services also make DirectX games available to OpenGL operating systems such as Linux, Apple, and even Android. That's why I say it's possible for LL to do the same thing. If a $99 dollar cigarette pack sized device can play both xbox and ps3 games, then why should we think that using sl on a $500 laptop should be considered impossible after only 4-5 years?

Kwakkelde, I would have disagreed with me if I hadn't seen things like I'm referencing in action. It's been a fun debate, and I respect you as a person. I can vouch for all the links I've posted. I've seen these things in action. If you remain even slightly skeptical after reading the links I posted, then test these things for yourself. If you want to take me up on that challenge without spending $99 for an Onlive console, then download a distrobution of Ubuntu, use wubi to install it so that it doesn't harm your partition table, Then, try any of the clients that are mentioned in this video

almost all of these are for Windows native games playing in environments they wren't made for. and they're mostly cloud based.

Sometimes, I think that we ignore what we're capable of, and instead limit ourselves to what others tell us we can do.

Interesting fact... I would have posted this within my Ubuntu 12.10 partition, but I need to prepare Win7 to be upgraded to win 8... All because someone told me that Win8 doesn't like sl. Let's all do what we're told can't be done. Let's all challenge ourselves. "It can't be done" is an answer that is against our nature to accept. Let us all refuse to be limited by what we're told. There is no such thing as impossible.

Link to comment
Share on other sites

What that console does, seems to be more or less what I described. Pretty much everything is done at the server's end so all your box has to do is drawing a picture and telling the server what you're up to. This means the hardware on such a thing can be very low spec and very dedicated to one simple task and therefor cheap like you say. It also means the software needs to be adapted to this system, which I mentioned a couple of times now. That can't be a slow steady change I think, as I explained before. It also means LL will have to make a huge investment in new hardware. Again, if LL could pull off such a system, I'd be all for it, as long as my SL experience doesn't suffer from it. I have no desire to run my SL on a 1500 euro computer with the looks and features of Lumiya.

It also means there is probably zero compatibility with other systems that don't share the same architecture. Like any console, the programs for it are very dedicated to one particular setup. This allows better performance again with lower specs, but it also means it's pretty much useless for PC's, both low end and high end systems, since not two are the same.

So without trying it out (I'm not going to mess with this computer as it is my workstation and frankly, I don't need to be convinced that the performance is good, I'll take your word on that) I can honestly say I remain very skeptical.

BTW, the phone market seems to change, over here in the Netherlands the subsidies are dropping at a fast rate. If anything costs money, it's a high end smartphone. A typical subscription here is around 25 euros a month I think, so for two years that is 600 euros. Not exactly peanuts for a lot of people and it probably won't get any cheaper. A smartphone just does other things than your average PC, a smartphone is also something people use in public, people want to show off their new gadget, a four year old gadget doesn't really exist. Phone technology is improving at a pretty fast rate (question is how long this will continue, that will depend on potential new functions) and therefor a phone depreciates quickly in desirability and functionality. I don't think cost is the leading factor here.

Oh and $1/GB? That's for SSD, not a 3TB disc. One of those would cost $3000 then, a bit steep :) A 3TB disc costs more in the region of 5 cents/GB. In the SSD market there's plenty of room for improvement. I'd love to have a 3TB SSD instead of a combo, but I'm not spending the $3000 I mentioned on it. Call me old fashioned, but I rather have my personal files on my personal computer, not in a Cloud somewhere. No digital database or system has been proved secure, quite the opposite. I understand any computer can be hacked and any CD can be stolen, but that would only happen by pure chance or by a very directed attempt, not a big sweep and analysis of a complete database.

Link to comment
Share on other sites

Kwakkelde, my personal feelings are much the same as yours. My cloud use is for things that do nothing that reveal who I am. For anything sensitive, I'm much the same as you.

As for personal privacy, I don't do facebook or twitter. I seperate sl from rl, etc. My views on how the internet should be handled are very old fashioned. But I see a growing trend in people who disagree with me. I see people who think like me, and I suspect like you, as a shrinking demographic.

As for the OnLive system, I have gotten a bit of an education about how it, and other server-based systems work. You're exactly right when you say, "Pretty much everything is done at the server's end so all your box has to do is drawing a picture and telling the server what you're up to. This means the hardware on such a thing can be very low spec and very

dedicated to one simple task and therefor cheap like you say."

However, experience has shown me that you're srong when you say, "It also means there is probably zero compatibility with other systems that don't share the same architecture. Like any console, the programs for it are very dedicated to one particular setup. This allows better performance again with lower specs, but it also means it's pretty much useless for PC's, both low end and high end systems, since not two are the same." because the whole setup is designed to allow programs that are built on completely different proprietary hardware (xbox 360 vs PS3) to run on the same device. The "console" that I showed you is based on ARM archetecture, but the subscribtion service is available for both the console, and for PC's. Furthermore, in the video that I linked, directX games can be accessed from a Linux machine. Now, directX belongs to Microsoft, and isn't available for Linux or Mac OS. But by using servers, users of both of those other OS's are capable of playing games that are written exclusively for windows.
 

The bottom line is that the more "server side" a program is, the more compatible it is, with a greater range of hardware options and a greater range of operating system options. without trying it, I can't blame you for your skepticism. i was a skeptic before I saw it in action.

The phone market seems to be in a state of constant change, much like the PC market used to be. but PC's have in a way matured, and their growth potential has slowed dramatically. there was a rule that the resources that a hardware provided was used at the same rate by software developers to add new features, resulting in the consumer seeing no boost to speed. And then there's the fact that Moore's law seems to have taken a shine to newer archetecture.

What I keep trying to say is that the weaker archetecture of ARM is now growing exponentially, while x86 in it's maturity is struggling to gain clock speed. The result inevitably is that the old "bloat" of x86 software is taking a trip to the proverbial gym, while ARM continues to be capable of running larger programs. Now, the ARM operating systems are based on the Linux kernel. Linux is much more "lightweight" than Windows. What I see the market demanding in the coming years is the power of x86, with the efficiency of ARM. In other words, the best of both. x86 would be the clear winner for years to come, except that the cost of x86 is purely on the wallet of the user, while ARM is paid for mainly by the provider.

Now, we're from two different coutries, so I'm not sure how the phone market is handled in yours.And you made me curious about the dollar to euro exchange rate, but i'll look that up after I posted. In the US, electronic devices often cost less to buy than they did to make. The catch is that you often pay more for use. For example, a printer often costs as much as the ink for the very same printer. The included ink cartriges often hold less than the "replacement" cartriges. The Amazon Kindle" is sold at a loss, because Amazon knows that if you have the device, you'll pay for content. Similarly, cell phone companies often sell the phones ofr less than they're worth, practically giving away all but the high end like what you described, because they know that they'll get that money back over the term of the contract, which is so itemized it's not funny. On my own contract, which includes three lines, the "middle" line to the provider in terms of usage is the most expensive, because it is the "primary" line. In other words, the contract prices go to the line that uses the middle of the road data. Of course, that is because there is a "voluntary lock" on the lowest phone, and the highest phone spends less time within 80211 networks as the "middle primary" phone. I really think that a phone's growth will become dependant more on clock speed, as cannonical is trying to bring the Ubuntu to the phone, motorola screwed up the whole "phone desktop" thing, and Samsung continues their war with Apple with Microsoft in their sights.

I think Win8 is proof that Microsoft knows they need to do something drastic. And I think that the "ultrabook" line by intel is telling the same thing.  I have faith that both companies will stay around for a long, long tilme, but I think that the reason is because they both seem to understand the market very well.

And yess, I said below $1/gb. that's because I see ssd's as the way of the future. Hdd's are kind of archaic. they are the cheapest way of tetting more storage than you need. And you're right, Hdd's are far below what I've quoted. But really, I see local storage as an old fashioned way of doing things now. 3tb is way more than most people need. I don't even use the 500gb of storage that came with my machine. With the efficiency upgrades I see coming to x86 archetecture, I firmly believe that the demand for local storage will decrease, while demand for read/write speeds will greatly increase.

Link to comment
Share on other sites

The compatibility I was talking about was not the potential compatibility between the two ends, being the server and the user. The more that's done at the server's end, the more compatiblity of course, since more hardware is shared between all users. I'm sure the server code can be written to be compatible with iOS, MacOS, Windows, XBox, PS, Linux, Android etc. In fact I think it already is. We already have different viewers for Win, Mac and Linux, (and even a tpv for android) so this shouldn't be an issue.

What is more important I think, is the fact the architecture of the hardware is completely different. For example, a PS3 game is optimised for PS3. It makes use of its strong points and tries to avoid the weak points. I don't know all that much about the internals of any computer, but I do know that what works well on one machine does not work well on another. I also know "porting" a program from one platform to the next often results in a second rate experience.

The most important thing in relation to all this, is the fact in order to make such a platform reality, the current server code will not work. Not only in a "code" way, but also in practical ways. A simple example. In order to make SL work on a current console, or tablet, or phone, the texture load needs to be lowered since it's simply too high. Lumiya takes care of this by lowering the maximum texture size (and draw distance). Lots of people use a 1024x1024 texture with text. If the texture shown in the viewer is lowered (by ignoring the highest part of the mipmap?) to 512x512 or even 256x256 you can't read the text. This means broken content, something LL tries to avoid as much as possible.

LL tries to bring a shared experience to us residents. The only way I see this possible at this point is by either lowering the quality to the lowest thing out there, being the phone, or by doing server side rendering, which means both soft- and hardware upgrading and importantly, would demand a high speed internet connection. I don't think LL has the resources for it and even if they do, I think those can be put to better use in other places, at least for now. This doesn't mean LL shouldn't keep a very sharp eye on the market of course.

I don't know anything about phone contracts in other countries, but here you usually get a phone for a two year contract and a certain amount of (minutes and) data you can use every month, for a certain amount of money every month. Going over this limit results in extra costs. I'm sure this is where the phone companies make a good share of their money. Both legislation and customers getting a bit smarter, pretty much has put an end to data use outside the contract limit. Phone companies are in bad weather here, so costs will probably go up.

Where the "war" between the producers is going I don't know, but looking at the investments by the Korean state into Samsung, the balooned value of Apple shares and MS trying to catch up, it might be an interesting soap.

I don't have any illusions about privacy on the internet or privacy as a whole nowadays. The most personal information about me or any other person around here is stored in databases. I think medical and financial data are more personal than the files I have on my computer. I do what I can to minimise the potential damage I guess.

Link to comment
Share on other sites

Oh, it WILL be an interesting yeart. That's for usre. When the new Samsungs come out with 8 cores, it'll be fun to see the ripple they make in the industry.

I guess the best way to say what I've been trying to is that LL should consider better support for voluntary downsampling, both to help those with aging x86 devices, and those with the newer arm devices. I know plenty of people who change their video settings frequently, so that they can walk around without feeling like they're getting nowhere, and so that they can up their graphics settings to take pics once they got where they needed to.

If you combine that idea with automatically storing assets at the maximum resolution that the highest end viewer will display, I think you would get your wish of keeping all the LL eye candy intact, while those on slower machines will get theri wish of a faster performing sl.

Using clothing as an example, if the top of the line computers will always display it as 512, then it should reside on the asset server as 512. If a "lightweight" viewer can only display it at 256, then instead of ignoring the texture, what is to stop the lightweight viewer from downsizing a second time locally to display that texture at 256? For that matter, if the higher end viewers (which means 99% of them) can interpret the 1024 texture as 512, then why not Lumiya? That must mean that the viewer is handling that locally on the user's machine. Now, if that was done server-side, preferably at the time of saving the object, or item of clothing, I think even the high end computers would see a benefit, without sacrificing any of the looks that you've grown to love. And it seems that a patch like that to how textures are handled wouldn't be too difficult to the people who invented the grid.

But then I wonder, if that's not exactly what they're preparing to do, why else would they change baking to server-side?

Link to comment
Share on other sites


solstyse wrote:

I know plenty of people who change their video settings frequently, so that they can walk around without feeling like they're getting nowhere, and so that they can up their graphics settings to take pics once they got where they needed to.


Exactly, this seems to work pretty well, so I don't see what LL should change. Users can make their own desicion in their performance/quality ratio.


If you combine that idea with automatically storing assets at the maximum resolution that the highest end viewer will display, I think you would get your wish of keeping all the LL eye candy intact, while those on slower machines will get theri wish of a faster performing sl.

People have this in their own hand already. Everything IS stored at a certain quality.


Using clothing as an example, if the top of the line computers will always display it as 512, then it should reside on the asset server as 512. If a "lightweight" viewer can only display it at 256, then instead of ignoring the texture, what is to stop the lightweight viewer from downsizing a second time locally to display that texture at 256? For that matter, if the higher end viewers (which means 99% of them) can interpret the 1024 texture as 512, then why not Lumiya? That must mean that the viewer is handling that locally on the user's machine. Now, if that was done server-side, preferably at the time of saving the object, or item of clothing, I think even the high end computers would see a benefit, without sacrificing any of the looks that you've grown to love. And it seems that a patch like that to how textures are handled wouldn't be too difficult to the people who invented the grid.

There's a difference between "clothing" and "textures". The textures that are used as clothing are stored at any supported resolution. Clothing is always 512x512 on the server. The piece of code that resamples the original texture to 512x512 is used once, not before rendering, but before sending the combined clothing texture to the server. This piece of code, or something similair, could be used another time before rendering, but I'm not sure if that's a good idea. Instead of resampling one texture, the viewer would have to resample all textures. Especially on a relatively slow device as a phone or tablet this could take too much time. What I think Lumiya does, is ignoring the highest part(s) of the mipmap. If you look at something in the distance, your viewer will download a downsampled version of the original texture, readily available. (That's at least what I picked up, that some sort of mipmapping is done server side in SL) As you come closer, your viewer will download the higher quality ones. Changing the distance at which a certain size is downloaded to zero, will ignore the higher resolution(s). The result as I mentioned earlier, is objects that really need the high resolution textures (text for example), won't show correctly.

The missing 1024x1024  clothing texture in Lumiya has to be a bug. I can think of something like this: All textures that are rendered on screen are "downsampled" to something smaller than 1024x1024, I never used Lumiya, but the screenshots clearly show the "downsampling", to what size I do not know. A 1024x1024 texture for your avatar is handled differently. It never reaches your screen, so it's processed by a different piece of code. The baking process in Lumiya probably can't handle the big texture and ignores it. Doing this baking server side is exactly what LL is planning to do. It should result in a small performance gain for everyone. (And it means the bugged code in Lumiya becomes obsolete)

Link to comment
Share on other sites

You are about to reply to a thread that has been inactive for 4099 days.

Please take a moment to consider if this thread is worth bumping.

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now
 Share

×
×
  • Create New...