I have fully explained this once before, i believe over the discussion of a chromebook and you injecting a personal opinion on how they are not laptops.
Hey look at this, i can make words bold too, like some kinda simplified wikihow article on a topic such as dog jousting or model boat hobbyist hunting for dummies.
The point that i made there was that im not looking at the requirements for "technically it runs", and i even stated there that i have run "technically it runs" and brought up a post i made a long while back where i ran below spec and where i ran the system requirements and said that while it will "technically run" it is not enjoyable or comfortably playable. Even going to the extent to explain that the system requirements page is entirely out of date and what is on paper considered technically playable should not be, because such an experience is not playable in the slightest. (namely, a 6600GT and 1gb of ram with a dual core from 2008 struggling to get over 1 frame per second as the "minimum"). Technically it runs, but that should not be considered any kind of minimum. Never have i looked at someones ages old A4 APU or a Core2duo laptop and gone "it will play" and leave it at that, i always make sure to mention that while the game will start, it will not be enjoyable.
In the same sense that when that person started talking about their chromebook and how to run SL on it, instead of immediately replying with something like "a chromebook is not a laptop" and not giving them any information on anything they actually wanted to know about, i told them a way to do it and warned them that it would be a very unpleasant experience depending on what hardware was in their chromebook, and potentially not possible if they didnt have an x86 based chromebook like an ARM model. But they later posted their machines specs and it would be capable of SL, likely low settings, but far from unplayable, and far from the above shown PC below minimum spec with its whopping 5fps. A fairly modern mobile ulv quadcore and 2 or 4gb of ram would totally be able to play secondlife and have it be more than a low FPS nightmare at minimum settings. My Athlon 5350 APU is outclassed by the current generation of mobile celerons and the IGP is as well, and that 5350 was doing me just fine for a while in 1080p on its own.
And if you had any idea what you were talking about and werent just replying to forum posts for the sake of argument and talking about your personal opinions rather than providing useful and relevant information, you would have known this as well and been able to recognize the general capabilities of a modern chromebook and share information gathered on the topic.
Been responding to me pretty much every time we're in the same thread and reacting to many of my posts with that little "haha" face.
I normally respond to you when i see you type out something the equivalent of middle school book report in a discussion entirely unrelated to such conversation.
Such as this thread which is a great example, where in a discussion on 4k resolution and if SL would ever be really playable at 4k, you came in here and started going off about a personal opinion on what you thought of 4k resolution and technologies related to it. Instead of any relevant discussion, you went straight for personal opinion. You would be a great clickbait news article writer.
So lets go over this sentence by sentence from your original footstep into this topic:
I doubt in 2003 there was talk about running SL in resolutions higher than what was around at the time for even the super high end enthusiast. There were 1920x1080 workstation monitors, mainly SGI stuff, that were incredibly expensive and most hardware wouldnt be able to display anything at that resolution, let alone actually game at such resolution. The highest end GPU of 2003, the FX 5950 Ultra, topped out at QXGA spec, 2048x1536 resolution. And that was just "it will run the display", it could not game at that resolution at all. That card was best in 1280x1024 which was the common resolution of the era. 1600x1200 being i guess the ages old equivalent of 1440p the way we see it today.
But it wouldnt matter because its not an issue from the very start. Dynamic resolution scaling and adjustment has been a thing for a very long time and was a thing when SL was new. When you resize the window of this game, the game adjusts. It does not crash, it does not freeze, the window is not locked to a specific size you have to adjust ingame, it supports any resolution you can make the window. And its not like 4k has some special requirement, just like switching any resolution, the higher the resolution is, the more demanding it is on the system. They would not need to plan for SL to run at 4k because SL will really run at any resolution you want. 8k, 16k? itll do it. Im sure at some point you will start to see FOV issues more than anything rather than issues with the game not running at higher resolution.
If someone wants higher resolution, they should have higher resolution. Thats the reason. Its a want, not a need. You can look at a 640x480 screen and go "wow, i really need a higher resolution to do what im doing". You do not NEED a higher resolution than 720p for 99% of things these days. Games will look ok, web pages will show ok, youtube videos will look ok. But 1080p is nicer. 1440p is a bit excessive but still looks nicer. 4k is definitely beyond overkill and some people likely wont even notice it, but why not have it? Theres no reason not to if you can afford it and your system can display things at such a resolution.
Just to drop in as well, thats a personal opinion again. Thats your roommates words, not the general synopsis of the PC gaming community or even anything close. Who cares if its frivolous, im currently posting this from a PC with a GTX 970 in it, even though my monitor is 1680x1050 at 60hz, and nothing i could ever do on this PC would ever dip below 100fps at this resolution. But its a quiet card. Frivolous, its a want, not a need.
So its reasonable if its a big display and not a computer monitor? I can understand where you're coming from, that at distance you would want a higher resolution rather than up close. But wouldnt that also be a matter of if youre far away, will you even notice the difference between a 55" 1080p display, and a 55" 4k display?
3D modelling also is one of the places where 4k is extremely beneficial, more crap on your screen without your UI getting all scrunched up and low res. If you have 4+ different applications open on one monitor, having a higher resolution on that smaller display will give those applications much needed clarity and make them usable at those low window sizes on an average computer monitor. Even if its just one application, especially 3D modelling, where you will have multiple viewports open at once, and your actual workspace being fairly small, it would be very helpful for the high resolution of the monitor to make it easier to see.
"Youre not going to use something that big for that, you're just not."
Reading this im realizing there are two ways this whole phrase can be taken, so im going to cover both. Using a 55" 4k display as a monitor? A lot of people use large tv's as computer monitors, usually just because they sit farther away or their PC is a multipurpose device. My parents regularly use the 65" 4k TV in their bedroom as both a comcast/amazon streaming device, youtube streaming device, and have a PC plugged into it to web browse and do old people stuff from the couch.
Using 4k on a small screen is covered above.
Also, personal opinions once again.
Have you seen a 4k display? Have you gamed on 4k? There is a noticeable difference between each resolution jump. Theres a reason its a popular topic and people want to game at 4k resolution, why there are consoles now that are promising 4k resolution, why TV's are marketed as "4k UHD" these days and why there are movies and streaming services that offer 4k options. People notice the change in visuals.
"dont even attempt to argue" try me
4k is a thing because people like the increase in visual clarity and visible detail. It is noticeable, if you cant notice it, either something is wrong with your eyes or you just havent been paying attention.
It irritates you that technology is advancing? Dont get me wrong, i'd love if we aesthetically peaked with our PC's in 2002. Lots of UV plastic and aluminum, looked really nice. But technology always moves forward and always gets better. We've gone from 1080p being the norm to 1440p and 4k becoming realistic resolutions that many people can game at now. There will always be improvement, there will always be something better, everything has gotten better in the world of technology. Todays integrated graphics on 15w mobile processors are more powerful than the flagship gpus of 2006-2008. The lowest end desktop processor you can buy new today, the Athlon 200GE, destroys the best possible processor of 2008, the QX 9775. Your power supply is drastically more power efficient, your motherboard has double the estimated lifespan, you went from 2gb of ram being the average to 16gb, storage went from 250gb being a lot to 1tb being the norm, storage speeds and technologies of course also changing with SSD's, even in a short time span, going from SATA to m.2 and NVME.
Like, damn. Just another thing on storage, right now a 250gb Samsung 970 EVO NVME ssd is around 100$. But for 100$ you can get a 4tb WD Blue 7200rpm HDD. Thats 8x the capacity for the same price. But why would you ever buy a 970 evo then? Surely theres gotta some reason right? Yeah its faster, but most users wont notice too drastic of a difference between an NVME ssd and a hard drive. A lot of stuff would load much faster but how often are you really waiting on things to load?
Maybe its a want and not a need? Maybe its a want for people who want those small increases. That 3 second power button to desktop boot time.
In the same sense that people who want things like 4k will be the market for 4k and will be the driving force behind the development of 4k and 4k capable hardware and other technologies.