Jump to content

filz Camino

Resident
  • Posts

    104
  • Joined

  • Last visited

Everything posted by filz Camino

  1. I'm sceptical that the demand for it is real. I've had Lumiya on my phone for years and I never use it. For me, SL is best experienced on a device that has a decent sized screen and a keyboard. And, SL is never what I want to spend time on when I'm out and about. I guess I would be (slightly) more likely to use a SL app that runs on my iPad rather than my phone, though.
  2. Having had Apple Macs for most of the last 20 years, but having converted to Windows last year, it really does have to be said that Mac SL performance is shockingly bad and likely to get worse in ways that can't be mitigated in graphics settings with the upcoming Firestorm PBR release, where it will no longer be possible to turn the graphics settings right down in the way that is possible today. If you want a light and efficient laptop that can still do SL reasonably well, it might be worth considering a Windows laptop with a AMD 7840u or 8840u. These APUs are aimed at the handheld gaming console market, so (despite having integrated graphics) they do work pretty well for SL and are also very power efficient. Here's a 16" 7840u laptop that is only 1.23Kg https://www.acer.com/us-en/laptops/swift/swift-edge
  3. It depends very much on what you are doing in SL. Firestorm uses about 3Gb when I'm alone in my skybox, and as much as 20Gb if I am in a busy club with 40+ avatars (and that is also with my 12Gb video memory maxed out). (Try going to Peak nightclub, which is usually pretty busy. A quick test there right now shows Firestorm alone using 18Gb).
  4. This was discussed here previously and the general consensus seemed to be that SecondLife doesn't make use of the extra cache. (I have a 7800X3D, but I also play other games that benefit from it).
  5. Depends exactly what you are doing in SL and the other apps, but I think it could slow down quite badly trying to do all that with just 16Gb of total memory. Remember that on Apple computers, the unified memory is used as system RAM and as video RAM. Running Firestorm and no other apps, my computer typically uses 10Gb of combined system and video memory and that's a minimum use case when I'm just sitting alone in my skybox. If I go to a busy sim, combined memory use often increases to well over 30Gb. So if I was getting a new computer right now I would want more than 16Gb of memory. I think for combined system and video memory, around 32Gb is a comfortable minimum in 2024 that will cover most use cases and also future-proof for a reasonable length of time. I understand that the extortionate prices that Apple charge for extra memory do make this a difficult choice, though. If money is really tight, I guess you could get a 16Gb MacBook, test it carefully with your typical use case and return it if you are not happy?
  6. Yes - I have had problems in the past running SL over a tethered 4/5G connection, but just right now it seems to be working perfectly (assuming a strong signal).
  7. The Linden Labs system requirements seem pretty unrealistic in this respect, IMHO. I can't imagine how badly SL performs on their 4Gb system minimum, and I really doubt how well it will perform in busy areas even with their "recommended" 8Gb of RAM. (Especially if the computer has integrated graphics).
  8. That screenshot was taken at "House of Booty", which just happened to be the busiest club at the time of posting. That amount of RAM use is pretty normal in a busy club, though - I've often seen figures like that at Peak or Warehouse 21. I've not modified any debug settings in Firestorm. Depending on the size of your video memory, RAM usage may go even higher. My desktop PC graphics card has 12Gb of vRAM and in a busy club it often ends up running out and allocating 2 or 3Gb of system memory to the graphics card. (In a busy club, my laptop with 64Gb of memory and integrated graphics can use up to 40Gb of memory just running Firestorm!)
  9. If you are wanting top shelf performance in SL, I think 32Gb is a definite requirement these days. Here's me in a club 5 mins ago, with nothing but SL running and 23.6Gb of memory in use (in a busy club, this is the norm). And, on top of that, it is useful to have free memory available to open things like a web browser while in SL.
  10. I get 165fps at ground level near the sea at 4K (e.g. Hi DPI) full screen resolution on my Windows gaming PC. Up in the sky, it can reach over 400fps. Obviously that is overkill, but more usefully in a busy club I can set Max Avatars to 40 and still get around 50fps. I'm sure that Apple silicon Macs are a reasonable step up from Intel Macs, but I have a feeling even a M3 Mac Pro is still very significantly slower in SL than a Windows machine. I've used Macs for the last 10 years or so and only recently switched to Windows, obviously Macs are superior in some respects but when it comes to SL, I just can't quite believe what I've been missing out on. Looking at the results of my original test, I think most of the problem is in the Apple software, either the drivers or perhaps the version of OpenGL that Apple uses, which I understand is rather old. But when it comes to gaming performance, I would say that even PC hardware tends to be superior in terms of absolute performance because Apple silicon is optimised for lower TDP - great for laptops, but for a desktop, a power-hungry gaming PC is likely to be significantly faster. Even when it finally arrives, I doubt an M3 Ultra GPU will be as fast as a 2022 RTX4090.
  11. Not quite - how it seems to work is the frame rate limiter only has a small number of meaningful values, e.g. i find that setting it anywhere between 35fps and 60fps caps at about 30fps, and setting it anywhere between 65fps and 105fps caps at about 60fps. So rather than allowing you to enter an arbitrary number for frame rate cap, it would be better if Firestorm allowed the user to select a frame rate cap value from the list of actual possible options.
  12. If being able to run on Ultra settings is the priority, but you want a computer that is as small as possible, check out "mini atx" form factor gaming computers, e.g. https://www.chillblast.com/pages/hej Different sizes and shapes of mini atx cases are available, and in many cases there is just enough space to squeeze in the components required for a really top end gaming machine.
  13. That is how Windows is supposed to work. Whenever a program loses focus, Windows assumes that you're not making active use of it and reduces its processing time to save resources for the program that currently has focus. Looks like it may be possible to turn this off, though: https://answers.microsoft.com/en-us/windows/forum/all/disabling-under-prioritization-of-background-tasks/c4749a47-76f0-4c5c-bb30-b9d319f610fc
  14. I've updated both my desktop and laptop computers in the last few months - both have AMD processors and both perform really well with SL. I don't find any of the current Intel options compelling for my use case. AMD is using an advanced TSMC 4nm manufacturing process and that gives them a pretty big edge in terms of efficiency and power consumption when compared to Intel's current manufacturing process and CPU offerings. Intel CPU performance does currently lead for productivity tasks, but only by operating at eye wateringly high power levels. When it comes to overall gaming performance, the fastest processor right now is AMD's 7800X3D, it is faster even than the new Intel i9-14900K and uses a small fraction of the power. So I think the TL;DR is - If you want absolute best productivity performance and don't care about fan noise and power consumption - buy Intel. For most other use cases, consider buying AMD. (This is only true right now, in a few year's time the situation may have changed. Although the outlook for the next year or so is that AMD may actually have moved more convincingly into the lead).
  15. Just tested TPS, I get a far higher reading for TPS on a complex sim with lots of avatars than I do in a simple skybox, and TPS variability factor between the different scenes (5.3) actually exceeds the variability factor for FPS (2.4). So whilst neither is a perfect measurement of absolute hardware performance, FPS does seem to be a slightly more reliable measurement than TPS. Busy club (748,625 tps, 70fps): https://gyazo.com/6996775c3b9f37aad7e24c0fe8d73a29 Skybox (142,039 tps, 193 fps): https://gyazo.com/5ceb9525f95a1e937a8d5ca8bf574808
  16. I think the reason that FPS is used by everyone is because FPS defines how fluid the game experience is. So it is meaningful in the sense that it is a good specification of whether a game is playable or not. TPS doesn't have to be high for a game to be fluid and playable, a game with a low triangle count and high fps might not look great, but will be perfectly playable. In a sense, FPS is a reasonable performance specification of the game software combined with the computer hardware. And by taking either the game software or the hardware performance as an axiom, a measurement of FPS can be used to critique the performance of the other. E.g. low FPS in a game when using an RTX 4090 is evidence of a badly optimised game. Whereas low FPS on a game known to run well on low end hardware is evidence of a poorly performing computer.
  17. I take your point about FPS being a badly-specified measure of hardware performance, but is triangles per second (TPS) any better? It would be easy to test - if TPS is a good specification of hardware performance, then TPS should always remain constant, regardless of the scene content. My hunch is that TPS is probably a bit like FPS in that TPS will vary somewhat depending on the specifics of the scene being displayed. Otherwise, surely we'd already be using TPS as a hardware performance metric? (I think we all know that FPS fails to differentiate scene complexity from the hardware performance, that's why we make an effort to specify the scene when quoting it). (And "we" here equals the entire gaming industry).
  18. i'm starting to think you are just a troll acting in bad faith, because that's not a remotely reasonable reading of anything i have ever said. my own view of myself is that i'm often wrong. i've also noticed that i'm actually pretty good at acknowledging it when i'm proven to be wrong.
  19. Zen Buddhism has a concept for the ideal state of mind in this respect which it calls "beginner's mind". If someone who is truly an expert can nevertheless retain "beginner's mind" in their approach to the world, it is very powerful. https://zenhabits.net/beginner/
  20. here's my theory: reality is a system of opposites - everything has a positive side and a negative side. so how does that apply to knowledge and expertise? - the good side of knowledge is obvious, knowledge and expertise is a huge asset for understanding the world and solving problems - the bad side is less obvious. the problem with knowledge is, the more of an expert someone feels themselves to be the more sure they are that they fully understand the field and the less need they feel to listen to people around them, particularly when that person is saying something that contradicts what they already believe. so ironically, one quality of knowledge is that it degrades our capacity to acquire new knowledge. i'm pretty sure that's all that is going on here. and i don't think any of us are exempt from this phenomena, i have even observed it in myself in my own domain of expertise. i'm less likely to be open-minded in areas where i already have considerable experience. it is a bit irritating though when the whole topic of this thread is the fact that in 2023 the situation has changed, and yet even on this thread, people are still just posting yesterdays (out of date) knowledge on auto-pilot.
  21. I agree, and for the reasons you stated. I tend to use the heuristic that many gamers use, which is just to look at GPU usage and assume that if the GPU is maxed out, then the game is GPU limited, but if it is not maxed out, then the game is probably CPU limited. (Most games can max out the GPU but not all cores of the CPU, so if the game is CPU limited, it is likely to be limited by the single core performance rather than the full multi-core performance of the CPU).
  22. OK, perhaps that makes a difference although I didn't enable texture compression when I tried the Linden Labs viewer, so it is possible that something on the sim has changed between your first 12fps visit and your and my subsequent 20fps visits.
  23. @Henri Beauchamp fair enough, it is true that I have been condescending in my language towards you and I apologise for that if you find it unacceptable. However, it has been in response to condescension from yourself. You will notice that my comments to you were consistently polite and respectful up until your sarcastic and condescending dismissal of my (politely stated) view on the "performance" thread. (If someone treats me with condescension, I don't complain, I just mirror it!)
  24. Don't worry - it wasn't a serious suggestion, I'm fully aware Henri will just ignore it!
×
×
  • Create New...