Jump to content

Ormand Lionheart

Resident
  • Posts

    164
  • Joined

  • Last visited

Posts posted by Ormand Lionheart

  1. Howdy

    Task manager gives you an average of CPU usage which includes the many threads that are ZERO. So in fact the rendering core might actually be close to being maxed out. So it won't give you an accurate comparison between the viewers. MSI Afterburner will be more useful. Check out the Alchemy Viewer since for some reason it uses more gpu % than other viewers and runs faster with shadows enabled versus when turned off. I have no explanation for that just a very interesting observation.

    cheers

     

    2060s.Alchemy.90fps.shadows.JPG

  2. On 2/5/2020 at 12:24 PM, Pussycat Catnap said:

    I didn't know this.

    It seems counter intuitive.

    I just went to 16x. My FPS went from 30-60 up to 80-120...

     

    That's bizarre and doesn't make sense on paper. Was it in the viewer that you changed anti aliasing or in the GPU Control Panel? I'll have to give it a try. Which GPU model and driver version are you using?

  3. On 2/19/2020 at 7:54 PM, Lyssa Greymoon said:

    Generally, no, I think GeForce cards are better performers and much better values for SL. Personally, I think there are two good use cases for Quadro cards in SL. First is upgrading an old office PC that can't power a GeForce card. Old Quadros will run in practically anything. The second is if someone hands you the corporate credit card and tells you to go nuts (or the lotto thing).

    The Quadros I have are all older models I bought on eBay for pretty good prices, so take it for what it's worth. No RTX 8000s here. None of them are "jaw droppingly fast" compared to GeForce cards in SL. Sorry, as far as I can tell, the stories that Quadros have OpenGL secret sauce that make all your SL dreams come true are myths.

    thanks for the info. Which Quadro models do you have and which GTX models were you comparing to?

  4. Howdy folks and thanks for the fountain of knowledge. I have another question re: graphic performance. Does anyone recommend a Quadro even the hardware specs seem lower than their GTX counterparts and excessively expensive but perform very well with 3D programs from what I've read. Isn't that what SL is? I've read some forums etc that suggest they are "jaw droppingly fast" but never a good reference to compare to. Does anyone have any experience. If I win the Lotto I might give one a run.

    thanks

    Ormand

  5. When you crap out try sliding the particle slider down to zero. A lot of them you don't even know are on the SIM and they tend to be a hard hit. It seems to me that most of the time your GPU is being bottle-necked by the cpu even though it looks like your cpu usage is low. If you take a look at the cores individually one of the threads will be close to maxed out so the other number you see is an average of all threads which tend to be close to zero. SL does seem to be trying to use more of the other threads I think. I have compared my video card to people with 1080 ti's on the same SIM looking at the same objects and they tank as well. I think your weapon of choice is the best cpu you can get but they had 8700's and still had issues depending on where they were. SL is just too inefficient due to non-professional builders. I'm sure if they knew they would optimize or polish their skills. Some places are wicked fast. If I built a laggy SIM I think I would find out why and try and fix it.

  6. 5 hours ago, Lillith Hapmouche said:

    As you figured out on your own: try your new CPU with the GPU you already got and see if you like the performance. (By the way: a fresh Windows setup is a must)

    The 5700 (XT) and the 2060 are two different leagues, actually. Either look at the 5700/2070 (Super) series... or perhaps a used 1070 for a good price. Or save some money and get into the 5600 or 2060 range, if you really want an upgrade now.

    Personally, I evventually ran into the driver issues from hell that several AMD cards - especially 5700XT ones - are encountering since December 2019. Black- and bluescreens, cold  restarts, games not loading, artefacts and errors with multi monitor setups - those things don't show up on all cards and recent drivers, but a good deal is affected. Whether it's really "just" driver programming or a hardware error in early charges, nobody can tell for sure. It's a hot debate on Reddit, was very well presented in a "Hardware Unboxed" YouTube video and is getting picked up by gaming and hardware sites.

    So as of now, I would only recommend a 5700 card if you are willing to fiddle with drivers and settings. If you aren't, go to Nvidia. I didin't follow the reports too much, so I can't tell if the 5500 and 5600 cards suffer from the same problems.

    My solution, nonetheless: I'm running a 19.10.x driver release and set the WattMan tool to automatically underclock the GPU. No more issues at all and I'm happily playing COD: MW now, which used to pester me with all kinds of issues. Second Life always ran fine, nonetheless.

     

    I'd seriously love a "roll eyes" emoticon to react to such statements. 🙄

    I agree with everything you said. I recall a few years ago when the GTX 670 came out and I and quite a few others were having issues with SL including the pink screen and crashes which were fixed when new drivers came out but in my case I think it was my model was clocked to high. It was an Asus model DC2 and eventually stopped working with artifacts. I ten RMA'd it and received a replacement which obviously was a refurbish. Tried it immediately with SL and didn't even last 5 minutes. Went thru the RMA process again. Went for a Zotac 780 which has had zero issues for a number of years now. I've been following the RX series driver issues on YouTube and people aren't happy and some interesting theories on the problem. AMD doesn't seem to have had a good rep re: drivers for years. I've never had an AMD card so I am curious is all. Best case is to try both in a side by side comparison but that's not possible unless I can return it if need be and get the Nvidia. But like you said I may not even need to upgrade my current gpu. My new AMD system is currently using my GTX 670 but I haven't run it yet due to time issues so it will be a fresh Windows install. Thanks for the info.

  7. On 6/21/2019 at 2:58 AM, NiranV Dean said:

    After a bit of sleep i'm back to explain further when shadows actually ramp up GPU usage.

    In the very last part of shadow rendering they are being translated and applied to the world, that's the part thats being run on the GPU, before it does these calculations however it does a simple check against lighting. IF the point in question is on the opposite side of where the sun is coming from, lets say you pulled the sun down all the way to where it is below the terrain and the moon lighting hasn't yet risen enough to be shown, shadows would simply "early out" with a "we're facing away from sunlight, we MUST be in shadow", there's no need to calculate where the shadow is if the point in question isn't in sunlight, which means we can save the rest and assume the point is shadowed. Any and all surfaces pointing away from the sun are automatically considered shadowed, this saves a lot of processing power with high resolution shadows and/or far shadow distances. However there is a point... specifically a sun angle which is a super edge case, you can put the sun in such an angle that the shader doesn't yet consider everything in shadow but still have the entire world shadowed due to the sun being in such a low angle that everything has a shadow drawn across, this is the post apocalyptic scenario you never want in this very case possibly every single point needs to calculate shadows fully, check against the shadow map and decide how shadowed something is, this scenario can often be achieved in very foggy or low-sunlight-high-ambient-fog windlights which basically eliminate any sunlight and have the entire world being shadowed, this will make the shadow calculation run rampant which is why @Nova Convair here might see such a huge framerate impact with "fog". Note that said shader is running on the GPU so a good GPU will negate most of this but also keep in mind that said shader will do a crazy amount of extra texture lookups which might be costly and comparisons against 4 shadow maps, possibly 2 at the same time (where shadow maps overlap) or more in possibly unseen weird instances, this combined with higher shadow resolution (usually in all Viewers except Black Dragon coming from higher screen resolution) can quickly massivey inflate the amount of times this calculation is run and can possibly quickly tax the GPU to the point it can't keep up anymore.

    I thought about moving shadows from pipeline into shaders to transfer more load from possibly CPU to GPU but i haven't exactly looked into the pipeline shadow calculation that much, most of it however should be just calculations which the shader could do too... i wonder if it was worth moving those into shaders.

    I noticed this exact behavior while bench marking fps while I browsed thru various Windlight settings. However I use a GTX 780 (gpu usage maxed out) whereas my friend who doesn't have this issue had a GTX 1070. Would you at any point recommend the newer AMD gpu's vs Nvidia? Or is that a non-starter? Thanks for the valuable information. Also would your viewer favor the specs of one product over the other due to "shaders" specs etc.?

  8. 1 minute ago, cheesecurd said:

    You would notice overall better performance out of the 5700 but it’s debatable how much either would really improve performance in SL since most of the lighting is cpu bound.

    The cpu upgrade probably did the most for lighting related performance, might not be worth upgrading the gpu unless you play a lot of other games that would utilize it more.

    Thanks

    I haven't yet used the new Ryzen cpu with my 780 but I suspect the same thing that I may not need an upgrade of a gpu. The Ryzen cpu is twice as fast as the xeon in single thread tests in Cinebench and that should be a huge boost. Mostly I just like to tinker and the AMD products look interesting but driver issues look like a pain in the ass. Once I get the new cpu running I will be able to tell if an upgrade is even worth it.

  9. On 12/28/2019 at 3:19 PM, Lillith Hapmouche said:

    On top of Alwin's explanation, in terms of that "not rezzing" problem, the easiest fix is a simple TP out and back to that spot.

    Also, the current driver versions with the number 19.12.x are a hit and miss release. Better wait for the bugfixes and stick with the 19.11 generation.

    Besides that, there isn't really any special setting you need to check.

    Regards from my Gigabyte RX 5700 XT Gaming OC.

    Hi

    What are your thoughts of an upgrade to a 5700 vs a 2060 for Second Life which is the only game I use. I already upgraded my cpu to a 3600 with my current GTX 780 so I'm guessing that combo may be sufficient as is. I am using a Xeon 5650 atm and it is excellent but will lose performance when using specific Windlight settings when in shadow mode. I determined that it depends on how many objects are being influenced by the sun and the various angles of the shadows. With shadows turned off performance is no issue at all from one Windlight to the next. I know someone who has a GTX 1070 does not have that issue so I could get a used one but always wanted to try an AMD since the hardware is interesting.

  10. What he says does happen on mine GTX 780. I beleive it has something to do with the Low Settings not enabling things that let the Video Card use it's hardware benefits compared to lower end cards. If you enable some of those grahics settings in the Preference Panel you will see it significantly rise. Kind of like having an 8 cylinder running on 2 cylinders. Still a powerful car underutilized.

  11. Have you had someone get stats in comparison to yours with a different video card on the same SIM at the same time? Some SIMs are brutally slow. My GTX 780 gets higher fps if I overclock my cpu. Unparking my cpu cores will additionally increase fps. I'm guessing that high resolution monitor is dipping the fps too. I use a 1080P. On a screen with your resolution I thought that disabling AA entirely and you won't see the jaggies? And if you limit frames to 24-60 you will see even less Gpu utilization, less power use and lower temps. I have seen stated on the internet that a 1080 is bottlenecked by most cpus. Did you notice you have dropped some packets? Maybe that's an issue?

  12. Try using GPU-Z to see what your gpu stats are while using SL. Sometimes Windows does not recognize SL as a game and so continues to throttle the GPU unless you make a game profile and set to maximum power mode in the Nvidia Contol Panel. At worst you will see what your gpu usuage is etc. If you are using a 4k monitor I believe you don't even need AA enabled since aliasing is not noticeale at that resolution? And 4k would reduce fps compared to 1080P with the same settings.

  13. Shadowplay doesn't slow down my PC either mainly because all the preferences are "greyed out" so I can't even test it. I use FRAPS which with my current system I have no issues although the files are LARGE. Eveyone seems to be touting Shadowplay but a google shows there are a lot of people like me that can't get it to authorize. Tried all the "solutions" from google searchs but nothing working so far. Did you have that problem? Does SL have to be running before you can use shadowplay? I tried that too. I'm guessing I might have a service disabled that it needs?

     

    regards

    Ormand

  14. I hope you are not expecting any benefit in SL with SLI since the performance will be horrendous compared to a single card (unless things have changed). You might be better off with Pascal rather than 2 Titans unless you are playing other games that do benefit? If you do get a single Titan would be interesting to see the performance difference in SL with the AMD.

  15. I think the biggest issue may be that YouTube is very strict if you use copy right music and some people I know actually lost their accounts for that reason. Also for some of us RL got in the way. 4k textures are not needed you just need to have hardware so you can use shadows, projected lights and depth of field, interesting camera angles, and audio and you can do plenty of interesting things.

  16. But...if you were to upload at 1023x1023 which is going to be downscaled to 512x512 would not that 512 texture look better than if you had uploaded a 512x512? I know the more info you start out with you get better quality when compressing to jpg in phototoshop etc. Same thing happens when uploading to youtube...the more data they have the better the compressed file. Would be an interesting experiment?

  17. Did you check to see if your vidoe card was running at 100%. A lot of the viewers are not recognized as a game so Windows runs the VC at 50% or so. Generally I have to make a game profile with the Nvidia Control Panel to get it working 100%. Try using GPU-Z and CPU-Z when you run these tests and it may give you some useful info especially where temps are concerned and usuage. Were you running Shadows and/or DoF? What was your viewer resolution? Can make a huge difference depending on your VC specs. The better the VC the less those extra features at high resolution will impact you.

  18. Is tht the correct driver for your video card? Also you are losing 8+% packets. Possibly Windows Update screwed up which driver you are using since it happened 3 months ago? Seems you are using the gpu on the cpu hioch is going to have a very tough time running SL especially depending on what settings you are using. The other issue is probably "rubber banding" from lag? It looks like an On-Chip gpu? Is there actually a more powerful gpu in your notebook and you're using the wrong one since an update?

×
×
  • Create New...