Jump to content

Gentle Welinder

Resident
  • Posts

    11
  • Joined

  • Last visited

Reputation

1 Neutral

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. If it happens only after the viewer has been running for some time, at least 10 minutes or so? Could *possibly* be a mild overheat issue too.(But that tend to look like little spots and blocks of static appearing and disappearing quickly at random) Older client versions did not demand so much of hardware and a marginal heatsink install, dry paste, etc. can also cause the issues if toggling VBO mentioned earlier does not immediately fix it. The blowing up prims and hair thing happened w/an older Radeon based machine I had until I updated drivers AND turned off VBO. It too can happen with nVidia boards w/older or release version drivers packed onto the DVD in the box and you never updated. The problem on nVidia stems from improper domain clocks being set at application launch and the cores are running too slow. If you have a relatively modern ATI based video card, likely a *clean* re-install of the latest drivers goes miles to solving many sins. If you have nVidia, I have no isues on two different CPU archetecture machines with latest beta drivers tweaked for Skyrim. (Verde 290.53 beta on the Notebook and same revision number but for GeForce on the desktop, both running Win7 X64 + 5xx series cards)
  2. To Danny & all interested: Hate to revisit old threads but I have been a tad busy with tinkering and other 1st life issues far too serious to go into here... But yes, SL's *implementation* is terrible, but there is no issues with OpenGL or Driect-X. In fact, OGL is an industry standard that Microsoft "borrowed" and integrated into their own API. Hence why if there is just no silicon present to actually render OpenGL properly, it then defaults through a driver wrapper then onto CPU as a last ditch effort at some sort of visual accuracy and speed. Current "fixes" and implementation of LL release code's a bit dodgy hence it's poor performance on nearly any hardware. Fix the pipeline and OpenGL code and tweak for specific hardware enhancements and it's amazing to behold. As for being where I was when I took that snapshot?Does not matter - I could be on the ground level in Ahern and still get a flat 60 FPS until I start winding out the draw distance on my particular hardware. By the time it starts to chug, the app's run itself out of 32-bit protected RAM footprint and shells out. The lab, I am certain is aware of the performance issues. But when the majority of users login with the equivalent of desktop calculators and cannot support a shader class above bare minimum of "0" - there is no incentive to update the engine, hardware optimize or in essence restrict access to "elite" hardware owners. The right engine, render pipe optimizations on native OpenGL hardware with a transistor for every in hardware function call of OpenGL acceleration... It. Is. Breathtaking. The Laptop w/nidia 555m GT chipset spanks SL @ 100+ FPS, easily on "Ultra" settings, without tweaking. I am not going to get into a description of what happens when you rip it apart and put it back together with native 64-bit help compiled in and capable of performing FXAA, Hardware Alpha Transparency and a few other tweaks to support shaders and shadow rendering for even particles.... again, nobody would believe me. So in short, I agree and disagree. OpenGL for what it does and what it GIVES SL as a community and an industry is golden, neigh untouchable. Going to Direct-X is hardly a solution, nor comes with the appropriate price tag to implement, (*cough-cough* Free*cough*) and licensing alone would tie up negotiations and force SL to be paid for software. Do not blame the API, blame the shoddy implementation so that everyone, frankly has a lackluster experience that they can all enjoy....or kvetch about. ;> If ya got some elite hardware, there are 3rd party viewers that have these little problems already fixed. Kirsten's was mentioned but no longer, sadly, viable as a solution due to developer's withdrawal from the playing field. But they exist, and if you are that hell bent on getting something out of the box w/no tweaking to maximize hardware and visual performance, you already know where to look .;> (I will not personally endorse any one viewer over another - it's a subjective experience based on tastes and individual hardware configurations and abilities - that which works miracles for me, will more than likely crash the hell out of someone's HP laptop, Dell XPS desktop or (fill in your hardware maker/vendor here) in a heartbeat.) (Emphasis mine for the TL;DR crowd - and no worries - performance issues are being looked into and "fixed" - these things take time so they work for *everyone* and not just a handful of folks with rendering cores rated in the realm of PetaFLOPs)
  3. Sometimes a "stray" SLvoice.exe process is left running after a crash, the thread never shutting down completely on it's own. You can use the performance monitor/task manager in windows to view all currently running processes and manually terminate the voice process before relaunching your client viewer. If you leave a former process for voice running, it will hang even when you restart the client and you lose voice services in-world. Alternatively, you can just logout from your start menu and and log back into your machine, this terminates all apps and restarts desktop, video drivers, etc. And of course a reboot absolves all sins if you do not feel like dinking about in your system task manager. :>
  4. Generally SL and wireless technologies are mutually exclusive. If it's your own home, private network? Plug your machine in via an Ethernet patch cable. Hard-line copper trumps wireless in nearly every case of SL performance issues. There are some subtle exceptions of course, but since you mentioned you are on 56MB, implying a consumer grade 802.11G access point? Best to tough it out on a physical wire. I am assuming here of course. That is not to say it is not possible, but it would require the purchase and installation of upgraded hardware, infrastructure and a card/laptop replacement to get the speeds and packet management required to silkily smooth out SL performance over WiFi. I cruise SL wirelessly, but utlilize Cisco business class gear and 3x3 5GHz "N" radios over gigabit backhaul to make it work... If I drop over to forcing the radio exclusively to 56MB/sec "G" performance does not suffer much, but can become a little touch-and-go-twitchy even with "pro" gear. Good luck, grab a 12' patch cable and enjoy. <- The quick fix
  5. I have had this too many times to count, especially after upgrading to a *new* VGA card. It takes some time for SL to catch up and add a mainstream/new VGA card to the video tables in the client. The error can occur with new cards, latest driver builds when SL does simply not recognize the card. Until they roll out an update to "natively" set a GPU table and feature set, you can experiment: Generally, it's best to start with LOW graphics settings to get in world. Then open the graphics tab under general preferences and slowly ramp up features, shader functions and draw distance along with anti aliasing, Anisotropic filtering, etc. Though do be careful of setting the "Frame Buffer Objects" box under advanced hardware graphics controls and ATI boards. It can lead to random if not immediate crashes. Other times, it may run perfectly well - but general forum concensus over the years is to disable that feature for all ATI cards and it will run. If a setting crashes or destabilizes the machine, just relaunch the application, disable the feature that is suspect of causing the issue and log back in as usual. Enjoy.
  6. Picked up a Dell L702x XPS 17" w/Full HD WLED screen, 6GB RAM, Windows 7 x64 + Intel i7 w/integrated 3000 HD graphics and nVidia GTS 555m 3GB DDR5 VRAM Optimus chipset. Runs SL sweeter than heck in high~ultra and allows for 15FPS with shadows and DOF enabled - just enough to grab your screenshot and save. It takes a little goofing with the nVidia control panel to set a 3D application profile to default SL rendering on the nVidia high performance DGPU but other than that, it's a dream. By default it will try to run SL on the Intel integrated HD graphics and it can, but it gets a lousy 3~4 FPS in that state. First thing out of the box is to load in the latest Verde nVidia mobile drivers and update the Intel Drivers for chipset, Huron River, Rhenessa USB hub, and integrated HD graphics. Otherwise Optimus technology will not hand off properly to nVidia DGPU no matter how much you monkey with the driver or attempt to right-click-launch the app selecting "High Performance nVida GPU". When not plugged in, the laptop will still run the nVidia array, but it slogs and chops a teensy bit at about 14 FPS, as it attempts to throttle GPU and CPU back to conserve battery power as much as possible. Once back on the power brick, it will immediately ramp up to full speed and runs liquid smooth. And you will not break the bank trying to grab up an Alienware or (insert gamer style notebook name here).
  7. Do you have a shield HUD or device that rezzes a shell around your body when you stop moving? They can sometimes crowd the cam up against your head and down when active...just a quicky thought.
  8. Reduce ALL OVERRCLOCKS. Repeat, ZERO the motherboard, RAM, CPU and GPU stats to bone stock. Break the 590 apart in the driver by deselecting "Maximize 3-D performance" and setting the card for single GPU by ticking "Disable Multi-GPU Mode" Use/reset the stock nVidia Driver application profile for Secondlife.exe. Do not use any hardware/driver application overrides for ANYTHING. Try relaunching SL after a reboot of the sytem with these stock settings in place. Overclocks and overrides can quickly destabilize even the hardiest of liquid cooled systems while in use for SecondLife. If anything, you *may* need to mildly OVERVOLT your GTX590 for it to be stable at stock speeds when the clock domains ramp up and the cores load....(It only takes 1~2 *mili*volts to get the desired effect!!) And even then? Unless you pick apart the client from scratch, chances are slim to none to getting any decent SLI performance gains outside of multi-card anti-aliasing modes. I.E. 32x QHD FSAA. ;D
  9. Not a gaming rig by any measure. Decoupled two out of three SLI'd cards, running in single GPU/Multiple Display modes, bone stock application nVidia control panel entry for "Secondlife.exe". Ran it full screen on one display panel, rather than spanned across two. Basically, representing an "Average Joe" computer built within the last year, year and a half. No tweaks. No fuss.... ...No problem. :matte-motes-big-grin-squint: Now if I brought it up to speed and spread processing across all GPU's, LCD's w/full shadows, light and magic turned on? Well, likely someone will still never believe me. Not fanboy-ism I see....general lack of user-end optimizations and tweaks one can manage on their own.
  10. Owning a Dell XPS with a nVidia GT555m w/3GB RAM + HD 3k Intel graphics paired with Optimus, I have discovered a work around that seems to work well with current 3.0.3 beta releases and previous 2.x clients to include Kirsten's. (R.I.P. and come back soon Kirsten!) Something seems to stuff the Optimus switching process when SL is in the downloading clothing stage and has caused numerous nVidia error code 7 and 8 messages to include "nVidia drivers XXX.XX have stopped working and have recovered..." Windows Vista and 7 messages. Perhaps it's in the hand off from the switch from the Core i7 HD 3k graphics to the DGPU nVidia card while under heavy load that causes the TRD timeout. *shrug* I am just guessing here from what I do understand from the archetecture and how it works. To get my L702X to run and launch SL clients at the moment is twofold: 1) Make that nVidia 3-D Application profile for the Secondlife.exe/Secondlifebeta.exe files and permanently override the default "Integrated Graphics (Intel)" to "Use High Performance Video (nVidia)". For poops and giggles, turn on the system tray notification for nVidia graphics, just to make sure that your DGPU is running, rather than integrated graphics. Click apply, then exit, do not "X" out the control panel. 2) Launch Secondlife from the desktop by right clicking and selecting "Run With Grpahic Processor -> High-Performance NVIDIA Processor." (Should be redundant if you set the profile for the application in the nVidia control panel, but it does not hurt either) When the client starts, you should see a rainbow colored nVidia GPU Activity systray notification "1 program" that looks like a grid/chip. Double check it is running before login by hitting Help -> About second life and seeing which GPU is currently active. If it says anything but "nVidia GT 555m" you did something wrong, check steps above! Now comes the silly part - because it still seems that the GPU starts in this "low power standby state", until the clock domains ramp up on the DGPU, you'll get it stuffed quickly - it takes a few moments for the Optimus to forget the Intel HD 3k and grab, then load and speed up the 555 cores. So what you do is open SL's (CNTRL-P) preferences and set graphics slider all the way left to "Low". Login as usual. Wait for your immediate area to load and at least your avatar shape and you see the beginnings of the skin bake. Once settled? Open preferences -> Graphics tab again and set the slider wherever you want. I jump right onto Ultra or High after my skin and clothes load and it goes just fine after that. GPU loaded and just humming away at 50 FPS or so. (Anisotropic and Full Scene Anti-aliasing set to 8x in this case - no shadows) Nothing will make the machine stop. It's been ultra stable for me this way - and I stay logged in until I get bored, honestly. This is tested to work with all *DELL* webiste drivers for my machine completely up to date, running nVidia Verde 285.27 beta drivers from www.nvidia.com. Hit the Microsoft website too, grab the web installer for the latest Direct-X end user runtime updates. *Knocks on Aluminum Hood of that 17" Dell* Works a treat!
×
×
  • Create New...