Message boards : Graphics cards (GPUs) : Overclocking GPU...
Author | Message |
---|---|
When overclocking a GPU one normally can modify three values: core, shader and memory. When altering these values which one will be the most effective to fasten up crunching ? | |
ID: 2788 | Rating: 0 | rate: / Reply Quote | |
so there's no need for a 'good' picture... ;) Yes, but a need for accurate calculations ;) You should require both, core and shader clock, whereas memory clock should be pretty irrelevant. MrS ____________ Scanning for our furry friends since Jan 2002 | |
ID: 2793 | Rating: 0 | rate: / Reply Quote | |
[quote] Sorry, but that's a very popular myth on the memory. In the F@H GPU community people say that the memory doesn't matter--its just not true. Set the core and shaders then start upping the memory and watch the PPD go up. Lower the memory and watch the PPD go down..It works that way with every card that I have (4). I have absolutely no idea why it would be any different for GPUGrid. Just try it for Yourself. | |
ID: 2799 | Rating: 0 | rate: / Reply Quote | |
I remember that in folding@home GPU 1 a 50% drop in memory clock would cause a slow-down of 10-15%. So if I'd increase my 9800 mem clock from 1100 to 1160 I could expect a 1.6% performance increase. | |
ID: 2800 | Rating: 0 | rate: / Reply Quote | |
Wow, this was unexpected! First result is in: | |
ID: 2808 | Rating: 0 | rate: / Reply Quote | |
Wow, this was unexpected! First result is in: Good to know. Thanks to that info I just decided to push the memory clock of my GTX260 a little bit further. I´ll post some results. Maybe the incerease will be not the same across different cards, as some may be not so much memory bandwith starved as overs. Cu KyleFL | |
ID: 2810 | Rating: 0 | rate: / Reply Quote | |
When running at stock values (600/1500/1000) I had about 65 ms/step... | |
ID: 2813 | Rating: 0 | rate: / Reply Quote | |
Maybe the incerease will be not the same across different cards, as some may be not so much memory bandwith starved as overs. Yes, I'd expect as much. But most cards are pretty balanced anyway. On GT200 the larger caches should also help. MrS ____________ Scanning for our furry friends since Jan 2002 | |
ID: 2814 | Rating: 0 | rate: / Reply Quote | |
The next 2 WUs are in, 62.7 ms with some interactive use and 61.3 ms with only minor use. Damn those long term averages.. | |
ID: 2825 | Rating: 0 | rate: / Reply Quote | |
Can you remind me what it the tool to overclock from Linux? | |
ID: 2826 | Rating: 0 | rate: / Reply Quote | |
There are CoolBits from Nvidia and NVclock... | |
ID: 2827 | Rating: 0 | rate: / Reply Quote | |
If you are lucky and the card and driver support it, somehow... | |
ID: 2828 | Rating: 0 | rate: / Reply Quote | |
I've got an ASUS 9600GT stock at 650/1625/1800 and am wondering the 'best' way to OC it. Do I keep all the ratio's the same or just the engine/shader ratio? | |
ID: 2856 | Rating: 0 | rate: / Reply Quote | |
As anybody overclocked a GTX280? | |
ID: 2862 | Rating: 0 | rate: / Reply Quote | |
Not a GTX280, but a GTX260 (if that helps) | |
ID: 2863 | Rating: 0 | rate: / Reply Quote | |
I've got an ASUS 9600GT stock at 650/1625/1800 and am wondering the 'best' way to OC it. Do I keep all the ratio's the same or just the engine/shader ratio? I must admit I'm underwhelmed by the responses to what I thought was a fairly simple question, anyhoo..... The first wu completed at the new speeds shows a dramatic decrease in time. Fantastic stuff! I'll see how the next one goes. Live long and BOINC! | |
ID: 2886 | Rating: 0 | rate: / Reply Quote | |
I must admit I'm underwhelmed by the responses to what I thought was a fairly simple question, anyhoo..... No, it's not simple. Intuition tells me that the ratios don't matter, as long as you don't go extreme. But I can't be 100% sure since I didn't test it specifically. My suggestion: find the maximum stable clock for engine and shader first, then for the memory and afterwards back off a bit for safety. And don't care about the ratios at all. MrS ____________ Scanning for our furry friends since Jan 2002 | |
ID: 2887 | Rating: 0 | rate: / Reply Quote | |
My suggestion: find the maximum stable clock for engine and shader first, then for the memory and afterwards back off a bit for safety. And don't care about the ratios at all. Agreed, that's pretty much how I approach it.... | |
ID: 2890 | Rating: 0 | rate: / Reply Quote | |
I have it (GTX280) overclocked to: | |
ID: 2891 | Rating: 0 | rate: / Reply Quote | |
My suggestion: find the maximum stable clock for engine and shader first, then for the memory and afterwards back off a bit for safety. And don't care about the ratios at all. Excellent! Many thanks for the responses. Looks like the 11% engine/shader oc and 8% memory oc gave a 15% decrease in overall gpu time for the first wu. The wu time is now below 24 hrs! | |
ID: 2899 | Rating: 0 | rate: / Reply Quote | |
Maybe I should run that test utility that find out what the card can do ... :) | |
ID: 5311 | Rating: 0 | rate: / Reply Quote | |
I have an XFX GTX280 XXX with all the following stock settings: | |
ID: 5323 | Rating: 0 | rate: / Reply Quote | |
I have an XFX GTX280 XXX with all the following stock settings: My 280 is running 27 - 32 ms per step over a bunch of tasks ... the 9800 GT is 91 - 92 ms ... | |
ID: 5324 | Rating: 0 | rate: / Reply Quote | |
Sorry guys, dumb question.. | |
ID: 5370 | Rating: 0 | rate: / Reply Quote | |
Sorry guys, dumb question.. Look under each computer at the tasks, it is in the output. For example: http://www.gpugrid.net/result.php?resultid=193826 | |
ID: 5371 | Rating: 0 | rate: / Reply Quote | |
The older posts in this thread from October were during a period where only one kind of workunit was being examined. Now we have three different kinds (with credits of 32xx, 24xx, and 18xx) that have widely varying ms per step on the same card. It would be interesting to see how cards are doing with these different types of work and how much spread there really is. I suspect that the spread is fairly tight on high-end cards with the differences becoming more substantial as one moves to the low-end. Not sure if different aspects of different OC would have different performance effects on different types of work? and yes, I could not fit another "different" into that last sentence ;) | |
ID: 5372 | Rating: 0 | rate: / Reply Quote | |
That explains alot to me! I have'nt been following up on the message boards and probably miss a lot of topics and project progress. I have a 280, and just quick sampled my tasks. I've noticed that the three credit wu types are almost equally spread, making things fair I suppose. Another thing I've noticed is that the 18xx tend to get close to the 35-40 ms while the 32xx tend to do 20-25 ms. the 24xx fall between 25-35 ms. Just a crude estimate. Perhaps the project admits can come up with a more accurate picture by querying the database and averaging out the cards/tasks. | |
ID: 5373 | Rating: 0 | rate: / Reply Quote | |
The older posts in this thread from October were during a period where only one kind of workunit was being examined. Now we have three different kinds (with credits of 32xx, 24xx, and 18xx) that have widely varying ms per step on the same card. It would be interesting to see how cards are doing with these different types of work and how much spread there really is. I started (1) with a GTX 260/192 @stock, later I added (2) a second GTX 260/216 @stock and (3) flashed both to 666/1500/1150 (since 26./27. Nov.), meanwhile I switched the cards between computer-IDs (around 4. Jan. last time), you should not be amazed about mixed values viewing the different task-lists, here are the results for the three credit-groups 1888/2435/3232 (2933-credit-WU may be an exemption): GTX 260/192 (666/1500/1150) (TaskID > credits > ms/step) - some examples
| |
ID: 5379 | Rating: 0 | rate: / Reply Quote | |
Okay...On the low end I have numbers from a 9500GT with 512mb(700 core, 1750 shader, 2000 memory) and a mid-range 9600 GSO with 384mb(600, 1700, 1800). Using 6.5.0 for all unless noted otherwise. | |
ID: 5384 | Rating: 0 | rate: / Reply Quote | |
I've noticed that the three credit wu types are almost equally spread, making things fair I suppose. The WUs do differ in ms/step because they differ in complexity. They also differ in number of steps, so that the overall time consumed corresponds to the credits and therefore the statistical spread of the WU types does not matter (no fair or unfair). Currently the 1888 credit WUs are off and give less credits per time, but the problem is already reported. Not sure if different aspects of different OC would have different performance effects on different types of work? I don't think there'll be any dramatic effects here.. the WUs are not that much different. The more complex ones could respond stronger to memory frequency increases, though. So I always OC to highest stable core and the rest no matter really. No. You need both, core and shader frequency. Some utilities clock both up synchronously, maybe that's why you didn't notice the shader went up as well? And memory clock also matter, just not as much as the other two. Does anyone know...My 260 won't OC. Is there a proggy besides the Nvidia stuffs? If RivaTuner can't do it probably noone can do it ;) Are those MDIO errors signs of serious problems ? Do not hope with OC-settings. Nope. They just tell you there was no file to resume computation from, because you didn't stop & restart BOINC during the WU. MrS ____________ Scanning for our furry friends since Jan 2002 | |
ID: 5461 | Rating: 0 | rate: / Reply Quote | |
Are those MDIO errors signs of serious problems ? Do not hope with OC-settings. Ah, good to know, thanks ! | |
ID: 5463 | Rating: 0 | rate: / Reply Quote | |
Update for the 9600 GSO 29xx workunit > 175ms Also, since this one took about 4 hours longer than the 3232 credit units, it looks like these might be 'off' in a similar manner as the 18xx credit units. | |
ID: 5495 | Rating: 0 | rate: / Reply Quote | |
I was having the same problem and I found an app put out by EVGA simply called | |
ID: 6419 | Rating: 0 | rate: / Reply Quote | |
I was having the same problem and I found an app put out by EVGA simply called The EVGA Precision only works in Windows & only Overclocks 1 Core of a GTX 295 though. For now though I'm using EVGA Precision to control the Fan Speed & the Gainward EXPERTool to control the Second Core of the 295. | |
ID: 6434 | Rating: 0 | rate: / Reply Quote | |
I didn't know about the program only controlling one core of a GTX 295 because I have yet to work out the nerve to buy one. I am looking to build a new system but am going nuts trying to figure out what hardware to purchase. It's main purpose will be crunching Boinc App's and a lot of video conversions, both of which can benefit from a great GPU, but I am going to be putting together a whole system, case, power supply, video card, motherboard, ram, processor, the list seems endless. I am hoping that during the time I spend trying to get the nerve up to lay out what for me will be a small fortune, the prices on video cards at least will stabilize. I saw a BFG GTX 280OC for $365 CAN today and the 295 is around $660 but I think the 295's and now the 285's will push the 280's price down even further very soon. The problem is I don't game at all, it is something I have never been interested in and while this will not only date me it's also a little embarrassing to say the last game I played was something like frogger or Pac-Man back in University. I was even considering getting dual 295's as I can probably scrap up the cash but can I justify that kind of outlay just to increase my RAC and hopefully do some worthwhile science in the process. | |
ID: 6442 | Rating: 0 | rate: / Reply Quote | |
The problem is I don't game at all, it is something I have never been interested in and while this will not only date me it's also a little embarrassing to say the last game I played was something like frogger or Pac-Man back in University. Frogger! Wasn't there a 3D remake of that? :-)
The GTX 295 offers very good performance per dollar because it's two-in-one. You can always buy one now, and get the second later. Get at least a 750 or 850 Watt power supply up front so in the future you can simply add the second card without additional PSU swapping. | |
ID: 6444 | Rating: 0 | rate: / Reply Quote | |
If this is going to be the rig for a few years ... get the best MB with 3 PCIe slots. Then the best and fastest CPU and memory you can afford ... if needs be, skimp on the memory as that can be replaced more cheaply than the CPU ... the high quality MB for the same reason. | |
ID: 6445 | Rating: 0 | rate: / Reply Quote | |
Then the best and fastest CPU and memory you can afford ... -> Then the best and fastest CPU and memory you want to afford ... ;) MrS ____________ Scanning for our furry friends since Jan 2002 | |
ID: 6451 | Rating: 0 | rate: / Reply Quote | |
h | |
ID: 6460 | Rating: 0 | rate: / Reply Quote | |
hello everybody,i would want your advices on my overclocking,it is good? | |
ID: 6463 | Rating: 0 | rate: / Reply Quote | |
Well, it's certainly a high Oc, but not an extreme one. And you finished one WU. But is it good? If it's stable then "yes", otherwise "no". How did you test stability? | |
ID: 6464 | Rating: 0 | rate: / Reply Quote | |
How did you test stability? stress gpu during 48 hours with furemark in "extreme mode" (1280X1024)with 16X anisotropic filtering(average 17 fps).max temp:81°c(gpu crash at 83°c) -6 hours using 3dmark 2006 (74°c) ____________ | |
ID: 6465 | Rating: 0 | rate: / Reply Quote | |
So the answer is a big fat "yes"! | |
ID: 6474 | Rating: 0 | rate: / Reply Quote | |
So the answer is a big fat "yes"! I think the answer is maybe, and that's final! | |
ID: 6479 | Rating: 0 | rate: / Reply Quote | |
hello all, | |
ID: 6911 | Rating: 0 | rate: / Reply Quote | |
Had the same experience like you: my card errored out each & every WU on GG and when I switched to SETI it ran without any probs... | |
ID: 6922 | Rating: 0 | rate: / Reply Quote | |
Had the same experience like you: my card errored out each & every WU on GG and when I switched to SETI it ran without any probs... I believe something has changed in the Application's myself. I have a 8800GT OC that was running the WU's just fine then a few days ago started erring out every WU after 3 Seconds. Like other people I can run the SETI GPU WU's just fine though. I've even under clocked the 8800GT OC & it still errors out the WU's, so lowering your clock speed isn't necessarily a cure for the errors, at least it wasn't in my case. The 8800GT OC runs perfectly fine though for everything else @ it's Stock or the Overclocked Speed it came with except for running the GPUGrid WU's so like I said I think it's something in the Application's that's changed ... ??? | |
ID: 6923 | Rating: 0 | rate: / Reply Quote | |
Tixx, how do you determine that it's stable everywhere else? | |
ID: 6930 | Rating: 0 | rate: / Reply Quote | |
Ive ran benchmarks on 3d mark, stressed on sisoft sandra, several games, and seti wus. | |
ID: 6949 | Rating: 0 | rate: / Reply Quote | |
Ive ran benchmarks on 3d mark, stressed on sisoft sandra, several games, and seti wus. I get the same thing with my 8800GT OC, every so often it will actually run & finish a Wu, the rest error out after just 3 seconds ... | |
ID: 6950 | Rating: 0 | rate: / Reply Quote | |
Had the same experience like you: my card errored out each & every WU on GG and when I switched to SETI it ran without any probs... I must say that I am seeing the same basic effect with my Asus 8800GT. I run 3x 8800GS and 1x 8800GT. The GS's are running fine as usual. The GT collected dust for several months, but is now back up and running--and kicking out 1 compute error after another. This particular card has always been the biggest pita out of the 4 (even on F@H). I just lowered my OC a bit once again, but I expect it to start throwing errors again...maybe not though. I thought it was the card and/or OC....however, after reading this: I'm wondering if there is an 8800GT issue. It's a G92 card..... | |
ID: 6954 | Rating: 0 | rate: / Reply Quote | |
I run 5 8800gt's (all o/c'ed) from three differnt manufactures and have the extremeely rare work unit have a problem...I am still running 6.5.0 on all my machines so maybe the problem is with the 6.6.xx application............Had the same experience like you: my card errored out each & every WU on GG and when I switched to SETI it ran without any probs... | |
ID: 6956 | Rating: 0 | rate: / Reply Quote | |
Some programs are more sensitive to errors than others. Aside from occasional errors I have been running for two months with very few errors. The system may pass all tests, yet not run a specific application. If you bring down the OC and the application begins to run without errors, then regardless of other tests the OC was at fault. | |
ID: 6960 | Rating: 0 | rate: / Reply Quote | |
I run 5 8800gt's (all o/c'ed) from three differnt manufactures and have the extremeely rare work unit have a problem...I am still running 6.5.0 on all my machines so maybe the problem is with the 6.6.xx application I'm running the 6.5.0 Client too & not the 6.6.xx Client which I think you meant ... ??? If you meant the 6.62 Application I've already surmised that & brought that point up ... @Paul D.B: If you bring down the OC and the application begins to run without errors, then regardless of other tests the OC was at fault. I've already stated that I've Under-clocked the 8800GT with the same results. Took the Card all the way down to 550 Core Speed which is 100 below it's Stock Speed Setting & the WU's still erred out 3 Seconds after starting. That consistent 3 Second Failure is what leads me to believe the Application is at fault & not the Card. Another cause can be the amount of memory on the card That's a distinct possibility but up until a week or so ago the Card ran the WU's just fine, so it must have had enough memory up until then if that's the reason. The Card does have 512mb of Memory & only uses about 79mb of it when running the WU's so I don't think that's the reason. A third cause can be other events on the system that the system as a whole does not react well to and causes tasks to error out I've put the 8800GT OC in 4 different Systems with the same results, the Systems were already running GTX 260's just fine before trying the 8800GT in them. At SaH one of the error modes is that once tasks fail, all will fail until the system is rebooted. There are dozens of reasons that this can happen with the simplest being that the API does not properly initialize the GPU after certain errors so that the error on one task will contaminate all other tasks started I've tried the Re-Boot trick already but the WU's still error out after Re-Booting. Basically I've retired the 8800GT from running GPU WU's either here or the SETI Project because it won't run the WU's here anymore and is just creating a lot of erred out WU's & to me it's just a waste of Electricity to run the SETI ones for the Credit it gets from them. I have 2 ATI 4780's coming which should be here either today or tomorrow at the latest & 1 on of them is going to go into the Box the 8800GT is in now and the other in a Box that doesn't have a GPU capable Video Card in it already. The 8800GT will just become a spare card in the event I need it for backup for one of the 200 Series Cards if one goes bad that I have until I can get a replacement for it. It won't be able to run the WU's but it still can be used for Display only purposes ... | |
ID: 6964 | Rating: 0 | rate: / Reply Quote | |
Don't get me wrong with my earlier post...I'm not discounting it being the card. This particular card has always been the most...uh,..."sensitive". I like Asus MBs, but I won't be buying any more of their vid cards. I can set the OC on this card and it will run fine...then start throwing errors left and right for no reason. Lowering the OC usually helps, but really just starts the cycle all over again. Then eventually it will OC back up....just to start the eventual spiral back downward. This isn't the only MB/system that it's been in, but seems to be starting once again. And You're right, Paul...there's many possible reasons why this can happen. It's simply that when I read the previous post...it seemed very familiar....very familiar. So, if there's any kind of "GT" issue with the app, wus, etc...It would be nice to know. | |
ID: 6965 | Rating: 0 | rate: / Reply Quote | |
Just trying to brainstorm ... | |
ID: 6968 | Rating: 0 | rate: / Reply Quote | |
PoorBoy, | |
ID: 7047 | Rating: 0 | rate: / Reply Quote | |
ATItool is a small free app that's very useful for testing video overclocks. It contains a "scan for artifacts" function that can catch things the naked eye cannot. My usual arsenal of apps for testing the stability of my system is: | |
ID: 7109 | Rating: 0 | rate: / Reply Quote | |
Hmm lol i am not really a hero in this area called oc. | |
ID: 7233 | Rating: 0 | rate: / Reply Quote | |
UPDATE, just in case anybody was using the stability test I encouraged earlier in this thread. CPU testing with Prime95 small FFTs is NOT sufficient unless you let it run for around 40 hours. | |
ID: 7520 | Rating: 0 | rate: / Reply Quote | |
Message boards : Graphics cards (GPUs) : Overclocking GPU...