Author |
Message |
|
I did a quick test with GPU-Grid and 3D Mark 2001. The good news: GPU-Grid doesn't crash (also tested with a few games). The bad news is: performance suffers so much that games become unplayable. Even if the frame counter shows 200 fps there is clearly visible stuttering. I lost so badly in good old Quake 3 :p
Conclusion: if you want to play a game suspend GPU-Grid. But if you forget it, there's likely no problem except some inconvenience on your side.
And some numbers:
- overall score without BOINC 49780
- overall score with 4xQMC 43492
- overall with 3xQMC & 1xGPU 14671
- in the last case individual scores are down to 50 - 10%, but don't matter much because neither 30 nor 300 fps is fluent / smooth
MrS
____________
Scanning for our furry friends since Jan 2002 |
|
|
Kokomiko Send message
Joined: 18 Jul 08 Posts: 190 Credit: 24,093,690 RAC: 0 Level
Scientific publications
|
Most games would not have big problems, but if you play a high speed car race like Trackmania United you miss sometime the next turn and crash in the wall ;).
____________
|
|
|
|
Most games would not have big problems
What do you mean by that? Have you tried it yourself?
Even if the frame counter shows 200 or 300 fps, the image is not smooth. It feels like 15 - 20 fps, but not exactly the same, so I used the term "stuttering". Might give you head aches after a while :p
MrS
____________
Scanning for our furry friends since Jan 2002 |
|
|
|
there IS a problem : if boinc starts a gpu task while a game is running, the wu will crash (i tried 3 times but with the 6.29 app) |
|
|
|
You mean if BOINC starts a new task, launching a game while a WU is in progress was also fine before? I remembered the problem as "WUs crashes when game is launched".
MrS
____________
Scanning for our furry friends since Jan 2002 |
|
|
Kokomiko Send message
Joined: 18 Jul 08 Posts: 190 Credit: 24,093,690 RAC: 0 Level
Scientific publications
|
Most games would not have big problems
What do you mean by that? Have you tried it yourself?
MrS
Yesterday evening I've played Trackmania United one hour while a WU was running on my XFX GTX280. Maybe while the card has 1 GB RAM. I will testing again after completing the actually running WU, but not, when she was running by 84%, only with a fresh startet one ;) Will test in 2 hours with Assassin's Creed.
____________
|
|
|
|
Most games would not have big problems
What do you mean by that? Have you tried it yourself?
MrS
Yesterday evening I've played Trackmania United one hour while a WU was running on my XFX GTX280. Maybe while the card has 1 GB RAM. I will testing again after completing the actually running WU, but not, when she was running by 84%, only with a fresh startet one ;) Will test in 2 hours with Assassin's Creed.
When i was runing 6.41 i was able to play in Anno 1701 and it does nor harm Ps3Grid computnig. But performance of the game is TERIBLLE while computing...
Now i will test 6.43 and see if you can play and crunch together... |
|
|
|
But performance of the game is TERIBLLE while computing...
Sounds like the same experience as mine. However, I already had 6.43.
MrS
____________
Scanning for our furry friends since Jan 2002 |
|
|
Kokomiko Send message
Joined: 18 Jul 08 Posts: 190 Credit: 24,093,690 RAC: 0 Level
Scientific publications
|
There are two possible reasons, why I don't have this problems. At first, my Phenom 9850 is a quad and second I work with Vista Ultimate 64 bit. Maybe this combination has a better performance.
____________
|
|
|
|
Some new input / ideas:
Thanks to the new clients we can see the time needed per step. On my 9800GTX+ this is a little over 60 ms and more like 51 ms with the previous client (estimated). Latencies of 50 or 60 ms result in a frequency of 20 and 16.7 Hz, respectively. This is just what I reported initially, based on my subjective impression!
This looks like it tells us something: the GPU gets a new CUDA task, i.e. "calcuate one step", executes this task and only after it's finished it is available for the game again. This explains perfectly why Kokomiko didn't experience such a strong lag with his GTX 280 - the times per step are shorter and therefore the lag is not as noticeable.
To GDF:
Is it possible to change the granularity of the calculation? It seems like all we need to eleminate the sluggish GUI and the stuttering in games (not talking about the start-new-WU-while-gaming issue) would be a reduction of "batch size" by a factor of 2 to 4 (depending on the card).
The client could calculate the 1st step in a WU normally, then look at the time it needed and afterwards decide how to spilt up the work by considering a maximum allowed step time / latency. This should be shorter than 1/25 fps = 40 ms, though some testing may be needed here.
I understand that dividing the calculation into steps is natural and that further partitioning introduces further overhead and may be difficult if you're doing things like matrix inversions.. so just give the idea a thought or two :)
Another option would be to have normal and short WUs and some preference which the user can set, so that people with slower cards could choose to prefer short WUs and thereby keeping latency in check. But of course smaller systems are scientifically less interesting.
MrS
____________
Scanning for our furry friends since Jan 2002 |
|
|
|
Some new input / ideas:
Thanks to the new clients we can see the time needed per step. On my 9800GTX+ this is a little over 60 ms and more like 51 ms with the previous client (estimated). Latencies of 50 or 60 ms result in a frequency of 20 and 16.7 Hz, respectively. This is just what I reported initially, based on my subjective impression!
With my GTX 280 doing a benchmark on full settings and max res with devil may cry 4 gave 80 FPS, with ps3grid it gave an average of 40 FPS with a few, barely noticed frame skips.
I can safely say the same happens with a lot of games, performance drops of around 50% with ps3 grid on. I tried fraps on overlord and warhammer and noticed the 50% drop considering these games dont have an FPS counter.
Its not annoying, but I prefer to pause ps3grid when gaming for silky performance. |
|
|
|
As i tried to express in my first post, measured fps is not the problem and reporting them tells us nothing about the problem.
If a 3D Mark test gives me for example 800 fps without GPU-Grid and drops to 200 or 300 fps with GPU-Grid, one would assume no problems. 25 fps is generally considered fluent and 50 fps is beyong any doubt.
However, the 200 fps which I might get with GPU-Grid are not smooth / fluent. The image is "stuttering". Consider it like this: the GPU renders 10 frames, then it does one GPU step which pauses the screen for ~60 ms. Afterwards it does 10 frames again very quickly and pauses again. In the end you may count 100 or 200 frames rendered over the duration of 1s, but this value tells you nothing about the interruptions of 60 ms duration.
The resulting image is not the same as getting one frame every 60 ms (=16.7 fps), but it's not much better either. And you can't see this effect as clearly, because your card is faster and therefore your interruptions are shorter.
MrS
____________
Scanning for our furry friends since Jan 2002 |
|
|
GDFVolunteer moderator Project administrator Project developer Project tester Volunteer developer Volunteer tester Project scientist Send message
Joined: 14 Mar 07 Posts: 1957 Credit: 629,356 RAC: 0 Level
Scientific publications
|
We have tried to split the kernels, but the performance penalty was too high and personally I did not notice a big improvement on the windows refreshing either on a 8800GT. Your analysis is spot on. I am not too sure why the performance degraded so much. We might try again in the future.
Unrelated>
I think that once you asked how we managed to reduce the cpu usage. We poll the GPU for kernel termination every millisecond. If the sleep function is correct, this should not slow it down for more than 1ms per step. As I indeed see on my computer.
gdf |
|
|
|
On My system, to play e.g. STALKER I must turn off GPUGRID because the game is not playable... Also when I am watching movies from divix or xvid or matroska i must turn off gpugrid because the movie lags and it lose synchro betwean voice and video.
...
|
|
|
|
Thanks for the information, GDF!
[still unrelated]
It may be possible to reduce the maximum loss of 1 ms or average of (probably) 0.5 ms even more with the same CPU overhead by knowing the average time per step for that GPU and the spread of step times and starting to poll more often when the GPU should be ready soon. But for now this is certainly not neccessary and 1 ms is perfectly fine. If it works as expected, which seems to be the case under Linux but not so much under Win yet.
[/still unrelated]
MrS
____________
Scanning for our furry friends since Jan 2002 |
|
|
GDFVolunteer moderator Project administrator Project developer Project tester Volunteer developer Volunteer tester Project scientist Send message
Joined: 14 Mar 07 Posts: 1957 Credit: 629,356 RAC: 0 Level
Scientific publications
|
Thanks for the information, GDF!
[still unrelated]
It may be possible to reduce the maximum loss of 1 ms or average of (probably) 0.5 ms even more with the same CPU overhead by knowing the average time per step for that GPU and the spread of step times and starting to poll more often when the GPU should be ready soon. But for now this is certainly not neccessary and 1 ms is perfectly fine. If it works as expected, which seems to be the case under Linux but not so much under Win yet.
[/still unrelated]
MrS
So, maybe the implementation of sleep under Windows that we used is crap. Does anyone know a nanosleep version for Windows?
gdf |
|
|
|
I'm wondering, is there an issue with memory leaks in the nVidia driver?
I'm playing a game every now and then and before GPU-Grid this was fine with an ATI card. Not much of my applications or usage scenarios has changed, but now after a few weeks my "idle" memory consumption raises to 1.5 - 1.8 GB (that's the point when I notice it, because I *only* have 2 GB), whereas after reboot it is 1 GB, including BOINC. Restarting BOINC does not help, so something else must be eating my memory.
MrS
____________
Scanning for our furry friends since Jan 2002 |
|
|
|
I didnt think GPU grid ran on ATI cards... |
|
|
|
I didnt think GPU grid ran on ATI cards...
It doesn't,
What he is saying his games ran fine on ati card, before gpu-grid game along and he replaced the ati with an nvidia card. |
|
|