Advanced search

Message boards : Number crunching : Is it worth the try?

Author Message
micropro
Send message
Joined: 4 Feb 20
Posts: 8
Credit: 674,423
RAC: 0
Level
Gly
Scientific publications
wat
Message 57845 - Posted: 16 Nov 2021 | 12:16:55 UTC

Is this project worth trying with the setup I own?

I mean, the tasks are huge to complete in time.

I've got a i5-9600K (the K is silent (haha) I disabled turbo boost).
You add up 16Go of RAM 2666Mhz CL13 and a GTX 1650 (4Go GDDR6).
I tweaked a bit the configuration files to lock the frequency of the graphic card so I manage to be at stock frequency with the Nvidia drivers.

I've switched to Ubuntu LTS 20.04 to try some work units under a Linux OS.

But really, in regard of the time needed to complete a task, should I switch to another project or am I good to go with this one?

Under Windows I did the math and I should have let my card crunching until the deadline without any rest and quite an amount of luck. No way I could make it in time.

With the price of hardware (amongst other things) I'm not ready to get another card (even for gaming sometimes).

Do people with a "low end" graphic card get a chance to work on this project or should I quit?

Thank you for reading this message.

Best regards,

micropro

Profile ServicEnginIC
Avatar
Send message
Joined: 24 Sep 10
Posts: 566
Credit: 5,945,802,024
RAC: 10,645,296
Level
Tyr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 57852 - Posted: 16 Nov 2021 | 22:58:23 UTC - in response to Message 57845.

Is this project worth trying with the setup I own?

I joined Gpugrid project on September 2010, with 0 credits and 0 RAC.
I've never had any High-end host, but managing modest resources with persistence, I'm proud to have contributed to several scientific studies based on Gpugrid work units.

I've got a i5-9600K (the K is silent (haha) I disabled turbo boost).
You add up 16Go of RAM 2666Mhz CL13 and a GTX 1650 (4Go GDDR6).

Just two of my currently most productive hosts are based on GTX 1650 GPU graphics cards, working with similar processors and memory.
Due to their (relatively) low power consumption (75 watts at full load), I find GTX 1650 GPUs to be easy to maintain at reasonable temperature levels, and very reliable for working 24/7.

I've switched to Ubuntu LTS 20.04 to try some work units under a Linux OS.

That's my current preferent OS.
Working at this environment, a GTX 1650 GPU is able to process the heaviest Gpugrid tasks in less than 3,5 days, well inside the 5 days deadline.

Do people with a "low end" graphic card get a chance to work on this project or should I quit?

I just wanted to express my extended opinion regarding this at the Managing non-high-end hosts thread.

micropro
Send message
Joined: 4 Feb 20
Posts: 8
Credit: 674,423
RAC: 0
Level
Gly
Scientific publications
wat
Message 57854 - Posted: 17 Nov 2021 | 8:58:47 UTC - in response to Message 57852.

Hi,

Thank you very much for the explanation, your advice and the information given!

This thread about how managing the project with not-high-end hardware is great.

I'm definitely give it a solid try (working with persistence) for a while.

Best regards,

micropro

micropro
Send message
Joined: 4 Feb 20
Posts: 8
Credit: 674,423
RAC: 0
Level
Gly
Scientific publications
wat
Message 58156 - Posted: 18 Dec 2021 | 10:54:24 UTC - in response to Message 57852.

Just one more question if I can...

What are the frequency of your GPU and its memory when computing for a task?

I'm using MSI Afterburner (I've got an MSI card so...) and I've locked the GPU frequency at 1350Mhz and leave the memory frequency untouched.

On the Nvidia website, the specs put the base clock at 1410Mhz for my GTX 1650 (GDDR6, it's different for GDDR5).

So, question is... can I lock it at 1410Mhz?
Can I lower the memory frenquency?

Best regards,

micropro

Profile ServicEnginIC
Avatar
Send message
Joined: 24 Sep 10
Posts: 566
Credit: 5,945,802,024
RAC: 10,645,296
Level
Tyr
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 58159 - Posted: 19 Dec 2021 | 19:01:53 UTC - in response to Message 58156.

I've locked the GPU frequency at 1350Mhz and leave the memory frequency untouched.

All my GTX 1650 graphics cards are based on GDDR5, I haven't tested GDDR6 models.
I've searched at TECHPOWERUP, and I've found NVIDIA GeForce GTX 1650 GDDR6 specifications.
If there isn't any temperature or reliability problem, locking GPU clock frequency to 1350 Mhz you'll be getting a potential performance loss.
You should have no problem in locking its clock to the standard base frequency of 1410 Mhz.
You could even try up to 1590 Mhz, its specified boost clock, if reliability and temperatures happen to maintain right.
Regarding memory, I usually prefer not to raise the original clock frequency, because (as far as I know) there is no way to monitor whether memory chips are overheating or not...
And memory underclocking, I've never been in the need of trying it.

jjch
Send message
Joined: 10 Nov 13
Posts: 98
Credit: 15,288,150,388
RAC: 1,732,962
Level
Trp
Scientific publications
watwatwatwatwatwatwatwatwatwatwatwatwatwatwat
Message 58160 - Posted: 19 Dec 2021 | 19:46:26 UTC
Last modified: 19 Dec 2021 | 19:50:25 UTC

What I do with all my Nvidia GPU's is run EVGA's precision X software and set the fan curve to Aggressive.
When the cards are cooled properly they will automatically run at the Boost clock until they run out of power.
You can also set the Power Target up a bit to 110% and they will run up to the Maximum Boost clock possible.

For example I have a GTX 1070 which has a the Boost clock of 1683 MHz but it is actually running at 1873 MHz. Temp is also 50-51C.

Definitely the Easy Button to get the best performance without fiddling around tweaking all of the settings manually.

micropro
Send message
Joined: 4 Feb 20
Posts: 8
Credit: 674,423
RAC: 0
Level
Gly
Scientific publications
wat
Message 58165 - Posted: 20 Dec 2021 | 18:09:30 UTC

Thank you both of you for your advice.

I locked my GTX 1650 because I do some Primegrid too and I had read somewhere that we had to be extremely caution regarding the stability of our hardware to maximise our chances to get a valid result.

I somehow mixed up the requirements here and there and... well, my apologise.

They got a point though having a reliable and stable hardware (for all projects actually).

Thanks again!

Best regards,

micropro

Post to thread

Message boards : Number crunching : Is it worth the try?

//