|
Post by Aleslammer on May 10, 2019 7:50:23 GMT -5
Vin, picked up two of the PCIe x16 extenders for possible use on the board I'm using. Image gives a better idea of size plus it will sit flat wasn't sure from the Newegg image. VGA in the image is a 960.
|
|
|
Post by Vinster on May 10, 2019 8:58:26 GMT -5
wow, that's a good size. Wonder if one of those would fit in my Haf-X case...
Thanks for the pic
Vin
|
|
|
Post by Aleslammer on May 10, 2019 11:06:33 GMT -5
Looking at this from the competitive side were a long way from anything close to the gargantuan teams at the top but we are kicking along at a pretty good pace at present production. Got a little more comfortable with spread sheet @ extreme overclocking.
|
|
|
Post by ShrimpBrime on May 10, 2019 11:56:38 GMT -5
Whoa, 5.3 Mil yesterday.... Droch yesterday pulled 2.8mil... that's more than a 2080Ti... there must be a few GPU's/Systems pulling that. Vin Would have to gauge this on how many WU where submitted between 3 hour shifts, I'm at work so cannot do that ATM.
|
|
|
Post by Vinster on May 10, 2019 12:26:01 GMT -5
Looking at this from the competitive side were a long way from anything close to the gargantuan teams at the top but we are kicking along at a pretty good pace at present production. Got a little more comfortable with spread sheet @ extreme overclocking. great spreadsheet over there. would be hard to match or compete with that data availability... and 100% agree on the competitively, but we are as always doing well. but for reference;
we are currently in position 718 with a new 4Mil ppd average and currently increasing as the average propagates. The Next team ahead of us putting out more than the 1.0-1.5mil ppd in a 24hr average is Team Picard in Position 357.
Yep, we're not a whale. but we'll be in the top 400 in 2 month at this rate. Top 200 is attainable in under a year. Top 100 is 2 years away based on our 4mil ppd average.
if we were to stop outright, it would take a few years for us to be knocked out and under the 1200 position region for tracking user points.
Vin
|
|
|
Post by Aleslammer on May 10, 2019 14:19:32 GMT -5
|
|
|
Post by Vinster on May 10, 2019 15:04:11 GMT -5
ah, yes.. WU/Hr/Day.. I didn't factor that in. then I'd be with you on with 1 or multiple system with multiple smaller GPU's. Smaller GPU's get smaller work units and complete them faster (same as CPU's WU's)... Maybe 4-6 980Ti/1070/1080's?
with the average 2.5Mil ppd with say 4 GPU's that's 625,000ppd/GPU... 6 GPU's is 416,666ppd/GPU....
so,
980Ti ~ 650k ppd 1070 ~ 700k ppd 1080 ~ 850k ppd Radeon RX Vega 56 ~ 740k ppd Radeon RX Vega 64 ~ 740k ppd
I'm just my 1080Ti and r9-290x and I pull 11-14 WU/Day and average 1.3Mil ppd... I have 2 GPU's for that, Droch is getting over 3 times my WU's that So that's where I landed on the 4-6 GPU's
That's my thinking...
Vin
|
|
|
Post by Aleslammer on May 11, 2019 10:36:37 GMT -5
I'm down hardware problems, might be awhile before I'm up again.
|
|
|
Post by ShrimpBrime on May 11, 2019 10:52:30 GMT -5
Mee too. Giving the HW a little break. Did a little gaming. Went offline yesterday after work. Don't push the rigs too hard myself. I run the 2700x stock no boost 16 threads. Stays nice and cool mid 50c range. Cpu fan under 50% this way. Nice and quiet. The video card I boost maybe 50Mhz and leave the fan profile stock, also stays under 50% fan load. May turn it on for a while here and there. I got a Sat off work for once.
|
|
|
Post by Bones on May 11, 2019 12:07:06 GMT -5
Getting another one done right now and may do it again later. Won't be long before I won't be able to fold, temps will be too high and the powerbill will be too with the AC running roughly 24/7 here. I'm just throwing in while I can and will have to wait out Summer before starting again for the most part.
|
|
|
Post by Vinster on May 14, 2019 7:08:24 GMT -5
yesterday was our new highest daily output at 5,682,264 points. Wow that is a ton... now in position 668. we'll be in the top 500 in under 5 days.
Today we are passing an "nVidia" and "Harvard" team.
in 4 days we are passing Free-DC (don't remember why that name is familiar)
Vin
|
|
|
Post by WhiteWulfe on May 14, 2019 7:19:37 GMT -5
Free-DC is a stats aggregator, most commonly known for tracking BOINC projects. I think they also track Folding@Home.
|
|
|
Post by Vinster on May 14, 2019 8:04:44 GMT -5
Free-DC is a stats aggregator, most commonly known for tracking BOINC projects. I think they also track Folding@Home. that's why it rings a bell... FYI, I blame my memory loss to the pot i smoked when I was younger
thank you for the info
Vin
|
|
|
Post by Aleslammer on May 16, 2019 13:30:31 GMT -5
I'm finishing up on my daily. Picked up a Kil-a-Watt meter, box is using 620 watts to produce about 980K (just whats feed through the PS) looking through some information a single RTX 2060 in a Z platform would be around 250 watts for 900K. Added: Never mind just checked the weather and going to be cold for the next 4 or 5 days so not going to need the AC. With the heat dump might not need the heater. Besides have to get by The Vegans being a strong believer in PETA have to show I care.
|
|
|
Post by Vinster on May 16, 2019 21:07:10 GMT -5
^ I love it.. that's good... haha
Vin
|
|
|
Post by ShrimpBrime on May 16, 2019 21:51:19 GMT -5
|
|
|
Post by ShrimpBrime on May 17, 2019 22:48:39 GMT -5
Hmm, been thinking. You guys/gals been folding up a storm and raking in big points. But I'm killing all ya'll by number of work units completed!
|
|
|
Post by WhiteWulfe on May 18, 2019 0:51:20 GMT -5
Hmm, been thinking. You guys/gals been folding up a storm and raking in big points. But I'm killing all ya'll by number of work units completed! Wow, it's even 3x more than what I've done all in, over 15 or so years...
|
|
|
Post by Vinster on May 18, 2019 20:39:29 GMT -5
in 1 day we will be passing the Bot team point count....
and Droch_Leanna has added HW to the fold pulling 3,000,000ppd now....
Vin
|
|
|
Post by Vinster on May 18, 2019 23:21:08 GMT -5
Ales what did you add to put in that whoopin?
Vin
|
|
|
Post by Aleslammer on May 19, 2019 5:40:31 GMT -5
Did some swapping with the daughter and got the 1080 Ti back, got here Friday.
X79 has a bad lan port (2-3 hour uploads) so trying a usb wifi adapter, working so far. Going to run for a few days to get a base and then down to switch the two 1080's out to a Z board and give linux a go. Although can't go much longer at this rate, heating up in the next couple days and will be using more power than the solar panels produce and PG&E power is flat out costly.
|
|
|
Post by ShrimpBrime on May 19, 2019 11:28:17 GMT -5
|
|
|
Post by Aleslammer on May 19, 2019 12:21:42 GMT -5
|
|
|
Post by ShrimpBrime on May 19, 2019 16:12:04 GMT -5
Back on track. OK Aleslammer did some more digging. Looks like we need a board with 2x PCI-E slots and run TCC driver. docs.nvidia.com/gameworks/content/developertools/desktop/nsight/tesla_compute_cluster.htmSetting TCC Mode for Tesla Products NVIDIA GPUs can exist in one of two modes: TCC or WDDM. TCC mode disables Windows graphics and is used in headless configurations, whereas WDDM mode is required for Windows graphics. NVIDIA GPUs also come in three classes: GeForce — typically defaults to WDDM mode; used in gaming graphics. Quadro — typically defaults to WDDM mode, but often used as TCC compute devices as well. Tesla — typically defaults to TCC mode. Current drivers require a GRID license to enable WDDM on Tesla devices. NVIDIA Nsight Graphics debugging requires the target GPU be in WDDM mode. NVIDIA Nsight Compute debugging does not require a specific driver mode. The NVIDIA Control Panel will show you what mode your GPUs are in; alternately, you can use the nvidia-smi command to generate a table that will display your GPUs and what mode they are using. To change the TCC mode, use the NVIDIA SMI utility. This is located by default at C:\Program Files\NVIDIA Corporation\NVSMI. Use the following syntax to change the TCC mode: nvidia-smi -g {GPU_ID} -dm {0|1} 0 = WDDM 1 = TCC
|
|
|
Post by Aleslammer on May 19, 2019 16:40:46 GMT -5
Shrimpy. THANKS
|
|
|
Post by Aleslammer on May 20, 2019 6:28:08 GMT -5
I'm dropping out, going to be good for about a week a month. Summer might be a little less productive, AC and have to ramp the pool pumps up to summer mode. The last mention have been off been cool enough the last couple days.
|
|
|
Post by Vinster on May 20, 2019 13:32:34 GMT -5
lol, I gamed the other day and forgot to re-enable Folding on my 1080Ti... oh well.. pretty bit hit in points.
Aleslammer, doing great on those points, I read that Linux will net you 5-15% depending on the distro. Vin
|
|
|
Post by Bones on May 20, 2019 20:25:08 GMT -5
Same as Ale, weather has caught up with me so for now I'm having to lay back.
|
|
|
Post by prodigit on May 21, 2019 21:30:25 GMT -5
Found your group via extremeoverclocking. For a while you were 'on my heels', as my server had an outage twice this week. I'm currently running 6.3M PPD at 750W, or could run 6M PPD at 720W, or 6.6M PPD at 780W. one 2080 ti, two 2080s and one 2060. The 2080 ti = 2.2-2.6M PPD the 2080s are 1.4-1.5M PPD each and the 2060= 1.04M PPD. And a second 350W 2.1M PPD unit, running a 2070 (~1.1M PPD) and 2060 (~925k PPD) on and off. Wished you guys would try to overtake me, We started around the same time, and I'm only 200M points away from you folding.extremeoverclocking.com/team_summary.php?t=44511For more info on properly setting up a Linux machine, follow this thread. It'll also give you tips on overclocking and lowering your power bill. The RTX cards can cut power by 1/3rd for nearly identical performance. foldingforum.org/viewtopic.php?f=16&t=31595If you want more info on what the best cards to buy are, you can read up below, or just take my word for it, that the RTX 2060 are the best bang for the buck, the 1660 or 1660ti are the lowest cards you should consider, and the RTX 2080 or 2080ti are the fastest: forums.evga.com/HARDWARE-GUIDE-Folding-at-Home-what-hardware-should-I-get-m2927907.aspxAnd AMD is no match efficiency wise. They're much slower for folding and cost a ton on electricity and generate more heat. Max GPUs recommended per system: 6 on an Intel system; 3 to 4 on a quad core intel; 1 to 2 on a dual core. 6-8 on an AMD system (with 24 PCIe lanes and at least 8 cores, or 6 cores 12 threads).
|
|
|
Post by ShrimpBrime on May 21, 2019 22:19:12 GMT -5
Firstly and fore-mostly, welcome to Warp9 prodigit!
That is one heck of a set up to be able to maintain a lead against all of us (though not a large team...) some of our members are very new to this, so the information you've shared is very nice of you! Thank you! Everything helps!!
Yes power bills matter a lot here. I could never think about running 780w constantly unless I can get my kids to move out... then that could change! Thanks for the tips on which GPUs to buy into...
I'm curious if you've run Tesla cards, K20 for example, and have some information about setting them up and which motherboards would be best choice. I cannot for the life of me get a board to post with my K20 installed. Nothing happens. I'm lost with it really and honestly.
|
|