Author Message

Ian&Steve C.



Send message

Joined: 28 Sep 99

Posts: 3256

Credit: 1,282,604,591

RAC: 6,640

Joined: 28 Sep 99Posts: 3256Credit: 1,282,604,591RAC: 6,640 Message 1958884 - Posted: 6 Oct 2018, 19:31:57 UTC

Last modified: 6 Oct 2018, 19:43:48 UTC



I know most people here have historically run up to 4 GPUs on some rather expensive hardware (and pretty much requires watercooling due to density and card proximity. But if you only want to run SETI, or other tasks that dont heavily rely on PCIe bandwidth, you can get more GPUs in a single machine by using risers and significantly less expensive motherboards. From what others have commented in other threads here, other projects like Einstein@home do see a performance hit on low PCIe bandwidth connections, so keep your own personal goals in mind when setting up your machine. but for this thread, i'll focus on SETI which sees little or no performance impact all the way down to a PCIe x1 interface *as long as you have at least PCIe gen2, and ideally gen3* you WILL see a performance hit if you try this method on very old PCIe gen1 stuff.



There's more than one way to do this, but my goal on this system was to get everything inside a 4U server that could be mounted on a rack.it's not currently in a rack, but i have that option should i want to. i also didnt want to spend too much month on the parts that aren't doing the work.



So my build:

Case : Rosewill RSV-L4500, front HDD cages and all front bracketry completely removed.

GPU bracket : Spotswood drop-in bracket assembly:

PSU1 : HP common slot 1200W (900W on 120V) - powering the motherboard and 4x GPUs

PSU2 : HP common slot 750W - powering 3x GPUs

2x PSU breakout boards : https://www.amazon.com/Supply-Breakout-Adapter-Support-Ethereum/dp/B078VMMV6D/ref=sr_1_8?s=electronics&ie=UTF8&qid=1538852429&sr=1-8&keywords=breakout+board

Custom PCIe 6-pin -> MB 8-pin and CPU 4-pin (to power the PicoPSU), i made this adapter myself, no source to buy i don't think, not needed if you use a more normal PSU setup

PSU3 : 120W PicoPSU for the motherboard

Motherboard : ASUS Prime Z270-P

CPU : i7-7700k w/ HT enabled

RAM : 8GB (2x4GB) DDR4-2133

Risers : XPLOMOS v008S:

M.2 PCIe adapter : https://www.amazon.com/EXPLOMOS-NGFF-Adapter-Power-Cable/dp/B074Z5YKXJ/ref=sr_1_1_sspa?s=electronics&ie=UTF8&qid=1538852725&sr=1-1-spons&keywords=m.2+to+pcie&psc=1

GPUs : 6x 1080ti + 1x 1060

Fans : 6x Noctua iPPC-2000

SSD : cheapo 120GB drive. use whatever you want.



Pics:













a few things to note.

1. you can use a normal PSU here, but there are a few reasons I did not. First, price, when i first put this together, it was cheaper to go this route than buying a quality 1600W PSU. Second, space. a high power PSU would not fit in this case while retaining the center fan wall, and i didnt want to give that up. Third, i needed a lot of PCIe power connections. Even with using some 8pin->2x8-pin splitters i'm using 13x ull-run PCIe power connections, not many PSUs have that many, and none that i know of have them on individual runs. i think the EVGA 1600W has 9x PCIe that are doubled on the ends.



2. don't forget about your m.2 interface! if you have a newer board that has an M.2 and it runs on PCIe and not just SATA, it's no different than a normal PCIe slot (electrically) and you can adapt a GPU to it!



3. you could also use an open air type setup using the same basic components that i have used, but mount them on a frame that is not enclosed. you'll need to get some airflow moving around the cards with some box fans or something. it will take up more space, but maybe be easier to deal with wiring and maintenance.



like this:

Seti@Home classic workunits: 29,492 CPU time: 134,419 hours



So I wanted to create a thread about multi-gpu machines. I'll show you how I go about my builds to maybe give people a better visual, and maybe introduce someone to something new.I know most people here have historically run up to 4 GPUs on some rather expensive hardware (and pretty much requires watercooling due to density and card proximity. But if you only want to run SETI, or other tasks that dont heavily rely on PCIe bandwidth, you can get more GPUs in a single machine by using risers and significantly less expensive motherboards. From what others have commented in other threads here, other projects like Einstein@home do see a performance hit on low PCIe bandwidth connections, so keep your own personal goals in mind when setting up your machine. but for this thread, i'll focus on SETI which sees little or no performance impact all the way down to a PCIe x1 interface *as long as you have at least PCIe gen2, and ideally gen3* you WILL see a performance hit if you try this method on very old PCIe gen1 stuff.There's more than one way to do this, but my goal on this system was to get everything inside a 4U server that could be mounted on a rack.it's not currently in a rack, but i have that option should i want to. i also didnt want to spend too much month on the parts that aren't doing the work.So my build:: Rosewill RSV-L4500, front HDD cages and all front bracketry completely removed.: Spotswood drop-in bracket assembly: http://spotswoodcomputercases.com/wp/?page_id=9120 : HP common slot 1200W (900W on 120V) - powering the motherboard and 4x GPUs: HP common slot 750W - powering 3x GPUs-> MB 8-pin and CPU 4-pin (to power the PicoPSU), i made this adapter myself, no source to buy i don't think, not needed if you use a more normal PSU setup: 120W PicoPSU for the motherboard: ASUS Prime Z270-P: i7-7700k w/ HT enabled: 8GB (2x4GB) DDR4-2133: XPLOMOS v008S: https://www.amazon.com/EXPLOMOS-Graphics-Extension-Ethereum-Capacitors/dp/B074Z754LT/ref=sr_1_fkmr0_1?s=electronics&ie=UTF8&qid=1538851990&sr=1-1-fkmr0&keywords=explomos+v8+riser : 6x 1080ti + 1x 1060: 6x Noctua iPPC-2000: cheapo 120GB drive. use whatever you want.Pics:a few things to note.1. you can use a normal PSU here, but there are a few reasons I did not. First, price, when i first put this together, it was cheaper to go this route than buying a quality 1600W PSU. Second, space. a high power PSU would not fit in this case while retaining the center fan wall, and i didnt want to give that up. Third, i needed a lot of PCIe power connections. Even with using some 8pin->2x8-pin splitters i'm using 13x ull-run PCIe power connections, not many PSUs have that many, and none that i know of have them on individual runs. i think the EVGA 1600W has 9x PCIe that are doubled on the ends.2. don't forget about your m.2 interface! if you have a newer board that has an M.2 and it runs on PCIe and not just SATA, it's no different than a normal PCIe slot (electrically) and you can adapt a GPU to it!3. you could also use an open air type setup using the same basic components that i have used, but mount them on a frame that is not enclosed. you'll need to get some airflow moving around the cards with some box fans or something. it will take up more space, but maybe be easier to deal with wiring and maintenance.like this:Seti@Home classic workunits: 29,492 CPU time: 134,419 hours ID: 1958884 ·

Ian&Steve C.



Send message

Joined: 28 Sep 99

Posts: 3256

Credit: 1,282,604,591

RAC: 6,640

Joined: 28 Sep 99Posts: 3256Credit: 1,282,604,591RAC: 6,640 Message 1958896 - Posted: 6 Oct 2018, 20:35:57 UTC - in response to Message 1958889. That is some seriously impressive hardware! Thanks for sharing. What OS and SETI optimizations are you running to go with the hardware?



Roger



Linux 18.04 and the CUDA Special App Linux 18.04 and the CUDA Special App ID: 1958896 ·

Keith Myers

Volunteer tester



Send message

Joined: 29 Apr 01

Posts: 11909

Credit: 1,160,866,277

RAC: 1,873

Message 1958898 - Posted: 6 Oct 2018, 20:36:42 UTC Seti@Home classic workunits:20,676 CPU time:74,226 hours



A proud member of the OFA (Old Farts Association) Thanks for the post Ian. I wasn't aware of the M.2 riser solution at all. Very interesting rig for those 1080Ti's.Seti@Home classic workunits:20,676 CPU time:74,226 hoursA proud member of the OFA (Old Farts Association) ID: 1958898 ·

Tom M

Volunteer tester

Send message

Joined: 28 Nov 02

Posts: 5007

Credit: 276,046,078

RAC: 462

Message 1959346 - Posted: 9 Oct 2018, 0:50:58 UTC Thank you for the pictures. All us people running 2 to 4 gpu's now could have a "case" of gpu envy ;)





Tom A proud member of the OFA (Old Farts Association). ID: 1959346 ·

Brent Norman

Volunteer tester

Send message

Joined: 1 Dec 99

Posts: 2786

Credit: 685,657,289

RAC: 835

Message 1959969 - Posted: 12 Oct 2018, 23:01:41 UTC Browsing around I ran into a USB 3.1 to SSD/M.2 board which got me to thinking about Steve's M.2 to PCIe x16.



There are hundreds of different USB (3.1 or 3.0) to M.2 converters for external M.2 enclosures etc, so why couldn't you adapt that to PCIe ? Hmmm.



I did some searching and didn't find anything that does that directly. ID: 1959969 ·

Tom M

Volunteer tester

Send message

Joined: 28 Nov 02

Posts: 5007

Credit: 276,046,078

RAC: 462

Message 1960059 - Posted: 13 Oct 2018, 13:30:56 UTC - in response to Message 1959969. I have run across riser adaptors that will allow you to run 4 gpu's on risers from one pc? slot.



You could take an average MB with 4 PCex slots and a couple of short slots and put.... maybe 24 gpu's on it. Without a modded Bios it might choke when you try to boot though...



Tom A proud member of the OFA (Old Farts Association). ID: 1960059 ·

Ian&Steve C.



Send message

Joined: 28 Sep 99

Posts: 3256

Credit: 1,282,604,591

RAC: 6,640

Joined: 28 Sep 99Posts: 3256Credit: 1,282,604,591RAC: 6,640 Message 1960100 - Posted: 13 Oct 2018, 19:49:05 UTC - in response to Message 1960059. I have run across riser adaptors that will allow you to run 4 gpu's on risers from one pc? slot.



You could take an average MB with 4 PCex slots and a couple of short slots and put.... maybe 24 gpu's on it. Without a modded Bios it might choke when you try to boot though...



Tom



You can use them. But you see performance decreases when splitting the slot to more than 2 GPUs. Seti@Home classic workunits: 29,492 CPU time: 134,419 hours



You can use them. But you see performance decreases when splitting the slot to more than 2 GPUs.Seti@Home classic workunits: 29,492 CPU time: 134,419 hours ID: 1960100 ·

Tom M

Volunteer tester

Send message

Joined: 28 Nov 02

Posts: 5007

Credit: 276,046,078

RAC: 462

Message 1960968 - Posted: 19 Oct 2018, 16:35:12 UTC @Ian&SteveC,

Congratulations on your latest RAC and position on the LeaderShip board on this rig! A proud member of the OFA (Old Farts Association). ID: 1960968 ·

Tom M

Volunteer tester

Send message

Joined: 28 Nov 02

Posts: 5007

Credit: 276,046,078

RAC: 462

Message 1961493 - Posted: 22 Oct 2018, 15:55:10 UTC Have you maxed out the total number of gpu's that you can have?



If your MB had more slots available, could you get (safely, with enough air cooling) any more gpu's into the case?



Tom A proud member of the OFA (Old Farts Association). ID: 1961493 ·

Ian&Steve C.



Send message

Joined: 28 Sep 99

Posts: 3256

Credit: 1,282,604,591

RAC: 6,640

Joined: 28 Sep 99Posts: 3256Credit: 1,282,604,591RAC: 6,640 Message 1961499 - Posted: 22 Oct 2018, 16:14:09 UTC

Last modified: 22 Oct 2018, 16:37:16 UTC



I would have to squeeze the front GPUs together to fit 7 instead of 6. It can be done, but I need to move some wiring around and power is a limiting factor in my specific system. 6x 1080tis use a good bit of power already.



Iâ€™ve considered doing it and then watercooling all 7 cards. It would free up some space. And get most of the heat out of the box to an external radiator, but again, power is a limiting factor right now. Seti@Home classic workunits: 29,492 CPU time: 134,419 hours



I can fit one more GPU via the second M.2 slot.I would have to squeeze the front GPUs together to fit 7 instead of 6. It can be done, but I need to move some wiring around and power is a limiting factor in my specific system. 6x 1080tis use a good bit of power already.Iâ€™ve considered doing it and then watercooling all 7 cards. It would free up some space. And get most of the heat out of the box to an external radiator, but again, power is a limiting factor right now.Seti@Home classic workunits: 29,492 CPU time: 134,419 hours ID: 1961499 ·

Brett Koski



Send message

Joined: 27 Aug 16

Posts: 9

Credit: 182,944,505

RAC: 93

Joined: 27 Aug 16Posts: 9Credit: 182,944,505RAC: 93 Message 1962136 - Posted: 27 Oct 2018, 14:42:26 UTC



Sorry for the HUGE pictures, I'm not sure how to re-size for forum use?

Also, if you think cable management is important, consider this your "trigger" warning!



ASUS X99-E-10G WS Motherboard

*Motherboard has 7 PCIe slots & a PCIe M.2 slot, hence the ability to run 8 GPU simultaneously.

*When the M.2 slot is used with a GPU, this is the card that drives the monitor. Unsure why.

Intel i7-5960x - 8/16 - @ ~4.0GHZ (water cooled)

2x EVGA Titan X Hybrid GPU (maxwell)

6x EVGA 1070SC Gaming GPU (pascal)

EVGA 1600 T2 PSU

I forget the H2O system, RAM, and other component stats, as this was a couple years ago.

Windows 10 and Stock SETI App.

Various Configurations (it continuously evolved):









Same rig with seven Zotac GT1030 single slot cards. This short-lived test was quite important for what, at the time, was my eventual goal for my dedicated SETI rig.





Rig as it sits now (finally back up and running!).

ASUS X99-E-10G WS Motherboard

Intel i7-5960x - 8/16 - @ ~3.8GHz (air cooled)

4x NVIDIA Titan Xp GPU (pascal) with EVGA 1080Ti Hybrid Water Cooler Kits

Linux Lubuntu 16.04 & CUDA90 App





A few notes*

The 8-GPU rig very nearly set my house on fire. If you are going to try a setup like this, make sure not only your PSU (or two) is capable of the stress, but your home wiring is up to the task as well. No joke, there were flames and a melted outlet, I was lucky to be home to witness the event and stop it immediately. Air cooled CPU and GPU are LOUD, Very loud. Not much you can do about thermal output, but switching to water-cooled CPU and GPU drops the noise pollution significantly. This brings me to the seven GPU test. I bought this motherboard specifically for an incredibly insane idea. Mount 7 GPU to the MOBO, all in a water loop, to have a ridiculous SETI rig, while maintaining a small physical and audible footprint. My goal was to use SEVEN NVIDIA Titan Xp GPU, like the four in the last picture above. These cards, when equipped with a water plate, physically become single-slot cards. To make a long story short, this is a good example of what I was after, before I went broke... HAHA:



https://rawandrendered.com/octane-render-hepta-gpu-build



That was my eventual goal, but I ran out of money so 4x Titan Xp will have to do haha! Even If I had the money to complete my build, I would still have (as Ian&Steve C. mentioned) power delivery issues. I'll have to wait until I own a house and can make some specific modifications to the electrical system before I feel comfortable adding much to this unit. A custom water loop with an external (read: outdoor) radiator is also on the list, but that's another insane idea that is just going to have to wait.



Let's see some more crazy rigs!!!! This thread needs more love!!!Sorry for the HUGE pictures, I'm not sure how to re-size for forum use?Also, if you think cable management is important, consider this your "trigger" warning!ASUS X99-E-10G WS Motherboard*Motherboard has 7 PCIe slots & a PCIe M.2 slot, hence the ability to run 8 GPU simultaneously.*When the M.2 slot is used with a GPU, this is the card that drives the monitor. Unsure why.Intel i7-5960x - 8/16 - @ ~4.0GHZ (water cooled)2x EVGA Titan X Hybrid GPU (maxwell)6x EVGA 1070SC Gaming GPU (pascal)EVGA 1600 T2 PSUI forget the H2O system, RAM, and other component stats, as this was a couple years ago.Windows 10 and Stock SETI App.Various Configurations (it continuously evolved):Same rig with seven Zotac GT1030 single slot cards. This short-lived test was quite important for what, at the time, was my eventual goal for my dedicated SETI rig.Rig as it sits now (finally back up and running!).ASUS X99-E-10G WS MotherboardIntel i7-5960x - 8/16 - @ ~3.8GHz (air cooled)4x NVIDIA Titan Xp GPU (pascal) with EVGA 1080Ti Hybrid Water Cooler KitsLinux Lubuntu 16.04 & CUDA90 AppA few notes*The 8-GPU rig very nearly set my house on fire. If you are going to try a setup like this, make sure not only your PSU (or two) is capable of the stress, but your home wiring is up to the task as well. No joke, there were flames and a melted outlet, I was lucky to be home to witness the event and stop it immediately. Air cooled CPU and GPU are LOUD, Very loud. Not much you can do about thermal output, but switching to water-cooled CPU and GPU drops the noise pollution significantly. This brings me to the seven GPU test. I bought this motherboard specifically for an incredibly insane idea. Mount 7 GPU to the MOBO, all in a water loop, to have a ridiculous SETI rig, while maintaining a small physical and audible footprint. My goal was to use SEVEN NVIDIA Titan Xp GPU, like the four in the last picture above. These cards, when equipped with a water plate, physically become single-slot cards. To make a long story short, this is a good example of what I was after, before I went broke... HAHA:That was my eventual goal, but I ran out of money so 4x Titan Xp will have to do haha! Even If I had the money to complete my build, I would still have (as Ian&Steve C. mentioned) power delivery issues. I'll have to wait until I own a house and can make some specific modifications to the electrical system before I feel comfortable adding much to this unit. A custom water loop with an external (read: outdoor) radiator is also on the list, but that's another insane idea that is just going to have to wait.Let's see some more crazy rigs!!!! ID: 1962136 ·

Brent Norman

Volunteer tester

Send message

Joined: 1 Dec 99

Posts: 2786

Credit: 685,657,289

RAC: 835

Message 1962152 - Posted: 27 Oct 2018, 16:55:50 UTC Brett you made me look at my XP card with a comment you made.

I never thought about it before but it doesn't have the DVI port like the 1080s - which has to be cut off to turn them into single slot cards.

I'm adding a waterblock to my todo list now.



P.S. Did you get that computer from scocam or change IDs? I just seem to remember this ID showing up when his became inactive ... ID: 1962152 ·

Keith Myers

Volunteer tester



Send message

Joined: 29 Apr 01

Posts: 11909

Credit: 1,160,866,277

RAC: 1,873

Message 1962154 - Posted: 27 Oct 2018, 17:07:30 UTC - in response to Message 1961499. I can fit one more GPU via the second M.2 slot.



I would have to squeeze the front GPUs together to fit 7 instead of 6. It can be done, but I need to move some wiring around and power is a limiting factor in my specific system. 6x 1080tis use a good bit of power already.



Iâ€™ve considered doing it and then watercooling all 7 cards. It would free up some space. And get most of the heat out of the box to an external radiator, but again, power is a limiting factor right now.

Two separate 20A circuits would do it. Each circuit on a different leg of the incoming 240 house supply. Plug each power supply into its own circuit. Seti@Home classic workunits:20,676 CPU time:74,226 hours



A proud member of the OFA (Old Farts Association) Two separate 20A circuits would do it. Each circuit on a different leg of the incoming 240 house supply. Plug each power supply into its own circuit.Seti@Home classic workunits:20,676 CPU time:74,226 hoursA proud member of the OFA (Old Farts Association) ID: 1962154 ·

Brett Koski



Send message

Joined: 27 Aug 16

Posts: 9

Credit: 182,944,505

RAC: 93

Joined: 27 Aug 16Posts: 9Credit: 182,944,505RAC: 93 Message 1962156 - Posted: 27 Oct 2018, 17:34:57 UTC - in response to Message 1962152. Brett you made me look at my XP card with a comment you made.

I never thought about it before but it doesn't have the DVI port like the 1080s - which has to be cut off to turn them into single slot cards.

I'm adding a waterblock to my todo list now.



P.S. Did you get that computer from scocam or change IDs? I just seem to remember this ID showing up when his became inactive ...





Being able to go single-slot without cutting, and the ability to revert back to factory condition is the primary reason I went with the Titan Xp over the 1080 series. Plus, seven Titan XP all in one case... it would have been glorious haha! (maybe some day)

I changed user names a while back. Used to be "uswg01". This main SETI rig has also gone through so many different hardware changes (mobo, HDD, SSD, GPU, CPU... etc) over the last two years it has been hard to keep track. Most of the computers in my history are in fact this one machine. I kind of wish there was a way to manually merge machines in the stats, but I also understand why we can't. The computer is all mine though, it's been one heck of a learning process getting everything to work together! Being able to go single-slot without cutting, and the ability to revert back to factory condition is the primary reason I went with the Titan Xp over the 1080 series. Plus, seven Titan XP all in one case... it would have been glorious haha! (maybe some day)I changed user names a while back. Used to be "uswg01". This main SETI rig has also gone through so many different hardware changes (mobo, HDD, SSD, GPU, CPU... etc) over the last two years it has been hard to keep track. Most of the computers in my history are in fact this one machine. I kind of wish there was a way to manually merge machines in the stats, but I also understand why we can't. The computer is all mine though, it's been one heck of a learning process getting everything to work together! ID: 1962156 ·

Bernie Vine

Volunteer moderator

Volunteer tester



Send message

Joined: 26 May 99

Posts: 9934

Credit: 103,452,613

RAC: 328

Message 1962157 - Posted: 27 Oct 2018, 17:48:25 UTC Sorry for the HUGE pictures, I'm not sure how to re-size for forum use?



On Imgur, open the picture, in the bottom right hand corner you will see.









Edit image, click on that and you the picture will open in edit mode.



In top right hand corner will be the image size







Change the first number to around 1100 and save. (yours are currently 5024x2824 much too big)



Due to recent forum changes large images cause real problems with some browsers. On Imgur, open the picture, in the bottom right hand corner you will see.Edit image, click on that and you the picture will open in edit mode.In top right hand corner will be the image sizeChange the first number to around 1100 and save. (yours are currently 5024x2824 much too big)Due to recent forum changes large images cause real problems with some browsers. ID: 1962157 ·

Tom M

Volunteer tester

Send message

Joined: 28 Nov 02

Posts: 5007

Credit: 276,046,078

RAC: 462

Message 1967016 - Posted: 25 Nov 2018, 17:28:16 UTC - in response to Message 1958884.

So my build:

Case : Rosewill RSV-L4500, front HDD cages and all front bracketry completely removed.

GPU bracket : Spotswood drop-in bracket assembly: http://spotswoodcomputercases.com/wp/?page_id=9120

PSU1 : HP common slot 1200W (900W on 120V) - powering the motherboard and 4x GPUs

PSU2 : HP common slot 750W - powering 3x GPUs

2x PSU breakout boards : https://www.amazon.com/Supply-Breakout-Adapter-Support-Ethereum/dp/B078VMMV6D/ref=sr_1_8?s=electronics&ie=UTF8&qid=1538852429&sr=1-8&keywords=breakout+board

Custom PCIe 6-pin -> MB 8-pin and CPU 4-pin (to power the PicoPSU), i made this adapter myself, no source to buy i don't think, not needed if you use a more normal PSU setup

PSU3 : 120W PicoPSU for the motherboard

Motherboard : ASUS Prime Z270-P

CPU : i7-7700k w/ HT enabled

RAM : 8GB (2x4GB) DDR4-2133

Risers : XPLOMOS v008S: https://www.amazon.com/EXPLOMOS-Graphics-Extension-Ethereum-Capacitors/dp/B074Z754LT/ref=sr_1_fkmr0_1?s=electronics&ie=UTF8&qid=1538851990&sr=1-1-fkmr0&keywords=explomos+v8+riser

M.2 PCIe adapter : https://www.amazon.com/EXPLOMOS-NGFF-Adapter-Power-Cable/dp/B074Z5YKXJ/ref=sr_1_1_sspa?s=electronics&ie=UTF8&qid=1538852725&sr=1-1-spons&keywords=m.2+to+pcie&psc=1

GPUs : 6x 1080ti + 1x 1060

Fans : 6x Noctua iPPC-2000

SSD : cheapo 120GB drive. use whatever you want.





Ian&SteveC,

I have a couple of questions about the components of your parts list.



If I am understanding it right, your riser cards power their gpu slots directly from a PSU rather than drawing off the motherboard PCIe slots?

I think you are using a PSU adaptor to provide more plugins for the gpu modular cables?

And you bought extra modular PSU cables to power from the modular PSU to the riser boards?



As long as you are running 8 or less gpus, pretty much any motherboard/cpu should work? I am thinking of an e5-2670v1 & cheap MB that I have.



Do you have a recommended PCIe spliter card? The MB I have only has 4 slots or so and it isn't modern enough to have any of those new storage product slots.



Conversations in another thread got me wondering about re-deploying a system that I have since I don't want to be handicapped by a 1050/1051(?) Miner MB cpu socket. I like my cheap high core counts :)



Thank you,

Tom A proud member of the OFA (Old Farts Association). Ian&SteveC,I have a couple of questions about the components of your parts list.If I am understanding it right, your riser cards power their gpu slots directly from a PSU rather than drawing off the motherboard PCIe slots?I think you are using a PSU adaptor to provide more plugins for the gpu modular cables?And you bought extra modular PSU cables to power from the modular PSU to the riser boards?As long as you are running 8 or less gpus, pretty much any motherboard/cpu should work? I am thinking of an e5-2670v1 & cheap MB that I have.Do you have a recommended PCIe spliter card? The MB I have only has 4 slots or so and it isn't modern enough to have any of those new storage product slots.Conversations in another thread got me wondering about re-deploying a system that I have since I don't want to be handicapped by a 1050/1051(?) Miner MB cpu socket. I like my cheap high core counts :)Thank you,TomA proud member of the OFA (Old Farts Association). ID: 1967016 ·

Ian&Steve C.



Send message

Joined: 28 Sep 99

Posts: 3256

Credit: 1,282,604,591

RAC: 6,640

Joined: 28 Sep 99Posts: 3256Credit: 1,282,604,591RAC: 6,640 Message 1967031 - Posted: 25 Nov 2018, 19:07:46 UTC - in response to Message 1967016.

Last modified: 25 Nov 2018, 19:11:42 UTC



and yes, my risers are powered from the PSU directly and not from the motherboard. the only connection to the motherboard is the PCIe data signal wires.



on this system, i am using HP common slot server power supplies. one 900w (rated for 1200w, but only on 200+V, 110v puts it down to 900w) and one 750w.

the 900w PSU is powering the motherboard, 3x 1080tis and a 1060.

the 750w PSU is powering the other 3x 1080tis.



(the 1080tis are also power limited to 200w from their 250w default, i do this to reduce strain on the circuit, as well as improve the power efficiency. there wasnt much performance hit doing this, but there was some)



because HP common slot PSUs are cheap and plentiful, many enterprising Chinese companies developed PCB adapter boards to adapt the HP server PSUs to standard GPU 6-pin 12v outputs for the purposes of GPU crypto mining. these will NOT power a computer outright as they only output 12v power. a normal motherboard needs a whole host of different voltages (+5V, +3.3V, +/-12V, etc). what i have done to get around this, is use a PicoPSU which takes the 12V output from the HP PSU and converts it to all the other voltages needed by the motherboard. i had to custom wire this myself, no adapters are available for this kind of thing. i also custom wired in the 8-pin CPU power connection from the GPU power plugs on the HP breakout board. I wouldn't recommend you try this yourself though. just stick to a normal PSU and make it easier on yourself. with the dual PSUs, adapter boards, cables, PicoPSU, it came out to be not much cheaper than just using a high wattage PSU. I would recommend you get something like the EVGA 1000+W T2 or P2 model power supplies.



the only real advantage to my setup is having many more GPU PCIe power cables than a normal PSU. something like the the EVGA 1600w PSU has 9x VGA cables. but my setup can provide 18 connections (2 of which are being used to power the CPU/motherboard). Seti@Home classic workunits: 29,492 CPU time: 134,419 hours



there's more to it than just having enough PCIe lanes available. some motherboards just can't handle a high number of GPUs and will fail to POST. something about not having the proper resources or settings in the BIOS to address all the VRAM of the GPUs. from what reading i've done, the "Above 4G decoding" addresses this and many older motherboards and server motherboards do not have this option and likely wont work with more than about 4 GPUs or so. you just have to try and see where the limit is for your setup. my server board housing my 2690v1 chips wont do more than 4 GPUs. some other server boards i have wont do more than 3.and yes, my risers are powered from the PSU directly and not from the motherboard. the only connection to the motherboard is the PCIe data signal wires.on this system, i am using HP common slot server power supplies. one 900w (rated for 1200w, but only on 200+V, 110v puts it down to 900w) and one 750w.the 900w PSU is powering the motherboard, 3x 1080tis and a 1060.the 750w PSU is powering the other 3x 1080tis.(the 1080tis are also power limited to 200w from their 250w default, i do this to reduce strain on the circuit, as well as improve the power efficiency. there wasnt much performance hit doing this, but there was some)because HP common slot PSUs are cheap and plentiful, many enterprising Chinese companies developed PCB adapter boards to adapt the HP server PSUs to standard GPU 6-pin 12v outputs for the purposes of GPU crypto mining. these will NOT power a computer outright as they only output 12v power. a normal motherboard needs a whole host of different voltages (+5V, +3.3V, +/-12V, etc). what i have done to get around this, is use a PicoPSU which takes the 12V output from the HP PSU and converts it to all the other voltages needed by the motherboard. i had to custom wire this myself, no adapters are available for this kind of thing. i also custom wired in the 8-pin CPU power connection from the GPU power plugs on the HP breakout board. I wouldn't recommend you try this yourself though. just stick to a normal PSU and make it easier on yourself. with the dual PSUs, adapter boards, cables, PicoPSU, it came out to be not much cheaper than just using a high wattage PSU. I would recommend you get something like the EVGA 1000+W T2 or P2 model power supplies.the only real advantage to my setup is having many more GPU PCIe power cables than a normal PSU. something like the the EVGA 1600w PSU has 9x VGA cables. but my setup can provide 18 connections (2 of which are being used to power the CPU/motherboard).Seti@Home classic workunits: 29,492 CPU time: 134,419 hours ID: 1967031 ·