C_Payne said: In my imagination it could be done physically Click to expand...

C_Payne said: This leads me to think you do not have the technical knowlege to pull this off. Sorry. Click to expand...

Ryvaeus said: This is madness and I wish you the best of luck! Click to expand...

Neapolitan6th said: How about a carefully considered case mod. Click to expand...

Wow, thank you everyone for pitching in! This is very helpful.You don't know how much it means to hear this from someone who made a working solution.Right on, no need to be sorry. That won't deter me from thinking outside the box though!C_Payne, Saper_PL, I'm keeping your pointers when it comes to signal integrity and connection caveats from under the motherboard in mind.So here goes!I've been taking a closer look at the PCIe slot, and I've noticed there are holes with access to the pins at the top of the connector. These pins are definitely matching a specification to ensure PCIe 3.0 throughput, unlike the solder joints under the motherboard.Connecting from a thin flexible flat cable between the slot and the card is the next possibility, since a tape fits. From what I gathered, flexible flat cable aren't that great with high-frequency signal integrity over a distance. I was looking at the available options PCIe 3.0 differential pair speed (8GT/s or 984.6MBps for PCIe 3.0 throughput per lane in one direction), and ethernet cables came to mind, most specifically the Cat 6a type. 500MHz, double-shield, capable of 10Gbps, or 1.2GBps over 100m.Here is the idea: several short ethernet cables carefully sized and stripped, soldered to short thin flexible flat cable strips to fit in the PCIe slot, and use the holes in the slot for daisy-chaining the necessary signals. The REFCLK signal can be dealt with using both methods, getting it to the extra PCB by the holes, and sent back to the slot with a small flexible flat cable, ensuring the card's contact pads are insulated from the pins on the mobo and it gets a REFCLK signal. Edit: maybe even not use all 4 pairs available in one cable, to further reduce the possibility of crosstalk.Why thank you!The idea is to get PCIe bifurcation working for more people ultimately. It's not every day that you hear about "mini-ITX" and "SLI/XF" in the same thread, if not to make it a drawback of going mini-ITX!The best, already available solution to "SLI/XFire in a mini-ITX enclosure" is mini-DTX. This standard didn't take off, and AFAIK, there's only one motherboard with two physical PCIe 16x slots and mini-DTX width (fitting in most mini-ITX cases with two PCIe brackets), the Shuttle X79. Which requires going with at least an i7-3930k, on Sandy-Bridge-E architecture. Expect to give up on M.2, DDR4, or even building it with an AMD processor...The solution found in this thread works, there's no denying it, but I don't believe it's perfect. This thread references three niche markets: "mini-ITX"/"SFF", "SLI/XFire", and "custom case design". I'm trying to move away from the "custom case design" niche to, at worst, the "water-cooling" niche. Let's face it, in the mini-ITX case market, either you go with an alternative (like a video capture or a sound card or whatever you need) or you go SLI/XFire, which means the first GPU will need cooling in a single slot depth. The ATX-laid-out Fractal Design Define Nano S (and the hopefully upcoming Meshify Nano S) is a great case for such application: space for one 240mm and one 280mm radiators, two PCIe brackets, mini-ITX motherboard, ATX PSU (the SilverStone ST85F-PT fits in the case and can deliver enough power for two 1080 Ti and a 7700k, all overclocked.)One further. Take the NCASE M1. It has three PCIe brackets. Forget SLI/XFire, and even water-cooling. Use a solution that doesn't "sacrifice" the PCIe slot to a riser, and there, you can put a capture card in the first slot, and a blower-style GPU in the second slot. Include the upcoming 6-cores i7. That could be an SFF enthusiast streamer's dream!