Nick Krader (Production Technician)

[BuildLog] Dual Socket Broadwell-EP Xeon Insanity

Written on July 12, 2016 by Nick Krader
Share:

Here at Puget Systems, we make some amazing computers, and I always thought it would be cool to show some of you what it looks like when a pile of parts transforms into a Puget System.  My name is Nick -- I'm an assembly technician who also does some of the Instagram photography here at Puget Systems, and this is the first build-log of a monthly series I will be doing for 2016.

 


This will be a build including some great companies and products, and here is what will be going into this build.

Case - Fractal Design - Define XL R2
Motherboard - Asus - Z10PE-D8 WS
CPU’s - 2x Intel Xeon - E5-2697v4 (THAT IS 72 THREADS @ 2.3Ghz!)
CPU Cooling - 2x Corsair - H60
Memory - 8x 32GB Samsung DDR4 2133 (256GB Total!)
Power Supply - EVGA - 1200P2- Platinum 80+
Videocard’s - 2x EVGA - Founders Edition GTX 1080
Drives - 
Samsung 950 Pro M.2 - 512GB NVMe
Samsung 850 Pro SATA - 1TB


This will be an incredibly powerful machine, and I'm pretty excited to get this system up and running.  With these specs, there's no doubt this system has some very complex computational tasks ahead of it.


The motherboard is out of the box.  Man, I love the huge boards. They are such amazing pieces of tech. Here's a close up of the 2011-3 socket, with its astounding 2011 pins that harness all that power from those little CPU’s. They install easy enough and then it's on to the next step.


In goes the RAM! Some would consider eight sticks, each at 32GB of RAM, to be overkill, but in the world of workstations and servers, this is quite an average amount of RAM. In this day and age, it’s needed when you can have such an enormous amount of parallel processing power.


I’m installing the NVMe drive into its dedicated 4x slot on the motherboard, and then the other SATA drive goes into its little home in the case. This machine will also be getting four SATA drive Data/Power prewiring, so that when the customer receives the machine they can just install the four HDD/SSDs they already have.


Next up is the GPUs! These ultra powerful Nvidia GTX 1080s should work out perfectly for pretty much anything you can throw at them from gaming to CUDA compute projects. The new design of the heatsinks makes them look so much more beautiful than the previous generations Nvidia graphics cards.

Powering this monstrosity of a machine will be the >90% efficient 1200 watt EVGA power supply unit. This will have more than enough power/plugs to suit this build wonderfully, especially since with every revision of computer hardware everything becomes more powerful while somehow using less electricity?!?!?

Testing the PSU for correct voltages.  This makes sure our power supply is free from defects that could otherwise damage the rest of the parts.

Now to install the CPU heatsinks. First things first, I remove the stock thermal paste. The stock thermal paste really isn't bad. If you are doing this at home, I would recommend using the paste that comes preinstalled on the Corsair units, because it really is quite good. But here at Puget, we replace the paste for continuity sake, mostly.


With that paste out of the way, I install a fresh layer of ARCTIC MX-2 thermal paste. Then I install the mounting hardware and get those heatsinks mounted up, making sure to always tighten the screws down in a progressive criss-cross pattern.


And with both of the heatsinks installed it's time to bench style test the hardware and update the BIOS to the latest available revision.



Next is unboxing the chassis, and checking to make sure there was no damage in shipping. After all, scratches do happen. This one looks great!

After that, I remove all the grommets/drive sleds/fans/etc because the next step requires some drilling. This is due to the fact that two of the standoffs that the EATX Asus motherboard needs are not predrilled into the chassis.


Then, using a trusty silver Sharpie, I mark both of the standoffs that need to be drilled and use a center punch to mark the absolute center. I drill the holes and tap them so the brass standoffs thread right in like they were meant to be there.


With the standoffs installed, the board goes right in and mounts up with all available mounting points populated. Now it's just a matter of mounting the radiators to the chassis, installing a few more small bits, and getting to the major task of routing and cleaning up all the cables that run all over inside of this thing.


To protect the GPU itself and the motherboard slot, we use custom laser cut brackets to brace the expansion cards so they don't get broken in transit. A few parts are needed to achieve this. There are two small bits that are attached to the end of the cards in the OEM mounting points, then there is a plate and arm that is mounted to the drive cage in front of them.  After the cards are installed, the cage is just slid back into place and it captures the cards and locks them in place.



Ugh. Slowly and steadily these cables will be pruned and routed to make it as clean as possible. (With a build like this you can't really make it “perfect” due to the sheer number of cables and the size of the chassis.)
 

 

 

A little better.

 

 




Getting better, now to add the prewire for the four customer drives and zip tie a little more into place. The inside is looking pretty dang good, the backside needs to be functionally beautiful. After all, you won't be seeing that apart from the occasional time you open the back panel and cleanup or upgrade.


Almost completed!!


After completion, I sign the machine so that we can keep track of who built your computer (look inside any newer Puget Systems machine, you will see the signature of the assembly technician that built the machine from start to finish), then we run tests on memory and drives to make sure that the hardware is in peak condition, and send it off to install for the OS/Software of choice. After that, it's passed to QC for thermal imaging and stress testing, and then it's packaged up and shipped out.

 
 

Tags: Build Log, Broadwell-EP, GTX, 1080, XEON
Patrick Browne

Okay, what's the damage hehe ($??.??), & is this ECC across the board, please :-)? Thank You.

Posted on 2016-07-28 01:34:05
slayerizer

impressive, what a dream machine... I'm curious, is the GTX 1080 better serve for gaming, with an i7 cpu or the Xeon? I was watching older e5-2683v3 cpu on ebay (QS). What do you plan to do with that machine? minecraft? ahah

I have two i7-4790 build (one configured as a desktop, the other as a hyper-v server )

Posted on 2016-08-10 19:23:30

This is so great...

Randomly placed request:

1. I'd love to see you dive into NIC performance differences, both in bandwidth and latency – the state of the art in current 1 GigE NICs, from the Intel adapters built into the motherboards (the Intel I219-V you've got on one of the desktops looks pretty good), to this whole Killer souped-up NIC thing (maybe there are other boutique NIC brands out there to). Something I've wondered about is whether, and to what extent, the NIC TCP offload features matter on the desktop. The built-in Intel NICs have some offload features, but I have no idea if Windows 10 or 7 leverages those, and how it affects performance (especially latency on things like gaming). (Cool paper: http://www.barrelfish.org/publ...

2. Relatedly, I think some professional desktop users are starting to bump up against the limits of 1 GigE connections, and have access to > 1 Gbps pipes, so I wonder if you've tested any of the new 2.5 GigE and 5 GigE cards (I think one of the emerging standards is IEEE 802.3bz). Or fiber on SFP+ or something. At some point we really ought to push for fiber all the way down from the last mile to a port in my laptop. Anyway, I'm curious about > 1 GigE.

Posted on 2016-08-20 03:25:19
Leonardo Graca

The 950 ssd is bootable in this motherboard? is there Very Performance gain compared to a normal ssd?

Posted on 2016-10-15 02:45:13