Build for Deep Learning / Machine LearningPost Date: 2017-04-24 |
Post Reply
|
Author | ||
jabalin
Newbie Joined: 24 Apr 2017 Online Status: Offline Posts: 8 |
Quote Reply
Topic: Build for Deep Learning / Machine Learning Posted: 24 Apr 2017 at 12:13pm |
|
Hi all. I’m helping my son configure a build specifically for use in study of Deep Learning / Machine Learning.
Strawman configuration below, but there are a few questions. First, We're really trying to avoid liquid cooling. Not planning to overclock anything. Have added the fan blow holes for CPU and GPUs. Will that be enough? Second.Are the UEFI temp/fan tools available post-boot? Budget: High-End Expectations: Stable, able to run for long periods of time (days) Usage: CUDA based Deep Learning / Machine Learning applications Special Needs: 2XTitan Xp Saved Ticket #: 1651306 Specifications: Chassis Model: Aventum 3 Exterior Finish: Black Metallic Matte Finish Processor: Intel Core i7 6850K 3.6GHz (6-Core) (Unlocked CPU) Motherboard: ASUS X99-E WS/USB 3.1 (Intel X99 Chipset) (Workstation Class) (Up to 5x PCI-E Devices) System Memory: 128GB DDR4 2666MHz Corsair Dominator Platinum DHX (Extreme-Performance) Power Supply: 1200W Corsair AX1200i (Digitally Controlled Power) Optical Drive: DVD-R/RW/CD-R/RW (DVD Writer 8x / CD-Writer 8x) (Internal) Storage Set 1: 1x SSD M.2 (1TB Samsung 960 EVO) (NVM Express) Storage Set 2: 1x Storage (3TB Seagate / Toshiba) Internet Access: High Speed Network Port (Supports High-Speed Cable / DSL / Network Connections) Graphics Card(s): 2x SLI Dual (GeForce GTX TITAN Xp 12GB (Pascal) (NVIDIA Founders Edition) (VR Ready) Sound Card: Integrated Motherboard Audio Extreme Cooling: AIR: Stage 1: High-Performance Copper Heat Pipe Cooler HydroLux Tubing Style: - Not Applicable, I do not have a custom HydroLux liquid cooling system selected HydroLux Fluid Color: - Not Applicable, I do not have a custom HydroLux liquid cooling system selected Cable Management: Premium Cable Management (Strategically Routed & Organized for Airflow) Chassis Fans: Corsair Airflow Performance Edition Fans Internal Lighting: Remote Controlled LED Lighting System (Multiple color options and lighting effects) Chassis Mods: Fan Blow-Holes: CPU & Graphics Card Area: Laser Cut and Mount (2x) Side Window Fans Boost Processor: Stock Factory Turbo Boost Advanced Automatic Overclocking Windows OS: Microsoft Windows 10 Home (64-Bit Edition) Recovery Tools: Factory Reset Feature (Restore Windows + Drivers) (Partition up to 50GB of Storage) Virus Protection: FREE: McAfee AntiVirus Plus (1 Year Service Activation Card) (Not Pre-installed) ($35 Value) Priority Build: - No Thanks, Ship Within 15-20 Business Days After Order Is Successfully Processed Warranty: Life-time Expert Care with 3 Year Limited Warranty (3 Year Labor & 1 Year Part Replacement) |
||
DS Veteran Joined: 28 Oct 2014 Online Status: Offline Posts: 1674 |
Quote Reply Posted: 24 Apr 2017 at 12:45pm | |
I'd really only recommend the Aventum for water-cooled builds. I'd recommend the Velox for air builds, however, and it is already well optimized for air flow, so there is no reason for additional venting to be added. If you need access to lots of 3.5" drives, then consider the Apollo.
If you're not going to go with ECC RAM, then there is no reason to use the WS board. 850W should be more than sufficient for a non-overclocked build like this. I'd recommend Western Digital drives over Seagate and Toshiba.
If you want ECC RAM and more processor cores
|
||
jabalin
Newbie Joined: 24 Apr 2017 Online Status: Offline Posts: 8 |
Quote Reply Posted: 25 Apr 2017 at 12:30pm | |
Thanks for the feedback.
Reasoning for the X99-E WS was room for the two Titan Xp and one Intel 750 Series PCIe SSD. The Velox does seem to be a better solution for air-cooling. Too bad it does not allow for custom exterior colors. |
||
DS Veteran Joined: 28 Oct 2014 Online Status: Offline Posts: 1674 |
Quote Reply Posted: 25 Apr 2017 at 1:11pm | |
I would think DS could accommodate you on the custom color...they've done it in the past.
I see what you want to do with the PCI-e lanes. The motherboard that you'd really want is the original Rampage V Extreme that can do x16/x0/x16/x4 and have a x4 M.2 card. It seems as though all the motherboards are now catering toward 28-lane processors and none have that configuration anymore, including the Rampage V Extreme Edition 10. So you know, the 750 SSD will share bandwidth with the GPU through a PCI-e switch, unless the two GPUs are sharing the same switch in the top or bottom two x16 PCI-e slots. |
||
jabalin
Newbie Joined: 24 Apr 2017 Online Status: Offline Posts: 8 |
Quote Reply Posted: 25 Apr 2017 at 1:35pm | |
So if we seat the Titan #1 in slot 1 (x-16 closest to CPU), Titan #2 in slot 3 (x16/8) then put the SSD in the last slot (x16/8), would that optimize the bandwidth?
|
||
DS Veteran Joined: 28 Oct 2014 Online Status: Offline Posts: 1674 |
Quote Reply Posted: 25 Apr 2017 at 5:21pm | |
For a 40-lane CPU, the first two grey x16 slots (32 lanes total) go to a PCI-e switch and share x16 lanes of the processor. Same thing for the bottom two grey slots. The black slots share bandwidth with quick switches with the slot below them (so if there is something in a black slot, it will run at x8 and the grey slot below it will be a x8, whether it is populated or not). They still obviously go through the PCI-e switch, so the top black slot goes through the first switch and the other two go through the second.
The other lanes from the processor go to the x4 M.2 slot, again making it hedged to 28 lane processors. Both the Aventum and Velox are inverted set-ups, so when I say top above, when looking at it in one of those cases it would mean bottom. |
||
jabalin
Newbie Joined: 24 Apr 2017 Online Status: Offline Posts: 8 |
Quote Reply Posted: 25 Apr 2017 at 5:42pm | |
That clears it up. Thanks a bunch!
|
||
DS Veteran Joined: 28 Oct 2014 Online Status: Offline Posts: 1674 |
Quote Reply Posted: 25 Apr 2017 at 11:00pm | |
I'll add that if you have GPU-to-GPU communications, when the two GPUs are on the same switch, you can bypass the processor altogether. So, maximizing bandwidth to the processor may or may not be best for you, depending on what sort of communications you expect to see.
|
||
jabalin
Newbie Joined: 24 Apr 2017 Online Status: Offline Posts: 8 |
Quote Reply Posted: 26 Apr 2017 at 6:12am | |
Thats the plan:
Place the Titans on same switch and use SLI: Titan #1 seated on PCIEX16_1 Titan #2 seated on PCIEX16_3 The Intel 750 Series 1.2 TB SSD placed on the other switch: SSD on PCIEX16_6 or PCIEX16_7 I guess it does not matter if SSD is in slot 6 or slot 7. Or does it? |
||
DS Veteran Joined: 28 Oct 2014 Online Status: Offline Posts: 1674 |
Quote Reply Posted: 26 Apr 2017 at 11:01am | |
Nope, it should not matter. It only has a x4 PCI-e connector, so bandwidth is not a concern there.
|
||
Post Reply |
Forum Jump | Forum Permissions You cannot post new topics in this forum You cannot reply to topics in this forum You cannot delete your posts in this forum You cannot edit your posts in this forum You cannot create polls in this forum You can vote in polls in this forum |