Welcome to the third installment of Learning AI if You Suck at Math . If you missed the earlier articles be sure to check outpart 1 andpart 2.

Today we’re going to build our own Deep Learning Dream Machine .

We’ll source the best parts and put them together into a number smashing monster. We’ll also walk through installing all the latest deep learning frameworks step by step on Ubuntu linux 16.04.

This machine will slice through neural networks like a hot laser through butter. Other than forking over $129,000 for Nvidia’s DGX-1 , the AI supercomputer in a box, you simply can’t get better performance than what I’ll show you right here.

Lastly, if you’re working with a tighter budget, don’t despair, as we’ll outline very budget friendly alternatives. First, a TL;DR, Ultracheap UpgradeOption

Before we dig into building a DL beast, I want to give you the easiest upgrade path.

If you don’t want to build an entirely new machine than you have one perfectly awesome option.
Learning AI if You Suck at Math ― P3 ― Building an AI Dream Machine or Budget  ...

Simply upgrade your GPU (with either a Titan X or a GTX 1080 ) and get VMware Workstation or use another virtualization software that supports GPU passthrough!

Install Ubuntu and the DL frameworks via the tutorial at the end of the article and bam, you just bought yourself a deep learning superstar on the cheap!

All right, let’s get to it.

I’ll mark dream machine parts and budget parts like so:

MINO = Money is No Object aka Dream Machine ADAD = A Dollar and a Dream aka Budget Alternative Dream Machine Parts Extravaganza GPUs First

CPUs are no longer the center of the universe.AI applications have flipped the script. If you’ve ever build a custom rig for gaming, you probably pumped it up with the baddest Intel chips you could get your hands on.

But times change.

Nvidia is the new Intel .

The most important component of any deep learning world destroyer are the GPU(s).

While AMD made headway in cyptocoin mining in the last few years, they have yet to make their mark on AI. That will change soon, as they race to capture a piece of this exploding field, but for now Nvidia is king. And don’t sleep on Intel either. They purchased Nervana Systems and plan to put out their own deep learning ASICs in 2017 .

Learning AI if You Suck at Math ― P3 ― Building an AI Dream Machine or Budget  ...
The king of DLGPUs Let’s start with MINO. The ultimate GPU is the Titan X. It has no competition.

It’s packed with 3584 CUDA cores at 1531 MHz, 12GB of G5X and it boasts a memory speed of 10 Gbps.

In DL, cores matter and so does more memory close to those cores.

DL is really nothing but a lot of linear algebra. Think of it as an insanely large Excel sheet. Crunching all those numbers would slaughter a standard 4 or 8 core Intel CPU.

Moving data in and out of memory is a massive bottleneck so more memory on the card makes a difference, which is why the Titan X is the king of the world.

You can get Titan X directly from Nvidia for $1,200 MSRP. Unfortunately, you’re limited to two. But this is a Dream Machine and we’re buying four. That’s right quad SLI!

For that you’ll need to pay a slight premium from a third party seller . Feel free to get two from Nvidia and two from Amazon. That will bring you to $5300, by far the bulk of the cost for this workstation.

Now if you’re just planning to run Minecraft, the game will still look blocky but if you want to train a model to beat cancer , these are your cards.:)

Gaming hardware benchmark sites will tell you that anything more than two cards is well past the point of diminishing returns but that is for gaming only! When it comes to AI you’ll want to hurl as many cards at it as you can. Of course, AI has its point of diminishing returns too but it’s closer to dozens or hundreds of cards (depending on the algo), not four. So stack up, my friend.

Please note you will NOT need an SLI bridge, unless you’re also planning to use this machine for gaming. That’s strictly for graphics rendering and we’re doing very little graphics here, other than plotting a few graphs in matplotlib.

Budget Friendly Alternative GPUs
Learning AI if You Suck at Math ― P3 ― Building an AI Dream Machine or Budget  ...
Your ADAD card is the GeForce GTX 1080 Founders Edition. The 1080 packs 2560 CUDA cores, a lot less than the Titan X, but it rings in at half the price, with an MSRP of $699.

It also boasts less RAM, at 8GB versus 12.

EVGA has always served me well so grab four of them for your machine . At $2796 vs $5300, that’s a lot of savings for nearly equivalent performance.

The second best choice for ADAD is the GeForce GTX 1070. It packs 1920 CUDA cores so it’s still a great choice. It comes in at around $499 MSRP but superclocked EVGA 1070s will run you only $389 bucks so that brings the price to a more budget friendly $1556. Very doable.

Of course if you don’t have as much money to spend you can always get two or three cards. Even one will get you moving in the right direction.

Let’s do the math on best bang for the buck with two or three cards:

3 x Titan X = 10,752 CUDA cores, 36GB of GPU RAM = $3800 2 x Titan X = 7,167 CUDA cores, 24 GB of GPU RAM = $2400 3 x GTX 1080 = 7,680 CUDA cores, 24GB of GPU RAM = $2097 2 x GTX 1080 = 5,120 CUDA cores, 16GB of GPU RAM = $1398 3 x GTX 1070 = 5,760 CUDA cores, 24GB of GPU RAM = $1167 2 x GTX 1070 = 3,840 CUDA cores, 16GB of GPU RAM = $778 The sweet spot is 3 GTX 1080s. For half the price you’re only down 3072 cores. Full disclosure, that’s how I built my workstation. SSD and SpinningDrive
Learning AI if You Suck at Math ― P3 ― Building an AI Dream Machine or Budget  ...
You’ll want an SSD, especially if you’re building Convolutional Neural Nets and working with lots of image data.

本文系统(linux)相关术语:linux系统 鸟哥的linux私房菜 linux命令大全 linux操作系统

主题: UbuntuCPUCUAMDLinuxExcelLG4G2G
本文标题:Learning AI if You Suck at Math ― P3 ― Building an AI Dream Machine or Budget ...

技术大类 技术大类 | 系统(linux) | 评论(0) | 阅读(18)