Question

I am new in deep learning. I am running a MacBook Pro yosemite (upgraded from Snowleopard). I don't have a CUDA-enabled card GPU, and running the code on the CPU is extremely slow. I heard that I can buy some instances on AWS, but it seems that they don't support macOS.

My question is, to continue with the deep learning, do I need to purchase a graphic card? Or is there other solution? I don't want to spend too much on this...

Was it helpful?

Solution

I would recommend familiarizing yourself with AWS spot instances. It's the most practical solution I can think of for your problem, and it works your computer too. So, no you don't have to buy an Nvidia card, but as of today you will want to use one since almost all the solutions rely on them.

OTHER TIPS

AWS GPU instances are an option, if you want to do CUDA development. If you don't want to leverage the cloud, you can look into the Nvidia Jetson TK1 development kits. They are about 200 dollars (in July 2015), have 192 CUDA cores, a quad-core ARM processor, and 2GB of RAM.

Conversely, the same amount of money could buy you a 640 CUDA core GeForce GTX 750 Ti, or maybe a 1024 CUDA core GTX 960. A GT 720 with 1GB RAM and 192 cores could be had for 45 dollars (in July 2015).

You don't have to use NVidia GPUs with deep learning. GPUs will increase the speed dramatically, though. There is very little support for non-NVidia GPUs with common deep learning toolkits.

AWS does "support" Mac OS. You can use any SSH client from your Mac to access the GPU instance.

Licensed under: CC-BY-SA with attribution
Not affiliated with datascience.stackexchange
scroll top