I am quite dumb. I bought a Tensorbook for $4000 6 months ago. It has an RTX 3080 max-Q. That is why you would buy a Tensorbook. It has a powerful GPU with 15 TFLOPS and 16GB VRAM.
But now I can buy a P100 on eBay for less than 200 dollars from China, or a K80 for less than $100 (prices excluding import tax). The P100 has 16GB VRAM and 19 TFLOPS and much more memory bandwidth. The K80 has 24GB VRAM and 5 TFLOPS. Also, the Tensorbook often crashes, if I don’t suspend it in the air. My guess is that it can’t handle the heat under load. Ups!
It would have been much better to buy a laptop without GPU and just use cloud computing or even just google colab, and now that I am at a somewhat permanent location that already has a powerful tower computer, buying a cheap P100 seems like a good option. All this becomes even stupider if you consider that I only used the GPU for tens of hours.
I am quite dumb. I bought a Tensorbook for $4000 6 months ago. It has an RTX 3080 max-Q. That is why you would buy a Tensorbook. It has a powerful GPU with 15 TFLOPS and 16GB VRAM.
But now I can buy a P100 on eBay for less than 200 dollars from China, or a K80 for less than $100 (prices excluding import tax). The P100 has 16GB VRAM and 19 TFLOPS and much more memory bandwidth. The K80 has 24GB VRAM and 5 TFLOPS. Also, the Tensorbook often crashes, if I don’t suspend it in the air. My guess is that it can’t handle the heat under load. Ups!
It would have been much better to buy a laptop without GPU and just use cloud computing or even just google colab, and now that I am at a somewhat permanent location that already has a powerful tower computer, buying a cheap P100 seems like a good option. All this becomes even stupider if you consider that I only used the GPU for tens of hours.