Robin may have been assuming abundant memory and scarce CPU time? I agree though that unless memory costs are very low this is a problem in the examples.
Robin may have been assuming abundant memory and scarce CPU time?
He’s not saving on CPU time (i.e., total number of instructions executed), but substituting more, slower processors for fewer, faster processors and also using more memory. We don’t see a lot of this today. For example render farms and data centers all use essentially the fastest CPUs available. Some operations might back off a few notches from the bleeding edge in order to save money, but it’s not even close to 2x much less 21x. My earlier “doesn’t seem plausible” may be too strong, but I don’t understand why Robin seems to be predicting this as the most likely scenario. If he has some specific reasons why the economics will likely work out this way, I’d very much like to see it.
We don’t see a lot of this today...it’s not even close to 2x much less 21x.
We see plenty of this today. Every processor you use with multiple slower cores rather than a single core screaming at 4ghz is making the slow parallel vs fast serial tradeoff. Processor migration and power-saving modes are other examples where the tradeoff is made dynamically ARM processors are hugely abundant in embedded and mobile spaces, and the ARM design is an example of trading off CPU time for other things like reduced transistor count or (especially) power consumption. ARM or Atom chips are making inroads into datacenters because power consumption & cooling are becoming such issues, and we can expect parallelisation to continue for power saving.
Hmm, apparently my knowledge of server hardware was a bit outdated. ARM processors being used by data centers are running at about 1.5 ghz, and it looks like with extreme overclocking it’s possible to push x86 processors up to 8 ghz which gives a factor of about 5x. So probably there will be some significant difference between the fastest and slowest uploads, and 21x may not be totally implausible.
One benefit of running on a lower speed is that you can interact with things farther away from you while it still seems instantaneous. although i have no idea why that would be more important for the workers than for the boss.
Robin may have been assuming abundant memory and scarce CPU time? I agree though that unless memory costs are very low this is a problem in the examples.
He’s not saving on CPU time (i.e., total number of instructions executed), but substituting more, slower processors for fewer, faster processors and also using more memory. We don’t see a lot of this today. For example render farms and data centers all use essentially the fastest CPUs available. Some operations might back off a few notches from the bleeding edge in order to save money, but it’s not even close to 2x much less 21x. My earlier “doesn’t seem plausible” may be too strong, but I don’t understand why Robin seems to be predicting this as the most likely scenario. If he has some specific reasons why the economics will likely work out this way, I’d very much like to see it.
We see plenty of this today. Every processor you use with multiple slower cores rather than a single core screaming at 4ghz is making the slow parallel vs fast serial tradeoff. Processor migration and power-saving modes are other examples where the tradeoff is made dynamically ARM processors are hugely abundant in embedded and mobile spaces, and the ARM design is an example of trading off CPU time for other things like reduced transistor count or (especially) power consumption. ARM or Atom chips are making inroads into datacenters because power consumption & cooling are becoming such issues, and we can expect parallelisation to continue for power saving.
Hmm, apparently my knowledge of server hardware was a bit outdated. ARM processors being used by data centers are running at about 1.5 ghz, and it looks like with extreme overclocking it’s possible to push x86 processors up to 8 ghz which gives a factor of about 5x. So probably there will be some significant difference between the fastest and slowest uploads, and 21x may not be totally implausible.
One benefit of running on a lower speed is that you can interact with things farther away from you while it still seems instantaneous. although i have no idea why that would be more important for the workers than for the boss.