My meaning is very straightforward. While we often treat computation as abstract information processing, in reality it requires and depends on certain resources, notably a particular computing substrate and a usable inflow of energy. The availability of these resources can and often does limit the what can be done.
Physically, biologically, and historically resource limits are what usually constrains the growth of systems.
The limits in question are rarely absolute, of course, and often enough there are ways to find more resources or engineer away the need for some particular resource. However that itself consumes resources (notably, time). For a growing intelligence resource constraints might not be a huge problem in the long term, but they are often the bottleneck in the short term.
The fourth option is that there are resource constraints and they matter.
You mean that our use of resources is already close to optimal so that higher intelligence won’t boost results a whole lot?
See my reply here.
Could you rephrase what you mean more specifically? Does apply to AI also?
My meaning is very straightforward. While we often treat computation as abstract information processing, in reality it requires and depends on certain resources, notably a particular computing substrate and a usable inflow of energy. The availability of these resources can and often does limit the what can be done.
Physically, biologically, and historically resource limits are what usually constrains the growth of systems.
The limits in question are rarely absolute, of course, and often enough there are ways to find more resources or engineer away the need for some particular resource. However that itself consumes resources (notably, time). For a growing intelligence resource constraints might not be a huge problem in the long term, but they are often the bottleneck in the short term.