How much worse is sitting outside in the shade compared to the sun, for the purpose of experiencing enough lux to feel alert?
Making the difference between shade and sun a bit more extreme than it probably is in practice, we can take “sun” to mean “10,000 lux” and “shade” to mean “1,000 lux”. Data to back this up:
Wikipedia gives these lux values for different conditions:
20,000 lux “Shade illuminated by entire clear blue sky, midday”
1,000–2,000 lux “Typical overcast day, midday”
<200 lux “Extreme of thickest storm clouds, midday”
NOAO (National Optical Astronomy Observatory) says:
10,000 lux “Full Daylight”
1,000 lux “Overcast Day”
100 lux “Very Dark Day”
[Side question: What does Wikipedia exactly mean with “Shade illuminated by entire clear blue sky”? Is this a strange way of describing sunlight, or are they referring to the lighting experienced standing in the shade of an object during an otherwise sunny day? The words sound like the latter to me, but the former fits better with NOAO’s description. I don’t think this distinction breaks my question either way.]
Same orders of magnitude, just seems like NOAO is averaging throughout the day and Wikipedia is looking at midday.
Meanwhile, this LW post discusses how to install indoor lighting to get about 1,000 lux inside. If 1,000 lux is enough to keep a human alert and non-SAD, then it seems like shade is no worse than sun. Or, maybe people installing lumenators in their homes just stop at 1,000 lux because it’s impractical to get an order of magnitude more lux than that.
One way of phrasing the argument in favor of lumenators is that going from 100 lux to 1,000 lux provides huge benefits. Can the same thing be said about going from 1,000 lux to 10,000 lux, or are there diminishing returns? In other words, should I sit in the sun instead of the shade?