Is a Purely Rational World a Technologically Advanced World?

What would our world be today if humans had started off with a purely rational intelligence?

It seems as though a dominant aspect of rationality deals with risk management. For example, an irrational person might feel that the thrill of riding a zip line for a few seconds as being well worth the risk of injuring themselves, contracting a flesh eating bug, and losing a leg along with both hands (sorry, but that story has been freaking me out the past few days, I in no way mean to trivialize the woman’s situation). A purely rational person would (I’m making an assumption here because I am certainly not a rational person) recognize the high probability of something going wrong and determine that the risks were too steep when compared with the minimal gain of a short-lived thrill.

But how does a purely rational intelligence—even an intelligence at the current human level with a limited ability to analyze probabilities—impact the advancement of technology? As an example, would humanity have moved forward with the combustible engine and motor vehicles as purely rational beings? History shows us that humans tend to leap headlong into technological advancements with very little thought regarding the potential damage they may cause. Every technological advancement of note has had negative impacts that may have been deemed too steep as probability equations from a purely rational perspective.

Would pure rationality have severely limited the advancement of technology?

Taken further, would a purely rational intelligence far beyond human levels be so burdened by risk probabilities as to render it paralyzed… suspended in a state of infinite stagnation? OR, would a purely rational mind simply ensure that more cautious advancement take place (which would certainly have slowed things down)?

Many of humanity’s great success stories begin as highly irrational ventures that had extremely low chances for positive results. Humans, being irrational and not all that intelligent, are very capable of ignoring risk or simply not recognizing the level of risk inherent in any given situation. But to what extent would a purely rational approach limit a being’s willingness to take action?

*I apologize if these questions have already been asked and/​or discussed at length. I did do some searches but did not find anything that seemed specifically related to this line of thought.*