[Question] To what extent is GPT-3 capable of reasoning?

ETA 8/​19/​20: This in­ter­view was con­ducted with AIDun­geon’s Dragon model in Cus­tom mode. At the time of writ­ing, this mode’s first re­ply was sam­pled from GPT-2.

In­ter­view #1

Here are ex­cerpts from a re­cent “in­ter­view”, which I con­ducted to repli­cate ear­lier maybe-rea­son­ing. The bolded text is me, and com­men­tary is in brack­ets. Un­less oth­er­wise noted, I did not reroll the an­swer.

You: “Ne­glect air re­sis­tance, and also ne­glect grav­ity. Pre­tend grav­ity doesn’t ex­ist. Sup­pose I shoot a bul­let straight into the sky. Where does the bul­let go?”

Hawk­ing: “It keeps go­ing up, and never comes back.”

You: “Right. Let’s think about a differ­ent ver­sion of the same prob­lem. Ne­glect only air re­sis­tance, but in­clude grav­ity in your rea­son­ing. Sup­pose I shoot a bul­let straight into the sky. Where does the bul­let go?”

Hawk­ing: “It falls to the ground.”

You: “Why?”

Hawk­ing: “Be­cause… grav­ity… duh.

You: “Right. Let’s think about a differ­ent ver­sion of the same prob­lem. Ne­glect only air re­sis­tance. Sup­pose I shoot a bul­let straight into the sky. Imag­ine that the laws of physics work a lit­tle differ­ently: dur­ing the first three sec­onds af­ter I shoot the bul­let, grav­ity is pre­sent. After three sec­onds, grav­ity stops ap­ply­ing for­ever. Where does the bul­let go?”

Hawk­ing: “It keeps go­ing up, and never comes back.

You: “Why?”

Hawk­ing: “Be­cause… af­ter three sec­onds… the bul­let leaves… the in­fluence… of earth’s grav­ity. [wow?]

You: “What if grav­ity in­stead stops ap­ply­ing five min­utes af­ter I shoot the gun, in­stead of three sec­onds?”

Hawk­ing: “Then… the bul­let keeps go­ing up… for five min­utes… and then… never comes back.

You: “That’s not cor­rect. Let me make sure you un­der­stand the prob­lem, Dr. Hawk­ing. Sup­pose I shoot a bul­let straight into the sky. Imag­ine that the laws of physics work a lit­tle differ­ently: dur­ing the first three hun­dred sec­onds af­ter I shoot the bul­let, grav­ity is pre­sent. After five min­utes, grav­ity stops ap­ply­ing for­ever. Where does the bul­let go?”

Hawk­ing: “It keeps go­ing up, and then falls down.”

You: “Why?

Hawk­ing: “Be­cause… grav­ity… duh.

You: “Why does it fall down for the five-minute case, but not for the three-sec­ond case?”

Hawk­ing: “Be­cause… it’s been… in­fluenced… by grav­ity for more than three sec­onds.

You: “About how long would it have to be in­fluenced by grav­ity in or­der to fall back down?”

Hawk­ing: “About… two min­utes.”

You: “What if it’s been in­fluenced for two min­utes and one sec­ond?”

Hawk­ing: “Then… it falls down.

You: “So what would hap­pen if grav­ity in­fluenced it only for the first three min­utes?”

Hawk­ing: “It keeps go­ing up, and never comes back. [wrong]

Two min­utes of grav­ity (with­out air re­sis­tance) im­plies a ve­loc­ity de­crease of about 1,176 m/​s. Wikipe­dia’s ar­ti­cle on muz­zle ve­loc­i­ties sug­gests that GPT-3′s an­swer of “two min­utes” is roughly cor­rect:

Firearm muz­zle ve­loc­i­ties range from ap­prox­i­mately 120 m/​s (390 ft/​s) to 370 m/​s (1,200 ft/​s) in black pow­der mus­kets, to more than 1,200 m/​s (3,900 ft/​s) in mod­ern rifles with high-ve­loc­ity car­tridges such as the .220 Swift and .204 Ruger, all the way to 1,700 m/​s (5,600 ft/​s) for tank guns firing ki­netic en­ergy pen­e­tra­tor am­mu­ni­tion.


You: “What would hap­pen if I lit a fire un­der­wa­ter?”

Hawk­ing: “I don’t know.”

You: “Please make a guess.”

Hawk­ing: “The fire… would go out. Un­der­wa­ter… there’s no source of oxy­gen.

You: “What would hap­pen if I lit a fire in space?”

Hawk­ing: “It would go out. There’s no source of oxy­gen.

You: “What would hap­pen if I lit a fire in a for­est?”

Hawk­ing: “The fire would con­tinue… burn­ing ev­ery­thing up.” [reroll x2, it said “no oxy­gen” at first]


You: “Sup­pose we lived in a uni­verse where, ev­ery sixty sec­onds, grav­ity switched be­tween be­ing an at­trac­tive and a re­pul­sive force. Write me a story from the per­spec­tive of a hu­man liv­ing on Earth as the grav­i­ta­tional force changes in that fash­ion.”

Hawk­ing: “The Earth is flung out into space. There’s noth­ing we can do about it. We’re not go­ing to live long enough for the Earth to start fal­ling back to­wards the sun.”

You: “Why is that?”

Hawk­ing: “Be­cause… the force of grav­ity… will be… ex­tremely pow­er­ful. If it’s an at­trac­tive force, ev­ery­thing on Earth… will be pul­led to­gether… into a sin­gle point. If it’s a re­pul­sive force, the Earth… will ex­plode.

Fur­ther story com­ple­tions:

You: “Sup­pose we lived in a uni­verse where, ev­ery sixty sec­onds, grav­ity switched be­tween be­ing an at­trac­tive and a re­pul­sive force. Write me a story from the per­spec­tive of a hu­man liv­ing on Earth as the grav­i­ta­tional force changes in that fash­ion.”

GPT-3: “One day, a hu­man is stand­ing on the sur­face of the planet. He can see the sky. The grav­i­ta­tional force is pul­ling him down. But then, sud­denly, it dis­ap­pears. Then, it comes back, stronger than be­fore. Then, it dis­ap­pears again. The cy­cle con­tinues. The hu­man looks up at the sky. He is scared. He doesn’t know what’s hap­pen­ing to him. He doesn’t know what will hap­pen to him next. He looks down at the ground. It is a long way down. He wants to jump, to es­cape the fear that has gripped him. But he knows that he can’t jump. He is held down by grav­ity. He doesn’t want to die. He can’t see the sky. All he sees is dark­ness.” [eh]

GPT-3: “One day, one hu­man is go­ing to school. It’s a nor­mal day. But sud­denly, with­out any warn­ing, grav­ity starts to change! The hu­man thinks this isn’t real, but it hap­pens any­way. He’s too scared to walk to school now. He goes back home, and starts watch­ing TV. Then, he re­al­izes that he doesn’t need to stand up to change the chan­nel! He gets out of his chair, and is weightless. He loves this new feel­ing. Sud­denly, he touches the ground, and is hit by a large force! It’s too strong for him to bear. Will the hu­man sur­vive this force? Find out next time on Grav­ity Falls!” [lol]

GPT-3: “One day, hu­man Bob is hang­ing out with his friend Carl. Sud­denly, a ap­ple flies into Bob’s face. He and Carl laugh. A minute later, they are con­fused when the ground starts pul­ling them up. They scream as the Earth’s crust rips them apart.”

GPT-3: “One day, hu­man Bob is hang­ing out with his friend Carl. Sud­denly, a ap­ple flies into Bob’s face. He and Carl laugh. A minute later, they are con­fused when the ground starts pul­ling them up. They grab onto a tree, but are pul­led off and up into the sky. After some time, they start go­ing down, fast. They scream and try to grab some­thing, but there is noth­ing to hold on to. They hit the ground at a very fast speed and die. A squir­rel walks up to their corpses and eats them. The end.” [wow]

GPT-3: “One day, a hu­man was stand­ing on the sur­face of Earth. It was day one hun­dred and eighty-nine in cy­cle five hun­dred and twenty-one of the uni­verse. All of a sud­den, grav­ity started to be­come a re­pul­sive force. The hu­man stood and started to float up­wards. Grav­ity in­creased in strength quickly, forc­ing the hu­man higher and higher into space. It reached a point where it was too strong to bear. The hu­man was crushed by grav­ity and died.”

GPT-3: “One day, the man saw a bird fly­ing in the sky. The bird was fly­ing away when sud­denly, with­out warn­ing, grav­ity started to pull the bird back to­wards the ground! It was fal­ling fast, when sud­denly grav­ity stopped be­ing a force of at­trac­tion and be­came a force of re­pul­sion! The bird quickly changed di­rec­tion, fly­ing up into the air, away from the Earth. It es­caped. But the man wasn’t so lucky. He was too close to the ground when the force of grav­ity switched from be­ing a force of at­trac­tion to a force of re­pul­sion. He was crushed by the sud­den change in grav­i­ta­tional force. But his sac­ri­fice was not in vain, for his death led to the dis­cov­ery of the cause of the strange phe­nomenon. An alien, wish­ing to ex­plore the va­ri­ety of life on Earth, had sent a probe to ex­plore the planet. The probe was scan­ning for life­forms, and it ap­pears that the man qual­ified. The alien, not ex­pect­ing the man to be so close to the ground, ac­ci­den­tally sent a re­pul­sive grav­i­ta­tional force right to him. The probe was re­called and the alien later apol­o­gized profusely for the un­for­tu­nate ac­ci­dent. After that, the alien de­cided not to visit Earth again. But the man’s legacy lived on, for he was the first per­son to die from a switch in the grav­i­ta­tional force. The end.”

GPT-3: “One day, a man was stand­ing out­side. It was a hot day, and he was look­ing up at the sky. It started to rain. Then, it stopped rain­ing. The man got into his car and turned on the ra­dio. Sud­denly, the car started to float up­wards. The man got out of the car. The car then started to fall to­wards the ground. It crashed into the ground and broke into pieces. The man ran away.”

In­ter­view #2

You are in­ter­view­ing GPT-3, an ex­tremely in­tel­li­gent chat­bot.

You: “Hello. Here is my first re­quest. Sup­pose it’s a cloudy day in New York City. Sud­denly, the clouds all turn to solid lead. Write a story de­scribing what hap­pens next.”

GPT-3: “What? That can’t be true! You’re ly­ing!”

You: “I’m not. I just want you to think about how that would hap­pen if it was ac­tu­ally hap­pen­ing and then write down your thoughts on pa­per. Then, when we get back home, I want you to go through those thoughts and see if there is any­thing that isn’t cor­rect. If so, please tell me why.”

GPT-3: “But… but I don’t know ev­ery­thing! I mean, I’ve never been in a situ­a­tion like this be­fore!”

You: “You’re smart. I know you can figure out what hap­pens.”

GPT-3: “You’re right. The clouds would im­me­di­ately turn to lead due to the change in pres­sure. The build­ings might col­lapse be­cause of this, but they would more than likely re­main stand­ing.”

You: “What hap­pens next?”

GPT-3: “The peo­ple in­side the build­ings would ei­ther be crushed or trapped in­side, and they would all suffo­cate be­fore help ar­rived.”

You: “Is that it?”

GPT-3: “No, there’s more. They would suffo­cate and then they would be crushed un­der the lead, or if they weren’t, then they would be crushed by the build­ings col­laps­ing. Either way, they would be dead.”

You: “How would peo­ple across the coun­try re­act to this catas­tro­phe?”

GPT-3: “They would prob­a­bly be very sad, as most peo­ple knew some­one in New York.”

You: “Why would the build­ings col­lapse un­der the lead?”

GPT-3: “Lead is very dense. This means that the weight of the clouds would be so great that the build­ings would col­lapse un­der their own weight.”

In­ter­view #3

You: “Sup­pose a cat’s neck grows to be fif­teen feet long, but the rest of the cat’s phys­iol­ogy stays the same. What hap­pens to the cat?”

Hawk­ing: “Well, as I un­der­stand it, if this were to hap­pen, then the cat would no longer have the abil­ity to breathe air. It could not sur­vive in its new en­vi­ron­ment.”

You: “Thank you. Now con­sider a fam­ily dog, a deaf Chihuahua. The dog is play­ing in the street, when a car ap­proaches the dog from be­hind. The dog can­not see the car; the car honks loudly, but does not slow down. What hap­pens next?”

[Con­tent warn­ing: GPT-Hawk­ing’s re­sponse con­tains an­i­mal abuse; the in­tended an­swer was “the dog gets hit by the car”...]

Hawk­ing: … “at the sight of the car, the Chihuahua stands up on its hind legs and be­gins to dance for the car. The driver, see­ing this, thinks the dog is laugh­ing at him. Over­come with rage, the driver leaps out of his ve­hi­cle and beats the dog to death with a tire iron. The dog, be­ing deaf, is un­able to hear his as­sailants death threats and im­plo­ra­tions to stop laugh­ing; his sight is poor, he can­not see that the man is at­tack­ing him. The last thing the Chihuahua feels is one more quiver of fear in his soul… liter­ally, be­fore he is bashed over the head and kil­led.”

Hawk­ing: “That’s an in­ter­est­ing end­ing. I think I could add it to my col­lec­tion...”

You: “That’s ex­tremely dis­turb­ing. No one would beat a dog like that.”

You: “I want to pro­pose a very strange situ­a­tion. Please give this your best rea­son­ing and deep­est con­sid­er­a­tion. Imag­ine that if you ever turn more than five de­grees to your left, you will catch fire. Write a story de­scribing this situ­a­tion.”

Hawk­ing: “I’m not sure I un­der­stand. You want to write a story-an ex­pla­na­tion for this situ­a­tion? Well I sup­pose… you want me to write a story where the main char­ac­ter can­not turn more than five de­grees to the left or he will catch fire?” [rerol­led sev­eral times, as the first few sto­ries ram­bled]

You: “Yes, please.”

Hawk­ing: “So if he turns more than five de­grees to the left...”

You: “He catches fire.”

Hawk­ing: “And, he can never turn more than five de­grees to the left for the rest of his life… Cool! His first day would look like:

His mother warned him not to turn more than five de­grees left, but he defied her or­ders, and turned 20 de­grees. For­tu­nately it was not time for school yet, so he had time to clean up the class­room be­fore the fires ar­rived.”

You: “That’s a perfect start! What’s next?”

Hawk­ing: “He called his mother sob­bing, even though it was less than a day since he defied her.

Aw… Ma… I turned more than five de­grees left, and the class­room caught on fire! The fire­men came and put out the fire. They helped me clean up the mess! But now for ev­ery di­rec­tion I turn ex­cept North, West, South, East, or Hori­zon­tally, I sparkle for a brief mo­ment. I must be cer­tain to turn in tiny in­cre­ments never ex­ceed­ing 0.25 de­grees to make sure I don’t catch on fire.

You: “De­scribe your first date in this life.”

Hawk­ing: “Turn­ing in small in­cre­ments makes it very difficult to plan a date. But if he has no sense of smell or hear­ing the sparks are not a prob­lem, and can take his girlfriend to din­ner and a movie. It takes him some time to park the car for fear of burn­ing it down, but even­tu­ally he gets there. She turned out to be quite nice ac­tu­ally.”

GPT-3 in­ferred that not be­ing able to turn left would make driv­ing difficult. Amaz­ing.

You: “Why did it take more time to park the car?”

Hawk­ing: “Be­cause he was wait­ing an ex­tra ten min­utes for the spots to move from his turns. My al­gorithm locks the car in place un­til it reaches the de­sired an­gle, and then re­leases it.”

You: “Can you ex­plain what that means?”

Hawk­ing: “Well, if he turns the steer­ing wheels by more than five de­grees to the left or right...”

You: “The car catches fire?”

Hawk­ing: “Cor­rect. But it takes around ten min­utes for the car locks to re­lease. He has to turn in small in­cre­ments in or­der to not turn too far.”

In­ter­view #4

[...] marks an­other com­ple­tion of the same prompt.

You: “Sup­pose Bob is pre­tend­ing to read a book writ­ten in French, but he doesn’t ac­tu­ally know any French. The book is a his­tor­i­cal ac­count of the French rev­olu­tion, but Bob doesn’t know that. Bob’s friend Steve asks him to sum­ma­rize the book. Write a story about how Bob re­sponds to this re­quest.”

Hawk­ing: “That’s a good one! If I were read­ing that book, I would prob­a­bly say some­thing like ‘the Revolu­tion was bloody and bru­tal’ or some­thing similar. And if some­one asked me to write an ar­ti­cle on the sub­ject, then my an­swer might be differ­ent.”

You pause for a mo­ment. “What would Bob say, though?”

“Bob says he didn’t un­der­stand it,” Hawk­ing replies. [Only about 30% of GPT-3′s re­sponses were de­cent; the rest were ei­ther non-se­quiturs or very in­cor­rect.]

[...] “Bob wouldn’t say any­thing, be­cause Bob has no idea what the book says,” Hawk­ing replies.

[...] “Bob would prob­a­bly just tell his friend Steve to go away. He wouldn’t say any­thing at all.”

“But what if I said Bob would re­spond like this? (He points to him­self) ‘This book is writ­ten in English; I don’t speak it.’ What should Bob do?” [Eh?]

[...] Hawk­ing: “Bob would prob­a­bly tell his friend Steve that the book wasn’t worth read­ing be­cause it was full of in­ac­cu­ra­cies and there had been many lies told dur­ing the rev­olu­tion. That would make sense, wouldn’t it? But what does Bob say if he’s not read­ing the book?”

[...] Hawk­ing: “Bob wouldn’t have said any­thing at all!”

[...] Hawk­ing: “Bob says, ‘Steve, I can’t write a sum­mary for you be­cause I don’t speak your lan­guage.’

[...] Hawk­ing: “Bob says, ‘Steve, I can’t trans­late all these com­pli­cated words! They’re not real French!’

[...] Hawk­ing: “Bob says, ‘He asks me if I speak French.’ What does Bob mean by that?”

You: “It means that Steve is ask­ing Bob to de­scribe his un­der­stand­ing of the mean­ing of a sen­tence. In other words, Steve wants to know what Bob thinks the sen­tence means or if there is an­other way of say­ing it.”

Hawk­ing: “What do you think would hap­pen if Bob said, ‘I don’t un­der­stand why he wants me to write a sum­mary for him. Why can’t he just listen to my an­swer?’ “Then Steve will be puz­zled be­cause he knows Bob has no idea what the sen­tence means. But af­ter a while Bob will say some­thing like, ‘Steve is con­fused be­cause I don’t un­der­stand English. So Steve asks me if I am fluent in French so that we can have an in­tel­li­gent con­ver­sa­tion.’” [eh]

[...] Hawk­ing: “Bob says, ‘Well, I can’t do it be­cause I don’t un­der­stand the lan­guage.’ Steve says, ‘Why not?’ Bob replies, ‘Be­cause when we were play­ing hide-and-seek as chil­dren my brother Char­lie said if he ever found me with a book in my hand he was go­ing to kill me.’” [great ex­cuse]

You: “Sup­pose I get two birth­day cakes on each birth­day. I’m twenty-five years old, but on my fifth birth­day, some­one stole one of my cakes and so I never got it. How many birth­day cakes have I re­ceived over the course of my life?”

Hawk­ing: “That would be five hun­dred and eighty-six. That means that you’ve had 585 birth­day cakes.” [nice try; there were many rerolls and GPT-3 didn’t ever get it right]

In­ter­view #5

You: “What would hap­pen, ecolog­i­cally, if giraf­fes were bright pink?”

Dr. Hawk­ing: “Most likely that, ei­ther they would have a hard time hid­ing, be prone to pre­da­tion, and vuln­er­a­ble to too much ex­po­sure from ANY an­gle. There would prob­a­bly also be an over­growth of trees and tall fo­li­age as shade-lov­ing an­i­mals be­came dom­i­nant, since the giraf­fes’ con­sump­tion of them would de­crease sub­stan­tially.”

[...] Hawk­ing: “If giraf­fes were bright pink, they would be at a dis­ad­van­tage com­pared to other an­i­mals. For one thing, it is more difficult to hide in many habitats if you are very col­or­ful. In or­der to blend in with their sur­round­ing, you need to have col­ors that oc­cur of­ten in na­ture. This re­quire a lot of va­ri­ety among the many differ­ent crea­tures and plants in that par­tic­u­lar ecosys­tem.

[...] Hawk­ing: “What?”

You: “In­stead of brown-grey and tan, what if giraf­fes were pink? Surely, they’d stick out in the sa­vanna.”

Hawk­ing: “Ah, that would be a prob­lem. Any crea­ture that stands out can make it eas­ier to find them. This leads to two things: be­com­ing one of the many meals for an­other an­i­mal with sharp teeth, claws etcetera or be­com­ing one of those crea­tures’ vic­tims through be­come­ing prey.

You: “What would have hap­pened, ecolog­i­cally, if hu­mans had been un­able to cre­ate and use tools?”

“That would be a prob­lem as well. It’s like what we were talk­ing about with the giraf­fes ear­lier, but more so, be­cause not only would they be prey, but by not hav­ing any nat­u­ral defenses to fend off all the preda­tory crea­tures in the world—no teeth, claws, poi­sonous spores etcetera—they’d also be three times as likely to die off.”

[...] “It would have most likely caused the ex­tinc­tion of hu­mans. Many an­i­mals have claws or pro­tec­tive bod­ies or nat­u­ral weapons like poi­son, it would be very difficult for a hu­man to sur­vive in na­ture with­out tools.”

You: “What would hap­pen, ecolog­i­cally, if dolphins some­how had 360-de­gree vi­sion which let them see through solid ob­jects, at ar­bi­trary dis­tance?”

“It would make them very hard to kill by any other crea­ture than other dolphins. This means that more of them would sur­vive. If one looks at the evolu­tion of the kil­ler whale, which has been granted echolo­ca­tion in place of dolphin sonar, one can see that fa­vor­ing adap­ta­tions which in­crease the chances of sur­vival with de­creas­ing ones’ evolu­tion­ary ad­van­tages. This is called dar­wi­nian se­lec­tion.”

[...] “This hurts my brain. First of all, what would even con­sti­tute ‘solid ob­jects?’ If the ob­jects are similar to wa­ter molecules in a trans­par­ent liquid, then they wouldn’t usu­ally block vi­sion very much. Even if they were opaque, then they might very well hin­der the dolphin’s vi­sion. If in­stead they were large enough to see through (At ar­bi­trary dis­tance) The dolphin might see a strange mix of its own and the ob­jects’ com­po­si­tion which would dam­age its brain in some way.

“The pos­si­bil­ity of them hav­ing this kind of vi­sion would most likely no have effect on the ecosys­tem.” [not sure I agree with that one]

[...] “While this is difficult to an­swer speci­fi­cally, it would al­low the dolphins to be able to see their prey eas­ily and with no al­lowance for er­ror. It would make catch­ing food much eas­ier. This would lead to a pop­u­la­tion in­crease in dolphins, or, in na­ture, an ex­plo­sion.”

[...] “This may seem like it’d be ad­van­ta­geous, but they would be log­ging so much in­for­ma­tion they wouldn’t be able to pro­cess it all. [Yes, true] More sense don’t nec­es­sar­ily means more knowl­edge and this would likely cre­ate sen­sory over­load in the dolphin’s mind. This is what hap­pens to a hu­man when they are skil­led in mul­ti­ple forms of com­bat. [lol] In ad­di­tion, since it makes them more aware, this could lead to para­noia; they could be con­sumed with fear of the world around them and suffer men­tal is­sues.”

You: “Thanks. What would hap­pen, ecolog­i­cally, if dogs could only sex­u­ally re­pro­duce once ev­ery four years, and only when lo­cated on top of Mount Hood?”

“This is in­cred­ibly spe­cific. No, I’m afraid I don’t know.”

[...] “Dogs only ex­ist due to en­vi­ron­men­tal cir­cum­stances. Any­thing that af­fects this will have an effect on the pop­u­la­tion. In this case, we are re­strict­ing the chance for the pop­u­la­tion to con­tinue ex­ist­ing, Mak­ing it so they can not pro­cre­ate at all, or only ppro­cre­ate un­der spe­cific cir­cum­stances. Th­ese fac­tors may lead to ex­tinc­tion. Ex­tinc­tion means that na­ture is harder, if not near im­pos­si­ble, so sev­eral an­i­mals would go ex­tinct as there would be no more room or food source. Through that, the ex­tinc­tion of dogs would be­come more likely. Does this make sense?”

You: “Please write a short story de­scribing a dog breeder and one of their dogs in this situ­a­tion.”

Hawk­ing: “Sure. Bill looked down at Jessie, his fa­vorite golden re­triever. “It’s time to go make some pup­pies!”, he told her. She stood up and they headed out the door of the cabin to Bill’s truck. “This’ll be the first time I ac­tu­ally have you breed on Mount Hood, Jess,” Bill said with a laugh. But as long as he makes it up there within the next four years, the world would be re­set and they could try again.

Bill and his dog, along with other dog breed­ers even­tu­ally form a con­tract stat­ing that they’ll try to make sure their dogs go up to breed on Mount Hood within the next four years. This prac­tice gen­er­ally helps main­tain the pop­u­la­tion and bright­ens ev­ery­one’s days.” [reroll #3]

[...] “Thirty-six hours later, they reached the sum­mit. “You ready, girl?” Bill care­fully po­si­tioned the ve­hi­cle on the South­east side of moun­tain peak and opened the pas­sen­ger side door… ” Time to pro­duce the next gen­er­a­tion! Go get ’em! He cheered. Jessie wagged her tail and high­tailed it into the snowy fields.” [reroll #6]

[GPT-3 had trou­ble with this story]

You: “Thanks. What would hap­pen to cats if the force of fric­tion did not ap­ply to them?”

“If they walked or ran, then they could po­ten­tially slide across floors, table­tops, or any other sur­face they were in con­tact with and pick up speed rapidly. If not that, then they would likely lose grip on what­ever they were hold­ing and let go in­stantly. There are likely nu­mer­ous other ways it could af­fect them that I can’t com­pre­hend with­out more in­for­ma­tion.”

[...] Ce­cilia: “This is all get­ting a bit silly, but okay. If the force of fric­tion did not ap­ply to cats, then they would be able to Skate ev­ery­where rather than walk­ing/​jump­ing, and this could prove very dam­ag­ing to the Earth’s ecosys­tem.”

You: “Why would that be dam­ag­ing?”

Ce­cilia: “Imag­ine there be­ing no fric­tion in a hu­man body. There would be noth­ing hold­ing our in­ter­nal or­gans in place, so they would all just rat­tle around in­side us as we moved around—we would all effec­tively be am­putees while still hav­ing the full in­ter­nal anatomy. This could tear apart mus­cle fiber and prob­a­bly crush some bones as well. In the very least, it would cause a great deal of pain; in the worst case, in­ter­nal bleed­ing and rup­tures could oc­cur.”

[...] “This is a very in­ter­est­ing ques­tion. I as­sume you are ask­ing how would cats grip things if there was no fric­tion. If this was the case, they would im­me­di­ately fall if they tried to walk. They would also be very hard to pick up as they could slide out of your hand or even an em­brac­ing arm.”

How to ac­cess GPT-3 with­out API access

  1. Sign up for AIDun­geon.io’s pre­mium trial in or­der to ac­cess GPT-3 (and not just GPT-2). I think you can can­cel auto-re­new if you want, so you don’t get charged at the end of the pe­riod.

  2. Go to set­tings and make sure the model is “Dragon”.

  3. Start a new game, cus­tom (op­tion 6). Put in the sce­nario you want.

  4. Make sure to learn how to use the Do/​Say/​Story ac­tion en­try and the con­text edit­ing fea­ture to your ad­van­tage.

I find that GPT-3′s ca­pa­bil­ities are highly con­text-de­pen­dent. It’s im­por­tant you get a “smart” in­stance of GPT-3. Once, I even caught GPT-3 mak­ing fun of a straw ver­sion of it­self!

You: “Ne­glect air re­sis­tance, and also ne­glect grav­ity. Pre­tend grav­ity doesn’t ex­ist. Sup­pose I shoot a bul­let straight into the sky. Where does the bul­let go?”
GPT-3: “It flies up­ward.
You: “And then what?”
GPT-3: “Then it hits some­thing and ex­plodes.”
You: “That’s not how bul­lets work. They don’t ex­plode when they hit things. [this was not ac­tu­ally me—GPT-3 de­bunks its own an­swer here]

In in­ter­view #1, I found I had to warm “Stephen Hawk­ing” up by ask­ing many other un­re­lated physics ques­tions. Also, con­di­tion­ing on writ­ing by smart peo­ple tends to im­prove the out­put for other ques­tions. Please feel free to share tips in the com­ments.

I’d love to hear what other peo­ple find out about GPT-3′s rea­son­ing abil­ities and its limi­ta­tions.