Playing wiþ Chat-GPT

What might it be good for? Chat-GPT looks back at up to 4000 tokens (maybe 3000 words?) in the prompt(s), and then, probabilistically, it selects the next token to output. It was trained on half…

I think: this is an occasionally hallucinatory conventional-wisdom generator:

I gave Chat-GPT a prompt:

Write a 1000 word New York Times-style op-ed in the style of Brad DeLong about productivity growth, making four points:

  • First, worldwide productivity growth was relatively stable at an average of about 2% per year from 1870–2010 as a tremendous expansion in the STEM research-and-development workforce was offset by the harvesting of the low-hanging technological fruit.

  • Second, the ramping up of the STEM research-and-development workforce is likely to soon reach its limits without a major reconfiguration of what world governments do, hence, the rate of growth is likely to decline.

  • Third, the 2% per year productivity growth of 1870 to 2010 was not enough to lead us to focus effectively on living wisely and well. Rather, the problems of equitably distributing and beneficently utilizing our material wealth continue to flummox us.

  • Fourth, the arrival of global warming will derange economies and societies. Hence the next 50 years are likely to see a world much richer than the last 50, but one that is significantly less happy and suffers significantly more in the way of human-caused catastrophe.

<braddelong.substack.com/p/playing…>