Matthew Clapham
1 min readJul 24, 2023

--

For another piece I wrote about automation I was trying to place a quote from John Betjeman, and couldn't find it, because I was dumbly looking for 'cheery chintziness' instead of 'chintzy cheeriness'.

I asked ChatGPT and it immediately and bullishly came back with a few lines from the poem Dawlish containing the phrase.

Except they were completely made up.

They kind of sounded a bit like Betjeman, but had nothing to do with the poem Dawlish (which does actually exist). Nor did Betjeman ever write 'cheery chintziness', as I said.

It's just so weird (and unhelpful) that rather than identifying the units 'chintzy', 'cheery', placing them correctly in Death in Leamington, and providing me with the corrected quote -all of which it is more than capable of doing- it just 'decided' to make some vaguely plausible-sounding shit up.

Like I say to anyone who'll listen, if you know that anything these programs produce could be complete bullshit, you have to doubt every single thing, however plausible it might seem.

--

--

Matthew Clapham
Matthew Clapham

Written by Matthew Clapham

Professional translator by day. Writer of silly and serious stuff by night. Also by day, when I get fed up of tedious translations. Founder of Iberospherical.

Responses (1)