The article discusses the use of GPT and other models by writers to learn new technology or to figure out technical steps. However, it emphasizes that GPT generated text is not used in published articles due to reasons such as style clash, increased editing and QA burden, and the potential to 'poison' an article in search engine rankings. The article also points out that GPT is often good at writing content that doesn't need to be written, and can sometimes cause unnecessary doubt in the writer's mind.
Despite these drawbacks, the article highlights several useful applications of GPT. These include creating code samples, generating ideas when stuck, teaching new concepts, reformatting text as markdown, and reviewing text to provide helpful suggestions for improvement. However, the article advises writers to use GPT's suggestions as a guide and to make the final corrections themselves.
Key takeaways:
GPT's style is noticeably different from human writing and can create a style clash in articles.
Using GPT generated text can push more of the production burden onto editing and QA.
GPT is best at rehashing existing knowledge, which often doesn't provide value to readers.
GPT can be useful for creating code samples, giving ideas when stuck, teaching new concepts, formatting, and reviewing text.