DetectGPT: How To Detect GPT Generated Text (Beginner Friendly)

Ching (Chingis)
7 min readApr 30, 2023

Welcome to DetectGPT, an innovative approach for detecting GPT generated text that has the potential to significantly improve the accuracy of existing zero-shot methods for model sample detection. In contrast to other methods, DetectGPT does not require the training of a separate classifier, the collection of a dataset of real or generated passages, or explicit watermarking of generated text.

Instead, DetectGPT uses only log probabilities computed by the model of interest and random perturbations of the passage from another generic pre-trained language model, such as T5. This approach has been shown to be more discriminative than existing zero-shot methods, with significant improvements in detecting fake news articles generated by the powerful 20B parameter GPT-NeoX model. In fact, DetectGPT achieved an impressive 0.95 AUROC compared to the 0.81 AUROC of the strongest zero-shot baseline.

In this blog, we will delve deeper into the mechanics of DetectGPT and explore how it can be used to identify GPT generated text with high accuracy, even for those with little technical expertise. Whether you’re a researcher, journalist, or simply someone interested in the intersection of language models and AI, this blog will provide you with valuable insights into the latest advancements in GPT detection. So join us on this journey and discover the power of DetectGPT!

DetectGPT

  • ChatGPT is a hot topic. People are discussing whether we can detect a passage generated from a Large Language Models (LLM), such as GPT.
  • A new curvature-based criterion, DetectGPT, is defined for judging if a passage is generated from a given LLM.
  • DetectGPT does not require training a separate classifier, collecting a dataset of real or generated passages, or explicitly watermarking generated text. Thus, it is totally zero-shot algorithm.
  • It uses only log probabilities computed by the model of interest and random perturbations of the passage from another generic pre-trained language model, such as T5.

Log Probability

Log probability is a way of measuring the likelihood of a particular event or outcome occurring, based on some evidence or information.

Consider it in terms of how LLMs, like GPT, generate texts. GPT model generates text one token at a time, based on the learned…

--

--

Ching (Chingis)

I am a passionate student. I enjoy studying and sharing my knowledge. Follow me/Connect with me and join my journey.