Headlines are one of the most critical aspects of an article. They draw readers in and give them a sense of what the article is about. A good headline will make an article more likely to be read, while a bad headline will discourage people from reading it. Headlines should be clear and concise and should accurately reflect the article's content. They should also be exciting and attention-grabbing so that readers will want to click on them. Spending time crafting the perfect headline can be worth it, as it can mean the difference between an article being read or ignored.

But coming up with a good headline can be challenging. Authors want something that accurately reflects the content of the article, but also want something that will grab attention and make people want to read more. This is where AI can help. Several AI-powered tools can help create better headlines for articles. These tools use algorithms to analyze the content and come up with accurate and attention-grabbing headlines. But, no surprise, in this article, all we need is GPT-3.

What are we going to look at?

Let's look at the basic title generation procedure with GPT-3. Based on this, we will collect some sample data from the magazine "Stern" to imitate the style of the magazine:

  1. Headline generation (Zero-shot)

  2. Data acquisition with AI support

  3. Headline generation (Few-shot)

Headline generation (Zero-shot)

As explained in the previous article, Zero-shot means to avoid giving GPT-3 more examples for what is expected as a result. The only information provided for generation is the task description in natural language. In this case, we take the abstract of a "Stern" article titled "Schädlicher Whatsapp-Kettenbrief verspricht Milka-Gewinnspiel – so erkennen Sie Betrugsversuche" and tell GPT-3 to create a German headline (green texts parts are generated by GPT-3).

Not a bad headline. We can repeat the generation as many times as we want, and, depending on the parameterization, we will get more outputs. Here is a selection:

  1. Vorsicht bei angeblichem Milka-Gewinnspiel!

  2. Mondelez warnt vor angeblichem Milka-Gewinnspiel

  3. Mondelez warnt vor Fälschung eines angeblichen Milka-Gewinnspiels

In the following, let's see if we can push the headings a bit more into the "Stern" style.

Data acquisition with AI support

The generated headlines already read well. However, they are not yet in style we know from "Stern". Therefore, we want to give GPT-3 a few examples in the following, which AI will consider during the generation. This procedure is called Few-shot. However, since we don't have any titles available, we must collect some. And in a good programmer mentality, we automate what can be automated.

However, acquiring data is not an exciting task from a programmer's point of view. Especially when it comes to collecting freely available data without additional security measures, these are precisely the titles of the articles on the "Stern" website. An RSS feed is offered, which can be accessed without much effort. We take the RSS feed from the "Panorama" section.

Therefore, the programming task is so simple that one can also let GPT-3 do it. Consequently, we ask GPT-3 if it can write the corresponding source code for a Python script. This also happens as instruction in natural language (The video shows an example of how to access the titles).

Now we have our examples that we can use in the following. Here are some examples for the titles:

  1. "Cold Case": Mordserie nach mehr als 30 Jahren aufgeklärt – DNA-Spuren überführen "I-65-Killer"

  2. Ukraine-Krieg: Nach Rückzug russischer Truppen: Mehr als 400 Menschen in Hostomel vermisst

  3. Blick in die Vergangenheit: 13,5 Milliarden Lichtjahre entfernt: Forscher entdecken offenbar die entfernteste Galaxie, die je beobachtet wurde

  4. Video: München: Stimmungsbild nach Ablehnung der Impfpflicht ab 60

... Okay, we learn, in "Stern" colons are taken very seriously.

Headline generation (Few-shot)

We give the collected examples to the language model in a semi-structured format. Here we use the prefixes "title" and "abstract" to signal to GPT-3 what we are entering and separate the respective examples with a line break. For the abstract for which we want a title generated, we enter only the abstract (with prefix) and start the following line with "title:" so that GPT-3 knows we want a completion here. We thus give GPT-3 20 examples, resulting in a total request of 2,263 tokens, which will cost approximately $0.15 to process.

To be honest, I am not an expert on headlines, but it is already recognized that the characteristic ":" is used in the title, and hyphens are used. And again, we can repeat the generation as many times as we want, and, depending on the parameterization, we will get more outputs. Here is a selection:

  1. Milka-Gewinnspiel: Achtung, Fälschung!

  2. Whatsapp-Kettenbrief: "Gewinne einen Oster-Geschenkkorb voller Schokolade" - Mondelez warnt: Fälschung!

  3. Milka-Gewinnspiel: Schokoladen-Kettenbrief auf Whatsapp ist eine Fälschung

  4. Whatsapp-Kettenbrief: Mondelez warnt vor Fälschung

  5. Whatsapp-Kettenbrief: Milka warnt vor angeblichem Gewinnspiel - Schokolade gibt es nicht zu gewinnen

Let's summarize.

Creating article headlines is a creative and challenging task. They should be meaningful, not too short, not too long, contain everything necessary and be catchy. AI can help to create variants and thus fuel the creative process. GPT-3 allows you to use examples to determine how titles should look. The example shown here is limited to the maximum request length to GPT-3. In production, one would use fine-tuning here, in which hundreds to thousands of examples are used to “tune” the text generation. We will talk about that soon.

That’s it for now.