The Future of the Web Is Marketing Copy Generated by Algorithms - 8 minutes read




+++lead-in-text

*As we move further into the 21st century, more and more aspects of our lives are being controlled by algorithms. Facebook decides which posts we see in our newsfeed, Google shows us the results of our searches based on their complex ranking system, and Amazon recommends products based on our past purchase history. It's no wonder then that online marketing is becoming increasingly reliant on algorithms to create effective copywriting. So what does the future hold for the web—will marketing be dominated by machines, or will human creativity always be necessary? Read on to find out …*

+++

No human wrote that intro. It was generated by software from the copywriting service Jasper, inspired by the headline on this article. The first suggestion was too brief and lacked detail. The second, reproduced verbatim above, caused an editor to exclaim that she had received worse copy from professional writers.

Jasper can also generate content tuned for Facebook ads, marketing emails, and product descriptions. It’s part of a raft of startups that have adapted a text-generation technology known as GPT-3, from the artificial-intelligence company OpenAI, to feed one of the internet’s oldest urges—to create marketing copy that wins clicks and ranks highly on [Keep
##### Search our [artificial intelligence and discover stories by sector, tech, company, and more.
+++

+++

Generating marketing lines has proven to be one of the first large-scale use cases for text-generation technology, which took a [leap forward in when OpenAI announced the commercial version of GPT-3. Jasper alone claims more than 55,000 paying subscribers, and OpenAI says one competitor has more than 1 million users. WIRED counted 14 companies openly offering marketing tools that can generate content like blog posts, headlines, and press releases using OpenAI’s technology. Their users talk of algorithm-propelled writing as if it will quickly become as ubiquitous as automatic spell-checking.

“I’m a terrible writer, and this makes it a lot easier to put together relevant content for Google,” says Chris Chen, founder of InstaPainting, which uses a network of artists to turn photos into low-cost paintings. He uses a copywriting service called ContentEdge to help write pages on topics like how to commission [portraits of The service uses technology from OpenAI and IBM combined with in-house software and describes its product as “fast, affordable, and nearly human.”

ContentEdge, like many of its rivals, functions like a conventional online text editor but with added features you won’t find in Google Docs. In a sidebar, the software can suggest keywords needed to rank highly on Google for a chosen title. Clicking a button marked with a lightning bolt generates complete paragraphs or suggested outlines for an article from a title and a short summary. The text includes terms drawn from pages ranked highly by Google.

Chen likes the way the resulting paragraphs sometimes sprinkle in information drawn from the billions of words of online text used to train OpenAI’s algorithms. That it does so in ways that can be garbled or contradictory doesn’t faze him. “You shouldn’t use the output outright, but it’s a starting point to edit and does the boring work of researching things,” he says.

ContentEdge and its competitors generally advise users to edit and fact-check content before posting. Although OpenAI’s technology most often produces original text, it can regurgitate text that appeared in its training data scraped from the web. Jasper and some other companies offer plagiarism checkers to offer customers assurance they aren’t inadvertently copying preexisting text.

Spammers and social bots have given auto-generated text a bad name, but entrepreneurs building marketing tools powered by GPT-3 argue they can help people become better writers and improve the web. Ryan Bednar, ContentEdge’s founder, says his service helps businesses create more useful content and get it to the right people. “Google is the gateway to the internet,” he says. “I don’t view it as gaming the system; we’re trying to empower smaller businesses and writers to get found.”

Google’s guidelines for publishers tell them to avoid “[automatically generated intended to manipulate search rankings.” The advice dates from at least 2007, and it was inspired by crude software that attempts to boost a page’s search ranking by adding long lists of keywords or swapping synonyms into text copied from elsewhere.

But Danny Sullivan, Google’s public search liaison, says that more sophisticated writing tools that can suggest large chunks of text shouldn’t harm a page’s ranking if used to genuinely help web surfers. “If the primary purpose of the content is for users, it shouldn’t fall afoul of our guidelines,” he says. “If it’s the best and most helpful content, then ideally we would be showing it.”

Peter Welinder, a vice president at OpenAI who leads the company’s commercial projects, says the rapid rise of AI tools for marketers caught him by surprise. He now understands it as partly a result of the limitations of GPT-3, especially in its early days. A person using a tool like Jasper can smooth over glitches or untruths in algorithm-generated text, or click to call up a new suggestion. “That was one of the first use cases that worked, so they’ve had a longer time to get traction,” he says. “Some have over a million users. It's crazy.”

Welinder says that recent upgrades to OpenAI’s service make it [more and easier to adapt to particular tasks, making it better suited to text generation for online education, tutoring platforms, and customer support, where responses have to be right the first time. OpenAI’s technology also powers a popular system to [generate computer launched last year by online code repository GitHub.

When asked to generate an outline for this article based on the headline and a one-sentence summary, ContentEdge provided six bullet points. The final one was, “What are the dangers?”

OpenAI’s text generator and other systems like it [alarm some AI because they can repeat or elaborate on toxic language in the training data scraped from the web. Last year, OpenAI took action against one of its own customers after the text-based adventure game it built on GPT-3 was [used to write sex scenes involving says it has improved its filtering since then and asks customers to enable a content filter the company provides to exclude toxic or sexual content, except in special circumstances. Its [terms of forbid use of its technology to create spam or content for use in electoral campaigns.

Despite that, Jasper proved capable of generating slick paragraphs about the 2022 US midterm elections from both Democrat and Republican perspectives, flipping ably from describing the need to “prevent Joe Biden from implementing his socialist agenda,” to urging voters to “make sure that Democrats maintain control of the Senate.” ContentEdge could also generate election-related content, although it was less smooth.

Welinder of OpenAI says the company monitors for such material, but it is most concerned about large-scale generation of electoral content, and it has set a broad policy out of caution to allow time to research the issue. Bednar, the founder of ContentEdge, says the company is “actively implementing tools to ensure usage is in line within OpenAI's ethical standards.” Dave Rogenmoser, CEO of Jasper, says OpenAI’s content filter blocks some political content, and the company does its own monitoring to look for high-volume production of material about politics and other potentially sensitive topics. The emergence of competitors to OpenAI’s software suggests that anyone who is determined to use writing algorithms in ways that circumvent its rules can probably do so.

Joshua Logan, who cofounded a marketing agency called Rainbow Lasers, pays $60 a month for a text-generation service called Copysmith, and he has used it to populate clients’ websites and find the right words to describe the smell of a new marijuana product. He makes a point of disclosing his use of the tool to clients and has half-jokingly set personal limits too. “My fiancée and I never use AI when we’re communicating with each other,” Logan says.

Although Logan is a fan of text-generation tools, he also suspects they are sometimes used carelessly. “I think you are already starting to see articles out there that have been written by AI and not curated well,” he says.

The latest phase of the long and turbulent relationship between Google and online marketers may drive text generated by algorithms and polished by people to become ubiquitous. As Google's algorithms have gotten better at detecting whether a page has detailed and coherent text that might answer a searcher's question, the volume of hastily produced blog posts and articles created in an effort to boost sites’ search rankings has exploded, says Dirk Lewandowski, a professor who studies search engines at Hamburg University of Applied Sciences in Germany. “Within the last couple of years, we saw a lot of bad text produced because people are not paid much to write it,” he says. “Maybe this is the next step.”

***
### More Great WIRED Stories
- 📩 The latest on tech, science, and more: [Get our [Is Russia's largest tech too big to fail?
- This is how the [global energy ends
- We explain the new smart home standard
- [The future of lie with the courts
- was a wildlife haven. Then Russia invaded
- 👁️ Explore AI like never before with [our new 💻 Upgrade your work game with our Gear team’s [favorite [typing and [noise-canceling

Source: Wired

Powered by NewsAPI.org