{"id":38019,"date":"2018-04-19T14:05:41","date_gmt":"2018-04-19T21:05:41","guid":{"rendered":"https:\/\/www.eigene-homepage-erstellen.net\/?p=38019"},"modified":"2018-04-19T14:05:41","modified_gmt":"2018-04-19T21:05:41","slug":"essay-natural-language-processing-impact-writers-content-digital-marketing","status":"publish","type":"post","link":"https:\/\/www.eigene-homepage-erstellen.net\/blog\/content\/essay-natural-language-processing-impact-writers-content-digital-marketing.htm","title":{"rendered":"What Does Natural Language Processing Mean for Writers, Content, and Digital Marketing?"},"content":{"rendered":"\n

Terrible Movie Pitch:<\/strong><\/p>\n

\nThe battle for the voice of the internet has begun. In one corner, we have computer programs fortified by algorithms, Artificial Intelligence, Natural Language Processing, and other sexy STEM buzzwords. In the other corner, we have millions of copywriters armed with the only marketable skill a liberal arts education can provide: communication. Who will lol the last lol?
<\/em>\n<\/p><\/blockquote>\n

Spoiler:<\/strong><\/p>\n

Writers, your jobs are probably safe for a long time. And content teams stand to gain more than they stand to lose.<\/p><\/blockquote>\n

I remember the day someone told me a computer had written a best-selling novel in Russia. My first thought? “I need to get the hell out of content marketing.”<\/p>\n

The book was called True Love<\/em>—an ambitious topic for an algorithm. It was published in 2008<\/a> and “authored” by Alexander Prokopovich, chief editor of the Russian publishing house Astrel-SPb. It combines the story of Leo Tolstoy’s Anna Karenina and the style of Japanese author Haruki Murakami, and draws influence from 17 other major works. <\/p>\n

Frankly, that sounds like it’d make for a pretty good book. It also sounds a lot like how brands create their digital marketing strategies.<\/p>\n

Today, every brand is a publisher<\/strong>. Whether you’re a multi-billion-dollar technology company or a family-run hot sauce manufacturer, content rules your digital presence. Maybe this means web guides, blog posts, or help centers. Maybe it means a robust social media presence or personalized chatbot dialogue. Maybe you feel the need to “publish or perish,” and provide value and engagement in a scalable way. <\/p>\n

Brands require a constant influx of written language to engage with customers and maintain search authority. And in a way, all the content they require is based on 26 letters and a few rules of syntax. Why couldn’t<\/em> a machine do it?<\/p>\n

In the time since I first heard about True Love<\/em>, I’ve moved from content writing to content strategy and UX, trying to stay one step ahead of the algorithms. But AI in general and Natural Language Processing in particular are only gaining momentum, and I find myself wondering more and more often what they’ll mean for digital marketing. <\/p>\n

This essay will endeavor to answer that question through conversations with experts and my own composite research. <\/p>\n

Portent’s Matthew Henry Talks Common Sense <\/h2>\n

\n“The Analytical Engine has no pretensions to originate anything. It can do whatever we know how to order it to perform<\/em>.”<\/p>\n

-Lady Ada Lovelace, 1842, as quoted by Alan Turing<\/a> (her italics)<\/p><\/blockquote>\n

Lady Lovelace might have been the first person to contend that computers will only ever know as much as they’re told. But today’s white-hot field of machine learning and Artificial Intelligence (AI) hinges on computers making inferences and synthesizing data in combinations they were never “ordered to perform.” <\/p>\n

One application of this Machine Learning and AI technology is Natural Language Processing (NLP), which involves the machine parsing of spoken or written human language. A division of NLP is Natural Language Generation (NLG), which involves producing<\/em> human language. NLP is kind of like teaching computers to read; NLG is like teaching them to write. <\/p>\n

I asked Portent’s Development Architect Matthew Henry<\/a> what he thinks about the possibilities for NLP and content marketing. Matthew has spent over a decade developing Portent’s library of proprietary software and tools, including a crawler that mimics Google’s own. Google is one of the leading research laboratories<\/a> for NLP and AI, so it makes sense that our resident search engine genius might know what the industry’s in for. <\/p>\n

I half expected to hear that he’s already cooking up an NLP tool for us. Instead, I learned he’s pretty dubious that NLP will be replacing content writers any time soon.<\/p>\n

“No computer can truly<\/em> understand natural language like a human being can,” says Matthew. “Even a ten year old child can do better than a computer.”<\/p>\n

“A computer can add a million numbers in a few seconds,” he continues, “which is a really hard job for a human being. But if a cash register computer sees that a packet of gum costs $13,000, it won’t even blink. A human being will instantly say Oh, that’s obviously wrong. And that’s the part that’s really hard to program.” <\/p>\n

Knowing that something is obviously wrong is something we do all the time without thinking about it, but it’s an extremely hard thing for a computer to do<\/strong>. Not impossible—to extend my analogy, you could program a computer to recognize when prices are implausible, but it would be a giant project, whereas for a human being, it’s trivial.” <\/p>\n

It’s not news that there are things computers are really good at that humans are bad at, and some things humans are really good at that computers can’t seem to manage. That’s why Amazon’s Mechanical Turk exists. As they say<\/a>,<\/p>\n

“Amazon Mechanical Turk is based on the idea that there are still many things that human beings can do much more effectively than computers, such as identifying objects in a photo or video, performing data de-duplication, transcribing audio recordings, or researching data details.”<\/p><\/blockquote>\n

Amazon calls the work humans do through Mechanical Turk “Human Intelligence Tasks,” or HITs. Companies pay humans small sums of money to perform these HITs. (A made-up example might be identifying pictures where someone looks “sad” for 10 cents a pop.) <\/p>\n

Matthew might instead call these HITs, “Common Sense Tasks,” like knowing a pack of gum shouldn’t cost $13,000. <\/p>\n

“People underestimate the power of common sense,” Matthew says. “No one has ever made a computer program that truly has common sense, and I don’t think we’re even close to that.”<\/p>\n

And here’s the real quantum leap for not only NLP but Artificial Intelligence: right now, computers only know what they’ve been told. Common sense is knowing something without being told.<\/p>\n

It sounds cheesy to say that our imaginations are what separate us from the machines, but imagination isn’t just about being creative. Today, computers can write poetry<\/a> and paint like Rembrandt<\/a>. Google made a splash in 2015 when the neural networks they’d trained on millions of images were able to generate pictures<\/a> from images of random noise, something they called neural net “dreams.” And in 2016, they announced Project Magenta<\/a>, which uses Google Brain to “create compelling art and music.” <\/p>\n

So it’s not “imagination” in any artistic terms. It’s imagination in the simplest, truest form: knowing something you haven’t been told.<\/strong> Whether it’s Shakespeare inventing 1,700 words<\/a> for the English language, or realizing that kimchi would be really good in your quesadilla, that’s the basis of invention. That’s also the basis of common sense and of original thought, and it’s how we achieve understanding.<\/p>\n

To explain what computers can’t do, let’s dig a little deeper into one of the original Common Sense Tasks: understanding language.<\/p>\n

Defining “Understanding” for Natural Language<\/h2>\n

NLP wasn’t always called NLP. The field was originally known as<\/a> ”Natural Language Understanding” (NLU) during the 1960s and ‘70s. Folks moved away from that term when they realized that what they were really trying to do was get a computer to process<\/em> language, not understand<\/em> it, which is more than just turning input into output. <\/p>\n

Semblances of NLU do exist today, perhaps most notably in Google search and the Hummingbird algorithm that enables semantic inferences. Google understands that when you ask, “How’s the weather?” you probably mean, “How is the weather in my current location today?” It can also correct your syntax intuitively:<\/p>\n