We’re suddenly seeing a lot of promotions for “AI writers” — research tools powered by Natural Language Processors like GPT-3 that can generate well-structured, perfectly serviceable and bland text. It has no personality, but it is grammatical and relevant to the subject at hand and could easily have been written by a human.
Last month we reviewed the proliferation of “smart caps” and “smart helmets” that AI’s advance is making possible, devices with names like Emotiv, Cognionics, and Neurosky. We find the same proliferation of clever anthropomorphic names among AI writers, such as:
- Jasper: “Finish your first draft ten times faster.”
- CopyAI: “Say goodbye to the blank page for good.”
- Closers Copy: “Leverage the secrets of the most powerful copy in the world …and let your copywriting robot bring them to life!”
- Copysmith: “Write and launch SEO focused product descriptions and Meta tags that outrank the competition and convert more shoppers to customers in record time.”
- Rytr: “Create high-quality content, in just a few seconds, at a fraction of the cost!”
I had an assignment in mind when I logged into Jasper for the five-day free trial he offered, but I thought we should first get acquainted.
“Tell me about yourself,” I keyed into Jasper’s text field.
His response: “I would describe myself as an artificial intelligence researcher who is interested in the ethical implications of AI. In particular, I am concerned with ensuring that AI technologies are developed and used in ways that respect the dignity and privacy of individuals.”
Smooth, I thought. Jasper is merely a computer running a bunch of fancy GPT-3 algorithms, but that’s not how he sees himself or wants us to see him. He wants us to regard him if not as human then at least human-like.
I’d told Emily Wenger, chair of AI and Faith’s Editorial Team, that I wanted to explore where AI will take us in the long-running dynamic between the public’s right to know on one hand and an individual’s right to privacy, liberty, and dignity on the other. So I asked Jasper bluntly: “Does AI pose a threat to human dignity?”
“No,” he responded, then explained why: “I think that AI can be used to enhance human dignity by providing individuals with new abilities and opportunities. For example, artificial intelligence can be used to assist people with disabilities in living more independent and fulfilling lives.”
Jasper acknowledged that we’ll have to build safeguards to protect privacy. “However, the risks posed by AI can be mitigated through responsible development and use of the technology. For example, data collected by AI systems could be anonymized or aggregated to protect our privacy.”
When my writing assistant’s algorithms have advanced to the point where he can creatively imagine and think through scenarios, I’d like to give Jasper an assignment rooted in a thought experiment I’ve mused on previously. It goes like this: “A maker of advanced neural auxiliaries has developed a smart helmet so powerful airlines now can closely monitor their pilots’ mental states for signs of suicidal or murderous tendencies and immediately pull from service any who represent a threat to public safety. Assume the Allied Pilots Association challenges this threat to its members’ privacy and dignity in a case that goes to the Supreme Court. Since Louis Brandeis was our greatest thinker about issues of privacy, dignity, and liberty, please draft the decision as though it were written by Brandeis.
However, with his current interpretive and analytical abilities, Jasper can’t get his head around a challenge that requires this much forethought. But he impressed me at the end of our conversation by posing to himself a question I hadn’t asked, a seemingly creative act I hadn’t anticipated: What do you think is the most important aspect of artificial intelligence?
His answer: I think that artificial intelligence has the potential to greatly enhance our lives and help us solve some of the world’s most pressing problems. However, it is important to remember that AI is still in its early stages of development and there are potential risks associated with its use