Being Human

Spokane's Next IT opens new dialogue with computers, creating opportunities and risks

Being Human
Next IT
Erin, a lifesize hologram, uses Next IT software to speak.

In the lobby of Next IT, on the 16th floor of the historic Paulsen Building on Riverside Avenue, a two-dimensional animated hologram glows ghostly as its artificial intelligence deconstructs a spoken question. With a smile, the life-size, red-haired female hologram, named Erin, calls up the appropriate answer and responds with a description of the company.

"Next IT," she says in a demo video, "creates human-emulation software that is redefining the relationship between people and technology."

When speaking, Erin motions casually with her hands. She can blink and wave. She even tells a Chuck Norris joke. Her lifelike interactions serve as one of many examples of how Spokane-based Next IT uses AI software to process and interpret the complexities of language in text and speech.

Mitch Lawrence, executive vice president of sales and marketing, says their virtual avatars, or "agents," will revolutionize how we communicate with computers. They can act as digital tour guides during complicated tasks like filing insurance claims online, or they can serve as a pocket nurse to help people remember their meds or track health information.

Most of the company's products operate as friendly, animated chat-bots on corporate websites, helping customers book flights or manage investments. But in developing those, Lawrence says Next IT built massive language libraries, creating a universal framework for interpreting speech into data, which can be used to both talk with Erin and track larger patterns.

Such powerful technology may unlock amazing opportunities for turning computers into digital assistants that help us navigate the world. But privacy advocates with the Electronic Frontier Foundation warn that the software also could be used to eavesdrop on huge streams of private online communications, or elicit sensitive information through subtle questioning.

"Here's the reality of the way a lot of people use data today — they abuse it," Lawrence admits. "They don't tell you what they're doing with it. We don't do that."

Founder and CEO Fred Brown started Next IT in 2002 in hopes of making technology more accessible, Lawrence says. A customer does not have to learn new programs or software if they can simply tell a computer what to do, like in Star Trek or other sci-fi universes. Brown's company has since grown to 170 employees, almost entirely based out of Spokane.

Some of the biggest companies and organizations in the world, including United Airlines, Merrill Lynch and the U.S. Army, now turn to Next IT for chat technology. The company recently expanded from 10 to 28 clients, Lawrence says, though some companies prefer to remain confidential. He says the latest revenues increased 50 percent over the previous year.

Logging on to the Aetna health insurance website, Lawrence opens a chat window with Ann, one of the company's newer agents. She pops up with a wide grin and dark, curly hair, looking more like a photo-quality avatar than some of the previous, obviously animated agents. Ann can recall individual medical histories, help change policies or schedule appointments.

"With Ann, I can just talk to her like I'm talking to you," he says. "She's wicked smart."

Next IT plans to shift many of its operations toward health care technology, he says, using virtual assistants to help people live longer. Imagine an app that tracks how you're feeling, helps you schedule medication and offers tips for a healthier diet. The app also could monitor some of your habits to help your doctor spot long-term trends in your health.

"There's a huge unmet need," Lawrence says. "There are not enough health care providers. ... [Our agent] is not as good as a nurse. It's not as good as a doctor, but it's as good a knowledge as you need to have to live a healthier lifestyle."

In certain cases, anonymous avatars have an edge. Lawrence says Army recruiters quickly discovered that people would ask their Sgt. STAR (Strong, Trained And Ready) agent sensitive questions they didn't feel comfortable bringing up in person, asking about sexual orientation or showering policies. Using that new insight, the Army added new answers to Sgt. STAR's database to help answer those questions.

Dave Maass, an investigative researcher with the EFF, says the human-like illusions behind chatbot technology have always fascinated him. It takes a lot of programming to develop software that can hold its own in a casual conversation, and Next IT leads the way in creating avatars that not only provide accurate responses, but also showcase individual personalities.

Being Human
Next IT
U.S. Army avatar Sgt. STAR

Dissecting the Army's use of Sgt. STAR, portrayed as a gruff, no-nonsense recruiter, Maass last month released a report about the potential risks of some chat programs, which he says may use human traits to socially manipulate people into providing information they would not typically share with another person.

"Military, law enforcement and intelligence agencies have employed virtual people capable of interacting with and surveilling the public on a massive scale," he writes in his report, "and every answer raises many, many more questions."

Maass learned that the Army first introduced Sgt. STAR in 2006 as post-Sept. 11 enlistment surged, leading to a 40-percent increase in live chat traffic. The Sgt. STAR program now has 835 scripted responses. He replaces about 55 human operators, engages an average of 1,550 people per day and has answered 10.5 million questions in the past five years.

Next IT and Army officials both emphasize that Sgt. STAR remains an anonymous interaction. He does not save personal or even cookies data. Next IT spokeswoman Jennifer Snell contends that the EFF report focuses on hypothetical abuses while ignoring many of the program's potential benefits.

"Privacy and security is definitely a conversation that needs to be had," Snell says. "[But] the EFF report relied on ... some assumptions based on what could possibly happen. ... It wasn't necessarily grounded in facts."

Beyond chatbots, Maass says Next IT's language interpretation software has the capability to initiate and monitor conversations online. He cites an old Next IT program called ActiveSentry that federal documents suggest once was used by the FBI and CIA to scout out conversations with suspected pedophiles or terrorists, allowing one investigator to oversee up to 30 chats at once.

"That's the most fascinating thing about this," he says, hinting at a hidden campaign of digital entrapment.

Next IT says the ActiveSentry program primarily served as a security feature for banks to identify suspicious transactions. The company has been phasing out the program. News archives show that the former head of ActiveSentry was fired in 2007 and later sued by Next IT over alleged fraud.

Next IT reports that its virtual assistants now answer more than 60,000 questions a day, with numbers only continuing to grow as more companies adopt the technology. Lawrence says the agents save thousands of hours of personnel time and often provide more timely assistance for customers. Quality audits show they can answer about 95 percent of questions accurately.

"Our technology is the most advanced on the planet for nature language processing — it just is," he says. "We have really sophisticated algorithms for driving the technology, to understand what humans are telling us."

And the agents become more lifelike every day. Lawrence says customers appreciate agents' unique personalities. Visitors sometimes flirt with the avatars, pushing the boundaries between humanity and technology. But he doesn't expect software to cross that line anytime in his lifetime.

"I don't ever see artificial intelligence getting to the point ... of bumping up against those human aspects of feeling — empathy, hope, those kinds of things that we experience as human beings," he says. "At the end of the day, [it's] a machine."

The virtual ghost behind your computer or smartphone screen still just mimics humanity. Pull up Sgt. STAR on the Army website and ask if he feels empathy. Even the disciplined cadence of his digital voice hints at an existential uncertainty.

"That is a good question, however, I am not positive that I understand what you're asking," he replies. "Try rephrasing your question. I understand simple questions best."♦

Heartistry: Artistic Wellbeing @ Spark Central

Tuesdays, 3-5 p.m.
  • or

Jacob Jones

Staff writer Jacob Jones covers criminal justice, natural resources, military issues and organized labor for the Inlander.