Uncategorized
CAN CHATBOTS BE BENEFICIAL FOR THE SOCIETY?  /
October 30, 2018
Viktoriya Polyarush Social Media Coordinator, an avid reader with the passion for technology.

Whenever one thinks about chatbots, what concepts come to your mind? Obviously, the images of business, revenue and convenience are going to come up first. However, some chatbots have been created for our society to benefit from them, to solve humanitarian crisis and advance social impact. Today we want to provide several examples chatbots can be used while performing important humanitarian initiatives.

 

BOTS RAISE AWARENESS & FUNDS

Unfortunately, according to the UN report, less than 50% of the population in Ethiopia has access to clean water and only 21% of the population enjoys proper sanitation services. However, cold statistics like these seldom move people to take action. That’s why Charity:Water teamed up with Lokai and AKQA to create Yeshi, a Facebook Messenger chatbot that humanizes the water crisis in Ethiopia. Yeshi is a young girl in Ethiopia who walks 2.5 hours every day to the nearest reliable water source. She travels alone and carries huge plastic jugs so she can bring gallons of water home to her family. You learn about her dreams of going to school and see a map of her journey.

Yeshi even asks you to send her a picture of where you live. “Wow! Our worlds are so different,” she remarks, before leaving you to continue her tedious walk alone again. The experience of “Walking With Yeshi” is definitely emotional. Conversational experiences like this can be powerfully effective ways to convey the humanitarian challenges that face the global poor and inspire action. Besides raising awareness, charities can also use bots and messaging platforms to raise critical funds. It is known that Charity: Water recently worked with Assist to enable donors to donate funds directly from Facebook Messenger.

 

BOTS FIGHT BUREAUCRACY & INEQUALITY

There are some people who stand out in the crowd. 19-year-old Joshua Browder is no typical teenager. The Stanford Computer Science undergraduate has single handedly-beaten over 160,000 unfair parking tickets with his bot, Do Not Pay. The sophisticated “robot lawyer” also helps tenants fight negligent landlords and the homeless apply for much needed government support. Browder was inspired to help the most vulnerable layers of society acquire legal help they would otherwise never be able to afford.

“So many government bureaucracies can be automated, like the DMV. Eliminating bureaucracy will actually save the government money,” points out Browder. “In the UK, there is this really broken system where the government pays a lawyer to file an application back to the government for a homeless person to receive support. The government wastes so much money with the application process when they should just spent that money on houses.”

Browder’s vision for Do Not Pay reaches far beyond simply fighting off parking tickets and filing for homelessness. While some aspects of the law, like bankruptcy, are complicated and unintuitive, many legal processes can be modeled as logical decision trees by computers. Browder’s mission is to turn Do Not Pay into a legal bot platform where lawyers can identify aspects of the law that are automatable and create their own bots.

Government bureaucracy is so pervasive that many other bots have cropped up to simplify civic matters. Against the backdrop of election cycle drama, several voter registration bots – HelloVoteGoVoteBot, and VotePlz – emerged to allow voters to skip onerous and error-prone paperwork and register simply through SMS and Facebook Messenger.

 

BOTS ENCOURAGE THE RIGHT ACTIONS

Do you feel like you can’t give up your nicotine addiction but too embarrassed to ask a friend to help every time you feel a craving? Public Health England experimented with a Facebook Messenger bot for their month-long Stoptober campaign to help smokers quit. Stoptober successfully helped 500,000 people quit smoking last year, an impressive 20% of the 2.5 million smokers who registered. PHE’s marketing director Sheila Mitchell believes the addition of the Facebook Messenger bot as a support tool for smokers will increase the % of successful quitters. “The heart of the campaign is social,” explains Mitchell. “We found that the big numbers and responses come from social and that within this Facebook is absolutely dominant.”

 

MENTAL HEALTH

Many non-profits and government agencies offer hotlines and support groups faced with high demand and insufficient human staff. Some, like Samaritans, a suicide prevention group, are reportedly working on chatbots to offer faster response times and around-the-clock support. Such social support, whether given by human or bot, has a huge impact on people. Even gifting senior citizens with a robotic seal is shown to reduce stress and improve socialization. Besides simply building mechanical robots to address the physical challenges of old age, social chatbots can be built to address emotional and mental needs.

Conversational avatars like Ellie, a digital therapist developed by USC’s Institute of Creative Technologies, can interview patients and detect depression and anxiety by analyzing words, facial expressions, and tone. Professor Louis-Philippe Morency, co-creator of Ellie, says the bot cannot replace a human therapist, but is a decision support tool that helps to “gather and analyze an interaction sample” for doctors.

Tess is a mental health chatbot. If you’re experiencing a panic attack in the middle of the day or want to vent or need to talk things out before going to sleep, you can connect with her through an instant-messaging app, such as Facebook Messenger (or, if you don’t have an internet connection, just text a phone number), and Tess will reply immediately. She’s the brainchild of Michiel Rauws, the founder of X2 AI, an artificial-intelligence startup in Silicon Valley. The company’s mission is to use AI to provide affordable and on-demand mental health support. Rauws’s own struggles with chronic illness as a teenager brought on a depression that led him to seek help from a psychologist. In learning to manage his depression, he found himself able to coach friends and family who were going through their own difficulties. It became clear to him that lots of people wanted help but, for a number of reasons, couldn’t access it. After working at IBM – where he worked with state-of-the-art AI – Rauws had his “aha” moment: if he could create a chatbot smart enough to think like a therapist and able to hold its own in a conversation, he could help thousands of people at once and relieve some of the wait times for mental health care.

One of the things that makes Tess different from many other chatbots is that she doesn’t use pre-selected responses. From the moment you start talking, she’s analyzing you, and her system is designed to react to shifting information. Tell Tess you prefer red wine and you can’t stand your co-worker Bill, and she’ll remember. She might even refer back to things you have told her. “One of the major benefits of therapy is feeling understood,” says Shanthy Edward, a clinical psychologist. “And so if a machine is not really reflecting that understanding, you’re missing a fundamental component of the benefits of therapy.”

BOTS PROVIDE SOCIAL & HEALTH SERVICES

With the threat of Zika looming over the Americas, knowing whether you’ve contracted the disease is critical to getting timely  treatment. GYANT, a healthbot on Facebook Messenger, walks you through a questionnaire of symptoms to identify your likelihood of having Zika. Concerned users can get a personalized answer immediately rather than wait for a doctor’s appointment or ignore the problem. In the healthcare industry, providers are overwhelmed by the number of patients, most of whom need continuous social and emotional support outside of their doctor and hospital visits. Sensely is a digital nurse bot with a human-like avatar that can save up to 20% of a clinician’s time by monitoring whether patients are dutifully following their prescribed regimens.

Public agencies are using chatbots to connect with citizens and engage diverse stakeholders in addressing social challenges. Cities in the U.S. are utilizing text-based services to aid citizens and government employees: the city of Mesa, Arizona is testing a text message chatbotthat can answer frequently asked questions about available services. Residents can use text messaging services to ask questions about their billing information or updating credit card information. Elsewhere, public agencies are using chatbots to help clients complete transactions. For instance, the Australian Tax Office deployed a chatbot called Alex in March 2016 to assist citizens with questions related to taxes. Alex has already conducted more than a million conversations with citizens. These examples show how chatbots improve service delivery and help government better respond to citizens’ needs.

Some public agencies are using chatbots to receive instant feedback and understand citizens’ perspectives about issues. Gwinnett County in the Atlanta metro area used Textizen, an interactive text messaging platform, to engage residents about the future of local transportation. The effort focused on collecting residents’ comments and opinions about improving county transportation services. The county received more than 1,400 survey responses and 2,700 text survey responses in a week, and the data is presented visually to track progress over time. By using chatbots to conduct surveys and gather information in real-time, public agencies are opening up new avenues to hear citizens’ voices about issues facing communities.

 

THE AREAS WHERE HUMANITARIAN BOTS FALL SHORT

Addressing social issues requires emotional sensitivity, a critical skill that bots are universally missing. LawBot is a legal bot created by Cambridge University students to help users in the UK understand the complexities of the law and identify whether a crime has been committed. Users can use the bot to report rape, sexual harassment, injuries and assaults, and property disputes. Unfortunately, the bot uses a strict checklist to assess if a “crime” has been committed.

While artificial intelligence technologies have not yet evolved for bots to respond with emotional acuity to difficult situations, a better solution for LawBot is to connect distressed users to sympathetic hotlines, support groups, or expert human lawyers once the conversation has exceeded the bot’s domain expertise.

“Humans change our behavior in reaction to how whoever we are talking to is feeling or what we think they’re thinking. We want systems to be able to do the same thing,” says William Mark, head of SRI International’s Information and Computing Sciences Division.

From a policy perspective, this is a critical concern because of the digital divide: how many people have access to internet or smartphones to avail these services? Policymakers have long struggled with the issue of designing electronic services that cater to the needs of all citizens. If the goal is to improve customer services and include marginalized populations in decisionmaking, public agencies must examine the demographics of people using chatbots. As chatbots become widespread, regulators need to think about developing rules to manage security and privacy concerns associated with the use of these new tools. Hackers and scammers could use chatbots to gather valuable personal information by contacting organizations and posing as clients. Similarly, hackers and scammers can design bots to target unsuspecting users. The issues of security and privacy will become more complicated as chatbots carry out more tasks and transactions.