Voice assistants might be simple enough for anyone to make thanks to the wide amount of development tools available. However,  even scrappy apps must comply with the law.  The chatbots and voice assistant technology has grown explosively. Now, the tech faces greater scrutiny from regulators and an increasingly privacy-minded public.

That’s not necessarily a bad thing, though. Complying with the law might complicate development by throwing in a few hoops to jump through. However, consider regulations as an opportunity to build trust between your assistant and the user. Here are four legal areas that you might want to consider before releasing your voice assistant/bot into the wild.

State Your AI Status

A recently passed California law, taking effect in July, states that some bots must disclose that they’re AI and not human. The law comes hot off the heels of a couple controversies in the past year: the increasing number of politically polarized Twitter bots and that time that Google Duplex creeped out the internet with its startlingly convincing verbal tics.

While it’s a California law, anyone whose userbase resides in California—even if they don’t operate within the state—should take notice. But the law is limited only to voting-related or commercial bots. Hence, only those concerned with selling products of delving into political issues will need to worry.

Ensure a GDPR Compliant Chatbot& Voice Assistant

While GDPR arrived early in the year, many businesses are still scrambling to ensure they are compliant. Even novice voice assistant developers will need to heed the law. Those investing in voice assistant technology will have to take special steps to ensure they are hosting a GDPR compliant chatbot.

Because the law seeks to protect the handling of user data—including voice assistant privacy—your first step is to provide a clear voice assistant privacy policy. Ask for consent before collecting data, including voice assistant usage. After they consent, you can employ conversational analytics which provides key demographic data, including location, to help you get to know your users better. You should always encrypt voice analytics. It’s also a good idea to give users the ability to access and delete their data should they choose.

Voice Assistant Technology Doesn’t Replace Professional Advice

If your voice assistant provides legal or medical advice, it’s worth reminding your users that it doesn’t fully replace getting advice from a professional. So while a voice assistant might diagnose symptoms, it should frame its service as helping inform users of possible issues that they might explain to their doctor—not replace a trip to the doctor. It’s helpful to connect your voice assistant users to local, professional resources that are relevant to their case.

 

Voice assistant Privacy and GDPR

 

Follow Regulations for Child-Focused Apps

If your voice bot has an audience that’s primarily children, make sure it is compliant with COPPA, the Children’s Online Privacy Protection Act. In fact, Amazon requires that child-focused Alexa skills comply with the regulation. So, don’t forget to ensure you are covered before submitting to the skill store. Among the chief concern for COPPA compliance is collection of personal data for children under 13 years old. This is forbidden without verifiable parental consent. It requires reaching out to the parent directly with an explanation for what data will you collect and why. Learn more about verifiable parental consent here.