Image default
Infrastructure and Operations

Can Alexa Be Used Against Your Business in Court?

Voice data recorded, collected, and stored by smart speakers dredge up a host of legal issues for businesses around discoverability and privacy

In just a few short years, digital assistants—voice-operated, internet-connected devices like Amazon’s Alexa and Google’s Home—have gone from novelties to household standbys. And now they are starting to appear in businesses as well.

To date, more than 100 million of the market-leading Alexa-powered devices have been sold, according to Amazon. That kind of popularity doesn’t happen unless a product provides some real benefits to users—and this one offers many, from assisting with online shopping and filling prescriptions to getting news and weather alerts.

Despite the conveniences that these Internet-connected devices provide, legal issues surrounding them remain murky, particularly for businesses that own the devices, as well as employees who may have bought one to use in the office.

The biggest privacy concern with a device like Alexa isn’t that it’s hearing your mundane conversations about the weather or this week’s movie listings, for instance. It’s that you could inadvertently trigger the device by saying a word that sounds like a command. When you do, Alexa records what you say and responds, which then becomes part of its history and is easily accessible through its app. Hypothetically, it could record a crime being committed or a conversation involving sensitive material, such as trade secrets.

Made for Consumers, Not Businesses

For the most part, devices like Alexa are consumer-focused and haven’t been designed to be productivity tools for enterprises, at least not yet (although Microsoft’s recent announcement regarding upcoming improvements to its Cortana assistant were targeted squarely at the enterprise). So, having them in an office setting has limited benefits to the business. But they do carry a degree of added risk to the business, so companies should start considering those risks and taking precautions now to prevent potential privacy and other issues from arising later.

Indeed, these devices strike at one of the most critical business concerns today: the ownership, use, security, and protection of a company’s data. With products aimed at the enterprise, the corporate owner of the product typically owns the data. Yet with consumer devices like these, there’s more of a gray area.

Consumer device user agreements often say that the user owns the data, but those agreements also can give the company selling the device or storing the data the right to access and analyze that data at will, terms that would be unacceptable to most companies. And that’s just one example of how consumer product terms often do not meet corporate data governance guidelines.

Could Be Used in Your Next Litigation

It’s not just data privacy and security concerns, though; these devices could come up the next time the company faces a lawsuit. Let’s at the example of a former employee suing the manager of a company for sexual harassment. If the former worker claims that an Alexa device in the office might provide proof of the harassment, could the employee obtain that information, and would it be admissible in court? Yes, likely so.

The first hurdle in litigation is gaining access to the device and data, and that often depends on who owns the device. Information on a company-owned device, even if the information is personal in nature, is generally fair game in U.S. courts (although that’s not the case in the E.U., so location matters).

The barrier can be a little higher if the device is owned by an employee who is not personally named in the litigation. In that circumstance, courts will often require that the requesting party meet a higher bar to show that the collection and use of that information is likely to be relevant. But in the end, as long as it can be shown that the device was used for a business purpose or in a business setting and that there’s a likelihood that the information on the device could be relevant to the case, then most courts will require the information be collected and produced.

Once access to the data is granted, then it’s largely treated just like any other evidence. From a legal perspective, the medium on which the data is located doesn’t really matter. Even though voice-activated devices like Alexa are a whole new medium, the data from them is treated the same as any other digital or physical evidence.

And while collection and review of that data can be difficult at times today, that will change over time as well, with discovery and compliance functions eventually being built into those devices. That, in turn, will only further encourage the use of that information in legal proceedings, as it’ll become more accessible and routine.

For comparison sake, when Gmail came out it was a consumer-focused product, not a business-focused one. But when Google introduced the G Suite, a cloud-based competitor to Microsoft Office aimed at businesses, it had to build in discovery and compliance components that allowed enterprises to pull information when needed. I suspect digital assistants will follow a similar path. 

What Companies Should Do

The growing popularity of digital assistants is similar to what companies started encountering five to 10 years ago with smartphones and tablets. Today, like then, companies have only a few options for tackling the problem: They can prohibit use of the devices entirely, create acceptable use policies, or stick their heads in the sand and ignore the problem. The last option is what a lot of corporations did then—and may do again with digital assistants. But that would be a mistake.

If you’re concerned about this issue at all, your best option for now may be discouraging the devices in the workplace, at least until you can develop acceptable usage policies. Certainly, you should outright prohibit them on financial trading floors or other places where confidential information is routinely discussed.

The next step would be to create usage policies and expectations similar to what companies did with smartphones, making sure employees receive regular reminders about the policies as well as the consequences for violating them. And employees should be warned that using such devices in the workplace or for business purposes will subject the data to review in litigations, investigations, and other proceedings.

A New Way to Access Information

If history is any guide, it’s entirely possible that in the coming years, digital assistants will become as integrated into our businesses as smartphones. After all, these devices provide yet another way of interacting with the massive trove of digital information that we’re constantly accumulating, and they are communicating with us in more natural conversation styles, which will only encourage increased use.

Whether companies want to adopt these devices or not, just like with smartphones, eventually they likely will have little choice. But companies should learn from history and start looking now at the impact to their organization before they are forced to do so. Whether the devices should be allowed, to what extent, and how they should be used are all questions that should be considered and addressed sooner rather than later. Those that prepare for and adopt these new technologies now will not only better protect their company and its data, but most likely will eventually find that doing so gives them a competitive advantage over their rivals who were slow to adapt.

Related posts

Deutsche Bank’s Digital Strategies Hinge on ‘Everything-as-a-Service’

Ericka Chickowski

Does Your Enterprise Encourage a Data-Driven Culture?

Ericka Chickowski

The Long Road Ahead for Supply Chain Automation

Ericka Chickowski

Leave a Comment