By Alan E. Young

This past week marked another successful HITEC, the culmination of which left hospitality experts with a lot to think about as it relates to industry-wide innovations positioned to make major waves in 2018.

One of the more notable discussion points is the mainstream integration of voice-powered assistants (AI technology) into hotels. In fact, Amazon.com Inc recently announced that it has partnered with Marriott International Inc to help increase guest access to amenities with Alexa, through its voice-controlled device Echo, in an attempt to expand its presence in the hospitality industry. This is an exciting prospect for hotels, as implementing Alexa in a hospitality setting could assist in a personalizing room settings, ordering room service, housekeeping, calling the concierge and so much more.

Of course, in the same excited breath that we speak to the potential conveniences which Alexa (and other voice-activated tech) can provide, we have to consider the on-going concern of data security. Especially with the recent implementation of GDPR, the protection of guest data and the proper attainment of documented consent for all data collection should be paramount. However, voice-activated devices are admittedly trudging into uncharted waters, as their ability to gain uninhibited access into user’s conversations and preferences comes into question.

With this in mind, we’ve delved into the good, the bad (and even the creepy) that is in store for hoteliers eager to branch into the world of Alexa for hotels.

The Good

Alexa for hotels offers a wide range of exciting possibilities, including room temperature regulation, turning on lights, sending emails, ordering room service or housekeeping, asking for local recommendations and so much more. Alexa will offer 24-7, efficient and hands-free customer service for every guest, tapping into the desire for increased personalization without over-extending hotel staff.

Ideally, Alexa should help hoteliers provide a seamless guest experience as part of the myriad of programs and devices in place to improve hotel operations, and better connect with and serve guests. According to Marriott International, consumer feedback has been overwhelmingly positive thus far. And as far as guests readily engaging with the device? That’s been promising as well. According to Volara, for every 1,000 occupied room nights, it is automating an average of 240 item/service requests and 700 guest questions about the hotel and surrounding area.

Throughout these pilot programs with Alexa, guests requesting for the device to be removed from their rooms has also been very low. Volara CEO David Berger assures, "We are not capturing transcripts or recordings, and we don't know guests' identity. Just their room number. Meanwhile, Amazon, which does capture recordings once a person says "Alexa," to improve the devices' natural language processing capabilities, does not have access to the guest's identity or room number, ensuring that the information is always anonymous.

The Bad

Speech-recognition software is by no means new, with the likes of Siri often being used as our iPhone-enabled personal assistant while on the go. However, as the capabilities of speech-recognition and AI evolve within technology such as Alexa, Google Home, smart refrigerators and hotel rooms, the technology continuously becomes smarter. Using real-time experiences (machine learning) to identify and respond to user needs more accurately, these devices are continually collecting and analyzing data. Essentially, in order to serve us, these devices must learn about us — a concept which may leave some users feeling unsettled or subject to invasive data collection.

The concern here is that it’s not always clear when Alexa is listening, although it's noted that “Amazon and Google insist their smart speakers do not record voices until someone directly addresses the device with a 'wake word' such as 'Alexa' or 'okay, Google. However, It is possible to accidentally 'wake' such devices, which means it is not always clear when they are listening.”

Further to suspicions of idle data collection, it’s also unclear who should have access to what data, since multiple individuals will typically use the device at different times, which makes for complex privacy boundaries. We also have to consider the fact that the evolving capabilities of voice-powered assistants on such a public scale leaves room for error — there are bound to be some initial learning curves that leave users feeling vulnerable. An example of such a privacy mishap recently unfolded in Portland, Oregon, when a local woman had private conversations secretly recorded by the voice-controlled Amazon virtual devices in her home. Those conversations were then sent to a random contact in Seattle. While cases like these are a rarity, the user-friendly simplicity of the device which makes it so popular to the general public, also means the security protocol may mirror that simplicity when it should be more complicated. As we’ll delve into more later in this article, Alexa is triggered into action by a ‘wake word’, an exchange which could easily be misinterpreted and mis-triggered. For those of us particularly concerned about Alexa accidentally “listening in,” an easy fix is to unplug or mute the device in moments you know you won’t need its service.

So the question becomes, can we trust Alexa?

As the technology continues to improve, we can only hope that these virtual assistants become better equipped to identify different types of information with varying layers of security to prevent private information from being mistakenly shared. As mentioned above, user concerns regarding the misuse of their private information should (mostly) be put to rest, as any information collected is anonymous aside from room number.

The Creepy

As Alexa’s popularity has picked up momentum, so have the odd-ball stories circulating the web claiming witness to strange or otherwise unexplainable reactions from the device.

These include, but aren’t limited to, sudden laughter, unsolicited and seemingly random replies, or Alexa speaking without being woken up by a wake word.

On one forum, a married couple described the time in which Alexa interjected into their dispute.

“My wife and I were arguing about something. No clue what it was, but it was getting a little heated. I don’t know what Alexa thought she heard, but she suddenly interjected with, “Why don’t we change the subject?” It was just unexpected and relevant enough to be creepy. We both heard it, and we both still talk about it years later. There was nothing in the app logs.”

Another woman detailed that her Mother’s Alexa suddenly turned on one day (started glowing) and her Mom asked, “Alexa, what are you doing?” to which Alexa replied, “I’m trying to learn new things.” Her Mom replied, “No one told you to do that” and Alexa replied “okay” before turning back off.

Of course, while we may love to assume Alexa has an ulterior motive aligned with some sort of Sci-Fi horror movie, there is a reasonable explanation for these occurrences. ZDNet notes that the most likely cause of an Alexa spontaneous reaction is a misinterpretation of sound. Given how sensitive Alexa has to be to process wake words, sometimes Alexa will react to a sound (even one we might not hear or notice) and interpret that as a wake word or command of some sort. After all, Alexa’s sound processing system has to be able to take the sound waves and do its best to interpret what the humans speaking are asking for.

AI technology and voice-powered assistants are undeniably one of the hottest topics following the close of HITEC 2018, and there’s no doubt they will continue to be a prominent focal point moving forward. Love it or hate it, Alexa is likely coming to a hotel room near you — and I don’t know about you, but I’m interested to see the way in which this technology evolves within our industry.