This year our team participated in the second annual Hackathon hosted by the Information Services department. Teams were given around 7 hours to create something before presenting their creations to all the participants and being judged on their work. Awards are given out at the end for categories like simplification, partnership, and learner experience.

Our team set out to create some custom skills for Amazon Alexa – Amazon’s virtual assistant voice service. We wanted Alexa to be able to answer questions about OSU. Our team decided to use the APIs we’ve built as the data source for some of the answers we wanted from Alexa. As apart of our project, we also had to create a new API that would function as anJared presenting at the hackathon intermediary between the Alexa voice service and our APIs that would be providing the data. Amazon allows to either use an AWS Lambda function or HTTPS endpoint to facilitate the interaction between the Alexa service and a backend data source.

Since we opted for the HTTPS option, we had to build our API around the specific JSON schema that Alexa sends and expects to receive. Amazon provides the Alexa Skills Kit to allow developers to create a skill that has a number of intents. A skill always has an invocation name that allows the Alexa to know what skill a person is wanting to use. We decided to use “Benny” as the invocation name for our skill since the questions that Alexa would answer would all be related to OSU. Intents are the types of actions that can be performed within a skill. To trigger an intent we created, we would start by saying “Alexa, ask Benny…”. When an intent is triggered, Alexa sends a request the Alexa API we created during the hackathon. Depending on the intent, our API will call one of our backend APIs to get the data for a response. The API uses the data to create a text response that’s meant to be spoken and returns the response to the Alexa.

Jose working at the hackathonWe used the locations API for several of the intents we created. The data in the locations API allowed us to create intents to answer questions like “what restaurants are open right now?”, “is the library open today?”, and “what resturants are close to me?”.

We used the directory API to create an intent to lookup information about people on campus. We can ask things like “what is the email address for Edward Ray?” and “what is the phone number for Wayne Tinkle?”.

Our team also created intents that used our terms API and class search API. For example, to get a list of open terms, you’d say “Alexa, ask Benny what terms can I register for?”. We also created the PAC (physical activity course) intent. When I was a student, I would often find myself looking for a random 1-2 credit class to take that fit around the rest of my schedule. The PAC classes were nice because I could do fun things like biking, running, or rock climbing. The PAC intent allows you to ask “give me a PAC class for Fall 2017 at 2:00 PM on Mondays”. Alexa will then find a random PAC class that fits into that schedule.

After the hackathon, we created a video to demo some of the intents we created with an Amazon Echo. However, you don’t need an Amazon Echo to develop and test Alexa skills. There are many applications out there that allow you to test an Alexa skill, like EchoSim.

Video Demo: https://media.oregonstate.edu/media/t/0_vqlnak06

Amazon let’s someone beta test any skill they create by linking an Alexa enabled device (like the Echo or EchoSim) to their account. Releasing a skill to be available to any Alexa device requires approval from Amazon. Since the skill we created the hackathon was a proof of concept, we didn’t submit the skill to be available on all Alexa devices, therefore the skill isn’t available to be used publicly.

Print Friendly, PDF & Email

Leave a reply

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong> 

required