We often get asked by customers, once they've had their integration for a while, whether there is more they could be doing with it. Often this is expressed as a desire to go beyond just looking at completion status, pass/fail, or score (which along with duration make up 'the big 4' data points).
The good news is that if your course is recording a piece of data, Engine has it available for you. One of the areas where this additional data comes into play is with 'interactions' -- the question and answer details recorded when a learner takes a quiz of some sort. This granular assessment data can help feed analytics and reporting that is useful to instructional designers and those who want a fuller picture of what learners understand about the material.
In this article, we will look at how you can get this type of data out of Engine, some tips on doing that efficiently, and some caveats around what's available if you are supporting multiple learning standards through Engine.
Does my course support interactions?
All of the learning standards we support in Engine (other than LTI) have the ability to record some level of information about interactions. However, not every course is actually sending us that data. If the course author (or the tool they use) has not enabled that, then we will not have it available in Engine.
This can sometimes be confusing, as you clearly see a quiz with questions and answers in the course. In these cases, the content may be tracking quiz results in an internal data structure, but does not make the appropriate learning standard specific calls to set those for us to see.
Please note that there is no way for Engine to know that a course will actually record interactions ahead of time without actually launching and taking the course to see what it does.
As well, there is no mechanism for getting a list of all possible questions -- you will only see the question in the context of a person having answered it. This means that if a person takes a 10 question quiz, but only answers 5 of the questions, it's very possible you'd only see 5 entries in the registration details. [I say 'very possible' there, because a course could just record every question with a blank answer if the learner doesn't pick something.]
What data can I (potentially) see for each interaction?
There is a pretty standard set of data points available for interactions. We normalize those into a set of properties that may be filled in if the learning standard supports them. Here is a list of those properties, with some notes about support where necessary:
- Id: (string) The interaction id assigned by the course. This should be unique for a given question.
- Type: This is the type of question. Possible values are: TrueFalse, Choice (multiple choice), FillIn, LongFillIn, Likert, Matching, Performance, Sequencing, Numeric, Other
The type of interaction will determine the types of answers that are allowed. - Description: (string) A description of the interaction. This would usually contain the question text, for learning standards that support it (SCORM 2004, Tin Can/xAPI, cmi5).
- CorrectResponses: (List<string>) The list of correct responses to this question. Note that amount of data here is dependent on learning standard -- some only allow single letter identifiers for answers, and some will allow the full answer text.
- LearnerResponse: (string) The learner's actual response to the interaction.
- Result: (string) The result of the interaction -- did the learner answer correctly? Possible values: correct, incorrect, unanticipated, neutral, or a numeric value.
- Timestamp: (string) The timestamp of when the interaction was reported, in the format provided by the activity.
- TimestampUtc: (string) The ISO 8601 timestamp of when the interaction was reported.
- Weighting: (string) The weight this interaction carries relative to the other interactions in the activity. Can be used in scoring to determine the amount a question impacts the score.
- Latency: (string) An ISO 8601 TimeSpan representing the amount of time it took for the learner to make the interaction, i.e. how long it took the learner to answer the question.
- Objectives: (List<string>) A list of learning objective ids that this interaction addresses.
If you want the best chance of getting full detail interaction data, SCORM 2004 and cmi5 are the best bets, as they both allow for courses to set the description and more detailed responses.
How do I get this data from Engine?
Now that we know it's possible to capture this data from courses, the next step is to get that data from Engine so that you can put into your own data stores for reporting and analytics purposes.
There are two main ways that this can be done, just like with the other registration data you are already capturing: through a GET request to the API, or as part of the automatic postbacks you may be configured to receive. However, to see this set of data in those responses, there are some additional steps.
In the API, when you make the GetRegistrationProgress request, you will need to include the optional parameters: includeChildResults, includeRuntime, and includeInteractionsAndObjectives. If you set these to 'true', then the response will basically contain all of the data we have collected for the registration.
To capture it from the JSON sent to your postback endpoint when registrations are updated, you would need to set ApiRollupRegistrationFormat to 'FULL'. This essentially tells Engine to send you all of the data filled in the registration schema, like with the optional parameters above.
The important thing to note is that the interaction data is in the runtime section for an activity. It is not something that is collected for the course as a whole, but instead for each activity. While most courses consist of a single activity (SCO in SCORM terms, AU in AICC or cmi5), there is the possibility that a course you get has multiple activities and that more than one of them has interactions collected. Because of this, any code or process you use to pull interaction data from the registration schema must be a bit more robust.
The following is an example of an abbreviated registration schema from one of our SCORM 2004 golf samples, with some properties removed in order to highlight the location and data related to interactions:
As you can see, in this case we have a single activity course, and we have to drill in from the top-level activity to the child activity to find the interactions: activityDetails -> children -> runtime -> runtimeInteractions.
It's important to note that in the registration schema for most learning standards, we always have this hierarchy of a top-level activity, and at least one child activity that would have the runtime. For Tin Can (xAPI) courses, though, there is only a single, top-level activity that would hold the runtime data.
Because of this, if you are going to support all of the learning standards in Engine, your code should probably do something like the following:
- Check whether the activity at the root of activityDetails has a non-null runtime property, and if so, then check if it has a runtimeInteractions array. If so, then process the items in that array.
- For each activity in the children array, do repeat steps 1 and 2.
The reason we end up repeating these steps is that for some courses, there could be a hierarchy of activities to walk through that could be multiple levels deep. That's generally not the case, but since you likely want code to handle all possibilities, you have to account for it.
Some other things to note
There are a few other things to be aware of when working with this data:
- Since multiple activities might have interactions recorded, you will probably want to store/associated the interactions with the activity id ('item_1', in the sample above) so that you can distinguish which interactions go with which activity in reporting.
- Not every property will have a value for all courses or learning standards, so make sure you protect against that possibility when reading the data.
- If a person launches the course multiple times, and takes the quiz multiple times, then you may end up with more than one item in the array for each question (as identified by the interaction 'id' value). The order of the items in the array is based on the order of their arrival, so you can use the index in the array to understand which came first. You can also look at the timestamp value to determine that.
This behavior of recording the same interaction multiple times as you take it again is called 'journaling'. That's the most common way courses handle that, but it is also possible for a course to just update the interaction on retaking it, such that you'd only have a single entry for each question.
If you've stayed with me this long, then you can see that while working with interactions is generally straightforward, the differing support by learnings standards and individual courses mean that we can't always depend on a perfect set of data. But hopefully this will get you started towards capturing this valuable information, and making meaning of it in your application's reporting.
As always, if you have questions about any of this, feel free to reach out to us as support@rusticisoftware.com.