Continuing our xAPI Governance blog series, the second aspect of Governance—after creating rules and processes—is documentation. So, why is keeping records of your captured xAPI data important? In this post, we’ll explain what you need to know when it comes to documentation and share steps to help get you started.
Why should I document xAPI data?
Creating documentation before implementation not only provides guidance to ensure future implementations follow a similar pattern, but also a baseline to ensure what you actually implemented matches what you planned to implement. It also gives you another point at which to test against your rules and processes before you spend time writing code and implementing xAPI.
What should I document?
For each application sending xAPI data to your learning record store (LRS), create documentation that includes the following:
- A list of the events you’re tracking and the captured data relating to those events
- A list of identifiers you will use including activity IDs, verb IDs, activity types, and extensions
- Explanations of the data structures for any extensions
- Example xAPI statements for each tracked event
You also may want to keep a record of the reasons behind statement design decisions—including why data is being captured in the first place and its intended use.
As a worked example, before designing statements related to completing a video, look at previous documentation methods. In this case, however, there are three valid paths you could follow, each with advantages and disadvantages.
The video recipe on The Registry uses this to represent reaching the end of the video:
The draft video profile uses this to represent watching all parts of the video at least once:
Kaltura (a popular video platform) uses the following with a progress extension value of “100” to represent reaching the end of the video:
You can use any of these approaches, but—whichever approach you choose—document that you’ve made that particular choice along with the reasons for doing so.
What about profiles?
So far, we’ve talked about defining rules and processes and documenting xAPI statements within an organization. Now, we’ll cover profiles, which are rules and documented statement structures created by a community of practice—including stakeholders from multiple organizations, collaborating on a common approach. In an ideal world, every xAPI implementation would follow publicly created profiles to ensure data consistency across organizations. But, putting on a pragmatist hat for a moment, the world is not ideal.
To quote “The LEGO® Movie 2,” Everything’s not awesome.
In practice, there are a number of challenges that may mean it’s not possible to always create and/or follow a public xAPI profile:
Creating a profile is time consuming. It often takes years for a community of practice to form, agree, and publish a profile. This process is unlikely to be compatible with your xAPI project deadlines.
Profiles often fail to account for existing implementations. Too often, relevant product vendors are not part of the group that designs a profile, which leads to profiles that vary from established practice. As a result, new implementers are forced to choose between compatibility with existing, in-use tools or with a theoretical profile. It also can lead to a profile that’s impractical to implement, perhaps requiring the inclusion of data that’s prohibitively difficult to capture—sometimes for no practical benefit other than meeting the requirements of the profile.
Profiles may fail to account for functionality of existing Learning Analytics Platforms. Occasionally, profile authors will create odd data structures that are unlikely to be supported by existing tools. Implementers then have to choose between xAPI data that works with their LAPs or that which follows the profile.
But, continuing with the “LEGO Movie 2” quote, Everything’s not awesome […] but that doesn’t mean we shouldn’t try.
When defining our own rules and designing our statements, we should definitely be aware of relevant profiles and consider the benefits of implementing them. We should also aim to be involved in profile communities of practice to help make better profiles that are compatible with existing practice.
What about off-the-shelf tools?
Oftentimes, when you’re implementing xAPI, you’re actually just buying an authoring tool, platform, app, or other product that has built-in xAPI support. In these cases, you rarely have much control over the data these products send, aside from a few configuration options. This is normally a good thing because it ensures data consistency for all product users.
It’s still important to document the data, however, so you have a record of:
- what the xAPI statements are supposed to look like (for comparison in case they change), and
- how the data is structured to use when configuring reports.
A vendor’s documentation can provide a good starting point for creating your own documentation, but you should also do your own testing to ensure your documentation matches the reality.
You also may be able to influence the vendor to ensure well-designed xAPI data. For example, some companies require vendors to become Certified Data Sources as part of their procurement process.
In some cases, using multiple products will inevitably mean some data incompatibility, such as using different verbs for the same action. (Don’t worry, though. We’ll cover this in an upcoming blog post and explain what you can do if it happens.)
Up Next: Test, Monitor & Enforce (Part 4)
Rules, processes, and documentation are only useful if they are followed. Next, we’ll look at how to test, monitor, and enforce your xAPI Governance strategy.
Keep your data clean.
A lack of good xAPI governance can lead to challenges, confusions, and inconveniences when reporting on data. Download this free checklist to help ensure consistent xAPI data across your organization.
NOTE: The LEGO Movie 2 is used here only to illustrate the examples in this blog post. Watershed is not associated with, sponsored by, or affiliated with Warner Bros, LEGO System A/S, or The LEGO Group.