Abstract Architecture

OpenLAP Architecture [1]

The abstract architecture of OpenLAP shows the interaction between the three main components in OpenLAP, namely Data Collection and Management, Indicator Engine, and Analytics Framework.

The Data Collection and Management component in OpenLAP is responsible for collecting learning activities data from different sources adhering to the privacy policies of OpenLAP and generating the learner and context models from it [1].

The aim of the Indicator Engine in OpenLAP is to achieve personalized and goal-oriented LA by following a Goal - Question - Indicator (GQI) approach that allows users to easily define new indicators through an intuitive and interactive UI. Additionally, it provides an administration panel to manage the analytics modules, analytics methods, and visualization techniques in OpenLAP [1].

The Analytics Framework is responsible for the management, generation and execution of indicators. It is the combination of the OpenLAP-DataSet, the Analytics Modules, the Analytics Methods, the Visualizer, and the Analytics Engine core components of OpenLAP.


User Scenarios

Student Scenario

Amir is a computer science student at ABC University. He is interested in web technologies. He uses the open learning analytics platform to collect data from his learning activities related to this subject on the university LMS, the edX MOOC platform, Khan Academy, his blog, Facebook, YouTube, Slideshare, and various
discussion forums.

What Amir likes most about the open learning analytics platform is that it provides him the possibility to select which learning activities from which application can be collected in his profile. For Amir privacy is one of the big concerns. By default all the logged activity data are only available to him. He has, however, the option to specify which data will be publicly available to whom and for how long.

Amir is interested in monitoring his performance across the different platforms. He uses the indicator editor to generate a new indicator which aggregates marks from the university LMS, the peer-review feedback from the edX MOOC platform, and open badges from Kahn Academy. He specifies to visualize his marks compared to his peers as a line chart, his peer-review feedback in a textual format, and his badges as a list view. The platform then generates the visualization code that Amir can embed in the assessment module of the university LMS. Further, Amir is interested in getting recommendations related to web technologies in form of lecture slides, videos, online articles, blog posts, and discussion forums. He generates a new indicator which recommends him learning resources from different sources. He then embeds the generated indicator in the learning materials module of the university LMS. [3]

...

Teacher Scenario

Rima is a lecturer at ABC University where she uses the university LMS to administer her courses. She uses personalized dashboard of the open learning analytics platform which gives her an overview of her courses using various indicators to augment and improve her teaching process. On the dashboard she has various
predefined indicators such as participation rate of students in lecture, students’ involvement rate in discussion forum, most viewed/downloaded documents, and the progress of her students in assignments.

Recently, Rima came up with the requirement to see which learning materials are more discussed in discussion forums. She looked in the list of available indicators but did not find any indicator which can fulfill this requirement. She opened the indicator editor which helps her in generating the new indicator and defining the appropriate visualization for this indicator. The newly generated indicator is also added to the list of available indicators for future use by other users. [3]

...

Developer Scenario

Hassan is a researcher at ABC University. He developed a mobile application for collaborative annotation of lecture videos. He is interested in using the open learning analytics platform to analyze the social interactions of the application’s users. Based on the data model specification and guidelines provided by the open learning
analytics platform, he develops a new collector to collect activity data from his mobile application and send it to the platform. Further, he uses the indicator editor to define a new indicator which should apply the Gephi social network analysis method on the collected data. Unfortunately, this method is not available in the platform yet. Therefore, he uses the platform API to register Gephi as a new analysis method. Hassan goes back to the indicator editor and selects the newly registered analysis method to be applied in his indicator. [3]
...

System Scenarios

Indicator Generation Process

Indicator Generation

The indicator generation process is realized by the Indicator Engine component of OpenLAP to let users dynamically define their new indicators. During this process, the user interacts with the "Indicator Editor" UI of the Indicator Engine to specify the LA goal such as monitoring, prediction, assessment, and reflection. Afterwards, the user formulates the LA question and associates multiple indicators to answer this question. To generate a new indicator, the user performs the following steps:

  • Explore the learning activities data stored in the Learning Context Data Model (LCDM) format.
  • Select an appropriate dataset for the indicator.
  • Apply different filters.
  • Choose the analytics method to be applied.
  • Map the columns of the dataset to the inputs of the analytics method.
  • Choose the visualization technique to be used.
  • Map the outputs of the analytics method to the inputs of the visualization technique.
  • Preview the visualization of the indicator.

During the indicator generation process, the Indicator Engine remains in continuous communication with the "Analytics Engine" component of the Analytics Framework to get all possible dataset parameters, get the list of analytics methods which can be used, get the visualization techniques which can be applied, validate the specified mappings, and generate the indicator preview. After the indicator is finalized, the "Analytics Engine" saves it to database as a triad: indicator query ID - method ID - visualizer ID and generates an HTML and JavaScript based indicator request codes for each indicator which the user can embed in any client application (e.g. Web page, dashboard, LMS) to visualize the indicator with current data [1].

...
Indicator Execution Process

Indicator Execution

The indicator execution process is handled by the Analytics Framework component of OpenLAP. The indicator request code embedded in the client application request the Analytics Framework to visualize the specific indicator. The "Analytics Engine" intercepts the request and performs the following steps:

  • Check whether the request is valid or not.
  • Communicate with the respective Analytics Module to get the triad.
  • Get the query related to the requested indicator from the database.
  • Execute the query and get the raw data.
  • Transform the raw data into the OpenLAP-DataSet.
  • Send the OpenLAP-DataSet and the mapping indicator query - method to the analytics method referenced in the triad for analysis.
  • Receive the analyzed data as an OpenLAP-DataSet.
  • Send the OpenLAP-DataSet and the mapping method - visualization to the visualization technique referenced in the triad.
  • Receive the indicator visualization code and forward it to the requesting client application to visualize the indicator.
[1]
...

System Workflow