The abstract architecture of OpenLAP shows the interaction between the three main components in OpenLAP, namely Data Collection and Management, Indicator Engine, and Analytics Framework.
The Data Collection and Management component in OpenLAP is responsible for collecting learning activities data from different sources adhering to the privacy policies of OpenLAP and generating the learner and context models from it .
The aim of the Indicator Engine in OpenLAP is to achieve personalized and goal-oriented LA by following a Goal - Question - Indicator (GQI) approach that allows users to easily define new indicators through an intuitive and interactive UI. Additionally, it provides an administration panel to manage the analytics modules, analytics methods, and visualization techniques in OpenLAP .
The Analytics Framework is responsible for the management, generation and execution of indicators. It is the combination of the OpenLAP-DataSet, the Analytics Modules, the Analytics Methods, the Visualizer, and the Analytics Engine core components of OpenLAP.
Student ScenarioAmir is a computer science student at ABC University. He is interested in web technologies. He uses the open learning analytics platform to collect data from his learning activities related to this subject on the university LMS, the edX MOOC platform, Khan Academy, his blog, Facebook, YouTube, Slideshare, and various
What Amir likes most about the open learning analytics platform is that it provides him the possibility to select which learning activities from which application can be collected in his profile. For Amir privacy is one of the big concerns. By default all the logged activity data are only available to him. He has, however, the option to specify which data will be publicly available to whom and for how long.
Amir is interested in monitoring his performance across the different platforms. He uses the indicator editor to generate a new indicator which aggregates marks from the university LMS, the peer-review feedback from the edX MOOC platform, and open badges from Kahn Academy. He specifies to visualize his marks compared to his peers as a line chart, his peer-review feedback in a textual format, and his badges as a list view. The platform then generates the visualization code that Amir can embed in the assessment module of the university LMS. Further, Amir is interested in getting recommendations related to web technologies in form of lecture slides, videos, online articles, blog posts, and discussion forums. He generates a new indicator which recommends him learning resources from different sources. He then embeds the generated indicator in the learning materials module of the university LMS. 
Teacher ScenarioRima is a lecturer at ABC University where she uses the university LMS to administer her courses. She uses personalized dashboard of the open learning analytics platform which gives her an overview of her courses using various indicators to augment and improve her teaching process. On the dashboard she has various
Recently, Rima came up with the requirement to see which learning materials are more discussed in discussion forums. She looked in the list of available indicators but did not find any indicator which can fulfill this requirement. She opened the indicator editor which helps her in generating the new indicator and defining the appropriate visualization for this indicator. The newly generated indicator is also added to the list of available indicators for future use by other users. 
Developer ScenarioHassan is a researcher at ABC University. He developed a mobile application for collaborative annotation of lecture videos. He is interested in using the open learning analytics platform to analyze the social interactions of the application’s users. Based on the data model specification and guidelines provided by the open learning
The indicator generation process is realized by the Indicator Engine component of OpenLAP to let users dynamically define their new indicators. During this process, the user interacts with the "Indicator Editor" UI of the Indicator Engine to specify the LA goal such as monitoring, prediction, assessment, and reflection. Afterwards, the user formulates the LA question and associates multiple indicators to answer this question. To generate a new indicator, the user performs the following steps:
- Explore the learning activities data stored in the Learning Context Data Model (LCDM) format.
- Select an appropriate dataset for the indicator.
- Apply different filters.
- Choose the analytics method to be applied.
- Map the columns of the dataset to the inputs of the analytics method.
- Choose the visualization technique to be used.
- Map the outputs of the analytics method to the inputs of the visualization technique.
- Preview the visualization of the indicator.
The indicator execution process is handled by the Analytics Framework component of OpenLAP. The indicator request code embedded in the client application request the Analytics Framework to visualize the specific indicator. The "Analytics Engine" intercepts the request and performs the following steps:
- Check whether the request is valid or not.
- Communicate with the respective Analytics Module to get the triad.
- Get the query related to the requested indicator from the database.
- Execute the query and get the raw data.
- Transform the raw data into the OpenLAP-DataSet.
- Send the OpenLAP-DataSet and the mapping indicator query - method to the analytics method referenced in the triad for analysis.
- Receive the analyzed data as an OpenLAP-DataSet.
- Send the OpenLAP-DataSet and the mapping method - visualization to the visualization technique referenced in the triad.
- Receive the indicator visualization code and forward it to the requesting client application to visualize the indicator.