Usa-DSL Framework
Usa-DSL Process is an iterative Usability Evaluation Process that is developed based on the concepts of the Usa-DSL Framework.
Relationships
Categories
Main Description

Usa-DSL Structure 

The Usa-DSL Framework structure is based on the project life cycle process [Stone: 2005], which is composed of phases, steps and activities.

Basically, Usa-DSL Framework is organized in phases, in which a set of steps has to be taken. For each step in a phase, there is one or none activity that has to be executed.

Notice that some steps, in certain phases, have no activities, e.g. step “2 - Ethical and Legal Responsibilities” in phase Analysis has no activity, while this same step in phase Execution has activity “E2 - Introduce the Form and Collect Signatures of Subjects”. 

There are four Phases in Usa-DSL Framework (PEAR phases  see Usa-DSL Lifecycle):

  • Planning
  • Execution
  • Analysis
  • Results

Each Phase can be split into a set of the following Steps:

  1. Evaluators Profiles,
  2. Ethical and Legal Responsibilities,
  3. Data Type, 
  4. Empirical Study Method (SE),
  5. Evaluation Method (HCI),
  6. Metrics,
  7. Gathering Instruments,
  8. Evaluation Instructions,
  9. Evaluation Conduction,
  10. Data Packaging and
  11. Evaluation Reporting.

Important to notice that the PEAR phases have to be executed, for each step, in that order.

Finally, there are thirty two (32) Activities that are distributed between Phases and Steps. 

The Usa-DSL Framework structure was planned in order to be adapted to the needs of each evaluation. It is possible to begin the “Planning” Phase from any of the steps present in our framework. For example, the evaluator can start the evaluation planning by the “P1 Define Evaluators Profiles” Activity, or by the “P3 Define Data Type” Activity.

This will improve the framework flexibility, since it allows different evaluator to start the evaluation based on the activities that they feel more comfortable with, the ones that they already have some data, or even the activities that are easier to perform for a specific DSL.

Besides, if the evaluator wants to perform a step in each of the PEAR Phases, that also is possible, for example, it is possible to execute all activities from step “1 - Evaluators Profile” in all PEAR phases before starting activities in any other step. Furthermore, not all steps have to be performed. Some of them might not be executed, for example, the “ 4 - Empirical Study Method (SE)” step is only needed if the end user will be involved.

Usa-DSL Phases 

As mentioned before, the Usa-DSL Framework contains the PEAR Phases. Each phase has a set of activities that is related to a respective Step.

Phase 1 - Planning: in this phase, the evaluator organizes the planning of the aspects that will be used in order to evaluate the DSL. In this phase, documents must be defined and created, as well as decision-making about the data that has to be collected or what kind of user will be part of the evaluation, for example. To summarize, this phase is where the structure and planning of the evaluation will be constructed. 

Phase 2 - Execution: in this phase, the documents created are used, subjects are recruited, environments are created and the evaluation is performed, following the already defined protocol.

Phase 3 - Analysis: this phase aims to accomplish the analysis of the artifacts created on the Planning and Execution phases. On the Planning phase, this analysis is executed in order for the documents to be adapted and, therefore, the decisions about the evaluation execution can be made. In this phase, the analysis is focused on the collected data and tasks created. 

Phase 4 - Reporting: in this phase, the evaluator will register the used protocol, the created artifacts and analyzed data.


Usa-DSL Steps 

The Usa-DSL framework is composed of eleven (11) steps. The steps of the Usa-DSL framework are described next (see Figure 1)

Step 1 - Evaluators Profiles: in this step the evaluator profile is defined, instruments to identify the evaluator are applied, the evaluator profile is analyzed and a report on that is written [Albuquerque:2015, Barisic:2014, Cuenca:2015, Ewais:2014, Gibbs:2015]. 

Step 2 - Ethical and Legal Responsibilities: similarly to the DECIDE Framework, which is an evaluation guide [Preece:2015], Usa-DSL follows the best practices of ethical and legal issues to protect the user data, dignity, anonymity and well-being. Furthermore, it has to include some description to inform the users that they can stop the evaluation at any time they are not comfortable with some aspectsof the evaluation process. At the end of this step, all the signed documents from the subjects are organized. 

Step 3 - Data Type: in this step the type of data that will be used is defined, i.e the evaluator defines whether the collected data is quantitative, qualitative or both. This will depend on the method that will be used, for example, usability testing uses quantitative, while user observation can use qualitative data. Basically, this step contains only one activity that is performed during the Planning phase. 

Step 4 - Empirical Study Method (SE): the Empirical Study Method suggested for Usa-DSL is based on the Wohlin et al. [Wohlin:2012] proposal, which can be a survey, a case study or a controlled experiment. These methods can be defined based on, for example, the evaluator’s profile (Step 1) or the data that will be collected (Step 3). The Empirical Study Method can be used with other evaluation methods, e.g. usability testing or heuristic evaluation. However, the restrictions and characteristics of every method must be always respected. 

Step 5 - Evaluation Method (HCI): the evaluation methods defined on Usa-DSL can be, for example, user observation evaluation, usability testing, inspection evaluation, or heuristic evaluation. The user observation evaluation must be applied when the study intention is to obtain the end users opinion about the DSL usability aspects. The inspection evaluation aims to verify the relevance of the language on the usability specialist level. 

Step 6 - Metrics: the metrics used on Usa-DSL were defined from an SLR mapping [Rodrigues:2017]. They are comprehension/learning, ease of use, effort/conclusion time, observed complexity and efficiency. These metrics will guide the definition of the evaluation instruments questions to be applied during the evaluation. Similarly to Step 3, this step has only one activity performed during the Planning phase. 

Step 7 - Gathering Instruments: the instruments were based on the studies of [Preece:2015] and [Rouly:2014], e.g. heuristic checklist, ergonomic checklist, questionnaires, interview, use observation or user action recording.

Step 8 - Evaluation Instructions: according to Wohlin et al. [Wohlin:2012], the evaluation instructions can be composed of use manual, instruments or task to be performed. These instruments must be distributed and used when executing an empirical method. They are used, for example, to clarify the participants of the evaluation on what will be evaluated and when the evaluation will take place. 

Step 9 - Evaluation Conduction: this is the step in which the aspects defined in the previous steps are applied. Therefore, it is necessary that the previous steps were executed and tested thoroughly, before involving the evaluation participants. Hence, a pilot test must be executed prior to the application of the evaluation to the actual participants. This will guarantee that the evaluation is viable. Furthermore, it is also important to guarantee that the needed number of participants will be achieved, otherwise, the results may not be statistically relevant, if a quantitative evaluation is being performed. 

Step 10 - Data Packaging: when the evaluation is finalized, the used material for training and collected data should be stored in a safe place with easy access in order to allow the study replication when necessary. This will allow future language evaluation and its comparison with the new collected data.

Step 11 - Evaluation Reporting: this report must follow the evaluation method that was chosen in step “5 - Evaluation Method (HCI)”. Each evaluation method provides a specific report with different fields that must be filled. 



Usa-DSL Activities 

The Usa-DSL framework activities are composed by a set of actions used to plan, execute, analyze and report the evaluation. It is worth mentioning that the identification of each of the 32 activities is composed of an ID and its name. ID is composed of a letter and a number. The letter represents a phase and the number a step, e.g. “E5 Prepare the Evaluation” is an activity that belongs to phase Execution and is associated with the “5 - Evaluation Method (HCI)” step. These activities define the whole evaluation protocol, and therefore are worth describing thoroughly in this paper. They are as it follows. 


Planning Phase Activities

P1 - Define Evaluators Profiles: the goal of this activity is to define the evaluators profiles, which will be related to the evaluation method that will be used. The evaluation can be performed by, for example, an HCI expert, a domain analyst, a domain developer or a Domain Tester. 

P2 - Define Informed Consent Term: it is a formal document that describes the evaluation goal, how the evaluation will take place, how the data will be collected, how the data will be protected, and so on. Usually, it is recommended the use of ethical codes from organizations like, for example, The Association for Computing Machinery (ACM). 

P3 - Define Data Type: the collected data type from the evaluation can be quantitative and/or qualitative. The quantitative data are numeric results that predict the quantity of answers attributed to determined item of a question. The qualitative data is composed of subjective information related to the participant’s opinion about studied object. These data aim to predict what kind of information the evaluator intends to obtain. Albuquerque et al. suggest the use of two data types, in order to obtain a wider and more complete view about the participant opinions. Barisic et al., on the other hand, use quantitative data and consider that to be sufficient for the goal of their research. 

P4 - Define Empirical Study Method: there are different empirical evaluation methods that can be used to evaluate usability. These methods have to involve users during data collection. This activity is closely related to activity P2. Examples of empirical methods are: Controlled Experiment, Survey or Case Study. 

P5 - Define Evaluation Usability Type: as mentioned in the description of step 5 - Evaluation Method, evaluation can be through end users, HCI or DSL experts. This activity is related to activities P1, P3 and P4. 

P6 - Define Metrics for Language Validation: the metrics depend on the evaluation goal and usability criteria that someone wants to evaluate. Examples of criteria that may be evaluated are: easy to learn, easy to remember, easy to use, effort/conclusion time, perceived complexity, utility, satisfaction, conclusion rate, task error rate, efficiency or effectiveness. 

P7 - Define the Instruments of Data Gathering: some of the instruments that can be used to collect data can be heuristic checklist, log capture, use observation, interview or questionnaire. 

P8 - Define the Instruments of Instruction and Training: the UsaDSL framework use the following instruments: DSL guide, user scenario and language presentation. This activity also defines the https://www.acm.org/about-acm/acm-code-of-ethics-and-professional conduct tasks that will be executed by the user, when an empirical method is chosen. In that case, this activity has a close relation to P3, P4 and P5. 

P9 - Define Execution Place: the place where the evaluation will take place depends on the data type that will be collected, the empirical study method that was chosen or even the usability type. For example, places could include a laboratory, via e-mail or through the web, or even the users workplace. 

P10 - Define Data Storage: data packaging is an important activity, since this data might be used later in to replicate the evaluation. 

P11 - Define Study Reporting: this activity is responsible for describing the way the results of the evaluation will be registered. 


Execution Phase Activities 

E1 - Apply Instruments to Identify Profiles: questionnaire that characterizes the profile of the evaluation of participants is applied. This document is used to obtain information such as: DSL/Domain experience time, training, and area of activity.

E2 - Introduce the Form and Collect Signatures of Subjects: in this activity, the consent form must be presented to the participants and after their reading and consent, it must be signed and a copy is given to the researcher that is conducting the evaluation. The consent form provides the subjects with sufficient written information to decide whether to participate in a research study or not.

E4 - Develop and Conduct Protocol: this activity consists of developing the evaluation protocol, describing all the steps and documentation that will be used, such as the type of evaluation, experimental study, context, hypotheses, variables of the study, profile of the participants, the instruments, type of data, data storage and how the study will be reported. This protocol must be performed by the researcher carefully following the planned steps and activities.

E5 - Prepare the Evaluation: the evaluation instruments should be organized, the equipment arranged in the rooms, the participants must be available at the scheduled date and time, and the questionnaires answered.

E7 Data Collection: by applying the characterization questionnaires, collecting instruments and obtaining the data recorded in audio or video, they must be compiled and stored for later tabulation, transcription and analysis.

E8 - Introduce Instruments of Instruction and Conduct Training: the presentation and training are intended to guide the functioning of the language, regarding syntax and semantics, as well as to instruct on the usage scenario, that is, explain the task to be performed in the evaluation process. The delivery of the language manual and usage scenario refers to the delivery of the printed or online documents. These documents describe the functioning of the language and its syntax and semantics. They contain the description of the usage scenario that must be expressed as a requirement or task.

E9 - Execution of Tasks and Evaluation Conduction: the task must be modeled according to the usage scenario delivered to the participants and must be performed from the tool that supports the execution of the language. Upon completion of the task modeling, the researcher may conduct an interview with the participants and thereby obtain their opinion about the language being evaluated. In addition to the interview, the researcher can choose only the use of the questionnaire, filled by participants after completing their tasks.

E10 - Store Data Obtained: after performing the evaluation, the collected data should be stored in a database or other location in order to compile the data later. If data are quantitative, it is important to tabulate them so that their behavior can be observed later and thus to obtain conclusions. If the data is qualitative, it is important to process the interviews, annotations, answers to open questions, recordings and access logs, trying to obtain patterns and a set of relevant information for the study.


Analysis Phase Activities

A1 - Analyze Evaluators Profiles: the analysis of the profiles is used to gather the number of participants and the type of knowledge they have. These profiles can be classified as: 

- Beginner: one who does not have solid knowledge on the domain or on DSL.

- Intermediate: one who has some knowledge on the domain and/or on DSL.

- Advanced: one who has solid knowledge on the domain and on DSL.

A4 - Analyze the Developed Protocol: in the analysis activity of the study protocol, all the described steps should be reviewed in detail and how they will be performed in order to ensure the validity of the study.

A7 - Analyze the Collected Data: when analyzing the data collected during the evaluation, standardization, hypothesis testing, analysis of images and logs, transcription of interviews and videos are performed.

A9 - Analyze the Performed Tasks: the developed models should be checked by more than one researcher to verify and to obtain the task execution rate and the error rate performed by the participants. After, the evaluation of those that did not reach the objectives of the task, or did not complete the intended task, will be discarded.

A11 - Analyze the Documentation: the documentation used in the evaluation must all be analyzed by the researcher and checked by a second researcher to ensure the consistency of the produced information and documentation.


Reporting Phase Activities

R1 - Report Evaluator Profiles: when reporting the participants' profile, the classification and the total number of participants who performed the evaluation should be taken into account. Furthermore, other information should be described if it appears in the characterization questionnaire.

R2 - Report Subjects Number and the Form Used: all documents used in the evaluation should be described in detail and attached to the final report.

R4 - Report the Developed Protocol: the study protocol should be described for each planned, executed and analyzed step. 


R5 - Report Conduction Evaluation: HCI evaluation methods must be described:

- Usability Testing: this evaluation aims to test whether potential users would use language developed to model the domain to which it was proposed;

- Heuristic Evaluation: this is a usability inspection method, which is applied by HCI specialists who are guided by a set of heuristics developed by Nielsen. This method aims to identify usability problems in the evaluated language.

R7 - Report Data Analysis:

- Quantitative data: should be reported through charts, spreadsheets or hypothesis testing.

- Qualitative data: can be represented by an image, interview transcript, annotation excerpts, categorization and standards, high-level video narratives, and fragments of open-ended questions.

R8 - Report the Instruments: the instruments used, characterization questionnaire, language manual, usage scenario, interview script, opinion questionnaire, among others, must be detailed at high level in the protocol and arranged in an appendix in the document used to present the study.

R9 - Report Tasks Analysis: the evaluation must be reported according to the chosen method. The usability test will be reported through the protocol of an experiment, case study or survey. When a heuristic evaluation is performed, the analysis performed by the specialists, as well as the activities carried out and the generated models, should all be reported.

R11 - Report the Results and Analyzed Information: at the end of the evaluation the data should be fully described in a report format, containing all the documents attached to the report.