2. Quality Factor: Configuration

The Eventide Quality Factor software option provides a structured way for organizations to analyze and report on call and dispatch handling performance.

In order for Eventide Quality Factor software to work, you will need a NexLog recorder running at least NexLog software version 2.1 and an Evaluation Agents license sufficient to cover all Agents who will be evaluated. On the NexLog recorder, you will need to add a custom field for Agent_ID, of type Text. Also required is a web browser that the MediaWorks DX web client supports:

  • Internet Explorer 11 & Edge

  • Chrome

  • Firefox

Quality Factor software is configured using the NexLog Configuration Manager in the Quality Factor Software section, as seen here:

Quality Factor

Fig. 2.1 Quality Factor

2.1. Skill Groups, Answer Sets, Questions, and Evaluation Forms

This chapter describes how to establish Skill Groups for Questions, how to create Answer Sets, how to create Questions, and finally how to create and revise the Evaluation Forms.

2.1.1. Skill Groups

Skill Groups are a way of organizing questions, both to ease in the creation of Evaluation forms, and to increase the usefulness of the evaluation results. Before you can create new questions, you should take a look at the sample Skill Groups that ship with the Eventide Quality Factor option.

These are:

  • Assignment of Incident

  • CAD Skills

  • Call Control

  • Dispatch Information Flow

  • Empathy

  • Information Gathering

  • Initial Contact

  • Interview Questions

  • Mutual Aid

  • Radio Protocol

  • Speaking Skills

  • Summarization

  • Supervisor Overview

  • Telephone Protocol

If these groups cover your needs, you can proceed to Answer Sets and then to Questions. Otherwise, you can add and edit the list of available skill groups by navigating to the Quality Factor Software -> Skill Groups page in the NexLog Configuration Manager web-based software.

Note

Questions are assigned to skill groups when created or edited via the Questions page.

2.1.2. Answer Sets

Questions can be scored using 4 types of Answer Sets:

  • Multiple Choice: A list of 2 or more answers, where the first choice is worth 100% and the last choice is worth 0%, in even increments. Examples:

    • For the default Answer Set “Yes/No”, Yes is worth 100% and No is worth 0%. (If you want No to be worth 100%, then create a new Answer Set with “No/Yes”).

    • If you create a 3-choice Answer Set such as Yes/Partial/No, then Yes is worth 100%, Partial is worth 50%, and No is worth 0%. (If you want No to be worth 100%, then create a new Answer Set with “No/ Partial /Yes”).

  • Range 1 through 10: This type of answer set allows a Question to be scored using a sliding scale with 10 steps, where step 1 is worth 0% and step 10 is worth 100%, with equal increments for each step. You can name step 1 (such as “Poor”) and you can name step 10 (such as “Excellent”).

  • Range 1 through 5: The same as Range 1 through 10, except this Answer Set allows a Question to be scored using a sliding scale with 5 steps, where step 1 is worth 0% and step 5 is worth 100%, with equal increments for each choice. You can name step 1 (such as “Poor”) and you can name step 5 (such as “Excellent”).

  • Freeform: This answer set is unique in that it has no effect on how the form is scored. It is useful for questions that do not have a qualitative answer, such as “Summarize the contents of the call” or “Was there anything unusual in this call worth noting?”

Answer Sets

Fig. 2.2 Answer Sets

2.1.3. Questions

Questions are at the core of evaluation forms. Questions should be developed based on your agency’s Evaluation criteria. Eventide Quality Factor software includes a set of sample Questions that will aid in understanding how questions are created, scored, and categorized into skills. You are free to customize the sample questions and add additional questions as you see fit.

Questions

Fig. 2.3 Questions

Each Question has three options:

  • A text field for the question itself

  • An Answer Set that the question is to be scored with

  • A Skill Group that the Question belongs to

Question Creation and Edit page

Fig. 2.4 Question Creation and Edit page

Questions can be “published,” (meaning that they are ready for use), or “unpublished,” (meaning that they are either in a rough draft form or have been retired and replaced by a newer question). Once unpublished, questions in use remain in the forms that use them, but they cannot be added to new forms.

Note that editing a question that is already in use will affect every instance of that question, including on existing forms, so editing questions should be done only to correct typos or to make minor changes in wording; no change should be made that would change the meaning of an answer that was already submitted.

If you need to make a substantial change, unpublish the original question (Edit Question -> Published checkbox) and then create a new question.

2.1.4. Forms

The forms page shows you a list of the Evaluation forms on the system. There is one sample form, three standard Call Taking forms and three Dispatch forms included.

Quality Factor: Forms page

Fig. 2.5 Quality Factor: Forms page

The forms are configured by the System Administrator from the Quality Factor Software “Forms” page of the NexLog Configuration Manager. The Form creation page, shown below, is organized into three separate fields:

  • Evaluation Form (top section) shows all evaluation questions that have been added to the form.

  • Skills (bottom-left) lists the configured skill groups with which to filter the questions library.

  • Questions Library (bottom-right) shows the bank of questions that have been developed for use in one or more Evaluation Forms; the list is filtered based on the selected Skill Group in the Skills field.

Form Creator

Fig. 2.6 Form Creator

There is no limit to the number of different evaluation forms that may be created. For instance, different call handling teams or call types may have specific technical needs that should be evaluated differently, and thus may require a separate form.

Forms are created by dragging and dropping questions from the Questions field into the Form field. You can then re-order questions by dragging (up or down) and dropping, and you can delete previously added questions by clicking the red X in the upper-right corner of the question you want to delete.

The minimum score to pass and the number of flags to fail are configured at the top; more information about both topics is provided in the Evaluation Scoring section later in this chapter.

Form in Progress

Fig. 2.7 Form in Progress

Note

The evaluation form is updated in real-time; if you check the Allow N/A checkbox, you will see the addition of a radio button for N/A. If a question has a 1-5 or 1-10 range, you can see how it will work by selecting an answer in the range boxes yourself. The scoring settings are not saved as default criteria, but simply exist on a per form basis.

From here, you can save the form, cancel, or preview the form, which shows you how the form will look when run in MediaWorks DX:

Form Preview

Fig. 2.8 Form Preview

2.1.5. Publishing Forms and Questions

You’ll notice that the creation and editing pages for Forms and Questions have a checkbox for Published. When a Form or Question is “Published,” it is available for use in MediaWorks DX; an unpublished form or question is either not yet ready for use, or has been retired and replaced. Unpublishing is different than deleting, because you aren’t allowed to delete a form or question that has been used in a completed or an in-progress evaluation, but by unpublishing, you can prevent its use in the future.

Questions can be edited after they have been used in a completed evaluation, but this will affect every evaluation that used the question; in other words, editing a question should only be done to correct a typo or clarify its meaning in a way that would leave the correct answer the same. For major changes, unpublish the old version of the question and create a new question with the changed text or answer set. This will allow you to make use of your new version of the question on forms you create in the future (or on future revisions of your current forms), without effecting existing forms or completed evaluations.

When a form that has been used is edited, it will be saved as a new Revision, and the previous version will be automatically unpublished. The unpublished, previous versions will show up on the Form page in grey italics. The Form page has a check box to “Show published forms only,” which is useful for clarity when many forms have been through several revisions.

Previous revisions remain on the logger because they are in use by previously completed evaluations, but are not available to be used for new evaluations.

Form Unpublished Because of New Revision

Fig. 2.9 Form Unpublished Because of New Revision

2.2. Evaluations Scoring

The way Evaluations are scored is straight forward but flexible enough to handle many evaluation requirements. The simplest case is a form that has a number of questions and a percentage value that needs to be met in order to pass. A form designed to meet more complex needs can have certain questions worth more points than others, can have questions that trigger an instant fail of the entire evaluation if the Agent doesn’t meet expectations, and can require an Evaluator to comment on a failed question. We’ll go over the basics of scoring first, and then tackle the Autofail, Flags to Fail and Commenting options.

2.2.1. Basic Scoring

Each question in a form is assigned a maximum value; this defaults to 5 points. The value of each question is relative to the other questions, so if you want a question to be worth twice as much as the others, you can set it to 10 points and leave the rest alone. Because the completed evaluation page will show the score as a percentage out of 100, there is no need to manually manage the total number of per-question points.

Scoring for the default Answer Sets works like this:

  • 1 to 5: A five is worth 100%, a four is worth 75%, a three is worth 50%, a two is worth 25% and a one is worth 0%.

  • 1 to 10: Rounding to the nearest ones place, a ten is worth 100%, a nine is worth 89%, an eight is worth 77%, a seven is worth 67%, a six is worth 56%, a five is worth 44%, a four is worth 33%, a three is worth 22%, a two is worth 11%, and a one is worth 0%

  • Yes / No: Yes is worth 100% and No is worth 0%. Note that this isn’t a special kind of question; it’s just a two answer Multiple Choice question. If you want to use questions that need No to be the correct answer, you can create a multiple choice Answer Set called No / Yes.

  • Outstanding / Good / Fair / Poor / Unacceptable: This is a five answer Multiple Choice question, and it scores the same as the 1 to 5 range Answer Set, with Outstanding being worth 100%, Unacceptable being worth 0% and Fair being worth 50%.

  • Freeform Answer: Freeform answers do not affect score, as they are for non-quantitative questions that don’t easily translate into numbers.

2.2.2. Autofail

Sometimes the answer to a question is important enough to be pass/fail for the entire evaluation. For example, you likely have a hardline stance against call takers using profanity when talking to callers. This is where the Autofail feature might be of use. Any question in an Evaluation form can be set to Autofail if it is scored below a certain percentage.

Question from Form Edit with Autofail Under % Highlighted

Fig. 2.10 Question from Form Edit with Autofail Under % Highlighted

If you want a multiple choice or range-based question to Autofail only if the agent gets less than an acceptable score, you can set this value to 50% or 25% to have it fail under those values.

Note that Autofail will not preclude the rest of the questions on a form from being evaluated, so when the evaluation has been completed, you may see a percentage score alongside the failure that is greater than the minimum score needed to pass.

Autofail Form with High Score

Fig. 2.11 Autofail Form with High Score

2.2.3. Flags

Another way to make scoring more sophisticated is through the use of Flags. A flagged question is one that requires attention because it was scored too low to be considered acceptable. If a form uses flags (and if Flags to Fail has been set to a number greater than 0), then an Agent receiving at least that number of flags on the form will fail the evaluation, regardless of the total score achieved.

If Flags to Fail is set to 0, then no amount of flags will fail the form, but individual questions can still be set to be flagged if you want.

Flags To Fail and Flag Under % Highlighted

Fig. 2.12 Flags To Fail and Flag Under % Highlighted

A flag (resulting from a low score on a question) can also be configured to require the evaluator to write a comment.

2.2.4. Comments

With the exception of Free Form questions, each question on a form can have a field for comments. There is a pull down menu on the Evaluation Form builder with four options:

  • None: (default)

  • Optional: comments provide a text-entry area for the evaluator to make a note if something unusual catches their ear

  • Require: a comment will always be required and an evaluation cannot be saved as complete until the comment has been filled out

  • Require if Flagged: a comment is required if an evaluator gives a low enough score to this question to trigger the flag.

    This comment would typically explain why the question was scored so low. If you are setting up a question with the Autofail feature and want to require a comment in that case, set the ‘Flag Under’ % to the same as the Autofail Under % and set the comment field to ‘Require if Flagged’.

Comment Mode Menu in Form Builder

Fig. 2.13 Comment Mode Menu in Form Builder

2.2.5. Allow N/A

Each question can be configured to have a Not Applicable (N/A) answer that removes the question from scoring. A question answered N/A will not affect the total score nor the breakdown by skill group. N/A is configured per-question in the form editor and when checked, will result in an “N/A” radio button being added next to the Answer Set.

Flags To Fail and Flag Under % Highlighted

Fig. 2.14 Flags To Fail and Flag Under % Highlighted

2.2.6. Scoring Summary

In summary, an evaluation passes if all three of the following conditions are met: 1. No questions with Autofail capability have been failed. 2. The number of flagged questions is less than the form’s Flags to Fail quantity. 3. The percentage score is equal to or greater than the form’s Passing Score. An evaluation will not be able to be saved as completed if: 1. Any questions have not been answered. 2. Any questions requiring a comment are lacking a comment. 3. Any flagged questions for which a comment is ‘required if flagged’ lacks a comment.

2.3. Evaluation Form Options and Actions

Forms have a variety of options that can be accessed in the Form Options tab in the Form Editor.

Form Options

Fig. 2.15 Form Options

The Form Options include:

  • Enable Evaluation Live Scoring

  • Evaluation Comment Mode: No Comments, Simple Comments, Multiple Comments with Actions

  • Email on Evaluation Submit

  • Email on Comments/Actions

  • Enable Comment Acknowledgement

  • Allow Comment Modification

  • Enable Agent Digital Signature

These actions are taken when viewing a completed evaluation, at the bottom of the evaluation form:

Evaluation Actions

Fig. 2.16 Evaluation Actions

Here you can see the action history of this evaluation. In this case it is simple: the evaluation was completed, and then the evaluator commented on the evaluation with the text “Example of a Comment.” The grey rows show you an audit history, in the format of Action Taken, by (Agent/Evaluator/Group Leader/SuperEvaluator) (User Name) on Date. The available actions depend on both the options for the form and the user looking at the form; for example, Acknowledge Above Comments is greyed out because the logged in user is not the Agent that the evaluation is about, and if we were logged in as the Agent, only Comment or Acknowledge Above Comments would be accessible because the other actions are restricted to the other Evaluation roles.

2.3.1. Live Scoring

Live Scoring

Fig. 2.17 Live Scoring

Live Scoring for a form will allow evaluators to see the current score of the form as it is filled out. You may want to use this just while designing a form, to see if the weight of each question is appropriate, or leave it on so that evaluators can see whether the calculated score is in line with what they are expecting.

2.3.2. Evaluation Comment Mode

Evaluation Comment Mode

Fig. 2.18 Evaluation Comment Mode

This option controls the commenting workflow for Evaluations. You can choose None, which is the simplest situation: calls are evaluated, and that’s it. You can choose Single Agent Comment, which allows for one Agent comment. Single Agent Comment was the only style previous to NexLog DX-Series.

Multiple Comments with Actions is the new default as of NexLog 2.7. This option is required for some of the options below: Enable Comment Acknowledgement, Allow Comment Modification, Enable Agent Digital Signature. It also provides for additional actions that an Agent, Evaluator, or SuperEvaluator can take on a completed evaluation. These actions: reopen, acknowledge comment, lock, protect, digital signature are covered later in this manual.

2.3.3. Email On Evaluation Submit

Email on Evaluation Submit

Fig. 2.19 Email on Evaluation submit

This setting controls who will be emailed when an evaluation is completed. In the example above, the Agent, their Group Leaders and all Super Evaluators will be emailed. To configure more than one recipient at a time, use the ctrl key and click.

Note

this feature relies on an email server being properly configured in Configuration Manager and each recorder user being configured with an email address. See the NexLog Manual for more details.

2.3.4. Email On Comment or Action

Email on Comment or Action

Fig. 2.20 Email on Comment or action

Emails can also be configured for whenever an evaluation is commented on or other actions (locked, digitally signed, etc) are taken. This setting allows emails to be sent to a different set of users here. In this case, it is configured to send to the Group Leader for the Agent involved in the Evaluation. As above, email must be properly configured for emails to be sent.

2.3.5. Enable Comment Acknowledgement

Click this checkbox to allow users to respond to a new comment on an evaluation with an “Acknowledge Above Comments” action. This allows for users to acknowledge that they have seen the above comments at a certain time, without requiring a response or comment at the same time. They can also include a comment as part this acknowledgement, but it is not required.

2.3.6. Enable Comment Modification

Comment Modification Pencil

Fig. 2.21 Comment Modification Pencil

With comment modification enabled, the pencil icon on the right will appear and can be used by any user with appropriate permissions to modify the text of any comment. A modified comment will show who and when for the most recent modification in the right side, as shown in the image above.

2.3.7. Enable Agent Digital Signature

Digital Signature

Fig. 2.22 Digital Signature

For sites that require a digital signature feature, this option allows an agent to sign that they saw the evaluation at a specific time. They can only sign once unless the evaluation is modified again. Only the Agent can sign, and text is required to be entered into the field.

2.4. Evaluation Form Import and Export

Once you are happy with a form, you can export the form and import it to other NexLog DX-Series systems so that you don’t have to re-create it from scratch on every system you administrate. Only users in the Admin or SuperEvaluator permission groups can use the export and import features.

2.4.1. Export Form and Associated Questions

Export Form and Associated Questions

Fig. 2.23 Export Form and Associated Questions

Select a form by clicking on it in the list, then click Export Form and Associated Questions. The form and its constituent parts (questions, skill groups and answer sets) will be processed and turned into a formName.eva file; the button will change into text that reads “Export Complete! Click to download form” which you can click to download the file.

2.4.2. Import Form

Import Forms

Fig. 2.24 Import Forms

To import a form, click the “Import Form button”, click “Browse…”, and then select the formName.eva file you want to load on to the recorder and press OK. The form will be imported, unless the same exact form already exists on the recorder, in which case you will be warned that it cannot be imported.

If a form with the same name already exists, the imported form will be saved as a new revision and be set as unpublished; you can then choose to publish it when it is ready to supersede the existing form.

Any answer sets, questions or skill groups involved in the form that don’t already exist on the recorder will be added and available for new form construction; Any answer sets, questions or skill groups that already exist will remain as they are.