Guide to: Quality Assurance (QA) in Quality Flow Projects


Quality Flow is designed to let you run ongoing expert review of your work jobs, receive detailed metrics on the data quality and contributor performance, and provide on the job feedback to contributors. To that end, Quality Flow provides a pre-configured QA job, called a "Following QA Job". This job is created from a Work Job on the Jobs canvas.

However, you may also want to load pre-annotated data and run a leading QA job to collect metrics on how well the annotation was done, or you might want to do further examination or revision of work done in your quality flow work jobs. For these purposes, you can use a "Leading QA Job". This job is created directly from ALL DATA in the Jobs canvas.

This article describes how to create QA Jobs and explains the various additional settings available for these jobs.


Following QA Jobs

Following QA Jobs branch off, or "follow" a Leading Work Job. These jobs are created either by clicking the small (+) icon at the bottom of the Leading Job, on the Jobs canvas. (Note: this icon will not appear if your Work job is collecting more than one judgment per row; following QA jobs on multiple judgments is not currently supported.)


You will be prompted to give the Job a title.



The job will then appear on the Canvas branching directly from the Leading Job.


  • This job will automatically be a copy of the Leading job with the addition of Rejection options and a feedback box.

  • You are able to edit the QA job's design (for example, add questions), however it is not recommended.


Following QA Jobs Quality configurations

In Following QA Jobs you will see some settings to choose under the QUALITY tab.

  • Allow QA contributor to modify the original contributor’s judgments:

    • if you select this option, your QA checker will be able to edit the responses and your Quality Dashboard will display an Acceptance Rate at the unit level and per contributor and if your use case is supported, detailed metrics on specific tags, labels, words etc. (See Dashboards).

  • Require the Original Contributor To Respond to Feedback

    • If this is checked, contributors will receive feedback left by QA and will need to Acknowledge or Dispute the feedback before they can continue to further work. If the contributor “Disputes” the feedback it will go to the Feedback dashboard, for the supervisor to arbitrate (see Dashboards).

  • Do not allow QA contributors to modify judgments and ...
    • if you select Do not allow QA contributors to modify judgments, contributors will only be able to Accept or Reject and your Quality Dashboard will display Acceptance Rate only (however you can also elect to add reasons for rejection, see below).

    • additionally, you will be asked whether Rejected units should be sent back to the pool to be redone, or not:

        • if you do choose this option, both the original contributor, the QA checker and the next contributor all receive payment for the unit.

        • you will be able to find the rejected units and manually route them to a Rework task later


If you choose to include "Rejection Reasons", enter one or more reasons (one per line) and QA checkers will need to choose one of these if they reject the unit. This option is not available when QA checkers are able to modify the result. Rejection reasons will be included in the data download and visible in the DATASET table and Quality Dashboard.

Sample Settings in Following QA Jobs

To set how many units you would like to review in QA, click on the small grey circle between your Leading and Following Jobs on the Jobs Canvas.



The following dialogue box will appear, where you can select what percentage of each contributor's work you would like to sample.

  • If you would also like to set a minimum number of units per contributor, you can also specify this.
  • If you are using unit groups, and you want the sample to be equally distributed across unit groups, you should toggle Segment Setting to ON.

A note on QA Sampling

It is possible to change the sample rate during the QA process:

  • Pause the QA job
  • Click on the grey circle
  • Adjust the Sample rate
  • Run the Job

Note, however if you have previously set the sample rate to 10% and you raise it to 15%, this will only apply to future submissions. An additional 5% from the previously sampled pool will not be routed to QA.

However, you can go back and select additional rows and route them via the Actions on the DATASET table.





Prices & Rows

In the Prices & Rows page on the LAUNCH tab for QA Jobs there is a button to allow QA checkers to QA their own work or not. Keep this toggled OFF if you have some of the same people doing Work and QA and you do not want QA checkers to get their own original work in the QA Job.



Leading QA Jobs

When creating leading QA Jobs, there are a few things to be aware of, especially if you are used to using ADAP jobs and workflows:

  • Leading QA Jobs are designed to take data & annotations from other Work or QA jobs as inputs. 

  • a Leading QA Job is therefore not exactly the same as using task-type="qa" in ADAP jobs () because you do not configure where the “review-data” comes from, it's automatic.

  • To re-create the ADAP jobs experience, (e.g. if you have external pre-annotations that you want verified or updated), you can select “Work Job” and then design your job using task-type="qa" in the CML, as you would in ADAP.

You can use leading QA jobs to do further examination or revision of work done in your quality flow work jobs.

A quick way to create a revision leading QA job is to copy a following QA job, using the copy icon at the top right of the page.

Name your job and choose from the following options in the dialogue box (right now, the only source option is Project Data Source). This will create a new Leading QA job, and all settings will be adjustable once the job has been created and until it is running, as usual.


Leading QA Job configurations

For Leading QA Jobs you’ll need to make the following selections under the Quality Tab. Note that the feedback loop is not available for leading QA jobs.


In the QUALITY tab for leading QA jobs there are some settings to choose from.

  • Choose either to Allow the QA checker to Modify the unit, or not

  • If you choose Do Not Allow you will also then be given the option to send any Rejected units back to the original contributor to rework, or not.

  • If you choose Do Not Allow you will be able to choose some Rejection Reasons.




Was this article helpful?
6 out of 9 found this helpful

Have more questions? Submit a request
Powered by Zendesk