Argilla open-source tool

Changing Guidelines: Best Practices for Maintaining Data Quality

Changing Guidelines: Best Practices for Maintaining Data Quality

May 3, 2024

Natalia Elvira Astoreca

Well-defined annotation guidelines are crucial for ensuring data quality and consistency in any annotation project. However, developing comprehensive guidelines can be challenging, and it's not uncommon for guidelines to evolve as the project progresses and you gain a better understanding of the data and the problem to solve. So, how can we manage these changes effectively without compromising data integrity?

Mastering the MAMA cycle: a scientific approach

In an ideal world, you would begin your annotation project with the MAMA (Model-Annotate-Model-Annotate) cycle, a scientific approach to refining your guidelines. During this "babbling" phase, you experiment with a sample of your data, iteratively refining your guidelines based on questions, feedback and edge cases identified by your team. Don't worry about the quality of annotations just yet; focus on grasping the task at hand and ensuring a shared understanding among the team.

As you progress, pay attention to Inter-Annotator Agreement (IAA) metrics. Low agreement on certain labels or questions may indicate areas where your guidelines need clarification. Iterate and refine your guidelines until you're satisfied with the IAA scores and the guidelines feel stable. Note that if you’re asking subjective questions, like for example to get feedback on human preference for LLM generations, you might be more comfortable with lower IAA scores, but you still want to check that there’s some common understanding of what aspects make a generation preferable or good (e.g. safety, honesty, etc.). Once you reach this stage, you can confidently move on to annotating your gold standard dataset, either reannotating the data used for the MAMA cycle or starting fresh with new data.

Ensuring that you spend enough time in this phase, helps you gradually develop a deep understanding of your data and task while allowing for guideline iterations without fear of breaking changes. However, we often face time and resource constraints or challenges in achieving stable guidelines and satisfactory IAA scores. So, what can we do when the MAMA cycle isn't feasible?

Guidelines management in action: strategies for efficiency

Sometimes it isn’t practical to complete a MAMA cycle because time and resources are limited and it may seem like you’re never close to finishing this phase. If that’s the case, you can jump straight to annotating with your initial guidelines, but having in mind some practical steps to manage changes to your guidelines effectively:

  • Maintain a changelog: Keep a record of changes made to your guidelines, including dates. This helps you track the evolution of your guidelines and communicate changes to your team.
  • Implement versioning: Assign versions to your guidelines, and "publish" updates regularly (e.g., weekly). Communicate significant changes, such as label definition modifications, to your team, ensuring everyone is working with the latest version. Extra tip: if you have a fixed “release” day in the week for your guidelines, you will know that those big changes will always be reflected in the annotations on a specific day of the week.
  • Identify major changes: Be vigilant about significant guideline changes, such as new labels or substantial alterations to existing label definitions. These changes may require reviewing previously annotated records to ensure consistency.
  • Prioritize review: Prioritize reviewing and updating records affected by the changes to maintain data integrity. In Argilla, you can use response filters, keyword searches, or similarity searches to identify those records.
  • Focus on the test split: If reviewing all annotated data is impractical, ensure that at least your test split aligns with the latest version of the guidelines. For the training set, you may tolerate noisier annotations and give more weight to recent annotations to compensate for inconsistencies.

Final Thoughts

Changing guidelines is a natural part of the annotation process, reflecting a growing understanding of your data. By implementing the MAMA cycle when possible and adopting practical management strategies when needed, you can ensure that your data has high quality and consistency. Remember, guidelines are living documents, and adapting to changes is key to successful annotation projects.

For further insights into the MAMA cycle and other best practices for natural language annotation projects, check out Pustejovsky, J. & Stubbs, A. (2012) Natural Language Annotation for Machine Learning, especially Chapter 6: Annotation and Adjudication.