Get the Most Effective Annotation Solution
Alegion Managed Services: Our Process
Alegion Managed Services allows data science teams to offload millions of hours of data annotation work to our team of data engineers, project managers, and highly skilled annotation workforce. We have three main stages of work: discovery, annotation, and quality assurance. Throughout this process, we follow the philosophy of continuous improvement to ensure we deliver highly accurate training data to our clients.


Discovery
We start the discovery process by analyzing your representative sample data and delivering a proof of concept based on your goals and requirements. We work together to set and optimize the annotation requirements, which will determine the number and types of annotations for each image, frame, or other data type. We also structure the classification ontology to ensure it identifies the appropriate object relationships.
After we determine the best segmentation types and ontology, we build out a few different annotation workflows for our annotators. We consult with our annotators and use A/B tests to determine the most intuitive way for our annotators to label objects to ensure annotation efficiency and accuracy.
Concurrently, we recruit the appropriate number of annotators for the project from our global and US based workforce. We train and qualify annotators using ground truth annotations and enroll them in the project.
Solution Deployment
Our solutions engineers design and deploy the final annotation task workflows. Then, we combine human annotation with Alegion’s machine learning-augmented software platform to annotate frames in just seconds, reducing annotation time by up to 90%. Our project managers ensure the delivery of batches of data meeting your time, budget, and quality requirements, for up to millions of data points.
Quality Checks
As annotators submit work, automated worker scoring is sent to our Quality Assurance team. We address quality issues as soon as they are found to prevent costly delays. Submitted work can also be reviewed in the Alegion Review Portal, where clients can accept annotations or provide feedback.
Continuous Calibration
With any machine learning project, we know that experimentation is required for the most accurate model outputs. As batches of labels—typically a hundred thousand at once—are delivered and fed into the model, we understand that data labeling requirements can change based on the model outputs. Our project managers are always ready to work with your annotation guidelines so that models hit quality requirements across different types of annotations and edge cases.