Reconciliation Analyst
Salary undisclosed
Checking job availability...
Original
Simplified
Job Title : Data Labeling Analyst
Duration : 06+ Months
Location : 100% Remote
Responsibilities
DLA IIs form the cornerstone of the Client and act as both annotators and auditors, depending on Product and project needs. Their responsibilities include, but are not limited to:
- Labeling / auditing a set amount of data daily, at the direction of Project Coordinators and Project Managers
- Reviewing and validating labeled datasets for accuracy and consistency.
- Performing QA in our in-house annotation tooling
- Flexing between projects as demand requires; adjusting to new project guidelines week-over week and learning multiple project guidelines at once
- Comprehending complex guidelines and product requirements to accurately label ML data to refine AI models
- Collaborating with team members to identify improvements to the labeling process to drive efficiency and promote high quality
- Supporting training new DLAs who join a project through shadowing sessions (i.e., letting other DLAs shadow their work)
- Operationalizing on Product feedback and bringing best data labeling practices downstream to peer DLAs in accordance with Product feedback and review
- Supporting with any additional tasks to help with program excellence
Educational Requirements:
- BA/BS required
- Knowledge of foreign language preferred
- STEM or MLG background preferred
- Must pass our English annotation test
KPIs
- Annotation quality (consistency, adherence to guidelines, etc.)
- Audit quality, precision, and accuracy
- Quality to speed / quality to throughput
- Accountability
- Engagement
Job Title : Data Labeling Analyst
Duration : 06+ Months
Location : 100% Remote
Responsibilities
DLA IIs form the cornerstone of the Client and act as both annotators and auditors, depending on Product and project needs. Their responsibilities include, but are not limited to:
- Labeling / auditing a set amount of data daily, at the direction of Project Coordinators and Project Managers
- Reviewing and validating labeled datasets for accuracy and consistency.
- Performing QA in our in-house annotation tooling
- Flexing between projects as demand requires; adjusting to new project guidelines week-over week and learning multiple project guidelines at once
- Comprehending complex guidelines and product requirements to accurately label ML data to refine AI models
- Collaborating with team members to identify improvements to the labeling process to drive efficiency and promote high quality
- Supporting training new DLAs who join a project through shadowing sessions (i.e., letting other DLAs shadow their work)
- Operationalizing on Product feedback and bringing best data labeling practices downstream to peer DLAs in accordance with Product feedback and review
- Supporting with any additional tasks to help with program excellence
Educational Requirements:
- BA/BS required
- Knowledge of foreign language preferred
- STEM or MLG background preferred
- Must pass our English annotation test
KPIs
- Annotation quality (consistency, adherence to guidelines, etc.)
- Audit quality, precision, and accuracy
- Quality to speed / quality to throughput
- Accountability
- Engagement