Data Engineer - Remote (India Offshore)
Role: Data Engineer
Location: Remote
Duration: Long Term
Experience:
Bachelor s degree in computer science, Engineering, or a related field
10+ years experience in data engineering, ELT development, and data modeling.
Proficiency in using Apache Airflow and Spark for data transformation, data integration, and data management.
Experience implementing workflow orchestration using tools like Apache Airflow, SSIS or similar platforms.
Demonstrated experience in developing custom connectors for data ingestion from various sources.
Strong understanding of SQL and database concepts, with the ability to write efficient queries and optimize performance.
Experience implementing DataOps principles and practices, including data CI/CD pipelines.
Excellent problem-solving and troubleshooting skills, with a strong attention to detail.
Effective communication and collaboration abilities, with a proven track record of working in cross-functional teams.
Familiarity with data visualization tools Apache Superset and dashboard development.
Understanding of distributed systems and working with large-scale datasets.
Familiarity with data governance frameworks and practices.
Knowledge of data streaming and real-time data processing technologies (e.g., Apache Kafka).
Strong understanding of software development principles and practices, including version control (e.g., Git) and code review processes.
Experience with Agile development methodologies and working in cross-functional Agile teams.
Ability to adapt quickly to changing priorities and work effectively in a fast-paced environment.
Excellent analytical and problem-solving skills, with a keen attention to detail.
Strong written and verbal communication skills, with the ability to effectively communicate complex technical concepts to both technical and non-technical stakeholders.
Role: Data Engineer
Location: Remote
Duration: Long Term
Experience:
Bachelor s degree in computer science, Engineering, or a related field
10+ years experience in data engineering, ELT development, and data modeling.
Proficiency in using Apache Airflow and Spark for data transformation, data integration, and data management.
Experience implementing workflow orchestration using tools like Apache Airflow, SSIS or similar platforms.
Demonstrated experience in developing custom connectors for data ingestion from various sources.
Strong understanding of SQL and database concepts, with the ability to write efficient queries and optimize performance.
Experience implementing DataOps principles and practices, including data CI/CD pipelines.
Excellent problem-solving and troubleshooting skills, with a strong attention to detail.
Effective communication and collaboration abilities, with a proven track record of working in cross-functional teams.
Familiarity with data visualization tools Apache Superset and dashboard development.
Understanding of distributed systems and working with large-scale datasets.
Familiarity with data governance frameworks and practices.
Knowledge of data streaming and real-time data processing technologies (e.g., Apache Kafka).
Strong understanding of software development principles and practices, including version control (e.g., Git) and code review processes.
Experience with Agile development methodologies and working in cross-functional Agile teams.
Ability to adapt quickly to changing priorities and work effectively in a fast-paced environment.
Excellent analytical and problem-solving skills, with a keen attention to detail.
Strong written and verbal communication skills, with the ability to effectively communicate complex technical concepts to both technical and non-technical stakeholders.