Epicareer Might not Working Properly
Learn More

Need AWS Data Engineer - Remote

Salary undisclosed

Apply on


Original
Simplified

Role: Sr AWS Data Engineer

Location: Remote

Duration: Long Term

Responsibilities

  • Software Development: Design, build and maintain robust and scalable software development applications
  • Data Pipeline Development: Design, build, and maintain robust and scalable data pipelines on AWS, ensuring the efficient and timely transfer of data from various sources to our data warehouse.
  • Data Integration: Implement and manage third-party data integration tools like Fivetran to streamline data collection and transformation processes.
  • Data Quality Assurance: Implement data quality checks and validation processes to ensure the accuracy and reliability of data throughout the pipeline.
  • Performance Optimization: Continuously monitor and optimize data pipelines for improved speed, efficiency, and cost-effectiveness.
  • Security and Compliance: Implement data security best practices and ensure compliance with data protection regulations, such as GDPR, as applicable.
  • Documentation: Create and maintain comprehensive documentation for data pipelines, workflows, and procedures to facilitate knowledge sharing and collaboration within the team.
  • Collaboration: Work closely with data scientists, analysts, and other stakeholders to understand their data requirements and provide support in delivering relevant datasets.
  • Technology Evaluation: Stay updated on emerging data engineering technologies and best practices, making recommendations for improvements and optimizations.

Qualifications

  • Bachelor's degree in Computer Science, Engineering, or a related field. Master's degree is a plus.
  • Proven experience as a Software Engineer, with a minimum of 8 years working on software engineering, data pipelines, data integration, and database management.
  • Strong expertise in AWS services, including S3, Glue, Redshift, and other relevant data-related services.
  • Hands-on experience working with data integration tools, preferably Fivetran.
  • Hands-on experience working with orchestration services, preferably Airflow.
  • Proficiency in database management systems, such as Amazon Aurora and Snowflake.
  • Solid programming skills in languages like Java 80%, Python 20%.
  • Knowledge of data modeling, ELT processes, and data warehousing concepts.
  • Familiarity with data quality and data governance principles.
  • Strong problem-solving skills and an ability to work independently and as part of a team.
  • Excellent communication skills to collaborate with cross-functional teams and articulate technical concepts to non-technical stakeholders.
  • AWS or other relevant certifications are a plus.
  • Experience working with distributed teams - be a self-starter and proactive in communication

Thanks
Vishnu Kiran Gadila

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
Report this job