Epicareer Might not Working Properly
Learn More

Lead Data Engineer

  • Full Time, onsite
  • Nityo Infotech Corporation
  • Hybrid, United States of America
Salary undisclosed

Apply on


Original
Simplified

Job Title - Lead Data Engineer

Location - Irvine/Los Angeles, CA (Local Only)

Duration - 12+ Months

Job Type - Contract

Job Description:

We are seeking a highly skilled and experienced Lead Data Engineer/Architect to join our dynamic team. The ideal candidate will have a strong background in data engineering, with extensive experience in Python, PySpark, AWS services, PostgreSQL, Apache Airflow and Docker.

This role requires a professional with a proven track record of designing, implementing, and maintaining robust data pipelines and architectures.

Key Responsibilities:

  • Design and Develop Data Pipelines: Create and maintain scalable data pipelines using Python and PySpark to process large volumes of data efficiently.
  • Cloud Integration: Utilize AWS services (such as S3, CloudWatch, ECS, ECR, Lambda) to build and manage cloud-based data solutions.
  • Database Management: Design, implement, and optimize PostgreSQL databases to ensure high performance and reliability.
  • Workflow Orchestration: Use Apache Airflow to schedule and monitor complex data workflows.
  • Containerization: Implement and manage Docker containers to ensure consistent and reproducible environments for data processing tasks.
  • Data Quality and Governance: Ensure data quality, integrity, and security across all data pipelinesand storage solutions.
  • Collaboration: Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions that meet business needs.
  • Mentorship: Provide guidance and mentorship to junior data engineers and contribute to the continuous improvement of the team s skills and processes.

Qualifications:

Education: Bachelor s or Master s degree in Computer Science, Engineering, or a related field.

Experience: 10+ years of overall experience in data engineering or related fields.

Technical Skills:

  • Proficiency in Python and PySpark.
  • Extensive experience with AWS services (S3, CloudWatch, ECS, ECR, Secrets Manager, Cloud9 IDE).
  • Strong knowledge of PostgreSQL and database optimization techniques.
  • Hands-on experience with Apache Airflow for workflow orchestration.
  • Proficiency in Docker for containerization.

Soft Skills:

  • Excellent problem-solving and analytical skills.
  • Strong communication and collaboration abilities.
  • Ability to work in a fast-paced, dynamic environment.
  • Preferred Qualifications:
  • Familiarity with CI/CD pipelines and DevOps practices.
  • Knowledge of data warehousing concepts and ETL processes.

Thanks & Regards

Dilshadul Karim

(Technical Recruiter)

Desk :

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
Report this job