Epicareer Might not Working Properly
Learn More

ETL Developer with (Apache Airflow, AWS)

Salary undisclosed

Apply on


Original
Simplified

Greetings,

TechProjects is a rapidly growing System Integration (SI) and IT staffing company dedicated to providing exceptional value for our clients' time and investment. We offer IT services to a wide array of commercial clients, from small businesses to large corporations, across diverse industries including supply chain & groceries, healthcare, and financial services. Additionally, we serve government entities at the city, state, and other public sector levels, with a particular focus on social services, school systems, and higher education.

Job Title: ETL Developer with (Apache Airflow, AWS)

Location: New Jersey(Hybrid - 3 days Onsite)

Duration: 12 Months

SQL and ETL are mandatory. Good to have DBT experience.

Job Description:

A dbt (data build tool) Developer is responsible for leveraging dbt to transform, model, and manage data within an organization's data analytics stack. dbt Developers play a crucial role in streamlining data workflows, ensuring data accuracy, and providing a structured foundation for data analysis and reporting.

Responsibilities:

1. Data Transformation and Modeling:

- Design and implement data transformations and models using dbt to create structured, cleaned, and aggregated datasets.

- Develop and maintain dbt models that accurately represent business logic and data requirements.

2. SQL Expertise:

- Write and optimize SQL queries within dbt to extract, manipulate, and join data from various sources (e.g., databases, APIs, flat files).

- Ensure SQL code follows best practices for readability, performance, and maintainability.

3. Version Control:

- Use version control systems (e.g., Git) to manage dbt codebase, enabling collaborative development and tracking changes over time.

- Collaborate with data engineers and analysts to coordinate code changes.

4. Testing and Documentation:

- Implement unit tests within dbt to verify the accuracy and reliability of data transformations.

- Document dbt models, data lineage, and transformations to facilitate understanding and collaboration.

5. Automation:

- Schedule and automate dbt runs to keep data models up-to-date and synchronized with source systems.

- Implement data orchestration and scheduling as needed.

6. Data Quality Assurance:

- Develop and enforce data quality checks and validations within dbt to identify and rectify data issues.

- Monitor data quality and integrity, responding to anomalies or discrepancies.

7. Performance Optimization:

- Optimize dbt models and queries for performance, identifying and addressing bottlenecks.

- Analyze and fine-tune data processing pipelines to meet performance requirements.

8. Collaboration:

- Collaborate closely with data engineers, data analysts, and business stakeholders to understand data requirements and deliver data solutions.

- Participate in cross-functional teams and contribute to data-related projects.

9. Security and Compliance:

- Ensure data security and compliance with relevant data protection regulations (e.g., GDPR, HIPAA) through appropriate data handling practices.

10. Knowledge Sharing:

- Share knowledge of dbt best practices and data modeling techniques with team members.

- Provide training and support to data analysts and other users of dbt.

Qualifications:

  • Bachelor's or master's degree in computer science, data science, or a related field.
  • Strong proficiency in SQL and experience working with relational databases.
  • 3+ years of experience using dbt for data transformation and modeling in a data warehouse environment (e.g., Snowflake, BigQuery, Redshift).
  • 7+ years of experience building business rules using a business rules engine similar to dbt
  • Familiarity with version control systems (e.g., Git) and code collaboration workflows.
  • Excellent data analysis and problem-solving skills.
  • Strong attention to detail and a commitment to data quality.
  • Understanding of data warehousing concepts and best practices.
  • Knowledge of data governance and data security principles.
  • Effective communication and collaboration skills to work with diverse teams.
  • Experience with other data tools and languages (e.g., Python, R, Looker) is a plus
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
Report this job