Snowflake and Data Engineer - Remote / Telecommute
Salary undisclosed
Checking job availability...
Original
Simplified
We are looking for Snowflake and Data Engineer - Remote / Telecommute for our client in Dallas, TX
Job Title: Snowflake and Data Engineer - Remote / Telecommute
Job Type: Contract
Job Description:
Job Description:
- Develop and maintain data models in dbt following modular & scalable design principles.
- Design, develop, and maintain scalable data pipelines using Snowflake and DBT.
- Build and optimize pipelines on Snowflake, ensuring performance, cost efficiency and scalability.
- Collaborate with cross functional teams to understand data requirements.
- Implement data quality checks and testing in dbt.
- Implement and optimize Airflow workflows for automation and scheduling of data processes.
- Maintain documentation for data models, transformations and processes.
- Develop and maintain CI/CD pipelines for data pipeline deployment and version control.
- Troubleshoot and resolve issues related to data pipelines.
- Proven experience with Snowflake, DBT, Airflow, and CI/CD pipelines.
- Strong knowledge of SQL and data warehousing concepts.
- Experience in designing and optimizing ETL/ELT processes.
- Familiarity with cloud platforms (like AWS).
- Strong problem-solving skills and ability to work in a fast-paced environment.
- Excellent communication and collaboration skills.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
Report this job We are looking for Snowflake and Data Engineer - Remote / Telecommute for our client in Dallas, TX
Job Title: Snowflake and Data Engineer - Remote / Telecommute
Job Type: Contract
Job Description:
Job Description:
- Develop and maintain data models in dbt following modular & scalable design principles.
- Design, develop, and maintain scalable data pipelines using Snowflake and DBT.
- Build and optimize pipelines on Snowflake, ensuring performance, cost efficiency and scalability.
- Collaborate with cross functional teams to understand data requirements.
- Implement data quality checks and testing in dbt.
- Implement and optimize Airflow workflows for automation and scheduling of data processes.
- Maintain documentation for data models, transformations and processes.
- Develop and maintain CI/CD pipelines for data pipeline deployment and version control.
- Troubleshoot and resolve issues related to data pipelines.
- Proven experience with Snowflake, DBT, Airflow, and CI/CD pipelines.
- Strong knowledge of SQL and data warehousing concepts.
- Experience in designing and optimizing ETL/ELT processes.
- Familiarity with cloud platforms (like AWS).
- Strong problem-solving skills and ability to work in a fast-paced environment.
- Excellent communication and collaboration skills.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
Report this job