Sr Snowflake Engineer - C2C - Remote - No OPT & CPT
Apply on
Key Skills:
Snowflake, SQL, Airflow, UNIX, ETL, DataWarehousing, ShellScripting, PerformanceOptimization, Troubleshooting, Cloud, Python, Git, Agile
Job Overview: We are looking for an experienced Snowflake Data Engineer with strong skills in SQL, Airflow, and UNIX to join our dynamic data engineering team. In this role, you will design, implement, and optimize Snowflake-based data solutions while leveraging SQL for data transformations, Airflow for orchestrating workflows, and UNIX for system management and automation.
Key Responsibilities:
Design, build, and maintain Snowflake data warehouses and ETL pipelines.
Develop and optimize SQL queries for data extraction, transformation, and reporting.
Create and manage DAGs (Directed Acyclic Graphs) using Airflow for data workflows.
Implement best practices for Snowflake performance tuning, cost optimization, and scalability.
Manage UNIX-based environments, perform shell scripting, and automate system processes.
Collaborate with data analysts, data scientists, and other stakeholders to deliver high-quality data solutions.
Troubleshoot and resolve performance issues in Snowflake, Airflow, and UNIX environments.
Required Skills:
Snowflake expertise: In-depth understanding of Snowflake architecture, security, and performance optimization.
SQL proficiency: Strong ability to write and optimize complex SQL queries for data manipulation and reporting.
Airflow experience: Hands-on experience with designing, building, and maintaining data pipelines using Apache Airflow.
UNIX skills: Experience in UNIX/Linux environments, with proficiency in shell scripting and system administration.
Experience with ETL processes and data warehousing concepts.
Ability to analyze and solve complex technical issues across different environments.
Preferred Qualifications:
Experience with cloud services (AWS, Azure, Google Cloud Platform) for deploying and managing Snowflake environments.
Familiarity with Python or other scripting languages for data pipeline automation.
Knowledge of version control systems (e.g., Git).
Experience working in an Agile development environment.
Education & Experience:
Bachelor s degree in Computer Science, Information Systems, or a related field.
15 years of experience in Snowflake development, SQL, Airflow, and UNIX.