Lead Snowflake Architect, Data Engineering
Lead Snowflake Architect, Data Engineering
NYC, NY
An experienced Data Engineering Architect with background as a Snowflake & Apache Spark Specialist and a deep understanding of Snowflake, data architectures, large-scale data processing systems, and a proven track record in designing and implementing robust, scalable data solutions using Snowflake, Apache Spark, and cloud platforms like AWS, Azure, and Google Cloud Platform.
The Opportunity
- Engage with stakeholders to define project scopes, goals, and deliverables aligned with business objectives.
- Architect and implement data solutions on Snowflake across multiple cloud platforms: AWS, Azure, and Google Cloud Platform.
- Develop and maintain data lakehouse architectures ensuring optimal data storage, processing, and retrieval.
- Deliver high-quality presentations and case studies on data architecture solutions and business impacts.
- Lead code reviews, solution designs, and ensure best practices in data management and processing.
- Stay abreast of industry trends and advancements in data technology, AI, and machine learning.
- Oversee and support the design and development of scalable ETL and ELT pipelines using Apache Spark and Snowflake for multiple clients and projects.
- Oversee delivery, mentor and guide other resources to ensure the highest quality of deliverables on Snowflake based projects.
Qualification:
- Bachelor s degree in Computer Science, Engineering, or a related field.
- At least 10 years of experience in data engineering with a strong focus in Snowflake and Apache Spark.
- IT consulting experience across in the Banking and Financial Services space highly preferred.
- Extensive experience in building and managing ETL/ELT pipelines.
- Proficiency in Snowpark/PySpark and hands-on experience with SQL and Python.
- Demonstrated experience with AI and ML concepts.
- Strong foundation in data warehousing and other data architectures.
- Excellent verbal and written communication skills with the ability to articulate complex technical ideas.
- Proven ability to develop detailed presentations and effectively communicate technical solutions to stakeholders.
- Flexibility to travel as required for project needs.
- Experience and exposure to AWS, Azure, or Google Cloud Platform
- Expert architect with experience creating architecture and design for high-volume, low-latency, and cloud-native 12-factor applications in AWS/Azure
- Experience with application delivery through DevSecOps and agile development processes leveraging CI/CD pipelines.
- Experience applying cloud access security brokers (CASBs), common and industry standard cloud-native/cloud-friendly authentication mechanisms (OAuth, OpenID, SAML, etc.)
Additional:
- Snowflake Advanced certifications or similar certifications in Apache Spark, Python, SQL
- Technical leadership experience or previous architectural roles in data projects
Knowledge and understanding of AI and Machine Learning
Lead Snowflake Architect, Data Engineering
NYC, NY
An experienced Data Engineering Architect with background as a Snowflake & Apache Spark Specialist and a deep understanding of Snowflake, data architectures, large-scale data processing systems, and a proven track record in designing and implementing robust, scalable data solutions using Snowflake, Apache Spark, and cloud platforms like AWS, Azure, and Google Cloud Platform.
The Opportunity
- Engage with stakeholders to define project scopes, goals, and deliverables aligned with business objectives.
- Architect and implement data solutions on Snowflake across multiple cloud platforms: AWS, Azure, and Google Cloud Platform.
- Develop and maintain data lakehouse architectures ensuring optimal data storage, processing, and retrieval.
- Deliver high-quality presentations and case studies on data architecture solutions and business impacts.
- Lead code reviews, solution designs, and ensure best practices in data management and processing.
- Stay abreast of industry trends and advancements in data technology, AI, and machine learning.
- Oversee and support the design and development of scalable ETL and ELT pipelines using Apache Spark and Snowflake for multiple clients and projects.
- Oversee delivery, mentor and guide other resources to ensure the highest quality of deliverables on Snowflake based projects.
Qualification:
- Bachelor s degree in Computer Science, Engineering, or a related field.
- At least 10 years of experience in data engineering with a strong focus in Snowflake and Apache Spark.
- IT consulting experience across in the Banking and Financial Services space highly preferred.
- Extensive experience in building and managing ETL/ELT pipelines.
- Proficiency in Snowpark/PySpark and hands-on experience with SQL and Python.
- Demonstrated experience with AI and ML concepts.
- Strong foundation in data warehousing and other data architectures.
- Excellent verbal and written communication skills with the ability to articulate complex technical ideas.
- Proven ability to develop detailed presentations and effectively communicate technical solutions to stakeholders.
- Flexibility to travel as required for project needs.
- Experience and exposure to AWS, Azure, or Google Cloud Platform
- Expert architect with experience creating architecture and design for high-volume, low-latency, and cloud-native 12-factor applications in AWS/Azure
- Experience with application delivery through DevSecOps and agile development processes leveraging CI/CD pipelines.
- Experience applying cloud access security brokers (CASBs), common and industry standard cloud-native/cloud-friendly authentication mechanisms (OAuth, OpenID, SAML, etc.)
Additional:
- Snowflake Advanced certifications or similar certifications in Apache Spark, Python, SQL
- Technical leadership experience or previous architectural roles in data projects
Knowledge and understanding of AI and Machine Learning