Epicareer Might not Working Properly
Learn More
I

Google Cloud Platform Data Engineer -W2 Only.

  • Full Time, onsite
  • Infinite Computer Solutions (ICS)
  • On Site, United States of America
Salary undisclosed

Checking job availability...

Original
Simplified

Hi,

Role : Google Cloud Platform Data Engineer

Location : Dallas ,TX (onsite)

Job Summary

The Data Engineer Google Cloud Platform BigQuery Specialist will be responsible for designing and implementing efficient data pipelines that extract, transform, and load (ETL) data from various relational databases into Google Cloud BigQuery. The ideal candidate should have strong SQL optimization skills to ensure fast and efficient data processing and movement. They must be comfortable working in a multi-vendor, dynamic work environment and collaborating with cross-functional teams.

Key Responsibilities

  1. Data Pipeline Development and Optimization
  • Design and build scalable ETL/ELT data pipelines to transfer data from relational databases (SQL Server, PostgreSQL, MySQL, Oracle, etc.) to Google Cloud Platform BigQuery.
  • Implement best practices for efficient data ingestion using tools such as Apache Airflow, Dataflow, Cloud Composer, and BigQuery Data Transfer Service.
  • Optimize SQL queries for high-performance data movement and processing in BigQuery.
  1. Performance Tuning and Optimization
  • Write and optimize complex SQL queries to ensure fast and cost-efficient data transformations.
  • Monitor and fine-tune BigQuery performance using clustering, partitioning, and query execution plans.
  • Implement best practices for cost optimization on BigQuery, reducing query costs and improving efficiency.
  1. Data Architecture and Schema Design
  • Design and implement schema models in BigQuery, including denormalization techniques, partitioning, and clustering strategies.
  • Collaborate with Data Architects to ensure scalability, reliability, and security of data architectures.
  • Work with business teams to define data models that align with analytics and reporting needs.
  1. Data Integration and Transformation
  • Develop real-time and batch data processing solutions using Cloud Dataflow, Pub/Sub, and Cloud Functions.
  • Implement data cleansing, enrichment, and transformation using SQL, Python, or Apache Beam.
  • Integrate data from multiple sources (structured, semi-structured, unstructured) into BigQuery.
  1. Cloud Infrastructure & Security
  • Manage Google Cloud Platform IAM roles and permissions to ensure secure access to data.
  • Implement data governance and compliance policies for handling sensitive data in BigQuery.
  • Utilize Cloud Logging and Monitoring (Stackdriver, Cloud Logging, Cloud Monitoring) for troubleshooting and performance monitoring.
  1. Collaboration in a Multi-Vendor, Dynamic Work Environment
  • Work in a fast-paced environment with multiple vendors, cloud service providers, and technology partners.
  • Collaborate with data analysts, data scientists, and business intelligence teams to understand requirements and provide efficient data solutions.
  • Coordinate with cross-functional teams across different geographical locations and time zones.

Required Qualifications

Education & Experience

  • Bachelor s or Master s degree in Computer Science, Data Engineering, Information Systems, or a related field.
  • 5+ years of experience in data engineering, ETL pipeline development, and SQL optimization.
  • 3+ years of experience with Google Cloud Platform (Google Cloud Platform), specifically BigQuery.

Technical Skills

  • Expertise in SQL optimization for BigQuery and relational databases.
  • Proficiency in ETL/ELT development using Cloud Composer (Apache Airflow), Dataflow (Apache Beam), and BigQuery Transfer Service.
  • Experience with Python, Shell scripting, or Java for automation and data transformation.
  • Strong knowledge of data partitioning, clustering, and materialized views in BigQuery.
  • Familiarity with data orchestration tools like Apache Airflow, Cloud Composer, or dbt (Data Build Tool).
  • Hands-on experience with Google Cloud Platform services such as Cloud Storage, Pub/Sub, Cloud Functions, and IAM.
  • Understanding of data warehouse design, star schema modeling, and data normalization/denormalization.
  • Experience with data quality frameworks and best practices for data validation and monitoring.
  • Working knowledge of version control tools like GitHub, Bitbucket, or GitLab.

Soft Skills

  • Strong problem-solving skills and the ability to troubleshoot performance issues.
  • Ability to work in a fast-paced, dynamic, and multi-vendor work environment.
  • Excellent communication skills to work effectively with stakeholders across teams.
  • Ability to multitask and manage multiple projects simultaneously.

Preferred Qualifications (Must Have)

  • Google Cloud Platform Professional Data Engineer Certification or BigQuery-related certifications.
  • Experience with NoSQL databases (Bigtable, MongoDB, etc.).
  • Knowledge of data streaming technologies (Kafka, Apache Flink, or Google Cloud Pub/Sub).

Preferred Qualifications (Nice to Have)

  • Experience with BI Tools (Looker, Tableau, Power BI, or Google Data Studio).
  • Familiarity with machine learning pipelines using BigQuery ML or Vertex AI.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
Report this job

Hi,

Role : Google Cloud Platform Data Engineer

Location : Dallas ,TX (onsite)

Job Summary

The Data Engineer Google Cloud Platform BigQuery Specialist will be responsible for designing and implementing efficient data pipelines that extract, transform, and load (ETL) data from various relational databases into Google Cloud BigQuery. The ideal candidate should have strong SQL optimization skills to ensure fast and efficient data processing and movement. They must be comfortable working in a multi-vendor, dynamic work environment and collaborating with cross-functional teams.

Key Responsibilities

  1. Data Pipeline Development and Optimization
  • Design and build scalable ETL/ELT data pipelines to transfer data from relational databases (SQL Server, PostgreSQL, MySQL, Oracle, etc.) to Google Cloud Platform BigQuery.
  • Implement best practices for efficient data ingestion using tools such as Apache Airflow, Dataflow, Cloud Composer, and BigQuery Data Transfer Service.
  • Optimize SQL queries for high-performance data movement and processing in BigQuery.
  1. Performance Tuning and Optimization
  • Write and optimize complex SQL queries to ensure fast and cost-efficient data transformations.
  • Monitor and fine-tune BigQuery performance using clustering, partitioning, and query execution plans.
  • Implement best practices for cost optimization on BigQuery, reducing query costs and improving efficiency.
  1. Data Architecture and Schema Design
  • Design and implement schema models in BigQuery, including denormalization techniques, partitioning, and clustering strategies.
  • Collaborate with Data Architects to ensure scalability, reliability, and security of data architectures.
  • Work with business teams to define data models that align with analytics and reporting needs.
  1. Data Integration and Transformation
  • Develop real-time and batch data processing solutions using Cloud Dataflow, Pub/Sub, and Cloud Functions.
  • Implement data cleansing, enrichment, and transformation using SQL, Python, or Apache Beam.
  • Integrate data from multiple sources (structured, semi-structured, unstructured) into BigQuery.
  1. Cloud Infrastructure & Security
  • Manage Google Cloud Platform IAM roles and permissions to ensure secure access to data.
  • Implement data governance and compliance policies for handling sensitive data in BigQuery.
  • Utilize Cloud Logging and Monitoring (Stackdriver, Cloud Logging, Cloud Monitoring) for troubleshooting and performance monitoring.
  1. Collaboration in a Multi-Vendor, Dynamic Work Environment
  • Work in a fast-paced environment with multiple vendors, cloud service providers, and technology partners.
  • Collaborate with data analysts, data scientists, and business intelligence teams to understand requirements and provide efficient data solutions.
  • Coordinate with cross-functional teams across different geographical locations and time zones.

Required Qualifications

Education & Experience

  • Bachelor s or Master s degree in Computer Science, Data Engineering, Information Systems, or a related field.
  • 5+ years of experience in data engineering, ETL pipeline development, and SQL optimization.
  • 3+ years of experience with Google Cloud Platform (Google Cloud Platform), specifically BigQuery.

Technical Skills

  • Expertise in SQL optimization for BigQuery and relational databases.
  • Proficiency in ETL/ELT development using Cloud Composer (Apache Airflow), Dataflow (Apache Beam), and BigQuery Transfer Service.
  • Experience with Python, Shell scripting, or Java for automation and data transformation.
  • Strong knowledge of data partitioning, clustering, and materialized views in BigQuery.
  • Familiarity with data orchestration tools like Apache Airflow, Cloud Composer, or dbt (Data Build Tool).
  • Hands-on experience with Google Cloud Platform services such as Cloud Storage, Pub/Sub, Cloud Functions, and IAM.
  • Understanding of data warehouse design, star schema modeling, and data normalization/denormalization.
  • Experience with data quality frameworks and best practices for data validation and monitoring.
  • Working knowledge of version control tools like GitHub, Bitbucket, or GitLab.

Soft Skills

  • Strong problem-solving skills and the ability to troubleshoot performance issues.
  • Ability to work in a fast-paced, dynamic, and multi-vendor work environment.
  • Excellent communication skills to work effectively with stakeholders across teams.
  • Ability to multitask and manage multiple projects simultaneously.

Preferred Qualifications (Must Have)

  • Google Cloud Platform Professional Data Engineer Certification or BigQuery-related certifications.
  • Experience with NoSQL databases (Bigtable, MongoDB, etc.).
  • Knowledge of data streaming technologies (Kafka, Apache Flink, or Google Cloud Pub/Sub).

Preferred Qualifications (Nice to Have)

  • Experience with BI Tools (Looker, Tableau, Power BI, or Google Data Studio).
  • Familiarity with machine learning pipelines using BigQuery ML or Vertex AI.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
Report this job