Google Cloud Platform Data Engineer
Salary undisclosed
Apply on
Availability Status
This job is expected to be in high demand and may close soon. We’ll remove this job ad once it's closed.
Original
Simplified
We are seeking a highly skilled and experienced Data\Cloud Engineer who possess a strong background in data engineering, cloud infrastructure (primarily Google Cloud Platform and Azure), ETL processes, and API development. This role requires expertise in managing and optimizing data pipelines and working seamlessly across multiple cloud and on-premises infrastructures.
Key Responsibilities:
- Design, build, and maintain scalable data pipelines using StreamSets, Azure Data Factory, and possibly related services on Google Cloud Platform.
- Work seamlessly across a multi-cloud environment, with 50% of infrastructure on Google Cloud Platform, 25% on Azure, and 25% on-premises.
- Manage and optimize API development within the .NET environment, and to a lesser extent, Python.
- Develop and maintain event streaming services using Kafka and StreamSets.
- Administer and work with databases including Snowflake, MongoDB, and FHIR servers (Firely and Azure FHIR).
- Ensure compliance with healthcare data standards such as UDAP, CDEX, FHIR, HL7, and ADT.
- Collaborate with cross-functional teams to gather requirements and deliver tailored data solutions that meet business and healthcare regulatory needs.
- Implement API Gateways using tools like Apigee and Datapower.
- Provide technical leadership and mentorship to junior data engineers.
- Monitor, troubleshoot, and optimize the performance and scalability of data solutions.
- Stay updated with the latest trends and technologies in healthcare interoperability and cloud services.
Mandatory Skills:
- Expertise in working with multiple cloud infrastructures, including 50% on Google Cloud Platform, 25% on Azure, and 25% on-premises setups.
- Proven experience with StreamSets, Azure Data Factory, and related ETL tools including Databricks
- Proficiency in managing API development primarily using .NET and familiarity with Python.
- API Gateway management (e.g., Apigee, Datapower).
- Hands-on experience with streaming technologies like Kafka and StreamSets.
- Working knowledge of healthcare data standards such as FHIR, HL7, and ADT.
- Expertise in databases such as Snowflake, MongoDB, and FHIR servers (Firely, Azure FHIR).
- Understanding of Firely and Azure FHIR services.
- Strong understanding of container orchestration K8s, Docker
Good to Have Skills:
- Familiarity with healthcare payor systems and regulatory standards such as HIPAA etc.
- Knowledge of Programming languages such Dot Net, Python and standards such as, HL7 V2/V3, ADT, FHIR, and CCDA.
- Experience with other cloud services related to Google Cloud Platform and Azure ecosystems.
Qualifications:
- 7 to 13 years of relevant experience in data engineering, with strong exposure to cloud services and healthcare interoperability.
- Strong analytical and problem-solving skills, with a proven ability to handle complex data systems.
- Excellent communication and collaboration abilities, especially in a multi-functional, multi-cloud environment.
- Proven track record of successfully leading and delivering complex data engineering projects.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
Report this job Similar Jobs