Epicareer Might not Working Properly
Learn More

Interop- Data Engineer

Salary undisclosed

Apply on

Availability Status

This job is expected to be in high demand and may close soon. We’ll remove this job ad once it's closed.


Original
Simplified

Sr Interop- Data Engineer
Location:
Remote
Experience Required: 9+ Years
Education: Engineering Degree (BE/ME/BTech/MTech/BSc/MSc)
Technical certification in multiple technologies is desirable.

What's in it for you?

As a Sr Interop- Data Engineer, you'll join an Agile team focused on building healthcare applications and implementing new features. You'll play a key role in developing interoperable solutions, ensuring high performance and scalability while adhering to industry best practices.

Key Responsibilities:

  • Design, develop, and maintain APIs with a focus on .NET to ensure high performance and scalability.
  • Implement and manage interoperability standards such as HL7 V2/V3, ADT, FHIR, and CCDA.
  • Utilize Firely and Azure FHIR services for effective data integration and interoperability.
  • Collaborate with cross-functional teams to gather requirements and deliver business and technical solutions.
  • Ensure data quality, integrity, and security across all interoperability processes.
  • Provide technical leadership and mentorship to junior engineers.
  • Stay updated with industry trends and leverage new technologies to enhance interoperability practices.

Mandatory Skills:

  • Strong knowledge of HL7 V2/V3, ADT, FHIR, and CCDA standards.
  • Experience with integration engines like Mirth Connect.
  • Proficiency in developing and managing APIs (Python, minimal Node/React.js).
  • Experience with Firely and Azure FHIR services.

Good to Have:

  • Expertise in .NET development.
  • Experience with API gateways such as Apigee Hybrid and DataPower.
  • Knowledge of GitHub in Azure Cloud and Azure Repos in Google Cloud Platform.
  • Experience building and managing pipelines using StreamSets Scheduler, CA Automatic Dollar Universe 6, Cron Jobs, and Databricks Scheduler.
  • Familiarity with container orchestration using Kubernetes (k8s).
  • Proficiency in ETL functions using StreamSets or similar tools (e.g., Databricks).
  • Hands-on experience with databases such as MongoDB, Snowflake, and PostgreSQL.
  • Experience with event streaming tools like Confluent Kafka
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
Report this job