Data Engineer
Salary undisclosed
Checking job availability...
Original
Simplified
Vital Care (www.vitalcare.com) is the premier pharmacy franchise business with franchises serving a wide range of patients, including those with chronic and acute conditions. Since 1986, our passion has been improving the lives of patients and healthcare professionals through locally-owned franchise locations across the United States. We have over 100 franchised Infusion pharmacies and clinics in 35 states, focusing on the underserved and secondary markets. We know infusion services, and we guide owners along the path of launch, growth, and successful business operations.
What we offer:
The Data Engineer will be responsible for development and implementation of technology and data solutions to meet current and future data warehousing and reporting needs. The experienced engineer will collaborate with the Technical Lead and business partners to build and support the company's data warehouse, ETL and reporting platforms.
Duties/Responsibilities:
Data Architecture and Design:
Vital Care Infusion Services is an equal-opportunity employer and values diversity at our company. We do not discriminate on the basis of color, race, sex, age, religion, national origin, disability, genetic information, gender identity, sexual orientation, veterans’ status, or any other basis protected by applicable federal, state, or local law.
Vital Care Infusion Services participates in E-Verify.
This position is full-time.
What we offer:
- Comprehensive medical, dental, and vision plans, plus flexible spending, and health savings accounts.
- Paid time off, personal days, and company-paid holidays.
- Paid Paternal Leave.
- Volunteerism Days off.
- Income protection programs include company-sponsored basic life insurance and long-term disability insurance, as well as employee-paid voluntary life, accident, critical illness, and short-term disability insurance.
- 401(k) matching and tuition reimbursement.
- Employee assistance programs include mental health, financial and legal.
- Rewards programs offered by our medical carrier.
- Professional development and growth opportunities.
- Employee Referral Program.
The Data Engineer will be responsible for development and implementation of technology and data solutions to meet current and future data warehousing and reporting needs. The experienced engineer will collaborate with the Technical Lead and business partners to build and support the company's data warehouse, ETL and reporting platforms.
Duties/Responsibilities:
Data Architecture and Design:
- Implement scalable and efficient data pipelines to ingest, process, and transform large volumes of structured and unstructured data from multiple sources.
- Define and implement data models, schemas, and storage solutions that optimize performance, reliability, and scalability.
- Implement data storage technologies, frameworks, and platforms based on the organization's requirements and objectives.
- Develop and maintain robust, fault-tolerant data pipelines using ETL (Extract, Transform, Load) processes.
- Implement data validation, testing, error handling, and monitoring mechanisms to ensure data quality and integrity throughout the pipeline lifecycle.
- Automate data workflows, scheduling, and orchestration tasks using tools such as Apache Airflow, Luigi, or similar frameworks.
- Integrate data from various internal and external sources, including databases, APIs, logs, and streaming platforms, to support business analytics, reporting, and machine learning initiatives.
- Transform raw data into meaningful insights and actionable information through data cleansing, enrichment, normalization, and aggregation techniques.
- Collaborate with business partners to translate business requirements into technical specifications and data processing workflows.
- Optimize the performance and efficiency of data pipelines, storage systems, and processing engines to meet service level agreements (SLAs) and performance targets.
- Identify and resolve performance bottlenecks, scalability issues, and resource constraints through system tuning, caching strategies, and infrastructure scaling.
- Implement and enforce data security controls, encryption mechanisms, and access policies to protect sensitive information and ensure compliance with regulatory requirements.
- Monitor data access patterns, audit trails, and user activities to detect and mitigate potential security threats and data breaches.
- Document data pipelines, workflows, and technical specifications to facilitate knowledge sharing, collaboration, and troubleshooting.
- Proficiency in SQL and other programming languages with data processing frameworks.
- Strong understanding of distributed systems, data modeling, database design principles, and performance optimization techniques.
- Hands-on experience with cloud platforms (e.g., AWS, Azure, GCP) and their data services (e.g., S3, Redshift, BigQuery, Dataflow).
- Familiarity with containerization and orchestration tools like Docker, Kubernetes, and infrastructure-as-code (IaC) concepts.
- Excellent problem-solving skills, attention to detail, and the ability to work effectively in a fast-paced, collaborative environment.
- Strong communication and interpersonal skills, with the ability to communicate complex technical concepts to non-technical stakeholders and collaborate effectively with cross-functional teams.
- Preferred: Experience with cloud-based ETL tools (Azure Data Factory, Fivetran, Airbyte).
- Preferred: Experience leveraging Data Transformation Tools (dbt, Matillion) and integrating them with Cloud Warehouses (Snowflake preferred).
- Bachelor’s or master’s degree in computer science, Engineering, or a related field.
- Minimum of 3-5 years in data engineering, database development, or related roles, focusing on designing and building data pipelines and infrastructure.
- Sitting: Prolonged periods of sitting are typical, often for the majority of the workday.
- Keyboarding: Frequent use of a keyboard for typing and data entry.
- Reaching: Occasionally reaching for items such as files, documents, or office supplies.
- Fine Motor Skills: Precise movements of the fingers and hands for tasks like typing, using a mouse, and handling paperwork.
- Visual Acuity: Good vision for reading documents, computer screens, and other detailed work.
Vital Care Infusion Services is an equal-opportunity employer and values diversity at our company. We do not discriminate on the basis of color, race, sex, age, religion, national origin, disability, genetic information, gender identity, sexual orientation, veterans’ status, or any other basis protected by applicable federal, state, or local law.
Vital Care Infusion Services participates in E-Verify.
This position is full-time.
Vital Care (www.vitalcare.com) is the premier pharmacy franchise business with franchises serving a wide range of patients, including those with chronic and acute conditions. Since 1986, our passion has been improving the lives of patients and healthcare professionals through locally-owned franchise locations across the United States. We have over 100 franchised Infusion pharmacies and clinics in 35 states, focusing on the underserved and secondary markets. We know infusion services, and we guide owners along the path of launch, growth, and successful business operations.
What we offer:
The Data Engineer will be responsible for development and implementation of technology and data solutions to meet current and future data warehousing and reporting needs. The experienced engineer will collaborate with the Technical Lead and business partners to build and support the company's data warehouse, ETL and reporting platforms.
Duties/Responsibilities:
Data Architecture and Design:
Vital Care Infusion Services is an equal-opportunity employer and values diversity at our company. We do not discriminate on the basis of color, race, sex, age, religion, national origin, disability, genetic information, gender identity, sexual orientation, veterans’ status, or any other basis protected by applicable federal, state, or local law.
Vital Care Infusion Services participates in E-Verify.
This position is full-time.
What we offer:
- Comprehensive medical, dental, and vision plans, plus flexible spending, and health savings accounts.
- Paid time off, personal days, and company-paid holidays.
- Paid Paternal Leave.
- Volunteerism Days off.
- Income protection programs include company-sponsored basic life insurance and long-term disability insurance, as well as employee-paid voluntary life, accident, critical illness, and short-term disability insurance.
- 401(k) matching and tuition reimbursement.
- Employee assistance programs include mental health, financial and legal.
- Rewards programs offered by our medical carrier.
- Professional development and growth opportunities.
- Employee Referral Program.
The Data Engineer will be responsible for development and implementation of technology and data solutions to meet current and future data warehousing and reporting needs. The experienced engineer will collaborate with the Technical Lead and business partners to build and support the company's data warehouse, ETL and reporting platforms.
Duties/Responsibilities:
Data Architecture and Design:
- Implement scalable and efficient data pipelines to ingest, process, and transform large volumes of structured and unstructured data from multiple sources.
- Define and implement data models, schemas, and storage solutions that optimize performance, reliability, and scalability.
- Implement data storage technologies, frameworks, and platforms based on the organization's requirements and objectives.
- Develop and maintain robust, fault-tolerant data pipelines using ETL (Extract, Transform, Load) processes.
- Implement data validation, testing, error handling, and monitoring mechanisms to ensure data quality and integrity throughout the pipeline lifecycle.
- Automate data workflows, scheduling, and orchestration tasks using tools such as Apache Airflow, Luigi, or similar frameworks.
- Integrate data from various internal and external sources, including databases, APIs, logs, and streaming platforms, to support business analytics, reporting, and machine learning initiatives.
- Transform raw data into meaningful insights and actionable information through data cleansing, enrichment, normalization, and aggregation techniques.
- Collaborate with business partners to translate business requirements into technical specifications and data processing workflows.
- Optimize the performance and efficiency of data pipelines, storage systems, and processing engines to meet service level agreements (SLAs) and performance targets.
- Identify and resolve performance bottlenecks, scalability issues, and resource constraints through system tuning, caching strategies, and infrastructure scaling.
- Implement and enforce data security controls, encryption mechanisms, and access policies to protect sensitive information and ensure compliance with regulatory requirements.
- Monitor data access patterns, audit trails, and user activities to detect and mitigate potential security threats and data breaches.
- Document data pipelines, workflows, and technical specifications to facilitate knowledge sharing, collaboration, and troubleshooting.
- Proficiency in SQL and other programming languages with data processing frameworks.
- Strong understanding of distributed systems, data modeling, database design principles, and performance optimization techniques.
- Hands-on experience with cloud platforms (e.g., AWS, Azure, GCP) and their data services (e.g., S3, Redshift, BigQuery, Dataflow).
- Familiarity with containerization and orchestration tools like Docker, Kubernetes, and infrastructure-as-code (IaC) concepts.
- Excellent problem-solving skills, attention to detail, and the ability to work effectively in a fast-paced, collaborative environment.
- Strong communication and interpersonal skills, with the ability to communicate complex technical concepts to non-technical stakeholders and collaborate effectively with cross-functional teams.
- Preferred: Experience with cloud-based ETL tools (Azure Data Factory, Fivetran, Airbyte).
- Preferred: Experience leveraging Data Transformation Tools (dbt, Matillion) and integrating them with Cloud Warehouses (Snowflake preferred).
- Bachelor’s or master’s degree in computer science, Engineering, or a related field.
- Minimum of 3-5 years in data engineering, database development, or related roles, focusing on designing and building data pipelines and infrastructure.
- Sitting: Prolonged periods of sitting are typical, often for the majority of the workday.
- Keyboarding: Frequent use of a keyboard for typing and data entry.
- Reaching: Occasionally reaching for items such as files, documents, or office supplies.
- Fine Motor Skills: Precise movements of the fingers and hands for tasks like typing, using a mouse, and handling paperwork.
- Visual Acuity: Good vision for reading documents, computer screens, and other detailed work.
Vital Care Infusion Services is an equal-opportunity employer and values diversity at our company. We do not discriminate on the basis of color, race, sex, age, religion, national origin, disability, genetic information, gender identity, sexual orientation, veterans’ status, or any other basis protected by applicable federal, state, or local law.
Vital Care Infusion Services participates in E-Verify.
This position is full-time.