Epicareer Might not Working Properly
Learn More
C

Data Architect IV- NO C2C

  • Full Time, onsite
  • CQuent Systems, Inc.
  • On Site Hybrid, United States of America
Salary undisclosed

Checking job availability...

Original
Simplified

Onsite: Washington DC

No C2C

1. Scope of work

The scope of work and activities include the following:

  • Create reference architectures and standards to enable effective solution delivery.
  • Implement modern data lakehouse architecture using Delta Lake, ensuring data reliability and performance
  • Design scalable data pipelines using Azure Databricks for ETL/ELT processes and real-time data integration
  • Defining data models (for medallion architectures)
  • Design solutions for performance and cost optimization leveraging cloud technologies.
  • Design automation solutions for reusability and consistency.
  • Defining approaches for data self-service leveraging Power BI, Databricks
  • Establish Power BI integration patterns.
  • Collaborating with Data Product managers to determine business-specific application needs.
  • Compiling and implementing application development plans for new or existing applications.
  • Leading the application development team and supervising the design, testing, and modification stages.
  • Demonstrating application prototypes and integrating user feedback.

2. SKILLS / EXPERIENCE REQUIRED

  • Minimum 15+ years in Data Warehousing, Data Engineering & Analytics Experience
  • Minimum 8+ years of Cloud data architecture with at least 4 + years on Azure
  • Expertise in Cloud Data Architecture, Azure Databricks. (5-7 yrs experience in Data with one large program from end to end, 15 yrs experience) Need someone who has worked as a Data Architect/Data engineer) Hands On Current Technology Along With Delivery.
  • High proficiency in developing complex ETL Data Pipelines using Databricks with PySpark, manage & maintain Data Lake & Lakehouse (CDC & SCD). Experience in Unity Catalog is preferred.
  • Experience integration with Power BI and defining the approach for data self service
  • Experience developing progressive information management solutions and end-to-end development life-cycle support and SDLC processes
  • Articulates clear objectives and qualitative/quantitative measures of success
  • Extensive experience in aligning application development with business needs.
  • Exceptional analytical and problem-solving skills.
  • Excellent communication and presentational skills, confident and methodical approach, and able to work within a team environment
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
Report this job

Onsite: Washington DC

No C2C

1. Scope of work

The scope of work and activities include the following:

  • Create reference architectures and standards to enable effective solution delivery.
  • Implement modern data lakehouse architecture using Delta Lake, ensuring data reliability and performance
  • Design scalable data pipelines using Azure Databricks for ETL/ELT processes and real-time data integration
  • Defining data models (for medallion architectures)
  • Design solutions for performance and cost optimization leveraging cloud technologies.
  • Design automation solutions for reusability and consistency.
  • Defining approaches for data self-service leveraging Power BI, Databricks
  • Establish Power BI integration patterns.
  • Collaborating with Data Product managers to determine business-specific application needs.
  • Compiling and implementing application development plans for new or existing applications.
  • Leading the application development team and supervising the design, testing, and modification stages.
  • Demonstrating application prototypes and integrating user feedback.

2. SKILLS / EXPERIENCE REQUIRED

  • Minimum 15+ years in Data Warehousing, Data Engineering & Analytics Experience
  • Minimum 8+ years of Cloud data architecture with at least 4 + years on Azure
  • Expertise in Cloud Data Architecture, Azure Databricks. (5-7 yrs experience in Data with one large program from end to end, 15 yrs experience) Need someone who has worked as a Data Architect/Data engineer) Hands On Current Technology Along With Delivery.
  • High proficiency in developing complex ETL Data Pipelines using Databricks with PySpark, manage & maintain Data Lake & Lakehouse (CDC & SCD). Experience in Unity Catalog is preferred.
  • Experience integration with Power BI and defining the approach for data self service
  • Experience developing progressive information management solutions and end-to-end development life-cycle support and SDLC processes
  • Articulates clear objectives and qualitative/quantitative measures of success
  • Extensive experience in aligning application development with business needs.
  • Exceptional analytical and problem-solving skills.
  • Excellent communication and presentational skills, confident and methodical approach, and able to work within a team environment
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
Report this job