Epicareer Might not Working Properly
Learn More

Databricks Administrator/Architect - HYBRID

  • Full Time, onsite
  • Chandra Technologies, Inc.
  • Hybrid, United States of America
Salary undisclosed

Apply on


Original
Simplified

Job Description:

***Corp to Corp Resumes are accepted.

Location Requirement: **All candidates must be local to the Triangle region of North Carolina, and posting may require up to 1-2 days per month in a Triangle area office for meetings.**

NCDIT-Transportation Database Team seeks a Databricks Administrator/Architect with proven skills for a 12-month engagement for creation/tuning & support of the Databricks environment. This position will be responsible for developing and designing the Databricks environment at NCDIT-T. This individual will work with internal staff to plan/design/maintain the Databricks environment and recommend changes needed to accommodate/grow as our business needs dictate. This individual will facilitate changes through DIT-T s change process and work very closely with the DBA & Development Staff regarding all aspects of the design and planning of the Databricks environment.

Responsibilities:

  • Provide mentorship, guidance, overall knowledge share, and support to team members, promoting continuous learning and development.
  • Oversee the design, implementation, and maintenance of Databricks clusters.
  • Ensure the platform s scalability, performance, and security.
  • Provide escalated support and troubleshooting to users.
  • Oversee maintenance of role-based access to data and features in the Databricks Platform using Unity Catalog.
  • Review clusters health check and best practices implementation.
  • Review and maintain documentation for users and administrators.
  • Design and implement tailored data solutions to meet customer needs and use cases, spanning from ingesting data from APIs, building data pipelines, analytics, and beyond within a dynamically evolving technical stack.
  • Work on projects involving on-prem data ingestion into Azure using ADF.
  • Build data pipelines based on the medallion architecture that clean, transform, and aggregate data from disparate sources.

Required Skills:

Extensive hands-on experience implementing Lakehouse architecture using Databricks Data Engineering platform, SQL Analytics, Delta Lake, Unity Catalog

Strong understanding of Relational & Dimensional modeling.

Demonstrate proficiency in coding skills - Python, SQL, and PySpark to efficiently prioritize perf, security, scalability, robust data integrations.

Experience implementing serverless real-time/near real-time arch. using Cloud (i.e., Azure, AWS, or Google Cloud Platform Tech Stack), and Spark tech (Streaming & ML)

Experience Azure Infra config (Networking, architect and build large data ingestion pipelines and conducting data migrations using ADF or similar tech

Experience working w/ SQL Server features such as SSIS and CDC.

Experience with Databricks platform, security features, Unity Catalog, and data access control mechanisms.

Experience with GIT code versioning software.

Desired Skills:

Databricks Certifications

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
Report this job