Databricks Administrator/Architect
- Full Time, onsite
- RedSalsa Technologies, Inc.
- HybridMust be local to the Triangle region of NC, United States of America
Apply on
- This role is hybrid and there will be occasional need to be onsite. Do you accept this requirement?
- ONLY SUBMIT CANDIDATES CURRENTLY LIVING IN THE RALEIGH/DURHAM/CHAPEL HILL, NC AREA. Please list where your candidate is currently located.
- **The candidate must come onsite on the first day to collect equipment.**
- **All candidates must be local to the Triangle region of North Carolina, and posting may require up to 1-2 days per month in a Triangle area office for meetings.**
Database Team seeks a Databricks Administrator/Architect with proven skills for a 12-month engagement for creation/tuning & support of the Databricks environment.
This position will be responsible for developing and designing the Databricks environment at client site. This individual will work with internal staff to plan/design/maintain the Databricks environment and recommend changes needed to accommodate/grow as our business needs dictate.
This individual will facilitate changes through client s change process and work very closely with the DBA & Development Staff regarding all aspects of the design and planning of the Databricks environment.
Responsibilities:
*Provide mentorship, guidance, overall knowledge share, and support to team members, promoting continuous learning and development.
*Oversee the design, implementation, and maintenance of Databricks clusters.
*Ensure the platform s scalability, performance, and security.
*Provide escalated support and troubleshooting to users.
*Oversee maintenance of role-based access to data and features in the Databricks Platform using Unity Catalog.
*Review clusters health check and best practices implementation.
*Review and maintain documentation for users and administrators.
*Design and implement tailored data solutions to meet customer needs and use cases, spanning from ingesting data from APIs, building data pipelines, analytics, and beyond within a dynamically evolving technical stack.
*Work on projects involving on-prem data ingestion into Azure using ADF.
*Build data pipelines based on the medallion architecture that clean, transform, and aggregate data from disparate sources.
Skills:
Extensive hands-on experience implementing Lakehouse architecture using Databricks Data Engineering platform, SQL Analytics, Delta Lake, Unity Catalog Required 5 Years
Strong understanding of Relational & Dimensional modeling. Required 5 Years
Demonstrate proficiency in coding skills - Python, SQL, and PySpark to efficiently prioritize perf, security, scalability, robust data integrations. Required 6 Years
Experience implementing serverless real-time/near real-time arch. using Cloud (i.e., Azure, AWS, or Google Cloud Platform Tech Stack), and Spark tech (Streaming & ML) Required 2 Years
Experience Azure Infra config (Networking, architect and build large data ingestion pipelines and conducting data migrations using ADF or similar tech Required 4 Years
Experience working w/ SQL Server features such as SSIS and CDC. Required 7 Years
Experience with Databricks platform, security features, Unity Catalog, and data access control mechanisms. Required 2 Years
Experience with GIT code versioning software. Required 4 Years
Databricks Certifications Desired