Azure Data Platform Architect
Salary undisclosed
Checking job availability...
Original
Simplified
Role: Azure Data Platform Architect
Work location: Las Vegas US
Rate: $85/hr on C2C
Design Scalable Data Systems - Build secure, efficient, and scalable data solutions using Azure tools like Data Factory, Synapse, and Data Lake.
Work with Databricks - Design and optimize data pipelines in Databricks, using Delta Lake to ensure smooth data processing.
Build & Optimize Data Pipelines - Create fast and reliable ETL/ELT pipelines using Azure Data Factory, Databricks, and Spark.
Ensure Security & Compliance - Make sure data is secure and follows best practices for governance using Azure security tools.
Improve Performance - Optimize Databricks clusters, Spark jobs, and SQL queries to save costs and boost speed.
Collaborate with Teams - Work closely with engineering, analytics, and business teams to design data solutions that support company goals.
Automate & Streamline - Use CI/CD pipelines and Infrastructure as Code (Terraform, Azure DevOps) to automate workflows.
Support AI/ML Projects - Help data scientists by designing efficient AI/ML data pipelines in Databricks.
Migrate Legacy Systems - Lead data migration from older systems like Teradata and SQL Server to modern Azure-based solutions.
Stay Updated & Innovate - Keep up with the latest Azure and Databricks trends and bring in new ideas to improve systems.
.Databricks Data Intelligence Platform,ETL,IT deployments,Microsoft Azure Data Factory,PySpark,business intelligence,data analysis,data intelligence,data migration,data mining,data processing,data science,information technology,python data science,systems migrations,technology
Work location: Las Vegas US
Rate: $85/hr on C2C
Job Description:
Design Scalable Data Systems - Build secure, efficient, and scalable data solutions using Azure tools like Data Factory, Synapse, and Data Lake.
Work with Databricks - Design and optimize data pipelines in Databricks, using Delta Lake to ensure smooth data processing.
Build & Optimize Data Pipelines - Create fast and reliable ETL/ELT pipelines using Azure Data Factory, Databricks, and Spark.
Ensure Security & Compliance - Make sure data is secure and follows best practices for governance using Azure security tools.
Improve Performance - Optimize Databricks clusters, Spark jobs, and SQL queries to save costs and boost speed.
Collaborate with Teams - Work closely with engineering, analytics, and business teams to design data solutions that support company goals.
Automate & Streamline - Use CI/CD pipelines and Infrastructure as Code (Terraform, Azure DevOps) to automate workflows.
Support AI/ML Projects - Help data scientists by designing efficient AI/ML data pipelines in Databricks.
Migrate Legacy Systems - Lead data migration from older systems like Teradata and SQL Server to modern Azure-based solutions.
Stay Updated & Innovate - Keep up with the latest Azure and Databricks trends and bring in new ideas to improve systems.
.Databricks Data Intelligence Platform,ETL,IT deployments,Microsoft Azure Data Factory,PySpark,business intelligence,data analysis,data intelligence,data migration,data mining,data processing,data science,information technology,python data science,systems migrations,technology
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
Report this job Role: Azure Data Platform Architect
Work location: Las Vegas US
Rate: $85/hr on C2C
Design Scalable Data Systems - Build secure, efficient, and scalable data solutions using Azure tools like Data Factory, Synapse, and Data Lake.
Work with Databricks - Design and optimize data pipelines in Databricks, using Delta Lake to ensure smooth data processing.
Build & Optimize Data Pipelines - Create fast and reliable ETL/ELT pipelines using Azure Data Factory, Databricks, and Spark.
Ensure Security & Compliance - Make sure data is secure and follows best practices for governance using Azure security tools.
Improve Performance - Optimize Databricks clusters, Spark jobs, and SQL queries to save costs and boost speed.
Collaborate with Teams - Work closely with engineering, analytics, and business teams to design data solutions that support company goals.
Automate & Streamline - Use CI/CD pipelines and Infrastructure as Code (Terraform, Azure DevOps) to automate workflows.
Support AI/ML Projects - Help data scientists by designing efficient AI/ML data pipelines in Databricks.
Migrate Legacy Systems - Lead data migration from older systems like Teradata and SQL Server to modern Azure-based solutions.
Stay Updated & Innovate - Keep up with the latest Azure and Databricks trends and bring in new ideas to improve systems.
.Databricks Data Intelligence Platform,ETL,IT deployments,Microsoft Azure Data Factory,PySpark,business intelligence,data analysis,data intelligence,data migration,data mining,data processing,data science,information technology,python data science,systems migrations,technology
Work location: Las Vegas US
Rate: $85/hr on C2C
Job Description:
Design Scalable Data Systems - Build secure, efficient, and scalable data solutions using Azure tools like Data Factory, Synapse, and Data Lake.
Work with Databricks - Design and optimize data pipelines in Databricks, using Delta Lake to ensure smooth data processing.
Build & Optimize Data Pipelines - Create fast and reliable ETL/ELT pipelines using Azure Data Factory, Databricks, and Spark.
Ensure Security & Compliance - Make sure data is secure and follows best practices for governance using Azure security tools.
Improve Performance - Optimize Databricks clusters, Spark jobs, and SQL queries to save costs and boost speed.
Collaborate with Teams - Work closely with engineering, analytics, and business teams to design data solutions that support company goals.
Automate & Streamline - Use CI/CD pipelines and Infrastructure as Code (Terraform, Azure DevOps) to automate workflows.
Support AI/ML Projects - Help data scientists by designing efficient AI/ML data pipelines in Databricks.
Migrate Legacy Systems - Lead data migration from older systems like Teradata and SQL Server to modern Azure-based solutions.
Stay Updated & Innovate - Keep up with the latest Azure and Databricks trends and bring in new ideas to improve systems.
.Databricks Data Intelligence Platform,ETL,IT deployments,Microsoft Azure Data Factory,PySpark,business intelligence,data analysis,data intelligence,data migration,data mining,data processing,data science,information technology,python data science,systems migrations,technology
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
Report this job