ML (Machine Learning) Engineers
ML (Machine Learning) Engineers: with Unity Catalog / Databricks / Feature Store/ Java 11/ Python/ Kubernetes/
The ML Engineers will be supporting 3 web services applications tech stack - Java 11/ Python/ Azure/ AKS/ APIM
Seattle based Client
Hybrid / 3 days office in Seattle WA
Visa: USC/ GC
What We are Looking For -
- Strong experience with Unity Catalog in Databricks for managing data assets and access control
- Hands-on experience working with Databricks Feature Store or similar solutions
- Knowledge of building and maintaining scalable ETL pipelines in Databricks
- Familiarity with Azure tools like Azure Cosmos DB and ACR
- Understanding of machine learning workflows and how feature stores fit into the pipeline
- Strong problem-solving skills and a collaborative mindset
- Proficiency with Java
- Proficiency in Python and Spark for data engineering tasks
- Experience with monitoring tools like Splunk or Datadog to ensure system reliability
- Familiarity with AKS for deploying and managing containers.
Required Key skills:
Databricks Unity Catalog: Data governance, access control, data asset management.
Databricks Feature Store: Feature engineering, feature serving, ML workflow integration.
ETL Pipeline Development (Databricks): Data processing, data pipeline construction, data integration.
Azure Tools: Azure Cosmos DB, Azure Container Registry (ACR).
Machine Learning Workflows: Model training, deployment, and monitoring; understanding of feature store's role.
Programming Languages: Java, Python, Spark (for data engineering).
Monitoring Tools: Splunk, Datadog (system reliability and performance monitoring).
Containerization: Azure Kubernetes Service (AKS) for deployment and ent.
Problem-Solving & Collaboration: Ability to work effectively in a team and address technical challenges
ML (Machine Learning) Engineers: with Unity Catalog / Databricks / Feature Store/ Java 11/ Python/ Kubernetes/
The ML Engineers will be supporting 3 web services applications tech stack - Java 11/ Python/ Azure/ AKS/ APIM
Seattle based Client
Hybrid / 3 days office in Seattle WA
Visa: USC/ GC
What We are Looking For -
- Strong experience with Unity Catalog in Databricks for managing data assets and access control
- Hands-on experience working with Databricks Feature Store or similar solutions
- Knowledge of building and maintaining scalable ETL pipelines in Databricks
- Familiarity with Azure tools like Azure Cosmos DB and ACR
- Understanding of machine learning workflows and how feature stores fit into the pipeline
- Strong problem-solving skills and a collaborative mindset
- Proficiency with Java
- Proficiency in Python and Spark for data engineering tasks
- Experience with monitoring tools like Splunk or Datadog to ensure system reliability
- Familiarity with AKS for deploying and managing containers.
Required Key skills:
Databricks Unity Catalog: Data governance, access control, data asset management.
Databricks Feature Store: Feature engineering, feature serving, ML workflow integration.
ETL Pipeline Development (Databricks): Data processing, data pipeline construction, data integration.
Azure Tools: Azure Cosmos DB, Azure Container Registry (ACR).
Machine Learning Workflows: Model training, deployment, and monitoring; understanding of feature store's role.
Programming Languages: Java, Python, Spark (for data engineering).
Monitoring Tools: Splunk, Datadog (system reliability and performance monitoring).
Containerization: Azure Kubernetes Service (AKS) for deployment and ent.
Problem-Solving & Collaboration: Ability to work effectively in a team and address technical challenges