
Senior Kafka Engineer
Job Title: Senior Kafka Engineer
Job type: CTH; 6+ months ongoing W2 contract.
Location: Remote
Program Related:
The client is looking to bring on a Sr. Kafka Engineer with strong skills in Confluent Platform and Kafka Connect to work within their technology center of Excellence. This person will be working with a Software as a Service (SAAS) Platform called Confluent Cloud for an enterprise API/Integration/messaging program. This program is looking to implement a more seamless Data streaming, messaging, pipelining, analytics, and integration, for their enterprise applications via Confluent cloud and Kafka. They are looking for outside expertise that knows the Confluent Cloud platform well so they can continue to make updates and also work on new development. This data will be coming from multiple sources like Oracle (On Prem), Snowflake, Salesforce, and many others. This project will be focused on streaming and pulling data and creating connectors to different applications and microservices. Job Description:?As a Sr Software Engineer, you will be responsible for working closely with IT Leadership to help support their vision to design, build, Implement, SAAS platforms and connectors while also providing governance of a large-scale streaming architecture solution that delivers both IT and business value across the organization. You will have the opportunity to both lead and execute on projects and always consider the bigger picture proactively with anticipating any performance issues, troubleshooting, monitoring, quality, etc. This role will require experience with multiple data sources, SAAS platforms, implementation, Backend development, database querying with technologies such as Java, KSQL/Stream Processing, data connectors/streaming technologies such as Kafka, performance tuning, data quality and data visualization knowledge.
Responsibilities; 10+ years of experience
- Strong Kafka Connect knowledge with SMT, Custom Connector Architecture and implementation, Security, Monitoring & custom Provider for credentials?- Strong skills with KSQL for any new streaming and querying for Kafka with the ability to do Custom UDF.
- Strong knowledge and experience implementing Confluent Cloud connectors like Salesforce Connector, Snowflake Connector, Oracle CDC Connector, Azure Blob Connector, Etc.
- At least 5+ years of strong hands on knowledge and experience as a core Java Developer. This person will need to not only edit and update existing code but also write new code in some cases as well as the project continues.
- Ability to recommend and institute Best practices for setting up Confluent Cloud and Kafka
- Externalize the Kafka and Kafka streams configurations through application properties
- Strong experience with Kafka and Kafka Connect in order to stream data between multiple data streams and sources?- Strong Querying and Database knowledge
- Expertise in Confluent Kafka administration and maintenance to Create KSQL streams, topics etc.
- Experience with C3 (Confluent Control Center) for managing and monitoring Apache Kafka
Must Have Skills (10+ years of Experience):
1-Strong understanding of the Confluent platform with the ability to work with and implement the various connectors:
- File Pulse Connector
- S3 Connector (Sink & Source)
- Salesforce connector
- Snowflake Connector
- JDBC Connector (Sink & Source)
- Debizium CDC Connectors
- Oracle CDC Connector?- Azure Blob connector
2. Strong experience with Kafka and Kafka Connect in order to stream data between multiple data streams with SMT & Custom Provider experience
3. Must have experience working with the Confluent Platform API as they will be working with it extensively
Nice to Have skills (5+ years of experience)
- KSQLDB/UDF
- DevOps experience
- Agile experience and working in a SAFe methodology
Thank you.
Job Title: Senior Kafka Engineer
Job type: CTH; 6+ months ongoing W2 contract.
Location: Remote
Program Related:
The client is looking to bring on a Sr. Kafka Engineer with strong skills in Confluent Platform and Kafka Connect to work within their technology center of Excellence. This person will be working with a Software as a Service (SAAS) Platform called Confluent Cloud for an enterprise API/Integration/messaging program. This program is looking to implement a more seamless Data streaming, messaging, pipelining, analytics, and integration, for their enterprise applications via Confluent cloud and Kafka. They are looking for outside expertise that knows the Confluent Cloud platform well so they can continue to make updates and also work on new development. This data will be coming from multiple sources like Oracle (On Prem), Snowflake, Salesforce, and many others. This project will be focused on streaming and pulling data and creating connectors to different applications and microservices. Job Description:?As a Sr Software Engineer, you will be responsible for working closely with IT Leadership to help support their vision to design, build, Implement, SAAS platforms and connectors while also providing governance of a large-scale streaming architecture solution that delivers both IT and business value across the organization. You will have the opportunity to both lead and execute on projects and always consider the bigger picture proactively with anticipating any performance issues, troubleshooting, monitoring, quality, etc. This role will require experience with multiple data sources, SAAS platforms, implementation, Backend development, database querying with technologies such as Java, KSQL/Stream Processing, data connectors/streaming technologies such as Kafka, performance tuning, data quality and data visualization knowledge.
Responsibilities; 10+ years of experience
- Strong Kafka Connect knowledge with SMT, Custom Connector Architecture and implementation, Security, Monitoring & custom Provider for credentials?- Strong skills with KSQL for any new streaming and querying for Kafka with the ability to do Custom UDF.
- Strong knowledge and experience implementing Confluent Cloud connectors like Salesforce Connector, Snowflake Connector, Oracle CDC Connector, Azure Blob Connector, Etc.
- At least 5+ years of strong hands on knowledge and experience as a core Java Developer. This person will need to not only edit and update existing code but also write new code in some cases as well as the project continues.
- Ability to recommend and institute Best practices for setting up Confluent Cloud and Kafka
- Externalize the Kafka and Kafka streams configurations through application properties
- Strong experience with Kafka and Kafka Connect in order to stream data between multiple data streams and sources?- Strong Querying and Database knowledge
- Expertise in Confluent Kafka administration and maintenance to Create KSQL streams, topics etc.
- Experience with C3 (Confluent Control Center) for managing and monitoring Apache Kafka
Must Have Skills (10+ years of Experience):
1-Strong understanding of the Confluent platform with the ability to work with and implement the various connectors:
- File Pulse Connector
- S3 Connector (Sink & Source)
- Salesforce connector
- Snowflake Connector
- JDBC Connector (Sink & Source)
- Debizium CDC Connectors
- Oracle CDC Connector?- Azure Blob connector
2. Strong experience with Kafka and Kafka Connect in order to stream data between multiple data streams with SMT & Custom Provider experience
3. Must have experience working with the Confluent Platform API as they will be working with it extensively
Nice to Have skills (5+ years of experience)
- KSQLDB/UDF
- DevOps experience
- Agile experience and working in a SAFe methodology
Thank you.