Epicareer Might not Working Properly
Learn More
S

IICS Developer

  • Full Time, onsite
  • Starcom consulting limited
  • On Site, United States of America
Salary undisclosed

Apply on

Availability Status

This job is expected to be in high demand and may close soon. We’ll remove this job ad once it's closed.


Original
Simplified

Job Description

Job Description
For a role based in Boston, supporting data management and transformation projects with a focus on specific cloud services and tools, here s a breakdown of the key technical requirements and skills:

### **Essential Technical Skills**

1. **Cloud Platforms (AWS Preferred)**
& & - **Services Required:**
& & & - **S3 (Simple Storage Service):** For scalable storage solutions and managing large datasets.
& & & - **DMS (Database Migration Service):** For migrating databases to AWS and ensuring data integrity during the transfer.
& & & - **Glue:** For ETL (Extract, Transform, Load) operations and data cataloging.
& & - **Why:** Proficiency in these AWS services will be crucial for handling data management and transformation tasks effectively.&

2. **SQL / Analytic Skills + NoSQL**
& & - **SQL:** For querying and managing relational databases.
& & - **NoSQL:** For working with non-relational databases and handling diverse data formats.
& & - **Why:** Both SQL and NoSQL skills are necessary for managing different types of data and performing complex queries and analyses.

3. **Snowflake (Plus)**
& & - **Importance:** Snowflake is a cloud-based data warehousing service that facilitates data storage, processing, and analysis.
& & - **Why:** Experience with Snowflake will be beneficial for managing large-scale data warehousing and analytics tasks.

4. **Python, Airflow, Scripting (Plus)**
& & - **Python:** For scripting, data manipulation, and automation.
& & - **Airflow:** For managing and scheduling workflows and ETL processes.
& & - **Why:** Python and Airflow will be useful for building and automating data pipelines, enhancing efficiency and scalability in data processing tasks.

5. **Kafka (Nice to Have)**
& & - **Importance:** Kafka is used for building real-time data pipelines and streaming applications.
& & - **Why:** While not essential, experience with Kafka can be a valuable asset for handling real-time data streams and integrating with various data sources.

6. **Salesforce (Nice to Have)**
& & - **Importance:** Salesforce knowledge is useful for integrating with CRM systems and managing customer data.
& & - **Why:** Experience with Salesforce can be an advantage if the project involves CRM data integration or reporting.

7. **APIs / How Things Work**
& & - **Importance:** Understanding APIs and how they integrate different systems is crucial for data interoperability.
& & - **Why:** Knowledge of APIs will help in integrating various services and systems, ensuring seamless data flow and functionality.

### **Summary**

For this Boston-based role, a strong foundation in cloud services (with a focus on AWS), SQL, NoSQL, and scripting languages (Python) is critical. Familiarity with Snowflake, Airflow, and Kafka will add significant value, while knowledge of Salesforce and APIs can further enhance your ability to integrate and manage diverse data sources and systems effectively.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
Report this job