Epicareer Might not Working Properly
Learn More
F

SENIOR DATA ENGINEER, Hybrid (Sacramento, CA), $60/HR

Salary undisclosed

Checking job availability...

Original
Simplified
Mandatory Qualifications (MQ):

least five (5) years of experience, in
developing and monitoring data pipelines, ETL/ELT (Extract Transform Load /
Extract Load Transform).

2. At least five (5) years of experience developing
in SQL. At least one (1) years of experience translating between different
versions of SQL.

3. At least five (5) years of experience developing
with ETL Tools including SSIS, SAS DI Studio, and AWS Glue, and Lambda.

4. At least five (5) years of experience developing
Rest APIs, including experience with JSON, C#, and Python.

least three. (3) years of experience, in
database design, database development, and database security.

6. At least three (3) years of experience, working
as a lead with data conversion and migration efforts involving legacy data
formats and legacy data repositories such as IBM Mainframe.

7. At least three (3) years of experience with
Microsoft Visual Studio and Git Code Repositories.

8. At least three (3) years of experience with
Cloud Technologies in in AWS (e.g. glue and lambda) and Snowflake

9. At least five (5) years experience working on a
large scale modernization projects involving data conversion and migration over
$50M budgets, designing and creating raw, integration, and presentation layers.

least three (5) year experience working with
Unemployment Insurance and Disability Insurance data.

11. At least three (3) years of experience designing
and optimizing data warehouses and data lakes.

12. At least three (3) years of experience with
Terraform, Kubernetes, Docker, and CI/CD pipelines.

13 At least three (3) years of experience in
automating workflows and data pipelines.

14. At least three (3) years of experience in
identifying bugs and bottlenecks. Identifying poor performance and optimize
ELT/ETL pipeline.

least three (3) years of experience working
with JSON, API, CSV, SQL and Parquet files.

16. *Bachelor's degree in computer science,
engineering, information systems, math or a technology-related field.
Additional qualifying experience may be substituted for the required education
on a year-for-year basis
Desirable Qualifications (DQ)

1. At least seven (7) years of experience in
developing and monitoring data pipelines, ETL/ELT (Extract Transform Load /
Extract Load Transform).

2. At least seven (7) years of experience
developing in SQL. At least three (3) years of experience translating between
different versions of SQL

3. At least seven (7) years of experience
developing with ETL Tools including SSIS, SAS DI Studio, and AWS, Glue, and
Lambda.

least seven (7) years of experience
developing Rest APIs, including experience with JSON, C#, and Python.

5. At least five (5) years of experience in
database design, database development, and database security.

least five (5) years of experience working as
a lead with data conversion and migration efforts involving legacy data formats
and legacy data repositories such as IBM Mainframe.

least five (5) years of experience with
Microsoft Visual Studio and Git Code Repositories.

8. At least five (5) years of experience with Cloud
Technologies in AWS (e.g. glue and lambda) and Snowflake.

9. At least seven (7) years experience working on
large scale modernization projects involving data conversion and migration over
$50M budgets, designing and creating raw, integration, and presentation layers.

10. At least five (7) years experience working with
Unemployment Insurance and Disability Insurance data.

11. At least five (5) years experience designing and
optimizing data warehouses and data lakes.

12. At least five (5) years experience with
Terraform, Kubernetes, Docker, and CI/CD pipelines.

13. At least five (5) years experience in automating
workflows and data pipelines.

14. At least five (5) years experience in
identifying bugs and bottlenecks. Identifying poor performance and optimize
ELT/ETL pipeline.

15. At least five (5) years experience working with
JSON, API, CSV, SQL and Parquet files.

16. Master's degree in computer science,
engineering, information systems, math or a technology-related field.
Additional qualifying experience may be substituted for the required education
on a year-for-year basis.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
Report this job
Mandatory Qualifications (MQ):

least five (5) years of experience, in
developing and monitoring data pipelines, ETL/ELT (Extract Transform Load /
Extract Load Transform).

2. At least five (5) years of experience developing
in SQL. At least one (1) years of experience translating between different
versions of SQL.

3. At least five (5) years of experience developing
with ETL Tools including SSIS, SAS DI Studio, and AWS Glue, and Lambda.

4. At least five (5) years of experience developing
Rest APIs, including experience with JSON, C#, and Python.

least three. (3) years of experience, in
database design, database development, and database security.

6. At least three (3) years of experience, working
as a lead with data conversion and migration efforts involving legacy data
formats and legacy data repositories such as IBM Mainframe.

7. At least three (3) years of experience with
Microsoft Visual Studio and Git Code Repositories.

8. At least three (3) years of experience with
Cloud Technologies in in AWS (e.g. glue and lambda) and Snowflake

9. At least five (5) years experience working on a
large scale modernization projects involving data conversion and migration over
$50M budgets, designing and creating raw, integration, and presentation layers.

least three (5) year experience working with
Unemployment Insurance and Disability Insurance data.

11. At least three (3) years of experience designing
and optimizing data warehouses and data lakes.

12. At least three (3) years of experience with
Terraform, Kubernetes, Docker, and CI/CD pipelines.

13 At least three (3) years of experience in
automating workflows and data pipelines.

14. At least three (3) years of experience in
identifying bugs and bottlenecks. Identifying poor performance and optimize
ELT/ETL pipeline.

least three (3) years of experience working
with JSON, API, CSV, SQL and Parquet files.

16. *Bachelor's degree in computer science,
engineering, information systems, math or a technology-related field.
Additional qualifying experience may be substituted for the required education
on a year-for-year basis
Desirable Qualifications (DQ)

1. At least seven (7) years of experience in
developing and monitoring data pipelines, ETL/ELT (Extract Transform Load /
Extract Load Transform).

2. At least seven (7) years of experience
developing in SQL. At least three (3) years of experience translating between
different versions of SQL

3. At least seven (7) years of experience
developing with ETL Tools including SSIS, SAS DI Studio, and AWS, Glue, and
Lambda.

least seven (7) years of experience
developing Rest APIs, including experience with JSON, C#, and Python.

5. At least five (5) years of experience in
database design, database development, and database security.

least five (5) years of experience working as
a lead with data conversion and migration efforts involving legacy data formats
and legacy data repositories such as IBM Mainframe.

least five (5) years of experience with
Microsoft Visual Studio and Git Code Repositories.

8. At least five (5) years of experience with Cloud
Technologies in AWS (e.g. glue and lambda) and Snowflake.

9. At least seven (7) years experience working on
large scale modernization projects involving data conversion and migration over
$50M budgets, designing and creating raw, integration, and presentation layers.

10. At least five (7) years experience working with
Unemployment Insurance and Disability Insurance data.

11. At least five (5) years experience designing and
optimizing data warehouses and data lakes.

12. At least five (5) years experience with
Terraform, Kubernetes, Docker, and CI/CD pipelines.

13. At least five (5) years experience in automating
workflows and data pipelines.

14. At least five (5) years experience in
identifying bugs and bottlenecks. Identifying poor performance and optimize
ELT/ETL pipeline.

15. At least five (5) years experience working with
JSON, API, CSV, SQL and Parquet files.

16. Master's degree in computer science,
engineering, information systems, math or a technology-related field.
Additional qualifying experience may be substituted for the required education
on a year-for-year basis.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
Report this job