Epicareer Might not Working Properly
Learn More

Data Scientist/Big Data Engineer with Snowflake and SAS Viya.Local to TX

Salary undisclosed

Apply on


Original
Simplified

This position is 100% remote, Candidate should currently living in State of TX

Data Scientist/Big Data Engineer 2
Location: US Remote - Telework

Resume and Photo ID required:For the safety and security of our clients and their systems,

I. DESCRIPTION OF SERVICES

OAG requires the services of Data Scientist (Big Data Engineer) 2, hereafter referred to as Candidate(s), who meets the general qualifications of Data Scientist (Big Data Engineer) 2, Data/Database Administration and the specifications outlined in this document for the Office of the Attorney General of Texas.

OAG Chief Data Office (CDO) is seeking a dynamic and visionary Data Scientist to join our team and support our System Modernization efforts for the Child Support Division.
This individual must be able to work in a heavily technical environment, preparing and optimizing data for Snowflake and utilizing SAS Viya to build comprehensive federal and state reports.
You will play a pivotal role in transforming, analyzing, and leveraging our data assets to drive strategic initiatives and business outcomes. You must be comfortable analyzing data,
developing predictive modeling algorithms, and can communicate your findings through visualizations enabling the discovery of solutions to business problems.
As a Data Scientist your role will be impactful, visible, and rewarding.

Responsibilities will include:

Data Conversion and Reporting Support for System Modernization Efforts
Data Transformation and Integration: Prepare and optimize data for migration to Snowflake and SAS Viya platforms, ensuring seamless integration and functionality by creating data transformation processes using ETL, SQL, Python, and R.
Develop Federal and State Reports: Build comprehensive reports that meet federal and state requirements using Snowflake and SAS Viya, ensuring accuracy and compliance.
Scrum Team Collaboration: Work as a member of an agile team to deliver new features and functions, delivering best-in-class value-based technology solutions.
Data Quality Management: Develop and implement databases, ETL processes, data collection systems, and data quality strategies that optimize statistical efficiency, accuracy, and quality.
Problem Examination and Resolution: Examine problems within the Data Intelligence space using ETL, Lambda, and Glue, and implement necessary changes to ensure data quality improvement.
Data Analytics and Insights: Utilize advanced data analytics techniques to support strategic decision-making, ensuring data integrity, quality, and timeliness of results.
The above job description and requirements are general in nature and may be subject to change based on the specific needs and requirements of the organization and project.


II. CANDIDATE SKILLS AND QUALIFICATIONS

Minimum Requirements:
Candidates that do not meet or exceed the minimum stated requirements (skills/experience) will be displayed to customers but may not be chosen for this opportunity.
Years Required/Preferred Experience
4 Required Proven experience in data conversion and report building using Snowflake and SAS Viya.
4 Required Demonstrated experience in data transformation processes using ETL, SQL, Python, and R.
4 Required Experience working with data analytics and business intelligence tools.
4 Required Experience working in a Scrum or Agile development environment.
4 Required Proficiency in ETL processes and tools such as AWS Glue and Lambda.
4 Required Strong knowledge of database management and data warehousing concepts.
4 Required Expertise in SQL for data querying and manipulation.
4 Required Proven experience in data conversion and report building using Snowflake and SAS Viya.
4 Required Demonstrated experience in data transformation processes using ETL, SQL, Python, and R.
4 Required Experience working with data analytics and business intelligence tools.
4 Required Experience working in a Scrum or Agile development environment.
4 Required Proficiency in ETL processes and tools such as AWS Glue and Lambda.

III. TERMS OF SERVICE

Services are expected to start 11/12/2024 and are expected to complete by 08/31/2025. Total estimated hours per Candidate shall not exceed 1000 hours.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
Report this job