Senior Data Architect
- Full Time, onsite
- ProHire Solution
- On Site HybridCandidates must be willing to work on-site at least 2 days a week., United States of America
Apply on
Key Required Skills:
Strong exp w/database design, architect/development Big data / RDBMS, Greenplum, PostgreSQL , SQL
Server, Db2, Oracle, Python, R , SAS, PostgreSQL, SQL, PL/SQL, Ansible, T-SQL, UNIX, Linux Shell Script, SAS, Python, R, JavaScript, SSRS/SSIS, JSON, Tableau
Position Description:
10+ years of experience in data architecture, database design, data analysis, and data engineering.
Experience in designing and managing enterprise data warehouse environments, especially on Greenplum, PostgreSQL, and SQL Server.
Hands-on experience with cloud environments, particularly AWS.
Advanced skills with Greenplum, PostgreSQL, and SQL Server.
Proficient in designing data pipelines and ETL processes.
Strong experience with performance tuning and optimizing databases for large datasets.
Proficiency in Python and R for data manipulation, statistical analysis, and machine learning tasks.
Expertise in Linux and shell scripting for automation and task scheduling.
Familiarity with SAS Viya for advanced analytics and reporting.
Skilled in using Bitbucket and Git for code versioning and collaboration.
Experience with Visual Studio for development and Jupyter for interactive data analysis and visualization.
Strong expertise in PostgreSQL and Greenplum database architecture, administration, and optimization.
Proficiency in SQL and advanced query optimization techniques for PostgreSQL, Greenplum, SQL
server , db2 and Oracle.
Experience with data modeling (conceptual, logical, and physical models) to support high-
performance data structures.
Hands-on experience in designing data pipelines and ETL (Extract, Transform, Load) processes.
Familiarity with ETL tools and frameworks to facilitate data integration from diverse sources.
Proficiency in designing data warehouse schemas, including star, snowflake, and hybrid schemas.
Strong programming skills in Python for data manipulation, scripting, and automation.
Familiarity with data science libraries in Python (e.g., Pandas, Sqlalchemy) for data processing and transformation.
Knowledge of API integration using Python for data ingestion.
Familiarity with data partitioning and parallel processing concepts to enhance performance on
large datasets.
Strong experience with AWS services (EC2, S3, RDS, Redshift, etc.).
Experience deploying and managing infrastructure on AWS and handling data security and
governance on cloud platforms.
Excellent communication skills for technical and non-technical audiences.
Ability to translate business requirements into technical solutions.
Strong documentation and presentation skills to effectively communicate data architecture strategies.
Use MS Project, Visio and IT Governance Frameworks to document the solution architecture & develop design documents and user guides.
Attend all customer technical discussions/design/development meetings and provide technical
inputs to further enhance the code quality/process.
Impact functional strategy by developing new solutions, processes, standards, or operational plans
that position Leidos competitively in the marketplace.
Provide guidance/support to other junior/mid-level developers.
All other duties as assigned or directed.
Skills Requirements: Basic Skills:
Bachelor's Degree in Computer Science, Mathematics, Engineering or a related field.
Masters or Doctorate degree may substitute for required experience
10+ years of experience with databases, python and Linux.
Must be able to obtain and maintain a Public Trust. Contract requirement.
Required Skills:
Effective communication skills for working with cross-functional teams, including data scientists,
analysts, and business stakeholders.
Ability to translate technical concepts into non-technical terms for business stakeholders.
Ability to manage multiple projects, prioritize tasks, and deliver within deadlines.
Ability to analyze complex data requirements and create scalable solutions.
Proficiency in debugging and optimizing SQL queries and Python scripts and R scripts.
Strong understanding of data warehousing best practices and data governance principles.
Proven experience with PostgreSQL and Greenplum in a production environment.
Experience with data modeling (conceptual, logical, and physical models) to support high-
performance data structures.
Knowledge of API integration using Python for data ingestion.
Understanding of Greenplum as a Massively Parallel Processing (MPP) database and experience
optimizing queries for distributed processing.
Familiarity with data partitioning and parallel processing concepts to enhance performance on
large datasets.
Knowledge of BI tools (Tableau, Power BI) and their integration with databases.
Familiarity with version control systems like Git and CI/CD pipelines for data projects.
Experience working on Agile projects and strong knowledge of Agile terminology and tools
including VersionOne.
Experience designing and deploying cloud-native applications onto AWS with PostgreSQL.
Experience with modern software development tools for Continuous Integration including Jenkins,
Git/BitBucket, Jira, Nexus and confluence.
Experience with performance tuning software applications for effective CPU, Memory utilization,
Quick turnaround response time, building highly scalable software applications in Cloud Environment handling higher volume of data and traffic.
Strong communication and documentation skills.
Desired Skills:
Experience with python and R for ETL and writing the predictive models.
Experience with the SAS , SAS Viya tool , Tableau and multiple databases.
Prior experience with federal or state governments IT projects.
Education:
Bachelors degree with 7+ years of experience
Must be able to obtain and maintain a Public Trust. Contract requirement.