Job Listings

Jobs (78014)

Senior Data Architect

synapse business systems

Position ID

SO-2122

Position Title

Sr. Data Architect

Labor Category

CSA3

Notes
• *THIS IS A CONTINGENT REQ**
• ** Selected candidatemust reside within two (2) hours of Headquarters in Woodlawn, MD
• ** Selected candidate must be willing to work on-site at least 2 days a week.

Key Required Skills:

Strong exp w/database design, architect/development Big data / RDBMS, Greenplum, PostgreSQL , SQL Server, Db2, Oracle, Python, R , SAS, PostgreSQL, SQL, PL/SQL, Ansible, T-SQL, UNIX, Linux Shell Script, SAS, Python, R, JavaScript, SSRS/SSIS, JSON, Tableau

Position Description
• 10+ years of experience in data architecture, database design, data analysis, and data engineering.
• Experience in designing and managing enterprise data warehouse environments, especially on Greenplum, PostgreSQL, and SQL Server.
• Hands-on experience with cloud environments, particularly AWS.
• Advanced skills with Greenplum, PostgreSQL, and SQL Server.
• Proficient in designing data pipelines and ETL processes.
• Strong experience with performance tuning and optimizing databases for large datasets.
• Proficiency in Python and R for data manipulation, statistical analysis, and machine learning tasks.
• Expertise in Linux and shell scripting for automation and task scheduling.
• Familiarity with SAS Viya for advanced analytics and reporting.
• Skilled in using Bitbucket and Git for code versioning and collaboration.
• Experience with Visual Studio for development and Jupyter for interactive data analysis and visualization.
• Strong expertise in PostgreSQL and Greenplum database architecture, administration, and optimization.
• Proficiency in SQL and advanced query optimization techniques for PostgreSQL, Greenplum, SQL server , db2 and Oracle.
• Experience with data modeling (conceptual, logical, and physical models) to support high-performance data structures.
• Hands-on experience in designing data pipelines and ETL (Extract, Transform, Load) processes.
• Familiarity with ETL tools and frameworks to facilitate data integration from diverse sources.
• Proficiency in designing data warehouse schemas, including star, snowflake, and hybrid schemas.
• Strong programming skills in Python for data manipulation, scripting, and automation.
• Familiarity with data science libraries in Python (e.g., Pandas, Sqlalchemy) for data processing and transformation.
• Knowledge of API integration using Python for data ingestion.
• Familiarity with data partitioning and parallel processing concepts to enhance performance on large datasets.
• Strong experience with AWS services (EC2, S3, RDS, Redshift, etc.).
• Experience deploying and managing infrastructure on AWS and handling data security and governance on cloud platforms.
• Excellent communication skills for technical and non-technical audiences.
• Ability to translate business requirements into technical solutions.
• Strong documentation and presentation skills to effectively communicate data architecture strategies.
• Use MS Project, Visio and IT Governance Frameworks to document the solution architecture & develop design documents and user guides.
• Attend all customer technical discussions/design/development meetings and provide technical inputs to further enhance the code quality/process.
• Impact functional strategy by developing new solutions, processes, standards, or operational plans that position Leidos competitively in the marketplace.
• Provide guidance/support to other junior/mid-level developers.
• All other duties as assigned or directed.

Skills Requirements:

Basic Skills:
• Bachelor's Degree in Computer Science, Mathematics, Engineering or a related field.
• Masters or Doctorate degree may substitute for required experience
• 10+ years of experience with databases, python and Linux.
• Must be able to obtain and maintain a Public Trust. Contract requirement.

Required Skills:
• Effective communication skills for working with cross-functional teams, including data scientists, analysts, and business stakeholders.
• Ability to translate technical concepts into non-technical terms for business stakeholders.
• Ability to manage multiple projects, prioritize tasks, and deliver within deadlines.
• Ability to analyze complex data requirements and create scalable solutions.
• Proficiency in debugging and optimizing SQL queries and Python scripts and R scripts.
• Strong understanding of data warehousing best practices and data governance principles.
• Proven experience with PostgreSQL and Greenplum in a production environment.
• Experience with data modeling (conceptual, logical, and physical models) to support high-performance data structures.
• Knowledge of API integration using Python for data ingestion.
• Understanding of Greenplum as a Massively Parallel Processing (MPP) database and experience optimizing queries for distributed processing.
• Familiarity with data partitioning and parallel processing concepts to enhance performance on large datasets.
• Knowledge of BI tools (Tableau

Location: Chevy Chase, MD

Posted: Nov. 8, 2024, 6:46 a.m.

Apply Now Company Website