Required Experience:
• 7+ years of Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education
• 5+ years of ETL (Extract, Transform, Load) Programming experience
• 3+ years of experience designing and optimizing complex SQL queries involving table joins and correlated sub-queries on large scale data tables
• 3+ years of experience writing SQL relational database queries, stored procedures, query optimization and performance tuning
• 3+ years of experience with databases such as Oracle, DB2, SQL Server, or Teradata
• 2+ years of Hadoop / Big Data development experience.
Job Responsibilities:
• Work closely with Data Modelers to structure data requirements into a data warehouse model for reporting and analytics
• Design, develop and debug ETL integration code to meet defined source to target mappings
• Handle complex ETL requirements and design
• Work with Business Stakeholders to build database objects to meet desired output
• Build, maintain and govern the Hadoop data layer to ensure data consistency and single version of the truth
• Complete Unit Testing to ensure low probability of defects and that data matches with back-end systems
• Ensure solutions are highly usable, scalable, and maintainable
• Understand the impacts of data layer performance factors and collaborate with the Data Architect to implement mitigating physical modeling solutions
• Perform code and design reviews to ensure performance, maintainability, and standards
• Work with the Data Architect to define data security roles, groups and policies
• Enforce Hadoop design standards, reusable objects, tools, best practices, and related development methodologies for the organization
• Partnering with Business Stakeholders and developers to ensure deliverables meet business expectations.
Location: Charlotte, NC
Posted: Sept. 26, 2024, 11:14 a.m.
Apply Now Company Website