Job Listings

Senior Data Engineer

Weber

At Weber, grilling is a passion that’s reflected in everything we do. Our goal is to share this passion and spark inspiration with the people who matter most – our grilling community. Weber has been the world’s premiere manufacturer of charcoal and gas grills and accessories since 1952. If you have the desire to work for a company that is recognized for exceptional quality products and high customer satisfaction, employment with Weber may be right for you. We provide a friendly working atmosphere with an environment of growth and opportunity through innovation, pride, and excellence.

Weber is committed to inclusive, equitable and diverse Hiring practices. Our goal is to create a workforce which resembles the diverse rich communities we live, play, and support every day.

Discover What’s Possible with a career, at Weber.

Summary:

We are seeking an experienced Senior Data Engineer with a strong technical understanding to join our team. The ideal candidate will spearhead the development of an Enterprise data warehouse platform, Snowflake, collaborating closely with key stakeholders to drive the organization's continuous growth. They will be responsible for designing, developing, and maintaining the infrastructure and systems required for data storage, processing, and analysis. Additionally, they will collaborate with key stakeholders and cross-functional teams to understand reporting requirements, drive data management projects, and innovate solutions to enhance the data platform. They play a crucial role in building and managing the data pipelines, data solutions that enable efficient and reliable data integration, transformation, and delivery for all data users across the Enterprise.

We want to ensure our stakeholders can easily self-serve the data and build reports they need to drive their projects and make decisions. If you're excited to work on a fast-moving team using cutting-edge technologies to collect, store, transform, analyze, and model data, we want to meet you!

Essential Duties and Responsibilities (include the following, and other duties may be assigned):

Data Modeling:
• Strong foundation in data modeling, and reporting to ensure our data infrastructure supports efficient analytics and reporting operations.
• Collaborate with stakeholders and cross functional teams to understand business requirements and define data structures and relationships.
• Design, develop and maintain robust, and scalable data models and schemas to support analytics and reporting requirements.

Data Integration:
• Integrate data from different sources, both internal and external, to create a unified and comprehensive view of the data.
• Work closely with cross-functional teams to understand data requirements and ensure successful integration.

ETL Development and Data Integrity:
• Develop, optimize, and maintain ETL/ELT processes to extract, load and transform data from various sources into our Snowflake data platform.
• Implement data quality checks and validation routines within data pipelines to ensure the accuracy, consistency, and completeness of data
• Interact and coordinate work with other technical and testing members in the team.
• Review and write code that meets set quality gates.

Performance Tuning:
• Optimize data infrastructure and enhance overall system performance.
• Optimize data pipelines and data processing workflows for performance, scalability, and efficiency.
• Optimize design efficiency to minimize data refresh lags, improve performance and enable data as a service through reusable assets.

Technical Leadership:
• Drive the design, code, and maintenance of data engineering standards.
• Troubleshoot and resolve issues related to data processing and storage.
• Coordinate with team members in finding root-cause of problems and issue resolution.
• Perform quality checks and user acceptance testing.

Cross-Functional Team Collaboration:
• Meet with internal team to clarify and document reporting requirements.
• Collaborate to understand existing issues and new requirements.

Documentation:
• Create, update, maintain technical documentation of the processes and configuration.
• Develop and maintain documentation for data processes, pipelines, and models.

Education and Experience Requirements:
• Bachelor's or Master’s Degree in Computer Science, Information Technology or a related field
• Minimum 10 years of experience in a technical role in data extracts, analysis, and reporting.
• 6+ years of advanced SQL development experience coupled with robust Python or PySpark programming capabilities. Extensive experience building and optimizing ETL pipelines, data modeling and schema design.
• 4+ years of hands-on experience with cloud data warehousing technologies (Snowflake is preferred, BigQuery, AWS, Azure, SAP BW).
• Experience with SAP ERP systems – OTC, Finance, Master Data is preferred
• Certification in relevant areas (e.g., Snowflake, AWS Certified Data Analytics, Google Cloud Professional Data

Location: Palatine, IL

Posted: Sept. 24, 2024, 11:13 p.m.

Apply Now Company Website