Job Listings

Data Engineer

Candidate Portal

Must Have:

• AWS Platform

• Snowflake

• ETL Tools - Informatica Power center Required; Informatica IICS, Fivetran, Dbt labs, AWS Airflow Nice to have

• Python, SQL – Required

Familiarity with:

• Cloud Technologies - AWS services like Lambda, SQS/SNS, EMR, Cloud watch, RDS. EKS

• Compute Technologies – Spark/Hadoop

• Streaming Technologies – Kafka, AWS Kinesis

• API Technologies – MuleSoft, AWS Gateway API

• CI/CD – Atlassian Bitbucket and Bamboo

• Build tools - Apache ANT and Maven

RESPONSIBILITIES:

• Be an expert in Data Warehouse domain and relevant business function.

• Work with minimal supervision and provide status updates.

• Provides application support as part of an on-call rotation to work on resolving outages and user questions.

• Build large scale datasets for a wide range of consumer needs.

• Build, test and implement highly efficient data pipelines using a variety of technologies.

• Analyze new data sources to understand quality, availability, and content.

• Create and executes unit tests.

• Supports QA team during testing.

Knowledge/Skills/Abilities:

• Proven ability to model data for reporting and analytics needs.

• Proven ability to design and implement applications using best practices.

• Proven ability to analyze and understand existing processes and code.

• Works and communicates effectively with all levels of management.

• Excellent written, verbal, and people skills.

• Must work well within team-oriented environments.

• Proven ability to work in Agile development methodology.

Mandatory Skills: Snowflake, dbt, ETL, AWS

Location: Rosemont, IL

Posted: Oct. 22, 2024, 10:34 p.m.

Apply Now Company Website