About this role
When BlackRock started in 1988, its founders envisioned a company that combined the best of financial services with cutting edge technology. They imagined a business that would provide financial services to clients as well as technology services to other financial firms. The result of their vision is Aladdin, our industry leading, end-to-end investment management platform. With assets valued over USD $7 trillion managed on Aladdin, our technology empowers millions of investors to save for retirement, pay for college, buy a home and improve their financial wellbeing.
About Data Ops Engineering (DOE)
Data is at the heart of Aladdin and increasingly the ability to consume, store, analyse and gain insight from data has become a key component of our competitive advantage. The DOE team is responsible for the data ecosystem within BlackRock. Our goal is to build and maintain a leading-edge data platform that provides highly available, consistent data of the highest quality for all users of the platform, notably investors, operations teams, and data scientists. We focus on evolving our platform to deliver exponential scale to the firm, powering the future growth of Aladdin data.
About this role
As a lead data engineer at BlackRock, you will get an experience working at one of the most recognized financial companies in the world while being part of a team responsible for building ETL framework for next generation Enterprise Data platform (EDP), which is designed to automate data acquisition, ingestion, transformation, and distribution processes to handle enterprise-wide data processing capabilities. You will partner with data and analytics experts to deliver high quality data analytical solutions to our consumers.
It is important to have a long track-record of delivering high-quality products frequently and repeatably. This role is expected to lead the design of highly complex and/or sensitive components. Also responsible for leveraging to the firm's broader technology strategy to make technology decisions.
Lead data engineers are expected to work in the areas of building frameworks for ETL/ELT use cases and metadata driven data pipeline processing engines with cloud native service knowledge are ideal candidates. We are looking for the candidates, who like to innovate and resolve complex data problems.
Responsibilities:
Lead data engineer is expected to be engaged from inception of projects, understand requirements, design architecture, develop, deploy, and maintain ETL / ELT framework. The goal is to increase the automation of data pipelines, support both internal and external teams to leverage the reusable ETL components and adopt to the standards of modern data platform while constantly working towards enhancing the enabling core ETL framework with new features and improving platform performance and scalability.
• Guide & Contribute to the development life cycle and maintain ETL framework, acquisition framework, workflow orchestration for data pipelines, including participation in tactical projects and strategic programs.
• Participate in brainstorming sessions to generate ideas for designing and building data engineering solutions using standard frameworks. Provide ideas for process enhancements that will result in cycle time reductions.
• Explore new tools and technologies to enhance the capabilities for the frameworks.
• Work closely with program managers, product teams and data stewards throughout the SDLC cycle.
• Guide development team conduct end-to-end tests to ensure production operations run successfully after every release cycle.
• Provide L2 or L3 support for technical and/or operational issues.
• Provide technical expertise to the development teams and establish strong bonds with technologists outside the team.
• Ability to coordinate and collaborate with engineers outside the team.
• Lead the build of highly complex ETL/ELT components. Adhere to the firm's technology strategy with best practices.
• Take ownership of multiple large components and/or initiatives requiring coordination and cooperation with others.
• Identify communication needs and ensure smooth flow of information within the organization.
• Participate in technology initiatives outside the direct team purview.
Required Experience:
• At least 5+ years’ experience as a data engineer
• ETL/ELT Framework design and Workflow Orchestrations
• Experience with object-oriented design patterns
• Experience with ETL Performance Tuning
• Experience in SQL, PL/SQL programming, Stored procedures, UDFs
• Experience in Python / PySpark programming
• Experience with Agile development concepts and related tools
• Experience conducting root cause analysis and solve issues.
• Excellent written and verbal communication skills.
• Ability to operate in a fast-paced environment.
• Strong interpersonal skills with a can-do attitude under challenging circumstances
• BS equivalent degree
Desired Qualifications:
• Experience with Snowf
Location: Atlanta, GA
Posted: Sept. 9, 2024, 9:45 a.m.
Apply Now Company Website