You thrive on diversity and creativity, and we welcome individuals who share our vision of making a lasting impact. Your unique combination of design thinking and experience will help us achieve new heights.
As a Lead Data Engineer within the Connected Commerce Technology Team, you serve as a seasoned member of an agile team to design and deliver trusted data collection, storage, access, and analytics solutions in a secure, stable, and scalable way. You are responsible for developing, testing, and maintaining critical data pipelines and architectures across multiple technical areas in support of the firm’s business objectives.
Job responsibilities
• Executes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems.
• Creates secure and high-quality production code.
• Produce architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development.
• Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems.
• Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture.
• Design & develop data pipelines end to end using Spark SQL, Java and AWS Services. Utilize programming languages like Java, Python, NoSQL databases, SQL, Container Orchestration services including Kubernetes, and a variety of AWS tools and services.
• Contributes to software engineering communities of practice and events that explore new and emerging technologies.
• Adds to team culture of diversity, equity, inclusion, and respect
Required qualifications, capabilities, and skills
• BS Degree in applicable STEM field and 5+ years of software development experience
• Proficiency in designing, developing, testing, debugging, and maintaining code using Java, Scala, Python.
• Hands-on practical experience in developing spark-based Frameworks for end-to-end ETL, ELT & reporting solutions using key components like Spark SQL & Spark Streaming.
• Experience with Relational and No SQL databases,
• Cloud implementation experience with AWS including:
• AWS Data Services: Proficiency in Lake formation, Glue ETL (or) EMR, S3, Glue Catalog, Athena, Kinesis (or) MSK, Airflow (or) Lambda + Step Functions + Event Bridge
• Data De/Serialization: Expertise in at least 2 of the formats: Parquet, Iceberg, AVRO, JSON-LD
• AWS Data Security: Good Understanding of security concepts such as: Lake formation, IAM, Service roles, Encryption, KMS, Secrets Manager
• Proficiency in automation and continuous delivery methods.
• Proficient in all aspects of the Software Development Life Cycle.
• Solid understanding of agile methodologies such as CI/CD, Applicant Resiliency, and Security.
Preferred qualifications, capabilities, and skills
• Snowflake experience
• In-depth knowledge of the financial services industry and their IT systems
Location: Jersey City, NJ
Posted: Aug. 11, 2024, 11:18 p.m.
Apply Now Company Website