Job Listings

Cloud Data Engineer with Databricks

Diaconia

Diaconia is looking for a talented Cloud Data Engineer to join our Amazing team!

If you're looking to join a company that truly appreciates you and your talents, look no further! At Diaconia, we are committed to serving and caring for our colleagues, our clients and our community. Our team is made up of talented individuals who appreciate having the opportunity to contribute their knowledge and experience to further the growth and development of our industry. Our ideal candidates embrace diverse thinking, enjoy partnering with others and are seeking to make a difference!

We are currently searching for a new, full-time member for our team for the position of:

Cloud Data Engineer with Databricks
• U.S. Citizenship is REQUIRED**

A Cloud Data Engineer with Databricks experience and U.S. Citizenship required per Federal Requirements for the Consumer Financial Protection Bureau (CFPB) is responsible for managing, developing, and maintaining data infrastructure and pipelines, with a specific focus on Databricks, to support the agency's data-driven initiatives and regulatory responsibilities. The CFPB is a U.S. government agency tasked with regulating and overseeing financial products and services to protect consumers.

Job Title: Cloud Data Engineer with Databricks

Job Summary: The Cloud Data Engineer with Databricks at the Consumer Financial Protection Bureau (CFPB) is responsible for designing, building, and maintaining data infrastructure and ETL pipelines using Databricks and cloud-based technologies. The role plays a crucial part in enabling data-driven decision-making and ensuring data accuracy and accessibility for regulatory purposes.

Key Responsibilities
• Collaborate & contribute to the architecture, design, development, and maintenance of large-scale data & analytics platforms, system integrations, data pipelines, data models & API integrations.
• Prototype emerging business use cases to validate technology approaches and propose potential solutions.
• Data Pipeline Development: Design, develop, and maintain data pipelines using Databricks, Apache Spark, and other cloud-based technologies to ingest, transform, and load data from various financial institutions and sources.
• Data Transformation: Implement data transformation processes to ensure data quality, integrity, and consistency, meeting regulatory standards. Create transformation path for data to migrate from on-prem pipelines and sources to AWS.
• Data Integration: Integrate data from diverse sources, including financial databases, APIs, regulatory reporting systems, and internal data stores, into the CFPB's data ecosystem.
• Data Modeling: Develop and optimize data models for regulatory analysis, reporting, and compliance, following data warehousing and data lake principles.
• Performance Optimization: Monitor and optimize data pipelines for efficiency, scalability, and cost-effectiveness while ensuring data privacy and security.
• Data Governance: Ensure data governance and regulatory compliance, maintaining data lineage and documentation for audits and reporting purposes.
• Collaboration: Collaborate with cross-functional teams, including data analysts, legal experts, and regulatory specialists, to understand data requirements and provide data support for regulatory investigations.
• Documentation: Maintain comprehensive documentation for data pipelines, code, and infrastructure configurations, adhering to regulatory compliance standards.
• Troubleshooting: Identify and resolve data-related issues, errors, and anomalies to ensure data reliability and compliance with regulatory requirements.
• Continuous Learning: Stay updated with regulatory changes, industry trends, cloud technologies, and Databricks advancements to implement best practices and improvements in data engineering.

Disclaimer "The responsibilities and duties outlined in this job description are intended to describe the general nature and level of work performed by employees within this role. However, they are not exhaustive and may be subject to change or modification at any time to meet the evolving needs of the organization

Minimum Qualifications
• U.S. Citizens ONLY as per Federal Requirements
• Bachelor's or higher degree in computer science, data engineering, or a related field.
• The Databricks Certified Data Engineer Professional certification is required. NO EXCEPTIONs!
• U.S. Citizenship is required. NO EXCEPTIONS!
• Minimum of 3 years of experience in the following:
• Strong understanding of data lake, lakehouse, and data warehousing architectures in a cloud-based environment.
• Hands-on experience with Databricks including data ingestion, transformation, and analysis
• Proficiency in Python for data manipulation, scripting, and automation
• In-depth knowledge of AWS services relevant to data engineering such as Amazon S3, EC2, Database Migration Service (DMS), DataSync, EKS, CLI, RDS, Lambda, etc.
• Understanding of data integration patterns and technologies.

Location: United States

Posted: Oct. 20, 2024, 11:17 p.m.

Apply Now Company Website