Spark Developer Remote 12 Months Contract We are working with a strategic customer and looking specifically for an individual with: Deep expertise when it comes to identifying and fixing bottlenecks in Spark jobs. This role will also require structured streaming expertise to build/optimize jobs. Please let us know if you can share a couple of profiles, we will be happy to review them asap! Identifying and Fixing Bottlenecks in Spark Jobs: Dive deep into Spark applications to identify performance bottlenecks. Analyze execution plans, query performance, and resource utilization. Optimize data serialization, shuffling, and memory usage. Fine-tune Spark configurations for maximum efficiency. Structured Streaming Expertise: Leverage your knowledge of structured streaming to build and optimize real-time data pipelines. Handle continuous data streams using Spark Streaming. Ensure fault tolerance, late data handling, and scalability. Creating Scala/Spark Jobs: Develop robust Scala/Spark jobs for data transformation and aggregation. Clean, preprocess, and enrich raw data from various sources. Implement business logic and calculations efficiently.
Location: San Francisco, CA, United States
Posted: Aug. 21, 2024, 11:02 p.m.
Apply Now Company Website