Title –Scala Developer
Responsibilities:
• Minimum 8+ years of hands on experience in software development.
• Must have 2+ years of experience working on Scala and 3+ years on with Hadoop/Big Data stack and Java.Must have exp in Google Big Query, Google DataProc experience. Automic or Airflow (Other scheduler tools) experience required
• Experience working with developing Spark ELT PipelinesWork on large data sets, experience working with distributed computing a plus (Map/Reduce, Hadoop, Hive, Apache Spark, etc.).
• Hands-on experience using Database procedural languages (such as SQL, PL/SQL, Database scripting in any of the following databases DB2, Oracle or Teradata).
• Understanding of NoSQL data stores, messaging or pub-sub queuing systems and data processing frameworks.
• Experience in Extract data from multiple structured and unstructured feeds by building and maintaining scalable ETL pipelines on distributed software systems.
• Exposure to Object-oriented design, distributed computing, performance/scalability tuning, advanced data structures and algorithms, real time analytics and large scale data processing.
• Exposure working in an Agile/SCRUM model.
Location: Anywhere
Posted: Aug. 22, 2024, 12:18 p.m.
Apply Now Company Website