SPRAK /SCala Dev

Offer by UST Global

scala

hadoop

java

About this job

Job type: Full-time
Role: Backend Developer



Technologies

scala, hadoop, java



Job description

Looking for resources with  :


Scala, Java, JSON, XML • MUST Have skills :

    • Experience working with large data sets and pipelines using tools and libraries of Hadoop ecosystem such as Spark, HDFS, YARN, Hive and Oozie .
    • Experience and working knowledge of distributed/cluster computing concepts.
    • Solid understanding of SQL, relational and nosql databases .
    • Solid understanding in multi-threaded applications; Concurrency, Parallelism, Locking Strategies and Merging Datasets. .
    • Solid understanding in Memory Management, Garbage Collection & Performance Tuning.
    • Solid understanding experience in Linux environments; strong knowledge of shell scripting and file systems.
    • Experience working in cloud based environment like AWS.
    • Knowledge of CI tools like Git, Maven, SBT, Jenkins, and Artifactory/Nexus.
    • Self-managed and results-oriented with sense of ownership is required.
    • Excellent analytical, debugging and problem solving skills is required.
    • Experience with Agile/Scrum development methodologies a plus.
    • Minimum Bachelor’s degree in CS or equivalent with 8 + years industry experience


A new version is available REFRESH