|Experience Required||8 years|
|Employment Type||Contract / Freelance Project|
|Vacancies Available||3 available|
We have a very urgent requirement for one of our clients at Singapore location, please go through following job description and let me know if you are comfortable with requirement.
Please share your Resume to || email@example.com || for rate part discussion.
Singaporean, PR and Dependent Visa are workable for this requirement.
Duration of contract: 6 months contract extendable
Total Yrs. of Experience: 6-10 Yrs
Relevant Yrs. of experience: 3-5 Yrs
Detailed JD *(Roles and Responsibilities)
Mandatory skills: Java, Hadoop, Spark (Scala is PLUS)
Desired skills: Java, Spark
Domain: Big Data, Hadoop, Data
Work Location: Singapore
• Bachelor’s degree or foreign equivalent required. Will also consider one year of relevant work experience in lieu of every year of education.
• At least 3 year of experience in Big Data space.
• Strong Hadoop – MAP REDUCE/Hive/Pig/SQOOP/OOZIE - MUST
• Candidate should have hands on experience with Java, APIs, spring – MUST
• Good exposure to Spark.
• Complex High Volume High Velocity projects end to end delivery experience
• Good experience with at least one of the scripting language like Scala, Python.
• Good exposure to BigData architectures.
• Experience with some framework building experience on Hadoop
• Very good understanding of Big Data eco system
• Experience with sizing and estimating large scale big data projects
• Good with DB knowledge with SQL tuning experience.
• Strong experience with SQL
• Experience with Apache Parquet Data format
• Past experience and exposure to ETL and data warehouse projects
• Experience with Kafka
• Cloudera / Hortonworks certified
• Experience and desire to work in a Global delivery environment
The acquisition is said to be over $80 million
Most of them have taken out a loan.
The company added $200 billion to their value this year.
Most have resulted in a new product or an existing product's enhancement.
The funding amount was undisclosed.