We are looking for a new member of our Big Data team. You will work on Hadoop based on Cloudera with focus on data grabbing and storing into Hadoop.
Main goal for your will be to design and develop custom applications.
- To do a whole lifecycle of big data product: i.e. analyze, solution design, development, deployment, documentation creation incl. model and further support
- Programming in Scala, Bash, Python and Spark jobs
- Usage of Impala, Hive, Kafka, HBase, Zeppelin
- To define architecture patterns
- User support incl. training and documentation maintenance
- Scrum meeting participation and preparation
- 2+ years of Big Data experience with focus on Cloudera platform
- Proficiency in coding in Scala
- Basic knowledge of Python
- Understanding NoSQL databases such as HBase
- Experience with Hadoop, Hive, Impala, Spark and Kafka
- Proficiency in Linux environment
- Knowledge of working with GIT, SVN
- Agile/Scrum development cycle understanding
- Motivating Salary + yearly bonuses.
- Extra week of holidays.
- 2 sick days/year. For your better working environment:
- Smart phone.
- Free tea. Subsidized coffee.
- Meal allowances.
- Relax room & Activity room.
- Referral Program Bonuses.
- Unforgettable corporate and team events.
- And of course you will be part of a IT professional team with access to new technologies, in a place where you can cultivate your expertise and use your knowledge to the full extent. For your future:
- Possible internal growth.
- Technical trainings and certificates.
- Loyalty bonus.
- Allowance to pension scheme.
- Potential business trips to exotic destinations (as China, Russia, Vietnam, Philippines, Indonesia, India, …).