#10439 | 2018-10-05 Malmö area, Sweden

Big Data Operations Engineer (Hadoop, Spark)

Job Summary:
We are seeking a solid Big Data Operations Engineer focused on operations to administer/scale our multipetabyte Hadoop clusters and the related services that go with it. This role focuses primarily on provisioning, ongoing capacity planning, monitoring, management of Hadoop platform and application/middleware that run on Hadoop.

Job Description:
  • Hands on experience with managing production clusters (Hadoop, Kafka, Spark, more).
  • Strong development/automation skills. Must be very comfortable with reading and writing
Python and Java code.
  • Overall 10+ years with at least 5+ years of Hadoop experience in production, in medium to
large clusters.
  • Tools-first mindset. You build tools for yourself and others to increase efficiency and to make
hard or repetitive tasks easy and quick.
  • Experience with Configuration Management and automation.
  • Organized, focused on building, improving, resolving and delivering.
  • Good communicator in and across teams, taking the lead.
Education:
Bachelors or Master Degree in Computer Science or similar technical degree.

  • Responsible for maintaining and scaling production Hadoop, HBase, Kafka, and Spark clusters.
  • Responsible for the implementation and ongoing administration of Hadoop infrastructure including monitoring, tuning and troubleshooting.
  • Provide hardware architectural guidance, plan and estimate cluster capacity, Hadoop cluster deployment.
  • Improve scalability, service reliability, capacity, and performance.
  • Triage production issues when they occur with other operational teams.
  • Conduct ongoing maintenance across our large scale deployments.
  • Write automation code for managing large Big Data clusters
  • Work with development and QA teams to design Ingestion Pipelines, Integration APIs, and provide Hadoop ecosystem services
  • Participate in the occasional on-call rotation supporting the infrastructure.
  • Hands on to troubleshoot incidents, formulate theories and test hypothesis, and narrow down possibilities to find the root cause.

Competence demands:
  • Hands on experience with managing production clusters (Hadoop, Kafka, Spark, more).
  • Strong development/automation skills. Must be very comfortable with reading and writing Python and Java code.
  • Overall 10+ years with at least 5+ years of Hadoop experience in production, in medium to large clusters.
  • Tools-first mindset. You build tools for yourself and others to increase efficiency and to make hard or repetitive tasks easy and quick.
  • Experience with Configuration Management and automation.


Start: as soon as found
Duration: long term assignment
Work location: Malmö area, Sweden
Requirements: Min. 5 years of professional IT experience.
Job type: Freelance

I am interested in this job!

Drop files here to attach them
Nothing has been attached.
Uploading: "{{uploadingFile.name}}" Attached: "{{candidate.fileName}}"
Could not process "{{uploadErrFile.name}}". Only the following files types are allowed: ".pdf, .doc, .docx". Please save your file one of the allowed types and try again. File must be below {{uploadErrFile.$errorParam}}
{{uploadErrorMsg}}
Choose document
{{$select.selected.commonName}}
Please correct these fields and then try to send again:
  • {{ getValFieldName(field) }}

You have just submitted this form.

Note: If we feel that you are the correct candidate for this project we will contact you personally. Your CV data will not be sent to the client without talking to you first.

If you have any questions about this project, you are welcome to contact the resource department:

Olga Saibel
Sourcing Specialist

Email
Mobile+46 76 843 39 34

ProData Consult stores data in your browser/device using cookies for the purpose of statistics and optimization of our websites and optionally for targeted advertising. By accepting, you give us your consent to this use of cookies. Read our Privacy Policy for more information. You can always withdraw your consent here: Privacy policy & cookies

The website requires the use of "Necessary cookies". Our necessary cookies are only used for the purpose of delivering a functioning website and webservice.

Selected third party services may store cookies to place relevant adverts to be delivered to you on third party websites.