Data Engineer Dataops Job In London

Data Engineer (DataOps) - Global
  • London, Other, United Kingdom
  • via Test Feed 1
-
Job Description

Job Description

In order to make an application, simply read through the following job description and make sure to attach relevant documents.
Data Engineer (DataOps)
Reporting of the role
This role reports to Head of Data Engineering
Overview of job
We are looking for a skilled DevOps/DataOps individual to join our growing data engineering function. The role strives to accelerate the development of data products, by ensuring consistent and reliable operations to the data engineering and data science teams.
Through the development and advancement of template repositories, DevOps pipelines, and the provisioning of cloud infrastructure capable of scaling to our big data needs. Working in the Media and Advertising domain you will contribute to delivering data products and the process which underpin them, to both internal and external users.
3 best things about the job

  • Working in an innovative and rapidly growing team with a modern cloud native architecture and toolset.
  • Play a central role in crafting and defining our Data/DevOps architecture at a pivotal time for data strategy in Global.
  • Build and own the DevOps practices within the data engineering domain, and work with DevOps professionals across global to craft best practice.

Measures of success -
In the first few months, you would have:

  • Understood our existing build and deployment architecture.
  • Applied your expertise to make improvements to our current processes and infrastructure.
  • Offer technical leadership and mentorship to engineers across the team in Data/DevOps best practices.

Responsibilities of the role

  • Owning the AWS environment(s) in which data products are developed and delivered.
  • Innovating to improve application monitoring and CI/CD processes (Jenkins pipelines) to ensure smooth build and deployment processes.
  • Keeping up to date with DevOps, security and AWS developments and how they might be applicable to Global.
  • Assisting other members of the team to help them increase their knowledge and grow.
  • Ensuring good practice is applied to all we do, including reviews, testing and documentation.

What you will need
The ideal candidate will be proactive and willing to develop and implement innovative solutions, capable of the following:

  • You'll be highly self-motivated, able to absorb new ideas or concepts quickly, and have previous experience with infrastructure as code and CI/CD pipelines.
  • Good Knowledge of Terraform, Kubernetes and AWS services.
  • Knowledge of containerisation (Docker and Helm).
  • Understanding of DevOps concepts and ideas on how these can be applied to data engineering.
  • Good knowledge of CI systems (such as Jenkins).
  • Familiarity with working in an Agile environment.
  • Basic understanding of python would be beneficial but not required.
  • Exposure to big data technologies (Spark, Kafka, Apache Druid)

Everyone is welcome at Global
Just like our media and entertainment platforms are for everyone, so are our workplaces. We know that we can't possibly serve our diverse audiences without first nurturing and celebrating it in our people and that's why we work hard to create an inclusive culture for everyone. We believe that different will set us apart, so no matter what you look like, where you come from or what your favourite radio station is, we want to hear from you.
Although we cannot make guarantees, we welcome conversations about flexible working for all roles at Global

;