Role Summary:

Telmar was acquired by LiiV in 2018 and is a provider of SaaS/software to advertising and media companies, providing the most influential agencies, media owners and planners with innovative cloud-based tools to help create, plan and predict the most successful media campaigns. With over 50 years of experience providing data and analytical tools for the marketing and advertising industry, Telmar’s easy-to-use software products and solutions support over 8,000 databases, making it the industry’s most trusted third-party data analysis software globally.

In the role of Senior Data Automation Engineer , you will be responsible for designing, expanding, optimizing full and partial automation of our data pipeline architecture. A key focus of this role is to optimize data flow and collection using large scale workflow management platforms. This role will allow you to programmatically author, schedule and monitor workflow/pipelines within tools like Apache Airflow.

You will be responsible for the design of architecture, build and management of automated data pipelines built upon environments within AWS cloud. This role will work with Python Engineers and Data Engineers to deliver secure, performant and maintainable data automation. The ideal candidate will be able to pioneer ways of working which other team members can adopt.

Working closely with the Enterprise Architect you will be a crucial part of a dynamic agile Data function collaborating with a multidisciplinary team i.e. Python Engineers, Data Engineers, Product Management to name a few.

For respondent survey pipeline building you will work to automate components of a manual load and then add these to the pipeline. The aim is to reduce manual intervention and increase processing speed, so that we can repeatedly process more data, at higher frequencies in the future. Where needed you may choose to add manual tasks to the automation workflow.

This is an awesome opportunity to build cutting-edge data automation essential to the media planning industry. We are focused on delivering improvements to clients and deliver robust, high quality solutions on AWS using the latest development practices.

What you will do:

  • Use large scale workflow management platforms e.g. Airflow to extend and optimise our automated data ingestion platform for the automation of data loading routines
  • Contribute individually to use Python to determine custom ETL processes in the design and implementation of data pipelines
  • Use Python to convert, parse and manipulate data files using various database programs and utilities
  • Advise on the ingestion of data from source systems, identify and resolve data quality and other related issues.
  • Document functional state of current workflow for the wider Technical team e.g. about the automation pipeline, and utilities used within it.
  • Investigate current data loading procedures, planning pipelines and required steps in order to automate data extraction, transformation, and loading (ETL) processes
  • Collaborate with the wider team in particular Infrastructure Engineers to deploy automation
  • Use best-practice CI/CD methodologies to ensure that the build and deployment pipelines which are fast, robust and secure.
  • Coach and Mentor Engineers within the Data Automation team to provide the framework to support learning.

Person Specification:

  • Proficiency in the Python scripting languages
  • Experience of large scale implementation of workflow management platforms.
  • Prior Experience of Apache Airflow is highly desirable e.g. understand the concept of DAGs (Directed Acyclic Graph) and Operators to schedule jobs
  • Prior experience with data analysis and data warehousing and working with customer data platforms.
  • Working knowledge of message queuing, stream processing, and highly scalable “big data” data stores.
  • Experience of manipulating, processing, extracting value from large disconnected datasets.
  • Be able to place security as a foremost primary concern in the architecture, secure coding, build and deployment of solutions
  • Experience in performing root cause analysis on internal/external data and processes.
  • Technical expertise with data models, data mining, and segmentation techniques
  • Proficiency in understanding of GitHub for source code repositories
  • Knowledge of popular Amazon Web Services (AWS) infrastructure & services e.g. EC2, RDS, S3, Lambda
  • Excellent Linux scripting skills
  • Experience with JIRA and be able to define technical acceptance criteria for stories

Personal Attributes:

  • Passionate about the power of data to drive better business outcomes for customers.
  • Proven ability to work effectively in a distributed working environment
  • Outstanding written and verbal communication skills
  • Ability to estimate effort of own tasks and those of others in expertise domain
  • Organized, detail-oriented, and deadline-driven
  • A champion of good code quality and architectural practices.
  • Strong interpersonal skills and ability to work proactively and as a team player
  • Ability to work efficiently and productively in a fast-paced environment
  • Excellent problem-solving skills
  • Willingness to learn new skills

Telmar’s Perks:

  • Flexible working
  • Medical cover
  • Employee Assistance Program (24/7)
  • Weekly open door meeting with CEO and the HR Team
  • Summer Team Building Events
  • Christmas Parties
  • Office gatherings (when C-19 restrictions allow it)
  • Random coffee dates every other week
  • Meeting free Fridays

Apply for This Job

Job Overview

Sign in

Sign Up

Forgotten Password

Share