Skip navigation

Have you considered using our job search? Click here to search our current jobs.

Have you considered using our job search? Click here to search our current jobs.

Data Engineer

Summary

Salary
Competitive
Team
Information Technology
Location
Malmesbury - United Kingdom

About us

Dyson is growing fast and our ambition is huge – more categories, more locations and more people.

Pioneering technology takes more than just inventive engineers. At Dyson, we take a problem-solving approach to everything we do.

Our IT team provides modern solutions that help to remove complexity and drive efficiency as our business changes and grows.

Data and analytics excellence at Dyson are delivered by a diverse and collaborative global community spread across Dyson locations from Bristol to Chicago, Malmesbury to Singapore. Domain-specific experts form spoke analytics teams, enabled by a central team at the hub. All teams benefit from significant recent investments in cloud technologies and tools, combined with an expansive scope and no shortage of ambition and momentum; data and analytics are recognised throughout the organisation, to the highest level, as critical to all of Dyson’s strategic objectives.    

With a ‘one-team’ approach, the global community are on a mission to…  

·       …evolve existing solutions to stay ahead  

·       …embed emerging solutions to capitalise on potential benefits  

·       …deliver conceptualised & future solutions to introduce net-new capability  

As the ‘hub’ team delivering the data, technology and community provision enabling Dyson’s global data and analytics capabilities, Global Data Services (GDS) have end-to-end responsibility for data from foundations (DQ, MDM) to management (data platforms, integrations), to value realisation (analytics enablement and delivery).   

GDS are a multi-disciplinary, global team providing round-the-clock development and operations – including product and project management, community enablement, governance, data architecture, data engineering, data science, and analytics expertise.   

Involved with every aspect of Dyson’s global business - from finance to product development, manufacturing to owner experience – GDS are enjoying record-breaking investment and mandate for 2021 and beyond, seeking to deliver solutions generating impressive and tangible business value. 



About the role

As a Data Engineer you will be responsible for developing, industrialising, and optimising Dyson's big data platform running on GCP. You will ingest new data sources, write data pipelines as code, and transform, enrich and publish data using the most efficient methods.

Working with data from across Dyson’s global data estate, you will understand the best way to serve up data at scale to a global audience of analysts. You will work closely with data architects, data scientists and data product managers on the team to ensure that we are building an integrated, performant solutions.

Ideally you will have a Software Engineering mind-set, be able to leverage CI/CD and apply critical thinking to the work you undertake. The role would suit candidates looking to make the move from working with traditional big data stacks such as Spark and Hadoop to using cloud native technologies (DataFlow, Big Query, Docker/Kubernetes, Pub/Sub, Redshift, Cloud Functions). Candidates who also have strong software development skills and wishing to make the leap to working with Data at scale will also be considered.

Responsibilities include:

  • Designing and building end to end Data Engineering solutions on the Google Cloud Platform.
  • Being a proactive member of DevOps / Agile scrum driven team; always looking for ways to tune and optimise all aspects of work delivered on the platform.
  • Aligning work to both core development standards and architectural principles.


About you

  • Strong programming skills in languages such as Python/Java/Scala including building, testing and releasing code into production
  • Strong SQL skills and experience working with relational/columnar databases (e.g. SQLServer, Postgres, Oracle, Presto, Hive, BigQuery etc…)
  • Knowledge of data modelling techniques and integration patterns
  • Practical experience writing data analytic pipelines
  • Experience integrating/interfacing with REST APIs / Web Services
  • Experience handling data securely
  • Experience with DevOps software delivery and CI/CD processes
  • A willingness to learn and find solutions to complex problems
  • Resilient and comfortable with high pace change

Desirable:

  • Experience migrating from on-premise data stores to cloud solutions
  • Experience of designing and building real/near real time solutions using streaming technologies (e.g. Dataflow/Apache Beam, Fink, Spark Streaming etc)
  • Hands-on experience with cloud environments (GCP & AWS preferred)
  • Building API's and apps using Python/JavaScript or an alternative language
  • Practical experience with traditional Big Data stacks (e.g Spark, Flink, Hbase, Flume, Impala, Hive etc)
  • Experience with non-relational database solutions (e.g. Big Query, Big Table, MongoDB, Dynamo, HBase, Elasticsearch)
  • Experience with AWS data pipeline, Azure data factory or Google Cloud Dataflow
  • Working with containerization technologies (Docker, Kubernetes etc…)
  • Experience working with data warehouse solutions including extracting and processing data using a variety of programming languages, tools and techniques (e.g. SSIS, Azure Data Factory, T-SQL, PL-SQL, Talend, Matillion, Nifi, AWS Data Pipelines)

Benefits

  • 27 days holiday plus eight statutory bank holidays 
  • Pension scheme 
  • Performance related bonus 
  • Life assurance 
  • Sport centre 
  • Free on-site parking 
  • Subsidised café and restaurants 
  • Discounts on Dyson machines

Interview guidance

We are following the government guidelines regarding COVID19. At this time all interviews will be conducted via video or telephone. We’re taking these precautionary measures to protect both our employee and candidate wellbeing. Our Talent Acquisition team will work with you and provide further information as appropriate.