Data Engineer


Digital, Information Technology, Data Science
Bristol, Malmesbury - United Kingdom


As a Data Engineer you will be responsible for developing, industrialising, and optimising Dyson's big data platforms across AWS and GCP. You will ingest new data sources, write data pipelines as code, and transform and enrich data using the most efficient methods.


  • Working with data from across Dyson’s global data estate, you will understand the best way to serve up data at scale to a global audience of analysts.
  • You will work closely with data architects, data scientists and data product managers on the team to ensure that we are building an integrated, performant solutions.
  • Ideally you will have a Software Engineering mindset, be able to leverage CI/CD and apply critical thinking to the work you undertake.
  • The role would suit candidates looking to make the move from working with traditional big data stacks such as Spark and Hadoop to using cloud native technologies (DataFlow, Big Query, Docker/Kubernetes, Pub/Sub, Redshift, Cloud Functions).
  • Candidates who also have strong software development skills and wishing to make the leap to working with Data at scale will also be considered.


  • Strong programming skills in languages such as Python/Java/Scala including building, testing and releasing code into production.
  • Strong SQL skills and experience working with relational/columnar databases (e.g. SQLServer, Postgres, Oracle, Presto, Hive, BigQuery etc…)
  • Knowledge of data modelling techniques and integration patterns.
  • Experience migrating from on-premise data stores to cloud solutions
  • Building API's and apps using Python/JavaScript or an alternative language.
  • Practical experience with traditional Big Data stacks (e.g Spark, Flink, Hbase, Flume, Impala, Hive etc)
  • Experience with AWS data pipeline, Azure data factory or Google Cloud Dataflow
  • Experience working with data warehouse solutions incl extracting and processing data using a variety of programming languages, tools and techniques (e.g. SSIS, Azure Data Factory, T-SQL, PL-SQL, Talend, Matillion, Nifi, AWS Data Pipelines)
  • GCP, Google Cloud Platform, AWS, BigQuery, Dataflow, Apache Beam, Flink, Spark, Kubernetes, Data Analytics, Pipelines, Python, Java, Scala, Pub Sub, Streaming Analytics, Big Data


  • 27 days holiday plus eight statutory bank holidays
  • Pension scheme
  • Performance related bonus
  • Life assurance
  • Free on-site parking
  • Lift share scheme
  • Subsidised café and restaurants
  • Discount on Dyson machines