Data Engineer


Salary Will be determined based upon credentials
Information Technology
Chicago - USA


As a Data Engineer you will be responsible for developing, industrialising, and optimising Dyson's big data platforms across AWS and GCP. You will ingest new data sources, write data pipelines as code, and transform and enrich data using the most efficient methods.


  • Working with data from across Dyson’s global data estate, you will understand the best way to serve up data at scale to a global audience of analysts.
  • You will work closely with data architects, data scientists and data product managers on the team to ensure that we are building an integrated, performant solutions.
  • Ideally you will have a Software Engineering mindset, be able to leverage CI/CD and apply critical thinking to the work you undertake.
  • The role would suit candidates looking to make the move from working with traditional big data stacks such as Spark and Hadoop to using cloud native technologies (DataFlow, Big Query, Docker/Kubernetes, Pub/Sub, Redshift, Cloud Functions).
  • Candidates who also have strong software development skills and wishing to make the leap to working with Data at scale will also be considered.


  • Strong programming skills in languages such as Python/Java/Scala including building, testing and releasing code into production.
  • Strong SQL skills and experience working with relational/columnar databases (e.g. SQLServer, Postgres, Oracle, Presto, Hive, BigQuery etc…)
  • Knowledge of data modelling techniques and integration patterns.
  • Experience migrating from on-premise data stores to cloud solutions
  • Building API's and apps using Python/JavaScript or an alternative language.
  • Practical experience with traditional Big Data stacks (e.g Spark, Flink, Hbase, Flume, Impala, Hive etc)
  • Experience with AWS data pipeline, Azure data factory or Google Cloud Dataflow
  • Experience working with data warehouse solutions incl extracting and processing data using a variety of programming languages, tools and techniques (e.g. SSIS, Azure Data Factory, T-SQL, PL-SQL, Talend, Matillion, Nifi, AWS Data Pipelines)
  • GCP, Google Cloud Platform, AWS, BigQuery, Dataflow, Apache Beam, Flink, Spark, Kubernetes, Data Analytics, Pipelines, Python, Java, Scala, Pub Sub, Streaming Analytics, Big Data


Dyson US monitors the market to ensure competitive salaries, holidays and retirement plans. Beyond that, you’ll also enjoy profit-related bonuses and life and disability cover. But financial rewards are just the start of a Dyson career. Rapid professional growth, leadership development and new opportunities abound, driven by regular reviews and dynamic workshops. And with a vibrant culture, the latest devices and a relaxed dress code reflecting our engineering spirit, it’s an exciting team environment geared to fueling and realizing ambition.
Dyson is committed to providing reasonable accommodations to individuals with disabilities. If you are interested in applying for employment with Dyson and need a reasonable accommodation for any part of the application process, please send an e-mail with your contact information, the job number of the position for which you are interested in applying, and the nature of your request to or call (312) 706-2260. Determinations of requests for reasonable accommodation are made on a case-by-case basis.

Posted: 03 September 2019