- Competitive salary
- Information Technology
- Malmesbury - United Kingdom
Whilst pockets of insight and analytics exist, building our analytical capabilities is a core pillar of Dyson's new global data strategy. As more consumers engage with Dyson, owning more types of products, and across more markets, the volumes of data is now vast. Understanding how to use this data to improve everything from customer experiences, campaign development and retailer business planning will all be key to Dyson’s success.
This role sits within the GB&I Business Intelligence & Analytics team and is tasked with ensuring that the GB&I market are able to leverage rich, accurate, timely data that drives reporting and analytics to generate insights and make data driven decisions. You will own our data ingestion roadmap, lead our modelling capabilities and design supporting processes to ensure smooth delivery of high availability data that feeds seamlessly into reporting. You will be closely aligned with our Business Intelligence and Analytics team to ensure requirements and timelines on all deliverables are clear, owning the provision of timely and accurate data. In addition to this you will work closely with our Data Engineers to ensure we adhere to process and best practice within the context of the Dyson Data Platform.
Accountabilities• Lead and develop a small team of data analysts responsible for ingesting, modelling and enriching data within GCP. • Ensure that data is ingested and analysed in a secure, efficient, and well-governed manner. • Working alongside our Global Data Services team of Data Engineers to help to shape Dyson's data platform from a GB&I perspective by ingesting, modelling, and enriching data, leveraging the latest cloud technologies and programming languages. • Develop and optimise Dyson's data platform to meet GB&I specific data and reporting requirements. • Ingesting new data sources, write data pipelines, and transform and enrich it using the most efficient methods. • Maintaining existing data pipelines and building automated pipelines that include log monitoring and data quality checks.
Skills• Previous experience with data lakes and or data warehouses, preferably Google Cloud Platform. • Skilled in writing performant SQL, Python, and Spark. • Experience with Google Cloud Data Analytics Products such as BigQuery, Data Flow, Data Proc etc. • Specify, Design, Develop, Test and Deploy solutions.• Familiar with data architecture and design principles. • Lead, manage and motivate your team.• Enthusiasm to explore new technologies, methods and techniques.
- 27 days holiday plus eight statutory bank holidays
- Pension scheme
- Life assurance
- Sport centre
- Free on-site parking
- Subsidized café and restaurants
- Discounts on Dyson machines