Contracted Hours: 39
Contract Type: Permanent
Location: Chester House, Epsom Ave, Handforth, Cheadle, Greater Manchester, SK9 3DF
This is a fantastic opportunity to come and work for a company voted 5th in the Great Place to Work (Best Large Workplaces) awards 2018, and Number 15 in Europe in 2017! We have over 450 stores, over 310 Groom Rooms and we’re the UK's number one pet care business. Our business is fast-paced, innovative and fun and it's our people that make the difference.
Since the appointment of the Chief Data Officer onto the Executive team we have outlined a strategy to establish “Pet Care Analytics for All”. Analytics will become the lifeblood of our Pet Care ambitions across the Group. We will establish an internal set of capabilities (people, process, technology and culture) that will allow the Group to:
- Get more impactful, relevant and timely insight out to colleagues and partners
- Increase the velocity and effectiveness of our VIP activity and customer experience journeys
- Leverage advanced analytics to optimise our supply chain, customer spend, workload, colleague rotas, pricing, next best action and vet practice profitability
- Introduce further artificial intelligence to improve pet welfare and diagnosis
To support this, we are looking for a Data Engineer to be based at our Support Office in Handforth (Cheshire).
We are at the early stages of our journey and, as a Data Engineer, you will have a lot of opportunity to influence the technical and strategic direction of the team. You will also have the opportunity to take ownership of parts of the existing platform and new greenfield development. This role sits within the newly formed Pet Care Analytics Group (PCAG), specifically the Data Engineering team reporting to our Lead Data Engineer.
- Develop, maintain and improve our cloud data platform and help plan, design, monitor, communicate and execute data projects
- Assist the analytics teams in the implementation of their machine learning use-cases
- Evangelize about our data platform, products and Data Engineering capabilities with other departments in order to bring more relevant data into our eco-system and develop future data-products that solve real business problems.
- Maintain simple but useful relevant technical documentation as it is of key importance to make sure that our services and applications are easy to understand and use by the analytics community
- Deliver software that is scalable, high availability and fault-tolerant
- Drive automation particularly in the continuous integration pipelines, infrastructure management and configuration.
- Ensure that data is of the highest quality and legality coming into and going out of the analytical platform.
- Adoption and improvement of software development patterns and best practices particularly around open-source components
- Continuous delivery and Dev Ops experience in order to drive the Data engineering team in infrastructure automation, monitoring, logging, auditing and security implementation and practices
- Offer improvement to software development patterns and best practices for an analytical platform
- Conduct code reviews, pair programming and knowledge sharing sessions
Skills / Competencies Required:
- Demonstrable experience of working with and designing a cloud-based analytical platform including best practices around data ingestion on an industrial scale (batch and streaming ETL/ELT) and turning data science/machine learning algorithms into production-grade products
- Strong software development skills (particularly in Python) – object oriented and/or functional design, coding, and testing patterns, the relevant DevOps principles, and the ability to document in a clean manner
- Solid knowledge of data modelling and structures, and experience with data laking and warehousing tools and techniques (BigQuery, Spanner, Snowflake, Redshift etc.)
- Hands-on experience with ingesting and processing streaming data – messaging queues (RabbitMQ, Kafka, PubSub etc.) and data flow orchestration (Data Flow, Apache NiFi, Airflow, Luigi etc.)
- Strong understanding of the end-to-end deployment process of data products (from raw code to scalable deployment), the relevant CI/CD tools (Jenkins, Spinnaker, TeamCity) and containerisation (Docker, Kubernetes, Helm)
- Strive to create the simplest solution to any problem, using the right tool for the job
- Focus on high-quality, reliable and fault tolerant software/systems
- Adapt to new technologies and technical challenges
- Support project delivery with pragmatic estimates and progress tracking
- Work collaboratively and support colleagues with areas of weakness
- Excellent verbal and written communication
- Competitive salary plus bonus
- 36 days paid annual leave (including bank holidays), rising to 38 days after two years
- Birthday Leave - 1 day extra leave to celebrate your Birthday!
- The option to buy extra holidays
- An extra days holiday when you become a new pet parent (Dog/Cat/Horse)
- Pension Scheme
- Colleague Discount- 20% discount for you (plus one family member). This can be used in Pets at Home Stores, Groom Rooms, Companion Care Veterinary Services, and Pet Plan Insurance!
- “Treats” benefits- an online range of offers and discounts which are available exclusively to Pets at Home Colleagues
- Life assurance
- Contributory Private Health Care
- Charity Leave- You are entitled to have one paid day’s leave each year to work for your favourite animal related charity.
- Celebrating Special Celebrations? You’ll get 1 extra week off and a gift to celebrate your wedding or civil partnership, and a gift from us if you are expecting or adopting a baby!
- Free Car Parking
- Free Gym
- Colleague of the Month Awards
- Cycle to Work Scheme
Organisation: Pets at Home
Date Posted: 18-05-2020
Expiry Date: 31-07-2020