
Lead Data Engineer
- Kraków, małopolskie
- Stała
- Pełny etat
- Up to 29,000 PLN gross per month on an employment contract
- Private medical healthcare at LUXMED (including dentalcare) for you and your family
- Medicover sports card (Fit&More package)
- Life insurance financed by the employer
- 30-minute lunch break included in the 8-hour working day
- Work in a highly professional and stimulating atmosphere
- Training & Buddy programme that will allow you to quickly adapt to your new role
- Wellbeing programme for employees
- Co-financing of monthly tickets for the public transport in Krakow
- Comfortable working environment in the office and the possibility of home office
- Language courses, accounting courses, access to LinkedIn Learning and the possibility of co-financing studies and certification
- Employee referral programme
- Work in agile team, collaborate with other teams, advising, mentoring and line managing other Data Engineers.
- Act as an escalation point for team, unblocking technical or project related issues to maintain project progress.
- Lead in design, architecture and delivery of new features and pipelines within the team
- Support the Principal Data Engineer with resourcing and other day to day management duties.
- Maintain high performing teams by driving continuous improvement of engineering and agile practices and standards.
- Be an active member of the Kingfisher wide technical lead team to help define development processes, and code quality standards, also serve as a technical leader within the organisation.
- Promote TDD to design and develop operationally maintainable solutions in accordance with Kingfisher standards.
- Ensure competence in the specialism is sustained, developed and encouraged within your team by being a data engineer with a passion for leading and coaching.
- Strong engineering background and proven experience in developing and maintaining large-scale distributed data processing systems or equivalent expertise.
- Extensive knowledge of data modelling principles, data quality assurance, testing methodologies, and best practices for data access and storage.
- Strong command of data scripting languages within data ecosystem, including Python, SQL, and PySpark.
- Experience with Azure Databricks and DBT for building and optimising data pipelines and workflows.
- Experience in implementing cloud data solutions on at least one major cloud provider (Azure, AWS, or GCP)
- Knowledge around shortening development lead time and improving data development lifecycle
- Familiarity with Agile methodologies and working experience within Agile delivery frameworks
- Excellent command of English (written and spoken)