We’re First Central Insurance & Technology Group (First Central for short), an innovative, market-leading insurance company. We protect the things customers love so they can get on with what matters to them in life.
Data drives us. It fuels our outstanding distribution, finance, technology and legal services. Our underwriting skills are built on data expertise; it creates the insights we need to give the right cover to the right customers at the right price. But, it’s the people inside and outside our business that power us. They make us stand out, help us succeed. We’re ambitious. We’re growing. We’ve won awards.
We’re big on data: it gives us the insights we need to give the right cover to the right customers at the right price. But it’s the people inside and outside our business that power us and were looking for a Principal Data Engineer
At First Central we’re building a sector leading Microsoft Azure data platform to meet our ambitions for a data led business. Over the past 2 years we’ve created and expanded a Data Engineering and ML Ops function with permanent head count to create a fantastic team and we’re looking to expand our data engineering management team with the addition of a third Principal Data Engineer role that will report into the Head of Data Engineering. In return, you’ll get to work with some of the latest technologies, support the expansions of our advanced analytics capabilities and work as part of an inclusive and high performing team.
Our Principal Data Engineer role is a hands-on technical role where you’ll be providing technical data solution leadership on projects and for persistent data products. The role will involve taking responsibility for determining the data platform and data solution design technical direction in collaboration with Architecture and ensuring alignment of data solutions to the overall data platform patterns. You’ll give clarity and technical direction to data engineers who will build solutions against the agreed designs. Whilst not having any direct people management responsibilities, the role holder will need to provide support, coaching and mentoring to the data engineers, ensuring they are able to deliver to the required quality and integrity.
This is a flexible hybrid working role with occasional visits to our offices, when required, in either Salford Quays, Manchester, Haywards Heath, West Sussex, or Guernsey. We offer great flexibility over working pattern and a business wide culture to be proud of.
We'd love to have you on the team if:
- You’re an expert in Azure data engineering including experience of Delta Lakehouse architecture.
- Possess experience in data modelling concepts and are comfortable leading modelling design workshops.
- You can demonstrate hands-on experience of data engineering practices and techniques.
- Are willing to share experience and guide data engineers to deliver best in class artefacts in a cloud solution
Powering the business with the right tools
Job responsibilities:
- Responsible for creating or guiding the low-level design of data solutions, taking high level solution architecture artefacts, and translating them into workable designs and work packages.
- Responsible for the quality of the overall data platform(s), ensuring that data pipelines and database solutions are implemented effectively from a re-use and performance optimisation perspective.
- Responsible for coding standards, low level design and ingestion patterns for the data platform(s), that all users, including data engineers follow.
- Develop high complexity, secure, governed, high quality, efficient data pipelines from a variety of on and off premise, internal and external data sources.
- Set the standards and ensure that data is cleansed, mapped, transformed, and optimised for storage to meet requirements for business and technical use cases.
- The design and build of data observability and data quality by design into all Data pipelines, promoting self-testing pipelines that proactively identify processing issues or discrepancies.
- Build solutions that pipe transform data into data lake storage areas, physical database models and reporting structures across data lake, data warehouse, business intelligence systems and analytics applications.
- Build physical data models that are appropriately designed to meet business needs and optimise storage requirements, ensuring maximum re-use.
- Carry out unit testing of own code, peer testing of others code to ensure appropriate quality, and be responsible for completeness and integrity of solutions delivered on the data platform(s).
- Ensure that effective, and appropriate documentation that brings transparency and understandability are in place for all content on the data platform(s).
- Coach and mentor Senior Data Engineers, Data engineers & Associate Data Engineers.
- Create high complexity BI solutions including data mart, semantic layer, and reporting & visualisation solutions in recognised BI tools such as PowerBI.
Job-specific competencies
Experience & knowledge
- Requires extensive experience of designing and building end to end data solutions (10 years +).
- Exceptional at building strong, effective relationships with people from different disciplines.
- Experience of carrying out data engineering design and build activities using agile working practices (such as Scrum or Kanban).
- Data Factory/Synapse Workspace – for building data pipelines or synapse analytics pipelines
- Data Lake – Delta Lake design pattern implementation experience in Azure Data Lake Gen2 with file hierarchy namespace and low-level permissions management.
- Synapse Warehouse/Analytics – Experience in Synapse data mappings, external tables, schema creation from SSMS, knowledge on how Synapse pool works behind the scenes.
- Azure Active Directory – for Managed identities creation and usage or for generating service principles for authentication and authorization.
- Version Control – Experience in building Data Ops i.e., CICD pipelines in Azure DevOps with managed identity.
- Unit Testing – Experience in writing unit tests for data pipelines.
- Data Architecture – Knowledge or experience in implementing, Kimball style Data Warehouse. Experience in building Metadata with Azure Purview or Data Lake Gen2.
- Data Quality – Experience in applying Data Quality rules within Azure Data Flow Activities.
- Data Transformation – Extensive hands on with Azure Data Flow Activities for Cleansing, transforming, validation and quality checks.
- Azure Cloud – Knowledge and confidence in effective communication on Azure Cloud Subscriptions, Resource Groups, Subnet, VNet, Private Endpoints testing, Firewall rues management on Azure data platform components.
- Keen and active interest in the use of data in the wider industry, with practical knowledge and networks.
Skills & Qualifications
- A creative problem solver who thrives on creating simplicity out of complexity.
- A passion for people and creating environments that enable others to flourish.
- Resilient and comfortable prioritising in demanding situations.
- Highly trustworthy and able to operate with integrity and discretion at all times.
- Energetic and proactive, and someone who motivates others by their “can do, will do” attitude.
- Able to operate with minimal brief, and a fast-moving set of changing priorities.
- Ability to bring together multiple different views and perspectives to create agreed designs and solutions.
- Attention to detail.
This is just the start. Imagine where you could end up! The journey’s yours…
What can we do for you?
People first. Always. We’re passionate about our colleagues and know the best people deserve an extraordinary working environment. We owe it to them so that’s what we offer. Our workplaces are energetic, inspirational, supportive. To get a taste of the advantages you’ll enjoy, take a look at all our perks in full here.
Intrigued? Our Talent team can tell you everything you need to know about what we want and what we’re offering, so feel free to get in touch.