About the Role


We are seeking a Microsoft Fabric Developer to build data products and support the data layer that powers AI agents. The role splits between delivering pipelines, Lakehouses, and Power BI reports, and maintaining the Fabric‑based foundations used by AI agents across the programme.

A senior Data Engineer and Architect provides technical direction, so this position focuses on hands‑on delivery: building, maintaining, and iterating on pipelines, models, and reports. Strong Fabric and Power BI fundamentals, plus comfort with Python and SQL, are essential, along with an interest in AI‑adjacent data patterns.


Core Skills & Requirements

  • Build and maintain Power BI reports and dashboards used by business stakeholders, including data modelling in Power BI Desktop (relationships, calculated columns, DAX measures), publishing and managing reports in Fabric workspaces, and understanding the difference between a well‑structured model and a flat table report.

  • Hands‑on experience with Microsoft Fabric workspaces, Lakehouse, and Dataflows Gen2, with a solid understanding of how data moves through Fabric and how the Lakehouse connects to Power BI. Full platform expertise is not required, productivity with core components and willingness to learn the rest is essential.

  • Ability to build data pipelines in Fabric or Azure Data Factory to ingest and move data from SQL databases, APIs, SharePoint lists, and flat files into the Lakehouse, including basic scheduling and failure alerts. Pipelines do not need to be complex but must be reliable.

  • Strong SQL skills for querying, transforming, and preparing data, including writing views and transformation queries for reports and AI agents. Must be comfortable with moderately complex SQL without needing deep performance‑tuning expertise.

  • Ability to write and maintain PySpark or Python notebooks in Fabric for data transformation tasks beyond pipeline capabilities, including reading, editing, and running notebooks and using GitHub Copilot to accelerate this work.

  • Awareness of Azure AI Search for indexing Fabric and SharePoint content to support RAG‑based AI agents. Useful for the programme and preferred, but can be learned on the job and is not a blocker.

Core responsibilities

  • Deliver Power BI reports and dashboards for business stakeholders across the client organisation.

  • Build and maintain Fabric pipelines that ingest data from operational systems, APIs, SharePoint lists, SQL sources, and flat files.

  • Develop and manage Lakehouse tables and semantic models that act as the single source of truth for reporting.

  • Provide ongoing maintenance, fixes, and enhancements to data products as business requirements evolve.

  • Prepare and maintain Gold layer tables used by AI agents at runtime.

  • Keep Azure AI Search indexes aligned with source documents to support RAG‑based agents.

  • Write outputs from AI agents, including summaries, assessments, and flags back into defined Fabric tables.

  • Monitor pipeline health to ensure agents always have access to fresh, accurate data.

Nice to have

  • DAX capability beyond basic measures, including time‑intelligence patterns, calculated tables, and understanding of row context.

  • Experience with Azure Data Factory, which maps closely to Fabric Pipelines and significantly shortens the ramp‑up time.

  • Familiarity with Power BI Service administration, including workspace management, row‑level security, and deployment pipelines.

  • Understanding of Delta Lake fundamentals, including Bronze/Silver/Gold layers and incremental load patterns.

  • Exposure to KQL (Kusto Query Language) for Fabric Eventhouse and real‑time analytics scenarios, which are becoming increasingly important.

 

Apply for position now