Microsoft Teams can answer questions using Microsoft Fabric AI Skills and Copilot Studio
I personally think that AI Skills fills the gap perfectly when looking to answer those Ad-hoc questions which cannot be catered for in the Power BI Semantic Models. For example, while it is possible to calculate returning customers in Power BI, the measure is complex and use a lot of resources to get the result. When comparing this to running…
How to create a Case Insensitive Warehouse in Microsoft Fabric
This is a quick blog post to show you how to use a Microsoft Fabric Notebook to quickly and easily create a Case Insensitive Warehouse. Just a quick note when I talk about a Case Insensitive Warehouse, what that means is that the upper casing and lower casing of column names and text are ignored. By default, Warehouses and Lakehouse’s…
How to read and write to different Lakehouse’s in different App Workspaces when using a Fabric Notebook
With the new Schema’s in a Lakehouse, it now is possible to read from Lakehouse A (In Workspace A) and write to Lakehouse B (In Workspace B). Here are more details about the Schema preview: Lakehouse schemas (Preview) – Microsoft Fabric | Microsoft Learn This opens a whole new world of possibilities. I also really like the fact that I…
Understanding what is consuming your Fabric Storage
This blog post will show you how to understand what is consuming your Fabric Storage. If you want to know how I got this data, please read my previous blog post View all your Storage consumed in Microsoft Fabric – Lakehouse Files, Tables and Warehouses – FourMoo With this Semantic model below, I could also create alerts to notify based…
View all your Storage consumed in Microsoft Fabric – Lakehouse Files, Tables and Warehouses
One of the things I have found when working with my customers in Microsoft Fabric is that there is currently no way to easily view the total storage for the entire tenant. Not only that, but it would also be time consuming and quite a challenge to then find out what is consuming the storage. Could it be large files…
Configuring the Service Principal access Microsoft Fabric OneLake APIs
This blog post explains how to configure access for my Service Principal to interact with the Azure Storage API to use the API to get details for Microsoft Fabric Storage. This is part of a blog post series where I am going to show you how to “View Total Storage consumed in Microsoft Fabric” When I started this blog post…
How to view or track the progress of Notebook while it is running in Microsoft Fabric
How to view or track the progress of Notebook while it is running in Microsoft Fabric I was recently working with a Notebook in Microsoft Fabric that was started via a Data Pipeline. The challenge I had was that I had no idea how far the notebook had gone (as there were quite a lot of cells in this particular…
Using a Service Principal to get all Entra ID Group Members into JSON File using a Python Notebook
Using a Service Principal to get all Entra ID Group Members into JSON File using a Python Notebook Sometimes it is useful to get all Group Members into a JSON file so that this could be used for reporting purposes. Reference Notebook: Get Entra ID Group Members – Power BI In the steps below I will show you how I…
How to get data from a Fabric Lakehouse File into Power BI Desktop – Using Scanner API JSON
How to get data from a Fabric Lakehouse File into Power BI Desktop – Using Scanner API JSON In this blog post I am going to show you how I connected to my Scanner API JSON file which is stored in the files section of my Microsoft Fabric Lakehouse. Full credit on how to complete this comes from Marc’s blog…
Downloading Scanner API data using a Microsoft Fabric Notebook
Downloading Scanner API data using a Microsoft Fabric Notebook I was recently working with a customer where they had more then 100 app workspaces and I was running into some challenges when using the Scanner API in Power Automate. I then discovered this blog post where they detailed how to download the Scanner API data (DataXbi – admin-scan.py), it was…