With the new Power BI Get-PowerBIActivityEvent I wanted to find a way where I could automate the entire process where it all runs in the cloud.

One of the current challenges with the Audit logs is that they only store 90 days, so if you want to do analysis for longer than 90 days the log files have to be stored somewhere. Why not use Azure Blob Storage?

Whilst these steps might appear to be rather technical if you follow them and you have access to an Azure Subscription you can do this too.

This is a rather long blog post, but this is because I do a lot of explanations of what to do. Typically to actually do this takes about 30 minutes to complete.

Assumptions

Already have got an Azure Blob Storage Account created.

The account that you use to connect to Power BI and get the Audit logs must not have MFA (Multi-Factor Authentication) enabled. The good news is that the account required to download the audit logs does not have to have a Power BI Pro license, it MUST be a Power BI Global Admin in order to have the permissions to download the data.

Creating the Function App

Below are the steps that I completed to create the Function App:

  • In the Azure Portal, I went to All Resources and clicked on Add
  • I then searched for the Function App
  • Next I clicked Create
  • I then configured it with the following details below.
    • My subscription that I wanted to use
    • Associated to this I used an existing Resource Group (This allows me to be able to keep everything together)
    • Function App Name, I put in a name that I could easily recognize
    • Runtime stack, here I chose PowerShell Core because I am going to be running PowerShell scripts
    • Region, I chose the region where I store all my data.
    • Below is what it looks like
  • I then clicked Next for Hosting
    • Here I used an existing storage account
      • Make a NOTE of this storage account, this will be used later in your PowerShell Script. (1)
      • In my example the storage account was called “mvpnloganalytics
    • I also left the Operating System to Windows
    • And in the plan type I left it to Consumption Model because it does not take all that long for the script to run, as well as being well priced
  • I clicked on Monitoring, here I created a new Application Insights
  • I clicked Tags here is an option if you want to use Tags for your Function App.
    • This does not have to have anything put into the Name or Value areas and can be left blank.
    • I left them as blank as shown below.
  • Next I clicked, Review + create, here it validates that everything is good to go.
  • I then clicked Create
  • This then went and created my Function App
  • Once it was ready, I clicked on Go to Resource

Getting the Storage Name and Key

In order to copy the files later, I need to make a note of the Storage Name and associated Key.

  • I go back into the Azure Portal, click on All Resources and find my storage account called “mvpnloganalytics”
    • NOTE: This was the storage account I selected in the earlier steps
  • I then clicked on mvpnloganalytics, this then opened the storage account.
  • Next, I clicked on Access keys
    • I made a note of the Key (2)

Finding the Location of your Blob Share linked to your Function App

This took me longer than I should admit figuring out! With that being said when you create a function app, it creates a file share in the storage account I specified when I created my function app.

  • I go back into the Azure Portal, click on All Resources and find my storage account called “mvpnloganalytics”
  • I then clicked on mvpnloganalytics, this then opened the storage account.
  • Next, I clicked on Storage Explorer (Preview)
  • Now on the right-hand side I clicked next to FILE SHARES
    • I then looked for my Function name, and with this working example it started with “fa-pbi-rest-api”
    • NOTE: There is a number appended to the Function App Name
    • Now make a note of the File Share (3)
  • While I was here, I also went into my BLOB CONTAINERS and made sure I could see my container called “pbitest”
    • If I did not have a container name I would have created one here and given it a name.
      • Right click and select Create blob container
      • I then gave it name and left the defaults
        • Then clicked OK
    • Make a note of the Destination Container Name (4)
  • You can then continue onto the next step below

Downloading the Power BI Modules

The next step is to upload the Power BI Modules.

In order to do this, I first had to download the modules (instead of installing them via PowerShell)

To download them I did the following below.

  • I opened the PowerShell ISE as an Administrator
  • I created the output folder “D:\PowerShell Downloaded Modules”
  • Next I then ran the following PowerShell command
    • Save-Module
      -Name
      MicrosoftPowerBIMgmt
      -Path
      “D:\PowerShell Downloaded Modules”
    • I could then see it downloading the files
  • Once finished I went into the folder and I could see all the associated PowerShell Modules

Uploading the Power BI Modules to the Function App

In order to have the PowerShell Modules Load at runtime using the Function App, I then had to do the following to get it working in the function App

  • I went into my Function App, on the top right clicked on Platform Features, and then below selected Advanced tools (Kudu)
  • This opens another window in my browser
  • I then clicked on Debug Console and then selected CMD
  • I then clicked on Site and wwwroot to navigate to the wwwroot folder
    • I could see I was there successfully by looking at my location in the CMD prompt
  • I then clicked on the plus sign next to wwwroot and selected New Folder
  • I then typed in “modules” and pressed Enter
  • I then clicked on “modules” to go into the modules folder
    • It currently was blank
  • Now on my Windows PC I went to the folder where I had downloaded all the Power BI Management PowerShell Modules.
    • I then dragged and dropped all the folders into the Kudu command prompt
    • NOTE: I had 7 Folders
  • I could then see it uploading on the right-hand side
  • Once it was completed, I could then see all the folders and files.

Creating a Folder to host the Audit Files

Next I had to create a folder to host the Audit files that will be saved from the PowerShell script

  • I still had my Kudu console open.
  • I then clicked on the plus sign next to wwwroot and selected New Folder
  • Under the wwwroot directory I created a new folder called AuditFiles
  • I then made a note of the AuditFiles location because I am going to need to change this in my PowerShell script
    • In my example I went into the AuditFiles folder to see the location in the Kudu console
  • As shown above the location is “D:\home\site\wwwroot\AuditFiles”
    • Make a note of the above location (5)

Creating function

The next steps I did was to create and test my PowerShell script

  • I made sure I was in my function app
  • I then clicked on the plus sign next to functions
  • I then selected In-Portal
  • I then clicked Continue
  • I wanted to have my Audit logs run on a schedule, so for my working example I selected Timer, then clicked Create
  • I was then prompted with the Function Name and Timer Trigger
    • I gave it the name of “PBI-GetActivityEvents”
    • And then I configured the schedule to run every day at 12:30:01 AM UTC

      1 30 0 * * *

    • NOTE: You can use this as a reference to set it to the time you want it to run. It is known as a CRON JOB scheduler
  • I then clicked Create
  • Once completed I could then see my function app
  • I then clicked on my function app, and on the right-hand side I could then see the run.ps1 which is the PowerShell script.

Creating and testing the PowerShell script

Next I created the PowerShell script and tested it in the steps below.

Whilst I acknowledge that it is NOT secure to put in usernames and passwords into scripts, I did this initially to make sure everything works as expected. Later I can change this to be more secure.

One other thing to note is that the account that I used did NOT have MFA enabled, otherwise it would not be able to log in.

I put in the following PowerShell script below which allowed me to extract the Power BI Activity Events

# Input bindings are passed in via param block.

param($Timer)

# Enable the AzureRM Aliasing for older Functions
Enable-AzureRmAlias

#1 User Account Details
$username = "user@domain.com"
$password = "MySecurePassword" | ConvertTo-SecureString -asPlainText -Force 
$credential = New-Object System.Management.Automation.PSCredential($username, $password)
# Reference for above: https://datagulp.com/powerbi/power-bi-rest-api-how-to-get-authentication-token-with-powershell-cmdlets/

#2. Authenticate to Power BI

$SecPasswd = ConvertTo-SecureString $password -AsPlainText -Force
$myCred = New-Object System.Management.Automation.PSCredential($username,$password)
 
#3. Login-PowerBI 

Connect-PowerBIServiceAccount -Credential $myCred

#4. Define export path and current date to retrieve

#Get Current Date Time
$CurrentDateTime = (Get-Date)

#Specify Folder Location for CSV Files to View & Export
$FolderAndCsvFilesLocation = "D:\home\site\wwwroot\AuditFiles"

#dir "X:\mgasia\BI Reporting - Documents\Audit Logs\*.csv" | 
$GetLastModifiedFileDateTime = Get-ChildItem "$FolderAndCsvFilesLocation\*.csv" | `

# Get the last 1 Days Files
Where{$_.LastWriteTime -gt (Get-Date).AddDays(-1)} | `

# Select the last File
 Select -First 1

#Convert the LastWriteTime to DateTime
$ConvertToDateTimeLastModified = [datetime]$GetLastModifiedFileDateTime.LastWriteTime
 
# Workout the Difference between the Dates
$DateDifference = New-timespan -Start $ConvertToDateTimeLastModified -End $CurrentDateTime

#Create a Variable with the Number of Days
$DaysDifference = $DateDifference.Days

#If Days Difference = 0 Make it 1
if ($DaysDifference -eq 0) {1} else {$DaysDifference}

# List of Dates to Iterate Through
$DaysDifference..1 |
    foreach {
        $Date = (((Get-Date).Date).AddDays(-$_))
        $StartDate = (Get-Date -Date ($Date) -Format yyyy-MM-ddTHH:mm:ss)
        $EndDate = (Get-Date -Date ((($Date).AddDays(1)).AddMilliseconds(-1)) -Format yyyy-MM-ddTHH:mm:ss)

#FileName
$FileName = (Get-Date -Date ($Date) -Format yyyyMMdd)
 
# Export location of CSV FIles
$ActivityLogsPath = "$FolderAndCsvFilesLocation\$FileName.csv"

#4. Export out current date activity log events to CSV file

$ActivityLogs = Get-PowerBIActivityEvent -StartDateTime $StartDate -EndDateTime $EndDate | ConvertFrom-Json

$ActivityLogSchema = $ActivityLogs | `
    Select-Object `
        Id,CreationTime,CreationTimeUTC,RecordType,Operation,OrganizationId,UserType,UserKey,Workload,UserId,ClientIP,UserAgent,Activity,ItemName,WorkSpaceName,DashboardName,DatasetName,ReportName,WorkspaceId,ObjectId,DashboardId,DatasetId,ReportId,OrgAppPermission,CapacityId,CapacityName,AppName,IsSuccess,ReportType,RequestId,ActivityId,AppReportId,DistributionMethod,ConsumptionMethod, `
        @{Name="RetrieveDate";Expression={$RetrieveDate}}

$ActivityLogSchema | Export-Csv $ActivityLogsPath 

#Move the File to Azure Blob Storage

$StorageAccountName = "mvpnloganalytics" 
$StorageAccountKey = "sdlkjhsdjkhdsklnsdkjhsdnhjsdnmsd=l3n32kj2323nm"

$ctx = New-AzureStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $StorageAccountKey

# The Source Share Name found via Storage Explorer (Preview)
$SourceShareName = "fa-pbi-rest-apib28f"

#This is the location of the Source Files in the Blob Container
#You can remove the D:\Home from the location above
$SourceFilePath = "site\wwwroot\AuditFiles\$FileName.csv"

#The Destination Container Name where the Logs will be moved it
$DestinationContainerName = "pbitest"

Start-AzureStorageBlobCopy -SrcShareName $SourceShareName -SrcFilePath $SourceFilePath `
-DestContainer $DestinationContainerName -DestBlob "$FileName.csv" -Context $ctx -Force

#End of ForEach Loop
}

I am not going to explain all the details of the PowerShell script (If you really want me to, let me know in the comments section). I am just going to detail below what you need to change.

  • LINE 9
    • I changed this to my user account which has been assigned the Global Administrator Role or the Power BI Administrator Role in Office 365
    • NOTE: This user account DOES NOT require a Power BI license.
  • LINE 10
    • I put in the user accounts password (Scary I know!)
  • LINE 29
    • This is the location of where the Audit files will be exported to
    • You can use the note made from note (5)
    • In my example it is D:\home\site\wwwroot\AuditFiles
  • LINE 78
    • This is where I put in my storage account name.
    • I used the value from note (1)
    • In my example it is mvpnloganalytics
  • LINE 79
    • This is where I put my Storage Account Key
    • I used the value from note (2)
    • In my example it is sdlkjhsdjkhdsklnsdkjhsdnhjsdnmsd=l3n32kj2323nm
  • LINE 84
    • This is the Share Name in the Azure Blob
    • I used the value from note (3)
    • In my example it is “fa-pbi-rest-apib28f”
  • LINE 88
    • This is the location of where the Audit files are in the Share
    • This value is taking the value from LINE 19 above and removing the “D:\home” part
    • In my example it is “site\wwwroot\AuditFiles\$FileName.csv”
  • LINE 91
    • This is the destination blob container where I wanted to move my files to.
    • I used the value from note (4)
    • In my example it is “pbitest”
  • I then clicked on Save at the top
  • I was now ready to run the script.
    • I clicked on Run to run the script
  • I could then see the output in the Logs, where it showed that it executed successfully
  • I then went back to my Kudu console and into the AuditFiles Folder and I could see the file there
  • The final check was to make sure that I could then see the files had moved to my blob container.
    • I go back into the Azure Portal, click on All Resources and find my storage account called “mvpnloganalytics”
    • I then clicked on mvpnloganalytics, this then opened the storage account.
    • Next, I clicked on Storage Explorer (Preview)
    • I then clicked on pbitest, and on the right-hand side I could see my files

My next blog post will detail how I then connected to the Audit Log files using Power BI Desktop and connecting to the Azure Blob storage.

Summary

Shew that was a long blog post, hopefully there was more reading and a bit of copying and getting it working for your environment.

I hope that this will help you be able to store your audit log files in a secure location, where they can be analysed over a longer period.

If there are any questions, or you have any suggestions please leave them in the comments below.

Thanks for reading.