How you can store All your Power BI Audit Logs easily and indefinitely in Azure
With the new Power BI Get-PowerBIActivityEvent I wanted to find a way where I could automate the entire process where it all runs in the cloud.
One of the current challenges with the Audit logs is that they only store 90 days, so if you want to do analysis for longer than 90 days the log files have to be stored somewhere. Why not use Azure Blob Storage?
Whilst these steps might appear to be rather technical if you follow them and you have access to an Azure Subscription you can do this too.
This is a rather long blog post, but this is because I do a lot of explanations of what to do. Typically to actually do this takes about 30 minutes to complete.
Assumptions
Already have got an Azure Blob Storage Account created.
The account that you use to connect to Power BI and get the Audit logs must not have MFA (Multi-Factor Authentication) enabled. The good news is that the account required to download the audit logs does not have to have a Power BI Pro license, it MUST be a Power BI Global Admin in order to have the permissions to download the data.
Creating the Function App
Below are the steps that I completed to create the Function App:
-
In the Azure Portal, I went to All Resources and clicked on Add
-
I then searched for the Function App
-
Next I clicked Create
-
I then configured it with the following details below.
- My subscription that I wanted to use
- Associated to this I used an existing Resource Group (This allows me to be able to keep everything together)
- Function App Name, I put in a name that I could easily recognize
- Runtime stack, here I chose PowerShell Core because I am going to be running PowerShell scripts
- Region, I chose the region where I store all my data.
- Below is what it looks like
-
I then clicked Next for Hosting
-
Here I used an existing storage account
- Make a NOTE of this storage account, this will be used later in your PowerShell Script. (1)
- In my example the storage account was called “mvpnloganalytics“
- Make a NOTE of this storage account, this will be used later in your PowerShell Script. (1)
- I also left the Operating System to Windows
- And in the plan type I left it to Consumption Model because it does not take all that long for the script to run, as well as being well priced
-
-
I clicked on Monitoring, here I created a new Application Insights
-
I clicked Tags here is an option if you want to use Tags for your Function App.
- This does not have to have anything put into the Name or Value areas and can be left blank.
- I left them as blank as shown below.
-
Next I clicked, Review + create, here it validates that everything is good to go.
- I then clicked Create
- This then went and created my Function App
-
Once it was ready, I clicked on Go to Resource
Getting the Storage Name and Key
In order to copy the files later, I need to make a note of the Storage Name and associated Key.
-
I go back into the Azure Portal, click on All Resources and find my storage account called “mvpnloganalytics”
- NOTE: This was the storage account I selected in the earlier steps
- I then clicked on mvpnloganalytics, this then opened the storage account.
-
Next, I clicked on Access keys
- I made a note of the Key (2)
- I made a note of the Key (2)
Finding the Location of your Blob Share linked to your Function App
This took me longer than I should admit figuring out! With that being said when you create a function app, it creates a file share in the storage account I specified when I created my function app.
-
I go back into the Azure Portal, click on All Resources and find my storage account called “mvpnloganalytics”
- I then clicked on mvpnloganalytics, this then opened the storage account.
-
Next, I clicked on Storage Explorer (Preview)
-
Now on the right-hand side I clicked next to FILE SHARES
- I then looked for my Function name, and with this working example it started with “fa-pbi-rest-api”
- NOTE: There is a number appended to the Function App Name
- Now make a note of the File Share (3)
-
While I was here, I also went into my BLOB CONTAINERS and made sure I could see my container called “pbitest”
-
If I did not have a container name I would have created one here and given it a name.
- Right click and select Create blob container
-
I then gave it name and left the defaults
- Then clicked OK
- Make a note of the Destination Container Name (4)
-
- You can then continue onto the next step below
Downloading the Power BI Modules
The next step is to upload the Power BI Modules.
In order to do this, I first had to download the modules (instead of installing them via PowerShell)
To download them I did the following below.
-
I opened the PowerShell ISE as an Administrator
- I created the output folder “D:\PowerShell Downloaded Modules”
-
Next I then ran the following PowerShell command
-
Save-Module
-Name
MicrosoftPowerBIMgmt
-Path
“D:\PowerShell Downloaded Modules”
- I could then see it downloading the files
-
-
Once finished I went into the folder and I could see all the associated PowerShell Modules
Uploading the Power BI Modules to the Function App
In order to have the PowerShell Modules Load at runtime using the Function App, I then had to do the following to get it working in the function App
-
I went into my Function App, on the top right clicked on Platform Features, and then below selected Advanced tools (Kudu)
- This opens another window in my browser
-
I then clicked on Debug Console and then selected CMD
-
I then clicked on Site and wwwroot to navigate to the wwwroot folder
- I could see I was there successfully by looking at my location in the CMD prompt
-
I then clicked on the plus sign next to wwwroot and selected New Folder
-
I then typed in “modules” and pressed Enter
-
I then clicked on “modules” to go into the modules folder
- It currently was blank
-
Now on my Windows PC I went to the folder where I had downloaded all the Power BI Management PowerShell Modules.
- I then dragged and dropped all the folders into the Kudu command prompt
- NOTE: I had 7 Folders
-
I could then see it uploading on the right-hand side
-
Once it was completed, I could then see all the folders and files.
Creating a Folder to host the Audit Files
Next I had to create a folder to host the Audit files that will be saved from the PowerShell script
- I still had my Kudu console open.
-
I then clicked on the plus sign next to wwwroot and selected New Folder
-
Under the wwwroot directory I created a new folder called AuditFiles
-
I then made a note of the AuditFiles location because I am going to need to change this in my PowerShell script
- In my example I went into the AuditFiles folder to see the location in the Kudu console
-
As shown above the location is “D:\home\site\wwwroot\AuditFiles”
- Make a note of the above location (5)
- Make a note of the above location (5)
Creating function
The next steps I did was to create and test my PowerShell script
-
I made sure I was in my function app
-
I then clicked on the plus sign next to functions
-
I then selected In-Portal
-
I then clicked Continue
-
I wanted to have my Audit logs run on a schedule, so for my working example I selected Timer, then clicked Create
-
I was then prompted with the Function Name and Timer Trigger
- I gave it the name of “PBI-GetActivityEvents”
-
And then I configured the schedule to run every day at 12:30:01 AM UTC
1 30 0 * * *
-
NOTE: You can use this as a reference to set it to the time you want it to run. It is known as a CRON JOB scheduler
- I then clicked Create
-
Once completed I could then see my function app
- I then clicked on my function app, and on the right-hand side I could then see the run.ps1 which is the PowerShell script.
Creating and testing the PowerShell script
Next I created the PowerShell script and tested it in the steps below.
Whilst I acknowledge that it is NOT secure to put in usernames and passwords into scripts, I did this initially to make sure everything works as expected. Later I can change this to be more secure.
One other thing to note is that the account that I used did NOT have MFA enabled, otherwise it would not be able to log in.
I put in the following PowerShell script below which allowed me to extract the Power BI Activity Events
# Input bindings are passed in via param block. param($Timer) # Enable the AzureRM Aliasing for older Functions Enable-AzureRmAlias #1 User Account Details $username = "user@domain.com" $password = "MySecurePassword" | ConvertTo-SecureString -asPlainText -Force $credential = New-Object System.Management.Automation.PSCredential($username, $password) # Reference for above: https://datagulp.com/powerbi/power-bi-rest-api-how-to-get-authentication-token-with-powershell-cmdlets/ #2. Authenticate to Power BI $SecPasswd = ConvertTo-SecureString $password -AsPlainText -Force $myCred = New-Object System.Management.Automation.PSCredential($username,$password) #3. Login-PowerBI Connect-PowerBIServiceAccount -Credential $myCred #4. Define export path and current date to retrieve #Get Current Date Time $CurrentDateTime = (Get-Date) #Specify Folder Location for CSV Files to View & Export $FolderAndCsvFilesLocation = "D:\home\site\wwwroot\AuditFiles" #dir "X:\mgasia\BI Reporting - Documents\Audit Logs\*.csv" | $GetLastModifiedFileDateTime = Get-ChildItem "$FolderAndCsvFilesLocation\*.csv" | ` # Get the last 1 Days Files Where{$_.LastWriteTime -gt (Get-Date).AddDays(-1)} | ` # Select the last File Select -First 1 #Convert the LastWriteTime to DateTime $ConvertToDateTimeLastModified = [datetime]$GetLastModifiedFileDateTime.LastWriteTime # Workout the Difference between the Dates $DateDifference = New-timespan -Start $ConvertToDateTimeLastModified -End $CurrentDateTime #Create a Variable with the Number of Days $DaysDifference = $DateDifference.Days #If Days Difference = 0 Make it 1 if ($DaysDifference -eq 0) {1} else {$DaysDifference} # List of Dates to Iterate Through $DaysDifference..1 | foreach { $Date = (((Get-Date).Date).AddDays(-$_)) $StartDate = (Get-Date -Date ($Date) -Format yyyy-MM-ddTHH:mm:ss) $EndDate = (Get-Date -Date ((($Date).AddDays(1)).AddMilliseconds(-1)) -Format yyyy-MM-ddTHH:mm:ss) #FileName $FileName = (Get-Date -Date ($Date) -Format yyyyMMdd) # Export location of CSV FIles $ActivityLogsPath = "$FolderAndCsvFilesLocation\$FileName.csv" #4. Export out current date activity log events to CSV file $ActivityLogs = Get-PowerBIActivityEvent -StartDateTime $StartDate -EndDateTime $EndDate | ConvertFrom-Json $ActivityLogSchema = $ActivityLogs | ` Select-Object ` Id,CreationTime,CreationTimeUTC,RecordType,Operation,OrganizationId,UserType,UserKey,Workload,UserId,ClientIP,UserAgent,Activity,ItemName,WorkSpaceName,DashboardName,DatasetName,ReportName,WorkspaceId,ObjectId,DashboardId,DatasetId,ReportId,OrgAppPermission,CapacityId,CapacityName,AppName,IsSuccess,ReportType,RequestId,ActivityId,AppReportId,DistributionMethod,ConsumptionMethod, ` @{Name="RetrieveDate";Expression={$RetrieveDate}} $ActivityLogSchema | Export-Csv $ActivityLogsPath #Move the File to Azure Blob Storage $StorageAccountName = "mvpnloganalytics" $StorageAccountKey = "sdlkjhsdjkhdsklnsdkjhsdnhjsdnmsd=l3n32kj2323nm" $ctx = New-AzureStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $StorageAccountKey # The Source Share Name found via Storage Explorer (Preview) $SourceShareName = "fa-pbi-rest-apib28f" #This is the location of the Source Files in the Blob Container #You can remove the D:\Home from the location above $SourceFilePath = "site\wwwroot\AuditFiles\$FileName.csv" #The Destination Container Name where the Logs will be moved it $DestinationContainerName = "pbitest" Start-AzureStorageBlobCopy -SrcShareName $SourceShareName -SrcFilePath $SourceFilePath ` -DestContainer $DestinationContainerName -DestBlob "$FileName.csv" -Context $ctx -Force #End of ForEach Loop }
I am not going to explain all the details of the PowerShell script (If you really want me to, let me know in the comments section). I am just going to detail below what you need to change.
-
LINE 9
-
I changed this to my user account which has been assigned the Global Administrator Role or the Power BI Administrator Role in Office 365
-
NOTE: This user account DOES NOT require a Power BI license.
-
-
LINE 10
-
I put in the user accounts password (Scary I know!)
-
-
LINE 29
-
This is the location of where the Audit files will be exported to
-
You can use the note made from note (5)
-
In my example it is D:\home\site\wwwroot\AuditFiles
-
-
LINE 78
-
This is where I put in my storage account name.
-
I used the value from note (1)
-
In my example it is mvpnloganalytics
-
-
LINE 79
-
This is where I put my Storage Account Key
-
I used the value from note (2)
-
In my example it is sdlkjhsdjkhdsklnsdkjhsdnhjsdnmsd=l3n32kj2323nm
-
-
LINE 84
-
This is the Share Name in the Azure Blob
-
I used the value from note (3)
-
In my example it is “fa-pbi-rest-apib28f”
-
-
LINE 88
-
This is the location of where the Audit files are in the Share
-
This value is taking the value from LINE 19 above and removing the “D:\home” part
-
In my example it is “site\wwwroot\AuditFiles\$FileName.csv”
-
-
LINE 91
-
This is the destination blob container where I wanted to move my files to.
-
I used the value from note (4)
-
In my example it is “pbitest”
-
-
I then clicked on Save at the top
-
I was now ready to run the script.
-
I clicked on Run to run the script
-
-
I could then see the output in the Logs, where it showed that it executed successfully
-
I then went back to my Kudu console and into the AuditFiles Folder and I could see the file there
-
The final check was to make sure that I could then see the files had moved to my blob container.
-
I go back into the Azure Portal, click on All Resources and find my storage account called “mvpnloganalytics”
- I then clicked on mvpnloganalytics, this then opened the storage account.
-
Next, I clicked on Storage Explorer (Preview)
-
I then clicked on pbitest, and on the right-hand side I could see my files
-
My next blog post will detail how I then connected to the Audit Log files using Power BI Desktop and connecting to the Azure Blob storage.
Summary
Shew that was a long blog post, hopefully there was more reading and a bit of copying and getting it working for your environment.
I hope that this will help you be able to store your audit log files in a secure location, where they can be analysed over a longer period.
If there are any questions, or you have any suggestions please leave them in the comments below.
Thanks for reading.
This was working so well…until I got the error about needing MFA. Guess I need to find an account that doesn’t need MFA and is also a tenant administrator…
Thanks for this anyway – great stuff!
Thanks for the heads up about MFA and it is something that I forgot to put into the blog post. I will update it now.
And thanks for the kind words!
Nice article.
I am not sure why you did not used Microsoft Flow.
Any leads about detailed log enabling i.e., every action and also including the report parameters or passing values to the reports, dashboards ….
Thanks in advance for the info
Hi there
I found that I could not call the API in the way that I wanted via Power Automate. Do you know of a way?
As far as I know there are no plans to have detailed logging.
This look really good, as I am not able to use Logic Apps, and I would assume PowerAutomate would fail as well with the same error. I can use PowerShell locally, so this looks really helpful, thanks.
Thanks for the comment. And yes you can certainly do exactly the same thing using a local version of PowerShell.
I successfull use PowerAutomate to invoke the API. I started by creating a custom connector following this blog by Konstantinos Ioannou (https://medium.com/@Konstantinos_Ioannou/refresh-powerbi-dataset-with-microsoft-flow-73836c727c33). Details of the flow I created can be found here: https://1drv.ms/w/s!AjP2o3di4mw-lg8vRyjNICzaSFtf?e=ErXjp3
I used Logic Apps to invoke the API but my logic app has to use an additional for each loop to cycle through each activity when appending to the activityEventEntities array. I tried replicating your configuration and I get an error when appending to the variable that the variable expects an object but got an array.
Hi there
I am not PowerShell expert but this has been working for me for a few months.
Can you confirm that you are using PowerShell within Logic Apps?
Hello! Great article, the Power Automate solution is very useful, but I get an error after all the records are retrieved from the API and the continuation token becomes null, in “Update Continuation Token” step. The error is:
Unable to process template language expressions in action ‘Set_variable’ inputs at line ‘1’ and column ‘8392’: ‘The template language function ‘decodeUriComponent’ expects its parameter to be a string. The provided value is of type ‘Null’. Please see https://aka.ms/logicexpressions for usage details.’.
Am I doing something wrong?
Thank you!
Hi there
Thanks for the comment.
I personally have not encountered the error where it needs the continuation step.
Here are more details in this blog post on how to use the Continuation token: https://powerbi.microsoft.com/en-us/blog/the-power-bi-activity-log-makes-it-easy-to-download-activity-data-for-custom-usage-reporting/
Hello! Additionally, my flow is very very slow… it runs for one or two days to retrieve one day of data. I’ve follows your steps exactly, do you know what would be the case? The do-until causes the slow performance, even if I check each iteration it looks like it takes under 1 minute each (aprox 40 iterations).
Thanks!
Hi Bertrand,
Do you have a copy of the modified swagger file thats linked to in your flow document – the file is no longer present on github
[…] on from my previous blog post How you can store All your Power BI Audit Logs easily and indefinitely in Azure, where every day it extracts the Audit logs into Azure Blob storage. One of the key things when […]
Getting error whenever I try to schedule data pull for last 30 days. It runs for 1-2 days then following error pops up-
[Error] Timeout value of 00:05:00 exceeded by function ‘Functions.PBI-GetActivityEvents’ (Id: ‘d99d3e4c-107d-46fc-a184-a5d4f62d999d’). Initiating cancellation.
2020-09-24T14:32:37.005 [Error] Executed ‘Functions.PBI-GetActivityEvents’ (Failed, Id=d99d3e4c-107d-46fc-a184-a5d4f62d999d, Duration=300166ms)Timeout value of 00:05:00 was exceeded by function: Functions.PBI-GetActivityEvents
Hi there
It looks like you will need to use the
Here is the link with the details from the API
Get the next set of audit activity events by sending the continuation token to the API
Thanks Gilbert for the prompt response. Will try that out. One more query- The log file that is generated is only giving the information of what report was accessed by someone. How to find what page in the report was opened as if you refer the standard usage report in a workspace it shows page wise views.
Thanks for your detailed explanation, would you be able post a solution that works with MFA.
Hi there
Currently it does not work with MFA.
What I did do is to create a simple Azure AD Account, then assign this account the Power BI Global Admin role. Which I then used in my implementation.
[…] on from my blog post on How you can store All your Power BI Audit Logs easily and indefinitely in Azure, I wanted to share with you how I connected to the Blob Storage Files and created my Power BI […]
Hi, Please help me i am getting below error while to powerBI service account.
2020-04-10T09:24:20.809 [Error] ERROR: Connect-PowerBIServiceAccount : One or more errors occurred. (AADSTS700016: Application with identifier ” was not found in the directory ‘analysis.windows.net’. This can happen if the application has not been installed by the administrator of the tenant or consented to by any user in the tenant. You may have sent your authentication request to the wrong tenant.
HI there
It appears that as per the error that the application is not running as an Admin account?
Hi Gilbert, thank you so much for writing such a detailed and awesome article. I followed your instructions and got the following error:
“2020-06-19T04:14:03Z [Error] ERROR: Connect-PowerBIServiceAccount : Failed to get ADAL token: Unhandled Exception: System.AggregateException: One or more errors occurred. —> Microsoft.IdentityModel.Clients.ActiveDirectory.AdalClaimChallengeException: AADSTS50079: Due to a configuration change made by your administrator, or because you moved to a new location, you must enroll in multi-factor authentication to access…”
1) I’ve tripled checked that the account used doesn’t have MFA and it has Global Admin rights
2) With the same account, I could execute successfully on local Windows PowerShell
Can you or anyone shed some light pls, thank you.
Hi Luke, thanks for the kind words.
Could you please make sure that you have got the latest version of the Power BI PowerShell modules uploaded?
Hi Gilbert, yes I believe so – ran the module installation per your instruction yesterday so should be the latest.
I suspect the issue is due to Azure Active Directory authentication – my first sign in which was few months back was through my corporate network but now I’m signing in from home and it was deemed risky by AAD (https://docs.microsoft.com/en-us/azure/active-directory/conditional-access/location-condition) so was asked to enable MFA but clearly this wouldn’t work in this scenario so I’m trying to find a workaround.
But if you or somehow know how that can be fixed, please let me know. Thanks in advance!
Hi there
You are right in that you need an account without MFA enabled for this to work.
Hey,
Thanks for the post. I am using New-AzStorageComtext instead and intemittently get an error that New-AzStorageContext is not recognized as a cmdlet. Problem is this works sometimes and sometimes it does not.
Have you faced this? Any idea?
Hi Gary
I also did find that to be an issue.
What I did was to change it to use AzCopy which I found was a lot more reliable.
If you need more details please let me know and I can email you through some details.
For me it is giving ‘Enable-AzureRmAlias’ is not recognized as the name of a cmdlet
Could you suggest some alternative with new AZ command
Hi Arvind,
Maybe you could import the azure module into your Azure Function App and then use the ‘Import-Module’ in your PowerShell script?
Hi Gilbert, thank you so much for writing this article. I’m so sorry to bother you, but when I tried to run it, it returned the following error:
ERROR: Failed to get ADAL token: Unhandled Exception: System.AggregateException: One or more errors occurred. —> Microsoft.IdentityModel.Clients.ActiveDirectory.AdalClaimChallengeException: AADSTS53003: Access has been blocked by Conditional Access policies.
I would appreciate it very much if you could help me.
Thanks a lot
Hi there
The error indicates that you have been blocked by your Network Administrator
“Access has been blocked by Conditional Access policies”
I would suggest contacting your network/azure admin
Hi Gilbert,
Thank you for such a wealth of information on setting this up.
Questions:
1) I assume this run every day to store the last days worth of logs from when it last ran. So I’m guessing if we need the last 90 days to do a once off setup to then load and store each day. i.e., so get the history first, then setup this daily schedule to keep adding to it
2) Connecting this to a Power BI will it automatically pick up the new files as they are added each day on refresh on the dataset ?
Thanks,
Bee
Hi Bee,
Thanks for the kind words.
What you could do in the script is you could change the script in Line 49 where it says
$DaysDifference = $DateDifference.Days
and let it be$DaysDifference = 95
This should loop through and get the last 95 days worth of logs.
And yes when you create the PBIX to look at the Azure Data Lake location, it will process all the files in the folder each time it refreshes.
Hi, Gilbert. thank you for your kindly article. I’ve followed your instruction till running the PowerShell Script. and got the error message like this:
2020-11-03T13:34:30.304 [Information] Executing ‘Functions.PBI-GetAudit-TimerTrigger1′ (Reason=’This function was programmatically called via the host APIs.’, Id=b8d412c1-930e-4e74-84e0-a5e88e2a99de)
2020-11-03T13:34:30.308 [Error] Executed ‘Functions.PBI-GetAudit-TimerTrigger1’ (Failed, Id=b8d412c1-930e-4e74-84e0-a5e88e2a99de, Duration=1ms)Value cannot be null. (Parameter ‘connectionString’)
Can you or anyone give me some idea about it?
Thank you.
Hi there,
Can you test and make sure that the account used to run the powershell command has got access to the power bi audit log?
I would run it manually in powershell to test
Hi, Gilbert.
Thanks for your replay.
Yes. I’ve run it manually in Powershell. it works well. the only difference that running manually and running on the Azure PowerShell is login part.
I can successfully login when I tested PowerShell manually. But in Azure PowerShell same error message occurred. I believe the reason is cause I’m using an account that MFA-enabled. & I have to enable my account’s MFA.
So, I will appreciate if you or anyone can help me on How to login PowerBI in Azure PowerShell with MFA account.
Thank you.
Hi Eric
Currently it is not possible to automate the PowerShell script with an MFA enabled account.
What you will need to do is either use the Service principal or an account that does not have MFA enabled.
I have the same issue. I’ve followed these instructions verbatim alongside a colleague — Our steps performed within our own individual developer tenants were identical, and we cannot narrow down what other differences may exist (presumably outside of the function itself, but within our individual tenants/Azure subscriptions) that causes this to work for one of us but not both 🙁
Hi Alan,
Do you have an error that is happening?
Can you also verify which version of the Azure Function App you are running?
There are currently 3 versions.
In my blog post I am using the RunTime Version 2.
You can find the RunTime version in the Azure Function setting under Configuration.
[…] have not read my prior blog post where I detail how to store the Audit log you can read it here (How you can store All your Power BI Audit Logs easily and indefinitely in Azure) which will allow you to get a copy of the audit […]
[…] How you can store All your Power BI Audit Logs easily and indefinitely in Azure, Gilbert Quevauvilliers […]
I used the same runtime version as you. I ended up creating the resources in a different region and everything worked like a charm. No clue why the region made a difference since both regions offer equivalent resources…
Hello, Can you please explain the process of not using the individual accounts and how to use the Service Principle please . Tat would help alot
Hi Arch
Thanks for the comment I will look and see when I can get this done.
Hi Gilbert, Did you look into this request, it is much needed instead of individual account, we can use Service Account.
Hi Ajay
Unfortunately this is not possible at this time.
Hi Gilbert,
I have been using the above process to get the Activity Events and I am using my credentials , but after the mandatory password change recently it has started failing and giving me the following error. Any help is appreciated as I tried diffenent ways and could’nt figure out how or why.
“”2021-05-19T15:31:53.211 [Error] Executed ‘Functions.PBI-GetActivityEvents’ (Failed, Id=f68164bf-137c-408a-acdc-69fd6fa23ca8, Duration=37ms)Result: FailureException: The script file ‘C:\home\site\wwwroot\PBI-GetActivityEvents\run.ps1’ has parsing errors:Missing argument in parameter list.Stack: at Microsoft.Azure.Functions.PowerShellWorker.AzFunctionInfo.GetParameters(String scriptFile, String entryPoint, ScriptBlockAst& scriptAst) in /home/vsts/work/1/s/src/FunctionInfo.cs:line 170at Microsoft.Azure.Functions.PowerShellWorker.AzFunctionInfo..ctor(RpcFunctionMetadata metadata) in /home/vsts/work/1/s/src/FunctionInfo.cs:line 73at Microsoft.Azure.Functions.PowerShellWorker.FunctionLoader.LoadFunction(FunctionLoadRequest request) in /home/vsts/work/1/s/src/FunctionLoader.cs:line 52at Microsoft.Azure.Functions.PowerShellWorker.RequestProcessor.ProcessFunctionLoadRequest(StreamingMessage request) in /home/vsts/work/1/s/src/RequestProcessor.cs:line 222″”
Hi Archana,
It would appear that something has either changed with the function or have a look at the line numbers 170, 73, 52 and 222
I am not sure what else is causing the issue.
Hey Gilbert,
I have recreated the entire process in UAT environment and now I am getting a different Error. This process has worked beautifully for couple of months and could’nt fiqure out what has changed . Google has not been my best friend too so any help is appreciated here
Error :
2021-05-24T20:05:09.408 [Warning] The Function app may be missing a module containing the ‘Enable-AzureRmAlias’ command definition. If this command belongs to a module available on the PowerShell Gallery, add a reference to this module to requirements.psd1. Make sure this module is compatible with PowerShell 7. For more details, see https://aka.ms/functions-powershell-managed-dependency. If the module is installed but you are still getting this error, try to import the module explicitly by invoking Import-Module just before the command that produces the error: this will not fix the issue but will expose the root cause.
2021-05-24T20:05:11.121 [Error] ERROR: The term ‘Enable-AzureRmAlias’ is not recognized as the name of a cmdlet, function, script file, or operable program.Check the spelling of the name, or if a path was included, verify that the path is correct and try again.Exception :Type : System.Management.Automation.CommandNotFoundExceptionErrorRecord
Hey Gilbert,
I’m running into issues running a timer based function in general, no matter what I do it leads to the same error.
2021-07-20T23:24:18.591 [Error] Executed ‘Functions.PBI-GetActivityEvents’ (Failed, Id=362962fc-9818-4ab1-b317-bf732dd5c14b, Duration=1ms)Could not create BlobServiceClient to obtain the BlobContainerClient using Connection: Storage
I’m wondering if there’s any insight into this and if it’s also because I am on a free trial account,
I appreciate all the help I can get, Thanks!
Hi John, I would suggest making sure you have got a full Power BI account.
And if you can make sure also you have got access to the Blob storage account
[…] How you can store All your Power BI Audit Logs easily and … […]
Hi Gilbert
Thank you very much for the excellent solution to store PowerBi audit logs in azure.. Can you please share the video tutorials if you have any. it would be very much helpful to implement step by step by our own.
Hi Ajay,
Unfortunately I do not do much video’s and at the current time I do not have the capacity to create a video.
I do hope the blog post is good enough.
I am new to azure and don’t know how much cost are involved in order to implement this end-to-end solution as described in your solution.
What is cost involved –
1. Creating function app
2. Running PowerShell script on azure daily basis
3. storing audit logs files in blob storage
4. Any other azure resources utilization cost
it would be very much helpful in order to show this cost part to my manager.
Hi Gilbert,
I have folowed the above steps and almost completed. When executing Power Shell script in the run.ps1 i am getting HTTP output as 202 Accepted and below message in prompt output.
2023-07-07T08:17:53Z [Verbose] Sending invocation id: ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
2023-07-07T08:17:53Z [Verbose] Posting invocation id:xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx on workerId:xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
2023-07-07T08:17:53Z [Information] INFORMATION: PowerShell timer trigger function ran! TIME: 07/07/2023 08:17:52
2023-07-07T08:17:53Z [Information] Executed ‘Functions.PBI-GetActivityEvents’ (Succeeded, Id=xxxxxxxxxxxxxxxxxxxxxxxx, Duration=14ms)
Do you have any idea on this error. Where did go wrong?
Hi Vigneshwaran,
The 202 means that it ran successfully?
Hi Gilbert,
Thanks for amazing post to pull audit log. I’m able to pull the date from last 28 days however if I go beyond that period it is giving me an error as ‘Operation returned an invalid status code ‘BadRequest’. Please could you let me know if this approach can be used to pull historical data beyond the current 30 days.
Thanks, Nikita
Hi Nikita,
What you could do is on line 47 you can hard-code the daysdifference to be say 60 to go back 80 days.
I do know that the Power BI Service only keeps about 60-90 days worth if history.
Hi Gilbert,
This is a very useful and great blog for Power BI Audit Logs. I followed all the steps in this blog and when I run the run.ps1 I receive HTTP 202 Accepted and the below output:
2024-03-06T22:09:01Z [Information] Executing ‘Functions.PBIGetActivityEvents’ (Reason=’This function was programmatically called via the host APIs.’, Id=017ff5eb-5916-460c-aa4f-f866cfdbbb3e)
2024-03-06T22:09:01Z [Verbose] Sending invocation id: ‘017ff5eb-5916-460c-aa4f-f866cfdbbb3e
2024-03-06T22:09:01Z [Verbose] Posting invocation id:017ff5eb-5916-460c-aa4f-f866cfdbbb3e on workerId:09c1f4cc-ccc4-4f6c-9009-24736fc085a3
2024-03-06T22:09:01Z [Information] INFORMATION: PowerShell timer trigger function ran! TIME: 03/06/2024 22:09:01
2024-03-06T22:09:01Z [Information] Executed ‘Functions.PBIGetActivityEvents’ (Succeeded, Id=017ff5eb-5916-460c-aa4f-f866cfdbbb3e, Duration=13ms)
When I check “site\wwwroot\AuditFiles” I don’t see any .CVS files uploaded and also don’t have any files added in Blob Container. Can you please let me know where I went wrong?
Note: I’m using my MFA account to sign in to Power BI
Thanks,
Srija
Hi Sai,
You would need to have an account that does not have MFA as you cannot input this information automatically each time it runs.
Once you have an account that has got the Power BI Admin access and does not have MFA enabled on the account.
Hi Gilbert,
Thanks for the quick response!! I can now see my .csv files in file share but unable to copy to the blob receiving the below error:
The Function app may be missing a module containing the ‘New-AzureStorageContext’ command definition. If this command belongs to a module available on the PowerShell Gallery, add a reference to this module to requirements.psd1. Make sure this module is compatible with PowerShell 7. For more details, see https://aka.ms/functions-powershell-managed-dependency. If the module is installed but you are still getting this error, try to import the module explicitly by invoking Import-Module just before the command that produces the error: this will not fix the issue but will expose the root cause.
2024-03-07T21:36:59Z [Error] ERROR: The term ‘New-AzureStorageContext’ is not recognized as a name of a cmdlet, function, script file, or executable program.
I have the right powershell version installed and also when I separately run the COPY TO BLOB code it works fine. but when running all together in the loop it throws the above error. please let me know your suggestions..
Thanks,
Srija