top of page

Fabric - How to check if a data pipeline is running?

  • jonlunn27
  • 3 days ago
  • 5 min read

How do you check if a pipeline is running, not from the monitor, but from your Data Pipelines?

Maybe you're like me and you have a  Data Pipeline process that needs to check if some other pipeline else is running. In my case I have to check if a process is running due to Delta tables liking you to have one process writing to them, otherwise you can get concurrency issues as two items are trying to update the same delta table metadata file.

Those tricky metadata items like the process to be exclusive. It's not just a Delta table issue; this can happen with regular SQL databases tables. So you can use this for anything you want to stop a locking issue or have an exclusive access to an object or just don’t want a process to run while another is doing its thing. 

As mentioned, in my case, there is a batch process that runs a few times a day, as well as another process that runs every 30 mins to bring in some data from another source. What I don’t want to happen is the 30min process to run at the same as the main pipeline. So how do I check to see if a pipeline is running? 


Set up the pre-requisites


First thing I did was google the issue and got this great example of doing it in Azure Data Factory. Wow he has the same name as my boss! 



So there wasn't a good Fabric based blog on it, so I decided to do it myself!

In Fabric it is a lot easier than ADF, you don’t have to worry about subscription Ids, or resource groups, it is just the Workspace Id and the Pipeline Id. It does have many of the same components in the process, you must hit the Fabric API to get the status of the pipeline and hit the Entra API to get authorisation to access the Fabric API. 


Pre-requisites needed

  • Tennant Id 

  • Service Principal Id 

  • Service Principal Secret 

  • Service Principal added to the relevant workspace 

  • If you are using a Workspace Identity, you need to generate a secret for it! See here 


Set up the pipeline tasks



Above is the overview of the pipeline. The first three web tasks are to get the Service Principal Id and secret and tenant Id from an Azure Key Vault; the rest of the activities are how to check if the pipeline is running. In this example I'm just going to show how to do the checking part, not the get items from a Key Vault. 


Connections

Before any of this is built we need to create two connections, one for the Entra API and one for the Fabric API. 

In manage connections and gateways, accessed via the setting cog icon in the top right of Fabric, create the following two Web v2 connections: 


Entra API 

Connection Type: Web v2  

Data Source Path: https://login.microsoftonline.com  

Authentication: Anonymous 

Fabric API 

Connection Type: Web v2  

AuthenticationAnonymous

 

Pipeline Tasks 


The first thing you need to create is a Web task, using the Entra API Connection. With the following details: 

Name: Web -Get Token 

Relative URL: /@{activity('Web - Get Tenant Id').output.value}/oauth2/v2.0/token  

Method: POST 

Body: @concat( 

  'grant_type=client_credentials', 

  '&client_id=', activity('Web - Get SP Id').output.value, 

  '&client_secret=', activity('Web - Get SP Secret').output.value, 

  '&scope=', uriComponent('https://api.fabric.microsoft.com/.default') 

Headers: 

Name: Content-Type  

Value: application/x-www-form-urlencoded 



 

You need to replace the 'activity' items with the name of your Key Vault tasks. 

The next task will take the output of the 'Web Get Token' and pass that token to the Fabric API. You will need two things, the Workspace Id and the Pipeline Id of the item you need to check. In this case I'm using 'pipeline().DataFactory', which will return the workspace Id of where the pipeline is running. The other item you need to get the is the Pipeline Id. This can be taken from the URL of the pipeline you want to check. If you open up the pipeline you need to check, you should see in the URL in the format of: 



You need to grab the pipeline_id which should be in the format of: 

'11c1a909-9915-4632-8c76-a2469c502763' 

This will be concatenated to the Fabric API URL in the connection that was set up and hit the endpoint ‘jobs/instances’. Details here.  



The response will be normally limited to the last 100 times it has run. 

 

Name: Web -Get Pipeline Status 

Relative URL: @concat('/v1/workspaces/', pipeline().DataFactory, 

       '/items/', pipeline().parameters.pipeline_id, 

       '/jobs/instances') 

Method: GET 

Body: @concat( 

  'grant_type=client_credentials', 

  '&client_id=', activity('Web - Get SP Id').output.value, 

  '&client_secret=', activity('Web - Get SP Secret').output.value, 

  '&scope=', uriComponent('https://api.fabric.microsoft.com/.default') 

Headers: 

Name: Authorization 

Value: @concat('Bearer ', activity('Web - Get Token').output.access_token) 

Name: Content-Type 

Value: application/json 


 

Once set up correctly, then you should get an output something like this: 

{

"value": [

{

"id": "bc1ec9a6-ead8-4713-a2ba-d1e1a5403d21",

"itemId": "11c1a909-9915-4632-8c76-a2468c490652",

"jobType": "Pipeline",

"invokeType": "Manual",

"status": "Completed",

"failureReason": null,

"rootActivityId": "358c31ae-7ddc-ae0f-9fd5-474782a7e14a",

"startTimeUtc": "2025-11-03T14:48:13.1182431Z",

"endTimeUtc": "2025-11-03T14:51:20.5724756Z"

},

{

"id": "2aef0f76-440f-401e-aedc-7c697e65ecd1",

"itemId": "11c1a909-9915-4632-8c76-a2468c490652",

"jobType": "Pipeline",

"invokeType": "Manual",

"status": "Completed",

"failureReason": null,

"rootActivityId": "fd09ab57-5151-091f-db63-da06d2ae1d34",

"startTimeUtc": "2025-11-03T14:37:35.4511393Z",

"endTimeUtc": "2025-11-03T14:40:43.0851735Z"

}

],

"ADFHttpStatusCodeInResponse": "200",

"ResponseHeaders": {

"Pragma": "no-cache",

"Transfer-Encoding": "chunked",

"Strict-Transport-Security": "max-age=31536000; includeSubDomains",

"X-Frame-Options": "deny",

"X-Content-Type-Options": "nosniff",

"RequestId": "9f5b0eb7-aaaf-4a10-ae7b-093a522ec8be",

"Access-Control-Expose-Headers": "RequestId",

"request-redirected": "true",

"home-cluster-uri": " https://wabi-uk-south-c-primary-redirect.analysis.windows.net/ ",

"Cache-Control": "no-store, must-revalidate, no-cache",

"Date": "Mon, 03 Nov 2025 14:57:35 GMT",

"Content-Type": "application/json; charset=utf-8"

},

"executionDuration": 0

}


Note: If you just get a 200 response, that may mean that you have not run the pipeline at all so there are no items to return! 

We don't need all the output; all we need is the "status".

{ 
"id": "2aef0f76-440f-401e-aedc-7c697e65ecd1", 
"itemId": "11c1a909-9915-4632-8c76-a2468c490652", 
"jobType": "Pipeline", 
"invokeType": "Manual", 
"status": "Completed", 
"failureReason": null, 
"rootActivityId": "fd09ab57-5151-091f-db63-da06d2ae1d34", 
"startTimeUtc": "2025-11-03T14:37:35.4511393Z", 
"endTimeUtc": "2025-11-03T14:40:43.0851735Z" 
} 

In this example it's "Completed". So, it has run and done its thing, if you look carefully, it has run twice. The next task we must do is filter the outputs. We are not interested in "Completed" or anything else, we are looking for "InProgress", if the pipeline is running. Just to note, the status of the pipeline is not real-time, there can be a delay between the pipeline starting, and the status being updated, but it tends to be seconds rather than minutes in the API returning the right status. 

So the next task is a filter on that output. 


Items: @activity('Web - Get Pipeline Status').output. Value 

Condition: @equals(item().status,'InProgress') 


 

To check more than one status you can add an 'OR' in the condition part.

@or(   equals(item().status, 'InProgress') 
,    equals(item().status, 'NotStarted')  
) 

"InProgress" - The pipeline is running 

"NotStarted" - The pipeline has received a run request and getting ready to start 


This will now filter to any item that is running or about to. Once that is done, you can add an If Condition to that task to stop your pipeline or continue with your process. 

 

Next steps 

  • You now have two new connections that you can use to hit the Fabric API with, and a method of hitting the API to get something back. Is there anything else you may need to hit the Fabric API for?  

  • If you add in some parameters to your new pipeline, you can have a repeatable element to package up into some process if you need to check more items across your pipelines to see if they are running .

 

 

 
 
 

Be the first to know

Subscribe to our blog to get updates on new posts.

Thanks for subscribing!

TRANSFORM YOUR BUSINESS

bottom of page