A SharePoint file shortcut and SharePoint list copy activity! But does it work? (ENG)

Published on January 30, 2026 at 2:29 PM

Finally, a SharePoint file shortcut and a SharePoint list copy activity!
But the real question is… does it actually work?

A few months ago, I wrote a blog about pulling data directly from a SharePoint list instead of relying on a OneLake Sync.

 

Integrate SharePoint lists instead of using Onelake (ENG) | youranalyticalbridge

Back then, I said:
“Business users can maintain the lists just as easily as they would in Excel, which keeps things intuitive and user‑friendly. But hey, it's still not perfect.”

And of course, not long after, a new business request came in, based on (you guessed it) Excel files. I’m definitely not a fan of that setup, but at the moment the information simply isn’t available in any other way.

So, I started working on the simplest possible approach to get this data into Fabric, with as little code as possible, while keeping ownership with the business.
However… giving the owner access to the OneLake environment wasn’t an option, too many folders, files, and lakehouses, and too many security concerns.
So a OneLake sync was off the table.

For the time being, I’ve been manually uploading the file to an azure blob storage account and loading the data into Fabric using a simple copy activity pipeline.
Not perfect, but it worked for what we needed.

But finally… there are now two new methods that might work much better for my customer:the SharePoint file shortcut and the SharePoint list copy activity.
Time to put them to the test!

First of all, the SharePoint file shortcut

To try this out, I created a folder in my teams location, which is also the client’s SharePoint team site.
From the Lakehouse in Fabric, I selected “Create shortcut to SharePoint”.

Using the service principal we set up earlier, I connected to the project site and selected the correct folder.
Once everything was configured, the shortcut appeared in my Lakehouse, linking directly to the SharePoint files.
Exactly what I hoped for!

Note that this shortcut still doesn’t sync a SharePoint list, but it does give your customer more options, depending on the data that needs to be synced, the frequency, dependencies, and so on.

Up to this point, everything works like a charm. The file is now available in the Lakehouse raw files section, and updating it in SharePoint syncs to Fabric almost immediately. Great!

Now, let’s use a copy activity to turn that file into a table.

Oeps… now I’m running into multiple errors while doing this.

When I try to connect to the file and read (preview) the XLSX, I get the following error:

 

  • “ExcelUnsupportedFormat, Only '.xls' and '.xlsx' format is supported in reading the excel file.”

and

  • 'AuthenticationFailed'

 

If I try to read a CSV file, I still get the AuthenticationFailed message.

The funny thing is that both error messages make zero sense.

I have access, also via the service principal, to the Excel file, and the file is definitely in XLSX format.

So… what’s going on?

If I go back to the blob storage, everything just works!

In the meantime, I’ve submitted this issue on the Microsoft Fabric forum, and for now I’m waiting for my API permissions to be updated in the Azure portal. So… let’s test the second option!

The SharePoint List Copy Activity

Luckily, there’s also the option to load a SharePoint list directly via the copy activity.
As shown in the following post, the copy activity in Fabric has been updated with a SharePoint List connector.

Simplifying data movement across multiple clouds with Copy job – Enhancements on incremental copy and change data capture | Microsoft Fabric-blog | Microsoft Fabric

As I mentioned earlier, I already have some SharePoint lists that I load daily for my client. Back then, I created an API-call notebook to pull data from these lists.
Right now, that process takes around 12 minutes to ingest the data.
Yeah… not exactly lightning-fast for just three dataset, especially when using Spark.

So let’s try it out via the copy activity.

Configuring the connection to the SharePoint site and selecting the list is super easy. My destination? A brand‑new table in my Fabric Lakehouse.

When I run the activity, it takes only 1 minute to load the data!

And if I run this for all three tables separately, it still only takes about 1 minute total for all datasets.
That’s a massive improvement!

This is definitely a huge performance boost compared to my ‘old’ API notebook.
But now let’s try to loop through the lists with one simple, dynamic solution using the copy activity, just to make things a bit more flexible.

“Error”:

ErrorCode=MissingSchemaForAutoCreateTable,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Failed to auto create table for no schema found in Source.,Source=Microsoft.DataTransfer.TransferTask,'

Apparently, doing it dynamically doesn’t work with SharePoint lists, since it needs a specific schema to create the destination. And because it tries to pick that up dynamically, the load fails!
That’s a pity.

So for now, I’ll stick with the three separate list loads to get the job done, at least until I get more information or help on fixing the error, improving the shortcut, or making the dynamic copy activity work.

Did you manage to get it working?