I have a CSV file stored in blob storage. The goal is to move this file into a Sharepoint site and set some metadata. What would be the best way to do this? The client does not want us to use Power Automate or Logic Apps.
I tried using Azure Data Factory but there seems to be an issue with writing data to SharePoint. I used the copy activity but the 'sink' to SharePoint failed. Does data factory support writing to Sharepoint?
The client does not want us to use Power Automate or Logic Apps.
Why not? This is the simplest way to achieve this, and is also better maintainable than for instance C# code.
Does data factory support writing to Sharepoint?
Yes, it does. However, using Data Factory only to copy a file to SharePoint is quite a bit of overkill.
If Logic Apps are not an option, have a look at an Azure Function to automatically trigger when the file is created in Azure Storage, and have a look at for instance Upload File To SharePoint Office 365 Programmatically Using C# CSOM – PNP for a C# way of uploading a file to SharePoint.
Related
Below are code snippets of what I currently use to import and export Excel files. However, is it possible to make this work through an Azure Web App which is serverless?
C#
File.WriteAllBytes(#"c:\temp\report.xlsx", excel.GetAsByteArray());
SQL Server
INSERT INTO Employee (FirstName, Salary)
SELECT FirstName, Salary
FROM OPENROWSET('Microsoft.ACE.OLEDB.12.0',
'Excel 12.0; Database=C:\Temp\Data.xlsx', [Sheet1$]);
Approach 1:
This could be achieved using azure logics apps along with web apps.
Logic apps are a server less offering from azure where you can create workflows combining different steps like connecting to database, extracting / exporting data to CSV or Excel file, upload file to blob storage or an ftp location or send it as attachment in email. You can follow steps here to create logic app.
But you want to access this using azure web app. That's also possible using https request from web app to trigger the logic apps. How part is described here.
Once the data is uploaded into blob storage, you can return the file link as response to the caller.
Approach 2:
Another way to achieve this is by coding data extract and upload logic at db end in a stored procedure. However, your db should be either on-prem or on IAAS ( VM based ) to be able to use xp_cmdshell utility from stored procedure.
Another approach is to map a network drive to an Azure file share. See here: https://techcommunity.microsoft.com/t5/fasttrack-for-azure/mapping-a-network-drive-to-an-azure-file-share-using-domain/ba-p/2220773
What's the simplest way for an Azure Functions to save a file into OneDrive? How does authentication work from a deployed Azure Function?
To make this discussion simpler, we have a string var content = "This is the file content" which needs to be saved as sample.txt file.
What if OneDrive folder is shared with an URL (real shared OneDrive link, will be removed - https://1drv.ms/f/s!Ak7ywxppmRtB8uRKhvT1FLmNBwXNwQ) and no authentication is required?
I'd recommend taking a look at the Microsoft Graph APIs for managing files stored in OneDrive.
They have great C# examples using the Microsoft.Graph NuGet package. You will need to implement a mechanism of authentication however and I don't think there is a way around this. For the Graph, I'd recommend looking at these implementations for getting authenticated for a user.
Once you've authenticated in your Function app, you should be able to get to where you need to be using the Graph APIs for OneDrive available.
List a user's drives
Upload or update a file in a drive
I need to connect my web app to a sharepoint online. I read a lot of example based on on-premise SHP ed it use a Sharepoint.DLL. So MS suggest to use Graph.
My question is about best practice to implement this function:
read a list of files/folder List item
create folder
(opt) upload document
Best approach would be, develop a REST API flow using MS Flow... the MS Flow action would be when a http request comes.... then in the MS Flow handle the files reading and files uploading functionality and once you save the the flow you will get an automatic Azure REST API URL - this api you have to call from your c# code and need to pass the parameter to REST api. Note:
1.You may need to develop two MS Flow api one is for reading the items and another for uploading the document.
If you want to access SharePoint Online data using C# in your web app, you could choose SharePoint Online CSOM library:
Microsoft.SharePointOnline.CSOM
Read List Item and other objects:
How To Read Various Objects Using SharePoint Online Client Side Object Model (CSOM)
Create folder:
Create Folder in SharePoint using CSOM
Upload document:
CSOM. Upload document
I have some files(.txt, .doc, .xlsx etc) inside a bucket in my AmazonS3 drive and is it possible to perform a content level search through my C# application? That is, when we type a string and upon pressing key in my application, every files that contains the searched string in its content should list.
Is there any way to achieve this either using any method or even using WebAPI's.
Thanks in advance
Amazon S3 is purely a storage service. There is no search capability built into S3.
You could use services such as Amazon CloudSearch and Amazon Elasticsearch Service, which can index documents, but please note that this involves additional configuration and additional costs.
You won't be able to do all those file types you listed, but any of your files that are structured, or semi-structured, you could consider using the newly released AWS Athena which does allow searching of S3 file using an SQL-like language:
https://aws.amazon.com/athena/faqs/
Amazon Athena is an interactive query service that makes it easy to
analyze data in Amazon S3 using standard SQL. Athena is serverless, so
there is no infrastructure to setup or manage, and you can start
analyzing data immediately. You don’t even need to load your data into
Athena, it works directly with data stored in S3. To get started, just
log into the Athena Management Console, define your schema, and start
querying. Amazon Athena uses Presto with full standard SQL support and
works with a variety of standard data formats, including CSV, JSON,
ORC, Apache Parquet and Avro. While Amazon Athena is ideal for quick,
ad-hoc querying and integrates with Amazon QuickSight for easy
visualization, it can also handle complex analysis, including large
joins, window functions, and arrays.
I am using sharepoint solely as a repository to store and retrieve large files (~100 MBs). How can I authenticate a web application such that it can upload and download files to a document list on Sharepoint 2007 without using Windows intergrated authentication?
The web application will handle the authorization - it'll figure out which users are allowed to access the repository via integrated windows authentication and a bunch of business rules that depend on the application's state. When the user wants a file they will use the web app. The web app will then download that file on the user's behalf using some sort of credentials. I prefer that these credentials be somewhat permanent so it's password doesn't expire every so often. I was thinking of using basic authentication because the files that I'm access controlling aren't high valued files (so its poor security is tolerable), and it seems to be the simplest. What are my options?
I wouldn't recommend using SharePoint for this at all. Its value comes from the features it provides through its user interface. If you remove this then you are looking at an expensive and over-complicated data store.
SharePoint stores all data in a database. Storage for databases is more expensive than storage for files. It's more costly to configure, administer, backup, load balance, scale, etc...
Development time is more costly with SharePoint. It's a big and complex product that's not trivial or quick to develop against. There needs to be a solid business case and using SharePoint for its back end only isn't a good one.
Please seriously consider this approach before going down it!
You are better off just enabling windows auth on your web application and then setting the permissions to the folders/files.
If you do need to get just the files however...go to www.codeplex.com and search for sharepoint powershell. There is a script there to upload stuff. This could be modified to download I believe.
As mentioned above, using SharePoint as a repository pretty much nullifies any of its benefits. You might as well just use a database to store your content (that's what SharePoint is doing behind the scenes anyway.)