Basically i creating a azure webjob which will loop through record from a sql table which has got sftp location detail. for each sftp location, connect using sftpclient and read all files in a folder.
for each files, i connect to azure and store it in the blob.
I have done the above in single file. But wish to do in a proper object oriented way. Not sure what is the right approach. i am not very much experience in design pattern but would love if someone can recomment a right approach.
Appreciate if someone can help me to achieve this in a proper object oriented way.
Thanks
var azureBlob = AzureBlobStorage.Instance(StorageConnectionString, Containername);
List<SftpLocationData> sftps = null;
try
{
sftps = SftpLocationClient.GetSftpLocationDetails().ToList();
if (sftps == null || !sftps.Any()) return;
}
catch (Exception ex)
{
LogMessage(string.Format("Error getting sftp details : {0}", ex.Message), log);
}
foreach (var fileClient in sftps.Select(sftp => new FileTransferClient(sftp)))
{
using (var sftpClient = fileClient.CreateClient())
{
sftpClient.Connect();
var files = sftpClient.ListDirectory(path: fileClient.Data.Directory ?? ".").ToList();
if (files.Any())
{
var validFiles = files.Where(f => ext.Any(e => e == Path.GetExtension(f.Name))).ToList();
foreach (var file in validFiles)
{
var fileExists = azureBlob.FileExists(file.Name);
var blobUri = string.Empty;
var blobName = file.Name;
var fileImport = PopulateFileImportData(file);
if (fileExists)
{
int count = azureBlob.ListFiles(blobName);
blobName = (count == 0) ? blobName : String.Format("{0}_v{1}", blobName, count);
fileImport.Error = true;
fileImport.ErrorMsg = "Duplicate File";
}
fileImport.FileName = blobName;
try
{
var fileSaved = RbsClient.SaveFileImport(fileImport);
blobUri = azureBlob.UploadFile(fileSaved.FileName, sftpClient.OpenRead(file.FullName));
fileSaved.Archived = DateTime.Now;
RRClient.UpdateFile(fileSaved);
}
catch (Exception ex)
{
LogMessage(string.Format("Error saving fileimport detail : {0}", ex.Message), log);
}
if (fileImport.Error) continue;
IQueueGenerator queueGenerator = new QueueGenerator();
var queueName = queueGenerator.GetQueueName(
blobName,
fileClient.Data,
blobUri);
if (string.IsNullOrEmpty(queueName)) continue;
}
}
sftpClient.Disconnect();
}
}
}
catch (Exception ex)
{
LogMessage(string.Format("Error occurred in processing pending altapay requests. Error : {0}", ex.Message), log);
}
Hi as of my understanding you need to copy the files/folders from FTP to azure blob storage.
Step 1. Get list of SFTP locations
Step 2. Init Azure Blob Storage Connection
Step 3. Init SFTP
Step 4. Iterate Files in SFTP
Step 5. Store the file into Blob Storage
Step 6. Close SFTP connection
Step 7. Close Azure Blob Storage
Related
I am using Azure File share I want to create zip file only once but wants to update it multiple times (upload multiple files after once created).
is it possible to create .zip file only once and add more files in it later without **overriding **existing files in zip.?
when i tried to add more files in .zip it overrides existing files in zip with new file.
private static async Task OpenZipFile()
{
try
{
using (var zipFileStream = await OpenZipFileStream())
{
using (var zipFileOutputStream = CreateZipOutputStream(zipFileStream))
{
var level = 0;
zipFileOutputStream.SetLevel(level);
BlobClient blob = new BlobClient(new Uri(String.Format("https://{0}.blob.core.windows.net/{1}", "rtsatestdata", "comm/2/10029.txt")), _currentTenantTokenCredential);
var zipEntry = new ZipEntry("newtestdata")
{
Size = 1170
};
zipFileOutputStream.PutNextEntry(zipEntry);
blob.DownloadToAsync(zipFileOutputStream).Wait();
zipFileOutputStream.CloseEntry();
}
}
}
catch (TaskCanceledException)
{
throw;
}
}
private static async Task<Stream> OpenZipFileStream()
{
BlobContainerClient mainContainer = _blobServiceClient.GetBlobContainerClient("comm");
var blobItems = mainContainer.GetBlobs(BlobTraits.Metadata, BlobStates.None);
foreach (var item in blobItems)
{
if (item.Name == "testdata.zip")
{
BlobClient blob = new BlobClient(new Uri(String.Format("https://{0}.blob.core.windows.net/{1}", "rtsatestdata", "comm/testdata.zip")), _currentTenantTokenCredential);
return await blob.OpenWriteAsync(true
, options: new BlobOpenWriteOptions
{
HttpHeaders = new BlobHttpHeaders
{
ContentType = "application/zip"
}
}
);
}
}
}
private static ZipOutputStream CreateZipOutputStream(Stream zipFileStream)
{
return new ZipOutputStream(zipFileStream)
{
IsStreamOwner = false,
};
}
This is not possible in Azure storage. The workaround would be to download the zip, unzip it, add more files, re-zip it, and re-upload to storage.
I would like to download multiple download files recursively from a FTP Directory, to do this I'm using FluentFTP library and my code is this one:
private async Task downloadRecursively(string src, string dest, FtpClient ftp)
{
foreach(var item in ftp.GetListing(src))
{
if (item.Type == FtpFileSystemObjectType.Directory)
{
if (item.Size != 0)
{
System.IO.Directory.CreateDirectory(Path.Combine(dest, item.Name));
downloadRecursively(Path.Combine(src, item.Name), Path.Combine(dest, item.Name), ftp);
}
}
else if (item.Type == FtpFileSystemObjectType.File)
{
await ftp.DownloadFileAsync(Path.Combine(dest, item.Name), Path.Combine(src, item.Name));
}
}
}
I know you need one FtpClient per download you want, but how can I make to use a certain number of connections as maximum, I guess that the idea is to create, connect, download and close per every file I find but just having a X number of downloading files at the same time. Also I'm not sure if I should create Task with async, Threads and my biggest problem, how to implement all of this.
Answer from #Bradley here seems pretty good, but the question does read every file thas has to download from an external file and it doesn't have a maximum concurrent download value so I'm not sure how to apply these both requirements.
Use:
ConcurrentBag class to implement a connection pool;
Parallel class to parallelize the operation;
ParallelOptions.MaxDegreeOfParallelism to limit number of the concurrent threads.
var clients = new ConcurrentBag<FtpClient>();
var opts = new ParallelOptions { MaxDegreeOfParallelism = maxConnections };
Parallel.ForEach(files, opts, file =>
{
file = Path.GetFileName(file);
string thread = $"Thread {Thread.CurrentThread.ManagedThreadId}";
if (!clients.TryTake(out var client))
{
Console.WriteLine($"{thread} Opening connection...");
client = new FtpClient(host, user, pass);
client.Connect();
Console.WriteLine($"{thread} Opened connection {client.GetHashCode()}.");
}
string remotePath = sourcePath + "/" + file;
string localPath = Path.Combine(destPath, file);
string desc =
$"{thread}, Connection {client.GetHashCode()}, " +
$"File {remotePath} => {localPath}";
Console.WriteLine($"{desc} - Starting...");
client.DownloadFile(localPath, remotePath);
Console.WriteLine($"{desc} - Done.");
clients.Add(client);
});
Console.WriteLine($"Closing {clients.Count} connections");
foreach (var client in clients)
{
Console.WriteLine($"Closing connection {client.GetHashCode()}");
client.Dispose();
}
Another approach is to start a fixed number of threads with one connection for each and have them pick files from a queue.
For an example of an implementation, see my article for WinSCP .NET assembly:
Automating transfers in parallel connections over SFTP/FTP protocol
A similar question about SFTP:
Processing SFTP files using C# Parallel.ForEach loop not processing downloads
Here is a TPL Dataflow approach. A BufferBlock<FtpClient> is used as a pool of FtpClient objects. The recursive enumeration takes a parameter of type IEnumerable<string> that holds the segments of one filepath. These segments are combined differently when constructing the local and the remote filepath. As a side effect of invoking the recursive enumeration, the paths of the remote files are sent to an ActionBlock<IEnumerable<string>>. This block handles the parallel downloading of the files. Its Completion property contains eventually all the exceptions that may have occurred during the whole operation.
public static Task FtpDownloadDeep(string ftpHost, string ftpRoot,
string targetDirectory, string username = null, string password = null,
int maximumConnections = 1)
{
// Arguments validation omitted
if (!Directory.Exists(targetDirectory))
throw new DirectoryNotFoundException(targetDirectory);
var fsLocker = new object();
var ftpClientPool = new BufferBlock<FtpClient>();
async Task<TResult> UsingFtpAsync<TResult>(Func<FtpClient, Task<TResult>> action)
{
var client = await ftpClientPool.ReceiveAsync();
try { return await action(client); }
finally { ftpClientPool.Post(client); } // Return to the pool
}
var downloader = new ActionBlock<IEnumerable<string>>(async path =>
{
var remotePath = String.Join("/", path);
var localPath = Path.Combine(path.Prepend(targetDirectory).ToArray());
var localDir = Path.GetDirectoryName(localPath);
lock (fsLocker) Directory.CreateDirectory(localDir);
var status = await UsingFtpAsync(client =>
client.DownloadFileAsync(localPath, remotePath));
if (status == FtpStatus.Failed) throw new InvalidOperationException(
$"Download of '{remotePath}' failed.");
}, new ExecutionDataflowBlockOptions()
{
MaxDegreeOfParallelism = maximumConnections,
BoundedCapacity = maximumConnections,
});
async Task Recurse(IEnumerable<string> path)
{
if (downloader.Completion.IsCompleted) return; // The downloader has failed
var listing = await UsingFtpAsync(client =>
client.GetListingAsync(String.Join("/", path)));
foreach (var item in listing)
{
if (item.Type == FtpFileSystemObjectType.Directory)
{
if (item.Size != 0) await Recurse(path.Append(item.Name));
}
else if (item.Type == FtpFileSystemObjectType.File)
{
var accepted = await downloader.SendAsync(path.Append(item.Name));
if (!accepted) break; // The downloader has failed
}
}
}
// Move on to the thread pool, to avoid ConfigureAwait(false) everywhere
return Task.Run(async () =>
{
// Fill the FtpClient pool
for (int i = 0; i < maximumConnections; i++)
{
var client = new FtpClient(ftpHost);
if (username != null && password != null)
client.Credentials = new NetworkCredential(username, password);
ftpClientPool.Post(client);
}
try
{
// Enumerate the files to download
await Recurse(new[] { ftpRoot });
downloader.Complete();
}
catch (Exception ex) { ((IDataflowBlock)downloader).Fault(ex); }
try
{
// Await the downloader to complete
await downloader.Completion;
}
catch (OperationCanceledException)
when (downloader.Completion.IsCanceled) { throw; }
catch { downloader.Completion.Wait(); } // Propagate AggregateException
finally
{
// Clean up
if (ftpClientPool.TryReceiveAll(out var clients))
foreach (var client in clients) client.Dispose();
}
});
}
Usage example:
await FtpDownloadDeep("ftp://ftp.test.com", "", #"C:\FtpTest",
"username", "password", maximumConnections: 10);
Note: The above implementation enumerates the remote directory lazily, following the tempo of the downloading process. If you prefer to enumerate it eagerly, gathering all info available about the remote listings ASAP, just remove the BoundedCapacity = maximumConnections configuration from the ActionBlock that downloads the files. Be aware that doing so could result in high memory consumption, in case the remote directory has a deep hierarchy of subfolders, containing cumulatively a huge number of small files.
I'd split this into three parts.
Recursively build a list of source and destination pairs.
Create the directories required.
Concurrently download the files.
It's the last part that is slow and should be done in parallel.
Here's the code:
private async Task DownloadRecursively(string src, string dest, FtpClient ftp)
{
/* 1 */
IEnumerable<(string source, string destination)> Recurse(string s, string d)
{
foreach (var item in ftp.GetListing(s))
{
if (item.Type == FtpFileSystemObjectType.Directory)
{
if (item.Size != 0)
{
foreach(var pair in Recurse(Path.Combine(s, item.Name), Path.Combine(d, item.Name)))
{
yield return pair;
}
}
}
else if (item.Type == FtpFileSystemObjectType.File)
{
yield return (Path.Combine(s, item.Name), Path.Combine(d, item.Name));
}
}
}
var pairs = Recurse(src, dest).ToArray();
/* 2 */
foreach (var d in pairs.Select(x => x.destination).Distinct())
{
System.IO.Directory.CreateDirectory(d);
}
/* 3 */
var downloads =
pairs
.AsParallel()
.Select(x => ftp.DownloadFileAsync(x.source, x.destination))
.ToArray();
await Task.WhenAll(downloads);
}
It should be clean, neat, and easy to reason about code.
I have to create a zip file from set of urls. and it should have a proper folder structure.
So i tried like
public async Task<byte[]> CreateZip(Guid ownerId)
{
try
{
string startPath = Path.Combine(AppDomain.CurrentDomain.BaseDirectory, "zipFolder");//base folder
if (Directory.Exists(startPath))
{
DeleteAllFiles(startPath);
Directory.Delete(startPath);
}
Directory.CreateDirectory(startPath);
string zipPath = Path.Combine(AppDomain.CurrentDomain.BaseDirectory, $"{ownerId.ToString()}"); //folder based on ownerid
if (Directory.Exists(zipPath))
{
DeleteAllFiles(zipPath);
Directory.Delete(zipPath);
}
Directory.CreateDirectory(zipPath);
var attachemnts = await ReadByOwnerId(ownerId);
attachemnts.Data.ForEach(i =>
{
var fileLocalPath = $"{startPath}\\{i.Category}";
if (!Directory.Exists(fileLocalPath))
{
Directory.CreateDirectory(fileLocalPath);
}
using (var client = new WebClient())
{
client.DownloadFile(i.Url, $"{fileLocalPath}//{i.Flags ?? ""}_{i.FileName}");
}
});
var zipFilename = $"{zipPath}//result.zip";
if (File.Exists(zipFilename))
{
File.Delete(zipFilename);
}
ZipFile.CreateFromDirectory(startPath, zipFilename, CompressionLevel.Fastest, true);
var result = System.IO.File.ReadAllBytes(zipFilename);
return result;
}
catch (Exception ex)
{
var a = ex;
return null;
}
}
currently im writing all files in my base directory(may be not a good idea).corrently i have to manually delete all folders and files to avoid exception/unwanted files. Can everything be written in memory?
What changes required to write all files and folder structure in memory?
No you can't. Not with the built in Dotnet any way.
As per my comment I would recommend storing the files in a custom location based on a Guid or similar. Eg:
"/xxxx-xxxx-xxxx-xxxx/Folder-To-Zip/....".
This would ensure you could handle multiple requests with the same files or similar file / folder names.
Then you just have to cleanup and delete the folder again afterwards so you don't run out of space.
Hope the below code does the job.
public async Task<byte[]> CreateZip(Guid ownerId)
{
try
{
string startPath = Path.Combine(Path.GetTempPath(), $"{Guid.NewGuid()}_zipFolder");//folder to add
Directory.CreateDirectory(startPath);
var attachemnts = await ReadByOwnerId(ownerId);
attachemnts.Data = filterDuplicateAttachments(attachemnts.Data);
//filtering youtube urls
attachemnts.Data = attachemnts.Data.Where(i => !i.Flags.Equals("YoutubeUrl", StringComparison.OrdinalIgnoreCase)).ToList();
attachemnts.Data.ForEach(i =>
{
var fileLocalPath = $"{startPath}\\{i.Category}";
if (!Directory.Exists(fileLocalPath))
{
Directory.CreateDirectory(fileLocalPath);
}
using (var client = new WebClient())
{
client.DownloadFile(i.Url, $"{fileLocalPath}//{i.Flags ?? ""}_{i.FileName}");
}
});
using (var ms = new MemoryStream())
{
using (var zipArchive = new ZipArchive(ms, ZipArchiveMode.Create, true))
{
System.IO.DirectoryInfo di = new DirectoryInfo(startPath);
var allFiles = di.GetFiles("",SearchOption.AllDirectories);
foreach (var attachment in allFiles)
{
var file = File.OpenRead(attachment.FullName);
var type = attachemnts.Data.Where(i => $"{ i.Flags ?? ""}_{ i.FileName}".Equals(attachment.Name, StringComparison.OrdinalIgnoreCase)).FirstOrDefault();
var entry = zipArchive.CreateEntry($"{type.Category}/{attachment.Name}", CompressionLevel.Fastest);
using (var entryStream = entry.Open())
{
file.CopyTo(entryStream);
}
}
}
var result = ms.ToArray();
return result;
}
}
catch (Exception ex)
{
var a = ex;
return null;
}
}
I'm looking to parse the WebCacheV01.dat file using C# to find the last file location for upload in an Internet browser.
%LocalAppData%\Microsoft\Windows\WebCache\WebCacheV01.dat
I using the Managed Esent nuget package.
Esent.Isam
Esent.Interop
When I try and run the below code it fails at:
Api.JetGetDatabaseFileInfo(filePath, out pageSize, JET_DbInfo.PageSize);
Or if I use
Api.JetSetSystemParameter(instance, JET_SESID.Nil, JET_param.CircularLog, 1, null);
at
Api.JetAttachDatabase(sesid, filePath, AttachDatabaseGrbit.ReadOnly);
I get the following error:
An unhandled exception of type
'Microsoft.Isam.Esent.Interop.EsentFileAccessDeniedException' occurred
in Esent.Interop.dll
Additional information: Cannot access file, the file is locked or in use
string localAppDataPath = Environment.GetFolderPath(Environment.SpecialFolder.LocalApplicationData);
string filePathExtra = #"\Microsoft\Windows\WebCache\WebCacheV01.dat";
string filePath = string.Format("{0}{1}", localAppDataPath, filePathExtra);
JET_INSTANCE instance;
JET_SESID sesid;
JET_DBID dbid;
JET_TABLEID tableid;
String connect = "";
JET_SNP snp;
JET_SNT snt;
object data;
int numInstance = 0;
JET_INSTANCE_INFO [] instances;
int pageSize;
JET_COLUMNDEF columndef = new JET_COLUMNDEF();
JET_COLUMNID columnid;
Api.JetCreateInstance(out instance, "instance");
Api.JetGetDatabaseFileInfo(filePath, out pageSize, JET_DbInfo.PageSize);
Api.JetSetSystemParameter(JET_INSTANCE.Nil, JET_SESID.Nil, JET_param.DatabasePageSize, pageSize, null);
//Api.JetSetSystemParameter(instance, JET_SESID.Nil, JET_param.CircularLog, 1, null);
Api.JetInit(ref instance);
Api.JetBeginSession(instance, out sesid, null, null);
//Do stuff in db
Api.JetEndSession(sesid, EndSessionGrbit.None);
Api.JetTerm(instance);
Is it not possible to read this without making modifications?
Viewer
http://www.nirsoft.net/utils/ese_database_view.html
Python
https://jon.glass/attempts-to-parse-webcachev01-dat/
libesedb
impacket
Issue:
The file is probably in use.
Solution:
in order to free the locked file, please stop the Schedule Task -\Microsoft\Windows\Wininet\CacheTask.
The Code
public override IEnumerable<string> GetBrowsingHistoryUrls(FileInfo fileInfo)
{
var fileName = fileInfo.FullName;
var results = new List<string>();
try
{
int pageSize;
Api.JetGetDatabaseFileInfo(fileName, out pageSize, JET_DbInfo.PageSize);
SystemParameters.DatabasePageSize = pageSize;
using (var instance = new Instance("Browsing History"))
{
var param = new InstanceParameters(instance);
param.Recovery = false;
instance.Init();
using (var session = new Session(instance))
{
Api.JetAttachDatabase(session, fileName, AttachDatabaseGrbit.ReadOnly);
JET_DBID dbid;
Api.JetOpenDatabase(session, fileName, null, out dbid, OpenDatabaseGrbit.ReadOnly);
using (var tableContainers = new Table(session, dbid, "Containers", OpenTableGrbit.ReadOnly))
{
IDictionary<string, JET_COLUMNID> containerColumns = Api.GetColumnDictionary(session, tableContainers);
if (Api.TryMoveFirst(session, tableContainers))
{
do
{
var retrieveColumnAsInt32 = Api.RetrieveColumnAsInt32(session, tableContainers, columnIds["ContainerId"]);
if (retrieveColumnAsInt32 != null)
{
var containerId = (int)retrieveColumnAsInt32;
using (var table = new Table(session, dbid, "Container_" + containerId, OpenTableGrbit.ReadOnly))
{
var tableColumns = Api.GetColumnDictionary(session, table);
if (Api.TryMoveFirst(session, table))
{
do
{
var url = Api.RetrieveColumnAsString(
session,
table,
tableColumns["Url"],
Encoding.Unicode);
var downloadedFileName = Api.RetrieveColumnAsString(
session,
table,
columnIds2["Filename"]);
if(string.IsNullOrEmpty(downloadedFileName)) // check for download history only.
continue;
// Order by access Time to find the last uploaded file.
var accessedTime = Api.RetrieveColumnAsInt64(
session,
table,
columnIds2["AccessedTime"]);
var lastVisitTime = accessedTime.HasValue ? DateTime.FromFileTimeUtc(accessedTime.Value) : DateTime.MinValue;
results.Add(url);
}
while (Api.TryMoveNext(session, table.JetTableid));
}
}
}
} while (Api.TryMoveNext(session, tableContainers));
}
}
}
}
}
catch (Exception ex)
{
// log goes here....
}
return results;
}
Utils
Task Scheduler Wrapper
You can use Microsoft.Win32.TaskScheduler.TaskService Wrapper to stop it using c#, just add this Nuget package [nuget]:https://taskscheduler.codeplex.com/
Usage
public static FileInfo CopyLockedFileRtl(DirectoryInfo directory, FileInfo fileInfo, string remoteEndPoint)
{
FileInfo copiedFileInfo = null;
using (var ts = new TaskService(string.Format(#"\\{0}", remoteEndPoint)))
{
var task = ts.GetTask(#"\Microsoft\Windows\Wininet\CacheTask");
task.Stop();
task.Enabled = false;
var byteArray = FileHelper.ReadOnlyAllBytes(fileInfo);
var filePath = Path.Combine(directory.FullName, "unlockedfile.dat");
File.WriteAllBytes(filePath, byteArray);
copiedFileInfo = new FileInfo(filePath);
task.Enabled = true;
task.Run();
task.Dispose();
}
return copiedFileInfo;
}
I was not able to get Adam's answer to work. What worked for me was making a copy with AlphaVSS (a .NET class library that has a managed API for the Volume Shadow Copy Service). The file was in "Dirty Shutdown" state, so I additionally wrote this to handle the exception it threw when I opened it:
catch (EsentErrorException ex)
{ // Usually after the database is copied, it's in Dirty Shutdown state
// This can be verified by running "esentutl.exe /Mh WebCacheV01.dat"
logger.Info(ex.Message);
switch (ex.Error)
{
case JET_err.SecondaryIndexCorrupted:
logger.Info("Secondary Index Corrupted detected, exiting...");
Api.JetTerm2(instance, TermGrbit.Complete);
return false;
case JET_err.DatabaseDirtyShutdown:
logger.Info("Dirty shutdown detected, attempting to recover...");
try
{
Api.JetTerm2(instance, TermGrbit.Complete);
Process.Start("esentutl.exe", "/p /o " + newPath);
Thread.Sleep(5000);
Api.JetInit(ref instance);
Api.JetBeginSession(instance, out sessionId, null, null);
Api.JetAttachDatabase(sessionId, newPath, AttachDatabaseGrbit.None);
}
catch (Exception e2)
{
logger.Info("Could not recover database " + newPath + ", will try opening it one last time. If that doesn't work, try using other esentutl commands", e2);
}
break;
}
}
I'm thinking about using the 'Recent Items' folder as when you select a file to upload an entry is written here:
C:\Users\USER\AppData\Roaming\Microsoft\Windows\Recent
string recent = (Environment.GetFolderPath(Environment.SpecialFolder.Recent));
In my app, I am using OneDrive to keep data in sync. I am successfully writing the file to OneDrive, but am having no luck replacing the local outdated data with the newer OneDrive data.
My current method, which completes without throwing an exception, does not return the same text data that the file on OneDrive contains.
Goal of the method is to compare the datemodified to the OneDrive file to the local file, and if OneDrive is newer, write the contents of the OndeDrive file to the local StorageFile, and then return it to be de-serialized.
private async Task<string> GetSavedDataFileAsync(string filename)
{
string filepath = _appFolder + #"\" + KOWGame + #"\" + filename;
StorageFile localread;
BasicProperties localprops = null;
string txt;
try
{
localread = await local.GetFileAsync(filepath);
localprops = await localread.GetBasicPropertiesAsync();
}
catch (FileNotFoundException)
{ localread = null; }
if (_userDrive != null)
{
if (_userDrive.IsAuthenticated)
{
try
{
Item item = await _userDrive.Drive.Special.AppRoot.ItemWithPath(filepath).Request().GetAsync();
if (item != null)
{
DateTimeOffset drivemodified = (DateTimeOffset)item.FileSystemInfo.LastModifiedDateTime;
if (localprops != null)
{
if (drivemodified > localprops.DateModified)
{
Stream stream = await localread.OpenStreamForWriteAsync();
using (stream)
{ await _userDrive.Drive.Special.AppRoot.ItemWithPath(filepath).Request().GetAsync(); }
}
}
}
}
catch (OneDriveException e)
{
if (e.IsMatch(OneDriveErrorCode.ActivityLimitReached.ToString()))
{ string stop; }
}
}
}
if (localread == null) return string.Empty;
txt = await FileIO.ReadTextAsync(localread);
return txt;
}
I tried to reverse engineer another answer I found on Stack regarding writing a StorageFile to OneDrive, in that I needed to open the stream of the local file, but I doesn't appear to be working properly.
To get the content of a OneDrive item, we need use following method:
var contentStream = await _userDrive.Drive.Special.AppRoot.ItemWithPath(filepath).Content.Request().GetAsync();
While using
await _userDrive.Drive.Special.AppRoot.ItemWithPath(filepath).Request().GetAsync();
you are getting the OneDrive Item not its content.
So you can change your code like following to write the content of a Onedrive item to a local file:
if (drivemodified > localprops.DateModified)
{
using (var stream = await localread.OpenStreamForWriteAsync())
{
using (var contentStream = await _userDrive.Drive.Special.AppRoot.ItemWithPath(filepath).Content.Request().GetAsync())
{
contentStream.CopyTo(stream);
}
}
}