Suppose I have something like this:
string TheFile = HttpRuntime.AppDomainAppPath + "\\SomeDir\\" + TheFilename + ".js";
System.IO.File.WriteAllText(TheFile , SomeText);
This works on my local machine: the file is created and visible in the file system and in the solution explorer. If I deploy on Azure, is the file going to be written only on the instance that's running this code or will it be written and available on all other instances?
Unless you write to Azure Drive or some equivalent thereof the change will of course be limited to the instance filesystem. The instance filesystem changes will be lost if the VM crashes and also in some other cases, so whatever you need to preserve should be stored to durable storage such as Azure Blob Storage.
Related
if (Upload.ContentLength > 0)
{
var FileName = Upload.FileName;
var path = Path.Combine(Server.MapPath("~/Content/Images"), FileName);
Upload.SaveAs(path);
cement.ImageLocation = ("/Content/Images/" + FileName);
}
This code is working perfectly in Local IIS hosting. But having problem on AppHarbor. The error that I got from the logging is this.
**Message**
An unhandled exception has occurred.
**Exceptions**
[DirectoryNotFoundException: Could not find a part of the path 'D:\Users\apphb9840cac8716388\app\_PublishedWebsites\RMQGrainsBeta\Content\Images\Capture.PNG'.]
I tried to read some of the articles about this and Got some Information that that the Folder might be protected or something, so I removed the ReadOnly on the Images folder and try to git bash, the problem is. GitBash doesn't recognize the difference and won't push the changes of the folder.
Finally got an answer from AppHarbor. So the thing is, it is not the code that is having the problem. It is the AppHarbor services that is blocking us to upload or directly access the file system. Here is what the Tech from the AppHarbor says.
About storing images: AppHarbor does not allow write access to the
application directory by default. This is because the local filesystem is
ephemeral and may be wiped on each deployment and/or during system
maintenance. For this reason it needs to be manually enabled on the
settings page, and it should only be used for temporary storage purposes such as caching.
I'd recommend using a cloud file storage solution such as Amazon S3. You can
upload files to S3 from your application or the client can upload files
directly to Amazon S3 by using presigned URLs. This will allow you to build
a scalable, distributed file storage feature that is suitable for cloud-based
platforms like AppHarbor. Let me know if you need help
implementing/architecting a solution that suits your needs!
Best,
Rune
Sorry for wasting your time Guys. But thank you for viewing the problem.
Situation: OneDrive for Business syncs files from Sharepoint Site Document Library to local directory:
C:\Users\users\Sharepoint\Library\Test.pttx
However with PowerPoint InterOp the:
presentation.Path
Is:
https://company.sharepoint.com/Library/Shared%20Documents/
Which is the correct path for Sharepoint.
How can I access the local directory?
Update: I found a similar question on MSDN but no answer
According to this post, the synced folders can be looked up in this multi-string registry value:
HKEY_CURRENT_USER\Software\Microsoft\Office\15.0\Common
\Internet\LocalSyncClientDiskLocation
Given that your local path and SharePoint URL look like
C:\Users\User\SharePoint\Library - Documents\Folder\SubFolder\Document.pptx and
https://***.sharepoint.com/Library/Shared%20Documents/Folder/SubFolder/Document.pptx,
you could try extracting the local part Folder/SubFolder/Document.pptx from the URL, add it to the local folder paths retrieved from the registry value and check for file existence.
If I understand correctly you want a way in Powerpoint (VSTO) to get the local path of the Sync'd directory? A method in the Powerpoint Object Model like presentation.GetLocalPath()?
I dont know why (in the MSDN link in your question) the MSFT CSG engineer said that it was impossible.
Sorry for the mistake, after the further investigation, it is
impossble. For the Word Application, the file is stored on the
OneDrive, and the "Offline" is cache mode for OneDrive, it is
transparent to Word Application (Word Application only know it is a
document on OneDrive), so when you check the location of the opened
document, the location is "http:/d.docs.live.net/xxxx/xx.docx" rather
than "C:\XXX\XXX".
This article shows you how you can Change the location where you sync SharePoint libraries on your computer. Obviously there isn't a method in the Powerpoint Interop Library but its definitely possible.
1) My first thought was to see if the OneDrive for Business sync app wizard saves any registry key or anything like that (using ProcessMonitor) but it probably stores the local directory in the cloud.
2) My second thought is a bit outside the box, just put a text file in https://company.sharepoint.com/Library/Shared%20Documents/LocalDrivePath.txt, then create an Extension method called presentation.GetLocalPath() and use the WebClient class to download the string.
Pseudo code:
public string GetLocalPath(this Microsoft.Office.Interop.PowerPoint.Presentation presentation)
{
WebClient client = new WebClient();
string localPath = client.DownloadString(presentation.Path + "LocalDrivePath.txt");
//Some good old protective programming to quickly identify the problem if the file doesn't exist
if (!string.IsNullOrEmpty(localPath)) throw new Exception("Issue: LocalDrivePath.txt not found in " + presentation.Path + Environment.Newline + "Please add the file for this Office Documents' sync'd (offline) local folder to be identified.");
return localPath;
}
Call it like so:
presentation.GetLocalPath();
I have my project files on my Dropbox folder so I can play around with my files at the office as well.
My project contains an EmbeddableDocumentStore with UseEmbeddedHttpServer set to true.
const int ravenPort = 8181;
NonAdminHttp.EnsureCanListenToWhenInNonAdminContext(ravenPort);
var ds = new EmbeddableDocumentStore {
DataDirectory = "Data",
UseEmbeddedHttpServer = true,
Configuration = { Port = ravenPort }
};
Now, this day when I started my project on my office pc I saw this message: Could not open transactional storage: D:\Dropbox\...\Data
Since it's early in my development stage I deleted the data folder on my Dropbox and the project started flawlessly. Now I'm back at home I ran into the same issue! I don't want to end up deleting this folder every time of course.
Can't I store my development data on my Dropbox? Should I bypass something to get this to work?
Set a data directory to a physical disk volume on your local computer. You will not be able to use any sort of mapped drive, network share, UNC path, dropbox or skydrive as a data directory. Just because you have a drive letter does not mean you have a physical disk.
The only types of non-physical storage that even make sense is a LUN attached from a SAN over iSCSI or FibreChannel, or an attached VHD in a virtualized or cloud environment. They will all present as physical disks to the OS.
This would be the case for just about ANY data access environment. Try it with SQL Server if you don't believe me. In RavenDB's case, it is using ESENT as its data store, which requires direct access to the filesystem.
Update
To clarify, even if you are storing on a physical disk, you can't rely on any type of synchronization technology like DropBox or SkyDrive. Why? Because they will be taking a shared read lock on the files to watch for changes. Technologies like ESENT (which RavenDB is based upon) require an exclusive lock to the file.
Other technologies like SQL Server and Windows Virtual Machine also take exclusive locks on their data stores. Why? Because they are constantly reading and writing bits of data in a random-access manner to the file. Would you really want DropBox to be trying to perform an sync operation for every bit of data change? It would be very inefficient and problematic.
Applications that use shared locks don't have this problem. For example, when you work on an MS Word document, it is all being done in memory. When you save the file, DropBox can read the entire file and sync it to the cloud. It can optimize by sending only the bits that have changed, but it still needs to be able to read the file to do so.
So if DropBox has a shared read lock on the ESENT file, then when RavenDB tries to open it exclusively, it gets an error and raises the exception you are seeing.
I have an Azure web role and a separate computer. Both of them are on the same network and both share certain folders which the other can access. If I go on my Azure web role, through remote desktop, I can go to the other computer's shared folder using \\comp1\folder and add/remove/edit files there.
I have a few image files on my web role which I need to copy to the separate computer.
These images are uploaded to the web role and stored there.
How can I copy those images that are on the web role, to my other computer?
I have tried using File.Copy but it always gives me Access Denied errors.
I tried doing:
File.Copy(Server.MapPath("~/image/a.jpg"),#"\\comp1\folder\b.jpg");
Result: UnauthorizedAccessException
I don't think you can access the file system on Azure like that, except through Local Storage?
To quote Bill Wilder
Any of your code running in either (a) ASP.NET (e.g., default.aspx or
default.aspx.cs) or (b) WebRole.cs/WorkerRole.cs (e.g., methods
OnStartup, OnRun, and OnStop which are derived from RoleEntryPoint
class) will not have permission to write to the file system.
You can read and write to the Local Storage system
try
{
LocalResource myConfigsStorage = RoleEnvironment.GetLocalResource("myConfigs");
string s = System.IO.File.ReadAllText(myConfigStorage.RootPath + "myFile.txt");
//... do your work with s
}
catch (Exception myException)
{
... }
But having always used Azure with more than one instance I have never seen the need for local storage and used the blob store instead.
Read more: http://www.intertech.com/Blog/Post/Windows-Azure-Local-File-Storage-How-To-Guide-and-Warnings.aspx#ixzz26ce8rXpk
I'm attempting to migrate fairly complex application to Windows Azure. In both the worker role and web role there are many instances where the application saves files to a local file system.
Here's an example:
string thumbnailFileName = System.IO.Path.GetDirectoryName(fileName) + "\\" + "bthumb_" + System.IO.Path.GetFileName(fileName);
thumbnail.Save(thumbnailFileName);
and another example:
using (System.IO.StreamWriter file = System.IO.File.AppendText(GetCurrentLogFilePath()))
{
string logEntry = String.Format("\r\n{0} - {1}: {2}", DateTime.Now.ToString("yyyy.MM.dd#HH.mm.ss"), type.ToString(), message);
file.Write(logEntry);
file.Close();
}
In these examples we are saving images and log files to file locations specified in the app.config. Here's an example:
<add key="ImageFileDirectory" value="C:\temp\foo\root\auth\inventorypictures"/>
I'd like to make as few code changes as possible to support Azure blob storage in case we ever decide to move back to a more traditional hosting environment and more generally to reduce the potential for creating unintended problems.
Based on this post I've decided that Azure Drive is not the best way to go.
Can someone guide me in the right direction (ideally with an example)? The best solution in my mind would be one that only requires a change to my config file. But I'm guessing that is not realistic.
Indeed, you want to use Azure Blob storage to save your files.
As for your coding question, consider creating an interface, call it IFileStore:
public interface IFileStore
{
void Save(string filePath, byte [] contents);
byte [] Read(string filePath);
}
Then you create 2 provider classes, one for the file system, and one for Azure Blob storage.
The file system provider can implement the save function like this:
public void Save(string filePath, byte [] content)
{
File.WriteAllBytes(filePath, content);
}
public byte [] Read(string filePath)
{
return File.ReadAllBytes(filePath);
}
As for the Azure Blob provider, you will have to derive the storage path based on the filePath passed in to you.
Generally
For your storage, I'd recommend using Blob and Table storage - this allows multiple instances to access the storage simultaneously. If you want to assist with making the code portable, then I'd recommend abstracting your code behind interfaces/APIs (see #Philpp's answer).
e.g.
for your log file example, then table storage might be the best thing to use
for your image files, then blob storage might be the best thing to use
If you really want to use AzureDrive
I'd only recommend using AzureDrive if you are only ever going to deploy a single instance of your role - otherwise you will end up fighting problems with sharing files across multiple instances (and remember that only 1 instance can mount with write access at any one time)
If you are operating with a single instance, and if you are only storing temp and log files, then you could also look at using local storage instead of azure drive - it's much simpler and cheaper to use than blob storage. e.g. One possible specialist alternative for your log file example is that you could consider using local storage alongside Azure Diagnostics-controlled uploading of that local storage to Blob storage.