I'm developing a dll that is supposed to be commonly used (in nuget for example). Simple description: my DLL simplifies message exchange with a particular service. It allows to send a request, then retrieve a response. Service is asynchronous and it can create a response in a hour or a day after accepting a request, so after making a request my dll calls service every few minutes to check out for response. The problem is that the app that uses the dll can be restarted therefore storing a request queue in memory isn't a good option (I don't want to lose info about requests). Neither is serializing it to file, because I can't know for sure where my dll will be used - it could be pc app, mvc. My main options is: serialize to file, but give an option to set a address where to place serialized files via web/app.config or make a user to think about it. But maybe there is some better solution about how to store requests queue?
I would put theses type of configuration or data files in a subfolder to the %appdata% folder. You will have write access to files in this folder and the documentation is extensive. Read more here.
in C# you can easily get this folder using:
var appdata = Environment.GetFolderPath(Environment.SpecialFolder.ApplicationData);
Or use Program Data:
var programdata = Environment.GetFolderPath(Environment.SpecialFolder.CommonApplicationData);
Related
I recently made a API on AWS API Gateway. To make accessing it easier, I created a dev kit library to make accessing the Gateway methods easier (using .Net 6). To make getting the API URI's easier to access and store, I put all of the URI's into a json file in the library.
When doing unit testing on the library itself, everything passed. The library accesses the json file with the below code
using (StreamReader reader = new StreamReader(Path.Combine(Directory.GetCurrentDirectory(), "Resources\\config.json")))
{
json = reader.ReadToEnd();
}
I published the library as a Nuget package and then installed it on a .Net web app. When the app tries calls a method from this dev kit, it throws the following error:
Message=Could not find a part of the path 'C:\Projects*ParentFolder**Subfolder**ProjectFolder*\Resources\config.json'
Which the file path given is the file path of the project, not the filepath that leads to the config.json file. So the question is, how do you structure the file path to access a file if it is in a library that is designed to be a stand alone library like a dev kit? Or is it just a bad idea and should I approach storing the URI's for the methods in a different manner?
FYI - the I do know the issue has nothing to do with the AWS API Gateway or Lambda functions. Those are running correctly and performing in a consistent manner.
TIA!
The current directory is a property of the process, which will be the application referencing the library. If you want the path of the library itself then I would think that something like this would do the job:
var rootFolderPath = Path.GetDirectoryName(Assembly.GetExecutingAssembly().Location);
I'm trying to use Realm Cloud in an Azure Function but it keeps giving me an error:
make_dir() failed: Permission denied Path: /realm-object-server/
Is there a way to configure Azure Functions to have permissions to create files? I'm new to Azure Functions, this is my first one so I'm not really sure of all the particulars.
I have found one solution to this. Inside Azure function, you can only create any file or folder inside the temp folder. So if you use following sync configuration, it should work. It worked for me.
var configuration = new FullSyncConfiguration(new Uri("/~/<your-realm-name>", UriKind.Relative), _realmUser, Path.Combine(Path.GetTempPath(), "realm-object-server"));
So basically, here you are passing the folder name to store the realm locally. If you don't pass the folder name, it will try to store it in a default location where you will get the access error.
I am not familiar with Realm, but functions has permissions to interact with the file system by default. See these links for information on the app service file system (this applies for functions too):
https://learn.microsoft.com/en-us/azure/app-service/operating-system-functionality#file-access
https://github.com/projectkudu/kudu/wiki/Understanding-the-Azure-App-Service-file-system
If the function is deployed using run from package, then the wwwroot is readonly. But since the path in the error message doesn't point to wwwroot, this is probably not the issue.
My best guess is that the code that is failing is trying to write to in an inappropriate location. The information in the links above should help resolve that. Maybe check to see if realm has a config setting that lets you specify which location it should be creating "realm-object-server" in.
You can use
SyncConfigurationBase.Initialize(UserPersistenceMode.NotEncrypted, basePath: Path.GetTempPath());
I have this specific scenario:
The user is sending me request which contains a URL to a file in my private repository.
The server side catch this request and Download the file.
The server making some calculation on the downloaded file.
The server sending the results back to client.
I implemented this in the "Naive" way. Which mean, I downloading the file (step 2) for each request. In most cases, the user will send the same file. So, I thought about better approach: keep the downloaded file in short term "cache".
This mean, I will download the item once, and use this for every user request.
Now the question is, how to manage those files?
In "perfect world", I will use the downloaded file for up to 30 minutes. After this time, I won't use it any more. So, optional solutions are:
Making a file system mechanism to handling files for short terms. Negative: Complex solution.
Using temporary directory to do this job (e.g. Path.GetTempFileName()). Negative: What if the system will start to delete those files, in the middle of reading it?
So, it's seems that each solution has bad sides. What do you recommend?
This is my first time developing this kind of system, so many of these concepts are very new to me. Any and all help would be appreciated. I'll try to sum up what I'm doing as efficiently as possible.
Background: I have a web application running AngularJS with Bootstrap. The app communicates with the server and DB through a web service programmed using C#. On the site, users can upload files and reference them later using direct links. There's no restriction to file type (yet), so just about anything is allowed.
My Goal: Having direct links creates a big security problem for me, since the documents/images are supposed to be private data. What I would prefer to do is validate a user's credentials when the link is clicked, then load the file in the browser using a more generic url path.
--Example--
"mysite.com/attachments/1" ---> (Image)
--instead of--
"mysite.com/data/files/importantImg.jpg"
Where I'm At: Not very far. My first thought was to add a page that sends the server request and receives a file byte stream along with mime type that I can reassemble and present to the user. However, I have no idea if this is possible using a web service that sends JSON requests, nor do I have a clue about how the reassembling process would work client-side.
Like I said, I'll take any and all advice. I'd love to learn more about this subject for future projects as well, but for now I just need to be pointed in the right direction.
Your first thought is correct, for it, you need to use the Response object, and more specifically the AddHeader and Write functions. Of course this will be a different page that will only handle file downloads, so it will be perfectly fine in your JSON web service.
I don't think you want to do this with a web service. Just use a regular IHttpHandler to perform the validation and return the data. So you would have the URL "attachments/1" get rewritten to "attachments/download.ashx?id=1". When you've verified access, write the data to the response stream. You can use the Content Disposition header to set the file name.
I'm writing a worker role for sending emails. It has some email template html files with build action = content and copy to output = Copy Always
How can i access those files from the code of the WorkerRole?
I don't want to store those files in a blob because i need to upload this service as soon as possible and i have to be able to edit those email template easily without wiring up extra code for that.
EDIT:
Guys i'm talking about Azure. This doesn't work with the regular method of loading the Folder of the current running assembly because that would give you the Azure host process which is located in a different place.
I figured it out - here's how to do it:
Path.Combine(Environment.GetEnvironmentVariable("RoleRoot") + #"\", #"approot\FileTemplates\");
Having a build action - content does not embed the file into the .dll, it is deployed into your assembly directory (usually \bin), then in the same folder structure that you have the template in. That was confusing to write, so here is an example:
Project Directory
-Templates
-EmailTemplateA
When complied and deployed, EmailTemplateA would be in the following location: \bin\Templates\EmailTemplateA
Now that we know where it is at, we need to use it. Below is a code snippet that would load a template, replace some values and then send your email
public void SendRegistrationConfirmation(string toAddress, string confirmUrl)
{
const string subject = "Your Registration";
//load the template
var template = File.OpenText(AssemblyDirectory + " \\Templates\\NewProgramRegistration.Template").ReadToEnd();
//replace content in the template
//We have this #URL# string in the places we want to actually put the URL
var emailContent = template.Replace("#URL#", confirmUrl);
//Just a helper that actually sends the email, configures the server, etc
this.SendEmail(toAddress, subject, emailContent);
}
You could store your templates in Local Storage:
http://convective.wordpress.com/2009/05/09/local-storage-on-windows-azure/
You should reconsider you design.
- as mentioned above, you can use Local Storage (you would need to copy the files there first) and then you can use the System.IO .NET API to manipulate the files
The problem with this is (as you said you are making changes)
- what if you horizontally scale, now you have individual worker roles that have their own local copy of the "template". You mention not wanting to use blob storage, but that should be an option since it can act as a central/persisted repository for your templates. And you can copy them locally of course as needed.
Another option is to use SQL Azure DB. Something like templates, should be super fast on it that multiple roles can share.