xml files are downloaded in the wrong format - c#

we have a web application in C# where our users can download .xml files but since we introduced identity server3 in our new update our download intermittently stop working and some early access users reported that they are (2 times on 3) downloading authorise.htm and/or a file without a format extension (that if renamed has all the content) and so on.
What part of our system is likely to be causing it? I'm a tester so I'm not expert in debugging but I am trying to understand what are our suspects.
Could it be some configuration issue? firewall issues?

Related

Windows Service doesn't complete what it's intended to do and it's suddenly stopped by the system?

I've been stuck with this annoying problem for the last couple of hours, and I've just exhausted my resources. This is what's going on:
I have a Windows Service created using the 4.0 .NET framework. The main functionality of this service is to read all .txt files contained in a folder, validate the formatting of these files and then send them to another folder, create an xml file from those files, and then create a pdf file using the information contained in the xml; I'm using the Report.NET library (http://sourceforge.net/projects/report/) to create these .pdf files.
When I run the service, it does everything it's supposed to do except generate the .pdf file, and then I get the following error:
"Service on Local Computer started and then stopped. Some service stop automatically if they have no work to do."
Also, I forgot to mention, if I debug the windows service using Visual Studio, everything works as intended and the .pdf files are generated correctly.
I've already tried attaching the "main" function to a thread and then starting that thread on the OnStart() method, but that doesn't seem to work, neither adding a timer.

Set *.exe file properties after upload to server

I'm trying to find a way to write "meta" information to EXE files that are uploaded to my IIS/ASP.NET web service. Here's a little bit of background:
I need to write one arbitary string into the properties
It'll be URL that I write as "metadata", if that matters
Example: https://example.com/someFolder/someOtherFolder
The files are mainly installers originally created by InnoSetup
The web server is running IIS 7.5 with ASP.NET on top of Server 2008 R2 (Standard)
Why am I trying to write this information?
Ultimately the EXEs are made available to users for download. When the application runs, it needs to know the web URL in order to execute properly. Currently we have a plain text box where the user can input the URL, but that has proven to be error prone (despite prompting/error checking/...)
Why can't I just write the metadata in the EXE when it's created?
I could do that, but the EXE could be uploaded to a variety of different servers, each with their own unique URL "metadata". I'm trying to avoid creating a separate build script for each server.
Why not just create a *.zip file with the *.exe and an extra piece of metadata?
I suppose I could do that too -- but then the user would have to actually unzip the download so that the real installer could read the metadata. I had something similar to this before and most people never unzipped the full download and that posed its own problems.
So is this even possible? I guess as a last resort I could use the uploaded EXE to create a new EXE, but I'm trying to avoid doing that (gets into problems with signed EXEs, etc.)

Concurrency Issue on IO operation

I'm writing a multi threaded console application which downloads pdf files from the web and copies it locally on to our content Server location(windows server). This is also the same location from which the files will be served to our website.
I am skeptical about the approach, because of concurrrency issues such as if the user on the web site requests a pdf file from the content server, and at the same time the file is being written to or being updated by the console application there might be an IO Exception. (The application also makes updates to the pdf files if the original contents change over time)
Is there a way to control the concurrency issue?
You probably want your operations on creating and updating the files where they are served to be atomic, so that any other processes dealing with those files get the correct version, not the one that is still open for writing.
Instead of actually writing the files to where they will be served, you could write them to a temporary directory and then move them into the directory where they will be served from.
Similarly, for updating them, you should check that when your application is updating those pdfs that the files themselves are not changed until writing has finished. You could test this by making your application sleep after it has started writing to the file, for example.
The details depend on which web server software you are using, but the key to this problem is to give each version of the file a different name. The same URL, mind you, but a different name on the underlying file system.
Once a newer version of the file is ready, change the web server's configuration so that the URL points to the new file. In any reasonably functional web server this should be an atomic operation.
If the web server doesn't have built-in support for this, you could serve the files via a custom server-side script.
Mark the files hidden until the copy or update is complete.

IIS compression doesnt open excel file properly.Shows message that file cannot be accessed or corrupted,located on a server that is not responding

we have used IIS 7 compression on asp.net application .The application downloads a excel file and opens it properly when compression is not applied.When compression is applied the excel file downloads but doesn't open properly. It shows a message that the file cannot be accessed.The file may be corrupted,located on a server that is not responding or read-only.
Please suggest what is going wrong.Why is the file opening with error.What resolution can be applied.Please help.
Thanks in advance.
Is the downloaded file perhaps still gzipped on the drive, but with a .xls extension? I'm guessing this shouldn't happen, but would be my first guess. You can probably just check the file size and compare it to the original as a quick check.
Also, does this happen with all browsers, or just a specific one?

Webserver not using the latest files

One of the projects I am working on includes a website that
is hosted on a cheap shared hosting server.
Whenever I upload to the server some updated files, they don't
necessarily become available immediately.
It can take from 15 to 30 minutes before the server actually
starts using the new files instead of the old ones and in some
cases I even need to re-re-upload the updated files.
Some more info:
- C# webforms files (.aspx and .aspx.cs)
- If there was no previous file with that name on the server
then the file always become immediately available
- But if I first delete the older file and refresh the page
I get immediately a "file not found" error but if I then upload
the newer file the "file not found error" stops immediately but I
get back the older file again.
I understand that the server isn't actually serving the .aspx
page but rather using the compiled to dll version that it has made
(right?) so maybe this is a compiling problem on the server somehow?
I'm not sure if this would be better on serverfault.com
but as a programmer SO is where I usually come.
Any idea why this is happenning and preferably some solution
on how to fix this behavior so that when I upload an updated page
we can start using it immediately?
Thank you.
Usually, touching your web.config file will recycle the web server - if you do that, you should flush any caches. Just upload a new web.config with a trivial change and see if that helps.
If you are using .NET 2.0 websites, you can have problems with the .dlls in the bin folder. Changing to a Web application should solve your problem permanently.
http://webproject.scottgu.com/
I have seen this behavior on one of my sites as well.
In my case the issues began just after the hosting provider had moved my site to their new SAN-solution.
It turned out that this new storage solution did not support "file system watchers".
And without it IIS would never receive any notification of when a file has been updated or not.
The workaround they introduced was to move the applications into new application pools with regular intervals. (This gives the symptoms you are describing with updates only being applied at regular intervals.)
The only solution I found was to move my sites to a different hosting provider.

Categories

Resources