Getting code from Windows Azure VM - c#

OK, so my hard disk just crashed. Big deal. All my web dev code that was on it went along with it, and now I'm running ddrescue on Ubuntu trying to recover whatever data I can recover. The hard disk keeps disconnecting and sometimes it can quit responding for a long time so it's really a pain in the ass.
Anyway, back to the main topic--I have my web dev code which was packaged and uploaded to Azure; now what I'm wondering is if it's possible to obtain all my .cs files from the VM. I noticed approot and siteroot folders, but all I saw were the views, the .asax file, some other misc, stuff, nothing with the .cs extension.
Is there any way I can get a copy of the code I packaged? or (as a last resort) any way to get the .cspkg file and work from there?

The site you are seeing on the web role and inside the cspkg file is the output of the compile, so you can't get the original .cs files out of them. That said, you can use a tool like Reflector, Just Decompile, or a variety of other decompilers out there to reverse engineer your compiled bits into something that will be very close to the original C# code (not I'm assuming this is your own code, or code that doesn't have a provision against reverse engineering). This at least will let you use the bits on the webrole to get the majority of your code back, then review it to see how good a job it did.
Note, you can open the cspkg file. It's just a zip file. You can rename with a .zip file extension and open it up, but you won't find the .cs files in there. The only time you find this to be the case is if you have multiple websites within a single web role. The default packager for Windows Azure doesn't compile the additional sites, only packages up all the files in their root directory. Not at all helpful for actual deployments really, but this won't likely help you.
You are likely well ahead of me on this, but I'd recommend using a personal source control system of some sort to avoid this issue in the future.

Related

Why is my asp pages executing cached code

I've got an application I'm taking over with a very strange issue.
The background: 60+ identical IIS applications running on windows server 2012 that I rdp into.
Each application is identical except for some image files and the web.config files. (yeah, I know)
The applications are not compiled but just run as cs files. No proj files or sln files either.
There is one compiled app, which runs as a scheduled task, and uses some of the files in each of the application folders.
The code is C# and I'm editing it with notepad++.
The issue:
I've been trying to update some of the code in one of the test applications but my changes don't seem to be taking effect. (specifically update a log file and send emails). The current emails work but my new one does not appear, nor do my log files show up.
I tried to test it in another test application just to see if it failed there too, and found that that website came up with an error in some code on a specific line of a specific file.
Thing is, this line of code is not on the same line in the actual cs file.
I then added another line higher up to see if I could get a divide by zero error.
Same result. Same line of code failed with the same line number. No change at all.
Seems like my code is being cached and I can't refresh it.
I tried making sure it's not being cached by the only scheduled task and cycling IIS entirely (at the root).
Still happening.
I know for a fact that it's not a matter of an exe hiding somewhere as two weeks ago I made a change to the code and it worked. My change showed up. I also know for a fact that I am editing the correct file. I opened the folder using Explore in IIS.
There are no obj folders. There is a bin folder in each application folder but nothing in the bin folder except nuget package dlls.
Ok. So after I spent a week trying to set each app's web.config to use tempDirectory=a new temp directory based on the app name, I finally discovered that the previous coders had made two copies of the relevant file. Identical copies. One for use by a scheduled task and one for use by the website itself.
So all the time I was trying to detect changes in the result of my code, I was changing the one the scheduled task used.
Of course, in the several times I discussed the problem with the previous developer (who was still working with us), he never bothered suggesting the solution, although he's been working with this code for several yeas.
Updating the correct copy solved the problem. I've also discovered my head hair was shedding faster in the last few weeks than ever before.

Transactional file management in C#

I have a scenario in my application is that i need to upload some files (Zip files) from the client to the server and in the server i want to extract the Zip file and replace those files which i getting from extracting the Zip file into some other folder.
The files which i need to be replaced is mostly dll files. So one thing that i need to ensure that either all files should be replaced or none of them get replaced.
Is there any way in C# to achieve this (like Transaction in SQL) ? If anything bad occurs while replacing files (Example: no memory space), every changes happened to the previous files should be rollbacked.
Hope you understand the problem.
Any help ?
NTFS allows file system transactions, see https://msdn.microsoft.com/en-us/magazine/cc163388.aspx
Having a quick poke around, only way I can see you doing this would be through https://msdn.microsoft.com/en-us/magazine/cc163388.aspx which involves some native code. Otherwise you could use a third party tool such as http://transactionalfilemgr.codeplex.com/
If you wanted to manage it yourself or go for a simpler approach, I would suggest backing up the existing files somewhere before trying to copy the new files. This could be in another folder or zipped up. Then if the copy fails, you handle this and revert all the files to their original state.
Whatever you choose, make sure you have plenty of logging so you can see what's happening and if/when something goes wrong :)

Create a folder backed by software rather than OS?

I recently got put on a project where they're having issues with too many files in a folder slowing down access. I believe it is 10,000+ files in a single folder where windows starts to slow down access, we have something on the order of 50,000. All the files are small and most of the time we only need to access the newest .1-2.% of them via windows file and print sharing. I'd look into dividing the files into subfolders, except that there is a bunch of legacy code that is only able to look at a single folder.
My idea - I don't know if it is possible or even plausible - is to create a small program that buffers the newest .1-.2% files in memory, and retrieves the rest from disk as needed.
I had thought that years ago I'd read of a protocol that could simulate a folder on a hard drive. Is it possible?
Is there something out there that already does this? Is there a better option without major changes to the system?
What to other systems use for serving up a large number of files? Is there some other product that serves files that we could map as a network drive? Or some way to blend 2 folders so they look like one?
Putting aside the "correct way to solve this problem" for the moment, what you're looking for is called "Shell namespace extensions". There are several .NET resources for writing these explorer extensions.
http://namespaceextension.codeplex.com/
http://www.codeproject.com/Articles/1649/The-Complete-Idiot-s-Guide-to-Writing-Namespace-Ex
http://www.codeproject.com/Articles/13515/A-Namespace-Extension-Toolkit
And perhaps many more.
Of course - we must remember why it isn't a good idea to write explorer extensions in .NET.
Hope this helps.

Sync services like Dropbox, theory behind file indexing?

I have realised that by using the Amazon S3 service directly, I can save myself a lot of money. Instead of buying a client like GoodSync or Jungle Disk I thought it would be interesting to create my own Windows syncing application, which would sync my files to S3.
I have discovered that I can use FileSystemWatcher to monitor for changes to files and directories, but I am looking for the theory behind how other services like Dropbox index their files. Things like comparing the file size of a file with the size recorded in an index somewhere on the client PC, then using this information to determine whether to sync or not.
I am using C# and references to different libraries or code samples I could use would be helpful, but I am mainly looking for the best way to index files and for someone to point me in the right direction.
Thanks
I've went down this path myself. In fact, now that Mozy dropped their unlimited plan and Carbonite chooses to NOT backup certain files...like 3GP files and *.dat files unless you routinely go in and manually add them, I am very disgruntled with online backups.
But your question was on syncing. Dropbox does it the best. But it's expensive. But I'm not sure S3 would be any cheaper.
Anyway, you will have a lot of hurdles. In my experiences, the problems I ran into are:
1) Propagating deletes
2) FileSystemWatcher simply missing events such as rapidly adding files to a folder then deleting them
3) etc..
Now some ideas on how I would tackle this again:
1) Keep a small SQLite db for files names/path locally
2) Copy files to a tmp directory before sending to S3.
3) On file changes/updates/deletions/etc store that meta information in SQLite
Anyway just some ideas.

C# Cannot write to Application.ExecutablePath, some boxes i can some i cannot? uninstall from this

So I have been writing to
Environment.SpecialFolder.ApplicationData
this data file, that upon uninstall needs to be deleted. I am using Innos Setup to build my installer. It works great for me. So my data file hangs out in the above path and I do that cause when I used to try to write it to
Application.ExecutablePath
certain boxes I tested it on would throw a nasty error at me trying to write data there. I do research and somehow its not always writable and its how i came up with the Environment.SpecialFolder.ApplicationData
That is why my data file now resides in the SpecialFolder.ApplicationData. Trouble is if the user uninstalls and reinstalls I need that file gone. It might be a short coming of my knowledge of Innos but I cannot figure out how to know where that file will be to tell innos that.
So then I thought I had a clever solution: Innos can run a file when its done uninstalling, so I had my program create this file "uninstallData.bat" that says:
del "the file in my special folder application data path"
and I wrote it out to drumroll
Application.ExecutablePath
(yes it was a while in development and I had forgot it was't doable.)
So of course I am back to square one, I need to write a file to a path Innos knows about {app} and I need it to be able to delete my data file in the SpecialFolder... i don't care how I do it i just need that file gone.
Are there other Environment. or Application. approches I have missed? Maybe somewhere that is viewable by an uninstaller AND can be written to?
As an aside, I am not sure why my box I develop on can write to the application folder no issue, but it cannot on other boxes... weird.
Any input would be great sorta lost as to how to crack this nut.
The environment location is in the user profile. If there are multiple users on the machine, and they all run the application then a copy of the file will be in each profile.
The path also depends on the OS.
Regardless, the current user's app data location is pointed to by %APPDATA% and %LOCALAPPDATA%. These Windows environment variables should be available within Innos.
Appliccation.ExecutablePath is not writable per standard defintions - the program files folder should never be manipulated by running applications. Ther area number of special folders for that. Nice that you finally found.... what is properly documented by Microsoft for a LONG time now (minimum 10 years).
I suggest you get a proper installer - WIX comes to my mind. Your problem is totally unrelated to C# - it seems to be totally a "crappy installer" issue. Or provide a PROGRAM (not bat file) to run at uninstall. What exatly is your problem there?

Categories

Resources