I've got an application I'm taking over with a very strange issue.
The background: 60+ identical IIS applications running on windows server 2012 that I rdp into.
Each application is identical except for some image files and the web.config files. (yeah, I know)
The applications are not compiled but just run as cs files. No proj files or sln files either.
There is one compiled app, which runs as a scheduled task, and uses some of the files in each of the application folders.
The code is C# and I'm editing it with notepad++.
The issue:
I've been trying to update some of the code in one of the test applications but my changes don't seem to be taking effect. (specifically update a log file and send emails). The current emails work but my new one does not appear, nor do my log files show up.
I tried to test it in another test application just to see if it failed there too, and found that that website came up with an error in some code on a specific line of a specific file.
Thing is, this line of code is not on the same line in the actual cs file.
I then added another line higher up to see if I could get a divide by zero error.
Same result. Same line of code failed with the same line number. No change at all.
Seems like my code is being cached and I can't refresh it.
I tried making sure it's not being cached by the only scheduled task and cycling IIS entirely (at the root).
Still happening.
I know for a fact that it's not a matter of an exe hiding somewhere as two weeks ago I made a change to the code and it worked. My change showed up. I also know for a fact that I am editing the correct file. I opened the folder using Explore in IIS.
There are no obj folders. There is a bin folder in each application folder but nothing in the bin folder except nuget package dlls.
Ok. So after I spent a week trying to set each app's web.config to use tempDirectory=a new temp directory based on the app name, I finally discovered that the previous coders had made two copies of the relevant file. Identical copies. One for use by a scheduled task and one for use by the website itself.
So all the time I was trying to detect changes in the result of my code, I was changing the one the scheduled task used.
Of course, in the several times I discussed the problem with the previous developer (who was still working with us), he never bothered suggesting the solution, although he's been working with this code for several yeas.
Updating the correct copy solved the problem. I've also discovered my head hair was shedding faster in the last few weeks than ever before.
Related
I created a project in Unity, work a little bit on it. And then I tried to delete one of the scripts I created earlier, but every time I try to delete it a message shows up:
Fatal Error!
attempt to write a readonly database
UnityEditor.DockArea:OnGUI()
I tried to delete the file manually from the folder, but it caused similar error every time I launched Unity.
All the googling did not help me as long as all the answers are for the Windows, where permissions are easy to set up. For OS X only one answer was found, which said to make a folder shared, which also didn't help.
Does anyone know the solution of that?
I get this crash whenever I am trying to delete a file from my project or trying to create a build in unity. I am using a mac running sierra OSX (though I expect that won't matter) but the primary issue, is due to limited internal hard drive space, I had the project saved on an external SSD drive plugged into my mac. This seems to be the main issue. If your running into this issue and you have a similar setup, try copying the project file over to your onboard hard drive and operating it from there.
I have not yet found another solution around this problem. For some reason unity can save to external drives fine, but doesn't like to delete or build from them.
Try to delete all in folder
/Users/Shared/Unity/4.0_Angrybots/Temp
I use Visual Studio Team Services to store the source code of my projects as I work on them, I love the service, especially that it is free, but I have been running into the biggest pain lately.
Randomly when I go to save, modify, check in check out I get this error for every single file I am modifying. So if I am trying to save changes to 8 files I get this message 8 times and it takes 45 60 seconds of trying to check out for each file meaning to takes 6 - 8 minutes for the errors to stop (even if I hit cancel).
The local data store is currently in use by another operation
I looked it up online and found many people with the same issue but the response from MS has nothing to do with my situation.
http://blogs.msdn.com/b/phkelley/archive/2013/05/31/tf400030-the-local-data-store-is-currently-in-use-by-another-operation.aspx
It basically says this can happen when you have to many files in your workspace or have several large solution open at once.
This does not apply to me as I usually only have on solution open at a time and my projects are very small (400 -500 files).
Ran into this issue as well on VS 2013 and TFS - every time I opened my team explorer it would take 10+ seconds to show all projects, then when I would expand the project in source control, another 10+ seconds would roll by.
Earlier today I began to experience the "local storage is being used" error when trying to save data in class files. I did some original research, and this following link saved the day for sure. Now TFS is blazing!
Local Data Store Solved
What you do is edit workspace (including all projects associated), and change the "Location" dropdown from "Local" to "Server". It took about 4-5 minutes for the changes to finish, but well worth it.
Hopefully this will help someone down the road.
Lately I started to get same error message and Visual Studio started to work very slow with TFS and nuget. I have tried repair and uninstall but not solve the problem. At the and it was so painfully slow that I cannot continue working. (Expanding one item on source control explorer takes 10 seconds)
Here is my story and how my problem be solved:
I was mapped tfs folders separately not to get whole TFS because there are lots of irrelevant documents. After trying lots of fix suggestion, I thought this might be the problem because I did this separate mapping first time while I have been using TFS. I generally map and get all items at once and never met this issue before.
I removed all mappings and it was like magic. Error is gone, slow TFS source control is gone and it is rocket fast now. Just to be in a safe side I also delete my workspaces and create a new one and get all TFS items at once.
I found the error would be triggered when I had more than once instance of VS 2012+ running utilizing Source Control Explorer, Solution Explorer and/or Team Explorer windows. I've not had this problem when running a single instance of VS 2012+ (on updates 2+) utilizing Source Control Explorer, Solution Explorer and/or Team Explorer windows in tandem.
I found this article and gave it's suggestion a shot: to prevent multiple threads from accessing the data store simultaneously.
http://blogs.msdn.com/b/phkelley/archive/2013/05/31/tf400030-the-local-data-store-is-currently-in-use-by-another-operation.aspx
This proved to be a remedy for this issue.
I would add for other users with large file repositories, using source control and share this issue, it may be greatly beneficial to create multiple workspaces for each of your branches/repositories. I found that by doing this my queries to TFS sped up immensely and also helped with this error. I found this suggestion here: http://blogs.msdn.com/b/phkelley/archive/2013/05/30/using-multiple-workspaces-with-visual-studio.aspx. I share this as users mention TFS running slowly.
I also started getting the same error this week. Maybe there's something wrong with VS Update 3?
Simply could not work on any of the projects of the "broken" local workspace anymore.
VS would show all files as being checked out, but none were really.
Other local workspaces were working fine.
I tried removing a project from the workspace, but when trying to confirm it, I would receive the same TF400030 error again.
Suggestion
If nothing else works, you might want to try this: simply delete the whole workspace and create it again, this time separating projects into different workspaces. This worked for me.
You'll probably want to back up your files first.
I did as mentioned below and TFS started working fine
Close all the VS instances
Go to: C:\Users[UserName]\AppData\Local\Microsoft\VisualStudio\15.0_46af8b8e
Delete the privateregistry.bin file
Reopen the project solution
Above worked for me.
Had the same problem,can be fixed in 3 quick steps:
Remove current Workspace: Source Control Explorer->Work Space ListBox->workspaces... and remove the workspace.
- Make sure that all pending changes are checked in
Delete Workspace local folder.
- Its Better to delete the folder entirely.If eventually keeping some folders make sure to delete all $tf folders (hidden folders inside the workspace folder)
Remap the projects you need ( the less the better )
Hopes that helps.
In my case the cause was a compressed folder containing my local data store, shown in blue in Windows explorer. Removing the compression did the trick.
I ran into this error when renaming my workspace. After changing back to original everything worked fine again
Restarting the Visual studio resolved the issue for me.
OK, so my hard disk just crashed. Big deal. All my web dev code that was on it went along with it, and now I'm running ddrescue on Ubuntu trying to recover whatever data I can recover. The hard disk keeps disconnecting and sometimes it can quit responding for a long time so it's really a pain in the ass.
Anyway, back to the main topic--I have my web dev code which was packaged and uploaded to Azure; now what I'm wondering is if it's possible to obtain all my .cs files from the VM. I noticed approot and siteroot folders, but all I saw were the views, the .asax file, some other misc, stuff, nothing with the .cs extension.
Is there any way I can get a copy of the code I packaged? or (as a last resort) any way to get the .cspkg file and work from there?
The site you are seeing on the web role and inside the cspkg file is the output of the compile, so you can't get the original .cs files out of them. That said, you can use a tool like Reflector, Just Decompile, or a variety of other decompilers out there to reverse engineer your compiled bits into something that will be very close to the original C# code (not I'm assuming this is your own code, or code that doesn't have a provision against reverse engineering). This at least will let you use the bits on the webrole to get the majority of your code back, then review it to see how good a job it did.
Note, you can open the cspkg file. It's just a zip file. You can rename with a .zip file extension and open it up, but you won't find the .cs files in there. The only time you find this to be the case is if you have multiple websites within a single web role. The default packager for Windows Azure doesn't compile the additional sites, only packages up all the files in their root directory. Not at all helpful for actual deployments really, but this won't likely help you.
You are likely well ahead of me on this, but I'd recommend using a personal source control system of some sort to avoid this issue in the future.
So I have been writing to
Environment.SpecialFolder.ApplicationData
this data file, that upon uninstall needs to be deleted. I am using Innos Setup to build my installer. It works great for me. So my data file hangs out in the above path and I do that cause when I used to try to write it to
Application.ExecutablePath
certain boxes I tested it on would throw a nasty error at me trying to write data there. I do research and somehow its not always writable and its how i came up with the Environment.SpecialFolder.ApplicationData
That is why my data file now resides in the SpecialFolder.ApplicationData. Trouble is if the user uninstalls and reinstalls I need that file gone. It might be a short coming of my knowledge of Innos but I cannot figure out how to know where that file will be to tell innos that.
So then I thought I had a clever solution: Innos can run a file when its done uninstalling, so I had my program create this file "uninstallData.bat" that says:
del "the file in my special folder application data path"
and I wrote it out to drumroll
Application.ExecutablePath
(yes it was a while in development and I had forgot it was't doable.)
So of course I am back to square one, I need to write a file to a path Innos knows about {app} and I need it to be able to delete my data file in the SpecialFolder... i don't care how I do it i just need that file gone.
Are there other Environment. or Application. approches I have missed? Maybe somewhere that is viewable by an uninstaller AND can be written to?
As an aside, I am not sure why my box I develop on can write to the application folder no issue, but it cannot on other boxes... weird.
Any input would be great sorta lost as to how to crack this nut.
The environment location is in the user profile. If there are multiple users on the machine, and they all run the application then a copy of the file will be in each profile.
The path also depends on the OS.
Regardless, the current user's app data location is pointed to by %APPDATA% and %LOCALAPPDATA%. These Windows environment variables should be available within Innos.
Appliccation.ExecutablePath is not writable per standard defintions - the program files folder should never be manipulated by running applications. Ther area number of special folders for that. Nice that you finally found.... what is properly documented by Microsoft for a LONG time now (minimum 10 years).
I suggest you get a proper installer - WIX comes to my mind. Your problem is totally unrelated to C# - it seems to be totally a "crappy installer" issue. Or provide a PROGRAM (not bat file) to run at uninstall. What exatly is your problem there?
This is really weird behavior, let's say I have an asp.net mvc project as follow on my desktop (vista):
/mvcapplication/app1
then over the course of development, I copy this solution to a briefcase on a thumbdrive so I can work on it from a laptop (xp).
When I insert the thumbdrive back on the desktop, I notice it's taking longer and longer to sync, eventually it took so long that it just hangs there. I checked the project structure and found that it is now:
/mvcapplication/app1/app1/app1/app1
with each /app1 containing the entire project structures. I am not new to visual studio, and I am sure I opened and saved the solution and files just as I normally did, but this is just bizzard. I thought this is caused by briefcase, but the same thing happened when I copied the solution into a folder on the thumbdrive.
I would have left this alone, but with that sort of crazy folder structure it's really difficult to determine which folder has my current changes..
Anyone ever run into something like this?
Never had this happen but then I don't use the Briefcase. Then again, I try to avoid most things in Vista.
I just use good 'ol copy and past for thumb drive stuff.