I have written a Console application that makes an HTTP POST to a ASMX Web Service (within Sharepoint) that I am looking to invoke remotely with WinRM. I have written a PowerShell process that bundles up the built application, moves it to the destination server, and executes it locally. The process works completely fine (ie moving copying, what-have-you), but fails when it gets to the Web Service call. Looking at logs on Fiddler, the HTTP Post request has no credentials in it at all, even though the application is being ran with an account that has all the proper permissions.
Since the application is moved to the destination server locally, I can run it locally, which I have done, and it works completely fine, as the Fiddler logs show the correct credentials in the header.
I am not sure where the issue resides, but I am having a hard time thinking its my code, since it works fine when ran locally. It may be with the WinRM process, but the exact same process works for other environments (this project is part of a larger deployment solution that deploys databases, ssis packages , etc).
I am happy to provide code of certain pieces if needed, and any basic troubleshooting advice would be greatly appreciated.
EDIT
The user credentials I specify in a xml file that contains the username and a hashed password, like this in Powershell
powershell.exe -version 3.0 -ExecutionPolicy RemoteSigned -command "& { $cred=Import-Clixml '%CREDENTIAL_FILENAME%' ; $cred.Password=ConvertTo-SecureString $cred.Password ; $Credential=New-Object System.Management.Automation.PsCredential($cred.UserName, $cred.Password) ; invoke-command -computername %MACHINE% -filepath %Result% -Credential $Credential }"
Related
We have an app that updates a gitlab project and pushes/commits to master. When I run locally, the program runs fine. When I run on the application hosted server (not the gitlab server), I get a 404 "Project not found".
It's not the private access token, we just made a new one
I can sign in to gitlab on our app hosted server and find the project
I verified the account can push straight to master
There is no xdt transformation on the config files
I logged into the account on the app hosted server and hit the project via the same api url.
Here is the codified url:
RestRequest fileExists = new RestRequest(
$#"api/v4/projects/{project.id}/repository/files/{action.file_path.Replace("/", "%2F")}?ref=master&private_token={ConfigurationManager.AppSettings["JiraPrivateKey"]}",
Method.GET);
I'm almost thinking it's some sort of server setting, but I would think that I would get the 404 when I tried the api url in the browser.
After thinking about it, since I use the ConfigurationManager, I had to copy by .exe.config file to the server and not just the exe itself. After dragging the config and the exe over. It worked. :-(
I am not sure if this is the right place to ask, but here it is: I created an ASP.NET Core web application, and copied all the files on my Ubuntu 14.04 server. I can compile and run without a problem, but now I want this application/web site to run permanently.
I followed all the steps described here https://docs.asp.net/en/latest/publishing/linuxproduction.html, installed nginx as reverse proxy to run with apache, and all of this run perfectly well.
BUT, trying to use supervisor and start the app from it, I systematically get an error /usr/bin/dotnet cannot execute binary file. But, if I move to be in the directory where the application is published, and manually type dotnet appname.dll it does start without a glitch.
I am not sure where to look to get this to work with supervisor. Thanks for your help (and if this question should be somewhere else, let me know)
I finally solved my problem, replacing the equivalent of the line command=bash /usr/bin/dotnet /var/aspnetcore/HelloMVC/HelloMVC.dll as described in https://docs.asp.net/en/latest/publishing/linuxproduction.html under "Configuring Supervisor" by a little script, far from perfect as I get a Warning: HOME environment variable not set.
Anyway, here is the script:
#!/bin/bash
cd /var/aspnetcore/foesuivi/
dotnet FoESuivi.dll
cd $HOME
As a windows programmer, I don't know much about bash scripting, but I certainly can see that I would need to give a value to $HOME before the cd command.
Anyway, after doing a chmod +x to the sh file, and replacing the command= with the full name of the sh file, it is now working, I can reboot and my site is immediately available.
I am trying to use this azure commandlets form C# application that is console job hosted in my azure Virtual machine. this job runs more then once in a day automatically but my azure commandlets are fails to execute. yesterday that works fine but after a day it is not working please help me. ☺
Note : here i am doing some administration task and i am using organizational account. but it fails to add account other commandlets goes fail to work.
Install the Azure PowerShell tools. The download link is here: https://azure.microsoft.com/en-us/documentation/articles/powershell-install-configure/#how-to-install-azure-powershell
It's renamed to Add-AzureAccount.
I'm assuming that the original question was "The term 'Login-AzureRmAccount' is not recognized as the name of a cmdlet, function, script file. I'm seeing the same thing, despite "Install-Module AzureRM" having been done (from an elevated powershell prompt) and "Get-Module AzureRM" returning a version of 4.3.1 (i.e. powershell tools being installed).
The first part of the solution for me was to also do "Install-Module Azure" (from an elevated powershell prompt) as well after which "Add-AzureAccount" worked. However a command such as "New-AzureRmResourceGroup" still failed with "Run Login-AzureRmAccount to login".
The second part of the solution was simply to reboot, after which:
Add-AzureRmAccount
New-AzureRmResourceGroup -Name wibble -Location UKSouth
worked.
I have a Powershell script that starts a service on a remote machine.
It does something like this:
Invoke-Command -ComputerName $serverName -ScriptBlock { param($serviceexe) & $serviceexe -install }-ArgumentList $localExePath
It works OK in a local machine and also in Teamcity as a Build step.
But when I am trying to start the service from some other tool, I am getting the message:
"cannot start service from the command line or a debugger".
This other tool is written in C#, using System.Management.Automation.
I´ve also tried it using Process class, but still the same problem.
Any idea?
Have you considered Powershell Remote Requirements already?
I am trying to create a setup package that not only installs our app but also copies files to a remote app server and installs a service there. I thought that I would just override the install method in a custom action to have it kick off a powershell script to copy the files. Unfortunately when the code calls the powershell script I get this CmdletProviderInvocationException:
The system detected a possible attempt to compromise security. Please ensure that you can contact the server that authenticated you.
I was able to copy the code I am using to call the powershell script into a test project and it ran just fine, as I would expect since I have logged in to the server through windows explorer and so my user should be authenticated. I think the reason the script won't work when called by the installer must be that the installer switches users in order to get admin permissions to install the app, and the admin user is not authenticated (although I could be wrong).
Does anyone know how I could get this to work?
Here's the custom action code:
Runspace runspace = RunspaceFactory.CreateRunspace();
runspace.Open();
Pipeline pipeline = runspace.CreatePipeline();
string scriptLoc = "c:\\sampleLocation";
pipeline.Commands.AddScript("&\"" + scriptLoc + "\\script.ps1\"");
Collection<PSObject> results = pipeline.Invoke();
runspace.Close();
and here's the script:
$RemotePath = "\\SERVER\C$\Shared\Service"
$Source = "C:\sampleLocation\Service"
Get-ChildItem $Source -Recurse | Copy-Item -Destination $RemotePath
There are two major requirements for copying files to a network location:
your custom action should run without impersonation
the network location should have full permissions for Everyone
A MSI installation runs under the local system account. So it doesn't matter if you have permissions or not.
Since it's not easy to give permissions to SYSTEM account from a network machine, the easiest approach is to give full permissions to Everyone. This needs to be done on the machine which contains the shared folder.
According to #Cosmin Pirvu and Microsoft documentation :
The LocalSystem account is a predefined local account used by the service control manager. It has extensive privileges on the local computer, and acts as the computer on the network.
If your shared folder is on a computer that is on a domain, you can give full permissions to the client computer in spite giving it to Everyone.