I am using Rotativa to generate a PDF from a view. It works on my localhost, but when I push to my server it does not work at all. The Server has Windows Authentication and Impersonate enabled which I need to have for this site.
This is the error I get when I try to run the code on the server
Qt: Could not initialize OLE (error 80070005) Error: Failed loading
page
https://api.mydomain.com/Reports/RedBluePDF?community=CommunityName&procedure=GetTasks
(sometimes it will work just to ignore this error with
--load-error-handling ignore) Exit with code 1 due to http error: 1003
Here is my code:
public byte[] getReportsPDF(string community, string procedure)
{
byte[] pdfBytes = new byte[] { };
RouteData route = new RouteData();
route.Values.Add("controller", "SiteSuperReports");
route.Values.Add("action", "RedBluePDF");
this.ControllerContext = new ControllerContext(new HttpContextWrapper(System.Web.HttpContext.Current), route, this);
if (procedure == "GetProductionTasks")
{
var actionPDF = new Rotativa.ActionAsPdf("RedBluePDF", new { community = community, procedure = procedure })
{
PageSize = Size.A4,
PageOrientation = Rotativa.Options.Orientation.Landscape,
PageMargins = { Left = 1, Right = 1 }
};
try
{
pdfBytes = actionPDF.BuildFile(ControllerContext);
}
catch(Exception e)
{
Console.Write(e.Message.ToString());
}
}
return pdfBytes;
}
And here is RedBluePDF Method, this just returns a View:
public ActionResult RedBluePDF(string community, string procedure) {
return View();
}
What am I doing wrong and how come this is not working on my server, but is on my localhost? And How do I get it work on my server.
Try one of this solutions:
1- Go to IIS > Site > Authentication, click on "ASP.NET Impersonation" and DISABLE it.
2- If you're calling a script or a file or whatever, specify the used protocol:
src="//api.mydomain.com/?????
to:
src="http://api.mydomain.com/?????
3- In your Application Pool's configuration, under Process Model, there's an option "Load User Profile". It comes as False by default, set it as true.
Related
I have added a Connected Reference in Visual Studio 2019. It consumed a https endpoint, and created all binding information needed into a reference.cs file.
It didn't generate any App.config file, so I suspected what I needed was bundled into the reference.cs file. Indeed, looking into it, it mostly was.
So I tried creating a client, specify client credentials in two ways, as you can see, but still, doesn't matter how I specify it, I get an exception when calling this code below.
public async Task SendFile(Stream fileStream, string fileName, Guid machineKey)
{
_logger.LogInformation("Starting file sending to Manager 1.");
_logger.LogInformation($"Sending file {fileName} from Machine {machineKey}");
try
{
var client = new FileTransferClient(FileTransferClient.EndpointConfiguration.BasicHttpBinding_IFileTransfer, _options.FileTransferEndPoint)
{
ClientCredentials =
{
UserName =
{
UserName = _options.FileTransferUsername,
Password = _options.FileTransferPassword
}
}
};
client.ClientCredentials.UserName.UserName = _options.FileTransferUsername;
client.ClientCredentials.UserName.Password = _options.FileTransferPassword;
using (new OperationContextScope(client.InnerChannel))
{
}
await client.UploadAsync(new FileUploadMessage
{
// Assume that this is enough. Can't really supply file length...
FileInfo = new FileTransferInfo
{
TransferId = new Guid(),
MachineUUID = machineKey.ToString(),
Name = fileName
},
TransferStream = fileStream
});
}
catch (Exception e)
{
_logger.LogError("An unexpected exception occurred while sending file to Manager 1G.", e);
}
_logger.LogInformation("File sending finished.");
}
The exception is "The HTTP request is unauthorized with client authentication scheme 'Basic'. The authentication header received from the server was 'Basic Realm'."
I have compared to similar APIs that use the beforementioned App.config, and have edited the reference.cs to match the security I think it should have.
Specifically, I've added the security related lines here:
private static System.ServiceModel.Channels.Binding GetBindingForEndpoint(EndpointConfiguration endpointConfiguration)
{
if ((endpointConfiguration == EndpointConfiguration.BasicHttpBinding_IFileTransfer))
{
System.ServiceModel.BasicHttpBinding result = new System.ServiceModel.BasicHttpBinding();
result.MaxBufferSize = int.MaxValue;
result.ReaderQuotas = System.Xml.XmlDictionaryReaderQuotas.Max;
result.MaxReceivedMessageSize = int.MaxValue;
result.AllowCookies = true;
result.Security.Mode = System.ServiceModel.BasicHttpSecurityMode.Transport;
result.Security.Transport.ClientCredentialType = HttpClientCredentialType.Basic;
result.Security.Transport.ProxyCredentialType = HttpProxyCredentialType.None;
return result;
}
if ((endpointConfiguration == EndpointConfiguration.MetadataExchangeHttpsBinding_IFileTransfer))
{
System.ServiceModel.Channels.CustomBinding result = new System.ServiceModel.Channels.CustomBinding();
System.ServiceModel.Channels.TextMessageEncodingBindingElement textBindingElement = new System.ServiceModel.Channels.TextMessageEncodingBindingElement();
result.Elements.Add(textBindingElement);
System.ServiceModel.Channels.HttpsTransportBindingElement httpsBindingElement = new System.ServiceModel.Channels.HttpsTransportBindingElement();
httpsBindingElement.AllowCookies = true;
httpsBindingElement.MaxBufferSize = int.MaxValue;
httpsBindingElement.MaxReceivedMessageSize = int.MaxValue;
result.Elements.Add(httpsBindingElement);
return result;
}
throw new System.InvalidOperationException(string.Format("Could not find endpoint with name \'{0}\'.", endpointConfiguration));
}
What I found dumbfounding, was that with embedding in the constructor calling setting the ClientCredentials, they were not in any way populated when I inspected the client with a debug session attached. Hence I tried to set it afterwards specifically.
But either way, the end result is the same, get the same error.
How can I resolve that error in Code?
I can in theory try to add an App.config and do it there, but I don't know the Contract. And I am not sure what to look for in the generated reference.cs to identify it. So I'd prefer to learn to do this by Code, as the Contract is already in place there, and I can supply the endpoint via the _options, so it should be able to configure for different environments by that.
Turned out I had indeed password and username exchanged, so fixing that helped me get past of this issue.
I am trying to deploy a website but am having an error with one of my forms that has an image upload.
Firstly i am using NetworkSolutions for the hosting, and one of the forms has an file upload input for images. When I run the application locally and upload the image to the FTP, everything works correctly, however when i deploy to the server, the connection appears to timeout (since it hangs for a few moments), than i get the message "Object reference not set to an instance of an object.". One thing i should note is SSL is not setup on this server, however i am using an unsecure port.
[HttpPost]
[ValidateAntiForgeryToken]
public async Task<ActionResult> Submit(RegistrationViewModel viewModel)
{
if (!ModelState.IsValid)
{
return RedirectToAction("Index", viewModel);
}
if (Request.Files.Count > 0)
{
string path = String.Empty;
HttpPostedFileBase file = Request.Files[0];
if (file != null && file.ContentLength > 0)
{
string fileName = Guid.NewGuid().ToString() + Path.GetExtension(file.FileName); // Path.GetFileName(file.FileName);
try
{
using (WebClient client = new WebClient())
{
client.Credentials = new NetworkCredential("***", "***");
byte[] buffer = new byte[file.ContentLength];
file.InputStream.Read(buffer, 0, buffer.Length);
file.InputStream.Close();
path = "ftp://***.***.com:21/pics/" + fileName;
client.UploadData(path, buffer);
}
}
catch (WebException ex)
{
string status = ((FtpWebResponse)ex.Response).StatusDescription;
}
}
Context.Registrations.Add(new Registration
{
FirstName = viewModel.FirstName,
LastName = viewModel.LastName,
Email = viewModel.Email,
PhoneNumber = viewModel.PhoneNumber,
Age = viewModel.Age,
ImagePath = path,
CreatedDate = DateTime.Now
});
await Context.SaveChangesAsync();
ConfirmationViewModel confirmViewModel = new ConfirmationViewModel
{
FirstName = viewModel.FirstName,
LastName = viewModel.LastName,
Email = viewModel.Email,
PhoneNumber = viewModel.PhoneNumber
};
return RedirectToAction("Confirm", "Register", confirmViewModel);
}
}
I expect that the image should save to the path as it does locally, however on the server i cannot get pass this timeout/null exception. The exception in the stack trace is when the method UploadData hits, in the register controller line 89 (im showing the Submit function in the register controller). Since this issue is happening on the server, getting feedback on the error has been fairly limiting. Removing the try/catch i get internal server error, with the try catch, i get Null reference exception.
One thing i tried was removing the lines assuming something was null here, but same result:
file.InputStream.Read(buffer, 0, buffer.Length);
file.InputStream.Close();
Any help would be greatly appreciated.
I have tracked down the Object reference not set to an instance of an object exception comes from the catch statement. The exception does not seem to be a WebException, or at least the response i was casting was not an FtpWebResponse. (very difficult to debug since i have to deploy each time), however i am still timing out for some unknown reason.
I have successfully saved the image to the server from the server! The issue came down to the request being made specifically from the web server domain to the ftp domain.
Hidden in NetworkSolution options was folder privileges (Security/Password Protection for anyone looking), after allowing write access on the directory i needed, i switched my method to use the HttpPostedFileBase to save the file directly to the server, and success! I wish I was able to find that option from the start and save hours of headache.
I have set up a webhook in my Braintree account https://example.com/Webhook/Accept
I have the site setup to create a Customer & Subscription via a purchase link.
When I make a purchase via the link these are both created ok in the Braintree vault, the webhook fires and I receive 2 notifications.
This all seems great, all is working, except the for 'Check URL' in the Braintree Control panel Webhook section, where im getting a HTTP error 500 internal server error.
I think its a bt_challenge issue on the GET but having tried a few variations I simply cant get it working.
Here is my controller:
public class WebhookController : Controller
{
public IBraintreeConfiguration config = new BraintreeConfiguration();
public ActionResult Accept()
{
CrmDB db = new CrmDB();
Log log;
var gateway = config.GetGateway();
if (Request.HttpMethod == "POST")
{
WebhookNotification webhookNotification = gateway.WebhookNotification.Parse(
Request.Params["bt_signature"],
Request.Params["bt_payload"]
);
string message = string.Format(
"[Webhook Received {0}] | Kind: {1} | Subscription: {2}",
webhookNotification.Timestamp.Value,
webhookNotification.Kind,
webhookNotification.Subscription.Id
);
System.Console.WriteLine(message);
// Save to Db
log = new Log
{
Stamp = DateTime.Now,
App = "api/Webhook/Accept",
IsInsight = true,
Insight = message
};
// Db
db.Log.Add(log);
db.SaveChanges();
return new HttpStatusCodeResult(200);
}
else
{
string msg = gateway.WebhookNotification.Verify(Request.QueryString["bt_challenge"]);
// Save to Db
log = new Log
{
Stamp = DateTime.Now,
App = "Webhook - bt_challenge",
IsInsight = true,
Insight = msg
};
// Db
db.Log.Add(log);
db.SaveChanges();
return Content(msg);
}
}
}
I believe the issue is in the else section (ie HTTP GET) but I cant work out what it is.
Why am I getting the 500 internal error from Webhook control Panel?
I can't manage to get pechkin or tuespechkin to work on my azure site.
Whenever I try to access the site it just hangs with no error message (even with customErrors off). Is there any further setup I'm missing? Everything works perfectly locally.
For a 64 bit app I'm completing the following steps:
Create a new Empty MVC App with Azure, make sure Host in the cloud is selected
Change the app to 64 bit
Log onto the azure portal and upgrade the app to basic hosting and change it to 64 bit
Install the TuesPechkin.Wkhtmltox.Win64 and TuesPechkin nuget packages
Add a singleton class to return the IConverter
public class TuesPechkinConverter
{
private static IConverter converter;
public static IConverter Converter
{
get
{
if (converter == null)
{
converter =
new ThreadSafeConverter(
new PdfToolset(
new Win64EmbeddedDeployment(
new TempFolderDeployment())));
}
return converter;
}
}
}
Add a Home controller with the following code in the Index Action:
var document = new HtmlToPdfDocument
{
GlobalSettings =
{
ProduceOutline = true,
DocumentTitle = "Pretty Websites",
PaperSize = PaperKind.A4, // Implicit conversion to PechkinPaperSize
Margins =
{
All = 1.375,
Unit = Unit.Centimeters
}
},
Objects =
{
new ObjectSettings { HtmlText = "<h1>Pretty Websites</h1><p>This might take a bit to convert!</p>" },
new ObjectSettings { PageUrl = "www.google.com" }
}
};
byte[] pdfBuf = TuesPechkinConverter.Converter.Convert(document);
return File(pdfBuf, "application/pdf", "DownloadName.pdf");
As far as i know, you can't make it work in a web app. However, there is a way you can do it: you have to create a cloud service and add a worker role to it. TuesPechkin will be installed in this worker role.
The workflow would be the following: from your cloud web app, you would access the worker role(this thing is possible by configuring the worker role to host Asp.NET Web API 2). The worker role would configure a converter using TuesPechkin and would generate the PDF. We would wrap the pdf in the web api response and send it back. Now, let's do it...
To add a cloud service (suppose you have Azure SDK installed), go to Visual Studio -> right click your solution -> Add new project -> select Cloud node -> Azure Cloud Service -> after you click OK select Worker Role and click OK.
Your cloud service and your worker role are created. Next thing to do is to configure your Worker Role so it can host ASP.NET Web API 2.
This configuration is pretty straightforward, by following this tutorial.
After you have configured your Worker Role to host a web api, you will have to install the TuesPechkin.Wkhtmltox.Win64 and TuesPechkin nuget packages.
Your configuration should now be ready. Now create a controller, in which we will generate the PDF: add a new class in your Worker Role which will extend ApiController:
public class PdfController : ApiController
{
}
Add an action to our controller, which will return an HttpResponseMessage object.
[HttpPost]
public HttpResponseMessage GeneratePDF(PdfViewModel viewModel)
{
}
Here we will configure two ObjectSettings and GlobalSettings objects which will be applied to an HtmlToPdfDocument object.
You now have two options.
You can generate the pdf from html text(maybe you sent the html of your page in the request) or directly by page url.
var document = new HtmlToPdfDocument
{
GlobalSettings =
{
ProduceOutline = true,
DocumentTitle = "Pretty Websites",
PaperSize = PaperKind.A4, // Implicit conversion to PechkinPaperSize
Margins =
{
All = 1.375,
Unit = Unit.Centimeters
}
},
Objects = {
new ObjectSettings { HtmlText = "<h1>Pretty Websites</h1><p>This might take a bit to convert!</p>" },
new ObjectSettings { PageUrl = "www.google.com" }
}
};
A nice thing to now is that when using page url, you can use the ObjectSettings object to post parameters:
var obj = new ObjectSettings();
obj.LoadSettings.PostItems.Add
(
new PostItem()
{
Name = "paramName",
Value = paramValue
}
);
Also, from TuesPechkin documentation the converter should be thread safe and should be kept somewhere static, or as a singleton instance:
IConverter converter =
new ThreadSafeConverter(
new RemotingToolset<PdfToolset>(
new Win64EmbeddedDeployment(
new TempFolderDeployment())));
Finally you wrap the pdf in the response content, set the response content type to application/pdf and add the content-disposition header and that's it:
byte[] result = converter.Convert(document);
MemoryStream ms = new MemoryStream(result);
response.StatusCode = HttpStatusCode.OK;
response.Content = new StreamContent(ms);
response.Content.Headers.ContentType = new System.Net.Http.Headers.MediaTypeHeaderValue("application/pdf");
response.Content.Headers.Add("content-disposition", "attachment;filename=myFile.pdf");
return response;
I'm afraid the answer is that it's not possible to get wkhtmltopdf working on Azure.
See this thread.
I am assuming you mean running wkhtmltopdf on Windows Azure Websites.
wkhtmltopdf uses Window's GDI APIs which currently don't work on Azure
Websites.
Tuespechkin supported usage
It supports .NET 2.0+, 32 and 64-bit processes, and IIS-hosted applications.
Azure Websites does not currently support the use of wkhtmltopdf.
Workaround
I ended up creating a Azure Cloud Service, that runs wkhtmltopdf.exe. I send the html to the service, and get a byte[] in return.
I am using primavera webservices (version 6.2.1) to read data from a primavera database (sqlserver 2008R2) in a winform application (c#). I use http cookie container authentication mode. Before I moved my database to a new server I was able to login and read data from the primavera database successfully but when I moved DB (using backup and restore), I can still login to db but primavera webservices return null for any request.
This is my code to login:
AuthenticationService authService = new AuthenticationService( );
authService.CookieContainer = new System.Net.CookieContainer( );
authService.Url = _P6wsAuthenticationService;
Login loginObj = new Login( );
loginObj.UserName = pv_Username;
loginObj.Password = pv_Password;
loginObj.DatabaseInstanceId = 1;
loginObj.DatabaseInstanceIdSpecified = true;
loginObj.VerboseFaults = true;
loginObj.VerboseFaultsSpecified = true;
LoginResponse loginReturn = authService.Login( loginObj );
ReadDatabaseInstancesResponseDatabaseInstance[] readdbInstances = authService.ReadDatabaseInstances("");
cookieContainer = authService.CookieContainer;
When I run this code the loginresponse for new database is "true" and it shows the correct database instance information in "readdbInstances".
I run the following code to read some project info from DB:
ProjectPortBinding pbProject = new ProjectPortBinding( );
pbProject.CookieContainer = cookieContainer;
pbProject.Url = _P6wsProjectService;
ReadProjects readProject = new ReadProjects( );
Primavera.Ws.P6.Project.ProjectFieldType[] pfProject = new Primavera.Ws.P6.Project.ProjectFieldType[6];
pfProject[0] = Primavera.Ws.P6.Project.ProjectFieldType.ObjectId;
pfProject[1] = Primavera.Ws.P6.Project.ProjectFieldType.Id;
pfProject[2] = Primavera.Ws.P6.Project.ProjectFieldType.Name;
pfProject[3] = Primavera.Ws.P6.Project.ProjectFieldType.Status;
pfProject[4] = Primavera.Ws.P6.Project.ProjectFieldType.StartDate;
pfProject[5] = Primavera.Ws.P6.Project.ProjectFieldType.FinishDate;
readProject.Filter = pv_ProjectList.Equals( String.Empty ) ? String.Empty : "Id IN (" + pv_ProjectList + ")";
readProject.Field = pfProject;
Primavera.Ws.P6.Project.Project[] aProject = pbProject.ReadProjects( readProject );
it sends the request to the server but the message that I receive contains no project.
Before I moved the database I was able to read data with the same code. I changed database instance for primavera web services using its database configuration and I'm sure that it connects to the right DB, I'm just confusing why it cannot read the data from it. When I use the primavera client module to connect and read data from the new database it works fine and I can see all my projects.
Check the credentials that you are using in the user administrator front end, you will need to increase the permissions there on a module level or project level. Once you enable those you should see some data.
Kindly,
JK.