Using tuespechkin with MVC project in Azure - c#

I can't manage to get pechkin or tuespechkin to work on my azure site.
Whenever I try to access the site it just hangs with no error message (even with customErrors off). Is there any further setup I'm missing? Everything works perfectly locally.
For a 64 bit app I'm completing the following steps:
Create a new Empty MVC App with Azure, make sure Host in the cloud is selected
Change the app to 64 bit
Log onto the azure portal and upgrade the app to basic hosting and change it to 64 bit
Install the TuesPechkin.Wkhtmltox.Win64 and TuesPechkin nuget packages
Add a singleton class to return the IConverter
public class TuesPechkinConverter
{
private static IConverter converter;
public static IConverter Converter
{
get
{
if (converter == null)
{
converter =
new ThreadSafeConverter(
new PdfToolset(
new Win64EmbeddedDeployment(
new TempFolderDeployment())));
}
return converter;
}
}
}
Add a Home controller with the following code in the Index Action:
var document = new HtmlToPdfDocument
{
GlobalSettings =
{
ProduceOutline = true,
DocumentTitle = "Pretty Websites",
PaperSize = PaperKind.A4, // Implicit conversion to PechkinPaperSize
Margins =
{
All = 1.375,
Unit = Unit.Centimeters
}
},
Objects =
{
new ObjectSettings { HtmlText = "<h1>Pretty Websites</h1><p>This might take a bit to convert!</p>" },
new ObjectSettings { PageUrl = "www.google.com" }
}
};
byte[] pdfBuf = TuesPechkinConverter.Converter.Convert(document);
return File(pdfBuf, "application/pdf", "DownloadName.pdf");

As far as i know, you can't make it work in a web app. However, there is a way you can do it: you have to create a cloud service and add a worker role to it. TuesPechkin will be installed in this worker role.
The workflow would be the following: from your cloud web app, you would access the worker role(this thing is possible by configuring the worker role to host Asp.NET Web API 2). The worker role would configure a converter using TuesPechkin and would generate the PDF. We would wrap the pdf in the web api response and send it back. Now, let's do it...
To add a cloud service (suppose you have Azure SDK installed), go to Visual Studio -> right click your solution -> Add new project -> select Cloud node -> Azure Cloud Service -> after you click OK select Worker Role and click OK.
Your cloud service and your worker role are created. Next thing to do is to configure your Worker Role so it can host ASP.NET Web API 2.
This configuration is pretty straightforward, by following this tutorial.
After you have configured your Worker Role to host a web api, you will have to install the TuesPechkin.Wkhtmltox.Win64 and TuesPechkin nuget packages.
Your configuration should now be ready. Now create a controller, in which we will generate the PDF: add a new class in your Worker Role which will extend ApiController:
public class PdfController : ApiController
{
}
Add an action to our controller, which will return an HttpResponseMessage object.
[HttpPost]
public HttpResponseMessage GeneratePDF(PdfViewModel viewModel)
{
}
Here we will configure two ObjectSettings and GlobalSettings objects which will be applied to an HtmlToPdfDocument object.
You now have two options.
You can generate the pdf from html text(maybe you sent the html of your page in the request) or directly by page url.
var document = new HtmlToPdfDocument
{
GlobalSettings =
{
ProduceOutline = true,
DocumentTitle = "Pretty Websites",
PaperSize = PaperKind.A4, // Implicit conversion to PechkinPaperSize
Margins =
{
All = 1.375,
Unit = Unit.Centimeters
}
},
Objects = {
new ObjectSettings { HtmlText = "<h1>Pretty Websites</h1><p>This might take a bit to convert!</p>" },
new ObjectSettings { PageUrl = "www.google.com" }
}
};
A nice thing to now is that when using page url, you can use the ObjectSettings object to post parameters:
var obj = new ObjectSettings();
obj.LoadSettings.PostItems.Add
(
new PostItem()
{
Name = "paramName",
Value = paramValue
}
);
Also, from TuesPechkin documentation the converter should be thread safe and should be kept somewhere static, or as a singleton instance:
IConverter converter =
new ThreadSafeConverter(
new RemotingToolset<PdfToolset>(
new Win64EmbeddedDeployment(
new TempFolderDeployment())));
Finally you wrap the pdf in the response content, set the response content type to application/pdf and add the content-disposition header and that's it:
byte[] result = converter.Convert(document);
MemoryStream ms = new MemoryStream(result);
response.StatusCode = HttpStatusCode.OK;
response.Content = new StreamContent(ms);
response.Content.Headers.ContentType = new System.Net.Http.Headers.MediaTypeHeaderValue("application/pdf");
response.Content.Headers.Add("content-disposition", "attachment;filename=myFile.pdf");
return response;

I'm afraid the answer is that it's not possible to get wkhtmltopdf working on Azure.
See this thread.
I am assuming you mean running wkhtmltopdf on Windows Azure Websites.
wkhtmltopdf uses Window's GDI APIs which currently don't work on Azure
Websites.
Tuespechkin supported usage
It supports .NET 2.0+, 32 and 64-bit processes, and IIS-hosted applications.
Azure Websites does not currently support the use of wkhtmltopdf.
Workaround
I ended up creating a Azure Cloud Service, that runs wkhtmltopdf.exe. I send the html to the service, and get a byte[] in return.

Related

Change Microsoft Graph Resource for Sharepoint Webhook

I am currently working out the Microsoft Graph tutorial with C# .Net Core, and in the process I came across the following C#-method for Subscription:
[HttpGet]
public async Task<ActionResult<string>> Get()
{
var graphServiceClient = GetGraphClient();
var sub = new Microsoft.Graph.Subscription();
sub.ChangeType = "updated";
sub.NotificationUrl = config.Ngrok + "/api/notifications";
sub.Resource = "/users";
sub.ExpirationDateTime = DateTime.UtcNow.AddMinutes(15);
sub.ClientState = "SecretClientState";
var newSubscription = await graphServiceClient
.Subscriptions
.Request()
.AddAsync(sub);
Subscriptions[newSubscription.Id] = newSubscription;
if (subscriptionTimer == null)
{
subscriptionTimer = new Timer(CheckSubscriptions, null, 5000, 15000);
}
return $"Subscribed. Id: {newSubscription.Id}, Expiration: {newSubscription.ExpirationDateTime}";
}
and wanted to know how I can change it for sharepoint lists instead of users.
If I change it to /sites/{site-id} or similar it does not work. (see sub.Resource)
Github-Link: MS Repo
Microsoft Graph API uses a webhook mechanism to deliver change notifications to clients. Using the Microsoft Graph API, an app can subscribe to changes for list under a SharePoint site.
Resource Path - Changes to content within the list:
/sites/{id}/lists/{id}
For details round how to subscribe to and handle incoming notifications, see Set up notifications for changes in user data
Also make sure you check necessary permissions needed here.
I found the solution myself with the sub.Resource: /sites/{site-id}/lists/{list-id}

How can I use a google speech to text api key in deployed WPF app?

I have created a C# WPF app that uses google Speech-to-Text API. Currently, on my development machine, I have added an environment variable in windows, which references to the JSON file that google gave me.
Will I need to create that environment variable to every machine I deploy my app or is there a way to somehow store the JSON key on a server and reference it from there?
Possibly, Passing Credentials Using Code helps you.
This is the code copied from there:
// Some APIs, like Storage, accept a credential in their Create()
// method.
public object AuthExplicit(string projectId, string jsonPath)
{
// Explicitly use service account credentials by specifying
// the private key file.
var credential = GoogleCredential.FromFile(jsonPath);
var storage = StorageClient.Create(credential);
// Make an authenticated API request.
var buckets = storage.ListBuckets(projectId);
foreach (var bucket in buckets)
{
Console.WriteLine(bucket.Name);
}
return null;
}
// Other APIs, like Language, accept a channel in their Create()
// method.
public object AuthExplicit(string projectId, string jsonPath)
{
LanguageServiceClientBuilder builder = new LanguageServiceClientBuilder
{
CredentialsPath = jsonPath
};
LanguageServiceClient client = builder.Build();
AnalyzeSentiment(client);
return 0;
}

Need to create a folder(and a file inside it) using C# inside Azure DevOps repository - be it Git or TFVC

From Azure DevOps portal, I can manually add file/ folder into repository irrespective of the fact that source code is cloned or not - Image for illustration.
However, I want to programmatically create a folder and a file inside that folder within a Repository from c# code in my ASP .NET core application.
Is there a Azure DevOps service REST API or any other way to do that? I'll use BASIC authentication through PAT token only.
Note : I'm restricted to clone the source code at local repository.
Early reply is really appreciated.
I tried HttpClient, GitHttpClient and LibGit2Sharp but failed.
Follow below steps in your C# code
call GetRef REST https://dev.azure.com/{0}/{1}/_apis/git/repositories/{2}/refs{3}
this should return the object of your repository branch which you can use to push your changes
Next, call Push REST API to create folder or file into your repository
https://dev.azure.com/{0}/{1}/_apis/git/repositories/{2}/pushes{3}
var changes = new List<ChangeToAdd>();
//Add Files
//pnp_structure.yml
var jsonContent = File.ReadAllText(#"./static-files/somejsonfile.json");
ChangeToAdd changeJson = new ChangeToAdd()
{
changeType = "add",
item = new ItemBase() { path = string.Concat(path, "/[your-folder-name]/somejsonfile.json") },
newContent = new Newcontent()
{
contentType = "rawtext",
content = jsonContent
}
};
changes.Add(changeJson);
CommitToAdd commit = new CommitToAdd();
commit.comment = "commit from code";
commit.changes = changes.ToArray();
var content = new List<CommitToAdd>() { commit };
var request = new
{
refUpdates = refs,
commits = content
};
var personalaccesstoken = _configuration["azure-devOps-configuration-token"];
var authorization = Convert.ToBase64String(System.Text.ASCIIEncoding.ASCII.GetBytes(string.Format("{0}:{1}", "", personalaccesstoken)));
_logger.LogInformation($"[HTTP REQUEST] make a http call with uri: {uri} ");
//here I making http client call
// https://dev.azure.com/{orgnizationName}/{projectName}/_apis/git/repositories/{repositoryId}/pushes{?api-version}
var result = _httpClient.SendHttpWebRequest(uri, method, data, authorization);

WebAPI + OWIN: zipping files directly to the output stream does not work as expected

I'm developing a website with ASP.NET MVC 5 + Web API. One of the requirements is that users must be able to download a large zip file, which is created on the fly.
Because I immediately want to show progress of the user, my idea was to use a PushStreamContent with a callback in the resonse. The callback creates the zipfile and streams it to the response.
When I implement this as follows, starting from an empty ASP.NET MVC + Web API project, it works as expected. As soon as the result is returned to the client, the callback gets invoked and
the zipfile is streamed to the client. So the user can see progress as soon as the callback creates the zip archive and add files to it.
[RoutePrefix("api/download")]
public class DownloadController : ApiController
{
[HttpGet]
public HttpResponseMessage Get()
{
var files = new DirectoryInfo(#"c:\tempinput").GetFiles();
var pushStreamContent = new PushStreamContent(async (outputStream, httpContext, transportContext) =>
{
using (var zipOutputStream = new ZipOutputStream(outputStream))
{
zipOutputStream.CompressionLevel = CompressionLevel.BestCompression;
foreach (var file in files)
{
zipOutputStream.PutNextEntry(file.Name);
using (var stream = File.OpenRead(file.FullName))
{
await stream.CopyToAsync(zipOutputStream);
}
}
}
});
var response = new HttpResponseMessage(HttpStatusCode.OK)
{
Content = pushStreamContent
};
response.Content.Headers.ContentType = new MediaTypeHeaderValue("application/octet-stream");
response.Content.Headers.ContentDisposition =
new ContentDispositionHeaderValue("attachment") {FileName = "MyZipfile.zip"};
return response;
}
}
Now, I have to integrate this in an existing website, which is configured to use Microsoft.Owin.OwinMiddleware. I used the same code as pasted above, but now the behavior is different: during the creation of the zipfile, it 's not streamed to the response, but only downloaded when the creation of the zip has finished. So the user doesn't see any progress during the creation of the file.
I also tried a different approach in my Web API + Owin project, as described here: (generate a Zip file from azure blob storage files).
In an empty Asp.NET MVC project (without OWIN middleware), this works exactly as expected, but when OWIN is involved, I get this HTTPException and stacktrace:
System.Web.HttpException: 'Server cannot set status after HTTP headers have been sent.'
System.Web.dll!System.Web.HttpResponse.StatusCode.set(int value) Unknown
System.Web.dll!System.Web.HttpResponseWrapper.StatusCode.set(int value) Unknown
Microsoft.Owin.Host.SystemWeb.dll!Microsoft.Owin.Host.SystemWeb.OwinCallContext.Microsoft.Owin.Host.SystemWeb.CallEnvironment.AspNetDictionary.IPropertySource.SetResponseStatusCode(int value) Unknown
It seems that OWIN wants to set a response status, although that was already done in my Get() method (HttpResponseMessage(HttpStatusCode.OK)).
Any suggestions how to fix this or ideas for a different approach?
Thanks a lot!

Implementing 3D Secure with Sitefinity Ecommerce

I want to add 3D secure authentication to credit card payments taken through a website. I am using Sitefinity 8, the e-commerce plug-in and SagePay as the payment processor.
I have created a custom payment provider and can successfully redirect users to the 3D secure page. I am able to perform the second authentication call to SagePay using the SagePay integration kit (i.e. externally from the e-commerce plugin). However, I am struggling to find a way to complete the payment due the way the internal e-commerce classes function.
The difficulty is that the order processor treats the payment as declined if 3D secure authentication is required, but there does not seem to be a way to process the order correctly without using the inbuilt functionality. From my inspections of the ecommerce libraries, there seems to be no way to extend or modify these classes due to internal modifiers and concrete implementations.
How can I process the order once I have completed authentiation? Has anyone successfully implemented 3D secure with ecommerce? Or know if it is possible?
This is my custom payment provider at the moment.
public class CustomSagePayProvider : SagePayProvider
{
// Rest of code...
protected override IPaymentResponse ParseReponse(string uniqueTransactionCode, string responseXml)
{
var paymentResponse = base.ParseReponse(uniqueTransactionCode, responseXml);
if (Requires3DSecure(paymentResponse))
{
var responseFields = GetResponseAsDictionary(responseXml);
Set3DSecureFields(responseFields, paymentResponse);
}
return paymentResponse;
}
private bool Requires3DSecure(IPaymentResponse paymentResponse)
{
return paymentResponse.GatewayCSCResponse == "OK";
}
private void Set3DSecureFields(Dictionary<string, string> responseFields, IPaymentResponse paymentResponse)
{
var postValues = new NameValueCollection();
postValues.Add("MD", responseFields.ContainsKey("MD") ? responseFields["MD"] : string.Empty);
postValues.Add("PAReq", responseFields.ContainsKey("PAReq") ? responseFields["PAReq"] : string.Empty);
paymentResponse.GatewayRedirectUrlPostValues = postValues;
paymentResponse.GatewayRedirectUrl = responseFields.ContainsKey("ACSURL") ? responseFields["ACSURL"] : string.Empty;
}
}
And this is the 3D secure payment process using the .NET SagePay integration kit
using SagePay.IntegrationKit;
using SagePay.IntegrationKit.Messages;
// Rest of code
var sagePay = new SagePayIntegration();
IThreeDAuthRequest request = new DataObject();
request.Md = Request.Form["MD"];
request.PaRes = Request.Form["PaRes"];
sagePay.RequestQueryString = sagePay.BuildQueryString(request, ProtocolMessage.THREE_D_AUTH_REQUEST, ProtocolVersion.V_223);
sagePay.ResponseQueryString = sagePay.ProcessWebRequestToSagePay("https://test.sagepay.com/gateway/service/direct3dcallback.vsp", sagePay.RequestQueryString);
var result = sagePay.GetDirectPaymentResult(sagePay.ResponseQueryString);
if (result.Status == ResponseStatus.OK)
{
// Process order
}
I was able to add 3D secure authentication by treating the 2nd authentication call as an offsite payment and adding the IOffsitePaymentProcessorProvider interface to my payment provider class
public class CustomSagePayProvider : SagePayProvider, IOffsitePaymentProcessorProvider
{
// Triggered after payments that have been 3D Secure authenticated
public IPaymentResponse HandleOffsiteNotification(int orderNumber, HttpRequest request, PaymentMethod paymentMethod)
{
var paymentResponse = new PaymentResponse() { IsOffsitePayment = true };
var sagePay = new SagePayIntegration();
var result = sagePay.GetDirectPaymentResult(request.Params.ToString());
if (result.ThreeDSecureStatus == ThreeDSecureStatus.OK)
{
paymentResponse.IsSuccess = true;
paymentResponse.GatewayTransactionID = result.TxAuthNo.ToString();
}
return paymentResponse;
}
public IPaymentResponse HandleOffsiteReturn(int orderNumber, HttpRequest request, PaymentMethod paymentMethod)
{
throw new NotImplementedException();
}
I pass the notification url as a query string parameter in the termUrl value posted to SagePay when first requesting the authentication
(The url must be /Ecommerce/offsite-payment-notification/ to use the inbuilt offside payment notification handler).
var notificationUrl = request.Url.GetLeftPart(UriPartial.Authority) + "/Ecommerce/offsite-payment-notification/";
In the callback from SagePay after the user completes authentication, I use the SagePay integration kit to process the result of the authentication.
var sagePay = new SagePayIntegration();
IThreeDAuthRequest request = new DataObject();
request.Md = md;
request.PaRes = paRes;
sagePay.RequestQueryString = sagePay.BuildQueryString(request, ProtocolMessage.THREE_D_AUTH_REQUEST, ProtocolVersion.V_223);
sagePay.ResponseQueryString = sagePay.ProcessWebRequestToSagePay("https://test.sagepay.com/gateway/service/direct3dcallback.vsp", sagePay.RequestQueryString);
return sagePay.GetDirectPaymentResult(sagePay.ResponseQueryString);
Finally, I trigger the HandleOffsiteNotification event by posting to the url www.mysite.com/Ecommerce/offsite-payment-notification/. This marks the order as complete, updates stock levels and cleans up the user's basket. For simplicity in this example, I am using the SagePay integration kit object to build the query string and post to the url.
var sagePay = new SagePayIntegration();
var ordersManager = OrdersManager.GetManager();
var query = sagePay.ConvertSagePayMessageToNameValueCollection(ProtocolMessage.DIRECT_PAYMENT_RESULT, typeof(IDirectPaymentResult), result, ProtocolVersion.V_223);
// Required Sitefinity fields to trigger HandleOffsiteNotification in IOffsitePaymentProcessorProvider
query.Add("invoice", orderNumber.ToString());
query.Add("provider", ordersManager.Provider.Name);
var queryString = sagePay.BuildQueryString(query);
// Post 3d secure details to this site simulate an offsite payment processor response
sagePay.ProcessWebRequestToSagePay(notificationUrl, queryString);

Categories

Resources