Embed user-specific data into an authenticode signed installer on download - c#

I have a Windows Forms application installed using InnoSetup that my users download from my website. They install this software onto multiple PCs.
The application talks to a Web API which must be able to identify the user. I am creating a web application where the user can log in and download the app. I would like to embed a universally unique ID into the installer so that they do not have to login again after installation. I want them to download and run setup.exe, and have the application take care of itself.
I am considering a couple of options:
Embed a user-specific UUID into setup.exe and perform code-signing on-demand on the web serverDownside: not sure how to do this?
Embed a user-specific UUID into the name of the installer file (e.g. setup_08adfb12_2712_4f1e_8630_e202da352657.exe)Downside: this is not pretty and would fail if the installer is renamed
Wrap the installer and a settings file containing the UUID into a self-extracting zip
How can I embed user-specific data into a signed executable on the web server?

The entire PE is not signed. You can embed data into a signed PE by adding it to the signature table. This method is used by Webex and other tools to provide the one-click meeting utilities.
Technically, the PKCS#7 signature has a list of attributes that are specifically designated as unauthenticated which could be used, but I know of no easy way to write to these fields without a full PE parser. Luckily, we already have signtool, and adding an additional signature to an already signed file is a non-destructive operation that uses the unauthenticated fields.
I put together a demo which uses this technique to pass data from an MVC website to a downloadable windows forms executable.
The procedure is to:
Start with the authenticode signed and timestamped exe produced by standard processes(must be able to run without dependencies - ILMerge or similar)
Copy the unstamped exe to a temp file
Create an ephemeral code signing certificate which includes the auxiliary data as an X509 extension
Use signtool to add the auxiliary signature to the temp file
Return the temp file to the client, delete it after the download completes
On the client side, the app:
Reads the signing certificates from the currently executing exe
Finds the certificate with a known subject name
Finds the extension with a known OID
Alters its behavior based off of the data contained in the extension
The process has a number of advantages:
No monkeying with the PE layout
The publicly trusted code signing certificate can stay offline (or even in an HSM), only ephemeral certificates are used on the web server
No outbound traffic is generated from the web server (as would otherwise be required if timestamping were performed)
Fast (<50ms for a 1MB exe)
Can be run from within IIS
Usage
Client side retrieval of data (Demo Application\MainForm.cs)
try
{
var thisPath = Assembly.GetExecutingAssembly().Location;
var stampData = StampReader.ReadStampFromFile(thisPath, StampConstants.StampSubject, StampConstants.StampOid);
var stampText = Encoding.UTF8.GetString(stampData);
lbStamped.Text = stampText;
}
catch (StampNotFoundException ex)
{
MessageBox.Show(this, $"Could not locate stamp\r\n\r\n{ex.Message}", Text);
}
Server side stamping (Demo Website\Controllers\HomeController.cs)
var stampText = $"Server time is currently {DateTime.Now} at time of stamping";
var stampData = Encoding.UTF8.GetBytes(stampText);
var sourceFile = Server.MapPath("~/Content/Demo Application.exe");
var signToolPath = Server.MapPath("~/App_Data/signtool.exe");
var tempFile = Path.GetTempFileName();
bool deleteStreamOpened = false;
try
{
IOFile.Copy(sourceFile, tempFile, true);
StampWriter.StampFile(tempFile, signToolPath, StampConstants.StampSubject, StampConstants.StampOid, stampData);
var deleteOnClose = new FileStream(tempFile, FileMode.Open, FileAccess.Read, FileShare.Read | FileShare.Delete, 4096, FileOptions.DeleteOnClose);
deleteStreamOpened = true;
return File(deleteOnClose, "application/octet-stream", "Demo Application.exe");
}
finally
{
if (!deleteStreamOpened)
{
try
{
IOFile.Delete(tempFile);
}
catch
{
// no-op, opportunistic cleanup
Debug.WriteLine("Failed to cleanup file");
}
}
}

Related

OfficeConverter issue when deploying to Azure App Service

I have a web API which simply
clone a .docx file
convert that cloned .docx to a .pdf format
using DocumentFormat.OpenXml.Packaging;
[HttpPost("clone")]
public IActionResult CloneBillFromTemplate()
{
var templateFilePath = System.IO.Path.Combine(Directory.GetCurrentDirectory(), "Bill", "PaymentTempl.docx");
var clonedFilePath = System.IO.Path.Combine(Directory.GetCurrentDirectory(), "Bill", "ClonedBill.docx");
var pdfFilePath = System.IO.Path.Combine(Directory.GetCurrentDirectory(), "Bill", "FinalBill.pdf");
using (WordprocessingDocument wordDoc = WordprocessingDocument.Open(templateFilePath, true))
{
var clonedDoc = wordDoc.Clone(clonedFilePath);
System.Threading.Thread.Sleep(500);
clonedDoc.Save();
clonedDoc.Close();
}
using (var converter = new OfficeConverter.Converter())
{
converter.Convert(clonedFilePath, pdfFilePath);
}
return Ok();
}
Everything works fine when debugging (for sure :3) and also on IIS
But when I deploy to Azure App service, I got this type of error (stack trace + exception message).
Could not read registry to check Word version
Could not find registry key Word.Application\CurVer
at OfficeConverter.Word..ctor()
at OfficeConverter.Converter.get_Word()
at OfficeConverter.Converter.Convert(String inputFile, String outputFile, Stream logStream) at ....
Could you guys help me on this? Thanks all!!!
**feel free to ask for more information you need to detect this issue
Update
Looks like this is the issue with the pdf converter pacakge I'm using, not the OpenXML
It seems your application is relying on the Windows registry in a way that is not supported. If you are running on a Linux App Service, that would be the first thing to swap out, though my guess is that you are already running on Windows.
Apps have read-only access to much (though not all) of the registry of the virtual machine they are running on. In practice, this means registry keys that allow read-only access to the local Users group are accessible by apps. One area of the registry that is currently not supported for either read or write access is the HKEY_CURRENT_USER hive.
Write-access to the registry is blocked, including access to any per-user registry keys.
https://learn.microsoft.com/en-us/azure/app-service/operating-system-functionality#registry-access
If you can't refactor your code to not rely on such dependencies, I would suggest you put your application inside a Windows docker container. If you can run that locally, it should run on App Service as well.

CommonApplicationData folder read-only after using MSIX Packaging Tool

I have written a .Net Windows Forms application that uses the common application data folder to store logfiles and user accounts. The application is distributed using an install shield project and runs perfect on all different Windows versions.
Some parts of the code from different files is shown below
// Defining the path to use (in ProductInfo class)
public static string CommonApplicationDataPath
{
get
{
string path = Environment.GetFolderPath(Environment.SpecialFolder.CommonApplicationData);
path = StringHelper.EnsureEndsWithSlash(path);
path += Vendor + #"\" + ProductName + #"\";
return path;
}
}
// Configuring the logger and user manager instances at startup
Logger.Configure(string.Empty, ProductInfo.Password, ProductInfo.CommonApplicationDataPath);
UserManager.Configure(User.Empty, ProductInfo.Password, ProductInfo.CommonApplicationDataPath,
ProductInfo.UserLimitCount);
// Example method for saving the users to file (in UserManager class)
public bool SaveUsers(AppUsers appUsers)
{
AppUsersSerializer serializer = new AppUsersSerializer(_password, _fileName);
if (serializer.Serialize(appUsers) == true)
{
return true;
}
else
{
Logger.Instance.Log(Logs.ErrorB.UserSave, _fileName);
return false;
}
}
I would now like to publish the application via Windows Store and have used the MSIX Packaging Tool. To sign the package I have created a self signed certificate and added it to the Trusted Root Certificate Authorities. The .msix package is install on the same PC as my old desktop version of the app.
The problem I have is that the application is not able to write to the files located in the CommonApplicationData folder. The application can read and load the data, but not update and write the changes to the files. Thus, the path to the files is correct, but some write permission seems to be missing. I have tried different capabilities on the package and even ticked all, but without any effect.
I have also browsed to the C:\Program Files\WindowsApps\<my app package>\ folder and checked the structure of the application and located the files. They are there, but only readable for the app. Removing the files will not create new ones when they should be added as done in the old desktop Windows Forms version.
The application is quite big and contains lots of functionality which runs great in the Windows Store app context. The only missing piece is the above mentioned issues with the file writing.
Any advice would be really appreciated.
After some continued searching on different websites I came across a viable solution for my issue.
The Microsoft MSDN blog described how to use the different folders in an appropriate way.
http://blogs.msdn.microsoft.com/appconsult/2017/03/06/handling-data-in-a-converted-desktop-app-with-the-desktop-bridge/
The proposed solution is to change:
string path = Environment.GetFolderPath(Environment.SpecialFolder.CommonApplicationData);
to:
string path = Environment.GetFolderPath(Environment.SpecialFolder.LocalApplicationData);
It will place the files in the user's local directory, meaning the data will be available only for the current user. Sharing the same log file and user accounts between different users of the application, will thus not be possible but that is ok for now.
You may also need to make sure that the folder exists:
C:\Users\<user>\AppData\Local\<vendor>\<product> because it might not always be created during installation of your application. It depends if it has user specific settings or not.
CommonApplicationData folder read-only after using MSIX Packaging Tool
If you have converted your app to store app, we could regard it as UWP app, In general, we store the user info with LocalSettings class that could keep the data during the app updating.
var localSettings = Windows.Storage.ApplicationData.Current.LocalSettings;
// Create a simple setting.
localSettings.Values["exampleSetting"] = "Hello Windows";
// Read data from a simple setting.
Object value = localSettings.Values["exampleSetting"];
if (value == null)
{
// No data.
}
else
{
// Access data in value.
}
// Delete a simple setting.
localSettings.Values.Remove("exampleSetting");
For more detail, please refer Store and retrieve settings and other app data

Get modified files in FTP folder

I have a local folder which contains files and directories (>2000
files).
I uploaded this entire folder on my ftp.
Now for example let's say my ftp folder is called FTPFolder and my local folder is called LOCALFolder. These two folders are exactly the same for now.
And let's say the both folders contain a file called text.txt.
Now what I would like to do:
If I change the test.txt on the FTP, how could I detect it in C#?
Getting all local files and all FTPfiles and then comparing them is just too long. Has anyone got another way of doing this ?
Basically the goal is to download all files on the FTP which are different from the same files but locally.
Usual approach to synchronize local folder with FTP folder, is to compare file modification times.
Assuming that you are using FtpWebRequest .NET class, this is actually not trivial to implement. It does not have any standard way to retrieve file modification time.
See Retrieving creation date of file (FTP).
Even better would be to use a file checksum, but that's hardly possible.
See FTP: copy, check integrity and delete.
It would be way easier to use a 3rd party FTP client library that has a better support for retrieving the modification time. And even more easier, if you use FTP client library that supports a synchronization out of the box.
For example with WinSCP .NET assembly you can use Session.SynchronizeDirectories method:
// Setup session options
SessionOptions sessionOptions = new SessionOptions
{
Protocol = Protocol.Ftp,
HostName = "example.com",
UserName = "user",
Password = "mypassword",
};
using (Session session = new Session())
{
// Connect
session.Open(sessionOptions);
// Synchronize files
session.SynchronizeDirectories(
SynchronizationMode.Local, #"C:\local\path", "/remote/path", false).Check();
}
WinSCP GUI can generate a code template for you.
(I'm the author of WinSCP)
Here are possible options:
Check the checksum - the best and fastest way, but it might not be present for the files in the first place.
Check the size of the file and its timestamp. Not ideal, but it might work.
Other than that, I don't think there is anything you can and it's not a C# issue per se.

efficiently pass files from webserver to file server

i have multiple web server and one central file server inside my data center.
and all my Web server store the user uploaded files into central internal file server.
i would like to know what is the best way to pass the file from web server to file server in this case?
as suggested i try to add more details to question:
the solution i came up was:
after receiving files from user at web server, i should just do an Http Post to the file server. but i think there is some thing wrong with this because it causes large files to be entirely loaded into memory twice: (once at web server and once at file server)
Is your file server just another windows/linux server or is it a NAS device. I can suggest you number of approaches based on your requirement. The question is why d you want to use HTTP protocol when you have much better way to transfer files between servers.
HTTP protocol is best when you send text data as HTTP itself is based
on text.From the client side to Server side HTTP is used as that is
the only available option for you by our browsers .But
between your servers ,I feel you should use SMB protocol(am assuming
you are using windows as it is tagged for IIS) to move data.It will
be orders of magnitude faster as much more efficient to transfer the same data over SMB vs
HTTP.
And for SMB protocol,you do not have to write any code or complex scripts to do this.As provided by one of the answers above,you can just issue a simple copy command and it will happen for you.
So just summarizing the options for you (based on my preference)
Let the files get upload to some location on the each IIS web server e.g C:\temp\UploadedFiles . You can write a simple 2-3 line powershell script which will copy the files from this C:\temp\UploadedFiles to \FileServer\Files\UserID\\uploaded.file .This same powershell script can delete the file once it is moved to the other server successfully.
E.g script can be this simple and easy to make it as windows scheduled task
$Destination = "\\FileServer\Files\UserID\<FILEGUID>\"
New-Item -ItemType directory -Path $Destination -Force
Copy-Item -Path $Source\*.* -Destination $Destination -Force
This script can be modified to suit your needs to delete the files if it is done :)
In the Asp.net application ,you can directly save the file to network location.So in the SaveAs call,you can give the network path itself. This you have to make sure this network share is accessible for the IIS worker process and also has write permission.Also in my understanding asp.net gets the file saved to temporary location first (you do not have control on this if you are using the asp.net HttpPostedFileBase or FormCollection ). More details here
You can even run this in an async so that your requests will not be blocked
if (FileUpload1.HasFile)
// Call to save the file.
FileUpload1.SaveAs("\\networkshare\filename");
https://msdn.microsoft.com/en-us/library/system.web.ui.webcontrols.fileupload.saveas(v=vs.110).aspx
3.Save the file the current way to local directory and then use HTTP POST. This is worst design possible as you are first going to read the contents and then transfer it as chunked to other server where you have to setup another webservice which recieves the file.The you have to read the file from request stream and again save it to your location. Am not sure if you need to do this.
let me know if you need more details on any of the listed method.
Or you just write it to a folder on the webservers, and create a scheduled task that moves the files to the file server every x minutes (e.g. via robocopy). This also makes sure your webservers are not reliant on your file server.
Assuming that you have an HttpPostedFileBase then the best way is just to call the .SaveAs() method.
You need the UNC path to the file server and that is it. The simplest version would look something like this:
public void SaveFile(HttpPostedFileBase inputFile) {
var saveDirectory = #"\\fileshare\application\directory";
var savePath = Path.Combine(saveDirectory, inputFile.FileName);
inputFile.SaveAs(savePath);
}
However, this is simplistic in the extreme. Take a look at the OWASP Guidance on Unrestricted File Uploads. File uploads can be the source of many vulnerabilities in your application.
You also need to make sure that the web application has access to the file share. Take a look at this answer
Creating a file on network location in asp.net
for more info. Generally the best solution is to run the application pool with a special identity which is only used to access the folder.
the solution i came up was: after receiving files from user at web server, i should just do an Http Post to the file server. but i think there is some thing wrong with this because it causes large files to be entirely loaded into memory twice: (once at web server and once at file server)
I would suggest not posting the file at once - it's then full in memory, which is not needed.
You could post the file in chunks, by using ajax. When a chunk receives at your server, just add it to the file.
With the File Reader API, you could read the file in chunks in Javascript.
Something like this:
/** upload file in chunks */
function upload(file) {
var chunkSize = 8000;
var start = 0;
while (start < file.size) {
var chunk = file.slice(start, start + chunkSize);
var xhr = new XMLHttpRequest();
xhr.onload = function () {
//check if all chunks are and then send filename or send in in the first/last request.
};
xhr.open("POST", "/FileUpload", true);
xhr.send(chunk);
start = end;
}
}
It can be implemented in different ways. If you are storing files in files server as files in file system. And all of your servers inside the same virtual network
Then will be better to create shared folder on your file server and once you received files at web server, just save this file in this shared folder directly on file server.
Here the instructions how to create shared folders: https://technet.microsoft.com/en-us/library/cc770880(v=ws.11).aspx
Just map a drive
I take it you have a means of saving the uploaded file on the web server's local filesystem. The question pertains to moving the file from the web server (which is probably one of many load-balanced nodes) to a central file system all web servers can access it.
The solution to this is remarkably simple.
Let's say you are currently saving the files some folder, say c:\uploadedfiles. The path to uploadedfiles is stored in your web.config.
Take the following steps:
Sign on as the service account under which your web site executes
Map a persistent network drive to the desired location, e.g. from command line:
NET USE f: \\MyFileServer\MyFileShare /user:SomeUserName password
Modify your web.config and change c:\uploadedfiles to f:\
Ta da, all done.
Just make sure the drive mapping is persistent, and make sure you use a user with adequate permissions, and voila.

Installing trust anchors or certificates from UWP app

I am working on a key management application for the universal windows platform and would like to install CA certificates and trust anchors that can be used by system apps and 3rd-party apps. I have tried using a combination of CertificateStores.GetStoreByName and CertificateStore.Add as well as a call accessed via P/Invoke to CertAddEncodedCertificateToStore. Unfortunately, in both cases the calls succeed but the certificates are not visible using MMC and they do not appear to be used by other applications.
Is there a means of installing certificates such that they are usable system-wide (including outside the app container)? Is there any means of viewing what certificates have been installed within an app container?
By default no. Please check introduction to certificates article.
Shared certificate stores
UWP apps use the new isolationist application model introduced in Windows 8. In this model, apps run in
low-level operating system construct, called an app container, that
prohibits the app from accessing resources or files outside of itself
unless explicitly permitted to do so. The following sections describe
the implications this has on public key infrastructure (PKI).
Certificate storage per app container
Certificates that are intended for use in a specific app container are stored in per user,
per app container locations. An app running in an app container has
write access to only its own certificate storage. If the application
adds certificates to any of its stores, these certificates cannot be
read by other apps. If an app is uninstalled, any certificates
specific to it are also removed. An app also has read access to local
machine certificate stores other than the MY and REQUEST store.
Anyway, you can add a capability to your application in Package.appxmanifest. The sharedUserCertificates capability grants an app container read access to the certificates and keys contained in the user MY store and the Smart Card Trusted Roots store.
<Capabilities>
<uap:Capability Name="sharedUserCertificates" />
</Capabilities>
I just added it for testing purpose (UWP application) and the following code works fine. Certificate is added on user MY store.
string pfxCertificate = null;
string pfxPassword = "";
FileOpenPicker filePicker = new FileOpenPicker();
filePicker.FileTypeFilter.Add(".pfx");
filePicker.CommitButtonText = "Open";
try
{
StorageFile file = await filePicker.PickSingleFileAsync();
if (file != null)
{
// file was picked and is available for read
// try to read the file content
IBuffer buffer = await FileIO.ReadBufferAsync(file);
using (DataReader dataReader = DataReader.FromBuffer(buffer))
{
byte[] bytes = new byte[buffer.Length];
dataReader.ReadBytes(bytes);
// convert to Base64 for using with ImportPfx
pfxCertificate = System.Convert.ToBase64String(bytes);
}
await CertificateEnrollmentManager.UserCertificateEnrollmentManager.ImportPfxDataAsync(
pfxCertificate,
pfxPassword,
ExportOption.NotExportable,
KeyProtectionLevel.NoConsent,
InstallOptions.None,
"Test");
}
}
catch (Exception ex)
{
Debug.WriteLine(ex.Message);
}
A sample is available on 8.1 if it helps. Cryptography and Certificate sample

Categories

Resources