I would like to backup data of client computers with PowerShell. To do so I create a ZIP archive of the requested data which basically is no problem at all. But when it comes to special characters which use Unicode or the users build very long path names I run into trouble. I have tried different things now but haven’t found a solution yet. My client computers run Windows 10 (Build 1511). Installing the anniversary update (Build 1607) isn’t a solution due to other dependencies. Also the usage of 3rd party software to create ZIP files can’t be one.
Below are three methods I’ve found and tried already. They all have the same problems in common: when it comes to long path names they immediately stop the execution or they just skip the rest of the folder structure.
1. Create a ZIP file with PowerShell V5 CmdLet Expand-Archive
$Target = "C:\Temp\Test.zip"
$Source = "C:\Test"
Compress-Archive -Path $Source -DestinationPath $Target
2. Create a ZIP file with .net class
Add-Type -assembly "system.io.compression.filesystem"
$Target = "C:\Temp\Test.zip"
$Source = "C:\Test"
[io.compression.zipfile]::CreateFromDirectory($Source, $Target);
3. Create a ZIP file with Windows Explorer (Compressed Folders)
$Source = Get-ChildItem "C:\Test" -Recurse
$Target = "C:\Temp\Test.zip"
if (-not (Test-Path $Target)) {
Set-Content $Target ("PK" + [char]5 + [char]6 + ("$([char]0)" * 18))
(dir $Target).IsReadOnly = $false
}
$objShell = New-Object -Com Shell.Application
$objZIP = $objShell.NameSpace($Target)
foreach($File in $Source) {
$objZIP.CopyHere($File.FullName)
Start-sleep -milliseconds 500
}
Then I found out, that it should be possible to access the local drives via UNC namespace which could look like \\?\C:\Test. But this doesn’t work with my Windows 10 build 1511 version. With build 1607 it is possible. What I don’t understand: why? I tried it with installing the latest .net version 4.6.2 on build 1511 but the problem still exists.
Can anybody help me with the access to (local) UNC namespaces or creating ZIP archives?
to access UNC path, use the $ characters to design drive
like \\computername\c$\path\path\file.txt
i use the .Net function because of compatibility with powershell v2
[IO.Compression.ZipFile]::CreateFromDirectory($SourcesFolder, $zipTempPath, $CompressionLevel, $False)
i use it with long UNC path with no problems.
Related
When I want to pack my library as a NuGet package, I get the below error
The DateTimeOffset specified cannot be converted into a Zip file timestamp
I'm using the below command to pack my project:
dotnet msbuild /t:pack /p:Configuration=Release /p:SourceLinkCreate=true
The problem is; some DLL files have invalid dates for a zip file (like 31/12/1979).
You can overcome this issue by updating all the invalid DLL files modification date.
Here's the Powershell script that updates all the invalid DLLs.
gci -path "C:\" -rec -file *.dll | Where-Object {$_.LastWriteTime -lt (Get-Date).AddYears(-20)} | % { try { $_.LastWriteTime = '01/01/2020 00:00:00' } catch {} }
It sets all the invalid DLL dates to 01/01/2000.
Change the path parameter for your computer.
My GitHub repositories are on my C drive so I'm running this -path C:\.
Seems that in most of cases (basing on research in github issues) the problem was related to Microsoft.Extensions.* packages. In my case update to newer version of Microsoft.Extensions.* (from 3.1.0 to 3.1.4) fixed the problem. For reference :
https://github.com/dotnet/extensions/issues/2750
and mentioned in comments :
https://github.com/NuGet/Home/issues/7001
I made a console app to fix invalid dates in a drive (cross-platform). It sets the LastModificationDate to 01/01/2000. You can just run it without any arguments. It will run in all your drives. Also you can specify a directory to search in.
Source-code on GitHub:
https://github.com/ebicoglu/FileBulkDateChanger
Usage:
FileBulkDateChanger.exe
or
FileBulkDateChanger.exe C:\
For MAC/Linux,
dotnet FileBulkDateChanger.dll
Run this tool and forget about this issue :)
Is it possible to transform the **/*.tt file into a *.cs file.
Using Azure Devops pipeline?
Otherwise is there a CLI command available for Dotnet core using TextTransform ?
I already test : T5.TextTransform.Tool but is don't work (and deprecated)
Thanks for your help
How i solve this problem using Devops pipeline + script:
As mention #Leo Liu-MSFT Install dotnet-t4
install global -g
Create powershell script and find tt file
Seach all *.tt file and convert them with the t4 command
Get-ChildItem -Path .\ -Filter *.tt -Recurse -File -Name| ForEach-Object {
$file = [System.IO.Path]::GetFileName($_);
$directory = [System.IO.Path]::GetDirectoryName($_)
"Conversion file : " + $file
t4 "$directory\$file" -I="$directory"
}
NOTE : It is important to place the T4.ps1 file in the parent directory of your *.tt files
Is it possible to transform the **/*.tt file into a *.cs file. Using Azure Devops pipeline?
The answer is yes.
According to the state of the package T5.TextTransform.Tool:
T5 was a stopgap measure for a time when Mono.TextTemplating was not
available for .NET Core. Now that that is no longer the case, T5 is
not needed and no longer being maintained. Use Mono.TextTemplating's
dotnet-t4 instead.
So, we could use the Mono.TextTemplating instead of T5.TextTransform.Tool.
Besides, there is also an implementation of the TextTransform.exe command-line tool, we could use the command line to transform the .tt file into .cs file:
"%CommonProgramFiles%\Microsoft Shared\TextTemplating\1.2\texttransform.exe" -out %1.cs -P %2 -P "%ProgramFiles%\Reference Assemblies\Microsoft\Framework\v3.5" %1.tt
Check this thread for some more details.
Hope this helps.
Trying to install Matlab run time on a docker image along with the project I'm working on, the project is an engine that will run a variety of measurements based on what is given to it, many of these measurements use Matlab. When I run the docker though I get an error that the "MWArray assembly failed to be initialized" or that a matlab dll is missing.
I'm trying to run this in Docker for Windows due to a company requirement, and have been unable to successfully get the DockerFile to recognize the MCR. Below is the code that I've been playing with to get the MCR onto a docker.
FROM mcr.microsoft.com/dotnet/framework/runtime:4.7.2-windowsservercore-ltsc2019
ADD http://ssd.mathworks.com/supportfiles/downloads/R2017b/deployment_files/R2017b/installers/win64/MCR_R2017b_win64_installer.exe C:\\MCR_R2017b_win64_installer.zip
# Line 3: Use PowerShell
SHELL ["powershell", "-Command", "$ErrorActionPreference = 'Stop'; $ProgressPreference = 'SilentlyContinue';"]
# Line 4: Unpack ZIP contents to installation folder
RUN Expand-Archive C:\\MCR_R2017b_win64_installer.zip -DestinationPath C:\\MCR_INSTALLER
# Line 5: Run the setup command for a non-interactive installation of MCR
RUN Start-Process C:\MCR_INSTALLER\bin\win64\setup.exe -ArgumentList '-mode silent', '-agreeToLicense yes' -Wait
# Line 6: Remove ZIP and installation folder after setup is complete
RUN Remove-Item -Force -Recurse C:\\MCR_INSTALLER, C:\\MCR_R2017b_win64_installer.zip
WORKDIR /app
COPY /Project/bin/Debug/*.dll ./
COPY /Project/bin/Debug/Project.exe .
ENTRYPOINT ["C:\\app\\Project.exe"]
Edit: I think I've found a working solution, following the idea from the other anwser about the ltsc2019 not working with Matlab 2017b. The below code has worked with 2017b inside a docker.
FROM mcr.microsoft.com/windows:1809
Windows Server 2019 is not supported by MATLAB R2017b, and support for it was not introduced until MATLAB R2019a.
For MATLAB R2017b you’ll need Windows Server 2016.
That’s not to say there may not be other issues as well.
I am running PowerShell Scripts from a C# Tool like this:
using (PowerShell pshell = PowerShell.Create())
{
pshell.AddCommand(scriptFullPath);
pshell.AddParameter("username", user);
pshell.AddParameter("password", pass);
PSDataCollection<PSObject> outputCollection = new PSDataCollection<PSObject>();
PSInvocationSettings settings = new PSInvocationSettings();
settings.ErrorActionPreference = ActionPreference.Stop;
pshell.Invoke(null, outputCollection, settings);
}
Almost everything works fine in the Script until I need special Cmdlets from other Assemblies. The Add-PSSnapin Command will always fail with:
Exception: The Windows PowerShell snap-in 'Microsoft.SharePoint.Powershell' is not installed on this computer.
Exception: Cannot bind parameter 'Path' to the target. Exception setting "Path": "Cannot find path 'D:\dev\tool\Microsoft.SharePoint.dll' because it does not exist."
when running
$snapin = Get-PSSnapin | Where-Object {$_.Name -eq "Microsoft.SharePoint.Powershell"}
if ($snapin -eq $null)
{
Write-Host "Loading SharePoint Powershell Snapin"
Add-PSSnapin "Microsoft.SharePoint.Powershell"
Add-Type -Path "Microsoft.SharePoint.dll"
Add-Type -Path "Microsoft.SharePoint.Runtime.dll"
}
Everything works fine when running the Script directly in a PowerShell window, so I guess it has something to do with the PATH or Scope that is not forwarded from the C# Tool. Playing around with the Parameter useLocalScope of AddCommand or other Parameters did not yield any results (although I am not sure if this has anything to do with paths).
How can I make the Script work and find external Assemblies?
The SharePoint PowerShell snapin is available in 64bit only. Your C# tool may be running as an x86 process, and therefore would give you the error "not installed". Also you may have to run the program "as Administrator" as some of the commands need that to work.
The second error is, you're right, that there is no PATH variable set for SharePoint by default. The workaround is to specify the full path to the .dll, (and changing the version number for your install) e.g.
Add-Type -Path "C:\Program Files\Common Files\microsoft shared\Web Server Extensions\15\ISAPI\Microsoft.SharePoint.dll"
I have massive 500GB index files on a Windows Server 2012R2 which are being constantly updated by an application.
I have to zip the file using PowerShell, but when I try to zip it using the following snippet of code I get this exception:
Exception calling "CreateFromDirectory" with "4" argument(s): "The process cannot access the file 'E:\Program Files (x86)\Application Folder\File\Status.FCS' because it is being used
by another process."
$zip = "E:\Folder\File.zip"
$can = "E:\Program Files (x86)\Application Folder\File
Import-Module AWSPowerShell
# functions
function ZipFiles( $zipfilename, $sourcedir )
{
Add-Type -Assembly System.IO.Compression.FileSystem
$compressionLevel = [System.IO.Compression.CompressionLevel]::Optimal
[System.IO.Compression.ZipFile]::CreateFromDirectory($sourcedir,
$zipfilename, $compressionLevel, $false)
}
ZipFiles $zip $can
I don't have any issues if I am compressing the files using the Windows GUI, but problem seems to happen only when I am using a PowerShell script. Also if I stop application services, the PowerShell script works fine (which I can't do in prod environment).
One of the solution is to copy folder with 500Gb of index and compress it (which should work) but I don't have enough disk space on my Windows server to do so.
So is there any solution to compress the file while it is write locked using PowerShell script?
you need to set a different directory for the zip file : [System.IO.Directory]::SetCurrentDirectory($inPath)
so
Function ZipFiles( $zipFileName, $pathTarget )
{
Add-Type -Assembly System.IO.Compression.FileSystem
# Emplacement de sortie
[System.IO.Directory]::SetCurrentDirectory($inPath)
$compressionLevel = [System.IO.Compression.CompressionLevel]::Optimal
[System.IO.Compression.ZipFile]::CreateFromDirectory($pathTarget, $zipFileName, $compressionLevel, $false)
}