I have a free trial account of Azure speech services and I use speech to text services in a program using c#.
The utility of the program is to conevrt file audios to text files, by speech to text API. The problem is taht sometimes an error appears saying:
Status: Canceled. Reason: The recognition service encountered an internal error and could not continue.Respones
text:{"Duration":0,"Offset":0,"RecognitionStatus":"Error"}.
Someone can help me if I have an error in the program or there is a problem of the free account in azure that gives problems?
Thanks!
I found that the type a WAV file I used gave me that error, I converted it be mono using FFMPEG in Docker using the following command line in PowerShell.
mkdir $pwd\original\output\ -Force
docker run -v ${PWD}\original:/tmp/workdir jrottenberg/ffmpeg -i Dummycall.wav -map_channel 0.0.0 DummycallMono.wav
Please be aware that you may need to run this for left and right channels by modifying the arguments using -map_channel 0.0.1
Related
When I CD to my path and run "upgrade-assistant upgrade My sln here", it runs the assistant and I choose 1 (I use interactive mode as its easier to see errors) and it acts like it will convert but then it errors out with "unexpected error applying step System.UnauthorizedAccessException: Access to the path "Path to android csproj" is denied".
I have tried running PowerShell as administrator and normal multiple times with the same outcome. Any suggestions? Thank you for the help.
I am developing a program that should work with standart streams of pwsh.exe process.
Official Microsoft Terminal application, which allows you to work with various console programs, allows you to enter characters only when the previous command is fully executed, for example, Write-Host hey1; Start-Sleep 10s; Write-Host hey2. In the profile settings for powershell core, there is only the path to the executable file.
I want to achieve similar behavior in my c# program, but I do not know by what mechanism the Microsoft Terminal get information that the previous command has been executed.
I tried checking the output stream, it doesn't have any special characters for this.
I published my c# .NET 5.0 code to azure functions (windows) and im getting this weird error message:
2021-06-21T01:56:53.465 [Error] Executed 'Function1' (Failed, Id=fdefdbba-49a7-44ad-8082-841d2941d90b, Duration=169ms)Unable to load DLL 'libgmp-10.dll' or one of its dependencies: The specified module could not be found. (0x8007007E)
I tried to see the \wwwroot files on the azure functions console but then i get this error:
3 [main] ls (8392) C:\Program Files\Git\usr\bin\ls.exe: *** fatal error - Couldn't set directory to \\?\PIPE\ temporarily.
Any hints?
It seems that the deployment is not done correctly.
Libgmp-10.dll a DLL (Dynamic Link Library) file which is referred to
essential system files of the Windows OS. It usually contains a set of
procedures and driver functions, which may be applied by Windows.
Please delete the Azure Function and, re-create and deploy a fresh code using Develop and publish .NET 5 functions using Azure Functions OR if you are using ADO, Setting up a CI/CD pipeline for Azure Functions.
Let me know if you have any follow up questions.
Im trying to use the integrated cluster rendering for unity. In order to do so I need to run my application from a batch file with certain command line options.
Here is he documentation
https://docs.unity3d.com/Manual/ClusterRenderingDeployment.html
I tried to open the app in -batchmode using the command line however this wasn't what the documentation meant. So i'm struggling to understand how to do this.
The command i tried is open ~/Desktop/PP/NewUnityProject/Buildtest.app -server 2 *:8888 *:* 10000
I am using mac
I am working with HDInsight .NET SDK to use C# sdk with pig. I am getting error when specifying the c# application path.
here's how am defining the C# app in pig script
DEFINE pigudf `PigUDF.exe` SHIP('wasb://book#storage.blob.core.windows.net/PIG/app/PigUDF.exe');
am getting error "invalid ship specification" 'wasb://bookstore#storage160203122480.blob.core.windows.net/PIG/app/PigUDF.exe' doesn't exists, however PigUDF.exe does exists at the given path.
If I run the same query from HDInsight cluster console with both pig script file and c# app stored locally on cluster, it runs successfully.. i.e the below works on hdinsight cluster console
DEFINE pigudf `PigUDF.exe` SHIP('C:/PigUDF.exe');
where pigudf.exe is locally stored on cluster.
I even tried running it through HDInsight tools for visual studio, but I get same error.
Any help here will be appreciated.
thanks,
Saleem
Try using http:// instead of wasb://. The wasb protocol is used to access Windows Azure blob storage.
DEFINE pigudf `PigUDF.exe` SHIP('http://book#storage.blob.core.windows.net/PIG/app/PigUDF.exe');
You can copy your udf to local, update its permissions, ship it, and eventually remove it
fs -copyToLocal wasb://<container>#account/udf.exe
sh cacls udf.exe /E /G <group>:F
define myUdf `udf.exe`ship('udf.exe')
-- run your computation...
sh del udf.exe