I’m unable to use the Android.OS.Environment.GetExternalStorageState method as outlined here.
The Android.OS.Environment.GetExternalStorageState property works. However in order to query the state the secondary external SD card’s state I need to use the method (not the property) and pass in the path to the secondary external SD card. When calling this method I receive a .NET exception as follows:
No static method with name=’getExternalStorageState’
signature=’(Ljava/io/File;)Ljava/lang/String;’ in class
Landroid/os/Environment;
A sample app that reliably reproduces the problem can be found here.
Note that this is on a device that has 2 SD cards. One is internal (not removable) and is the default SD card path. The other SD card is an external SD card which is removable. This test app assumes the external secondary removable SD card is at a root path of ‘sdcard1’ and the default internal is ‘sdcard0’. You may need to change this if your device is different.
Here is a trimmed down version of the CheckState from the sample app.
private bool CheckState(string pathStr)
{
try
{
var path = new File(pathStr);
var otherState = Environment.ExternalStorageState;
var state = Environment.GetExternalStorageState(path);
// Good state - set values and be done
return state.Equals(Environment.MediaMounted);
}
catch (Java.Lang.Exception ex)
{
Debug(Tag, $"Java Ex: {ex.Message}");
}
catch (Exception ex)
{
Debug(Tag, $".Net Ex: {ex.Message}");
}
return false;
}
In the example code above, the otherState variable is set with the state of the internal SD card. However when the next line with the state variable executes an exception is thrown.
Also, as an FYI - it is the .Net System.Exception that is thrown, no the Java.Lang.Exception.
If this code is correct and there is in fact a bug in the framework code somewhere (Xamarin and/or Android) then is there a recommended work around to determine if an SD card is available for reading & writing?
No static method with name=’getExternalStorageState’ signature=’(Ljava/io/File;)Ljava/lang/String;’ in class Landroid/os/Environment;
String getExternalStorageState (File path) was added in API 21.
re: https://developer.android.com/reference/android/os/Environment.html#getExternalStorageState(java.io.File)
Related
I have an asp.net core API that was recently updated from .net5 to .net6.
There is a piece of code that should read a duration of an audio file. The code that seems to have worked on previous versions was this:
try
{
//
// NAudio -- Windows only
//
using var fileReader = new AudioFileReader(filePath);
return Convert.ToInt32(Math.Ceiling(fileReader.TotalTime.TotalSeconds));
}
catch (DllNotFoundException)
{
try
{
//
// LibVLCSharp is crossplatform
//
using var libVLC = new LibVLC();
using var media = new Media(libVLC, filePath, FromType.FromPath);
MediaParsedStatus parsed = Task.Run(async () => await media.Parse(MediaParseOptions.ParseNetwork, timeout: 2000).ConfigureAwait(false)).Result;
if (parsed != MediaParsedStatus.Done) throw new ArgumentException("Could not read audio file");
if (!media.Tracks.Any(t => t.TrackType == TrackType.Audio) || (media.Duration <= 100)) throw new ArgumentException("Could not read audio from file");
return Convert.ToInt32(Math.Ceiling(TimeSpan.FromMilliseconds(media.Duration).TotalSeconds));
}
catch (Exception ex) when (ex is DllNotFoundException || ex is LibVLCSharp.Shared.VLCException)
{
try
{
using var fileReader = new Mp3FileReader(filePath);
return Convert.ToInt32(Math.Ceiling(fileReader.TotalTime.TotalSeconds));
}
catch (InvalidOperationException)
{
throw new ArgumentException("Could not read audio file");
}
}
}
The application was deployed on Linux and, I don't know which part of the code did the exact calculation (I am assuming the VLC part), but since the update to .NET6, all of these fail, and since the last fallback is NAudio, we get the following exception:
Unable to load shared library 'Msacm32.dll' or one of its dependencies.
I am using Windows, but I tried running the app with WSL, and I can't get the VLC part to run either - it always throws the following exception (even after installing vlc and vlc dev SDK):
LibVLC could not be created. Make sure that you have done the following:
Installed latest LibVLC from nuget for your target platform.
Unable to load shared library 'libX11' or one of its dependencies. In order to help diagnose loading problems, consider setting the LD_DEBUG environment variable: liblibX11: cannot open shared object file: No such file or directory at LibVLCSharp.Shared.Core.Native.XInitThreads()
at LibVLCSharp.Shared.Core.InitializeDesktop(String libvlcDirectoryPath)
at LibVLCSharp.Shared.Helpers.MarshalUtils.CreateWithOptions(String[] options, Func`3 create)
Is there any clean way to read a duration of an audio file on all platforms?
Needless to say, NAudio works like a charm on Windows, and so does the VLC (with the proper nuget package).
If you install ffmpeg, you can do this quite easily. ffmpeg comes installed in most linux distros by default, but in case it isn't, you can install it with your favorite package manager.
sudo apt install ffmpeg
To install it in windows, you'll need to download the build files, extract it, and add it to the PATH.
Next, install Xabe.FFMpeg package in your project.
Finally, you can call the static method Xabe.FFMpeg.FFMpeg.GetMediaInfo() to get all information regarding your audio file. Here is a sample snippet that I tested on my linux machine.
using System;
using System.IO;
using Xabe.FFmpeg;
namespace Program;
public static class Program
{
public static void Main(string[] args)
{
string filename;
if (args.Length == 0)
{
Console.WriteLine("No arguments found! Provide the audio file path as argument!");
return;
}
else if (File.Exists(filename = args[0]) == false)
{
Console.WriteLine("Given file does not exist!");
return;
}
try
{
var info = FFmpeg.GetMediaInfo(filename).Result;
TimeSpan duration = info.Duration;
Console.WriteLine($"Audio file duration is {duration}");
}
catch(Exception ex)
{
Console.WriteLine(ex);
}
}
}
The error you are seeing is because we were assuming that you would display a video on linux, using X11, so we are always initializing X11. See here.
We shouldn't do that for your use case(because you may not have a GUI available). Please report the issue here : https://code.videolan.org/videolan/LibVLCSharp/-/issues
or even better, submit a pull request on github or gitlab.
As for your question of why did it work on .net 5 and not anymore, I'm not sure we have enough info to tell why, because you didn't send us the error message from that machine.
I would encourage you to take a look at atldotnet. It is a small, well maintained completely managed code / cross platform library without any external dependencies and was accurate detecting audio file duration in all of my test cases (more accurate than ffmpeg). Most common audio formats are supported.
var t = new Track(audioFilePath);
// Works the same way on any supported format (MP3, FLAC, WMA, SPC...)
System.Console.WriteLine("Duration (ms) : " + t.DurationMs);
I am having issues with the .NET WebView2 control. I thought I had it fixed but it is not working. I have read numerous posts to no avail.
I have a WPF C# application that runs on a server. Various people log into the server via a web browser and run the app.
Within this app, I open up a WebView2 browser, setting the user data directory to a unique directory for each person.
When I set the user data directory and call EnsureCoreWebView2Async(), I get an error in the exception code "Access is denied. (Exception from HRESULT: 0x80070005 (E_ACCESSDENIED)).
Below is the code:
public static async void InitializeWebView(WebView2 browser, string path)
{
Directory.CreateDirectory(path);
browser.CreationProperties = new CoreWebView2CreationProperties()
{
UserDataFolder = path
};
try
{
await browser.EnsureCoreWebView2Async();
}
catch( Exception ex)
{
Log.LogString("Ensure error: " + ex.Message);
}
}
I have tried various things without success. What am I doing wrong? Any suggestions?
I'm not all that familiar with CoreWebView2CreationProperties, but according to the documentation.
Its main purpose is to be set to CreationProperties in order to
customize the environment used by a WebView2 during implicit
initialization...If you need complete control over the environment used by a WebView2 control then you'll need to initialize the control explicitly by creating your own environment with CreateAsync(String, String, CoreWebView2EnvironmentOptions) and passing it to EnsureCoreWebView2Async(CoreWebView2Environment) before you set the Source property to anything.
As mentioned in the documentation referenced above, implicit initialization occurs when the Source property is set and CoreWebView2 hasn't been explicitly initialized.
To explicitly initialize CoreWebView2, try the following:
public async Task InitializeCoreWebView2Async(WebView2 wv, string userDataFolder = null)
{
//initialize CoreWebView2
CoreWebView2EnvironmentOptions options = null;
CoreWebView2Environment cwv2Environment = null;
//it's recommended to create the userDataFolder in the same location
//that your other application data is stored (ie: in a folder in %APPDATA%)
//if not specified, we'll create a folder in %TEMP%
if (String.IsNullOrEmpty(userDataFolder))
userDataFolder = Path.Combine(Path.GetTempPath(), System.Reflection.Assembly.GetExecutingAssembly().GetName().Name);
//create WebView2 Environment using the installed or specified WebView2 Runtime version.
//cwv2Environment = await CoreWebView2Environment.CreateAsync(#"C:\Program Files (x86)\Microsoft\Edge Dev\Application\1.0.1054.31", userDataFolder, options);
cwv2Environment = await CoreWebView2Environment.CreateAsync(null, userDataFolder, options);
//initialize
await wv.EnsureCoreWebView2Async(cwv2Environment);
System.Diagnostics.Debug.WriteLine("UserDataFolder: " + userDataFolder);
}
Note: If one desires to explicitly initialize CoreWebView2, it must be done prior to setting the Source property for the WebView2 control.
Usage:
await InitializeCoreWebView2Async(webView21, Path.Combine(#"C:\Temp", System.Reflection.Assembly.GetExecutingAssembly().GetName().Name));
Resources:
CoreWebView2 Class
CoreWebView2CreationProperties
CoreWebView2Environment Class
CoreWebView2Environment.CreateAsync
WebView2.EnsureCoreWebView2Async(CoreWebView2Environment) Method
Thank you for your continued help. Due to character limitations, I am responding to your questions in an Answer section. In response to your comments...
If the Source was being set prior to initialization, I would be getting a different error (which I have had in the past).
Here is the xaml for the WebView2 control:
<Wpf:WebView2 Name="Browser" Margin="20,20,20,20" CoreWebView2InitializationCompleted="Browser_CoreWebView2InitializationCompleted" />
As you can see, I am subscribed to the InitializationComplete event, which is not being called due to the error.
Here is the source for the form that hosts the WebView2 control:
private async void _DoIt()
{
try
{
var env = await CoreWebView2Environment.CreateAsync(null, Global._GetServerDataDirectory(_Person));
await Browser.EnsureCoreWebView2Async(env);
}
catch( Exception ex)
{
Log.LogString("DoIt error: " + ex.Message);
}
}
public PayPal(Person person)
{
InitializeComponent();
_Person = person;
_DoIt();
Log.LogString("After PayPal constructor");
}
It is erroring in _DoIt().
Okay, after many hours of trying to track this down, the end issue is that Thinfinity's VirtualUI product does not support an application using WebView2. This was my core issue and has been confirmed by the vendor.
Thank you once again for everyone's kind help with this.
Blank project repo with problematic code
I have the following code for querying all the supported Audio codec following this article using CodecQuery.FindAllAsync
try
{
var query = new CodecQuery();
var queryResult = await query.FindAllAsync(CodecKind.Audio, CodecCategory.Encoder, "");
var subTypes = queryResult
.SelectMany(q => q.Subtypes)
.ToHashSet();
// Other codes
}
catch (Exception ex)
{
Debug.WriteLine(ex);
throw;
}
As the documentation mentioned,
To specify that all codecs of the specified kind and category should be returned, regardless of what media subtypes are supported, specify an empty string ("") or null for this parameter.
For empty string "", it works for CodecKind.Video but not for Audio (for both CodecCategory of Encoder or Decoder). If I specify a subtype, then it does not crash, for example:
var queryResult = await query.FindAllAsync(CodecKind.Audio, CodecCategory.Encoder, CodecSubtypes.AudioFormatMP3);
What is strange about that is that even though I have a try/catch with generic Exception, the app just crashes through that one and show this instead:
I have tried restarting Windows, uninstall the UWP app and make a clean build. What is happening? How do I query all available audio codecs?
Update: after changing Debug setting, I could trace the error message:
Unhandled exception at 0x00007FFDCE5FD759 (KernelBase.dll) in ******.exe: 0xC0000002: The requested operation is not implemented.
After my testing, the content of this document is correct. When I set “” for the third parameter to find all audio codecs, the code works well. So there is not an error in this document.
You could choose the Debug options, and change Debugger type from Managed Only to Mixed. It won't fix your exception, but you can trace it with the debugger. You could refer to this reply to get more information.
I have problem with using camera in Windows Phone Silverlight 8.1 application. I want just to initialize camera and see its preview (for now I don't need any photos or video capture). I have found nice and simple example on MSDN and
private CaptureSource captureSource;
private VideoCaptureDevice videoCaptureDevice;
private void InitializeVideoRecorder()
{
try
{
if (captureSource == null)
{
captureSource = new CaptureSource();
var a = captureSource.VideoCaptureDevice;
videoCaptureDevice = CaptureDeviceConfiguration.GetDefaultVideoCaptureDevice();
captureSource.CaptureFailed += OnCaptureFailed;
if (videoCaptureDevice != null)
{
VideoRecorderBrush = new VideoBrush();
VideoRecorderBrush.SetSource(captureSource);
captureSource.Start();
CameraStatus = "Tap record to start recording...";
}
else
{
CameraStatus = "A camera is not supported on this phone.";
}
}
}
catch (Exception ex)
{
CameraStatus = "ERROR: " + ex.Message.ToString();
}
}
The code stops at captureSource.Start(); throwing System.UnauthorizedAccessException: Attempted to perform an unauthorized operation..
First of all I found information (on the same page) that ID_CAP_ISV_CAMERA capability is needed in `WMAppManifest.xml'. But I have problem with adding it, because:
I can't find this capability in designer
I get error when I add it manualy to .xml file
Error reproduced below:
Warning 1 The 'Name' attribute is invalid - The value 'ID_CAP_ISV_CAMERA' is invalid according to its datatype 'http://schemas.microsoft.com/appx/2010/manifest:ST_Capabilities' - The Enumeration constraint failed.
Error 3 App manifest validation failed. Value 'ID_CAP_ISV_CAMERA' of attribute '/Package/Capabilities/Capability/#Name' must be a valid capability.
I have even found the same solution on SO WP8.1 SilverLight Microsoft.Devices.PhotoCamera Access Denied
Can somebody tell me why can't I use original MSDN solution to this problem?
First, it looks like you're trying to add that capability to Package.appxmanifest instead of WMAppManifest.xml. You should be able to find WMAppManifest.xml under Solution Explorer -> <your project> -> Properties:
Opening that file should give you the option to add ID_CAP_* capabilities.
Second, you need to specify both ID_CAP_ISV_CAMERA and ID_CAP_MICROPHONE in order to use CaptureSource.Start(), even if you're only using one of the devices.
I Want Create a Win-Form Application Can be Connect to a Digital Camera Attached to My Computer.I Want See a LiveView of Persenels in Computer And Then Take a Picture of Persenels.
How Can I Implement This Action?
What Camera Can I Use?
What Component Or Library Can I Use??
What SDk Tools Can I Use??
Please Help me...
You can do this with the Windows Image Acquisition API. Get this started with Project + Add Reference, Browse tab, navigate to c:\windows\system32\wiaaut.dll. That's a COM component, you'll get an interop library for it with interface types in the WIA namespace.
First thing you want to do is get a reference to the camera, use WIA.ShowSelectDevice(). It returns a Device object if there's only one camera attached, a dialog to let the user select if there are more. Like this:
public static WIA.Device SelectCamera() {
var dlg = new WIA.CommonDialog();
try {
return dlg.ShowSelectDevice(WIA.WiaDeviceType.CameraDeviceType, false, false);
}
catch (System.Runtime.InteropServices.COMException ex) {
if (ex.ErrorCode == -2145320939) return null;
throw;
}
}
That ought to get you started. Check out the code snippets at this MSDN page for more of the thing you can do with the API. Beware that not all cameras allow you do use them interactively when they are attached to the machine. My cheapo point-and-shoot doesn't.