cannot get realtime video from capture card - c#

object vdeosource;
DsDevice device1 = DsDevice.GetDevicesOfCat(FilterCategory.VideoInputDevice)[1];
Guid baseIdentifier = typeof(IBaseFilter).GUID;
device1.Mon.BindToObject(null, null, ref baseIdentifier, out vdeosource);
IBaseFilter sourceFilter = vdeosource as IBaseFilter;
graphBuilder.AddFilter(sourceFilter, "Source");
hr = captureGraphBuilder.SetFiltergraph(this.graphBuilder);
DsError.ThrowExceptionForHR(hr);
mpeg2Demux = (IBaseFilter)new MPEG2Demultiplexer();
object cross;
captureGraphBuilder.FindInterface(FindDirection.UpstreamOnly, null, sourceFilter, typeof(IAMCrossbar).GUID, out cross);
IAMCrossbar crossbar2 = cross as IAMCrossbar;
int inputpincount, outputpincount;
crossbar2.get_PinCounts(out outputpincount, out inputpincount);
crossbar2.Route(0, 0);
If I try to render video at this point, I get some 5sec delayed pictures from my playing video which is connected as input for my capture card. My capture card output is mpeg2.
IBaseFilter demuxFilter = (IBaseFilter)new MPEG2Demultiplexer();
graphBuilder.AddFilter(demuxFilter, "Mpeg-2 Demultiplexor");
IPin sourceFilterOutputPin = DsFindPin.ByDirection(sourceFilter, PinDirection.Output, 0);
IPin demuxFilterPinIn = DsFindPin.ByDirection(demuxFilter, PinDirection.Input, 0);
hr = graphBuilder.Connect(sourceFilterOutputPin, demuxFilterPinIn);
DsError.ThrowExceptionForHR(hr);
IBaseFilter defaultRenderer = (IBaseFilter)new VideoRendererDefault();
hr = graphBuilder.AddFilter(defaultRenderer, "Default Renderer");
DsError.ThrowExceptionForHR(hr);
hr = captureGraphBuilder.RenderStream(null, MediaType.Video, sourceFilter, null, defaultRenderer);
DsError.ThrowExceptionForHR(hr);
At this point I get an error which says there is no combination of intermediate filters...
All I wanna know is whether I am using mpeg2 demux right or wrong?

You shouldn't need to connect the source to the demux before rendering the stream. This may be confusing the render operation.
During rendering operations, DirectShow will try to the use the filters already present in the graph before trying other registered filters so you can add your preferred filters to the graph without connecting them.
You may also be missing appropriate mpeg2 decoding or color space conversion filters. Have you simulated any of this with a graph editing tool (e.g. Graph Studio Next or GraphEdit in the Windows SDK)?

Related

Add sound filters and connect sound pins

Context
WPF UI with a user control that instanciate multiple COMS and use filters with directshow.net
Problem
The audio pins' names change depending on which video is playing. ( both are .avi files )
As you can see in the screenshots, the sound pins are not the same. (one is 'Stream 01', while the other one is '01 Microsoft wave form ..... ')
In my code, I use ConnectDirect and the method GetPin. To use GetPin, you need to give a pin name.
Graphs
Graph generated with exactly the same code, only change the video files.
Question
How do I connect the filters when the pins name change depending on which .avi file is running ? btw one avi file is 'home made' while the other is a microsoft avi sample file (12 seconds blue clock)
Relevant code
//sound filter linker
IBaseFilter pACMWrapper = (IBaseFilter)new ACMWrapper();
hr = m_FilterGraph.AddFilter(pACMWrapper, "ACM wrapper");
//add le default direct sound device
IBaseFilter pDefaultDirectSoundDevice = null;
try
{
pDefaultDirectSoundDevice = (IBaseFilter)new DSoundRender();
hr = m_FilterGraph.AddFilter(pDefaultDirectSoundDevice, "Default DirectSound Device");
IBaseFilter aviSplitter;
//find the avi splitter automatically added when I connect samp grabber to source filter.
m_FilterGraph.FindFilterByName("AVI Splitter", out aviSplitter);
System.Windows.MessageBox.Show(""); // graph screenshot is from here.
hr = m_FilterGraph.Connect(GetPin(aviSplitter, "Stream 01"), GetPin(pACMWrapper, "Input"));
DsError.ThrowExceptionForHR(hr);
//connect audio filters
hr = m_FilterGraph.ConnectDirect(GetPin(pACMWrapper, "Output"), GetPin(pDefaultDirectSoundDevice, "Audio Input pin (rendered)"), null);
DsError.ThrowExceptionForHR(hr);
}
catch (Exception)
{
pDefaultDirectSoundDevice = null;
//log error, play video without sound
//throw;
}
GetPin code
private IPin GetPin(IBaseFilter destinationFilter, string pinName)
{
IEnumPins pinEnum;
int hr = destinationFilter.EnumPins(out pinEnum);
DsError.ThrowExceptionForHR(hr);
IPin[] pins = new IPin[1];
IntPtr fetched = Marshal.AllocCoTaskMem(4);
while (pinEnum.Next(1, pins, fetched) == 0)
{
PinInfo pInfo;
pins[0].QueryPinInfo(out pInfo);
bool found = (pInfo.name == pinName);
DsUtils.FreePinInfo(pInfo);
if (found)
return pins[0];
}
return null;
}
You don't need to choose an output pin using a hardcoded name. Instead, and it is a more reliable way in fact, you need to enumerate pins - as your GetPin function already does - then enumerating media types on the given pin. It is OK to look just at the first media type (if any). If its major type is MEDIATYPE_Audio then it's your pin to take, regardless of its effective name.

Forcing Mpeg2Demultiplexer to use ffdshow to render H264 Digital TV Video

I spend a lot of time trying to make DTVViewer sample of DirectShow work unfortunately with no success. The video format of DVBT network is H264 and I found that the IntelliConnect behavior of IFilterGraph prefers to use Mpeg2 Video format.
For those who want to see the code, here it is. If you do not know anything about DirectShow I shared my experience with this code. And the most probably problem is described in step 5 and 6 of the tutorial.
The code for helper function which connects filters:
public static void UnsafeConnectFilters(IFilterGraph2 graph, IBaseFilter source, IBaseFilter dest, Func<AMMediaType, bool> sourceMediaPredicate=null, Func<AMMediaType, bool> destMediaPredicate=null) {
foreach(IPin spin in IteratePinsByDirection(source, PinDirection.Output)) {
if(IsConnected(spin))
continue;
int fetched;
AMMediaType[] sourceTypes=GetMajorType(spin, out fetched);
if(fetched>0) {
Guid sourceType=sourceTypes[0].majorType;
try {
if(sourceMediaPredicate!=null&&!sourceMediaPredicate(sourceTypes[0]))
continue;
foreach(IPin pin in IteratePinsByDirection(dest, PinDirection.Input)) {
if(IsConnected(pin))
continue;
var types=GetMajorType(pin, out fetched);
try {
if(fetched>0) {
Guid destType=types[0].majorType;
if(destMediaPredicate!=null&&!destMediaPredicate(types[0]))
continue;
if(sourceType==destType) {
spin.Connect(pin, types[0]);
return;
}
}
else {
spin.Connect(pin, sourceTypes[0]);
return;
}
}
finally {
}
}
}
finally {
}
}
}
}
Does anyone know about:
How should I connect the h264 pin to ffdshow?
How should I recommend the graph to use h264 video decoding?
Tutorial and details
Create the graph
_graph = (IFilterGraph2)new FilterGraph();
We are using DVBT network
IBaseFilter networkProvider = (IBaseFilter) new DVBTNetworkProvider();
... which must be tuned to 602000KHz#8MHz ONID=1 TSID=1 SID=6
ITuner tuner = (ITuner) networkProvider;
IDVBTuningSpace tuningspace = (IDVBTuningSpace) new DVBTuningSpace();
tuningspace.put_UniqueName("DVBT TuningSpace");
tuningspace.put_FriendlyName("DVBT TuningSpace");
tuningspace.put__NetworkType(typeof (DVBTNetworkProvider).GUID);
tuningspace.put_SystemType(DVBSystemType.Terrestrial);
ITuneRequest request;
tuningspace.CreateTuneRequest(out request);
ILocator locator = (ILocator) new DVBTLocator();
locator.put_CarrierFrequency(602000);
((IDVBTLocator) locator).put_Bandwidth(8);
request.put_Locator(locator);
IDVBTuneRequest dvbrequest = (IDVBTuneRequest) request;
dvbrequest.put_TSID(1);
dvbrequest.put_ONID(1);
dvbrequest.put_SID(6);
_graph.AddFilter(networkProvider, "Network Provider");
Create a mpeg2 demux to get separate EPG/Vidoe/Audio/Text streams out of single TV stream
_mpeg2Demultiplexer = (IBaseFilter) new MPEG2Demultiplexer();
_graph.AddFilter(_mpeg2Demultiplexer, "MPEG-2 Demultiplexer");
Now we search local filters for BDA Source Filter which in my case is IT9135 BDA Fitler
DsDevice[] devicesOfCat =
DsDevice.GetDevicesOfCat(FilterCategory.BDASourceFiltersCategory);
IBaseFilter iteDeviceFilter;
_graph.AddSourceFilterForMoniker(
devicesOfCat[0].Mon, null, devicesOfCat[0].Name, out iteDeviceFilter);
Now connect filters: [DVBT Net. Provider]->[BDA Src Filter]->[MPEG2Demux]-> ...
UnsafeConnectFilters(_graph, networkProvider, iteDeviceFilter);
UnsafeConnectFilters(_graph, iteDeviceFilter, _mpeg2Demultiplexer);
Two filters must be connected to demux, to provide epg (program guide data). sorry I do not know what they specifically are doig :P. They are located under BDATransportInformationRenderersCategory category. We try to find them by name and connect them to demux
DsDevice[] dsDevices =
DsDevice.GetDevicesOfCat(FilterCategory.BDATransportInformationRenderersCategory);
foreach (DsDevice dsDevice in dsDevices)
{
IBaseFilter filter;
_graph.AddSourceFilterForMoniker(
dsDevice.Mon, null, dsDevice.Name, out filter);
if(dsDevice.Name == "BDA MPEG2 Transport Information Filter")
_bdaTIF = filter;
else if(dsDevice.Name == "MPEG-2 Sections and Tables")
{
_mpeg2SectionsAndTables = filter;
}
UnsafeConnectFilters(_graph, _mpeg2Demultiplexer, filter);
}
Now demux is connected to both MPEG-2 Sections and Tables and BDA MPEG2 Transport Information Filter.
Now create h264 video type and add the output an output pin to demux for this type
AMMediaType h264 = new AMMediaType();
h264.formatType = FormatType.VideoInfo2;
h264.subType = MediaSubType.H264;
h264.majorType = MediaType.Video;
IPin h264pin;
((IMpeg2Demultiplexer) _mpeg2Demultiplexer).CreateOutputPin(h264, "h264", out h264pin);
Below, I tried to search for ffdshow Video Decoder which is capable of processing H264 video and is located under DirectShow Filters category(as in GraphStudio).
DsDevice[] directshowfilters =
DsDevice.GetDevicesOfCat(FilterCategory.LegacyAmFilterCategory);
IBaseFilter ffdshow = null;
foreach (DsDevice directshowfilter in directshowfilters)
{
if(directshowfilter.Name == "ffdshow Video Decoder")
{
_graph.AddSourceFilterForMoniker(
directshowfilter.Mon, null, directshowfilter.Name,
out ffdshow);
break;
}
}
Create a video renderer for video output ...
_videoRenderer = new VideoRendererDefault();
_graph.AddFilter((IBaseFilter)_videoRenderer, "Video Renderer");
... and audio ...
DsDevice defaultDirectSound =
DsDevice.GetDevicesOfCat(FilterCategory.AudioRendererCategory)[0];
_graph.AddSourceFilterForMoniker(
defaultDirectSound.Mon, null, defaultDirectSound.Name,
out _audioRender);
Here I tried to connect h264 output pin of demux to ffdshow. This method call fails with AccessViolationException. I'm not sure how to connect these two together :(.
Commenting this line will result in a graph which starts running, although there is an disconnected ffdshowVideoDecoder filter in the graph, will not show anything. IntelliConnect connects Mpeg2 video output to a locally available video decoder and as I said it will not display anything.
// UnsafeConnectFilters(_graph, _mpeg2Demultiplexer, ffdshow, type => type.majorType == MediaType.Video && type.subType == MediaSubType.H264);
ConnectFilters is borrowed from DTVViewer sample of directshowlib
ConnectFilters();
I moved actual tuning here
tuner.put_TuningSpace(tuningspace);
tuner.put_TuneRequest(request);
start the graph and wish for some sound or video to be displayed
int hr = (_graph as IMediaControl).Run();
DsError.ThrowExceptionForHR(hr);
check that the graph is running ...
FilterState pfs;
hr = (_graph as IMediaControl).GetState(1000, out pfs);
DsError.ThrowExceptionForHR(hr);
and it says that the graph is running.
Did you check that your ffdshow is enabled for H264/AVC? Open the filter properties and In "Codecs" section, H264/AVC format should be enabled (you can also disable the Mpeg2 decoder just to make sure it won't prefer this format).
Another thing, you can try using another Mpeg2 demultiplexer. The default "MPEG-2 Demultiplexer" is not behaving the same on different environments. There are many other filters that can demux TS and if you can invest some money, I'd recommend using MainConcept or Elecard.

DirectShow .NET custom graph

I am trying to build a custom graph filter and I am having problems.
I am using the C# DirectShow.NET lib
I am reading a file with vc1 video and dts audio. i add the source filter to the graph, it works fine, i can also add the splitter filter (using lav splitter), but when i try to connect the file source filter to the lav splitter, it fails.
and it fails, because it doesn't find any input pin on the splitter ... i know that output pins can be dynamic, but the input pin should be there right ?
this is the code
_graphBuilder = (IGraphBuilder)new FilterGraph();
_dsRotEntry = new DsROTEntry((IFilterGraph)_graphBuilder);
LogInfo("Adding source filter...");
int hr = _graphBuilder.AddSourceFilter(_inputFilePath, _inputFilePath,
out _fileSource);
DsError.ThrowExceptionForHR(hr);
IPin pinSourceOut = DsFindPin.ByDirection(_fileSource, PinDirection.Output, 0);
if (pinSourceOut == null)
{
LogError("Unable to find source output pin");
};
IBaseFilter lavSplitter = CreateFilter(LAV_SPLITTER);
if (lavSplitter == null)
{
LogError("LAV Splitter not found");
};
hr = _graphBuilder.AddFilter(lavSplitter, "LAV Splitter");
DsError.ThrowExceptionForHR(hr);
bool result = TryConnectToAny(pinSourceOut, lavSplitter);
if (!result)
{
LogError("Unable to connect FileSource with LAV Splitter");
}
and
private bool TryConnectToAny(IPin sourcePin, IBaseFilter destinationFilter)
{
IEnumPins pinEnum;
int hr = destinationFilter.EnumPins(out pinEnum);
DsError.ThrowExceptionForHR(hr);
IPin[] pins = { null };
while (pinEnum.Next(pins.Length, pins, IntPtr.Zero) == 0)
{
int err = _graphBuilder.Connect(sourcePin, pins[0]);
if (err == 0)
return true;
Marshal.ReleaseComObject(pins[0]);
}
return false;
}
Most likely that input pin does exist, and what fails is the connection itself. err holds error code to possibly explain the problem. If it is unable to make the connection, TryConnectToAny returns false the same way as if there were no input pins on the filter at all.

Am I using the GMFBridge.DLL to preview/capture a stream correctly?

I am trying to use the GMFBuilder so that i can preview a stream from a webcam and save it periodically without restarting the whole graph. However Im not sure if this is correct or not, I was trying to follow examples but the codes been updated and things have changed.
I try and create:
WEbcam -> Smart Tee (preview) -> AVI Decompressor -> Video Renderer
Smart Tee (Capture) -> BridgeSinkFilter
and also:
BridgeSourceFilter -> ffdshow video encoder -> haali mastroska muxer
(just cause its easy to use)
Input regarding getting the code to run properly would be greatly appreciated.
private void button2_Click(object sender, EventArgs e)
{
IGraphBuilder firstGraph = (IGraphBuilder)new FilterGraph();
IGraphBuilder secondGraph = (IGraphBuilder)new FilterGraph();
IBaseFilter BridgeSinkFilter;
IBaseFilter BridgeSourceFilter;
IBaseFilter Source;
IBaseFilter Mux;
IBaseFilter FileWriter;
IGMFBridgeController bridge = (IGMFBridgeController)new GMFBridgeController();
bridge.AddStream(1, eFormatType.eMuxInputs, 1);
BridgeSinkFilter = (IBaseFilter)bridge.InsertSinkFilter(firstGraph);
Source = FindFilter(FilterCategory.VideoInputDevice, "SG330");
firstGraph.AddFilter(Source, "source");
IBaseFilter SmartTee = FindFilter(FilterCategory.LegacyAmFilterCategory, "Smart Tee");
firstGraph.AddFilter(SmartTee, "Smart Tee");
IPin pinin, pinout;
pinout = FindPinByDirection( Source, PinDirection.Output);
pinin = FindPinByDirection( SmartTee, PinDirection.Input);
firstGraph.Connect(pinout, pinin);
pinout = FindPinByDirection(SmartTee, PinDirection.Output);
pinin = FindPinByDirection(BridgeSinkFilter, PinDirection.Input);
firstGraph.Connect(pinout, pinin);
IBaseFilter Decomp = FindFilter(FilterCategory.LegacyAmFilterCategory, "AVI Decompressor");
firstGraph.AddFilter(Decomp, "Avi Decompressor");
pinout = FindPinByDirection(SmartTee, PinDirection.Output);
pinin = FindPinByDirection(Decomp, PinDirection.Input);
firstGraph.Connect(pinout, pinin);
IBaseFilter Renderer = FindFilter(FilterCategory.LegacyAmFilterCategory, "Video Renderer");
firstGraph.AddFilter(Renderer, "Video Renderer");
pinout = FindPinByDirection(Decomp, PinDirection.Output);
pinin = FindPinByDirection(Renderer, PinDirection.Input);
firstGraph.Connect(pinout, pinin);
DsROTEntry g = new DsROTEntry(firstGraph);
BridgeSourceFilter = (IBaseFilter)bridge.InsertSourceFilter(BridgeSinkFilter, secondGraph);
DsROTEntry h = new DsROTEntry(secondGraph);
IBaseFilter Muxe = FindFilter(FilterCategory.VideoCompressorCategory, "ffdshow video encoder");
secondGraph.AddFilter(Muxe, "Mux");
pinout = FindPinByDirection(BridgeSourceFilter, PinDirection.Output);
pinin = FindPinByDirection(Muxe, PinDirection.Input);
secondGraph.Connect(pinout, pinin);
IBaseFilter MKV = FindFilter(FilterCategory.LegacyAmFilterCategory, "Haali Matroska Muxer");
IFileSinkFilter fs = (IFileSinkFilter)MKV;
fs.SetFileName("c:\\cool.mkv", null);
secondGraph.AddFilter(MKV, "mux");
pinout = FindPinByDirection(Muxe, PinDirection.Output);
pinin = FindPinByDirection(MKV, PinDirection.Input);
secondGraph.Connect(pinout, pinin);
bridge.BridgeGraphs(BridgeSinkFilter, BridgeSourceFilter);
IMediaControl mediacontrolforpartone = (IMediaControl)firstGraph;
mediacontrolforparttwo = (IMediaControl)secondGraph;
mediacontrolforpartone.Run();
mediacontrolforparttwo.Run();
}
To use GMFBridge correctly to my current knowledge:
Grab the GMFBridge DLL from: http://www.gdcl.co.uk/gmfbridge/
Grab the DirectShowLib DLL from: https://sourceforge.net/projects/directshownet/files/DirectShowNET/
include them both in your project.
Create 2 graphs. 1 is for preview the 2nd is for capture.
IGraphBuilder firstgraph = (IGraphBuilder) new FilterGraph();
IGraphBuilder secondgraph = (IGraphBuilder) new FilterGraph();
Create a Bridge which from the GMFBridge dll, will connect the two graphs
IGMFBridgeController Bridge = (IGMFBridgeController) new GMFBridgeController();
From here you setup the bridge to allow for muxed inputs
Bridge.AddStream(1, eFormatType.eMuxInputs, 1);
from here you can add your source video filter, it doesnt need connected to the bridge, add Smart Tee, and connect source to Smart Tee.
then create a filter to house the first bridge filter that will do the work
BridgeSinkFilter = (IBaseFilter)Bridge.InsertSinkFilter(firstgraph);
This filter will continuously accept video from the capture pin of Smart Tee. If the 2nd graph bridge filter is connected and running it will pass the video from BridgeSInkFilter to the 2nd graph. otherwise, it just throws it out, but it is always running.
Connect the BridgeSinkFilter to the capture pin of Smart Tee.
I found the best way to connect the pins is to use FindPinByDirection by https://splicer.svn.codeplex.com/svn/src/Splicer/Utilities/FilterGraphTools.cs and then just call
firstgraph.connect(pinoutput, pininput)
From here, to preview the video, the AVI Decompressor filter from FilterCategory.LegacyAmFilterCategory should be addedd and connected to the Preview pin of Smart Tee. Then Video Renderer added and connected to AVI Decompressor.
That should take care of the first graph.
The second graph needs to start with the bridge. it will create a bridge from the first graph bridgesinkfilter and pull it to a second graph in which we can do anything we want to it. to do this we need the other side of the bridge.
IBaseFilter BridgeSourceFilter = (IBaseFilter)Bridge.InsertSourceFilter(BridgeSinkFilter,secondgraph);
this sets the source as the sinkfilter from the firstgraph but puts it on our second graph under the guise of BridgeSourceFilter.
Now connect an encoder, ffdshow video encoder, etc..
connect it to the BridgeSourceFilter.
add in a muxer, AVI Mux, and file writer. Connect them. Thats everything for the second graph.
To finish the graphs off, we need to create 2 mediacontrollers that can start and stop the graphs.
IMediaControl MediaControl_FirstGraph = (IMediaControl)firstgraph;
IMediaControl MediaControl_SecondGraph = (IMediaControl)secondgraph;
now we can call
MediaControl_FirstGraph.Run() to start previewing the video.
and then to capture that video, we need to connect the bridge between the first and second graph, then run the second graph.
Bridge.BridgeGraphs(BridgeSinkFilter,BridgeSourceFilter);
MediaControl_SecondGraph.Run()
at any point, you can stop capturing by breaking the bridge connection, then stopping the second graph.
Bridge.BridgeGraphs(null, null);
MediaControl_SecondGraph.Stop();
I think that about covers what ive found out about GMF bridge control :)
Hopefully this is up to Geraint Davies's standards
if there is even the slightest error anywhere in the second graph, when you run the second graph, it will stop the first graph. its a good indication something is wrong. If you give it an invalid name like making Avi mux -> file writer try to save in a location that isnt real, itll stop the graph
This code would create a new window and stick the video in that new window. To stream the video into a panel box of a form you just really need to add 4 lines of code.
IVideoWindow var = firstgraph as IVideoWindow();
var.put_Owner(panel1.handle);
var.put_windowstyle( windowstyle.child | windowstyle.clipchildren );
var.SetWindowPosition( panel1.clientrectangle.left, panel1.clientrectangle.top, panel1.clientrectangle.width, panel1.clientrectangle.height);

Samplegrabber works fine on AVI/MPEG files but choppy with WMV

I have been using the latest version of the WPFMediaKit. What I am trying to do is write a sample application that will use the Samplegrabber to capture the video frames of video files so I can have them as individual Bitmaps.
So far, I have had good luck with the following code when constructing and rendering my graph. However, when I use this code to play back a .wmv video file, when the samplegrabber is attached, it will play back jumpy or choppy. If I comment out the line where I add the samplegrabber filter, it works fine. Again, it works with the samplegrabber correctly with AVI/MPEG, etc.
protected virtual void OpenSource()
{
FrameCount = 0;
/* Make sure we clean up any remaining mess */
FreeResources();
if (m_sourceUri == null)
return;
string fileSource = m_sourceUri.OriginalString;
if (string.IsNullOrEmpty(fileSource))
return;
try
{
/* Creates the GraphBuilder COM object */
m_graph = new FilterGraphNoThread() as IGraphBuilder;
if (m_graph == null)
throw new Exception("Could not create a graph");
/* Add our prefered audio renderer */
InsertAudioRenderer(AudioRenderer);
var filterGraph = m_graph as IFilterGraph2;
if (filterGraph == null)
throw new Exception("Could not QueryInterface for the IFilterGraph2");
IBaseFilter renderer = CreateVideoMixingRenderer9(m_graph, 1);
IBaseFilter sourceFilter;
/* Have DirectShow find the correct source filter for the Uri */
var hr = filterGraph.AddSourceFilter(fileSource, fileSource, out sourceFilter);
DsError.ThrowExceptionForHR(hr);
/* We will want to enum all the pins on the source filter */
IEnumPins pinEnum;
hr = sourceFilter.EnumPins(out pinEnum);
DsError.ThrowExceptionForHR(hr);
IntPtr fetched = IntPtr.Zero;
IPin[] pins = { null };
/* Counter for how many pins successfully rendered */
int pinsRendered = 0;
m_sampleGrabber = (ISampleGrabber)new SampleGrabber();
SetupSampleGrabber(m_sampleGrabber);
hr = m_graph.AddFilter(m_sampleGrabber as IBaseFilter, "SampleGrabber");
DsError.ThrowExceptionForHR(hr);
/* Loop over each pin of the source filter */
while (pinEnum.Next(pins.Length, pins, fetched) == 0)
{
if (filterGraph.RenderEx(pins[0],
AMRenderExFlags.RenderToExistingRenderers,
IntPtr.Zero) >= 0)
pinsRendered++;
Marshal.ReleaseComObject(pins[0]);
}
Marshal.ReleaseComObject(pinEnum);
Marshal.ReleaseComObject(sourceFilter);
if (pinsRendered == 0)
throw new Exception("Could not render any streams from the source Uri");
/* Configure the graph in the base class */
SetupFilterGraph(m_graph);
HasVideo = true;
/* Sets the NaturalVideoWidth/Height */
//SetNativePixelSizes(renderer);
}
catch (Exception ex)
{
/* This exection will happen usually if the media does
* not exist or could not open due to not having the
* proper filters installed */
FreeResources();
/* Fire our failed event */
InvokeMediaFailed(new MediaFailedEventArgs(ex.Message, ex));
}
InvokeMediaOpened();
}
And:
private void SetupSampleGrabber(ISampleGrabber sampleGrabber)
{
FrameCount = 0;
var mediaType = new AMMediaType
{
majorType = MediaType.Video,
subType = MediaSubType.RGB24,
formatType = FormatType.VideoInfo
};
int hr = sampleGrabber.SetMediaType(mediaType);
DsUtils.FreeAMMediaType(mediaType);
DsError.ThrowExceptionForHR(hr);
hr = sampleGrabber.SetCallback(this, 0);
DsError.ThrowExceptionForHR(hr);
}
I have read a few things saying the the .wmv or .asf formats are asynchronous or something. I have attempted inserting a WMAsfReader to decode which works, but once it goes to the VMR9 it gives the same behavior. Also, I have gotten it to work correctly when I comment out the IBaseFilter renderer = CreateVideoMixingRenderer9(m_graph, 1); line and have filterGraph.Render(pins[0]); -- the only drawback is that now it renders in an Activemovie widow of its own instead of my control, however the samplegrabber functions correctly and without any skipping. So I am thinking the bug is in the VMR9 / samplegrabbing somewhere.
Any help? I am new to this.
Some decoders will use hardware acceleration using DXVA. This is implemented by negotiating a partly-decoded format, and passing this partly decoded data to the renderer to complete decoding and render. If you insert a sample grabber configured to RGB24 between the decoder and the renderer, you will disable hardware acceleration.
That, I'm sure, is the crux of the problem. The details are still a little vague, I'm afraid, such as why it works when you use the default VMR-7, but fails when you use VMR-9. I would guess that the decoder is trying to use dxva and failing, in the vmr-9 case, but has a reasonable software-only backup that works well in vmr-7.
I'm not familiar with the WPFMediaKit, but I would think the simplest solution is to replace the explicit vmr-9 creation with an explicit vmr-7 creation. That is, if the decoder works software-only with vmr-7, then use that and concentrate on fixing the window-reparenting issue.
It ended up the the code I have posted (which in itself was fairly shamelessly slightly modified by me from Jeremiah Morrill's WPFMediakit source code) was in fact adequate to render the .WMV files and be sample grabbed.
It seems the choppiness has something to do with running through the VS debugger, or VS2008 itself. After messing around with the XAML for a while in the visual editor, then running the app, I will have this choppy behavior introduced. Shutting down VS2008 seems to remedy it. :P
So not much of an answer, but at least a (annoying - restarting VS2008) fix when it crops up.

Categories

Resources