Streaming FFMPEG RTSP in C# - c#

I wish to stream an RTSP feed from my camera through my C# client app and eventually recieve it on my server.
To test it I decided to stream it through my client and then recieve the jpeg frames from it as well.
I 1st tested my command line arguments in a DOS window.
It worked in that it stream to an output file without incident.
So, now I ported it over to my C# app and used the Process Class.
This is the code:
Task.Run(() =>
{
try
{
process.Start();
byte[] buffer = new byte[200000];
baseStream = process.StandardOutput.BaseStream as FileStream;
bool gotHeader = false;
bool imgFound = false;
int counter = 0;
while (true)
{
byte bit1 = (byte)baseStream.ReadByte();
buffer[counter] = bit1;
if (bit1 == 217 && gotHeader)
{
if (buffer[counter - 1] == 255)
{
byte[] jpeg = new byte[counter];
Buffer.BlockCopy(buffer, 0, jpeg, 0, counter);
using (MemoryStream ms = new MemoryStream(jpeg))
{
pictureBox1.Image = Image.FromStream(ms);
ms.Close();
}
imgFound = true;
}
}
else if (bit1 == 216)
{
if (buffer[counter - 1] == 255)
{
gotHeader = true;
}
}
if (imgFound)
{
counter = 0;
buffer = new byte[200000];
imgFound = false;
gotHeader = false;
}
else
{
counter++;
}
}
}
catch (Exception ex2)
{
//log error here
}
});
But I do not get any images I just get the byte value of 255 at this line:
buffer[counter] = bit1;
So.. what am I doing wrong please?
Thanks

Related

Extract audio from video using autogen ffmpeg C# in Unity

Hi I'm using ffmpeg autogen to extract audio from video in Unity, but when I following this code, the file write cannot write, it's 0Kb, so what's issue of this or someone have any examples for extract audio using this library, apologize for my English. This is github of library:
https://github.com/Ruslan-B/FFmpeg.AutoGen
unsafe void TestExtractAudio()
{
string inFile = Application.streamingAssetsPath + "/" + strFileName;
string outFile = Application.streamingAssetsPath + "/" + strFileNameAudio;
AVOutputFormat* outFormat = null;
AVFormatContext* inFormatContext = null;
AVFormatContext* outFormatContext = null;
AVPacket packet;
ffmpeg.av_register_all();
inFormatContext = ffmpeg.avformat_alloc_context();
outFormatContext = ffmpeg.avformat_alloc_context();
if (ffmpeg.avformat_open_input(&inFormatContext, inFile, null, null) < 0)
{
throw new ApplicationException("Could not open input file.");
}
if (ffmpeg.avformat_find_stream_info(inFormatContext, null) < 0)
{
throw new ApplicationException("Failed to retrieve input stream info.");
}
ffmpeg.avformat_alloc_output_context2(&outFormatContext, null, null, outFile);
if (outFormatContext == null)
{
throw new ApplicationException("Could not create output context");
}
outFormat = outFormatContext->oformat;
AVStream* inStream = inFormatContext->streams[1];
AVStream* outStream = ffmpeg.avformat_new_stream(outFormatContext, inStream->codec->codec);
if (outStream == null)
{
throw new ApplicationException("Failed to allocate output stream.");
}
if (ffmpeg.avcodec_copy_context(outStream->codec, inStream->codec) < 0)
{
throw new ApplicationException("Couldn't copy input stream codec context to output stream codec context");
}
outFormatContext->audio_codec_id = AVCodecID.AV_CODEC_ID_MP3;
int retcode = ffmpeg.avio_open(&outFormatContext->pb, outFile, ffmpeg.AVIO_FLAG_WRITE);
if (retcode < 0)
{
throw new ApplicationException("Couldn't open output file");
}
int returnCode = ffmpeg.avformat_write_header(outFormatContext, null);
if (returnCode < 0)
{
throw new ApplicationException("Error occurred opening output file.");
}
while (true)
{
if (ffmpeg.av_read_frame(inFormatContext, &packet) < 0)
{
break;
}
if (packet.stream_index == 1)
{
inStream = inFormatContext->streams[1];
outStream = outFormatContext->streams[0];
// TODO: Replicate log packet functionality to print out what's inside the packet.
packet.pts = ffmpeg.av_rescale_q_rnd(packet.pts, inStream->time_base, outStream->time_base,
AVRounding.AV_ROUND_NEAR_INF | AVRounding.AV_ROUND_PASS_MINMAX);
packet.dts = ffmpeg.av_rescale_q_rnd(packet.dts, inStream->time_base, outStream->time_base,
AVRounding.AV_ROUND_NEAR_INF | AVRounding.AV_ROUND_PASS_MINMAX);
packet.duration = ffmpeg.av_rescale_q(packet.duration, inStream->time_base, outStream->time_base);
int returncode = ffmpeg.av_interleaved_write_frame(outFormatContext, &packet);
}
ffmpeg.av_packet_unref(&packet);
}
ffmpeg.av_write_trailer(outFormatContext);
ffmpeg.avformat_close_input(&inFormatContext);
ffmpeg.avformat_free_context(outFormatContext);
Console.WriteLine("Press any key to continue...");
Console.ReadKey();
}
the value returnCode return less than 0, so someone can fix this, thanks so much for that
The problem is here:
inStream = inFormatContext->streams[1];
outStream = outFormatContext->streams[0];
This code:
ffmpeg.av_interleaved_write_frame
does the following validation
static int check_packet(AVFormatContext *s, AVPacket *pkt)
{
if (pkt->stream_index < 0 || pkt->stream_index >= s->nb_streams) {
av_log(s, AV_LOG_ERROR, "Invalid packet stream index: %d\n"
}
you need to change the packet.stream_index from 1 to 0 before calling
int returncode = ffmpeg.av_interleaved_write_frame(outFormatContext, &packet);
See:
if (type == AVMediaType.AVMEDIA_TYPE_AUDIO){
inStream = inFormatContext->streams[1];
outStream = outFormatContext->streams[0];
// TODO: Replicate log packet functionality to print out what's inside the packet.
ffmpeg.av_packet_rescale_ts(&packet, inStream->time_base, outStream->time_base);
packet.stream_index = 0;
int returncode = ffmpeg.av_write_frame(outFormatContext, &packet);
}

Parsing variable-sized packets

I'm working on a multiplayer game and I'm having an issue with the the way I parse the packets from the connection. When I'm debugging the game it runs at a lower performance and the packets are received, when I'm not, packets aren't fully received and the ParsePacket method isn't called.
My packet structure is this:
2 Bytes Short Command, 2 Bytes Short Payload Size, (Optional) Payload Bytes
IInputStream inputStream = null;
DataReader dataReader = null;
byte[] data = new byte[1024];
IBuffer buffer = data.AsBuffer();
try
{
inputStream = StreamSocket.InputStream;
dataReader = DataReader.FromBuffer(buffer);
dataReader.InputStreamOptions = InputStreamOptions.Partial;
dataReader.ByteOrder = ByteOrder.LittleEndian;
while (connected)
{
await inputStream.ReadAsync(buffer, 1024, InputStreamOptions.Partial);
Debug.WriteLine("Buffer " + buffer.Length);
if (buffer.Length >= PacketHeaderSize)
{
short command = dataReader.ReadInt16();
short payloadSize = dataReader.ReadInt16();
byte[] payload = null;
if (payloadSize == 0)
{
UpdateBuffer(buffer, (uint)(PacketHeaderSize + payloadSize));
Packet packet = new Packet(command, payloadSize, payload);
ParsePacket(packet);
}
else if (payloadSize > 0)
{
if (buffer.Length >= (PacketHeaderSize + payloadSize))
{
payload = new byte[payloadSize];
dataReader.ReadBytes(payload);
UpdateBuffer(buffer, (uint)(PacketHeaderSize + payloadSize));
Packet packet = new Packet(command, payloadSize, payload);
ParsePacket(packet);
}
}
}
}
}
catch (Exception e) {
// ...
}
private void UpdateBuffer(IBuffer buffer, uint bytesRead)
{
if (buffer.Length > bytesRead)
{
byte[] bufferBytes = new byte[buffer.Length - bytesRead];
System.Buffer.BlockCopy(buffer.ToArray(), (int)bytesRead, bufferBytes, 0, (int)(buffer.Length - bytesRead));
buffer = bufferBytes.AsBuffer();
}
else
{
byte[] bufferBytes = new byte[1024];
buffer = bufferBytes.AsBuffer();
}
}
What I'm doing wrong?
Things fixed to make this work:
Take important variables outside the parsing packets loop as we need the value next time we need to parse another packet.
Read the packet header once if a incomplete app or game packet is received.
Load only the bytes we need
Read the header and payload once it has been fully received using UnconsumedBufferLength.
Code:
short command = 0;
short payloadSize = 0;
byte[] payload = null;
bool packetHeaderRead = false;
while (connected)
{
if (!packetHeaderRead)
{
if (dataReader.UnconsumedBufferLength < PacketHeaderSize)
{
int headerBytesLeft = PacketHeaderSize - (int)dataReader.UnconsumedBufferLength;
if (headerBytesLeft > 0)
{
await dataReader.LoadAsync((uint)headerBytesLeft);
continue;
}
}
else
{
command = dataReader.ReadInt16();
payloadSize = dataReader.ReadInt16();
packetHeaderRead = true;
continue;
}
}
else
{
int payloadBytesLeft = payloadSize - (int)dataReader.UnconsumedBufferLength;
if (payloadBytesLeft > 0)
{
await dataReader.LoadAsync((uint)payloadBytesLeft);
}
if (payloadSize == 0)
{
Packet packet = new Packet(command, payloadSize, payload);
ParsePacket(packet);
packetHeaderRead = false;
}
else if (dataReader.UnconsumedBufferLength >= payloadSize)
{
payload = new byte[payloadSize];
dataReader.ReadBytes(payload);
Packet packet = new Packet(command, payloadSize, payload);
ParsePacket(packet);
packetHeaderRead = false;
}
}
}

TcpClient performance on Xamarin.Android

I have encountered a problem with TcpClient running on an Android device. The issue is that reading from NetworkStream takes a lot of time. I use it to send bitmaps from server application (also written in C#) to my Android device (Nexus 9). I push 6000 bitmaps each of them around 1.4KB in size. That takes around 30 seconds to complete (5ms per bitmap). On the other hand I also implemented the same in Java and it takes 5 seconds to do the same (less than 1ms per bitmap)!
Are you aware of any issues regarding TcpClient in Xamarin.Android? I have to admit that this is very surprising as I have never had similar problems with Xamarin.
I pasted just a little code below so you can take a look if it is not me that messed up something. At the very bottom I also pasted Java code and as you can see it is very similar. Do you have any suggestions regarding this? I appreciate any feedback.
Here is the method that I use to receive bitmaps (this function runs asynchronously and is started through Task.Factory.StartNew()):
private void HandleRequestBitmaps(TcpClient client, CancellationToken token)
{
if (client == null)
{
throw new ArgumentNullException("client");
}
var encoder = new UTF8Encoding();
using (var stream = client.GetStream())
{
stream.WriteMessageWithLength(ServerCommand.PushBitmaps.ServerCommandToBytes(encoder));
var initialMetadataBytes = stream.ReadMessageWithLength(client.ReceiveBufferSize);
int[] initialMetadata;
if (!TryParseMetadataMessage(initialMetadataBytes, encoder, 3, out initialMetadata))
{
BitmapReceiveEnd(this, new BitmapReceiveEndEventArgs(false));
return;
}
BitmapsReceiveBegin(this,
new BitmapsReceiveBeginEventArgs(initialMetadata[0], initialMetadata[1], initialMetadata[2]));
while (!token.IsCancellationRequested)
{
var messageBytes = stream.ReadMessageWithLength(client.ReceiveBufferSize);
if (ServerCommand.PushBitmapsDone.Equals(messageBytes.BytesToServerCommand(encoder)))
{
BitmapReceiveEnd(this, new BitmapReceiveEndEventArgs(true));
break;
}
int[] metadata;
if (!TryParseMetadataMessage(messageBytes, encoder, 3, out metadata))
{
BitmapReceiveEnd(this, new BitmapReceiveEndEventArgs(false));
return;
}
var bitmapBytes = stream.ReadMessageWithLength(client.ReceiveBufferSize);
BitmapReceived(this,
new BitmapReceivedEventArgs(metadata[0], new Point(metadata[1], metadata[2]), bitmapBytes));
}
}
}
Where TryParseMetadaMessage method looks like this:
private static bool TryParseMetadataMessage(byte[] message, Encoding encoder, int expectedLength,
out int[] metadata)
{
var messageString = encoder.GetString(message, 0, message.Length);
var messageStringSeparated = messageString.Split('x');
if (messageStringSeparated.Length != expectedLength)
{
metadata = new int[0];
return false;
}
var parameteres = new List<int>();
foreach (var messageStringPart in messageStringSeparated)
{
int value;
if (!int.TryParse(messageStringPart, out value))
{
metadata = new int[0];
return false;
}
parameteres.Add(value);
}
metadata = parameteres.ToArray();
return true;
}
}
Also ReadMessage with length looks like this:
public static byte[] ReadMessageWithLength(this Stream stream, int bufferSize = 2048)
{
if (stream == null)
{
throw new ArgumentNullException("stream");
}
var messageSizeBuffer = new byte[4];
if (stream.Read(messageSizeBuffer, 0, messageSizeBuffer.Length) < 1)
{
return new byte[0];
}
if (BitConverter.IsLittleEndian)
{
Array.Reverse(messageSizeBuffer);
}
var totalBytesToRead = BitConverter.ToInt32(messageSizeBuffer, 0);
if (totalBytesToRead < 0)
{
throw new Exception("Number of bytes to read cannot be negative!");
}
using (var memoryStream = new MemoryStream())
{
var buffer = new byte[bufferSize];
int chunkBytesRead, totalBytesRead = 0;
while (
(chunkBytesRead = stream.Read(buffer, 0, Math.Min(totalBytesToRead - totalBytesRead, buffer.Length))) >
0)
{
memoryStream.Write(buffer, 0, chunkBytesRead);
totalBytesRead += chunkBytesRead;
if (totalBytesRead >= totalBytesToRead)
{
break;
}
}
return memoryStream.ToArray();
}
}
Now time for Java code. The function used to receive bitmaps:
public void requestBitmaps() {
if (socket == null || !socket.isConnected()) {
throw new IllegalStateException("Socket has to be initialized and connected to call this method!");
}
requestBitmapsThread = new Thread(new Runnable() {
#Override
public void run() {
try {
InputStream sis = new BufferedInputStream(socket.getInputStream());
OutputStream sos = new BufferedOutputStream(socket.getOutputStream());
writeMessageWithLength(sos, ServerCommand.PushBitmaps.getValueBytes());
byte[] initialMetadataBytes = readMessageWithLength(sis, socket.getReceiveBufferSize());
int[] initialMetadata = parseMetadataMessage(initialMetadataBytes, 3);
if (initialMetadata == null) {
listener.onBitmapReceivedEnd(false);
return;
}
listener.onBitmapReceivedBegin(initialMetadata[0], initialMetadata[1], initialMetadata[2]);
while (!Thread.currentThread().isInterrupted()) {
byte[] messageBytes = readMessageWithLength(sis, socket.getReceiveBufferSize());
if (ServerCommand.PushBitmapsDone.equals(ServerCommand.fromValueBytes(messageBytes))) {
listener.onBitmapReceivedEnd(true);
break;
}
int[] metadata = parseMetadataMessage(messageBytes, 3);
if (metadata == null) {
listener.onBitmapReceivedEnd(false);
break;
}
byte[] bitmapBytes = readMessageWithLength(sis, socket.getReceiveBufferSize());
listener.onBitmapReceived(metadata[0], metadata[1], metadata[2], bitmapBytes);
}
} catch (IOException e) {
e.printStackTrace();
}
}
});
requestBitmapsThread.start();
}
And also parseMetadataMessage method:
private int[] parseMetadataMessage(byte[] metadataBytes, int expectedLength) {
String metadataString = new String(metadataBytes, Charset.forName("UTF-8"));
String[] metadataStringSeparated = metadataString.split("x");
if (metadataStringSeparated.length != expectedLength) {
return null;
}
try {
int[] parameters = new int[metadataStringSeparated.length];
for (int i = 0; i < metadataStringSeparated.length; i++) {
parameters[i] = Integer.parseInt(metadataStringSeparated[i]);
}
return parameters;
} catch (NumberFormatException e) {
e.printStackTrace();
return null;
}
}
And readMessageWithLength:
private byte[] readMessageWithLength(InputStream is, int bufferLength) throws IOException {
byte[] messageLengthBuffer = new byte[4];
if (is.read(messageLengthBuffer) < 0) {
return null;
}
ByteBuffer messageLengthByteBuffer = ByteBuffer.wrap(messageLengthBuffer).order(ByteOrder.BIG_ENDIAN);
int totalBytesToRead = messageLengthByteBuffer.getInt();
if (totalBytesToRead < 0) {
throw new IllegalStateException("Total bytes to read cannot be less than 0!");
}
ByteArrayOutputStream bitmapBytesBuffer = null;
try {
bitmapBytesBuffer = new ByteArrayOutputStream();
byte[] buffer = new byte[bufferLength];
int chunkBytesRead, totalBytesRead = 0;
while ((chunkBytesRead = is.read(buffer, 0, Math.min(totalBytesToRead - totalBytesRead, buffer.length))) > 0) {
bitmapBytesBuffer.write(buffer, 0, chunkBytesRead);
totalBytesRead += chunkBytesRead;
if (totalBytesRead >= totalBytesToRead) {
break;
}
}
return bitmapBytesBuffer.toByteArray();
} finally {
if (bitmapBytesBuffer != null) {
bitmapBytesBuffer.close();
}
}
}
If you have come this far, thanks for attention. I appreciate any feedback and I hope it will be possible to speed up the code.
Best regards,
Bartosz

Send parts of a byte[] by webclient to create a large file in server side

So trying to Transfer a large byte[] for that chose to separate it in chunks of 20MB, and in the first chunk received create the file and add that,the rest open the existing file and add the remaining.The problem that I am having is instead of send the first part and reconnect to receive the second part its establishing the two connections and sending the two chunks at the same time.. how can send the second after the first as finished?
client.OpenWriteAsync(ub.Uri);
void client_OpenWriteCompleted(object sender, OpenWriteCompletedEventArgs e)
{
if (e.Cancelled)
{
MessageBox.Show("Cancelled");
}
else if (e.Error != null)
{
MessageBox.Show("Deu erro");
}
else
{
try
{
using (Stream output = e.Result)
{
int countbytes;
//for (int i = 0; i < max; i++)
//{
if ( (max+1) != maxAux)
{
countbytes = zippedMemoryStream.Read(PartOfDataSet, 0 , 20000000);//maxAux * 20000000
output.Write(PartOfDataSet, 0, (int)countbytes);
if (max != maxAux)
{
client.OpenWriteAsync(ub.Uri);
}
maxAux++;
}
//}
//numeroimagem++;
}
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
//throw;
}
}
}
public void ProcessRequest(HttpContext context)
{
//context.Response.ContentType = "text/plain";
//context.Response.Write("Hello World");
string ImageName = context.Request.QueryString["ImageName"];
string UploadPath = context.Server.MapPath("~/ServerImages/");
byte[] bytes = new byte[20000000];
int bytesToRead = 0;
if (!File.Exists(UploadPath + ImageName))
{
using (FileStream stream = File.Create(UploadPath + ImageName))
{
try
{
//List<byte> bytes = new List<byte>();
while ((bytesToRead =
context.Request.InputStream.Read(bytes, 0, bytes.Length)) != 0)
//context.Request.InputStream.Read(bytes, 0, 200000)) != 0)
{
stream.Write(bytes, 0, bytesToRead);
stream.Close();
}
bytes = null;
}
catch (Exception ex)
{
string error = ex.Message;
throw;
}
}
}
else
{
using (FileStream stream = File.Open(UploadPath + ImageName,FileMode.Append))
{
try
{
while ((bytesToRead =
context.Request.InputStream.Read(bytes, 0, bytes.Length)) != 0)
{
stream.Write(bytes, 0, bytesToRead);
stream.Close();
}
bytes = null;
}
catch (Exception ex)
{
string error = ex.Message;
throw;
}
}
}
}
public bool IsReusable
{
get
{
return false;
}
}
In fact you gain nothing by opening the client again. You can just loop over your zippedMemoryStream in chunks and write it to the output stream. Then the data will arrive in order on the server side.
Otherwise look into the UploadData methods, if you really want to create a new connection each time.

Web Service returned error : Not Found

The web service i am using returned web exception. Service is called from Silverlight application. Basically i am trying to convert eps file into png . On localhost everything is working well but when i deployed on web server i am getting this error.
Silverlight Code is --
private void btnUploadImage_Click(object sender, RoutedEventArgs e)
{
string fileextn = string.Empty;
OpenFileDialog openDialog = new OpenFileDialog();
if (openDialog.ShowDialog() == true)
{
try
{
string fileExtension = openDialog.File.Extension.ToString();
if (fileExtension.Contains("jpeg") || fileExtension.Contains("jpg") || fileExtension.Contains("png") || fileExtension.Contains("tif") || fileExtension.Contains("tiff") || fileExtension.Contains("bmp") || fileExtension.Contains("gif"))
{
using (Stream stream = openDialog.File.OpenRead())
{
HtmlPage.Window.Invoke("showProcessingAndBlockUI");
// Don't allow really big files (more than 2 MB).
if (stream.Length < 5 * 1024 * 1025)
{
MemoryStream tempStream = new MemoryStream();
byte[] data = new byte[stream.Length];
stream.Position = 0;
stream.Read(data, 0, (int)stream.Length);
tempStream.Write(data, 0, (int)stream.Length);
stream.Close();
ProductConfiguratorServiceClient pcs = new ProductConfiguratorServiceClient();
string virtualpath = HelperClass.GetVirtual();
pcs.Endpoint.Address = new System.ServiceModel.EndpointAddress(virtualpath + "/Services/ProductConfiguratorService.svc/basic");
pcs.GetFormattedImageCompleted += new EventHandler<GetFormattedImageCompletedEventArgs>(pcs_GetFormattedImageCompleted);
pcs.GetFormattedImageAsync(data);
pcs.CloseAsync();
tempStream.Close();
}
else
{
MessageBox.Show("Files must be less than 5 MB.");
}
}
}
else if (openDialog.File.Extension.Contains("eps"))
{
HtmlPage.Window.Invoke("showProcessingAndBlockUI");
using (Stream stream = openDialog.File.OpenRead())
{
if (stream.Length < 5 * 1024 * 1025)
{
MemoryStream tempStream = new MemoryStream();
byte[] data = new byte[stream.Length];
stream.Position = 0;
stream.Read(data, 0, (int)stream.Length);
tempStream.Write(data, 0, (int)stream.Length);
stream.Close();
ProductConfiguratorServiceClient pcs = new ProductConfiguratorServiceClient();
string virtualpath = HelperClass.GetVirtual();
pcs.Endpoint.Address = new System.ServiceModel.EndpointAddress(virtualpath + "/Services/ProductConfiguratorService.svc/basic");
pcs.GetEpsFileIntoPngCompleted += new EventHandler<GetEpsFileIntoPngCompletedEventArgs>(pcs_GetEpsFileIntoPngCompleted);
pcs.GetEpsFileIntoPngAsync(data);
tempStream.Close();
}
else
{
MessageBox.Show("Files must be less than 5 MB.");
}
}
}
else
{
MessageBox.Show("Please Check the Image Format.");
}
}
catch (Exception)
{
HtmlPage.Window.Invoke("hideBlockUI");
MessageBox.Show("Somr Error Occured, Please Try Again Later .");
}
}
}
void pcs_GetEpsFileIntoPngCompleted(object sender, GetEpsFileIntoPngCompletedEventArgs e)
{
busi.CurrentlySelectedOverlayImage.UploadedImageStream = e.Result;
busi.CurrentlySelectedOverlayImage.ImageFileType = "png";
RefreshStatus();
busi.CurrentlySelectedOverlayImageChanged = true;
HtmlPage.Window.Invoke("hideBlockUI");
}
void pcs_GetFormattedImageCompleted(object sender, GetFormattedImageCompletedEventArgs e)
{
try
{
if (e.Error != null)
{
MessageBox.Show(e.Error.ToString());
}
else
{
busi.CurrentlySelectedOverlayImage.UploadedImageStream = e.Result;
busi.CurrentlySelectedOverlayImage.ImageFileType = "jpeg";
RefreshStatus();
busi.CurrentlySelectedOverlayImageChanged = true;
}
}
catch (Exception ex)
{
MessageBox.Show(ex.ToString());
}
//throw new NotImplementedException();
}
Error is at System.Net.Browser.BrowserHttpWebRequest.
Please assist me whether what kind of problem is this?
It could have something to do with this line.
pcs.Endpoint.Address = new System.ServiceModel.EndpointAddress(virtualpath +
You are missing the rest of it.
Pretty sure this is a compile error though.
So either this error is getting eaten and the generic one is popping up or you have some other issue along with this.

Categories

Resources