I try to receive a desktop screen capture from my PC using WPF (Client and server side WPF)
here is my code
private void ViewReceivedImage(byte[] buffer)
{
try
{
using (MemoryStream memoryStream = new MemoryStream(buffer))
{
BitmapImage imageSource = new BitmapImage();
imageSource.BeginInit();
imageSource.StreamSource = memoryStream;
imageSource.EndInit();
// Assign the Source property of your image
MyImage.Source = imageSource;
}
//MemoryStream ms = new MemoryStream(buffer);
//BitmapImage bi = new BitmapImage();
//bi.SetSource = ms;
//MyImage.Source = bi;
//ms.Close();
}
catch (Exception) { }
finally
{
StartReceiving();
}
}
The commented line above are for Windows phone app, I have already tested it and it works with WP8(Client Side), I have the problem only on WPF Client Side.
WPF Server Side and WP8 Client Side works
WPF Client Side not working but connection is successful
this method sends the image
void StartSending()
{
while (!stop)
try
{
System.Drawing.Image oldimage = scr.Get_Resized_Image(wToCompare, hToCompare, scr.GetDesktopBitmapBytes());
//Thread.Sleep(1);
System.Drawing.Image newimage = scr.Get_Resized_Image(wToCompare, hToCompare, scr.GetDesktopBitmapBytes());
byte[] buffer = scr.GetDesktop_ResizedBytes(wToSend, hToSend);
//float difference = scr.difference(newimage, oldimage);
//if (difference >= 1)
//{
SenderSocket.Send(buffer);
//}
}
catch (Exception) { }
}
and this Get_Resized_Image
public Image Get_Resized_Image(int w, int h, byte[] image)
{
MemoryStream ms = new MemoryStream(image);
Image bt = Image.FromStream(ms);
try
{
Size sizing = new Size(w, h);
bt = new System.Drawing.Bitmap(bt, sizing);
}
catch (Exception) { }
return bt;
}
edited
this is the output
Related
Hello I would like to send image of screen by udp but I got error with buffer length in client side. How Can I compress image to send it or change udp byte[] limit?? I already tried to compress it by deflate & change image size but quality of picture was bad (I used code from other post for quality it not work). I want to show this image in picture box in the other app.
static Rectangle bounds = Screen.GetBounds(Point.Empty);
static UdpClient socket = new UdpClient();
static void Main(string[] args)
{
Console.WriteLine(GetScreenShot().Length);
socket.Connect(new IPEndPoint(IPAddress.Parse("myServerAddress"), 6665));
while (true)
{
byte[] arr = GetScreenShot();
socket.Send(arr, arr.Length);
}
}
static byte[] GetScreenShot()
{
using (MemoryStream ms = new MemoryStream())
{
using (Bitmap bitmap = new Bitmap(bounds.Width, bounds.Height))
{
using (Graphics g = Graphics.FromImage(bitmap))
{
g.CopyFromScreen(Point.Empty, Point.Empty, bounds.Size);
}
bitmap.Save(ms,ImageFormat.Jpeg);
}
return Compress(ms.ToArray());
}
}
static byte[] Compress(byte[] b)
{
using(MemoryStream ms = new MemoryStream())
{
using(DeflateStream deflateStream = new DeflateStream(ms, CompressionMode.Compress))
{
deflateStream.Write(b, 0, b.Length);
}
return ms.ToArray();
}
}```
I simply want streaming capture screen with TCP protocol on C#.
private Bitmap bmpScreenshot;
private byte[] screenToByteArray()
{
byte[] result;
try
{
if (bmpScreenshot != null)
bmpScreenshot.Dispose();
bmpScreenshot = new Bitmap(SystemInformation.VirtualScreen.Width,
SystemInformation.VirtualScreen.Height,
PixelFormat.Format32bppArgb);
using (var gfxScreenshot = Graphics.FromImage(bmpScreenshot))
{
gfxScreenshot.CopyFromScreen(SystemInformation.VirtualScreen.X,
SystemInformation.VirtualScreen.Y,
0,
0,
SystemInformation.VirtualScreen.Size,
CopyPixelOperation.SourceCopy);
result = ImageToByte(bmpScreenshot);
}
}
catch (Exception ex)
{
Console.WriteLine("[ERROR]screenToByteArray Error..{0}", ex.Message);
result = null;
}
return result;
}
private byte[] ImageToByte(Image iImage)
{
if (mMemoryStream != null)
mMemoryStream.Dispose();
mMemoryStream = new MemoryStream();
iImage.Save(mMemoryStream, ImageFormat.Png);
if (iImage != null)
iImage.Dispose();
return mMemoryStream.ToArray();
}
I use that code part and I'm sending screenToByteArray() but I have a problem. If My screen does not have lots of image like that enter image description here, Listener can see correct display, but When my screen has complicated image(s) like that enter image description here , Listener sees distorted display like that enter image description here .When my screen has any complicate image, listener can't see the whole display. How can I do that. Thank for your help.
EDIT
I share my tcp code below
Socket socket = new Socket(AddressFamily.InterNetwork, SocketType.Stream, ProtocolType.Tcp);
socket.Connect(new IPEndPoint(IPAddress.Parse("192.168.1.109"), 8500));
while(true)
{
try
{
byte[] sendData = screenToByteArray();
socket.Send(sendData, sendData.Length, SocketFlags.None);
sendData = null;
}
catch (Exception ex)
{
Console.WriteLine("[ERROR]sendScreen Error..{0}", ex.Message);
socket.Dispose();
break;
}
}
I did it. The streamed picture format has been changed.
private byte[] ImageToByte(Image iImage)
{
if (mMemoryStream != null)
mMemoryStream.Dispose();
mMemoryStream = new MemoryStream();
iImage.Save(mMemoryStream, ImageFormat.Jpeg);
if (iImage != null)
iImage.Dispose();
return mMemoryStream.ToArray();
}
I am writing a C# WPF application, I am using the AForge library for video streaming.
I wanted to deploy the application, because everything worked on the first pc.
Than I deployed it and on the other pc I get the following Error:
"Must create DependencySource on same Thread as the DependencyObject even by using Dispatcher"
This is the source Code, I am calling it on every new Frame which I get from the WebCam
private void video_NewFrame(object sender, NewFrameEventArgs eventArgs)
{
if (StreamRunning)
{
try
{
using (var bitmap = (Bitmap)eventArgs.Frame.Clone())
{
Image = ToBitmapImage(bitmap);
}
Image.Freeze();
}
catch (Exception e)
{
UIMessages = "Error: NewFrame " + e.Message;
}
}
}
ToBitmapImage Method:
private BitmapImage ToBitmapImage(Bitmap bitmap)
{
var start = 420;
var end = 1920 - 2* 420;
BitmapImage bi = new BitmapImage();
bi.BeginInit();
MemoryStream ms = new MemoryStream();
Bitmap source = bitmap;
Bitmap CroppedImage = source.Clone(new System.Drawing.Rectangle(start, 0, end, 1080), source.PixelFormat);
CroppedImage.Save(ms, ImageFormat.Bmp);
ms.Seek(0, SeekOrigin.Begin);
bi.StreamSource = ms;
bi.EndInit();
return bi;
}
Further code:
private BitmapImage _image;
public BitmapImage Image {
get => _image;
set
{
_image = value;
OnPropertyChanged();
}
}
The start of the camera:
if (SelectedDevice != null)
{
_videoSource = new VideoCaptureDevice(SelectedDevice.MonikerString);
//var test = _videoSource.VideoCapabilities;
_videoSource.NewFrame += video_NewFrame;
_videoSource.Start();
}
UI:
<Image Height="400" Width="400" Source="{Binding Image, UpdateSourceTrigger=PropertyChanged}" Margin="0,0,0,0"/>
I also tried this out:
Dispatcher.CurrentDispatcher.Invoke(() => Image = ToBitmapImage(bitmap));
When loading a BitmapImage from a Stream, you would usually close the Stream as soon as possible and make sure the BitmapImage is loaded immediately, by setting BitmapCacheOption.OnLoad.
private static BitmapImage ToBitmapImage(Bitmap bitmap)
{
var start = 420;
var end = 1920 - 2 * 420;
var croppedBitmap = bitmap.Clone(
new System.Drawing.Rectangle(start, 0, end, 1080),
bitmap.PixelFormat);
var bi = new BitmapImage();
using (var ms = new MemoryStream())
{
croppedBitmap.Save(ms, ImageFormat.Bmp);
ms.Seek(0, SeekOrigin.Begin);
bi.BeginInit();
bi.CacheOption = BitmapCacheOption.OnLoad;
bi.StreamSource = ms;
bi.EndInit();
}
bi.Freeze();
return bi;
}
I cannot decode images back from their encoded form as a (Jpeg) byte array retrieved from my database to be used as image sources for my WPF application.
The code I am using to encode them as a Jpeg byte array is as follows:
public byte[] bytesFromBitmap(BitmapImage bit)
{
byte[] data;
JpegBitmapEncoder encoder = new JpegBitmapEncoder();
encoder.Frames.Add(BitmapFrame.Create(bit));
using (MemoryStream ms = new MemoryStream())
{
encoder.Save(ms);
data = ms.ToArray();
}
return data;
}
This is taking my Image taken directly from a webpage and assigned to an Image control like so:
var img = new BitmapImage(new Uri(entity.Image.ImageSrc)); //the entity has been saved in my DB, having been parsed from html
pbImage.Source = img;
This works just fine, I encode the BitmapImage and it saves just fine. But when I retrieve it from the DB and try to display it in another window, I cannot get it to work after trying every example I can see online - all either render nothing, or a black box or a visual mess not at all similar to the image I encoded.
Neither of the following have worked for me:
public BitmapSource GetBMImage(byte[] data)
{
using (var ms = new MemoryStream(data))
{
JpegBitmapDecoder decoder = new JpegBitmapDecoder(ms, BitmapCreateOptions.PreservePixelFormat, BitmapCacheOption.Default);
BitmapSource frame = decoder.Frames[0];
return frame;
}
}
public static BitmapImage ImageFromBytes(byte[] imageData)
{
if (imageData == null)
{
return null;
}
else
{
var image = new BitmapImage();
using (var mem = new MemoryStream())
{
mem.Position = 0;
image.BeginInit();
image.CreateOptions = BitmapCreateOptions.PreservePixelFormat;
image.CacheOption = BitmapCacheOption.OnLoad;
image.UriSource = null;
image.StreamSource = mem;
image.EndInit();
}
image.Freeze();
return image;
}
} //this throws a 'No imaging component suitable to complete this operation was found' exception
Among other uses of memory streams and decoders I just can't get this to work - can anyone help?
I have a windows forms application written in C# which uses embedded Windows Media Player (AxInterop.WMPLib.dll and WMPLib.dll) to play some video files. Now I need to add an option to capture image from video on button click. If I set the windowless option to true, I am able to capture an image of a video, but when I set the windowless option to true I don't see a video image on some computers. Without the windowless option I only get a black screen with this code:
System.Drawing.Image ret = null;
try{
Bitmap bitmap = new Bitmap(wmPlayer.Width-26, wmPlayer.Height-66);
{
Graphics g = Graphics.FromImage(bitmap);
{
Graphics gg = wmPlayer.CreateGraphics();
{
this.BringToFront();
g.CopyFromScreen(
wmPlayer.PointToScreen(
new System.Drawing.Point()).X+13,
wmPlayer.PointToScreen(
new System.Drawing.Point()).Y,
0, 0,
new System.Drawing.Size(
wmPlayer.Width-26,
wmPlayer.Height-66)
);
}
}
using (MemoryStream ms = new MemoryStream()){
bitmap.Save(ms, System.Drawing.Imaging.ImageFormat.Png);
ret = System.Drawing.Image.FromStream(ms);
ret.Save(#"C:\\WMP_capture.png");
pictureBox1.Image=ret;
}
}
bitmap.Dispose();
}catch (Exception){ }
How can I capture a frame (snapshot) from a video playing in embedded Windows Media Player without the windowless option in C#?
Or is there any other video player for C# windows forms that can be easily implemented and that supports capture functionality.
Hope this code work for you
if (!string.IsNullOrEmpty(axWindowsMediaPlayer1.URL)){
axWindowsMediaPlayer1.Ctlcontrols.pause();
System.Drawing.Image ret = null;
try
{
// take picture BEFORE saveFileDialog pops up!!
Bitmap bitmap = new Bitmap(axWindowsMediaPlayer1.Width, axWindowsMediaPlayer1.Height);
{
Graphics g = Graphics.FromImage(bitmap);
{
Graphics gg = axWindowsMediaPlayer1.CreateGraphics();
{
//timerTakePicFromVideo.Start();
this.BringToFront();
g.CopyFromScreen(
axWindowsMediaPlayer1.PointToScreen(
new System.Drawing.Point()).X,
axWindowsMediaPlayer1.PointToScreen(
new System.Drawing.Point()).Y,
0, 0,
new System.Drawing.Size(
axWindowsMediaPlayer1.Width,
axWindowsMediaPlayer1.Height)
);
}
}
// afterwards save bitmap file if user wants to
if (saveFileDialog1.ShowDialog() == DialogResult.OK)
{
using (MemoryStream ms = new MemoryStream())
{
bitmap.Save(ms, System.Drawing.Imaging.ImageFormat.Png);
ret = System.Drawing.Image.FromStream(ms);
ret.Save(saveFileDialog1.FileName);
}
}
}
}
catch (Exception ex)
{
Debug.WriteLine(ex.Message);
}
}
One more demo for you : http://www.codeproject.com/Articles/34663/DirectShow-Examples-for-Using-SampleGrabber-for-Gr