We have an Aiwit camera that connects to our wifi. I would like to connect to it (in this case using C#) and retrieve data from the camera.
Anyone who was experience with these cameras that can help me out. Also with determining the camera's portnumber.
Some code I'm working with rn:
TcpClient client = new TcpClient("Camera IP", Camera Port);
NetworkStream stream = client.GetStream();
// Request an image
byte[] request = Encoding.ASCII.GetBytes("GET /image.jpg HTTP/1.1\r\n\r\n");
stream.Write(request, 0, request.Length);
// Read the image
byte[] imageData = new byte[1024*1024]; // buffer to store image data
int bytesRead = stream.Read(imageData, 0, imageData.Length);
// Display the image
Image image = Image.FromStream(new MemoryStream(imageData, 0, bytesRead));
Related
I want to make an app to transfer Audio from the microphone in a laptop or PC live, in real time, like a YouTube stream but without video. I will describe my process:
I transfer a normal file by converting it to byte then back again to origin.
I change the file type to MP3 or wav then use NAudio. Works fine also I can play the file if transferred or while receiving
I change the input file to Microphone and receive the audio.
Here is the problem: NAudio is unable to put the live stream from the mic then send it automatically. Always, the buffer gives me a null pointer exception while debugging ,then it gives me another error that the decoder of NAudio did not receive any data, "not acceptable by the way".
It should listen or keep receiving data until the port or connection closes.
I've tried to search about any library about VoIP but found nothing except Ozeki, but no tutorial to handle. All I found is old videos that do not work. I searched about that over a week but no result. I don't want a fully developed project because I already found one, but it is too complex -- about 2K lines of code. All I need is to know what to do or to be given the code that solves the problem.
This is client side code:
public void client()
{
try
{
//byte[] send_data = Audio_to_byte(); // ORIGINAL WORK CODE
byte[] send_data = new byte [BufferSize]; // 100M buffer size
TcpClient client = new TcpClient(serverip, port);
NetworkStream stream = client.GetStream();
// sourceStream and wavein is global vars
sourceStream = new NAudio.Wave.WaveIn();
sourceStream.DeviceNumber = 1;
sourceStream.WaveFormat = new NAudio.Wave.WaveFormat(44100, NAudio.Wave.WaveIn.GetCapabilities(1).Channels);
wavein = new NAudio.Wave.WaveInProvider(sourceStream);
//wavein.Read(send_data, 0, send_data.Length); // this buffer not work some times give me full buffer
BufferedWaveProvider pro = new BufferedWaveProvider(wavein.WaveFormat);
pro.AddSamples(send_data, 0, send_data.Length); // Empty buffer or error full buffer
stream.Write(send_data, 0, send_data.Length);
stream.Close();
client.Close();
}
catch (Exception e)
{
MessageBox.Show("Server is offline" + e, "Error");
// Here the message is buffer is full or send it empty then the decoder did not receive anything give exception error or of them happen first
}
}
and this server side code with MP3 reader code
IPAddress ip = Dns.GetHostEntry(serverip).AddressList[0];
TcpListener server_obj = new TcpListener(ip,port);
TcpClient client = default(TcpClient);
try
{
server_obj.Start();
while (true)
{
// accept all client
client = server_obj.AcceptTcpClient();
// make byte storage from network
byte[] received_buffer = new byte[BufferSize];
//get data from cst
NetworkStream stream = client.GetStream();
//save data from network to memory till finish then save with playing
MemoryStream ms = new MemoryStream();
int numBytesRead = 0;
while ((numBytesRead = stream.Read(received_buffer, 0, received_buffer.Length)) > 0)
{
// THIS STEP TO RECEIVE ALL DATA FROM CLIENT
ms.Write(received_buffer, 0, numBytesRead);
//receive sound then play it direct
WaveOut(ms.ToArray());
}
Byte_to_audio(ms.ToArray()); // YOU can make or allow override
}
}
catch(Exception e)
{
MessageBox.Show("Error Message : " + e, "Error");
}
}
this is Method that read stream receiving data from network
private void WaveOut(byte[] mp3Bytes)
{
// MP3 Format
mp3Stream = new MemoryStream(mp3Bytes);
mp3FileReader = new Mp3FileReader(mp3Stream);
wave32 = new WaveChannel32(mp3FileReader, 0.3f, 3f);
ds = new DirectSoundOut(); // but declration up global
ds.Init(wave32);
ds.Play(); // work code*/
}
I recommend using UDP, if it has to be in real time.
As I use Naudio with vb.net, I based this post.
Client Example:
waveIn = new WaveIn();
waveIn.BufferMilliseconds = 50; //Milissecondes Buffer
waveIn.DeviceNumber = inputDeviceNumber;
waveIn.WaveFormat = HEREWaveFormatHERE;
waveIn.DataAvailable += waveIn_DataAvailable; //Event to receive Buffer
waveIn.StartRecording();
void waveIn_DataAvailable(object sender, WaveInEventArgs e)
{
//e -> BUFFER
stream.Write(send_data, 0, send_data.Length);
stream.Close();
client.Close();
}
RECEIVE:
1 - Create WaveOut and BufferProvider Global
WOut = New WaveOut(WaveCallbackInfo.FunctionCallback());
BufferedWaveProvider pro = new BufferedWaveProvider(HEREWaveFormatHERE);
pro.BufferLength = 20 * 1024; //Max Size BufferedWaveProvider
pro.DiscardOnBufferOverflow = True;
WOut.Init(pro);
WOut.Play();
As long as there is no audio BufferedWaveProvider will provide silence for WaveOut or other outputs, it will also queue everything that arrives, for continuous playback.
2 - Play and Enqueue
while ((numBytesRead = stream.Read(received_buffer, 0, received_buffer.Length)) > 0)
{
pro.AddSamples(received_buffer, 0, numBytesRead);
}
My knowledge of naudio is limited to that
English from google translator
I am receiving H264 encoded frame but when I convert it into bitmap I just get a black screen. The resolution is right. I have tried a lot of things and couldnt find a working way. Thank you!
Here is my code
public System.Drawing.Bitmap CopyDataToBitmap(byte[] data)
{
//Here create the Bitmap to the know height, width and format
System.Drawing.Bitmap bmp = new System.Drawing.Bitmap((int)2592, (int)1936, System.Drawing.Imaging.PixelFormat.Format24bppRgb);
//Create a BitmapData and Lock all pixels to be written
System.Drawing.Imaging.BitmapData bmpData = bmp.LockBits(
new System.Drawing.Rectangle(0, 0, bmp.Width, bmp.Height),
System.Drawing.Imaging.ImageLockMode.WriteOnly, bmp.PixelFormat);
//Copy the data from the byte array into BitmapData.Scan0
Marshal.Copy(data, 0, bmpData.Scan0, data.Length);
//Unlock the pixels
bmp.UnlockBits(bmpData);
//Return the bitmap
return bmp;
}
public async void ListenVideo()
{
byte[] data = new byte[1024];
IPEndPoint ipep = new IPEndPoint(IPAddress.Any, 11111);
UdpClient newsock = new UdpClient(ipep);
IPEndPoint sender = new IPEndPoint(IPAddress.Any, 11111);
data = newsock.Receive(ref sender);
string message = Encoding.UTF8.GetString(data, 0, data.Length);
while (true)
{
data = newsock.Receive(ref sender);
message = Encoding.UTF8.GetString(data, 0, data.Length);
MemoryStream stream = new MemoryStream(data);
panel1.BackgroundImage = CopyDataToBitmap(data);
await Task.Delay(2000);
}
}
H.264 is encoding for elementary video stream, it is not an encoding for a separate image.
This means that decoding process involves a setup for video encoding. There is no single function (at least it is not supposed to work this way) to take bitstream and output a video frame.
You would typically setup stream decoding context, then feed input and pull output when it is ready.
Windows comes with a stock H.264 encoder and a typical API to consume such decoding services is Media Foundation. It is not the API for .NET so hence Media Foundation with C#
There is also Media Foundation .NET project with C# wrappers over native Media Foundation API.
i am trying to send raw image from server to client using sockets in NETMF, but with no success.
Here is my server side code, i am sending an image byte[] array using a chunks method, because of buffer size limitations in NETMF:
if (serverSocket.IsConnected)
{
Byte[] bytesToSend = capturedata;
Byte[] outboundBuffer = new byte[512];
int incomingOffset = 0;
while (incomingOffset < bytesToSend.Length)
{
int length = System.Math.Min(outboundBuffer.Length, bytesToSend.Length - incomingOffset);
Array.Copy(bytesToSend, incomingOffset, outboundBuffer, 0, length);
incomingOffset += length;
// Transmit outbound buffer
serverSocket.SendBinary(outboundBuffer);
}
Here is my client side code. I am trying to prepare image from memorystream and show it in picturebox:
public static void ReadCallback(IAsyncResult ar)
{
// Retrieve the state object and the handler socket
// from the asynchronous state object.
StateObject state = (StateObject)ar.AsyncState;
handler = state.workSocket;
// Read data from the client socket.
int bytesRead = handler.EndReceive(ar);
if (bytesRead > 0)
{
memorystream.Write(state.buffer, 0, bytesRead);
if (bytesRead<=0)
{
PICTUREBOX_CAPTURED_IMAGE.Image = new System.Drawing.Bitmap(memorystream);// Here is an ArgumentException
}
else
{
// Not all data received. Get more.
handler.BeginReceive(state.buffer, 0, StateObject.BufferSize, 0,
new AsyncCallback(ReadCallback), state);
}
}
}
I dont know what the matter, but it seems like the server side chunks buffers not completely writes to memorystream on client side, because Byte[] bytesToSend = capturedata; all the time is about 12540 bytes, but when i start to investigate this value on clientside memory stream capacity is about 3500-4600 bytes. And i think because of it i have an Argument Exception when i am trying to make a Bitmap image from stream in this line of code:
if (bytesRead<=0)
{
PICTUREBOX_CAPTURED_IMAGE.Image = new System.Drawing.Bitmap(memorystream);// Here is an ArgumentException
}
Help me please to find out what am i doing wrong?
I have a Samsung IP camera and I want to stream it in to my c# program, but when I run the program I got 'invalid parameter' error.
private void button1_Click(object sender, EventArgs e) { while (true) {
string sourceURL = url; byte[] buffer = new byte[100000];
int read, total = 0;
// create HTTP request
HttpWebRequest req = (HttpWebRequest)WebRequest.Create(sourceURL);
req.Credentials = new NetworkCredential("admin", "4321");
// get response
WebResponse resp = req.GetResponse();
// get response stream
Stream stream = resp.GetResponseStream();
// read data from stream
while ((read = stream.Read(buffer, total, 1000)) != 0)
{
total += read;
}
Bitmap bmp = (Bitmap)Bitmap.FromStream(new MemoryStream(buffer, 0, total));
pictureBox1.Image = bmp;
}
}
What might be the problem?
You are not building the correct buffer, you are overriding the old buffer with new buffer each time while there is new data, idea to fix it:
List<byte> fullData = new List<Byte>();
while ((read = stream.Read(buffer, 0, buffer.Length)) > 0)//also > 0 not == 0 because of it can be -1
{
fullData.AddRange(new List<Byte>(buffer).GetRange(0, read));//only add the actual data read
}
byte[] dataRead = fullData.ToArray();
Bitmap bmp = (Bitmap)Bitmap.FromStream(new MemoryStream(dataRead , 0, dataRead.Lenght));
My guess (since you don't indicate the error) is that the image has gone over 100,000 bytes, which your code doesn't handle at all. I would, instead:
byte[] buffer = new byte[10 * 1024];
...
using(var ms = new MemoryStream())
{
// read everything in the stream into ms
while ((read = stream.Read(buffer, 0, buffer.Length)) > 0)
{
ms.Write(buffer, 0, read);
}
// rewind and load the bitmap
ms.Position = 0;
Bitmap bmp = (Bitmap)Bitmap.FromStream(ms);
pictureBox1.Image = bmp;
}
kinda late answer relative to the time of the question; however, since I'm hunting down similar issues and this has apparently not been answered, I'm adding my two cents...
There are two issues here as I see it:
First:
You don't wanna put the event handler of a button-click into an endless loop. This should probably be a thread of some type so that the event handler can return.
Second:
As mentioned in another comment, your code is expecting the response to be a raw image of some type, and most likely it is not. Your camera may eventually send MJPG, but that doesn't mean it comes raw. Sometimes you have to send other commands to the camera and then when you actually start getting the MJPG stream you have to parse it and extract headers prior to sending the portion of the image to some picturebox. You're probably getting some kind of html response from the camera (as I am) and when you try to pass that data to a method that is expecting the data to be some image format (likely JPEG), then you get the invalid parameter error.
Can't say I know how to solve the problem cause it depends on the camera. If there is some kind of standard interface for these cameras I'd sure like to know what it is! Anyway, HTH...
I want to preload a lot of images when my app starts up, sort of a once off thing.
I have an Image class that contains the Url of its image stored in the cloud as blob storage (this address is a https address BTW)
I want to download the image bytes from the cloud, store them on the object, then when it comes time to show the image, load the image from its bytes.
I have all the code for this, but I keep getting the exception:
No imaging component suitable to complete this operation was found.
Here is my code: EDIT UPDATED WITH A FIX
//Loaded on start-up
private static void LoadImageBytes(Image img)
{
var urlUri = new Uri(img.Url);
var request = (HttpWebRequest)WebRequest.CreateDefault(urlUri);
MemoryStream memStream = new MemoryStream();
using (var response = request.GetResponse())
{
var buffer = new byte[4096];
using (var stream = response.GetResponseStream())
{
int bytesRead = stream.Read(buffer, 0, buffer.Length);
while (bytesRead > 0)
{
memStream.Write(buffer, 0, bytesRead);
bytesRead = stream.Read(buffer, 0, buffer.Length);
}
img.ImageBytes = memStream.ToArray();
}
}
}
Then when I want to get the image on the screen I call this:
public BitmapImage ImageFromBuffer(Byte[] bytes)
{
MemoryStream stream = new MemoryStream(bytes);
stream.Seek(0, SeekOrigin.Begin);
BitmapImage image = new BitmapImage();
image.BeginInit();
image.StreamSource = stream;
image.EndInit();
return image;
}
But in the EndInit() call I get the exception.
I have done some testing and if I load the file from my local filesystem, I get a different set of bytes to that of the image in the cloud. I assume its something to do with blob storage or https?
And yes I can browse to that image and its not corrupted.
EDIT, fixed now all good
Are you sure this line is correct?
while (stream.Read(buffer, 0, buffer.Length) > 0)
img.ImageBytes = buffer;
img.ImageBytes will hold the last read buffer.