Unity System.Drawing Internal Buffer Overflow Exception - c#

Goal
Get a gif working in unity from a URL, I am currently using the WWW class. I am currently getting a byte[] and converting it to a System.Drawing.Image. This works in the editor but not any build
Error:
"Type Load Exception: Could not load type" System.IO.InternalBufferOverflowException from the assembly "System.Drawing.Image" at line 111
Why?
It has to do with the System.Drawing.Image.FromStream built in method, Unity for some reason doesn't like it. The other options are .FromFile and .FromHBitMap, I dont know how to use HBitMap but going back to my original plan, .FromFile is unusable to me.
Entire Code
using System.Collections.Generic;
using System.Drawing;
using System.Drawing.Imaging;
using UnityEngine;
using System.IO;
using UnityEngine.UI;
using System.Collections;
public class AnimatedGifDrawerBack : MonoBehaviour
{
public string loadingGifPath;
public float speed = 1;
public Vector2 drawPosition;
public string pName;
public float width;
public float height;
public float percentage;
public GameObject positionPlaceHolderGO;
public Vector2 positionPlaceHolder;
public Text debugText;
private SpriteImageArray sia;
private string url;
private WWW www;
public bool finishedWWW = false;
public bool hasWWW = false;
public bool canOnGUI = false;
List<Texture2D> gifFrames = new List<Texture2D>();
void Start()
{
percentage = 1.3f;
positionPlaceHolderGO = GameObject.FindGameObjectWithTag("PBLPlace");
positionPlaceHolder = positionPlaceHolderGO.transform.position;
}
void Update()
{
while (hasWWW == false)
{
Debug.Log("in while loop");
if (this.GetComponent<PokemonCreatorBack>().name == "")
{
}
else
{
debugText.text = "Name Found";
url = "www.pkparaiso.com/imagenes/xy/sprites/animados-espalda/" + this.GetComponent<PokemonCreatorBack>().PokemonName.ToLower() + ".gif";
StartCoroutine(WaitForRequest(positionPlaceHolderGO, url));
hasWWW = true;
debugText.text = "hawWWW = true";
}
}
}
void OnGUI()
{
height = (float)Screen.height - 80f / percentage;
//GUI.DrawTexture (new Rect (Screen.width-width, Screen.height - height, gifFrames [0].width * percentage, gifFrames [0].height * percentage), gifFrames [(int)(Time.frameCount * speed) % gifFrames.Count]);
if (canOnGUI)
GUI.DrawTexture(new Rect(positionPlaceHolder.x, positionPlaceHolder.y, gifFrames[0].width * percentage, gifFrames[0].height * percentage), gifFrames[(int)(Time.frameCount * speed) % gifFrames.Count]);
}
IEnumerator WaitForRequest(GameObject go, string url)
{
www = new WWW(url);
yield return www;
if (www.error == null)
{
Debug.Log("WWW Ok!: " + www.texture.name);
}
else
{
Debug.Log("WWW Error: " + www.error);
}
debugText.text = "finishedWWW = true";
finishedWWW = true;
}
public System.Drawing.Image ByteArrayToImage(byte[] byteArrayIn)
{
if (finishedWWW == false)
{
Debug.Log("Called too early");
}
if (byteArrayIn == null)
{
Debug.Log("Null byte array");
return null;
}
Debug.Log("Bytra array in length: " + byteArrayIn.GetLongLength(0));
MemoryStream ms = new MemoryStream(byteArrayIn);
System.Drawing.Image returnImage = System.Drawing.Image.FromStream(ms); //MAIN SOURCE OF ERROR HERE
finishedWWW = true;
debugText.text = "System.Image Created";
return returnImage;
}
public void loadImage()
{
Debug.Log("Called Load Image BACK");
debugText.text = "Called Load Image BACK";
System.Drawing.Image gifImage = ByteArrayToImage(www.bytes);
FrameDimension dimension = new FrameDimension(gifImage.FrameDimensionsList[0]);
int frameCount = gifImage.GetFrameCount(dimension);
for (int i = 0; i < frameCount; i++)
{
gifImage.SelectActiveFrame(dimension, i);
Bitmap frame = new Bitmap(gifImage.Width, gifImage.Height);
System.Drawing.Graphics.FromImage(frame).DrawImage(gifImage, Point.Empty);
Texture2D frameTexture = new Texture2D(frame.Width, frame.Height);
for (int x = 0; x < frame.Width; x++)
for (int y = 0; y < frame.Height; y++)
{
System.Drawing.Color sourceColor = frame.GetPixel(x, y);
frameTexture.SetPixel(frame.Width - 1 + x, -y, new Color32(sourceColor.R, sourceColor.G, sourceColor.B, sourceColor.A)); // for some reason, x is flipped
}
frameTexture.Apply();
gifFrames.Add(frameTexture);
}
Debug.Log("Starting ON GUI!");
debugText.text = "Starting OnGUI";
canOnGUI = true;
}
}
Thoughts
byteArrayIn.GetLongLength(0)
returns 80,000 at most.
The last debug statement coming through is Called Image Loading BACK.
I will write my own fire streamer if necessary, and if it is necessary can someone point me in the write direction for that.
I think the main workaround is dealing with the Image.FromStream().
There are two of these in the scene.
All thoughts or solutions are welcome, I really just wish I knew how to tackle this error so that I could share it more with the Unity Community.

We faced the same problem this morning.
The app is not finding a type in the System.IO namespace required by System.Drawing.Image.
The missing type has apparently been stripped from the system.dll that is packed during the build process.
To fix this, you need to copy and replace the unity generated System.dll with the original mono System.dll.
In your build, replace projectName_Data\Managed\System.dll with the System.dll found in Unity's mono installation folder:
Editor\Data\Mono\lib\mono\2.0 (relative to the root of the Unity installation folder).
Hope it helps!

Related

How to indicate when the game is saving to file and then show something on screen?

I have a saving system and today i'm using a canvas and text and just displaying the words "Saving Game" make it flickering for 3 seconds.
but what i want to do is that it will show the text "Saving Game" but the time will not be static 3 seconds or any other time i select but to be the time it's taking to save the game.
for example at the first time it's saving less stuff so it will take faster to save so the "Saving Game" should be display for a short time later in the game it will save more stuff so the saving time and the time to display "Saving Game" should be longer.
How can i know when the game is saving ? Maybe by somehow checking if the saved game file is busy/locked ?
public IEnumerator SaveWithTime()
{
yield return new WaitForSeconds(timeToStartSaving);
if (objectsToSave.Count == 0)
{
Debug.Log("No objects selected for saving.");
}
if (saveManual == false && objectsToSave.Count > 0)
{
Save();
StartCoroutine(fadeInOutSaveGame.OverAllTime(savingFadeInOutTime));
}
}
public IEnumerator SaveWithTimeManual()
{
yield return new WaitForSeconds(timeToStartSaving);
if(objectsToSave.Count == 0)
{
Debug.Log("No objects selected for saving.");
}
if (saveManual && objectsToSave.Count > 0)
{
Save();
StartCoroutine(fadeInOutSaveGame.OverAllTime(savingFadeInOutTime));
}
}
}
In the bottom of the script i have two methods SaveWithTime that save the game automatic at specific points in the game and SaveWithTimeManual that save the game when i'm pressing a key.
In this two methods i'm using the canvas to display the "Saving Game" text.
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class FadeInOutSaveGameText : MonoBehaviour
{
public Canvas canvas;
public float fadingSpeed;
private bool stopFading = false;
private const float THRESHOLD = 0.01F;
// Start is called before the first frame update
void Start()
{
//StartCoroutine(OverAllTime(5f));
}
IEnumerator CanvasAlphaChangeOverTime(Canvas canvas, float duration)
{
float alphaColor = canvas.GetComponent<CanvasGroup>().alpha;
while (true)
{
alphaColor = (Mathf.Sin(Time.time * duration) + 1.0f) / 2.0f;
canvas.GetComponent<CanvasGroup>().alpha = alphaColor;
// only break, if current alpha value is close to 0 or 1
if (stopFading && Mathf.Abs(alphaColor) <= THRESHOLD)//if (stopFading && (Mathf.Abs(alphaColor) <= THRESHOLD || Mathf.Abs(alphaColor - 1) <= THRESHOLD))
{
break;
}
yield return null;
}
}
public IEnumerator OverAllTime(float time)
{
stopFading = false;
StartCoroutine(CanvasAlphaChangeOverTime(canvas, fadingSpeed));
yield return new WaitForSeconds(time);
stopFading = true;
}
}
And this class make the actual writing to the save game file :
using System;
using System.Collections;
using System.Collections.Generic;
using System.IO;
using UnityEngine;
public static class SaveSystem
{
private static readonly string SAVE_FOLDER = Application.dataPath + "/save_";
public static void Init()
{
if (!Directory.Exists(SAVE_FOLDER))
{
Directory.CreateDirectory(SAVE_FOLDER);
}
}
public static void Save(string saveString)
{
string fileName = Path.Combine(SAVE_FOLDER, "savegame.txt");
File.WriteAllText(fileName, saveString);
}
public static string Load()
{
string content = "";
string fileName = Path.Combine(SAVE_FOLDER, "savegame.txt");
if (File.Exists(fileName))
{
content = File.ReadAllText(fileName);
}
else
{
Debug.Log("Save game file is not exist" +
" either the file has deleted or the game has not saved yet.");
}
return content;
}
}
How can i find the real time it's taking to save and while saving showing the text "Saving Game" ?
Use Async/Threading, or in Unity's case, Coroutines.
bool isSaving;
public void SaveGame(){
StartCoroutine(SaveGameCoroutine());
}
private IEnumerator SaveGameCoroutine(){
if (isSaving){
yield break; // or something else
}
ShowSavingCanvas();
isSaving = true;
SaveGame();
isSaving = false;
HideSavingCanvas();
}

Loading .FBX file without Content Pipeline in Monogame

I was wondering if there was a way to load a simple .fbx file with just a cube into my game without using the monogame content pipeline. Here's the code I have currently, but I want to essentially just get rid of the requirement for the file to be .xnb
public static Model LoadModel(string model)
{
Model outModel = null;
// If model is already in memory, reference that instead of having to load it again
if (Loaded3DModels.ContainsKey(model))
{
Loaded3DModels.TryGetValue(model, out outModel);
}
else
{
if (File.Exists(Game.gameWindow.Content.RootDirectory + "/" + model + ".fbx"))
{
outModel = Game.gameWindow.Content.Load<Model>(model + ".fbx");
Loaded3DModels.Add(model, outModel);
}
else
{
Debug.LogError("The Model \"" + model + ".fbx\" does not exist!", true, 2);
}
}
return outModel;
}
So I managed to use AssImp to load a mesh, and then "Convert" it to something Monogame could render
public Renderer3DComponent(Mesh mesh)
{
// Set up material
Material = new BasicEffect(Game.graphics.GraphicsDevice);
Material.Alpha = 1;
Material.VertexColorEnabled = true;
Material.LightingEnabled = false;
// Tris
for (int i = mesh.VertexCount - 1; i >= 0; i--)
{
var Vert = mesh.Vertices[i];
Verts.Add(new VertexPositionColor(new Vector3(Vert.X, Vert.Y, Vert.Z), Color.White));
}
// Buffer
Buffer = new VertexBuffer(Game.graphics.GraphicsDevice, typeof(VertexPositionColor), Verts.Count, BufferUsage.WriteOnly);
Buffer.SetData(Verts.ToArray());
}

Unity 2019 - How can I save mixer audio output to an AudioClip or load a newly saved .wav to an audioclip at runtime?

I'm making a game with musical instruments and a recording device in-game. I want the user to be able to play the instruments and record them to an audioclip - and to a wav file - which they can then play back as accompaniment to themselves playing another instrument, the output again recorded, allowing them to make tracks with combined instruments. So far I have everything working except I can't figure out how to get the audioclip - I have .wav files being saved to the asset folder but I don't understand the code enough to figure out where to set the audioclip's data to the data being written.
I'm using code from Olkor here to save the recording as wav, which is working great: https://unitylist.com/p/za/Output-Audio-Recorder
My two options to save an audioclip from this seem to be a) save to an audioclip at the same time as saving to disk (I can't figure this out) or save it to disk and then load it into the game as an audioclip - using Unity Web Request which I have tried to do but I get an error either cannot access the .audioclip property of an aborted DownloadHandlerAudioClip or if I invoke the function with a delay to load the file, there is an error decoding the audio. Either way I have a saved .wav audio file in my assets folder but no audioclip.
InvalidOperationException: Cannot access the .audioClip property of an aborted DownloadHandlerAudioClip
UnityEngine.Networking.DownloadHandlerAudioClip.GetContent
(UnityEngine.Networking.UnityWebRequest www) (at
C:/buildslave/unity/build/Modules/UnityWebRequestAudio/Public/DownloadHandler
Audio.bindings.cs:49)
OutputAudioRecorder+<GetAudioClip>d__27.MoveNext () (at
Assets/Scripts/OutputAudioRecorder.cs:202)
UnityEngine.SetupCoroutine.InvokeMoveNext (System.Collections.IEnumerator
enumerator, System.IntPtr returnValueAddress) (at C:/buildslave/unity/build/Runtime/Export/Scripting/Coroutines.cs:17)
The script below that I'm using is attached to the audiolistener and so it's picking up all the output of audio from the game. I have looked at doing something with audioListener GetOutputData and passing it to audioClip SetData but I haven't been able to figure it out and I admit that this is beyond my current ability - I'd really appreciate some insight into how to approach this problem in any of the ways possible.
using System;
using System.IO;
using UnityEngine;
using UnityEngine.Networking;
using System.Collections;
public class OutputAudioRecorder : MonoBehaviour
{
public const string DEFAULT_FILENAME = "record";
public const string FILE_EXTENSION = ".wav";
public bool IsRecording { get { return recOutput; } }
private int bufferSize;
private int numBuffers;
private int outputRate;
private int headerSize = 44; //default for uncompressed wav
private String fileName;
private bool recOutput = false;
private AudioClip newClip;
private FileStream fileStream;
private AudioClip[] audioClips;
private AudioSource[] audioSources;
public int currentSlot;
float[] tempDataSource;
void Awake()
{
outputRate = AudioSettings.outputSampleRate;
}
void Start()
{
AudioSettings.GetDSPBufferSize(out bufferSize, out numBuffers);
audioSources = new AudioSource[3];
audioSources[0] = GameObject.FindWithTag("RecSlot1").GetComponent<AudioSource>();
audioSources[1] = GameObject.FindWithTag("RecSlot2").GetComponent<AudioSource>();
audioSources[2] = GameObject.FindWithTag("RecSlot3").GetComponent<AudioSource>();
}
public void StartRecording(string recordFileName)
{
fileName = Path.GetFileNameWithoutExtension(recordFileName) + FILE_EXTENSION;
if (!recOutput)
{
StartWriting(fileName);
recOutput = true;
}
else
{
Debug.LogError("Recording is in progress already");
}
}
public void StopRecording()
{
recOutput = false;
WriteHeader();
UpdateClip();
}
private void StartWriting(String name)
{
fileStream = new FileStream(Application.dataPath + "/" + name, FileMode.Create);
var emptyByte = new byte();
for (int i = 0; i < headerSize; i++) //preparing the header
{
fileStream.WriteByte(emptyByte);
}
}
private void OnAudioFilterRead(float[] data, int channels)
{
if (recOutput)
{
ConvertAndWrite(data); //audio data is interlaced
}
}
private void ConvertAndWrite(float[] dataSource)
{
var intData = new Int16[dataSource.Length];
//converting in 2 steps : float[] to Int16[], //then Int16[] to Byte[]
var bytesData = new Byte[dataSource.Length * 2];
//bytesData array is twice the size of
//dataSource array because a float converted in Int16 is 2 bytes.
var rescaleFactor = 32767; //to convert float to Int16
for (var i = 0; i < dataSource.Length; i++)
{
intData[i] = (Int16)(dataSource[i] * rescaleFactor);
var byteArr = new Byte[2];
byteArr = BitConverter.GetBytes(intData[i]);
byteArr.CopyTo(bytesData, i * 2);
}
fileStream.Write(bytesData, 0, bytesData.Length);
tempDataSource = new float[dataSource.Length];
tempDataSource = dataSource;
}
private void WriteHeader()
{
fileStream.Seek(0, SeekOrigin.Begin);
var riff = System.Text.Encoding.UTF8.GetBytes("RIFF");
fileStream.Write(riff, 0, 4);
var chunkSize = BitConverter.GetBytes(fileStream.Length - 8);
fileStream.Write(chunkSize, 0, 4);
var wave = System.Text.Encoding.UTF8.GetBytes("WAVE");
fileStream.Write(wave, 0, 4);
var fmt = System.Text.Encoding.UTF8.GetBytes("fmt ");
fileStream.Write(fmt, 0, 4);
var subChunk1 = BitConverter.GetBytes(16);
fileStream.Write(subChunk1, 0, 4);
UInt16 two = 2;
UInt16 one = 1;
var audioFormat = BitConverter.GetBytes(one);
fileStream.Write(audioFormat, 0, 2);
var numChannels = BitConverter.GetBytes(two);
fileStream.Write(numChannels, 0, 2);
var sampleRate = BitConverter.GetBytes(outputRate);
fileStream.Write(sampleRate, 0, 4);
var byteRate = BitConverter.GetBytes(outputRate * 4);
fileStream.Write(byteRate, 0, 4);
UInt16 four = 4;
var blockAlign = BitConverter.GetBytes(four);
fileStream.Write(blockAlign, 0, 2);
UInt16 sixteen = 16;
var bitsPerSample = BitConverter.GetBytes(sixteen);
fileStream.Write(bitsPerSample, 0, 2);
var dataString = System.Text.Encoding.UTF8.GetBytes("data");
fileStream.Write(dataString, 0, 4);
var subChunk2 = BitConverter.GetBytes(fileStream.Length - headerSize);
fileStream.Write(subChunk2, 0, 4);
fileStream.Close();
}
void UpdateClip()
{
StartCoroutine(GetAudioClip());
}
IEnumerator GetAudioClip()
{
using (UnityWebRequest www = UnityWebRequestMultimedia.GetAudioClip("file://" + Application.dataPath + "myRecord1.wav", AudioType.WAV))
{
yield return www.Send();
if (www.isNetworkError)
{
Debug.Log(www.error);
}
else
{
AudioClip newClip = DownloadHandlerAudioClip.GetContent(www);
Debug.Log(newClip.name + "name " + newClip.length);
keyboardScript.audioClip = newClip;
}
}
}
Try to load as audio clip using this:
private IENumerator LoadAudio(string url) {
WWW www;
www = new WWW("file:///" + url);
yield return www;
if (www != null && www.isDone)
{
AudioClip audioClip;
audioClip = www.GetAudioClip(true, false, AudioType.WAV);
}
}
That works for me at least.
Thought this was solution but still not working to load at runtime. (It just refers to the old copy of the clip until I reload the game. Didn't notice this at first.)
While I wasn't able to get the web request to work for me or get WWW to work either, I did get the effect I wanted
by creating a blank clip in the assets folder where the .wav saves, then dragging it in the properties window as a reference, setting the audio clip to streaming and then using the code above to write the wav, it automatically overwrites the clip with the data I recorded.
The audio has a bit of distortion on it (crackling) which I'll look into next.
This solution is I'm aware probably incredibly obvious (i guess I had to give unity something to refer to but didn't realise it) and doesn't solve the problem of loading files at runtime if they aren't predefined like mine are, but it fixed my problem at least.

Cardboard Magnet Detection

I got some trouble with my unity cardboard app. May some of you guys can help me.
I have build a little Island with Animations and A second island as a main menu.
So when the apps starts, you see the Island from above and the Logo of the App.
When the user pull down the magnet button on side the app will starts another level.
I used this scripts:
http://www.andrewnoske.com/wiki/Unity_-_Detecting_Google_Cardboard_Click
Detecting Google Cardboard Magnetic Button Click - Singleton Implementation
CardboardMagnetSensor.cs and CardboardTriggerControlMono.cs
I created a script in my asset folder(CardboardMagnetSensor.cs) like in the description from Link. Than I created a second script(CardboardTriggerControlMono.cs) like in the discription an dragged it onto my CardboardMain in may Projekt.
The CardboardTriggerControlMono.cs looks like:
using UnityEngine;
using System.Collections;
public class CardboardTriggerControlMono : MonoBehaviour {
public bool magnetDetectionEnabled = true;
void Start() {
CardboardMagnetSensor.SetEnabled(magnetDetectionEnabled);
// Disable screen dimming:
Screen.sleepTimeout = SleepTimeout.NeverSleep;
}
void Update () {
if (!magnetDetectionEnabled) return;
if (CardboardMagnetSensor.CheckIfWasClicked()) {
Debug.Log("Cardboard trigger was just clicked");
Application.LoadLevel(1);
CardboardMagnetSensor.ResetClick();
}
}
}
The CarboardMagnetSensor:
using UnityEngine;
using System.Collections.Generic;
public class CardboardMagnetSensor {
// Constants:
private const int WINDOW_SIZE = 40;
private const int NUM_SEGMENTS = 2;
private const int SEGMENT_SIZE = WINDOW_SIZE / NUM_SEGMENTS;
private const int T1 = 30, T2 = 130;
// Variables:
private static bool wasClicked; // Flips to true once set off.
private static bool sensorEnabled; // Is sensor active.
private static List<Vector3> sensorData; // Keeps magnetic sensor data.
private static float[] offsets; // Offsets used to detect click.
// Call this once at beginning to enable detection.
public static void SetEnabled(bool enabled) {
Reset();
sensorEnabled = enabled;
Input.compass.enabled = sensorEnabled;
}
// Reset variables.
public static void Reset() {
sensorData = new List<Vector3>(WINDOW_SIZE);
offsets = new float[SEGMENT_SIZE];
wasClicked = false;
sensorEnabled = false;
}
// Poll this once every frame to detect when the magnet button was clicked
// and if it was clicked make sure to call "ResetClick()"
// after you've dealt with the action, or it will continue to return true.
public static bool CheckIfWasClicked() {
UpdateData();
return wasClicked;
}
// Call this after you've dealt with a click operation.
public static void ResetClick() {
wasClicked = false;
}
// Updates 'sensorData' and determines if magnet was clicked.
private static void UpdateData() {
Vector3 currentVector = Input.compass.rawVector;
if (currentVector.x == 0 && currentVector.y == 0 && currentVector.z == 0) {
return;
}
if(sensorData.Count >= WINDOW_SIZE) sensorData.RemoveAt(0);
sensorData.Add(currentVector);
// Evaluate model:
if(sensorData.Count < WINDOW_SIZE) return;
float[] means = new float[2];
float[] maximums = new float[2];
float[] minimums = new float[2];
Vector3 baseline = sensorData[sensorData.Count - 1];
for(int i = 0; i < NUM_SEGMENTS; i++) {
int segmentStart = 20 * i;
offsets = ComputeOffsets(segmentStart, baseline);
means[i] = ComputeMean(offsets);
maximums[i] = ComputeMaximum(offsets);
minimums[i] = ComputeMinimum(offsets);
}
float min1 = minimums[0];
float max2 = maximums[1];
// Determine if button was clicked.
if(min1 < T1 && max2 > T2) {
sensorData.Clear();
wasClicked = true; // Set button clicked to true.
// NOTE: 'wasClicked' will now remain true until "ResetClick()" is called.
}
}
private static float[] ComputeOffsets(int start, Vector3 baseline) {
for(int i = 0; i < SEGMENT_SIZE; i++) {
Vector3 point = sensorData[start + i];
Vector3 o = new Vector3(point.x - baseline.x, point.y - baseline.y, point.z - baseline.z);
offsets[i] = o.magnitude;
}
return offsets;
}
private static float ComputeMean(float[] offsets) {
float sum = 0;
foreach(float o in offsets) {
sum += o;
}
return sum / offsets.Length;
}
private static float ComputeMaximum(float[] offsets) {
float max = float.MinValue;
foreach(float o in offsets) {
max = Mathf.Max(o, max);
}
return max;
}
private static float ComputeMinimum(float[] offsets) {
float min = float.MaxValue;
foreach(float o in offsets) {
min = Mathf.Min(o, min);
}
return min;
}
}
And my steps:
http://www.directupload.net/file/d/3887/mtjygjan_jpg.htm
(sorry I´m not able to upload pictures here)
How ever, it wont work. When I start the app and pull down the magnet, nothing happens. May I did something wrong with switching the level over level index?
I use a nexus 4 and 5 for testing the app
Thanks allot and greetz to you!
Phillip
If you are using the Google Cardboard SDK for Unity, it currently has a bug that prevents Unity from seeing the magnet (and gyro, and accelerometer). That is probably why your script is not working. Until the bug is fixed, there is no good workaround, but you can instead use the property Cardboard.CardboardTriggered to detect if the magnet was pulled.
Update for Unity 5: The sensor bug is gone. Cardboard SDK does not block the sensors.

Converting a vb.Net Project to C# .Net Project

I've tried converting vb.net to C# but I keep getting error while compiling. I am new to .NET.
This is my version of converted image utilities class. Util.cs
using Microsoft.VisualBasic;
using System;
using System.Collections;
using System.Collections.Generic;
using System.Data;
using System.Diagnostics;
using System.IO;
using System.Drawing;
using System.Drawing.Imaging;
using System.Runtime.InteropServices;
using Com.Griaule.IcaoFace;
using System.Windows.Forms;
namespace IcaoWF
{
public class Util
{
//Set if the mouth is important
public const bool USES_MOUTH = true;
//The number of supported pictures
const int SFile = 3;
//
public const int FEATURES_COLOR = (255 << 8);
public const int FEATURES_SIZE = 8;
//File Vector with spec ID
TFile[] VTFile = new TFile[SFile + 1];
//Pointers to the .NET classes
public FaceImage GrFaceImage = null;
public IcaoImage GrIcaoImage = null;
public CbeffImage GrCbeffImage = null;
public Cbeff GrCbeff = null;
ListBox log;
// raw image data type.
public struct TRawImage
{
// Image data.
public object img;
// Image width.
public int width;
// Image height.
public int height;
//Reduction Factor because stretch
public float frX;
public float frY;
//Eyes an mouth positions
public int lx;
public int ly;
public int rx;
public int ry;
public int mx;
public int my;
}
// File Enum Type
public enum EFile
{
BMP = 1,
JPEG2000 = 2,
CBEFF = 3,
NOTDEF = 0
}
//File Type
private struct TFile
{
//File Extension
public string fileExt;
//File Type
public EFile fileID;
}
//Class constructor
public Util(ListBox ltBox)
{
//Adding Supportted files
VTFile[1].fileExt = ".bmp";
VTFile[1].fileID = EFile.BMP;
VTFile[2].fileExt = ".jp2";
VTFile[2].fileID = EFile.JPEG2000;
VTFile[3].fileExt = ".cbeff";
VTFile[3].fileID = EFile.CBEFF;
log = ltBox;
}
public void WriteError(GriauleIcaoFaceException err)
{
WriteLog("Error: " + err.ToString());
}
// Write a message in box.
public void WriteLog(string message)
{
log.Items.Add(message);
log.SelectedIndex = log.Items.Count - 1;
log.ClearSelected();
}
//Get the ID File Type from file path name
public EFile GetFileType(string fileName)
{
EFile functionReturnValue = default(EFile);
int i = 0;
for (i = 0; i <= SFile; i++)
{
if (Strings.InStr(1, fileName, VTFile[i].fileExt) == Strings.Len(fileName) - Strings.Len(VTFile[i].fileExt) + 1)
{
functionReturnValue = VTFile[i].fileID;
return functionReturnValue;
}
}
functionReturnValue = EFile.NOTDEF;
return functionReturnValue;
}
//Loading an Image
public bool LoadImage(string fileName, PictureBox img)
{
// create face image from file
GrFaceImage = new FaceImage(fileName);
// display face image
DisplayFaceImage(img, false);
WriteLog("Image loaded successfully.");
return true;
}
//Process the raw Image to FaceImage Type and paint the points on pBox
public bool ProcessFaceImage(PictureBox pBox)
{
//Set mouth to be relevant to generate the ICAO
GrFaceImage.MouthDetectionEnabled = USES_MOUTH;
WriteLog("Finding the eyes and mouth positions. Please, wait...");
//Get the positions from mouth and eyes
if (GetPositionsFromFaceImage())
{
WriteLog("Eyes and mouth founded. Drawing their positions on the image.");
//Display Face Image with eyes and mouth drawn
DisplayFaceImage(pBox, true);
return true;
}
else
{
//Display Face Image
DisplayFaceImage(pBox, false);
return false;
}
}
//Display the ICAO Image
public void DisplayIcaoImg(PictureBox imgIcao)
{
if (GrFaceImage.LeftEye.X <= 0 | GrFaceImage.LeftEye.Y <= 0 | GrFaceImage.LeftEye.X > GrFaceImage.Width | GrFaceImage.LeftEye.Y > GrFaceImage.Height)
{
WriteLog("Left eye is out of bounds.");
return;
}
if (GrFaceImage.RightEye.X <= 0 | GrFaceImage.RightEye.Y <= 0 | GrFaceImage.RightEye.X > GrFaceImage.Width | GrFaceImage.RightEye.Y > GrFaceImage.Height)
{
WriteLog("Right eye is out of bounds.");
return;
}
if (GrFaceImage.Mouth.X <= 0 | GrFaceImage.Mouth.Y <= 0 | GrFaceImage.Mouth.X > GrFaceImage.Width | GrFaceImage.Mouth.Y > GrFaceImage.Height)
{
WriteLog("Mouth is out of bounds.");
return;
}
//Get the GrIcaoImage
try
{
GrIcaoImage = GrFaceImage.FullFrontalImage(imgIcao.Width, 3.0 / 4.0, IcaoImage.IcaoFullFrontalMode.FullFrontal);
}
catch (GriauleIcaoFaceException ex)
{
WriteError(ex);
return;
}
//Getting the eyes positons from icao
if (GetPositionsFromIcaoImage())
{
//Displaying the icao image
DisplayIcaoImage(imgIcao);
}
WriteLog("ICAO image generated.");
}
//Display Face Image
public void DisplayFaceImage(PictureBox pBox, bool withFeatures)
{
if (withFeatures)
{
pBox.Image = GrFaceImage.ImageWithFeatures(8, Color.Green);
}
else
{
pBox.Image = GrFaceImage.Image;
}
pBox.Update();
}
//Display Cbeff Image
public void DisplayCbeffImage(PictureBox pBox)
{
pBox.Image = GrCbeffImage.Image;
pBox.Update();
}
//Display Icao Image
public void DisplayIcaoImage(PictureBox pBox)
{
pBox.Image = GrIcaoImage.Image;
pBox.Update();
}
//Save ICAO in CBEFF file format
public void SaveIcaoIntoCBEFFImage(string fileName)
{
// Create a CBEFF from Icao
if (GetCbeffFromIcao())
{
//Get the CBEFF buffer
try
{
SaveBuffer(fileName, ref GrCbeff.CBEFF);
}
catch (GriauleIcaoFaceException ex)
{
WriteError(ex);
}
}
}
//Load an ICAO image from CBEFF file format
public void LoadIcaoFromCBEFFImage(string fileName, PictureBox pBox)
{
//Creating the cbeff from the buffer
try
{
GrCbeff = new Cbeff(LoadBuffer(fileName));
GrCbeffImage = GrCbeff.Image(0);
}
catch (GriauleIcaoFaceException ex)
{
WriteError(ex);
}
// Display icao image
DisplayCbeffImage(pBox);
}
//Save ICAO image in JPEG2000 file format
public void SaveIcaoIntoJP2Image(string fileName)
{
// Create a CBEFF from Icao
if (!GetCbeffFromIcao())
{
return;
}
//Get Jpeg2000 buffer from CBEFF and save it in a file
SaveBuffer(fileName, ref GrCbeffImage.BufferJPEG);
}
//Save Byte Buffer into a file
private void SaveBuffer(string fileName, ref byte[] buffer)
{
System.IO.FileStream oFileStream = new FileStream(fileName, FileMode.Create, FileAccess.Write);
System.IO.BinaryWriter swb = new System.IO.BinaryWriter(oFileStream);
swb.Write(buffer);
swb.Close();
}
//Load stream from file
private byte[] LoadBuffer(string fileName)
{
// Open a file that is to be loaded into a byte array
FileInfo oFile = null;
oFile = new FileInfo(fileName);
System.IO.FileStream oFileStream = oFile.OpenRead();
long lBytes = oFileStream.Length;
byte[] fileData = new byte[lBytes + 1];
// Read the file into a byte array
oFileStream.Read(fileData, 0, lBytes);
oFileStream.Close();
return fileData;
}
//Get CBEFF image from an Icao image
private bool GetCbeffFromIcao()
{
//Create Cbeff Image Data pointer
GrCbeff = new Cbeff();
GrCbeffImage = GrCbeff.AddImage(GrIcaoImage, false, 0);
GrCbeffImage.Gender = CbeffImage.CbeffGender.Unknown;
GrCbeffImage.Eyes = CbeffImage.CbeffEyes.Unspecified;
GrCbeffImage.Hair = CbeffImage.CbeffHair.Unspecified;
GrCbeffImage.FeatureMask = 0;
GrCbeffImage.Expression = CbeffImage.CbeffExpression.Unspecified;
return true;
}
//Get eyes and mouth position from Face Image
public bool GetPositionsFromFaceImage()
{
float prob = 0;
//Get the eyes detection probabilty
prob = GrFaceImage.DetectionProbability;
if (prob == 0)
{
Interaction.MsgBox("There isn't any probability to find the eyes position.", Constants.vbCritical, "No probability");
return false;
}
return true;
}
//Get eyes and mouth position from ICAO Image
public bool GetPositionsFromIcaoImage()
{
//get the position from an icao image.
float prob = 0;
prob = GrIcaoImage.DetectionProbability;
if (prob <= 0)
{
WriteLog("There isn't any probability to find the eyes position.");
return false;
}
return true;
}
//Set left eye position on library
public void SetLeftEyePos(int x, int y)
{
GrFaceImage.LeftEye = new Point(x, y);
}
//Set right eye position on library
public void SetRightEyePos(int x, int y)
{
GrFaceImage.RightEye = new Point(x, y);
}
//Set mouth position on library
public void SetMouthPos(int x, int y)
{
if (x > 0 & x < GrFaceImage.Width & y > 0 & y < GrFaceImage.Height)
{
Point p = new Point(x, y);
GrFaceImage.Mouth = p;
}
}
//Marshal between library and VB .NET. Copy an Variant Array to Byte() vector
public byte[] ConvertArrayToVByte(Array buffer)
{
GCHandle handle = GCHandle.Alloc(buffer, GCHandleType.Pinned);
IntPtr ptr = handle.AddrOfPinnedObject();
byte[] bytes = new byte[buffer.Length + 1];
Marshal.Copy(ptr, bytes, 0, bytes.Length);
return bytes;
}
// Show GriauleAfis version and type
public void MessageVersion()
{
int majorVersion = 0;
int minorVersion = 0;
GriauleIcaoFace.GetVersion(majorVersion, minorVersion);
MessageBox.Show("The GrIcaoFace DLL version is " + majorVersion + "." + minorVersion + ".", "GrIcaoFace Version", MessageBoxButtons.OK);
}
}
}
I keep get error with this error:
The type or namespace name 'ListBox' could not be found (are
you missing a using directive or an assembly reference?).
The type or namespace name 'PictureBox' could not be found (are you
missing a using directive or an assembly reference?).
And here is the formMain.cs
using Microsoft.VisualBasic;
using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Data;
using System.Drawing;
using System.Linq;
using System.Text;
using System.Windows.Forms;
namespace IcaoWF
{
public partial class formMain : Form
{
public formMain(): base()
{
Load += formMain_Load;
InitializeComponent();
}
// raw image data type.
private struct TSetting
{
// Image data.
public Button button;
// Image width.
public Label x;
public Label y;
public bool setting;
}
TSetting CSetting = new TSetting();
Util myUtil = default(Util);
private void formMain_Load(Object sender, EventArgs e)
{
InitializeInterface();
//Setting file filters
ldImg.Filter = "JPEG Images (*.jpg,*.jpeg)|*.jpg;*.jpeg|Gif Images (*.gif)|*.gif|Bitmaps (*.bmp)|*.bmp";
ldIcaoImg.Filter = "CBEFF (*.cbeff)|*.cbeff";
svIcaoImg.Filter = "JPEG2000 (*.jp2)|*.jp2|CBEFF (*.cbeff)|*.cbeff";
myUtil = new Util(logBox);
//Verifieing if the mouth is important
gbMouth.Enabled = myUtil.USES_MOUTH;
}
//Unlock the interface and update the clicked iten
private void interfaceSetStop(int x, int y)
{
if (CSetting.setting) {
//Set the CSetting to false
CSetting.setting = false;
//Set positions from mouse to CSetting text selected
CSetting.x.Text = x.ToString();
CSetting.y.Text = y.ToString();
//Enable Set button again
CSetting.button.Enabled = true;
//Set the normal cursor above image
imgFace.Cursor = Cursors.Arrow;
//Enable all butons, disabled before
EnableButtons();
//Sets new position from image
myUtil.SetLeftEyePos(lbLeftEyeXPos.Text, lbLeftEyeYPos.Text);
myUtil.SetRightEyePos(lbRightEyeXPos.Text, lbRightEyeYPos.Text);
if (myUtil.USES_MOUTH) {
myUtil.SetMouthPos(lbMouthXPos.Text, lbMouthYPos.Text);
}
//Redraw img
myUtil.DisplayFaceImage(imgFace, true);
}
}
//Initialize the program interface
private void InitializeInterface()
{
//Disable butons
DisableButtons();
//Disbable image picture box
imgFace.Enabled = false;
//Current setting eye or mouth to false
CSetting.setting = false;
//Disable Save ICAO image
mnFileSaveIcaoImg.Enabled = false;
//Reset the logBox
logBox.ResetText();
}
//Enable all butons from interface
private void EnableButtons()
{
btGenIcaoImage.Enabled = true;
btLeftEyeSet.Enabled = true;
btMouthSet.Enabled = true;
btRightEyeSet.Enabled = true;
btProcess.Enabled = true;
imgFace.Enabled = true;
}
//Set the inteface to click on the image
private void btLeftEyeSet_Click(Object sender, EventArgs e)
{
interfaceSetStart(btLeftEyeSet, lbLeftEyeXPos, lbLeftEyeYPos);
}
//Set the inteface to click on the image
private void lbRightEyeSet_Click(Object sender, EventArgs e)
{
interfaceSetStart(btRightEyeSet, lbRightEyeXPos, lbRightEyeYPos);
}
//Set the inteface to click on the image
private void btMouthSet_Click(Object sender, EventArgs e)
{
interfaceSetStart(btMouthSet, lbMouthXPos, lbMouthYPos);
}
//Lock the interface to click on image
private void interfaceSetStart(Button button, Label x, Label y)
{
//Set the clicked button set
CSetting.button = button;
//set the label to update the position
CSetting.x = x;
CSetting.y = y;
//Enable set mode
CSetting.setting = true;
//Disable the button
button.Enabled = false;
//Enable Cross cursor on image
imgFace.Cursor = Cursors.Cross;
//Disable button to avoid user to click in another area
DisableButtons();
}
//Disable all buttons from interface
private void DisableButtons()
{
btGenIcaoImage.Enabled = false;
btLeftEyeSet.Enabled = false;
btMouthSet.Enabled = false;
btRightEyeSet.Enabled = false;
btProcess.Enabled = false;
}
//On click on the image, stop the interface and set the right position
private void imgFace_MouseDown(object sender, MouseEventArgs e)
{
interfaceSetStop(e.X / (imgFace.Width / myUtil.GrFaceImage.Width), e.Y / (imgFace.Height / myUtil.GrFaceImage.Height));
}
//Gen the ICAO image from FaceImage
private void btGenIcaoImage_Click(Object sender, EventArgs e)
{
//Display ICAO image captured
myUtil.DisplayIcaoImg(imgIcaoImg);
//Enabled
mnFileSaveIcaoImg.Enabled = true;
}
//Load Icao IMAGE From CBEFF or JPEG2000
private void mnFileLoadIcaoImg_Click(Object sender, EventArgs e)
{
Util.EFile fileType = default(Util.EFile);
ldIcaoImg.FileName = "";
//save the ICAO image
if (ldIcaoImg.ShowDialog == DialogResult.OK & !string.IsNullOrEmpty(ldIcaoImg.FileName))
{
fileType = myUtil.GetFileType(ldIcaoImg.FileName);
switch (fileType)
{
case Util.EFile.CBEFF:
//Save CBEFF image
myUtil.LoadIcaoFromCBEFFImage(ldIcaoImg.FileName, imgIcaoImg);
break;
//
default:
//Image type not found
myUtil.WriteLog("File type not supported.");
return;
}
}
}
//Save ICAO Image
private void mnFileSaveIcaoImg_Click(Object sender, EventArgs e)
{
Util.EFile fileType = default(Util.EFile);
svIcaoImg.FileName = "";
//save the ICAO image
if (svIcaoImg.ShowDialog == DialogResult.OK & !string.IsNullOrEmpty(svIcaoImg.FileName))
{
fileType = myUtil.GetFileType(svIcaoImg.FileName);
switch (fileType)
{
case Util.EFile.CBEFF:
//Save CBEFF image
myUtil.SaveIcaoIntoCBEFFImage(svIcaoImg.FileName);
break;
case Util.EFile.JPEG2000:
//Save JPEG200 image
myUtil.SaveIcaoIntoJP2Image(svIcaoImg.FileName);
break;
default:
//Image type not found
myUtil.WriteLog("File type not supported.");
break;
}
}
}
//Load Image
private void mnFileLoadImg_Click(Object sender, EventArgs e)
{
lbLeftEyeXPos.Text = "0";
lbLeftEyeYPos.Text = "0";
lbRightEyeXPos.Text = "0";
lbRightEyeYPos.Text = "0";
lbMouthXPos.Text = "0";
lbMouthYPos.Text = "0";
//Disable buttons
DisableButtons();
//Enable image
imgFace.Enabled = true;
//Set file name image to null
ldImg.FileName = "";
if (ldImg.ShowDialog == DialogResult.OK & !string.IsNullOrEmpty(ldImg.FileName))
{
//load image from FileName into imgFace Picture Box
if (myUtil.LoadImage(ldImg.FileName, imgFace))
{
//Set the icaoImage to null
imgIcaoImg.Image = null;
imgIcaoImg.Refresh();
//Disble mnFileSaveIcaoImg to save
mnFileSaveIcaoImg.Enabled = false;
//Disable buttons
DisableButtons();
//Enable find eyes and mouth button
btProcess.Enabled = true;
}
}
}
//Close the program
private void MenuItem5_Click(Object sender, EventArgs e)
{
this.Close();
}
//Process the Face Image
private void btProcess_Click(Object sender, EventArgs e)
{
//Enable buttons to set eyes and mouth
EnableButtons();
//Process face image
if (myUtil.ProcessFaceImage(imgFace))
{
//Get positions from face image
lbLeftEyeXPos.Text = myUtil.GrFaceImage.LeftEye.X.ToString();
lbLeftEyeYPos.Text = myUtil.GrFaceImage.LeftEye.Y.ToString();
lbRightEyeXPos.Text = myUtil.GrFaceImage.RightEye.X.ToString();
lbRightEyeYPos.Text = myUtil.GrFaceImage.RightEye.Y.ToString();
lbMouthXPos.Text = myUtil.GrFaceImage.Mouth.X.ToString();
lbMouthYPos.Text = myUtil.GrFaceImage.Mouth.Y.ToString();
}
}
//Print the DLL version
private void mnVersion_Click(Object sender, EventArgs e)
{
myUtil.MessageVersion();
}
}
}
I can post the vb.net version if needed.
EDITED: I have added refference(System.Windows.Forms.dll) to the project and all the other am using.
thanx
Nurcky
ListBox is defined in the assembly System.Windows.Forms.dll. Make sure you have a reference to that assembly.
Additionally you probably want
using System.Windows.Forms;
Both the ListBox and PictureBox controls are found in the System.Windows.Forms namespace. Add the following statement to the top of your code:
using System.Windows.Forms;
Then add a reference to System.Windows.Forms.dll
Use the System.Windows.Forms namespace. To use the System.Windows.Forms namespace, you have to add System.Windows.Forms.dll as a reference.
To add the reference,follow the following step:
In Solution Explorer, right-click on the project node and click Add Reference.
In the Add Reference dialog box, select the .NET tab and choose System.Windows.Forms and click OK.

Categories

Resources