Need your help.
I am making simple application (and I'm a new to Unity3d also), which takes video from IP-camera and displays it onto Texture2D.
Video format is MJPG.
For simple jpg-images the code below works fine, but when I try to display MJPG I just get gray screen.
Did I make a mistake in code?
public class testVid : MonoBehaviour {
//public string uri = "http://24.172.4.142/mjpg/video.mjpg"; //url for example
public Texture2D cam;
public void Start() {
cam = new Texture2D(1, 1, TextureFormat.RGB24, false);
StartCoroutine(Fetch());
}
public IEnumerator Fetch() {
while(true) {
Debug.Log("loading... " + Time.realtimeSinceStartup);
WWWForm form = new WWWForm();
WWW www = new WWW("http://24.172.4.142/mjpg/video.mjpg");
yield return www;
if(!string.IsNullOrEmpty(www.error))
throw new UnityException(www.error);
www.LoadImageIntoTexture(cam);
}
}
public void OnGUI() {
GUI.DrawTexture(new Rect(0, 0, Screen.width, Screen.height), cam);
}
}
I used this plugin https://www.assetstore.unity3d.com/en/#!/content/15580
Add the script to a game object
Set the URL of the video in the script
Create a new material of Unlit 2D
Add this material to the movie script in the inspector
Then assign the same material to the game object you
want to display the video (e.g. a quad)
Hope it helps
Related
In Unity, I'd like to generate 2D map from a large image file (965012573).
I already have a tile/image pieces generator but it generates 1900 images 256256, and it seems hard to create a map by hand like this (and I have more then one large image to process...).
These image pieces are sorted like this :
x/y.png
Where x starts from left to right and y from top to bottom.
As MelvMay suggests me here, I should use textures to achieve that, but how?
Do you have any idea of how to achieve that programmatically? Or may be with an existing bundle?
Thanks a lot!
[edit] As a workaround, I tried this :
public class MapGenerator : MonoBehaviour {
private Renderer m_Renderer;
private Texture2D m_MainTexture;
// Use this for initialization
void Start () {
//Fetch the Renderer from the GameObject
m_Renderer = GetComponent<Renderer> ();
m_MainTexture = LoadPNG("Assets/Tiles/Ressources/Map/0/3.png");
m_Renderer.material.mainTexture = m_MainTexture;
}
private Texture2D LoadPNG(string filePath) {
Texture2D tex = null;
byte[] fileData;
if (File.Exists(filePath)) {
UnityEngine.Debug.Log(filePath);
fileData = File.ReadAllBytes(filePath);
tex = new Texture2D(2, 2);
tex.LoadImage(fileData); //..this will auto-resize the texture dimensions.
}
return tex;
}
}
But even if I apply it on an empty object, I only see my dynamic texture in the inspector, not in the scene when I press run...
I am trying to take a photo through the Hololens with my PhotoCapture unity script. I want to use a Vuforia Engine ARCamera for the possibility to see the reality at the same time as I see the AR-GUI I've created (for future functionality).
The main error I get is:
Failed capturing photo (hr = 0xC00D3704)
Why does it occur? How do I fix it?
The FocusManager singleton has not been initialized.
UnityEngine.Debug:Assert(Boolean, String)
HoloToolkit.Unity.Singleton`1:AssertIsInitialized() (at
Assets/HoloToolkit/Common/Scripts/Singleton.cs:51)
HoloToolkit.Unity.CanvasHelper:Start() (at
Assets/HoloToolkit/Utilities/Scripts/CanvasHelper.cs:30)
is also an error occurring when starting the unity scene, but I haven't had it before...
This is code that I am using, placed on an ARCamera (have also tried out mixed reality camera with a vuforia behavior script and it didn't get the second error). Also, I want to apologize to the person who this code is borrowed from because I don't remember the link to your site.
public class PhotoCaptureExample : MonoBehaviour
{
PhotoCapture photoCaptureObject = null;
Texture2D targetTexture = null;
public string path = "";
CameraParameters cameraParameters = new CameraParameters();
void Start()
{
}
void Update()
{
if (Input.GetKeyDown("k"))
{
Debug.Log("k was pressed");
Resolution cameraResolution = PhotoCapture.SupportedResolutions.OrderByDescending((res) => res.width * res.height).First();
targetTexture = new Texture2D(cameraResolution.width, cameraResolution.height);
// Create a PhotoCapture object
PhotoCapture.CreateAsync(false, delegate (PhotoCapture captureObject)
{
photoCaptureObject = captureObject;
cameraParameters.hologramOpacity = 0.0f;
cameraParameters.cameraResolutionWidth = cameraResolution.width;
cameraParameters.cameraResolutionHeight = cameraResolution.height;
cameraParameters.pixelFormat = CapturePixelFormat.BGRA32;
// Activate the camera
photoCaptureObject.StartPhotoModeAsync(cameraParameters, delegate (PhotoCapture.PhotoCaptureResult result)
{
// Take a picture
photoCaptureObject.TakePhotoAsync(OnCapturedPhotoToMemory);
});
});
}
}
string FileName(int width, int height)
{
return string.Format("screen_{0}x{1}_{2}.png", width, height, DateTime.Now.ToString("yyyy-MM-dd_HH-mm-ss"));
}
void OnCapturedPhotoToMemory(PhotoCapture.PhotoCaptureResult result, PhotoCaptureFrame photoCaptureFrame)
{
// Copy the raw image data into the target texture
photoCaptureFrame.UploadImageDataToTexture(targetTexture);
Resolution cameraResolution = PhotoCapture.SupportedResolutions.OrderByDescending((res) => res.width * res.height).First();
targetTexture.ReadPixels(new Rect(0, 0, cameraResolution.width, cameraResolution.height), 0, 0);
targetTexture.Apply();
byte[] bytes = targetTexture.EncodeToPNG();
string filename = FileName(Convert.ToInt32(targetTexture.width), Convert.ToInt32(targetTexture.height));
//save to folder under assets
File.WriteAllBytes(Application.dataPath + "/Snapshots/" + filename, bytes);
Debug.Log("The picture was uploaded");
// Deactivate the camera
photoCaptureObject.StopPhotoModeAsync(OnStoppedPhotoMode);
}
void OnStoppedPhotoMode(PhotoCapture.PhotoCaptureResult result)
{
// Shutdown the photo capture resource
photoCaptureObject.Dispose();
photoCaptureObject = null;
}
}
It seems that I can't get to OnCapturedPhotoToMemory or if it breaks already by the method call. Trying it out again right now the code occasionally won't register that I have pushed k at all...
Any help is appreciated!!
The problem is: Vuforia's VuforiaBehaviour on the camera holds the access on the device's real camera hardware => nothing else can use it meanwhile.
For fixing this you could use a dedicated camera for Vuforia (simply somewhere place a new GameObject e.g. VuforiaCamera in the scene and attach a Camera component as well as a VuforiaBehaviour to it.
On the Vuforia Camera set Culling Mask to Nothing (we don't render anything with that camera) and Depth to e.g. -2 (higher values are rendered on top -> put this behind all other cameras).
You have to do this because otherwise Vuforia automatically adds it to the main camera (which we don't want to disable because then we don't see anything). By manually adding one to the scene Vuforia automatically uses that one instead.
Everywhere in your scene where you need a Camera ofcourse you use the original camera from the Holo-Tookit (your usual MainCamera). Problem you can't fully rely on Camera.main in scripts because on runtime the VuforiaBehaviour automatically marks its Camera as MainCamera as well ... (-_-) Vuforia ... so additionally I always disabled and enabled the VuforiaBehaviour along with the GameObject but maybe it is enough already to only disable and enable the GameObject. At least at GameStart it should be disabled I guess until everything relying on Camera.main is finished.
Then you can simply disable that VuforiaCamera that has the VuforiaBehaviour on it.
VuforiaBehaviour.Instance.gameObject.SetActive(false);
// Note: that part I'm just inventing since I did it different
// using reference etc. I hope vuforia does not destroy the Instance
// OnDisable .. otherwise you would need the reference instead of using Instance here
// but maybe it is already enough to just enable and disable the GameObject
VuforiaBehaviour.Instance.enabled = false;
By doing this the device's camera was "free" and PhotoCapture can now use it.
PhotoCapture.StartPhotoModeAsync(....
If you then need the camera again for vuforia, first stop the PhotoCapture
PhotoCapture.StopPhotoModeAsync(..
and then after it stopped in the callback re-enable the ARCamera (object with the VuforiaBehaviour) and the VuforiaBehaviour again.
Something like
private void Awake()
{
var cameraResolution = PhotoCapture.SupportedResolutions.OrderByDescending((res) => res.width * res.height).First();
targetTexture = new Texture2D(cameraResolution.width, cameraResolution.height);
// Create a PhotoCapture object
PhotoCapture.CreateAsync(false, captureObject =>
{
photoCaptureObject = captureObject;
cameraParameters.hologramOpacity = 0.0f;
cameraParameters.cameraResolutionWidth = cameraResolution.width;
cameraParameters.cameraResolutionHeight = cameraResolution.height;
cameraParameters.pixelFormat = CapturePixelFormat.BGRA32;
});
}
private void Update()
{
// if not initialized yet don't take input
if (photoCaptureObject == null) return;
if (Input.GetKeyDown("k"))
{
Debug.Log("k was pressed");
VuforiaBehaviour.Instance.gameObject.SetActive(false);
// Activate the camera
photoCaptureObject.StartPhotoModeAsync(cameraParameters, result =>
{
if (result.success)
{
// Take a picture
photoCaptureObject.TakePhotoAsync(OnCapturedPhotoToMemory);
}
else
{
Debug.LogError("Couldn't start photo mode!", this);
}
});
}
}
private static string FileName(int width, int height)
{
return $"screen_{width}x{height}_{DateTime.Now:yyyy-MM-dd_HH-mm-ss}.png";
}
private void OnCapturedPhotoToMemory(PhotoCapture.PhotoCaptureResult result, PhotoCaptureFrame photoCaptureFrame)
{
// Copy the raw image data into the target texture
photoCaptureFrame.UploadImageDataToTexture(targetTexture);
Resolution cameraResolution = PhotoCapture.SupportedResolutions.OrderByDescending((res) => res.width * res.height).First();
targetTexture.ReadPixels(new Rect(0, 0, cameraResolution.width, cameraResolution.height), 0, 0);
targetTexture.Apply();
byte[] bytes = targetTexture.EncodeToPNG();
string filename = FileName(Convert.ToInt32(targetTexture.width), Convert.ToInt32(targetTexture.height));
//save to folder under assets
File.WriteAllBytes(Application.streamingAssetsPath + "/Snapshots/" + filename, bytes);
Debug.Log("The picture was uploaded");
// Deactivate the camera
photoCaptureObject.StopPhotoModeAsync(OnStoppedPhotoMode);
}
private void OnStoppedPhotoMode(PhotoCapture.PhotoCaptureResult result)
{
// Shutdown the photo capture resource
photoCaptureObject.Dispose();
photoCaptureObject = null;
VuforiaBehaviour.Instance.gameObject.SetActive(true);
}
The exception might however also be related to this issue.
I can successfully load a material saved in my Unity project using the code below:
RenderSettings.mat1 = (Material)Resources.Load ("images/img1.jpg", typeof(Material));
However, I am now trying to load an external image by downloading it.
Texture2D imgDownloaded;
string url = "http://www.intrawallpaper.com/static/images/1968081.jpg";
void Start()
{
StartCoroutine(getImg());
fucntionx ();
}
public void functionx()
{
RenderSettings.mat1 = (Material)imgDownloaded;
}
IEnumerator getImg()
{
yield return 0;
WWW dl = new WWW(url);
yield return dl;
imgDownloaded = dl.texture;
}
However, I get the message that I cannot convert from Texture2D to Material .
Is there any way to fix this?
Try:
yourMaterial.mainTexture = yourTexture;
A material consists of many textures, so naturally you can't convert between them.
In Unity, this script loads on the start of the program. I want to download an image, and after that show it on the main screen. What should I do? The following code did not work.
My code:
using UnityEngine;
public class PushNotifications : MonoBehaviour {
IEnumerator Start () {
Texture2D textWebPic = null;
WWW image = new WWW("http://www.test.com/image.png");
yield return image;
image.LoadImageIntoTexture(textWebPic);
}
void Update () {
}
}
You can't pass null to LoadImageIntoTexture, since then Unity has no idea where to put the output (it is not ref). The texture must be initialized first.
It does not really matter, however, with what size or format you initialize it, unity will resize it anyway. So you can initialize some dummy, like this to load the image:
IEnumerator Start () {
Texture2D textWebPic = new Texture2D(2,2);
WWW image = new WWW("http://www.test.com/image.png");
yield return image;
image.LoadImageIntoTexture(textWebPic);
}
Another, and probably better option is to use WWW.texture instead of LoadImageIntoTexture, like this:
IEnumerator Start () {
WWW image = new WWW("http://www.test.com/image.png");
yield return image;
Texture2D textWebPic = image.texture;
}
See WWW class reference for more examples:
http://docs.unity3d.com/ScriptReference/WWW.html
And then to show it on the screen you have multiple option - creating a material with this texture, creating sprite from texture (best for 2d games) or simply use Graphics.DrawTexture.
I am building an application in Unity3d, and I need to download textures from my server and apply them to prefabs.
I have two types of prefabs;
The first is a simple plane that I use to display 2d images, and the second is a prefab to play videos and have a thumbnail texture that is displayed before the video is played in full screen.
I am having problems with the video prefab. If I create a public texture in my script and apply it to the prefab, everything works fine. However, if I download the texture from my server and apply it to the prefab it appears black. This only happens in iOS, in the Unity Player everything appears fine.
Here is my code:
Instantiate the prefab:
newVideo = (GameObject)Instantiate(arvideo, new Vector3(15*i, 0, 0), Quaternion.identity);
newVideo.GetComponent<VideoPlaybackBehaviour>().m_path = ((Assets)Data.Assets[i]).AssetContent; // SET THE URL FOR THE VIDEO
string url = ((Assets)Data.Assets[i]).AssetThumbnail;
StartCoroutine(DownloadImage(url, newVideo, ((Assets)Data.Assets[i]).AssetFilename, "VIDEO"));
newVideo.transform.rotation = Quaternion.Euler(0, -180, 0);
Download IEnumerator:
public IEnumerator DownloadImage(string url, GameObject tex, string filename, string type)
{
WWW www = new WWW(url);
yield return www;
/* EDIT: */
if (!string.IsNullOrEmpty(www.error)){
Debug.LogWarning("LOCAL FILE ERROR: "+www.error);
} else if(www.texture == null) {
Debug.LogWarning("LOCAL FILE ERROR: TEXTURE NULL");
} else {
/* EOF EDIT */
tex.GetComponent<VideoPlaybackBehaviour>().KeyframeTexture = www.texture;
Color color = tex.renderer.material.color;
color.a = 1f;
tex.renderer.material.color = color;
}
}
I struggled with same issue. And after several hours search, for my case i solved this problem with disabling "Metal write-only BackBuffer" in iOS player settings. I'm using unity 2020.1.8f1 and actually i have no any idea about why back buffer setting is causing black texture problem in iOS devices.