Texture downloaded from server appears black on iOS - c#

I am building an application in Unity3d, and I need to download textures from my server and apply them to prefabs.
I have two types of prefabs;
The first is a simple plane that I use to display 2d images, and the second is a prefab to play videos and have a thumbnail texture that is displayed before the video is played in full screen.
I am having problems with the video prefab. If I create a public texture in my script and apply it to the prefab, everything works fine. However, if I download the texture from my server and apply it to the prefab it appears black. This only happens in iOS, in the Unity Player everything appears fine.
Here is my code:
Instantiate the prefab:
newVideo = (GameObject)Instantiate(arvideo, new Vector3(15*i, 0, 0), Quaternion.identity);
newVideo.GetComponent<VideoPlaybackBehaviour>().m_path = ((Assets)Data.Assets[i]).AssetContent; // SET THE URL FOR THE VIDEO
string url = ((Assets)Data.Assets[i]).AssetThumbnail;
StartCoroutine(DownloadImage(url, newVideo, ((Assets)Data.Assets[i]).AssetFilename, "VIDEO"));
newVideo.transform.rotation = Quaternion.Euler(0, -180, 0);
Download IEnumerator:
public IEnumerator DownloadImage(string url, GameObject tex, string filename, string type)
{
WWW www = new WWW(url);
yield return www;
/* EDIT: */
if (!string.IsNullOrEmpty(www.error)){
Debug.LogWarning("LOCAL FILE ERROR: "+www.error);
} else if(www.texture == null) {
Debug.LogWarning("LOCAL FILE ERROR: TEXTURE NULL");
} else {
/* EOF EDIT */
tex.GetComponent<VideoPlaybackBehaviour>().KeyframeTexture = www.texture;
Color color = tex.renderer.material.color;
color.a = 1f;
tex.renderer.material.color = color;
}
}

I struggled with same issue. And after several hours search, for my case i solved this problem with disabling "Metal write-only BackBuffer" in iOS player settings. I'm using unity 2020.1.8f1 and actually i have no any idea about why back buffer setting is causing black texture problem in iOS devices.

Related

[Unity, c#]. Video Player. Video flicker/blink on android just right before video starts

This is the first time I ask someone for help in this forum. Hope I'll get your help :)
Well, the issue is that, when scene loads, I start mp4 video with Unity's Video Player. I use Raw Image and RenderTexture with it. In editor and Bluestacks emulator everything is fine, but when I start the game on android, just right before the video starts - I get black screen for about ~1 second and then video starts normally. And every time it happens right before the video starts. Does anyone know how to fix this that video would start normally? I would greatly appreciate.
I am new with Video Player and don't understand what could be wrong. I assigned textures in my code.
Image of component
void Start()
{
renderTextureFirstVideo = new CustomRenderTexture(1080, 1920);
renderTextureFirstVideo.initializationColor = new Color(0f, 0f, 0f, 0f);
renderTextureLoopVideo = new CustomRenderTexture(1080, 1920);
renderTextureLoopVideo.initializationColor = new Color(0f, 0f, 0f, 0f);
time = _FirstVideo.clip.length;
Timing.RunCoroutine(IShowVideo());
}
private IEnumerator<float> IShowVideo()
{
yield return Timing.WaitForSeconds(1f);
_FirstVideoRawImage.texture = renderTextureFirstVideo;
_FirstVideo.targetTexture = _FirstVideoRawImage.texture as RenderTexture;
_FirstVideo.gameObject.SetActive(true);
_FirstVideo.Play();
yield return Timing.WaitForSeconds((float)time);
_LoopVideoRawImage.texture = renderTextureLoopVideo;
_LoopVideo.targetTexture = _LoopVideoRawImage.texture as RenderTexture;
_LoopVideo.gameObject.SetActive(true);
_LoopVideo.Play();
}

How can i update an image on runtime in unity?

I'm buidling a smooth transtion from a level scene to a menu scene.
The idea is to make a screenshot of the level just before the next scene(menu) loads.
then take this screenshot which overlays the entire screen and will fade out to reveal the menu.
At the end of a level I take a screen shot.
ScreenCapture.CaptureScreenshot("Resources/DestroyedBoss.png");
When I load the next scene this screenshot should load and overlay the entire scene.
public Texture2D myTexture;
void Start()
{
// load texture from resource folder
myTexture = Resources.Load("DestroyedBoss") as Texture2D;
GameObject rawImage = GameObject.Find("RawImage");
rawImage.GetComponent<RawImage>().texture = myTexture;
}
The screenshot will fade and the object containing it will be destroyed.
public RawImage imageToFade;
void Update()
{
imageToFade.color -= new Color(0, 0, 0, 0.2f * Time.deltaTime);
if (imageToFade.color.a < 0.1)
Destroy(this.gameObject);
}
This all works fine, except for loading of the screenshot itself.
It would seem that unity does not update the image while running the game. And always uses the screenshot from the previous time I ran the game.
I have been searching/googling and trying for 2 days now and I am at a loss.
How can I make this work? Is there a way to update the png file while running my game? Maybe create a temp container for the screenshot and set this to the RawImage?
Like derHugo commented. you can use Application.persistentDataPath to save your image.
ScreenCapture.CaptureScreenshot(Application.persistentDataPath+"DestroyedBoss.png");
For loading the image, you can use UnityWebRequestTexture.GetTexture.
IEnumerator SetImage()
{
yield return new WaitForSeconds(1);
using (UnityWebRequest uwr =
UnityWebRequestTexture.GetTexture(Application.persistentDataPath + "DestroyedBoss.png"))
{
yield return uwr.SendWebRequest();
if (uwr.isNetworkError || uwr.isHttpError)
{
Debug.Log(uwr.error);
}
else
{
// Get downloaded asset bundle
myTexture = DownloadHandlerTexture.GetContent(uwr);
GameObject rawImage = GameObject.Find("RawImage");
rawImage.GetComponent<RawImage>().texture = myTexture;
FadeOut.instance.StartFade();
}
}
}
It works ok, but it takes a few seconds for the screenshot to appear in the folder. So for me... it does not work great.
I'll be using another technique.
Instead of fading the screenshot while loading the menu. I'll stop the game, (Time.timescale = 0) and fade in a screenshot of the menu, then load the actual menu scene.
Thanks again derHugo.

How to set Texture readable from Image Capture by Camera in Android or IOS Unity?

I have try to build an app which need to use a camera to capture an image (Credit Card) and read the number from the image.
The number i need to read is card number, expire month, and year.
I use a free asset from here :
Native Camera For Android IOS
to capture an image.
Also i am using a google tesseract form designspark here :
Tesseract OCR Unity
to recognize a text from image.
But before trying to read a text from image i got a problem on reading a texture.
For detail my code below :
public void TakePicture(int maxSize)
{
if (NativeCamera.IsCameraBusy())
{
return;
}
else
{
NativeCamera.Permission permission = NativeCamera.TakePicture((path) =>
{
Debug.Log("Image path: " + path);
if (path != null)
{
// Create a Texture2D from the captured image
Texture2D texture = NativeCamera.LoadImageAtPath(path, maxSize);
if (texture == null)
{
Debug.Log("Couldn't load texture from " + path);
return;
}
TesseractWrapper_And tesseract = new TesseractWrapper_And();
string datapath = System.IO.Path.Combine(Application.persistentDataPath, "tessdata");
tesseract.Init("eng", datapath);
//Color32[] imageColors = texture.GetPixels32();
string result = tesseract.RecognizeFromTexture(texture, false);
//copy bufferColors to imageColors one by one with necessary logic.
Card_Number.text = result ?? "Error: " + tesseract.errorMsg;
// Assign texture to a temporary quad and destroy it after 5 seconds
GameObject quad = GameObject.CreatePrimitive(PrimitiveType.Quad);
quad.transform.position = Camera.main.transform.position + Camera.main.transform.forward * 2.5f;
quad.transform.forward = Camera.main.transform.forward;
quad.transform.localScale = new Vector3(1f, texture.height / (float)texture.width, 1f);
Material material = quad.GetComponent<Renderer>().material;
if (!material.shader.isSupported) // happens when Standard shader is not included in the build
material.shader = Shader.Find("Legacy Shaders/Diffuse");
material.mainTexture = texture;
Destroy(quad, 5f);
// If a procedural texture is not destroyed manually,
// it will only be freed after a scene change
Destroy(texture, 5f);
}
}, maxSize);
Debug.Log("Permission result: " + permission);
}
}
I got an error on the line :
string result = tesseract.RecognizeFromTexture(texture, false);
The error is :
AndroidPlayer(ADB#127.0.0.1:34999) UnityException: Texture '' is
not readable, the texture memory can not be accessed from scripts. You
can make the texture readable in the Texture Import Settings. at
(wrapper managed-to-native)
UnityEngine.Texture2D.GetPixels(UnityEngine.Texture2D,int,int,int,int,int)
at UnityEngine.Texture2D.GetPixels (System.Int32 miplevel) [0x0002b]
in <004fc436a9154f7fab4df9679445af6c>:0 at
UnityEngine.Texture2D.GetPixels () [0x00001] in
<004fc436a9154f7fab4df9679445af6c>:0 at
OCR_Test+<>c__DisplayClass6_0.b__0 (System.String path)
[0x00040] in F:\Github\Tesseract_OCR\Assets\script\OCR_Test.cs:80
at NativeCameraNamespace.NCCameraCallbackAndroid.MediaReceiveCallback
(System.String path) [0x0001d] in
F:\Github\Tesseract_OCR\Assets\Plugins\NativeCamera\Android\NCCameraCallbackAndroid.cs:30 at
NativeCameraNamespace.NCCameraCallbackAndroid+<>c__DisplayClass3_0.b__0
() [0x00000] in
F:\Github\Tesseract_OCR\Assets\Plugins\NativeCamera\Android\NCCameraCallbackAndroid.cs:19 at NativeCameraNamespace.NCCallbackHelper.Update () [0x0001d] in
F:\Github\Tesseract_OCR\Assets\Plugins\NativeCamera\Android\NCCallbackHelper.cs:21
(Filename: <004fc436a9154f7fab4df9679445af6c> Line: 0)
The texture is not readable.
This is how and where the image captured and save to temporary disk.
Texture2D texture = NativeCamera.LoadImageAtPath(path, maxSize);
and the file was found in here :
Image path: /data/user/0/com.Creativire.OCR/cache/IMG_camera.jpg
Question :
How to make the texture readable since the image is capture directly from the camera so we cannot set it from inspector ?
How to set the google tesseract from designspark just recognize the number ?
For note : i have already try the designspark tesseract ocr with a file image save in the unity and it worked, just do not work when capture from camera directly.
Any explanation from you is very appreciate.
Thank You
okay finally figured it out you just have to change the nonreadable property to false
Texture2D texture = NativeCamera.LoadImageAtPath(path, maxSize,false,true);

Play 360 Stereoscopic video with VideoPlayer

I want to play a stereo 360 degree video in virtual reality in Unity on an Android. So far I have been doing some research and I have two cameras for the right and left eye with each a sphere around them. I also need a custom shader to make the image render on the inside of the sphere. I have the upper half of the image showing on one sphere by setting the y-tiling to 0.5 and the lower half shows on the other sphere with y-tiling 0.5 and y-offset 0.5. With this I can show a 3D 360 degree image already correct. The whole idea is from this tutorial.
Now for video, I need control over the Video speed so it turned out I need the VideoPlayer from the new Unity 5.6 beta. Now my setup so far would require the Video Player to play the video on both spheres with one sphere playing the upper part (one eye) and the other video playing the lower part (other eye).
Here is my problem: I don't know how to get the video Player to play the same video on two different materials (since they have different tiling values). Is there a way to do that?
I got a hint that I could use the same material and achieve the tiling effect via UV, but I don't know how that works and I haven't even got the video player to play the video on two objects using the same material on both of them. I have a screenshot of that here. The Right sphere just has the material videoMaterial. No tiling since I'd have to do that via UV.
Which way to go and how to do it? Am I on the right way here?
Am I on the right way here?
Almost but you are currently using Renderer and Material instead of RenderTexture and Material.
Which way to go and how to do it?
You need to use RenderTexture for this. Basically, you render the Video to RenderTexture then you assign that Texture to the material of both Spheres.
1.Create a RenderTexture and assign it to the VideoPlayer.
2.Create two materials for the spheres.
3.Set VideoPlayer.renderMode to VideoRenderMode.RenderTexture;
4.Set the Texture of both Spheres to the Texture from the RenderTexture
5.Prepare and Play Video.
The code below is doing that exact thing. It should work out of the box. The only thing you need to do is to modify the tiling and offset of each material to your needs.
You should also comment out:
leftSphere = createSphere("LeftEye", new Vector3(-5f, 0f, 0f), new Vector3(4f, 4f, 4f));
rightSphere = createSphere("RightEye", new Vector3(5f, 0f, 0f), new Vector3(4f, 4f, 4f));
then use a Sphere imported from any 3D application. That line of code is only there for testing purposes and it's not a good idea to play video with Unity's sphere because the spheres don't have enough details to make the video smooth.
using UnityEngine;
using UnityEngine.Video;
public class StereoscopicVideoPlayer : MonoBehaviour
{
RenderTexture renderTexture;
Material leftSphereMat;
Material rightSphereMat;
public GameObject leftSphere;
public GameObject rightSphere;
private VideoPlayer videoPlayer;
//Audio
private AudioSource audioSource;
void Start()
{
//Create Render Texture
renderTexture = createRenderTexture();
//Create Left and Right Sphere Materials
leftSphereMat = createMaterial();
rightSphereMat = createMaterial();
//Create the Left and Right Sphere Spheres
leftSphere = createSphere("LeftEye", new Vector3(-5f, 0f, 0f), new Vector3(4f, 4f, 4f));
rightSphere = createSphere("RightEye", new Vector3(5f, 0f, 0f), new Vector3(4f, 4f, 4f));
//Assign material to the Spheres
leftSphere.GetComponent<MeshRenderer>().material = leftSphereMat;
rightSphere.GetComponent<MeshRenderer>().material = rightSphereMat;
//Add VideoPlayer to the GameObject
videoPlayer = gameObject.AddComponent<VideoPlayer>();
//Add AudioSource
audioSource = gameObject.AddComponent<AudioSource>();
//Disable Play on Awake for both Video and Audio
videoPlayer.playOnAwake = false;
audioSource.playOnAwake = false;
// We want to play from url
videoPlayer.source = VideoSource.Url;
videoPlayer.url = "http://www.quirksmode.org/html5/videos/big_buck_bunny.mp4";
//Set Audio Output to AudioSource
videoPlayer.audioOutputMode = VideoAudioOutputMode.AudioSource;
//Assign the Audio from Video to AudioSource to be played
videoPlayer.EnableAudioTrack(0, true);
videoPlayer.SetTargetAudioSource(0, audioSource);
//Set the mode of output to be RenderTexture
videoPlayer.renderMode = VideoRenderMode.RenderTexture;
//Set the RenderTexture to store the images to
videoPlayer.targetTexture = renderTexture;
//Set the Texture of both Spheres to the Texture from the RenderTexture
assignTextureToSphere();
//Prepare Video to prevent Buffering
videoPlayer.Prepare();
//Subscribe to prepareCompleted event
videoPlayer.prepareCompleted += OnVideoPrepared;
}
RenderTexture createRenderTexture()
{
RenderTexture rd = new RenderTexture(1024, 1024, 16, RenderTextureFormat.ARGB32);
rd.Create();
return rd;
}
Material createMaterial()
{
return new Material(Shader.Find("Specular"));
}
void assignTextureToSphere()
{
//Set the Texture of both Spheres to the Texture from the RenderTexture
leftSphereMat.mainTexture = renderTexture;
rightSphereMat.mainTexture = renderTexture;
}
GameObject createSphere(string name, Vector3 spherePos, Vector3 sphereScale)
{
GameObject sphere = GameObject.CreatePrimitive(PrimitiveType.Sphere);
sphere.transform.position = spherePos;
sphere.transform.localScale = sphereScale;
sphere.name = name;
return sphere;
}
void OnVideoPrepared(VideoPlayer source)
{
Debug.Log("Done Preparing Video");
//Play Video
videoPlayer.Play();
//Play Sound
audioSource.Play();
//Change Play Speed
if (videoPlayer.canSetPlaybackSpeed)
{
videoPlayer.playbackSpeed = 1f;
}
}
}
There is also Unity tutorial on how to do this with a special shader but this does not work for me and some other people. I suggest you use the method above until VR support is added to the VideoPlayer API.

Capture and Show Video from IP camera source (Unity3d + c#)

Need your help.
I am making simple application (and I'm a new to Unity3d also), which takes video from IP-camera and displays it onto Texture2D.
Video format is MJPG.
For simple jpg-images the code below works fine, but when I try to display MJPG I just get gray screen.
Did I make a mistake in code?
public class testVid : MonoBehaviour {
//public string uri = "http://24.172.4.142/mjpg/video.mjpg"; //url for example
public Texture2D cam;
public void Start() {
cam = new Texture2D(1, 1, TextureFormat.RGB24, false);
StartCoroutine(Fetch());
}
public IEnumerator Fetch() {
while(true) {
Debug.Log("loading... " + Time.realtimeSinceStartup);
WWWForm form = new WWWForm();
WWW www = new WWW("http://24.172.4.142/mjpg/video.mjpg");
yield return www;
if(!string.IsNullOrEmpty(www.error))
throw new UnityException(www.error);
www.LoadImageIntoTexture(cam);
}
}
public void OnGUI() {
GUI.DrawTexture(new Rect(0, 0, Screen.width, Screen.height), cam);
}
}
I used this plugin https://www.assetstore.unity3d.com/en/#!/content/15580
Add the script to a game object
Set the URL of the video in the script
Create a new material of Unlit 2D
Add this material to the movie script in the inspector
Then assign the same material to the game object you
want to display the video (e.g. a quad)
Hope it helps

Categories

Resources