Unity - paint a texture with alpha to make it dissapear partially - c#

I am trying to achieve this effect, I want to "paint" the image with alpha, like erasing part of the image exactly where the user touches or uses the mouse.
I don't know where to start.

You could put each image on a separate Layer and tell your "eraser brush" to ignore the layer behind the one you want to erase using a LayerMask. When the user clicks on the image, it erases part of the image on the first layer, but leaves the image behind it alone.
Credit to Statement from here:
LayerMask, or any "Mask" in Computer Science, is an integer which uses
its 32 bits to represent different flags. While it is an integer, the
value doesn't mean anything useful numerically. What matters is which
flags are turned on or off.
In the case of LayerMask, each flag (each bit) represents one layer.
The layers are selectable at the top of the inspector for each GameObject, you can add and name layers there, and also choose to view different combinations of them (see pic).

If you want to have 2 textures on asingle plane:
- Make a custom shader with 3 textures (background, foreground, mask)
- Then you can paint into that mask texture with grayscale color and shader fades the 2 textures based on the mask color (example, if color is full white, only foreground is visible)
Or if you want to have 3D object behind a separate texture plane (and you only need to erase the existing texture, not draw the same pattern back)
- Could use any default texture which supports alpha
- Just paint into the MainTexture in the material, painting with color alpha 0 makes the area transparent
- Texture needs to have [x] read/write enabled in the importer settings to allow drawing on it
Some related links:
Basic pixel drawing example from unity docs: http://docs.unity3d.com/ScriptReference/RaycastHit-textureCoord.html
Old demo scene, painting to texture and using dissolve shader : http://unitycoder.com/blog/2011/08/10/painting-holes-with-shader-v2-0/
And search for unity blend 2 textures with mask, unity draw to texture etc. to find more examples & shaders.

Related

How to animate alpha in an image in Unity?

Ok, I don't know if this is possible but I need to animate the alpha portion of an image - like I have an image with a shape cut out of it so the background shows through, and I want to animate the size of this hole in Unity;s animation controller.
I know it is possible to animate images by stringing together a series of different images like sprite animation, but I want to know if there is a way to animate or cut a hole in another image using an image in Unity and/or animate the alpha portion.
Is this possible? An example of an alpha image with hole cut in it:
And I would want to scale/animate that cut out inner square.
In unity, you can animate everything that can be changed in the inspector, including the size and color (including alpha) of your sprite.
However, it is not possible to animate pixels of a sprite seperately, as far as i know, meaning that you cannot change the alpha value of an area inside your sprite.
Do your images consist of a single color only or do you use complex images?
Well I guess the Shader you can find here will the closest answer to your question.
Other than that, operations on your texture will be quite heavy to perform at runtime.
Hope this helps,

Remapping colours on textures dynamically in unity

I'm making a RTS game in Unity. In such games, players usually can determine the unit allegiance using the unit color markings. And I'm now trying to implement system, that will remap purple color on the unit owner's color.
One idea was to determine color that will be used as mask and then recoloured to any color. It could be done using some hue distribution function:
Solution 1
I used funtion based on max(). You can see the plot there.
hue = min(hue, pow(Math.abs(hue-MASK_HUE),8)*5000000+RESULT_HUE)
This solution has two big flaws:
Purple can't be used (I don't like it anyway)
Only full colours are applicable (no brown, black, white...)
What you see above is just my fiddling. The actual project would run on Unity engine in C#.
Solution 2
My friend proposed diferent approach: every image should use a map - either faded or just true/false array - to map where should the colours be applied. I didn't try this yet, as it's more complicated to test ad-hoc.
Altering textures in unity
It seems that texture for material can be easily altered in Unity, so the question is:
Q: How should I implement the dynamic texture coloring (generating) in Unity? Is any of my methods the good way? How should I produce the new textures, using what functions?
Rather than full code, I need information about what functions should I use. I hope other users will also profit from the general info.
To help you answering, I can guess the process will have 3 important parts:
Somehow get the texture of a model material (we know just the GameObject here)
If it hasn't been recolored already, use some algorithm to change the image properly. We'll need some pixel manipulation here
Apply the texture back to the model material
If you go the texture manipulation route, you'll need to make an additional copy of the texture stored in memory for each color, and this will increase the "loading" time of your scene. You can access the texture of a GameObject with renderer.material.mainTexture used on the GameObject that has the renderer component. Then you can use all sorts of pixel manipulation options such as SetPixel, or SetPixels to do it in batches for better performance.
There is, however, a third option that I would recommend. If you write/modify a custom shader, you can perform the color replacement at render time without significantly decreasing performance. This can be accomplished by adding a step where you convert your color output from RGB to HSV, change the Hue and Saturation, and then convert back to HSV.
By making the Hue and Saturation external parameters, you should be able to use a full range of colors including whatever you used for your marker color.
This post from the Unity forums should help with the hue shift shader.

Draw a one pixel line around square sprite

I have a 15 x 15 pixel box, that I draw several off in different colours using:
spriteBatch.Draw(texture, position, colour);
What I'd like to do is draw a one pixel line around the outside, in different colours, thus making it a 17 x 17 box, with (for example), a blue outline one pixel wide and a grey middle.
The only way I can think of doing it is to draw two boxes, one 17x17 in the outline colour, one 15x15 with the box colour, and layer them to give the appearance of an outline:
spriteBatch.Draw(texture17by17, position, outlineColour);
spriteBatch.Draw(texture15by15, position, boxColour);
Obviously the position vector would need to be modified but I think that gives a clear picture of the idea.
The question is: is there a better way?
You can draw lines and triangles using DrawUserIndexedPrimitives, see Drawing 3D Primitives using Lists or Strips on MSDN for more details. Other figures like rectangles and circles are constructed from lines, but you'll need to implement them yourself.
To render lines in 2D, just use orthographic projection which mirrors transformation matrix from SpriteBatch.
You can find a more complete example with the PrimitiveBatch class which encapsulates the logic of drawing in the example Primitives from XBox Live Indie Games.
Considering XNA can't draw "lines" like OpenGL immediate mode can, it is far more efficient to draw a spite with a pre-generated texture quad (2 triangles) than to draw additional geometry with dynamic texturing particularly when a single "line" each requiring 1 triangle; 2 triangles vs 4 respectfully. Less triangles and vertices in the former too.
So I would not try to draw a "thin" line using additional geometry that is trying to mimic lines around the outside of the other, instead continue with what you are doing - drawing 2 different sprites (each is a quad anyway)
Every object drawn in 3D is drawn using triangles. - Would you like to know more?

Retrieve texture coordinates on a 3D Model?

Is it possible to retrieve the texture coordinates of an object, for example through hittesting?
As an example: I use a 1920x1080 texture on a simple plane, and I want to get the coordinates 1920, 1080 if I click in the right bottom. (The model is in reality slightly more complex, so trying to calculate the position via math isn't as easy)
When math does not work for some reasons, I used to do the following graphic hit-test: assign unique colors to each texel of your plane, then do one frame rendering to an offscreen surface with lighthing and effects disabled, then read pixel color under the cursor and translate its value back to coordinates. This is quite efficient on complex models when you don't need to do such lookups too often (say, games), because reading pixels back will stop graphics hardware pipeline and drain the performance. Also, this potentially would work with any projections: ortho or perspective.

Issue with alpha in 2d lighting shader in XNA 4.0

I'm currently learning HLSL with XNA, I figured the best place to start after tutorials would be some simple 2D shaders. I'm attempting to implement a simple lighting shader in 2D.
I draw the scene without shadows to a rendertarget, swap my rendertarget to a shadowmap, draw my light(each individually) onto the shadowmap via alpha channel, swap my rendertarget back to default and render the scene then the shadows on top.
The alpha of the light changes depending on the distance of the current pixel and the point of the light, this is all working fine for me except when I render the scene, if two lights overlap it causes a nasty blending issue.
I'm using alphablend when I draw on the shadowmap, and when I draw the shadowmap to the scene.
Am I just using the wrong blending settings here? I don't know much about blendstates.
Sorry if the question was vague.
I've had this happen before when doing software lighting, where your values exceed 255 (or 1.0) and you end up back in the realms of blackness. I believe the values are clamped using a modulo operation causing 1.1 to be 0.1 or 256 to be 1. I notice how the black ellipse is actually two spheres edges combined, this is what is giving me this conclusion.
Hope this gets you closer to understanding and finding out your problem. I have no idea what code you are using already but in your HLSL technique pass you could try adding:
AlphaBlendEnable = true;
SrcBlend = One;
DestBlend = One;

Categories

Resources