I'm making a RTS game in Unity. In such games, players usually can determine the unit allegiance using the unit color markings. And I'm now trying to implement system, that will remap purple color on the unit owner's color.
One idea was to determine color that will be used as mask and then recoloured to any color. It could be done using some hue distribution function:
Solution 1
I used funtion based on max(). You can see the plot there.
hue = min(hue, pow(Math.abs(hue-MASK_HUE),8)*5000000+RESULT_HUE)
This solution has two big flaws:
Purple can't be used (I don't like it anyway)
Only full colours are applicable (no brown, black, white...)
What you see above is just my fiddling. The actual project would run on Unity engine in C#.
Solution 2
My friend proposed diferent approach: every image should use a map - either faded or just true/false array - to map where should the colours be applied. I didn't try this yet, as it's more complicated to test ad-hoc.
Altering textures in unity
It seems that texture for material can be easily altered in Unity, so the question is:
Q: How should I implement the dynamic texture coloring (generating) in Unity? Is any of my methods the good way? How should I produce the new textures, using what functions?
Rather than full code, I need information about what functions should I use. I hope other users will also profit from the general info.
To help you answering, I can guess the process will have 3 important parts:
Somehow get the texture of a model material (we know just the GameObject here)
If it hasn't been recolored already, use some algorithm to change the image properly. We'll need some pixel manipulation here
Apply the texture back to the model material
If you go the texture manipulation route, you'll need to make an additional copy of the texture stored in memory for each color, and this will increase the "loading" time of your scene. You can access the texture of a GameObject with renderer.material.mainTexture used on the GameObject that has the renderer component. Then you can use all sorts of pixel manipulation options such as SetPixel, or SetPixels to do it in batches for better performance.
There is, however, a third option that I would recommend. If you write/modify a custom shader, you can perform the color replacement at render time without significantly decreasing performance. This can be accomplished by adding a step where you convert your color output from RGB to HSV, change the Hue and Saturation, and then convert back to HSV.
By making the Hue and Saturation external parameters, you should be able to use a full range of colors including whatever you used for your marker color.
This post from the Unity forums should help with the hue shift shader.
Related
To start, I have a bird eye view 2d tilemap that I generate procedural. I color each tile using TileMap.SetColor( new Color(r,g,b,a));. This works very well with the default shader that does not process light on the scene. I want light in my game, so I swapped the shader, 1 by 1, to every other shader with none supporting the custom color and lighting at the same time in Legacy vertex Lit Mode. I tried swapping my rendering mode to Forward and Deffered and this did work as I had my custom colors on top of the lighting, but problems occurred. The reason I swapped to Vertex Lit in the first place was because I was getting artifacts at the boundaries of every tile with lines that showed the cross sections of tiles that overlapped and I get a way lower frame rate in forward/deffered, so I prefer legacy. I know why the shader does this and I tried writing my own, but I am very new to Unity, so I am not experience enough to dive into graphics that far.
What Is Needed: All that is needed is my custom tile colors and lighting in Legacy Vertex Lit mode or A fix to the overlapping tiles lighting up in forward and deffered modes.
Thanks again for the help! I am losing my mind trying to solve this.
EDIT: I noticed that when I changed the layer or z position of a tile in that map, the effect disappeared completely in forward and deffered of just that tile only. I don't know if this the solution I am looking for, but its a start.
EDIT #2: I set the chunk rendering mode to individual and it fixes the problem as in the first edit, but performance takes a very big hit. Any way to tell the render/shader that each tile is separated?
Ok, I eventually got it to work. What I did to get rid of the lines and get custom coloring in the game was to swap rendering modes to forward mode. Then, you can go into the shader you are using for the tilemap and add #pragma noforwardadd.
I am trying to achieve this effect, I want to "paint" the image with alpha, like erasing part of the image exactly where the user touches or uses the mouse.
I don't know where to start.
You could put each image on a separate Layer and tell your "eraser brush" to ignore the layer behind the one you want to erase using a LayerMask. When the user clicks on the image, it erases part of the image on the first layer, but leaves the image behind it alone.
Credit to Statement from here:
LayerMask, or any "Mask" in Computer Science, is an integer which uses
its 32 bits to represent different flags. While it is an integer, the
value doesn't mean anything useful numerically. What matters is which
flags are turned on or off.
In the case of LayerMask, each flag (each bit) represents one layer.
The layers are selectable at the top of the inspector for each GameObject, you can add and name layers there, and also choose to view different combinations of them (see pic).
If you want to have 2 textures on asingle plane:
- Make a custom shader with 3 textures (background, foreground, mask)
- Then you can paint into that mask texture with grayscale color and shader fades the 2 textures based on the mask color (example, if color is full white, only foreground is visible)
Or if you want to have 3D object behind a separate texture plane (and you only need to erase the existing texture, not draw the same pattern back)
- Could use any default texture which supports alpha
- Just paint into the MainTexture in the material, painting with color alpha 0 makes the area transparent
- Texture needs to have [x] read/write enabled in the importer settings to allow drawing on it
Some related links:
Basic pixel drawing example from unity docs: http://docs.unity3d.com/ScriptReference/RaycastHit-textureCoord.html
Old demo scene, painting to texture and using dissolve shader : http://unitycoder.com/blog/2011/08/10/painting-holes-with-shader-v2-0/
And search for unity blend 2 textures with mask, unity draw to texture etc. to find more examples & shaders.
I just started making my own mapgenerator. I finished the perlin noise yesterday. The problem now is that it gives me a texture with more wholes in it than a swiss cheese!... So i planned to make some code to cut out one of the wholes with some kind of automatic algorithm. Later i could blend them together on some background and make many different sized caves and ores and so on. Now i need an algorithm that can cut out one cave and put it on an alpha channel.
The way i would do it now is making my own method wich needs X and Y as parameter. This method would check if the pixel at the Coordinates is black. If it is it puts this pixel down on a 2d-Texture and it would call itself with the Coordinates of the pixels that are located next to the pixel it checked. if the pixel is not black it does nothing. I would also have to make sure the pixels will only get checked once.
Now do you think this is a clever way of generating my terrain, ores, caves, etc?
I'm new to DirectX and having trouble getting colors & lighting to work.
I want to be able to load various colored vertices into a single vertex buffer and enable with directional lighting, however every method I try has problems. Imagine I want to render 3 cubes, red green & blue respectively, with white directional lighting from the back.
1) I don't want to set the color via device.Material because then I'd have to make a separate call to DrawPrimitives() for each cube.
2) Working from samples, I seem to be able to get something working by using CustomVertex.PositionNormalColored - problem is I'm not really clever enough to set the Normal vertex programmatically. (Any tips? I don't want anything fancy, just for the lighting to work.)
3) CustomVertex.PositionColored seems ideal but doesn't seem to work with I turn on lighting and add a direction light, it seems to want the Normal vector.
I don't really have a stable code sample to provide but would appreciate any general advice on how to implement this - what CustomVertex to use, what's necessary for directional lighting to work, etc.
-Brendan
I am trying to make my lighting similar to Terraria's, block-lighting. Now, I know how to make blocks darker, I can assign blocks to a certain lightlevel, but, how would I make an actual light entity, that emits light in a round shape (Can be diamond-shaped too)?
Help would be greatly appreciated, also, if I wasn't clear in my question, feel free to ask.
Basic 2D lighting is very simple. Just do a distance check from your block, to your light, and use that value to scale your light.
This is something you could do fairly simple, since Spritebatch.Draw has a nice Color tint parameter [link]
A pseudo function could be
distance = (block.position - light.position). Length();
lightPower = distance / light.MaxDistance;
finalTint = light.Color * lightPower;
Render Block, with finalTint
For more nice looking light, you could replace "distance / light.MaxDistance" with a more smooth effect.
If you also want lights to go through a few blocks like Terraria, you could count all blocks between your block and the light source. Scale your lightPower down by that amount, and you get the same effect like Terraria has.
Of course, this is a non optimized way of doing it, but should work.
The latest Terraria version however seems to have smooth per pixel lighting instead of per block [preview]. For that I assume they used a second render target and/or Pixel Shader to keep fast performance. This could be a little difficult if you are not familiar with rendering pipelines though.
Hope this helps!
I'm working on a game with a similar lighting model, and the way we do it is this:
Draw the scene, without lighting, to a render target (called the 'Scene Buffer')
Draw the scene's lights, represented as grayscale gradients of any required shape, to a second render target (called the 'Light Map')
Draw the Scene Buffer to the screen, passing in the Light Map as a parameter to the pixel shader
In the pixel shader, query the value of the Light Map at each pixel and adjust the color of the final pixel up or down as necessary.
This also gives you the ability to have colored lighting; all you have to do is tint the light gradients that you render to the Light Map. Just remember to use additive alpha blending.
The downside of this approach is that it's rather naive, and provides no easy way to occlude the lights (that is to say, they pass through walls). In our case, this isn't an issue; you might decide otherwise.