Change camera shader property - c#

I need to set a variable of a shader without a material that wraps around it.
I'll explain the problem and why it's not like the question "How can I access shader variables from script?".
Problem
My shader is similar to this (99% of irrelevant boilerplate code removed):
Shader "Hidden/XShader"
{
Properties
{
_x ("", float) = 0
}
SubShader
{
Pass
{
float _x;
half4 frag(v2f i) : SV_Target
{
// "col" and "wpos" have been correctly defined
if (wpos.x >= _x)
{
col.r = 1;
} else {
col.r = 0;
}
return col;
}
}
}
}
This shader is set through the Edit->Project Settings->Graphics->Deferred option. It is the default shader that the main camera uses.
Now I need to set the _x value from code attached to the camera:
public class XCameraController : MonoBehaviour
{
public float x;
void Update()
{
<something>.SetFloat("_x", x);
}
}
The <something> placeholder would normally be a material, as SetFloat() is defined there. But the camera and shader do not have a material. The concept of material doesn't even apply to the default shader.
I've searched online and in the documentation for hours. I admit I failed and am at a loss here. I guess it must be simple but I can't find any documentation.
I don't expect an implemented solution, a pointer where I can find help will suffice!

But the camera and shader do not have a material. The concept of
material doesn't even apply to the default shader.
True but materials simply exposes all properties from a shader so it is relevant here since you want to change the shader properties.
You have a custom shader but that's not used to render a GameObject but the camera. A material is still needed to change the shader. If you don't want to use a material then you can use the Shader.SetGlobalXXX functions such as Shader.SetGlobalFloat("_x", 3) but it will change all the shader properties. This is unrealistic.
The proper way to do this is to create a temporary material you will use to modify the shader, change the shader properties, then update the shader the camera is using. To do this, you have to:
Find the shader or get a reference of the shader with a public variable:
Shader camShader = Shader.Find("Hidden/XShader");
Create material from the shader
Material camMat = new Material(camShader);
Modify the property as you wish
camMat.SetFloat("_x", 3);
Apply to the modified shader property to the camera
Camera.main.SetReplacementShader(camShader, "RenderType");
If you're manually rendering the camera then use Camera.main.RenderWithShader(camShader, "RenderType") instead of Camera.main.SetReplacementShader(camShader, "RenderType").

Related

Access TextMesh Pro Texture Tiling and Offset how?

TextMesh Pro shaders have two unusual facilities for adjusting the Textures used for both the Face and the Outline: Tiling and Offset.
They're not accessible via the normal ways of using keywords to access shader properties in Unity.
Is it possible to access these properties from Monobehaviours? How?
If you're wanting to see sample code... there's little point... as I've tried all the normal ways of accessing shader properties in Unity and found none of them work, all throwing errors relating to symbols not existing. Or returning nulls.
These properties are somewhat nested, somehow.
If you've ever successfully edited these values with a Script in Unity, you'll likely know that they're a bit different.
Within the Shader files for TextMesh Pro, these values are known as:
float4 _FaceTex_ST;
float4 _OutlineTex_ST;
Note, the .x and .y of these float4 are the scaling/tiling along their respective axis, whilst .z and .w are used for their offsets.
Depending a bit on which shader exactly you use - for now assuming one of the built-in ones like e.g. TextMeshPro/Distance Field (Surface) you can search for the shader e.g. in Assets/TextMeshPro/Shaders, select the Shader and now see which properties are exposed in the Inspector.
In that case it would be the _FaceTex texture.
Now the Tiling and Offset are indeed quite tricky since they store those directly along with the Texture property itself! You can see his when setting the Inspector to Debug mode with the TextMeshPro selected
=> You want to use Material.SetTextureOffset and Material.SetTextureScale (= Tiling) with the property name of the texture itself
yourTextMeshPro.fontMaterial.SetTextureScale("_FaceTex", new Vector2(42, 42));
yourTextMeshPro.fontMaterial.SetTextureOffset("_FaceTex", new Vector2(123, 456));
The Tiling and Offset have no effect for the Outline apparently. See e.g. Distance Field (Surface) Shader.
Outline
...
Tiling: ????
Offset: ????
You could still try though and do the same just with _OutlineTex
Thanks to the incomparable derHugo, the resultant code works perfectly, in both directions:
using TMPro;
using UnityEngine;
public class testMaterailProps : MonoBehaviour {
public Vector2 FaceTiling;
public Vector2 FaceOffset;
public Vector2 OutLineTiling;
public Vector2 OutlineOffset;
public Material myFontMaterial;
TextMeshPro myTexmMeshPro;
static readonly int FaceTex = Shader.PropertyToID( "_FaceTex" );
static readonly int OutlineTex = Shader.PropertyToID( "_OutlineTex" );
void Start ( ) {
myTexmMeshPro = GetComponent<TextMeshPro>( );
myFontMaterial = myTexmMeshPro.fontSharedMaterial;
FaceTiling = myFontMaterial.GetTextureScale( FaceTex );
FaceOffset = myFontMaterial.GetTextureOffset( FaceTex );
OutLineTiling = myFontMaterial.GetTextureScale( OutlineTex );
OutlineOffset = myFontMaterial.GetTextureOffset( OutlineTex );
}
void OnValidate ( ) {
myFontMaterial.SetTextureScale( FaceTex, FaceTiling );
myFontMaterial.SetTextureOffset( FaceTex, FaceOffset );
myFontMaterial.SetTextureScale( OutlineTex, OutLineTiling );
myFontMaterial.SetTextureOffset( OutlineTex, OutlineOffset );
}
}
Making it possible to copy text styles accurately from one text object to another, since a bug in the copy/paste of Unity properties prevents these values being copy-able through the Editor UI...

Distort the scene on item pick

I'm making a simple game, and i have an idea about one of the items for player. If player picks the item, the screen starts to wave (distort), but i can't do that. I found a shader which distorts the image, but it distorts it permanently and works only on one picture, not the whole scene. Also i tried to use Camera.SetReplacementShader, but everything becomes just light blue and that's all.
Any ideas appreciated!
The code of the shader is down below:
Shader "Custom/NewShader" {
Properties {
_MainTex ("Base (RGB)", 2D) = "transparent" {}
_SpeedX("SpeedX", float)=3.0
_SpeedY("SpeedY", float)=3.0
_Scale("Scale", range(0.005, 0.2))=0.03
_TileX("TileX", float)=5
_TileY("TileY", float)=5
}
SubShader {
Tags { "RenderType"="Opaque" }
LOD 200
CGPROGRAM
#pragma surface surf Lambert
sampler2D _MainTex;
float4 uv_MainTex_ST;
float _SpeedX;
float _SpeedY;
float _Scale;
float _TileX;
float _TileY;
struct Input {
float2 uv_MainTex;
};
void surf (Input IN, inout SurfaceOutput o)
{
float2 uv = IN.uv_MainTex;
uv.x += sin ((uv.x+uv.y)*_TileX+_Time.g *_SpeedX)*_Scale;
uv.x += cos (uv.y*_TileY+_Time.g *_SpeedY)*_Scale;
half4 c = tex2D (_MainTex, uv);
o.Albedo = c.rgb * 4;
o.Alpha = c.a * 4;
}
ENDCG
}
FallBack "Diffuse"
}
You can use Post Processing Effects to solve this. It is Unity's "visual enchancement" package, basically an instagram filter for your game camera. You can change the visuals of your game completely using this package.
Setup:
Go to Unity > Window > Package Manager
Select Unity Registry on the top and type in the search bar "Post Porcessing"
Download and Import the package
Go to a gameObject and add the component post-process Volume
And then go to your Main Camera and add the component post-process Layer
On post-process Volume create a New Profile and give the gameObject a new Layer ("PostProcessing" perhaps)
On the post-process layer (in your camera) again select the Layer you assigned
You are all set. You can now go back to the post-process Volume component and press "Add Effect" and play around with the various effects to create the one you want.
This works for both 2D/3D, if there is a camera this will work.

Unity Post-processing PostProcessEffectRenderer shows in Editor but not in build

After adding an implementation of a PostProcessEffectRenderer to the Unity post-processing stack the effect works perfectly in the Unity Editor, but does not show in the built game.
Changes to build quality have no effect, effect does not show using maximum quality settings, building for Windows x86_64.
Grayscale.cs
using System;
using UnityEngine;
using UnityEngine.Rendering.PostProcessing;
[Serializable]
[PostProcess(typeof(GrayscaleRenderer), PostProcessEvent.AfterStack, "Custom/Grayscale")]
public sealed class Grayscale : PostProcessEffectSettings
{
[Range(0f, 1f), Tooltip("Grayscale effect intensity.")]
public FloatParameter blend = new FloatParameter { value = 0.5f };
}
public sealed class GrayscaleRenderer : PostProcessEffectRenderer<Grayscale>
{
public override void Render(PostProcessRenderContext context)
{
var sheet = context.propertySheets.Get(Shader.Find("Hidden/Custom/Grayscale"));
sheet.properties.SetFloat("_Blend", settings.blend);
context.command.BlitFullscreenTriangle(context.source, context.destination, sheet, 0);
}
}
Grayscale.shader
Shader "Hidden/Custom/Grayscale"
{
HLSLINCLUDE
#include "Packages/com.unity.postprocessing/PostProcessing/Shaders/StdLib.hlsl"
TEXTURE2D_SAMPLER2D(_MainTex, sampler_MainTex);
float _Blend;
float4 Frag(VaryingsDefault i) : SV_Target
{
float4 color = SAMPLE_TEXTURE2D(_MainTex, sampler_MainTex, i.texcoord);
float luminance = dot(color.rgb, float3(0.2126729, 0.7151522, 0.0721750));
color.rgb = lerp(color.rgb, luminance.xxx, _Blend.xxx);
return color;
}
ENDHLSL
SubShader
{
Cull Off ZWrite Off ZTest Always
Pass
{
HLSLPROGRAM
#pragma vertex VertDefault
#pragma fragment Frag
ENDHLSL
}
}
}
After much trial and error I realised that this was caused by Unity excluding the hidden shader as it lacked a reference to anything in the game at build time. On build Unity will only include shaders either attached to a material in use in a scene or those added in project settings in the 'Always Included Shaders' array.
I tried both and it solved my problem, it has been suggested that creating a dummy object within your game referencing the hidden shader will work better as it leaves Unity to decide if it's needed in a scene or not. Either way this fixed it for me.

Find and change property name in a shader

I want to change and set 3 things after setting the new material:
First The shader type to Unlit/Color
Second The albedo color to change it for example to: 255,0,0,255
Third The metallic value from 0 to 1
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
public class DoorsLockManager : MonoBehaviour
{
public bool locked;
public Color lockedColor = Color.red;
public Color unlockedColor = Color.green;
public Renderer rend;
private GameObject[] doorPlanes;
private void Start()
{
doorPlanes = GameObject.FindGameObjectsWithTag("DoorPlane");
for (int i = 0; i < doorPlanes.Length; i++)
{
rend = doorPlanes[i].GetComponent<Renderer>();
if (locked)
{
rend.material.SetFloat("Metallic", 1);
rend.material.color = lockedColor;
}
else
{
rend.material.color = unlockedColor;
}
}
}
// Update is called once per frame
void Update ()
{
}
}
This line does nothing:
rend.material.SetFloat("Metallic", 1);
This is what I want to change:
When you need to change a shader property but you don't know what to use for the proper name to use, select the material, click on its settings icon then click on "Select Shader". From there, you will see all the shader property names and their types and you will know which function and name to use to change the shader's properties.
With the default standard shader it looks something like this:
You need to know this otherwise you would need to ask new question for each property you want to change.
Your Renderer reference:
public Renderer rend;
To set it, the SetXXX function is used:
rend.material.SetFloat("_Metallic", 1); //Metallic is a float
To get it the GetXXX function is used:
float metallic = rend.material.GetFloat("_Metallic"); //Metallic is a float
To change or get the The albedo color to 255,0,0,255.
rend.material.SetColor("_Color", new Color32(255, 0, 0, 255));
Color color = rend.material.GetColor("_Color");
Notice that I used Color32 instead of Color because Color32 takes values between 0 and 255 while Color expects values between 0 and 1.
To change the material's shader to "Unlit/Color", just find it then assign it to the material:
Shader shader = Shader.Find("Unlit/Color");
rend.material.shader = shader;
Note that the "Unlit/Color" shader doesn't have the _Metallic property. It has one one property and that is "_Color". You can use the first method I described to determine which properties it has before attempting to change them.
What if the shader is used on many different objects. I want all the
objects to change when the property is changed on one of them.
To change all the objects using the-same material, use Renderer.sharedMaterial instead of Renderer.material. The shared material changes the original material and every other objects should pick that new material up as long as you have not called Renderer.material on the material which actually makes a new copy for the said material and disconnects the renderer material from the original material.

How exactly do we bind an attribute name to a location in OpenGL?

I am using OpenTK, a wrapper for .NET. The version of OpenGL used is 4.5 implemented by NVIDIA. I am using Windows 10 Pro.
Issue
My issue is simple. I want to address the vertex attributes by their names, instead of hard coding their location in shader source.
I have a vertex shader called basicTexturedVert.glsl
#version 450 core
in vec4 position;
in vec2 normal;
uniform mat4 modelMatrix;
uniform mat4 viewMatrix;
uniform mat4 projectionMatrix;
out vec2 vs_uv;
void main(void)
{
vs_uv = normal;
gl_Position = projectionMatrix * viewMatrix * modelMatrix * position;
}
Things I have tried
Now to do that, normally I would have to do a GL.GetAttribLocation with the name of the attribute in the program and it would return the location of it. Well I tried everything but it only returns the location of the in vec4 position and not of in vec2 normal. And by everything I mean:
When I hard code the location of both attributes, GL.GetAttribLocation("position") always returns the correct location, but the same for normal returns a -1.
I thought it had to do with the name of normal, maybe it is a reserved word by OpenGL, so I changed it to a random word like abcdef still gives same result.
Now I am thinking maybe it has to do with the order of declaration of the shader attributes in the shader source, so I move normal before position, still same results.
About now I am going insane trying to figure why OpenGL is always giving the right location for position. I thought maybe vec2 (which here is the only differentiator between the two) is not an accepted type, I check online, damn well it is accepted.
As you can see I tried many things before trying this one. I read that you can programmatically bind the attributes to names and assign a location to choose. So that is what I do here in the following code.
First I create my Shader objects like this:
var basicTexturedVertexShader = new Shader("Basic Textured Vertex Shader",
ShaderType.VertexShader,
File.ReadAllText(#"Components\Shaders\Vertex\basicTexturedVert.glsl"),
new[] { "position", "normal" }
);
var basicTexturedFragmentShader = new Shader("Basic Textured Fragment Shader",
ShaderType.FragmentShader,
File.ReadAllText(#"Components\Shaders\Fragment\basicTexturedFrag.glsl")
);
As you can see, each shader gets assigned:
- A name so I can understand which shader I am working on (during debug)
- The type of the shader (VertexShader or FragmentShader)
- The shader source code
- And optionally an array containing the names of the shader attributes like for the first one new[] { "position", "normal" } which will be assigned to a location during program linking
I then create a program and link them to it:
_texturedProgram = new ShaderProgram(basicTexturedVertexShader, basicTexturedFragmentShader);
_texturedProgram.Link();
Now inside the _texturedProgram.Link:
int location = 0; // This is a location index that starts from 0 then goes up
foreach (var shader in _shaders) {
DebugUtil.Info($"Attaching shader {shader.Name} of handle {shader.Handle} of type {shader.Type} to program {_handle}");
GL.AttachShader(_handle, shader.Handle);
// If the shader we attached has attribute names with it
// It means we need to give them a location
if (shader.AttributeNames != null)
{
foreach (var shaderAttributeName in shader.AttributeNames)
{
_attributeLocation[shaderAttributeName] = location;
GL.BindAttribLocation(_handle, location, shaderAttributeName);
// We check if anything wrong happened and output it
ErrorCode error;
bool errorHappened = false;
while ((error = GL.GetError()) != ErrorCode.NoError) {
DebugUtil.Warning($"Problem during binding attrib location of {shaderAttributeName} of {shader.Name} to {location} in program {_handle}. Error: {error}");
errorHappened = true;
}
if (!errorHappened) {
DebugUtil.Info($"Shader attribute \"{shaderAttributeName}\" of {shader.Name} of program {Handle} SHOULD HAVE BEEN bound to location {location}");
}
location++;
}
}
}
// We link the program
GL.LinkProgram(_handle);
// Make sure the linking happened with no problem
var info = GL.GetProgramInfoLog(_handle);
if (!string.IsNullOrWhiteSpace(info)) {
DebugUtil.Warning($"Info log during linking of shaders to program {_handle}: {info}");
}
else {
DebugUtil.Info($"Program {_handle} linked successfully");
}
// We compare the locations we think have been assigned to the vertex attributes
// to the one that are actually stored in OpenGL
foreach (var attribute in _attributeLocation) {
DebugUtil.Info($"[Program:{_handle}] According to OpenGL, {attribute.Key} is located in {GL.GetAttribLocation(_handle, attribute.Key)} when it is supposed to be in {attribute.Value}");
}
// We clean up :)
foreach (var shader in _shaders) {
GL.DetachShader(_handle, shader.Handle);
GL.DeleteShader(shader.Handle);
}
// No need for the shaders anymore
_shaders.Clear();
And here is the console output:
Lets say that position's default position would have been 0 and it just a coincidence. Let's set location starting index at like 5.
As you can see, my code works for position but not for normal...
It appears that, because the normal vertex attribute leads to no use in the subsequent stage (= fragment shader), the OpenGL optimizes the shader program by getting rid of unused variables.
Thanks to #Ripi2 for pointing that out.

Categories

Resources