XNA 3.1 to 4.0 shader - c#

So my professor gave our class code to look over to help us learn about Vector3D drawing, positioning, and movement. The code was originally written in XNA 3.1, and our lab here at school is still in XNA 3.1, however, I do everything on my laptop. My laptop has Visual Studio 2012 (I have 2013 but haven't moved XNA over yet). I have figured out and fixed all but one of the errors. I keep getting this error whenever I debug and run the game:
Error 1 Errors compiling C:\Users\Nicholas\Documents\Visual Studio 2013\Projects\MGH05_PrimitiveObjects\MGH05_PrimitiveObjects\Content\Shaders\PositionColor.fx:
C:\Users\Nicholas\Documents\Visual Studio 2013\Projects\MGH05_PrimitiveObjects\MGH05_PrimitiveObjects\Content\Shaders\PositionColor.fx(27,6): error X3000: syntax error: unexpected token 'VertexShader' C:\Users\Nicholas\Documents\Visual Studio 2013\Projects\MGH05_PrimitiveObjects\MGH05_PrimitiveObjects\Content\Content\Shaders\PositionColor.fx 27 6 MGH05_Win_PrimitiveObjects
My professor has been of no help, and neither has google. Anyone have any idea how to fix this? Here is the shader (PositionColor.fx) code:
float4x4 wvpMatrix : WORLDVIEWPROJ;
struct VSinput
{
float4 position : POSITION0;
float4 color : COLOR0;
};
struct VStoPS
{
float4 position : POSITION0;
float4 color : COLOR0;
};
struct PSoutput
{
float4 color : COLOR0;
};
// alter vertex inputs
void VertexShader(in VSinput IN, out VStoPS OUT)
{
// transform vertex
OUT.position = mul(IN.position, wvpMatrix);
OUT.color = IN.color;
}
// alter vs color output
void PixelShader(in VStoPS IN, out PSoutput OUT)
{
float4 color = IN.color;
OUT.color = clamp(color, 0, 1); // range between 0 and 1
}
// the shader starts here
technique BasicShader
{
pass p0
{
// declare & initialize ps & vs
vertexshader = compile vs_1_1 VertexShader();
pixelshader = compile ps_1_1 PixelShader();
}
}
When I rename VertexShader I still get the error, but now with PixelShader. When I rename them both it still gives me the VertexShader error.
If anyone has any thoughts let me know! Also, I apologize if this was asked on the wrong stack website. I'd think this one would be the proper place. If you need any extra info, let me know!
Thanks in advance!

As Romoku suggested, it would appear that XNA 4 uses a newer/different shader version, in which some new keywords have been added. This includes PixelShader and VertexShader, so those may no longer be used as identifiers.
The solution would be to give them some other name (any will do). Also remember to update the names in the technique.
void FancyVertexShader(in VSinput IN, out VStoPS OUT)
{
// transform vertex
OUT.position = mul(IN.position, wvpMatrix);
OUT.color = IN.color;
}
// alter vs color output
void AwesomePixelShader(in VStoPS IN, out PSoutput OUT)
{
float4 color = IN.color;
OUT.color = clamp(color, 0, 1); // range between 0 and 1
}
// the shader starts here
technique BasicShader
{
pass p0
{
// declare & initialize ps & vs
vertexshader = compile vs_2_0 FancyVertexShader();
pixelshader = compile ps_2_0 AwesomePixelShader();
}
}
Edit: and as you pointed out yourself, XNA 4 uses version 2.0 for both the vertex and the pixel shader.

Related

Access TextMesh Pro Texture Tiling and Offset how?

TextMesh Pro shaders have two unusual facilities for adjusting the Textures used for both the Face and the Outline: Tiling and Offset.
They're not accessible via the normal ways of using keywords to access shader properties in Unity.
Is it possible to access these properties from Monobehaviours? How?
If you're wanting to see sample code... there's little point... as I've tried all the normal ways of accessing shader properties in Unity and found none of them work, all throwing errors relating to symbols not existing. Or returning nulls.
These properties are somewhat nested, somehow.
If you've ever successfully edited these values with a Script in Unity, you'll likely know that they're a bit different.
Within the Shader files for TextMesh Pro, these values are known as:
float4 _FaceTex_ST;
float4 _OutlineTex_ST;
Note, the .x and .y of these float4 are the scaling/tiling along their respective axis, whilst .z and .w are used for their offsets.
Depending a bit on which shader exactly you use - for now assuming one of the built-in ones like e.g. TextMeshPro/Distance Field (Surface) you can search for the shader e.g. in Assets/TextMeshPro/Shaders, select the Shader and now see which properties are exposed in the Inspector.
In that case it would be the _FaceTex texture.
Now the Tiling and Offset are indeed quite tricky since they store those directly along with the Texture property itself! You can see his when setting the Inspector to Debug mode with the TextMeshPro selected
=> You want to use Material.SetTextureOffset and Material.SetTextureScale (= Tiling) with the property name of the texture itself
yourTextMeshPro.fontMaterial.SetTextureScale("_FaceTex", new Vector2(42, 42));
yourTextMeshPro.fontMaterial.SetTextureOffset("_FaceTex", new Vector2(123, 456));
The Tiling and Offset have no effect for the Outline apparently. See e.g. Distance Field (Surface) Shader.
Outline
...
Tiling: ????
Offset: ????
You could still try though and do the same just with _OutlineTex
Thanks to the incomparable derHugo, the resultant code works perfectly, in both directions:
using TMPro;
using UnityEngine;
public class testMaterailProps : MonoBehaviour {
public Vector2 FaceTiling;
public Vector2 FaceOffset;
public Vector2 OutLineTiling;
public Vector2 OutlineOffset;
public Material myFontMaterial;
TextMeshPro myTexmMeshPro;
static readonly int FaceTex = Shader.PropertyToID( "_FaceTex" );
static readonly int OutlineTex = Shader.PropertyToID( "_OutlineTex" );
void Start ( ) {
myTexmMeshPro = GetComponent<TextMeshPro>( );
myFontMaterial = myTexmMeshPro.fontSharedMaterial;
FaceTiling = myFontMaterial.GetTextureScale( FaceTex );
FaceOffset = myFontMaterial.GetTextureOffset( FaceTex );
OutLineTiling = myFontMaterial.GetTextureScale( OutlineTex );
OutlineOffset = myFontMaterial.GetTextureOffset( OutlineTex );
}
void OnValidate ( ) {
myFontMaterial.SetTextureScale( FaceTex, FaceTiling );
myFontMaterial.SetTextureOffset( FaceTex, FaceOffset );
myFontMaterial.SetTextureScale( OutlineTex, OutLineTiling );
myFontMaterial.SetTextureOffset( OutlineTex, OutlineOffset );
}
}
Making it possible to copy text styles accurately from one text object to another, since a bug in the copy/paste of Unity properties prevents these values being copy-able through the Editor UI...

Unity Post-processing PostProcessEffectRenderer shows in Editor but not in build

After adding an implementation of a PostProcessEffectRenderer to the Unity post-processing stack the effect works perfectly in the Unity Editor, but does not show in the built game.
Changes to build quality have no effect, effect does not show using maximum quality settings, building for Windows x86_64.
Grayscale.cs
using System;
using UnityEngine;
using UnityEngine.Rendering.PostProcessing;
[Serializable]
[PostProcess(typeof(GrayscaleRenderer), PostProcessEvent.AfterStack, "Custom/Grayscale")]
public sealed class Grayscale : PostProcessEffectSettings
{
[Range(0f, 1f), Tooltip("Grayscale effect intensity.")]
public FloatParameter blend = new FloatParameter { value = 0.5f };
}
public sealed class GrayscaleRenderer : PostProcessEffectRenderer<Grayscale>
{
public override void Render(PostProcessRenderContext context)
{
var sheet = context.propertySheets.Get(Shader.Find("Hidden/Custom/Grayscale"));
sheet.properties.SetFloat("_Blend", settings.blend);
context.command.BlitFullscreenTriangle(context.source, context.destination, sheet, 0);
}
}
Grayscale.shader
Shader "Hidden/Custom/Grayscale"
{
HLSLINCLUDE
#include "Packages/com.unity.postprocessing/PostProcessing/Shaders/StdLib.hlsl"
TEXTURE2D_SAMPLER2D(_MainTex, sampler_MainTex);
float _Blend;
float4 Frag(VaryingsDefault i) : SV_Target
{
float4 color = SAMPLE_TEXTURE2D(_MainTex, sampler_MainTex, i.texcoord);
float luminance = dot(color.rgb, float3(0.2126729, 0.7151522, 0.0721750));
color.rgb = lerp(color.rgb, luminance.xxx, _Blend.xxx);
return color;
}
ENDHLSL
SubShader
{
Cull Off ZWrite Off ZTest Always
Pass
{
HLSLPROGRAM
#pragma vertex VertDefault
#pragma fragment Frag
ENDHLSL
}
}
}
After much trial and error I realised that this was caused by Unity excluding the hidden shader as it lacked a reference to anything in the game at build time. On build Unity will only include shaders either attached to a material in use in a scene or those added in project settings in the 'Always Included Shaders' array.
I tried both and it solved my problem, it has been suggested that creating a dummy object within your game referencing the hidden shader will work better as it leaves Unity to decide if it's needed in a scene or not. Either way this fixed it for me.

Change camera shader property

I need to set a variable of a shader without a material that wraps around it.
I'll explain the problem and why it's not like the question "How can I access shader variables from script?".
Problem
My shader is similar to this (99% of irrelevant boilerplate code removed):
Shader "Hidden/XShader"
{
Properties
{
_x ("", float) = 0
}
SubShader
{
Pass
{
float _x;
half4 frag(v2f i) : SV_Target
{
// "col" and "wpos" have been correctly defined
if (wpos.x >= _x)
{
col.r = 1;
} else {
col.r = 0;
}
return col;
}
}
}
}
This shader is set through the Edit->Project Settings->Graphics->Deferred option. It is the default shader that the main camera uses.
Now I need to set the _x value from code attached to the camera:
public class XCameraController : MonoBehaviour
{
public float x;
void Update()
{
<something>.SetFloat("_x", x);
}
}
The <something> placeholder would normally be a material, as SetFloat() is defined there. But the camera and shader do not have a material. The concept of material doesn't even apply to the default shader.
I've searched online and in the documentation for hours. I admit I failed and am at a loss here. I guess it must be simple but I can't find any documentation.
I don't expect an implemented solution, a pointer where I can find help will suffice!
But the camera and shader do not have a material. The concept of
material doesn't even apply to the default shader.
True but materials simply exposes all properties from a shader so it is relevant here since you want to change the shader properties.
You have a custom shader but that's not used to render a GameObject but the camera. A material is still needed to change the shader. If you don't want to use a material then you can use the Shader.SetGlobalXXX functions such as Shader.SetGlobalFloat("_x", 3) but it will change all the shader properties. This is unrealistic.
The proper way to do this is to create a temporary material you will use to modify the shader, change the shader properties, then update the shader the camera is using. To do this, you have to:
Find the shader or get a reference of the shader with a public variable:
Shader camShader = Shader.Find("Hidden/XShader");
Create material from the shader
Material camMat = new Material(camShader);
Modify the property as you wish
camMat.SetFloat("_x", 3);
Apply to the modified shader property to the camera
Camera.main.SetReplacementShader(camShader, "RenderType");
If you're manually rendering the camera then use Camera.main.RenderWithShader(camShader, "RenderType") instead of Camera.main.SetReplacementShader(camShader, "RenderType").

How exactly do we bind an attribute name to a location in OpenGL?

I am using OpenTK, a wrapper for .NET. The version of OpenGL used is 4.5 implemented by NVIDIA. I am using Windows 10 Pro.
Issue
My issue is simple. I want to address the vertex attributes by their names, instead of hard coding their location in shader source.
I have a vertex shader called basicTexturedVert.glsl
#version 450 core
in vec4 position;
in vec2 normal;
uniform mat4 modelMatrix;
uniform mat4 viewMatrix;
uniform mat4 projectionMatrix;
out vec2 vs_uv;
void main(void)
{
vs_uv = normal;
gl_Position = projectionMatrix * viewMatrix * modelMatrix * position;
}
Things I have tried
Now to do that, normally I would have to do a GL.GetAttribLocation with the name of the attribute in the program and it would return the location of it. Well I tried everything but it only returns the location of the in vec4 position and not of in vec2 normal. And by everything I mean:
When I hard code the location of both attributes, GL.GetAttribLocation("position") always returns the correct location, but the same for normal returns a -1.
I thought it had to do with the name of normal, maybe it is a reserved word by OpenGL, so I changed it to a random word like abcdef still gives same result.
Now I am thinking maybe it has to do with the order of declaration of the shader attributes in the shader source, so I move normal before position, still same results.
About now I am going insane trying to figure why OpenGL is always giving the right location for position. I thought maybe vec2 (which here is the only differentiator between the two) is not an accepted type, I check online, damn well it is accepted.
As you can see I tried many things before trying this one. I read that you can programmatically bind the attributes to names and assign a location to choose. So that is what I do here in the following code.
First I create my Shader objects like this:
var basicTexturedVertexShader = new Shader("Basic Textured Vertex Shader",
ShaderType.VertexShader,
File.ReadAllText(#"Components\Shaders\Vertex\basicTexturedVert.glsl"),
new[] { "position", "normal" }
);
var basicTexturedFragmentShader = new Shader("Basic Textured Fragment Shader",
ShaderType.FragmentShader,
File.ReadAllText(#"Components\Shaders\Fragment\basicTexturedFrag.glsl")
);
As you can see, each shader gets assigned:
- A name so I can understand which shader I am working on (during debug)
- The type of the shader (VertexShader or FragmentShader)
- The shader source code
- And optionally an array containing the names of the shader attributes like for the first one new[] { "position", "normal" } which will be assigned to a location during program linking
I then create a program and link them to it:
_texturedProgram = new ShaderProgram(basicTexturedVertexShader, basicTexturedFragmentShader);
_texturedProgram.Link();
Now inside the _texturedProgram.Link:
int location = 0; // This is a location index that starts from 0 then goes up
foreach (var shader in _shaders) {
DebugUtil.Info($"Attaching shader {shader.Name} of handle {shader.Handle} of type {shader.Type} to program {_handle}");
GL.AttachShader(_handle, shader.Handle);
// If the shader we attached has attribute names with it
// It means we need to give them a location
if (shader.AttributeNames != null)
{
foreach (var shaderAttributeName in shader.AttributeNames)
{
_attributeLocation[shaderAttributeName] = location;
GL.BindAttribLocation(_handle, location, shaderAttributeName);
// We check if anything wrong happened and output it
ErrorCode error;
bool errorHappened = false;
while ((error = GL.GetError()) != ErrorCode.NoError) {
DebugUtil.Warning($"Problem during binding attrib location of {shaderAttributeName} of {shader.Name} to {location} in program {_handle}. Error: {error}");
errorHappened = true;
}
if (!errorHappened) {
DebugUtil.Info($"Shader attribute \"{shaderAttributeName}\" of {shader.Name} of program {Handle} SHOULD HAVE BEEN bound to location {location}");
}
location++;
}
}
}
// We link the program
GL.LinkProgram(_handle);
// Make sure the linking happened with no problem
var info = GL.GetProgramInfoLog(_handle);
if (!string.IsNullOrWhiteSpace(info)) {
DebugUtil.Warning($"Info log during linking of shaders to program {_handle}: {info}");
}
else {
DebugUtil.Info($"Program {_handle} linked successfully");
}
// We compare the locations we think have been assigned to the vertex attributes
// to the one that are actually stored in OpenGL
foreach (var attribute in _attributeLocation) {
DebugUtil.Info($"[Program:{_handle}] According to OpenGL, {attribute.Key} is located in {GL.GetAttribLocation(_handle, attribute.Key)} when it is supposed to be in {attribute.Value}");
}
// We clean up :)
foreach (var shader in _shaders) {
GL.DetachShader(_handle, shader.Handle);
GL.DeleteShader(shader.Handle);
}
// No need for the shaders anymore
_shaders.Clear();
And here is the console output:
Lets say that position's default position would have been 0 and it just a coincidence. Let's set location starting index at like 5.
As you can see, my code works for position but not for normal...
It appears that, because the normal vertex attribute leads to no use in the subsequent stage (= fragment shader), the OpenGL optimizes the shader program by getting rid of unused variables.
Thanks to #Ripi2 for pointing that out.

Can Short2 be used on WP7 for vertex positions?

I'm having trouble using Short2 for the (x,y) positions in my vertex data. This is my vertex structure:
struct VertexPositionShort : IVertexType
{
private static VertexElement[]
vertexElements = new VertexElement[]
{
new VertexElement(0, VertexElementFormat.Short2, VertexElementUsage.Position, 0),
};
private static VertexDeclaration
vertexDeclaration = new VertexDeclaration(vertexElements);
public Short2
Position;
public static VertexDeclaration Declaration
{
get { return new VertexDeclaration(vertexElements); }
}
VertexDeclaration IVertexType.VertexDeclaration
{
get { return new VertexDeclaration(vertexElements); }
}
}
Using the WP7 emulator, nothing is drawn if I use this structure - no artifacts, nothing! However, if I use an identical structure where the Short2 structs are replaced by Vector2 then it all works perfectly.
I've found a reference to this being an emulator-specific issue: "In the Windows Phone Emulator, the SkinnedEffect bone index channel must be specified as one of the integer vertex element formats - either Byte4, Short2, or Short4. This same set of integer data formats cannot be used for other shader input channels such as colors, positions, and texture coordinates on the emulator." (http://www.softpedia.com/progChangelog/Windows-Phone-Developer-Tools-Changelog-154611.html) However this is from July 2010 and I'd have assumed this limitation has been fixed by now...? Unfortunately I don't have a device to test on.
Can anyone confirm that this is still an issue in the emulator or point me at another reason why this is not working?
Solved, by Mr Shawn Hargreaves: "You can use Short2 in vertex data, but this is an integer type, so your vertex shader must be written to accept integer rather than float inputs. BasicEffect takes floats, so Short2 will not work with it. NormalizedShort2 might be a better choice?"
http://blogs.msdn.com/b/shawnhar/archive/2010/11/19/compressed-vertex-data.aspx
I can confirm that NormalizedShort2 does in fact work for position data, in both the WP7 emulator and on real devices.
Thanks, Shawn!

Categories

Resources