Today i tried an old project, in my company and got an error which makes me curious. The issue line looks something like this:
if((dynamic)com_list.GetIntValue() != (dynamic)container.GetEnumValue())
The exception shows clearly that you can't compare Int32 with an Enum.
But i wonder, could this have ever worked, in some circumstance?
Are there changes in the dynamic Keyword which don't allow this anymore?
BTW, he also build this in the code like this:
if((dynamic)com_list.GetIntValue() != (dynamic)container.GetBooleanValue())
I'm still confused, why somebody would put this kind of comparing into productiv code.
No. The dynamic specification hasn't changed and I am pretty sure the evaluation in the compiler of such a trivial comparison didn't change overnight in one release to another. Most likely that code never worked.
Without additional cast from enum to int (or the other way around) it won't work.
Related
I like switch expressions for mapping enums to values - while not the most scalable solution, it's fast and fairly clean if the enum represents something modal (and isn't huge).
One often source of bugs is adding members to an enum - which often leaves new cases unhandled.
But I think these bugs could be nearly wiped out if we could have compile errors for a non-exhaustive switch, making the omissions easily visible and fixable. (The default case has to be omitted though, otherwise it's all moot)
Is this possible? I'm thinking of something like this:
public string GetTargetValue()
{
return target switch
{
Target.A => "foo",
Target.B => "bar",
// Compile error if someone added Target.C, otherwise works fine
// no default case - it would defeat the point
};
}
P.S: I work primarily in Unity, but to my understanding newer versions of Unity use the Roslyn compiler (I don't use burst) so I assume that doesn't matter.
Yes you can.
This case raises the warning CS8509, as you can see here. To turn this into an error, add the following to your .editorconfig:
dotnet_diagnostic.CS8509.severity = error
You might also want to ignore CS8524, which happens if you don't have a default case, and CS8509 isn't being raised (so even if you're covering all possible values), see here. In this case, the compiler will insert a default case with throw new SwitchExpressionException(target):
dotnet_diagnostic.CS8524.severity = none
I suspect you probably need to be building with the .NET 5 SDK or higher for this warning to be generated.
In my code I have a variable of type int. Directly after initalizing it, I receive a NullReferenceException. I am stumped why this is happening and actually how that is even possible.
Here is the code:
int lookupValue = 0;
if (0 == lookupValue)
And here is the debugger screen. The value of lookupValue is actually 0.
The debugger is showing the wrong line as the exception source. This does happen sometimes, you need to keep an eye on the surrounding code, and the stack trace.
Since you're working with a web application, it's also quite possible that the debugging information is out of sync with the code. Rebuilding the whole project might help, unless your dependencies are badly arranged.
Look at code ahead of the comparison, and below it as well (Is Session null? Is Session.UserId null? Is SqlCommands.LookupInsertCommand throwing NullReferenceException?). You can use quick watch to check pieces of code and find the one causing the NullReferenceException.
As a side-note, try not to carry practices from other languages to C#. Initialize local variables when you actually have a reasonable value to initialize them with - don't worry, the compiler will not allow you to compile code reading a variable that hasn't been assigned yet. When you just assign a default value, you're losing out on a few sanity checks of the code. Also, don't compare constant == variable. There's no reason to do that in C#, because you can't just accidentally type variable = constant - it will not compile (the only exception being the bool type, but you shouldn't compare that to a constant anyway - just do if (boolValue) or if (!boolValue)). It just makes the code harder to read and understand.
EDIT:
This case in particular is actually quite obvious if you know what you're looking for. You see, the if (0 == lookupValue) doesn't exist anywhere in the compiled binary - the compiler can safely ignore it, because lookupValue will always be 0. Usually, the debugging information will account for this, but missing by one line is quite common even when there's nothing as drastic as a whole missing line of code (in your case, likely more than one).
Since you are working with an ASP.NET application, part of the code isn't actually compiled by Visual Studio - it's compiled when you make a request. To generate proper debug information, you must also set the <compilation debug="true" /> in web.config (Compilation element).
There is no error in your code. anyway try this...
if (lookupValue.CompareTo(0) == 0)
I am trying to track down a very elusive bug in an application that manipulates a FlowDocument. I have shown below three consecutive lines of debugging code, together with their output:
Debug.Assert(ReferenceEquals(document1, document2));
Debug.WriteLine(document1.Blocks.Count); // 1
Debug.WriteLine(document2.Blocks.Count); // 3
Can anyone help me to understand how two references to the same object can have different values for a given property? Or am I missing something about the way ReferenceEquals works?
Thanks,
Tim
Edit:
If I change the assertion to an if block, the debugging code never runs ...
if (ReferenceEquals(document1, document2))
{
Debug.WriteLine(document1.Blocks.Count);
Debug.WriteLine(document2.Blocks.Count);
}
... which makes me feel utterly stupid, because the ReferenceEquals test is clearly working, but I don't understand why the assertion is not working.
Two things that might be happening from the top of my mind:
Accessing Blocks or Blocks.Count might mutate state (it shouldn't, but it is possible).
The object might be changed on another thread between the two calls. Do you use multi-threading in the application ?
Also, if the references are of different types (ie. document2 is of an inherited type), the property might be overloaded to return something different. You could check to see whether document1.GetType() == document2.GetType().
Edit in response to your update
Debug.Assert will only ever run, if the assembly is compiled in Debug mode. If you are running Release, it will not be run. This is because Debug.Assert is decorated with the [Conditional("DEBUG")] attribute.
It seems that the issue is the fact that you indeed have 2 different objects.
If a property has side effects it can yield different results each time you call it. E.g. DateTime.Now does not always equal DateTime.Now.
Without knowing anything more about the code, that would be my guess.
EDIT: Using Reflector on FlowDocument shows that Blocks return a new instance each time it is called. Additionally, the Count property BlockCollection is rather elaborate, so I would take a closer look at that. Unfortunately I don't know the involved types very well, so I can't immediately tell you what is wrong.
Possibilities (some of which you have already discounted in the comments):
Some external process, say something that is loading Blocks into FlowDocument, is altering the value between writes.
Heisenberg: reading the Blocks property affects it. This happens sometimes when reading rows from a data source. I'm not familiar with FlowDocument so I'm not sure how feasible this is.
If the instances were declared as different types, their references would still be equal, but the value of Blocks (or Blocks.Count) could be overridden, resulting in different return values since different code might be called - like Object.ToString() vs Int.ToString().
You're somehow calling this debug code in the middle of a loop. This could happen if you're running it in the command window or some attached debugger instead of within the application.
You have dead pixels on your screen that make the first "3" look like a "1".
You live next to a nuclear reactor.
Some things to try:
Run your .Assert code in a loop and see if the values stabilize.
Set a read/write breakpoint on the Blocks value. (I know you can do this in C, but haven't tried it in C#)
Update
Regarding your additional question about .Assert() not working as expected:
Just looked at this note on MSDN regarding Debug.Assert().
By default, the Debug.Assert method works only in debug builds. Use the Trace.Assert method if you want to do assertions in release builds. For more information, see Assertions in Managed Code.
Are you running a debug build or a release build?
Are you certain that the Blocks object reference points to the same object? Try a
Debug.Assert(ReferenceEquals(document1.Blocks, document2.Blocks));
and see if that succeeds.
If I do this I get a System.StackOverflowException:
private string abc = "";
public string Abc
{
get
{
return Abc; // Note the mistaken capitalization
}
}
I understand why -- the property is referencing itself, leading to an infinite loop. (See previous questions here and here).
What I'm wondering (and what I didn't see answered in those previous questions) is why doesn't the C# compiler catch this mistake? It checks for some other kinds of circular reference (classes inheriting from themselves, etc.), right? Is it just that this mistake wasn't common enough to be worth checking for? Or is there some situation I'm not thinking of, when you'd want a property to actually reference itself in this way?
You can see the "official" reason in the last comment here.
Posted by Microsoft on 14/11/2008 at
19:52
Thanks for the suggestion for
Visual Studio!
You are right that we could easily
detect property recursion, but we
can't guarantee that there is nothing
useful being accomplished by the
recursion. The body of the property
could set other fields on your object
which change the behavior of the next
recursion, could change its behavior
based on user input from the console,
or could even behave differently based
on random values. In these cases, a
self-recursive property could indeed
terminate the recursion, but we have
no way to determine if that's the case
at compile-time (without solving the
halting problem!).
For the reasons above (and the
breaking change it would take to
disallow this), we wouldn't be able to
prohibit self-recursive properties.
Alex Turner
Program Manager
Visual C# Compiler
Another point in addition to Alex's explanation is that we try to give warnings for code which does something that you probably didn't intend, such that you could accidentally ship with the bug.
In this particular case, how much time would the warning actually save you? A single test run. You'll find this bug the moment you test the code, because it always immediately crashes and dies horribly. The warning wouldn't actually buy you much of a benefit here. The likelihood that there is some subtle bug in a recursive property evaluation is low.
By contrast, we do give a warning if you do something like this:
int customerId;
...
this.customerId= this.customerId;
There's no horrible crash-and-die, and the code is valid code; it assigns a value to a field. But since this is nonsensical code, you probably didn't mean to do it. Since it's not going to die horribly, we give a warning that there's something here that you probably didn't intend and might not otherwise discover via a crash.
Property referring to itself does not always lead to infinite recursion and stack overflow. For example, this works fine:
int count = 0;
public string Abc
{
count++;
if (count < 1) return Abc;
return "Foo";
}
Above is a dummy example, but I'm sure one could come up with useful recursive code that is similar. Compiler cannot determine if infinite recursion will happen (halting problem).
Generating a warning in the simple case would be helpful.
They probably considered it would unnecessary complicate the compiler without any real gain.
You will discover this typo easily the first time you call this property.
First of all, you'll get a warning for unused variable abc.
Second, there is nothing bad in teh recursion, provided that it's not endless recursion. For example, the code might adjust some inner variables and than call the same getter recursively. There is however for the compiler no easy way at all to prove that some recursion is endless or not (the task is at least NP). The compiler could catch some easy cases, but then the consumers would be surprised that the more complicated cases get through the compiler's checks.
The other cases cases that it checks for (except recursive constructor) are invalid IL.
In addition, all of those cases, even recursive constructors) are guarenteed to fail.
However, it is possible, albeit unlikely, to intentionally create a useful recursive property (using if statements).
I'm having a little bit of trouble understanding what the problem is here. I have a bit of code that pulls records from a database using LINQ and puts them into an object which is cast into an interface. It looks a bit like this:
public IEnumerable<ISomeObject> query()
{
return from a in dc.SomeTable
select new SomeObject
{
//Assign various members here
} as ISomeObject;
}
When I test this, I put the returned IEnumerable into a variable called results and run this line:
Assert.AreEqual(EXPECTED_COUNT, results.Count());
When this is run, I get a System.Security.VerificationException: "Operation could destabilize the runtime."
I found the solution here, which is this:
var results = from a in dc.SomeTable
select new SomeObject
{
//Assign various members here
} as ISomeTable;
return results.OfType<ISomeObject>();
This works, but I'm having trouble understanding what's happening here. Why did I get the exception in the first place and how did the lines of code above fix it? The MSDN documentation seems to suggest that this is an issue of type safety, but I'm not seeing where the previous code was type-unsafe.
UPDATE
A little bit more information I found out. The first example works if I make the return type IQueryable. This sheds a little bit more light on what was going wrong, but I'm still confused about the why. Why didn't the compiler force me to cast the IEnumerable into an IQueryable?
I believe it is an issue of covariance or contravariance as noted by this forum post.
See Covariance and Contravariance in C#, Part Two: Array Covariance and the rest of the Covariance and Contravariance series at Eric Lippert's blog.
Although he is dealing with Arrays in the article I linked, I believe a similar problem presents itself here. With your first example, you are returning an IEnumerable that could contain objects that implement an interface that is larger than ISomeTable (i.e. - you could put a Turtle into an Animals IEnumerable when that IEnumerable can only contain Giraffes). I think the reason it works when you return IQueryable is because that is larger/wider than anything you could return, so you're guaranteed that what you return you will be able to handle(?).
In the second example, OfType is ensuring that what gets returned is an object that stores all the information necessary to return only those elements that can be cast to Giraffe.
I'm pretty sure it has something to do with the issues of type safety outlined above, but as Eric Lippert says Higher Order Functions Hurt My Brain and I am having trouble expressing precisely why this is a co/contravariant issue.
I found this entry while looking for my own solution to "operation could destabilize the runtime". While the covariance/contra-variance advice above looks very interesting, in the end I found that I get the same error message by running my unit tests with code coverage turned on and the AllowPartiallyTrustedCallers assembly attribute set.
Removing the AllowPartiallyTrustedCallers attribute caused my tests to run fine. I could also turn off code coverage to make them run but that was not an acceptable solution.
Hopefully this helps someone else who makes it to this page trying to find a solution to this issue.
Just a guess, but the as operator may return a null - so it may have to do with the actual implementation of the new SomeObject { ... } code, since it's syntactic sugar. The return results.OfType<ISomeTable>(); filters based on type, so your method's return statement will only return that type (ensuring type safety). I've run into a similar issue with returning generic types.
P.S. I love the "Operation could destabilize the runtime." exception. That's almost like the "You might blow up the internet" exception.
I came across this error with similar code;
IEnumerable<Table> records = (from t in db.Tables
where t.Id.Equals(1)
select t).ToList();
This seemingly harmless code was part of a UserControl method which was called from a Page. No problem in a .NET4 development environment, however, when the site was PreCompiled and deployed to the server on .NET3.5 I got this error.
I suspect this has something to do with the fact that the control was being compiled into a separate DLL combined with the security changes between the frameworks as described in this .NET security blog
My solution: run the live site on .NET4
`I have added to web.config file in section then System.Security.VerificationException: "Operation could destabilize the runtime." not coming for me.
<system.Web>
<trust level="Full"/>
Does it still fail if you change this:
select new SomeObject { ... } as ISomeTable;
to this:
select (ISomeTable) new SomeObject { ... };
?
If so (as I see you've confirmed), perhaps this has to do with the fact that an interface implementation could be either a class or a struct? Does the problem still appear if you cast to an abstract class rather than an interface?
I found that OfType had some nasty side effects when using linq to sql. For example, parts of the linq that were previously evaluated after the query was run against the db were instead translated to SQL. This failed as those sections had no SQL equivalent. I ended up using .Cast instead which seems to solve the problem as well.
In my case i had wrongly declared the Storage property in the Column attribute of a Linq2SQL class
[Column(Storage = "_Alias", DbType = "NVarChar(50)")]
public string UserAlias
I had same problem, but with inheritance
I defined a class in assembly A and a subclass in Assembly B
after I added below attribute to assembly A, problem solved:
[assembly: SecurityRules(SecurityRuleSet.Level1, SkipVerificationInFullTrust = true)]
I ran into this error while using the "Dynamic data access framework" Passive library. The source of the error was line 100 in the DynamicDatabase.cs file.
databaseDetectors = (databaseDetectors ?? Enumerable.Empty<DatabaseDetector>()).DefaultIfEmpty(new DatabaseDetector());
I changed that line of code to:
databaseDetectors = (databaseDetectors ?? Enumerable.Empty<DatabaseDetector>()).DefaultIfEmpty(new DatabaseDetector()).OfType<IDatabaseDetector>();
Doing so resolved the problem. I went ahead and forked the project and submitted the change to the original author.
Thank you, Jason Baker, for pointing out the solution in your original question.
On a side note, the original library ran fine on my local machine and on a Rackspace VPS, but when I pushed the same code to a shared hosting environment (GoDaddy and Rackspace's Cloud Sites), I began getting the "Operation could destabilize the runtime" error.
I guess, Linq to Sql may not support casting when translate to sql statement.