I have a HashSet<int[]> foo where the int[] represents the coordinates of a point in a plane. The value at position 0 represents the x and the value at position 1 represents the y. I want to override the Equals and GetHashCode methods to be able to remove an element (the point represented as an array of size two) if its internal values are equals to a given one.
Already tried:
public override int GetHashCode(){
return this.GetHashCode();
}
public override bool Equals(object obj){
if (obj == null || ! (obj is int[]))
return false;
HashSet<int[]> item = obj as HashSet<int[]>;
return item == this;
}
In my class Maze.
Thanks in advance.
EDIT
I found a way to do that
class SameHash : EqualityComparer<int[]>
{
public override bool Equals(int[] i1, int[] i2)
{
return i1[0] == i2[0] && i1[1] == i2[1];
}
public override int GetHashCode(int[] i)
{
return base.GetHashCode();
}
}
It may seems like you solved what you asked for, but there is something important that should be pointed out. When you implemented the EqualityComparer<int[]> you coded the GetHashCode(int[] i) as return base.GetHashCode(); which is not correct even when it works. I took the time to provide you with the code below for you to see the results of your implementation and I also gave you a possible solution.
Copy this code and run it in a Console Project. Comment your line of code, uncomment the line right below it and run it again. You will see the difference!
Summarizing, when you return base.GetHashCode() you are returning the same hash code for every item. This causes collisions inside the hash set for all insertions ending up in a behavior as slow as if you were using a List<int[]> and you were asking if it contains an element before inserting it. That is why you will see that by using the function I provided you and for the range of numbers I'm generating you will be able to insert up to one million times in less than 1 sec. However, using yours, no matter the range, it spends 1 sec in around ten thousand insertions. This happens because for all n insertions there are collisions and the resulting time complexity is O(n^2) when the expected for a HashSet and an even distributed Hash Function is O(n).
Check this out:
using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.Linq;
namespace hashExample
{
class Program
{
static void Main(string[] args)
{
List<int[]> points = new List<int[]>();
Random random = new Random();
int toInsert = 20000;
for (int i = 0; i < toInsert; i++)
{
int x = random.Next(1000);
int y = random.Next(1000);
points.Add(new int[]{ x,y });
}
HashSet<int[]> set = new HashSet<int[]>(new SameHash());
Stopwatch clock = new Stopwatch();
clock.Start();
foreach (var item in points)
{
set.Add(item);
}
clock.Stop();
Console.WriteLine("Elements inserted: " + set.Count + "/" + toInsert);
Console.WriteLine("Time taken: " + clock.ElapsedMilliseconds);
}
public class SameHash : EqualityComparer<int[]>
{
public override bool Equals(int[] p1, int[] p2)
{
return p1[0] == p2[0] && p1[1] == p2[1];
}
public override int GetHashCode(int[] i)
{
return base.GetHashCode();
//return i[0] * 10000 + i[1];
//Notice that this is a very basic implementation of a HashCode function
}
}
}
}
The only way I found it possible was by creating a class MyPair instead of using an array (int[]) like you did. Notice that I used X*10000 + Y in the GetHashCode() function but you can change the constant value in order to get a better HashCode for every item or you can create you own. I just provided this one as a simple example and because is an easy way of having different hashCodes when the bounds for X and Y are relative small (less than the root of Int.MaxValue).
Here you have the working code:
using System;
using System.Collections.Generic;
using System.Linq;
namespace hash
{
public class MyPair
{
public int X { get; set; }
public int Y { get; set; }
public override int GetHashCode()
{
return X * 10000 + Y;
}
public override bool Equals(object obj)
{
MyPair other = obj as MyPair;
return X == other.X && Y == other.Y;
}
}
class Program
{
static void Main(string[] args)
{
HashSet<MyPair> hash = new HashSet<MyPair>();
MyPair one = new MyPair { X = 10, Y = 2 };
MyPair two = new MyPair { X = 1, Y = 24 };
MyPair three = new MyPair { X = 111, Y = 266 };
MyPair copyOfOne = new MyPair { X = 10, Y = 2 };
Console.WriteLine(hash.Add(one));
Console.WriteLine(hash.Add(two));
Console.WriteLine(hash.Add(three));
Console.WriteLine(hash.Add(copyOfOne));
}
}
}
Related
Similar to List<> OrderBy Alphabetical Order, we want to sort by one element, then another. we want to achieve the functional equivalent of
SELECT * from Table ORDER BY x, y
We have a class that contains a number of sorting functions, and we have no issues sorting by one element.
For example:
public class MyClass {
public int x;
public int y;
}
List<MyClass> MyList;
public void SortList() {
MyList.Sort( MySortingFunction );
}
And we have the following in the list:
Unsorted Sorted(x) Desired
--------- --------- ---------
ID x y ID x y ID x y
[0] 0 1 [2] 0 2 [0] 0 1
[1] 1 1 [0] 0 1 [2] 0 2
[2] 0 2 [1] 1 1 [1] 1 1
[3] 1 2 [3] 1 2 [3] 1 2
Stable sort would be preferable, but not required. Solution that works for .Net 2.0 is welcome.
For versions of .Net where you can use LINQ OrderBy and ThenBy (or ThenByDescending if needed):
using System.Linq;
....
List<SomeClass>() a;
List<SomeClass> b = a.OrderBy(x => x.x).ThenBy(x => x.y).ToList();
Note: for .Net 2.0 (or if you can't use LINQ) see Hans Passant answer to this question.
Do keep in mind that you don't need a stable sort if you compare all members. The 2.0 solution, as requested, can look like this:
public void SortList() {
MyList.Sort(delegate(MyClass a, MyClass b)
{
int xdiff = a.x.CompareTo(b.x);
if (xdiff != 0) return xdiff;
else return a.y.CompareTo(b.y);
});
}
Do note that this 2.0 solution is still preferable over the popular 3.5 Linq solution, it performs an in-place sort and does not have the O(n) storage requirement of the Linq approach. Unless you prefer the original List object to be untouched of course.
The trick is to implement a stable sort. I've created a Widget class that can contain your test data:
public class Widget : IComparable
{
int x;
int y;
public int X
{
get { return x; }
set { x = value; }
}
public int Y
{
get { return y; }
set { y = value; }
}
public Widget(int argx, int argy)
{
x = argx;
y = argy;
}
public int CompareTo(object obj)
{
int result = 1;
if (obj != null && obj is Widget)
{
Widget w = obj as Widget;
result = this.X.CompareTo(w.X);
}
return result;
}
static public int Compare(Widget x, Widget y)
{
int result = 1;
if (x != null && y != null)
{
result = x.CompareTo(y);
}
return result;
}
}
I implemented IComparable, so it can be unstably sorted by List.Sort().
However, I also implemented the static method Compare, which can be passed as a delegate to a search method.
I borrowed this insertion sort method from C# 411:
public static void InsertionSort<T>(IList<T> list, Comparison<T> comparison)
{
int count = list.Count;
for (int j = 1; j < count; j++)
{
T key = list[j];
int i = j - 1;
for (; i >= 0 && comparison(list[i], key) > 0; i--)
{
list[i + 1] = list[i];
}
list[i + 1] = key;
}
}
You would put this in the sort helpers class that you mentioned in your question.
Now, to use it:
static void Main(string[] args)
{
List<Widget> widgets = new List<Widget>();
widgets.Add(new Widget(0, 1));
widgets.Add(new Widget(1, 1));
widgets.Add(new Widget(0, 2));
widgets.Add(new Widget(1, 2));
InsertionSort<Widget>(widgets, Widget.Compare);
foreach (Widget w in widgets)
{
Console.WriteLine(w.X + ":" + w.Y);
}
}
And it outputs:
0:1
0:2
1:1
1:2
Press any key to continue . . .
This could probably be cleaned up with some anonymous delegates, but I'll leave that up to you.
EDIT: And NoBugz demonstrates the power of anonymous methods...so, consider mine more oldschool :P
This may help you,
How to Sort C# Generic List
I had an issue where OrderBy and ThenBy did not give me the desired result (or I just didn't know how to use them correctly).
I went with a list.Sort solution something like this.
var data = (from o in database.Orders Where o.ClientId.Equals(clientId) select new {
OrderId = o.id,
OrderDate = o.orderDate,
OrderBoolean = (SomeClass.SomeFunction(o.orderBoolean) ? 1 : 0)
});
data.Sort((o1, o2) => (o2.OrderBoolean.CompareTo(o1.OrderBoolean) != 0
o2.OrderBoolean.CompareTo(o1.OrderBoolean) : o1.OrderDate.Value.CompareTo(o2.OrderDate.Value)));
I have a double[] array, i want to use it as key (not literally, but in the way that the key is matched when all the doubles in the double array need to be matched)
What is the fastest way to use the double[] array as key to dictionary?
Is it using
Dictionary<string, string> (convert double[] to a string)
or
anything else like converting it
Given that all key arrays will have the same length, either consider using a Tuple<,,, ... ,>, or use a structural equality comparer on the arrays.
With tuple:
var yourDidt = new Dictionary<Tuple<double, double, double>, string>();
yourDict.Add(Tuple.Create(3.14, 2.718, double.NaN), "da value");
string read = yourDict[Tuple.Create(3.14, 2.718, double.NaN)];
With (strongly typed version of) StructuralEqualityComparer:
class DoubleArrayStructuralEqualityComparer : EqualityComparer<double[]>
{
public override bool Equals(double[] x, double[] y)
{
return System.Collections.StructuralComparisons.StructuralEqualityComparer
.Equals(x, y);
}
public override int GetHashCode(double[] obj)
{
return System.Collections.StructuralComparisons.StructuralEqualityComparer
.GetHashCode(obj);
}
}
...
var yourDict = new Dictionary<double[], string>(
new DoubleArrayStructuralEqualityComparer());
yourDict.Add(new[] { 3.14, 2.718, double.NaN, }, "da value");
string read = yourDict[new[] { 3.14, 2.718, double.NaN, }];
Also consider the suggestion by Sergey Berezovskiy to create a custom class or (immutable!) struct to hold your set of doubles. In that way you can name your type and its members in a natural way that makes it more clear what you do. And your class/struct can easily be extended later on, if needed.
Thus all arrays have same length and each item in array have specific meaning, then create class which holds all items as properties with descriptive names. E.g. instead of double array with two items you can have class Point with properties X and Y. Then override Equals and GetHashCode of this class and use it as key (see What is the best algorithm for an overriding GetHashCode):
Dictionary<Point, string>
Benefits - instead of having array, you have data structure which makes its purpose clear. Instead of referencing items by indexes, you have nice named property names, which also make their purpose clear. And also speed - calculating hash code is fast. Compare:
double[] a = new [] { 12.5, 42 };
// getting first coordinate a[0];
Point a = new Point { X = 12.5, Y = 42 };
// getting first coordinate a.X
[Do not consider this a separate answer; this is an extension of #JeppeStigNielsen's answer]
I'd just like to point out that you make Jeppe's approach generic as follows:
public class StructuralEqualityComparer<T>: IEqualityComparer<T>
{
public bool Equals(T x, T y)
{
return StructuralComparisons.StructuralEqualityComparer.Equals(x, y);
}
public int GetHashCode(T obj)
{
return StructuralComparisons.StructuralEqualityComparer.GetHashCode(obj);
}
public static StructuralEqualityComparer<T> Default
{
get
{
StructuralEqualityComparer<T> comparer = _defaultComparer;
if (comparer == null)
{
comparer = new StructuralEqualityComparer<T>();
_defaultComparer = comparer;
}
return comparer;
}
}
private static StructuralEqualityComparer<T> _defaultComparer;
}
(From an original answer here: https://stackoverflow.com/a/5601068/106159)
Then you would declare the dictionary like this:
var yourDict = new Dictionary<double[], string>(new StructuralEqualityComparer<double[]>());
Note: It might be better to initialise _defaultComparer using Lazy<T>.
[EDIT]
It's possible that this might be faster; worth a try:
class DoubleArrayComparer: IEqualityComparer<double[]>
{
public bool Equals(double[] x, double[] y)
{
if (x == y)
return true;
if (x == null || y == null)
return false;
if (x.Length != y.Length)
return false;
for (int i = 0; i < x.Length; ++i)
if (x[i] != y[i])
return false;
return true;
}
public int GetHashCode(double[] data)
{
if (data == null)
return 0;
int result = 17;
foreach (var value in data)
result += result*23 + value.GetHashCode();
return result;
}
}
...
var yourDict = new Dictionary<double[], string>(new DoubleArrayComparer());
Ok this is what I found so far:
I input an entry (length 4 arrray) to the dictionary, and access it for 999999 times on my machine:
Dictionary<double[], string>(
new DoubleArrayStructuralEqualityComparer()); takes 1.75 seconds
Dictionary<Tuple<double...>,string> takes 0.85 seconds
The code below takes 0.1755285 seconds, which is the fastest now! (in line with the comment with Sergey.)
The fastest - The code of DoubleArrayComparer by Matthew Watson takes 0.15 seconds!
public class DoubleArray
{
private double[] d = null;
public DoubleArray(double[] d)
{
this.d = d;
}
public override bool Equals(object obj)
{
if (!(obj is DoubleArray)) return false;
DoubleArray dobj = (DoubleArray)obj;
if (dobj.d.Length != d.Length) return false;
for (int i = 0; i < d.Length; i++)
{
if (dobj.d[i] != d[i]) return false;
}
return true;
}
public override int GetHashCode()
{
unchecked // Overflow is fine, just wrap
{
int hash = 17;
for (int i = 0; i < d.Length;i++ )
{
hash = hash*23 + d[i].GetHashCode();
}
return hash;
}
}
}
I have a struct called "Complex" in my project (I build it with using C#) and as the name of the struct implies, it's a struct for complex numbers. That struct has a built-in method called "Modulus" so that I can calculate the modulus of a complex number. The things are quite easy up to now.
The thing is, I create an array out of this struct and I want to sort the array according to the modulus of the complex numbers contained.(greater to smaller). Is there a way for that?? (Any algorithm suggestions will be welcomed.)
Thank you!!
Complex[] complexArray = ...
Complex[] sortedArray = complexArray.OrderByDescending(c => c.Modulus()).ToArray();
First of all, you can increase performances comparing squared modulus instead of modulus.
You don't need the squared root: "sqrt( a * a + b * b ) >= sqrt( c * c + d * d )" is equivalent to "a * a + b + b >= c * c + d * d".
Then, you can write a comparer to sort complex numbers.
public class ComplexModulusComparer :
IComparer<Complex>,
IComparer
{
public static readonly ComplexModulusComparer Default = new ComplexModulusComparer();
public int Compare(Complex a, Complex b)
{
return a.ModulusSquared().CompareTo(b.ModulusSquared());
}
int IComparer.Compare(object a, object b)
{
return ((Complex)a).ModulusSquared().CompareTo(((Complex)b).ModulusSquared());
}
}
You can write also the reverse comparer, since you want from greater to smaller.
public class ComplexModulusReverseComparer :
IComparer<Complex>,
IComparer
{
public static readonly ComplexModulusReverseComparer Default = new ComplexModulusReverseComparer();
public int Compare(Complex a, Complex b)
{
return - a.ModulusSquared().CompareTo(b.ModulusSquared());
}
int IComparer.Compare(object a, object b)
{
return - ((Complex)a).ModulusSquared().CompareTo(((Complex)b).ModulusSquared());
}
}
To sort an array you can then write two nice extension method ...
public static void SortByModulus(this Complex[] array)
{
Array.Sort(array, ComplexModulusComparer.Default);
}
public static void SortReverseByModulus(this Complex[] array)
{
Array.Sort(array, ComplexModulusReverseComparer.Default);
}
Then in your code...
Complex[] myArray ...;
myArray.SortReverseByModulus();
You can also implement the IComparable, if you wish, but a more correct and formal approach is to use the IComparer from my point of view.
public struct Complex :
IComparable<Complex>
{
public double R;
public double I;
public double Modulus() { return Math.Sqrt(R * R + I * I); }
public double ModulusSquared() { return R * R + I * I; }
public int CompareTo(Complex other)
{
return this.ModulusSquared().CompareTo(other.ModulusSquared());
}
}
And then you can write the ReverseComparer that can apply to every kind of comparer
public class ReverseComparer<T> :
IComparer<T>
{
private IComparer<T> comparer;
public static readonly ReverseComparer<T> Default = new ReverseComparer<T>();
public ReverseComparer<T>() :
this(Comparer<T>.Default)
{
}
public ReverseComparer<T>(IComparer<T> comparer)
{
this.comparer = comparer;
}
public int Compare(T a, T b)
{
return - this.comparer.Compare(a, b);
}
}
Then when you need to sort....
Complex[] array ...;
Array.Sort(array, ReverseComparer<Complex>.Default);
or in case you have another IComparer...
Complex[] array ...;
Array.Sort(array, new ReverseComparer<Complex>(myothercomparer));
RE-EDIT-
Ok i performed some speed test calculation.
Compiled with C# 4.0, in release mode, launched with all instances of visual studio closed.
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Diagnostics;
namespace TestComplex
{
class Program
{
public struct Complex
{
public double R;
public double I;
public double ModulusSquared()
{
return this.R * this.R + this.I * this.I;
}
}
public class ComplexComparer :
IComparer<Complex>
{
public static readonly ComplexComparer Default = new ComplexComparer();
public int Compare(Complex x, Complex y)
{
return x.ModulusSquared().CompareTo(y.ModulusSquared());
}
}
private static void RandomComplexArray(Complex[] myArray)
{
// We use always the same seed to avoid differences in quicksort.
Random r = new Random(2323);
for (int i = 0; i < myArray.Length; ++i)
{
myArray[i].R = r.NextDouble() * 10;
myArray[i].I = r.NextDouble() * 10;
}
}
static void Main(string[] args)
{
// We perform some first operation to ensure JIT compiled and optimized everything before running the real test.
Stopwatch sw = new Stopwatch();
Complex[] tmp = new Complex[2];
for (int repeat = 0; repeat < 10; ++repeat)
{
sw.Start();
tmp[0] = new Complex() { R = 10, I = 20 };
tmp[1] = new Complex() { R = 30, I = 50 };
ComplexComparer.Default.Compare(tmp[0], tmp[1]);
tmp.OrderByDescending(c => c.ModulusSquared()).ToArray();
sw.Stop();
}
int[] testSizes = new int[] { 5, 100, 1000, 100000, 250000, 1000000 };
for (int testSizeIdx = 0; testSizeIdx < testSizes.Length; ++testSizeIdx)
{
Console.WriteLine("For " + testSizes[testSizeIdx].ToString() + " input ...");
// We create our big array
Complex[] myArray = new Complex[testSizes[testSizeIdx]];
double bestTime = double.MaxValue;
// Now we execute repeatCount times our test.
const int repeatCount = 15;
for (int repeat = 0; repeat < repeatCount; ++repeat)
{
// We fill our array with random data
RandomComplexArray(myArray);
// Now we perform our sorting.
sw.Reset();
sw.Start();
Array.Sort(myArray, ComplexComparer.Default);
sw.Stop();
double elapsed = sw.Elapsed.TotalMilliseconds;
if (elapsed < bestTime)
bestTime = elapsed;
}
Console.WriteLine("Array.Sort best time is " + bestTime.ToString());
// Now we perform our test using linq
bestTime = double.MaxValue; // i forgot this before
for (int repeat = 0; repeat < repeatCount; ++repeat)
{
// We fill our array with random data
RandomComplexArray(myArray);
// Now we perform our sorting.
sw.Reset();
sw.Start();
myArray = myArray.OrderByDescending(c => c.ModulusSquared()).ToArray();
sw.Stop();
double elapsed = sw.Elapsed.TotalMilliseconds;
if (elapsed < bestTime)
bestTime = elapsed;
}
Console.WriteLine("linq best time is " + bestTime.ToString());
Console.WriteLine();
}
Console.WriteLine("Press enter to quit.");
Console.ReadLine();
}
}
}
And here the results:
For 5 input ...
Array.Sort best time is 0,0004
linq best time is 0,0018
For 100 input ...
Array.Sort best time is 0,0267
linq best time is 0,0298
For 1000 input ...
Array.Sort best time is 0,3568
linq best time is 0,4107
For 100000 input ...
Array.Sort best time is 57,3536
linq best time is 64,0196
For 250000 input ...
Array.Sort best time is 157,8832
linq best time is 194,3723
For 1000000 input ...
Array.Sort best time is 692,8211
linq best time is 1058,3259
Press enter to quit.
My machine is an Intel I5, 64 bit windows seven.
Sorry! I did a small stupid bug in the previous edit!
ARRAY.SORT OUTPEFORMS LINQ, yes by a very small amount, but as suspected, this amount grows with n, seems in a not-so-linear way. It seems to me both code overhead and a memory problem (cache miss, object allocation, GC ... don't know).
You can always use SortedList :) Assuming modulus is int:
var complexNumbers = new SortedList<int, Complex>();
complexNumbers.Add(number.Modulus(), number);
public struct Complex: IComparable<Complex>
{
//complex rectangular number: a + bi
public decimal A
public decimal B
//synonymous with absolute value, or in geometric terms, distance
public decimal Modulus() { ... }
//CompareTo() is the default comparison used by most built-in sorts;
//all we have to do here is pass through to Decimal's IComparable implementation
//via the results of the Modulus() methods
public int CompareTo(Complex other){ return this.Modulus().CompareTo(other.Modulus()); }
}
You can now use any sorting method you choose on any collection of Complex instances; Array.Sort(), List.Sort(), Enumerable.OrderBy() (it doesn't use your IComparable, but if Complex were a member of a containing class you could sort the containing class by the Complex members without having to go the extra level down to comparing moduli), etc etc.
You stated you wanted to sort in descending order; you may consider multiplying the results of the Modulus() comparison by -1 before returning it. However, I would caution against this as it may be confusing; you would have to use a method that normally gives you descending order to get the list in ascending order. Instead, most sorting methods allow you to specify either a sorting direction, or a custom comparison which can still make use of the IComparable implementation:
//This will use your Comparison, but reverse the sort order based on its result
myEnumerableOfComplex.OrderByDescending(c=>c);
//This explicitly negates your comparison; you can also use b.CompareTo(a)
//which is equivalent
myListOfComplex.Sort((a,b) => return a.CompareTo(b) * -1);
//DataGridView objects use a SortDirection enumeration to control and report
//sort order
myGridViewOfComplex.Sort(myGridViewOfComplex.Columns["ComplexColumn"], ListSortDirection.Descending);
I'm new to C#, have looked at numerous posts but am still confused.
I have a array list:
List<Array> moves = new List<Array>();
I'm adding moves to it using the following:
string[] newmove = { piece, axis.ToString(), direction.ToString() };
moves.Add(newmove);
And now I wish to remove duplicates using the following:
moves = moves.Distinct();
However it's not letting me do it. I get this error:
Cannot implicitly convert type 'System.Collections.Generic.IEnumerable' to 'System.Collections.Generic.List'. An explicit conversion exists (are you missing a cast?)
Help please? I'd be so grateful.
Steve
You need to call .ToList() after the .Distinct method as it returns IEnumerable<T>. I would also recommend you using a strongly typed List<string[]> instead of List<Array>:
List<string[]> moves = new List<string[]>();
string[] newmove = { piece, axis.ToString(), direction.ToString() };
moves.Add(newmove);
moves.Add(newmove);
moves = moves.Distinct().ToList();
// At this stage moves.Count = 1
Your code has two errors. The first is the missing call to ToList, as already pointed out. The second is subtle. Unique compares objects by identity, but your duplicate list items have are different array instances.
There are multiple solutions for that problem.
Use a custom equality comparer in moves.Distinct().ToList(). No further changes necessary.
Sample implementation:
class ArrayEqualityComparer<T> : EqualityComparer<T> {
public override bool Equals(T[] x, T[] y) {
if ( x == null ) return y == null;
else if ( y == null ) return false;
return x.SequenceEquals(y);
}
public override int GetHashCode(T[] obj) {
if ( obj == null) return 0;
return obj.Aggregate(0, (hash, x) => hash ^ x.GetHashCode());
}
}
Filtering for unique items:
moves = moves.Distinct(new ArrayEqualityComparer<string>()).ToList();
Use Tuple<string,string,string> instead of string[]. Tuple offers built-in structural equality and comparison. This variant might make your code cluttered because of the long type name.
Instantiation:
List<Tuple<string, string, string>> moves =
new List<Tuple<string, string, string>>();
Adding new moves:
Tuple<string, string, string> newmove =
Tuple.Create(piece, axis.ToString(), direction.ToString());
moves.Add(newmove);
Filtering for unique items:
moves = moves.Distinct().ToList();
Use a custom class to hold your three values. I'd actually recommend this variant, because it makes all your code dealing with moves much more readable.
Sample implementation:
class Move {
public Move(string piece, string axis, string direction) {
Piece = piece;
Axis = axis;
Direction = direction;
}
string Piece { get; private set; }
string Axis { get; private set; }
string Direction { get; private set; }
public override Equals(object obj) {
Move other = obj as Move;
if ( other != null )
return Piece == other.Piece &&
Axis == other.Axis &&
Direction == other.Direction;
return false;
}
public override GetHashCode() {
return Piece.GetHashCode() ^
Axis.GetHashCode() ^
Direction.GetHashCode();
}
// TODO: override ToString() as well
}
Instantiation:
List<Move> moves = new List<Move>();
Adding new moves:
Move newmove = new Move(piece, axis.ToString(), direction.ToString());
moves.Add(newmove);
Filtering for unique items:
moves = moves.Distinct().ToList();
The compiler error is because you need to convert the result to a list:
moves = moves.Distinct().ToList();
However it probably won't work as you want, because arrays don't have Equals defined in the way that you are hoping (it compares the references of the array objects, not the values inside the array). Instead of using an array, create a class to hold your data and define Equals and GetHashCode to compare the values.
Old question, but this is an O(n) solution using O(1) additional space:
public static void RemoveDuplicates(string[] array)
{
int c = 0;
int i = -1;
for (int n = 1; n < array.Length; n++)
{
if (array[c] == array[n])
{
if (i == -1)
{
i = n;
}
}
else
{
if (i == -1)
{
c++;
}
else
{
array[i] = array[n];
c++;
i++;
}
}
}
}
Similar to List<> OrderBy Alphabetical Order, we want to sort by one element, then another. we want to achieve the functional equivalent of
SELECT * from Table ORDER BY x, y
We have a class that contains a number of sorting functions, and we have no issues sorting by one element.
For example:
public class MyClass {
public int x;
public int y;
}
List<MyClass> MyList;
public void SortList() {
MyList.Sort( MySortingFunction );
}
And we have the following in the list:
Unsorted Sorted(x) Desired
--------- --------- ---------
ID x y ID x y ID x y
[0] 0 1 [2] 0 2 [0] 0 1
[1] 1 1 [0] 0 1 [2] 0 2
[2] 0 2 [1] 1 1 [1] 1 1
[3] 1 2 [3] 1 2 [3] 1 2
Stable sort would be preferable, but not required. Solution that works for .Net 2.0 is welcome.
For versions of .Net where you can use LINQ OrderBy and ThenBy (or ThenByDescending if needed):
using System.Linq;
....
List<SomeClass>() a;
List<SomeClass> b = a.OrderBy(x => x.x).ThenBy(x => x.y).ToList();
Note: for .Net 2.0 (or if you can't use LINQ) see Hans Passant answer to this question.
Do keep in mind that you don't need a stable sort if you compare all members. The 2.0 solution, as requested, can look like this:
public void SortList() {
MyList.Sort(delegate(MyClass a, MyClass b)
{
int xdiff = a.x.CompareTo(b.x);
if (xdiff != 0) return xdiff;
else return a.y.CompareTo(b.y);
});
}
Do note that this 2.0 solution is still preferable over the popular 3.5 Linq solution, it performs an in-place sort and does not have the O(n) storage requirement of the Linq approach. Unless you prefer the original List object to be untouched of course.
The trick is to implement a stable sort. I've created a Widget class that can contain your test data:
public class Widget : IComparable
{
int x;
int y;
public int X
{
get { return x; }
set { x = value; }
}
public int Y
{
get { return y; }
set { y = value; }
}
public Widget(int argx, int argy)
{
x = argx;
y = argy;
}
public int CompareTo(object obj)
{
int result = 1;
if (obj != null && obj is Widget)
{
Widget w = obj as Widget;
result = this.X.CompareTo(w.X);
}
return result;
}
static public int Compare(Widget x, Widget y)
{
int result = 1;
if (x != null && y != null)
{
result = x.CompareTo(y);
}
return result;
}
}
I implemented IComparable, so it can be unstably sorted by List.Sort().
However, I also implemented the static method Compare, which can be passed as a delegate to a search method.
I borrowed this insertion sort method from C# 411:
public static void InsertionSort<T>(IList<T> list, Comparison<T> comparison)
{
int count = list.Count;
for (int j = 1; j < count; j++)
{
T key = list[j];
int i = j - 1;
for (; i >= 0 && comparison(list[i], key) > 0; i--)
{
list[i + 1] = list[i];
}
list[i + 1] = key;
}
}
You would put this in the sort helpers class that you mentioned in your question.
Now, to use it:
static void Main(string[] args)
{
List<Widget> widgets = new List<Widget>();
widgets.Add(new Widget(0, 1));
widgets.Add(new Widget(1, 1));
widgets.Add(new Widget(0, 2));
widgets.Add(new Widget(1, 2));
InsertionSort<Widget>(widgets, Widget.Compare);
foreach (Widget w in widgets)
{
Console.WriteLine(w.X + ":" + w.Y);
}
}
And it outputs:
0:1
0:2
1:1
1:2
Press any key to continue . . .
This could probably be cleaned up with some anonymous delegates, but I'll leave that up to you.
EDIT: And NoBugz demonstrates the power of anonymous methods...so, consider mine more oldschool :P
This may help you,
How to Sort C# Generic List
I had an issue where OrderBy and ThenBy did not give me the desired result (or I just didn't know how to use them correctly).
I went with a list.Sort solution something like this.
var data = (from o in database.Orders Where o.ClientId.Equals(clientId) select new {
OrderId = o.id,
OrderDate = o.orderDate,
OrderBoolean = (SomeClass.SomeFunction(o.orderBoolean) ? 1 : 0)
});
data.Sort((o1, o2) => (o2.OrderBoolean.CompareTo(o1.OrderBoolean) != 0
o2.OrderBoolean.CompareTo(o1.OrderBoolean) : o1.OrderDate.Value.CompareTo(o2.OrderDate.Value)));