I'm trying to do my own implementation of the knapsack algorithm, and I was wondering if it could be done without multidimensional arrays, which is what I see online mostly.
The idea wasn't to make the most optimal one, so I merely used the methods I felt were what could get the job done. (using lists instead of int[])
I feel like i'm close to getting it, but not quite there yet, and I'm not sure what i'm doing wrong.
The below is fed by a driver function that parses say, the following input (weight:value):
Max
50
10:60,20:100,30:120
Should print 220, and the subset that creates it.
CheckSum returns true if the value, when added is less than or equal to the max weight.
Although, half the code is merely getting and placing the values...
public static bool Knapsack(int W, Dictionary<int, Tuple<int,int>> Items, List<int> numbers)
{
List<int> summationValue;
List<int> subset;
List<int> summationWeight;
List<KeyValuePair<int, List<int>>> finalSet = new List<KeyValuePair<int, List<int>>>();
for (int auto = 0; auto <= Items.Count-1; auto++)
{
Console.WriteLine("Iteration: " + auto);
int value1 = Items[auto].Item2;
int weight1 = Items[auto].Item1;
summationValue = new List<int> { value1 };
subset = new List<int> { (auto+1) };
summationWeight = new List<int> { weight1 };
//Initialize//var currentItem = itemIndex == 0 ? null : items[itemIndex - 1];
for (int other_nodes = 0; other_nodes <= Items.Count-1; other_nodes++)
{
if (other_nodes == auto)
{
other_nodes++;
}
int value2 = Items[other_nodes].Item2;
int weight2 = Items[other_nodes].Item1;
if (CheckSum(summationWeight, weight2, W))
{
summationValue.Add(value2);
subset.Add(other_nodes+1);
summationWeight.Add(weight2);
KeyValuePair<int, List<int>> kv = new KeyValuePair<int, List<int>>(summationValue.Sum(), subset);
finalSet.Add(kv);
break;
}
}
//After all item iteration, print outputs; [Highest value] [Subset] [Weight]
string printValue = summationValue.Sum().ToString();
subset.Sort();
string printWeight = summationWeight.Sum().ToString();
Console.WriteLine("Highest Value: ");
Console.WriteLine(printValue);
Console.WriteLine("Subsets: \n");
Console.Write("{ ");
foreach (var i in subset)
{
Console.Write(i + " ");
}
Console.WriteLine("}");
Console.WriteLine("Highest Weight: ");
Console.WriteLine(printWeight);
}
Console.WriteLine("[========================]");
var pair = finalSet.OrderByDescending(x => x.Key).First();
var keys = finalSet.Where(kvp => kvp.Key == pair.Key).Select(kvp => kvp.Key).ToArray();
var fkey = keys.Max();
Console.WriteLine(fkey);
Console.Write("{ ");
foreach (var i in pair.Value)
{
Console.Write(i + " ");
}
Console.Write("}");
return true;
}
Not included is the linq function to get the largest key in the list which corresponds to the greatest value, something like:
finalSet.OrderByDescending(x => x.Key).First();
I know that this isn't the best way to go about it, but i'd still like to ask what i'm doing wrong in the implementation.
Currently, it would print for the final set: 180, 1 3. When it should've been 220.
I have a list of something.
public List<Objects> obj;
The objects in this list need to be added to these other lists.
public List<Objects> objGroup1, objGroup2, objGroup3, objGroup4;
I assign them right now by doing this.
void AssignToGroups()
{
for(int i = 0; i < obj.Count ; i++)
{
//Need the first 4 for group 1 next 4 for group 2 and so on...
if(i < 4)
{
objGroup1.Add(obj[i]);
}
else if(i >= 4 && i < 8)
{
objGroup2.Add(obj[i]);
}
else if (i >= 8 && i < 12)
{
objGroup3.Add(obj[i]);
}
else if (i >= 12 && i < 16)
{
objGroup4.Add(obj[i]);
}
}
}
I'm planning on expanding and my method for grouping objects right now will fill my screen with endless if and else statements.
4 objects need to be assigned to each groups.
The objects in the group gets them in their order of arrangement.
e.g. group1 gets obj 1-4. group 2 get obj 5-8 and so on...
Does anyone have a better method of grouping objects like this?
You can utilize the Skip and Take methods.
You'll need the using System.Linq;:
objGroup1 = obj.Take(4).ToList(); //edited: use ToList() to keep the list format
objGroup2 = obj.Skip(4).Take(4).ToList();
objGroup3 = obj.Skip(8).Take(4).ToList();
objGroup4 = obj.Skip(12).Take(4).ToList();
objGroup5 = obj.Skip(16).Take(4).ToList();
Let me know if it works, since I am not able to test it now, except for the syntax.
You can also group the obj before Take(), such as
var orderedobj = obj.OrderBy(i => "some order criteria").ToList();
objGroup1 = orderedobj.Take(4);
...
I referenced my answer on How to get first N elements of a list in C#?.
EDIT:
In case you somehow do not want to use Linq, you can also use GetRange
objGroup1 = obj.GetRange(0, 4);
objGroup2 = obj.GetRange(3, 4); //since GetRange(index, count) has index starting from 0 instead of 1
objGroup3 = obj.GetRange(7, 4); //count stays the same since we always want 4 elements
objGroup4 = obj.GetRange(11, 4);
objGroup5 = obj.GetRange(15, 4);
Using Keyur's excellent answer, you could create a method that will create the groups for you, based on any source list, with a configurable group size:
private static List<List<object>> AssignToGroups(List<object> source, int groupSize)
{
var groups = new List<List<object>>();
if (source == null || groupSize < 1) return groups;
for (int i = 0; i < source.Count / groupSize; i++)
{
groups.Add(source.Skip(groupSize * i).Take(groupSize).ToList());
}
return groups;
}
Usage
private static void Main()
{
var mainList = new List<object>
{
"one", "two", "three", "four","five",
"six","seven","eight","nine","ten",
"eleven", "twelve", "thirteen", "fourteen","fifteen",
"sixteen","seventeen","eightteen","nineteen","twenty",
"twentyone", "twentytwo", "twentythree", "twentyfour","twentyfive",
"twentysix","twentyseven","twentyeight","twentynine","thirty",
"thirtyone", "thirtytwo", "thirtythree", "thirtyfour","thirtyfive",
"thirtysix","thirtyseven","thirtyeight","thirtynine","forty",
};
var groups = AssignToGroups(mainList, 4);
for (var i = 0; i < groups.Count; i++)
{
Console.WriteLine($"Group #{i + 1}: {string.Join(", ", groups[i])}");
}
Console.WriteLine("\nDone!\nPress any key to exit...");
Console.ReadKey();
}
Output
Question: How to distribute items in number of lists? following are examples:
Imagine I have a 8 lists, and 5 items. in this case list 1 to 5 will have 1 item. rest of lists remain empty.
Now If I have 8 lists and 16 items each list will have 2 items.
If I have 8 lists and 11 items, list 1 to 3 will have two items. rest of the lists will have 1 item.
var items = new List<object>();
var containers = new List<List<object>>();
int c = -1; // indexer for container.
for(int i = 0; i < items.Count; i++)
{
// disturbute items to containers
if (i % (items.Count/containers.Count) == 0) c++; // this is wrong. when to increment?
containers[c].Add(items[i]);
}
I'm pretty sure only if statement is wrong. Its getting confusing how to handle i.
Try this:
static void Main(string[] args)
{
var items = new List<object>() {1, 2, 3, 4, 5, 6};
var containers = new List<List<object>>() { new List<object>(), new List<object>(), new List<object>()};
int c = 0; // indexer for container.
int containerCount = containers.Count;
for (int i = 0; i < items.Count; i++, c++)
{
c = c % containerCount;
containers[c].Add(items[i]);
}
}
Think about the algorithm. When inserting item, you insert it to a list and then move to next one. When reaching last list, start inserting back to the first. Code (written in notepad, so may not compile):
var items = new List<object>();
var containers = new List<List<object>>();
int c = 0;
for (int i = 0; i < items.Count; i++)
{
c++; // move to next container
// when reached to the end, insert again to first list
if (c == containers.Count)
{
c = 0;
}
containers[c].Add(items[i]);
}
This can be made a bit shorter:
var items = new List<object>();
var containers = new List<List<object>>();
int c = 0;
foreach (int item in items)
{
c = (c == containers.Count - 1) ? 0 : c + 1;
containers[c].Add(item);
}
If you want to isolate this behavior into something reusable and testable:
public class ListBalancer
{
public void BalanceItemsBetweenLists<T>(
IEnumerable<T> input,
IEnumerable<IList<T>> targets)
{
var inputArray = input as T[] ?? input.ToArray();
var targetArray = targets as IList<T>[] ?? targets.ToArray();
var currentTargetIndex = 0;
foreach (var item in inputArray)
{
targetArray[currentTargetIndex].Add(item);
currentTargetIndex++;
if (currentTargetIndex == targetArray.Length) currentTargetIndex = 0;
}
}
}
[TestClass]
public class ListBalancerTests
{
[TestMethod]
public void BalancesListsWhenAddingItems()
{
var source = Enumerable.Range(1, 11);
var targets = Enumerable.Range(1, 8).Select(n => new List<int>()).ToArray();
new ListBalancer().BalanceItemsBetweenLists(source, targets);
Assert.AreEqual(2, targets[0].Count);
Assert.AreEqual(2, targets[1].Count);
Assert.AreEqual(2, targets[2].Count);
Assert.AreEqual(1, targets[3].Count);
}
}
It seems like a little bit of extra work. But you probably found that the process of debugging when it didn't do what was expected took a little extra time, too. It might have been necessary to start a console application or some other app to test the behavior. If you write a class with a unit test you might still have to debug, but it's faster and more self-contained. You finish the one class with the one behavior, test it, and then move on.
Personally I found that once I formed the habit I could write code a little faster and with fewer bugs because I made my debugging process smaller and easier.
I'm trying to take a list of numbers, and put them into >=N groups such that the sums of each groups are approximately (but not necessarily exactly) equal, and 'outliers' can be in a group of their own.
So for a target of 3 groups and an input of something like:
[3, 2, 1, 4, 2, 5]
The output might be:
[[5,1], [4,2], [3,2]]
The respective sums of each group being
6, 6, 5
I think I've got the methodology down, as pseudocode it looks something like this:
let target = Ceil(Sum(Series) / NumberOfTargetGroups) //The ideal size of each group
while (count(UnpickedNumbers) > 0)
let CurrentGroup = new group
while (sum(CurrentGroup) < target)
for each Unpicked in sortDesc(UnpickedNumbers)
if (sum(CurrentGroup) + Unpicked)
Add Unpicked to current group
Remove unpicked from available numbers
What I can't figure out is how to turn that logic into a GroupBy(n => ...) - the reason for wanting to do this being that the list of numbers is actually coming from a property of a series of objects that I want to group in this manner.
Partition is NP-complete problem.
I've preapred snippet:
public IEnumerable<IEnumerable<TObject>> Algo<TObject>(IEnumerable<TObject> source, int groups,
Func<TObject, int> intSelector)
{
if (source == null)
{
throw new ArgumentNullException("source");
}
source = source.OrderByDescending(intSelector);
var evaluated = source as IList<TObject> ?? source.ToList();
if (groups > evaluated.Count())
{
throw new ArgumentException("Invalid group count.");
}
var result = new List<List<TObject>>();
for (var i = 0; i < groups; i++)
{
result.Add(new List<TObject> { evaluated[i] });
}
for (var i = groups; i < evaluated.Count(); i++)
{
var bestIndex = 0;
var bestSum = result[bestIndex].Sum(intSelector);
for (var j = 1; j < result.Count; j++)
{
var sum = result[j].Sum(intSelector);
if (sum < bestSum)
{
bestSum = sum;
bestIndex = j;
}
}
result[bestIndex].Add(evaluated[i]);
}
return result;
}
It is not efficient (there are many ways to optimize it) and the result is not always optimial. But hope that it will be base for your algorithm (maybe approx. is enough for you - test it!).
EDIT:
I've modified snippet for you - you don't have to use GroupBy. Usage:
var widgets = new List<Widget> { W1, W2, etc. };
var result = Algo(widgets, groups: 3, intSelector: widget => widget.Height);
I have been stumped on this one for a while. I want to take a List and order the list such that the Products with the largest Price end up in the middle of the list. And I also want to do the opposite, i.e. make sure that the items with the largest price end up on the outer boundaries of the list.
Imagine a data structure like this.. 1,2,3,4,5,6,7,8,9,10
In the first scenario I need to get back 1,3,5,7,9,10,8,6,4,2
In the second scenario I need to get back 10,8,6,4,2,1,3,5,7,9
The list may have upwards of 250 items, the numbers will not be evenly distributed, and they will not be sequential, and I wanted to minimize copying. The numbers will be contained in Product objects, and not simple primitive integers.
Is there a simple solution that I am not seeing?
Any thoughts.
So for those of you wondering what I am up to, I am ordering items based on calculated font size. Here is the code that I went with...
The Implementation...
private void Reorder()
{
var tempList = new LinkedList<DisplayTag>();
bool even = true;
foreach (var tag in this) {
if (even)
tempList.AddLast(tag);
else
tempList.AddFirst(tag);
even = !even;
}
this.Clear();
this.AddRange(tempList);
}
The Test...
[TestCase(DisplayTagOrder.SmallestToLargest, Result=new[]{10,14,18,22,26,30})]
[TestCase(DisplayTagOrder.LargestToSmallest, Result=new[]{30,26,22,18,14,10})]
[TestCase(DisplayTagOrder.LargestInTheMiddle, Result = new[] { 10, 18, 26, 30, 22, 14 })]
[TestCase(DisplayTagOrder.LargestOnTheEnds, Result = new[] { 30, 22, 14, 10, 18, 26 })]
public int[] CalculateFontSize_Orders_Tags_Appropriately(DisplayTagOrder sortOrder)
{
list.CloudOrder = sortOrder;
list.CalculateFontSize();
var result = (from displayTag in list select displayTag.FontSize).ToArray();
return result;
}
The Usage...
public void CalculateFontSize()
{
GetMaximumRange();
GetMinimunRange();
CalculateDelta();
this.ForEach((displayTag) => CalculateFontSize(displayTag));
OrderByFontSize();
}
private void OrderByFontSize()
{
switch (CloudOrder) {
case DisplayTagOrder.SmallestToLargest:
this.Sort((arg1, arg2) => arg1.FontSize.CompareTo(arg2.FontSize));
break;
case DisplayTagOrder.LargestToSmallest:
this.Sort(new LargestFirstComparer());
break;
case DisplayTagOrder.LargestInTheMiddle:
this.Sort(new LargestFirstComparer());
Reorder();
break;
case DisplayTagOrder.LargestOnTheEnds:
this.Sort();
Reorder();
break;
}
}
The appropriate data structure is a LinkedList because it allows you to efficiently add to either end:
LinkedList<int> result = new LinkedList<int>();
int[] array = { 1, 2, 3, 4, 5, 6, 7, 8, 9, 10 };
Array.Sort(array);
bool odd = true;
foreach (var x in array)
{
if (odd)
result.AddLast(x);
else
result.AddFirst(x);
odd = !odd;
}
foreach (int item in result)
Console.Write("{0} ", item);
No extra copying steps, no reversing steps, ... just a small overhead per node for storage.
C# Iterator version
(Very simple code to satisfy all conditions.)
One function to rule them all! Doesn't use intermediate storage collection (see yield keyword). Orders the large numbers either to the middle, or to the sides depending on the argument. It's implemented as a C# iterator
// Pass forward sorted array for large middle numbers,
// or reverse sorted array for large side numbers.
//
public static IEnumerable<long> CurveOrder(long[] nums) {
if (nums == null || nums.Length == 0)
yield break; // Nothing to do.
// Move forward every two.
for (int i = 0; i < nums.Length; i+=2)
yield return nums[i];
// Move backward every other two. Note: Length%2 makes sure we're on the correct offset.
for (int i = nums.Length-1 - nums.Length%2; i >= 0; i-=2)
yield return nums[i];
}
Example Usage
For example with array long[] nums = { 1,2,3,4,5,6,7,8,9,10,11 };
Start with forward sort order, to bump high numbers into the middle.
Array.Sort(nums); //forward sort
// Array argument will be: { 1,2,3,4,5,6,7,8,9,10,11 };
long[] arrLargeMiddle = CurveOrder(nums).ToArray();
Produces: 1 3 5 7 9 11 10 8 6 4 2
Or, Start with reverse sort order, to push high numbers to sides.
Array.Reverse(nums); //reverse sort
// Array argument will be: { 11,10,9,8,7,6,5,4,3,2,1 };
long[] arrLargeSides = CurveOrder(nums).ToArray();
Produces: 11 9 7 5 3 1 2 4 6 8 10
Significant namespaces are:
using System;
using System.Collections.Generic;
using System.Linq;
Note: The iterator leaves the decision up to the caller about whether or not to use intermediate storage. The caller might simply be issuing a foreach loop over the results instead.
Extension Method Option
Optionally change the static method header to use the this modifier public static IEnumerable<long> CurveOrder(this long[] nums) { and put it inside a static class in your namespace;
Then call the order method directly on any long[ ] array instance like so:
Array.Reverse(nums); //reverse sort
// Array argument will be: { 11,10,9,8,7,6,5,4,3,2,1 };
long[] arrLargeSides = nums.CurveOrder().ToArray();
Just some (unneeded) syntactic sugar to mix things up a bit for fun. This can be applied to any answers to your question that take an array argument.
I might go for something like this
static T[] SortFromMiddleOut<T, U>(IList<T> list, Func<T, U> orderSelector, bool largestInside) where U : IComparable<U>
{
T[] sortedArray = new T[list.Count];
bool add = false;
int index = (list.Count / 2);
int iterations = 0;
IOrderedEnumerable<T> orderedList;
if (largestInside)
orderedList = list.OrderByDescending(orderSelector);
else
orderedList = list.OrderBy(orderSelector);
foreach (T item in orderedList)
{
sortedArray[index] = item;
if (add)
index += ++iterations;
else
index -= ++iterations;
add = !add;
}
return sortedArray;
}
Sample invocations:
int[] array = { 1, 2, 3, 4, 5, 6, 7, 8, 9, 10 };
int[] sortedArray = SortFromMiddleOut(array, i => i, false);
foreach (int item in sortedArray)
Console.Write("{0} ", item);
Console.Write("\n");
sortedArray = SortFromMiddleOut(array, i => i, true);
foreach (int item in sortedArray)
Console.Write("{0} ", item);
With it being generic, it could be a list of Foo and the order selector could be f => f.Name or whatever you want to throw at it.
The fastest (but not the clearest) solution is probably to simply calculate the new index for each element:
Array.Sort(array);
int length = array.Length;
int middle = length / 2;
int[] result2 = new int[length];
for (int i = 0; i < array.Length; i++)
{
result2[middle + (1 - 2 * (i % 2)) * ((i + 1) / 2)] = array[i];
}
Something like this?
public IEnumerable<int> SortToMiddle(IEnumerable<int> input)
{
var sorted = new List<int>(input);
sorted.Sort();
var firstHalf = new List<int>();
var secondHalf = new List<int>();
var sendToFirst = true;
foreach (var current in sorted)
{
if (sendToFirst)
{
firstHalf.Add(current);
}
else
{
secondHalf.Add(current);
}
sendToFirst = !sendToFirst;
}
//to get the highest values on the outside just reverse
//the first list instead of the second
secondHalf.Reverse();
return firstHalf.Concat(secondHalf);
}
For your specific (general) case (assuming unique keys):
public static IEnumerable<T> SortToMiddle<T, TU>(IEnumerable<T> input, Func<T, TU> getSortKey)
{
var sorted = new List<TU>(input.Select(getSortKey));
sorted.Sort();
var firstHalf = new List<TU>();
var secondHalf = new List<TU>();
var sendToFirst = true;
foreach (var current in sorted)
{
if (sendToFirst)
{
firstHalf.Add(current);
}
else
{
secondHalf.Add(current);
}
sendToFirst = !sendToFirst;
}
//to get the highest values on the outside just reverse
//the first list instead of the second
secondHalf.Reverse();
sorted = new List<TU>(firstHalf.Concat(secondHalf));
//This assumes the sort keys are unique - if not, the implementation
//needs to use a SortedList<TU, T>
return sorted.Select(s => input.First(t => s.Equals(getSortKey(t))));
}
And assuming non-unique keys:
public static IEnumerable<T> SortToMiddle<T, TU>(IEnumerable<T> input, Func<T, TU> getSortKey)
{
var sendToFirst = true;
var sorted = new SortedList<TU, T>(input.ToDictionary(getSortKey, t => t));
var firstHalf = new SortedList<TU, T>();
var secondHalf = new SortedList<TU, T>();
foreach (var current in sorted)
{
if (sendToFirst)
{
firstHalf.Add(current.Key, current.Value);
}
else
{
secondHalf.Add(current.Key, current.Value);
}
sendToFirst = !sendToFirst;
}
//to get the highest values on the outside just reverse
//the first list instead of the second
secondHalf.Reverse();
return(firstHalf.Concat(secondHalf)).Select(kvp => kvp.Value);
}
Simplest solution - order the list descending, create two new lists, into the first place every odd-indexed item, into the other every even indexed item. Reverse the first list then append the second to the first.
Okay, I'm not going to question your sanity here since I'm sure you wouldn't be asking the question if there weren't a good reason :-)
Here's how I'd approach it. Create a sorted list, then simply create another list by processing the keys in order, alternately inserting before and appending, something like:
sortedlist = list.sort (descending)
biginmiddle = new list()
state = append
foreach item in sortedlist:
if state == append:
biginmiddle.append (item)
state = prepend
else:
biginmiddle.insert (0, item)
state = append
This will give you a list where the big items are in the middle. Other items will fan out from the middle (in alternating directions) as needed:
1, 3, 5, 7, 9, 10, 8, 6, 4, 2
To get a list where the larger elements are at the ends, just replace the initial sort with an ascending one.
The sorted and final lists can just be pointers to the actual items (since you state they're not simple integers) - this will minimise both extra storage requirements and copying.
Maybe its not the best solution, but here's a nifty way...
Let Product[] parr be your array.
Disclaimer It's java, my C# is rusty.
Untested code, but you get the idea.
int plen = parr.length
int [] indices = new int[plen];
for(int i = 0; i < (plen/2); i ++)
indices[i] = 2*i + 1; // Line1
for(int i = (plen/2); i < plen; i++)
indices[i] = 2*(plen-i); // Line2
for(int i = 0; i < plen; i++)
{
if(i != indices[i])
swap(parr[i], parr[indices[i]]);
}
The second case, Something like this?
int plen = parr.length
int [] indices = new int[plen];
for(int i = 0; i <= (plen/2); i ++)
indices[i] = (plen^1) - 2*i;
for(int i = 0; i < (plen/2); i++)
indices[i+(plen/2)+1] = 2*i + 1;
for(int i = 0; i < plen; i++)
{
if(i != indices[i])
swap(parr[i], parr[indices[i]]);
}