iterating with Linq - c#

I am trying to find a way to access previous values from a Linq method in the same line.
I want to be able to use this general form in Linq:
var values = Enumerable.Range( 1, 100 ).Select( i => i + [last result] );
But I can't find a way to do something like this without multi-line lambda's and storing the results somewhere else.
So the best fibonacci sum I've been able to do in Linq is:
List<int> calculated = new List<int>( new int[] { 1, 2 });
var fibonacci = Enumerable.Range(2, 10).Select(i =>
{
int result = calculated[i - 2] + calculated[i - 1];
calculated.Add(result);
return result; // and how could I just put the result in fibonacci?
}
);
Which seems ugly. I could do this in less code with a regular for loop.
for (int i = 2; i < 10; i++)
{
calculated.Add(calculated[i - 2] + calculated[i - 1]);
}
It seems like if I could find a way to do this, I could use Linq to do a lot of Linear programming and sum a lot of iterative formulas.

If you are looking for a way to create a Fibonacci sequence generator, you would be better off writing your own generator function instead of using Linq extension methods. Something like this:
public static IEnumerable<int> Fibonacci()
{
int a = 1;
int b = 0;
int last;
for (;;) {
yield return a;
last = a;
a += b;
b = last;
}
}
Then you can apply Linq methods to this enumerable to achieve the result you want (try iterating over Fibonacci().Take(20) for example).
Linq extension methods are not the solution for every programming problem, and I can only imagine how horrid a pure LINQ Fibonacci sequence generator would look.

Closest you can come to something like that with LINQ is the IEnumerable.Aggregate method (a.k.a. fold from functional programming). You can use it to, for example, sum up the squares of a collection, like:
int sumSquares = list.Aggregate(0, (sum, item) => sum + item * item);
Since in LINQ the values are retrieved from a collection using an enumerator, i.e. they are taken one by one, by definition, there is no concept of "previous item". The items could even be generated and discarded on the fly, using some yield return magic. That said, you could always use some hack like:
long a= 1;
long b= 1;
var fibonacci = Enumerable.Range(1,20).Select(i => {
long last= a + b;
b = a;
a = last;
return last;
});
but the moment you have to use and modify an outside variable to make the lambdas work, you are in code-smell territory.

Related

Why does it take more time when you run a LINQ OrderBy before Select?

While writing a solution for a coding problem I discovered an interesting behavior of my LINQ statements. I had two scenarios:
First:
arr.Select(x => x + 5).OrderBy(x => x)
Second:
arr.OrderBy(x => x).Select(x => x + 5)
After a little bit of testing with System.Diagnostics.Stopwatch I got the following results for an integer array of length 100_000.
For the first approach:
00:00:00.0000152
For the second:
00:00:00.0073650
Now I'm interested in why it takes more time if I do the ordering first. I wasn't able to find something on google so I thought about it by myself.
I ended up with 2 Ideas:
1. The second scenario has to convert to IOrderedEnumerable and then back to IEnumerable while the first scenario only has to convert to IOrderedEnumerable and not back.
2. You end up having 2 loops. The first for sorting and the second for the selecting while approach 1 does everything in 1 loop.
So my question is why does it take much more time to do the ordering before select?
Let's have a look at the sequences:
private static void UnderTestOrderBySelect(int[] arr) {
var query = arr.OrderBy(x => x).Select(x => x + 5);
foreach (var item in query)
;
}
private static void UnderTestSelectOrderBy(int[] arr) {
var query = arr.Select(x => x + 5).OrderBy(x => x);
foreach (var item in query)
;
}
// See Marc Gravell's comment; let's compare Linq and inplace Array.Sort
private static void UnderTestInPlaceSort(int[] arr) {
var tmp = arr;
var x = new int[tmp.Length];
for (int i = 0; i < tmp.Length; i++)
x[i] = tmp[i] + 5;
Array.Sort(x);
}
In order to perform benchmark, let's run 10 times and average 6 middle results:
private static string Benchmark(Action<int[]> methodUnderTest) {
List<long> results = new List<long>();
int n = 10;
for (int i = 0; i < n; ++i) {
Random random = new Random(1);
int[] arr = Enumerable
.Range(0, 10000000)
.Select(x => random.Next(1000000000))
.ToArray();
Stopwatch sw = new Stopwatch();
sw.Start();
methodUnderTest(arr);
sw.Stop();
results.Add(sw.ElapsedMilliseconds);
}
var valid = results
.OrderBy(x => x)
.Skip(2) // get rid of top 2 runs
.Take(results.Count - 4) // get rid of bottom 2 runs
.ToArray();
return $"{string.Join(", ", valid)} average : {(long) (valid.Average() + 0.5)}";
}
Time to run and have a look at the results:
string report = string.Join(Environment.NewLine,
$"OrderBy + Select: {Benchmark(UnderTestOrderBySelect)}",
$"Select + OrderBy: {Benchmark(UnderSelectOrderBy)}",
$"Inplace Sort: {Benchmark(UnderTestInPlaceSort)}");
Console.WriteLine(report);
Outcome: (Core i7 3.8GHz, .Net 4.8 IA64)
OrderBy + Select: 4869, 4870, 4872, 4874, 4878, 4895 average : 4876
Select + OrderBy: 4763, 4763, 4793, 4802, 4827, 4849 average : 4800
Inplace Sort: 888, 889, 890, 893, 896, 904 average : 893
I don't see any significant difference, Select + OrderBy seems to be slightly more efficient (about 2% gain) than OrderBy + Select. Inplace Sort, however, has far better performance (5 times) than any of Linq.
Depending on which Linq-provider you have, there may happen some optimization on the query. E.g. if you´d use some kind of database, chances are high your provider would create the exact same query for both statements similar to this one:
select myColumn from myTable order by myColumn;
Thus performamce should be identical, no matter if you order first in Linq or select first.
As this does not seem to happen here, you probably use Linq2Objects, which has no optimization at all. So the order of your statements may have an efffect, in particular if you´d have some kind of filter using Where which would filter many objects out so that later statements won´t operate on the entire collection.
To keep long things short: the difference most probably comes from some internal initialzation-logic. As a dataset of 100000 numbers is not really big - at least not big enough - even some fast initialization has a big impact.

ParallelEnumerable.Aggregate for several methods

Start to learn multithreading. Have 3 methods to calculate a sum, average, and product of square roots of an array.
At first, I make three separate blocking calls using PLINQ. Then I thought that it would be nice to be able to make it in a single call and return an object with sum, product, and average at the same time. I read that ParallelEnumerable.Aggregate can help me with this, but I totally don't know how to use it.
I would be really grateful for some explanation how to use this function in my case, good/bad aspects of this approach.
public static double Average(double[] array, string tool)
{
if (array == null) throw new ArgumentNullException(nameof(array));
double sum = Enumerable.Sum(array);
double result = sum / array.Length;
Print(tool, result);
return result;
}
public static double Sum(double[] array, string tool)
{
if (array == null) throw new ArgumentNullException(nameof(array));
double sum = Enumerable.Sum(array);
Print(tool, sum);
return sum;
}
public static void ProductOfSquareRoots(double[] array, string tool)
{
if (array == null) throw new ArgumentNullException(nameof(array));
double result = 1;
foreach (var number in array)
{
result = result * Math.Sqrt(number);
}
Print(tool, result);
}
The three aggregated values (average, sum and product of square roots) that you want to compute can each be computed by performing a single pass over the numbers. Instead of doing this three times (one for each aggregated value) you can do this once and aggregate the three values inside the loop (this should save time).
The average is the sum divided by the count and as you already are computing the sum you only need the count in addition to get the average. If you know the size of the input you don't even have to count the items but here I assume that the size of the input is unknown in advance.
If you want to use LINQ you can use Aggregate:
var aggregate = numbers.Aggregate(
// Starting value for the accumulator.
(Count: 0, Sum: 0D, ProductOfSquareRoots: 1D),
// Update the accumulator with a specific number.
(accumulator, number) =>
{
accumulator.Count += 1;
accumulator.Sum += number;
accumulator.ProductOfSquareRoots *= Math.Sqrt(number);
return accumulator;
});
The variable aggregate is a ValueTuple<int, double, double> with the items Count, Sum and ProductOfSquareRoots. Before C# 7 you would use an anonymous type. However, that would require an allocation for each value in the input sequence slowing down the aggregation. By using a mutable value tuple the aggregation should become faster.
Aggregate works with PLINQ so if numbers is of type ParallelQuery<T> and not IEnumerable<T> then the aggregation will be performed in parallel. Notice that this requires the aggregation to be both associative (e.g. (a + b) + c = a + (b + c) and commutative (e.g. a + b = b + a) which in your case is true.
PLINQ has an overhead so it might not perform better compared to single threaded LINQ depending on the number of elements in your sequence and how complex the calculations are. You will have to measure this yourself to determine if PLINQ speeds things up. However, you can use the same Aggregate expression in both LINQ and PLINQ making your code easy to switch from single threaded to parallel by inserting AsParallel() the right place.
Note: you must initialize the result variable with the value 1, because otherwise you will always get 0.
Note 2: instead of Enumerable.Sum(array), just write array.Sum().
No, the Aggregate method won't help you to calculate the three functions at the same time. See Martin Liversage answer.
KISS ;)
if (array == null) throw new ArgumentNullException(nameof(array));
var sum = array.Sum();
var average = array.Average();
var product = array.Aggregate(1.0, (acc, val) => acc * Math.Sqrt(val));
Can be simplified:
var average = sum / array.Length;
This eliminates an extra pass through the array.
Want to parallelize?
var sum = array.AsParallel().Sum();
//var average = array.AsParallel().Average(); // Extra pass!
var average = sum / array.Length; // More fast! Really!
var product = array.AsParallel().Aggregate(1.0, (acc, val) => acc * Math.Sqrt(val));
However, it will probably be slower than the previous method. Such paralleling is justified only for very large collections, in billions of elements.
Each pass through the collection takes time. The less passes, the better the performance. From one we have already disposed of, when calculating the average. Let's do just one.
double sum = 0;
double product = 1;
foreach (var number in array)
{
sum += number;
product = product * Math.Sqrt(number);
}
double average = sum / array.Length;
Three results in one pass! We are the best!
Let's get back to the subject.
The Parallel.Invoke method allows you to execute several functions in parallel, but it does not get the results from them. It is suitable for calculations of the type "fire and forget".
We can parallelize the computation by running multiple tasks. With help of Task.WhenAll waiting for them all completed and get the result.
var results = await Task.WhenAll(
Task.Run(() => array.Sum()),
Task.Run(() => array.Average()),
Task.Run(() => array.Aggregate(1.0, (acc, val) => acc * Math.Sqrt(val)))
);
var sum = results[0];
var average = results[1];
var product = results[2];
It is also not effective for a small size collection. But it may be more efficient than the AsParallel in some cases.
Another way of writing this approach with tasks. Perhaps it will seem clearer.
var sumTask = Task.Run(() => array.Sum());
var avgTask = Task.Run(() => array.Average());
var prodTask = Task.Run(() => array.Aggregate(1.0, (acc, val) => acc * Math.Sqrt(val)));
Task.WaitAll(sumTask, avgTask, prodTask);
sum = sumTask.Result;
average = avgTask.Result;
product = prodTask.Result;

Project sequence so that each element will become sum of all before it with LINQ

I have a following code that transforms each element of an array into sum of all elements before it. The procedural implementation is as follows:
float[] items = {1, 5, 10, 100}; //for example
float[] sums = new float[items.Length];
float total = 0;
for(int i = 0; i < items.Length; i++){
total+=items[i];
sums[i] = total;
}
How would I implement this as a LINQ one-liner?
I know it can be done for example as
items.Select((x, i) => items.Take(i + 1).Sum())
but I think it's not very efficient when the array size grows, as it has to do Sum() for each element.
LINQ doesn't support this case very cleanly, to be honest - you want a mixture of aggregation and projection. You can do it with side-effects, which is horrible:
// Don't use this!
float sum = 0f;
var sums = items.Select(x => sum +=x).ToArray();
Side-effects in LINQ are nasty. Likewise you can do it using Take/Sum as shown by RePierre and L.B - but that takes an operation which is naturally O(N) and converts it into an operation which is O(N^2).
The MoreLINQ project I started a while ago does have support for this, in its Scan and PreScan members. In this case you want Scan, I believe:
var sums = items.Scan((x, y) => x + y);
If you don't want to use a third-party library, don't want to use side-effects, don't want the inefficiency of the Take solution, and only need addition, and only need it for a single type (e.g. float in your case) you can easily introduce your own method:
public static IEnumerable<float> RunningSum(this IEnumerable<float> source)
{
if (source == null)
{
throw new ArgumentNullException(source);
}
float sum = 0f;
foreach (var item in source)
{
sum += item;
yield return sum;
}
}
As you'll have noticed, this is basically the same code as your original - but is lazily evaluated and applies to any sequence of floats.
var result = items.Select((item, index) => items.Take(index).Sum() + item);
EDIT
You can use Aggregate method to create the sums:
var result = items.Aggregate(new List<float>(), (seed, item) =>
{
seed.Add(seed.LastOrDefault() + item);
return seed;
});
The Reactive Extensions team at Microsoft released an "Interactive Extensions" library that adds many useful extensions to IEnumerable<T>. One of them is Scan which does exactly what you want.
Here's the IX way of doing a running total:
IEnumerable<float> results = items.Scan(0.0f, (x1, x2) => x1 + x2);

Thoughts on foreach with Enumerable.Range vs traditional for loop

In C# 3.0, I'm liking this style:
// Write the numbers 1 thru 7
foreach (int index in Enumerable.Range( 1, 7 ))
{
Console.WriteLine(index);
}
over the traditional for loop:
// Write the numbers 1 thru 7
for (int index = 1; index <= 7; index++)
{
Console.WriteLine( index );
}
Assuming 'n' is small so performance is not an issue, does anyone object to the new style over the traditional style?
I find the latter's "minimum-to-maximum" format a lot clearer than Range's "minimum-count" style for this purpose. Also, I don't think it's really a good practice to make a change like this from the norm that is not faster, not shorter, not more familiar, and not obviously clearer.
That said, I'm not against the idea in general. If you came up to me with syntax that looked something like foreach (int x from 1 to 8) then I'd probably agree that that would be an improvement over a for loop. However, Enumerable.Range is pretty clunky.
This is just for fun. (I'd just use the standard "for (int i = 1; i <= 10; i++)" loop format myself.)
foreach (int i in 1.To(10))
{
Console.WriteLine(i); // 1,2,3,4,5,6,7,8,9,10
}
// ...
public static IEnumerable<int> To(this int from, int to)
{
if (from < to)
{
while (from <= to)
{
yield return from++;
}
}
else
{
while (from >= to)
{
yield return from--;
}
}
}
You could also add a Step extension method too:
foreach (int i in 5.To(-9).Step(2))
{
Console.WriteLine(i); // 5,3,1,-1,-3,-5,-7,-9
}
// ...
public static IEnumerable<T> Step<T>(this IEnumerable<T> source, int step)
{
if (step == 0)
{
throw new ArgumentOutOfRangeException("step", "Param cannot be zero.");
}
return source.Where((x, i) => (i % step) == 0);
}
In C# 6.0 with the use of
using static System.Linq.Enumerable;
you can simplify it to
foreach (var index in Range(1, 7))
{
Console.WriteLine(index);
}
You can actually do this in C# (by providing To and Do as extension methods on int and IEnumerable<T> respectively):
1.To(7).Do(Console.WriteLine);
SmallTalk forever!
I kind of like the idea. It's very much like Python. Here's my version in a few lines:
static class Extensions
{
public static IEnumerable<int> To(this int from, int to, int step = 1) {
if (step == 0)
throw new ArgumentOutOfRangeException("step", "step cannot be zero");
// stop if next `step` reaches or oversteps `to`, in either +/- direction
while (!(step > 0 ^ from < to) && from != to) {
yield return from;
from += step;
}
}
}
It works like Python's:
0.To(4) → [ 0, 1, 2, 3 ]
4.To(0) → [ 4, 3, 2, 1 ]
4.To(4) → [ ]
7.To(-3, -3) → [ 7, 4, 1, -2 ]
I think the foreach + Enumerable.Range is less error prone (you have less control and less ways to do it wrong, like decreasing the index inside the body so the loop would never end, etc.)
The readability problem is about the Range function semantics, that can change from one language to another (e.g if given just one parameter will it begin from 0 or 1, or is the end included or excluded or is the second parameter a count instead a end value).
About the performance, I think the compiler should be smart enough to optimize both loops so they execute at a similar speed, even with large ranges (I suppose that Range does not create a collection, but of course an iterator).
I think Range is useful for working with some range inline:
var squares = Enumerable.Range(1, 7).Select(i => i * i);
You can each over. Requires converting to list but keeps things compact when that's what you want.
Enumerable.Range(1, 7).ToList().ForEach(i => Console.WriteLine(i));
But other than for something like this, I'd use traditional for loop.
It seems like quite a long winded approach to a problem that's already solved. There's a whole state machine behind the Enumerable.Range that isn't really needed.
The traditional format is fundamental to development and familiar to all. I don't really see any advantage to your new style.
I'd like to have the syntax of some other languages like Python, Haskell, etc.
// Write the numbers 1 thru 7
foreach (int index in [1..7])
{
Console.WriteLine(index);
}
Fortunatly, we got F# now :)
As for C#, I'll have to stick with the Enumerable.Range method.
#Luke:
I reimplemented your To() extension method and used the Enumerable.Range() method to do it.
This way it comes out a little shorter and uses as much infrastructure given to us by .NET as possible:
public static IEnumerable<int> To(this int from, int to)
{
return from < to
? Enumerable.Range(from, to - from + 1)
: Enumerable.Range(to, from - to + 1).Reverse();
}
How to use a new syntax today
Because of this question I tried out some things to come up with a nice syntax without waiting for first-class language support. Here's what I have:
using static Enumerizer;
// prints: 0 1 2 3 4 5 6 7 8 9
foreach (int i in 0 <= i < 10)
Console.Write(i + " ");
Not the difference between <= and <.
I also created a proof of concept repository on GitHub with even more functionality (reversed iteration, custom step size).
A minimal and very limited implementation of the above loop would look something like like this:
public readonly struct Enumerizer
{
public static readonly Enumerizer i = default;
public Enumerizer(int start) =>
Start = start;
public readonly int Start;
public static Enumerizer operator <(int start, Enumerizer _) =>
new Enumerizer(start);
public static Enumerizer operator >(int _, Enumerizer __) =>
throw new NotImplementedException();
public static IEnumerable<int> operator <=(Enumerizer start, int end)
{
for (int i = start.Start; i < end; i++)
yield return i;
}
public static IEnumerable<int> operator >=(Enumerizer _, int __) =>
throw new NotImplementedException();
}
There is no significant performance difference between traditional iteration and range iteration, as Nick Chapsas pointed out in his excellent YouTube video. Even the benchmark showed there is some difference in nanoseconds for the small number of iterations. As the loop gets quite big, the difference is almost gone.
Here is an elegant way of iterating in a range loop from his content:
private static void Test()
{
foreach (var i in 1..5)
{
}
}
Using this extension:
public static class Extension
{
public static CustomIntEnumerator GetEnumerator(this Range range)
{
return new CustomIntEnumerator(range);
}
public static CustomIntEnumerator GetEnumerator(this int number)
{
return new CustomIntEnumerator(new Range(0, number));
}
}
public ref struct CustomIntEnumerator
{
private int _current;
private readonly int _end;
public CustomIntEnumerator(Range range)
{
if (range.End.IsFromEnd)
{
throw new NotSupportedException();
}
_current = range.Start.Value - 1;
_end = range.End.Value;
}
public int Current => _current;
public bool MoveNext()
{
_current++;
return _current <= _end;
}
}
Benchmark result:
I loved this way of implementation. But, the biggest issue with this approach is its inability to use it in the async method.
I'm sure everybody has their personal preferences (many would prefer the later just because it is familiar over almost all programming languages), but I am like you and starting to like the foreach more and more, especially now that you can define a range.
In my opinion the Enumerable.Range() way is more declarative. New and unfamiliar to people? Certainly. But I think this declarative approach yields the same benefits as most other LINQ-related language features.
I imagine there could be scenarios where Enumerable.Range(index, count) is clearer when dealing with expressions for the parameters, especially if some of the values in that expression are altered within the loop. In the case of for the expression would be evaluated based on the state after the current iteration, whereas Enumerable.Range() is evaluated up-front.
Other than that, I'd agree that sticking with for would normally be better (more familiar/readable to more people... readable is a very important value in code that needs to be maintained).
I agree that in many (or even most cases) foreach is much more readable than a standard for-loop when simply iterating over a collection. However, your choice of using Enumerable.Range(index, count) isn't a strong example of the value of foreach over for.
For a simple range starting from 1, Enumerable.Range(index, count) looks quite readable. However, if the range starts with a different index, it becomes less readable because you have to properly perform index + count - 1 to determine what the last element will be. For example…
// Write the numbers 2 thru 8
foreach (var index in Enumerable.Range( 2, 7 ))
{
Console.WriteLine(index);
}
In this case, I much prefer the second example.
// Write the numbers 2 thru 8
for (int index = 2; index <= 8; index++)
{
Console.WriteLine(index);
}
Strictly speaking, you misuse enumeration.
Enumerator provides the means to access all the objects in a container one-by-one, but it does not guarantee the order.
It is OK to use enumeration to find the biggest number in an array. If you are using it to find, say, first non-zero element, you are relying on the implementation detail you should not know about. In your example, the order seems to be important to you.
Edit: I am wrong. As Luke pointed out (see comments) it is safe to rely on the order when enumerating an array in C#. This is different from, for example, using "for in" for enumerating an array in Javascript .
I do like the foreach + Enumerable.Range approach and use it sometimes.
// does anyone object to the new style over the traditional style?
foreach (var index in Enumerable.Range(1, 7))
I object to the var abuse in your proposal. I appreciate var, but, damn, just write int in this case! ;-)
Just throwing my hat into the ring.
I define this...
namespace CustomRanges {
public record IntRange(int From, int Thru, int step = 1) : IEnumerable<int> {
public IEnumerator<int> GetEnumerator() {
for (var i = From; i <= Thru; i += step)
yield return i;
}
IEnumerator IEnumerable.GetEnumerator()
=> GetEnumerator();
};
public static class Definitions {
public static IntRange FromTo(int from, int to, int step = 1)
=> new IntRange(from, to - 1, step);
public static IntRange FromThru(int from, int thru, int step = 1)
=> new IntRange(from, thru, step);
public static IntRange CountFrom(int from, int count)
=> new IntRange(from, from + count - 1);
public static IntRange Count(int count)
=> new IntRange(0, count);
// Add more to suit your needs. For instance, you could add in reversing ranges, etc.
}
}
Then anywhere I want to use it, I add this at the top of the file...
using static CustomRanges.Definitions;
And use it like this...
foreach(var index in FromTo(1, 4))
Debug.WriteLine(index);
// Prints 1, 2, 3
foreach(var index in FromThru(1, 4))
Debug.WriteLine(index);
// Prints 1, 2, 3, 4
foreach(var index in FromThru(2, 10, 2))
Debug.WriteLine(index);
// Prints 2, 4, 6, 8, 10
foreach(var index in CountFrom(7, 4))
Debug.WriteLine(index);
// Prints 7, 8, 9, 10
foreach(var index in Count(5))
Debug.WriteLine(index);
// Prints 0, 1, 2, 3, 4
foreach(var _ in Count(4))
Debug.WriteLine("A");
// Prints A, A, A, A
The nice thing about this approach is by the names, you know exactly if the end is included or not.

Simple Sequence Generation?

I'm looking for an ultra-easy way to generate a list of numbers, 1-200.
(it can be a List, Array, Enumerable... I don't really care about the specific type)
Apparently .Net 4.0 has a Sequence.Range(min,max) method.
But I'm currently on .Net 3.5.
Here is a sample usage, of what I'm after, shown with Sequence.Range.
public void ShowOutput(Sequence.Range(1,200));
For the moment, I need consequitive numbers 1-200. In future iterations, I may need arbitrary lists of numbers, so I'm trying to keep the design flexible.
Perhaps there is a good LINQ solution? Any other ideas?
.NET 3,5 has Range too. It's actually Enumerable.Range and returns IEnumerable<int>.
The page you linked to is very much out of date - it's talking about 3 as a "future version" and the Enumerable static class was called Sequence at one point prior to release.
If you wanted to implement it yourself in C# 2 or later, it's easy - here's one:
IEnumerable<int> Range(int count)
{
for (int n = 0; n < count; n++)
yield return n;
}
You can easily write other methods that further filter lists:
IEnumerable<int> Double(IEnumerable<int> source)
{
foreach (int n in source)
yield return n * 2;
}
But as you have 3.5, you can use the extension methods in System.Linq.Enumerable to do this:
var evens = Enumerable.Range(0, someLimit).Select(n => n * 2);
var r = Enumerable.Range( 1, 200 );
Check out System.Linq.Enumerable.Range.
Regarding the second part of your question, what do you mean by "arbitrary lists"? If you can define a function from an int to the new values, you can use the result of Range with other LINQ methods:
var squares = from i in Enumerable.Range(1, 200)
select i * i;

Categories

Resources