I have a problem with implementing FFT. The target device is Windows Phone 7.
This is how i'm doing it.
buffer is a byte array with fixed size 1024.
var o = Observable.FromEvent<EventArgs>(Microphone.Default, "BufferReady");
o.Subscribe(evt =>
{
double[] dImageArray = this.buffer.Select(i => Convert.ToDouble(i)).ToArray();
fftoutput = Saluse.MediaKit.Sample.FourierTransform.FFTDb(ref dImageArray);
});
The class i'm using(as you can see) is from SaluseMediakit (source)
Is this the right path? Or i'm somewhere mistaken?
I've manage to perform a good FFT, with AFORGE(this library saved me several times). The proper way to obtain the waveform info from the mic.
double[] sampleBuffer = new double[buffer.Length / 2];
int h = 0;
for (int i = 0; i < buffer.Length; i += 2)
{
sampleBuffer[h] = Convert.ToDouble(BitConverter.ToInt16((byte[])buffer, i));
h++;
}
Following up with another Question. I would love to make a visual equalizer. But I have no idea how-to.
Related
I'm programming an C# emulator and decided to output the PCM using CScore.
When the sample size (for each channel) is one byte, the sound outputs correctly, but when I increase the sample size to 16 bits, the sound is very very noisy.
A related question to that problem is how those 2 bytes are interpreted (are they signed? high byte first?)
This is roughly what I'm doing:
First I generate the samples as such
public void GenerateSamples(int sampleCount)
{
while(sampleCount > 0)
{
--sampleCount;
for(int c = 0; c < _numChannels; ++c)
{
_buffer[_sampleIndex++] = _outputValue;
}
// The amount of ticks in a sample
_tickCounter -= APU.MinimumTickThreshold;
if(_tickCounter < 0)
{
_tickCounter = _tickThreshold;
_up = !_up;
// Replicating signed behaviour
_outputValue = (short)(_up ? 32767 : -32768);
}
}
}
This will generate a simple square wave with the frequency determined by the _tickThreshold. If the _buffer is a byte array, the sound is correct.
I want to output it with shorts because it will enable me to use signed samples and simply add multiple channels in order to mix them.
This is how I'm outputting the sound.
for(int i = 0; i < sampleCount; ++i)
{
for(int c = 0; c < _numChannels; ++c)
{
short sample = _channel.Buffer[_channelSampleIndex++];
// Outputting the samples the other way around doesn't output
// sound for me
_buffer[_sampleIndex++] = (byte)sample;
_buffer[_sampleIndex++] = (byte)(sample >> 8);
}
}
The WaveFormat I'm using is determined like this:
_waveFormat = new WaveFormat(_apu.SampleRate, // 44000
_apu.SampleSize * 8, // 16
_apu.NumChannels); // 2
I'm pretty sure there is something obvious I'm missing, but I've been debugging this for a while and don't seem to pinpoint where the problem is.
Thanks
Walk of shame here.
The problem was that I wasn't taking in account that now I needed to generate half the amount of samples (CScore asked for amount of bytes, not samples).
In my example, I had to divide the sampleCount variable by the sampleSize to generate the correct amount of sound.
The noise came because I wasn't synchronizing the extra samples with the next Read call from CScore (I'm generating sound on the fly, instead of pre-buffering it. This way I have no delay introduced because of extra samples).
I found out about the problem looking at this: SampleToPcm16.cs
From a SDK I get images that have the pixel format BGR packed, i.e. BGRBGRBGR. For another application, I need to convert this format to RGB planar RRRGGGBBB.
I am using C# .NET 4.5 32bit and the data is in byte arrays which have the same size.
Right now I am iterating through the array source and assigning the BGR values to there appropriate places in the target array, but that takes too long (180ms for a 1,3megapixel image). The processor the code runs at has access to MMX, SSE, SSE2, SSE3, SSSE3.
Is there a way to speed up the conversion?
edit: Here is the conversion I am using:
// the array with the BGRBGRBGR pixel data
byte[] source;
// the array with the RRRGGGBBB pixel data
byte[] result;
// the amount of pixels in one channel, width*height
int imageSize;
for (int i = 0; i < source.Length; i += 3)
{
result[i/3] = source[i + 2]; // R
result[i/3 + imageSize] = source[i + 1]; // G
result[i/3 + imageSize * 2] = source[i]; // B
}
edit: I tried splitting the access to the source array into three loops, one for each channel, but it didn't really help. So I'm open to suggestions.
for (int i = 0; i < source.Length; i += 3)
{
result[i/3] = source[i + 2]; // R
}
for (int i = 0; i < source.Length; i += 3)
{
result[i/3 + imageSize] = source[i + 1]; // G
}
for (int i = 0; i < source.Length; i += 3)
{
result[i/3 + imageSize * 2] = source[i]; // B
}
Bump because the question is still unanswered. Any advise is very appreciated!
You could try to use SSE3's PSHUFB - Packed Shuffle Bytes instruction. Make sure you are using aligned memory read/writes. You will have to do something tricky to deal with the last dangling B value in each XMMWORD-sized block. Might be tough to get it right but should be a huge speedup. You could also look for library code. I'm guessing you will need to make a C or C++ DLL and use P/Invoke, but maybe there is a way to use SSE instructions from C# that I don't know about.
edit - this question is for a slightly different problem, ARGB to BGR, but the techniques used are similar to what you need.
Basler company has a SDK for their cameras, called Basler Pylon, working in Windows and Linux.
This SDK has APIs for C++, C# and more.
It has an image conversion class PixelDataConverter, which seems to be what you need.
I am attempting to create a classifier/predictor using SURF and a Naive Bayesian. I am pretty much following the technique from "Visual Categorization with Bags of Keypoints" by Dance, Csurka... I am using SURF instead of SIFT.
My results are pretty horrendous and I am not sure where my error lies. I am using 20 car samples (ham) and 20 motorcycle samples(spam) from the CalTec set. I suspect it is in the way I am creating my vocabulary. What I can see is that the EMGU/OpenCV kmeans2 classifier is returning different results given the same SURF descriptor input. That makes me suspicious. Here is my code so far.
public Matrix<float> Extract<TColor, TDepth>(Image<TColor, TDepth> image)
where TColor : struct, Emgu.CV.IColor
where TDepth : new()
{
ImageFeature[] modelDescriptors;
using (var imgGray = image.Convert<Gray, byte>())
{
var modelKeyPoints = surfCPU.DetectKeyPoints(imgGray, null);
//the surf descriptor is a size 64 vector describing the intensity pattern surrounding
//the corresponding modelKeyPoint
modelDescriptors = surfCPU.ComputeDescriptors(imgGray, null, modelKeyPoints);
}
var samples = new Matrix<float>(modelDescriptors.Length, DESCRIPTOR_COUNT);//SURF Descriptors have 64 samples
for (int k = 0; k < modelDescriptors.Length; k++)
{
for (int i = 0; i < modelDescriptors[k].Descriptor.Length; i++)
{
samples.Data[k, i] = modelDescriptors[k].Descriptor[i];
}
}
//group descriptors into clusters using K-means to form the feature vectors
//create "vocabulary" based on square-error partitioning K-means
var centers = new Matrix<float>(CLUSTER_COUNT, samples.Cols, 1);
var term = new MCvTermCriteria();
var labelVector = new Matrix<int>(modelDescriptors.Length, 1);
var cluster = CvInvoke.cvKMeans2(samples, CLUSTER_COUNT, labelVector, term, 3, IntPtr.Zero, 0, centers, IntPtr.Zero);
//this is the quantized feature vector as described in Dance, Csurska Bag of Keypoints (2004)
var keyPoints = new Matrix<float>(1, CLUSTER_COUNT);
//quantize the vector into a feature vector
//making a histogram of the result counts
for (int i = 0; i < labelVector.Rows; i++)
{
var value = labelVector.Data[i, 0];
keyPoints.Data[0, value]++;
}
//normalize the histogram since it will have different amounts of points
keyPoints = keyPoints / keyPoints.Norm;
return keyPoints;
}
The output gets fed into NormalBayesClassifier. This is how I train it.
Parallel.For(0, hamCount, i =>
{
using (var img = new Image<Gray, byte>(_hams[i].FullName))
{
var features = _extractor.Extract(img);
features.CopyTo(trainingData.GetRow(i));
trainingClass.Data[i, 0] = 1;
}
});
Parallel.For(0, spamCount, j =>
{
using (var img = new Image<Gray, byte>(_spams[j].FullName))
{
var features = img.ClassifyFeatures(_extractor);
features.CopyTo(trainingData.GetRow(j));
trainingClass.Data[j + hamCount, 0] = 0;
}
});
using (var classifier = new NormalBayesClassifier())
{
if (classifier.Train(trainingData, trainingClass, null, null, false))
{
classifier.Save(_statModelFilePath);
}
}
When I call Predict using the NormalBayesClassifier it returns 1(match) for all of the training samples...ham and spam.
Any help would be greatly appreciated.
Edit.
One other note is that I have chosen CLUSTER_COUNT from 5 to 500 all with the same result.
The problem was more conceptual than technical. I did not understand that the K Means cluster was building the vocabulary for the "entire" data set. The way to do it correctly is to give the CvInvoke.cvKMeans2 call a training matrix containing all of the features for every image. I was building the vocabulary each time based on a single image.
My final solution involved pulling the SURF code into its own method and running that on each ham and spam image. I then used the massive result set to build a training matrix and gave that to the CvInvoke.cvKMeans2 method. It took quite a long time to finish the training. I have about 3000 images total.
My results were better. The prediction rate was 100% accurate with the training data. My problem now is that I am likely suffering from over fitting because its prediction rate is still poor for non-training data. I will play around with the hessian threshold in the SURF algorithm as well as the cluster count to see if I can minimize the over fitting.
I want to write a distributed software system (system where you can execute programs faster than on a single pc), that can execute different kinds of programs.(As it is a school project, I'll probably execute programs like Prime finder and Pi calculator on it)
My preferences is that it should written for C# with .NET, have good documentation, be simple to write(not new in C# with .NET, but I'm not professional) and to be able to write tasks for the grid easily and/or to load programs to the network directly from .exe.
I've looked a little at the:
MPAPI
Utilify(from the makers of Alchemy)
NGrid (Outdated?)
Which one is the best for my case? Do you have any experience with them?
ps. I'm aware of many similar questions here, but they were either outdated, not with proper answers or didn't answer my question, and therefore I choose to ask again.
I just contacted the founder of Utilify (Krishna Nadiminti) and while active development has paused for now, he has kindly released all the source code here on Bitbucket.
I think it is worth continuing this project as there are literally no comparable alternative as of now (even commercial). I may start working on it but don't wait for me :).
Got same problem. I tried NGrid, Alchemi and MS
PI.net.
After all i decided to start my own open source project to play around, check here: http://lucygrid.codeplex.com/.
UPDATE:
See how looks PI example:
The function passed to AsParallelGrid will be executed by the grid nodes.
You can play with it running the DEMO project.
/// <summary>
/// Distributes simple const processing
/// </summary>
class PICalculation : AbstractDemo
{
public int Steps = 100000;
public int ChunkSize = 50;
public PICalculation()
{
}
override
public string Info()
{
return "Calculates PI over the grid.";
}
override
public string Run(bool enableLocalProcessing)
{
double sum = 0.0;
double step = 1.0 / (double)Steps;
/* ORIGINAL VERSION
object obj = new object();
Parallel.ForEach(
Partitioner.Create(0, Steps),
() => 0.0,
(range, state, partial) =>
{
for (long i = range.Item1; i < range.Item2; i++)
{
double x = (i - 0.5) * step;
partial += 4.0 / (1.0 + x * x);
}
return partial;
},
partial => { lock (obj) sum += partial; });
*/
sum = Enumerable
.Range(0, Steps)
// Create bucket
.GroupBy(s => s / 50)
// Local variable initialization is not distributed over the grid
.Select(i => new
{
Item1 = i.First(),
Item2 = i.Last() + 1, // Inclusive
Step = step
})
.AsParallelGrid(data =>
{
double partial = 0;
for (var i = data.Item1; i != data.Item2 ; ++i)
{
double x = (i - 0.5) * data.Step;
partial += (double)(4.0 / (1.0 + x * x));
}
return partial;
}, new GridSettings()
{
EnableLocalProcessing = enableLocalProcessing
})
.Sum() * step;
return sum.ToString();
}
}
I'm looking for a library or existing code to simplify fractions.
Does anyone have anything at hand or any links?
P.S. I already understand the process but really don't want to rewrite the wheel
Update
Ok i've checked out the fraction library on the CodeProject
BUT the problem I have is a little bit tricker than simplifying a fraction.
I have to reduce a percentage split which could be 20% / 50% / 30% (always equal to 100%)
I think you just need to divide by the GCD of all the numbers.
void Simplify(int[] numbers)
{
int gcd = GCD(numbers);
for (int i = 0; i < numbers.Length; i++)
numbers[i] /= gcd;
}
int GCD(int a, int b)
{
while (b > 0)
{
int rem = a % b;
a = b;
b = rem;
}
return a;
}
int GCD(int[] args)
{
// using LINQ:
return args.Aggregate((gcd, arg) => GCD(gcd, arg));
}
I haven't tried the code, but it seems simple enough to be right (assuming your numbers are all positive integers and you don't pass an empty array).
You can use Microsoft.FSharp.Math.BigRational, which is in the free F# Power Pack library. Although it depends on F# (which is gratis and included in VS2010), it can be used from C#.
BigRational reduced = BigRational.FromInt(4)/BigRational.FromInt(6);
Console.WriteLine(reduced);
2/3
Console.WriteLine(reduced.Numerator);
2
Console.WriteLine(reduced.Denominator);
3
This library looks like it might be what you need:
var f = new Fraction(numerator, denominator);
numerator = f.Numerator;
denominator = f.Denominator;
Although, I haven't tested it, so it looks like you may need to play around with it to get it to work.
The best example of Fraction (aka Rational) I've seen is in Timothy Budd's "Classic Data Structures in C++". His implementation is very good. It includes a simple implementation of GCD algorithm.
It shouldn't be hard to adapt to C#.
A custom solution:
void simplify(int[] numbers)
{
for (int divideBy = 50; divideBy > 0; divideBy--)
{
bool divisible = true;
foreach (int cur in numbers)
{
//check for divisibility
if ((int)(cur/divideBy)*divideBy!=cur){
divisible = false;
break;
}
}
if (divisible)
{
for (int i = 0; i < numbers.GetLength(0);i++ )
{
numbers[i] /= divideBy;
}
}
}
}
Example usage:
int [] percentages = {20,30,50};
simplify(percentages);
foreach (int p in percentages)
{
Console.WriteLine(p);
}
Outupts:
2
3
5
By the way, this is my first c# program. Thought it would simply be a fun problem to try a new language with, and now I'm in love! It's like Java, but everything I wish was a bit different is exactly how I wanted it
<3 c#
Edit: Btw don't forget to make it static void if it's for your Main class.