Looking for ML.NET sample - Machine Learning in Visual Studio

Created at 18 Oct 2020, 15:08
How’s your experience with the cTrader Platform?
Your feedback is crucial to cTrader's development. Please take a few seconds to share your opinion and help us improve your trading experience. Thanks!
PR

prosteel1

Joined 04.07.2018

Looking for ML.NET sample - Machine Learning in Visual Studio
18 Oct 2020, 15:08


Hi,

   I've only just found that Visual Studio has a Machine Learning implimentation called ML.Net 

 

It looks like it might allow machine learning to be used in cTrader but I don't see many posts about it. 

Does it work or not?

It seems like the first step is to write a .csv file from Ctrader, then train a model in Visual Studio, then use that model in a cBot.

Any further info or thoughts would be appreciated.

Regards


@prosteel1
Replies

genappsforex
19 Oct 2020, 20:20

RE:

there is a problem with the framework (see discussions on netstandard 2.0 in this forum)

 


@genappsforex

prosteel1
21 Oct 2020, 13:18

RE: RE:

genappsforex said:

there is a problem with the framework (see discussions on netstandard 2.0 in this forum)

 

Thanks,

            I've been trying for the past 3 days to get the netstandard2.0 based ML.NET running in a calgo file in Visual studio. Dispite changing the target framework and target platform I haven't succeeded.

@PanagiotisCharalampous mentioned cTrader 4.0 would be upgraded to  .NetCore, would that upgrade allow it to work?


@prosteel1

ctid2032775
22 Oct 2020, 09:35

RE:

prosteel1 said:

Hi,

   I've only just found that Visual Studio has a Machine Learning implimentation called ML.Net 

 

It looks like it might allow machine learning to be used in cTrader but I don't see many posts about it. 

Does it work or not?

It seems like the first step is to write a .csv file from Ctrader, then train a model in Visual Studio, then use that model in a cBot.

Any further info or thoughts would be appreciated.

Regards

Hi,

I'm using machine learning in one of my cBots for training a model (C4.5 decision tree), identifying the accuracy (TP, FP, TN, FN) and making predictions before opening a position.

It's working fine but I'm using the Accord.NET framework instead of ML.NET. With this library everything can be done directly in a cBot without writing a csv file and develop an additional "stand alone" application for the machine learning part!

There are a lot of documents and code examples available and the modules are really well explained. You can find all information under Accord.Net Framework.

BR,
Christian


@ctid2032775

prosteel1
22 Oct 2020, 15:25

RE: RE:

ctid2032775 said:

Hi,

I'm using machine learning in one of my cBots for training a model (C4.5 decision tree), identifying the accuracy (TP, FP, TN, FN) and making predictions before opening a position.

It's working fine but I'm using the Accord.NET framework instead of ML.NET. With this library everything can be done directly in a cBot without writing a csv file and develop an additional "stand alone" application for the machine learning part!

There are a lot of documents and code examples available and the modules are really well explained. You can find all information under Accord.Net Framework.

BR,
Christian

 

Looks great, thanks heaps!


@prosteel1

prosteel1
23 Oct 2020, 17:23

RE: RE:

The founder of Accord.Net has left the project. It is listed as having 9 developers and the project seems to be in a coma on life support.

While it does have good documentation for some things, without a lot of testing and fixing going on, the code is stuck where it is for a long time.

For Example. I'm looking at Genetics (similar to the cTrader Optimization). But if I find a bug I'll have to fix it myself which I likely don't have the ability to do. 

cTrader Optimisation doesn't support multiple timeframes - thus my need for machine learning that can do multiple timeframes.

 

On the flipside, ML.Net is in active development but does have bugs reported which are fixed quite quickly.

 

There is a danger with using a project which may never be supported, as if or when there is a bug or a required feature addition it may never come.

 

I would like to see the upcoming cTrader version 4 support ML.NET through the upgrade of cTrader from .NET 4.0 client framework to .NET Core. I can see that as being a long term support solution that would benefit cTrader and it's users greatly.

In the meantime I'll be looking to use Accord knowing that I might come up against a brick wall but hoping that I don't :).


@prosteel1

afhacker
23 Oct 2020, 19:29

cTrader will migrate to .NET core and in future you will be able to use ML.NET, but if you are a serious algo trader and want to use ML on your algos then there is no need to wait, you don't have to use even .NET at all for your ML stuff, use Keras or any other ML library like Sklearn with Spotware Open API.

You can also use ML.NET with Open API, I use Python when it comes to ML and for trading I use Open API.

If you don't want to use Open API or the authentication is hard for you to implement you can use Python ML libs for building your model and then you can send the ML model signal to a cBot with a socket connection.

I love .NET but for ML its not so mature right now. 

 


@afhacker

prosteel1
23 Oct 2020, 20:07

RE:

afhacker said:

cTrader will migrate to .NET core and in future you will be able to use ML.NET, but if you are a serious algo trader and want to use ML on your algos then there is no need to wait, you don't have to use even .NET at all for your ML stuff, use Keras or any other ML library like Sklearn with Spotware Open API.

You can also use ML.NET with Open API, I use Python when it comes to ML and for trading I use Open API.

If you don't want to use Open API or the authentication is hard for you to implement you can use Python ML libs for building your model and then you can send the ML model signal to a cBot with a socket connection.

I love .NET but for ML its not so mature right now. 

 

Wow thanks for the answer to my question "@PanagiotisCharalampous mentioned cTrader 4.0 would be upgraded to  .NetCore, would that upgrade allow it to work?

While I am serious about my algos I doubt I'm good enough to use Spotware Open API, I know it's available but sounds like I would need to be a proper developer to use it, while I'm just a hack that's been learning C# for the past 2.5 years simply because the cTrader team seem to provide the best support of any trading platform.

Your description of how you use it is very helpful and I appreciate you replying with great info.

I've been liking the advances in Tensorflow lately. The future looks good for using machine learning in algos!

 

Since I have all of my code in cTrader and only looking to impliment ML in the trading decisions code, which I am about to re-write, perhaps I could use ML for the decisions and Spotware OpenAPI to place the trades, I'll have a look and thanks for the suggestion :)

I think I've just taken the blue pill and about to see for far down the .NET rabbit hole goes lol.


@prosteel1

prosteel1
26 Oct 2020, 13:00

I got a non trading classifier example by https://docs.microsoft.com/en-us/archive/msdn-magazine/2012/july/test-run-classification-and-prediction-using-neural-networks running from the fourth one down in https://docs.microsoft.com/en-us/archive/msdn-magazine/2012/july/july-2012-code-downloads but the extra classes can't access the calgo Print for some reason  so doesn't print all data to the log.

It's a start to see ML working in cAlgo. It's from this youtube video by a Microsoft dev which seems know his stuff. This is ML written from scratch without API's.

using System;
using System.IO;
using System.Linq;
using cAlgo.API;
using cAlgo.API.Indicators;
using cAlgo.API.Internals;
using cAlgo.Indicators;

namespace cAlgo.Robots
{
    [Robot(TimeZone = TimeZones.UTC, AccessRights = AccessRights.FileSystem)]
    public class NeuralNet1 : Robot
    {
        static Random rnd = null;

        protected override void OnStart()
        {
            // Put your initialization logic here
        }

        protected override void OnTick()
        {
            try
            {
                // Put your initialization logic here
                Print("\nBegin neural network classification demo\n");
                Print("Goal is to predict/classify color based on four numeric inputs\n");
                rnd = new Random(159);
                // 159 makes 'good' output
                Print("Creating 100 lines of raw data");
                string dataFile = "..\\..\\colors.txt";
                MakeData(dataFile, 100);

                Print("\nFirst few rows of raw data file are:");
                Helpers.ShowTextFile(dataFile, 4);

                double[][] trainMatrix = null;
                double[][] testMatrix = null;
                Print("\nGenerating train and test matrices using an 80%-20% split");
                MakeTrainAndTest(dataFile, out trainMatrix, out testMatrix);

                Print("\nFirst few rows of training matrix are:");
                Helpers.ShowMatrix(trainMatrix, 5);

                Print("\nCreating 4-input 5-hidden 3-output neural network");
                NeuralNetwork nn = new NeuralNetwork(4, 5, 3);

                Print("Training to find best neural network weights using PSO with cross entropy error");
                double[] bestWeights = nn.Train(trainMatrix);
                Print("\nBest weights found:");
                Helpers.ShowVector(bestWeights, 2, true);

                Print("\nLoading best weights into neural network");
                nn.SetWeights(bestWeights);

                Print("\nAnalyzing the neural network accuracy on the test data\n");
                double accuracy = nn.Test(testMatrix);
                Print("Prediction accuracy = " + accuracy.ToString("F4"));

                Print("\nEnd neural network classification demo\n");
                Console.ReadLine();
            } catch (Exception ex)
            {
                Print("Fatal: " + ex.Message);
                Console.ReadLine();
            }
        }

        protected override void OnStop()
        {
            // Put your deinitialization logic here
        }
        static void MakeData(string dataFile, int numLines)
        {
            double[] weights = new double[] 
            {
                -0.1,
                0.2,
                -0.3,
                0.4,
                -0.5,
                0.6,
                -0.7,
                0.8,
                -0.9,
                1.0,
                -1.1,
                1.2,
                -1.3,
                1.4,
                -1.5,
                1.6,
                -1.7,
                1.8,
                -1.9,
                2.0,
                -0.5,
                0.6,
                -0.7,
                0.8,
                -0.9,
                1.5,
                -1.4,
                1.3,
                -1.2,
                1.1,
                -1.0,
                0.9,
                -0.8,
                0.7,
                -0.6,
                0.5,
                -0.4,
                0.3,
                -0.2,
                0.1,
                0.1,
                -0.3,
                0.6
            };

            NeuralNetwork nn = new NeuralNetwork(4, 5, 3);
            nn.SetWeights(weights);

            FileStream ofs = new FileStream(dataFile, FileMode.Create);
            StreamWriter sw = new StreamWriter(ofs);

            for (int i = 0; i < numLines; ++i)
            {
                double[] inputs = new double[4];
                for (int j = 0; j < inputs.Length; ++j)
                    inputs[j] = rnd.Next(1, 10);

                double[] outputs = nn.ComputeOutputs(inputs);

                string color = "";
                int idx = Helpers.IndexOfLargest(outputs);
                if (idx == 0)
                {
                    color = "red";
                }
                else if (idx == 1)
                {
                    color = "green";
                }
                else if (idx == 2)
                {
                    color = "blue";
                }

                sw.WriteLine(inputs[0].ToString("F1") + " " + inputs[1].ToString("F1") + " " + inputs[2].ToString("F1") + " " + inputs[3].ToString("F1") + " " + color);
            }
            sw.Close();
            ofs.Close();
        }
        // MakeData
        static void MakeTrainAndTest(string file, out double[][] trainMatrix, out double[][] testMatrix)
        {
            int numLines = 0;
            FileStream ifs = new FileStream(file, FileMode.Open);
            StreamReader sr = new StreamReader(ifs);
            while (sr.ReadLine() != null)
                ++numLines;
            sr.Close();
            ifs.Close();

            int numTrain = (int)(0.8 * numLines);
            int numTest = numLines - numTrain;

            double[][] allData = new double[numLines][];
            // could use Helpers.MakeMatrix here
            for (int i = 0; i < allData.Length; ++i)
                allData[i] = new double[7];
            // (x0, x1, x2, x3), (y0, y1, y2)
            string line = "";
            string[] tokens = null;
            ifs = new FileStream(file, FileMode.Open);
            sr = new StreamReader(ifs);
            int row = 0;
            while ((line = sr.ReadLine()) != null)
            {
                tokens = line.Split(' ');
                allData[row][0] = double.Parse(tokens[0]);
                allData[row][1] = double.Parse(tokens[1]);
                allData[row][2] = double.Parse(tokens[2]);
                allData[row][3] = double.Parse(tokens[3]);

                for (int i = 0; i < 4; ++i)
                    allData[row][i] = 0.25 * allData[row][i] - 1.25;
                // scale input data to [-1.0, +1.0]
                if (tokens[4] == "red")
                {
                    allData[row][4] = 1.0;
                    allData[row][5] = 0.0;
                    allData[row][6] = 0.0;
                }
                else if (tokens[4] == "green")
                {
                    allData[row][4] = 0.0;
                    allData[row][5] = 1.0;
                    allData[row][6] = 0.0;
                }
                else if (tokens[4] == "blue")
                {
                    allData[row][4] = 0.0;
                    allData[row][5] = 0.0;
                    allData[row][6] = 1.0;
                }
                ++row;
            }
            sr.Close();
            ifs.Close();

            Helpers.ShuffleRows(allData);

            trainMatrix = Helpers.MakeMatrix(numTrain, 7);
            testMatrix = Helpers.MakeMatrix(numTest, 7);

            for (int i = 0; i < numTrain; ++i)
            {
                allData[i].CopyTo(trainMatrix[i], 0);
            }

            for (int i = 0; i < numTest; ++i)
            {
                allData[i + numTrain].CopyTo(testMatrix[i], 0);
            }
        }
        // MakeTrainAndTest
    }
    class NeuralNetwork
    {
        private int numInput;
        private int numHidden;
        private int numOutput;

        private double[] inputs;
        private double[][] ihWeights;
        // input-to-hidden
        private double[] ihSums;
        private double[] ihBiases;
        private double[] ihOutputs;
        private double[][] hoWeights;
        // hidden-to-output
        private double[] hoSums;
        private double[] hoBiases;
        private double[] outputs;

        static Random rnd = null;

        public NeuralNetwork(int numInput, int numHidden, int numOutput)
        {
            this.numInput = numInput;
            this.numHidden = numHidden;
            this.numOutput = numOutput;

            inputs = new double[numInput];
            ihWeights = Helpers.MakeMatrix(numInput, numHidden);
            ihSums = new double[numHidden];
            ihBiases = new double[numHidden];
            ihOutputs = new double[numHidden];
            hoWeights = Helpers.MakeMatrix(numHidden, numOutput);
            hoSums = new double[numOutput];
            hoBiases = new double[numOutput];
            outputs = new double[numOutput];

            rnd = new Random(0);
        }

        public void SetWeights(double[] weights)
        {
            int numWeights = (numInput * numHidden) + (numHidden * numOutput) + numHidden + numOutput;
            if (weights.Length != numWeights)
                throw new Exception("The weights array length: " + weights.Length + " does not match the total number of weights and biases: " + numWeights);

            int k = 0;
            // points into weights param
            for (int i = 0; i < numInput; ++i)
                for (int j = 0; j < numHidden; ++j)
                    ihWeights[i][j] = weights[k++];

            for (int i = 0; i < numHidden; ++i)
                ihBiases[i] = weights[k++];

            for (int i = 0; i < numHidden; ++i)
                for (int j = 0; j < numOutput; ++j)
                    hoWeights[i][j] = weights[k++];

            for (int i = 0; i < numOutput; ++i)
                hoBiases[i] = weights[k++];
        }

        public double[] ComputeOutputs(double[] currInputs)
        {
            if (inputs.Length != numInput)
                throw new Exception("Inputs array length " + inputs.Length + " does not match NN numInput value " + numInput);

            for (int i = 0; i < numHidden; ++i)
                this.ihSums[i] = 0.0;
            //for (int i = 0; i < numHidden; ++i)
            //  this.ihOutputs[i] = 0.0;
            for (int i = 0; i < numOutput; ++i)
                this.hoSums[i] = 0.0;
            //for (int i = 0; i < numOutput; ++i)
            //  this.outputs[i] = 0.0;


            for (int i = 0; i < currInputs.Length; ++i)
                // copy
                this.inputs[i] = currInputs[i];

            //Console.WriteLine("Inputs:");
            //ShowVector(this.inputs);

            //Console.WriteLine("input-to-hidden weights:");
            //ShowMatrix(this.ihWeights);

            for (int j = 0; j < numHidden; ++j)
                // compute input-to-hidden sums
                for (int i = 0; i < numInput; ++i)
                    ihSums[j] += this.inputs[i] * ihWeights[i][j];

            //Console.WriteLine("input-to-hidden sums:");
            //ShowVector(this.ihSums);

            //Console.WriteLine("input-to-hidden biases:");
            //ShowVector(ihBiases);

            for (int i = 0; i < numHidden; ++i)
                // add biases to input-to-hidden sums
                ihSums[i] += ihBiases[i];

            //Console.WriteLine("input-to-hidden sums after adding biases:");
            //ShowVector(this.ihSums);

            for (int i = 0; i < numHidden; ++i)
                // determine input-to-hidden output
                //ihOutputs[i] = StepFunction(ihSums[i]); // step function
                ihOutputs[i] = SigmoidFunction(ihSums[i]);
            //ihOutputs[i] = TanhFunction(ihSums[i]);

            //Console.WriteLine("input-to-hidden outputs after sigmoid:");
            //ShowVector(this.ihOutputs);

            //Console.WriteLine("hidden-to-output weights:");
            //ShowMatrix(hoWeights);


            for (int j = 0; j < numOutput; ++j)
                // compute hidden-to-output sums
                for (int i = 0; i < numHidden; ++i)
                    hoSums[j] += ihOutputs[i] * hoWeights[i][j];

            //Console.WriteLine("hidden-to-output sums:");
            //ShowVector(hoSums);

            //Console.WriteLine("hidden-to-output biases:");
            //ShowVector(this.hoBiases);

            for (int i = 0; i < numOutput; ++i)
                // add biases to input-to-hidden sums
                hoSums[i] += hoBiases[i];

            //Console.WriteLine("hidden-to-output sums after adding biases:");
            //ShowVector(this.hoSums);

            //for (int i = 0; i < numOutput; ++i)   // determine hidden-to-output result
            //  this.outputs[i] = SigmoidFunction(hoSums[i]);  // step function

            //double[] result = new double[numOutput];
            //this.outputs.CopyTo(result, 0);
            //return result;

            double[] result = Softmax(hoSums);

            result.CopyTo(this.outputs, 0);

            //Console.WriteLine("outputs after softmaxing:");
            //ShowVector(result);

            //Console.ReadLine();

            //double[] result = Hardmax(hoSums);
            return result;
        }
        // ComputeOutputs
        //private static double StepFunction(double x)
        //{
        //  if (x > 0.0) return 1.0;
        //  else return 0.0;
        //}

        private static double SigmoidFunction(double x)
        {
            if (x < -45.0)
                return 0.0;
            else if (x > 45.0)
                return 1.0;
            else
                return 1.0 / (1.0 + Math.Exp(-x));
        }

        private static double[] Softmax(double[] hoSums)
        {
            // determine max
            double max = hoSums[0];
            for (int i = 0; i < hoSums.Length; ++i)
                if (hoSums[i] > max)
                    max = hoSums[i];

            // determine scaling factor (sum of exp(eachval - max)
            double scale = 0.0;
            for (int i = 0; i < hoSums.Length; ++i)
                scale += Math.Exp(hoSums[i] - max);

            double[] result = new double[hoSums.Length];
            for (int i = 0; i < hoSums.Length; ++i)
                result[i] = Math.Exp(hoSums[i] - max) / scale;

            return result;
        }

        // seek and return the best weights
        public double[] Train(double[][] trainMatrix)
        {
            int numWeights = (this.numInput * this.numHidden) + (this.numHidden * this.numOutput) + this.numHidden + this.numOutput;
            //double[] currWeights = new double[numWeights];

            // use PSO to seek best weights
            int numberParticles = 10;
            int numberIterations = 500;
            int iteration = 0;
            int Dim = numWeights;
            // number of values to solve for
            double minX = -5.0;
            // for each weight
            double maxX = 5.0;

            Particle[] swarm = new Particle[numberParticles];
            double[] bestGlobalPosition = new double[Dim];
            // best solution found by any particle in the swarm. implicit initialization to all 0.0
            double bestGlobalFitness = double.MaxValue;
            // smaller values better
            double minV = -0.1 * maxX;
            // velocities
            double maxV = 0.1 * maxX;

            // initialize each Particle in the swarm with random positions and velocities
            for (int i = 0; i < swarm.Length; ++i)
            {
                double[] randomPosition = new double[Dim];
                for (int j = 0; j < randomPosition.Length; ++j)
                {
                    double lo = minX;
                    double hi = maxX;
                    randomPosition[j] = (hi - lo) * rnd.NextDouble() + lo;
                }

                double fitness = CrossEntropy(trainMatrix, randomPosition);
                // smaller values better
                double[] randomVelocity = new double[Dim];

                for (int j = 0; j < randomVelocity.Length; ++j)
                {
                    double lo = -1.0 * Math.Abs(maxX - minX);
                    double hi = Math.Abs(maxX - minX);
                    randomVelocity[j] = (hi - lo) * rnd.NextDouble() + lo;
                }
                swarm[i] = new Particle(randomPosition, fitness, randomVelocity, randomPosition, fitness);

                // does current Particle have global best position/solution?
                if (swarm[i].fitness < bestGlobalFitness)
                {
                    bestGlobalFitness = swarm[i].fitness;
                    swarm[i].position.CopyTo(bestGlobalPosition, 0);
                }
            }
            // initialization
            double w = 0.729;
            // inertia weight.
            double c1 = 1.49445;
            // cognitive/local weight
            double c2 = 1.49445;
            // social/global weight
            double r1, r2;
            // cognitive and social randomizations
            Console.WriteLine("Entering main PSO weight estimation processing loop");
            while (iteration < numberIterations)
            {
                ++iteration;
                double[] newVelocity = new double[Dim];
                double[] newPosition = new double[Dim];
                double newFitness;

                // each Particle
                for (int i = 0; i < swarm.Length; ++i)
                {
                    Particle currP = swarm[i];

                    // each x value of the velocity
                    for (int j = 0; j < currP.velocity.Length; ++j)
                    {
                        r1 = rnd.NextDouble();
                        r2 = rnd.NextDouble();

                        newVelocity[j] = (w * currP.velocity[j]) + (c1 * r1 * (currP.bestPosition[j] - currP.position[j])) + (c2 * r2 * (bestGlobalPosition[j] - currP.position[j]));
                        // new velocity depends on old velocity, best position of parrticle, and best position of any particle
                        if (newVelocity[j] < minV)
                            newVelocity[j] = minV;
                        else if (newVelocity[j] > maxV)
                            newVelocity[j] = maxV;
                        // crude way to keep velocity in range
                    }

                    newVelocity.CopyTo(currP.velocity, 0);

                    for (int j = 0; j < currP.position.Length; ++j)
                    {
                        newPosition[j] = currP.position[j] + newVelocity[j];
                        // compute new position
                        if (newPosition[j] < minX)
                            newPosition[j] = minX;
                        else if (newPosition[j] > maxX)
                            newPosition[j] = maxX;
                    }

                    newPosition.CopyTo(currP.position, 0);

                    newFitness = CrossEntropy(trainMatrix, newPosition);
                    // compute error of the new position
                    currP.fitness = newFitness;

                    // new particle best?
                    if (newFitness < currP.bestFitness)
                    {
                        newPosition.CopyTo(currP.bestPosition, 0);
                        currP.bestFitness = newFitness;
                    }

                    // new global best?
                    if (newFitness < bestGlobalFitness)
                    {
                        newPosition.CopyTo(bestGlobalPosition, 0);
                        bestGlobalFitness = newFitness;
                    }

                }
                // each Particle
                //Console.WriteLine(swarm[0].ToString());
                //Console.ReadLine();

            }
            // while
            Console.WriteLine("Processing complete");
            Console.Write("Final best (smallest) cross entropy error = ");
            Console.WriteLine(bestGlobalFitness.ToString("F4"));

            return bestGlobalPosition;

        }
        // Train
        // (sum) Cross Entropy
        private double CrossEntropy(double[][] trainData, double[] weights)
        {
            // how good (cross entropy) are weights? CrossEntropy is error so smaller values are better
            this.SetWeights(weights);
            // load the weights and biases to examine
            double sce = 0.0;
            // sum of cross entropy
            // walk thru each training case. looks like (6.9 3.2 5.7 2.3) (0 0 1)  where the parens are not really there
            for (int i = 0; i < trainData.Length; ++i)
            {
                double[] currInputs = new double[4];
                currInputs[0] = trainData[i][0];
                currInputs[1] = trainData[i][1];
                currInputs[2] = trainData[i][2];
                currInputs[3] = trainData[i][3];
                double[] currExpected = new double[3];
                currExpected[0] = trainData[i][4];
                currExpected[1] = trainData[i][5];
                currExpected[2] = trainData[i][6];
                // not really necessary
                double[] currOutputs = this.ComputeOutputs(currInputs);
                // run the jnputs through the neural network
                // compute ln of each nn output (and the sum)
                double currSum = 0.0;
                for (int j = 0; j < currOutputs.Length; ++j)
                {
                    if (currExpected[j] != 0.0)
                        currSum += currExpected[j] * Math.Log(currOutputs[j]);
                }
                sce += currSum;
                // accumulate
            }
            return -sce;
        }
        // CrossEntropy
        // returns the accuracy (percent correct predictions)
        public double Test(double[][] testMatrix)
        {
            // assumes that weights have been set using SetWeights
            int numCorrect = 0;
            int numWrong = 0;

            // walk thru each test case. looks like (6.9 3.2 5.7 2.3) (0 0 1)  where the parens are not really there
            for (int i = 0; i < testMatrix.Length; ++i)
            {

                double[] currInputs = new double[4];
                currInputs[0] = testMatrix[i][0];
                currInputs[1] = testMatrix[i][1];
                currInputs[2] = testMatrix[i][2];
                currInputs[3] = testMatrix[i][3];
                double[] currOutputs = new double[3];
                currOutputs[0] = testMatrix[i][4];
                currOutputs[1] = testMatrix[i][5];
                currOutputs[2] = testMatrix[i][6];
                // not really necessary
                double[] currPredicted = this.ComputeOutputs(currInputs);
                // outputs are in softmax form -- each between 0.0, 1.0 representing a prob and summing to 1.0
                //ShowVector(currInputs);
                //ShowVector(currOutputs);
                //ShowVector(currPredicted);

                // use winner-takes all -- highest prob of the prediction
                int indexOfLargest = Helpers.IndexOfLargest(currPredicted);

                // just a few for demo purposes
                if (i <= 3)
                {
                    Console.WriteLine("-----------------------------------");
                    Console.Write("Input:     ");
                    Helpers.ShowVector(currInputs, 2, true);
                    Console.Write("Output:    ");
                    Helpers.ShowVector(currOutputs, 1, false);
                    if (currOutputs[0] == 1.0)
                        Console.WriteLine(" (red)");
                    else if (currOutputs[1] == 1.0)
                        Console.WriteLine(" (green)");
                    else
                        Console.WriteLine(" (blue)");
                    Console.Write("Predicted: ");
                    Helpers.ShowVector(currPredicted, 1, false);
                    if (indexOfLargest == 0)
                        Console.WriteLine(" (red)");
                    else if (indexOfLargest == 1)
                        Console.WriteLine(" (green)");
                    else
                        Console.WriteLine(" (blue)");

                    if (currOutputs[indexOfLargest] == 1)
                        Console.WriteLine("correct");
                    else
                        Console.WriteLine("wrong");
                    Console.WriteLine("-----------------------------------");
                }

                if (currOutputs[indexOfLargest] == 1)
                    ++numCorrect;
                else
                    ++numWrong;

                //Console.ReadLine();
            }
            Console.WriteLine(". . .");

            double percentCorrect = (numCorrect * 1.0) / (numCorrect + numWrong);
            Console.WriteLine("\nCorrect = " + numCorrect);
            Console.WriteLine("Wrong = " + numWrong);

            return percentCorrect;
        }
    }
    public class Helpers
    {
        static Random rnd = new Random(0);

        public static double[][] MakeMatrix(int rows, int cols)
        {
            double[][] result = new double[rows][];
            for (int i = 0; i < rows; ++i)
                result[i] = new double[cols];
            return result;
        }

        public static void ShuffleRows(double[][] matrix)
        {
            for (int i = 0; i < matrix.Length; ++i)
            {
                int r = rnd.Next(i, matrix.Length);
                double[] tmp = matrix[r];
                matrix[r] = matrix[i];
                matrix[i] = tmp;
            }
        }

        public static int IndexOfLargest(double[] vector)
        {
            int indexOfLargest = 0;
            double maxVal = vector[0];
            for (int i = 0; i < vector.Length; ++i)
            {
                if (vector[i] > maxVal)
                {
                    maxVal = vector[i];
                    indexOfLargest = i;
                }
            }
            return indexOfLargest;
        }

        public static void ShowVector(double[] vector, int decimals, bool newLine)
        {
            string fmt = "F" + decimals;
            for (int i = 0; i < vector.Length; ++i)
            {
                if (i > 0 && i % 12 == 0)
                    Console.WriteLine("");
                if (vector[i] >= 0.0)
                    Console.Write(" ");
                Console.Write(vector[i].ToString(fmt) + " ");
            }
            if (newLine == true)
                Console.WriteLine("");
        }

        public static void ShowMatrix(double[][] matrix, int numRows)
        {
            int ct = 0;
            if (numRows == -1)
                numRows = int.MaxValue;
            for (int i = 0; i < matrix.Length && ct < numRows; ++i)
            {
                for (int j = 0; j < matrix[0].Length; ++j)
                {
                    if (matrix[i][j] >= 0.0)
                        Console.Write(" ");
                    if (j == 4)
                        Console.Write("-> ");
                    Console.Write(matrix[i][j].ToString("F2") + " ");
                }
                Console.WriteLine("");
                ++ct;
            }
            Console.WriteLine("");
        }

        public static void ShowTextFile(string textFile, int numLines)
        {
            FileStream ifs = new FileStream(textFile, FileMode.Open);
            StreamReader sr = new StreamReader(ifs);
            string line = "";
            int ct = 0;
            while ((line = sr.ReadLine()) != null && ct < numLines)
            {
                Console.WriteLine(line);
                ++ct;
            }
            sr.Close();
            ifs.Close();
        }
    }
    public class Particle
    {
        public double[] position;
        // equivalent to x-Values and/or solution
        public double fitness;
        public double[] velocity;

        public double[] bestPosition;
        // best position found so far by this Particle
        public double bestFitness;

        public Particle(double[] position, double fitness, double[] velocity, double[] bestPosition, double bestFitness)
        {
            this.position = new double[position.Length];
            position.CopyTo(this.position, 0);
            this.fitness = fitness;
            this.velocity = new double[velocity.Length];
            velocity.CopyTo(this.velocity, 0);
            this.bestPosition = new double[bestPosition.Length];
            bestPosition.CopyTo(this.bestPosition, 0);
            this.bestFitness = bestFitness;
        }

        public override string ToString()
        {
            string s = "";
            s += "==========================\n";
            s += "Position: ";
            for (int i = 0; i < this.position.Length; ++i)
                s += this.position[i].ToString("F2") + " ";
            s += "\n";
            s += "Fitness = " + this.fitness.ToString("F4") + "\n";
            s += "Velocity: ";
            for (int i = 0; i < this.velocity.Length; ++i)
                s += this.velocity[i].ToString("F2") + " ";
            s += "\n";
            s += "Best Position: ";
            for (int i = 0; i < this.bestPosition.Length; ++i)
                s += this.bestPosition[i].ToString("F2") + " ";
            s += "\n";
            s += "Best Fitness = " + this.bestFitness.ToString("F4") + "\n";
            s += "==========================\n";
            return s;
        }
    }
}

 


@prosteel1