Back Prop ANN's For Speech Recognition

GPU Processing Artificial Neural Networks (ANNs)

1. Speech, Character, General Patterns Recognition
2. C#, Unity3D Graphics, C++ Acclerated Massive Processing
3. Unity3D Audio Analysis

Artificial Neural Networks (ANN’s)

     I could jump right into the subject of Artificial Neural Networks using Green’s theorem and partial derivatives like a professor or graduate student might who has progressed pass the the days they first trained themselves to speak in the precise engineering terms to the expert level where they now use engineering math in lectures, research papers and books. Using the engineering math necessary for exacting duplication of process in the computing and engineering sciences is often necessitated. I prefer, however, to introduce background information that has led me into unique insights, and hope this might culture more interest in the novice, and upcoming researches of the science and make for quick easy reading.

      In 1975 in my youth I contstructed an analog and digital computing device, for the science fair projects in junior high school. My interest continued in electronics through science fiction, high school electronics, literature and education as world technology evolved from tube to transistors, and finally into the full-blown information age of or world today. However, that early smallest examples of a digital adding, and an analog differentiator impressed in my mind how, the Analog computing example had such a simple connective design of only a dozen wires three variable resistors a battery and a meter, but it held this powerful function compared to digital methods which took, some hundred wires, with complex connections to switches, and light bulbs to add two digits.

      So this, "The first analog computing device for my science fair project", like multiple other necessary concepts lead to my entry into ANN study. I eventually ended up studying the subject of the perceptron and ANN's over and over again. Reading hundreds of papers like the solution of certain Logical calculus, by Warren McCulloch and Walter Pitts, on Logical Calculus, and Hebbian learning. I found myself sometimes online in the University, news groups, those precursorsto today’s internet discussing types of solutions for ANN like, "the Back Propagation algorithms, or the Perceptron. I began building models using programable calculators, then Matlab, and Mathematica, C++, and now I have entered into using GPU's, through the hardware agnostic C++ AMP (acclerated massive processing), Microsoft libraries, linked into C# libraries added to Unity as a 3D display tool. I also became very interested in the History of Mathematics to see why, and when Linear Algebra were being used by how many students in general who studied computing. It was important to me to know why certain ideas were progressing, and or not progressing, and what scientists, and graduates held what skill set in order to advance these AI ANN type tools. There are a lot of opinions in these areas, but it is understandable that as litteracy increased at a certain rate over the years in the general countries and populations of the world, so the sciences, engineering, and mathematics also had a literacy rate of increase which affected how widely these subjects were being discussed among these general populations.

      I can still remember those earlier University years in the 1990's, after returning from Germany at the end of the Cold War. Sometimes without notice as a freshman in the University discussing through Email/News groups with the professors who’s books I had been consuming as they pointed out solutions to non-linear solutions of type that required biases withing the Back Propagation model. If by multiplying the other weights together and setting that bias with the result it gave a solution to a general amplituted such as a bright or darker picture. In other cases this breaks ratio's of similar equations in the solution of sets of equations. So I enjoyed those first learning experiences in this area of science and discovered many techniques and important ideas in using these ANN methodologies. I hope you as well can find a welcoming easy entry into areas of these useful and developing set of computing and engineering AI tools. That is the purpose of this endeavor I am continuing with here, and placing online for others to enjoy and participate with.

The Competition: Tesla K40, K80, Other GeForce 900s

Titan X: $1,000, 6.9 TFLOPS, 6.9 GFLOPS/$ GTX 980 Ti: $670, 5.6 TFLOPS, 8.4 GFLOPS/$ GTX 980: $500, 4.1 TFLOPS, 8.2 GFLOPS/$ Tesla K40: $3,000, 4.3 TFLOPS, 1.4 GFLOPS/$ Tesla K80: $5,000, 8.7 TFLOPS, 1.7 GFLOPS/$

Back Prop ANN's Basics Function

1. Feed Forward (Identify) 2. Feed Back (Correct Net)

Feed Forward is like a matrix multiply A*B=C
Matrice A are inputs(Patterns as set of Vectors)
A (Inputs) are presented randomly as each Patern alone.
Matrice B (One per Layer) is Weight Network , and C is your output.

Diverse Neural Network Methodology

Choose Your Type and Bing for Info Wikipedia Help

1. Backpropagation, Hebian Learning, Convolutions Networks
2. Supervised, Unsupervised Learning, Weights Layers
3. C++ Acclerated Massive Processing & GPUs