# Neural Network Basics

The following information is provided to help you better understand the concept of Neural Networks.

Definition - Expanation:

Neural Networks are computer systems linking inputs with outputs in a network structure of nodes and arcs (connections). They are inspired by replicating portions of what is known about the way human brains function. In the human brain, neurons are connected in a complex network, with activity generated by impulses from one neuron to another. Neural networks have proven to be very good a identifying patterns, with some ablility to generalize. These systems are not strictly programmed, but are given many cases of sets of inputs along with results. The systems "learn" by adjusting the weights of relative impact of inputs to output, trying many combinations of weights until a good fit to the training cases is obtained. Then the resulting network can be used to evaluate future cases.

An early approach was to develop a computer system where a set of input values was linked to an output variable by a network. Each input variable and the output possibilities were represented by nodes. A network of arcs connnected the nodes. Each arc had a weight associated with it. The weighted sum of the input values leading into an output node determined whether the output node fired or not. If this value was above the threshold value, the conclusion associated with this node was assumed. The weights on the arcs connecting nodes were adjusted during a training period, where the computer tried many combinations of weights, retaining those weights where improved accuracy was obtained and rejecting changed weights otherwise.

This system can be viewed as being similar to a regression analysis. The input variables in fact are selected in much the same manner as independent variables in a regression model. Input variables are those characteristics that have expanatory power in selecting among alternative outcomes. Like regression models, the weights yielding the best fit in explaining the outcomes of the training sample are sought. Neural networks have proven quite effective at tasks humans do better than computers, such as recognizing faces, aircraft, fingerprints, voices and handwriting. They are usually relatively poor at doing things traditional computers do better than humas, such as accurate arithmetic computation, transaction processing, or anything requiring numerical precision.

### Introduction

A very simple neural network was developed to predict the number of runs scored by a baseball team in a game based on total team offensive statistics. The resulting model could then be used to:
• Compare the contribution of players to team run production based on individual statistics.
• Determine the key statistics and their relative importance in run production.
• Better identify the worth of individual players to the team for the purpose of supporting salary arbitration arguments.

The resulting neural network results were compared to a linear regression model's results. A regression model is a standard statistical tool that was used as a basis of comparison in model performance. The same data were used to develop and test both models. Game totals of the key offensive statistics such as hits, home runs and base on balls were used.
The data used for the model development consisted of a small set of 132 baseball games obtained from a realistic computer simulation based upon actual 1992 major league statistics. Both teams' data were used giving a dataset of 264 observations. A true measure of the complexity of the problem would require thousands of observations. This was beyond the resources available for this investigation. However, the number of observations available does make for an instructive problem.

### Results: Prediction of Runs Scored

After each model had been run using the 264 observations, the models were compared to determine prediction accuracy. Error is calculated as the predicted value from the model minus the actual value for all observations. The absolute value of each error was then taken (i.e., the sign of the error was removed). An average absolute error was calculated across all 264 observations for each model. With this measure of model accuracy, the neural network improved upon the traditional linear regression model by 32%. The average error for the regression model was 1.1 runs, as compared to 0.75 runs for the network.
A second measure of fit used to compare the two models involved determining how many of the 264 observations were more accurately predicted for the regression model versus the network. Once again, the neural network showed superior results. The network was closer to actual for 164 out of the 264 observations in the dataset, or 62% of the total. The regression was closer in 100 out of the 264 cases, or 38%. Not only were the network predictions better more often, but when they were worse, the errors were not as bad as when the regression was worse. Where the regression gave a closer fit, the network prediction error was on average 0.54 runs worse than the regression prediction error (with no regard to the sign of the error). In cases where the network was more accurate, the regression prediction error was on average 0.89 runs higher than the network.
By all measures, the network model does a better job of predicting total runs than the regression model. These tests indicate the neural network understands the conditions leading to runs in a baseball game better than the regression models. This is a basic requirement to future applications such as the one that follows.

### Results: Estimation of Offensive Contribution by Category

One of the uses of the model is to predict the impact of increased offensive production. For example, predicting the impact of an additional single per game. This section will compare the estimates of increased production based on the regression model and the network model. As will be shown below, the network model produces results similar to the regression model's. However, the real strength of the network model is the estimate of offensive production dependent on the other offensive statistics in the game -- understanding the interactions between the offensive statistics. This result is consistent with the findings of the above section.
Because of the low sample size, statistically significant estimates could not be established for the stolen base, caught stealing and triple variables. Therefore, these variables are removed from the analysis. The triple estimates were suspect, but are included for completeness. This problem will be eliminated by increasing the sample size.
The following table presents the estimated increased number of runs per game based on increasing the offensive production of each category.

#### Average Increase in the Number of Runs Per Game Based on an Increase of One of Each Offensive Category

 Regression Model Network Model Single 0.456 0.439 Double 0.654 0.643 Home Run 1.648 1.659 Base on Balls 0.356 0.388 Intentional Base on Balls -0.209 -0.012 Hit By Pitcher 0.293 0.341 Sacrifice Fly 1.034 0.942 Sacrifice Bunt 0.123 0.118 Strike Outs -0.044 -0.048 Hitting in Double Play -0.322 -0.315

Many of the estimates are similar. The intentional base on balls estimate is negative, indicating the manager's decision worked.
The interpretation of the results should be considered separately for the network model but may be combined for the regression model. For example, to estimate (from the regression model) the increased number of runs from a single and a home run, the two values are added together. The estimated number of runs produced is 2.104 runs a game. The neural network model considers the interactions of the offensive categories together. Applying the two hits to the network model produces 2.147 additional runs a game. The two answers are very similar.

### The Importance of the Interactions of Variables in the Neural Network

Regression techniques require independence between variables. This means that the variables are treated separately. The regression coefficients do not look at how two or more variables interact. A neural network does not require the same independence. In fact being able to use the interaction in variables is a strength of the approach. Another example demonstrates this value.
Consider the above example in a one game at a time analysis. Using the regression model the impact of the single and a home run is the same (2.104 runs) regardless of the other offensive statistics. In the neural network the results are different. For example, consider the two extremes presented below. The columns present the offensive statistics before the additional single and home run.

#### Offensive Statistics Before Additional Hits

 Game A Game B Single 1 7 Double 0 2 Home Run 0 0 Base on Balls 2 4 Intentional Base on Balls 0 0 Hit By Pitcher 0 0 Sacrifice Fly 0 1 Sacrifice Bunt 0 0 Strike Outs 3 4 Hitting in Double Play 1 0

Given the higher overall offense production in Game B, you would expect a higher output from the additional single and home run. The neural network model follows this expectation. The estimate of increased runs in Game A is 1.083 and the estimate for Game B is 4.204. These estimates are more believable than the unconditional estimates for the regression model.
Practically speaking, in Game A it is less likely that runners will be on base when the home run is hit or in scoring position when the single is hit. Also after the single is hit there is less likelihood the runner will be brought home. Therefore, more runs are expected from the home run and single in Game B than Game A.
This ability related to the neural network approach allows for a more accurate estimation of specific conditions in a game. Most importantly the performance of individual teams or individual players can be compared giving the specific situations the player faces. This ability cannot be matched using traditional approaches.

• To which customers should we direct our latest marketing efforts? How can we improve the response rate?
• What region has the greatest level of customer satisfaction, given the differences between regions?
• Is there a way to predict the outcome of our manufacturing process based on data collected early in the process?

These questions can be addressed with analytical methods, however they are not academic problems. Answering these questions involves imperfect data, assumptions and a knowledge of what is important, an element that is not easy to identify/quantify. Z Solutions will work with you to use neural networks and other data mining methods to achieve the results you are looking for.

• Z Solutions defines success as the solution to your business problem, not an academic or mathematical measure of success.
• Neural network techniques can be very powerful. However, in a complicated project that power is secondary to the knowledge of the client about the problem and the data itself. A project consist of a careful project definition along with the client and careful analysis and preparation of the data. The network approaches and interpretation of the models then refine that understanding.
• Transferral of the neural network methodology used to the client is required before the project is considered complete. Neural networks may best be utilized when all parties understand the methodologies and capabilities.

Neural Networks use a set of processing elements (or nodes) loosely analogous to neurons in the brain. (Hence the name, neural networks.) These nodes are interconnected in a network that can then identify patterns in data as it is exposed to the data. In a sense, the network learns from experience just as people do. This distinguishes neural networks from traditional computing programs, that simply follow instructions in a fixed sequential order.
The structure of a neural network looks something like the following:

The bottom layer represents the input layer, in this case with 5 inputs labeled X1 through X5. In the middle is something called the hidden layer, with a variable number of nodes. It is the hidden layer that performs much of the work of the network. The output layer in this case has two nodes, Z1 and Z2 representing output values we are trying to determine from the inputs. For example, we may be trying to predict sales (output) based on past sales, price and season (input).

#### More on the Hidden Layer

Each node in the hidden layer is fully connected to the inputs. That means what is learned in a hidden node is based on all the inputs taken together. This hidden layer is where the network learns interdependencies in the model. The following diagram provides some detail into what goes on inside a hidden node.

Simply speaking a weighted sum is performed: X1 times W1 plus X2 times W2 on through X5 and W5. This weighted sum is performed for each hidden node and each output node and is how interactions are represented in the network.
Each summation is then transformed using a nonlinear function before the value is passed on to the next layer.

#### Where Does the Network Get the Weights From?

That is the \$64,000 question. A full discussion of this topic is well beyond the scope of this introduction. But we can offer a simple explanation. The network is repeatedly shown observations from available data related to the problem to be solved, including both inputs (the X1 through X5 in the diagram above) and the desired outputs (Z1 and Z2 in the diagram). The network then tries to predict the correct output for each set of inputs by gradually reducing the error. There are many algorithms for accomplishing this, but they all involve an interative search for the proper set of weights (the W1-W5) that will do the best job of accurately predicting the outputs.

### Extracting Knowledge From Information

In his book, Powershift, Alvin Toffler writes of the power gained from knowledge. Powershift is the last in Toffler's trilogy preceded by Future Shock and The Third Wave. In his latest book he predicts that the most important weapon in the war for economic supremacy in the 21st century will be the organization of knowledge. The continued existence of organizations will depend on how well they use available information.
With the 21st century on the horizon, the 90's could be called the decade of re-engineering. Managers faced with streamlined staffs and reduced budgets are finding it difficult to determine how to best use information technology to improve performance. From the standpoint of an individual manager's team, the challenge is increasingly one of understanding and organizing large amounts of information to improve knowledge of the organization's business and markets. Without these workgroup-level improvements, the organization is at a competitive disadvantage in a changing macro-level economy.
In short, managers are faced with the task of getting more from less, often without the resources to develop the improvements needed. This includes the use of information technology. Although lack of information is seldom a problem, real measurable improvements in knowledge gained from the information and the resulting decisions based on the information are hard to find.
A powerful emerging technology can be used to efficiently process information to achieve greater knowledge and improved decision making. This technology is the field of artificial neural networks, commonly referred to as simply neural networks. Neural networks self-adapt to learn from information, providing powerful models representing knowledge about a specific problem.

### Origins of Neural Networks

Artificial neural networks are the result of academic investigations that involve using mathematical formulations to model nervous system operations. The resulting techniques are being successfully applied in a variety of everyday business applications.
Neural networks represent a meaningfully different approach to using computers in the workplace. A neural network is used to learn patterns and relationships in data. The data may be the results of a market research effort, the results of a production process given varying operational conditions, or the decisions of a loan officer given a set of loan applications. Regardless of the specifics involved, applying a neural network is a substantial departure from traditional approaches.
Traditionally a programmer or an analyst specifically "codes" every facet of the problem in order for the computer to "understand" the situation. Neural networks do not require the explicit coding of the problem. For example, to generate a model that performs a sales forecast, a neural network only needs to be given raw data related to the problem. The raw data might consist of: history of past sales, prices, competitors' prices, and other economic variables. The neural network sorts through this information and produces an understanding of the factors impacting sales. The model can then be called upon to provide a prediction of future sales given a forecast of the key factors.
These advancements are due to the creation of neural network learning rules, which are the algorithms used to "learn" the relationships in the data. The learning rules enable the network to "gain knowledge" from available data and apply that knowledge to assist a manager in making key decisions.

### What Can Neural Networks Be Used For?

Neural networks constitute a powerful tool for data mining. Data mining has become quite popular recently and really involves the extraction of knowledge from information. Organizations have more and more data from which they need to extract key trends in order to run their businesses more efficiently and improve decision making.
Applications of neural networks are numerous. Many receive their first introduction by reading about the applications of the techniques in financial market predictions. Claims are made by several well-known investment groups that at least some of their technical analysis of financial markets and portfolio selection is performed with neural networks.
Other successful applications of the techniques include: analysis of market research data and customer satisfaction, industrial process control, forecasting applications, and credit card fraud identification. Mellon Bank installed a neural network credit card fraud detection system and the realized savings were expected to pay for the new system in six months. A number of other banks are also using neural network - based systems to control credit card fraud. These systems are able to recognize fraudulent use based on past charge patterns with greater accuracy than other available methods.
Another example of using neural networks to improve decisions is in medical diagnosis. A neural network can be shown a series of case histories of patients, with a number of patient characteristics, symptoms, and test results. The network is also given the diagnosis for the case from the attending physician. The network can then be shown information regarding new patients and the network will provide a diagnosis for the new cases. This essentially creates a system containing the expertise of numerous physicians which can be called upon to give an immediate, real-time initial diagnosis of a case to medical personnel.

### Should I consider Neural Networks?

When approached with a proposal to apply a neural network, how should a business manager evaluate the proposal? Does this new capability offer real benefits, or is this the latest example of trendy approaches and buzzwords? Most importantly, are these techniques practical or are they academic approaches that are not practical or cost effective?
Given a steady increase in successful applications, neural networks are for real and offer substantial benefits. The technical details of neural networks are beyond the scope of this article, but successful applications share certain common characteristics that may be easily understood. First, there will exist interrelationships between the explanatory factors that are used to estimate the factor we don't know -- the outcome. Having interrelationships in the data means that two or more factors work together to predict model outcome. For example, a chemical process in a production facility may be dependent on temperature and humidity. These two factors combine to affect the outcome of the process. The second condition in which neural networks excel is when there is a non-linear relationship between the explanatory factors and the outcome. This simply means that the nature of the relationship between the factors and the outcome changes as the factors take on different values, which is the norm for everyday problems.
In regards to the trendiness issue, yes, neural networks are presently trendy -- at least in some circles. However, the need to improve processes by doing things better and cheaper is more important than ever in today's competitive business climate. Likewise, the desire to develop computer systems that can learn by themselves and improve decision-making is an ongoing goal of information technology. The neural network techniques we use today may not remain with us. However, the goal of developing computers that learn from past experience and lead to better business decisions will remain a high priority. Neural networks now represent one of the best practices in achieving this goal. Furthermore, continued achievements toward this goal are likely to be inspired or generated from these technologies.
The answer to the question of whether these approaches are practical and cost effective is a definitive "yes", although finding documented proof of this can be a challenge. It is true that the techniques are relatively new and that experience with these techniques is not as extensive as with traditional techniques. A great deal has been published about the technical approaches, the mathematics, and the learning rules. However, little has been written about the practical application of neural networks. It would be highly unlikely for you to find a source describing the application of neural networks to your specific problem. However, there is not a dearth of successful applications. Look at it this way - how likely would it be for you to share specifics of key information learned about your markets or business with your competitors?
The fact remains, however, that neural networks are proving their worth everyday in a wide variety of business applications, and saving their users time and money in the process.

### When to Consider a Neural Network

Neural networks should be applied in situations where traditional techniques have failed to give satisfactory results, or where a small improvement in modeling performance can make a significant difference in operational efficiency or in bottom-line profits. Direct marketing is an excellent example of where a small improvement can lead to significant results. The response rate on direct marketing campaigns is usually quite low. A five percent response rate is often considered very good. By reviewing the demographic data on those that respond it may be possible to identify characteristics that would produce a 6% response rate. If a neural network is used to analyze the demographic characteristics and a 7% response rate is produced, then the cost of the direct mail campaign can be reduced while maintaining the same desired level of positive response from prospects.
An individual wanting to investigate this emerging technology and explore ways in which it can improve his/her organization is advised to consult with neural network practitioners who have experience in developing and implementing models for use in commercial applications. Z Solutions will be glad to discuss this with you.
The bottom line is that any manager interested in getting more useful information from available data should consider neural network technology as an option. They can be used by aggressive organizations to focus available resources more effectively, thus gaining a valuable competitive edge.

# Want to Try Neural Nets?

Z Solutions is in the business of helping organizations apply artificial neural networks. Neural networks are one of those "new" technologies that are touted as being able to revolutionize your understanding of data. The techniques are loosely inspired by biological learning and can be used to discover (learn) patterns not readily apparent in data. The techniques can be thought of as delivering a "little brain" that has learned much about your problem. Anyone reading popular business press these days reads about the term data mining. Neural networks are one of the most powerful techniques used in this area. The learning algorithms of neural networks can probe through data and learn relationships not readily apparent otherwise -- the techniques deliver on their promise.
Several recently published sources expound upon the power of these techniques to solve business problems. Leaving that aside I will focus on an even more remarkable capability of the technology. Perhaps the most interesting facet of these little brains is their ability to shut down previously well functioning bigger brains. If you think neural networks can help you solve a sticky problem you are probably right. However, perhaps the benefit of our experience will help you in that effort.

### Avoid the Magic Pill Syndrome

I always wondered why every third utility pole as you drive down the street has a sign that says: "Lose weight, now. Let me show you how!" or "Earn \$4,000 a week from home, no risk." I think I’m beginning to see why now. I have actually had people ask me, "we have no data and no history to learn from, can a neural network help me solve this problem?" Now I've got to tell you, after spending an hour and a half sweating over a detailed explanation of learning algorithms, the clever mathematics of a neural network and the general concept of learning for both humans and machines, such a question is disheartening. It is not at all surprising, but artificial neural networks, like our biological neural networks, perform much better when there is something to learn. Maybe I should just take their money and sell them some sugar coated diet pills.
People are intrigued by the concept of neural networks. The idea of a mathematical technique that can learn using methods similar to the way we learn is fascinating. They hope the techniques can provide an answer where one was not apparent before. I am firmly convinced that neural network techniques can make many tasks easier and solve some problems that are not solvable otherwise. However, there is still a great deal of sweat involved.
This is our first example of the little brain controlling the big brain. Most of our clients recognize that in order to make a significant difference in a significantly difficult problem that a significant effort will be involved. The magic pill will not make the weight go away without effort. In some cases a multiple phase project may be in order, with each phase progressing to a well-defined conclusion. Even in these well organized projects the temptation is there to believe that some magic in neural networks is going to make a problem disappear.

### Maybe you didn’t learn all this in kindergarten, but...

A few years ago Robert Fulghum wrote a nice little book called, All I Really Need to Know I Learned in Kindergarten. It is a nice little book with a very simple premise. Many of the answers to life’s big problems were actually covered in kindergarten. You know, share, play fair, etc. My kindergarten education did not cover much in the way of analytical techniques, in fact my formal education didn’t cover anything at all about neural networks. But similar to Mr. Fulghum, my basic education did give me a lot of useful information I can apply to using neural networks.
Although the techniques represent great advances in learning and data analysis capability, several basic principles from your educational background apply. For example, while studying literature we had essay questions such as, "Summarize the basic premise of the book, detailing..." If my memory is correct (I really guess it isn’t) I got this question once a week. It took me a while but I finally learned the best way to solve the problem is not to throw every random fact I can remember from the book on the paper. A much more structured approach to the essay question worked a whole lot better.
But guess what people do when given a neural network? As soon as the data is formatted and ready for the network’s learning algorithm, they throw it all at it. Remember Peppermint Patty from Peanuts fame? She repeatedly throws out unrelated facts on her school papers. What grade does she receive? Yes, a "D minus". Do you know what grade a neural network would get from what it learns from such an approach. Yes, a "D minus" .
Another example of that little brain having an effect on the big brain. A structured thoughtful approach to problem solving is as always the best approach. In our training classes that we provide and in our consulting assignments we stress repeatedly that 75% or more of your time in a neural network project is spent working with your data: understanding it, preparing it for the learning algorithm and just plain looking at it.
Back to my formal education. Whether it was a literature class, a physics class or a decision sciences class, (and probably kindergarten) the lesson was the same (although the wording would be different). Format and state your problem, define your research question or just plain determine what is important.
I regret to say it but we have had clients who can’t wait to solve their problem using these nifty algorithms. Without a close inspection of the data they quickly apply it to the software and they develop a model that looks very nice. They have a graph which shows the model error is quite low -- meaning the network has learned the problem well. You can even tell which inputs are important. When you have such a model you can’t resist showing it off. Finally it is time to apply the model to an important problem. The results are terrible. Suspicion and dread now loom over the project. All involved spend a great deal of time looking at the data and the problem. Pouring over graphs and descriptive statistics of the input data, the step we recommend as the first step, it is discovered that the data used to train the neural network does not cover the region where the network is applied. Basically, the network knows nothing about the problem you want to solve. There are technical terms to describe this state but in realistic terms, you might as well go to your doctor for landscaping advice.

We have heard many stories where a neural network project was undertaken and it failed. And I am sure this is true of other technologies. The participants are still excited. They are certain that with new software (invariably due out next summer) and more powerful computers success is assured. This is better than the above described problems. At least the big brain is in gear -- just looking in the wrong place. We like to say, "the answers are in your data, not the learning algorithm." All learning algorithms are not the same. Some are going to find some relationships others will not find. But as a general rule, working with your data and improving the data that the network sees is a much higher probability approach to solving your problem.
What are you trying to accomplish with this data analysis? You need to tell the neural network algorithm what you know about the problem and let the network learn what else there is to see in the data. The reality is the network cannot learn all that you know. Specifically, the network cannot learn what is important. The problem design is the most important aspect of the project.
Additionally, if you are asking a learning algorithm to find out more about your problem than you already know, then the least you can do is provide some help. For example, if you are a financial analyst you know certain important financial ratios apply to your problem. If you are an engineer, you are aware of certain data relationships that are based on the physics of the problem. Calculate those relationships and feed them to the network. The network may learn them itself, but why make it difficult?
I believe this is true of all technologies. The technology itself does not solve problem. The technology is the tool used to implement the solution.

### Inertia

Even when the above concerns are well understood and dealt with there is still a problem when applying any new technology. You are dealing with change. Change is hard. In discussions with our clients and other colleagues we hear a recurring theme - new technologies can be difficult to implement. The participants appear willing but the technology doesn’t get used. In his recent book, Sacred Cows Make the Best Burgers, Robert Kriegel presents four resistance drivers to change. They are:
• Fear -- "What if ... I lose my job, look stupid, can’t adapt," etc.
• Feeling Powerless -- "No one asked me!"
• Inertia -- "It’s too much effort, too uncomfortable."
• Absence of Self-Interest -- "What’s in it for me?"

Of these four the most common we see is the inertia driver as the biggest impediment to the implementation of technology. To continue the physics analogy Mr. Kriegel has used, a force greater than the inertia has to be applied in order for there to be movement.
One of the key benefits of neural network solutions is their capacity to save time, an important consideration in today’s business world.. Because of the adaptive nature of the learning algorithms the approaches can solve problems faster than traditional techniques. Less time can be spent concerned with finding the correct functional forms. But this is after the techniques have been implemented, tested and applied to the specific problem. The big brain needs to prepare for the time and expense investments required to achieve the changes desired. An investment upfront is needed before the benefits should be expected.

### In conclusion

We are at a loss to explain why otherwise very intelligent people approach their problems in such a way. These people have sharp analytical minds and good business sense. The answer is that it is easy to be lulled in by the power of neural networks or to hope for the blessings of technology in general. Our clients, be they industry or government, are dealing with a lot. Corporations are restructuring, downsizing, budgets are being cut, markets are changing and relationships are breaking down. This is all occurring at a time when the amount of information pouring into organizations is increasing. Neural networks are said to be easier. They are said to learn what is important on their own. This is all true, to a degree, but it is a false hope to expect this to happen without a significant and thoughtful commitment to apply the technology.

# Neural Networks and Data Mining_files

 Z Solutions defines data mining as the systematic exploration of data for the purpose of extracting key patterns or findings. Organizations in the late 1990’s typically have large stores of data available due to advances in information technology and reduced costs for data storage. The question is how to best utilize data that has magnified in terms of volume and complexity? Companies that can more quickly and efficiently uncover useful information to help run their businesses have a distinct competitive advantage. Neural networks are well suited for data mining tasks due to their ability to model complex, multi-dimensional data. As data availability has magnified, so has the dimensionality of problems to be solved, thus limiting many traditional techniques such as manual examination of the data and some statistical methods. Although there are many techniques and algorithms that can used for data mining, some of which can be used effectively in combination, neural networks offer the following desirable qualities: Automatic search of all possible interrelationships among key factors Automatic modeling of complex problems without prior knowledge of the level of complexity Ability to extract key findings much faster than many other tools We have found that the process alone of organizing the data for neural networks can be invaluable. The level of rigor applied is in itself sufficient to reveal findings in the data. Although neural networks are quite adept at finding the hidden patterns in data, they do not directly reveal their findings to the developer. Examination of the final model is necessary to extract the key relationships uncovered. Z Solutions has developed a number of techniques to perform this task in order to make neural networks viable as a data mining tool.