Search :

Artificial Intelligence Learns When Beer Sells

by Kathleen Cason

EMAIL THIS     PRINTABLE VERSION

 

The sales-prediction model was only about 30 percent accurate. So the North Carolina beer distributor turned to Jay Aronson, a management information systems professor at the University of Georgia, to create a better one. This, Aronson realized, was a job for “neural nets.”

Aronson and his graduate student Hui Wang looked at the distributor’s raw sales data and noticed upswings in sales one week, downswings the next, and then up again. Regression analysis — the standard statistical method on which the company’s sales forecasts were based — flattens such spikes. By taking an average and fitting a line to the overall data, the forecast loses information on how the sales actually occur over time. The result, according to Aronson, is that stores are overstocked one week and run out the next.

“We needed to model [these] zings — high-frequency upturns and downturns,” he said. “If you can predict sales, you can schedule delivery trucks better, which decreases costs for the distributor. Plus if you can predict store needs, you can get more shelf space and that means more sales. And you can schedule optimal use of marketing tools.”

Artificial neural networks like the one at left operate similarly to the human brain. In the brain, a typical neural network functions by “processing” a face, for example, in the input layer into different aspects like shape and color. These inputs are converted to numerical values in the hidden layer that can trigger a neuron in the hidden layer to “fire.” This sends a signal to the output layer, which in turn fires and indicates recognition.

So Aronson and Wang developed a model using a type of artificial intelligence called a neural network, which mimics how the human brain works, including the ability to learn from experience. Even in its initial and relatively naïve state, their model predicted beer sales with 50 percent accuracy, easily beating previous forecasts based on regression analysis.

Aronson demonstrates the principle of neural nets by drawing circles in three columns — six on the left, four in the middle, and one on the right — and connecting each of the circles to others in adjacent columns. Consider the right column to be the input, the middle a “hidden” layer that recodes the input, and the left, the output. (See illustration at right.) By adjusting the strength and arrangement of these connections, the neural net learns, producing a better mathematical fit to the data and becoming better able to compute accurate solutions to problems.

Real brains do pretty much the same thing by sensing the strength of electrical impulses that flow through the circuits of interconnecting neurons. To recognize a face, for example, the brain processes a vast amount of data — shapes, colors and relationships of facial components, etc. — to retrieve a name from memory. The more often a brain recalls something from memory, Aronson said, the stronger the neural connections become and the easier it is to recognize patterns in the data.

Neuroscientists do not yet completely understand how the brain processes complex information to arrive at a solution — in this case, a person’s name. Ordinary computers simplify the process considerably by manipulating data as zeroes and ones, such as brown-eyed or not brown-eyed, and dealing with fewer variables. In contrast, artificial neural nets — constructed as hardware or simulated in software — are more like brains because they can ingest the entire complex set of information, process it and arrive at a solution. Artificial neural nets have been developed to diagnose appendicitis, interpret X-rays, predict the success of heart-valve surgery and even detect abnormal cells in Pap smears.

Aronson has been applying neural nets to business models for projecting such things as foreign-currency exchange rates, strawberry yogurt sales and option prices on the stock market. Many of these tools have been precise indeed. His model for currency exchange rates at 30, 60 and 90 days out, for example, predicted the actual rates within 3.62 percent. (The best published result up to that time had been 50 percent.)

Given neural nets’ capacity to learn, Aronson knew that he and Wang could improve on the accuracy of their first beer-sales model. They looked for factors that cause the zings, and ultimately identified variables that include shelf space, in-store advertising, holidays and sporting events like the Super Bowl.

With the next round, their model achieved 93 percent accuracy. To get even more precise, Aronson and Wang now are adding demographics for individual stores, factoring in certain ethnic and state holidays, and taking into account sporting events popular in particular neighborhoods.

They also have discovered a new use of their beer-sales prediction model. Because the availability of recycling materials correlates with beer and soft-drink sales, a recycler is looking at the model to schedule its own trucks.

For more information, e-mail Jay Aronson at jaronson@uga.edu.

EMAIL THIS     PRINTABLE VERSION


CONTENTS| BROWSE | ARCHIVE | SUBSCRIBE
UGA | OVPR | NEWS | CONTACT
Research Communications, Office of the VP for Research, UGA
For comments or for information please e-mail: rcomm@uga.edu
To contact the webmaster please email: ovprweb@uga.edu