Last week, I needed to add a power switch in a wall of our home, so I pulled out my electric edge saw to make the precise rectangular cut in the sheetrock. I also utilized my fiber-optic endoscope to see a color display of the exact placement of every wire, water pipe and gas line inside—wirelessly displayed on my iPad. Twenty years ago, I did not have any of these tools. Now I find them indispensable.
Likewise, when approaching a difficult predictive modeling problem, I find my genetic algorithms, sentiment analysis and neural network tools to be essential components of my actuarial toolkit. Yet, when I started as an actuary, that toolkit contained only commutation functions, linear regression techniques, and various graduation and curve fitting tools that seem primitive by today’s standards. And in my prior career as an engineer, I used a slide rule for my calculations. When we cover generalized linear models (GLMs) in the graduate-level artificial intelligence (AI) machine learning course I teach, I bring the slide rule and pass it around so the students can play with this ancestor to GLM software. Some students find this relic from the predawn of history quite fascinating.
Commutation functions go back at least to the 1800s. My 1895 edition of the Encyclopedia Britannica has several pages describing them. It also shows extensive techniques for interpolation, graduation, curve fitting and even generalized linear modeling techniques (before they were given that name in 1972). In 2018, I wrote an article stating that commutation functions were obsolete, and a senior staff actuary at the Society of Actuaries (SOA) corrected me. He pointed out that they are still in use on some defined benefit pension plans.
Sadly, many actuaries have chosen not to embrace the new AI and predictive analytics tools. They tend to be the same actuaries lamenting the incursion of data scientists, CPAs, MBAs, CFAs and other professionals into financial risk positions formerly held only by actuaries.
How did we let this happen? In fairness, some of the new techniques come with the baggage of intimidating names. Take genetic algorithms, for instance. Five years ago, I had the honor of speaking at the Human Genome Institute at Washington University’s medical school. My topic was genetic algorithms and, frankly, I went into the lecture hall terrified that within five minutes I would be thrown out as a fraud—because I am! These were geneticists, and even after several certificate courses at the university’s MiniMed School (13 semesters so far), I know very little about genetics. It is a vast and complex subject. But genetic algorithms were merely inspired by genetics. Genetic algorithms are easy, yet powerful, ways to solve problems involving lots of variables and no known closed solution. After two and a half hours, we had to give up the lecture hall because it was needed by another group. I left feeling pleased that geneticists were so eager to learn about genetic algorithms from an actuary!
Even more intimidating than genetic algorithms is the term neural network. Neural network: What image does this data science term conjure up for you? For many, it conveys an image of the emulation of a human brain. That’s unfortunate as I see it. Biological neurons are only an inspiration for a data science technique we use to form predictive models. The artificial neural network (ANN) is far more simplistic than a biological one. However, the “artificial” qualifier is usually dropped for convenience, and we use the term neural network (NN)—a term many people find somewhat intimidating. But NNs are simple.
Most actuaries are familiar with the term linear model. It assumes the dependent variable y is some linear combination of the independent variables: y = f(X1, X2, … Xn). We write this as y = β1X1 + β2X2 + … βnXn, where the various β1s are constants. This is a useful prediction technique, but it has limitations. It can work well for Gaussian distributions, but it gives disappointing results for other distributions, such as Poisson. An approach not limited by this distribution assumption is the GLM. GLMs employ a link function. We say that y = f(g(X1, X2, X3, … Xn)).
Essentially our NN modeling technique is a set of layers of GLMs. Between every layer and the next layer, we take the sum of a set (vector) of weights times the input vector (a simple dot product), then apply an activation function (a link function) to get our output prediction. Is a NN more difficult to trace backward, to show all the breadcrumbs leading from your input vector to your output prediction? Sure, it is, because each layer is like a GLM and we are cascading several layers.
The path backward becomes tedious, but that doesn’t make it cerebral, or even magical. It is just a straightforward, albeit many-layered, way to build a model that often provides a superior prediction for classification or regression situations. We train the NN by assuming a set of weights (coefficients) for each layer and then refining them by iteratively comparing the output for a given input, with desired outputs for that input, and applying simple calculus to calculate progressively better weights. The process is very similar to what we do for GLMs.
Initially, NNs were thought to be of potentially great value for solving numerical regression and classification problems. But problems involving nonnumeric data, such as images, were considered beyond the scope of a NN. Enter the convolutional neural network (CNN). It turns out pictures can be digitized, and that provides the opportunity to feed the digital representations to deep NNs (with many layers) for image recognition. They can even go beyond the recognition of images and create new ones. A generative adversarial network (GAN) can create a painting in a specified genre to convince an expert it is by a specific famous artist.
NNs can solve numeric problems, perform image recognition and even generate creative images. Yet, they are all simple in concept. In effect, they are not cerebral at all!
What’s in your toolkit?
Copyright © 2019 by the Society of Actuaries, Schaumburg, Illinois.