While actuaries are familiar with SQL and Microsoft Excel, and can use these applications to prepare data for consumption, these applications are not without fault.
With the ever-increasing volume of data, desktop applications such as Excel and Access can quickly reach their limit. To overcome such constraints, cloud computing has introduced concepts of elasticity and scalability, where computer resources grow and shrink on demand and without constraints.
Imagine a future where:
- Low-cost cloud data storage or data lakes (like Amazon S3) collect and store data.
- Elastic compute clouds (like Amazon EC2) run powerful programs like Python or PySpark to clean, transform and structure the data.
- Scalable data warehouses (like Amazon Redshift or DynamoDB) store structured data.
- Business intelligence tools (like Tableau, Power BI or Microstrategy) consume data to unlock its potential.
- All of these tools work in concert with each other without any manual intervention, through workflow and robotic process automation.
This architecture can provide a solid foundation for developing essential reporting and analytical capability given the volume of data and the speed at which it is generating today.
This future is here. Many insurers have already adopted such technologies to support their actuarial workforce and equip them with the resources they need. Traditional actuarial analytics like trends, roll forwards, experience analysis, hedging and asset liability management analytics can be produced within minutes—no matter how large the data set or how many millions of records need to be analyzed. As a result of these improvements, actuaries have been able to shift a significant amount of their time from preparing data to performing value-adding analysis. In addition, the affordability of setup and maintenance of this architecture is helping to make a strong business case. Now, who would refuse that?
Copyright © 2019 by the Society of Actuaries, Schaumburg, Illinois.