The latest in our series of interviews with figures from the risk transfer and insurance-linked securities markets saw us speak with Dmitry Mnushkin, President of reinsurance and risk software consultancy Treefrog Consulting Ltd.
Treefrog Consulting Ltd is a Bermuda-based risk software firm which specialises in solutions for the reinsurance and insurance-linked securities (ILS) industry. Dmitry Mnushkin has been a software architect and developer in the reinsurance industry for the past 14 years.
In the current competitive pricing environment, topics such as modelling, risk analytics, portfolio construction and managing ‘big data’ are more critical than ever. Given Dmitry Mnushkin’s expertise in developing software systems to meet these challenges, we felt it would be interesting to hear his thoughts on this dynamic segment of the reinsurance market.
To begin, can you tell us a bit about your background and what led you to form Treefrog Consulting?
I have been interested in software design and development since I got my first computer in 1985. When I moved to Bermuda 15 years ago I was fortunate to work with Renaissance Reinsurance for 14 years. During that time I became fascinated with how the reinsurance industry viewed and managed risk. In April of 2013 I saw an opportunity to use the skills and knowledge I had accumulated to help companies better manage their portfolios of risk through the focused use of custom software. This led to the formation of Treefrog Consulting Ltd.
Tell us a little about the type of solutions Treefrog Consulting has delivered for clients before?
We have been quite fortunate that in our first year we’ve had the chance to work with a broad range of clients. ILS and Cat bond fund managers, traditional reinsurers, mortgage insurers, and primary insurers have all engaged Treefrog services and expertise. Our smaller customers are quite happy with simpler solutions focused on management and reporting while we build fully architected, enterprise risk portfolio management systems for our larger clients.
And tell us a bit about how you approach projects and your software development methodology?
We never build more than the client needs and believe that a quality product is one that requires minimal care and maintenance. We design our systems to be data-driven which significantly reduces the need and cost of code changes in future versions. Most of all we realize that the programs we build are there to do one thing: support the business.
Our software builds are very interactive with constant customer input throughout. This allows the client to steer software development from the very beginning, thus ensuring what gets delivered is what is needed by the business.
Where are you seeing the most demand within the reinsurance and ILS markets right now and what kind of tools are clients looking for?
We’ve had a lot of interest from companies wanting to better analyse property catastrophe risk. Traditionally this has been an area with a lot of science behind it and also one with a large amount of uncertainty. As the sophistication of vendor models continues to advance and the output of those model grows, our clients are looking for better ways to consolidate and understand that data.
Another reason why we believe there is growing interest in our software systems is the dispersion of people from early technical reinsurance leaders into the rest of the industry. These individuals know what is possible and are driving their new employers to adopt similar techniques.
The tools we are most frequently asked to create vary greatly by the size of the client. Smaller shops are looking for aggregate controls on their portfolio and the solutions we build tend to be in technologies they can readily understand and manipulate such as Excel. This benefits the client by being inexpensive and eases their transition into a more tech-assisted approach to risk and portfolio management.
Larger shops are frequently ready for an entire underwriting platform rebuild. Here the level of sophistication and granularity increases significantly. Solutions such as these require a team to implement and can take many months to do so. We have in the past provided the entire development team for such efforts or just the architect to work with the client’s own team on the software.
Where do you see the biggest opportunities for reinsurers to embrace software and data within their businesses?
This past 1/1 renewal season has seen some of the most challenging market conditions in recent memory. The softening market and the flood of new capital has created a much more competitive landscape for traditional reinsurers. Many believe these conditions will persist for some time so in order to survive and prosper these companies need to understand their risk better than their competitors to manage term creep. Enhanced understanding leads to identifying short-term market opportunities but must be coupled with the ability to rapidly execute on them. For this we recommend clients build powerful underwriting platforms that allow the business to be flexible and informed about their risk.
How about for ILS investors, how can software make a difference to them?
What we are seeing now is a move towards increased sophistication by the ILS players. Companies that have directed their initial energies at formation, capital raising and deployment are now focusing in on more granular management of their risk portfolios. Companies need to differentiate themselves from the competition and frequently in this space that means a superior risk management approach that allows for selection of more profitable business.
This push to be different from everyone else goes hand in hand with developing in-house risk management software. If everyone is running off a common platform that does 60% of the job but is not particularly tailored to a given business it’s a tougher sell to convince investors why what you do is “better”.
Have you any thoughts or ideas about how software could help reinsurers and ILS investors to handle non-modelled perils within their portfolios?
This is an area where we have seen considerable strides made, especially by reinsurers. Traditionally non-modelled perils were handled by applying largely qualitative factors to existing model losses. Obviously this is a very broad approach that requires a diversified exposure base to be most effective. The advantage was simplicity of implementation.
What we are seeing now is actuarial and modelling departments working together to come up with their own event simulations based on loss experience. This mimics to some degree what the vendor models are doing in that a stochastic event set is used to get aggregate and occurrence based estimates of loss. It’s not a full model (although some have built that also), but it is considerably more granular than the non-scientific “fudge factor” approach and can yield much better results for less diversified books. We have had considerable experience integrating these loss curves with vendor model curves to produce overall portfolio rollups.
Where do you envision the use of technology in the ILS market in ten years time?
The success of ILS is due in no small part to its ability to assume risk at rates traditional reinsurers struggle to match. This is achieved, in part, by having a low operating expense and being more nimble than their larger cousins. We expect to see traditional reinsurers rise to meet these new challengers by leveraging their sophisticated analytics and deep underwriting platforms.
For ILS to maintain their advantage in the face of taking on more risk, companies will have to continue to grow their underwriting and risk management capabilities. They don’t have the luxury of throwing people at the problem so systems must become sophisticated enough to empower an individual to do the work of a team.
Our thanks go to Dmitry for this insight into the world of software development for the risk, reinsurance and ILS sectors at Treefrog Consulting Ltd.
Read an article Dmitry wrote for Artemis last year: Aggregates; what you don’t know can hurt you.