How many decimals should I carry?

Here’s a good question I got by email recently. I think it’s a common question for people first starting out with fellowship exams:

I was wondering what the best way to go about rounding intermediate values is on the exam? Let’s say I’m showing my work on a problem and I write out some values to 4 decimal places. Then I use these values to calc the final answer. Should I use those rounded values or should I use unrounded values stored in my calculator.

I realize it probably won’t make a huge difference, but if I’m trying to find a mortality rate or confidence interval, I might get different answers based on the method I use.

The good news with FSA exams is that precision isn’t nearly as important as it is on the prelims. Graders are pretty forgiving when it comes to rounding error, and you will see a wide variety of rounding in model solutions posted by the SOA.

My general rule of thumb is to carry no more decimal places than is absolutely necessary. In a lot of situations, I tend to work with only 2 decimal places unless the numbers are very large, then I work in whole numbers usually. Mortality rates are usually easier to work with on a per 1000 basis, which basically means 3-4 decimal places (e.g. 5 per 1000 is 0.005).

Annuity factors are notorious for causing frustration due to rounding, so sometimes I carry 4-5 decimals on annuity factors in examples to keep from causing confusion. But in an exam situation, working with an annuity factor like 10.15 is totally fine since there usually aren’t too much subsequent calculations for things to go astray.

So in summary, the best answer is carry only as many as you feel comfortable with but no more. How’s that for precision? Ha.

 

Leave a Reply