Fear of systemic risk often stops digitally supported advice initiatives before they even get started. It shouldn’t.
Current UK compliance infrastructure is based around the manual provision of advice by advisers. While there have been well publicised episodes of mis-selling, most firms have a good understanding of how their advisers operate, most of the time with good controls and good outcomes.
Digital means changing these process and systems so while the opportunities are vast the consequences of getting it wrong can be severe. If you deliver your financial planning services at scale using digital it means any problems can magnify quickly. Delivering unsuitable advice means potential customer redress, sanctions by the FCA and potentially the Financial Ombudsman’s Service.
Digital systems, driven by planners or by customers properly calibrated will by definition deliver more consistent answers compared to manual approaches and reduce the variability of your financial planning and advice. The key of course is to make sure that any resulting advice is accurate and suitable.
Over the last decade there has been a revolution in UK financial planning with the adoption of risk profiling and asset allocation tools. 10 years ago the use of tools was very limited with 80% of firms adopting a manual, home grown approach. By 2014 according to research from IFA Census most firms used a digital service as part of their process.
In the early years many of these tools were accessed via providers, who funded them and to a greater or lesser extent used them to support their own investment propositions. (Read Ian McKenna’s latest blog on the topic)
After the Retail Distribution Review and a move away from the dependency on life offices and platforms the sector has come to realise that the objectivity of a model is key to ensuring a customer is treated fairly. Regardless of whether you are an independent or restricted planner the use of independent digital models has come to dominate.
The benefits of this approach have been to bring:
- Greater consistency;
- Increased transparency and objectivity; and
- More data and auditability.
The profiling methodology, the asset allocation model and increasingly the centralised investment proposition used by your firm will by definition have been documented and codified within your digital service and will provide more data and a strong audit trail. Managing the risk of something which is documented and codified is considerably more effective than managing ‘the conversation on the couch’.
In our experience firms that have managed the risk that introducing a digital service represents best have done so following 3 basic rules:
- Clearly define the scope, access and limits of the digital service to ensure it’s suitable for your target audience.
Whether this is the introduction of a digital risk profiling service or fully-fledged automated advice being clear on its scope to your target audience is critical.
Introducing a service for those who have limited investment experience in particular needs to reflect their financial knowledge and capabilities and use language they understand. Assessing that experience is important.
- Due diligence.
The firms understand the model they adopt and its strengths and limitations. All models have limitations as they are representations of reality and as such have to make approximations and assumptions.
It is important you understand the key assumptions being made in the asset and risk model engine, particularly long term returns, volatilities and correlations. Ask for the documentation and sense check these against other data from alternative studies. Ask questions around how these have performed historically versus their forecasts. Ask how often they are updated and what is the process for doing so? Is it rigorous, does it have external validation?
The key is for you to be able to check that assumptions are reasoned and reasonable and that you are able to explain the key ones to your customers so that they too can understand them where they need to.
Look at the asset allocations at each risk level as these will be a central driver of any financial planning and in a more automated service any advice. Do they represent a smooth gradient from cash through to emerging markets so that you can fairly compare what the customer has today without contorting their position?
Some models only cover the mid-range of investment risks; in our language risk profiles 3 to 8 out of 10. It’s impossible to show a customer where they are today if their savings are mostly in a bank account and your risk spectrum doesn’t start at cash.
Your philosophy should be customer first not product first when you are implementing your digital service. In a smaller firm this often comes quite naturally, you are at the sharp end of dealing with customers everyday.
In a larger firm with different teams responsible for customer relationships, investment strategy and product selection it is critical that your assumptions and model are built from the ‘outside in’ not from the ‘inside out’. Let the customer facing and investment teams do their work first, before thinking; ‘what product do I want to sell most of’.
While past performance does not guarantee future outcomes, check the model’s history and how it has performed. What’s its track record? Did the risk levels and asset allocations perform as expected over the long term and during periods of stock market stress? Again, ask for the documentation and most experienced model providers will have that for you. If you don’t understand something – ask! You can see Dynamic Planner’s [here].
Once the new service is available, testing it based on a range of example customer scenarios and outcomes is an important part of the process. Working through expected and actual outcomes for example customers with different risk profiles, portfolio sizes, goals and needs makes sure it has been calibrated properly for your organisation. A test might for example be made up of customers of different ages, wealth profiles, health, marital status and budgets with different financial priorities and levels of affordability, debt, financial dependents etc. The more automated the service the more you will need to test outcomes in each of these different areas and ensure they are as expected. If they aren’t then referring the customer to a planner where more information can be gained and judgement exercised is important.
In any test case defining a ‘good financial planning outcome’ is key. Good outcomes though are subjective and we have found that having a range of qualified and experienced individuals contributing to what good looks like given specific customer needs is the most effective way of creating test strategies. These might include; experienced financial planners, tax experts, compliance and investment management in addition to asset and risk modelling.
The number and range of tests depends on the scope and level of automation of your service; testing a widely used and well documented risk profiling service might entail documenting expected outcomes for a range of customer situations for each risk profile; customers seeking income, growth or both for example.
Some of our clients then start testing our service ‘in a corner of the business’ on a small number of customers before rolling out more widely. Some have tested our service on their staff first and then customers prior to a wider roll-out. Some dual run the new digital service with existing processes to compare the results for a period.
It’s important from a TCF perspective to know how your customers will receive your advice and the more you apply automation, the more important that is; from planner-driven to customer-driven services.
Particularly when no planner is involved, careful testing is fundamental.
Best practice learnings:
- Clearly defining the scope, access and limits of the digital service to ensure it’s suitable for your target audience. Ensure it reflects their financial knowledge and capabilities and uses language they understand.
- Due diligence and understanding the strengths and limitations of your model. Make sure you understand and are able to explain the key assumptions to customers. Check how the model has performed historically.
- Test and roll out carefully. Test on example customer scenarios first to ensure you are getting the expected outcomes. Then roll out slowly initially perhaps on staff and then a small number of customers before wider roll out.