There’s a term in network technology called ‘latency’ which refers to the delay between the execution of a command and the instruction given by the user. You’ll hear it most often in the world of high-speed training where a slight increase in latency (to the effect of a couple of milliseconds) can have a drastic negative impact on speed and thus performance.
As most of us realise, there is useful data and not-so-useful data. And merely having it at your disposal doesn’t necessarily mean that you’re able to discern between these two camps. It’s often only in the processing phase where we dig into the data and look for insights that we discover whether the data we’ve collected can actually drive us forward, rather than remaining a red herring.
Critics of financial modelling will always tell you that there are simply too many moving parts and interdependencies within a company to arrive at an accurate prediction of the future. They’ll point to how easy it is to adjust an input assumption and completely change the entire scope of what the model outputs. And to a certain extent – they’re right.
Much has been written about how changing tides, rapid disruption, and global trends impact the customer-facing side of business today. You can open any business publication of your choice and hear stories of how technology has completely changed how they think about their offering and their messaging to the market.
In the world of financial planning and analysis, we’ve long been hamstrung by financial products and tools that, while powerful, can be somewhat clunky to use. Traditionally they’ve taken lots of time and training to move through the learning curve and the sheer complexity has meant that the user experience has been deprioritised and the functionality has been front and center.