The Hidden Data Factory that Masks Process Problems
Monday, June 1, 2020 at 12:02PM
Charlie in Digital Financial Reporting

The Harvard Business Review article by Thomas Redman, Bad Data Costs the U.S. $3 Trillion Per Year, makes mention of the author calls "the hidden data factory" where U.S. organizations waste $3 trillion correcting errors made by others in the same organization.

In another article, Martin Doyle differentiates prevention cost, correction cost, and failure cost, WHY DATA SHOULD BE A BUSINESS ASSET – THE 1-10-100 RULE. Essentially this describes what George Labovitz and Yu Sang Chang came up with in 1992 called the "1-10-100 Rule" and is widely used as a tool to describe efficiency.  In summary: 

Said another way, "An ounce of prevention is worth a pound of cure."

Lean Six Sigma techniques and philosophies which have been around for years seems to be completely unknown to accounting departments and auditors.  Auditors I get; they bill by the hour.

I have demonstated that you can connect accounting, reporting, audit, and analysis processes. If that is hard to follow, try walking through this narrative to understand what I am trying to get at. Or, read the documentation. Or even better, try processing these files yourself.

If you have 100% of the information you need within a process to get information to flow effectively from one end to the other and if you have process control mechanisms in place then automation of certain specific tasks is trivial.

Even without the use of artificial intelligence accounting, reporting, auditing, and analysis processes would benefit.  But with artificial intelligence this becomes a no brainer.

Article originally appeared on XBRL-based structured digital financial reporting (http://xbrl.squarespace.com/).
See website for complete article licensing information.