Understanding Base Criteria for Using SEC XBRL Financial Filings
There is a base set of criteria which must be satisfied in order to make use of information reported in SEC XBRL financial filings. Satisfying these criteria does not mean that all the reported information is useful. Or, said another way, these base criteria are not sufficient. However, this base criteria is necessary.
All of these base criteria, in my personal opinion, can and should be verified to be true before the SEC even accepts an XBRL-based financial filing. Every one of these base criteria is automatable.
These base criteria were determined from analysis of 7160 SEC financial filings, all of which were 10-K filings. The analysis is summarized in the document Summary of Analysis of SEC XBRL Financial Filings.
The majority of SEC XBRL financial filers satisfy each category of these base criteria. Less SEC XBRL financial filings satisfy all criteria. The SEC clearly does not use its inbound validation to detect these base criteria for if they did, these issues would not exist in the submitted filings. The issues would be detected using automated processes, flagged as issues, reported to an SEC filer attempting to submit a filing, and then would force the SEC filer to correct the issue before being able to submit their financial information.
This is how the FDIC XBRL-based system works.
The closest thing that I have seen in the market which satisfies the detection of these base criteria so that the can be identified and corrected is XBRL Cloud's EDGAR Dashboard. However, that commercially available tool does not cover all the base criteria.
You have to give the accountants creating these SEC XBRL financial filings credit for doing as good a job as they are doing using manual verification processes. But manual verification is time consuming, expensive, and it lets errors fall through the cracks. For digital financial reporting to be deemed to work, there cannot be cracks.
So, what are the base criteria? Here you go:
Here is a list which explains the criteria, where SEC XBRL financial filings are today based on my analysis, an example to help you understand why the criteria must be satisfied, and other information to help you understand why the criteria must be satisfied.
- Automated XBRL Technical Syntax Validation Errors: 99.9% of all SEC XBRL financial filings already satisfy this criteria so this is not worth discussing in detail, it works. What causes the .1% shortfall is inconsistent XBRL implementations by software vendors. Fundamentally, if software cannot read the XBRL technical syntax without error, the information is somewhere between unusable to unreliable. The good news? This works very well.
- Automated EDGAR Filer Manual (EFM) Errors: 86.1% of all SEC XBRL filers pass the EFM validation rules. Why not 100%? Well, that is because different interpretations of the EFM rules by software vendors, including the SEC. The good news is that from what I can see, most of the automated EFM validation rules relate to improperly formatted HTML within [Text Block]s.
- Automated US GAAP Taxonomy Architecture Model Structure Errors: 97.9% of all SEC XBRL financial filings properly distinguish and properly organize [Table]s, [Axis], [Member]s, [Line Items], [Abstract]s and Concepts. A small number do not. This is a problem because if a [Member] is placed within a set of [Line Items]; how should that be interpreted by software using the information?
- Balance sheet date of current period identifiable: 97.2% of all SEC XBRL financial filings have the correct relationship between the balance sheet date, the fiscal year focus, the dei:DocumentPeriodEndDate and the context of the required context. If these dates are incorrect or inconsistent, it is not possible to figure out what the date of the current balance sheet is. Software might be able to figure it out, but they can guess incorrectly. A specific example will help understand this. In this filing, note the fiscal year focus (which says 2011) in the document information, the context of the document information (which says 2012), the document period end date concept (which says 2011), and the balance sheet date (which says 2012). Again, software can guess but because it is a guess, they could get it wrong.
- Root reporting entity is discoverable: 99.2% of SEC XBRL financial filings provide an easily detectable root reporting entity as required by the EFM. A few don't. Here is an example. Go look at the balance sheet and notice that there are multiple balance sheets and the reporting entity balance sheet cannot be detected. If you cannot detect the one legal entity or one scenario you should be using, you have to venture a guess. If you have to guess, you can guess wrong. Besides, if you don't have to guess with 99.2%, why can't the other .8% just get with the program?
- Fundamental accounting concepts identifiable/distinguishable and relations are intact: 98% of SEC XBRL financial filings report fundamental concepts and have the same relations between those concepts. For example, "Assets = Current assets + Noncurrent assets". It does not matter whether a filer explicitly reported one of these fundamental concepts or not, if software cannot sort this out, bad things happen when trying to use the information. Here is more information on what these fundamental concepts and relations are. Hard to disagree with those. Don't agree with my list? Fine, modify the list. But again, if this cannot be sorted out the information is not usable.
- Primary financial statement computations proven to be intact: 79.1% of SEC XBRL financial filers provide the business rules which prove that their balance sheets, income statements, or cash flow statements add up (roll up) correctly. But 20.9% don't. As a result, many times that 20.9% reverse the polarity of a reported number. For example they report (1000) when they should have reported 1000. Easy enough to fix. Just provide the XBRL calculations which the SEC requires anyway.
- No detectable accounting or reporting anomalies found to exist: I don't have a reliable number for this criteria. I have the number on a test-by-test bases and I have 26 tests. But this is really a wild card. The list of tests will grow, and grow, and grow. As the number of tests grow, accounting anomalies will be identified, corrected, and information will be more usable. There will be thousands of tests like this, I am clearly only scratching the surface. Here is an example; If an SEC filing reports current assets but you cannot find current liabilities, how trustworthy is the information? Now you can say that if current liabilities is zero and not reported, that is not an anomaly. Well, I would say two things. First, if it is zero it is "not reported", it is zero and should be reported as zero. That will keep software from doing the wrong things. Second, most people who have this error report "us-gaap:Liabilities" when they should have reported "us-gaap:LiabilitiesCurrent". That is a problem.
While I have not characterize these base criteria as data quality logic or business logic; all are based on logic. This is not really a matter of opinion. In fact, this cannot be a matter of opinion and also be called usable digital financial information.
All one needs to do is write a software application, prove that they can successfully use this information and refute what I am pointing out. Until someone can justify otherwise, and I would love to be wrong about this, each one of these base criteria leads to unsafe and unpredictable interpretation of the information. And I am not talking about creating a guessing game, a game which different software vendors would implement in different ways and then we get one financial statement with two different meanings depending on which software application you use. This information should be reliable, predictable, and easy to safely grab.
Unsafe, unreliable, unpredictable cannot exist and the SEC XBRL-based financial information considered appropriately usable.
How hard is it to create automated software routines to detect these base issues? Well, I did it. Therefore it cannot be considered that hard.
Reader Comments