Historical Data Access Approaches⚓︎
Historically Financial Intelligence data has been accessed via the following two mechanisms:
- Purpose build analytical tools, and
- Bulk data extracts.
Purpose (or Custom) built applications provide pre-canned analysis options which cover the most common internal and external use cases. While bulk data extracts are leveraged by sophisticated clients willing to deal with the significant volume and complex data structure. Both of these mechanisms have significant compromises.
Firstly, purpose build analytical tools are costly to maintain, support, upgrade and only support the originally scoped pre-canned analysis patterns. While bulk data extracts; multiplies the processing and storage requirements of the data, distributes subject matter expertise (possibly resulting in mis-understandings of what the data actually means) and due to the time involved in extracting, transferring and processing data generally results in analysis being performed on stale data.
Emerging Data Access Approaches⚓︎
Over time the above data access mechanisms have begun to be replaced by modern alternatives. Specifically:
- Purpose build analytical tools have begun to be replaced by Commercial Off The Shelf (COTS) and Open Source Software (OSS) analytical software. COTS and OSS analytical applications have significant advantages over purpose built software solutions, particularly in relation to support, maintenance and continual upgrades. Effectively organisations are pooling their requirements and funding, then outsourcing the development work to a specialised COTS software vendors or the OSS community.
Bulk Data extracts have begun to be replaced by well documented APIs. Data access APIs have significant advantages over bulk data exchange. Specifically:
- Centralising the storage, processing, subject matter expertise, and management of the data within government.
- Formalising interface definitions.
- Ensuring all searches are performed on the latest data available.
- Allowing consumers to tailor the data they recieve and how regularly they access it (i.e. they can get what they need when they need it).
Going forward the use of APIs is expected to explode as partners integrate financial intelligence APIs into automated business workflows. Consider an existing process that requires an analyst to search financial intelligence data to reach a determination on a request (e.g. a visa request or an application for a grant). This process would currently involve a skilled analyst logging in, performing a number of searches, analysing the results, then probably downloading and manually integrating the data found with other data sources (and in the majority of cases not finding anything untoward).
In the future APIs can be used to fully automate almost all of these checks with an analyst only needing to be involved to deal with exception scenarios (allowing them to focus their skills on the hardest problems / most complex data). This type of automation would also increase the overall speed in which requests are completed - improving overall client satisfaction.
More generally providing API's allow financial intelligence agencies to focus on their core value proposition (i.e. their monopoly position in domestic Financial Intelligence data). As opposed to expending resources on user interface development, which will only ever be able to provide clients a single
silo of data for analysis.