In meeting, conference, and travel mode but a few thoughts on issues that have arisen over the last few days.
One of these is the concept of understanding how data flows within and between organizations. The confusion often arises by mixing in issues of reporting and report formatting into the discussion. No doubt these perspectives continue to persist–a type of Zombie argument–because is it hard for non-techies to understand that, given data, such views–which oftentimes are posed as impediments and counterfactuals to data optimization–become trivial.
Here is the deal: right now, especially in contractual relationships in the public sphere, project performance management data is submitted to demonstrate accountability and progress in the expenditure of public monies. A similar relationship also exists in private industry among parties both within large organizations and through contractual and fiduciary relationships.
When it comes to project management, there are a number of areas relevant to measuring progress from which data must be collected. The issue here is determining what data to collect, store, and process–and the most effective way of disseminating, analyzing, and recording responses to such data once it is transformed through processing into information. Thus, we have a plethora of data streams in a typically complex organization or system. In working with project management data there are a number of areas of overlap, redundancy, and suboptimization. This last is typically represented by proprietary and stove-piped data repositories that resist optimization of data across all areas of the organization or system that require access. These islands also resist integration with other data that can optimize the value of the data in providing further insights.
To eliminate this redundancy and unnecessary complexity requires a systemic approach to data streaming. What are the systems of record? Who are the necessary consumers of the data? How can this data be optimized? What does not go into this equation are concerns of report formatting (especially artifacts based on human-readable formatting that were conceived under non-digital technology), and levels of reporting. This last consideration becomes trivial when answering the questions listed at the beginning of this paragraph. Furthermore, technologies that break down proprietary barriers in the translation and integration of data are an important consideration toward optimization. For example, the application of the international UN/CEFACT XML standard with well-defined data exchange instructions (DEIs) focused on particular industries is one way of overcoming limitations imposed by proprietary barriers. The use of standard APIs, given open rules of engagement in defining data syntax, is also another approach.
In the end, the objective is to reduce the number of data streams in order to resolve redundancies and the needs of multiple project stakeholders that can leverage commonalities. Such an approach also reduces the disruption of organizational processes that result from supplemental information requests, and inefficiencies that rely on suboptimized human-readable submittals.