The highest-performing delivery organizations in today’s business environment are technology-driven across the board, using data for knowledge gain, cost avoidance, and revenue growth. Companies face difficult decisions about which products and technologies to implement, and often, these choices are made with a tactical rather than a strategic mentality.
The Case for Governance
Data governance describes the set of rules and processes by which data is handled and utilized. The most efficient and productive analytics teams have made governance a priority, and recognize it as a link between their mission as a business unit and the overall strategic orientation of the firm. Governance describes both how data works to move the organization toward its goals and the decision-making rights and responsibilities of individual roles. Furthermore, a defined system of governance outlines a code of conduct against which all technology planning is measured.
In our experience decentralized business units (marketing, sales, product development, entry, operations, transportation, and delivery), each with their own definitions, explanations, and descriptions or data management can cause serious issues in communications, decision-making and operations.
According to Gartner, AI and machine learning are expected to be part of 75% of all software in the next five years. One algorithm will save hundreds of thousands of dollars in transportation optimization costs over another option. As winning algorithms increasingly rely on data integrity and single meaning, only good data governance practices will differentiate companies who lead from those who are left behind.
Optimal Handling Through Master Data Management (MDM)
A core element of many governance frameworks is Master Data Management, or MDM. It refers to a method by which all of an organization’s data is linked to a master file.
An effective implementation allows sharing of data among employees and facilitates working in disparate programming environments. The complexities of MDM require organization-wide documentation on storage and formatting to be effective.
MDM is essentially version control, ensuring that teams aren’t working with outdated or differing versions of the same codebases and datasets. In the context of big data, small irregularities can cause exponential problems in modeling and repeatability. Thus, MDM is prescribed by a plan of governance to use only the most sanitized and up-to-date versions of all digital assets.
Repeatable Processes and Replication of Results
Repeatable processes allow management to expect the same results each time they utilize a set of inputs. For any given variable, there should be a set of rules that govern its use—in which models is the variable suitable for use, and under what circumstances? Teams must have a plan in place to recognize, document, and codify the processes, schematics, and templates associated with each variable so they are easy to draw from for all future applications.
The importance of standardization is reflected in the scale of government, nonprofit and civilian regulatory bodies like Idealliance, IEEE, ISO and UL. While these organizations develop and enforce standards primarily to improve worker and consumer safety, the philosophy translates directly to data management function. Standardization represents a collective agreement on how things should be done within the organization, allowing teams to maximize repeatability and quality. Standards are developed and adopted at both through collective expertise developed over time and through topical on-the-job experience.
In the context of a governance plan, risk reduction prevents decisions from being made based on poor data. The effects of decisions informed by incorrect data can grow exponentially, such that even small errors can impact profitability and efficiency in major ways. Governance quantifies these risks and assigns tolerances for data accuracy based on specific decision-making cases.
Elimination of Key-Person Dependency
Documentation of standards and processes is the most critical aspect for the success of any organization, including parcel, post and logistics companies.
When one individual holds key knowledge or understanding of a procedure or an element of the technology workflow, bottlenecks can quickly arise—often at the most inopportune moments. Governance prioritizes documentation to avert this issue. Documentation consists of rules about how data is handled, how often it is updated, and from what sources it is compiled. If employees can access extensive documentation, the problem of one person having all the answers is averted.
When data is used to produce results, develop models, communicate with shareholders and regulatory authorities, file 10Ks et cetera, having a strong data plan can make results defensible.
Proving the validity of data is tied to the repeatability of processes and results as described above. Quality data becomes information, which is interpreted by management and used for business decision making. Thus, being able to explain and prove the suitability of the methods used for data consumption and usage ensures that only the most quality information crosses management desks.
In terms of external forces like regulation, building defensible processes for data collection and management streamlines compliance checks, limits liability, and improves shareholder confidence.
Clear Lines of Accountability
In general terms, accountability refers to being held responsible for one’s actions, or to a set of responsibilities carried by a given employee. Like all executives, data executives are accountable to shareholders and their bosses for the profitability, efficiency, and budgetary adherence of their departments. Employees are accountable for the individual functions of their jobs. End-users of data infrastructure (the business’ other staff members) are accountable to the network and security policies dictated by technology teams.
A governance plan maps out individual accountability for aspects of the data ecosystem to each role. Various roles will be responsible for collecting, sanitizing, formatting, and distributing data. The system of accountability also sets a prescribed usage for datasets for each role on the team.
A well-planned and executed governance structure in posts, parcel, and logistics organizations maps out exactly who in each business unit will do what by when, enabling the team to exist as a source of competitive advantage rather than a cost center.
Trust in Data/Process/Results
In a highly systematized environment, the success or failure of a given venture can be analyzed and explained in terms of the strength or weakness of policy and procedure. Data Governance provides documentation and a regimen for all workflows, which, when coupled with strong accountability, leads to higher efficiency, faster turnaround, and work output that is aligned with the corporate mission.
The Mail.dat, Mail.XML and Shipping Services File data specifications in the US have enabled over $42Billion US dollars in commercial commerce for over a decade, as acknowledged by the US Smithsonian Postal Museum.
Through Assurety’s experience in leading and managing these specifications, as well as in providing mailing and parcel shipping software reliant on standard data specifications since 2004, we can say that the only way governance and standardization work is when everyone in the organization (and ideally the industry in case of b2b data specifications, including public/private partnerships) plays a critical role where the participants have both business and technical knowledge.
We’ve led and directed training for new software releases to thousands of personnel for over a decade. Our clients’ staff works in post offices, sorting and induction facilities, cross dock facilities, and in delivery centers. Our clients also utilize external industry professionals from commercial mailers, publishers, marketers, banks, universities, retailers, and eTailers. All direct and indirect employees must receive access to documentation and training on how to use new software, workflows, and processes. We believe well-trained people and formally governed data are the key to success or failure of any multimillion dollar software implementation and digital transformation. We also believe that exponential organizations (versus linear organizations), which we will discuss in a future article, are the ones poised for success.