During the course of my career, if there has been one common theme, it has been process improvement around data. Whether I was working in Operations, Accounting, Finance or IT, I took a consistent approach and usually returned significant productivity gains. At the same time, I also increased quality and reduced costs. I called it working faster, better, and with less effort.

Here is my own simple graphic on this point.

Many organizations are overlooking opportunities for very easy process improvement and productivity gains around their data processing efforts.

In this week’s blog, I am going to provide a high level overview on how I identify the differences between processing data and data processing. I will then lay out the steps on how to move from one to the other and some of the rewards for making the shift.

Data Processing vs. Processing Data Data Processing Processing Data Standard file, fields, and data structures Users define their own structures Only required data/ data validation built into system Free form Global – dates, numbers, currency formats Local setup/user defined Upstream and downstream processes incorporated in design As long as it works Consistent and standard process across organizations/ Desktop procedures Determined by user Version Control user provides latest version Requires technical support Figure it out Process driven Users driven Limited manual intervention Cut, Paste, Vlookups, Pivot Tables Staff training investment and baseline skills requirements Individual determine training needs Upfront investment Investment when employees leave company Reduced learning curve Database skills, Java or VBA or C+, HTML, Excel – Ability to identify best tool for job. Excel

It is not an exaggeration to say that by moving only a few of the items from the right to left, organizations have seen significant productivity gains and quality improvements.

Over time, I have identified the below as the key challenges to overcome. These are challenges which may appear to be beyond the ability of a manager or leader to address, but this process has enabled addressing these issues one-by-one.

I have added comments about change management and emotional issues at the end of this blog

Systems which are not integrated – Usually organic growth with limited resources leads to systems that don’t talk to each other.

Some times it is best to start with an entirely new system, but very few companies are willing to take the risk. Companies will continue to band-aid an existing ERP, GL, or other systems. There are many companies out there that are willing to build a new building before they will change their ERP system.

Old systems with no identified owner.

Over reliance on spreadsheets which usually happens beause the staff’s skills are spreadsheet based.

Limited analytical skills.

Processes are setup on how they impact a department and not the complete E2E process. This can cause extra work for downstream groups.

Policies contradict data processing systems.

Multiple Vendor Systems.

By committing to a data processing mentality, you start to chip away at the above challenges and improve productivity. Barriers to E2E solutions will start to go by the wayside as processes improve.

Here are the simple steps:

Automate Repetitive Tasks. Reduce “Data Gymnastics”. Create Single Sources of Truth. Focus on improving staff skill sets. Partner with IT.

The below chart outlines the “journey”.

What does this mean?

Automate repetitive tasks. In many cases, in preparation for doing analysis or uploading data into an enterprise system, the same tasks are performed by the same people each week. These types of repetitive tasks have no value add and can usually be automated with limited effort. Some times the answer is easy. Example: a team was downloading three reports from the same system, merging the data together and uploading it into a different system. The solution was to program the first system to produce the report in the required format. It cost a bit of money, but saved three hours per week and improved the quality as the data was moved from one system to another with no human interaction. Other times, macros or tools can be used to automate task.

Create Single Sources of Truth (SSOT). Key data needs to be stored in one location, with a process for updating. Sending files via email in spreadsheet format is not productive. Whether you are a department of four or five people, or a large organization, the SSOT is the foundation for success (My next blog will detail SSOT).

Focus on improving staff skill sets. – The old saying if the only tool you have is a hammer, then every problem starts to look like a nail can be updated to, if the only tool you have is Microsoft Excel, you will create a lot of spreadsheets. In many companies people are using Microsoft Excel as their primary data processing tool. Why? Because almost everybody knows how to use Microsoft Excel. The most advanced skills used are Vlookups and Pivot Tables. Time and energy is spent just getting the data into a format to do analytics or feed data into an enterprise database. Skills sets such as VBA, javascript and database skills can make a big impact on productivity. The investment in these skill sets has always more than made up for the cost.

Finally, partner with IT. As part of this process, great tools are created, but I did not want to be the software support business. I always tried to meet with IT before I started the journey and explain the goal. In most cases IT appreciated the effort. The benefit for IT is that when the efforts have matured, IT can step in and bring an enterprise focus with the confidence that business requirements are clear. Also, IT can serve as a consultant as processes are created. This has worked at multiple companies, so perhaps I am on to something.

Improved data processing can lead to better data and better reporting. Decisions based on data can be made with a higher level of confidence.

Here is a simple case study.

Yearly process required audit of data before sending it to individuals for approval. Analyst audited for ten separate items.

The process had been in places for many years until our team asked why can’t we automate this process?

Processing Data Approach (old).

Analyst pull report from system. Analyst did 10 audit checks. This was a (supposedly) global standard process. 25 different ways of doing the above two steps were identified. Report were manual collated and metrics on completed audits were tracked manually and reported weekly. Process took about one hour to audit 25 records.

Data Processing (new).

Audit Steps were automated. 35 additional checks added. Items that passed audit were tabulated and approved. Exceptions were review, rejected or fixed. Process was standardized to one global process. Metrics were tracked automatically and reported daily. Analyst processed about 150 records per hour.

The above process was put in place with at no incremental cost to the company. The identified savings of processing time was significant. One of the additional audits found potentially ten million dollars in over charges.

I briefly mentioned change management. I believe change management is one of the primary responsibilities of being a manager. In today’s world, manager who do not have change management skills will not be successful. Change management was also a part of this journey.

One last item. Security is a major concern in today’s data processing world. All of these processes were behind the company firewall. Data was secured with passwords and in my opinion was much more secure than sending items around the world via email.

I will go into more details in future blogs, but this gives the high level overview of the process and benefits.

Related