Sap system migration management
When was the last time someone jumped up and offered to be a data steward? Can’t remember? Neither can I! It is one of the most needed positions in a company and in a systems implementation, yet it seems to be the forgotten function. Why?
Data is the driver
As everyone knows, data is the driver in today’s environment. Getting it right, keeping it right and managing it right are sort of the ‘101’ of any decision-making activity. Yet, we often tolerate not having data managed, put up with having data downloaded from many sources and integrated in the ‘mother of all spreadsheets’ for operational reporting and/or decision-making. Let’s remember that most companies are dealing with many complex legacy systems as well as some new and more modern systems. How do these systems coexist? Who’s the master, who’s the slave? This is a decision that needs to be made before companies extend their portfolio footprint.
Data needs to be considered a corporate asset. Something with intrinsic value, that without, a company would find difficult to exist. Decisions would be made that are painful (without good data) and prone to nasty or unforeseen outcomes. We get someone to clean up data so it can be reported upon, then never updates in the source system – so someone else has the heartache of doing it again. Not really a good practice, time consuming, expensive, slow. So, what are some solutions to helping to resolve this dilemma?
A first place to start is to convince executive management that without good data, they’re just fishing for answers. Demonstrate examples of companies that have made decisions with poor or no consistent data (lots of examples of this – just Google it) and put a dollar value of the impact of having decided without data that was owned, managed, curated and verified.
Then you need to have an ownership structure for key data. Naturally, you’re going to ask who is going to determine what is ‘key’ data. Key data is data that without it, a company would struggle to survive. Like customers, pricing, cost, or some piece of data that is key to your business or vertical. Key data isn’t the whole library of possible data elements (that would be totally over the top!); it is that data which is key to your decision making. Getting to determine what key data actually is and who should own it can be an arduous task, but when done, you have alignment with the executives and functions of the company. You have also built the case for the management of this data.
At a former company, we had a huge project to completely replace existing legacy systems with new technology, better integrated. Transform the business decision-making into the functions where it mattered yet provide a reporting capability to the executives (and functions) that was real-time. Wow, what a great venture to be part of!.
When I arrived at the company, the overview chart I reviewed had this small circle on it called ‘Master Data.’ I met with the CFO and said bluntly that if you don’t make that ‘master data’ circle the biggest circle, then the money you’re spending on the overall system would be largely wasted and you’d end up with mountains of manual processes to control the very data that you’re trying to get a grip on and make real-time. We were not talking chicken feed: this was a multi-multimillion dollar, multi-year endeavor that covered several years of implementation, training, integrations and more. Yet the concept of data management was essentially missing.
From here we went into a set of workshops to look at what we would need to do to control key data elements. Lots of discussions, documentation and of course, opinions. This went on for weeks until the company realized (I was there!) that it was unprepared to establish a function to control its data. It wasn’t a cost decision only, but the shear fact that data ownership crossed several functional boundaries caused angst in the decision-making process. We ended up with a compromise that essentially gave control of the data to a quasi-central authority, yet this authority didn’t have all of the necessary rights to insure data integrity. The good news here is that the company recognized that it was not prepared and built a data management capability into the next year’s budget. They punted and it worked for this initial implementation cycle.
Here is an example of how putting in a data management function yielded results that actually changed the approach to new product development.
The situation: an international company, based in Europe, rolling out SAP to at least 13 countries, all systems being migrated to SAP are different. Not only running in different languages, but the software was written (home-grown in many cases) using variables and words written in those languages. The coding structures for products were different – the same products in different countries had different coding structures. Imagine figuring out what same products were selling across Europe?!
The first thing we did is evaluate the coding structures. Took 6 months to figure this out. Then derive a common coding structure. Took another 3 months. Sell this in. Took another 3 months. Then the dilemma. How to support it?
Fortunately, when we approached the team in Paris, they understood the dilemma. One person volunteered to take on the role of data steward and then enlisted several colleagues to join him in this first venture for the company. We then developed some workflow processes and easy-to-use systems that allowed countries to submit a request for a new product to be evaluated, tested, and maybe, manufactured. The data team reached out to other functions to embellish the data with their own insights and knowledge yet kept track of the data.
When we implemented SAP, this process became core to controlling what products were actually in production, what was being requested by the countries and drove a process to insure core data elements were in place before any new product was actually produced. We gave the new system an acronym called GMM (Global Material Master). Within 6 months, people were asking if the data was ‘GMM compliant.’ This phrase meant that the data was curated, verified and capable of being reported upon without question. Every country after the first one complied with the new ‘GMM’ architecture and rollouts across Europe went without the usual debate of data ownership.