Search

ERP Data Conversions 101

When it's time to find a new software vendor, one of the biggest decisions you will need to make is what to do with your current data. Conversions are consistently a Quadrant II Risk on projects as they are both critical to the success of the project and have a high likelihood of manifesting from risks into issues.





When I started my career in the Community Development domain there were still a lot of municipalities running their departments solely on paper. In these cases, many of them performed some basic imports of parcels, addresses, and owner names, but started from scratch for everything else. There was just no data to convert. Some of these customers back-entered data from the current year (or further) to allow them to manage existing projects within their new software. In other cases, municipalities had an existing system but either did not want to budget a conversion of the data or they deemed their existing data as being too poor of quality to bring forward.


As time went on, data conversions became increasingly common requirements on RFP's.


Once you select a vendor, their conversion process will typically be defined to some extent in their Statement of Work. Some vendors may instead provide a separate document such as a Conversion Plan which outlines their process or they may even describe the process verbally. I've found that having the process documented in some manner is always a best practice. This makes it clear to all stakeholders what will be converted, how it will be done, and what the roles and responsibilities of the process are.


First, let's define the two main types of conversions you may see:


Standard vs Custom


In our previous blog entry, we looked at How Com Dev Data Goes Bad and How It Can Be Fixed. In that article, we talked about the difference between what we would call a standard and a custom conversion.


Standard Conversion


Some vendors require you to export data from your current vendor system, 'massage' it a bit to hit a specific format, then they import it into the new software using a canned (or non-customized) import process. This process is typically the quickest and easiest but it is susceptible to the most amount of data loss as you may have to leave behind data that is not included in the import format. Also, since it is not custom, if you have fields where data was not being stored consistently, it does not provide rules to conditionally route or operate on data as it is converted to clean it up. This is usually best for converting from small or simple databases but is not ideal if the data in those databases needs some adjustments on it's way across.


Custom Conversion


The other approach is a custom conversion. In this case, a vendor will either build a conversion from the ground up for the vendor you are leaving or will start with a base conversion if they have converted that system before.





Data Extraction


One of the first steps to converting data is the initial extraction. Some vendors may get a first preliminary set of source data immediately, especially with systems they have not converted. Others may wait until later in the process if they have converted the system many times and know it is a lower risk or that modifications to the base conversion is unlikely.


Source data can be in a multitude of different formats and may be exported in one of many different ways. Here are some common sources and methods that could be used to obtain historical data to convert:


  • Export - CSV or Excel may be exported directly from your current vendors interface

  • Reports - reports generated from your legacy system can sometimes be OCR'd and transformed into tabular or delimited data

  • Data Files - the legacy system may utilize data files which can be converted to a usable format such as CSV

  • DBMS Backup - a SQL, MySQL, Oracle, etc., backup of the legacy data may be easily obtained

  • Custom Programs - sometimes a custom program exists or can be written to extract legacy data and convert it to tabular or delimited data


As also mentioned in How Com Dev Data Goes Bad and How to Fix It this data can be either flat or relational. We encourage you to read that article to see how this can affect the quality and level of detail of converted data.


Vendors will typically extract data at least once when the project begins and again just before go-live. However, data may be extracted several times in between. These can be pre-scheduled for the purpose of working with updated data throughout the development and review process or they can be requested ad-hoc due to issues found during the review process to track down missing data or other similar reasons.


Be aware that your current vendor may charge you a fee to extract data. This fee can be thousands of dollars per extraction and is typically only incurred when your current vendor has to assist with the data pull. Make sure you research these costs when budgeting for a project or you are sure to have cost overruns. This also may affect your overall conversion plan in terms of how many times you retrieve updated sets of source data during the project.





Data Mapping


Now that source data is obtained, someone needs to identify how the legacy data will line-up with (or map to) the new database structure. Some basic record types such as permits, inspections, comments, etc., might match up well. However, you may have fields that exist in the source/legacy data that don't exist in the new system or vise verse. Some of these attributes may even be required fields in the new system. Data can sometimes also be stored cryptically in nature. Whether it be field or table names that are not descriptive, data being stored as enumerated values, or even purposefully in encrypted data fields. All of this data needs to be accounted for or it will be lost in the conversion.


In order to reconcile this, someone will typically need to go table-by-table, field-by-field to compare and map the source format to the destination. In some cases, you will need to lookup a record in the system and find the data in the user interface to see what label is associated to know where to map it in the new system if the field names in the data tables aren't clear. With mature conversions which have been used by the vendor many times in the past, this process may be largely done. You may only need to review a small list of fields that are custom or specific to your jurisdiction. In other cases, you may be starting from scratch. It's best to know how much time your team will need to spend on this process before making a decision on a vendor as this process has the potential to consume a lot of resources on a project. If you are choosing between two vendors, one may have more experience converting the system you're leaving, making them a potentially better choice.


Common approaches vendors may take with data mappings for conversions are:


  • Pre-defined - in this approach, the system you are leaving has likely been converted by the vendor many times. Tables and fields are already mapped and logic has already been developed to account for migrating data into the new system. When this is the case, you may still be presented with some limited decisions to make regarding mapping of fields, or you may not be able to make any changes at all. This method can be used by vendors to guarantee a limited scope conversion at a lower, fixed cost.

  • Customer Mapped - in this type, the customer is responsible for taking their current format and mapping it on their own, field-by-field, to a pre-set format provided by the new vendor. This can be less expensive up-front, however, unless you have someone on-staff with strong knowledge of the schema of the existing database, you will usually end up having to reach out to your current vendor to assist. This can create a few problems. The first is that you may be forced to notify your current vendor of your intentions to move prior to when you had intended. Secondly, your current vendor may charge you to help map fields to the new format which may negate the savings of going with this method.

  • Vendor Mapped - in many cases, your vendor may perform the majority (or all) of the mapping between the systems. When this happens, they may get a backup of the data and disappear under a curtain for a while to perform this work, emerging when it's time to get input from you or to begin the review process. This can reduce resource usage greatly among your team during the conversion process, however you may expend those resources instead more heavily during the review phase. Also, since you aren't involved in the process you may not find issues with the conversion logic until much later on, even months or years after go-live. This can help remedy short-term resource issues but can backfire later on in the project.

  • Collaboratively Mapped - from what I've seen, most vendors lean into this method the most. The vendor may start from scratch or with a base conversion. Then they may either review the current mappings and logic with you or you may work together to map the conversion from the beginning. This process may seem daunting as some systems can have hundreds of tables and tens of thousands of fields. However, from what I've experienced, this method provides greater ROI as the complexity increases. At most, this process may take several days but can take weeks off review time and can result in a higher quality conversion with fewer post-go-live issues. This is done in part by limiting the number of needed iterations during the review process which we will talk about down below.




Attachments


One important consideration in conversions is how to handle attachments. These can be scanned documents, letters, images, etc., that were uploaded/attached to records in your legacy system.


We won't go into too much technical depth here but there are 3 common ways attachments are stored in systems:


  • File System - in this method, the files making up each attachment are stored in a file system such as Windows Directory. The database then contains records that reference those files and identifies their location on the network along with what record in the system they are attached to (permit number, address, parcel number, etc). When a user opens the file from the program, it is simply retrieves it from the Windows Directory and is opened for the user. This method is commonly seen in On-Premise solutions.

  • Byte Array - in this method, the data for the file is stored in a data field within the database. The record containing the data is also attributed with other information such as file size, type, name, etc. When the user opens the file from the system, the software uses methods which interprets the file data and displays it to the user. This is also commonly seen in On-Premise solutions.

  • Object Storage - object storage methods such as AWS's S3 or Microsoft's Azure Blob Storage allows unstructured data, think non-tabular data, such as attachments to be stored on a hosted/cloud service. Developers have tools that allows them to store and retrieve these attachments in this environment as needed. This is very commonly seen with hosted/cloud-based ERP solutions.


You should know whether the conversion of attachments is included in the scope of the project when you are selecting a vendor. If it is, ensure the chosen vendor can successfully retrieve these files from the source system and has the necessary experience to convert them to whatever format and method will be used to store them going forward. Attachments are typically pretty straight-forward to find and map within a conversion but they can be easily overlooked during the process so it's best to have them accounted for specifically in the SOW and/or Conversion Plan.



Translation Tables


Data cannot always be mapped verbatim. In some cases, you will need (or want) to transform data that is converted across to the new system. Maybe you want to rename permit or inspection types, or merge variations of stored values or control types into fewer variations. This could be due to new business processes, new functionality within the new system, or just an opportunity to clean-up bad data.


We recommend that you understand what your options are here before you sign a contract with a vendor. Some may not have any tools to transform data, others may do minimal transformation on a case-by-case basis by hard-coding the transformations into the conversion logic. Some have what are known as translation tables which is a user interface that allows either your implementation team or you yourself to add records into the new database which specifies source values from the source database and maps them to new values in the destination. These can go by different names but the basic premise is the same. When the conversion is run, the translation tables are analyzed and automatically factored into the logic. This can make adjustments more efficient by not having to rely on a developer to make these changes each iteration.


Example:

Source Value

Destination Value

Mechanical Rough

Rough Mechanical

Electrical Final

Final Mechanical



Open Records


When you perform your conversion for go-live, most commonly your final conversion, you will likely have open/active records from the legacy system. These could be outstanding invoices, issued permits, scheduled inspections, and so on. Since these records are open, they are typically the newest records in the system and were likely not included in prior data reviews (which we will talk about next) since they didn't exist at the time.


Be sure that you have a plan in place to account for open records at time of cutover. If a payment on an open permit comes in after final extraction of data but before the new system is live, you will want to know what to do with it. Will you process it in the legacy system and issue the permit, requiring you to re-enter and process it in the new system? Or will you hold it until the new system is live? How will online payments be handled? There are many considerations here and various methods of managing each. I have managed projects which handle these many different ways and I can't say one single method has been clearly the most effective. It usually comes down to customer comfort level, preference and/or cash-management requirements. The most important action that contributes to success here, regardless of the approach, is simply to have a plan. Document this plan and communicate it regularly.


Another consideration for open records unrelated to the cutover itself is how the configuration in the new system will affect open records. If the new system is designed with different or new record types than what is being converted, this can cause the converted records to not behave the same as new records which are not converted. For instance, a permit type of "Building" may not have a defined workflow path in the new system where the permit types are broken out by use-type like "Res Building". Ask this question early and know how converted records will be affected. Having a different post-go-live process for converted records vs new records may or may not have an adverse effect on user experience. You may be able to configure the system to account for these or you may be able to use translation tables to transform the legacy types into the new, configured types.




Data Review


An entire article could be written on this topic alone but I'll try to keep it simple. The data review step and the data mapping step are closely related. The better you map the data, the easier this step will be.


A vendors data review process should be identified in the SOW or conversion plan. Like mapping, you will want to know what their process looks like and who is responsible for quality control of the converted data.


The possibilities are:


  • Vendor Review - the vendor performs the entire review process 'in-house' then delivers converted data to the customer at training and/or go-live. While this method has the benefit of putting the onus on the vendor for QC, it can actually cause your team to spend just as much time on data issues overall. This is because this method typically leads to more issues being identified later, even after go-live. When this happens, your team will need to spend time communicating and remedying these issues with the vendor. This will almost always lead to an unsatisfactory user experience.

  • Customer Review - the vendor delivers a raw version of the converted data to the customer who is then required to review it against the source data. Flaws are communicated to the vendors who fixes them and provides updated datasets to review further. This method is not recommended since your team will be trying to review data in a system where you have little experience. This will undoubtedly lead to data issues down the line. In this method, you may also be responsible for quality to the point where adjustments after a certain point even incur an additional fee by the vendor.

  • Collaborative Review - the vendor and customer share review responsibilities. Data reviews are often scheduled in advance where the customer and the vendor navigate the data together to combine their experience in vetting out the data. This method, combined with the collaborative mapping approach is our most recommended. As mentioned earlier, if the mapping process is done effectively, the review process will be much more successful.


During data reviews, you will likely look at complex cases and records to ensure they are converting successfully, verify previous mapping, logic, and translation table entries, and verify balances and calculated fields are accurate. Reports generated from the source data may be used to validate converted totals.


One often overlooked aspect of data review is testing. Be sure that there is a testing component to the data review process. Without this step, you can end up with great looking data that doesn't function properly with the new system.





Cutover Process


The exact cutover/go-live process (from legacy system to new system) is going to vary greatly based on project requirements but will typically consist of the following basic sequence:


  1. Blackout of Current System

  2. Final Data Extraction

  3. Transfer Backup (and Reports) to Vendor

  4. Vendor Conversion and Final QA

  5. Transfer Backup to Client Site (if on-premise solution)

  6. Restore Final Database to New System

  7. Connect Integrations

  8. Go-Live


Here are a few things to consider:


  • Complete or phased cutover - will there be multiple conversions of the final solution at various phased go-live milestones based on functional areas or will you have one complete final extraction, conversion, and go-live?

  • Down Time - how much time will you be down and when? Be sure to have a plan in place for internal staff and the public. This is one of the most critical components of your change management activities on the project. Try to avoid cutovers at peak times just after a mass renewal or billing goes out where entries are being returned for processing. Don't try to minimize downtime to the detriment of the project, however. Listen to the advice of your vendor and try to make the recommended down-time work. The last thing you want to do is rush the vendor through the critical steps at go-live, causing further delays or issues due to mistakes.

  • Risk Assessment - have a risk management plan in place to ensure action plans are on standby for issues that materialize.

  • Parallel - sometimes it may be necessary to run a process in parallel. Paralleling can be used in data review as well to verify the outcome of a process is the same in both systems. Paralleling in the cutover process is more related to keeping operations open to the public or internal customers where data is entered twice, once into the legacy system and again into the new system. I am personally not a huge fan of this since it requires more work of the end-users but it does have its benefits in some situations where being completely offline for several days is not an option.


Regardless how you go about it, the most important thing to remember about the cutover process is to plan well and document thoroughly. This is the point in the project where anxiety and emotion will likely be at its highest among stakeholders. Making last-minute, complex decisions under those conditions is not ideal. Having a good plan in place which is reviewed and adjusted routinely throughout the project and which has been developed in collaboration with all of the stakeholders can help put everyone at ease since your team will know how to respond to issues as they arise.



Whether you are upgrading your current vendor or migrating to a new one, please consider partnering with Munivate to help guide you through the process. We are experts in ERP systems with a specialization in Community Development and Licensing. We'd love to have a chat any time with you about your project.


Message us using the chat in the corner or email us at info@munivate.com! We'd love to hear from you!








0 comments

Recent Posts

See All