Database Migration Testing

Database Migration is moving applications, cron jobs, all components and data transfer from legacy database software system to another new database software system. Here database software system in the sense Data warehouse or Hadoop or any other system which can handle the existing functionality and handle the data without any performance issues. A plenty of database testing will be involved especially in data migration projects. Here are the varieties of data migration case studies like Migration of DW to Hadoop, Migration of ODS (Operational Data Store)to Pentaho or any other ETL tool etc. Any of the data migration or replacing ETL Tool with another ETL tool we should have the parallel system in the process, so that the data comparison should be pretty some easy going process.


What is the database migration testing strategy?

The main strategy here is you to generate the source data or test data and this is going to be a very challenging task before testing. Sometimes you won't be having any data in both legacy qa env and new qa env,  and you need to generate the data or simply you have to copy the data from production, and for this you should have access to databases to copy data from production to your test env, or concerned DBA team team will do this requirement for you. Once you set the source data for both old and new qa env, then run the job or transaction or workflow in new and legacy system and compare the data.

What are the database migration test cases?

    The test cases we need to check here are as below.

    1) Before running any job on any system just ensure that the corresponding data in terms of tables or files should have same source data on both the systems.
    2) Then run the job and verify the data on all temp folders and final target tables, and you can do this as one of the following.
    • For example the legacy one is oracle and new one is mysql as assumption, then copy data from the target tables into xls sheets and using "Spreadsheet Compare 2013" you can compare quickly.
    • Second one have a some script/job which in turn will copy the data from new system target table to old system,. then the compassion will be so easy using MINUS command if it is on oracle and EXCEPT command on MySQL.
    • Third one come up with a small shell/perl script which will query the tables, copy to text files and compare the data and finally display the comparison data with counts and mismatches in a text file.
    • In case the output is files itself, then compare issuing comm or diff Unix commands.
    • Or in case the data is very less then compare manually by random sampling.


3) Sometimes in Oracle the temporary tables are created as global temporary. Whenever commit command come across the data will be deleted from temporary tables when DBA created the table using "on commit delete rows", one more option "on commit preserve rows", in this case the data will be preserved till the session completes, after the session the data will be deleted from global temporary tables, we won't be having this kind of functionality in any other databases except Oracle. So testers have to check any duplicate or cumulative data on temporary tables.


4) Once you are done with testing of any job, try to rerun the job again, we should not get duplicate records in any target tables.

5) In migration projects we won't cut the old system immediately, at least for some days it should be in parallel process until we will get confident that we are not going to face any issues in future.

Data Migration Testing:

We can say Data Migration testing is a subset of Database Migration testing, let me clarify what is the difference between Database Migration testing and  Data migration testing. In database migration testing we will test any workflow, job or task and compare the data and conclude the job is developed as per the requirement or legacy system and it is working fine as expected. Whereas the data migration is comes into picture when the job has already ran for so many years in legacy system and generated plenty of historical data on different tables. So then we need to check the historical data migrated/transferred to new system properly or not.


1) Check for the counts in each table, if counts itself not matching then there could be problem in data copying or data migration.
2) If counts are matching then randomly pick any records from both old and new system and compare the data. If there are no issues then check the functionality once from application perspective.

3) If you are not having any tools to compare then just check for the following.

  • Check for count of NULL fields in row
  • Check for count of NULL fields in a column
  • Check for the counts using GROUP BY clauses. 
  • SELECT column_name, count(*) 
    FROM table_name
    GROUP BY column_name
    you can use following having clause for more granularity
    HAVING count(*)>=2
    
  • Check for lowest value in number or int column.
  • SELECT min(column_name)
    FROM table_name
    
  • Check for highest value in number or int column.
  • SELECT max(column_name)
    FROM table_name
    
  • Check for average value for number or int column.
  • SELECT avg(column_name)
    FROM table_name
    
  • Check for total values in number or int column.
  • SELECT sum(column_name)
    FROM table_name
    
ETL Testing                                                                           Database Testing

No comments:

Post a Comment