Home

Upgrade Toolkit - Oracle Documentation

image

Contents

1. 3 7 3 3 6 Verifying Data after Database 3 7 Module Upgrade cani coru ae 4 1 4T Introduction iren ttm een 4 1 4 2 OOO reali teu si a 4 1 4 3 Upgrade of Revamped 202020 1 4 1 4 4 Migrating Data from Loans and Deposits to Consumer Lending Module 4 2 441 Migrating Products from LD to eene 4 2 4 4 2 Migrating Contracts from LD to CL 4 2 4 4 3 Migrating 2 sse 4 10 4 5 Migrating Data from LM Module to ELCM Module 4 12 4 5 1 Migration Approach 1 0 0 0000 001000000 nnn 4 12 45 2 PEQUES ioo pau e 4 12 4 53 Enabling 1 riggers s B e da ERA d E RR 4 13 4 5 4 Migrating suite 4 13 4 5 5 Truncating 4 14 4 6 Migrating Data from Branch to Retail 2 4 15 4 7 ATM POS Modules 4 15 4 8 Upgrading Existing 4 15 4 8 1 Generic Conversion 8 1 0000000060000000000000000 nn 4 16 4 8 2 Upgrading Core Module nnns 4 17 4 8 8 Upgrading SM
2. This verification includes the following steps Generic Checks generic check includes the following Check the main parameter table cstb_param for the parameter values Menu organization is as per the static factory shipped data and handled from the import ELCM and CL modules are not handled by the script You need to remove the LD and Limits menu functions and add the ELCM and CL menu functions The bank may need to modify their user roles You need to take care of the change of user roles for the new modules A script will introduce all such roles into ALLROLES Unlock and save all modules maintenance such as Products ICCF Rule ICCF Class UDF etc EOD and Performance Testing This verification includes the following Configure the following as part of FCJ configuration Install Oracle Web cache which is present in the Sizing document Change the Internet Explorer setting as DBA server sizing document recommended IE settings Ensure that onsite changes are not done to introduce no cache in the code Configure realistic user roles against the usual ALLROLES Launch basic screens of high volume module sand process them onsite ORACLE 3 3 6 2 3 3 6 3 3 3 6 4 3 3 6 5 3 3 6 6 3 3 6 7 3 3 6 8 e Record and review the EOD EOM timelines before and after upgrade to check if there are any major variances e Check if all the programmes maintained the EOD window i
3. e Identify temp soft changes to and take appropriate action to setup target version with temp soft changes temp soft changes refer to the customization changes and bug fixed on the source version e Setup target version of Oracle FLEXCUBE Universal banking Solutions e Perform module specific changes if any e Set up interfaces and adapters e After upgrade test the target version along with all interfaces and adapters Get sign off on the production environment upgrade Applying Temp Soft Changes Temp soft changes refer to the customization changes and bug fixes that are applied on the source version of the application used by the customer You need to identify the temp soft changes that should be applied in the target version Source File Changes You can use DIFF tools to compare the base version of the source application and the version used by the customer Static Data Changes For identifying the differences in the factory shipped data use utility scripts mentioned in the Annexure Setting up Target Schema You need to set up the target schema For the purpose of illustration let us consider a schema by name DESTSCHEMA You can use the Oracle FLEXCUBE Universal Banking Installer to set up the target schema Follow the steps given below 1 Create the target version database using the target version Installer Refer to the installation manual of the required version for details on setting up database 2 Loa
4. 7 5 4 Moving Extraction Data into History Table For moving the extraction data into the history tables you need to follow the steps given below 1 Check the following Head office branch for the source schema and the target schema are the same Today column in sttm dates table should be the same for all the branches source and target system Collect the branch code stage and extraction date for the extraction data which is being moved into the history table 2 Inthe source schema complete the following activity 1 move to history src sql with parameter branch code stage and extraction date which has been collected in the previous step 3 In the target schema complete the following activity Run 1c move to history tar sql with parameter branch code stage and extraction date which has been collected in step 1 7 9 ORACLE 8 1 Utility Scripts The utility scripts are given in the following table 8 Annexure Script Name Location Remarks Export Site FULL Dump par SOFT TOOLS Param files to export the Upgradetoolkit production schema at Soft ImpExp site In this par replace the word SOURCE SCHEMA with the actual schema name to be exported Also change the DIRECTORY DUMPFILE LOGFILE names as per the actual names used This is appli cable to all the par files supplied in t
5. 7 1 7 2 7 3 7 4 7 5 2A VALIDATE RECON SCRIPTS TAR SQL 7 1 7 2 7 3 7 4 7 5 2B UPDATE INV SCRIPTS SRC SQL 7 1 7 2 7 3 7 4 7 5 2B UPDATE INV SCRIPTS TAR SQL 7 1 7 2 7 3 7 4 7 5 2 LD COMT EXTRACTION SQL 4 3 39470 Heading 3 7 2 6 Spooling Module wise Spool Files and Control File 6 3 3 500 4 3 3 RECON MIGRT SRC SQL see 7 4 3 RECON MIGRT TAR SQL 7 4 44860 Heading 2 5 3 Upgrade of Revamped Modules 4 1 46A RECON REPORTGEN PARL PREOD SQL 7 5 49841 Heading 3 4 5 4 Enabling Triggers 4 13 49903 Heading 1 7 Data Reconciliation 7 1 4A RECON PARL PREOD SRC SQL 7 5 4A RECON PARL PREOD TAR SQL 7 5 ORACLE 4B RECON PARL PSEOD TAR SQL 7 6 4C RECON PARL ADHOC SRC SQL 7 4 4C RECON PARL ADHOC TAR SQL 7 4 4 LD LOAN EXTRACTION SQL 4 3 5 1 POST MIGRATION CHECKS SQL 4 3 50568 Heading 1 6 Conversion Script Generation Tool 6 1 59264 Heading 1 1 Annexure 8 1 5 CL UPLOAD SOL 4 3 5 RECON REPORTGEN MIGRT SQL 7 4 62137 Heading 2 5 4 Migrating Data from Loans and Deposits to Consumer
6. See Post Import Activities on page 3 6 ORACLE 5 4 Installation of Other Components The Gold Copy should be used for setup in production environment for all applicable components and various files 9 ORACLE 6 Conversion Script Generation Tool 6 1 Introduction You need to apply a set of conversion scripts on the target schema to upgrade the data after the production data is imported This is necessary to make the target database compatible with the front end application deployed The new features introduced in the target version application may necessitate application of some data conversion upgrade scripts Apart from this the schematic differences in the database and constraints would necessitate certain scripts to be run in the back end before the application is opened to the Bank s users The conversion utility is a set of scripts that includes repository of data upgrade scripts and PL SQL utility to generate the scripts dynamically to address functional enhancements and the schema differences This chapter discusses the method to use the Dynamic Script Generation tool 6 2 Generating and Executing Scripts Following steps are involved in the generation and of execution of scripts Setup parameter Generate the dynamic scripts and spool the module wise spool files control file Generate the dynamic script for a specific modules Generate the dynamic script for a specific script identifier Generate
7. The tables eltb migration log and eltb mig exception log are the log tables populated during the migration process 6 You can control the behaviour of the migration using function ID input parameter If you pass the value ALL as the input then all liabilities maintenance entities and utilizations will be migrated in one step Or we can pass individual function IDs to migrate each entity one by one The functions IDs are listed below GEDMLIAB Liabilities Maintenance GEDCULIK Liability Customer Link GEDCOLTY Collateral Types Maintenance GEDSTYPE Static Types Maintenance GEDCOLCA Collateral Categories Maintenance GEDISSUR Issuer codes Maintenance GEDSECTY Securities Maintenance GEDCOLLT Oollaterals Maintenance GEDCOLTD TD Collaterals GEDMPOOL Pool Collateral Linkages GEDFACLT Facilities Maintenance GEDTRANS Facilities Transger GEDBLOCK Facilities Block GEDTRKEXP Country wise limits GEDUTILUPD Utilization Upload GEDUTILMIG Utilization Migration Truncating Database During upgrade you may need to iterate the migration of LM to ELCM module Before an iterative migration the EL tables in target database can be truncated to re run the whole process You can use the script ELCM_TRUNCATE sql to truncate See Annexure page 8 1 for details on the location and usage of ELCM TRUNCATE sgl 4 14 ORACLE 4 6 4 7
8. 009 is as follows cvpks_dynamic_script_gen pr_generate_scripts FLEXCUBE S 5 DO0H 3 For generating spool files see Spooling Module wise Spool Files and Control File for a Run Number on page 6 3 Generating Dynamic Script for Aborted Script Identifiers You can regenerate all the scripts that were aborted for a specific run number You can do this by executing the stub cvpks specific generation sql The parameters for this file are FLEXCUBE A and the run number The parameter A denotes that it is for the aborted script identifiers The run number is the third argument Example 1 The SQL call to generate the scripts for aborted script identifiers is as follows dynamic script gen pr generate scripts FLEXCUBE A 2 5 For generating spool files see Spooling Module wise Spool Files and Control File for a Run Number on page 6 3 Spooling Module wise Spool Files and Control File for a Run Number You can generate module wise spool files and control file by executing the stub call_cvpks_generate_spools sql This stub creates the spool files for the code blocks that are already generated The files would be generated in the path maintained in CVTB_PARAM Yo can generate the scripts for different run numbers by specifying it in the stub itself The input to this stub is source_code FLEXCUBE by default and then the run_number For every run_number the stub generates t
9. 4 2 6B RECON REPORTGEN PARL PSEOD SQL 7 6 6E RECON REPORTGEN PARL ADHOC SQL 7 5 6 LIMITS UPDATE SQL 4 3 7 BILLS UPDATE SQL eese 4 3 8 POST MIGRATION UPDATES SQL 4 3 99471 Heading 4 3 3 3 1 Post Import Activities 3 6 A ALTER CONSTRAINTS DISABLE SQL 3 4 ALTER TRIGGER DISABLE SQL 3 4 CALL CVPKS FULL GENERATION SQL 6 2 CALL CVPKS GENERATE SPOOLS SQL 6 3 CALL CVPKS SPECIFIC GENERATION SQL 6 2 6 3 CL ACCOUNT CREATION ROLLBACK SQL 4 3 CONSTRAINT TRIGGER DISABLE SCRIPT SQL 3 4 8 1 CONSTRAINT TRIGGER ENABLE SCRIPT SQL 8 1 CREATE DB LINK SQL eH 3 4 8 1 CVPKS RECON 7 2 CVPKS RECON EXTRACT SQL 2 7 2 D DROP SEQUENCES SQL 2 3 5 DROP SEQUENCE SCRIPT SQL 3 5 8 2 E ELCM TRIGGERS ENABLE SQL 8 2 ELCM TRIGGERSENABLE SQL 4 13 8 2 ELCM TRUNCATE SQL 2 err 4 14 8 2 EXISTING TABLE COLUMN DIFF SQL 3 4 8 1 EXPORT SITE FULL DUMP PAR 3 3 8 1 9 2 ORACLE EXPORT SOURCE PDATA PAR 8 2 EX
10. CL product there may be parameter differences in the schedule definition e Acquiring unamortized portion of the discounted premium on charge or fees component in CL side is not supported by automated migration Migrating Commitments Commitment migration strategy is similar to the loans migration strategy except for the derivation of commitment amount and value date See Migrating Data from Loans and Deposits to Consumer Lending Module on page 4 2 for the migration flow of loans You can follow the same method for commitment migration except for derivation of commitment amount and value date Derivation logic for commitment amount and value date is given below Linking Type Value Date Amount Maturity Date Strategy Non revolving Min value date Sum active Same as origi During CL active loans loans outstand nal commit migrations uti outstanding ing Unutilized ment lize the com as on migration mitment date Revolving Min value date Original commit Same as origi During CL active loans ment amount nal commit migrations uti outstanding ment lize the com mitment Example This example explains the derivation of parameters for non revolving and revolving commitments Assume that the commitment does not have VAMI events and the loans have simple BOOK and LIQD events 0 ORACLE Non revolving Commitment Event Date Utilized an FD Event Amount utili
11. and data migration if any in the migration area Identify and document the verification strategy Prepare the staging area for both source schema and target schema staging area for source schema is required only if the strategy followed mandates it otherwise it is not required Identify the conversion scripts to be applied post upgrade 9 Prepare a plan with timeline considering all changes required for a smooth upgrade 1 ORACLE 3 3 3 3 1 3 3 2 Mock Upgrade Activity The mock upgrade activity provides a safe platform for the actual production environment upgrade You need to prepare a test area where the mock activity can be carried out During mock activity you need to perform user acceptance testing UAT for the new modules and the functionality that are added in the higher versions While performing the actual migration you need to take the maintenances and parameterizations done in UAT to the production environment The target database after the mock upgrade serves as a Gold Copy for you to set up the upgraded production environment You can truncate the p Data tables from the Gold Copy and re import from the production area In the time between starting mock run activity and starting the actual production upgrade activity if any of the static data is changed then you need to handle such data manually Mock upgrade involves the following steps e Upgrade Oracle database to the new version
12. approach are as follows e While enabling the constraints which are disabled earlier during the process there might be a few columns that violate the constraints You need to manually handle this Upgrade Process Summary A summary of the version upgrade process is provided below 1 Complete the mock upgrade activity The mock upgrade provides a safe platform for the actual production environment upgrade Mock upgrade involves the following steps See Upgrading Database on page 3 3 Identify the source database schema The source system should be at the stage of TI just marked on migration date a ORACLE Setup the target database as per the installation manuals Exit the installer immediately after loading the static data Retrofit the customization changes if any into the target database Apply the DB object changes source file changes and static data changes Take a full dump of the source schema In the target schema disable all the triggers and selective constraints In the target schema selectively enable the triggers as per the module wise approach Import the table data alone into the target database from the full dump of source schema In target schema apply the dynamic conversion scripts This makes the target database compatible with the front end application deployed See Conversion Script Generation Tool on page 6 1 In the target schema enable all the trigge
13. extraction sql 3_commitment upload sq l 4 ld loan extraction sq l 1 2 3 4 5 postextraction_check_html sql 6 5 cl upload sql 7 5 1 post migration checks sq 8 6_limits update sql 9 7_bills update sql 1 0 8 post migration updates sql When contracts should be migrated again you need to execute the following scripts in the order given below 1 account creation rollback sql 2 1 extraction rollback sqgl T9 ORACLE 4 4 2 1 Note Note the following You need to create the CL products manually through front end Characteristics of LD module like GL mapping tenor transaction code holiday period exchange rate code etc must be the maintained AS IS in CL There should not be any deviation Once the migration is completed irrespective of the Interest period basis for contracts in LD the corresponding CL accounts will be based on Include from and Exclude to Schedules pertaining to capitalized LD contract in CL would not be reflected after migration Nevertheless capitalization will be handled on the schedule due date General Migration Strategy The strategy for migration of LD contracts to CL module is given below 1 LD loans portfolio is migrated to CL module Only active LD contracts are considered for migration Inactive liquidated or reversed Contract Reference Number in LD module is stored as Alternate Account Number in CL module This Alternate A
14. field only if it is enabled Otherwise you will notice the option Reset Password as enabled Refer to the user manual Security Management System for details on this screen 4 ORACLE 4 8 4 4 8 5 4 8 6 4 8 7 4 8 7 1 4 8 7 2 4 8 8 4 8 9 Upgrading Deposits Module The existing deposits in the source version will be created based on the Deposit type of products as part of LD Loans and Deposits module These deposits will be migrated as Corporate Deposits and can be handled through the CD Corporate Deposits module If required you need to configure new TD products in the target version 12 0 0 or higher to make use of the TD functionality Dynamic Package Generation for IC Rule A stub is provided to generate the rule based package dynamically For each rule in IC module a package which is necessary for the functioning of 1 module is generated The package new PR GEN RULE is called for each IC rule defined Dynamic Package Generation for Products in CD MM A stub is provided to generate a dynamic package for each of the migrated products in CD and MM modules The package Ldpkss Status Rule Gen Fn Gen called for all CD and MM products Upgrading PC Module Dynamic Package Generation for Products in PC Module A stub is provided to generate a dynamic package for each of the migrated products in PC module This package will help i
15. is greater than zero then the details of the contracts that are not eligible for migration will be populated in CSTB_LD_CONTRACT_CHECK You need to scrutinize them Note If you are not able to correct the non eligible contracts you need o manually reverse such LD contracts Prerequisites for Target For target version you need to do the following maintenances e Create conversion GLs e Maintain CL branch parameters and bank parameters e Dothe mapping between LD product code and CL product code e Do the mapping between LD component name and CL component name e Replicate the status codes maintained for LD module in CL module e Temporarily make the UDFs non mandatory before migration e Temporarily waive the charges e Temporarily suppress the advices for Book Init and Disbursement events e Uncheck the flag Liquidated Back Value Schedules before migration and revert the same to its original value after migration e Maintain the interest calculation method e Maintain the Holiday details in line with the source environment for all the branches e Change Manual Disbursement to Auto Disbursement before migration and revert the same to its original value after migration Once the above maintenances are done you can execute the scripts for migration These scripts will migrate the eligible LD contracts to CL module You must execute the following scripts in the order given below 1_pre migration upd sq l 2 d comt
16. 4 8 Note Partial truncation or partial re migration is not supported Migrating Data from Branch to Retail Teller In Oracle FCUBS 10 0 version the branch related data has been moved to new set of tables Conversion or upgrade scripts are not provided for migration of branch data into retail teller At the time of upgrade the implementation team needs to ensure that outstanding transactions are not pending to be posted to the account at the time of cut over in any of the web branches The clearing checks which are yet to be paid will come as part of the host data migration That will be available in the upgraded system Amendment or reversal an old transaction which was entered before upgrade is not supported in the new system after upgrade You can setup the new Retail Branch For details refer to the chapter Data Replication in Savings user manual Note Note the following The table FBTB TCDENM will be replicated when you maintain the data in the screen Confirm Receipts IVMDCONFR table FBTB TCDENM DESC will be replicated through the screen Denomi nation Details CSDDEMAN ATM POS Modules Impact For handling ATM POS transactions in Oracle FCUBS Switch module was introduced in version 10 3 From this version a totally new set of tables are used Migration scripts are not provided for the upgrade At the time of cut over all the transactions shou
17. 960 52 960 52 960 550AMN208086xxxx AMN2 INTER 3 28 2012 37 570 37 570 37 570 550AMN208086xxxx AMN2 INTER 4 26 2012 31 170 31 170 31 170 550AMN208086xxxx AMN2 INTER 5 26 2012 24 170 24 170 LFPS 24 170 550AMN208086xxxx AMN2 INTER 6 27 2012 17 120 17 120 0 000 550AMN208086xxxx AMN2 INTER 7 26 2012 55 860 44 56 0 000 Total 274 870 263 57 201 890 Inthe above example since till 26 May 2012 the interest has been fully paid by the customer Accrued interest in above example will be 61 68 At the time of migration the system will migrate schedules which are partially paid unpaid LD partially paid unpaid schedule will be migrated and populated in the CL table mig account schedule as given below Contract Reference Component Due Date Amount Accrued Amount Number Due Amount Settled 550AMN208086xxxx AMN2 INTER 6 27 2011 17 120 17 120 0 000 550AMN208086xxxx AMN2 INTER 7 26 2012 55 860 44 56 0 000 Total 72 980 61 68 0 000 4 8 ORACLE 4 4 2 4 4 4 2 5 4 4 2 6 4 4 2 7 Note From the migration date onwards accrual will continue as per the existing CL calculation Status Change Rules Status change rules are applicable to automatic as well as manual status changes Automatic Status Change Rules After migration irrespective of the status in LD module all the CL accounts have NORM status During migration when the CL acc
18. H CODE parameter needs to be the specific branch itself 4 Follow the below steps in Target System Run recon parl adhoc tar sql with Parameter BRANCH CODE as ALL for all branch extraction For a specific branch the BRANCH CODE parameter needs to be the specific branch itself s ORACLE 7 5 3 Run 6e_recon reportgen_parl_adhoc sql with parameter BRANCH CODE as ALL for all branch extraction For a specific branch the BRANCH_CODE parameter needs to be the specific branch itself Generating Parallel Run Reconciliation Report Once the data is migrated you need to run EOD batch on both the source and the target environments at the same time You can check specific entities and mark for parallel run The parallel run reconciliation report provides the details of data reconciliation after the parallel EOD batch For generating this report you need to follow the steps given below 1 Check the following Head office branch for the source schema and the target schema are the same The branch for which Recon is planned to be executed The Today column in sttm dates table should be the same for the branch in Source and target system for which the report is generated The stage during which the recon is planned to be executed It can be MarkEOTI or PostBOD The Oracle FLEXCUBE logical stage is the same for source schema and target schema report generation path av
19. If itis set to N the scripts are generated but not spooled Bal ORACLE 6 2 2 6 2 3 6 2 4 For scripts to be spooled later see Spooling Module wise Spool Files and Control File for a Run Number on page 6 3 Generating Dynamic Scripts and Spooling Files Before you generate the dynamic scripts ensure that the data in the parameter table cvtb_param is set as per the requirement In order to generate and spool the scripts execute the stub call cvpks full generation sql in the SQL prompt The stub will generate the code for the script identifier and spool the module wise script files control file in the folder specified in the WORK_AREA parameter Generating Dynamic Script for Specific Modules You can generate scripts for all the script identifiers of a one or more specific modules In order to generate the scripts for specific modules you need to execute the stub call cvpks specific generation sql The parameters for this file are FLEXCUBE M and the list of modules The parameter M denotes that it is module specific You need to provide the list of modules separated by comma as the third argument The modules will be validated against the maintenance in cvtm module Example 1 The SQL call to generate scripts for the modules BC and Sl is as follows dynamic script gen pr generate scripts FLEXCUBE M BC SI 5 In order to execute it in SQL prompt you need t
20. NTEREST 3 15 2012 110 110 406LILP10040000Z PRINCIPAL 3 15 2012 1000 1000 4 4 ORACLE Contract Reference Component Due Date Amount Amount Number Due Settled 406LILP10040000Z INTEREST 4 15 2012 110 110 406LILP10040000Z PRINCIPAL 4 15 2012 1000 1000 406LILP 10040000Z INTEREST 5 16 2012 100 100 406LILP10040000Z PRINCIPAL 5 16 2012 1000 0 406LILP10040000Z INTEREST 6 15 2012 100 406LILP10040000Z PRINCIPAL 6 15 2012 1000 0 406LILP 10040000Z INTEREST 7 15 2012 100 406LILP10040000Z PRINCIPAL 7 15 2012 1000 0 Example Case 2 In this example the Principal component is fully paid on March 30 2012 and the Interest component is fully paid on April 15 2012 LFPS in this case would be March 30 2012 Contract Reference Component Due Date Amount Amount Number Due Settled 406LILP10040000Z INTEREST 1 17 2012 120 120 406LILP10040000Z INTEREST 2 15 2012 120 120 406LILP10040000Z INTEREST 3 15 2012 110 110 406LILP10040000Z PRINCIPAL 3 30 2012 1000 1000 406LILP10040000Z INTEREST 4 15 2012 110 110 406LILP10040000Z INTEREST 5 16 2012 100 50 406LILP10040000Z INTEREST 6 15 2012 100 0 406LILP10040000Z PRINCIPAL 6 30 2012 1000 0 Migration date plays a vital role in this strategy No operations payments amendment rollover reversals are allowed on migrated loans prior to the migration date The LD outstanding principal amount as on the migration da
21. ORACLE Upgrade Toolkit User Guide Oracle FLEXCUBE Universal Banking Release 12 0 2 0 0 Part No E49740 01 September 2013 Upgrade Toolkit User Guide September 2013 Oracle Financial Services Software Limited Oracle Park Off Western Express Highway Goregaon East Mumbai Maharashtra 400 063 India Worldwide Inquiries Phone 91 22 6718 3000 Fax 91 22 6718 3001 www oracle com financialservices Copyright 2007 2013 Oracle and or its affiliates All rights reserved Oracle and Java are registered trademarks of Oracle and or its affiliates Other names may be trademarks of their respective owners U S GOVERNMENT END USERS Oracle programs including any operating system integrated software any programs installed on the hardware and or documentation delivered to U S Government end users are commercial computer software pursuant to the applicable Federal Acquisition Regulation and agency specific supplemental regulations As such use duplication disclosure modification and adaptation of the programs including any operating system integrated software any programs installed on the hardware and or documentation shall be subject to license terms and license restrictions applicable to the programs No other rights are granted to the U S Government This software or hardware is developed for general use in a variety of information management applications It is not developed or intended for use in any inherently da
22. PORT SOURCE PDATA PAR 5 1 IMPORT_EM_DATA PAR 3 5 8 2 IMPORT_P M_DATA PAR 3 5 8 2 IMPORT_SEQUENCE PAR 3 5 8 2 L LD EXTRACTION ROLLBACK SQL 4 3 LM EL MIG STUB SQL 8 2 LM EL MIG STUB SQL 4 14 POSTEXTRACTION CHECK HTML S QL 4 3 PRELIM SRC DATA CHK SQL 4 2 T TABLEDIFF SOURCE DEST SQL 3 4 8 1 TRUNCATE_PDATA SQL 8 2 TRUNCATE PDATA SQL 5 1 ORACLE 9 4 ORACLE 9 5 ORACLE 9 6 ORACLE 9 7 ORACLE 9 8 ORACLE 9 9 ORACLE
23. S Module 220000000000 nr 4 17 4 8 4 Upgrading Deposits Module 4 18 4 8 5 Dynamic Package Generation for IC 4 18 4 8 6 Dynamic Package Generation for Products in CD MM 4 18 4 8 7 Upgrading PC 4 18 4 8 8 LC Module Tracers Generation 0 0000000000000000000000000 4 18 4 8 9 Upgrading CASA Module Lower Case Alphabets Account Number 4 18 4 9 Module Wise Verification Check 4 19 Cut over Upgrade Activities 5 1 br Introduction het tn te acne E 5 1 5 2 Activities in Production Environment 5 1 5 3 Database Upgrade in Production Environment 2 4 4 5 1 5 4 Installation of Other 5 2 Conversion Script Generation Tool eseeesee 6 1 6 1 AA 6 1 6 2 Generating and Executing 6 1 6 2 1 Setting up Paramelters nnsneerssnnnnneesennnnnnnnennnnnnnnernnnnnnnennnnnnnne nenn 6 1 6 2 2 Generating Dynamic Scripts and Spooling 6 2 6 2 3 Generating Dynamic Script for Specific 6 2 6 2 4 Generating Dynamic Script fo
24. SOUFER Depen Activity Details Desti ty No dency nation e For illustration purpose consider that the name of the Source No source schema used by the customer is Depen SOURCESCHEMA This contains the production data dency of the bank and the complete set of DB objects You including tables constraints index sequences source can do packages triggers etc this e Disable the running Oracle jobs if any es e Create a full schema dump using the expdp utility in the SOURCESCHEMA Name the export dump file as schem SITE FULL DUMP DMP The parameter file ais Export Site FULL Dump par can be used for this setup export See Annexure on page 8 1 e Configure the TNS in source and destination database Com to create DB link mon gt ORACLE Source ARUM Activity Details Desti Pepen ty No dency nation 3 e Run the schema difference utility See Annexure on Destina Activity page 8 1 This utility lists out the schema differences tion 1 and for the Tables and Columns Activity e Run Create DB Link sql in the destination schema It 2 will prompt for the site schema name password and database name Upon providing the details MIG DB database link will be created connecting source schema e Incase creating a DB link to the production schema is disallowed a staging area can be created and the DB link can be created to point to the same e Run T
25. TM BEFORE INSERT OR UPDATE GETM LIAB CU B CUST 01 EACH ROW ST 5 ELTR LDTB CON AFTER INSERT OR UPDATE CSTB CONTRA TRACT MASTER EACH ROW CT 6 ELTR PRODUCT AFTER INSERT OR UPDATE CSTM PRODUC EACH ROW OR DELETE T 7 ELTR STTM CUS BEFORE INSERT OR UPDATE STTM CUSTOM TOMER EACH ROW ER 8 ELTR STTM CUS AFTER INSERT OR UPDATE STTM CUST AC T AC EACH ROW COUNT Migrating Data You need to migrate the data from LM to ELCM module in the target version Follow the steps given below 1 Ensure that the triggers related to ELCM are enabled ORACLE 4 5 5 2 Use the package ELPKS LM REPLICATION to migrate LM data to ELCM module tables You can use the stub 1 EL Stub sql to call the package See Annexure on page 8 1 for details on the location and usage of the LM EL STUB sgl 3 Usethe function fn_process for migrating maintenance entities and utilizations You may migrate the entities either one by one or all in one stretch The parameters to the function are as follows p user id IN VARCHAR2 p ref no IN OUT VARCHAR2 function id IN VARCHAR2 process no IN NUMBER p remarks IN VARCHAR2 p errs IN OUT VARCHAR2 p prms IN OUT VARCHAR2 4 Inthe above list the parameter p function id is a mandatory parameter You may leave the other parameters blank Remarks if passed will be used for updating the log tables 5
26. UBE Universal Bank ing Solutions from a lower version to higher version Module Upgrade discusses the data migration methods specific to Chapter 4 Oracle FLEXCUBE Universal Banking Solutions modules which is part of the mock upgrade activity 1 1 ORACLE Cut over Upgrade Activities explains the activities that you need to sl carry out during cut over Conversion Script Generation Tool discusses the method to use the Chapter 6 Dynamic Script Generation tool which makes the target database compatible with the front end application deployed Data Reconciliation explains the use of Upgrade Reconciliation Tool Chapter 7 which compares the data on the source and target versions after migration and after running a parallel EOD Annexure provides the details of utility scripts and conversion scripts Chapter 8 used in the upgrade process Glossary of Scripts lists out the scripts and links to the places in the Chapter9 document wherever they are used 1 6 Related Information Sources For more information refer to the following documents e Oracle FLEXCUBE Installation Manual e Oracle FLEXCUBE User Manuals 15 ORACLE 2 1 2 2 2 2 1 2 2 2 2 3 2 Upgrade and Conversion Approach Introduction This chapter gives an outline of the upgrade and conversion approaches This also provides a summary of all the activities involved in the entire upgrade process A
27. ableDiff Source Dest sql utility to identify the table difference between the SOURCESCHEMA and DESTSCHEMA Copy the results to an Excel file e Existing Table Column Diff sql to identify the Table Column difference between the SOURCESCHEMA and DESTSCHEMA Copy the spooled result to Excel file e This Excel file will act as a reference point of the schema differences between source DB and target DB e This file has the column level information and details like whether null values are allowed or not For all the not null columns that are newly introduced in the target version you need to handle the data import with special consideration because the import for these tables will fail if the records are present in the SOURCESCHMA for the same e Based on the column differences generate the scripts to disable the constraints for the new not null columns in the DESTSCHEMA Along with this generate the scripts to disable all the triggers e Use the stub Constraint Trigger Disable Script sql See Annexure on page 8 1 to generate the following scripts ALTER TRIGGER DISABLE sgq This sql contains the scripts to disable all the triggers ALTER CONSTRAINTS DISABLE sq This sql contains the script to disable only the not null unique constraints and check constraints for a column without default value e Execute the above two scripts before importing the table data from site dump to the DESTSCHEMA e Y
28. ailable CSTB_PARAM TABLE PARAM RECON REPORT PATH The recon extraction modules available in CSTB PARAM TABLE PARAM NAME RECON MODULE LIST 2 In the source schema complete the following activities 16 tab recon script gen src sql Run 2a validate recon scripts src sql Run 2b update inv scripts src sql 3 In the target schema complete the following activities Run 1b tab recon script gen tar sql Run 2a validate recon scripts tar sql 2b update inv scripts tar sql At this stage you need to consider two instances of parallel run at MarkEOTI stage and PostBOD stage Case A Parallel Run at MarkEOTI Stage 4 n the source schema complete the following activity Run 4a_recon parl_preod_src sql with parameter BRANCH CODE as ALL for all branch extraction For a specific branch the BRANCH CODE parameter needs to be the specific branch itself 5 In the target schema complete the following activities 4 recon parl preod tar sql with parameter BRANCH CODE as ALL for all branch extraction For a specific branch the BRANCH CODE parameter needs to be the specific branch itself recon reportgen parl preod sq with parameter BRANCH CODE as ALL For all branch extraction For a specific branch the BRANCH CODE parameter needs to be the specific branch itself 3 ORACLE Case Parallel Run a
29. ate recon scripts src sql 2b update inv scripts src sql Run 1d populate mapping tables scr sql 3 In the target schema complete the following activities Update the target schema name password and SID name in the script 1a db link tar sql and run the script Run 1b tab recon script gen tar sql 2a validate recon scripts tar sql Run 2b update inv scripts tar sql Run 1d populate mapping tables tar sql 4 Check the following parameters in the CVTB PARAM TABLE Parameter Description RECON MODULE LIST Module list in Tilda separated values This list will be taken if the module code has been passed as ALL during data extraction 7 ORACLE 7 3 7 4 Parameter Description RECON_REPORT_PATH Path where the Recon reports needs to be generated TARGET_LM_INSTALLED Target LM module installed it can be LM or EL SOURCE_LM_INSTALLED Source LM module installed it can be LM or EL TARGET LOAN INSTALLED Target Loans module installed it can be LD or CL SOURCE_LOAN_INSTALLED Source Loans module installed it can be LD or CL RECON_ENVIRONMENT This is the environment Name It will be appended as part of the report file name Releasing Additional Units Delta Release In case DDL files are released it needs to be applied in both source schema and target schema In case the INC files are released you nee
30. ate the data from such modules You need to separately handle the customization changes done at site Involve the bank authorities in the discussions and decide whether data migration is required for the tables added as part of customization Identify such requirements and document that as an addendum to this guide For the existing tables if data conversion scripts are not provided for newly added columns or existing columns you need to analyze and handle them manually Upgrade of Revamped Modules Data migration scripts are provided for the following operations e Migration from Loans and Deposits LD module to Consumer Lending CL module The CL module was introduced instead of LD in the Kernel version FCUBS 7 1 0 0 0 0 2 The module architecture has changed during this e Migration from LM module to ELCM module The ELCM module was introduced instead of LM in Kernel version FCUBS 11 0 0 The module architecture has changed during this e Migration from WebBranch module to Retail Teller Retail Teller module was introduced instead of Web Branch in Kernel version FCUBS 10 3 0 44 ORACLE 44 441 4 4 2 Migrating Data from Loans and Deposits to Consumer Lending Module The loans portion of LD module has been replaced with CL module in the versions later than FCUBS 7 1 0 0 0 0 2 You can use the migration scripts to upload all active loans from LD module to CL module This involves the following steps e M
31. ccount Number can be used for future references The status of the LD contracts migrated to CL module will be updated as M to indicate that they were migrated contracts Changes will be done in the batches to ignore all All the loans migrated to CL module will start with version 1 and event DSBR 2 contracts will not be migrated 3 4 contracts with status M 5 6 The value date of the LD contracts will be the original start date of the CL accounts Incase the original start date is available in the LD contract then it will be taken as the original start date of the CL account Value date of the migrated CL accounts will be determined from the Last Fully Paid Schedule s Due Date LFPSD irrespective of the schedule type payment method status and interest rates fixed floating This LFPSD will be the Last fully Paid Schedule for Principal Interest component Example Case 1 In the below example Principal and Interest components are fully paid on April 15 2012 and Interest is fully paid on May 16 2012 whereas Principal component for May 16 2012 is Outstanding So the LFPS would be on April 15 2012 Contract Reference Component Due Date Amount Amount Number Due Settled 406LILP10040000Z INTEREST 1 17 2012 120 120 406LILP10040000Z PRINCIPAL 1 17 2012 1000 1000 406LILP10040000Z INTEREST 2 15 2012 120 120 406LILP10040000Z PRINCIPAL 2 15 2012 1000 1000 406LILP10040000Z I
32. d the static data using installer Refer to the installation manual of the required version for details on loading static data 9 2 ORACLE 3 3 3 Exit the installer immediately after loading the static data The basic setup step should not be done through the installer At this point all data structures will be in place and static data tables will have the data populated as of the target version But all schema objects like the source packages triggers procedures functions constraints indexes views sequences etc would be available as of the base Kernel version If there are any customization changes that needs to retro fitted in the target version schema you may compile them now You can also make the related static data changes While doing the TEMPSOFT changes you need to take care of the following If the source version had an additional column with data you need to manually move the same as the import of data from production has already been done Apply the additional static data onto the upgraded schema You need to create a dummy schema in the same oracle instance as that of target schema The dummy schema will have the same name as that of the source schema from which dump was exported Provide necessary grants for import export This is necessary to connect and import data later on from the dump Upgrading Database The activities involved in database upgrade are given in the table below Activi
33. d to perform the following activities 1 Apply CVTB table related INCs in both the source schema and the target schema Apply CVTM_RECON_DYN_SCRIPTS table related INCs only in the target schema Apply CVTM RECON REPORTS table related INCs only in the target schema In the source schema complete the following activities Update the source schema name password and SID name in the script db link src sql and run the script Run 1b tab recon script gen src sql Run 2a validate recon scripts src sql Run 2b update inv scripts src sql Follow the below steps in Target System Update the target schema name password and SID name in the script db link tar sql and run the script Run 1b tab recon script gen tar sql Run 2a validate recon scripts tar sql Run 2b update inv scripts tar sql In case spc sql vw cvpks recon extract sql cvpks recon extract spc are released you need to execute them in both the source schema and the target schema Changing Source and Target Schema in Existing System When source schema and target schema in the existing system are changed you need to follow the steps given below 1 In the source schema complete the following activities Update the source schema name password and SID name in the script 1a db link src sql and run the script Run 1b tab recon script gen src sql ORACLE 7 5 7 5 1 Run 2a validate_r
34. econ_scripts_src sql 2b update inv scripts src sql Run 1d populate mapping tables scr sql 2 Inthe target system complete the following activities Update the target schema name password and SID name in the script db link tar sql and run the script Run 1b tab recon script gen tar sql Run 2a validate recon scripts tar sql 2b update inv scripts tar sql Run 1d populate mapping tables tar sql 3 Check the following parameters in the CVTB PARAM TABLE 4 RECON MODULE LIST Module list in Tilda separated values This list will be taken if the module code has been passed as ALL during data extraction Parameter Remarks RECON REPORT PATH Path where the Recon reports needs to be generated TARGET LM INSTALLED Target LM module installed it can be LM or EL SOURCE LM INSTALLED Source LM module installed it can be LM or EL TARGET LOAN INSTALLED Target Loans module installed it can be LD or CL SOURCE LOAN INSTALLED Source Loans module installed it can be LD or CL RECON ENVIRONMENT This is the environment name It will be appended as part of the report file name Generating Reports You can generate various reports related to reconciliation tool This section discusses the methods to generate the following reports e Migration Reconciliation Report e Adhoc Reconciliation Report e Parallel Run Reconciliation Report Generating Mig
35. es from DESTSCHEMA e After dropping the sequence import the sequences from SITE FULL DUMP DMP using the import par file Import Sequence par 5 e Ensure that all the triggers and selected constraints are Destina Activity disabled as mentioned in Activity 3 tion 4 e Generate and apply the module wise conversion scripts EXCEPT for LD to CL and LM to ELCM migration For details on conversion script generation and application See Conversion Script Generation Tool on page 6 1 e Enable all the triggers and constraints once the module wise conversion scripts are generated and applied You need to manually handle the errors encountered while enabling the triggers and constraints e the triggers and constraints are enabled migrate LD module to CL module and LM module to EL module e Note that conversion from LD module to CL module and LM module to EL module should be separately handled as they are re vamped modules You may carry out these two migration activities after enabling the triggers and constraints See Upgrade of Revamped Modules on page 4 1 39 ORACLE Source the post upgrade verification activities You need to preserve the scripts applied while carrying out these activities to use them again if required ARUM Activity Details Desti Pepen ty No dency nation 6 e target database is now ready Carry out the post Destina Activity import activitie
36. filiates will not be responsible for any loss costs or damages incurred due to your access to or use of third party content products or services Contents AIT sen A ee 1 1 1 1 Introduction diode Fee e Ce ed M tb faa ee ah Beds 1 1 7 2 Intenided A dlence x umen mee ee etie Aet ters 1 1 1 3 Documentation 1 1 1 41 9Cc0D8 ca ee le us e deae e itti 1 1 1 5 ORGANIZATION 2 edi oe c UE e eH EUR e EE 1 1 1 6 Related Information 1 2 Upgrade and Conversion Approach 2 1 2 1 Introduction AA A 2 1 2 2 Approach Data Import to Target 2 1 2 2 1 Advantages ied aia aa ia 2 1 2 2 2 Disadvantages aiiis piter au esee Ee Gas 2 1 2 3 Upgrade Process 2 1 Mock Upgrade ee ee 3 1 Em 3 1 32 0 0 nn nnne tenen enter EEEn 3 1 3 3 Mock Upgrade Activity ici ecd od HE end AE EORR edt 3 2 3 3 1 Applying Temp Soft Changes sse 3 2 3 3 2 Setting up Target Schema sss 3 2 3 3 8 Upgrading Database 3 3 3 3 4 Deploying Front End 3 6 3 3 5 Impact on Existing External System
37. he document Create DB Link sgl SOFT TOOLS Script to create database Upgradetoolkit link PM Before creating the DB link configure the TNS connection between the source and destination database TableDiff Source Dest sgql ISOFT TOOLS Scripts to list out the dif Upgradetoolkit ferences between the Soft ImpExp source schema and target Scripts schema Existing_Table_Column_Diff sql SOFT TOOLS Script to list out the differ Upgradetoolkit ences in tables which are Soft ImpExp existing in both the sche Scripts mas but the columns are different Constraint Trigger Disable 591 SOFT TOOLS Script to disable con Upgradetoolkit straints and triggers Soft ImpExp Scripts Constraint Trigger Enable Script sgd SOFT TOOLS Script to enable con Upgradetoolkit straints and triggers Soft ImpExp Scripts ORACLE Script Name Location Remarks Drop_Sequence_Script sql SOFT TOOLS Script to drop the Upgradetoolkit sequence Soft ImpExp Scripts Import_P M_data par ISOFT TOOLS Param files to import Upgradetoolkit static data Soft ImpExp Scripts Import_EM_data par ISOFT TOOLS Param files to import non Upgradetoolkit static data Soft ImpExp Scripts Import_Sequence par SOFT TOOLS Par file to import Upgradetoolkit sequence Soft ImpExp Scripts Truncate_PData sq l SOFT TOOLS Scripts to truncate p Data Upgradetoolkit Soft Im
38. he module wise spool files and control files separately gt ORACLE 7 1 7 2 7 Data Reconciliation Introduction Once the data has been migrated from the source version to the target version you need to reconcile the data You can use the Upgrade Reconciliation Tool to compare the data on the source and target versions after migration and after running a parallel EOD After data reconciliation you can generate the reconciliation reports This chapter discusses the method of using upgrade data reconciliation tool The following points are discussed in connection with reconciliation tool in this chapter e Setting up a new environment e Releasing additional units delta release e Changing the source and target schema in the existing system e Extraction and report generation Generating migration reconciliation report Generating adhoc reconciliation report Generating parallel run reconciliation reports e Moving extraction data into history table Setting Up New Environment When reconciliation tool is setup in a fresh environment you need to follow the steps given below 1 Run the recon tool sources dll inc spc s ql vw in the source schema and target schema 2 In the source schema complete the following activities Update the source schema name password and SID name in the script db link src sql and run the script Run 1b tab recon script gen src sql 2a valid
39. horized transactions in any module e Switch off the SWITCH interface SWIFT and e Bring down the application server with due notification e Bring down the Gateway server Database Upgrade in Production Environment In the Gold Copy truncate the transaction data tables and re import the same from the latest production data dump You can use the script files Truncate pData sql and Export Source PData par See Annexure on page 8 1 for this Ideally the transaction data that has gone into the production data from the time of starting mock run to till time would be the data level change in the source You need to handled this However during the mock run and verification activities the transaction data p Data might have undergone changes So you must truncate the p Data tables using the scripts in the Gold Copy You need to repeat the database upgrade activity performed during the mock upgrade See Upgrading Database on page 3 3 You need not do any static data comparison atthis point The implementation team takes care not to do any static data changes in the production environment Selectively apply the post upgrade check points in the upgraded production area See Verifying Data after Database Upgrade on page 3 7 In order to make the database consistent and up to date you can reapply the scripts discussed in chapter Mock Upgrade Activity See Verifying Data after Database Upgrade on page 3 7
40. igrate LD products to CL module e Migrate LD contracts to CL module e Understand the migration strategy e Migrate commitments You must migrate commitments before loans For ease of reference this document first discusses the loans migration strategy and then the commitment migration strategy Migrating Products from LD to CL You cannot migrate the product data from LD module to CL module You must create the products in CL manually Migrating Contracts from LD to CL The method of migrating LD contracts to CL module is explained below Prerequisites for Source The below set of instructions are applicable for the source environment You need to first identify the LD contracts which are not eligible for migration You can identify such contracts using the script Prelim_src_data_chk sql This script is available the location SOFT TOOLS Upgradeto2olkit Soft Migration _D CL SCRIPTS SOURCE Execute the script Prelim_src_data_chk sql to populate the LD contracts which are not eligible for migration The following objects needs to be compiled in the source environment for successful execution of this script e DDL e CSTB LD CONTRACT CHECK e FNC e FN LD CONTRACT CHECK FNC Once you execute the script Prelim src data chk sql in the source environment it will print the following output querying from CSTB LD CONTRACT CHECK COUNT 1 query should return ZERO rows 4 2 ORACLE If the count
41. igrated as is from LD contract 4 4 2 2 Accounting Strategy During the migration process accounting entries will not be passed in CL module during All the relevant tables especially GL balances will be populated from LD to CL module GL mapping between CL tables and LD tables should be the same Otherwise you may notice inconsistencies in the GL balances 4 7 ORACLE 4 4 2 3 Interest and Accrual The accrued interest is populated in one table which will have CL replica of LD from last fully paid schedule due of LD till the migration date including the current running schedule This is applicable to outstanding interest penalty unpaid principal amount and up front fees collected During the migration process accounting entries will not be passed in CL module during All the relevant tables will be populated as they are from LD module starting from the last fully paid schedule till the migration date including the current running schedule Accruals from the migration date onwards will continue in CL module Example Consider the following details Loan Contract 550AMN208086xxxx Loan Book Date January 25 2012 LFPSD May 26 2012 Migration Date July 20 2012 Contract Reference Component Due Date Amount Accrued Amount Number Due Amount Settled 550AMN208086xxxx AMN2 INTER 1 26 2012 56 020 56 020 56 020 550AMN208086xxxx AMN2 INTER 2 28 2012 52
42. ld have been posted to the accounts from the Switches You need to ensure that there is no pending transaction in the Switches Reversal of a transaction entered in the source version is not supported in the new system after upgrade Maintenance table of ATM POS terminals may have huge data The bank may want to migrate such data to the new system Upgrading Existing Modules You need to upgrade the modules that are not revamped The conversion scripts for new tables columns and functional enhancements are available in the Conversion Script Repository See Conversion Script Generation Tool on page 6 1 Scripts are available for the following modules for upgrade from a lower versions to a higher version 4 19 ORACLE 4 8 1 4 8 1 1 4 8 1 2 ST CIF CASA IC IS MS e SMS e BC e FA e FT e LC e e MM e Sl Generic Conversion Methods The generic conversion methods are discussed under the following headings Note If the source data has account numbers which are in mixed case alphabet or contain char acters that are considered to be invalid in target version then you need to change the re spective RAD XML property You must manually uncheck the uppercase property in RAD Node Update Node field is updated with the new database name in different tables across modules The script for node update is available in the conversion script repository Basic Parameters Setu
43. ment Gold Copy Gold Copy DB Schema Setup Once the above activities are completed you can use the DESTSCHEMA as the Gold Copy to set up the database during production environment upgrade Gold Copy Front End Setup and Interface Use the latest available executables to set up the various components for production upgrade All interface related changes available in various files need to be deployed 2s ORACLE 4 1 4 2 4 3 4 Module Upgrade Introduction This chapter discusses the data migration methods specific to Oracle FLEXCUBE Universal Banking Solutions modules This step is part of the mock upgrade activity The following points are discussed in detail in this chapter e Scope e Upgrade of revamped modules e Upgrade of existing modules e Module wise verification checkpoints Scope The scope of the Oracle FCUBS module upgrade is discussed below Upgrade of Revamped Modules The Oracle FCUBS modules such as LD and LM in the older versions are revamped and upgraded to CL and ELCM modules This migration requires the bank s consent and such cases must be handled separately As the target version application may refer to a new set of tables you need to move the data from the old set of tables to the new set Upgrade of Existing Modules In the target version new tables and columns may have been added as part of functional enhancements You need to use the module wise conversion scripts to migr
44. n resolving the rule set at product level for charges calculation The package PCPKS CHG CALC Fn Create Body is called for all PC products Bank Clearing Network Maintenance In lower versions of Oracle FLEXCUBE the bank network clearing maintenance was not mandatory However it is mandatory in the higher versions Once the migration process is complete before you start with the operations maintenances need to be done for the required Bank Code Network ID combinations LC Module Tracers Generation If the batch LCTRACER is maintained as part of EOD BOD batch make sure that the advices are properly maintained for TRGN event of the product Otherwise LC tracer will fail due to non maintenance of advices for TRGN event Upgrading CASA Module Lower Case Alphabets in Account Number In the higher versions of Oracle FCUBS only upper case alphabets are allowed in account numbers However in lower versions which needs to be upgraded in the existing database the account number fields may have alphabets in both upper and lower cases You need to handle such cases separately To handle such situations after upgrade you can correct the front end UI file to accept lower case input in the below fields e Account Number 4 18 ORACLE 4 9 e Alternate account number e Clearing Account Number e Master Account Number e ATM Account Number You should not set the Upper Case property and Restricted Te
45. nd tables affected not recommended to cre by this issue ate tables using long data type Instead CLOB to be used 3 3 4 Deploying Front End Application For deploying the front end application follow the steps below 1 Referto the installation manual of the required version of the application 2 Apply the temp soft changes if any ORACLE 3 3 5 3 3 6 3 3 6 1 3 Ensure that the deployed EAR points to the upgraded database Impact on Existing External System Interfaces If the customer has any external interfaces maintained in the source application you need to follow the steps below 1 Communicate any format level changes GI files Gateway XSDs in existing interfaces to the external systems Communicate the changes in queues configuration file locations etc to the external systems Communicate the changes in the tag names of the XSD files which are shared with other systems to the respective external system owners Verifying Data after Database Upgrade Once the database is upgraded you need to do the following verifications System wide data verification of reports and other check points Interface testing to check the connectivity Module wise data verification of reports and other check points Converted deals testing New deals testing New product maintenance testing Signoff These verifications are explained in detail under the following headings System wide Data Verification
46. nding is 58188 04 on the migration date CL Account Disbursement Amount 58188 04 Amount Financed 58188 04 Original disbursement amount 87000 has to be stored as UDF at the CL account Level To summarize the least of the last fully paid schedule among the components pertaining to a loan will be considered as the value date of the account During migration from LD to CL module the fully paid schedules are not considered Only the partially paid and unpaid schedules are migrated to CL module History of the fully paid schedules have to be viewed from LD screens All overdue schedules partial full overdue for all components are moved to CL with the current outstanding amount as it is without any further calculations This table is stored in a table and contains the data as is from LD module This table contains data for all schedules between LFPSD and migration date Current running schedule will also be present in this table This will be applicable to outstanding interest penalty unpaid principal amount and up front fees collected All the future schedules from migration date onwards are calculated automatically in CL module based on the schedule definition at the contract level When the minimum tenor of a CL product is higher than the tenor of the active pending loans of the corresponding LD contracts the product definition of CL product has to be modified manually for the migration purpose You can later update it to the actual teno
47. ngerous applications including applications that may create a risk of personal injury If you use this software or hardware in dangerous applications then you shall be responsible to take all appropriate failsafe backup redundancy and other measures to ensure its safe use Oracle Corporation and its affiliates disclaim any liability for any damages caused by use of this software or hardware in dangerous applications This software and related documentation are provided under a license agreement containing restrictions on use and disclosure and are protected by intellectual property laws Except as expressly permitted in your license agreement or allowed by law you may not use copy reproduce translate broadcast modify license transmit distribute exhibit perform publish or display any part in any form or by any means Reverse engineering disassembly or decompilation of this software unless required by law for interoperability is prohibited The information contained herein is subject to change without notice and is not warranted to be error free If you find any errors please report them to us in writing This software or hardware and documentation may provide access to or information on content products and services from third parties Oracle Corporation and its affiliates are not responsible for and expressly disclaim all warranties of any kind with respect to third party content products and services Oracle Corporation and its af
48. o use the command EXECUTE If the prompt is SQL gt then the screen will have the following text SQL Execute cvpks dynamic script gen pr generate scripts FLEX CUBE M SI You can execute the same statement as a PL SQL block within begin end as follows Begin CVpks dynamic script gen pr generate scripts FLEXCUBE M SI 5 Exception Uhen others then Dbms output put line Error llsqlerrm 3i Ends Example 2 The SQL call to generate scripts for the module CA is as follows cvpks_dynamic_script_gen pr_generate_scripts FLEXCUBE CA S For generating spool files see Spooling Module wise Spool Files and Control File for a Run Number on page 6 3 Generating Dynamic Script for Specific script_identifier You can generate scripts for specific script_identifiers This is done by executing the stub CALL_CVPKS_SPECIFIC_GENERATION SQL 6 2 ORACLE 6 2 5 6 2 6 The parameters for this file are FLEXCUBE S and the list of script identifiers The parameter S denotes that it is script_identifier specific You need to provide the list of script identifiers separated by comma as the third argument Example 1 The SQL call to generate scripts for the script identifiers CA_007 LD_001 and SI_009 is as follows cvpks_dynamic_script_gen pr_generate_scripts FLEXCUBE S 1 SI_009 5 2 The SQL call to generate scripts for MS
49. ou may need to enable any specific triggers during import as a special case Certain ELCM triggers need to be enabled during the data import process For details on enabling ELCM related triggers See Enabling Triggers on page 4 13 ax ORACLE Activi Sources Depen Activity Details Desti dency nation 4 e Note that we have already created a dummy schema Destina Activity with the same name as the source schema to facilitate tion 1 impdp command which is used in the below command Activity 2 and Activity 3 e Import the table data from the site dump using the par file given below Data pump import command IMPDP source schema name pwd glttarget instance PARFILE parameter file name with path e parameter file Import P M data par can be used to import P Data M Data and P M Data into the DESTSCHEMA See Annexure on page 8 1 e parameter file Import EM data par can be used to import the E M data into the DESTSCHEMA See Annexure on page 8 1 Refer the import log to ensure that all the table data is imported without any error e If there is any failure in the import you need to analyse and handle it manually e While comparing the SOURCESCHEMA and DESTSCHEMA the stub Drop Sequence Script sql generates the drop script for the common sequences The drop script file name will be DROP SEQUENCES sq Execute this script to drop the common sequenc
50. ount upload happens the system processes status change by suppressing the CL status This action will move the status depending on the status change rule maintained for the CL products after the CL account upload The status change rule should to be maintained with the same conditions as in the LD module to ensure that the account status will change from NORM to the new status The status of the LD contract is matched against the changed status of the CL account to ensure that the status has not changed after migration Note During migration the status change of CL accounts is strictly based on the rules defined at product level Statuses that are supposed to be manual must be maintained as Manual at CL Product level in order to avoid incorrect status derivation Manual Status Change Rules There may be loan contracts in the LD modules whose status has been manually changed Such contracts will be migrated to CL module with the status NORM Once the migration is completed you need to manually change the status of these accounts to their respective statuses in LD module You also need to manually reverse the accounting entries passed during manual status change Limits Utilization During CL account migration process the limit utilization will be disabled After migration the ELCM tables will be updated with the new CL account number in place of loans reference number After the migration process the component
51. out This document guides you through the standard strategy for the upgrade activity In this document you can find necessary information required to carry out the upgrade activity from any lower version of Oracle FLEXCUBE Universal Banking Solutions to the latest version Intended Audience This document is intended for the following audience e Implementation team e Partners Documentation Accessibility For information about Oracle s commitment to accessibility visit the Oracle Accessibility Program website at http www oracle com pls topic lookup ctx acc amp id docacc Scope This document covers the upgrade strategy methodology for migration from lower version of Oracle FLEXCUBE Universal Banking Solutions 6 2 1 2 to the latest version 12 0 1 Note Upgrade from versions FCC6 3 FCC6 4 and FCC6 5 are not in the scope as the target versions may not have all the features of the source version Organization This manual is organized into the following chapters Preface gives information on the intended audience It also lists the Chapter 1 various chapters covered in this User Manual Upgrade and Conversion Approach gives an outline of the upgrade Chapter 2 and conversion approaches This also provides a summary of all the activities involved in the entire upgrade process Mock Upgrade discusses the prerequisites and guides you through Chapter 3 the process of mock upgrade of Oracle FLEXC
52. p As per the upgrade strategy E Data tables will not get imported into target schema The basic application level parameters CSTB PARAM table is of type E Data It cannot be imported to the target version You can setup the basic parameters through the following methods e Directly through Oracle FLEXCUBE Universal Banking Installer interface at the time of installation you can setup the parameters directly in the user interface screen popped up by the Installer e Load static data you can use Load Static Data option provided by the Installer e Manual execution you can use the INC script available for the table CSTB PARAM the VERCON area The implementation team should verify the parameter values wherever parameters are set up 19 ORACLE 4 8 2 4 8 2 1 4 8 3 4 8 3 1 4 8 3 2 Upgrading Core Module Conversion scripts are provided for various tables related to GL ST CIF CASA IC IS and MS modules See Conversion Script Generation Tool on page 6 1 Import the source data into target schema and apply the conversion scripts referred above Once this step is completed you need to do some maintenances for the imported data to work in the target environment In case of CIF screen note that the data in the maintenance tables of the LOV fields option lists like Prefix Customer Category etc may have mixed case text But in the new setup of Oracle FCUBS 12 0 and higher such da
53. pExp Scripts Export Source PData par SOFT TOOLS Param file to export Upgradetoolkit required p_data tables Soft ImpExp Scripts ELCM Triggers Enable sql SOFT TOOLS Script to enable ELCM Upgradetoolkit related tables Soft ImpExp Scripts ELCM TriggersEnable sq SOFT TOOLS Script to enable ELCM Upgradetoolkit related triggers Soft Migration LM EL Scripts ELCM TRUNCATE sql SOFT TOOLS Script to truncate the Upgradetoolkit ELCM database Soft Migration LM EL Scripts LM EL STUB sql SOFT TOOLS Script to call the package Upgradetoolkit ELPKS LM REPLICATI Soft Migration LM ON which migrates LM EL Scripts data to GE ORACLE 9 Glossary of Scripts Numerics 10304 Heading 1 4 Cut over Upgrade Activities 5 1 14 DB LINK SRC SQL nee 7 1 7 2 TA DB LINK TAR SQL aussi 7 1 7 3 1B TAB RECON SCRIPT GEN SRC SQL 7 1 7 2 7 4 7 5 1B TAB RECON SCRIPT GEN TAR SQL 7 1 7 2 7 3 7 4 7 5 1C MOVE TO HISTORY SRC SQL 7 6 1C MOVE TO HISTORY TAR SQL 7 6 1D POPULATE MAPPING TABLES SCR SQL 7 1 7 3 1D POPULATE MAPPING TABLES TAR SQL 7 1 7 3 1 PRE MIGRATION UPD SQL eese 4 3 25038 Heading 3 3 3 6 Verifying Data after Database Upgrade 3 7 25395 Heading 1 5 Module Upgrade 4 1 26346 Heading 3 3 3 3 Upgrading Database 3 3 2A VALIDATE RECON SCRIPTS SRC SQL
54. parameter BRANCH CODE as head office branch In normal cases it is CHO 5 recon reportgen migrt sql with parameter BRANCH CODE as head office branch In normal cases it is CHO Generating Adhoc Reconciliation Report You can generate the adhoc reconciliation report for individual entities that you need to verify For generating this report you need to follow the steps given below 1 Before you start the report generation check the following Head office branch for the source schema and the target schema are the same Today column in sttm dates table should be the same for all the branches in source and target system report generation path available in CSTB_PARAM TABLE PARAM RECON REPORT PATH recon extraction modules available in PARAM TABLE PARAM NAME RECON MODULE LIST Getthe list of entities which needs to be part of the adhoc report generation module code entities and prepare the below insert statement insert into cvtb recon adhoc entity module code entity values PERIODIC INSTRUCTIONS 2 Inthe target schema complete the following activities Runinsert statements prepared in the previous step Run commit 3 In the source schema complete the following activity 4 recon parl adhoc src sql with parameter BRANCH CODE as ALL for all branch extraction For a specific branch the BRANC
55. pproach Data Import to Target Schema In this approach you will prepare the staging area only with the target version schema This is the recommended approach The steps involved in this method are as follows 1 Prepare the staging area with the target version schema 2 Insert the E data factory shipped static data alone in the target schema using the consolidated insert scripts You can do this using the Oracle FLEXCUBE Universal Banking Installer The basic setup step should not be done through the installer 3 Disable the Not null and check constraints in the target schema Disable all the triggers except the module specific triggers required to be enabled for the upgrade activity For the list of triggers to be enabled See Module Upgrade on page 4 1 4 Perform data import from the export dump of the production data 5 Apply the conversion scripts for the columns with the not null and check constraints in the target schema to populate them with proper business data 6 Enable the constraints in the target schema The target schema will now act as the gold copy for the customer to resume the verification production activities Advantages The advantages of this approach are as follows e Only one staging area prepared for the target version schema e The production p data transaction data is imported without loss of data as constraints are disabled during the import Disadvantages The disadvantages of this
56. r During migration the minimum tenor will be updated as one day and once the contract T9 ORACLE migration is over the CL products should be updated to its original minimum tenor as in the LD product 14 Contracts having only bullet schedule for principal and interest components are migrated with the original value date 15 For a given contract the following values are migrated from LD to CL module for a given contract LD Loans CL Loans Contract Reference Number Alternate Account Number Last Fully Paid Schedule s Due Date Value Date LFPSD Book Date Book date will be the Migration Date Maturity Date Maturity Date migration date Outstanding principal amount as on Loan Amount Original Start Date Original Start Date Value Date Original Start Date if original start date is null in LD Number of Schedules Derived LD original schedules as on LFPSD Frequency Same as in LD from LFPSD No of Units Derived from LFPSD 16 For a given contract the following details are migrated from LD to CL module LD Loans CL Loans Contract CL Account Upload Parties CL Account Parties Upload Components CL Component Upload Schedules CL Component Schedule Upload Interest Rates CL Account UDE Upload Linkages CL Linkages Upload Settlement Details CL Account Settlement Upload MIS Details M
57. r Specific script identifier 6 2 6 25 Generating Dynamic Script for Aborted Script Identifiers 6 3 6 2 6 Spooling Module wise Spool Files and Control File for a Run Number 6 3 Data Reconciliation e 7 1 7 1 AO 7 1 7 2 Setting Up New Environment 7 1 7 3 Releasing Additional Units Delta 7 2 7 4 Changing Source and Target Schema in Existing 7 2 7 5 Generating Reports prado ket cea i ae de tapa he a pen eee e 7 3 7 5 1 Generating Migration Reconciliation 7 3 7 5 2 Generating Adhoc Reconciliation Report sees 7 4 7 5 3 Generating Parallel Run Reconciliation 7 5 7 5 4 Moving Extraction Data into History Table sse 7 6 ee T 8 1 8 1 Utility SeriptS nni Ben Hl eel 8 1 Glossary of SCrIpls lso eier 9 1 1 1 1 2 1 3 1 4 1 5 1 Preface Introduction Customers who use the lower versions of Oracle FLEXCUBE Universal Banking Solutions may need to upgrade to the latest version as the support to the older versions will phase
58. ration Reconciliation Report Once the source data is migrated to the target version environment you can generate the migration reconciliation report This is a complete reconciliation report and covers all the entities that need to be reconciled For generating the migration recon report you need to follow the steps given below 1 Check the following details Head office branch for the source schema and target schema must be the same Today column in sttm dates table should be same for all the branches in the source schema and the target schema report generation path available in CSTB_PARAM TABLE PARAM RECON REPORT PATH recon extraction modules available in PARAM TABLE PARAM NAME MODULE _ LIST 159 ORACLE 7 5 2 2 Inthe source schema complete the following activities Run 1b tab recon script gen src sql 2a validate recon scripts src sql 2b update inv scripts src sql 3 Follow the below steps in Target System Run 1b tab recon script gen tar sql 2a validate recon scripts tar sql Run 2b update inv scripts tar sql 4 Inthe source schema complete the following activity 3 recon migrt src sql with parameter BRANCH CODE as head office branch In normal cases it is CHO 5 Inthe target schema complete the following activities Run 3 recon migrt tar sql with
59. rs and constraints Intarget schema proceed with LD to CL migration and then LM to ELCM migration Complete the module wise migration and verification checks See Module Upgrade on page 4 1 Inthe target schema complete the verification activities See Verifying Data after Database Upgrade on page 3 7 2 Complete the cut over upgrade activity During this you will upgrade the production environment database See Cut over Upgrade Activities on page 5 1 During cut over upgrade if conversion script needs to be applied on any specific units generate the required conversion script and apply it on the target schema See Conversion Script Generation Tool on page 6 1 3 Reconcile the data in the source and target databases using the data reconciliation tool Run a parallel EOD and reconcile the data again See Data Reconciliation on page 7 1 22 ORACLE 3 1 3 2 3 Mock Upgrade Introduction This chapter discusses the prerequisites and guides you through the process of upgrading Oracle FLEXCUBE Universal Banking Solutions from a lower version to higher version The upgrade involves the following two activities e Mock upgrade activity e Cut over upgrade activity The mock upgrade activity provides a safe platform for the actual production environment upgrade Once the mock upgrade is completed you will have a ready target database which is termed as the Gold Copy for setting up
60. s provided in the next section Carry out tion 5 3 3 3 1 Post Import Activities Once the data import is completed you need to perform the following post import activities e Recompile invalid objects 3 3 3 2 Issues in Data Import using IMPDP Utility You may encounter any the following issues while importing data using IMPDP utility Issue Problem Cause Resolution Import Some of the import Oracle parameter setup DBA needs to options not options may not be ena enable the recognized bled in the server One same such example is the DATA OPTIONS clause of the import which is used in the E M Data import par file Data Import If the value for a column The existing column Disable the fails because is null in the imported would have been added index do the of new data which is going to be as part of a newly created import supply indexes part of an index in the tar unique index in the values to this get then the import fails DESTSCHEMA So ifthe column data for this column con tains null values then the uniqueness is violated Data Import If a varchar2 column was IMPDP does not support As a worka fails due to changed to long column importing varchar2 col round instead long columns in the higher versions umns into long columns of impdp utility then the import fails It is given in oracle docu use the imp util mentation that long col ity to import the umns are deprecated a
61. s run without being skipped e Check if all aspects of the EOD i e module functionality and reports generation are covered e Test on a masked dump of the site if it is done offshore Interface Testing to Check Connectivity As part of this verification you need to perform the following activities e Test incoming and outgoing interfaces conversion of FLEXML formats to gateway EMS to JEMS and ATM POS using the SWIG interface e Check for all the channels that receive information from Oracle FCUBS Module wise Data Verification of Reports and other Check Points See Module Upgrade on page 4 1 Converted Deals Testing You need to test the converted deals as follows e As part of the upgrade the system will have new tables as well as new columns in the existing tables Check the sanctity of the conversion utility and populate the additional fields and tables by testing the converted migrated data e Perform basic life cycle testing for the converted contracts e Check the product maintenances and static maintenances for modifications New Deals Testing You need to test the new deals as follows e Create new contracts on existing products and observe the validation of default values e Test the basic life cycle of new deals New Product Maintenance Testing Once the upgrade is completed create a new product in each module Signoff Get the customer signoff to go ahead with the upgrade of production environ
62. t PostBOD Stage 4 Inthe source schema complete the following activity Run 4b_recon parl_pseod_src sql with parameter BRANCH CODE as ALL for all branch extraction For a specific branch the BRANCH CODE parameter needs to be the specific branch itself 5 In the target schema complete the following activities Run 4b_recon parl_pseod_tar sql with parameter BRANCH CODE as ALL for all branch extraction For a specific branch the BRANCH CODE parameter needs to be the specific branch itself 6b recon reportgen parl pseod sql with parameter BRANCH CODE as ALL for all branch extraction For a specific branch the BRANCH CODE parameter needs to be the specific branch itself 6 Check the following parameters in the PARAM TABLE Parameter Remarks RECON MODULE LIST Module list in Tilda separated values This list will be taken if the module code has been passed as ALL during data extraction RECON REPORT PATH Path where the Recon reports needs to be generated TARGET LM INSTALLED Target LM module installed it can be LM or EL SOURCE LM INSTALLED Source LM module installed it can be LM or EL TARGET LOAN INSTALLED Target Loans module installed it can be LD or CL SOURCE LOAN INSTALLED Source Loan module installed it can be LD or CL RECON ENVIRONMENT This is the environment name It will be appended as part of the report file name
63. ta is expected to be created in upper case FCUBS 12 0 2 0 0 accepts mixed case characters during modification of the records Limitations Following are the limitations with regard to account creation e Location type was free text in older version but is an LOV field in the later versions It has to be populated before saving the customer information e Invalid characters check for external account had been introduced which might prevent the account being saved Conversion scripts are not provided for the above two cases Upgrading SMS Module Password History During migration if the source version is older than Oracle FCUBS 11 3 you should not move the password history from source to target You can control this by updating the column DATA IMPORT REQD in CVTM TABLE TYPES to N where TABLE NAME SMTB PASSWORD HISTORY Password Reset Once the data upgrade activities are completed and the front end application is setup you may login to the new system However you need to reset the password Using the Oracle FLEXCUBE Universal Banking Installer you can create two users with password Refer to the Installation Manual User Creation Utility for details You can login to new system with the user IDs created Once logged in you can reset the password for any or all the users You can use the User Credentials Change SMDCHPWD screen to reset the password You can directly specify the password in the
64. te is taken as Amount Disbursed and Amount Financed in CL account Original loan amount has to be stored as a UDF at CL contract level Contracts with fully paid principal amounts will be migrated with 0 01 as Amount Disbursed Amount Financed Example Case 1 The principal outstanding from the last fully paid schedule No partial payment after LFPS Loan Date December 15 2011 LFPS April 15 2012 ORACLE 10 11 12 13 Migration Date June 30 2012 nn nn Number Component Due Date Amount Settled 406LILP10040000X PRINCIPAL 1 17 2012 6734 310 6734 310 406LILP10040000X PRINCIPAL 2 15 2012 7394 010 7394 010 406LILP10040000X PRINCIPAL 3 15 2012 7571 560 7571 560 406LILP10040000X PRINCIPAL 4 15 2012 7112 080 7112 080 LFPS 406LILP10040000X PRINCIPAL 5 16 2012 7130 450 0 000 406LILP10040000X PRINCIPAL 6 15 2012 7306 740 0 000 406LILP10040000X PRINCIPAL 7 15 2012 7325 000 0 000 406LILP10040000X PRINCIPAL 8 15 2012 7186 670 0 000 406LILP10040000X PRINCIPAL 9 15 2012 7205 240 0 000 406LILP10040000X PRINCIPAL 10 17 2012 7068 410 0 000 406LILP10040000X PRINCIPAL 11 15 2012 7551 820 0 000 406LILP10040000X PRINCIPAL 12 15 2012 7413 71 0 000 Maturity TOTAL 87000 28811 96 Principal Outstanding 87000 28811 96 58188 04 In the above case since the principal outsta
65. the dynamic script for aborted script identifiers Spool the module wise spool files and control file for a run number 6 2 1 Setting up Parameters Set the appropriate values for the parameters in the table CVTB_PARAM before generating dynamic scripts You need to set the following parameters SITE VERSION This refers to the Oracle FLEXCUBE version installed at the customer site The scripts for data migration are picked up based on this parameter as the repository master CVTM REPOS MASTER contains all the scripts MIGRATION TYPE You can generate spool files for different table types depending on the time of data migration For example conversion scripts should not be applied for P data during live cut over PARSE STATEMENT You can have Y or N as the value If the value is Y you can check the correctness of syntax of the dynamic scripts by parsing the statement WORK AREA This is the valid folder which is accessible and provided Write permission The module wise spool of the conversion scripts and the control file is generated in this folder RUN NUMBER It is used to identify the execution of the scripts end to end if the scripts are already executed and applied onto the database then you need to change the run number Otherwise the same run number can be retained GENERATE SPOOL FILES You can generate the scripts and spool the module files and control file by setting the parameter GENERATE SPOOL FILES to Y
66. the upgraded production environment Prerequisites Following are the prerequisites for the upgrade activity 1 Prepare a copy of the production system covering all components of the product for mock upgrade 2 Setup the Oracle DB parameters as per the FCUBS recommendations for the destination schema It is ideal to follow the steps of installation of a new version Identify and list out the installed components of the production system List out new components available in the new version that the customer proposes to use Identify and list out the details missed out in the new version Internally discuss and suggest the actions proposed to address them 6 Update the customer about the proposed plan and get the customer s concurrence Following are the other interfacing teams to be involved in discussions for qualification For qualifying with the new version Address all changes required for qualification with the new version For qualifying with the existing interfacing system Identify and address the new interfacing requirements for the interfacing system to remain intact 8 You need to understand the database upgrade strategy proposed below Identify and document the migration steps that are planned Identify whether any module migrations are present and collate the migration scripts in the migration area like module migration from LD to CL LM to ELCM etc Setup the utilities for data comparison
67. uals of the target version for details on this setup ORACLE 4 5 3 4 5 4 Ensure that the CSTB_PARAM values for related parameters are properly set Compile the POJO jars and ELCM java files as per the ELCM setup document for target version Provide the Java grants in the schema as per ELCM setup document for target version Verify all java objects in the database and ensure that they having valid status Note The prerequisites given above are a few basic checks to be completed For complete de tails refer to the Installation Manuals of the target version Enabling Triggers Before you import the data from LM to ELCM in the target schema you need to enable certain ELCM related triggers You can enable the triggers using the script ELCM TriggersEnable sql See Annexure on page 8 1 for details on the location and usage of the SQL file As per the migration strategy by default all the triggers are disabled before the data import However the triggers mentioned in this section must be enabled as exceptional cases The list of triggers is given below Trigger Name Type Event On Table 1 ELTR_ACC_CLAS AFTER INSERT OR UPDATE STTM_ACCOUN 5 EACH ROW DELETE T_CLASS 2 ELTR_CLTM_PRO AFTER INSERT OR UPDATE CLTM_PRODUC DUCT EACH ROW OR DELETE T 3 ELTR_CLTM_PRO AFTER INSERT OR UPDATE CLTM_PRODUC DUCT_UDE EACH ROW OR DELETE T_UDE 4 ELTR GE
68. unt Utilization is as follows Commitment Event Date Utilized ie d as Event Amount CX2 1000 01 Mar 12 0 1000 01 Mar 12 150 850 LX12 BOOK 150 01 Jun 12 300 700 LX13 BOOK 150 4 5 Migrating Data from LM Module to ELCM Module The LM module in the source version have been revamped as ELCM module in the target version You need to migrate the LM data to the ELCM module This involves the following steps 4 5 1 Understand the migration approach Ensure that the prerequisites are met Table mapping Enable EL specific triggers Migrate data to the target version Truncate database if required Migration Approach The migration approach is as follows 4 5 2 Before starting LM to ELCM data migration LD to CL migration must be completed The migration of data from old set of tables to new set of tables is effected through a package The DDLs corresponding to the package are also available in the shipment media All maintenance data such as liabilities collateral etc are migrated by a simple table to table movement of data The utilizations migration involves calls to the underlying limits processing packages LM to ELCM migration is done as a last step in the module wise conversion application process Prerequisites The prerequisites for this migration are as follows Complete the ELCM setup in the migration environment Refer to the installation man
69. wise balances interest rates schedule definition UDF fields calculation method date fields etc have to be compared and verified with LD module contracts Commitments Linkage You need to update the commitment linkages separately after the migration of both commitment and loans contracts MIS During the CL account migration process the MIS update is disabled After the migration the MIS tables will be updated with the new CL account number in place of loans reference number ORACLE 4 4 2 8 4 4 2 9 4 4 2 10 4 4 3 UDF You need to associate the user defined fields defined at the LD product level with the CL products Account level UDFs attached to the LD contracts will be automatically migrated to CL accounts based on the product mapping done in CL New UDFs has to be created for storing the original loan amount SMS The customer needs to define the SMS roles for new function IDs Since the LD CL function ID mapping is not always one one the bank needs to manually do the new role configuration Exceptions in Migration Strategy The exceptions in the migration strategy are as follows loan is of type Rule 78 you need to migrate it with its original value date e Holiday treatment in CL module is only at the product level For CL accounts the holiday treatment is defaulted from the corresponding CL product For the LD contracts if the holiday treatment is different from the corresponding
70. xt property in the RAD XML for the above fields The RAD XML for these fields is STDCUSAC_RAD XML Once the changes are effected you need to deploy the UI files and related back end packages Refer to the Installation Manuals of the target version for details Module Wise Verification Check Points You need verify the modules to ensure the following e Module wise maintenances Check whether the module wise maintenances required for the module to function are completed or not You may unlock or save the products or do backend updates e Correctness of sequences e Correctness of parameter values e Count of entities before and after migration Reconciliation scripts are provided for each module to verify the counts of different entities in the database before and after migration See Data Reconciliation on page 7 1 for details on reconciliation scripts ORACLE 5 1 5 2 5 3 5 Cut over Upgrade Activities Introduction The upgrade activities that you need to carry out during cut over are as follows e Activities in production environment e Database upgrade in production environment e Installation of other components Activities in Production Environment On the cut over date the following activities required to be done at the production environment level e Run operations and bring the system to TI Transaction Input stage of migration date e Ensure that there are no unaut
71. zed Contract C1 1000 01 Jan 12 0 1000 01 Feb 12 100 900 L1 BOOK 100 01 Mar 12 300 700 L2 BOOK 200 01 May 12 300 700 L1 LIQD 100 01 Jun 12 150 550 L3 BOOK 150 Contract L1 is liquidated on 01 May 2012 Hence L1 is not migrated C1 is migrated as CX1 L2 is migrated as LX2 and L3 is migrated as LX3 Migration Date 15 June 2012 Commitment Value Date 01 May 2012 Minimum of value date of active loan L2 L3 Commitment amount 350 550 900 sum of outstanding loans L2 L3 Un utilized Utilization is as follows Commitment zen Utilized Un utilized ee Event Amount CX1 900 01 Mar 09 0 900 01 Mar 09 200 700 LX2 BOOK 200 01 Jun 09 350 550 LX3 BOOK 150 Revolving Commitment Event Date Utilized aie a Event Amount utilized Contract C2 1000 01 Jan 12 0 1000 01 Feb 12 100 900 L11 BOOK 100 01 Mar 12 70 930 L11 PAYM 30 01 Mar 12 270 730 L12 BOOK 200 01 Apr 12 240 760 L11 PAYM 30 01 Apr 12 190 810 L12 PAYM 50 01 May 12 150 850 L11 LIQD 40 01 Jun 12 300 700 L13 BOOK 150 ORACLE Contract L1 is liquidated on 01 May 2012 Hence L11 is not migrated C2 is migrated as CX2 L12 is migrated as LX12 and L13 migrated as LX13 Migration Date 15 June 2012 Commitment Value Date 01 March 2012 Minimum of value date of active loan L12 L13 Commitment amount 1000 Original amo

Download Pdf Manuals

image

Related Search

Related Contents

Este es el FRITZ!Powerline 540E WLAN Set  取扱説明書 ショートサイドスタンド ご使用前に必ずご確認ください  Philips 046677410285 energy-saving lamp  Dell Venue 8 3840 Quick Start Manual  平成26年3月号  Untitled - Casaenergia.net  vaddio 999-1036-000 product manual  Instrukcja obsługi i montażu  1.04 MB  

Copyright © All rights reserved.
Failed to retrieve file