Home

Data Consolidation and Importing Software for Microsoft Dynamics

image

Contents

1. AAA Figure 2 System Architecture The data consolidation and importing tool itself can be broken down to user interface and three major functional components 1 User interface It allows users to enter necessary information to setup and run the tool 2 Configuration amp verification unit This unit performs the following tasks e Receives inputs from the user interface e Checks configuration files availability Database connectivity default application setting etc e Checks if the predefined data has been setup correctly
2. Collumn Name Data Type Allow Nulls Primary Key ProjectID smallint No Yes TestCaselD int No Yes TestCaseName nvarchar 255 No No TestCaseType char 50 No Yes TestCaseTypeID int No No TestCaseOwner char 50 No No TestCaseOwnerID smallint No No TestCasePriority smallint No No ScenarioID Smallint No No PerfCounterID Smallint No Yes PerfCounterOrder Smallint No No PerfCounterName varchar 50 No No PerfCounterUnitID Smallint No No 4 2 2 Data Access From the data access point of view the system can be largely divided into the following three layers Data Presentation Layer It contains User Interface components e Business Logic Layer It contains the parsing and processing modules e Data Access Layer It is used to access and perform operations on database tables 1 Connects to the database 2 Retrieves data from database 3 Uploads processed data to database Data Access layer encapsulates database related operations As a result it makes it easier to maintain the data manipulation methods without affects other modules The data access layer developed for this system includes two classes ToolDataAccess class and LoaderDataAccess Please refer to the class diagrams Figure 20 and Figure 28 for their description These classes make use of Microsoft ADO NET and stored procedures written with Transact SQL statements Following is a sample method in ToolDataAccess class The method is used to retri
3. UserInputs Uses class Class amp Fields a invalidTeeDt 39 lda LoaderDataAccess E gi message Class a tcc ag _tccDt 3 Fields Uses 39 _userInputs me zi Methods ES s PROJECTMAME 34 AddMachineIDToTierInDB ROA zi Properties q AddMachineToDB FSA InvalidTestCaseCollectionTabl 4 5 InvalidTestCaseCollectionTable av CheckMachineIDInTierInDB Message a CheckOsNameExistInDB 9 amp CheckProjectNameExistInDB A validTestCaseCollectionTable a CheckProjectNameExistIn i CheckTestCaseIDExistInDB amp Methods a CheckTierIDExistInDB BeginProcess a DeleteOldTestRunResult 3 ConstructInvalidDataTable a GetMachineIDByMachineName 3 ConstructvalidDataTable a GetNumberOfPerFCounterIDByTestCaseID a GetPerfCounterIDAndOrderByTestCaseID 4 LoaderDataAccess LoadLookupTable W SQLExecuteScalar a UploadDataTableToDB a UploadResult FirstindexOfvalueln rray GetPerfCounterIDOrderByType Y Loader LogInvalidEventData LogInvalidMachine LogInvalidTestCase YalidateAndConvertData 1 amp Go Go Go E Uses Uses TestCaseCollection Driver Class Class Figure 29 Class Level Class Diagram Loader and LoaderDataAccess 4 2 Data Models and Data Access The system requires two databases for data storage and access 1 ABench It is a database that has been used by a variety of performance test teams for their projects and has the following features e tis the backend d
4. Figure 20 Class Level Class Diagram ToolHelper gt gt DataTable_viewer Class Form E Fields MainForm 3 Class o Methods 3 btnOK Click DataTable viewer 1 overload ev NA E 0 Figure 19 Class Level Class Diagram DataTable_viewer 36 ApplicationSettings B Class E Fields 29 applicationList ApplicationItem 5 w defaultApplicationType Class ww defaultApplicationTypeID Y defaultos E Fields defaultOSID Has alwaysParseAllFiles 9 defaultTier Y description ww defaultTierID FileExtension 3 Properties FileNameFilter e eee pem Applications parserType Methods showExpectedIteration 4 AddApplication type ApplicationSettings 1 overloa amp Methods y GetApplicationByType ApplicationItem 1 overload a _d CIEN Uses ToolHelper Class Figure 20 Class Level Class Diagram ApplicationSettings and ApplicationItem gt gt StatusForm Class Form E Fields MainForm Uses 3 Class Methods e 3 O Show ty StatusForm Figure 21 Class Level Class Diagram StatusForm 37 UserInputs gt Class Creates MainForm Class ParserHelper Uses Class El Fields _applicationType _applicationTypeID _buildAppendOrOverwrite _buildNewOrold _buildNumber _expectediterations _osID _osName _resultFolderPath _tierID _tierName Lk k X x eee oo S S Properties
5. A ApplicationType A ApplicationTypelD A Build ppendOrOverwrite OP BuildNewOrold A BuildNumber A expectedlterations A OsID A OsName A ResultFolderPath A TierID A TierName Methods 4 UserInputs Figure 22 Class Level Class Diagram UserInputs 38 gt gt ObjectXMLSerializer lt T gt ParserHelper Uses Generic Class Class E Methods Load ObjectXMLSerializer ToolHelper Save Class 34 serializer_Unknown ttribute aY serializer_UnknownNode Figure 23 Class Level Class Diagram ObjectXMLSeriaLizer gt gt MainForm l Driver cl Class En E Fields 3 _filesDt 49 loaderInvalidTccDt IParser 4 Pit 29 loadervalidTccDt Interface 39 _lookupTable av message 29 _parser Loader 3 userInputs Class 3 Properties A LoaderInvalidTestCaseCollectionTable A LoaderValidTestCaseCollectionTable rn LookUpTable a UserInputs m Message Class Methods Driver 3 GetLookupTable 9 LookUpTable 3 GetParser 4 ParseAndUpload Class Figure 24 Class Level Class Diagram Driver 39 XMLParser TXTParser Class Class Creates TestCaseCollection A Class E Fields a testCasesList E Properties Has A LatestTestCase A TestCases zi Methods Y AddTestCase GetIndexByTestCaseID TestCaseCollection Uses Event A Class E Fields 9 additionalinfo Has iteration message time type E Methods Event 1 overload e
6. Dynamics GP which consists of a variety of applications Raw test results are stored in a plain text TXT format for some applications and an XML format for others Currently these raw data have to be converted to Excel spreadsheets for the management to interpret The problems of this approach include 1 With the amount of information generated in a given release the number of spreadsheets to monitor becomes cumbersome and difficult to manage 2 It requires great manual interference from the testers to convert the data 3 The spreadsheets are difficult to read and interpret 4 It is not easy to conduct comparisons between results and goals or previous test runs 5 It is hard to communicate the results with other teams as the format is unique to certain application type ABench a web based reporting website was chosen to be the achieving and publishing system for the performance testing team It provides a uniform framework to for multiple projects scenarios test cases performance metrics execution tiers and baselines This project is a part of the efforts of publishing the performance test results of different applications to ABench It is also desirable that the project to be able to consolidate and process test results from other possible applications using similar formats 1 2 Objectives The primary objectives for this system are 1 To parse XML results and TXT results according to the application types and predefined
7. 1900 5 56 15 PM 18781 0 2 100 3 1101 1 1 1900 5 56 15 PM 18788 0 6 100 6 1101 1 1 1900 5 56 15 PM 18790 0 7 100 7 1101 1 1 1900 5 56 21 PM 24083 1 8 100 1 1101 1 1 1900 5 56 21 PM 24084 1 9 100 3 1101 1 1 1900 5 56 21 PM 24088 1 10 100 5 1101 1 1 1900 5 56 21 PM 24105 1 11 11 CHAPTER III HIGH LEVEL AND LOW LEVEL DESIGN Based on the specification analysis several design models of the system are developed at difference levels of abstraction This chapter describes the high level and low level design in details 3 1 System Architecture System architecture is the top level design that gives us the overview of the whole project There are three main architectural pieces as shown in Figure 2 1 Data consolidation and importing tool which can be further divided into several functional components 2 Raw data files generated by performance test These files are the input files for the import tool and must be stored in a consistent format 3 SQL database to store the processed data The production database is hosted on a remote SQL Server named RM_PERFORMANCE For development and testing purpose the project uses ABench database on local machine 12
8. Application Type from the list OS ComboBox e The control is for Select only e Items are retrieved from database e User must select an existing Operating System from the list Tier ComboBox e The control is for Select only e Items are retrieved from database e User must select an existing Execution Tier from the list Save above settings as default Button e Set the selected Application type OS and Tier as the default values These data will be saved in the configuration file Add or Select Build 4 ComboBox e User must either write in the box to add a new Build Number or select an existing one from the dropdown list e Dropdown list items are retrieved from database e Input data validation enabled Actions on existing Build RadioButtion e This control depends on user s action on control 5 i Disabled if user added a new Build Number as shown in Figure 10 ii Enabled if user selected an existing Build Number 26 e It has two options for the existing Build Number i Append new results to the previous test run default value 11 Overwrite the results of previous test run Build Add or Select Build New Build v Figure 10 User Interface Radio Button Disabled 7 Expected of Iterations TextBox e This control depends on data from configuration file i Disabled if lt showExpectedlteration gt value for this application type is False ii Enabled if lt showExpectedlteratio
9. BPResults e Number of test files 18 Type of test files XML e Sample of test file Only part of the contents are listed here xml versionz 1 0 root tests test lt name gt HomePageLoad lt name gt lt starttime gt 11 7 2005 6 11 22 PM lt starttime gt lt type gt performance lt type gt lt machine gt H16649 lt machine gt lt os gt Win32NT lt os gt lt osVersion gt 5 1 2600 0 lt osVersion gt lt netFramework gt 1 1 4322 2032 lt netFramework gt lt event gt lt time gt 6 11 22 PM lt time gt lt type gt Iteration lt type gt lt message gt 1 0 214 01 lt message gt lt iteration gt 1 lt iteration gt lt data gt 2266 lt data gt lt threadid gt 556 lt threadid gt lt event gt 4 Inputs from user interface The input include e Application type BP STD e OS Windows XP 52 e Tier BP Standard e Bulild BP Test Qiang Expected of Iterations 10 5 Results The results include e Files are parsed and processed Figure 34 shows the valid data processed e Results are successfully uploaded to ABench database Figure 35 shows the information Totally 192 test cases are parsed and 3302 rows are uploaded to the database e ABench website publishes the new updated data for this build number BP Test Qiang as shown in Figure 36 37 and 38 i Figure 36 shows the executive summery for the Build BP Test Qiang As we can see the two test cases listed in the table
10. Figure 34 Figure 35 Figure 36 Figure 37 Figure 38 Class Level Class Diagram ToolHelper eere Class Level Class Diagram DataTable viewer eee Class Level Class Diagram ApplicationSettings and ApplicationItem Class Level Class Diagram Status Porm iii tacna cidad Class Level Class Diagram UserlInputs eene Class Level Class Diagram Object X MLSeriaLizer eene Class Level Class Diagram Driver eese eene Class Level Class Diagram TestCaseCollection TestCaseltem Machineltem and Eyetuuniaiia quete tied den reiv Class Level Class Diagram XML TestFile XMLTestItem and DOVE CIC i iuter ha direc toss O O Class Level Class Diagram TxtTestFile TxtTestItem and XML SmgleUserl tem eei ri e e NES SERERE INL YR ER aa E HE Mna Class Level Class Diagram IParser XMLParser TXTParser EcokUpTable and ParserHelpet sete Lotes Prin it Class Level Class Diagram Loader and LoaderDataAccess Error Message for an Invalid User Input Table Display for an Invalid Test Data eiii etus Table Display for Valid Test Data nio ene dd erased iras ii Message Box AHet PEtOCOSSIDE saa navi anana anana tube nta Screenshot of ABench Webpage ernnrnnnnnvonennvnnnnnensnnnensnnnensvnnvenvrnvevnrevennsee A Screenshot of ABench Webpage entitats Screenshot of ABench Webpage esnnrononnvon
11. configurations 2 To process and save the parsed data into the remote ABench SQL database 3 To build a tool that is adaptable to other applications that use similar output format 1 3 Project Outline The proposed system is a database driven multi tier Windows application using C and NET programming By integrating with ABench website it is expected to solve the problems of data compatibility and presentation for performance testing data with minimal manual processing After the implementation is completed initial testing and debugging will be performed by the developers on the local system The test team will then take the project for a satisfaction testing on the product system Feedback from the test team will be collected to guide a second iteration of the software development life cycle This project requires knowledge from both of the various performance testing results and the ABench SQL Server database Many design programming features are used to develop a generic tool such as dynamic instantiation In addition to software design and development the following tasks are also critical for this research e Design and implement algorithms for processing the raw testing data e Establish precise mappings from the test results to the data schemas and models used in ABench database e Design and implement a helper database for storing configuration information lookup tables etc e Implement an ABench website and database on a loc
12. connectionString Database Helper Server local SQLEX PRESS Integrated Security SSPI providerName System Data SqlClient gt lt connectionStrings gt e Change the Server local SQLEXPRESS to your database server instance name 6 Start the system click on PerfImportTool exe the main entry Windows form will display on the screen 7 Input data using the user interface Please refer to User interface design in Chapter 3 for detailed explanation of the interface controls e Select an Application Type from Application ComboBox as shown in Figure 39 Settings Application BP STD vi OS WindowsXP E Tier BP Standard Y Save above settings as default Figure 39 User Interface Select Application OS and Tier 62 Select an Operating System from OS ComboBox as shown in Figure 39 Select an Execution Tier from Application ComboBox as shown in Figure 39 If you want to save the above settings click on Save above settings as default Button as shown in Figure 39 Write in the Add or Select Build 4 ComboBox add a new Build Number or select an existing one from the dropdown list as shown in Figure 40 Build Add or Select Build BP Test Qiang v Figure 40 User Interface Add or Select Build Number Check one of the Actions on existing Build 4 RadioButtions accord as shown in Figure 41 i Click on Append to append new results to the previous test
13. data object User input data object Figure 6 Level 3 Data Flow Diagram Process user input The four service functions are o Initialize user interface This function displays user interface to end user It queries the ABench database tables to create menu options It extracts data from the configuration file for default application settings which are used to set selected menu items It also gives warning messages if it fails to connect to the database or cannot find the configuration file 20 o Get user input This function gets the user input when the user selects items from menu inputs test folder or chooses to save the settings as default It also save the default settings to the configuration file if user decides to so o Parse folder and save file information This function will parse through the folder and display available files to user interface It takes input when the user selects from the displayed files It also extracts the information of number of users from the folder file names Finally it saves all the file information into an easy access data object o Save user inputs This function saves user inputs into an easy access data object e Parse test files Level 3 DFD for this process is shown in Figure 7 As one can see in the diagram this process takes two data objects generated by the previous process At the end of the process it saves the results to a new test case results data object as an output Here are t
14. find TestCase For 1 User BadName in H16649_2005 11 7_23 16 45 878 xml From LookUp DB Parsing finished See results in TeseCasesCollection xml Number of files parsed 18 Number of TeseCases generated 192 Number of Events Iterations generated 3312 SRR ROR RR ARR OR A A aa Message from Loader eie eee eee eee see Number of valid Test Cases 191 Number of valid Test Iterations events 3302 Number of invalid Test Cases 1 Number of rows uploaded to database 3302 Figure 33 Message Box After Processing 54 Test Environment Settings Apply Settings Build Number Tier Goal Baseline OS Repetition Flavor BP Test Qiang y BP Standard BP Goal Windo Averac Frei bp std Summary Results 60 40 20 Legend Y mmm Show Execution Rate V mmm Show Pass Rate Filter Settings Apply Settings PerfCounter Priority TestCase Priority Test Owner Test Detail Show All Priorities y Show All Priorities y All y Tests with Results and Goal I Show Context There are 2 tests that match the criteria TestCaseName Type Owner Pri Performance MetricPerfCounter Priority GoalResultStatus Overall StatusComment 3000010 BP Admin Ul Load Home 1User BP STD gholma 3 Elapsed Time ms 6000 3000290 BP Admin UlLoad Home 5 User BP STD gholma 3 Elapsed Time ms 6500 Figure 34 A Screenshot of ABench Webpage TestCase BP Admin UI Load Home 1 User PerfCounter Elapsed Time TestC
15. have passed the predefined Goals For example the performance value of test case 3000290BP Admin Ul Loader Home 5User is 2683 milliseconds which is below the Goal set as 6500 milliseconds ii Figure 37 shows the trend of the performance by comparing with the previous test results ili Figure 38 represents the test values of each user for the particular test case These tests indicate that the system meets the customer s requirements both functional and un functional It can successfully consolidate and process the test results and publish them in ABench website as expected 53 H valid data of BP STD BP STD Count ProjectN ame D Dynamics GP 15 xi MachineName MachineCount OSName ProductRelease FlavolD H16649 Windows XP BP Test Qiang i T Dynamics GP Dynamics GP Dynamics GP H16649 Windows XP BP Test Qiang H16649 Windows XP BP Test Qiang H16649 Dynamics GP Windows XP BP Test Giang H1 6649 Dynamics GP Windows XP BP Test Qiang H16658 Dynamics GP Windows XP BP Test Giang H16658 Dynamics GP KININ a at Windows XP BP Test Giang H16658 Windows XP BP Test Qiang Figure 32 Table Display for Valid Test Data Ed ooo odo abe ak ata ob ak ok abe a ab aba a Message from Driver eese se opos peo peo a i esee epe Found parser type XML Found lookup table BP STD Getting in XMLParser Could not
16. in Database tables such as test cases performance counters etc e If existing build number has been selected from the user interface prompts user to choose whether to overwrite or append to previous test run 13 e Launches the data parser 3 Data Parser based on the file type of the test results one of the Parser classes is instantiated XMP Parser or TXT Parser Its functionalities include e Get the input files e Parse through files and process data conduct calculation when necessary e Organize and save data in a consistent format e Pass processed data to data loader 4 Data Loader e Get processed data from Parser e Verify data integrity according to database model and configuration file e Upload the valid results to database 3 2 Use Case Diagram Use case diagram is used to show the interaction between actors and the system An actor represents a user or another system that will interact with the system A use case 1s an external view of the system that represents some action the user might perform in order to complete a task 1 In this project the only actor is the user The use cases are 1 Input application settings which include e Input application type e Input operating system e Input execution tier hardware and server information 2 Input build number for this application 3 Specify input files This task include two steps 14 e Input result folder e Select test files to process 4 Start process User starts the
17. tables etc e Implement an ABench website and database on a local machine for system development and testing Xil CHAPTER I INTRODUCTION 1 1 Significance Performance testing is one of the most crucial steps in software development life cycle It is used to test the run time performance within the context of an integrated system The application s features and transactions are tested and compared to measurable goals and objectives such as response time from the server for a web based application A final assessment report detailing executive summaries and pass fail results is created for management to make decisions about the product release There are many different ways to go about performance testing depending on the application type and test purpose The tests are usually conducted by automated tools running against scripted test suites and test cases Final test results are logged on local disks for further interpretation These raw test results are organized in different ways depending on the testing tool For a complex enterprise solution that normally have a family of applications each application can have its own test tool which will create test results in totally different form and data format Because of the data inconsistency creating and presenting compatible results are often a frustrating and time consuming process This software engineering project was conducted at the Microsoft Corporation for a complex business solution
18. things happen when they shouldn t or things don t happen when they should Testing involves operation of a system under controlled conditions and evaluating the results The controlled conditions should include both normal and abnormal conditions 5 1 System Testing System testing provides evidence that the integration of the sub systems has not resulted in unexpected behavior and the software meets its functional and non functional requirements In this system each module has been tested first after the coding After integration the whole system is tested to ensure each part of it communicates well and functions correctly The scenarios tested are 1 Valid data from user interface The system accepts valid inputs and starts processing 2 Invalid data from user interface 49 The system gives error message for the invalid input processing is not started For example if we enter c NoSuchFolder in the TestResultsFolder textbox and then click on Select Files button the system gives the following error message Figure 32 Build Add or Select Build Could not find a part of the path c NoSuchFolder Folder Test Results Folder C NoS uchFolder Select Files Figure 30 Error Message for an Invalid User Input 3 Valid database settings The system starts processing 4 Invalid database settings including connection failure wrong database server name wrong database credential etc The system gives error me
19. 3 Figur EE Se Dans sogner ausa dtm ga a aa GS Ga aa e Nga aa 15 Figure 4 Level 1 Data Flow Diagram E eer 17 Figure 5 Level 2 Data Flow Didgrain eite lid da 18 Figure 6 Level 3 Data Flow Diagram Process user input seen 20 Figure 7 Level 3 Data Flow Diagram Parse test fIl8S ooooonnnnnncnoncnnncnocnnonccconccannnnnnnoo 22 Figure 8 Level 3 Data Flow Diagram Upload results eese 23 Figure 9 User Interface Main Entry iei aida 25 Figure 10 User Interface Radio Button Disabled see 21 Figure 11 User Interface Folder Browser Dialog eee 28 Figure 12 User Interface Select Files Lit eee eed 30 Figure 13 User Interface File Viewer Example siia ode ee 31 Figure 14 User Interface Message Box Example viii is 32 Figure 15 Package Level Class Diagram Process User Input ess 34 Figure 16 Package Level Class Diagram Control Flow and Data Objects 34 Figure 17 Package Level Class Diagram Parse Test Files ees 35 Figure 18 Package Level Class Diagram Upload Results esses 35 Figure 19 Class Level Class Diagram ToolDataAccess eeee 29 vi Figure 20 Figure 21 Figure 22 Figure 23 Figure 24 Figure 25 Figure 26 Figure 27 Figure 28 Figure 29 Figure 30 Figure 31 Figure 32 Figure 33
20. 6 Process test results of other applications that use similar output format Requirements specification design and implementation details are included in this report Many design programming features are used to develop a generic tool such as dynamic instantiation After the implementation is completed the system was tested by real test files and gave correct results as expected As a conclusion it solves the problems of data compatibility and presentation for performance testing data with minimal manual processing This project requires knowledge from both of the various performance testing results and the ABench SQL Server database In addition to software design and development the following tasks are also critical for this research e Design and implement algorithms for processing the raw testing data 57 Establish precise mappings from the test results to the data schemas and models used in ABench database Design and implement a helper database for storing configuration information lookup tables etc Implement an ABench website and database on a local machine for system development and testing Future work for this project can include 1 In the current system users need to modify and save configurations in XML files A configuration dialog in the user interface can be used to manage configurations In the current system the data tables of the Helper database have to be created and populated manually A function that w
21. APTrx and only one user is simulated If a folder name failed to follow the above convention default value of the number of users is one No constraint for file names as long as it has a xml extension 26 a APTrx1User OD APTrxSUser C3 APTrx10User C ARTrx1User a ARTrx5User O ARTrx10User a Customer 1User Figure 1 Sample Folder Structure of XML Test Results The following example is a XML result file lt xml version 1 0 encoding utf 8 gt lt root gt lt tests gt lt test gt lt name gt PayablesInvoice lt name gt lt starttime gt 6 29 2006 9 03 02 AM lt starttime gt lt type gt performance lt type gt lt machine gt H16649 lt machine gt lt os gt Win32NT lt os gt lt osVersion gt 5 1 2600 131072 lt osVersion gt lt netFramework gt 2 0 50727 42 lt netFramework gt lt event gt lt time gt 9 03 07 AM lt time gt lt type gt Iteration lt type gt lt message gt Iterations lt message gt lt iteration gt 10 lt iteration gt lt data gt 208 lt data gt lt event gt lt event gt lt event gt lt test gt lt tests gt lt root gt The lt test gt element contains information of a test suite The test tags include e lt name gt test name e lt starttime gt time at which the results were logged e type the type of test e machine machine name on which the test was run e os operating system e lt osVersion gt version of the operating system e netFram
22. Ah data TestCaseItem E Class E Fields 29 machinesList Y testCaselD Y testCaseName Properties A MachineCount A Machines A TestCaselD A TestCaseName z Methods 4 AddMachine y TestCaseltem 1 overload Has gt gt Machineltem Class E Fields F count a eventslist Y name w type E Properties A Count A Events A Name oP Type E Methods 4 AddEvent MachineItem 1 overload Figure 25 Class Level Class Diagram TestCaseCollection TestCaseltem Machineltem and Event 40 XMLParser Class XMLTestFile Class E Fields a testsList E Properties HA Tests B Methods AddTest 4 XMLTestFile XMLTestItem E Class E Fields a eventsList eeooc ood machine name netFramework os osversion starttime type E Properties A Events 2 Methods 4 AddEvent O xMLTestItem 1 overload Has XMLEventItem E Class E Fields eee additionalinFo data iteration message threadid time type FE Methods y XMLEventltem 1 overload Figure 26 Class Level Class Diagram XMLTestFile XMLTestltem and XMLEventltem 41 TutTestFile E TxtTestItem E Class Class E Fields E Fields a7 _fileName Y _numUsers 39 _numUsers Y _testCaselDs y testsList Y _testCaseNames E Properties w trigger oF FileName Has 3 testSingleUserList HP LatestTest E Properties A NumUsers A LatestTestSi
23. Data Consolidation and Importing Software for Microsoft Dynamics GP Performance Tests by Qiang Zhang Bachelor of Science Nanjing University A Project Submitted to the Graduate Faculty of the University of North Dakota in partial fulfillment of the requirements for the degree of Master of Science Grand Forks North Dakota November 2006 This project document submitted by Qiang Zhang in partial fulfillment of the requirements for the Degree of Master of Science from the University of North Dakota has been read by the Faculty Advisor under whom the work has been done and is hereby approved Faculty Advisor This project document meets the standards for appearance conforms to the style and format requirements of the Computer Science Department of the University of North Dakota and tg hereby approved Graduate Director 11 1301 OG Date PERMISSION Title Data Consolidation and Importing Software for Microsoft Dynamic GP Performance Tests Department Department of Computer Science Degree Master of Science In presenting this report in partial fulfillment of the requirements for a graduate degree from the University of North Dakota I agree that Department of Computer Science shall make it freely available for inspection I further agree that permission for extensive copying for scholarly purposes may be granted by the professor who supervised my work or in his absence by the Chairperson of the Dep
24. Software Users should follow the following steps 1 Check system configuration Make sure the computer running this system have the following configurations e Operating system Windows Server 2003 or Windows XP Professional e NET framework version 2 0 Obtain user permission The user must get appropriate credentials to databases e The user has access to ABench and Helper database server e The user has permission to upload and delete records on ABench Prepare test results The test results must be stored in a consistent agreed upon format which include folder structure folder naming convention and file structure e Please refer to Format of Test Results in Chapter 3 Install the software Copy the whole product folder of PerfImportTool to local disk The folder contains e One executable file PerfImportTool exe e Three dynamic link library DLL files Parser dll Microsoft Practices EnterpriseLibrary Common dll and Microsoft Practices EnterpriseLibrary Data dll e Two configuration files PerflmportToo exe config and PerfToolSettings xml 61 5 Setup configuration files e Open up PerfImportToo exe config find the following statement lt connectionStrings gt add namez PerflImportToolHelper connectionString 2 Database ABench Server local SQLE XPRESS Integrated Security SSPI providerName System Data SqlClient gt lt add name PerfResultsConnection
25. al machine for system development and testing 1 4 Background Information Some background information about this research include Microsoft Dynamics GP formerly Microsoft Great Plains Microsoft Dynamics GP is a comprehensive business management solution built on the highly scalable and affordable platform of Microsoft technologies It offers a cost effective solution for managing and integrating finances e commerce supply chain manufacturing project accounting field service customer relationships and human resources 5 ABench website and database ABench is a scalable and generic framework for archiving and displaying performance data It is used by a variety of performance test teams The front end provides several viewing options that range from high level executive summaries to detailed charts and tables Performance results for various projects are reported using a standard format making it easy for teams and management read performance results The database is hosted on a centrally located server using SQL Server Users need to log on with Windows authentication to access the database The database uses stored procedures to upload test results to the tables NET and C The NET is a framework for programming on the Windows platform Along with the NET framework C is a language that has been designed from scratch to work with NET as well as take advantage of all the features provided by Visual Studio 2005 an object o
26. artment Signature CL Date f 7 le 111 TABLE OF CONTENTS PERMISSION 00d eh tags a danang nta ili TABLE OF CONTENTS eio qnd tds dime iuto iv LIST OF FIGURES uta vi EISTOETABEES 00 od ix ACKNOWLEDGMENTS sitiado ida X ABSTRACT PS xi CHAPTERS I INTRODUCTION a io a oO Ct B OM n He 1 Me uskadde dg 1 LOES 2 1 5 Project Outline unn nai 3 1 4 Background Information rte ona 4 155 Report OrgeatzatlotL oia 5 IL REQUIREMENTS AND SPECIFICATIONS eere 6 NL i eset oett ten kostet a Per ENG ip eda o ese E 6 2 2 SDOCIPICOBOTS ues Sore uS DE Sot ose E oa per evan I Qu ue 7 2 3 Format of Test Results sa nasag anan an gga a E a RENA 8 I HIGH LEVEL AND LOW LEVEL DESIGN aane ena eee 12 3 1 System Architecture oc aa a a anaa ag aa he EH UA RENDIR TU PE REPE a aii 12 3 2 Use Case Diagram os 14 3 3 Data Flow Diagrams i55 t ote si be berto dee eee 15 3 4 User Interface Design ui ia h vet 24 IV IMPLEMENTATION iii id 33 4 1 Class Dia Sram cri 33 4 2 Data Models and Data Access ucraniana ida 44 V TESTING AND VERIFICATION esee 49 Di L SYstem Testing I vous 49 5 2 Acceptance Tesi ai 51 VI CONCLUSION AND FUTURE WORK eere 57 REFERENCES E A Aaa iaa 59 APPENDIX Ai M 61 APPENDIX DB siii Ac 65 LIST OF FIGURES Figure 1 Sample Folder Structure of XML Test Results 9 Figure 2 System Architecture eo aote A En aaa alee 1
27. ata storage for ABench website 44 e tis the production database that holds all the processed test results e In this system it is the destination for uploading data it is also used for system initialization and data validation 2 Helper It is a database designed and developed for this system only e As the name suggests it helps the system to do the parsing e It holds the information that cannot be included in the ABench database 4 2 1 Data Models Whether the database is designed reasonable and sufficient will have a direct effect on the quality of the application Data models focuse on what data should be stored in the database For relational database the data model is used to design the relational tables The database designed for this system is the Helper database which consists of two tables AllTestCaseLookUp and AllTestCasePerfCounters These two tables store the information of lookup table and performance test counters Table 1 and 2 show the data schema of these two tables ABench database is briefly discussed in Chapter 1 45 Table 1 AllTestCaseLookUp Database Table Collumn Name Data Type Allow Nulls Primary Key ProjectID smallint No No TestCaseType char 50 Yes No NumUsers smallint Yes No FileName varchar 64 Yes No OrigTestCaseID varchar 64 Yes No TestCaseID Int No No TestCaseName nvarchar 255 INo INo Table 2 AllTestCasePerfCounters Database Table
28. atabase Any database schema and models designed for this system should be flexible enough to handle a wide variety of performance test types 5 Data format The data must be setup in a way that makes reporting from it easier than the current methods 6 Interface The interface must be easy to use and not cumbersome to set up Implicit in this requirement is that the user should only have to specify a minimal amount of set up information each time results are imported 7 System environments The most popular environments are e Windows Server 2003 Developer can also use Windows XP environment for development and testing e NET 2 0 framework e SQL Server 2005 2 3 Format of Test Results As part of the input requirement for this system the raw test results must be stored in a consistent agreed upon format which includes folder structure folder naming convention and file structure These files contain raw data that needs to be parsed and summarized into useful data 2 3 1 XML Test Results The test results should be stored in a two level folder structure as shown in Figure 2 The parent folder is the physical location for this test run The subfolder contains the XML test files for a certain scenario or module depending on the application type The subfolder name should end with the number of users and the string User to indicate the user load For example the folder name APTrx1User means the scenario or module name tested is
29. ed in the accompanied Compact Disk 65
30. ennvnnennensnnnrennnnersvnnvenvrnvenrnvennsee vii Figure 39 User Interface Figure 40 User Interface Figure 41 User Interface Figure 42 User Interface Figure 43 User Interface Figure 44 User Interface Select Application OS and Tier sess Add or Select Build Number sedet cxed onore oe Select Action on Existing Build Number Enter Test Results Folder and Select Files gt Statt PrOCess aa Close The Software eere tener aa Ga Te edan Vili LIST OF TABLES Table 1 AllTestCaseLookUp Database Table Table 2 AllTestCasePerfCounters Database Table ACKNOWLEDGMENTS I would like to express sincere thanks to the many people who have contributed to the completion of this study I would like to thank Dr Wen Chen Hu for his support and guidance through the duration of this work Many thanks to Shawn Hanson Mark Dowell and Russ Brown for their encouragement advices and assistance on this project Special thanks go to my family for their constant love understanding and many sacrifices without which would have made this work impossible ABSTRACT Data compatibility and presentation are always a major issue of computer science This project was conducted at the Microsoft Corporation for a complex business application Dynamics GP formerly Microsoft Great Plains It is a part of the efforts of publishing the performance te
31. eve the maximum value of a field in a database table Inputs are the TableName string and FieldName string Out put is an integer or 0 if the result is not found Sample usage is GetMax TestCaseTable TestCaseID Retrieve the Max value of FieldName in TableName public int GetMax string TableName string FieldName DbCommand getMax _db GetSqlStringCommand SELECT Max FieldName FROM TableName object result _db ExecuteScalar getMax 47 if result null result ToString string Empty return 0 else return int Parse result ToString In this method db is the database connection GetSqlStringCommand is a method to create a database command from an in line Transact SQL query string ExecuteScalar queries the database and returns result to object result If result is not null or empty string it is parsed to an integer and finally returned 48 CHAPTER V TESTING AND VERIFICATION A system that cannot be trusted to work correctly has no value This means that the programs must function correctly and the results that come back are valid and complete Software verification is the set of activities that ensure that software correctly implements a specific function and meets the customer s requirements Testing plays an extremely important role in verification It should intentionally attempt to find problems for example
32. ework version of the NET framework e event information of this particular iteration The tags inside the lt event gt tag include e lt time gt time at which this iteration of the test case was recorded e type type of entry e message description message e lt iteration gt iteration index e data total response time for this iteration 10 2 3 2 TXT Test Results The test results should be stored in a one level folder structure The folder is the physical location for all the TXT files of this test run No subfolder is allowed File name should end with the number of users to indicate the user load For example the file name glent7 txt means the scenario or module name is glent and seven users are simulated The raw data are saved in delimited plain text file Each line in the files contain the following data e TestID Test ID assigned to a test run e UserID User ID index e EventID a unique ID used to identify a test case e Time time at which the results are logged e MSTime number of milliseconds since benchmarks started e Type flag to show start or stop O start 1 stop e RowID row index The following is an example of the contents from a result text file TestID UserID EventID Time MSTime Type RowID 100 2 1101 1 1 1900 5 56 15 PM 18760 0 1 100 1 1101 1 1 1900 5 56 15 PM 18766 0 2 100 4 1101 1 1 1900 5 56 15 PM 18769 0 3 100 7 1101 1 1 1900 5 56 15 PM 18775 0 4 100 2 1101 1 1
33. he service functions o Create parser This function takes the parser type information from the user input data object and creates a parser object accordingly o Create lookup table This function gets application type from the user input data object It then creates a lookup table in memory by querying the helper database o Parse files This function takes input from file information data objects and then uses the lookup table to find test case information including test case IDs and names It parses through every test file does necessary calculations on the 21 raw data and summarizes all the results of this test run into a new test case results data object It displays message to user if needed and record running information to log file User input data object Parser type Application Helper database type Create look up table Create parser Test case ID Test case name Lookup table Message Parse files Log file File info data object Test case results data object Test files Figure 7 Level 3 Data Flow Diagram Parse test files Upload results Figure 8 represents level 3 DFD for this process which can be divided into two service functions This process takes the test case results data object from the previous process It validates the data and uploads valid data to ABench database tables The final outputs are inserted rows in the database Running information is rec
34. ill initialize these tables from the user interface can be added to the system The current system can only process test results in TXT and XML formats It will be more beneficial if test results in other formats can also be processed such as Excel spreadsheets 58 REFERENCES 1 Martin Fowler Kendall Scott UML Distilled A Brief Guide to the Standard Object Modeling Language second edition Addison Wesley 2000 2 Carlo Ghezzi Mehdi Jazayeri Dino Mandrioli Fundamentals of Software Engineering second edition Prentice Hall 2002 3 Roger S Pressman Roger Pressman Software Engineering sixth edition McGraw Hill Science Engineering Math 2005 4 Remez Elmasri Shamkant B Navathe Fundamentals of Database Systems third edition Addison Wesley 2000 5 Product information for Microsoft Dynamics GP Retrieved November 10 2006 from http www microsoft com dynamics gp product productoverview mspx 6 Christian Nagel Bill Evjen Jay Glynn Morgan Skinner Karli Watson Allen Jones Professional C 2005 third version Wrox 2005 7 Juval Lowy Programming NET Components second version O Reilly Media 2005 8 Cem Kaner Jack Falk Hung Q Nguyen Testing Computer Software second version Wiley 1999 9 Erik T Ray Learning XML second version O Reilly Media 2003 59 APPENDICES APPENDIX A User Manual This user manual provides instructions of how to setup and use the Data Consolidation and Importing
35. n gt value for this application type is True e User needs to input an integer value in the box The value is used to check if iterations processed for a test case matches the expectation e Input data validation enabled 27 Browse For Folder Select Test Results Folder amp 5 PerfimportToo ca project E 5 project implementation E 5 BPResults GPResults 5 WSResults 5 QiangDoc E references C3 Screenshots C SOL Server 2000 Sample Databases T cli Figure 11 User Interface Folder Browser Dialog 8 Test Results Folder Button e User clicks this button to open a Folder Browser Dialog as shown in Figure 11 The dialog is used to select the test results folder by navigating to the location and then clicking OK e The selected full path from Folder Browser Dialog will be shown in the TextBox control 9 next to this button 9 Test Results Folder TextBox e User has two options to populate this box with the full path of test results folder 28 1 Click on the button in front of this box and select from the folder browser dialog as described previously ii Write the full path in the box e Input data validation enabled 10 Show Files Select Files Button e This control depends on the contents of control 9 i Enabled if Test Results Folder TextBox is not empty ii Disabled otherwise e The name depends on data from configuration file i If lt alwaysParseAllFiles gt value for this a
36. ngleUser A Tests A singleUserTests E Methods E Methods AddTest O AddTestSingleUser 4 TxtTestFile 4 TxtTestItem V 4 V 4 Uses Has TXTParser TxtTestSingleUserItem A Class Class Fields a dt Y saveMax Y saveMean Y saveMin a _saveTimes Y userID a _valuePairs Y _windowMax Y _windowMean Y _windowMin 3 _windowTimes 2 Methods a Calculate a isPairOfzeroOne O TxtTestSingleUserItem Figure 27 Class Level Class Diagram TxtTestFile TxtTestltem and XMLSingleUserltem 42 Parser A Interface E Properties Message rn Parser Type E Methods ty ProcessData Uses xMLParser Class B Fields 39 message 29 parserType i Properties OP Message A ParserType B Methods ty ProcessData y xMLParser LookUpTable Class amp Fields 3 dt 39 message E Properties A Message A table B Methods y GetTestCaselDNameByParserType 37 GetTestCaseIDNameByTXT 37 GetTestCaseIDNameByxML y GetTriggersByFilename LookUpTable aY valueExistInArray TXTParser E Class B Fields 3 message a parserType E Properties A Message A ParserType E Methods 3 GetTestsFromTxT 9 ProcessData TxTParser Uses ParserHelper E Class B Methods t GetTestsFromxML 4 LoadDelimitedTxtToDataTable y ParserHelper y SaveTestCasesCollectionToxML 4 SaveTestsToxML Figure 28 Class Level Class Diagram IParser XMLParser TXTParser LookUpTable and ParserHelper 43 Loader
37. nterface design aims to create an effective communication medium between the user and the system The design begins with identification of users tasks and environmental requirements After the functionality analysis and modeling user scenarios are created to define a set of interface objects and actions Based on the interface objects and actions layouts of the interface elements are generated such as menus icons buttons etc This system uses a windows based graphical user interface as shown in Figure 9 The interface was designed using Microsoft Visual Studio 2005 which provides rich user interface features for Microsoft Windows operating system The user interface is easy to use and setup The main part of interface is a Windows Form that prompts the user for additional information for menu items It has the following components Windows controls and functionalities 24 EE Performance Import Tool Main Settings Application BP STD OS WindowsXP Tier BP Standard Save above settings as default Build Add or Select Build BP27 NewEnw v Actions on existing Build tt Append O Overwrite Expected of Iterations 10 Folder Test Results Folder e Show Files Process Close Figure 9 User Interface Main Entry 1 Application ComboBox e The control is for Select only 25 6 e Items are retrieved from database e User must select an existing
38. ontext Undefined Click arrow to show hide Baseline and Tier options 12 6000 ms 3640 ms 5280 ms 4920 ms 4560 ms 4200 ms 3840 ms 3430 ms 3120 ms 2760 ms 2400 ms Figure 35 A Screenshot of ABench Webpage 55 There are 2 tests that match the criteria TestCaseName Type Owner Pri Performance MetricPerfCounter Priority GoalResultStatusOverall StatusCommen 3000010 BP Admin ULLoad Home 1User BP STD gholma 3 Elapsed Time ms 3 6000 2495 None 3000290 BP Admin Ul Load Home 5 User BP STD gholma 3 Elapsed Time ms 6500 2683 PerfCounter Results Windows Internet Explorer TestCase 3000010 BP Admin Ul Load Home 1 User PerfCounter 1 Elapsed Time ms Context Default Context RequestQueueID Repetition Perf Value Figure 36 A Screenshot of ABench Webpage 56 CHAPTER VI CONCLUSION AND FUTURE WORK In this project a data consolidation and importing software was design and developed It is a database driven multi tier Windows application using C and NET programming The project is a part of the efforts of publishing the performance test results of different applications to ABench web reporting system The main functionalities of this system are 4 Parse XML results and TXT results according to the application types and predefined configurations 5 Process and save the parsed data into the remote ABench SQL database
39. orded in log file and messages are displayed for the user 22 User input data object Test case results data object Validate and convert data Valid Data data object Log file ABench database Upload valid data Message Figure 8 Level 3 Data Flow Diagram Upload results o Validate and convert data This function gets inputs from user input data object and test case results data object It loops through the test case results to validate data by querying the ABench database tables It calls a private method to convert valid data into a data object ready for database uploading o Upload data This function uploads the valid data data object to ABench database tables Depending on the user s need it may overwrite or append the records in the tables for the current build number Information is logged to file and displayed to user 23 3 3 4 Conclusion of Data Flow Diagrams These three levels of data flow diagrams give clear insights of system design in a top down approach Detailed design and implementation for each functional component can start from the level 3 DFDs Please be noted that there might be a variety of smaller functional units to support each service function such as methods of database access searching and sorting etc These functional units are the smallest building blocks of the system however they are too detailed to be included in data flow diagrams 3 4 User Interface Design User i
40. pplication type is true name is Show Files ii If lt alwaysParseAllFiles gt value for this application type is false name is Select Files e Click it will parse the folder specified in Test Results Folder TextBox control 9 and populate the below Files ListBox control 11 with all eligible files found in the folder 11 Files ListBox e Disabled by default Enabled after clicking Show Files Select Files Button control 10 e ists all eligible files found in the test results folder e User needs to selects files to process 29 i If control 10 is Select Files user can select individual files to process see Figure 12 If user does NOT select any file in the list ALL files will be processed by default ii If control 10 is Show Files user cannot select individual files to process ALL files will be processed always Folder Test Results Folder C Giang project_implementatior Select Files benefit txt chxdump txt deduction txt department txt endclosedump txt IM end 2dump txt lt Figure 12 User Interface Select Files in List 12 File viewer e Opens up when user double clicks a file name in the listbox control 11 e Displays contents of the file in a spreadsheet format for user s convenience Figure 13 shows an example The file used in the example is olent txt 30 ES glent txt C Qiang project_implementation GPResults glent tx
41. process via User Interface Figure 3 demonstrates the User Case Diagram for this project Input application name Input operating system Input execution tier Input result folder Input application settings Input build number Specify input files Figure 3 Use Case Diagram User 3 3 Data Flow Diagrams A Data Flow Diagram DFD shows the flow of data from external entities into the system and how the data moves from one process to another DFD may partition into levels that represent increasing information flow and function detail For this project three level DFDs are developed Figure 4 to Figure 8 show DFDs using the Gane and Sarson notation 1 which include four symbols 15 model Squares representing external entities Bubbles representing processes which take data as input and output Arrows representing the data flows Open ended rectangles representing data stores such as databases or XML files 3 3 1 Level 1 Data Flow Diagram Level 1 data flow diagram is also called context model or a fundamental system The whole system is considered as a process As shown in Figure 4 the system takes four kinds of input data User input User provides application settings build number and test file information Helper database A helper database is needed for the system to store additional data such as lookup table and performance counter information The system needs to acce
42. riented programming and development environment 6 1 5 Report Organization The organization of this report is as follows e Chapter II describes the requirements and specifications of this project e Chapter III contains high level and low level design e Chapter IV focuses on implementation of this project e Chapter V is testing and verification e Chapter VI includes conclusion and future directions e Appendix A includes the user manual e Source code is stored in the CD Rom attached as Appendix B CHAPTER II REQUIREMENTS AND SPECIFICATIONS A requirement specification describes the user s needs of this system It serves as an agreement between the end user and developer it s viewed as a definition of what the implementation must achieve The performance test team as the end user of this system provided requirements and specifications 2 1 Requirements The users have the following functional requirements for the data consolidation system 1 Provide a unified method of processing test results The results are processed by different tools depending on the application type in the current system 2 Publish results to the ABench web system with minimal manual intervention Only Excel spreadsheets are created for reporting in the current system 3 The system should be adaptable to other applications that use similar output formats After integration with the ABench website the whole system should achieve the following goals 1 Di
43. river ObjectXMLSerializer TestCaseCollection TestCaseltem Machineltem Event XMLTestFile XMLTestltem XMLTestEvent TxtTestFile TxtTestItem TxtTestSingleUserltem Figure 16 Package Level Class Diagram Control Flow and Data Objects 34 Parse test files IParser XMLParser TXTParser ParserHelper LookUp Table Figure 17 Package Level Class Diagram Parse Test Files Upload results Loader LoaderDataAccess Figure 18 Package Level Class Diagram Upload Results 4 1 2 Class Level Class Diagram Figures 19 to 31 demonstrate class level class diagrams The class functionalities and relationships are displayed in details Classes that are generated by the Visual Studio 2005 automatically are not described here which include MainForm Program Resources and Settings ToolDataAccess Class E Fields a db E Methods Uses GetApplicationTypeDataSet MainForm Y GetBuildDataSet Class y GetMax ty GetOSDataSet y GetTierDataSet ToolDataAccess Figure 19 Class Level Class Diagram ToolDataAccess 35 gt gt ToolHelper MainForm Class Class zi Methods GetApplicationSettingsBy Type y GetDefaultSettings pplicationSettings 4 GetNumUserFromTXTFileName Class 34 GetNumUserFromXMLSubFolderName InitializeComboBox LoadFileInfoToDataTable IXMLObjectSerializer LoadSubfolderToDataT able 4 SaveDefaultSettings serializer_Unknownattribute 4 serializer_UnknownNode
44. run default value ii Click on Overwrite to overwrite the results of previous test run Actions on existing Build Append O Overwrite Figure 41 User Interface Select Action on Existing Build Number Click on Test Results Folder Button to locate the test results folder or write the full path in Test Results Folder TextBox as shown in Figure 42 Click on Show Files Select Files Button to populate the below Files ListBox with all eligible files found in the folder as shown in Figure 42 Select files from the Files ListBox to process Do not select any file is all files are to processed as shown in Figure 42 63 Folder Test Results Folder C Qiang project_implementatior Show Files All files will be parsed H16649_logTimes txt_2005 11 7_19 28 A H16658_logTimes txt_2005 11 7 19 28 H16722_logTimes txt_2005 11 7 19 28 H16732_logTimes txt_2005 11 7_19 28 H16734_logTimes txt_2005 11 7_19 28 H16772_logTimes txt_2005 11 7_19 28 H16780_logTimes txt_2005 11 7_19 28 Y Figure 42 User Interface Enter Test Results Folder and Select Files Click on Process Button to start the process as shown in Figure 43 Process Figure 43 User Interface Start Process e After processing click on Close Button to close the software as shown in Figure 44 Close Figure 44 User Interface Close the Software 64 APPENDIX B Source Code The source code is not printed It is stor
45. rver 2005 is used for backend database solution Transact SQL statements are used for database queries and scripts The complete source code is in the attached CD This chapter discusses implementation details by giving class diagrams and data models and accesses 4 1 Class Diagrams Class diagrams are used to describe a group of classes in a system and their relationships such as containment inheritance and associations 2 A class represents an entity of a given system that provides an encapsulated implementation of certain functionality 3 In C classes are composed of three things a name attributes that include fields and properties and some methods to fulfill the functionalities This system consists of thirty classes Among them four classes are generated automatically by the VS development environment to start a Windows Form application The remaining classes are explained in detail in class diagrams 4 1 1 Package Level Class Diagram Classes that are either similar in nature or related are grouped in a package This provides better readability for complex class diagrams There are four class packages in this system as shown in Figure 15 to Figure 18 33 Process user input MainForm Program Resources Settings ToolHelper ToolDataAccess ApplicationSettings ApplicationItem DataTable_viewer StatusForm UserInputs Figure 15 Package Level Class Diagram Process User Input Control flow and data objects D
46. splays results from performance test runs in an easy to read and interpret format 2 Allows comparisons between results and goals or previous test runs 3 Minimizes manual processing of raw data from test execution to publication 4 Shows product performance over time The business justifications for this system are specified as follows 1 Improved productivity for performance test team In other words the system should help the team to cover more tests 2 Consistency in reporting for program team 3 Single point of reference for all performance test results 2 2 Specifications The system specifications are discussed in the following list 1 Input e Be able to process TXT and XML performance test results e Be able to specify application type build number and testing environment information such as operating system data server web server etc e The user must have the following two options according to the application type a Process all test results for a test run b Select and process multiple test results 2 Output e Upload processed results to ABench database Must be able to view test reports on ABench website e Report errors and warning messages during the process e Create log file at the end of the process 3 Security Must work within the constraints of an isolated environment Performance tests are primarily run within their own networks it is important that no interaction with outside domains be required 4 D
47. ss these data to successfully parse the test files Configuration file This file contains default settings for this system as well as configurations for each application type Test files Test results to be processed 16 Helper database ABench database Configuration file Log file Test files Figure 4 Level 1 Data Flow Diagram And there are three kinds of output data e Configuration file The system can save current application settings as default into the configuration file e ABench database Processed data are uploaded to ABench database e Log file The system save the running record of this process to a log file such as the number of files parsed the number of data rows uploaded etc 3 3 2 Level 2 Data Flow Diagram Since there is only one process shown in level 1 DFD it is unclear for the algorithm applied to transform the input to the output We can partition the level 1 DFD to level 2 DFD to reveal more detail as shown in Figure 5 17 Configuration file Process user input Test files 2 Helper database ABench database Parse test files Upload results Log file Figure 5 Level 2 Data Flow Diagram There are three processes in this Data Flow Diagram e Process User Input which include o Retrieve data from ABench database tables about available application types operating systems execution tiers and build numbers o Extract default application settings from the configura
48. ssage processing is not started 5 Good test results file settings including correct folder structure correct naming convention and file format The system processes files and gives correct results An example will be given in next section 6 Bad test results file settings The system processes files and displays information to user 50 For example if there s a piece of invalid test data in the XML file the system displays the following table Figure 33 Invalid data of BP STD alo x BP STD EnorMessage ActionTaken TestCaselD TestCaseName TestCaselD Not exist in DB TestCase skipped 3005890 BP EDD GR Cust Setting CustomerSetupT ypes Figure 31 Table Display for an Invalid Test Data 5 2 Acceptance Testing Acceptance testing provides evidence that the system works with real world data The system is tested extensively using real test files including both XML and TXT files The following example gives a description of the system environments data used and the results obtained 1 System environments They include e Windows XP Professional Operating system e NET framework 2 0 e Intel 1 66Ghz CPU e 1 0G RAM memory 51 2 Database settings The settings include e Database server local machine Sony Vaio Laptop e Instance name local SQLEXPRESS e Security windows authentication e Database name ABench 3 Test files information e Test results folder C Qiang project_implementation
49. st results of a project to a web based reporting system ABench The proposed system tries to extract and process information from various sources and save the data into the ABench SQL database with minimal manual processing The proposed system is a database driven multi tier windows application using C and NET It includes four major components 1 Graphical user interface It lets users set up and run the system after entering necessary information 2 Configuration amp verification unit It performs the following tasks e Checks file availability database connectivity default application setting etc e Checks predefined data in database tables e Launches the data parser 3 Data parser t includes two functions e Parses and processes data e Saves data by using a consistent format Xi 4 Data loader It uploads the processed data to a database This project requires knowledge from both of the different performance testing results and the ABench SQL Server database Many design programming features are used to develop a generic tool such as dynamic instantiation In addition to software design and development the following tasks are also critical to this research e Design and implement algorithms for processing the raw testing data e Establish precise mappings from the test results to the data schemas and models used by ABench database Design and implement a helper database for storing configuration information lookup
50. t TestlD UserlD EventlD RecordingT ime Value StartStop RowlD 1101 11 1 1900 7 46 58 PM 14612 1101 1 1 1900 7 47 03 PM 19799 1101 1 1 1900 7 47 03 PM 20358 1101 11 1900 7 47 04 PM 20488 1101 1 1 1900 7 47 04 PM 21015 1101 11 1 1900 7 47 04 PM 21135 1101 1141 1900 7 47 05 PM 21586 1101 11 1900 7 47 05 PM 21809 1101 11 1 1900 7 47 05 PM 22405 2 1 2 3 4 5 6 7 0 1 na d o 1 0 1 0 8 4 9 Figure 13 User Interface File Viewer Example 13 Process Button e Disabled by default Enabled after Files ListBox control 11 has been populated which means there are files to process e Clicks to start the process which include 1 Validates input data ii Launches backend process driver 14 Close Button e Close the main Windows Form and terminates the system In addition to the controls in main form discussed above a few Message boxes are used to display warnings errors progress status etc when necessary For example Figure 14 shows an error message for invalid input value of Expected iteration 31 Input validation E Expected iterations can not be empty Figure 14 User Interface Message Box Example 32 CHAPTER IV IMPLEMENTATION After completing the high level and low level design the functional modules are implemented by using C and NET framework The development environment is Visual Studio VS 2005 and SQL Se
51. tion file o Display above information on the User Interface and takes user input o Process and pass information to the next process 18 e Parse test files which include o Get information from the previous process o Get information from help database and parse the test files o Pass the processed data to the next process o Record running information to log file e Upload results which include o Get processed data from the previous process o Upload data to ABench database o Record running information to log file Please note that level 2 DFD does not show the details about the data flows and transforms between the processes 3 3 3 Level 3 Data Flow Diagram Level 2 DFD can be further partitioned to level 3 DFDs for each process In these level 3 DFDs internal data objects used to transfer information between processes are explained e Process user input This process is further broken down to four services functions as shown in Figure 6 There are two kinds of output data for this process file information data object and user input data object These two data objects will be transferred to the next process 19 Menu options ABench database Initialize user interface Menu display Warning message Default setting Configuration file Get user input Input test folder Save default Available files Select files Parse folder and save file info Save user inputs File info

Download Pdf Manuals

image

Related Search

Related Contents

Operating Instructions Models 7410 and 7413 Manual Micro  HGA4K GB  Auricular Estéreo Nokia WH-500  Projecteur Sanyo-PLV-Z1 - Lampe VideoProjecteur.info  FICHA TECNICA DERMOCOSMÉTICO KUMARI  Manual AlfaBlue Jr. - T    

Copyright © All rights reserved.
Failed to retrieve file