Wednesday, July 31, 2019

Database Development Essay

Abstract This paper defines the Software Development Life Cycle phases specifically the Waterfall method with a review of tasks to improve the quality of datasets throughout the cycle. It includes recommendations of actions to be performed for full optimization for enhancing performance from data quality assessment. Although full optimization may be reached throughout the process of SDLC, continued maintenance must be in sued to properly retain the database error-free and protected. An evaluation of three methods and activities to ensure maintenance planning is implemented is discussed. An in-depth analysis of an efficient method for planning concurrency control methods and lock granularities that are available to use that will minimize potential security risks that may occur. Finally, serializability isolation model is introduced that ensures transactions produce less record-level locking while operating the system and how a verification method allows review of proper inputs and error checks to increase consistency. Introduction There are several Software Development Life Cycle methods that are availabel to utilize although, the Waterfall SDLC is the most desirable due to the simplicity and straight forward methods utilized and will be discussed in regards to topics in this paper. The benefits of this model type include departmentalization and manegerial control. A schedule can be set for each phase similarly to a how a factory system works from one step to the next in a proceeding manner until the product is complete. However, once in the testing phase it is difficult to revert back to make any additional changes.  (SDLC Models., n.d.). Tasks to Improve Dataset Quality Using SDLC Methodology The Waterfall SDLC incorparates the following stages of planning and executing software, requirements specification, design, implementation, testing and maintenance. The requirements phase of the SDLC is to ensure clearly defined requirements via all parties involved in the processes. Deliverables in this stage include requirements documents that incorporates descriptions of requirements, diagrams and references to necessary documentation as well as Requirements Tracability Matrix (RTM), this displays the manner in which products being developed will interact and correlate to previous components that have already been developed. This phase prepares datasets integrity success throughout the SDLC process when requirements are properly defined. (The Software Development Cycle (SDLC)., n.d.). The design phase lists software features in detail with psuedocode, entity-relationship model(s) (ERM), hierarchy diagrams, layout hierarchy, tables of business rules, a full data dictionary, and business process diagrams. This phase transforms the requirements into system design specifications. In this phase it is imporatant to review software and hardware specifications and system architecture. This will create the foundation for the implementation phase. Lastly, the implementation phase begins the coding process in which portions of programs are developed and tested. Clearly defined requirements are defined via use-case scenario that enables context based definitions and a visualization of the completed product for clarifications, accuracy, and completeness of requirement request. (SDLC Models., n.d.). Actions to Optimize Record Selections and Improve Database Performance Actions to optimize record selection and improve database performance include automated controls that can be applied in the design phase of SDLC. The design phase specifically it is important for developers to set proper  automated controls such as input, processing, and output controls to enhance integrity, security, and reliability of the system and datasets. Input controls such as completeness checks and duplication checks ensure blank fields and duplicate information is not entered into the data sets. Automating process controls to ensure systems correctly process and record information. (FFIEC IT Examination Handbook InfoBase – Design Phase., n.d.). Quality management techniques that improve quality assessments include error detection, process control, and process design. These processes detect missing values, improve recurring errors, and help optimize effeciency. (Even, A., & Shankaranarayanan, G., 2009). Three Maintenance Plans and Three Activities to Improve Data Quality Three types of maintenance plans include: preventative, corrective, and adaptive maintenance which improve the data quality. Activities to improve data quality include database backups, integrity checks, optimizing the index. Preventative maintenance incorporates creating and continuously maintaining daily and/or weekly backups for data loss prevention, corrective maintenance ensures system errors are corrected. One activity associated to corrective maintenance includes resolving deadlocks, which occurs when two or more tasks permanently block each other. Adaptive maintenance includes enhancing system and database performance via based on utility assessments and optimized queries to improve performance. (Coronel, Morris, & Rob., 2013). Methods for Planning Proactive Concurrency Control and Lock Granularity Concurrency issues revolve around conflicts that occur when simultaneous tasks are performed on multiple systems, the conflict may cause inconsistencies. The goal of concurrent controls is to establish consistent throughput and accurate from results in concurrent operations. Granular locking schemes enable locking pages, tables, rows, and cells. After reviewing â€Å"Process-centered Review of Object Oriented Software Development Methodologies,† the methodologies mentioned were outside of the scope of concurrency and lock granularity. However, there are two methods, high  granularity approach and low granularity approach that will enable a distributed database with consistency. High granularity offers maximum concurrency although requires more overhead versus low granularity which offers minimum overhead although reduces concurrency. Additional overhead in the form of locking granularly at different object-oriented hierarchy levels helps create proactive concurrency control within the system. This provides additional security via the ability to control which users are modifying the database at the same time. (Ellis, R., n.d.). System Analysis to Ensure Tractions do not Record-Level Lock Database in Operation In multiuser database transactions that are executing simultaneously must have consistent results, it is vital to have control over concurrency and consistency. To enable processes that provide this control a transaction isolation model named, serializability is available for use. This model gives the illusion that transactions execute one at a time. The multiversion consistency model provides multiple users with a separate view of the data concurrently, which prevents record-level locking from affecting the database. (Data Concurrency and Consistency., n.d.). Once updates are commited to the system verifyoption can be utilized to ensure the integrity of data entered will enhance system effectiveness. (SqlCeEngine.Verify Method (VerifyOption) (System.Data.SqlServerCe)., n.d). Conclusion In conclusion material discussed includes an analysis specific tasks that will improve the quality of datasets within a database. A review of the Software Development Life Cycle (SDLC) and more specifically the Waterfall methodology SDLC. Recommended actions in the design phase that will enhance the optimization of record selection are considered along with three maintenance plan options and activities to improve the quality of data within the database. Serializability isolation model ensures transactions that will produce less record-level locking while operating the system and verification methods will allow for review of proper inputs and error checks to increase consistency. Overall, research shows that multiuser distributed databases utility will depend on specific functions created from the  origination of the product in the SDLC to the finished product and continued maintenance for consistent and efficient performance. References Data Concurrency and Consistency. (n.d.). Oracle Documentation. Retrieved September 12, 2013, from http://docs.oracle.com/cd/B10500_01/server.920/a96524/c21cnsis.htm Even, A., & Shankaranarayanan, G. (2009). Quality in Customer Databases-Centered Review of Object Oriented Software Development Methodologies. ACM Computer Database, 15, 3,4,5. Retrieved September 12, 2013, from the ACM Computer database. Ellis, R. (n.d.). Lock Granularity. Granularity of Locks_and Degrees of Consistency_in a Shared Database. Retrieved September 12, 2013, from www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=9&sqi=2&ved=0CF8QFjAI&url=http%3A%2F%2Fpages.cs.wisc.edu%2F~nil%2F764%2FTrans%2FGranularity.ppt&ei=kQ8yUqOhPIzl4AOM6oDIDw&usg=AFQjCNEdfijo3XG83N7W2WlglSi3cEJsQQ&sig2=WGLffPJ8amqYRjHXJAHLuQ&bvm=bv.52109 FFIEC IT Examination Handbook InfoBase – Design Phase. (n.d.). FFIEC IT Examination Handbook InfoBase – Welcome. Retrieved September 12, 2013, from http://ithandbook.ffiec.gov/it-boo klets/development-and-acquisition/development-procedures/systems-development-life-cycle/design-phase.aspx Rob, P., & Coronel, C. (2002). Database systems: design, implementation, and management (5th ed.). Boston, MA: Course Technology. SDLC Models. (n.d.). One Stop QA. Retrieved September 12, 2013, from www.onestopqa.com/resources/SDLC%20Models.pdf SqlCeEngine.Verify Method (VerifyOption) (System.Data.SqlServerCe). (n.d.). MSDN the Microsoft Developer Network. Retrieved September 12, 2013, from http://msdn.microsoft.com/en-us/library/cc835509%28v=vs.100%29.aspx The Software Development Cycle (SDLC). (n.d.). Pelican Engineering. Retrieved September 13, 2013, from www.pelicaneng.com/DevDocs/sdlc.pdf

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.

Car pooling scheme is effective argue against this statement Essay

Vehicle pooling plan is compelling contend against this announcement - Essay Example While there appear to be numerous focal points to it...