StudentShare
Contact Us
Sign In / Sign Up for FREE
Search
Go to advanced search...
Free

Issues in Software Methodologies - Essay Example

Cite this document
Summary
The paper "Issues in Software Methodologies" highlights that data migration has been and also will remain a necessity rather than a choice for organizations generating and dealing with data. It is now a routine IT operation undertaken by organizations…
Download full paper File format: .doc, available for editing
GRAB THE BEST PAPER95.8% of users find it useful
Issues in Software Methodologies
Read Text Preview

Extract of sample "Issues in Software Methodologies"

Issues in Software Methodologies and Implementation By _____________________________ Submitted to ____________________________ Acknowledgement Research is like a never-ending relay race, where the baton of knowledge is passed on from researchers to researchers. I acknowledge all the previous researchers’ efforts that have enabled me to write this report. Without the stepping stones laid by them, I could not have reached this level. I sincerely, thank all of them. I also my mentor, Dr. _____ has been my lighthouse. In the ocean of knowledge, not knowing where to start and when to stop, without her direction I would have landed up miles away from my desired destination. I would like to thank her for teaching me how to navigate through the ocean of knowledge. I am equally overwhelmed by the unflinching support of all my other professors, lecturers and administrative staff and the library staff at the _____________ University and thank them profusely. Their slightest disarrangement would have distracted my focus from the research. Author Contents Acknowledgement 4 Table of Figures 7 Image credit 7 List of Tables 7 Section 1: The Dilemma of Using Software Methodologies and the Implications for Object-Oriented Software Projects 2 1.1 Background 2 1.2 Concept and definition 2 1.3 Current Situation 3 1.3.1 Industry/Academic Considerations 3 13.2 Global Considerations 8 1.4 Conclusion 8 1.5 Recommendations 9 Section 2: The effects of legacy systems on future systems design and implementations 10 2.1 Background 10 2.2 Concept and Definition 11 2.3 Current situation 12 2.3.1 Effect of legacy systems on future of system design and implementation 13 2.3.2 Modernisation 14 2.4 Conclusions 14 2.5 Recommendations 14 Section 3: Issues surrounding Data migration 16 3.1 Background 16 3.2 Concept and Definitions 17 3.2.1 Concept 17 3.2.2 Definition 17 3.2.3 Migration Process 18 3.2.4 Data migration vs. database migration 20 3.3 Current Situation 20 3.3.1 Data migration issues 20 3.4 Conclusions 23 3.5 Recommendations 23 Bibliography of Section 1 25 Bibliography of Section 2 26 Bibliography of Section 3 27 Appendix 1 28 Bad Engineering Properties of Object-Oriented Languages 28 Table of Figures Figure 1: Cardelli’s 5 point bad engineering properties of object-oriented languages. 4 Figure 2: Mansfield’s seven deficiencies of OOP. 6 Figure 3: Problems areas encountered during the present E-commerce project 7 Figure 4: Characteristics of a legacy system. 11 Figure 5: Four reasons why systems become legacy. 12 Figure 6: Likely circumstances under which data migration is necessitated 18 Figure 7: The butterfly approach provides a phase model with five steps. 19 Figure 8: The three-stage data migration strategy as suggested by IBM. 20 Figure 9: Problems with data migration 22 Image credit Front page: http://www.arri.de/uploads/pics/software-updates-header-image.jpg List of Tables Table 1: Top 10 issues surrounding with data migration. 22 “Think object-orient programming (OOP) is the only way to go? You poor, misguided soul. OOP is just the latest in a history of ideas that sound good in theory but are clumsy in practice.”  Richard Mansfield Author of 38 computer books Section 1: The Dilemma of Using Software Methodologies and the Implications for Object-Oriented Software Projects “Software is not written, it is re-written” Popular software adage1 1.1 Background As early as 1960’s the object-oriented programming (OOP) emerged in the computer software arena, when “data abstraction, polymorphism and modularisation were being applied to the procedural paradigm” (Cardelli, 1996). By 1980’s, research predicted that what structured programming was in the 1970’s, object-oriented programming will be in the 1980’s (Booch, 1986). Object-oriented software development not only prevailed all through the 1980’s, but also through the next two decades. But in a globalised market, as the software market becomes more and more competitive with ever increasing pressure on shortening software development cycles and for improved software productivity, an important dilemma has engulfed the software developers across the world. Can the object-oriented programming stand the test of the time? The present section deals with the issues pertaining to using software methodologies and the implications for object-oriented software projects. 1.2 Concept and definition Booch (1986) defined object-oriented development as “a partial-lifecycle software development method in which decomposition of a system is based upon the concept of an object” (Booch, 1986, p.211). Booch in his paper “Object-Oriented Development” argued that the object-oriented technology is fundamentally different from the traditional functional approaches in respect of designing, serving and managing massive software-intensive systems as well (Booch, 1986). Nearly one and a half decade later, Montlick (1999) supports Booch’s argument and suggests that “object oriented software is all about objects” (Montlick, 1999). While defining an object as a “black box” designed to receive and send signals, Montlick (1999) suggests that a black box has two components, (1) code, which is the sequence of computer instructions and (2) data, which is the information on which instructions operates. In an object oriented programming the code and the data are bundled to form the “black box”, also termed as encapsulation or information hiding, which a user of an object should never peak into (Montlick, 1999). Citing reasons for not to tempt to break this primary rule of object-oriented programming, Montlick explains that “some of the costliest mistakes in computer history have come from software that breaks when someone ties to change it” (Montlick, 1999). 1.3 Current Situation After nearly three decades of Bull Run enjoying huge popularity in commercial programming languages such as C++, Java, Smalltalk/V, Visual Smalltalk and VisualAge, it appears that the object-oriented programming is losing ground. 1.3.1 Industry/Academic Considerations Many scholars and professional software developers now believe that there is little quantitative evidence to support the notion developed over the decades that object-oriented technology provides a subtle solution to the pressure on software development productivity (Potok et al., 1999). Earlier, in his paper “Bad Engineering Properties of Object-Oriented Languages” Luca Cardelli (1996) cited five problem areas as depicted in the figure below. Reproduction of the full details of the problem areas are placed at Appendix 1. Figure 1: Cardelli’s 5 point bad engineering properties of object-oriented languages. Source: Cardelli, L., 1996. Bad Engineering Properties of Object-Oriented Languages. [Online] Digital Equipment Corporation, Systems Research Center Available at: http://lucacardelli.name/Papers/BadPropertiesOfOO.html [Accessed 06 October 2010]. Potok et al., (1999) argue that most studies on object-oriented productivity do not take it into account vis-à-vis the business process and culture under which it is developed (Potok et al., 1999). In his white paper Mansfield (2005) asked “Has OOP failed?” Mansfield opines that the computer programming is in serious difficulty due to the effect of what he called as “quasi-religious cult” called object-oriented programming (OOP). The author argued that due to this overly quasi-religious cult, software development productivity has taken backseat (Mansfield, 2005). Mansfield cited seven reasons to support his argument: 1 OOPs benefits are not unique 2 Runaway linguistic inflation 3 Dozens of terms where one will do 4 Bloated source code 5 OOPs make many simple tasks a struggle 6 Inconsistent grammar 7 OOPs features do not deliver what they promise Further, as Mansfield (2005) asks in his blog “why does OOP generate problems it must then deal with later” considerable contradiction is revealed between OOP practices of encapsulation and inheritance. On one hand, to keep the code bug free, procedures and sometimes even data is encapsulated from other programmers; on the other hand, during inheritance, the same programmer is asked to inherit, modify, and reuse which is impervious to them. While the programmer can see what is transacted, yet, they have to remain unaware of the inside developments. Mansfield asks that although OOP offers a feature to deal with this, “but why does OOP generate the problem in the first place?” Figure 2: Mansfield’s seven deficiencies of OOP. Source: Mansfield, R., 2005. OOP Is Much Better in Theory Than in Practice. [Online] QuinStreet Inc. Available at: http://www.devx.com/DevX/Article/26776 [Accessed 06 October 2010]. As Mansfield says, “All this leads to the familiar granularity paradox in OOP: should you create only extremely small and simple classes for stability (some computer science professors say yes), or should you make them large and abstract for flexibility (other professors say yes). Which is it? (Mansfield, 2005)”. Moreover, the strong argument in favour of OOP is it code reusability does not hold-good as, one can reuse code without OOP, many times by just copy-pasting. Finally, since most programming is done by individuals, hiding code from the programmer himself does not make sense (Mansfield, 2005 ). With all these problems, issues and situations engulfing the OOP, it is beyond doubt an alternate to OOP can alleviate the industry/academics agonies and improve productivity. Some of the common problems experienced during OOP are placed below. Figure 3: Problems areas encountered during the present E-commerce project In the present project in point, E-commerce solution for an auto brokerage insurance service provider, the company looks to maximise sales of insurance policies through a cost effective solution and minimum workforce. Although, the scope of the project is limited to transforming the current company’s website to enable providing sales, renewal of insurance policies by providing customers with a comparison between different insurance companies’ policy rates along with services and options available, yet, by analysing the many dilemmas pertaining to object oriented software project enumerated above, undertaking the project by employing other software especially the procedure-based programming and 4GL (fourth-generation) languages that are deliberately constructed to resemble natural human language need to be considered. The idea is to ensure that by getting into the OOP’s problems, the company should not get into bigger difficulties. 13.2 Global Considerations IT is invading the whole world. With more and more IT applications being done with OOP, Mansfield’s concern appears to be relevant that, thousands of programmers those are indoctrinated every year into the so called semi religious cult the solution seem to take many more decades. 1.4 Conclusion In nearly five decades of OOP’s evolution, many problems have been detected to which solutions have been found, yet many issues are yet to be sorted out. In the present context of providing E-commerce solutions to the auto-insurance service provider many problems have been encountered which needed to be addressed such as the execution flow. It is a difficult proposition to use step-by-step debugging to ensure the execution flow as it is no longer relevant when multiple threads going over the code as used in the present project is planned. As Intel’s Shelly (2008) points out in their blog, “You single step one thread and another completes 5000 loops in the background. If you have multiple threads going over the same function they might all stop on the same breakpoint and you have no way of telling which is which effectively. Following execution flow today is a terrible problem” (Asaf Shelly;, 2008). It is therefore time to think about the alternate and move beyond OOP. As the most vociferous opponent of OOP Mansfield puts it, “In sum: like countless other intellectual fads over the years ("relevance," communism, “modernism,” and so on--history is littered with them) OOP will be with us until eventually reality asserts itself. But considering how OOP currently pervades both universities and workplaces, OOP may well prove to be a durable delusion. Entire generations of indoctrinated programmers continue to march out of the academy, committed to OOP and nothing but OOP for the rest of their lives” (Mansfield, 2005). 1.5 Recommendations Software may fail or fall deficient to the customers’ satisfaction due to many problems. It is recommended that appropriate implementation and application of system methodologies be undertaken along with object-oriented software, which will improve the management of the information and data flow within the organisation. Section 2: The effects of legacy systems on future systems design and implementations 2.1 Background “The most inflexible and unmaintainable sort of application developed by the most unenlightened programmers for an obsolete mainframe computer” is often associated the designation “legacy system” (Weisert, 2002). Ironically, one of the most critical challenges being faced by the IT fraternity is maintaining and upgrading these legacy systems (Sucharov & Rice, n.d.). Studies conducted by the Garner group indicate that the primary concern among IT managers in mid-sized IT companies is that of integration with legacy systems. Sucharov & Rice (n.d.) of Erudine in their white paper “The Burden of Legacy” opine that “large and complex IT projects will turn into legacy systems and that such collapse are inevitable” (Sucharov & Rice, n.d., p.3). Citing for the reasons the authors argue that this pessimistic view is due to the fact that the elements that turn software into legacy system have not been fully addressed. Besides, the new programming techniques and strategies offer only small, incremental improvements in the size and complexity of new projects prior to an inescapable collapse (Sucharov & Rice, n.d., p.3). This view is seen somewhat differently in many quarters of the industry. For instance, Weisert (2002) asserts that in spite of many decades of continuous and breakthroughs in software development methodologies, many companies are taken by surprised and even disappointed to find that “they have replaced those old applications with expensive new software which are equally expensive to maintain (Weisert, 2002). On the other hand Sucharov & Rice (n.d.) believe that “the real issues of legacy are all believed to be associated with the ‘mental models’ built up by the system developers and maintainers. Once a system becomes too complex for the maintainers to understand fully, or when the people who understand it leave the project, the system becomes a legacy system” (Sucharov & Rice, n.d., p.3). There is however unanimity amongst software vendors, academic programs, fad methodologists, and contract development firms that the legacy system is a burden and every stakeholder is responsible for this alarming situation (Weisert, 2002). Meanwhile, more legacy applications are still developed as the debate continues. 2.2 Concept and Definition Weisert (2002) states that even though there is no officially accepted definition, a legacy system is characterised as a legacy system when (1) its user interfaces are unfriendly and error prone, (2) it is poorly documented, (3) its programmes are disorganised, inflexible, difficult to understand, and very expensive to change, (4) its database contains lots of inconsistencies and redundancies (Weisert, 2002). Figure 4: Characteristics of a legacy system. Source: Weisert, C., 2002. Legacy Systems of the Past, Present, and Future. [Online] Information Disciplines, Inc., Chicago Available at: http://www.idinews.com/legacySys.html [Accessed 8 October 2010]. Sucharov & Rice (n.d.) on the other hand state that the most acceptable definition of legacy systems “are hard to modify or update” and a legacy system is “a bespoke application that processes large or complex data have become cumbersome and hard to maintain” (Sucharov & Rice, n.d., p.3). The authors explain that “a legacy system is not necessarily an application that has been written 10 years ago in COBOL” (Sucharov & Rice, n.d., p.3). On the contrary, a legacy system may even be an application delivered by an external service provider and has just been delivered (Sucharov & Rice, n.d.). Sucharov and Rice (n.d.) offer four reasons why systems become legacy as (1) developers leave, (2) code complexity increases, (3) obsolete hardware and software, (4) inherited and commissioned systems. Figure 5: Four reasons why systems become legacy. Source: Sucharov, T. & Rice, P., n.d. The Burden of Legacy. [Online] Erudine Available at: http://www.erudine.com/downloads/whitepaper_legacyburden.pdf [Accessed 08 October 2010]. 2.3 Current situation Ferguson (2005) states that while on one hand the legacy systems present formidable challenges for IT executives; on the other these systems contain vital competitive advantages of organisations in the form of customer transactions, dealings with partners and internal working process. Often, the investment made on the legacy systems are over years and are therefore constitute the IT foundation of organisations (Ferguson, 2005). Nonetheless, more and more business organisations are viewing the legacy systems as liability since they (1) limit organisation’s agility and ability to compete over time, (2) become more expensive to maintain and upgrade, (3) cannot be extended to new applications, interfaces and web services, (4) finally, at some point of time the complexity of the problems they pose can become unmanageable leading to collapse of the entire system (Ferguson, 2005). 2.3.1 Effect of legacy systems on future of system design and implementation The effect of legacy systems on future of system design and implementation is huge. According to an estimate nearly 80 percent of IT platforms are currently running on legacy systems (Sucharov & Rice, n.d.). Two estimates on legacy systems are noteworthy that of the International Data Corp. and that by Hurwitz Group. While According to International Data Corp. estimates, nearly then thousand large mainframes use over two-hundred billions lines legacy code; estimates of Hurwitz Group suggests that ninety percent of companies have not “fully integrated their mission critical business processes” (Sucharov & Rice, n.d., p.3). Sucarov and Rice suggest from industry polls that currently the legacy systems claim as much as sixty-to-ninety percent of IT budget towards system operation and maintenance (Sucharov & Rice, n.d.). Legacy systems also affect the future system design and development since companies are reluctant to get over the legacy systems because (1) they run mission critical business process, (2) they offer excellent accuracy, (3) they have run for many years and thus allow time for errors to be fixed (4) replacement or modification of these systems may introduce some initial bugs, which may prove to be disastrous for the mission critical process (Sucharov & Rice, n.d.). 2.3.2 Modernisation Ferguson (2005) is of the opinion that legacy systems can be modernised to Java, C#, or .NET-compliant systems, which offer superior alternative to emulation, screen scrapping, and replacement. Sooner or later, companies that are dependent on iSeries systems will find that modernisation is the only option. 2.4 Conclusions The problem of legacy system is here to stay unless organisations make serious efforts to get over them. To the extent the term ‘future legacy system’ is now are in use. Yet solutions do exist, which include traditional approaches such as (1) doing plenty of documentation, (2) improving design patterns, (3) improving system architecture, (4) employing powerful software development processes (5) employing reverse engineering tools, (6) refactoring and (7) testing-first developments (Sucharov & Rice, n.d.). As Ferguson states, “Regardless of the path they choose, carefully considering the four pillars of business--people, processes, applications, and data--will assure that they balance strategic business benefit with rational cost concerns to meet competitive pressures and enhance agility and market strength” (Ferguson, 2005, p.4). 2.5 Recommendations The following are some of the recommendations based on Ferguson’s (2005) five-point paths to modernisation legacy systems: 1. Emulate the system on other systems 2. Rewrite and replace the existing systems 3. Scrape the display file DDS to provide a WebFaced solution 4. Covert the legacy source code to Java or C# 5. Covert the source code to an alternative compiler that supports familiar syntax and technique while producing modern code applications Source: Ferguson, A., 2005. System Migration:Supporting the Four Pillars of Legacy Systems Through Modernization. [Online] MC Press Online, LP Available at: http://www.asna.com/WhitePapers/Anne%20Ferguson%20MCPress%20Piece%20on%20Modernization.pdf [Accessed 08 October 2010]. Section 3: Issues surrounding Data migration 3.1 Background Data migration, one of the core data management activities has been practiced since the dawn of computers (Dhumne, 2009). Data migration, the process of moving application code along with the underlying data platform from one database to another is an expensive affair with serious administrative issues that may pose many application development challenges. Typically, data migration may be necessitated due to technological redundancy or due to competitive “win back’ migrations, where competing database vendors try to win back their opponents customers. Usually, due to incompatibilities between source and target database platforms, data migration becomes a costly effort. The incompatibilities may arise due to “unique product specific extensions to support the language, procedural logic, DDL and administrative interfaces” (Xin et al., 2010, p.1426). The cost of migration depends upon the degree of compatibility between the platforms, which the customer has to bear. This happens in spite of existence of international standards for database definitions and language processing. The irony is that these migrations seldom “occur without development cost due to large number of non-standard language and interfaces extensions that many database vendors provide” (Xin et al., 2010, p.1426). Hence, in the event of an application coded to operate on a particular source database is needed to migrate to a newer target platform, a number of these databases used by the source database are unsupported in the target database, which necessitates appropriate changes to be made in the application or database prior to commencement of migration (Xin et al., 2010). This section deals with the concept and definition of data migration, the current situation and issues surrounding data migration followed by a conclusion and the recommendations. 3.2 Concept and Definitions 3.2.1 Concept Haller (2009) while discussing the concept of data migration system in his paper “Towards the Industrialization of Data Migration: Concepts and Patterns for Standard Software Implementation Projects” explains that he was motivated by the efforts of the Swish banks when they replaced their core-banking information system. To enable this, the banks were required to migrate their data (such as accounts) from the old into the new system (Haller, 2009). Haller (2009) argues that data migration is a necessity, but not necessarily a business catalyst. The consequence of data migration is invariably pressure on costs, which had to be effectively managed by an efficient software development process along with industrialisation of the development (Haller, 2009). 3.2.2 Definition John Morris, the author of “Practical Data Migration” opines that data migration is a practitioners’ topic, i.e. only very few publications exist (Morris, 2010). Many definitions have been proffered by practitioners; however, perhaps a widely accepted definition is the one developed by IBM. According to them, “data migration is the process of making an exact copy of an organization’s current data from one device to another device—preferably without disrupting or disabling active applications—and then redirecting all input/output (I/O) activity to the new device” (IBM Global Technology Services, 2007). 3.2.3 Migration Process According to IBM, a variety of circumstances under which organisations may need data migration. This may include (1) server or storage technology replacement upgrade, (2) server or storage consolidation, (3) relocation of the data centre, (4) server or storage equipment maintenance, including workload balancing or other performance related maintenance (IBM Global Technology Services, 2007). Figure 6: Likely circumstances under which data migration is necessitated Source: IBM Global Technology Services, 2007. Best practices for data migration. White Paper. IBM. Haller (2009) suggests that whatever may be the reason for data migration, the process involves the butterfly approach undertaken in five steps: (1) Analysis, (2) development of data mapping, (3) building up a sample data set in the target system, (4) migration of the system components to the target system without any data, (5) step-by-step migration (Haller, 2009). Figure 7: The butterfly approach provides a phase model with five steps. Source: Morris, J., 2006. Practical Data Migration. Swindon: The British Computer Society IBM on the other hand offers a three-stage data migration methodology as placed in the figure below: Figure 8: The three-stage data migration strategy as suggested by IBM. Source: IBM Global Technology Services, 2007. Best practices for data migration. White Paper. IBM. 3.2.4 Data migration vs. database migration Haller (2009) to differentiate between the two mistaken IT terms, data migration and database migration explains that “In data migration, application-related artefacts like triggers must not be migrated. Instead, data might have to be transformed to fit into the new systems’ database schema. In contrast, migrating a database (e.ge. from Microsoft SQL server to Oracle) demands not only to copy the data but also all application-related artefacts (triggers, constraints etc.)” (Haller, 2009, p.63). 3.3 Current Situation Dhumne (2009) suggests that though part of a core data management activity, data migration is often the least priority work on IT managers’ list of things to do, which results in poor quality data in target system. Dhumne states that, according to Bloor Research (September 2007), an estimated 84 percent data migration projects fail2 (Dhumne, 2009). Dhumne argues that the impact of data migration project failures can be substantial and may be breakdown of target systems, poor data quality in the target environment, loss of business opportunity and cost overruns (Dhumne, 2009). 3.3.1 Data migration issues According to the survey results from Softek’s 2005 worldwide data migration survey of 700 IT personnel, “60 percent of respondents stated that they migrate data quarterly or more often – with 19 percent migrating weekly” (IBM Global Technology Services, 2007, p.4). In the same survey, more than 75 percent admitted that they had experienced problems during data migration. The report suggested that even routine process can cause problems for IT administrators. The problems grouped under four broad categories include but not limited to (1) extended or unexpected downtime, (2) data corruption, missing data or data loss, (3) application performance issues, (4) technical compatibility issues (IBM Global Technology Services, 2007). Figure 9: Problems with data migration Source: IBM Global Technology Services, 2007. Best practices for data migration. White Paper. IBM. At a more detailed level, Kevin Allen, a respondent, on 02 August 2007 in the blog of the John Morris, the author of book “Practical Data Migration” identified the ten most issues he came across while performing data migration are as placed at the box below. Table 1: Top 10 issues surrounding with data migration. 1. Lack of data profiling & analysis - i.e. not realizing 10% of the integers are actually characters. Or mandatories with nulls, or spaces, etc. 2. Fixing data problems - having a practical plan for what to fix and what not to - e.g. solving referential integrity problems, entering mandatory data, etc. 3. Lack of specifications for transformations, especially when considering data profiling discoveries (e.g. 0/1, M/F, Male/Female all need mapping) 4. Insufficient planning / time allocated to the migration project 5. Insufficient testing prior to actual data migration 6. Hot switch over plan for volatile data. Its easy to migrate historic data that isnt changing, but migrating bank account data while transactions are still coming in is a lot harder! 7. Merging duplicate data - when more than one data source is being migrated into a new single database, its amazing how often there isnt a plan on identifying and resolving duplicates. 8. Creation of identifiers - especially when the target is a vendor package, the decision on how to allocate identifiers and whether/how to keep the old ones. Resist the temptation of intelligent keys! 9. Lack of reconciliation - queries are easy to get wrong. e.g.1 A home and work address h1, w1 are migrated and suddenly there are two home addresses (h1,w1) and two work addresses (h1,w1). e.g.2 10% of records got dropped due to a mismatching key field. Even simple record counts can prevent these types of mistake. 10. Poor design of the new system. e.g. Types and codes often need to have extra values to cope with Unknown or Undefined. Data profiling can often uncover requirements for the new system or package. Doing profiling earlier enough can avoid having to change the system after its already been built. Source: Morris, J., 2010. Top Ten Data Migration Issues. [Online] Available at: http://www.bcs.org/server.php?show=conBlogPost.73 [Accessed 08 October 2010]. 3.4 Conclusions Data migration has been and also will remain as a necessity rather than a choice for organisations generating and dealing with data. It is now a routine IT operation undertaken by organisations. Even then, in most operations, it causes considerable trouble sometimes even leading to IT disasters. Mismanagement of data migration can make a serious dent in the company’s revenue including loss of clients. 3.5 Recommendations To obviate these issues surrounding data migration, the following points are recommended: 1. Develop a consistent and reliable methodology to enable the company to plan, design, migrate and validate the migration. 2. Identify the appropriate migration software that supports the specific migration requirements, including operating systems, storage platforms and performance. 3. Keep the migration products that maintain continuous data availability during the migration without affecting performance. 4. Do not hesitate to enlist the help of an experienced consultant or vendor. Bibliography of Section 1 Asaf Shelly; Intel, 2008. Flaws of Object Oriented Modeling. [Online] Intel Available at: http://software.intel.com/en-us/blogs/2008/08/22/flaws-of-object-oriented-modeling/ [Accessed 06 October 2010]. Asaf Shelly;, 2008. Flaws of Object Oriented Modeling. [Online] Intel Available at: http://software.intel.com/en-us/blogs/2008/08/22/flaws-of-object-oriented-modeling/ [Accessed 06 October 2010]. Booch, G., 1986. Object Oriented Development. IEEE Transactions in Software Engineering, SE-12(2), pp.211-21. Cardelli, L., 1996. Bad Engineering Properties of Object-Oriented Languages. [Online] Digital Equipment Corporation, Systems Research Center Available at: http://lucacardelli.name/Papers/BadPropertiesOfOO.html [Accessed 06 October 2010]. Mansfield, R., 2005. OOP Is Much Better in Theory Than in Practice. [Online] QuinStreet Inc. Available at: http://www.devx.com/DevX/Article/26776 [Accessed 06 October 2010]. Mansfield, R., 2005. Has OOP Failed? [Online] 4js.com Available at: http://www.4js.com/en/fichiers/b_genero/pourquoi/Has_OOP_Failed_Sept_2005.pdf [Accessed 06 October 2010]. Mehmet, A. & Lodewijk, B., 1992. Obstacles in Object-Oriented Software Development. ACM SIGPLAN Notices, 27(10), pp.341-58. Montlick, T., 1999. What is Object-Oriented Software? [Online] softwaredesign.com Available at: http://www.softwaredesign.com/objects.html [Accessed 06 October 2010]. Potok, T.E., Vouk, M. & Rindos, A., 1999. Productivity Analysis of Object-Oriented Software Developed in a Commercial Environment. Software – Practice and Experience, 29(10), pp.833-47. sampleresearchproposals.blogspot.com, 2006. Report on the Dilemma of Using Software Methodologies and the Implications for Object-Oriented Software Projects. [Online] researchproposals.blogspot.com Available at: http://sampleresearchproposals.blogspot.com/2008/07/report-on-dulemma-of-using-software.html [Accessed 06 October 2010]. zaemis.blogspot.com, 2009. Whats Wrong with OOP. [Online] zaemis.blogspot.com Available at: http://zaemis.blogspot.com/2009/06/whats-wrong-with-oop.html [Accessed 06 October 2010]. Bibliography of Section 2 ditii.com, 2008. Future of legacy compatibility in Windows. [Online] D Technology Weblog Available at: http://www.ditii.com/2008/02/08/future-of-legacy-compatibility-in-windows/ [Accessed 08 October 2010]. Ferguson, A., 2005. System Migration:Supporting the Four Pillars of Legacy Systems Through Modernization. [Online] MC Press Online, LP Available at: http://www.asna.com/WhitePapers/Anne%20Ferguson%20MCPress%20Piece%20on%20Modernization.pdf [Accessed 08 October 2010]. Sucharov, T. & Rice, P., n.d. The Burden of Legacy. [Online] Erudine Available at: http://www.erudine.com/downloads/whitepaper_legacyburden.pdf [Accessed 08 October 2010]. Vo, H.H. et al., 2007. Environment for Executing Legacy Applications on a Native Operating System. [Online] USA Available at: http://appft1.uspto.gov/netacgi/nph-Parser?Sect1=PTO1&Sect2=HITOFF&d=PG01&p=1&u=/netahtml/PTO/srchnum.html&r=1&f=G&l=50&s1=%2220080034377%22.PGNR.&OS=DN/20080034377&RS=DN/20080034377 [Accessed 08 October 2010]. Weisert, C., 2002. Legacy Systems of the Past, Present, and Future. [Online] Information Disciplines, Inc., Chicago Available at: http://www.idinews.com/legacySys.html [Accessed 8 October 2010]. Bibliography of Section 3 Anderson, E. et al., 2007. Algorithms for Data Migration. Algorithmica , 57(2), pp.349-80. Dhumne, S., 2009. Proven Strategies for Large-Scale Data Migration Projects. [Online] The Data Administration Newsletter, LLC Available at: http://www.tdan.com/view-articles/11185 [Accessed 08 October 2010]. Haller, K., 2009. Towards the Industrialization of Data Migration: Concepts and Patterns for Standard Software Implementation Projects. Advanced Information Systems Engineering, 5565/2009, pp.63-78. IBM Global Technology Services, 2007. Best practices for data migration. White Paper. IBM. Morris, J., 2006. Practical Data Migration. Swindon: The British Computer Society. Morris, J., 2010. Top Ten Data Migration Issues. [Online] Available at: http://www.bcs.org/server.php?show=conBlogPost.73 [Accessed 08 October 2010]. Touchstone Systems Ltd., 2010. Top 10 things to watch out for in a data migration project. [Online] Available at: http://touchstone-systems.co.uk/blog/?p=25 [Accessed 08 October 2010]. Xin, R.S. et al., 2010. MEET DB2: Automated Database Migration Evaluation. Proceedings of the VLDB Endowment, 3(2), pp.1426-34. Appendix 1 Bad Engineering Properties of Object-Oriented Languages Economy of execution. Object-oriented style is intrinsically less efficient that procedural style. In pure object-oriented style, every routine is supposed to be a (virtual) method. This introduces additional indirections through method tables and prevents optimizations such as inlining. The traditional solution to this problem (analyzing and compiling whole programs) violates modularity and is not applicable to libraries. Economy of compilation. Often there is no distinction between the code of a class and the interface of a class. Some object-oriented languages are not sufficiently modular and require recompilation of superclasses when compiling subclasses. Therefore, the time spent in compilation may grow disproportionally with the size of the system. Economy of small-scale development. This is a big win of object-orientation: individual programmers can take good advantage of class libraries and frameworks, drastically reducing their work load. When the level of ambition grows, however, programmers must be able to understand the details of those class libraries, and this turns out to be more difficult than understanding module libraries (see also the next point). The type systems of most object-oriented languages are not expressive enough; programmers must often resort to dynamic checking or to unsafe features, damaging the robustness of their programs. Economy of large-scale development. Teams of programmers are often involved in developing class libraries and specializing existing class libraries. Although reuse is a big win of object-oriented languages, it is also the case that these languages have extremely poor modularity properties with respect to class extension and modification. For example, it is easy to override a method that should not be overridden, or to reimplement a class in a way that causes problems in subclasses. Other large-scale development problems include the confusion between classes and object types, which limits the construction of abstractions, and the fact that subtype polymorphism is not good enough for expressing container classes. Economy of language features. Smalltalk was originally intended as a language that would be easy to learn. C++ is based on a fairly simple model, inherited from Simula, but is otherwise daunting in the complexity of its many features. Somewhere along the line something went wrong; what started as economical and uniform ("everything is an object") ended up as a baroque collection of class varieties. Java represents a healthy reaction to the complexity trend, but is more complex than many people realize. Read More
Cite this document
  • APA
  • MLA
  • CHICAGO
(“Research Essay Example | Topics and Well Written Essays - 2750 words”, n.d.)
Research Essay Example | Topics and Well Written Essays - 2750 words. Retrieved from https://studentshare.org/miscellaneous/1570752-research
(Research Essay Example | Topics and Well Written Essays - 2750 Words)
Research Essay Example | Topics and Well Written Essays - 2750 Words. https://studentshare.org/miscellaneous/1570752-research.
“Research Essay Example | Topics and Well Written Essays - 2750 Words”, n.d. https://studentshare.org/miscellaneous/1570752-research.
  • Cited: 0 times

CHECK THESE SAMPLES OF Issues in Software Methodologies

Agile Software Development Methodologies

This paper presents an overview of the agile software development methodologies.... he basic purpose of this research is to show that 'agile software development methodologies' are a superior design method that is why Scrum and XP rapidly emerging frameworks .... This paper presents an overview of the agile software development methodologies.... The basic purpose of this research is to show that 'agile software development methodologies' are a superior design method that is why Scrum and XP rapidly emerging frameworks as methodologies....
3 Pages (750 words) Essay

Overview of Agile Software Development

The paper "Overview of Agile Software Development" discusses that without a doubt, agile software development methodologies have been proved to be successful for the completion of small size teams and limited size projects.... It is believed that agile software development methodologies are not suitable for large size and critical projects.... In this scenario, agile software development methodologies have become a trend in quickly changing software industry....
17 Pages (4250 words) Term Paper

Agile Software Engineering

The author of the current paper casts light upon the fact that within the last 50 years various software development methodologies have been developed to tackle and manage different challenges and problems that happen all through the software development.... The initiation of 'agile software engineering methodologies' was another response to a number of software development problems those have reasoned an intense debate amongst software engineering developers from the beginning of 2000, like that 'Spiral' or 'Waterfall' model has established a high value of agile software engineering methodologies....
8 Pages (2000 words) Research Paper

Scrum Methods in Software Development

The paper "Scrum Methods in software Development" describes that agile software development has become an attractive trend.... 2003; Kavitha & Thomas, 2011):Without a doubt, agile software development methodologies have been developed to deal with the issues of delivering high-quality software promptly under a quickly and continuously changing business environment and requirements.... In fact, agile software development methodologies have an excellent reputation in the IT and software sectors....
11 Pages (2750 words) Coursework

Distributed agile software development

1213-1221) states that the agile software methodologies are anchored on various principles.... gile software development refers to a group of software development methodologies that aim to achieve a more nimble and lighter development processed which as a result make them increasingly responsive to change.... We also have the DART project which was a research project on different web applications with one of the main goals being to analyze the ways in which Agile methodologies fit the numerous needs of web development (Torgeir, Sridhar, Venu and Nils 2012, p....
2 Pages (500 words) Essay

Software Engineering Principles for Forensic Integrity of Digital Forensics

Some of the most common methodologies in software engineering are prototyping, waterfall, iterative and incremental development, rapid application development, spiral development, and extreme programming (Khurana, 2007).... The proposal "Software Engineering Principles for Forensic Integrity of Digital Forensics" focuses on the critical analysis of conducting a review of the software engineering methodologies and principles to evaluate the most suitable methodology and principle in digital forensic software development....
4 Pages (1000 words) Research Proposal

Benefits of Various Software Engineering

Describe THOUGHTFULLY how you learned to understand the differences and benefits of various software engineering lifecycle methodologies.... By understanding the differences and benefits of various software engineering lifecycle methodologies we can be able to evaluate the advantages and problems in initiating a new system development project.... The paper "Benefits of Various software Engineering" highlights that we always need to understand and acknowledge the larger issues of and specific approaches to project management, metrics collection, quality assurance, risk management, testing, and configuration management....
8 Pages (2000 words) Assignment

Software Engineering Techniques for Service-Based Development

While the initial focus in software development was over quality considerations, the trend is now shifting gradually towards efficient maintenance and delivery within the budget owing to the rising costs of maintaining and upgrading software.... The author discusses the most popular software engineering techniques that have been used by the industry over past decades.... With growing complexity and rise in the popularity of computers, software systems have grown to such an extent that programs can no longer be written by a single individual ....
8 Pages (2000 words) Research Paper
sponsored ads
We use cookies to create the best experience for you. Keep on browsing if you are OK with that, or find out how to manage cookies.
Contact Us