StudentShare
Contact Us
Sign In / Sign Up for FREE
Search
Go to advanced search...
Free

Computer Security Analysis Project - Case Study Example

Cite this document
Summary
The paper 'Computer Security Analysis Project' states that the core business of the computer being to create, manipulate, store and retrieve data, file system only provides support for the same but taking it to higher levels, file system provides a way of organizing, storing, retrieving and managing information in a computer disk. …
Download full paper File format: .doc, available for editing
GRAB THE BEST PAPER96.8% of users find it useful

Extract of sample "Computer Security Analysis Project"

Computer security Name: Number: Course: Lecturer: Date: Abstract The core business of the computer being to create, manipulate, store and retrieve data, file system only provides support for the same but taking it to higher levels, file system provides a way of organizing, storing, retrieving and managing information in a computer disk. On top of this, file system contains information regarding users and their authorization credentials giving system administrators to monitor and track authorized and unauthorized system accesses. This makes it possible for them to enforce security measures for the file systems. The main challenging situation arises when there is varying platforms in the network since every platform has its own file system design and implementations. There are a number of file system integrity tools available for system security. These include: iwatch that is written in perl and based on inotify, iScanner written in ruby, submain and tripwire designed for UNIX. This paper will focus on the design and implementation of tripwire file system integrity checker which uses interchangeable signature to catch alterations in file system. Introduction Tripwire is a program that is designed to aid UNIX system administrators watch file system alterations; whether addition, deletion or modification (Kim & Spafford, 1994). This program has an inbuilt capability of detecting possible system intrusion by unauthorized users. The seventh edition of tripwire and possibly the last revision is said to have reflected success of design objectives of being portable, scalable, configurable, extensible, manageable and malleable to enjoy widespread use (Tripwire Inc). This program was developed to counter frequent internet attacks by intruders (Kim & Spafford, 1994). The basics of tripwire are creation of a database with distinct identifiers reflecting each file that has to be monitored. A comparison between saved and created identifiers make possible for the administrators to track any changes in case of any. Addition or deletions are also noticeable. An example is the checklist in UNIX system, whereby disk space is saved through generation of a set of values (time last modified, time and owner) from an original file (Reinhardt, 1993). Frequent regeneration is done and comparison is made against the stored copies and issues arising are sorted out though sometimes some alterations on the file contents may not be noticed. Unauthorized user may get access to root account and make changes the raw disk with an attempt to change the stored data without it being shown on the checklist. In Linux systems there are change notifications like the inotify and dnotify that can be used to detect changes made to the file system (Cowan et al., 1998). However, there are limitations that make them unsuitable for real-time file system. Apart from their inability to detect file content-preserving file changes, every directory has to be opened so as to register alterations within a directory. Another detection called fschange can therefore be used to monitor file changes so as to give better description of changes rather than just indication of file change. The same also offers support for file changes that resulted from map and shows when the file system is mounted or unmounted (Craig & McNeal, 2003). Its advantage is the ability to inform about all changes that occurred at every part of the file system tree. Storage of a value resulting from calculation of files contents being monitored might be the efficient way to detect file changes under these prevailing situations. Its sufficiency depends on the entire file contents and the difficulty in matching for a random adjustment to the file. Signature of the file can then be saved instead of the contents of the file. It is also important to consider the ease of signature function computation and difficulty in trying to reverse the same useful (Garfinkel, & Spafford, 1991). According to Rivest (1992), design of tripwire was as result of unavailability of signature functions that could work effectively with in the UNIX environment. Discussion This paper is based on empirical findings and tends to extend research on the security implementation issues. The areas that will be discussed in this paper are presented in the sections that follow. 1. Administration issues Concerns of administration are as a result of heterogeneous environment whereby machines are running on varied hardware platforms, different operating systems, different versions of software, different machine disks and peripherals and different functions of the computer systems such as file and mail servers. All these sum up to complexity in system administration (Garfinkel, & Spafford, 1991). Administration of these machines is done within their specific roles and specified rules dictating backups, user accounts, access and security (Aranya & Zadok, 2004). Administration of these systems are calls for a need to classify configurations into logical classes that base on the purpose thus maximizing the configuration reuse and reduces fault chances. A properly designed integrity checker tool has to satisfy the following conditions: Scalability is important so as to handle several computers in a network. The tool also has to be flexible in order to handle varying configurations arising from varying hardware platforms and software releases (Edward, 1993). Appropriate integrity tool should support reuse so as to reduce complexity and make use of commonality of logical classes of computer systems. This describes the need to for an integrity tool that handles both varied configuration settings and support reuse of configuration settings based on common characteristics (Garfinkel, & Spafford, 1991). 2. Reporting issues System administrators use the integrity checker tools to monitor system trends so as to notice change in file systems for added, deleted and altered files. Difficulty in reporting changed files has been witnessed since a large number of files are likely to change but there are files that are very critical and the administrators have to be concerned about them. These include system binaries that may be carrying vital information that controls system behavior (Garfinkel, & Spafford, 1991). Some files undergo frequent changes thus giving the administrators hard task of interpreting their large volumes of data. Some of these files carries unimportant data thus no reason for the system administrator to interpret them and this increase the risk of omission of important reports. For a case whereby an administrator has to find a report of malicious file that is in the mix of a large number of files that have recorded change, “trap files” could be placed against snooping intruders. Properly configured system lets the administrators know when a trapped file has been accessed by an intruder thus last time modification is updated (Leadly, Rich and Sirota, 1991). A global filter should therefore be supplied so as to manage the reports that are generated but the challenge arises when policies that remove unwanted reports from a massive mix of events. A good integrity checker should incorporate a mechanism of file reports of interest only (Edward, 1993). 3. Database issues This repository of important information that checks integrity of files need to be given special attention since malicious intruders may get access to it and completely alter the integrity scheme (Leadly, Rich and Sirota1991). System administrators try to overcome this problem by storing the database on offline storages like printouts but at the same time, usability is compromised. Usability can be witnessed in the databases stored in a computer system but the same face challenge of exposure to intruders (Edward, 1993). The most suitable choice is to keep databases on read only media. The reason is that it allows the machine to read the data but disallowing them to make any change. The administrators can then easily monitor file trends and at the same time users can they can monitor their own files (Edward, 1993). Once reports regarding file addition, deletion or alter, database have to be altered by updating the information stored so as to prevent reflection of the changes in future reports. It is however hard to update database stored in read only memories since write capability is not allowed. Therefore there is a greater need to incorporate a mechanism for installing updated version of the database securely (Edward, 1993). Considering the fact that file systems are highly dynamic, a mechanism to report only changed part of the database may be of great importance so as not to degrade performance (Zadok, and Nieh 2000). It is also worth considering a way of enumerating every file to be updated since a good number of files may change (King and Chen, 2003). To ensure security and integrity of the database, no information that may hint the intruders should reside in the database. This allows databases to be moved along with software distribution packages whose distributions cannot be limited (Aranya, & Zadok, 2004). 4. File signature issues Appropriate choice of signature for integrity checkers need to be made. There are issues that go hand in hand with selection of file signature generating functions and include: i. Change detection It is time and resource intensive for the administrator to always compare earlier files with changed files (Wright et al., 2002). The only advantaged offered by this method is that the system administrator can easily tell the change that has occurred to a certain file. A better method of change detection would record the file’s fixed-size signature in the database though this results in multiple mappings. Here there are numerous (almost infinity) files of various sizes that generates the same signature. This helps in locking out the intruders since they won’t know which file generated the signature (Kim, & Spafford, Experiences with tripwire: Using integrity checkers for available security too, 1994). ii. Signature spoofing For situations whereby same signatures can be generated further by other file compositions, intruders can alter file systems and remain undetected. Brute force search and inverting the spoofing the function of the signature are the ways that can be used by intruders to gain access and damage files (Aranya, & Zadok, 2004). Brute force search can be an easy way for an intruder to access files of small size. If for example an intruder wants to find a duplicate sign for /bin/login program in SunOS 4.1, it won’t take much to gain access using the common 12.5 MIPS machine. The intruder can use only 0.42 seconds using 16-bit CRC signature preserving the files length (Abadi, Lomas, & Needham, 1997). This shows that small files are more susceptible to intrusion than bigger files since the same in 32-bit CRC signature takes about 4 hours though a solid knowledge of how the signatures work renders exhaustive search unnecessary. Reverse engineering will suffice the game. For these reasons stronger integrity tools like message-digest algorithms have been adopted. These algorithms are large and attempts to reverse are rendered impossible (Rivest, 1992). iii. Empirical results Frequency of signature collisions (254, 686 signatures) Signature Number of collisions Total 1 2 3 4 5 6 7 8 >9 16-bit checksum 14177 6647 2437 800 235 62 12 2 1 24375 16-bit CRC 15022 6769 2387 677 164 33 5 0 0 25059 32-bit CRC 3 1 1 0 0 0 0 0 0 5 64-bit DES-CBC 1 1 0 0 0 0 0 0 0 2 128-bit MD4 0 0 0 0 0 0 0 0 0 0 128-bit MD5 0 0 0 0 0 0 0 0 0 0 128-bit Snefru 0 0 0 0 0 0 0 0 0 0 This is collision frequencies of signatures gathered from file systems at Purdue University and Sun Microsystems, Inc. This shows that collisions are less likely to occur in larger signatures than in smaller ones. 5. Performance and resource issues File comparison can be easy in that a previous copy is compared against a possibly altered file. The problem with this is that more resources; both time and space are required. Generation of signatures may seem to be a better means, but it involves more calculations. Some functions may poorly be expensive to work in some software but simply execute in others. It is therefore upon the local policy to clearly indicate the signatures and resources that are worth to suit the intensity of trust required (Reinhardt, 1993). 6. Miscellaneous issues It would be far much better if the security tools were self-contained. For instance, a tool that relies on diff or sum would malfunction if the programs fail thus the need for having the tools rely not on outside programs to perform their expected tasks (Rivest, 1992). If the database were human readable it would be making the work easier not only for alternative database reading but also for users to read their own files. Having a standalone system for applying signature functions to a random file lets the user compare this against the signature database (Rivest, 1992). The program should also operate without privileges. Additionally the tool should only report changes and don’t effect changes, though the user could only use the output of the tool to drive changes (Zadok, & Nieh, 2000). Related research and ideas There are tools already existing but is less like the tripwire. Most of these tools fall into two categories; the integrity checkers and the static audit tools. Major tools include COPS, TAMU, Hobgoblin and ATP. There are also others like the I3FS which is an in-kernel integrity checker and intrusion detection file system (Kim, & Spafford, Writing, supporting, and evaluating tripwire: A publically intrusion detection, 1994). COPS It is a static audit tool that is comprehensive and easily configurable. As a static tool, the only problem with it is that it does not help in detection of intrusion but identifies possibilities of intrusion. It is based on CRC thus database update is a challenge. Certain changes cannot be monitored since the CRC-check does not allow monitoring of inode structure. Reporting cannot be incorporated in CRC-check but output can be transformed. The CRC signatures are not suitable for checking file system integrity (Garfinkel and Spafford, 1991). TAMU Refers to a set of utilities being distributed by Texas A&M University (Garfinkel and Spafford, 1991). The tool incorporates signature database that check for system binaries against known signature patch files, an advanced network traffic analyzer that system administrators use in accessing outside attacks. TAMU’s critical files are compared against those stored in database to see if any known version exists. The limitation of TAMU is that it requires to be updated whenever there is a now Operating System or new patches released. It also fails to provide scanning mechanism for changes in file system (Garfinkel and Spafford, 1991). Hobgoblin Hobgoblin was meant to track local trends in file systems for example when more than user can install or delete files. However it is limited in detecting file changes like addition and deletion. Also it lacks file’s signature storage interface and it is based on the assumption that there are no changes in file contents contained in the database thus not suitable in dynamic file system use (Leadly, Rich and Sirota, 1991). ATP It makes use of both MD5 and 32-bit CRC algorithms in file checking. This tool is more advanced as it goes beyond just reporting changes. It is able to automatically change ownership of file once changes have been realized thus blocking further access to the files (Edward, 1993). This is by use of action list. However, the tool uses DES in cipher Block Chaining mode and undergoes checksumming to check for any change and to block unauthorized modification. This renders it unusable in situations where automation is of interest. I3FS This is an in-kernel mode integrity checker tool (Patil, Kashyap, Sivathanu, & Zadok, 2004). It has an ability to compare checksums in real-time. The tool uses encryption technology to scan for changes and performs specific actions that it is configured to do. In contrast to the tripwire, this tool has a mechanism of immediately blocking affected files and notifying the administrator. The I3FS also is implemented inside the kernel thus a loadable module of the system. Another advantage over the tripwire is that it can allows implementation on top of any file system Critical analysis of the security issues The tripwire integrity checker is a tool that has been implemented and already in use. Many advances have been made in an attempt to improve its functionality. It was built over a period of 2 months in the year 1992 and was taken through a test stage by over hundred testers who gave promising feedback on portability and its other supporting characteristics. A formal release of the tripwire was released in 1993 after removal of realized bugs (Tripwire Inc). Administrative model of tripwire Administrative model of tripwire revolves around issues of portability, scalability, configurability and flexibility. Portability - tripwire is able to run in 28 BSD and System-V variants of UNIX. It also runs in Xenix and Unicos systems (Tripwire Inc). Scalability – use of M4-like languages help system administrators write and reuse configuration files. Inclusion of common directives like “@@inclide”, “@@ifdef”, “@@ifhost” and “@@define” allows administrators write core files for configuring a variety of machines in a heterogeneous environment (Tripwire Inc). Configurability and flexibility – the configuration files can be shared by various machines that have identical configurations but each creates its own database file since the tripwire can make distinction between files; configuration and database files (Tripwire Inc). Reporting model The tw.config file is a file containing information of files. It contains the names and directories with their associated selection mask. System administrators classify files into categories by reading the flags. Files contradicting with database entry are interpreted based on their selection masks (Garfinkel, & Spafford, 1991). Database model Tripwire makes use of configuration file and output database (Tripwire Inc). Inviolability – tripwire uses database that is encrypted and to prevent it from tampering the database is kept in a write protected disk so as to securely monitor logins (Tripwire Inc). Semantics – an appropriate action is taken based on whether the file is tw.config, and whether the file is in the old and newly created databases. Changes can be file addition, file deletion, file update, entry addition, entry deletion or entry update (Patil, Kashyap, Sivathanu, & Zadok, 2004). Interface – command-line can be used to interact with the tripwire. It also has interactive mode of update where the user is prompted for the changes to the database (Tripwire Inc). Signature model There over ten signatures supported by tripwire for each file. In the latest version of tripwire the following are included: MD5, MD4, MD2, 4-Pass Snefru, 128-bit HAVAL and SHA (Tripwire Inc). Also include are POSIX 1003.2 compliant CRC-32 and CCITT compliant CRC-16 signatures. Performance Tripwire allows the flexible policies to be used without causing any change to the configuration files and this is done during runtime. Impacts on security of systems Tripwire is reported to be useful in many thousands of sites around the world. Retrievals of the tripwire is a clear indication of its effectiveness. Several cases of tripwire successfully detecting intrusion have been reported. The ability of the tripwire to update database is one of the most interesting feature among the system administrators (Tripwire Inc). Portability of tripwire has also earned it credit as far as security is concerned. The administrators can now manage the heterogeneous environment without any hassle. Intruders who might have been taking the advantage of platform variance have been locked out by the success of the tripwire (Aranya and Zadok, 2004). This has far much improved system security. Recommendations for further research Though the tool is said to be successful, there are still some deficiencies that need to be improved on. Example are the issues concerning reporting, database, performance and resource issues, administration, signature and other issue concerning tool independence so as to ensure availability of service despite other programs malfunction. Conclusion Tripwire has proven to be the best integrity tool. It can be built using available tools and it is highly portable. The program is also scalable and reusable. The tool has been used by many sites around the world and no complain has been raised over its failure or incompatibility. References Abadi, M.T., Lomas, M.A. & Needham, R (1997). Strengthening Passwords. Technical Note 1997-033, California, DEC Systems Research Center. Aranya, A.C. P. & Zadok, E. (2004). “Tracefs: A File System to Trace Them All,” Proceedings of the Third USENIX Conference on File and Storage Technologies (FAST 2004), pp. 129-143. Cowan, C., CaltonPu, Maier, D., Hinton, H., Bakke, P., Beattie, S., et al. (1998). StackGuard: Automatic adaptive detection and prevention of buffer overflow attacks: Automatic adaptive detection and prevention of buffer overflow attacks. In Proceedings of the 7th USENIX Security Conference. Craig, W. & McNeal, P. (2003): “Radmind. The Integration of Filesystem Integrity Checking with File System Management,” Proceedings of the 17th USENIX Large Installation System Administration Conference (LISA2003). Edward, D. (1993). Proceedings of the Security IV Conference, Berkeley: CA, USENIX Association. Garfinkel, S & Spafford, G. (1991). Practical Unix Security. Sebastopol: CA, O’Reilly & Associates, Inc. Kim, G. H. & Spafford, E. H. (1994). Experiences with tripwire: Using integrity checkers for available security tool. In Proceedings of the Usenix Applications Development Symposium. Berkeley. Kim, G. H. & Spafford, E. H. (1994). Writing, supporting, and evaluating tripwire: A publically intrusion detection. In Systems Administration, Networking and Security Conference III. Usenix. Berkeley. King, S. and Chen P. (2003). “Backtracking Intrusions”, Proceedings of the 19th ACM Symposium on Operating Systems Principles (SOSP '03). Leadly, S., Rich. K., & Sirota, M. (1991). Hobgoblin: A File and Directory Auditor. New York, NY: University Computing Center, University of Rochester. Patil, S., Kashyap, A., Sivathanu, G., & Zadok, E. (2004). I3FS: An In-Kernel Integrity Checker and Intrusion Detection File System. Proceedings of LISA '04: Eighteenth Systems Administration Conference. Atlanta, GA : USENIX Association, (pp. 67-78).. Reinhardt, R.B. (1993). An architectural overview of UNIX network security. Technical report. ARINCResearch Corportation. Rivest, R. L. (1992). RFC 1321: The md5 message-digest algorithm. Technical report, Internet Activities Board. Tripwire Inc, Tripwire Software, http://www.tripwire.com. Wright, C., Cowan, C., Morris, J., Smalley S., & Kroah G.H. (2002). “Linux Security Modules: General Security Support for the Linux Kernel,” Proceedings of the 11th USENIX Security Symposium. Zadok, E., & Nieh, J. (2000). FiST: A Language for Stackable File Systems,'' Proceedings of the Annual USENIX Technical Conference, pp. 55-70. Read More

Storage of a value resulting from calculation of files contents being monitored might be the efficient way to detect file changes under these prevailing situations. Its sufficiency depends on the entire file contents and the difficulty in matching for a random adjustment to the file. Signature of the file can then be saved instead of the contents of the file. It is also important to consider the ease of signature function computation and difficulty in trying to reverse the same useful (Garfinkel, & Spafford, 1991).

According to Rivest (1992), design of tripwire was as result of unavailability of signature functions that could work effectively with in the UNIX environment. Discussion This paper is based on empirical findings and tends to extend research on the security implementation issues. The areas that will be discussed in this paper are presented in the sections that follow. 1. Administration issues Concerns of administration are as a result of heterogeneous environment whereby machines are running on varied hardware platforms, different operating systems, different versions of software, different machine disks and peripherals and different functions of the computer systems such as file and mail servers.

All these sum up to complexity in system administration (Garfinkel, & Spafford, 1991). Administration of these machines is done within their specific roles and specified rules dictating backups, user accounts, access and security (Aranya & Zadok, 2004). Administration of these systems are calls for a need to classify configurations into logical classes that base on the purpose thus maximizing the configuration reuse and reduces fault chances. A properly designed integrity checker tool has to satisfy the following conditions: Scalability is important so as to handle several computers in a network.

The tool also has to be flexible in order to handle varying configurations arising from varying hardware platforms and software releases (Edward, 1993). Appropriate integrity tool should support reuse so as to reduce complexity and make use of commonality of logical classes of computer systems. This describes the need to for an integrity tool that handles both varied configuration settings and support reuse of configuration settings based on common characteristics (Garfinkel, & Spafford, 1991). 2. Reporting issues System administrators use the integrity checker tools to monitor system trends so as to notice change in file systems for added, deleted and altered files.

Difficulty in reporting changed files has been witnessed since a large number of files are likely to change but there are files that are very critical and the administrators have to be concerned about them. These include system binaries that may be carrying vital information that controls system behavior (Garfinkel, & Spafford, 1991). Some files undergo frequent changes thus giving the administrators hard task of interpreting their large volumes of data. Some of these files carries unimportant data thus no reason for the system administrator to interpret them and this increase the risk of omission of important reports.

For a case whereby an administrator has to find a report of malicious file that is in the mix of a large number of files that have recorded change, “trap files” could be placed against snooping intruders. Properly configured system lets the administrators know when a trapped file has been accessed by an intruder thus last time modification is updated (Leadly, Rich and Sirota, 1991). A global filter should therefore be supplied so as to manage the reports that are generated but the challenge arises when policies that remove unwanted reports from a massive mix of events.

A good integrity checker should incorporate a mechanism of file reports of interest only (Edward, 1993). 3. Database issues This repository of important information that checks integrity of files need to be given special attention since malicious intruders may get access to it and completely alter the integrity scheme (Leadly, Rich and Sirota1991).

Read More
Cite this document
  • APA
  • MLA
  • CHICAGO
(Computer Security Analysis Project Case Study Example | Topics and Well Written Essays - 3250 words, n.d.)
Computer Security Analysis Project Case Study Example | Topics and Well Written Essays - 3250 words. https://studentshare.org/information-technology/2059281-computer-security-research-analysis-project
(Computer Security Analysis Project Case Study Example | Topics and Well Written Essays - 3250 Words)
Computer Security Analysis Project Case Study Example | Topics and Well Written Essays - 3250 Words. https://studentshare.org/information-technology/2059281-computer-security-research-analysis-project.
“Computer Security Analysis Project Case Study Example | Topics and Well Written Essays - 3250 Words”. https://studentshare.org/information-technology/2059281-computer-security-research-analysis-project.
  • Cited: 0 times

CHECK THESE SAMPLES OF Computer Security Analysis Project

Information Technology Security

Effective project management means getting the right things done according to the planned schedule.... 1) project management begins with planning.... The project life cycle begins when the project is started until it is declared completed.... The key players include the customer who wants to buy the project; the contractor who works on the project up to completion; and the project manager who plans and manages the project activities until it is finished....
8 Pages (2000 words) Case Study

Computer Network Security

This essay discusses that network security is continuously evolving and becoming a challenging aspect for organizations maintaining highly sensitive and customer data.... hellip; From this paper, it is clear that network security appliances are implemented for providing three fundamental functions i.... Some of the most commonly adopted security appliances are Firewalls and Intrusion Detection Systems.... These controls are considered as logical and provide security on the logical layer....
5 Pages (1250 words) Case Study

Project Manager and System Analyst

The paper "project Manager and System Analyst" states that system analysts are an important part of the organization.... Systems analysts play an important role in merging business skills and computer know-how.... They make sure the organization is always on course without missing the target....
11 Pages (2750 words) Essay

Security in Computer Networks

computer security is the “protection afforded to an automated information system to attain the applicable objectives of preserving the integrity, availability, and confidentiality of information system resources (includes hardware, software, firmware, information/data, and telecommunications)” (NIST, 2008).... The paper "security in Computer Networks" tells us about threats to security in computer networks and techniques used as a control mechanism....
5 Pages (1250 words) Research Paper

Project on Spyware and Viruses

ith this background, this project studies the problem of spyware and viruses and presents a report on the issue of spyware and viruses and their impact on the security of the information stored.... he project is to create awareness among the community of a University about the adverse effects of spyware and viruses on the information and data stored by the administration, staff, faculty and students of the University, as most of the users in the particular University have not considered the security of data on their computers important....
8 Pages (2000 words) Essay

Product Survey Project

The preferred computer manufacturer is Hewlett-Packard.... However, with a little research the whole process can be hastened and made easy.... The process is normally complicated by several factors that come into play.... The factor complicating the process may… Budget limitations may be compelling to settle for low cost hardware that may not be sufficiently effective....
3 Pages (750 words) Case Study

Methods Available for Maintaining Computer System Security

Therefore, computer security is a must in every field.... computer security can be achieved in different ways such as by using an anti-malware program, by using cryptography technique to encrypt the data, and also various other methods.... Therefore, a total solution for any computer security system has to meet the three requirements like integrity, secrecy or confidentiality, and availability.... This separation can be achieved by involving three subproblems as a part of computer security i....
8 Pages (2000 words) Coursework

Security Quality Requirements

nbsp;… The Department of Homeland Security Software Assurance Program sponsored the project, and the BSI website contains a range of guidelines, tools, principles, rules, among other resources that project managers would find useful in addressing security concerns at each stage of the software development life cycle.... Security engineers and project managers must understand that software security risks will keep changing throughout the SDLC....
20 Pages (5000 words) Annotated Bibliography
sponsored ads
We use cookies to create the best experience for you. Keep on browsing if you are OK with that, or find out how to manage cookies.
Contact Us