StudentShare
Contact Us
Sign In / Sign Up for FREE
Search
Go to advanced search...
Free

Networking Security Engineering - Essay Example

Cite this document
Summary
The paper "Networking Security Engineering" tells us about advances in communications and technology. The virtual business done on the Web now has undergone a great transformation and taking giant leaps…
Download full paper File format: .doc, available for editing
GRAB THE BEST PAPER96.6% of users find it useful
Networking Security Engineering
Read Text Preview

Extract of sample "Networking Security Engineering"

Networking Security Engineering The advance in communications and technology is compelling us to share information all over the world and which in turn paved the way for E-Commerce. The virtual business done on Web now has undergone a great transformation and taking giant leaps. But in that course of development there are number of threats that affect negatively that aspect of doing business on the net. The unwanted intruders can deface the site to damage the company's reputation, denial of service to authorised service users that result in loss of business and the theft of private information pushes the company into legal turmoil. If the Web server root directory was same as the ftp server root directory and if one was able to upload a PHP script to a world-writeable ftp directory then the PHP script can execute UNIX commands and creates a shell server bound to high port which will open. To avoid this micro content management system can be designed to extract the maximum potential from the content and at the same time the data stored as XML within a native XML must searchable using Xpath. This gives to execute Xpath via a URL from within user's browser. This makes the limitations on how one can use and reuse his content on the site. It depends on the mark up of the data by the user and resists spamming or illegal posting by using. However a fast, stable, clean PHP code is required. It should be designed in such a way that beginners can learn and advanced programmers can make complex modifications. The licensing should ensure that the programmers must be able to learn and improve from it. This may save the situation of having no firewall at the arrangement and even prevent intruders up to some extent to use a database process. Several improvements must be made by the addition of several tools to prevent and remove spam from the Web logs. These tools could be bundled under a new section and can be named as 'spam smasher'. A spam tool must present to search for spam and delete all the matches. A centralized version of the 'Blocked phrases' tool's presence is needed in addition to the local version. Spam recognizing capability must be improved according to the need. The track backs system must be hardened from time to time. It must be easier to the user to remove the spam with one click and he must be able to report it immediately, so that the administrator can include it in Global Blocked Phrases list. A spam log must be present to detect, check, prevent, and remove spam from the sites and web logs. Keeping all the vulnerable sections in mind one has to maintain a high level of security for the site. The built-in blockers for referrer, comment and track back spam should be present. Not only do these block the spam as it comes in, they must also help a great deal in preventing the spammers from overtaxing your web server. This is feasible if the knowledge about security strategies and awareness was enough. The first thing that should one keep in mind is that the users and processes must have access to only required privileges but not more. If not the case mentioned here might happen. The individuals visit blogs and post their adverts, along with a link to whatever dodgy website they were promoting. Now a days it has been automated. Some clever programmers working for one of the outfits has written a tool that goes around a list of web logs and collects information on the various ports made to it. It then can create a right HTML to fool the blogging software into thinking that a comment has been entered, and the resulting advert is posted to the blog as if it was legitimate. After all, a public blog with an accessible comments page is hardly a closed system, and even if you have an acceptable use policy saying what sort of postings you welcome, that is not legally binding either. To avoid this one has to put strong restrictions on posting comments which undermines the primary objective of the blog of removing restrictions on online chatting. For this automated tagging process may be suggested. This is an old-new method for organizing data by assigning descriptors to documents and other sources of information. These which are known as tags will provide an easy way to categorize, search and browse the information they describe. The annotating the document with keywords is nothing new, but a collaborative form of this one with unique properties added can attract a lot o attention on the Web. The collaborative tagging that is different from the traditional keyword annotation is its open-vocabulary; non-hierarchical nature and the fact that tags will be assigned by authors and users of the information rather than by professional annotators, with no rules or limitations. In this context we try to address the task of automatic assignment of tags to web log posts. For this a system of Auto Tag can be devised. A given Web Log post offers a small number of tags which seem useful for it; the blogger can review the suggestions and can select those which he/she finds instrumental. This system not only simplifies the tagging process, it also improves its quality. It is done by increasing the chance that web log posts will be tagged in the first place and second by offering relevant tags that may have not been applied. This is capable of improving the tasks for which tagging is aimed at by providing better search and browse capabilities. The basic approach to automate the tag assignment must be that of collaborative filtering. In a nutshell, the system helps users find desirable products or services by analyzing their profile and matching it with profiles of other users. They must also can be able to find products that are similar to the ones they expressed interest in, for similar users share similar tastes. In this system the blog posts themselves must take the role of users and the tags assigned to them function as the products that the users expressed interest in. By assuming similar users buy similar products, this system makes and identifies useful tags for a post by examining tags assigned to similar posts. The recommendations are then further improved by incorporating external knowledge about the bloggers, the posts or the tags. After that the tags assigned are aggregated thus by creating a ranked list of likely tags. In the later stage the system filters and re ranks the tag list. This method assures that the top ranked tags are offered to the user, who can select the tags to attach to the post. This system should also use Information Retrieval measures to estimate the similarity between web log posts. The most similar posts are then taken to be the highest ranking ones retrieved from the collection using some retrieval model. It can be experimented with a number of methods for generating queries from a post, including using the entire text of the post and using links in it to locate citations associated with it. A Tag model in this system should use simple heuristics for composing the ranked list of tags from the top-retrieved posts. Each one is scored according to its frequency. By experimenting with more complex ways we can take into account the retrieval rank or score yielded by improvements in accuracy. The clear source of information we must have about a blogger is the tags used before writing the post be analyzed. Then if those tags appear on ranked list they are supposed to boost their score by that factor. To evaluate the mentioned method use the corpus distributed. It should contain at least 10M web log posts collected during the period 3 weeks. Out of them 1.8M posts being tagged, with a total of 2.3 M tags. To do indexing and retrieval, use open source engine which can use rather a simple vector space retrieval model, text being stemmed with an English stemmer. The first method we should manually examine the tags assigned to a random subset of 50 posts from our collection. For each tag decide that if it was indeed a relevant label for the post. But seeing its ineffectiveness in cost it can be applied to only smaller number of posts. It is also not automated it difficult to improve the system. The automated system must be capable of tagging thousands of posts in the system. After that the system's output was compared to the actually posted tags. The automated system was proposed and evaluated, a tool for tagging web log posts based on a collaborative filtering approach. The system offers suggestions for tags based on tags assigned to other, similar posts; the final decision about a tag is left to the blogger. Despite a relatively small corpus for this type of task, the system shows good results, and has the potential to benefit both the bloggers and others making use of tags assigned to web log posts. Identifying effective ways to generate queries from a post and successful retrieval models to use; improving the aggregation of tags from the retrieved posts; and various methods for filtering and re ranking the lists produced by the system. In addition to the collaborative approaches described in this paper, there is need to investigate a "local" approach to tag suggestion, in which suggestions for tags are made without access to the entire blogo sphere as is the case with automates system, but using deeper analysis of the contents of the post and the blog it belongs to. The multiple security layers will provide limited access to your system. The weakest links between a strong network will increase the vulnerability and that may deny the access if the system fails. The files can be kept safe if the access is denied whenever the system fails. The diversity in the defence can be made possible by multiple mechanisms. Security through obscurity can be made possible but that anonymity will results in absence of growth or negative growth in business. In building a Secure Site, one has to cover all the bases and formally analyze the requirements. The policies must be defined under UK and Microsoft definitions. The original action standards must be focused on implementing actions rather than simply creating documents. The design of security to all of the products must be to protect the date from the utility's employees, hawkers and customer portal to the data center and then to the microprocessor. The system tools must restrict access to control systems and critical data using Identity Management software. This software must be able to easily authenticate, authorize and audit personnel access with single sign-on ease of use throughout the enterprise, quickly and effectively provisioning and de-provisioning users. This must collaborate with partners and contractors on specific projects. Employees of site must not download hidden viruses and worms using the site servers or computers. Generally employees tempt to download a game or an application that brings virus, worm or spy ware with it. Some utilities were infected by worms by the employee's usage. The Desktop Software must enable the IT administrators to restrict software downloads and dangerous computer use across the enterprise and at the server-level. The hackers also must be put out by the site's operating system. It must securely host thousands of applications and multiple users on the same system. The security features must verify the system integrity with secure execution file and verification features. The risk was reduced by granting only the privileges needed with user and process right management. Simplify administration by using the open standards based cryptographic framework for file encryption. Secure the IT systems by leveraging IP Sec and IP filter firewall for network traffic protection. It must reset failed OS s by data recovery systems. The critical information systems must be kept up and running or quickly restarted after a failure. A host of tools must be presenting services that help utilities comply with recovery plans for critical cyber assets. One of them can be preventive services. Its engagement starts with a comprehensive assessment of the people, processes, and products that affect data center operations. It should identify data centre weaknesses by evaluating IT skills, security, environmental conditions and procedures. It must work to improve customer's key performance indicators such as severity level and the operational risk index. The disaster recovery solutions offered must manage the availability of application service and data across geographically dispersed site's clusters. In the event that a primary cluster goes down, the cluster edition must be in a position to enable to start up the business services with replicated data on the secondary cluster. The critical storage environments must design a detailed plan for recovery operations after an interruption. The services offered should include developing business continuity plans, disaster recovery plans and high availability plans. The system must remotely monitor data backups overnight and must provision new capacity or hosts anytime. It should also analyse performance degradation trends and recurring faults. Using that analyses it should do monthly recommendations for on going system updates and recommendations for data life cycle management improvements. These tasks must be engineered in such a way that they eliminate inefficient storage use and poor performance. The control on the access to site must be in our hands. Care should be taken that PHP script shall not execute UNIX commands so that the firewall protection is granted. There should be an audit check on all access policy violations. It is recommended frequently to back up and collect logs on a separate secure system. It will be better if we make others to review our plans and work which gives us critical assessment. The evaluation of the programme: We have to take three fold approach to the evaluation and impact assessment of the digital programmes: The strategic policy level evaluation, which will consider the impact of the whole programme w.r.t the digital content produced and maintained and how it targets the user priority, user groups and demonstrates fulfilment of the plans over policy directives. This must include a study of the content being accessed by users and why, identifying evidence of users benefiting from the materials available; the impact of the programme on overall and local strategies should support learning and acceptability of the process. The assessment of the operation of the programmes to consider aspects such as project management, staff skills, competencies and capacity building, collaboration, good practice in managing and presenting content and the impact of technical standards and guidance must be done. This can be done on series of case studies. The monitoring at project level and evaluation of all projects is needed to monitor and report the number of user visits. These should include a user feed back device on website. The user feedback can be monitored to demonstrate response to user requirements. This gives a report on project level evaluation's impact on users. The aim of all the programmes on the site must be learning. The access to the site should be for long-time which is possible by user satisfaction. For this digital materials available in the site must be in good condition in 100 years time. The system must be easy to use and hard to spam. The information user requested must be available without frustration. The access to the site when widened the more e-commerce was done. The frame work for future developments should be inherent in the system. The important thing is that there must be absolute clarity as to what the objectives of a project are before any attempt is made to assess performance. Otherwise there is a real danger that the results of an assessment will not aid decision making at any level. The use of encryption prevents unwanted and unsolicited interventions. The Bastion Host can be the critical strongpoint in the network's security. These security points must be hardened and must be audited regularly to know the mal functioning if any. The software we use on our site must be designed, tested and configured for safe operation with lesser vulnerability. In the Perimeter Network if there is a demilitarized zone with firewall system serving as a single point entry, then untrusted network on the outskirts of the private trusted network serves as an intermediate between the site and internet which may result in deny access. The main problem in this is firewall was made a single point entry which can deny access itself. Instead of that the routers must provide access control along with Firewall gateways, which are application-level gateways and packet filters. The Bastion hosts will be in general the email servers, www servers and ftp servers which are victim machines along with Switches and hubs. These deny access for intruding and a defence of depth. It can be kept simple for us by taking a phased approach that can be attained by planning. As a part of hardening a Bastion Host as it was crucial first enforce least privilege level that makes the processes and users get access to the privileges that they need on that moment. By using a few fixed TCP/IP ports we can block the rest. We do not trust our own applications if we want enhanced security. The Host Designing is done with minimal OS with the latest service pack. Out of that only the needed application must be loaded. The service pack can be reapplied whenever needed. The unneeded OS components must be either removed or disabled. By restricting access to files and other objects the OS can be hardened. Out of all the OS s the Mac OS X is a BSD UNIX and considered as most commercially secure solution. This proves that UNIX must be preferred over Windows, which has better tools for building a bastion host and a far better remote management than Windows. Though Windows NT/2000 is in some aspects stronger than UNIX, but the network security it offers is much weaker because too many ports will open and too many services are offered. It turns harder if UNIX style hardening is applied but much weaker security if not. Windows NT will be secure if applied for TCP/IP only and prevent connecting to the public network until it was fully hardened. The MS office or other developmental tools must be avoided and unnecessary applications, network services and system processes must be removed or disabled. Instead of using LINUX dual boot it is safer to use CYGWIN. The NTFS file system can be preferred in the Windows platform. It can be used in a standalone member server where no domains and no user accounts were present. Remote administration will serve good for Windows Servers. An open Source SSH and Cygwin of UNIX emulation must be framed. After that to enhance security a plan about backups must be administered. The administrator should think about who will do backup How often they were taken The kernel bugs should be fixed as soon as humanly possible, and any delay is basically just about making excuses. And that means that as many people as possible should know about the problem as early as possible, because any closed list just increases the risk of the thing getting lost and delayed for the wrong reasons. But the bottom line on kernel security, though, is that the kernel does have bugs that will need to be exposed and then patched. The creator of Linux made no excuses for kernel security and actually noted that users should take additional precautions on their own. Quite frankly, nobody should ever depend on the kernel having zero holes. We have to do our best, but if it needs real security, you should have other shields in place. Whether they are taken these backups from local or Network must be checked. The point of storing of media must be kept in account. It is very important to decide who may restore data to the system and the frequency of testing the backups must be known. Apply Bruce Schneier's rules of security. According to them the risk underlined in the system must be understood by the administrator along with the secrecy. In these secrecy is anathema to security. By knowing the agendas of people involved usually predict their decisions to demystify them. Securing e-commerce needs careful planning, defining the policies, providing the required physical security, implementation of access control ( i.e the access is denied when not needed and it is given when it is needed), placing of firewalls and managing them carefully to get access and denial when it was necessary. After following all the above it is better to get SSCP certified. This requires one year experience in at least one area. This requires continuing the education by agreeing to the code of ethics. But the CISSP requires 3 to 4 years experience and six hours exam in ten areas. The agreement to code of ethics is common. The background approval and continuing education will be the same. The SSCP knowledge areas are specified as access controls and administration. Auditing and monitoring the operations and appraising of Risk, response and Recovery along with cryptography was also mentioned. Actually cryptography, data communications and removal of malicious code were also taken into consideration. As computers and networks have become more complex, the approaches must be evolved to secure them. In this paper we focus on Training for certification. An expert should investigate the framework and structures that make up typical computer systems; this system sketches the evolution of security models and evaluation methods as they have struggled to keep pace with changing technology needs. The following topics are needed to be covered: The CISSP candidate gains a clear understanding of the tradeoffs between levels of trust, assurance and performance. Security mechanisms placed at the hardware, kernel, operating, services or the program layers are explored, along with the security of open (distributed) and closed (proprietary) systems. This section also covers the concept of the Trusted Computing Base -- the subset of system components that make up the totality of protective mechanisms. The origins of the TCB are presented as they appear in the Orange Book. Concepts such as the security perimeter, reference monitor and its requirements, the security kernel, object domains (i.e., privileged versus non-privileged), process/resource isolation, trust ratings, security layering and hiding, object and subject classifications, and the concept of least privilege are covered. These concepts are presented as a means by which security structures can be understood, and therefore, responsibly controlled. This section explores different types of security models and the attributes and capabilities that distinguish them. The Basic Security Theorem -- if a system initializes in a secure state and all state transitions are secure, then every subsequent state will be secure no matter what inputs occur -- is covered. Security modes describe the security conditions under which a system functions. Systems can support one or more security modes, thus servicing one or more user security classification groups. This section should explore four modes and also introduce the concept of the trust assurance. The level of trust is based on the integrity of the Trusted Computing Base. The Common Criteria global evaluation standard has its origins in independent global efforts, one based on U.S. standards and the other representing pan-European standards. The Trusted Computer System Evaluation Criteria (TCSEC), also referred to as the U.S. Orange Book, describes the specific criteria for several evaluation areas (security policy, identification, labels, documentation, accountability, life cycle assurance and continuous protection), and the formal process of evaluation executed by the National Computer Security Center (NCSC), which yield an evaluated product. The European community instead launched what is called the Information Technology Security Evaluation Criteria (ITSEC). ITSEC looks primarily at functionality and assurance as two broad category areas with subheadings. The key difference between the U.S. and European approaches has to do with their rating scheme. The European ITSEC applies a separate rating system for security functionality and for assurance, whereas the U.S. TCSEC uses a single-rating system. The confusing relationship between these two rating schemes are compared and explored in depth. The Common Criteria established must be the global compromise standard that can supersede both TCSEC and ITSEC. It must introduce the concept of protection profiles, which outline specific real-world needs in the industry. Students will need to understand the different components of the Common Criteria and the evaluation process and assurance levels. Security evaluation must yield proof (or lack thereof) of security operational readiness. Confusing terminology, such as the difference between certification (expected versus achieved readiness level) and accreditation (authorization to operate) can be contrasted. Threats This section covers some security threats specific to security models and architecture. Among the threats explored are covert channels, developer backdoors, timing attacks that exploit race conditions at boot time and buffer overflows. Countermeasures should be discussed for each. The Access Control Systems & methodology must be combined with applications and system development. Then it results in Business continuity planning. The cryptography offers security to your accounts. The legal, investigation and Ethics must be followed and thus maintain security to operations along with physical security. The security architecture & models, security Management practices must be followed in Tele Communications and Network &Internet Security. References: http://www.enetation.co.uk/ http://www.asymptomatic.net/blogbreakdown.htm http://news.bbc.co.uk/1/hi/technology/3210623.stm http://mlm.business-opportunities.biz/2006/06/06/prosperity-automated-system-explained/ Read More
Cite this document
  • APA
  • MLA
  • CHICAGO
(“Networking Security Engineering Essay Example | Topics and Well Written Essays - 2750 words”, n.d.)
Retrieved from https://studentshare.org/technology/1516464-networking-security-engineering
(Networking Security Engineering Essay Example | Topics and Well Written Essays - 2750 Words)
https://studentshare.org/technology/1516464-networking-security-engineering.
“Networking Security Engineering Essay Example | Topics and Well Written Essays - 2750 Words”, n.d. https://studentshare.org/technology/1516464-networking-security-engineering.
  • Cited: 0 times

CHECK THESE SAMPLES OF Networking Security Engineering

Reverse Social Engineering Attacks in Online Social Networks

The writer of the essay "Reverse Social engineering Attacks in Online Social Networks" suggests that in order to minimize the effects of RSE, the networking sites should only suggest possible friends when there is a strong connection that exists between them.... hellip; A social networking structure comprises of nodes that are represented by individuals and is also one of the most widely growing phenomena to date.... Social networking sites such as LinkedIn, Facebook and Twitter, which are, being utilized primarily for communication, oriented either upon business related ventures, friendship....
4 Pages (1000 words) Essay

Psychological Aspects of Cybersecurity

Today, when the dependency on the use of computer systems and internet technology has increased significantly, the understanding and incorporation of cybersecurity prove to be highly essential (“What is Cyber security?... This may actually result in a number of security-related threats for the members and hence for the services as well.... hellip; Reportedly, the forms of social networking threats are different, and depending on the level of seriousness of the threat safety measures are required for the protection of one's personal and important information....
10 Pages (2500 words) Term Paper

Identify the ethical issues within the field of Information Technology

Such software can be associated with plagiarism, reverse engineering, open source code and cybersquatting(Ethics in Information Technology - Auburn University).... Science and engineering Ethics, , 137-154.... Accounting firm's administration cannot rely only on trust for security issues like this while in the process of integrating with a different firm.... In the new system confidentiality and security of data saved in individual computers is protected by passwords....
2 Pages (500 words) Essay

Computer Networking

here are two modes which are very useful in wireless networking security:PERSONAL MODE: This mode is used in small organization and usually in homes.... 23:1-23:35Rohilla, Y, & Gulia, P 2012, A comparative study of wireless mesh and ad-hoc network : A Cross layer design approach, International Journal On Computer Science & engineering, 4, 6, pp.... Some of benefits of wireless network can be convenience, flexibility, productivity, easy setup, maintainable, expandable, robust security protection and cost....
1 Pages (250 words) Essay

Local Area Networking Technology - Primary School

From the paper "Local Area Networking Technology - Primary School" it is clear that regarding security effectiveness, the security for files, programs and private information will be highly guaranteed using LAN technology as compared to using external hardware devices....
16 Pages (4000 words) Coursework

The Security of Networking

The paper "The security of Networking" states as the complexity of safety threats increases, so do the security mechanisms necessary to safeguard networks.... Information center operators, network administrators, need to appreciate the fundamentals of security to arrange safe and administer systems.... nbsp;… security events are on the rise at an alarming rate each year.... As the complexity of safety threats increase, so do the security mechanisms necessary to safeguard networks....
12 Pages (3000 words) Essay

Emerging Information Security Threat Advisory White Paper

The paper focuses on social engineering and how the institutions can secure their employees and data analysts.... Social engineering: Stakeholders in the financial sector should also be aware of individuals who can manipulate legitimate employees after securing their trust to access vital information about the institution, and its clients....  This paper "Emerging Information security Threat Advisory White Paper" focuses on the challenge that financial institutions face while securing their private data from hackers....
6 Pages (1500 words) Term Paper

Web 2.0 Computer Networking Technology

security and future developments3.... security measures3.... each, A, Gartrell, M, & Han, R 2009, Solutions to security and privacy issues in mobile social networking, retrieved on 5th April 2011 from http://www.... … IntroductionSocial networking is taking shape in the business world.... Social networking is becoming the new medium of communication that is preferred in the modern business environment....
2 Pages (500 words) Assignment
sponsored ads
We use cookies to create the best experience for you. Keep on browsing if you are OK with that, or find out how to manage cookies.
Contact Us