Result for "Algorithms":

500
essays
4 pages
(1000 words)
, Essay

Only on StudentShare

...ALGORITHMS GENETIC ALGORITHMS [The [The of the Genetic Algorithms Genetic Algorithms
The N-Queens problem is a traditional AI problem. Its name is resulting from the allowable moves for the ruler part in chess. Queens are allowable to move flat, vertically, or diagonally, toward the back and forward, with the merely restriction being that they can go in only one course at a time. A queen that can reach one more piece in one move capture it.
The N-Queens problem is based on the idea of trying to put N queens on an N x N grid, such that no ruler will be able to imprison any other queen. The N-queens problem is characteristic of a lot of combinatorial problems, in that it is easy... Running Head: GENETIC...

3 pages
(750 words)
, Essay

Only on StudentShare

...algorithms are used as a synthetic but extremely useful means of solving extremely complex problems. It is a particularly beneficial means of problem solving because it is an efficient way of managing large quantities of arbitrary information. The use of GA’s is not limited to genetics but rather is used in business and other professional fields. For argument’s sake, we will approach genetic algorithms in this paper, from a Biological approach. approach. The principles of the genetic algorithm rests on the principles of natural selection, evolution and Mendelian Genetics. (Jamshidi 2003). The preliminary step to a genetic algorithm is to identify a means of encoding any sort... of...

2 pages
(500 words)
, Assignment

Only on StudentShare

...Algorithms I.D. of the Routing Algorithms In this paper, some widely used complex routing algorithms are discussed which can be used to manage complicated networks and build routing tables. Today, OSPF is most widely used while once popular RIP is undergoing extensive development. (Forouzan 2006)
Open Shortest Path First (OSPF)
OSPF is a complex routing algorithm that can analyze the link-state situation of the given network. It can maintain the routes for all possible destinations in a network continuously. The routing table built by OSPF is based on topology database, and selection of nodes is done along shortest paths. The shortest path has least number of nodes in it. Hence... , the...

7 pages
(1750 words)
, Research Paper

Only on StudentShare

...Algorithms: Introduction In information theory and computer technology, source coding, reduction in the bit rate or data compression involves encoding of information by the use of fewer bits compared to original representations. The compression can be lossless or lossy. Lossless compression lessens bits through identification and elimination of statistical redundancy. There is no information that is lost in the lossless compression. In contrary, lossy compression lessens bits through identification of marginally vital information and eliminates it. This process of size reduction of data is popularly known as compression of data, though it was formally known as source coding. Compression... ? Compression...

9 pages
(2250 words)
, Download 1
, Research Paper

...Algorithms Table of Contents Introduction 3 Mathematical Algorithms Overview 3 Evolution 5 Relation of Computer Science and Mathematical Algorithms 6
Algorithm-Supported Mathematical Theory 7
Mathematical Algorithm Analysis 8
Impotence of Mathematical Algorithms 9
More Real-world Examples 10
Conclusion 10
Bibliography 11
Introduction
Discrete mathematics is a section or element of mathematics that is concerned with the objects which are capable of assuming just divided, distinctive values. The concept of discrete mathematics is thus applied in distinction with continuous mathematics, that is the subdivision of mathematics concerned with the objects which have... DISCRETE MATHEMATICS Mathematical...

7 pages
(1750 words)
, Research Paper

Only on StudentShare

...Algorithms In a power system world, breaker failure protection became a critical element to provide a back up protection for circuit breakers (CBs). Practically, every apparatus is equipped with primary protection to interrupt the current flow whenever a fault occurs. Thus, the breaker failure relay opens adjacent breakers to isolate the problem. Fast and secure breaker failure detection algorithm would be a critical challenge facing a numerical BF relay. In the first part, it presents the need for breaker failure protection. The second part addresses issues worth consideration when applying Breaker Failure Protection and lastly, advances towards breaker Failure Protection... Breaker Failure Detection...

8 pages
(2000 words)
, Research Paper

Only on StudentShare

...ALGORITHMS Hashing Algorithms Number Introduction Hashing algorithms are used for several purposes like encoding, decoding a particular text or a file, to provide key access to a database, to convert the specified value into another format. Hashing algorithms are generally categorized based on the type of data in which they are used. The most popular hashing algorithms include SHA algorithms which work on the basis of the bit length of the text. Another commonly used algorithm is MD5 and its basic function is to generate hash values to encode and decode the given information. Apart from these algorithms, Whirlpool, Bloom Filters and other algorithms are also widely used... ?Running head: HASHING...

7 pages
(1750 words)
, Download 1
, Essay

...Algorithm Applications in Social Media Data Structures and Algorithm Applications in Social Media Data structures are the registers and memories in a computer, whilst algorithms are the pieces of information stored in the structures (Wirth, 1984). Algorithms are very useful in selecting the most relevant information during a search. There are recommended algorithms that are able to guide selective searches. In social media, such algorithms are able to identify a certain element, like a fiend, and ignore others. The algorithms are able to identify a certain topic as “trending”, from millions of topics on a particular social media site. These can; hence, be defined as procedures... Data Structures and...

1 pages
(250 words)
, Essay

Only on StudentShare

...algorithms Differentiation of different types of cryptographic algorithms Cryptographic algorithms are instruments used in the process of encryption of data. Encryption is the process of coding information inform of cipher text for protection of information. The cipher text requires a decoding key to read the text. In cryptography, a key refers to a sequence of bits which long and used in decryption or encryption algorithms. There are various forms of classifying Cryptographic algorithms. The dominant and common form is the classification on the basis of the number of keys employed in encryption and decryption. There are three types of cryptographic algorithms. These are secret key... Cryptographic...

5 pages
(1250 words)
, Research Paper

Only on StudentShare

...ALGORITHMS IN MINING BIOLOGICAL DATABASES (s) Efficiency of clustering algorithms in mining biological databases Introduction Clustering analysis is increasingly being used in the mining of databases such as gene and protein sequences. Clustering algorithms is generally a common technique of data mining where by the data sets being examined are assigned into clusters on the basis of their similarities. In most cases, clustering algorithms are categorized into various groups depending on how they form their clusters1. For example Hierarchical algorithms often work by either splitting or merging the groups being analyzed in order to develop a hierarchy of clusters... that is based on the...

10 pages
(2500 words)
, Research Paper

Only on StudentShare

...ALGORITHMS FOR MINING LARGE BIOLOGICAL DATA BASES By Presented to With the enormous amount, of gene sequences resulting from genome expression and uncontrolled data classification into functional families or classes, clustering of such large data sets has become a serious headache in functional and structural genomics. The accurate classification of sequences into classes is in recent times a necessity, and as a result, computer programs have been developed to help in automatic clustering. Various clustering algorithms-methods-have addressed the gene sequence clustering. They are categorized into portioning, hierarchical and graph-based techniques. The most widely used... ? EFFICIENCY OF CLUSTERING...

4 pages
(1000 words)
, Essay

Only on StudentShare

...Algorithms From the many years, numerous data compression algorithms have been developed to deal with specific data compression problem. From the developed data compression algorithms, there does not exist a single compression algorithm that compress all data types efficiently. Therefore, each algorithm has a number of strengths as well as weaknesses. The compression algorithms are also commonly used in forensics. These algorithms are used to reduce space amount required to store data on the computer hard disk. Usually, these algorithms are employed on the large files so that their size may be reduced. Mainly, there are two types of the compression algorithms include: lossless... ?Data Compression...

5 pages
(1250 words)
, Research Paper

Only on StudentShare

...ALGORITHMS IN MINING BIOLOGICAL DATABASES (s) Efficiency of clustering algorithms in mining biological databases
Introduction
Clustering analysis is increasingly being used in the mining of databases such as gene and protein sequences. Clustering algorithms is generally a common technique of data mining where by the data sets being examined are assigned into clusters on the basis of their similarities. In most cases, clustering algorithms are categorized into various groups depending on how they form their clusters1. For example Hierarchical algorithms often work by either splitting or merging the groups being analyzed in order to develop a hierarchy of clusters... that is based on the...

6 pages
(1500 words)
, Assignment

Only on StudentShare

...algorithm Data structure and algorithm Algorithm: binarySearch(A, e) Input: An ordered arrayA and element e to be searched in the array.
Output: Index of element e(if found), otherwise return -1
lowestPossibleLocation ← 0
highestPossibleLocation ← A.getSize() - 1
while highestPossibleLocation >= lowestPossibleLocation do
middle ← (lowestPossibleLocation + highestPossibleLocation) / 2
if A[middle] = e then
return middle
else if A[middle] > e then
highestPossibleLocation ← middle – 1
else
lowestPossibleLocation ← middle + 1
return -1
Implementation Code
public class BinarySearch {
// Search for int X in array A containing N elements.
static int binarySearch(int... Data structure and...

6 pages
(1500 words)
, Research Paper

Only on StudentShare

...Algorithms In Identifying Outliers/Noise In A Large Biological Data Base Efficiency Of Data Mining Algorithms In Identifying Outliers/Noise In A Large Biological Data Base Introduction The protein sequences numbers in bioinformatics are approximated to be over a half a million. This calls for the need of meaningful partitions of the protein sequences so as to be in a position to detect the role they play. Alignment methods were traditionally used in the grouping and comparing protein sequences. In a later stage, local alignment algorithms were introduced to replace the earlier methods and perform more complex functions. The local algorithms were used to find amino acid... ? Efficiency Of Data Mining...

18 pages
(4500 words)
, Essay

Only on StudentShare

...algorithms have been developed, which assist in transforming Low Resolution images to High Resolution images. High Resolution (HR) images have a wide range of usage in the various fields, for example, medical imaging, video surveillance, and satellite imaging. However, due to limitations of hardware, many Low Resolution (LR) images are obtained than High Resolution images. As a result, researchers have come... up with new techniques that help them in obtaining HR images from LR images. Researchers have come up with a reconstruction technique known as Super-resolution (SR) technique (Bannore, 2009). The technique solves the problem of developing HR images from LR images since it allows the...

4 pages
(1000 words)
, Essay

Only on StudentShare

...algorithm).
// - AdaptiveHuffmanProvider: Static methods used to compress and decompress... [value] = oldNYT.RightChild;
// Return the new value node.
return oldNYT.RightChild;
}
/// Find the highest-numbered node with the given count. (FGK algorithm)
/// The starting node.
/// The current node if no higher nodes were found; the higher node, otherwise.
///
/// Use this HighestWithSameCount function to use the FGK algorithm for adaptive
/// huffman compression method. The FGK algorithm is often much faster than Vitter's
/// variation and often produces results just as good; however, in some cases
/// Vitter's method can produce better compression.
///
private... Task a) A...

1 pages
(250 words)
, Coursework

Only on StudentShare

...algorithms they are in a position to handle different questions with different algorithms and still arrive at the same answer. This occurs in cases where there is the use of the traditional algorithms and the student made algorithms .The student made algorithm may be in a position to solve... the problem competently delivering the answer.
Question Three
In the handling of problems of algorithm, mathematical validity is a preserve of testing efficiency. The search for validity leads to granting of certain rules of solving math algorithms. The student generated algorithm is different from the traditional algorithm when the...

10 pages
(2500 words)
, Research Paper

Only on StudentShare

...ALGORITHMS Hashing Algorithms Affiliation Introduction Hashing algorithms are used to translate the input of any size to a smaller output of fixed extent by making use of the hash function. In this scenario, this output value is known as the hash value. Basically, this hash value consists of a distinctive and very condensed mathematical illustration of a portion of data. This process of translation from input to output is a compressed mapping, to be precise, the space that is taken by the hash value is a great deal smaller than that of the input. In fact, any value or even a single letter is changed in the input; then the values of hash value will also be changed. Additionally, same hash... ?HASHING...

1 pages
(250 words)
, Admission/Application Essay

Only on StudentShare

...algorithms to solve or complete a certain action or solve a problem. Applying mathematics in the program can be done through use of mathematical functions and algorithms like sorting, searching, use of induction in recursive algorithms and lambda calculus. Sorting algorithms are used in sorting arrays in an application or organizing names, and this makes searching for these values easier in a program. Mathematical sort algorithms are used to sort different data types, files or even URL.
Variables in applications are names that provide a program with a named storage that allows programs... variables are declared to a specific type and used in accessing objects. Default values of reference...

6 pages
(1500 words)
, Essay

Only on StudentShare

...algorithmic complexity, such as memory bound problems, error convergence issues and supervised training are prohibitive for large state and solution spaces or high dimensional state spaces. In addition, among popular search strategies, heuristic algorithms many1 not guarantee search optimality or hard to approximate without close formed utility functions. Furthermore, model-building algorithms require large computation per iteration since every update needs to compute sums over the entire state space... Both pervasive directional graphs and data intensive algebraic operations require adaptive search functions in problem solving and reasoning with limited computing resources. Among search methods,...

5 pages
(1250 words)
, Research Paper

Only on StudentShare

...algorithms as a mechanism to enable clients using cloud computing... effectively locates their stored data within the cloud. String matching (Fuzzy keyword) algorithms Generally cloud data systems usually consist of the client, data service provider and the cloud server. Advances in computing technology have enabled the use of networks and data identifier algorithms to build a mechanism that allows the clients to locate their data in the clouds based on string matching of the any data. Approximate string matching algorithms is an important technique that can potentially be used in finding a string of data that closely matches the search...

6 pages
(1500 words)
, Essay

Only on StudentShare

...algorithms that are employed in a uni-processor system and in a multi-processor system.
Methodology
In this study on the mutual exclusion, the following methodology will be adopted:
1. The problems that are resolved by the mutual exclusion are identified and listed out.
2. The solutions and the algorithms... process to run at a time. Mutual exclusion algorithms should ideally provide lee-way for the following options:
1. Freedom from deadlock: Locking is the simplest way of avoiding repeat use of critical processes. While locking can be effective for stopping execution of a job, when another one is running, it might not be fool proof. For instance, if process 1 locks a...

2 pages
(500 words)
, Assignment

Only on StudentShare

...algorithms?) to accomplish these tasks?
Answer:
According to my perspective, in future program verification and performance tuning will be accomplished with the help of algorithms. The automatic ways are much faster and efficient as compared to manual methods. Through algorithms, verification and performance tuning can be done more effectively and in well-organized manner (Dave, 2008). With the advancement in technology more and more... Week 4 DQ2: Program verification How do you envision the ways program verification and performance tuning will be accomplished in the future? Willit still be a work of art? Will it always be the result of ones experience? Or do you forecast more and more automatic ways...

6 pages
(1500 words)
, Essay

Only on StudentShare

...algorithms that are employed in a uni-processor system and in a multi-processor system.
Methodology
In this study on the mutual exclusion, the following methodology will be adopted:
1. The problems that are resolved by the mutual exclusion are identified and listed out.
2. The solutions and the algorithms... Mutual Exclusion in Multi Processing Systems The Problem In any processing environment that runs more than one program or task at the same time has the problem of restricting resource access by multiple processes at the same time. In the course of a normal execution of any two programs on the same processor (uniprocessor systems) there is always a possibility that the programs might request for the...

4 pages
(1000 words)
, Essay

Only on StudentShare

...algorithms has been reviewed (J. Kolter and M. Maloof, 2003... 1. Is a Neural Network with one or more hidden layers more powerful than a single layer perceptron Explain (Hint: in terms of learning can a neuralnetwork with one or more hidden layers learn functions more complex than the perceptron)
An artificial neural network contains networked neurons working together to solve complex problems. A typical single-layer perceptron consists of one or more artificial neuron parallely arranged and is not computationally complex. In single layer perceptron, as the number of input increases the proportion of function decreases drastically. The drawbacks of single layer perceptron are described already (Minsky an...

3 pages
(750 words)
, Essay

Only on StudentShare

...algorithms have been invented. These algorithms can be categorizedinto three types, i.e. encryption algorithms, hashing algorithms, and signature based algorithms. Encryption algorithms are utilized for encrypting data to ensure privacy and protection. Hashing algorithms are utilized to ensure data is not modified during transit – that is, to ensure data integrity. Lastly, signature based algorithms are used for authentication by providing a digital or electronic signature of the user. As an ABC institute of research has to protect highly sensitive information from its rivals, we will discuss symmetric and asymmetric encryption algorithms... ? Full Paper Introduction Over the years, many cryptographic...

10 pages
(2500 words)
, Dissertation

Only on StudentShare

...Algorithms 8 4. An Example of 10 bar Truss 10 5. Conclusions 11 References 12 1. Introduction In the recent years, the employment of genetic algorithms (GA) to get the optimal design for the civil engineering structures has been studied (Leon, 1970; Bunday, 1984; Bell, 1974; Makowski, 1965; Collatz and Wetterling, 1975). Ghasemi et al. (1999) have revealed the appropriateness of the genetic algorithms to deal with the large trusses that have numerous indefinite... ?Table of Contents Table of Contents 1.Introduction 2 1. Background 2 2.Literature Review 3 3. Objective 4 2. Theory 4 2 Structural optimization 4 2.2. The GA Principle 6 2.3. Truss Structures 7 3. Structural Optimization Using Genetic...

1 pages
(250 words)
, Research Paper

Only on StudentShare

...algorithms used to assess whether an error took place.
Though his paper is focused on reviewing and critiquing the approach to medication error data collection and analysis, Ferner (2009) wrote it in a manner that showed how the design of this field of research in general might be flawed. This paper critiques and reviews the approach to research on medication... Answer to a Question from a Previous Source Ferner (2009) applied in his work a systems analysis. He d: “In this review I consider some of the problems in counting and classifying medication errors” (Ferner, 2009, p.614). He undertook a systems analysis by taking into consideration not only psychological factors, but also factors such as...

3 pages
(750 words)
, Download 1
, Essay

Free

...algorithms. According to Deo (1974, p. 284), Computer algorithms are essential sets of instructions pursued to resolve certain problems... . The implication is that every step of an algorithm must be defined unambiguously and precisely such that the algorithm has definiteness, finiteness, output, input and effectiveness. These features of an algorithm are achieved if the computer program is written in a language understandable by the machine or English then converted. Usually, an algorithm is first expressed in ordinary language, then converted into a flow chart and finally written in a language that the machine can execute....

3 pages
(750 words)
, Essay

Only on StudentShare

...Algorithms... a wide range of challenges that undermine the results of this service delivery approach. In health care units, the P4P approach involves the setting of achievable targets and working towards them. The development of these targets is an issue that has become posed a challenge to many health practitioners. The question of whether the targets are too low of too high is a controversial issue. Consequently, it becomes hard to define the reward system and how to punish or reward a health facility. Cromwell et al (2011) identifies the challenge of implementing the pay algorithms that accompany the challenge of setting up standards for quality services. Therefore, it...

4 pages
(1000 words)
, Essay

Only on StudentShare

...algorithms and (b) distribution algorithms.
Research goals:
There are various methods which have been used for software optimization. Genetic algorithms are a method for solving... Proposal In the current global environment, digital technology and the increase in volume of online transmissions requires the availability of considerable amounts of bandwidth, further exacerbated by the lack of availability of storage space on computers. Software optimization offers an excellent solution to both problems, providing for more efficient usage with fewer resources.
This research study will focus upon software optimization through a comparison of two different methods of software optimization: (a) genetic algor...

8 pages
(2000 words)
, Essay

Only on StudentShare

...algorithms that can query data in every way possible. In the sections to follow data mining technology and the tools it provides to assist auditing is analysed.
Data Mining: An Introduction
Data mining refers to extracting or 'mining' knowledge from large amounts of data. Jiawei Han and Micheline Kamber (Han and Kamber, 2001) aptly state... Data Mining for Auditing Contents of the Report Introduction Auditing: An Introduction to the Problem Domain The Solution to the Problem
Data Mining: An Introduction
Integration of Data Mining with Auditing
Introduction
The transition of the applied information technology from the primitive file processing systems to sophisticated database systems can be tra...

4 pages
(1000 words)
, Assignment

Only on StudentShare

...Algorithms (SPAs) and the Minimum spanning tree concept. Computer network refers to a linkage of one computer to another in order to exchange or interchange information. Computer networking depends heavily on theoretical concepts of graph theory. A simple computer network can be represented using the graph theory (Hart, 2013).
Shortest Path Algorithms (SPAs) are graph theory concepts that are widely used in computer networking technology in the present. A short path algorithm refers to a method used to determine the least cost path or the best path from a certain node to another node. The shortest path problems... Graph theory in Networking Affiliation Graph theory in networking Graph theory is one of...

2 pages
(500 words)
, Download 1
, Essay

...algorithm that doesn't require two parties (sender and receiver) to first exchange a secret key in an attempt to carry out the process of communication. In this scenario, the public key part is used for encryption entirely at the receiver side, whose private key part is applied for decryption. However, in order to make this communication safe it is necessary to make sure that only the intended receiver is able to access the private part of the key. Moreover... ?PUBLIC-KEY CRYPTOGRAPHY AND INFORMATION SECURITY Public-Key cryptography and Information Security Affiliation Public Key Cryptography (PKC) uses a key with two elements, a "public key" and a "private key", for the implementation of an encryption...

7 pages
(1750 words)
, Essay

Only on StudentShare

...algorithms such as Least squares... ?Echo Cancellation and Applications:- Cancelling the effect of echo is called echo cancellation. Echo is the delayed and distorted voice of the original sound. There are certain techniques for the echo cancellation called the echo canceller, corresponding to the field of adaptive filtering. Their scope lies in the full duplex data transmission over wire to wire circuits. Echo is observed in telecommunications. Certain filter structures are used including FIR, IIR, lattice structure, frequency domain structure, memory type structure, among which FIR is of most use. Standard bodies such as CCITT evaluate methods and performance requirements for echo cancellers. Two...

8 pages
(2000 words)
, Essay

Only on StudentShare

...algorithm, and it is widely used by websites to prevent credit-card or other personal information being intercepted by snoopers. Although it is in itself extremely secure, ssl secures only the link between... What is Security? Although there are thousands of definition available on the Internet related to security. The definition available on ‘www.businessdictionary.com’ covers the basics and states it as “Prevention of and protection against assault, damage, fire, fraud, invasion of privacy, theft, unlawful entry, and other such occurrences caused by deliberate action”. In the context of network security definition, it consists of concerns related to network communication privacy, confidentiality of...

5 pages
(1250 words)
, Essay

Only on StudentShare

...algorithms that are hard for humans to comprehend. The recent financial crisis resulted in increased unemployment, which is an indicator of the increased inefficiency of the stock market. This paper agrees with Stiglitz opinion that that Flash Crash will lead to less investment in information, which is harmful to the markets price discovery function hence the financial market. The paper will oppose the opinion that Flash Crash could be a positive feedback loop of the trading environment. Computer... ? 6 MAY, FLASH CRASH Introduction The Flash Crash of 6 May, caused titillations even amongst economic scholars. It was characterized by sharp drop and recovery of the prices of securities. In less than half...

5 pages
(1250 words)
, Thesis Proposal

Only on StudentShare

...algorithms that will be used in creation of reliable anti-recognition and anti-segmentation techniques. This will eventually make user information and use of CAPTCHAS very secure against automated computer bots.
Methodology
We will perform a deep analysis of the different anti-segmentation techniques... Your full full February 28, Thesis Proposal Topic for the Thesis “Analysis of Distorted Character visual CAPTCHAs”
Aim of the Research
Internets increase in size and services has brought about a lot of many conveniences but in order for this to be achieved a lot of challenges are being faced. Some of these challenges are in differentiating between legitimate computer users and unauthorized computer bots....

2 pages
(500 words)
, Essay

Only on StudentShare

...algorithms of local optimization with calculation of partial derivatives of first order; local optimization with calculation of partial derivatives of first and second orders, for example, Newton method; stochastic optimization algorithms, for example, Monte-Carlo method, algorithms of global optimization etc.
References
1. P Chan, W Fan, A Prodromidis & S Stolfo. 1999. Distributed data mining in
credit card fraud detection, IEEE Intelligent Systems. 14(6): 67–74.
2. J. Stolfo, David W. Fan, Wenke Lee and Andreas L. Prodromidis.1999. Credit Card Fraud Detection Using Meta-Learning: Issues and Initial results, IEEE Intelligent Systems, 20(7): 67–85.... The methodology of risk decreasing and fraud...

2 pages
(500 words)
, Essay

Only on StudentShare

...algorithms of local optimization with calculation of partial derivatives of first order; local optimization with calculation of partial derivatives of first and second orders, for example, Newton method; stochastic optimization algorithms, for example, Monte-Carlo method, algorithms of global optimization etc.
References
1. P Chan, W Fan, A Prodromidis & S Stolfo. 1999. Distributed data mining in
credit card fraud detection, IEEE Intelligent Systems. 14(6): 67-74.
2. J. Stolfo, David W. Fan, Wenke Lee and Andreas L. Prodromidis.1999. Credit Card Fraud Detection Using Meta-Learning: Issues and Initial results, IEEE Intelligent Systems, 20(7): 67-85.... methodology of risk decreasing and fraud...

10 pages
(2500 words)
, Research Paper

Only on StudentShare

...ALGORITHM FOR MULTI-DIMENSIONAL MATRIX MULTIPLICATION OPERATIONS REPRESENTATION USING KARNAUGH MAP Multi-dimensional arraysare widely used in a lot of scientific studies but still some issues have been encountered regarding efficient operations of these multi-dimensional arrays. In this paper, the extended Karnaugh Map representation (EKMR) scheme has been proposed as an alternative to the traditional matrix representation (TMR) which caused the multi-dimensional array operation to be inefficient when extended to dimensions higher than two. EKMR scheme has managed to successfully optimize the performance of the multi-dimensional array operations to the nth dimension of the array... Topic: PARALLEL...

4 pages
(1000 words)
, Essay

Only on StudentShare

...algorithms can be done through the validation of data mining modes. The process involves the assessment of the performance of the mining models against real data. This is done through understanding the characteristics and the quality of the algorithms before deploying them into the production environment (Chung, & Gray, 1999). To determine the reliability of data mining algorithms, the deployment of different statistical validity measures is checked, towards determining whether there are issues in the model or the data. The reliability of data... ? Data Mining School a). Using predictive analytics to comprehend behaviors: To remain competitive within the market, sellers need to comprehend present...

3 pages
(750 words)
, Assignment

Only on StudentShare

...algorithms Routing algorithms, also known as routing protocols are formulas used by routers in order to determine the best and appropriate path onto which packet are to be forwarded... ? Week 4 Hand-in Assignment al Affiliation) Router and IP Addresses Yes. Routers have IP address, one IP address per router interface. The number of IP address a router has depend on the number of interfaces connected to the router. Routers have at least two IP addresses since it has the function of deciding where to next send the packet. There exists another IP address for management purposes usually the router’s own address for convenient access to the router by the user. Routers are generally used for switching and...

20 pages
(5000 words)
, Essay

Only on StudentShare

...algorithms 7 2 The Gerchberg-Saxton “error-reduction” algorithm 7
2.2. Gradient-search algorithms 12
2.3. Input-output algorithms 13
3. Autocorrelation and its role in an object reconstruction 15
4. Phase retrieval stagnation: problems and solutions 17
4.1 Simultaneous twin images. 17
4.2 Stripes superimposed upon the image. 18
4.3 Unintentional truncation by the support constraint 19
6. Conclusion 22
Abstract
For many years lens-based optical techniques have been used for imaging, promoting the evolution of science. Nowadays coherent diffraction imaging (CDI) is a modern approach... of reconstruction of an object’s image on the basis of its diffraction pattern, which is recorded as a result...

3 pages
(750 words)
, Assignment

Only on StudentShare

...algorithms
Routing algorithms, also known as routing protocols are formulas used by routers in order to determine the best and appropriate path onto which packet are to be forwarded... Week 4 Hand-in Assignment al Affiliation) Router and IP Addresses Yes. Routers have IP address, one IP address per router interface. The number of IP address a router has depend on the number of interfaces connected to the router. Routers have at least two IP addresses since it has the function of deciding where to next send the packet. There exists another IP address for management purposes usually the router’s own address for convenient access to the router by the user. Routers are generally used for switching and...

3 pages
(750 words)
, Case Study

Only on StudentShare

...algorithms that are better oriented towards meeting the need of service users.
The Filter, which is currently providing its services to the music and videos-related web sites, intends to expand its operation in the coming times. It is faced with the decision of opting new arenas... The Filter, as the implies is a company that deals with the filtering of relevant digital data to provide suggestions to the users of the internet websites. Although this trend has been in the market since quite a few years, this company which started it business with a direct approach to the customer claims to provide improved recommendation technology than other companies due to enhancements and improvements in its...

2 pages
(500 words)
, Assignment

Only on StudentShare

...algorithm that was created and posted by another to solve the same problem, and analyze the differences between the effectiveness of the two algorithms. Post the result of the analysis, and improvements.
DQ1:CK,ML,JL,AL,KSem
Code 1
The following optimizations have been used:
1) To verify if a number is divisible by three it is enough to make sure the sum of the numbers digits is divisible by three. A counter ModSoFar is kept to see what the mod of the sum of the digits is in the currently tested string. This optimization helps contain the very large division that occurs on all of the tests to a mod of a number that is 11 at most. This might be optimized even further by changing... Choose one...

8 pages
(2000 words)
, Essay

Only on StudentShare

...algorithms. Likewise, we will precisely discuss blowfish encryption algorithm and its integration with applications that are involved in handling customer highly sensitive data. Blowfish Encryption... ?Introduction Information system security is becoming a dominant and challenging factor for organizations, as it leverages many risks that are constantly changing. Every now and then, there are new security breaches resulting in massive losses in terms of customer confidence, as well as revenue. As information technology is now considered as the fundamental function, every organization acquires information systems for business automation. Moreover, electronic commerce has also introduced many businesses...

1 pages
(250 words)
, Essay

Only on StudentShare

...algorithms will be employed to develop the cyber security walls between cyber sites at critical infrastructures.
This research concludes that the evidence-based model adequately sheds light on the ambiguity or insecurity in the user feedback to the CPS evaluation, and thus in the provisional risk assessments for the whole physical and cyber protection scheme. The actions required to improve... Evidence-Based Techniques for Evaluating Cyber Protection Systems for Critical Infrastructures
At present, modern societies confront new problems, like global terrorism and other malevolent attacks, which make securing critical infrastructures more challenging. This, alongside the growing reliance of societ...