Greedy AND Dynamic Programming in DAA PDF

Title Greedy AND Dynamic Programming in DAA
Course Design And Analysis Of Algorithms
Institution SRM Institute of Science and Technology
Pages 8
File Size 94.8 KB
File Type PDF
Total Downloads 431
Total Views 757

Summary

SRM INSTITUTE OF SCIENCE AND TECHNOLOGYKATTANKULATHURDESIGN AND ANALYSIS OF ALGORITHMS(18CSC204J)GREEDY AND DYNAMIC PROGRAMMINGGreedy Method Greed is an effective strategy for development problems with the following factors: Greedy assets: Global profits can be achieved through the best local select...


Description

SRM INSTITUTE OF SCIENCE AND TECHNOLOGY KATTANKULATHUR DESIGN AND ANALYSIS OF ALGORITHMS (18CSC204J) GREEDY AND DYNAMIC PROGRAMMING

Greedy Method Greed is an effective strategy for development problems with the following factors: 1. Greedy assets: Global profits can be achieved through the best local selection. 2. The correct sub-structure: The correct solution to a problem consists of a complete solution to minor problems. The second feature can make selfish algorithms look like a dynamic system. However, the two approaches are very different.

Applications The most greedy (but not always) algorithms fail to find the right solution worldwide, because they usually do not work fully in all data. They can commit to it choosing something too early prevents them from finding the best solution over time. Examples of such selfish algorithms • Kruskal algorithm for determining the minimum number of compact trees • Prim algorithm for obtaining minimum cutting trees • Huffman Algorithm for finding suitable Huffman trees. • Also used on Network

o Greedy algorithms also appear on the network channel. Using the line of greed, a the message is forwarded to a "near" neighbor destination. The concept of a node location (and that is why "proximity") may be is determined by its visible location, as in the local route used by ad hoc networks. The space can also be a fully functional construction as small land route and distributed hash table.

Dynamic Programming It is used where the solution can be repeatedly described as solutions for minor problems (small correct structure). The algorithm finds solutions to minor problems with stores in memory for later use. It works much better than "brute-force" methods, which solve the problem minor similar problems over and over again.

Proper basement: The right solution for a problem contains the right solutions to the minor problems

Minor complications: A few minor problems in total, many recurring cases each Bottom line at top: Solve to the top, creating a table of small problems solved that are used to solve big problems

Huffman Encoding Huffman codes are the most effective and widely used data compression techniques. Huffman The problem with coding is finding the minimum character unit of length that can be used for coding a series of symbols. Uses a table of frequencies that occur for each character represents each letter as a unit of binary characters, appropriately. It uses a simple number based first

queue. Each leaf is marked with a letter and the frequency of its occurrence. Each internal The node is marked with the total number of leaf masses in its lower tree. Huffman coding is an algorithm for compressing lost data. The idea is to provide variablelegth codes for input characters, the length of the given codes is based on the frequencies. of the corresponding characters. The most common character gets a little code as well as a strange little character gets a very large code. Variable length codes are assigned to input Startup Codes, say the codes (bit sequences) are provided in such a way that the code assigned to a single letter may not be The beginning of the code given to any other character. This is how Huffman Coding proves that there is no ambiguity when coding the generated bit stream. Huffman's greed The algorithm considers the occurrence of each character and as a unit of binary characters the correct way. There are two main components in Huffman Coding 1) Create a Huffman Tree from input characters. 2) Go through the Huffman Tree and give codes to the characters.

Steps to Build a Huffman Tree Input is a series of different characters and their frequency of events and output Huffman Tree. 1. Create a leaf node for each unique character and create lots of nodes for every leaf (Min Bulk is used as an important line. The frequency field value is used to compare two nodes in countless minutes. Initially, the most common character is basic) 2. Subtract two nodes with a small amount of frequency from a small pile. 3. Create a new internal environment with a frequency equal to the sum of two nodes waves. Make the first output node as its left child and the output node as its proper child. Add this node to a small batch. 4. Repeat steps # 2 and # 3 until the pile contains only one node. The remaining node is root node and tree are complete.

Building a Huffman Tree A greedy algorithm that creates a valid source code called Huffman code. I The algorithm creates a T-tree corresponding to the correct code in a vertical and vertical direction. It starts with a | c | set leaves and perform | c | -1 "merging" tasks to create the latter tree.

Pseudo code Data Structure Used: Key Line = Q Huffman (c) n=|c| Q=c for = 1 to n-1 create z = Assign-Inode () x = left [z] = EXTRACT_MIN (Q) y = right [z] = EXTRACT_MIN (Q) f [z] = f [x] + f [y] FAKA (Q, z) return EXTRACT_MIN (Q) Analysis • Q used as a binary pile. • Line 2 can be done using BUILD-HEAP in O (n) period. • Loop installed | n | - 1 times again as the performance of each batch requires O (lg n) time. loop F gives (| n | - 1) O (lg n) O (n lg n) • So Huffman's total time set for characters is O (nlg n).

Fractional knapsack There are n items in the store. Mine = 1,2,. . . , n, something I weigh wi> 0 once worth vi> 0. A thief can carry W pounds in a bag. In this version in trouble things may be broken into smaller pieces, so the thief may decide to carry them only xi part of item I, where 0 ≤ xi ≤ 1. I give xiwi by weight bag, and xivi in load value.

Greedy-fractional-knapsack (w, v, W) BECAUSE I = 1 to n make x [i] = 0 weight = 0 while weight...


Similar Free PDFs