Final EXAM Notes PDF

Title Final EXAM Notes
Author Anonymous User
Course Software Development Methods
Institution University of Virginia
Pages 8
File Size 195.2 KB
File Type PDF
Total Downloads 10
Total Views 142

Summary

final exam notes...


Description

MULTITHREADING/CONCURRENCY ● Running threads ○ Often it is useful for a program to carry out 2 or more tasks at same time. This can be achieved by implementing threads ○ Thread: a program unit that is executed independently of other parts of the program ○ The JAVA Virtual Machine executes each thread in the program for a short amount of time (“time slice”) ○ This gives the impression of parallel execution ○ If a computer has multiple central processing units (CPUs), then some of the threads can run in parallel, one on each processor ● Running a thread ○ Create a task to be run in a thread by implementing the Runnable Interface: ■ public interface Runnable { void run(); } ● Thread scheduler ○ Thread scheduler: runs each thread for short amount of time (a time slice) ○ Then scheduler activates another thread ○ Will always be slight variations in running times ○ You cannot guess, predict, or plan when which thread will run ● Terminating threads ○ A thread terminates when its run method terminates ○ Do not terminate a thread using deprecated stop method, use interrupt instead ○ Interrupt does not cause thread to terminate, it sets a boolean variable in the thread data structure ○ sleep() suspends execution of the thread CONCURRENCY DAY 2 ● Race conditions ○ When threads share a common object, they can conflict with each other ○ Sample program: multiple threads manipulate a bank account ■ Create two sets of threads ○ Occurs if the effect of multiple threads on shared data depends on the order in which they are scheduled ● Synchronizing object access ○ To solve problem, use a lock object ■ Lock object: used to control threads that manipulate shared resources ○ Code that manipulates shared resource is surrounded by calls to lock and unlock ■ Lock, try (manipulate shared resource), finally unlock ○ When a thread calls lock, it owns the lock until it calls unlock ○ A thread that calls lock while another thread owns the lock is temporarily deactivated





○ Thread scheduler periodically reactivates thread so it can try to acquire the lock ○ Eventually, waiting thread can acquire lock Avoiding deadlocks ○ Deadlock: occurs if no thread can proceed because each thread is waiting for another to do some work first ○ Use a condition object - allows thread to temporarily release a lock, and to regain lock at later time ■ Each condition object belongs to a specific lock object Condition objects ○ As long as test is not fulfilled, call await on the condition object ■ Makes the current thread wait ■ Allows another thread to acquire the lock object ○ To unblock, another thread must execute signalAll on the same condition object ■ signalAll unblocks all threads waiting on the condition

REVIEW ● Running Threads ○ A thread is a program unit that is executed concurrently with other parts of the program ○ The start method of the Thread class starts a new thread that executes the run method od the associated Runnable object ○ The sleep method puts the current thread to sleep for a given number of milliseconds ○ When a thread is interrupted, the most common response is to terminate the run method ○ The thread scheduler runs each thread for a short amount of time, called a time slice ● Terminating Threads ○ A thread terminates when its run method terminates ○ The run method can check whether its thread has been interrupted by calling the interrupted method ● Race Conditions ○ A race condition occurs if the effect of multiple threads on shared data depends on the order in which the threads are scheduled ● Synchronizing Object Access ○ By calling the lock method, a thread acquires a Lock object ○ Then no other thread can acquire the lock until the first thread releases the lock ● Avoiding Deadlocks ○ A deadlock occurs if no thread can proceed because each thread is waiting for another to do some work first ○ Calling await on a condition object makes the current thread wait and allows another thread to acquire the lock object



A waiting thread is blocked until another thread calls signalAll on the condition object for which the thread is waiting

RECURSION 1: INTRODUCTION TO RECURSION AND RECURSIVE ALGORITHMS ● Different views of recursion ○ Recursive definition: n! = n * (n-1)! ○ Recursive Procedure: a procedure that calls itself ○ Recursive data structure: a data structure that contains a pointer to an instance of itself ● Some important definitions ○ Base case: the case for which the solution can be stated non-recursively (solved directly) ○ Recursive case: the case for which solution is expressed in terms of smaller version of itself ● Why do recursive methods work? ○ Activation records on the run-time stack are the key: ■ Each time you call a function (any function) you get a new activation record ■ Each activation record contains a copy of all local variables and parameters for that invocation ■ Activation record remains on the stack until function returns, then it is destroyed LINKED LISTS: ● List Class ○ List combines a series  of Nodes to form a list ■ We must keep track at least of the “first” node (often called “head”) ■ Useful but not required to keep track of tail ○ List class doesn’t directly keep track of every node INTRO TO TREES AND BINARY TREES: ● Linked Lists (High-level/brief) ○ Lists keep things in order ○ Arrays keep things in a fixed block of memory, which is good for some operations and not as good for other operations ○ Linked Lists use reference pointers between list nodes (elements) to maintain order ● Trees ○ Data types can be simple or composite ○ Data structures are composite data types ■ Def: a collection of elements that are some combination of primitive and other composite data types ○ Tree classification: ■ Trees are a composite,  hierarchical, and graph-like data structure ■ Trees grow down, not up

● Predecessors are up, successors are down Trees are composed of: ■ Nodes: ● Elements in the data structure (hold data) ● Only one parent (unique predecessor) ● Zero, one, or more children (successors) ● Leaf nodes: nodes without children (terminal) ● Root node: top or start node; no parent ● Internal node: nodes with children (non-terminal) ● Measure of degree: how many children ■ Edges ● Link parent node with children node (if applicable) ○ Height is longest path (# nodes) from root to leaf (including root) ● Tree Definitions and Terms ○ Binary tree: ■ Each node has at MOST 2 children (left or right child) ○ General tree definition: ■ Set of nodes T with distinguished node, and root ● Made up of subtrees, each node is root of a subtree ● Trees: Recursive Data Structure ○ Recursive data structure: a data structure that contains references (or pointers) to instances of that same type ○ Tree operations may come in two flavors: ■ Node specific (e.g. hasParent() or hasChildren()) ■ Tree wide (e.g. size() or height()) - requires tree traversal ● Two-class strategy for recursive data structures ○ “Top” (tree) class ■ References to “first” node ■ Methods and fields that apply to entire data structure ○ Node class ■ Recursively defined: references to other node objects ■ Contains data stored at the node ■ Methods defined in this class are specific to this node or recursive (this node and its references) BINARY SEARCH TREES: ● Binary Search Trees ○ Binary tree with comparable data values ○ Binary search tree (BST) property: ■ Values of all descendants to left subtree are less than parent node data ■ Values of all descendants to right subtree are greater than parent node data ○ BST requirement: ○



Data variable should be a Comparable type (not Object) in order for the data comparisons to work ● BST Find and Insert ○ Find an element in the trees ■ Compare with root, if less traverse left, else traverse right; repeat ■ Stops when found or at leaf ■ Sounds like binary search ■ Time complexity: O(log n), worst case height of the tree ○ Insert a new element into the tree ■ Do a find operation and add at leaf node ● Deleting from BST ○ Removing a node requires ■ Moving its left and right subtrees ■ If 0 children, delete node ■ If 1, replace node with only child ■ If 2, find next largest (or smallest) to fill in ○ After removing element, find node with which to replace (successor) ○ Where to find the successor? ■ Next “largest” element ■ Next “smallest” element ○ Where would these exist in the BST? ■ Next largest: in right sub-tree ■ Next smallest: in left sub-tree TREE TRAVERSALS: ● Tree Traversals ○ Tree traversal is specific order in which to trace the nodes of a tree ○ 3 common tree traversals for binary trees: pre-order, in-order, post-order ○ All recursive ○ Left subtree is ALWAYS traversed (recursively) before  the right subtree is traversed



Iterative depth-first search ○ Depth-first search (DFS) goes deeply into the tree and then backtracks when it reaches the leaves ○ DFS pseudocode algorithm uses a Stack ● Iterative breadth-first search ○ Breadth-first search (BFS) visits all notes on the same level before going to the next ○ BFS pseudocode algorithm uses a Queue BINARY HEAP: ● Heaps (“binary heaps”) ○ Heap data structure is an example of a balanced  binary tree ○ Useful in solving 3 types of problems ■ Finding the min or max value within a collection ■ Sorting numerical values into ascending or descending order ■ Implementing another important data structure called a priority queue ○ Binary heap is a heap data structure created using a binary tree ○ Can be seen as binary tree with 2 additional constraints ■ Shape property: ● Heap is a complete binary tree, a binary tree of height (i) in which all leaf nodes are located on level (i) or level (i-1), and all leaves on level (i) are as far to the left as possible ■ Order (heap) property: ● Data value stored in a node is less than or equal to the data values stored in all of that node’s descendants ● (value stored in the root is always the smallest value in the heap) ● Min Heap vs Max Heap ○ We could just as easily define a heap in which a node’s value is greater than or equal to the data values stored in all of that node’s descendants ● Min heap ○ Smallest value is the root of the tree ○ All nodes are smaller than ALL its descendants ○ NOT a binary search tree, values larger than root can appear on either side as children ● Implementation ○ Heap must be a complete tree ○ All levels are on the lowest 2 levels ○ Nodes are ADDED on the lowest level, from left to right ○ Nodes are REMOVED (to replace the root), from the lowest level, from right to left ● Binary Heap ○ 2 most important mutator methods on heaps are: ■ Inserting a new value into the heap and













Retrieving the smallest value from the heap (in other words, removing the root) ○ The insertHeapNode() method adds a new data value to the heap ■ Must ensure insertion maintains both the order and shape properties of the heap ■ Retrieval method, getSmallest() removes and returns the smallest value in the heap, which must be the value stored in the root ■ Method also rebuilds the heap because it removes the root, and all nonempty trees must have a root by definition Inserting a node into a Heap ○ Add the element to the bottom level of the heap - maintaining the shape property ○ Compare the added element with its parent; if they are in correct order, stop ○ If not, swap the element with its parent and return to the previous step (the parent must be less than or equal to its children - maintaining the order property) ○ The number of operations required is dependent on the number of levels the new element must rise to satisfy the heap property ○ Time complexity: O(log n) Adding to a heap ○ Min value is always the root element (in a min heap) ○ In this case, since ‘9’ was added to the heap, and it was the smallest item, it rose to the top, and became the root of the heap Deleting a node from a Heap ○ Replace the root of the heap with the last element on the last level maintaining the shape property ○ Compare the new root with its children; if they are in the correct order, stop ○ If not, swap the element with one of its children and return to previous swap ■ Swap with its smaller child in a min-heap and its larger child in a max-heap maintaining the order property ○ In the worst case, the new root has to be swapped with its child on each level until it reaches bottom level of heap, meaning delete operation has time complexity relative to the height of the tree ■ Time complexity: O(log n) ○ When retrieving smallest element, delete the root node Removing an element from a heap ○ For priority queue, you always remove lease value element and replace it with the last node on the right at the bottom level of the heap ○ When retrieving the smallest element, we delete the root node (min heap) Remove() method: remove root ○ Calling remove() does not specify which element to remove ○ Always removes at the  root of the binary heap (nothing else)







So that the tree (heap) doesn’t remain without a root node, replace it with last node on the right at the bottom level of the heap Why keep a balanced Tree? ○ Height of the tree determines the time required for adding and removing elements - keeping the tree balanced maximizes performance ○ Adding an element requires 1 step for every level of height ○ Adding and removing elements is O(log n) Heaps 1D Arrays ○ We can store the elements of our heap in one-dimensional array in strict left-to-right, level order (“breadth-first traversal”) ○ That is, we store all of the nodes on level i from left to right before storing the nodes on level i + 1 ■ This one-dimensional array representation of a heap is called a heapform ○ We do not need pointers in this array-based representation because the parent, children, and siblings of a given node must be placed into array locations that can be determined with some simple calculations...


Similar Free PDFs