Data and structures notes PDF

Title Data and structures notes
Author Rit Mat
Course Algorithms and their Applications
Institution Brunel University London
Pages 3
File Size 128.2 KB
File Type PDF
Total Downloads 3
Total Views 151

Summary

self - notes
...


Description



Algorithms and Data Structures SUMMARY 

Big(O) notation ●

Big O notation is the language we use for talking about how long an algorithm takes to run. It's how we compare the efficiency of different approaches to a problem.



Big O ignores constants, but sometimes the constants matter. If we have a script that takes 5 hours to run, an optimization that divides the runtime by 5 might not affect big O, but it still saves you 4 hours of waiting.



Beware of premature optimization. Sometimes optimizing time or space negatively impacts readability or coding time. For a young startup it might be more important to write code that's easy to ship quickly or easy to understand later, even if this means it's less time and space efficient than it could be.

Data Structures ●

Arrays have O(1)-time lookups. But you need enough uninterrupted space in RAM to store the whole array. And the array items need to be the same size.



Another problem with arrays is you have to specify their sizes ahead of time. There are two ways to get around this: dynamic arrays and linked lists. Linked lists have faster appends and prepends than dynamic arrays, but dynamic arrays have faster lookups.



Fast lookups are really useful, especially if you can look things up not just by indices (0, 1, 2, 3, etc.) but by arbitrary keys ("lies", "foes"...any string). That's what hash tables are for. The only problem with hash tables is they have to deal with hash collisions, which means some lookups could be a bit slow.

 Arrays  1



 

Strengths ●

Fast lookups. Retrieving the element at a given index takes O(1) time, regardless of the length of the array.



Fast appends. Adding a new element at the end of the array takes O(1) time.

Weaknesses ● Fixed size. You need to specify how many elements you're going to store in your array ahead of time. (Unless you're using a fancy dynamic array.)

● Costly inserts and deletes. You have to " scoot over" the other elements to fill in or close gaps, which takes worst-case O(n) time.

In-place Algorithms An in-place function modifies data structures or objects outside of its own stack frame (i.e.: stored on the process heap or in the stack frame of a calling function). Because of this, the changes made by the function remain after the call completes.

In-place algorithms are sometimes called destructive, since the original input is "destroyed" (or modified) during the function call. ●

Working in-place is a good way to save time and space. An in-place algorithm avoids the cost of initializing or copying data structures, and it usually has an O(1) space cost.



But be careful: an in-place algorithm can cause side effects. Your input is "destroyed" or "altered," which can affect code outside of your function



Generally, out-of-place algorithms are considered safer because they avoid side effects.You should only use an in-place algorithm if you're space constrained or you're positive you don't need the original input anymore, even for debugging.

 Dynamic Arrays

 2



 



A dynamic array is an array  with a big improvement: automatic resizing.



One limitation of arrays is that they're fixed size, meaning you need to specify the number of elements your array will hold ahead of time.



A dynamic array expands as you add more elements. So you don't need to determine the size ahead of time.

Strengths ●

Fast lookups. Just like arrays, retrieving the element at a given index takes O(1) time.



Variable size. You can add as many items as you want, and the dynamic array will expand to hold them.



Cache-friendly. Just like arrays, dynamic arrays place items right next to each other in memory, making efficient use of caches.

Weaknesses ● Slow worst-case appends. Usually, adding a new element at the end of the dynamic array takes O(1) time. But if the dynamic array doesn't have any room for the new item, it'll  need to expand, which takes O(n) time.

● Costly inserts and deletes. Just like arrays, elements are stored adjacent to each other. So adding or removing an item in the middle of the array r equires "scooting over" other elements, which takes O(n) time.

 3...


Similar Free PDFs