axiomatic definition of ADT behavior List ADT Stack ADT Queue ADT Bounded Stack ADT Bounded Queue ADT

List data structure general ops: get kth item insert at slot k delete at slot k implementing lists: as array as linked cells Ordered lists are especially nice order information can assist in list searching binary search can find an item in O(log N) time worst case special lists: Stack general ops: push, pop, top, empty, size keep pointer to most recent slot (t) push is insert at slot t pop is delete from slot t push, pop are O(1) since we access the top slot ddirectly via pointer no searching through the list, no rearrangement of locations this applies to implementations by links or array Queue general ops: add, remove, front, empty, size keep pointers to front slot, back slot add is insert at slot front remove is delete from slot back add, remove are O(1) since we access the slots directly by pointer no searching through the list, no rearrangement of locations this applies to implementations by links or array

Bubble sort O(N^2) time required to sort list of N items pro: easy to program con: very slow, useless for sorting lists of any serious size

Tree data structure in general (n-ary tree) Binary Tree data structure Any n-ary tree can be represented as a binary tree Binary Search Tree order property: all nodes in left child subtree have smaller value than root, all nodes in right child subtree have values greater Tree balancing: AVL: imbalance condition, rotations to re-balance rotations alter tree structure but maintain the order property worst case time complexity for insert, remove, contains Splay: another approach to balancing splaying done any time a node is accessed splaying is applying preserving rotations until accesses node is moved to the root amortized worst case time complexity traversals: in-order, pre-order, post-order in-order gives sorted list from a BST pre-order, post-order give lists dependent on tree structure breadth-first BST Sort Not a real sorting algorithm name, but a way to remember that sometimes fairly efficient way to sort a list is to first build a binary search tree, then traverse it in the right way Average case analysis: 1) build a BST from the list: O(N log N) one insert in a BST: O(log N) for tree with N elments doing N inserts: O(N log N) 2) in-order traversal to get the sorted order of elements O(N) since we have to visit each of the N nodes in the BST So total avg case time complexity is O(N log N) + O(N) which is O(N * (1 + log N)) which is O(N log N) Worst case analysis: 1) build a BST from the list: O(N * N) is O(N^2) one insert in a BST: O(N) for tree with N elments doing N inserts: O(N * N) is O(N^2) 2) in-order traversal to get the sorted order of elements O(N) since we have to visit each of the N nodes in the BST So total worst case time complexity is O(N^2) + O(N) which is dominated by O(N^2) AVL tree Sort Like BST sort but we use a balanced BST, and this affects the worst case time complexities so know the worst case time complexities for AVL tree operations

"Big Oh" notation for expressing growth rates of functions List of length N insert is O(N) remove is O(1) once you find the element find is O(N) Queue containing N items add is O(1) remove is O(1) front is O(1) can't get middle elements like general List Stack containing N items push is O(1) pop is O(1) top is O(1) can't get middle elements like general List Binary Search Tree with N nodes find is O(log N) insert is O(log N) remove is O(log N) because we have to find it to remove it traversals are O(N) (list every element in some order) we get ordering (sorting) information from the tree structure degenerate case of BST is a List "Big Oh" is an upper bound on growth speed of an operation/algorithm but it is not necessarily a good (tight) upper bound if some operation F is O(N^2) in worst case time use then F is also O(N^3)... and then F is also O(2^N) etc. but O(N^2) is the lowest (tightest) bound of those given. we would say O(N^2) is the "best" worst case time complexity of those choices

binary heap binary tree with two properties: heap order property structure property always a complete binary tree, with last level being filled from L to R but perhaps not full representation of binary heap as array heaps can be Min or Max operations: getMin (Max) O(1) worst, avg since it is at the root of the heap tree removeMin (Max) O(log N) worst case O(log N) average case add O(log N) worst case O(1) average case (75% of the heap elements in last 2 tree levels) build simple way is O(N log N) by doing N adds cleaver way is O(N) if you have all the elements on hand, use special "build" priority queue implement priority queue using binary heap elements stored have value and priority priority is used to put elements into binary heap operations: enq: add item to PrQUE deq: take item out of the PrQUE, and that item is "highest" (or lowest) priority front: produce element from queue that has "highest" (or lowest) priority size, empty, etc... heap sort special "build" a binary heap do repeated removeMin() to create the ordered sequence worst case time complexity if we do repeated insert to construct heap worst case time complexity if we do special build to construct heap

Sorting we have discussed and analyzed these sorting methods: -- keeping a basic List sorted -- Bubble Sort: rearranging the items in an array/list to put them in order -- BST sort: build a binary search tree and do in-order traversal -- AVL sort: build a balanced BST and do in-order traversal -- Sorting with a binary heap

Tree representations we have represented a tree (binary tree) as -- linked cells (BST, AVL, Splay) -- array (binary heap) in array representation, the links are implicit and are obtained via array subscript arithmetic