Data structures make heavy use of pointers and dynamically allocated memory.
ADT | data structures |
---|---|
list of supported operations | specify exactly |
how data is represented | |
what should happen | algorithms for operations |
not: how to do it | has concrete costs |
(space and running time) | |
not: how to store data |
// Digression 题外话
Arrays have fixed size (supplied at creation).
maintain capacity 𝐶 = 𝑆.length so that 14 𝐶 ≤ 𝑛 ≤ 𝐶
How to maintain the last invariant?
before push: If 𝑛 = 𝐶, allocate new array of size 2𝑛, copy all elements.
after pop: If 𝑛 < 14 𝐶, allocate new array of size 2𝑛, copy all elements.
Any individual operation push / pop can be expensive! Θ(𝑛) time to copy all elements to new array.
But: An one expensive operation of cost 𝑇 means Ω(𝑇) next operations are cheap!
Formally: consider “credits/potential” Φ = min{𝑛 − 14 𝐶, 𝐶 − 𝑛} ∈ [0, 0.6𝑛]
amortized cost of an operation = actual cost (array accesses) − 4 · change in Φ
elements in the bag have different priorities.
Operators:
PQ implementations
Why complete binary tree shape?
Why heap ordered?
Add new element at only possible place: bottom-most level, next free spot.
Let element swim up(游上) to repair heap order.
𝑛 times insert ⇝ Θ(𝑛 log 𝑛)
instead:
Analysis
Operation | Running Time |
---|---|
construct(𝐴[1…𝑛]) | 𝑂(𝑛) |
max() | 𝑂 (1) |
insert(𝑥,𝑝) | 𝑂(log 𝑛) |
delMax() | 𝑂(log 𝑛) |
changeKey(𝑥,𝑝′) | 𝑂(log 𝑛) |
isEmpty() | 𝑂 (1) |
size() | 𝑂 (1) |