**by:Toqa Mahmoud**

**Data Structure**

Basics Data Structure :

-No thing is the best to compare

the performance.

-Costs related to its maintenance:

more debugging, modification,

update, improvement

Which measure is more important??

Time complexity

analysis:-

asymptotic notations

- big oh,theta ,omega

Linked Lists

Stacks

Queues

etc

Algorithms Data Structure:-

-Searching

-Sorting

Analysis of Algorithms

Efficiency of an algorithm can be measured in terms of:

Execution time (time complexity) or compilation time<<Performance>>

The amount of memory required (space complexity)

Which case???

Best case: among all possible input data of length n, we consider the one with the lowest T(n)

Worst case: among all input data of length n we consider

the one with the highest T(n)

Average case: T(n) is the average cost for input data of

length n

We will always consider the worst case

upper bound, the algorithm cannot take longer

very relevant from an engineering point of view

for some applications, it occurs very often

Murphy’s Law...

average case is often as bad when we consider the order of growth

Order of growth

We are not interested in the precise value of T(n),

but only in its order of growth with respect to n

We are interested in the asymptotic behavior of the cost

functions, not in their exact expression

We introduce suitable notations to highlight

the asymptotic

behavior:

– big-Oh notation (O)

– big-Theta notation (Ω)

– big-Omega notation (Θ)

Big-Oh notation (O)

The big-oh notation indicates an asymptotic upper bound

(worst-case)

Given a function g(n), O(g(n)) is the following set of functions:

O(g(n)) ={f(n) | ∃c,n 0 (c,n 0>0 such that ∀n≥n 0 0≤f(n)≤cg(n))}

Graphically:

The most commonly used notation for specifying asymptotic complexity is the big-O notation.

The Big-O notation, O(g(n)), is used to give an upper bound (worst-case) on a positive runtime function f(n) where n is the input size.

Definition of Big-O:

For a function f(n) that is non-negative for all n ≥ 0, we say that f(n) = O(g(n)) (“f(n) is Big-O of g(n)”)

if there exist n0 ≥ 0 and a constant c > 0 such that f(n) ≤ cg(n) for all n ≥ n0.

Big-Omega notation (Ω) :-

The big-Omega notation

indicates an asymptotic lower bound

Given a function g(n), Ω(g(n)) is the following set of functions:

Ω(g(n)) ={f(n) | ∃c,n 0 (c,n 0>0 such that ∀n≥n 0 0≤cg(n)≤f(n))}

Graphically:

Similarly, Ω(g(n)) is used to give a lower bound on a positive runtime function f(n) where n is the input size.

Definition:

For a function f(n) that is non-negative for all n ≥ 0, we say that f(n) = Ω(g(n)) (“f(n) is big-Omega of g(n)”)

if there exist n0≥ 0 and a constant c > 0 such that

f(n) ≥ cg(n) for all n ≥ n0.

Big-Theta notation (Θ)

The big-Theta notation

indicates both an upper and a lower bound

Given a function g(n), Θ(g(n)) is the following set of functions:

Θ(g(n)) ={f(n) | ∃c 1 ,c 2 ,n 0 (c 1 ,c 2 ,n 0>0 s.t. ∀n≥n 0 0≤c 1 g(n)≤f(n)≤c 2 g(n))}

Graphically:

Similarly,

Θ(g(n)) is used to give a tight bound on a

positive runtime function f(n) where n is the input size.

Definition:

For a function f(n) that is non-negative for all n ≥

0, we say that

f(n) = Θ(g(n)) (“f(n) is big-Theta of g(n)”)

if f(n) = O(g(n)) and f(n) = Ω(g(n)).

Thank You