Class Notes (1,100,000)

CA (620,000)

Queen's (10,000)

CISC (500)

CISC 235 (20)

Robin W Dawes (20)

Lecture 2

# CISC 235 Lecture Notes - Lecture 2: Random-Access Machine, Virtual Memory, Memory Address

by OC962936

What criterion should we use to choose an appropriate data

structure for an application?

How about "I already understand data structure A, and I don't

understand data structure B"? .... umm, no.

Perhaps "There is a built-in module for data structure A, but I

would have to code data structure B myself"? .... fail!

What could be left? What could be right?

Why, computational complexity, of course! We will prefer

structure A to structure B if A has alower order of complexity for

the operations we need in our particular application.

Before we discuss computational complexity, we need to agree on

the type of computer we are dealing with. In particular, we need to

clarify which operations can be completed in constant time.

The R.A.M. model of computation

We will use the Random Access Machine model, which can be

summarized as follows:

- sequential operations only (i.e. no parallel computation)

- all fundamental operations:

- +, -, *, / and comparisons for integers and floating

point numbers

- comparisons on booleans

- comparisons and type conversions on characters

- execution control

- accessing a memory address

- assigning a value to a variable

find more resources at oneclass.com

find more resources at oneclass.com

take constant time

- no hardware accelerators (such as caches or virtual

memory) are employed

It is important to note that this model implies an upper limit on the

number of digits in any number. This is true of virtually all

programming languages.

The RAM model does not assume constant time operations on

strings. A string is considered to be a data structure consisting of a

sequence of characters.

I'm assuming we are all familiar with "big O" complexity

classification:

Let f(n) and g(n) be non-negative valued functions on the set of

non-negative numbers. f(n) is in O(g(n)) if there are constants c

and n0 such that f(n) <= c*g(n) for all n >= n0

The significance of this is that as n gets large, the growth-rate of

f(n) is no greater than the growth rate of g(n). In other words, the

growth of g(n) is an upper bound on the growth of f(n).

There are several complexity classes that we encounter

frequently. Here is a table listing the most common ones.

Dominant Term

Big-O class

c (a constant)

O(1)

c * log n

O(log n)

c * n

O(n)

c * n * log n

O(n * log n)

find more resources at oneclass.com

find more resources at oneclass.com

c * n^2

O(n^2)

c * n^3

O(n^3)

c * n^k where k is a constant > 3

O(n^k)

c * k^n where k is a constant > 1

O(k^n)

c * n!

O(n!)

Combinations of Functions

If f1(n) is in O(g1(n)), and f2(n) is in O(g2(n)) .....

f1(n) + f2(n) is in O(max(g1(n),g2(n)))

f1(n) * f2(n) is in O(g1(n) * g2(n))

So far this should all be very familiar. But O classification is just

the smallest first step in the field of computational

complexity. There are many other ways of grouping functions

together based on the resources (time and/or space) they

require. We will consider two more: Omega classification and

Theta classification.

Consider this function:

A(n):

if n % 2 == 0:

for i = 1..n:

print '*'

find more resources at oneclass.com

find more resources at oneclass.com

**Unlock Document**

###### Document Summary

We will prefer structure a to structure b if a has alower order of complexity for the operations we need in our particular application. Before we discuss computational complexity, we need to agree on the type of computer we are dealing with. In particular, we need to clarify which operations can be completed in constant time. We will use the random access machine model, which can be summarized as follows: Sequential operations only (i. e. no parallel computation) +, -, *, / and comparisons for integers and floating point numbers. Assigning a value to a variable take constant time. No hardware accelerators (such as caches or virtual memory) are employed. It is important to note that this model implies an upper limit on the number of digits in any number. This is true of virtually all programming languages. The ram model does not assume constant time operations on strings.

## More from OC962936

###### CISC235 Lecture 1: Week 1

Lecture Note

###### CISC 235 Lecture Notes - Lecture 7: Binary Tree, Search Algorithm, Binary Search Algorithm

Lecture Note

###### CISC 235 Lecture Notes - Lecture 6: Binary Tree

Lecture Note

## Similar documents like this

###### CISC235 Lecture 5: Week 6

Lecture Note

###### CISC 235 Lecture Notes - Lecture 13: Quadratic Probing, Linear Probing, Hash Table

Lecture Note

###### CISC 235 Lecture Notes - Lecture 8: Reverse Polish Notation, Polish Notation, Binary Tree

Lecture Note

###### CISC 235 Lecture Notes - Lecture 7: Binary Tree, Search Algorithm, Binary Search Algorithm

Lecture Note

###### CISC 235 Lecture Notes - Lecture 4: Sorting Algorithm, Quicksort, Simple Algorithm

Lecture Note

###### CISC 235 Lecture Notes - Lecture 20: Minimum Spanning Tree, Binary Heap, Prims

Lecture Note

###### CISC 235 Lecture Notes - Lecture 9: Binary Search Tree, Search Algorithm, Binary Tree

Lecture Note

###### CISC 235 Lecture Notes - Lecture 6: Binary Tree

Lecture Note

###### CISC 235 Lecture 12: CISC_235_29-March-2016

Lecture Note

###### CISC 235 Lecture Notes - Lecture 19: Directed Acyclic Graph, Topological Sorting, Critical Path Method

Lecture Note

###### CISC 235 Lecture Notes - Lecture 15: Merge Sort, Heapsort, Priority Queue

Lecture Note

###### CISC 235 Lecture Notes - Lecture 14: Donald Knuth, American Broadcasting Company

Lecture Note

###### CISC 235 Lecture Notes - Lecture 21: Bit Array, The Algorithm, Bloom Filter

Lecture Note

###### CISC 235 Lecture Notes - Lecture 18: Popular Alternative, Tree Traversal, Binary Tree

Lecture Note

###### CISC235 Lecture 17: Week 17- Graph Searching Algorithms

Lecture Note