Class Notes (1,020,317)
CA (585,796)
U of S (3,517)
CMPT (50)
Lecture 12

CMPT 115 Lecture Notes - Lecture 12: Process Control Block, The Queue, Multiprocessing

8 pages77 viewsWinter 2016

Department
Computer Science
Course Code
CMPT 115
Professor
Jason Bowie
Lecture
12

This preview shows pages 1-2. to view the full 8 pages of the document.
The Queue ADT
CMPT 115 lecture notes
Notes written by Michael Horsch, Mark Eramian, Ian McQuillan, Lingling Jin, and Dmytro Dyachuk
Objectives
After this topic, students are expected to
1. describe what queues are in your own words.
2. define the data structure of the Queue ADT.
3. associate the data structure of the Queue ADT and the List ADT.
4. define and implement the operations of the Queue ADT.
5. associate the operations of the Queue ADT and the List ADT (reuse some operations in the List ADT).
6. analyze the complexities of the operations of the Queue ADT.
7. apply the Queue ADT in different applications.
Outline
Contents
1 Queues 1
2 Applications of Queues 2
2.1 Buering ............................................... 2
2.2 ProcessScheudling.......................................... 3
3 Queue ADT 5
3.1 Time Complexity of Queue Operations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
1 Queues
A Queue
A queue is another type of ordered, linear list.
Like stacks, they are restricted in the way that we can insert and delete from them.
A familiar concept: a first-come first-serve lineup.
Anybody who joins the lineup, joins at the end.
The people who leave the lineup, do so at the front.
Queues are known as a first-in first-out (FIFO) data structure.
The first element which gets added to the queue is going to be the first element which comes out of
the queue.
1
You're Reading a Preview

Unlock to view full version

Only half of the first page are available for preview. Some parts have been intentionally blurred.

Queues
Deleting from the front of the queue is known as a dequeue. We can dequeue the front element from
the queue.
Adding to the back of the queue is known as an enqueue. We can enqueue some new element onto the
back of the queue.
These are the only ways in which we can insert and delete from the queue.
Picture from http://www.cs.mcgill.ca/ cs251/OldCourses/1997/topic6/
2 Applications of Queues
Buffering
Process Scheduling
2.1 Buffering
Something about cats
This is another video about cats
because CATS.
0:00 / 4:59
URL
Embed
YouTube sends video data ("producer"); video player displays data ("consumer")
Variables that affect video experience:
Resolution of video (320, 720, 1080p, HD, etc)
Network lag, varies over time ("data delivery time")
Computer speed
Video data is buffered: accumulated to allow smooth(er) viewing.
Program P (e.g., YouTube) produces some data elements.
Program C (e.g., viewer) consumes those data elements.
Data elements (e.g., video) are communicated in pieces.
Potential problems:
If P produces faster than C can consume, P will have to wait for C.
2
You're Reading a Preview

Unlock to view full version

Only half of the first page are available for preview. Some parts have been intentionally blurred.

If C consumes faster than P produces, C will have to wait.
Communication rate varies over time (sometimes faster/slower)
A buffers allow more than one element to be "in communication" at the same time.
This potentially reduces the amount of time either program A or B has to wait for the other.
It is particularly effective if A and/or B produce/consume data in bursts.
Buffers are built using queues, because:
In order to preserve communication order, data must be removed from the buffer in the same
order in which it was put in.
We aren’t allowed to read from an empty buffer, or write to a full buffer.
Buffers are used anywhere the flow of data needs to be “smoothed”:
streaming video
network communication
data transfer (“input/output") within computer system (e.g., console input&output, file input&output)
2.2 Process Scheudling
Queues are very common in operating systems for process scheduling in multiprocessing systems.
What is a multiprocessing system?
Multiple programs all running "at the same time", sharing the same CPU(s).
If there is one CPU and several programs, are they really all running at the same time?
What is process scheduling?
When, and for how long, (and on which processor) does a program get to run?
In other words, process scheduling seeks to share limited CPU resources between multiple processes
in a fair way.
Operating systems manage processes by storing all of the information about a process in a structure
called a process control block (PCB).
In the simplest form of scheduling, the process control blocks are placed on a queue. When a PCB
gets to the front of the queue, that process gets to run.
New processes are enqueued on the runnable queue.
When a process makes it to the front of the runnable queue, it runs for a fixed amount of time, or until
it releases the CPU on its own (whichever comes first).
A process may release the CPU early if it terminates, or enters an I/O wait state, etc.
A process that uses its full time gets re-enqueued on the tail of the runnable queue.
Processes that stop to wait for I/O are enqueued on an I/O wait queue (depending on kind of I/O).
Processes that are dequeued from an I/O wait queue (because their I/O is done) are enqueued on an
auxilliary runnable queue. This queue has priority over the normal runnable queue.
3
You're Reading a Preview

Unlock to view full version


Loved by over 2.2 million students

Over 90% improved by at least one letter grade.