1700 Midterm Review.docx

16 Pages
139 Views

Department
Media, Information and Technoculture
Course Code
Media, Information and Technoculture 2200F/G
Professor
Svitlana Matviyenko

This preview shows pages 1,2,3,4. Sign up to view the full 16 pages of the document.
Description
1700 Midterm Review Lecture Two: Pictures: Charles Babbage (1791-1871): • Wanted to bring calculations from table to mechanical—tables prone to error • The Difference Engine: o Automatic mechanical calculator designed to tabulate polynomial functions o Functions commonly used by navigators and scientists can be approximated by polynomials—difference engine useful o Funded by British government but killed project as Babbage went on to build analytical engine before difference was complete • The Analytical Engine: o The first design for a general purpose computer o Designed by English mathematician Charles Babbage o Used arithmetic logic until, control flow and memory—would have been turning complete o Babbage never able to complete construction of any of his machines- produced developed by Geog Schwatz after Babbage had died Ada Lovelace (1815-1852): • Worked on Babbage’s early machine/general purpose computer—the Analytical Engine • Her notes include what is recognized as the first algorithm intended to be carried out by a machine • Often described as worlds first computer programmer • Had vision of computers going beyond mere calculating or number- crunching Claude Shannon: • Quantified information • Explained how symbols of communication are transmitted and how symbols convey meaning • Information is a statistical measure of the uncertainty or entropy of a system • Used Boolean Algebra (binary logic) to explain switching circuits—used in mechanical engineering • Through digital logic Shannon was approaching something like software • First worked on Analog machine with Vanaveer bush—later made the anolg digital A Mathematical Theory of Communication (1948): • Article written by Claude Shannon- later republished with Weaver and made less mathematical • One of the founding works in the field of Information Theory • Book explains how symbols of communication are transmitted, how transmitted symbols convey meaning, and the effect of the received meaning • Five moments in the communication process: an information source, a transmitter, the message, the channel of communication, the receiver • Information is the content of communication—it is what needs to be transported with minimum loss of quality from sender to receiver • Also developed concept of information entropy, redundancy and the term bit as a unit of information • Basic elements of communication laid out in article: o An information source that produces a message o A transmitter that operates on the message to create a signal which can be sent through a channel o a channel, which is the medium over which the signal is sent o a receiver, which transforms the signal back into the message intended for delivery o a destination, which can be a person or a machine for whom a message is intended • Before Shannon, information generally regarded as a “a mathematically defined quantity divorced from any concept of news or meaning” • Claude Shannon gave information meaning/statistical probabilities of occurrence—variation to the message transmitted Noise: • Information is defined by its relation to noise (i.e. what threatens to distort or corrupt it) • Information is a statistical measure of the uncertainty or entropy of a system • For Shannon, information exists as long as there is a choice of alternative messages (no uncertainty = no information) • Entropy: the measure of uncertainty that is expected o Entropy usually measured in bits- (a binary digit, 0 or 1) o Coin- low entropy, Dice-high entropy (harder to guess outcome, more uncertainty) **IMPORTANT to measure the amount of information/entropy in the message in order to determine the minimum channel capacity required to reliably transmit the encoded message Symbolic Logic and Switching Theory: • A Symbolic Analysis of Relay and Switching Circuits (1938, Claude Shannon) • Shannon showed that Boolean algebra can describe the operation of switching circuits • **Uses Boolean logic and binary algebra to explain the operation of electromechanical relays o “the calculus is shown to be exactly analogous to the calculus of propositions use in the symbolic study of logic” o Shannon used logic in realm of electrical engineering • Shannon proved that Boolean Algebra can be used to simplify the arraignment of relays used in the telephone routing switches (i.e. got rid of human switchboard operators) • Shannon’s work became foundation of practical digital circuit design • Symbolic logic for Boole became digital logical for Shannon • Through digital logic Shannon was approaching something like software • Logic Gates: o Used by Shannon—related to the idea of digital logic o Has an input(s) and one output and the logical operations happen inside (process uses Boolean logic) • Binary System: Either or • Decimal System: o 10 digit system: 0, 1, 2, 3, 4, 5, 6, 7, 8, 9 • Digital Logic: o Claude Shannon used Boolean Logic to found digital circuit theory and the digital computer o Took symbolic logic to digital logic o Took analog circuit to digital circuit George Boole: • 1815-1964 • Boolean Logic/Algebra: everything can be boiled down to a universal law of logic • Values of variables are true and false- usually denoted 1 and 0 respectively • Operations of Boolean algebra are the conjunction and, the disjunction or and the negation not • Boolean logic fundamental in the development of computer science and digital logic • Famously said “no general method for solution of questions in the theory of probabilities can be established which does not explicitly recognize those universal laws of which are the basis of all reasoning…” o i.e. all comes down to simple logic • Conjunction, disjunction, negation Terms: ‘information’ (old, rare) • “a mathematically defined quantity, divorced from any concept of news or meaning, specifically one which represents the degree of choice exercised in the selection or formation of one particular symbol, message etc. out of a number of possible ones and which is defined logarithmically in terms of the statistical probabilities of occurrence of the symbol or the elements of the message” • Before Shannon: Information as meaning—communication is to make oneself understood, to convey meaning • Shannon and after: information as entropy—communication is reproducing at one point either exactly or approximately a message selected at another point Vannevar Bush: • Constructed first widely practical Differential Analyzer at MIT in 1930 • Differential Analyzer was a mechanical, analog computer that solved differential equation with the help of shafts and gears—100-ton iron platform • Was an analog “thinking machine”, “mechanical brain” • Claude Shannon worked with Bush on Differential Analyzer Lecture Three: Industrial vs. post-industrial (Bell): • We have transitioned from industrial to post-industrial society • Post industrial society key characteristics: o Economy undergo’s a transition from the production of goods to the provision of services o Knowledge becomes form of capital o Producing ideas main way to grow economy o Through processes of globalization and automation, blue colar, unionized work, manual labor etc. decline, and profession workers (scientists, creative industry professionals etc.) grown in value/prevalence • Social change as being determined by technological change • The Third Wave by Alvin Toffler: o A book that describes the transition if developed countries from industrial age society (2 wave) to Information age society (3 rd wave) st  1 wave is the settled agricultural society  4 wave- how our technology makes us feel about ourselves- more personal/semantic Information Revolutions: • each revolution radically changed the way human beings understand themselves and their place in the universe • Copernicus (1 rev): o Earth is not the centre of the universe • Darwin (2 ):d o Human is not the king in the kingdom of animals • Freud (3 ): o We do not know what are are and what we want • Information Revolution (4 ): defined by Luciano Fioridi o Feels technology has changed our relationship to one another and the world o Information and communication technologies showing us that we are information organism o We can interact with reality information society: • Society where creation, distribution, use, integration and manipulation of information is significant economic, political, cultural etc. activit
More Less
Unlock Document

Only pages 1,2,3,4 are available for preview. Some parts have been intentionally blurred.

Unlock Document
You're Reading a Preview

Unlock to view full version

Unlock Document

Log In


OR

Join OneClass

Access over 10 million pages of study
documents for 1.3 million courses.

Sign up

Join to view


OR

By registering, I agree to the Terms and Privacy Policies
Already have an account?
Just a few more details

So we can recommend you notes for your school.

Reset Password

Please enter below the email address you registered with and we will send you a link to reset your password.

Add your courses

Get notes from the top students in your class.


Submit