(Update 2024) Shannon: information theory | IELTS Reading Practice Test Free

Table of Contents

Passage

A A satellite in the solar system got pictures of Jupiter and Saturn and the pictures were meant to be sent back to the earth. Unfortunately, the satellite was broken so it took a while for the earth to receive the collected photographs. And eventually, the malfunctioning satellite was off the solar system. All this information transmitting technology should be credited to Claude E. Shannon(1916-2001) and his information theory.

B Shannon was born in Petoskey, Michigan. His father, Claude Sr (1862-1934), a descendant of the early New Jersey settlers, was a self-made businessman and for a while, Judge of Probate. Shannon showed an inclination towards mechanical things. His best subjects were science and mathematics, and at home he constructed such devices as models of planes, a radio-controlled model boat and a wireless telegragh system to a friend’s house half a mile away. While growing up, he worked as a messenger for Western Union. His childhood hero was Thomas Edison, who he later learned was a distant cousin. Both were descendants of John Ogden, a colonial leader and an ancestor of many distinguished people.

C Shannon first began his research in the information field just to distinguish the correctness of a piece of information. The main concepts of information theory can be grasped by considering the most widespread means of human communication: language. Two important aspects of a concise language are as follows: First, the most common words (e.g., “a”, “the”, “I”) should be shorter than less common words(e.g., “roundabout”, “generation”, “mediocre”), so that sentences will not be too long. Such a tradeoff in word length is analogous to data compression and is the essential aspect of source coding. Second, if part of a sentence is unheard or misheard due to noise, e.g., a passing car, the listener should still be able to glean the meaning of the underlying message. Such robustness is as essential for an electronic communication system as it is for a language; properly building such robustness into communications is done by channel coding. Source coding and channel coding are the fundamental concerns of information theory.

D Note that these concerns have nothing to do with the importance of messages. For example, a platitude such as “Thank you; come again” takes about as long to say or write as the urgent plea, “Call an ambulance!” while the latter may be more important and more meaningful in many contexts. Information theory, however, does not consider message importance or meaning, as these are matters of the quality of data rather than the quantity and readability of data, the latter of which is determined solely by probabilities.

Shannon information theory
Shannon information theory

E Information theory is closely associated with a collection of pure and applied disciplines that have been investigated and reduced to engineering practice under a variety of rubrics throughout the world over the past half century or more: adaptive systems, anticipatory systems, artificial intelligence, complex systems, complexity science, cybernetics, informatics, machine learning, along with systems sciences of many descriptions. Information theory is a broad and deep mathematical theory, with equally broad and deep applications, amongst which is the vital field of coding theory.

F Coding theory is concerned with finding explicit methods, called codes, of increasing the efficiency and reducing the net error rate of data communication over a noisy channel to near the limit that Shannon proved is the maximum possible for that channel. These codes can be roughly subdivided into data compression (source coding) and error-correction (channel coding) techniques. The rate of transmitting information relies on the amount of noise. In the latter case, it took many years to find the methods Shannon’s work proved were possible. A third class of information theory codes are cryptographic algorithms (both codes and ciphers). Concepts, methods and results from coding theory and information theory are widely used in cryptography and cryptanalysis. See the article ban (information) for a historical application. Information theory is also used in information retrieval, intelligence gathering, gambling, statistics, mobile phones and even in musical composition.

G A key measure of information is known as entropy, which is usually expressed by the average number of bits needed to store or communicate one symbol in a message. Entropy quantifies the uncertainty involved in predicting the value of a random variable. For example, specifying the outcome of a fair coin flip (two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a die (six equally likely outcomes). Applications of fundamental topics of information theory include lossless data compression (e.g. ZIP files), lossy data compression (e.g. MP3s and JPGs), and channel coding (e.g. for Digital Subscriber Line (DSL)). The field is at the intersection of mathematics, statistics, computer science, physics, neurobiology, and electrical engineering. Its impact has been crucial to the success of the Voyager missions to deep space, the invention of the compact disc, the feasibility of mobile phones, the development of the Internet, the study of linguistics and of human perception, the understanding of black holes, and numerous other fields. Important subfields of information theory are source coding, channel coding, algorithmic complexity theory, algorithmic information theory, informationtheoretic security, and measures of information.

Questions

Question 27-32 The reading Passage has seven paragraphs A-G. Which paragraph contains the following information? Write the correct letter A-G, in boxes 27-32 on your answer sheet.

NB You may use any letter more than once.

27 the process of condensing data

28 Dispensable roles of information theory in many areas

29 Easiness of reading data

30 Numerous subjects concerning information theory

31 Charmel coding as a branch of information theory

32 Aspects specializing in coding theory

Questions 33-37 Complete the following summary of the paragraphs of Reading Passage, using no more than three words from the Reading Passage for each answer. Write your answers in boxes 33-37 on your answer sheet.

In the 33 ……………….. , a 34 ……………….. was sent to gather photographs of Jupiter and Saturn and the pictures were meant to be sent back to the earth. However, there was something unexpected with the satellite so that the photographs which were 35 ……………….. did not reach the earth as planned. At the end, the 36 ……………….. satellite went out of the solar system. Thanks to Claude E.Shannon and his information theory that the 37 ……………….. could contribute to accomplishing the demanding task.

Questions 38-40 Do the following statements agree with the information given in Reading Passage 3? In boxes 38-40 on your answer sheet, write

TRUE if the sataement agrees with the information

FALSE if the statement contradicts the information

NOT GIVEN if there is no information on this

38 The original purpose that drove Shannon to begin his research was to tell whether the information was correct.

39 The amount of information through transmitting is determined by the level of noise.

40 Nearly all fields are concerning information theory.

Answers

Shannon information theory answers
Shannon information theory answers

IELTS Reading Practice Test

Cambridge IELTS Reading 5-18 Explanation

IELTS Online Practice Test

Leave a Reply