Free Essay

Coding Theory

In:

Submitted By IanBathelt
Words 1095
Pages 5
Coding Theory
Ian Bathelt
Math\221
02\16\2015
Cory Bennet

Coding theory is the study of codes and their properties and their propensity to work with certain types of applications. There are four general types of coding that we as humans use. The four types are: line coding, error correction or channel coding, data compression, and lastly cryptographic coding. In relation to error detecting codes, there exist error-correcting codes. The purpose of this paper is to explain the different types of coding and go over hamming distance, perfect codes, generator matrices, parity check matrices, and hamming codes. We will also be giving examples of how coding theory could be applied in a real world application; and a brief history of coding theory. According to “Wolframmathworld” (2015) “Coding theory, sometimes called algebraic coding theory, deals with the design of error-correcting codes for the reliable transmission of information across noisy channels. It makes use of classical and modern algebraic techniques involving finite fields, group theory, and polynomial algebra”(). Coding theory has roots in communication filed. Claude Shannon first published the “A Mathematical Theory of Communication” in the Bell System Technical Journal. It was a piece that featured encoding information transmitted by a send. The fundamentals of this work included probability, which he applied to his communication theory. This gave way to Shannon developing what later became known as information theory. Later, a binary code came to fruition, known as the Golay code. This was an error correcting code that could correct up to three errors in words that were 24-bits in length, and could detect the presence of a fourth. Decades later, Richard Hamming came along and began doing extensive work with numerical methods, coding systems that were automatic, and error detecting and error correcting codes. His study of these systems led to Hamming Codes, and Hamming Distance. Hamming codes use a set of mathematical parameters that detect and correct errors with up to two bits, and correct one-bit errors, and does not detect errors that are uncorrected. Most codes until the Hamming code, could not correct errors. Hamming codes are known as perfect codes, or codes that achieve the highest rate possible for codes of their block length with minimum distance. Hamming codes add little information to the data that it is correcting, therefore they are only able to detect and correct the errors in the information when errors in the information, is considerably low. This gives way to the Hamming distance. The Hamming distance is basically the distance of which two strings of information and the symbols of information that they contain, are different. So say you two strings of information with varying data inside of them, the Hamming distance is what it would take two change the two strings to be exactly alike. The differences are considered errors, where changing one is to change it to be like the other string. This all gives way to error correcting codes. Error correcting codes are techniques for helping to control errors that are made in the transmission of data over communications that are considered A: unreliable, and B: noisy communications. The basic concept is that when the sender sends an encoded message, it is done in a redundant way using an error correcting code. Redundancy in a message gives the receiver the ability to detect limited errors that could arise anywhere within the message; as well as the ability to correct any errors without having to rebroadcast the transmission of the message. This is called Forward Error Coding, which is essentially the main part of error correcting codes. According to Permutationpuzzles.org (2007), “Since a code is a finite dimensional vector space over a finite field, it only has finitely many elements. To represent a code in a computer one may of course store all the elements” (para. 1). For this type of information you would use a generator matrix. The rows of a generator matrix are a basis for linear codes. Code words in a generator matrix are linear combinations that the rows of the matrix make up, and the linear code are its row space. Row space is essentially of the possible combinations that its row vectors generate. Next we have the parity check matrix. A parity check matrix is a block of linear codes that describe linear relations where components of that code word must be satisfied. It is used in deciding whether or not a vector is a code word, and is used when algorithms are decoded. Parity check matrices takes variables that are parts of a generator matrix when they pertain to that dual code. Rows used in parity matrices or coefficients of the equations that they are checking. What that means, is that they show linear combinations of components to those code words which equal zero. The "Pulse.embs.org" (2015) website, “In 1997, a science fiction film titled Gattaca premiered in U.S. theaters, depicting a society sometime in the not-so-distant future in which people are crafted for and judged by the quality of their genetic material.” To me the most relevant real world application and implementation of coding theory, is it’s relation to DNA. If you can apply the techniques of coding theory to DNA, the Human Race may definitely one day realize the dream where no human has to suffer from inferior genetics. You take someone’s genetic code, and develop an error detecting code, where genetic mutations occur and cause disease. If you have a code that can detect that, and then correct that, I think the world would be a better place. I think the most relevant diseases where coding theory could be applied to would be auto-immune diseases. If you took someone with an auto-immune disease, and ran their genetic (DNA) code through a genetic error detecting, or Hamming code, you could pinpoint the genetic information that causes that specific disease, and replace it with different, more-healthy information. To conclude, coding theory is everywhere in today’s society. It was given birth as a child of World War II, with siblings the computer, and the nuclear bomb. Communication information gave birth to coding theory and the many different forms of it that include Error detecting and correcting, Hamming distances and codes, perfect codes, and matrices: Generator, and Parity.

References: wolframMathWorld. (2015). Retrieved from http://mathworld.wolfram.com/CodingTheory.html

permutationpuzzles.org. (2007). Retrieved from http://www.permutationpuzzles.org/AAAbook/node125.html

pulse.embs.org. (2015). Retrieved from http://pulse.embs.org/january-2015/state-art/

Similar Documents

Free Essay

Coding Theory

...Coding Theory 8/12/15 Coding theory is a study of codes it is generally used in error correcting codes ad error detecting codes. It is way to have a secured application or network in today’s generation where the technology and information is growing rapidly. How coding theory works is for example we have lots of information and we want to decode it. Error detecting will detect all the errors in the information and error correcting will fix al the errors but most of the time it can we difficult to correct the errors when they are detected. Coding theory has many techniques different like hamming codes, perfect codes and generator matrices these are few techniques that work with coding theory. Error detecting codes helps look for errors in information or codes and it lists all the errors in the codes. Since most of the technology we use are binary numbers of 0’s and 1’s error detecting codes uses these binary numbers to look for the errors in codes. Some of the things like digital messages and zip code use error detecting to find any error like if there is a wrong zip code entered. Error correcting codes include error detecting codes so it can detect errors and also fix the errors. What exactly error correcting does is for example if we send some data to someone. Error correcting will fix all the errors it has detected while it is getting all the information. Hamming distance measures the length of the character from one to another mostly two binary strings or...

Words: 500 - Pages: 2

Free Essay

Linear Block Codes

...________________________________________________________________ LINEAR BLOCK CODES A systematic (n,k) linear block code is a mapping from a k-dimensional message vector to an ndimensional codeword in such a way that part of the sequence generated coincides with the k message digits. The difference (n − k) represents the parity bits. A systematic linear block will have a [k × n] generator matrix [G] of the form G=[P IK ] Then the code is given as C= D*G Where D is the data word. Another important matrix associated with block codes is the [(n − k) × n] parity check matrix, [H]. The parity check matrix is formed by starting with the identity matrix and appending the transpose of the nonidentity portion of [G]: H=[IK PT ] The parity check matrix has the property c[H]T = 0 That is, any errorless, received code word multiplied by the transpose of the parity check matrix, [H], yields a zero vector, or syndrome. If the received code word contains an error, the resulting vector will match the corresponding bit that caused the error. S= R[H]T=E H]T Where S= error syndrome R= received codeword E=error Algorithm: 1. Take generator matrix from the user. 2. Create the 4 bit data word, total of 16 data words, using two dimensional array in the matlab. 3. Then code word is generated utilizing the matrix multiplication formula for the code word. 4. Then Hamming distance...

Words: 841 - Pages: 4

Premium Essay

Axial Coding Paper

...Other than the methods discussed above, there are few other methods can be applied under the grounded theory such as open coding, axial coding, selective coding, memoing, sorting, writing and many others.For our research purpose as stated by Holy Feen we will be only looking at the first three methods. The first method is known as open coding. Open coding is carried out by analysing a research by identifying, naming, categorizing and describing a phenomena found in the text. The only way for us to analyse a research is by going through each line and paragraph and also reading and re-reading in search of the answer by questioning ourselves 'what is the objective about?' or 'what is being referenced in that particular line?'. These categories may consist...

Words: 465 - Pages: 2

Free Essay

On Block Security of Regenerating Codes at the Mbr Point for Distributed Storage Systems

...On Block Security of Regenerating Codes at the MBR Point for Distributed Storage Systems Son Hoang Dau∗ , Wentu Song† , Chau Yuen‡ Singapore University of Technology and Design, Singapore Emails: {∗ sonhoang dau, † wentu song, ‡ yuenchau}@sutd.edu.sg Abstract—A passive adversary can eavesdrop stored content or downloaded content of some storage nodes, in order to learn illegally about the file stored across a distributed storage system (DSS). Previous work in the literature focuses on code constructions that trade storage capacity for perfect security. In other words, by decreasing the amount of original data that it can store, the system can guarantee that the adversary, which eavesdrops up to a certain number of storage nodes, obtains no information (in Shannon’s sense) about the original data. In this work we introduce the concept of block security for DSS and investigate minimum bandwidth regenerating (MBR) codes that are block secure against adversaries of varied eavesdropping strengths. Such MBR codes guarantee that no information about any group of original data units up to a certain size is revealed, without sacrificing the storage capacity of the system. The size of such secure groups varies according to the number of nodes that the adversary can eavesdrop. We show that code constructions based on Cauchy matrices provide block security. The opposite conclusion is drawn for codes based on Vandermonde matrices. I. I NTRODUCTION A. Background In recent years, the demand...

Words: 4963 - Pages: 20

Free Essay

20th Foot Musters

...Thomas WILLIAMS – service record extracted from Muster Rolls of 20th Foot regiment 1812 May Active in Newry “Volunteer from Royal Westminster Militia paid by them to 6th May” (as a private) June Active in Newry July “From Private” (to Drummer on 25th). “Sent recruiting To Bungay, Suffolk, Eng paid by me to 25th” August Recruiting in Bungay September Recruiting in Bungay October In red: “Cordwainer at St George, Middlesex” November Recruiting in Bungay December “To recruiting company” 1813 January Recruiting in Bungay February Recruiting in Bungay March Recruiting in Bungay April Recruiting in Bungay May Recruiting in Bungay June Recruiting in Bungay July Recruiting in Bungay August Recruiting in Bungay September Recruiting in Bungay October Recruiting in Stowmarket November Recruiting in Stowmarket December Recruiting in Stowmarket 1814 January Recruiting in Stowmarket (Regiment shown as being in Totnes) February Recruiting in Stowmarket March Recruiting in Stowmarket April Recruiting in Stowmarket May Recruiting in Stowmarket June Recruiting in Stowmarket July Recruiting in Stowmarket “The regiment being at home this manning will be discontinued on the rolls from 25th July” (home depot = Knightsbridge) August Recruiting in Stowmarket (Regiment shown as ‘on ship’) September Recruiting in Stowmarket October Recruiting...

Words: 829 - Pages: 4

Premium Essay

Nt1330 Unit 3.1 Problem Analysis

...PROBLEM FORMULATION 3.1 Problem Statement Low density parity check codes are forward error correcting codes. The LDPC block codes are inefficient, since a new code must be hypothesized each time a change in frame size is desired. A number of algorithms with varying complexity and performance have been proposed for LDPC decoding. But achieving a balanced trade-off between decoding performance and implementation complexity still remains a potential problem. LDPC decoding algorithms operates by making either hard decision or soft decision on the message received from the noisy channel [20]. 3.1.1 Sum product Algorithm The sum product algorithm for LDPC decoding is a soft decision message passing algorithm. In case of soft decision based algorithms, the input data to the decoder is the channel probabilities, represented in logarithmic ratio which is known as log-likelihood ratio (LLR). This algorithm requires LLR for variable node operations to make decoding decisions. The LLRs are transferred over to the variable nodes (V), this variable node carry out the sum operation on the input LLRs as in equation (1) and computed messages are passed along the connected edges to the check nodes (C). SPA Variable node operation: V_i=〖LLR〗_n+∑_(j≠i)▒C_j (1) Where n=1,2,. . . .number of variable nodes i, j=1,2,. . . .degree of variable node The operation performed by the check nodes (C) is given in equation (2). The check nodes also perform...

Words: 910 - Pages: 4

Free Essay

Theories of Knowlege

...Theories of knowledge Epistemology or theory of knowledge is the branch of Western philosophy that studies the nature and scope of knowledge. But how much and what do we really know? The debate in this field has been on analyzing the nature of knowledge and how it relates to similar notions such as truth, belief, and justification. The ability to store and retrieve information provides individuals with the ability to form logical thought, express emotions and adapt to the world around them. In order to understand the theories of knowledge it is necessary to investigate the aspects of the theories. The neural network model attempts to explain that which is known about the retention and retrieval of knowledge. Neural network models have been examined for a number of years. While in the mid 1940's the first of the network model appeared, the publications introduced the first models of as computing machines, the basic model of a self-organizing network (Arbib, 1995). Martindlae (1991) states that "The brain does not have anything we could really call a central processing unit, and the brain does not work in a serial fashion. The brain is therefore more like a large number of very slow computers all operating at the same time and each dedicated to a fairly specific task" (p. 10). The more modern is the dual coding approach, which believes that knowledge is a series of complex associative networks. Within these networks there are imagined (visual) and verbal representations...

Words: 1191 - Pages: 5

Premium Essay

Nt1330 Unit 3 Assignment 1

...digital by an Analog Telephone Adapter.An ATA is a simple device which lets you connect any standard telephone or fax machine so it can use VoIP through your internet connection. The ATA converts the analog signal from Alice’s voice to digital data I:e 0s and 1s. So that it can be further converted into data packets to be transmitted over the internet. Then the converted data packets are sent to the router or the Modem to be transmitted towards the destination. Discuss the quality of the Analog/Digital conversion in terms of the resources used (e.g., sampling rate and bit depth in PAM and PCM) Pulse-Code Modulation or PCM is a method to digitally represent sampled analogue signals It includes three steps: Sampling Quantization Coding Sampling: In the Sampling process the magnitude of the analogue signals is sampled at uniform intervals and the obtained values are called Samples For a 4 khz voice channel, the sampling rate is 8000 Hz which means the signal is sampled 8000 times per second. The samples will then be converted in digital numbers which is called the quantization process. Quantization: Quantization is a process to convert the obtained signals into discrete digital values. Bit Depth: The number of bits used to describe each sample in called Bit Depth. If the bit depth is higher more and more data will be stored and will re-create the sound more accurately. If the bit depth is low, the information will be lost and the reproduced sound will be degraded...

Words: 905 - Pages: 4

Free Essay

Postgraduate

...6. Durham: https://www.dur.ac.uk/study/postgraduate/taught/ 6. St Andrews: http://www.st-andrews.ac.uk/media/pgdegrees11-12.pdf 8. Warwick: http://www2.warwick.ac.uk/study/postgraduate/courses/coursea2z/#p 9. Lancaster: https://www.postgraduate.lancs.ac.uk/PGSearch.aspx 10. Exeter: http://www.exeter.ac.uk/postgraduate/degrees/taughtindexa/ 11. York: http://www.york.ac.uk/study/postgraduate/courses/ 12. Bath: http://www.bath.ac.uk/management/courses/postgraduate/ 13. Bristol: http://www.bristol.ac.uk/efm/postgraduate-programmes/ 14. Sussex: http://www.sussex.ac.uk/study/pg/2012/taught#a 15.Edinburgh: http://www.ed.ac.uk/studying/postgraduate/degrees?taught=Y&cw_xml=subjectarea.php 16.Nottingham No business http://pgstudy.nottingham.ac.uk/postgraduate-courses/schools-and-departments.aspx 17. Sheffield: http://www.sheffield.ac.uk/postgraduate/taught/courses/all 17. Leicester: http://www2.le.ac.uk/study/postgrad/taught-campus 19. Southampton: http://www.soton.ac.uk/postgraduate/pgstudy/programmes/index.html 20. Loughborough: http://www.lboro.ac.uk/study/postgraduate/courses/#p 21. Buckingham: http://www.buckingham.ac.uk/courses/ 22. Glasgow: http://www.gla.ac.uk/postgraduate/ 25. Newcastle: http://www.ncl.ac.uk/postgraduate/search/list/courses/taught 27. East Anglia: http://business.uea.ac.uk/courses 28.RoyalHolloway: http://www.rhul.ac.uk/studyhere/postgraduate/departmentsandcourses.aspx ...

Words: 333 - Pages: 2

Free Essay

Pdf of Telecom

...GSM BSS Network KPI (MOS) Optimization Manual INTERNAL Product Name GSM BSS Product Version V00R01 Confidentiality Level INTERNAL Total 36 pages GSM BSS Network KPI (MOS) Optimization Manual For internal use only Prepared by Reviewed by Reviewed by Granted by GSM&UMTS Network Performance Research Department Dong Xuan Date Date Date Date 2008-2-21 yyyy-mm-dd yyyy-mm-dd yyyy-mm-dd Huawei Technologies Co., Ltd. All rights reserved 2011-08-04 Huawei Technologies Proprietary Page 1 of 36 GSM BSS Network KPI (MOS) Optimization Manual INTERNAL Revision Record Date 2008-1-21 2008-3-20 Revision Version 0.9 1.0 Change Description Draft completed. The document is modified according to review comments. Author Dong Xuan Wang Fei 2011-08-04 Huawei Technologies Proprietary Page 2 of 36 GSM BSS Network KPI (MOS) Optimization Manual INTERNAL GSM BSS Network KPI (MOS) Optimization Manual Key words: MOS, interference, BER, C/I, power control, DTX, frequency hopping, PESQ, PSQM /PSQM+, PAMS Abstract: With the development of the radio network, mobile operators become more focused on end users’ experience instead of key performance indicators (KPIs). The improvement of the end users’ experience and the improvement of the network capacity are regarded as KPIs. Therefore, Huawei must pay close attention to the improvement of the soft capability of the network quality as well as the fulfillment of KPIs. At present, there are three...

Words: 9686 - Pages: 39

Premium Essay

Apple

...The industry we could consider could be MP3. Vertically integrated company gaining control over customers and developing new customer outlets. The advantages being certainity, quality control or supply reliability. IPod is Not only a product but service and lifestyle while Itunes is a new wave of music distribution in the music industry. Apple maintains a business model that does not fit into traditional industry borders. Besides its traditional business of making and selling computers (e.g. iMacs, Power Books, etc.) it sells music over the internet in order to increase its sales of iPods, which is a sort of digital walkman. In terms of industries this business model includes hardware (the iPod), software (iTunes, which is the software to download the music) and entertainment (music). Hence we could define apple being in the Consumer Electronics and Digital Distribution. Since the scope of products it provides is a bit complicated we cannot compare it with any industry. Porter’s 5 forces Threat of new entry is low Capital Intensive Industry Economies of Scale: Ipod along with the Itunes services had captured most of the market Product Differentiation: Ipod had it own music services which allowed customers to download music convieniently. There were accessories market that mushroomed around the IPOD flogging leather Chanel pouches to Ipod enabled Toilet roll holders. Cost Advantage: High demand in proprietary technology. Apple has several patents for its products...

Words: 450 - Pages: 2

Premium Essay

The Ipod and Spotify

...Sound and Audio Engineering Dubai Unit FND 100: Foundation Course FND 100.2 Essay Student Name: Fin Sheffield Date : 25/9/2012 Unit Lecturer: Tasnim Saleh The media industry has changed in the past twenty years in so many ways, most for the better but, in some cases, for negative reasons. Since the late 1990’s, society has witnessed the blow up of the internet allowing people from all over the world to access and research many different things, the influx of mobile phones and the advantage it has had on the people, the many different game consoles around and other things such as the transition between VHS to DVD’s and the introduction of BluRay and Bluetooth. Two major impacts on the media industry for both positive and negative effects, have been the iPod and the music sensation, Spotify. Both have created a large hype within the media industry for several different reasons, and are both appreciated and cursed by the audience of today. Spotify is a downloadable window, where if you have internet connection, you are able to stream music for free. There are many positive effects that come with Spotify. You are able to listen to music anywhere, due to a new app that can be downloaded to any Android phone. On Spotify, there are two different subscriptions that are available. One can either pay for “The Ultimate” or “The Premium” or, alternatively, one can opt out of subscription, pay nothing, and...

Words: 1105 - Pages: 5

Free Essay

Ipod vs Zune

...iPod vs. Zune June 17, 2013 BUS350: Consumer Behavior Technology has been defined as the practical application of knowledge especially in a particular area. (Webster, 2013) As our society continue to grow and advance so does our need for advancement in the world of technology. Today, we look at the iPod and the Zune two very similar gadgets and their evolution. For 12 years we have embraced a new way to listen to music by using the advancements of technology. We have graduated from vinyl records, eight tracks, cassettes, and lastly cd’s by entering into the world of by use of MP3 files. Today we have fast-tracked our way to taking our music to another level to be able to listen too. This is the journey of two very similar products and how they have grown in many areas and stayed the same in others. The iPod was introduced to the world in 2001. Apple started with just a simple device that held music. Over the last 12 years Apple has reinvented the gadget that would be just for music and changed it for the consumer to be able to do much more than listen to music. In watching the first video’s that came out; it was clear that Apple had mastered the art of music on the device. But fast forward 12 years later and they have changed the size, made it smaller and more compact to carry, and given more options other than just listening to music. Games can be played now, videos can be watched, and files can be store, pictures taken, and...

Words: 477 - Pages: 2

Free Essay

Mpeg4

...MPEG-4 The new standard for multimedia on the Internet, powered by QuickTime. MPEG-4 is the new worldwide standard for interactive multimedia creation, delivery, and playback for the Internet. What MPEG-1 and its delivery of full-motion, full-screen video meant to the CD-ROM industry and MPEG-2 meant to the development of DVD, MPEG-4 will mean to the Internet. What Is MPEG-4? MPEG-4 is an extensive set of key enabling technology specifications with audio and video at its core. It was defined by the MPEG (Moving Picture Experts Group) committee, the working group within the International Organization for Standardization (ISO) that specified the widely adopted, Emmy Award–winning standards known as MPEG-1 and MPEG-2. MPEG-4 is the result of an international effort involving hundreds of researchers and engineers. MPEG-4, whose formal designation is ISO/IEC 14496, was finalized in October 1998 and became an international standard in early 1999. The components of MPEG-4 Fact Sheet MPEG-4 Fact Sheet MPEG-4 2 Multimedia beyond the desktop The MPEG committee designed MPEG-4 to be a single standard covering the entire digital media workflow—from capture, authoring, and editing to encoding, distribution, playback, and archiving. The adoption of the MPEG-4 standard is not just critical for desktop computers, but is increasingly important as digital media expands into new areas such as set-top boxes, wireless devices, and game consoles. Member companies of the MPEG-4...

Words: 1851 - Pages: 8

Free Essay

Business

...Sony Walkman Sony is an international brand and known as electronic company. The company selling products to the global market that makes company has a very large consumer market. Also Sony has always led the market in terms of innovation and new technology products. Companies in this industry are benefiting from rapid growth have sales and earnings that are expanding at a faster rate than firms in other industries but, electronic industry has low barrier to enter and exit. The technology of electronic product is easy to copy that makes Sony faces with many competitors so Sony should try to be the first and fast entrant to develop and launch new products to market in order to gain high profit. Sony moved to expand their branding strategies with the potentially unlimited difference of the global market. The target consumer group, Sony Walkman will focus on teenager, and young consumer groups. Often the concept of “youth” came to play a predominate role in many of Sony Walkman’s advertisings. By emphasizing consumer driven design and hip sensibility, Sony played a pivotal role in developing much of the branding and lifestyle marketing that has become a central strategy for Sony Walkman. Entertainment giants “Sony” were the first one to envision the portable music player when they launched the first Walkman in the year 1979 and they were very successful in 1981. At that time, at least 20 companies...

Words: 2041 - Pages: 9