Budapest University of Technology and Economics, Faculty of Electrical Engineering and Informatics

    Belépés
    címtáras azonosítással

    vissza a tantárgylistához   nyomtatható verzió    

    Information Theory

    A tantárgy neve magyarul / Name of the subject in Hungarian: Információelmélet

    Last updated: 2018. július 8.

    Budapest University of Technology and Economics
    Faculty of Electrical Engineering and Informatics
    Engineering Information Theory, MSc, Branching Common Subject
    Course ID Semester Assessment Credit Tantárgyfélév
    VISZMA03 1,2 3/0/0/f 4  
    3. Course coordinator and department Dr. Pintér Márta,
    4. Instructors Gabor Simonyi
    5. Required knowledge Probability Theory
    6. Pre-requisites
    Kötelező:
    NEM ( TárgyEredmény( "BMEVISZM101" , "jegy" , _ ) >= 2
    VAGY
    TárgyEredmény("BMEVISZM101", "FELVETEL", AktualisFelev()) > 0
    VAGY
    TárgyEredmény( "BMEVISZMA13", "jegy" , _ ) >= 2
    VAGY
    TárgyEredmény("BMEVISZMA13", "FELVETEL", AktualisFelev()) > 0)

    A fenti forma a Neptun sajátja, ezen technikai okokból nem változtattunk.

    A kötelező előtanulmányi rend az adott szak honlapján és képzési programjában található.

    Ajánlott:
    Probability Theory
    7. Objectives, learning outcomes and obtained knowledge The course deals with the theoretical problems arising during transfer
    and storage of information. The theoretical limits of data compression
    and reliable information transmission are presented. Basic properties
    of Shannon's information measures are covered and several data
    compression techniques are taught. Course topics include the main
    principles of channel coding along with basic examples of situations
    when such coding is required.

    Students completing the course are supposed to

    (1) know the theoretical limits of efficiency of variable length source coding

    (2) know the main codes realizing the above limits

    (3) be acquainted with the main principles of lossy source coding

    (4) develop a basic understanding of the main concepts of classical
    information theory

    (5) be able to rightly model situations when the task is information
    transmission in a noisy environment.
    8. Synopsis 1. Variable length source coding
    Unique decodability, prefix coding

    2. McMillan's theorem and Kraft's theorem

    3. Jensen's inequality
    The entropy function and its main properties

    4. Shannon-Fano coding
    Huffman coding

    5. Lempel-Ziv type algorithms

    6. The entropy of a source, Markov source
    Conditional entropy and its properties

    7. Mutual information and its properties

    8. Quantization

    9. Lloyd-Max algorithm

    10. The discrete memoryless channel model

    11. Channel capacity
    Fano's inequality

    12. Converse of the channel coding theorem
    Channel coding theorem

    13. Basic principles of error correction, Hamming codes

    14. Zero-error codes.

    9. Method of instruction 3 lectures per week
    10. Assessment There are 2 midterm tests during the semester. To complete the
    course with a valid grade 40% of the total score should be achieved on
    both of the  midterms. If this requirement is met, the course
    grade is calculated by averaging the results of the three midterms
    with equal weights.

    In the exam period: ---
    11. Recaps There will be a make up test for each of the three midterms during the
    semester. One more make up test can be written on the week right after
    the semester in case one (and only one) midterm is still below 40%.
    12. Consultations Upon appointment.
    13. References, textbooks and resources Cover - Thomas: Elements of Information Theory, Wiley, 2006.
    14. Required learning hours and assignment
    Kontakt óra42
    Félévközi készülés órákra
    Felkészülés zárthelyire78
    Házi feladat elkészítése
    Kijelölt írásos tananyag elsajátítása
    Vizsgafelkészülés
    Összesen120
    15. Syllabus prepared by Gabor Simonyi