Claude Shannon Explains Information Theory

Ғылым және технология

#informationtheory #claudeshannon #technology
Claude Shannon, the mastermind behind the concept of modern information theory, delves into the idea of treating information as a tangible entity, much like energy. Shannon's insight was that to truly grasp the essence of information, one must approach it objectively, disregarding its meaning or content. He highlights the crucial role of information in reducing uncertainty and brings to light the simplest form of information- the bit, a representation of a binary choice. Shannon's theory states that the more predictable an event is, the lower the amount of information it holds, while complete randomness holds the highest amount of information. This concept has revolutionized the world of telecommunications, cryptography, and beyond. Shannon's innovative thoughts on information continue to inspire and captivate experts in diverse fields even today.
Richard Feynman on the human mind:
• Inside the Mind of a P...
Russell Ackoff on changing systems:
• Russell Ackoff on Tran...
Claude Shannon shows off his inventions:
• Inside the Boundless M...
Carl Sagan on The Power to Change The World
• The Power of Humanity:...
Claude Shannon - Father of the Information Age
• Claude Shannon - Fathe...
Steve Jobs on Shaping Your Life:
• Video
/ @ieeeinformationtheory...
Claude Shannon was an American mathematician and electrical engineer who is widely regarded as the father of modern digital circuit design theory and information theory. He was born on April 30, 1916, and died on February 24, 2001.
Information theory is a mathematical framework for understanding and quantifying information. It provides a way to measure information in a way that is independent of the meaning or interpretation of that information. Shannon's theory deals with the efficient encoding, transmission, and storage of information in the presence of noise and errors.
Shannon's most famous work, A Mathematical Theory of Communication, was published in 1948 and laid the foundations for modern digital communication. In this work, he introduced the concept of entropy as a measure of the amount of uncertainty or randomness in a signal. He also formulated the concept of channel capacity, which represents the maximum amount of information that can be transmitted over a communication channel in the presence of noise.
Shannon's ideas have had a profound impact on a wide range of fields, including telecommunications, computer science, cryptography, and linguistics. Information theory is still an active area of research and has many practical applications in areas such as data compression, error correction, and network communication.
Shannon was the first to mathematically quantify the concept of information and introduce the idea of treating it as a tangible entity. He developed the concept of entropy as a measure of uncertainty in information and provided the foundation for the development of digital circuit design and modern communication systems. Shannon's groundbreaking work also paved the way for the creation of the first digital computer, and he is considered one of the key pioneers of digital computing. His innovations and inventions have transformed the world and paved the way for many of the technological advancements we enjoy today.
#ClaudeShannon #InformationTheory #BinaryInformation #Probability #Communication #MeasuringInformation #ObjectiveMeasurement #RemovingUncertainty #ZeroInformation #CompletelyRandom #GreatLiterature #PhysicalQuantity #FatherOfModernInformationTheory #SignificantImpact #Telecommunications #Cryptography

Пікірлер: 29

  • @Discern_
    @Discern_ Жыл бұрын

    If predictable observations contains less information than randomness, what does that say about the way we communicate and process information in our daily lives? Is it possible that we are undervaluing the importance of randomness and unpredictability in our pursuit of knowledge and understanding?

  • @user-fk6sv2ch4p

    @user-fk6sv2ch4p

    11 ай бұрын

    He is something else... brilliant!

  • @Lancelote.

    @Lancelote.

    6 ай бұрын

    that would come in the form of curiosity and exploration! but i have the feeling that predictable is relative

  • @winonafrog

    @winonafrog

    4 ай бұрын

    id say it points to a nondual meaning of knowledge : to know and not know are both sides of a coin #thaumatropy

  • @Jiyoon02
    @Jiyoon02 Жыл бұрын

    I'm just a mere undergraduate, but I remember when I read the acclaimed paper of Shannon on information theory. It was one of the most elegant thing I had seem on this planet. Not only the theory amazing but the way paper is structured is so elegant too. I knew on the first sight that this paper must have endorsed countless mathematician/math-engineer/computer-scientists to start their career on the field

  • @jamesvstone9925
    @jamesvstone9925 Жыл бұрын

    This is not Shannon speaking, but an actor in a fine movie called 'The Bit Player'.

  • Жыл бұрын

    It seems he did a good job.

  • @savage22bolt32

    @savage22bolt32

    6 ай бұрын

    @ a dead ringer. I looked him up on the computer.

  • @winonafrog

    @winonafrog

    4 ай бұрын

    @@savage22bolt32I am witnessing this on iPhone 13 youtube app

  • @skilifavas4016

    @skilifavas4016

    3 ай бұрын

    an actor indeed very good one apparently

  • @rolandclark918
    @rolandclark9187 ай бұрын

    What I get it out of this is that there's more knowledge than the things we don't know than the things we do

  • @notdarkon
    @notdarkon11 ай бұрын

    Fascinating.

  • @brandoncarrillo7202
    @brandoncarrillo72023 ай бұрын

    So information in this case is considered knowability yes? As in if I know the answer is heads bc both sides are heads then I fully comprehend the information before me therefore I gain no knowledge. So if I added the stipulation of some sort of bet made upon the coin toss say “if it’s heads you have to cross the street”, would their be more information in the crossing of the street in turn giving the coin toss a greater amount of information despite our previous knowability of the outcome?

  • @philippwaag2173
    @philippwaag217313 күн бұрын

    how did you create that video type? crazy quality of optics

  • @maheshkanojiya4858
    @maheshkanojiya48586 ай бұрын

    Anybody want to know more about information theory , shanon and related things in a fascinating way They should read a book "The Information" by "James Gleick" one of the best non fiction book ever

  • @winonafrog
    @winonafrog4 ай бұрын

    so vitally important to the inability of quantum indeterminacy in a universe of entangled bits to square easily with our local real opinions of gravity, imo

  • @arjunmenonkandanat6328
    @arjunmenonkandanat63283 ай бұрын

    2:15 What a statement!

  • @redberries8039

    @redberries8039

    2 ай бұрын

    There's more information in gibberish because gibberish is perfectly random and that makes the letters hard to guess. That's his defintion of information 'guessability' Words have patterns and patterns are predictable so it's easier to guess the next letter. It becomes >> Gibberish is harder to predict than words.

  • @musicjunk8266
    @musicjunk82662 ай бұрын

    Interesting...

  • @winonafrog
    @winonafrog4 ай бұрын

    Same!

  • @brianplatt7041
    @brianplatt70419 ай бұрын

    Following bread crumbs. 😎

  • @eylulagar7512
    @eylulagar75126 ай бұрын

    allah sizi mesut etsin anlınızdan öptüm muck

  • @billy-cg1qq
    @billy-cg1qq8 ай бұрын

    The f did he just say? I didn't understand a thing lmao

  • @zyxyuv1650

    @zyxyuv1650

    6 ай бұрын

    He's saying it takes a lot more drive space to store random useless uncompressible information, than it takes to store a perfect structure or a perfect formula...the more perfect you make something, the less information that is actually there...because the default is for maximum information (noise) to be everywhere...except this universe of maximum information is always headed for destruction by becoming more similar to itself until there's nothing left because there's no difference between anything in the end

  • @billy-cg1qq

    @billy-cg1qq

    6 ай бұрын

    @@zyxyuv1650 if the universe tends to maximize information (noise) doesn't that mean we'll end up with a lot of different things that are not the same, as opposed to what you said we end up with the same thing? Idk, this whole information theory seems like a trick to me. He's trying to put a quantity that is purely in the mind into mathematical formula that depends heavily on the content you're working with. This is not a generalization, which what mathematics hopes to achieve

  • @coderoshi

    @coderoshi

    5 ай бұрын

    @@billy-cg1qq "this whole information theory seems like a trick to me" he types, unironically, into his computer and sends over the internet...

  • @luispint0

    @luispint0

    4 ай бұрын

    ​@@coderoshi😂

  • @AdolfDahmer
    @AdolfDahmer10 ай бұрын

    This guy’s 🧠 isn’t from here.

Келесі