I don't believe the 2nd law of thermodynamics. (The most uplifting video I'll ever make.)

Ғылым және технология

Learn more about differential equations (and many other topics in maths and science) on Brilliant using the link brilliant.org/sabine. You can get started for free, and the first 200 will get 20% off the annual premium subscription.
The second law of thermodynamics says that entropy will inevitably increase. Eventually, it will make life in the universe impossible. What does this mean? And is it correct? In this video, I sort out what we know about the arrow of time and why I don't believe that entropy will kill the universe.
💌 Support us on Donatebox ➜ donorbox.org/swtg
🤓 Transcripts and written news on Substack ➜ sciencewtg.substack.com/
👉 Transcript with links to references on Patreon ➜ / sabine
📩 Sign up for my weekly science newsletter. It's free! ➜ sabinehossenfelder.com/newsle...
🔗 Join this channel to get access to perks ➜
/ @sabinehossenfelder
🖼️ On instagram ➜ / sciencewtg
00:00 Introduction
1:00 The Arrow of Time
3:04 Entropy, Work, and Heat
7:07 The Past Hypothesis and Heat Death
9:34 Entropy, Order, and Information
11:38 How Will the Universe End?
15:46 Brilliant Sponsorship

Пікірлер: 5 300

  • @MrEiht
    @MrEiht10 ай бұрын

    As Abba said: "Entropy killed the radio star..."

  • @chris.hinsley

    @chris.hinsley

    10 ай бұрын

    Come on Sabine ! Do another techno pop song ! Please.

  • @nagualdesign

    @nagualdesign

    10 ай бұрын

    Abba? I think you mean The Buggles.

  • @MagnumInnominandum

    @MagnumInnominandum

    10 ай бұрын

    That was Jefferson No Ship...😮

  • @chris.hinsley

    @chris.hinsley

    10 ай бұрын

    Don’t get me wrong I love the physics. But “Catching Light” was the bomb !

  • @jorriffhdhtrsegg

    @jorriffhdhtrsegg

    10 ай бұрын

    Abba surely not

  • @mikebarushok5361
    @mikebarushok536110 ай бұрын

    I have witnessed water running upwards from a ditch onto a road then becoming airborne and creating a cloud. But, it was because of a tornado passing near to my house and I hope never to see anything like that again.

  • @SabineHossenfelder

    @SabineHossenfelder

    10 ай бұрын

    😮😮😮

  • @Dead-Not-Sleeping

    @Dead-Not-Sleeping

    10 ай бұрын

    Fascinating! Scary, but fascinating.

  • @rickmorty7284

    @rickmorty7284

    10 ай бұрын

    💀💀

  • @benjaminbeard3736

    @benjaminbeard3736

    10 ай бұрын

    I thought you were going to say mushrooms.

  • @ultramovier

    @ultramovier

    10 ай бұрын

    When water evaporates it goes into the sky to form clouds.

  • @tayzonday
    @tayzonday10 ай бұрын

    If I don’t shower, all the good air in a room definitely moves into the corner.

  • @jdtv50

    @jdtv50

    10 ай бұрын

    Chocolate RAIINNNN!

  • @AttackOfTheTube

    @AttackOfTheTube

    10 ай бұрын

    Some stay dry

  • @stevenlayne9227

    @stevenlayne9227

    10 ай бұрын

    @tayzonday we definitely have the same algorithm. You keep behaving like a tyrant in the comment sections 😂

  • @bossoholic

    @bossoholic

    10 ай бұрын

    When I don't shower, everybody suffocates and my flowers die

  • @alexandershendi7428

    @alexandershendi7428

    10 ай бұрын

    @tayzonday And if you move into *that* corner, it will move into the opposite corner.

  • @deebarker1969
    @deebarker196919 күн бұрын

    Sabine, I'm a chemist, and in conversations in the past, I attempted to make the arguments about entropy you've made so eloquently in this video (especially the associations/ideas about order). From now on, rather than arguing, I'll recommend watching your video. Thank you so much for this great video!

  • @Thomas-gk42
    @Thomas-gk422 ай бұрын

    To some of your vids, I come back months later again and again, because they are lovely every time new. This is one of them.

  • @truejim
    @truejim10 ай бұрын

    “My videos can only go downhill from here…” The perfect ending to a video about entropy! 😂

  • @aggies11

    @aggies11

    10 ай бұрын

    Yep, that joke is brilliant and works on so many levels.

  • @yeroca

    @yeroca

    10 ай бұрын

    The entropy of explanations of entropy. So meta! The explanations can only be as good, and will likely be worse from now on.

  • @GoDodgers1

    @GoDodgers1

    10 ай бұрын

    She is probably right, not that they were anything special from the start.

  • @hyperduality2838

    @hyperduality2838

    10 ай бұрын

    SYNTROPY (convergence) is dual to increasing ENTROPY (divergence) -- the 4th law of thermodynamics! Making predictions to track targets, goals & objective is a syntropic process -- teleological. Teleological physics (syntropy) is dual to non teleological physics (entropy). Concepts (mathematics) are dual to percepts (physics) -- the mind duality of Immanuel Kant. Mathematicians create new concepts from their perceptions (geometry) all the time! Noumenal (rational, analytic, mathematics) is dual to phenomenal (empirical, synthetic, physics) -- Immanuel Kant. Mathematics (concepts) is dual to physics (measurements, perceptions). Deductive inference is dual to inductive inference -- Immanuel Kant. Inference is dual. The rule of two -- Darth Bane, Sith lord. "Always two there are" -- Yoda. Subgroups (discrete, quantum) are dual to subfields (continuous, classical) -- the Galois Correspondence. Classical reality is dual to quantum reality synthesizes true reality -- Roger Penrose using the Hegelian dialectic.

  • @MR-ub6sq

    @MR-ub6sq

    10 ай бұрын

    @@hyperduality2838 Yeah

  • @theprinceofinadequatelighting
    @theprinceofinadequatelighting10 ай бұрын

    "It can only go downhill from here" Sabine's humor, much like our universe, is chaotic.

  • @truejim

    @truejim

    10 ай бұрын

    It was a brilliant joke!

  • @davido.newell4566

    @davido.newell4566

    10 ай бұрын

    Well ordered!

  • @jonathanbourret2968

    @jonathanbourret2968

    10 ай бұрын

    I think she meant, "It can only go high entropy from here".

  • @cdorman11

    @cdorman11

    10 ай бұрын

    11:45 "If you stop a random physicist on the street and ask them if they agree..."

  • @keithwollenberg5237

    @keithwollenberg5237

    10 ай бұрын

    I wonder if she was aware how funny it sounds to anglophone ears to have someone with a German accent expound the desirability of order.

  • @vixeni3365
    @vixeni33652 ай бұрын

    “and luckily so because it would be inconvenient if you entered a room and all the air went to a corner” just made my day

  • @un4given868
    @un4given8687 ай бұрын

    Thank you for making clear to me what entropy is and as a bonus giving me a solid idea of what Necessity could mean as well .

  • @DrAndrewSteele
    @DrAndrewSteele10 ай бұрын

    Aging biologist here! I’d like to add an uplifting comment to this uplifting video. :) I’m always a bit sad to see entropy given as the reason we get old. As Sabine discusses later in the video, open systems with access to a source of low entropy can use that to decrease their own entropy, so given that we can take in low-entropy food, there’s no in-principle reason we couldn’t use this to keep the entropy of our bodies roughly constant with time. So not aging is totally allowed by the laws of physics! It’s even well within the laws of biology-there are plenty of animals that don’t age, from tiny hydra to giant tortoises, and even one of nature’s most beautiful animals, the naked mole-rat. Their risk of death and disease doesn’t change with time, which basically means they keep their entropy constant throughout their adult lives. Now all we need to do is crack this biological entropy preservation using science…but that’s another story!

  • @elio7610

    @elio7610

    10 ай бұрын

    Unfortunately, we already have an overpopulated earth so preventing people from aging is gonna exasperate the issue. I would not be against preventing ageing though, even if my life was no longer than normal, a life without aging is far more enjoyable than living with a constant gradual decline.

  • @DrAndrewSteele

    @DrAndrewSteele

    10 ай бұрын

    @@elio7610 If you’re worried about overpopulation, I made a video about exactly that which you might enjoy :)

  • @edcunion

    @edcunion

    10 ай бұрын

    Planaria?

  • @TitanOfClash

    @TitanOfClash

    10 ай бұрын

    Wow, I can't believe that I ever espoused that exact view. When you put it like that, entropy as a reason for aging makes no sense. Thank you for ridding me of that misunderstanding.

  • @DrAndrewSteele

    @DrAndrewSteele

    10 ай бұрын

    @@edcunion I’m not sure if we’ve got any lifespan data on them (happy to be corrected!), but given their regenerative powers I’d not be surprised!

  • @maninspired
    @maninspired10 ай бұрын

    I'm a mechanical engineer who focused on thermodynamics in college. Despite years of study and using entropy in formulas, this is by far the best explanation of entropy I've ever heard.

  • @deltalima6703

    @deltalima6703

    10 ай бұрын

    Would you agree that entropy is fishy?

  • @johndemeritt3460

    @johndemeritt3460

    10 ай бұрын

    There's a reason my father, who trained as a mechanical engineer but became a computer programmer/systems analyst, always referred to thermodynamics as "thermogoddamics".

  • @oosmanbeekawoo

    @oosmanbeekawoo

    10 ай бұрын

    Classic response on a video that does not explain Entropy. It's far easier to believe we understand than believe we've been fooled into understanding.

  • @chrisj5443

    @chrisj5443

    10 ай бұрын

    My education was the same. I was waiting for her to say entropy tends to increase in a CLOSED SYSTEM, but she never did. She must assume the universe as we understand it is a closed system, but that's very debatable, given the required low entropy at the Big Bang. Also, questioning the semantics of the 2nd Law (the meaning of "order" seems trivial to me. After all, I recall the word "disorganized" being used in the 2nd Law, which is better I think.

  • @karol_p

    @karol_p

    10 ай бұрын

    She actually makes the mistake of calling heat a form of energy, when in fact heat is a form of energy transfer.

  • @markusk2289
    @markusk22897 ай бұрын

    This video reminds me how it always frustrated me in school as well as later at Uni that things were described or defined in simplified ways that made them wrong, harder to actually grasp or both. Sometimes the actual truth of the matter would slowly reveal itself often leading to an „aha“ moment years later and a feeling of „I knew it“. Also funny how oftentimes simply explaining the meaning of a latin or greek term would have almost explained the whole concept behind it, yet somehow no professor ever did that.

  • @Cre8tvMG
    @Cre8tvMG6 күн бұрын

    I enjoy your sense of humor keeping things light and entertaining while simultaneously tackling deep concepts. Great blend.

  • @ToddPangburn
    @ToddPangburn10 ай бұрын

    THANK YOU for pushing back against entropy being described as order vs. disorder! Through years of schooling entropy was this poorly defined, almost spooky concept of order. Then I was finally introduced to entropy as probabilities of microstates (with gas in a box), and it was completely logical and clear.

  • @billschlafly4107

    @billschlafly4107

    10 ай бұрын

    Entropy is to heat transfer what friction is to an object in motion. Entropy reduces the available energy in a system just like friction. Order vs disorder isn't useful at all IMO and only serves to cloud what's happening.

  • @sjoerdthabozz

    @sjoerdthabozz

    10 ай бұрын

    Agree. Order is a matter of human opinion. Don’t think nature cares.

  • @xxportalxx.

    @xxportalxx.

    10 ай бұрын

    I think they're synonymous ways of talking about it, it's just that order as a concept has more to unpack to get to the point. Order means that there are rules that limit the number of microstates. A bookshelf ordered alphabetically has very few allowed microstates (defining the microstates as the books' arrangement), a disordered bookshelf on the other hand would have as many microstates as there are ways to arrange the books.

  • @blueckaym

    @blueckaym

    10 ай бұрын

    I agree! I've always found order & statistical description of Entropy to be fundamentally wrong. And if you look you can see that this approach exists in many other fields of science (of SCIENCE! Which is supposed to be OBJECTIVE :)) The trouble is scientists are not always objective - actually everyone is at least a little subjective at times, that due to our limited resources (at least attention and Time). But how science generally work - you observe a phenomenon, you describe is somehow, you test if your description allows you to predict same phenomenon in the close future; if your prediction fails (or is imprecise) you improve your description until you get accurate enough predictions ... and at this point you KNOW that you best description is the CAUSE for your prediction ... so it's easy to miss the fact that the Universe doesn't give a rat's ass about your descriptions :) It's just one of our many biases - one that many scientists also fail to. Ultimately our descriptions would be so perfect that they'll fit to Reality 100%, and then it would be only a philosophy game to distinguish between the two ... but even with our most precise sciences we're still not there. But people (and esp. scientists) love to believe that the theories they learn (and especially the ones they make) are close to perfect and thus that bias becomes really, really strong! Have you seen a scientist trying to explain some phenomenon by stating some equation and ending with something like "and as the equation tells us, the object has to do this and that". Well sure if your theory and equations are perfect you'll get it right ... except for the understanding part! Very little of such descriptions actually explain anything, and it's exactly because they jump over the actual forces and threat the description as the cause. In that case if your description is for example statistical (which many sciences use today ... and particularly QM) then it's really easy to believe that probabilities and statistics are the CAUSE for things like Entropy! ... but this is still fundamentally wrong of course! :)

  • @Bryan-Hensley

    @Bryan-Hensley

    10 ай бұрын

    Kinda like dark matter

  • @shivasive
    @shivasive10 ай бұрын

    A friend of mine once said ”if studying physics doesn't humble you, you're doing it wrong."

  • @PrivateSi

    @PrivateSi

    5 ай бұрын

    If you don't confidently, arrogantly even, question physics you're doing it wrong... Entropy should be redefined as Simplicity or Uniformity... Complexity can be static and structured, or dynamic (and structured)... Closed systems simplify over time. Energy can increase complexity. This redefinition solves many issues.

  • @moneteezee

    @moneteezee

    Ай бұрын

    @@PrivateSi You haven't been humbled yet clearly. Entropy has already been defined precisely in physics, from the mid 19th century. What loose definitions you get from science communicators isn't reflecting the reality of this situation, hence you must study physics. (Even this video enunciated this so I truly don't know where your head is)

  • @PrivateSi

    @PrivateSi

    Ай бұрын

    @@moneteezee .. You can either make a new term or change the Law of Entropy, because that law is wrong when over-applied to the entire universe. It's fine for gases in a closed system, but fails as a universal law, unlike my redefinition.. You could call it the Law of Simplicity if you prefer, as long as you stop declaring the Law of Entropy to hold at all times.

  • @PrivateSi

    @PrivateSi

    Ай бұрын

    @@moneteezee.. If fields are real they have to be made of discrete elements that have to be as equidistant as possible throughout the entire field for the field to be empty. This is a perfectly ordered state that's as simple as possible. Hence, The Law of Entropy should not be a law or Entropy should be redefined. It's a simple, logical argument given the evidence for QUANTISED FIELD(S).

  • @M-dv1yj

    @M-dv1yj

    Ай бұрын

    @@moneteezeeentropy is likely the unwinding of quantum entanglement from the source of singular entanglement of all possible quantum expressions into less entanglement limited expressions … it entropy is the cost or the balancing side of the of increased complexity that rebirths cycles of quantum tangling and de tangling that at some phase of transition leads to us. 😊 Entropy must be considered within the scope it’s relationship to quantum complexity. In short small scale fluctuations must run dry for the whole quantum baseline to reset into the potential of singulars expression. 😊

  • @GetMoGaming
    @GetMoGaming8 ай бұрын

    _"Not enough data for a meaningful answer."_

  • @Sarita41248
    @Sarita412489 ай бұрын

    Thanks Sabine, I am a curious dilettante in science and I really enjoy being able to understand a little more about everything that science has advanced.

  • @randomwalk5095
    @randomwalk509510 ай бұрын

    I completely agree with you. When I was young, I used to tell my mom that my room was never in disorder because disorder itself doesn't truly exist. Instead, what exists are infinite states of possible order, and none of them holds more inherent sense than another.

  • @methylene5

    @methylene5

    10 ай бұрын

    If a workspace becomes disorganised, the ability to get work done rapidly approaches zero. So I would argue that certain states of possible order do make more inherent sense than others.

  • @lemurpotatoes7988

    @lemurpotatoes7988

    10 ай бұрын

    If I understood Ramsey theory I could say something intelligent here.

  • @hyperduality2838

    @hyperduality2838

    10 ай бұрын

    SYNTROPY (convergence) is dual to increasing ENTROPY (divergence) -- the 4th law of thermodynamics! Making predictions to track targets, goals & objective is a syntropic process -- teleological. Teleological physics (syntropy) is dual to non teleological physics (entropy). Concepts (mathematics) are dual to percepts (physics) -- the mind duality of Immanuel Kant. Mathematicians create new concepts from their perceptions (geometry) all the time! Noumenal (rational, analytic, mathematics) is dual to phenomenal (empirical, synthetic, physics) -- Immanuel Kant. Mathematics (concepts) is dual to physics (measurements, perceptions). Deductive inference is dual to inductive inference -- Immanuel Kant. Inference is dual. The rule of two -- Darth Bane, Sith lord. "Always two there are" -- Yoda. Subgroups (discrete, quantum) are dual to subfields (continuous, classical) -- the Galois Correspondence. Classical reality is dual to quantum reality synthesizes true reality -- Roger Penrose using the Hegelian dialectic.

  • @petermaunsell4575

    @petermaunsell4575

    10 ай бұрын

    So your mother was the intervening force , if you keep your room tidy today as an adult it’s probably her victory, or your partners:)

  • @jumpycat

    @jumpycat

    10 ай бұрын

    Extremely organized workspace looks amazing, but it can intimide some people and drain creativity. On the other hand organizing is really good work hygiene after project is done.

  • @AlexanderTome
    @AlexanderTome10 ай бұрын

    This has brought me closer to reconciling questions I've had about entropy than I've ever been before. Thank you for that.

  • @whyofpsi

    @whyofpsi

    10 ай бұрын

    I've made some visualisation that might further enhance your understanding: kzread.info/dash/bejne/notrqrZveduvj9I.html

  • @billwindsor4224
    @billwindsor42249 ай бұрын

    Excellent video on entropy and statistical probabilities; thank you, Sabine!

  • @elizabethco6116
    @elizabethco611617 күн бұрын

    I’ve been hoping to hear something like this from you. Thank you for this.

  • @geoicons1943
    @geoicons194310 ай бұрын

    LOL “I’d be inconvenient if you entered room and all the air went into a corner” @sabine you totally cracked me up with this joke. Thank you!

  • @SeattleShelby
    @SeattleShelby10 ай бұрын

    Sabine’s Carnot efficiency at explaining this stuff is 100%.

  • @redandblue1013

    @redandblue1013

    10 ай бұрын

    So there is an infinite temperature gradient across her body?

  • @saraawwad9163

    @saraawwad9163

    9 ай бұрын

    Not possible 😂

  • @nancymatro8029

    @nancymatro8029

    9 ай бұрын

    Are you claiming to have Carnot knowledge of Sabine?

  • @msimon6808

    @msimon6808

    8 ай бұрын

    There are always losses. Superconductors radiate when the current flow changes.

  • @IsoMorphix

    @IsoMorphix

    8 ай бұрын

    ​@@nancymatro8029I appreciate this joke.

  • @mishelleilieva9657
    @mishelleilieva96579 ай бұрын

    You should read Asimov's story called "The last question". It covers the topic of entropy and this video explains perfectly the science behind it. The story also follows Sabrines idea about future life forms in higher entropy universe. It is one of my favorite Asimov's pieces.

  • @RobertR3750

    @RobertR3750

    5 ай бұрын

    Asimov.

  • @mishelleilieva9657

    @mishelleilieva9657

    5 ай бұрын

    @@RobertR3750 oops, you're right. English is not native for me and we spell it with a "z".

  • @tezzerii

    @tezzerii

    11 күн бұрын

    I've read that. Very clever story ! I love Asimov.

  • @mishelleilieva9657

    @mishelleilieva9657

    9 күн бұрын

    @@tezzerii me, too! I think I read almost everything he wrote..

  • @audistik1199

    @audistik1199

    6 күн бұрын

    As a young boy I became a science fiction devotee, and Asimov was one if my favorites, among with Heinlein and Clarke. I presume you are 😊from an older generation like me to have these greats on your reading list. It was a seminal experience in my life.

  • @minifix
    @minifix7 ай бұрын

    Sabine is not only intelligent enough to master a complex subject, but also think in new ways about how the data should be interpreted. It absolutely blew my mind when you explained how macro systems are a relative concept, and not an absolute value that happens to match our anthropocentric view of the universe. Thank you.

  • @katebuckley4523

    @katebuckley4523

    2 ай бұрын

    I often think about how the anthropocentric view tends to divide ‘reality’ into separate things where there are none .. ie. a kind of fiction that perceives order v chaos, living v non-living, observer v observed etc.. ‘we’ seem to separate ourselves out along with everything else. My hunch is that the energetic solidity we feel in the body gives rise to the appearance of everything else as distinct and knowable.. a kind of ‘specialness’ that has something to do with the apparent body’s drive to conserve low entropy... what we call ‘me’ is only a kind of interpretation and nothing at all in reality.

  • @suomeaboo
    @suomeaboo9 ай бұрын

    9:55 Finally someone said it! I always considered a homogeneous mixture as "ordered", contrary to how lots of entropy explainers describe it as "disordered". This led me to lots of confusion over the years, until my recent physics classes cleared things up.

  • @cameronbartlett6593

    @cameronbartlett6593

    9 ай бұрын

    now run outside and play

  • @b1ff

    @b1ff

    9 ай бұрын

    @cameronbartlett6593 no u

  • @limitlessenergy369

    @limitlessenergy369

    9 ай бұрын

    EZ water / exclusion zone worth a read

  • @limitlessenergy369

    @limitlessenergy369

    9 ай бұрын

    @@cameronbartlett6593I am a plasma engineer and I would beat you in physical anything best of 3, pick your best sport it wont matter even if I have never done it you will still lose because I will also pick my best sport and third party picks the third sport meaning you have low chances of winning. Go say your bs to yourself in the mirror. Not every nerd minded dude isn’t physically able.

  • @yazmeliayzol624

    @yazmeliayzol624

    8 ай бұрын

    Boy oh boy... I love checking comments and only seeing 2 of the supposed 4 comments left before me... it tells me I've really done my job well and been blocked by someone who is closed minded... I only see a comment from Cameron and darkside... But yup... I've said it for years the chaos and order are entirely indistinguishable in their ultimate forms... you are 100% right in saying homogeny is perfect order... it is also perfect chaos as no one part is distinguishable from the whole...

  • @jeffb3357
    @jeffb335710 ай бұрын

    The entropy of KZread must have decreased since veritasium and Sabine both posted videos on entropy within two weeks of eachother :) They're both great, but Sabine definitely lives up to her motto of no gobbledygook... thanks Sabine!

  • @geoffwales8646

    @geoffwales8646

    9 ай бұрын

    Veritasium explains it better for the layperson, IMO.

  • @michalgric

    @michalgric

    9 ай бұрын

    This video about life and entropy kzread.info/dash/bejne/nWGqz5WTh9Gzh84.html is really good match to these two videos you mentioned.

  • @siddified

    @siddified

    9 ай бұрын

    I like my gobbledygook with a good glass of razzmatazz

  • @AnthonyCassidy50

    @AnthonyCassidy50

    9 ай бұрын

    I agree, they are both great. Sabine's was posted on Jun18, and Veritassium's on Jun2nd . So we know Sabine wasn't responding to his (since she was first) and since Veritassium takes longer than two weeks to make one of his elaborate videos, we know he started his before he knew Sabine was doing one. That's perfect. Also cool how they both didn't like the word "disorder" as entropy,, Veritassium likes entropy as "energy spread out", and Sabine presents her own argument.

  • @andrewmycock2203

    @andrewmycock2203

    9 ай бұрын

    @@geoffwales8646not nit picking, correcting. 👍🏻

  • @anthonybelz7398
    @anthonybelz73982 ай бұрын

    One of the best commentaries on Entropy I've encountered thanks SB, especially as "The # of microstates per macrostate" - Given that a macrostate is simply an arbitrary human classification/aggregation, does this mean that entropy is an arbitrary physical aggregate outcome? I think I'm missing something, so I'll listen to your commentary until I can discern this thing. 🥝🐐

  • @huynguyen4450
    @huynguyen44507 ай бұрын

    Thanks Sabine! This makes me appreciate my current stat mech class more!

  • @alexandretorres5087
    @alexandretorres508710 ай бұрын

    This remembers me of the book "A Choice of Catastrophes" by Isaac Asimov. He talks about how to survive heat death by exploiting low entropy fluctuations. The book was written in 1979.

  • @A_Stereotypical_Guy

    @A_Stereotypical_Guy

    10 ай бұрын

    Ah good remembers

  • @bullpuppy7455

    @bullpuppy7455

    10 ай бұрын

    @@A_Stereotypical_Guy gooder:)

  • @Syncrotron9001

    @Syncrotron9001

    10 ай бұрын

    We wont make it anywhere near that long

  • @Syncrotron9001

    @Syncrotron9001

    10 ай бұрын

    Tru vacum coming soon

  • @perendinatorian

    @perendinatorian

    10 ай бұрын

    also in ''the last question''. yah boi was preoccupied with surviving heat death.

  • @marknovak6498
    @marknovak649810 ай бұрын

    This is the coolest clearest and most concise explanation of entropy ever. I wish my physics professors had taught it this way. It would have saved me so much sleep.

  • @aggies11

    @aggies11

    10 ай бұрын

    True say. While almost anyone can understand a concept/subject, it really takes a special mind to be able to explain it in a way that can be understood by someone else. Sabine's way of conveying information is so refreshing.

  • @aurelienyonrac

    @aurelienyonrac

    10 ай бұрын

    Yes. Finaly someone that admit it is a human bias and not a law.😅 To compare is not fair. Everyting is perfect compared to itself

  • @FGBFGB-vt7tc

    @FGBFGB-vt7tc

    10 ай бұрын

    @@aggies11 I am a firm believer in the Feynman methodology: start as simple as you can to build up knowledge. If you can explain a complex idea in such a way that a child (or a non-specialist) can understand it while still being faithful to the core concepts then you are good!

  • @ricktownend9144

    @ricktownend9144

    10 ай бұрын

    @@aurelienyonrac Yes, I get fed up with it being called a law ... to call it an assumption, or - as sabine does - a matter of probability, would be much more accurate and satisfying.

  • @monnoo8221

    @monnoo8221

    10 ай бұрын

    maybe, but she misses the point. and hence your professor too

  • @ShipOfFreaks
    @ShipOfFreaks8 ай бұрын

    This is a great philosophical point. I love your subtlety, Sabine.

  • @Grantnatnian
    @Grantnatnian4 ай бұрын

    Thanks Sabine! A few years out of college and missing my p Chem courses, this brought back happy memories

  • @tyfooods
    @tyfooods10 ай бұрын

    The idea that there are always macrostates capable of turning a high entropy system into a low entropy system is fascinating. Entropy being constant, but perceived by us macrostates as variable, and thus subjective is powerful. I love how more and more physicists are adopting such a perspective! 😁✨

  • @adamt5

    @adamt5

    10 ай бұрын

    time crystals are a great example. Check that out!

  • @thearpox7873

    @thearpox7873

    10 ай бұрын

    It may be powerful, but that doesn't mean it is particularly useful. Saying that life may be able to emerge post-heat death of the universe just because it perceives reality differently may be a great Douglas Adams book, but belongs right alongside the parallel dimensions theories in plausibility.

  • @florianp4627

    @florianp4627

    10 ай бұрын

    ​@@thearpox7873isn't the idea similar to Penrose's cyclical cosmos hypothesis? And he has proposed some actual ways to experimentally verify that

  • @thearpox7873

    @thearpox7873

    10 ай бұрын

    @@florianp4627 It depends on what you mean by "idea" and what you mean by "similar". But I personally find Penrose's hypothesis intellectually coherent, interesting and plausibly congruent with reality, while Sabine here is engaging in the exact same cognitive hocus-pocus that she makes fun of certain other physicists so much for.

  • @dzcav3

    @dzcav3

    10 ай бұрын

    Sounds like Maxwell's demon

  • @abelriboulot7166
    @abelriboulot716610 ай бұрын

    Great video! But tiny correction: high entropy means high information, not low information. The more random something is, the least it can be succinctly explained. At least for Shannon Entropy. For instance imagine water in its solid form with molecules neatly aligned, it can be described succinctly as a pattern repeating, whereas in its (high entropy) gaseous form, each particle can be anywhere in a the volume that contains it: describing the micro-states would require describing the position of each molecule (high information).

  • @commentarytalk1446

    @commentarytalk1446

    10 ай бұрын

    That's a nice description - just watching the vid but based on what you say I wonder if this video can do better than that concise description. Maybe the simpler the explanation the more encompassing it is also? Entropy feels like such a description tending towards that... hence it's ubquity/applicability.

  • @willthecat3861

    @willthecat3861

    10 ай бұрын

    Ya... I think one has to be careful when analyzing a physical example using Shannon. In physics it about information known (or information that can be potentially known.) It's not conceivable that we could know very much about the micro-states of gas molecules. Yet, we have a lot of information about the micro-states of ice. Thus... in the context of the ice and gas example you gave.... it's kind of opposite of what you said. There is a subtle connection between the two concepts of entropy. A good paper is by Jaynes, if you want to read that. Sean Carroll also writes about this too.

  • @lawrencedoliveiro9104

    @lawrencedoliveiro9104

    10 ай бұрын

    But given that entropy tends to increase, does that mean the amount of information tends to increase or decrease? Consider the case of a gas confined to one half of a container by a barrier. Then the barrier is opened and the gas escapes to fill both halves of the container. Do we have more or less information about the positions of the gas particles than we did before?

  • @jarnorajala

    @jarnorajala

    10 ай бұрын

    This bothered me too. Maybe Sabine meant information in the colloquial sense, not how it's defined in Information Theory. Either way it's confusing.

  • @noneofyourbusiness-qd7xi

    @noneofyourbusiness-qd7xi

    10 ай бұрын

    You are perfectly right and she got it wrong. Information and entropy are perfectly positively correlated, not negatively.

  • @charlesthomas8450
    @charlesthomas845025 күн бұрын

    She’s an excellent teacher and her dry humor is hilarious! Love her!

  • @undercoveragent9889

    @undercoveragent9889

    6 күн бұрын

    "She’s an excellent -teacher- _propagandist_ and her dry humor is hilarious! Love her!" FYP!

  • @LiamCraft02
    @LiamCraft029 ай бұрын

    This is the best explanation of entropy I’ve heard yet

  • @Anaesify
    @Anaesify10 ай бұрын

    Sabine your work has changed the very way I view life and physics in a way that's nearly spiritual. I feel so lucky to be able to learn about the beauty of the universe and physics from you

  • @matthewparker9276
    @matthewparker927610 ай бұрын

    The statistical description of entropy has never resonated with me, but your talk about other life with access to different macrostates is very interesting, and warrants further thought.

  • @kristianshreiner6893

    @kristianshreiner6893

    10 ай бұрын

    It’s vague and non substantial. She needs to produce some scenario that could meaningfully demonstrate how one side steps Boltzmann’s formulation of the second law. Not the order/ disorder pop science stuff, the physics. Proposing some race of hypothetical beings that rely on different Marco states is about as meaningful as proposing that magical beings will eventually come into existence who live differently. I love her, but this argument isn’t serious.

  • @demonicakane2083

    @demonicakane2083

    10 ай бұрын

    I didn't get wut she means by Complex systems with different microstates which we can't use and further the entropy is small for them.... Can u explain

  • @ywtcc

    @ywtcc

    10 ай бұрын

    Could Entropy just be a matter of perspective? If a horizon from one perspective can appear to be a singularity from another perspective... A horizon takes an eternity, and a singularity happens in an instant.... And, a horizon is a state of maximum entropy, and a singularity is a state of minimum entropy... Entropy should be dependent on your reference frame. If your horizon is growing, then your entropy should be growing. I've been thinking about it like this after listening to Leonard Susskind. I don't know if I'm getting it quite right, but it's really interesting to think about in this way.

  • @ThePowerLover

    @ThePowerLover

    10 ай бұрын

    ​@@kristianshreiner6893 _"Proposing some race of hypothetical beings that rely on different Marco states is about as meaningful as proposing that magical beings will eventually come into existence who live differently."_ Well, if you understand "magic" as "fairy tales", like many, then its a straw man fallacy, because Sabine didn't thought of that on terms of "something that doesn't exist nor can exist". And she didn't use the order/disorder stuff, but to only criticize it. Boltzmann's formulation, as Sabine hinted on this video, it's only useful for thinks that we "know", not stuff we clearly don't know, as information not accesible to us. Stop lying pls.

  • @ThePowerLover

    @ThePowerLover

    10 ай бұрын

    @@ywtcc All stadistic is clearly a "matter of perspective"-

  • @Turbohh
    @Turbohh9 ай бұрын

    Entropy change expresses the potential for change of the state of life. It seems more interesting and philosophical than anything else. I do like your description of changed entropy for the future and what it could mean....a crazy world indeed. Very good views...thank you!

  • @EminezArtus
    @EminezArtusАй бұрын

    Probably one of the best explanation of Entropy. Thanks for the video.

  • @juancarlos-cl7cs
    @juancarlos-cl7cs10 ай бұрын

    Me encanta la profundidad y claridad que tienes, este tipo de contenidos es oro. Yo estudié Física y Filosofía y una conversación contigo me hubiera ahorrado años de dudas y confusiones. Gracias por esto.

  • @exciton007
    @exciton00710 ай бұрын

    Intuitive explanation of entropy. Thanks a lot

  • @Donald-fg2ew
    @Donald-fg2ew14 күн бұрын

    For the life of me I don't remember where I first heard "life is the postponement of entropy" but that quote has stuck with me since I was about 15 or 16 years old so had been a factor in my thinking for over 30 years now. It is nice to see someone talk about life vs entropy.

  • @Viky.A.V.
    @Viky.A.V.9 ай бұрын

    Thank you so much, it sheds some light on the entropy for me =)

  • @live_free_or_perish
    @live_free_or_perish10 ай бұрын

    Thank you for delivering such great content 👏 always interesting, humorous, and informative. My field is engineering, but sometimes, I regret not pursuing physics. Watching your videos gives me a chance to see what I missed.

  • @SabineHossenfelder

    @SabineHossenfelder

    10 ай бұрын

    Many thanks from the entire team!

  • @barnsisback8524

    @barnsisback8524

    10 ай бұрын

    @@SabineHossenfelder Time does not pass, only you is moving thru the Time. Time is a coordinate of the state of the moving entropy. From the pass state to the present state you can aim the future state.

  • @enk335

    @enk335

    10 ай бұрын

    you still have time!

  • @effectingcause5484

    @effectingcause5484

    10 ай бұрын

    Engineering is good experience for a prospective physicist. Nikola Tesla, perhaps the greatest engineer, was by extension, also one of the greatest physicists who ever lived, especially in the field of electromagnetism.

  • @hyperduality2838

    @hyperduality2838

    10 ай бұрын

    SYNTROPY (convergence) is dual to increasing ENTROPY (divergence) -- the 4th law of thermodynamics! Making predictions to track targets, goals & objective is a syntropic process -- teleological. Teleological physics (syntropy) is dual to non teleological physics (entropy). Concepts (mathematics) are dual to percepts (physics) -- the mind duality of Immanuel Kant. Mathematicians create new concepts from their perceptions (geometry) all the time! Noumenal (rational, analytic, mathematics) is dual to phenomenal (empirical, synthetic, physics) -- Immanuel Kant. Mathematics (concepts) is dual to physics (measurements, perceptions). Deductive inference is dual to inductive inference -- Immanuel Kant. Inference is dual. The rule of two -- Darth Bane, Sith lord. "Always two there are" -- Yoda. Subgroups (discrete, quantum) are dual to subfields (continuous, classical) -- the Galois Correspondence. Classical reality is dual to quantum reality synthesizes true reality -- Roger Penrose using the Hegelian dialectic.

  • @luipaardprint
    @luipaardprint10 ай бұрын

    I've always read the term heat death as the death of heat, which made a lot of sense to me.

  • @SchgurmTewehr

    @SchgurmTewehr

    10 ай бұрын

    Or a death caused by heat. It could mean both things, and language is just inaccurate in many ways.

  • @AdaptiveApeHybrid

    @AdaptiveApeHybrid

    10 ай бұрын

    I thought so too lol

  • @vikiai4241

    @vikiai4241

    10 ай бұрын

    Yes, the word order in English defaults to "Death by heat" , but "Death of heat" is also a valid interpretation of the words, though not the default.

  • @gnomiefirst9201

    @gnomiefirst9201

    10 ай бұрын

    @@vikiai4241 that's interesting because I am a native English speaker and frequently come onto this but never was sure about my way of thinking about it. Thanks for the validation lol.

  • @FredMaverik

    @FredMaverik

    10 ай бұрын

    @@vikiai4241 Why though? What rule is there

  • @42_universe
    @42_universeАй бұрын

    Finally a video makes the topic crystal clear. And the comments anout macro vs micro states was incredibly logical and very interesting. This should be the ultimate video on entropy.

  • @vieiradelimafilho
    @vieiradelimafilhoАй бұрын

    Great video, as usual. Sabine really puts things in perspective with her very humble scientific admission of gaps. Around the 7th minute we can visualize clearly what is the greatest bias (blindness?) that will make our grandkids cringe in a century: "how could you not understand this emergent order?" To me it feels like it's way past time we start conceptualizing things like "syntropy" and eutropy", and take Metaphysics seriously again. Restore it to its natural place since Aristotle: the real "theory of everything". Intuition (idealism) is the way we understand things so that science can make strides again. Empricism depends on it, and is basically a secondary development.

  • @scifieric
    @scifieric10 ай бұрын

    "It can only go downhill from here" made me laugh out loud. Another excellent video that teaches us about science. Lovely.

  • @homeworld22
    @homeworld2210 ай бұрын

    For a mortal being with a life expectancy of ~80yrs I confess I spend an irrationally large portion of my life brooding on the eventual heat death of the universe. Glad to find videos like this on occasion which at least try and come up with a philosophical explanation for why we shouldn't feel depressed over the stars eventually going out. Kudos Sabine!

  • @edh.9584

    @edh.9584

    10 ай бұрын

    It almost makes one consider praying to God.

  • @siddified

    @siddified

    10 ай бұрын

    @@edh.9584 which one.. There's so many to choose from...

  • @jongyon7192p

    @jongyon7192p

    10 ай бұрын

    @siddified Definitely the Pasta One Although I personally like the machine god that came from the future

  • @edh.9584

    @edh.9584

    10 ай бұрын

    @@siddified Well, choose one.

  • @jongyon7192p

    @jongyon7192p

    10 ай бұрын

    @@edh.9584 The "satanic temple" has some surprisingly good doctrine

  • @EricBittner
    @EricBittner8 ай бұрын

    Very nice! It's refreshing to hear someone talk about Entropy in this manner. When I teach thermo, I always emphasize that Entropy is not about order or disorder. But, I don't know what those terms mean in a quantitative sense. Second, the analogy of the universe being in 1 microstate per microstate is spot-on correct. The thermodynamic definition of entropy dS = dq/T (for a reversible process) also ties into this analogy. If the universe's entropy increases, then heat (dq) must be added...but from where? That "where" can be incorporated into your definition of the universe, which moves the "where" to somewhere else, und so wieder/ So, one is faced with the notion that the universe expansion must be adiabatic (dS = 0)....which is a beautiful problem in Callen's Thermodynamics book.

  • @mechez774
    @mechez7747 ай бұрын

    I recently learned that Planck was highly motivated by believing in the 2nd law, but was still reluctant to embrace atomism. What I seem to remember from either a P Chem or a Chem E course that entropy, or was it enthalpy(?), is a calculable quantity that plays a role in the equations much the same as energy. We can recognize that it is a construct, but it still plays a role in our modern frame of reference. I think it is something still worth pondering as Planck did so I thank you for the video , although the first 8 minutes of textbook regurgitation make me want to pound my head on the desk. Actually I dont own a desk, just a sewing machine table so I can make my sails and escape to my ocean sabbatical,,, maybe then I will understand entropy

  • @willdon.1279
    @willdon.127910 ай бұрын

    She is wonderful at explaining very difficult or (almost) impossible ideas in an understandable way. Alas, even Sabine leaves my poor brain often mystified, but always stretched, curious, and entertained. 🙂

  • @hannahbaxter8825

    @hannahbaxter8825

    10 ай бұрын

    Same same

  • @warrenjoseph76

    @warrenjoseph76

    10 ай бұрын

    When it’s a topic like this I need to watch a few times over a few days and dig more into some of the ideas with other videos. I like that she doesn’t completely dumb it down for people like us but opens the door for us to learn more (such as that entropy logarithm formula). But yeah at the end of this video I STILL don’t QUITE understand a simple way to explain entropy to someone. Yet. But I will after a few views 😂

  • @aaronreyes7645

    @aaronreyes7645

    10 ай бұрын

    She is wonderful Without the gobblety gook

  • @jackcarswell4515

    @jackcarswell4515

    10 ай бұрын

    @@aaronreyes7645 In my opinion, it was all gobblety gook. She talked in circles and never really said anything at all

  • @TheBsavage

    @TheBsavage

    10 ай бұрын

    I know, right? I'm totally in love with her. With her MIND, but I suspect it's a package deal. I'm still game.

  • @Dron008
    @Dron00810 ай бұрын

    I have been trying to understand entropy for decades, now I am a little closer to it.

  • @treadwell1917

    @treadwell1917

    10 ай бұрын

    Entropy is simply a lack of data. In a “real” sense though. Information or data is better to say than order because things being “ordered” are subjective. So entropy happens when you begin to lose information of a system. If something is breaking down it’s order or information is also separating. Therefore it’s entropy is increasing. In a way entropy could be seen as time. Because time moves forward as things spread out or break down. It’s even how we quantify the “second” is by using the measure of an atom breaking down. These two things are the same though. We measure time by our perception of all systems around us progressing towards entropy. So time moves based on the frame rate we perceive it which is again measured or quantized by entropy.

  • @treadwell1917

    @treadwell1917

    10 ай бұрын

    I wasn’t far along enough in the video she pretty much says the same thing.

  • @cinegraphics

    @cinegraphics

    10 ай бұрын

    Entropy is simply the rise of equality. Which is a result of calculations the nature does to produce the next moment of time. So, entropy is simply the result of calculation. It's real, but it's wrongly defined. Entropy is not a measure of disorder. It's a measure of equality. Because once the full equality is reached, the universe stops. The formulas still work, but their inputs and outputs become the same, hence there's no change, hence the passage of time makes no difference. That's maximum entropy. Or equality. So, equality is bad 😊

  • @snaawflake

    @snaawflake

    10 ай бұрын

    ​@@cinegraphics No their inputs and outputs don't become the same, only when you're working with an information discarding model of the universe (macrostate). There is still exactly one microstate leading to exactly one next microstate, and thus each microstate has exactly one preceding microstate, so the universe in terms of microstates cannot advance to a point where the inputs and outputs become the same; as then that final microstate that would be reached would have more than one way for it to be reached: first the last state of the universe in which inputs and outputs were not the same, and second the final state itself where inputs and outputs are the same. But this is in contradiction with the assumption that there is only exactly one microstate leading to exactly one next microstate.

  • @cinegraphics

    @cinegraphics

    10 ай бұрын

    @@snaawflake you're forgetting the rounding errors. At one moment the error level drops below the computation precision and the output becomes same as input. And that's the end of computation. Death of the universe. Stable state has been reached.

  • @mikey1836
    @mikey18363 ай бұрын

    I’m not old, I’m just high entropy.

  • @4c00h
    @4c00h4 ай бұрын

    13:37 this is key, thanks for all the clear explanations and happy to see you got ready early for movember

  • @gaemlinsidoharthi
    @gaemlinsidoharthi10 ай бұрын

    This reminds me of Roger Penrose’s ideas. I think you shared a stage with him at one point. If we change our scale of looking at the universe, in both space *and* time, the almost uniform distribution after 10^100 years (or whatever the number is) becomes recognisably clumpy again. The almost imperceptible effects of gravity speed up and become perceptible again.

  • @CodepageNet

    @CodepageNet

    10 ай бұрын

    i don't think this is possible. if the universe at that point is nothing but a diluteted, even, gas, there may happen some random "clumping" but there just not enough matter nearby for anything more than a few particles hooking up.

  • @akidnag

    @akidnag

    10 ай бұрын

    The most probable state for the universe is a gigantic black hole, which has the largest entropy given by the Bekenstein formula.

  • @IamPreacherMan

    @IamPreacherMan

    10 ай бұрын

    @@CodepageNet cold things contract. The universe is expanding because it’s still heating as evidenced by the high number of stars currently burning. Turn the heat off and the dark matter contracts as does the distance between all particles and celestial bodies. The crunch will happen as it obviously happened before. Unless you believe the universe poofed into existed out of nothing. Which defies all known physics and logic. I think Penrose is brilliant. And wrong about his cyclical universe approach. But I also think hawking was wrong about BH radiation for a number of reasons.

  • @5naxalotl

    @5naxalotl

    10 ай бұрын

    @@CodepageNet the point is that there is eventually no matter, just stray photons. but photons move at the speed of light and from their point of view everything happens instantly. penrose claims that this has peculiar consequences. i can't tell you more than that, or vouch for penrose, but you don't seem to be anticipating the situation fully

  • @wafikiri_

    @wafikiri_

    10 ай бұрын

    Once every remaining particle is beyond the event horizon of every other particle, there is no way they can interact again. Expansion of the universe would end up in a zillion one-particle universes.

  • @wyrmh0le
    @wyrmh0le10 ай бұрын

    Very interesting point about 'macrostates' that I've never heard or thought of before. Definitely food for thought. Thanks!

  • @virgild88
    @virgild886 ай бұрын

    thank you for this wonderful video and for all the others. you have kindled an interest in physics in me.

  • @adrianwright8685
    @adrianwright86854 ай бұрын

    Love the videos Sabine, especially the hand waving - are you trying to hypnotize us?

  • @adanbrown
    @adanbrown10 ай бұрын

    Videos that provide a "simple understanding" of historically complex concepts (like the laws of thermodynamics) are crucial for our kids to hear and learn. I am especially interested when the video explains what we "don't know" or "don't yet understand" to pique their interest in what to solve next!

  • @vazap8662
    @vazap866210 ай бұрын

    Sabine has just addressed a question I been tickling me since my teenage years. This notion of pockets of emergent decreasing of entropy, such as say, our brains. I'm so grateful to hear her address this topic.

  • @FredMaverik

    @FredMaverik

    10 ай бұрын

    ...what

  • @hyperduality2838

    @hyperduality2838

    10 ай бұрын

    SYNTROPY (convergence) is dual to increasing ENTROPY (divergence) -- the 4th law of thermodynamics! Making predictions to track targets, goals & objective is a syntropic process -- teleological. Teleological physics (syntropy) is dual to non teleological physics (entropy). Concepts (mathematics) are dual to percepts (physics) -- the mind duality of Immanuel Kant. Mathematicians create new concepts from their perceptions (geometry) all the time! Noumenal (rational, analytic, mathematics) is dual to phenomenal (empirical, synthetic, physics) -- Immanuel Kant. Mathematics (concepts) is dual to physics (measurements, perceptions). Deductive inference is dual to inductive inference -- Immanuel Kant. Inference is dual. The rule of two -- Darth Bane, Sith lord. "Always two there are" -- Yoda. Subgroups (discrete, quantum) are dual to subfields (continuous, classical) -- the Galois Correspondence. Classical reality is dual to quantum reality synthesizes true reality -- Roger Penrose using the Hegelian dialectic.

  • @henrythegreatamerican8136

    @henrythegreatamerican8136

    10 ай бұрын

    Locally you can have decreasing entropy, but the overall entire system remains highly entropic.

  • @Lightning_Lance

    @Lightning_Lance

    10 ай бұрын

    @@hyperduality2838 I'm here to remind you to take your pills.

  • @dugiejoness5197
    @dugiejoness51978 ай бұрын

    Entropy increases only in hermetic isolated thermodynamic systems. When we are dealing with the flow of energy, the entropy decreases and the atoms organize themselves into more and more complex structures, e.g. life. The universe becomes more orderly rather than chaotic.

  • @irenerosenberg3609
    @irenerosenberg36095 ай бұрын

    So glad a brilliant person, such as Sabine, was also confused by the use of "order" to describe entropy. "Order" never made sense to me in relation to entropy.

  • @KauTi0N
    @KauTi0N10 ай бұрын

    I've been waiting a long time for this video. Sabine, you are an amazing communicator and I think the world needs to hear you. This channel and other sources like it are THE REASON I use the internet. Thank you! ❤

  • @hyperduality2838

    @hyperduality2838

    10 ай бұрын

    SYNTROPY (convergence) is dual to increasing ENTROPY (divergence) -- the 4th law of thermodynamics! Making predictions to track targets, goals & objective is a syntropic process -- teleological. Teleological physics (syntropy) is dual to non teleological physics (entropy). Concepts (mathematics) are dual to percepts (physics) -- the mind duality of Immanuel Kant. Mathematicians create new concepts from their perceptions (geometry) all the time! Noumenal (rational, analytic, mathematics) is dual to phenomenal (empirical, synthetic, physics) -- Immanuel Kant. Mathematics (concepts) is dual to physics (measurements, perceptions). Deductive inference is dual to inductive inference -- Immanuel Kant. Inference is dual. The rule of two -- Darth Bane, Sith lord. "Always two there are" -- Yoda. Subgroups (discrete, quantum) are dual to subfields (continuous, classical) -- the Galois Correspondence. Classical reality is dual to quantum reality synthesizes true reality -- Roger Penrose using the Hegelian dialectic.

  • @JohnPretty1

    @JohnPretty1

    10 ай бұрын

    You are psychic?!

  • @gregwarrener4848

    @gregwarrener4848

    10 ай бұрын

    @@hyperduality2838 sounds like your grasping for straws, likely for some sort of affirmation of a biased narrative. entropy is all around you and can be observed, posing a balance to the force is wishful thinking

  • @hyperduality2838

    @hyperduality2838

    10 ай бұрын

    @@gregwarrener4848 Your concept of reality is a prediction or model -- syntropic!

  • @MR-ub6sq

    @MR-ub6sq

    10 ай бұрын

    @@hyperduality2838 Really?

  • @cedriclorand1634
    @cedriclorand163410 ай бұрын

    Sabine s humour and wit is so empowering. Such a wonderful person. Thank you for existing Sabine 😂

  • @leonardgibney2997

    @leonardgibney2997

    10 ай бұрын

    She doesn't exist. Our reality is an illusion.

  • @hyperduality2838

    @hyperduality2838

    10 ай бұрын

    SYNTROPY (convergence) is dual to increasing ENTROPY (divergence) -- the 4th law of thermodynamics! Making predictions to track targets, goals & objective is a syntropic process -- teleological. Teleological physics (syntropy) is dual to non teleological physics (entropy). Concepts (mathematics) are dual to percepts (physics) -- the mind duality of Immanuel Kant. Mathematicians create new concepts from their perceptions (geometry) all the time! Noumenal (rational, analytic, mathematics) is dual to phenomenal (empirical, synthetic, physics) -- Immanuel Kant. Mathematics (concepts) is dual to physics (measurements, perceptions). Deductive inference is dual to inductive inference -- Immanuel Kant. Inference is dual. The rule of two -- Darth Bane, Sith lord. "Always two there are" -- Yoda. Subgroups (discrete, quantum) are dual to subfields (continuous, classical) -- the Galois Correspondence. Classical reality is dual to quantum reality synthesizes true reality -- Roger Penrose using the Hegelian dialectic.

  • @peter9477

    @peter9477

    10 ай бұрын

    She couldn't help it. Her existence is a high entropy state.

  • @johnkufeldt3564

    @johnkufeldt3564

    10 ай бұрын

    I agree, but you summed it upwith far fewer words. Cheers from Canada.

  • @neanda

    @neanda

    10 ай бұрын

    so true 🤣 it's so subtle as it blends in with her education. imagine her being the teacher, she would grab your attention because you'd be like wtf? then listen more. she's a great teacher. 'if you tell this to a random physicist on the street to see if they agree, you should let them go as they've got better things to do' 🤣

  • @robert48719
    @robert487194 ай бұрын

    I have to say: her jokes are rather as you would expect them to be from a German

  • @fabzy4L
    @fabzy4L4 ай бұрын

    Ive been trying to explain this to students for years, this video is spot on 🤝🏻

  • @PaulElmont-fd1xc
    @PaulElmont-fd1xc10 ай бұрын

    I have had major depressive episodes because of entropy and the second law. I am deadly serious. Since learning about it in high school physics class, I have always considered it to be my greatest fear. Thank you for easing my fear a bit. 😊

  • @SabineHossenfelder

    @SabineHossenfelder

    10 ай бұрын

    I totally know what you mean.

  • @MommysGoodPuppy

    @MommysGoodPuppy

    10 ай бұрын

    I like to think life only came about because of entropy because a human that builds a car will generate a lot more entropy than a rock that gets chipped to pieces over time.

  • @joansparky4439

    @joansparky4439

    10 ай бұрын

    If you like SciFi I can suggest a book to you that has this as part of the story.. it's called 'Voyage from Yesteryear' by James P Hogan. Should give you a positive story to carry around wherever you go.

  • @jackkrell4238

    @jackkrell4238

    10 ай бұрын

    @@joansparky4439 What about the concept of entropy daunts you so much?

  • @MommysGoodPuppy

    @MommysGoodPuppy

    10 ай бұрын

    @@PaulElmont-fd1xc also an uplifting story on entropy is "The Last Question" by Isaac Asimov

  • @ilkoderez601
    @ilkoderez60110 ай бұрын

    I love your channel Sabine. I've been following you for a long time (we even had an epic argument on Twitter _many_ years ago) and it makes me happy that your channel is doing good!

  • @skriptzurvorlesung4474
    @skriptzurvorlesung44745 ай бұрын

    Can you explain, why sunlight is low in entropy but for example the radiation of a woodstove is high in entropy?

  • @justicegear85
    @justicegear857 ай бұрын

    I really like this video and it helps and enables a better understanding on physics.

  • @Viewpoint314
    @Viewpoint31410 ай бұрын

    That was one of the best lectures although it is hard to compare lectures. It's maybe the first time entropy makes sense even though I have been reading about it all my life. I also like your dry physics humor. One needs humor in life. So thank you very much.

  • @tnb178
    @tnb17810 ай бұрын

    The analogy of "disorder" can work in the right context. You could say: there's all kinds of messy but only one kind of clean. For that reason your room tends towards getting messy if you make any change not specifically targetting cleanliness. What I like about the example is that it uses the cleanliness as a macroscopic state defined by a human and talks about the associated microstate count.

  • @boldCactuslad

    @boldCactuslad

    10 ай бұрын

    i see you've never been to a certain college dorm. as determined with mathematical certainty, this was a space of superlative messiness: the Platonic form of disorder, chaos so intense both the gravitational and nuclear forces had all but given up - trash and antitrash not belonging to our dimensions materialized, ionized, annihilated, sublimated, vaporized, crystallized, and floated about the space with such frequency that any external modification to the region did nudge the system towards cleanliness regardless of the actor's intention. the situation passed when a window broke itself in its frustration. the resulting decompression vented the contents of the room (including furniture, wallpaper, rodentia) into the environment.

  • @petevenuti7355

    @petevenuti7355

    10 ай бұрын

    A single homogeneous pile of ash should be the cleanest state, if it's totally homogeneous then there truly can be only one state..

  • @off6848

    @off6848

    10 ай бұрын

    @@petevenuti7355 This is correct and similar to what I was going to say about her comment that "entropy is why things break down" Actually things break apart because usually a "thing" is comprised of many things forced together and eventually unravel because they don't share the same essence. A house will break down because its a collection of wood, glue, sand, gypsum, wire, metal etc.. But those things on their own do not break down

  • @LMarti13

    @LMarti13

    10 ай бұрын

    except that there isn't one kind of clean, there are literally an infinite number of them

  • @wojtek4p4

    @wojtek4p4

    10 ай бұрын

    ​@@LMarti13 If you assume any margin where two adjacent states would be considered "the same" there's a finite amount of states - both "clean" and "dirty". E.g. if you assume that a pair of scissors being moved by less than 1mm constitutes the same state (that is: you quantize the position) the amount of states becomes finite. Importantly increasing the "precision" doesn't make the amount of states infinite - and it doesn't change the proportion of "clean" and "dirty" states significantly. So even at a limit of however small your measurement error is, the amount of "clean" states is smaller than the amount of "dirty" status. Unless quantum effects start taking over, the amount of "clean" states is lower than the amount of dirty "states", no matter the scale.

  • @NinaadDas
    @NinaadDas4 ай бұрын

    I dont know whether you had the channel when I was pursuing a science career, but had i seen your channel then, I would have seriously considered taking up physics major despite my maths department being a bit weak.

  • @lukebradley3193
    @lukebradley31939 ай бұрын

    That was so good. When you bring in information, it get’s so luminous. One question that’s very rich is whether there are natural macrostates, natural groupings of microstates. Is the even division of the box somehow more preferred by nature than the squiggly line? We can argue that within human math, the division of odd and even natural numbers below a trillion is more natural than a randomly chosen subset, because it takes less information to specify. However, that is only true within the domain of discourse, the number system as we have defined it. At the heat death of the universe, we are talking about this quantum soup we already know is nonlocal, and it very well may be that what we call spacetime is only an emergent construct of the present configuration of things. That means it may be anything is possible there, as Boltzman was mocked for claiming, with “Boltzmann brains” and the like. There could even be a Sabine video there that makes my head explode even more than this, recreating the Big Bang starting the whole thing over, who knows.

  • @PATRIK67KALLBACK
    @PATRIK67KALLBACK10 ай бұрын

    Great video Sabine! Even if I have a MSc in chemistry and PhD in pharmacy, one of the hardest thing to undetstand is the thermodynamics. You really have to eat and breathe thermodynamics to really understand it, and it doesn't come with intuition, and one of these things are entropy. So thank you Sabine to make these entropy more intuitiv.

  • @mixerD1-

    @mixerD1-

    10 ай бұрын

    Farmacy? Do you mean agriculture or is this alternative medicine? Sorry... couldn't help myself.😁

  • @andrewmycock2203

    @andrewmycock2203

    10 ай бұрын

    @@mixerD1-or maybe an Fhd.

  • @PATRIK67KALLBACK

    @PATRIK67KALLBACK

    10 ай бұрын

    ​@@mixerD1-ha ha, too much swedish 😊

  • @noneofyourbusiness-qd7xi

    @noneofyourbusiness-qd7xi

    10 ай бұрын

    Your physical chemistry profs will be extremely disappointed if they read your comment (and likely be sorry that they let you pass)

  • @beingsentient
    @beingsentient9 ай бұрын

    I don't understand what Sabine is saying at the very end, about other life forms. Up to that, fine. She says first that the universe is in one microstate per microstate, always, and always with zero entropy. I can see that, using the analogy of the partitioned box, whether the partition is in place or not and whether the molecules are in half the box or in the full box. The states of individual molecules (mass + energy) are the same in each case. She then applies that to the universe, declaring that there is only one microstate in the universe, whether or not there are "partitions," such as stars, planets, black holes, etc. She's saying that we humans are artificially looking at the macrostates for our own purposes, and as far as the universe goes, it's only in one microstate at zero entropy. I get that. So she must then be saying that the increase of entropy in conjunction with the arrow of time that physicists teach us is an artifact of our dividing the universe into macrostates. Fine. But what does she mean by saying there's a possibility of other life forms appearing? From where? From the same universe? How can that happen? What region of lower entropy can provide for such life forms (macrostate)? Near Heat Death, there will be next to no concentrated bits of matter or energy from which to construct - or imagine - any macrostate. I do see parallels here with Penrose's idea that at Heat Death, the universe forgets itself, unable to distinguish Heat Death with the instant of creation, and so with no distinction, Heat Death again becomes the first instant of creation. And on and on. But Sabine seems to be wading far more into speculation here, and I don't see the basis for it. Am I wrong?

  • @Nefville
    @Nefville10 ай бұрын

    According to entropy at some point we're going to come back to this video and say "well this didn't age well" 😉

  • @anywallsocket

    @anywallsocket

    10 ай бұрын

    That'd be Poincare's recurrence theorem which only applies to classical closed systems.

  • @ThePowerLover

    @ThePowerLover

    10 ай бұрын

    @@elinope4745 Not quite.

  • @dr.gordontaub1702
    @dr.gordontaub170210 ай бұрын

    I think many of my (undergraduate) students would get a laugh (or a cry) out of the phrase, '...You don't need to know that much maths, just differential equations and probabilities...'

  • @bryck7853

    @bryck7853

    10 ай бұрын

    linear algebra and complex number theory be damned!

  • @afterthesmash

    @afterthesmash

    10 ай бұрын

    There's not a single thing about Special Relativity that Einstein couldn't have explained to Archimedes. Einstein would probably take a detour into explaining the modern notation of algebra. But this wouldn't really _be_ algebra, for the same reason that many undergraduates in computer science can barely distinguish a formula from an equation. Archimedes: This algebra thing is cool! I wonder what else you can do with it. Einstein: Well, I did once take a year-long detour into the deep technical weeds of Ricci tensors. Archimedes: Excellent! Please explain. [Archimedes smooths out some complex geometric diagram in the sand.] Einstein: Uh, okay, but we're going to need a _much_ bigger beach.

  • @johnzander9990

    @johnzander9990

    10 ай бұрын

    My college students would laugh at "half the box being filled with particles" as being a low entropy state as they would understand that this is just one state that's counted in the entropy of the system. As a person that implies you have physics students, do you not see anything wrong with her understanding of thermodynamics?

  • @brooklyna007

    @brooklyna007

    10 ай бұрын

    @@johnzander9990 But it is clearly lower relative entropy to being fully spread out in the box. Just by halving the space you're halving the number of microstates. Further, it is clearly not an equilibrium state and not a state that can last long.

  • @brooklyna007

    @brooklyna007

    10 ай бұрын

    @@johnzander9990 Sorry, I should add that your understanding of entropy is a bit concerning. The state of the box half filled is the macrostate. It is not one of the contributing microstates to some other macrostate as your statement implies. Ex. You could instead have the microstate be that all molecules are in the box and *then* you could consider all on one side as a contributing microstate to that macrostate.

  • @whong09
    @whong094 ай бұрын

    Veritasium's video on entropy is really interesting and maybe aligns with the "uplifting" takeaway here. The universe doesn't just tend towards higher entropy, but it also tends towards faster entropy gain. Life as we know it is locally the fastest entropy increasing mechanism, as entropy further increases other life-like mechanisms (that may not be recognizable to us) should be likely because the universe tends towards accelerating entropy growth.

  • @derekboyt3383
    @derekboyt33833 ай бұрын

    The idea that heat death will occur depends on the shape of the universe. If infinite expansion is assumed then that might be a possibility but we can’t be certain of this given the small scale universe we observe. Personally, I believe the universe is toroidal (like the atoms, planets, stars, and galaxies) and that the energy will eventually return to a singularity. At this point the entire universe will be ordered and entropy will not exist.

  • @inciaradible7144
    @inciaradible714410 ай бұрын

    Fascinating video and very well explained; entropy has always been one of these quantities I've found difficult to define or understand. It has mostly been explained to me in the concept of order (or just plain equations), and those mostly just made me go ‘oh that makes sense... wait.’ Minor note, Boltzmann has two n's. 😅

  • @serpentphoenix
    @serpentphoenix10 ай бұрын

    "If someone points out to you that your pet theory of the universe is in disagreement with Maxwell’s equations-then so much the worse for Maxwell’s equations. If it is found to be contradicted by observation-well these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation." - Arthur Eddington, New Pathways in Science

  • @kongtheanwong1849
    @kongtheanwong18494 ай бұрын

    🎯 Key Takeaways for quick navigation: 00:00 🌌 *Introduction to Entropy and the 2nd Law of Thermodynamics* - Introduction to the concept of life requiring order and structure. - Physics perspective on entropy and its connection to the increase in disorder. - Questions about the accuracy of the 2nd Law of Thermodynamics and its implications. 02:26 🚗 *The Arrow of Time and Probability* - Explanation of the "arrow of time" and its connection to the likelihood of events. - Discussion on time reversibility in physics equations. - Importance of initial states and the probability associated with entropy. 05:00 ⚙️ *Entropy, Equilibrium, and Work* - Definition of entropy as a measure of system likelihood. - Explanation of equilibrium and its role in defining useful energy(work). - Practical applications of entropy in physics and daily life. 07:42 ☀️ *Entropy, Reservoirs, and the Fate of the Universe* - Discussion on the Past Hypothesis and the low entropy start of the universe. - The role of entropy in the increasing disorder of the universe. - Utilization of low entropy reservoirs like the sun and fossil fuels. 09:33 ❄️ *Heat Death and the End of the Universe* - Explanation of the concept of "heat death" as the eventual fate of the universe. - Clarification that "heat death" doesn't imply a hot end but rather a state of maximum entropy. - Discussion on the exhaustion of low entropy reservoirs leading to a cosmic standstill. 10:25 🌌 *Entropy, Order, and Perception* - Clarification of the confusion surrounding the relationship between entropy and order. - Examples of seemingly ordered situations with varying entropy. - Discussion on the subjective nature of human perceptions of order. 12:12 🔄 *Entropy, Microstates, and the Second Law* - Explanation of microstates and macrostates in the context of entropy. - Assertion that the second law doesn't strictly state an increase in entropy but rather a lack of decrease. - The role of information loss in the perception of increasing entropy. 14:26 🌐 *Entropy, Quantum Mechanics, and the Future* - Discussion on the relationship between entropy and information in quantum mechanics. - The proposition that complex systems may emerge, reducing entropy for those systems. - Speculation on the future of life in the universe and its potential forms. 15:51 📚 *Conclusion and Brilliant.org Sponsorship* - Summary of the video's main points about entropy and the 2nd Law of Thermodynamics. - Encouragement to explore further learning on Brilliant.org. - Information about Sabine's own course on Brilliant, focusing on quantum mechanics. Made with HARPA AI

  • @Galileosays
    @Galileosays9 күн бұрын

    Entropy is a macrscopic property that only can be quantified at equilibrium. In the case of the box, the equilibrium is established when the particles bounce equally to each side of the box, otherwise one side would have a lower pressure; i.e. no equilibrium. Once equilibrium is reached, the particles will never be at one side of the box spontaneously. This is impossible from a kinetic point of view, since that would mean that an external force changed the average momentum of the particles from zero (at equlibrium) to a non-zero value. The entropy quantifies the number of ways the energy can be distributed over the particles for given equilibrium.

  • @whyofpsi
    @whyofpsi10 ай бұрын

    I like how complexity emerges somewhere between minimum and maximum entropy :)

  • @anywallsocket

    @anywallsocket

    10 ай бұрын

    Indeed, it is within the inflection point of rate change that new minima can be added.

  • @ahmedtoufahi5198
    @ahmedtoufahi519810 ай бұрын

    I guess Sabine is the best person that has ever explained this relationship between order, life, entropy and those things without either being carried away in theory or relying on very superficial metaphors. Because maybe physics is both ! intuition and somewhat philosiphical insights + rigourous mathematical modals.

  • @No-cg9kj

    @No-cg9kj

    10 ай бұрын

    Physics isn't philosophy, it's what we know to be reality.

  • @off6848

    @off6848

    10 ай бұрын

    @@No-cg9kj What you think you know (episteme) is philosophy.

  • @ThePowerLover

    @ThePowerLover

    10 ай бұрын

    @@No-cg9kj Physics is not that despite its name.

  • @ahmedtoufahi5198

    @ahmedtoufahi5198

    10 ай бұрын

    @@No-cg9kj I agree with you actually. Philosophical thinking doesnt mean something detached from reality. There s a section in philospohy about science and modelization and those things. I mean that it's useful to take some time to pose some "meta" questions about what we're trying to look for in nature, what is really fundemantal and what stems from our specific human view of nature. In this video Sabine took sometime to think that maybe our defintion of entropy is a reflexion of what information is accessible to US and what energy WE can use. It is pretty philosophical it seems to me. It's funny how your user name is NO btw xd, thnx for ur comment.

  • @shrub4248

    @shrub4248

    10 ай бұрын

    Science is built up on a bedrock of epistemology.

  • @nilo9456
    @nilo94568 ай бұрын

    I worked as a technician in refrigeration & AC. This video called to mind a different concept: Enthalpy, a discription of the energy state of a system. (Those who know something about thermodynamics will know how inadequate this discription is.

  • @ramabommaraju2715
    @ramabommaraju27159 ай бұрын

    Simple question- the partciles that constitute the body of a living organism are highly ordered and constantly work towards repairing damaged parts- HOW IS THIS POSSIBLE IN A WORLD OF EVER INCREASING DISORDER?

  • @nissieln
    @nissieln10 ай бұрын

    This is truly your best video IMO. Thank you Sabine! Finally a happy ending, from which, as you said, everything will have to go downhill 😂

  • @NumericFork
    @NumericFork10 ай бұрын

    I read a while ago that they made a tiny membrane out of graphene that could harvest minute amounts of electricity from brownian motion by vibrating the membrane. That's pretty cool because as I understand it, it would mean that if you're in a sealed sphere that doesn't lose energy, you could in theory produce energy out of seemingly nothing by cooling the air to convert the heat to electricity, which in turn would do useful things before inevitably heating the air again.

  • @TheBackyardChemist

    @TheBackyardChemist

    10 ай бұрын

    Sounds a lot like the molecular ratchet or rectifying thermal noise sort of ideas. So far none have worked.

  • @zsmith200

    @zsmith200

    10 ай бұрын

    We actually don’t need a closed system to make accurate predictions with thermo though. You can derive how a system in contact with a thermal reservoir will act by treating the system and reservoir as a closed system. You’ll still see that perpetual motion machines like Brownian ratchets are impossible with heat exchange

  • @dragons_red

    @dragons_red

    10 ай бұрын

    You're not going to power much from Brownian motion.

  • @IVANHOECHAPUT

    @IVANHOECHAPUT

    10 ай бұрын

    Duh... You're vibrating a membrane and expecting electricity. Where do you think the energy to vibrate the membrane came from - the vacuum?

  • @G1vr1x

    @G1vr1x

    10 ай бұрын

    I just checked a presentation from Paul Thibado | Charging Capacitors Using Graphene Fluctuations, where he explains in a nutshell that energy in his experiment is gathered by the diodes, not the graphene. The graphene is acting like a variable capacitor that doesn't need energy input, but is at equilibrium. From his own claims, it doesn't break 2nd law of Th. even though I don't fully get the explanation.

  • @user-hj8uo1zl6k
    @user-hj8uo1zl6k2 ай бұрын

    I liked this video because it resonated with my views on entropy that I gained after reading the inspiring article by Myron Tribus and Edward C. McIrvine, "Energy and Information", Scientific American , Vol. 225, No. 3 (September 1971), pp. 179-190. I strongly recommend that paper, because it provides an insightful complement and an additional explanation and formalization of the points raised by Sabine. Following Shannon's original proposal, lucidly explained by Tribus and NcIrvine, entropy is defined as the degree of uncertainty, X, that one has about a certain question Q. A question can be "In which part of the box is the particle, left or right?" . Complete uncertainty about that question means that there is equal probability, p_1= p_2 = 1/2, for the particle being in the left or the right part of the box. The entropy, defined as S(Q|X)= - \sum p_i ln p_i, is then S(Q|X)= - ln (1/2) = ln 2. On the contrary, if one knows the answer, namely that the particle is in the left part (denoted 1) and not in the right part (denoted 2), then p_1=1 and p_2=0. The entropy is then S(Q|X) = 0, which means that there is no uncertainty about the question Q. This is in contrast with S(Q|X) = ln 2, which is the maximal ignorance about the question Q. The authors then remark: "The only state of greater ignorance is not to know Q." And not knowing Q, or even worse, not even be aware of the fact that one must bring Q into the definition of entropy, is a source of all the confusion and all the difficulties people have in understanding the concept of entropy. It was great that Sabine revealed in her own wonderful and easily grasping way this important point to the general public. In the article, cited above, the simple model with two possible answers that I gave to illustrate the idea, was generalized to the cases of generic questions and any number of possible answers. Information was defined as the difference between two uncertainties, I= S(Q|X) - S(Q|X'), The relation to the thermodynamic entropy was exposed. Again, a fabulous, inspiring, highly cited paper that removes the longstanding "mystery" and confusion.

  • @kaboomboom5967

    @kaboomboom5967

    2 ай бұрын

    I dont understand but i agree 😁

  • @user-hj8uo1zl6k

    @user-hj8uo1zl6k

    2 ай бұрын

    @@kaboomboom5967 You would certainly understand the Scientific American paper "Energy and Information" which is very clear and readable, written for general audience.

  • @joeharker7918
    @joeharker79183 ай бұрын

    Wow, this is so far above my head it’s funny. But it is amazing to hear this kind of analysis done by someone so educated. I agree with the concept of entropy not being the opposite of order. That never sat right with me, either. Love your work! ❤

  • @kencusick6311
    @kencusick631110 ай бұрын

    How did Sabine know I was about to ask, “But what about Quantum Mechanics?” Spooky action at a distance at work?

  • @TheCreativeKruemel
    @TheCreativeKruemel10 ай бұрын

    11:36 Sabine you are wrong. High entropy indeed corresponds to high information content, as per Claude Shannon's definition in information theory. The reasoning behind this is that entropy essentially quantifies the level of uncertainty or randomness within a set of data. The more uncertain or unpredictable the data, the more information it contains. For reference: "en.wikipedia.org/wiki/Entropy_(information_theory)"

  • @Darenimo

    @Darenimo

    10 ай бұрын

    If a highly likely (High entropy) event occurs, the message carries very little information. On the other hand, if a highly unlikely (Low entropy) event occurs, the message is much more informative. This is exactly what Sabine is saying.

  • @TheCreativeKruemel

    @TheCreativeKruemel

    10 ай бұрын

    ​@@Darenimo You twisted some definitions in your comment. Entropy IS average information. If a high-entropy message occurs, then the message carries lots of information. High Entropy is NOT the same as "highly likely". This holds for information theory AND statistical mechanics!

  • @NightmareCourtPictures

    @NightmareCourtPictures

    10 ай бұрын

    ​@@TheCreativeKruemel it's not that anyone here is wrong, it's just the definitions of stat mech is all screwed up, and always has been. Physicists have been arguing about thermodynamics for god damn years cause of the way it was formulated. First, you have to remember what information really is, and what observer the "message" is being viewed from. If you read the message 000110010101 This tells you some kind of information, but it turned out that this message, was just the macro-state description of a more underlying message 00110000 00110000 00110000 00110001 00110001 00110000 00110000 00110001 00110000 00110001 00110000 00110001 When you compare these two, their information content is the same (binary expansion of a binary set of numbers, where 00110000 is 0 and 00110001 is 1.) The 2nd clearly has more bits of information, because it's very obviously longer and can describe a lot more states (higher shannon entrop). But also under Shannon, the surprise of the messages are equivelent...in that these messages have the same amount of surprise, and therefor have the same "shannon" information. Additionally and by contrast, randomness is not well defined. Consider the random string again of 0's and 1's : 000110010101...it contains regularity and in fact, for any arbitrarily large string, you will never find a truly random sequence, there will always be patterns and repetition in that sequence... 000, 11, 00 0101, 1010...these are patterns embedded in the seemingly "random" string and I'd challenge anyone to try and create a truly random string of 0's and 1's. The only time one can is when each bit is unique , at which point the idea of "surprise" gets thrown out the window because every state would be a surprise if all of it's bits are all unique (and this is the true extension to what Sabine's sid, just from an information theoretic viewpoint) Cheers,

  • @Darenimo

    @Darenimo

    10 ай бұрын

    Such as, if you have an airtight box on your desk, and you request information about the contents of the box, you could get a high entropy response, letting you know all the possible configurations of atoms and molecules that could exist in the box. Or you could get a low entropy message, telling you it contains mustard gas and opt to not open it. And you'd probably find you gained more information from the latter message than the former, even if the former had a higher information content.

  • @TheCreativeKruemel

    @TheCreativeKruemel

    10 ай бұрын

    @@Darenimo Hey good point :)! The low entropy response, saying "it contains mustard gas", is less detailed but more straightforward for us to understand. However, 'information' here doesn't just mean what WE can understand easily (very important!). It includes ALL possible details, even very complex ones (one key idea in Shannons original paper). In principle it could also be possible to use data compression and Character encoding on the high entropy message to reduce the high entropy message to a message we humans can read :). The important thing to understand is that the high entropy message (about all configurations in the box) contains ALL the information of the low entropy message!

  • @zerkig9058
    @zerkig90583 ай бұрын

    This is what I call the true scientific spirit! You made me interested in this stuff, instead of giving me an existential crisis :D

  • @Kescall01
    @Kescall019 ай бұрын

    The trouble with entropy is that it's the wrong way round. Most entities, when they have a lot of whatever it is, have a high value. If they don't have much they have a low value. Low entropy means that we have a lot of useful energy available, and high entropy means that we have nothing. No wonder I'm confused.

  • @zerocero5850
    @zerocero585010 ай бұрын

    So much food for thought. One of your best yet Sabine.

  • @hyperduality2838

    @hyperduality2838

    10 ай бұрын

    SYNTROPY (convergence) is dual to increasing ENTROPY (divergence) -- the 4th law of thermodynamics! Making predictions to track targets, goals & objective is a syntropic process -- teleological. Teleological physics (syntropy) is dual to non teleological physics (entropy). Concepts (mathematics) are dual to percepts (physics) -- the mind duality of Immanuel Kant. Mathematicians create new concepts from their perceptions (geometry) all the time! Noumenal (rational, analytic, mathematics) is dual to phenomenal (empirical, synthetic, physics) -- Immanuel Kant. Mathematics (concepts) is dual to physics (measurements, perceptions). Deductive inference is dual to inductive inference -- Immanuel Kant. Inference is dual. The rule of two -- Darth Bane, Sith lord. "Always two there are" -- Yoda. Subgroups (discrete, quantum) are dual to subfields (continuous, classical) -- the Galois Correspondence. Classical reality is dual to quantum reality synthesizes true reality -- Roger Penrose using the Hegelian dialectic.

Келесі