The Dangerous Math Used To Predict Criminals

Ғылым және технология

The criminal justice system is overburdened and expensive. What if we could harness advances in social science and math to predict which criminals are most likely to re-offend? What if we had a better way to sentence criminals efficiently and appropriately, for both criminals and society as a whole?
That’s the idea behind risk assessment algorithms like COMPAS. And while the theory is excellent, we’ve hit a few stumbling blocks with accuracy and fairness. The data collection includes questions about an offender’s education, work history, family, friends, and attitudes toward society. We know that these elements correlate with anti-social behavior, so why can’t a complex algorithm using 137 different data points give us an accurate picture of who’s most dangerous?
The problem might be that it’s actually too complex -- which is why random groups of internet volunteers yield almost identical predictive results when given only a few simple pieces of information. Researchers have also concluded that a handful of basic questions are as predictive as the black box algorithm that made the Supreme Court shrug.
Is there a way to fine-tune these algorithms to be better than collective human judgment? Can math help to safeguard fairness in the sentencing process and improve outcomes in criminal justice? And if we did develop an accurate math-based model to predict recidivism, how ethical is it to blame current criminals for potential future crimes?
Can human behavior become an equation?
** ADDITIONAL READING **
Sample COMPAS Risk Assessment: www.documentcloud.org/documen...
COMPAS-R Updated Risk Assessment: www.equivant.com/compas-r-cor...
“The accuracy, fairness, and limits of predicting recidivism.” Julia Dressel. www.science.org/doi/10.1126/s...
“Understanding risk assessment instruments in criminal justice,” Brookings Institution: www.brookings.edu/research/un...
“Machine Bias,” Julia Angwin, Jeff Larson, Surya Mattu and Lauren Kirchner, ProPublica: www.propublica.org/article/ma...
“The limits of human predictions of recidivism,” Lin, Jung, Goel and Skeem: www.science.org/doi/full/10.1...
“Even Imperfect Algorithms Can Improve the Criminal Justice System,” New York Times: www.nytimes.com/2017/12/20/up...
Equivant’s response to criticism: www.equivant.com/official-res...
“A Popular Algorithm Is No Better at Predicting Crimes Than Random People,” Ed Yong: www.theatlantic.com/technolog...
“The Age of Secrecy and Unfairness in Recidivism Prediction,” Rudin, Wang, and Coker: hdsr.mitpress.mit.edu/pub/7z1...
“Practitioner’s Guide to COMPAS Core,” s3.documentcloud.org/document...
State v. Loomis summary: harvardlawreview.org/wp-conte...
** LINKS **
Vsauce2:
TikTok: / vsaucetwo
Twitter: / vsaucetwo
Facebook: / vsaucetwo
Talk Vsauce2 in The Create Unknown Discord: / discord
Vsauce2 on Reddit: / vsauce2
Hosted and Produced by Kevin Lieber
Instagram: / kevlieber
Twitter: / kevinlieber
Podcast: / thecreateunknown
Research and Writing by Matthew Tabor
/ tabortcu
Editing by John Swan
/ @johnswanyt
Police Sketches by Art Melt
Twitter: / eeljammin
IG: / jamstamp0
Huge Thanks To Paula Lieber
www.etsy.com/shop/Craftality
Vsauce's Curiosity Box: www.curiositybox.com/
#education #vsauce #crime

Пікірлер: 1 000

  • @DemonixTB
    @DemonixTB Жыл бұрын

    IBM Internal presentation slide, circa 1979; "A COMPUTER CAN NEVER BE HELD ACCOUNTABLE THEREFORE A COMPUTER MUST NEVER MAKE A MANAGEMENT DECISION" is the perfect response to any of this. no algorithm should ever decide the fate of who lives who dies, whose life get's cut by 30 and whose by 3 years.

  • @feedbackzaloop

    @feedbackzaloop

    Жыл бұрын

    Even more so, justice must not be based on probabilty, computer-calculated or man-accounted.

  • @Mikee512

    @Mikee512

    Жыл бұрын

    Juries falsely convict a certain % of time. Algorithms falsely convict a certain % of time. Shouldn't you choose the method that falsely convicts less frequently? Or is there something fundamentally important about having people make the decision, even though they falsely convict more often? I don't know the answer, but it's not a cut-and-dry issue, IMO. **Whatever the case, I think any algorithms in use by the justice system (government) should be open-source and subject to public scrutiny. This seems like it should be a non-negotiable minimum.**

  • @feedbackzaloop

    @feedbackzaloop

    Жыл бұрын

    @@Mikee512 open-source judging algorithm is a disaster, not a non-negotiable minimum! We kind of already have it as written criminal and civil codes and look at all loopholes people are coming up with to get away from justice, absolutely legally. Now imagine how simple it would be to reverse engineer the algorithm, predict your own sentence and based on that commit it with maximum profit.

  • @sillyproofs

    @sillyproofs

    Жыл бұрын

    If we little people can see how nonsense all this is, why can't the higher-ups? I thought they were the more educated..

  • @fedcab4360

    @fedcab4360

    Жыл бұрын

    @@sillyproofs LMAO

  • @DogKacique
    @DogKacique Жыл бұрын

    That company made a buzzfeed quiz and is selling it like it was an advanced minority report AI

  • @bow_and_arrow

    @bow_and_arrow

    Жыл бұрын

    FRRRRR

  • @joshyoung1440

    @joshyoung1440

    10 ай бұрын

    ​@@bow_and_arrow for real real real real real

  • @joshyoung1440

    @joshyoung1440

    10 ай бұрын

    ​@@bow_and_arrow oh sorry FOR REAL REAL REAL REAL REAL

  • @avakining

    @avakining

    4 ай бұрын

    Plus like… the whole point of Minority Report was that those algorithms don’t work anyway

  • @CorvieClearmoon
    @CorvieClearmoon Жыл бұрын

    FYI - Noom was found to be practicing very shady business behind the scenes. They have been overcharging customers and refusing to allow them to cancel their services. I believe they are currently under investigation. From what I've come to learn, they are actually bragging about their mishandling of services and suggesting other companies do the same. I''d do some digging to see what you can find before accepting their promotions again.

  • @moizkhokhar815

    @moizkhokhar815

    Жыл бұрын

    yes More people should read this comment

  • @ashlinberman4534

    @ashlinberman4534

    Жыл бұрын

    I think they made canceling subscriptions easy/easier after complaints, but i couldnt find anything anything about overcharging being solved, however they did get a class action lawsuit over it, and all reports seem to be from 2+ years ago, so it might be solved as well. Not accountable on either fronts btw, this is just from basic research, so you might be able to find better evidence against what i said

  • @Games_and_Music

    @Games_and_Music

    Жыл бұрын

    I thought that part of the video really displayed the criminal maths.

  • @thelistener1268

    @thelistener1268

    Жыл бұрын

    That's for the tip!

  • @that_rhobot

    @that_rhobot

    Жыл бұрын

    I've seen accounts from people trying Noom's mental health app that it pretty much always just recommends dieting, regardless of what you are dealing with. Like, there were people battling anorexia that were being told they were eating too much.

  • @cee8mee
    @cee8mee Жыл бұрын

    I think using an algorithm to look for possible suspects, or location of evidence, or possibly areas that might require higher security due to history of criminal behavior is valid. But as soon as you start asking subject philosophical questions, you've introduced a wild card that makes the algorithm meaningless. I think we can find areas in the justice system for algorithmic programs, but definitely not proprietary and hidden. Open source is a must for transparency.

  • @gewurzgurke4964

    @gewurzgurke4964

    Жыл бұрын

    Any algorithm made for "justice" will reinforce the prejudice of those that make it What law is, what crime is and what what crime prevention should look like are already deeply philosophical questions

  • @quintessenceSL

    @quintessenceSL

    Жыл бұрын

    It's a bit more than that, as these same types of test were/are used in "character profiles" for hiring (actually had a manager stand behind me and give me answers after I failed the thing for the 5th time. ALL of my references stated I was a great employee. Who ya gonna believe?). It is akin to social credit scores and the like of essentially magic smoke to remove accountability from decision making (and quite possibly subtly gaming an algorithm for a result not mentioned in the stated intent). And while claiming the mantle of "science", like much forensic tools, it hasn't been tested for falsifiability or even degree of improvement over existing methods. It's a modern day snake-oil salesmen now using computer science as their pitch. Run the test on the management of said companies. Let's see how accurate they really are.

  • @Cajek2

    @Cajek2

    Жыл бұрын

    It’s trying to measure how likely it is that you’ll commit a crime in capitalism. In capitalism it’s a crime to be poor or hungry. And in that sense the algorithm is doing pretty good.

  • @andrasfogarasi5014

    @andrasfogarasi5014

    Жыл бұрын

    @@Cajek2 What the hell are you talking about? Even if we accept for a fact that the entirety of society is structured to enrich a ruling class, being poor wouldn't be a crime. The poor don't cause the rich to become less rich by virtue of existing. Instead, a poor person under such a system would be considered someone whose labour can be easily bought and is thus quite useful. Preventing the poor from working by imprisoning them would be akin to the rich shooting themselves in the foot. And no, prison labour is not profitable. The number of prisoners in the USA is 2.1 million. The value of prison labour per year is $11 billion. This comes out to each inmate producing $5,238 worth of goods and services per year. There is no prison in the developed world which can house a prisoner while spending only $5,238 per year on them. It's clear then that unless someone causes like net $10,000 worth of social damages per year, it does not make purely financial sense to imprison them. And if they do cause a net $10,000 worth of social damage per year, then I do dare say in my humble opinion that they probably *should* go to prison.

  • @notsojharedtroll23

    @notsojharedtroll23

    Жыл бұрын

    Just watch Psycho pass

  • @Cyberlisk
    @Cyberlisk Жыл бұрын

    We need a law that any algorithm that affects sentences or political decisions must be open source. For me as a computer scientist, that's just common sense and not having that law contradicts every juridical principle in a democracy. Having a black box algorithm influence decisions is literally the equivalent of using investigative results or testimonies without presenting them in court.

  • @mqb3gofjzkko7nzx38

    @mqb3gofjzkko7nzx38

    Жыл бұрын

    @Lawrence Rogers We might as well have secret laws and secret tax codes too so that those can't be easily gamed either.

  • @zafar0132

    @zafar0132

    Жыл бұрын

    If they are using a bog standard convolutional neural network, they might not be able to explain the decisions it makes. The US military used them in deciding what drone targets to attack in Pakistan and ended up bombing and killing ordinary people just going about their business. Using these technologies in certain areas with no oversight is just criminally negligent in my opinion.

  • @joshyoung1440

    @joshyoung1440

    10 ай бұрын

    This is great but I'm pretty sure the word is judicial

  • @user-yx5ry9rj3z

    @user-yx5ry9rj3z

    7 ай бұрын

    Some would argue that's exactly why we don't live in a democracy.

  • @johnmcleodvii

    @johnmcleodvii

    Ай бұрын

    Any AI model needs to be traceable.

  • @PhilmannDark
    @PhilmannDark Жыл бұрын

    I've first read about this in the book "Weapons of Math Destruction". A major problem with all of these algorithms is that they can't measure the variables which they want to observer (like what people think, how stable they are emotionally, what their views, experiences and skills are). So companies use second-hand variables which are often only weakly linked to the problem at hand. Laymen just see "a computer came up with the number after doing some very complex math" which they think means "must be correct since neither math nor computers can be wrong" and they forget the old wisdom "garbage in, garbage out".

  • @garronfish8227

    @garronfish8227

    2 ай бұрын

    I'm sure more frequent criminals will work out how to answer the questions in the best way. The system seems flawed.

  • @stevenboelke6661
    @stevenboelke6661 Жыл бұрын

    There's no way that this machine wasn't trained with data about actual convictions and suspect info. Therefore, the algorithm could at best only accurately replicate justice as it has been done, not as it should be.

  • @quarepercutisproximum9582

    @quarepercutisproximum9582

    Жыл бұрын

    Dang, that's... a *really* good point. I hadn't thought of that. But, who could say what it should be? How would the creator of the algorithm decide what qualities to select for? I'm not sure such a thing is possible, while still working under the supposition that people lie for their own benefit

  • @andershusmo5235

    @andershusmo5235

    Жыл бұрын

    I was thinking the same thing. Algorithms aren't necessarily the objective oracles the way we commonly think of them as. An algorithm making predictions based on historical data is bound to replicate that data. An algorithm not based on historical data relies on speculation in some form or some degree and will reveal (or worse, hide) biases and assumptions on the part of whoever designed the algorithm. Like Steven stated so well, an algorithm trained on the data we have will merely replicate justice as it has been done so far, not change it. An algorithm thus only serves to obfuscate issues in the justice system behind a veil of infallibility and inaccountability.

  • @pXnTilde

    @pXnTilde

    Жыл бұрын

    Well, it probably wasn't trained at all. It's not a neural network. It's possible the coefficients were tuned to match historical decisions, and your point is very valid. However, if it's true it's simply reflecting what has happened, then getting rid of it would return to ... the exact thing it was doing.

  • @tweter2

    @tweter2

    Жыл бұрын

    No, the machine algorithms are used at the research level. Studies are done on past convictions to look for common denominators. Researchers use machine learning to look for these correlations. Once there is a stronger correlation is established, it can be considered for a risk assessment. Risk assessments are ultimately a set of items that show a stronger correlation.

  • @NotQuiteGuru

    @NotQuiteGuru

    Жыл бұрын

    You're correct in your initial assessment, but I think you're incorrect in your last. The algorithm does not predict or force "justice". It does NOT dictate a judge's sentence, or if the person is guilty of a crime or not. It merely reports it's best guess for the likelihood of recidivism. By your reasoning (if I'm correctly understanding your meaning that is), it could _"at best only accurately"_ determine the chance for recidivism _"as it has been done."_ There is no recidivism _"as it should be."_ It is guessing possible futures based on historical data, plain and simple. It is STILL the responsibility of the judge to set a sentence... mind you, for someone who has already been convicted of the crime.

  • @SupaKoopaTroopa64
    @SupaKoopaTroopa64 Жыл бұрын

    Using AI to predict future crimes is an extremely dangerous idea. If you give an AI access to currently available crime data, and optimize it to predict future crimes, what you are actually doing is asking it to predict who the criminal justice system (with all of its biases) will find guilty of a future crime. It gets even worse when you feed the AI data from crimes that it predicted. The AI can now learn from its past actions, and further 'fine tune' it's predictions, by looking at what traits are more likely to lead to a guilty conviction, and focus its predictions on people with those traits. This leads to a feedback loop where the AI discovers a bias in the justice system, exploits that bias to improve its "accuracy," leading to the generation of more crime data which further enforces its biases. Don't even get me started on what could happen if we use an AI powerful enough to realize that it can 'influence' its own training data.

  • @diceblock

    @diceblock

    Жыл бұрын

    That's alarming.

  • @buchelaruzit

    @buchelaruzit

    Жыл бұрын

    exactly. and it very quickly starts sounding like eugenics.

  • @Codexionyx101
    @Codexionyx101 Жыл бұрын

    You'd think that if we were going to recreate Minority Report, we'd at least try to do a good job at it.

  • @orlandomoreno6168

    @orlandomoreno6168

    Жыл бұрын

    This is more like Psycho Pass

  • @tweter2

    @tweter2

    Жыл бұрын

    There is a lot of "minority report" in the sex offender world. For example, in Minnesota every such felon is given a risk assessment at end of jail sentence to determine if they need to be civilly committed to treatment. Sex offender assessments basically determine the probability to reoffend in the next five years. if you are labeled as a higher risk, you are often given extra treatment / civil commitment time.

  • @TheVaryox
    @TheVaryox Жыл бұрын

    Company: "yea you should sentence him harder, and I won't tell you why I think that" Judges: "eh, good enough" Man, if trade secrets get prioritised over a citizen's right to a fair trial, seriously, wtf. This is trial by crystal ball.

  • @tweter2

    @tweter2

    Жыл бұрын

    Research shows sentences are longer in the afternoon or if it's nice weather outside.

  • @jeffreykirkley6475

    @jeffreykirkley6475

    Жыл бұрын

    Honestly, why do we have trade secrets as a protected thing? If no-one can know the truth about it, then why should we even agree to it's use/consumption?

  • @alperakyuz9702

    @alperakyuz9702

    Жыл бұрын

    @@jeffreykirkley6475 well, if you spend millions of dollars on development an algorithm to gain an edge over competition, would you publish the information freely so that your competition can imitate itbfor free?

  • @ipadair7345

    @ipadair7345

    Жыл бұрын

    @@alperakyuz9702 No, but the gov.(courts especially) shouldn't use an algorithm which nobody except comp. knows the working of.

  • @legendgames128

    @legendgames128

    Жыл бұрын

    @@ipadair7345 One which the company could use to suppress those who don't like them, perhaps. Or if they are working with the government and the media, we essentially get political opponents being sentenced. In this one, it merely predicted the rate of recidivism. In the one used to actually punish criminals, it could be used to punish political opponents while still being guarded as a trade secret.

  • @imaperson1060
    @imaperson1060 Жыл бұрын

    This is assuming that nobody lies and gives answers they know will lower their score.

  • @fetchstixRHD

    @fetchstixRHD

    Жыл бұрын

    Quite possibly, that may be why the girl got a higher score than the guy. The guy probably knew better to think ahead as to how the questions may be taken, whereas the girl probably wasn't calculated at all.

  • @jmodified

    @jmodified

    Жыл бұрын

    Hmmm, if I have no financial concerns, is it because I'm independently wealthy or because I know I can always rob a convenience store if I need cash? Probably best to answer "sometimes" on that one.

  • @joaquinBolu
    @joaquinBolu Жыл бұрын

    This brings me memories of Psycho pass anime were an AI computer decided who was a threat for society even before comitting a crime. The whole society was ruled by this tech withought questioning it, even cops and law enforcers

  • @feffy380

    @feffy380

    Жыл бұрын

    It wasn't even AI. It was brains of other psychopaths in jars

  • @aicy5170

    @aicy5170

    Жыл бұрын

    course?

  • @tweter2

    @tweter2

    Жыл бұрын

    Oh, by no means is this all "tech." I've done paper and pencil risk assessments that then get shared with courts / probation.

  • @notoriouswhitemoth
    @notoriouswhitemoth Жыл бұрын

    "determined by the strength of the item's relationship to person's offense recidivism" I was gonna say there was no way those coefficients weren't racist, and the results bear that out. It's almost like predictive algorithms are really good at perpetuating self-fulfilling prophecies.

  • @desfortune

    @desfortune

    Жыл бұрын

    AI and the sort just act on the data you provide. If you provide data that contains racist biases, the program will use them. AI is not intelligent, it does what you teach it to do, so as long as faulty humans insert faulty data, most of time without realizing it, you are not gonna solve anything lol

  • @Oxytail
    @Oxytail Жыл бұрын

    The fact many of these questions seem like what you'd ask a person whilst trying to diagnose them with certain mental illnesses or neurodivergencies is disgusting, let alone the part where these questions are answered with no context or nuanced conversations on the subject. "Do you often feel sad?" The answer: "Yes" The algorithm's thoughts: "this person has nothing to live for and might commit a crime because they don't fear losing their life, their crime and answers indicate they'd be more likely to break the law again" The reality/nuance: "Yes, my mom died 4 months ago to cancer and I've felt down ever since, she helped me keep my life in check and without her I completely forgot to get my car's documents renewed, since she always reminded me to do it as I still lived with her and the mail was received by her" It's SO easy for any answer to mean the complete opposite if you don't allow someone to explain the reason for their emotion. Algorithms and AIs and machines in general should never be in charge or judging people because they do not, and cannot, guess the nuance behind actions and feelings. It's ludacris to me that this is even a thing.

  • @DanGRV

    @DanGRV

    Жыл бұрын

    Using that same question: "Do you often feel sad?" "No" "The subject displays shallow affect; more likely to have antisocial tendencies."

  • @HoSza1

    @HoSza1

    Жыл бұрын

    First off, algorithms don't think nothing, they are just not able to. AI included. It's the people who create the algorithms are making the decisions ultimately. Second off, there may be a correlation between mental state and the chance of committing a crime, so why not testing for it? What would *you* ask if your job was to decide if given suspect would about to commit crimes repeatedly or not?

  • @unliving_ball_of_gas

    @unliving_ball_of_gas

    Жыл бұрын

    @@HoSza1 What would I do? Do a nuanced personal detailed psychological assessment and then decide. But even then, you can never understand 100% of someone's thoughts even if you were given years to do it. So the question becomes, SHOULD we even try to determine recidivism or should we just treat everyone equally regardless of their past because everyone can change?

  • @HoSza1

    @HoSza1

    Жыл бұрын

    @@unliving_ball_of_gas I agree that in an ideal world where resources are unlimited we could do that. Your other question is indeed more difficult to answer, but I think that investing energy in order to reduce the chance of reoccurring criminal tendencies would pay off on the long run.

  • @noahwilliams8996

    @noahwilliams8996

    Жыл бұрын

    Computers can be programmed to understand emotions. That was one of the things Turing proved about them.

  • @felipegabriel9220
    @felipegabriel9220 Жыл бұрын

    Those algorithms sounds literally like the SYBIL system in PSYCHO PASS anime, lol. Next step we get a social credit score :D

  • @sirswagabadha4896

    @sirswagabadha4896

    Жыл бұрын

    In a capitalist world, your credit score is pretty much already your social credit score. But of course, some countries go even further than that already...

  • @estebanrodriguez5409

    @estebanrodriguez5409

    4 ай бұрын

    @@sirswagabadha4896 I was about to answer the same thing

  • @awesomecoyote5534
    @awesomecoyote5534 Жыл бұрын

    The worst kinds of judgements are judgements made by someone who can't be held accountable if they are wrong. Judgements that determine how many years someone spends in prison should not be decided by an unaccountable AI.

  • @Klayhamn

    @Klayhamn

    Жыл бұрын

    humans that determine it aren't accountable either. in fact, the people who design the systems or manage the systems of law and order rarely if ever (and most likely - never) are held accountable for the decision they made so, at least based on this fact, it makes no difference if we use AI or not instead, what does matter is how good it is at predicting what it claims to predict

  • @prajwal9544

    @prajwal9544

    Жыл бұрын

    But algorithms can be changed easily and made better. A biased judge is worse

  • @soulsmanipulatedinc.1682

    @soulsmanipulatedinc.1682

    Жыл бұрын

    Should we desire to hold someone accountable? Sorry. It's just that, if we need to hold someone accountable for wrong judgment, I feel that we would have already failed. I mean, the option to hold someone accountable isn't a means to correct someone's judgment, but instead control a person's judgment. An algorithm always has perfectly controlled judgment, so, like...I don't see the problem here? I mean, yeah, this could be implemented horribly. However, the base idea would theoretically work.

  • @schmarcel4238

    @schmarcel4238

    Жыл бұрын

    If it is a machine learning algorithm, it can be punished for mistakes, thus be held accountable. And it will then try not to make the same mistakes again.

  • @soulsmanipulatedinc.1682

    @soulsmanipulatedinc.1682

    Жыл бұрын

    @@schmarcel4238 I thought about that as well, however, that may cause the program to develop harmful biases that we didn't intend.

  • @ElNerdoLoco
    @ElNerdoLoco Жыл бұрын

    I'd scrawl, "I plead the 5th" over every question. I mean, you have the right to not be a character witness against yourself too, and how can you tell if you're incriminating yourself with some of these questions? Hell, just participating while black seemed incriminating in one example.

  • @o0Donuts0o

    @o0Donuts0o

    Жыл бұрын

    Not that I agree with software being used to predict potential future criminal activity, isn’t this software used after judgement is served and only used to determine the sentencing term?

  • @pXnTilde

    @pXnTilde

    Жыл бұрын

    Seriously, this test was used during sentencing, which means there was absolutely no obligation whatsoever for him to complete that test. Remember, too... _he is guilty of his crime_ The judge could have easily decided on the same exact sentence regardless of the algorithm. In fact, often judges have already decided the sentence before hearing the arguments at sentencing.

  • @chestercs111
    @chestercs111 Жыл бұрын

    This reminds me of the study James Fallon did on psychopaths. He would analyze brain scans of known psychopaths and found that all their brains showed similar results. Then during a brain scan testing he did on him and his family he found that one of the brains matched that of a psychopath. He thought someone at work was playing a joke on him but it turned out to be his brain. Showing that it's more than just how your brain is that makes you a psychopath. However, those that match the brain scans may be more susceptible to being a psychopath if certain conditions are met

  • @KenMathis1
    @KenMathis1 Жыл бұрын

    The fundamental problem with this approach is that generalities can't be applied to an individual, and these automated approaches to crime prediction only rely on generalities. They are a codification into law of biases and stereotypes.

  • @mvmlego1212

    @mvmlego1212

    Жыл бұрын

    Well-said. Even if the predictions are statistically valid, they're not individually valid.

  • @luisheinle7071

    @luisheinle7071

    Жыл бұрын

    @@mvmlego1212 yes, it doesn't matter if they are statistically correct because it says nothing about the individual

  • @airiquelmeleroy
    @airiquelmeleroy Жыл бұрын

    Mathematically, the problem is preeeetty obvious. The amount of people that only have commited 0 to 1, or maybe 2 crimes, is astoundingly massive. The amount that have commited 4 or more, have commited MANY more than 4, usually around the hundreds if we take into account the amount of times they got away with it before caught. This means that while one group (the people that have commited many many crimes) have a fairly similar profile or data points between each other, the ither group is literally *everyone* else. So picture this: the algorythm determines that 90% of criminals wear blue pants, accounting for like 10% of the population, then the algorythm will happily mark any blue pants wearing citizen a "potential criminal", despite there being thousands more blue pant wearing innocent people, than total criminals overall. While also, completely making invisible any criminal that wears white pants, or worse, chooses to wear white pants, to avoid long sentences. The second problem: Petty crimes tend to be done by normal people, so almost any person that commits a crime is "likely" to commit another, since the algorythm will find the pattern "all these criminals are normal people, therefore, any normal people could be criminals!" Way to go blackbox...

  • @TheEnmineer

    @TheEnmineer

    Жыл бұрын

    For real, it's a clear misunderstanding of the field of statistics. Though, the interesting question is how do we know which criminals who have committed less than 4 will commit more than 4? After all, this is supposed to be an algorithm to predict (not just detect) recidivism, pointing at something that's clearly already recidivism isn't what it's supposed to do.

  • @truthboom

    @truthboom

    Жыл бұрын

    it needs neural network training

  • @ichigo_nyanko

    @ichigo_nyanko

    Жыл бұрын

    @@truthboom that will just reinforce biases already present in the justice system, like racism and sexism.

  • @andrasfogarasi5014
    @andrasfogarasi5014 Жыл бұрын

    If you want to develop an effective method for measuring recidivism, here's the plan: Step 1: Make a law requiring all people to buy liability crime insurance. Under the terms of this type of insurance, whenever the client commits a crime, the insurance agency pays for the damages caused and the client is charged nothing. Step 2: Wait 2 months. Step 3: Base prison sentences on people's insurance rates. Insurance companies under this system have a financial incentive to create an effective system for predicting future criminal behaviour and base their liability crime insurance rates on that. As such, the insurance rates become accurate predictors of future criminality. Of course you could argue that this system will cause repeat offenders to have such incredibly high insurance rates that they have no reasonable way of ever paying them, thus making them unable to buy liability crime insurance. Fret not, for I have a solution. Execution. This will drop their rates to precisely $0. Thank you for listening to my very own dystopia concept presentation.

  • @michaellautermilch9185

    @michaellautermilch9185

    Жыл бұрын

    You're just shifting who builds the models and asking insurance companies to be the ones building the black boxes. Yes, insurance companies do have people who build black box algorithms too, but they will basically do the same thing. Actually your plan has a massive flaw: insurance premiums don't only include measures of risk, but also multiple other business considerations. They want to sell more policies after all! So now you would have the justice system being partially influenced by some massive insurance company's 5 year growth plan. Not a great idea.

  • @grapetoad6595
    @grapetoad6595 Жыл бұрын

    The problem is the focus on punishment. I.e. we think you might commit crime again so you should be punished more for your potential future crime. If instead it was built on attempts to rehabilitate, and decided who was most in need of support to avoid recidivism, this would be so much better. The algorithms are a problem, but what's worse is why they are able to cause a problem in the first place.

  • @fetchstixRHD

    @fetchstixRHD

    Жыл бұрын

    Agreed. There's a whole separate discussion on whether punishment should be appropriate, but regardless getting punished for something you haven't done (or attempted to do) is pretty unfair.

  • @michaellautermilch9185

    @michaellautermilch9185

    Жыл бұрын

    No this is backwards. Punishment needs to be proportional to the crime, not to the likelihood of rehabilitation. With your mindset, someone could be rehabilitated for virtually anything, regardless of their actions, if they posed a future risk.

  • @jeremyfarley3872

    @jeremyfarley3872

    5 ай бұрын

    Then there's the difference between punishment and rehabilitation. They aren't the same thing. Are we sending someone to prison for ten years because we want to hurt them or because we want to teach them to be a productive member of society?

  • @DeJay7
    @DeJay7 Жыл бұрын

    "Thanks for watching" No, thank you for making all of these videos, Kevin. I love every single one of your videos, everything you do is great.

  • @themacocko6311
    @themacocko6311 Жыл бұрын

    IDK if it works 100%. There is 0 right to punish anyone for acts that have not been committed.

  • @taodivinity1556

    @taodivinity1556

    Жыл бұрын

    Yet if a time where it really works 100% of the time ever comes to reality, the fact stands that if you ignore the future crime, somebody will suffer, so perhaps rather than a punishment, a pre-emptive rehabilitation might be the compromise.

  • @quarepercutisproximum9582

    @quarepercutisproximum9582

    Жыл бұрын

    Exactly my problem with it. Present punishment should not be allocated based on one potential future (whether "punishment" deserves a place of its own right- outside of rehab- is its own discussion). There will always be variables that may prevent someone from acting on an intention they have to do one thing or the other; to push any forceful action upon a party before they have done anything is a path to thoughtcrime, which is less than a step away from a total lack of real freedom

  • @truthboom

    @truthboom

    Жыл бұрын

    @@taodivinity1556 Future crimes happen because of past unjust like bullying or racism. If there's no unjustice there would be no crime in the future

  • @taodivinity1556

    @taodivinity1556

    Жыл бұрын

    @@truthboom So are you saying crime is born out of crime? Then how did the crime of bullying and racism happened? Was there another crime before it? I think you're honestly oversimplifying the process, humans are way more complex than that. There is always a beginner, one that happens due to a reason, which may not be from exterior malice at all.

  • @taodivinity1556

    @taodivinity1556

    Жыл бұрын

    @NatSoc Kaiser Then change it, I don't know what else to tell you, haha. It isn't working to keep society safe.

  • @epiren
    @epiren Жыл бұрын

    I'm sad that you didn't cover retrophrenology, where you create bumps on people's heads until they acquire the personality traits you want. ;-)

  • @josephsimmons1232

    @josephsimmons1232

    Жыл бұрын

    GNU Terry Pratchett

  • @epiren

    @epiren

    Жыл бұрын

    @@josephsimmons1232 I read it in a novel by Simon R. Green called "Tales From The Nightside"

  • @josephsimmons1232

    @josephsimmons1232

    Жыл бұрын

    @@epiren Oh cool. Pratchett did the same gag in 1993 with "Men At Arms."

  • @moizkhokhar815
    @moizkhokhar815 Жыл бұрын

    Noom has been involved in some controversy recently with a lot of complaints of their free trials being misleading and subscriptions being very hard to cancel. And some of their diets were also triggering eating disorders apparently

  • @chankfreng
    @chankfreng Жыл бұрын

    If an algorithm told us that lighter sentencing leads to lower recidivism, would the courts treat those results the same way?

  • @buchelaruzit

    @buchelaruzit

    Жыл бұрын

    lol we all know the answer to that question

  • @Epic-so3ek

    @Epic-so3ek

    9 ай бұрын

    Not in the great US of A

  • @zncvmxbv4027
    @zncvmxbv4027 Жыл бұрын

    It’s a Myers Briggs test basically. But the only way to correctly do one of these is to have multiple people who know you do one about you and compare their results to yours. After correlating the data you get a much more correct version of the data.

  • @Eeeeehhh
    @Eeeeehhh Жыл бұрын

    This test feels scarily similar to an ADHD assessment, I always wonder how algorithms will discriminate against mentally/chronically ill people

  • @weslanstr
    @weslanstr Жыл бұрын

    My first problem of many with that software is that its mechanics are secret.

  • @bonbondurjdr6553
    @bonbondurjdr6553 Жыл бұрын

    I love those videos man, very thought-provoking! Keep up the great work!

  • @Nylak-Otter
    @Nylak-Otter Жыл бұрын

    My problem with this evaluation in my own case is that I test high for recidivism, and they're absolutely correct. But in practice I wouldn't show that feedback since I'd be less likely to be caught more than once. I have the same criminal habits that I've had for 20 years, and no one has caught me or bothered to call me out for it yet. If I was caught, I'd continue but be even more careful. The evaluation would be marked down as inaccurate.

  • @GrimMeowning
    @GrimMeowning Жыл бұрын

    Or they could go Scandinawia way - where prisoners are not punished (unless very serious crimes) - but instead reintegrated into society, where they learn new stuff and working with psychologists and re-thinking their actions and life position. That decreased level of recidivism to extremely small levels. Thought - until there are private prisons in USA, I doubt it will be possible.

  • @Epic-so3ek

    @Epic-so3ek

    9 ай бұрын

    That system won’t work for people with aspd, and honestly a number of other people. Many people need to be kept incarcerated until they’re not dangerous or with aspd people just forever. A focus on rehabilitation or at least not intentionally torturing prisoners would be a good start though.

  • @EnzoDraws
    @EnzoDraws Жыл бұрын

    Should've titled this video "The Immoral COMPAS"

  • @The_Privateer
    @The_Privateer Жыл бұрын

    YAY!! "Pre-crime." I'm sure that will work out well. No risk of dystopian tyranny here... move along.

  • @williamn1055
    @williamn1055 Жыл бұрын

    Oh my god they made me take this test without saying what it was. I'm so glad I assumed it was a test against me and answered whatever sounded best

  • @studentofsmith

    @studentofsmith

    Жыл бұрын

    You mean people might try to game the system by lying? I'm shocked, I tell you, shocked!

  • @buchelaruzit

    @buchelaruzit

    Жыл бұрын

    yeah just looking at these questions tells you that it can and will be used against you whenever convenient

  • @RialVestro
    @RialVestro Жыл бұрын

    I once got detention for being racist against myself... cause I was speaking in an Irish accent on St. Patrick's Day and I'm actually part Irish... I also got a detention for being late to class when our Teacher was having a parent teacher meeting and locked us out of the classroom during that time but she apparently still took attendance and marked the entire class absent. Apparently that teacher is known for doing stuff like this because when I showed up for detention the lady who runs the detention room took one look at who issued the detention slip and said I could leave. And another time I got a detention because I had left school early to go to work and I had already cleared the absence with the school ahead of time but still ended up getting a detention anyway. Though after I explained that to the principal he threw the detention slip in the trash and told me to just ignore it if it happens again.

  • @o0Donuts0o

    @o0Donuts0o

    Жыл бұрын

    3 detentions. I predict 20 to life for you!

  • @truthboom

    @truthboom

    Жыл бұрын

    if the times you went to detention are recorded in some data. Then you have to sue otherwise it's meaningless

  • @jampersand0
    @jampersand0 Жыл бұрын

    Never expected there to be what sounds like the Meyers-Briggs equivalent of a recidivism assessment. Also, glad to contribute my art in the video ☆ Stoked you reached out to me.

  • @_BangDroid_

    @_BangDroid_

    Жыл бұрын

    And Myers-Briggs is just glorified palm reading

  • @adamplace1414
    @adamplace1414 Жыл бұрын

    "Hey let's take the smartest known computer in the universe - the human brain - out of the equation in favor of some vague questions posed by people the defendants will never meet." "Sounds great!" I get we all have biases and there should be checks in place to offset them. But rules and algorithms are just poor substitutes for common sense in a lot of ways. I wonder if the ongoing labor shortage isn't in part due to so many employers relying on similar questionnaire based algorithms to disqualify worthwhile candidates.

  • @desfortune

    @desfortune

    Жыл бұрын

    The program does what you teach it to do. It's still the human developers at fault, because if you train it using biased data, you end up with a biased program. Also no, labor shortage is not because emplyee questionnaires, it's because we are in a recession

  • @adamplace1414

    @adamplace1414

    Жыл бұрын

    @@desfortune "...in *part* ..."

  • @yinq5384
    @yinq5384 Жыл бұрын

    The black box reminds me of Minority Report.

  • @orsettomorbido
    @orsettomorbido Жыл бұрын

    The problem is: We (as world) shouldn't use punitive "justice", but rehabilitative and restorative justice.

  • @ichigo_nyanko

    @ichigo_nyanko

    Жыл бұрын

    Absolutely, why should you punish someone for something they might do? It's innocent until proven guilty, and if you haven't even committed the crime yet it is literally impossible to prove you guilty.

  • @orsettomorbido

    @orsettomorbido

    Жыл бұрын

    @@ichigo_nyanko I'm not talking about thinking wether someone might do a crime again. I'm talking about not punishing people, but helping them change the motivations that made them commit the crime. And helping the victims too, of course! Wether the person had already commited a crime or not, or wether they might commit another or not.

  • @michaellautermilch9185

    @michaellautermilch9185

    Жыл бұрын

    No, you're asking the justice system to do more than administer justice. This will lead to a totalitarian dystopia where the justice system gets to act like everybody's personal overseer. Punishment should be punitive (deserved) because rehabilitative punishment is allowed to go far beyond what the person deserves, if there's a chance it might "help them".

  • @SgtSupaman
    @SgtSupaman Жыл бұрын

    Statistics and algorithms can absolutely help predict what people will do but cannot predict what a *person* will do. No one should be trying to predict a single person's actions for anything more than theoretical interest, especially not in any capacity that will affect that person's life.

  • @Lazarosaliths
    @Lazarosaliths Жыл бұрын

    Amazing video Kevin!!!! Thats so dystopian. One more step towards the future

  • @trickdeck
    @trickdeck Жыл бұрын

    I can't wait for the Sibyl System to be implemented.

  • @keanugump
    @keanugump Жыл бұрын

    Most of those questions sounded to me like "are you rich?", "are you a stereotypical white person?" or "are you in a vulnerable position in life?"

  • @andrasfogarasi5014

    @andrasfogarasi5014

    Жыл бұрын

    Yeah. Most of the questions on that survey could've been condensed into a single question: "What percentage of your income do you save?" A great predictor of recidivism. Financial strain causes criminality due to obvious reasons. And the simplest way to quantify financial strain is your savings rate. If someone makes $15,000 but saves 30% of it, that person is distinctly good at managing their finances. They may be poor, but they are certainly not the type to have to commit crimes over that. Now imagine someone who makes $100,000 a year and saves none of it. What exactly do you spend $100,000 on per year? Drugs? Alcohol? Gambling? Status symbols? An unemployed spouse and 3 children? Whatever it may be, this person is likely to have a stressful life and/or a terrible personality. I dare say they're probably more likely to commit a crime than our impoverished financial wizard. And while that crime is most likely going to be insurance fraud, it is still crime.

  • @j.matthewwalker1651
    @j.matthewwalker1651 Жыл бұрын

    As odd as it sounds polling Twitter and taking the average is a pretty good way to validate results. The "wisdom of the masses" concept has repeatedly demonstrated extremely accurate results, much more accurate than a small group of experts.

  • @SkigBiggler

    @SkigBiggler

    Жыл бұрын

    Twitter is not a good representation of people as a whole. Wisdom of the masses is also (as far as I am aware) typically only meaningfully applicable to situations where person beliefs are unlikely to play a role in decision making. No one is likely to hold a strong opinion on the nature of a jar of jelly beans, they are likely to do so with regards to a criminal.

  • @j.matthewwalker1651

    @j.matthewwalker1651

    Жыл бұрын

    @@SkigBiggler fair points, and obviously Twitter should not become the source for sentences, but as long as the data is presented in a way that reduces the likelihood of sensationalism it's still a good way to corroborate something like the algorithm. Specifically, anything that could link the subject to a trial in the media, and things like race and sexual orientation should be omitted.

  • @buchelaruzit

    @buchelaruzit

    Жыл бұрын

    you cannot ignore the biased element there is to it. here it makes sense that the general opinion is the same as the AI's, where do you think the AI learned? the "wisdom of the masses" also tended to rank black people higher

  • @kylejramstad
    @kylejramstad Жыл бұрын

    I love the "code" stock footage that shows the help of the command line command append.

  • @LeetJose
    @LeetJose Жыл бұрын

    this reminds me of this older book my class read in middle school (2002?) about a computer that could predict crime. I think I remember the book describing a person being led to the room with the device so it could be destroyed I actually don't remember to well I haven't been able to find it.

  • @Youssii
    @Youssii Жыл бұрын

    If an accurate algorithm said it was almost certain someone would commit a crime, would it even be fair to punish them for it? After all, it would seem predestined to happen…

  • @michaellautermilch9185

    @michaellautermilch9185

    Жыл бұрын

    Under a fair judicial system, no. Under a rehabilitative system, yes, you can punish anyone for just about any reason if it will "help them" in the long run.

  • @PlaNkie1993
    @PlaNkie1993 Жыл бұрын

    Didn't know the black box was actually real, that's pretty wild and concerning

  • @mykalkelley8315

    @mykalkelley8315

    Жыл бұрын

    It's symbolic

  • @danielhernandezmota225
    @danielhernandezmota225 Жыл бұрын

    One must be careful to include relevant and pertinent data when generating a model. In this case, the model must not have biased features directly or indirectly; that can be tested alongside with a team of experts who carefully evaluate de results. An additional procedure must also be done in order to "open" the black box with model explainabilty. One can use SHAP values or Anchors, even Lime to try to uncover what's inside. Finally monitoring of the model is a must; performance through detailed audits is imperative to determine if the model is still functional or if it is getting worse over time. In this case since population dynamics change over time, it is save to assume that the model will eventually stop working correctly.

  • @SuperYoonHo
    @SuperYoonHo Жыл бұрын

    Vsauce! Glad to have you back!!! Love your videos Kevin! You are so cool as always.

  • @youkofoxy
    @youkofoxy Жыл бұрын

    They should have watched Minority Report or Psycho Pass. Just that, one just need to watch one of those to realise how such system can be easily ruin people's lives.

  • @notme222
    @notme222 Жыл бұрын

    Your question at the beginning isn't about who's more likely to commit a violent crime, or who's more likely to get a conviction in the next 8 years. It's "who's more likely to commit another crime?" And logic backs up the algorithm on that. The person with more years in front of them, who may believe they got away with their last crime, has a higher chance of doing something at some point. No context from that question was about setting parole. A algorithm that makes accurate predictions would still be wrong if the questions being answered aren't what the asker meant to ask.

  • @vgamesx1
    @vgamesx1 Жыл бұрын

    6:00 Right here is where I really noticed the biggest problem with these questions on my own, I do agree with this statement, however that does NOT mean that I think you should always put yourself first, but for someone who's main goal is to climb the corporate ladder or whatever then that would be a perfectly valid response too.

  • @distortedjams
    @distortedjams Жыл бұрын

    I only chose the bike stealer because they weren't caught, and the other one was in prison so couldn't commit more crimes.

  • @prnzssLuna
    @prnzssLuna Жыл бұрын

    Not gonna lie, this is genuinely terrifying. The other vidoes you've made so far mostly showed one-off mistakes, that got rectified afterwards, but it doesn't look like anyone is willing to stop the use of unreliable software like this? Terrifying.

  • @sydney9225
    @sydney9225 Жыл бұрын

    Great video! love the way you summarize and explain topics. But that voice crack tho

  • @evil_bratwurst

    @evil_bratwurst

    Жыл бұрын

    when was the voice crack

  • @sydney9225

    @sydney9225

    Жыл бұрын

    @@evil_bratwurst 1:42

  • @evil_bratwurst

    @evil_bratwurst

    Жыл бұрын

    @@sydney9225 lmao

  • @MrTJPAS
    @MrTJPAS Жыл бұрын

    The Watch Dogs games sure seem to be more and more prophetic as time has passed, with the use of big data and algorithms moving from businesses improving their marketing into more personal and immediately important parts of people's lives, like in this case a calculation of one's likelihood to commit crime or be the victim of a crime being reduced to a eimple equation.

  • @daaawnzoom
    @daaawnzoom Жыл бұрын

    6:30 Remember everyone, if you saw someone stealing food, no you didn't.

  • @bbrandonh
    @bbrandonh Жыл бұрын

    Minority report moment

  • @nourgaser6838
    @nourgaser6838 Жыл бұрын

    This video to me relates directly to the MBTI and proves that we cannot predict or understand human behavior and personality. Psychology is not a natural science with concrete facts that can be derived mathematically. (Not that the MBTI or that compass software rely on psychology or anything scientific anyways).

  • @feedbackzaloop

    @feedbackzaloop

    Жыл бұрын

    For a 'not a natural science' psychologists learn way too much statistics. Like, near as much as physicists

  • @venkat2277
    @venkat2277 Жыл бұрын

    0:40 yes, I predicted that too, it makes a lot of sense. Think about it, the 40 year old guy who has done armed robbery knows the consequences and probably regrets it and will be very scared to repeat it. While the girl walked away as if nothing happened, faced no consequences hence she is much more likely to repeat it.

  • @michaellautermilch9185

    @michaellautermilch9185

    Жыл бұрын

    The girl should be appropriately punished by her parents, as all children occasionally need. If parents would parent, then the government wouldn't need to become Big Brother and act like everybody's parent.

  • @Mysteroo
    @Mysteroo Жыл бұрын

    Those darn scooter thieves

  • @raxcentalruthenta1456
    @raxcentalruthenta1456 Жыл бұрын

    This is dystopian. Plain and simple.

  • @Rayzan1000
    @Rayzan1000 Жыл бұрын

    I think you misinterpret the "How often do you worry about financial survival" -question. If you are often worried about your financial survival, then you "probably" either have a rather low wage or fluctuating wage, making you more likely to commit a crime, in order to pay your bills.

  • @sirswagabadha4896

    @sirswagabadha4896

    Жыл бұрын

    In that case, any psych undergrad could tell you how much the ambiguity of the question without any context invalidates its results. There's a whole history of keeping people in prison for being poor, they could have chosen something much better

  • @SeidCivic

    @SeidCivic

    Жыл бұрын

    Thus making the test/algorithm even more unreliable.

  • @Rayzan1000

    @Rayzan1000

    Жыл бұрын

    @@sirswagabadha4896 Well, most (if not all) questions can invalidate the result if taken out of context.

  • @Gerard1971
    @Gerard1971 Жыл бұрын

    The duration of a sentence should be based on what evidence about the crime that happened, not on what might happen in the future according to some black box algorithm that is based on group statistics and not on the individual, and that nobody can independently verify. It should only be used to determine if certain treatment needs to be given before rehabilitation to decrease recidivism. It is sometimes used to reduce sentences when the risk for recidivism is deemed low to free up space in prisons, but that is similar to using it to give someone a longer sentence because they have a higher risk of recidivism.

  • @quarepercutisproximum9582

    @quarepercutisproximum9582

    Жыл бұрын

    Exactly! Our system is based not on self-proclaimed rehabilitation, but instead on revenge/ punishment. Therefore, we cannot morally "take revenge" or "punish" that which has yet to actually be done

  • @csolisr
    @csolisr Жыл бұрын

    One of the parameters in that COMPAS algorithm is basically the skin tone chart from that Family Guy skit, you know the one

  • @Lolstarwar
    @Lolstarwar Жыл бұрын

    i wanne read the poem

  • @j.21
    @j.21 Жыл бұрын

    .

  • @sllenderbrine

    @sllenderbrine

    Жыл бұрын

    .

  • @BansheeWho

    @BansheeWho

    Жыл бұрын

    .

  • @WengineerKimProductions

    @WengineerKimProductions

    Жыл бұрын

    .

  • @daniplays9124

    @daniplays9124

    Жыл бұрын

    .

  • @james_b0nk

    @james_b0nk

    Жыл бұрын

    .

  • @louistennent
    @louistennent Жыл бұрын

    This is literally the plot of Captain America:the winter soilder. Except of course,with massive aircraft with guns aimed at the high risk people.

  • @roosterdoodster9220
    @roosterdoodster9220 Жыл бұрын

    I just realized when he went over the questions I have almost definitely done something like this.

  • @FreeDomSy-nk9ue
    @FreeDomSy-nk9ue Жыл бұрын

    I love your videos, that was awesome I really enjoyed it. I can't believe COMPAS isn't talked about as much as it should

  • @prim16
    @prim16 Жыл бұрын

    This convinces me that COMPAS doesn't just need to be revised or "fixed", it needs to be discontinued. AI may have a future in the world of law. But this has completely tarnished its reliability, and ruined the lives of people. Its untested and inaccurate technology is being used too soon. If you were using machine learning to teach a bot to play chess, you wouldn't throw it up against Magnus Carlsen on its first dozen trials.

  • @tweter2

    @tweter2

    Жыл бұрын

    What would replace it? Gut hunches?

  • @jinolin9062

    @jinolin9062

    Жыл бұрын

    @@tweter2 something that doesn’t ask philosophical questions to base whether or not someone should get 13 or 30 years in prison?

  • @tweter2

    @tweter2

    Жыл бұрын

    ​@@jinolin9062 That's the county prosecutor and judge. I know of one crime where one judge gave someone 15 years probation and treatment (for first conviction) while and prosecutor appealed to get the guy 15 years in prison. (Yes, the prosecutor can appeal your conviction for a harsher sentence)

  • @tweter2

    @tweter2

    Жыл бұрын

    @@jinolin9062 I think another horrid thing is that judges can decide if they want sentences for multiple convictions to be served concurrently or consecutively. In other words, if you get convicted for a 3 year crime, a 5 year crime, and a 10 year crime will you service 10 years for all three or 18 for all three? Judge gets to pick!

  • @ichigo_nyanko

    @ichigo_nyanko

    Жыл бұрын

    @@tweter2 Nothing, standardised sentencing for the same crime; perhaps increased sentencing for repeat offenders. Why should you punish someone for something they might do? It's innocent until proven guilty, and if you haven't even committed the crime yet it is literally impossible to prove you guilty.

  • @zeropoint703
    @zeropoint703 Жыл бұрын

    that outro with the box tho 🔥🔥🔥

  • @arnauarnauarnau
    @arnauarnauarnau Жыл бұрын

    Your jumper is awesome. Where'd you buy it?

  • @jamesmiller4487
    @jamesmiller4487 Жыл бұрын

    Excellent and thought provoking video, clearly algorithms are not, and maybe never will be, ready to judge humans. The problem is that human judgement is just as flawed, varying from person to person, day to day, and situation to situation. You could have created a video on the fallibility of human judges, their inept and biased sentencing, and been equally right and thought provoking.

  • @meisstupid1831
    @meisstupid1831 Жыл бұрын

    Okay, kevin. This is the problem Crimes shouldnt have algorythms. By judgement is basically the closest anyone could have basically counted the crime. Things might be related, but its never always true either, people are too hard to predict on criminalogy or basically everything. Math doesnt conclude crimes, it catches clues, as kevin has already proven in the last video. Such a dumb misconception, its like using a broken compass to find your way back. The real problem is that human nature is too complex, but the best way to reduce crime rates is to find the root cause. It feels odd to judge people by using math, its a tool but not for something too complex like us human beings.

  • @HHHjb_

    @HHHjb_

    Жыл бұрын

    Ye

  • @feedbackzaloop

    @feedbackzaloop

    Жыл бұрын

    Funny you brought up that analogy, when one of the said algorithms is called COMPAS

  • @truthboom

    @truthboom

    Жыл бұрын

    Human nature isn't that complicated lol. People rob food if they have no food. Bosses lower the wage as they are greedy and can get away with it

  • @militantpacifist4087
    @militantpacifist4087 Жыл бұрын

    Reminds me of that one episode of Futurama.

  • @kevinlago1619
    @kevinlago16195 ай бұрын

    Awesome video as always Kevin! :D Cool name btw

  • @maxwhite4732
    @maxwhite4732 Жыл бұрын

    This is the equivalent of asking a fortune teller to predict the future and using it as evidence in court.

  • @evil_bratwurst

    @evil_bratwurst

    Жыл бұрын

    Exactly! Nice pfp btw.

  • @themightyquinn1343
    @themightyquinn1343 Жыл бұрын

    There is something extremely concerning to me about an algorithm or artificial intelligence that tells me whether or not I will commit a crime.

  • @tmrogers87
    @tmrogers87 Жыл бұрын

    Liking and commenting to increase engagement and visibility. This is fascinating and more people should know how criminal justice, AND MOST OTHER ASPECTS OF MODERN SOCIETY, are shaped by assumptions made by an algorithm or other model

  • @Crimsaur
    @Crimsaur Жыл бұрын

    I really liked that track that you used at the opening, could I ask what it was?

  • @charlierogers5403
    @charlierogers5403 Жыл бұрын

    And this is why algorithms are not good for everything! We shouldnt rely on them 100%

  • @timojissink4715

    @timojissink4715

    Жыл бұрын

    Algorithms can be amazing, but they need the right unbiased human input.

  • @luc_666jr5

    @luc_666jr5

    Жыл бұрын

    Tell KZread that please

  • @vertigo747
    @vertigo747 Жыл бұрын

    Havent watched it yet but I know its going to be good

  • @Nillowo

    @Nillowo

    Жыл бұрын

    That’s easy to say for all of Kevin’s videos ;)

  • @feedbackzaloop
    @feedbackzaloop Жыл бұрын

    Ok, but are there any recent studies showing recedivism is relevant to judjment at all? If we fail to access it, might as well banish the idea altogether

  • @theomni1012
    @theomni10124 ай бұрын

    It’s always been interesting how history can predict the future- but it still varies wildly. For example, a kid raised by abusive parents. You could say that they’ll be an abusive parent when they grow up because that’s how they were raised. You could also say that they’d grow up to be a very good parent because they never want to treat their child the way they were treated.

  • @SAUL_GOOFYMAN
    @SAUL_GOOFYMAN Жыл бұрын

    we dont need detectives anymore?

  • @issamasghar5203

    @issamasghar5203

    Жыл бұрын

    This isnt about finding criminals but predicting if someone will become one, this is more of a proactive way of stopping crime rather than waiting for it to happen and try to prevent monetary loss and even loss of life

  • @timojissink4715
    @timojissink4715 Жыл бұрын

    Instead of using our biased assumptions on what plays a role in committing crimes, we should probably get to the root by letting psychologists treat thousands of criminals and make them form the questions.

  • @earzo7

    @earzo7

    Жыл бұрын

    But in proposing this, you've revealed your own bias: that psychologists are the ones most qualified to form these questions -- that criminal tendency is mostly linked to the psychological -- and not specialists from other fields like sociologists or even people within the justice system itself. Even deciding what fields of study are relevant to such a system is a contentious area, and even the plausibility of such a system should be called into question. If, for example, criminal behavior is strongly linked to economic factors such as poverty or homelessness, the algorithm may not be able to adequately predict changes in this status no matter how good it is.

  • @timojissink4715

    @timojissink4715

    Жыл бұрын

    @@earzo7 You certainly got a point there, I'm definitelly no expert. But I do think psycology can explain a lot. The decision to solve your poverty by commiting a crime for example has to do with how our brains are wired. Many will have the thought and few will excecute that thought. My point was more about that when you're creating such an algorithem, you should do a great amount of research about the drive to commit the crime before having an idea how to do it succesfully.

  • @EmperorShang
    @EmperorShang Жыл бұрын

    Thanks for being part of the problem

  • @kevinlemon3467
    @kevinlemon3467 Жыл бұрын

    I think this is an excellent example of how statistics and large numbers work. A single individual is extremely hard to predict, but large groups of people are relatively easy, meaning you can predict with a fair degree of accuracy the average behavior of a large group of people. We do this in business all the time to predict people's buying habits. I used to run a small business and I would use some rudimentary statistical analysis to predict how we'd do in any given year, set prices, manage inventory, etc . . . and I could generally predict to within a few percent the total profits we would have during a year based on numbers I would have at the beginning of the year. I couldn't tell you what a single customer would do, but I could predict what people would do as a whole months in advanced if I had the right information. The Twitter info doesn't surprise me in the least bit. The number of people involved means that extremes are effectively controlled for and you'll probably get a fairly average group of people, which means the common wisdom of what people may commit crimes will be what shows up in the data. If that common wisdom is at all accurate, then the data from Twitter will be fairly accurate. If that data is inaccurate, then it will be inaccurate. It isn't surprising that Twitter ended up with similar numbers as the algorithm, since the algorithm seemed to take into account things that most people would consider (prior convictions, education, socio-economic background, etc . . . ) I don't know how accurate they both were, but I'd be surprised if the Twitter poll was extremely different.

  • @martinzg0078
    @martinzg0078 Жыл бұрын

    Vsauce

  • @tom05011996
    @tom05011996 Жыл бұрын

    The compass risk assessment would give a high score to anyone with ADHD!

  • @evil_bratwurst

    @evil_bratwurst

    Жыл бұрын

    I guess I'm gonna be a major criminal, then!

  • @pkmntrainermark8881
    @pkmntrainermark8881 Жыл бұрын

    I'm just gonna take a moment here to voice my appreciation for Kevin still making videos for us. Vsauce 1 and 3 never upload anything, so it's good to still have one around.

  • @quarepercutisproximum9582

    @quarepercutisproximum9582

    Жыл бұрын

    You mean 2! lololol but yeh you right

  • @aloe.0v0
    @aloe.0v0 Жыл бұрын

    These "risk assessments" have HUGE bias towards the neurodivergent. As someone with ADHD, I've faced similar lines of questioning in clinical assessments. ("Do you feel bored?", "Do you feel discouraged?", "Is it difficult to keep your mind on one thing for a long time?")... ...Not to mention I live in an expensive city and live with friends to afford rent. Apparently I'm high risk for repeat criminality 😅

  • @lawlerzwtf
    @lawlerzwtf Жыл бұрын

    Psycho Pass. Or Minority Report, depending on your demographic.

  • @RedIceberg
    @RedIceberg Жыл бұрын

    I feel like the problem is that a young person, if taken to court, is much less likely to commit a crime. COMPAS probably doesn't take this into account, and therefore gave the teenager an inflated score.

  • @lloydgush
    @lloydgush Жыл бұрын

    Anyone ever heard of the halting problem? Feels quite like it.

  • @TheRockingChar
    @TheRockingChar Жыл бұрын

    If that risk assessment could be turned into an online test that'd be so dope

Келесі