Live-Feature Engineering-All Techniques To Handle Missing Values- Day 3

Dataset link: drive.google.com/file/d/1hJgt...
Live Streaming Playlist: • Live stream playlist
Telegram link: t.me/joinchat/N77M7xRvYUd403D...
github link: github.com/krishnaik06/Featur...
Join the Ineuron Affordable course
ineuron1.viewpage.co/Deep-lea...
Please donate if you want to support the channel
Gpay: krishnaik06@okicici
Please join as a member in my channel to get additional benefits like materials in Data Science, live streaming for Members and many more
/ @krishnaik06
Please do subscribe my other channel too
/ @krishnaikhindi
Connect with me here:
Twitter: / krishnaik06
Facebook: / krishnaik06
instagram: / krishnaik06

Пікірлер: 55

  • @akash_a_desai
    @akash_a_desai3 жыл бұрын

    Thanks sir for this series, no one is teaching such techniques , it's really helpful for everyone 🙏

  • @abhi13091985
    @abhi130919853 жыл бұрын

    Commendable job, people can record and do coding offline but to do it on live streaming is great

  • @srishtikumari6664
    @srishtikumari66643 жыл бұрын

    Thank You so much! I really enjoyed this session.

  • @ganeshkharad
    @ganeshkharad3 жыл бұрын

    #respect.... thanks for sharing your knowledge...!!!

  • @pankajkumarbarman765
    @pankajkumarbarman7653 жыл бұрын

    This is perfect sir plz go ahead by this kind of feature engineering I truly learn a lot ...plz continue sir 💖💖😊 session is going awesome

  • @ravitanwar9537
    @ravitanwar95373 жыл бұрын

    god bless you krish, you rock

  • @paneendraprathap1607
    @paneendraprathap16072 жыл бұрын

    Thanks a lot sir, awesome explanation and hats off for your patience and dedication even after your regular office work

  • @rambaldotra2221
    @rambaldotra22213 жыл бұрын

    Sir you are taking so much pain to teach the best of your knowledge . Thanks a lot sir.

  • @nothing8919
    @nothing89193 жыл бұрын

    Thank you alot best teacher ever

  • @harshchindarkar5887
    @harshchindarkar58873 жыл бұрын

    thank you so much sir explanation by you is very easy to understand...... :)

  • @riteshmukhopadhyay6922
    @riteshmukhopadhyay6922 Жыл бұрын

    From 200K while recording this video to today4/7/2022 - 600K subscribers, kudos to you Krish for your effort

  • @shivamshinde9810
    @shivamshinde98102 жыл бұрын

    Very useful sir!! Thank you

  • @marijatosic217
    @marijatosic2173 жыл бұрын

    Great class!

  • @Mars7822
    @Mars78222 жыл бұрын

    perfect explanation.

  • @dipeshjindal2731
    @dipeshjindal27313 жыл бұрын

    very useful sir Thanks...

  • @InovateTechVerse
    @InovateTechVerse3 жыл бұрын

    Thank you so much sir

  • @VenuGopal-dr8ln
    @VenuGopal-dr8ln3 жыл бұрын

    Nice explanation 💟

  • @017farazbintariq9
    @017farazbintariq9 Жыл бұрын

    thank you sir

  • @jainitafulwadwa8181
    @jainitafulwadwa81813 жыл бұрын

    We can use embeddings for features with large category which encodes the feature in small feature space

  • @vikasloonia4580
    @vikasloonia45803 жыл бұрын

    Very useful sir😍😍😍

  • @sandipansarkar9211
    @sandipansarkar92112 жыл бұрын

    Finished watching

  • @karthiksundaram544
    @karthiksundaram5442 жыл бұрын

    Yes

  • @nirajsathe9220
    @nirajsathe92202 жыл бұрын

    Sir can we use technique like filling the missing values either by back value or next value ......

  • @nikhilparmar9
    @nikhilparmar93 жыл бұрын

    Super useful 🤟🏻

  • @amponsahwellington3197

    @amponsahwellington3197

    3 жыл бұрын

    Naik, I sent you a mail . Pleae reply me

  • @karanyadav9432
    @karanyadav94322 жыл бұрын

    hey guys, does feature engg require knowledge of ML as well? or can u plz tell the pre requisites for FE?

  • @prashanthdhananjayan1745
    @prashanthdhananjayan17452 жыл бұрын

    god bless this man lol

  • @MageDigest
    @MageDigest2 жыл бұрын

    How do we understand which are my categorical variables from a dataset?

  • @vagheeshmk3156
    @vagheeshmk31566 ай бұрын

    #KingKrish

  • @hrishi_rich703
    @hrishi_rich7033 жыл бұрын

    krish sir please record the video and upload it !!

  • @sahilmalpotra8284
    @sahilmalpotra8284 Жыл бұрын

    why are we converting the data set of mercedesbenz into list??

  • @sridhar6358
    @sridhar63583 жыл бұрын

    but we have to do the same for other categories too still we may get 5x10 = 50 more columns right which is expected

  • @nitikeshsaini7655

    @nitikeshsaini7655

    11 ай бұрын

    Right

  • @datafuse32
    @datafuse323 жыл бұрын

    Can anyone plz tell about the code .. For category in lst_10: Df[category]=no.where.......

  • @sridhar6358
    @sridhar63583 жыл бұрын

    Can you explain the winners solution of machinehack is that possible

  • @harshmakwana8001
    @harshmakwana80013 жыл бұрын

    I have bought your Membership Krish Sir. I Saw that it unlocked the Projects Playlist, but if you have any other material than How do I access the Data Science Material?

  • @krishnaik06

    @krishnaik06

    3 жыл бұрын

    Hi Harsh please check the community post. All the info is given there

  • @harshmakwana8001

    @harshmakwana8001

    3 жыл бұрын

    @@krishnaik06 Thank you sir!☺️

  • @rich007p
    @rich007p2 жыл бұрын

    one question is when we have 100 of categorical values in the categorical variable and when we apply the method of selecting the top 10 of them and then we perform the encoding, whether we will drop that original categorical variable from our dataset?

  • @joeljoseph26

    @joeljoseph26

    5 ай бұрын

    check KDD Orange CUP competition from kaggle. He has also covered a video about picking the top 10 features and ignoring the next. You can also try, mean-encoding,freq/count encoding, target-guided(if you need rank for the lables)

  • @adityadwivedi9159
    @adityadwivedi91592 жыл бұрын

    Part 2- Handling Categorical features starts at 1:12:0

  • @rebeccakipanga478

    @rebeccakipanga478

    11 ай бұрын

    Thank you

  • @jaswanthksk8360
    @jaswanthksk83603 жыл бұрын

    Hey Krish, Do we have any one method for replacing NAN values, which will work for all types of datasets?

  • @zshan101992

    @zshan101992

    3 жыл бұрын

    I guess there is no such all in one method.. usually impution with mean median mode is preferred to get complete data fast butit impacts correlation also and distort the variance so choose accordingly.

  • @kislaykrishna8918
    @kislaykrishna89182 жыл бұрын

    can anyone give the code to convert all other features apart from x1

  • @redroom07
    @redroom072 жыл бұрын

    Most frequent comming category means (mode) , why we can't directly imputing na by mode ???????? like this df.bsmatq.fillna(df.basmqt.mode())

  • @tirumaleshn8504
    @tirumaleshn85043 жыл бұрын

    Sir! is it ok to replace null values(0) with imputation?

  • @bruhm0ment767

    @bruhm0ment767

    Жыл бұрын

    If the number of rows with null values is less compared to the total dataset, the rows can simply be dropped. However, if the number of rows with nulls is high we need to impute

  • @dra.talwar4592
    @dra.talwar45923 жыл бұрын

    wanna join you but technical hurdle while payment,

  • @sandipansarkar9211
    @sandipansarkar92112 жыл бұрын

    finished practice coding

  • @surajkrishnamoorthy7486
    @surajkrishnamoorthy74863 жыл бұрын

    If 47% of the data is missing in a column, does it make sense to actually impute the data? Would it not be better to delete the column itself, if almost half the values are synthetic?

  • @amponsahwellington3197
    @amponsahwellington31973 жыл бұрын

    Naik, I sent you an email. Pleae check and reply when you are not busy.

  • @asyakatanani8181
    @asyakatanani8181 Жыл бұрын

    i am literally killed... please Krish, stop smacking your mouth...

  • @gh504
    @gh5042 жыл бұрын

    thank you sir