CloudAndDataUniverse

CloudAndDataUniverse

CloudAndDataUniverse Channel has been created to help people learn and scale up in Cloud and Data domain. As of now playlists have been created for AZURE, AZURE DATA FACTORY, BIG DATA, SPARK, PYSPARK, DATABRICKS, SCALA, PYTHON, SQL, EXCEL and POWER BI each in English and Hindi. Share within your network. Keep learning !

We offer courses on our portal : www.cloudanddatauniverse.com

49. Recursion

49. Recursion

49. Recursion

49. Recursion

074. Output Clause

074. Output Clause

074. Output Clause

074. Output Clause

005. ACID Properties

005. ACID Properties

005. ACID Properties

005. ACID Properties

Nashukre Insaan !

Nashukre Insaan !

13. YTD Total

13. YTD Total

12. QTD Total

12. QTD Total

11. MTD Total

11. MTD Total

9. Custom Merging

9. Custom Merging

7. Palindrome

7. Palindrome

6.  Find Alternate Rows

6. Find Alternate Rows

5. Row Numbering

5. Row Numbering

Пікірлер

  • @jayanthkumarg8958
    @jayanthkumarg895818 сағат бұрын

    Is it free or paid?

  • @cloudanddatauniverse
    @cloudanddatauniverse18 сағат бұрын

    It's a paid course

  • @krishnakanjully4859
    @krishnakanjully48592 күн бұрын

    Happy learning ❤

  • @ShubhamNaikanavare
    @ShubhamNaikanavare2 күн бұрын

    Best sir...aapko jab bhi sunta hun..bhot aacha mehsoos krta hun...u are a great human being ❤thank you for sharing all

  • @cloudanddatauniverse
    @cloudanddatauniverse2 күн бұрын

    Welcome 🤗 Khushi Hui jaankar. Bas Lage raho. Kamyaabi aas paas hi hai

  • @pranjal86able
    @pranjal86able2 күн бұрын

    Hamare paas jaan to ek hi hai. Kis cheez ko paane ki koshish mein jaan dena chahiye, is baat pe aapke kya khayal hain? Aapki aur hamari umar ek jaisi hai. Jeewan ke is mukam par aakar lagta hai ki career par hamne zaroorat se zyada koshish kar di. Shayad aakhri waqt par isi baat ki shiddat reh jayegi. "Hazaaron khwahishen aisi ke har khwahish pe dam nikle Bohat niklay mere armaan, lekin phir bhi kam nikle"

  • @cloudanddatauniverse
    @cloudanddatauniverse2 күн бұрын

    Bahot khoob! Jahan tak Jaan Dene ki baat hai, yeh sirf aur sirf ek hi situation mein hai aur woh desh ke liye, iske alawa kuch aur nahi. Haan yeh baat bhi hai hum sabne career par bahot jee Jaan Laga di aur iske chalte kafi kuch kho diya. Upar wala hamare liye aasaan kare.

  • @pranjal86able
    @pranjal86able2 күн бұрын

    @cloudanddatauniverse Jaan desh ke naam par dena bahut badi baat hai. Halaki, jaan dene se mera matlab sirf samay dena tha. Mai Ghalib ke shabd upyog kar raha tha. Aapne koshish poori karne ki baat kahi, to mujhe ehsaas hua ki shayad kabhi-kabhi zoom out karke sochna chahiye ki sabse important kya hai jeewan mein. Sunk cost fallacy (kzread.info/dash/bejne/qKSi2sZsYdyWdtI.html) ke baare mein aapne shayad suna hoga. Aapke kya vichar hain is baat par? Dhanyawad

  • @nafisamulani8498
    @nafisamulani84982 күн бұрын

    👍

  • @MuskanShaikh-hz4fe
    @MuskanShaikh-hz4fe2 күн бұрын

    💯

  • @SAli-ld6hh
    @SAli-ld6hh3 күн бұрын

    please make a video on "roadmap of azure data engineering". what to learn or what not and resources (must be open sources to create personal projects), i'am trying to switch my domain but getting confuse because everyone giving different advices when ask i ask them

  • @cloudanddatauniverse
    @cloudanddatauniverse3 күн бұрын

    Here you go : kzread.info/dash/bejne/eIKqtdOcd5WWg8o.html Check this video as I explain the path for azure data engineering.

  • @alhasanmohammedbinquraish960
    @alhasanmohammedbinquraish9603 күн бұрын

    Kafka video sir hindi

  • @saifalam3906
    @saifalam39065 күн бұрын

    🎉

  • @thisissparta4866
    @thisissparta48669 күн бұрын

    Are there any update in 2024 power bi?? Asking just because I’m planning to start this course today

  • @cloudanddatauniverse
    @cloudanddatauniverse9 күн бұрын

    Yes there have been incremental updates in 2024 as Microsoft releases updates or power bi every single month. But yet most of it remains same and you can certainly go ahead with this playlist

  • @VictorBoney
    @VictorBoney10 күн бұрын

    Nice! Sir, I couldn't find any DML Trigger videos in this SQL playlist. Is there a specific reason? Is it because they are no longer widely used, particularly for data engineering projects, or is there another reason you have left them?

  • @cloudanddatauniverse
    @cloudanddatauniverse9 күн бұрын

    Welcome. I don't see triggers being used much in data engineering, but that doesn't mean they are not useful. They do have specific use cases

  • @mallikarjunchintu2442
    @mallikarjunchintu244211 күн бұрын

    will the amount get autodebited from my credit card??

  • @cloudanddatauniverse
    @cloudanddatauniverse11 күн бұрын

    Initially it charges only rs.2 while signing up which is refunded and once the bills are generated after upgrading to PAYG you have to manually settle it.

  • @swamimauli4306
    @swamimauli430611 күн бұрын

    Good explanation, thanks

  • @cloudanddatauniverse
    @cloudanddatauniverse11 күн бұрын

    Welcome. Thanks for watching

  • @swadeshbanerjee8888
    @swadeshbanerjee888812 күн бұрын

    Can we extract 4th to 8th ? Instead of 0 to 4th

  • @cloudanddatauniverse
    @cloudanddatauniverse12 күн бұрын

    use this: rdd1.take(8)[4:8] This is possible as take returns a list and then you can slice it!

  • @user-yq8it4og2e
    @user-yq8it4og2e14 күн бұрын

    Awesome explanation and good scenario ❤

  • @cloudanddatauniverse
    @cloudanddatauniverse14 күн бұрын

    Glad to know. Thank you for watching

  • @ram.grandhi
    @ram.grandhi15 күн бұрын

    How to use stored procedure with output parameter? Please help

  • @VictorBoney
    @VictorBoney17 күн бұрын

    In previous examples of this video, we used the qualifiers as inserted or deleted, however those qualifiers are not there in the Merge statement. We are simply using Output S.ProductKey, S.ProductName - still we are getting the result returned. What is making that work in the Merge statement is not clear

  • @cloudanddatauniverse
    @cloudanddatauniverse17 күн бұрын

    In merge we have action keyword which returns what actions the merge statement has performed like insert, update , delete

  • @VictorBoney
    @VictorBoney17 күн бұрын

    Sir, It is not clear (and surprising) why a sequence of statements does not work in the IF block yet it does in the ELSE part. As a result, as a general rule, you propose that we always use BEGIN END if there are several statements to be executed. Am I right in understanding your point?

  • @cloudanddatauniverse
    @cloudanddatauniverse17 күн бұрын

    True

  • @VictorBoney
    @VictorBoney18 күн бұрын

    Nice! Sir, What exactly are the situations or use cases in a practical project that we get to implement / use Pivot tables? How best can we use this feature which you have explained so well. Can you provide your thoughts on this please. Thanks.

  • @cloudanddatauniverse
    @cloudanddatauniverse18 күн бұрын

    Thanks for watching. As explained pivot is a 2 dimensional summarised view. One is row and other is column. So these e dimensions can be any column in your table. Mostly you will see pivot by year and month, year and qtr, country and product,etc. it could be anything.

  • @apurvjadhav7995
    @apurvjadhav799518 күн бұрын

    can you please make a video on what is constructor and generator what is the difference between them?

  • @VictorBoney
    @VictorBoney18 күн бұрын

    Informative! Sir, I am having difficulty working with two fact tables. Even in Adventure works, there are many fact tables. Consider simply two Fact tables: FactInternetSales and FactResellerSales, which share common dimensions, however, they have differences in their granularity. Simply put, if I need to work on a report that requires information from both fact tables, how would I be able to combine the two tables together and be able to show the required information on the report?

  • @VictorBoney
    @VictorBoney19 күн бұрын

    Informative! Sir, I find both of the following functions equal. Is there any specific reason or any other performance reasons, for which you are using CONCAT_WS SELECT CONCAT('HELLO', SPACE(1), 'WORLD') SELECT CONCAT_WS(SPACE(1), 'HELLO', 'WORLD')

  • @VictorBoney
    @VictorBoney20 күн бұрын

    Sir, can I get the SQL script for this video please? Thanks, much appreciated.

  • @VictorBoney
    @VictorBoney20 күн бұрын

    Hi Yusuf I have seen administrators, they generally follow a practice, they would always use BEGIN TRANSACTION before deleting single or multiple records of a table, and doing ROLLBACK TRANSACTION or COMMIT TRANSACTION per the result of their operation. How are things handled there?

  • @VictorBoney
    @VictorBoney20 күн бұрын

    Informative! Yusuf Sir, There are 3 variants for the first parameter (interval) of DatePart and DateName functions. Are there any differences or suggestions which one to be used? All the following statements return the same result: --DATEPART SELECT DATEPART(M, GETDATE()) SELECT DATEPART(MM, GETDATE()) SELECT DATEPART(MONTH, GETDATE()) --DATENAME SELECT DATENAME(M, GETDATE()) SELECT DATENAME(MM, GETDATE()) SELECT DATENAME(MONTH, GETDATE())

  • @VictorBoney
    @VictorBoney20 күн бұрын

    Informative! Thank you, this is one of the interview questions that generally get asked wherein the candidate is asked to bring the 2nd occurrence of 3rd occurrence of a character in a given string: Example: 3rd occurrence of character L ( 'l') select CHARINDEX('l', 'Hello World', CHARINDEX('l', 'Hello World') +2) Thanks!

  • @VictorBoney
    @VictorBoney20 күн бұрын

    Good Explanation on the joins sir! Generally we get to see one another type of Join which is Natural Join. Is that also important to learn? How is that different from the Inner join which you have explained in your video?

  • @rajendrayegireddi3429
    @rajendrayegireddi342920 күн бұрын

    Nice video bro, but in realtime scenario we have lot of table right, we can not create source query as like you, if we create like you its very lenghty process, please suggest us how to overcome this situation.

  • @cloudanddatauniverse
    @cloudanddatauniverse20 күн бұрын

    Thank you for watching. Totally agree. You can't adopt this practice when file size is huge. But this was an example to get started with so people know the process and limitations of this approach

  • @rajendrayegireddi3429
    @rajendrayegireddi342920 күн бұрын

    @@cloudanddatauniverse Bro, how can we overcome this situation bro

  • @cloudanddatauniverse
    @cloudanddatauniverse20 күн бұрын

    By watching this video you need to understand one thing and conclude. Data Lake has raw file which cannot be queried directly, hence you have to load the data in database and then query it. But later developments like synpase serverless and delta lake make it easy to overcome this limitation

  • @rajendrayegireddi3429
    @rajendrayegireddi342920 күн бұрын

    @@cloudanddatauniverse Ho, we overcome this by create synapse correct bro?

  • @rajendrayegireddi3429
    @rajendrayegireddi342920 күн бұрын

    Bro, could you please make that session, that would be very helpful for us

  • @VictorBoney
    @VictorBoney20 күн бұрын

    Sir, When we create single column primary key or multi-column primary key, some indexes get automatically created in the table. Are they also covered in any of your other videos. Please suggest Thanks.

  • @cloudanddatauniverse
    @cloudanddatauniverse20 күн бұрын

    Yes by default primary key creates n index in SQL server. Indexes have not yet been covered in this playlist

  • @VictorBoney
    @VictorBoney21 күн бұрын

    Sir, I want to find out records (example failed records) which do not have underscores, but the following is not working, what should I do? create table student( id int, student_name varchar(30), student_comments varchar(100) ) insert into student(id, student_name, student_comments) values(1, 'Deepak', 'Passed_90_percent') insert into student(id, student_name, student_comments) values(2, 'Saket', 'Passed_60_percent') insert into student(id, student_name, student_comments) values(3, 'Shekhar', 'Failed') insert into student(id, student_name, student_comments) values(4, 'Saket', 'Passed_%_percent') select * from student where student_comments not like '%_%'

  • @cloudanddatauniverse
    @cloudanddatauniverse21 күн бұрын

    Use this : select * from student where student_comments not like '%[_]%' for complete details on like check this link: learn.microsoft.com/en-us/sql/t-sql/language-elements/like-transact-sql?view=sql-server-ver16&f1url=%3FappId%3DDev15IDEF1%26l%3DEN-US%26k%3Dk(like_TSQL)%3Bk(sql13.swb.tsqlresults.f1)%3Bk(sql13.swb.tsqlquery.f1)%3Bk(MiscellaneousFilesProject)%3Bk(DevLang-TSQL)%26rd%3Dtrue

  • @VictorBoney
    @VictorBoney21 күн бұрын

    @@cloudanddatauniverse So, it is all about converting a pattern to a literal first, and then comparing it. Perfect. Thank you sir! I have changed the insert statements above, and now the following SQL statements are working as expected. select * from student where student_comments not like '%[_]%' select * from student where student_comments like '%[%]%'

  • @VictorBoney
    @VictorBoney22 күн бұрын

    Sir, Is it possible to have the Null values of a column appear at the bottom of the resultset, and not on top. Thanks

  • @cloudanddatauniverse
    @cloudanddatauniverse21 күн бұрын

    There is no direct way as sql server doesnt have a keyword for same, other database might have NULLS LAST option, use the below query for sql server. select * from dimproduct order by case when listprice is null then 1 else 0 end,listprice but it comes with a performance impact.

  • @VictorBoney
    @VictorBoney21 күн бұрын

    @@cloudanddatauniverse Thank you. Noted.

  • @VictorBoney
    @VictorBoney22 күн бұрын

    Nice! It was nice to see how hovering over the mouse over the asterisk ( * symbol ) pulls up all the column information and shows to us. 😃💡 Informative!

  • @VictorBoney
    @VictorBoney22 күн бұрын

    Sir, Is this important to learn about the 2 database properties: Recovery Model and Compatability Level - How do they help us in a project?

  • @cloudanddatauniverse
    @cloudanddatauniverse21 күн бұрын

    Good to know.

  • @VictorBoney
    @VictorBoney23 күн бұрын

    Loved it! Yusuf Sir, Is this approach of showing measures dynamically similar to what we achieve using Modeling Tab / New Parameter / Fields option. Upon clicking that Fields option, a popup window appears and we provide what all fields are supposed to be kept in that fields, and subsequently using that Field Parameters in some visuals to give dynamically changing effect for a visual / chart. Example: The following kind of isolated table gets created using Modeling Tab / New Parameter / Fields option Dynamic Slicer Orders (Ship) = { ("ShipCity", NAMEOF('Orders'[ShipCity]), 0), ("ShipCountry", NAMEOF('Orders'[ShipCountry]), 1), ("ShipRegion", NAMEOF('Orders'[ShipRegion]), 2), ("ShipName", NAMEOF('Orders'[ShipName]), 3) } Is your approach similar to the above process or are there some difference in terms of the result achieved? Best Regards!

  • @VictorBoney
    @VictorBoney25 күн бұрын

    Liked it! Sir, One thing that I noted when comparing the Distinct() and Values() functions is that when you supply the complete table to both, DISTINCT() returns the distinct rows, however VALUES() does not remove duplicates and returns the entire table data as is. It appears that both functions eliminate duplicates when a column is supplied, but behave differently when a table is passed as a parameter. Am I missing something? What do you think?

  • @pujithan9032
    @pujithan903226 күн бұрын

    Too many ads. There are 3 or 4 ads within a 6 minute video. This is just too distracting. Not worth watching

  • @VictorBoney
    @VictorBoney26 күн бұрын

    Nice! Sir, For SCD-2, do we really need to have StartDate, EndDate, and IsActive ? Once we are setting-up/updating EndDate, isn't it automatically an end to that record for maintaining record history? Later on, if we need to pick the record we will pick only those records wherever EndDate is still null for all active records, and picking up inactive records wherever EndDate is not null

  • @cloudanddatauniverse
    @cloudanddatauniverse26 күн бұрын

    Thanks for watching. The start date and enddata will ease future calculations, lets say i want to see period of activeness for each record, then i can simply subtract enddate from start date and such more things which ease our development.

  • @VictorBoney
    @VictorBoney26 күн бұрын

    @@cloudanddatauniverse Yes, that's right. However, I was getting inquisitive about whether or not "IsActive" as an additional column is really required when we already have Start Date and End Date specifically for SCD-2.

  • @muskanarora2671
    @muskanarora267126 күн бұрын

    Your way of explaining each and everything makes you different from everyone...you are doing to great for us

  • @cloudanddatauniverse
    @cloudanddatauniverse26 күн бұрын

    Thank you, glad to know

  • @muskanarora2671
    @muskanarora267126 күн бұрын

    Very informative

  • @cloudanddatauniverse
    @cloudanddatauniverse26 күн бұрын

    Thank you for watching

  • @VictorBoney
    @VictorBoney28 күн бұрын

    Nice! Yes, you rightly pointed out. The new offerings like DeltaLake make it important for us to be aware of these things. Sir, Is there a video which talks about the following 4 isolation levels: 1- set transaction isolation level read committed 2 - set transaction isolation level read uncommitted 3 -set transaction isolation level repeatable read 4 - set transaction isolation level serializable The 1st and 2nd is somewhat easy to understand, however, 3rd and 4th sometimes give issues in understanding them. Could you suggest something please.

  • @cloudanddatauniverse
    @cloudanddatauniverse26 күн бұрын

    Thank you for watching. Will create a video on concurency

  • @VictorBoney
    @VictorBoney26 күн бұрын

    @@cloudanddatauniverse Thank you. Really appreciate it.

  • @user-mt3lt5gw1k
    @user-mt3lt5gw1k28 күн бұрын

    Nice explanation..keep posting..tq

  • @cloudanddatauniverse
    @cloudanddatauniverse28 күн бұрын

    Thanks for watching

  • @ArrowAAA
    @ArrowAAA29 күн бұрын

    what if our data is more than 5000 rows or greater than 4MB, then how do we use lkp in that case ?

  • @cloudanddatauniverse
    @cloudanddatauniverse28 күн бұрын

    We generally don't need that. Even if we do we can use loopkup inside a Foreach activity and pull in batches of 5000. The reason we don't need is there is no activity to perform transformations once we pull those rows. Instead we use data flow transformation to do it. Lookup activity has typical use case which we have covered in later example

  • @ArrowAAA
    @ArrowAAA28 күн бұрын

    @@cloudanddatauniverse Thanks bhai, you summed up pretty well.

  • @cloudanddatauniverse
    @cloudanddatauniverse28 күн бұрын

    Welcome

  • @ArrowAAA
    @ArrowAAA29 күн бұрын

    very nice Yusuf bhai

  • @cloudanddatauniverse
    @cloudanddatauniverse29 күн бұрын

    Shukriya bhai

  • @SAli-ld6hh
    @SAli-ld6hh29 күн бұрын

    Can you suggest some resources where I can find practice questions with datasets to brush up on my Power BI skills after completing your Power BI playlist ?

  • @cloudanddatauniverse
    @cloudanddatauniverse26 күн бұрын

    will share

  • @WaseemKhan-gi4eh
    @WaseemKhan-gi4eh29 күн бұрын

    hello.......do u teach full spark

  • @SAli-ld6hh
    @SAli-ld6hh29 күн бұрын

    Most Underrated PowerBI Tutorial

  • @cloudanddatauniverse
    @cloudanddatauniverse29 күн бұрын

    Thank you for watching and your heartfelt comments. Keep learning. Hope this helps you.

  • @ArrowAAA
    @ArrowAAA29 күн бұрын

    Nice one Yusuf bhai

  • @cloudanddatauniverse
    @cloudanddatauniverse29 күн бұрын

    Thank you for watching

  • @muskanarora2671
    @muskanarora2671Ай бұрын

    Great

  • @cloudanddatauniverse
    @cloudanddatauniverse29 күн бұрын

    Thank you

  • @rahsss5601
    @rahsss5601Ай бұрын

    Hi send me phone no will contact you

  • @cloudanddatauniverse
    @cloudanddatauniverseАй бұрын

    91 9028 411 640

  • @hobbyhorse4668
    @hobbyhorse4668Ай бұрын

    Please provide me your contact number