Don't make this big O mistake!

Ғылым және технология

Accidentally quadratic!
Don't fall for this common big O mistake! Python's syntax is short and sweet, sometimes it can be easy to forget the performance implications of little things like "in". These can add up in unexpected ways even if performance is not high on your priority list.
*Important Notes*:
1. list_in and set_in are creative pseudo-code and not a good representation of how CPython actually implements these operations in C.
2. O(1) runtime for x in set depends on having a good hash function so that probing "the" spot that x should go is O(1).
3. Python has variable width integers, so seemingly constant time operations like adding or multiplying by an integer (e.g. total += n) are technically not constant time, and you will see this effect if your ints get big enough.
― mCoding with James Murphy (mcoding.io)
Source code: github.com/mCodingLLC/VideosS...
CPython time complexities: wiki.python.org/moin/TimeComp...
SUPPORT ME ⭐
---------------------------------------------------
Sign up on Patreon to get your donor role and early access to videos!
/ mcoding
Feeling generous but don't have a Patreon? Donate via PayPal! (No sign up needed.)
www.paypal.com/donate/?hosted...
Want to donate crypto? Check out the rest of my supported donations on my website!
mcoding.io/donate
Top patrons and donors: Jameson, Laura M, Dragos C, Vahnekie, Neel R, Matt R, Johan A, Casey G, Mark M, Mutual Information, Pi
BE ACTIVE IN MY COMMUNITY 😄
---------------------------------------------------
Discord: / discord
Github: github.com/mCodingLLC/
Reddit: / mcoding
Facebook: / james.mcoding
CHAPTERS
---------------------------------------------------
0:00 Intro
0:38 Timing 1k, 10k, 100k
1:30 Performance is relative
2:09 Big O Analysis
2:45 Amortized constant time
3:15 Putting it together

Пікірлер: 145

  • @sophigenitor
    @sophigenitor10 ай бұрын

    O(n**2) is the most problematic time complexity. While the problem with O(exp(n)) becomes obvious even with modest test cases, O(n**2) still performs reasonably well for such test cases and only blows up when your application needs to scale to production size.

  • @SodaWithoutSparkles

    @SodaWithoutSparkles

    10 ай бұрын

    O(N²) is good enough to make into prod, but is bad enough to blow up prod.

  • @FireSiku

    @FireSiku

    10 ай бұрын

    I'm pretty sure O(n!) is the most problematic complexity. It could take AEONS before that code ever finishes. However you really have to be trying hard to make an algorithm that is THAT bad. It might even go as far as compete with BogoSort!

  • @qsykip

    @qsykip

    10 ай бұрын

    @@FireSikuI don't know to what extent I agree with the original assertion, but I think you may be missing the point. They're saying that O(N^2) is insidious precisely because it's not too bad when the test cases are small, so you don't notice that it's a problem until much later.

  • @BytesVsStrings
    @BytesVsStrings10 ай бұрын

    Bro explains o(N^2) difficult concept really in o(1) time . Kudos!

  • @vandelayindustries2971
    @vandelayindustries297110 ай бұрын

    I've been a fan of your videos since you only had a few thousand subscribers, and I have to say each video is becoming more natural! Your explanations are balancing the fine line between being precise/accurate and being easily understandable. Thanks for making these!

  • @mCoding

    @mCoding

    10 ай бұрын

    Thank you for coming along for the ride :)

  • @MindlessTurtle
    @MindlessTurtle10 ай бұрын

    Great video. There's a reason why data structures and algorithms go hand in hand.

  • @julians.2597
    @julians.25973 ай бұрын

    It would've been good to see the initialisation of the more complex datatypes timed as well, since that is usually where those types' hidden costs are

  • @XxyehezkelxX
    @XxyehezkelxX10 ай бұрын

    Although i don't watch as much as i want due to lack of free time, I appreciate your work so much, your videos teach a lot in a concise and accurate manner! Keep up the great work, thank you!

  • @cbunn81
    @cbunn8110 ай бұрын

    Congrats on your 100th video!

  • @mCoding

    @mCoding

    10 ай бұрын

    Thanks!

  • @mayank8387
    @mayank838710 ай бұрын

    This is helpful! Thanks for sharing!

  • @volodymyrtruba7016
    @volodymyrtruba701610 ай бұрын

    Why I didn't found your channel earlier. Great job!

  • @bigutubefan2738
    @bigutubefan273810 ай бұрын

    Really good - thanks James.

  • @xzex2609
    @xzex26099 ай бұрын

    I remember days when subjects on this channel was too much for me, I surely understand them but like I said it was pro stuff. now the time changed and I really beg for these kinds of stuff and there is not much. so I thank you for your effort that learning advance programming stuff.

  • @kklowd
    @kklowd10 ай бұрын

    I had a similar issue with using in yesterday in JavaScript. Interesting timing.

  • @robertbrummayer4908
    @robertbrummayer490810 ай бұрын

    Great video as always :)

  • @yash1152
    @yash115210 ай бұрын

    2:46 _"Amortized order of 1"_ 1 element takes O(N), but so does all N combined.

  • @mingyi456

    @mingyi456

    10 ай бұрын

    Yes, that is literally the definition of amortisation. Are you doubting it?

  • @gurvanbk

    @gurvanbk

    10 ай бұрын

    1 element takes O(N) **in the worst case** (ie. If you need to copy all the data to have more space)

  • @masheroz
    @masheroz10 ай бұрын

    When comparing numbers of different magnitude, please don't hide the difference in the units. Write it in full. It's a lot easier to see the difference between 1 ms and 1000 ms, than 1 ms and 1 s.

  • @BrianWoodruff-Jr

    @BrianWoodruff-Jr

    10 ай бұрын

    This

  • @rb1471
    @rb147110 ай бұрын

    Usually I just assume using "in" is O(n) so looking at your algorithm it was obviously O(n^2). But I learned that set/dict make a good workaround to it.

  • @Yllipolly

    @Yllipolly

    10 ай бұрын

    This just comes down to knowing your basic data structures. If you are uncertain how a container works, try to implement it yourself, then it will become obvious why in is linear, logarithmic or constant for any given type, as it just calls a contain method..

  • @BosonCollider

    @BosonCollider

    10 ай бұрын

    More like python should never have implemented "in" for lists and tuples, or made it a .linear_search(x) method on those containers. It allows new programmers to not learn about sets and dictionaries and repeatedly write horribly slow code, while most languages just force you to use hash tables to be concise. Using a dict here is not a "workaround", it should just be the default data structure you use for that purpose.

  • @vandelayindustries2971

    @vandelayindustries2971

    10 ай бұрын

    @@BosonCollider I think you mean a set instead of a dict

  • @vandelayindustries2971

    @vandelayindustries2971

    10 ай бұрын

    @@BosonCollider I disagree that lists shouldn't use the "in" keyword. You're right that it abstracts away from how fast a search operation is for different data structures, but a method name is not the right place to advertise the time complexity of an operation in my opinion (we have docs for that). If you're at the point in your programming journey where performance really matters and/or you're interested in the differences between data structures, you'll likely google that stuff anyway. In most cases, when you need to pick a data structure, you're likely to pick one that's fast for the operation you're doing many times (e.g. element access by index), but very often you will still need the operations it doesn't do very fast (e.g. linear search). In that scenario, it would be painful if every data structure had its own, differently-named method for the different types of operations. A common syntax between all data structures is much easier to remember.

  • @anon_y_mousse

    @anon_y_mousse

    10 ай бұрын

    @@vandelayindustries2971 I would argue that if you're at the point of caring about the speed your code runs at then you shouldn't be using Python at all.

  • @Nerdimo
    @Nerdimo3 ай бұрын

    I’ve see some cases where you’d prefer the list/tuple when n is pretty small. For example if you’re checking if something exists in a collection of 2 elements, the list and tuple would be faster than the creation and hashing behind sets/dictionaries.

  • @evlezzz
    @evlezzz10 ай бұрын

    It might be surprising, but for really small collections (like 10 elements) it could be faster to use a list instead of set. Calculating hash becomes slower than iterating over whole collection within "in" operator.

  • @tigrankhachatryan5638

    @tigrankhachatryan5638

    10 ай бұрын

    Isn't int.__hash__(self) == self?

  • @0LoneTech

    @0LoneTech

    10 ай бұрын

    @@tigrankhachatryan5638 Depends; e.g. in CPython 3.11.4 on amd64, 1

  • @evlezzz

    @evlezzz

    10 ай бұрын

    @@tigrankhachatryan5638 Mostly (with exception of -1 and huge numbers). But 1. We are not always doing this manipulation with numbers, that's just what was tested in the video. 2. For structures like sets and dicts we are not using hash directly, but something like hash % capacity, to determine actual location for element in hashmap. After that, it's possible (and quite regular) to hit a collision, so the process will run couple more times until actual element is found. 3. There are additional internal operations running when "in" operations is executed (for each structure). Exact constants will vary between Python versions but the main idea is the perormance might differ dramatically depending on ammount of elements and for short collections (below certain threshold) the simpliest ones might be the best.

  • @TonyHammitt

    @TonyHammitt

    10 ай бұрын

    Law of small numbers: Anything's possible! Small numbers of samples can behave however they like, statisticians hate them ;)

  • @megaing1322

    @megaing1322

    10 ай бұрын

    The point isn't really the calculation of the hash, for the builtin objects that is a completely trivially operation. (for ints, it's most likely a NOP, for strings it's cached, for general object instances with an `__eq__` implementation it's also a NOP.). But the overhead of a dict existing at all is relevant at that size. OTOH, it's not that hard to optimize those small cases for dict and set to be just as fast as list and tuple by just iterating through a linear list, but python might not be doing that for some reason (IDK, I haven't checked the code).

  • @user-ld8lc4ex4m
    @user-ld8lc4ex4m10 ай бұрын

    Thank y so much. thank you

  • @QuantumHistorian
    @QuantumHistorian10 ай бұрын

    Someone needs to show this to Matt Parker :p

  • @naitsab_33

    @naitsab_33

    10 ай бұрын

    tbf its exactly what mCoding said. He did something a single time, so a month waiting is fine, because you can actually use the month some other way. You could also optimize, but you never know how long that will take, so even if it only takes a week to create a near-instant algorithm you still wasted a week of time, which you could have used for something that actually matters.

  • @QuantumHistorian

    @QuantumHistorian

    10 ай бұрын

    @@naitsab_33 Sure... but if you can spent 2h rewriting code to save 3 months of your laptop running hot with the fan on max, you probably should. Even if you're only going to run it once. Even if it ends taking you 3x longer to save 3x less time, it's still worth it.

  • @vectoralphaAI
    @vectoralphaAI10 ай бұрын

    Awesome. But why is it that you no longer upload much often anymore?

  • @jakdaxter31
    @jakdaxter3110 ай бұрын

    Wow I wish I saw this 3 weeks ago. Shortened the retrieval of my dataclass property from 5 minutes to under a second!

  • @knut-olaihelgesen3608
    @knut-olaihelgesen360810 ай бұрын

    How do you type hint an argument of both class A and class B. I do not mean Union

  • @aflous

    @aflous

    10 ай бұрын

    Stay away from multi inheritance, thank me later

  • @whosdr
    @whosdr10 ай бұрын

    The biggest mistake is just not doing any preliminary performance testing the moment you realise a list can be arbitrarily large.

  • @whosdr

    @whosdr

    10 ай бұрын

    ​@@imagudspellr1644 I would still rather be able to prove it in the language I'm targeting. I've been burned a few times by structures that switch algorithm once data becomes so large, and it completely throws out any theoretical model you've got. That's not to mention cache concerns, where a theoretical algorithm might not behave as expected as only a finite amount of memory can be stored in the CPU. But if I can write a test and show that on a similar platform, and for a million elements, it takes X time and Y memory? Those uncertainties vanish. And I can then go back later with the same code and possibly a newer algorithm to test further performance improvements. It seems worth the 5 minutes of work to me at least.

  • @Cyber_Chriis
    @Cyber_Chriis9 ай бұрын

    Very good explanation for non CS people :)

  • @gexahedrop8923
    @gexahedrop892310 ай бұрын

    2:30 more precisely, x in set is *average* O(1)

  • @megaing1322

    @megaing1322

    10 ай бұрын

    To get anywhere near O(n), you would have to construct very pathological counter examples for the set hashing algorithm, so it will almost always be O(1) . it can technically be O(n) worst case, but saying "it's O(1)" is basically always correct (I.e. I am making a slightly stronger statement than "average O(1)"). You would have to get very unluckily to get more than 2 or 3 chained overlapping lookups in a row.

  • @user-dh8oi2mk4f

    @user-dh8oi2mk4f

    10 ай бұрын

    I believe the technical term is *amortized* O(1)

  • @megaing1322

    @megaing1322

    10 ай бұрын

    @@user-dh8oi2mk4f That is not the term pythons performance overview lists, and is AFAIK something different. Average means that this individual operation might take longer, but shouldn't "in a random case", amortized is not a statistical term here: It means that if you view it in combination with other operations, it's overall O(1). This for example includes list.append, which is worst case O(n) (it needs to copy), but since it only needs to copy every O(n) appends, it's amortized O(1).

  • @user-dh8oi2mk4f

    @user-dh8oi2mk4f

    10 ай бұрын

    ​@@megaing1322 Is that not the same thing? How is "If you view it in combination with other operations" any different from a running average? An average doesn't say anything about individual operations. It only tells you about the behavior on a large scale, no?

  • @AssemblyWizard

    @AssemblyWizard

    10 ай бұрын

    ​@@user-dh8oi2mk4f Amortized = average over a sequence of actions performed on the data structure. Average = average of a single use, over all possible inputs. Expected = average over a random choice that happens in the algorithm. With sets/hash-maps, lookup is O(1) in both amortized and expected.

  • @hikaritsumi2123
    @hikaritsumi21239 ай бұрын

    I don't like appending to list if I'm not going to use something in that again, plus if I have to create another if statement just to check for duplication it's such a wasteful operation so I use set. That's how I think of it and like in the video "if you're using Python you're not going to carte about perfomance" but one thing you should always care, you should avoid inefficient operation.

  • @bman12three43
    @bman12three4310 ай бұрын

    I had this exact question on a job interview and first offered the naiive solution, and the interviewer asked me, "Do you know about big O?"

  • @Raren789
    @Raren78910 ай бұрын

    Great video, I wanted to poke around python source code for fun, do you happen to know where to find the code for python built-in operators, such as "in"? I've tried digging into the github repo, but only found some tests. Is it hidden somewhere?

  • @anon_y_mousse

    @anon_y_mousse

    10 ай бұрын

    Have you looked in the Modules folder and at _operator.c? Anything deeper than that and I'll have to dig into how it runs the bytecode, which I'll eventually end up doing anyhow, just not all in one sitting and not today.

  • @corejake
    @corejake10 ай бұрын

    Other way around is true too. You should always measure first to draw conclusions as constants vary greatly and usually they matter more than complexity.

  • @kellymoses8566
    @kellymoses856610 ай бұрын

    Python sets are awesome.

  • @enchantedplays7860
    @enchantedplays786010 ай бұрын

    Short and sweet

  • @Alister222222
    @Alister22222210 ай бұрын

    Thank goodness, a mistake I am pleased to say I haven't been making! Sets are awesome.

  • @gardnmi
    @gardnmi10 ай бұрын

    Are you hacking my stackoveflow history? I just made a change the other day to convert all lookups to sets from lists.

  • @unperrier5998
    @unperrier599810 ай бұрын

    Also remember that *_premature optimization is the root of all evil_* So it's best to implement in the way that is the most readable then optimize where necessary.

  • @Xavier-es4gi

    @Xavier-es4gi

    10 ай бұрын

    Using set when it's appropriate instead of lists is more a best practice than premature optimisation

  • @unperrier5998

    @unperrier5998

    10 ай бұрын

    @@Xavier-es4gi yes and it's an implementation detail specific to Cpython.

  • @ShankarSivarajan

    @ShankarSivarajan

    10 ай бұрын

    @@unperrier5998 True, but this advice is probably not targeted at people who decide what language/implementation to use based on details like that.

  • @mingyi456

    @mingyi456

    10 ай бұрын

    @@unperrier5998 I think choosing the right data structure to use is not just specific to cpython, because data structures are supposed to behave largely the same way across languages?

  • @Y2B123

    @Y2B123

    10 ай бұрын

    @@unperrier5998 I beg to differ - any sane Python interpreter on modern hardware will behave in this way unless you count very fancy optimizations.

  • @jullien191
    @jullien19110 ай бұрын

    Güzel anlatın. Teşekkürler

  • @Corncycle
    @Corncycle10 ай бұрын

    i love how easy python makes it to use a bevy of data structures right out of the box, and how clean the syntax is to interact with them i definitely agree with the point of the video that it can obscure how poorly optimized a piece of code is, but the other side of the coin is that it also makes it very pleasant to write code that has visually clear purpose

  • @BosonCollider

    @BosonCollider

    10 ай бұрын

    Imho, python lists and tuples should just not have had the in operator implemented, or it should require the list to be sorted and use binary search. The fact that it makes it somewhat convenient to use an unsorted list as a set is one of the most common sources of beginner mistakes that other languages will not let you do

  • @anon_y_mousse

    @anon_y_mousse

    10 ай бұрын

    @@BosonCollider Which are these other languages that won't let you make that mistake?

  • @opticalreticle
    @opticalreticle9 ай бұрын

    O!

  • @Nekroido
    @Nekroido9 ай бұрын

    Holy shit, now that 1+ hour long data export at one of my previous projects makes total sense. That particular part of code used only lists and tuples. That's what you get when you're to stubborn to jump onto the latest language features, huh.

  • @0LoneTech

    @0LoneTech

    9 ай бұрын

    Latest? collections.Counter has been there since 2010. Even back in Python 1, when we didn't have sets (those are from 2003), we just used dictionaries with all the same values, or value and key identical.

  • @Nekroido

    @Nekroido

    9 ай бұрын

    @@0LoneTech yet still they used arrays. And were against using SCSS/Typescript instead of vanilla CSS/Javascript on frontend

  • @TrimutiusToo
    @TrimutiusToo10 ай бұрын

    I was like... Wait using list instead of set??? Whenever i don't need to iterate over the data structures or if i do find way more often than iteration, doing dict() or set() is pretty much automatic for me... But i do have 11 years of experience as a software developer

  • @QuantumHistorian

    @QuantumHistorian

    10 ай бұрын

    The issue is that list is pretty much the first collection anyone learns in python, and in many people's mind the "default" one. Especially for people who aren't full time programmers, but just do it occasionally as an ancillary part of their job (which is a pretty big use case for python). In many cases, sets or dicts are simply better - but unless it's a necessity lots of people wont bother doing something outside their comfort zone. This video is presumably meant to prod those people in that direction, not those like you who already know better.

  • @31redorange08

    @31redorange08

    10 ай бұрын

    Set is slower than list. You should know after 11 years.

  • @StainedTag
    @StainedTag10 ай бұрын

    I like this channel because it has a lot of more advanced, niche Python topics. Set and list search time complexity is not that.

  • @TohaBgood2
    @TohaBgood29 ай бұрын

    You have a little mistake in your one-liner solution. sum(set([1, 2, 3])) returns the sum of the elements in a set. What you want here is len(set([1, 2, 3]))

  • @mCoding

    @mCoding

    9 ай бұрын

    The code is correct. The example problem is to compute the sum of the unique numbers in the iterable, not the number of unique elements in the iterable.

  • @bereck7735
    @bereck773510 ай бұрын

    Discord gang

  • @bswck

    @bswck

    10 ай бұрын

    discord gang 🤙

  • @yash1152
    @yash115210 ай бұрын

    I legit thought it was going to be about some mistake regarding OOPS (ObjOrie programming) in python lol Thumbnail: O(OOPS) Thumbnail backdrop: Python Title: Don't make this big O mistake!

  • @lachlanperrier2851
    @lachlanperrier285110 ай бұрын

    I’m somewhat surprised that a list is that much slower then a dict. Cause surely if that were the case you would just make a list a special case of the dict, where you restrict the dict keys to integers?

  • @mingyi456

    @mingyi456

    10 ай бұрын

    Lists and dictionaries have completely different properties. Dictionary keys are supposed to be immutable and unique, while order of elements is not guaranteed to be preserved, while lists (in python) can contain mutable elements and duplicate elements, and preserve the order of elements. They are not to be used interchangeably, and the video is a prime example of using a list when a dictionary should be used instead.

  • @0LoneTech

    @0LoneTech

    10 ай бұрын

    @@mingyi456 Python dict is order preserving since CPython 3.6, guaranteed since 3.7. sets are not. More details are available e.g. in the talk Modern Dictionaries by Raymond Hettinger. List indexing is even faster than dict lookup, and lists are more compact (in CS terms, they are arrays of references, not to be confused with linked lists). It is value search that is slow, and it would be in dict as well; x in dict.values() is no faster than x in list. Key lookup is optimized, at the expense of insertion time and storage size; set.add() is slower than list.append(). The data structure used for dict is also an array but with a hash table index, whereas I think set just uses the hash table.

  • @enchantedplays7860
    @enchantedplays786010 ай бұрын

    401 views

  • @igorlukyanov7434
    @igorlukyanov743410 ай бұрын

    I think 60fps means that one frame takes not more that 17ms, not 60ms :) Anyway, this a great video about time complexity, and why you should care about it.

  • @orbitalteapot21

    @orbitalteapot21

    10 ай бұрын

    I think he said 16 ms

  • @Adventium_

    @Adventium_

    10 ай бұрын

    he said 16ms, not 60ms

  • @Djellowman
    @Djellowman10 ай бұрын

    I sincerely hope nobody with more than 4 weeks of programming experience would make these silly mistakes of choosing the wrong datastructures and even something as elementary as not looping over an entire list every loop.....

  • @Djellowman

    @Djellowman

    10 ай бұрын

    I refuse to believe that somewhat experienced developers use the wrong tool for the job.

  • @johnbennett1465

    @johnbennett1465

    10 ай бұрын

    You wish. TIVO's thumbs/suggest feature works fine with less than 100 shows marked. By the time it has a few hundred, it fails in multiple ways. It stops giving suggestions. It also takes over a minute to delete a single marked show from its list. I wasted a large number of hours because of this.

  • @Berutoron

    @Berutoron

    10 ай бұрын

    Depends what you call "programming experience". If the person has a full CS degree and only has 4 weeks of professional experience then yeah, but if we're talking about someone who's literally been learning programming for like a month, your expectations are way too high and you've forgotten what it's like to be a total programming novice if you think people shouldn't be making that mistake. None of this is obvious to a learner or even many casual programmers. Especially if they didn't learn the hard way but instead picked a high-level language.

  • @Djellowman

    @Djellowman

    10 ай бұрын

    @@Berutoron Perhaps my expectations are too high, but I haven't forgotten what it's like to be just starting out. After 4 weeks I made a hashtable in C without stdlib because i wanted to build huge graphs, and iterating over all nodes every time i wanted to set a connection would obviously take forever.

  • @Jeyekomon

    @Jeyekomon

    10 ай бұрын

    I sincerely hope this is just a failed joke and not a total overestimate of novice's capabilities.

  • @Diapolo10
    @Diapolo1010 ай бұрын

    Unless the sequence of values only contained unique value, or in other words if this function was actually entirely pointless, it's technically O(n*m) and not O(n^2) because the number of cache insertions (and subsequent reads) depends on the number of non-unique values in the original dataset. But I know that this isn't particularly important for getting the point across. You are, of course, free to correct me if I'm the one making a blunder here.

  • @0LoneTech

    @0LoneTech

    10 ай бұрын

    The distinction only helps if you can prove useful properties of your m

  • @BlurryBit
    @BlurryBit10 ай бұрын

    Rule one: Use rust. 😂

  • @user-xh9pu2wj6b

    @user-xh9pu2wj6b

    10 ай бұрын

    you're saying this as if this very same thing isn't present in rust. This is about time complexity of an algorithm, a language-independent concept.

  • @BlurryBit

    @BlurryBit

    10 ай бұрын

    @@user-xh9pu2wj6b it may or may not be the same for other languages. Totally depends on the compiler/interpreter or whatever imo. python is slowwwww. very slowwwwww in general .

  • @screwaccountnames

    @screwaccountnames

    10 ай бұрын

    @@BlurryBit Slow to execute, but fast to code. There are still a lot of use cases where Python makes sense to use.

  • @BlurryBit

    @BlurryBit

    10 ай бұрын

    @@screwaccountnames I am a sucker for execution speed haha, sorry. Btw, the comment was meant to be a joke. don’t be offended guys and girls :)

  • @konsth191

    @konsth191

    10 ай бұрын

    @@BlurryBit if that's your level of understanding rust most likely isn't faster lol

  • @pattypeppermint3753
    @pattypeppermint375310 ай бұрын

    Too much talking

  • @rupen42

    @rupen42

    10 ай бұрын

    From you? I agree.

  • @pattypeppermint3753

    @pattypeppermint3753

    10 ай бұрын

    @@rupen42 short feedback, no need to cry.

Келесі