What is it about big O notation that is so hard to understand for people?Despite being a fundamental concept in CS, so many CS graduates think they understand what big O means, but when they try to explain it they are completely wrong.For example pic related, almost every statement in the "explanation" is incorrect.
>>107723015Get a job
>>107723096What do I need a second job for?
>>107723160To replace your first job of being bestest good boy point getter.
>>107723015The hardest thing to understand about "Big O notation" is its use case. It only tells you how well an algorithm scales instead of telling you how well it runs, an algorithm with worse big O notation can run faster. The other issue is that it takes the hardware out of the equation, binary search and the "cache friendly binary search" have the same big O notation, but the cache friendly one not only runs faster, but it also scales faster.
>>107723160Blacked.com cleanup crew doesn't count.
>>107723205>It only tells you how well an algorithm scales instead of telling you how well it runsWrong.>>107723187>>107723238That is not my job.
>>107723257Oh really, tell me what you can infer from knowing that Quicksort, Mergesort, Shellsort, and Introsort all have the same Big O notation in the best case.
>>107723257Are you the same idiot that ruins every daily programming thread?
>>107723303Wrong.>>107723299>Quicksort, Mergesort, Shellsort, and Introsort all have the same Big O notation in the best caseMeaningless claim. Mathematical gibberish.
>>107723015I'm not even a programmer but I get it:O(n) is a function dictating the required processing time based on "n", the size of the input.So for example, if I were to say O(n), and the processing time is 10s for n=1, here's what you'd get:n=1, O(1) = 10s*1 = 10sn=2, O(2) = 10s*2 = 20sn=9001, O(9001) = 10s*9001 =~ 90ks etc.If the function was O(n^(3/2)) instead:n=1, O(1) = 10s*1^(3/2) = 10sn=2, O(2) = 10s*2^(3/2) =~ 28sn=9001, O(9001) = 10s*9001^(3/2) =~ 854ksSo if "n" is large you'll probably want to stick to functions that grow slower, such as log(n). But if "n" is small you'll probably want to look at the time necessary to run when n=1.
>>107723367>O(n) is a functionWrong>n=1, O(1) = 10s*1 = 10s>n=2, O(2) = 10s*2 = 20s>n=9001, O(9001) = 10s*9001 =~ 90ks etc.Wrong wrong wrong.
O(n) describes a function's upper bound A function f(n) is O(g(n)) if there exist positive constants M and k such that 0 <= f(n) <= M * g(n) for all n >= kTHERE OP you can stop sperging out now
>>107723401Get a job.
But how will this help me create a new GET endpoint for return number of sales?
>>107723493Correct.>>107723499see>>107723160
>>107723015>pic related, almost every statement in the "explanation" is incorrectNo.You are wrong.Kill yourself.
>>107723205>an algorithm with worse big O notation can run fastercan********!!!!!!!!!!!!!!
>>107723401Calm down, autist.
>>107723493>0 <= f(n)Do you even know how retarded you are?
assume that n is the size of a dataset. O(1) constant time, and does not grow based on n. this could be returning the size of the dataset, if it is known. O(n) is linear time, and it means essentially any constant operations applied to each of the dataset. this could be looping over it 100 times. as n grows so do the operations. O(logn) is something that is always less than O(n), but still grows as n grows. the classic example is binary search. O(n^m) is for each element in the dataset n^(m-1) operations are applied to it. the only other one that is real is O(n!) and this is just brute forcing combinations, an example would be the traveling salesman.
>>107723205>an algorithm with worse big O notation can run fasterCorrect. Quicksort is O(n^3) in the average case and bubble sort is O(n^2), yet quicksort sometimes runs faster.
>>107723652>You are wrong.Wrong.>>107723804>O(1) constant time, and does not grow based on nWrong.>O(n) is linear time, and it means essentially any constant operations applied to each of the datasetWrong.>O(logn) is something that is always less than O(n)Wrong.>but still grows as n growsWrong.Clearly you know next to nothing about big O notation, just like the redditor in OP's pic.
>>107723015BigO notation can be understood in O(LogN)2¥ time. Basically you need to be naturally exponentially asian.
>>107723367big O has nothing to do with timeyou can't make generalizations like this
This autistic shit is for no code retards. (YOU) would save years of your life if you actually just fucking wrote code instead of obsessing over semantic minutiae. Fucking spergs.
>>107723928What he did was specification not generalization. The opposite of generalization.>>107724013>understanding the most basic CS terms is for no code retards
>>107723850If that's not it, then what is?
>>107723928>big O has nothing to do with timeIt has EVERYTHING to do with time.
>>107723257>UMM AKSHUALLY THAT'S NOT MY JOB OKAYYYYYY
>>107724129What point in particular are you inquiring about? Your question is unclear.
>>107724122This is all you need. Now kill yourself.
>>107724184This image is incredibly wrong and unbelievably stupid. The person who made it didn't have a clue as to what big O notation stands for
>>107724184Actual candidate for one of the worst illustrations of a math concept ever made.
Non-whites can't comprehend Big-O notation. They might be able to recite it, but they never truly understand it.
>>107723015In the real world O(1) can be slower than O(log n). Some trees are faster than hashmaps in practice.
>>107724256In the mathematical world too. Literally nothing about the concept implies that a member of O(1) is faster than a member of O(logn), not even asymptotically.
>>107723367>I get it>every conclusion is wrong
>>107724268>>107724256Also you needing to specify "in the real world" implies you don't understand what the concept means, i.e. what big O stands for. It's not just the real world.
>>107724268>not even asymptoticallyYes even asymptotically
>>107724256You can make that claim for all time complexities.
>>107723015This is a good explanation.
>>107724289Wrong.
>>107724316Wrong. Nearly every sentence of the "explanation" is incorrect, the person demonstrates little to no understanding of what big O notation stands for.
>>107724122>O(n) is a function dictating the required processing time based on "n", the size of the input.how is this not a generalization of the concept in terms of timeanyhow, it's wrong>>107724137no it's notbig-O is just the upper bound as your input becomes sufficiently large. it has nothing to do with time.
>>107724333>Nearly every sentence of the "explanation" is incorrect,Prove it
>>107724353>O(1) is constant time, which means it doesn't take longer as the input size increasesThis is incorrect. A function that takes timef(n) = 1000 - 1000/(n + 1)is O(1) yet its execution time increases with input.
>>107724401That function would be O(n), not O(1)
>>107724401>is O(1) yet its execution time increases with input.But it doesn't. It decreases instead
>>107723015>>107723257deja vuhow often do you make this thread opI remember seeing an anon have a melty over this replying>WRONG WRONG WRONGto every post in the thread on at least one earlier occasion
>>107724442Learn basic arithmetic
I feel like I saw this exact OP with the same Wrong. poster like 6 months ago.Really inane stuff on the website today guys.
>wahhhh the intuitive explanation did not talk about sets constants and limits this will not stand
>>107724415>That function would be O(n)Correct.>not O(1)Wrong.>>107724442Wrong.>>107724461>to every post in the threadNot every post, just the wrong ones.
>>107724523>>not O(1)>Wrong.Wrong
Name one time this explanation has been insufficient in the real world during your career as a software developer.
>>107724548I think anon is suggesting that array lookup is O(n!)
>>107724548f(n) = 1000 - 1000/(n+1) is O(1) because f(n) <= 1000 for all n. Just substitute C=1000 in the definition of O(1) and it directly applies.
>>107724567>that array lookup is O(n!)This is objectively correct.If you disagree you either don't know what an array lookup is or you don't know what big O notation means.
>>107724617I dont disagree I just dont see the fun in waiting for people to say something accurate and be all>haha you are wrongand refuse to elaborate and just feel smug
>>107724652>accuratesay something inaccurate I mean
>>107724567wrong :)>>107724617wrong :)
>>107724671Whatever person or AI model you were consulting in that image is wrong. Array lookup is absolutely O(n!).
>>107724671small o is different from big O just so you know
People in academia that they don't know what to do invented big O. Meanwhile real programmers were making fast algos to solve actual problems, based on real hardware and software.Big O is therefore nothing more than an humiliation ritual for the undergrads
>>107724700Sorry, that's just wrong :)
>>107723015Most of them are retards who barely scraped by in basic math classes.
>>107724747>always>by definitionkek
>>107724500Autists love telling the same joke and the same bait for 100's of timesNo person who respects himself and his time should browse this shithole of a board
>>107723205>the cache friendly one not only runs faster, but it also scales fasterUntil you reach the limit of what the cache can handle; once an algorithm's working set is large enough, caches don't help much.Point is: choose the right general type of algorithm or you'll be very sorry when the amount of data goes up. Then consider other factors (like whether the input data is already sorted) and measure with something like your real data at real sizes, to avoid surprises.I've seen people use quadratic algorithms instead of linear algorithms, and then wonder why their code was slow for anything other than their test data. Hmm, could it be because they're idiots?
>>107724747bet it thinks sorting is nlogn too
>>107724671You're technically correct in the way that proves you're either a failed mathematician or a perma-NEET.Or both. Both is good.
>>107724700>Array lookup is absolutely O(n!)you made me go find this image in my image archivesSHAME ON YOU
>>107724747Ask it to explain why it's not O(n!) and watch it glitch out.>>107724749As is most of people ITT apparently. Just look, almost nobody knows what big O means.>>107724797It's not a joke. Everything I've said is correct and my points are valid.
>>107724842It's not a troll though. And the fact that array lookup is O(n!) is not just a random useless fact to impress people, such facts are regularly used in analyzing the asymptotic performance of algorithms. >>107724837No, he is wrong. He is asserting that accessing an array is not O(n!), which is incorrect.
>>107724579>f(n) = 1000 - 1000/(n+1) is O(1)Yes, but if you could read you would see that that's not what were talking about. Anon described a hypothetical function whose runtime is specified by that given function. We aren't discussing the complexity of the given function itself.
>>107724872>And the fact that array lookup is O(n!)can you walk me through those 9 operations that you need to perform to access a[2] in array of size 3?
>>107724903*6 of course
>>107724145All of them
>>107723015>Yo dude it's O(1) it's fine>look behind the scenes>relies on hashing the input O(n)everytime
>>107724903Depends on the CPU architecture what kind of microoperations would happen. Your question seems unrelated to the discussion at hand though. Perhaps you need to refresh your memory as to what big O notation means? That might be the source of your misunderstanding.>>107724908Your question is confusing. Please ask something more specific.
>>107724845>>107724878They're making much better arguments than you are. Conclusion: You're wrong.
>>107724942>binary search would be O(n!)That's correct, it is.>hash tables would be O(n!)That's correct, they are.>heaps would be O(n!)That's correct, they are (i.e. all operations on heaps are O(n!))>dynamic programming tables would be O(n!)They are.>That would collapse the entire complexity hierarchyWrong.Ask it to explain how, it can't, because it's wrong.
>>107724930>Depends on the CPU architecture what kind of microoperations would happen.What???ok, it seems like really a troll attempt, so I'm not going to engage further, since you can't show example even for a trivial case
>w-well actually O(whatever) is also a member of all the faster growing functionsNobody cares, only the slowest growing upper bound matters.
>>107724963Trolling is not allowed outside of /b/. I'll report you if you don't start making an effort to explain yourself.
>>107725006Sure.If a function is O(1) it's also O(n^2) and O(n!). O(1) is a subset of O(n!). And so is O(n^2) and O(n^3) etc.You can literally just use the definition of big O notation to verify this fact. Someone already posted the definition ITT so idk how this is still confusing to you guys.
>>107725026>>If a function is O(1) it's also O(n^2) and O(n!).
>>107725038This is actually very important in analysis of algorithms and in analytic number theory.For example you might have terms(n^2)! + n^2 + n! and you want to simplify it to(n^2)! + O(n!),and that requires you to know that n^2 is O(n!).
Are there O(1/n) algorithms?
>>107724930>>O(1) constant time, and does not grow based on n>Wrong.Specifically what's wrong about it?>>O(n) is linear time, and it means essentially any constant operations applied to each of the dataset>Wrong.Specifically what's wrong about it?>>O(logn) is something that is always less than O(n)>Wrong.Specifically what's wrong about it?>>but still grows as n grows>Wrong.Specifically what's wrong about it?For each of these points, explain what specific thing he said is wrong, and provide a minimal correction to make them right.
>>107725061incredible insight, anon, thank youcould you please share what university have you finished or currently attend?
>>107725077how can you have half an operation
>>107725086No problem.I've graduated from Cambridge, but the degree was in maths, not CS (or compsci as they call it).
>>107725077Yes:1. NOOP
>>107725099Division but without calculating the remainder
>>107725099Imagine.
I'm studying this stuff right now. Taking Discrete Math 2 and I still don't know what big O means. Something about time complexity and fast binary splitting vs slow quadratic growth or some boring shit like that. I hate logarithms. I don't care, my brain refuses to dedicate mental energy to any of this outmoded pedantic academic nonsense. I just need to copy and paste things from Grok to program my graphing calculator so I have ready-made functions to pass the silly test and get my worthless CS degree. I just want to make AI slopI don't have time for this garbageSchool is so dead, thank God
>>107725109it's still an operation that takes time>>107725105>I've graduated from Cambridge, but the degree was in mathsamazing, anonI have bachelors in math and masters in CS from non-US university and you put me to shame with this incredible insight of yours
>>107725026Semantically wrong.
>>107725136Half the time since its normal division operation, but its cut in half.
>>107725085For the first point, I've already addressed this, see >>107724401 for an example of how it's incorrect.To correct the statement, you'd say O(1) algorithm never takes more than a certain constant time. The time it takes to execute may vary, but it will always stay below a given constant.For other points it's similar. For example, an O(n) algorithm doesn't have to grow at all, for example the noop algorithm is O(n).Also something in O(logn) is not always less than something in O(n), not even asymptotically. For example logn is in O(logn) but it's not less than logn/n, which is in O(n).The redditor simply doesn't understand what big O notation means.
hmmm
>>107725136It's really not some great insight, I don't know why you're making it out to be one. It's just a matter of understanding the definition. Many people think they understand what big O notation means but they don't, as posters ITT and the OP image demonstrates.>>107725140Which claim is wrong in particular?
>>107725136Why is there a sudden dip in woofs with a modest increase in bark barks before woofs increases linearly with bark barks?
>>107725150O notation is specifically about algorithm complexity, it doesn't give a shit about how much specific operation takessum is way easier for CPU to calculate than division but the O number is going to be the same if algorithm is doing sums or divisions or "half divisions"
>>107725164>only in a trivial, unhelpful senseAs opposed to what other sense? It's not like there's a competing definition of big O that makes it false.Is 5+7 = 12 also true only in a trivial, unhelpful sense?Also it's wrong that it's unhelpful, such facts are very helpful in analyzing the asymptotics of various functions.
>>107725174Yeah and its half the complexity for the same operation since its a normal division operation without the remaindeIn fact we can arbitrarily have this all set up so that the algorithma themselves self modify using time travel ontoligical reduction in the turing manifold to infinitely subdivide their comexity coproducts
As far as I know, asymptotic notation are sets of functions that relate functions inputs to the bounds of that of a given asymptotic set. In the most generic way possible.
>>107725174Classic case of a transient paws in the signal before the woof function resumes its upward fetch‑jectory
>>107725201>>107725173
Wrong. Array lookup is O(n^n)
I fucking hate this stupid board SO MUCH
>>107723015>f is an element of O(g) if there exists a soijak C and a chudjak n0 s.t. for all troonjaks blah blah blahnobody cares
>>107725206How can we be sure that this data is reliable? How do we know a postman or a cat didn't taint the dataset?
>>107725256Don’t worry, the dataset’s been thoroughly sniff‑validated. No postman interference, and any feline tampering would’ve left clear purr‑turbations in the signal.
thread boils down to>giga-autistic formal mathematical big O definition vs retarded colloquial zoomer leetcoder big O usagenext