Big O is one of the first concepts introduced in computer science classes. It's also a concept that all professional programmers who have gone though the leetcode interview process have heard before, and are nominally required to know about.So what then explains the complete failure to understand what big O means among the population who talk about it?See pic related for an example. A presumably experienced programmer gets the most amount of upvotes in an attempt to give an explanation of big O, yet almost every single sentence in the post is wrong.What is is about the big O concept that makes it so hard for people to understand?
>>108737454Most learn about it to approximate how fast an algorithm is, not to prove upper and lower bounds or so
>>108737454Wrong.
>>108737568What part is wrong in what I wrote and why do you think it's wrong?
>>108737454Because Big O notation is defined as “how an algorithm’s runtime changes as the input grows". Runtime and O() notation are related, but they are not the same thing.That is why O(1) means constant time COMPLEXITY, not constant actual runtime. This makes someone like that redditor think that accessing an item in an array will magically take roughly the same amount of real time regardless of the array’s size. In reality, depending on the array’s size and memory/cache behavior, accessing an item can be 10 times slower than a small array with few elements even if you have the same hardware. Why? because as the size grows, operation count increases to another constant but increases anyway, those operations also have their own runtimes and there are physical limits too. Both still O(1) and same fucking algorithm.
>>108737605>In reality, depending on the array’s size and memory/cache behaviorThat isn't relevant for O notation at all.
>>108737677Exactly. That’s why you can’t say that just because your algorithm is O(1), it won’t take longer as the input size increases. Big O doesn’t convey that information. The hardware architecture and physical limits isn't included inside Big O. That's why the first definition from the redditor - "it doesn't take longer as the input size increases". Wrong! In fact, the opposite is true. It just, because the algorithm is O(1), total operations stay constant.In fact, there are cases where running the same O(1) algorithm on smaller inputs many times is faster than running it on one big input. If you trust that redditor, this shouldn't be possible.
>Big O...Big Oooo...Big O
>>108737454if they're all wrong, would you care to provide a correct description for each?
>>108737605>Because Big O notation is defined as “how an algorithm’s runtime changes as the input grows"That's actually not how it's defined as at all. Literally nowhere will you find this as a definition of big O notation.>Runtime and O() notation are related, but they are not the same thingThey are a completely different category of thing. That's like saying an apple and sin(x) are related but they are not the same thing.>That is why O(1) means constant time COMPLEXITY, not constant actual runtimeWrong. Big O notation can be used to describe runtime.
Big O is just a ballpark guesstimate that works reasonably well when not having to ultra optimize to within microseconds. Treat each O() as a card and think this card beats that card and you’ll do fine. It’s mostly useful if you’ve got a really ugly performance bottleneck and you may quickly realize you can apply a better O. Anything more than that is academic wankery
>>108737812Sure.O(1) means bounded above by a constant.O(logn) means bounded above by a function of the form Clogn where C>0O(n) means bounded above by a function of the form Cn, where C> 0Do I need to keep going or do you get the idea?
>>108737454Big O isn't the end all because of the hardware, the cache, the cache lines, cache misses, etc.Everyone sane and competent will tell you to use a vector instead of a linked list, because while linked lists have very nice, very nice behavior in Big O notation, they are cache hogs. All traversal in a linked list mean a cache miss at every node, which adds an additional ~100 ns to access on an Intel cpu.The same way, a Red-Black tree is nice in Big O notation, but an absolute hog in cache misses. Hashtables (in the best of case) only have one (1) cache misses and thus are just plain better, even if they're a theoretical mess.Big O is fine to learn about what is good/bad to write as a code, for a student. Students should absolutely knows this. But also, they should also know that data cache is far more important if they want performance, and not to be boggled down by "it looks better on paper" vs "it is better".
>>108737842okay, so you're just arguing semantics, then
>>108737829>That's actually not how it's defined as at all. Literally nowhere will you find this as a definition of big O notation.nta but that's literally what wikipedia says.
>>108737838>Big O is just a ballpark guesstimateWrong.>Treat each O() as a card and think this card beats that card and you’ll do fineWrong.>>108737843>Big O isn't the end all because of the hardware, the cache, the cache lines, cache misses, etcThe two things have nothing to do with each other.
>>108737854No it isn't. Post the quote.>>108737848If arguing about
If people bring up "Big O" I get suspicious.None of the engineers I talk with speak in big-o notation, we speak in benchmark differences. Measuring real execution not algorithm theoretics.I'm confident this is something only academics and management talk about, not programmers.
>>108737829>Other entries are measures of the efficiency of the operations. A const entry means the operation takes an amount of time that does not depend on the number of elements in the container; another conventional notation for constant time is O(1). O(n) means the operation takes time proportional to the number of elements involved. A + suffix indicates that occasionally a significant extra cost is incurred. For example, inserting an element into a list has a fixed cost (so it is listed as const), whereas the same operation on a vector involves moving the elements following the insertion point (so it is listed as O(n)). Occasionally, all elements of a vector must be relocated (so I added a +). The ‘‘big O’’ notation is conventional. I added the + for the benefit of programmers who care about predictability in addition to average performance. A conventional term for O(n)+ is amortized linear time
>>108737857Oh shut the fuck up.
>>108737866>No it isn't. Post the quote.>In computer science, big O notation is used to classify algorithms by how their run time or space requirements[a] grow with the input.
>>108737848>>108737866If arguing about the meaning of fundamental concepts is just semantics, then sure.The whole point is that nobody seems to understand what big O notation means, i.e. the semantics of big O. And as such, they keep saying wrong things about it, things that would fail them on a CS 101 exam.
>>108737881This is a correct statement. This is not a definition however. Do you not know what a definition is? I suggest you look up what a definition is on wikipedia.Also look up the actual definition of big O notation.
>>108737829>They are a completely different category of thing. That's like saying an apple and sin(x) are related but they are not the same thing.either you are trolling or retarded.
>>108737875Who are you quoting? What are you trying to prove?>>108737869Is it because none of the engineers you talk with are mathematically capable enough to do a very basic analysis of an algorithm?
>>108737829>different category of thing>Wrong. Big O notation can be used to describe runtime.huh?
>>108737882Because it’s completely irrelevant in the real world. Caring about O to such an autistic degree is useless. O deals with orders of magnitude, so in practice optimizing O is trivial. People talk about O in general terms because in practice it’s only useful in general. Only in academia you actually may need to know the exact definition
>>108737454My guess would be that people intuitively understand the concept but they don't know the correct notation to communicate it. it's like musicians who can't read sheet music but they can play a song after hearing it
>>108737905>Who are you quoting? What are you trying to prove?Take a guess, you retarded LARper.
>>108737857>>Big O isn't the end all because of the hardware, the cache, the cache lines, cache misses, etc>The two things have nothing to do with each other.Not if you want to write efficient, optimized code. The Big O/Cache misses dichotomy appears in pretty much everything if you squint a little. I know, I once had a job where one cache miss more would be too slow.When you need to choose your data structure, when you need to optimize, cache misses very often goes against the Big O notation naive thinking. You can marry both (like in B-Trees in the Linux kernel) but it's complicated.Of course if you write python code then it's not really something you need to care about, you're not at caring about ten thousand cache misses per miliseconds more. Or god forbid me, Java, the most violently aggressive against the CPU cache language I've ever had the displeasure to use.
>>108737905Why would you even waste time "analyzing the algorithm", you have to measure the real time anyway, why bother trying to guess.What is the value, if any, in theoretics when you need to prove them anyway, just prove them to begin with.You know which of these 2 implementation are faster? The faster one.>but what about the difference in big-o between themI guess the CPU on our target machine thinks differently than your hypothetical constructs.
>>108737905No, it’s because O by definition only deals with such extremely wide ranges of values that it becomes irrelevant after you identify the category to which it belongs. Once you realize this is O(n2) or whatever thats all useful information youre going to get from O because that’s about as much useful information as O can give you Show me one case where autistically concerning yourself with the pedantries of O notation can be of use
>>108737898Explain how. Seems like you are confused about something, I can help clarify.>>108737908The person I replied to seemed to have implied that big O does not apply to runtime, only to complexity. As if big O is somehow synonymous with complexity, and in the same category as runtime, but different. I pointed out that big O has not special relationship to complexity, and no special relationship to complexity over runtime. It can applied to both, and is a different category/kind of thing from both.Does that help?>>108737909>Because it’s completely irrelevant in the real worldThen why do people talk about it constantly?> O deals with orders of magnitude, so in practice optimizing O is trivial.What does that mean "O deals with orders of magnitude"? Do you know what orders of magnitude means?>so in practice optimizing O is trivialI don't think optimizing a sorting algorithm to be O(logn) is trivial at all.> Only in academia you actually may need to know the exact definitionThe exact definition is simple and understandable by everyone. If you mention and use a concept, you should know what it means. Don't you agree?
>>108737892>Also look up the actual definition of big O notation.you mean mathematical definition? what are you arguing about even? what anon said is correct.
>>108737912The person in the OP screencap clearly does not intuitively understand the concept either. Intuitive understanding would lead one to make correct statements about it. Yet the statements about big O notation were almost all incorrect. Demonstrating that whatever understanding he had of big O notation, was false.>>108737913I have no idea.
>>108737932>Explain how. Seems like you are confused about something, I can help clarify.runtime and big O are, in fact, the same category. nothing to explain.
>>108737936>you mean mathematical definition?Is there any other kind? Mathematical concepts have definitions, and you can call them mathematical definitions if you want, sure.>what anon said is correct.It's a correct statement about big O notation, but it's not a definition of big O notation. It doesn't even syntactically qualify as a definition.
>>108737947>Is there any other kind? Mathematical concepts have definitions, and you can call them mathematical definitions if you want, sure.post definition and let's see what are you mad about.
>>108737940>The person in the OP screencap clearly does not intuitively understand the concept either. Intuitive understanding would lead one to make correct statements about it. Yet the statements about big O notation were almost all incorrect. Demonstrating that whatever understanding he had of big O notation, was false.But they were correct. It's a nice shorthand, from someone who knows his stuff. I don't exactly understand what you're trying to do there. He is literally correct.He didn't take into account cache misses, of course, or corner cases, or cases when the constant time is "10 years" and the linear time for the same algo is O (ns), he didn't explain any trap, but it's basically right. A good student.
>>108737959They were wrong. For example, O(1) doesn't mean it doesn't take longer as the input increases.O(n^2) doesn't mean as the input size grows, you have to do square number of things. For example, array access is O(n^2), and you don't have to do a square number of things.Should I keep going?
>>108737454Big O is basically used in interviews, for example, "Hello fellow white person, I would like a job with total compensation of 5 followed by 5 Big Os. Can you hook me up?" "Sure bro" And then we go out for drinks, bromance, fat ladies in caftans, ect.
>>108737454>Big O is one of the first concepts introduced in computer science classeswrong that's what a bit ism turing machines or what a variable is, an address, loop types, functions, parameters etc what you are discussing is later in data structurest. used teach first year comp sci
>>108737955For a topological space X, a point c in X, usually omitted and implicit, and two positive functions f,g defined on X - c, one defines O(g) by saying that f belongs to O(g) if and only if there is a positive constant C and an open neighborhood of c such that for all the points inside of it, f <= Cg.In the usual computer science situation, one takes X to be the two point compactification of the natural numbers N with the order topology, takes c to be the positive infinity.
>>108737989most jeets can't answer basic comp sci questions only the ones they have memorised answers for via their whatsapp groups, try it yourself, they will be able to answer a question about a sort that has been asked at several other interviews but completely fuck up explaining somthing trivial like an XOR or what is a local vartiable vs a global. Try it yourself, most of their claims to uergraduate degrees are fabrications. Throw an SSD out on the table with sata connectors and no label and they won't have a clue what it is either, Nation of cheats, frauds, scammers and liars to a man.
>>108738016>the two point compactification of the natural numbersSorry, meant to write one point compactification. I had in mind Z when writing it, which would have a two point compactification.
>>108737983>They were wrong. For example, O(1) doesn't mean it doesn't take longer as the input increases.It literally does. For any n, you do constant amount of operations, meaning it should take roughly the same time. But in reality, you can easily get sublinear growth, like with referencing an item because if you ignore all hardware stuff, still latency increases as input grows.if each abstract operation costs roughly the same, then an O(1) algorithm should take roughly the same time. so, that Redditor is right but that's where the usefulness of big O ends, because it doesn't include information of other things like how said operations slow down as input grows or actual physical limitations.
a jeet with all the cisco qualifications on earth won;t have a clue what a console cable is if you throw on one the table, try it. Even in the rare cases they got some certs they got them running vms. Jeets will also list both academic and industry certs they don't have on the basis they may want to get them and submit another jeets doctored if proof is asked, they hire scam centres to act as phone and email inquiry services for academic qualifications they list on cvs.what they will do is join whatsapp groups that share interview questions and memorise answers and pay bribes to 'aunties' in HR or the manager. Whewre there are jeets there is bribery.
>>108738055>For any n, you do constant amount of operations, meaning it should take roughly the same time.No that's not what it means.>For any n, you do constant amount of operationsThat's not what it means.
>>108737454Big O ist retarded because O(1) is misleading af. For themthere is no difference betweend O(1) and O(472828838382929292929).
>>108738074O(1) count is bounded, fuck off retarded.
You guys fall for OP's bait every single time
>>108737932>Then why do people talk about it constantly?Who? College kids? They talk about a lot of things of no consequent.Surely you're not attempting to appeal to popularity.
>>108737605>hardware implementation of machine code is different from computer science algorithmic mathmatical analysisOf course. Hardware limited systems and realtime systems need to worry about registers, ALU, cache hits, page faults and other latency as well as operating system implementation of the task manager, parallel cores, GPU and other peripherals.However O(whatever) is still a useful analysis to understand expected costs of different algorithms.Mathematicians love recursive functions, which when implemented on a computer is extremely slow and memory intensive as the stack gets pushed full as register and cache data gets stored each time another function is called. Summation notation is also another thing Mathematicians love and it also is not great for computers to do a million clock cycles on what should only be a few ALU calculations.
>>108738127assuming things are bait is just denial used to cope with the reality that its sincere
>>108737454>O(1) is constant time, which means it doesnt take longer as the input size increases, For example, referencing an item in an array takes O(1) time.Literally few lines of C code debunks this. Fags don't even test it and are making shit up.
>>108738266>>108738301Big O should be divided into two categories: hardware-level Big O and abstract Big O.Because, from that image, you can clearly see that referencing grows linearly. So what is the point of saying it is O(1)?For example, you can put registers, ALU, cache hits, page faults and other latency in a category where they're dependent on n and if algorithm uses any of them, then on hardware we say that the algorithm is actually O(N), but abstract one is O(1).
>>108738110Correct. Bounded is not the same as constant.>>108738258Everyone interested in analyzing efficiency of software algorithms.
>>108737780glad someone posted this
Posting in schizo constantly re-enacted threadhttps://desuarchive.org/g/thread/107723015/#q107723015https://desuarchive.org/g/thread/105527500/#q105527500https://desuarchive.org/g/thread/105489165/#q105489165https://desuarchive.org/g/thread/105477243/#q105477243https://desuarchive.org/g/thread/105126664/#q105126664https://desuarchive.org/g/thread/103554396/#q103554396https://desuarchive.org/g/thread/103539705/#q103539705
>>108738301It's a mathematical statement, you don't need lines of C code to debunk it. Simply knowing what big O notation means is enough to realize he's obviously wrong.
>>108737454Big O is your invitation to debate. "f is O(n)" isn't true or false because we haven't specified if it's bounding the runtime, number of arithmetic operations, or the numeric value of f. It's not necessarily a tight bound so whoever chooses the implied constants can make any x be O(y).
>>108739379If it's not specified usually the runtime is implied.>It's not necessarily a tight bound so whoever chooses the implied constants can make any x be O(y).This is not true. You cannot make x^2 be O(x)
>>108739384>You cannot make x^2 be O(x)They do that move when claiming that array indexing is O(1) instead of O(log(n)). They index arrays of all sizes that they care about with a single machine word.
https://desuarchive.org/g/search/image/gqACSvB_1wU0FOKVpAybYg
>>108738301Does that continue to increase in size as the array gets larger, say, to 1TB? (You'll need a moderately large machine to run that test properly. Poorfags need not bother trying.)
>>108737454Oh look, it's another episode of OP having a fit over programmers not being autistically precise with mathematical formalismsYes, we are aware that the way programmers use big O is not fully formally correct and arguably closer to big Theta, no we don't care because the informal definition is more useful to everyone not writing a dissertation or having melties on mongolian basket weaving forumsTo answer your question, the way O is informally used in computer science is an upper bound of a function's computational complexity, with tighter bounds expressed through elementary functions being universally preferred (yes, believe it or not we are aware that virtually all algorithms can be classed as O(n^n!^n!) or something similarly ridiculous, that doesn't make it useful to anyone)Seriously, how sad does your life have to be that you are entertained by posting this over and over again?\bread