Big O is an elementary concept in mathematics and computer science, one of the first things you learn when you study algorithms and data structures.Yet for some reason most people who talk about it seem to have no idea what it means. Even people who have no reason not to know what it means, like CS graduates and tech workers.Take pic related for example. Almost every single sentence in this explanation is incorrect. Yet it's the most upvoted reply in a CS thread.
That's because most programmers today are webshitters and let the SQL database do the heavy lifting. They know how to add 3D shadow to an HTML button but don't know how to sleep sort.
>>108205219>Yet for some reason most people who talk about it seem to have no idea what it means.because the formal definitions are gay and boring
I had a big O with OP's mom last night.
>>108205230You don't need formal definitions, the informal definition means the same thing. People don't even understand what it means informally, as OP's pic related demonstrates.
>>108205219But have you considered that O(P) = faggot?
>>108205285>as OP's pic related demonstrates.it's mostly shown for the time something takes depending on the input size of some array/list, so people forget that it's an general concept that can be applied to other things aswell
>>108205312That's not the issue with the explanation.
time to nip this thread in the budBIG O = ceiling of algorithm. In ENGLISH: "This algorithm cannot POSSIBLY run slower than this" people get confused and thinkBIG O = how much time an algorithm ACTUALLY takes to run. wrong! Knowledge check: Looping through a 1D array is >O(n>O(n^2)>O(n!)>O(n! * n! * n^5) if you understand this then you understand big O!:D :) <3 :D :D <3 :) :3
>>108205321in case you are talking about worst case performance it's usually what matters
I think you're being histrionic, most people understand complexity intuitively, pre-calc is a required class after all.Of course, CS people are rard'd and pretentious so they decided to use O() instead of just f(x) or ditch the function notation all together.Outside of that, the limitation is an intuitive understand of certain functions specifically vs others, which has nothing to do with CS and you wouldn't expect a CS student to have intuitive knowledge of those. But, anyone with a brain knows immediately the difference between logn n 2n n^2 etc.
>>108205336>Knowledge check:>Looping through a 1D array is>>O(n>>O(n^2)>>O(n!)>>O(n! * n! * n^5)E) all of the above
>>108205346>>108205347>>108205349Yeah. What's crazy is that even LLMs today get this wrong. For example, if you tell them that array lookup is O(n^2), they will disagree and tell you you're wrong, even the best ones.Bad CS students have succeeded in poisoning the well of what big O notation means.
>>108205360Here's Claude Opus 4.6, the supposedly smartest model currently available, in an incognito chat.Doesn't know what big O notation means.
big o is the reason why seedance can't give me 5 hours of Captain Amrrica vs godzilla I wish tbey never invent it
>>108205360>What's crazy is that even LLMs today get this wrong.Why would that be crazy? LLMs prioritize how likely a phrase is to follow the prompt, not accurately answering the contents of that prompt.
>>108205374Because all the big labs boast about PhD level intelligence of these latest models.
>>108205360how is it O(n2) exactly? what are you taking 'array lookup' to be?
>>108205395>how is it O(n2) exactly?per definition >>108205230
People in general have a pretty poor understanding of functions. That's because we don't ever work with word problems. Take a kid who got an A in calc 1 and then ask him to model even a basic linear function with a word problem. He will likely struggle, if not outright be unable to do it.My guess is it's also explained poorly, by professors who themselves have poor modeling intuition.Negative feedback loop type ish unc.
>>108205404you need to use the definition with an algorithm, not just have the definition
>>108205395Let's say you have an array of size n and you want to look up an arbitrary element of that array. By that I mean you want to read an element of the array at an index i, where i is arbitrary. The time it takes to perform this operation is O(n^2)
>>108205420actually I may be retarded, you're just being a pain about how O(n) is within O(n2) instead of going for the minimum O
>>108205429There is no such thing as "the minimum O"
>>108205434I think you can fully understand what my meaning is anyway. The function that restricts the upper bound the most (that you know), in case you're too autistic not to figure that out.
>>108205429>you're just being a pain about how O(n) is within O(n2) yeah, it took me a bit too until i noticed what OP was complaining about, theoretically Theta(n) would be more correct but then who the fuck cares (apart from your theoretical CS professor and OP)
>>108205422Are you making a joke? It's O(1).It takes one single operation regardless of n's range.
>>108205450>The function that restricts the upper bound the most (that you know),First of all, there's no unique such function. For every function f, O(f) = O(Cf) for every constant C > 0. Second, even supposing our choice may not be unique, the most obvious function f to make g = O(f) as tight as possible is g itself. So you get answers like "array lookup is O(array lookup)", which is not useful, but are the natural answers that your way of thinking leads to.
>>108205468No, I'm not joking. And if you actually cared to know the first thing about array lookup and big O notation you will realize that it's O(n^2).If reading this thread didn't make it obvious to you yet, consult wikipedia.
>>108205412>Take a kid who got an A in calc 1 and then ask him to model even a basic linear function with a word problem.alices starts at 10m and goes 1m forward every 5 seconds, how long does it take alice till she reaches 20m? Good enough?
I couldn't care less how well you know Big O, when every "modern" programmer imports millions of layers of libraries (in JavaScript no less) just to make simple CRUD apps, or to replace existing natively-compiled programs, without a care for how much RAM or storage is used let alone the CPU overhead and cache misses from all those layers of abstraction mentioned.Oh and when you mention this, the hipsters tell you to switch to Rust, which takes centuries longer to compile than the most templatized C++ slop.
>>108205492>no, you're wrong, and no I won't tell you why you're wrong or why your explanation is incorrectfor a person so anal about correctness, you're being exceptionally vague about your "knowledge"
>>108205492Except you're wrong, so I'm not going to bother looking it up.This isn't real analysis.
I'll take Big Duo instead.
>>108205510There's nothing vague about my knowledge and my claim that array lookup is O(n^2). If you don't know why it's true it means you either1. Don't know what big O means. In this case, look it up on wikipedia or take a look at >>1082052302. Don't know what array lookup means. In that case, simply look it up on wikipedia or ask an LLM.These are two very basic and easy things you can do right now. The future is in your hands.
>>108205422advanced trolling
>>108205219this is totally wrong and you should finish school before posting here
>>108205527It's not trolling.1. The statement is objectively true without caveats.2. It's easy to see why it's true if you know what big O means and what array lookup is, both of which are easy simple concepts to learn.3. Many people think the statement is wrong, illustrating their lack of knowledge of the basic concepts involved, proving OP's point.
>>108205525I have done both, and I've come to the conclusion that you are full of shit and don't know anything about thisyou're just baiting with "nuh uhs" and have no semblance of proof about what you're talking about
>>108205537>I have done bothYou are lying. In fact, an explanation for why it's O(n^2) was already provided in this thread, but you were too lazy to read it.
>>108205543and you're too lazy to even link itmaybe because it doesn't exist
>>108205505yeah, you should be able to immediately see 1/5 + 10, but vast majority will not
>>108205552You have to be kidding me. It's obviously 50 seconds.
>>108205525Please go back to /sci/ real analysis schizo. it's constant time to look up array[index], it's just a fact of reality. hope this helps.
>>108205557not to solve it dumbass, you can do that in your head, to model it -- to write it's function.
>>108205535>>108205422it is trolling because it's actually O(nlogn), you retard
>>108205219You're wrong.
>>108205285>You don't need formal definitions
>>108205589What's confusing or wrong about my post?
>>108205574You keep ignoring my proof and denying it, just stating "facts" without any logichow about you try to refute my post and not just ad hominem
>>108205229I wish webshitters would use sql.
>>108205608id rather you take your meds you fuckin brainlet, sage.
>>108205634I'm not calling you dumb, you don't have to be so defensivejust properly learn the subjects you're misusing, that's all you need to do
>>108205589>the fact that I can recognize Veruca JamesAm I too far gone in porn or is this just a deep knowledge in memes?
>>108205651you havent written a proof and youre fucking retarded.
>>108205669you could at least read the thread before trying to troll
>>108205525schizo nignoggins at it again, kindly choke on my 12 inch dodonkadonka
>>108205369It’s actually right
>>108205681you havent even represented analysis correctly, which is log, let alone actual modeling. you're a fucking retard and terrible troll.
>>108205712Wrong.
I would be impressed if you could clone myth2 soulblighter and add thign where u can design own units and add or subtract them from army
>>108205229whats a sleep sort?
>>108205395NTA, but there's a subtle difference here related to different functions used to indicate orders of functions.Big O indicates an asymptotic upper bound on a function f. That is, as the inputs to f approach infinity, the result of f will not exceed what is indicated by the surrogate O(...) function.Because it's an upper bound, if we know array lookup is O(n) - then it also by definition is O(n log n), O(n^2), O(n!), etc.This is not true for Big Theta, which indicates both an asymptotic upper AND LOWER bound on a function f. That is, as the inputs to f approach infinity, the result of f will neither be greater nor be smaller than that of the surrogate Θ(...) function.Array lookup is Θ(n). But it is NOT Θ(n log n), Θ(n^2), Θ(n!), etc.Usually, when people write Big O notation, they actually mean Big Theta. The wrong notation being used got more or less grandfathered in because theta is a pain in the ass to type on a US keyboard. So people just settled on using capital ASCII O and relied on others to understand what was meant, in context.Ofcourse newbs to the material that don't have a solid basis of education in computer science's theory don't have that context, and that's where the slippery slope into decline starts.
>>108205872you English and the English of OP are poor and the linked picno wonder programmers are so poor in USA n westI bet the English for occam-pi rmox is more clear.
>>108205583NTA, but you're wrong.General array lookup means scanning the elements of an array until you find a matching element. This is worst-case linear in the number of elements in the array: walking from start to end and stopping once you find the first match, which in the worst-case is at the very end of the array. Hence O(n). Big O indicates an upper bound - lookup will never perform worse than that.
>>108205903>Big O indicates an upper bound - lookup will never perform worse than thatlookup will also never perform worse than O(n^2)
>>108205903 (cont.)That said, it's ALSO O(n log n) AND also O(n^2) because those are also valid upper bounds.However, it's ONLY Θ(n). As Big Theta indicates both upper and lower bound.And again- most get those two mixed up.
>>108205909You beat me to posting my follow-up >>108205914,thanks to the god-fucking-damned post timer.
>>108205535it's trolling being insanely pedantic instead of just guiding him into the answer. YES you could explain big O with the formal definition but why? If you're upset that "so many" people are missing it then just give them the easy to understand answer and let them work through the math shit later. You are being annoyign on purpose
>>108205890>you English and the English of OP are poor and the linked pic>no wonder programmers are so poor in USA n west>you English"your English">in USA n west"in the USA and the West"I rest my case.
>>108205924like>Let's say you have an array of size n and you want to look up an arbitrary element of that array. By that I mean you want to read an element of the array at an index i, where i is arbitrary. The time it takes to perform this operation is O(n^2)you wrote an entire essay to say "pick an element from the array" . hate nerd ass niggas like you
>>108205941>stop being pedantic about math(You) + AI
am i crazy or did we have this same exact thread with the same exact schizo OP and reddit screenshot on /g/ already at least 5 times in the past
>>108205914You're talking about array search, not lookup. Lookup is constant time because it's just accessing one index. >However, it's ONLY Θ(n). As Big Theta indicates both upper and lower boundnuh-uh, the lower bound for search is Ω(1) because you might get lucky and hit the correct entry first.
>>108205924You know what's worst than trolling? Feeding them trolls, anon. As soon as you noticed OP was being a faggot, you could have just all fielded and hid this fucking thread like a normal person. Or stuck around to laugh at the retards getting ragebaited and feeding the trolling OP, which is what I'm doing.
wikipedia sometimes uses this, but only sometimes, which is quite a shame
>>108205280you lost me
>>108205219
>>108206081>when you trying to gen a study guide, but the model is overtrained on 1girl
>>108206027>nuh-uh, the lower bound for search is Ω(1) because you might get lucky and hit the correct entry first.Array search is both Ω(1) and Θ(n).And yes - it's definitely Θ(n) because Big Theta is an upper and lower bound on the AVERAGE CASE.You compute it by taking the average of all possible events, i.e. the target element being located at any of the possible indices in the array, assuming uniform distribution.Which means you sum Θ(i) for i=1..n+1 and divide the result by n+1Which is equal to Θ((n+1) ∗ (n+2) / 2) / (n + 1) Which is equal to Θ(1 + (n / 2))And then we remove constants and get Θ(n)
>low effort bait thread
100% of newtonian mechanics are 100% false. 100% of newtonian optics is 100% false.we still teach both to undergrads.
>>108205219Big O is a relic of how computer science worked in the 70's, nowadays it's very common for algorithms that are better according to Big O to be slower in practice. It wouldn't be as bad if people tried to count uncached memory accesses, but people insist on counting """operations""" as if that means anything.Anyway this thread is just bait from a schizo that posts this same thread every few months, don't fall for it
>>108205230You just take the precise growth formula for your algorithm and chop off all the low order stuff. That's it.
>>108206248This is like complaining about integral calculus because people are being retards about how they do physics.
>>108206248>it's very common for algorithms that are better according to Big O to be slower in practicePeople knew that in the 70s too you fucking retard, you just don't understand what the notation means
>Is my algorithm retarded?>Does that make it slow?It's not that complicated
>>108205229Most programmers still have degrees and the degrees still make you learn this>>108205230
>>108205422Nigger are you factoring cache misses into this or something?
>>108205219i was taught data structures and algorithms by a spanish woman with such bad english that over 75% of the class formally complained about her (she was given a promotion to dept. head the next year and the school finally gained its athena swan silver award)i think big o is worst case time complexity.array lookup is O(1) because an array of fixed sized objects you can work out where it is in linear memory and go straight there, you literally need to look at one thing only.O(n) means in the worst case you need to look at every item in the array, for example finding a single value in a random list is O(n) because in te worst case (the last place you look) you have to check every (n) element.so thats basically what it means, and all the other numbers just mean different things like that, n^2 means the array lookups are roughly square in shape etc etc, that you have to compare every element against every other element for example.little o is best case? is that right?anyway the most important part of big O is that it doesn't matter in the slightest.i rewrote some medical device software last week with an sql select * statement and the rest of the function assuming the column order and doing a bunch of string compares against a list to find a serial number. thats the kind of thing that real living breathing people with uni degrees in software development apparently think is reasonable thing to do. so dont even fucking worry about big O.especially at a job interview, focus instead on highlighting any minority status you have.
>>108206699>i think big o is worst case time complexity.Wrong.>O(n) means in the worst case you need to look at every item in the array,Wrong.>little o is best case? is that right?Wrong>anyway the most important part of big O is that it doesn't matter in the slightest.Wrong.
>>108206630Wrong.
>>108206788thanks, really helpful and informative reply lol
>>108206248>Big O is a relic of how computer science worked in the 70's, nowadays it's very common for algorithms that are better according to Big O to be slower in practice. It wouldn't be as bad if people tried to count uncached memory accesses, but people insist on counting """operations""" as if that means anything.Algorithms can have not just runtime complexity but also I/O complexity measured.Big O notation is also used with that.In fact, there's an entire subfield in research dedicated to I/O-optimized algorithms which became very, VERY much more relevant with distributed computing.
>>108205219God WILL punish you, retard.
>>108205422>n size of the array, i index>the TIME(?????) it takes to look at i is n^2 (which unit? second? burger per bald eagles???)lmao what?care to explain how you reach this insane result?>>108205492>If reading this thread didn't make it obvious to you yet, consult wikipedia.this thread makes no fucking sense, you retards come here pretending billions of people are suddenly wrong about complexity and you refuse to elaborate, the fuck is your problem? are you even human and not some shitty AI in trial on /g/?>>108205508beyond that, most operations in real-world are io-bound, it does not really matter how fast your algo are because you'll spend an unholy amount of time just waiting for something to happen, interrupt, socket, disk io, user input, all this shit hides algorithm complexity anyways.also most of the time the "algorithms' we write aren't really "math"-like, we don't sort trees, compute matrices, etc, we open files, read files, move memory around, etc.how woudl you compute complexity on these tasks? there are microoptimisations to do left and right but most of the time it would be retarded to waste time trying to understand the big hoe notations of these things.there are obviously cases where it matters but most performance gains come from less io, better caching and less code, not O(1) everywhere...>>108205525>There's nothing vague about my knowledge and my claim that array lookup is O(n^2). the entire internet says you're wrong, why can't you just explain it simply so us lowly humans and AIs can improve?>>108205535>1. The statement is objectively true without caveats.you're mixing size and time, obviously it's wrong from the get-go on top of even if we're charitable and you meant space instead of time it would still be wrong...>illustrating their lack of knowledge of the basic concepts involvedENLIGHT US FFS
All this Big O crap is pointless when 99% of the time programmers don't give a shit about optimization.
>>108206909NTA but O doesn't necessary mean we know the optimal bound, so if you're a retard it's not worst case it's "we know that complexity can't grow worse than this" maybe this is what he meantand the notation does matter if you do shit like HPC or cryptography, at least for research
>>108205219Determinism is flawed, purely out of spite for you personally.
>>108205492>>108205360>>108205219>>108205422Either you guys have brain damage or this is bait.>>108205749You sleep until the array magically sorts itself. It's technically O(1) since it doesn't depend on the input size.
>>108205375The bar for PHDs is that low huh?
>>108205505Lmao these were exactly my thoughts.I never took “calc 1” though so this doesn’t prove anything.
>>108205535>worst caseSo that would be a cache miss right?How does big O apply to cache misses.
>>108205552>>108205557This is how you end up with off-by-1 errors. You're correct in this case, but with a problem like this you always want to verify in a more fool-proof way in case it's one of those times.
>>108205219Dorothy best Big O.
>>108206797Am I retarded then? Isn't the optimistic case for array access like this>we want array[5]>okay, this array begins at logical address 500>one piece of data spans 7 bytes >therefore, array[5] is at logical address 500+5*7 = 535 , this arithmetic is O(1) because there is a finite address space>let's check the TLB >it's in the TLB! I now found the physical address in O(1) since the TLB has fixed number of entries>I can now go to the physical address in O(1) because memory supports random access>I now read the array[5] in O(1) time!
>>108208895No, you're just arguing with a troll or bot
>>108205872yeah I realized shortly after that's what his point was (>>108205429)he's mostly being a pain, your big O is typically going to be the one that limits the upper bound the most because it gives the most information. Array access being O(1) is a lot more informative and useful for algorithm design than basing it off array access also falling within O(n), O(n^2), O(2^n) etc.Big Theta isn't necessarily as general, not all algorithms can have it, but most simple ones do. Big O is just all you typically need for most things since you usually care about worst case performance instead of best case, especially in proofs (but conditions vary, so good to know both)OP not actually explaining things for a mostly simple misunderstanding makes it seem like he's just trolling and I'm surprised this thread is still active. Shouldn't be that hard for everyone else to figure out, though, I had just woken up but it's quick to see and it's been explained in the thread a few times already.
>big OLiterally always a topdown approach to problem solving like most models are. No amount of metrics or stuffing the nomenclature will fix this, and fuck the type of person to want to.
>>108205219Big O is what I give your mom regularly.
programming isn't an engineering or science discipline, it's a pop culture
>>108206248Flash attention was huge because of how it reduced space complexity.
>>108207735>>the TIME(?????) it takes to look at i is n^2 (which unit? second? burger per bald eagles???)If you knew the first thing about big O notation you would know that the units don't matter, since a change of units only adds a multiplicative factor or a constant term, neither of which alters the big O class.>care to explain how you reach this insane result?It's not insane but trivially true. You just need to know what big O notation is and what an array lookup is to see why it's true.But you don't know what those things are.>this thread makes no fucking sense, you retards come here pretending billions of people are suddenly wrong about complexity and you refuse to elaborateNot billions of people, just uneducated codemonkeys, CS people who didn't pay attention in class and redditors.The people who can actually read textbooks or wikipedia or pay attention in school about the topic know what big O means.And nobody's refusing to elaborate, the explanation was already given several times ITT.>are you even human and not some shitty AI in trial on /g/?Yes I am a human.>the entire internet says you're wrong,A few uneducated codemonkey who refuse to read and a few dumb redditors are not "the entire internet" but ok.
>>108207739The point is not that big O matters a lot in programming, but that a ton of people who think they know what big O means don't have a clue. And this thread proves it perfectly.>>108208660To be fair current generation of AI models are able to get gold medals in the IMO. That's really impressive.>>108208895You don't seem retarded, everything in this post seems correct. Array lookup is O(n^2) though.>>108208929Wrong.>>108209214>your big O is typically going to be the one that limits the upper bound the most becauseWrong.>Array access being O(1) is a lot more informative and useful for algorithm design than basing it off array access also falling within O(n), O(n^2), O(2^n) etc.Wrong.>Big Theta isn't necessarily as general, not all algorithms can have itWrong.>Big O is just all you typically need for most things since you usually care about worst case performance instead of best caseBig O has nothing to do with worst case.>OP not actually explaining things for a mostly simple misunderstandingYeah most people ITT have a simple misunderstanding of what big O notation is, including yourself. Which is the whole point of the thread.>it's been explained in the thread a few times already.Correct.
>>108205230>>108205219OP is vague but reasonably correct. This all stems from the notation using equality instead of set notation (it really is a set of functions)
>>108206630Would be kinda based if he was
>>108210248>This all stems from the notation using equality instead of set notation (it really is a set of functions)Wrong.It's perfectly reasonable to use equality to say statements for example f = O(g). Mathematicians and computer scientists do that all the time. In fact equality is preferred here and makes algebraic manipulations easier and clearer; use of equality is preferred.There's no problem with using equality just as there is no problem with using plus minus in equaltions, to say for example that x+y = 1 +- 2. Perfectly reasonable notation, which in set notation would be written as x+y -1 in {+2, -2}, which is obviously more unwieldy less pleasant to look at.Expr_1 = Expr_2 + O(f) simply means logically that there exists g in O(f) s.t. Expr_1 = Expr_2 + getc.
>>108207735>the entire internet says you're wrong, why can't you just explain it simply so us lowly humans and AIs can improve?Array lookup is probably O(n^2) when you consider truly arbitrary size arrays on external storage. Because you can't even assume you can store the full index without swapping disks/tape.
>>108210280>>108210248The problem stems from people not paying attention in school or not actually attempting to read what big O notation means, instead simply going off vibes and their own retarded half-conceived conceptions.
>>108210288I think it most colleges the informal (wrong in the strict sense) notion is taught, no?
>>108205422You cannot troll outside >>>/pol/ you have to go back
>>108210335Please explain how that is trolling. It's true, and you challenging it demonstrates the point of the thread, which is that most people don't know what big O means.
>>108205219How many times are you gonna post this fucking thread? You've posted in 22 fucking times already.https://desuarchive.org/g/search/image/gqACSvB_1wU0FOKVpAybYg/
>>108210536Might post it a few more times.
>>108210280It's not reasonable. At a fundamental level it's crazy that you're writing an element of a set equals a set. What you mean is that people are lazy and don't want to do the proper proofs of bounds with functions, then simply state that as it's bounded, f is an element of O(g). If you carefully define and state how you overload existing operations like + or -, maybe. But you could just state much more clearly g is some element of O(f) instead of even better some definition of the smallest amortized function in O(f), but nooo, one extra line is too hard. That's what you're really doing. If something is O(1), and you do f + O(1), you don't mean like f + n!.
>>108205219>it's this thread againOP kys
>>108210294Sometimes they explain O is really a set but basically is taught the abused notation
>>108210628We always used proper set operators (∈) in uni
>>108210227>Big O has nothing to do with worst caseYou're actually wrong here. If you dig into the definition, it's defined as the maximum number of steps taken for some string of length n. So there it is a necessary part of the definition.
>>108205360>if you tell them that array lookup is O(n^2)they will rightfully tell you that you're retarded
>>108206630a cache miss is still O(1)
>>108208895the thing you're debating is an LLM that was proompted to create a thread with the maximum amount of seething (You)s possible
>>108210660no, it isn't. mathematicians use it all the time with upper bound functions that have wildly faster scaling because they don't care what the theta isit's technically correct to say that array lookup is O(2^n) but it's also needlessly obtuse if you're a programmer talking to other programmers
Funny to see people internalized big O meaning the biggest factor in time complexity, when really it's just any function that grows faster in the long run. So yes obviously indexing an array is O(n^2), but it's also O(1). But also obviously when talking about big O the most useful/interesting is the lowest proven big O and not some random larger big O.
>>108210719>people internalized big O meaning the biggest factor in time complexitythat's because this is what matters to programmers. programmers don't care about the exact complexity bounds of algorithms, they just need to be able to do a quick mental heuristic check to make sure the way they're doing the thing isn't wildly suboptimal. the standard way to do this is just to know a couple of discrete buckets and learn to tell at a glance which one your implementation falls into
>>108206027>>108206167Also another victim of big O equality notation. Linear scan is omega(1), but not because you can get lucky, it's because f = 1 is an asymptotic lower bound for all other f. The f(n) itself always is the steps for the maximum number of steps for input of length n. In a linear scan f(n)=n, which is trivially Omega(1), more precisely Omega(n) and Theta(n). That big theta is the average case is a kind of hand wavy intuition of what it is. What you're describing is average case analysis. Average case analysis has f(n)=expected value of steps taken for all inputs of length n. This average case analysis for linear scan is f(n) = (1+2+3...+n)/2, which is (n+1)/2. If you have this different definition of n you can also do bounds analysis. They often converge (if a big theta exists), and have the same intuition, but are different things.
>>108210730>>108206167>>108206027hope you guys are having fun in data structures & algorithms class, because it won't last when its your turn to enter the job market :^)
>>108210718It really is just part of the definition. For a function, f(n) = max(steps taken for input of length n). I mean obviously if you don't specify this, f(n) can have multiple values. Like linear scan f(1000) could be 1 or 50 or 200).
>>108210740>obviously if you don't specify thisby having a job it's been implicitly specified for you, kiddo. it's not like we didn't have to do the same complexity analysis proofs that you're doing now. but it was a fucking decade ago and nobody cares when you gotta push to prod
>>108210719Ironically there's a very simple solution, they're all describing big theta.
>>108210610>It's not reasonable. At a fundamental level it's crazy that you're writing an element of a set equals a setIt's not crazy.f = O(g) means f equals some element of O(g). Very reasonable.>What you mean is that people are lazy and don't want to do the proper proofs of bounds with functionsThere's nothing improper about the notation. It's rigorous notation. And it allows you to express more complicated things more easily.For example, the mathematical statement(sin(x) + O(x^2))^2 = sin^2(x) + O(x^2)is perfectly valid, but very unwieldy to express using set notation.
>>108210628It's not abuse of notation.f = O(g) is perfectly valid notation, and very useful.
>>108210660I'm not wrong. You can say for example in sorting algorithms that worst case performance is O(n^2) and best case is O(n). O is not used exclusively for the worst case.> If you dig into the definition, it's defined as the maximum number of steps taken for some string of length nThat's not how it's defined though. You're imagining a definition that doesn't exist.
>>108210776You can specify a set of other rules to make it correct, yes. But it's a really big abuse of notation. You are overloading the meaning of every operation including equals, plus, etc. in your case it would be equally correct, and don't require any carefully constructed additional axioms, to just replace O(x^2) with g, where g is defined as an element of that. In fact this is how every other field uses set notation, complexity analysis just decides not to do this.
>>108210684Then you would be wrong.>>108210719>when really it's just any function that grows faster in the long runWrong.>O the most useful/interesting is the lowest proven big O and not some random larger big O.Absolutely wrong.
>>108210755> Very simpleGood luck proving big theta on any real problem lmao
>>108210798If you read the foundational work, it was based on f(n) being defined as the way I mentioned. Subsequent work did indeed modify f(n) to be a number of other things. The implicit definition uses the original worst case variant. If you deny the implicit definition, of course none of your statements mean anything because you haven't even defined what f(n) is.
>>108205219this is literally all correct
>>108210784Possibly consistent if you add a large number of other, different definitions of equality and so on, yes. In any way good notation, no. You can see how much misconceptions this has caused among people.
>>108210865Lol this OP. I mean all those algorithms are indeed an element of the Big O sets specified. The only possible thing wrong is that the precise definition is O contains the mentioned functions as an element. But their phrasing is just the translation of the notation used, where there can be equality between a set and an element of that set.