So let's say you want to operate on bits, like 1111 + 1111But instead of manually calculating them every time, why not store every poassible solution?so instead of flipping each individual bit, we just pull the solution from a static register
In terms of addition itβs numerically equivalent
>>16799974This is already done for certain high complexity operations.
>>16799974https://en.wikipedia.org/wiki/Lookup_table?useskin=vector
>>16799974You're so close but no that's not it keep tryingDamn you're so close but also low key so fucking far away but so SO CLOSE HOLY SHIT
It's called a look up table, very normal in embedded programming on bare bones micro-controllers. It's getting a bit less intense now when an average micro has like 64mb of flash and more sram than you know what to do with.
>>16799974ok fuck around with logic gates and see if its faster for addition.
>>16799974but then they wouldn't be computers anymore. if you don't compute anything ever, what's the point of having a computer?
>>16799974This is what mathematics does, except the point to that register is some F(x) and it(the register) is purely conceptual. But it can be addressed and accessed.
>>16799974In a large enough array, it would take significantly more resources to store and lookup than to just calculate.
Question for all anons here:Have you memorized your multiplication tables? How much have you memorized?
>>16801752Also this is relevant to ITT because your brain is a meat computer that you have on you at all times.
>>16801752I only memorize the square numbers. For other multiplications I just use.[eqn]a \cdot b = \left( \frac{a + b}{2} \right)^2 - \left( \frac{a - b}{2} \right)^2[/eqn]
>>16801771chad move
Hrmm? I thought it was more {x|Ο(x)} {x|x(Ο)} {Ο|x(Ο)} {Ο|Ο(x)}but I can't be bothered to care about money and and or importance or stuff, Should be the actual solution to the n vs np problem and not whatever fag shit OP is talking about. Please do verify if you must.
>>16799974https://en.wikipedia.org/wiki/The_Library_of_Babel
>>16801752How do you guys store theorems and formulas in your brain? Personally, I use a disjoint set (union find) and a bloom filter.
>>16799974Arithmetic is so simple that there's not really a reason to do that, but older software usually stored the results of trig. functions in tables
With c++ you can use the constexpr keyword for expressions that evaluate at compile time and stores the result in read only static memory.
>>16800833This. SO much this
>>16799974classic computational time/space tradeoffyou can either use up a lot of space (i.e. memory) to store a lookup table which takes minimal time (i.e. cycles) to compute a functionalternatively, you can use up a lot of time (i.e. cycles) to compute intermediate results that get tossed after being used to minimize space (i.e. memory)of course, you can have the worst of both worlds and have a shitty program that eats memory and takes forever (i.e. python)
>>16800833a lookup table is the fastest type of computation (ignoring cache) and has complexity O(1)stepping back, computers only do 3 things; some basic math, moving data, and conditional execution based on the results of the previous two things
>>16799974It's not worth it. A lookup table of n bits would already be 2^n in size. Storing register values as flip-flops costs a lot of energy because of leakage. Parallel prefix adders are the way. They come in many different flavors, depending on your energy/speed/space needs. Don't forget that the adder shares its CPU with many other modules, they all fight for space, so there's always gonna be some variation, but in general you're looking for O(log n) when implementing an adder.
>>16799974why read a book when you could just skim around a dictionary?
Even if you somehow had the means to create such an astronomically huge database of possible solutions and store it somewhere, you'd still need to implement some sort of lookup procedure for every bit operation so that correct solutions could be retrieved from it given any arbitrary pair of integers. I can't imagine any way this could NOT be orders of magnitude slower than just doing the operations on the go.To put it in a simple analogy: imagine that there is a monstrously huge library, with uncountable shelves of books containing the result for every addition operation up to the 32-bit limit (around 4 billion). What would be faster, adding 1,391,400 + 149,597,870 normally with a piece of paper and a pencil, or going all the way to the specific book from the specific shelf where the answer is located, flipping through the pages, and retrieving the answer from there? By the time you grabbed a ladder you could already have done that addition on paper.
>>16805445i know we are just having fun in this thread, but the look up procedure for addition is simply concatenating the addends and interpreting that as an address/index into the lookup table>>16804806>>16805445more seriously, though, it might scale badly, but that doesn't mean extremely small tables could not be useful (like adding 2bit numbers)
>>16801771>Hey, what's 17x10?>Why that's easy, it's just (27/2)^2 - (7/2)^2 = 729/4 - 49/4 = 680/4 = 170
>>16801771but anon, you are multiplying numbers by 1/2
>>16804790it has complexity o(n) you fucking retard.
>>16805721oh really? explain your reasoning if you think so.
i was talking about time complexityi suppose you are talking about space complexity
>>16805724They just discovered that spaghetti code can save space at the expense of time... https://www.youtube.com/watch?v=8JuWdXrCmWg This reminds me of that hypothetical distributed filesystem where everyone's data is xored with everyone elses and accessed with keys pointing to the random looking data so that yeah, your hard drive IS filled with CP but you have no idea or any way to access it but you need it to reconstruct your music collection which may be all pirated but the people mirroring it have no idea because they don't have the key and are using it to store their holocaust denial information which you are unknowingly mirroring for them.https://www.youtube.com/watch?v=8JuWdXrCmWg
>>16805979Ignore the second link, the URI I mean to be in my paste buffer was this: http://www.madore.org/~david/misc/freespeech.html re filesystem or random illicit data
>>16804790[math]\begin{array}{l}\textbf{Input: } \boxed{0} \; \boxed{0} \; \boxed{+} \; \boxed{0} \; \boxed{1}\\[2pt]\textbf{Steps:}\\\quad 1.\ \text{read } 0\\\quad 2.\ \text{read } 0\\\quad 3.\ \text{read } +\\\quad 4.\ \text{read } 0\\\quad 5.\ \text{read } 1\\\quad 6.\ \text{lookup } S(00,01)=01\\[4pt]\textbf{Output: } 01\\\textbf{Input size: } 5\\\textbf{Computation steps: } 6\\\textbf{Time complexity: } \mathcal{O}(n)\end{array}[/math]
>>16805990when doing complexity analysis, you assume that the inputs have already been read
>>16804790>>16805990>>16806076This is the most sophomoric trash I've seen related to computer science in a while.>>16799974OP, if you want your question answered, go to >>>/g/
>>16799974Lay out the transitors and woring for your 4 bit lookup table for addition, subtraction, multiplication and division. Then compare to an ALU.
>>16804044Library of babel is easy, you just miraculously pick up the book that says exactly what you were looking for and includes instructions to find the next book which is also exactly correct and also references the next book ad infinitum(finitum because there's a limited number of unique books and bookshelf arrangements).
>>16806076>you assume that the inputs have already been read???>yeah, searching for a value in an array is O(1) because the input has already been read!
>>16806515if you don't assume inputs have already been read, then you can't have sub-linear complexity i.e. O(1) is impossible, even for a function that e.g. just returns a constant number.reading a value at a given address is generally modeled as a having constant time complexity O(1). if you put a lookup table at address zero, you don't have to add a base to the array index. i'm just saying that if you interpret the input as an address, you can look up the value at that address in constant time O(1). that's it. one the other hand, the table will (assuming binary data) have a space complexity O(2^n).this is cs 101 shit. lookup tables are a dirty secret behind all sorts of high performance code. it just tends to catch people off guard that a lookup is considered a function because of how simple it is.
>>16806521Hmm Decoding binary into unary would have exponential complexity... I therefore propose that all bijective functions are O(1). Simply encode the input as the answer and have the machine do nothing.QED
>>16804790>this is the fastest thing if we ignore the slowest element in the processLOL GENIUS
>>16807186feel free to search google for LUT complexity
>>16799974CONGRATS!YOU JUST INVENTED LUTs IN FPGAs IN 1971.WELL FUCKING DONE.