[a / b / c / d / e / f / g / gif / h / hr / k / m / o / p / r / s / t / u / v / vg / vm / vmg / vr / vrpg / vst / w / wg] [i / ic] [r9k / s4s / vip] [cm / hm / lgbt / y] [3 / aco / adv / an / bant / biz / cgl / ck / co / diy / fa / fit / gd / hc / his / int / jp / lit / mlp / mu / n / news / out / po / pol / pw / qst / sci / soc / sp / tg / toy / trv / tv / vp / vt / wsg / wsr / x / xs] [Settings] [Search] [Mobile] [Home]
Board
Settings Mobile Home
/sci/ - Science & Math

Name
Options
Comment
Verification
4chan Pass users can bypass this verification. [Learn More] [Login]
File
  • Please read the Rules and FAQ before posting.
  • Additional supported file types are: PDF
  • Use with [math] tags for inline and [eqn] tags for block equations.
  • Right-click equations to view the source.

08/21/20New boards added: /vrpg/, /vmg/, /vst/ and /vm/
05/04/17New trial board added: /bant/ - International/Random
10/04/16New board for 4chan Pass users: /vip/ - Very Important Posts
[Hide] [Show All]


πŸŽ‰ Happy Birthday 4chan! πŸŽ‰


[Advertise on 4chan]


File: 1729556099957437.png (310 KB, 467x260)
310 KB
310 KB PNG
So let's say you want to operate on bits, like
1111 + 1111

But instead of manually calculating them every time, why not store every poassible solution?

so instead of flipping each individual bit, we just pull the solution from a static register
>>
In terms of addition it’s numerically equivalent
>>
>>16799974
This is already done for certain high complexity operations.
>>
>>16799974
https://en.wikipedia.org/wiki/Lookup_table?useskin=vector
>>
>>16799974
You're so close but no that's not it keep trying
Damn you're so close but also low key so fucking far away but so SO CLOSE HOLY SHIT
>>
It's called a look up table, very normal in embedded programming on bare bones micro-controllers. It's getting a bit less intense now when an average micro has like 64mb of flash and more sram than you know what to do with.
>>
>>16799974
ok fuck around with logic gates and see if its faster for addition.
>>
>>16799974
but then they wouldn't be computers anymore. if you don't compute anything ever, what's the point of having a computer?
>>
>>16799974
This is what mathematics does, except the point to that register is some F(x) and it(the register) is purely conceptual. But it can be addressed and accessed.
>>
>>16799974
In a large enough array, it would take significantly more resources to store and lookup than to just calculate.
>>
File: IMG_4434.jpg (750 KB, 1600x1600)
750 KB
750 KB JPG
Question for all anons here:
Have you memorized your multiplication tables? How much have you memorized?
>>
>>16801752
Also this is relevant to ITT because your brain is a meat computer that you have on you at all times.
>>
>>16801752
I only memorize the square numbers. For other multiplications I just use.
[eqn]a \cdot b = \left( \frac{a + b}{2} \right)^2 - \left( \frac{a - b}{2} \right)^2[/eqn]
>>
>>16801771
chad move
>>
Hrmm? I thought it was more {x|Ο†(x)} {x|x(Ο†)} {Ο†|x(Ο†)} {Ο†|Ο†(x)}

but I can't be bothered to care about money and and or importance or stuff, Should be the actual solution to the n vs np problem and not whatever fag shit OP is talking about. Please do verify if you must.
>>
>>16799974
https://en.wikipedia.org/wiki/The_Library_of_Babel
>>
>>16801752
How do you guys store theorems and formulas in your brain? Personally, I use a disjoint set (union find) and a bloom filter.
>>
>>16799974
Arithmetic is so simple that there's not really a reason to do that, but older software usually stored the results of trig. functions in tables
>>
With c++ you can use the constexpr keyword for expressions that evaluate at compile time and stores the result in read only static memory.
>>
>>16800833
This. SO much this
>>
File: 1756147352509076.jpg (69 KB, 640x743)
69 KB
69 KB JPG
>>16799974
classic computational time/space tradeoff
you can either use up a lot of space (i.e. memory) to store a lookup table which takes minimal time (i.e. cycles) to compute a function
alternatively, you can use up a lot of time (i.e. cycles) to compute intermediate results that get tossed after being used to minimize space (i.e. memory)
of course, you can have the worst of both worlds and have a shitty program that eats memory and takes forever (i.e. python)
>>
>>16800833
a lookup table is the fastest type of computation (ignoring cache) and has complexity O(1)
stepping back, computers only do 3 things; some basic math, moving data, and conditional execution based on the results of the previous two things
>>
>>16799974
It's not worth it. A lookup table of n bits would already be 2^n in size. Storing register values as flip-flops costs a lot of energy because of leakage. Parallel prefix adders are the way. They come in many different flavors, depending on your energy/speed/space needs. Don't forget that the adder shares its CPU with many other modules, they all fight for space, so there's always gonna be some variation, but in general you're looking for O(log n) when implementing an adder.
>>
>>16799974
why read a book when you could just skim around a dictionary?
>>
Even if you somehow had the means to create such an astronomically huge database of possible solutions and store it somewhere, you'd still need to implement some sort of lookup procedure for every bit operation so that correct solutions could be retrieved from it given any arbitrary pair of integers. I can't imagine any way this could NOT be orders of magnitude slower than just doing the operations on the go.

To put it in a simple analogy: imagine that there is a monstrously huge library, with uncountable shelves of books containing the result for every addition operation up to the 32-bit limit (around 4 billion). What would be faster, adding 1,391,400 + 149,597,870 normally with a piece of paper and a pencil, or going all the way to the specific book from the specific shelf where the answer is located, flipping through the pages, and retrieving the answer from there? By the time you grabbed a ladder you could already have done that addition on paper.
>>
>>16805445
i know we are just having fun in this thread, but the look up procedure for addition is simply concatenating the addends and interpreting that as an address/index into the lookup table

>>16804806
>>16805445
more seriously, though, it might scale badly, but that doesn't mean extremely small tables could not be useful (like adding 2bit numbers)
>>
>>16801771
>Hey, what's 17x10?
>Why that's easy, it's just (27/2)^2 - (7/2)^2 = 729/4 - 49/4 = 680/4 = 170
>>
>>16801771
but anon, you are multiplying numbers by 1/2
>>
>>16804790
it has complexity o(n) you fucking retard.
>>
>>16805721
oh really? explain your reasoning if you think so.
>>
i was talking about time complexity
i suppose you are talking about space complexity
>>
>>16805724
They just discovered that spaghetti code can save space at the expense of time... https://www.youtube.com/watch?v=8JuWdXrCmWg

This reminds me of that hypothetical distributed filesystem where everyone's data is xored with everyone elses and accessed with keys pointing to the random looking data so that yeah, your hard drive IS filled with CP but you have no idea or any way to access it but you need it to reconstruct your music collection which may be all pirated but the people mirroring it have no idea because they don't have the key and are using it to store their holocaust denial information which you are unknowingly mirroring for them.

https://www.youtube.com/watch?v=8JuWdXrCmWg
>>
>>16805979
Ignore the second link, the URI I mean to be in my paste buffer was this: http://www.madore.org/~david/misc/freespeech.html re filesystem or random illicit data
>>
>>16804790
[math]
\begin{array}{l}
\textbf{Input: } \boxed{0} \; \boxed{0} \; \boxed{+} \; \boxed{0} \; \boxed{1}\\[2pt]
\textbf{Steps:}\\
\quad 1.\ \text{read } 0\\
\quad 2.\ \text{read } 0\\
\quad 3.\ \text{read } +\\
\quad 4.\ \text{read } 0\\
\quad 5.\ \text{read } 1\\
\quad 6.\ \text{lookup } S(00,01)=01\\[4pt]
\textbf{Output: } 01\\
\textbf{Input size: } 5\\
\textbf{Computation steps: } 6\\
\textbf{Time complexity: } \mathcal{O}(n)
\end{array}
[/math]
>>
>>16805990
when doing complexity analysis, you assume that the inputs have already been read
>>
>>16804790
>>16805990
>>16806076
This is the most sophomoric trash I've seen related to computer science in a while.
>>16799974
OP, if you want your question answered, go to >>>/g/
>>
>>16799974
Lay out the transitors and woring for your 4 bit lookup table for addition, subtraction, multiplication and division. Then compare to an ALU.
>>
>>16804044
Library of babel is easy, you just miraculously pick up the book that says exactly what you were looking for and includes instructions to find the next book which is also exactly correct and also references the next book ad infinitum(finitum because there's a limited number of unique books and bookshelf arrangements).
>>
>>16806076
>you assume that the inputs have already been read
???
>yeah, searching for a value in an array is O(1) because the input has already been read!
>>
>>16806515
if you don't assume inputs have already been read, then you can't have sub-linear complexity i.e. O(1) is impossible, even for a function that e.g. just returns a constant number.

reading a value at a given address is generally modeled as a having constant time complexity O(1). if you put a lookup table at address zero, you don't have to add a base to the array index. i'm just saying that if you interpret the input as an address, you can look up the value at that address in constant time O(1). that's it. one the other hand, the table will (assuming binary data) have a space complexity O(2^n).

this is cs 101 shit. lookup tables are a dirty secret behind all sorts of high performance code. it just tends to catch people off guard that a lookup is considered a function because of how simple it is.
>>
>>16806521
Hmm Decoding binary into unary would have exponential complexity... I therefore propose that all bijective functions are O(1). Simply encode the input as the answer and have the machine do nothing.

QED
>>
>>16804790
>this is the fastest thing if we ignore the slowest element in the process
LOL GENIUS
>>
>>16807186
feel free to search google for LUT complexity
>>
File: m608.png (176 KB, 284x258)
176 KB
176 KB PNG
>>16799974
CONGRATS!
YOU JUST INVENTED LUTs IN FPGAs IN 1971.
WELL FUCKING DONE.



[Advertise on 4chan]

Delete Post: [File Only] Style:
[Disable Mobile View / Use Desktop Site]

[Enable Mobile View / Use Mobile Site]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.