What is the fastest way to check that I've Orthonormalized a basis? I usually have to do O(n^2) dot products of size n to verify that they are all 0. Is there a faster way, and what kind of math do I have to study to prove that there is or isn't?
>>16452449do it well so you don’t have to check
>>16452464>Do it wellI want to do it fast. My method of checking being the fastest would imply that Gram Schmidt is the fastest way. This is O(n^3), not to mention all the fast inverse square roots I have to do.
>>16452473https://en.wikipedia.org/wiki/Gram%E2%80%93Schmidt_processjust check wikipedia. It has all the references. Besides, don’t bother with all this crap when numerical linear algebra libraries like Eigen already exist. I can guarantee you that they have implementations that run faster than yours.
>>16452481My pursuit is theoretical. I don't care about libraries. I just want to be sure there's no faster way.
>>16452449>Is there a faster way, and what kind of math do I have to study to prove that there is or isn't?No, there is no faster way. The kind of math is differential geometry. Consider the following. Orthonormality properties of a coordinate system are entirely contained in its metric tensor [math]g_{ij}=v_i\cdot v_j[/math]. If the metric tensor is the identity matrix, then you have an orthonormal basis. Since this is a rank-2 tensor, it requites O(N^2) operations to construct.
>>16452493>I just want to be sure there's no faster waynigga, people still don't know the low bounds of the complexity of matrix multiplicationalso, it's important to know how big the basis is, as that may influence what algorith is best>>16452473this is a numerically unstable implementation of GS, not good to program on a computer except for the smallest of bases
>>16452539Ok, I'll look into it. I was honestly expecting something related to automata.
>>16452554If you aren't convinced by my post, I can justify why the metric tensor must be of rank 2. The inner product of two arbitrary vectors is given by [math](a,b) = \sum_{i=1}^N\sum_{j=1}^N g_{ij}a_ib_j[/math]. The inner product is invariant under coordinate transformations. If the metric were given by a rank-1 tensor instead, the resulting expression for the inner product would be a vector, which isn't invariant. Contradiction.
>>16452539no one determines if they have an orthonormal basis this way on a computer
>>16452562I don't care and neither does OP (>>16452493). This is a mathematical proof. Fuck off to >>>/g/ if you don't like it.
>>16452555So I have to take O(n^2) dot products, and there's no transformation that will do any better. Got it. But if I'm working in a finite field, I bet there's a fast way.
>>16452565Oh, very interesting. I believe vector spaces over finite fields aren't metric spaces, no? I'm not even sure one can come up with a well-defined notion of an inner product on such spaces. Such definitions fail even for the rationals, because the space isn't complete in the topological sense. So it seems that this problem is ill-posed, because there can be no notions of orthogonality and normality in such vector spaces.
>>16452550>nigga, people still don't know the low bounds of the complexity of matrix multiplicationWe practically do, it's O(irrelevant) because the algorithms are galactic anyway.