In memory compute, thoughts?
Retardation
>>106569187It's inevitable but only for highly specialized applications (eg GPUs, AI)
>>106569187Computing in memory. Its no longer some hypothetical. Qbit anealing machines like the D.Wave do that, and some NAND vendors are now shipping memory module stacks with a die at the base that has a processor in it. We live in the future. A very gay dystopian future.
>>106569196I see use case for medical wearables though. You could have low power models that can detect heart attack, strokes and other bio markers to prevent worst. >>106569208AI
>>106569242You can already have that. The point of in-memory compute is that it's faster, which makes no difference in a medical wearable
>>106569811faster surely equals less computation time equals more power saving though? especially with less time moving data between memory and cache
>>106569187Diagram is hilarious.Before von neumann dominated everything, the diagram looked the same except there was a ROM memory on the other side that stored the program.Problem being, ROM is waaay slower than RAM.So in the early PC days, ISA cards would copy their ROM into RAM and execute it there.> in memory computeThe PS 3 cell b.e. core has its own 256 KB memory, which is kind of starting to approach the memory comput idea. How’d that work out?
>>106569187I watched a talk about it recently, it does make sense after a certain amount of datahttps://youtu.be/Ct9GhK32tVo?t=667
>>106570169The first real computer used Eckert architecture which stored the program and data The only thing that's hilarious is how dumb you are. All PCs were Eckert (what idiots like you call von-neumann) they just load their program and data into ram. It's all stored together. Only misc old computers and micrro controlers did this. ROM is not way slower than ram that's retarded it depends on your rom and ramEckert dominated everything from the beginning. Harvard was always a niche of a niche
>>106569844>>106569811Longer lasting medical wearables go a long way.
>>106570876bullshit queer. in memory compute has nothing to do with microcontroller socs with smol bus that use microwatts. this is about putting a third of your system memory across a pci bus next to a huge simd core
>>106570945Huh
We should put CPUs in every component, from sound cards to the input devices, until a computer no longer needs the CPU to run, like an insect. The operating system should be able to fall back to any HDD microcontroller through the firmware blob for ring0 instructions.
ask an llm, retard
>>106570983>let's make a non-distributed problem distributedthis is why web retards are constantly making their own lives harder. I hope you rope yourself of your own motivation.
>>106570983>We should put CPUs in every component, from sound cards to the input devices, until a computer no longer needs the CPU to run, like an insectinsects aren't known for being very smart so I think your idea is retardedmaybe you are an insect too
>>106569187https://arxiv.org/abs/2105.12839
>>106570983>>106571052>>106571325>We should put CPUs in every component, from sound cards to the input devicesanons... not sure if you retards know this, but most devices, even fucking CABLES, have CPUs nowadays.check this (decade old) HDD hack:https://spritesmods.com/?art=hddhack&page=3
Just make the L1 cache bigger and load the whole program into it.
>>106571417How do u people find random papers unless your promoting yourself
>>106571484sqt is that way anon
>>106570169https://arxiv.org/pdf/2509.08542You couldn't be more wrong
>>106571439yeah and in the operating system community this is widely debated on whether it is a good thing. it means that everything is individually responsible for synchronization, which is in fact more complicated and slower. appealing to authority (in this case precedent) is a logical fallcy.
>>106571868>appealing to authority (in this case precedent) is a logical fallcy.are you retarded or just pretending? what "authority" are you talking about?it's a reality now, there is nothing to discuss here because there is no point in arguing against reality. manufacturers are starting to make things work like that. even USB-C cables have microprocessors in them. who cares about your discussion about theory?