>"This is similar to the description of how the Dungeon Master fuzzy bits are written.
As can be seen in the screenshot, the 0x88 data bytes soon start reading back incorrect and non-deterministically. But the variance isn't 100% random like weak bits -- the variance is whether the 0x8 bit is late enough to have a chance of being missed. If missed, you can still eyeball that there are patterns and themes to the madness.
You know, Dungeon Master and copy protection aside, my intuitive mind says there is some form of as-of-yet unobserved/not understood -- similarity between the concept of "fuzzy bits" on old school floppy disks, and that of Qubits, from relatively new school Quantum Physics...
Perhaps qubits could be constructed in the same way fuzzy bits are -- use a device which writes multiple pieces of data at a higher resolution -- then measure that data with a device that reads at a lower resolution.
Applied to qubits then, whatever the particle size, whatever the region of space that the qubit occupies -- attempt to "write" one by writing multiple points in that same space with a device that can modify smaller points within that space (if the qubit is a particle, then this would be sub-particles), and then read it with a device that reads the entire space (which should read probabilistically now, because it lacks the resolution of those smaller particles necessary for a true, repeatable reading...)
I could be completely wrong about this, of course.
But intutitively, I sense something there...
>"The above results are actually the application of fuzzy bit principles to FM encoded data. In FM encoding, every data bit is interleaved with a clock bit. This results in the bleeding of clock bits in to the data stream on occasion (see the 0xFF bytes in the first run above -- they are likely clock bits). The Dungeon Master protection uses fuzzy bits in conjunction with MFM. This leads to a calmer situation where the fuzzy bit drifts between two valid data bit encodings and does not mess up the clock!"
I wonder what could be learned in physics if we tried similar techniques with small particles/small regions of space/small regions of space on substrates...
Anyway, physics aside, a truly fascinating article!
> Perhaps qubits could be constructed in the same way fuzzy bits are -- use a device which writes multiple pieces of data at a higher resolution -- then measure that data with a device that reads at a lower resolution.
If I understand, you mean all that in order to achieve quantum computing cheaper. As far as I know, the effects used and needed in quantum computing aren't just "randomness" of reading something but much more. The physicists do attempt to achieve the "quantum simulation" effects as cheaply as possible, and until now they had to depend on complex setups:
"These quantum devices can be implemented in a large number of ways, for example, using ultracold trapped ions6,7,8,9,10,11 cavity quantum electrodynamics (QED),12,13,14,15 photonic circuits,16,17,18 silicon quantum dots,19,20,21 and theoretically even by braiding, as yet unobserved, exotic collective excitations called non-abelian anyons.22,23,24 One of the most promising approaches is using superconducting circuits,25,26,27"
It doesn't appear to me that the "unreliable bits" read from the magnetic medium could be enough. (Disclaimer: not in that field.) It's however always interesting to learn more while trying to make some idea work.
As can be seen in the screenshot, the 0x88 data bytes soon start reading back incorrect and non-deterministically. But the variance isn't 100% random like weak bits -- the variance is whether the 0x8 bit is late enough to have a chance of being missed. If missed, you can still eyeball that there are patterns and themes to the madness.
You know, Dungeon Master and copy protection aside, my intuitive mind says there is some form of as-of-yet unobserved/not understood -- similarity between the concept of "fuzzy bits" on old school floppy disks, and that of Qubits, from relatively new school Quantum Physics...
Perhaps qubits could be constructed in the same way fuzzy bits are -- use a device which writes multiple pieces of data at a higher resolution -- then measure that data with a device that reads at a lower resolution.
Applied to qubits then, whatever the particle size, whatever the region of space that the qubit occupies -- attempt to "write" one by writing multiple points in that same space with a device that can modify smaller points within that space (if the qubit is a particle, then this would be sub-particles), and then read it with a device that reads the entire space (which should read probabilistically now, because it lacks the resolution of those smaller particles necessary for a true, repeatable reading...)
I could be completely wrong about this, of course.
But intutitively, I sense something there...
>"The above results are actually the application of fuzzy bit principles to FM encoded data. In FM encoding, every data bit is interleaved with a clock bit. This results in the bleeding of clock bits in to the data stream on occasion (see the 0xFF bytes in the first run above -- they are likely clock bits). The Dungeon Master protection uses fuzzy bits in conjunction with MFM. This leads to a calmer situation where the fuzzy bit drifts between two valid data bit encodings and does not mess up the clock!"
I wonder what could be learned in physics if we tried similar techniques with small particles/small regions of space/small regions of space on substrates...
Anyway, physics aside, a truly fascinating article!