TheDudeV2@lemmy.caOPtoTechnology@lemmy.world•Neuralink looks to the public to solve a seemingly impossible problemEnglish
39·
5 months agoI’m not an Information Theory guy, but I am aware that, regardless of how clever one might hope to be, there is a theoretical limit on how compressed any given set of information could possibly be; and this is particularly true for the lossless compression demanded by this challenge.
Quote from the article:
The skepticism is well-founded, said Karl Martin, chief technology officer of data science company Integrate.ai. Martin’s PhD thesis at the University of Toronto focused on data compression and security.
Neuralink’s brainwave signals are compressible at ratios of around 2 to 1 and up to 7 to 1, he said in an email. But 200 to 1 “is far beyond what we expect to be the fundamental limit of possibility.”
I can try to explain, but there are people who know much more about this stuff than I do, so hopefully someone more knowledgeable steps in to check my work.
What does ‘random’ or ‘noise’ mean? In this context, random means that any given bit of information is equally as likely to be a 1 or a 0. Noise means a collection of information that is either random or unimportant/non-useful.
So, you say “Compression saves on redundant data”. Well, if we think that through, and consider the definitions I’ve given above, we will reason that ‘random noise’ either doesn’t have redundant information (due to the randomness), or that much of the information is not useful (due to its characteristic as noise).
I think that’s what the person is describing. Does that help?