SyntaxBomb - Indie Coders
General Category => General Discussion => Topic started by: Matty on January 03, 2021, 07:25:20 PM
-
Alternative compression algorithm
Take datastream, let's call it A.
Record length of A in bytes. Store in output file OUT.
Generate n random number sequences with unique seeds.
For byte a to byte b of A beginning at a=1 and b=length of A look for equal sequence to a same length section of random sequences. If none then reduce b by 1 and try again.
If find a match:
Record seed s, start of sequence s0 and length of sequence s1 in output file OUT.
Move pointer in A to position b+1, set new value for a=b+1 and b to length of A.
Repeat process.
Given sufficient compression time file ought to be smaller.
Note should be able to even compress white noise datastreams too.
-
Google has and is using their Deep Learning technology to improve compression on their video's on youtube. Not sure how fast this all is going to show results. A scientist in a article somewhere said that cpu's need to be a billion billion times faster for current ai to be effective.
I have this brute force pathfinding code which really gets slow at paths more than a short number of steps.
Just imagine alien civilizations having super computing power and solving and simulating everything they can imagine!
-
Just imagine alien civilizations having super computing power and solving and simulating everything they can imagine!
Imagine if we were in one of these simulations, and are from the imagination of one of these things "out there"... I can only conclude, by looking around our world... That these all powerful things are... Complete and utter twats with the poorest of humour! :D
Dabz
-
I don't know - if so, they certainly generated a lot of funny British comedies over the years. ...
-
Very true Matty! ;)
Dabz