You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Sep 13, 2025. It is now read-only.
One idea that's been batted around is creating "multi-trit" codecs, where a binary value can be represented using a variable number of trits (in the same way that UTF-8 may use 1, 2, 3 or 4 bytes to represent a particular character).
Here's an example, put forward (as a thought experiment) by @paulhandy:
I commented here iotaledger-archive/iota.js#130 regarding a unicode encoding scheme, in which I suggested that perhaps the first trit of a sequence could give the length. For example, we could have [0+-] defined as [end,3,6] trits such that any byte up to 3^4 could be represented in 4 trits, with a maximum of 7 trits. I'm not sure this is best or anything, but something like this could be considered.
I feel like this might be an interesting approach to the "square peg / round hole" problem.