You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Sep 13, 2025. It is now read-only.
How does the process of encoding trits into bytes work when the trits represent a negative value?
For example, consider the following sequence of little-endian* trits:
00++000++--+-bal3
These trits represent the value -42424210.
How do we determine the length of the resulting binary sequence? +42424210 can be represented using 19 bits (11001111001001100102):
signed long)?* (in this context, assume little-endian trits and bits — that's a whole 'nother can of worms!)