I am using std::bitset to represent short DNA sequences (haplotypes). For my purposes, I need to be able to convert each such sequence into a single integer. At the moment, it seems like this requirement bounds my sequences to be of length <=64 because of the way std::bitset::to_ullong works?
Since there are ways to represent 128-bit integers in C++ (Is there a 128 bit integer in C++?), I am wondering how long it will take before std::bitset supports direct conversion to 128-bit integers as well?
Of course I would be more than happy to learn that I am wrong and one can already do that...
Thanks!
You could always provide your own:
However, this makes a few technically unsafe assumptions, such as
unsigned long longbeing 64 bits. Thestatic_asserthelps, but a better solution is to actually account for the possible discrepancies:Both gcc and clang optimize this down to adequately clean assembly on X86-64:
see on godbolt
Notes:
my_uint128_tis a bit of a placeholder type here. There's still some endianness handling required to use this correctly as a portable 128 bit integer.