Place least significant bits of int into char in C++ -


i find maximally efficient way compute char contains least significant bits of int in c++11. solution must work possible standards-compliant compiler. (i'm using n3290 c++ draft spec, c++11.)

the reason i'm writing fuzz tester, , want check libraries require std::string input. need generate random characters strings. pseudo-random generator i'm using provides ints low bits pretty uniformly random, i'm not sure of exact range. (basically exact range depends on "size of test case" runtime parameter.)

if didn't care working on compiler, simple as:

inline char int2char(int i) { return i; } 

before dismiss trivial question, consider that:

  • you don't know whether char signed or unsigned type.

  • if char signed, conversion unrepresentable int char "implementation-defined" (§4.7/3). far better undefined, solution i'd need see evidence standard prohibits things converting ints not between char_min , char_max '\0'.

  • reinterpret_cast not permitted between signed , unsigned char (§5.2.10). static_cast performs same conversion in previous point.

  • char c = & 0xff;--though silences compiler warnings--is not correct implementation-defined conversions. in particular, i & 0xff positive number, in case c signed quite plausibly not convert negative values of i negative values of c.

here solutions work, in of these cases i'm worried won't efficient simple conversion. these seem complicated simple:

  • using reinterpret_cast on pointer or reference, since can convert unsigned char * or unsigned char & char * or char & (but @ possible cost of runtime overhead).

  • using union of char , unsigned char, first assign int unsigned char, extract char (which again slower).

  • shifting left , right sign-extend int. e.g., if i int, running c = ((i << 8 * (sizeof(i) - sizeof(c)) >> 8 * (sizeof(i) - sizeof(c)) (but that's inelegant, , if compiler doesn't optimize away shifts, quite slow).

here's minimal working example. goal argue assertions can never fail on compiler, or define alternate int2char in assertions can never fail.

#include <algorithm> #include <cassert> #include <cstdio> #include <cstdlib>  using namespace std;  constexpr char int2char(int i) { return i; }  int main(int argc, char **argv) {   (int n = 1; n < min(argc, 127); n++) {     char c = -n;     int = (atoi(argv[n]) << 8) ^ -n;     assert(c == int2char(i));   }   return 0; } 

i've phrased question in terms of c++ because standards easier find on web, equally interested in solution in c. here's mwe in c:

#include <assert.h> #include <stdlib.h>  static char int2char(int i) { return i; }  int main(int argc, char **argv) {   (int n = 1; n < argc && n < 127; n++) {     char c = -n;     int = (atoi(argv[n]) << 8) ^ -n;     assert(c == int2char(i));   }   return 0; } 

a far better way have array of chars , generate random number pick char array. way 'well behaved' characters; or @ least characters defined badness. if want 256 chars (note 8 bit assumption) create array 256 entries in ('a','b',....'\t','n'.....)

this portable too


Comments

Popular posts from this blog

qt - Using float or double for own QML classes -

Create Outlook appointment via C# .Net -

ios - Swift Array Resetting Itself -