+ 1
Char to binary
How to convert char to binary?
4 odpowiedzi
+ 2
𝕾𝖆𝖌𝖆𝕿𝖍𝖊𝕲𝖗𝖊𝖆𝖙 💯
Moveable mask is a fairly good method of having fine control over binary sequence format yet it needs more work to produce the full ASCII table's binary equivalent.
To make sure the OP grasps the essence of the logic, I step through each operation
1st cycle
i = 7
1<<i = 1000 0000
a = 0100 0001
& = 0000 0000
if(false)
2nd cycle
i = 6
1<<i = 0100 0000
a = 0100 0001
& = 0100 0000
if(true)
3rd cycle
i = 5
1<<i = 0010 0000
a = 0100 0001
& = 0000 0000
if(false)
4th cycle
i = 4
1<<i = 0001 0000
a = 0100 0001
& = 0000 0000
if(false)
5th cycle
i = 3
1<<i = 0000 1000
a = 0100 0001
& = 0000 0000
if(false)
6th cycle
i = 2
1<<i = 0000 0100
a = 0100 0001
& = 0000 0000
if(false)
7th cycle
i = 1
1<<i = 0000 0010
a = 0100 0001
& = 0000 0000
if(false)
8th cycle
i = 0
1<<i = 0000 0001
a = 0100 0001
& = 0000 0001
if(true)
Output: 01000001
+ 1
char to binary conversion needs 8 bits to represent each character's binary equivalent. One way to implement a simple converter would be as the following
//...
#include <bitset>
//...
// Passing each character as an argument to this macro, returns the binary equivalent of that character
#define BIN(X) bitset<8>(X)
int main() {
// Visible ASCII characters start off from ' ' (space) character or integer 32 to '~' (tilde) or integer 126
for (char ch = 32; ch < 127; ++ch)
cout << ch << " --BIN-> " << BIN(ch) << endl;
}
Part of the output:
...
+ --BIN-> 00101011
, --BIN-> 00101100
- --BIN-> 00101101
...
+ 1
Here is basic way to do this,
https://code.sololearn.com/cJfJx90Rwkze/?ref=app