+ 3
ASCII is a data coder like UTF8, UTF2, UTF16(BE&LE), UTF32.
But it has not all character types.
+ 2
ASCII was the first character set (encoding standard) used between computers on the Internet.
Both ISO-8859-1 (default in HTML 4.01) and UTF-8 (default in HTML5), are built on ASCII.
Id say its like a variable that contains an array of characters. Each character has a number. When you call the number you get the value. So the number 38 will be & symbol. 33 ! etc.
Example: const ASCII = [];
ASCII [33]= "!";
ASCII [34]= ' " ';
ASCII [38]= "&";
To call it the value you press the ALT Key + the value.
Many keys on your keyboard are mapped all ready.
Not sure if that helps but if you googled everything this is the best explanation I could come up with.
+ 1
Yes everything is binary in a computer,
think about it this way everything in a computer is defined by electric signals.
Electricity can either be flowig or not. In other words its binary.
Therefore one can say everything in a computer is defined by binary values.
So bsically a binary signal defines each character. And this signal tells the cpu where to find that character in memory
+ 1
All of it correlates. Computers see 1 and 0'. to make it easier to communicate electronically we have character encodings such as ASCII which is used for the representation of text such as symbols, letters, digits, etc.. Think of it as translating the 1and 0's into easy to read characters. There are different encodings for different purposes. I wouldn't think to hard about it.
+ 1
Excerpt from Wikipedia
ASCII (/ËĂŠskiË/ (listen) ASS-kee),[3]:â6â abbreviated from American Standard Code for Information Interchange, is a character encoding standard for electronic communication. ASCII codes represent text in computers, telecommunications equipment, and other devices. Most modern character-encoding schemes are based on ASCII, although they support many additional characters.
The concept of electricity being ON or OFF, creating the concept of 1 & 0, and then being converted into binary, followed by ASCII is what got me interested in learning how to code.
Please find below 2 codes which explore these concepts
https://code.sololearn.com/c9JnEuWyx4UA/?ref=app
https://code.sololearn.com/cR8q6PcSDb0A/?ref=app
+ 1
Manav Roy
You don't need to memorise binary numbers.
If you want to understand why, then start investigating binary numbers & how they work.
You might also wish to investige octal & hexadecimal systems while you are at it.
0
It is unique numbers for each charaters
0
Manav Roy That's 65 in binary. ASCII is, as it's name indicates a table. The table entry for A is at 65. That's why A is 1000001. As it's 65 in binary. The binary code is necessary as computers operate with binaries and not decimals.
- 1
Manav Roy
The binary number for 'A' is 1000001, which is 65 when converted to decimal number.
print(int('1000001',2)) #65 -> A
I would not bother memorising the ASCII values.
Copy the A-Z Comprehension code I posted & review it when required