+ 3
ASCII is a data coder like UTF8, UTF2, UTF16(BE&LE), UTF32. But it has not all character types.
18th May 2022, 4:36 PM
aewerdev
aewerdev - avatar
+ 2
ASCII was the first character set (encoding standard) used between computers on the Internet. Both ISO-8859-1 (default in HTML 4.01) and UTF-8 (default in HTML5), are built on ASCII. Id say its like a variable that contains an array of characters. Each character has a number. When you call the number you get the value. So the number 38 will be & symbol. 33 ! etc. Example: const ASCII = []; ASCII [33]= "!"; ASCII [34]= ' " '; ASCII [38]= "&"; To call it the value you press the ALT Key + the value. Many keys on your keyboard are mapped all ready. Not sure if that helps but if you googled everything this is the best explanation I could come up with.
17th May 2022, 7:01 AM
Chris Coder
Chris Coder - avatar
17th May 2022, 6:55 AM
Mustafa A
Mustafa A - avatar
+ 1
Yes everything is binary in a computer, think about it this way everything in a computer is defined by electric signals. Electricity can either be flowig or not. In other words its binary. Therefore one can say everything in a computer is defined by binary values. So bsically a binary signal defines each character. And this signal tells the cpu where to find that character in memory
17th May 2022, 7:16 AM
Raul Ramirez
Raul Ramirez - avatar
+ 1
All of it correlates. Computers see 1 and 0'. to make it easier to communicate electronically we have character encodings such as ASCII which is used for the representation of text such as symbols, letters, digits, etc.. Think of it as translating the 1and 0's into easy to read characters. There are different encodings for different purposes. I wouldn't think to hard about it.
17th May 2022, 7:26 AM
Chris Coder
Chris Coder - avatar
+ 1
Excerpt from Wikipedia ASCII (/ˈéskiː/ (listen) ASS-kee),[3]: 6  abbreviated from American Standard Code for Information Interchange, is a character encoding standard for electronic communication. ASCII codes represent text in computers, telecommunications equipment, and other devices. Most modern character-encoding schemes are based on ASCII, although they support many additional characters. The concept of electricity being ON or OFF, creating the concept of 1 & 0, and then being converted into binary, followed by ASCII is what got me interested in learning how to code. Please find below 2 codes which explore these concepts https://code.sololearn.com/c9JnEuWyx4UA/?ref=app https://code.sololearn.com/cR8q6PcSDb0A/?ref=app
17th May 2022, 9:29 AM
Rik Wittkopp
Rik Wittkopp - avatar
+ 1
Manav Roy You don't need to memorise binary numbers. If you want to understand why, then start investigating binary numbers & how they work. You might also wish to investige octal & hexadecimal systems while you are at it.
21st May 2022, 7:41 AM
Rik Wittkopp
Rik Wittkopp - avatar
0
It is unique numbers for each charaters
19th May 2022, 2:20 AM
Numan
Numan - avatar
0
Manav Roy That's 65 in binary. ASCII is, as it's name indicates a table. The table entry for A is at 65. That's why A is 1000001. As it's 65 in binary. The binary code is necessary as computers operate with binaries and not decimals.
21st May 2022, 7:51 AM
Mustafa A
Mustafa A - avatar
- 1
Manav Roy The binary number for 'A' is 1000001, which is 65 when converted to decimal number. print(int('1000001',2)) #65 -> A I would not bother memorising the ASCII values. Copy the A-Z Comprehension code I posted & review it when required
20th May 2022, 10:21 PM
Rik Wittkopp
Rik Wittkopp - avatar