+ 6

How does a machine/computer understand language?By only using 1s and 0s.

If a machine/computer only understood 0(zeroes) and 1(ones) which are a part human language. Then why computers doesn't understand other letter or number,why it need to be complied for machine to understand?How does it understand by only using 'on' and 'off' . Will it not be better if we able to communicate directly with machines/computers thought this letter and numbers as in humans.Rather then to use different types of computer language,which have strict rules for communicating. And why a computer only understand 0s and 1s ??

21st Aug 2019, 5:18 AM
😈😈Devil
😈😈Devil - avatar
13 ответов
+ 10
Firts of all, the 1s and 0s are not "our" 1 and 0. They are actually the electricity state, whether it's on or off, that's why they can only understand 1s and 0s. Second, computers can understand everything with only 1s and 0s because of the instruction set. This instruction set defines operations using a combination of 1s and 0s. The instruction set depends on the processor itself. For example, in some processors, the code 00100 is an operation code for addition, but in other processors, it can be subtraction (this is only an example and may not conform with reality) Third, yes, it willbe better if we can communicate with it directly. However, this was impossible back then, it's possible now that we communicate with a robot which understands natural language, but it also starts from 1s and 0s
21st Aug 2019, 6:08 AM
Agent_I
Agent_I - avatar
+ 7
At the basic level a processor consists of different processing units CPU, GPU, modem, co-processors, etc The CPU consists of transistors (millions or billions of them). Each transistor can only have one of the two states ON(1) or OFF(0) When the current passes through it, it's on otherwise it's off. Using this mechanism you can send signals. Each signal corresponds to a different instruction supported by the processor. We need to compile high level code to machine code for it to be executed by the processor. As today's computers only understand binary(on or off). This is practical and what works now. ..continued
21st Aug 2019, 6:50 AM
Lord Krishna
Lord Krishna - avatar
+ 7
..continued To represent Alphanumeric or just base 10 numbers. The hardware must support 10 or more states. But today's hardware doesn't support it. Quantum computers can support more but are limited and the tech hasn't reached a point where it can be used in place of regular computers. According to HTG Ternary computers did exist but no one did anything which was better than the current model. Progress stagnated, so we continue to use what already works. But It's not necessarily an issue as we can convert numbers/characters to binary or any base really (Do you know what's 21523 in base 128?). Two states is simpler and practical. https://www.howtogeek.com/367621/what-is-binary-and-why-do-computers-use-it/ https://nookkin.com/articles/computer-science/why-computers-use-binary.ndoc
21st Aug 2019, 6:50 AM
Lord Krishna
Lord Krishna - avatar
+ 6
A computer is made of electronic components, mainly transistors and diodes. These elements can have two electric states: logic 1, which is often represented by approximately +5V, or logic 0, which is represented by the ground voltage (0V). Simply speaking, you can turn an element on or off. This is what information scientists call the smallest unit of information (or „bit“). If you want to give instructions to a processor, you have to set the correct bits to 0 or 1. Therefore manufacturers of processors have published tables of operation codes (opcodes) which specify the operation to be performed. Look here for an example: https://www.cs.unm.edu/~maccabe/classes/341/labman/img75.gif As you can imagine, it is not very practicable to learn the correct combinations of zeros and ones, to make a working program. So the early computer scientists developed human readable abbreviations for the opcodes. They are called mnemonics. These are symbolic names for a single executable instruction. Typical mnemonics are AND, OR, ADD, NOP, MOV, ... This is a very primitive programming language, which is not easy to learn and handle. So what modern high-level programming languages like C, Java or Python do (with a compiler or an interpreter), is converting high-level instructions into sequences of opcodes, which the processor can handle.
21st Aug 2019, 6:38 AM
Michael
Michael - avatar
+ 5
There are some great answers here already, So I’ll just fill in the gaps. The binary system is connected to several major concepts in ICT. 1) truth tables: they are a combination of four bits, called a nibble, and are capable of representing all 16 logical operators used by a machine such as and, or, nor, xor, xnor, etc. total number of combinations are 2^4=16 2) ASCII uses 7 binary numbers to is to represent letters, numbers and commands characters such as return and delete on your keyboard. it is calculated a 2^7=128 total combinations, meaning there are a total of 128 ASCII chars, however UTF-8 is built on ASCII and overcomes this limitation. 3) Networking: IPv4 address is a 32 bit number used to identify computers on a network. it consists of 4 octets/bytes separated by a dot. the total number of combinations 2^32 4) Hexadecimal numbers are prefered over decimal numbers for representing binary numbers as four bits(nibble) map directly to one hex number.
21st Aug 2019, 8:53 AM
Logomonic Learning
Logomonic Learning - avatar
+ 4
The code you write is called high level language. this is converted by interpreter or compiler into machine language which is basically hexadecimal code. This is processer friendly language and can easily be converted into commands that a CPU can understand. 🐒
22nd Aug 2019, 2:45 AM
Sanjay Kamath
Sanjay Kamath - avatar
+ 3
Thank You!!, guys for your kind answers. 😃😇😑😉😉
21st Aug 2019, 7:38 AM
😈😈Devil
😈😈Devil - avatar
+ 3
Lately, due to the advances in AI and natural language processing humans are better able to communicate with computers using human language and this ability will improve over time.
21st Aug 2019, 11:31 AM
Sonic
Sonic - avatar
+ 3
To get more light, read on Logic gates=> transistors...electronics This is why electrical engineering basics are taught to college students
21st Aug 2019, 6:35 PM
Fogwe
Fogwe - avatar
+ 2
1- it's very expensive. 2- it's hard to make it 3- it's not fast
21st Aug 2019, 3:59 PM
arvin
arvin - avatar
+ 2
The language are not understood by our Computer the code we write is converted into bytecodes by our interpreter or compiler this bytecodes or 0 and 1 language is understood by our computer and that's how our computer gives output of our command.... I hope you understand
22nd Aug 2019, 2:55 AM
Pritish Mishra
+ 2
If you want to gain a good understanding of the "inner logic" of a computer, I can highly recommend the course "From Nand to Tetris" on coursera. Starting with a simple logic gate (NAND) you will build the logic of a full computer. I finished the first part and I am really looking forward to doing the secodn part.
22nd Aug 2019, 8:39 AM
Thoq!
Thoq! - avatar
0
Yes, languages using just 1s and 0s are called binary code
6th Nov 2019, 1:12 PM
Joshua