Signup/Sign In

What is ASCII (American Standard Code for Information Interexchange) ?

Posted in General   DECEMBER 13, 2021

    WHAT IS ASCII

    One of the most common character encoding formats for text data in computers and on the internet is called ASCII, which stands for Americal Standard Code for Information Interchange. You will get unique values for 128 alphabetic, numeric, or special additional characters and control codes in standard ASCII encoded data.

    The ASCII encoding is used for telegraph data as it is based on character encoding. It was first published as a standard for computing in 1963 by the American National Standards Institute.

    Here, you have the characters including upper and lowercase letters A to Z, numerals 0 to 9 and the main punctuation symbols. Even some non-printing control characters, that were originally intended for use with teletype printing terminals, are used by it.

    One could represent the ASCII characters in the following ways:

    • in the form of pairs of hexadecimal digits -- base-16 numbers, which is represented as 0 through 9 and A through F for the decimal values of 10-15;

    • in the form of three-digit octal (base 8) numbers;

    • in the form of decimal numbers from 0 to 127; or

    • in the form of a 7-bit or 8-bit binary.

    For instance, the ASCII encoding for the character "m" is represented in the following ways:

    Character

    Hexadecimal

    Octal

    Decimal

    Binary (7 bit)

    Binary (8 bit)

    m

    0x6D

    /155

    109

    110 1101

    0110 1101

    Why is ASCII important?

    ASCII is known to be the first major character encoding standard for data processing. Unicode Worldwide character Standard a.k.a Unicode, which is a character encoding standard that includes ASCII encodings is used by most modern computer systems.

    ASCII COES

    ASCII was adopted as the standard for internet data when it published "ASCII format for Network Interchange" as RFC20 in 1969, by The Internet Engineering Task Force (IETF). The request for comments (RFC) was accepted as a full standard in 2015.

    Today, almost all computers use ASCII or Unicode encoding.

    How does ASCII work?

    For basic data communications, ASCII will offer you a universally accepted and understood character set. Developers can use this to design interfaces that both human computers can understand. A code of data is stringed as ASCII characters by the ASCII codes and these characters can be interpreted and displayed as readable plain text for people and as data for computers.

    ASCII TABLE

    Programmers can simplify certain tasks using the design of the ASCII character. For instance, if you change a single bit, using ASCII character codes, it easily converts the text from uppercase to lowercase.

    The binary value of the capital letter "A" is represented by:

    0100 0001

    The binary value of the capital letter "a" is represented by:

    0110 0001

    The third most significant bit is the difference. In hexadecimal and decimal, this comes down to:

    Character

    Binary

    Decimal

    Hexadecimal

    A

    0100 0001

    65

    0x41

    a

    0110 0001

    97

    0x61

    You will see that the difference between the uppercase and lowercase characters is always 32, thus if you are converting uppercase to lowercase or vice-versa you will just have to add or subtract 32 from the ASCII character code.

    Similarly, for the digits 0 through 9, the hexadecimal characters are as follows:

    Character

    Binary

    Decimal

    Hexadecimal

    0

    0011 0000

    48

    0x30

    1

    0011 0001

    49

    0x31

    2

    0011 0010

    50

    0x32

    3

    0011 0011

    51

    0x33

    4

    0011 0100

    52

    0x34

    5

    0011 0101

    53

    0x35

    6

    0011 0110

    54

    0x36

    7

    0011 0111

    55

    0x37

    8

    0011 1000

    56

    0x38

    9

    0011 1001

    57

    0x39

    Developers using this encoding, can covert ASCII digits easily to numerical values just by stripping off the four most significant bits of the binary ASCII values. You could also do this calculation by dropping the first hexadecimal digit or by subtracting 48 from the decimal ASCII code.

    Also, to verify that a data stream, string or file contains ASCII values, developers can also check the most significant bit of characters in a sequence. If the bit of an ASCII character is 1, then the character is not an ASCII encoded character as the most significant bit of basic ASCII characters is always 0.

    ASCII advantages and disadvantages

    The advantages and disadvantages of ASCII character encoding are very well understood today, after being used for more than half a century.

    Advantages

    • ASCII is universally accepted.

    • Compact character encoding. As the standard codes can be expressed in 7 bits it doesn't require much data.

    • Efficient for programming.

    Disadvantages

    • Limited character set.

    • Inefficient character encoding.

    Published by: Amundra
    Tags:ASCII
    IF YOU LIKE IT, THEN SHARE IT
     

    RELATED POSTS