072 101 108 108 111 032 087 111 114 108 100... with Optional 032 or 046.

There seems to be no other Thread (upon NAS) discussing this as a Topic, or Hobby, or its fancies... despite everyone who can read this actually doing it or using it. 

Parents Reply Children
  • Computers can't really work with letters in the way we do. Instead, everything is numbers - long strings of zeroes and ones.

    To be able to interact with computers using letters, systems of glyphs (think: hieroglyphics) were developed. In the western world the current "lowest common denominator" system of glyphs is still ASCII, but there have been others, such as https://en.wikipedia.org/wiki/EBCDIC. Unicode is replacing ASCII in many applications this century, because representing languages other than English quickly becomes problematic if all you have are 127 or 255 different glyphs to use.

    So how do these glyphs really work? Put simply, a grid of dots representing a picture, such as the capital letter 'A' is assigned a particular number code. The letter 'B' is assigned the next code, and so on.

    Now, computers tend to have a little bit more trouble with decimal numbers than we do, because they don't have eight fingers and two thumbs. The computer's equivalent "convenient" system is based on a power of 2. This gives rise to the following numbering schemes:

    1. Decimal, used when outputting numbers for people to use. In reality, this is just the correct sequence of glyphs to represent a decimal number that people will recognise.
    2. Binary, the raw sets of zeroes and ones, but too awkward for people to use (see example below!)
    3. Octal, which is now rarely used, because it is again too troublesome to scale, and you can't even count up to 8 or 9 before you need an extra glyph.
    4. Hexadecimal, which gets you all the way to 15 before you need an extra glyph. This enables you to display four consecutive "binary digits", aka "bits" using a single hexadecimal glyph. The glyphs are 0..9A..F. If you use two consecutive hexadecimal glyphs, you can represent all numbers from 0..255, i.e. an 8-bit byte.

    Example:

    • ASCII glyphs: ABC
    • Decimal ASCII: 65 66 67
    • Binary ASCII: 01000001 01000010 01000011
    • Octal ASCII: 101 102 103
    • Hexadecimal ASCII: 0x41 0x42 0x43

    So, if we want to create a file on disk containing "ABC", the sequence of numbers 65, 66, 67 are stored sequentially. If you ask a computer to read the file on disk and display the contents on the screen, the computer reads 65, 66, 67 and outputs the appropriate glyph for each number, so you see "ABC" on the screen.

    If you work at this level for long enough(!), you learn that:

    'A' is decimal 65, hexadecimal 0x41

    'a' is hexadecimal 0x61

    '0'..'9' are 0x30..0x39

    LF (line feed) is 0x0A (thanks to for correcting this).

    CR (carriage return) is 0x0D

    SPACE is 0x20

    . is 0x2E