Published on 12, July, 2020
There seems to be no other Thread (upon NAS) discussing this as a Topic, or Hobby, or its fancies... despite everyone who can read this actually doing it or using it.
What is it? I came accross it on the Christmas thread and got intimidated so retreated to my shell.
Computers can't really work with letters in the way we do. Instead, everything is numbers - long strings of zeroes and ones.
To be able to interact with computers using letters, systems of glyphs (think: hieroglyphics) were developed. In the western world the current "lowest common denominator" system of glyphs is still ASCII, but there have been others, such as https://en.wikipedia.org/wiki/EBCDIC. Unicode is replacing ASCII in many applications this century, because representing languages other than English quickly becomes problematic if all you have are 127 or 255 different glyphs to use.
So how do these glyphs really work? Put simply, a grid of dots representing a picture, such as the capital letter 'A' is assigned a particular number code. The letter 'B' is assigned the next code, and so on.
Now, computers tend to have a little bit more trouble with decimal numbers than we do, because they don't have eight fingers and two thumbs. The computer's equivalent "convenient" system is based on a power of 2. This gives rise to the following numbering schemes:
Example:
So, if we want to create a file on disk containing "ABC", the sequence of numbers 65, 66, 67 are stored sequentially. If you ask a computer to read the file on disk and display the contents on the screen, the computer reads 65, 66, 67 and outputs the appropriate glyph for each number, so you see "ABC" on the screen.
If you work at this level for long enough(!), you learn that:
'A' is decimal 65, hexadecimal 0x41
'a' is hexadecimal 0x61
'0'..'9' are 0x30..0x39
LF (line feed) is 0x0A (thanks to Robert124 for correcting this).
CR (carriage return) is 0x0D
SPACE is 0x20
. is 0x2E