How the C programming language bridges the man-machine gap

Roger L Costello costello at
Fri Apr 15 07:46:51 CDT 2022

Hi Folks,

I learned two neat things this week.

First, I learned how C bridges the gap between man and machine.

The C programming language allows you to create a char variable:

char ch;

and assign it a character:

ch = 'a';

Interestingly, you can do addition and subtraction on the char variable. That always puzzled me. Until today.

Humans interact with the computer (mostly) using characters. The computer, however, (mostly) uses numbers. How does the C programming language bridge the gap between human-friendly characters and machine-friendly numbers? Answer: by allowing addition and subtraction on char variables.

For example, what is this: 1

You might answer: It is the number 1.

And you would be wrong. It is a character not a number.

Programs are written using characters. To instruct the computer that we really mean the number 1 not the character 1, we must convert the character 1 to the number 1. That's where addition and subtraction of characters enters. 

(As you know) Every character is represented inside the computer as a number. To convert the character 1 to the number 1 we subtract the numeric representation of the character 1 from the numeric representation of the character 0:

int val = ch - '0';

To convert the sequence of characters 123 to the number 123 we iterate through each character, converting the character to a number, and then multiplying by 1 or 10 or 100, etc. depending on the position in the string:

while ( ch = getchar() )
    val = val * 10 + ch - '0';

That little loop bridges the gap between the human-friendly string 123 and the machine-friendly number 123. Neat!

The second thing I learned this week is to liberally sprinkle assertions throughout a program. Oddly enough, this is why mastery of Boolean logic is essential.

In his book The Science of Programming, David Gries has a great example of converting degrees Fahrenheit to degrees Celsius. He shows the value of adding assertions throughout the code to ensure that the code behaves as expected. He says that using assertions provide a way to "develop a program and its proof hand-in-hand, with the proof ideas leading the way!"

When I went to school the computer science department emphasized learning Boolean logic. While I enjoyed learning Boolean logic, it puzzled me why there would be such emphasis on it. 

As I was reading David Gries' book the answer dawned on me: assertions are Boolean expressions; master Boolean expressions and it will take you a long way toward writing provably correct programs. Neat!

Well, those are the two revelations for me this week. I realize they are pretty basic. Sometimes I don't fully internalize a concept until I see the right use case. This week I saw the right use case.


More information about the Unicode mailing list