That's not entirely true. NOT NULL means that a value, an empty string, or 0 in the case of numeric columns, will be inserted in the column rather than NULL.
NOT NULL doesn't create an error if no value is set. Instead NOT NULL causes the value of a column to be set to 0 or an empty string by default.
Suppose you set up a table
MyTable
someColumn varchar(5)
testNullColumn integer(5)
Suppose you had this insert to create 1 single row in this table
INSERT INTO myTable
SET someColumn='test'
Since NULLs are allowed in testNullColumn, the value of that field in this row would be NULL
So the following query:
SELECT *
FROM myTable
WHERE testNullColumn=0
Would return no rows.
Suppose you changed the testNullColumn to NOT NULL
Now with this insert:
INSERT INTO myTable
SET someColumn='test'
The value of testNullColumn would be set to 0 (zero) because the definition is that the column is NOT NULL
So the query:
SELECT *
FROM myTable
WHERE testNullColumn=0
Would return that row.
Same would be true in text type columns
varchar(20) NOT NULL
means that the query
SELECT * FROM myTable WHERE textNotNullTest=''
would select rows with no value.
if NULLS were allowed in the column
Only rows that had the value of textNotNullTest SPECIFICALLY SET to an empty string (INSERT INTO myTable SET textNotNullTest='') would be selected. Rows with the default null value would be ignored.
NOT NULL adds I think a byte per column per row.
If a column has NULLS ALLOWED, thought must always be taken for case of a column value being set to NULL in addition to expected values.
I was taught, vehemently, to always create columns as NOT NULL, and I follow this practice. It makes life a lot easier and doesn't create much overhead.