The last 100 years have been dealing with the fact that true and false are not enough to describe the world of computation. You also need undecidable, or null, for statements that can be proved to never terminate.
It is very much still an open question if a statement is absolutely undecidable and if we need to add something like null in all logic [0].
As for your arguments on why we don't need null, they sound exactly like the arguments against zero from the middle ages [1].
>Just as the rag doll wanted to be an eagle, the donkey a lion and the monkey a queen, the zero put on airs and pretended to be a digit.
It is very much still an open question if a statement is absolutely undecidable and if we need to add something like null in all logic [0].
As for your arguments on why we don't need null, they sound exactly like the arguments against zero from the middle ages [1].
>Just as the rag doll wanted to be an eagle, the donkey a lion and the monkey a queen, the zero put on airs and pretended to be a digit.
[0] http://logic.harvard.edu/koellner/QAU_reprint.pdf
[1] Menninger, Karl (1969), Number Words and Number Symbols. Cambridge, Mass.: The M.I.T. Press.