"While lex can be used to generate sequences of tokens for a
parser, it can also be used to perform simple input operations. A
number of programs have been written to convert simple English text
into dialects using only lex, for instance. The most famous is
probably the classic Swedish Chef translator, widely distributed on
the Internet. These work by recognizing simple bits and pieces of
patterns, and acting on them immediately. A good lexer example can
help a lot with learning how to write a tokenizer...
"Most simple programming projects, of course, can get by with
very trivial lexers. A bit of care put into designing a language
can help simplify this substantially, by guaranteeing that tokens
can be recognized reliably without needing to know about context.
Of course, not all languages are this congenial; for instance, lex
has a hard time with C string constants, where special characters
modify other characters..."