Summary¶
Here, the chapter of coffee and cookies comes to an end. It started with reflections on abstract ideas about lexical structures. It was paying homage to lexical structures and their importance for civilization and cooperating organisms. It provided a rationale for the application of sequential data streams and the use of state machines for interpretation. The goal of these discussions was to provide intellectual comfort and, hopefully, spark some interest in the subject of lexical analysis.
To properly present the process of lexical analysis, the concept of a DFA (deterministic finite automaton) was indispensable. The term ‘lexatom’ has been introduced to reflect the multitude of events that may trigger DFA state transitions. After a definition of a set of basic terms, the procedure how lexatom streams are translated into token streams has been explained. Finally, work stages were discussed which are necessary to accomplish a functional lexical analyzer.