Domanda

Salutations!

While reading chapter 3 of the dragonbook (lexical analysis), I understood pretty much everything (how they specified tokens with regular expressions) till they started talking about finite automata. And it seemed to be a huge part of describing the lexical analyzer.

Now I understand the concepts of finite automata, but I don't understand its role and use in the lexical analyzer? Why not only specifiy tokens with regular expressions?

Thanks in advance.

È stato utile?

Soluzione

Regular expression can be represented by a finite automata and more precisely by a Deterministic one.

When you write a Regex, the lexer will convert it into a DFA to find matches in the text. So of course finite automata have its role in the lexical analyser.

There are very simple algorithms to convert Regex to FA like the Thompson's construction algorithm (http://en.wikipedia.org/wiki/Thompson's_construction_algorithm) but you can find optimized algorithm.

Autorizzato sotto: CC-BY-SA insieme a attribuzione
Non affiliato a StackOverflow
scroll top