Lexical Analysis is the first phase of compiler also known as scanner. It converts the input program into a sequence of Tokens.
Lexical Analysis can be implemented with the Deterministic finite Automata.
What is a token?
A lexical token is a sequence of characters that can be treated as a unit in the grammar of the programming languages
[ad type=”banner”].Example of tokens:
- Type token (id, number, real, . . . )
- Punctuation tokens (IF, void, return, . . . )
- Alphabetic tokens (keywords)
Keywords; Examples-for, while, if etc. Identifier; Examples-Variable name, function name etc. Operators; Examples '+', '++', '-' etc. Separators; Examples ',' ';' etc
[ad type=”banner”]
Example of Non-Tokens:
- Comments, preprocessor directive, macros, blanks, tabs, newline etc
How Lexical Analyzer functions
- Tokenization .i.e Dividing the program into valid tokens.
- Remove white space characters.
- Remove comments.
- It also provides help in generating error message by providing row number and column number.
The lexical analyzer identifies the error with the help of automation machine and the grammar of the given language on which it is based like C , C++.
Suppose we pass a statement through lexical analyzer
a = b + c ; It will generate token sequence like this:
id=id+id; Where each id reference to it’s variable in the symbol table referencing all details
For example, consider the program
[ad type=”banner”]All the valid tokens are:
Above are the valid tokens.
As another example, consider below printf statement.
There are 5 valid token in this printf statement.
Exercise 1:
Count number of tokens :
[ad type=”banner”]Output:
Answer: Total number of token: 27.[ad type=”banner”]
Exercise 2:
Count number of tokens :
int max(int i);
- Lexical analyzer first read int and finds it to be valid and accepts as token
- max is read by it and found to be valid function name after reading (
- int is also a token , then again i as another token and finally ;
Answer: Total number of tokens 7: int, max, ( ,int, i, ), ;