Unit 1 Flashcards
(20 cards)
What is a compiler?
A compiler is a program that translates source code written in a programming language into machine code.
This process involves several phases.
What are the phases of a compiler?
The phases of a compiler include lexical analysis, syntax analysis, semantic analysis, optimization, and code generation.
What are the types of compilers?
Types of compilers include single-pass compilers, multi-pass compilers, and just-in-time compilers.
What is lexical analysis?
Lexical analysis is the first phase of a compiler that converts a sequence of characters into a sequence of tokens.
What is the role of the lexical analyzer?
The lexical analyzer reads the source code and produces tokens for the syntax analyzer to process.
What are tokens?
Tokens are the smallest units of meaning in the source code, such as keywords, operators, and identifiers.
What are patterns in lexical analysis?
Patterns define the structure of tokens and are used to identify them in the source code.
What are lexemes?
Lexemes are the actual character sequences in the source code that match a token’s pattern.
What is input buffering?
Input buffering is a technique used to read input data efficiently during lexical analysis.
What are buffer pairs?
Buffer pairs are two buffers used in input buffering to allow for continuous reading of input data.
What are sentinels in input buffering?
Sentinels are special characters used to mark the end of input in a buffer.
What are regular sets?
Regular sets are collections of strings defined by regular expressions.
What are regular expressions?
Regular expressions are patterns used to match character combinations in strings.
What is the specification of tokens?
The specification of tokens describes the patterns that define each token type.
What is token recognition?
Token recognition is the process of identifying tokens from the input stream based on their specifications.
What is the conversion from regular expression (RE) to DFA?
The conversion from regular expression to deterministic finite automaton (DFA) is a method to create a DFA that recognizes the same language as the regular expression.
What is minimization of finite state machines?
Minimization of finite state machines is the process of reducing the number of states in a finite state machine while preserving its language.
What is the Lex tool?
Lex is a tool used for lexical analysis that generates a lexical analyzer from a set of regular expressions.
What is the Yacc tool?
Yacc is a tool used for syntax analysis that generates a parser from a context-free grammar.
What is a language processing system?
A language processing system is a system that processes programming languages, typically involving compilation and interpretation.