450 exam 2 Flashcards

(139 cards)

1
Q

static semantics vs dynamic semantics

A

static: syntax
dynamic: meaning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

attribute grammar is a ? grammar with additions

A

context-free

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

what are the three additions to context-free grammars that make an attribute grammar? (“need to understand but not write all down”)

A
  1. each grammar symbol has a set of attributes
  2. each rule has function(s) that define certain attributes of the nonterminals in the rule
  3. each rule has predicate(s) to check for attribute consistency
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

fully attributed parse tree

A

if all the attribute values in a parse tree have been computed

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

three approaches to describe dynamic semantics

A

operational, denotational, axiomatic

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

dynamic semantics: operational vs denotational vs axiomatic

A

operational: describe meaning by executing its statements on a machine
denotational: meaning defined by the values of the program’s variables
axiomatic: based on mathematical logic (predicate calculus); axioms (inference rules) defined for each statement type in the language

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

example of operational semantics

A

JUnit Testing

@Test
public void testBinarySearch() {
// do stuff
}
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

example of denotational semantics

A

grammar rules with a semantic function (? - review this)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

example of axiomatic semantics

A

{x > 5} x = x - 3 {x > 0}

precondition, statement, postcondition

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

assertions

A

the logic expressions in axiomatic expressions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

precondition

A

an assertion before a statement that states the relationships and constraints among variables that are true at that point in execution

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

weakest precondition

A

the least restrictive precondition that will guarantee the postcondition

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

postcondition

A

an assertion following a statement

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

what is the weakest precondition for x = x- 3 {x > 0} ?

A

{x > 3}

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

what is the weakest precondition for a = b + 1 {a > 1} ?

A

{b > 0}

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

the syntax analysis portion of a language nearly always consists of two parts

A

lexical analysis and syntax analysis

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

lexical analyzer

A

converts characters in the source program into lexical units (lexemes and tokens)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

lexeme

A

the lowest level syntactic unit (such as =)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

token

A

collection of lexemes (such as ASSIGN_OP)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

syntax analyzer (or parser)

A

analyzes the syntactical structure of the given input and checks if the input is in the correct syntax of the language

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

two categories of parsers

A

top-down and bottom-up

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

top-down parser

A

produce the parse tree, beginning at the root (downward to the leaves) – order is that of a leftmost derivation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

bottom-up parser

A

produce the parse tree, beginning at the leaves (upward to the root) – order is that of the REVERSE of a rightmost derivation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

two most common top-down parsing algorithms (LL algorithms)

A

recursive-descent parser and LL parser

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
what does LL stand for in LL algorithms?
Left-to-right, Leftmost
26
recursive-descent parser
there is a subprogram for each nonterminal in the grammar, which can parse sentences that can be generated by that nonterminal; a coded implementation based on BNF or EBNF
27
recursive-descent parser coding process
for each terminal symbol in the RHS, compare it with the next input token if they match, continue; else, there is an error for each nonterminal symbol in the RHS, call its associated parsing subprogram
28
LL parser
table- (or stack-) driven implementation
29
LL parser procedure
in each step, the parser reads the next available symbol from the input stream and the top-most symbol from the stack (application of a grammar rule) if the input symbol and the stack-top symbol match, the parser discards them both, leaving only the unmatched symbols in the input stream and on the stack
30
purpose of the pairwise disjointness test
indicates whether the parser can always choose the correct RHS on the basis of the next token of input
31
pairwise disjointness test
for each nonterminal A in the grammar that has more than one RHS, for each pair of rules A -> ai and A -> aj, it must be true that FIRST(ai) ∩ FIRST(aj) = empty set
32
pairwise disjointness test on | A -> a | bB | cAb
``` FIRST(a) = {a} FIRST(bB) = {b} FIRST(cAb) = {c} ``` pass
33
pairwise disjointness test on | A -> a | aB
``` FIRST(a) = {a} FIRST(aB) = {a} ``` fail
34
bottom-up parser (LR algorithm)
reads input text from left to right and produces a rightmost derivation in reverse (starts with the input and attempts to rewrite it to the start symbol)
35
two most common actions for bottom-up parsers
shift and reduce
36
shift
advances in the input stream by one symbol; that shifted symbol becomes a new node in the parse tree
37
reduce
applies a grammar rule to some of the recent parse trees, joining them together as one tree with a new root symbol
38
shift-reduce algorithm
if the input has no syntax errors, the parser continues to shift and reduce until all the input has been consumed and all of the parse trees have been reduced to a single tree representing an entire legal (valid) input
39
l-value of a variable
address
40
r-value of a variable
contents
41
what does x represent in the following statement? | x = 5;
the l-value or address of the variable
42
what does x represent in the following statement? | y = x + z;
the r-value or content of the variable y
43
a binding is static if
it first occurs before run time and remains unchanged throughout the program execution
44
static binding is also called
early binding
45
dynamic binding is also called
late binding
46
a binding is dynamic if
it first occurs during execution or can change during the execution of the program
47
static type binding: explicit declaration definition + example
a statement in a program that lists variable names and specifies that they are a particular data type example: Java, C++, etc. saying int a = 5;
48
static type binding: implicit declaration definition
a means of associating variables with types through context (type inference) or default conventions
49
Example of type inference (static type binding - implicit declaration)
In C#, a variable can be declared with var and an initial value, which sets the type of the variable.
50
Example of default conventions (static type binding - implicit declaration)
In Perl, any name that begins with $ is a scalar, a name that begins with @ is an array, so $apple and @apple are variables with different types.
51
for statically typed variables, their types are
fixed for the lifetime of the unit in which they are declared (that is, you cannot change the data type)
52
dynamic type binding: any variable can be assigned any type value, and a variable's type (can/cannot) change during execution
can
53
example of dynamic type binding (language such as Python, Ruby, PHP, etc. use it)
in Python: a = 17 a = 'hi'
54
lifetime of a variable
the time during which it is bound to a particular memory cell (how long it exists in memory)
55
allocation
getting a memory cell from some pool of available cells
56
deallocation
putting a memory cell back into the pool
57
four categories of variables by lifetime
static, stack-dynamic, explicit heap-dynamic, implicit heap-dynamic
58
static variables + example
variables before the execution begins and stay available through the entire run (declared with the keyword static) e.g., static variables in Java and C++
59
stack-dynamic variables + example
Come into existence when you call a method (or function). Their existence is temporary. They are either in the parameter list or declared inside the method. They disappear when the method ends.
60
explicit heap-dynamic variables + example
allocated and deallocated by explicit instructions specified by the programmer e.g., Objects in Java and C++ (created via new), referenced through pointers or references (new is an explicit instruction used to create the variable) (come into being via the new operator, and sometimes need to be explicitly deallocated by the delete operator)
61
implicit heap-dynamic variables + example
bound to heap storage only when they are assigned values e.g., list = [74, 84, 86, 90, 71], whether it was previously used, it is now a list of 5 numeric values (Python)
62
scope of a variable
the range of statements over which it is visible
63
local variables of a program unit
those that are declared in that unit
64
nonlocal variables of a program unit
those that are visible in the unit but not declared there
65
global variables
a special category (subset) of nonlocal variables
66
static scoping (also called lexical scoping)
a variable always refers to its top level environment (static parent) -- a property of the program text, not the runtime stack
67
(dynamic/static) scoping is used by most of the major programming languages
static
68
dynamic scoping
based on calling sequences of program units, not their textual layout; parent is who calls it
69
how does the search process for dynamic scoping work?
compiler first searches locally then successively all the calling functions
70
dynamic scoping results in (less/more) reliable programs than static scoping
less
71
referencing environment of a statement
the collection of all names that are visible in the statement
72
what is the referencing environment in a static-scoped language?
the local variables plus all of the visible variables in all of the enclosing scopes
73
what is the referencing environment in a dynamic-scoped language?
the local variables plus all visible variables in all active subprograms
74
primitive data types
those that are not defined in terms of other data types; provided by almost all programming languages
75
ordinal type
one in which the range of possible values can be easily associated with a set of integers
76
examples of primitive ordinal types in Java
integer, char, boolean
77
boolean in Java versus C/C++
In Java, 1 is true and 0 is false In C/C++, any nonzero is true and zero is false
78
enumeration type
one in which all of the possible values, which are named constants, are provided (or enumerated) in the definition
79
Java example of enumeration type for Days
enum Day {SUN, MON, TUE, WED, THU, FRI, SAT} Day workday = Day.WED;
80
array
a homogeneous aggregate of data elements in which an individual element is identified by its position in the aggregate, relative to the first element
81
static array
subscript ranges are statically bound and storage allocation is static (before run-time)
82
advantage to static arrays
efficiency
83
four categories of arrays
static, fixed stack-dynamic, fixed heap-dynamic, heap-dynamic
84
fixed stack-dynamic array
subscript ranges are statically bound, but the allocation is done during execution
85
advantage to fixed stack-dynamic arrays
space efficiency (just allocate enough memory space)
86
fixed heap-dynamic array
similar to fixed stack-dynamic, storage binding is done when requested, but storage is allocated from heap, not stack
87
heap-dynamic array
binding of subscript ranges and storage allocation is dynamic and can change any number of times
88
advantage to heap-dynamic arrays
flexibility (arrays can grow or shrink at any time during program execution)
89
arrays that include static modifier are
static
90
arrays created without static modifier are
fixed stack-dynamic
91
fixed heap-dynamic arrays use ? to manage heap storage;; an array is treated as ?
new and delete; a pointer to a collection of storage cells
92
in Java, all non-generic arrays are
fixed heap-dynamic
93
Perl, JavaScript, Python, and Ruby support
heap-dynamic arrays
94
static array example (C++)
static int nums[10] = {1, 3, 5, 7};
95
fixed stack-dynamic array example (C++)
int nums[] = {1, 3, 5, 7};
96
fixed heap-dynamic array example (C++)
int *nums = new int[10]; ... delete[] nums;
97
heap-dynamic array example (Python)
``` myList = [1, 3, 5] myList = [2, 4, 6, 8, 10] ```
98
slice
some substructure of an array
99
Python slice example vector = [2, 4, 6, 8, 10, 12, 14, 16] what does vector[3:6] return?
[8, 10, 12] gets 3-element array from index 3 (inclusive) to 6 (exclusive)
100
Python 2D slice example mat = [[1, 2, 3], [4, 5, 6], [7, 8, 9]] what does mat[0][0:2] return?
[1, 2] gets the first and second element of the first row of mat
101
Ruby slice example list = [2, 4, 6, 8, 10] what does list.slice(2, 2) return?
[6, 8] start at index 2, get 2 elements; returns the third and fourth elements of list
102
associative array
an unordered collection of data elements that are indexed by an equal number of values called keys
103
associative arrays are also called ? and they contain ?
dictionaries, maps, or hashes; key-value pairs
104
record
a heterogeneous aggregate of data elements in which the individual elements are identified by names
105
tuple
a data type that is similar to a record, except that the elements are not named
106
Python's tuples are closely related to its lists, but they are
immutable (cannot change anything in the tuple); uses parentheses instead of square brackets
107
list
also serves as Python's arrays, mutable, elements can be of any type
108
[x ** 2 for x in range(6) if x % 3 == 0]
[0, 9] range is [0, 5], only 0%3 and 3%3 = 0, so 0^2 and 3^2
109
[y * 3 for y in range(1, 20, 3) if y % 4 == 0]
[12, 48] range [1, 19] with step value 3 yields 1, 4, 7, 10, 13, 16, 19 4, 16 are the only ones where mod 4 = 0 4 * 3 and 16 * 3
110
[y * 3 for y in range (20, 1, -3) if y % 4 == 0]
[60, 24] range [20, 2] with step -3 yields 20, 17, 14, 11, 8, 5, 2 20 and 8 are the only ones where mod 4 = 0 multiply by 3
111
union
a type whose variables are allowed to store different type values at different times during execution
112
free union
no language support for type checking; unsafe
113
discriminated union
support type checking
114
why do Java and C# not support unions? what do they use?
growing concerns for safety in programming languages; use classes instead
115
pointer
a variable that stores the memory address of another value located in the memory
116
reference types
references are references to objects; Java uses reference variables and allows them to replace pointers entirely (b/c pointers are not safe)
117
two fundamental pointer operations
assignment and dereferencing
118
assignment
used to set a pointer variable's value to some useful memory address
119
address-of-operator
&
120
dereference operator
*
121
dereferencing
yields the value stored at the location represented by the pointer’s value
122
problems with pointers
dangling pointers and lost heap-dynamic variables (and hence memory leakage)
123
solution to dangling pointer problem
two algorithms: tombstone and locks-and-keys
124
tombstone algorithm
every heap-dynamic variable includes an extra cell, called a tombstone, that the pointer variable points at; tombstone is a marker for "variable is no longer here or dead"
125
locks-and-keys algorithm
represent pointers as ordered pairs (key, address) where the key is an integer value; every access to the pointer compares the lock value in the variable's cell and the key value in the pointer's cell
126
solution to heap management problem
two approaches (algorithms) to reclaim garbage: reference counters (eager) and mark-sweep (lazy)
127
reference counter algorithm
this algorithm maintains a counter that stores the number of pointers currently pointing at the cell; if the reference counter reaches zero, it has become garbage
128
mark-sweep algorithm
mark phase: mark all unreachable cells (as garbage) | sweep phase: reclaim the heap space used by garbage cells and make the space available to the program again
129
garbage
an allocated heap-dynamic variable that is no longer accessible to the user program
130
memory leakage
the process of losing heap-dynamic variables
131
coercion
automatic implicit conversion to a legal type
132
a programming language is strongly typed if ?
type errors are always detected
133
a programming language is weakly (loosely) typed if ?
type errors are not always detected
134
type equivalence
determining whether two types are considered the same; two standard ways: name type equivalence and structure type equivalence
135
name type equivalence
two types are equal if and only if they have the same name (e.g., both integers, both Puppy objects)
136
(name/structure) type equivalence is easier to implement and used more frequently by Java
name
137
structure type equivalence
two types are equal if and only if they have the same “structure"
138
do most languages have a mixture of name/structure type equivalence? if so, give an example
yes; e.g., C++ uses structure equivalence except for structs and classes (where name equivalence is used)
139
which type of variable is only found in a dynamic type binding language?
implicit heap-dynamic variables