Concept Questions 160119
note: the questions are addressed in the order of the page numbers where
the term or phrase they are concerned with appears (not always the page
number the original asker cited).
- pp 32: what does `dummy parameter' mean here? [last sentence on the
- Here they are referring to what will later be called `formal parameters'
as opposed to `actual parameters'. In common languages, formal parameters
are the parameter names that appear in the function header wheraeas the
actual parameter values are the values being passed to the function on
the invocation line -- where the function was called from.
- `Formal parameter' appears 15 times in
the text. `Dummy parameter' is used 22 times in the text. The formal versus
actual parameter distinction is first drawn on page 52 (Section 3.1).
- Specifically, when we look at Exhibit 2.16, we see something
called `list of dummy parameter names' and it is separated by something
called `body of function' from the actual values.
- pp 32: more on the principle of lexical coherence?
- as questions go, that is sort of vague.
- the term does come up again in the textbook with regards to various
control structures (e.g., pages 278 and 279 in Chapter 10)
- The notion that it is easier to figure out a piece of code if
all the pieces related to some part of it are nearby is an important
idea in programming language design. Calling it `lexical coherence'
is much less common (usually it doesn't have a name). Giving things
names helps remember them and communicate about them (see the
- The term `lexical coherence' is also used in the natual linguistics
community, but there they are looking at things like maintaining flow
in a paragraph by repeating words and so help tie the discourse together,
so it isn't quite the same notion as explained here, but it is motivated
by a common observation about how people read (that nearby text is treated
as if it were more important than far away text).
- pp 47: should we use LaTeX to hand in our assignment reports?
- Your assignment reports are handed in on paper, so at that point,
I theoretically don't know how you created them. Indeed, it is ok to
hand them in handwritten if they are very readable -- although you would
still want to print out the code example (part 11, see
- If you like learning new systems, latex/pdflatex is installed on the
department linux boxes and is usually part of the standard linux distribution
(it can, in theory, be installed on other systems as well -- a lot of effort
has gone into making it portable and it is widely used by people who are
interested in typesetting mathematics or computer science).
As you may note, most of the stuff I do for class is typeset using HTML and
then saved as a pdf file from the web browser (less of a hassle than
LaTeX and we usually don't need the fancy typesetting capabilities of
- TeX was designed by Donald Knuth to make it easier to typeset
the mathematical equations he used in writing his classic `The Art of Computer
Programming' (copies in library -- wikipedia entry). He also invented a new programming
style called literate programming ( wikipedia entry) while implementing TeX. People at Stanford who were
into computer typesetting had been using a language called `Scribe'
( wikipedia entry)
to write their documents. Leslie Lamport used the extension capabilities
of TeX to implement the features of Scribe that people liked and the result
was called LaTeX. TeX/LaTeX was free and source included. Scribe was
a `commercial product' (small startup). See https://www.latex-project.org/ and
https://tug.org/ (TeX User's Group).
- pp 52: What are considered first class objects?
- Basically it is the list of capabilities in the last sentence of
the second paragraph on page 52. These are properties that most languages
give integers and floating point numbers. By now this could be said about
strings as well (although early languages had restricted notions of what
you could do with a string). For objects and functions, this came
- The term `first class' is very common the programming language literature.
For a general discussion of it see wikipedia entry for first-class citizen. Generally it
is used with regards to functions ( wikipedia entry for first-class function), but
it is equally applicable to many other concepts in a programming language.
- page 60: how does the lexical generator work?
- see Exhibit 4.3 on page 81 for the big picture.
- you can think of it as a function that takes in a file or string and
outputs a sequence of objects for further analysis. For example lexer("ab=379+ab * y") might output ["ab", "=", "379", "+", "ab", "*", "y"]. The entries
in the output array are generally referred to as tokens.
The question is, what do the tokens look like that you are interested in.
A lexical generator generally takes a table of regular expressions and the
corresponding token type (e.g., [0-9]+ INTEGER) and generates a short
program that tries to match the description against the input stream and
returns the string that matched and the token type when it finds something.
- The most common example is lex ( wikipedia entry) that is part of the
standard unix/linux distributions where you specify token types using
regular expressions and lex writes C code for a routine you can call that
will tokenize an input stream along the lines you want. A faster version
of lex called `flex' ( wikipedia entry) was one of the early attempts to
improve on Unix outside the standard Unix License that led us to the
GNU/Linux world of today.
- In Java, the class StreamTokenizer automates this sort of thing
(even handles C-style and C++-style comments) to make it easier for people
to do this sort of processing.