Parent

Class Index [+]

Quicksearch

TaskJuggler::TextParser::Pattern

This class models the most crutial elements of a syntax description - the pattern. A TextParserPattern primarily consists of a set of tokens. Tokens are Strings where the first character determines the type of the token. There are 4 known types.

Terminal token: In the syntax declaration the terminal token is prefixed by an underscore. Terminal tokens are terminal symbols of the syntax tree. They just represent themselves.

Variable token: The variable token describes values of a certain class such as strings or numbers. In the syntax declaration the token is prefixed by a dollar sign and the text of the token specifies the variable type. See ProjectFileParser for a complete list of variable types.

Reference token: The reference token specifies a reference to another parser rule. In the syntax declaration the token is prefixed by a bang and the text matches the name of the rule. See TextParserRule for details.

End token: The . token marks the expected end of the input stream.

In addition to the pure syntax tree information the pattern also holds documentary information about the pattern.

Attributes

keyword[R]
doc[R]
seeAlso[R]
exampleFile[R]
exampleTag[R]
tokens[R]
function[R]

Public Class Methods

new(tokens, function = nil) click to toggle source

Create a new Pattern object. tokens must be an Array of String objects that describe the Pattern. function can be a reference to a method that should be called when the Pattern was recognized by the parser.

    # File lib/taskjuggler/TextParser/Pattern.rb, line 49
49:     def initialize(tokens, function = nil)
50:       # A unique name for the pattern that is used in the documentation.
51:       @keyword = nil
52:       # Initialize pattern doc as empty.
53:       @doc = nil
54:       # A list of TokenDoc elements that describe the meaning of variable
55:       # tokens. The order of the tokens and entries in the Array must correlate.
56:       @args = []
57:       # A list of references to other patterns that are related to this pattern.
58:       @seeAlso = []
59:       # A reference to a file under test/TestSuite/Syntax/Correct and a tag
60:       # within that file. This identifies example TJP code to be included with
61:       # the reference manual.
62:       @exampleFile = nil
63:       @exampleTag = nil
64: 
65:       @tokens = []
66:       tokens.each do |token|
67:         unless '!$_.'.include?(token[0])
68:           raise "Fatal Error: All pattern tokens must start with a type " +
69:                 "identifier [!$_.]: #{tokens.join(', ')}"
70:         end
71:         # For the syntax specification using a prefix character is more
72:         # convenient. But for further processing, we need to split the string
73:         # into two symbols. The prefix determines the token type, the rest is
74:         # the token name. There are 4 types of tokens:
75:         # :reference : a reference to another rule
76:         # :variable : a terminal symbol
77:         # :literal : a user defined string
78:         # :eof : marks the end of an input stream
79:         type = [ :reference, :variable, :literal, :eof ]['!$_.'.index(token[0])]
80:         # For literals we use a String to store the token content. For others,
81:         # a symbol is better suited.
82:         name = type == :literal ?
83:                token[1..1] : (type == :eof ? '<END>' : token[1..1].intern)
84:         # We favor an Array to store the 2 elements over a Hash for
85:         # performance reasons.
86:         @tokens << [ type, name ]
87:         # Initialize pattern argument descriptions as empty.
88:         @args << nil
89:       end
90:       @function = function
91:       # In some cases we don't want to show all tokens in the syntax
92:       # documentation. This value specifies the index of the last shown token.
93:       @lastSyntaxToken = @tokens.length - 1
94: 
95:       @transitions = []
96:     end

Public Instance Methods

[](i) click to toggle source

Conveniance function to access individual tokens by index.

     # File lib/taskjuggler/TextParser/Pattern.rb, line 214
214:     def [](i)
215:       @tokens[i]
216:     end
addTransitionsToState(states, rules, stateStack, sourceState, destRule, destIndex, loopBack) click to toggle source

Add the transitions to the State objects of this pattern. states is a Hash with all State objects. rules is a Hash with the Rule objects of the syntax. stateStack is an Array of State objects that have been traversed before reaching this pattern. sourceState is the State that the transition originates from. destRule, this pattern and destIndex describe the State the transition is leading to. loopBack is boolean flag, set to true when the transition describes a loop back to the start of the Rule.

     # File lib/taskjuggler/TextParser/Pattern.rb, line 115
115:     def addTransitionsToState(states, rules, stateStack, sourceState,
116:                               destRule, destIndex, loopBack)
117:       # If we hit a token in the pattern that is optional, we need to consider
118:       # the next token of the pattern as well.
119:       loop do
120:         if destIndex >= @tokens.length
121:           # Have we reached the end of the pattern? Such state always trigger
122:           # a reduce operation.
123:           sourceState.noReduce = false
124:           if sourceState.rule == destRule
125:             if destRule.repeatable
126:               # The transition leads us back to the start of the Rule. This
127:               # will generate transitions to the first token of all patterns
128:               # of this Rule.
129:               destRule.addTransitionsToState(states, rules, [], sourceState,
130:                                              true)
131:             end
132:           end
133:           # We've reached the end of the pattern. No more transitions to
134:           # consider.
135:           return
136:         end
137: 
138:         # The token descriptor tells us where the transition(s) need to go to.
139:         tokenType, tokenName = token = @tokens[destIndex]
140: 
141:         case tokenType
142:         when :reference
143:           # The descriptor references another rule.
144:           unless (refRule = rules[tokenName])
145:             raise "Unknown rule #{tokenName} referenced in rule #{refRule.name}"
146:           end
147:           # If we reference another rule from a pattern, we need to come back
148:           # to the pattern once we are done with the referenced rule. To be
149:           # able to come back, we collect a list of all the States that we
150:           # have passed during a reference resolution. This list forms a stack
151:           # that is popped during recude operations of the parser FSM.
152:           skippedState = states[[ destRule, self, destIndex ]]
153:           # Rules may reference themselves directly or indirectly. To avoid
154:           # endless recursions of this algorithm, we stop once we have
155:           # detected a recursion. We have already all necessary transitions
156:           # collected. The recursion will be unrolled in the parser FSM.
157:           unless stateStack.include?(skippedState)
158:             # Push the skipped state on the stateStack before recursing.
159:             stateStack.push(skippedState)
160:             refRule.addTransitionsToState(states, rules, stateStack,
161:                                           sourceState, loopBack)
162:             # Once we're done, remove the State from the stateStack again.
163:             stateStack.pop
164:           end
165: 
166:           # If the referenced rule is not optional, we have no further
167:           # transitions for this pattern at this destIndex.
168:           break unless refRule.optional?(rules)
169:         else
170:           unless (destState = states[[ destRule, self, destIndex ]])
171:             raise "Destination state not found"
172:           end
173:           # We've found a transition to a terminal token. Add the transition
174:           # to the source State.
175:           sourceState.addTransition(@tokens[destIndex], destState, stateStack,
176:                                     loopBack)
177:           # Fixed tokens are never optional. There are no more transitions for
178:           # this pattern at this index.
179:           break
180:         end
181: 
182:         destIndex += 1
183:       end
184:     end
each() click to toggle source

Iterator for tokens.

     # File lib/taskjuggler/TextParser/Pattern.rb, line 219
219:     def each
220:       @tokens.each { |type, name| yield(type, name) }
221:     end
empty?() click to toggle source

Returns true of the pattern is empty.

     # File lib/taskjuggler/TextParser/Pattern.rb, line 224
224:     def empty?
225:       @tokens.empty?
226:     end
generateStates(rule) click to toggle source

Generate the state machine states for the pattern. rule is the Rule that the pattern belongs to. A list of generated State objects will be returned.

     # File lib/taskjuggler/TextParser/Pattern.rb, line 101
101:     def generateStates(rule)
102:       states = []
103:       @tokens.length.times { |i| states << State.new(rule, self, i) }
104:       states
105:     end
length() click to toggle source

Returns the number of tokens in the pattern.

     # File lib/taskjuggler/TextParser/Pattern.rb, line 229
229:     def length
230:       @tokens.length
231:     end
optional?(rules) click to toggle source

Return true if all tokens of the pattern are optional. If a token references a rule, this rule is followed for the check.

     # File lib/taskjuggler/TextParser/Pattern.rb, line 235
235:     def optional?(rules)
236:       @tokens.each do |type, name|
237:         if type == :literal || type == :variable
238:           return false
239:         elsif type == :reference
240:           if !rules[name].optional?(rules)
241:             return false
242:           end
243:         end
244:       end
245:       true
246:     end
setArg(idx, doc) click to toggle source

Set the documentation text and for the idx-th variable.

     # File lib/taskjuggler/TextParser/Pattern.rb, line 193
193:     def setArg(idx, doc)
194:       @args[idx] = doc
195:     end
setDoc(keyword, doc) click to toggle source

Set the keyword and documentation text for the pattern.

     # File lib/taskjuggler/TextParser/Pattern.rb, line 187
187:     def setDoc(keyword, doc)
188:       @keyword = keyword
189:       @doc = doc
190:     end
setExample(file, tag) click to toggle source

Set the file and tag for the TJP code example.

     # File lib/taskjuggler/TextParser/Pattern.rb, line 208
208:     def setExample(file, tag)
209:       @exampleFile = file
210:       @exampleTag = tag
211:     end
setLastSyntaxToken(idx) click to toggle source

Restrict the syntax documentation to the first idx tokens.

     # File lib/taskjuggler/TextParser/Pattern.rb, line 198
198:     def setLastSyntaxToken(idx)
199:       @lastSyntaxToken = idx
200:     end
setSeeAlso(also) click to toggle source

Set the references to related patterns.

     # File lib/taskjuggler/TextParser/Pattern.rb, line 203
203:     def setSeeAlso(also)
204:       @seeAlso = also
205:     end
terminalSymbol?(i) click to toggle source

Returns true if the i-th token is a terminal symbol.

     # File lib/taskjuggler/TextParser/Pattern.rb, line 249
249:     def terminalSymbol?(i)
250:       @tokens[i][0] == :variable || @tokens[i][0] == :literal
251:     end
terminalTokens(rules, index = 0) click to toggle source

Find recursively the first terminal token of this pattern. If an index is specified start the search at this n-th pattern token instead of the first. The return value is an Array of [ token, pattern ] tuple.

     # File lib/taskjuggler/TextParser/Pattern.rb, line 256
256:     def terminalTokens(rules, index = 0)
257:       type, name = @tokens[index]
258:       # Terminal token start with an underscore or dollar character.
259:       if type == :literal
260:         return [ [ name, self ] ]
261:       elsif type == :variable
262:         return []
263:       elsif type == :reference
264:         # We have to continue the search at this rule.
265:         rule = rules[name]
266:         # The rule may only have a single pattern. If not, then this pattern
267:         # has no terminal token.
268:         tts = []
269:         rule.patterns.each { |p| tts += p.terminalTokens(rules, 0) }
270:         return tts
271:       else
272:         raise "Unexpected token #{type} #{name}"
273:       end
274:     end
to_s() click to toggle source

Generate a text form of the pattern. This is similar to the syntax in the original syntax description.

     # File lib/taskjuggler/TextParser/Pattern.rb, line 358
358:     def to_s
359:       str = ""
360:       @tokens.each do |type, name|
361:         case type
362:         when :reference
363:           str += "!#{name} "
364:         when :variable
365:           str += "$#{name } "
366:         when :literal
367:           str += "#{name} "
368:         when :eof
369:           str += ". "
370:         else
371:           raise "Unknown type #{type}"
372:         end
373:       end
374: 
375:       str
376:     end
to_syntax(argDocs, rules, skip = 0) click to toggle source

Returns a string that expresses the elements of the pattern in an EBNF like fashion. The resolution of the pattern is done recursively. This is just the wrapper function that sets up the stack.

     # File lib/taskjuggler/TextParser/Pattern.rb, line 279
279:     def to_syntax(argDocs, rules, skip = 0)
280:       to_syntax_r({}, argDocs, rules, skip)
281:     end
to_syntax_r(stack, argDocs, rules, skip) click to toggle source

Generate a syntax description for this pattern.

     # File lib/taskjuggler/TextParser/Pattern.rb, line 284
284:     def to_syntax_r(stack, argDocs, rules, skip)
285:       # If we find ourself on the stack we hit a recursive pattern. This is used
286:       # in repetitions.
287:       if stack[self]
288:         return '[, ... ]'
289:       end
290: 
291:       # "Push" us on the stack.
292:       stack[self] = true
293: 
294:       str = ''
295:       first = true
296:       # Analyze the tokens of the pattern skipping the first 'skip' tokens.
297:       skip.upto(@lastSyntaxToken) do |i|
298:         type, name = @tokens[i]
299:         # If the first token is a _{ the pattern describes optional attributes.
300:         # They are represented by a standard idiom.
301:         if first
302:           first = false
303:           return '{ <attributes> }' if name == '{'
304:         else
305:           # Separate the syntax elemens by a whitespace.
306:           str << ' '
307:         end
308: 
309:         if @args[i]
310:           # The argument is documented in the syntax definition. We copy the
311:           # entry as we need to modify it.
312:           argDoc = @args[i].dup
313: 
314:           # A documented argument without a name is a terminal token. We use the
315:           # terminal symbol as name.
316:           if @args[i].name.nil?
317:             str << "#{name}"
318:             argDoc.name = name
319:           else
320:             str << "<#{@args[i].name}>"
321:           end
322:           addArgDoc(argDocs, argDoc)
323: 
324:           # Documented arguments don't have the type set yet. Use the token
325:           # value for that.
326:           if type == :variable
327:             argDoc.typeSpec = "<#{name}>"
328:           end
329:         else
330:           # Undocumented tokens are recursively expanded.
331:           case type
332:           when :literal
333:             # Literals are shown as such.
334:             str << name.to_s
335:           when :variable
336:             # Variables are enclosed by angle brackets.
337:             str << "<#{name}>"
338:           when :reference
339:             if rules[name].patterns.length == 1 &&
340:                !rules[name].patterns[0].doc.nil?
341:               addArgDoc(argDocs, TokenDoc.new(rules[name].patterns[0].keyword,
342:                                               rules[name].patterns[0]))
343:               str << '<' + rules[name].patterns[0].keyword + '>'
344:             else
345:               # References are followed recursively.
346:               str << rules[name].to_syntax(stack, argDocs, rules, 0)
347:             end
348:           end
349:         end
350:       end
351:       # Remove us from the "stack" again.
352:       stack.delete(self)
353:       str
354:     end

Private Instance Methods

addArgDoc(argDocs, argDoc) click to toggle source
     # File lib/taskjuggler/TextParser/Pattern.rb, line 380
380:     def addArgDoc(argDocs, argDoc)
381:       raise 'Error' if argDoc.name.nil?
382:       argDocs.each do |ad|
383:         return if ad.name == argDoc.name
384:       end
385:       argDocs << argDoc
386:     end

Disabled; run with --debug to generate this.

[Validate]

Generated with the Darkfish Rdoc Generator 1.1.6.