Represents a run of a lexer on a given input string.
Constructs a token of type symbol_name from the current_lexeme.
# File lib/dhaka/lexer/lexer_run.rb, line 15 def create_token(symbol_name, value = current_lexeme.characters.join) Token.new(symbol_name, value, current_lexeme.input_position) end
Yields each token as it is recognized. Returns a TokenizerErrorResult if an error occurs during tokenization.
# File lib/dhaka/lexer/lexer_run.rb, line 20 def each reset_and_rewind loop do c = curr_char break if (c == "\00"" && @not_yet_accepted_chars.empty? && !@current_lexeme.accepted?) dest_state = @curr_state.transitions[c] unless dest_state return TokenizerErrorResult.new(@input_position) unless @current_lexeme.accepted? token = get_token yield token if token reset_and_rewind else @curr_state = dest_state @not_yet_accepted_chars << c @curr_state.process(self) advance end end yield Token.new(END_SYMBOL_NAME, nil, nil) end
Generated with the Darkfish Rdoc Generator 2.