Definition at line 56 of file DocumentMarkup.h.
|
| TokenStream (const std::string &s) |
|
Token | scanNextToken (const Container::LineVector &content, size_t &at) |
| Function that obtains the next token.
|
|
| TokenStream (const boost::filesystem::path &fileName) |
| Create a token stream from the contents of a file.
|
|
| TokenStream (const std::string &inputString) |
| Create a token stream from a string.
|
|
| TokenStream (const Container::Buffer< size_t, char >::Ptr &buffer) |
| Create a token stream from a buffer.
|
|
const std::string & | name () const |
| Property: Name of stream.
|
|
const Token & | current () |
| Return the current token.
|
|
bool | atEof () |
| Returns true if the stream is at the end.
|
|
const Token & | operator[] (size_t lookahead) |
| Return the current or future token.
|
|
void | consume (size_t n=1) |
| Consume some tokens.
|
|
std::pair< size_t, size_t > | location (size_t position) |
| Return the line number and offset for an input position.
|
|
std::pair< size_t, size_t > | locationEof () |
| Returns the last line index and character offset.
|
|
std::string | lineString (size_t lineIdx) |
| Return the entire string for some line index.
|
|
std::string | lexeme (const Token &t) |
| Return the lexeme for a token.
|
|
std::string | lexeme () |
| Return the lexeme for a token.
|
|
bool | isa (const Token &t, typename Token::TokenEnum type) |
| Determine whether token is a specific type.
|
|
bool | isa (typename Token::TokenEnum type) |
| Determine whether token is a specific type.
|
|
bool | match (const Token &t, const char *s) |
| Determine whether a token matches a string.
|
|
bool | match (const char *s) |
| Determine whether a token matches a string.
|
|
Token Sawyer::Document::Markup::TokenStream::scanNextToken |
( |
const Container::LineVector & |
content, |
|
|
size_t & |
at |
|
) |
| |
|
virtual |
Function that obtains the next token.
Subclasses implement this function to obtain the next token that starts at or after the specified input position. Upon return, the function should adjust at
to point to the next position for scanning a token, which is usually the first character after the returned token's lexeme. If the scanner reaches the end of input or any condition that it deems to be the end then it should return the EOF token (a default-constructed token), after which this function will not be called again.
Implements Sawyer::Lexer::TokenStream< Token >.