| Package | Description |
|---|---|
| com.mitchellbosecke.pebble.lexer | |
| com.mitchellbosecke.pebble.parser | |
| com.mitchellbosecke.pebble.tokenParser |
| Modifier and Type | Method and Description |
|---|---|
Token |
TokenStream.current()
Looks at the current token.
|
Token |
TokenStream.expect(Token.Type type)
Checks the current token to see if it matches the provided type.
|
Token |
TokenStream.expect(Token.Type type,
String value)
Checks the current token to see if it matches the provided type.
|
Token |
TokenStream.next()
Consumes and returns the next token in the stream.
|
Token |
TokenStream.peek()
Returns the next token in the stream without consuming it.
|
Token |
TokenStream.peek(int number)
Returns a future token in the stream without consuming any.
|
| Modifier and Type | Method and Description |
|---|---|
ArrayList<Token> |
TokenStream.getTokens()
used for testing purposes
|
| Constructor and Description |
|---|
TokenStream(Collection<Token> tokens,
String name)
Constructor for a Token Stream
|
| Modifier and Type | Method and Description |
|---|---|
boolean |
StoppingCondition.evaluate(Token data) |
| Modifier and Type | Method and Description |
|---|---|
RenderableNode |
MacroTokenParser.parse(Token token) |
RenderableNode |
SetTokenParser.parse(Token token) |
RenderableNode |
ParallelTokenParser.parse(Token token) |
RenderableNode |
TokenParser.parse(Token token)
The TokenParser is responsible to convert all the necessary tokens into
appropriate Nodes.
|
RenderableNode |
IfTokenParser.parse(Token token) |
RenderableNode |
ImportTokenParser.parse(Token token) |
RenderableNode |
BlockTokenParser.parse(Token token) |
RenderableNode |
VerbatimTokenParser.parse(Token token) |
RenderableNode |
FlushTokenParser.parse(Token token) |
RenderableNode |
AutoEscapeTokenParser.parse(Token token) |
RenderableNode |
ForTokenParser.parse(Token token) |
RenderableNode |
FilterTokenParser.parse(Token token) |
RenderableNode |
IncludeTokenParser.parse(Token token) |
RenderableNode |
ExtendsTokenParser.parse(Token token) |
Copyright © 2015. All rights reserved.