Package org.apache.lucene.analysis.ngram
Class NGramTokenizer
- java.lang.Object
-
- org.apache.lucene.util.AttributeSource
-
- org.apache.lucene.analysis.TokenStream
-
- org.apache.lucene.analysis.Tokenizer
-
- org.apache.lucene.analysis.ngram.NGramTokenizer
-
- All Implemented Interfaces:
Closeable
,AutoCloseable
public final class NGramTokenizer extends Tokenizer
Tokenizes the input into n-grams of the given size(s).
-
-
Nested Class Summary
-
Nested classes/interfaces inherited from class org.apache.lucene.util.AttributeSource
AttributeSource.AttributeFactory, AttributeSource.State
-
-
Field Summary
Fields Modifier and Type Field Description static int
DEFAULT_MAX_NGRAM_SIZE
static int
DEFAULT_MIN_NGRAM_SIZE
-
Constructor Summary
Constructors Constructor Description NGramTokenizer(Reader input)
Creates NGramTokenizer with default min and max n-grams.NGramTokenizer(Reader input, int minGram, int maxGram)
Creates NGramTokenizer with given min and max n-grams.NGramTokenizer(AttributeSource.AttributeFactory factory, Reader input, int minGram, int maxGram)
Creates NGramTokenizer with given min and max n-grams.NGramTokenizer(AttributeSource source, Reader input, int minGram, int maxGram)
Creates NGramTokenizer with given min and max n-grams.
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description void
end()
This method is called by the consumer after the last token has been consumed, afterTokenStream.incrementToken()
returnedfalse
(using the newTokenStream
API).boolean
incrementToken()
Returns the next token in the stream, or null at EOS.void
reset()
Resets this stream to the beginning.-
Methods inherited from class org.apache.lucene.analysis.Tokenizer
close, correctOffset, reset
-
Methods inherited from class org.apache.lucene.util.AttributeSource
addAttribute, addAttributeImpl, captureState, clearAttributes, cloneAttributes, copyTo, equals, getAttribute, getAttributeClassesIterator, getAttributeFactory, getAttributeImplsIterator, hasAttribute, hasAttributes, hashCode, reflectAsString, reflectWith, restoreState, toString
-
-
-
-
Field Detail
-
DEFAULT_MIN_NGRAM_SIZE
public static final int DEFAULT_MIN_NGRAM_SIZE
- See Also:
- Constant Field Values
-
DEFAULT_MAX_NGRAM_SIZE
public static final int DEFAULT_MAX_NGRAM_SIZE
- See Also:
- Constant Field Values
-
-
Constructor Detail
-
NGramTokenizer
public NGramTokenizer(Reader input, int minGram, int maxGram)
Creates NGramTokenizer with given min and max n-grams.- Parameters:
input
-Reader
holding the input to be tokenizedminGram
- the smallest n-gram to generatemaxGram
- the largest n-gram to generate
-
NGramTokenizer
public NGramTokenizer(AttributeSource source, Reader input, int minGram, int maxGram)
Creates NGramTokenizer with given min and max n-grams.- Parameters:
source
-AttributeSource
to useinput
-Reader
holding the input to be tokenizedminGram
- the smallest n-gram to generatemaxGram
- the largest n-gram to generate
-
NGramTokenizer
public NGramTokenizer(AttributeSource.AttributeFactory factory, Reader input, int minGram, int maxGram)
Creates NGramTokenizer with given min and max n-grams.- Parameters:
factory
-AttributeSource.AttributeFactory
to useinput
-Reader
holding the input to be tokenizedminGram
- the smallest n-gram to generatemaxGram
- the largest n-gram to generate
-
-
Method Detail
-
incrementToken
public boolean incrementToken() throws IOException
Returns the next token in the stream, or null at EOS.- Specified by:
incrementToken
in classTokenStream
- Returns:
- false for end of stream; true otherwise
- Throws:
IOException
-
end
public void end()
Description copied from class:TokenStream
This method is called by the consumer after the last token has been consumed, afterTokenStream.incrementToken()
returnedfalse
(using the newTokenStream
API). Streams implementing the old API should upgrade to use this feature. This method can be used to perform any end-of-stream operations, such as setting the final offset of a stream. The final offset of a stream might differ from the offset of the last token eg in case one or more whitespaces followed after the last token, but aWhitespaceTokenizer
was used.- Overrides:
end
in classTokenStream
-
reset
public void reset() throws IOException
Description copied from class:TokenStream
Resets this stream to the beginning. This is an optional operation, so subclasses may or may not implement this method.TokenStream.reset()
is not needed for the standard indexing process. However, if the tokens of aTokenStream
are intended to be consumed more than once, it is necessary to implementTokenStream.reset()
. Note that if your TokenStream caches tokens and feeds them back again after a reset, it is imperative that you clone the tokens when you store them away (on the first pass) as well as when you return them (on future passes afterTokenStream.reset()
).- Overrides:
reset
in classTokenStream
- Throws:
IOException
-
-