1.demo
public class AnalyzerStudy { private static String str = "分词器测试 Lucene 案例 开发 by future for study"; private static void print(Analyzer analyzer) { StringReader reader = new StringReader(str); TokenStream stream = analyzer.tokenStream("", reader); try { stream.reset(); CharTermAttribute term = stream.getAttribute(CharTermAttribute.class); System.out.println("分词技术: " + analyzer.getClass()); while (stream.incrementToken()) { System.out.print(term.toString() + "|"); } System.out.println(); System.out.println("==========================================="); } catch (IOException e) { // TODO Auto-generated catch block e.printStackTrace(); } } public static void main(String[] args) { /* * 标准分词器 * 每个汉字是一个词,每段英文是一个词,会忽略某些词 */ print(new StandardAnalyzer()); // 空格分词器,以空格分词 print(new WhitespaceAnalyzer()); // 简单分词器,结果与空格分词器相似 print(new SimpleAnalyzer()); // 二分法分词器,对于英文,以空格分词,对于中文,会与前后的一个中文分别组词,会忽略指定默认单词 print(new CJKAnalyzer()); // 关键字分词器,整个一段作为分词结果 print(new KeywordAnalyzer()); // 被忽略词分词器,忽略指定单词 /* * "a", "an", "and", "are", "as", "at", "be", "but", "by", * "for", "if", "in", "into", "is", "it", * "no", "not", "of", "on", "or", "such", * "that", "the", "their", "then", "there", "these", * "they", "this", "to", "was", "will", "with" */ print(new StopAnalyzer()); } }
2.运行结果
分词技术: class org.apache.lucene.analysis.standard.StandardAnalyzer
分|词|器|测|试|lucene|案|例|开|发|future|study|
===========================================
分词技术: class org.apache.lucene.analysis.core.WhitespaceAnalyzer
分词器测试|Lucene|案例|开发|by|future|for|study|
===========================================
分词技术: class org.apache.lucene.analysis.core.SimpleAnalyzer
分词器测试|lucene|案例|开发|by|future|for|study|
===========================================
分词技术: class org.apache.lucene.analysis.cjk.CJKAnalyzer
分词|词器|器测|测试|lucene|案例|开发|future|study|
===========================================
分词技术: class org.apache.lucene.analysis.core.KeywordAnalyzer
分词器测试 Lucene 案例 开发 by future for study|
===========================================
分词技术: class org.apache.lucene.analysis.core.StopAnalyzer
分词器测试|lucene|案例|开发|future|study|
===========================================