Signup/Sign In

Java String tokenizer

In Java, StringTokenizer is used to break a string into tokens based on provided delimiter. Delimiter can be specified either at the time of object creation or on a per-token basis.

Its object internally maintains a current position within the string to be tokenized. It is located into java.util package.

In string, tokenizer objects are maintained internally and returns a token of a substring from the given string.

Note: StringTokenizer is a deprecated class and available only for compatibility reasons.

We can get idea from the below image, how tokenizer breaks the string into tokens.

string-tokenizer

Following are the constructors in string tokenizer

1. StringTokenizer(String str)

2. StringTokenizer(String str, String delim)

3. StringTokenizer(String str, String delim, booleanreturnValue)

Following are the methods in string tokenizer

1. booleanhasMoreTokens()

2. String nextToken()

3. String nextToken(String delim)

4. booleanhasMoreElements()

5. Object nextElement()

6. intcountTokens()

Example:

In this example, we are using Stringtokenizer to break string into tokens based on space.

	
import java.util.StringTokenizer;  
public class TokenDemo1
{  
	public static void main(String args[])
	{  
		StringTokenizerobj = new StringTokenizer("Welcome to studytonight"," ");  
		while (obj.hasMoreTokens()) 
		{  
			System.out.println(obj.nextToken());  
		}  
	}  
} 

	

Welcome to studytonight

Example

Lets take another example to understand tokenizer, here we are breaking string into tokens based on the colon (:) delimiter.

	
import java.util.*;
public class TokenDemo2{
	public static void main(String args[])
	{
		String a= " : ";
		String b= "Welcome : to : studytonight : . : How : are : You : ?";
		StringTokenizer c = new StringTokenizer(b, a);
		int count1 = c.countTokens();
		for (inti = 0; i<count1; i++)
			System.out.println("token [" + i + "] : "
					+ c.nextToken());
		StringTokenizer d= null;
		while (c.hasMoreTokens())
			System.out.println(d.nextToken());
	}
}
	

string-tokenizer images