← Back to companies
[ OK ] Loaded —
[ INFO ]
$ cd
$ ls -lt
01
02
03
04
05
$ ls -lt
01
02
03
04
05
user@intervues:~/$
Problem Statement: Write a Python function to tokenize a given string. A token is defined as a sequence of non-whitespace characters. The function should return a list of tokens.
Examples:
tokenize("Hello World") # ["Hello", "World"] tokenize(" Foo Bar ") # ["Foo", "Bar"] tokenize(" Foo\tBar\nBaz") # ["Foo", "Bar", "Baz"]
Constraints:
Hints:
split method to split the string into tokens.split method can take a regular expression as an argument to split the string based on a pattern.Solution: `python import re
def tokenize(s): return re.split(r'\s+', s) `
This solution uses the re.split function from the re module to split the input string s based on one or more whitespace characters (\s+). The result is a list of tokens, which is returned by the function.