tokenizer a module of the standardization process intended to determine the boundaries within a character string that mark or delimit the two ends of a name piece; cf. Chapter 4, §1, ¶4