Writing a database in Go: Part 2

This is a continuation of the first part of this series where we discussed the data model and query language for this new database. In this second part we will setup our lexer. Lexing Lexing is the process for transforming a textual input into tokens representing each keyword, character or puncuation. Fortunately, there’s little we have to do for lexing the input due to a library I have written previous.