Writing a database in Go: Part 3

This is a continuation of the second part of this series where we discussed lexing the query language for this new database. In this third part we will write our parser. Parsing We start by defining the Parser. A parser takes the tokens produced by the lexer and creates AST nodes which can then be used by a query executor. For PrefixDB, our parser is a simple struct with a TokenBuffer.

   Go, Databases, Parsing

Writing a database in Go: Part 2

This is a continuation of the first part of this series where we discussed the data model and query language for this new database. In this second part we will setup our lexer. Lexing Lexing is the process for transforming a textual input into tokens representing each keyword, character or puncuation. Fortunately, there’s little we have to do for lexing the input due to a library I have written previous.

   Go, Databases, Lexical analysis

Writing a database in Go: Part 1

Databases are a dime a dozen these days. So why in the world would I write another one? Well, mainly for fun, but also for specialization. Building a database which specializes in one thing could reap huge benefits such as performance, data model, or even operations. So what kind of database are we building? We’re going to build yet another key-value database. However, this one is going to be geared specifically for range scans and is called PrefixDB.