SPL Lexer
A lexer (tokenizer) implementation in Go for parsing custom language/query syntax.
Overview
This project implements a lexical analyzer that converts input strings into tokens for a custom language. The lexer recognizes various token types including symbols, operators, keywords, literals, and punctuation.
Features
- Tokenizes input strings into meaningful components
- Supports various token types:
- Symbols (identifiers)
- Operators (comparison, logical)
- Literals (strings, numbers)
- Punctuation (braces, colons, pipes)
- Comments
- Whitespace
Token Types
The lexer supports the following token types:
SYMBOL: IdentifiersCOLON::EXCLAMATION:!EQUAL:=NOT_EQUAL:!=AND:andNOT:notOR:orMORE:>LESS:<MORE_EQUAL:>=LESS_EQUAL:<=OPEN_BRACE:(CLOSED_BRACE:)COMMENT://commentsPIPE:|NUMBER: Integer numbersFLOAT_NUMBER: Floating-point numbersSTRING_LITERAL: String literals in quotesSPACE: Whitespace
Usage
To use the lexer, you can call the Parse function from the lexer package:
package main
import (
"github.com/e1lama/spl/lexer"
)
func main() {
tokens := lexer.Parse("some input string")
// Process tokens
}
Project Structure
main.go- Main entry point (currently empty)lexer/lexer.go- Core lexer implementationlexer/tokens.go- Token definitions and types
License
MIT
Description
Languages
Go
100%