CSV
CSV
| Module | csv |
| Author | Luma Contributors |
| Version | 1.0.0 |
| License | MIT |
A pure Luma CSV library for parsing, querying, and converting comma-separated data. No external dependencies — works with Luma’s built-in file I/O and JSON support.
Features
- Read CSV files or parse CSV strings
- Custom delimiters (tabs, semicolons, etc.)
- Quoted field handling (commas inside quotes)
- Column access by name
- Filtering rows by column value
- Convert to maps or JSON
- Write back to CSV with proper quoting
Quick Start
csv = import "csv"
// Read a CSV file
table: [[str]] = csv.read("data.csv")
// Access data
print(csv.headers(table)) // [name age city]
print(csv.col(table, "name")) // [Alice Bob Charlie]
print(csv.get(table, 0, "name")) // Alice
// Filter rows
engineers: [[str]] = csv.where(table, "role", "engineer")
// Convert to JSON
print(csv.to_json(table))Installation
- Download the module file: csv.luma
- Place it in your project directory (or a
lib/folder) - Import it in your code:
csv = import "csv"Design
A CSV table is [[str]] — a list of rows, where each row is a list of field strings. The first row (table[0]) is always the header. Fields are split during parsing, so all accessor functions work with already-parsed data.
API Reference
| Function | Description |
|---|---|
read(path: str) -> [[str]] |
Read a CSV file (comma-delimited) |
read_delim(path: str, delim: str) -> [[str]] |
Read a file with a custom delimiter |
parse(text: str) -> [[str]] |
Parse a CSV string (comma-delimited) |
parse_delim(text: str, delim: str) -> [[str]] |
Parse a string with a custom delimiter |
headers(table: [[str]]) -> [str] |
Get header names |
rows(table: [[str]]) -> [[str]] |
Get all data rows (without header) |
row(table: [[str]], index: int) -> [str] |
Get a single data row by index (0-based) |
col(table: [[str]], name: str) -> [str] |
Get all values in a column by name |
get(table: [[str]], row: int, col: str) -> str |
Get a single cell by row index and column name |
count(table: [[str]]) -> int |
Count data rows (excluding header) |
to_maps(table: [[str]]) -> [{str: str}] |
Convert to a list of header-keyed maps |
to_json(table: [[str]]) -> str |
Convert to a JSON array of objects |
where(table: [[str]], col: str, val: str) -> [[str]] |
Filter rows where a column matches a value |
to_string(table: [[str]]) -> str |
Convert table back to a CSV string |
write(path: str, table: [[str]]) |
Write table to a CSV file |
Usage
Reading CSV
// From a file
table: [[str]] = csv.read("data.csv")
// From a string
text: str = "name,age,city\nAlice,30,Tallinn\nBob,25,Riga"
table: [[str]] = csv.parse(text)
// Tab-separated
tsv: [[str]] = csv.read_delim("data.tsv", "\t")Accessing data
h: [str] = csv.headers(table) // [name age city]
all: [[str]] = csv.rows(table) // all data rows
first: [str] = csv.row(table, 0) // first data row
names: [str] = csv.col(table, "name") // all values in "name" column
val: str = csv.get(table, 0, "name") // single cell
n: int = csv.count(table) // number of data rowsFiltering
where returns a new table (with header) containing only matching rows:
engineers: [[str]] = csv.where(table, "role", "engineer")
csv.rows(engineers).walk(r) -> print(sprint(r))Converting
// To maps (header name -> value)
maps: [{str: str}] = csv.to_maps(table)
// To JSON
j: str = csv.to_json(table)
print(j)Writing
// To string
output: str = csv.to_string(table)
// To file
csv.write("output.csv", table)Quoted fields
The parser handles quoted fields correctly — commas inside quotes are preserved:
text: str = "name,note\nAlice,\"likes commas, really\""
table: [[str]] = csv.parse(text)
print(csv.get(table, 0, "note")) // likes commas, reallyWhen writing, fields containing commas, quotes, or newlines are automatically quoted.
Complete Example
csv = import "csv"
// Read the data
table: [[str]] = csv.read("employees.csv")
print("Loaded ${csv.count(table)} employees")
// Show all names
names: [str] = csv.col(table, "name")
print("Names: " + sprint(names))
// Filter engineers
engineers: [[str]] = csv.where(table, "role", "engineer")
print("Engineers: ${csv.count(engineers)}")
// Convert to JSON and save
file("engineers.json").write(csv.to_json(engineers))
// Write filtered CSV
csv.write("engineers.csv", engineers)