At least until Tractor Loader takes off, my day job involves building the SingleStore database and managed service. SingleStore is the evolution and new name for MemSQL, which began years ago as a distributed in-memory SQL database. Itās an independent engine, which makes it particularly interesting to work on and gives it unique strengths compared to PostgreSQL, MySQL, and other SQL databases.
One of the downsides to being a unique engine, as opposed to, say, a PostgreSQL plugin, is that thereās less written about when and how to use SingleStore. So, Iām going to cover a few use cases where SingleStore works well, from the ground-up, and apply them to my own web sites as an example.
So whatās one good use-case? When you need analytics!
Mostly my web sites use Vercel. Iād like an independent count of page visits, since Vercelās own analytics are mostly an upsell.
SingleStore is great for this scenario due to its two-tier storage for tables ā a row store segment for fresh data supporting fast writes and retrievals, flushed periodically to a columnar format and pushed to bottomless object storage. The application gets the best of both worlds: fast analytics and fresh data, and the database doesnāt have to work very hard.
Hereās a real-time view of this pageās traffic over SingleStore since I wrote this article. Try refreshing! Try to overwhelm my system by sharing this article with all of your friends š.
Since this blog is more aspirational than popular, most pages donāt get many views. But since I wrote this article one did: DeepSeek My User Agent. Check out the traffic for that page!
OK first, if you just want analytics probably just buy something like Vercelās analytics or Fathom or something else. But if you want to BUILD an analytics SaaS, or strictly do-it-yourself, then SingleStore can be the database for you.
First, youāll need a SingleStore instance. Iāve previously set up an account for my personal projects on SingleStoreās Free Tier which I currently use as a caching layer for weather.bingo. Iāll use it for my analytics backend as well.
Iāll be connecting to SingleStore using the SingleStore Kai endpoint for MongoDB client drivers. Why MongoDB? I am a shill for SingleStore Kai, and also I want client-first schema because itās easier. The same thing can be accomplished through the SQL interface, so use whatever you prefer.
Iāll store my data in an analytics
collection. The SingleStore Kai proxy will create the backing
table with just two main columns, an _id
and _more
. This can be ported to a more optimized
schema later (the subject of a future articleā¦).
My Vercel site is static and cached in many places, so Iāll need to trigger the userās browser to make an API call on every page to be counted. Iāll use a tried-and-true option: an invisible tracking pixel. This will call a server API where I can log the request.
Hereās the client code Iāll add to my root layout.tsx
:
<img src="/api/tracking" alt="" width={1} height={1} className="hidden" />
Hereās the API route handler.
import clientPromise from "../../mongodb";
import { waitUntil } from "@vercel/functions";
import { NextRequest, NextResponse, userAgent } from "next/server";
const pixel = Buffer.from(
// curl https://upload.wikimedia.org/wikipedia/commons/c/c0/Blank.gif | base64
"R0lGODlhAQABAIAAAP///wAAACH5BAAAAAAALAAAAAABAAEAAAICRAEAOw==",
"base64"
);
export async function GET(req: NextRequest): Promise<NextResponse> {
let path = "";
let hostname = "";
try {
const url = new URL(req.headers.get("referer")!);
path = url.pathname;
hostname = url.hostname;
} catch {
// OK
}
// send to DB in background
waitUntil(
(await clientPromise)
.db()
.collection<any>("analytics")
.insertOne({
hostname: hostname,
path: path,
country: req.headers.get("x-vercel-ip-country"),
region: req.headers.get("x-vercel-ip-country-region"),
city: req.headers.get("x-vercel-ip-city"),
ua: userAgent(req),
ts: new Date(),
})
);
return new NextResponse(pixel, {
headers: {
"Content-Type": "image/gif",
},
});
}
I want to display the up-to-the-second stats for visitors to my page, so Iāll create an API route to serve the data. I asked ChatGPT to write this MongoDB-language query; GPT-4o is quite good at MongoDB queries.
import clientPromise from "../../mongodb";
import { NextRequest, NextResponse, userAgent } from "next/server";
export async function GET(req: NextRequest): Promise<NextResponse> {
const { searchParams } = new URL(req.url);
const path = searchParams.get("path");
const col = (await clientPromise).db().collection("analytics");
const response = await col
.aggregate([
{ $match: { path } },
{
$group: {
_id: "$country",
count: { $sum: 1 },
},
},
{
$unionWith: {
coll: "analytics",
pipeline: [
{ $match: { path } },
{ $group: { _id: "Total", count: { $sum: 1 } } }
],
},
},
{ $project: { _id: { $ifNull: ["$_id", "Other"] }, count: 1 } },
{ $sort: { count: -1 } },
{ $limit: 10 },
])
.toArray();
return new NextResponse(JSON.stringify(response), {
headers: {
"Content-Type": "application/json",
},
});
}
Here is the data rendered with chart.js
:
<Bar
data={{
labels: data.map((item: any) => item._id),
datasets: [
{
label: "Views",
data: data.map((item: any) => item.count),
backgroundColor: "rgba(54, 162, 235, 0.5)",
},
],
}}
options={{
indexAxis: "y",
animation: false,
plugins: {
tooltip: false,
},
hover: {
mode: null,
},
}}
/>
The analytics use-case is a proven use of SingleStore, even at very large, multi-terabyte scale. For my own site, I wonāt need to scale beyond the free tier, even when I (hopefully) do start to get tens of thousands of views per article.
One of SingleStoreās competitors, ClickHouse, publishes a benchmark where SingleStore ranks up with the best performing databases for web-style analytics. But what that benchmark doesnāt tell you is how SingleStore supports that performance over arbitrary JSON and BSON data, maintains the performance with constant real-time ingestion, supports many joins in analytics queries, and includes many other features that make the same system suitable for a wide variety of use-cases without requiring complex ETL moving the data around all the time.
Iāll touch on more of those capabilities in some future articles. Thanks for reading!