Flaux
Unlistedanalyticsbusinessinternal-tool

P&L Dashboards

Local-first weekly analytics for a service business

Customer, timesheet, and booking data from Launch27 and QuickBooks dropped into a single local HTML file. Runs in the browser, no upload, no cloud, no AI in the loop at runtime. Replaced a brittle spreadsheet that was eating an hour a week.

Available on request
Next.jsTailwind

What it is

A local-only dashboard for the financial side of The Naturally Clean Co. Pulls in customer information, timesheet data, and booking data, runs the weekly analytics I care about, and never sends a byte of customer data anywhere.

Why local-only

Customer data should stay local. Addresses, contact info, sometimes access codes. That lives in our booking system (Launch27) and our timesheets (QuickBooks), and that's where it should stay. Pulling it into a third-party analytics tool, or even into a cloud spreadsheet, adds risk that doesn't pay back for what I'm using it for.

So the rule going in: no cloud, no AI in the runtime loop, no data leaves my machine.

How it works

It's literally one HTML file. The whole thing.

  1. Download the latest CSVs from Launch27 (bookings) and QuickBooks (timesheets). Both have native export buttons.
  2. Open the HTML file in a browser.
  3. Drag and drop the CSVs onto the page.
  4. The dashboard parses them in the browser, runs the joins and aggregations, and renders the views.

No upload, no server, no API. Just JavaScript reading FileReader input, doing the maths in memory, drawing the charts.

The shape this replaced

The previous version was a spreadsheet. A big one. Years of formulas, references between sheets, ad-hoc tabs for new questions, and the usual spreadsheet rot. When something broke it took an hour to find the cell that lied.

The HTML version is easier to maintain because every "what if I want to see X this week" change is just code, with comments and version control. When I want a new chart or a different cut, I edit the file. The next time I drop the CSVs in, the new view is there.

What's interesting about how it was built

AI built most of the parsing and rendering code. But there's no AI in the loop at runtime. The deployed file has no API keys, no external requests, no inference. The model was a co-pilot during construction, not a dependency afterwards. Which is exactly the right division when customer data is involved.

Status

In active weekly use. I'm the only user. The "deployment" is a file on my laptop with a backup in a synced folder. There's no app to share and no plan to make one. The case study is here because the architecture (local-first, AI-in-the-build but not in the runtime) is worth pointing at, especially for any business that handles customer data.