How I Built a Sales Audit System Using a SQLite Backup in Capital City
The Problem: Auditing Sales Data Without a Live Connection
I was handed a challenge in the Capital City project: analyze two years of historical sales data for compliance reporting—but the original system was offline, and the only artifact I had was a single SQLite database backup. No APIs, no replication, no live access. Just a .db file sitting in cold storage.
My goal wasn’t to restore operations. It was to answer real business questions: What were the top-selling items by quarter? Were there any suspicious spikes in refunds? How did regional performance shift over time? And I needed to deliver this insight securely to stakeholders—without risking production data integrity.
The catch? I couldn’t connect to the live app database, and I couldn’t modify the backup. So I had to build a read-only analysis layer that treated the SQLite file like a time capsule: observe, query, visualize—but never alter.
Restoring the Past: Read-Only SQLite in Laravel
My stack for Capital City is Laravel-based, so I looked for a way to mount the SQLite backup as a secondary, isolated database connection. The key was ensuring immutability—both technically and culturally—so no developer or admin could accidentally write to it.
First, I added a new connection in config/database.php:
'backup_sales' => [
'driver' => 'sqlite',
'url' => env('BACKUP_SALES_DATABASE_URL'),
'database' => storage_path('backups/sales-archive-2023.db'),
'prefix' => '',
'foreign_key_constraints' => false,
'busy_timeout' => 10000,
'synchronous' => 'off',
'immutable' => true, // Critical: enables SQLite's immutable mode
],
Setting 'immutable' => true was the game-changer. It tells SQLite to open the file in read-only mode and prevents any write operations—even if someone tries to INSERT via a raw query. I also set foreign_key_constraints to false since the backup didn’t enforce them, and disabled synchronous writes for faster reads.
Next, I created a dedicated Eloquent model:
namespace App\Models\Analysis;
class HistoricalSale extends Model
{
protected $connection = 'backup_sales';
protected $table = 'sales';
public $timestamps = false;
protected $guarded = [];
}
This model became my window into the past. I could now write standard Laravel queries:
HistoricalSale::where('region', 'North')
->whereBetween('sale_date', [$start, $end])
->sum('amount');
But I didn’t stop at raw queries. I wanted stakeholders—product leads, finance, compliance—to explore the data themselves.
Querying the Past: Secure Filament Admin Integration
Enter Filament PHP. I used it to build a lightweight, role-gated admin panel where authorized users could explore sales trends without writing SQL.
I created a Filament Resource for HistoricalSale with custom filters for date ranges, regions, and product categories. I also added summary widgets to show high-level KPIs: total sales, average order value, refund rates.
// app/Filament/Resources/HistoricalSaleResource.php
public static function getWidgets(): array
{
return [
SaleStatsOverview::class,
];
}
The real win was access control. Using Laravel’s gate system, I restricted access to the sales-audit panel to only users with the audit_viewer role:
// In AuthServiceProvider
Gate::define('viewSalesAudit', function ($user) {
return $user->hasRole('compliance') || $user->hasRole('finance_lead');
});
Then in Filament:
public static function canViewAny(): bool
{
return auth()->user()?->can('viewSalesAudit');
}
Now, instead of developers running manual queries and emailing spreadsheets, stakeholders could log in and explore trends in real time. All backed by a static file, zero writes, zero risk.
This pattern has since inspired similar approaches in my Subscription Platform project, where I’m exploring ways to audit billing history using point-in-time backups.
Why This Matters Now
As SaaS platforms mature, the demand for auditability grows. Regulators want proof. Executives want insights. Developers want safety.
By treating backups as first-class data sources—and building secure, read-only analysis layers on top—I can deliver all three. No ETL pipelines. No data lakes. Just a .db file, Laravel, and a clear boundary between observation and operation.
If you're sitting on legacy data and wondering how to extract value without reactivating old systems, consider this: sometimes the best database for analysis is one that can’t be changed at all.