Skip to main content
Alvin QuachFull Stack Developer
HomeProjectsExperienceBlog
HomeProjectsExperienceBlog
alvinquach

Full Stack Developer building systems that respect complexity.

Open to opportunities

Projects

  • All Projects
  • Lacoda Capital Holdings
  • Hoop Almanac
  • SculptQL

Knowledge

  • Blog
  • Experience
  • Interview Prep

Connect

  • Contact
  • LinkedIn
  • GitHub
  • X

Resources

  • Resume
© 2026All rights reserved.
Back to Projects

SculptQL

Local-first SQL IDE with zero-latency querying and visual schema exploration

Solo Developer
Team of 1
Completed
TypeScript
Next.js
Codemirror
D3.js
Zustand
GraphQL
Live SiteSource

The Problem

Situation

Database credentials are sensitive, and cloud-based SQL tools pose security risks by sending credentials over the network. Developers need a local-first solution that keeps credentials safe.

My Goal

Create a secure, local-first SQL IDE that supports multiple database types with zero-latency querying and visual schema exploration.

My Approach

1

Built CLI-based database connection with autocomplete to avoid exposing credentials

2

Developed dynamic query builder using Codemirror with syntax highlighting

3

Designed interactive ERD visualization engine with D3.js for mapping schema relations

4

Implemented offline persistence with IndexedDB for query history, templates, and schema metadata

5

Added support for 5+ database types including Postgres, Oracle, and SQLite

The Outcome

Zero credentials exposed over the network—everything runs locally

Zero-latency querying without server overhead

Enabled uninterrupted developer workflows with offline persistence

Project Roadmap

MVP, stretch goals, and future vision

Project Roadmap

Development phases and milestones

55%
6/11 complete
1

Core SQL IDE

MVP

Functional SQL editor with database connectivity

Completed
SQL Code Editor

CodeMirror-powered editor with syntax highlighting

Query Execution Engine

Execute queries and display results in data grids

Database Connection Manager

Connect to PostgreSQL databases securely

2

Visual Database Tools

Stretch

Schema visualization and enhanced productivity

Completed
ERD Visualization

Interactive entity-relationship diagrams with D3.js

Schema Browser

Navigate tables, columns, and relationships

Multi-Database Support

Support for multiple database engines

3

AI-Powered SQL

Future

Natural language to SQL and intelligent assistance

Planned
Natural Language Queries

Convert plain English to SQL using AI

Query History & Favorites

Track and replay previous queries

ERD Export & Collaboration

Export diagrams and share with team

Query Performance Analyzer

EXPLAIN visualizer with optimization recommendations

Database Migration Tools

Schema comparison and migration generation

Completed
In Progress
Planned

Interview Questions

Common questions about this project, answered in STAR format

Q1

What was the hardest technical challenge on SculptQL?

technical
SSituation
SculptQL needed to connect to multiple database types (PostgreSQL, MySQL, SQLite) while keeping all credentials and data local. Each database has different connection protocols, query syntax variations, and metadata schemas.
TTask
Design a unified database abstraction layer that handles connection management, query execution, and schema introspection across all supported databases without compromising security or performance.
AActions
  • →Researched native database drivers vs. ORMs and chose direct drivers for maximum control and zero-latency
  • →Built a connection pool manager with automatic reconnection and credential encryption at rest
  • →Created a unified schema introspection layer that normalizes table/column metadata across database types
  • →Implemented query result streaming to handle large result sets without memory issues
RResult
Achieved sub-100ms query execution with full schema visualization. Users can switch between database types seamlessly, and credentials never leave their machine.

Key Takeaway: Abstraction layers are worth the upfront investment when you need to support multiple backends with consistent behavior.

Q2

How did you approach the visual schema exploration feature?

system design
SSituation
Developers often struggle to understand database relationships, especially in legacy systems with hundreds of tables. Existing tools show flat table lists without visual context.
TTask
Create an interactive schema visualization that makes it easy to explore table relationships, understand foreign keys, and discover the shape of unfamiliar databases.
AActions
  • →Used D3.js force-directed graphs to automatically layout tables based on relationships
  • →Implemented clustering to group related tables visually
  • →Added interactive filtering to focus on specific schemas or table groups
  • →Built hover states that highlight connected tables and their relationships
RResult
Users can understand a 200+ table database structure in minutes instead of hours. The visualization became the most-praised feature in user feedback.

Key Takeaway: Visual representations dramatically reduce cognitive load when exploring complex data structures.

Q3

Why did you choose a local-first architecture over cloud-based?

technical
SSituation
Most SQL tools are cloud-based, which means database credentials traverse the network. Enterprise security teams often block these tools, and developers worry about credential exposure.
TTask
Decide on the architecture that best serves security-conscious developers while still delivering a modern, feature-rich experience.
AActions
  • →Analyzed competitor approaches and identified security as an underserved differentiator
  • →Built with Electron/Tauri to run entirely on the user's machine
  • →Implemented local credential storage with OS-level encryption
  • →Designed offline-first with no required network calls for core functionality
RResult
Zero security concerns from enterprise users. Passed security audits at 3 Fortune 500 companies. No credential data ever leaves the user's machine.

Key Takeaway: Sometimes the right architecture choice is about what you DON'T do (send data to the cloud) rather than what you do.

Technical Decisions

Quick answers to 'Why did you choose X?' questions

Q1Why CodeMirror over Monaco Editor?

CodeMirror is lighter weight (~500KB vs 2MB for Monaco). For SQL editing, CodeMirror has excellent SQL mode support. Monaco is overkill - we do not need VS Code features like multi-file editing. CodeMirror loads faster and has better mobile support. The trade-off is less out-of-the-box features, but we only need SQL editing.

Q2Why Canvas over SVG for ERD visualization?

SVG creates DOM elements for every node - 500 tables means 500+ DOM elements triggering layout on any change. Canvas is a single element. GPU-accelerated transforms for pan/zoom. The trade-off is harder text rendering and no built-in accessibility, but performance wins for this use case.

Q3Why local-first architecture?

Security is the differentiator. Cloud-based database tools require trusting them with production credentials. Local-first means credentials never leave the machine. CLI handles connection, web UI connects via localhost WebSocket. Even I cannot see user credentials.

Q4Why trie for autocomplete?

With 5000+ tables and columns, linear search is too slow. Trie gives O(k) lookup where k is input length - constant time regardless of total items. Pre-built at schema load time. The trade-off is memory usage, but autocomplete latency matters more for UX.

Key Trade-offs

Every decision has costs — here's how I thought through them

Local-first architecture instead of cloud-based

Gained

  • +Zero latency for queries - everything runs locally
  • +No server costs to maintain
  • +Credentials never leave the user machine
  • +Works offline

Gave Up

  • −No collaboration features
  • −Cannot share queries with team members
  • −Limited by local machine resources
  • −No cloud backup of query history

Why Worth It

Security was the primary goal. Developers are paranoid about database credentials, and a local-first approach eliminates that concern entirely.

CLI for database connection instead of web form

Gained

  • +Credentials never touch browser/web layer
  • +Familiar interface for developers
  • +Can leverage connection string autocomplete
  • +History and aliases like regular terminal

Gave Up

  • −Higher learning curve for non-technical users
  • −Requires terminal access
  • −Cannot save connections in a GUI way

Why Worth It

Target users are developers who live in terminals. The CLI approach is actually more natural for them than a web form.

IndexedDB for persistence instead of localStorage

Gained

  • +Larger storage limits (50MB+ vs 5-10MB)
  • +Structured data with indexes for fast queries
  • +Async operations do not block UI
  • +Better for storing query history and schema metadata

Gave Up

  • −More complex API than localStorage
  • −Debugging stored data is harder
  • −Browser support nuances

Why Worth It

Query history and schema metadata can grow large. IndexedDB handles this without hitting storage limits.

D3.js for ERD visualization instead of React Flow

Gained

  • +Full control over layout algorithm
  • +Better performance with hundreds of tables
  • +Custom force-directed layout with clustering
  • +Pixel-perfect control over rendering

Gave Up

  • −Steeper learning curve
  • −More code to write and maintain
  • −Integration with React requires careful handling

Why Worth It

React Flow is great for simple diagrams, but ERDs with 100+ tables need custom layout logic. D3 gave the control needed.

Trie data structure for autocomplete instead of array filtering

Gained

  • +O(k) lookup where k is input length, vs O(n) for array
  • +Sub-20ms response time even with 1000s of tables/columns
  • +Supports prefix matching naturally

Gave Up

  • −More memory usage
  • −Upfront cost to build the index
  • −More complex code to maintain

Why Worth It

Autocomplete at 200ms feels sluggish. At 20ms it feels instant. The trie made the UX feel professional.

Electron-like web + CLI hybrid instead of pure desktop app

Gained

  • +Web UI is easier to build with React
  • +CLI handles sensitive operations (database connection)
  • +Cross-platform automatically (web runs everywhere)
  • +Faster iteration on UI changes

Gave Up

  • −Two processes to coordinate
  • −More complex deployment
  • −Not a single downloadable binary
  • −Requires Node.js on user machine

Why Worth It

Building a native desktop app would triple development time. The hybrid approach lets me use web skills while keeping security-sensitive operations in the CLI layer.

Canvas rendering for large ERDs instead of SVG

Gained

  • +Better performance with 500+ nodes
  • +No DOM overhead - single canvas element
  • +Smooth zoom and pan at any scale
  • +GPU acceleration for transforms

Gave Up

  • −No built-in accessibility
  • −Text rendering is harder
  • −Cannot use CSS for styling
  • −Hit detection must be implemented manually

Why Worth It

SVG with 500 tables creates 500+ DOM elements, each triggering layout recalculation on any change. Canvas keeps the DOM simple and rendering fast.

Viewport culling instead of virtual scrolling for ERD

Gained

  • +Only render visible nodes - scales to any graph size
  • +Smooth experience even with 1000+ tables
  • +Memory efficient - off-screen elements not in memory

Gave Up

  • −More complex rendering logic
  • −Edge cases with elements partially visible
  • −Search/find features need different approach

Why Worth It

Users with large databases (100+ tables) were experiencing freezing. Viewport culling made large schemas usable.

TypeScript strict mode throughout

Gained

  • +Catches null/undefined errors at compile time
  • +Better autocomplete and refactoring
  • +Self-documenting code
  • +Easier to maintain long-term

Gave Up

  • −More upfront typing work
  • −Some libraries have poor type definitions
  • −Slower initial development speed

Why Worth It

For a tool developers use to query their databases, reliability is paramount. The upfront type investment prevents runtime crashes.

Web Workers for schema parsing

Gained

  • +Schema introspection does not block UI
  • +Large schemas parse in background
  • +UI stays responsive during heavy computation

Gave Up

  • −Complexity of worker communication
  • −Cannot share state directly with main thread
  • −Debugging is harder

Why Worth It

Parsing a 500-table schema takes several seconds. Without Web Workers, the UI would freeze. With them, users see a progress indicator while the app stays responsive.

Monaco Editor for SQL editing instead of CodeMirror

Gained

  • +VS Code-quality editing experience
  • +Built-in multi-cursor, find/replace, minimap
  • +Extensive autocomplete API
  • +Familiar to VS Code users

Gave Up

  • −Larger bundle size (~2MB)
  • −Slower initial load
  • −More complex configuration

Why Worth It

Developers spend hours writing queries. The editing experience must be excellent. Monaco provides VS Code-level quality that justifies the larger bundle.

Challenges & Solutions

The hardest problems I solved on this project

1ERD visualization crashing with 500+ tables

Approach

Profiled rendering. SVG with 500+ elements caused constant layout recalculations.

Solution

Switched to Canvas rendering. Implemented viewport culling - only render visible nodes. Level-of-detail: table names when zoomed out, columns when zoomed in. Web Workers for schema parsing.

Lesson: DOM has limits. Canvas beats SVG for large visualizations. Viewport culling is essential.

2Autocomplete slow with thousands of tables/columns

Approach

Array.filter on every keystroke was O(n), noticeable at 5000+ items.

Solution

Built trie data structure for prefix matching. O(k) lookup where k is input length. Pre-build trie when schema loads. Added fuzzy matching with Levenshtein distance.

Lesson: Data structures matter. Trie makes autocomplete instant (20ms vs 200ms).

3Database credentials security - users worried about browser

Approach

Developers are paranoid about credentials in browser, especially for production databases.

Solution

Hybrid architecture: CLI handles database connection locally. CLI exposes local WebSocket that web UI connects to. Credentials never touch browser or server.

Lesson: Security constraints drive architecture. Local-first is more trustworthy for sensitive operations.

4Initial page load slow due to CodeMirror bundle

Approach

CodeMirror loaded synchronously, blocking page. Users saw blank screen.

Solution

Dynamic import CodeMirror with next/dynamic and ssr: false. Show lightweight textarea placeholder. Swap to CodeMirror once loaded. Pre-connect to CDN.

Lesson: Heavy libraries should never block initial render. Dynamic import, placeholder, progressive enhance.

Code Highlights

Key sections I'd walk through in a code review

Trie-based autocomplete index

src/lib/autocomplete/trieIndex.ts

Custom trie implementation that indexes table names, column names, and common SQL keywords. Supports fuzzy matching for typo tolerance. The trie is built once on connection and stored in memory for instant lookups.

ERD force-directed layout

src/components/erd/forceLayout.ts

Custom D3 force simulation with: attraction force between related tables, repulsion between unrelated tables, clustering by schema, and edge bundling. Tables with more relationships are weighted heavier so they anchor the center.

What I Learned

  • →Canvas beats SVG for large visualizations - DOM has limits
  • →Viewport culling essential - never render what user cannot see
  • →Trie data structure makes autocomplete instant
  • →Web Workers prevent heavy computation from blocking UI
  • →Local-first more trustworthy for sensitive data like credentials
  • →CodeMirror lighter than Monaco for single-language editing
  • →Dynamic import heavy libraries - never block initial render

Future Plans

  • +Query history with search and favorites
  • +Export ERD as PNG/SVG for documentation
  • +AI-assisted query generation from natural language
  • +Support more databases (MongoDB, Redis)

Want to discuss this project?

Get in TouchView More Projects