Insight206 logo Insight206 logo
Menu

TOOLS & PATTERNS

Airtable patterns

Data-first workflow systems, designed to evolve

Approach

I start Airtable work with the data. As systems take shape, I continuously test and refine table structure, relationships, and fields so that workflows, decision-making, and reporting can all be driven cleanly from the same foundation.

The aim is not just to support today’s use cases, but to arrive at a model that remains understandable and extensible as requirements change.

That modeling work is often paired with JavaScript-based automations. Being comfortable with scripting allows me to process, reshape, and validate data in ways that go beyond Airtable’s out-of-the-box automation steps— particularly where interface constraints, sequencing limitations, or cross-table coordination would otherwise become friction points. When needed, I also integrate external systems via APIs to enrich records or orchestrate multi-step workflows.

How the system feels to use matters just as much.

I work closely with stakeholders as systems take shape—testing assumptions, refining workflows, and aligning on what “good” looks like before patterns harden. I pay close attention to wayfinding, guardrails, and explanatory cues in interfaces, and I deliberately design to reduce cognitive load so users can understand where they are, what they can do next, and why the system behaves the way it does.

I take the same pragmatic stance with AI inside Airtable. Capabilities like Omni are an important part of Airtable’s direction, and they work best when the underlying data model carries clear meaning and shared context. When tables, relationships, and states align cleanly with Airtable’s concepts, AI can reason effectively within the system.

More complex scenarios—such as coordinated updates across multiple tables or recommendations that depend on timing, role, or operational state—require additional structure to behave reliably.

Shape matters. Context matters. Everything else builds on that.

Selected builds

Representative Airtable products that combine structured data, thoughtful interfaces, and practical automation.

Program and Project Management

Designed to replace Smartsheet with a more rigorous, unified framework for intake and execution—giving leaders clear visibility into capacity, utilization, and priorities while preparing the organization for a future ServiceNow transition.


This system was built to move a growing organization off of ad hoc planning tools and toward a more durable operating model. Work intake began as a structured request process, designed to capture the right information up front and guide requesters through a decision-aware flow. Conditional inputs and lifecycle states helped distinguish exploratory requests from work that was ready to move into execution, while giving stakeholders a shared view of demand before commitments were made.

Once approved, intake requests transitioned into projects that could be grouped under broader programs. Project managers built work breakdown structures directly in the system, defining activities, dependencies, and timelines in a way that supported both execution and reporting. Projects were not treated as static plans, but as living structures that could adapt as priorities shifted and scope evolved.

The hardest problem to solve was resourcing.

Multiple models were explored, including explicit time entry and manual allocation, before settling on an approach where resource utilization was derived directly from task assignments. This required significant data shaping beyond what Airtable provides out of the box—translating multi-week assignments, partial allocations, holidays, and PTO into usable weekly availability and utilization signals. Because the underlying data model had been designed with flexibility in mind, this late-stage pivot did not require a rewrite.

The result was a resource management layer that allowed both individual contributors and managers to see load, capacity, and constraints clearly. These signals rolled up into organizational views showing utilization by role, team, and program, alongside intake throughput and project health. Project managers could baseline plans, report status, and surface risk, while leadership gained a coherent picture of delivery and demand across the organization.

In practice, the system covered a substantial portion of the capabilities organizations often rely on service management platforms to provide—while remaining closely aligned to how the team actually worked and adaptable as the organization prepared for a future transition to ServiceNow.


Meal Planning

Built to replace a spreadsheet-heavy, high-information planning process that was consuming weeks of manual effort and preventing the team from moving forward.


This system began as an evaluation exercise. The team was assessing external meal planning and nutrition platforms, but none fully addressed the combination of operational complexity, variability, and integration requirements involved. To explore whether a custom approach could work, I built an early proof of concept in Airtable focused on recipe management and meal planning, grounded in a clear semantic data model and supported by AI-assisted workflows.

That prototype made one thing clear: the challenge wasn’t generating meals, it was coordinating information.

From there, the system expanded into a full end-to-end planning platform. At its foundation was structured recipe and ingredient management, paired with condition-based nutrition requirements that could be applied across meals, days, and plans. The data model supported multi-day, multi-meal scaffolding, allowing planners to balance variety, nutritional constraints, and operational feasibility across an entire planning horizon.

Nutrition analysis and validation were embedded directly into this workflow. Through integration with Spoonacular, recipes and dishes could be analyzed for nutritional content, with results rolled up by meal and by day. These rollups were evaluated against defined condition requirements, turning nutrition from a manual reconciliation task into a first-class system behavior.

Equally important, the system was designed from the outset to connect planning decisions to procurement and execution. Grocery and ingredient requirements were derived directly from meal plans, consolidated across days and meals, and translated into structured ordering inputs. This ensured that planning outputs were immediately actionable, rather than becoming yet another intermediate artifact.

The platform also supported the workflow of curation and iteration. Planners could review, adjust, and approve plans through purpose-built interfaces designed to reduce cognitive load, while backend automation coordinated with the company’s core ordering and fulfillment systems. The same foundation enabled rapid creation of recipe and meal variations to accommodate individual needs and allergies without duplicating data or fragmenting the model.

AI-assisted interactions were layered in to help planners navigate a high-information planning process—surfacing options, highlighting tradeoffs, and assisting with exploration—while remaining grounded in the system’s data model and operational constraints. The result was a substantial planning platform, rivaling modern meal planning products in scope, but tailored precisely to the organization’s workflows and realities.


Operational Systems for Multi-Site Programs

A connected suite of Airtable systems coordinating finance, programming, inventory, enrollment, and staffing across a multi-site organization.


I joined a team that had already recognized that continued growth couldn’t rely on spreadsheets alone, and that they needed to rethink how they planned, staffed, supplied, and operated ten weeks of programming across an expanding set of sites. My role was to work alongside them as they made that transition—helping translate operational reality into durable systems, and providing a steady, practical point of view on how to design workflows that could scale without losing clarity. The focus was on building systems that could support the organization’s current operations while remaining adaptable as it grew.

The first major constraint was inventory planning and material needs. Programming decisions directly determined what materials were required at each site, and the team needed a reliable way to answer questions like how many of a given item were needed, where it should be packed, and whether it already existed in inventory. I built an inventory planning system capable of processing large volumes of line items and translating program requirements into site-level packing and ordering needs. That planning workflow was tied into inventory tracking and ordering, providing visibility into what was needed, what was on hand, and what needed to be purchased, as well as what inventory returned at the end of the season.

Inventory planning made another issue visible: programming definition was the upstream source of many downstream errors. To address this, I built a dedicated programming base that codified how programs were defined, scheduled, and maintained. Interfaces helped program creators shape their guides consistently, and curated definitions were published to downstream systems—particularly inventory and operations—so planning could rely on stable, well-structured inputs rather than ad hoc interpretation.

Upstream of these operational systems sat financial management. Early on, financial planning and controls were spread across multiple tools, making it difficult to connect planning assumptions to operational execution. A financial operations base was created to serve as the hub for planning, selling, estimating, and tracking. Data from this base set the operating plan for the season, while other bases published relevant signals back so consolidated reporting could be done without manual reconciliation.

A deliberate decision was made to use a multi-base architecture rather than force everything into a single system. Separation of concerns helped maintain role clarity across finance, programming, inventory, enrollment, and staffing, while still allowing the organization to operate as a coordinated whole. To make this work, the systems relied on Airtable sync, shared identifiers, and carefully designed data flows.

The result was an operational backbone that could scale with growth while remaining understandable to the teams running it. By intentionally shaping the data model and designing for real operational complexity, the organization gained better visibility, reduced manual effort, and a foundation that could continue to evolve as programs and sites expanded.


Cohort Management

Reworking an early Airtable implementation that couldn’t scale as the number of programs grew, turning it into a durable, multi-program system.


This system was rebuilt as the organization expanded the number and types of programs it needed to run. The original implementation was designed around a single program, with cohorts and events loosely connected and most workflows optimized for a straightforward “happy path” experience. As long as usage stayed within those bounds, it worked reasonably well.

That changed as the program footprint grew.

The core problem wasn’t tooling—it was structure. The existing data model had evolved incrementally, with relationships and lookup fields added as needs emerged. Wide tables made early progress easy, but over time they became a constraint: adding a new concept often meant adding new columns and then projecting those changes through a growing web of automations. This made the system increasingly difficult to extend and reason about as requirements expanded beyond a single program.

The rewrite started by stepping back and rethinking the underlying model. Programs, cohorts, events, and participants were deliberately separated and reconnected with clearer, more explicit relationships. This allowed the same enrollment, scheduling, and communication patterns to be reused across multiple programs without duplicating logic or introducing fragile exceptions.

The new system accounted for operational edge cases that the original design struggled with—participants missing events and needing reassignment, cohorts changing midstream, events being rescheduled or canceled, and individuals moving between cohorts. Rather than handling these situations manually or through one-off fixes, the model and automation were designed to absorb them predictably.

A significant focus was placed on giving program operators more direct control. Administrative interfaces were added to manage cohort assignments, events, and communications without requiring changes to automation code. Visibility into invitations and notifications was improved, making it possible to track delivery, identify failures, and retry or resend when needed—reducing the operational load on the team running the programs.

The system also addressed a practical friction point in day-to-day use. Previously, certain tasks—such as importing participant data—required users to switch from interfaces back into the base, increasing the risk of mistakes. This was replaced with interface-based CSV uploads and supporting automation, allowing data intake to happen safely within the same controlled environment where the rest of the work occurred.

The result was a cohort management platform that could support multiple programs with different structures and schedules, while remaining understandable and operable by the teams who relied on it.


CRM

A lightweight, domain-aware CRM built to support platform integration, product exploration, and realistic end-to-end demonstrations.


This CRM emerged out of a broader platform challenge rather than a desire to compete directly with established CRM products. The organization was consolidating activity data from many external partners in order to provide centralized reporting to another entity. Some partners operated mature CRM systems, while others relied on spreadsheets or ad hoc tools to manage their relationships.

At the same time, the backend product itself needed a realistic stand-in for external CRMs during development. Rather than relying on mock data or overly simplified assumptions, I built a working CRM on Airtable that could behave like a real client system—supporting ingestion, synchronization, and orchestration through the same APIs the platform would ultimately rely on.

The build started with familiar CRM concepts—accounts, contacts, and opportunities—but extended them with domain-specific structures relevant to the platform’s use cases. The CRM acted as an active participant in the system, coordinating data exchange via API calls, tracking state changes, and reflecting how information would move between client systems and the central platform in practice.

To support business discovery and enrichment, the system integrated with the Dun & Bradstreet API, allowing firmographic data to be pulled into the CRM and aligned with existing records. This helped test how external data sources could be incorporated into broader workflows and surfaced questions about data quality, matching, and ownership early in the product’s development.

In parallel, I explored what the CRM might look like outside of Airtable’s native interface. Using Softr as a frontend layer, I built a non-Airtable user experience backed by the same data and automation. This allowed the team to evaluate alternative interaction models and better understand how the backend system could support different user surfaces without changing its core logic.

The result was not a full-scale CRM replacement, but a purpose-built system that served multiple roles: a realistic integration partner for backend development, a demonstration asset for client conversations, and a way to explore where lightweight CRM capabilities might add value without overcommitting to a separate product.

Want to talk this through?

If you’re planning an Airtable build or refining a live system, I’m happy to compare notes and share practical patterns.