FnHCI And Penpot Abstraction Boundary
This note captures the current NEXUS direction for how Penpot relates to FnHCI, FnUI, and later cross-platform UI generation.
The short version is:
- business behavior is not modeled in terms of buttons
- Penpot components are not the canonical UI abstraction either
FnHCIshould own the reusable interaction primitives at the right semantic level- Penpot, Blazor, HTML, Android, iOS, and similar targets should adapt or project those primitives rather than owning them
Why This Matters
The current design pressure is clear:
- we want a modeled "button" concept that is still meaningful across platforms
- we do not want HTML tags or Android widget classes to be the abstraction owner
- we do not want Penpot component structure to become the only source of truth
- we do want design surfaces like Penpot to stay useful and mappable
The imported discussion history already points in this direction:
019d174f-2ce1-7496-a7f3-2e5cae80727e.tomlsaysFnUIbecomes a platform-neutral projection model and names lines such asFnUI.Blazor,FnUI.HTML, andFnUI.android019d174f-2cd6-772c-97db-8fdcb16a0050.tomlsays Penpot belongs in the target projection layer too019d174e-eaa9-7b82-9c3f-c55499fe9fd6.tomlreinforces that business behavior does not know about buttons
Layering
The current intended layering is:
- business/domain behavior commands, events, policies, and invariants
FnHCIinteraction primitive layer reusable semantic controls and view composition- design and authoring projections Penpot files, components, variants, tokens, and board conventions
- runtime adapters Blazor, HTML, Android, iOS, and later others
This means:
- domain behavior should not depend on
Button - a platform widget such as
<button>orandroid.widget.Buttonshould not be the canonical concept either - Penpot's
Button.Counteror other component names should map to a primitive, not define the primitive
The "Correct Level" For A Button
The current working hypothesis is that a Button is a valid cross-platform UI primitive if it is modeled at the interaction level instead of the runtime-widget level.
That means the primitive should capture things like:
- semantic role what the button means in the interaction
- content label, icon, or both
- state enabled, disabled, busy, selected, or similar
- emphasis or affordance primary, secondary, destructive, quiet, counter-style, and similar visual intent
- activation behavior which interaction command it invokes
- accessibility-facing text name, description, hint, or similar metadata
It should avoid hard-coding things like:
- HTML element names
- CSS class names
- Android or iOS control types
- Penpot shape ids
- Penpot variant ids
- renderer-specific event APIs
So the "button" concept we want is not:
- a DOM button
- a Material button
- a Penpot frame
It is a reusable FnHCI interaction primitive that those targets can render or project.
Penpot's Role
Penpot remains important, but its role is different from the canonical abstraction.
Penpot is valuable as:
- a visual authoring surface
- a design-system and token surface
- a collaboration and review canvas
- a projection target and projection source
- a bridge while the dedicated Event Modeling and UI tooling is still emerging
Penpot should therefore be treated as:
- an inspectable artifact surface
- a design projection surface
- an adapter boundary
- a likely shared token and theming surface
not as:
- the owner of domain behavior
- the owner of the cross-platform interaction primitive model
Current LaundryLog Pressure
The current LaundryLog.penpot file already gives a good example of this split.
Observed Penpot structures include component-like shapes such as:
Button.Counter
That is useful evidence, but Button.Counter should likely map to a deeper FnHCI primitive shape such as:
Buttonwith acounterorincrement/decrementaffordance variant
The reusable primitive would then project into:
- a Penpot component/variant arrangement
- a Blazor render tree
- an HTML element structure
- an Android native control
- an iOS native control
without changing the underlying interaction meaning.
What This Suggests For FnTools
The first durable reusable code line should likely separate:
FnTools.FnHCIsemantic interaction primitives and shared cross-platform meaningsFnTools.FnHCI.UIview structure, layout, composition, and stateFnAPI.PenpotPenpot artifact and backend accessFnMCP.Penpothigher-level live Penpot interaction helpers
The current design-token evidence also suggests a future shared token line, likely owned outside raw Penpot files but mappable to and from them.
That future line will likely need to distinguish:
- foundation tokens raw scales, colors, spacing, opacity, radius, typography families, and similar bases
- semantic tokens interaction-facing meanings such as button background, layer text, input border, and similar roles
- theme axes dimensions such as density, color mode, and color theme
- active theme selection which orthogonal theme combinations are currently applied
The current Penpot token examples also suggest that semantic tokens may need to sit above raw foundations in a structured way. Real examples include meanings such as:
buttonPrimary.background.defaultbuttonSecondary.border.focusedlayerBase.textinput.border
That is useful because it is already closer to FnHCI and FnUI concerns than a raw palette alone, while still remaining independent of one renderer.
The breakpoint-theme example suggests another important requirement:
- theme axes should remain orthogonal where possible
So instead of flattening everything into one combined token space such as:
light-mobiledark-mobilelight-desktopdark-desktop
the better model is likely:
- a color-mode axis
- a breakpoint axis
- later possibly brand, density, or contrast axes
with active theme selection combining them as needed.
That is closer both to Penpot's current theme model and to how CSS conditions compose.
That lets Penpot integration become strong without making Penpot the abstraction owner.
First Deterministic Surfaces To Aim For
This should gradually become deterministic tooling such as:
- a normalized Penpot component extractor
- a Penpot-to-
FnHCImapping surface - a
FnHCIprimitive catalog - runtime adapters from
FnHCIprimitives into specific host/render targets
The key goal is:
- the mapping rules become reviewable and toolable
- they are not trapped inside one AI model or one Penpot GUI session
Near-Term Next Question
The next concrete modeling step is probably:
- define the first small primitive catalog for
FnHCI.UI
Likely first candidates include:
ButtonTextInputLabelListSection
with Button as the clearest first pressure point because it is already visible in both the recorded discussions and the current Penpot artifact.