Skip to main content
Back to Blog
data-integrity validation fields configuration

Unique Value Enforcement

Swifty Team Mar 12, 2026 3 min read

Duplicate data is a persistent problem in any system where multiple people enter records. A customer entered twice by different team members. A product code duplicated because the entry wasn't found in the search. A reference number that was supposed to be unique but wasn't enforced. Cleaning up duplicates is tedious work that happens because the system didn't prevent the problem at entry time.

Unique value enforcement prevents duplicates at the data level, before they happen.

Field-Level Uniqueness

Any field can be configured to require unique values across all records in the object type. Enable the "unique" constraint in the field configuration, and saving a record with a value that already exists in that field on another record will be rejected with a clear error message.

Common candidates: reference numbers, email addresses, product codes, usernames, national ID numbers, registration numbers. Any field where the value should uniquely identify a record.

Compound Uniqueness

Some uniqueness requirements span multiple fields. A customer's order number might be unique per customer but not globally — order #1001 can exist for two different customers, but not twice for the same customer. Compound uniqueness expresses this as a rule: the combination of "customer" and "order number" must be unique.

Compound uniqueness constraints are configured in the object settings, specifying the set of fields whose combination must be unique.

Clear Feedback on Violation

When a save is attempted with a value that violates a unique constraint, the error is specific: "A record with this reference number already exists." The error appears next to the relevant field in the form, not as a generic failure message.

For compound constraints, the error identifies which combination of fields is the source of the conflict.

Case Sensitivity Control

Text field uniqueness can be configured as case-sensitive or case-insensitive. Case-insensitive uniqueness treats "ACME Corp" and "acme corp" as duplicates — appropriate for most business identifiers. Case-sensitive uniqueness treats them as distinct — appropriate for identifiers where casing is meaningful.

Imports and Integrations

Unique constraints apply consistently, regardless of how data enters the system. Records imported from a file, synchronized from an integration, or created through an API call are all subject to the same uniqueness rules. There are no "backdoors" that bypass the constraint.

When an import contains duplicate values that would violate uniqueness, the import either rejects the conflicting rows with error detail or updates the existing records (depending on the configured import behavior). The unique constraint is always enforced.

Related posts

Composed Data Sources

Chain and relate data sources for rich dashboards — compose complex data views from simpler sources without writing code.

Computed Expressions

Transform data with template expressions and built-in functions — format, combine, and derive values from your data without code.

Cross-Source Data Joins

Combine data from multiple sources in one view — join records from your database with data from external services using a shared key.